Welcome!

Java IoT Authors: Stackify Blog, APM Blog, Liz McMillan, Elizabeth White, Yeshim Deniz

Related Topics: @DevOpsSummit, Java IoT, @CloudExpo

@DevOpsSummit: Article

Terminology Nerd War By @HoardingInfo | @DevOpsSummit [#DevOps]

It turns out your background is important to your interpretation of the DevOps lingo

Terminology Nerd War: APM, Log Analysis & More
by Chris Riley

Just the other day I was hanging out with my developer buddy. We entered what we thought would be an interesting topic on how you cannot call an environment "DevOps" without analytics.

But we soon were in a nerd war on what a term meant. Yes, this is what I talk about in my free time.

In the thick of it, we both used the term "Server Monitoring." But neither of us were talking about the same thing. I was referring to log management and analysis, and he was referring to application performance monitoring (APM). No wonder the DevOps market is confused. But the good news is, once we realized our mistake, we agreed that both APM and log analysis are critical and beneficial to the DevOps practice.

It turns out your background is important to your interpretation of the DevOps lingo. There are basically four points of view: front-end developers, back-end developers, QA, and IT. They all speak the same language, but with different dialects. And the developer dialect is the furthest from IT's. This is where the differences between APM and log analysis are confusing, but at the same time made more clear.

Log Analysis vs. APM?

Developers
Developers are all about the software layer. And thus when they think analytics, they think mostly of analytics for the application. Which means, the "server" to them is the Web Server, not the VM that the stack is running on. When you talk about analysis, even if it is a log analysis platform and not APM being used, what they find the most important to them is getting data about the application operation, users, and functionality.

IT Operations
To IT the "server," is at a minimum, the hypervisor, but could even be bare metal. IT keeps the VMs up and running, and their perception of the application is from the server up. They tend to think of server monitoring as OS events and system level performance. They want to know about processes, bandwidth, pegged disks, etc. As they work up the stack to the application layer, their interest is mostly focused on how activity and functionality will impact servers and uptime.

Both are right and wrong at the same time. But mostly both are wrong because they do not take the time to understand each other. Which leads to some interesting conversations that go nowhere. The result is a common vice committed across entire teams: choosing one tool for the job. This actually is feasible from IT's perspective because log analysis has the unique ability to monitor and analyze across all layers of the application and all processes to support it.

APM
As we already alluded too, APM is the application layer only. In its simplest terms it breaks down to measuring the time for each http(s) request or post, and who made the request or post. But it goes further to an abstracted higher level value that views how application functionality either degrades or improves performance, and all user and application data over long periods of time. What is even more confusing is when you add in the term "load testing" (which is not APM either) because load testing is focused on the pre-release stages of development. It is executed by simulating connections to the application which APM does not do, but can monitor. Generally APM has not expanded to look at application data in the earlier stages of the pipeline. Such as QA, continuous integration or delivery (deployment is a different story).

Log Analysis
The great thing about log analysis is that it is like a warm blanket for the entire DevOps process. You can log anything and everything. From bare metal (used in Software Defined Data Centers), hypervisor, virtual machine (most common), and application. And not just your application, all the component applications around the entire process as well. This includes release management, IDE, and other services contributing to the creation of the application.

The other nice part about log analysis is that the more data you log, the more consistent your language is for talking about the environment.

The problem with getting the two to work together in sync is not just the confusion around them, it is also in sharing. Neither IT nor developers like to share. So while IT might set up APM, the developers hoard it, and vice versa IT might hoard log analysis. This commonly results in both teams getting the tools, but just so they can have their own castle.

Some organizations are more progressive. IT might deliver data from, or provide access to, the log analysis platform. Developers then find they can pretty much get, and share, all the data they need from the application layer, as well as it's relationship to server data.

My vote is to have both. But I also have the perspective that you should do what you can to get them in the same place. Otherwise having two platforms means that the language barrier is carried forward and communication still doesn't improve. DevOps is about breaking down walls, not building them up.

Integrations like Log Entries and New Relic are magic. No matter where you are, data is consistent, shared, and language is unified.

And once it is set up there really is no idea of where information is, how it is communication, or who owns what.

The next time you start throwing around DevOps terminology, make sure you are talking about the same thing. And when it comes to server monitoring, allow log analysis to be the system of record for all data, and APM to do what it does best, understanding your users.

More Stories By Trevor Parsons

Trevor Parsons is Chief Scientist and Co-founder of Logentries. Trevor has over 10 years experience in enterprise software and, in particular, has specialized in developing enterprise monitoring and performance tools for distributed systems. He is also a research fellow at the Performance Engineering Lab Research Group and was formerly a Scientist at the IBM Center for Advanced Studies. Trevor holds a PhD from University College Dublin, Ireland.

@ThingsExpo Stories
"Cloud Academy is an enterprise training platform for the cloud, specifically public clouds. We offer guided learning experiences on AWS, Azure, Google Cloud and all the surrounding methodologies and technologies that you need to know and your teams need to know in order to leverage the full benefits of the cloud," explained Alex Brower, VP of Marketing at Cloud Academy, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clar...
In his session at 21st Cloud Expo, Carl J. Levine, Senior Technical Evangelist for NS1, will objectively discuss how DNS is used to solve Digital Transformation challenges in large SaaS applications, CDNs, AdTech platforms, and other demanding use cases. Carl J. Levine is the Senior Technical Evangelist for NS1. A veteran of the Internet Infrastructure space, he has over a decade of experience with startups, networking protocols and Internet infrastructure, combined with the unique ability to it...
"IBM is really all in on blockchain. We take a look at sort of the history of blockchain ledger technologies. It started out with bitcoin, Ethereum, and IBM evaluated these particular blockchain technologies and found they were anonymous and permissionless and that many companies were looking for permissioned blockchain," stated René Bostic, Technical VP of the IBM Cloud Unit in North America, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Conventi...
Gemini is Yahoo’s native and search advertising platform. To ensure the quality of a complex distributed system that spans multiple products and components and across various desktop websites and mobile app and web experiences – both Yahoo owned and operated and third-party syndication (supply), with complex interaction with more than a billion users and numerous advertisers globally (demand) – it becomes imperative to automate a set of end-to-end tests 24x7 to detect bugs and regression. In th...
Widespread fragmentation is stalling the growth of the IIoT and making it difficult for partners to work together. The number of software platforms, apps, hardware and connectivity standards is creating paralysis among businesses that are afraid of being locked into a solution. EdgeX Foundry is unifying the community around a common IoT edge framework and an ecosystem of interoperable components.
"MobiDev is a software development company and we do complex, custom software development for everybody from entrepreneurs to large enterprises," explained Alan Winters, U.S. Head of Business Development at MobiDev, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Large industrial manufacturing organizations are adopting the agile principles of cloud software companies. The industrial manufacturing development process has not scaled over time. Now that design CAD teams are geographically distributed, centralizing their work is key. With large multi-gigabyte projects, outdated tools have stifled industrial team agility, time-to-market milestones, and impacted P&L stakeholders.
"Akvelon is a software development company and we also provide consultancy services to folks who are looking to scale or accelerate their engineering roadmaps," explained Jeremiah Mothersell, Marketing Manager at Akvelon, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
"Space Monkey by Vivent Smart Home is a product that is a distributed cloud-based edge storage network. Vivent Smart Home, our parent company, is a smart home provider that places a lot of hard drives across homes in North America," explained JT Olds, Director of Engineering, and Brandon Crowfeather, Product Manager, at Vivint Smart Home, in this SYS-CON.tv interview at @ThingsExpo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Coca-Cola’s Google powered digital signage system lays the groundwork for a more valuable connection between Coke and its customers. Digital signs pair software with high-resolution displays so that a message can be changed instantly based on what the operator wants to communicate or sell. In their Day 3 Keynote at 21st Cloud Expo, Greg Chambers, Global Group Director, Digital Innovation, Coca-Cola, and Vidya Nagarajan, a Senior Product Manager at Google, discussed how from store operations and ...
"There's plenty of bandwidth out there but it's never in the right place. So what Cedexis does is uses data to work out the best pathways to get data from the origin to the person who wants to get it," explained Simon Jones, Evangelist and Head of Marketing at Cedexis, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
SYS-CON Events announced today that CrowdReviews.com has been named “Media Sponsor” of SYS-CON's 22nd International Cloud Expo, which will take place on June 5–7, 2018, at the Javits Center in New York City, NY. CrowdReviews.com is a transparent online platform for determining which products and services are the best based on the opinion of the crowd. The crowd consists of Internet users that have experienced products and services first-hand and have an interest in letting other potential buye...
SYS-CON Events announced today that Telecom Reseller has been named “Media Sponsor” of SYS-CON's 22nd International Cloud Expo, which will take place on June 5-7, 2018, at the Javits Center in New York, NY. Telecom Reseller reports on Unified Communications, UCaaS, BPaaS for enterprise and SMBs. They report extensively on both customer premises based solutions such as IP-PBX as well as cloud based and hosted platforms.
It is of utmost importance for the future success of WebRTC to ensure that interoperability is operational between web browsers and any WebRTC-compliant client. To be guaranteed as operational and effective, interoperability must be tested extensively by establishing WebRTC data and media connections between different web browsers running on different devices and operating systems. In his session at WebRTC Summit at @ThingsExpo, Dr. Alex Gouaillard, CEO and Founder of CoSMo Software, presented ...
WebRTC is great technology to build your own communication tools. It will be even more exciting experience it with advanced devices, such as a 360 Camera, 360 microphone, and a depth sensor camera. In his session at @ThingsExpo, Masashi Ganeko, a manager at INFOCOM Corporation, introduced two experimental projects from his team and what they learned from them. "Shotoku Tamago" uses the robot audition software HARK to track speakers in 360 video of a remote party. "Virtual Teleport" uses a multip...
A strange thing is happening along the way to the Internet of Things, namely far too many devices to work with and manage. It has become clear that we'll need much higher efficiency user experiences that can allow us to more easily and scalably work with the thousands of devices that will soon be in each of our lives. Enter the conversational interface revolution, combining bots we can literally talk with, gesture to, and even direct with our thoughts, with embedded artificial intelligence, whic...
SYS-CON Events announced today that Evatronix will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Evatronix SA offers comprehensive solutions in the design and implementation of electronic systems, in CAD / CAM deployment, and also is a designer and manufacturer of advanced 3D scanners for professional applications.
Leading companies, from the Global Fortune 500 to the smallest companies, are adopting hybrid cloud as the path to business advantage. Hybrid cloud depends on cloud services and on-premises infrastructure working in unison. Successful implementations require new levels of data mobility, enabled by an automated and seamless flow across on-premises and cloud resources. In his general session at 21st Cloud Expo, Greg Tevis, an IBM Storage Software Technical Strategist and Customer Solution Architec...
To get the most out of their data, successful companies are not focusing on queries and data lakes, they are actively integrating analytics into their operations with a data-first application development approach. Real-time adjustments to improve revenues, reduce costs, or mitigate risk rely on applications that minimize latency on a variety of data sources. In his session at @BigDataExpo, Jack Norris, Senior Vice President, Data and Applications at MapR Technologies, reviewed best practices to ...
An increasing number of companies are creating products that combine data with analytical capabilities. Running interactive queries on Big Data requires complex architectures to store and query data effectively, typically involving data streams, an choosing efficient file format/database and multiple independent systems that are tied together through custom-engineered pipelines. In his session at @BigDataExpo at @ThingsExpo, Tomer Levi, a senior software engineer at Intel’s Advanced Analytics gr...