|By AppDynamics Blog||
|April 2, 2014 10:30 AM EDT||
Data is dumb but we can't seem to get enough of it these days. An entire industry has evolved to make sense of the massive amounts of data being generated every day. Massive data collection by itself does not guarantee the context required to solve business and IT problems... If we are smart about data, however, it will lead us in the right direction.
Today's businesses are defined by the software they run on, enabling them to innovate and create new services faster. Apple re-invented the music industry through software; Netflix changed the way we consume movies through software; grocery shopping, taxi-hire, parcel-logistics, car rental, travel booking... these are all industries that have been completely disrupted by software.
As more and more organisations depend upon the software they run on, the consumer demand for superior services, faster innovation, and improved performance continues to accelerate. As a result the software required to compete in this unforgiving market only continues to increase in its complexity.
It was only a few short years ago that when I wanted to book my family vacation, we all made the trip down to the local travel agent and sat around a desk while the agent tapped away on her green-screen searching for available flights and hotels. The process of booking our hotels has obviously changed dramatically, and so have the applications that support the process.
For example, below is a screen shot of the Business Transaction flow through a global travel site when a user makes a simple search for a hotel. All the user does is select a destination, some dates and maybe one or two preferences, but the resultant transaction kicks off more than 200 service requests before the results are returned to the user they can make their selection of preferred hotel.
This rise in application complexity often means we generate increasingly more structured and unstructured data from our applications such as log files, email, alerts, infrastructure stats, network stats etc. Close to 200 billion emails will be sent in 2014, I wonder how many of those emails are false alerts from monitoring tools?
I have been helping companies build monitoring strategies for nearly a decade, and the problem I used to regularly face was dealing with organisations that just didn't have enough data. IT departments didn't have the required information to be able to quickly diagnose and remediate problems when they occurred.
As organisations continue to have a greater dependency on the software they run on and application complexity continues to increase, there is a danger we swing the other way by collecting and storing too much data to be able to make sense of it in a timely manner. My colleague Jim Hirschauer often refers to this as the "home hoarders effect" drawing parallels with people who simply never throw anything out from their homes. The bigger the piles get the harder it is to find what you need, and at some point that pile is going to topple over.
Keep what you need
There are lots of good reasons to keep data, but "just in-case we need it" is not a good reason. It is important to think about why the data is being captured and stored and how it will help solve problems in the future. By keeping the relevant data and throwing away the clutter, you become more efficient and effective in troubleshooting and resolving problems.
But data alone isn't good enough to solve your business and IT problems. Data in and of itself is one-dimensional and dumb. Data doesn't tell you when there are problems, it doesn't tell you when business is going well, it doesn't tell you anything meaningful without some help and a lot more context. Let's explore an example to illustrate my point.
Let's say we have a couple of data points about a person. The data points are...
Heart Rate = 150 bpm
Blood Pressure = 200 over 100
Now tell me, is this person performing well? With these data points we have no idea. We need more data. The table below shows a list of some possible data points we can collect.
Notice the last attribute in the table. The activity provides us with the context we need to focus on the proper data points to figure out if the person is performing well or not. Here are some more relevant data points...
Distance Run = 100 meters
Time = 9.58s
Now can we determine if the person is performing well or not? I'm not much of a track and field aficionado so I have no idea if this is a good performance or not. I need point of comparison to determine if 9.58 seconds in the 100-meter dash is good. So here is our baseline for comparison sake...
100-Meter World Record Time = 9.69s
Well it looks like the person was performing really well. They set a world record in the 100-meter dash. All of the data points individually didn't tell us anything. We required correlation (context) and analytics (comparison to baseline) in order to turn data into information. I like to refer to this concept as creating Smart Data.
Smart Data Defined
Smart Data is actionable, intelligent, information.
Smart Data is created by performing correlation and analytics on data sets. AppDynamics correlates end user business transaction details with completion status (success, error, exception), response times, and all other data points measured at any given time. It automatically analyzes the entire data set to provide information from which to draw conclusions and take the appropriate action. This information is called Smart Data.
Being smart about your monitoring data collection allows you to isolate and resolve problems much faster, and with a much lower cost of ownership and overhead.
I have recently been working with a company to replace their legacy monitoring tool with AppDynamics. The environment they are monitoring is a good size, consisting of about 1,200 servers and their main application processes approximately 300,000 transactions per minute. They invested in a monitoring tool to help manage the performance of their applications which captured and stored all the data they could "just in-case" it was needed. Unfortunately this approach required an additional 92 servers to be provisioned for the monitoring tool itself, which consumed approximately 80TB of data storage per year. The increasing investments this customer needed to make in hardware, storage, people, and maintenance was too much to manage for them and they decided to look for a different approach.
AppDynamics "smart data" approach to analytics means this particular customer now only requires two reporting servers and the storage requirements were reduced to just 1TB per year. Collecting only the data required to make smart decisions gave them both a 98% reduction in hardware costs and more effective analytics in the process.
Adding business context
Smart data is not just about resolving problems faster though. In October last year we introduced Real-time Business Metrics and described how AppDynamics customers can use Real-time Business Metrics to extract and present business metrics directly from within their applications. These business metrics provide business context enabling customers to turn smart data into actionable information. Our customers, for example, can see the exact revenue impact of performance problems, the end user experience during an app upgrade, or the real-time impact of marketing campaigns. Smart analytics not only clearly show the business benefits of making immediate improvements, they help direct where limited resources should be invested for further business and application improvement going forward.
AppDynamics is focused on delivering actionable intelligence to sell problems for IT operations and development teams as well as business owners. To learn more about Real-time Business Metrics read here.
The post A New Breed of Actionable Analytics written by Tom Levey appeared first on Application Performance Monitoring Blog from AppDynamics.
SYS-CON Events announced today that Transparent Cloud Computing (T-Cloud) Consortium will exhibit at the 19th International Cloud Expo®, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. The Transparent Cloud Computing Consortium (T-Cloud Consortium) will conduct research activities into changes in the computing model as a result of collaboration between "device" and "cloud" and the creation of new value and markets through organic data proces...
Oct. 21, 2016 10:30 AM EDT Reads: 1,213
Donna Yasay, President of HomeGrid Forum, today discussed with a panel of technology peers how certification programs are at the forefront of interoperability, and the answer for vendors looking to keep up with today's growing industry for smart home innovation. "To ensure multi-vendor interoperability, accredited industry certification programs should be used for every product to provide credibility and quality assurance for retail and carrier based customers looking to add ever increasing num...
Oct. 21, 2016 09:15 AM EDT Reads: 197
In an era of historic innovation fueled by unprecedented access to data and technology, the low cost and risk of entering new markets has leveled the playing field for business. Today, any ambitious innovator can easily introduce a new application or product that can reinvent business models and transform the client experience. In their Day 2 Keynote at 19th Cloud Expo, Mercer Rowe, IBM Vice President of Strategic Alliances, and Raejeanne Skillern, Intel Vice President of Data Center Group and ...
Oct. 21, 2016 08:45 AM EDT Reads: 1,358
@ThingsExpo has been named the Top 5 Most Influential M2M Brand by Onalytica in the ‘Machine to Machine: Top 100 Influencers and Brands.' Onalytica analyzed the online debate on M2M by looking at over 85,000 tweets to provide the most influential individuals and brands that drive the discussion. According to Onalytica the "analysis showed a very engaged community with a lot of interactive tweets. The M2M discussion seems to be more fragmented and driven by some of the major brands present in the...
Oct. 21, 2016 08:45 AM EDT Reads: 11,112
Machine Learning helps make complex systems more efficient. By applying advanced Machine Learning techniques such as Cognitive Fingerprinting, wind project operators can utilize these tools to learn from collected data, detect regular patterns, and optimize their own operations. In his session at 18th Cloud Expo, Stuart Gillen, Director of Business Development at SparkCognition, discussed how research has demonstrated the value of Machine Learning in delivering next generation analytics to impr...
Oct. 21, 2016 08:00 AM EDT Reads: 5,571
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, will discuss the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
Oct. 21, 2016 07:45 AM EDT Reads: 3,727
What happens when the different parts of a vehicle become smarter than the vehicle itself? As we move toward the era of smart everything, hundreds of entities in a vehicle that communicate with each other, the vehicle and external systems create a need for identity orchestration so that all entities work as a conglomerate. Much like an orchestra without a conductor, without the ability to secure, control, and connect the link between a vehicle’s head unit, devices, and systems and to manage the ...
Oct. 21, 2016 07:15 AM EDT Reads: 1,274
Virgil consists of an open-source encryption library, which implements Cryptographic Message Syntax (CMS) and Elliptic Curve Integrated Encryption Scheme (ECIES) (including RSA schema), a Key Management API, and a cloud-based Key Management Service (Virgil Keys). The Virgil Keys Service consists of a public key service and a private key escrow service.
Oct. 21, 2016 07:15 AM EDT Reads: 872
Web Real-Time Communication APIs have quickly revolutionized what browsers are capable of. In addition to video and audio streams, we can now bi-directionally send arbitrary data over WebRTC's PeerConnection Data Channels. With the advent of Progressive Web Apps and new hardware APIs such as WebBluetooh and WebUSB, we can finally enable users to stitch together the Internet of Things directly from their browsers while communicating privately and securely in a decentralized way.
Oct. 21, 2016 06:45 AM EDT Reads: 1,784
Amazon has gradually rolled out parts of its IoT offerings, but these are just the tip of the iceberg. In addition to optimizing their backend AWS offerings, Amazon is laying the ground work to be a major force in IoT - especially in the connected home and office. In his session at @ThingsExpo, Chris Kocher, founder and managing director of Grey Heron, explained how Amazon is extending its reach to become a major force in IoT by building on its dominant cloud IoT platform, its Dash Button strat...
Oct. 21, 2016 06:15 AM EDT Reads: 4,635
Two weeks ago (November 3-5), I attended the Cloud Expo Silicon Valley as a speaker, where I presented on the security and privacy due diligence requirements for cloud solutions. Cloud security is a topical issue for every CIO, CISO, and technology buyer. Decision-makers are always looking for insights on how to mitigate the security risks of implementing and using cloud solutions. Based on the presentation topics covered at the conference, as well as the general discussions heard between sessi...
Oct. 21, 2016 05:45 AM EDT Reads: 5,046
For basic one-to-one voice or video calling solutions, WebRTC has proven to be a very powerful technology. Although WebRTC’s core functionality is to provide secure, real-time p2p media streaming, leveraging native platform features and server-side components brings up new communication capabilities for web and native mobile applications, allowing for advanced multi-user use cases such as video broadcasting, conferencing, and media recording.
Oct. 21, 2016 05:00 AM EDT Reads: 3,934
Fifty billion connected devices and still no winning protocols standards. HTTP, WebSockets, MQTT, and CoAP seem to be leading in the IoT protocol race at the moment but many more protocols are getting introduced on a regular basis. Each protocol has its pros and cons depending on the nature of the communications. Does there really need to be only one protocol to rule them all? Of course not. In his session at @ThingsExpo, Chris Matthieu, co-founder and CTO of Octoblu, walk you through how Oct...
Oct. 21, 2016 04:30 AM EDT Reads: 3,073
Major trends and emerging technologies – from virtual reality and IoT, to Big Data and algorithms – are helping organizations innovate in the digital era. However, to create real business value, IT must think beyond the ‘what’ of digital transformation to the ‘how’ to harness emerging trends, innovation and disruption. Architecture is the key that underpins and ties all these efforts together. In the digital age, it’s important to invest in architecture, extend the enterprise footprint to the cl...
Oct. 21, 2016 04:15 AM EDT Reads: 1,717
Almost everyone sees the potential of Internet of Things but how can businesses truly unlock that potential. The key will be in the ability to discover business insight in the midst of an ocean of Big Data generated from billions of embedded devices via Systems of Discover. Businesses will also need to ensure that they can sustain that insight by leveraging the cloud for global reach, scale and elasticity.
Oct. 21, 2016 04:00 AM EDT Reads: 10,931
One of biggest questions about Big Data is “How do we harness all that information for business use quickly and effectively?” Geographic Information Systems (GIS) or spatial technology is about more than making maps, but adding critical context and meaning to data of all types, coming from all different channels – even sensors. In his session at @ThingsExpo, William (Bill) Meehan, director of utility solutions for Esri, will take a closer look at the current state of spatial technology and ar...
Oct. 21, 2016 03:15 AM EDT Reads: 1,654
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
Oct. 21, 2016 03:15 AM EDT Reads: 3,861
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In his session at @ThingsExpo, James Kirkland, Red Hat's Chief Arch...
Oct. 21, 2016 02:00 AM EDT Reads: 5,910
The IoT industry is now at a crossroads, between the fast-paced innovation of technologies and the pending mass adoption by global enterprises. The complexity of combining rapidly evolving technologies and the need to establish practices for market acceleration pose a strong challenge to global enterprises as well as IoT vendors. In his session at @ThingsExpo, Clark Smith, senior product manager for Numerex, will discuss how Numerex, as an experienced, established IoT provider, has embraced a ...
Oct. 21, 2016 01:30 AM EDT Reads: 892
Everyone knows that truly innovative companies learn as they go along, pushing boundaries in response to market changes and demands. What's more of a mystery is how to balance innovation on a fresh platform built from scratch with the legacy tech stack, product suite and customers that continue to serve as the business' foundation. In his General Session at 19th Cloud Expo, Michael Chambliss, Head of Engineering at ReadyTalk, will discuss why and how ReadyTalk diverted from healthy revenue an...
Oct. 21, 2016 01:15 AM EDT Reads: 2,915