|By Rich Collier||
|March 1, 2014 10:15 AM EST||
Application Performance Management (APM) grew out of the movement to better align IT with real business concerns. Instead of monitoring a lot of disparate components, such as servers and switches, APM would provide improved visibility into mission-critical application performance and the user experience. Today, APM solutions help IT track end-to-end application response time and troubleshoot coding errors across application components that have an impact on performance.
APM has a rightful place in the arsenal of monitoring tools that IT uses to keep its applications and systems up and running. However, today's APM solutions have some serious gaps and challenges when it comes to providing IT with the entire application performance picture.
Most APM solutions provide minimal information about the hardware and network components underlying application performance, other than showing which components are involved in each part of the transaction. Those that do a better job usually require users to shift to another screen or monitoring system to get more hardware visibility. As with the blind men touching different parts of an elephant, this approach makes it difficult to correlate hardware performance with all the other components driving the application.
The Virtual, Distributed Environment
Most of today's APM solutions were created before virtualization, the cloud, and complex, composite applications took off in the IT environment. With virtual machines migrating back and forth among physical servers at different times of the day or week, and applications dependent on scores of components and cloud services, APM vendors are hard-pressed to provide visibility into the entire scope of a single application.
As 24 by 7 by 365 uptime becomes increasingly critical to business success, enterprises need to be able to predict and address issues BEFORE they affect the business, rather than after. APM has had mixed success in this area. A recent survey by TRAC Research found that of organizations deploying APM solutions, 60 percent report a success rate of less than half in identifying performance issues before they have an impact on end users.
Enter Predictive Analytics for IT
Filling these APM gaps is how Big Data and predictive analytics for IT can play a significant, highly beneficial role in IT's efforts to maintain application performance. Today, when IT encounters performance issues, it typically has to collect its server, storage, network, and APM folks into a war room to search through mountains of hardware and APM logs, and correlate information manually to isolate the root cause. This resource-intensive process can frequently take hours or even days.
IT has lots of alerts and thresholds to analyze, but those are only as good as the knowledge, experience, and insight of the IT folks who configured them. Just because a server surpassed its CPU utilization threshold doesn't mean that event had anything to do with the root cause of an application issue. Often the real issue is hidden deep in all the delicate interactions among multiple hardware and software components, and may not be reflected in individual thresholds. The same TRAC Research study shows an average of 46.2 hours spent by IT each month in these war rooms searching for root cause. Even more depressing, the root cause is often not found, so IT just reboots everything in the hope that it all works until the same problem rears its ugly head again.
Predictive analytics take over where APM leaves off, harnessing third-generation machine learning and Big Data analysis techniques to efficiently plow through mountains of log data. They discover all the behavior patterns and interrelationships between the IT software and hardware components driving today's mission-critical applications. Over several hours or days, the best solutions baseline the normal behavior of all those components, relationships, and events and use complex algorithms to detect any anomalies that are the early warning signs of developing performance issues. Better yet, because the analytics understand the chain of events involved in the developing anomaly, IT support staff are immediately provided with not only the alert that something is going wrong, but also the behavior of every component involved. This information can shave hours or even days off those war room scenarios. For example, thanks to a predictive analytics for IT solution, a major retailer was able to trace periodic gift card application outages to a misconfigured VLAN. Similarly, a predictive analytics solution reduced - from six hours in the war room to ten minutes - the time it took to diagnose a financial content management performance issue.
Another advantage of predictive analytics solutions is that because they self-learn the normal behavior patterns of underlying components, they drastically reduce the educated guessing that usually goes along with IT staff identifying and setting thresholds against key performance. The inflexibility of these thresholds results in large numbers of false-positive alerts. But with predictive analytics, highly sophisticated algorithms compute the probability of certain behaviors and can therefore generate much more accurate alerts. Some users of predictive analytics solutions have called them the Donald Rumsfelds of IT management tools because they point IT to infrastructure issues they never even knew existed and never looked for. Rumsfeld called these the "unknown unknowns."
However, it is in their ability to be "predictive" that these advanced analytics solutions really shine. By detecting small anomalies early in the game, predictive analytics can alert IT to performance issues and provide enough information to address their root cause before IT or application users even notice them. This can have a dramatic effect on application uptime and performance and a direct impact on user satisfaction and even enterprise revenue. In the case of the document management application, predictive analytics discovered a developing performance issue, and its root cause, the night before it would have affected users placing the application under load on Monday morning.
APM tools have their place in the enterprise, but predictive analytics solutions for IT can kick the effectiveness of those and other IT monitoring tools up a notch by detecting, tracing, and predicting performance issues and their root cause long before any IT war room can.
- TRAC Research, March 4, 2013: "2013 Application Performance Management Spectrum" report.
Today’s enterprise is being driven by disruptive competitive and human capital requirements to provide enterprise application access through not only desktops, but also mobile devices. To retrofit existing programs across all these devices using traditional programming methods is very costly and time consuming – often prohibitively so. In his session at @ThingsExpo, Jesse Shiah, CEO, President, and Co-Founder of AgilePoint Inc., discussed how you can create applications that run on all mobile devices as well as laptops and desktops using a visual drag-and-drop application – and eForms-buildi...
Jan. 30, 2015 05:00 AM EST Reads: 2,816
The Internet of Things is tied together with a thin strand that is known as time. Coincidentally, at the core of nearly all data analytics is a timestamp. When working with time series data there are a few core principles that everyone should consider, especially across datasets where time is the common boundary. In his session at Internet of @ThingsExpo, Jim Scott, Director of Enterprise Strategy & Architecture at MapR Technologies, discussed single-value, geo-spatial, and log time series data. By focusing on enterprise applications and the data center, he will use OpenTSDB as an example t...
Jan. 30, 2015 04:45 AM EST Reads: 3,109
Cultural, regulatory, environmental, political and economic (CREPE) conditions over the past decade are creating cross-industry solution spaces that require processes and technologies from both the Internet of Things (IoT), and Data Management and Analytics (DMA). These solution spaces are evolving into Sensor Analytics Ecosystems (SAE) that represent significant new opportunities for organizations of all types. Public Utilities throughout the world, providing electricity, natural gas and water, are pursuing SmartGrid initiatives that represent one of the more mature examples of SAE. We have s...
Jan. 30, 2015 04:30 AM EST Reads: 3,146
The 3rd International Internet of @ThingsExpo, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that its Call for Papers is now open. The Internet of Things (IoT) is the biggest idea since the creation of the Worldwide Web more than 20 years ago.
Jan. 30, 2015 03:30 AM EST Reads: 3,088
SYS-CON Media announced that Splunk, a provider of the leading software platform for real-time Operational Intelligence, has launched an ad campaign on Big Data Journal. Splunk software and cloud services enable organizations to search, monitor, analyze and visualize machine-generated big data coming from websites, applications, servers, networks, sensors and mobile devices. The ads focus on delivering ROI - how improved uptime delivered $6M in annual ROI, improving customer operations by mining large volumes of unstructured data, and how data tracking delivers uptime when it matters most.
Jan. 30, 2015 03:30 AM EST Reads: 3,812
The true value of the Internet of Things (IoT) lies not just in the data, but through the services that protect the data, perform the analysis and present findings in a usable way. With many IoT elements rooted in traditional IT components, Big Data and IoT isn’t just a play for enterprise. In fact, the IoT presents SMBs with the prospect of launching entirely new activities and exploring innovative areas. CompTIA research identifies several areas where IoT is expected to have the greatest impact.
Jan. 30, 2015 03:00 AM EST Reads: 1,860
There is no doubt that Big Data is here and getting bigger every day. Building a Big Data infrastructure today is no easy task. There are an enormous number of choices for database engines and technologies. To make things even more challenging, requirements are getting more sophisticated, and the standard paradigm of supporting historical analytics queries is often just one facet of what is needed. As Big Data growth continues, organizations are demanding real-time access to data, allowing immediate and actionable interpretation of events as they happen. Another aspect concerns how to deliver ...
Jan. 30, 2015 03:00 AM EST Reads: 3,425
The Internet of Things will greatly expand the opportunities for data collection and new business models driven off of that data. In her session at @ThingsExpo, Esmeralda Swartz, CMO of MetraTech, discussed how for this to be effective you not only need to have infrastructure and operational models capable of utilizing this new phenomenon, but increasingly service providers will need to convince a skeptical public to participate. Get ready to show them the money!
Jan. 30, 2015 02:30 AM EST Reads: 3,006
Code Halos - aka "digital fingerprints" - are the key organizing principle to understand a) how dumb things become smart and b) how to monetize this dynamic. In his session at @ThingsExpo, Robert Brown, AVP, Center for the Future of Work at Cognizant Technology Solutions, outlined research, analysis and recommendations from his recently published book on this phenomena on the way leading edge organizations like GE and Disney are unlocking the Internet of Things opportunity and what steps your organization should be taking to position itself for the next platform of digital competition.
Jan. 30, 2015 02:00 AM EST Reads: 3,030
Scott Jenson leads a project called The Physical Web within the Chrome team at Google. Project members are working to take the scalability and openness of the web and use it to talk to the exponentially exploding range of smart devices. Nearly every company today working on the IoT comes up with the same basic solution: use my server and you'll be fine. But if we really believe there will be trillions of these devices, that just can't scale. We need a system that is open a scalable and by using the URL as a basic building block, we open this up and get the same resilience that the web enjoys.
Jan. 30, 2015 02:00 AM EST Reads: 3,028
In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect at GE, and Ibrahim Gokcen, who leads GE's advanced IoT analytics, focused on the Internet of Things / Industrial Internet and how to make it operational for business end-users. Learn about the challenges posed by machine and sensor data and how to marry it with enterprise data. They also discussed the tips and tricks to provide the Industrial Internet as an end-user consumable service using Big Data Analytics and Industrial Cloud.
Jan. 30, 2015 01:00 AM EST Reads: 2,902
How do APIs and IoT relate? The answer is not as simple as merely adding an API on top of a dumb device, but rather about understanding the architectural patterns for implementing an IoT fabric. There are typically two or three trends: Exposing the device to a management framework Exposing that management framework to a business centric logic Exposing that business layer and data to end users. This last trend is the IoT stack, which involves a new shift in the separation of what stuff happens, where data lives and where the interface lies. For instance, it's a mix of architectural styles ...
Jan. 30, 2015 12:30 AM EST Reads: 3,046
IoT is still a vague buzzword for many people. In his session at @ThingsExpo, Mike Kavis, Vice President & Principal Cloud Architect at Cloud Technology Partners, discussed the business value of IoT that goes far beyond the general public's perception that IoT is all about wearables and home consumer services. He also discussed how IoT is perceived by investors and how venture capitalist access this space. Other topics discussed were barriers to success, what is new, what is old, and what the future may hold. Mike Kavis is Vice President & Principal Cloud Architect at Cloud Technology Pa...
Jan. 29, 2015 06:15 PM EST Reads: 4,102
Dale Kim is the Director of Industry Solutions at MapR. His background includes a variety of technical and management roles at information technology companies. While his experience includes work with relational databases, much of his career pertains to non-relational data in the areas of search, content management, and NoSQL, and includes senior roles in technical marketing, sales engineering, and support engineering. Dale holds an MBA from Santa Clara University, and a BA in Computer Science from the University of California, Berkeley.
Jan. 29, 2015 06:00 PM EST Reads: 3,289
The Internet of Things (IoT) is rapidly in the process of breaking from its heretofore relatively obscure enterprise applications (such as plant floor control and supply chain management) and going mainstream into the consumer space. More and more creative folks are interconnecting everyday products such as household items, mobile devices, appliances and cars, and unleashing new and imaginative scenarios. We are seeing a lot of excitement around applications in home automation, personal fitness, and in-car entertainment and this excitement will bleed into other areas. On the commercial side, m...
Jan. 29, 2015 06:00 PM EST Reads: 3,043
Almost everyone sees the potential of Internet of Things but how can businesses truly unlock that potential. The key will be in the ability to discover business insight in the midst of an ocean of Big Data generated from billions of embedded devices via Systems of Discover. Businesses will also need to ensure that they can sustain that insight by leveraging the cloud for global reach, scale and elasticity.
Jan. 29, 2015 05:00 PM EST Reads: 4,097
"People are a lot more knowledgeable about APIs now. There are two types of people who work with APIs - IT people who want to use APIs for something internal and the product managers who want to do something outside APIs for people to connect to them," explained Roberto Medrano, Executive Vice President at SOA Software, in this SYS-CON.tv interview at Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Jan. 29, 2015 02:30 PM EST Reads: 2,632
Performance is the intersection of power, agility, control, and choice. If you value performance, and more specifically consistent performance, you need to look beyond simple virtualized compute. Many factors need to be considered to create a truly performant environment. In his General Session at 15th Cloud Expo, Harold Hannon, Sr. Software Architect at SoftLayer, discussed how to take advantage of a multitude of compute options and platform features to make cloud the cornerstone of your online presence.
Jan. 29, 2015 02:15 PM EST Reads: 3,197
Since 2008 and for the first time in history, more than half of humans live in urban areas, urging cities to become “smart.” Today, cities can leverage the wide availability of smartphones combined with new technologies such as Beacons or NFC to connect their urban furniture and environment to create citizen-first services that improve transportation, way-finding and information delivery. In her session at @ThingsExpo, Laetitia Gazel-Anthoine, CEO of Connecthings, will focus on successful use cases.
Jan. 29, 2015 01:30 PM EST Reads: 2,049
DevOps Summit 2015 New York, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that it is now accepting Keynote Proposals. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long development cycles that produce software that is obsolete at launch. DevOps may be disruptive, but it is essential.
Jan. 29, 2015 01:15 PM EST Reads: 2,554