Welcome!

Java Authors: Sematext Blog, PagerDuty Blog, Pat Romanski, Jayaram Krishnaswamy, Liz McMillan

Blog Feed Post

What Customers expect in a new generation APM (2.0) solution

In the last blog, I discussed the challenges with an APM 1.0 solution. 

 

As an application owner or application support personnel, you want to

  • Exceed service levels and avoid costly, reputation-damaging application failures through improved visibility into the end-user experience
  • Ensure reliable, high-performing applications by detecting problems faster and prioritizing issues based on service levels and impacted users
  • Improve time to market with new applications, features, and technologies, such as virtualization, acceleration, and cloud-based services

 

The APM 2.0 products enable you manage application performance leading with real user activity monitoring. Following are some of the top functionalities they provide that help you achieve your business objectives.

Visibility to real users and end-user driven diagnostics

  • APM 2.0 solutions provide visibility to end-to-end application performance as experience by real end users and help application support to focus on critical issues affecting end-users.

 

The dashboard shown in Figure 1, as an example, provides visibility of application performance as experienced by users in real-time.

 

image001.png

 

 

  • As an application owner, you probably care about which users are impacted, what pages they are navigating and what kind of errors they are getting. You want your APM product to improve MTTR by identifying what is causing the latency issues or failure e.g. network, load balancer, ADN like Akamai, SSL or the application tier itself. Figure 2 shows a specific user session and what pages the user navigated and identified that the application tier is the cause.

image003.png

  • The “details” link in Figure 2 allows the application support personnel to drill down further which application tier is the culprit for the slow or failed transaction in context to the specific user. This allows the application support personnel to track end-user request to the line of the code.

Ease of use and superior time-to-value

You want to use a product that is simple to use for your application support / operation team.

  • A modern APM solution does not require manual definition of instrumentation policies.
  • It should not require manual changes such as Java script injection for visibility to the end user.
  • APM 2.0 tools provide ability to drill down from end-user to deep-dive for diagnostics and drill up from deep-dive data to identify the impacted user and the context for the transaction without having to do a manual correlation, jumping between consoles.
  • The agent install is typically a 5-10 mts process in the modern APM deep-dive tools.
  • The APM 2.0 deep-dive solution provides automatic detection of application servers, business transactions, frameworks etc.

 

Figure 3 shows a specific user transaction request and latencies by tiers. It also shows the SQL and latencies information.

 

image005.png

 

Suitable for production deployment

  • The real user monitoring tool should be non-invasive in nature and it should put additional overhead on application response time.
  • You should be able to deploy an always-on, deep-dive monitoring and diagnostic solution for your production enterprise and cloud-based applications.
  • It should work in an agile environment without having to configure new instrumentation policies with application releases.
  • It should scale for a large production deployment to 1000s of application servers that you want to manage in your production environment.

 

Operations Ready product and enables DevOps collaboration

The APM 1.0 products were originally built for developers and hence they were not very intuitive for operations use. The APM 2.0 products are operations friendly. Also you would expect some of those to enable DevOps collaboration for intelligent escalation to development.

  • Most application support personnel do not understand what frameworks or application technologies used by an application. The majority deep-dive tools in the market move very fast from a transaction view to line of code thus being not providing much value to operations team.

 

For example, Figure 4 shows the transaction break-down by specific technologies used by the transaction. This also provides baselines for different tiers and the system resource usage along with tiers to make intelligent decision. Figure 3 shows an application flow map for a specific transaction and time spent in each SQL or a remote web service call without having to drill down to the line of code.

image007.png

 

  • There are many instances operations team need to escalate problems to developers. The tool should allow application support personnel to escalate to Tier 3/development for diagnostics by sending a direct link to the diagnostic instance. However in many organizations, developers do not have access to production environment and as shown in Figure 5, solution from BMC allows exporting the diagnostics data call tree with latency, parameters, etc in a HTML format.

 

image009.png

Adaptive to virtualization and Cloud environment

The new APM 2.0 products are purpose-built and architected for cloud and virtualized environments. 

  • The APM 2.0 product components and agents are designed to communicate in a firewall-friendly protocol and can be encrypted / secured.
  • They support virtualized and dynamic environment without causing a lot of false alerts.
  • They support modern cloud frameworks and Big Data platforms such as Hadoop.

 

Conclusion

The APM 2.0 solution provides the functionalities that you need to manage your applications that will help exceed business expectations and increase customer loyalty. These tools provide capability to improve time to market. These provide you understanding how application performance affects user behavior — and how that behavior impacts the bottom line. You can leverage an APM 2.0 solutionlike BMC Application Performance Management to improve your application performance and thus meeting your business objectives.

Read the original blog entry...

More Stories By Debu Panda

Debu Panda is a Director of Product Management at Oracle Corporation. He is lead author of the EJB 3 in Action (Manning Publications) and Middleware Management (Packt). He has more than 20 years of experience in the IT industry and has published numerous articles on enterprise Java technologies and has presented at many conferences. Debu maintains an active blog on enterprise Java at http://debupanda.blogspot.com.

@ThingsExpo Stories

ARMONK, N.Y., Nov. 20, 2014 /PRNewswire/ --  IBM (NYSE: IBM) today announced that it is bringing a greater level of control, security and flexibility to cloud-based application development and delivery with a single-tenant version of Bluemix, IBM's platform-as-a-service. The new platform enables developers to build ap...

Building low-cost wearable devices can enhance the quality of our lives. In his session at Internet of @ThingsExpo, Sai Yamanoor, Embedded Software Engineer at Altschool, provided an example of putting together a small keychain within a $50 budget that educates the user about the air quality in their surroundings. He also provided examples such as building a wearable device that provides transit or recreational information. He then reviewed the resources available to build wearable devices at home including open source hardware, the raw materials required and the options available to power s...
The Internet of Things promises to transform businesses (and lives), but navigating the business and technical path to success can be difficult to understand. In his session at @ThingsExpo, Sean Lorenz, Technical Product Manager for Xively at LogMeIn, demonstrated how to approach creating broadly successful connected customer solutions using real world business transformation studies including New England BioLabs and more.
Since 2008 and for the first time in history, more than half of humans live in urban areas, urging cities to become “smart.” Today, cities can leverage the wide availability of smartphones combined with new technologies such as Beacons or NFC to connect their urban furniture and environment to create citizen-first services that improve transportation, way-finding and information delivery. In her session at @ThingsExpo, Laetitia Gazel-Anthoine, CEO of Connecthings, will focus on successful use cases.
Enthusiasm for the Internet of Things has reached an all-time high. In 2013 alone, venture capitalists spent more than $1 billion dollars investing in the IoT space. With "smart" appliances and devices, IoT covers wearable smart devices, cloud services to hardware companies. Nest, a Google company, detects temperatures inside homes and automatically adjusts it by tracking its user's habit. These technologies are quickly developing and with it come challenges such as bridging infrastructure gaps, abiding by privacy concerns and making the concept a reality. These challenges can't be addressed w...
The Domain Name Service (DNS) is one of the most important components in networking infrastructure, enabling users and services to access applications by translating URLs (names) into IP addresses (numbers). Because every icon and URL and all embedded content on a website requires a DNS lookup loading complex sites necessitates hundreds of DNS queries. In addition, as more internet-enabled ‘Things' get connected, people will rely on DNS to name and find their fridges, toasters and toilets. According to a recent IDG Research Services Survey this rate of traffic will only grow. What's driving t...
The Internet of Things is a misnomer. That implies that everything is on the Internet, and that simply should not be - especially for things that are blurring the line between medical devices that stimulate like a pacemaker and quantified self-sensors like a pedometer or pulse tracker. The mesh of things that we manage must be segmented into zones of trust for sensing data, transmitting data, receiving command and control administrative changes, and peer-to-peer mesh messaging. In his session at @ThingsExpo, Ryan Bagnulo, Solution Architect / Software Engineer at SOA Software, focused on desi...
"For over 25 years we have been working with a lot of enterprise customers and we have seen how companies create applications. And now that we have moved to cloud computing, mobile, social and the Internet of Things, we see that the market needs a new way of creating applications," stated Jesse Shiah, CEO, President and Co-Founder of AgilePoint Inc., in this SYS-CON.tv interview at 15th Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
The Industrial Internet revolution is now underway, enabled by connected machines and billions of devices that communicate and collaborate. The massive amounts of Big Data requiring real-time analysis is flooding legacy IT systems and giving way to cloud environments that can handle the unpredictable workloads. Yet many barriers remain until we can fully realize the opportunities and benefits from the convergence of machines and devices with Big Data and the cloud, including interoperability, data security and privacy.
The Internet of Things is tied together with a thin strand that is known as time. Coincidentally, at the core of nearly all data analytics is a timestamp. When working with time series data there are a few core principles that everyone should consider, especially across datasets where time is the common boundary. In his session at Internet of @ThingsExpo, Jim Scott, Director of Enterprise Strategy & Architecture at MapR Technologies, discussed single-value, geo-spatial, and log time series data. By focusing on enterprise applications and the data center, he will use OpenTSDB as an example t...
The industrial software market has treated data with the mentality of “collect everything now, worry about how to use it later.” We now find ourselves buried in data, with the pervasive connectivity of the (Industrial) Internet of Things only piling on more numbers. There’s too much data and not enough information. In his session at @ThingsExpo, Bob Gates, Global Marketing Director, GE’s Intelligent Platforms business, to discuss how realizing the power of IoT, software developers are now focused on understanding how industrial data can create intelligence for industrial operations. Imagine ...
Cultural, regulatory, environmental, political and economic (CREPE) conditions over the past decade are creating cross-industry solution spaces that require processes and technologies from both the Internet of Things (IoT), and Data Management and Analytics (DMA). These solution spaces are evolving into Sensor Analytics Ecosystems (SAE) that represent significant new opportunities for organizations of all types. Public Utilities throughout the world, providing electricity, natural gas and water, are pursuing SmartGrid initiatives that represent one of the more mature examples of SAE. We have s...
There is no doubt that Big Data is here and getting bigger every day. Building a Big Data infrastructure today is no easy task. There are an enormous number of choices for database engines and technologies. To make things even more challenging, requirements are getting more sophisticated, and the standard paradigm of supporting historical analytics queries is often just one facet of what is needed. As Big Data growth continues, organizations are demanding real-time access to data, allowing immediate and actionable interpretation of events as they happen. Another aspect concerns how to deliver ...
Scott Jenson leads a project called The Physical Web within the Chrome team at Google. Project members are working to take the scalability and openness of the web and use it to talk to the exponentially exploding range of smart devices. Nearly every company today working on the IoT comes up with the same basic solution: use my server and you'll be fine. But if we really believe there will be trillions of these devices, that just can't scale. We need a system that is open a scalable and by using the URL as a basic building block, we open this up and get the same resilience that the web enjoys.
The Internet of Things will greatly expand the opportunities for data collection and new business models driven off of that data. In her session at @ThingsExpo, Esmeralda Swartz, CMO of MetraTech, discussed how for this to be effective you not only need to have infrastructure and operational models capable of utilizing this new phenomenon, but increasingly service providers will need to convince a skeptical public to participate. Get ready to show them the money!
The 3rd International Internet of @ThingsExpo, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that its Call for Papers is now open. The Internet of Things (IoT) is the biggest idea since the creation of the Worldwide Web more than 20 years ago.
Cloud Expo 2014 TV commercials will feature @ThingsExpo, which was launched in June, 2014 at New York City's Javits Center as the largest 'Internet of Things' event in the world.
"People are a lot more knowledgeable about APIs now. There are two types of people who work with APIs - IT people who want to use APIs for something internal and the product managers who want to do something outside APIs for people to connect to them," explained Roberto Medrano, Executive Vice President at SOA Software, in this SYS-CON.tv interview at Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
SYS-CON Media announced that Splunk, a provider of the leading software platform for real-time Operational Intelligence, has launched an ad campaign on Big Data Journal. Splunk software and cloud services enable organizations to search, monitor, analyze and visualize machine-generated big data coming from websites, applications, servers, networks, sensors and mobile devices. The ads focus on delivering ROI - how improved uptime delivered $6M in annual ROI, improving customer operations by mining large volumes of unstructured data, and how data tracking delivers uptime when it matters most.
DevOps Summit 2015 New York, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that it is now accepting Keynote Proposals. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long development cycles that produce software that is obsolete at launch. DevOps may be disruptive, but it is essential.