Java IoT Authors: Yeshim Deniz, Elizabeth White, Carmen Gonzalez, Jim Kaskade, Pat Romanski

Related Topics: Java IoT

Java IoT: Article

Detecting J2EE Problems Before They Happen

A runtime abstract application model derived automatically from app server using stored knowledge of Java EE construction

This article introduces a new form of analysis for Java EE applications: a runtime abstract application model derived automatically from an application server using stored knowledge of Java EE construction. The model is used dynamically to do extensive automatic checks for a range of construction errors that could produce poor performance or unreliability. The model also lets server behavior be dynamically visualized in real-time or retrospectively.

There has been a lot of attention given lately to the topic of of Model Driven Architecture (MDA), which aims to create working systems by generating source code from successively transformed high-level component models. While doubts have been cast on the real-world robustness of this idea - and previous code-generation solutions haven't been a big success - there's no doubt that the possibility of working with software at a more abstract level holds a strong appeal for engineers.

Although the inauspicious history of CASE tools suggests that making a project dependent on model-driven code generation could be limiting, the central tenet of MDA - the ability to view and analyze our application at an abstract level - is a powerful and attractive goal. Even if our application grew beyond an initial set of predefined patterns and code templates we'd still like to be able to validate and understand it based on a design-level description of its operation.

Derived Model Analysis (DMA)
If we don't have a predefined model, how are we going to get one? Well, if you try to describe your application to someone else you'll almost certainly use architecture-level abstractions: the services it uses; the main business and data components and how these relate. So it would be good if similar high-level abstractions could be derived and presented automatically by analyzing and monitoring the execution of your application. Model elements would include application components, the application server services they use, and the data access, transaction management, and calling relationships between them.

Once application model elements were identified they would be updated dynamically during execution. Monitoring the changing patterns of inter-relationships in the model would automatically detect construction-quality problems by detecting unlikely relationships, unnecessary and duplicated relationships, and undesirable model entity states. Instead of trying to spot problems in the clutter of source code we could see key abstractions directly in the model.

eoLogic terms this form of indirect application monitoring Derived Model Analysis (DMA): tools analyze Java EE applications both statically and during server execution to derive an abstract model that includes both application components and Java EE services. Subsequent changes to the model form a dynamic event sequence that can be used to (a) track and validate application execution and (b) visualize the model. Lower-level application execution details can be recorded in the context of the sequence of model changes.

Note that DMA is not a profiling technique - it doesn't aim to identify current code hotspots; instead, it analyses how services have been constructed and are being used. The idea is to identify places where hotspots or unreliability may occur under load. This deeper form of analysis can be used to find problems before they manifest themselves and without the application being loaded during testing. These problems include incorrect or inefficient transaction grouping, inefficient database access, unreliable sequences of inter-component communication, and failure to control service lifecycles correctly. There's no need to drive the application to a point at which it exhibits slowdown, and the results need little interpretation.

Deriving a DMA Model
To generate and validate an abstract model of an application a tool must be able to monitor events in the server and interpret them in light of the relevant stored knowledge.

This includes definitions of the main abstract entities we're interested in (transaction manager, transaction resources, transactions, EJB containers, JMS destinations, etc.), the possible relationships between these entities, and invalid and valid patterns of relationships and states. DMA forms them into an abstract Entity-Relationship-Attribute (ERA) model as the system executes, with model changes triggering annotated definitions of problem states.

Relationship to JMX
The model sounds a lot like Java Management Extensions (JMX) - which essentially define a form of abstract model for purposes of managing and monitoring Java applications, and it suggests that possibly DMA could be layered on top of the information available from JMX MBeans. In detail, what characteristics does a DMA model require?

  • It must be an accurate and complete abstract model of an application, linking static (source) and runtime application components.
  • It must be able to be updated in real-time as the server executes generating meaningful sequential event flows.
  • It must support a wide range of relationship types including application-level call relationships.
  • It must be able to be intimately combined with knowledge about valid and potentially invalid model forms.
  • It must be possible to relate model-level information easily back to application source.
  • It must be easily filtered to focus on different aspects of server operation.
  • And it must be easily and intuitively understood.
JMX goes some way towards what is needed: It provides an abstract model of an application for both its static and dynamic aspects; it allows easy selection of MBeans; many MBeans relate directly to easily understood aspects of server operation; there's a notification system for attribute changes and there's even an MBean relation service.

However, for our purposes it also has some serious limitations. Many of the relationships we have to monitor are based on calling sequences and application component relationships. Designed primarily for system management and threshold monitoring, JMX doesn't provide the source-level monitoring and mapping that the detection and (especially) the explanation of application construction errors requires. Also, the level of coverage is generally insufficiently detailed to provide a coherent execution model for the purposes of visualization. And if we want to use the product to investigate problems requiring the ability to freeze the server at the point of problem detection and extract stack and related data information then JMX isn't precise enough.

So the approach that we adopted is to create a more detailed runtime ERA model specialized for the following purposes:

  • Representing sequences of server operation precisely and clearly
  • Detecting construction errors based on component interrelations, including call sequences and transaction membership
  • Explaining construction errors by relating model entities and relationships to precise source references
  • Providing an intuitive visual model of sequential server operation
  • Supporting model tracing and playback
  • Supporting integrated debugging
This specialized model then provides the structure for attaching knowledge about model entity roles and valid and invalid patterns of model relationships and attributes, together with details on problem descriptions and suggested fixes.

The need for detailed tracking of calls and object states means that the DMA engine moves from the realm of JMX and more towards an application of Aspect-Oriented Programming (AOP), combining the planned abstraction of JMX with the detailed and flexible monitoring and intervention of AOP. Having said this, it would be wasteful not to exploit the JMX information provided by a server. Some JMX MBeans serve as important internal DMA monitoring and access points, but are augmented with additional monitoring and updating points in the server.

DMA Error Detection
As shown in Figure 1 DMA abstracts from the underlying framework and application objects to a conceptual ERA model. Queries against this model then provide the means for problem recognition.

Usually the abstraction stage is primarily one of selection as key objects are monitored, but it can also require composition of elements from more than one underlying object.

DMA Use Case
To look at how the abstraction mechanisms of DMA allow construction problems to be detected and explained let's look at it operating on a sample application. We'll aim to show how we can identify a pattern of application and framework components that indicates a problem. We'll then show how the problem can be visualized and explained back to the source level by exploring the model at the point of detection.

Example Application
Figure 2 shows a simple Web-based order-processing example that accepts orders and processes them in the following way:

  • An order invoice is created and queued to an existing invoice service using JMS to create and process the invoice.
  • The order details are queued to an order processing system using JMS to process and deliver the order separately.
However there's a problem: The invoices don't arrive at the invoice processing application although the order entries are processed correctly.

Monitoring the Application
To monitor the sample application we'll run the WebLogic server from our DMA analyzer called eoSense, which comprises a server agent and a client. The agent constructs and checks the abstract model as WebLogic executes. When a problem is detected, the agent signals an alert to the client.

Transaction-Related Alerts
Running the example application results in the initial alerts shown in Figure 3 being detected (after several less serious alerts).

Looking at the alerts in more detail, there was a:

  • JMS Message sent inside a JTA Transaction using a non-XA Connection - A JMS Connection created from a non-XA Factory was used to create a JMS session and sender. The sender was then used with the context of a JTA transaction. This may indicate that an XA JMS Connection should have been used instead.
  • Mixed Transactions - The JMS sender has been used from a JMS Session marked as transacted, but there's already a JTA transaction active on the current thread.
DMA Visualization
When the Mixed Transaction alert is recorded a diagram of the ERA model allows the context of the problem to be understood. In eoSense this is called the Server View and an image of the server view is shown in Figure 4.

We can see that there are two active transactions, one linked to the Order Processing Servlet and the other linked to a JMS Session. We can also see that the Order Processing Servlet has communicated with two JMS Senders. Figure 5 shows diagrammatically the named key entities and relationships from Figure 4.

  • There are two in-flight transactions held by the Transaction Manager
  • There is an "initiated By" relationship between the OrderProcessingExample Servlet and Transaction 1
  • There is an "Initiated By" relationship between the JMS Session 1 and Transaction 2
  • The OrderProcessingExample Servlet has sent two Messages: There is a "Has_Called" relationship to JMS Sender 1, which is attached to the Order JMS Destination, and a "Has_Called" Relationship to JMS Sender 2, which is attached to the Invoice JMS Destination.

More Stories By Alan West

Alan West is CTO of eoLogic (http://www.eologic.com), responsible for all product development. He was previously a founder of Object Software Technology Ltd, and has over 20 years of experience in software tool design and architecting large software systems.

More Stories By Gordon Cruickshank

Gordon Cruickshank is co-founder of eoLogic (http://www.eologic.com), a software tools company created to develop innovative testing and debugging solutions. He was previously development manager at Wind River Systems and Objective Software Technology, building C++ debugging and object visualization tools.

Comments (1) View Comments

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

Most Recent Comments
n d 08/22/06 10:36:16 AM EDT

This article introduces a new form of analysis for Java EE applications: a runtime abstract application model derived automatically from an application server using stored knowledge of Java EE construction. The model is used dynamically to do extensive automatic checks for a range of construction errors that could produce poor performance or unreliability. The model also lets server behavior be dynamically visualized in real-time or retrospectively.

@ThingsExpo Stories
SYS-CON Events announced today that Cloudbric, a leading website security provider, will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Cloudbric is an elite full service website protection solution specifically designed for IT novices, entrepreneurs, and small and medium businesses. First launched in 2015, Cloudbric is based on the enterprise level Web Application Firewall by Penta Security Sys...
Virgil consists of an open-source encryption library, which implements Cryptographic Message Syntax (CMS) and Elliptic Curve Integrated Encryption Scheme (ECIES) (including RSA schema), a Key Management API, and a cloud-based Key Management Service (Virgil Keys). The Virgil Keys Service consists of a public key service and a private key escrow service. 

Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, will discuss the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
SYS-CON Events announced today that MathFreeOn will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. MathFreeOn is Software as a Service (SaaS) used in Engineering and Math education. Write scripts and solve math problems online. MathFreeOn provides online courses for beginners or amateurs who have difficulties in writing scripts. In accordance with various mathematical topics, there are more tha...
In an era of historic innovation fueled by unprecedented access to data and technology, the low cost and risk of entering new markets has leveled the playing field for business. Today, any ambitious innovator can easily introduce a new application or product that can reinvent business models and transform the client experience. In their Day 2 Keynote at 19th Cloud Expo, Mercer Rowe, IBM Vice President of Strategic Alliances, and Raejeanne Skillern, Intel Vice President of Data Center Group and ...
The best way to leverage your Cloud Expo presence as a sponsor and exhibitor is to plan your news announcements around our events. The press covering Cloud Expo and @ThingsExpo will have access to these releases and will amplify your news announcements. More than two dozen Cloud companies either set deals at our shows or have announced their mergers and acquisitions at Cloud Expo. Product announcements during our show provide your company with the most reach through our targeted audiences.
@ThingsExpo has been named the Top 5 Most Influential Internet of Things Brand by Onalytica in the ‘The Internet of Things Landscape 2015: Top 100 Individuals and Brands.' Onalytica analyzed Twitter conversations around the #IoT debate to uncover the most influential brands and individuals driving the conversation. Onalytica captured data from 56,224 users. The PageRank based methodology they use to extract influencers on a particular topic (tweets mentioning #InternetofThings or #IoT in this ...
There is growing need for data-driven applications and the need for digital platforms to build these apps. In his session at 19th Cloud Expo, Muddu Sudhakar, VP and GM of Security & IoT at Splunk, will cover different PaaS solutions and Big Data platforms that are available to build applications. In addition, AI and machine learning are creating new requirements that developers need in the building of next-gen apps. The next-generation digital platforms have some of the past platform needs a...
"We've discovered that after shows 80% if leads that people get, 80% of the conversations end up on the show floor, meaning people forget about it, people forget who they talk to, people forget that there are actual business opportunities to be had here so we try to help out and keep the conversations going," explained Jeff Mesnik, Founder and President of ContentMX, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Intelligent machines are here. Robots, self-driving cars, drones, bots and many IoT devices are becoming smarter with Machine Learning. In her session at @ThingsExpo, Sudha Jamthe, CEO of IoTDisruptions.com, will discuss the next wave of business disruption at the junction of IoT and AI, impacting many industries and set to change our lives, work and world as we know it.
Bert Loomis was a visionary. This general session will highlight how Bert Loomis and people like him inspire us to build great things with small inventions. In their general session at 19th Cloud Expo, Harold Hannon, Architect at IBM Bluemix, and Michael O'Neill, Strategic Business Development at Nvidia, will discuss the accelerating pace of AI development and how IBM Cloud and NVIDIA are partnering to bring AI capabilities to "every day," on-demand. They will also review two "free infrastruct...
More and more brands have jumped on the IoT bandwagon. We have an excess of wearables – activity trackers, smartwatches, smart glasses and sneakers, and more that track seemingly endless datapoints. However, most consumers have no idea what “IoT” means. Creating more wearables that track data shouldn't be the aim of brands; delivering meaningful, tangible relevance to their users should be. We're in a period in which the IoT pendulum is still swinging. Initially, it swung toward "smart for smar...
In past @ThingsExpo presentations, Joseph di Paolantonio has explored how various Internet of Things (IoT) and data management and analytics (DMA) solution spaces will come together as sensor analytics ecosystems. This year, in his session at @ThingsExpo, Joseph di Paolantonio from DataArchon, will be adding the numerous Transportation areas, from autonomous vehicles to “Uber for containers.” While IoT data in any one area of Transportation will have a huge impact in that area, combining sensor...
In his general session at 19th Cloud Expo, Manish Dixit, VP of Product and Engineering at Dice, will discuss how Dice leverages data insights and tools to help both tech professionals and recruiters better understand how skills relate to each other and which skills are in high demand using interactive visualizations and salary indicator tools to maximize earning potential. Manish Dixit is VP of Product and Engineering at Dice. As the leader of the Product, Engineering and Data Sciences team a...
Join IBM November 2 at 19th Cloud Expo at the Santa Clara Convention Center in Santa Clara, CA, and learn how to go beyond multi-speed it to bring agility to traditional enterprise applications. Technology innovation is the driving force behind modern business and enterprises must respond by increasing the speed and efficiency of software delivery. The challenge is that existing enterprise applications are expensive to develop and difficult to modernize. This often results in what Gartner calls...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
Although it has gained significant traction in the consumer space, IoT is still in the early stages of adoption in enterprises environments. However, many companies are working on initiatives like Industry 4.0 that includes IoT as one of the key disruptive technologies expected to reshape businesses of tomorrow. The key challenges will be availability, robustness and reliability of networks that connect devices in a business environment. Software Defined Wide Area Network (SD-WAN) is expected to...
The Internet of Things (IoT), in all its myriad manifestations, has great potential. Much of that potential comes from the evolving data management and analytic (DMA) technologies and processes that allow us to gain insight from all of the IoT data that can be generated and gathered. This potential may never be met as those data sets are tied to specific industry verticals and single markets, with no clear way to use IoT data and sensor analytics to fulfill the hype being given the IoT today.
@ThingsExpo has been named the Top 5 Most Influential M2M Brand by Onalytica in the ‘Machine to Machine: Top 100 Influencers and Brands.' Onalytica analyzed the online debate on M2M by looking at over 85,000 tweets to provide the most influential individuals and brands that drive the discussion. According to Onalytica the "analysis showed a very engaged community with a lot of interactive tweets. The M2M discussion seems to be more fragmented and driven by some of the major brands present in the...
Personalization has long been the holy grail of marketing. Simply stated, communicate the most relevant offer to the right person and you will increase sales. To achieve this, you must understand the individual. Consequently, digital marketers developed many ways to gather and leverage customer information to deliver targeted experiences. In his session at @ThingsExpo, Lou Casal, Founder and Principal Consultant at Practicala, discussed how the Internet of Things (IoT) has accelerated our abil...