Java IoT Authors: Pat Romanski, Yeshim Deniz, Elizabeth White, Roger Strukhoff, Liz McMillan

Related Topics: Java IoT, Machine Learning , Silverlight

Java IoT: Article

Ensuring Website Performance

Why measure, what to measure, and how to measure accurately and automatically

It is a fact that end user response time is critical for business success. The faster web pages are perceived the longer users tend to stay on the page and therefore spend more money and drive business.

In order to ensure that end user response times are acceptable at all times it is necessary to measure the time in the way the end user perceives performance. Measuring and monitoring your live system is important to identify problems early on before it affects too many end users. In order to make sure that web pages are fast from the start it is very important to constantly and continuously measure web page performance throughout the development phase and in testing. There are two questions that need to be answered

  • What is the time the user actually perceives as web response time?
  • How to measure it accurately and in an automated way?

What Time to Measure? Technical Response Time vs. Perceived Response Time
Technically - the response time of a web page is the time from the first byte sent by the browser to request the initial document until the last byte of all embedded objects (images, JavaScript files, style sheets, ...) was received. Using network analysis tools like HTTP Watch or Fiddler visualize the individual downloads in a timeline view. The following illustration shows the network timeline when accessing Google Maps (http://maps.google.com) with an empty browser cache using Fiddler:

Network Timeline showing Network Requests but no Browser Activities

Network Timeline showing Network Requests but no Browser Activities

The initial document request returned after 1.6s. Embedded objects get downloaded after the initial document was retrieved. It turns out there are 2 additional HTML documents, a list of images and some JavaScript files. After 5 seconds (when main.js was downloaded) we see a small gap before the remaining requests are downloaded. We can assume that the gap represents JavaScript execution time that delay loaded some other objects- but we cannot be fully sure about that.

From this analysis its hard to tell what the perceived end user response time really is. Is it 1.6 seconds because that is the time when the browser could already start rendering the initial content of the HTML document? Or is it roughly 5 seconds when the first batch of embedded objects was fully downloaded? Or might it be 8 seconds - because that is the time till the last request was completed? Or is the truth somewhere in between?

There is more than meets the "HTTP Traffic" Eye

The browser does much more than just downloading resources from the server. The DOM (Document Object Model) is built and maintained for the downloaded document. Styles are applied to DOM Elements based on the definition in Style Sheets. JavaScript gets executed at different points in time triggered by certain events, e.g.: onload, onclick, .... The DOM and all its containing images are rendered to the screen.

Using a tool like dynaTrace AJAX Edition we get all this additional activity information showing us where and when additional time is spent in the browser for JavaScript execution, Rendering or waiting for asynchronous network requests. We also see page events like onLoad or onError:

Timeline of all Browser Activities

Timeline of all Browser Activities

Looking at this timeline view of the same Google Maps request as before now tells us that the browser started rendering the initial HTML document after 2 seconds. Throughout the download process of the embedded objects the browser rendered additional content. The onLoad event was triggered after 4.8 seconds. This is the time when the browser completed building the initial DOM of the web page including all referenced objects (images, css, ...). The execution of main.js - which was downloaded as last JavaScript file - caused roughly 2 seconds of JavaScript execution time causing high CPU on the browser, additional network downloads and DOM manipulations. The High CPU utilization is an indication of the browser not being very responsive to user input via mouse or keyboard as JavaScript almost exclusively consumed the processor. DOM Manipulations executed by JavaScript got rendered after JavaScript execution was completed (after 7.5s and 8s).

So what is the perceived end user performance?

I believe there are different stages of perceived performance and perceived response time.

The First Impression of speed is the time it takes to see something in the browsers window (Time To First Visual). We can measure that by looking at the first Rendering (Drawing) activity. Get a detailed description about Browser Rendering and the inner workings the Rendering Engine at Alois's blog entry about Understanding Browser Rendering.

The Second Impression is when the initial page is fully loaded (Time To OnLoad). This can be measured by looking at the onLoad event which is triggered by the browser when the DOM is fully loaded meaning that the initial document and all embedded objects are loaded.

The Third Impression is when the web site actually becomes interactive for the user (Time To Interactivity). Heavy JavaScript execution that manipulates the DOM causes the web page to become non interactive for the end user. This can very often be seen when expensive CSS Selector Lookups (check out the blogs about jQuery and Prototype CSS Selector Performance) are used or when using dynamic elements like JavaScript Menus (check out the blog about dynamice JavaScript menus).

Let's look at a second example and identify the different impression stages. The following image shows a page request to a product page on a very popular online retail store:

3 Impression Phases

Three Impression Phases

The initial page content is downloaded rather quickly and rendered to the screen in the first second (First Impression). It takes a total of about 3 seconds for some of the initial images to load that make up the pages initial content (Second Impression). Heavy JavaScript that manipulates the DOM causes the page to be non responsive to the end user for about 10 seconds also delaying the onLoad event where the page delay loads most of the images. In this case the user sees some of the content early on (mostly text from the initial HTML) - but then needs to wait another 10 seconds till the remaining images get delay loaded and rendered by the browser (Third Impression). Due to the high CPU usage and DOM manipulations the page is also not very interactive causing a bad end user perception of the pages performance.

How to Measure? Stop Watch Measuring vs. Tool Supported Measuring
The idea to this blog post came from talking with performance testing engineers at on of our clients. I introduced them to the dynaTrace AJAX Edition and was wondering about a small little gadget they had on their table: a Stop-Watch.

Their task was to measure end-user response time for every build of their new web-site in order to verify if the times are within defined performance thresholds and in order to identify regressions from build to build. They used the Stop-Watch to actually measure the time it took to load each single page and to measure the time till the page was responsive. The "manually" measured numbers were put into a spreadsheet which allowed them to verify their performance values.

Do you see the problems in this approach?

Not only is this method of measuring time very inaccurate - especially when we talk about measuring precise timings in tenths of seconds. Every performance engineer also has a slightly different perception of what it means for the site to be interactive. It also involves additional manual effort as the timing can only be taken during manual tests.

Automate measuring and measure accurately

The solution to this problem is rather easy. With tools like dynaTrace AJAX Edition we capture the performance measures like JavaScript execution, Rendering Time, CPU Utilization, Asynchronous Requests and Network Requests. Not only is this possible for manual tests but also works in an automated test environment. Letting a tool do the job eliminates the inaccuracy of manual time taking and subjective perception of performance.

When using the dynaTrace AJAX Edition as seen on the examples above all performance relevant browser activities are automatically captured and enable us to determine the time of the 3 Impression Stages. The blog article "Automate Testing with Watir" shows how to use dynaTrace AJAX Edition in the combination with automated testing tools. The tool also provides the ability to export captured data to XML or spreadsheet applications like Excel - supporting the use case of automated regression analysis across different web site versions/builds.

Using tools like dynaTrace AJAX Edition for Internet Explorer, YSlow or PageSpeed for FireFox or DevTools for Chrome enables automating web site performance measuring in manual and automated test environments. Continuously measuring web site performance in the browser allows you to always focus on end user performance which in the end determines how successful your website will be.

Related Reading:

  1. 101 on HTTPS Web Site Performance Impact I recently analyzed a secure web page that took 20...
  2. Top Low Hanging Fruit to Performance Optimize your Web Site and boost Business Web Page Performance was one of the big topics at...
  3. Inability to measure SLAs around application performance The 2008 Aberdeen Report on Application Performance Management listed the...
  4. Your Top Links about Web Site/AJAX Performance With the recent work we did for the dynaTrace AJAX...
  5. Automated Performance Analysis: What’s going on in my ASP.NET or ASP.NET MVC Application? I’ve spent some time in the last weeks playing with different...

More Stories By Andreas Grabner

Andreas Grabner has been helping companies improve their application performance for 15+ years. He is a regular contributor within Web Performance and DevOps communities and a prolific speaker at user groups and conferences around the world. Reach him at @grabnerandi

@ThingsExpo Stories
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Cloud-enabled transformation has evolved from cost saving measure to business innovation strategy -- one that combines the cloud with cognitive capabilities to drive market disruption. Learn how you can achieve the insight and agility you need to gain a competitive advantage. Industry-acclaimed CTO and cloud expert, Shankar Kalyana presents. Only the most exceptional IBMers are appointed with the rare distinction of IBM Fellow, the highest technical honor in the company. Shankar has also receive...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
The standardization of container runtimes and images has sparked the creation of an almost overwhelming number of new open source projects that build on and otherwise work with these specifications. Of course, there's Kubernetes, which orchestrates and manages collections of containers. It was one of the first and best-known examples of projects that make containers truly useful for production use. However, more recently, the container ecosystem has truly exploded. A service mesh like Istio addr...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Business professionals no longer wonder if they'll migrate to the cloud; it's now a matter of when. The cloud environment has proved to be a major force in transitioning to an agile business model that enables quick decisions and fast implementation that solidify customer relationships. And when the cloud is combined with the power of cognitive computing, it drives innovation and transformation that achieves astounding competitive advantage.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...
Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settlement products to hedge funds and investment banks. After, he co-founded a revenue cycle management company where he learned about Bitcoin and eventually Ethereal. Andrew's role at ConsenSys Enterprise is a mul...
DXWorldEXPO LLC announced today that "Miami Blockchain Event by FinTechEXPO" has announced that its Call for Papers is now open. The two-day event will present 20 top Blockchain experts. All speaking inquiries which covers the following information can be submitted by email to [email protected] Financial enterprises in New York City, London, Singapore, and other world financial capitals are embracing a new generation of smart, automated FinTech that eliminates many cumbersome, slow, and expe...
DXWorldEXPO | CloudEXPO are the world's most influential, independent events where Cloud Computing was coined and where technology buyers and vendors meet to experience and discuss the big picture of Digital Transformation and all of the strategies, tactics, and tools they need to realize their goals. Sponsors of DXWorldEXPO | CloudEXPO benefit from unmatched branding, profile building and lead generation opportunities.
The best way to leverage your Cloud Expo presence as a sponsor and exhibitor is to plan your news announcements around our events. The press covering Cloud Expo and @ThingsExpo will have access to these releases and will amplify your news announcements. More than two dozen Cloud companies either set deals at our shows or have announced their mergers and acquisitions at Cloud Expo. Product announcements during our show provide your company with the most reach through our targeted audiences.
DevOpsSummit New York 2018, colocated with CloudEXPO | DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City. Digital Transformation (DX) is a major focus with the introduction of DXWorldEXPO within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. A total of 88% of Fortune 500 companies from a generation ago are now out of bus...
With 10 simultaneous tracks, keynotes, general sessions and targeted breakout classes, @CloudEXPO and DXWorldEXPO are two of the most important technology events of the year. Since its launch over eight years ago, @CloudEXPO and DXWorldEXPO have presented a rock star faculty as well as showcased hundreds of sponsors and exhibitors! In this blog post, we provide 7 tips on how, as part of our world-class faculty, you can deliver one of the most popular sessions at our events. But before reading...
Cloud Expo | DXWorld Expo have announced the conference tracks for Cloud Expo 2018. Cloud Expo will be held June 5-7, 2018, at the Javits Center in New York City, and November 6-8, 2018, at the Santa Clara Convention Center, Santa Clara, CA. Digital Transformation (DX) is a major focus with the introduction of DX Expo within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive ov...
DXWordEXPO New York 2018, colocated with CloudEXPO New York 2018 will be held November 11-13, 2018, in New York City and will bring together Cloud Computing, FinTech and Blockchain, Digital Transformation, Big Data, Internet of Things, DevOps, AI, Machine Learning and WebRTC to one location.
DXWorldEXPO LLC announced today that ICOHOLDER named "Media Sponsor" of Miami Blockchain Event by FinTechEXPO. ICOHOLDER give you detailed information and help the community to invest in the trusty projects. Miami Blockchain Event by FinTechEXPO has opened its Call for Papers. The two-day event will present 20 top Blockchain experts. All speaking inquiries which covers the following information can be submitted by email to [email protected] Miami Blockchain Event by FinTechEXPO also offers s...
Dion Hinchcliffe is an internationally recognized digital expert, bestselling book author, frequent keynote speaker, analyst, futurist, and transformation expert based in Washington, DC. He is currently Chief Strategy Officer at the industry-leading digital strategy and online community solutions firm, 7Summits.
With tough new regulations coming to Europe on data privacy in May 2018, Calligo will explain why in reality the effect is global and transforms how you consider critical data. EU GDPR fundamentally rewrites the rules for cloud, Big Data and IoT. In his session at 21st Cloud Expo, Adam Ryan, Vice President and General Manager EMEA at Calligo, examined the regulations and provided insight on how it affects technology, challenges the established rules and will usher in new levels of diligence arou...