Click here to close now.


Java IoT Authors: Adrian Bridgwater, Pat Romanski, SmartBear Blog, Elizabeth White, Carmen Gonzalez

Related Topics: Java IoT, Microservices Expo, IoT User Interface

Java IoT: Article

You Only Control One-Third of Your Page Load Performance!

You can’t rely on big third-party providers to always deliver high performance

You don't agree with that? Have you ever looked at the details of your page load time and analyzed what really impacts Page Load Time? Let me show you with a real life example and let me explain that in most cases you only control 1/3 of the time required to load a page as the rest is consumed by third-party content that you do not have under control.

Be Aware of Third-Party Content
When analyzing web page load times we can use tools such as dynaTrace, Firebug or PageSpeed. The following two screenshots show timeline views from dynaTrace AJAX Edition. The timelines show all network downloads, rendering activities and JavaScript executions that happen when loading almost exactly the same page. The question is: Where does the huge difference come from?

Timeline of web page with and without third-party content showing a difference of 8 seconds in total page load time

The two screenshots below show these two pages as rendered by the browser. From your own application perspective it is the exact same page - the only difference is the additional third-party content. The screenshot on the left side refers to the first timeline, the screenshot on the right to the second timeline. To make the differences easier to see I have marked them with red boxes.

Screenshot of the page without and with highlighted third-Party content

The left screenshot shows the page with content delivered by your application. That's all the business-relevant content you want to deliver to your users, e.g., information about travel offers. Over time this page got enriched with third-party content such as Tracking Pixels, Ads, Facebook Connect, Twitter and Google Maps. These third-party components make the difference between the two page loads. Everyone can easily see that this "enrichment" has an impact on page load performance and therefore affects the user experience. The super-fast page that finishes the download of all necessary resources after a little over two seconds is slowed down by eight seconds. Table 1 shows five KPIs (Key Performance Indicators) that represent the impact of the third-party content.

# of Domains

# of Resources

Total Bytes

DNS [ms]

Connect [ms]

With Third Party Content



2856 Kb



Without Third Party Content



897 Kb



Table 1: Comparison of KPIs with and without third-party content

Four Typical Problems with Third-Party Content
Let me explain four typical problems that come with adding third-party content and why this impacts page load time.

Problem 1: Number of Resources
With every new third-party "feature" we are adding new resources that have to be downloaded by the browser. In this example the number of resources increased by 117. Let's compare it with the SpeedOfTheWeb baseline for the shopping industry. The best shopping page loads at least 72 resources. If we would stick with our original page we would be the leader in this category with just 59 resources.

In addition to 117 roundtrips to download these resources, it also means that the total download size of the page grows significantly. To download the extra ~2 Mb from the servers of the third-party content provider your customer will need extra time. Depending on the bandwidth and latency the download time can vary and, if you think of downloading the data via a mobile connection, it really can be time-consuming.

Problem 2: Connection Usage
Domain sharding is a good way to enable older browsers to download resources in parallel. Looking at current modern websites, domain sharding is often used too aggressively. But, how can you do too much domain sharding? Table 2 shows us all the domains from which we only download on or two resources. There are 17 domains for downloading 23 resources - domain sharding at its best!

What about connection management overhead? For each domain we have to make a DNS look-up so that we know to which server to connect. The setup of a connection also needs time. Our example needed 1286 ms for DNS lookup and another 1176 ms for establishing the connections to the server. As almost every domain refers to third-party content you have no control over them and you can't reduce them.




















Table 2: Download Domains from which less than three resources are downloaded

Problem 3: Not Minified Resources
You are trying to reduce the download size of your page as much as possible. You have put a lot of effort into your CI process to automatically minify your JavaScript, CSS and Images and then you are forced to put ads on your pages, for example. On our example page we can find an ad provider that does not minify JavaScript. The screenshot below shows part of the uncompressed JavaScript file.

Uncompressed JavaScript code of third-party content provider

I have put the whole file content into a compressor tool and the size can be reduced by 20%. Again you can't do anything about it.

Problem 4: Awareness of Bad Response Times of Third-Party Content Provider
Within your datacenter you monitor the response times for incoming requests. You will be alerted if the performance of the response time is decreasing. Within your data center you will know when something is going wrong and you can do something about it. What about third-party content? Do Facebook, Google, etc., send you alerts if they are experiencing bad performance? You will now say that these big providers will never have bad response times, but take a look at the following two examples.

Timeline with slow Facebook request

This timeline shows us a very long running resource request. You will never see this 10698 ms lasting request in your datacenter monitoring environment as the resources are provided by Facebook one of the third-party content providers of this page.

Timeline with slow Facebook and Google+ requests

The second example shows the timeline of a different page but with the same problem. On this page not only Facebook is slow but also Google+. The slow requests have durations from 1.6 sec to 3.5 sec. and have a big impact on the experience of your user. The problem with the user experience is that the bad experience is not related to the third-party content provider but to YOU!

What we have seen is that third-party content has a big impact on user experience. You can't rely on big third-party providers to always deliver high performance. You should be aware of the problems that can occur if you put third-party content on your page and you really have to take action. In this article I have highlighted several issues you are facing with third-party content. What should be done to prevent these types of problems will be discussed in my next blog -Third Party Content Management!

More Stories By Klaus Enzenhofer

Klaus Enzenhofer has several years of experience and expertise in the field of Web Performance Optimization and User Experience Management. He works as Technical Strategist in the Center of Excellence Team at dynaTrace Software. In this role he influences the development of the dynaTrace Application Performance Management Solution and the Web Performance Optimization Tool dynaTrace AJAX Edition. He mainly gathered his experience in web and performance by developing and running large-scale web portals at Tiscover GmbH.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

@ThingsExpo Stories
Mobile messaging has been a popular communication channel for more than 20 years. Finnish engineer Matti Makkonen invented the idea for SMS (Short Message Service) in 1984, making his vision a reality on December 3, 1992 by sending the first message ("Happy Christmas") from a PC to a cell phone. Since then, the technology has evolved immensely, from both a technology standpoint, and in our everyday uses for it. Originally used for person-to-person (P2P) communication, i.e., Sally sends a text message to Betty – mobile messaging now offers tremendous value to businesses for customer and empl...
SYS-CON Events announced today that Sandy Carter, IBM General Manager Cloud Ecosystem and Developers, and a Social Business Evangelist, will keynote at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA.
The IoT market is on track to hit $7.1 trillion in 2020. The reality is that only a handful of companies are ready for this massive demand. There are a lot of barriers, paint points, traps, and hidden roadblocks. How can we deal with these issues and challenges? The paradigm has changed. Old-style ad-hoc trial-and-error ways will certainly lead you to the dead end. What is mandatory is an overarching and adaptive approach to effectively handle the rapid changes and exponential growth.
The IoT is upon us, but today’s databases, built on 30-year-old math, require multiple platforms to create a single solution. Data demands of the IoT require Big Data systems that can handle ingest, transactions and analytics concurrently adapting to varied situations as they occur, with speed at scale. In his session at @ThingsExpo, Chad Jones, chief strategy officer at Deep Information Sciences, will look differently at IoT data so enterprises can fully leverage their IoT potential. He’ll share tips on how to speed up business initiatives, harness Big Data and remain one step ahead by apply...
WebRTC converts the entire network into a ubiquitous communications cloud thereby connecting anytime, anywhere through any point. In his session at WebRTC Summit,, Mark Castleman, EIR at Bell Labs and Head of Future X Labs, will discuss how the transformational nature of communications is achieved through the democratizing force of WebRTC. WebRTC is doing for voice what HTML did for web content.
Today air travel is a minefield of delays, hassles and customer disappointment. Airlines struggle to revitalize the experience. GE and M2Mi will demonstrate practical examples of how IoT solutions are helping airlines bring back personalization, reduce trip time and improve reliability. In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect with GE, and Dr. Sarah Cooper, M2Mi's VP Business Development and Engineering, will explore the IoT cloud-based platform technologies driving this change including privacy controls, data transparency and integration of real time context w...
"Matrix is an ambitious open standard and implementation that's set up to break down the fragmentation problems that exist in IP messaging and VoIP communication," explained John Woolf, Technical Evangelist at Matrix, in this interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
WebRTC has had a real tough three or four years, and so have those working with it. Only a few short years ago, the development world were excited about WebRTC and proclaiming how awesome it was. You might have played with the technology a couple of years ago, only to find the extra infrastructure requirements were painful to implement and poorly documented. This probably left a bitter taste in your mouth, especially when things went wrong.
Nowadays, a large number of sensors and devices are connected to the network. Leading-edge IoT technologies integrate various types of sensor data to create a new value for several business decision scenarios. The transparent cloud is a model of a new IoT emergence service platform. Many service providers store and access various types of sensor data in order to create and find out new business values by integrating such data.
The broad selection of hardware, the rapid evolution of operating systems and the time-to-market for mobile apps has been so rapid that new challenges for developers and engineers arise every day. Security, testing, hosting, and other metrics have to be considered through the process. In his session at Big Data Expo, Walter Maguire, Chief Field Technologist, HP Big Data Group, at Hewlett-Packard, will discuss the challenges faced by developers and a composite Big Data applications builder, focusing on how to help solve the problems that developers are continuously battling.
There are so many tools and techniques for data analytics that even for a data scientist the choices, possible systems, and even the types of data can be daunting. In his session at @ThingsExpo, Chris Harrold, Global CTO for Big Data Solutions for EMC Corporation, will show how to perform a simple, but meaningful analysis of social sentiment data using freely available tools that take only minutes to download and install. Participants will get the download information, scripts, and complete end-to-end walkthrough of the analysis from start to finish. Participants will also be given the pract...
WebRTC: together these advances have created a perfect storm of technologies that are disrupting and transforming classic communications models and ecosystems. In his session at WebRTC Summit, Cary Bran, VP of Innovation and New Ventures at Plantronics and PLT Labs, will provide an overview of this technological shift, including associated business and consumer communications impacts, and opportunities it may enable, complement or entirely transform.
SYS-CON Events announced today that Dyn, the worldwide leader in Internet Performance, will exhibit at SYS-CON's 17th International Cloud Expo®, which will take place on November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Dyn is a cloud-based Internet Performance company. Dyn helps companies monitor, control, and optimize online infrastructure for an exceptional end-user experience. Through a world-class network and unrivaled, objective intelligence into Internet conditions, Dyn ensures traffic gets delivered faster, safer, and more reliably than ever.
WebRTC services have already permeated corporate communications in the form of videoconferencing solutions. However, WebRTC has the potential of going beyond and catalyzing a new class of services providing more than calls with capabilities such as mass-scale real-time media broadcasting, enriched and augmented video, person-to-machine and machine-to-machine communications. In his session at @ThingsExpo, Luis Lopez, CEO of Kurento, will introduce the technologies required for implementing these ideas and some early experiments performed in the Kurento open source software community in areas ...
Too often with compelling new technologies market participants become overly enamored with that attractiveness of the technology and neglect underlying business drivers. This tendency, what some call the “newest shiny object syndrome,” is understandable given that virtually all of us are heavily engaged in technology. But it is also mistaken. Without concrete business cases driving its deployment, IoT, like many other technologies before it, will fade into obscurity.
Who are you? How do you introduce yourself? Do you use a name, or do you greet a friend by the last four digits of his social security number? Assuming you don’t, why are we content to associate our identity with 10 random digits assigned by our phone company? Identity is an issue that affects everyone, but as individuals we don’t spend a lot of time thinking about it. In his session at @ThingsExpo, Ben Klang, Founder & President of Mojo Lingo, will discuss the impact of technology on identity. Should we federate, or not? How should identity be secured? Who owns the identity? How is identity ...
The buzz continues for cloud, data analytics and the Internet of Things (IoT) and their collective impact across all industries. But a new conversation is emerging - how do companies use industry disruption and technology enablers to lead in markets undergoing change, uncertainty and ambiguity? Organizations of all sizes need to evolve and transform, often under massive pressure, as industry lines blur and merge and traditional business models are assaulted and turned upside down. In this new data-driven world, marketplaces reign supreme while interoperability, APIs and applications deliver un...
Electric power utilities face relentless pressure on their financial performance, and reducing distribution grid losses is one of the last untapped opportunities to meet their business goals. Combining IoT-enabled sensors and cloud-based data analytics, utilities now are able to find, quantify and reduce losses faster – and with a smaller IT footprint. Solutions exist using Internet-enabled sensors deployed temporarily at strategic locations within the distribution grid to measure actual line loads.
The Internet of Everything is re-shaping technology trends–moving away from “request/response” architecture to an “always-on” Streaming Web where data is in constant motion and secure, reliable communication is an absolute necessity. As more and more THINGS go online, the challenges that developers will need to address will only increase exponentially. In his session at @ThingsExpo, Todd Greene, Founder & CEO of PubNub, will explore the current state of IoT connectivity and review key trends and technology requirements that will drive the Internet of Things from hype to reality.
The Internet of Things (IoT) is growing rapidly by extending current technologies, products and networks. By 2020, Cisco estimates there will be 50 billion connected devices. Gartner has forecast revenues of over $300 billion, just to IoT suppliers. Now is the time to figure out how you’ll make money – not just create innovative products. With hundreds of new products and companies jumping into the IoT fray every month, there’s no shortage of innovation. Despite this, McKinsey/VisionMobile data shows "less than 10 percent of IoT developers are making enough to support a reasonably sized team....