Java IoT Authors: Elizabeth White, Sematext Blog, Yeshim Deniz, Carmen Gonzalez, Andreas Grabner

Related Topics: Java IoT, Microservices Expo, IoT User Interface

Java IoT: Article

You Only Control One-Third of Your Page Load Performance!

You can’t rely on big third-party providers to always deliver high performance

You don't agree with that? Have you ever looked at the details of your page load time and analyzed what really impacts Page Load Time? Let me show you with a real life example and let me explain that in most cases you only control 1/3 of the time required to load a page as the rest is consumed by third-party content that you do not have under control.

Be Aware of Third-Party Content
When analyzing web page load times we can use tools such as dynaTrace, Firebug or PageSpeed. The following two screenshots show timeline views from dynaTrace AJAX Edition. The timelines show all network downloads, rendering activities and JavaScript executions that happen when loading almost exactly the same page. The question is: Where does the huge difference come from?

Timeline of web page with and without third-party content showing a difference of 8 seconds in total page load time

The two screenshots below show these two pages as rendered by the browser. From your own application perspective it is the exact same page - the only difference is the additional third-party content. The screenshot on the left side refers to the first timeline, the screenshot on the right to the second timeline. To make the differences easier to see I have marked them with red boxes.

Screenshot of the page without and with highlighted third-Party content

The left screenshot shows the page with content delivered by your application. That's all the business-relevant content you want to deliver to your users, e.g., information about travel offers. Over time this page got enriched with third-party content such as Tracking Pixels, Ads, Facebook Connect, Twitter and Google Maps. These third-party components make the difference between the two page loads. Everyone can easily see that this "enrichment" has an impact on page load performance and therefore affects the user experience. The super-fast page that finishes the download of all necessary resources after a little over two seconds is slowed down by eight seconds. Table 1 shows five KPIs (Key Performance Indicators) that represent the impact of the third-party content.

# of Domains

# of Resources

Total Bytes

DNS [ms]

Connect [ms]

With Third Party Content



2856 Kb



Without Third Party Content



897 Kb



Table 1: Comparison of KPIs with and without third-party content

Four Typical Problems with Third-Party Content
Let me explain four typical problems that come with adding third-party content and why this impacts page load time.

Problem 1: Number of Resources
With every new third-party "feature" we are adding new resources that have to be downloaded by the browser. In this example the number of resources increased by 117. Let's compare it with the SpeedOfTheWeb baseline for the shopping industry. The best shopping page loads at least 72 resources. If we would stick with our original page we would be the leader in this category with just 59 resources.

In addition to 117 roundtrips to download these resources, it also means that the total download size of the page grows significantly. To download the extra ~2 Mb from the servers of the third-party content provider your customer will need extra time. Depending on the bandwidth and latency the download time can vary and, if you think of downloading the data via a mobile connection, it really can be time-consuming.

Problem 2: Connection Usage
Domain sharding is a good way to enable older browsers to download resources in parallel. Looking at current modern websites, domain sharding is often used too aggressively. But, how can you do too much domain sharding? Table 2 shows us all the domains from which we only download on or two resources. There are 17 domains for downloading 23 resources - domain sharding at its best!

What about connection management overhead? For each domain we have to make a DNS look-up so that we know to which server to connect. The setup of a connection also needs time. Our example needed 1286 ms for DNS lookup and another 1176 ms for establishing the connections to the server. As almost every domain refers to third-party content you have no control over them and you can't reduce them.





































Table 2: Download Domains from which less than three resources are downloaded

Problem 3: Not Minified Resources
You are trying to reduce the download size of your page as much as possible. You have put a lot of effort into your CI process to automatically minify your JavaScript, CSS and Images and then you are forced to put ads on your pages, for example. On our example page we can find an ad provider that does not minify JavaScript. The screenshot below shows part of the uncompressed JavaScript file.

Uncompressed JavaScript code of third-party content provider

I have put the whole file content into a compressor tool and the size can be reduced by 20%. Again you can't do anything about it.

Problem 4: Awareness of Bad Response Times of Third-Party Content Provider
Within your datacenter you monitor the response times for incoming requests. You will be alerted if the performance of the response time is decreasing. Within your data center you will know when something is going wrong and you can do something about it. What about third-party content? Do Facebook, Google, etc., send you alerts if they are experiencing bad performance? You will now say that these big providers will never have bad response times, but take a look at the following two examples.

Timeline with slow Facebook request

This timeline shows us a very long running resource request. You will never see this 10698 ms lasting request in your datacenter monitoring environment as the resources are provided by Facebook one of the third-party content providers of this page.

Timeline with slow Facebook and Google+ requests

The second example shows the timeline of a different page but with the same problem. On this page not only Facebook is slow but also Google+. The slow requests have durations from 1.6 sec to 3.5 sec. and have a big impact on the experience of your user. The problem with the user experience is that the bad experience is not related to the third-party content provider but to YOU!

What we have seen is that third-party content has a big impact on user experience. You can't rely on big third-party providers to always deliver high performance. You should be aware of the problems that can occur if you put third-party content on your page and you really have to take action. In this article I have highlighted several issues you are facing with third-party content. What should be done to prevent these types of problems will be discussed in my next blog -Third Party Content Management!

More Stories By Klaus Enzenhofer

Klaus Enzenhofer has several years of experience and expertise in the field of Web Performance Optimization and User Experience Management. He works as Technical Strategist in the Center of Excellence Team at dynaTrace Software. In this role he influences the development of the dynaTrace Application Performance Management Solution and the Web Performance Optimization Tool dynaTrace AJAX Edition. He mainly gathered his experience in web and performance by developing and running large-scale web portals at Tiscover GmbH.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

@ThingsExpo Stories
SYS-CON Events announced today that Niagara Networks will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Niagara Networks offers the highest port-density systems, and the most complete Next-Generation Network Visibility systems including Network Packet Brokers, Bypass Switches, and Network TAPs.
In an era of historic innovation fueled by unprecedented access to data and technology, the low cost and risk of entering new markets has leveled the playing field for business. Today, any ambitious innovator can easily introduce a new application or product that can reinvent business models and transform the client experience. In their Day 2 Keynote at 19th Cloud Expo, Mercer Rowe, IBM Vice President of Strategic Alliances, and Raejeanne Skillern, Intel Vice President of Data Center Group and ...
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, will discuss the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
Virgil consists of an open-source encryption library, which implements Cryptographic Message Syntax (CMS) and Elliptic Curve Integrated Encryption Scheme (ECIES) (including RSA schema), a Key Management API, and a cloud-based Key Management Service (Virgil Keys). The Virgil Keys Service consists of a public key service and a private key escrow service. 

The best way to leverage your Cloud Expo presence as a sponsor and exhibitor is to plan your news announcements around our events. The press covering Cloud Expo and @ThingsExpo will have access to these releases and will amplify your news announcements. More than two dozen Cloud companies either set deals at our shows or have announced their mergers and acquisitions at Cloud Expo. Product announcements during our show provide your company with the most reach through our targeted audiences.
Fact is, enterprises have significant legacy voice infrastructure that’s costly to replace with pure IP solutions. How can we bring this analog infrastructure into our shiny new cloud applications? There are proven methods to bind both legacy voice applications and traditional PSTN audio into cloud-based applications and services at a carrier scale. Some of the most successful implementations leverage WebRTC, WebSockets, SIP and other open source technologies. In his session at @ThingsExpo, Da...
Fifty billion connected devices and still no winning protocols standards. HTTP, WebSockets, MQTT, and CoAP seem to be leading in the IoT protocol race at the moment but many more protocols are getting introduced on a regular basis. Each protocol has its pros and cons depending on the nature of the communications. Does there really need to be only one protocol to rule them all? Of course not. In his session at @ThingsExpo, Chris Matthieu, co-founder and CTO of Octoblu, walk you through how Oct...
SYS-CON Events announced today that Embotics, the cloud automation company, will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Embotics is the cloud automation company for IT organizations and service providers that need to improve provisioning or enable self-service capabilities. With a relentless focus on delivering a premier user experience and unmatched customer support, Embotics is the fas...
The Internet of Things (IoT), in all its myriad manifestations, has great potential. Much of that potential comes from the evolving data management and analytic (DMA) technologies and processes that allow us to gain insight from all of the IoT data that can be generated and gathered. This potential may never be met as those data sets are tied to specific industry verticals and single markets, with no clear way to use IoT data and sensor analytics to fulfill the hype being given the IoT today.
More and more brands have jumped on the IoT bandwagon. We have an excess of wearables – activity trackers, smartwatches, smart glasses and sneakers, and more that track seemingly endless datapoints. However, most consumers have no idea what “IoT” means. Creating more wearables that track data shouldn't be the aim of brands; delivering meaningful, tangible relevance to their users should be. We're in a period in which the IoT pendulum is still swinging. Initially, it swung toward "smart for smar...
@ThingsExpo has been named the Top 5 Most Influential M2M Brand by Onalytica in the ‘Machine to Machine: Top 100 Influencers and Brands.' Onalytica analyzed the online debate on M2M by looking at over 85,000 tweets to provide the most influential individuals and brands that drive the discussion. According to Onalytica the "analysis showed a very engaged community with a lot of interactive tweets. The M2M discussion seems to be more fragmented and driven by some of the major brands present in the...
The Quantified Economy represents the total global addressable market (TAM) for IoT that, according to a recent IDC report, will grow to an unprecedented $1.3 trillion by 2019. With this the third wave of the Internet-global proliferation of connected devices, appliances and sensors is poised to take off in 2016. In his session at @ThingsExpo, David McLauchlan, CEO and co-founder of Buddy Platform, discussed how the ability to access and analyze the massive volume of streaming data from millio...
SYS-CON Events announced today that Pulzze Systems will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Pulzze Systems, Inc. provides infrastructure products for the Internet of Things to enable any connected device and system to carry out matched operations without programming. For more information, visit http://www.pulzzesystems.com.
WebRTC has had a real tough three or four years, and so have those working with it. Only a few short years ago, the development world were excited about WebRTC and proclaiming how awesome it was. You might have played with the technology a couple of years ago, only to find the extra infrastructure requirements were painful to implement and poorly documented. This probably left a bitter taste in your mouth, especially when things went wrong.
Enterprise IT has been in the era of Hybrid Cloud for some time now. But it seems most conversations about Hybrid are focused on integrating AWS, Microsoft Azure, or Google ECM into existing on-premises systems. Where is all the Private Cloud? What do technology providers need to do to make their offerings more compelling? How should enterprise IT executives and buyers define their focus, needs, and roadmap, and communicate that clearly to the providers?
Successful digital transformation requires new organizational competencies and capabilities. Research tells us that the biggest impediment to successful transformation is human; consequently, the biggest enabler is a properly skilled and empowered workforce. In the digital age, new individual and collective competencies are required. In his session at 19th Cloud Expo, Bob Newhouse, CEO and founder of Agilitiv, will draw together recent research and lessons learned from emerging and established ...
One of biggest questions about Big Data is “How do we harness all that information for business use quickly and effectively?” Geographic Information Systems (GIS) or spatial technology is about more than making maps, but adding critical context and meaning to data of all types, coming from all different channels – even sensors. In his session at @ThingsExpo, William (Bill) Meehan, director of utility solutions for Esri, will take a closer look at the current state of spatial technology and ar...
SYS-CON Events announced today that Streamlyzer will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Streamlyzer is a powerful analytics for video streaming service that enables video streaming providers to monitor and analyze QoE (Quality-of-Experience) from end-user devices in real time.
You have great SaaS business app ideas. You want to turn your idea quickly into a functional and engaging proof of concept. You need to be able to modify it to meet customers' needs, and you need to deliver a complete and secure SaaS application. How could you achieve all the above and yet avoid unforeseen IT requirements that add unnecessary cost and complexity? You also want your app to be responsive in any device at any time. In his session at 19th Cloud Expo, Mark Allen, General Manager of...
SYS-CON Media announced today that @WebRTCSummit Blog, the largest WebRTC resource in the world, has been launched. @WebRTCSummit Blog offers top articles, news stories, and blog posts from the world's well-known experts and guarantees better exposure for its authors than any other publication. @WebRTCSummit Blog can be bookmarked ▸ Here @WebRTCSummit conference site can be bookmarked ▸ Here