Welcome!

Java Authors: Liz McMillan, Pat Romanski, Yeshim Deniz, Carmen Gonzalez, Yakov Fain

Related Topics: Java, SOA & WOA, AJAX & REA

Java: Article

Third-Party Content Management Applied

Four steps to gain control of your Page Load Performance

Today's web sites are often cluttered up with third-party content that slows down page load and rendering times, hampering user experience. In my first blog post, I discussed how third-party content impacts your website's performance and identified common problems with its integration. Today I want to share the experience I have had as a developer and consultant with the management of third-party content. In the following, I will show you best practices for integrating third-party content and for convincing your business that they will benefit from establishing third-party management.

First the bad news: as a developer, you have to get the commitment for establishing third-party management and changing the integration of third-party content from the highest business management level possible - the best is CEO level. Otherwise you will run into problems trying to implement improvements. The good news is that, from my experience, this is an achievable goal - you just have to bring the problems up the right way with hard facts. Let's start our journey toward implementing third-party content management from two possible starting points that I've seen in the past. The first one is triggered if someone from the business has a bad user experience and wants to find out who is responsible for the slow pages. The second one is that you as the developer know that your page is slow. No matter where you are starting, the first step you should make is to get the correct hard facts.

Step 1: Detailed third-party content impact analysis
For a developer this isn't really difficult. The only thing we have to do is use the Web Performance Optimization Tool of our choice and take a look at the page load timing. What we get is a picture like the screenshot below. We as developers immediately recognize that we have a problem - but for the business this is a diagram that requires an explanation.

As we want to convince them, we should make it easier for them to understand. Something that from my experience works well is to take the time and implement a URL-Parameter that turns off all the third-party content for a webpage. Then we can capture a second timeline from the same page without the third-party requests (see Figure 2). Everybody can now easily see that there are huge differences:

We can present these timelines to the business as well but we still have to explain what all the boxes, timings, etc., mean. We should invest some more time and create a table like Table 1, where we compare some main Key Performance Indicators (KPI).

KPI

Page with TPC

Page without TPC

First Impression Time

 

 

Onload Time

 

 

Total Load Time

 

 

JavaScript Time

 

 

Number of Domains

 

 

Number of Resources

 

 

Total Pagesize

 

 

As a single page is not representative we prepare similar tables for the five most important pages. Which pages these are depends on your website. Landing pages, product pages and pages on the "money path" are potentially interesting. Our Web Analytics Tool can help us find the most interesting pages.

Step 2: Inform business about the impact
During step 1 we discovered that the impact is significant, we collected facts, and we still think we have to improve the performance of our application. Now it's time to present the results of the first step to the business. From my experience the best way to do this is a face-to-face meeting with the high-level business executives. CEOs, CTOs and other business unit executives are the appropriate attendees.

The presentation we give during this meeting should cover the following three major topics:

  • Case Study facts from other companies
  • The hard facts we have collected
  • Recommendations for improvements

Google, Bing, Amazon, etc., have done some case studies that show the impact of slow pages on the revenue and the users' interaction with the website. Amazon, for example, found out that a 100 ms slower page reduces revenue by 1%. I have attached an example presentation to this blog, which should provide some guidance for our presentation and contains some more examples.

After this general information we can show the hard facts about our system, and as the business is now aware of the relationship between performance and revenue they normally listen carefully. Now we're no longer talking about time but money.

At the end of our presentation we make some recommendations on how we can improve the integration of third-party content. Don't be shy - no third-party content plugin is untouchable at this point. Some of the recommendations can only be decided by the business, not by the development. Our goals for this meeting are that we have the commitment to proceed, that we get support from the executives when discussing the implementation alternatives, and that we have a follow-up meeting with the same attendees to show improvements. What would be nice is the consent of the executives to the recommended improvements but from my experience they seldom commit.

Step 3: Check third-party content implementation alternatives
Now as we have the commitments, we can start thinking about integration alternatives. If we stick to the standard implementation the provider recommends, we won't be able to make any improvements. We have to be creative and always try to create win-win situations. Here in this article I want to talk about four best practices I have encountered the past.

Best Practice 1: Remove it
Every developer will now say "Okay, that's easy," and every business man will say "That's not possible because we need it!" But do you really need it? Let's take a closer look at the social media plugins, tracking pixels and ads.

A lot of websites have integrated social media plugins like Twitter or Facebook. They are very popular these days and a lot of webpages have integrated such plugins. Have you ever checked how often your users really use one of the plugins? A customer of ours had integrated five plugins. After six months they checked how often each of them was used. They found out that only one was used by people other than the QA department, which checks that all of them are working after each release. With a little investigation they found out that four of the five plugins could be removed as nobody used it.

What about a Tracking Pixel? I have seen a lot of pages out there that have not only integrated one tracking pixel but five, seven or even more. Again, the question is: Do we really need all of them? It doesn't matter who we ask; we'll always get a good explanation as to why a special pixel is needed, but stick to the goal of reducing it down to one pixel. Find out which one can deliver most or even all of the data that each department needs and remove all the others. Problems we might run into will be user privileges and business objectives that are defined for teams and individuals on specific statistics. It takes some time to handle this but, at the end, things will get easier as we have only one source for our numbers; we will stop discussing which statistics delivers the correct values and from which statistics to take the numbers as there is only one left. Once at a customer we removed five Tracking Pixels with a single blow. As this led to incredible performance improvements, their marketing department made an announcement to let customers know they care about their experience. This is a textbook example of creating a win-win-situation, as mentioned earlier.

Other third-party content that is a candidate for removal is banner ads. Businessmen will now say this guy is crazy to make such a recommendation, but if your main business is not earning money with displaying ads then it might be worth taking a look. Take the numbers from Amazon that 100 ms additional page load time is reducing the revenue by one percent and think of the example page where ads consume about 1000 ms of page load time - 10 times as much. This would mean that we lose 10 * 1% = -10% of our revenue just because of ads. The question now is: "Are you really earning 10% or more of your total revenue with ads?" If not you should consider removing ads from your page.

Best Practice 2: Move loading of resources back after the Onload-event
As we have now removed all unnecessary third-party content, we still have some left. For the user experience, apart from the total load time, the first impression and the onload-time are the most important timings. To improve these timings we can implement lazy loading where parts of the page are loaded after the onload event via JavaScript; several libraries are available that help you implement this. There are two things you should be aware of: first is that you are just moving the starting point for the download of the resources so you are not reducing the download size of your page or the number of requests. The second is that lacy loading only works when JavaScript is available in the user's browser. So you have to make sure that your page is useable without JavaScript. Candidates for moving the download starting point back are plugins that only work if JavaScript is available or are not vital to the usage of the page. Ads, Social Media Plugins, Maps, etc., are in most cases such candidates.

Best Practice 3: Load on user click
This is an interesting option if you want to integrate a social media plugin. The standard implementation, for example, of such a plugin looks like the figure below. It consists of a button to trigger the like/tweet action and the number of likes / tweets.

To improve this, the question that has to be answered is: Do the users really need to know how often the page was tweeted, liked, etc.? If the answer is no, we can save several requests and download volume. All we have to do is deliver a link that looks like the action button and, if the user clicks on the link, we can open a popup window or an overlay where the user can perform the necessary actions.

Best Practice 4: Maps vs. static Maps
This practice focuses on the integration of maps like Google Maps or Bing Maps on our page. What can be seen all around the Web are map integrations where the maps are very small and only used to give the user a hint as to where the point of interest is located. To show the user this hint, several JavaScript files and images have to be downloaded. In most cases the user doesn't need to zoom or reposition the map, and as the map is small it's also hard to use. Why not use the static map implementation Bing Maps and Google Maps are offering? To figure out the advantages of the static implementation I have created two HTML pages that show the same map, one uses the standard implementation and the other the static implementation.

After capturing the timings we get the following results:

KPI

Standard Google Maps

Static Google Maps

Difference in %

First Impression Time

493 ms

324 ms

-34%

Onload Time

473 ms

368ms

-22%

Total Load Time

1801 ms

700 ms

-61%

JavaScript Time

563 ms

0 ms

-100%

Number of Domains

6

1

-83%

Number of Resources

43

2

-95%

Total Pagesize

636 Kb

77 Kb

-88%

When we take a closer look at the KPIs we can see that every KPI for the Static Google Map implementation is better. Especially when we look at the KPIs timings we can see that the first impression and the onload time are improving by 34% and 22%. The total load time decreases by 1 sec, which is 61% less and this really has a big impact on the user's experience.

Some people will argue that this approach is not applicable as they want to offer the map controls to their customers. But remember Best Practice 3 - Load on user click: As soon as the user states his intention of interacting with the map by clicking on it, we can offer him a bigger and easier-to-use map by opening a popup, overlay or a new page. The only thing the development has to do is to surround the static image with a link tag.

Step 4: Monitor the performance of your web application / third-party content
As we need to show improvements in our follow-up meeting with business executives, it's important to monitor how the performance of our website evolves over time. There are three things that should be monitored by business, operations and development:

  1. Third-party content usage by customers and generated business value - business monitoring
  2. The impact of new added third-party content - development monitoring
  3. The performance of third-party content in the client browser - operations monitoring

Business Monitoring
An essential part of the business monitoring should be a check as to whether the requested third-party features contribute to the business value. Is the feature used by the customer or does it help us to increase our revenue? We have to ask this question again and again - not only once at the beginning of the development, but every time when business, development and operations meet to discuss web application performance. If we ever can state: No, the feature is not adding value - remove it as soon as possible.

Operations Monitoring
There are only a few tools that help us monitor the impact of third-party content for our users. What we need is either a synthetic monitoring system like Gomez and Keynote provide, or a monitoring tool that really sits in our users' browsers and collects the data there like dynaTrace UEM.

Synthetic monitoring tools allow us to monitor the performance from specified locations all over the world. The only downside is that we are not getting data from our real users. With dynaTrace UEM we can monitor the third-party content performance of all our users wherever they are situated and we get the real experienced timings. The figure below shows a dashboard from dynaTrace UEM that contains all the important data from the operations point of view. The pie chart and the table below indicate which third-party content provider has the biggest impact on the page load time and the distribution. The three line charts on the right side show you the request trend, the total page load time and the onload time and the average time that third-party content contributes to your page performance.

Development Monitoring
A very important thing is that the development has the ability to compare the KPIs between two releases and the differences between the pages with and without third-party content. We have already established Functional Web Tests that integrate with a web performance optimization tool that delivers the necessary values for the KPIs. We just have to reuse the switch we established during Step 1 and run automatic tests on the pages we have identified as the most important. From this moment on we will always be able to automatically find regression caused by third-party content.

We may also consider enhancing our switch and making each of the third-party plugins switchable. This allows us to check the overhead a new plugin adds to our page. It also helps us when we have to decide which feature we want to turn if there are two or more similar plugins.

Last but not least as now business, operation and development have all the necessary data to improve the user experience, we should meet up regularly to check the performance trend of our page and find solutions to upcoming performance challenges.

Conclusion
It is not a big deal to start improving the third-party content integration. If we want to succeed it's necessary that business, development and operations work together. We have to be creative, we have to make compromises and we have to be ready to use different methods of integration - never stop aiming for a top performing website. If we take things seriously, we can improve the experience of our users and therefore increase our business.

More Stories By Klaus Enzenhofer

Klaus Enzenhofer has several years of experience and expertise in the field of Web Performance Optimization and User Experience Management. He works as Technical Strategist in the Center of Excellence Team at dynaTrace Software. In this role he influences the development of the dynaTrace Application Performance Management Solution and the Web Performance Optimization Tool dynaTrace AJAX Edition. He mainly gathered his experience in web and performance by developing and running large-scale web portals at Tiscover GmbH.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories

SUNNYVALE, Calif., Oct. 20, 2014 /PRNewswire/ -- Spansion Inc. (NYSE: CODE), a global leader in embedded systems, today added 96 new products to the Spansion® FM4 Family of flexible microcontrollers (MCUs). Based on the ARM® Cortex®-M4F core, the new MCUs boast a 200 MHz operating frequency and support a diverse set of on-chip peripherals for enhanced human machine interfaces (HMIs) and machine-to-machine (M2M) communications. The rich set of periphera...

WebRTC defines no default signaling protocol, causing fragmentation between WebRTC silos. SIP and XMPP provide possibilities, but come with considerable complexity and are not designed for use in a web environment. In his session at Internet of @ThingsExpo, Matthew Hodgson, technical co-founder of the Matrix.org, will discuss how Matrix is a new non-profit Open Source Project that defines both a new HTTP-based standard for VoIP & IM signaling and provides reference implementations.
SYS-CON Events announced today that Aria Systems, the recurring revenue expert, has been named "Bronze Sponsor" of SYS-CON's 15th International Cloud Expo®, which will take place on November 4-6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Aria Systems helps leading businesses connect their customers with the products and services they love. Industry leaders like Pitney Bowes, Experian, AAA NCNU, VMware, HootSuite and many others choose Aria to power their recurring revenue business and deliver exceptional experiences to their customers.
The Internet of Things (IoT) is going to require a new way of thinking and of developing software for speed, security and innovation. This requires IT leaders to balance business as usual while anticipating for the next market and technology trends. Cloud provides the right IT asset portfolio to help today’s IT leaders manage the old and prepare for the new. Today the cloud conversation is evolving from private and public to hybrid. This session will provide use cases and insights to reinforce the value of the network in helping organizations to maximize their company’s cloud experience.
The Internet of Things (IoT) is making everything it touches smarter – smart devices, smart cars and smart cities. And lucky us, we’re just beginning to reap the benefits as we work toward a networked society. However, this technology-driven innovation is impacting more than just individuals. The IoT has an environmental impact as well, which brings us to the theme of this month’s #IoTuesday Twitter chat. The ability to remove inefficiencies through connected objects is driving change throughout every sector, including waste management. BigBelly Solar, located just outside of Boston, is trans...
SYS-CON Events announced today that Matrix.org has been named “Silver Sponsor” of Internet of @ThingsExpo, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Matrix is an ambitious new open standard for open, distributed, real-time communication over IP. It defines a new approach for interoperable Instant Messaging and VoIP based on pragmatic HTTP APIs and WebRTC, and provides open source reference implementations to showcase and bootstrap the new standard. Our focus is on simplicity, security, and supporting the fullest feature set.
Predicted by Gartner to add $1.9 trillion to the global economy by 2020, the Internet of Everything (IoE) is based on the idea that devices, systems and services will connect in simple, transparent ways, enabling seamless interactions among devices across brands and sectors. As this vision unfolds, it is clear that no single company can accomplish the level of interoperability required to support the horizontal aspects of the IoE. The AllSeen Alliance, announced in December 2013, was formed with the goal to advance IoE adoption and innovation in the connected home, healthcare, education, aut...
SYS-CON Events announced today that Red Hat, the world's leading provider of open source solutions, will exhibit at Internet of @ThingsExpo, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Red Hat is the world's leading provider of open source software solutions, using a community-powered approach to reliable and high-performing cloud, Linux, middleware, storage and virtualization technologies. Red Hat also offers award-winning support, training, and consulting services. As the connective hub in a global network of enterprises, partners, a...
The only place to be June 9-11 is Cloud Expo & @ThingsExpo 2015 East at the Javits Center in New York City. Join us there as delegates from all over the world come to listen to and engage with speakers & sponsors from the leading Cloud Computing, IoT & Big Data companies. Cloud Expo & @ThingsExpo are the leading events covering the booming market of Cloud Computing, IoT & Big Data for the enterprise. Speakers from all over the world will be hand-picked for their ability to explore the economic strategies that utility/cloud computing provides. Whether public, private, or in a hybrid form, clo...
Software AG helps organizations transform into Digital Enterprises, so they can differentiate from competitors and better engage customers, partners and employees. Using the Software AG Suite, companies can close the gap between business and IT to create digital systems of differentiation that drive front-line agility. We offer four on-ramps to the Digital Enterprise: alignment through collaborative process analysis; transformation through portfolio management; agility through process automation and integration; and visibility through intelligent business operations and big data.
The Transparent Cloud-computing Consortium (abbreviation: T-Cloud Consortium) will conduct research activities into changes in the computing model as a result of collaboration between "device" and "cloud" and the creation of new value and markets through organic data processing High speed and high quality networks, and dramatic improvements in computer processing capabilities, have greatly changed the nature of applications and made the storing and processing of data on the network commonplace.
Be Among the First 100 to Attend & Receive a Smart Beacon. The Physical Web is an open web project within the Chrome team at Google. Scott Jenson leads a team that is working to leverage the scalability and openness of the web to talk to smart devices. The Physical Web uses bluetooth low energy beacons to broadcast an URL wirelessly using an open protocol. Nearby devices can find all URLs in the room, rank them and let the user pick one from a list. Each device is, in effect, a gateway to a web page. This unlocks entirely new use cases so devices can offer tiny bits of information or simple i...
Things are being built upon cloud foundations to transform organizations. This CEO Power Panel at 15th Cloud Expo, moderated by Roger Strukhoff, Cloud Expo and @ThingsExpo conference chair, will address the big issues involving these technologies and, more important, the results they will achieve. How important are public, private, and hybrid cloud to the enterprise? How does one define Big Data? And how is the IoT tying all this together?
The Internet of Things (IoT) is going to require a new way of thinking and of developing software for speed, security and innovation. This requires IT leaders to balance business as usual while anticipating for the next market and technology trends. Cloud provides the right IT asset portfolio to help today’s IT leaders manage the old and prepare for the new. Today the cloud conversation is evolving from private and public to hybrid. This session will provide use cases and insights to reinforce the value of the network in helping organizations to maximize their company’s cloud experience.
TechCrunch reported that "Berlin-based relayr, maker of the WunderBar, an Internet of Things (IoT) hardware dev kit which resembles a chunky chocolate bar, has closed a $2.3 million seed round, from unnamed U.S. and Switzerland-based investors. The startup had previously raised a €250,000 friend and family round, and had been on track to close a €500,000 seed earlier this year — but received a higher funding offer from a different set of investors, which is the $2.3M round it’s reporting."
The Industrial Internet revolution is now underway, enabled by connected machines and billions of devices that communicate and collaborate. The massive amounts of Big Data requiring real-time analysis is flooding legacy IT systems and giving way to cloud environments that can handle the unpredictable workloads. Yet many barriers remain until we can fully realize the opportunities and benefits from the convergence of machines and devices with Big Data and the cloud, including interoperability, data security and privacy.
All major researchers estimate there will be tens of billions devices - computers, smartphones, tablets, and sensors - connected to the Internet by 2020. This number will continue to grow at a rapid pace for the next several decades. Over the summer Gartner released its much anticipated annual Hype Cycle report and the big news is that Internet of Things has now replaced Big Data as the most hyped technology. Indeed, we're hearing more and more about this fascinating new technological paradigm. Every other IT news item seems to be about IoT and its implications on the future of digital busines...
Cultural, regulatory, environmental, political and economic (CREPE) conditions over the past decade are creating cross-industry solution spaces that require processes and technologies from both the Internet of Things (IoT), and Data Management and Analytics (DMA). These solution spaces are evolving into Sensor Analytics Ecosystems (SAE) that represent significant new opportunities for organizations of all types. Public Utilities throughout the world, providing electricity, natural gas and water, are pursuing SmartGrid initiatives that represent one of the more mature examples of SAE. We have s...
The Internet of Things needs an entirely new security model, or does it? Can we save some old and tested controls for the latest emerging and different technology environments? In his session at Internet of @ThingsExpo, Davi Ottenheimer, EMC Senior Director of Trust, will review hands-on lessons with IoT devices and reveal privacy options and a new risk balance you might not expect.
IoT is still a vague buzzword for many people. In his session at Internet of @ThingsExpo, Mike Kavis, Vice President & Principal Cloud Architect at Cloud Technology Partners, will discuss the business value of IoT that goes far beyond the general public's perception that IoT is all about wearables and home consumer services. The presentation will also discuss how IoT is perceived by investors and how venture capitalist access this space. Other topics to discuss are barriers to success, what is new, what is old, and what the future may hold.