Welcome!

Java IoT Authors: Elizabeth White, Roger Strukhoff, Dana Gardner, David Green, Pat Romanski

Related Topics: Adobe Flex, Java IoT, IoT User Interface

Adobe Flex: Article

Three Tips to Successfully Load Test Adobe Flex Applications

It has become increasingly important to performance test these web applications under different loads

The world has changed dramatically for organizations that use web-enabled business applications in enterprise environments. The complexities of modern applications, which include multi-tiered, globally distributed architectures, SOA and a host of other new technologies, have forced major change in the way enterprises develop, test and manage web-enabled applications. Specifically, Rich Internet Application (RIA) technologies such as Adobe Flex allow a faster, more engaging, interactive experience with browser- enabled applications and services. While an attractive option for developing expanded application capabilities these RIA technologies bring new challenges to organizations that need to deploy application features and functions in a compressed time frame and at a lower cost.

Preparing for the shift as part of this new technology trend, companies must adapt their testing approach as well. Given the way new technologies work when deployed on Onternet/intranet applications, it's important to be careful in approaching the testing of Flex applications. When rolling out next-generation applications on a mission-critical enterprise application platform, you need to be confident that your systems will perform. As load testing is one of the last crucial tasks before launching a web application, it's usually done under pressure and with tight time constraints. Before we discuss tips to make you successful when load testing Adobe Flex applications, first let's look at how Adobe Flex applications are unique.

Adobe Flex applications may be different from applications you've worked with before. For classic HTML web applications, the server does all of the processing. In addition, the rendering of HTML requires full page refreshes. Flex applications are different because content is delivered without having to reload the page. These applications download the Flash client application and run in the browser, or for Adobe AIR applications, on the desktop. The Flash is constantly being updated by the server, asynchronously. For this reason, Adobe Flex server loads reveal a very different profile.

We all know it is important to measure transaction response times when performance testing web appli­cations. But Flex technology can cause significant increases in the number of browser-to-server HTTP calls made in the background and this increase in traffic can have a profound effect on per­formance. While users might not be aware of the round-trips between the browser and a distant server, they will definitely notice performance a problem if the application is slow or doesn't work because of the increased load. Knowing the scalability limitations of your deployments is crucial.

Innovative RIA technologies like Flex present new challenges with respect to emulating a realistic load. If you're developing or deploying applications in this newer technology, successfully addressing these challenges requires thinking about load testing in new ways. Let's look at the new challenges in load testing Flex applications and three tips to improve application performance and enrich the user experience.

Tip 1: Make sure you can decode AMF and process internal identifiers
Adobe's Flex AMF (Action Messaging Format) protocol is used to exchange data between an Adobe Flash application and a database, using a Remote Procedure call. Since the data is in compressed binary format, communication is faster and more efficient than a web service. Older load testing solutions claim they can handle "Flex" technology, but have difficulty doing so in the real world. Many load testing tools rely on data being sent over HTTP and don't tap into the AMF protocol. Therefore, they don't go deep enough to understand what is being exchanged between client and server. For optimum testing, make sure your load testing tool decodes these objects so that you can parameterize the requests, extract data from the response and validate the response. Users should look for automated variable extraction and parameterization, automated comparison features to show changes from one code base to the next and automated documentation of results so test output is immediately actionable by all stakeholders.

In addition, AMF message protocol uses internal identifiers (clientId, DSId, etc.) to maintain the AMF session. Parameters in an AMF session are dynamically generated and are numerous. To avoid painful manual configuration of these parameters, make sure that your load testing tool automatically processes serialized objects and identifiers.

Tip 2: Make sure you can support customized external messages
AMF3 compact binary format helps integrate binary data exchange into all server platforms. The protocol improves the quality of all remote and messaging solutions for Flex and optimizes data exchange. AMF3 reduces the quantity of exchanged information and avoids redundant messages. In particular, AMF3 externalizable messages allow developers to customize the serialization of objects exchanged in AMF.

When selecting a load testing tool, make sure it allows you to integrate custom code into it. For optimum testing, the customized code should be loaded and run by the testing tool to correctly replay custom messages.

Tip 3: Make sure you can support polling and streaming
Many Flex applications require that some data are regularly pushed from the server to the client. Servers using AMF protocol update client data in two ways:

  • The polling method: This is when the browser queries the server at regular intervals. Technically simple in its implementation, the method's downside is that it needlessly overloads the server and is not very reactive. When these applications are recorded with a load testing tool, a high number of browsers to server requests are recorded along with other more significant requests. It's important to make sure your tool can automatically simplify your recording to get a clear maintainable script.
  • The streaming method: In this case, the client sends a single request to the server and the server responds when pertinent information is available, without closing the request. The server can send information to the client again using the same connection, without having to wait for a new request. Using this method, client data can be rapidly updated while network traffic is kept to a minimum. The testing tool must be able to keep the stream open when it plays a streaming request. This request will block the connection so the tool has to use a second connection for classic requests.

In both cases, typical HTTP response times are not useful. For regular HTTP requests, load testing tools measure the time between when the request is sent and when the response is received. It's important that your load testing tool can measure what's really interesting: the time between when the server updates the client and when the client receives the update. Not only should the tool measure this information but this function should be automated so that testing can be done frequently and earlier in the lifecycle.

As more and more companies deploy mission-critical applications on the Internet, it has become increasingly important to performance test these web applications under different loads before they go live. While Adobe Flex provides expanded capabilities to develop new powerful applications in support of organizational needs, the technology is different from other applications. As a result a new approach to load testing is required. To understand how your applications will work in the Adobe Flex production environment, and to optimize them for higher performance, you need be able to simulate not just the actions of one user, but the actions of many users. By implementing the tips in this article, companies can quickly adapt their testing methodology to handle Flex applications, get to market faster and easier than ever before, while improving productivity and saving money.

More Stories By Christophe Marton

Christophe Marton is co-founder of Neotys and co-creator of the development tool NeoLoad. Previously, he worked at Calendra where he developed his expertise in the performance and load testing space, for which he ultimately developed a unique vision for a next generation solution. While at Calendra, he developed software solutions for the identity management market which was later acquired by BMC software in 2005. During his six year tenure at Calendra, he developed the skills required to create, develop and take cutting edge software to market. Marton is a Computer Engineering graduate of ESIL Marseille Engineering School.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
Pulzze Systems was happy to participate in such a premier event and thankful to be receiving the winning investment and global network support from G-Startup Worldwide. It is an exciting time for Pulzze to showcase the effectiveness of innovative technologies and enable them to make the world smarter and better. The reputable contest is held to identify promising startups around the globe that are assured to change the world through their innovative products and disruptive technologies. There w...
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound change in personal and enterprise IT since the creation of the Worldwide Web more than 20 years ago. All major researchers estimate there will be tens of billions devices - comp...
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, will discuss the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
There is growing need for data-driven applications and the need for digital platforms to build these apps. In his session at 19th Cloud Expo, Muddu Sudhakar, VP and GM of Security & IoT at Splunk, will cover different PaaS solutions and Big Data platforms that are available to build applications. In addition, AI and machine learning are creating new requirements that developers need in the building of next-gen apps. The next-generation digital platforms have some of the past platform needs a...
The 19th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Digital Transformation, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportuni...
19th Cloud Expo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterpri...
Personalization has long been the holy grail of marketing. Simply stated, communicate the most relevant offer to the right person and you will increase sales. To achieve this, you must understand the individual. Consequently, digital marketers developed many ways to gather and leverage customer information to deliver targeted experiences. In his session at @ThingsExpo, Lou Casal, Founder and Principal Consultant at Practicala, discussed how the Internet of Things (IoT) has accelerated our abil...
With so much going on in this space you could be forgiven for thinking you were always working with yesterday’s technologies. So much change, so quickly. What do you do if you have to build a solution from the ground up that is expected to live in the field for at least 5-10 years? This is the challenge we faced when we looked to refresh our existing 10-year-old custom hardware stack to measure the fullness of trash cans and compactors.
The emerging Internet of Everything creates tremendous new opportunities for customer engagement and business model innovation. However, enterprises must overcome a number of critical challenges to bring these new solutions to market. In his session at @ThingsExpo, Michael Martin, CTO/CIO at nfrastructure, outlined these key challenges and recommended approaches for overcoming them to achieve speed and agility in the design, development and implementation of Internet of Everything solutions wi...
DevOps at Cloud Expo, taking place Nov 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long dev...
Cloud computing is being adopted in one form or another by 94% of enterprises today. Tens of billions of new devices are being connected to The Internet of Things. And Big Data is driving this bus. An exponential increase is expected in the amount of information being processed, managed, analyzed, and acted upon by enterprise IT. This amazing is not part of some distant future - it is happening today. One report shows a 650% increase in enterprise data by 2020. Other estimates are even higher....
I wanted to gather all of my Internet of Things (IOT) blogs into a single blog (that I could later use with my University of San Francisco (USF) Big Data “MBA” course). However as I started to pull these blogs together, I realized that my IOT discussion lacked a vision; it lacked an end point towards which an organization could drive their IOT envisioning, proof of value, app dev, data engineering and data science efforts. And I think that the IOT end point is really quite simple…
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no id...
Identity is in everything and customers are looking to their providers to ensure the security of their identities, transactions and data. With the increased reliance on cloud-based services, service providers must build security and trust into their offerings, adding value to customers and improving the user experience. Making identity, security and privacy easy for customers provides a unique advantage over the competition.
Is the ongoing quest for agility in the data center forcing you to evaluate how to be a part of infrastructure automation efforts? As organizations evolve toward bimodal IT operations, they are embracing new service delivery models and leveraging virtualization to increase infrastructure agility. Therefore, the network must evolve in parallel to become equally agile. Read this essential piece of Gartner research for recommendations on achieving greater agility.
Smart Cities are here to stay, but for their promise to be delivered, the data they produce must not be put in new siloes. In his session at @ThingsExpo, Mathias Herberts, Co-founder and CTO of Cityzen Data, will deep dive into best practices that will ensure a successful smart city journey.
SYS-CON Events announced today that Venafi, the Immune System for the Internet™ and the leading provider of Next Generation Trust Protection, will exhibit at @DevOpsSummit at 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Venafi is the Immune System for the Internet™ that protects the foundation of all cybersecurity – cryptographic keys and digital certificates – so they can’t be misused by bad guys in attacks...
SYS-CON Events announced today Telecom Reseller has been named “Media Sponsor” of SYS-CON's 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Telecom Reseller reports on Unified Communications, UCaaS, BPaaS for enterprise and SMBs. They report extensively on both customer premises based solutions such as IP-PBX as well as cloud based and hosted platforms.
For basic one-to-one voice or video calling solutions, WebRTC has proven to be a very powerful technology. Although WebRTC’s core functionality is to provide secure, real-time p2p media streaming, leveraging native platform features and server-side components brings up new communication capabilities for web and native mobile applications, allowing for advanced multi-user use cases such as video broadcasting, conferencing, and media recording.
Akana has announced the availability of version 8 of its API Management solution. The Akana Platform provides an end-to-end API Management solution for designing, implementing, securing, managing, monitoring, and publishing APIs. It is available as a SaaS platform, on-premises, and as a hybrid deployment. Version 8 introduces a lot of new functionality, all aimed at offering customers the richest API Management capabilities in a way that is easier than ever for API and app developers to use.