Click here to close now.

Welcome!

Java Authors: Jason Bloomberg, Wade Williamson, Pat Romanski, Liz McMillan, VictorOps Blog

Related Topics: XML, Cloud Expo, Security

XML: Article

Combining the Cloud with the Computing: Application Delivery Networks

What new challenges does Cloud Computing present for the enterprise?

IT executives are being asked to increasingly evaluate new cloud-based services to improve business agility while lowering operating and capital costs within the enterprise. Yet often very little is known about the “cloud” itself. How does it work? What new challenges does it present for the enterprise?

While cloud vendors continue to roll-out new technology to capture the imagination of application development and IT organizations – one area continues to remain noticeably cloudy and overlooked – the cloud itself.

The first of the two words in cloud computing is often not well understood. It’s almost always drawn very minuscule in pictures while dwarfed by the virtualized server farms providing on-demand computing power. Implying as if the cloud is secondary and works in a simple way – something goes in one side of the cloud and then shows up instantaneously on the other side. Or perhaps it’s a control issue – after all, the cloud is seemingly outside of the data-center beyond direct control of IT...or is it?

In order for cloud computing to realize its full potential and become commonplace for a range of business processes and applications within the enterprise the cloud itself needs to be treated equally as important as the computing aspect. The two must go hand-in-hand. For decades, enterprises have grown accustomed to private IP-VPN services such as MPLS offered by network providers. Such services offer high degrees of uptime, low latency and packet loss guarantees, and a sole point of escalation for problem resolution. Yet the on-demand accessibility promised by cloud computing services are best fulfilled when any type of user can access applications – anywhere in the world, and at any time thru a common interface such as a Web browser. And it simply isn’t possible to run private IP-VPN services to everywhere application users have access to a Web-browser. As a result, the Internet is more often than not the de facto cloud used to fulfill the ubiquitous reach and economies of scale necessitated by on-demand cloud applications.

Herein lies the challenge. The Internet cloud is not like a private network offered by a service provider. The Internet is a network-of-networks, consisting of over ten thousand individual network providers. And unlike traffic carried within a private-WAN, not all networks are economically motivated to carry the bulk of Internet traffic generated by an on-demand cloud computing service. The first-mile provider offering bandwidth for the data-center and the last-mile access provider are the two providers who directly get paid to connect the user to the application. All other Internet network providers have little economic incentive to exchange and deliver traffic and apply sub-optimal, unreliable relationships called peering. Peering relationships manifest themselves by adding extra round-trip latency and packet loss by way of the Border Gateway Protocol (BGP) which is used to route application requests thru the cloud between application users and the infrastructure1. Yet any latency or service interruption, whether caused by either the computing infrastructure or the cloud, degrades user experience and can damage customer satisfaction resulting in abandonment issues and low adoption of cloud computing services.

To make matters even worse, other protocols used to govern Web application delivery such as the chatty TCP protocol for transport and HTTP for applications introduce new application delivery bottlenecks for distributed users of on-demand cloud based applications. Users far away from computing infrastructure will experience slower response times and worse availability than those users close to the resources. And the Internet opens new security vulnerabilities ranging from Domain Name Server (DNS) and distributed denial-of-service (DDoS) attacks to more advanced malicious activities exposing application-specific vulnerabilities.

The challenges associated with the Internet cloud are very real. What happens to application adoption when one user gets a 5-10x slower application response time than another, merely because of their increased distance from computing servers? What if applications are unavailable due to issues associated with the Internet itself such as congestion, de-peering, cable cuts or earthquakes? What happens if your in-cloud application is attacked by Internet hackers? As evidenced by a recent State of the Internet Report2, attack traffic on the Internet was originated in over 139 unique countries. Over 400 unique ports were attacked, a twenty fold increase from just the prior quarter. DDoS attacks continued to exploit tactics that were identified years ago along with numerous high-visibility DNS hijackings. Network and routing outages remain commonplace. And Website and application hacks, such as SQL injections and cross-site scripting (XSS) attacks have infected hundreds of thousands of Web properties. It is clear the Internet must transform into a predictable, reliable application delivery platform suitable for business use to fulfill the promise of cloud computing within the enterprise.

Cloud computing providers need a strategy for optimizing the cloud for their on-demand applications and computing services on a global scale, while remaining as cost-effective as possible, in order to survive what is undoubtedly becoming increasingly competitive environment. At the same time, they are pressured to ensure their infrastructure can cope with a rapidly escalating volume of data and shield users from in-the-cloud bottlenecks outside of the data-center. For this reason, they are increasingly reliant on proven third-party providers for the reliable and cost-effective delivery of on-demand content and applications in the cloud in to solidify their position in this rapidly evolving and promising market.

One way of optimizing delivery over the Internet cloud has been thru next-generation content delivery network (CDN) providers. To enable on-demand cloud computing services, however, such providers must transcend far beyond traditional CDN capabilities to address the fact that rich interactive websites and enterprise applications aren’t generally cacheable like a large media file or image. Dynamic content requires new application delivery optimizations addressing routing, transport and application layer protocol inefficiencies introduced by the Internet cloud for effective delivery. Such optimizations allow globally distributed users to feel as though they are close to centralized computing resources, regardless of their distance from the infrastructure, while addressing other key availability, security and scalability bottlenecks associated with Internet-based application delivery.

Next-generation CDN providers incorporate tens of thousands of distributed computing servers across the globe at the edge of the Internet, within one network hop away from both application infrastructure and the vast majority of the world’s Internet users. In essence, this creates a distributed global “overlay” of the Internet serving as the foundation for powering a better Internet experience. Thru software written on the platform, the application of a sophisticated set of algorithms and knowledge of real-time Internet conditions are applied towards accelerating content goes well beyond static caching and traditional CDN capabilities to optimize application delivery bottlenecks for fully dynamic, on-demand applications. Essentially, these services leverage their own optimized protocols to optimize the distance induced performance and availability challenges introduced by BGP, TCP and HTTP protocols. Next-generation CDN services, often referred to as “Application Delivery Networks” (ADN), improve the delivery of dynamic content in the Internet cloud, without the use of any additional hardware, new software or application code changes for any application user accessing an application over the Internet cloud. The operation of an ADN is described and illustrated in Figure1.

1. A dynamic mapping system based on DNS directs user requests for secure application content to an optimal edge server.

2. Malicious activity can be blocked at the edge of the Internet, outside the data-center, through a set of configurable rules

3. Dynamic route optimization technology identifies the fastest and most reliable path back to the origin infrastructure to retrieve dynamic application content.

4. A high-performance transport protocol transparently optimizes communication throughput between edge server and the origin, improving performance and reliability. 5. The edge server retrieves the requested application content and returns it to the user over secure optimized connections. Static and pre-fetched content leverages edge proximity to speed delivery when possible.

Figure 1 – How an Application Delivery Network (ADN) works

Providers of on-demand computing resources and applications leveraging ADN technologies benefit by keeping data-center build-out to a minimum while simultaneously addressing Internet delivery issues. ADN services are provided as a convenient managed service with no capital expenditure. The result is higher application availability, better performance, improved security, and significantly improved scalability and operations. Cloud computing providers can focus on their core strength – developing innovative hosting services, application development platforms and off-the-shelf software applications - while benefiting from a scalable and robust delivery platform which works on a global scale.

Figure 2 – Response times across 25 geographies to complete a 4-step dynamic transaction for a Web-based customer service portal hosted as a single instance in eastern United States. Prior to the use of an ADN, users in some cities such as Madrid, Singapore and Sydney experienced over 40-second response times. After the use of an ADN, all cities exhibited response times no more than 17-seconds – whereas someone in Singapore would “feel” as though they were located in Los Angeles.

Some of the large cloud computing providers will opt to build-out a multitude of big regionalized data-centers, often spending tens or hundreds of millions of dollars on big data-center investments. While this will undoubtedly place on-demand infrastructure in closer proximity to application users, there are architectural limitations to this approach.

On-demand browser applications are accessible on a global scale, which means if the application resides in a single data-center there will always be some portion of the user community who will be much farther away. Do you have your application run in a North America, Europe or Asia-Pac data-center? And replicating instances of a single application across multiple data-centers may often not be desirable or even possible due to a variety of considerations such as management, cost, integration, performance, regulatory compliance and security

For those applications which can be replicated in multiple instances, however, the big data-center approach remains flawed as the majority of application users are most likely not buying their Internet connectivity from the same provider servicing the regional data-center. In fact, measurements show the ten largest networks in the Internet provide last-mile subscriptions to approximately 30% of overall Internet users3. And no single network provides more than 10% of the access traffic. So even if application instances were replicated in large data-centers that happened to reside within the world’s 30 largest networks, the average distance from an application user to data-center would still exceed 1,500 miles. Let alone unless the data-center is in the same service provider as the application user, the user remains at the mercy of Internet delivery bottlenecks.

From IP traceroute measurements, it is easy to observe how users are sometimes routed outside of countries and even continents to reach data-center infrastructure. Even when having infrastructure in the same city as the end-user, but not the same service provider, applications can be subject to substantial latency challenges. As a result, despite pre-existing data-center build-out, the use of an ADN is highly beneficial to optimize from the application user to a nearby data-center.

Number of ISPs Crossed from Application User to Data-Center - Intra City

1

2

3

4

5

Frankfurt

5%

21%

30%

28%

16%

Singapore

19%

19%

25%

31%

6%

Chicago

10%

59%

31%

0%

0%

Seattle

3%

17%

47%

27%

6%

Table1: It is very common for Internet routing to go outside of city and country when connecting application users to nearby data-centers.  For example, based on a sample of IP traceroutes, an application user in Frankfurt would traverse 3 or more ISP's 74% of the time to connect to application infrastructure also located in Frankfurt.

Leveraging CDN for static delivery of content via the public Internet is well established and understood. The next-generation of CDN services – Application Delivery Networks - are already proven and can be equally effective for transparent delivery of dynamic, on-demand applications developed and delivered within the Internet cloud. For many years now, leading managed service providers have been offering advanced services based on highly distributed global platforms which transform the Internet into a reliable and high-performing platform for on-demand application delivery to the global enterprise – for anyone, anywhere, anytime. An increasing number of applications and business processes are moving to a cloud-based delivery model. Whether it is for rich interactive Web 2.0 websites, web-enabled business processes such as extranet portals and supply chains, software-as-a-service and now on-demand cloud computing – the importance of optimizing the cloud itself moves to the forefront in order to meet the stringent demands of the enterprise.

Globally distributed Application Delivery Networks put the optimal architecture for in-cloud optimization right into IT and application development’s hands. The Internet cloud is tremendously complicated and those placing the same scrutiny towards optimizing outside of the data-center, as inside the data-center, are those who will be able to successfully satisfy the stringent demands necessary to bring cloud-based applications to the marketplace.

For those evaluating the use of any cloud-based platform or service… don’t forget the cloud. Ask probing questions to understand what is available to optimize cloud-based application delivery both inside and outside the data-center. The use of highly distributed Application Delivery Networks when applied to on-demand computing platforms is a powerful combination to help bring cloud based services to the enterprise market and is readily available today.

Recommended Reading and Viewing:

1 Historical Internet latency & packet loss measurements
http://www.akamai.com/dv2

2 Akamai - “Quarterly State of the Internet Report”
http://www.akamai.com/stateoftheinternet/

3 Akamai – How Will the Internet Scale?”
http://www.akamai.com/dl/whitepapers/How_will_the_internet_scale.pdf

More Stories By Willie M. Tejada

Willie M. Tejada is Vice President, Application and Site Acceleration, at Akamai Technologies, Inc., where he is responsible for the Application and Site Acceleration Business Units targeted at optimizing the delivery of enterprise sites and applications over the Internet. With more than 15 years of marketing, product management, and business development experience, Tejada joined Akamai in March 2007 as part of the Netli acquisition. A seasoned executive, he has held various senior management positions in both start-up and large enterprise companies including Novell, where he led marketing, product management, developer and strategic relations organizations. An accomplished communicator and presenter, he is an inventor listed on US Patent 6,078,924, and also the author of Facilitating Competitive Intelligence: The Next Step in Internet-Based Research published in CRC Press' "Best Practices Series" in Internet Management.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
The WebRTC Summit 2014 New York, to be held June 9-11, 2015, at the Javits Center in New York, NY, announces that its Call for Papers is open. Topics include all aspects of improving IT delivery by eliminating waste through automated business models leveraging cloud technologies. WebRTC Summit is co-located with 16th International Cloud Expo, @ThingsExpo, Big Data Expo, and DevOps Summit.
The Open Compute Project is a collective effort by Facebook and a number of players in the datacenter industry to bring lessons learned from the social media giant's giant IT deployment to the rest of the world. Datacenters account for 3% of global electricity consumption – about the same as all of Switzerland or the Czech Republic -- according to people I met at the recent Open Compute Summit in San Jose. With increasing mobility at the edge of the cloud and vast new dataflows being predicted with the growth of the Internet of Things (and The Coming Age of Many Zettabytes) in the near...
GENBAND has announced that SageNet is leveraging the Nuvia platform to deliver Unified Communications as a Service (UCaaS) to its large base of retail and enterprise customers. Nuvia’s cloud-based solution provides SageNet’s customers with a full suite of business communications and collaboration tools. Two large national SageNet retail customers have recently signed up to deploy the Nuvia platform and the company will continue to sell the service to new and existing customers. Nuvia’s capabilities include HD voice, video, multimedia messaging, mobility, conferencing, Web collaboration, deskt...
Wearable technology was dominant at this year’s International Consumer Electronics Show (CES) , and MWC was no exception to this trend. New versions of favorites, such as the Samsung Gear (three new products were released: the Gear 2, the Gear 2 Neo and the Gear Fit), shared the limelight with new wearables like Pebble Time Steel (the new premium version of the company’s previously released smartwatch) and the LG Watch Urbane. The most dramatic difference at MWC was an emphasis on presenting wearables as fashion accessories and moving away from the original clunky technology associated with t...
SYS-CON Events announced today that Cisco, the worldwide leader in IT that transforms how people connect, communicate and collaborate, has been named “Gold Sponsor” of SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Cisco makes amazing things happen by connecting the unconnected. Cisco has shaped the future of the Internet by becoming the worldwide leader in transforming how people connect, communicate and collaborate. Cisco and our partners are building the platform for the Internet of Everything by connecting the...
15th Cloud Expo, which took place Nov. 4-6, 2014, at the Santa Clara Convention Center in Santa Clara, CA, expanded the conference content of @ThingsExpo, Big Data Expo, and DevOps Summit to include two developer events. IBM held a Bluemix Developer Playground on November 5 and ElasticBox held a Hackathon on November 6. Both events took place on the expo floor. The Bluemix Developer Playground, for developers of all levels, highlighted the ease of use of Bluemix, its services and functionality and provide short-term introductory projects that developers can complete between sessions.
SYS-CON Events announced today that robomq.io will exhibit at SYS-CON's @ThingsExpo, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. robomq.io is an interoperable and composable platform that connects any device to any application. It helps systems integrators and the solution providers build new and innovative products and service for industries requiring monitoring or intelligence from devices and sensors.
Temasys has announced senior management additions to its team. Joining are David Holloway as Vice President of Commercial and Nadine Yap as Vice President of Product. Over the past 12 months Temasys has doubled in size as it adds new customers and expands the development of its Skylink platform. Skylink leads the charge to move WebRTC, traditionally seen as a desktop, browser based technology, to become a ubiquitous web communications technology on web and mobile, as well as Internet of Things compatible devices.
The list of ‘new paradigm’ technologies that now surrounds us appears to be at an all time high. From cloud computing and Big Data analytics to Bring Your Own Device (BYOD) and the Internet of Things (IoT), today we have to deal with what the industry likes to call ‘paradigm shifts’ at every level of IT. This is disruption; of course, we understand that – change is almost always disruptive.
WebRTC is an up-and-coming standard that enables real-time voice and video to be directly embedded into browsers making the browser a primary user interface for communications and collaboration. WebRTC runs in a number of browsers today and is currently supported in over a billion installed browsers globally, across a range of platform OS and devices. Today, organizations that choose to deploy WebRTC applications and use a host machine that supports audio through USB or Bluetooth can use Plantronics products to connect and transit or receive the audio associated with the WebRTC session.
Docker is an excellent platform for organizations interested in running microservices. It offers portability and consistency between development and production environments, quick provisioning times, and a simple way to isolate services. In his session at DevOps Summit at 16th Cloud Expo, Shannon Williams, co-founder of Rancher Labs, will walk through these and other benefits of using Docker to run microservices, and provide an overview of RancherOS, a minimalist distribution of Linux designed expressly to run Docker. He will also discuss Rancher, an orchestration and service discovery platf...
SYS-CON Events announced today that Alert Logic, the leading provider of Security-as-a-Service solutions for the cloud, has been named “Bronze Sponsor” of SYS-CON's 16th International Cloud Expo® and DevOps Summit 2015 New York, which will take place June 9-11, 2015, at the Javits Center in New York City, NY, and the 17th International Cloud Expo® and DevOps Summit 2015 Silicon Valley, which will take place November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA.
Sonus Networks introduced the Sonus WebRTC Services Solution, a virtualized Web Real-Time Communications (WebRTC) offer, purpose-built for the Cloud. The WebRTC Services Solution provides signaling from WebRTC-to-WebRTC applications and interworking from WebRTC-to-Session Initiation Protocol (SIP), delivering advanced real-time communications capabilities on mobile applications and on websites, which are accessible via a browser.
SYS-CON Events announced today that Aria Systems, the leading innovator in recurring revenue, has been named “Bronze Sponsor” of SYS-CON's @ThingsExpo, which will take place on June 9–11, 2015, at the Javits Center in New York, NY. Proven by the world’s most demanding enterprises, including AAA NCNU, Constant Contact, Falck, Hootsuite, Pitney Bowes, Telekom Denmark, and VMware, Aria helps enterprises grow their recurring revenue businesses. With Aria’s end-to-end active monetization platform, global brands can get to market faster with a wider variety of products and services, while maximizin...
SYS-CON Media announced today that @WebRTCSummit Blog, the largest WebRTC resource in the world, has been launched. @WebRTCSummit Blog offers top articles, news stories, and blog posts from the world's well-known experts and guarantees better exposure for its authors than any other publication. @WebRTCSummit Blog can be bookmarked ▸ Here @WebRTCSummit conference site can be bookmarked ▸ Here
SYS-CON Events announced today that Vitria Technology, Inc. will exhibit at SYS-CON’s @ThingsExpo, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Vitria will showcase the company’s new IoT Analytics Platform through live demonstrations at booth #330. Vitria’s IoT Analytics Platform, fully integrated and powered by an operational intelligence engine, enables customers to rapidly build and operationalize advanced analytics to deliver timely business outcomes for use cases across the industrial, enterprise, and consumer segments.
SYS-CON Events announced today that Solgenia will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY, and the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Solgenia is the global market leader in Cloud Collaboration and Cloud Infrastructure software solutions. Designed to “Bridge the Gap” between Personal and Professional Social, Mobile and Cloud user experiences, our solutions help large and medium-sized organizations dr...
SYS-CON Events announced today that Liaison Technologies, a leading provider of data management and integration cloud services and solutions, has been named "Silver Sponsor" of SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York, NY. Liaison Technologies is a recognized market leader in providing cloud-enabled data integration and data management solutions to break down complex information barriers, enabling enterprises to make smarter decisions, faster.
Connected devices and the Internet of Things are getting significant momentum in 2014. In his session at Internet of @ThingsExpo, Jim Hunter, Chief Scientist & Technology Evangelist at Greenwave Systems, examined three key elements that together will drive mass adoption of the IoT before the end of 2015. The first element is the recent advent of robust open source protocols (like AllJoyn and WebRTC) that facilitate M2M communication. The second is broad availability of flexible, cost-effective storage designed to handle the massive surge in back-end data in a world where timely analytics is e...
SYS-CON Events announced today that Akana, formerly SOA Software, has been named “Bronze Sponsor” of SYS-CON's 16th International Cloud Expo® New York, which will take place June 9-11, 2015, at the Javits Center in New York City, NY. Akana’s comprehensive suite of API Management, API Security, Integrated SOA Governance, and Cloud Integration solutions helps businesses accelerate digital transformation by securely extending their reach across multiple channels – mobile, cloud and Internet of Things. Akana enables enterprises to share data as APIs, connect and integrate applications, drive part...