Welcome!

Java IoT Authors: Elizabeth White, Pat Romanski, Liz McMillan, Craig Lowell, Mike Raia

Related Topics: Java IoT, @CloudExpo

Java IoT: Article

PaaS 2.0 Adds Standards and Greater Developer Control

PaaS 2.0 should combine the global availability of the hosting ecosystem and the standardization and flexibility of IaaS

Like evolution, constant technological change is unstoppable, whether through improvements or the emergence of entirely new technologies. This is especially true in the hosting industry where a new generation of more flexible and efficient platforms is emerging to take advantage of the cloud.

Historically, hosting services have been very inflexible. For example, a customer looking to change out servers rented from a hosted service provider would have to wait for several days, sometimes resulting in a significant blow to credibility and other losses. Amazon, with its infrastructure-as-a-service (IaaS) solution, was able to reduce that operation to minutes, yet still left most of the server administration tasks to users. Later, first generation platform-as-a-service (PaaS) offerings, like Google AppEngine and Heroku, allowed developers to upload their application code to a preconfigured environment, but with a trade-off. Developers often had to rewrite code to run on the vendor's platform and give up control of the execution environment.

PaaS for the cloud needs to evolve to address the shortcomings of earlier platforms. This next generation platform - PaaS 2.0 - should combine the global availability of the hosting ecosystem (hosters, their customers and application developers), the standardization and flexibility of IaaS, and ease of provisioning and scaling.

Proprietary Engines to Standard Execution Environments
The first Java platforms as a service, like Google App Engine, implemented proprietary execution environments. Developers had to learn the new environment and its limitations, and change their code to work in that environment. Some platforms, like Heroku, tried to mitigate this problem by taking an existing open source server and customizing it. While this made the learning curve smaller, the limitations imposed on developers, such as the absence of multi-threading and limited server choice, meant that most existing applications could not be deployed "as is" in PaaS.

Java, a popular and established programming language, is supported by numerous application servers that suit various deployment scenarios. The Wikipedia page on Java Platform Enterprise Edition lists a variety of server options, including Tomcat, GlassFish and Jetty - each with its share of devoted users and advantages in particular scenarios. Developers are not eager to switch between servers and want the exact environment required by their applications.

The same situation is true for database servers. Older platforms offer either proprietary solutions or support only one type of database server, such as MySQL, so the environment topology for the applications can't be set up. Use of proprietary storage platforms, such as Google's Big Table or Amazon SimpleDB, lock developers into a specific service and hold them hostage to changes in the service provider's policies. Recent price increases by Google AppEngine illustrate the problem.

PaaS 2.0 should offer a wide range of application and database servers. This is the only way that developers can achieve 100 percent compatibility with existing open standards, use a wide range of libraries and frameworks, and establish environment topologies according to their tastes, needs and knowledge, without vendor lock-in.

Separation of Platform and Service
Early PaaS offerings lack a separation between platform and service. Google, for example, is the only service provider offering Google App Engine. Amazon is the only one offering Amazon Elastic Beanstalk. Salesforce.com (the owner of Heroku) is the only one offering Heroku, and so on. If you like the platform, but don't like the service - too bad. If the price goes up (as in Google's case) or if you need the service outside the U.S. with a different set of service level agreements (SLA) or certifications - again, too bad. This marriage between platform and service limits consumer choice and stifles innovation.

PaaS needs to evolve to an ecosystem model where platforms are developed by software companies, and the service is available in multiple geographies, across a variety of industries, at attractive price points, and with value-add options from an ecosystem of hosted service providers.

Scalability
Scalability for first-generation PaaS offerings came at a price. Platforms were not able to automatically add resources, such as RAM and CPU, when apps needed them. Instead, developers had to change their apps to use multiple parallel machines, and add machines each time more resources were needed.

While horizontal scaling creates highly scalable and available Internet applications, it imposes huge additional requirements on application developers, making code redesign necessary just to ensure the application has the resources it needs.

The next-generation PaaS needs to be more flexible in adapting to applications. If a developer chooses to run a single instance of an application (or the application is just not designed to run on multiple machines), the platform needs to automatically give the application the resources it needs when usage goes up. The same is true when usage goes down. If no one is using the system at 1 a.m. and it only needs 128 MB of memory, PaaS needs to automatically recycle the rest of the resources to reduce the bill.

PaaS 2.0 shouldn't force developers to change their applications and behaviors, but should support both scaling scenarios without imposing limitations.

Control over the Environment
Like the limitations that early PaaS imposed on the application server, developers were also very limited in how they could work with the execution environment to extend and control it. If an extra library was needed for an application, the developer could only use frameworks made available by the vendor. If the developer needed to fine-tune server configuration files or add new ones, he was out of luck if the application did not fit within the platform's limits. PaaS 2.0 needs to allow developers to use all of the libraries and configurations currently available.

Graphical User Interface
Usability is another key requirement for PaaS technology to go mainstream. First-generation PaaS platforms often required using obscure command-line utilities to get anything done, making the learning curve even bigger and limiting its use among enthusiasts. PaaS 2.0 needs to easily integrate into the tools that developers know and use already with a highly intuitive graphical user interface of its own.

More Stories By Ruslan Synytskyy

Ruslan Synytsky is CTO and co-founder of Jelastic, Inc, the first company to deliver Platform-as-Infrastructure, combining the flexibility of IaaS and the ease of use of PaaS within a single turnkey platform. With over 15 years in the IT industry, Ruslan is an expert in large-scale distributed Java applications and enterprise platforms. Before starting Jelastic in 2011, Ruslan led engineering and software architecture teams at iQueLab, SolovatSoft and Datamesh. He was also one of the key engineering leads at the National Space Agency, Ukraine.

Comments (2) View Comments

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Most Recent Comments
StevenDDeacon 12/07/11 01:09:00 PM EST

For more information on cloud computing Integrated Service Management with IBM Cloud Computing Reference Architecture read IBM Whitepaper "Integrated service management and cloud computing" here [PDF].

StevenDDeacon 12/06/11 12:58:00 PM EST

Great Article! Many IT IaaS providers are now emerging offering many of the features your article describes required for PaaS 2.0. I've come across a few with interesting offerings. I currently write a blog "Information Technology Infrastructure Logistics". I hope to see more articles on this subject as PaaS offerings mature.

@ThingsExpo Stories
Pulzze Systems was happy to participate in such a premier event and thankful to be receiving the winning investment and global network support from G-Startup Worldwide. It is an exciting time for Pulzze to showcase the effectiveness of innovative technologies and enable them to make the world smarter and better. The reputable contest is held to identify promising startups around the globe that are assured to change the world through their innovative products and disruptive technologies. There w...
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound change in personal and enterprise IT since the creation of the Worldwide Web more than 20 years ago. All major researchers estimate there will be tens of billions devices - comp...
The 19th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Digital Transformation, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportuni...
For basic one-to-one voice or video calling solutions, WebRTC has proven to be a very powerful technology. Although WebRTC’s core functionality is to provide secure, real-time p2p media streaming, leveraging native platform features and server-side components brings up new communication capabilities for web and native mobile applications, allowing for advanced multi-user use cases such as video broadcasting, conferencing, and media recording.
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, will discuss the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
SYS-CON Events announced today that 910Telecom will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Housed in the classic Denver Gas & Electric Building, 910 15th St., 910Telecom is a carrier-neutral telecom hotel located in the heart of Denver. Adjacent to CenturyLink, AT&T, and Denver Main, 910Telecom offers connectivity to all major carriers, Internet service providers, Internet backbones and ...
DevOps at Cloud Expo, taking place Nov 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long dev...
Amazon has gradually rolled out parts of its IoT offerings in the last year, but these are just the tip of the iceberg. In addition to optimizing their back-end AWS offerings, Amazon is laying the ground work to be a major force in IoT – especially in the connected home and office. Amazon is extending its reach by building on its dominant Cloud IoT platform, its Dash Button strategy, recently announced Replenishment Services, the Echo/Alexa voice recognition control platform, the 6-7 strategic...
19th Cloud Expo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterpri...
Akana has announced the availability of version 8 of its API Management solution. The Akana Platform provides an end-to-end API Management solution for designing, implementing, securing, managing, monitoring, and publishing APIs. It is available as a SaaS platform, on-premises, and as a hybrid deployment. Version 8 introduces a lot of new functionality, all aimed at offering customers the richest API Management capabilities in a way that is easier than ever for API and app developers to use.
Personalization has long been the holy grail of marketing. Simply stated, communicate the most relevant offer to the right person and you will increase sales. To achieve this, you must understand the individual. Consequently, digital marketers developed many ways to gather and leverage customer information to deliver targeted experiences. In his session at @ThingsExpo, Lou Casal, Founder and Principal Consultant at Practicala, discussed how the Internet of Things (IoT) has accelerated our abil...
With so much going on in this space you could be forgiven for thinking you were always working with yesterday’s technologies. So much change, so quickly. What do you do if you have to build a solution from the ground up that is expected to live in the field for at least 5-10 years? This is the challenge we faced when we looked to refresh our existing 10-year-old custom hardware stack to measure the fullness of trash cans and compactors.
The emerging Internet of Everything creates tremendous new opportunities for customer engagement and business model innovation. However, enterprises must overcome a number of critical challenges to bring these new solutions to market. In his session at @ThingsExpo, Michael Martin, CTO/CIO at nfrastructure, outlined these key challenges and recommended approaches for overcoming them to achieve speed and agility in the design, development and implementation of Internet of Everything solutions wi...
Cloud computing is being adopted in one form or another by 94% of enterprises today. Tens of billions of new devices are being connected to The Internet of Things. And Big Data is driving this bus. An exponential increase is expected in the amount of information being processed, managed, analyzed, and acted upon by enterprise IT. This amazing is not part of some distant future - it is happening today. One report shows a 650% increase in enterprise data by 2020. Other estimates are even higher....
I wanted to gather all of my Internet of Things (IOT) blogs into a single blog (that I could later use with my University of San Francisco (USF) Big Data “MBA” course). However as I started to pull these blogs together, I realized that my IOT discussion lacked a vision; it lacked an end point towards which an organization could drive their IOT envisioning, proof of value, app dev, data engineering and data science efforts. And I think that the IOT end point is really quite simple…
"My role is working with customers, helping them go through this digital transformation. I spend a lot of time talking to banks, big industries, manufacturers working through how they are integrating and transforming their IT platforms and moving them forward," explained William Morrish, General Manager Product Sales at Interoute, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 19th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo Silicon Valley Call for Papers is now open.
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no id...
SYS-CON Events announced today that Venafi, the Immune System for the Internet™ and the leading provider of Next Generation Trust Protection, will exhibit at @DevOpsSummit at 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Venafi is the Immune System for the Internet™ that protects the foundation of all cybersecurity – cryptographic keys and digital certificates – so they can’t be misused by bad guys in attacks...
Smart Cities are here to stay, but for their promise to be delivered, the data they produce must not be put in new siloes. In his session at @ThingsExpo, Mathias Herberts, Co-founder and CTO of Cityzen Data, will deep dive into best practices that will ensure a successful smart city journey.