Welcome!

Java IoT Authors: Elizabeth White, Carmen Gonzalez, Liz McMillan, Jyoti Bansal, Mano Marks

Related Topics: @CloudExpo, Java IoT, Microservices Expo, Linux Containers, @BigDataExpo, SDN Journal

@CloudExpo: Article

Mainframe: A Resilient Model for the Modern Cloud

The emerging cloud-based model of computing requires systems that can provide fast response times to huge volumes of requests

Technology is moving at a blistering pace. In today's era of data-centric, complex environments where the lines between business and technology are becoming increasingly blurred, organizations are moving beyond virtualization to cloud computing to meet new challenges and keep up with the pace of change. Critical investments are needed to keep companies competitive, and chief among these technologies is cloud computing. In fact, Gartner expects cloud computing to become the bulk of new IT expenditure by 2016. The bottom line is, if you're not already looking at cloud as an essential investment, you're risking your survival into the next era of computing.

The emerging cloud-based model of computing requires systems that can provide very fast response times to huge volumes of requests. And, mission critical services such as healthcare, finance, transportation, public utilities, and other industries require very high levels of availability, security and other industrial-strength capabilities. Those attributes, qualities and requirements make the mainframe the ideal platform for such mission critical cloud-based workloads.

Cloud computing is a modern extension of a concept first developed nearly 50 years ago with the mainframe. The inherent spirit behind mainframe based computing was to serve users in remote locations at the same time, on a pay-as-you-go basis. The mainframe was introduced as the most robust, scalable system ever built, and with continued innovation the system has maintained its leadership status as one of the platforms of choice to handle today's complex workloads including sophisticated public, private and hybrid cloud computing environments. At its core, the mainframe was designed around three key traits - virtualization, standardization and provisioning. Not coincidentally, these are the foundational requirements for true cloud implementation.

Most enterprises today started their cloud journey with low-risk applications and high agility requirements. This approach allows customers to ease into cloud computing, learn and adjust their management of the cloud, and build the confidence to introduce more demanding applications.  The applications tend to use web technologies and architectures that can be scaled on commodity infrastructures, using load balancing and service cloning. Batch workloads that fit with commodity infrastructures are another popular workload on clouds.

For private, public or hybrid clouds, the mainframe can provide the following key requirements:

  • Scalability - users need to scale quickly and efficiently both up and down with complete confidence and zero loss of availability.
  • Reliability - a cloud computing environment that is always accessible with guaranteed application performance, limited to no downtime with provisions for rapid recovery from failure.
  • Multi-Tenancy - allowing multiple users to access software applications on the same system, concurrently and securely, critical for cloud service providers hosting many organizations in a single cloud infrastructure and for enterprises deploying private clouds to manage growth through acquisitions to host multiple companies in the same infrastructure;
  • Cost Efficiency - consolidating a distributed x86 cloud environment onto one mainframe creates a simplified, more efficient environment with reductions in floor space and power requirements, and higher return on investment over the life of the platform;
  • Security - the mainframe has unmatched system security with ensured isolation and protection of each virtual server environment.

Companies across various industries are gaining these advantages and efficiencies by consolidating cloud environments on a mainframe, such as:

By consolidating cloud on a mainframe private cloud solution that replaced thousands of standalone servers for its daily business activities like policy verification, claims processing, and generating customer quotations, Nationwide Insurance has saved 80% in energy and facility costs. The consolidation saved the company roughly $15 million over three years and will only continue to efficiently keep costs down in the future. Additionally, this solution gives them the capacity, processing speeds and reliability to increase the pace of innovation across its products and channels as it continues to grow.

By leveraging the cloud capabilities offered by the mainframe, Marist College was able to extend its business analytics technology to its academic community including researchers and students, while extracting even more value from its IT investments. By providing its analytics technology via cloud, the college has been able to expose analytics tools to a wide variety of programs, including technical disciplines and also business, liberal arts and communications programs so students learn how to apply it to their fields of study. Marist has also realized significant financial benefits, saving roughly $350,000 by using the cloud to support the college's ERP system.

The mainframe, with its shared platform, integration, and secure design attributes combined with continuous innovation, has enabled organizations to stay ahead of changing market dynamics with a solution that embodies efficiency, economics and agility - a resilient solution for today's cloud environment.

More Stories By Jose Castano

Jose Castano is the Director for the System z Growth Initiatives in IBM’s Systems & Technology Group. He has over 25 years of experience within IBM and has held multiple key positions in System z during this tenure.

Jose has worldwide responsibility to drive new workloads on System z. This includes Cloud, Analytics, Mobile, and Security He sets the business and technical strategy and direction for the System z platform. He drives coordination and collaboration of the System z ecosystem, from marketing, sales, business partners, consultants, and most importantly customers; leading the platform through an evolution that maintains leadership and meets customer and industry requirements.

Jose has a team comprised of workload and industry architects (who focus on business trends, market and industry requirements and develop solutions/offerings). Offering managers (who are responsible for the GTM for the solutions/offerings) and ISV managers (who work with our ecosystem to support new and existing workloads). Together, these teams have responsibility for researching, designing, building and maintaining the new workload strategy and its roadmap for IBM System z, driving the plans for the next 3-5 years.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
Discover top technologies and tools all under one roof at April 24–28, 2017, at the Westin San Diego in San Diego, CA. Explore the Mobile Dev + Test and IoT Dev + Test Expo and enjoy all of these unique opportunities: The latest solutions, technologies, and tools in mobile or IoT software development and testing. Meet one-on-one with representatives from some of today's most innovative organizations
20th Cloud Expo, taking place June 6-8, 2017, at the Javits Center in New York City, NY, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy.
WebRTC is the future of browser-to-browser communications, and continues to make inroads into the traditional, difficult, plug-in web communications world. The 6th WebRTC Summit continues our tradition of delivering the latest and greatest presentations within the world of WebRTC. Topics include voice calling, video chat, P2P file sharing, and use cases that have already leveraged the power and convenience of WebRTC.
SYS-CON Events announced today that Super Micro Computer, Inc., a global leader in Embedded and IoT solutions, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 7-9, 2017, at the Javits Center in New York City, NY. Supermicro (NASDAQ: SMCI), the leading innovator in high-performance, high-efficiency server technology, is a premier provider of advanced server Building Block Solutions® for Data Center, Cloud Computing, Enterprise IT, Hadoop/Big Data, HPC and E...
Internet of @ThingsExpo, taking place June 6-8, 2017 at the Javits Center in New York City, New York, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @ThingsExpo New York Call for Papers is now open.
WebRTC sits at the intersection between VoIP and the Web. As such, it poses some interesting challenges for those developing services on top of it, but also for those who need to test and monitor these services. In his session at WebRTC Summit, Tsahi Levent-Levi, co-founder of testRTC, reviewed the various challenges posed by WebRTC when it comes to testing and monitoring and on ways to overcome them.
DevOps is being widely accepted (if not fully adopted) as essential in enterprise IT. But as Enterprise DevOps gains maturity, expands scope, and increases velocity, the need for data-driven decisions across teams becomes more acute. DevOps teams in any modern business must wrangle the ‘digital exhaust’ from the delivery toolchain, "pervasive" and "cognitive" computing, APIs and services, mobile devices and applications, the Internet of Things, and now even blockchain. In this power panel at @...
WebRTC services have already permeated corporate communications in the form of videoconferencing solutions. However, WebRTC has the potential of going beyond and catalyzing a new class of services providing more than calls with capabilities such as mass-scale real-time media broadcasting, enriched and augmented video, person-to-machine and machine-to-machine communications. In his session at @ThingsExpo, Luis Lopez, CEO of Kurento, introduced the technologies required for implementing these idea...
"A lot of times people will come to us and have a very diverse set of requirements or very customized need and we'll help them to implement it in a fashion that you can't just buy off of the shelf," explained Nick Rose, CTO of Enzu, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Buzzword alert: Microservices and IoT at a DevOps conference? What could possibly go wrong? In this Power Panel at DevOps Summit, moderated by Jason Bloomberg, the leading expert on architecting agility for the enterprise and president of Intellyx, panelists peeled away the buzz and discuss the important architectural principles behind implementing IoT solutions for the enterprise. As remote IoT devices and sensors become increasingly intelligent, they become part of our distributed cloud enviro...
The WebRTC Summit New York, to be held June 6-8, 2017, at the Javits Center in New York City, NY, announces that its Call for Papers is now open. Topics include all aspects of improving IT delivery by eliminating waste through automated business models leveraging cloud technologies. WebRTC Summit is co-located with 20th International Cloud Expo and @ThingsExpo. WebRTC is the future of browser-to-browser communications, and continues to make inroads into the traditional, difficult, plug-in web co...
In his keynote at @ThingsExpo, Chris Matthieu, Director of IoT Engineering at Citrix and co-founder and CTO of Octoblu, focused on building an IoT platform and company. He provided a behind-the-scenes look at Octoblu’s platform, business, and pivots along the way (including the Citrix acquisition of Octoblu).
For basic one-to-one voice or video calling solutions, WebRTC has proven to be a very powerful technology. Although WebRTC’s core functionality is to provide secure, real-time p2p media streaming, leveraging native platform features and server-side components brings up new communication capabilities for web and native mobile applications, allowing for advanced multi-user use cases such as video broadcasting, conferencing, and media recording.
Web Real-Time Communication APIs have quickly revolutionized what browsers are capable of. In addition to video and audio streams, we can now bi-directionally send arbitrary data over WebRTC's PeerConnection Data Channels. With the advent of Progressive Web Apps and new hardware APIs such as WebBluetooh and WebUSB, we can finally enable users to stitch together the Internet of Things directly from their browsers while communicating privately and securely in a decentralized way.
WebRTC is about the data channel as much as about video and audio conferencing. However, basically all commercial WebRTC applications have been built with a focus on audio and video. The handling of “data” has been limited to text chat and file download – all other data sharing seems to end with screensharing. What is holding back a more intensive use of peer-to-peer data? In her session at @ThingsExpo, Dr Silvia Pfeiffer, WebRTC Applications Team Lead at National ICT Australia, looked at differ...
The security needs of IoT environments require a strong, proven approach to maintain security, trust and privacy in their ecosystem. Assurance and protection of device identity, secure data encryption and authentication are the key security challenges organizations are trying to address when integrating IoT devices. This holds true for IoT applications in a wide range of industries, for example, healthcare, consumer devices, and manufacturing. In his session at @ThingsExpo, Lancen LaChance, vic...
With all the incredible momentum behind the Internet of Things (IoT) industry, it is easy to forget that not a single CEO wakes up and wonders if “my IoT is broken.” What they wonder is if they are making the right decisions to do all they can to increase revenue, decrease costs, and improve customer experience – effectively the same challenges they have always had in growing their business. The exciting thing about the IoT industry is now these decisions can be better, faster, and smarter. Now ...
Fact is, enterprises have significant legacy voice infrastructure that’s costly to replace with pure IP solutions. How can we bring this analog infrastructure into our shiny new cloud applications? There are proven methods to bind both legacy voice applications and traditional PSTN audio into cloud-based applications and services at a carrier scale. Some of the most successful implementations leverage WebRTC, WebSockets, SIP and other open source technologies. In his session at @ThingsExpo, Da...
Who are you? How do you introduce yourself? Do you use a name, or do you greet a friend by the last four digits of his social security number? Assuming you don’t, why are we content to associate our identity with 10 random digits assigned by our phone company? Identity is an issue that affects everyone, but as individuals we don’t spend a lot of time thinking about it. In his session at @ThingsExpo, Ben Klang, Founder & President of Mojo Lingo, discussed the impact of technology on identity. Sho...
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.