Welcome!

Java IoT Authors: Liz McMillan, Elizabeth White, Yeshim Deniz, Zakia Bouachraoui, Pat Romanski

Related Topics: @CloudExpo, Containers Expo Blog

@CloudExpo: Article

The Next Technology Boom is Already Underway at Cisco, F5 Networks, Riverbed and VMware

Clouds, Virtualization and IT Diseconomies

Back to the Clouds and Virtualization

Cloud computing is dynamic computing power on a massive scale delivering new economies for IT services and applications. In between those economies and the prices existing enterprises are already paying for their own services is the business case, in addition to operations, sales, marketing, and new infrastructure requirements.

As much as cloud computing has rallied behind the prospect of electricity and real estate savings, the business case still feels like a dotcom hangover in some cases. Virtualization is still a bit hamstrung in the enterprise by the disconnect between static infrastructure and moving, state-changing VMs; and labor is the largest cost component of server TCO (IDC findings) and a significant component of network TCO (as suggested by the Computerworld findings). So just how much will real estate and electricity savings offset other diseconomies and barriers in the cloud game? I think cloud computing will also have to innovate in areas like automation and connectivity intelligence.

For the network to be dynamic, for example, it needs continuous, dynamic connectivity at the core network services level. Network, endpoint and application intelligence will all depend upon connectivity intelligence in order to evolve into dynamic, automated systems that don't require escalating manual intervention in the face of network expansion and rising system and endpoint demands.


Getting beyond Infrastructure1.0's Zero Sum Game

Whether you "cloudsource" or upsize your network to address any of a number of high level business initiatives the requirements for infrastructure2.0 will be the same. You can certainly get to virtualization and cloud (or consolidation or VoIP, etc) with a static infrastructure; you'll just need more "operators", more spreadsheets and other forms of manual labor. That means less flexibility, more downtime and higher TCO; and you'll be going against the collective wisdom of decades of technologists and innovations.

This recession-proof dynamic gives the leaders in TCP/IP, netsec and traffic optimization an inherent advantage, if they can get the connectivity intelligence necessary to deliver dynamic services. They have the expertise to build intelligence into their gear as they have demonstrated. They just haven't had the connectivity intelligence to deliver the dynamic infrastructure. Yet that is inevitable.


The Potential Leaders in Infrastructure2.0

Cisco is the leader in TCP/IP and has the most successful track record when it comes to executing in the enterprise IT market. Cisco has kept up with major innovations in security and traffic management as well, and it is likely to become a leader in Infrastructure2.0 as enterprises seek to boost productivity as their networks continues to become strategic to business advantage in an uncertain world economy.

F5 Networks has become the leader in application layer traffic management and optimization, thanks to its uncanny ability to monetize the enterprise web, or the enterprise initiative to deliver its core applications over the WAN and Internet. Their ability to merge load balancing with sophisticated application intelligence positions them to play an important role in the development of dynamic infrastructure.

Riverbed has come on the scene thanks to its ability to optimize a vast array of network protocols so that their customers could empower their branch offices like never before. While many tech leaders focused on the new data center, Riverbed achieved stellar growth by focusing on the branch office boom enabled by breakthroughs in traffic management and optimization. It was a smart call that has positioned Riverbed to be a leader in the emerging dynamic network.

Infoblox is the least known of the potential I2.0 leaders. It is a private company that already counts more than 20% of the Fortune 500 as customers. Its solutions automate core network services (including IPAM), enabling dynamic connectivity intelligence for TCP/IP networks. (Disclosure: I left virtualization security leader Blue Lane Technologies in July to join Infoblox, largely because of their legacy of revenue growth, sizable customer base and the promise of core network service automation.) Infoblox's founder and CTO is also behind the IF-MAP standard, a new I2.0 protocol that holds promise as a key element for enabling dynamic exchange of intelligence among infrastructure, applications and endpoints (think MySpace for your infrastructure).

VMware is executing on the promise of production virtualization and clearly now has the most experience in addressing the challenges of integrating dynamic processing power with static infrastructure. I think the biggest challenges for VMware will be regarding how much it has to build or acquire in order to address these challenges. Not all of its technology partners are adequately prepared for the network demands of dynamic systems and endpoints. VMsafe was a big step forward on the marketing front, but partners have been slow to execute virtsec-ready products.

Google has no doubt benefited from the hype surrounding cloud computing. They've been investing in cloudplexes and new pre-enterprise cloud applications. While I do have reservations about their depth of infrastructure experience (versus the Nicholas Carr prediction of the eventual decline of enterprise IT) I think one would be hard pressed not to include them as a player driving requirements for a more dynamic infrastructure.

Microsoft has recently become more vocal on both virtualization and cloud fronts and has tremendous assets to force innovation in infrastructure, in the same way that its more powerful applications have influenced endpoint and server processing requirements. They are likely to play a similar role as the network becomes more strategic to the cloud.

There are no doubt other players (both public and private) that promise to play a strategic role in this next technology revolution, including those delivering more power, automation and specialization around network, endpoint and application intelligence as well as enabling more movement and control in virtual and cloud environments. All are welcome to join the conversation.

These leaders are well positioned to play a substantial part in the race to deliver Infrastructure2.0; and strategic enterprise networks promise to be big winners. The dynamic infrastructure will change the economics of the network by automating previously manual tasks and will unleash new potentials for application, endpoint and network intelligence. It will also play a major part in the success or failure of many leading networking and virtualization players as well as enterprise IT initiatives during periods of economic weakness and beyond. Infrastructure2.0 is the next technology boom. It is already underway.

More Stories By Greg Ness

Gregory Ness is the VP of Marketing of Vidder and has over 30 years of experience in marketing technology, B2B and consumer products and services. Prior to Vidder, he was VP of Marketing at cloud migration pioneer CloudVelox. Before CloudVelox he held marketing leadership positions at Vantage Data Centers, Infoblox (BLOX), BlueLane Technologies (VMW), Redline Networks (JNPR), IntruVert (INTC) and ShoreTel (SHOR). He has a BA from Reed College and an MA from The University of Texas at Austin. He has spoken on virtualization, networking, security and cloud computing topics at numerous conferences including CiscoLive, Interop and Future in Review.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


IoT & Smart Cities Stories
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...