|By Lori MacVittie||
|April 11, 2013 09:15 AM EDT||
When all you have is a hypervisor, everything looks like it should be virtualized.
Yes, I'm about to say something that's on the order of heresy in the church of virtualization. But it has to be said and I'm willing to say it because, well, as General Patton said, "If everyone is thinking the same... someone isn't thinking."
The original NFV white paper cited in the excellent overview of the SDN and NFV relationships "NFV and SDN: What’s the Difference?" describes essentially two problems it attempts to solve: rapid provisioning and operational costs.
The reason commodity hardware is always associated with NFV and with SDN is that, even if there existed a rainbow and unicorns industry-wide standard for managing network hardware there would still exist significant time required to acquire and deploy said hardware. One does not generally have extra firewalls, routers, switches, and application network service hardware lying around idle. One might, however, have commodity (cheap) compute available on which such services could be deployed.
Software, as we've seen, has readily adapted to distribution and deployment in a digital form factor. It wasn't always so after all. We started with floppies, moved to CD-ROM, then DVD and, finally, to neat little packages served up by application stores and centralized repositories (RPM, NPM, etc...).
Virtualization arrived just as we were moving from the physical to digital methods of distribution and it afforded us the commonality (abstraction) necessary to enable using commodity hardware for systems that might not otherwise be deployable on that hardware due to a lack of support by the operating system or the application itself. With the exposure of APIs and management via centralized platforms, the issue of provisioning speed was quickly addressed. Thus, virtualization is the easy answer to data center problems up and down the network stack.
But it isn't the only answer, and as SDN has shown there are other models that provide the same agility and cost benefits as virtualization without the potential downsides (performance being the most obvious with respect to the network).
ABSTRACT the ABSTRACTION
Let's abstract the abstraction for a moment. What is it virtualization offers that a similar, software-defined solution would not? If you're going to use raw compute, what is it that virtualization provides that makes it so appealing?
Hardware agnosticism comes to mind as a significant characteristic that leads everyone to choose virtualization as nearly a deus-ex machina solution. The idea that one can start with bare metal (raw compute) and within minutes have any of a number of very different systems up and running is compelling. Because there are hardware-specific drivers and configuration required at the OS level, however, that vision isn't easily realized. Enter virtualization, which provides a consistent, targetable layer for the operating system and applications.
Sure, it's software, but is standardizing on a hypervisor platform all that different from standardizing on a hardware platform?
We've turned the hypervisor into our common platform. It is what we target, what we've used as the "base" for deployment. It has eliminated the need to be concerned about five or ten hundred different potential board-level components requiring support and provided us a simple base platform upon which to deploy. But it hasn't eliminated dependencies; you can't deploy a VM packaged for VMware on a KVM system or vice-versa. There's still some virtual diaspora in the market that requires different targeted packages. But at least we're down to half-a-dozen from the hundreds of possible combinations at the hardware level.
But is it really virtualization that enables this magical deployment paradigm or is it the ability to deploy on common hardware it offers that's important? I'd say its the latter. It's the ability to deploy on commodity hardware that makes virtualization appealing. The hardware, however, still must exist. It must be racked and ready, available for that deployment. In terms of compute, we still have traditional roadblocks around ensuring compute capacity availability. The value up the operational process stack, as it were, of virtualization suddenly becomes more about readiness; about the ability to rapidly provision X or Y or Z because it's pre-packaged for the virtualization platform. In other words, it's the readiness factor that's key to rapid deployment. If there is sufficient compute (hardware) available and if the application/service/whatever is pre-packaged for the target virtualization platform then rapid deployment ensues.
Otherwise, you're sitting the same place you were before virtualization.
So there's significant planning that goes into being able to take advantage of virtualization's commoditization of compute to enable rapid deployment. And if we abstract what it is that enables virtualization to be the goodness that it is we find that it's about pre-packaging and a very finite targeted platform upon which services and applications can be deployed.
The question is, is that the only way to enable that capability?
Obviously I don't think so or I wouldn't be writing this post.
COMPLACENCY is the GREAT INHIBITOR of INNOVATION
What if we could remove the layer of virtualization, replacing it instead with a more robust and agile operating system capable of managing a bare metal deployment with the same (or even more) alacrity than a comparable virtualized system?
It seems that eliminating yet another layer of abstraction between the network function and, well, the network would be a good thing. Network functions at layer 2-3 are I/O bound; they're heavily reliant on fast input and output and that includes traversing the hardware up through the OS up through the hypervisor up through the... The more paths (and thus internal bus and lane traversals) a packet must travel in the system the higher the latency. Eliminating as many of these paths as possible is one of the keys*** to continued performance improvements on commodity hardware such that they are nearing those of network hardware.
If one had such a system that met the requirements - pre-packaged, rapid provisioning, able to run on commodity hardware - would you really need the virtual layer?
But when all you have is a hypervisor...
I'm not saying virtualization isn't good technology, or that it doesn't make sense, or that it shouldn't be used. What I am saying is that perhaps we've become too quick to reach for the hammer when confronted with the challenge of rapid provisioning or flexibility. Let's not get complacent. We're far too early in the SDN and NFV game for that.
* Notice I did not say Sisyphean. It's doable, so it's on the order of Herculean. Unfortunately that also implies it's a long, arduous journey.
** That may be a tad hyperbolic, admittedly.
*** The operating system has a lot - a lot - to do with this equation, but that's a treatise for another day
DevOps is being widely accepted (if not fully adopted) as essential in enterprise IT. But as Enterprise DevOps gains maturity, expands scope, and increases velocity, the need for data-driven decisions across teams becomes more acute. DevOps teams in any modern business must wrangle the ‘digital exhaust’ from the delivery toolchain, "pervasive" and "cognitive" computing, APIs and services, mobile devices and applications, the Internet of Things, and now even blockchain. In this power panel at @...
Oct. 26, 2016 06:00 AM EDT Reads: 2,038
If you had a chance to enter on the ground level of the largest e-commerce market in the world – would you? China is the world’s most populated country with the second largest economy and the world’s fastest growing market. It is estimated that by 2018 the Chinese market will be reaching over $30 billion in gaming revenue alone. Admittedly for a foreign company, doing business in China can be challenging. Often changing laws, administrative regulations and the often inscrutable Chinese Interne...
Oct. 26, 2016 06:00 AM EDT Reads: 1,405
Enterprise IT has been in the era of Hybrid Cloud for some time now. But it seems most conversations about Hybrid are focused on integrating AWS, Microsoft Azure, or Google ECM into existing on-premises systems. Where is all the Private Cloud? What do technology providers need to do to make their offerings more compelling? How should enterprise IT executives and buyers define their focus, needs, and roadmap, and communicate that clearly to the providers?
Oct. 26, 2016 05:45 AM EDT Reads: 2,555
SYS-CON Events announced today that Streamlyzer will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Streamlyzer is a powerful analytics for video streaming service that enables video streaming providers to monitor and analyze QoE (Quality-of-Experience) from end-user devices in real time.
Oct. 26, 2016 05:30 AM EDT Reads: 1,040
You have great SaaS business app ideas. You want to turn your idea quickly into a functional and engaging proof of concept. You need to be able to modify it to meet customers' needs, and you need to deliver a complete and secure SaaS application. How could you achieve all the above and yet avoid unforeseen IT requirements that add unnecessary cost and complexity? You also want your app to be responsive in any device at any time. In his session at 19th Cloud Expo, Mark Allen, General Manager of...
Oct. 26, 2016 05:30 AM EDT Reads: 1,019
SYS-CON Events announced today that Pulzze Systems will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Pulzze Systems, Inc. provides infrastructure products for the Internet of Things to enable any connected device and system to carry out matched operations without programming. For more information, visit http://www.pulzzesystems.com.
Oct. 26, 2016 05:00 AM EDT Reads: 2,578
Cloud based infrastructure deployment is becoming more and more appealing to customers, from Fortune 500 companies to SMEs due to its pay-as-you-go model. Enterprise storage vendors are able to reach out to these customers by integrating in cloud based deployments; this needs adaptability and interoperability of the products confirming to cloud standards such as OpenStack, CloudStack, or Azure. As compared to off the shelf commodity storage, enterprise storages by its reliability, high-availabil...
Oct. 26, 2016 04:30 AM EDT Reads: 1,193
One of biggest questions about Big Data is “How do we harness all that information for business use quickly and effectively?” Geographic Information Systems (GIS) or spatial technology is about more than making maps, but adding critical context and meaning to data of all types, coming from all different channels – even sensors. In his session at @ThingsExpo, William (Bill) Meehan, director of utility solutions for Esri, will take a closer look at the current state of spatial technology and ar...
Oct. 26, 2016 04:30 AM EDT Reads: 1,759
The IoT industry is now at a crossroads, between the fast-paced innovation of technologies and the pending mass adoption by global enterprises. The complexity of combining rapidly evolving technologies and the need to establish practices for market acceleration pose a strong challenge to global enterprises as well as IoT vendors. In his session at @ThingsExpo, Clark Smith, senior product manager for Numerex, will discuss how Numerex, as an experienced, established IoT provider, has embraced a ...
Oct. 26, 2016 04:15 AM EDT Reads: 1,171
In past @ThingsExpo presentations, Joseph di Paolantonio has explored how various Internet of Things (IoT) and data management and analytics (DMA) solution spaces will come together as sensor analytics ecosystems. This year, in his session at @ThingsExpo, Joseph di Paolantonio from DataArchon, will be adding the numerous Transportation areas, from autonomous vehicles to “Uber for containers.” While IoT data in any one area of Transportation will have a huge impact in that area, combining sensor...
Oct. 26, 2016 03:45 AM EDT Reads: 1,024
In the next forty months – just over three years – businesses will undergo extraordinary changes. The exponential growth of digitization and machine learning will see a step function change in how businesses create value, satisfy customers, and outperform their competition. In the next forty months companies will take the actions that will see them get to the next level of the game called Capitalism. Or they won’t – game over. The winners of today and tomorrow think differently, follow different...
Oct. 26, 2016 02:30 AM EDT Reads: 1,065
“Media Sponsor” of SYS-CON's 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. CloudBerry Backup is a leading cross-platform cloud backup and disaster recovery solution integrated with major public cloud services, such as Amazon Web Services, Microsoft Azure and Google Cloud Platform.
Oct. 26, 2016 02:30 AM EDT Reads: 1,472
The Internet of Things (IoT), in all its myriad manifestations, has great potential. Much of that potential comes from the evolving data management and analytic (DMA) technologies and processes that allow us to gain insight from all of the IoT data that can be generated and gathered. This potential may never be met as those data sets are tied to specific industry verticals and single markets, with no clear way to use IoT data and sensor analytics to fulfill the hype being given the IoT today.
Oct. 26, 2016 01:45 AM EDT Reads: 2,776
Ask someone to architect an Internet of Things (IoT) solution and you are guaranteed to see a reference to the cloud. This would lead you to believe that IoT requires the cloud to exist. However, there are many IoT use cases where the cloud is not feasible or desirable. In his session at @ThingsExpo, Dave McCarthy, Director of Products at Bsquare Corporation, will discuss the strategies that exist to extend intelligence directly to IoT devices and sensors, freeing them from the constraints of ...
Oct. 26, 2016 01:15 AM EDT Reads: 3,127
Fifty billion connected devices and still no winning protocols standards. HTTP, WebSockets, MQTT, and CoAP seem to be leading in the IoT protocol race at the moment but many more protocols are getting introduced on a regular basis. Each protocol has its pros and cons depending on the nature of the communications. Does there really need to be only one protocol to rule them all? Of course not. In his session at @ThingsExpo, Chris Matthieu, co-founder and CTO of Octoblu, walk you through how Oct...
Oct. 26, 2016 12:00 AM EDT Reads: 3,194
A completely new computing platform is on the horizon. They’re called Microservers by some, ARM Servers by others, and sometimes even ARM-based Servers. No matter what you call them, Microservers will have a huge impact on the data center and on server computing in general. Although few people are familiar with Microservers today, their impact will be felt very soon. This is a new category of computing platform that is available today and is predicted to have triple-digit growth rates for some ...
Oct. 26, 2016 12:00 AM EDT Reads: 34,232
SYS-CON Events announced today that SoftNet Solutions will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. SoftNet Solutions specializes in Enterprise Solutions for Hadoop and Big Data. It offers customers the most open, robust, and value-conscious portfolio of solutions, services, and tools for the shortest route to success with Big Data. The unique differentiator is the ability to architect and ...
Oct. 26, 2016 12:00 AM EDT Reads: 1,050
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform and how we integrate our thinking to solve complicated problems. In his session at 19th Cloud Expo, Craig Sproule, CEO of Metavine, will demonstrate how to move beyond today's coding paradigm ...
Oct. 26, 2016 12:00 AM EDT Reads: 3,888
Everyone knows that truly innovative companies learn as they go along, pushing boundaries in response to market changes and demands. What's more of a mystery is how to balance innovation on a fresh platform built from scratch with the legacy tech stack, product suite and customers that continue to serve as the business' foundation. In his General Session at 19th Cloud Expo, Michael Chambliss, Head of Engineering at ReadyTalk, will discuss why and how ReadyTalk diverted from healthy revenue an...
Oct. 25, 2016 11:45 PM EDT Reads: 3,000
For basic one-to-one voice or video calling solutions, WebRTC has proven to be a very powerful technology. Although WebRTC’s core functionality is to provide secure, real-time p2p media streaming, leveraging native platform features and server-side components brings up new communication capabilities for web and native mobile applications, allowing for advanced multi-user use cases such as video broadcasting, conferencing, and media recording.
Oct. 25, 2016 08:45 PM EDT Reads: 3,250