Click here to close now.




















Welcome!

Java IoT Authors: Adrian Bridgwater, Nicholas Lee, Elizabeth White, Mike Kavis, Liz McMillan

Related Topics: Containers Expo Blog, Java IoT, Linux Containers, Open Source Cloud, SDN Journal

Containers Expo Blog: Article

The Brave New World of Storage Virtualization

Can you put your virtual environment on autopilot?

I recently found myself intrigued by an article by Jon William Toigo on Tech Target titled - Software-defined infrastructure or how storage becomes software. In his article, Toigo poses the question: Could a software-defined infrastructure, with software-based controls and policies, be the answer to managing and allocating storage? While I am sure Jon would agree we're all tired of the buzz words around "software-defined-anything", the fact of the matter is that we all use it anyway for lack of a better description of what's occurring today.

Storage specifically is one of those areas I always say is like organizing your room: everyone has their own way of doing it.  You want your dresser to be a certain height.  You want you mattress pointing a certain direction and the light shining through your window on your desk at just the right time of day.  The truth is, storage is the under-pinning of virtualization that everyone wants to architect and manage the way they want to, and no one is going to tell them otherwise... UNTIL - wait for it - ...software can manage this FOR people.

Where is the robot (I wish the Roomba folks would hurry up with this, but I digress;) that keeps your room exactly the way you want it to be?  The industry seems constantly to toss this idea around, but no one really seems to know where to find the solution.  The truth is that no matter what quirks people have with their storage, there is one goal everyone has in common: Ensuring applications have ability to consume the storage resources they need while preserving priority and business logic; as well as increasing efficiency without introducing risk.  This is the promise of every storage vendor on the planet trying to sell their latest auto-tiering, de-dupe, compression solution and all the wonderful bling to trick out their man cave.

The problem, however, is that virtualization obscures the lines with storage and it becomes intractably complex to manage the macro-level supply and demand of resources occurring across the stack. Traditional management tools - storage- and virtualization-related - are running into the limitations of their stats-based, linear approach to managing diverse environments.

An Illustration
As an illustration, let's use a real-life example. Let's assume we have a NetApp environment supporting VMWare.  The NetApp consist of 4 Aggregates spread across two filers.

  • Aggr 1 SATA and Aggr 2 SATA are on Controller A, and each Aggr comprises of 2 Volumes in VMWare
  • Aggr 3 SAS and Aggr 4 SAS are on Controller B, and each Aggr comprises of 2 Volumes in VMWare

Aggr1 sees a spike in IOPS driven by 3 virtual machines demanding IOPS on its 2 Volumes.  This results in Aggr1 utilizing 93% of its available IOPS capacity due to several high consumers, or "Bully" VMs (to use a traditional storage vendor's language).  To compound the issue, the high utilization on disk has now manifested itself up the stack and begins to impact the ability of other workloads to access the storage resources they require on Aggr1.

BraveNew-a

In this example, a traditional software management system for the storage platform will alert an administrator that the Aggr1 has exceeded a tolerable utilization on IOPS and that it is time for an administrator to act.  Similarly, the virtualization vendor (in this case VMWare), will generate alarms related to the virtualized components layered on top of the storage platform.

BraveNew-b

The administrator must then siphon through the charts and graphs in their storage vendor's tool, or their virtualization management system, with the end goal being some sort of resource allocation decision to intelligently allocate storage resources to the applications that need them while avoiding quality of service disruption at the expense of low-priority applications.

More likely the administrator needs to access both of these interfaces to try and accomplish this.  In this example, the resource decision might be to move the volume to separate aggregate on Controller A (when it reality this won't do much due to performance constraints underneath), move the volume itself to faster disk associated with a separate storage controller, or move the virtual machine to a volume hosted on Controller B.

BraveNew-c

Now the second part of this equation (and arguably more difficult to get right) is: How does the administrator ensure that the domino they decide to push over doesn't create another resource constraint within the environment?  Fundamentally, traditional storage vendor software offerings and virtualization management tools are incapable of understanding the impact and outcome of any prospective resolution because they simply do not analyze the interdependencies of this decision across both (virtualized) compute and storage components.  The best case for operations is a head start on the troubleshooting process after quality of service has already been impacted or is in the process of being degraded.

Are Things Even at Human Scale Anymore?
In order to truly accomplish a software-defined storage system, there needs to be a new type of management system capable of connecting these two obscure worlds for the purpose of intelligent decision making and resource allocation.

Toigo paints this gap perfectly when he states, "Our storage needs to be managed and allocated by intelligent humans, with software-based controls and policies serving as a more efficient extension of our ability to translate business needs into automation support."

Following this logic, this new system must go above and beyond looking at application issues in isolation to determine how to properly allocate the infrastructure's entire supply of finite storage resources to every virtualized workload and application - at scale.  Inevitably, this means looking across all application resource demands concurrently and then determining how to service each application's request for the best cost/benefit to the overall platform by allocating the supply of storage resource intelligently and in the most efficient way.   Ideally, this will be done prescriptively - before quality of service is degraded.

The second phase of this brave new world will involve incorporating business logic that allows the software-driven control plane to consider business constraints alongside of capacity and performance metrics in real time.   If tier 1 applications need to have priority for faster disk over low-priority applications, then the system should be set it and forget it.  If tier 3 applications must be confined to bronze or slow storage, then the constraint should carry over dynamically for any workload matching this criterion that is provisioned across the lifecycle of the environment.

If 20% overhead needs to be maintained across tier-1 storage resources, then software should be intelligent enough to control utilization below this level, instead of notifying administrators once they have crossed it and forcing them to bring the infrastructure back from the brink.

The reality is that everyone has their own idea how to best trick out their room - in this case, their precious storage. Administrators will never truly be comfortable with putting their storage architecture on auto-pilot until they can rest assured that their policies are maintained while assuring application performance.  Any system developed to tackle this brave new world, must be able to solve both of these goals simultaneously - a challenge that Toigo argues is beyond human capacity to do so at scale.

More Stories By Eric Bannon

A passion for econometric analysis and statistical modeling led Eric to… wait for it… software. Eric discovered that by leveraging IT algorithms, based on the principles of supply and demand, software can solve some of the biggest challenges in infrastructure and cloud management today.

Joining VMTurbo in 2011, Eric now serves as a Solution Architect, where he helps organizations unlock the full value of virtualization through the implementation of software-defined control. He holds a B.S. in Economics and Finance from Bentley University, and still likes to deconstruct James Heckman’s econometric models in his free time.

@ThingsExpo Stories
SYS-CON Events announced today that IceWarp will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. IceWarp, the leader of cloud and on-premise messaging, delivers secured email, chat, documents, conferencing and collaboration to today's mobile workforce, all in one unified interface
WebRTC has had a real tough three or four years, and so have those working with it. Only a few short years ago, the development world were excited about WebRTC and proclaiming how awesome it was. You might have played with the technology a couple of years ago, only to find the extra infrastructure requirements were painful to implement and poorly documented. This probably left a bitter taste in your mouth, especially when things went wrong.
Too often with compelling new technologies market participants become overly enamored with that attractiveness of the technology and neglect underlying business drivers. This tendency, what some call the “newest shiny object syndrome,” is understandable given that virtually all of us are heavily engaged in technology. But it is also mistaken. Without concrete business cases driving its deployment, IoT, like many other technologies before it, will fade into obscurity.
While many app developers are comfortable building apps for the smartphone, there is a whole new world out there. In his session at @ThingsExpo, Narayan Sainaney, Co-founder and CTO of Mojio, will discuss how the business case for connected car apps is growing and, with open platform companies having already done the heavy lifting, there really is no barrier to entry.
As more intelligent IoT applications shift into gear, they’re merging into the ever-increasing traffic flow of the Internet. It won’t be long before we experience bottlenecks, as IoT traffic peaks during rush hours. Organizations that are unprepared will find themselves by the side of the road unable to cross back into the fast lane. As billions of new devices begin to communicate and exchange data – will your infrastructure be scalable enough to handle this new interconnected world?
SYS-CON Events announced today that Micron Technology, Inc., a global leader in advanced semiconductor systems, will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Micron’s broad portfolio of high-performance memory technologies – including DRAM, NAND and NOR Flash – is the basis for solid state drives, modules, multichip packages and other system solutions. Backed by more than 35 years of technology leadership, Micron's memory solutions enable the world's most innovative computing, consumer,...
SYS-CON Events announced today that Pythian, a global IT services company specializing in helping companies leverage disruptive technologies to optimize revenue-generating systems, has been named “Bronze Sponsor” of SYS-CON's 17th Cloud Expo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Founded in 1997, Pythian is a global IT services company that helps companies compete by adopting disruptive technologies such as cloud, Big Data, advanced analytics, and DevOps to advance innovation and increase agility. Specializing in designing, imple...
SYS-CON Events announced today that HPM Networks will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. For 20 years, HPM Networks has been integrating technology solutions that solve complex business challenges. HPM Networks has designed solutions for both SMB and enterprise customers throughout the San Francisco Bay Area.
Consumer IoT applications provide data about the user that just doesn’t exist in traditional PC or mobile web applications. This rich data, or “context,” enables the highly personalized consumer experiences that characterize many consumer IoT apps. This same data is also providing brands with unprecedented insight into how their connected products are being used, while, at the same time, powering highly targeted engagement and marketing opportunities. In his session at @ThingsExpo, Nathan Treloar, President and COO of Bebaio, will explore examples of brands transforming their businesses by t...
With the proliferation of connected devices underpinning new Internet of Things systems, Brandon Schulz, Director of Luxoft IoT – Retail, will be looking at the transformation of the retail customer experience in brick and mortar stores in his session at @ThingsExpo. Questions he will address include: Will beacons drop to the wayside like QR codes, or be a proximity-based profit driver? How will the customer experience change in stores of all types when everything can be instrumented and analyzed? As an area of investment, how might a retail company move towards an innovation methodolo...
Through WebRTC, audio and video communications are being embedded more easily than ever into applications, helping carriers, enterprises and independent software vendors deliver greater functionality to their end users. With today’s business world increasingly focused on outcomes, users’ growing calls for ease of use, and businesses craving smarter, tighter integration, what’s the next step in delivering a richer, more immersive experience? That richer, more fully integrated experience comes about through a Communications Platform as a Service which allows for messaging, screen sharing, video...
The Internet of Things (IoT) is about the digitization of physical assets including sensors, devices, machines, gateways, and the network. It creates possibilities for significant value creation and new revenue generating business models via data democratization and ubiquitous analytics across IoT networks. The explosion of data in all forms in IoT requires a more robust and broader lens in order to enable smarter timely actions and better outcomes. Business operations become the key driver of IoT applications and projects. Business operations, IT, and data scientists need advanced analytics t...
A producer of the first smartphones and tablets, presenter Lee M. Williams will talk about how he is now applying his experience in mobile technology to the design and development of the next generation of Environmental and Sustainability Services at ETwater. In his session at @ThingsExpo, Lee Williams, COO of ETwater, will talk about how he is now applying his experience in mobile technology to the design and development of the next generation of Environmental and Sustainability Services at ETwater.
As more and more data is generated from a variety of connected devices, the need to get insights from this data and predict future behavior and trends is increasingly essential for businesses. Real-time stream processing is needed in a variety of different industries such as Manufacturing, Oil and Gas, Automobile, Finance, Online Retail, Smart Grids, and Healthcare. Azure Stream Analytics is a fully managed distributed stream computation service that provides low latency, scalable processing of streaming data in the cloud with an enterprise grade SLA. It features built-in integration with Azur...
Akana has announced the availability of the new Akana Healthcare Solution. The API-driven solution helps healthcare organizations accelerate their transition to being secure, digitally interoperable businesses. It leverages the Health Level Seven International Fast Healthcare Interoperability Resources (HL7 FHIR) standard to enable broader business use of medical data. Akana developed the Healthcare Solution in response to healthcare businesses that want to increase electronic, multi-device access to health records while reducing operating costs and complying with government regulations.
For IoT to grow as quickly as analyst firms’ project, a lot is going to fall on developers to quickly bring applications to market. But the lack of a standard development platform threatens to slow growth and make application development more time consuming and costly, much like we’ve seen in the mobile space. In his session at @ThingsExpo, Mike Weiner, Product Manager of the Omega DevCloud with KORE Telematics Inc., discussed the evolving requirements for developers as IoT matures and conducted a live demonstration of how quickly application development can happen when the need to comply wit...
The Internet of Everything (IoE) brings together people, process, data and things to make networked connections more relevant and valuable than ever before – transforming information into knowledge and knowledge into wisdom. IoE creates new capabilities, richer experiences, and unprecedented opportunities to improve business and government operations, decision making and mission support capabilities.
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In his session at @ThingsExpo, James Kirkland, Red Hat's Chief Architect for the Internet of Things and Intelligent Systems, described how to revolutionize your archit...
MuleSoft has announced the findings of its 2015 Connectivity Benchmark Report on the adoption and business impact of APIs. The findings suggest traditional businesses are quickly evolving into "composable enterprises" built out of hundreds of connected software services, applications and devices. Most are embracing the Internet of Things (IoT) and microservices technologies like Docker. A majority are integrating wearables, like smart watches, and more than half plan to generate revenue with APIs within the next year.
Growth hacking is common for startups to make unheard-of progress in building their business. Career Hacks can help Geek Girls and those who support them (yes, that's you too, Dad!) to excel in this typically male-dominated world. Get ready to learn the facts: Is there a bias against women in the tech / developer communities? Why are women 50% of the workforce, but hold only 24% of the STEM or IT positions? Some beginnings of what to do about it! In her Opening Keynote at 16th Cloud Expo, Sandy Carter, IBM General Manager Cloud Ecosystem and Developers, and a Social Business Evangelist, d...