|By Ranvir Wadera||
|April 5, 2013 02:15 PM EDT||
Today's IT infrastructure is in the midst of a major transformation. In many ways, the data center is a victim of its own success. The growing number of technologies and applications residing in the data center has spawned increasing complexity, which makes IT as a whole less responsive and agile. While businesses are focused on moving faster than ever, large and complex infrastructure is inherently rigid and inefficient.
As a result, IT is moving outside the traditional data center into colocation facilities and cloud infrastructures - essentially Infrastructure Anywhere. The move to Infrastructure Anywhere is driven by the core objective of improving responsiveness and agility and reducing costs. For example, you can scale up resources through the cloud in minutes, not months. But for all of its benefits, this new Infrastructure Anywhere model presents critical challenges.
To make smart decisions about where to run applications and what kind of resources you need, you first must understand your workload: utilization, capacity, and cost. Gaining unified visibility is difficult when your application workloads are distributed across data centers and colocation facilities in different parts of the country or around the world. With limited visibility, how do you accurately align resources and capacity with workloads for efficient processing, cost control, and - more important - achieve the full business value of your IT investment?
It's All About Agility
According to the results of Sentilla's recent survey of data center professionals about their plans for cloud deployments, agility and flexibility are the top drivers behind enterprise IT transformation initiatives such as cloud deployments - followed closely by issues of capacity and cost.
Figure 1: Key drivers for cloud computing initiatives
While agility is the prime motivating factor, the importance of cost as a factor should not be ignored. According to the survey, the major resource limitation experienced by respondents - for all infrastructure initiatives - is budget.
Figure 2: Resource limitations
Note that several of the reported constraints (personnel, storage capacity) are related to the broader issue of budget. In this sense, cost is overwhelmingly the most important constraint on IT initiatives - including cloud initiatives.
2013 Is for Planning, 2014 for Deployment
Of the organizations surveyed, nearly 50 percent plan to be deploying cloud initiatives in 2014. Many are in the planning phases. In general, we can expect cloud computing deployments to increase by 70 percent in 12 months:
Figure 3: Data center cloud initiatives, by year
Similarly, those surveyed expect to gradually migrate more workloads to cloud platforms in the coming years - with 28 percent planning to run more than half of their applications in the cloud by 2014. The barriers to cloud migration are lowering.
Figure 4: Percentage of applications planned to move to the cloud, by year
The Cloud Isn't a Homogeneous Place
Cloud computing can refer to several different deployment models. At a high level, cloud infrastructure alternatives are defined by how they are shared among different organizations.
Figure 5: Where respondents planned to deploy cloud initiatives
Private clouds offer the flexibility of the elastic infrastructure shared between different applications, but are never shared with other organizations. Hosted on dedicated equipment either on-premises or in a colocation provider, a private cloud is the most secure but the least cost-effective cloud model.
Public cloud infrastructure offers similar elasticity and scalability and is shared with many organizations. This model is best suited for businesses requiring managing load spikes and scaling to a large number of users without a large capital investment. Amazon Web Services (AWS) is perhaps the most widely deployed example of public cloud infrastructure as a service.
Hybrid cloud offers the dual advantages of secure applications and data hosting on a private cloud and the cost benefits of keeping sharable applications and data on the public cloud. This model is often used for cloud bursting - the migration of workloads between public and private hosting to handle load spikes.
Community cloud is an emerging category in which different organizations with similar needs use a shared cloud computing environment. This new model is taking hold in environments with common regulatory requirements, including healthcare, financial services and government.
The research showed that organizations are evaluating a broad range of different cloud solutions, including Amazon AWS, Microsoft Azure, Google Cloud Platform, and Red Hat Cloud Computing, as well as many solutions based on OpenStack, the open source cloud computing software.
Without planning, ad hoc cloud deployments combined with islands of virtualization will only add complexity to the existing data center infrastructure. The resulting environment is one of physical, virtual and cloud silos with fragmented visibility. While individual tools may deliver insight into specific parts of the infrastructure puzzle (physical infrastructure, server virtualization with VMware, specific infrastructure in a specific cloud provider), IT organizations have little visibility into the total picture. This lack of visibility can impede the IT organization's ability to align infrastructure investments with business needs and cost constraints.
Infrastructure Complexity Is the New Normal
While it aims to bring agility to IT, the process of cloud transformation will only increase infrastructure complexity in the near term. IT organizations must manage a combination of legacy systems with islands of virtualization and cloud technologies.
When asked about where cloud infrastructure will reside, survey respondents indicated that they will be managing a blend of on-premises and outsourced infrastructure, with the balance shifting dramatically from 2013-2014.
Figure 6: Where cloud infrastructure will reside, by year
The Need for Unified Visibility into Complex Infrastructure
As you plan your own cloud initiatives, you must prepare for multiple phases of transformation:
- Deploying new applications to the cloud as part of the broader application portfolio
- Migrating existing applications to cloud infrastructure where possible and appropriate
- Managing the hybrid "infrastructure anywhere' environment during the transition and beyond
To support these phases, you need visibility into workloads and capacity across essential application infrastructure - no matter where it resides. From the physical and virtual resources, up through applications and services, you will need insight so you can align IT with business objectives.
Figure 7: The need for infrastructure insight at all levels
Essential Infrastructure Metrics for Right-Sizing Infrastructure
Decisions about which applications to deploy to the cloud and where to deploy them will require visibility into:
- Historical, current and predicted application workload
- Current and predicted capacity requirements of the workload
- Comparative cost of providing that capacity and infrastructure on different platforms
For application migration scenarios, you will need to understand the actual resource consumption of the existing application. Whether it's a new application or a migrated one, you will need to ‘right-size' the cloud infrastructure to avoid the twin dangers of over-provisioning (and wasting financial resources) or under-provisioning and risking outages or performance slow-downs. You will need insight into:
- Memory utilization
- CPU utilization
- Data transfer/bandwidth
- Storage requirements
You will also need good metrics about the cost of running the application in your existing data center, as well as the predicted costs of running that same application on various platforms. These metrics need to factor in the total cost of the application, such as:
- Personnel for supporting the application
- Operating system
- Management software
- Cooling and power
- Leased cloud
- Server and storage hardware
To accurately predict the cost of running the application on cloud-based infrastructure, you will need accurate metrics around the actual, historical resource consumption of the application (storage, memory, CPU, etc.) as it maps to billable units by the provider. By understanding the actual consumption, you can avoid over-provisioning and overpaying for resources from external providers.
Infrastructure Analytics for Cloud Migration
For any given application that is a candidate for the cloud, you want to be able to compare the total cost of the resources required across different options (public, private and on-premise).
While you could try to manually crunch these numbers in a spreadsheet, the computations are not trivial. These are decisions that you will need to make repeatedly, for each application, and revisit when an infrastructure provider changes its cost model or fee structure. For that reason you'll want a tool that lets you get an accurate and continuous view into current costs and model "what-if" scenarios for different deployments.
Continuous Analysis for Continuous Improvement of the "Infrastructure Anywhere" Environment
Before deploying an application, what-if scenarios help you make sound resource decisions and right-size applications. After deploying, continuous analysis is key to ensuring that you are optimizing capacity and using resources most efficiently.
While individual tools may already give you slices of the necessary information, you need integrated insight into the complete infrastructure environment. Again, emerging infrastructure intelligence can assemble necessary information from applications and assets that are both on and off your premises, virtualized and not, in different platforms and locations.
Figure 8: Transformational analytics
The software can provide 'single pane of glass' visibility into assets and applications throughout the physical, virtual, and cloud infrastructure, including:
- Application cost/utilization spanning different locations
- True resource requirements of apps (for more accurate provisioning in cloud infrastructure)
- CPU and memory utilization of apps, wherever they reside
By 2014, enterprise computing will look quite different than it does today, yet many legacy systems and infrastructure will still be with us. IT operations, business units and application architects will need to manage applications that reside in infrastructure that spans on-premise and offsite locations, with public, private, hybrid, and community cloud infrastructure. Data centers will be just one part of the total pool of infrastructure that IT manages on behalf of the business.
To manage this transformation, you will need to make smart decisions about where workloads should reside based on specific application and business needs. As these changes roll out, you will need to manage the transforming and hybrid application infrastructure to deliver the necessary performance and service levels, no matter where applications reside.
IT organizations need the insight to make fast, smart and informed decisions about where workloads and data should reside and how to deploy new applications. Rather than isolated silos of metrics and capacity and utilization data, IT needs unified visibility into infrastructure across the virtual computing environment - both on-premise and off. And they need the metrics and continuous analysis to manage the evolving infrastructure in a manner aligned with business objectives.
An emerging category of infrastructure intelligence can provide the continuous and unified analytics necessary to understand and compare your decisions and to manage the data center during the transformation. With broad 'infrastructure insight' you can align cloud platforms with business needs and cost requirements - delivering the agility to realize new revenue opportunities with the insight to contain the costs of existing applications.
The Internet of Things is a misnomer. That implies that everything is on the Internet, and that simply should not be - especially for things that are blurring the line between medical devices that stimulate like a pacemaker and quantified self-sensors like a pedometer or pulse tracker. The mesh of things that we manage must be segmented into zones of trust for sensing data, transmitting data, receiving command and control administrative changes, and peer-to-peer mesh messaging. In his session at @ThingsExpo, Ryan Bagnulo, Solution Architect / Software Engineer at SOA Software, focused on desi...
Dec. 17, 2014 11:15 PM EST Reads: 1,922
"At our booth we are showing how to provide trust in the Internet of Things. Trust is where everything starts to become secure and trustworthy. Now with the scaling of the Internet of Things it becomes an interesting question – I've heard numbers from 200 billion devices next year up to a trillion in the next 10 to 15 years," explained Johannes Lintzen, Vice President of Sales at Utimaco, in this SYS-CON.tv interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 17, 2014 11:00 PM EST Reads: 1,982
"For over 25 years we have been working with a lot of enterprise customers and we have seen how companies create applications. And now that we have moved to cloud computing, mobile, social and the Internet of Things, we see that the market needs a new way of creating applications," stated Jesse Shiah, CEO, President and Co-Founder of AgilePoint Inc., in this SYS-CON.tv interview at 15th Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 17, 2014 08:00 PM EST Reads: 1,976
SYS-CON Events announced today that Gridstore™, the leader in hyper-converged infrastructure purpose-built to optimize Microsoft workloads, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Gridstore™ is the leader in hyper-converged infrastructure purpose-built for Microsoft workloads and designed to accelerate applications in virtualized environments. Gridstore’s hyper-converged infrastructure is the industry’s first all flash version of HyperConverged Appliances that include both compute and storag...
Dec. 17, 2014 06:30 PM EST Reads: 1,842
"People are a lot more knowledgeable about APIs now. There are two types of people who work with APIs - IT people who want to use APIs for something internal and the product managers who want to do something outside APIs for people to connect to them," explained Roberto Medrano, Executive Vice President at SOA Software, in this SYS-CON.tv interview at Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 17, 2014 06:00 PM EST Reads: 1,822
Today’s enterprise is being driven by disruptive competitive and human capital requirements to provide enterprise application access through not only desktops, but also mobile devices. To retrofit existing programs across all these devices using traditional programming methods is very costly and time consuming – often prohibitively so. In his session at @ThingsExpo, Jesse Shiah, CEO, President, and Co-Founder of AgilePoint Inc., discussed how you can create applications that run on all mobile devices as well as laptops and desktops using a visual drag-and-drop application – and eForms-buildi...
Dec. 17, 2014 11:45 AM EST Reads: 2,082
We certainly live in interesting technological times. And no more interesting than the current competing IoT standards for connectivity. Various standards bodies, approaches, and ecosystems are vying for mindshare and positioning for a competitive edge. It is clear that when the dust settles, we will have new protocols, evolved protocols, that will change the way we interact with devices and infrastructure. We will also have evolved web protocols, like HTTP/2, that will be changing the very core of our infrastructures. At the same time, we have old approaches made new again like micro-services...
Dec. 16, 2014 11:45 PM EST Reads: 1,956
Code Halos - aka "digital fingerprints" - are the key organizing principle to understand a) how dumb things become smart and b) how to monetize this dynamic. In his session at @ThingsExpo, Robert Brown, AVP, Center for the Future of Work at Cognizant Technology Solutions, outlined research, analysis and recommendations from his recently published book on this phenomena on the way leading edge organizations like GE and Disney are unlocking the Internet of Things opportunity and what steps your organization should be taking to position itself for the next platform of digital competition.
Dec. 15, 2014 11:45 PM EST Reads: 2,419
The 3rd International Internet of @ThingsExpo, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that its Call for Papers is now open. The Internet of Things (IoT) is the biggest idea since the creation of the Worldwide Web more than 20 years ago.
Dec. 15, 2014 10:30 AM EST Reads: 8,053
In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect at GE, and Ibrahim Gokcen, who leads GE's advanced IoT analytics, focused on the Internet of Things / Industrial Internet and how to make it operational for business end-users. Learn about the challenges posed by machine and sensor data and how to marry it with enterprise data. They also discussed the tips and tricks to provide the Industrial Internet as an end-user consumable service using Big Data Analytics and Industrial Cloud.
Dec. 15, 2014 10:00 AM EST Reads: 2,341
As the Internet of Things unfolds, mobile and wearable devices are blurring the line between physical and digital, integrating ever more closely with our interests, our routines, our daily lives. Contextual computing and smart, sensor-equipped spaces bring the potential to walk through a world that recognizes us and responds accordingly. We become continuous transmitters and receivers of data. In his session at @ThingsExpo, Andrew Bolwell, Director of Innovation for HP's Printing and Personal Systems Group, discussed how key attributes of mobile technology – touch input, sensors, social, and ...
Dec. 15, 2014 10:00 AM EST Reads: 3,232
Things are being built upon cloud foundations to transform organizations. This CEO Power Panel at 15th Cloud Expo, moderated by Roger Strukhoff, Cloud Expo and @ThingsExpo conference chair, addressed the big issues involving these technologies and, more important, the results they will achieve. Rodney Rogers, chairman and CEO of Virtustream; Brendan O'Brien, co-founder of Aria Systems, Bart Copeland, president and CEO of ActiveState Software; Jim Cowie, chief scientist at Dyn; Dave Wagstaff, VP and chief architect at BSQUARE Corporation; Seth Proctor, CTO of NuoDB, Inc.; and Andris Gailitis, C...
Dec. 15, 2014 09:00 AM EST Reads: 2,152
Building low-cost wearable devices can enhance the quality of our lives. In his session at Internet of @ThingsExpo, Sai Yamanoor, Embedded Software Engineer at Altschool, provided an example of putting together a small keychain within a $50 budget that educates the user about the air quality in their surroundings. He also provided examples such as building a wearable device that provides transit or recreational information. He then reviewed the resources available to build wearable devices at home including open source hardware, the raw materials required and the options available to power s...
Dec. 15, 2014 09:00 AM EST Reads: 2,454
There's Big Data, then there's really Big Data from the Internet of Things. IoT is evolving to include many data possibilities like new types of event, log and network data. The volumes are enormous, generating tens of billions of logs per day, which raise data challenges. Early IoT deployments are relying heavily on both the cloud and managed service providers to navigate these challenges. In her session at Big Data Expo®, Hannah Smalltree, Director at Treasure Data, discussed how IoT, Big Data and deployments are processing massive data volumes from wearables, utilities and other machines...
Dec. 15, 2014 08:45 AM EST Reads: 3,108
"There is a natural synchronization between the business models, the IoT is there to support ,” explained Brendan O'Brien, Co-founder and Chief Architect of Aria Systems, in this SYS-CON.tv interview at the 15th International Cloud Expo®, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 15, 2014 08:15 AM EST Reads: 3,512
SYS-CON Media announced that Splunk, a provider of the leading software platform for real-time Operational Intelligence, has launched an ad campaign on Big Data Journal. Splunk software and cloud services enable organizations to search, monitor, analyze and visualize machine-generated big data coming from websites, applications, servers, networks, sensors and mobile devices. The ads focus on delivering ROI - how improved uptime delivered $6M in annual ROI, improving customer operations by mining large volumes of unstructured data, and how data tracking delivers uptime when it matters most.
Dec. 15, 2014 06:45 AM EST Reads: 3,229
In this Women in Technology Power Panel at 15th Cloud Expo, moderated by Anne Plese, Senior Consultant, Cloud Product Marketing at Verizon Enterprise, Esmeralda Swartz, CMO at MetraTech; Evelyn de Souza, Data Privacy and Compliance Strategy Leader at Cisco Systems; Seema Jethani, Director of Product Management at Basho Technologies; Victoria Livschitz, CEO of Qubell Inc.; Anne Hungate, Senior Director of Software Quality at DIRECTV, discussed what path they took to find their spot within the technology industry and how do they see opportunities for other women in their area of expertise.
Dec. 15, 2014 04:00 AM EST Reads: 2,148
While great strides have been made relative to the video aspects of remote collaboration, audio technology has basically stagnated. Typically all audio is mixed to a single monaural stream and emanates from a single point, such as a speakerphone or a speaker associated with a video monitor. This leads to confusion and lack of understanding among participants especially regarding who is actually speaking. Spatial teleconferencing introduces the concept of acoustic spatial separation between conference participants in three dimensional space. This has been shown to significantly improve comprehe...
Dec. 12, 2014 05:00 PM EST Reads: 1,885
The Industrial Internet revolution is now underway, enabled by connected machines and billions of devices that communicate and collaborate. The massive amounts of Big Data requiring real-time analysis is flooding legacy IT systems and giving way to cloud environments that can handle the unpredictable workloads. Yet many barriers remain until we can fully realize the opportunities and benefits from the convergence of machines and devices with Big Data and the cloud, including interoperability, data security and privacy.
Dec. 11, 2014 06:00 PM EST Reads: 2,191
Performance is the intersection of power, agility, control, and choice. If you value performance, and more specifically consistent performance, you need to look beyond simple virtualized compute. Many factors need to be considered to create a truly performant environment. In his General Session at 15th Cloud Expo, Harold Hannon, Sr. Software Architect at SoftLayer, discussed how to take advantage of a multitude of compute options and platform features to make cloud the cornerstone of your online presence.
Dec. 11, 2014 03:15 PM EST Reads: 3,009