|By John Busch||
|June 9, 2009 12:45 PM EDT||
With the explosive growth of Web-based businesses and applications, datacenter workloads have increased exponentially. IT managers are finding it difficult to meet the accelerating demands for performance, capacity, scalability and reliability, while at the same time meeting budgets, maintaining service-level agreements and driving green initiatives.
New component and middleware technologies - including multi-core processors, low-latency interconnect, flash memory, and highly optimized data access and caching technologies - hold serious promise for accomplishing these goals. But without optimal integration and implementation, these technologies have failed to deliver the benefits that datacenter managers and the business demand. Because Web 2.0 and cloud computing enterprises must focus on their core business, higher-level building blocks are needed that can exploit these advanced technologies to fundamentally solve these costly challenges.
Datacenter Trends and Challenges
With the explosive growth of Web 2.0, software-as-a-service (SaaS), cloud computing and other emerging Web-based applications, datacenter workloads have increased exponentially. The business opportunities that are created from the deployment of these new applications are substantial, but the demands they place on the datacenter are daunting. Challenges include:
- Unprecedented data growth. Recent studies indicate that the amount of data managed by today's datacenters will quadruple every 18 months. To complicate matters, online users are getting much more sophisticated, and response time expectations are at an all time high. But with the large increases in data volume, user interaction times are actually increasing for many datacenters.
- Severe capacity constraints. Datacenter managers are struggling to manage huge increases in rack, power and network utilization. They are constrained by limited datacenter power and space and are seeking cost-effective ways to expand capacity without increasing the datacenter footprint.
- Increasing data complexity. Organizations have too much data to process in a time-sensitive and consistent manner. Information management requires extensive data partitioning and application-level mapping, caching, replication/recovery and load balancing. Existing data management tools are complex and existing commodity, non-application-specific hardware is difficult to use and manage.
- Lack of scalability. Current datacenter environments lack the ability to scale effectively to manage peak demand. Even in multi-core server environments, many datacenters have already scaled to the point they are memory and disk bound.
- Underutilized resources. Estimates of current datacenter equipment utilization rates run between five and 30 percent. Kilowatt-hours are very expensive and inefficient. Underutilized hardware translates into wasted capex and opex, as well as increased power consumption.
- Severe budget constraints. Budgets are tighter than ever. Only solutions that are able to provide quick return on investment (ROI) are now being approved. Traditional approaches to datacenter expansion are no longer viable.
- Corporate mandates to go green. According to the 2007 EPA Report to Congress on Server and Datacenter Energy Efficiency, the energy consumption of servers and datacenters has doubled in the past five years and is expected to almost double again in the next five years to more than 100 billion kWh, costing about $7.4 billion annually. Datacenter managers are working hard to save energy, reduce datacenter space requirements and protect the environment.
Limitations of Today's Technology
The ability to effectively scale the database and caching tiers is critical for the success of any growing Web-enabled business. Unfortunately, ordinary server and middleware infrastructure is loosely integrated and minimally optimized. Existing datacenter solutions aren't adequately addressing the performance, capacity, scaling, reliability and power challenges of supporting new Web applications effectively.
Several new technologies have the potential to solve these challenges:
- Multi-core processors offer the promise of improved application performance and increased energy efficiency, but harnessing that potential effectively requires very high parallelism and synchronizing the processors, with a resultant higher degree of complexity.
- Enterprise-class flash memory has several advantages over traditional hard disk drives (HDDs), including 100 times faster access time, increased reliability due to no moving parts, and significantly reduced power. Space and power savings, for the same server workload, translate into a tangible reduction in total cost of ownership (TCO).
- Low-latency interconnect technologies provide the ability to send a message in a millionth of a second - more than an order of magnitude improvement over existing solutions.
- Optimized data access and caching applications can speed access to data. But these applications need to be deeply integrated with the operating environment and hardware to fully utilize the benefits of multi-core processor, flash memory and low-latency interconnect technologies.
The unfortunate reality is that each customer's IT team must become its own integration house, developing software in an attempt to optimize these advanced technologies - a highly complex and error-prone process that can take years to implement. This approach not only diverts valuable IT resources from the enterprise's core business, it results in unmanageable application and administrative complexity, low-resource utilization and high TCO.
Challenges of Utilizing New Technologies
The new generation of commodity multi-core processors, flash memory and low-latency interconnects offer tremendous potential in Web 2.0 and cloud computing datacenters, but in reality, due to the extensive work to implement, the benefits are limited. The effort required to utilize these new technologies to solve today's severe performance, power, space and TCO challenges is significant. IT teams need to develop highly parallel middleware applications, a high-performance operating system, and develop and optimize numerous, specialized configurations.
Adapting or inventing new deployment architectures to take advantage of the new technologies is a major undertaking, with large development and support costs that are not the core value of the businesses. Fortunately new, higher-level building blocks are now being introduced that address these challenges.
New technology integrates advances in flash memory, multi-core processors, low-latency interconnect, and optimized data access and caching technologies into fully integrated, optimized, multi-node appliances for faster data access. In the first half of 2009, several vendors introduced all-new, fully integrated appliances that leverage some or all of these technology advances.
By using these fully optimized data access appliances, enterprises can leverage higher-level building blocks that eliminate the need for complex integration projects. These appliances enable IT to meet corporate goals of increasing capacity, scalability and reliability, while at the same time slashing costs and dramatically curtailing energy consumption across the datacenter. The net result is not just more efficient datacenter operations, but the creation of new revenue-producing business opportunities based on rapid access to terabyte-scale data.
The Internet of Things (IoT), in all its myriad manifestations, has great potential. Much of that potential comes from the evolving data management and analytic (DMA) technologies and processes that allow us to gain insight from all of the IoT data that can be generated and gathered. This potential may never be met as those data sets are tied to specific industry verticals and single markets, with no clear way to use IoT data and sensor analytics to fulfill the hype being given the IoT today.
Oct. 27, 2016 04:45 AM EDT Reads: 2,867
Donna Yasay, President of HomeGrid Forum, today discussed with a panel of technology peers how certification programs are at the forefront of interoperability, and the answer for vendors looking to keep up with today's growing industry for smart home innovation. "To ensure multi-vendor interoperability, accredited industry certification programs should be used for every product to provide credibility and quality assurance for retail and carrier based customers looking to add ever increasing num...
Oct. 27, 2016 04:00 AM EDT Reads: 754
In his general session at 19th Cloud Expo, Manish Dixit, VP of Product and Engineering at Dice, will discuss how Dice leverages data insights and tools to help both tech professionals and recruiters better understand how skills relate to each other and which skills are in high demand using interactive visualizations and salary indicator tools to maximize earning potential. Manish Dixit is VP of Product and Engineering at Dice. As the leader of the Product, Engineering and Data Sciences team a...
Oct. 27, 2016 03:45 AM EDT Reads: 723
In the next forty months – just over three years – businesses will undergo extraordinary changes. The exponential growth of digitization and machine learning will see a step function change in how businesses create value, satisfy customers, and outperform their competition. In the next forty months companies will take the actions that will see them get to the next level of the game called Capitalism. Or they won’t – game over. The winners of today and tomorrow think differently, follow different...
Oct. 27, 2016 03:45 AM EDT Reads: 1,112
“Media Sponsor” of SYS-CON's 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. CloudBerry Backup is a leading cross-platform cloud backup and disaster recovery solution integrated with major public cloud services, such as Amazon Web Services, Microsoft Azure and Google Cloud Platform.
Oct. 27, 2016 03:45 AM EDT Reads: 1,506
The security needs of IoT environments require a strong, proven approach to maintain security, trust and privacy in their ecosystem. Assurance and protection of device identity, secure data encryption and authentication are the key security challenges organizations are trying to address when integrating IoT devices. This holds true for IoT applications in a wide range of industries, for example, healthcare, consumer devices, and manufacturing. In his session at @ThingsExpo, Lancen LaChance, vic...
Oct. 27, 2016 03:30 AM EDT Reads: 3,830
What are the successful IoT innovations from emerging markets? What are the unique challenges and opportunities from these markets? How did the constraints in connectivity among others lead to groundbreaking insights? In her session at @ThingsExpo, Carmen Feliciano, a Principal at AMDG, will answer all these questions and share how you can apply IoT best practices and frameworks from the emerging markets to your own business.
Oct. 27, 2016 03:00 AM EDT Reads: 2,691
Big Data has been changing the world. IoT fuels the further transformation recently. How are Big Data and IoT related? In his session at @BigDataExpo, Tony Shan, a renowned visionary and thought leader, will explore the interplay of Big Data and IoT. He will anatomize Big Data and IoT separately in terms of what, which, why, where, when, who, how and how much. He will then analyze the relationship between IoT and Big Data, specifically the drilldown of how the 4Vs of Big Data (Volume, Variety,...
Oct. 27, 2016 02:45 AM EDT Reads: 1,591
SYS-CON Events announced today that SoftNet Solutions will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. SoftNet Solutions specializes in Enterprise Solutions for Hadoop and Big Data. It offers customers the most open, robust, and value-conscious portfolio of solutions, services, and tools for the shortest route to success with Big Data. The unique differentiator is the ability to architect and ...
Oct. 27, 2016 02:15 AM EDT Reads: 1,127
For basic one-to-one voice or video calling solutions, WebRTC has proven to be a very powerful technology. Although WebRTC’s core functionality is to provide secure, real-time p2p media streaming, leveraging native platform features and server-side components brings up new communication capabilities for web and native mobile applications, allowing for advanced multi-user use cases such as video broadcasting, conferencing, and media recording.
Oct. 27, 2016 02:00 AM EDT Reads: 4,257
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform and how we integrate our thinking to solve complicated problems. In his session at 19th Cloud Expo, Craig Sproule, CEO of Metavine, will demonstrate how to move beyond today's coding paradigm ...
Oct. 27, 2016 02:00 AM EDT Reads: 3,964
DevOps is being widely accepted (if not fully adopted) as essential in enterprise IT. But as Enterprise DevOps gains maturity, expands scope, and increases velocity, the need for data-driven decisions across teams becomes more acute. DevOps teams in any modern business must wrangle the ‘digital exhaust’ from the delivery toolchain, "pervasive" and "cognitive" computing, APIs and services, mobile devices and applications, the Internet of Things, and now even blockchain. In this power panel at @...
Oct. 27, 2016 01:45 AM EDT Reads: 2,145
A completely new computing platform is on the horizon. They’re called Microservers by some, ARM Servers by others, and sometimes even ARM-based Servers. No matter what you call them, Microservers will have a huge impact on the data center and on server computing in general. Although few people are familiar with Microservers today, their impact will be felt very soon. This is a new category of computing platform that is available today and is predicted to have triple-digit growth rates for some ...
Oct. 27, 2016 01:00 AM EDT Reads: 34,304
In past @ThingsExpo presentations, Joseph di Paolantonio has explored how various Internet of Things (IoT) and data management and analytics (DMA) solution spaces will come together as sensor analytics ecosystems. This year, in his session at @ThingsExpo, Joseph di Paolantonio from DataArchon, will be adding the numerous Transportation areas, from autonomous vehicles to “Uber for containers.” While IoT data in any one area of Transportation will have a huge impact in that area, combining sensor...
Oct. 27, 2016 12:00 AM EDT Reads: 1,121
Almost everyone sees the potential of Internet of Things but how can businesses truly unlock that potential. The key will be in the ability to discover business insight in the midst of an ocean of Big Data generated from billions of embedded devices via Systems of Discover. Businesses will also need to ensure that they can sustain that insight by leveraging the cloud for global reach, scale and elasticity.
Oct. 27, 2016 12:00 AM EDT Reads: 11,134
SYS-CON Media announced today that @WebRTCSummit Blog, the largest WebRTC resource in the world, has been launched. @WebRTCSummit Blog offers top articles, news stories, and blog posts from the world's well-known experts and guarantees better exposure for its authors than any other publication. @WebRTCSummit Blog can be bookmarked ▸ Here @WebRTCSummit conference site can be bookmarked ▸ Here
Oct. 26, 2016 11:30 PM EDT Reads: 9,761
SYS-CON Events announced today that LeaseWeb USA, a cloud Infrastructure-as-a-Service (IaaS) provider, will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. LeaseWeb is one of the world's largest hosting brands. The company helps customers define, develop and deploy IT infrastructure tailored to their exact business needs, by combining various kinds cloud solutions.
Oct. 26, 2016 11:00 PM EDT Reads: 3,926
Most people haven’t heard the word, “gamification,” even though they probably, and perhaps unwittingly, participate in it every day. Gamification is “the process of adding games or game-like elements to something (as a task) so as to encourage participation.” Further, gamification is about bringing game mechanics – rules, constructs, processes, and methods – into the real world in an effort to engage people. In his session at @ThingsExpo, Robert Endo, owner and engagement manager of Intrepid D...
Oct. 26, 2016 11:00 PM EDT Reads: 9,916
Established in 1998, Calsoft is a leading software product engineering Services Company specializing in Storage, Networking, Virtualization and Cloud business verticals. Calsoft provides End-to-End Product Development, Quality Assurance Sustenance, Solution Engineering and Professional Services expertise to assist customers in achieving their product development and business goals. The company's deep domain knowledge of Storage, Virtualization, Networking and Cloud verticals helps in delivering ...
Oct. 26, 2016 09:45 PM EDT Reads: 1,142
SYS-CON Events announced today that CDS Global Cloud, an Infrastructure as a Service provider, will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. CDS Global Cloud is an IaaS (Infrastructure as a Service) provider specializing in solutions for e-commerce, internet gaming, online education and other internet applications. With a growing number of data centers and network points around the world, ...
Oct. 26, 2016 09:45 PM EDT Reads: 3,671