Welcome!

Java Authors: Pat Romanski, Elizabeth White, Yeshim Deniz, Roger Strukhoff, Sebastian Kruk

Related Topics: SOA & WOA, Java, Virtualization, Cloud Expo, Big Data Journal, SDN Journal

SOA & WOA: Blog Post

Evolving the Cloud

Use what’s available

Although often misunderstood, cloud computing ultimately relies on the same technological underpinnings as traditional server and storage options. While software, platforms and even infrastructure are farmed out to third-party providers, their ability to operate efficiently is constrained by the same physical laws as those which govern local server stacks. IT professionals and service providers, therefore, both have a vested interest in making the best use of the physical hardware available – and that means thinking outside the power box.

Keeping Costs Down
One of the most-touted benefits of cloud computing is reduced cost. By offloading server management to a public or hybrid providers, admins can save themselves the price of hardware upgrades, and bypass the costs of local energy. This can result in a significant savings over time, but represents only a transfer of responsibilities, rather than a re-imagining – the price of running multiple servers still exists; it is simply split between multiple users.

For IT professionals, defrayed operating expenditures (OpEx) which offset the need for regular capital expenditures (CapEx) are often worth it, since they eliminate the need for server migrations and lower the possibility of legacy system conflicts. For cloud providers, however – or admins who choose private cloud alternatives – this isn’t enough. Fortunately, an environmental evolution is at hand.

Is it Cold in Here?
One of the largest costs associated with any server stack is cooling. Hot systems run slowly, hot systems fail – in some cases, hot systems self-destruct. As a result, companies pay unseemly amounts of money to cool their servers, often using a combination of chilled water and cold air. Steve Hammond, director of the U.S. Department of Energy’s National Renewable Energy Laboratory (NREL), says this process is “like putting your beverage on your kitchen table and then going outside to turn up the air-conditioner to get your drink cold.” In other words, if you want to maximize energy efficiency, this is not a good choice.

There are several tech evolutions currently underway to help address this cooling challenge. First are large data centers in locations with naturally low (but not freezing) temperatures, significantly reducing the need to add-on system cooling. Dublin, Ireland, for example, is a popular option, since its average temperature is low enough that companies can use fresh air to cool servers for free; recent research demonstrates that air cooling - even at temperatures above what you’d find in a meat locker – has a minimal impact on hardware failures.

NREL is also working on a way to cool their massive server stack using water, but not the typically chilled variety. Instead, the lab is going to pipe in water at 75 degrees Fahrenheit, which will increase to a balmy 100 degrees by the time it leaves the system. Not only will the plan help cool NERL’s servers, but the energy transferred will be used to heat nearby offices. Ultimately, the lab could save upwards of $800,000 annually by using lukewarm water to cool their system.

Cloud computing leverages natural technology interconnections to excel; companies are now seeking the same kind of existent environmental advantages.

 

About the Author: Doug Bonderud is a freelance writer, cloud proponent, business technology analyst and a contributor on the Dataprise Cloud Services website.

More Stories By Doug Bonderud

Doug Bonderud is a freelance writer, cloud proponent, business technology analyst and a contributor on the Dataprise Cloud Services website.