Many people in the software industry have heard the buzzword cloud computing by now. It is currently swirling around IT water coolers everywhere and is getting computer geeks animated from Redmond to Silicon Valley to Boston and beyond. Cloud computing promises to revolutionize how computing is done and huge investments are being made by the major players in the industry to stake a claim on this gold mine.
So What is Cloud Computing?
What cloud computing means is basically computing resources are provided as a web service in a form that is transparent to the user and in which the underlying infrastructure and complexity is hidden from them. The cloud is dynamically scalable and in it computing services are virtualized, including even the actual servers themselves which become virtual servers. The cloud is a very powerful paradigm and it is one that is catching on big time. It is important that it grow in a energy efficient manner, which is what I mean by the green cloud.
Datacenters consume a great deal of power
Behind this virtual cloud there are very real machines living in massive data centers, tied together by a vast network of systems. A veritable IT army of rows upon rows of densely packed racks of these virtual machines all running on top of real hardware — somewhere in the cloud. Not surprisingly, this extended infrastructure underlying the cloud is a massive energy consumer. Datacenters consume considerable amounts of power. In 2005, for example, if you factor in all the energy needed to keep them cool, datacenters used around 1.2% of all electricity consumed in the United States. The total power requirements of a datacenter extends well beyond the actual power consumption of the machines themselves. Jon Koomey, a staff scientist at Berkeley National Laboratory, in Berkeley. Calif. has noted that every kilowatt burned by servers in a datacenter, requires another 1 to 1.5 kW to cool and support them.
The power needs of datacenters is exploding — in part because more and more services are being moved into the cloud, which ultimately physically lives inside these datacenters and in the physical infrastructure of the internet itself. If a business as usual attitude is followed some estimates place the dollar figure for energy usage of datacenters at $250 billion by 2012. As the powerful notion of cloud computing catches on — and it will catch on — these power hungry beasts will proliferate at an even faster rate, in order to service the growing cloud’s needs for computing horsepower and data storage.
A Green Cloud is Needed to Make Data Centers Energy Efficient
What is needed is for the concept of a green cloud to also catch on within the industry. A green cloud is cloud computing that uses intelligent power management and cooling to minimize its energy needs for some given level of service.
The green cloud relies on common sense design and operation of data centers to increase the efficiency of cooling systems. It also reduces the need for cooling and delivers cooling precisely to where it is needed, as opposed to waste it on areas that are not over-heating. If you have ever been in one of these data centers, you know how cold they keep those vast rooms. An example of how this would be applied would be the simple act of doing away with raised floors in modern data centers as they are no longer required. This significantly reduces the volume of air that needs to be kept cooled. Relying on ambient air, instead of expensively cooled air, to do the job when that is possible offers great potential to reduce energy demands. So, for instance, a data center sited in San Fransisco could rely on ambient air to provide it’s cooling needs for almost half the hours in any given year.
It has also been found that data centers could operate at higher temperatures than they do without compromising the equipment and servers or the quality of service. Through accurate thermal mapping of these facilities, it has been determined that heat is not evenly distributed and that isolated and often transient hot spots occur. In a cloud scenario, this could be mitigated by dynamically shifting resources to other physical areas in the cloud infrastructure. For example, virtual servers could be moved away from hot spots to cooler regions in the cloud. It should be noted that this is not as easy for dedicated co-location scenarios in which a given client is tied to some physical hardware.
A lot of this green cloud energy management relies on software coupled with a sophisticated sensor network. The Networked Embedded Computing group at Microsoft Research and other research groups in large computer and software companies are developing new architectures, models, and tools for organizing and programming these systems. They rely on real time thermal data to build a dynamic picture of energy usage and hot spots in the data center, and to respond to this changing picture by redirecting usage to other less heated areas.
In short, much can be done, and is being done to make sure that the future of cloud computing will be a green one (that the cloud will be a green cloud). It is an understatement to say that there are huge opportunities for software and hardware developers and vendors, for datacenter builders and operators, for chip manufacturers and sensor manufacturers in building the green cloud. This has the potential to become a huge business.
© 2009, Chris de Morsella. All rights reserved. Do not republish.