Welcome Guest!
Create Account | Login
Locator+ Code:

Search:
FTPOnline Channels Conferences Resources Hot Topics Partner Sites Magazines About FTP RSS 2.0 Feed

email article
printer friendly

Grid Computing Takes Off in the Enterprise
Start thinking about computing power like you do about electrical power: Pay only for how much you use.
by Mike Ellsworth

Right now, your computer and every other computer in your enterprise is wasting resources. Millions of CPU cycles are doing nothing more than warming the air. Megabytes of disk space sit unused. Your network connection idles as you read downloaded Web pages. On a daily basis, you and everyone you work with typically use only a small fraction of your computers' capacity.

What if you could put all these underemployed resources to work? And what if you could solve the big computing problems your organization has without spending an extra dime on new hardware? That's the promise of grid computing, a method of breaking a big computing problem into smaller chunks and farming them out to dozens or hundreds of computers to work on.

Whether you steal idle cycles from your organization's desktop PCs or link dedicated machines together to form a supercomputer, the result is a lot more computer power to devote to massive computations like modeling weather systems, oil deposits, or viruses; computing complex financial derivatives or insurance risk portfolios; or simulating car crashes. And this computer power is provided with a comparatively small incremental investmentusually miniscule when compared to the cost of buying new hardware.

So what's not to love? You get more resources for your big problems, you get better ROI on your computing investment, and you spend very little. That's why more and more enterprises are looking into adopting grid computing to fill at least some of their computing needs.

Grid is not a panacea, however, and the solutions available are not typically going to help you with scaling Exchange servers or improving online transaction speed, for example. Current grid solutions typically attack more strategic, more massive computing problems, and are not yet suitable for some standard enterprise applications, especially those with a low computing-to-I/O ratio.

Old Concept, New Implementation
Grid computing is not a new concept. In fact, researchers at Xerox's Palo Alto Research Center first explored distributed computing in the 1970s. Along the way, several high-profile grid projects have gained recognition, including the rendering of the Toy Story movies through server farm and the crackingthrough a grid network in 1997of Netscape's 56-bit encryption key.

One of the earliest prominent grid computing efforts was the SETI@Home project, which links more than four million Internet-connected computers together into a massive supercomputer. The Search for Extraterrestrial Intelligence project sends each participating computer a chunk of radio telescope data to process. When done, the computers send back the results and request more. Now four years old, SETI@Home has 3.3 million users in 226 countries, has accumulated 1.4 million years of CPU time, is running across 136 operating system/hardware combinations in 226 different countries, and has processed more than 800 million results. The capability of this grid averages more than 48 teraflops per secondnot bad for an effort that has cost only a few hundred thousand dollars.

Back to top

Printer-Friendly Version










Java Pro | Visual Studio Magazine | Windows Server System Magazine
.NET Magazine | Enterprise Architect | XML & Web Services Magazine
| | Discussions | Newsletters | FTP Home