Strategien


Utility Computing

Plug and Pay

21.04.2003
Von Fred Hapgood

Local Utility

One way of bringing utility computing inside - given the huge incompatibilities that exist in most established networks - is to dedicate a special computer to the task. Several companies, such as Hewlett-Packard, Inkra and Opsware, sell software that will partition a computer (often a mainframe) into several perfectly interoperable environments, keep track of resource use on a per-transaction basis and bill accordingly. If a transaction requires, say, an unusual operating system, that OS can boot in its partition to support just that transaction. Cognigen, a data analysis and consultancy for the biotech and health-care industries, recently bought some utility computing software from Sun, the Sun Grid Engine, that performs this seeming magic.

According to Darcy Foit, director of IS at Cognigen, the problem that inspired the purchase was the need to optimize execution of a critical program that did not share processor time well. The Grid Engine gave Cognigen's scientists a running view of and access to all processors on their LAN, letting them monitor and schedule their tasks more efficiently. "Since implementation," Foit says, "each scientist has had an average of an extra hour of work time." (Previously that much time was wasted waiting for processors to free up.)

Foit says he is now thinking of taking the natural next step: using the Grid Engine to offer a specialized virtual computing service to external clients. Unlike Gateway, which will talk with anyone, Foit plans to stay within bioinformatics. "Bio companies often need to do validation runs on their computing work," he says, "and perhaps validation by its nature is done best by an independent company."

Daunting Challenges

Those cases might seem like baby steps set against the utility computing utopia - in which any operation has access to any resource - but even they are not without problems. The primary issue for most CIOs will be how much control they lose when renting or borrowing resources instead of owning them, says AmEx's Salow. He notes that he has some concern that either the utility computing vendor or the relationship itself will end up influencing the development of a company's network, perhaps by biasing procurement decisions toward the supplying vendor's products. He says, however, that so far the service, which started in March 2001, has not raised any of those flags.

In some companies, moving procurement out of the capital and into the operating line item might not be simple either. Many CIOs will worry about the security risk of moving critical data onto external machines, though Inpharmatica's Leach thinks the problem is manageable. "I think the security issue is overstated," he says. "OutsourcingOutsourcing is common practice. The United Devices/Gateway facility is just a step along the same road." He says the issue for many companies will be whether to buy an expensive kit that is completely under their control or use a trusted third party and save money. "I think that many small and midsize companies will choose the latter and be more competitive than their larger, more conservative competitors," he says. Alles zu Outsourcing auf CIO.de

Zur Startseite