|HOME||PUBLIC LIBRARY||ANOTHER PERSPECTIVE||INFOPERSPECTIVES||CONTACT|
Something funny happened when the 1816 edition of the Farmer's Almanac, also known as the Old Farmer's Almanac, was being prepared. Due to some kind of fluke, the Almanac forecast a snowstorm in New England for July 13. And that's exactly what happened! It's no wonder, then, that readers of the Almanac, which has been published continuously since 1792, take its weather forecasts seriously. But you don't need that venerable publication to forecast this: For the foreseeable future, you're going to spend a lot more for the electricity to heat and cool your computer.
You heat up your computer with electricity every time you turn it on. Just about all the power the machine consumes turns into heat, although a little bit may become light or sound. If your computer is a laptop, the heat your computer generates might be all that much, something in the vicinity of 20 watts when it's working hard, which means it's throwing off heat at roughly 6.8 BTUs per hour. If you have a workstation with a large CRT display, you're probably burning up 150 watts, maybe more, and kicking out more than 500 BTUs per hour. If you have a big iSeries, such as a 9119-59X, you can pull down more than 20 KW of electricity and kick out more than 77,000 BTUs of heat per hour. IBM's largest mainframe, the new Z9 Danu, uses less power and generates less heat, but it doesn't have internal disks like the iSeries does. With outboard disks, a Danu mainframe will use more power and give off more heat than the big iSeries.
Most business sites are between these extremes. They use desktop computers that burn about 100 watts and servers that eat up several kilowatts. But if that typical site has 200-odd seats, it could be kicking eating 25 KW and out 80,000 BTUs an hour, with or without the server. And that doesn't count the lighting for the offices where the computers happen to be or the power cost of HVAC systems to keep the place comfortable.
All the power used for heating the machines and then cooling the rooms they are in has been costly all along. Now, with energy costs half again as high as they were a year ago, companies are going to start looking at the power portion of their physical plant costs . . . and possibly point some fingers at the computers. The good old days, when most companies didn't connect electric bills to computer usage, are gone. It's going to become fashionable to talk about the amount of juice computers use, and in some cases there will be arguments that only add more heat to the environment without shedding much light on possible options.
The people charged with managing computers, which might mean you, poor reader, are going to have to have some answers ready, or at least to be able to explain, coherently, why there might not be any answers right this second. You'll have to explain this to beancounters whose initial take on the situation is about as useful as a weather forecast based on wooly bear caterpillars.
Wooly bear caterpillars, the larvae of Isabella tiger moths, are supposed to know what's coming down the weather channel this winter. According to the folklore, these caterpillars, with black ends and a brown strip in the middle, have a broader brown stripe if the winter is going to be mild and a narrower one if the winter is going to be a real humdinger.
We don't know a lot about the grubs, but we can tell you this: don't believe a word they say. The caterpillars, generally speaking, change color as they age. They start out mostly black. They molt, maybe half a dozen times, as they grow during the summer and fall. With each molt, if they are eating well and are otherwise in good fettle, they brown stripe gets longer. You might say that the black ends stay the same and the caterpillar gets longer in the middle, but that's not exactly the case. Eventually winter does come, to wooly bears as it does to people, and the caterpillars hide somewhere and enter the insect equivalent of a state of hibernation, a state that may cause your laptop to crash until you go through two or three scary BIOS updates. The caterpillar, unlike your laptop, doesn't crash, it crashes out, sleeping its way through the winter and living off fat it built up when its favorite veggies were in season. It can survive extreme cold, perhaps 50 or more degrees below zero. In the spring it will wake up, pupate, and become a very pretty moth. At that point it has to start a family, because once it gets rid of its fur coat and grows wings it cannot live through a winter.
A similar fate may lie ahead for computer users who plan big applications that depend on powerful client side software to work. When the applications are in their larval stage and can run on skinny workstations, the computer folk can probably ride out an energy crisis that lasts a couple of seasons. Once the full-blown suite of software is deployed, and workstations upgraded to handle all the new code, PC power consumption might jump from 75 watts a seat to 150 (depending on how much bigger the PCs get and now intensively they are run), and cooling costs may rise, too.
Meanwhile, back at the server ranch, those client-heavy sites are still going to need more server power, more storage, and maybe more networking apparatus, all of it burning power and bleeding heat.
If your servers are X86 machines, you might find yourself studying the new Sun Galaxy products, which use AMD CPU chips instead of Intel processors. When it comes to doing more computing per watt, AMD is way ahead of Intel. Intel has vowed to close the gap or maybe even do better than AMD, but you can't plug its plans into your network. If you're using any other kind of server, you're just plain stuck; you won't have any nice answers for the beancounters.
But if, like most companies, you use a lot more power on desktops than in the glass house, you may have options. Moreover, the computer vendors that already have been rolling out workstations that use less juice are going to continue along that path. So, too, are the display makers who want you to dump your CRTs and switch to flat screens that use a lot less power and, incidentally, yield better financial results for their manufacturers.
Of the big vendors, HP seems to have the lead in low power desktop systems that use AMD chips and come in configurations that include fewer power-hungry components, but perversely has pressed for a complicated solution involving blade PCs and thin clients. Dell, a favorite in the corporate world, is staying loyal to Intel, and as a result the best it can do is come up with some new products that take advantage of laptop technology, where Intel does indeed provide superior low power chips. All the vendors can do a lot more than they have, if they can persuade customers to buy conservation flavored client machines, which might cost more up front even if they save money over the long haul.
One example is the disk drive in every PC. Smaller drives use less power. A typical drive used in desktop computers, a 120 GB Seagate Barracuda 7200, uses 12.5 watts when it's seeking data. A typical disk drive used in desktop replacement laptops, an 80GB Toshiba MK8026GAX, uses less than 3 watts while seeking. Both drives use much less power when idling.
Moreover, client machines in most offices don't need anywhere near this amount of disk capacity. They could get by with Microsoft says XP shouldn't need more than 1.5 gigabytes of disk space. Office XP in a typical configuration uses another quarter of a gigabyte. So a client with, say, 5 GB of disk storage might be more than ample. A machine with such low disk requirements could even use the 1.8-inch disks that show up in music players and which use less power than laptop drives, although the relatively slow speed of very small disk drives might interfere with user productivity. But wait, there's flash memory. It might cost a lot more than a hard drive, but it hardly uses any power at all. Flash can pull peak currents of more than 2 amps per gigabyte (at 3 volts, which is typical for common flash chips), but it uses that current for only a small fraction of its active cycle; when flash is idling, it draws only leakage current that's barely measurable. Flash could cut the power cost of permanent storage on a 5GB client to levels well below the power levels a disk drive requires. And because it's very fast, it could allow client machines to be built that use less dynamic memory (that can draw 2or 3 watts per stick) and more swap space on flash.
BitMicro Networks, Adtron, and M-Systems already make solid-state flash drives that resemble SCSI, IDE, or ATA disks in both 2.5-inch and 3.5-inch form factors. To software, these gadgets look like regular hard disks. They're expensive, though, and this pretty much restricts their appeal to military and other "tough iron" applications.
We would not be surprised if the companies that now sell keychain flash drives also start offering models that live inside PCs, imitate hard drive interfaces, and run a lot faster than they could off a USB port. What we cannot predict, any more than a caterpillar can guess the way a winter will work out, is how the marketplace would react to a PC with no moving parts that consumes less power than a laptop and still provides a nice big display suitable for office work and all the capacity any reasonable software stack requires.
Here's the new math: Computers don't use as much power when they are idle as they do when they are in active use, but it's not hard to reckon that a PC using 100 watts when it's really working can consume a 1000 watt-hours a day or 365 kilowatt-hours a year. At 10 cents a KWH, that's something like $36, and it's possible to bring that to $60 when related HVAC costs are added in; it can cost nearly as much to get rid of heat as it does to create it in the first place. Now if that same power costs 20 cents per KWH, the direct cost jumps to $72 per PC and maybe $120 or so with HVAC thrown in. In big cities like New York, where the population of computers is very high, juice already costs more than a dime per KWH for residential and commercial customers and the rates on record are based in part on oil costs that were half what they are today.
If you can cut that power cost by 80 percent, which seems possible, by moving to machines that use only 200 watt-hours a day, there's a possibility that cleverly configured low power PCs, whether they use small disks or flash, can make it into the marketplace. All that's required is a cost of ownership per box similar to the price users pay for the more familiar PCs that use more familiar amounts of electricity. And if these low power PCs were shrunk to bumps on the back of displays, which they could be, they might even be able to sell at a premium to their older cousins in locations where size matters.
The market shift that has made laptop clients as popular and nearly as affordable as desktop models might turn out to be the larval stage of a broader trend, one towards smaller and less power hungry machines. The first signs of a move in that direction would most likely appear in showcase settings, like bank lobbies and open plan offices at trendy firms. Back room operations, even if they have more workstations and server, won't change as quickly.
If things go the way we think they will, there will be no end of arguments about whether cool running PCs are really practical, or just the PC, politically correct, thing for corporations to do. The bets won't be settled by debates. They'll be settled by the nature of new machines the computer industry offers, the longterm trends in power costs, and by the willingness of the people who lay out computing strategies to examine unfamiliar issues.
You won't have to wait long to see which way things are moving. The way prospects for winter power bills are looking, the idea of ultra low power PCs could, like tiger moths, take off in the spring.
— Hesh Wiener September 2005