Lower-power microprocessors can reduce computer energy use, but the solutions don't have to all be in hardware. PowerEscape, a programming tools company, has unveiled a utility called "Insight" that lets programmers improve the efficiency of their software with an eye towards reducing overall power consumption, according to LinuxDevices.com. The more the data has to be shuffled around, the more power is needed.
In the olden days of slow, limited hardware, programmers had to work to maximize both processor efficiency and memory use; those habits have largely fallen away in the era of desktop supercomputers. It's just possible that the growing need to improve energy efficiency will trigger a return to programming parsimony, a desire to get the maximum possible result out of minimum possible effort.
PowerEscape's utilities are available for Linux, MacOS X, and Windows XP.
Seems to me that this sort of thing will end up as a compiler option (Do you want to optimise for speed, size,... and/or power?)
The again, what is the power consumption of CPU's as a fraction of overall PC consumption? As a fraction of overall power use? Any Peroti analyses?
"what is the power consumption of CPU's as a fraction of overall PC consumption?"
I think this misses the point. Bloatware is insidious, and wends its way into all components. For example, bloated software uses more memory, which modern OS VM systems end up caching onto disks, which use more energy when moving heads than when simply spinning -- and which keeps disks from spinning down to save power.
Another example: GUI software that takes longer to do its task keeps screen savers from kicking in earlier, causing monitors to use more energy.
These examples do not even begin to address the huge embedded energy cost of continual upgrades. A computer consumes ten times its weight in petroleum during its manufacture. Computers don't "go bad" or "wear out," they just "get slower" as software becomes more bloated. If more efficient software makes computers "last" just one year longer, that's a tremendous energy savings.
That's one reason I use Apple products -- they last longer, as documented by independent research (IDC, Gartner Group). The average age of Mac computers in my company is nearly five years old -- a five-year-old Windows machine is a doorstop! And we're using them for fairly intensive graphics work. Our webserver, hosting a dozen domains, was discontinued while Bill Clinton was in office!
The problem is systemic. It is not limited to CPU energy use!
Curious, when initially purchased, my laptop with a reasonably new P4 Celeron in it could run 3.5 hours on a single charge doing just word processing, but less than 90 minutes doing intense matrix arithmetic (playing 3D video games :P).
So there's an answer, at least in architectures that are mildly optimized for power-scaling, the factor is up to 2 for heavier computing.
I also agree with Jan's comment about the screensaver but framed differently. If you consistently finish your work 10 mintes earlier out of the day because your user interface and response is faster. Having the computer on for less time will add up pretty fast too.
It'd be nice to have a setting somewhere in the laptop to force it to the low-power mode. Even there are some spikes in processor demand, my laptop "supercomputer" is orders of magnitude faster than some other machines I've used for basic tasks and would probably have been referred to as a super computer in any other decade last century.
(incidentally, I could swear my max battery life dropped by 1/3 when I installed WinXP SP2.. what else do they have running in here? Proably just anecdotal, maybe I should blame my old battery... or not).