Manufacturing computer chips uses huge amounts of power. Not as much as you use during the lifetime of the computer, but you'd be absolutely shocked how much energy is embodied in those tiny little chips. (You'll get just a small taste below.) University College London may have achieved a breakthrough in lowering the energy-intensity of computer chip manufacturing, by replacing heat with UV light.
Chips require a lot of electrical insulation in them--first on the surface of the chip itself where all the transistors are (to keep electrons from sloshing between transistors), and secondly on all the stacked-up layers of wires that connect those transistors (so electrons don't slosh around between wires). The first layer is always made of silicon dioxide, and many of the upper layers are, too. The upper layers are formed by chemical vapor deposition ("CVD"), which requires temperatures of 300 - 500 degrees C in a plasma at very low pressures, and is done by machines like this. But the first layer is formed by rapid thermal processing ("RTP"), which doesn't require the low pressures or plasma but uses temperatures up to 1200 degrees C. In fact, an RTP chamber like Applied Materialss RadiancePlus heats up at over 250 C per SECOND, keeping tolerances of +/- 3 C across the entire surface of a wafer as it goes. This is why its called rapid. This is also insanely energy-intensive, as you might imagine. (If you follow the link, look at the picture and realize that every one of those hundreds of bulbs is a 480-watt heat lamp.) (And keep in mind this is just one step in a process that takes hundreds of steps. Though it's one of the most energy-intensive steps, there are many others close on its heels.) Note also that those temperature tolerances are crucial--a little difference in temperature from one side of the wafer to another can cause warping (meaning goodbye wafer, that's 200 chips that aren't Pentiums anymore), and a larger difference in temperature can make the wafer shatter, not only destroying it but bringing production to a halt while you fix the machine.
But what if you could make insulation on a chip without thousands of watts of power? That's what University College London researchers are trying to do. The BBC reports that they are managing to grow oxide layers at room temperature, simply by shining extremely bright UV light through oxygen above the wafer.
How does this work? Basically, if you took a bare silicon wafer and let it sit out in the air (or, better, a high-purity oxygen environment), the surface of the wafer would oxidize naturally, just like iron rusts. The problem is that takes too long. The extreme heat used in traditional RTP is a way of speeding up that chemical reaction so that it happens in seconds. But heat is not the only way of speeding up chemistry. Ions do it too. Shining super-bright UV light into oxygen energizes the oxygen molecules so much that they split up from their O2 pairs into single atoms that carry a charge--they become free radicals, which are far more chemically reactive than normal O2. (Incidentally, this is why health nuts are so big on reducing free radicals in the body--they make oxidation happen faster, which in our bodies means aging and damage accrue faster.) The UV light still requires some energy, but nowhere near as much as the heat lamps of RTP systems. Yahoo News quoted the university's electronic materials department chair Ian Boyd as saying the difference is roughly like "a sun lamp versus a pizza furnace".
So far the free-radical oxidation at room temperature is still slower than high-temperature oxidation--Boyd says it grows the oxide layer at a rate of about one angstrom per second. RTP also has the advantage that it fixes defects which naturally occur during oxidation. (Variations occur here and there--an impurity, or a bit of extra-fast or extra-slow oxidation; high temperatures don't exactly melt these defects, but they cause these defects to diffuse into surrounding areas so they aren't a problem.) There is also a significant history of this technique being tried before, unsuccessfully--the BBC quotes Douglas Paul of Cambridge as saying "These techniques cannot be used for electronics because the defect densities are far too high". However, Boyd says they are the first ones to use the farther UV spectrum (126 nanometer wavelength).
Boyd's process is still in early stages, but it shows great promise for reducing energy-intensity of chip manufacturing. If the UV process becomes as fast as RTP with a similarly low number of defects, then it will be a clear winner even for those that dont care at all about green issues, because it will never destroy wafers from warping or shattering, and the process will cost less. Perhaps it can be used in combination with existing methods, growing most of the oxide layer at low power and throwing in a little high-temperature annealing at the end for cleaning up defects. It looks like Applied Materials is already using the free radical technique to some extent in their "RadOx" tool, though I don't know any specifics.
Nice follow-on to your green computers article!
"...you'd be absolutely shocked how much energy is embodied in those tiny little chips." It would be great to get a rough embodied-to-operational energy ratio for this and other common goods, to see where it might make sense to particularly focus on using the old stuff longer...and to predict where such opportunities may arise, when and if energy prices rise.
I second Hassan's request for a embodied-to-operational energy ratio/life cycle analysis of a computer.
The best analysis I have seen is this article (.pdf) which states that the life cycle energy use of a computer is dominated by production (81%) as opposed to operation (19%). It also estimates that creating a PC and monitor takes 6,400 megajoules (MJ) which is the energy equivalent of 51 gallons of gasoline. The data for this study was a 2000 PC and CRT monitor, so the information might be outdated by now.
I would really like to know what the numbers would be for a PC and LCD monitor (or laptop) manufactured in 2006. If you have access to such information and can share it, I would be much obliged.
Ive been using Macintoshes for twenty years and only need to replace them about every six to eight years. Many people who buy cheap PCs replace them every couple years.