Jeremy Faludi is one Worldchanging's secret weapons
Jer's recent Your Stuff: If It Isn't Grown, It Must Be Mined is a must read, but you also owe it to yourself to check out his terrific and under-read Green Computing Update, which comes in four parts
1. Data Centers
Data centers (also called server farms) are where companies like Google or Amazon or internet service providers locate the hundreds or thousands of computer servers that provide their online services. As Worldchanging's Joel Makower has written before in a couple great articles, data centers use massive amounts of electricity; large ones can use megawatts of power, with each square meter using as much power as an entire average US home. According to a Tech Target white paper on green computing, "Data centers consume 1.5 percent of the total electricity used on the planet." [note: this should be 1.5% of U.S. electricity]
It's not just the computers themselves that use all this power: the combined heat output of all these servers, hard drives and network gear is so large that massive air conditioning is required to keep it all from overheating. "Cooling is about 60 percent of the power costs in a data center because of inefficiency," said Hewlett Packard executive Paul Perez in Data Center News. "The way data centers are cooled today is like cutting butter with a chain saw." Cooling capacity is often the limiting factor of how big these systems can be -- I've talked with more than one engineer whose data center facility sat half empty or more; even though there was plenty of room for more servers, the building's air conditioning was maxed out.
The quest for energy efficiency is opening up a whole new discipline in chip design. As an Ars Technica commentator stated in relation to the ISSCC conference, "if I actually understood more than 25 percent of the gory details of how any of the presenters -- Intel, AMD, PA Semi, or Sun -- are cutting down on power consumption right now, then I'd quit writing about these chips and get paid lots of money to design them." But the future holds power efficiency disciplines that are even further afield under the rubric of reversible computing, including quantum computing. Reversible computing promises to cut power consumption by perhaps a thousandfold, and quantum computing also promises the capability of solving massively parallel problems that are effectively unsolvable by normal computers, such as large number factoring for cryptography.
There appears to be little recent news in the field of "normal" reversible computing (although last year American Scientist ran a good in-depth description of what it is and why it would radically improve CPU performance per watt). Surprisingly, however, quantum computing (seemingly the more difficult proposition) had a huge breakthrough last February, when Canadian startup D-Wave demonstrated the first commercial quantum computer. (It even played Sodoku.) Some academics were skeptical, but NASA's Jet Propulsion Laboratory verified D-Wave's claims, as JPL fabricated the quantum computing chips for D-Wave. So far this prototype is limited -- not even as fast as a normal desktop computer, and no doubt much more energy-intensive due to the cooling system needed for the superconductors -- but by the end of 2008 D-Wave hopes to have a model that is 64 times as powerful. Growth will no doubt continue from there.
...The past two years have seen a radical shift in circuit board materials. Most companies have phased out once-ubiquitous lead solder; within two more years probably none at all will be using it. Unfortunately, however, this was not the result of enlightened self-interest; the industry had to be forced into it. The cornerstone to greener materials in electronic components is Restriction of Hazardous Substances (RoHS), the European Union law that bans cadmium, hexavalent chromium, lead, mercury, polybrominated biphenyls (PBBs), and polybrominated diphenyl ethers (PBDEs). Joel Makower has described RoHS as "the most significant transformation in the manufacturing sector since the banning of ozone-depleting substances in the late 1970's." Companies that ignore the legislation (or are not careful enough about vetting their supply chain) get smacked hard: Sony lost about $150 million when the EU banned its' PlayStation gaming console due to excessive cadmium content.
Green buildings took off when people could agree how to measure the greenness of a building with the LEED standards. Until 2006 there was no such standard for computers, but then came EPEAT, the Electronic Product Environmental Assessment Tool:
Much like LEED, which has good/better/best levels of performance, the EPEAT rating system evaluates computer and other electronic products according to three tiers of environmental performance -- Bronze, Silver, and Gold. There are 51 total environmental criteria: 23 required criteria and 28 optional criteria. To qualify for certification as an EPEAT product, a product must conform to all the required criteria. Manufacturers may pick and choose among the optional criteria to boost their EPEAT rating to achieve a higher level as follows. To qualify as "Silver," a product must meet all 23 required criteria plus at least 14 optional criteria; "Gold" requires at least 21 optional criteria in addition to the required items.
Perhaps the best thing about EPEAT is that -- unlike LEED, Cradle to Cradle, and other certification systems that cost tens of thousands of dollars -- it's free. This means the only barrier to adoption is time and effort on the part of the manufacturers. All you need to do to qualify is download the spec and change your operations to meet the requirements. Most of the criteria are directly product-related, but some of them operate on the business level like having a take-back program or a written sustainability policy.
The Australian research group Climate Risk has produced Towards a High-Bandwidth, Low-Carbon Future, assessing communication technology's current and potential impacts. It's the most comprehansive study I've seen: CR cites data not just from Australia, but from the UK, US, and other countries. In this series, I've noted how much environmental impact computers have; but as Climate Risk quotes the UK's Forum for the Future, "An increase in transport intensity is ten times more significant for CO2 emissions than all primary broadband impacts." Furthermore, "[T]he estimated abatement opportunity calculated herein is almost 5% (4.9) of Australia’s total national emissions, making the use of telecommunication networks one of the most significant opportunities to reduce the national carbon footprint."
This is a bold statement, but history backs it up. CR notes a remarkable statistic from the dot-boom 1990's in the US:
From 1992 to 1996, US economic growth averaged 3.4% a year while energy use grew by 2.4% a year...[H]owever from 1996 to 2000, Romm reports an apparent anomaly in US energy statistics. The US GDP growth increased to over 4% a year, but during the same period, energy demand only increased by 1.2% a year.
This was the biggest drop in energy intensity since the oil crisis of the 1970's, and we weren't even trying.
It may not sound like much at first, but as we've written before, Rosenfeld's Law (a "Moore's Law for efficiency") shows that if energy use per GDP falls by 2% per year, then 100 years from now, we can have ten billion people on this planet at today's standards of living while using half the energy currently needed to support that standard.
Currently there is little more than a trickle-down of green chemistry knowledge between companies, governments, NGOs, and universities. Companies' chemical information is proprietary, and many environmental impacts have never been measured, much less publicized. Some universities and government agencies have data on a few specific chemicals, but lack a centralized clearinghouse of information. MBDC may have the best database of chemical environmental data, but it is private and expensive information. Opening up the faucets of these knowledge flows, and getting them all in one tub big enough to splash in, may be the most important step for the industry right now. Several groups are trying to crank the taps.
Britain's Chemistry Innovation Network has a roadmap for sustainable technologies, including trends and drivers, specific needs of the industry, the business case, a review of technologies, and case studies. These are aimed at everyone in the chemical industry. UC Berkeley's Framework for California Leadership in Green Chemistry Policy recommends policy directions for lawmakers. For consumers, the Ecology Center put together a consumer guide to toxic chemicals in cars, HealthyCar.org. The site ranks over 200 vehicles in terms of indoor air quality, as well as rating child car seats for brominated flame retardants, and explaining what chemicals to be concerned with and why.
Chemists looking to learn should check out the EPA's 2002 textbook, Green Engineering: Environmentally Conscious Design of Chemical Processes. There's also a newer EPA tool, the downloadable Green Chemistry Expert System. It's a piece of software that "allows users to build a green chemical process, design a green chemical, or survey the field of green chemistry." For a less technical introduction, they have a web page listing their Twelve Principles of Green Chemistry:
1. Prevent waste
2. Design safer chemicals and products
3. Design less hazardous chemical syntheses
4. Use renewable feedstocks
5. Use catalysts, not stoichiometric reagents
6. Avoid chemical derivatives
7. Maximize atom economy
8. Use safer solvents and reaction conditions:
9. Increase energy efficiency
10. Design chemicals and products to degrade after use
11. Analyze in real time to prevent pollution
12. Minimize the potential for accidents
Designing a recyclable product, and wondering about adhesives? Here is some research on how to make a product recyclable despite glues.
Hey Alex, make sure you take good care of your secret weapon! Jer's a real treasure. I first met him at Stanford after working with Michael Braungart on my C2C journey. Jeremy has very cool insights and does incredible detective work.
The q-bit works like powers of 2. This means that a 28 q-bit is 2^28, or 268,000,000 operations.
A 512 q-bit is 2^512 which is equal 1.34 times 10 to the power of 154, or 134 with an additional 152 zeros after the number.
The visible univers has only 10 to the power of 92 sub-atomic particles in it, if we assume about a trillion galaxies, each with 100 to 500 billion stars, and all the interstellar hydrogen between stars and galaxies, and all the phononic ratiation as well.
So...a 512 q-bit quantum computer is like a 1.34E154 bit normal computer.
A 1024 q-bit is 1.8E308 bit normal, classical, or non-quantum computer.
In otherwords, a 1024 q-bit is not 64 times as powerful as a 28 q-bit.
A 1024 q-bit is 6.7E299 times as powerful.
Many orders of magnitude difference.
But, one 1024 q-bit computer could replace all the data centers on this Earth, as well as parallel Earth's in 1E100 other universes, with plenty of spare capacity!