The 2004 report “Burning Plasma: Bringing a Star to Earth,” from the U.S. National Research Council, sold Washington on the International Thermonuclear Experimental Reactor (ITER), a massive R&D project that proponents predict will be the breakthrough project for fusion energy. In its fiscal 2008 budget, however, Congress drove the United States’ role in ITER right into the ground, slashing US $160 million promised for this year to $10.7 million. That has some wondering if fusion research, considered since the 1960s one of the great long shots for a sustainable and relatively clean energy supply, has run out of time.
What makes fusion a long shot? Like most fusion experiments to date, ITER proposes to use formidable electric currents and magnetic fields to induce fusion in isotopes of hydrogen (deuterium and tritium) and to contain the resulting burning plasma-akin to a tiny star and exceeding 100 million ˚C. Existing fusion reactors have produced heat equivalent to just a few megawatts of power–less than the energy required to induce the reaction–and for just fractions of a second. ITER should put out ten times as much power as it consumes, but still for just a matter of minutes.
And even that level of performance will require a 27-meter-high magnetic confinement chamber that will take a decade to build and cost an estimated $2.76 billion. Including design, administration, and 20 years of operation, the project’s total expenses will be nearly $15 billion.
For more on fusion’s troubled poster child, see my story at Spectrum Online.