Success in producing usable energy from nuclear fusion is a Holy Grail that has mesmerized physicists since the Manhattan Project. Finally, after nearly 50 years of travail by researchers all over the world, scientists working on the Joint European Torus project at Oxfordshire, England, have something to crow about.
Their aggressive, multi-nation effort became the first ever to introduce tritium, a heavy isotope of hydrogen, to the controlled fusion reaction. The reward was impressive: almost 2 million watts of power, sustained for two seconds, a long period in nuclear terms. Earlier efforts led only to thousands of watts, for fractions of a second.
Americans working on hot fusion, notably those at Princeton University's Plasma Physics Laboratory, had planned to reach this level of power, a major marker on the way to practical fusion, long ago. What intervened was a lag in funding. During the 1980s, the United States cut back its funding for hydrogen fusion research in a number of decisions culminating in the scrapping of a long-planned Compact Ignition Tokamak, or Burning Plasma experiment. The $1-billion-plus cost of building this reactor, with its associated electromagnets, costly electric power bill and complex construction, could not be sustained in a recessionary fiscal climate growing steadily worse.