It was announced recently that the JET laboratory in the UK successfully fused deuterium and tritium hydrogen atoms to produce helium and produced a small amount of electrical power. Eleven megawatts of power doesn’t sound like much now, but the radiation produced by cloud chambers a century ago weren’t very large either.
The benefit of fusion reactors is that the amount of radiation byproduct is (or may be) insignificant. Theoretically, a hydrogen or helium fusion reactor would produce little to no radiation. Unlike fission reactors, a major fusion reactor accident likely wouldn’t hurt or kill anyone except those working at the reactor, if any at all.
BBC: Major breakthrough on nuclear fusion energy
The UK-based JET laboratory has smashed its own world record for the amount of energy it can extract by squeezing together two forms of hydrogen.
If nuclear fusion can be successfully recreated on Earth it holds out the potential of virtually unlimited supplies of low-carbon, low-radiation energy.
The experiments produced 59 megajoules of energy over five seconds (11 megawatts of power).
This is more than double what was achieved in similar tests back in 1997.
It’s not a massive energy output – only enough to boil about 60 kettles’ worth of water. But the significance is that it validates design choices that have been made for an even bigger fusion reactor now being constructed in France.
The question becomes, can this be developed and built fast enough to solve our energy and environmental problems?
Patrick Slattery says
From what I understand the real progress is being made at MIT with the first 20 Tesla magnet for their forthcoming SPARC reactor:
https://news.mit.edu/2021/MIT-CFS-major-advance-toward-fusion-energy-0908
For comparison the JET reactor magnets are just 3.4 Teslas, the ITER reactor magnets are 13 Teslas.
rsmith says
The article doesn’t say, but I wonder how much power was used to enable the production of 11 MW? Many times more, I suspect. We’re not even close to break-even at this point.
One of the major problems is that 80% of the energy produced by D-T fusion is in the form of neutrons. So the “little to no radiation” argument doesn’t really make sense to me. The neutron flux of a commercial fusion reactor will be much higher than that of a fission reactor.
By interacting with materials, the neutron flux can create other forms of radiation as well;
one of the ways to detect neutron flux is to place a sheet of aluminium in it and detect the resulting gamma radiation.
It’s difficult to efficiently convert neutrons into usable electric power. You have to rely on the neutrons being captured by the reactor walls, heating them up to the point that they can boil water into steam to drive a turbine. That limits the efficiency to that of conventional power stations.
The neutron flux can also lead to neutron activation of the materials making up the reactor and the people around it.
lorn says
Perhaps.
Not to be negative but as the saying goes: ‘fusion is the energy of the future, and always will be’.
I’ll be more enthusiastic when I can plug in my percolator and make coffee.
rsmith says
BTW, there are several videos on Youtube from people producing actual neutron flux (and lots of gamma radiation) from running “Farnsworth–Hirsch fusor” fusion reactors in their basement.
Approximate quote from one of the video’s, showing the glowing insides of a fusor:
Pierce R. Butler says
… a major fusion reactor accident likely wouldn’t hurt or kill anyone except those working at the reactor, if any at all.
Eh wot? A process that amounts to a slow-motion H-bomb can get very hinky, even harsh, very quickly over a *wide* area.
It’s been decades since I worked with energy calcs, and that was on a strictly amateur basis, but some of the figures given here confuse me.
… 59 megajoules of energy over five seconds (11 megawatts of power) … – only enough to boil about 60 kettles’ worth of water.
Thus, the heat from ~100,000 100-watt incandescent light bulbs (usually too hot to touch with bare fingers) would only boil 60 (gallons? liters? what is this “kettle” of which you speak?) small pots of water (starting from what temp?)? (Why do they assign scientifically illiterate reporters to science/engineering stories anyway?)
I smell industrial-strength hype.
Ridana says
We need a new name for that unit of measurement. Sadly, the Muskrat has forever tainted the present one.
Jazzlet says
Pierce R. Butler @5
The standard size kettle in the UK ie where the BBC is based, is 1.7 litres. The story is directed at the general population of the UK so using a unit that they understand is reasonable. The BBC does directly employ or commision people with science backgrounds for it’s scientific coverage, however their job is not to write scientific articles, but to write articles for the general public. If you want a general take more relevant to your locale read something published about it in your countiry’s media. If you want a more scientific take read the relevant journal article.
moarscienceplz says
@#5 Pierce R. Butler
The very little formal physics instruction I have had was back when we rode dinosaurs to school, so terms like joules and newtons make my head swim, but Wikipedia tells me a joule is one watt-second, so I can work with that.
I have an electric kettle that is rated at 1500 watts. I just now filled it with 1.5 liters of water and timed how long it took to boil. It was 7 minutes 21 seconds. Let’s round that up to 450 seconds. So 1500W times 450s = 675,000 Ws or 675 KJ. Multiply that by 60 (kettles) gives me 40.5 MJ, less than 59 MJ but definitely in the right ballpark. As you say, they don’t really define what a “kettle” is, if they are assuming 2 liters of water/kettle, the two results are almost a match, so I say the 60 kettles figure is about right.
OTOH, “59 megajoules of energy over five seconds (11 megawatts of power)” is written wrong. First, since a joule is equal to a watt-second, the “5 seconds” time denominator is already built into the 59 MJ figure. Second, the “11 megawatts of power” statement, while technically correct, is misleading in this context. The article is concerned with how much energy fusion technology can currently provide, not with how much instantaneous power this particular setup delivers. If you turn on ten 100 watt light bulbs in your house for one hour, how much power do you need from the electrical grid? Answer: one kilowatt. If you leave the lights on for two hours, how much power do you need? One kilowatt. How about if you leave the lights on for 24 hours? Answer: one kilowatt. Now, how much energy will the electric company charge you for, in each of those three cases? Answers: 1 kilowatt-hour, 2 kilowatt-hours, and 24 kilowatt-hours. So, “59 megajoules of energy over five seconds (11 megawatts of power)” should have been written, “59 megajoules of energy ( about 11 megawatts of power over 5 seconds).
kronk says
“It was announced recently that the JET laboratory in the UK successfully fused deuterium and tritium hydrogen atoms to produce helium and produced a small amount of electrical power.”
Reports are that it was only 59 megajoules of heat energy. Hardly worth bothering to convert into electricity.
“Theoretically, a hydrogen or helium fusion reactor would produce little to no radiation.”
The low-radiation version is proton-boron11 fusion-fission. There is considerable neutron flux with hydrogen and helium fuels. Some people have even looked into combining it with thorium fission–which could really use the extra neutrons, and which would also produce tritium that is undesirable for the fission reactor, but could be fuel for the fusion reactor.
“Unlike fission reactors, a major fusion reactor accident likely wouldn’t hurt or kill anyone”
There are a lot of different ways to do fission. TRIGA fission reactors have been pulsed to much higher power levels (33 gigawatts in one case) without any significant danger to anyone.
“The question becomes, can this be developed and built fast enough to solve our energy and environmental problems?”
Well that’s double the power from only 25 years ago, so with such a zippy development curve, I can see why there is so much excitement about it’s potential in that regard. And I expect will be simple and cheap to generate sun-scale temperatures inside super-cold magnetic coils, and to figure out how to get the heat out of the bottle, and to manage the neutron flux and fuel breeding inside the bottle. Much better than, say, the Elysium fission reactor approach, which would involve pumping a hot fluid through a core cannister. As opponents are quick to point out, that sounds super-complicated and expensive.
Pierce R. Butler says
Jazzlet @ # 7: … their job is not to write scientific articles, but to write articles for the general public.
Their job is to write science articles for the general public. (See below)
moarscienceplz @ # 8 – I went through a bunch of those calculations myself, before realizing I was falling for a hustle. This research involved a reactor, which produced gross energy of ferocious amounts of heat and pressure (though net energy, as rsmith reminds us @ # 2, surely in large negative numbers) equivalent to 59 MW. Howsomeverotherwise, as rsmith also pointed out, that’s not functionally the same as megawatts, it’s just heat: by the time you’ve hooked it to a boiler and fed that into a turbine and connected that to a generator, your actual output into watt-hours drops significantly at each step, and you’d be lucky to have 1/3 of the 1/2 of average US daily household consumption falsely implied by “11 megawatts in 5 seconds”.
This experiment did NOT generate electricity, and the choice of energy-equivalent was NOT chosen honestly or innocently. A technically-knowledgeable writer would’ve caught that.
kronk @ # 9 – Gerrard, izzat you? Sounds like you…
Pierce R. Butler says
@ my # 10 – pls disregard the part about “average US daily household consumption” – I did those numbers lae last night, and now realize I made some boo-boos…
kronk says
@#8 moarscienceplz
“since a joule is equal to a watt-second, the “5 seconds” time denominator is already built into the 59 MJ figure.”
No they got it right. Watt-seconds over seconds = watts. Written out it would look like:
59 MW-sec. / 5 sec. = 11.8 MW
rsmith says
Pierce R. Butler@10
A Rankine cycle (steam turbine with condenser etc) can in theory be up to 64% efficient, IIRC. The Carnot efficiency is directly determined by the temperature difference between the high and low temperature reservoirs. The high temperature is in practice limited by the performance of the materials that the installation is made of.
So in practice, combined with generator efficiency it’s probably more like 40-50%.
So a 100 MW(thermal) power plant will get out between 40-50 MW of electrical power. The remainder is low temperature heat that typically goes up the cooling towers.
If you see higher figures it’s either a combined cycle plant (CCGT) or the waste heat is used as process heat in industry or for residential heating. For example, a power plant in my part of the Netherlands feeds waste heat and CO2 to al lot of greenhouses around it.
Pierce R. Butler says
rsmith @ # 13 – Thanks for those numbers.
I suspect we need to add in some energy loss between the putative fusion-reaction chamber and the steam system driving the turbine, and some for whatever it takes to launch and manage the fusion process. If getting really persnickety – that is, drawing up the cost-benefit tables for a prospective actual power machine – we’d also need to factor in the overhead and distribution demands for a centralized system, probably in most cases a notch or two above those for decentralized designs.
Using waste heat for productive purposes strikes me as somewhere between brilliant and necessary, but clustering homes and production facilities around anything which might conceivably go BLOOOEY! raises a lot of valid counterarguments.
rsmith says
Pierce R. Butler # 14
A couple of points;
With proper insulation, I don’t think that the heat loss will be that bad. Especially since most fusion reactor designs need superconducting magnets. Those need very good insulation anyway.
Honestly, I don’t know how much of the generated power a fusion plant will consume to keep itself running. But I expect it to be a significant fraction of the power produced.
Centralized generation is a lot more efficient than decentralized. Electricity grids tend to be widely connected. At least in Europe, and I think in most of the USA as well. (Except Texas; and it has already been demonstrated how well that worked when the sh*t hits the fan.)
(The efficiency of a small home generator is really bad. E.g. the WEN 56200i 1600W generator promises 6 hours of half-load runtime on 1 gallon of gasoline. 6 hr × 3600 s/hr x 800 J/s = 17.3 MJ. 1 gallon = 3.7854118 dm3 × 0.755 kg/dm3 = 2.856 kg × 46.7 MJ/kg = 133 MJ. Efficiency ≈13%. Yikes.)
We should recognize that a fusion plant will not be free of radiation and radioactivity. The containment vessel and maybe other components will become radioactive over time due to neutron activation. And the reaction will generate fast neutrons and from that gamma rays.
But I don’t expect a fusion reactor to be able of a runaway reaction such as is possible in a fission plant. The reason being it takes significant power to keep up plasma containment in a fusion reactor. If the containment fails or is switched off, fusion stops. I’d expect a commercial reactor to be able of generating hundreds of MW of thermal power; it will have a cooling system to match. So any remaining heat in the installation can be quickly carried away.
Pierce R. Butler says
rsmith @ # 15 – Thanks again for useful numbers and a reasonable overview.
Centralized generation is a lot more efficient than decentralized.
That sent me to a web search, and I was surprised to find “Energy lost in transmission and distribution: About 6% – 2% in transmission and 4% in distribution – or 69 trillion Btus in the U.S. in 2013…” and “The U.S. Energy Information Administration (EIA) estimates that electricity transmission and distribution (T&D) losses equaled about 5% of the electricity transmitted and distributed in the United States in 2016 through 2020.”
Years ago I’d seen figures much higher – either somebody got the numbers wrong or transmission technology has greatly improved.
Nonetheless, I still think juice going from your rooftop PV straight to your PC has less loss than does current from a power plant 100 miles away.
If the containment fails or is switched off, fusion stops.
Howsomever, if the boo-boo goes the other way and the power surges for a millisecond, ka-blooey. 100,000,000°C worth of tritium-flavored kablooey.
See this article by a retired Princeton Plasma Physics Lab researcher for more drawbacks to fusion power: https://thebulletin.org/2017/04/fusion-reactors-not-what-theyre-cracked-up-to-be/
rsmith says
The inverter alone would loose about 10%. Unless you replace all your electrics by DC.
But of course one could argue that this doesn’t matter since sunshine is free (the equipment is not, though). And there’s the problem of supply and demand not being in sync for PV and wind, which means energy storage and lots of it. Batteries and chargers aren’t 100% efficient either. On a large scale, I think this means using PV energy (abundant in summer) to generate a storable fuel. Maybe hydrogen, maybe ammonia or synthesized hydrocarbons from CO2 filtered from the air. Maybe all of them plus others I haven’t thought of. Anhydrous ammonia would be a lot easier to store than pure hydrogen (but still pretty nasty in some ways). And it wouldn’t require concentrating CO2 from the atmosphere. All of these things tend to be more efficient at scale though.
You’ll get no argument from me that nuclear reactions aren’t potentially very dangerous, be it fission or fusion. And I’d fully agree that the best source of nuclear power is about 8 light-minutes from Earth. But in the end we need to realize that *all* of our energy is nuclear in origin. Even geothermal is basically powered by the decay of radioactive isotopes.
Pierce R. Butler says
rsmith @ # 17 – I can’t and don’t disagree with any of that, but I do have to quibble with the penultimate sentence.
Neither of the reactors you cite, the one 500 light-seconds up nor the one 4000 miles down, presents any waste-disposal or biosphere-contamination problems. The former one does present a major nuisance potential ~7B years down the line, but nothing we do to take advantage of it between now and then makes that any worse.
Pierce R. Butler says
rsmith @ # 17: Even geothermal is basically powered by the decay of radioactive isotopes.
Secondarily.
GerrardOfTitanServer says
Note: Any real fusion reactor is going to have a very high neutron flux. That neutron flux is going to transmute nearby materials into radioactive materials. The total radioactive inventory should be many orders of magnitude less than a comparable fission reactor.
Also, there are nuclear, chemical, and physical differences that would limit dispersal – in a conventional fission reactor, you have an uncontrollable source of heat – the decay heat from the fission products – and you have radioactive elements in a chemically unstable form, and all of this is next to very high pressure water. Without proper heat dissipation, the water will come into contact with the chemically-reactive radioactive elements, and then it goes airborne.
By contrast, consider a fusion reactor. In a fusion reactor, there is no source of heat after you turn it off. There is no very high pressure water. The radioactive material inventory will be much smaller than a comparable fission reactor. The radioactive material inventory of a fusion reactor will mostly be chemically-inert instead of chemically reactive AFAIK. All of that means that there’s no pathway for widespread dispersal in case of an accident.
Also, because of the much smaller inventory of radioactive material, disposal is easier and cheaper. Also, the radioactive nature is different, and AFAIK, it will be dangerously radioactive for a much shorter time. Having said that, you shouldn’t worry about leaks from proper disposal of fission reactors either – the dangers are vastly exaggerated by liars in Greenpeace et al.