News U.S. Injects $112M into Supercomputing to Enable Fusion Future

Status
Not open for further replies.
322MJ to pump the lasers to deliver 2MJ of energy to the ignition point and get only 3MJ back doesn't look like an energy gain to me... and that doesn't even count the energy required to produce the fuel pellet, set up the ignition chamber and clean up after half of the test fixture gets destroyed by the ignition event.

They are going to need a lot more than AI and supercomputers to achieve the 120+X efficiency increase needed to just break even. If anything useful comes out of this, it will be things like improved lasers and optical amplifiers, not the fusion method itself.
 
  • Like
Reactions: thisisaname
We need to look to different tech instead of just building bigger and bigger versions of things that have never gotten anywhere near real output. Not to mention the issues with their fuel source and breaking of their system with high energy particles.

I think going with a helion type pulse heating system makes a lot more sense when you dig down into the long term usage and fuel production procurement side of things. All the current giant laser or tokamak type reactors are really science experiments to lean from. I just don't see them as ever being fully functional production reactors.
 
All the current giant laser or tokamak type reactors are really science experiments to lean from. I just don't see them as ever being fully functional production reactors.
I think the stellarator is the most likely one to become economically viable for GW-scale baseline power production.

Helion's overall simplicity looks interesting but these things are only projected to produce about 50MWe a pop assuming the Microsoft project is only one reactor and the project is actually successful. At that scale, they may be most suitable as substation stabilizers: operate them at ~50% as the baseline and use their practically instantaneous response time to provide +/-25MW of local buffer against source (renewables) and load fluctuations. These could be handy for substations that feed Tesla v4 supercharger stations and competing EV charging providers once EVs become more commonplace.
 
322MJ to pump the lasers to deliver 2MJ of energy to the ignition point and get only 3MJ back doesn't look like an energy gain to me... and that doesn't even count the energy required to produce the fuel pellet, set up the ignition chamber and clean up after half of the test fixture gets destroyed by the ignition event.

They are going to need a lot more than AI and supercomputers to achieve the 120+X efficiency increase needed to just break even. If anything useful comes out of this, it will be things like improved lasers and optical amplifiers, not the fusion method itself.
Ignoring most of the energy input plus support infrastructure and only counting the direct energy is how they turn less than 2% yield into 150% yield.
 
I think the stellarator is the most likely one to become economically viable for GW-scale baseline power production.

Helion's overall simplicity looks interesting but these things are only projected to produce about 50MWe a pop assuming the Microsoft project is only one reactor and the project is actually successful. At that scale, they may be most suitable as substation stabilizers: operate them at ~50% as the baseline and use their practically instantaneous response time to provide +/-25MW of local buffer against source (renewables) and load fluctuations. These could be handy for substations that feed Tesla v4 supercharger stations and competing EV charging providers once EVs become more commonplace.
That is per reactor I believe, so cost of reactor and longevity/cost to run is the real issue. If those work out than you scale and refine and just get better and better. The issue is how large can it scale and how many can you run safely in a given footprint say vs a wind or solar farm. If they are extremely safe you could spread them out and have a much better grid, but the question is how safe.

The safety is the biggest thing, and why nuclear is such a hassle. Security and distance make it hard to do, but if these are safe even in the worst possible explosive failure than you could have distributed power in where you need it.
 
Lawrence Livermore National Laboratory (LLNL) to deliver just that when it comes to cold fusion.
I think somebody still has room-temperature superconductors on the brain. The referenced article said nothing about cold fusion - just regular, hot fusion!

The key thing is to have energy-producing fusion (i.e. that emits more energy than is required to trigger it) as a practical power-generation source.
 
They are going to need a lot more than AI and supercomputers to achieve the 120+X efficiency increase needed to just break even.
Since there are some pretty smart people working on this stuff, I'd guess they're aware of that issue. Perhaps if the reaction could be self-sustaining, the ignition energy wouldn't matter too much. Or, maybe they find better materials that are easier to ignite and provide a better yield.

If anything useful comes out of this, it will be things like improved lasers and optical amplifiers, not the fusion method itself.
The experiment cited in the article represents a milestone, but not on par with the best ideas currently under development. Its significance was reproducing an experiment from a couple years prior, to help prove that it wasn't merely a fluke or bad measurements. Simply confirming others' results is an important part of scientific process.

Apparently, there's enough confidence that big breakthroughs are around the corner that there are several startup companies with their own fusion techniques currently under development. In just the past year, 13 fusion energy startups have been founded:

 
Last edited:
If they are extremely safe you could spread them out and have a much better grid, but the question is how safe.
Safety-wise, the whole spiel about hydrogen fusion is that worst case, you have a deuterium/tritium gas leak which is going to disperse in the atmosphere and rain back down in the water it likely originated from, no big deal. Spreading them out might be an issue with trucking the fuel to however many locations there are, which means that much more fuel storage infrastructure and staff to deal with. For a grid-scale deployment, you'd likely still end up with concentrated deployments near major substations for convenience, management and maintenance efficiency.
 
Since there are some pretty smart people working on this stuff, I'd guess they're aware of that issue. Perhaps if the reaction could be self-sustaining, the ignition energy wouldn't matter too much. Or, maybe they find better materials that are easier to ignite and provide a better yield.
Good luck conserving ignition energy from any system that relies on a fuel pellet/target system of some sort. The need to rearm/reload between shots looks like it should make it necessarily impossible since most of the energy is spend annihilating the target.

As for alternate fuels, Deuterium-Tritium are the most reactive elements known for nuclear fusion, easier to fuse than plain hydrogen due to their extra neutrons reducing repulsion between protons, easier to fuse than anything heavier which requires more energy and pressure to ignite.

If we had the stronger superconductor magnets necessary to achieve heavier element fusion, we should also be able to achieve regular hydrogen fusion and that would be the most logical fuel to use due to simple abundance.
 
"the newly-instated Scientific Discovery through Advanced Computing (SciDAC) program combines the two pre-existing programs from the Department of Defense with the aim of streamlining programs invested"

Scientific laboratories that can turn a "streamlining"" program into $112 million in additional public funds expended already have pretty good experience turning almost nothing into something.
 
Some time ago, respected British physicist, Prof Jim Al-Khalili, asked for questions on Twitter. I asked him if he believed we would have commercially viable electricity generation, by nuclear fusion, by 2050. His answer was an unequivical "yes".
 
Some time ago, respected British physicist, Prof Jim Al-Khalili, asked for questions on Twitter. I asked him if he believed we would have commercially viable electricity generation, by nuclear fusion, by 2050. His answer was an unequivical "yes".
60 years ago, they believe we'd have nuclear fusion sorted out around 2000. The calendar has slipped quite a few times since. Almost every time someone says they expect to reach commercial viability in 10 years, they are either still 10 years away 20 years later or vanish.

Fusion is looking like a running gag, especially with supposedly serious groups like LLNL claiming an "energy gain" when their experiment produced less than 1% of the energy needed to make it happen.
 
60 years ago, they believe we'd have nuclear fusion sorted out around 2000. The calendar has slipped quite a few times since. Almost every time someone says they expect to reach commercial viability in 10 years, they are either still 10 years away 20 years later or vanish.
Past performance does not necessarily predict future results. We've made progress and learned much, since those early predictions were made.

I would look at the rationale behind why he thinks the answer is "yes", than to simply attack him on the basis of "people have been wrong about this before".

Fusion is looking like a running gag, especially with supposedly serious groups like LLNL claiming an "energy gain" when their experiment produced less than 1% of the energy needed to make it happen.
You talk as if they're trying to deceive someone, but I think it's not so. I think they're a lot more competent than you seem to give them credit for being.
 
Safety-wise, the whole spiel about hydrogen fusion is that worst case, you have a deuterium/tritium gas leak which is going to disperse in the atmosphere and rain back down in the water it likely originated from, no big deal. Spreading them out might be an issue with trucking the fuel to however many locations there are, which means that much more fuel storage infrastructure and staff to deal with. For a grid-scale deployment, you'd likely still end up with concentrated deployments near major substations for convenience, management and maintenance efficiency.
I think there will be a balance. Probably be more distributed than other systems but less than say solar panels on the roof. With no poisonous output there will be a balancing act between cost to ship fuel, cost to maintain grid, and reliability /switching between loads. The real question is how many of these could you run in a given square footage and is that output higher lower or the same as other products....ie solar wind nuke coal or oil. If it is anywhere near any of those it's a winner hands down.
 
  • Like
Reactions: bit_user
You talk as if they're trying to deceive someone, but I think it's not so. I think they're a lot more competent than you seem to give them credit for being.
Claiming an energy gain when output is only 1% of total input is fundamentally disingenuous and misleading. Nobody wishing to be taken seriously should make that sort of false representation.
 
Claiming an energy gain when output is only 1% of total input is fundamentally disingenuous and misleading. Nobody wishing to be taken seriously should make that sort of false representation.
Not really disingenuous, unless they weren't transparent about it, which I think they were. Focusing on the efficiency of the reaction is reasonable, since that's a threshold that wasn't crossed until this pair of experiments.

Perhaps it's somewhat a matter of perspective. A scientist is preoccupied with the reaction itself, while an engineer is preoccupied with the entire system.

Even so, if you cannot make the reaction net-positive, then it doesn't matter how efficient the rest of the system is - you'll still never have a net-positive system. So, it makes sense to start by achieving a net-positive reaction.
 
Last edited:
Ignoring most of the energy input plus support infrastructure and only counting the direct energy is how they turn less than 2% yield into 150% yield.
They are not ignoring it, that's probably where 100% of the research money will go towards, to need less energy to trigger that 2mj laser blast, or to find materials that would reduce the input energy in general.

The first computers were made with vacuum tubes that would need seconds to warm up before they could do anything and would draw ridiculous amounts of power for something a pocket calculator could do now on a solar cell, the first version of everything is always pretty bad.
 
  • Like
Reactions: thisisaname
They are not ignoring it, that's probably where 100% of the research money will go towards, to need less energy to trigger that 2mj laser blast, or to find materials that would reduce the input energy in general.

The first computers were made with vacuum tubes that would need seconds to warm up before they could do anything and would draw ridiculous amounts of power for something a pocket calculator could do now on a solar cell, the first version of everything is always pretty bad.
Yes but a lot of tech reporting make it seem like nuclear fusion is here when in fact there is quite a lot of work to do. Not just in the fusion chamber but also in the support infrastructure and turning that thermal power in the plasma into useful electricity.
 
Status
Not open for further replies.