To quote the late Admiral Rickover,
"An academic reactor or reactor plant almost always has the following basic characteristics: (1) It is simple. (2) It is small. (3) It is cheap (4) It is light. (5) It can be built very quickly. (6) It is very flexible in purpose (’omnibus reactor’). (7) Very little development is required. It will use mostly off-the-shelf components. (8) The reactor is in the study phase. It is not being built now.
“On the other hand, a practical reactor plant can be distinguished by the following characteristics: (1) It is being built now. (2) It is behind schedule. (3) It is requiring an immense amount of development on apparently trivial items. Corrosion, in particular, is a problem. (4) It is very expensive. (5) It takes a long time to build because of the engineering development problems. (6) It is large. (7) It is heavy. (8) It is complicated."
From the article,
"The first module is expected to be operational by 2029 with full plant operation the following year."
> In this particular case, what is this quote supposed to imply?
I don't know that it's supposed to imply anything but rather shine a light on the different acceptance criteria for a study reactor v. one that generates power for, in his case, mission critical needs (though power for homes and businesses would be life-critical, so it's up there).
If a study reactor breaks, even though there's a possible risk to life depending on how it fails, you can endure the downtime for a bit while bringing it back up. If a practical reactor breaks, people are far more likely to die.
That's why it's important to have a multitude of energy sources on a competitive power grid. The nice thing about SMRs is you can build enough of them to make the overall output consistent as needed, even with a higher tolerance for failure of any individual reactor.
A study reactor is also designed to break. It is meant to be easily opened up and examined after a fault. It is meant to validate the core process but also explore edge case situations. So it needs to fail gracefully. That is a more complex beast than any final product.
In that case Rickover's observations don't apply at all:
(1) It is simple -> nobody said NuScale's reactor is simple. It took NRC about 6 years to approve it, and in the process NuScale had to produce about half a million pages of documentation
(2) It is small -> not really. It is smaller than full size reactors, but then it delivers only 50 MW, not 1GW. Per unit of electricity delivered, it is most likely somewhat larger than a full size nuclear power plant
(3) It is cheap -> relative to what? it's not that cheap. If anything, see the comments in this thread, it appears to be expensive
(4) It is light -> this does not apply here. Rickover was concerned with submarines, where weight was important. This is not a reactor designed for submarines. I don't know how light it is, but nobody cares about this
(5) It can be built very quickly -> well, if the first one is supposed to get online in 2029, that does not seem to be very quick, does it?
(6) It is very flexible in purpose (’omnibus reactor’) -> NuScale's reactor is designed to generate electricity. That's it. What is flexibility in purpose?
(7) Very little development is required -> NuScale has already worked for one decade on this. It will take until the end of this decade to see one come online. Nobody claimed "very little development is required"
It will use mostly off-the-shelf components -> not really. NuScale will use a Korean manufacturer that is accredited by NRC to manufacture componenets for nuclear reactors. There's nothing off-the-shelf about this.
(8) The reactor is in the study phase. It is not being built now. -> it depends what "being built now" means. NuScale can't start building before it has all the approvals. It is working on getting these approvals, if this counts as "being built now", then it's being built now.
More to the point. Rickover was talking about a completely different context. People venturing cheap ideas, while he needed concrete reactors for his submarines. We are in a different world. NRC is extraordinarily stringent. The fact that NuScale got their approval is a phenomenal achievement. This should not be dismissed with the same tired old quote from Rickover that gets posted on HN almost every time we talk about nuclear reactors.
It's not flippant at all. It is the essence of the much longer version and the real lesson contained therein, you asked a question and I answered it. If you see it as flippant then that might reflect on you. Your other comment further down in this thread is flippant, even though it is a much longer one: you reject out of hand what is contained within these words without taking the time to ponder how they do apply to the matter at hand.
> If you see it as flippant then that might reflect on you.
It does reflect on me. It reflects the fact that I can't read minds.
You gave a widely known quote, and in your mind it was crystal clear what you meant. But to other people, who can't read your mind, it just sound flippant.
It sounds like you think NuScale is a bunch of theoreticians. And that with your remark you are trying to put down their decade-long quest to achieve something.
Maybe it does not sound flippant to you, but I assure you, it is flippant.
The message, to expand on it is that they are in the earlier stages of their development and the safe bet is that by the time it is all said and done things will be much more in line with what we've come to expect from the nuclear industry, including any and all of the statements already in line with that today.
So by the time it is done it will be more expensive, likely heavier, likely way late, more complex and narrower in its possible range of applications. It's not a law, it is an observation made over many nuclear deployments and to the best of my knowledge there isn't a single project that was an exception so I expect it to be true this time around as well. Which is why you can safely ignore any of the touted advantages until the product is ready to be fielded in quantity. Assuming it ever will be fielded in quantity, plenty of designs were slated for large numbers of deployment and ended up being one-offs or at best single digit runs because of unforeseen issues with the design.
But although the most dangerous words in history are "this time is different", there are reasons to believe things could be different this time:
1. the Nuclear Regulatory Commission. NuScale got the design approval from them. That means this design is as good as frozen. The NRC is an extraordinarily conservative organization. If NuScale, or anyone who licenses their technology, will try to diverge from the approved design by a bit, ..., well that's not even a possibility, why should we think of consequences. So this design, by the time is in production, won't be heavier, more complex, or narrower in scope, because it will be this exact design. It may be more expensive, late, in fewer numbers than expected, however.
2. Conservativeness of the design. This reactor is just a pressurized water reactor design. The most widely used today. It's in a small form factor and modular, but otherwise there's nothing revolutionary or radical about it.
3. History. After about 70 years of reactor operation, the NRC has seen lots and lots of failure modes. The number of new failure modes to be discovered is not zero for sure, but it asymptotically approaches zero. The reason for a lot of past construction delays was the moving of the goalposts by the NRC, but that was due to the discovery of new failure modes, not to any malicious intent. The goalposts may still move between now and 2029, but much less.
4. Politics. In the past few decades, at least in the US, the left was against nuclear power, and the right was for it. Now it appears a large number on the left have embraced nuclear power, so some form of bipartisanship has been achieved. This is quite unprecedented. Will it last? Probably not forever, but it might last a few election cycles, and this could be enough for this SMR design to achieve some escape velocity.
Sure, but having the design frozen is one reason why this may never happen in the first place, after all any kind of deviation and it is back to a recertification and that gets expensive quickly. No such thing as 'move fast and break stuff' in the nuclear power business, your design works or you're history.
And as you tacitly admit modular is revolutionary and there are a whole pile of challenges right there, having a single (large) point of failure may well be much easier to manage than several smaller ones, I never saw the modularity as the problem to solve, because existing (large) reactors are more often than not also created in sets to deal with downtime due to maintenance and overhauls.
Your third point is accurate, but also may end up being this projects downfall: by changing some core design parameters (such as size) new failure modes may well pop up. MTBF is mostly a function of the number of components, if you increase the number of components, even if the components are smaller you will probably still see more failures.
Four is true, but: here in NL we currently have the same situation and a new reactor was just greenlit (two, actually if I understood all of the material right). But this is a decade+ project and I suspect that given the fickle nature of these projects that there is a fair chance that it will eventually never see the light of day because of the dark horse of economics: a typical nuclear plant will generate power, but it will never generate any money. Without subsidies these are essentially dead ends.
We use them in nuclear submarines and aircraft carriers, so the tech seems feasible - not sure why commercial use for small reactors is such a difficult stretch.
Nuclear submarines cost on the order of $6B a pop, so I'd hesitate to describe anything about them as "cheap". We use nuclear reactors in them because there's no other power source that's as compact and self-contained.
A submarine (when submerged) has a large body of water for heat to radiate out into.
A (previous generation) terrestrial plant already has systems that can dump ludicrous amounts of water onto the core, but they all rely on there being power to operate pumps. A nuclear submarine that can dump in sea water does not need this - the body of water it is in is cold enough (relative to the heat in the core) that it will naturally carry heat away.
Sub and ship reactors are effectively sealed units that are not intended to be refueled on a regular basis. They can be serviced and refueled eventually, but it is a major overhaul to do so.
The problem with a small reactor is that there's a lot of per reactor cost. Things like security, communicating with the grid, and paperwork don't get cheaper with a smaller reactor so the percent of cost going to overhead goes up.
I'm not sure there we currently have the supply chain, security and ability to deal with geopolitical fallout of using the same fuel for mass domestic use.
I wonder what causes the cost projections to be so far off. Not enough time or research into what the final structure will actually cost? Overly optimistic projections to help ensure the project is approved and started, thus getting the "foot in the door" to complete it?
The big problem is the lack of knowlage and experienced. Simply put almost nobody has experience with building reactors, and specially not that one. And outside of it being nuclear, its also an incredible large civil engineering project with incredibly high specification. The US is not exactly known for being great at executing large civil engineering projects.
The reality is if you want cheap nuclear you need to mass produce it, just like with everything else. But that requires large scale state action and planning.
Or alternatively having a competitive market for smaller nuclear. But for that to happen regulatory approval processes and many other problems are in the way.
It's rule number one of the big physics playbook. Make outrageous claims that funders can't verify cause they ain't good at math. Then let the good times roll!
See the LHC, ITER, James Webb, F35, quantum computing etc.
After SpaceX showed how it is possible to build rockets much cheaper than previous ones I attribute the size and complexity of current reactors to outdated mentality by their designers
Setting goals like "maximum efficiency" and trying to bring cost per MWh too low is a self-defeating exercise
Rockets blow up because that's what rockets do. You can absolutely build reasonably safe test reactors to verify ideas. That is what they did constantly in the 60s, with multiple new reactors every year.
The whole Molten Salt reactor experiment was something like 70 million $ for an amazing project.
And dispute that somehow the world didn't blow up and there are no mutant ninja turtles anywhere.
What helped SpaceX out was the COTS and CRS program, and something like that could absolutely be world changing in nuclear, but nobody at the DoE has that kind of vision sadly. In fact the mastermind behind COTS has been advocating for something like that for many years now.
You obviously don't need to blow up reactors to learn stuff
Build, prototype and see how parts react (for example, corrosion) way before an emergency happens. Build a first version, then improve on a second version. Iterations will work better with smaller reactors than with bigger ones
And you do need some research into materials, but that's part of the process and evolution.
We have seven+ decades of experience now. We are certain of two things. (1) Price of reactor n+1 is not less than n's; (2) every number produced by the nuke industry is a lie.
this could be said about computers back in the day too. they used to take up entire floors, now they fit on your pocket. i suspect something similar applies for tech in general and reactors also
Nuclear reactors are almost as old as computer, yet we have seen nothing like the scaling we have in the semiconductor industry. Part of this is because they half of the power plant is essentially a steam turbine just like in most other thermal power plants. That technology is essentially centuries old. Not much scaling to be done there.
If we put a fraction of R&D into reactors as we have with semiconductors, it would be a VERY different story. Instead we have spent 50 years scared shitless of the nuclear boogyman. The worst proponents of nuclear are so called environmentalists who's feelings over facts have caused far more harm to the environment (mining for rare earths anyone?). If we would have continued research into other than light water reactors instead of abandoning it we'd be burning "nuclear waste" instead of arguing where to bury otherwise useable fuel.
Fission gets billions of dollars of free public money for research every year and had much more spent in the past.
> The worst proponents of nuclear are so called environmentalists who's feelings over facts have caused far more harm to the environment (mining for rare earths anyone?).
Modern PV involves less rare earths than a nuclear reactor for the same energy output. DFIG wind generators use none. New batteries use none.
Uranium mining on the other hand poisons thousands of km^2, and is almost never properly remediated. Tailings dams of mines that ran out decades ago still need constant maintenance (often on the public dime) to not spew heavy metal laden dust everywhere.
> If we would have continued research into other than light water reactors instead of abandoning it we'd be burning "nuclear waste" instead of arguing where to bury otherwise useable fuel.
Trillions were spent on 'burning nuclear waste' by which you actually mean creating larger quantities of Pu240 and Pu241 (the worst high level waste) than a burner reactor does and dumping fission products into the ocean. It went nowhere.
The US has tons of money saved up to handle nuclear waste, all reactor operators have been paying for 5+ decades, but because of grid lock the money is just pilling up. Instead of making constructive policy the idiots dead-locked themselves with this idiocy at Yucca Mountain.
And there has never been a real program to systematically use the civilian nuclear waste. This was of course planned but the program was stopped in the general anti-nuclear attitude of the 70s. This program would have lead to a huge amount of green energy and a reduction in the nuclear waste.
And the uranium mining would be much less of an issue if we had modern reactors. We are operating at 2% efficiency, a thermal breeder could reach high 90%s. The existing civilian nuclear waste could be enough for a long time. No more mining. And thorium is a waste product of current rare earth mining in absurd quantities. So really any more mining that we do is just because of bad policy.
And when talking about mining, the reality is solar and wind need far, far more mining and those mins have tailings too.
> Fission gets billions of dollars of free public money for research every year and had much more spent in the past.
And nuclear was excluded from almost all green energy subsidies for decades a far larger number in total. Even more important the changes in the grid pricing basically simply did not favor long term base-load investments. Not to mention all the programs that force utilities to adopt a % of renewables.
And even today, the places that have stable reactors have low electricity prices because of long term price agreements while everybody else is suffering.
France had decades of low energy prices and even today most in France have low energy prices. And the cost of that was mostly on the books of the utility and has been paid down for many decades now even with the low energy prices.
France has sadly mismanaged its fleet for the last 30 years, with the anti-nuclear state abusing the amazing systems the generation before built for them. And despite all of that abuse and lack of maintenance and investment France still has reliable, low price, green energy.
The idea that environmentalists had any influence on IFR is a complete fantasy or that pouring another half a trillion into failed attempts at separation and reprocessing programs would magically make them clean or cheap is laughable. If you want to lament the mismanagement of funding for green energy, then lament that large scale wind was abandoned in the early 40s because the very first attempt was a little bit more expensive than coal or that the first ten billion dollars moving down the learning curve for solar to make it viable for mass market took 5 decades.
Oof, reactor in my pocket is a tasty prospect. Personal computing and personal transport could get super exciting. Basically a grown up way of saying Iron Man suits one day...
What is a car? Is it 4 wheels, a driver seat, passenger seat, three read seats, and a trunk? Or is it an efficient means for getting from point A to B on paved roadways? If a "car" is the latter, then there have been plenty of advancements. Electric scooters and bikes are two pieces of tech you can very easily take with you.
Your car analogy doesn't really work. It's like saying "desktop computers are mature tech, can I fit one in my pocket?" No, but you can fit a smartphone in your pocket, which does many of the same things a desktop computer can do.
NuScale recently announced large cost increases at the project with UAMPS. The cost per unit of capacity is now on par with the new reactors at Vogtle (~$20/W). This is outside the range at which the project could be competitive.
To put in other units: a rise from previous targets of $58/MWh to $89/MWh, more than 50%, not including a $30/MWh subsidy (so the true cost is actually $119/MWh).
To be fair, it says the cost increases are mainly due to the rise in construction material prices as well as financing costs; nothing inherent to nuclear power or the novel technology itself.
To provide more context, wind and solar were both in the low $30's/MWh of LCOE (levelized cost of energy) 3 years ago[0], with that number predicted to continue falling rapidly.
Combined cycle (natural gas) is a bit higher[1] than solar and wind, with that number expected to rise over time, and I'm fairly sure the current numbers don't really reflect the substantial cost of the carbon emissions, which we will all have to pay for sooner or later. Either way, the number utilities see is currently much lower than SMRs.
I'm pretty sure every prediction I've ever seen for how quickly the cost of wind and solar will fall has underestimated the speed in retrospect.
That's the kind of thing these reactors have to compete with.
Grids have also repeatedly been shown to handle more renewables than every previous prediction would make, and we haven't hit the limit. At this point, fossil fuel sources more frequently a source of blackouts than than renewables from everything I've seen, despite certain people blaming renewables at every turn.
What we need is more energy storage, whether that's in the form of traditional batteries or more novel forms of energy storage.
I think nuclear is a fine source of energy if you have it, but evidence over the last several decades shows that it is virtually impossible to build for myriad reasons. The Vogtle nuclear reactors have been one giant boondoggle. New nuclear is not cost competitive, unfortunately.
Sorry but it is very misleading to say that fossil fuels sources are a bigger cause of blackouts than renewables. Without dispatchable generation, like nat gas plants but also hydroelectric, the grid would black out every single night.
Yes dispatchable generation may fail to materialize at times, but renewables “fail” to provide consistent power every single day, when the sun stops shining at night or wind stops blowing. Future battery deployments may be able to smooth these out over long enough timescales, but we are nowhere near that point right now.
>> Grids have also repeatedly been shown to handle more renewables than every previous prediction would make, and we haven't hit the limit.
> Yes dispatchable generation may fail to materialize at times, but renewables “fail” to provide consistent power every single day, when the sun stops shining at night or wind stops blowing.
As I said. You're just repeating the old arguments. People thought that small percentages of renewables would destabilize the grid, then that didn't happen, so then they said a slightly larger percentages would do it, and it didn't. This tired theme has been repeated ad nauseam for the last decade or two.
I agree that you need some amount of Base Load, but renewables haven't been the problem yet, and energy storage is the solution, long term, along with Demand Response. Small amounts of grid energy storage have been shown[0] to have disproportionately high effects on improving grid stability. We might need less than you predict.
As it is, since we are still successfully adding more and more renewables to the grid, and renewables aren't being the source of blackouts, SMRs have to compete with renewables on cost, and they simply don't. SMRs also don't compete with Combined Cycle plants in terms of cost either, so which one are utilities going to choose?
Peak demand for the grid is in the late afternoon / early evening, so the amount of battery storage needed to "shift" solar production by a few hours is not as much as you would think.
Wind power produces more power at night than during the day, and it produces more in winter than summer, which is quite convenient given how solar produces more in the summer and during the day.[1] They make quite a complementary pair of power sources.
There are seasonal concerns, which is where some combined cycle plants come into play, even if they don't operate for most of the year, but that's also not SMRs. Combined Cycle is way cheaper than these nuclear SMRs. If you over-build on wind and solar, you can go a long way even in times of year with "less" wind and less sunshine, and with the low cost of wind and solar... lots of people are looking to overbuild as a partial solution that doesn't require batteries.
The overwhelming majority of electricity demand is base load. Usually on the order of 70-80% [1]. We don't need "some" base load, almost all our demand is base load.
Electricity storage is nowhere near the scale required to make a dent in the electricity grid. To put this in perspective, the US alone uses about 500 GWh of electricity every hour. Worldwide this figure is about 2,500 GWh per hour. The storage facility you linked to was the biggest facility in the world when it was first constructed, and it stored only 129 MWh of electricity.
At our current rate of battery production it'd take us a century of dedicating 100% of our battery output to grid storage to reach 1 day's worth of storage. Battery production is expected to increase, but it's unclear whether raw material inputs can keep up with manufacturing demands [2].
Hornsdale Power Reserve was nowhere even close to being the biggest energy storage facility, even when it was constructed. At that time, the biggest facility was the Bath County Pumped Storage Station at 24,000 MWh, which has since been surpassed again.
Dams are geographically limited. You can't just build more of them. A fully decarbonized grid's ability to build renewables is largely determined by the availability of hydroelectricity for dispatchable power. Sweden gets ~40% of its electricity from hydro, another 40 from nuclear, and 20 from intermittent sources.
A fully decarbonized grid's ability to build renewables is, in fact, not at all determined by the availability of (watershed) hydroelectricity.
In particular, hydro storage can be built in hundreds of times as many places as watershed hydro generation. And, there are numerous other practical storage methods.
Ready combined-cycle gas generation capacity is more important, most places. Those will be incrementally converted to consume imported synthetic ammonia.
> Ready combined-cycle gas generation capacity is more important, most places. Those will be incrementally converted to consume imported synthetic ammonia.
Synthetic ammonia that comes from where? Ammonia is currently produced by steam reformation, which emits CO2. You're talking about electrolyzing water to produce hydrogen, and using renewable energy to carry out the Haber process. Nobody is using this strategy of energy storage, so it's priceless in a very literal sense: there is no way to estimate how much such a system would cost. I'd take an expensive solution over a priceless solution any day.
Synthetic ammonia will come from electric synthetic ammonia production plants. A few production-scale plants are already under construction, but of course hundreds more will be needed. Ammonia will not be, primarily, a storage medium, but a transportation medium and fuel, although it stores well under light pressure. Any tropical country can put up a solar farm and begin exporting ammonia to places less blessed with reliable sunshine -- for fertilizer, at first.
Insisting technologies that people are already spending billions of dollars on building out were not first shown to be viable is a peculiar position to take.
> This demand can be met by unvarying power plants,[2] dispatchable generation,[3] or by a collection of smaller intermittent energy sources,[4] depending on which approach has the best mix of cost, availability and reliability in any particular market.
Renewables are baseload. They mainly compete with other traditional "base load" power plants, i.e. coal and nuclear, all of which rely on dispatch able generation for evening out the peaks and trowths. Your calculation also doesn't make sense, you never need storage to cover the whole electricity generation in all of the US. It's like saying in an all nuclear scenario we need to build plants to cover twice the peak demand because all plants could be under maintenance at the same time. You don't build with 100% redundancy.
Yes, but in practice Watts are almost always an average over some sample period. Generation capacities for power plants are given in Watts, not Wh/h. Wh/h gives you a hint that the sample period was an hour, but that isn't necessarily the case. I don't think Wh/h is the correct way to describe the sample period.
This is highly geographically dependent, but water displacement “batteries” that pump water to an elevated basin when power is on and then let it run through turbines back down to a lowered basin seem like a really simple, effective solution that can work at scale.
I’ll back out then/leave the debate to those who have the time to sort through all of this. The gist about the enormous amount of space needed in that article seemed accurate/more than I thought.
It says exactly one right thing: you can put a reservoir on a hilltop, and the "head" is the height of the hill, not the depth of the reservoir.
You do not need a mountain. You do not need a high valley. It is cheap to build an earthen dike around the top of a hill, leveling off the hilltop peak for material.
The hill does not need to be steep; a shallow slope just means a longer penstock, which costs more.
A 300-meter hill is high enough for practical use. In most cases only a few hours' storage is plenty; you only need enough to make firing up a gas generator an occasional event.
Luckily the author took every opportunity to err on the side of underestimating the amount of water storage needed, to head off such pedantry that wouldn't significantly change the math.
By “at scale” I mean building enough pumped storage facilities to meet current energy demands when the wind isn’t blowing and the sun isn’t shining. It seems much more likely that we could build sufficient pumped storage facilities than the kind of battery farms the poster I was replying to rightly points out are not likely to meet required needs for the foreseeable future, if ever.
My personal pet idea: a big ‘tower’ on the seafloor.
Yes, costs are immense, but scaling would really work well. (Making the ‘well’ or tower twice as big would not make the cost double).
In ascii art:
———| |——
| |
|++|
With ‘| |’ the walls of the well or tower, —- the sealevel, and ++ the water inside the well. Energy would be gained through letting seawater go ‘into’ the well/water tower.
(Edit: attempt to make ascii art work)
Limited by the displaced water weight. Prone to storm damage. But the general idea could work and you could even do it entirely under water, saving you from having to dig out a mineshaft, just sink a bunch of caissons to create an underwater tower. As long as they stack and can be re-inforced so they don't end up shearing under the pressure of flowing water. Maybe even open up the sides to a lattice to reduce that resistance.
Existing supertankers can displace in the hundreds of thousands of tons. New, they cost under $100M. Scrap, much less.
They are not bothered much by weather, in normal operation. With that much weight hanging well below the surface, they would be very stable. (Racked together side by side, moreso.) Think of it as a very long oil platform. Those endure any weather with no difficulty.
No need for anything attached to the bottom except anchor chains.
When planning ocean operation, make sure almost everything is at or well above the surface, and everything complicated (motor/generator, winch, power conversion) is well protected from exposure. Ocean deployments that expose expensive stuff to wave action or put it underwater fail, reliably. (Expect tidal generation that puts generators underwater to fail spectacularly.)
Offshore wind has the nacelle well encapsulated and everything that moves far above the waves. Nothing is underwater except the pilings, the end of the support post, and a wire.
Yep, I tried to acknowledge that in the original comment when I said it’s geographically dependent.
It still seems like the only solution that currently exists and actually works on the scale required. I’m not sure what realistic solutions are for regions where hydro pumped storage is not a viable option.
A handful of hours storage, 60% curtailment (with most of the surplus used to create hydrogen and such for chemical feedstock) and a small amount of HVDC leaves a remainder small enough that meeting it with biogas and existing hydro is possible, meeting it with hudrogen is not costly, or meeting it with LNG is not a huge problem.
The world produces about 3TW of electricity, 2022's battery production was 760GWh and it's growing 30-50% yoy.
Ideally degrowth happens, but while we negotiate that, the |bhattery industry is at the scale required and is rapidly shedding critical mineral requirements (sodium ion is at GWh scale now, and much larger supply chains with all abundant materials come online in june).
True, yes. It just seems to me like conventional battery farms are not going to make much sense in a lot of places, and hydro pumped storage will. Another user pointed out that the size inefficiency is severe enough with hydro pumped storage that it is less grid scale suitable than I thought, so that difference between hydro pumped storage and conventional battery farms is less than I thought. Regardless, energy storage is a huge problem that’s going to require lots of deference to practicality and as many simple solutions as possible. Frankly I’m not sure it is possible, which is why I’m a big advocate for nuclear.
We can't use much of it yet because we haven't the renewable generating capacity to charge it from. (Charging it from fossil fuels would be beyond stupid.) Several varieties are getting cheaper very fast, and new, cheaper ones are being invented, so when the time comes it will cost a lot less. Likewise, carbon capture: building that out now would be stupid if it diverted money that could be spent building renewables that displace carbon emission.
It is also why building nukes is stupid: they displace way less carbon emissions, per dollar, than renewables, and first spend a decade displacing none at all. For the price of the coal burned waiting for the nuke to come on line, you could build that much solar, never mind what you are wasting on building the nuke, and it would start displacing carbon emissions almost immediately.
Batteries and pumped hydro are very far from the only practical storage media. But the assertion you read that pumped hydro does not scale is deliberately deceptive.
I would read Tom Murphy's posts from a different light. His message has consistently been 'unbounded exponentials aren't real, why are you pretending they are?'.
In that light 'pumped hydro can't scale' means 'stop trying to use this for Petawatt hours of energy storage you idiot, you'll destroy everything'.
Some of his more recent posts have been worryingly easy to coopt by the 'maek moar fossil fuels and nuclear' crowd, which his previous posts indicate he should be even more strongly opposed to.
The general message of degrowth and steady state economy is positive even if 'pumped hydro can't scale indefinitely' sounds a bit like 'pumped hydro can't decarbonize the current scale of the economy', and he gets the scale required to maintain status quo consumption a bit wrong.
I do wish he'd make this distinction clearer though.
OK if by "highly geographically dependent" you mean "not highly geographically dependent".
You need a hill, but there are a very, very large number of hills. Most usually you need to build an earthen dike around the top of the hill, although often a natural feature allows it to be shorter.
The reservoir does not need to be deep because the "head" is from it to the bottom of the hill, or even to the water table well below that.
The reservoir or reservoirs are good places to float solar farms.
I forget what country was trying to build a bunch of these but an article popped up on here about this a while back and they cited lack of suitable locations as one of the main barriers for building more.
The most ideal locations are natural that require minimal land reshaping. Some places in the world have a lot of suitable locations like that. Others don’t.
But despite that it still seems much more feasible to build lots of hydro pump stations like that than it does to build other forms of battery farms.
Places with not many hills will of course prefer other storage media.
Places with lots of hills will also prefer other storage media if they turn out to be cheaper. It is far from clear yet how costs will settle out. It is anyway not time yet to build more than just enough storage to shift the few hours from peak generation to peak use.
So it's supposed to be impossible to get the 3 hours or 9TWh (12 years of 2022 production) of battery needed to shift the grid to >99% renewables (including biogas from waste plant matter, planned HVDC, and existing hydro for dispatch) with the much larger than current lithium production that is already pipelined, but scaling Uranium production and the entire reactor and enrichment supply chain 10x with no significant pipelined expansion is trivial?
What happens when the known Uranium resources run out 12 years after that?
1) not that it would happen, but if it did, they won’t just run out.
2) There are vast Uranium deposits in the Southwest and Canada that were identified and either mined a little, or left untouched after WW2 - because we had so much, we didn’t need them.
Early lithium was ‘hard to find’ until it wasn’t, and now there are vast reserves.
Lithium is harder to extract per power unit though, as Uranium is incredibly power dense.
> Lithium is harder to extract per power unit though, as Uranium is incredibly power dense.
Y'all just love making up lies out of whole cloth. This is incredibly wrong and you didn't even consider the idea of checking before deciding it was true. Stanning for nuclear does involve incredible density but it's not power or energy density.
Weight for weight, the amount of lithium you need for a 1kW renewable system with diurnal storage (80-160g/kWh for 8-12kWh to provide for 1kW) and the amount of Uranium you need for 1kW of nuclear reactor (45-60MWd/kg @ 32% thermal efficiency with a 7.4:1 tails/fuel ratio with a 3-6 year fuel cycle) are about the same: Roughly 1kg.
The lithium battery will last 2-4x as long as the Uranium fuel (12-20 years vs 3-6).
At 1-7% the lithium ore is 1-2 orders of magnitude more concentrated than the 0.01-0.7% of most Uranium deposits (Canada's untapped high yield deposits are about the same as good lithium ore but they are deep underground, unique, and only a tiny fraction of what would be needed). The typical ore mined in a mass expansion scenario (0.01-0.03%) would have an energy density between that of coal and crude oil.
Mining for the uranium would involve around 100-1000x the quantity of mined or leached ore and tens to hundreds of times as much leaching chemical.
The Uranium extraction process doesn't end at the mill. Enrichment is just as involved as brine extraction. The end product of the Uranium fuel costs twice as much as an LFP grid battery and has about the same embodied energy per joule delivered (or double if from an underground or deep open pit mine).
Diurnal lithium battery storage is irrelevant where good pumped hydro is available, and is far more than is needed to reach 85-90% VRE (which can be done with 3 hours).
Early uranium was also hard to find, then tens of billions were spent trying to find more. There are fairly reliable methods of estimating how much hasn't been found based on the rate of finding it vs. the effort spent and the answer is there's not a lot undiscovered at concentrations that make fuel affordable. It's also largely irrelevant because any you find after you open your 3-8TW of nuclear plants (which are somehow built in 12 years) is not going to be developed before they all run out.
Thank you for the detailed responce, am I correct in understanding this is for once-through fuel cycle? I was not able to find figures for total burnup once fuel reprocessing is taken into account. I think analysing large-scale transition to nuclear only makes sence with reprosessing.
> The typical ore mined in a mass expansion scenario (0.01-0.03%) would have an energy density between that of coal and crude oil.
This is really interesting, because I have seen a lot of hand-wringing about lithium mining, describing it as physically impossible.
We seem to be mining 8.5 billion tons of coal a year, and 100k tons of lithium a year. Assuming 2% for lithium, that puts us at 4 million tons of ore.
So we need 2000 times less earthmoving equipment to achieve the quantity of lithium we currently consume?
I am assuming here that 'Coal' is the name for stuff that's dug out of the ground, so comparing 'coal' and 'ore' is correct for estimating earthmoving required.
> Diurnal lithium battery storage is irrelevant where good pumped hydro is available
To me the entire point of this scenario is, what do I do if my country doesn't have it. That's going to be the dilemma facing half the world.
> I have seen a lot of hand-wringing about lithium mining, describing it as physically impossible. While cost remains to be seen, it is clear the amount we need to mine is much lower than that of coal.
Very little mining or extraction is actually physically impossible, just as safes are rated not as "impossible to crack" but "takes an expert with best tools 16 hours to open", known measured resources are ranked by economic feasibility .. how much effort will it take to produce the end product (and is that cost worth it).
Lithium | Uranium | Copper | etc .. there are pros, cons, issues and problems all the way through any mineral extraction process - the one solution not proposed nearly enough is for populations to just consume less.
Re: Lithium specifically:
> Let us consider, for example, electric cars. To give an idea of this effect, producing a battery weighing 1,100 pounds emits over 70% more carbon dioxide than producing a conventional car in Germany, according to research by the automotive consultancy Berylls Strategy Advisors.
> Furthermore, lithium mining requires a lot of water. To extract one ton of lithium requires about 500,000 liters of water, and can result in the poisoning of reservoirs and related health problems.
Also thank you for maintaining civility and expressing genuine curiosity. It's easy to forget that not everyone who is a fan of nuclear is just using it as a tool to attack anything that threatens fossil fuels just hecause those ones are the loudest my distaste towards some of the other commenters caught you in the crossfire and I apologise.
> So we need 2000 times less earthmoving equipment to achieve the quantity of lithium we currently consume?
Both metals have a variety of mining methods including extraction in liquid form, and both need a leaching step in large amounts of chemicals if dug up whole (with the exception of canadian ore which is high purity) but that's the general gist of it. The Lithium requires between 1 and 3 orders of magnitude less space/industry/chemicals and a bit less energy for the same target use. Nickel, manganese, copper and phosphorus also have significant impact but still about the same total impact as the Uranium for a 90-95% grid decarbonization use case up front (but longer lived and recyclable for the battery ingredients). Building a mix of solar and onshore wind to cycle the battery requires a subset of the ingredients of a reactor like an EPR (with maybe a bit more concrete, zinc and steel depending on wind capacity factor, but much less chromium and a number of other higher impact materials).
Offshore wind is about the only renewable technology with significantly larger mining impact. It is better than fossil fuels or delay but does need to come down (Iron Nitride should help here, but there is still a lot of steel and copper -- although an order of magnitude less than the nuclear and fossil fuel interests will tell you).
> Thank you for the detailed responce, am I correct in understanding this is for once-through fuel cycle? I was not able to find figures for total burnup once fuel reprocessing is taken into account. I think analysing large-scale transition to nuclear only makes sence with reprosessing.
Reprocessing does very little without a positive breeding ratio. Unless you create more Pu239 than you consume U235 it's just a small boost in U235 efficiency. You can verify this by looking at the isotope mix of waste for your reactor of choice and the isotope mix and burnup of MOX. An APR has a breeding ratio around 0.6 and consumes 60% of the bred Pu without reprocessing. A PHWR has the advantage of (in principle) being able to extract the last 5-15% of energy without plutonium extraction (which is incredibly polluting and expensive), but it doesn't result in much of an increase in net output, nor does it reduce the amount of Pu240, Pu241, Am242 and Am241 (the very bad high level long lived alpha emitters) that must be dealt with by more than 20%.
Breeders have a host of technical barriers, and a breeder-heavy strategy is actually impaired by spending money and the fairly finite easily accessed startup fissile material on PWRs. The net result of most proposals is about 10x the burnup (as fertile material is notnplannednto be recovered and multiple roundsnof breeding and reprocessing result in problems that don't have proposed solutions), but there is no plutonium separation process that is either affordable or environmentally sustainable if scaled to the TW level.
> To me the entire point of this scenario is, what do I do if my country doesn't have it. That's going to be the dilemma facing half the world.
Sodium ion batteries are a commercial technology now at the 1-5GWh level with a massive scale up being completed in June (100s of GWh/yr), and use all abundant materials (Iron, sodium, carbon, water, aluminium). Pumped hydro is also far less limited. Other chemistries that are in scale up include Zinc Bromide (there is a process that can retrofit existing lead acid production being scaled among others), Iron, and Vanadium (often available as a side product of Uranium in greater quantities at about 1-3hrs of storage per year of uranium fuel), but generally not valuable enough to extract). The main geopolitical danger is access to silver, and building a local industry.
Also buying lithium on the open market has far fewer opportunities for geopolitical domination than systems dependent on fossil fuels or U235 as there are many low quality deposits and the bottleneck is largely extraction. Your country could spend $3-10k per US-citizen-of-primary-energy on imported batteries or $300-1k on imported lithium once then develop domestic recycling and manufacturing and then you are done for decades (maybe replacing 5% of it per year once the kWh per kg of Li stops improving).
Bwaha, 1kg of uranium produces 8.64 x 10^13 joules of energy when fissioned. That's ~ 24 MWH per gram, not per KG. You're off by ~ 3 orders of magnitude.
So what are you talking about? Because your math at the start is so far off, it’s pretty hard to tell here.
You haven't understood his post, he is correct under worst-case scenario for nuclear -> once through fuel cycle achieves 5% burnup, that's 4 million MJ/KG, or roughly 60MWd/kg
Enrichment requires that we throw away most of uranium as depleted uranium, that's what he means by "7.4:1 tails/fuel ratio"
That's gives us ~20 years of power at 1KW, but we have are converting to electricity, so it really only gives us 6 years or so.
The main flaw with his argument is, in my understanding, he assumes a once-through fuel cycle, whereas France and others reprocess nuclear fuel. It also discards reactors that work on un-encirched uranium like CANDU.
I am not sure about the ores, but I am greatfull for the detailed writeup
Reprocessing doesn't turn fertile material into fissile. It mostly just costs money and pours more fission products into the nearest body of water than Fukushim released. It also extracts the dregs of Pu239 and U235 to get another 5-15% of final energy out. CANDU + reprocessing is a bit better than a PWR, but still significantly under 2x the energy in the original U235.
Closed fuel cycles are a myth so there is no need to specify 'once through'.
Just gonna double down on the 'it's so dense' myth huh?
Do it with a real fuel cycle, real turbine efficiencies and stop trying to conflate the fertile content with the fissile.
Here's a hint to get you started: the reactor fleet uses about 67500 tonnes of raw Uranium and produces about 2650TWh each year. New reactors are about 1.8x the average and most SMR proposals are worse. The world produced 760GWh of batteries from around 75,000 tonnes of Lithium.
Also note that 2650TWh / 760GWh is 3500 which is less than the number of cycles an LFP battery will last.
If you’re going to throw out bullshit numbers, don’t be surprised when you get called on it.
If de-rating, show your math. Which you didn’t.
This whole thing is hilarious anyway, as I’m far from a nuclear advocate. Just pointing out you’re not actually telling the truth while going on your rant.
And it’s all clearly apples to oranges with no sense.
Lithium is not destroyed when used in these battery systems, it can be recycled indefinitely if we cared. It also doesn’t produce any actual power, it’s storage.
Uranium is burned/fissioned, and actually is gone. And actually produces power.
Wow. You got triggered hard by that. My comment was in response to the pearl clutching over the imagined requirement of Lithium mining as a component of a system that generates electricity from wind and solar.
If we take diurnal storage being provided by lithium batteries as a given, then a VRE system producing 1kW >95% of the time using existing commercial technology requires roughly 1kg of lithium in 6-12kWh of batteries which needs to be recycled or replaced every 12-20 years.
A system producing 1kW of electricity from fission 85% of the time using existing commercial technology requires fuel made from 1kg of mined Uranium. The Uranium needs replacing at least every 6 years. You can calculate this easily from burnup (25-60MWd/kg thermal), thermal efficiency (30-38%), and the Uranium required to make a unit of fuel (~8kg per kg of fuel).
Hence the pearl clutching is revealed as disingenuous nonsense and the proposed alternative to the terrors of Lithium mining is revealed to be worse, and why you are triggered.
> So it's supposed to be impossible to get the 3 hours or 9TWh (12 years of 2022 production) of battery.. but scaling Uranium production and the entire reactor and enrichment supply chain 10x with no significant pipelined expansion is trivial
Yes, that is exactly right!
Energy density of natural, raw, unenriched uranium is 1,000,000 MJ/KG, and energy density of a lithium battery is 0.46 MJ/KG.
1 kilo of uranium gives you 1,000 times more energy than a kilo of lithium will be able to 'process' over the entire 10-year life expectancy of the battery.
You will need 1,000 fewer excavators, dump trucks and people involved in mining if you choose uranium.
If price of uranium increases 3x nobody cares, fuel is like 5% of cost for nuclear.
If price of lithium increases 3x it's a disaster, the entire transition to electric vehicles will fail.
We don't even have enough batteries for vehicles, the grid needs another solution, either hydro + hydrogen storage or nuclear.
Now do it again with the electricity output of fuel cycles that actually exist and the amount of lithium in a new battery.
The only density here is that required to mindlessly parrot factoids about the theoretical thermal energy content fertile material as if they were relevant to electricity output of fissile content.
> If price of uranium increases 3x nobody cares, fuel is like 5% of cost for nuclear. If price of lithium increases 3x it's a disaster, the entire transition to electric vehicles will fail.
If the price of Uranium triples, the raw Uranium becomes as expensive as renewables' LCOE in one year rather than 5. If the price of lithium triples, it's still irrelevant because PHES still exists and so do Sodium Ion, Zinc Bromide and Iron batteries.
> We don't even have enough batteries for vehicles, the grid needs another solution, either hydro + hydrogen storage or nuclear.
The amount of batteries required for >90% renewables is tiny compare to the amount needed for overly large EVs everywhere. If you're going to change one of these variables to stop climate change faster, far better to unban light electric vehicles in countries with mandatory monster trucks and spend a few percent of that nuclear reactor money on transit, low speed roads and pedestrianisation.
This model projects that a 98% solar/wind/hydro grid is possible for Australia by building an additional 450GWh of storage. 1.3x snowy 2 (which is 350GWh).
Unfortunately, Snow 2's storage figures are rather misleading. That 350 GWh cannot be used daily, that's its total storage capacity which takes a month and a half to refill. The cyclic capacity of Snowy 2 - as in, the storage capacity that I can pump back into the facility using excess renewable power - is only ~40 GWh.
And again, this is geographically limited storage system: pumped hydro requires just the right geography of an upper and lower reservoir spaced not too far apart. Not too bad if you're a sparsely populated country with huge amounts of land per capita. But it isn't a solution for most countries.
"If based on the active storage volume of the ‘lesser reservoir’, Talbingo, the theoretical energy storage capacity is about 240 GWh"
Once again this is just JUST ONE project, already under construction and it already will cover somewhere between 30-40% of the energy storage requirements to get the entire country to reach a 98% solar/wind powered grid.
Dismissing the possibility of this order of magnitude of storage as simply impossible as the OP did when it is already under construction is asinine.
>And again, this is geographically limited storage system: pumped hydro requires just the right geography of an upper and lower reservoir spaced not too far apart.
There have been multiple studies on this. Unlike hydro, the geography for pumped storage is not rare throughout most of the world.
This is the most perplexing talking point against pumped storage, and frankly, reminds me of when people used to pick up on pro nuclear/carbon lobbies sneering at solar/wind for being infeasible because it was < 1% of the grid back in 2014.
It will be used where it is a good solution. Where it is not, others will be used. Use of other methods elsewhere does not detract from its usefulness in places suited to it.
Australia is 1 - Huge, low density, lots of choice where to place renewables, 3 - has some of the best places for solar in the world, 4 - good solar production in winter.
Try Austria, and find that solar production falls 5-10x in winter, there is almost no good location for wind, and the problem is much harder.
Indeed, central and eastern Europe are some of the worst places on Earth for renewable energy. They are "nuclear's last stand". What this means, though, is that in a post-fossil world energy intensive industries will simply move elsewhere. Why build your aluminum smelter in an energy ghetto?
> The overwhelming majority of electricity demand is base load. Usually on the order of 70-80% [1]. We don't need "some" base load, almost all our demand is base load.
The article you linked doesn't really back this up, at least in the way this discussion means it. It shows that with flexible production, you can drastically scale back on "traditional" base load power sources, and that is representing a real power grid in Germany. Nothing about the graph actually says "this is as much renewables as you can pack into this power grid".
If you look at the graph closely, you'll notice that solar is big during the day, and wind is big during the night. With greater installed wind production capacity, the fossil fuel lines would drop drastically in the graph. It's that simple. We would still need to have "peaker plants" available until there is enough grid energy storage capacity, but combined cycle natural plants work fine for that. We can keep pushing down the time they need to on by building more renewables even without batteries.
> The storage facility you linked to was the biggest facility in the world when it was first constructed, and it stored only 129 MWh of electricity.
I specifically linked that one because it talks about how ridiculously profitable it has been, and how much of an impact it has had on the local grid. If it can save money for a traditional grid, then it is a no-brainer for utilities to install bigger and bigger grid batteries. More demand for batteries means more battery production facilities, increasing global production capacity over time.
However, the world is also transitioning to Electric Vehicles, and most EV manufacturers are offering V2G (vehicle to grid) solutions, so millions of EVs can contribute a portion of their battery capacity to the grid in the future, and the grid can compensate them for their contribution.
> Battery production is expected to increase, but it's unclear whether raw material inputs can keep up with manufacturing demands [2].
Lithium is not exactly rare or hard to extract, you can even extract lithium from saltwater, so this argument seems specious. But, various alternative chemistries are being explored which could help in different ways.
> At our current rate of battery production it'd take us a century of dedicating 100% of our battery output to grid storage to reach 1 day's worth of storage.
How did you determine that we need a full day's worth of energy storage? We can drastically decarbonize the grid (and lower electric costs for consumers) with a lot less than that, based on what I've seen, but this is a highly speculative part of the discussion so it's interesting to hear how that number came to be.
We can get a handle on how much storage needed by optimization based on real weather data, minimizing costs based on various assumptions on cost of wind, solar, batteries, and long term storage. This web site lets you do that to obtain "synthetic baseload", the equivalent of what a nuclear plant could provide:
If we do this for Germany with 2030 cost assumptions and 2011 weather data, 6 hours of batteries are needed and 289 hours of hydrogen storage. Hydrogen storage is quite cheap, if nowhere near as efficient. It's very useful here, reducing the optimal cost by nearly a factor of 2.
For the US as a whole, the optimum solution uses 6 hours of batteries again, but 106 hours of hydrogen. For just Texas, 2 hours of batteries and 254 hours of hydrogen. California alone is 16 hours of batteries and 70 hours of hydrogen (likely due to wind optimizing to zero under those assumptions.)
I plugged this in with existing storage technologies and existing energy demand for just the USA (500 GW). It turns out we'll only need... 6,000 GWH of battery storage!
By comparison, the entire world only produces ~400 GWh of batteries each year. So it'd only take a decade and a half of global battery production to satisfy the storage demands of just the USA. The rest of the world would be left with zero EV or electronics production for a decade and a half and no grid storage to show for it.
Thanks for the site: it's a good tool to demonstrate just how unfeasible energy storage really is.
It doesn’t matter if battery prices increase some, as you have made that the cornerstone of your argument. Many many comments ago, I linked to the Hornsdale battery. Based on the revenue, you can do the math: the batteries could cost a lot more, and it would still have been profitable. Batteries also don’t make up the entire cost of grid scale storage: the inverters, the transformers, even the cabinets and control computers cost money.
As it is, mining is a lagging indicator. Once mining scales up, the cost of the commodities will naturally go back down. While there is profit to be made, too many people will open mines, which will bring the profitability back down to earth. It’s a tale as old as supply chains.
This same inability to imagine how quickly solar and wind would drop in price led to numerous “experts” making absurd claims about solar and wind being economically infeasible at any scale. The battery supply chain is scaling. There will be price volatility, but the volume is growing by leaps and bounds.
It will take years for the world to transition, but it also takes years to build and install the necessary wind and solar. Battery production won’t remain constant, and it won’t even increase slightly. It has and will increase drastically.
Predictions and actual capability are two vastly different things. Outside the realm of predictions, back here in reality, battery costs are actually increasing rather than declining [1].
"Just scale up mining" is easier said than done. Steel is a widely used commodity. If we could "just scale up mining" and exponentially decrease the cost of materials, why haven't we been able to do this with steel? Care you explain why "just scale up mining" will work for lithium when it hasn't for plenty of other commodities?
That’s an irrelevant question. I hate that I’m still bothering to respond when you don’t seem interested in changing your position.
There is a fixed cost to extract a kilogram of lithium using a given method. When demand spikes, lithium suppliers will charge more and reinvest those profits into increasing mining capacity. If they had their way, they’d keep prices high, but anyone can open a lithium mine. Lithium is not hard to find. Those people will undercut the previous miner in order to attract buyers, which will force all miners to lower prices back to reality.
As long as prices are high, more and more mines will open as it is suddenly an attractive resource to mine. This will happen until prices start to drop to some percentage-over-cost where things stop being attractive, and the market reaches equilibrium again.
If lithium was profitable to mine for $X/kg for many years, then that is the price it will naturally return to. It is a commodity. There is no special value for getting your lithium from Corporation X or Corporation Y. The only value is the material.
That’s how commodities work. The price is determined by how hard it is to extract, and whether demand has recently spiked (or subsided). Gold is expensive because it
If this were some rare substance, extraction difficulty would increase noticeably with time, but we’re a long way from a shortage of ways to mine lithium. Some ways might be more expensive, so those mines will only open if the prices stay high, but the prices won’t rise indefinitely this century. That is such an illogical assumption. Perhaps the natural price of high volume lithium production is higher than we have today because those more expensive methods are convenient, but it won’t be enough of a price increase to matter in the grand scheme of things, and this assumption of a higher price is a pretty flimsy one to base predictions on even then.
Eventually, as batteries age, they will be recycled, and the need for lithium mining will likely drop sharply, bankrupting some of these mines until production and (profitable) demand are matched again.
Higher prices would indeed incentivize increased demand. But again, this requires higher prices. Understand that renewable activists are predicting that lithium battery production will not only increase exponentially, it will also exponentially drop in cost per KWh as this happens.
This is not going to happen because, as you point out, lithium mining will only expand if prices climb higher to make otherwise unprofitable reserves profitable. And since raw materials now dominate the cost of batteries, this is going to increase the cost of batteries. There's no having your cake and eating it too: in order to increase lithium mining capacity, battery cost is going to have to grow, not shrink.
You edited your post after I replied, so I'll have to edit in response:
There's no one "difficulty of extraction" factor for each commodity. The reality is that there's a diverse variety of reserves all of which are easier or harder to exploit, even for the same commodity. "The price is determined by how hard it is to extract" is at best a huge simplification.
Higher commodity prices make it viable to extract the more inaccessible reserves, but those reserves will only be profitable so long as prices remain high.
> If lithium was profitable to mine for $X/kg for many years, then that is the price it will naturally return to.
Nope! This is completely wrong. Commodities don't "naturally return" to any price. Prices is a result of supply and demand. It could be we find some other battery chemistry that blows lithium out of the water. In that case the cost of lithium will probably collapse well below $X/kg. Conversely, if countries start to try and provision significant amounts of grid battery storage, the costs will grow even higher as more and more inaccessible reserves need to be exploited to supply market demand.
There is no "natural" price of commodities, whatsoever. If the demand for lithium is going to rise and keep rising, then the cost will rise and keep rising unless some breakthrough makes it way more efficient to mine. Given the fact that we've been mining for centuries and we have huge demand for minerals other than lithium, I'm not optimistic on a 100x improvement in mining efficiency.
It's interesting because coder is very locked into their view on one price.
I don't know lithium at all, but if you follow natural gas / fracking / oil production the question of what reserves are economically recoverable is HIGHLY price dependent (and highly variable). On low end $10-15/barrel cost of production or better. The only reason we have any production in the US is because oil costs more. If you said we would double oil usage prices would spike incredibly.
Same thing with solar net metering, Solar activists demand that soler gen (often during periods with solar curtailments already in effect) be reimbursed based on the same cost as electric rates during the evening (usually peak demand and low solar / wind). Again, depressing how economically nonsensical this all is for a supposedly scientifically grounded effort.
Anyways, rational heads (and economic forces) tend to prevail. I think we will see more TOU rates, and ideally this will at least create some market forces around storage.
The prices have gone up, so mining is expanding. There’s a clear next step to the way commodities work. That next step is not for prices to continue increasing dramatically, things should level off and eventually return to normal, as I explained.
> Understand that renewable activists are predicting that lithium battery production will not only increase exponentially, it will also exponentially drop in cost per KWh as this happens.
Battery prices don’t need to keep dropping for them to take over the world. They’re already cheap. Of course people would love for the prices to drop, and they have historically been dropping slowly, but the basic elements will cost a certain amount, so there is a price floor. I don’t know where that is, but it is lower than we’ve seen, because every step in the battery production chain has been making a profit up to this point, including the miners.
The fact that prices have increased only indicates a mismatch between supply and demand, not that the natural price needs to be this high, or that it needs to continue rising. As I pointed out, everyone was already profitable at a lower battery price than what we have today. If batteries are expensive because of lithium, more lithium mines will open and drop the price. If batteries are expensive because the battery makers are price gouging, someone else will undercut them.
The main concern for something like this is market distortion by patents. If the best way to make batteries is locked behind a patent, that can cause prices to be unnaturally high. This concern does not apply to elemental lithium.
> Higher commodity prices make it viable to extract the more inaccessible reserves, but those reserves will only be profitable so long as prices remain high.
You’re practically quoting my comment to me. I addressed that. It is possible that the price will settle higher than it is today if those methods play a big role, but it won’t matter. You don’t seem to appreciate how insanely cost effective batteries already are today. A modest price increase is fine, but it is still illogical to assume prices will stay high. Lithium is not hard to find. Why would the market settle on expensive extraction methods? It is a strange assumption to start from. We’re not talking about something that’s rare.
Either way, the outcome is unchanged unless batteries increase in price exponentially, as your earlier comments apparently assumed.
For the second time, there is no "natural price". You need to iron this kind of wishful thinking out of your head. Companies want to buy more and more lithium, so unless you've got some breakthrough that makes mining a heck of a lot cheaper the cost will keep going up and up as more and more inaccessible reserves need to be exploited. If I have a reserve that costs $4/kg to operate when the price is $5/kg and another reserve that costs $7/kg, I may open the latter if the price rises to $8/kg. But if the price drops back down to $5/kg it's unprofitable to operate.
Here's a way to articulate this that might better mesh with your mental model. The "natural price" of a commodity isn't static. If demand increases and the only way to meet demand is to use more and more expensive mining operations, then this raises the "natural price". As per the above example, the "natural price" rose to $8. It won't drop back below $8/kg unless either we find a way to make mining cheaper, or demand drops back to levels that can be satisfied by my cheaper-to-operate reserve.
You have too many assumptions about the availability of lithium, and lack of “easy” reserves. That’s the fundamental disconnect here. Lithium is not inaccesible, and it is super common. The “hard” ways to mine lithium aren’t that hard.
We’re not talking about gold or platinum, where your discussion points would be relevant.
I’m not wishing over here. But anyways, I’ve tried to explain the reason why the market disagrees with you, and why people are rapidly seeking to integrate batteries into the grid and into cars, which would not be possible if you were correct about lithium becoming insanely expensive.
Battery production will keep scaling at a huge rate, and the prices will be fine. A modest price increase is irrelevant to anything discussed here today.
Lithium is far from easy to extract. It also needs to be refined on-site because its raw form is too low-density to economically transport. So most lithium mines require large amounts of water to fill brine pools. The number of sites that are actually viable to produce lithium are not nearly so common.
Again, the market disagrees with you not me. Lithium commodity prices are rising, and so are battery prices. People are indeed seeking to integrate batteries into the grid - I'm not disputing that. I'm pointing out that these batteries are actually becoming more expensive [1], not less and it's unlikely this trend will reverse on a dime.
Commodity prices are rising… causing more mines to open… causing more supply… which will cause prices to drop. Why do you keep pointing to that? Your argument makes no sense. That’s how commodities scale.
I'd check out something like oil which is a large and high dollar market.
The relationship between production and price is well established, and no, production does not move inversely to price. This is because the marginal cost in almost any of these areas is higher per unit of production as production increases. That also of course makes sense logically.
Gold as well, same thing. You start by grabbing gold nuggets of the ground. The effort to mine more gold has not gotten easier, it's gotten harder over time.
> Commodity prices are rising… causing more mines to open…
Yes.
> causing more supply…
Yes
> which will cause prices to drop.
Nope! Remember, the expanded supply is only profitable to operate at higher prices. So the price will drop if demand drops and then supplier will operate the more efficient extraction sites. But if demand stays high, so will prices.
Obviously production would have to be scaled up. It's dishonest to present this as some sort of insurmountable barrier, for all potential battery chemistries or other storage modes.
Batteries are not transistors. Their input costs are skyrocketing, and sure enough the end costs of batteries are now starting to rise, too [1]. It's dishonest to pretend that continued exponential growth is guaranteed.
The cost of an automobile shrank from a million dollars inflation adjusted to a hundred thousand over the course of the 1900s. Assembly line manufacturing continued to shrink this down to $10,000 by 1920. Would it be safe to assume that a car would cost $1 by the end of the century given the past rate of a 10x drop in price every two decades?
The reality is that most products are not transistors. They don't get better when you make them smaller. A car will always contain a certain mass of metal, and will not cheaper than the cost of that input. Manufacturing already accounts for under a quarter of a battery's cost. The rest is dominated by cathode and anode material [2]. Battery manufacturing had already become a resource extraction problem.
You could not be more wrong about batteries. You made a huge error about the cost percentage that lithium makes up. It’s possible — just possible — that other people have researched batteries too, and maybe they have a better sense of where the industry is going.
It makes no sense to assume that we’ve reached “peak battery”, at all.
Continuing this argument is pointless, and really does seem like you’re choosing to ignore reality. I’m done with this conversation.
And for those nickel-free battery chemistries, what inputs are the main cost drivers? And as per my response, nickel is also experiencing shortages and cost spikes.
My core assertion remains true regardless of the fact that different chemistries require different inputs: battery production has become a resource extraction problem, rather than a manufacturing problem. Unless we find a way to somehow make mining exponentially more efficient, we're not going to be seeing exponential growth in battery production.
Your core assertion remains false because (1) lithium is not the only battery chemistry, and (2) batteries are not the only, or even cheapest, storage technology.
CAT-L, the biggest battery maker in the world, is today ramping up sodium battery production capacity. Will you now insist on a sodium supply bottleneck?
Lithium is the only viable battery chemistry we have presently. Lead acid decays after 100-200 cycles. Who knows if sodium will be viable, it'd be a lot better to argue for its efficacy when it's actually on the market.
The other forms of storage have other shortcomings: hydroelectric storage is geographically dependent. Most of the other form of storage people are mentioning in this tread have never been operated outside of prototypes: like hydrogen electrolysis or compressed air. Again, technological feasibility and market viability are two vastly different things. If compressed air storage works, but it's more expensive than nuclear power what's the point?
Lithium is not, in fact, the only viable battery chemistry. CAT-L, the world's largest maker of batteries, is today ramping up production of sodium batteries. Numerous other chemistries are also being fielded, at $billions scale.
Pumped hydro storage is not, in fact, geographically dependent: there are a lot of hills (which you knew). Compressed air is used in production (which you knew). Hydrogen electrolysis is operated in the millions of tons (which you knew).
Relying on falsehoods is a strange way to argue. It depends on your audience remaining ignorant.
I don't care about company hype and marketing. Get back to me when I can actually buy a sodium battery and throw it on a test load to see if it's living up to the promises. It's not available yet, that's the reality. Tech that's 5 years away has a nasty habit of staying 5 years a way for a lot longer than that.
I got a real kick when you insisted hydroelectricity storage is not geographically dependent and immediate followed by saying it needs hills. And it requires more than just hills. It needs a hill, with a reservoir on top, and another reservoir at the base of the hill to collect the water, and another body of water to fill said reservoirs. The conditions for hydroelectric storage are way more specific than "it needs a hill".
I agree, relying on falsehoods are a bad way to argue. That's why it's bad to insist that storage is an easy problem to solve to try and make the case for intermittent sources.
And then there are various thermal storage schemes. Pumped thermal (reversible thermal cycle with hot and cold storage) could have up to a 75% round trip efficiency with cheap materials and no geographical restrictions.
> Thanks for the site: it's a good tool to demonstrate just how unfeasible energy storage really is.
Turns out even good tools can give bad results if intentionally used in bad faith.
If you include a small amount of dispatch to represent long distance transmission, biogas or hydro things change dramatically. Prices are for real finished projects, and are going down for wind, solar, and storage very rapidly.
At 2.5% even using gas for dispatch would only add 10g of CO2 per kWh to the total. Well within the error margins of estimates from various low carbon sources and significantly less than taking a year longer to build out low carbon generation.
Restricting scope to subsets of the country:
PNW with no offshore wind because it has the worse renewable resources:
A little off-topic, but has anyone ever developed residential hydrogen production / storage solution? What's the smallest scale hydrogen storage avaliable?
> However, the world is also transitioning to Electric Vehicles, and most EV manufacturers are offering V2G (vehicle to grid) solutions, so millions of EVs can contribute
V2G is idiotic. It cost £10 for full-charge of Tesla in UK pre-ukraine. Half of that money is distribution cost, not energy cost. What will you earn from buying low, selling high in V2G? Let's be generous, say it's £5.
So the battery lasts 1500 cycles, you are going to make £7500. And the battery costs £17,000 to replace?
No-one will be damaging their expensive car battery to earn fraction of a dollar to earn pennies. For anyone to do this, it has to be 2x above break even.
> Lithium is not exactly rare or hard to extract, you can even extract lithium from saltwater, so this argument seems specious.
You can extract anything from seawater, even gold.
"Using electrochemical methods, comparable to those used in electroplating, gold has actually been extracted from the ocean, but unfortunately the cost of the process is five times the value of the gold obtained."
“It's technically possible to extract lithium from seawater,” Cui says. “But it's all about cost. And currently it's too high.”
According to one website that I found, the average residential electricity cost in the UK is $0.482/kWh. Simple math: 1500 cycles * 75kWh * $0.482/kWh = $54,225 in offset electric costs, which would more than cover the hypothetical battery replacement cost. It is well above the 2x break even that you requested.
However, I expect most people do not have electric bills that high, so those people would not allow that much of their battery to be used by the program, unless they were participating in this program for a very long time. They would be more than fairly compensated for whatever cycle life was used by the net metering program. Yes, I recognize that the electricity would have come from the grid at some previous point in time in most cases, so some adjustments would have to be made to the concept of Net Metering to make this a sensible proposal, but there are ways to make it work if you don't just declare that everyone is a bunch of idiots. Electricity is much more valuable to the grid at certain moments in time than others. ""Transmission costs"" are irrelevant when the grid needs more power. Any kind of V2G system would involve exporting power when the grid desperately requests it, not just at a random hours of the day when the grid doesn't even want it. Virtual distributed grid batteries are already being proven.[0] This solution can definitely be scaled up.
Also, that estimate of battery replacement cost is based on the battery magically having zero marginal value at the end of its lifecycle in your car, which is not at all how things work. A battery that people are unwilling to keep using in an EV will still have significant value if it is directly transplanted into a stationary storage application, but oftentimes, it is just a handful of cells that are causing balancing issues, so replacing those for a few dollars can restore a lot of missing capacity. Even after the battery is done in stationary storage, it will still have value in the highly concentrated resources which can easily be recycled into a new battery pack.
So, no, it would not cost that much to replace the battery. Even today, you could throw a used battery pack on eBay as a worst case scenario and get thousands of bucks for it, but in the future as more of these batteries are being cycled through the market, it will be more economically viable to have businesses built around making this as painless as possible.
If you consider V2G idiotic, then you're effectively asserting that any form of grid battery is idiotic because those have to work on very similar economics, and that is a strange position to hold when we have real world examples of grid batteries being profitable even today.
> Half of that money is distribution cost, not energy cost.
I also have no idea where you got the notion that UK electricity costs are 50% transmission fees. It is 3%.[1]
> You can extract anything from seawater, even gold.
I'm not saying seawater is absolutely the best available source of lithium, but some people claim to have proven real success with it.
"The researchers estimate that their system can extract 1 kg of lithium from seawater at a cost of $5 (Energy Environ. Sci. 2021, DOI: 10.1039/d1ee00354b). “Our process is quick, energy efficient, and scalable,” Lai says. “And the system runs continuously and is compact and easy to operate.”" https://cen.acs.org/materials/inorganic-chemistry/Can-seawat...
Using gold in your response the way you did seems intended to be an insult, which is completely unnecessary. I'm not proposing alchemy.
Maybe you know more than these researchers whose literal job is to know about lithium and seawater, and that's fine, but I have never claimed you can extract meaningful amounts of gold from seawater. It is infeasible. Lithium is far more common than gold, though, which makes a huge difference in the feasibility.
> I also have no idea where you got the notion that UK electricity costs are 50% transmission fees. It is 3%.[1]
I expressed myself incorrectly - I meant that the various non-generation costs, like transmission, various middlemen and fees, etc. roughly double the cost of energy between powerplant selling it and it reaching my house.
> Maybe you know more than these researchers whose literal job is to know about lithium and seawater
I am just slightly jaded from seeing many research breakthroughs, especially in batteries, never make it to market, and only really consider something once there has been at least one commercial facility operating. My understanding is that I can't buy lithium from seawater at the moment.
> If you consider V2G idiotic, then you're effectively asserting that any form of grid battery is idiotic
On the contrary, I think if you are buying a used car battery on the cheap, cycling it optimally and you don't need an overpriced Tesla-authorised mechanic to fix it, the results should be much better. Safety is less of a concern too. Or you can buy Lithium iron phosphate.
I think people are very precious about their cars and have many worries about damaging the battery, some of them irrational. There are also logistical challenges, repair is harder, etc.
I think the economics you outlined are the very best-case scenario, with war in Ukraine causing energy crisis and assuming nearly free electricity avaliable to charge.
"Lithium is not exactly rare or hard to extract, you can even extract lithium from saltwater, so this argument seems specious. But, various alternative chemistries are being explored which could help in different ways."
Lithium is abundant as far as its total share in the Earth's crust goes, but it is my impression that mineable concentrations are rather rare, unless you are willing to spend obscene amounts of energy on purification. That is why only a few countries in the world actually produce lithium commercially.
We have some reserves of lithium in Czechia, but they would only be economically mineable if the bulk price rose significantly (as in, ten times or so), plus the mining methods would introduce a lot of poisons into the environment. Mining is usually seriously dirty.
Until recently, there wasn’t much demand for lithium. That’s the main reason for limited mining, in my opinion.
The cost of lithium was 9% of the cost of a lithium ion battery cell according to one analyst last year. Battery packs cost more than just the sum of the cells.
If lithium rises in price, it will affect battery prices, but a 10x increase would “only” double the cost of the battery cells, and the packs would be slightly less affected than that.
Realistically, there should be plenty of lithium for less than that, but it takes time to ramp, and lithium mining does not have to be destructive. The mining can be as simple as evaporation ponds in the desert[0]. (You’ll need refining either way.)
> The mining can be as simple as evaporation ponds in the desert[0]. (You’ll need refining either way.
The OP you are replying to lives in Czechia, they are not blessed with optimal location. Most of the world isn't.
The optimal location is a paradox, it is a desert, At the same time it has to have abundance of water for extraction and evaporative pools. Very few locations have that. Extracting from non-optimal locations costs more.
like 4 people are trying to explain this to you, but you keep repeating the same thing
I'm pretty sure it has almost exclusively been one person that has been explaining, and they have received numerous rebuttals from other people, but okay.
And no, I did not suggest the person I was replying to should do this in Czechia. Every country doesn't need to mine every resource. Thanks for throwing in a strawman.
In reality it's 50% of the cost, so a 10X increase in input price would amount to a 5x increase in cost. Battery production has become a resource extraction problem.
> In 2021, lithium comprised 9 percent of the cost of a battery cell. Nickel represented 12 percent of the cost. But for 2022, those numbers have risen to 13 percent and 21 percent, respectively.
I had just glanced at the Google summary, so I only saw the first part. Call it 13%, then.
It is not anywhere close to 50%. The cathode is made of more than just lithium. 50% is an incorrect interpretation of the graphic in the article you linked.
My point remains: battery production has become a resource extraction problem. Even completely optimizing manufacturing to the point that it costs nothing would only reduce the cost of batteries by a quarter.
What don't I understand? Ultimately price is determined by supply and demand. Gold's supply relative to its demand sets its price. If some massive load of goal was dumped on the market, we'd expect a downswing with increased supply and no commensurate increase in demand (although gold being bought by speculators might distort this).
We're seeing a big upswing in lithium demand, mostly for EVs. And supply isn't catching up leading to higher costs, which will in turn lead to higher prices and consequently lower demand. Unless greater supply is secured or demand is reduced, this increase in price is going to persist.
If the predicate is that mining of a commodity cannot ever expand, the how is building new nuclear supposed to work? Where does the Uranium and Gadolinium for fuel rods come from?
And tautologies will persist in being tautologies.
Higher prices will result in investment in increased production, shift of demand to lower-priced alternatives, and investment in alternatives. Increased investment results in cheaper and larger production of everything.
Unobtainable lithium is a strange choice of hill to die on.
Increased investment does indeed incentivize the exploitation of less accessible reserves that would not otherwise be profitable. But those higher prices have to be sustained to keep those additional operations profitable. If prices goes down, those mines close and diminished supply pushes prices back up to equilibrium.
It's not unobtainable lithium I'm worried of. It's difficult supplying 50x as much lithium over the next couple decades with no significantly increase prices.
> But those higher prices have to be sustained to keep those additional operations profitable.
That is not, in fact, the case. A higher price need persist only long enough to pay the capital cost of developing the resource. The millionth ton is routinely much cheaper to extract than the first ton.
And, of course, we don't need 50x lithium production in any case. That is your private fantasy. No, thank you, not buying.
Look at the the lowest point of energy demand. That's base load. How big is it relative to the peak of energy demand? Depends on the season, but it's usually 70-80% of peak demand. So, the vast majority of energy demand is in fact base load. I'm really confused about why renewable proponents talk about base load all the time - it's really not relevant to decarbonization of the grid.
Intermittency of wind and solar aren't just daily: you also have longer-term periods of cloud weather blocking solar and lower wind speed hampering wind power. Actually running a majority renewable grid requires either hydroelectricity, or fossil fuels.
The majority of Germany's electricity comes from fossil fuels [1]. It's not a mostly renewable grid, occasionally supplemented by peaker plants. It's a majority fossil fuel grid supplemented by renewables. By comparison, here's France's electricity production [2]. One of these is a mostly decarbonized grid. The other is a primarily fossil fuel grid, supplemented by renewables here and there.
> However, the world is also transitioning to Electric Vehicles, and most EV manufacturers are offering V2G (vehicle to grid) solutions, so millions of EVs can contribute a portion of their battery capacity to the grid in the future, and the grid can compensate them for their contribution.
This is an idea that no sane grid operator would ever accept. First of all, most of these vehicles actually lose energy in cold weather [3]. And if people leave to go on vacation, then we have blackouts because our energy storage solution drove away for a week? No to mention, plenty of people drive their cars around during the day doing chores or work and charge them at night. Those people are going to be a net drain on the grid.
> Lithium is not exactly rare or hard to extract, you can even extract lithium from saltwater, so this argument seems specious. But, various chemistries are being explored.
The market demonstrates otherwise. At the end of the day, if there's a shortage of lithium and the price goes it up it doesn't really matter what people are writing on tech forums.
> Look at the the lowest point of energy demand. That's base load. How big is it relative to the peak of energy demand? Depends on the season, but it's usually 70-80% of peak demand. So, the vast majority of energy demand is in fact base load. I'm really confused about why renewable proponents talk about base load all the time - it's really not relevant to decarbonization of the grid.
I think this is just a confusion of terminology. What "renewable proponents" are talking about is baseload power plants. To quote the Wikipedia article that you linked to:
"Power plants that do not change their power output quickly, such as large coal or nuclear plants, are generally called baseload power plants."
If you've been missing that key piece of terminology, I can see the source of the confusion. People just call this "base load" to be short, for various reasons. You can criticize this if you want to, but that is what is happening.
> The majority of Germany's electricity comes from fossil fuels [1]. It's not a mostly renewable grid, occasionally supplemented by peaker plants. It's a majority fossil fuel grid supplemented by renewables. By comparison, here's France's electricity production [2]. One of these is a mostly decarbonized grid. The other is a primarily fossil fuel grid, supplemented by renewables here and there.
This has nothing to do with my previous comment. I have no idea what point you're trying to get at.
> This is an idea that no sane grid operator would ever accept. First of all, most of these vehicles actually lose energy in cold weather [3].
You're really confused about a lot of things in this part of the discussion. EVs have less range in winter because more of the energy is being used to heat the cabin, and that was done with resisitive heating until recently (more vehicles are starting to use heat pumps). The grid operator would not be insane, except from the point of view of a very traditionalist grid operator. The future is dynamic.
> The market demonstrates otherwise.
It really doesn't? The market demonstrates that demand has risen sharply, and it will take time for production to catch up. Every demand spike results in a "shortage".
"base load power plant" is a meaningless term. Base load is a feature of energy demand. Power plants produce energy, the same power plant might serve peak load, base load, or both.
EVs are not a solution to energy storage. Even ignoring the challenge of getting people to hook their cars up to the grid, the battery production figures aren't remotely close to what we need.
People have been assuming that battery cost will continue to decline exponentially. We'll need a century to provision just 18 hours of battery storage at current rates. Renewable activists hand wave this saying that battery production will increase a hundred fold. In reality, prices are rising: https://about.bnef.com/blog/lithium-ion-battery-pack-prices-....
> We'll need a century to provision just 18 hours of battery storage at current rates.
That's a weird way of spelling enough batteries were produced in 2022 alone (about 760GWh) to replace the entire world's nuclear fleet with a renewable mix with minimal overprovision, more flexibility to meet peak demand and a lower forced outage rate.
And this has nothing to do with reserves and everything to do with extraction. The price spike of lithium was a combination of unpredicted demand and the supply dip from covid (because brine ponds take about 3 years to process).
Lithium will be irrelevant to grid storage anyway. There are already two GW scale sodium ion factories running and the supply chain CATL is building will completely dwarf them as it is much larger and compatable with the TWh/yr of existing lithium cell production and packaging facilities.
> People have been assuming that battery cost will continue to decline exponentially.
I think the incredible success of the transistor industry has created some very unrealistic expectations about the pace of technological advance more generally. I know this is a fallacy I used to fall for. If a pocket sized computer today can beat out a supercomputer the size of a small building from when I was a kid, anything seems possible. But in reality, the rapid miniaturization of transistors is an extraordinary outlier.
> EVs have less range in winter because more of the energy is being used to heat the cabin, and that was done with resisitive heating until recently (more vehicles are starting to use heat pumps).
I thought it was this PLUS the fact that lithium batteries degrade quicker in freezing temps?
I agree cold has some effects on batteries, but the main impact on EV range is related to heating, so linking an article on EVs is not very useful when talking about stationary storage.
Batteries generate heat when charging and discharging, so grid batteries should naturally keep themselves warm, but maybe in extreme climates it would be worth adding some heat pumps to keep them in the optimal temperature range for longevity. It all depends on how the cost calculations work out, but it is not some major obstacle.
> Batteries generate heat when charging and discharging
/scratches head.
Huh? How do you explain why my iPhone drains so much quicker in the cold then, if it's able to heat itself from the discharge?
> but it is not some major obstacle.
Ignoring efficiency of battery consumption is a hilarious oversight. If it costs you energy ("lets just pump heat pumps on them") to keep an energy source from depleting faster, then you're still depleting energy faster because of said energy required to heat the thing. That's not trivial.
A tiny battery generates a very tiny amount of heat. Batteries have internal resistance, so charging and discharging is not perfectly efficient. Larger batteries generate much more heat by moving larger amounts of energy. Secondly, consider the relationship between volume and surface area. Volume grows to the power of 3, and area grows to the power of 2. A small battery naturally has a much lower ratio of volume to surface area than a big battery. With a higher ratio, it takes a lot longer for the heat to be lost to the cold because of how heat transfer works, and that's in addition to the larger batteries generating much more total heat energy. Thirdly, consider that the goal of an iPhone is to use as little power as possible so that your battery lasts as long as possible, which also means generating as little heat as possible. A grid battery does not have that same goal.
It is not a "hilarious oversight". Heat pumps are incredibly efficient these days even in very cold weather and there are ways to mitigate the heat loss, such as insulation. Grid scale battery deployments up to this point are typically already done in small structures, so the batteries aren't just sitting out directly in the elements. Obviously it's better if you don't have to do anything at all, but it is a very legitimate option. Every system will have some losses, no matter what. A very efficient heat pump could result in less losses than relying on the batteries to resistively heat themselves in extreme cold.
Just because it wouldn't work for a phone has nothing to do with how it would work for a grid battery, since scaling things up has a dramatic impact on efficiency for what we're discussing.
For how condescending your comment seemed to be, you seem to be missing a lot of basic points. Please give people the benefit of the doubt. I didn't come here to be a jerk to anyone, I was trying to help people understand things they seemed to be missing, but HN got super toxic over this topic. Some of this stuff is very nuanced and difficult to explain, which means that some people have no patience for it. If I'm wrong about things, or make a mistake in my explanation, I'm always happy to learn, but your comment was not attempting to teach anything.
"Base load" is just a description of how demand is behaving currently, its now high as we are using fossils unsustainably and keeping intra day price variations artificially low. It can dip much lower and hourly pricing and spot market means the usage adapts to production without blackouts. (this also makes storage profitable to build where needed)
Every fact you are saying may be true, but “fossil fuels are a bigger cause of blackouts than renewables” doesn’t follow from the facts your gave, you’re just assuming it is true. That is the last thing I will say on this.
I'm not assuming it. I'm referring to the very real, major blackouts that have occurred in the US over the past couple of years. These events have plenty of reliable sources that tell exactly what happened.
In the Texas blackouts, the problems were coal and gas plants going offline due to the cold that they weren't winterized against. It had nothing to do with Wind or Solar failing unexpectedly, despite the governor's claims early in the blackouts.
In the recent rolling blackouts in the Southeast, TVA and Duke Energy reported that the cause was their coal and gas plants freezing up. Nothing to do with renewables again.
I cannot recall any major blackouts caused by the fossil fuel plants operating normally while renewables failed to produce on schedule. I would love to see some examples, if they exist.
The Cleantechnica article that I linked to earlier (which you surely didn't read) also provided quotes on this exact topic, and that analysis agrees with my own, FWIW.
I could dig into the specifics of these events and provide more sources, but you don't seem likely to care.
Perhaps unintentionally, you changed your argument from “fossil fuels are responsible for blackouts” to “ I cannot recall any major blackouts caused by the fossil fuel plants operating normally while renewables failed to produce on schedule”. These are different arguments.
I am very much on team renewable, but the only reason those fossil fuel plants are needed in the first place is because of the very nature of renewables. So it’s disingenuous to say that fossil fuels are responsible for blackouts when it’s the dispatchability of renewables that required fossil fuel burning in the first place.
I don’t think the GP meant anything other than this.
> So it’s disingenuous to say that fossil fuels are responsible for blackouts when it’s the dispatchability of renewables that required fossil fuel burning in the first place.
No... as I recall, in the Texas blackouts, the renewables actually generated more power than originally forecasted. If the natural gas and coal plants go offline completely, that has nothing to do with the dispatchability of renewables. The fossil fuel plants would be part of the grid regardless, because it takes decades for those plants to reach the end of their lifecycle and be decommissioned.
The grid relies on every type of power doing what it says it will do. It is possible to predict when solar and wind will deliver power, and how much they will deliver. The stability of the grid relies on that. The fossil fuel plants were the ones that had problems. If they had produced power as they normally do, there would have been no blackout.
> These are different arguments.
Do you still think I changed my argument? I'm fairly sure I didn't, but I can see how you reached that conclusion without the clarifications above.
> the renewables actually generated more power than originally forecasted.
They were forecasted to produce 6% and actually made 8%, vs 41% from the prior week. You can’t blame renewables in this case, but you aren’t presenting facts in an unbiased way.
My primary point is that renewables are working the way that they say they will work. Certain people refuse to admit that fossil fuel plants struggle in adverse conditions, but then rant about renewables being somehow unreliable.
Renewable integration into the grid is heavily dependent on forecasting. You may think it is biased, but I see that statistic as things going better than forecasted. With proper forecasting, you can overbuild and make up for lower production. With proper forecasting, you can employ demand response.
When the fossil fuel plants turn off completely and unexpectedly, there’s nothing you can do, because no one planned for that. The grid was relying on the power plants, and they weren’t there. If the grid had planned for the unreliability of those plants, the grid might’ve had more renewables to make up for it, who knows.
People building renewables plan for the variability. As I’ve mentioned, energy storage is essential for the long term.
> there’s nothing you can do, because no one planned for that
You're comparing a future of renewables to a present of non-renewables. You could also add in additional generation to cover the gap you mentioned in the future as well.
You may not have intended to change your argument, but I think you were assuming evancox100 was arguing against renewables where I thought they were just making a point about your language. I can’t speak for them or you, but I think their comment was fair in the absence of the clarity you’ve now provided.
Energy is fungible, so if both fossil and renewables are both online and producing energy how can you blame one over the other? My understanding is that neither source of energy was winterized to transmit the energy to the grid, not that they were unable to produce. So blaming one or the other is a red herring.
> But unlike utilities under traditional models, they don’t ensure that the resources can deliver power under adverse conditions, they don’t require that generators have secured firm fuel supplies, and they don’t make sure the resources will be ready and available to operate.[0]
The question should squarely be "can renewables create the same base load and peak load required to run the entire demand of the grid"? From there than we can talk about the other topics such as what is required to deliver the energy, which is cheaper to operate, etc.
> My understanding is that neither source of energy was winterized to transmit the energy to the grid, not that they were unable to produce.
Transmission was not the problem in any of these blackouts that I’m referring to. Powerlines failing affects extremely localized parts of the distribution network, but the coal and gas plants literally stopped producing. It was a production failure.
> "It is possible to “winterize” natural gas power plants, natural gas production and wind turbines, experts said, which prevents such major interruptions in other states with more regular extreme winter weather."
This is really my point. The production sources (and how the elemental source of that energy is acquired) and transmission of energy to residents can be winterized regardless of what they are (wind, gas, etc.). So the point isn't whether wind or gas is better.
> that goes back to how we desperately need energy storage and demand response.
Don't disagree (because this would create a much larger buffer), however if you have fully winterized energy production and transmission, then this point is moot.
I agree with most of what you say, but renewables can’t be counted on, so they can never be blamed. This is tautologically true and yet a meaningless point.
If a hospital loses power to lifesaving equipment, 100% of the time it is due to a failure of backup generators.
> I agree with most of what you say, but renewables can’t be counted on, so they can never be blamed.
Renewables are fairly predictable, so they can be counted on. The production is variable, but not unreliable or unpredictable. It's an important distinction, and it gives more time for the grid to coordinate with Demand Response (or peaker plants) to match load and production.
Obviously the weather models involved are still improving, but for solar especially, it's easy to predict when the sun will go down. Cloud coverage and wind forecasting are active areas of development to make things easier and more predictable for everyone.
> Renewables are fairly predictable, so they can be counted on. The production is variable, but not unreliable or unpredictable. It's an important distinction, and it gives more time for the grid to coordinate with Demand Response (or peaker plants) to match load and production.
Was Germany's months-long period of low wind in 2022 predictable in this sense? Was Germany supposed to stop using electricity for months as Demand Reduction? I buy that you can maybe store enough power for overnight demand if solar generates enough during the day (at some potentially large cost), but wind can just stop blowing for weeks. Storage cannot solve that without outrageous cost, or blackouts.
I can't find any actual accounts of the wind stopping entirely for "weeks" in Germany. It is a theoretical possibility. Periods of "low wind" are fully expected in summer, since there is usually more wind power generated in winter, although I'm not sure when you're referring to specifically. Similarly, solar produces more power in summer and less in winter.
Critically, we have the ability to transmit power over distance. That's the whole point of having a grid, instead of just having each person operating off-grid and only having access to the resources that are within arm's reach of their house. Individual solar panels may be under a cloud or individual wind turbines may experience no wind, but it is tremendously less likely for entire regions to experience this for an extended period of time.
The US has encountered several winter storms that brought a number of our fossil fuel plants to a halt for days at a time, causing blackouts. That's not theoretical.
We do have transmission, but it is lossy and expensive, and the places it is cheaper to transmit from are geographically adjacent and are going to have similar solar and wind conditions.
If you're talking about Texas' gas plants, yeah, that was not good. But gas plants are an essential part of preventing blackouts in a wind/solar grid, so I'm not sure this is a point in wind/solar's favor.
Rare "dark calm" backup can be done with gas plants. The gas can be hydrogen. Hydrogen is not cheap, compared to what natural gas typically costs, but for rare event backup it doesn't have to be cheap.
95% of stored energy in the US is hydro-electric pump storage. People really have no understanding of how the grid works. We don't just run gas plants all night.
Gas plants aren't "stored energy", you're comparing apples to oranges. 95% of stored energy amounts to little more than bupkis, the US grid doesn't run off stored energy at night.
Comparing renewables without storage with a non-intermittent source is comparing apples to oranges. Until said storage system is developed, renewables have to be paired with a dispatchable source - usually fossil fuels. Existing batteries are nowhere near the scale required to capture and re-release intermittent energy production.
Nuclear power is cheaper when built at scale [1]. When dozens of plants were being built of the same few designs, costs were less than a quarter of what they are now. Most nuclear plant construction is first-of-a-kind in the country it's being built. These have always cost more.
> Most nuclear plant construction is first-of-a-kind in the country it's being built. These have always cost more.
I don't see how this is relevant. The Vogtle reactors are here in the US. The US has plenty of experience building nuclear reactors, no? I would love it if nuclear were cost effective to build, but I would like to see any recent examples of that, anywhere in the world. Even NuScale is predicting that they won't be cost competitive with Combined Cycle gas plants.
Vogtle 3 and 4 are the first AP1000 reactors built in the US. That means most of the parts and components used in this plant are the first attempt at building and integrating such components. It's a lot cheaper to retain all this knowledge and churn out a run of, say, 2 dozen steam generators [1] instead of building them as a one-off every time. There's many such components where there's no market outside of nuclear power plants, and so there's no economy of scale to be had if we're only building 1 or 2 nuclear plants at a time.
> I would love it if nuclear were cost effective to build, but I would like to see any recent examples of that, anywhere in the world
Of course it's probably not as cost competitive with fossil fuels. The whole point is to get off of fossil fuels. This is where wind and solar really struggle: they're great at reducing fossil fuel use by ~40% by shutting down gas plants when wind and sun are available. But it's ultimately still fossil fuels forming the backbone of the grid. Nuclear provides a path towards actually removing fossil fuel generation entirely, instead of just opportunistically supplementing it with intermittent renewables.
Only one model of US reactor has ever gone down in price after repeated builds (and then not by much), South Korea's 'cheap' reactors are suddenly $10/W net when they built one somewhere else and couldn't get creative with the accounting. And the Messmer plan reactors turned out just great (in addition to going up in price with each reactor and having many hidden costs that make them not comparable to a privately funded project).
> This is where wind and solar really struggle: they're great at reducing fossil fuel use by ~40% by shutting down gas plants when wind and sun are available
I love how this number that renewables can't possibly go beyond keeps going up every month but is said with the same level of ridiculous overconfidence every time (you've got to update it to 60% now for NE Brazil, South Australia and a few other generation grids, and much of Europe has also crossed your 40% threshold too). Any realistic analysis puts the limit in the mid 70% range with no storage or overprovision and well above the threshold where biogas and existing hydro can cover the rest once you add diurnal storage and 3 day dispatchable loads like EV charging and electrolysis.
> You've got to update it to 60% now for NE Brazil, South Australia and a few other generation grids
Most of that is hydroelectricity, not wind and solar. Why stop at 60%? Norway produces 100% (or very close to it) of its electricity from hydro.
Of course, the answer is that geographically dependent energy sources aren't very useful outside places that have the right geography. Most places with hydroelectric potential are already making use of it. The question is, how do we decarbonize the rest of the grid?
I was specifically citing the wind and solar share delivered to loads on those grids over the last year and you know this, this attempt to derail is hilariously transparent and pathetic. Additionally South Australia has almost zero storage or hydro and is at 69% for the year. Their link to the rest of NEM suffered storm damage throughout the year so there has been very little interconnect, but unlike last time when they relied on imported coal, gas and hydro this has not resulted in anything remotely like a shortage. This also puts the lies about transmission being more precarious with renewables to bed.
And why stop at 60 indeed. Feed a little surplus into dispatchable loads like EV charging, district heat storage and electrolysis, and 80-90% is pretty trivial.
It's very easy to see this as you can just scale up the current mix in many grids until curtailment hits 30% or so (plus whatever portion can be peak shaved by plugging your car in at work, plus the extra during low generation for new generation including less-correlated offshore wind or vertical solar) and see how much of an obvious lie the 40% claim is, and how incapable of critical thinking someone would have to be to believe they could pass the lie off as anything related to reality.
You already know the answer to that, as it has been explained to you dozens of times.
Once again: renewables+storage, backed by combined-cycle turbines. They will powered by NG until synthetic fuel becomes plentiful.
Right now, most storage is at hydroelectric plants built in past decades, with small amounts of battery. In the future, it will still be small amounts of battery, along with plenty of other storage, much of it underground hydrogen and hilltop hydro.
There is no hint of any shortage of hills, most places. Flat places will use other methods.
Yes it's been explained repeatedly - but almost always in vague terms referring to "storage" but carefully avoiding any nuanced discussion of what form of storage. When they do mention storage, they mention infeasible forms of storage. I'm happy to explain the shortcomings of the ones you listed:
"Small amounts of battery" are still amounts that would take centuries to provision. Again, I don't think people realize that 1 day's worth of storage is well over a hundred times annual battery production.
Hydroelectric storage requires more than just a hill. It requires a reservoir on top of the hill, another reservoir on the base of the hill, and access to a lake or river to fill these reservoirs. This is a much more specific set of geographic features than just "a hill".
And lastly, nobody has successfully operated a hydrogen electrolysis storage facility. This is totally unproven technology.
It has been explained repeatedly in very specific terms.
3 hours of cyclable storage (primarily in the form of pumped hydro because the exaggerations about geographic limits are another lie, but also in batteries) and a few tens to a few hundred of hours per year of high power dispatchable generation (again, existing hydro and biogas covers the overwhelming majority of this). With a list of technologies and insignificant behavior changes that assist which make it even easier thas is too long to state.
As has been explained to you numerous times, reservoirs are normally constructed, not found, and no lake or river is needed. So, you really do just need a hill with unused top. (Penstocks and turbines are also constructed.)
There will be no need for "one day's" worth of batteries, but in any case battery production is ramping up fast, and that will continue as long as demand increases.
Numerous storage technologies will be used. It is not clear yet which will be cheapest. Hydrogen might not be among them.
You need a lake to fill the artificial reservoir. Pumping water over long distances is prohibitively expensive. You also need a lower reservoir, otherwise how do you refill the upper reservoir once it's been drained? Its more than just a hill.
Numerous storage technologies are proposed. And you're just assuming that one of the proposed solutions will work.
China's reactors are cheap in China Bux. But we just got to see how 'cheap' South Korea's $2.50/W reactors are when they exported one and let slip the 'service' contract that put the final price at $10/W (net)
The most reliable output of the nuke industry is shown, again, to be dishonesty. Never trust a figure delivered by the nuke industry, or by someone who believes the nuke industry.
Just pick a claim and examine it yourself. They're all lies and half truths.
Claim: The price will come down with repeat builds
Reality: All but one model of reactor in US went up with repeat builds, that went down very modestly. Every program except Japan had costs that increased with time, and Japan's costs only went down modestly during a period where money was extremely cheap and the prices of other projects decreased far more.
Claim: It was TMI and chernobyl
Reality: Prices went up 20% yoy from the time the first non turnkey reactor went online until 1979
Claim: If there had just been more funding
Reality: Billions to tens of billions are spent on fission research every year. All reactors have substantial public support and most have government backed loans.
Claim: It's soooo dense
Reality: The majority of Uranium ore has similar energy density to fossil fuels. Uranium mines overlap in average area power density with solar farms.
Claim: Reprocessing turns waste into energy
Reality: It only uses the Pu239 and dregs of U235 which a boost of 15% from not reprocessing and works once. Separation dumps large quantities of fission products into the environment. It creates more Pu240and Pu241 which are the worst isotopes and just as much fission products as not using MOX.
Claim: Breeeeders have a closed fuel cycle
Reality: No closed fuel cycle has ever been demonstrated start to finish. On top of breeders generally having horrible reliability when run in breeding mode. Every program fails at the refuelling part because it's so dirty and expensive.
Claim: Coal makes more radiation
Reality: Coal is way worse in general, but this is a lie carefully crafted by comparing one specific reactor a long time away from refuelling to an ols coal plant. One reprocessing facility releases more radiation in a year than 90% of the coal ash that ever made it into the air. Dust from poorly remediated mines is also much worse.
Claim: It's available 24/7/365
Reality: EAF averages under 77% and there are now wind farms that have higher capacity factor than the EAF of many UK and french reactors (except they don't pretend to be available all the time). Nameplate capacities are often lower than output or left when expensive upgrades are made to make load factor look artificially high.
Claim: It's dispatchable
Reality: Being able to pay a little more in maintenance to throw away energy you already paid for isn't dispatch.
Thanks. I understand your position now, which is interesting, but is there anything more verifiable than HN comments out there that is the source of your claims?
Your link literally shows reactor costs going up in price over 20% per year for reactors finished before TMI, many of which were NOAK. Prices only went down before reactors had been operated commercially, and that's only because the fixes to stop them catching fire or being offline the vast majority of time hadn't been invented (and retrofitting them to existing reactors cost just as much as adding them to new ones).
> Comparing renewables without storage with a non-intermittent source is comparing apples to oranges.
Absolutely correct. There are a lot of magical hand-wavy arguments and false stats used when comparing solar to nuclear.
My 13kW array went down to 600 W (yes, six hundred Watts) peak, not constant, during the last few weeks of rains in Los Angeles. I cannot possibly imagine an entire city relying on this for energy.
At some point we have to get real. Solar isn’t the solution. Nuclear is. Solar can help, yet it is very far from being a reliable solution.
I’ll post power output graphs when I get a moment.
You can back up solar with hydrogen at $1/W of generating capacity for those rare prolonged outages. Because they are rare, the fuel cost is inconsiderable. At the same time, the backup generators are 1/10th (or, if you use simple cycle instead of combined cycle, 1/20th) the cost of building a new nuclear power plant, per unit of output.
No, you can't, because nobody is offering hydrogen electricity storage. If you're okay with energy plans involving heretofore unused technology, then I've got a fusion plant to sell you.
Ah yes, your old "if no one is offering it, it cannot ever exist" argument.
Yes, I am perfectly comfortable imagining the future will be using technologies that we are not currently using. Hydrogen is not much of a stretch, as it involves integrating technologies that already exist.
Fusion doesn't need non-existent technologies either. It's just electromagnets and plasma. We just need to integrate these existing technologies to make the fusion process more efficient.
His argument is apparently "if there is some technology that it is not reasonable to expect will happen, then there is no technology that it is reasonable to expect will happen."
(Except maybe whatever advances are needed for nuclear fission to power the world, I'd guess.)
> Yes, I am perfectly comfortable imagining the future will be using technologies that we are not currently using.
Sure. OK. The problem is that this is science fiction, not reality.
I hear these kinds of arguments from people who have never done any construction project of non-trivial scale in their lives. Sure, from that perspective anything is possible.
Let me tell you about reality in the US.
If you want to build, say, a new instant-on hydrogen-based gigawatt-scale power generation plant, you need at least four things:
- A design
- A site
- Environmental studies
- Permits
The design is tightly coupled to the site. The site is tightly coupled to the environmental studies and, of course, the permits.
It could take 5 to 10 years to find a site and get it approved for a specific design.
The permits could take another 5 to 10 years in the aggregate. This means you'll get some permits in a few years and others will be a battle you will have to fight for probably a decade as things are built.
Finally, the construction project will likely take somewhere around 20 to 25 years.
You are looking at 20 to 30 years. Just for one power plant. And I could be 100% off. It could take double that time.
Here's the key:
The clock starts NOW. Which means you have to design it with the technology you have NOW. Not dilithium crystals or magical hydrogen generators that do not exist. If you want the 25 year clock to start ticking today, the only way is to design with what you have, not what you wish you could have or what you think you might have.
That's the problem with all of these hand-wavy arguments. They are fantasy.
If we got our heads out of our collective asses we could start building modern nuclear power plants very quickly. They are not fantasy. They work. And they are far better than most, if not all, of the alternatives.
We do not, in fact, need to build storage now. What we need now is renewable generating capacity to displace fossil fuel burning.
At a time in the (not very distant) future, when we have enough of that above immediate needs to spare enough to charge storage, then we will start to need storage.
There is no point in even talking about building nukes. Nukes are dead, dead, dead. Not because of regulation, or hippies, but because no one with the money would waste it building one.
No, we do. Well, perhaps saying storage isn't quite accurate. What we need is reliable power, because solar is not.
Perhaps you have not seen the charts I posted from my 13 kW array showing what we produced in the last three months compared to the same period last year?
This was caused by rain and weather. The very direct implication of this is that solar power requires an external reliable power source. Without it you could have entire cities go dark.
And so, the question is: If solar cannot work without an additional power source capable of delivering 100% of the required power for prolonged periods of time, why are we insisting on building two power systems, one solar and one using a different technology?
> There is no point in even talking about building nukes. Nukes are dead, dead, dead. Not because of regulation, or hippies, but because no one with the money would waste it building one.
Forget I said nuclear then. Solar at scale cannot happen without having a reliable power source available to support it. If we want to stick with clean sources, the only real options are wind and hydro. Nuclear, I would highlight, is cleaner than burning stuff to make energy. Yet, again, let's not discuss nuclear for the moment.
Because of the characteristics of solar you have to two at least two things:
- You have to grossly over-build by ten times or more
- You have to have a backup power source that can
deliver 100% of the required peak power for minutes,
hours, days and even weeks.
The grossly overbuild part is very easy math to understand. Let's take the simplest of them all: No sun at night. This means --in very rough strokes-- that if you want to store the equivalent amount of energy for night-time use, you have to double the system. One half of the array supports daytime use while the other half charges 100% efficient storage (not a reality) for use at night.
That's not the end though. In a practical reality (feeding a neighborhood, town, city) you need constant power. In a perfect day (no clouds, rain, etc.) the output of a solar array looks like an inverted parabola. Here's a chart from my system.
In order to deliver the same amount of energy as a constant-power system of the same peak power output, you need to build a solar array 1.5 larger than this. That's because the integral of the area under the inverted parabola is 2/3 the area of the enclosing rectangle. Simple math.
Now we are to having to build a system of 2 x 1.5 = 3 times larger.
This happens more often than most people might imagine. The cause, in this case, clouds. Not ugly dark clouds, beautiful white clouds during an beautiful blue-sky day. When it comes to solar, clouds are evil.
I won't continue with the math. I'll just say that, when you consider all the issues with solar (including seasonal output, negative power coefficient and dirt) you can easily see that if you want 1 GW of output you better consider building a 10 GW array, or more. And this requires massive amounts of storage, otherwise you have no power at night or during some of the issues I presented above. As the other charts show, the last few months taught me a lot about what can happen.
Going back to having to build a 100% reliable power system that can supply 100% of the power needs to support unreliable solar. At some point you have to ask yourself. If you are going to build a full duplicate power system, just to have solar, does it really make sense?
This is where reality smacks you in the face again. Sure, there are places in the world where one could use hydro and wind. That isn't going to solve the problem though. You can't use these technologies everywhere. Wind also has its problems.
This is why I tend to reach for nuclear. I can't think of any other technology that can provide 100% power availability at nearly 100% of the time. The other requirement is that we have to be able to start building it now, not in ten years (see above). In the US, it could take well over 25 years to build any type of reliable-power generation plant. We just don't have the ability to move quickly any more. Which means that there's a practical limit function to how far we could take solar, because it isn't reliable and it requires 100% backup.
Not a simple topic. I obviously believer in solar enough to have spent my own money and built a nice 13 kW system. I will be expanding it to 20 kW this year. I might consider going to 30 kW next year. Why? I can't charge enough batteries for the system to deliver power reliable enough to support electric vehicles. This is another reality. Most of my neighbors have small 3 to 5 kW systems. They are all screwed. I talk to them all the time. Some regret having solar because it is costing them more per month (due to leasing and the rising cost of power) than when they put these inadequate systems in. Some were told they could charge electric cars with solar, which was 100% false.
I love solar. I believe in it. I simply prefer to talk about it in real terms and not in a fantasy world where the technology is perfect, reliable and has no issues.
None of what you have posted is surprising. None of it changes the equation dictating build schedule. None of it favors nukes in any role.
There is no need for "ten times" overbuild. Instead, you just need a backup generator you can fuel at need.
Any tropical country can put up a solar farm and start exporting synthetic fuel. Until those are built, we can burn NG in shortfalls, at radically reduced average total carbon output.
Building storage after you have enough renewable overbuild to charge it from, in normal conditions, incrementally reduces duty cycle on the generator. So, for a utility, a 1.5x overbuild and a few hours' 1x storage means they hardly ever run it. A transmission line to a neighboring utility cuts the fuel bill more, and makes selling excess easier.
At home, with a grid tie-in, you need the generator only when a tree takes out the lines, and you can sell excess most days. A low duty-cycle backup generator should be, above other considerations, cheap. Don't you have one? They are cheap.
In the future, 10x overbuild will be much cheaper than today, and would reduce your residual backup fuel bill if you care enough, and you can sell more power, most days.
The correct course for a public utility is to focus on exceeding 1x average renewable generating capacity, and then add a bit of very dispatchable and quickly built storage--batteries. After that, incrementally overbuild, and add cheapest usable storage--not batteries--still using the combined-cycle gas turbine at need. The more overbuild and storage they add, the better things get. Maybe add some fuel synthesis equipment and tankage, and sell excess beyond local tankage.
For home, keep your generator and/or grid tie-in ready. At 13 kW nameplate, you can usefully add some battery to carry you past peak evening price and, with a bit more, through most nights.
Your neighbors with 4 kW are substantially reducing their power bill. There was no expectation of anything else.
You can sell excess when you have excess. Some places (sadly, not all places), adding battery lets you sell excess at a higher price during peak times and maybe buy back off-peak.
Ah, sorry to hear that. I opted not to build the array on the roof. Too complex. Too many issues I did not want to deal with, including damage to roofs reported by neighbors. This is what it looks like from space:
40 x 325 W panels feeding two SMA 6000W inverters. That's enough because you'll never make peak label power. The most I've seen is 11 kW peak. Lots of reasons for this. One of them being the permit authorities limiting the tilt angle of the array to 15° due to zoning/planning/whatever code. No big deal. I am adding a third inverter and more panels this year.
Not sure how things are done in New York. Here in SoCal it isn't as simple as selling excess energy back. The accounting is like going to the casino, it's heavily tilted in favor of the house. They are paying a useless amount of money per excess kWh. With NEM-3 coming-up, the incentive to install solar are mostly gone (which is a serious problem).
I have been looking at shifting into the tiered power metering plan. I can't do it until I take the time to fully model it and understand what will happen. A quick top-level analysis says it should actually be better than NEM TOU plans because you never go into high tiers at all, particularly with a decent system.
To make things worse, the true-up calculation is just killing people these days. We have great relationships with a dozen of our immediate neighbors. We talk about stuff like this all the time. I can't think of one who is currently happy with solar. Please understand that none of them bought their system. They were sold a small leased system that was barely adequate. So, they end-up with something like a $150 to $200 per month lease payment, another $150 to $300 per month in energy bills (because they can't make enough with solar) and, on top of that, they get hit with hundreds of dollars every 12 month when the true-up calculation is done. It's crazy and NEM-3 isn't going to help at all.
Going back to what you were saying...
> There is no need for "ten times" overbuild. Instead, you just need a backup generator you can fuel at need.
I hope you are not proposing that every solar owner should have a backup generator. For one, those things are horrible polluters.
Please note that the 10x overbuild is based on simple mathematical realities. One of the examples I gave was the need to double the system to store energy for night use and a multiplier of 1.5 (a total of 3.0) as a result of energy being the integral of the ideal-case parabolic generation profile. In other words, these numbers are not a matter of my opinion, this is physics. If you want constant power 24/7 you have to start there. Once you calculate all other factors the number very easily goes above 10x.
Another example, here's a basic derating calculation:
This math requires increasing the array size by 1.5x again. We go form 3.0 to 4.5.
Etc.
If you do the math, 10x is an understatement.
> Building storage after you have enough renewable overbuild to charge it from
I understand what you are saying. However, there's an intersection of curves where things just do not line-up. It's already happening. Utility companies are raising prices and consumers are going to get hurt (NEM-3, again). I forget the exact numbers. When I installed my system 1 kWh was about $0.15 off-peak. Today, depending on the plan, you are in the $0.28 to $0.35 range.
As more solar is installed the balance will cause serious side-effects. I don't think we fully understand what will happen.
> A low duty-cycle backup generator should be, above other considerations, cheap. Don't you have one? They are cheap.
I don't need one. My system allows me to generate power (up to 4 kW) if the grid goes down. That's why I chose the SMA inverters. When I add a third inverter this will increase to 6 kW. Yes, of course, the sun has to be up and no clouds, etc.
I will eventually add batteries. I just doesn't make any sense right now. The ROI is horrible. Also, if you do the math, the amount of energy you have to store to literally weather the storm (if you want to be 100% off-grid) is insane. Looking at what happened in December/January, we are talking something in the order of 500 kWh. And, of course, this would require an equally massive array to keep the batteries charged.
My conclusion is that going 100% off grid with 24/7/365 reliable power and no compromises is pretty close to a fantasy. Before concluding this is crazy, take a look at this calculation from a battery manufacturer:
Their conclusion is you need 56 kWh to survive just FOUR days if your home requires 10 kWh per day. My air conditioning system alone burns approximately 5 kW and, in the heat of summer, it is on some 12 to 16 hours per day. That alone blows-up the 10 kWh/day budget. Still, if we stick with that and decide we need to be 100% reliably off-grid for 30 days, not 4, the number is 420 kWh.
In other words, my estimate isn't crazy at all and the idea of homes existing 100% off-grid is a fantasy for most.
Which brings me full circle to having to have 100% reliable external power sources that can deliver 100% of the power a town or city might require at any given moment for short or long periods of time. That's the part I can't reconcile. I can play with solar. I can throw money at it. However, at scale, not sure. That's why I talk about nuclear. Pick any other technology. My conclusion is solar alone isn't going to happen. Not at scale.
> Please understand that none of them bought their system. They were sold a small leased system that was barely adequate. So, they end-up with something like a $150 to $200 per month lease payment,
So these neighbors are paying the full cost of their 3kw to 5kw systems every 2 to 3 years? Sounds like they're right to be pissed if they didn't knowingly opt in to being early adopters in 2008, but the problem isn't the solar panels.
Even with that, they're still only paying a little above retail for their electricity. If the bill is causing hardship then maybe the solution is to stop using an order of magnitude more electricity than is necessary on luxuries?
My guess is millions of people were scammed into these ridiculous leases. People here regularly aren’t $500 to $600 per month for air conditioning. It isn’t a luxury. It’s a necessity. Solar companies showed-up and sold people on $200/month leases. In that context it was a deal. Not so if you did the Lon-term math...but people don’t do math...even in conversations here on HN. Rates went up and they got screwed. I bought my system, engineered and installed it myself. Even with that we get true-ups of a few hundred dollars every so often. Some of my neighbors have seen $2,000 true-ups. And it is going to get worse.
There's a world between running a 3.5kW output mini split in the room you spend most of your time in for 2-4 hours a day over the worst 6 weeks (which costs about $50/yr) and running a 20kW output central system 24/7. Only one of them is a necessity and only for people with health issues.
And again, having your society be full of scammers has nothing to do with the viability of solar. Nor is signing a lease whereby you pay full retail cost for the electricity generated actually a step down from not having the system installed.
It will get continually better as prices continue downward at exponential rate, as they have done year after year.
People taken in by hucksters of any stripe suffer. There is nothing unique to or characteristic of solar in that. People who have shopped carefully are doing extremely well with solar, especially for recent installs.
Oh, so your system is shaded for 3 hours a day, has a tree over it, is wedged in a small space where 60% of the diffuse light won't hit it on a cloudy day, has no back face diffuse illumination, and you're still trying to conflate a 100% solar single location off grid system with a wind + solar utility scale mix.
Add the same constraints to a nuclear system and it a) does not exist and b) still needs a month of storage for outages.
There is nothing wrong with running a backup generator when mains power is down.
Solar alone obviously can't happen, most places. Solar + wind + hydro + (sometimes) geo + (even) old nukes + storage + xmission line + backup combined-cycle gas turbines will, instead. Backup gas turbines will burn NG at first, synthetic ammonia or hydrogen later.
Getting 100% off-grid is not a goal worth the extra cost, for most. For anyone without access, or with unreliable mains, a backup generator is just prudent. Batteries, if you have any overbuild, mean you spend on mains or run your generator less often.
At utility scale, with regional-grid tie-in, calculations come out differently. Maybe the gas turbine runs part of most weeks, this decade, rather than once or twice in winter like your backup generator. Wind and transmission line complement solar.
Battery cost is radically less than 2 years ago, and still falling. If you haven't checked lately, prepare for a surprise.
> Solar alone obviously can't happen, most places.
Exactly my point.
Yet, beyond that, the analysis shows you have to have 100% backup rated at 100% the required peak power output.
That's the conundrum. For every deliverable GW of of solar you have have a GW of backup.
And so my question is simple: How does that make any sense at all?
Let's make sure we don't engage in a hand-wavy arguments though.
Wind: Regional, seasonal and varies throughout the day
Hydro: Not available everywhere, highly seasonal. Check out this chart [0]. Hydro is susceptible to water availability. Hoover Dam lost 33% of it's output capacity for this reason --about 600 MW.
Geothermal: Sounds great. Only 0.4% of US utility scale generation
Old Nukes: 20% of US power generation. We have over 80 of them. Only three of them in the west [1] (which is astounding)
Transmission Lines: They are only good for about 300 miles. Power is relatively local. Hoover Dam feeding Los Angeles is about 250 miles straight line.
Combined Cycle Gas turbines: About 20% of US capacity. None west of Texas.
Anyhow, the point is that it is easy to list all of this power generation options and lose sight that they are not available everywhere and adding capacity is nearly impossible. In fact, if you look at the maps showing distribution of the above power sources in the US it isn't too hard to conclude that the west coast is not in good shape at all.
Here's the kicker: I haven't even introduced the reality that 100% conversion to electric powered vehicles will require the US to double it's power generation capacity. This is monumental. Conceptually, it means we have to fully duplicate our existing generation and transmission infrastructure. Frankly, I don't know how we make that happen. Please don't say solar. Once again, nuclear keeps coming back as a source that is impossible to ignore. We need so much power to support electric vehicles that we would need to build 1200 new 1 GW class nuclear power plants. This is impossible. Clearly a hybrid system will be required. Again, no, solar can't do it. It can be a part of it, but it just can't rise up to the occasion.
> Transmission Lines: They are only good for about 300 miles.
Say what now?
Are you trapped in the 1950s or something?
> Most HVDC links use voltages between 100 kV and 800 kV. However, a 1,100 kV link in China was completed in 2019 over a distance of 3,300 km (2,100 mi) with a power capacity of 12 GW [1].
I realise the USofA is stuck with a lot of old infrastructure that they've only recently barely started to upgrade .. but better things are possible.
> Say what now? Are you trapped in the 1950s or something
No. Reality.
Where in the US do we have a power transmission line that goes significantly farther than 300 miles?
Also, if I had to guess, I would not be surprised if the super high voltages required to go farther are so dangerous to people, wildlife and the environment that they will never be build in the US or Europe. China doesn’t care about such things.
> Here's the kicker: I haven't even introduced the reality that 100% conversion to electric powered vehicles will require the US to double it's power generation capacity. This is monumental. Conceptually, it means we have to fully duplicate our existing generation and transmission infrastructure. Frankly, I don't know how we make that happen. Please don't say solar. Once again, nuclear keeps coming back as a source that is impossible to ignore. We need so much power to support electric vehicles that we would need to build 1200 new 1 GW class nuclear power plants. This is impossible. Clearly a hybrid system will be required. Again, no, solar can't do it. It can be a part of it, but it just can't rise up to the occasion.
This is an unfathomably stupid take.
If every truck including all the small ones were a tesla semi, and every car, pickup or motorbike were a ford lightning, then this is an average load of around 200-300GW. 150-200GW is more realistic. Simply plugging in most cars most of the time and using grid aware charging would solve this, but if that's really unfathomable you can just duplicate whatever portion of the battery is used each day (<10% on average, less on some cars, more on others) and leave some of it at the charger and some at the solar array or wind turbine.
Daily gasoline demand is fairly consistent and quite elastic, it drops about 20% in winter and has minor spikes on holidays. Price signals can and regularly do control for this, but overbuilding 20% or simply taking advantage of many of the air conditioners or heaters being off is more than sufficient.
This will lower transmission strain rather than raise it, as it allows all the energy to be transferred during off peak and allows peak smoothing.
Long haul trucking will likely opt to schedule charging in blocks rather than have two redundant batteries for each truck. This is fairly easy to do as they know where they are going. Running chargers at 3MW per bay without storage wouldn't work anyway and no large multi bay truck chargers are planned without colocated storage or generation.
You're right that 1.2TW of nameplate generation would solve this easily but it is 1.2TW of solar built at <30c/W over the next 10 years, not 1.2TW of nuclear built for an average cost of $20/W (including all the failures that still need to be paid for) and rising. 500GW solar and 300GW wind would probably also be plenty.
> For every deliverable GW of of solar you have have a GW of backup
That is utterly silly. Think.
You need only just enough backup generation capacity to fill a shortfall. In contrast, you overbuild solar and wind so you can charge up your storage while also satisfying peak load even under reduced yield.
You will never, ever charge storage from your backup generator: that would be beyond idiotic. So you need only a very strictly limited backup capacity, no matter how much renewable generation you have overbuilt. You will overbuild, because it is by far the cheapest way to get power, and you can sell the extra.
Wind and solar are variable, but not random: you know well ahead of time what they will produce, and whether and when your storage will run low. So, you can schedule a fuel delivery or a slice of transmission line capacity. You might charge storage from the transmission line if you are not producing enough excess, which will be cheaper than ordering in liquid fuel.
Your figures on transmission lines are badly dated. UK is building a transmission line to a solar farm in Africa, which is rather more than 300km away. (Siting a solar farm in the desert is also idiotic, but people who control money love the idea, so here we are.) China wants to build one to Chile, their antipodes; they figure on 50% loss, which is fine.
We will need a lot of power generation and transmission. Renewable generation is cheap and still getting cheaper fast. So it is obvious what to do: build that. Keep building that.
Diverting capital to nukes (or geo) would radically cut generation capacity. More expensive means, exactly, fewer watts per dollar. Fortunately nobody with money is dumb enough to buy a nuke, even with DoE doing their level best to sucker somebody on board.
If we are very, very lucky, civilization will not collapse before enough is built out.
> Your figures on transmission lines are badly dated. UK is building a transmission line to a solar farm in Africa.
Perhaps I have not made it clear enough that I am dealing in reality rather than things that do not exist? This cable and project are not real yet. They have yet to build the very factory that will manufacture this cable.
Also, I am talking about land-based power. The US infrastructure. Europe is screwed in so many forms that they have to resort to such things.
This cable is horrifically dangerous in so many ways. Talk about boiling the ocean! 10 GW will do that. I wonder how that accident will compare to an oil spill?
From a strategic perspective, bringing in 8% of your nation's power needs from a place like Morocco via an underwater cable does not sound like a great geopolitical strategy. Unless we want to imagine something like a large or regional war can never happen...
Then again, the UK might have few choices. I don't think they can do solar at this scale in-country. I would think they can do a massive amount of offshore wind. This is puzzling to me. I don't understand a strategy that creates such a dependency. Look at what happened to Germany and others with Russia. I don't get it.
An underwater cable pushing that kind of voltage and power for such a long distance is likely an ecological disaster waiting to happen. I'm sure they have safeguards...just like offshore oil rigs.
>> For every deliverable GW of of solar you have have a GW of backup
> That is utterly silly. Think
Here's a concept: How about we have a conversation without insults?
Explain this to me then:
Let's say you have a 1 MW array in, say, Los Angeles. Let's assume not a cloud in the sky most of the year. Not true. Let's go there anyway.
We then get pounded with rains and have black clouds and dark skies for a couple of months (pretty much the kind of thing that happened in December/January).
Power output goes down from one million Watts peak to 75000 Watts peak for days and days.
How much backup --in any form, any technology-- would you propose to have in order to ensure the town, homes, people, hospitals, traffic lights, etc. relying on this 1 MW of power does not go dark during those two months.
Pick any technology. I don't care what it is. What I want from you is: We need x Watts of power as backup to survive this period. Power, not energy. There's a huge difference.
Since you think my statement is silly, I am assuming you think you can supply this town with a lot less than the 1 MW they lost during this blackout. How?
Quite possibly, but to 100% and for use cases with few charge/discharge cycles? In any case, hydrogen provides an existence proof that renewables can get to 100%, and probably more cheaply than nuclear.
I am showing daily energy generation for the last three months and the same period 12 months ago for comparison.
I have also added day charts for January 14th through the 19th of this year (the period indicated by the red arrow) for readers to get a sense of what solar reality looks like. I've done this because it is too easy to say "On January 17th we generated 42.6 kWh" and fail to understand that between 12:20 and 13:15 the system dropped from 6.672 kW to 1.632 kW (power, not energy).
Intelligent readers will be able to take these charts, play with some very basic numbers and understand the significant issues facing solar.
If I wanted to have a system that delivered a reliable, usable 40 kWh of energy per day it would likely grow from the 13 kWh array I have today to somewhere between 50 kWh and 100kWh (if not more). And, on top of that, I would probably need somewhere in the order of 400 kWh of batteries for storage. In other words, an unrealizable monster.
I know people are going to laugh at these numbers. These are the folks who never bother to fire-up a spreadsheet, run the numbers and reason. Take a look at the daily chart for January and tell me how much energy you would need to store to be able to have a real 40 kWh per day supply and how large the array would have to be. Then tell me how you are going to charge that pack in December, because it has to be fully charged so you can use it in January. That's the reality you need to understand. And the only way you will is to do the math.
And, BTW, this is living in Southern California. Almost anywhere north of this latitude --places with far more weather-- the situation is even worse.
So your system in perfect weather has a performance ratio of around 80% of what an average system optimized for total power would produce on average over the month, it has terrible low light performance (almost as if it's 2010 tech or you picked a bad inverter for your use case or it's badly installed), you're pretending transmission and wind don't exist and that any suggestion of a mix involving solar be exclusively off grid, and that dispatchable backup like biomethane and hydro for a few hundred hours a year is both impossible and is never going to be involved in any nuclear based system?
You're also trying to pretend weather is a 1:1 correlation with latitude and that wind and solar aren't anti-correlated. You know a vertical south facing bifacial panel a bit north of Calgary will produce just as much power in december as a flat one in singapore right?
California already has about 7% hydro and biomass capacity. Include 7% dispatchable generation, then adjust your stats to match or look up the output of a real modern fixed tilt utility (or well designed off grid) system with a decent MPPT, adequate low light performance and bypass diodes, and you'll be able to see you've actually provided fairly strong evidence that 12hr storage, 7% dispatch and 100% overprovision (which you can use with your 3 day dispatchable load called an EV to avoid curtailment if you drive an average amount) on a 93% solar system is more than sufficient. Include transmission to the other side of a range to get less correlated weather and the storage and overprovision drop significantly.
Add onshore wind and the requirement for dispatch, overprovision and storage plummets.
Add offshore wind and HVDC and 3 hours with 30% curtailment into an electrolyser for fertilizer is overkill.
> So your system in perfect weather has a performance ratio of around 80% of what an average system optimized for total power would produce on average over the month
Sorry buddy, the level of ignorance you continue to exhibit about real-life solar is astounding. The fact that you say tuff like this continues to show you are what I call a “google search expert”. Go build something. Learn. Maybe then you’ll understand. You also have to learn to listen to people who know more than you think you know.
Perfect solar only exists in fantasy land. In the real world things are different. Nobody has a system on their roof that Mets your fantasy specifications. Nobody.
You mentioned Singapore. One of our customers has a 300 kW solar array there. Care to guess how much power they actually generate. Hint: Rain. Lots of rain.
> Sorry buddy, the level of ignorance you continue to exhibit about real-life solar is astounding. The fact that you say tuff like this continues to show you are what I call a “google search expert”. Go build something. Learn. Maybe then you’ll understand.
Good thing the industry has specific metrics and models for all of these things and there is plenty of data published for integration studies. GTI on a fictional perfect bifacial system on a good day at that latitude in January would be 70-80kWh not 50. A real utility system in easy transmission range of LA including an 85-90% performance ratio is about 50kWh/day in January. You can see clearly from the posted graphs that the system has poor low light performance, less than ideal tilt and that's where the missing energy is. A utility site or well sited standalone off grid system (or one wherebthe building was designed with solar in mind) would not have this.
3.8kWh/day/kWp in January in california isn't some magical ideal. It's completely normal.
> You mentioned Singapore. One of our customers has a 300 kW solar array there. Care to guess how much power they actually generate. Hint: Rain. Lots of rain.
You completely missed the point here and made it for me. Weather is not latitude. The idea that solar is useless anywhere outside the tropics or that a summer optimised system's winter performance is representative of a winter-optimized system is a myth made up by insane conservatives. Local weather has a larger effect than 50 degrees of latitude during winter. Tilt is also very important -- you do not want to install an off grid system (or a system in a saturated market) at the angle which maximises annual output, you install it at the angle which maximises winter output.
Once Europe saturates summer PV, adding more doesn't become useless, you just slap some on a south facing wall or use it as a fence in a paddock.
Read. Pay some minimal amount of attention to what you're responding to and to new developments. You also have to learn to listen to people who know more than you think you know.
Your google search isn't a substitute for reality. Lots of words. No link whatsoever to practical, realizable reality.
We've had prolonged back-and-forth on this subject a few times. Not interested.
I am more than thrilled to talk to anyone who is actually interested in exploring and learning. Just like me. It is obvious that you have no experience whatsoever with solar. If you did you would not continue to post these platitudes. They simply do not make sense to anyone who actually owns and operates a non-trivial solar array. Zero.
I understand where you are coming from. You don't know much about this yet think you do because google allows you to post great sounding statement.
As the great race car mechanic Smokey Yunick was fond of saying: When all the smoke and bullshit clears out, you have to drive the car and win the race. Smoke and bullshit = Fantasy. Go build a nice solar array. Run if for a few years. Then come back and read some of the stuff you are posting. I know exactly what your reaction will be at that time.
Anyhow, as I have said in the past. Good luck buddy. Live long and prosper.
Care to share which make and model of panels the array is, what the controller is, what angle they're at and what sight lines they have? Or would that prove me right?
Being a condescending twit isn't the same thing as knowledge. And being unable to understand that a utility array designed for maximum uptime might behave differently to a home array doesn't make you right. And not understanding that clouds aren't infinitely large doesn't help your case either.
Even if you insist that all solar must behave exactly like your array, then simply upping the dispatchable power + pumped hydro to 20% of the total power for the year or adding wind still easily covers a constant load with 12 hours storage and <50% curtailment.
I like how you're unable to imagine it being windy when it rains or transmitting between places that don't rain at the same time, but imagining transmission from two states over when there's a correlated outage in the local nuclear generation (which happens just as often) is fine.
If we did live in this fiction where long term storage is impossible rather than simply not being the lowest hanging fruit I'd also far rather spend 5% of the next century building out renewables and then 5% of the time running fossil fuels when the alternative is:
Spend 30% of the next century building reactors whilst running fossil fuels, then 30% of the next century realising the uranium ran out immediately and we have to spend another 30 years building breeders and then finally realising that nuclear needs storage and load shifting too because correlated outages aren't that rare and there's not much demand for energy at 3am.
Nuclear stans assure us the road to success involves standardized reactor designs. Now imagine what happens when a terrible design flaw is discovered in that reactor type and all must be shut down to fix it.
Are you saying that will cause as many power delivery lulls as will happen with lack of wind in wind power or lack of sun in solar? That is the context of this.
The forced outage rate of nuclear is between 1% and 20% depending on how the program is run (abandoning plants that are having problems vs. Maintaining the whole fleet) and last weeks. The scheduled outage rate is 15%. Your backup has to be able to cover several months.
Spending an equal amount on VRE, transmission and storage gives you a much lower shortfall per dollar for a given power target than spending the same money on nuclear. Shortfalls over 1 week are incredibly rare. The curtailed energy also decarbonizes many other things like iron reduction and ammonia production.
> To provide more context, wind and solar were both in the low $30's/MWh of LCOE (levelized cost of energy) 3 years ago[0], with that number predicted to continue falling rapidly.
This is like comparing cost of water during a flood and during a drought.
London water authority can purify rainwater/river water for $0.1 per tonne, or can desalinate seawater for $2 per ton. Why would they go with expensive desalination?
Obviously when there is a drought there is no rainwater to purify, and it has worked out cheaper to install a more expensive, reliable source of water, than it was to create water storage for all of London to last through the worst possible drought.
Water and energy are similar in that, if they really run out, people start dying.
They are different because you can easily store a week's worth of water in your house, but try store a week's worth of energy.
Some countries, like India, Australia and US, can really rely on Solar. But northern countries really cannot. In UK solar panels give 10x less energy in the Winter than in the Summer.
And some countries don't even have good wind sources.
The biggest differences between electricity and water is that water is that water is really hard to transport (cause heavy) and droughts can last for years and effect giant areas. With solar and wind on the other hand, their supply is fairly predictable and variation is mostly local (i.e. it sometimes is cloudy in Germany, but Germany and Britain have almost completely uncorrelated weather). Also it's easy to send electricity 2-4 thousand miles with only minor (10%) losses using high voltage DC. As such you can build a grid with 60-80% renewables with minimal storage. You just make it large to remove local variation in weather and use a mix of wind and solar for your renewables (which are anti-correlated which gives you better reliability). You then can make up any renewable shortages with peaker plants that burn fossil fuels, but if you have a little extra renewable capacity you can keep them from running most of the time.
Edit: Also hydro makes a really good battery for the several week timespan. It can't meet 100% of power needed but (especially if you bank water) can provide a decent percent of total demand for a while.
If you actually look at those northern cloudy countries case by case they all have solutions. You can vaguely handwave at "not everywhere has hydro resource", "some countries don't have neighbors with uncorrelated wind", "offshore windns prohibitive in deep water", "some countries have cloudy winters" and so on, but where about 98% of the world live each negative in one category is met with enough positives in the other categories that it turns out that VRE is the most cost effective strategy and the gaps can he filled with existing known solutions like W2E and turbine upgrades on hydro.
Having 2% of the world needing to source 30% of their electricity from gas isn't a good reason to put the brakes on the 99% of electricity and 80% of other energy that can be decarbonized much more quickly with wind and solar than any other choice.
Even if it were impossible to decarbonise fully with a VRE dominant strategy, pipelining it until it hits around 50% curtailed as the emissions it avoids while nuclear is being built will be more than funding the nuclear 20% sooner and it will remain useful for producing hydrogen/ammonia/etc.
In this case the optimal strategy would be fund both immediately (which china, india, japan, and france are doing), rather than using hypothetical nuclear to attack and slow real VRE buildout.
Solar and wind are great! I’m sure they’ll be a huge part of the future. However, there are places that don’t have good conditions for either and applications that they struggle with like providing large amounts of power for things like refining aluminum or casting steel. Not to mention how useful it would be if you fit one in a Super Galaxy and power a military base with it or quickly connect one and get it pumping power into a grid that’s experiencing blackouts. Wind and solar will probably be cheaper but this kind of tech could still be very useful in quite a few places.
Cost estimates for novel nuclear designs have a track record of being all but worthless. I wish I could dismiss your pessimism but the flip side of the economics, that’s a large part of what makes this so difficult, is that if they do succeed and make SMRs a real thing the cost could go down dramatically.
There's a lot of pressure on the industry to emphasize that the designs are "new" and not like ones that failed. Innovation is good, I think, but the reason nuclear is even in the conversation now is because it already has been done, mostly safely, and could be scaled with existing tech. This makes it a decent player for a transition energy source. High cost shoot-the-moon future designs hopefully will never be necessary.
I say this as an absolutely fervently pro-nuclear person. Comparing France and Germany is really all the information you need for this kind of case.
But I don't understand this push for small reactors outside of niche military applications etc.
Small reactors are being pushed because new big reactors in the US are stone cold dead. This is why I call them HMRs, "Hail Mary Reactors". They're nuclear's last desperate chance in the US.
I think you’re maybe wishcasting. The state and federal governments are spending billions to keep Diablo Canyon open after a rush of blood to the head thinking they could close it.
If those applications benefit from particularly favorable circumstances, then those applications will migrate to the places with those circumstances. We don't grow bananas in the Yukon; we won't put energy-intensive industries in places where energy is more expensive.
When doing calculations on such costs, we need to consider the total cost of operation of the plant, distribution costs, load balancing, etc. In the case of intermittent power sources, many calculations tend to favour them without taking into account the entire operational cycle of the power grid. These intermittent sources tend to require more hands on deck, additional backup sources, or power diversions, stability management, battery storage, and a plethora of other indirect costs that are not considered. I really do appreciate these efforts, since they generate energy, which allows progress and growth. However, we do need to ensure our calculations, and accounting for such infrastructure is able to consider the system as a whole.
Surely if given half a chance, the cost of building SMRs will also fall? Comparing wind and solar pricing to the pricing of SMRs when this is the first one to ever have been approved, never mind built, is pretty unfair.
But, I can only speak to the numbers NuScale is providing. As I have said several times around this discussion, it would be awesome if SMRs were cost effective, and I hope NuScale can prove they’re up to the challenge, but the same people who are trying to sell the technology keep announcing that it’s going to cost more than expected, which does not make me confident. Hopefully things turn out better than expected.
I agree that scaling up production would be helpful for cost, if they avoid getting tangled in a regulatory quagmire. But will it be enough to reduce the cost by more than half?
That question isn’t without consequence. If the cost is too high there will be a populist revolt and return to “roll coal!”
You can’t impoverish people today to prevent a future catastrophe like severe climate change that can only be argued for on the basis of science many people don’t understand… not unless you are in a North Korea level dictatorship and can just shoot people who disagree.
We have combined-cycle gas turbines already, that will not be torn down. They will just be fired up only at need. Eventually they will burn synthetic fuel. At need.
> I think nuclear is a fine source of energy if you have it, but evidence over the last several decades shows that it is virtually impossible to build for myriad reasons.
> What we need is more energy storage
If nuclear is impossible despite existing, storage is Even More Impossible
Sadly LCOE is a very bad metric when looking at intermittent and non dispatchable generation sources in a power grid. Especially so as they approach a meaningful fraction of total generation. The market goal of power generation after all is not to produce as much kWh as possible but to satisfy demand at a specific time. System level LCOE which take dispatch into account are a better way of looking at power generation costs.
Here is a pretty good paper on the topic, albeit a bit older so some parameters might have changed slightly.
System level numbers are dependent on the details of the system. LCOE has the advantage that it's independent of the system. Of course everyone understands that in specific cases one has to look at details local in time and space.
Ah, and from that abstract:
"DOSCOE shows that to cost-effectively remove the last 10-20% of fossil fuels requires a moderate price on carbon and either low-cost nuclear power or carbon capture and sequestration. Alternatively, a hypothetical zero-carbon source needs to have a net present cost less than $2200/kW to displace existing fossil-fuel plants."
A combined cycle power plant burning hydrogen satisfies that last requirement. Studies that purport to show that nuclear is needed for the last 10-20% do so by ignoring hydrogen (and other e-fuels), which slam that door in nuclear's face.
> What we need is more energy storage, whether that's in the form of traditional batteries or more novel forms of energy storage.
Batteries are a nightmare at grid scale from an environmental perspective.
Other forms of storage are needed (pumped hydro for example), or nuclear plus renewable on top of a smart-grid capable of adjusting demand instead.
It's fundamentally far more difficult and costly to adjust supply (or to buffer with storage) than it is to reduce demand during periods of low renewable generation. As more EVs and their chargers come online, instantaneous load reductions become cheap and easy - and possible.
> Batteries are a nightmare at grid scale from an environmental perspective.
Which part[0], exactly? I think most people dramatically overestimate the level of "nightmare", and battery contents are highly recyclable. We don't have a ton of battery recycling right now because there aren't enough failing batteries yet to support the necessary facilities, but several companies are starting to ramp up.
Also worth considering that even after a battery is "too old" to use in an EV, it is perfectly fine to use in stationary storage applications for quite awhile longer ("reuse") even before it is time to recycle and rebuild those components into a new battery.
> It's fundamentally far more difficult and costly to adjust supply (or to buffer with storage) than it is to reduce demand during periods of low renewable generation. As more EVs and their chargers come online, instantaneous load reductions become cheap and easy - and possible.
I completely agree with this, and most people either can't or won't see this point in discussions about renewables. The more predictable load that comes online, the easier it is to justify more production. Even if that production is using so-called "intermittent" renewables, when the need arises, asking people to voluntarily avoid charging for a day would be equivalent to adding a huge amount of production suddenly, just by removing load. (And EVs have enough range for a week of normal commuting for most people, easily. The few people who need to charge desperately would be able to charge without problems.) If you pay people for volunteering to participate in Demand Response, you will get plenty of volunteers.
"Demand response" is a critical part of the grid of the future.
> Which part[0], exactly? I think most people dramatically overestimate the level of "nightmare", and battery contents are highly recyclable. We don't have a ton of battery recycling right now because there aren't enough failing batteries yet to support the necessary facilities, but several companies are starting to ramp up.
Lithium mining is horrible for the environment. [1, 2]
We will keep doing it for as long as it remains cheaper to extract than to recycle, which is why we don't recycle. It's the reason we don't recycle that vast majority of what you put into the recycle bin.
All mining is bad for the environment to some degree or another. Lithium mining allows us to stop doing other harmful forms of mining, and it is infinitely recyclable; it isn't being blasted away into the atmosphere like gasoline. Eventually, we should have enough in the recycling pipeline that mining it becomes relatively uncommon.
The researchers claim it has the potential to be very cost effective, but that remains to be seen. Their process sounds very environmentally neutral, which is always something to strive for.
For it to be an "environmental nightmare", it has to be worse than what we’re already doing. So, no, a couple of articles complaining about lithium mining is not equivalent to evidence that this is worse for the environment than mining coal and oil, or other things you might want to mine instead of lithium.
Nickel and cobalt are more of a problem than lithium according to my understanding, but we have some nickel-free and cobalt-free battery chemistries that are becoming more common, like LFP batteries.
> We will keep doing it for as long as it remains cheaper to extract than to recycle, which is why we don't recycle. It's the reason we don't recycle that vast majority of what you put into the recycle bin.
This is a misunderstanding of the economics, then. The batteries involved are huge, so it is very hard to “lose” these lithium-rich containers. These are not small coke cans which could easily end up in a landfill. But even then, more than 50% of the aluminum in coke cans is made from recycled aluminum. Recycling giant lithium ion batteries should be very profitable for everyone involved compared to mining new lithium.
Plastic recycling is unfortunately a bad joke, of course.
> All mining is bad for the environment to some degree or another. Lithium mining allows us to stop doing other harmful forms of mining, and it is infinitely recyclable; it isn't being blasted away into the atmosphere like gasoline. Eventually, we should have enough in the recycling pipeline that mining it becomes relatively uncommon.
I agree, but the choice isn't mine lithium or burn gasoline. There are other choices. Nuclear, renewables, and a grid that can adjust demand instead of needing to adjust supply. Transit. We don't need electric cars if we have trains, and trains have pantographs or third rails so don't require batteries.
If we're willing to adjust our way of life, then we can have a much smaller impact.
Lithium from seawater is definitely interesting. But yes to your point other metals are equally or more problematic, for instance rare earths, copper, nickel, etc.
This is a very optimistic take and I like that. It’s just my experience that it’s very hard to convince people to make those kinds of changes, but that doesn’t mean we shouldn’t try to do those things.
Plain old combined cycle can handle a lot of the demand peaks. We already have that in place.
If they operate 1% or even 5% of the time, we've still cut vast amounts of carbon. There would be much lower hanging fruit than trying to replace that last fraction with nuclear. We have a solution already in place.
That doesn't mean all research on modular reactors should stop. It would have a niche if it worked. It's just not the thing holding back decarbonization, and not an excuse to hold back as much renewables as possible as fast as possible.
I don't know anything about the electric grid, but I'm surprised we don't have more pumped hydro. Seems like a great way to suck up energy from solar during the day and release it when it's needed. Guess capital costs are high compared to "oh we'll just borrow some power from your electric car if we need it"?
Many of the geographically-convenient spots to do pumped hydro in are already being used, which makes this hard to scale beyond what we currently have.
There is no hint of a shortage of places good for pumped hydro.
What is in short supply is existing hydro-power dams that have not yet been retrofitted with pumps.
Retrofitting an existing hydro plant is cheaper than building a hilltop reservoir, penstock, turbine, and pump. The latter might cost more than other alternatives. Anywhere that is true, expect to see one of the others used.
It's not a question of whether you can, it's a question of whether pumped hydro is cheaper than lithium ion batteries, and that price is heavily influenced by the available geography and water supply. Wherever pumped hydro is cheaper, then by all means, we should build a bunch of it.
It is far from clear which will be the cheapest storage medium, in each place. Count on people to install whatever is cheapest where they are at the time they build. Batteries are expensive right now, but costs are still falling. It is conceivable that a substantial fraction of installed storage will actually end up batteries.
My favorite medium, at the moment, is heavy weights hung from a disused supertanker moored over a sea trench. Each weight would have its own cable reel, with clutch and brake, sharing a shaft with the rest, the shaft driven by a winch and motor/generator kept out of the weather. A net full of ironstone riprap would serve for the weight. The winch would be whatever is the biggest available off the shelf, with the weights chosen to match the winch. Maybe 1000 tons each?
A supertanker is wide enough for multiple shafts, and you can rack together multiple supertankers. A smallish one can hold up 100,000 tons.
Taiwan has an excellent trench right off the SE shore, but there are a lot of near-shore trenches, off SE India, SW Mexico, E Korea, SE Japan, and even Monterey and Monaco. Probably a deep trench is not even needed for viability; 1000 meters is probably plenty.
A supertanker seems to run $50-100M new, probably a small fraction as scrap. There will be a lot of supertankers to scrap; it is already starting.
> Batteries are a nightmare at grid scale from an environmental perspective.
More tired lies.
Diurnal storage provided via LFP requires around a kg of lithium to serve 1kW.
1kg of natural Uranium can provide around 1kW
The battery lasts 12-20 years. The Uranium lasts 3-6.
Mining a kg of lithium has less environmental impact than mining a kg of Uranium.
Meanwhile, in reality, Sodium Ion and Iron batteries are fully abundant and far closer to mass commercialisation than an SMR or even new traditional nuclear.
> cost increases are mainly due to the rise in construction material prices as well as financing costs; nothing inherent to nuclear power or the novel technology itself.
I would argue that construction is inherent to nuclear power, and is in fact the biggest draw back about nuclear power.
SMRs were the attempt to mitigate most of the disadvantages of a constructed product, versus a manufactured product.
There's still significant work needed to convert nuclear into a technology that has a learning curve. I think this work has some of the best insights about which technologies do or do not experience learning curves with price drops:
This is one reason (among many) to prefer Helion. It doesn't use DT, so it doesn't have the problem of closing a tritium cycle. It won't even have a lithium blanket. Tritium does get produced from DD reactions, but it can be just separated from D, stored in metal hydride beds, and allowed to decay to 3He.
> To be fair, it says the cost increases are mainly due to the rise in construction material prices as well as financing costs; nothing inherent to nuclear power or the novel technology itself.
The problem is mostly cost. Now that we're entering a higher interest environment, the situation is unlikely to improve.
Nothing inherent except that nuclear portions of the plan needs needs a higher level of construction to build the (non-nuclear) support infrastructure to operate
And the article notes that the cost increases were due to factors such as the price of steel and steel fabrication increasing. Perhaps the cost of building new reactors like Vogtle has gone up similarly?
The report cites recent large increases in the price of structural steel and copper wire as driving the cost increases, together with higher interest rates. It would seem to follow that other sources of power which rely on steel and copper, and which are financed by loans in dollars — most of them, last I checked — would be similarly affected. Nuclear power does usually involve a lot more concrete than alternatives, but this was not cited as a cost driver.
Source? I would imagine rooftop panels use aluminum, but I'd be surprised if the solar farms did, since it's more expensive and there's no particular advantage. Wind turbine blades are fiberglass, but it seems likely the towers themselves would be steel (although there are relatively few towers, making this less important), since again it has excellent cost-to-strength ratio.
But I was also referring to natural gas, coal, hydro, etc — steel is ubiquitous.
Wind towers use a surprisingly large amount of zinc. I assume this is sacrificial, to protect the steel against corrosion. But, yes, a wind turbine tower is a steel pipe bolted to a concrete base.
Solar panel mounting rail hardware I priced were all aluminum. On a solar farm it would not be surprising if the uprights were steel. Floating on a reservoir, supports are probably fiberglass.
It would be a mistake to build most solar farms not floating, but that doesn't mean it won't happen.
The thing about magnesium is that it's available in unlimited quantities. It's one of the 30 or so elements that industrial society can keep using forever without fear of exhausting mineral resources.
I think the idea behind these technologies is they’re industrialized and continued production will see cost of production and operations fall rapidly over time. It’s unfair to judge a new tech based on the performance relative to long established and optimized tech.
While I don't really think PWR in a module is that huge of an improvment. They did help to develop some certification that will help many other SMR companies.
Congrats on getting this certification threw. Its a huge achievment even with a PWR.
Sadly if it wasnt a PWR it would essentially have been impossible in the US.
For instance my favourite reactor type - lead bismuth cooled fast reactors. They've only been actually used once, on the Soviet "Alfa" class submarines, and have some interesting advantages (lead naturally blocks gamma radiation, in case of a leak temperatures will go down and the coolant will solidify thus preventing radiation leaks, high efficiency due to the high temperatures, etc.), but are pretty expensive and impractical (the coolant solidifies if temperatures get lower than expected, thus you need specialised equipment to keep them hot/operate them 24/7).
There is a very, very large amount of different reactor types you can build.
The main decision points are first, do you want to have 'slow'/'thermal' or 'fast' neutrons. If you want to have 'slow' neutrons you need some material to slow them down. This both has advantages and disadvantages.
Then you have to decide what the primary material is you want to use to fuel the reactors, here the options are basically U-238 (natural uranium), U-235 or Thorium.
Then you have to decide, if you are going to slow neutrons down, there are a few options on how to do that. light Water, PWRs are an example, Heavy Water, like CANDU reactors. Other common options are graphite (Molten Salt Reactor Experiment). But there is also more exotic stuff like beryllium.
Once you made that choice you need to somehow cool your reactor. For that you have tons of options all with advantage and disadvantages. Water is the common example but water has to be under a huge amount of pressure for this to work.
There are also salt cooled reactor (like fluride salt), or lead cooled, sodium cooled, gas cooled and so on and so on.
There are more options of course, the fuel can be liquid or solid. And if its solid it can be shaped in different forms. Nuclear reactors can work so many different ways.
If you want to see some interesting examples of current commercial reactors check out for example:
- Moltex Energy SSR-W
- Terrestrial Energy IMSR
Here are lots of companies currently engaged with the Canadian regulator:
Nuclear waste is in general mostly a solved problem. A bunch of barrels less or more over the lifecycle of the plant in the storage site does not move a needle much, economically or otherwise.
Initial investment dominates the equation. It mostly does not make sense to ramp nuclear power up and down, because of economical reasons. Gas costs are mostly dominatd by the fuel, so there ramping up and down makes sense. Nuclear costs are dominated by the initial investment.
But even without ramping up and down, still nuclear power helps integrate variable renewables, since it is providing valuable inertia to keep the grid stable.
> Nuclear waste is in general mostly a solved problem. A bunch of barrels less or more over the lifecycle of the plant in the storage site does not move a needle much, economically or otherwise.
Storage of nuclear waste was one of several issues that led residents around the Indian Point nuclear plant to push hard for its shutdown, at which they eventually succeeded.
Other issues included ecological harm to the adjacent Hudson River due to discharge of unnaturally warm water, and (probably the biggest one) the risk of widespread death and injury in the event of an accident or terrorist attack.
Indian Point is in relatively densely populated area with relatively low-capacity highways, and there was a pervasive sense that the official evacuation plans would be insufficient in case of a disaster.
If the plan here is to make more, smaller, cheaper nuclear plants in or near more cities and towns, this means you are going to have even more towns putting up the same local resistance on those same issues. Advocates of nuclear will need to have answers.
Not really. Nobody ever wants power generation in their backyard. Doesn't matter if it's dangerous nuclear or polluting fossil fuel or eye-sore wind. The answer is always one of the same two options: do it anyway, or put it in someone else's backyard.
This is nihilistic and disrespectful of the actual humans involved.
Resistance is a matter of proportion and intensity. A few assholes will complain about wind. A lot of people remember Three Mile Island and Chernobyl, and wouldn't care about an eyesore wind farm but would worry about the effects of an accident at a nuclear plant.
You are basically saying that NIMBYs exist and therefore nobody's complaints about nuclear power are valid. Good luck with that attitude.
That's not what I said. I even called nuclear dangerous, so how did you conclude that I invalidated concerns besides NIMBYism?
Every option has downsides. Every proposal faces opposition. Every project is decided individually.
The nuclear advocates don't need a special extra requirement to convince everyone in the world to choose nuclear. They're going to go through the same process as everyone else every time.
I said something neutral about nuclear and you invented a personal attack on yourself and reprimanded me for it. Good luck with that attitude, I guess?
> Not really. Nobody ever wants power generation in their backyard. Doesn't matter if it's dangerous nuclear or polluting fossil fuel or eye-sore wind. The answer is always one of the same two options: do it anyway, or put it in someone else's backyard.
This is what I responded to. It sounds pretty absolute to me.
I'd love a small reactor in my backyard. A nice little 10 kilowatt system would be lovely. As it stands, solar on the roof and a backup battery is the dream, but if it were a self-maintaining nuclear system with a battery to smooth things out, that would be super cool!
> Storage of nuclear waste was one of several issues that led residents around the Indian Point nuclear plant to push hard for its shutdown, at which they eventually succeeded.
Solar power plants produce zero waste, and those are opposed:
The solution in many countries basically amounts to letting future generations deal with the problem. A lot of nuclear waste is being stored in temporary places awaiting a permanent solution that does not yet exist. That solution is going to be expensive and nobody is particularly eager to have that in their back yard even though the risks are very low. So, lots of countries have been deferring solving that problem and only talk in terms of hypothetical solutions that could work that somebody else (i.e. future generations) might want to pay for.
Only a few countries (e.g. Finland) have permanent storage underground for nuclear waste. But that was only opened fairly recently. France actually used to just dump it in the ocean. These days they are a bit more responsible. I think they are building a storage facility that is supposed to open some time next decade. Meanwhile, just like in most other places, they just store the waste in sealed containers and store those on site.
I would say it's a solvable problem but not a solved problem. The solution has a large price tag with absolutely no return on investment. And that's a problem that doesn't have any solution.
Ramping down means multiplying the cost per kWh generated while ramped down, at a time when that kWh has depressed value. Ever turning it off means multiplying the cost of every kWh produced while on, in proportion to how much time it spends off.
All this is because absolute cost is not much affected by whether it is producing.
Stupid question: the article says that each module can produce 50 megawatts. What is the time of that number? Like how many people can have their energy needs met by one module?
50 megawatts (MW) of power is (perhaps obviously) able to supply 50 megawatt-hours (MWh) of energy every hour.
According to a google search, the average US residential customer consumes 886kWh (0.9MWh) of energy per month.
50MW -> approximately 36.5GWh / month
36.5GWh / (886kWh/home) -> 41196 homes
So, about 42,000 homes worth of power could be supplied each month in theory, but there are a lot of asterisks on that. (One example of an asterisk: residential load factors are really low. Another quick google search suggests Phoenix, AZ homes have a load factor of 33%, so 50MW might only be good for 16,000 homes if you want to avoid blackouts. There are other factors that would affect the number further, but 16k is probably a good approximation.)
In the future, with electric cars and gas cooking ban, average energy consumption of a household will go up significantly, but perhaps will be somewhat offset by local solar/wind generation and/or battery storage.
A typical household uses something on the order of 1-2 MWh of power a month. This is something like 2-3 KW on average. 1 MW can thus support something like 300-500 households, and 50 MW can support 15000-25000 households, i.e. small to medium size town.
That may suffice for where they are installing the first one. [1] The county has 19k people. I assume it will be tied into the grid to shed some load from the other power plants.
There are necessary energy needs only biologically, adults run at about 100 watts. So if humans could use electricity as energy source directly (or indirectly without losses) instead of biomass diet, 50 MW would supply about 500k people.
Electricity for our everyday applications is nice, but its demand is elastic (depends on price) and there is no rule of thumb about consumption per person outside of just measuring what various societies happen to use currently, but that's bad data to plan by as we are hugely overusing and underpricing fossils. Some communities don't use any electricity, etc.
50 MW should be able to power a small town (aroud 80 000 people) AFAIK. (ed: or half that, see sibling comment - i saw that an avg us house use up a kwh in 50 minutes - but that might be with gas heating etc. So "full electric might very well be more in the 30-40k ppl range).
One way to think about this is that the maximum power of a Tesla Supercharger is 250 kW. So with one such SMR you can supply the electricity to power 200 EVs at the maximum possible power.
Also notice that 50 MW is only the approved level. Each module can actually produce more, but NRC only approved 50 MW so far. Towards the end of the article you can see that NuScale is applying for uprating the modules to 77 MW, and it's expected the NRC will review this in 2024.
Technically, the person you replied to is correct. Energy / time (energy over time) is power. Power * time is energy.
If you had a battery with 20kWh and it was empty after two hours, you would know that it was providing 10kW, which is a measure of the energy released over time, aka. average power.
I don't think nynx's comment made things much clearer to anyone, though.
There is a lot of discussion here about renewables. I recommend a recent paper published in Nature Communications titled "Geophysical constraints on the reliability of solar and wind power worldwide"[1]. It considers mixes of Solar and Wind, with or without excess capacity, and storage facilities zero, 3, or 12 hours. The paper's model (it looks like the code is available on GitHub) makes optimist assumptions (e.g. no transmission losses within a single country and feasibility of 12hrs of storage for the whole country). Nevertheless, the results are interesting.
Figure 3 (see[1]) indicates how much power outage must be tolerated across the entire country for different mixes renewable capacity and storage depending on the country:
US -- Generation 1.5 times the capacity needed, 12 hours of storage, wind and solar have a power supply gap of 15% when the goal is to tolerate 10 hours of power outage.
Germany -- Generation 3 times the capacity needed, 12 hours of storage, wind and solar have a power supply gap of 60% when the goal is to tolerate 10 hours of power outage.
The larger the power supply gap, the more additional dispatchable power that must be provided.
That study is useless from a system standpoint. Sure it is a bit interesting to look at the basic facts, but all energy systems are more complicated. For example it only mentions the world "sector coupling" once in the discussion, while it is central in any renewable system.
HVDC connections are being strung up across Europe. Sweden and Norway can together in an hour vary their hydro output by 15 GW, that is 15 nuclear reactors worth of balancing power backed by tens of TWh stored.
The research on 100% renewable systems have long embraced the thought of holistic approaches.
> The majority of studies show that a global transition to 100% renewable energy across all sectors – power, heat, transport and desalination – is feasible and economically viable.[5][6][7][8] A cross-sectoral, holistic approach is seen as an important feature of 100% renewable energy systems and is based on the assumption "that the best solutions can be found only if one focuses on the synergies between the sectors" of the energy system such as electricity, heat, transport or industry.[9]
Yep, doable and already getting done in lots of places. Discussions like this usually devolve into alarmist what ifs and vague assertions about needing something called "base load" which is a surprising poorly defined notion. There are now several places in the world that regularly have hours/days/weeks of being exclusively powered by renewables. And it's fine. When that happens the cost is low. When it doesn't they pay more to import power from elsewhere or they switch on some peaker plants. Typically, without any outages or downtime.
Applying some system thinking is indeed key. If you look at each solution in isolation, they indeed each have issues but they are different issues. If you take them altogether, you end up with a resilient grid network with much less issues. Wind by itself has issues. But together with solar and some short term storage, it gets a lot more resilient. There are still some issues left when you do that because cold gloomy winter days with no wind are a thing and those conditions can last for days or weeks in some places. We don't (yet) have storage to bridge such gaps. Months is actually unusual but weeks is fairly common in places like Germany in the winter.
So, you can't obviously rely on that mix exclusively. Which is something nuclear proponents love to point out forgetting that turning nuclear plants on and off is really expensive and slow and generally not something that is done regularly. They kind of suck for backup power. Nuclear peaker plants are not a thing. Hence they like to talk about base load because that means leaving them on permanently to provide that base load.
The alarmist view to this is that without this base load we need to have enormous amounts of storage to survive these horrendous apocalyptic spells of gloomy days (i.e. winters). Exaggerating here but this goes to the core of what nuclear proponents advocate: yes it is stupendously expensive but we have to have it because we need the "base load". The fallacy in that argument is pretending that nuclear is the only option for this and ignoring the cost aspect. Also, nobody ever specifies how much of this base load is actually needed (in gwh). They just assume that we need lots of it.
The system thinking pragmatic real world solution to this is realizing that these gloomy conditions are typically localized, seasonal, predictable, etc. and that running some cables across the continent adds a lot of resilience. Like Scandinavian hydro power, or solar power imported from places like Spain or Morocco (both of which are a thing). Moving power around with cables means you can shape and shift demand around as well. Mostly we're not talking about 100% collapses in generation but supply and demand variations of more reasonable percentages.
And of course the reality is that we have all these legacy plants still providing much more base load than is actually needed right now. We don't actually need more of that right now. It's not an urgent problem (aside from getting rid of emissions). And they aren't going to be switched off overnight and will be around for quite some time. The mix is gradually shifting to more and more renewables, all sorts of storage solutions. It's going to asymptotically converge on 100%.
I don't object to using nuclear for grid power, but it's really not a great complement for solar and wind. What you want to complement those variable sources is something that ramps up and down easily, when you hit extended reduced production periods in the variable sources. Something like nuclear, which is inherently a baseline source, which has basically no storage utility, and which is already far more expensive per unit of energy produced, even when run at high utilization, misses the boat.
Nuclear power is used as base load because of economics, not physics or regulations. France's nuclear fleet is used for load-following. See "Load-following with PWR nuclear plants" in https://www.world-nuclear.org/information-library/country-pr.... (SMRs are easier to use for load-following, since you enable and disable reactors, as opposed to using the "grey control rods" mentioned in the linked article.)
Technical ability to ramp is not the issue, it's the economic cost of ramping. A nuclear plant must operate with as high a capacity factor as possible or else the cost per kWh inflates. Almost all the costs are fixed.
Exactly. To be useful to fill the dips in renewal production, you have to run your the baseline contribution of nuclear at a low power, so you've got headroom to expand into when needed. That makes nuclear kwh, already inherently expensive, prohibitively so.
The true reason why nuclear does not ramp up and down easily is that you need to run it all the time to make it even remotely economical. Otherwise it won't recoup the upfront costs even in 50 years.
No, not vice versa, because people are buying and installing the renewables. It's the nuclear people who are complaining. Talk to the hand, nuclear stan.
Yep. And if nuclear advocates had a plausible story about what a path to zero-carbon, nuclear powered grid that produces affordable power, deals with its own waste issue, and doesn't scare the shit out of the population, we should be looking at that as an option. Seen any evidence of that? I haven't.
US generation capacity in Feb 2022 was 1.2e12 Watts [1]
(let's take that as the value for 2021 as well)
US electricity consumption in 2021 was 3.93e15 Watt-Hours. [2]
Now, 1 Watt over a 365.25-day year is 8766 Watt-Hours. So, US capacity for 2021 under our assumption was 8766 * 1.2e12 = 1.05e16 . That's 3x capacity over need.
So, those doom-and-gloom descriptions of inadequacy of solar and wind seem to rely on a low capacity/need rate.
Until folks in Europe stop liking each other again.
You think it’s bad when Russia plays with Gas supplies, wait until France/Germany/North/South start using power (literal) plays to mess with each other.
Your post mentions "hours of power outage" without a context of which interval this is within. I think it is annually from looking at your citation, so "hours of power outage per year"
Please don't post in the flamewar style and please don't break the site guidelines with complaints about downvotes, tendentious generalizations about the community, etc. All of this noticeably lowers discussion quality.
"Why Small Modular Reactors Can’t Compete With Renewable Energy"
"So the physics of thermal efficiency are important. So is modularity and manufacturability. There’s an optimizing curve in there that the SMR firms are trying to figure out"
Looking at the control room displays they use imperial units all over the place - lb/ft/F. I'm very surprised that none of that stuff is metric. I wonder if this is part of some regulatory requirement.
These old units are common in engineering applications across North America, the US especially. When I was in chemical engineering school, we had to be effectively bilingual in terms of units. We even used some bizarre units like the lb-mol, defined as the number of atoms in 12 lbs of carbon-12 (ie 454 times more than a gram mol).
I am an enormous advocate for nuclear as a transition energy source and I know energy geeks love decentralization, but I don't think nuclear and decentralization is a good mix.
The basic rationale is that you are risking a few dead spots on the planet (Chernobyl, Fukushima, etc.) in exchange for the entire planet being destroyed, but that only makes sense if the number of places you are risking is quite small.
Ideally, it would be places that are already in use for radiological purposes.
Well, it says its a LWR, video says "fuel rods", which means solid rods and meltdown risk. Eh.
If it's a solid fuel rod, then if you get a runaway reaction, and if circumstances mean the safety systems go offline (see: Fukushima) then meltdown.
Contrast this with something like LFTR: the liquid fuel needs to stay in a certain shape/containment/vessel to maintain criticality. If it starts overreacting/heating, the "plug" at the bottom of the containment melts and the liquid flows into a shallow distributed pool that, per nuclear physics, is impossible to maintain criticality.
That type of system is inherently meltdown-proof, even if all the systems go offline, the plug will melt. You know, assuming gravity still works.
A pebble bed, where the solid fuel rod is instead a bunch of solid pellets, but if they get too hot you can similarly melt a plug and the pellets fall into a shape that likewise wouldn't stay critical, might also be similarly meltdown-proof, but I haven't read nearly as much on pebble bed designs.
Nuclear engineer here. That's not quite how it all works.
Solid vs. liquid fuel is not tied directly to reactivity stability, as quantified in the power coefficient of reactivity. If it goes up in power, you want the chain reaction to naturally go down.
In solid fueled reactors, this is usually accomplished via the moderator. If the moderator temperature goes up, it reduced in density, thereby reducing the overall neutron moderation in the core. Thus, fewer neutrons make it to to the energy range that causes fission. Thus, reactivity goes down and the reaction stops. This is inherently stable, just like fuel density in a fluid fuel reactor going down and reducing the overall fission rate.
LFTRs are pre-melted. You melt 100% of the core and then bring it critical. That's a lot of pretty mobile fission products!
As for the melt plug, that's also a false solution. Given that achieving subcriticality is trivial in modern solid and fluid fuel reactors, the challenge in an accident is afterglow heat removal. As you may know, the plants at Fukushima had all rods in and were fully subcritical an hour before the tsunami hit). But when they lost afterglow heat removal, it still melted some containment barriers. Same can happen with fluid fuel, regardless of whether or not you've moved it from one tank to another.
Fluid fuel is not the panacea many people want to think it is.
Passive afterglow heat removal is the thing that lets reactors of any fuel form be safer than today's typical reactors, which generally require backup power to run the cooling systems. If you use certain molten salt, liquid sodium metal, liquid lead metal, etc. cooling configurations you can achieve indefinite heat removal without any external power. That reduces core damage frequencies by about 100x from modern large LWRs. Again, regardless of fuel form.
That said, 100x safer than how safe current nuclear is is kind of just playing with very small numbers. Fossil and biofuel combustion kills 8 million per year from particulate emissions, according to the WHO, and also cause climate change. So we should just be building hundreds of regular large water cooled reactors now and then switch over to fancier cooling ones later, and also breeders that are ~infinitely sustainable for the long term.
For people who don't know this stuff, like myself a few years ago:
When a uranium atom is hit by a neutron some of the will split immediately releasing new neutrons but some of them will go into an unstable state and then split some period of time later. If the instantaneous splits are enough to keep the reaction going that's called a "prompt critical" configuration and usually seen only in atomic weapons. If the neutrons released by both the immediate reactions and the delayed reactions are enough to keep the reaction going that's only "critical."
Because there are many atoms in the reactor that have been hit by neutrons and are unstable but haven't split yet a reactor continues to release a lot of heat even when it's no long critical, somewhere on the order of 10% as much power as when it was fully on.
> Passive afterglow heat removal is the thing that lets reactors of any fuel form be safer than today's typical reactors, which generally require backup power to run the cooling systems.
NuScale does not require backup power. They solve the afterglow heat removal problem by running the reactors under millions of gallons of water. By the time it all boils away (about a month), passive air cooling is sufficient.
I agree freeze plug isn't magic, passive heat removal can be much easier if you can move the fuel into a different contaiment with different geometry and different passive heat removal features.
However it has to be noted that most of the companies that work on molten salt reactors dont use that method. Terrestrial Energy, Moltex Energy for example.
Meh, I dunno I think freeze plugs are really falling out of favor in general. They take a long time to melt, aren't that reliable or predictable, and can spuriously actuate, dumping the whole core.
Why make two vessels when you already need one? Just add passive afterglow heat removal to the one and you're done. Moving stuff around for no reason doesn't add anything.
If you have a breeding reactor that has a mantle and you have a graphite core that you might want to replace then being able to easily remove material from the core might be a good thing.
People currently doing commercial have moved away from this and mostly just do fast reactors and mostly don't want to do breeding or do core replacements.
So if the moderator fails (like Fukushima and all meltdowns) and the active cooling fails, what does the passive cooler do to drop the neutron economy/chain reaction in the fuel rods? It just keeps the rods cool so they don't melt through the floor, and they do that until the rods finally drop the economy?
I still don't like it because nothing in the fuel rod safety does anything about the continued criticality. What are the passive cooling systems, are they big heat sinks and pipes? What happens if an earthquake or explosion disrupts the heat sinks or heat pipe connection to the solid rods? Makes the coolant leak out?
Speed of melt of the plug doesn't seem like a big deal, you simply use a thinner plug if you're worried about that. Dumping out of the core in a liquid fuel isn't a big deal, if the core is intact but a dump-out occurs with a plug you simply replace the plug and send the liquid fuel back into the reactor.
I mean, liquid fuel reprocessing obviously isn't simple, the materials around the liquid fuel and high temperature isn't simple.
The bottom line is that you can downvote me, but I'm basically an example the first tier of people you need to convince for politically viable nuclear. This is what LFTR really appeals to me on:
- total fuel use. Yes I know it won't be 0% waste, those fission products can be nasty, but ... still total fuel use is a big selling point to me
- ability to breed / consume spent fuel waste ... clean up the mistakes of the past
- modular : some hope to be economical
- contained on single facility: no transport of waste, no disruption of transportation infrastructure, no risk of terrorism/hijacking, no Yucca mountain
- closed loop economics: you see the full lifecycle. No hiding costs in reprocessing or transportation or storage, you have the facility, it's operating cost, and you know.
- safety: you didn't refute that liquid fuels are safer than solid rods. No one really knows what at-scale processing of MSR fission products involves, so I could be wrong, but "mobile" implies "processable" to me.
I've always wondered that even if LFTRs aren't economical, they might be an economical cleanup facility: let cheaper nuclear designs generate the power, then send the spent fuel to a LFTR facility that ... maybe ... melts the spent fuel and breeds/processes the products, and at least the processing cost is offset by the power you get from the LFTR and the useful/valuable products.
The real issue is that nuclear isn't cost competitive with solar/wind, and might not be competitive with solar/wind+storage. Solar/wind and especially batteries are going to go through a decade of nonlinear cost improvement in the next decade that ... probably ... drops their cost by half.
So nuclear will only be a load leveller tech, and needs to compete with hydro (and pumped hydro storage), geothermal, whatever comes out of synthfuels/"green" hydrogen.
I'm of the view we need to invest in nuclear research, but going all-in on nuclear plants? Nope, the nuclear industry should have gotten off its tush 20-30 years ago with a more compelling design that addresses full lifecycle and safety and economics.
Nuclear should have recognized the enormous opportunity global warming represented, but the nuclear industry seems full of "green" hostile (from the antinuke conflicts) and regulatory hostile people that it couldn't bring itself to align with left-wing environmentalism.
The problem with waste destruction is that it can always be made cheaper by waiting. Storing spent fuel in dry casks is cheap. The net present value of the cost of destruction declines the longer you wait. It turns out to be cheaper to just keep the waste in dry casks indefinitely.
I think the point about pebble-bed designs is that the density of reactive material in the pebbles is low enough that without external control it maintains a moderate temperature. low enough to not cause a meltdown under any circumstances
Engineered to cool by convection and gravity, yes. Somewhat safer than reactors that melt without continuous active cooling. But that comes at a cost: the good parts of low power density AND the bad part of low power density, such as being "huge" compared to a tiny little submarine reactor of similar power level.
Really any reactor "could" be engineered this way but it does make them big. And "big" competes against the natural desire of engineers to run them at high temps thus high pressures to keep efficiency high. But what if lower efficiency results in net cheaper and safer electricity; its not like they're paying silicon valley prices for the land and nuclear fuel is stunningly cheap so burning twice as much is still cheaper than coal, LOL.
A lot of the "old school" reactor design was based around the nuclear navy where both weight and volume are NOT cheap, not cheap at all. I don't think you could ever have a pebble bed reactor in an aircraft carrier.
This is one of the things that annoys me about the hype about DT fusion reactors. "They can't melt down like fission reactors!" "Yeah, if your fission reactor had 1/40th the volumetric power density of a PWR it would be really hard to make it melt down too."
Sometimes in these discussions you see some precious spirit advocating a DT fusion reactor for use in ships.
It also means the fuel takes up much more volume. This has two downsides: the fuel now becomes more expensive to make, and it becomes more expensive to dispose of.
I would love to see SMRs deployed at older coal fuel power plant sites to replace baseload power these facilities offer. These locations are already connected to their respective grids and are environmentally degraded.
Meh, old reactor tech with new "engineered" safety features. I would have liked to see stuff more like FAST or slow wave reactors with inherent physics based safety features.
A big part of the problem with today's reactors is that they are full of water which requires huge heat exchangers (often bigger than the reaction vessel but still safety critical) and have a huge steam turbine.
Even if heat were free you'd have a hard time making the steam turbine powerset competitive in 2023.
Nuclear might be able to compete if we can get rid of the water. In Japan they are talking about producing hydrogen directly with thermochemistry, no powerset at all. There is also talk about coupling fast reactors or molten salt reactors to this kind of powerset
The claimed price of a NuScale reactor isn't going to beat a large LWR but it might possibly be able to build at the quote that NuScale quotes, whereas the large LWR struggles.
If you want "the power to save the Earth" you have to get costs down and reactors that can do that are still a decade + out.
Thermochemical hydrogen — solar or nuclear — has been studied for a while. The simulation thermodynamic efficiency numbers are excellent (beating electricity generation by 1.5-2x), but the reaction cycles in practice tend to leak process chemicals or corrode equipment too quickly to be sustainable (even losing, e.g., 0.1% of your iodine per cycle is unacceptable). I believe Japan is considering a reactor made of tantalum. Canada started such a project in 2010 that was supposed to be online by 2016ish but has continued hitting roadblocks.
The steam loop is a tiny fraction of current nuclear reactors costs. If you didn’t need to worry about nuclear safety etc then a pure steam loop would be wildly profitable.
the steam turbine and other systems that are bloated by low temperature overhead comprise much of the rest. Also some of the "nuclear island" such as the steam generators is also bloated by low temperature overhead.
It is no accident that we stopped building coal-burning power plants at the same time we stopped building LWRs and that is because gas turbine power plants with much lower capital cost became available.
28% of construction costs not total costs. If we are assuming magic such that you don’t need fuel then you also don’t need armed security, nuclear decommissioning etc etc.
Construction costs are the big cost in nuclear power. Uranium comprises between $0.0015/kWh and $0.000015/kWh.
You don't really need armed security, but a couple guys with guns in America are a dime a dozen.
Decomissioning is $300-400M after a 30-50Y lifecycle and operators are generally allowed to collect that money over the plant life. [1] That's compared to the $17B in construction costs for Vogtle.
Uranium costs are the kind of meaningless fact that’s true and wildly misleading. Fuel rods are not simply long sticks of unprocessed uranium.
Refueling is expensive because of many separate costs. Even simply being forced to take a power plant offline for a long period is inherently expensive. Similarly building a cooling pond and equipment to move extremely high level nuclear waste is costly. Add up all those individual costs and fuel represents a significant faction of the total lifetime costs for a nuclear reactor.
> We don't actually know what most decommissionings will cost. $300M is just a lower bound.
Of course we do, we've decommissioned plants before. About 200 commercial and 500 research reactors. That's a sufficient sample size. [1]
> Nukes' fuel cost seems low only in comparison to their other very high costs.
$0.0015/kWh is objectively cheap on an absolute scale. I don't know if you noticed, but California pays about $0.19/kWh, so this would be 0.78% of the delivered cost.
The fact it's expensive on a relative scale is irrelevant because it's cheap on an absolute scale. In fact the price of raw uranium input is 100X lower than that in a breeder reactor because more of the fuel is consumed, $0.000015/kWh. Being objectively cheap also allows you the flexibility to collect it in more expensive ways, for instance seawater extraction (which makes nuclear renewable) is only double the price, and falling as the technique is improved.
> That’s ~7% of the total cost for solar per kWh and doesn’t even get you to fuel rods.
Which again doesn't matter because it's still objectively and on an absolute scale very cheap. About 0.8% of the cost you pay for electricity.
Solar is great, nobody, certainly not me, is trying to tell you not to build solar.
The reality is the future of the grid is going to be a mix of generation sources. That's going to include solar. It should, in my opinion, also include nuclear due to their different generation characteristics.
> The only thing that matters in economics is the relative scale. Nuclear being more expensive compared to the alternatives is a deal killer even if it’s not that expensive per kWh in absolute terms.
I disagree because it has different supply characteristics. One supplies a constant amount over a long period of time and is difficult to adjust. The other varies massively over the course of a day and zeroes out at night. Solar alone isn't going to meet needs, you need either or both of base load plus storage in addition.
It's disingenuous to compare the price of a kWh of solar by itself to nuclear when one works at night. If you want power at night, which I think many of us do, then you need to price into the $/kWh rate the cost of storage. You need to compare like for like.
[edit] In 2021, utility scale solar-plus-storage with a capacity of 50 MW/200 MWh is estimated to reach $0.085-$0.158/kWh. Nuclear is $0.131-$0.204/kWh. [1] They're actually quite comparable, and we have line of sight to making nuclear cheaper. Again, nobody is advocating for a 100% nuclear grid, it's not possible, because it's only suited to providing base load. A 100% solar grid is impossible because of the night time.
[edit2] Are you not reading what I'm saying? Nobody is advocating for a 100% nuclear grid. I am advocating for a mixed grid of renewables and nuclear where each operates according to its optimal utility function.
> [edit2] Are you not reading what I'm saying? Nobody is advocating for a 100% nuclear grid. I am advocating for a mixed grid of renewables and nuclear where each operates according to its optimal utility function.
The point was illustrative. We already have nuclear, the only point of advocating nuclear is if you want to increase it. Unfortunately, a 35% nuclear grid costs more per kWh than a 30% nuclear grid, and I don’t think most nuclear advocates understand why.
Your 50 MW/200 MWh numbers are basically what it costs for a 100% solar grid. So it’s not even clear if any nuclear would be cost effective in most areas. Alaska and Russia clearly can benefit from nuclear power, it’s not obvious if California, Texas etc will.
The article is talking about small modular reactors, assuming the price per kWh is similar for 3x50MW as it is for current 1GW designs they should be able to lower the 18c/kWh average prices. https://www.electricitylocal.com/states/alaska/
Wind power generation does not, in fact, require any rare-earths at all. (And, they are anyway not rare.) You have asserted this falsehood, and been corrected, many times before. Please do not repeat this falsehood again.
Copper is used heavily in all electric power systems, so does not count against any. Likewise, all industrial energy infrastructures depend heavily on mining, less-electric ones most of all. So, please do not throw up this irrelevant canard again.
The only thing that matters in economics is the relative scale. Nuclear being more expensive compared to the alternatives is a deal killer even if it’s not that expensive per kWh in absolute terms.
Again if 7% represented a total fuel costs that might be helpful benchmark, but Nuclear’s actual fuel costs are higher than Solar’s total costs.
Yes, that’s right even if the only costs where fuel rods nuclear would already be more expensive than Solar per kWh.
Edit: In response to your edit unsubsidized Nuclear is currently more expensive than unsubsidized solar + batteries which can not only provide 24/7/365 power but actually respond to changing grid demand. Base load power isn’t a benefit it’s a major limitation to adoption because demand isn’t constant.
A true apples to apples comparison shows a 100% Nuclear grid would required vast price increase, while a 100% solar grid is roughly the same price as what we pay today.
Edit2: “Nuclear is $0.131-$0.204/kWh” that’s only for base load nuclear costs skyrocket if you want to respond to changing grid demand on a 100% nuclear grid. Rough estimates are close to 50c/kWh for a pure nuclear grid which is why nobody did so and even France was forced to import and export a large fraction of their generation and useage.
There is no place for nukes, just because because they cost so many times more than renewables. They cost, per kWh, even more run intermittently.
What you need in backup generation is cheap construction. Storage cost is falling even faster than solar and wind.
We already have the combined-cycle gas turbines. Their opex falls with duty cycle. Duty cycle falls with renewable generation buildout and, eventually, storage buildout.
First of all you're double-counting decommissioning costs because as I said operators are collecting decommissioning costs during operation. So that XXc/kWh includes the cost of decommissioning but you're talking about it like it's a surprise on top after the fact. It's not.
While you are actually underestimating the cost of Indian Point decommissioning (the scale of which was a surprise to me too - but likely attributable to there being 3 reactors), it has a $2.4B fund that's already been collected and held in trust for its clean-up. But of course Indian Point is in New York, home of the most expensive subway track on earth - $3.9B per mile for the Q line extension. So decommissioning a 67 year old 3-reactor NPP costs the same as 0.58 miles of Q line extension.
Since commissioning, it generated 777TWh based on a 73% lifetime capacity factor (src: Wikipedia). About 1/4 of NYC's entire energy needs for 60 years. Decommissioning cost $0.0029/kWh generated. Why are we still pretending this is expensive?
On the other hand you know who isn't responsible for collecting the cost of recycling during operation? Solar and wind operators.
I don't know why people saying that nuclear has a waste problem when even cursory research will show you it simply doesn't due to the sheer energy density of the fuel. Further, fast neutron reactors produce two orders of magnitude less waste, and with incredibly short half lives.
The reality is we've been okay with the quantity of waste in large part because the input uranium is just so cheap it hasn't been economically effective to reprocess and reuse it, and nuclear has been such a whipping boy of the so-called environmentalist movement that we just haven't meaningfully invested in developing the technology in decades. The outcome of this so-called environmentalism in the 60s though was 60 years of coal and oil plants.
We've had a renewable, effectively-unlimited source of clean zero-carbon energy for decades now. We've simply elected not to pursue it.
We know the biggest cost of nuclear is the up-front expenditure and there was a really good interview recently with the head of the DOE loan program, Jigar Shah.* He talked about how a ton of the cost was simply that each NPP is a one-off snowflake, and that there were huge savings to be realized by standardizing designs and copy-pasting them onto favorable spots. There are tons of low hanging fruit that we can actually invest in addressing that would dramatically reduce the cost. We just lack the motivation. Stop mistaking the current state with the potential.
Yet, no one has been willing, in decades, to lend any of their own money to building a nuke. What are you missing? That is a rhetorical question; the topic is dead and buried. Good day.
A story I've been gathering bits and pieces of evidence for but haven't put together completely is that nuclear decommissioning projects, unlike construction projects, frequently end up completed ahead of schedule and under budget.
This is even true in cases where the situation is unprecedented and people are having to develop new techniques.
As is so often the case it isn’t any one issue like decommissioning that’s the problem alone. It’s that nuclear has such a wide range of costs that they collectively become expensive even if each cost in it’s own isn’t prohibitive.
Aka if we only needed to pay for fuel rods and waste management then nuclear would be wildly profitable. Similarly if the only cost was a large workforce and expensive maintenance then again it would be wildly profitable. Being forced to act as base load generation with long periods offline for refueling isn’t a deal killer. If it was just the long construction times and NIMBY issues that would be fine. Etc.
Unfortunately because there is such a diverse range of costs there isn’t a single silver bullet that’s going to solve all problems with nuclear power. At best by addressing individual issues we might increase the percentage of electricity generated by nuclear power. That’s very realistic and IMO a worthwhile goal.
If your intuition was correct, I think we'd see a trend towards much smaller steam turbines with fewer stages. It's a deliberate choice to engineer them at the size scale they are: the marginal efficiency gains from largest [0], lowest-pressure stages has to justify their cost.
So why don't they just build a ton of un-safe nuclear reactors where they used to test nukes and use long-distance high-voltage lines to transfer the power?
If you can test Tsar Bomba somewhere - why can't you build a nuclear reactor there that might melt down?
They can't even get people to put nuclear waste in Yucca Mountain which just so happens to be adjacent to the Nevada Test Site. That's one of the most contaminated locations in the entire United States. The US government detonated 928 nuclear weapons there between 1951 and present.
There's a difference of dropping a bomb in the desert vs hauling nuclear waste across federal highways through people's "land" to get to Yucca Mountain. It's not exactly apples to apples of a comparison you're making here.
While it is definitely full of NIMBYism, there is a bit more complexity to the Yucca Mountain decision.
I don't know to what extent the public is aware of it but another problem with Yucca Mountain is that used LWR fuel is by no means waste and it doesn't make sense at all to dispose of it in its current form.
At best the LWR gets 2% of the energy out of natural uranium. A fuel cycle that removes the small fraction of fission products and feeds plutonium and uranium can extract vast amounts of energy from today's "nuclear waste". It is the plutonium that is radioactive for tens of thousands of years, if you use it as fuel the remaining fission products decay quickly and are less radioactive than the original ore in less than 1000 years.
So Yucca Mountain makes no sense from the viewpoint of the nuclear industry (it isn't going to fight for it) so if some people don't like it there is no point in pursuing it.
Yucca Mountain is irrelevant to the nuclear industry the US government has already agreed to deal with the fuel it’s quite literally not a problem for the industry.
The Act established a Nuclear Waste Fund composed of fees levied against electric utilities to pay for the costs of constructing and operating a permanent repository, and set the fee at one mill per kilowatt-hour of nuclear electricity generated. Utilities were charged a one-time fee for storage of spent fuel created before enactment of the law. … The Nuclear Waste Fund previously received $750 million in fee revenues each year and had an unspent balance of $44.5 billion as of the end of FY2017. … In late 2013, a federal court ruled that the Department of Energy must stop collecting fees for nuclear waste disposal until provisions are made to collect nuclear waste.[12]
Anyway, LWR can extract more than 2% of the energy in the fuel it mostly comes down to how enriched the uranium you feed them is because the ratio of U235:U238 in reactor grade fuel is different than the ratio LWR are burning. They can extract far more energy from weapons grade uranium, but using it would be a bad idea.
Agreed - in one case you're irradiating the land and air. In the other case you're safely trucking low-risk spent fuel in safe, secure enclosures and likely retaining the ability to reprocess and re-use that fuel in the future.
This is currently a big problem with renewable energy in the UK and other places. One reason you see negative wholesale costs for electricity in some places is you have a lot of generating capacity but no power lines to get it to demand.
It turns out the lead time to build long power lines is long and it is a politically difficult proposition because you have to get permits for a whole line from Point A to Point B.
One of the ways where the sticker price of renewables is higher than what is quoted is the cost of transporting it and one advantage of nuclear is it could be sited closer to demand in some cases.
I believe a compounding factor in this problem was a 2015 decision to ban onshore wind farms in England.
Leading to the somewhat perverse situation where it's now necessary to build new or bigger lines from Scotland all the way down to the south of England where most of that generated energy is needed.
Nuclear weapons use and nuclear meltdown don't have comparable radiation fallout. It's not even remotely similar. Nuclear bombs release radiation in a bang (usually disappearing in a couple days, IIRC), but nuclear melt downs release materials that continue to be radioactive (for an eternity).
Think about Chernobyl vs Hiroshima. Chernobyl is uninhabitable and will remain so for a very long time. Hiroshima was rebuilt in the exact same spot that was destroyed and is a healthy, thriving city, by all accounts.
Even in some far out place, nuclear fallout in some far out place will eventually make its way into the air and water of the world, count on it.
Could you expand on why the disappearance time is so much faster for a bomb than for a power plant? If 1kg of uranium undergoes fission, I would expect there to be a little less then 1kg of fission products resulting from it. No matter if it were a bomb or a power plant, the amount and lifetime of the fission products would be the same. There would have to be something else going on, like the bomb only splitting a tiny fraction of its uranium, or maybe something about the environment of the explosion destroying fission products?
At a very high level bombs are trying to convert as much of their fuel into energy as possible. The ideal bomb consumes most of the fuel and produces a lot of very nasty extremely short lived nuclear waste thus making a big detonation over a small fraction of a second. Waste products should be short lived isotopes to make them even more deadly weapons.
Meltdowns aren’t controlled reactions the waste includes perfectly useful fuel, short and long lived waste products, plus a mix of things such as control rods and the walls of the reactor etc. https://en.wikipedia.org/wiki/Corium_(nuclear_reactor)
There’s a few other effects such as mushroom clouds moving material away from the blast location, and reactors containing more nuclear fuel.
Do we actually have that much control over what the fission products are? My impression was that you hit U-235 with a neutron, and you get the same kinds of fission products out whether it's a bomb or a reactor. It's just determined by physics what kinds of isotopes are likely to come out, and my impression was that we don't know how to influence it so that only the shorter-lived kinds are created. Bombs may be designed to consume a very high fraction of the fuel, but that would tend to make them worse, since the fission products are more radioactive than the starting Uranium, and there will be more of them. The mushroom cloud thing does make sense as an explanation for why Hiroshima is still inhabited, though. And it's clear that Corium wouldn't be able form in the middle of a nuclear bomb explosion.
We don’t have direct control over the specific products, but as I understand it the extreme amounts of neurons in a nuclear bomb destabilize large atoms.
Even if you don’t care about public safety, worker safety is going to be expensive. You can’t pay someone enough to handle fresh from the core fuel rods by hands because it will quickly kill them.
Similarly, you need a design that’s likely to last long enough to pay back construction costs.
Finally there’s logistic issues in locating power plants in the middle of nowhere. You need massive quantities of water and large massive workforce plus dedicated power transmission to someone in need of power etc.
Reminds me of what we humorously learn from The System Bible[0]: "When a fail-safe system fails, it fails by failing to fail safely."
The book mentions 3-miles island, where a problem in a secondary system (an added safety system) spread and caused the system as a whole to fail. This is a tongue-in-cheek way of illustrating a serious issue when designing systems, though I wonder if the interpretation of what happened at 3-miles island is a bit of a stretch? (And I may misremember the book.)
"The accident to unit 2 happened at 4 am on 28 March 1979 when the reactor was operating at 97% power. It involved a relatively minor malfunction in the secondary cooling circuit which caused the temperature in the primary coolant to rise..."[1]
> a problem in a secondary system (an added safety system)
"Secondary" in nuclear parlance for a PWR refers to the loop of water that cycles through the steam generators and turbines, while the "primary" loop cycles through the reactor and steam generators.
Not to detract from your point, which is a good one, and the pressurizer relief valve that stuck open and through which the cooling water escaped was indeed an added safety system.
"Stationary Low-Power Reactor Number One, also known as SL-1 or the Argonne Low Power Reactor (ALPR), was a United States Army experimental nuclear reactor in the western United States at the National Reactor Testing Station (NRTS), later the Idaho National Laboratory, west of Idaho Falls, Idaho. It experienced a steam explosion on the night of January 3, 1961, killing all three of its young military operators, and pinning one of them to the ceiling of the facility with a reactor vessel plug. The event is the only reactor accident in U.S. history that resulted in immediate fatalities.
Personally I'm not worried about the biggest nuclear accident so far. I'm worried about the potential future very unlikely but very severe nuclear accident that kills a non-negligible fraction of the population. An accident that makes Chernobyl look small.
Sure, it's a risk vs reward thing and my comment was just focusing on the risk - a risk that I think is continually downplayed because we are bad at appreciating the costs of extremely unlikely but extremely bad events that have never occurred before.
If the reward is high enough the risk might be justified. Personally I doubt it (mostly because economically solar + wind + power storage seems like a better bet), but that's a whole other discussion.
NuScale's main safety feature is an enormous pool of water, so the reactors can cool down without human intervention. That's more physics than engineering.
Not really. You have to engineer that water containment, delivery, and steam venting. Sure, it's making use of physics for some of those aspects, but there are still others that rely on how it was designed and manufactured.
I don't disagree that there are more-elegant proposed reactors with more-inherent safety, but NuScale has an incremental/conservative design with 20 years of effort behind it. We need to fight climate change immediately, and prioritize "good enough and politically feasible" over "technically optimal" solutions.
If by 30% tax credit you mean they are being handed enough money to pay for 3x the net capacity in renewables and then having their energy price subsidized by an amount higher than the total cost of unsubsidized renewables on top of that, and that rate payers must pay however much they spend on it no matter how far over budget it goes, then yes.
The first one is going into Lincoln County Wyoming in the city of Kemmerer. The city, county and state are all excited to get this reactor. They have the full support of the state. This state has a very diverse power production profile and provides power to most of the western states.
The only people not so happy are the coal miners that are soon to be out of work. Kemmerer is also a coal mining city. Some of them have already started relocating.
I think its just virtue-signalling for the people soon to be out of work as the coal mines are slowly shutting down, also mentioned here [1] That said the state has little demand for EV's so probably not too many people noticed. Probably also to give some confidence to the oil investors but I am not an investment expert.
> a pesky nuclear reactor would surely also endanger their precious oil and gas profits
(A), stop taking obviously unserious legislation seriously, it's a bad look.
(B), they can comfortably rely on other jurisdictions having their heads shoved way too far up their own assholes that those other jurisdictions will not ever adopt nuclear, and will therefore remain reliable oil and gas customers.
That's great. The issue is finding 99 more buyers so this can work at scale (the whole point of smr). Then you just have to deliver 100 reactors and do so on budget and without any defects.
This is why I say the regulation is the EASY part. People were amazed when Tesla got off the ground because it was the first time anyone had succeeded at starting a new car maker in 100 years. This is the same idea, but much harder.
You could be right. Time will tell I suppose. I will keep a close eye on these and submit articles here as they are created and add comments from the locals that end up working there.
If other energy sources were regulated like the NRC, they would also be more expensive. Coal wouldn’t have been built at all. Nuclear regulation is not “the easy part.”
I think what that person means is that to get this actually built, is likely going to take at least 5 to 10 years of regulatory and government action. On the local level. That's not counting anyone that might throw up roadblocks, such as environmental, and safety. That could easily extend this out another five or 10 years.
If a company wanted to build this reactor today, speaking as a government bureaucrat, you are looking at least 10 years before they even break ground.
Federal land and less regulation are closer to antonyms than synonyms. Federal government contracts tend to be more involved than most.
The military might consider a reactor like this for remote installations. I think they've had similar ideas/tests in the past. Not sure those ever panned out though.
That's fine if you have a few 10s of billions to risk building products that might never be bought.
There are 2 key issues here: you have to convince people these will work for a decade plus without issue despite being new AND you have to convince a large number of people (companies, municipalities etc) (>100 to make the factory viable and get the economies of scale) who actually have the cash to buy them.
This is a key moat for a lot of tech: anyone could design a decent airliner. Can you convince enough airlines to order them to make it viable to mass manufacturer them, despite having no name or track record? Hence Boeing and Airbus remain the only games in town (and Airbus only got there with a lot of state assistance).
“On the other hand, a practical reactor plant can be distinguished by the following characteristics: (1) It is being built now. (2) It is behind schedule. (3) It is requiring an immense amount of development on apparently trivial items. Corrosion, in particular, is a problem. (4) It is very expensive. (5) It takes a long time to build because of the engineering development problems. (6) It is large. (7) It is heavy. (8) It is complicated."
From the article, "The first module is expected to be operational by 2029 with full plant operation the following year."