This study doesn't correct for baseline exponential decay due to inflation, to better highlight the meaningful variations. By comparing based on 1914 dollars it also causes old variations to be relatively more extreme and newer inflationary events to look less extreme. You must compare apples to apples.
Finally the events are quite cherry-picked. It is a conclusion looking for a result, when the statistical reason for choosing those 4 events simply isn't evident when you look at the data itself. There is no mathematical rule you could apply to your dataset that would distinctly highlight those 4 periods.
Yes, a log chart would be better. That said, apples cannot be compared in this case; probably very few of us would choose to go back to 1914. A Tesla model Y would cost $1,680 in 1901 dollars, but would have been worth millions of those same 1901 dollars. Or nothing, depending on how much charging tech you could fit in the frunk. Many quality of life items are not covered by PPP (or money supply or other measures) adjustments.
Why would anyone pay millions of dollars - that would be the equivalent to a billionaire's entire fortune - for a Tesla Model Y in 1901?
You'd have nowhere to charge it. Electricity would be more expensive than gas even if you did.
You'd benefit almost nothing from the technology. There's no internet. Not much of it would work. And it wouldn't really help move you forward technologically, as it's just too advanced.
I think you are interpreting the comment too literally. The point is just this: calculating inflation is an art and depending on what kinds of assumptions you make, the results will vary wildly.
Before the printing press, very few people in Europe owned even a single book. But even a lower class, modern European might easily own several dozen books. Depending on how you account for this, you might conclude that the given lower class, modern person is among the richest people in Europe in 1400. Or you might not properly account for the wealth of a 1400 European noble and rank them as middle class by modern standards.
It's simpler with commodities like a bushel of wheat, but still complicated. Depending on what you are trying to explain, you can use different methods but there is not straightforward way to convert the cost of something in one time period to another time period.
Right. Many, many reasons why the dollar that bought the Tesla in 2026 is not quite so disadvantaged against the dollar of 1901 it’s being compared to by CPI.
There was little reasons for anybody to buy a Model T either. The reason they did is that the government picked winners, let people drive cars through city roads. Commute times have only gotten longer, GDP growth has been more than offset by cost of roads and road deaths.
This is dogma, but I'm not sure it's possible or even ideal. Booms and busts seem endemic to any economy that targets inflation, and of course most entities (that don't understand the balance) want to encourage booms and limit busts. Meanwhile, there's another way to think of inflation (and also deflation):
Inflation obfuscates the value of money and therefore of goods, services, etc. In an environment where value is volatile, it makes sense to keep moving, keep trading, because you might come into possession of something that was undervalued before you owned it, or that you'll need in a future when it would otherwise be too expensive. The people who skim off the top of all of this activity love this environment.
Deflation, on the other hand, makes value readily and immediately apparent. What was speculative and risky goes to zero and people hold onto things with intrinsic value. Those who skim profit off of economic activity hate the slowdown, obviously, but maybe you need periods of this to reset when valuations becomes too far removed from reality.
Is it a bad thing for people to buy what they need, when they need it, instead of being forced by inflation to anticipate their needs further and further out?
You're basically critiquing a chart showing how purchasing power is decayed due to inflation because it isn't adjusted for "baseline" inflation. That doesn't make sense.
And yes, earlier variations are more impactful because compounding.
I will say that a better representation would be a logarithm of the inverse. The problem with doing it this way is that later changes look very small. $1.00 to $.99 is the same y-axis delta as $0.05 to $0.04 but the latter is very different.
Also we’re looking at periods that involve dramatically different monetary policy (gold standard before WWII, Bretton Woods from 1944-1976, then the current regime).
One could argue that the defining aspect of each of those shifts in monetary policy has been to devalue the dollar further. I have a relatively basic understanding of economics though, and do understand the arguments that even if that's the outcome it's not an inherently bad one as an american, though a notable effect appears to have been massively widening inequality.
Maybe the next step of the business plan is to sell blue (or green) screen shirts to individuals on which AR glasses can display targeted advertising that only you see (eg: everyone you see in those shirts is wearing Nike gear, but everyone I see is wearing Ralph Lauren because I am fancy).
Then everyone whose shirt is used to display ads can get revenue-share.
Batteries are an expensive solution that doesn't scale well at the grid level. It is useful for grid stability (fast frequency response) but simply a non-starter when you're dealing with national grids.
Batteries are an added cost to the system, without producing more electricity, and as a result prices will go up.
A far cheaper source of flexibility is Demand Side Response. Particularly data centres that are willing to be market actors. Compute can happen anywhere, so it should happen where the wind blows and the sun shines. It is cheaper to transmit bits than Megawatts.
> Batteries are an expensive solution that doesn't scale well at the grid level.
I'd like to see the reasoning behind why they don't pan out. LoFePO4 have dropped to $60/kWh in China. At 3,000 cycles that means they add about 2¢ to every kWh they store.
We don't get that cheap price where I live of course, but they being installed at a rapid pace now. I think most are being installed "behind the meter", which means they are being installed by people who pay retail. That's happening because paired with solar, they've dropped below the break even point at retail prices. Grid scale needs roughly another factor of 3 price drop to hit the same point. If CATL's $10/kWh sodium batteries that get 10,0000 cycles pan out, it will drive the price down by another factor of 10.
Your "demand" side response arises naturally with batteries. Those who can do without the power simply won't buy one. Or if they can get by with only a little emergency power, they buy a small one.
I experienced that first hand. I owned a 4.8kWh battery a while ago. That is by any definition is small. It costs about the same as a generator at today's prices (it didn't back then). A flood caused power to be cut off for a week. We only fired up the generator once, before discovering we could reduce our usage to what a small battery and a 6.6 kW solar array could cope with, even in the very overcast conditions that accompany a heavy rain event.
Demand side response drives up costs a lot. You end up with expensive, rapidly depreciating capital equipment sitting idle and not earning any revenue. The same problem applies whether the equipment is a GPU cluster or aluminum smelter. If we're going to have a modern industrial economy then we need to have enormous quantities of cheap electrical power available 24 hours a day.
Long distance high voltage transmission lines can help to an extent but create the same sort of concerns about dependence on unreliable foreign countries as fossil fuel imports.
Demand side management is a nice concept, but it is neither free nor a cure-all:
It has real costs because it limits the utilization of involved infrastructure and is simply not feasible for a lot of industries. It does not help when residential demand exceeds the available supply either.
The most practical solution will probably be a mix of overprovisioning (especially considering how cheap solar panels have become), battery storage and fuel powered fallback, with the balance shifting as long as batteries and panels get cheaper.
In Californias batteries have in recent years decreased the fossil gas usage by ~40% and essentially removed the duck curve.
Demand side response is of course cheaper but there will always be people willing to buy it expensive electricity to fulfill a certain demand.
Take a BEV. The charging is generally optimized for when electricity is cheap and abundant, but when going on a roadtrip without flexibility in their charging people are willing to pay more.
Paying more opens the possibility for batteries and other solutions to fulfill the demand.
A huge portion of compute is triggered on request, so there isn't that much ability to time shift it. A build was just kicked off because I merged some code. In theory, that could happen overnight. In reality, changing the delay from 20 minutes to 12 hours would be unworkable.
Gas power generation is a necessary evil to balance out the variability of intermittent energy generation (i.e. wind and solar).
Hydropower isn't a feasible alternative because the easy resources have been developed.
The only alternative source of flexibility available today is demand side response.
Edit: I appreciate the down votes, as I've not explained in detail. It is a complex issue. My opinions are based on having a phd in the topic, 10+ years in control rooms, years of market operations and design, and years contributing to europe-wide risk assessment methodologies.
Maybe you're the person to answer this question then.
How can I find the price of battery storage, per kWh delivered to the customer, assuming a pure wind/solar/battery grid?
I can easily find the price per kWh of battery capacity but that's not the same thing. I'm looking for the effective levelized cost of electricity, over the lifetime of the battery, so I can compare against generation sources.
> It also does nothing to help transmission grid frequency stability and control.
they dont help grid stability via inertia of spinning masses, but PLLs and the like exist, where you can control frequencies and phases without a spinning mass.
you dont need to burn gas to have a flywheel either
Batterie prices are falling constantly and grid sized battery production has not even started. The focus was and is mobile batteries.
So expect prices to drop further.
Also yes, batteries help very much with grid stability as they can give steady power on demand anywhere. Have lots of batteries everywhere == lots of on demand grid stabilizers.
> Why do people think that residential power is the issue here?
My experience has been that the vast majority of people, even very technical people, don't really understand the energy mix required to sustain modern industrial technology. Their only experience is with their utility bill which shows them a pie-chart with a big area showing "green" so they can feel better about the state of things.
Electricity production accounts for the minority of energy usage, and residential a minority of the usage of electricity. People don't think about the energy required to send an Amazon package to their door or have fruits from South America stocking their grocery store year round, or even to create the industries that ultimately make up their paychecks each month.
The pandemic was the best view of what real energy usage changes would look like. Early pandemic was a rare moment when global energy usage dipped and that had nothing to do with the demand on the residential grid.
Many things are technically possible. Fewer things are economically practical. Does Europe have the capacity to manufacture batteries that are big enough? How much will that cost and how many years will it take? A few local small-scale demonstration projects don't tell us much about the difficulties of scaling up by orders of magnitude. Have you actually done the math on this or are you just repeating platitudes?
> A few local small-scale demonstration projects don't tell us much about the difficulties of scaling up by orders of magnitude.
The UK is forging ahead with large scale battery storage projects. I have not done the math, but I assume there is a sound economic case in order for these projects to receive this level of investment.
Edit: Here's some more data on revenue for battery storage in the UK [3]
Yes, I have done the math. Thing is, if you ignore the climate, coal and co is still cheaper. That's why it is still used so much.
If you factor in climate costs, things are different.
> Does Europe have the capacity to manufacture batteries that are big enough?
why is this relevant? clearly europe can also buy from outside of europe.
the nice thing about batteries is you dont need a new battery for each watt, compared to needing gas.
the simplest thing is to keep buying russian gas, and also pay ukraine to attack russia. no need to change anything or do any new buildouts whether thats batteries or in US LNG export terminals+european import terminals. those also take time where the russian fuel is readily available. the russian invasion isnt gonna last forever, so a move to US gas is wasted investment when europe can move back to Russian gas eventually anyways
Apart from the obvious advantages of cheap energy, the reason Europe bought so much Russian gas is the theory that interconnected economies don't go to war. Now that Europe considers Russia a belligerent threat even after Nordstream was completed the reintegration might not happen.
Residential energy use is the least interesting thing to think about at a grid scale. The grid actually will get more brittle and/or expensive if everyone wealthy enough to get batteries and solar gets them.
What about the manufacturing and industrial uses? Or the need for natural gas to be a feedstock?
How many batteries does it take to power a giant hyperscaler datacenter for a few days during poor weather conditions? You can’t really rely on backup generators at that usage rate as the expense (and environmental impact) gets to be crazy. Or you end up just building natural gas turbines co-located with such facilities and we are back to where we began.
this is to say, that natural gas isnt the necessary evil to account for intermittent power sources.
its a necessary evil to fully capitalize on other investments. i dont care if the hyperscaler can run their GPUs overnight. perfectly happy for them to delay their training because theyre running in daytime.
the capital owners who bought the GPUs sure care, but why should i accept their pollution in order for them to run a bit faster?
This is just another way to say you are in favor of western hemisphere degrowth.
Without industry you don’t have an economy in the long run. Replace hyperscaler with aluminum smelter or manufacturing line if you prefer. If you can’t operate those capital assets 24x7 they simply will not be built in your country.
Cheap, plentiful, and reliable energy is the foundation of wealth. Nuclear fission was likely humanities transition technology but we fumbled the ball 40 years ago so here we are.
Could you explain what you would use that we can produce in Europe and can generate electricity to fill the batteries with? The batteries cannot be produced in Europe and have very limited lifetime.
Europe has 100 days worth of natural gas storage facilities. All it needs to do is to get renewables + batteries + nuclear above ~70% or so to be able to withstand being cut off for a year. Getting to ~95% is relatively cheap and easy. 100% is hard and expensive, but they don't need 100%. If they get to 95%, that's multiple years worth of storage.
> An exercise to the reader, calculate the space and materials required to replace the average norwegian hydro reservoir with batteries.
Solution: I can't compute the space and materials, but can estimate the cost.
Norway has 1240 storage reservoirs with a total capacity of 87 TWh [1], which yields an average of 70 GWh/reservoir.
Last year, in China, a 16 GWh battery storage plant received an average bid price of $US66.3/KWh [2]. From this we can compute that a 70 GWh plant should cost $US4.65 billion.
A bit on the high side, but can battery prices fall by another order of magnitude? Then again, this is for replicating one reservoir. Replicating 1240 would be a 5 trillion dollar endeavor.
That's why I said 100% renewable was hard and expensive. A grid that gets 5-10% of its energy from natural gas, but can get 100% of it's power from nuclear + gas during a dankelflaute provides optimally cheap + secure power.
> The problem is dispatchability/flexibility, not storage. At a more complex level the issue is grid inertia and frequency response.
The easiest money I ever made was investing in natural gas infrastructure during the COVID insanity of everyone calling it dead due to renewables.
I really don’t understand the disconnect otherwise very intelligent people have on this subject. Every single person I’ve talked to in the actual industry seems to be aware of this fact and how dire things are getting. However it seems that everyone else believes that grid scale batteries are somehow going to save the day in the next decade or two.
Energy storage is energy storage. Natural gas is just a giant underground battery.
And that’s before you get to industrial uses of natural gas as a feedstock, while ignoring how much is still used for heating infrastructure and how long it would take to retrofit everything to heat pumps.
I often wonder what I’m missing, but I’m confident enough in this one to have put my money where my mouth is at least.
Agreed they are. But they want to move away from it, especially for air quality reasons. They've had a huge problem with air pollution. They are big into EVs. This means less reliance on foreign oil and cleaner air.
Nuclear is as dead as a great technology can be. A few more incremental improvements in solar and battery industry and nuclear won't be profitable even in theory, to say nothing of construction cost overruns.
Reactors are only good at providing baseload but that isn't how grids operate anymore. Renewables are too cheap, if a power plant can't drop output fast enough it is punished.
nuclear plants can cut power as quickly as any other power plant, you are just controlling steam. divert the steam from the turbine and you aren't generating power anymore.
I agree with that, I have just seen on here before that people think you can't regulate the electrical output of a nuclear plant like with more traditional ones.
Mostly because it's very expensive and slow to build, what with nuclear engineers not wanting their workplaces to be as dangerous as a construction site. Look up who invented the Maximum Credible Accident, it wasn't the environmentalists.
Ironically, that response runs into the standard problem that many "limit" arguments have.
Generally speaking just because something looks like it's converging from some angle, it doesn't mean that it actually has a well-defined limit, and if it does then it also does not mean that the limit shares the properties of the items in the sequence of which it is the limit.
Examples: 1/n is strictly positive for all n. Its limit for n going to infinity, while well-defined, is not strictly positive. Another example: You can define pi as the limit of a sequence of rational numbers. But it's not rational itself.
So, no, your argument does not prove that pi is a number.
(I'm not arguing that pi is not a number. It definitely is. It's just that the argument is a different one.)
Finally the events are quite cherry-picked. It is a conclusion looking for a result, when the statistical reason for choosing those 4 events simply isn't evident when you look at the data itself. There is no mathematical rule you could apply to your dataset that would distinctly highlight those 4 periods.