Hacker Newsnew | past | comments | ask | show | jobs | submit | lifty's commentslogin

So everything basically.

Solar capacity is always misleading because it’s intermittent. Capacity of a gas power plant can’t be compared to capacity of a solar power plant, even though it sounds like you are comparing the same thing. Would love to know total kWh generated.

Yep. The key difference is that a gas power plant can be cut off completely at any time. For example if a trigger happy leader decided to cause military mayhem in an unpredictable region supplying a large proportion of the world’s gas. The sun, however, keeps on shining.

I didn’t mean to compare them, implying that gas or anything else is better. I’m a big fan of renewables, especially solar, but just wanted to bring this aspect up. It’s confusing to me because I get excited when I see these numbers only to later deflate when I figure out the total generated kWh quantity. It would be great if there would be a “synthetic” calculation which takes into account the estimated generation and smoothing out using batteries, which would also take into account the extra cost of batteries. That would be a more apples to apples comparison both in terms of net generation and cost.

I understand why people are downvoting you, but we still have a bit to go before renewables make up 50% of yearly electricity generation.

Not as far as you’d think though. According to [0] in 2024 it was 6.9% solar, 8.1% wind, and 14.3% hydro, I.e. 29% renewables. Given the trajectory I wouldn’t be surprised if that total was ~33% in 2025.

[0]: https://ourworldindata.org/grapher/electricity-prod-source-s...


Sadly, my country (Uruguay) is not on that map. Right now, ~99% of the energy we get comes from renewables.

By your definition/chart, we were 0% solar, 0% wind, and 20% hydro in 1985 for 20% total renewables. So, 20% -> 29% in 4 decades

Yes, but thats a bad extrapolation because per-capita electricity consumption was still rising then but is mostly flat/decreasing in western countries since 2000 or so, and the significant rise in reneably fraction mostly started after 2000.

The hydro fraction is also a really bad indicator in general, because it basically just reflects geography of a country and not really its effort to reduce CO2 emissions.


> The hydro fraction is also a really bad indicator in general, because it basically just reflects geography of a country and not really its effort to reduce CO2 emissions.

As a ‘clean green New Zealander’, your comment is perfect.

We trash our country in such appalling ways. The fact they there aren’t many of us and that the easy way of getting power is hydro is coincidence, not a national conscience.


IEA had been predicting 2030 as peak fossil fuel usage up until recently. They revised it back upon Trump's election and shifting policy, but it's possible the Iran War has moved it forward again. Either way, it's within reach.

That being said, peak fossil fuels is the future date at which we are burning more than ever followed by the slow decrease. Meaning we are still accelerating CO2 emissions and even if we emit less, every emission is still cumulative so the march towards actually fixing the climate will only start at peak fossil fuels. We still need to remove all that GHG.


What’s the point of saying one stat is better than another, when all of them are meaningful in a different way? When renewables reach big numbers of TWh, someone will say “total generation is misleading if doesn’t line up with demand; what matters is capacity for power when we actually need it”.

> what matters is capacity for power when we actually need it

uh,...yea?


And due to weird nuclear fetishism, people seem unaware that solar lines up really well with when people need power.

Both on daily cycles and seasonally for anywhere that uses airconditioning. It's a good fit for 2/3rds of the global population.

Some people live nearer the poles and wind lines up better with their heating needs. And of course you can combine them because they anti-correlate.


Perhaps a meta evolution, they become experts at writing harnesses and prompts for discovering and patching vulnerabilities in existing code and software. My main interest is, now that we have LLMs, will the software industry move to adopting techniques like formal verification and other perhaps more lax approaches that massively increase the quality of software.

> Perhaps a meta evolution, they become experts at writing harnesses and prompts

Harnesses, maybe, but prompts?

There's still this belief amongst AI coders that they can command a premium for development because they can write a prompt better than Bob from HR, or Sally from Accounting.

When all you're writing are prompts, your value is less than it was before., because the number of people who can write the prompt is substantially more than the number of people who could program.


Also, synthetic data and templates to help them discover new vulnerabilities or make agents work on things they're bad at. They differentiate with their prompts or specialist models.

Also, like ForAllSecure's Mayhem, I think they can differentiate on automatic patching that's reliable and secure. Maybe test generation, too, that does full coverage. They become drive by verification and validation specialists who also fix your stuff for you.


Testing exists.

> formal verification

Outside of limited specific circumstances, formal verification gives you nothing that tests don't give you, and it makes development slow and iteration a chore. People know about it, and it's not used for lot of reasons.


This statement shows an intense lack of technical knowledge. You’re probably one of those ignorant managerial types.

First type checking is a form of formal verification and it’s used everywhere. Second have you heard of rust? Do you know why it’s becoming an alternative to C++ or C? Entirely because of its type checker or aka formal verification. It is the literal main reason why rust was created.

Have you heard of typescript? It’s essentially a formal verification layer over JavaScript. Everyone uses it now for the front end.

You don’t know what you’re talking about. I recommend you do some research before saying anything on this site.


I agree with this take. Nothing changes, everything just evolves. Been happening for 60 years, will (likely) continue to happen for the next 60 years.

Regarding the growing log in CRDTs, it doesn't have to be that way. Most of these popular CRDT libs use Merkle DAG only to build up the log of the changes. But there is a better way, you can combine a Merkle DAG for ordering + prolly trees for storing the actual state of the data. That gives you total ordering, an easy way to prune old data when you choose to, and an easy way to sync. Look into fireproof for how this is combined.

Regarding distributed schemas, I agree, there's a lot of complexity there but it's worth looking into projects like https://www.inkandswitch.com/cambria/ and https://github.com/grafana/thema, which try to solve the problem. It's a young space though. If anyone else knows about similar projects or efforts, please let me know. Very interested in the space.


Interesting! Do you mind explaining the idea in more detail?


As far as I understand, libraries like Automerge use the Merkle DAG to encode a document as an immutable bundle of state changes aka operation log + the causal ordering which enables conflict free merging between multiple peers. The final document is reconstructed by combining the state transitions. So the Merkle DAG is both the state and the causal relationship between mutations which allows the merge "magic".

Prolly trees allow you to store history independent data which is trivial to sync, diff and merge, regardless of insert order, merge order or originating peer. A Merkle DAG layered on top of prolly trees (event reference prolly tree roots) gives you causality so that peers can agree on a common view of history. So it's very useful because you can check integrity and travel in time, but you can keep as much of it as you want, because it's not necessary for constructing the current state. Prolly trees give you the current state and the easy syncing, diff,merge. So you can truncate the history as needed for your use case.

For a production ready implementation of prolly trees you can check Dolt. For a combination of Merkle DAG (https://github.com/storacha/pail) and prolly trees you can check https://github.com/fireproof-storage/fireproof


Those are lovely data structures and I know about them. But how are you planning on using those data structures? What CRDT are you building?

You might also be interested in Alex Good's Beelay algorithm: https://www.youtube.com/watch?v=neRuBAPAsE0


Prolly trees can act as CRDTs if you have a merge function that always merges and doesn’t block.

So my initial comment merely tried to make the point that there is a design space where you’re not stuck with the tradeoff of carrying the full Merkle DAG history just to be able to reconstruct the latest version of your document.

Thanks for the video, will check it out!


It’s clear that even in 2026 most people still don’t get the pixel density (PPI) argument. Or perhaps they get it but they don’t appreciate it. For me, any monitor that is not HIDPI (218 ppi) is a non starter. Maybe my eyesight is better than the average but looking at a non-retina display seems atrocious after having spent time working on a retina display.


Ironically I find this even more true for text. It's not just photographers that need the HiDPI - it's people reading and writing text who benefit as well.


I come from the text angle as well. I stare the whole day at terminals and IDEs.


Yeah I've always ascribed to the idea that the monitor is the most important part of the computer. Everything else is secondary because you can workaround a lot of other problems with a computer but there's no overcoming a low quality display.


So it seems the new Studio Display XDR is the only display on the market that offers:

- 5k resolution at HIDPI (27inch)

- 120hz refresh rate

- TB5 and single cable connectivity.

There are a couple of other HIDPI displays at 5k with 120hz refresh rate but they don't do TB5.


Super disappointed that the base model doesn't get 120hz. I own the old model and it's great, but I will have to look for an alternative 5k display with 120hz refresh rate. There are a few on the market now, and I won't pay 3.5k for 120hz.


What model do you use it with? And through which API, openrouter? Wondering how you manage cost because it can get quite expensive


I am dumb. I use Anthropic Api and Opus for some, Sonnet for other tasks. Accumulated quite some costs.

But i book it as a business expense , so its less painful as if it would be for private.

But yeah, could optimize for cost more


Too bad we can’t use it. Whenever Google releases something, I can never seem to use it in their coding cli product.


You can but only via Gemini Ultra plan which you can buy or Gemini API with early access.


I know, and neither of these options are feasible for me. I can't get the early access and I am not willing to drop $250 in order to just try their new model. By the time I can use it, the other two companies have something similar and I lose my interest in Google's models.


I agree with your characterisation of what is going on, and at some point, the EU states will have to decide for full fiscal integration or for removing the common currency. You can't have a common currency without a common fiscal union. So we either have to integrate more or desintegrate more, this inbetween we have now is not working very well. Speaking as a European, not sure what is better.


Not related to the comment, but in general I agree with you.

You can't have a single monetary system without complete unification, including tax systems, budgeting systems, governance models, retirement systems, benefits. I mean, you can, like we have now, but it's not sustainable, and eventually we all have it worse.

As a European, I would not want to go that way, since I'm afraid such a unified EU will be a bureaucratic monster that is even more centralized than the USA, and way more autocratic than any current EU state.

I'd rather take a step back, dissolve much of the EU's competences, and go back to pure trade union, dissolve the EURO as a currency, and let every member state take sovereign decisions on their own.


Even more centralized than the USA? That must be an outsider's view. You are aware that each state has its own tax laws? Corporate laws? Even criminal laws and traffic laws? And so on? Obviously there is federal law, too, but that's more of an umbrella covering all states in addition to local laws. The USA is a federation, just like Germany, Switzerland, or Austria are federations, and one common complaint you get out of these nations all the time is that they are not centralized enough!

The EU, on the other hand, is not a federation, it is a association of nations trying to figure out how to go together, be it as a federation or something different, because in today's world you need size and power to survive. Any EU state alone would be insignificant on the world stage and being sidelined one way or the other. The most powerful inside the EU would suffer the most, e.g. Germany or France, because while the EU is second or third on many metrics, the individual nations are small. Depending on which ranking you look at, the EU is just behind the USA in military and economic power, sometimes it is behind China in economy, too.

The strongest individual economic power of the EU is Germany, which globally usually comes in 4th place after the USA, China, and the EU as a whole. But this is deceiving, when you look at the absolute numbers. Germany's output is impressive for its size, but is still only about a quarter of China's! Germany alone is a dwarf and without the EU would be inconsequential. The post-Brexit UK is just learning this the hard way and all the sovereignity does not matter a damn. The top leadership of EU nations are not stupid, even if they could explain their reasoning better, and thus try to keep the EU going, despite its many flaws.

This is a Chesterton's Fence situation.


I recently became much more pro-total-unification, so let me give you this counterpoint: individually, European nations are no match to the major superpowers, neither economically nor militarily. We'll get gutted by divide-and-conquer approach. In contrast, bound much closer together (particularly with some form of pan-european armed forces), the EU would become a proper global superpower and a counterbalance for the USA and China.


There is not a single example of a multiethnic or culturally diverse empire or state that was successful after the rise of nation-states at the end of the 19th century.

All of them crumbled either peacefully or in bloody wars.

Why do you think such a bureaucratic monster that no one really wants would be an exception?

Maybe we can try it again in the 24th century, when the humans evolve enough, but for now, your best bet for a successful country is either an ethnonational or religion-national state.


You cant really believe any of the EU countries actually want to work together? The EU was only possible with the premise of countries keeping their autonomy. Believe me when I say that us Germans would rather go to war than merge with France, just as an example.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: