Hacker Newsnew | past | comments | ask | show | jobs | submit | vanviegen's commentslogin

On that subject, I'd be curious to see any computer that's not mostly made in Asia.

HP makes them, so does Dell. They cost a bit extra, but essentially the whole Federal government runs on nothing else.

The difference between EU and US is that it's possible to make all components in the US, using US equipment, and so some companies do because it commands a pretty decent premium. It's not even that hard since most components (e.g. reference motherboard designs) are still designed and actually built in the US. China still really mostly does what you might politely call "commercializes US tech". And let's not discuss too deeply if they correctly pay licensing for all the components they make, because nobody enjoys that discussion.

And yep, as you might expect, only Intel chips, no Nvidia cards ... and that's not the end of the limitations. The previous version had no USB-C monitor support, never mind one USB-C cable to multiple monitors, but last year intel really pushed a bit harder. But even this year, I'd hope you're not going to be trying to use these machines for gaming.

The EU can't even make a modern motherboard's USB port chip.

Oh and yes, there are cracks in the US version too. The phones used, for example, are iPhones. Radio designed in South Korea ...


Can you point to the models that are entirely made in the USA?

I’m having trouble searching for this - but all the top results seem to be SEO or AI slop, so perhaps I’m just not finding them.


> The difference between EU and US is that it's possible to make all components in the US, using US equipment

False. ASML is in the EU.


The most technologically critical component of ASML's EUV lithography machines (the EUV light source) is designed, developed, and manufactured in California by Cymer.

I don't understand how it would work either, but it may be something similar to this: https://developers.openai.com/api/docs/guides/predicted-outp...

> Particularly, they allow you to write code whose output is unpredictable

Is that an easy mistake to make and a hard one to recover from, in your experience?

The way you have to bend over backwards in Terraform just to instantiate a thing multiple times based on some data really annoys me..


> Is that an easy mistake to make and a hard one to recover from, in your experience?

If you're alone in a codebase? Probably not.

In a company with many contributors of varying degrees of competence (from your new grad to your incompetent senior staff), yes.

In large repositories, without extremely diligent reviewers, it's impossible to prevent developers from creating the most convoluted anti-patterny spaghetti code, that will get copy/pasted ad nauseam across your codebase.

Terraform as a tool and HCL as a programming language leave a lot to be desire (in hindsight only, because, let's be honest, it's been a boon for automation), but their constrained nature makes it easier to reign in the zealous junior developer who just discovered OOP and insists on trying it everywhere...


> but their constrained nature makes it easier to reign in the zealous junior developer who just discovered OOP and insists on trying it everywhere...

I don't think this is true anymore. Junior devs of today seem to be black pilled on OOP.


> and especially how China caught up so fast.

Isn't that largely nationalism and pressuring companies to use (initially) mediocre local tech solutions though? Once the market is there, quality catches up rapidly.


GUI apps are good for discoverability. They generally are not optimized for rapid use by power users though. That's of course not an inherent limitation of GUI apps, it's just that dominant paradigms hardly seem to consider power users.

I'm still annoyed and baffled by the fact that Ubuntu had searchable application menus 10 years ago, which were awesome and worked for just about any program, and then dropped them when they abandoned Unity. And neither KDE not Gnome thought to bring them back. In stead, many apps have since dropped application menus entirely, in favour of... some mishmash of icons and ad hoc menu buttons?

Also, even in the post-LLM era, building a decent GUI app is a lot more work than building a decent terminal app.


Another factor is the lack of a good cross-platform GUI toolkit (by which I mean one that (a) doesn't feel out-of-place and jarring on any platform and (b) doesn't turn a 50K app into a 1GB download.)

Between that and the level of complexity of modern GUI toolkits - and the size of their dependency trees - distribution of a GUI app is a much bigger headache than distributing a terminal app.


Make the core feature of your app a library, then write different interfaces according to the targeted platforms.

there is TCL-TK. :)

Super easy to use, fast, almost zero ram usage and battle tested for 2 decades.


"Equally jarring and out-of-place on all platforms" isn't quite what I asked for, but I guess it's the next best thing! ;)

That hasn't been true for a while, it's easily the best of the bunch at this point. It's also always been trivial to change, which can't be said of the others.

I'd say it's easily the least bad of the bunch, anyway, if you're really committed to cross-platform.

I think most/many banks had their own nfc tap-to-pay solution before Google/Apple Pay came along. Any idea why the banks chose to give that up?

On Smartcards yes, maybe Android, but certainly not on iPhones. On iOS, it's only been possible to implement alternatives to Apple Pay since 17.4 (2024), and only in Europe (EEA).

Ah, I didn't realize the landscape was different on the Apple side of things.

Because it cost money to develop and Google/Apple Pay works really, really well everywhere on the planet.

But they already had their own solutions that worked just fine. I can't see how switching to integrate a new system instead would save on dev costs. There surely must be some other reason?

Discouraging superfluous production is not nothing.

And what about games that are actually just great fun? That would be easy to confuse with addictive, right?

The important indicator is "I spend more time on this than I myself want to." That applies equally well to games or anything else.

No.

You'd add a lot of technical complexity, especially if you need this to be instant. You'd loose the ability to effectively fight fraud, and because of this get a huge target on your back attracting all sorts of unwanted behavior.

On the other hand, you'd gain... nothing? Especially since consumers cannot be expected to run their own blockchain stack, they'd need to fully trust their banks and intermediaries anyhow.


That's not even close to dominating in chess.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: