Hacker Newsnew | past | comments | ask | show | jobs | submit | happyopossum's commentslogin

> slop. If you have a LinkedIn account you must fill it with a constant stream of something

This is 100% baloney. Almost none of the people I work with are heavy LinkedIn posters, and I’ve never met a hiring manager who cared what your LinkedIn feed looked like. This has held true across startups, FAANGs, and mid cap tech companies.


Apple has offered an “iCloud for windows” app for ages that literally syncs your iCloud Keychain (passwords and passkeys) to a windows box where you can use browser extensions for chrome, edge, etc.

You’re still not platform locked…


There are nearly countless ones - 1password for example works everywhere, as does Roboform, bitwarden, keepass, LastPass, nordpass, and many others.

All sync seamlessly and support the major (and often minor) browsers.


> Safari requires iCloud keychain for passkeys

Repeating this doesn’t make it true. https://developer.apple.com/documentation/authenticationserv...

All of the 3rd party credential managers I’ve used that support passkeys work with safari, and through the APIs that Apple offers the credential managers you can even pick your default CM and never think about iCloud again…


> All of the 3rd party credential managers I’ve used that support passkeys work with safari

I've already addressed this pedantry: https://news.ycombinator.com/item?id=46304137


> what annoys me the most is that the use of passkeys in Safari requires iCloud Keychain

Completely untrue, Safari on both Mac and iOS supports third-party password managers for both traditional passwords and passkeys.


You're repeating yourself and also way too many pedantic comments here: https://news.ycombinator.com/item?id=46304159

> I have no problem using Anthropic's cloud-hosted services

Anthropic - one of GCP’s largest TPU customers? Good for you.

https://www.anthropic.com/news/expanding-our-use-of-google-c...


> whereas the Gemini models, even 3 Pro, always answer after a few seconds and never cite their sources

Definitely has not been my experience using 3 Pro in Gemini Enterprise - in fact just yesterday it took so long to do a similar task I’d thought something was broken. Nope, just re-chrcking a source


Does Gemini Enterprise have more features?

Just tried once again with the exact same prompt: GPT-5.1-Thinking took 12m46s and Gemini 3.0 Pro took about 20 seconds. The latter obviously has a dramatically worse answer as a result.

(Also, the thinking trace is not in the correct language, and doesn't seem to show which sources have been read at which steps- there is only a "Sources" tab at the end of the answer.)


Google appears to be changing what flash is “meant for” with this release - the capability it has along with the thinking budgets make it superior to previous Pro models in both outcome and speed. The likely-soon-coming flash-lite will fit right in to where flash used to be - cheap and fast.

Two orders of magnitude would imply that these models cost $28/m in and $42/m out. Nothing is even close to that.

To me as an engineer, 60x for output (which is most of the cost I see, AFAICT) is not that significantly different from 100x.

I tried to be quite clear with showing my work here. I agree that 17x is much closer to a single order of magnitude than two. But 60x is, to me, a bulk enough of the way to 100x that yeah I don't feel bad saying it's nearly two orders (it's 1.78 orders of magnitude). To me, your complaint feels rigid & ungenerous.

My post is showing to me as -1, but I standby it right now. Arguing over the technicalities here (is 1.78 close enough to 2 orders to count) feels besides the point to me: DeepSeek is vastly more affordable than nearly everything else, putting even Gemini 3 Flash here to shame. And I don't think people are aware of that.

I guess for my own reference, since I didn't do it the first time: at $0.50/$3.00 / M-i/o, Gemini 3 Flash here is 1.8x & 7.1x (1e1.86) more expensive than DeepSeek.


Gpt 5.2 pro is well beyond that iirc

Whoa! I had no idea. $21/$168. That's 75x / 400x (1e1.875/1e2.6). https://platform.openai.com/docs/pricing

So they’re not killing the lightning, they’re adding a range extender? I guess that’s not gonna get as many clicks, but it hardly seems controversial given market reception of the current lighting (basically everyone who wanted one bought one and then sales tanked).

Yes, they're killing the Lightning. They are replacing it with a new Lightning EREV, there will not be a BEV version. It's not just a new option they are bringing to the existing truck.

How much are they actually changing though? We could imagine they take literally the same design and stick a generator in the corner. Making that mandatory would not be "killing" the Lightning. At the other end is a total redesign that cuts most of the battery, which would be killing it.

They have a patent for a range extender under the bed, behind the axle. Which I guess could work, depending on how big an engine it needs to be. By my math it probably needs to make around 50hp, though for towing maybe double that. But it doesn't need to be a full size 400hp V6 or anything like that.

If they can throw it under the bed, keep the weight under control, maintain a decently big battery, and not lose the frunk ... then I'll be an optimist about this.

I'm planning to keep mine another 7 years, though, and they only announced the current lighting 4.5 years ago, so a lot can change before I'm in the market again.


The EREV is a BEV. The EV in EREV means it will use a battery for the powertrain. No transmission on the engine.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: