What do you expect from them? Unless saxenaabhi is code for "US CBP Commissioner" or some position of an equivalent power, there is very little that an HN user can do about it at this point.
I'm not an American, but I'm doubtful there's 100 millions Americans out there who think this is wrong, let alone to an extent where they would be willing to do anything about it. If you look at it in terms of scenarios that can actually happen, the US has already decided about a year ago - now they and the rest of the world are along for the ride. Something truly momentous would have to happen now for an agency as powerful as the CBP to change course, barring an unlikely sudden mood change from up top.
It's not even about like or dislike. Some people dislike the UK, but I imagine that few feel threatened by the prospect of having to cross its border. It's an easier sell to make someone come to the country despite whatever they don't like to attend a big event. But with the US, who knows at this point? The system had been shaken up so much in the last year that there's no telling what's going to happen to any given entrant (especially someone from one of the "disfavored" countries), or what the rules are going to be like tomorrow. It's not preference, it's preference combined with fear.
Some people have to travel for work. For Canadians, lots of international flights connect through the US (especially if you're flying on the cheapest routes), and there's no way to transit through without "properly" entering the country. While the thing in the post doesn't yet apply to Canada (due to us not requiring an ESTA), it very well could become a thing soon. That would be pretty awful for everyone.
This is literally a data collection scam run by US intel / law enforcement to collect biometrics with some “plausible” reason. Now it’s a chance to grab your passwords and private conversations too. Act accordingly,
Israel, or the US and its current admin? While it's pretty obvious that these two countries have ties like no others, it would seem weird if they were looking for criticism of a foreign state first rather than their own, despite the circumstances.
Going after "anti-semitism" gives the admin the political top-cover to systematically abridge various Constitutional rights and anyone who pushes back is "obviously" an anti-semite.
You must be misremembering, or maybe the your social circle mixed up the two by accident, which then became established. The Canadian border agency is never called CBP, because the actual name of the agency is CBSA. CBP always refers to the US agency.
1. You assume that your LLM of choice is perfect and impartial on every given topic, ever.
2. You assume that your prompt doesn't interfere with said impartiality. What you have written may seem neutral at first glance, but from my perspective, a wording like yours would probably prime the model to try to pick apart absolutely anything, finding flaws that aren't really there (or make massive stretches) because you already presuppose that whatever you give it was written with intent to lie and misrepresent. The wording heavily implies that what you gave it already definitely uses "persuasion tactics", "emotional language" or that it downplays/overstates something - you just need it to find all that. So it will try to return anything that supports that implication.
It doesn't matter if you make assumptions or not - your prompt does. I think the point of failure isn't even necessarily the LLM, but your writing - because you leave the model no leeway or a way to report back on something truly neutral or impartial. Instead, you're asking it to dig up any proof of wrongdoing no matter what, basically saying that lies surely exist in whatever you post, and you just need help uncovering all the deception. When told to do this, it would read absolutely anything you give it in the most hostile way possible, stringing together any coherent-sounding arguments that would reinforce the viewpoint that your prompt implies.
> Apps can't be 100MB on modern displays, because there are literally too many pixels involved.
What? Are you talking about assets? You'd need a considerable amount of very high-res, uncompressed or low-compressed assets to use up 100MB. Not to mention all the software that uses vector icons, which take up a near-zero amount of space in comparison to raster images.
Electron apps always take up a massive amount of space because every separate install is a fully self-contained version of Chromium. No matter how lightweight your app is, Electron will always force a pretty large space overhead.
I was talking about RAM - in that running Chromium on its own already has a preset RAM penalty due to how complicated it must be.
But window buffers are usually in VRAM, not regular RAM, right? And I assume that their size would be relatively fixed in system and depend on your resolution (though I don't know precisely how they work). I would think that the total memory taken up by window buffers would be relatively constant and unchanging no matter what you have open - everything else is overhead that any given program ordered, which is what we're concerned about.
Well, you see, there's a popular brand of computers that don't have separate VRAM and have twice the display resolution of everyone else.
Luckily, windows aren't always fullscreen and so the memory usage is somewhat up to the user. Unluckily, you often need redundant buffers for parts of the UI tree, even if they're offscreen, eg because of blending or because we want scrolling to work without hitches.
> I'm assuming you wouldn't see it as fine if the corporation was profitable.
I feel like the implication of what they said was "think of how much worse it would be if they could truly spare no expense on these types of things". If an "unprofitable" company can do this, what could a profitable company of their size do on a whim?
This seems like a simple conclusion, to the point where I'm surprised that no one replying to you had really put it in a more direct way. "slave of the state" is pretty provocative language, but let me map out one way in which this could happen, that seems to already be unfolding.
1. The country, realizing the potential power that extra data processing (in the form of software like Palantir's) offers, start purchasing equipment and massively ramping up government data collection. More cameras, more facial scans, more data collected in points of entry and government institutions, more records digitized and backed up, more unrelated businesses contracted to provide all sorts of data, more data about communications, transactions, interactions - more of everything. It doesn't matter what it is, if it's any sort of data about people, it's probably useful.
2. Government agencies contract Palantir and integrate their software into their existing data pipeline. Palantir far surpasses whatever rudimentary processing was done before - it allows for automated analysis of gigantic swaths of data, and can make conclusions and inferences that would be otherwise invisible to the human eye. That is their specialty.
3. Using all the new information about how all those bits and pieces of data are connected, government agencies slowly start integrating that new information into the way they work, while refining and perfecting the usable data they can deduce from it in the process. Just imagine being able to estimate nearly any individual's movement history based on many data points from different sources. Or having an ability to predict any associations between disfavored individuals and the creation of undesirable groups and organizations. Or being able to flag down new persons of interest before they've done anything interesting, just based on seemingly innocuous patterns of behavior.
4. With something like this in place, most people would likely feel pretty confined - at least the people who will be aware of it. There's no personified Stasi secret cop listening in behind every corner, but you're aware that every time you do almost anything, you leave a fingerprint on an enormous network of data, one where you should probably avoid seeming remarkable and unusual in any way that might be interesting to your government. You know you're being watched, not just by people who will forget about you two seconds after seeing your face, but by tools that will file away anything you do forever, just in case. Even if the number of people prosecuted isn't too high (which seems unlikely), the chilling effect will be massive, and this would be a big step towards metaphorical "slavery".
reply