I use my own tools and scripts, and those aren't for everyone - so I'm just gonna make some general suggestions.
1. You should try Aider. Even if you don't end up using it, you'll learn a lot from it.
2. Conversations are useful and important. You need to figure out a way to include (efficiently, with a few clicks) the necessary files into the context, and then start a conversation. Refine the output as a part of the conversation - by continuously making suggestions and corrections.
3. Conversational editing as a workflow is important. A better auto-complete is almost useless.
4. Github copilot has several issues - interface is just one of them. Conversational style was bolted on to it later, and it shows. It's easier to chat on Claude/Librechat/etc and copy files back manually. Or use a tool like Aider.
5. While you can apply LLMs to solve a particular lower level detail, it's equally effective (perhaps more effective) to have a higher level conversation. Start your project by having a conversation around features. And then refine the structure/scaffold and drill-down to the details.
6. Gradually, you'll know how to better organize a project and how to use better prompts. If you are familiar with best practices/design patterns, they're immediately useful for two reasons. (1) LLMs are also familar with those, and will help with prompt clarity; (2) Modular code is easier to extend.
7. Keep an eye on better performing models. I haven't used GPT-4o is a while, Claude works much, much better. And sometimes you might want to reach for o1 models. Other lower-end models might not offer any time savings; so stick to top tier models you can afford. Deepseek models have brought down the API cost, so it's now affordable to even more people.
8. Finally, it takes time. Just as any other tool.
I agree with your overall point, and your despair at software engineers who are still refusing to acknowledge the value of these tools during the process of writing code. However
> A better auto-complete is almost useless.
That's not true. I agree that Copilot seemed unhelpful when I last tried it, but Cursor's autocomplete is extremely useful.
> Oh no, political parties moving in response to the expressed will of the general public in a democracy.
Consider a very simple model of a two party system, expressing views on one left/right axis. Members of the population vote for the political party that is closest to their views on that axis. Political parties want to win as many votes as possible.
Under that model and those assumptions, if one party moves in a particular direction, the 'correct' behaviour for the other party is to move in that direction also to capture more votes, regardless of what the underlying 'will' of the general voting public is.
It's a nice story, but you're right to say it's not true. If easy-piracy was the cause, you'd expect the attach rate (games-sold-per-console-sold) to be much lower than other contemporary consoles, because pirated games still needed legitimate Dreamcasts to be sold. But it just wasn't [1]. They simply didn't sell enough Dreamcasts for the reasons you say, and others.
'Piracy killed the Dreamcast' is very commonly put around, and it was incredibly easy for a contemporary console (literally just burn a CDR, no hardware modifications required), but if you look at the attach rates [1] for the console, they are comparable to successful consoles. Pirated games still needed consoles to be played, so we would expect a much lower attach rate than normal if this was a primary factor.
Ultimately, it was almost everything else going against the console. [2]
Yeah, this is the common refrain after the fact but it wasn't as easy at the time as people think, the ability to just burn a disk and go on a stock Dreamcast came pretty late in the lifecycle. And this is 1999 and 2000, it's not like literally everyone was on Facebook and sharing tips on how to burn disks. Heck, even the internet connection large enough to successfully download disks at all, and the hard drives to easily store them, weren't that common. In 2022 optical media is clearly inferior on the density front to cheap SD micro cards, let alone other storage techs... in 1999, even a single CD was a whackload of data and hard to deal with. Not impossibly large, but it's not like it is now where you can lift a couch cushion and find a 8GB sd micro card you forgot you had, and then throw it away, because what use is 8GB anyhow?
I think the more conventional marketing discussions are much more relevant. In an alternate universe where Sega hadn't burned the entire market with the Saturn, but simply released the Dreamcast at the same time otherwise, it may well have done much better. I'm not sure there is any way it could have "won", while I have my quibbles with some decisions in the PS2 hardware it is generally superior enough that it probably would have won anyhow, it might have been a more grinding war. (Bear in mind that part of this alternate universe is better support from the gaming companies because they weren't burned and they still had some residual good feelings about Sega remaining, so the DC would have been going in with more high-quality games in this version of reality.)
The simple truth is that six months into the Dreamcast's run in the US, before any of the next-gen competition had emerged, I could just tell it wasn't in for the long haul. The library was just too lopsided and the support from the big names, while not entirely absent, just was never there in the necessary quantity. The competition actually coming to market merely buried an already-dying console. But it's the only console I've ever purchased on release day and I didn't ever regret it. There was a lot of good and interesting stuff... there just wasn't enough.
It's not quite point 1, but Natural Selection 2[1] is close, and might even be better than what you're asking for if you're looking to play with (rather than against) your friends. Each team has one player who plays with an RTS view, with all the other players being the units, playing with an FPS view.
Yes it appears so. But there seems to be a few requirements [0].
Also bear in mind a /64 is more than 18 quintillion addresses so it might be a bit more expensive. And it appears they only offer /48's to individual businesses if they meet the criteria but that is still billions of IPs.
/64 is just normal allocation that every home connection should get. (/48 in this notation is bigger)
Given these amounts of addresses subnetting philosophy is quite different from IPv4.
I completely forgot about that difference with IPV6 and IPV4. You are right. There are still many thousands if not millions of addresses available to those who want them. And compared to IPV4 everyone on the planet could have a /64.
Here is a decently comprehensive serverfault post[0] on the subject of IPV6 subnetting.
This isn't really my scene, but the Link to the Past // Super Metroid randomiser[1], where you keep switching between the two games, potentially discovering items from one in the other, is something that is impressive that it's even technically possible.