Hacker Newsnew | past | comments | ask | show | jobs | submit | mattgreenrocks's commentslogin

It’s not about perfectly architected code. It’s more about code that is factored in such a way that you can extend/tweak it without needing to keep the whole of the system in your head at all times.

It’s fascinating watching the sudden resurgence of interest in software architecture after people are finding it helps LLMs move quickly. It has been similarly beneficial for humans as well. It’s not rocket science. It got maligned because it couldn’t be reduced to an npm package/discrete process that anyone could follow.


Which makes me wonder: how is serving static content at all nondeterministic?

Yet it still fumbles even when limiting context.

Asked it to spot check a simple rate limiter I wrote in TS. Super basic algorithm: let one action through every 250ms at least, sleeping if necessary. It found bogus errors in my code 3 times because it failed to see that I was using a mutex to prevent reentrancy. This was about 12 lines of code in total.

My rubber duck debugging session was insightful only because I had to reason through the lack of understanding on its part and argue with it.


Once you've gone through that, you might want to ask it to codify what it learned from you so you don't have to repeat it next time.

I would love to see that code.

Try again with gpt-5.3-codex xhigh.

The goalposts have been moved so many times that they’re not even on the playing field.

Try again with Opus 4.5

Try again with Sonnet 4

Try again with GPT-4.1

Here I thought these things were supposed to be able to handle twelve lines of code, but they just get worse.


If you think AGI is at hand why are you trying to sway a bunch of internet randos who don’t get it? :) Use those god-like powers to make the life you want while it’s still under the radar.

how do you take over the world if you have access to 1000 normal people? if AGI is by the original definition (long forgotten by now) of surpassing MEDIAN human at almost all tasks. How the rebranding of ASI into AGI happened without anyone noticing is kind of insane

IMO, you need to have the capacity to write Good Code to know what Good Enough Code is. It's highly contextual to a particular problem and season in a codebase's life. One example: ugly code that upholds an architecture that confers conceptual leverage on a problem. Most of the code can operate as if some gnarly problem is solved without having to grapple with it themselves. Think about the virtual memory subsystem of an OS.

The problem with this argument is many do not believe this sort of leverage is possible outside of a select few domains, so we're sort of condemned to stay at a low level of abstraction. We comfort ourselves by saying it is pragmatic.

LLMs target this because the vast, vast majority of code is not written like this, for better or for worse. (It's not a value judgment, it just is.) This is a continuation (couldn't resist) of the trend away from things like SICP. Even the SICP authors admitted programming had become more about experimentation and gluing together ready-made parts than building beautifully layered abstractions which enable programs to just fall out of easily.

I don't agree with the author, BTW. Good code is needed in certain things. It's just a lot of the industry really tries to beat it out of you. That's been the case for awhile. What's different now is that devs themselves are seemingly joining in (or at least, are being perceived to be).


> IMO, you need to have the capacity to write Good Code to know what Good Enough Code is. I completely agree, and its one of the biggest problem of trying to talk about "how you use agents". A lot of the people that may use the same agents with the same workflow may see wildly different results depending on their ability to evaluate the end result.

> The problem with this argument is many do not believe this sort of leverage is possible outside of a select few domains, so we're sort of condemned to stay at a low level of abstraction.

I think theres a similar tangential problem to consider here: people don't think that they are the person to create the serious abstraction that saves every future developer X amount of time because its so easy to write the glue code every time. A world where every library or API was as well thought out as the virtual memory subsystem would be an overspecified but at the same time enable creations far beyond the ones seen today (imo).

> Even the SICP authors admitted programming had become more about experimentation and gluing together ready-made parts than building beautifully layered abstractions which enable programs to just fall out of easily.


> Plus, it makes natural moat against masses of normal (i.e. poor) people, because requires a spaceship to run. Finally intelligence can also be controlled by capital the way it was meant to, joining information, creativity, means of production, communication and such things

I'd put intelligence in quotes there, but it doesn't detract from the point.

It is astounding to me how willfully ignorant people are being about the massive aggregation of power that's going on here. In retrospect, I don't think they're ignorant, they just haven't had to think about it much in the past. But this is a real problem with very real consequences. Sovereignty must be occasionally be asserted, or someone will infringe upon it.

That's exactly what's happening here.


>massive aggregation of power that's going on here

Which has been happening since what at least the bad old IBM days and nobody's done a thing about it?

I've given up tbh. It's like the apathetic masses want the billionaires to become trillionaires as long as they get their tiktok fix.


> It's like the apathetic masses want the billionaires to become trillionaires as long as they get their tiktok fix.

it's much worse. a great demographic of hacker news love gen. AI.. these are usually highly educated people showing their true faces on the plethora of problems this technology violates and generates


There's nothing wrong with generative AI as a technology.

The problem is that it's in the hands of sociopaths. But that is a general problem with our socioeconomic system, not with AI.


>I've given up tbh. It's like the apathetic masses want the billionaires to become trillionaires as long as they get their tiktok fix.

Especially at cost of diverting power and water for farmers and humans who need them. And the benefit of the AI seems quite limited from recent Signal post here on HN.


Water for farmers is its own pile of bullshit. Beef uses a stupid amount of water. Same with almonds. If you're actually worried about feeding people and not just producing an expensive economic product you're not going to make them.

Same goes for people living in deserts where we have to ship water thousands of miles.

Give me a break.


And one of my favorites, alfalfa in Arizona for Saudi Arabian horses.

Water usage must be taxed according to its use, unfortunately.


Very important. There is more than just 1 bullshit line of business.

Popularity’s flip side is that it can fuel commodification.

I argue popularity is insufficient signal. React as tech is fine, but the market of devs who it is aimed at may not be the most discerning when it comes to quality.


It’s wild to me that we both see people like Jensen as great while also tolerating public whining of the sort in the linked article. Don’t get me wrong, there are people who are far worse! But why do we put up with a billionaire whining that people are critical of what they make? At that scale it is guaranteed to have haters. It’s just statistics, man.

Indices are fine. Fixating on the “right” shape of the solution is your hang-up here. Different languages want different things. Fighting them never ends well.

The right job for a person depends on whether they can rise above the specific flavor of pain that the job dishes out. BigTech jobs strike me as having an inextricable political element to them: so you enjoy jockeying for titles and navigating constant reorgs?

The pay is nice but I find myself…remarkably unenvious as I get older.


Big companies are political and re-orgs lead to layoffs. Startups are a constant battle for funding and go out of business. Small companies mean a lot of exposure to bad management and budget issues. Charities are highly regulated and audited environments. Government jobs have no perks and entrenched middle management.

Every type of work has its idiosyncrasies, which people will either get on with or not. Mentioning one without the others is a bit disingenuous, or its whatever the opposite of the grass-is-greener bias is.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: