Specification driven design has been around for a long time. Lets see if AI can make it come true. Dr. David Parnas wrore about Parnas Tables https://research.cs.queensu.ca/home/cisc323/2006w/slides/Bil... ; and then there was Eiffel with Design-By-Contract; and the Type System in F-Sharp seems like magic.
Look, maybe AI for professional programming is overhyped, but for the huge number of professionals who need to program as a part of their job, but programming is decidedly NOT their job, but who do not have programmers available to them, LLM's are.monunmental. For these people, these LLMs are huge productivity enhancers. We shouldn't be measuring productivity gain by professional programmers, but by the larger population of professionals who are required to wear many different hats.
Look, maybe low-code for professional programming is overhyped, but for the huge number of professionals who need to program as a part of their job, but programming is decidedly NOT their job, but who do not have programmers available to them, low-code is monumental. For these people, these low-code systems are huge productivity enhancers. We shouldn't be measuring productivity gain by professional programmers, but by the larger population of professionals who are required to wear many different hats.
Meanwhile my company has finally shut down last low code instance this year, after 10 years of struggling with maintaining a mountain of unmanageable slow-ass code, all the while paying through the nose for the licenses. Spoiler alert - literally zero non-programmers in my company have ever constructed even a single low-code "thing", as was advertised initially. Every person who worked with it was a programmer or QA who could program (aka SDET) and they simply had to suffer with all the limitations of low code platform while gaining none of the supposed benefits.
I suspect it will end up this way with LLM generated code. At most managers would generate some non-operational prototype of some app and then throw it to the dev team who will have to spend time deconstructing it and then rewriting from scratch (with or without LLM assistance).
The thing is, the main reason the hype for software engineers is because they're the main cost of extremely lucrative SAAS companies. If you could do software without the engineers, that's basically just a VC's wet dream of extremely low cost of revenue. On top of very low marginal costs, typically 0 delivery costs (compared to shipping), often unencumbered by things like tariffs.
So like, 2-10xing the productivity of someone who gets paid <<< a software engineer? And who is more 1:1 with the profit from each widget? They don't care that much.
But juicing an already extremely juicy fruit? Now that's worth 70x ARR.
Except now you have over supply (assuming it is really that fullfiling). Even with steady demand basic economics says it will just prices down. So where are thr vcs making their returns?
> For these people, these LLMs are huge productivity enhancers.
Yes, it allows them to solve their desperate need to combine spreadsheets, creating one big merged spreadsheet with pockets of corrupted data and fake last-names. It either lies undiscovered until clients complain, or I notice and have to redo the work for them properly, without an LLM. :p
There is a big gulf between the know-nothing vibe-coding and a professional programmer using a LLM, with lots and lots of people somewhere in the middle, who can code, but is slow at it because they are always learning on the fly, coding just a few hours a week, targeted at small goals, like data wrangling in a pipeline, small applications to replace clunky and error prone Excel books. No, I don't have a study to back it up, just my own experiences in how much I have done this last year. Maybe it doesn't apply to corps with a lot of top down beuraceacy, but in small to medium-sized enterprises, I am convinced it is an engine of agile innovation
Differences in LLM usage are common, so I guess we'll just "argue" here :)
Yesterday I was discussing with the head of a company that has been developing software for 30 years, and he told me that they had some old assembly code left, the original programmer retired years ago, so they converted the code with the help of LLM...
I use LLM to create scripts for various manipulations and data extraction from documents, once I managed to create a working "Wikipedia" after several attempts to make prompt as accurate as possible...
The question is not whether I could have done it, but how long it would have taken me without LLM.
But looking for bugs in "someone else's" code I created is a scary, frustrating job, especially when LLM hallucinates extensively somewhere in the middle of the work :)
The question is, will the vibe coder care at all about those bugs? Will the company using vibe coded apps care about those bugs? Or they will just add another requirement to the prompt and churn out the next version? Yes, they will never have a guarantee this new version will be better, but arguably most software products don't have such guarantees either. So are we users worse off or the same? Because if we're about the same, and the producer saves a shitload of money, you can guess where the future is.
Learning coding is easier than ever today, it's no longer the domain of few elite nerds sitting behind desks in large offices. It takes less than two days to learn python from free online resources like Youtube and SoloLearn; surely a small price to pay compared to lifetime of limping with LLM crutches?
While I agree with the sentiment that tech workers should know how to code at least a bit, I think "2 days" is quite a bit of an exaggeration unless someone already has a lot of prerequisite knowledge.
> McKinsey stated AI could “automate work activities that absorb 60 to 70 percent of employees’ time today” across all occupations, not specifically software development.
I would wager a large portion of that would be both generating and then triaging a certain level of institutional "bullshit communications".
For example, the employee must give daily status updates, and they have a three-item list for the day. They use an LLM to pad out their three-item bullet list into prose that sounds smart and active and impressive.
Then their manager says "TLDR" and uses an LLM to "summarize" it back into almost the original three-item bullet list. Each employee believes they've "saved time" using the tool, but it's a kind of broken-window fallacy [0].
Haha, in my workplace, I feel it's making me come up with BS. I do use it occasionally in my job, and it can be helpful. But there's a very strong push from my C-suite to make everyone in the company use more AI. Doesn't really matter how, we're just supposed to be rubbing AI on stuff. So a few times a day I boop our AI tools and find excuses to mention to my boss that I did an AI thing.
Without a definition of thought as work, the entire argument of AI productivity is noise. The value of AI to work is utterly unquantifiable.
But command of human attention is another matter: it can be quantified and it is command of attention that defines the value of FAANG+
The AI boosters hype it in such terms, both intra-model and extra-model: we track the attention and sell seats in the musical chairs of "investment", leaving it to users to explain the productivity gains to themselves.
I always find the X% of developers use AI tools for their work suspect because I know many people who are being pressured by their company to adopt AI tools for their day to day workflows.
As they should be. It’s a necessary tool to learn. If your entire dev team circa 2010 was writing code in notepad++ and sharing code versions in a zip file, you would push them to use an IDE and source control. It’s the same concept.
Did you read the article? It cites the METR study[1] which showed that while people using AI tools to program report feeling like they are producing more, they are in fact producing about 20% less than without the tools.
Ironically you could get the same effect and save compute fees by simply having programmers stay home one day a week.
That study is straight out of the school that measures productivity in kLOC. Completely worthless except as fuel for Internet arguments and poorly-informed policymaking.
You act like that's a "gotcha" instead of a normal thing. All they mean by that [0] is that can't mathematically prove their developers/tasks/tool are representative for of the majority of worldwide developers/tasks/tools.
You're demanding an unreasonable level of investment for anyone to "prove a negative."
The burden of proof lies on the people claiming zillion-fold boosts in productivity across "enough" places that they don't really define. This is especially true because they could profit in the process, as opposed to other people burning money to prove a point.
If you want to shit talk LLMs, you better come armed with research, buddy. Claims about how it will revolutionise every profession just need n=1 anecdata though.
Historical comparison: "I just had a pizza delivered on the new Segway and it was super duper cool because they came right into the conference center, so say goodbye to cars and bikes, by 2025 it's all going to be Personal People Movers!"
That said, I think LLMs will have a bigger effect than a self-balancing scooter, both positively and negatively.
Correct, the study showed that it slows down experienced developers. We don't know what it does to inexperienced developers so that sounds like a good research topic. But it still leads to the question of why experienced developers should be told to adopt it, given that it slows them down.
Revenue does exist .. showing demand/value. Also, valuation multiple of 25-70x ARR is in that "captn .. the engine is gonna blow up anything sir". Also, given their current design surface, rapid changes to the "core intelligence layer/substrate" & good usage workflow patterns that are still being discovered ... we are probably creating tech debt faster than value. Specs are great .. but we also discover the [stable/actual/valid] spec as we build (and often at the end of the build).
The “X% AI coded” metric needs to die. It is completely meaningless.
The nature of development work is changing. A project can be 100% written by AI but guided so closely by humans that the process wasn’t much faster. Alternatively, AI can make a project faster by helping with architecture and other things beyond writing code.
When people quote “70% AI coded”, it implies that 100% is some mythical goal that means AI is writing all code unsupervised. But most production AI code at the moment is still developed in lockstep with humans.
Everyone knows this including the Companies that say these things and their customers.
They leverage/exploit the fact that it takes time to Verify anything. [Hype-verification gap]
Their theory goes if you use that time delay to beat the drum louder than the next guy you have a shot at attracting more customers, investors, employee, govt support than if you dont. Those who don't do it appear less Ambitious or Capable. Exploit and amplify that also.
Chimps have 3 inch brains. And this is all easier to do than solving quantum field equations. So they do it happily, cluelessly, patting each other on the head about how fascinating they are.
As soon as one chimp troupe (corp) does it well, everyone else does the same, and that's when the entire system experiences a phase transition.
It's no more about what individual chimps or companies say or think, a super structure has emerged, which traps everyone Inside the super structure, into patterns they can't escape.
Stories emerge about what the super structure is then doing. Outsider and Insider stories start diverging. And we get a bifurcation point a moment in chaos theory where one neat line splits into 2, 4, 1000, until you can’t tell what’s signal and what’s noise[1] It’s all feedback where every claim feeds on every other claim, a forest of branches growing out of thin air and verification moves too slowly to prune it.
So what's the message to the kids watching it all and getting absorbed by it?
Recognize when you have jumped into the chaos stream between bifurcation and verification. When everything starts to sound urgent, revolutionary, world-changing but no one can show you how or why youre standing dead center in chaos. Stories multiply faster than facts can catch up, where everyone’s pretending to see patterns that arent there yet. The louder it gets, the less meaning there is.That’s your cue to walk sideways. Dont let it drain your attention, time and energy. There are better things to do in life.
[1]https://upload.wikimedia.org/wikipedia/commons/thumb/c/c8/Lo...