Wasn't most of that caused by that one change in 2022 to how R&D expenses are depreciated, thus making R&D expenses (like retaining dev staff) less financially attractive?
Yes, because US big tech have regional offices in loads of other countries too, fired loads of those developers at the same time and so the US job market collapse affected everyone.
And since then there's been a constant doom and gloom narrative even before AI started.
Same thing happened to farmers during the industrial revolution, same thing happened to horse drawn carriage drivers, same thing happened to accountants when Excel came along, mathmaticins, and on and on the list goes. Just part of human peogress.
I don't know, I go back and forth a bit. The thing that makes me skeptical is this: where is the training data that contains the experiences and thought processes that senior developers, architects, and engineering managers go through to gain the insight they hold?
I don't have all the variables in (financials of openai debt etc) but a few articles mention that they leverage part of their work to {claude,gemini,chatgpt} code agents internally with good results. it's a first step in a singularity like ramp up.
People think they'll have jobs maintaining AI output but i don't see how maintaining is that harder than creating for a llm able to digest requirements and codebase and iterate until a working source runs.
I don't think either, people forget that agents are also developing.
Back then, we put all the source code into AI to create things, then we manually put files into context, now it looks for needed files on their own. I think we can do even better by letting AI create a file and API documentation and only read the file when really needed. And select the API and documentation it needs and I bet there is more possible, including skills and MCP on top.
So, not only LLMs are getting better, but also the software using it.