Software started as flipping switches, then moved on to tape and punchcards and then to keyboards and the ability to write in words instead of bare machine logic. Soon we were in the print phase where "print" meant putting physical words on physical paper and sometime later you were able to actually view your program on a screen without needing to reprint it. A generation later we get code generation from a WSDL and squiggly lines under typos. A generation later we get AI that tries to predict if I'm more likely to buy the Smooshtek or Prodex version of the exact same item on Amazon but can't come close to solving unknown business problems independently.
I guess that what I'm trying to say is that the first and last job of software is to eat its self. That's the environment and if you want to make it making software you have to learn how to use the new impossible thing that software can do and not take it as the brand new end of the world.
We could draw parallels to warfare, the one industry that historically warps everything around itself. First we fought with spears, swords and shields, then with primitive musket guns. Suddenly tanks and planes and helicopters join the frey. Then carriers, submarines, missiles and targeting systems. Radars. Space satellites. I'd be surprised if there aren't secret projects that simulate the same potential conflict a million times to decide chance of success given new parameters.
And every time the new advancement in technology blows the previous ones out of the water, but it is open to its own exploitable faults. Tanks beat infantry in open field, but they're less suited for urban warfare if you don't want to leave the city in ruin.
I guess that what I'm trying to say is that the first and last job of software is to eat its self. That's the environment and if you want to make it making software you have to learn how to use the new impossible thing that software can do and not take it as the brand new end of the world.