Hacker Newsnew | past | comments | ask | show | jobs | submit | 9rx's commentslogin

> How long do you think it'll take for the AI trend to mostly automate the parts of your job that still make you excited?

The exciting part of the job is, and always has been, listening to idle chitchat where you pick up on the subtle cues of where one is finding difficulty in their life and then solving those problems. I think AI could already largely handle that today just fine, except:

You have to convince, especially non-technical, people to have idle chitchat with machines instead of humans

-or-

Convince them of and into having a machine always listening in to their idle conversations with humans

Neither of those are all that palatable in the current social landscape. If anything, people seem to be growing more weary of letting technology into their thoughts. Maybe there is never a future where humans become accepting of machines being always there trying to figure out what is wrong with them.

The trouble with AI replacing jobs is that a lot of jobs exist only because people want to have other people to talk to and are willing to pay for the company.


> and your app doesn't have 300% test coverage with fuzzing like SQLite does

Surely it does? Otherwise you cannot trust the interface point with SQLite and you're no further ahead. SQLite being flawless doesn't mean much if you screw things up before getting to it.


That's true but relying on a highly tested component like SQLite means that you can focus your tests on the interface and your business logic, i.e. you can test that you are persisting to the your datastore rather than testing that your datastore implementation is valid.

Your business logic tests will already, by osmosis, exercise the backing data store in every conceivable way to the fundamental extent that is possible with testing given finite time. If that's not the case, your business logic tests have cases that have been overlooked. Choosing SQLite does mean that it will also be tested for code paths that your application will never touch, but who cares about that? It makes no difference if code that is never executed is theoretically buggy.

Business logic tests will rarely test what happens to your data if a machine loses power.

Then your business logic contains unspecified behaviour. Maybe you have a business situation where power loss conditions being unspecified is perfectly acceptable, but if that is so it doesn't really matter what happens to your backing data store either.

> what's special about 3d printers?

They are typically stocked with material and ready to deploy at a moment's notice. When the time comes that you need a weapon, casually walking into Home Depot won't be an option.


Global food production now produces more than enough calories to feed everyone, but still hasn't figured out how to produce enough nutrients to feed everyone.

I'm not really sure that holds. Carb-heavy grains form the bulk of global food production, but meats, legumes, fruits, etc. are still massively overproduced. Of course, I think you get at something very important to recognize which is that modern industrial variants of vegetables/fruits are bred in favor of crop yield, size, shelf-stability and visual appeal/consistency. Many of those contribute to lower nutrient densities.

I don't think it matters as far as getting enough micronutrients is concerned, but speaking from a lot of experience, "heirloom" produce is absolutely superior in terms of texture and flavor. It's not even close. I think we could really stand to put more effort into making robust food supply chains without turning everything into bland mush.


If you ignore waste then it is likely that we also produce enough nutrients. That isn't a useful way to look at it, however, as, just with all things in life, losses are inherit. A 100% efficient system will never exist. Calories, though, we produce enough of even accounting for all the waste.

> If the market demands more chicken over beef, producers are perfectly capable of making a switch.

Depends on local laws. In Canada, you cannot simply switch to chicken. It is supply managed.


The vast majority is beef is finished on grain, but start on grass.

> The article suggests that there is a lot of programming being done without considering what exactly needs to be programmed.

And the parent rightfully points out that you cannot know exactly what needs to be programmed until after you've done it and have measured the outcome. We literally call the process development; for good reason. Software is built on hunches and necessarily so. There is an assumption that in the future the cost of the work will pay back in spades, but until you arrive in that future, who knows? Hence why businesses focus on metrics that try to observe progress towards finding out rather than tracking immediate economic payoff.

The interesting takeaway from the article, if you haven't give this topic much thought already, is that the changing financial landscape means that businesses are going to be more hesitant to take those risks. Right now there still seems to be enough optimism in AI payoffs to keep things relatively alive, but if that runs out of steam...


What is hard about it? Young children seem to pick it up with ease. It cannot be that hard?

Determining what to program can be hard, but that was already considered earlier.

The only other place where I sometimes see it become hard for some people is where they treat programming as an art and are always going down crazy rabbit holes to chase their artistic vision. Although I would say that isn't so much that programming is hard, but rather art that is trying to push boundaries is hard. That is something that holds regardless of the artistic medium.


> What is hard about it? Young children seem to pick it up with ease. It cannot be that hard?

That's like saying "becoming a writer can't be that hard, since kids learn how to write in the elementary school".

Given a set of requirements, there are many different ways to write a program to satisfy them. Some of those programs will be more efficient than others. Some will scale better. Some will end up having subtle bugs that are hard to reproduce.


> That's like saying "becoming a writer can't be that hard, since kids learn how to write in the elementary school".

Is writing hard? I expect most can agree that determining what to write, especially if you have an objective (e.g. becoming a best-selling novelist), can be extremely hard — but writing itself?

> there are many different ways to write a program to satisfy them.

"What to program" being hard was accepted from the onset and so far we see no disagreement with that.


> Is writing hard? I expect most can agree that determining what to write, especially if you have an objective (e.g. becoming a best-selling novelist), can be extremely hard — but writing itself?

Being able to transcribe sentences in a certain language is the skill kids pick up in elementary schools. Being a writer requires a whole set of skills built on top of that.

The reason why I brought up that difference in the first place is because both of these are called "writing". When a fan says "I heard the author is writing the next book in the series" or when an author says "I haven't been able to focus on writing due to my health issues", they're not talking about the low-level transcription skill.

> "What to program" being hard was accepted from the onset and so far we see no disagreement with that.

Similar to your interpretation of "writing", you're choosing to interpret "programming" as a process of transcribing an algorithm into a certain programming language, and everything else ends up being defined as "what to program".

That's an overly reductive interpretation, given the original context:

> For reasons which it would take a while to unpack, if is often the case that the best (or sometimes only) way to find out what programming actually needs to be done, is to program something that's not it, and then replace it. This may need to be done multiple times. Programming is only occasionally the final product, it is much more often the means of working through what it is that is actually needed.

> [...]

> Most of what is being done, during programming, is working through the problem space in a way which will make it more obvious what your mistakes are, in your understanding of the problem and what a solution would look like.

Notice that the original comment talks defines "determining what to program" as a process of refining your understanding of the problem itself.

In my reading of the original comment, understanding what your users need is "what to program". Writing code that solves your users' requirements is "programming".


> Writing code that solves your users' requirements is "programming".

For me, I need to have a solution figured out before writing code. I am not even sure how you could write code before having the problem solved. Your approach would be insightful.

Like, I get it is effectively impossible to gather all user requirements upfront and that you need to build a product to find out what details you overlooked. That means software is an iterative process. But within each iteration, surely you still need to have the solution — to the extent that it satisfies the known requirements — prepared before you can write code? Maybe if you had an infinite number of monkeys you could have them bang out something that accidentally works by throwing down random keywords, but in the real world where it is only you and your code has to meaningful, what you program is simply the expression of what to program.

Writing code is just means of conveyance, no?


> Writing code is just means of conveyance, no?

Yes, which is why I have been making the distinction between "programming" and "writing code" all this time.

Programming is hard because it's not merely writing code. Determining what to program is not the same as determining what code to write. "What to program" is about requirements. Going from "what to program" to "what code to write" is what programming is about.


> Going from "what to program" to "what code to write" is what programming is about.

Once your requirements are established there isn't any thing left to choose from, other than maybe whether to use a while loop instead of a for loop — stuff that makes absolutely no difference. The structure of your code, the algorithms you choose, etc. are all dictated by the requirements. So what lies in this nebulous in-between state and what makes it so hard? Is it choosing between for and while what you think is hard?


> The structure of your code, the algorithms you choose, etc. are all dictated by the requirements.

Only if you expand the meaning of the word "requirements" to encompass a full specification of the solution.

> Is it choosing between for and while what you think is hard?

You want to know what I think? I think this conversation is crossing into rudeness.


> Only if you expand the meaning of the word "requirements" to encompass a full specification of the solution.

They are one in the same, no? Why would one write code that isn't required?

Are you referring to the aforementioned iteration process where the act of writing code and measuring the results will lead to realizing that not all requirements have been gathered? When you start with bubble sort and then users complain that your program is too slow you will realize that you missed that efficiency as a requirement when you were first determining "what to write", sure, but theoretically the requirement was there all along. I don't think when you discover "what to write" really matters with respect to the discussion. The process was understood in the original comment.

We can recognize that transient state where one hypothetically thought bubble sort was a suitable algorithm for the user's requirements when it actually wasn't. But isn't it still chosen under some misaligned understanding of the user requirements, like seeing bubble sort as being quick to implement and thinking fastest delivery is the requirement in the moment? The choice to use bubble sort is not random. I'm not sure refinement as more information becomes available changes anything.

> You want to know what I think? I think this conversation is crossing into rudeness.

True. I can think of nothing more rude than asking for clarification an in effort to understand someone. My apologies. I will only talk past you henceforth... But seriously, appeal to emotion is a poor device. There is no good faith discussion that can venture into the world of logical fallacies. What were you trying to accomplish here?


If good writing was easy then "LLM slop writing" wouldn't be a thing.

Not at all. LLM slop exists exactly because writing is easy, but figuring out what to write is hard.

You sound like you would confidently say that you can play chess. Basic moves are easy to learn by very young children.

But if only thing you know are basic moves playing against a player with 1600 ELO you are not going to win without serious training and 1600 is still far below grand master level.


Absolutely. I am quite capable of moving the pieces around a chess board within the confines of the rules. I think you would be hard-pressed to find many who are incapable of that, given exposure to the game. If that isn't easy, what is? I am not all that good at figuring out what moves to make, but that analogs with "what to program", not "programming" as it pertains to the discussion that has been talking place. Nobody has ever suggested "what to program" is easy.

You missed the point.

At some level even if you know basic moves those moves are wrong.

Some things are hard to express in code even if you know exactly what you need to achieve and you know all the basic moves like loops and if statements.

If you know that you have to do a check mate or get amount of points and you know how to do basic moves but you don’t know any openings you are going to loose in 3-5 moves. If you get past openings you might get to 15 moves. If you do easiest greedy approach you loose if you take easiest defensive approach you loose.

It is not „what to program” because in chess you exactly know what is the goal. Getting to that goal alone is hard.


> Some things are hard to express in code even if you know exactly what you need to achieve and you know all the basic moves like loops and if statements.

Like what? Are you, perhaps, confusing "hard" with "time consuming"? Some things take a long time to express in code (absent AI, at least). It's not hard, though. It's just rote copying down what you have already determined "what needs to be programmed". Getting to the point where it is just rote copying can be difficult, but is decidedly in the "what to program" phase.


Well we don’t operate on the same definitions so I agree to disagree and move on.

Giving an example will take too much time from my perspective in regards to how much time I am willing to spend here arguing. Have a great day


So what you are saying is that writing is easy, but figuring out what to write is hard?

> Young children seem to pick it up with ease. It cannot be that hard

It is other way around. Children can pick up a lot of skills that adults struggle at, like languages for example.

Plenty of research has shown reduced plasticity of the brain has stronger correlation to learning speed and ability as it grows old. Most breakthrough research is usually at age 40 or less or chess grand-masters fade in skill after. 25-40 is probably age group where the optimal balance between knowledge experience and learning ability for best outcomes.


> What is hard about it? Young children seem to pick it up with ease. It cannot be that hard?

They do? I've known plenty of kids and young adults who utterly failed to become even borderline competent at programming.


They don't? It is taught in schools in the early elementary level. I see no indication that most are failing.

I think we can agree that few of them would be economically useful due to not knowing what to program. There is no sign of competency on that front. Certainly, even the best programmer in the world could theoretically be economically useless. Programmers only become economically useful when they can bridge "what to program".


> They don't? It is taught in schools in the early elementary level. I see no indication that most are failing.

Programming in elementary schools typically involves moving a turtle around on the screen. (My mother taught 4th grade in New York for many years, and I believe her when she explained the computer instruction.)

Economically valueable programming is much more complex than is taught in many schools through freshman college. (I taught programming at the college level from 1980 till I retired in 2020.)


Because economically valuable programming has to consider what to program, not simply follow the instructions handed down by a teacher of exactly where and how to move a turtle on the screen. But nobody questions "what to program" not being hard. It was explicitly asserted in the very first comment on this topic as being hard and that has also carried in the comments that have followed.

If customers were willing to pay more then a higher price wouldn't solve anything. The price is said to be too low exactly because people are trying to buy more than there is available to sell. The whole point of higher prices is to try and scare people away. Not enough supply and a price too low are the same thing.

Its not a significant concern because we've learned the hacks to work around it, but it is pretty freeing to not have to put hacks into your app.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: