The universal theme with general purpose technologies is 1) they start out lagging behind current practices in every context 2) they improve rapidly, but 3) they break through and surpass current practices in different contexts at different times.
What that means is that if you work in a certain context, for a while you keep seeing AI get a 0 because it is worse than the current process. Behind the scenes the underlying technology is improving rapidly, but because it hasn’t cusped the viability threshold you don’t feel it at all. From this vantage point, it is easy to dismiss the whole thing and forget about the slope, because the whole line is under the surface of usefulness in your context. The author has identified two cases where current AI is below the cusp of viability: design and large scale changes to a codebase (though Codex is cracking the second one quickly).
The hard and useful thing is not to find contexts where the general purpose technology gets a 0, but to surf the cusp of viability by finding incrementally harder problems that are newly solvable as the underlying technology improves. A very clear example of this is early Tesla surfing the reduction in Li-ion battery prices by starting with expensive sports cars, then luxury sedans, then normal cars. You can be sure that throughout the first two phases, everyone at GM and Toyota was saying: Li-ion batteries are totally infeasible for the consumers we prioritize who want affordable cars. By the time the technology is ready for sedans, Tesla has a 5 year lead.
> The universal theme with general purpose technologies is 1) they start out lagging behind current practices in every context 2) they improve rapidly, but 3) they break through and surpass current practices in different contexts at different times.
I think you should say succesful "general purpose technologies". What you describe is what happens when things work out. Sometimes things stall at step 1, and the technology gets relegated to a foot note in the history books.
We don’t argue that microwaves will be ubiquitous (which they aren’t, but close enough). We argue that microwaves are not an artificial general barbecue, as the makers might wish were true.
And we argue that microwaves will indeed never replace your grill as the makers, again, would love you to believe.
Your reasoning would be fine if there were a clear distinction, like between a microwave and a grill.
What we actually have is a physical system (the brain) that somehow implements what we know as the only approximation of general intelligence and artificial systems of various architectures (mostly transformers) that are intended to capture the essence of general intelligence.
We are not at the microwave and the grill stage. We are at the birds and the heavier-than-air contraptions stage, when it's not yet clear whether those particular models will fly, or whether they need more power, more control surfaces, or something else.
Heck, the frontier models have around 100 times lower number of parameters than the most conservative estimate of the equivalent number of parameters of the brain: the number of synapses. And we are like "it won't fly".
There was a lot of hubris around microwaves. I remember a lot of images of full chickens being roasted in them. I've never once seen that "in the wild" as it were. They are good for reheating something that was produced earlier. Hey the metaphor is even better than I thought!
There are many years since I have switched to cooking only with microwaves, due to minimum wasted time and perfect reproducibility. And I normally eat only food that I cook myself from raw ingredients.
Attempting to roast a full chicken or turkey is not the correct way to use microwaves. You must first remove the bones from the bird, then cut the meat into bite-sized chunks. After using a boning knife for the former operation, I prefer to do the latter operation with some good Japanese kitchen scissors, as it is simpler and faster than with a knife.
If you buy turkey/chicken breasts or thighs without bones, then you have to do only the latter operation and cut them into bite-sized pieces.
Then you can roast the meat pieces in a closed glass vessel, without adding anything to the meat, except salt and spices (i.e. no added water or oil). The microwave oven must be set to a relatively low power and long time, e.g. for turkey meat to 30 minutes @ 440 W and for chicken to less time than that, e.g. 20 to 25 minutes. The exact values depend on the oven and on the amount of cooked meat, but once determined by experiment, they remain valid forever.
The meat cooked like this is practically identical to meat cooked on a covered grill (the kind with indirect heating, through hot air), but it is done faster and without needing any supervision. In my opinion this results in the tastiest meat in comparison with any other cooking method. However, I do not care about a roasted crust on the meat, which is also unhealthy, so I do not use the infrared lamp that the microwave oven has for making such a crust.
Vegetable garnishes, e.g. potatoes, must be cooked at microwaves separate from the meat, as they typically need much less time than meat, usually less than 10 minutes (but higher power). Everything must be mixed into the final dish after cooking, including things like added oil, which should better not be heated at great temperatures.
Even without the constraints of a microwave oven, preparing meat like this makes much more sense than cooking whole birds or fish or whatever. Removing all the bones and any other inedible parts and also cutting the meat into bite-sized pieces before cooking wastes much less time than when everybody must repeat all these operations every time during eating, so I consider that serving whole animals at a meal is just stupid, even if they may look appetizing for some people.
Neither "roasted" nor "steamed" are completely appropriate for this cooking method.
While there is steam in the vessel, it comes only from the water lost from the meat, not from any additional water, and the meat is heated by the microwaves, not by the steam.
Without keeping a lid on the cooking vessel, the loss of water is too rapid and the cooked meat becomes too dry. Even so, the weight of meat is reduced to about two thirds, due to the loss of water.
In the past, I was roasting meat on a covered grill, where the air enclosed in it was heated by a gas burner through an opening located on one side, on its bottom. With such a covered grill, the air in which the meat was cooked would also contain steam from the water lost by the meat, so the end result was very similar to the microwave cooking that I do now, also preventing the meat from becoming too dry, unlike with roasting on an open grill, while also concentrating the flavor and avoiding its dilution by added water or oil.
I greatly appreciate this type of thinking. If you debone the meat yourself, do you make stock or use the bones in any way? You obviously care about personal process optimization and health factors, and I'm curious to what extent you are thinking of the entire food/ingredient supply chain.
> There are many years since I have switched to cooking only with microwaves, due to minimum wasted time and perfect reproducibility. And I normally eat only food that I cook myself from raw ingredients.
Some of us also like the food tasting nice besides the reproducibility of the results.
Like I have said, meat prepared in this way is the tastiest that I have ever eaten, except if you are one of those who like real meat less than the burned crust of meat.
Even for the burned-crust lovers, many microwave ovens have lamps for this purpose, but I cannot say whether the lamps are good at making tasty burned crusts, because I have never used them, since I like more to eat meat than to eat burned crusts.
The good taste of meat prepared in this way is not surprising, because the meat is heated very uniformly and the flavor of the meat is not leached away by added water or oil, but it is concentrated due to the loss of water from the meat, so the taste is very similar to that of meat well cooked on a grill, neither burned nor under-cooked, and also without adding anything but salt and spices.
“Burned crust” is not what people want. If it’s burnt it went too far. The fact that you are equivocating the two makes me think you haven’t ever had properly cooked meat or are very confused about what “burned” means. https://www.thetakeout.com/how-to-brown-meat-while-cooking-1...
This convo is hilarious, you rock. I'm not surprised someone was open minded enough to master the art of cooking with a microwave. Also there are different type of fast cooking apparatus that are usually reserved for restaurants but I could imagine they might be up your alley. (I can't right now recall the name of such a device but its similar in function to microwave maybe it is a microwave at its heart?)
As I understood it, if you used the esoteric functions of the microwave, you COULD cook food like it was cooked on a range, but it required constant babysitting of the microwave and reinput of timers and cook power levels.
And what is the point? "We know that a microwave is not a grill, but pushy advertisers insist otherwise?" The analogy is plainly wrong in the first part. We don't know.
The second part is boring. Advertisers are always pushy. It's their work. The interesting part is how much truth in their statements and it depends on the first part.
Microwaves are pretty terrible, and proof that wide consumer adoption does not really indicate quality or suggest that consumers adopt technology which _should_ exist.
I lived without a microwave for a ~year and ended up buying one again because they are pretty convenient.
So maybe it's not high on the list based on the value you are measuring but it definitely has a convenience value that isn't replaced by anything else.
The generations of microwave after the first few were fantastic. They did what they were supposed to and lasted decades. Once that reputation was solidified, manufacturers began cutting corners, leaving us with the junk of today.
The same thing happened with web search and social media. Now, those selfsame companies are trying to get us to adopt AI. Even if it somehow manages to fulfill its promise, it will only be for a time. Cost-cutting will degrade the service.
I grew up in a country where microwaves were not a thing. When they suddenly got introduced, it felt like a miracle even just for the ability to quickly heat things up.
They warm up things that a) I don't want to put in a kettle and b) don't want to put in a dedicated pot to put on the stove.
Like the remainder of the soup i made yesterday that I've put in a china bowl in the fridge. I was able to eat warm soup out of that bowl without requiring to make any other dishes dirty. Pretty convenient if you ask me.
Bonus: you can take a cherry tomato or a single grape and make a small plasma arc in the microwave. Pretty cool trick to impress and scare people at house parties.
They are not. But they are the AI slop of cooking - it's easy to get an acceptable result, so people associate it with junk food made with zero effort.
That is not my experience or look at the history at all, wrt general-ish purpose technologies.
What usually happens is they either empower unskilled (in a particular context) personnel to perform some amount of tasks at "good enough" level or replace highly specialized machinery for some amount of tasks at again "good enough" level.
At some point (typically, when a general purpose technology is able to do "good enough" in multiple contexts) operating at "good enough" enough level in multiple verticals becomes profitable over operating ant specialized level and this is when the general purpose technology starts replacing specialized technology/personnel. Very much not all general-purpose technologies reach this stage at all, this is only applicable to highly successful general purpose technologies.
Then market share of general technology starts increasing rapidly while at the same time market share of specialized technologies drops, RnD in general tech explodes while specialized technologies start stagnating. Over time this may lead to cutting edge general purpose technologies surpassing the now-old specialized technologies, taking over in most areas.
> A very clear example of this is early Tesla surfing the reduction in Li-ion battery prices. <...> By the time the technology is ready for sedans, Tesla has a 5 year lead.
> everyone at GM and Toyota was saying: Li-ion batteries are totally infeasible for the consumers we prioritize who want affordable cars.
We are nearly two decades since the Tesla "expensive sports car" and pure BEVs are still the significantly more expensive option, despite massive subsidies. If anything, everyone at Toyota were right. Furthermore, they have been developing their electric drive-trains in parallel via the hybrid tech: surfing the same wave while raking in profits.
In fact, BEV sales outpace other drive-train sales only in regions where either registrations of those are artificially limited, or the government heavily subsidizes both purchase and maintenance costs. If you don't have government subsidized rooftop solar the cost per mile of BEV is more or less on par with a HEV and in most cases worse than diesel for long range trips.
> pure BEVs are still the significantly more expensive option
New technology often has ‘new’ tradeoffs, are GPU’s are sill only situationally better than CPU’s.
DC fast charging is several times more expensive than home charging which heavily influences the economics of buying an EV without subsidies. Same deal with plug in Hybrids or long range batteries on PEV, if you don’t need the range you’re just wasting money. So there’s cases when an unsubsidized PEV is the cheapest option and that line will change over time even if it’s not going away anytime soon.
AI on the other hand is such a wide umbrella it doesn’t really make sense to talk about specific tradeoffs beyond the short term. Nobody can say what the downsides will be in 10-20 years because they aren’t extrapolating a specific technology with clear tradeoffs. Self driving cars could be taking over industries in 15 years, or still quite limited we can’t say.
GPUs are a good example - they started getting traction in the early 2000s/late 90s.
Once in the mid 2000s we figured out that single-thread perf won't scale, GPUs became the next scaling frontier and it was thought that they'd complement and supplant CPUs - with the Xbox and smartphones having integrated GPUs, and games starting to rely on general purpose compute shaders, a lot of folks (including me) thought that the software in the future will constantly pingpong between CPU and GPU execution? Got an array to sort? Let the GPU handle that. Got a JPEG to decode? GPU. Etc.
I took an in depth CUDA course back in the early 2010s, thinking that come 5 years or so, all professional signal processing will move to GPUs, and GPU algorithm knowledge will be just as widespread and expected as how to program a CPU, and I would need to Leetcode a bitonic sort to get a regular-ass job.
What happened? GPUs weren't really used, data sharing between CPU and GPU is still cumbersome and slow, dedicated accelerators like video decoders weren't replaced by general purpose GPU compute, we still have special function units for these.
There are technical challenges sure to doing these things, but very solvable ones.
GPUs are still stuck in 2 niches - video games, and AI (which incidentally got huge). Everybody still writes single-threaded Python and Js.
There was every reason to be optimistic about GPGPU back then, and there's every reason to be optimistic about AI now.
Not sure where this will go, but probably not where we expect it to.
I heard a very similar sentiment expressed as "everything is not good enough to be useful until it suddenly is".
I find it a powerful mental model precisely because it is not a statement of success rate or survival rate: Yes, a lot of ideas never break any kind of viability threshold, sure, but every idea that did also started out as laughable, toy-like, and generally shit (not just li-ion batteries, also the wheel, guns, the internet and mobile computers).
It is essentially saying 'current lack of viability is a bad indicator of future death' (at least not any more than the high mortality of new tech in general), I guess.
Even if you consider a car a general purpose technology, Tesla displacing GM is a car displacing a car, so it's not really an example of what you're saying, is it?
You took a very specific argument, abstracted it, then posited your worldview.
What do you have to say about the circular trillions of dollars going around 7 companies and building huge data centers and expecting all smaller players to just subsidize them?
Sure, you can elide the argument by saying, "actually that doesn't matter because I am really smart and understood what the author really was talking about, let me reframe it properly".
I don't really have a response to that. You're free to do what you please. To me, something feels very wrong with that and this behavior in general plagues the modern Internet.
What that means is that if you work in a certain context, for a while you keep seeing AI get a 0 because it is worse than the current process. Behind the scenes the underlying technology is improving rapidly, but because it hasn’t cusped the viability threshold you don’t feel it at all. From this vantage point, it is easy to dismiss the whole thing and forget about the slope, because the whole line is under the surface of usefulness in your context. The author has identified two cases where current AI is below the cusp of viability: design and large scale changes to a codebase (though Codex is cracking the second one quickly).
The hard and useful thing is not to find contexts where the general purpose technology gets a 0, but to surf the cusp of viability by finding incrementally harder problems that are newly solvable as the underlying technology improves. A very clear example of this is early Tesla surfing the reduction in Li-ion battery prices by starting with expensive sports cars, then luxury sedans, then normal cars. You can be sure that throughout the first two phases, everyone at GM and Toyota was saying: Li-ion batteries are totally infeasible for the consumers we prioritize who want affordable cars. By the time the technology is ready for sedans, Tesla has a 5 year lead.