Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I am wondering if Apple's focus is off lately with this drive for AI. So far all they are showing in that presentation is that I can have

"the ability to transform 2D photos into spatial scenes in the Photos app, or generating a Persona — operate with greater speed and efficiency."

And by making Apple AI (which is something I do not use for many reasons, but mainly because of Climate Change) their focus, I am afraid they are losing and making their operating Systems worse.

For instance, Liquid Glass, the mess I was lucky enough to uninstall before they put in the embargo against doing so, is, well, a mess. An Aplha release in my opinion which I feel was a distraction from their lack of a robust AI release.

So by blowing money on the AI gold rush that they were too late for, will they ultimately ruin their products across the board?

I am currently attempting to sell my iPhone 16E and my M1 Macbook Air to move back to Linux because of all of this.



Running AI on the macbook or phone is probably really energy efficient compared to data centers. I think AI hardware makes sense. Dunno about recent software though - glass and apple intelligence both seem useless.


Assuming you've read https://andymasley.substack.com/p/a-cheat-sheet-for-conversa... or the longer full essay/related works, could you elaborate on why you don't use Apple Intelligence?

I totally understand why someone would refuse to use it due to environmental reasons (amongst others) but I'm curious to hear your opinions on it.


Some commenters already answered for me. To me there is no real use benefit. I am rather a simple user and it seems to take up space on the phone as well. I refuse to use iCloud so space is important to me since photography is what I do the most.

Also, I like researching things old school how I learned in college because I think it leads to unintended discoveries.

I do not trust the source you linked to. It is an organization buried under organizations for which I cannot seem to find their funding source after looking for a good 15 minutes this morning. It led me back to https://ev.org/ where I found out one guy used to work for "Bain and Company", a consulting firm, and was associated with FTX funding:

https://oxfordclarion.uk/wytham-abbey-and-the-end-of-the-eff...

Besides "Effective Altruism" makes no sense to me. Altruism is Altruism IMO.

Altruism: unselfish regard for or devotion to the welfare of others

There is no way to be ineffective at altruism. The more you have to think about altruism the further you get from it.

But the organization stinks as some kind of tech propaganda arm to me.


Not sure why would one think that article is something other than distraction attempt. Because emissions are adding up.

I'm from country (in Europe) where CO2 emissions per capita [0] are 5.57 while number for USA is 14.3, so reading this sentence in that article: "The average American uses ~50,000 times as much water every day..." surly does not imply that one should use ChatGPT because it is nothing. If "average American" wants to decrease emissions then not using LLMs is just start.

[0]: https://ourworldindata.org/grapher/co-emissions-per-capita


This isn’t about ChatGPT this is about Apple Intelligence which is an on-device low power ML system.


For me: unproven trust and no killer feature.

If I can't search my Apple Mail without AI, why would I trust AI?


> could you elaborate on why you don't use Apple Intelligence?

Why would I trust this when they can't deliver a voice assistant that can parse my sentences beyond "Set a reminder" or "Set a timer"? They have neglected this area of their products for over a decade, they are not owed the benefit of the doubt


> I totally understand why someone would refuse to use it due to environmental reasons

Huh. This one baffles me.


Energy use, presumably.

Of course, are those same users always running their screens super dim? Are they using pen + paper instead of typing whenever they can?


Consuming kilowatts is not intrinsically bad for the environment. If you are worried about the environmental impact of power generation, then advocate for cleaner generators.


Most of the AI and Machine Learning Apple has done so far are primarily done on device so you can see whether there is any climate change concern or not.


> making Apple AI [...] their focus

Are they really doing that? Because if it's the case they have shockingly little to show for it.

Their last few attempts at actual innovation seem to have been less than successful. The Vision Pro failed to find a public. Liquid Glass is to put it politely divisive.

At that point to me, it seems that good SoC and a captive audience in the US are pretty much all they have remaining and competition on the SoC part is becoming fierce.


Yeah, I agree, they have a captive audience for sure. But they still need to satisfy share holders. If people are failing to upgrade that is a problem. And the battery drain on my iPhone 16e on Glass was horrific. I know casual users who did not notice until I pointed it out and they were tracking it better. This, unfortunatly, makes me think conspiratorially. Even a modest about of extra battery use and degradation will mean more upgrades in the future.

But I think $500 billion is a lot of money for AI:

Apple accelerates AI investment with $500B for skills, infrastructure

https://www.ciodive.com/news/Apple-AI-infrastructure-investm...

Imagine using $500 for the operating system and squashing bugs or making the system even more energy efficient? Or maybe figuring out how to connect to an android tablet's file system natively?


If you don’t use AI for climate reasons then you should read the recent reports about how little electricity and water is actually used. It’s basically zero (image and video models excluded). Your information about this is probably related to GPT3.5 or something. Which is now 3 years old - a lifetime in AI world.


Big data centers running tons of GPUs and the construction of even bigger ones is not carbon neutral come on


Don't newer models use more energy? I thought they were getting bigger and more computationally intensive.


They use a massive amount of energy during training. During inference they use a tiny amount of energy, less than a web search (turns out you can be really efficient if you don't mind giving wrong answers at random, and can therefore skip expensive database queries!)


Right, but the comment I was responding to suggested that ChatGPT3.5 used lots of energy and newer models use less.


Indeed, this is correct. See today's Claude Haiku 4 announcement for an example.


Looking at https://platform.openai.com/docs/pricing, GPT 3.5 is $1.50-4 per million output tokens, and GPT 5 is $0.40-120, with plain "gpt-5" with no qualifiers going for $10/million.

GPT5 is probably cheaper in the sense that gpt5-nano is at least as capable as 3.5 while costing less, but the "normal" models are more expensive for the newer ones, and thats what people are generally going to be using.


I think they will continue ruining their products via software updates. That's implied by a walled garden approach they chose to do their business: this forces users to consoom more and thus generates profits. Apple isn't a "lean" company, it needs outrageous profits to stay afloat.


I'm interested in reading about your low-carbon lifestyle that is so efficient you got to the point of giving up machine inference.


I live in a van full time. I have a 200w solar panel and a 1500w output solar battery that powers everything I use, mostly for cooking, sometimes heat. I also poop in the woods a lot. :) I do not use the internet much really. Driving is my biggest carbon footprint but I really do not put much more mileage than the average suburban person. Anyway, I try my best. I am permanently disabled so that makes a lot of it easier. Being poor dramatically lowers ones carbon footprint.


[flagged]


What a nice way to talk to another person who... didn't attack you?

A typical passenger car driving 12,000 miles puts out about 5 metric tons of C02

The person driving that passenger car likely has a 1,000 sq ft or larger home or apartment, which can vary widely but could be reasonably estimated at another 5 metric tons of C02 (Miami vs. Minnesota makes a huge difference)

So we're at 10 metric tons for someone who doesn't live in a van but still drives like a suburbanite

Care to be a little kinder next time you feel whatever compelled you to write you response to the other user? Jeesh.


First, I need my van. My van is my house.

> Burning 1000 gallons of motor fuel has the same GHG impact as 300 million uses of Google Gemini, and the CO2 impact of local inference on a Mac is even less

Still, even lets say your number are correct (and I feel they are not), does that mean I should just add to the problem and use something I do not need?

Driving my van for my yearly average creates about 4.4 metric tons of CO2.

"A more recent study reported that training GPT-3 with 175 billion parameters consumed 1287 MWh of electricity, and resulted in carbon emissions of 502 metric tons of carbon, equivalent to driving 112 gasoline powered cars for a year."

https://news.climate.columbia.edu/2023/06/09/ais-growing-car...

Just to get an idea of how I conserve, another example is I only watch videos in 480 becasue it uses less power. This has a double benefit for me since it saves my solar battery as well.

I am not bragging, just showing what is possible. Right now, being tsill this week in the desert, my carbon footprint is extremely low.

Second, I cannot really trust most numbers that are coming out regarding AI. Sorry, just too much confusion and green-washing. For example, Meta is building an AI site that is about the size of Manhattan. Is all the carbon used to build that counted in the equations?

But this paper from 5/25:

https://www.technologyreview.com/2025/05/20/1116327/ai-energ...

says "by 2028 more than half of the electricity going to data centers will be used for AI. At that point, AI alone could consume as much electricity annually as 22% of all US households."

And

"Tallies of AI’s energy use often short-circuit the conversation—either by scolding individual behavior, or by triggering comparisons to bigger climate offenders. Both reactions dodge the point: AI is unavoidable, and even if a single query is low-impact, governments and companies are now shaping a much larger energy future around AI’s needs."

And

"The Lawrence Berkeley researchers offered a blunt critique of where things stand, saying that the information disclosed by tech companies, data center operators, utility companies, and hardware manufacturers is simply not enough to make reasonable projections about the unprecedented energy demands of this future or estimate the emissions it will create. "

So the confusion and obfuscation is enough for me to avoid it. I think AI shoudl be restaind to research, not to be used from most of the silliness adn AI slop that is being produced. Because yiou know, we are not even counting the AI slop views that also take up data space and energy by people looking at it all.

But part if why I do not use it is my little boycott. I do not like AI, at least how it is being misused to create porn and AI slop instad of doing the great things it might do. They are misusing AI to make a profit. And that is also what I protest.


Depends where you are. People in some countries have lot of catching up: https://ourworldindata.org/grapher/co-emissions-per-capita

Maybe they are in USA - every little think counts there.


I am in the US, and thanks for that link. I am of the opinion that the Climate Crisis should be the number one focus for everyone right now.

So, to keep this on point, Apple making a faster chip is not on my climate change agenda and anything but negative.


No, in the USA it is the opposite. The little things do not and cannot add up to anything. The only things that make a difference are motor fuels and hamburgers.


oh grow up. people can make cuts wherever the choose and no cut is a bad cut. These decisions are so complicated, personal and nuanced it is ridiculous to try to police someone else's best efforts.


At the end of the day, they're building silicon that can do this to be ready for when the software side of the house actually figures this stuff out. Of course, it doesn't seem like the software side is close to this, and a very real risk for Apple is a world where the local AI use-cases don't really grow to justify this level of silicon investment. More specifically: Personal context is a big thing that Apple is uniquely positioned to capitalize on; but will a mobile-sized LLM and mobile-sized memory ever be able to coherently handle the volume of contextual data that might be necessary to be truly great? I have 400gb in iCloud, I don't want to get into the weeds of most of that being images and such; you don't need to in order to recognize that modern data center-scale LLMs can handle, like, less than a megabyte of context.

There will always be local-first use-cases, but its also possible that, you know, we're already near the global maxima of those use-cases, and the local AI coprocessors we've had can do it fine. This would be a severe shock to my perceived value of Apple right now, because my view is: their hardware division is firing on all cylinders and totally killing it. But when you're putting supercomputers into the iPad... maybe that doesn't actually matter. Meanwhile, their software is getting worse every year that goes by.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: