Hacker Newsnew | past | comments | ask | show | jobs | submit | Gigachad's commentslogin

They could start by not renaming Microsoft Office and laptops as copilot.

> start by not renaming Microsoft Office

To my understanding, Office (or "Microsoft 365") itself becoming "Copilot" was just confused messaging about the "Office Hub" app/shortcut being repurposed.


I feel like even iPad kids are more capable with a computer than HN users these days.

“Father, how do I <do basic thing>”

“Ask your mother.”

"Ask your AI Agent"

I’m in Melbourne and I honestly haven’t seen any around. I see all this talk about the pubs being full of them but all of the pubs I’ve been to have none.

I’m thinking it must exclusively be an outer suburb thing or places where old people hang out.


RSLs and pub/bistros on shopping centres and along major thoroughfares are the big ones. They're absolutely everywhere, just not traditional pubs in the inner to mid suburbs as much.

I think Crown basically bought up all the licenses in the CBD. I think there might have been one pub at the corner of Williams and Collins or something but the last time I was there it was closed.

If there were ads promoting breeding mosquitos or deliberately inducing cancer, we could look at banning them. But there aren’t so this is a pointless take.

Advertised or not, you can take my breaded mosquitos from my cold, dead hands!

Apples plan has been pretty obvious. They invested in small locally running features that provide small utility rather than massive hosted models that cost a fortune and aren’t profitable.

There also doesn’t seem to be much risk in falling behind. If you wait longer you can skip buying the now obsolete GPUs and training the now obsolete models.


They invested a ton in their Private Cloud Compute though, but are barely using their capacity.

https://9to5mac.com/2026/03/02/some-apple-ai-servers-are-rep...


Only if you are asking surface level questions. There are also certain topics that seem to be worse than others. For asking about how to do things in software guis modern LLMs seem to have a high rate of making up features or paths to reach them. For asking advice in games I've seen an extremely high rate of hallucinations. Asking why something is broken in my codebase has about a 95% hallucination rate.

If you are just asking basic science questions or phone reviews then its pretty reliable.


> Only if you are asking surface level questions.

I find it pretty accurate well beyond that level. How much of that is actually a problem in K-12 education?


I've used it for languages and studying some reinforcement learning stuff, including examples in PyTorch. I haven't had many problems with it really.

Once when I asked it some questions about a strategy game (Shadow Empire) it got them wrong, but the sources it cited had the correct information.


You could sell them after too. Now the book is the same price but it's a 1 year license. The platform we used was so restricted that it would block access the moment your network connection dropped.

It's true that you can use LLMs as a learning resource and to unblock you. But students just aren't. They are using them as a way to avoid thinking, avoid research, and just spit out an answer they can paste in to their homework.

Because the students learned that school is designed by old morons, without understating why writing book reports and doing math drills has the intent of creating students that can read and write or learn other transferable skills.

They should at least require handwritten work, the kids will still be AI-stupid but will at least be able to write.

You remember better when you write, too.

I keep hearing this at work but so far no one has explained what “learning ai” actually means. It seems to just be nonsense like those people selling prompt recipes or claiming to be prompt engineers.

No one needs training in prompting AI. I could understand if they meant a deeper layer of integrating tech with systems but all they ever mean is typing things in to a text box.


I suspect that, in practice, what many enthusiastic advocates mean by “learning AI” is actually “learning to need AI”.

In other words, the aim is to get kids used to using AI as soon as possible, so that they do not learn the skills to function without depending on it.


If you’re smart AI saves you time getting to something you could probably achieve anyway. If you’re… not smart… then it will be a necessary crutch for you to get through life.

I can see the angle for making sure kids start using it before they develop the skills to become independent of it.


You absolutely need prompting skills to use AI usefully. You need to know how to eliminate sycophancy, how to ask for and check primary sources, and how to use follow-up questions.

I've been using AI for some legal issues, and it's been incredibly good at searching for case law and summarising the key implications of various statutes - much more efficient than web search, with direct links to the primary sources it finds.

I'm still the one gaming out "What if...?" and "Does that mean..?" scenarios and making sure the answers are grounded in the relevant statutes, and aren't mistakes or hallucinations.

It's not so much a prompting problem as a critical thinking and verbal reasoning problem.


Learning those prompting skills was very useful for you, but in the context of schools it's a lot more difficult to make the investment worth it.

Schools are slow, by the time the teachers get around to teaching the sophisticated techniques you use today, those techniques will be obsolete, the new AI models will require completely different style of prompts.

As for critical thinking and reasoning, those are even harder to teach. How can teachers teach what they don't know?


> It's not so much a prompting problem as a critical thinking and verbal reasoning problem.

And that means you have to learn without AI to understand when the AI is wrong. This is just how its dangerous to use a calculator without knowing math since you wont spot when you entered things wrongly etc.


As someone who sells AI... You'd be shocked at how bad people are at using AI.

My 6 year old kid who watches me is a better prompter.


Especially since kids these days aren't even very good at using computers:

http://www.coding2learn.org/blog/2013/07/29/kids-cant-use-co...

It seems to me that if someone can read and think critically-- they can RTFM and get much better much quicker at computers and AI than people who spent all their time tapping an ipad to watch the next video.


I'd think really the only AI skill you need is the ability to think independently and be able to verify the results you are getting or spot when something is wrong in the response.

It would take a few sessions at most to take someone from 10 years ago and get them fully up to speed with AI tools since they have zero learning curve.


I think exercises when student is given pre-generated AI output and told to identify as many issues or mistakes as possible might be sensible. Not sure how long creating such exercise would take and what should be the tools or sources to verify the output but that might be helpful excersise.

Similar to Google and Wikipedia lessons back in my day.

We had something like this when I was in school but it was reading a news article, or the same story covered by different people and identifying the bias or missing information.

Evaluating AI output it’s not a skill on its own. It’s just general critical thinking and literacy.


You also need to understand the limits of AI and that it has limits that a human that gives you usually correct and authoritative answers does not have.

I think it comes easily to the sort of people who comment here. Moat people have a very vague understanding of computers in general.


It would probably be useful to learn how the models work under the hood - particularly about high dimensional vector spaces - as a counter to magical thought. But I doubt that's what is meant.


Are these supposed to be the "skilled" prompts? This just reads as a basic conversation and not as particularly well-written or well-defined prompts. So far everything I've seen about prompting "skills" has just come down to being able to articulate and critically think a bit.

Yeah, my post was kind of sarcastic and it doesn't show.

I’m not sure anything was clarified. Nothing about that conversation is special or unique?

This is possible in Australia via the new PayTo system. But it’s quite new, doesn’t work for international payments and so far not much uses it.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: