Hacker Newsnew | past | comments | ask | show | jobs | submit | more juunpp's commentslogin

I thought this was sarcasm, but it isn't. Seems like a very weird choice to me to build network infra on top of Python. C/Rust would have been the more obvious choice since you can then bind to that from any language (at least with C).


Investment disclosure?

Forcing Chrome to display a search engine choice doesn't even begin to solve the issue. Chrome spies on you regardless, and it also banned UBO and other ad-blocking extensions. It is a product made to deliberately spy on and harvest more data from you.


It advertises that it runs locally and that it is "extensible" but then requires you to set up a remote/external provider as the first step of installation? That's a rather weird use of "local" and "extensible". Do words mean anything anymore?


You went as far as checking how it works (thus "requires you to set up a remote/external provider as the first step").

But you didn't bother checking the very next section on side bar, Supported LLM Providers, where ollama is listed.

The attention span issue today is amusing.


> The attention span issue today is amusing.

I find it rather depressing. I know it's a more complex thing, but it really feels irl like people have no time for anything past a few seconds before moving onto the next thing. Shows in the results of their work too often as well. Some programming requires very long attention span and if you don't have any, it's not going to be good.


But this is an elevator pitch. I didn't come here to be marketed to, yet I am being marketed to.

So if you're going to market something to me at least do it right. My attention span is low because I don't really give a shit about this.


But people really have no time. There is only one brain and thousands of AI startups pitching something every day.


Yeah, don't need to try any until everyone says 'you have to'. Which happened with Aider and later Cline & Cursor.


Can’t you just run ollama and provide it a localhost endpoint? I dont think its within scope to reproduce the whole local LLM stack when anyone wanting to do this today can easily use existing better tools to solve that part of it.


Did you not see Ollama?


You can use it with ollama too


Yeah, they seem to be referring to the Goose agent/CLI that are local. Not models themselves.


You can run ollama, so no, not only Goose itself.


Fair, but the repeated references to local/on-machine on the project's homepage which OP criticized is, I would think, in reference to the Goose agent.


I think that comment is a copy-paste mistake. If you look at the next code snippet, the comment actually makes sense there.

That being said, I've also given up on C++ and learn it mostly to keep up with the job, if that's where you are coming from. I don't find Rust to be a satisfying replacement, though. No language scratches the itch for me right now.


Excellent catch. The std::move in the snippet is a copy-paste mistake. It was carried over from the previous code snippet.


You put it well. It should also be noted that the embargos and tariffs will only sabotage the US's own interests in the long term, as if the Chinese were not intelligent enough to make their own in-house technology. Wait until their own GPUs beat NV's; there are already startups at work [1]. The NSA used to put backdoors in US-made hardware, which it was then happy to distribute worldwide; now, somebody has decided that encouraging China to make their own will work well for US interests? I have no idea what this foreign policy is meant to accomplish. Even if you were the most patriot of US patriots, I have no idea why you'd support this policy. Even Huawei is striking back [2].

[1] https://www.tomshardware.com/news/chinese-gpus-made-by-moore...

[2] https://www.bloomberg.com/news/features/2025-01-28/huawei-ha...


> I have no idea what this foreign policy is meant to accomplish. Even if you were the most patriot of US patriots, I have no idea why you'd support this policy.

Because the vast majority of Republicans in this country have no fucking idea how e̶c̶o̶n̶o̶m̶i̶c̶s̶ anything works, and no desire to learn. The entire platform now is fuck liberals.

And they still won't learn even after they lose their pensions, their federal funding, jobs, whatever else. They're married to the dumbass now. The best we can hope for is the rest of us riding out their finding out phase of their fuck around journey.


It's basically a slow return to the cold-war era strategy of brinkmanship

https://en.wikipedia.org/wiki/Brinkmanship


Well, I can't wait for the day when Microsoft just disappears. All their life trying to stifle innovation and competition, and here we are again, where this time they have essentially been scammed by OpenAI thinking that they could pull off their anti-competitive practices once more with exclusive access to their models, only to then learn that they've lost and resorting to litigating their sorry ass out of the situation, all the while the US government is living a crypto wars dejavu trying to manufacture as much propaganda as possible to make us believe China is the new enemy we should be worried about this time.

Yep, nope, thanks. Keep those papers coming, bois. Make those models small enough that they can run locally so we don't depend on an online feudal lord.


I can't even listen to 50 Cent unfiltered on Youtube.


Nicaragua, El Salvador, Honduras, Indonesia, Vietnam, ...


your thoughts containing errors, must be greyed out


What do you mean? Are you too stupid to read some literature?


Muddling the term 'open source' is one of his latest achievements, for example.


> who solve only 50% of the problem and have another tool to solve the rest, which in turn only solves 50% etc.. ad infinitum

This actually converges to 1:

1/2 + 1/4 + 1/8 + 1/16 + ... = 1

You just need 30kloc of maven in your pom before you get there.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: