Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
crystaln
22 days ago
|
parent
|
context
|
favorite
| on:
Show HN: NanoClaw – “Clawdbot” in 500 lines of TS ...
Seems much more likely the cost will go down 99%. With open source models and architectural innovations, something like Claude will run on a local machine for free.
walterbell
22 days ago
[–]
How much RAM and SSD will be needed by future local inference, to be competitive with present cloud inference?
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: