Hacker Newsnew | past | comments | ask | show | jobs | submit | noobly's commentslogin

I listened to the founders explain that the current process of syncing up to review or question code is very multi-step and sort of inefficient almost adversarial at a high level. A slack mention, Github discussion, screen share, etc it all ends up being kind of disorganized and painful versus just being able to edit the document collaboratively and directly, perhaps leaving some metadata tagged at certain locations (e.g, notes from a conversation about the code).

It's not like the editor prevents one from still using slack and other external tools, either. I guess I just see the value in in-editor integration to handle that stuff more smoothly, at least for those using the same editor.. I can see myself really appreciating the feature if there's a part of the codebase that consistently trips people up or is under active discussion.


Has the team commented on this? Coming from Emacs, it seems insane to not implement an API to the UI. GPUI looks great too, it’d be a real shame if they opted to keep the extensibility limited to just LSP servers and whatnot.


Code is amazing. I'm not sure why OpenAI isn't using it as their default CLI. I was cancelling my membership and stumbled upon it right before, now I'm dropping my other subs to move to this.


It’s weird how divisive it is. For me it’s completely dependent on the quality of the output. Lately, it’s been more of a hinderance.


You won’t be scrolling TikTok on the Remarkable, though.


I only began to love my RM2 when I stopped trying to use it as a PDF reader and writer and instead only a scratch paper replacement. But it’s not as economical if limited to this.

I do wish they’d improve the PDF usability or embrace open sourcing the UI. There’s a lot of features that should be easy to implement, like split screen or floating sticky notes, but they seem almost wholly focused on the hardware. I thought it’d be the ultimate tool for studying math and saving money on books, thus paying for itself, but it’s just not there yet and I’m not sure they plan to get it there.


Is there an alternative to Remarkable that offers good drawing/writing, but at a lower price? That's the only thing I'd want. I have stacks of dot-rule notebooks full of various notes and sketches. It'd be nice to have a replacement for all that.


I considered all the options back in 2021 and went with the iPad Mini.

My reasons: much better software for sketching, not bound to a single ereader app, multiple ways to send stuff around, perfect size.

Many years later, I would still choose the same. I use it to annotate webpages, sketch, read books and read queued articles in instapaper. It's distraction-free but still connected. I can Airdrop drawings or load my handwritten notes on the Macbook app. Tap to define is so good I've absent-mindedly tried it on a paper book.

The LED screen is great for some things and bad for others. You have to turn it on and unlock it. You can't SSH into it or sync your drawings as simple files. Otherwise, it's really good.


I like using my iPad better than my RM2 for similar reasons (I have both, but really only use the iPad anymore). The pickup is much better on the iPad, in my opinion.

However, one thing that I think makes the biggest difference is adding a screen protector. I particularly like the ones from https://paperlike.com/. It adds a layer that makes it less like writing on glass and more like writing on paper. For me, this was the biggest increase in usability for taking notes.


The Boox is a little cheaper, I have one. It's mostly just an Eink Android tablet. I absolutely love it.

It will run most Android apps (modulo Eink screen support). The built-in note taking app is terrific.


Ratta's Supernote is an option. The second hand market isn't as good, though, for buyers.


reMarkable (1) user here since 2019, what does the software stack look like for the Supernote? The A6 looks interesting as a form factor for someone like me that uses it solely for note taking (all I want is a "non-linear notebook") rather than annotation and reading (I use a printer and scanner for annotation/feedback and an "ancient", never-online Kindle for reading books). reMarkable has always been open-ish rather than properly open, so I would hope for Supernote to be more open to the idea of users having access to code and control over their devices (even if I never connect my reMarkable to the network).


The software support is decent. Currently it's running an old version of Android that allows you to side load apps. They are supposedly working on moving the OS over to a custom Linux build, but we haven't really seen anything with that. They do release updates fairly frequently and they have a publicly viewable Trello software development board so you can see the status of features they are working on.


Interestingly, https://supernote.com/pages/supernote-nomad says

> Customize Your System

> A Linux-based system will be open in the future for community modifications and customizations.

> *Not built-in with the device

Anyone have details? Internet says it's a very custom Android. https://www.ereaders.org/supernote-manta-versatile-durable-e...


Supernote hasn't posted anything about the potential future Linux system. I'm personally not expecting this to actually come around.

The standard OS is derived from Android, but has no Google Play services and only supports side loading (which works well via ADB; I've not tried other app stores like FDroid). It doesn't have a lot of standard Android things like a home screen/launcher, no notifications, and no UI for switching between apps other than using the sidebar to bring up the application list and going back that way. There are no speed modes for adjusting refresh rates.

IMO, it's a great eink notepad & sketch book, and makes a good ereader with something like koreader, but it's not good as an Android tablet.


+1 I own two Supernote (Nomad and Manta) and I can only recommend them.


+2 to supernotes ratta. Amazing product, amazing company


Love my supernote as well


Same, the writing experience on the supernote is extremely good (most reviewers say it’s the best because it mimics writing on a stack of paper) and the parts are supposedly replaceable


The PineNote is only slightly cheaper, I suspect that Remarkable isn't making a lot of profit on their product.


The Boox Go 10 is ~$400 and has a note-taking app built in. While it runs Android, the note taking experience is poor with apps other than the built-in one.


Honestly I found the base iPad excellent for this. The writing experience isn't a lot like paper, but is still quite good. You can get a little closer by applying a matte screen guard.


I heard these destroy the Pencil tips and are pretty loud. Is that true?


Paper?


They specifically said they want to replace their paper usage.


> only a scratch paper replacement. But it’s not as economical if limited to this.

That's exactly the use case though. It's a replacement for pen and paper, and the lack of functionality is seen as a feature.


What’s the ‘30% take’?


30% of all AppStore sales go right to Apple


15% if you’re part of the small business program.


What program do I have to join for 0%?


The one where you create your own mobile operating system.


The one where you collect cash directly from users, and magically make handling that have zero overhead.

Credit card processing is hard... Go price out stripe + customer service + dealing with charge backs and tell me if you really want to do processing your self.


Being in the EU and releasing in an alternative marketplace.


Moneyball highlights an inefficiency that I would think would've stopped existing sometime in the 80-90's as data driven approaches have become standard.


One of the analytics leads for the Red Sox came to Harvard to give a presentation. I asked if he could quantify the effects analytics was having compared to the conventional wisdom developed over the course of 150 years of pro baseball.

He thought that analytics was changing the probabilities of discrete events by single digits. Essentially, nobody was doing anything wrong, there were just optimizations that were/are available.

Remember that the book/movie is about the A’s, who were eliminated in the first round of the baseball playoffs.


Each baseball game is so high variance that even a 5 or 7 game series is still largely a crap shoot. Unlike the NFL or NBA, any MLB team that makes the playoffs has a puncher's chance to with the title. It's one of the beauties of the game. (Unless you're a Dodgers fan.)


This highlights one of the things I've always loved about baseball as a sport ... and always bugged the shit out of me about college football.


Why would a game of baseball be any more high variance than an NFL game?

On the face of it, NFL’s playoff of a single game deciding forward progress seems more high variance, rather than a best of x series.

Football also seems more high variance just due to the explosive, physical nature of the game.

I wonder what the stats say about lower seeds winning the tournament for MLB vs NFL playoffs.


https://www.vox.com/videos/2017/6/5/15740632/luck-skill-spor...

Why it’s so much harder to predict winners in hockey than basketball


The way I look at it is the Patriots in '85 had maybe a 1 in 10 chance to beat the Bears in the Super Bowl. And I think that's being charitable.

Meanwhile the best regular season team of my lifetime, the '01 Seattle Mariners, still only win maybe 75% to 80% of the time vs. the historically terrible 2024 White Sox.

Now put the '85 Bears against a historically inept team instead of one of the other best teams in the league that year. I'm not sure the '76 Buccaneers win 1 game out of 100.

It's just not possible to physically dominate a game in baseball the way it is in football.


I think part of the point though was that the As were performing far better, and progressing further than you'd expect given a dramatically lower budget for players.

If you can have a competitive team filling 80% of the seats as your competitors at 50% the payroll..


Before Moneyball there were other controversial stats, largely driven by Bill James. By and large they had the same footprint you're describing. In any particular game they were such a small effect as to be meaningless. But of the course of a season? That could be the difference between playoffs or not.

The one that pops into mind offhand was putting the "cleanup hitter" as leadoff, even though at the time "leadoff hitter" was a very specific physical archetype as was the "cleanup hitter". Yet the latter was often the best hitter on the team, and by putting them top of the batting order they'd get more at bats over the course of a season.


>He thought that analytics was changing the probabilities of discrete events by single digits.

That is actually huge in baseball. As an example, the player who was least likely to get a hit last year did it in 19.6% of their at bats (this typically isn't represented as a percentage and would instead be listed as .196 batting average), while the player who was most likely to get a hit did it 33.2% of the time, meanwhile the league average was 24.3%. That means "changing the probabilities of discrete events by single digits" is what separates the average player from the outliers at both the top and bottom of the talent pool.


Famously, Billy Bean said "his (stuff) doesn't work in the playoffs." This is because top end talent still generally wins championships. The regular season is a slog and you can do very well by always beating the teams you should beat. But in the playoffs when everyone is giving maximum focus and effort, the talent gap is much more important than in the regular season. In basketball, I think of teams like this year's Cavaliers or the Budenholzer era Hawks. These teams won a lot of regular season games, but never felt like a legitimate threat to win the championship.


That might be true in-game (though I suspect the marginal gains compound fairly strongly) but I think the numbers when applied to recruitment and avoiding costly mistakes are much more impactful.


It was just after the 90s in 2002 so not too far off. Teams were collecting more data but looking at it wrong and now it's likely much less effective because it's been brought out into the open by both the book and the movie. Though small cap teams are still paying way less per win than the big boys like the Yankees who can pay almost 3x per win.


Personal question, but I had to drop out a couple years ago as a Math/CS senior. I felt like the clock was really ticking back then to get my foot in the door. I'm considering re-enrolling (only two terms remaining), would it be better to wait until if/when the market has recovered so I can enter the market with a fresh degree? I worry a downside of completing it now would be a stale degree by the time the job market recovers. Assuming equal job experience, employers seem to prefer fresh graduates. But you know what they say about time in the market vs timing the market..


How much would it cost to complete your degree? Could you take online or CC courses that you could transfer to your old program?

While having no degree hasn't historically been much of a blocker, during a weaker job market credentials can and do play a role in tie breaking during hiring.

Honestly, even a WGU style bachelors degree can be enough depending on years of experience.


Not much given that it's a state school and I only have two terms (6-8 courses) remaining. I'll probably re-enroll this fall, perhaps I've been too negative. Thanks for the input.


Cool!

That said, be strategic when re-enrolling - if you have a job today, hold on to it tightly and try to finish your program remotely and/or via transfer credits. Finishing a degree at the expense of having a job today is really dumb given the current market.

If you can take low cost and/or remote offerings from a CC or normal college, you can continue to hold a job. Also, most colleges now offer their courses online for credit - even Stanford and Harvard type programs. I'm sure your original program offers a significant portion of it's coursework online for credit.

Finally, Ds get degrees. You can skirt with the bare minimum and you'll be fine. If you are mid-career and kept your day job, no one will ask about your GPA.


Yeah, I have to keep my job or I can't afford to live (non-CS/blue collar job however). I'm done with CC, I only have two terms remaining to get a CS or Math degree now (was double majoring prior but life got in the way and will likely only complete one now). I'm unsure whether at this junction it'd be better to complete a math degree or CS degree, tbh.

My uni's online offerings drop off like a stone after a students sophomore year. Very little online classes for seniors, which has been a big problem for me as I work full time and live over an hour from campus. If I get a schedule change mid-term, I just have to drop out. It drive me nuts as this policy disproportionately impacts the lower income / higher expense students who can't afford to live in the city and need to work full time (i.e, we get 3/4 of the way through our degree via online classes and debt, then it becomes inaccessible as in-person requirements are required for completion). It's especially ironic as a CS major!

But anyway, I'll probably wrap it up this academic year if things remain stable for me, it's just a questionable time to complete a degree now more than ever. I just want to work on a team doing interesting, challenging work. If I could I'd go to grad school.


You are over analyzing this. It doesn’t matter. No one cares about when you graduated. You are going to need to hustle. If you are truly passionate about this industry you will make it work. Trying to fit some ideal on paper with timing is impossible


Thanks, you're probably right. I do feel that letting the dust settle on the AI layoff hype train might be prudent, but getting it done sooner is likely best.


Pursue a degree now to get you foot it the door sooner. I went back to school after a few year break. I also worked nights and weekends full time in the restaurant industry. It took sweat, tears, and burn out but over a decade later it’s 100% worth it.

Don’t worry about timing at all.


Thanks for the input, I'll aim to wrap it up promptly then.


But, unfortunately, it also runs the risk of hallucination and improper logic.


But that's fine for an mode of interface, right? The risk is significantly mitigated the same way GUI workflows risks are mitigated.

Every RDS database with a dozen of terabytes that's at the entire value of a business that's running it still comes with a "Delete permanently, skip snapshot" button and, believe it or not, accidentally clicking it is not THAT unheard of.

If AI is thought of as an interface for an application where the "destructive" functions are all explicitly and clearly represented to the user and all the other actions are safe to experiment with is acceptable.

Bad UX (be it GUI, CLI, TUI, AIUI or even physical) can cause catastrophic bugs. Remember the Cisco switch with a reset button above an RJ45 port? https://thenextweb.com/news/this-hilarious-cisco-fail-is-a-n...


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: