Hacker Newsnew | past | comments | ask | show | jobs | submit | freehorse's commentslogin

And link to that analysis from the linked discussion https://snubi.net/posts/Show-HN/

which also includes the average voting scores, which actually fall at the same time the quantity increases (while the average story scores remain the same), which is interesting.


It is the first time I am trying to skip a macos version. I really hope in macos27 they will fix things. I used to skip every second windows version, so back here we are.

same here. using mac since OS9, and Tahoe is the first time i skipped a version (downgrade after 2 months)

Yeah, I think nobody is gonna tell their boss "I did not like the way you treated me, so I will take a day off for feeling slightly sick". So, while it all sounds obvious, the extent of "idgaf then" is not easy to quantify.

I think zfg is a measurable quantity.

And even if you are allowed to use a computer, you cannot use internet (and should not be hard to prevent that).

I had a Continuous and Discrete Systems class that allowed open everything during exams. You could google whatever you wanted but the exam was so lengthy that if you had to google something, you really did not have much time to do it and would definitely not have enough time to do it a second time. I would load up a PDF of the chapters and lectures I needed and my homeworks for that unit with everything properly labeled. It was much faster looking for a similar problem you already did in the homework than trying to find the answer online.

Local LLMs

Be sure to bring an extra power strip for all your plugs and adaptors.

https://www.tomshardware.com/pc-components/gpus/tiny-corp-su...


My laptop runs gpt-oss 120B with none of that. Don't know how long though. I suspect a couple of hours continuous.

Which laptop?

ROG Flow Z13 with maxxed out RAM.

Nice laptop. I love my current laptop in general, but it is lagging in performance.

how is anyone going to be able to take a test with all of the noise from that fan as it cranks through tokens?

Offer to make everyone espresso and macchiato with you GPU cooling module. They won't be able to hear the fan over the grinder and pump and milk foamer!

You can make it as slow as you want. At half TDP it is silent.

> when one buzzes we all instinctively check our pockets because they all sound the same

Isn't that the same for every brand? I have a friend who worked in cybersecurity in a certain phone company and was getting very stressed whenever my phone, which happened to be from the same brand, was ringing :D

I guess one can change the default sound, isn't that the case with fairphones?


I have a Samsung Moto, and it has a very default ringtone, not really a tone since it says "Hello, Moto" which is embarrassing but I haven't made the effort to switch tones, at any rate while I will be confused if someone in proximity to me gets a call on their Moto, my experience they don't have to be very far from me before I realize instinctively, that sound is far enough away it can't be my phone, although it irritates me nonetheless.

And I've been seated eating with people who had the same phones and I realized no, it must be their phone (although I feel a strong urge to check), because my ears are able to determine direction of a sound.

I'm also old and keep getting told I'm going deaf, so my question is, are people really not able to tell it's not their phone or are they just not thinking it through before checking.


Samsung Moto? Two different companies with very different phones. I'm surprised that such a mutant exists. Reads to me Car (with square wheels).

Moto is the only big brand I ever consider for a phone, while Samsung has never been as much as a consideration. Moto has had, which is changing, a bit of freedom - enough to tweak it into resembling a pure android experience. Samsung is incorrigibly infested - and if they ever start giving phones to prisoners, they'll be Samsung.


you're right, for some reason I had my son's Samsung Galaxy Tab in mind, and I made the mutant.

My experience with the latest Moto I have is the AI assistant is an anti-pattern but the phone is nearly unusable for a lot of things without it.


Just in case you wondered, and even if you didn't,

I admire ignorance of smartphones and consider such as virtue. I obtained my first in 2018 after years of resistance. But driving a semi and not being the best with maps and logistics, I finally capitulated.

And back then, although cyanogenmod was gone, they weren't too bad. 2019 changed a lot, with autonomous, respawning, immutable "services" and things have digressed severely since. Hence my visiting this post for Fairphone.

So take pride in your purity. It only gets worse the more you know.


It's less the sound, and more the buzz when it's on vibration. I've never found a way of changing that, unfortunately. It's probably true for other brands, but I've never really had a phone that other people have also used, whereas now I'm in a (very small) bubble that seems to be happily converging on Fairphones...

> brings some novelty to our scientific work

Is this satire? I hope it is. Otherwise it seems like a sorry state that science currently is if it needs emojis to bring some novelty into it.


Heh, I didn't intend it to be satire. When you spend 7 hours a day cleaning data, sending queries to research sites and doing patient profile review emojis spice it up and can be eye-catching and fun. Why not?

I generally don't use them in routine practice but when I see some of my straight-laced coworkers strategically deploy them I don't hate it!


NO FUN IN MY SCIENCE ONLY SERIOUS BUSINESS LIKE GOD INTENDED

Novelty doesn't mean fun, it could have been a joke because the work of scientific research is literally finding novelty, that which is new, pushing boundaries of knowledge, etc.

Novelty can increase enjoyment which can imply that the activity is "fun" (though not all enjoyable activities can be categorized as "fun").

However, using the context clues, I surmised that the original poster, that is the one who enjoys seeing emojis being used as bullet points in literature produced by his colleague, finds this to be "fun".

Is that pedantic enough? AM I GOOD ENOUGH???? WILL YOU LOVE ME NOW DADDY?


Jesus, my dude, lighten up a bit.

Consider this: You're a grad student who's been reading page after page after page after page after page after page after page after page after page after page of lack and white text.

How is marking a particularly explosive comment with a graphic representation of an explosion any different from highlighting it? Or from Davinci's marginal scribbles? or from Feynman's wave diagrams?

Or, for that matter, simply bolding, italicizing, or underlining it?

Shit, why even format it at all? Who needs page breaks and indented paragraphs in something as serious as a scientific paper?

God forbid we ever go so far as to implement more than one font.

Changes to the methods by which we communicate are made on a regular basis. If people find them useful enough to put them in their own communications, and they do not harm the clarity of the transmission, who are we (or you in particular) to cry about it on the sidelines?

You remind me of the person in the back of the room trying to invalidate a proof based on a misspelling that in no way impacts the validity of the proof.

As if adding an emoji somehow invalidates the months or years of work that went in to producing the content that you are consuming at no cost and will likely benefit from without having contributed to the project in any meaningful way.

I mean, seriously. Imagine someone's finally created a genuine cure for all cancers. They've spent the entire lives of hundreds of people and billions of dollars, and oh no! What's this? Aww, damn there's an emoji in one of the graphs. Damn. Too bad, I guess it's not going to be good enough for freehorse. Better go ahead and send it back for revisions. Can't publish it like that. Not now, not ever. Curing cancer's going to have to wait until we can force the author of this paper to conform to our arbitrary preferences.


> You're a grad student who's been reading page after page after page after page after page after page after page after page after page after page of lack and white text.

What an interesting way to describe reading a book. It's amazing that anyone can read an entire book, composed of hundreds of pages, without getting bored of the black text.


The rise in illiteracy rates is really fucking disturbing and this attitude (the parent) is part of what's to blame.

Looking at the examples in the comment above, I really hope it's not that bad.

It's like the stupid "ROFL"/CLOWN by political fighters, and the (handclap) 500 times in a row, like. Or the ROFL by people who are trying to make their shit seem "funnier" than it every could be and only makes it more obnoxious.

There's a difference between "making a powerpoint at a conference and using emojis as bullet points" and throwing emojis every other word to be cute and getting papers published, or medical records with that.


You're being downvoted, but I tend to agree that communication is not the part of science you want to "innovate" on. The purpose of (scientific) communication is to be understood, not to be novel.

The science you're writing about is hopefully extremely novel of course.

In general I've found "innovating on the wrong thing" is surprisingly common, especially from people who are bored and/or hungry for promotions, etc.


They're not putting emojis in peer review papers in Science and Nature or poster presentations at ASCO; they're putting them in emails, teams chats and meeting minutes.

Believe it or not researchers enjoy humor around sometimes. There's a global shortage of a specific DAKO antibody we need for biopsy analysis right now and on a call with 50 people one of our chief scientists deadpans, "it's because I stopped making it in my basement."


I do believe it, and am glad for it. The paper indicates clinical notes and patient communications, though, not internal messages. Which means I've been talking past you the whole time anyway, my bad.

As long as they dont force people eat the cake, it sounds fine?

I am also in OP's boat and, even though these are great suggestions, personally I would like to be able to do a basic thing such as opening an app with a built-in way rather than having to download yet another app to do that. Every major macos update I have to worry about spotlight reindexing things.

What I find really annoying with macos is that with stock/default settings it is the worst UX. You have to download an app to launch apps, an app to move and resize windows, an app to reverse the mouse's wheel direction to be the opposite of the trackpad, an app to manage the menu bar (esp decrease the spacing, so that you can fit items up until the notch). Then, you also need anyway to spend an hour tweaking settings and run custom commands (such as `defaults write -g ApplePressAndHoldEnabled -bool false` so that you can actually type stuff like aaaaaaaaaaaaaaaaaaaaa). These are just needed to make using macos bearable, and do not include any kind of "power user" kind of stuff.

I used to hate macos before getting my own mac, because I had to use some at work in their default settings and it was just a horrible experience.


Not submitting to state censorship requests is not a great example of what is the problem with Big Tech as discussed here.

I wasn't referring to the state censorship request, but rather to the 'flocking' to self-proclaimed champions of free speech in the current Trump administrations as a cry for help.

I personally find that the fact that a private company compels a list of IPs and domains that they want blocked to get blocked more alarming than that.

As not all codebases are well-written, I have found useful once to get an LLM to produce code that does X, essentially distilling from a codebase that does XYZ. I found that reviewing the code the LLM producced, after feeding the original codebase in the context, was easier than going through the (not very well-written) codebase myself. Of course this was just the starting point, there was a ton of things the LLM "misunderstood", and then there was a ton of manual work, but it is an (admittedly rarer) example for me where "AI-generated" code is easier to read than code written by (those) humans, and it was actually useful having that at that point.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: