Hacker Newsnew | past | comments | ask | show | jobs | submit | anonymousDan's commentslogin

Why fintech specifically?

Destructive operations are both tempting to some devs and immensely problematic in that industry for regulatory purposes, so picking a tech that is inherently incapable of destructive operations is alluring, I suppose.

I would assume that it's because in fintech it's more common than in other domains to want to revert a particular thread of transactions without touching others from the same time.

Not only transactions - but state of the world.

compliance requirements mostly (same for health tech)

Because, money.

I will always fly Ryanair ahead of other low cost carriers in Europe as unlike easyJet for example they don't overbook. The most painful experience I've had was to arrive at an airport with a young family and get all the way to the easyJet flight gate to be told the flight is overbooked. And unlike the US where this starts an auction it's basically tough luck. Should be outright fraud in my opinion.


Interesting, are you at least entitled to rule 261 compensation?


I honestly don't know would I be able to keep it together if something like that happened to me and my family. Definitely should be fraud and compensated VERY HEAVILY if it happens to someone due to a technical glitch or something similar.


Some kind of casual or eventual consistency related invariants perhaps?


I don't understand why the nuclear industry wouldn't pile in to fund research into this area (as a potential way to clean up nuclear waste). Probably I don't understand how this fungus actually works and it is impossible!


As mentioned elsethread, it doesn't actually clean up anything, since it doesn't affect the waste at all, just turns some of the radiation into metabolism in the same way that plants turn solar radiation into metabolism.

Even if it did somehow accelerate the decay, it wouldn't be that useful, since (Chernobyl aside), all the waste from the typical civilian nuclear reactor can fit in a side lot on the site of the reactor complex itself (and often does!). There just isn't that much radioactive waste to clean up!


Yeah waste has been a red herring that anti nuclear people like to bring up. Yes it’s nasty stuff but there isn’t that much of it and it can be buried or reprocessed it’s not a real problem.


Low level waste is an expensive pain in the buttocks. I toured a local medical and research reactor back in highschool, and they were running out of space to store their discarded PPE and other minimally contaminated waste. You could probably empty most of the barrels on the floor and roll in the contents without any noticeable effect, and yet they still needed to be treated like real waste, just in case.

Not to disagree with you, just to say that even though it's a minor nuisance it nevertheless occupies a lot of mental space because of how annoying it is.


I don’t see a straightforward way this would actually help with the cleanup. A hypothetical microbe that “eats” oil would be useful in an oil spill as would chemically break down the oil and harvest its carbon.

A radiotropic fungus that’s in TFA can’t meaningfully affect the rate at which nuclear decay is happening. What it can do, supposedly, is to harvest the energy that the nuclear decay is releasing; normally there’s too much energy for an organism to safely handle.

At the risk of vastly oversimplifying, you can’t plug your phone into high voltage transmission lines. These fungi are using melanin to moderate the extra energy, stepping it down into a range that’s useful (or at least minimally harmful).


clean it up how? by having fungus grow near it?


almost nobody cares about solving actual problems :C


for sure noone understand fully how a living organism "works" but it could be possible... at least to learn something.


I don't understand when people blame AI for buying DDR5 DRAM - aren't they mostly interested in HBM? Or is the fab space being diverted to manufacture more HBM than DDR DRAM previously?


Inference, don't need gpu's for inference. Frontier labs are eking out progress by scaling up inference-time compute. Pre-training scaling has kind of stalled / giving diminishing returns (for now).


Not a dumb question. The links to mesh networking etc seem interesting. It sounds like the insights from descriptive set theory could yield new hardness/impossibility results in computational complexity, distributed algorithms etc.


are there any current or foreseeable practical applications of those results?

the math of infinity isn't very relevant to the finite programs that humans use. Even today's astronomically large computing systems have size approximately equal to 3 compared to infinity.


Cool paper. Their modeling of the temperature response curve seems a more elegant (albeit non-trivial) solution than burning CPU.


Couldn't you model the effect of temperature on clock drift and try to factor that in dynamically (e.g. using a temperature sensor) instead of burning CPU unnecessarily?


Sure, that's what the chrony closed loop is already doing (the estimated residual frequency is pretty linear with temperature), but no matter how robust your closed loop is, it's strictly better to not have disturbances in the first place.


That's what the chrony tempcomp directive is for. But you would have to figure out the coefficients, it's not automatic.

An advantage of constantly loading at least one core of the CPU might be preventing the deeper power states from kicking in, which should make the RX timestamping latency more stable and improve stability of synchronization of NTP clients.


Chrony does have ability to do temperature compensation. I've done this and need to do a write up on it. It's not super easy to keep all the parts working together. Basically you feed chrony a table of temperatures and expected clock frequency and it subtracts it out.


Interesting. It feels like once you have the features defined this is basically dead code elimination. To solve the transitive dependency issue could you not execute a dead code elimination pass once and cache the results?


Yes, I do think it resembles dead-code elimination at a high-level. I don't think that doing it after the fact is particularly desirable though, even with the results cached. I went into more details in my response to a sibling comment, but I think there are actually quite a lot of reasons why someone in practice might still care about the experience when doing a completely clean build rather than an incremental one that can re-use the results from a previous one. A lot of it might be overfitting from my own personal experience, but I'm really not convinced that fixing this issue purely by adding additional steps to building that assume the earlier ones completed successfully will end up with a better experience in the long term; all it takes is one link in the chain to break in order to invalidate all of the later ones, and I'd feel much more confident in the toolchain if it were designed so that each link was strengthened as much as possible instead of extending the chain further to try to mitigate issues with the entire thing. Adding new links in the chain might improve the happy path, but it also increases the cost in the unhappy path if the chain breaks, and arguably adds more potential places where that can happen. I'm worried that focusing too much on the happy path has led to an experience where the risk of getting stuck on the unhappy path has gotten too high precisely because of how much easier it's been for us to keep adding links to the chain than to address the structural integrity of it.


I'm interested to understand how this works from an IP perspective. This guy is still employed by Meta but is actively fundraising for a new competing startup. Presumably he will have negotiated that Meta forfeits all rights to anything related to his new business? Would be interesting to hear of people's experience/advice for doing this. Or are there some legal entitlements he can avail of?


Even if it’s Meta, they don’t want to antagonize LeCun. Also they all know it’s a small circle of people that create value. I will not be surprised if meta itself invests in his company and get a share.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: