The State of Illinois is going to lose its "business" already for other reasons. Do you think there is a reasonable privacy regime that prevents health systems from knowing where their patients live or using that information to site clinics?
Why is my data freely and instantly available within a centralized "health system" to begin with? Why can't we implement a digital equivalent of clunky paper records? Everything E2EE. Local storage requiring in person human intervention to access. When a new provider wants my records from an old one there should be a cryptographic dance involving all three parties. Signed request, signed patient authorization, and then reencryption for the receiving party using the request key.
What the health system should impose is a standard for interoperability. Not an internal network that presents a juicy target.
Having seen this world up close, the absolute last place you ever want your medical data to be is on the Windows Server in the closet of your local doctors office. The public cloud account of a Silicon Valley type company that hires reasonably competent people is Fort Knox by comparison.
Yeah but the a local private practice is a fairly small target. No one is going to break into my house just to steal my medical records, for example.
This could also be drastically improved by the government spearheading a FOSS project for medical data management (archival, backup, etc). A single offering from the US federal government would have a massive return on investment in terms of impact per dollar spent.
Maybe the DOGE staff could finally be put to good use.
You seem to be confused about how this works. Attackers use automated scripts to locate vulnerable systems. Small local private practices are always targeted because everything is targeted. The notion of the US federal government offering an online data backup service is ludicrous, and wouldn't have even prevented the breach in this article.
> Attackers use automated scripts to locate vulnerable systems.
I'm aware. I thought we were talking about something a bit higher effort than that.
> online data backup service
That isn't what I said. I suggested federally backed FOSS tooling for the specific usecase. If nothing else that would ensure that low effort scanners came up empty by providing purpose built software hardened against the expected attack vectors. Since it seems we're worrying about the potential for broader system misconfiguration they could even provide a blessed OS image.
The breach in the article has nothing to do with what we're talking about. That was a case of shadow IT messing up. There's not much you can do about that.
This has nothing to do with the "data broker system." Reading between the lines it was more of a "shadow IT" issue where employees were using some presumably third-party GIS service for a legitimate business purpose but without a proper authentication & authorization setup.
Assuming your tea leaf reading is correct, that particular third party would not even exist in its current form without 'data broker ecosystem'. It is, genuinely, the original sin.
A website where you can upload POIs to a shareable map seems like one of those things that's so obvious and so useful it exists almost under any economic arrangement of the advertising industry.
I get that data brokers and big tech are a much sexier topic, but this breach - like so many of the most pressing threats to our privacy - are mundane shortages of competence and giving-a-shit in the IT activities of boring old organizations.
Heh. The shareable map is operated by someone and that someone has information that other people crowdsourced for them for free is even more valuable. If you want a more relatable example, I would like to point to defuct effort ( karma or something.. I can't find the specifics now ), where people were invited to crowdsource all sorts of info on other people. It only got shut down, because it was too on the nose. On the other hand, items like the shareable map like the one you mention is more easily defensible...
<< I get that data brokers and big tech are a much sexier topic, but this breach - like so many of the most pressing threats to our privacy - are mundane shortages of competence and giving-a-shit in the IT activities of boring old organizations.
I think OSM would exist regardless of data brokers. Free services ingesting that data and letting a user annotate it would also exist. People create and operate all sorts of little projects for fun.
A line cook makes no more burgers per hour, a hairdresser delivers no more or better haircuts, and a daycare worker watches no more children concurrently than they could have 25 years ago. Meanwhile the Magnificent 7 have emerged. Baumol effects might have raised wages a bit, sure. How could the relative positions of these workers not fall as all these tech-enabled and scale-enabled neighbors come on to the scene?
The whole concept of buying services from people is either that their time is worth less than yours, or they have special skills that you need and lack. “No such thing as unskilled labor,” ok, but you are definitely get sorting on how useful people’s skills are and how difficult they are to substitute or replicate.
> How could the relative positions of these workers not fall as all these tech-enabled and scale-enabled neighbors come on to the scene?
I’d like to see this worked out for real.
On the one hand, sure, a hairdresser cuts the same amount of hair as they did 25 years ago, and a fancy tech worker produces enormously more output than 25 or 50 years ago.
On the other hand, why does it follow that that tech worker should have an amount of take-home pay equal to vastly more haircuts per month than a comparable worker 25 or 50 years ago? A modern programmer does not actually need more haircuts, or more food, or more lattes, or more housing, or more doctor visits than a comparable worker any other time in the last 50 years.
So maybe something is actually wrong with the profitability of modern non-labor-intensive companies and the tax system such that their owners and employees are wildly overpaid compared to lower-productivity workers.
I’m thinking more of paying people on the margin or of some kind on tax system that compensates for inequality a bit.
Not fully worked out, but consider: suppose there are 100 people in the population, and a bunch of them are ambivalent between tech work and jobs like hairdressing. If tech work paid 10% more than hairdressing, some would do tech work and some would cut hair. If tech work paid 200%, then maybe there would be too many applicants and the employers would reduce wages. (I’ve occasionally contemplated that perhaps one reason that the big Silicon Valley employers pay so much is kind of anticompetitive: they can afford it, so they might as well, because it makes it more expensive to compete with them.)
Or alternatively, imagine if taxes were structured so that owning more than one house were highly discouraged (with appropriate provisions to make owning properties to rent them out make sense, which is something that a lot of legislators get wrong), and if permitting to build houses were not absurdly restrictive, then many different jobs with very different salaries would could still result in having enough income to afford to live in approximately one house. Some might afford two (!), and some might afford one that is much fancier than someone else’s, but if the pressure that makes someone like a hairdresser need to compete against a highly paid tech worker to pay for a similar house went away, the situation could be much improved.
(California, like many places, has strictly too few residential units in the places that people want to live, so just adjusting prices won’t help much.)
> The whole concept of buying services from people is either that their time is worth less than yours
And if everyone else's time has become more valuable then too has the time that is being saved by buying services.
If my time as a programmer is worth significantly more now than it was 25 years ago, then the time I save by buying services is worth more.
There's a reason that someone making $1mil/year is going to be willing to pay more for the exact same haircut that someone making $70k/year also gets. The time being saved is worth more to them.
You're only looking at half of the equation here. Following your logic, if my time is worth $100/hr, I should be willing to pay $99/hr for a haircut. But reality is that a haircut isn't just worth some utility value based on time saved, it's worth the lowest amount where suppliers' willingness to provide it at a given quality and buyers' willingness to pay meet.
So while the $99/hr haircut might technically save me money/time, suppliers of haircuts are generally willing to give the same haircut for $30/hr. If one supplier tried to pin their prices to the growth of their customers' income, they would go out of business. That is because the value of the suppliers time isn't increasing at the same rate.
I'm mostly bald... I got tired of paying as much as I did for haircuts and now mostly just use a pair of clippers on myself, since my goal is to take off all of it. I've paid for more beard trims the past few years than haircuts, though I mostly do that myself too.
Note, I usually use clippers on myself about once a week. Sometimes I'll use a shaver to get a closer shave, but generally doesn't matter as I don't care if there's a little growth, which is noticeable unless I literally shave daily anyway... which I'm too lazy to do, and definitely not able to pay someone else to do.
> Baumol effects might have raised wages a bit, sure. How could the relative positions of these workers not fall as all these tech-enabled and scale-enabled neighbors come on to the scene?
Supply and demand? If the population of hairdressers was small, so they could charge more and more, then their wages could keep up as a percentage. And that would be possible if for example so many people moved into high productivity work that only a small percentage remained in low automation work. But if you have a constant influx of new hairdressers or a constant influx of people willing to do low automation work, that doesn’t happen.
But the relative value of those same labor (hairdresser, daycare worker) changes. If my labor as a tech worker was $50 / hour I'm willing to pay $30 / hour for a haircut. But now if I make $100 / hour I'm willing to pay $60 for the same haircut. And in both situations I still need a haircut and the hairdresser is still the place to go to get one.
Which is fine right? If both wages just keeps up with inflation, the gap will increase by the same amount as well. In fact in this particular case I wouldn't even expect the hairdresser's wage to increase the same amount proportionally, which is also fine. Not all wages should increase at the same rate.
The line cook is relatively as valuable as they were in the past, they're just being out leveraged by people asserting a self-centered entitlement mentality.
I think it is more of a social technology for keeping your ducks in a row. Developers won’t be able to gamble that something “never happens” if we induce it weekly.
Killing instances of load-balanced stateless services is not that interesting anymore in the context of a mature service mesh. What is interesting is injecting failures or latency on specific edges of the call graph to ensure that “fail open” dependencies really are. This is accomplished with context propagation, baggage, middleware, and L7 proxies rather than killing anything at the VM/container level. Even iptables rules turned out to not be a very good approach since most destinations would have many, constantly cycling IPs and ports.
In the stateful world, chaos testing is useful, but you really want to be treating every possible combination of failures at every possible application state, theoretically with something like TLA or experimentally with something like Antithesis. The scenarios that you can enumerate and configure manually are just scratching the surface.
At Netflix when this article was written, Cloud Engineering accomplishing failure injection with circuit breakers which essentially were L7 proxies. Chaos engineering was more than killing instances. There was a whole simian army after all. They would inject latency, error codes, etc and simulate tiers of the application failing. It’s not nearly as unsophisticated as your making it seem.
Also true of human-written unit tests. You probably also want to have integration or UI automation tests that cover the end-user scenarios in your product requirements, and invariants that are checked against large numbers of examples either taken from production (sanitized of course), in a shadow environment, or generated if you absolutely must.
Please make remote development work well in the IntelliJ-based IDEs. It's very difficult to get corporate employers to continue supporting their toolchains locally when VSCode Remote is "good enough" and disposable cloud VMs are so much easier to support/secure/manage/scale.
The development experience in IntelliJ-family IDEs is incomparably superior, but you have got to figure out how to run the code indexing on the remote server and the UI locally. This quasi VNC thing isn't it.
Service B initiates the connection to Service A in order to receive notifications, and Service B initiates the connection to Service A to query for changed data.
Service A never initiates a connection with Service B. If Service B went offline, Service A would never notice.
reply