People point to the cost of land, but if being physically inaccessible isn’t a problem, then there are lots of cheap places on Earth you can deploy data centres too at far lower cost than launching them into orbit.
Desert land is free. Floating data centres in the middle of the pacific is free.
If a state, or even rich billionaire, wanted to take out your data centre in low earth orbit, it's only a few million dollars to launch a retrograde rocket which explodes into 10 ton of shrapnel, or even less to forget the orbit and just launch it directly up.
It likely is in many places, under laws relating to dealing with proceeds of crime, but I’m not aware of any prosecutions having ever been made on this basis.
It’s not a popularly held mindset, either within the security industry or outside of it. This piece seems to be pitched at salespeople whose only job is to extract money from other companies.
Basic hygiene security hygiene pretty much removes ransomware as a threat.
Basic hygiene security hygiene pretty much removes ransomware as a threat.
I cant tell if you’re being flippant, or naive. There is nothing that removes any category of malware as a threat.
Sure, properly isolated backups that run often will mitigate most of the risks from ransomware, but it’s quite a reach to claim that it’s pretty much removed as a threat. Especially since you would still need to cleanup and restore.
I'm not really sure what point you're making. Is the point that it is harder to to secure more things? Is it that security events happen more frequently the higher your number of employees goes?
If so, I bristle at this way that many developers (not necessarily you, but generally) view security: "It's red or it's green."
Attack surface going up as the number of employees rises is expected, and the goal is to manage the risk in the portfolio, not to ensure perfect compliance, because you won't, ever.
And just as dangerous: 50 employees. Because quite frequently these 50 employee companies have responsibilities that they can not begin to assume on the budgets that they have. Some business can really only be operated responsibly above a certain scale.
A law firm with 50 employees who use nothing but Microsoft Word, Outlook and a SaaS practice management application is really easy to button up tight, though they probably don’t have any inhouse IT and the quality of MSPs varies wildly.
A company of 50 software developers is an enormous headache.
It's not often presented as "we should be spending more", but it's absolutely true that cybersecurity is predominated by a reflexive "more is better" bias. "Defense in depth" is at least as often invoked as an excuse to pile on more shit as it is with any real relation to the notion of boundaries analogous to those in the context from which the metaphor is drawn.
The security industry absolutely has a serious "more is better" syndrome.
> Basic hygiene security hygiene pretty much removes ransomware as a threat.
It does not. The problem is, as long as there are people employed in a company, there will be people being too trustful and executing malware, not to mention AI agents. And even if you'd assume people and AI agents were perfect, there's all the auto updaters these days that regularly get compromised because they are such juicy targets.
And no, backups aren't the solution either, they only limit the scope of lost data.
In the end the flaw is fundamental to all major desktop OS'es - neither Windows, Linux nor macOS meaningfully limit the access scope of code running natively on the filesystem. Everything in the user's home directory and all mounted network shares where the user has write permissions bar a few specially protected files/folders is fair game for any malware achieving local code execution.
AFAIK the idea is to have backups so good, that restoring them is just a minor inconvenience. Then you can just discard encrypted/infected data and move on with your business. Of course that's harder to achieve in practice.
If the important data is in a web app and the Windows PC is effectively a thin client, this lowers the ransom value of the local drive. Of course business disruption in the form of downtime, overtime IT labor cannot be mitigated by just putting everything online.
The next step is just to move to security by design operating systems like ChromeOS where the user is not allowed to run any non-approved executables.
If tricking a single employee can cause an entire company to stall out, it's a process issue. Just like how a single employee should not be able to wire out $100,000.
Getting rid of Windows in favor of an OS with a proper application sandbox like Android would solve so, so many security issues, but that's not viable in most cases because so much software depends on the outdated user-based permissions model most desktop OSs are built around.
Please don't. It's bad enough that companies running windows have all the data on win premises. Dumbing down what the users can do with their machines seems like the end of personal computing.
I don't think Android is "dumber" or less capable than Windows. In many ways the application sandbox actually gives owners a lot more control over their devices than a less locked down OS would, allowing them to restrict what information installed applications are allowed to access.
But what I think you're concerned about (and I agree) is that the flip side of that is that giving device owners more control over their apps also gives the OS developers more control, and Google's interests are not always perfectly aligned with the device owner's. There's a much wider market for apps than there is for operating systems, so sometimes app developers' interests will actually be better aligned with the device owner's than the OS developer's interests are.
One possible saving grace here is AOSP. In theory you could have multiple competing AOSP-based desktop OSs, each catering to a slightly different set of users. This would be close to the ideal situation in my opinion. Either that or Chrome, Firefox, Edge, and Ladybird all evolve into full fledged OSs with WASM-based apps.
I see your point, I do. It seems like all external software is going in the SaaS direction, where the vendor is keeping all of the data, so they are available over an API. So there are genuinely solid cases for Chromebooks.
The issue is how much power this gives to the vendors. I think we should be able to survive a vendor going poof, taking all our data with them. Having a general computing platform capable of mixing files and privileges seems to me like the only way of keeping this capability.
Sleeper agent malware is a thing especially in high risk situations. If somebody has a dormant RAT installed since year X-1 it’s going to be impossible to solve that in year X by using backups
That does not work. They just infect you and do not demand a ransom for a few months as they encrypt all your data going to the backup. Now your backups are also encrypted going back multiple months and you have to discard months of work.
Modern ransomware are not just encrypting data but uploading them somewhere too, the victim is then threatened with a leak of the data. A backup does not save you from that.
Well yes, if you get breached, you have problems. At least in good backups scenario you can continue to operate, so you have money incoming to fix this.
> all mounted network shares where the user has write permissions
This is very literally what 'basic hygiene prevents these problems' addresses. Ransomeware attacks have shown time and again that they way they were able to spread was highly over-permissioned users and services because that's the easy way to get someone to stop complaining that they can't do their job.
Basic security hygiene in the modern world is "assume your employees can be a threat", either because they're incompetent ("I accidentally deleted the shared spreadsheet, I thought it was my copy"), malevolent ("I will show them all!") or compromised ("I clicked a link in my email and now my computer is slow.")
If you aren't designing your systems to be robust against insider threats, they will fail.
(If you design them to be robust against insider threats, they will probably also fail, so you have to be constantly working to understand how to limit the consequences of any individual failure.)
Yes it does. A little bit of application control, network segmentation and credential hygiene (including phishing resistant MFA) go a long way.
> The problem is, as long as there are people employed in a company, there will be people being too trustful and executing malware,
Why are you letting employees execute arbitrary software in the first place? Application allowlisting, particularly on Windows is a well solved problem.
> not to mention AI agents.
Now this is possible only through criminal incompetence.
> And even if you'd assume people and AI agents were perfect, there's all the auto updaters these days that regularly get compromised because they are such juicy targets.
Relatively rare, likely to be caught by publisher rules in application control and even if not, if the compromise of a handful of endpoints can take down the entire business then you have some serious, systemic problems to solve.
> And no, backups aren't the solution either, they only limit the scope of lost data.
In the end the flaw is fundamental to all major desktop OS'es - neither Windows, Linux nor macOS meaningfully limit the access scope of code running natively on the filesystem. Everything in the user's home directory and all mounted network shares where the user has write permissions bar a few specially protected files/folders is fair game for any malware achieving local code execution.
Why are you giving individual employees such broad access to so many file shares in the first place? We’re in basic hygiene territory again.
That's an ambiguous statement, depending on what exactly you're referring to by "DOS".
If by "DOS" you're specifically referring to shell (COMMAND.COM), then yes, it didn't know or care about the mouse. But MS introduced DOSSHELL (in '88), which had mouse support (along with other later core applications such as EDIT.COM), and of course, there were other thirdparty shells too (like Norton Commander) which also had mouse support.
But if by "DOS" you're specifically referring to the kernel (MSDOS.SYS), then you may be surprised to know that even the Windows kernel (NTOSKRNL.EXE) doesn't know or care about the mouse - this is handled by other bits like mouclass.sys and win32k.sys.
reply