I mean yeah, that would check out. Seems like a lot of these facilities are sitting on a bit of a real estate goldmine in which installation of solar panels and storage batteries could offset a lot of that consumption (in addition to being nicely synergistic with emergency power needs when grid power is interrupted).
I think you might be over-estimating solar or under estimating data center power usage. A quick search suggests that a data center consumes between 150W and 750W per square foot of building space. Solar is roughly 20W per square foot.
Even if you assume 1 floor and a roof covered entirely in solar and the lower end of the power estimate, that's only covering 13% of the power requirements. If you look at the higher end of power estimates and assume a 3 storey data center with 50% of the roof covered, a more realistic set of numbers for a hyperscaler, that's only 0.4%.
As for being synergistic for emergency power, batteries would help here, but peak power usage is probably around dusk when solar won't be that useful but when power is most likely to be disrupted.
Those solar production numbers are exceedingly exaggerated. You need to add capacity factor to the solar install. Multiply your solar production by 0.2 to 0.3 or so and you’ll be in a better ballpark range.
Peak power usage for large scale facilities is not much different than off peak. Most megawatt plus builds I’ve been part of have less than 10% difference between peak and valley in terms of watts used. Perhaps the hyperscalers see a much larger difference but I have my doubts.
If you want to see exactly the type of solar panels needed for a data center (and why they’re impractical in even ideal situations) then check out this mockup of a proposed space data center ran off solar: https://www.youtube.com/watch?v=d-YcVLq98Ew
What system that systemd replaced was so bad that it needed to be replaced by the init system.
I am not against new software, but if using a new piece of software, requires replacing dozens of other system components, then something is going very wrong.
The thought process behind people at Redhat who think "We need a new system logger, you know the init system is the perfect place to develop that" is just inexplicable to me.
counterpoint: those policies are not sustainable and can be easily defeated by someone simply setting up an endpoint somewhere not on any lists. if you have a security worry about devices being compromised by dint of their location, you need to control the location in some physical rather than logical way. if you have an HR worry about residency, I suspect those rules are going to slowly go the way of the dodo anyway.
It doesn't matter if the policies COULD be easily defeated. If you live in a country of 5 million people, and say "only connections from smallstan are allowed into this sensitive infrastructure", you've probably wiped out 99% of automated attacks.
Security measures are judged by how much they cost to implement, and how effectively they reduce the threats you will actually face, and geolocation blocking has the amazing one-two punch of being cheap and effectual against real world threats. Realistically, you're going to face a lot more automated hacking attempts than you are hackers actively trying to workaround security safeguards your company has implemented. It also generates indicators of compromise, so even if this doesn't stop a hostile actor, it can reveal their presence.
Getting to 100% security is too expensive and it's also impossible.
These types of views/takes honestly are not productive. Security is never 100%. If location-based blocking defeats 98% of the low hanging fruit threats, it is most likely worth it. You can then filter down your more costly countermeasures to the 2% of the remaining pie.
Similar reality exists regarding security through obscurity. Is it perfect? No; nothing is. But if the cost to even understand the system in play is very expensive, that alone is a deterrent to low-effort / drive-by attackers.
just want to thank you for everything you've done for the iOS user community these last 15 years. I left the platform when Apple's success in fighting its own users became too much of a pain point, but before that your work helped enable developers to do some utterly fantastic stuff.