Here in the northeast, electricity is expensive because we rely heavily on natural gas for power but lack sufficient pipeline capacity to bring in cheaper supply, all while nuclear plants are being retired, politicians have blocked new pipelines from Canada, and the Jones act makes it costly to transport fuel by sea.
I'm sure AI isn't helping but we have plenty of problems already
And all those subsidized heat pumps and solar panels our governments make subsidy programs for are paid for by rolling those costs into our transmission/distribution/delivery fees.
About $300/mo on an app that provides real-time train tracking for Boston's commuter rail. Thankfully for me the MBTA is a mess, so each system meltdown (which happens about once a week) causes a spike in downloads.
You pointed out a few best-practices, but it's a shame that this article is written in the context of jQuery, rather than recognizing that these "taxes" are true with every script tag you place on your website.
That is absolutely true, was looking at the cnn.com .. http://z.cdn.turner.com/cnn/tmpl_asset/static/intl_homepage/... which is in the header chews up 100k of compressed js. This monster must take upwards of 20ms to parse in chrome as well. Way worse in crappy browsers, the CDN network you need to support this is huge.
In defense of sloppily composed news websites: they're not run like normal sites. Despite the obvious benefit of optimizing and cleaning up the site markup, it's quite normal and it's even encouraged to sacrifice optimization -- so that something would just work.
Two main differences from normal sites are:
(1) The sheer size of moving parts: number of hands that have a say in the site's daily operation
Before anyone says "duh, well, get a handle on your site" you have to understand that news world is chaotic for the right reason: content flexibility and innovation. You can't put hundreds of ongoing creative and newsworthy projects on small development teams. That kind of open exploration belongs in hands of editors who specialize in their particular beats and are willing to pursue newsworthy projects (vendor or agency partnership, special reports, etc.)
These people only understand rudimentary HTML but they have access to be dangerous.
That's a good thing.
More people that do this sort of thing is good for any news agency: it gives us a collective chance for any random project to be naturally selected by audience for success. Problem is how to corral various implementations, bandaids, and so forth to where it all works flawlessly.
However, cutting off editors from being able to write markup is not the solution - and post-optimization regression testing is impossible due to sheer volume of content and creative projects. As long as something works within acceptable limits, it's fine.
(2) Politics of how technology intersects with editorial and operations.
There are invisible lines of power that outsiders don't see with news. Bureaucracy alone is tough to navigate, but, compartmentalization of editorial, technology, operations teams contributes to the problem. Most news agencies have been transformed over the decades (in some cases, centuries) and are pretty set in their ways.
Explosion of internet technology has dramatically sped up this transformation process and it'll take some time to iron out. Good news? Rise of devops in the news biz is happening.
Though I'm a fairly adept developer working for a news agency, I cannot update or fix our favicon file or optimize JS on the main page: for one, template control belongs to a different department that will prioritize their projects differently than I. Communication overhead is too large for small matters. In their defense, their lean team is supporting 50 languages, 1000-persons worth of editors and producers at this point.
Secondly, even if something is simple to fix and done for the right reasons, navigating this field of artificial obstructions just to reduce page weight or load time doesn't justify me taking time off from working on my latest editorial project: there is a fast approaching relevance deadline on it.
My laptop which could be a bit faster, the cold cache issue is the major issue though. Distributing a file this big in a scalable way without blocking requires a lot of servers.
I'm sure AI isn't helping but we have plenty of problems already