> If you can spend $100 BILLION on data centers, you have the budget to stand up a proper school.
That is a good point. Odd that our reactions, including my own given that I think the AI data center boom is pretty nuts, to that much being spent on data centers is "not surprising" while a company standing up a school sounds like something from science fiction.
It was normal for large firms to build schools. All the big 5 accounting consultancies had a month or two of residential boot camp. Toyota built a school. GM too. When you’re that big, you can make your own weather.
> This is about a certain group of people trying to control AI through OpenAI and other such startups controlled by them, without naming names or calling out agendas.
Not trying to argue, I'd totally like to hear who is trying to control AI though OpenAI et al.
Not sure but I think OP means the investors in OpenAI. Rephrasing: there are too many investors and orgs (ie MSFT) that want to "be in AI" and very few outlets that actually deliver the frontier hard core tech: Google, OpenAI, Anthropic, and X, Meta, Amzn and a few other to lesser extents.
> (I do believe that climate change worries are also the root of the resurgence of authoritarianism, but that's a story for another time. Just in short the key hypothesis: adopting a hate ideology is just another type of looking away from the problem that has no simple convenient answers)
Oh that's an interesting insight. I'd be up to hear more if you're up for sharing...
Really not that complicated, actually: even at the best of times, it's always a struggle between ideas roughly in the corner of "a better world for everybody" (positivity!) and ideas built on some form of "us vs them" (zero sum, or worse). The latter come in different colors, they can co-opt religious concepts, the idea of community anywhere between the small scale of family all the way to the large scale of nation, or even social constructs orthogonal to those such as class. Or some combination thereof. At the best of times, it's still close enough to a tie between positivity and the others.
Enter climate change: positives become a much harder sell, those can't really ignore it. But the zero sum ones (or negative sum, that difference does not really matter) remain unaffected - or in some ways even become more attractive.
Meanwhile, he's done deeper analysis than anyone else I've seen on financial statements and actual costs from AI for the large players and I absolutely don't think he's wrong.
Unfortunately, this seems to strike a chord for you.
Not unlike his blog posts and podcasts. Spiteful insult-laden rants (he even admits to this on occasion). If you're looking for a financial analysis on AI, I'd literally look anywhere else.
Unless you get your eyes open to Intuitionist Math and then you realize math isn't "true".
Then again... where in the trillion or so parameters of any LLM is The Law of the Excluded Middle that classical math requires to be "true".
Even more comical is that there are certainly embeddings in there _about_ an excluded middle. With thousands of dimensions and billions of values in each one.
That is a good point. Odd that our reactions, including my own given that I think the AI data center boom is pretty nuts, to that much being spent on data centers is "not surprising" while a company standing up a school sounds like something from science fiction.