Reminder: there are no video outputs on these chatbot data center processors driving up the prices of graphics cards.
So they can’t even sell as used GPUs to crash the consumer GPU price market when the AI bubble pops.
This is a reminder that businesses aren’t “money focused calculation machines that optimize for the maximum possible profit.” They don’t worry about every little dollar, they just print money and use it to control you.
Raising prices for you is the goal, not a byproduct of some other smarter plan.
Some people don’t need the rest of this post, and it’s very long, so I’ll put it in a comment.


This post assumes way too much and gives the businesses more credit than they deserve. All the AI companies are looking for is maximizing compute per unit of rack space. They are only operating under the goal of winning the AI race to become wildly profitable and powerful. There is no consideration for any bankruptcy proceedings if/when the AI boom comes crashing down as that isn’t their problem at that point.
The GPUs used for these LLMs are simply not in a form factor that could be used for consumer devices, they fit in racks that use far more power than a home computer would be able to provide.Making them that way would be utterly idiotic.
People years ago said the cards used for crypto mining took too much power and cooling for gamers, but later we got gaming cards that take pretty much the same power and cooling as some of them.
I hope you’re right and that doesn’t happen this time. It’s gone too far already.
You might also be right that they just see the “AI race” as more of a sprint than a marathon, so they don’t care if they can liquidate parts for money back later.