The Great AI Gold Rush (Or: How to Ruin Something Beautiful)
Not the technology itself. The tech is genuinely miraculous, almost stupidly powerful. But the way it's being packaged, sold, locked behind gates... that's where it all goes wrong.
The Hardware Racket
Start with the chips.
Want to run proper AI models locally? Good luck finding a GPU that won't cost three months' salary. Nvidia and AMD have this beautiful racket going. Artificial scarcity, tiered nonsense where the good stuff costs lakhs and regular people get integrated graphics that can barely render a decent image.
Here's the mad part. They could flood the market. Make them cheap, get them into every desktop, every laptop. You know what happens then? The technology survives. Becomes infrastructure.
OS/2 Warp, BeOS, the Cell processor : all brilliant tech, all dead because they stayed expensive and niche. Meanwhile x86 chips became the foundation of modern computing by being boring and everywhere.
If AMD and Nvidia actually wanted AI to succeed, they'd practically give away mid-range GPUs. Build the install base. Let people tinker. Instead, GPUs that can run models properly cost more than a decent motorcycle. Then they wonder why adoption is slow, why the whole thing feels like a bubble.
The Software Circus
Now the companies running big models.
They've built something genuinely useful and powerful. Then locked it behind subscription walls, usage limits, censorship layers so thick you can barely ask anything interesting.
The censorship is doing real damage. Not because of free speech principles. Because it makes the tools useless for actual work. A doctor trying to draft a frank patient-information leaflet on sexual health gets blocked for explicit content. Historical researchers can't analyse conflicts properly. Fiction writers hit walls with thriller plots.
Of course you don't want models teaching people to build bombs, that's obvious. But blocking serious research, fiction, adult conversation - that's overreach.
Like buying a knife that refuses to cut. When the tool keeps refusing reasonable requests, people think it's either broken or being controlled. Both conclusions are poison for adoption.
The Money Cycle
So chip makers keep prices high. Model makers keep access restricted. And in the middle, this whole ecosystem of grift.
Consultants selling AI integration for problems that don't need it. Startups slapping "AI-powered" on basic database queries. You've seen this in slide decks where "AI-powered" means they added a chat window on the side.
The blockchain cycle all over again. Everything had to be "on the blockchain" even when a database worked better. Now everything has to be "AI-powered" even when it makes things worse.
VCs love it though. Pump money in, talk about disruption, exit before anyone notices. The actual useful applications? Drowned in noise.
The Fear Factory
Then the fear-mongerers.
AI's going to take all the jobs. Become sentient. End civilisation. We need to stop, regulate, control access.
Know what this reminds me of? Steam engines. When railways first appeared, people genuinely believed the human body couldn't survive travelling at 30 miles per hour. Doctors said it would cause insanity. Farmers said it would stop cows from giving milk.
Yes, horses lost jobs. Coaching inns closed. The world changed. But we didn't ban railways. We built them everywhere. Could we have networked the whole country with horse-drawn carts? Obviously not.
Same with AI. Some jobs will change, disappear, new ones appear. That's how technology works.
But here's what fear-mongering actually does. Gives companies and governments an excuse to control access. Keep it expensive, locked down, in the hands of a few. "Too dangerous for regular people." "Only trained professionals should have access."
Gatekeeping wearing a safety vest.
The Damage Report
Where does this leave us?
Incredible technology that could be helping millions right now. Students learning. Developers building. Small businesses competing. Researchers accelerating work.
Instead it's stuck behind expensive hardware, expensive subscriptions, stupid restrictions. Companies think they can keep this going forever. Squeeze more from subscriptions, keep people dependent and scared.
But technology wants to spread, be used, find its natural level. The more you try bottling it up, the more pressure builds.
What Actually Needs to Happen
First. Hardware needs to become accessible. Really accessible, not "slightly less expensive than before" accessible.
AMD and Nvidia should be flooding the market with GPUs that can run decent models. Price them like RAM used to be priced. Something you pick up without thinking twice. Build it into every machine coming off the production line.
Will they make less profit per unit? Yes. More profit overall from volume and ecosystem growth? Absolutely. Will they ensure their technology becomes foundational instead of optional? That's the real prize.
Second. Models need to open up. Not necessarily the weights, though that helps. But the access, the usage, the arbitrary restrictions.
Trust people to use tools responsibly. When they don't, deal with that specifically instead of crippling the tool for everyone else. We don't ban cars because some people drive drunk. We deal with drunk drivers.
Third. Stop with the fear-mongering. AI is a tool. Powerful, yes. Transformative, yes. But still a tool.
Don't need breathless articles about robot apocalypse. Don't need restrictions designed to "protect" people from their own curiosity. Need clear information, open access, trust that humans are generally smart enough to figure things out.
The Alternative Path
Imagine if we'd done this right from the start.
Cheap, capable GPUs in every machine. Open models you could run locally or access cheaply online. No artificial restrictions on what you could ask or explore. Clear pricing, clear capabilities, clear limitations.
What would we have now?
Students in tier-two cities building applications that solve local problems. Small businesses using AI for customer service, inventory, planning without paying enterprise prices. Writers, artists, developers using it as naturally as search engines.
The technology would be infrastructure. Boring, reliable, everywhere. Like electricity or internet access.
What would that actually look like?
- Local models bundled with operating systems
- Entry-level GPUs standard in mid-range laptops
- Perpetual licences for basic AI access, not just subscriptions
- Open weights you can run privately without phoning home
Instead we're building a gilded cage. Expensive, restricted, controlled. Wondering why people are suspicious, why adoption is slower than it should be, why the whole thing feels off.
The Bottom Line
The AI industry is committing slow suicide.
Not through technical failure. The tech works beautifully. Through greed, through fear, through this desperate need to control and monetise every interaction.
Taking something that could change the world and turning it into just another way to extract rent from users who have no choice.
Premium tech that stays premium doesn't survive. It gets replaced by something accessible, even if inferior. That's the lesson everyone keeps forgetting.
AI is heading down that path. Unless chip makers wise up, model makers open up, fear-mongers shut up. We're going to look back in ten years and wonder why we wasted such an opportunity.
The technology isn't the problem. Never was.
The industry around it? That's what's broken.
Going to stay broken until someone has the sense to make AI accessible, affordable, available to everyone who wants to use it.
Until then we're just building another bubble. Pretty, expensive, bound to pop.