Home | Blog | Grump
04 May 2026

Ok. Let's talk about it.

In the 90s, I was one of quite a few angry linux-evangelizing teenagers who hated Microsoft with the kind of all-consuming passion usually only available to the blissfully young and inexperienced.

Microsoft were doing Horrible Things with the power they had as owners of the most widely deployed Operating System on the planet. But it could have been worse.

Imagine if Microsoft (and what the hell, maybe also IBM) had basically invented the whole concept of the Computer Operating System in 1991. Imagine if there had been no Unix, and hence no minix, no linux, and no BSD for kids like me who wanted to fight the power.

I would have screamed in the face of anyone who didn't stop me fast enough: Operating systems are Stupid and Evil, and we should all write Proper Programs that Microsoft can't own.

This is a post about AI.


I'm a programmer, so this is necessarily going to be a programming-centric post. The effect of AI on art is really important, but my experience is with AI and programming.

Let's talk about:

AI Works

I can't speak for other fields, but when it comes to professional computer programming with AI:

  • I have seen large numbers of people royally screw up their codebase by using AI in unhelpful ways.
  • I have seen lots of folks make short-term gains with AI, and load up on debt in the process that cripples their productivity later.

However, I've also seen lots of people accomplish these things without AI. I've seen folks manage both these things with poorly-understood attempts at "agile", and with lots of other tools. Maybe not to the enormous extent that I've seen folks do it with AI-based tools, but even so.

I have also seen good programmers use AI really effectively to build things that previously would have felt too outlandish to even attempt.

I don't want to cross the streams between my personal and work online identities, so I'm not going to plug my workplace here. The stuff I've seen that's been most scarily-effective has been "dark factory" stuff like the StrongDM folks talk about. I have not seen anyone doing that kind of stuff really well out in the open. That's a problem for a lot of reasons.

It means that for a lot of people, the only experience you have of AI programming is grifters attempting to sell you snake-oil.

It means that the only communities getting practice in how to use these tools really well are rich closed communities. This is made-worse by the fact that AI is a subscription service that costs money. It's as if the Personal Computer revolution had never happened, and you needed to rent access to a megacomputer if you want to learn to program.

We still need humans who care

I don't think that LLMs are going to replace programmers. I don't think the act of programming has to become a homogeneous gate-kept cyberpunk nightmare. I do think that the job of programming will change.

It's changed before. We're used to that. There will be lots we'll have to adapt to. New programmers will need to learn by walking different paths from the ones we walked. Remember the story of Mel? We are all Mel. We have weird close-to-the-metal programming powers that the fresh coders 10 years from now won't quite believe.

But I strongly believe that the essential thing that makes programmers good will stay basically the same. You have to actually genuinely care about making good software.

AI Centralizes Power

So far, 99% the Good LLM Programming I've seen has come from Anthropic's Claude, Google's Gemini, and Microsoft's ChatGPT. All of these are centralized services owned by a small number of VC-funded US-based big-tech companies.

Suppose that LLM Programming becomes a much easier way to make money than hand-crafting our source code. These three companies would love to own the very process of professional programming.

I would hate that.

I've heard some good things about Mistral – but at best, they're a European VC-funded big-tech company. What I want is the linux or BSD of LLM-based programming.

The linux of AI

At the moment, it's probably ollama, and llama-cli and so on. If AI genuinely does change the way most industrial programming happens (and I believe it will), and if we don't want a few big tech companies to effectively own that practice, then we need open source alternatives. We need the linux of AI.

If the AI bubble pops (as many of us hope it will) and the cost of GPUs falls through the floor, the first thing I'm doing is buying some chonky chips with bags of VRAM so I can run enough big enough open-weight models to do proper LLM Programming in my house.

It's going to really suck for a while though. It's not going to be like linux today, where you can install Mint with a GUI and expect everything to work. We haven't built Mint yet. It's going to be like linux in the 90s, when we had to compile our own kernels if we wanted half a hope of making the damned soundcard work, and there was only one functioning postscript-compatible printer in western europe.

We don't even have the Slackware of open source AI yet.

We have some serious work to do.

My biggest fear about the current AI discourse online is that the Patrick Volkerding of AI is out there somewhere, but that they're still trying to convince everyone to write code on the bare metal instead of helping folks run a non-corporate OS.

AI Burns Electricity

Data centres are absolutely power-and-coolant-hungry things.

It doesn't help that none of the big players are giving us any data on their resource consumption. That lack of data makes it really hard to have an opinion on data-centre resource-usage that doesn't tell the reader more about the writer's preconceptions and politics than it does about the actual truth of data-centre resource-usage.

But you've read this far. What the hell. Here's my poorly-informed take:

I think AI is a distraction from the real issue: our society likes building stuff it doesn't actually need or want.

I think that, when used properly, AI-assisted programmers burn less resource than artisanal human programmers do to produce a given result. I think that AI programming can be like washing your dishes with a dishwasher instead of by hand. If you do the same amount of dishes, the dishwasher will use less water and power than you will at the sink. However, I think that our society is looking at this awesome new technology and not thinking "Cool! We can have exactly what we had before, but with less footprint!" I think society is thinking "Wow, you mean in my 1-bedroom flat I can now wash enough dishes every day to cater a royal garden party? I'm gonna live like a Queen!"

This is not anyone's fault. This is what a consumption- oriented society does. This is also not the fault of "AI", any more than Facebook is the fault of mobile phones. I am anti-Facebook, but pro-mobile-phone. I am anti-overconsumption, but pro-AI. That's why I want an AI in my house, where I can see how much resource it's consuming, and alter my behaviour accordingly. That's why I think we desperately need the linux of AI.

I must stress:

  • If you spend all day every day running your dishwasher empty, then you will waste power and water. This is also true of ineffective use of AI
  • If you're in the business of building better dishwashers and you build 4 experimental dishwashers for every one that you sell to someone who actually uses it to do dishes, then it doesn't matter how energy-efficient your dishwashers are. You're wasting resource. I believe this absolutely is happening. But it's not a problem with "AI", it's a problem with VC-funded AI. I believe hobbyist-driven AI would at least be limited by normal human budgets.
  • I do not have any good data for my hot take. I don't think anyone does.
  • But I do think that my hot-take explains the new resource-guzzling data centres as well as any other take.

AI Art

It's going to be really weird.

We've had weird before. Photography changed painting. Film changed theatre. Digital special effects changed film.

This is going to be weirder.

It'll be more dystopian if we allow big AI companies to act like copyright laundries. But even if we fix intellectual property law to actually benefit small creators, it's still going to be really weird.

Like I said up-top, I'm a programmer. You should listen to actual artists about what they want you to do about this stuff.

For myself, for right now, I'm doing things like buying CDs and Zines from local artists.

What do we do about it?

Make Good Stuff.

Support other people who Make Good Stuff.

Provide tech support for your elderly neighbours.

If you're a tinkerer, tinker. Play with llama.cpp and all that stuff. See what you can run on the hardware you already have, without encouraging some megacorp to drain a river to build another data centre. Keep an eye open in case the bubble does burst, and the GPUs get affordable.

If you're a programmer, learn these new tools, and keep your eyes open while you do it. If you're working in a corporate environment with unrestricted access to tokens then notice what that environment is teaching you. Notice what you can bring back to the community, and notice which of your new skills are only relevant while you're working at hyper-scale projects. Remember that small problems and small communities still exist.

Look out for other local tinkerers and artists who you can support (by buying their art!) and learn from.

If you're a political type – an organizer or a lawyer or something, I'd love to see folks like you holding the megacorps to account. Making sure that the folks whose art got stolen end up getting paid. Making sure that this terrifying new technology doesn't only draw the world's power towards a few massively rich individuals.

I want to live in a world where we can have BSD-like AI "pragmatists" who're happy to use the tools of the megacorps so long as the results meet their definition of "free"; and where we can also have GNU-like AI "fundamentalists" who refuse to use any software that has ever been touched by a megacorp AI. People who insist on all software being written by Free Humans and Free AIs. For their own definition of "free" of course. And yes, I also want space for Human-first folks who want all their software to be AI-free, hand-crafted by Humans Who Care.

Most of all, I want us all to help each other out. I want us to check in on our friends, family, and local communities who aren't tinkerers or programmers. I want us to help them to use computers in ways that actually work for them, rather than in ways that funnel power to the rich. Maybe help your grandma to use Thunderbird instead of GMail, if you think that'd be a better experience for her. If your auntie's been complaining that she'd just gotten Windows 10 to do what she wanted and now they're forcing her to move to Windows 11, maybe she might be interested in Linux Mint? If your uncle's fed up of Google's AI summaries, maybe help him to use duckduckgo. If you're really keen, take that local community group you really care about (maybe a sports club, a games group, a yoga class, a church, or a dojo – whatever it is for you) and help them get an actual web page instead of always posting all their stuff on instagram and facebook.

Whoever they are, and whatever it is, help your people take one empowering step in their digital journey, to balance out some of the digital bullshit.

Because the stuff that makes me grumpy about AI isn't actually the AI. It's the same stuff that makes me grumpy about all kinds of computers.

Tags: programming tech tech-industry philosophy

There's no comments mechanism in this blog, but I welcome emails and fedi posts. If you choose to email me, you'll have to remove the .com from the end of my email address by hand.

You can also follow this blog with RSS and find older posts here.