Selfish AI - News

This will be a bit more ranty than my usual articles. Fair warning. But I need to put this out there.

Recently, a video by Jeffrey Way of Tailwind/Laracast fame came across my feed. In less than 15 minutes, he managed to succinctly capture everything I absolutely detest about the current moment in the IT industry.

This isn't a response to Jeffrey per se, but to the whole attitude that he captures. (And it shouldn't need to be said, but this is not an attack on Jeffrey, nor should anyone use this as an excuse to attack or otherwise be a jerk to Jeffrey or anyone else.)

He starts off by talking about how the sudden growth of AI has basically torpedoed his business. He recently had to lay off half his company because AI usage had changed the market behavior enough that their revenue was drying up, and "it is what it is."

But then he pivots into his personal struggle getting used to working with an AI code agent/assistant/whatever, and how he's "Done" fighting it, and looking forward to the new future of Autocomplete Code Authorship. "It is what it is." Lamenting that he/we enjoy writing code (and I do) is fine, but "those days are numbered, it is what it is, you need to get on board." The code quality and style may be poor, but it gets the job done so fast that we need to just give up on caring about that.

You can agree or disagree with him about that point, you can lament or celebrate this tectonic shift in what it even means to be a programmer... but bloody hell I am sick and tired of everyone I know viewing AI coding in such purely selfish terms.

Selfish AI usage

All of this discussion, from Jeffrey's video to 99% of what shows up in forums or Mastodon discussions, is about how all of this will impact "me." Me, the developer. Me, the person writing code, who may not be writing code now. Me the person who just got laid off because some accountant thinks that Claude means they need only two engineers now instead of 10. Me the person who just had to lay off half my company. Me, me, me.

What I almost never see is the impact of AI code on our society.

Scarcely a word is said about the fact that essentially all LLMs are built on scraping the Internet, badly, often completely trashing servers as they download the entire site without using any of the polite techniques developed by archivers and search engines over the past 30 years. They're all built on copyright infringement. Which is a felony, as anyone who has crossed Disney well knows. Modern LLMs literally could not exist without violating copyright, as even Sam Altman of OpenAI openly admits. But it's OK, because the ones violating copyright this time have VC backing. The courts are still trying to figure out if Fair Use applies; I and most Free Software developers I know hold that it does not, but that doesn't stop OpenAI and Anthropic from slurping up our open source code to train their models, even if it violates a copyleft license. Nor does it seem to matter that the same companies and VCs have been pushing successfully for years to narrow and circumvent Fair Use, via Digital Restrictions Management (DRM) and other means. What's good for the goose is not good for the gander. But, "it is what it is."

In a few edge cases, large media companies have been able to sue and get a small license fee for scraping their data, but if you're not a billion dollar company, as usual you're SOL. "It is what it is."

Far too little is said about the fact that training AI models is not an entirely digital process. It is backed by an army of over-worked, low-paid, sweatshop-level workers manually labeling data to feed into the machine. Because why wouldn't we outsource painful grunt work to some person in a poor country we don't care about? It's standard procedure if you're a tech company. They already do it for moderation, may as well do it for AI training. OK, it means one of the main selling points of AI is a lie, but, "it is what it is."

More than 2 million people in the Philippines perform this type of “crowdwork”, according to informal government estimates, as part of AI’s vast underbelly. While AI is often thought of as human-free machine learning, the technology actually relies on the labour-intensive efforts of a workforce spread across much of the global south and is often subject to exploitation.

Almost none of my colleagues seem to be focusing on the absolutely massive impact on the electrical grid.

From 2005 to 2017, the amount of electricity going to data centers remained quite flat thanks to increases in efficiency, despite the construction of armies of new data centers to serve the rise of cloud-based online services, from Facebook to Netflix. In 2017, AI began to change everything. Data centers started getting built with energy-intensive hardware designed for AI, which led them to double their electricity consumption by 2023. The latest reports show that 4.4% of all the energy in the US now goes toward data centers.

By 2028, "AI alone could consume as much electricity annually as 22% of all US households."

Some power companies that were planning go all solar from now on in order to avoid destroying the planet (even more than they already have) are now saying they will be bringing new natural-gas-fired plans online in order to keep up with the increased demand from LLM data centers.

Essentially, none of the new renewable electrical capacity the US has built in recent years is going to replacing existing CO2-barfing coal, oil, and gas plants. It's all getting eaten up by AI data centers. That of course drives up electrical rates across the country, which in turn means people are less likely to switch to electric appliances, which means even more CO2 produced.

We're already probably past the point that we will have a livable planet by the end of the century. But whatever slim chance we may have to survive as a species gets slimmer with every new data center built. But, you know, "it is what it is."

There's lots of misinformation about the water usage of AI data centers. Most use water for cooling, which uses millions of gallons per day. "[R]esearchers calculated that writing a 100-word email with AI could consume around 500 mL of water (about one typical drinking bottle’s worth) when you account for both data center cooling and power generation." Which sounds like a lot, and it is, but as that article notes, many industries use vastly more than that. What matters is where the data center is; if it's in an area where water is abundant, it's no big deal. If it's in an area that is already experiencing water shortages -- like, say, the entire western half of the US -- the impact could destroy communities. And in aggregate, data centers now use as much water as the entire bottled water industry. But, of course, "it is what it is."

And of course, none of that is even mentioning the fact that even AI company execs say there's a massive bubble, that they're all losing money faster than a roulette table, that the AI construction boom is basically propping up the American economy and when it pops it's going to implode, or that the price you're paying for these services now is unsustainably low and you can expect the price to skyrocket once the tech bros decide they need to actually make money. That's all true, but that's a different long list of issues.

Quite simply, no one seems to give a fuck about the ethical implications of new technology. That's hardly new, to be fair. The VC and tech bro startup crowd have long been of the belief that ethics are just an annoying road bump that gets in the way of profit. But it feels new that so many people who are otherwise invested in Open Source and Free Software (which, I remind you, is an ethical and political framework whether you like it or not) seem to just... not care. Maybe they'll make a nod on the copyright front, but then go and vibe code everything. OK, so it means my daughter won't have a future to grow up in, and may be the last generation of humans that can live below a certain latitude, but, "it is what it is."

Ye, the planet got destroyed. But for a beautiful moment in time we created a lot of value for shareholders.

No. Fuck no. No it is not. You don't get off that easily.

Collective action

It only "is what it is" because we have collectively, in aggregate, decided that it is. A technology that no one uses can't hurt us; only a technology that we do use.

Every time you, as an individual, shrug your shoulders and say "it is what it is" and do something ethically problematic, you make it that much harder for anyone else to not do so. The pseudo-libertarian lie of "vote with your dollars" is pure bullshit. You have already outvoted me, so I no longer have a choice.

  • You consider Amazon unethical? Good fucking luck not doing business with AWS. You basically can't use the Internet.
  • You consider Walmart unethical? Too bad, it's the only business left in your small town because all of your neighbors went there for the marginally lower prices.
  • You consider Uber and its long history of employees stalking their exes and building its entire business on simply ignoring the law unethical? Tough, the cabs have mostly been driven out of business in many towns so that's all that's left.
  • You don't want the carbon footprint of a car, even an electric one? Sucks to be you if you live in most US cities; without a car you're basically screwed.
  • Don't want to put your small business on Facebook because they've allowed their systems to be used to enact genocide? Well, I guess you won't have a small business, sorry.
  • Don't want to use Apple or Android's Big Brother-in-your-hand? Eh, half the businesses out there basically don't want you unless you're in one of those ecosystems.

Don't want to use AI because it's built on copyright infringement and literally destroying the planet? Well, I guess you can't work in software anymore, sorry. It is what it is.

Every time someone like Jeffrey Way says "it is what it is," it makes it so. It is not inevitable just because Sam Altman tells his over-leveraged investors it is so. It becomes inevitable when you, you personally, decide that you just don't want to think about the externalities or put in the work to find better alternatives.

We are making this choice. But really, that means you have already decided for me. And I curse you and the ground you walk on for it. No, I'm not joking or exaggerating. Burn in hell.

Where do we go from here

I have, to date, not used any AI coding tools, at all. I've actively removed them from my IDE. I don't want them. Not just because of questions of their quality (still lower than a human), or because I will miss writing elegant code (I know I will), but because I want my daughter to have a future, and a planet on which to live. Because I actually do care about respecting copyright.

But, at this point, it's become obvious that I have to either compromise on that, or leave tech entirely. And every time I think about that, I get angry. Angry at you, dear reader, and everyone else who has said "it is what it is" and gone off to spend all day vibe coding.

It's not that the industry is changing that bothers me. I've been a change agent in most organizations I've been in; I switched from nano as a code editor to full on IDEs with all their auto-refactor glory; I don't mind change. I do mind unethical behavior. I do mind being forced into unethical behavior in order to survive.

At some point soon, I will have to figure out how to work with AI coding tools if I want to stay in the industry I've put my entire adult life into. But more importantly, I will have to figure out how to live with myself every time I consider the tons of CO2 and gallons of water involved in every function I write from now on.

See, I can't just say "it is what it is." I will feel that cost in my gut every fucking time. Whether I'll be able to swallow it and accept (as I do for Amazon and Android, after long resisting them) in order to find a job, I don't know. If not, it won't be a tight labor market that forces me out of tech, but every fucking one of my colleagues that decided "it is what it is, ethics don't matter, it's not worth fighting the billionaire VCs that have fucked up everything else they touch already."

If you have already shrugged and said "it is what it is," fuck you. It is exactly that attitude, that lack of care for ethics, that lack of interest in the global implications of our work, that is literally dooming our species. And by forcing -- yes forcing -- everyone to join you in your uncaring attitude through shear force of numbers is abusive. It's despicable.

As I learn how to work with AI coding agents, know that I will be thinking ill of you the entire time. Not because I don't get to write for loops, but because you have made yet another part of the economy impossible to engage with ethically.

This is how societies die. I wish I were being hyperbolic. I really really do. But I have nothing left but contempt for "it is what it is."

No it isn't. It only is what it is because you're OK with what it is, and aren't putting in the work to make it otherwise. Those that don't give a fuck about fair copyright application, about poor people, about our planet, are putting in the work to make it so. And you're letting them.

I do not forgive you.