When I was young, I was diagnosed with eczema (or eggs-uh-muh, as my family referred to it) which is a condition where your skin is itchy more often than usual. I’ve had access to steroid creams and various remedies for it my whole life for when it flares up, but it petered off as I became an adult. I still have dry skin in the winter, but nothing nearly on the level of my early eczema which would see me tearing my skin apart repeatedly.
I like to think eczema taught me something about hedonism. I don’t do drugs, or even drink, and I think one of the reasons why is that I know how I respond to things that feel good and hurt afterwards. Anyone who has had a mosquito bite knows the annoyance of the itch, where you know itching it will only make it worse, but the release feels so good that you do it anyway. It is hard to stop when confronted with things that feel good, even if you know the long-term expected value is negative.
This isn’t just an itch thing though – you can see this pattern all over the world. It is a human quality that when confronted with something enjoyable, we find it hard to resist overindulging even if it might be harmful in the long term. Right now it is most obvious with phones and social media, but before that we had video games, TV shows, old-school telephones, radio, and even just plain old books. Drugs, whether hard like heroin or seductive like marijuana, also fit in this category. Gambling, an activity where it is even spelled out at the beginning that the expected value is strongly against you, still has people flock to it for the short-term enjoyment.
You can make the argument that these addictions fill a hole in our lives, or are ways of hiding from other problems, or result from a genetic predisposition to addictive behaviours. I’m not here to argue the mechanism, only that the behaviours exist and seem pretty consistent across time. When you dangle a big positive reward in front of our faces, we tend to focus on that even if there is plenty of downside. A lot of us at least.
And it also appears like there isn’t much of a middle ground. We can think of people who get by without technology, like the Amish, but they have drawn a line in the sand of sorts. The difference between mainstream society and the Amish is similar to the difference between someone who has never tried alcohol and a currently sober alcoholic. The temptation isn’t there if you never expose yourself to the initial reward. Once you get used to the reward, that is when it gets hard to quit.
There are definitely high-functioning addicts, some can handle their vices a lot better than others, but it is always a dangerous game to play. Sometimes you can handle your vices quite well until a big life event goes the wrong way, or you find a new vice that is a little more addictive. Then, suddenly, your self-control goes out the window.
What does this have to do with AI? I talk a lot about ChatGPT on this blog because it is the new hotness, but let’s look a bit further out in the future and examine another possible scenario for societal collapse that does require slightly more than just text prediction: total hedonic destruction.
In an earlier post, we discussed ineffable human qualities, like creativity. The point GPT is at right now makes its creativity look a little stunted, but you can still get it to write pretty neat little stories if you try. They might not be super internally consistent, but then again neither is a lot of writing by professional writers.
But let’s say we can do a little better and get to the point where we start to be able to train an AI to write pretty satisfying stories, and even start personalizing them. That doesn’t seem too hard to believe, does it? You can make the argument that it will never approach the quality of War and Peace, but I’m betting that you haven’t read that (reading excerpts and what people have to say about it doesn’t count). And anyway, the AI isn’t trying to create beautiful works of art, it is trying to create stories that satisfy a generation used to TikTok and Marvel movies (Full disclosure: I enjoy Marvel movies too).
The main obstacle there is not complexity of story, but graphical quality, and I don’t think video-generating AI is that far off. Image generation is already pretty strong, text generation for stories is getting there, and though it might require some more horsepower in the rendering department, I think we’ll soon have the ability to ask for videos of what we want.
That is where things start to get bad.
AI is already pretty good at being able to give you what you want – you can see examples at work in everything from the simple, like Amazon recommended products, all the way up to probably the best example of personalized prediction today, TikTok. TikTok is in that uncomfortable spot right now of being loved by the younger generation for its enjoyability and feared by the older generation for its addictiveness, but almost every social media/technology/storytelling format has sat in that spot, so I expect we’ll stop worrying about it pretty soon. It is, after all, just a bunch of short videos.
But let’s imagine that we get an AI to the point where it can create its own videos. Not long, just TikTok length. Then let’s say the TikTok algorithm starts inserting created videos that match user profiles into its own algorithm. What does that look like?
Well, suddenly there is no longer a point where you run out of TikTok videos. In fact, as you get bored of one type, the algorithm could detect that and get some custom made videos of a different type to keep your interest. Unlike the rest of your life which requires you to change and fight to get what you want, TikTok could give you exactly what you want with no fuss. How often are people going to want that feeling?
My worry is a lot. This problem gets even worse when the videos get longer, more believable, and more immersive and personalized. TikTok gets a lot of flak because of ties to China, but I think a more reasonable concern is anybody possessing this kind of power. I would be happier if Netflix got it than TikTok, but I still wouldn’t be happy that someone had the power to instantly flood my senses with a stream of satisfying content.
That last line makes it seem like I might be worrying about nothing here – who doesn’t want to be satisfied whenever they want? My worry here is not that satisfaction on demand is inherently bad – it is that I don’t think we’re really equipped to handle it. Look at how we use the inferior forms of satisfaction we have now – screens, drugs, gambling, food. Already we have problems resisting them and large swathes of the population end up hurting themselves pursuing them. They at least have biological or financial feedback loops that prevent most of us from going off the deep end, but those are mostly due to overstimulation - e.g. running out of money, or physical exhaustion. You don’t think an algorithm could figure out a way to phase in and out such that it never triggers that? I think AI has the potential to turn into the most addictive technology we’ve seen as a civilization, and I don’t think it requires much actual intelligence for it to get there.
For a pure libertarian, this wouldn’t be a problem. Let people do as they please, they can always stop. I think that works for certain levels of rewards, but I really worry that with AI we could brush up against the barrier of self-control and end up in a world where the ability to choose whether to engage or not engage isn’t much of a choice anymore.
They say that you can’t tickle yourself, and in a sense that is true. There are certain forms of pleasure that require a second party, and most of the big ones involve that, whether it is drugs, food, or media. I think with AI we run the risk of designing a system that is effectively the ultimate tickler – a dumb name for a really real risk of designing something too pleasurable to meaningfully resist. Something better at reading our feedback and enjoyment than any human could ever be, and better at responding to it.
It is also worth keeping in mind that addiction is mostly based on how you define it. You can say you aren’t addicted to screens, but how much time do you spend in front of one in a given day? You’re reading this on one right now. You can make the argument that the screens make you more productive, but what is your ratio of consumption to production? How much time do you spend creating with your technology as opposed to consuming content others have made? You think you can resist AI-enabled content because you control all of your vices - a drink a day, calorie counting, no phone in bed – but to truly say you could be immune, try imagining a day without any media or vice consumption at all. That includes physical media like books. Can you?
The best way to avoid itching mosquito bites is not willpower – it is to avoid being bitten by mosquitos. And the way our lives are set up makes us vulnerable to being exposed to content that is very addictive. Be wary of consuming AI-enhanced content, as I think it could very quickly become a slippery slope. Maybe not everyone will fall into the trap, but those that do will be too distracted to realize how chained to their vices they have become.