Algorithms and the Flattening of Desire
How to reclaim freedom of choice from the bots

If I ask you what you want to make for dinner, your mind will probably cycle through a bunch of factors before you make a decision. You might consider, among other things:
What tastes good to you;
What you’re in the mood for right now;
How tired you are;
How much time you have to prepare dinner;
What you have in your fridge;
How expensive the ingredients are;
And what you had for dinner yesterday.
On a day when you’re feeling particularly tired or lazy, you might eat a bowl of cereal over the sink, or scarf down half a bag of chips in front of Wheel of Fortune. Does that mean that this is what you wanted for dinner?
What your dinner choice — whether it’s from-scratch ravioli with a fresh green salad or a sad handful of stale Girl Scout cookies — reveals is that, in one particular set of circumstances, you settled on one particular choice. Under a slightly different set of circumstances — say, your commute was slightly less tiring or you hadn’t had that greasy slice of free pizza at 3 pm — you might have made a different choice.
Because you are a fully-functioning, complex human, you are capable of making very different choices under very different circumstances. You have free will, and you are capable of making decisions based on a wide array of factors, many of which you may not be fully aware of. You may even choose things you don’t really want, that you know you shouldn’t want, or that you don’t want to want.
You contain multitudes!
What you do is not necessarily what you want
Now think about how tech companies and their algorithms try to understand what you want. They don’t have access to your innermost thoughts (thank God), so all they can see is what you do. They assume that what you just did is what you want to do in the future. And then they give you the chance to do more of what you already did. It’s a very simplistic view of the human mind.
According to the Apple News algorithm, for example, I want to read:
Thinly-sourced transfer rumors involving my favorite soccer team;
Articles about deals on tech devices that are usually too good to be true;
Personal-finance articles designed to freak me out about saving for retirement;
Celebrity “news” (was I not going to click on an article about Prince Harry’s frostbitten penis?);
And, strangely, news about airlines and air travel, even though I almost never fly.
And, yes, I guess that I click on this stuff sometimes! But it’s not what I want. I want to be an intelligent consumer of complex news stories, a reader of weighty long-form journalism. Am I happy that the algorithm is serving me junk food? No. Do I want the junk food? Not really. Will I click on it at times? Maybe.
The same goes for social media networks like Twitter, TikTok, and Instagram. These algorithms are programmed to throw stuff in your face and, if you click on any of it, give you more stuff that’s similar to what you’ve already clicked on. But they have no idea why you made the choices you did.
Often, these algorithms are quite clumsy. Amazon keeps trying to get you to buy stuff you already have (how many iPad cases do they think you need?). Twitter may notice that you clicked on a couple of cute animal videos after a hard day at work and then start bombarding you with baby ducks. It will never understand that you don’t want baby ducks all the time — just a couple, and only when you’re sad. You might watch Breaking Bad for any number of reasons — You love desert scenery! You like incredibly tense dramas! You’re embarrassed that you’ve never seen it! — but all Netflix can do is guess that you love hourlong shows about drug dealers, and nudge you to watch Narcos next.
Sometimes I’ll hear someone say that an algorithm like TikTok’s “knows” them. But it doesn’t. It knows a version of you, and probably not the version of you that you would like to be. TikTok, especially, learns what you “want” when your defenses are down, when there’s no friction between you and another ten minutes of mindless videos.
To put it more bluntly, TikTok knows what you choose when you’ve surrendered most of your ability to make choices.
And remember, the goal of these platforms — especially if they cost nothing and are driven by advertising — is to keep your eyeballs glued to the screen for as long as possible whether that’s what you want or not. If it means making you less happy or more anxious or angrier, well, that’s just good business.
Is that what you want? Or is that what the tech companies want?
Curation and friction
So what can a twenty-first-century human do?
The algorithms provide one useful service: they curate the internet’s firehose of information. There’s been a lot written about the paradox of choice — the idea that more choices paralyze us rather than empower us. And if the mustard aisle at the store has the potential to overwhelm us, then what are our chances in the face of all of the information on the internet?
So we need some curation to make the world of information seem manageable. But I think we should let humans — either ourselves or trusted individuals — do that curation.
It occurs to me that the newspaper, which now feels like a stone-age product, wasn’t a bad idea. Newspapers bundle a bunch of things together — national news, local stories, fluffy feel-good pieces, sports, weather, comic strips, and crosswords. Their expert editors curate the world for us.
Even if newspaper readers really only ever read the sports section, they have to at least glance at some of the other stuff, and probably get a much more well-rounded sense of the world. Even if you’re never going to get a dead-tree newspaper again, you might benefit from subscribing to one or two papers and starting your day with a visit to their homepage rather than firing up Twitter. Alternatively, try subscribing to some newsletters that provide a daily digest of news from a trusted expert.
Or you can curate things yourself. Turn off the algorithms where you can (this is often an option), and thoughtfully cultivate a list of social media accounts that make you feel smarter and better after you’ve read them. Pay attention to how you feel after visiting certain sites. If you don’t like how you feel, stop visiting them.
Another thing you can do is embrace friction. Algorithms do the most damage when you slip into their orbit without even thinking about it — when the app is right there on your phone, or when you can scroll endlessly without even a second to think. Try to give yourself moments when you can really make a choice: Should I keep scrolling? Am I enjoying this experience? Do I want this?
I recently quit Twitter and joined Mastodon, which is often criticized for being a little unwieldy compared to Twitter. I guess that’s true to a certain extent. But it’s not all bad.
Twitter is made to be so slick and so seamless that you never have the chance to think about what you’re doing. The developers don’t want you to have a moment to choose — that way, you stay on the platform. Mastodon’s wonkiness is partially the result of not having tens of billions of corporate dollars behind it. But it’s also because the platform is based on the idea that letting you have choice and agency is more important than keeping the experience smooth. On Mastodon, I have more opportunities to make real choices about what I want to see, and those choices have clear and direct results. I feel like I’m getting more of what I actually want.
Mastodon isn’t for everyone, but it might be worth reconsidering how you think about friction, especially when it comes to technology. The tech world thinks friction is a bad thing — nothing, in their eyes, is better than speed and immersive experiences.
But is that because friction is bad for you or because it’s bad for them?
So don’t sell yourself short. You’re a complex human with profound and ever-changing desires. A math equation should not be able to tell you what you want.