The Unique Maladjustment of Social Media
Everybody knows “social media is bad”. But different algorithms are harmful in different ways.
It’s no longer “cool” to be terminally online.
The zeitgeist has changed.
Taylor Lorenz is gone.
The circle of hypereducated elites in New York and San Francisco who live their entire lives online are no longer the vanguard of society.
Twitter/X/BlueSky have fractured the delicate echo chamber in which people can retweet endlessly with the twelve people and bots on the planet who happen to share exactly the same views and/or algorithmic programming.
Following TikTok micro trends like “Strawberry Girl” or “Tomato Girl” is just considered sad.
In this vein, choosing right now to write about the effects of social media feels a bit like picking 2021 to decide that Ikat prints are cool.
And yet, having read headline after headline discussing the ills of social media, I’ve noticed that there isn’t much talk about how different algorithms fuel different types of socials ills.
Instead, whatever the boogie man—China, Russia, the alt right, ANTIFA, etc.—the discussion centers around the idea that either (a) all social media is bad, or (b) social media owned by _____ is bad, because ____ is really a propaganda arm of China/Russia/the alt right/ANTIFA/rampaging capybaras.
What both theories forget is that for all of the bots, foreign agents, and propaganda on social media, by and large, those things are simply exploiting existing algorithms.
Algorithms that were designed almost entirely to generate growth and profit for their respective companies.
And, crucially, the levers one pulls to generate growth and profit depend on the nature of the medium.
This means that while every social media algorithm is designed to manipulate users, different social media platforms manipulate users in different ways. Not because of the dastardly plots of CEOs to control world politics, but because of the same profit motive every other company has.
Several years ago, in what could now be referred to as “the before times”, I read a piece about TikTok’s algorithm.
Again, these were simpler times.
But, one thing was clear: TikTok knew that it was going to be a visual platform, and as such, growth was going to come from visuals that the average person found to be attractive and desirable.
What did this mean in practice?
No fat people.
No awkwardly-shaped people.
No scars.
No disabled people.
No old people.
No dumpy hausfraus or pasty accountants in pleated khakis. No sales managers with yellowing teeth, or secretaries with frizzy hair and thin lips. No beer guts. No acne.
The regular people who fill neighborhoods and office parks around the country? The people at the grocery store? TikTok didn’t want them.
TikTok didn’t want them in March of 2020, and TikTok probably doesn’t want them now. They’re too hideous for a visual medium.
…
Same for houses.
No slums, ghettos, or shanties.
No rural fields, unless they’re picturesque and have been featured in a tampon commercial.
No construction sites.
No poorly maintained houses, or houses that show too much wear and tear.
No damaged walls.
No outdated decor.
No excessive clutter or mess.
The model home the builder has for everybody to visit? That’s a pretty good house, according to the algorithm.
The house at the end of the same subdivision that was built 15 years ago and now bears the scars of three kids, four dogs, two cats, a misguided sponge painting attempt, and a DIY porch repair from a dad who never used a hammer before? According to Zillow, it’s still a half million dollar house, but nope. Not up to TikTok standards. The people who live there should try being less poor and ugly.

…
Instagram?
In many ways, the fact that we have the term “Instagram Face” says it all.
Mirroring TikTok’s leaked criteria, “Instagram Face” is most remarkable for what it’s not—flawed.
The Instagram Face has no lines and wrinkles. It has no visible pores. It is unblemished, unscarred, and free of any of the signs of exhaustion. Nobody is ever puffy, bloated, or asymmetrical. Teeth never look like they’ve been exposed to coffee, tea, red wine, cigarettes, or solid food.
Like the model home, Instagram Face is a face that’s never experienced real life. It’s a promise of what faces could be if a person were blessed with perfect genetics and cryogenically frozen at 18.
This also means that only one very specific idea of “beauty” is likely to go viral. Only one type of beauty is going to be making it to the top of a person’s feed, regardless of who he or she follows.
Only follow friends?
Congrats. Instagram is going to move the hot friend with the well-lit new house to the top of the feed.
She’s not the closest friend. She’s not the most interesting friend. She’s not even the wealthiest or most successful friend (those people live in confusing houses, full of weird expensive things that AI doesn’t understand). But, according to the algorithm, her pixels add up—she has white teeth, smooth skin, bouncy hair, and she takes lots of pictures in front of other generically attractive (and thus easy to categorize) backgrounds, often while posing with equally attractive friends.
According to the pixel data, she’s rich, popular, and attractive with an aspirational lifestyle.
And, if anything, this is even more important in Insta world—after all, the potatoey shaped lady on TikTok was sharing a recipe for a Table Nachos. She had actual information to offer. The lady on Instagram is literally just posting a picture of herself in a bikini, in front of a beach.
This is a visual medium without the pretense of learning how to scatter a bag of chips across a table.
Only following accounts of cultural and educational significance?
Given the choice between a story about endangered newts and a story about $17M houses, the algorithm has figured out that the Architectural Digest story has the better thumbnail. AD is going to win the top spot on the ‘gram, newts be damned. And since that was such a good story, here’s a promoted story from Veranda. It’s about a perfectly decorated house on the coast of Nantucket.
Cultural. Very cultural.
A public housing project in East St. Louis? A $3M house in Palo Alto? It doesn’t matter. Both houses have leftover pizza boxes on the counter, an ugly chair, and the weird clutter that comes from actually living in a house with people. That’s…not what the Insta wants to see. The Insta wants to see someone posing on a beach in a floppy hat.
Facebook?
It and Instagram may have the same parents, and the algorithms may share some similarities, but like any siblings, the differences are real.
There is no such a thing as Facebook Face. If there were, at this point, it would be a 60 year old taking a poorly lit selfie. Or possibly a guy in Oakleys, rambling about sheeple from the cab of his truck.
He might be 30. He might be 70. He might be in a Prius, and swapping out the Oakleys for an N-95 mask. Alone. In his eco-friendly car.
It doesn’t really matter. He has opinions.
Those are the real algorithmic bait for Facebook: Opinions that will get a reaction.
“I do believe that some changes could be made to our foreign policy” is a sensible statement, but also boring.
“We should nuke Zimbabwe” is not a sane or sensible statement, but it’s salient. It’ll get a reaction. Every single person who reads that post will have a response.
The flip side to this is that the algorithm is no longer quite as obsessed with white teeth and uncluttered backgrounds.
Babies are good for clicks. Marriages are good for clicks. Major career milestones (i.e. one’s first job, retirement) are good for clicks. The sorts of announcements that people used to read about in the local paper are all great for garnering responses.
This is actually one of the more wholesome aspects of the algorithm, but at the end of the day, there are only so many marriage announcements to make.
And once the algorithm runs out? It’s time to learn that another high school classmate has opinions about Martians funding ANTIFA to advance Russian interests. Nuking them is the only answer, he says. All of them.
…
Twitter/X/BlueSky?
These platforms are what happens when the local newspaper element goes out the window, and anybody can live out his or her fantasy of being a panelist on CNN.
Nobody else in the office wants to discuss Matt Goetz’ face for seventeen hours straight? The boss keeps saying insensitive things like “I know Matt Goetz has a weird face, but we’re firefighters. The building next door is on fire. Shut up and do your job before the city burns!”?
Twitter understands. Twitter is there. One can even dream of being retweeted by a Z-list celebrity who shares one’s obsession of the moment.
It probably won’t be a celebrity anybody cares about, but beggars can’t be choosers.
On an imaginary panel full of imaginary experts, a retweet from the nutritionist who once wrote a book about carbs might as well be a retweet from God himself.
And, fundamentally, no matter who owns what, this is the bread and butter of the model—the opportunity for every person who once earned an A- in ninth grade Civics to join a non-selective think tank of his or her choosing.
The Cato Institute may not be interested in the foreign policy thoughts of a low level marketing assistant from Columbus, but BlueSky is. Moreover, unlike Facebook, there is no baggage in the form of reality.
One can be unburdened from what has been, far away from any old classmates who could undermine one’s intellectual bonafides by posting “Heeeeeeeeey bytch. Remember that time we got fired from Taco Bell for throwing all of the salsa packets at the custumers cuz they were being dum? Those were the days, girrrrl! Luv you!”
…
Crucially, much like an actual think tank, the algorithms don’t want users to veer too far off track.
Cato doesn’t want its employees using company time to take up projects involving fashion design or mechanical engineering, much less the defense of Donald Trump.
Similarly, once a person has been sorted into his or her unpaid think tank, the algorithm is going to punish anything unrelated.
The algorithm likely doesn’t try promote extremism nearly as much as either side fears, but at the end of the day, it still feeds a very specific type of content.
For the person who is interested in dogs, music, global warming, baseball, cooking, and rollerblading, once the Twit Tank has decided he’s the most likely to engage with stories about global warming, it’s going to be a 24/7 barrage of stories about global warming.
His posts about global warming will gain more traction, relatively speaking. He’ll see every story there is about global warming, whether he follows a particular account or not. And, when he posts a picture of his dog, he’ll get noticeably fewer responses than he’s used to, leading to inference global warming is a much bigger deal in the eyes of the public than pet pictures. He might as well stop going to baseball games, too. They have too large of a carbon footprint.
…
In a way, this already feels like an outdated discussion.
The vanguard of social media is long over.
But that’s also what makes this relevant.
To say “social media is bad” or “social media is bad if it’s owned by XYZ” is an oversimplification that risks ignoring reality.
Social media exists. It can be used more responsibly or less responsibly; a person can decide which platforms to engage with. But social media exists. Most people are going to engage with some social media platform in some way.
While precise algorithms are heavily guarded secrets that change by the day, understanding the basic contours gives better insight into how different platforms breed different types of obsession. It gives insight into the particular way that different mediums are incentivized to skew reality, not because of any particular political agenda, but because of the desire for growth. Because of the need to compete for attention.
Social media is a hall of mirrors that won’t disappear anytime soon. The best way to defuse the funhouse distortion is by talking about exactly how the mirrors work in the first place.