Since the move to Threads a year ago, I’ve been happy with the product execution and the resulting quality of conversations and community. Generally, there are thoughtful posts about topics I’m interested in: technology, politics, photography, and funny dog bloopers.
Recently, I’ve noticed that low-quality content has been creeping into my feed, and it made me think (again) about the role of product technologists in designing social media platforms.
If I were starting a brand new social network today, do I build features to deterministically produce a desired community or do I build features so users can self-organize and vote on the type of community that they prefer? It’s a loaded question because I’m not even sure you can do the prior and things can be super chaotic with the latter.
Metaphors often help me a lot when thinking about these things – they’re imperfect but they simplify difficult concepts well enough.
It’s like if you’re designing a brand new government – do you go with pure democracy or do you install a dictatorship?
When I worked on Flickr and Tumblr in the early days of social media, these questions come up a lot as we saw how conversations can grow beyond your initial expectations. We built a photosharing community, but the discussions can quickly veer from aperture advice into sexual tastes, political associations and philosophical debates. Many of us who built these tools come from a Western liberal school of thought, so the maxims we tried to leave it to users to decide.
If users wanted to see more of something, they’d give us positive signals through favorites, likes, and comments. If users wanted to see less of something, they’d vote by blocking, unfollowing, or muting folks. I think Threads so far has worked within these assumptions.
This way of product design runs up against the reality with false positives fairly quickly. On Threads, it’s easy to see that posing threads as questions, for instance, invites replies. Or breaking apart threads into multiple posts forces users to click deeper into the stream to get the complete story. Superficially these are positive signals that train the For You algorithm to showcase more posts from these users or more types of posts with similar engagement velocity.
How would product designers build a social network that wasn’t so reliant on users and their signals for community building? What’s interesting is that Threads has also moved in this direction, by not explicitly promoting political news into users’ feeds unless they explicitly follow those posters. This is a sort of have your cake and eat it to moment. Normally, political news is the type of content that generates the most outrage and engagement. In a pure populistic platform, that content would rise to the top of recommendations. But Meta is deliberate – not suppressing – but not letting the natural distribution mechanisms run.
So how is political news – known to bait engagement – different than threadstorms, simplistic questions and other types of threads that beg for superficial engagement that are proliferating?
Another metaphor is that platforms are building the town square where anyone can come and say what they want and talk to who they want. And they are not responsible for the things that people say. The recent arrest of Telegram’s founder by French authorities challenges this defense. Are tech companies just providing the raw materials – the wood and stone and brick – for communities to build the town square? Or are they designing buildings and seating and stages for speakers and audiences to interact?
If platforms are responsible for the types of interaction and speech that happen inside their system, how would they start to design more community controls?
Most networks already take down accounts based on keywords and user reporting. Could they suppress conversation based on word or engagement patterns? Build a history profile of users who spark outrage and limit their reach? Go down this route and free speech opponents will quickly cry censorship.
The problems of early social media hasn’t gone away since their introduction 20 years ago. The problems aren’t technical – they’re human problems just like systems of governments and shared public spaces – which means the solutions may not be technical either.