Social media is broken, especially X and YouTube
Social media is broken. At least in the range I can observe, X, formerly Twitter, and YouTube are clearly broken.
By “broken,” I mean this:
- their algorithms optimize too strongly for user attention,
- they end up prioritizing extreme and biased information over facts,
- and their structure makes misinformation and conspiracy theories easy to spread.
Of course, social media is not completely useless. It can still be entertaining, and it still has some value as a place for gathering information and communicating with others. But I feel that this very “fun” and “addictiveness” is what makes the brokenness more serious.
The broken part is the top page and recommendations
On X, this means posts shown in For You. On YouTube, it means recommended videos. Both are pieces of content shown on a top page optimized for each user’s interests, and both are designed so that they inevitably enter your field of view.
The issue is not whether each individual post or video is good or bad. The issue is the mechanism, or architecture, that makes misinformation easy to spread.
What happens on X: when speculation becomes “fact”
On X, misinformation about an incident often spreads through a pattern like this:
- Someone posts about an incident with speculation mixed in.
- Some followers sympathize with that post, and the “speculation” is retweeted or quoted as if it were a firm claim.
- The post reaches more people, and new speculation or emotional comments are added as it spreads.
- A tone emerges that says, “If this person did something so bad, attacking them is justice,” and the claims become more extreme.
The algorithmic problem overlaps with this flow.
- These posts are shown in For You according to the user’s interests.
- People cannot help looking because the content is concerning, especially when it is extreme.
- Once you watch or read it, the algorithm decides that you like this kind of post and shows similar posts again and again.
- When biased opinions keep appearing regardless of the facts, they start to feel like reality.
This is the result of social media architecture combining with weaknesses in human cognition.
YouTube has the same problem
YouTube has basically the same structure. It may even be more serious because video is:
- more addictive,
- and easier to perceive as “truth.”
For example:
- you watch one conspiracy-themed video,
- similar videos appear in recommendations one after another,
- and before long you start believing that “this is how the world works.”
Video communicates through both sound and image, so it often feels more persuasive than text. That also makes it harder to escape once you are pulled in.
Effects on children and older people
This problem is especially serious for children and older people. Children whose metacognition is still developing, and older people whose media literacy may be declining, are more vulnerable to this kind of misinformation.
Risks for children
Looking at my own children, who are around ten years old, I see that they are still in a stage where they absorb and learn many things quite directly. They do not yet have enough ability to judge whether information is true.
At that stage, I honestly feel it is dangerous to place them in an environment where biased information keeps flowing through recommendations. That is why I cannot yet let them use social media freely.
In the future, perhaps I can hand them social media once they have developed the ability to examine information independently, look at it skeptically, and still avoid becoming overly pessimistic.
But YouTube can be opened with a single click. If parents cannot manage it properly, children keep watching on their own. That is a serious problem.
Risks for aging parents
Social media is also becoming a difficult issue for aging parents.
When overconfidence in one’s own judgment overlaps with limited understanding of YouTube’s algorithm, I feel that more people become absorbed in conspiracy-like videos.
Once someone believes, “This is what the world is like,” it can be very hard for family members to persuade them otherwise. I have heard close examples of this.
So what can we do?
This is a very hard problem, but there are still things we can do.
Things we can do ourselves
- Do not depend too much on For You or recommendations. Build a habit of searching yourself and checking multiple sources.
- When you see extreme claims or emotional posts, pause once and ask, “Is this really true?”
- When you keep seeing only similar claims, intentionally look for information from other positions.
Things society should consider
- We need to ask platforms for greater algorithmic transparency and stronger mechanisms to suppress misinformation.
- It is also important to bring media literacy education more actively into homes and schools.
- We need places where people with different opinions can exchange views and think from diverse perspectives.
Social media is broken. X and YouTube in particular have structures where algorithms and human cognitive biases combine to make misinformation easy to spread.
But completely avoiding social media is not realistic, and it is not a solution by itself. What matters is understanding how the system works and consciously choosing how we and our families relate to it.
It is not only social media that is broken. We also need to gradually update how we relate to information. Otherwise, I think our world will keep breaking further.