The Chaos Machine
- Jacob Rodriguez
- Feb 20, 2024
- 6 min read
The Chaos Machine, by Max Fisher, details the history of social media. Not The Social Network story of backstabbing and crazy valuations but the story of its social impact. Fisher details the way social media plays on users' psychology and the observable side effects that have been observed by several independent bodies. I cannot go into detail about every story, study, and finding mentioned in the book, but I will try my best to capture the ones I thought were most relevant to the message.
Social media companies generate revenue through screen time. The more a user scrolls, the more ads are displayed, the more revenue is generated. Algorithms are refined across platforms to get users' screen time as high as possible. Addictiveness is a KPI.
Before homo sapiens, there was an alpha male in every group of people that acted as the leader. As more advanced communication developed, the males who were not the alpha were able to come together to overthrow their leader. Likeability among the group became a required trait and alpha males were bred out. This collective fight against those who were bad for the group is called “tyranny of the cousins.” Society became based on moral outrage. Moral outrage releases dopamine, a molecule that produces a euphoric feeling, in people's brains for this reason. This Us vs Them point of view is a reoccurring theme on social platforms as groups become hateful to protect their idea of self.
This need to be liked by the group is what makes social media so popular. People would rarely get the opportunity to be validated by a group of 50+ people, let alone at any time they wanted. The like button, comments, and reposts all play into the user’s desire for praise. The valley went from playing with silicon to playing with the human mind. Designing slot machine-like platforms to provide users with instantaneous social validation and reshape social impulses.
Gamergate was a historical movement that started primarily with the harassment of Zoë Quinn, a feminist video game developer. Her ex-boyfriend, Eron Gjoni, made a 10,000-word blog post where he called for harassment against Quinn as she was destroying the video game community for its core audience which he believed to be men. News of Quinn’s supposed motives traveled across 4Chan, Reddit, and YouTube which led to online and real-world harassment towards her. These attackers believed that Quinn was a threat to their gamer identity and the community that they belonged to. They were protecting themselves and chasing the rush of moral outrage. They had become radicalized by the internet.
Fisher walks through the journey of a man named Adam whose introduction to memes in 2009 led 13-year-old him to 4Chan, a message board site. Adam decided to embrace the community after a group of 4Chan users helped track down an animal abuser in real life. Adam’s lisp and stutter made him a shy person in the real world but 4Chan offered him a community to belong to. On 4Chan, users post memes that are extremely controversial and offensive. Actions on 4Chan were meant to get a reaction out of people. Participating helped users like Adam reaffirm their relationship with the community. The 4Chan community, whose identity depended on saying and doing deplorable things, evolved due to the users adapting to social feedback. Over time, Adam couldn’t differentiate between the supposed memes and real opinions. This is called irony poisoning.
Renée DiResta found that when searching through Facebook groups, she was being recommended anti-vaccine groups. In fact, whenever she searched about vaccines, the top results were for anti-vaccine-related groups. As the anti-vaccine ideology was a minority, DiResta was curious why it was so recommended so she joined some anti-vaccine groups. She then received a series of notifications to join other anti-vaccine groups and noticed all vaccine content she would see on Facebook was against vaccines. Facebook then prompted her to join groups of unrelated conspiracies like Flat Earth and Chemtrails. Facebook was promoting misinformation. Further research suggested that Facebook did this because people in these groups spent more time interacting with the platform. These anti-vaxxers were in an echo chamber where their community was never challenged. They were fed the content which kept them on the platform longer, which in this case was misinformation. The content that was most popular in these silos used moral-emotional words. The kind people had adapted to react to. It was found that moral-emotional words traveled 20% farther in aligned circles. Users would eventually receive a feed with more and more radicalized content that aligned with their values. Moral grandstanding, the act of siding with morals to the most extreme to prove superiority, was rampant. Users were not allowed to take in opposing views and if they encountered them, they were encouraged by their groups to attack them the way our ancestors attacked the alpha. It was this kind of platform environment that ended up encouraging genocide in Myanmar when Facebook was introduced to the market and became the primary source of news.
Cristos Goodrow is credited for declaring the purpose of the YouTube algorithm. He believed that watch time was the only factor that should be considered. Why promote a short video that will efficiently help viewers when YouTube could promote a video that would make viewers want to see another video? Maximizing watch time means maximizing ad revenue. With YouTube, a similar effect to what was seen with Facebook appeared. The algorithm seemed prone to promoting alt-right content. The content also seemed to become more radicalized over time. Misinformation flourished on the platform just like Facebook. In Brazil, YouTube became the center for politics, with their president gaining rapid popularity from the far-right content they published on the site. What was revealing here was that Brazil had no far right before YouTube which meant that the algorithm could not just be stoking the flame but also lighting the match.
Misinformation on social media is extremely dangerous because users become unlikely to reject misinformation promoted in their groups. This is explained by the Illusory Truth Effect which states that people tend to confuse familiarity for truth. If the group is promoting it, then it has to be true. This led to genocide in Myanmar, a rise in anti-vaxxers in Brazil, and the storming of the capital in America. YouTube and Facebook failed to act at multiple points where the harm caused by their algorithm was readily apparent. Action is not a priority because it most likely means less screen time. Collateral is only important when it interferes with revenue.
What would a company’s solution to this problem be? Fisher argues that researchers agree that disabling these algorithms would stop the chaos. That is most likely not going to happen as it would mean the death of the company. Twitter’s former CEO, Jack Dorsey, stopped focusing on user growth and prioritized removing harassment and misinformation from the platform which was hated by shareholders and likely encouraged his departure. Contact theory is the belief that putting opposing sides in the same place helps each side humanize the other. Breaking the silos would force people to see both sides of the coin. However, contact theory has proven to only be effective under specific circumstances. Any benchmark that could measure an algorithm’s bias or likelihood to increase radicalization would end up being cheesed eventually.
What makes finding a solution so impossible is that it is the users' fault as well. Social instincts overwhelm reason. It’s human desires that make the algorithms target the subjects they do. They want to belong to their communities that give them an identity. Controlling the chaos is going to be the responsibility of both the platform owner and the user.
This was a very one-sided argument. Fisher framed Silicon Valley’s populace as an ignorant collective hell-bent on the idea of free speech and believed that everything they were doing was for the greater good of the world. I don’t think anyone would argue that mistakes, many of which were avoidable, had been made, but that doesn’t mean social media companies are evil and the idea of them should be scrapped. I’ve been online since I was twelve and I didn’t grow up to be a hate-mongering online dependent that lost the ability to think critically for myself. To be fair, I use social media sparingly and find myself on YouTube to watch mostly computer videos which is not an extremely typical use case. But it does mean that social media has the power to do good for people and there is hope that the bad that has happened as a result of social media so far is just a blip on the road to prosperity. Unfortunately, from a screen time perspective, things only seem to be getting worse with TikTok-like features cropping up on more platforms that encourage endless scrolling and increased engagement. I hope to see change for the better in the coming years.
Comments