There is a growing consensus that the way social media has infiltrated our lives has caused not just divisive behavior but has affected mental health in some fairly serious ways, especially amongst teens. Some might say that at the heart of the problem is a business model that relies on engagement – in other words, the market-based interest in optimizing features of the experience within social media to “hook” the end user. We know through neuroscience that there are certain addictive tendencies that are triggered by this optimization.
According to Dr. Andrew Huberman, the frequent use of social media can lead to dopamine-driven feedback loops. Every like, comment, or share triggers a release of dopamine in our brain, which is a neurotransmitter associated with pleasure and reward. This gratification prompts us to continue the behavior, hence forming a habit or even an addiction.
Hmmm. An industry business model that fosters addiction in its customer base in spite of, or even in willful ignorance of, the public health implications. Sound familiar? Let’s see – tobacco?
The tobacco industry went through its own trajectory of innovation, marketing prowess, and public health crises. It began with the commercialization of tobacco in the 16th century, followed by rapid growth in the 18th and 19th centuries with the invention of the cigarette-making machine. However, by the mid-20th century, scientific research began linking smoking to serious health conditions, including lung cancer and heart disease. These findings led to a widespread public health movement against smoking, resulting in regulations and restrictions on the industry. Regardless of one’s personal stance on tobacco, it seems pretty straightforward to agree that the industry had its revealing moments.
The tobacco industry's practices have significantly evolved under public scrutiny, but the business of tobacco hasn't completely disappeared (a fact with which my husband, who enjoys the occasional cigar, is quite pleased). Even when there were innovations in the nicotine delivery system (i.e e-cigarettes and vaping devices like Juul), US regulators were quick to intervene, placing restrictions on sales to minors and requiring warning labels among other enforcements.
So are we, or are we not, in a similar situation with social media? With the advent of social media, we've observed a surge in mental health issues, especially among teenagers.
Studies that focused on the mid 2000s through 2019 found over 50% increases in teen suicide and depression. The rise was more prominent in the era of smartphone ubiquity, suggesting a correlation between the digital age and teenagers' mental health.
Let’s just imagine that most people agree in principle with all the above – that social media has gone too far in its ability to instigate a public health crisis, whereby its victims are addicted to something that causes their own self destruction. Then what?
This is where our tobacco industry analogy stops working, if only because tobacco’s problem, and the solutions to regulate it, center around the fact that it is a physical, packaged product with well-known ingredients. They had a huge amount of control over advertising and public perception of this product that they were putting out.
Social media, on the other hand, is on every device, inextricably entangled within a new concept introduced by tech writer David Auerbach, called a Meganet. A Meganet is a network beyond the internet – a conglomeration of data, services, and devices all interconnected, overlapping and interdependent, an amorphous cloud that encompasses all digital activity. Consider the challenge of applying a 'surgeon general' style warning to this amorphous cloud.
Even the tech companies themselves that created the platforms for social media have no control over something whose lifeblood is not physical, but rather dependent on the stochastic interactions of billions of humans.
Some might object to this, saying that companies such as Meta have the ability to “twiddle” their algorithms to produce certain outcomes. Is that really the case? Can machine learning curators really have an impact through adjusting weightings, filters, or other aspects of the algorithm to influence content delivery or user experience?
There are many who believe that governmental intervention can potentially outmaneuver these technological mechanisms – as with the recent bans on TikTok in various states and institutions. But this brings up a host of issues that we humans are arguing about – from data privacy concerns, potential for foreign interference, to infringement on freedom of speech, creativity and community building. It’s not like there’s an “off” switch for TikTok; even if authorities are able to thwart its use by removing it from app stores, or through content filtering on institutional networks and IP address blocking, it doesn’t leave us with a great feeling about what said “authorities” are capable of.
We are increasingly compelled to confront the 'Meganet' elephant in the room, and its effects on our mental health. But how do we address these issues in an era dominated by screens and algorithms? How do we teach the next generation to use these tools responsibly, without sacrificing their well-being? If only it were as “simple” as tobacco.