Are AI powered digital drugs eating the future?

The PISA scores are plummeting in Finland, which once served as a respected example of successful education. There has been increasing debate about the underlying reasons for the educational crisis—a lot has changed from the golden years. However, teachers seem to be quite clear about the culprit. The rise of mobile devises and increasing screen time, paralleled by less and less time spent reading. As a father of three, I sadly share this observation, and recognize that on my part I have failed to counter act this trend with my own children.

Does this make me a bad parent? Absolutely, and I’m certainly not alone. While I’m not trying to undermine my responsibility as a parent, I would like to elaborate the fact that we are fighting against a very powerful enemy. Children and young adults are being hooked on social media content, optimized by engagement maximizing algorithms. To understand how dangerous these algorithms are, let me take an analogy from the world of narcotics.

When the business interest of enterprises collides with the wellbeing of consumers

In early 2000’s, an American pharmaceutical company, Purdue Pharma (owned principally by the Sackler family), started aggressively marketing their new opioid pain killer named Oxycontin. While this was not the first opioid on the market, Purdue claimed their product was safe to use, which was sadly backed up by the US Food and Drug Administration. Fairly rapidly the country was in a deep crisis of opioid abuse, which has been later documented in books and tv-series.

To get an idea of the extent of the epidemic, nearly 841,000 people died from drug overdoses in the US between 1999–2020 [1]. The reason behind this tragedy is the following. There is clear conflict between the business interests of a pharma company and the wellbeing of patients, which the authorities failed to regulate. Retrospectively, looking this from Finland (a country of strict legislation), it is unbelievable that anything like this could happen in a democratic western country. Still, it did.

The race to keep you engaged

This is where the story intersects with certain social media platforms. As with pharma companies who try to sell as much of their products as possible, social media companies try to get the users to consume as much of the content as possible. The aim is that the users do not leave their devices alone, in order to show them more adds; The business model is to keep people engaged.

In order to keep the users engaged, content is personalizing in a way that results in dopamine being released in the brain (also the pleasure-increasing mechanism with many addictive drugs [2]), exposing users to mobile addiction. Considering that this is done intentionally, there is a clear conflict between the business interests of social media companies and the wellbeing of users. Sounds familiar, right?

Interestingly, there is even research indicating that Facebook usage can have similar effect on the brain as does cocaine or gambling [3]. Moreover, studies have shown that social median engagement correlates with adverse psychological outcomes like depression, anxiety, and loneliness, and It also interferes with sleep and hampers academic success [4]. Even the mere presence of mobile devices has been associated with reduced available cognitive capacity [5].

Drug addiction most likely ruins one’s life. While social media/smartphone addiction does not necessarily have similar proximate effects on one’s health and overall wellbeing than drug abuse, it should still be taken seriously. Spending countless hours watching social media content means that time is taken from studies, work, exercise, relationships, and family—i.e. life, which in the long run will have an impact.

If it really is so—as research indicates—that social media has been intentionally designed to act like an addictive drug, with potentially very similar side effects, why are we, as a society, not doing anything about it? Drugs are not legal in Finland and any normal parent would do anything to keep their children away from those substances. But if social media can have similar effects on the young brain as cocaine, alarm bells should be ringing loud.

The next step from “just” recommending content.

The newest breed of social media products doesn’t anymore settle on recommending interesting new ideas and content to the user. Instead, they are solely driving the user experience using the engagement algorithms. The content selection power is taken from the user and given to the algorithm. These products like TikTok, Instagram stories, and YouTube Shorts are growing at an unprecedented rate.

The problem is not the business model, but the application of increasingly effective machine learning algorithms used for content recommendation. Dusty old books are no match to the endless source of exciting content social media has to offer. No need to look any further to see why reading and literacy are in a deep crisis. With less reading we get less education and culture, and weaker ability to critically judge the truthfulness of digital information, which increasingly often is AI generated.

In addition to being legal, engagement-optimized algorithms differ from chemical drugs in one important aspect. Their behavior is no longer under the full control of the social media platforms. It is better to think the new algorithms not as pure code, but as product of complex interaction between the code, content creators, and user behavior. Thus, the code does not exist in isolation, so that its behavior could be fully analyzed and understood.

What content will ultimately be presented to each user cannot be predicted. Enterprises running the platforms can do aggregated metrics and sample tests, but don’t have clear view what harmful events might be taking place at any given time, as content is being presented to users in a constant, never ending flow.

Dangerous dance between the algorithm makers and content creators.

Let’s assume that the social media companies are doing their best to filter harmful and inappropriate content. What the content creators want, however, is to have their content being presented to users; in the end, many people have made this their primary source of income. As a result, there will be instant counter adaptation to any fixes made to the algorithms by the content providers.

Moreover, the content creators also try to come up with new ways of getting the users hooked on their supply. Thus, not only are social media platforms trying to do their best keeping their users engaged, but the content is also becoming increasingly seductive. I you ask me, this is utterly disgusting.

EU has for some time already being preparing legislation in order to regulate the application of AI [6]. The cases in this upcoming legislation that are either prohibited (namely related to surveillance and crowd control) or requiring continuous safety assessment. The later have been defined quite broadly, but none of them cover the use of AI to induce addiction. This is, in my opinion, an important shortcoming of the AI Act, as this kind of application is far from ethical.

It seems that the future rests on the shoulders of parents and teachers. So, what can we do? I can only hope that schools do their part by banning mobile devices altogether. My own responsibility is to start limiting screen time at home and try to wean the kids of TikTok. As an AI professional, I’m embarrassed that it has taken me so long to realize the danger of social media algorithms. I hope it is not too late to start acting accordingly.

While what I have written above is deliberately exaggerated, I have tried to back my claims up with published research. This topic requires more large-scale studies to fully understand what mobile devices and social media is doing to developing brains. Still, it is already clear that their role in the wellbeing of children as well as adults should not be neglected.