Economy

because psychiatrists confirm that social media seriously harms your health

And if one day, before opening Instagram or TikToka writing similar to that on cigarette packets appeared before our eyes: “Do social media seriously harm your health?”. In the United States the idea is no longer a provocation, in fact it is already almost a reality. Between collective lawsuits and requests for maxi-compensation, digital platforms are entering a judicial season that closely resembles the one that overwhelmed the tobacco industry in the 1990s. The first jury trial on social media damage ever to take place in a court of law has just concluded in Los Angeles. Brought by a girl now in her twenties, who claims to have become “addicted” to social media from an early age and then plunged into depression and in self-harm to the point of showing suicidal tendencies, is the “pilot case” of a series of legal actions brought in various states against the major platforms.

Snapchat and TikTok have already decided to settle, while the lawyers of Half And YouTube claim that the young woman’s problems derive from a difficult family and personal experience: it will be up to the jury to ascertain whether there really is a causal link between the appellant’s problems and her use of social media, and if so, to establish her compensation for the damages suffered.

Addiction design under indictment in US courts

In what was defined as “an unprecedented trial”, he was also called to testify Mark Zuckerberg. The Los Angeles ruling is expected to arrive in the next few weeks. But what is under accusation, both in the Californian pilot case and in the other approximately 1,600 “derivative” trials that are starting almost everywhere in the States, is not only the presence of harmful content on social media, but the very design and functioning of the platforms. According to the appeals presented by various federal states, schools and families, social networks have developed recommendation and notification systems designed to retain users – especially children and adolescents – for as long as possible, voluntarily creating addiction mechanisms and placing them before mental health of minors. The objective is not only compensation for damages due to the increase in anxiety, depression and self-harm among young people, but a structural modification of the functioning and mechanisms that regulate visibility of content, notifications and automatic suggestions.

“The impact on the psyche can be considered to all intents and purposes a public health problem, and these causes will be increasingly numerous,” says the psychiatrist Claudio Mencaccipresident of the Italian Society of Neuropsychopharmacology. «The platforms are under attack because they can encourage real forms of addiction: their mechanisms work in a similar way to those of gambling. Likes, in a certain sense, are like small wins: they stimulate the brain’s dopaminergic circuit, that of immediate rewards. From a neurological point of view, not much changes if we talk about physical addictions to substances such as tobacco, or mental and behavioral ones, precisely because the brain circuits involved are substantially the same.”

Clarifying this is very important, because we often tend to underestimate addictions linked to digital habits, as if they were less serious and with fewer negative effects on our lives and our health than those caused by substances or alcohol and tobacco. But this is not the case: it is true that smoking is carcinogenic, but are we sure that “burning” children’s minds is less serious? Or that it will have fewer consequences on their future life?

From instant gratification to the new risks of AI

«When we get used to the very rapid gratification of social media, of likes, we risk entering into a logic of instant satisfaction that makes it increasingly difficult to tolerate waiting and frustration, in every sector», continues Mencacci. «A mechanism with cascade effects in everyday aspects and in relationships with other people. It can lead to deviant, violent and self-harming behavior, as well as sleep disorders that make daily activities complicated, from school to sport.”

Let us consider, for example, all the times in which, after cases of femicide or bullying, problems related to pathological narcissism and the inability to tolerate frustrations and rejection. What is happening in the United States will probably also happen here, in the Old Continent, where however the focus is on preventive legislation, rather than on the courtrooms.

«In Europe, the Digital Service Act already imposes appropriate and proportionate measures on platforms accessible to minors to guarantee them a high level of privacy, safety and protection”, it states Ginevra Cerrina Feronivice president of the Guarantor for the Protection of Personal Data. «The Commission guidelines ask to limit autoplay, notifications and other functions that encourage excessive use, as well as any type of behavioral profiling. It is not a question of censoring content, but rather of placing reasonable limits on certain attention-grabbing techniques when they concern vulnerable subjects. Being able to include warnings on the use of social media, as with tobacco, is of little use if the label is not accompanied by rules on design and, above all, on the profiling of minors. Perhaps along with the initial warnings, timed warnings could also be included to remind users not to spend too much time on an app.”

Cerrina Feroni has no doubts about the danger that technology is moving too fast, while we try to “stop the wind with our hands”. «The European approach certainly has the advantage of trying to intervene before the damage occurs, not after years of litigation. The Digital Services Act on minors andAI Act on manipulative practices they already offer a more solid basis than the American one for imposing ex ante precautions. The risk is, precisely, that technology arrives faster than the law. But I wouldn’t say it’s “stopping the wind with your hands”. If the rules are well written, that is, they do not pursue the single function but target the opaque logics that exploit vulnerability, they are effective even when the products change.”

The era of affective chatbots and the challenge of awareness

And the products, it must be said, evolve with impressive speed. Because while we talk about social media which have now lost their original idea of ​​virtual relationship squares and have become machines designed to keep us glued to the screen (and sell us “things”), the world moves on. With the dangers multiplying. «Reality is already going beyond TikTok. We have entered the era of affective chatbots: Artificial intelligence has come to stay”, claims the educationalist Michele Marangiprofessor at eCampus University and author of the book AI, childhood and algorithms. Growing and learning in the post-digital world of Edizioni Junior. «Let’s think about the universe of connected toys, the so-called “Internet of toys”: with the teddy bear or the robot equipped with AI we can chat and tell it about our lives, it will respond in kind and give us “advice”. But if this happens as early as six years old, we have to ask ourselves what this means for the growth and development of children.” Nothing positive, so much so that some smart toys have already been withdrawn from the market: this is the case of the Kumma bear, which in some interactions apparently encouraged young users to self-harm or suggested inappropriate content. Or the Miko3 robot, which makes children feel guilty if they stop playing with it.

Added to this is a further, serious criticality: often parents, teachers and institutions don’t even know the digital language of children, starting with emojis. And reality, as often happens, is literally thrown in our faces by the TV series Adolescence. Lilac hearts, yellow hearts, emojis in chats and on Instagram: for an adult they are harmless or positive messages, but for those who send them they are signals of seduction, exclusion and bullying. It is one of the most disturbing scenes of the TV series: we look at those messages and we understand nothing. «This is the real short circuit: adolescents live within a symbolic ecosystem whose codes we adults do not interpret, the signs of exclusion, the social pressure hidden behind a like that doesn’t arrive or a comment that disappears», concludes Marangi. «Moreover, without wanting to criminalize parents, it is true that many children grow up as only children who are constantly put on display: “you are my light”, “I show you everywhere”. If at 15 they feel the need to perform on social media it is also because they have been used to doing so.”

Perhaps, let’s face it, we need to examine our conscience about the too many hours that we adults spend scrolling the endless feeds of Instagram or TikTok. «Almost all problems related to social media are generated by a phenomenon called technoference» adds the psychiatrist Mencacci. «The term indicates the introduction of social interactions mediated by technology within everyday relationships. Technoference influences parental attention and cues needed to support children’s mental health. We now know that if a mother, when she is alone with her child, dedicates more than 27 percent of the time she spends with him to shaking her cell phone, this can induce reduced interaction behaviors. The child looks into his mother’s eyes, but does not see them focused on him: this is where the first fracture in the relationship arises.”

Giorgia knows this well: she is 16 years old and is being treated in a large hospital in Rome for behavioral disorders that have reached the point of anorexia. «Now I understood that the models that scrolled in front of me on Instagram and TikTok were wrong and not very realistic», he admits on the phone, after agreeing to speak with us to try to “help the little ones”. «But when I was 13 I couldn’t understand it, and I found myself living in a strange reality, which my psychiatrist defined as fake and altered. For me, the cell phone has always been normal even at the table, and even before going to bed I would still scroll through social media. And many of my friends still do. But after therapy, all of us in the family changed our attitude. Now I can use my smartphone two hours a day, and I often use it even less by choice.”

Sometimes, a shock is needed, even in the family. But it is certainly not easy to reach a compromise, and there are no simple and within reach solutions. If the temptation is to ban smartphones altogether, well there’s no point, also because when kids show discomfort, it never has just one aspect. “We rarely see kids who come to our services just because they have a problem related to social media,” says the teacher Antonella Costantinodirector of child and adolescent neuropsychiatry at the Irccs Policlinico hospital in Milan. «More often they show a mix of difficulties: neuropsychiatric disorders, emotional fragility, relational inabilities. The use of social media is intertwined with these conditions, sometimes amplifying them, but sometimes also offering a space for contact. Taking away cell phones is useless: we need to help young people increase their awareness of how social media works. For some young people who are very withdrawn or have relational difficulties, these tools can also represent a gateway to relationships.”

The point lies right here: knowing how to discern and above all knowing how to admit that there is a hidden and manipulative power behind what seem like harmless feeds to watch before falling asleep, to try to turn off the brain and not think about the reality that surrounds us. The American sociologist already said it Neil Postmanway back in 1985, when in the essay Amusing ourselves to death he explained how television distanced us from true knowledge, manipulating us with misinformation. Today, when faced with social media, the danger is the same: it is no longer a question of content, but of who decides how we should think. Maybe we are no longer the ones driving the screen. Maybe we never were.