Banning children’s social media use: A wave of symbolic regulations, but at what cost?

Pascal Schneiders, Department of Communication, Johannes Gutenberg University Mainz, Germany, pascal.schneiders@uni-mainz.de
Alicia Gilbert, Department of Communication, Johannes Gutenberg University Mainz, Germany, gilbert@uni-mainz.de

PUBLISHED ON: 13 Mar 2024

Recent public discourse on social media sounds somewhat dystopian: Facebook, TikTok, Instagram and co. knowingly use manipulative design features and algorithms to keep users hooked. Children and young people are particularly susceptible to this — staring at their screen for countless hours, they become addicted, depressed, and plagued by anxiety. Losing control over their own behaviour, they neglect other activities. A problem so serious that politicians need to intervene.

Social media regulation attempts in the US

At least that is the assumption of the governments of the US states of Florida and Utah. Both states have recently passed bills (Florida Social Media Use for Minors bill, Utah Social Media Regulation Act) that would prohibit underage teens from using social media or make it contingent on parents’ consent. When opening an account, social media companies would need to verify the age of the user — for example, by using an ID or AI-based facial recognition. In the case of Florida, existing accounts of under 16-year-olds would have to be deleted. Utah, on the other hand, takes the approach of granting parents or legal guardians access to their children’s accounts. In an effort to curb minors’ screen time, social media companies would need to create a default curfew for minors’ accounts between 10:30 PM and 6:30 AM. Taken together, both bills would intervene in the platform architecture by prohibiting a whole range of presumably addictive practices, designs, or features.

Florida and Utah are not isolated cases. In other US states such as Arkansas, Texas, Wisconsin, Ohio, Louisiana, and California, but also on federal level, similar laws are in the works, have already been passed, or have been put on hold. Instead of solid and proportionate rules, rapid-fire, technosolutionist measures are being taken (Angel & boyd, 2024). Their primary aim does not appear to be to protect and balance different fundamental rights and interests, but rather to attract attention and symbolise power to act under the noble, ever-returning cover of “holding social media companies accountable for the harms being caused to our children and young people”. The surge in anti-social-media legislation, however, faces legal backlash. Laws have already been temporarily blocked by federal judges in Ohio, California, and Arkansas — among other things on the grounds that they likely unconstitutionally restrict children’s right to free speech. In Utah, the tech industry successfully postponed the proposed law in court and as a consequence, Utah significantly softened their original proposal. Among other changes, the curfew would be lifted before it is put in place, and parental consent would not be required for account registration. Furthermore, instead of prohibiting any kind of “addictive” design features, the act would specifically target autoplay functions, endless scrolling as well as push notifications.

The European situation: DSA and beyond

Still, European legislators seem to take the US crusade against social media as an inspiration. On the other side of the Atlantic, France has already passed a decree requiring social media platforms to verify ages and obtain parental consent for minors under the age of 15. The UK and the Netherlands recently announced they will be banning smartphones from schools. Moreover, in a recent, vague resolution the EU Parliament called on the EU Commission to examine if and how the EU should regulate presumably addictive designs of digital services. The Parliament casts the net wide and lists a variety of practices that are suggested to be addictive or manipulative: infinite scrolling, autoplay of videos, personalised recommendations, push notifications, like buttons, and more. The reason for the call: The Digital Services Act (DSA), which imposes new obligations particularly on large digital platforms and has only been in full force since February, would not be sufficient to protect minors. Article 25(1) DSA states that “Providers of online platforms shall not design, organise or operate their online interfaces in a way that deceives or manipulates the recipients of their service or in a way that otherwise materially distorts or impairs the ability of the recipients of their service to make free and informed decisions”. Just a few days ago, the EU Commission initiated formal proceedings against TikTok to examine whether the design or algorithm of TikTok is addictive — ironically with reference to the DSA.

The trap of technological determinism

What unites all those proposed policies is the belief that the use of social media inherently has negative effects and consequently, that intense use must in and of itself be “problematic” and classify as an “addiction”. This trap of technological determinism (Orben, 2020a) hides the many benefits and the centrality of social media in the personal, social, and cultural lives of children and adolescents. They don’t just consume funny clips or hate speech on social networks. Especially for many young people, social media serves as a major source for world events and political news. Half of 14 to 29-year-olds in Germany, for instance, use social media for this purpose every day, a third on Instagram alone (Die Medienanstalten, 2022). Besides that, social media is a relevant social space for children, where they get in contact with their peers, new perspectives, try out different identities (a major developmental task in adolescence), and where they can support each other (which is particularly important for marginalised groups; Berger et al., 2021; Escobar-Viera et al., 2018).

Indeed, some users may experience negative effects of social media use on their mental health and well-being. However, high-quality studies show that mental health outcomes do not only depend on technological design (Orben & Przybylski, 2019), but also on personality factors and the social and cultural context (Masur et al., 2022; Vanden Abeele, 2021). Many studies find weak and unstable net effects of social media use on well-being at best (Odgers & Jensen, 2020). For a great majority of children, the amount of social media use does not affect their well-being at all (Valkenburg et al., 2021). Critically, important questions remain unanswered: What role do pre existing mental health issues play in social media use? What about content? And to what degree do short-term effects of social media use on well-being accumulate over time? What researchers largely agree on, though, is that screen time is not predictive of well-being outcomes (Kaye et al., 2020; Meier, 2022; Orben, 2020b) and that a bad habit is not the same as a clinical addiction (Meier, 2022; Vanden Abeele et al., 2022).

Restrictive regulations risk ineffectiveness and pathologisation

Thus, the radical attempts at regulation many Western nations are currently implementing, or planning to implement, are too shaky to be considered necessary, not just from a constitutional but also from an empirical point of view. Regulations that adopt a one-size-fits-all approach and problematise children’s social media use as “addiction” run the risk of being ineffective (Masur et al., 2022) while simultaneously pathologising social media use and stigmatising young people (Taylor, 2023; Vanden Abeele, 2021; Vanden Abeele et al., 2022). TikTok is not a digital narcotic or equal to cigarettes, as it is sometimes attention-grabbingly proclaimed.

In their sweeping nature and paternalism, access restrictions, surveillance options, and design interventions are sometimes even more harmful than the status quo. The right to communicative participation, access to and expression of facts and opinions, identity development, as well as to privacy may be significantly impaired. Furthermore, it is not always a given that parents or guardians will use time limits and monitoring options in the best interests of their children. Not to mention the fact that age verification systems require tech companies to collect sensitive information about teens and families.

In addition, evidence is lacking for the suitability of curfews and other interventions to curb supposedly problematic use. Previous studies suggest that technological interventions have at best weak effects in the desired direction (Jesse & Jannach, 2021; Maier et al., 2022, Masur et al, 2021; Taylor et al., 2019). In the case of South Korea, a nightly gaming curfew for adolescents implemented in 2011 has proven to be largely ineffective. Adolescents’ internet use went down by an average of mere two minutes and no effect on sleeping hours could be demonstrated (Choi et al., 2018). Young people who are on average more digitally literate can circumvent blocks, for example by using VPNs. What’s more, technological interventions can trigger unintended negative effects, such as annoyance and fear of missing out (Lyngs et al., 2020). This also depends on whether interventions are perceived by the user as self-imposed or externally imposed.

Chances for responsible regulation

The impulse of political actors to take precautionary action is almost understandable — especially when it comes to an issue that unites political parties. However, politicians and other decision-makers should be careful about uncritically fueling technologically deterministic narratives or technology panics (Orben, 2020a). After all, this would obscure other possible causes of mental health issues, such as family problems, a lack of social support, low self-confidence, and stress.

This does not mean that digital platforms should not be held responsible for various threatening problems and for their contribution to solving them. Rather, the goal should be to protect those minorities that may be more susceptible to negative mental health outcomes of digital media than others, while fostering freedom of choice and user autonomy for all users. Personalised support for those potentially at risk, media literacy interventions for all children, and an increase in user choices for platform configuration such as the ability to individually deactivate habit-building features, are not only more proportionate but will also be more effective in the long run than rushed alarmist regulations. In any case, it is essential to listen to children and young people in the legislation process, to understand their perspectives and needs in order to thoroughly weigh up the different interests. The consequences of interventions should be continuously (scientifically) monitored, and the instruments adaptively adjusted if necessary (Orben, 2020a).

Ultimately, users — regardless of age — should learn to balance the inseparable advantages and disadvantages of digital connectivity (Vanden Abeele, 2020). If we ban children’s access to digital media, we leave them to their own devices to learn how to grapple with ubiquitous digital technologies. Political resources should be used responsibly to prepare our youth for a digital future, rather than pathologising and patronising them for a symbolic win.

References

Angel, M. P., & boyd, d. (2024). Techno-legal solutionism: Regulating children’s online safety in the United States. Proceedings of the Symposium on Computer Science and Law, 86–97. https://doi.org/10.1145/3614407.3643705

Berger, M. N., Taba, M., Marino, J. L., Lim, M. S. C., Cooper, S. C., Lewis, L., Albury, K., Chung, K. S. K., Bateson, D., & Skinner, S. R. (2021). Social media’s role in support networks among LGBTQ adolescents: A qualitative study. Sexual Health, 18(5), 421–431. https://doi.org/10.1071/SH21110

Choi, J., Cho, H., Lee, S., Kim, J., & Park, E.-C. (2018). Effect of the online game shutdown policy on internet use, internet addiction, and sleeping hours in Korean adolescents. Journal of Adolescent Health, 62(5), 548–555. https://doi.org/10.1016/j.jadohealth.2017.11.291

Die Medienanstalten. (2022). Intermediäre und Meinungsbildung. Gewichtungsstudie zur Relevanz der Medien für die Meinungsbildung in Deutschland, 2022-I [Intermediaries and opinion formation. Study on the relevance of the media for opinion formation in Germany, 2022-I] [Report]. Die Medienanstalten. https://www.die-medienanstalten.de/fileadmin/user_upload/die_medienanstalten/Forschung/Intermediaere_und_Meinungsbildung/Intermediaere_Meinungsbildung_2022-I.pdf

Escobar-Viera, C. G., Whitfield, D. L., Wessel, C. B., Shensa, A., Sidani, J. E., Brown, A. L., Chandler, C. J., Hoffman, B. L., Marshal, M. P., & Primack, B. A. (2018). For better or for worse? A systematic review of the evidence on social media use and depression among lesbian, gay, and bisexual minorities. JMIR Mental Health, 5(3), Article e10496. https://doi.org/10.2196/10496

Jesse, M., & Jannach, D. (2021). Digital nudging with recommender systems: Survey and future directions. Computers in Human Behavior Reports, 3, Article 100052. https://doi.org/10.1016/j.chbr.2020.100052

Kaye, L. K., Orben, A., A. Ellis, D., C. Hunter, S., & Houghton, S. (2020). The conceptual and methodological mayhem of “screen time”. International Journal of Environmental Research and Public Health, 17(10), Article 3661. https://doi.org/10.3390/ijerph17103661

Lyngs, U., Lukoff, K., Slovak, P., Seymour, W., Webb, H., Jirotka, M., Zhao, J., Van Kleek, M., & Shadbolt, N. (2020). ‘I just want to hack myself to not get distracted’: Evaluating design interventions for self-control on Facebook. Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, 1–15. https://doi.org/10.1145/3313831.3376672

Maier, M., Bartoš, F., Stanley, T. D., Shanks, D. R., Harris, A. J. L., & Wagenmakers, E.-J. (2022). No evidence for nudging after adjusting for publication bias. Proceedings of the National Academy of Sciences, 119(31), Article e2200300119. https://doi.org/10.1073/pnas.2200300119

Masur, P. K., DiFranzo, D., & Bazarova, N. N. (2021). Behavioral contagion on social media: Effects of social norms, design interventions, and critical media literacy on self-disclosure. PLOS ONE, 16(7), Article e0254670. https://doi.org/10.1371/journal.pone.0254670

Masur, P. K., Veldhuis, J., & Bij de Vaate, N. (2022). There is no easy answer. How the interaction of content, situation, and person shapes the effects of social media use on well-being. In D. Rosen (Ed.), The social media debate. Unpacking the social, psychological, and cultural effects of social media (1st ed., pp. 187–202). Routledge. https://doi.org/10.4324/9781003171270-12

Meier, A. (2022). Studying problems, not problematic usage: Do mobile checking habits increase procrastination and decrease well-being? Mobile Media & Communication, 10(2), 272–293. https://doi.org/10.1177/20501579211029326

Odgers, C. L., & Jensen, M. R. (2020). Annual research review: Adolescent mental health in the digital age: Facts, fears, and future directions. Journal of Child Psychology and Psychiatry, 61(3), 336–348. https://doi.org/10.1111/jcpp.13190

Orben, A. (2020a). The Sisyphean cycle of technology panics. Perspectives on Psychological Science, 15(5), 1143–1157. https://doi.org/10.1177/1745691620919372

Orben, A. (2020b). Teenagers, screens and social media: A narrative review of reviews and key studies. Social Psychiatry and Psychiatric Epidemiology, 55, 407–414. https://doi.org/10.1007/s00127-019-01825-4

Orben, A., & Przybylski, A. K. (2019). Screens, teens, and psychological well-being: Evidence from three time-use-diary studies. Psychological Science, 30(5), 682–696. https://doi.org/10.1177/0956797619830329

Taylor, K. (2023). The social diagnoses of digital addictions: Technophobic ambivalences, the limits of the natural and imperatives of self‐governance in the information age. Sociology of Health & Illness, 1–19. https://doi.org/10.1111/1467-9566.13624

Taylor, S. H., DiFranzo, D., Choi, Y. H., Sannon, S., & Bazarova, N. N. (2019). Accountability and empathy by design: Encouraging bystander intervention to cyberbullying on social media. Proceedings of the ACM on Human-Computer Interaction, 3, 1–26. https://doi.org/10.1145/3359220

Valkenburg, P., Beyens, I., Pouwels, J. L., Van Driel, I. I., & Keijsers, L. (2021). Social media use and adolescents’ self-esteem: Heading for a person-specific media effects paradigm. Journal of Communication, 71(1), 56–78. https://doi.org/10.1093/joc/jqaa039

Vanden Abeele, M. M. P. (2021). Digital wellbeing as a dynamic construct. Communication Theory, 31(4), 932–955. https://doi.org/10.1093/ct/qtaa024

Vanden Abeele, M. M. P., Halfmann, A., & Lee, E. W. J. (2022). Drug, demon, or donut? Theorizing the relationship between social media use, digital well-being and digital disconnection. Current Opinion in Psychology, 45, Article 101295. https://doi.org/10.1016/j.copsyc.2021.12.007

Add new comment