In the rapidly evolving digital landscape, few realities are more unsettling than the silent epidemic unfolding among our youth. While mounting evidence links social media exposure to deteriorating mental health in adolescents, the major social media platforms—Facebook, Instagram, TikTok, Snapchat, and YouTube—persist in policies and practices that allow, and even encourage, young children to engage with their products. Despite clear laws, public concern, and overwhelming scientific consensus, these companies prioritize profit and market share over the wellbeing of the most vulnerable among us. In doing so, they have set off a race to the bottom, where the mental health of an entire generation becomes collateral damage.
Jonathan Haidt, in The Anxious Generation, outlines in painstaking detail the psychological harms wrought by early exposure to social media. Increased rates of anxiety, depression, self-harm, and suicide attempts among teens correlate almost precisely with the advent of smartphone-based social media around 2012. Platforms designed to maximize engagement exploit developmental vulnerabilities in children and teens, who are biologically primed for peer comparison and social feedback. The addictive designs of these apps—the endless scrolling and algorithmic content tailoring—interact detrimentally with the fragile emotional architecture of young, developing brains.
Haidt’s statistics and research findings are so disturbingly compelling that I often recommend his book as required reading for parents before putting a mobile internet device into the hands of their children. Despite the downward spiral of our children’s mental health, social media companies have failed to implement effective measures to restrict users under the age of 13, as required by laws like the Children’s Online Privacy Protection Act (COPPA). The excuses typically offered—that it is too difficult to verify users’ ages, or that enforcement would be imperfect—are disingenuous and collapse under scrutiny. Modern technology enables us to launch half a dozen celebrity “astronauts” into space for a 10-minute publicity stunt, yet we are supposed to believe that rigorous age verification through biometric checks, digital ID verification, and AI-based profiling is too difficult? Platforms could absolutely introduce friction at the point of entry—they simply choose not to.
“Why” is a good question. The short answer is that the economic incentives to maintain access to underage users are staggering. Pre-teens and early adolescents represent a hidden but enormous market. Studies show that up to 40% of 8- to 12-year-olds are regular users of platforms such as TikTok, Instagram, and Snapchat. This demographic not only engages heavily with content, driving up ad impressions and watch time, but also establishes early brand loyalty that pays dividends for decades. Based on a 2022 study by the Harvard T.H. Chan School of Public Health, it was estimated that if TikTok raised its minimum age requirement to 16 and truly enforced the restriction, it could lose between $2 and $3 billion annually—not to mention the long-term loss in user lifetime value.
Several whistleblower cases and investigative reports have revealed that social media executives are acutely aware of these dynamics. Frances Haugen’s 2021 leak of internal Facebook documents showed that the company knew Instagram was harming teenage girls’ mental health but prioritized growth metrics over user safety. TikTok, caught in multiple legal settlements, has been fined for collecting biometric and personal data from minors without parental consent. Snapchat, facing lawsuits alleging the facilitation of drug sales and cyberbullying, has defended features like disappearing messages despite their obvious risks to young users.
The pattern is unmistakable: when forced to choose between protecting children and maintaining growth, platforms consistently opt for the latter. The fines and regulatory actions taken against them sound substantial but in reality amount to little more than rounding errors in corporate budgets. Bestselling author and NYU professor Scott Galloway has summed up Big Tech’s lack of incentive as being akin to having a parking meter outside your house that costs $100 per hour, while the parking ticket for ignoring it is only 25 cents. Chances are, you’ll opt to break the law. YouTube’s $170 million settlement for COPPA violations in 2019, for instance, was barely a speed bump for a company with billions in annual revenue. Enforcement is too slow, too light, and too isolated to drive real change.
At the same time, arguments for allowing young teens to access social media are not without merit. I have seen firsthand how some engagement can provide vital peer connections, especially for marginalized groups or isolated students. Digital literacy is an essential 21st-century skill, and a complete ban could delay the development of critical online navigation abilities. There are also legitimate fears that strict prohibition would drive usage underground, leading to unsupervised, riskier behaviors. I believe these nuances deserve serious, continued discussion.
Yet the reality remains: today’s social media ecosystems are not built with young users’ wellbeing in mind. Even the best efforts at creating “kid-friendly” versions like YouTube Kids or the now-paused “Instagram Kids” project, have faced criticism for poor content moderation and hidden dangers. The algorithms that prioritize engagement still do not prioritize safety. Until the fundamental profit structure changes, none of these superficial adjustments will likely be enough. Platforms must be held financially accountable at a scale that makes negligence unprofitable. Regulatory bodies should require transparent reporting on user demographics, content exposure, and mental health impacts. Educational initiatives should prepare children for digital life, but they must complement—not substitute for—serious corporate responsibility.
At this point, the stakes could not be higher. We are witnessing the first generation to grow up fully immersed in algorithmically curated realities, where the internet age of adulthood is effectively 13. Their anxieties, their self-perceptions, and even their suicides are being shaped in part by systems designed without their welfare in mind. It is a moral failure of disturbing proportions, and one that history will not judge kindly.
In the end, the complicity of social media platforms in the mental health crisis among youth is not a side effect, it is the business model. Until that model changes, every parent, educator, policymaker, and citizen has a role to play in holding these companies accountable. Our children’s future—and perhaps our culture’s future—depends on publicly recognizing that we are not facing a technological inevitability but a series of human choices. Social media giants have the tools to make their platforms safer for children; they simply refuse to use them. Until we demand better, they will continue to pursue engagement at any cost, racing each other to the bottom while our children’s mental health deteriorates at the top of the ledger. #### Cindi is President and Founder of The Noble Path Foundation, a 501(c)(3) located in San Clemente, CA, dedicated to helping the youth of our communities reach their highest potential via healthy nutrition and lifestyle choices, safe and fun social activities, and motivational mentoring. For sources mentioned in this article, please visit our website and search MEDIA at www.thenoblepathfoundation.org.
SOURCES: The Anxious Generation: How the Great Rewiring of Childhood Is Causing an Epidemic of Mental Illness, by Jonathan Haidt. 2024 Penguin Press. ISBN-13: 978-0593655030
Whistle-Blower Says Facebook ‘Chooses Profits Over Safety’: https://www.nytimes.com/2021/10/03/technology/whistle-blower-facebook-frances-haugen.html
TikTok: Influence Ops, Data Practices Threaten U.S. Security:
https://www.cisecurity.org/insights/blog/tiktok-influence-ops-data-practices-threaten-us-security
New Trends May Help TikTok Collect Your Personal, Unchangeable Biometric Identifiers:
https://www.aclu.org/news/privacy-technology/new-trends-may-help-tiktok-collect-your-personal- unchangeable-biometric-identifiers#:~:text=TikTok’s%20parent%20company%20ByteDance%2C%20for,might%20be%20applied%20or%20abused.
Snapchat is Harming Children on an Industrial Scale:
https://www.afterbabel.com/p/industrial-scale-snapchat
Google and YouTube Will Pay Record $170 Million for Alleged Violations of Children’s Privacy Law: https://www.ftc.gov/news-events/news/press-releases/2019/09/google-youtube-will-pay-record-1 70-million-alleged-violations-childrens-privacy-law
TikTok User Age, Gender, & Demographics (2025):
https://explodingtopics.com/blog/tiktok-demographics
TikTok Ad Revenue (2020-2027):
https://www.oberlo.com/statistics/tiktok-ad-revenue
YouTube Statistics 2025 (Demographics, Users By Country):
https://www.globalmediainsight.com/blog/youtube-users-statistics/
25 Essential Snapchat Statistics You Need To Know in 2025:
https://thesocialshepherd.com/blog/snapchat-statistics
Social Media and Youth Mental Health:
https://www.hhs.gov/sites/default/files/sg-youth-mental-health-social-media-advisory.pdf
Targeting Kids Generates Billions in Ad Revenue for Social Media: https://news.harvard.edu/gazette/story/2024/01/social-media-platforms-make-11b-in-ad-revenue -from-u-s-teens/