“We’re making body image issues worse for 1 in 3 teenage girls.”
“Teens Blame Instagram for Rising Rates of Anxiety and Depression.”
These findings from the company’s internal research were among many alarming revelations made public last year when former Facebook product manager Frances Haugen hit out at the social media giant.
She revealed that the company knew its Instagram platform was harmful to the mental health of a large proportion of young users, who said they felt addicted to scrolling through images on the app, even though it made them feel feeling bad about themselves or, in some cases, contributing to eating disorders. and self-harm. But it pursued teen engagement as a key to boosting its profits.
“Left alone, Facebook will continue to make choices that run counter to the common good, our common good,” Haugen said in congressional testimony in October, imploring the government to regulate social media, as it sees fit. did with other consumer products – like cars. and cigarettes — which pose risks to the public.
The dangers here are the algorithms and other features designed to attract people and keep them on a platform by showing content that the user is likely to react to – even if it is more dangerous or extreme than what they have. first wanted or just not suitable for his age. . Haugen told Congress that Facebook searches show that a search for healthy recipes could drive a user to posts promoting anorexia. A Wall Street Journal investigation found that TikTok’s algorithms were serving videos of sexual violence and drug use to accounts belonging to 13-year-olds.
Elected officials, lawyers and child protection advocates are asking how to make the internet safer for children and teens while respecting 1st Amendment rights and without impeding the benefits of technological innovation. California lawmakers have an opportunity to keep this debate alive with two bills that will be voted on in the Assembly this week.
Assembly Bill 2408 would hold social media companies liable for harm to children who become addicted to their platforms, essentially urging companies to remove addictive features from accounts for minors and giving parents new rights to sue if they don’t. Although the legislation does not list specific features, these could include autoplay features that spit out a continuous stream of videos, notifications that pop up around the clock, algorithms that deliver attractive but harmful content, or design endless scroll designed to keep users on site.
This bill is controversial because it creates a new avenue of litigation. Social media companies lobbying against the bill say it is so onerous they would end up kicking minors off their platforms rather than risk being sued. They argue that the bill can be deemed unconstitutional for restricting how platforms present information.
Assembly Bill 2273 is more focused on data privacy. This would require online services used by children to be designed in an “age-appropriate” manner – such as prohibiting location tracking and assigning social media accounts to the most private settings by default – but does not are not intended for addictive features or include any enforcement mechanism. It would create a task force within the new state data privacy agency to work out many details, such as how platforms should verify the age of users and how to report privacy information in child-friendly terms.
If the bills are approved by the Assembly this week, they will move on to the Senate, where they will continue to be debated and fine-tuned throughout the summer. So there is more time to settle the important issues raised by the legislation if Assembly members act this week to keep the debate alive.
Of course, regulating global platforms one state at a time is far from ideal. We would much rather see Congress take action to make the Internet safer for all American children, which it can do by passing the Kids Online Safety Act. This bipartisan legislation would require social media platforms to create tools for parents to tweak algorithms and eliminate features, such as autoplay, that prolong online time. And it establishes an important obligation for social media companies to act in the best interests of minors by preventing the promotion of self-harm, suicide, eating disorders, substance abuse and sexual exploitation.
But California should not wait for Washington to act. As the birthplace of Silicon Valley, the state that brought life-changing technologies to the world has an obligation to help address their pitfalls. There is too much at stake to allow Congress to drag its feet.
Serious mental health issues among high school students skyrocketed in the same decade that teens’ use of cellphones and social media became pervasive. From 2009 to 2019, the proportion of high school students reporting ‘lingering feelings of sadness or hopelessness’ rose 40%, to more than 1 in 3 students, according to the US Surgeon General’s advisory last year on the national youth mental health crisis. During the same decade, the number of college students contemplating suicide jumped 36%, to about 1 in 5 high school students.
The effect of technology “almost certainly varies from person to person,” the surgeon general’s report says, citing research that shows both negative and positive consequences of teens’ use of social media. . While some research shows that time spent online leads to depression and anxiety, other research shows that it helps people form meaningful connections with friends and family.
“Even if technology does not harm young people on average, some types of online activity are likely to harm some young people,” the report concludes. In particular, passive social media activities – such as scrolling through posts and auto-playing videos – are more linked to lower well-being, compared to active use such as commenting on posts or watching videos. video recording.
More recent research has shown that social media platforms, including Instagram and Snapchat, have made it easy for teens to find and buy deadly drugs, such as fentanyl-containing pills.
Meta, the renowned company that owns Facebook and Instagram, says it has developed new tools to help parents supervise their kids on the platforms, like seeing how long they spend on those sites and defaulting new teen accounts to settings. higher confidentiality. Its good. But that shouldn’t stop lawmakers from pushing companies to go further.
California should keep up the pressure to make it clear to Congress and the tech industry that American families want stronger protections for children online. If the nation is not acting to limit dangerous social media features, states must.