[ad_1]
In January, Seattle Public Schools sued social media companies, including TikTok and Meta, alleging that they’d helped cause the youth mental health crisis. Florida and New Jersey followed suit in February. In March, California’s San Mateo County Board of Education and Pennsylvania’s Bucks County also hopped on the Big Tech lawsuit bandwagon.
“For too long these companies have exploited developing minds without consequence, exchanging our children’s mental wellbeing for billions of dollars in ad revenue,” said Bucks County Commissioner Chair Bob Harvie in a statement. “The negative effects these platforms have are real, they are serious, they are quantifiable, and they cannot be allowed to continue.”
We are most definitely in the midst of a youth mental health crisis: Emergency-room visits for suicide attempts for people ages 12 to 25 increased 31% in 2020 compared with 2019. However, exactly how much of that can be laid at the feet of Big Tech is unclear, and in making the case that social media platforms created by tech companies are causing or exacerbating the crisis, school districts bear the legal burden of proving that in court. While research shows that young adults are spending more time on social media, the question of whether this increased use is related to rising rates of depression and anxiety is open for debate.
Will our current understanding of the issues at play be enough for schools to win actual court cases? We’re in a moment of “techlash,” says Thomas Kadri, an assistant law professor at University of Georgia School of Law. Historically, he points out, major tech companies have enjoyed a lot of leeway under federal law, because they are not considered publishers of most content that people post on their platforms—which means they are immune from legal liability.
But this doesn’t apply if a platform was responsible for creating or developing illegal content. For example, Kadri points out, the website Roommates.com was sued for allegedly violating anti-discrimination laws by helping people find roommates based on information about their sex, sexual orientation, and whether they would bring children.
Recently, Kadri says, courts have started to rule against tech companies, indicating that we might be in a moment of change. A couple years ago, a court allowed a case to move forward in which parents sued Snap Inc., the maker of Snapchat, for offering a “speeding” filter that users mistakenly thought would reward them for driving over 100 mph.
“Courts have long been used as venues to engage in public debate about harms and wrongs,” Kadri says.
‘A function for social change’
Karin Swope, an attorney representing San Mateo’s Board of Education, sees the lawsuit as a form of accountability. Typically, legal accountability can happen either through legislation—changing the existing laws—or in the courtroom, where a legal victory will force the defendant to make changes. She couldn’t comment specifically on the case but said in general cases like this are handled on contingency, which means lawyers receive a percent of what their clients receive if they win or there’s a settlement.
“The law can be a function for social change,” she says. “One of the most important issues we’re facing today is the regulation of Big Tech and how they impact our youth’s mental health.
Tech companies say they invest heavily in the wellbeing of users, especially the younger ones. Both Snap and Google, the only tech companies that responded to our request for comment, said in emails that they are trying to protect young audiences.
Snap noted that “nothing is more important to us than the wellbeing of our community.” It pointed out it was supporting youth mental health in a number of ways, from using human moderators to reduce the spread of harmful content to creating an in-app support system for users experiencing a mental health crisis and a tool for parents to know who their teens are talking to without revealing the content of the messages.
A spokesperson for Google wrote, “We have invested heavily in creating safe experiences for children across our platforms and have introduced strong protections and dedicated features to prioritize their wellbeing. For example, through Family Link, we provide parents with the ability to set reminders, limit screen time, and block specific types of content on supervised devices.”
Regardless of who wins or loses the cases, it’s clear that children and young adults are continuing to lose daily as America’s mental health crisis deepens. A court battle may take years to resolve: Swope says something on the scale of two to five years is typical.
In the meantime, healthcare advocates say young people need more immediate solutions. Aja Chavez, executive director of Adolescent Healthcare at Mission Prep, a teen mental health treatment center, says she’s seen the number of young adults with mental health issues caused or exacerbated by tech dramatically increase over the past decade. “Almost every single client we see is struggling with an inability to set boundaries,” Chavez says. “They are constantly drawn to their devices without realizing the impacts they can have.”
She says a little education could go a long way. For instance, classes could teach people about the risks of tech addiction to the adolescent brain and could encourage more discussions about how to effectively limit use and set boundaries, whether that’s only following content that makes you feel good or limiting the amount of time spent on devices.
“Social media is here to stay,” she says. “We need to teach young adults how to have a sustainable relationship with it.”
[ad_2]
Source link
Comments are closed.