As much as the advent of social media has opened up new avenues for discourse, interaction, connection and commerce – there is a flip side to its takeover of popular culture. Left unchecked, social media platforms are now costing us – economically, socially and, most alarmingly – in how they are negatively affecting the mental health of young people.
Given this, it comes as no surprise that people are launching locally-led and individual efforts to hold social media platforms accountable in the absence of any strong U.S. federal policy or oversight.
January saw two notable events that are indicative of the frustration parents and local leaders are feeling about the state of social media in the United States. First, New Jersey became the first state to require public schools to teach media literacy at every grade level, K-12. The bill’s legislative sponsor said the law is needed in order to give kids the tools to “weigh the flood of news, opinion, and social media they are exposed to” everyday. Second, Seattle Public Schools filed a lawsuit against five social media companies, accusing them of fueling a mental health crisis for children. The district wants the companies to pay for the costs of dealing with children who are in distress or distracted from their schoolwork.
Both of these events signal that parents and local leaders are done waiting for the federal government to reign in Silicon Valley. Rather, in the absence of the federal government, concerned parents, educators and local leaders are seeking their own remedies to a public health crisis that is disproportionately affecting children.
These aren’t the only instances of local efforts to take on Big Tech. In December, 60 Minutes aired a segment, “Suing Social Media,” that profiled families who say social media algorithms put their kids in danger. Some “1,200 families are pursuing lawsuits against social media companies including TikTok, Snapchat, YouTube, Roblox and Meta – the parent company of Facebook and Instagram,” reported the news program, and more than 150 lawsuits are poised to move forward this year with parents speaking out about kids ”addicted to social media” and how it has become a problem “much bigger than us.”
The newly-formed Social Media Victims Law Center is leading a federal case against Meta and other social media companies in another example of how citizens are using the courts to address what Congress has been unable – or unwilling – to take on. This year the center is starting the discovery process against Meta and others. The work is part of a multi-million lawsuit aimed at changing policy by arguing that social media is a health risk with algorithms designed to make it addictive and platform products that are “designed to evade parental authority.”
This is something Reboot has long been researching and writing about. Practically speaking, the problem with social media platforms goes far deeper than the content on them. The overall design of these platforms makes them hotbeds for misinformation, bias and fruitless bickering. To increase revenue, algorithms are weighted toward stories that get the most engagement. Often, those stories are inflammatory, misleading or outright false. Social media companies may show well-intentioned concern, but they have proven that they will stop short of reforms that threaten their business models.
It is why parents and others are increasingly questioning who is holding social media companies accountable.
This is not the case in Europe, where a new EU law (the Digital Services Act) has tightened content moderation requirements on major social media platforms, requiring them to develop concrete plans to remove illegal content, to publicize details about their algorithms, and give users the option to opt out of the recommendation systems entirely.
Furthermore, the U.K. is toughening its new social media law to include threatening CEOs with jail time for violations. The law is aimed at better protecting children and others from viewing certain online content. Such moves are placing additional pressure on social media companies to be more responsible about the content they are allowing or even pushing out on their platforms.
No such action is being mirrored in the United States. Just the opposite, in fact.
Last year the Department of Homeland Security officially disbanded its short-lived “Disinformation Governance Board” – an interagency team that was to coordinate government activities related to fighting disinformation after conservatives railed against it. And although there was agreement among both Republicans and Democrats under President Donald Trump to reform Sec. 230 of the Communications Decency Act of 1996, the two sides never came close to taking action.
Now, it appears Sec. 230 might get reformed after all – by the courts. An upcoming U.S. Supreme Court case seeks a ruling on whether platforms can be held liable for damages related to algorithmically generated content recommendations. Google has filed a brief in the case saying such a ruling could “upend the Internet.”
The United States’ current piecemeal approach to social media is clearly not as effective as it could or should be – creating a void that parents, educators, and lawyers are trying to fill. It’s past time for federal officials to join them in working to craft policies that both protect the use of social media while better managing its potential for harm. The work and collaboration must be nuanced; it will certainly be contested and controversial, but it can no longer go undone.
The toll on the nation’s youth and their families – mentally, emotionally, and economically as more resources must be thrown at countering social media’s negative impact – cannot be back-burnered any longer.
Stay connected with us on social media platform for instant update click here to join our Twitter, & Facebook
We are now on Telegram. Click here to join our channel (@TechiUpdate) and stay updated with the latest Technology headlines.
For all the latest Education News Click Here