Is the Jig Finally Up for Social Media Giants?

By Rafael Hoffman

Meta founder and CEO Mark Zuckerberg (left) and TikTok CEO Shou Zi Chew (right).

There are not many subjects that still bring consensus on both sides of America’s political spectrum. Yet what has emerged as a unifying enemy are social media platforms and the tech giants that own them.

Now, after more than a decade of escaping public scrutiny, the harm these companies knowingly cause has attracted increased attention. With that has come a multifront attempt to rein in the damage Silicon Valley’s creations have inflicted on America.

Most in focus currently are the ill effects forums have on children and teenagers, who spend an inordinate amount of time on these applications.

Amid that realization, big tech is facing more challenges than ever before. Several states are in the process of establishing controls like age limits for social media use and data gathering. The Senate seems on the cusp of passing its first significant bill to protect young online users.

At the same time, social media companies face antitrust regulatory pressure from the Federal Trade Commission (FTC) and a pending Supreme Court case over liability for content regulation.

Companies like Meta and TikTok are the objects of lawsuits from a long list of states, school districts, and cities alleging they bear financial responsibility for widespread mental health problems, addiction, eating disorders, and other forms of self-harm. The latest to join these suits is New York City, whose Mayor Eric Adams said the suit’s goal was to be part of “a larger reckoning.”

Citing a list of disturbing mental health-related fallouts linked to teen social media use, the Mayor said that “this is not a reality we can accept or normalize.”

While pressures continue to pile up, so far, none have had any impact, leaving many wondering if big tech will ever have to pay the piper.

New York City Mayor Eric Adams announces the filing of a lawsuit to hold five social media platforms — TikTok, Instagram, Facebook, Snapchat, and YouTube — accountable for fueling the nationwide youth mental health crisis, at City Hall on February 14. (Michael Appleton/Mayoral Photography Office)

The harms linked to social media have become increasingly apparent. Studies have linked them to the rise of political polarization and terror threats. The left-wing political bias of companies has played a significant role in promoting narratives favoring Democrats and censorship of stories and information promoted by conservatives.

Now, governments have set their sights on its effects on young people. There is good reason for concern with well more than 50% of Americans ages 13-17 using social media platforms with multifaceted harms to show for it. Companies are banned by law from allowing users under 13, but with the lack of any enforceable age verification system, millions of younger children use forums as well.

A major accusation against social media companies is that, with data they collect about users’ interests, their algorithms feed content they think will maximize their screen time. The inordinate amount of time this leaves many teens attached to screens is widely blamed for impeding socialization, learning, and other aspects of healthy development.

More concerning is that much of the material used to keep teens on forums encourages eating disorders, fosters low self-esteem, and encourages taking dangerous risks and other forms of self-harm, up to and including suicide. An equally harmful aspect of several forums are “endless scrolls,” which feeds users a never-ending stream of content.

“Companies know that they’re peddling harmful content to kids and have done very little to mitigate that,” said Anne Tutor, an analyst at the Heritage Foundation’s Tech Policy Center. “They are designed to get people to spend more time on the app, because the longer you’re there, the more money they make.”

A Gallup survey showed that 51% of teens spend at least four hours daily on social media apps. TikTok’s internal data shows its teen users spending 2–4 hours a day on the platform, checking it an average of 17 times per day.

Among myriad ills are forums’ effect on teen sleep. Tik Tok’s data also shows that 20% of its teen users are active on the platform between 12 a.m.-5 a.m. Twenty-nine percent of teen girls who use Instagram and 31% who use Snapchat said using the forums regularly disrupts their sleep.

Many researchers feel social media use is closely linked to teen mental health trends which have worsened considerably since the COVID pandemic, when screen use skyrocketed. New York City’s suit notes that, nationwide, emergency room visits for anxiety disorders are up 117% and youth suicide rates are up 57%. 

Social media companies continue to promote themselves as vehicles for greater connectivity, allowing families and friends to communicate easier. Yet studies show their platforms have become dominated by advertisers, promotional campaigns, and other entities vying for users’ attention — many through inappropriate and damaging content. The millions of teens using these forums are posting less of their own input and spending more of their online time consuming content, much of it taking them down roads of negativity.

“It’s become dominated by businesses,” said Haley Hinkle, policy counsel at Fair Play for Kids, a group that advocates for more regulation of social media platforms. “The reality has become that most content is not promoted by people you know and chose to follow, but by advertisers out to make money.”

While some shrewd observers detected the deleterious effects of social media life early in its proliferation, popular forums mostly gained millions of users and financial dominance without pushback. At some points, rising extremism, some mass shootings and the like were blamed on these platforms, yet little attention was paid to its destructive influence on many if not most users.

In recent years, however, several events pulled back the protective veil, revealing far more of Silicon Valley’s ugliness to the American public.

The first of these was a series of articles in fall 2021 by The Wall Street Journal, “The Facebook Files,” which revealed internal data showing the company was well aware its products were being used by terrorist groups, sowing social divisions, and being used to plan criminal activity. It also had clear data showing harmful effects its forums, especially Instagram, had on the mental health of young people. The Journal’s source eventually revealed herself as former Facebook data analyst, Frances Haugen. In media interviews and Congressional testimony, she made clear that, despite the company holding reams of information detailing the harm their products cause, Facebook, now rebranded as Meta, was doing very little about it.

Since then, several Congressional hearings have focused on different aspects of the mass dangers posed by social media. Over the past two years, TikTok, owned by a Chinese company with close ties to that nation’s ruling Communist Party, came to be viewed by many as not only a conscious attempt to erode young minds but as a potential national security threat.

Many states banned its employees from using the forum, citing the risk of user data ending up in the hands of the Chinese government. Eventually, the federal government also prohibited its use on official devices, with the FCC calling it “a sophisticated surveillance tool.” Recently, the Biden campaign opened a TikTok account, engendering a letter from 18 Republican lawmakers asking the campaign to suspend its use citing “well-established national security risks.”

“The CCP does have access to TikTok’s data,” said Mrs. Tutor. “It can be a manipulative tool for influence and espionage from a foreign adversary. The national security concerns have been laid out for several years, so it’s a little alarming that the Biden campaign is ignoring that.” 

Last year, former Facebook engineering director Arturo Bejar told a Senate Committee that the company’s data showed a fifth of 13- to 15-year-olds were bullied on the site, one of several harms he said the platform facilitated without much effort at mitigation.

This year, the Senate Judiciary Committee summoned CEOs of several leading social media companies to a hearing on child safety. Senators’ hostility to the companies was punctuated by the presence of parents of children who committed suicide, died, or suffered other severe consequences as a result of social media use. At one point, after prodding from Senator Josh Hawley, Meta chief Mark Zuckerberg offered an apology to the parents. 

“This is a very powerful industry, but I think what we saw in the hearing is that Washington has run out of good will to trust them to address these problems,” said Mrs. Hinkle.

As in prior hearings, social media bosses touted safety features their companies have in place and committed to step up efforts. Yet after years of similar promises producing few results, few in that room were convinced.

“After years of working on this issue with you and others, I’ve come to conclude the following: Social media companies as they’re currently designed and operate are dangerous products,” said the committee’s ranking Republican Senator Lindsey Graham.

“Companies have demonstrated time and again, they can’t regulate themselves,” said Mrs. Hinkle. “They’ve had their chance and as much as they’ve made some small tweaks, it’s not going to be enough.”

Sen. Ted Cruz (R-Texas) reads from a poster as he questions TikTok CEO Shou Zi Chew during a Senate Judiciary Committee hearing to discuss child safety on Capitol Hill, Jan. 31. (AP Photo/Susan Walsh)

No shortage of bills aimed at reining in big tech have been introduced in Congress. None became law, and few even made their way to floor votes.

The first that might be poised to break this impasse is the Kids Online Safety Act (KOSA). The bill’s main features require social media companies to provide options to protect young users’ data and opt out of algorithm-driven content suggestions. It also saddles companies with the responsibility “to prevent and mitigate harms to minors,” such as content encouraging suicide, eating disorders, and other dangerous activities and requires companies to reveal how their algorithms work.

The bill was introduced in 2022 by Connecticut Democratic Senator Richard Blumenthal and Tennessee Republican Senator Marsha Blackburn, but failed to rally sufficient support in Congress amid opposition from progressive groups who argued it would give state attorney generals too much power to remove content driven by left-wing social agendas. The recently introduced version shifts most oversight responsibility to the FTC. With that change, many organizations dropped their objection, and the bill garnered the support of more than 60 senators, including Majority Leader Charles Schumer. Amid the pressure and poor public image created by the Senate’s recent hearing, even some tech companies have endorsed the measure that would give parents tools to opt out of features like the infinite scroll. President Joseph Biden said he would sign the bill into law.

While most agree that even with KOSA, social media will still pose serious dangers to users, it would be an important move in beginning to regulate the industry.

“KOSA would be a big step in starting to put some guardrails and give parents more control over how teens use these apps,” said Mrs. Tutor. “The hurdle right now is getting anything passed, so if KOSA can become law, hopefully, it would pave the way for future legislation.”

Even with wide bipartisan Senate support, KOSA’s path is far from certain. The House of Representatives has several tech bills of its own focused on data protection for users of all ages. There have been no public signals as to what its prospects would be there or if Speaker Mike Johnson would bring the measure for a vote if passed by the Senate.

With KOSA still many steps from becoming law, those hoping for it to pass and bring more tech regulation are reminded of the myriad challenges other such attempts faced.

Most place blame at the feet of Silicon Valley’s multi-million-dollar lobbying efforts supported by several Washington firms.

In 2022, Open Secrets revealed that big tech lobbyists blocked two major regulatory bills from getting Senate votes with a $277 million lobbying campaign against them. Groups supporting the bills spent $45 million. The same tech companies also contributed more than $2.3 million to Congressional campaigns that year.

“We know the industry does not want potential liability; its current model is very profitable,” said Mrs. Hinkle.

These campaigns included public efforts to promote a narrative that regulation would violate free speech concerns and pose dangers to groups portrayed in media as “disadvantaged.” Many of those same messages are still being promoted by those hoping to block KOSA.

“Big tech has a massive lobbying arm, which uses creative tactics,” said Mrs. Tutor. “They claim this will harm privacy or that they will need even more user data to comply. They find ways to convince the public not to regulate, but I would caution a lot of skepticism. They are not altruistic.”

Big Tech’s hold on Congress runs in different directions. Members in both parties take financial contributions from Silicon Valley companies. Many California Democrats, including former Speaker Nancy Pelosi, have deep connections to the industry. Mr. Schumer has one daughter who works as a registered lobbyist for Amazon and another who works as a marketing manager at Meta.

“There’s a libertarian faction on the right grappling with their distaste for regulations. On the left, there’s some sympathy, since these companies lean left and they tend to benefit more from them directly. That’s created a good deal of hesitancy to act,” said Mrs. Tutor.

With Washington atrophied in addressing social media threats, a growing number of states attempted to fill the gap.

Bills to protect children’s data and force companies to minimize the risk of young people’s exposure to harmful content were passed in Connecticut, and are being considered in Vermont, Illinois, New Mexico, and Maryland.

Some conservative-leaning states are taking a harder line. Texas and several others implemented age verification systems to protect young people from inappropriate material, which is often linked through social media advertising. Utah and Arkansas passed laws requiring parental consent for minors to access social media and Florida’s legislature is in the process of passing a similar law.

Increasingly, efforts to tame social media companies have made their way to courthouses. Tech giants are currently awaiting a Supreme Court ruling on a case that would set a more stringent standard of responsibility for posted content. The case pits the family of a woman killed by ISIS against Google, claiming the group was allowed to use the company’s video platform to recruit terrorists. Another pending case pits the FTC and 17 states against Amazon, alleging monopolist control of the industry.

This year, a long list of states, municipalities, and school districts sued social media giants to hold them responsible for an acute mental health crisis.

Over 40 attorneys general from states across the nation are suing Meta, and some other companies, as well as accusing them of knowingly perpetuating an epidemic of youth mental health problems. Some of the suits also accuse companies of failing to live up to their legal obligation to protect data of users under the age of 13. Suits cite data showing a sharp spike in teen depression and suicide levels, presenting reams of studies linking the disturbing trends to the ways social media companies have designed their platforms to ensnare young users.

“When you see so many AGs agreeing about the scope of this problem, that forces Meta to come to the table,” said Mrs. Hinkle. “This could be impactful in showing these platforms that they can’t get away with whatever they want.”

Plaintiffs argue that Silicon Valley bears financial responsibility to help schools and governments cover the millions spent treating these mental health issues.

These suits bear similarities to past actions against tobacco companies, who were held responsible for knowingly designing a dangerous, addictive product, and marketing it to young people.

This month, New York City filed its own suit against Meta, SnapChat, TikTok, and YouTube, saying they are designed to “exploit children and adolescents.”

“Meta has every incentive to — and knowingly does — addict users to Facebook and Instagram. It accomplishes this through the algorithms that power its apps, which are designed to induce compulsive and continuous scrolling for hours on end,” reads the suit.

New York’s action makes it the first major city to sue big tech over the harm it allegedly causes to young people.

“Just as the surgeon general did with tobacco and guns, we are treating social media like other public health hazards,” said Mayor Adams. “Not only do we aim to hold TikTok, Meta, Snapchat and YouTube accountable for their role in creating the youth mental health crisis in New York City by purposely manipulating and addicting children and teens to social media applications, but we seek to hold them financially responsible for what they cost our city year after year. … We spend over $100 million on youth mental health programs each year alone, even as these corporations reap billions of dollars of profit at the cost of young people’s emotional, mental and physical health.”

To Read The Full Story

Are you already a subscriber?
Click to log in!