SC attorney general joins multistate coalition lawsuit against Meta

Officials claim the company’s social platforms harm children and teen’s mental health
South Carolina is one of several states across the country suing Meta -- Facebook's parent company.
Published: Oct. 24, 2023 at 1:19 PM EDT|Updated: Oct. 24, 2023 at 4:43 PM EDT
Email This Link
Share on Pinterest
Share on LinkedIn

GREENVILLE, S.C. (FOX Carolina) - South Carolina Attorney General Alan Wilson announced that 42 attorneys general, including North Carolina and Georgia, sued Meta in federal and state courts alleging the company knowingly has features that harm children.

According to the attorney general, the company knowingly designed and deployed harmful features on Instagram and its other social media platforms that purposefully addict children and teens.

Officials said Meta falsely assured the public that these features are safe and suitable for young users.

Attorney General Wilson said Meta’s business practices violates state consumer protection laws and the federal Children’s Online Privacy Protection Act (COPPA). These practices have harmed and continue to harm the physical and mental health of children and teens have fueled what the U.S. Surgeon General deemed a “youth mental health crisis” which has ended lives, devastated families, and damaged the potential of a generation of young people.

“Protecting our children is one of our most important jobs and that’s exactly what we’re trying to do with these lawsuits,” Attorney General Wilson said in a release. “We can’t stand by and do nothing while Big Tech continues to engage in behavior that knowingly harms our children and breaks the law.”

The other states joining the lawsuit include:

ArizonaCaliforniaColoradoConnecticutDelaware
GeorgiaHawaiiIdahoIndianaKansas
KentuckyLouisianaMaineMarylandMichigan
MinnesotaMissouriNebraskaNew JerseyNew York
North CarolinaNorth DakotaOhioOregonPennsylvania
Rhode IslandSouth DakotaVirginiaWashingtonWest Virginia
Wisconsin

Florida is filing its own federal lawsuit in the U.S. District Court for the Middle District of Florida.

The following states are filing lawsuits in their own state courts: District of Columbia, Idaho, Massachusetts, Mississippi, New Hampshire, Oklahoma, Tennessee, Utah and Vermont.

FOX Carolina reached out to Meta for the following statement:

“We share the attorneys general’s commitment to providing teens with safe, positive experiences online, and have already introduced over 30 tools to support teens and their families. We’re disappointed that instead of working productively with companies across the industry to create clear, age-appropriate standards for the many apps teens use, the attorneys general have chosen this path.”

Meta Spokesperson

Meta also provided additional details on safety steps the company has taken since October 2020 including implementing age verification technology, sensitive content control, and banning content that promotes suicide, self-harm or eating disorders. Those are provided below:

  • We use age verification technology to help teens have experiences that are appropriate for their age, including limiting the types of content they see and who can see and interact with them.
  • We automatically set teens’ accounts (U16) to private when they join Instagram. We also don’t allow people who teens don’t follow to tag or mention them, or to include their content in Reels Remixes or Guides. These are some of the best ways to help keep young people from hearing from adults they don’t know, or that they don’t want to hear from.
  • We’ve developed technology to help prevent suspicious adults from engaging with teens. We work to avoid showing young people’s accounts in Explore, Reels or Accounts Suggested For You to these adults. If they find young people’s accounts by searching for their usernames, they won’t see an option to follow them. They also won’t be able to see comments from young people on other people’s posts, nor will they be able to leave comments on young people’s posts.
  • We limit the types of content teens can see in Explore, Search and Reels with our Sensitive Content Control. The control has only two options for teens: Standard and Less. New teens on Instagram who are under 16 years old are automatically placed into the Less state. For teens who are already on Instagram, we send prompts encouraging them to select the Less experience.
  • We don’t allow content that promotes suicide, self-harm or eating disorders. Of that content we take action on, we identify over 99% before it is reported to us.
  • We show expert-backed, in-app resources when someone searches for, or posts, content related to suicide, self-harm, eating disorders or body image issues. They see a pop-up with tips and an easy way to connect to organizations like NEDA in the US. We also have a dedicated reporting option for eating disorder content.