• Sign in
  • Sign up
Elektrine
EN
Log in Register
Modes
Overview Chat Timeline Communities Gallery Lists Friends Email Vault DNS VPN
Back to Timeline !technology @lmmarsano
In reply to 1 earlier post
@marx@piefed.social on piefed.social Open parent
The plaintiffs’ brief alleges that Meta was aware that its platforms were endangering young users, including by exacerbating adolescents’ mental health issues. According to the plaintiffs, Meta frequently detected content related to eating disorders, child sexual abuse, and suicide but refused to remove it. For example, one 2021 internal company survey found that more than 8 percent of respondents aged 13 to 15 had seen someone harm themself or threaten to harm themself on Instagram during the past week. The brief also makes clear that Meta fully understood the addictive nature of its products, with plaintiffs citing a message by one user-experience researcher at the company that Instagram “is a drug” and, “We’re basically pushers.” [emphasis mine] Perhaps most relevant to state child endangerment laws, the plaintiffs have alleged that Meta knew that millions of adults were using its platforms to inappropriately contact minors. According to their filing, an internal company audit found that Instagram had recommended 1.4 million potentially inappropriate adults to teenagers in a single day in 2022. The brief also details how Instagram’s policy was to not take action against sexual solicitation until a user had been caught engaging in the “trafficking of humans for sex” a whopping 17 times. As Instagram’s former head of safety and well-being, Vaishnavi Jayakumar, reportedly testified, “You could incur 16 violations for prostitution and sexual solicitation, and upon the seventeenth violation, your account would be suspended.”
Open parent Original URL
341
1
18
0
lmmarsano in !technology
@lmmarsano@lemmynsfw.com · Dec 07
Instagram had recommended 1.4 million potentially inappropriate adults to teenagers in a single day in 2022 What does that even mean? That all still seems like catastrophizing over videos, images, text on a screen that can’t compel action or credible harm. I expect that lawsuit to go nowhere.
View on lemmynsfw.com
0
0
0
Sign in to interact

Loading comments...

About Community

technology
Technology
!technology@lemmy.world

This is a most excellent place for technology news and articles.


Our Rules
  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots
  • @L4s@lemmy.world
  • @autotldr@lemmings.world
  • @PipedLinkBot@feddit.rocks
  • @wikibot@lemmy.world
83897
Members
18814
Posts
Created: June 11, 2023
View All Posts
313k7r1n3

Company

  • About
  • Contact
  • FAQ

Legal

  • Terms of Service
  • Privacy Policy
  • VPN Policy

Email Settings

IMAP: mail.elektrine.com:993

POP3: pop3.elektrine.com:995

SMTP: mail.elektrine.com:465

SSL/TLS required

Support

  • support@elektrine.com
  • Report Security Issue

Connect

Tor Hidden Service

khav7sdajxu6om3arvglevskg2vwuy7luyjcwfwg6xnkd7qtskr2vhad.onion
© 2026 Elektrine. All rights reserved. • Server: 03:29:04 UTC