• Sign in
  • Sign up
Elektrine
EN
Log in Register
Modes
Overview Chat Timeline Communities Gallery Lists Friends Email Vault DNS VPN
Back to Timeline !technology @Devial
In reply to 5 earlier posts
@themachinestops@lemmy.dbzer0.com on lemmy.dbzer0.com Open parent
Open parent Original URL
603
0
86
@Devial@discuss.online on discuss.online Open parent
The article headline is wildly misleading, bordering on just a straight up lie. Google didn’t ban the developer for reporting the material, they didn’t even know he reported it, because he did so anonymously, and to a child protection org, not Google. Google’s automatic tools, correctly, flagged the CSAM when he unzipped the data and subsequently nuked his account. Google’s only failure here was to not unban on his first or second appeal. And whilst that is absolutely a big failure on Google’s part, I find it very understandable that the appeals team generally speaking won’t accept “I didn’t know the folder I uploaded contained CSAM” as a valid ban appeal reason. It’s also kind of insane how this article somehow makes a bigger deal out of this devolper being temporarily banned by Google, than it does of the fact that hundreds of CSAM images were freely available online and openly sharable by anyone, and to anyone, for god knows how long.
Open parent Original URL
0
0
0
@cupcakezealot@piefed.blahaj.zone on piefed.blahaj.zone Open parent
so they got mad because he reported it to an agency that actually fights csam instead of them so they can sweep it under the rug?
Open parent Original URL
0
0
0
@Devial@discuss.online on discuss.online Open parent
They didn’t get mad. Did you even read my comment ?
Open parent Original URL
0
0
0
@cupcakezealot@piefed.blahaj.zone on piefed.blahaj.zone Open parent
they obviously did if they banned him for it; and if they’re training on csam and refuse to do anything about it then yeah they have a connection to it.
Open parent Original URL
0
0
0
0
Devial in !technology
@Devial@discuss.online · Dec 12
Also, the data set wasn’t hosted, created, or explicitly used by Google in any way. It was a common data set used in academic papers on training nudity detectors. Did you seriously just read the headline, guess what happened, and are now arguing based on that guess that I, who actually read the article, am wrong about it’s content ? Because that’s sure what it feels like reading your comments…
View on discuss.online
0
0
0
Sign in to interact

Loading comments...

About Community

technology
Technology
!technology@lemmy.world

This is a most excellent place for technology news and articles.


Our Rules
  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots
  • @L4s@lemmy.world
  • @autotldr@lemmings.world
  • @PipedLinkBot@feddit.rocks
  • @wikibot@lemmy.world
83897
Members
18814
Posts
Created: June 11, 2023
View All Posts
313k7r1n3

Company

  • About
  • Contact
  • FAQ

Legal

  • Terms of Service
  • Privacy Policy
  • VPN Policy

Email Settings

IMAP: mail.elektrine.com:993

POP3: pop3.elektrine.com:995

SMTP: mail.elektrine.com:465

SSL/TLS required

Support

  • support@elektrine.com
  • Report Security Issue

Connect

Tor Hidden Service

khav7sdajxu6om3arvglevskg2vwuy7luyjcwfwg6xnkd7qtskr2vhad.onion
© 2026 Elektrine. All rights reserved. • Server: 02:36:22 UTC