Experts say case highlights well-known dangers of automated detection of child sexual abuse imagesGoogle has refused to reinstate a man’s account after it wrongly flagged medical images he took of his son’s groin as child sexual abuse material (CSAM), the New York Times first reported. Experts say it’s an inevitable pitfall of trying to apply a technological solution to a societal problem.Experts have long warned about the limitations of automated child sexual abuse image detection systems, particularly as companies face regulatory and public pressure to help address the existence of sex ...
Continue reading the above news article, by clicking here.
Administration and Moderation NOTE: A Reminder, the above news is provided for free, through a RSS Feed. The above quote does NOT include the full story. In order to continue to read more past what is quoted, please click through to the article, by clicking on the link provided above through the "continue reading" text.
Please be sensible and follow our terms of service and rules, when replying and debating the news article.