2018-9-4 08:15 |
Google’s latest attempt to battle the spread of child sexual abuse material (CSAM) online comes in the form of an AI that can quickly identify images that haven’t been previously catalogued. It’s part of the company’s Content Safety API, which is available to NGOs and other bodies working on this issue.
By automating the process of rifling through images, the AI not only speeds things up, but also reduces the number of people required to be exposed to it – a job that can take a serious psychological toll. Given that the UK’s Internet Watch Foundation (IWF) found nearly 80,000…
This story continues at The Next Web
Or just read more coverage about: Google
.
Similar to Notcoin - Blum - Airdrops In 2024