Bumble open sources nude-detecting tech: Here’s how that could thwart cyber flashing
To combat the sending of unsolicited nudes online — known as cyber flashing — and make the internet safer place for everyone, the popular women-first dating app Bumble is open-sourcing its AI tool — Private Detector.
The new tool works by automatically blurring a potential nude image shared within a chat on Bumble. Users will be notified, and it is up to them to decide whether to view or block the image.
"Bumble's Data Science team has written a white paper explaining the technology of Private Detector and has made an open-source version of it available on GitHub," the company said in a blogpost.
"It is our hope that the feature will be adopted by the wider tech community as we work in tandem to make the internet a safer place," it added.
This version of the Private Detector is released under the Apache License, so that it is available for everyone to implement as the standard for blurring lewd images as it is, or after fine tuning it with additional training samples.
To help address this more significant issue of cyberflashing, Bumble said it teamed up with legislators from across the aisle in 2019 in Texas to pass a bill that effectively made sending unsolicited lewd photos a punishable offence.
Since the passing of HB 2789 in Texas in 2019, Bumble has continued to advocate for similar laws across the US and globe successfully.
In 2022, Bumble reached another milestone in public policy by helping to pass SB 493 in Virginia and most recently SB 53 in California, adding another layer of online safety in one of the most populous states in the US.
(Except for the headline and cover image, the rest of this IANS article is un-edited)
For more technology news, product reviews, sci-tech features and updates, keep reading Digit.in.
IANS English
This is an unedited, unformatted feed from the Indo-Asian News Service (IANS) wire. View Full Profile