Cyberflashing is the act of sending non-consensual nude photos via your phone. It is an act of digital sexual violence that is often minimised through terms like "unsolicited dick pics". Dating app Bumble is trying to combat the violation on its app, while also campaigning for the act to be made illegal in the UK and U.S..
In 2019, Bumble launched its artificial intelligence tool 'Private Detector', which alerts users when they've been sent an obscene photo and automatically blurs the image.
iRobot Roomba Combo i3+ Self-Emptying Robot Vacuum and Mop—$329.99(List Price $599.99)
Samsung Galaxy Tab A9+ 10.9" 64GB Wi-Fi Tablet—$178.99(List Price $219.99)
Apple AirPods Pro 2nd Gen With MagSafe USB-C Charging Case—$199.00(List Price $249.00)
Eero 6 Dual-Band Mesh Wi-Fi 6 System (Router + 2 Extenders)—$149.99(List Price $199.99)
Apple Watch Series 9 (GPS, 41mm, Midnight, S/M, Sports Band)—$299.00(List Price $399.00)
Now, the dating app is making a version of the tool available to the wider tech community.
SEE ALSO:It's time to stop saying 'unsolicited dick pics.' Here's why.So, how does the tool work exactly? Well, AI detects when a lewd image has been sent. The photo is then automatically blurred and the user has the choice of viewing, deleting, or reporting the image. It was launched on both Bumble and Badoo in response to the rise of cyberflashing.
Research from a UK government reportfrom March found that 76 percent of girls aged 12–18 had been sent unsolicited nude images of boys or men. Bumble's own research found that almost half of those aged 18–24 have received a non-consensual sexual image, with 95 percent of people under the age of 44 in England and Wales stating more should be done to stop cyberflashing.
Other apps have also made moves to try to combat the rise of this act of image-based sexual violence, with OKCupid announcing in 2017 it would make all users take an anti-harassment pledge stating that they wouldn't send unsolicited nude photos on the app. More recently, Mashable has reported that Instagram is working on a feature called "Nudity protection."
Bumble has made a version of Private Detector widely available on GitHub, so that other tech companies can adapt it and make features of their own to improve safety and accountability online in the fight against abuse and harassment.
Rachel Haas, Bumble’s VP of member safety, said in a statement, "Open-sourcing this feature is about remaining firm in our conviction that everyone deserves healthy and equitable relationships, respectful interactions, and kind connections online."
Bumble has been campaigning against cyberflashing in both the U.S. and UK in recent years. In 2019, the app's founder and CEO Whitney Wolfe Herd helped pass a law in Texas (HB 2789) to make non-consensual nude images a punishable offence. In the time since then, the dating app has helped pass similar bills in Virginia (SB 493) and California (SB 53).
In England and Wales, Bumble has been campaigning for the criminalisation of cyberflashing, and in March 2022, the government announced it will become a criminal offence under new laws set to be introduced, with perpetrators facing up to two years in prison.
It's time for the wider tech community to take this violation more seriously and to do more in the fight against it.