The dating app bumble included a tool called Private Detector developed with a intelligence artificial trained to detect images of naked and other types of inappropriate content that are sent within the chats of the platform.
This function, which was in the phase of developing since the end of April this year, has been integrated into the profiles inside of dating app and will notify users in the event that any user submits a inappropriate photo by direct message to another person without their consent.
How Private Detector works
According to the statement issued by the company, the machine learning as a way to allow the app understand a Photography and detect it if it contains an image that may be considered harmful to the user. Username receive it.
To achieve a high level of accuracy of precision in the detection of this content, as indicated by the publication in the website dating platform official, the artificial intelligence tool was exposed to non-sexual images labeled as “limit content”which refers to parts of the body such as arms or legs, so that these may be allowed and any photograph is not classified as sensitive or inappropriate content.
“Repeatedly adding samples to the database of training to reflect the actual behavior of the users or try the misclassification, turned out to be a successful exercise that we apply for years in all our efforts of machine learning”, states the statement.
It may interest you: What are the Android applications that consume the most battery
Similarly, photographs containing weapons are also labeled as sensitive content and can be denounced in the media that are integrated within the app.
“When their performance was analyzed in different conditions, both on-line What offlinewe get an accuracy of about a 98% (in the identification of cases of sensitive content)”, it was stated in the publication of the dating app.
If a person receives this type of content within the platform of bumblethe system Private Detector will blur the image automatically and by default. In the event that the person wishes to see the contents of the photograph, it will only be necessary to press on it for a long time and authorize viewing. If not, you will be able to report it using the mechanisms integrated within the application.
This is the way that bumble try to prevent the cyberflashing, a practice of digital sexual violence that consists of the unwanted sending of images with obscene content or sexual videos through digital media. Something that, despite the fact that the company indicates that it represents less than 0.1% of the conversations within the application, can occur on other platforms and social networks.
In addition, due to the intention that this tool be applied on other websites and social media platforms, bumble has communicated that the system based on artificial intelligence is available to interested people and is open source, so that its implementation in other digital services will be even easier
The publication in Web page states that “the standard service for blurring sexual images is available as-is, but can be enhanced with additional test samples. Any improvement in the architecture of the code or the quality of its structure is welcome.”