The dating app Bumble,centred on the particularity of allowing women to choose who they want to talk to, will soon be able to ban users deemed to be using body shaming speech in their profile details or in their conversations with prospective partners.
New guidelines to protect against body shaming
In their recently updated guidelines, Bumble announced that their moderators would be keeping track of any and all body shaming rhetoric as a way to protect women from harassment.
The update was put into effect following growing concern about abuse on dating apps. A survey conducted by bumble in which 1,003 samples were collected revealed that about a quarter of Brits had, at one point in time in their lives, been subjected to online body-shaming through dating apps and/or social media.
In addition, the survey also found that 54% of people reported being less likely to feel good after spending extended amounts of time online with body shaming making 35% feel self-conscious, 33% insecure and 25% angry.
The algorithm will specifically target language that is fatphobic, racist or homophobic. The offending user will be given a first warning with repeated incidents or pointedly harmful comments warranting a permanent and immediate ban.
Added measures to keep women safe
Bumble's head of UK and Ireland, Naomi Walkland, said:
[The aim is to create] a kinder, more respectful and more equal space on the internet. Key to this has always been our zero-tolerance policy for racist and hate-driven speech, abusive behaviour and harassment. Body shaming [is] not acceptable on Bumble.
This will add to the already existing feature introduced in 2019 whereby artificial intelligence is used to automatically detect and censor unsolicited nude images. It also allows the recipient to choose, delete or report the image.
And the company is now looking into applying further automated intelligence to review and updates its photo guidelines in an attempt to filter out toxic users and remove violating content.