Mobile news

Facebook says it now uses machine learning to proactively detect revenge porn


Facebook has revealed its latest efforts to curb the spread of so-called “revenge porn” across the social network.

Revenge porn, which is when intimate photos or videos are shared to the public without the subject’s consent, has become a whole lot easier in the age of camera phones, messaging apps, and an always-on society — it’s easier than ever to share anything with anyone at any time. While many countries now have laws in place specifically to address revenge porn, this doesn’t stop the practice from happening — and by the time an intimate image has been shared, legal retribution will likely be too late for the victim.

As with other technology companies, Facebook has introduced a number of tools through the years designed to tackle revenge porn. Back in 2017, Facebook launched a new Report button to make it easier to report intimate content shared on Facebook, Messenger, or Instagram. Additionally, at the time it said that it would use image recognition technology to ensure that a photo detected as revenge porn is not reshared in the future.

Later, Facebook allowed its users to take a more proactive approach by submitting a digital copy of the image to help Facebook automatically block any future attempts to share it publicly.

Burden

With its latest endeavor, Facebook is now striving to remove the burden from the victim altogether by automatically detecting “near nude” images or videos that are shared without permission across Facebook or Instagram.

“This means we can find this content before anyone reports it, which is important for two reasons: often victims are afraid of retribution so they are reluctant to report the content themselves or are unaware the content has been shared,” noted Antigone Davis, Facebook’s global head of safety, in a blog post.

Moving forward, Facebook’s machine learning smarts will flag visual content it thinks may constitute revenge porn, and a human employee will review the content.

“If the image or video violates our Community Standards, we will remove it, and in most cases we will also disable an account for sharing intimate content without permission,” Davis added. “We offer an appeals process if someone believes we’ve made a mistake.”

In addition to the new auto-detection technology, Facebook is also launching a new support hub called Not Without My Consent, which serves up a bunch of related tools and resources for revenge porn victims.



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.