Pages

Facebook's going to block revenge smut with AI. Or humans. Or both


Comment Well, that is awkward. Facebook's head of worldwide safety and corporate executive Mark Zuckerberg on Wednesday gave differing descriptions of the advertising network's just-launched "AI" battery-powered “online safety” initiative.


The idea is that if someone's intimate pictures ar shared while not permission as “revenge smut," the site's systems are ready to take them down as quickly as attainable to prevent them spreading.

Zuckerberg aforementioned this worthy aim goes to be achieved by artificial intelligence; the pinnacle of worldwide safety mythical being Davis says the work are done by “specially trained representatives from our community operations.”

It's pretty clear however it has been explained to Mark, as way as we will tell. He wrote: “We can currently use AI and image recognition to stop [revenge porn] being shared across all of our platforms.”

In her company web log post, Davis goes into somewhat additional detail.

Facebookers seeing revenge smut pictures ar asked to use the “report” button; the community operations staffers can decide whether or not the image violates Facebook's splendidly rigorous community standards, and if it will, it (and in all probability the poster) can get blocked.

It's then purpose that technology gets concerned, Davis explained: “We then use photo-matching technologies to assist thwart any makes an attempt to share the image on Facebook, courier and Instagram.”

Someone making an attempt to share a reported-and-removed image can get a warning and also the share are blocked. Facebook additionally guarantees to “partner with safety organisations” to support victims.

So within the Register's reading, there is a little bit of AI for the photo-matching (well, maybe, it's in all probability easy file hash checks or similar), and every one altogether, somewhat but Zuckerberg looks to believe.

Reg comment
While we tend to applaud the target of the program, The Register cares|worries|is bothered} about the method. First, there is a risk that having revenge smut pictures reviewed by humans would possibly deter victims, already upset by having intimate pictures shared on-line. Second, we tend to hope that the humans concerned in creating classification selections ar properly supported against the possible psychological trauma they may suffer.
Related Posts Plugin for WordPress, Blogger...