Child abuse poll a ‘mistake’ – Facebook Leave a comment

In what has become a seemingly regular occurrence, Facebook is being asked: how on earth did this happen?

On Sunday Facebook asked an unspecified number of users their thoughts on how child abuse images should be handled on the network.

It gave a scenario in which an “adult man” asks a 14-year-old girl for “sexual images”, and then a list of possible answers.

One option read: “This content should be allowed on Facebook, and I would not mind seeing it.”

A follow-up question offered options on how policies should be enforced, such as “Facebook decides the rules on its own” or “external experts decide the rules and tell Facebook”.

As noted by the Guardian, none of the options allowed the user to suggest that the proper course of action in this scenario would be to inform child protection agencies or call the police.

“We run surveys to understand how the community thinks about how we set policies,” said Guy Rosen, Facebook’s head of product.

“But this kind of activity is and will always be completely unacceptable on FB. We regularly work with authorities if identified. It shouldn’t have been part of this survey.

“That was a mistake.”

Data set
The company is no longer running the survey.

The BBC understands Facebook’s team was instructed to find out how users felt was best appropriate to deal with illegal content on the network. The site is not, of course, considering changing how it deals with child abuse imagery.

Over the network’s head looms the prospect of more regulation.

By asking users if they feel more comfortable with Facebook determining the rules on how unacceptable content is handled could be an attempt to build data to back up its likely argument that it can regulate itself.

Another option, whereby experts advise the network, is also a possibility. In the past, Facebook has turned to outside experts when developing new technologies, particularly those aimed at younger users.

However, the site as criticised over its choice of experts – many of which it had funded, as reported by Wired magazine.

Facebook has had a tough time dealing with negative publicity recently. At an event for US conservatives last month, in the wake of another school shooting, the company demonstrated a virtual reality shooting game set in a train station.

It later pulled the demonstration, saying it regretted its inclusion on its stand.

Leave a Reply

Your email address will not be published. Required fields are marked *