Facebook Working With UC Berkeley Psychologists On Inappropriate Posts


Facebook is swamped with complaints about “inappropriate” posts, each of which must be manually reviewed by an employee. Yet rather than take down the offending content, the social network has tapped the emotional intelligence of UC Berkeley psychologists, among other top minds, to resolve disputes over posts that don’t clearly violate the company’s community standards.

And their efforts are paying off: “By working with experts in developmental psychology and the science of human emotion, we’re proud to see Facebook’s social resolution tools help, on average, 3.9 million people each week,” said Facebook spokesman Matt Steinfeld.

In its 10th year, Facebook is thriving with more than 1 billion active monthly users. But it’s also a magnet for uncomfortable interactions over everything from unflattering snapshots to cyber-bullying. That’s why the company’s engineers have teamed up with university scholars to create messaging tools to make the social network a safer and more empathetic space.

Building a more ‘tender’ Facebook community

A neuroscientist, Simon-Thomas is on Facebook’s “compassion research team” along with UC Berkeley psychologists Paul Piff and Dacher Keltner. Among other things, they’ve helped tweak pop-up text and create emoticons to encourage Facebook users to communicate their “authentic” feelings about posts that do not fall into the categories  of bullying, hate speech or pornography, but are nonetheless upsetting to them.

“It’s amazing how intimate Facebook can quickly get, and how the written word can be trouble,” said Keltner, who has helped to shape Facebook emoticons based on naturalist Charles Darwin’s study of how facial muscles are used to express emotions. “That’s why we’ve had to build in visuals that express the apologetic, the ironic, and all those emotions that you can’t convey in a smiley face or in words alone.”

Complicating this endeavor is that different cultures find different things funny or offensive.For example, “People who use Facebook in India are offended by different things, like a photo that mocks a favorite cricket player or Bollywood actor,” Simon-Thomas said.” They tend not to use Facebook as much for personal sharing,”

Meanwhile in Britain and northern Europe, Facebook users’ displays of humor can be quite brutal. “But Facebook’s hands are tied around humor; they’re completely committed to freedom of speech,” she said.

Controversial posters appreciate the feedback

Facebook used to offer empty message boxes to encourage private conversations among users over controversial posts. But only one in five users actually wrote a message. That’s changed since Facebook began providing tailored messages such as “Hey, there’s something about this photo that bothers me. Would you mind taking it down? It’s a little embarrassing to me.”

With the help of these suggested messages, Facebook users were found to be more willing to remove photos when contacted (85 percent of requests are honored). Moreover, 63 percent reported feeling positive toward people who sent such a message, and around the same number said they felt fine about being asked to remove a post.

“We’ve found that the people who created status updates and shared links that annoyed or offended others are generally happy to hear feedback,” Steinfeld said.

People, not an algorithm, used to screen complaints

In addition to improving online discourse among Facebook friends, the messaging tools can save the company time and labor because Facebook uses people, not an algorithm, to screen complaints about posts. This is because the social network is on the lookout for bullying, suicide threats and other matters that may require intervention.

Still, the bulk of complaints are not about overt breaches of the company’s community standards, but are grievances of a more subjective or personal nature, which is why it makes sense for Facebook users to work out their friend-to-friend conflicts without involving a third party, Simon-Thomas said,

‘Facebook is no different from real life. You’re going to have conflicts with people that are not easy to resolve,” Simon-Thomas said. “Facebook can’t take down a post just because you don’t like it, but it can provide the words and images to help you convey the way you feel in a sensitive way.”

[Image courtesy: Facebook]

Also see:

EdCast Raises $6M

EdCast, a Stanford StartX company that creates knowledge cloud platforms for organizations to collaborate upon, has completed a $6 million Series A round of financing led by SoftBank Capital. Mitch Kapor (Kapor Capital), Menlo Ventures, Novel TMT Ventures, Cervin Ventures, Aarin Capital, NewSchools Venture Fund/ CoLab, and the Stanford StartX Fund.