An MIT Technology Review story with an unforgettable GIF of a beating heart gave us a firsthand look at how Facebook polices images—and how that system will need to improve if the social network is going to be a reliable partner for news organizations.
Not long after we published the story Tuesday, Facebook blocked people younger than 18 from seeing a post about it on our page on the social network. The post remained visible for adults, but it was emblazoned with this notice: “WARNING: Graphic Photo. Photos that contain graphic content can shock, offend and upset people. Are you sure you want to see this?”
Is it graphic? Well, even a static version of the image does meet Merriam-Webster’s definition of graphic as “vividly or plainly shown.” There’s no getting around the fact that this is a heart pulsing away outside a body. It’s in a box developed by a startup company whose technology might significantly expand the availability of organs that can be used in life-saving transplants.
Could it shock, offend, or upset people? Surely the answer is yes. Just about anything that is interesting could upset someone. Indeed, amid the thousands of “likes” and dozens of comments about the substance of the story and the potential importance of the technology, one person took issue with the image: “OMG! What a terrible selection!!! … Not all readers are OK with blood and parts of the body full exposed on their FB timeline.”
I am not trying to scold anyone who finds the picture gross or upsetting. I also know I’m far from the first person to point out that Facebook, in an effort to maintain a chipper atmosphere, appears to err on the side of censoring images related to the human body. It took years for Facebook to get comfortable with images of mothers breastfeeding. And finally, I recognize that policing images, especially exploitative ones, is vital work.
The issue, though, is whether Facebook really should be the host to more news stories, which, if they are any good, will often be shocking and upsetting. If news organizations are going to have a fruitful relationship with Facebook, it will need an image-analysis system that is not too quick to deem something beyond the pale.
A single complaint from anyone about the content of a post triggers a review. In this case of the disembodied heart, Facebook put a canned message on our page that said “someone reported your photo for containing graphic violence.”
Facebook says all such reviews are made by people, not image-detecting computers—people who, in the aggregate, check out millions of posts every week. In this instance, someone determined that the heart was unacceptable, even in the context of the biomedical news story it accompanied.
After I queried the company for details about its image-policing process, Facebook spokesman Will Nevius said the reviewer made the wrong call about the bloody heart. The warning label came down.
Nonetheless, the fact that a human heart could even create a judgment call is a reminder of how Facebook can make for an awkward partner for news organizations. If publishers, desperate for the audience Facebook offers, post more of their stories directly and perhaps exclusively to the site, will they deliver only a sanitized subset, or risk having a single reader complaint shield an article from readers under 18? In explaining Facebook’s review process, Nevius said in a statement: “We aim to find the right balance between giving people a place to express themselves and promoting a welcoming and safe environment for our diverse, global community.” That’s an admirable spirit for a social network, but if it requires being on hair-trigger alert for potentially upsetting images, maybe Facebook’s heart can’t ever truly be in the news business.