Facebook’s Algorithm Is Dumb and Cruel

Happy new year! I’m back to daily writing, working, and what-have you. I hope you enjoyed the excerpts from the book the robots posted for me while I was roaming around California over the holidays. I saw two different copies of In Real Life in person while I was away, and I think you’re going to be pleased.

My decision to stop blogging and enjoy the holidays made me very late to the news I’m linking today, but I do not care or mind even a little bit.

If you use Facebook, you may have been irritated by its Year In Review feature in recent weeks. Or maybe your experience was perfectly pleasant. Either way, you probably saw lots of pop-ups telling you to check out Your Year In Review, and a bunch of your friends and relatives probably shared theirs.

Facebook automatically generates the Year In Review report by applying mysterious criteria to judge which events and stories from the past year of its users’ activities were most important. This feature has existed for years, but it works in new and different ways each time, so it’s a great snapshot of the new colossal misunderstandings of humanity developed at Facebook in the preceding year. Facebook’s algorithm isn’t exactly a perfect picture of humanity, so its choices about which stories to show can seem dumb or odd. And sometimes they’re unforgivably awful.

Eric Meyer’s Year In Review, for example, begged him to publish the report using garish, ‘60s party clip art. The photo Facebook placed in the center was the face of his daughter, Rebecca, who died this year on her sixth birthday.

Meyer’s a respected web designer, so he’s well aware that this is inadvertent algorithmic cruelty, in his words. But the dumbness of this algorithm doesn’t soften its cruelty at all. His post suggests ways this mistake could have been prevented, and Facebook damn well better listen. (Year In Review product manager Jonathan Gheller immediately apologized to Meyer and did a little apology press tour).

Both Meyer’s initial post and his follow-up are beautiful reads, both for his compassion and his suggestions for more compassionate technology.

There’s a chapter in In Real Life about a similar incident in my 2013 Year In Review on Facebook. Mine wasn’t anywhere near as heinous as Meyer’s, but Facebook crossed a privacy line with me that I hope they won’t cross again.

As software continues to embed itself in our emotional lives, it’s going to have to learn to be sensitive. I guess it isn’t surprising that Silicon Valley engineering culture is learning this from mistakes, rather than starting off prepared. The incentives of advertising companies with social ambitions are totally upside-down. But my Internet friend Drew is right; they’ll have to learn to care eventually: