While personalized feeds are taking off, they still fall short of good human editors in some important ways. Here are seven of them:
Anticipation. As it turns out, algorithms at sites like Technorati and MediaGazer are quite good at figuring out what the Internet is buzzing about right now, but they're quite bad at predicting what's going to be news tomorrow. Artificial intelligence simply isn't good enough yet to know that while there are only a few info-drips about "Obama middle east speech" on Tuesday, by Thursday it's all anyone will be able to talk about.Risk-taking. Chris Dixon, the co-founder of personalization site Hunch, calls this "the Chipotle problem." As it turns out, if you are designing a where-to-eat recommendation algorithm, it's hard to avoid sending most people to Chipotle most of the time. People like Chipotle, there are lots of them around, and while it never blows anyone's mind, it's a consistent three-to-four-star experience. Because of the way many personalization and recommendation algorithms are designed, they'll tend to be conservative in this way — those five-star experiences are harder to predict, and they sometimes end up ones. Yet, of course, they're the experiences we remember.
The whole picture. A newspaper front-page does a lot of informational heavy lifting. Not only do the headlines have to engage readers and sell copies, but most great front pages have a sense of representativeness — a sampling of "all the news that's fit to print" for that day. The front page is a map of the news world. That sense of the zoomed-out big picture is often missing from algorithmically tailored feeds: you get the pieces that are of most interest to you, but not how they all fit together.
Pairing. As any restaurateur worth his sea salt knows, it's not just the ingredients, it's how you blend them together. Great media does the same thing, bringing together complementing and contrasting pieces into a whole that's greater than its parts (think of great issues of your favorite magazine, or your favorite album). Even those of us with a wicked sweet tooth can't survive on dessert alone. Algorithms are pretty clumsy about this — they lack a sense of which flavors pair well together.
Social importance. A recent study found that stories about Apple got more play on Google News than stories about Afghanistan. Few of us would argue that Steve Jobs' latest health gossip is as important as the war we have soldiers fighting on our behalf, but that's a hard signal for algorithms to pick up on — after all, people click on the stories about Apple more. Maybe it's time for a Facebook "Important" button to go next to the "Like" button.
Mind-blowingness. While we're adding buttons to Facebook, what about a "it was a hard slog at first, but then it changed my life" button? So many of the media experiences that change our lives, that we remember 5 or 10 years later — the things that keep us coming back to our favorite periodicals and websites — are hardly the most clickable. They may not even be that share-able — James Joyce's Ulysses wouldn't fare very well if it had to compete for attention on Facebook with cat photos and celebrity gossip. Great human editors can see beyond the clicks; they sometimes pick pieces that may not be the most accessible but stay with the readers who take the journey.
Trust. Even if we can't put our finger on it, most of us can feel when the preceeding qualities are lacking. And it means that we don't trust these algorithms very much: sure, we'll glance at what Netflix recommends, but it's only as good as its current recommendations. That trust is critical because it's what allows editorial institutions to push us out of our comfort zones — "you might not think you'd be interested in this fashion industry kingpin/new cooking fad/small country in southeast Asia, but trust me, you will be." And that's how new interests are born.