Humans make recommendation engines work

The Google News recommendation engine exposes one of the interesting problems facing the new media models.  Though machines can make really good editors, the better solutions for providing relevant things to people are those that combine the power of social actions with machine learning.

Justin Fox of Fortune (who happens to be a good family friend, actually) asks if the role of the traditional editor is getting hijacked on the Internet, then what is going to replace the need for social reference points?

"There is nothing natural or inherently superior about the monolithic media institutions of the mid-to-late 20th century.  But there is still a need for the community-building, consensus-shaping role that the best of the media gatekeepers can play. The question is, who's going to play it? And how are they going to make it work economically?"

Google's PageRank algorhythm was a landmark differentiator in the web-based world of machine-driven editorial decision-making because it placed value on one very important social action, the hyperlink.  The existence of a hyperlink to a web site implied human value because it was assumed that the link was created by a human who valued that destination page.  Therefore, that page should be ranked higher than other pages valued less by humans.  That factor more than any other made the Google search results better than everything else out there at the time.

It's not obvious to me that Google learned from that breakthrough in their new recommendation editor in Google News.  It appears to be based entirely on machine learning.  Just like all the content in Google News, the results are pretty good in a sort of categorical way and in terms of immediacy, but the machine context isn't enough to create any kind of emotive response to the recommendations they give me.

(Update: Google News posted some info on what's happening: "Google News has no human editors selecting stories or deciding which ones deserve top placement. Our headlines are selected by computer algorithms, based on factors including how often and on what sites a story appears online...Google News can automatically recommend relevant stories just for you by using smart algorithms that analyze your selections.")

The New York Times published a Recommendations 101 piece this week where they described why recommendations matter, and they noted a key flaw in the concept:

"Earlier this month, issued a public apology and took down its entire cross-selling recommendation system when customers who looked at a boxed set of movies that included 'Martin Luther King: I Have a Dream' and 'Unforgivable Blackness: The Rise and Fall of Jack Johnson' were told they might also appreciate a 'Planet of the Apes' DVD collection, as well as 'Ace Ventura: Pet Detective' and other irrelevant titles."

If recommendation engines are to replace traditional editors, they need to have a more direct connection to the human-powered web.  This is one of the reasons I like so much.  The output and display of content there is very purely machine-driven, but all of the input is human-driven.  It's a fantastic combination of the two forces working together.

Noah Brier agrees that the role of the editor in a media company is changing quickly:

"In many ways, recommendation systems spell the end of the editor as we know it. Of course there will always be a place for human editors somewhere, but increasingly technology is going to find ways to deliver information without their help."

Our social actions create the editorial filter through which we will discover things that matter to us.  Recommendations are very trendy and for a good reason, but the juice that makes it a powerful force is not the machines behind's the people.


TrackBack URL:

No trackbacks found.
Humans make recommendation engines work