Print This Post

Crowd Sourced Knowledge: Netflix Prize Won?

netflix prize awardLog in to Netflix and rate a movie you watched on a scale of one to five and Netflix’ computers will try to suggest other titles you’re likely to rank the same way.  This algorithmic pairing process has been marketed as a key differentiator between Netflix movie rental service and that of competitors.  It’s been touted as an achievement. 

For two and a half years, Netflix challenged the public to try and create a better mousetrap.   A million dollar prize was dangled as bait for the first person (or team) to create a program capable of beating Netflix’ Cinematch algorithm by a margin of ten percent or better.   Nobody succeeded.  Developers inched close but couldn’t quite hit the mark. Seven percent, eight percent, nine…but not ten.  The so called "Netflix Prize" went unclaimed. Until now, that is.

On Friday, a group created from a combination of four independent teams that had been vying for the prize submitted a solution that they claim resulted in a 10.05% improvement over Netflix’ Cinematch ranking algorithm.

Once validated, the team’s result will start a thirty day countdown in which other participants can submit their competing solutions too.  At the end of the thirty days, if no better performing solution is delivered, the seven member team calling itself “Bellkor’s Pragmatic Chaos” will split the cool million – the “Netflix Prize.”

Like the X Prize Foundation which came before it, and the Orteig Prize which came way before that, the Netflix Prize contest began as an experiment in collaborative, crowd-sourced research.   In October 2006, Netflix offered the grand prize and $50k “progress prizes” as incentive to lure participants.  A data set of 100 million movie ratings from 480,000 Netflix users was provided to get things started and Scientists and researchers, even garage tinkerers, were given free rein to figure out how to analyze the data.

The challenge: a 10% or better improvement.   If a customer rated a movie with four stars, a new algorithm would have to suggest another title likely to be ranked similarly by the same viewer.  The winning solution had to beat Netflix’ existing process (called Cinematch) by no less than ten percent.

Almost immediately, submissions rolled in.  It took only days before one topped Cinematch but the margin of improvement was barely better than 1%.  Ten percent was far off.  It took a year before competitors started to get close, reaching as high as 8%.  But the closer the teams got to the 10% threshold, the smaller their incremental improvements became.  Two years in, leading entrants toiled for improvements in the range of tenths of a percentage point.  2007’s best solution topped Cinematch by 8.43%.  2008’s top entry managed to improve to 9.44%.  It took the first six months of 2009 for Bellkor’s Pragmatic Chaos to push the bar another .61% and cross the mythic milestone.

More than 50k entries were submitted in pursuit of the award.

The tentatively victorious team, whose seven members span the globe, will know for certain if they’ve won on July 26th.   Assuming they have and tabulating the costs, it’s probably safe to say Netflix got a bargain:  seven  elite researchers  including two from AT&T Research and one from Yahoo Research, all working for two and half years?  (UPDATE: the team’s profile is here)

If these researchers had been working in-house, at a salary of a $100k each, plus benefits and operational overhead, Netflix could easily have paid two to five times what it’s doling out in prize money (and with no guarantee of results). 

Crowd-sourced solutions won’t work for every problem.  There are security concerns and licensing issues to consider in a for-profit environment.   But here, in the tradition of the Orteig Prize and X Prize, Netflix found a way to harness the insight and innovation of many, for the price of a few.   The company bought the insight of thousands of researchers and a winning solution for far less than the cost of a proprietary team focused solely on the problem.

In exchange for the money it pays out, Netflix will get a non-exclusive license to use the discovery.

Other Related Articles from Metue
Disney and Hulu Make Pact
How Long is the Blu Ray Runway
Netflix and LG Go Further With Connected TVs
Netflix and Tivo finally Hook Up
Blockbuster and Sonic Solutions Hook Up to Compete with Netflix Digital
Netflix and LG Reveal Streaming Blu Ray Player
Cracking Release Windows: Studios Embracing “Day and Date”

Prior Metue Coverage of Netflix Earnings
Netflix Q1 2009
 •Netflix Q4 2008 Earnings
Netflix Q3 2008 Earnings
Netflix Resets Earnings Expectations
Netflix Q2 ‘08 Earnings: Better than Expectations
Netflix Q1 ’08 Earnings: Short Term Sell? Long Term Buy?
Netflix Q4 ’07 Earnings

Comments are closed.