Mobs Rule!

by Andrew McAfee on July 6, 2009

A June 25 post to the New York Times Bits blog starts with “To get the best predictions about when your company’s latest product will ship or how it will sell, you might try asking your employees — anonymously.” A skeptical but fair response to this is something like “Sure, I might, but why should I? Our company has professionals who make these kinds of forecasts for a living. Why on earth should I place more trust, or any trust, in the predictions that come from an anonymous crowd?”

A convincing answer to this question is “You should place more trust in the crowd if it gives you better answers than your current methods do.” And the post in the NYT presents evidence that corporate prediction markets do just that.

Prediction markets are a technology that harnesses collective intelligence / the wisdom of crowds to gaze into the future and generate a consensus forecast (in other words, the crowd’s aggregate best guess) about what will happen. I’ve written about them here, and here’s a nice overview paper about them from Justin Wolfers and Eric Zitzewitz.

On the Internet they’ve been shown to deliver more accurate predictions about political elections and movie revenues than other techniques like polls and statistical forecasting methods. Pioneering efforts to use them within companies show that they’re also highly accurate when deployed behind the firewall (see, for example, the case I wrote with Karim Lakhani and Peter Coles about Google’s internal prediction market and this paper written by Google’s Bo Cowgill and his colleagues).

But are they more accurate than the other techniques companies use to forecast future events of interest? The NYT post presents data from the prediction market startup Crowdcast (disclosure: I am on Crowdcast’s advisory board, and have a small amount of stock in the company), which has deployed its technology within multiple large companies.  As Claire Cain Miller writes in the Times:

At a media company with a new product to ship, 1,200 employees predicted a ship date and sales figures that resulted in 61 percent less error than executives’ previous prediction, according to Crowdcast. A pharmaceutical company asked a panel of scientists and doctors to predict regulatory decisions and new drug sales using Crowdcast, and they were more accurate than the company’s original prediction 86 percent of the time.

Crowdcast chief scientist Leslie Fine, a deceptively normal looking person with what I’m pretty sure is a 4-digit IQ (she has a doctorate in experimental economics from CalTech, and they don’t just hand those out), gave me some more information about the setting in which accuracy vs. the ‘official’ forecast was improved by 61%:

For a major media and gaming company, we predicted total hardware and software sales, as well as metacritic scores for games.  Metacritic is an aggregate review score, and is by far the #1 predictor of sell-through.  Our client there is the Senior Director for Publishing Intelligence.  He starts the markets off at the company’s ‘official’ best guesses for future sales and scores, as soon as the publishing intelligence guys come up with them. For metacritic scores, they do this by ranking the game on a number of axes (out of 10): playability, graphics, etc..  They add them up, which yields the score.  For hardware and software sales, the official prediction comes from applying growth percentages to historical data.

There are also some great stats coming out of a survey we did at this company.  They’re a real turn on (ok, if you’re a total nerd):
- 80.6% say they participate because it’s fun.
- 35.5% say they do so because it “Gets me thinking about stuff that helps me do my job”
- 44.1% say “I believe my input is valuable”

- 52.1% say they usually research the forecast before betting; another 38% say at least sometimes
- 54.7% believe that that business leaders value the knowledge

Crowdcast has had two important insights; the first is common to all prediction market vendors, while the second is unique to them (as far as I know). The common one is captured well in a couple paragraphs from the Times post:

“It’s a huge problem in corporations — by the time information gets to the top, it’s meaningless or too late,” said Mat Fogarty, Crowdcast’s founder and chief executive. Mr. Fogarty spent a decade doing corporate forecasting at Electronic Arts and Unilever, where he saw how inaccurate official corporate forecasts could be. “There is a lot of information not getting to decision-makers, and that’s expensive,” he said.

At Electronic Arts, Mr. Fogarty said, the most junior people often had the most insight into the quality of new video games, and he picked up the best information about the latest version of Madden while playing soccer games on the company lawn.

Fogarty saw the same thing that the great Austrian economist Friedrich von Hayek noted in his seminal 1945 article “The Use of Knowledge in Society:”

The… problem of a rational economic order is determined precisely by the fact that the knowledge of the circumstances of which we must make use never exists in concentrated or integrated form but solely as the dispersed bits of incomplete and frequently contradictory knowledge which all the separate individuals possess. The economic problem of society is… a problem of the utilization of knowledge which is not given to anyone in its totality.

Hayek also saw how beautifully prices established by markets solved this vexing problem:

We must look at the price system as such a mechanism for communicating information if we want to understand its real function… The most significant fact about this system is the economy of knowledge with which it operates, or how little the individual participants need to know in order to be able to take the right action. In abbreviated form, by a kind of symbol, only the most essential information is passed on and passed on only to those concerned. It is more than a metaphor to describe the price system as a kind of machinery for registering change…

The marvel is that in a case like that of a scarcity of one raw material, without an order being issued, without more than perhaps a handful of people knowing the cause, tens of thousands of people whose identity could not be ascertained by months of investigation, are made to use the material or its products more sparingly; i.e., they move in the right direction…

I have deliberately used the word “marvel” to shock the reader out of the complacency with which we often take the working of this mechanism for granted. I am convinced that if it were the result of deliberate human design, and if the people guided by the price changes understood that their decisions have significance far beyond their immediate aim, this mechanism would have been acclaimed as one of the greatest triumphs of the human mind.”

(sorry for the long quote, but I really love this paper)

Fogarty started Crowdcast because he believed, as did Hayek, that markets were marvelous devices for aggregating and sharing important knowledge. Fogarty further believed that they could be used to address the problems of poor intra-company information flows he had witnessed firsthand. In the era of high bandwidth, powerful browsers, and software as a service, he saw an opportuity to bring Hayek’s marvel inside the firewall, and to apply it to a range of tasks.

The company’s second insight was that most people are not stock traders by nature, and the typical trader’s interface of current prices, bid-ask spreads, and order books is off-putting and alienating to most (it certainly is to me). So the Crowdcast team replaced it with a much simpler interface –  one that lets participants make their trades graphically, by moving sliding bars across a bell curve. They make a prediction by selecting a range of values; a tight range indicates a confident prediction (and a high payout if it turns out to be correct), while a large range corresponds to a less confident prediction. This demo from Crowdcast shows how the system works. I find it simple and highly intuitive, and Fine has worked to make sure all the underlying math is solid.

The evidence is mounting that corporate prediction markets work as advertised, delivering quick, accurate, and decisive predictions in areas of great interest. Furthermore, the evidence so far also suggests that they work better than current corporate forecasting techniques, at least in some circumstances. So are there any good reasons left for not using them, or at least experimenting with them? I ask seriously: why would any enlightened company not avail itself of this technology? Can you come up with any legitimate reasons not to jump in with prediction markets?  Leave a comment, please, and let us know.

  • http://www.darmowe-mp3-wyszukiwarka.pl darmowe mp3

    In general it’s better to be safe than sorry, so spending a bit more on predictions before launching a product is a good thing in my book. Asking employees is a great move. At first I simply assumed their predictions would be worthless as they are all biased. But then it came to me – they are biased with, so to say, knowledge! They know the product (since they made it ..) and know better than anyone else it’s faults. That’s the reason for the disturbingly high statistics.
    The only reason I can see not to predict is when you’re occupying a niche market and have your faithful customers.

  • http://www.intuitlabs.com Tad Milbourn

    A very interesting post and fascinating data. The question running through my head though is the extent or limit to which prediction markets can be used. At first glance, it seems that those participating in the prediction market need to be a good proxy for the customer the company running the market hopes to target.

    In the Electronic Arts example, their front-line workers are often gamers (EA’s target customer), so they serve as a good proxy. The often cited Best Buy “Blue Shirt Nation” example also holds to this. Best Buy employees are reasonable proxies for the electronics enthusiasts that shop at Best Buy stores.

    Does this resonate with the data you’re seeing? Do participants need to be proxies or “informed enough” to make valid decisions in these markets? Is the crowd more accurate even when making predictions outsides its collective expertise?

  • http://twitter.com/ArikJohnson Arik Johnson

    Since reading the Times story I’ve been wondering: Why anonymous? If it is to maximize participative liquidity in any given market where insider information would prove salient, that’s answer enough; however…

    With government regulation looking like it’ll only increase going forward (at least in the U.S.) it strikes me that corporations seeking to aggregate knowledge in the enterprise using PMs as a method will need to know “who knew what and when” more than ever before.

    The Times piece and subsequent comments loiter over this point of anonymity as a primary asset and differentiator (which I acknowledge by the way), but the upside escapes me if it’s impossible to answer the who-knew-when question.

    Other than removing disincentives to participate where insider knowledge to game the market might prove compelling for the insider to remain anonymous, the reason regulators will seek to uncover them is to present a paper trail to call out these insiders in the event of impropriety. As any legal discovery team why IM and email are archived and you’ll know what I mean.

  • http://blog.crowdcast.com/?p=49 McAfee Blog Post – Andy Rules!

    [...] McAfee, father of Enterprise 2.0, wrote a great blog post today about prediction market use, and our technology in [...]

  • http://www.allenvarney.com Allen Varney

    You’re absolutely right that prediction markets can be uncannily accurate. In January 2007 I wrote about them (in the context of online games) in the article “Buzz Games” in the online gaming magazine The Escapist:

    http://www.escapistmagazine.com/articles/view/issues/issue_82/467-Buzz-Games

    The Oddhead blog by Dr. David Pennock, a computer scientist at Yahoo, frequently covers prediction markets:

    http://blog.oddhead.com/

  • http://www.midasoracle.org/2009/07/07/surprise-surprise-andrew-mcafee-is-pulling-for-crowdcast/ Surprise, Surprise: Andrew McAfee is pulling for CrowdCast. | Midas Oracle .ORG

    [...] Surprise, Surprise: Andrew McAfee is pulling for CrowdCast. Written by Chris F. Masse on July 7, 2009 — Leave a Comment He loves it. [...]

  • http://www.midasoracle.org/2009/06/25/crowdcast-collective-forecasting-collective-intelligence-that-predicts/ CrowdCast = Collective Forecasting = Collective Intelligence That Predicts | Midas Oracle .ORG

    [...] Andrew McAfee on CrowdCast [...]

  • Wilson

    Thanks for the article. I certainly don’t disagree that prediction markets are an important method of information gathering for any enterprise. However, I think we need more guidance, more best practices on how to use those results, when we should be casting for predictions, and just what sort of outcomes and circumstances we’re encouraging predictions for. Prediction markets have an inherent limitation of being grounded in conventional thinking and of lowest common denominator thinking, even within an expertise community. We need to reconcile those limitations with its power to best identify when and how to use prediction markets. I would be the last person to deny the power of the effective use of prediction markets, but the issue I see is we seem to be implying, in evangelizing for prediction markets, that they are always accurate, when clearly they are not, or not as such, depending upon the point of time, the level of expertise of the crowd involved, and the visibility/circumstances of “today” which blind us to tomorrow.

    The easiest example here of the limitation of predictive markets lies in the public space. Had we simply relied on predictive markets sometime around October or November 2008, those markets generally reduced McCain to no chance of getting the nomination as well as indicated Obama had little chance and by simplistic usage of predictive markets we’d say we would not see an Obama-McCain scenario. But any political analyst understood those results were rooted in the (mistaken and limited) conventional wisdom and general knowledge available of that moment and by larger and less sophisticated crowds in this knowledge space. And to be fair to the amateur analysts of that moment, even a crude predictive market among so-called pundits, I am pretty sure, would have shown similar results, even if some marked uptick for Obama and a marginal uptick (but still a call for a loss of the nomination) for McCain.

    Now here also is where intelligent crowdsourcing can overcome/refine predictive market aggregation outcomes. Crowdsourcing mechanisms by which we identify experts with good track records and isolate the TREND among particular crowd elements could well inform us of particularities that would suggest the extreme flux of the moment and suggest the greater chances that McCain enjoyed all along along with the limited parameters but (at the time) remote-but-real probabilities/scenarios for an Obama breakthrough.

    Similarly, in the economic prediction crowd, there was a growing trend before the economic crisis of people spotting a crisis coming, but it seems no prediction market came close to giving us an accurate picture that a meltdown was imminent. On the other hand, we did see a growing number of experts suggest exactly that in the 1-2 years preceding the meltdown. Again, a mixture of predictive markets but also careful trend monitoring of qualitative crowdsourcing feedback seems more useful here than purely predictive markets.

    So we have to understand when, where, and how to use predictive markets and not turn those into tomorrow’s surveys and polls, mechanisms which are too often abused and too often underexplained and become (mistakenly) too often dismissed. I suggest the greater frontier is that explanation rather than selling that they are useful.

  • http://www.asynchronoussynchrony.org bob roan

    It sounds great, but I’m too confused to jump. Which way?
    What does a good application look like? Could a retailer use it to decide if a new product line would make sense? What size, in dollars, would be necessary? Or could an entrepreneur use it to check out ideas? Or is this just for big companies? And then, who do I ask to be part of the prediction markets? Employees, customers, the general public?
    The science is compelling, but how does a small business person figure out how to apply it?

  • glenn morgan

    Enjoyed your post.
    1) As a base contribution, “crowdcasting” provides additional data from which you may compare more traditional predictive models; if the two sets of predictions vary it is time to start digging into assumptions and weights associated with your traditional variables as well as the weight assigned to a costly error

    2) Evaluations such as those provided by Crowdcast demonstrate the rational actor is joined in the marketplace by the rational crowd; assimilating data and optimizing likely outcomes via a much larger set of inputs and possible outcomes than those applied to an individual

    3) Traditional predictions are often costly. Crowdcasting techniques are cost effective; at a minimum they provide a powerful directional starting point. Why would you not employ a cost effective technique to confirm your more traditional forecasts are marching in the correct direction?

  • http://www.odzyskiwanie.com.pl/ Odzyskiwanie Danych

    The crowd gives you statisticaly better chance of getting a right anwser. People who are wrong will simply be in minority. When asking “professionals” for opinions you’re risking getting the wrong answer as “the only true one”.

  • howtogrowtaller

    that is very interesting, i actually learned something.

  • TimoteoManna

    Go for a casual date. It is never cool to plan like you are about to make a marriage proposal when you are just blind date uncensored about to go to a blind date. A cozy café or park that allows lots of conversation and exchange of ideas will do the trick.

  • pharmajobs

    It is very nice posting. Thanks for sharing it.

  • http://lawn-edging.freeoda.com/ lawn edging

    Your web is very useful I liked a lot and I will return to read again.
    For my site lawn edging

  • 123456789sbb

    wedding dresses,wedding gowns,bride dresses,bridesmaids dresses,evening dresses,bridal gowns,flower girl dresses
    Wedding Gowns
    Formal Gowns
    Cocktail Gowns
    Find the wedding dress designer and wedding dress that's right for you! Browse dresses from
    Bridesmaid Gowns
    Evening Gowns
    View our selection of exquisite, handmade gowns and dresses for your wedding
    Wedding Dresses, Wedding Shoes and Wedding Accessories from wedding shop, the UK's finest collection of designer wedding dresses.
    Use the wedding dress and
    cheap wedding
    wedding dresses
    wedding shop

  • lihaoxj16

    tiffany jewelry
    Choose, buy and shop for on sale tiffany jewelry including Tiffany & Co Silver Necklace, Pendants, Bangles, Bracelets, Earrings, Rings and Accessories.
    tiffany co
    Tiffany Jewellery offering bangle Jewellery, bracelet jewelry, eardrop jewelry, necklace jewelry, ring jewelry, finger ring jewelry and earring jewelry
    tiffany
    tiffany and co
    links of london
    links london
    Tiffany Style Silver Jewelry: Rings, Earrings, Necklaces, Bracelets and more Tiffany Jewellery at low prices.

    tiffany jewelry
    Choose, buy and shop for on sale tiffany jewelry including Tiffany & Co Silver Necklace, Pendants, Bangles, Bracelets, Earrings, Rings and Accessories.
    tiffany co
    Tiffany Jewellery offering bangle Jewellery, bracelet jewelry, eardrop jewelry, necklace jewelry, ring jewelry, finger ring jewelry and earring jewelry
    tiffany
    tiffany and co
    links of london
    links london
    Tiffany Style Silver Jewelry: Rings, Earrings, Necklaces, Bracelets and more Tiffany Jewellery at low prices.

Previous post:

Next post: