22  Predictions Affecting Outcomes


Rahul Shah, Jacob Steinhardt

22.1 How Predictions can Affect Outcomes

Predictions can affect outcomes in 2 primary ways:

  1. Prediction provides information, which affects behavior
  2. Prediction market creates incentives, which influence actions

22.1.1 Yelp Reviews – Case Study

Anderson and Magruder (2012) studied the effect of Yelp reviews on restaurant behavior.

Specifically, they asked: > How much does a half-star rating increase affect the rate that restaurants sell out?

And they found that the answer was yes. This is an example of how predictions can affect outcomes by providing information.

22.2 Predictions that affect behavior

  • Child is predicted to be a good soccer player, gets additional training, coaches, etc. -> more likely to become a good soccer player
  • Primary candidate is predicted to not be competitive -> people vote for someone else to avoid “wasting their vote”
    • Some politicians pay for polls to reveal this information
  • If a bank is predicted to be insolvent, it can cause a bank run
  • If a crime is predicted to occur, police may stop the crime

Brainstorm additional examples

(whitespace to avoid spoilers)

  • Stock going up -> people buy it -> stock goes up
  • AI progress predicted to be fast -> more investment -> faster progress
  • Forecast of disaster -> reduce deaths
  • Prediction of revolt -> policies to prevent revolt
  • Prediction of rain -> umbrella -> don’t get wet
  • State-specific election probabilities -> candidate behavior -> state-specific outcomes

22.2.1 Theoretical Scenario

  • If a crime is predicted to occur, police may stop the crime.

  • Suppose we had a prediction market on “Will the Blue Hope Diamond be stolen before December 31st, 2024?”

    • Suppose you know of a secret plot to steal the diamond next week.
    • What should you bet on this market? What are your strategies for making money?

22.3 Information Boosting

  • Kevin created this market to encourage developers to read his argument

  • If market is low, developers might not read it -> outcome likely to be “No”

  • If market is high, developers probably read it -> outcome “Yes” if argument is actually good

  • Suppose market is at 5% and you think Kevin’s argument is actually good–should you bid on “Yes”?

22.4 Information Elicitation

  • Austin is CEO of Mantic

  • Market is basically a prediction of what he will think

  • Suppose you had useful arguments about whether “Dynamic Parimutuel” was a good system—how could you make money?

22.5 Prediction Markets Create Incentives

Now that we’ve seen two examples so far (information elicitation and classroom). Let’s look at a third example: prediction markets create incentives.


In 2003, DARPA proposed a “Policy Analysis Market” for predictions on a number of geopolitical events. One of their included examples “assassination of Yasser Arafat”. - How could a (very) unscrupulous person make money off of this? - How big of a problem is this in reality?

(whitespace to avoid spoilers)

  • Assassinate Yasser Arafat -> bet on “Yes” -> profit
  • In expectation, this is a big problem because it creates incentives for people to do bad things. In reality, it’s not a big problem because the lack of liquidity on these markets end up making it negative expected value to assassinate a well-known killer for only 300 USD.

However, the market was shut down due to public outcry – and the director of DARPA was fired over this.

22.6 Creating Useful Incentives

A sex researcher, Aella, created a market on who she would date next. She wanted to incentivize people to give her good advice on who to date – and she ended up dating the person who was most recommended.

22.7 Incentives in Data Science Settings

Many data science algorithms make predictions - E.g. loans / credit scores - Youtube recommendation algorithm - Fraud detection

Each of these create incentives for end users! (And can render the predictions inaccurate.) - What are other examples?


If you want your Twitter post to be recommended by the algorithm, what should you do?

(whitespace to avoid spoilers)

  • Post something that will get a lot of engagement. This tends to be controversial and hateful content, as it gets more engagement. This is a problem because it creates incentives for people to post hateful content.
  • This leads to a cycle of more hateful content being recommended, which leads to more hateful content being posted, etc.

If you are interested more in creating aligned incentives as well as differentially private recommendation systems and fraud detection, take DATA C102 and Michael Jordan’s class.

22.8 Conclusion

  • Calibration – “I’m 80% sure” now means something :)
  • Fermi estimates
  • Finding and integrating information
  • Evaluating the trustworthiness of sources
  • Understanding the inherent uncertainty in the world
    • SF half-marathon
    • Club tables
    • “Other” options
  • Glimpse into mathematical tools
  • Martingales, Kelly betting, independence / copulas, winner’s curse