AI Calibration Training

Test your knowledge of AI facts while practicing calibrated uncertainty. Give 90% confidence intervals — aim for accuracy and precision.

How it works

  • You'll be asked questions about AI — compute, dates, benchmarks, costs, and more.
  • For each question, provide a 90% confidence interval: a range where you believe there's a 90% chance the true answer falls.
  • You're scored on both calibration (did ~90% of your intervals contain the truth?) and precision (how narrow were your intervals?).
  • Wider intervals are safer but score lower. The best strategy is tight intervals that still capture the truth ~90% of the time.

Scoring: Each question is scored 0–100. You get a base score for capturing the true value, with a bonus for tighter intervals. Missing the interval incurs a penalty proportional to how far off you were. Your final score is the average across all questions.

Settings

Leaderboard

Game Complete

Overall Score — combines calibration & precision (0–100)

Calibration Analysis

Question Review

Leaderboard