Back
Good Judgment Open - FAQ
webgjopen.com·gjopen.com/faq
GJO is relevant to AI safety as a tool for developing forecasting skills and crowd-aggregated predictions about AI and geopolitical risks; its methodology informs how AI risk forecasting platforms are designed and evaluated.
Metadata
Importance: 35/100documentationreference
Summary
Good Judgment Open (GJO) is a crowd-forecasting platform derived from the Good Judgment Project, where users make probabilistic forecasts about future events and are scored for accuracy using Brier Scores and Relative Brier Scores. The FAQ explains platform mechanics including scoring methodology, challenge competitions, and the wisdom-of-the-crowd philosophy underpinning the site.
Key Points
- •GJO is a public crowd-forecasting platform built on research from the Good Judgment Project, which demonstrated crowd wisdom can improve geopolitical forecasting.
- •Forecasters compete in themed 'challenges' and are ranked by accuracy using Brier Scores (lower = better), similar to golf scoring.
- •The Relative Brier Score compares individual forecaster accuracy against the crowd median, serving as the primary performance metric.
- •Unlike prediction markets, GJO allows forecasters to share reasoning and challenge each other's assumptions, promoting calibrated thinking.
- •The platform is designed for skill development in probabilistic forecasting, open to anyone regardless of background.
Cited by 1 page
| Page | Type | Quality |
|---|---|---|
| Good Judgment (Forecasting) | Organization | 50.0 |
Cached Content Preview
HTTP 200Fetched Mar 20, 202634 KB
### Frequently Asked Questions (FAQ)
Last updated: 3 June 2024
**Please read our FAQ below. If you still have questions about Good Judgment Open (GJ Open/GJO), [email us at beta@goodjudgment.com](mailto:beta@goodjudgment.com).**
**1\. What is GJ Open?**
GJ Open is a crowd-forecasting site where you can hone your forecasting skills, learn about the world, and engage with other forecasters. On GJ Open, you can make probabilistic forecasts about the likelihood of future events and learn how accurate you were and how your accuracy compares with the crowd. Unlike prediction markets and other forecasting sites, you can share your reasoning with other forecasters to challenge your assumptions.
GJ Open taps into the [Wisdom of the Crowd](https://en.wikipedia.org/wiki/Wisdom_of_the_crowd). We believe in the wisdom of the crowd and hope to use that wisdom to better understand, and predict, the complex and ever evolving world that we live in.
GJ Open was born out of the [Good Judgment Project](https://www.doncio.navy.mil/chips/ArticleDetails.aspx?ID=5976), a multi-year research project which showed that the wisdom of the crowd could be applied to forecasting. Good Judgment Inc was founded to bring the science of forecasting to the public. GJ Open is designed for anyone and everyone to improve their forecasting skills and is not itself a scientific research project.
**2\. What is forecasting?**
If you're new to forecasting, we encourage you to watch a [short video](https://www.youtube.com/watch?v=1SpqQQDoDA4) about probability forecasting.
When you're ready to begin, look at our active [questions](https://www.gjopen.com/questions) and start forecasting!
**3\. How do I compete against other forecasters?**
Competitions on GJ Open are called [challenges](https://www.gjopen.com/challenges). Challenges are collections of questions organized by a theme or topic. Each challenge has its own leaderboard, which ranks forecasters by comparative accuracy against the forecasters participating in a specific challenge.
**4\. How are my forecasts scored for accuracy?**
We encourage all forecasters to watch our short video about scoring at [http://goodjudgment.io/Training/KeepingScore/index.html](https://goodjudgment.io/Training2/KeepingScore/index.html)
We report three different numbers to quantify your forecasting accuracy and compare it to other users on the site: Brier Score, Median Score, and Relative Brier Score. **Lower scores always indicate better accuracy**, like in golf. Our primary measure of accuracy is called the Relative Brier Score (formerly known as Accuracy Score), which compares your score to the crowd. Scoring a question doesn’t occur until the outcome is known and the question has resolved.
On your profile page, next to each question you’ll see several columns. Here are more detailed explanations of each:
**Brier Score**: The [Brier score](https://en.wikipedia.org/wiki/Brier_score) was originally proposed to quantify the accuracy of weath
... (truncated, 34 KB total)Resource ID:
57d7e92f3c3d30d4 | Stable ID: NjljOWNkMj