Scaling Laws
Capabilities ResearchactiveEmpirical research on how AI capabilities scale with compute, data, and parameters, crucial for forecasting future capabilities.
Organizations
3
Key Papers
2
Grants
2
Total Funding
$89K
First Proposed: 2020 (Kaplan et al., OpenAI)
Cluster: Capabilities Research
Tags
scalingcapabilitiesforecasting
Organizations3
| Organization | Role |
|---|---|
| Anthropic | active |
| Google DeepMind | active |
| OpenAI | pioneer |
Grants2
| Name | Recipient | Amount | Funder | Date |
|---|---|---|---|---|
| 4-month stipend: Research on agent scaling laws—relationships between training compute and agent capabilities of LLMs | Axel Højmark | $70K | Long-Term Future Fund (LTFF) | 2024-07 |
| 10-month salary for research on AI safety/alignment, scaling laws, and potentially interpretability | Benedikt Hoeltgen | $19K | Long-Term Future Fund (LTFF) | 2021-10 |
Funding by Funder
| Funder | Grants | Total Amount |
|---|---|---|
| Long-Term Future Fund (LTFF) | 2 | $89K |
Key Papers & Resources2
SEMINAL
Scaling Laws for Neural Language Models
Kaplan et al. (OpenAI)2020
SEMINAL
Training Compute-Optimal Large Language Models
Hoffmann et al. (DeepMind)2022