10²⁸ FLOP training run completed
Agent-2 trained at 2×10²⁸ FLOP (Apr 2026 – Mar 2027), representing a 1000× increase over GPT-4 scale.
What AI 2027 Predicted
The scenario’s Compute Forecast table shows Agent-2 trained at 2×10²⁸ FLOP, with training running from April 2026 through March 2027. GPT-4 was trained at 2×10²⁵ FLOP, making Agent-2 a 1000× increase over GPT-4 scale (not 100× as previously stated). Note: the “Late 2025” section describes datacenter capacity being built, not a completed 10²⁸ training run. The actual 10²⁸ run completes in 2027.
How We Track This
We monitor:
- Epoch AI’s tracking of largest training runs
- Lab announcements of model training details
- Hardware deployment timelines (e.g., Blackwell clusters becoming operational)
- Analyst estimates of training compute for frontier models
Current Evidence
No confirmed 10²⁸ FLOP training run yet. Epoch AI estimates show the frontier at approximately 10^26.5–10^27 FLOP (GPT-4.5, Claude Opus 4.5). Epoch’s projection: open models to surpass 10^26 FLOP by Nov 2025 (90% CI: Aug 2025–Nov 2026). The largest confirmed training run at GPT-4 scale (~2×10^25) was in early 2023; GPT-4.5 is estimated at ~10^26.5. Infrastructure buildout is ongoing — Stargate Abilene has 2 buildings operational with 8 more under construction. NVIDIA Blackwell Ultra (B300) with 288GB HBM3e is now shipping. At 70% speed, a 10²⁸ FLOP run is expected mid–late 2026 rather than late 2025.
Sources:
- Frontier open models may surpass 1e26 FLOP — Epoch AI
- Tracking Large-Scale AI Models — Epoch AI
- Grading AI 2027’s 2025 Predictions — AI Futures Project
- Top 12 NVIDIA GPUs for AI Training in 2026 — Atlantic.net
Counterevidence & Limitations
- Labs are increasingly opaque about training compute, making verification difficult
- Algorithmic improvements may reduce the need for raw compute scaling
- Some labs may have completed large runs without publicly disclosing details
- The 10²⁸ threshold is somewhat arbitrary — continuous improvement matters more than crossing a specific number
What Would Change Our Assessment
- Upgrade to “on-track”: Credible reports of a training run at or near 10²⁸ FLOP
- Downgrade further: If Blackwell deployment delays push the timeline into 2027
Update History
| Date | Update |
|---|---|
| 2025-12 | No confirmed 10²⁸ FLOP training run. Frontier estimated at ~10²⁶·⁵–10²⁷, well below the target. |
| 2026-03 | 10²⁸ FLOP run now expected mid-to-late 2026 based on infrastructure buildout timelines. Prediction was ~6-12 months too aggressive. |