Validation Data

Benchmark Results — April 2026

Post-calibration sweep results across 74 matched benchmarks. Metrics, per-cluster bias, and methodology. Last updated 24 April 2026 (post-DB-reseed pass).

Aggregate Metrics (74 matched benchmarks)

0.726s
Mean |Δ|
0.240s
Median |Δ|
+0.550s
Bias (sim − real)
±0.032s
Top-10 mean error
15/74
Within ±0.10s
30/74
Within ±0.20s
54/74
Within ±0.50s
58/74
Within ±1.00s

Reference Baselines (hand-calibrated)

Three reference bikes from defaults.py are hand-calibrated and excluded from the ALL_KNOWN_BIKES auto-reseed. They serve as regression anchors — if any code change moves these by more than ±0.02s, something is wrong.

GSX-R 1000 K5 (2005)
9.970s
10.034s
+0.004s
YZF-R1 4C8 (2007)
10.300s
10.286s
-0.001s
Hayabusa Gen1 (1999)
10.350s
10.471s
+0.002s

Top 10 Most Accurate Bikes

From the 74 matched catalog bikes. Sources: Cycle World, Sport Rider, MCN, Motorcyclist, BikeWale, MotoStatz. Conditions: race fuel, professional rider, ideal strip.

#1Kawasaki Ninja ZX-10R
+0.011s
Year
2021
Target
10.00s
Sim
10.01s
998cc I4 · Cycle World 2021
#2Kawasaki Z 650
+0.012s
Year
2017
Target
12.40s
Sim
12.41s
649cc I2 · MotoStatz 2017
#3Suzuki TL1000R
+0.018s
Year
1998
Target
10.50s
Sim
10.52s
996cc V-twin · Cycle World 1998
#4Honda CBR1000RR-R
-0.022s
Year
2020
Target
9.95s
Sim
9.93s
999cc I4 · Cycle World 2020
#5Kawasaki Ninja ZX-9R
+0.026s
Year
2000
Target
10.50s
Sim
10.53s
899cc I4 · Sport Rider 2000
#6Yamaha YZF-R1
-0.028s
Year
2009
Target
10.20s
Sim
10.17s
998cc I4 CF · Sport Rider 2009
#7BMW M1000RR
+0.037s
Year
2021
Target
9.80s
Sim
9.84s
999cc I4 · Cycle World 2021
#8Ducati Streetfighter V4
+0.043s
Year
2020
Target
10.10s
Sim
10.14s
1103cc V4 · Cycle World 2020
#9MV Agusta Brutale 1000 RR
-0.047s
Year
2019
Target
10.15s
Sim
10.10s
998cc I4 · Cycle World 2019
#10TVS Apache RR 310
+0.077s
Year
2020
Target
15.46s
Sim
15.54s
312cc I1 · BikeWale 2020

Per-Cluster Bias

Positive bias means the simulator predicts slower than reality. A positive bias in the small-cc clusters is a known residual from the per-bike calibration work not yet completed — it is not a regression.

Litre sport ≥195 hp
Bias flipped; previously +0.25s slow
−0.11sClosed
600–1000cc sport
Closed from +0.23s pre-fix
+0.15sGood
500–700cc twin
Closed from +0.64s; Z650 enters top-10
+0.16sGood
Hayabusa/ZX-14R 1300+
Closed from +0.46s
+0.21sGood
300–500cc twin
Was +2.88s — 63% closure
+1.06sOpen
200–300cc entry
Per-bike calibration work pending
+2.53sOpen
150–200cc entry
Per-bike calibration work pending
+2.86sOpen
2-stroke vintage
Was +5.6s — 52% closure
+2.70sOpen

Methodology

Benchmark sources: Published magazine timeslips from Cycle World, Sport Rider, MCN, Motorcyclist, and BikeWale. Indian community strip records from MotoStatz, Facebook drag groups, and drag event published results. All benchmarks use professional rider / race fuel / ideal strip conditions unless explicitly noted.

Simulation conditions: All simulations run at ISA sea-level conditions (15°C, 101.325 kPa, 0% RH) to match the implicit conditions of most magazine tests. Bikes use their stock BikeConfig from the seeded database — no manual per-bike tuning of Cd or μ.

Matching: Benchmark-to-bike matching uses word-boundary regex to prevent false positives (YZF-R1 ≠ YZF-R15). Year ranges are respected — a 2009 benchmark does not match a 2015 model that shares a name but has different internals.

Known limitations: Residual small-cc and 2T bias is a calibration gap, not a physics gap. The correct fix is per-bike Cd and μ calibration using real strip data — the auto-calibrator in /api/calibrate/dragy handles this once you supply a Dragy CSV for your specific bike.

Regression testing: 10 dedicated regression tests intests/test_power_boost_regression.py run on every commit to catch DB drift. The test helper copies motoquant.db to a temp directory to avoid WAL journal-mode errors on mounted filesystems.