Hyperion-DeepSpace-218M

Emergence Labs • 2025

"Reasoning is not emergent.
It is engineered in 64 parallel paths."

  • 218 million parameters
  • Trained on TinyStories in under 1 hour on a single GPU
  • Final PPL: ~7.3 (better than most 1B+ models)
  • No RLHF • No synthetic data • No preference tuning
  • Pure next-token prediction with explicit symbolic decomposition

This is the first public release from Emergence Labs.

We are just getting started.

WebEssentz / Onyerikam Godwin Gideon
December 2025

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Dataset used to train WebEssentz/hyperion-deepspace-218m