Algorithmic Intelligence Research

What happens when algorithms converge?

Compress 10 different algorithms to their Kolmogorov minimum. The structures they share are what mathematical intelligence actually is. AttractorForge maps them.

Open the Vault Compress algorithms. Discover convergence.
quicksort mergesort heapsort radixsort timsort shellsort introsort smoothsort K(attractor)

If you compress 10 different algorithms to their KC minimum independently, what structure do they share? Those shared structural attractors are what mathematical intelligence actually is.

They're the invariants that survive under information-theoretic pressure. Not designed top-down by a human, but discovered bottom-up by compression. AttractorForge is the first platform built to find them, vault them, and map the space between them.

From code to convergence

01

Multi-Strategy Compression

Three parallel refactoring paths per algorithm: minimize tokens, find the recursive invariant, express as pure mathematics. Score all three, keep the best. The KC search becomes genuinely multi-directional.

KC SEARCH
02

Joint Loss Optimization

Every candidate is scored against four simultaneous constraints: task correctness, Kolmogorov complexity, Fisher information stability, and proof success. The joint loss forces all desiderata at once.

OPTIMIZATION
03

Equilibrium Vault

When an algorithm reaches Nash Equilibrium, its KC-minimal form is preserved permanently. Over time, this becomes a library of proven minimal forms with pairwise structural similarity (JSD) computed between all entries.

PRESERVATION
04

Convergence Map

Visualize which algorithms cluster together structurally at their KC minimum. This is the AGI-relevant output: an empirical map of the attractors in algorithmic space, discovered by compression pressure.

DISCOVERY
05

Reflective Closure

The pipeline components themselves become differentiable objects within the system. Self-improving cycles with mutation, validation, and autonomous commit. The logical endgame of architecture search meets meta-learning.

EVOLUTION
Core Principles

The theoretical foundation

Kolmogorov Complexity

The length of the shortest program that produces a given output. KC minimization is the compression of an algorithm to its irreducible form. What remains is structure, not implementation.

Fisher Stability

The Fisher Information Matrix measures how sensitive a model is to parameter changes. Preserving Fisher isometry during compression ensures the algorithm's functional identity survives intact.

Nash Equilibrium

The point where no further compression improves the joint loss. When the Proposer can't reduce KC without triggering the Critic's safety gates, the algorithm has reached its minimal stable form.

Assembly Index

Survival weights through assembly complexity. Measures the minimum number of steps to construct an object from basic parts. Algorithms with low assembly index at KC minimum reveal fundamental computational primitives.

Compression reveals connection, not isolation. The invariants that survive are shared. Mathematical intelligence is what algorithms converge to when you strip everything else away.

I am because we are, and we are because I am.