Evolutionary Turing Machine - The Classics Revival
Neural Turing Machines That Evolve Their Own Architectures
Experimental Research Code - Functional but unoptimized, expect rough edges
What Is This?
Evolutionary Turing Machine combines Neural Turing Machines with evolutionary algorithms to create memory-augmented networks that evolve their own architectures. Instead of hand-designing memory systems, populations of NTMs compete and evolve optimal memory configurations.
Core Innovation: NTMs that mutate their memory slots, read/write heads, and controller architectures through evolutionary pressure, discovering novel memory patterns.
Architecture Highlights
- Self-Evolving Memory: Memory slots and dimensions adapt through evolution
- Adaptive Read/Write Heads: Number and behavior of memory heads evolve
- Controller Architecture Search: LSTM controller dimensions discovered evolutionarily
- Task-Driven Fitness: Evolution guided by performance on memory tasks
- Population Diversity: Maintains genetic diversity in memory architectures
- Crossover Operations: Recombines successful memory strategies
Quick Start
from evolutionary_turing import EvolutionaryTuringMachine, EvolutionaryTuringConfig
# Create evolutionary NTM system
config = EvolutionaryTuringConfig(
population_size=50,
input_dim=8,
output_dim=8,
max_generations=100
)
evolution = EvolutionaryTuringMachine(config)
# Evolve population on memory tasks
history = evolution.run_evolution()
# Get the best evolved model
best_ntm = evolution.get_best_model()
Current Status
- Working: Population evolution, architecture mutations, memory task evaluation, crossover operations
- Rough Edges: No distributed evolution, limited task variety, basic fitness functions
- Still Missing: Advanced mutation operators, multi-objective optimization, neural architecture search integration
- Performance: Functional on toy problems, needs scaling for complex tasks
- Memory Usage: High due to population storage, optimization needed
- Speed: Sequential evaluation, parallelization would help significantly
Mathematical Foundation
The evolutionary process optimizes the NTM architecture space through genetic algorithms:
Fitness(NTM) = Performance(copy_task) × 0.5 + Performance(recall_task) × 0.3 + Efficiency × 0.2
Mutations modify:
- Controller dimensions:
d_new = d_old + N(0, σ_controller)
- Memory parameters:
M_slots_new ~ U[16, 256]
,M_dim_new ~ U[8, 64]
- Head configurations:
heads_read_new ~ U[1, 4]
,heads_write_new ~ U[1, 3]
Selection pressure favors architectures that balance task performance with computational efficiency.
Research Applications
- Neural architecture search for memory systems
- Adaptive memory allocation strategies
- Meta-learning through evolutionary computation
- Automated machine learning for sequence tasks
- Evolutionary neural network design
Installation
pip install torch numpy
# Download evolutionary_turing.py from this repo
The Classics Revival Collection
Evolutionary Turing Machine is part of a larger exploration of foundational algorithms enhanced with modern neural techniques:
- Evolutionary Turing Machine ← You are here
- Hebbian Bloom Filter
- Hopfield Decision Graph
- Liquid Bayes Chain
- Liquid State Space Model
- Möbius Markov Chain
- Memory Forest
Citation
@misc{evolutionaryturing2025,
title={Evolutionary Turing Machine: Self-Evolving Memory Architectures},
author={Jae Parker 𓅸 1990two},
year={2025},
note={Part of The Classics Revival Collection}
}