Skip to content

maxencefaldor/cax

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

57 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

CAX: Cellular Automata Accelerated in JAX

Pyversions PyPI version Ruff Paper

CAX is a high-performance and flexible open-source library designed to accelerate artificial life research. 🧬

Overview 🔎

Are you interested in emergence, self-organization, or open-endedness? Whether you're a researcher or just curious about the fascinating world of artificial life, CAX is your digital lab! 🔬

Designed for speed and flexibility, CAX allows you to easily experiment with self-organizing behaviors and emergent phenomena. 🧑‍🔬

Get started here Colab

Why CAX? 💡

CAX supports discrete and continuous models, including neural cellular automata, across any number of dimensions. Beyond traditional cellular automata, it also handles particle systems and more, all unified under a single, intuitive API.

Rich 🎨

CAX provides a comprehensive collection of 15+ ready-to-use systems. From simulating one-dimensional elementary cellular automata to training three-dimensional self-autoencoding neural cellular automata, or even creating beautiful Lenia simulations, CAX provides a versatile platform for exploring the rich world of self-organizing systems.

Flexible 🧩

CAX makes it easy to extend existing models or build custom ones from scratch for endless experimentation and discovery. Design your own experiments to probe the boundaries of artificial open-ended evolution and emergent complexity.

Fast 🚀

CAX is built on top of the JAX/Flax ecosystem for speed and scalability. The library benefits from vectorization and parallelization on various hardware accelerators such as CPU, GPU, and TPU. This allows you to scale your experiments from small prototypes to massive simulations with minimal code changes.

Tested & Documented 📚

The library is thoroughly tested and documented with numerous examples to get you started! Our comprehensive guides walk you through everything from basic cellular automata to advanced neural implementations.

Implemented Systems 🦎

Cellular Automata Reference Example
Elementary Cellular Automata Wolfram (2002) Colab
Conway's Game of Life Gardner (1970) Colab
Lenia Chan (2020) Colab
Flow Lenia Plantec et al. (2022) Colab
Particle Lenia Mordvintsev et al. (2022) Colab
Particle Life Mohr (2018) Colab
Boids Reynolds (1987) Colab
Growing Neural Cellular Automata Mordvintsev et al. (2020) Colab
Growing Conditional Neural Cellular Automata Sudhakaran et al. (2022) Colab
Growing Unsupervised Neural Cellular Automata Palm et al. (2021) Colab
Diffusing Neural Cellular Automata Faldor et al. (2024) Colab
Self-classifying MNIST Digits Randazzo et al. (2020) Colab
Self-autoencoding MNIST Digits Faldor et al. (2024) Colab
1D-ARC Neural Cellular Automata Faldor et al. (2024) Colab
Attention-based Neural Cellular Automata Tesfaldet et al. (2022) Colab

Getting Started 🚦

import jax
from cax.core.ca import CA
from cax.core.perceive.conv_perceive import ConvPerceive
from cax.core.update.nca_update import NCAUpdate
from flax import nnx

seed = 0

channel_size = 16
num_kernels = 3
hidden_layer_sizes = (128,)
cell_dropout_rate = 0.5

key = jax.random.key(seed)
rngs = nnx.Rngs(seed)

perceive = ConvPerceive(
	channel_size=channel_size,
	perception_size=num_kernels * channel_size,
	rngs=rngs,
	feature_group_count=channel_size,
)
update = NCAUpdate(
	channel_size=channel_size,
	perception_size=num_kernels * channel_size,
	hidden_layer_sizes=hidden_layer_sizes,
	rngs=rngs,
	cell_dropout_rate=cell_dropout_rate,
	zeros_init=True,
)
ca = CA(perceive, update)

state = jax.random.normal(key, (64, 64, channel_size))
state = ca(state, num_steps=128)

Installation ⚙️

You will need Python 3.10 or later, and a working JAX installation.

Then, install CAX from PyPi:

pip install cax

To upgrade to the latest version of CAX, you can use:

pip install --upgrade git+https://github.com/maxencefaldor/cax.git

Citing CAX 📝

If you use CAX in your research, please cite the following paper:

@inproceedings{cax,
	title       = {{CAX}: Cellular Automata Accelerated in {JAX}},
	author      = {Maxence Faldor and Antoine Cully},
	booktitle   = {The Thirteenth International Conference on Learning Representations},
	year        = {2025},
	url         = {https://openreview.net/forum?id=o2Igqm95SJ},
	keywords    = {cellular automata, emergence, self-organization, neural cellular automata},
}

Contributing 👷

Contributions are welcome! If you find a bug or are missing your favorite self-organizing system, please open an issue or submit a pull request following our contribution guidelines 🤗.

Releases

No releases published

Packages

No packages published

Languages