This package was created to facilitate the manipulation of neural networks, understand their functioning, create custom models, and train them. It is developed in Python entirely from scratch, without using external packages beyond Python.
Note: This project is currently in Beta and still under active development. Many features are planned for future releases.
pip install tona-ai
A neural network consists of neurons and synapses. The neurons are organized into different layers:
- The input layer
INPUT
- This layer is mandatory and unique as it is the neurons in this layer that receive the input values of the neural network when it is executed.
- The hidden layers
HIDDEN
- These layers are optional and there can be several of them. They allow the neural network to perform more complex calculations and be more precise.
- The output layer
OUTPUT
- This layer is mandatory and unique. The neurons in this layer receive the output values of the neural network when it is executed.
Neurons have an activation function and a bias. The activation function calculates the neuron's output, and the bias is added to the sum of the inputs before passing through the activation function. This bias adjusts the output.
Available activation functions are:
- Sigmoid function
- ReLU (Rectified Linear Unit)
- Hyperbolic tangent (TANH)
Synapses serve as connections between neurons. Each synapse has an input neuron and an output neuron. Synapses also have a weight that can be adjusted and will be multiplied by the input neuron's output before being sent to the output neuron.
Tona AI allows you to create a custom neural network. You have access to each neuron and each synapse.
To create a neural network, you can simply use the create()
method of the NeuralNetwork class:
from tona_ai import ActivationFunction, NeuralNetwork
# Layered parameter is required, to define if the network is organized into strict layers
nn = NeuralNetwork(layered=False)
nn.create(
input_size=2,
output_size=1,
layers=[
(4, ActivationFunction.TANH),
],
dense=True,
output_activation_function=ActivationFunction.TANH,
)
Example code available here!
Or you can manually create the neurons and the connections between them:
from tona_ai import (
ActivationFunction,
Layer,
LayerType,
NeuralNetwork,
Neuron,
Synapse,
)
# Define the layers
input_layer = Layer(layer_type=LayerType.INPUT)
output_layer = Layer(layer_type=LayerType.OUTPUT)
# hidden layer can have multiple layers so you need to specify the index
hidden_layer = Layer(layer_type=LayerType.HIDDEN, layer_index=0)
# Define the neurons
neurons = [
# Input Neurons do not have a bias or activation
Neuron(id=0, layer=input_layer), # x1
Neuron(id=1, layer=input_layer), # x2
# Hidden Neurons
Neuron(
id=2,
layer=hidden_layer,
bias=0.0,
activation_function=ActivationFunction.RELU,
), # h1
Neuron(
id=3,
layer=hidden_layer,
bias=0.0,
activation_function=ActivationFunction.RELU,
), # h2
# Output Neurons
Neuron(
id=4,
layer=output_layer,
bias=-2.0,
activation_function=ActivationFunction.SIGMOID,
), # o
]
# Define the synapses
synapses = [
Synapse(in_neuron=neurons[0], out_neuron=neurons[2], weight=2.0), # x1 -> h1
Synapse(in_neuron=neurons[0], out_neuron=neurons[3], weight=-2.0), # x1 -> h2
Synapse(in_neuron=neurons[1], out_neuron=neurons[2], weight=-2.0), # x2 -> h1
Synapse(in_neuron=neurons[1], out_neuron=neurons[3], weight=2.0), # x2 -> h2
Synapse(in_neuron=neurons[2], out_neuron=neurons[4], weight=2.0), # h1 -> o
Synapse(in_neuron=neurons[3], out_neuron=neurons[4], weight=2.0), # h2 -> o
]
# Add the synapses to the neurons
for synapse in synapses:
synapse.out_neuron.inputs_synapses.append(synapse)
# Create the neural network
# Layered parameter is required, to define if the network is organized into strict layers
nn = NeuralNetwork(layered=True, neurons=neurons, synapses=synapses)
Example code available here!
To run your neural network, use the forward()
method by passing the input values as a list of floats. It will return the output values as a list of floats.
result_1 = nn.forward([0.0, 0.0]) # [0.11920292202211755]
result_2 = nn.forward([0.0, 1.0]) # [0.8807970779778823]
result_3 = nn.forward([1.0, 0.0]) # [0.8807970779778823]
result_4 = nn.forward([1.0, 1.0]) # [0.11920292202211755]
Example code available here!
nn.save("my_network.pkl")
Example code available here!
loaded_nn = NeuralNetwork.load("my_network.pkl")
Example code available here!
The algorithm implementation is still under development. Currently, only an EXTREMELY simplified version is available.
Tona AI provides a very simplified version of the NEAT algorithm (see disclaimer), but it is functional. The principle is simple: you need to define an environment where NEAT will evaluate each individual in the population, retain the top 50%, and for each individual create a mutated copy.
The environment is essential as it defines how individuals are evaluated and how their efficiency is calculated.
To create an environment, simply create a class inheriting from the Environment class:
from tona_ai import Environment, Individual
# Create a simple XOR environment
class XorEnvironment(Environment):
def __init__(self):
super().__init__()
# Implement the run method
# This method is called when an individual is evaluated for each generation
def run(self, individual: Individual) -> float:
inputs = [[0, 0], [0, 1], [1, 0], [1, 1]]
expected_outputs = [0, 1, 1, 0]
fitness = 0
for index, input in enumerate(inputs):
output = individual.genome.forward(input)
fitness += self.fitness_calculation(
outputs=output, expected_output=expected_outputs[index]
)
fitness = fitness * fitness
individual.fitness = fitness
return fitness
# Implement the fitness calculation method
def fitness_calculation(self, outputs: list[float], **kwargs: dict) -> float:
error = (outputs[0] - kwargs["expected_output"]) ** 2
mse = error
fitness = 1 / (1 + mse)
return fitness
Example code available here!
To run NEAT, you first need to create a population. Start by creating a base Genome, which is equivalent to a Neural Network with some changes. Then create a population with the create()
method by passing the initial genome. Finally, create a NEAT object by passing the population, the environment, and defining the mutation rate and range.
# Create a simple genome with two inputs, one output, and one hidden layer of 4 neurons
genome = Genome()
genome.create(
input_size=2,
output_size=1,
layers=[
(4, ActivationFunction.TANH),
],
dense=True,
output_activation_function=ActivationFunction.TANH,
)
# Create a population of 100 individuals based on the genome
pop = Population()
pop.create(population_size=100, initial_genome=genome)
# Create the NEAT object
neat = NEAT(
population=pop,
environment=XorEnvironment(),
mutation_rate=0.1,
mutation_range=(-0.5, 0.5),
)
# Run the NEAT algorithm for 100000 epochs
neat.run(epochs=100000)