moo-rs
is a project for solving multi-objective optimization problems with evolutionary algorithms, combining:
- moors: a pure-Rust crate for high-performance implementations of genetic algorithms
- pymoors: a Python extension crate (via pyo3) exposing
moors
algorithms with a Pythonic API
Inspired by the amazing Python project pymoo, moo-rs
delivers both the speed of Rust and the ease-of-use of Python.
moo-rs/
├── moors/ # Rust crate: core algorithms
└── pymoors/ # Python crate: pyo3 bindings
moors is a pure-Rust crate providing multi-objective evolutionary algorithms.
- NSGA-II, NSGA-III, R-NSGA-II, Age-MOEA, REVEA, SPEA-II (many more coming soon!)
- Pluggable operators: sampling, crossover, mutation, duplicates removal
- Flexible fitness & constraints via user-provided closures
- Built on ndarray and faer
[dependencies]
moors = "0.1.1"
use ndarray::{Array1, Axis, stack};
use moors::{
algorithms::{MultiObjectiveAlgorithmError, Nsga2Builder},
duplicates::ExactDuplicatesCleaner,
genetic::{PopulationConstraints, PopulationFitness, PopulationGenes},
operators::{
crossover::SinglePointBinaryCrossover, mutation::BitFlipMutation,
sampling::RandomSamplingBinary,
},
};
// problem data
const WEIGHTS: [f64; 5] = [12.0, 2.0, 1.0, 4.0, 10.0];
const VALUES: [f64; 5] = [4.0, 2.0, 1.0, 5.0, 3.0];
const CAPACITY: f64 = 15.0;
/// Compute multi-objective fitness [–total_value, total_weight]
/// Returns an Array2<f64> of shape (population_size, 2)
fn fitness_knapsack(population_genes: &PopulationGenes) -> PopulationFitness {
let weights_arr = Array1::from_vec(WEIGHTS.to_vec());
let values_arr = Array1::from_vec(VALUES.to_vec());
let total_values = population_genes.dot(&values_arr);
let total_weights = population_genes.dot(&weights_arr);
// stack two columns: [-total_values, total_weights]
stack(Axis(1), &[(-&total_values).view(), total_weights.view()]).expect("stack failed")
}
fn constraints_knapsack(population_genes: &PopulationGenes) -> PopulationConstraints {
let weights_arr = Array1::from_vec(WEIGHTS.to_vec());
(population_genes.dot(&weights_arr) - CAPACITY).insert_axis(Axis(1))
}
fn main() -> Result<(), MultiObjectiveAlgorithmError> {
// build the NSGA-II algorithm
let mut algorithm = Nsga2Builder::default()
.fitness_fn(fitness_knapsack)
.constraints_fn(constraints_knapsack)
.sampler(RandomSamplingBinary::new())
.crossover(SinglePointBinaryCrossover::new())
.mutation(BitFlipMutation::new(0.5))
.duplicates_cleaner(ExactDuplicatesCleaner::new())
.num_vars(5)
.num_objectives(2)
.num_constraints(1)
.population_size(100)
.crossover_rate(0.9)
.mutation_rate(0.1)
.num_offsprings(32)
.num_iterations(2)
.build()?;
algorithm.run()?;
let population = algorithm.population()?;
println!("Done! Population size: {}", population.len());
Ok(())
}
pymoors uses pyo3 to expose moors
algorithms in Python.
pip install pymoors
import numpy as np
from pymoors import (
Nsga2,
RandomSamplingBinary,
BitFlipMutation,
SinglePointBinaryCrossover,
ExactDuplicatesCleaner,
)
from pymoors.typing import TwoDArray
PROFITS = np.array([2, 3, 6, 1, 4])
QUALITIES = np.array([5, 2, 1, 6, 4])
WEIGHTS = np.array([2, 3, 6, 2, 3])
CAPACITY = 7
def knapsack_fitness(genes: TwoDArray) -> TwoDArray:
# Calculate total profit
profit_sum = np.sum(PROFITS * genes, axis=1, keepdims=True)
# Calculate total quality
quality_sum = np.sum(QUALITIES * genes, axis=1, keepdims=True)
# We want to maximize profit and quality,
# so in pymoors we minimize the negative values
f1 = -profit_sum
f2 = -quality_sum
return np.column_stack([f1, f2])
def knapsack_constraint(genes: TwoDArray) -> TwoDArray:
# Calculate total weight
weight_sum = np.sum(WEIGHTS * genes, axis=1, keepdims=True)
# Inequality constraint: weight_sum <= capacity
return weight_sum - CAPACITY
algorithm = Nsga2(
sampler=RandomSamplingBinary(),
crossover=SinglePointBinaryCrossover(),
mutation=BitFlipMutation(gene_mutation_rate=0.5),
fitness_fn=knapsack_fitness,
constraints_fn=knapsack_constraint,
duplicates_cleaner=ExactDuplicatesCleaner(),
n_vars=5,
num_objectives=1,
num_constraints=1,
population_size=32,
num_offsprings=32,
num_iterations=10,
mutation_rate=0.1,
crossover_rate=0.9,
keep_infeasible=False,
)
algorithm.run()
pop = algorithm.population
# Get genes
>>> pop.genes
array([[1., 0., 0., 1., 1.],
[0., 1., 0., 0., 1.],
[1., 1., 0., 1., 0.],
...])
# Get fitness
>>> pop.fitness
array([[ -7., -15.],
[ -7., -6.],
[ -6., -13.],
...])
# Get constraints evaluation
>>> pop.constraints
array([[ 0.],
[-1.],
[ 0.],
...])
# Get rank
>>> pop.rank
array([0, 1, 1, 2, ...], dtype=uint64)
# Get best individuals
>>> pop.best
[<pymoors.schemas.Individual object at 0x...>]
>>> pop.best[0].genes
array([1., 0., 0., 1., 1.])
>>> pop.best[0].fitness
array([ -7., -15.])
>>> pop.best[0].constraints
array([0.])
In this small example, the algorithm finds a single solution on the Pareto front: selecting the items (A, D, E), with a profit of 7 and a quality of 15. This means there is no other combination that can match or exceed both objectives without exceeding the knapsack capacity (7).
Once the algorithm finishes, it stores a population
attribute that contains all the individuals evaluated during the search.
Contributions welcome! Please read the contribution guide and open issues or PRs in the relevant crate’s repository
This project is licensed under the MIT License.