
A high-performance DSPy rewrite in Rust for building LM-powered applications
Documentation β’ API Reference β’ Examples β’ Issues β’ Discord
DSRs (DSPy Rust) is a ground-up rewrite of the DSPy framework in Rust, designed for building robust, high-performance applications powered by Language Models. Unlike a simple port, DSRs leverages Rust's type system, memory safety, and concurrency features to provide a more efficient and reliable foundation for LM applications.
Add DSRs to your Cargo.toml
:
[dependencies]
# Option 1: Use the shorter alias (recommended)
dsrs = { package = "dspy-rs", version = "0.5.0" }
# Option 2: Use the full name
dspy-rs = "0.5.0"
Or use cargo:
# Option 1: Add with alias (recommended)
cargo add dsrs --package dspy-rs
# Option 2: Add with full name
cargo add dspy-rs
Here's a simple example to get you started:
use dsrs::prelude::*;
use anyhow::Result;
#[Signature]
struct QASignature {
/// You are a helpful assistant that answers questions accurately.
#[input]
pub question: String,
#[output]
pub answer: String,
}
#[tokio::main]
async fn main() -> Result<()> {
// Configure your LM (Language Model)
configure(
LM::builder()
.api_key(SecretString::from(std::env::var("OPENAI_API_KEY")?))
.build(),
ChatAdapter {},
);
// Create a predictor
let predictor = Predict::new(QASignature::new());
// Prepare input
let example = example! {
"question": "input" => "What is the capital of France?",
};
// Execute prediction
let result = predictor.forward(example).await?;
println!("Answer: {}", result.get("answer", None));
Ok(())
}
DSRs follows a modular architecture with clear separation of concerns:
dsrs/
βββ core/ # Core abstractions (LM, Module, Signature)
βββ adapter/ # LM provider adapters (OpenAI, etc.)
βββ data/ # Data structures (Example, Prediction)
βββ predictors/ # Built-in predictors (Predict, Chain, etc.)
βββ evaluate/ # Evaluation framework and metrics
βββ macros/ # Derive macros for signatures
#[Signature(cot)] // Enable chain-of-thought reasoning
struct TranslationSignature {
/// Translate the text accurately while preserving meaning
#[input]
pub text: String,
#[input]
pub target_language: String,
#[output]
pub translation: String,
}
#[derive(Builder)]
pub struct CustomModule {
predictor: Predict,
}
impl Module for CustomModule {
async fn forward(&self, inputs: Example) -> Result<Prediction> {
// Your custom logic here
self.predictor.forward(inputs).await
}
}
// Get prediction
let predict = Predict::new(MySignature::new());
// Configure with OpenAI
let lm = LM::builder()
.api_key(secret_key)
.model("gpt-4")
.temperature(0.7)
.max_tokens(1000)
.build();
impl Evaluator for MyModule {
async fn metric(&self, example: &Example, prediction: &Prediction) -> f32 {
// Define your custom metric logic
let expected = example.get("answer", None);
let predicted = prediction.get("answer", None);
// Example: Exact match metric
if expected.to_lowercase() == predicted.to_lowercase() {
1.0
} else {
0.0
}
}
}
// Evaluate your module
let test_examples = load_test_data();
let module = MyModule::new();
// Automatically runs predictions and computes average metric
let score = module.evaluate(test_examples).await;
println!("Average score: {}", score);
#[derive(Optimizable)]
pub struct MyModule {
#[parameter]
predictor: Predict,
}
// Create and configure the optimizer
let optimizer = COPRO::builder()
.breadth(10) // Number of candidates per iteration
.depth(3) // Number of refinement iterations
.build();
// Prepare training data
let train_examples = load_training_data();
// Compile optimizes the module in-place
let mut module = MyModule::new();
optimizer.compile(&mut module, train_examples).await?;
Component Freezing:
// The Optimizable derive macro automatically implements the trait and marks Module Optimizable
#[derive(Builder, Optimizable)]
pub struct ComplexPipeline {
#[parameter] // Mark optimizable components
analyzer: Predict,
// Non-parameter fields won't be optimized
summarizer: Predict,
// Non-parameter fields won't be optimized
config: Config,
}
use dsrs::prelude::*;
#[Signature]
struct AnalyzeSignature {
#[input]
pub text: String,
#[output]
pub sentiment: String,
#[output]
pub key_points: String,
}
#[Signature]
struct SummarizeSignature {
#[input]
pub key_points: String,
#[output]
pub summary: String,
}
#[derive(Builder)]
pub struct AnalysisPipeline {
analyzer: Predict,
summarizer: Predict,
}
impl Module for AnalysisPipeline {
async fn forward(&self, inputs: Example) -> Result<Prediction> {
// Step 1: Analyze the text
let analysis = self.analyzer.forward(inputs).await?;
// Step 2: Summarize key points
let summary_input = example! {
"key_points": "input" => analysis.get("key_points", None),
};
let summary = self.summarizer.forward(summary_input).await?;
// Combine results
Ok(prediction! {
"sentiment" => analysis.get("sentiment", None),
"key_points" => analysis.get("key_points", None),
"summary" => summary.get("summary", None),
})
}
}
Run the test suite:
# All tests
cargo test
# Specific test
cargo test test_predictors
# With output
cargo test -- --nocapture
# Run examples
cargo run --example 01-simple
#[Signature(cot)] // Enable CoT with attribute
struct ComplexReasoningSignature {
#[input(desc="Question")
pub problem: String,
#[output]
pub solution: String,
}
We welcome contributions! Please see our Contributing Guide for details.
# Clone the repository
git clone https://github.com/krypticmouse/dsrs.git
cd dsrs
# Build the project
cargo build
# Run tests
cargo test
# Run with examples
cargo run --example 01-simple
# Check formatting
cargo fmt -- --check
# Run clippy
cargo clippy -- -D warnings
This project is licensed under the Apache License 2.0 - see the LICENSE file for details.
- Inspired by the original DSPy framework
- Built with the amazing Rust ecosystem
- Special thanks to the DSPy community for the discussion and ideas
Star β this repo if you find it useful!