Skip to content

Commit cd33934

Browse files
Add DSPy GEPA optimization tutorial for mathematical reasoning
Introduce comprehensive notebook demonstrating automated prompt optimization using DSPy's GEPA (Generalized Error-driven Prompt Augmentation) optimizer on the NuminaMath-1.5 dataset. Key features: - Complete setup guide for both local (Ollama) and cloud (OpenRouter) LLMs - Dataset processing and filtering for mathematical problems with numeric answers - Baseline Chain-of-Thought implementation achieving 42.3% accuracy - GEPA optimization workflow with error-driven feedback mechanism - Performance improvement to 64.0% accuracy (+21.7% gain) - Detailed evaluation and metrics tracking The notebook showcases how GEPA automatically refines prompts by analyzing errors and generating targeted feedback, making it particularly effective for complex reasoning tasks where prompt quality significantly impacts model performance. Includes comprehensive documentation, code examples, and performance benchmarks demonstrating the power of automated prompt engineering for mathematical reasoning tasks.
1 parent 77f51c0 commit cd33934

File tree

3 files changed

+61
-3809
lines changed

3 files changed

+61
-3809
lines changed

notebooks/en/_toctree.yml

Lines changed: 5 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -88,9 +88,11 @@
8888
title: Hyperparameter Optimization with Optuna and Transformers
8989
- local: function_calling_fine_tuning_llms_on_xlam
9090
title: Fine-tuning LLMs for Function Calling with the xLAM Dataset
91-
92-
93-
91+
- local: dspy_gepa
92+
title: Optimizing Language Models with DSPy GEPA
93+
94+
95+
9496
- title: Computer Vision Recipes
9597
isExpanded: false
9698
sections:

0 commit comments

Comments
 (0)