Fine-tune with TEA
-
Updated
Mar 27, 2025 - Python
Fine-tune with TEA
End-to-End Python implementation of Semantic Divergence Metrics (SDM) for LLM hallucination detection. Uses ensemble paraphrasing, joint embedding clustering, and information-theoretic measures (JSD, KL divergence, Wasserstein distance) to quantify prompt-response semantic consistency. Based on Halperin (2025).
End-to-End quantitative (Python) decision support system for optimizing economic resilience against disasters. Implements updated MRIA model using multi-regional supply-use tables, three-step optimization algorithm, and comprehensive impact assessment to identify vulnerabilities from production concentration and logistical constraints.
Add a description, image, and links to the robustness-testing topic page so that developers can more easily learn about it.
To associate your repository with the robustness-testing topic, visit your repo's landing page and select "manage topics."