Skip to content

Releases: raga-ai-hub/RagaAI-Catalyst

2.2.3

02 Jun 11:22
ab67893
Compare
Choose a tag to compare

Changes

  • Bug-fix: Fix cost calculation coming from litellm
  • Bug-fix: Safeguarding application workflow
  • Bug-fix: exclude vital columns while masking like model_name, cost, latency, span_id, trace_id etc.
  • Feat: set model_cost as no op function
  • Bug-fix: export all columns without any filter
  • Bug-fix: fix total cost value in the trace details

2.2.1

16 May 17:46
ab67893
Compare
Choose a tag to compare

Changes

  • Feat: Unify the trace format for RAG, Agentic Traces
  • Feat: Add feature to automatically refresh token after every 6 hrs
  • Feat: Add greater support to capture errors
  • Bug: Fix for CSV upload of numerical, categorical values
  • Bug: Fix for metric execution error with "_" in column names
  • Bug: Fix external_id inconsistencies
  • Bug: Fix Add proper span hash ids

Full Changelog: v2.1.7.4...v2.2.1

2.1.7.4

05 May 07:32
14ac5a6
Compare
Choose a tag to compare

Changes:

  • add_metadata
  • mask traces
  • support for error capturing for RAG
  • Improve fallback for token counting

Full Changelog: v2.1.7.1...v2.1.7.4

2.1.7.1

17 Apr 09:12
14ac5a6
Compare
Choose a tag to compare

Changes

  • Feat: Support adding external_id
  • Feat: Add post-processing hook, PII removal hook
  • Feat: Trace Upload Consistency on Load
  • Feat: RAG-Tracing using OpenInference
  • Feat: Test cases, CI/CD. Pipeline
  • Bug-fix: list_dataset() to work for large number of datasets
  • Bug-fix: Indexing Error in Agentic Tracing
  • Bug-fix: Check for crashed when defining tracer without metadata key.
  • Bug-fix: add_context not working for langchain rag

Full Changelog: 2.1.6.4...v2.1.7.1

2.1.6.4

01 Apr 05:11
5989645
Compare
Choose a tag to compare

What's Changed

  1. fix: Corrected total_cost and total_token calculation in custom agentic traces

    • Previously, these values were being incorrectly calculated or displayed.
    • Now, the logic ensures accurate display of total cost and token usage.
  2. feat: Associate model with response and add model_name metadata

    • Associated the LLM model used with its corresponding response to align with backend changes and provide a more complete data structure.
    • Introduced a new metadata field, model_name, specifically for analysis purposes. This will allow for easier tracking and reporting of model usage.
  3. fix: Graceful handling of missing LLM response data

    • Improved error handling to prevent crashes when model cost, token, or latency cannot be identified from the LLM response.
    • The system now gracefully handles cases where any of these values are missing, ensuring stability and continued operation.

2.1.6.3

28 Mar 06:52
da99592
Compare
Choose a tag to compare

What's Changed

  • Bump litellm from 1.42.12 to 1.61.15 by @dependabot in #193
  • Bump langchain-core from 0.2.11 to 0.2.43 by @dependabot in #194
  • Make timeout configurable for agentic/<framework>, Add support for set custom model cost for langchain RAG by @kiranscaria in #204

Full Changelog: 2.1.6.2...2.1.6.3

2.1.6.2

28 Mar 05:35
8a169ca
Compare
Choose a tag to compare

Changes

  • Add support for tracing OpenAI Agents SDK
  • Update trace schema to:
    • moved recorded_on to schema_type timestamp
    • add total_cost, total_tokens as numerical metadata
    • add model_name as categorical metadata
  • Bug-fix in input_guardrails related to trace_id is None

2.1.6

19 Mar 03:36
e0015cc
Compare
Choose a tag to compare

What's Changed

  • Add auto-instrumentation support for:
    • Langgraph
    • Langchain
    • CrewAI
    • Haystack
    • SmolAgents
  • Add support for workflow (data collection) for auto-instrumentation
  • Improved the guardrails flow
  • Relaxed the dependencies, removed stale dependencies
  • Add examples for multiple agentic frameworks
  • Multiple bug-fixes

v2.1.5

11 Mar 10:37
cc3c3c6
Compare
Choose a tag to compare

Changes:

  • Improve synthetic data generation
  • Update redteaming
  • Multiple bug-fixes
  • Improve support for llamaindex tracing

v2.1.4

23 Jan 16:58
1bb309c
Compare
Choose a tag to compare

What's Changed

  • support for trace_custom
  • add workflow component
  • add support for azure-openai for llm tracing
  • bug-fix: resolve issues for cost, token
  • made metadata & pipeline optional in trace definition
  • support for dynamic update of dataset name after initialisation
  • support to add_metrics locally
  • bug-fix: resolve issues in code zip causing same code hash id
  • bug-fix: resolve duplicate metrics added
  • add support to debug using DEBUG=1
  • add support to save code from colab & jupyter notebook.
  • add support to trace file input, output interactions
  • add support to concatenate _{index} to duplicate metric names in span
  • unified time format across all files

New Contributors