Skip to content

Releases: MervinPraison/PraisonAI

v2.2.75

24 Jul 09:01
Compare
Choose a tag to compare

v2.2.74

24 Jul 00:54
Compare
Choose a tag to compare

Full Changelog: v2.2.73...v2.2.74

v2.2.73

23 Jul 22:57
Compare
Choose a tag to compare

What's Changed

New Contributors

  • @github-actions[bot] made their first contribution in #1032

Full Changelog: v2.2.72...v2.2.73

v2.2.72

20 Jul 02:50
Compare
Choose a tag to compare

What's Changed

Full Changelog: v2.2.71...v2.2.72

v2.2.71

18 Jul 21:29
Compare
Choose a tag to compare

What's Changed

Full Changelog: v2.2.70...v2.2.71

v2.2.70

17 Jul 00:32
Compare
Choose a tag to compare

What's Changed

  • fix: ensure consistent Task/Response formatting across all LLM providers by @MervinPraison in #961
  • fix: properly shutdown telemetry aiohttp sessions to prevent ImportError during Python shutdown by @MervinPraison in #965
  • fix: improve Ollama sequential tool execution to prevent redundant calls by @MervinPraison in #966
  • fix: make TelemetryCollector.stop() consistent with shutdown behavior by @MervinPraison in #974

Full Changelog: v2.2.69...v2.2.70

v2.2.69

16 Jul 23:46
Compare
Choose a tag to compare

What's Changed

  • fix: resolve task_name undefined error in async execution by @MervinPraison in #959
  • fix: accumulate tool results across iterations in Ollama sequential execution by @MervinPraison in #960
  • fix: enable real-time streaming regardless of verbose setting by @MervinPraison in #962

Full Changelog: v2.2.68...v2.2.69

v2.2.68

16 Jul 23:20
Compare
Choose a tag to compare

What's Changed

Full Changelog: v2.2.67...v2.2.68

v2.2.67

16 Jul 13:16
Compare
Choose a tag to compare

What's Changed

  • fix: resolve task_name undefined error in LLM callback execution by @MervinPraison in #952

Full Changelog: v2.2.66...v2.2.67

v2.2.66

16 Jul 13:13
Compare
Choose a tag to compare

What's Changed

  • fix: Resolve Ollama infinite tool call loops by improving response handling by @MervinPraison in #943
  • feat: Add 7 comprehensive examples for advanced PraisonAI agent features by @MervinPraison in #942
  • fix: prevent Ollama tool summary from being overwritten by empty response by @MervinPraison in #944
  • fix: prevent premature termination in Ollama sequential tool execution by @MervinPraison in #945
  • Revert "fix: prevent premature termination in Ollama sequential tool execution" by @MervinPraison in #953
  • feat: Enable imports from PraisonAI package for issue #950 by @MervinPraison in #951

Full Changelog: v2.2.65...v2.2.66