Releases: Sinapsis-AI/sinapsis-chatbots
sinapsis-chatbots v0.4.0
The Sinapsis-chatbots monorepo now introduces two new packages, sinapsis-anthropic and sinapsis-mem0, designed to enhance your AI integration experience.
sinapsis-anthropic: Claude Model Support
Expand your AI capabilities with support for Anthropic's Claude models.
AnthropicTextGeneration: Streamline text and code generation tasks using Claude models via the Anthropic API.
Customize with parameters like llm_model_name, role, prompt, system_prompt, context_max_len, extended_thinking, web_search, and more.
AnthropicMultiModal: Empower multimodal chat processing with Claude models, offering the same flexibility and customization as AnthropicTextGeneration.
sinapsis-mem0: Advanced Memory Management
Enhance memory handling with Mem0's powerful tools:
Mem0Add: Organize AI interactions into structured memories, supporting both individual facts and full conversation histories.
Mem0Delete: Efficiently manage memories with selective or bulk deletion options.
Mem0Get: Retrieve memories, from individual entries to comprehensive conversation histories.
Mem0Reset: Clear memory storage entirely or within specific scopes (e.g., user, agent, run).
Mem0Search: Dynamically inject relevant memories into prompts, with configurable formatting options.
Common Attributes:
use_managed: Choose between the managed Mem0 API and self-hosted infrastructure.
memory_config: Configure MemoryClient or Memory with custom parameters.
Template-Specific Attributes:
Mem0Add: Use add_kwargs and generic_key for tailored memory addition.
Mem0Delete: Control deletion with delete_all and delete_kwargs.
Mem0Get: Retrieve memories efficiently with get_all and get_kwargs.
Mem0Search: Customize memory injection with enclosure and search_kwargs.
Additionally, we have improved Local Model Loading
Enhance flexibility with the ability to load local models, complementing existing support for HuggingFace Hub models. This update simplifies model deployment and extends capabilities for LLaMA models.
sinapsis-chatbots v0.3.0
Release Notes for sinapsis-chatbots: Version 0.3.0
New Features and Enhancements
-
Support for Meta's Llama4 Models
sinapsis-llama-cpp: Introduced a new template for text-to-text generation using Llama4 models. This template is designed to integrate seamlessly with the existing framework, showcasing the ease of adding new features and models.
sinapsis-chatbots-base: Core functionality has been extracted and consolidated, allowing for easier integration of various LLM frameworks, including Llama4. -
Modularization and Improved Compatibility
sinapsis-llama-cpp: Updated with a new template for context-aware queries, enhancing flexibility and integration capabilities.
The sinapsis-chatbots framework is designed to easily incorporate new models and features. Here's how the new text-to-text template for Llama4 was integrated:
Hardware Requirements: Llama4 may require specific hardware configurations for optimal performance. Refer to the documentation for setup guidelines.
sinapsis-chatbots v0.2.0
sinapsis-chatbots v0.2.0
This release introduces significant updates to the sinapsis-chatbots monorepo, including the addition of two new packages: sinapsis-chatbots-base
and sinapsis-llama-index
, as well as updates to sinapsis-llama-cpp
. The changes aim to modularize functionality, improve compatibility with different LLM frameworks, and enhance the capabilities of the chatbot ecosystem.
Key Features
- sinapsis-chatbots-base
The sinapsis-chatbots-base package has been introduced to host core functionality for working with various LLM frameworks. This package extracts and consolidates the essential components previously contained within sinapsis-llama-cpp, making it easier to integrate with different LLM providers and frameworks.
- sinapsis-llama-index
The sinapsis-llama-index package introduces templates for working with embedding generation, retrieval, and insertion in vector databases using the core functionality of llama-index. This package is designed to streamline RAG (Retrieval-Augmented Generation) workflows.
Embedding Generation: Templates for generating embeddings from text using Llama models.
Vector Database Integration: Simplifies the process of inserting and retrieving embeddings in vector databases.
Retrieval-Augmented Generation (RAG): Enables developers to build RAG-based chatbots by leveraging vector database
capabilities.
- sinapsis-llama-cpp
The sinapsis-llama-cpp package has been updated with a new template to make queries with context from generic data. This enhancement allows developers to build more context-aware chatbots by incorporating relevant data into the query process. Furthermore, core functionality was removed from this package and moved to sinapsis-chatbots-base
Context-Aware Queries: New template for making queries with context from generic data sources.
Improved Flexibility: Enhances the ability to integrate with various data sources and use cases.
- New WebApp for RAG Chatbot
Apart from the llama_cpp_simple_chatbot app, we now integrate a new web application, providing a user-friendly interface for interacting with RAG-based chatbots. This webapp simplifies the process of building and testing RAG chatbots, making it easier for developers to experiment with different configurations.
User-Friendly Interface: Easy-to-use interface for interacting with RAG chatbots.
Context-Aware Chatting: Supports queries with context, enabling more accurate and relevant responses.
Multi-Source Support: Works seamlessly with various vector databases and data sources.
sinapsis-chatbots v0.1.0
We present sinapsis-chatbots , a comprehensive mono-repository designed to simplify the development and deployment of Large Language Model (LLM)-based applications. This release marks the beginning of a powerful new toolset for building AI-driven chatbots.
The sinapsis-chatbots mono repo provides a suite of templates and utilities to streamline the configuration and operation of LLM applications.
Key features include:
Chat-Based Interaction Templates: Predefined templates for building conversational interfaces, enabling developers to quickly
create chatbots with natural, AI-driven interactions.
Model Agnostic Design: Built to support integration with a wide range of LLM models, providing flexibility and adaptability for
diverse use cases.
Gradio Web App: A pre-built web interface powered by Gradio, allowing users to quickly try out and experiment with chatbot
functionality without additional setup.
- sinapsis-llama-cpp package:
This release introduces the sinapsis-llama-cpp subpackage, a specialized module designed to simplify the deployment of LLMs using the llama-cpp library. Key features include:
Optimized Templates for Llama-CPP: Ready-to-use templates that leverage the performance and efficiency of the llama-cpp
library for running LLMs.
Easy Integration: Straightforward solutions for integrating llama-cpp-based models into chatbot applications.
Enhanced Performance: Designed to maximize the capabilities of llama-cpp, enabling faster inference and more efficient
resource utilization.
Why This Matters
The sinapsis-chatbots package is a game-changer for developers working with LLMs. By providing a unified, modular framework for building and deploying chatbots, it reduces the complexity of getting started with AI-driven applications. The addition of the sinapsis-llama-cpp subpackage and the Gradio web app further strengthens the package’s versatility, offering developers a powerful toolset for experimentation and deployment.
What’s Next?
We’re already planning future updates to expand the capabilities of the sinapsis-chatbots package. Upcoming features will include:
- Additional templates for advanced chatbot functionalities.
- Retrieval-Augmented Generation (RAG) Support: Stay tuned for the addition of RAG capabilities, enabling developers to combine LLMs with document retrieval for even more powerful applications.