Skip to content

Using Ollama insted of ChatGPT #1

@Billy-Mitchell

Description

@Billy-Mitchell

I came across your video 'Forget CrewAI & AutoGen, Build CUSTOM AI Agents!' and thought you made many good points, so I wanted to try the code myself. I don't have an OpenGPT account, so I wanted to try to adapt the code to use Ollama and llama3 instead. It all seemed quite manageable, but the problem I'm a bit stuck on / wondering how to get around or reimplement is the 'tool_calls' object that OpenGPT apparently has in its return JSON.

Do you have any suggestions or insights on how to best work around this? I tried following the 'solution' here (ollama/ollama-python#39 (comment)), but I can't get it to include 'tool_calls'. Any ideas or suggestions?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions