Hello there, It will be good to add a version of this that uses ollama to run a local LLM, e.g. Mixtral. Is there any interest with this? @emrekiciman @amit-sharma @RoseDeSicilia26 Cheers!