``` result = lx.extract( text_or_documents=input_text, prompt_description=prompt, examples=examples, model_id="Qwen3:0.6b", model_url="http://localhost:11434", fence_output=False, use_schema_constraints=False ) ``` The inference out of langextract works perfectly, but it throws the API key error when this is ran. also tried this: ``` result = lx.extract( text_or_documents=input_text, prompt_description=prompt, examples=examples, model_id="gemma2:2b", model_url="http://ollama:11434", fence_output=False, use_schema_constraints=False ) print(result) ``` and from #7 I tried setting the OLLAMA_HOST env variable but that didn't fix it too.