This repository demonstrates the integration of OpenShift DevSpaces with the Continue AI coding assistant and IBM Granite models for enhanced cloud development workflows. It serves as a proof of concept (POC) for leveraging AI tools in collaborative development environments.
A CLI tool to manage and serve AI models locally, similar to Ollama and Goose CLI. Its distinguishing factor focuses on running LLMs in secure, isolated container environments. It’s compatible with pulling models from well-known registries like Hugging Face or Ollama Hub.
- Red Hat Developer Sandbox Account
- OpenShift DevSpaces account.
- GitHub OAuth access for repository integration.
- Basic knowledge of cloud development environments and devfile.
- Go to Red Hat Developer Sandbox and register an account there.
- Once you've got an account, you can access Red Hat OpenShift Dev Spaces by going to https://workspaces.openshift.com
- On User Dashboard, navigate to Create Workspace tab and provide URL of this repository.
- Wait for the Cloud Development environment to start
- Once Cloud Development environment is ready, you'll see that Continue extension gets automatically installed. You'll also see that your workspace notifies you about a process running in the cluster (that's basically granite-code LLM served by Ramalama).
- Your Continue extension is automatically configured to connect to LLM, on clicking Continue icon, you should be able to query LLM with various coding tasks.