Skip to content

redhat-developer-demos/cde-ramalama-continue

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

14 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Cloud Development Environment with IBM Granite Code, Continue and Ramalama

Contribute

This repository demonstrates the integration of OpenShift DevSpaces with the Continue AI coding assistant and IBM Granite models for enhanced cloud development workflows. It serves as a proof of concept (POC) for leveraging AI tools in collaborative development environments.

What is Ramalama?

A CLI tool to manage and serve AI models locally, similar to Ollama and Goose CLI. Its distinguishing factor focuses on running LLMs in secure, isolated container environments. It’s compatible with pulling models from well-known registries like Hugging Face or Ollama Hub.

Prerequisites

  • Red Hat Developer Sandbox Account
  • OpenShift DevSpaces account.
  • GitHub OAuth access for repository integration.
  • Basic knowledge of cloud development environments and devfile.

How to try it?

  1. Go to Red Hat Developer Sandbox and register an account there.
  2. Once you've got an account, you can access Red Hat OpenShift Dev Spaces by going to https://workspaces.openshift.com
  3. On User Dashboard, navigate to Create Workspace tab and provide URL of this repository.
  4. Wait for the Cloud Development environment to start
  5. Once Cloud Development environment is ready, you'll see that Continue extension gets automatically installed. You'll also see that your workspace notifies you about a process running in the cluster (that's basically granite-code LLM served by Ramalama).
  6. Your Continue extension is automatically configured to connect to LLM, on clicking Continue icon, you should be able to query LLM with various coding tasks.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published