diff --git a/README.md b/README.md index 63c8961d..2bc7f785 100644 --- a/README.md +++ b/README.md @@ -54,20 +54,55 @@ Read [**README.md**](docs/architecture) for the detailed documentation. ## Getting Started - -### Requirements -``` -Version's requirements - - Python >= 3.10 and < 3.12 - - NodeJs >= 18 - - bun -``` - +- Nvidia Driver installation + + https://linuxconfig.org/how-to-install-nvidia-drivers-on-ubuntu-24-04 + + nvtop to see installed Nvidia Cards + +- Python >= 3.10 and < 3.12 + + sudo apt update && sudo apt upgrade + + sudo apt install python3 python-is-python3 + +- NodeJs >= 18 Installation + [NodeJs >= 18 Installation](https://linuxconfig.org/how-to-install-node-js-on-ubuntu-24-04) + + sudo apt install nodejs + + sudo apt install npm + + node -v + +- Pip Installation + + sudo apt install pip + + python3 -m pip config set global.break-system-packages true + + als Alternative im uv pip installieren ??? + + sudo apt install python3-pip + - Install uv - Python Package manager [download](https://github.com/astral-sh/uv) + + $ pip install uv + - Install bun - JavaScript runtime [download](https://bun.sh/docs/installation) - For ollama [ollama setup guide](docs/Installation/ollama.md) (optinal: if you don't want to use the local models then you can skip this step) + + curl -fsSL https://ollama.com/install.sh | sh + + ./get-ollama-models.sh + + + - For API models, configure the API keys via setting page in UI. + Keys without <> + + ### Installation @@ -99,6 +134,7 @@ To install Devika, follow these steps: ``` 5. Start the Devika server: ```bash + npm install @sveltejs/adapter-node python devika.py ``` 6. if everything is working fine, you see the following output: @@ -113,6 +149,27 @@ To install Devika, follow these steps: ``` 8. Access the Devika web interface by opening a browser and navigating to `http://127.0.0.1:3001` +9. Install xterm in wsl + + sudo apt install xterm + +10. Windows run Xming Server from http://www.straightrunning.com/XmingNotes/#head-16 + +11. Change Fontsize in xterm + + sudo apt install x11-xserver-utils + + vi ~./Xresources + + xterm*font: *-fixed-*-*-*-18-* + + run xrdb -merge ~/.Xresources + +14. Install Code + + sudo snap install --classic code + + ### how to use To start using Devika, follow these steps: diff --git a/devika.py b/devika.py index 961b792a..f1baa71e 100644 --- a/devika.py +++ b/devika.py @@ -26,17 +26,13 @@ app = Flask(__name__) -CORS(app, resources={r"/*": {"origins": # Change the origin to your frontend URL - [ - "https://localhost:3000", - "http://localhost:3000", - ]}}) +CORS(app) app.register_blueprint(project_bp) socketio.init_app(app) log = logging.getLogger("werkzeug") -log.disabled = True +log.disabled = False TIKTOKEN_ENC = tiktoken.get_encoding("cl100k_base") diff --git a/docs/architecture/README.md b/docs/architecture/README.md index 7fa3bc16..d81dbae1 100644 --- a/docs/architecture/README.md +++ b/docs/architecture/README.md @@ -12,5 +12,5 @@ Devika's system architecture consists of the following key components: 8. **Knowledge Base**: Stores and retrieves project-specific information, code snippets, and learned knowledge for efficient access. 9. **Database**: Persists project data, agent states, and configuration settings. -Read [ARCHITECTURE.md](https://github.com/stitionai/devika/Docs/architecture/ARCHITECTURE.md) for the detailed architecture of Devika. -Read [UNDER_THE_HOOD.md](https://github.com/stitionai/devika/Docs/architecture/UNDER_THE_HOOD.md) for the detailed working of Devika. +Read [ARCHITECTURE.md](https://github.com/stitionai/devika/blob/main/docs/architecture/ARCHITECTURE.md) for the detailed architecture of Devika. +Read [UNDER_THE_HOOD.md](https://github.com/stitionai/devika/blob/main/docs/architecture/UNDER_THE_HOOD.md) for the detailed working of Devika. diff --git a/get-ollama-models.sh b/get-ollama-models.sh new file mode 100644 index 00000000..c5e3ece6 --- /dev/null +++ b/get-ollama-models.sh @@ -0,0 +1,43 @@ +#! /bin/bash +echo "/usr/local/bin/ollama pull llama3.2" +echo +/usr/local/bin/ollama pull llama3.2 +echo +echo "/usr/local/bin/ollama pull llama3.1" +/usr/local/bin/ollama pull llama3.1 +echo +echo "/usr/local/bin/ollama pull llama3" +/usr/local/bin/ollama pull llama3 +echo +echo "/usr/local/bin/ollama pull gemma" +/usr/local/bin/ollama pull gemma +echo +echo "/usr/local/bin/ollama pull qwen" +/usr/local/bin/ollama pull qwen +echo +echo "/usr/local/bin/ollama pull qwen2" +/usr/local/bin/ollama pull qwen2 +echo +echo "/usr/local/bin/ollama pull mistral" +/usr/local/bin/ollama pull mistral +echo +echo "/usr/local/bin/ollama pull phi3:14b" +/usr/local/bin/ollama pull phi3:14b +echo +echo "/usr/local/bin/ollama pull phi3:3.8b" +/usr/local/bin/ollama pull phi3:3.8b +echo +echo "/usr/local/bin/ollama pull codellama" +/usr/local/bin/ollama pull codellama +echo +echo "/usr/local/bin/ollama pull qwen2.5" +/usr/local/bin/ollama pull qwen2.5 +echo +echo "/usr/local/bin/ollama pull llama2" +/usr/local/bin/ollama pull llama2 +echo +echo "/usr/local/bin/ollama pull gemma2" +/usr/local/bin/ollama pull gemma2 +echo +echo "/usr/local/bin/ollama pull llama3.1:70b" +/usr/local/bin/ollama pull llama3.1:70b diff --git a/requirements.txt b/requirements.txt index 91666960..a0a18437 100644 --- a/requirements.txt +++ b/requirements.txt @@ -31,3 +31,5 @@ orjson gevent gevent-websocket curl_cffi +vite +shutils