-
Notifications
You must be signed in to change notification settings - Fork 18
Description

The objective would be to get your project working as an overlay onto LocalAI running separately. I commented out the LocalAI in the docker-compose.yaml:
`❯ cat docker-compose.yaml
version: '3.6'
services:
frontend:
build:
context: .
dockerfile: Dockerfile
ports:
- 3000:3000`
❯ netstat -an | grep LISTEN
tcp46 0 0 *.3000 . LISTEN
The docker container is up and running as shown in the above image.
The API is running as an autonomous project separately and working independently. See below:
`❯ curl http://localhost:8080/v1/completions -H "Content-Type: application/json" -d '{
"model": "llama-2-7b-chat",
"prompt": "What is the expected population of Ghana by the year 2100",
"temperature": 0.7
}'
{"object":"text_completion","model":"llama-2-7b-chat","choices":[{"index":0,"finish_reason":"stop","text":"?\nlazarus May 3, 2022, 1:49pm #1\nThe population of Ghana is projected to continue growing in the coming decades. According to the United Nations Department of Economic and Social Affairs Population Division, Ghana’s population is expected to reach approximately 47 million by the year 2100. This represents a more than fivefold increase from the country’s estimated population of around 8.5 million in 2020.\nHowever, it is important to note that population projections are subject to uncertainty and can be influenced by various factors such as fertility rates, mortality rates, and migration patterns. Therefore, actual population growth may differ from projected values."}],"usage":{"prompt_tokens":0,"completion_tokens":0,"total_tokens":0}}%`
My question is how to get the "Select Model" and "Model Gallery" to effectively integrate with the LocalAI project when run separately and not directly integrated into your project? Is this possible?
I love the project concept of being able to change "model" and have "model galleries".