Skip to content

Failed running localai-frontend with kubernetes #8

@chenxinlong

Description

@chenxinlong

Install LocalAI inside kubernetes by using helm charts, local-ai service exposed to cluster with FQDN local-ai.default.svc.cluster.local:80 , so I construct a yaml to deploy LocalAI-fronted as follow :

# k8s deployment
apiVersion: apps/v1
kind: Deployment
metadata:
  name: localai-frontend
  namespace: default
spec:
  replicas: 1
  selector:
    matchLabels:
      app: localai-frontend
  template:
    metadata:
      labels:
        app: localai-frontend
    spec:
      containers:
      - name: localai-frontend
        image: dhruvgera/localai-frontend
        imagePullPolicy: IfNotPresent
        ports:
        - containerPort: 3000
        env:
          - name: API_HOST
            value: http://local-ai.default.svc.cluster.local:80
      restartPolicy: Always
---

# k8s service
apiVersion: v1
kind: Service
metadata:
  name: localai-frontend
  namespace: default
spec:
  type: NodePort
  ports:
  - port: 80
    targetPort: 3000
    nodePort: 31000
  selector:
    app: localai-frontend

Since localai-frontend service exposed by service type Nodeport, so I visit a random node(let say it is 172.16.33.21) with port 31000. Seems it doesn't work :

  1. web page create a websocket connection ws://172.16.33.21:3000/ws and failed
  2. still listing models by using http://localhost:8080/v1/models

Does this project support running inside a kubernetes cluster ?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions