🤖 Self-hosted, community-driven, local OpenAI-compatible API with Keycloak Auth Flak app as frontend. 🏠
You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
 
 
 
 
 
FlaskAI/examples/chatbot-ui
Ettore Di Giacinto 577d36b596
images: cleanup, drop .dev Dockerfile (#437)
2 years ago
..
models examples: remove threads from example models (#337) 2 years ago
README.md feat: add image generation with ncnn-stablediffusion (#272) 2 years ago
docker-compose.yaml images: cleanup, drop .dev Dockerfile (#437) 2 years ago

README.md

chatbot-ui

Example of integration with mckaywrigley/chatbot-ui.

Screenshot from 2023-04-26 23-59-55

Setup

# Clone LocalAI
git clone https://github.com/go-skynet/LocalAI

cd LocalAI/examples/chatbot-ui

# (optional) Checkout a specific LocalAI tag
# git checkout -b build <TAG>

# Download gpt4all-j to models/
wget https://gpt4all.io/models/ggml-gpt4all-j.bin -O models/ggml-gpt4all-j

# start with docker-compose
docker-compose up -d --pull always
# or you can build the images with:
# docker-compose up -d --build

Pointing chatbot-ui to a separately managed LocalAI service

If you want to use the chatbot-ui example with an externally managed LocalAI service, you can alter the docker-compose file so that it looks like the below. You will notice the file is smaller, because we have removed the section that would normally start the LocalAI service. Take care to update the IP address (or FQDN) that the chatbot-ui service tries to access (marked <<LOCALAI_IP>> below):

version: '3.6'

services:
  chatgpt:
    image: ghcr.io/mckaywrigley/chatbot-ui:main
    ports:
      - 3000:3000
    environment:
      - 'OPENAI_API_KEY=sk-XXXXXXXXXXXXXXXXXXXX'
      - 'OPENAI_API_HOST=http://<<LOCALAI_IP>>:8080'

Once you've edited the Dockerfile, you can start it with docker compose up, then browse to http://localhost:3000.

Accessing chatbot-ui

Open http://localhost:3000 for the Web UI.