From 96267d94377da36e30c34158c32aae5ebc16efc5 Mon Sep 17 00:00:00 2001 From: Dhruv Gera Date: Thu, 4 May 2023 21:57:58 +0530 Subject: [PATCH] localai: Include the WebUI project example (#130) Co-authored-by: Ettore Di Giacinto --- README.md | 4 +--- examples/README.md | 1 + examples/localai-webui/README.md | 26 +++++++++++++++++++++++ examples/localai-webui/docker-compose.yml | 20 +++++++++++++++++ 4 files changed, 48 insertions(+), 3 deletions(-) create mode 100644 examples/localai-webui/README.md create mode 100644 examples/localai-webui/docker-compose.yml diff --git a/README.md b/README.md index c584609..b51fc79 100644 --- a/README.md +++ b/README.md @@ -162,8 +162,6 @@ To build locally, run `make build` (see below). ### Other examples -![Screenshot from 2023-04-26 23-59-55](https://user-images.githubusercontent.com/2420543/234715439-98d12e03-d3ce-4f94-ab54-2b256808e05e.png) - To see other examples on how to integrate with other projects for instance chatbot-ui, see: [examples](https://github.com/go-skynet/LocalAI/tree/master/examples/). @@ -572,7 +570,7 @@ Not currently, as ggml doesn't support GPUs yet: https://github.com/ggerganov/ll ### Where is the webUI?
-We are working on to have a good out of the box experience - however as LocalAI is an API you can already plug it into existing projects that provides are UI interfaces to OpenAI's APIs. There are several already on github, and should be compatible with LocalAI already (as it mimics the OpenAI API) +There is the availability of localai-webui and chatbot-ui in the examples section and can be setup as per the instructions. However as LocalAI is an API you can already plug it into existing projects that provides are UI interfaces to OpenAI's APIs. There are several already on github, and should be compatible with LocalAI already (as it mimics the OpenAI API)
diff --git a/examples/README.md b/examples/README.md index 7c93955..66a9b18 100644 --- a/examples/README.md +++ b/examples/README.md @@ -8,6 +8,7 @@ Here is a list of projects that can easily be integrated with the LocalAI backen - [discord-bot](https://github.com/go-skynet/LocalAI/tree/master/examples/discord-bot/) (by [@mudler](https://github.com/mudler)) - [langchain](https://github.com/go-skynet/LocalAI/tree/master/examples/langchain/) (by [@dave-gray101](https://github.com/dave-gray101)) - [langchain-python](https://github.com/go-skynet/LocalAI/tree/master/examples/langchain-python/) (by [@mudler](https://github.com/mudler)) +- [localai-webui](https://github.com/go-skynet/LocalAI/tree/master/examples/localai-webui/) (by [@dhruvgera](https://github.com/dhruvgera)) - [rwkv](https://github.com/go-skynet/LocalAI/tree/master/examples/rwkv/) (by [@mudler](https://github.com/mudler)) - [slack-bot](https://github.com/go-skynet/LocalAI/tree/master/examples/slack-bot/) (by [@mudler](https://github.com/mudler)) diff --git a/examples/localai-webui/README.md b/examples/localai-webui/README.md new file mode 100644 index 0000000..8e36f40 --- /dev/null +++ b/examples/localai-webui/README.md @@ -0,0 +1,26 @@ +# localai-webui + +Example of integration with [dhruvgera/localai-frontend](https://github.com/Dhruvgera/LocalAI-frontend). + +![image](https://user-images.githubusercontent.com/42107491/235344183-44b5967d-ba22-4331-804c-8da7004a5d35.png) + +## Setup + +```bash +# Clone LocalAI +git clone https://github.com/go-skynet/LocalAI + +cd LocalAI/examples/localai-webui + +# (optional) Checkout a specific LocalAI tag +# git checkout -b build + +# Download any desired models to models/ in the parent LocalAI project dir +# For example: wget https://gpt4all.io/models/ggml-gpt4all-j.bin + +# start with docker-compose +docker-compose up -d --build +``` + +Open http://localhost:3000 for the Web UI. + diff --git a/examples/localai-webui/docker-compose.yml b/examples/localai-webui/docker-compose.yml new file mode 100644 index 0000000..a40bc39 --- /dev/null +++ b/examples/localai-webui/docker-compose.yml @@ -0,0 +1,20 @@ +version: '3.6' + +services: + api: + image: quay.io/go-skynet/local-ai:latest + build: + context: . + dockerfile: Dockerfile + ports: + - 8080:8080 + env_file: + - .env + volumes: + - ./models:/models:cached + command: ["/usr/bin/local-ai"] + + frontend: + image: quay.io/go-skynet/localai-frontend:master + ports: + - 3000:3000 \ No newline at end of file