🤖 Self-hosted, community-driven, local OpenAI-compatible API with Keycloak Auth Flak app as frontend. 🏠
You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
 
 
 
 
 
gregandev 77800c1636 Mise à jour de 'README.md' 1 year ago
.github ci: release space before build 1 year ago
.vscode feat: Add more test-cases and remove dev container (#433) 1 year ago
api fix: make completions endpoint more close to OpenAI specification (#790) 1 year ago
cmd/grpc refactor: rename "llama-master" and "llama" (#776) 1 year ago
examples [Telegram-bot example] Fix lint for command docker-compose (#787) 1 year ago
extra feat: add huggingface embeddings backend 1 year ago
internal feat: cleanups, small enhancements 1 year ago
models Add docker-compose 1 year ago
pkg debug 1 year ago
prompt-templates docs: enhancements (#133) 1 year ago
tests feat: add external grpc and model autoloading 1 year ago
.dockerignore Remove .git from .dockerignore 1 year ago
.env Make REBUILD=false default behavior 1 year ago
.gitignore fix: fix tests, small refactors 1 year ago
Dockerfile feat: add huggingface embeddings backend 1 year ago
Earthfile Rename project to LocalAI (#35) 1 year ago
LICENSE docs: update docs/license(clarification) and point to new website (#415) 1 year ago
Makefile ⬆️ Update go-skynet/go-llama.cpp (#723) 1 year ago
README.md Mise à jour de 'README.md' 1 year ago
assets.go feat: Update gpt4all, support multiple implementations in runtime (#472) 1 year ago
docker-compose.yaml images: cleanup, drop .dev Dockerfile (#437) 1 year ago
entrypoint.sh Make REBUILD=false default behavior 1 year ago
go.mod refactor: rename "llama-master" and "llama" (#776) 1 year ago
go.sum fix(deps): update module github.com/gofiber/fiber/v2 to v2.48.0 (#757) 1 year ago
main.go feat: add external grpc and model autoloading 1 year ago
renovate.json ci: manually update deps 1 year ago

README.md

LOCAL AI

USAGE

  • Installation et démarrage:
# Clone LocalAI
git clone https://github.com/go-skynet/LocalAI

cd LocalAI

# (optional) Checkout a specific LocalAI tag
# git checkout -b build <TAG>

# Download gpt4all-j to models/
wget https://gpt4all.io/models/ggml-gpt4all-j.bin -O models/ggml-gpt4all-j

# Use a template from the examples
cp -rf prompt-templates/ggml-gpt4all-j.tmpl models/

# (optional) Edit the .env file to set things like context size and threads
# vim .env

# start with docker-compose
# docker-compose up -d --pull always
# or you can build the images with:
docker-compose up -d --build
# Now API is accessible at localhost:8080
curl http://localhost:8080/v1/models
# {"object":"list","data":[{"id":"ggml-gpt4all-j","object":"model"}]}

curl http://localhost:8080/v1/chat/completions -H "Content-Type: application/json" -d '{
     "model": "ggml-gpt4all-j",
     "messages": [{"role": "user", "content": "How are you?"}],
     "temperature": 0.9 
   }'

# {"model":"ggml-gpt4all-j","choices":[{"message":{"role":"assistant","content":"I'm doing well, thanks. How about you?"}}]}
  • Python implementation:
import openai

openai.api_base = "http://localhost:8080/v1"

# create a chat completion
chat_completion = openai.ChatCompletion.create(model="gpt-3.5-turbo", messages=[{"role": "user", "content": "Hello world"}])

# print the completion
print(completion.choices[0].message.content)

TO DO

  • Flask app frontend
  • Keycloak auth
  • speech to text avec openVINO