diff --git a/README.md b/README.md index d5cc93a..cc0599d 100644 --- a/README.md +++ b/README.md @@ -20,10 +20,10 @@ In a nutshell: - Local, OpenAI drop-in alternative REST API. You own your data. - NO GPU required. NO Internet access is required either - Optional, GPU Acceleration is available in `llama.cpp`-compatible LLMs. See also the [build section](https://localai.io/basics/build/index.html). -- Supports multiple models: +- Supports multiple models: + - πŸ“– Text generation with GPTs (`llama.cpp`, `gpt4all.cpp`, ... and more) - πŸ—£ Text to Audio πŸŽΊπŸ†• - πŸ”ˆ Audio to Text (Audio transcription with `whisper.cpp`) - - πŸ“– Text generation with GPTs (`llama.cpp`, `gpt4all.cpp`, ... and more) - 🎨 Image generation with stable diffusion - πŸƒ Once loaded the first time, it keep models loaded in memory for faster inference - ⚑ Doesn't shell-out, but uses C++ bindings for a faster inference and better performance.