From d69048e0b080d07541a0708498871151fcf69d5f Mon Sep 17 00:00:00 2001 From: Ettore Di Giacinto Date: Wed, 5 Apr 2023 00:41:02 +0200 Subject: [PATCH] Update README.md --- README.md | 6 +++++- 1 file changed, 5 insertions(+), 1 deletion(-) diff --git a/README.md b/README.md index abdedeb..e65e8ce 100644 --- a/README.md +++ b/README.md @@ -163,6 +163,10 @@ func main() { } ``` +### Windows compatibility + +It should work, however you need to make sure you give enough resources to the container. See https://github.com/go-skynet/llama-cli/issues/2 + ### Kubernetes You can run the API directly in Kubernetes: @@ -202,4 +206,4 @@ MIT - [llama.cpp](https://github.com/ggerganov/llama.cpp) - https://github.com/tatsu-lab/stanford_alpaca - https://github.com/cornelk/llama-go for the initial ideas -- https://github.com/antimatter15/alpaca.cpp for the light model version (this is compatible and tested only with that checkpoint model!) \ No newline at end of file +- https://github.com/antimatter15/alpaca.cpp for the light model version (this is compatible and tested only with that checkpoint model!)