|
|
|
@ -163,6 +163,10 @@ func main() { |
|
|
|
|
} |
|
|
|
|
``` |
|
|
|
|
|
|
|
|
|
### Windows compatibility |
|
|
|
|
|
|
|
|
|
It should work, however you need to make sure you give enough resources to the container. See https://github.com/go-skynet/llama-cli/issues/2 |
|
|
|
|
|
|
|
|
|
### Kubernetes |
|
|
|
|
|
|
|
|
|
You can run the API directly in Kubernetes: |
|
|
|
@ -202,4 +206,4 @@ MIT |
|
|
|
|
- [llama.cpp](https://github.com/ggerganov/llama.cpp) |
|
|
|
|
- https://github.com/tatsu-lab/stanford_alpaca |
|
|
|
|
- https://github.com/cornelk/llama-go for the initial ideas |
|
|
|
|
- https://github.com/antimatter15/alpaca.cpp for the light model version (this is compatible and tested only with that checkpoint model!) |
|
|
|
|
- https://github.com/antimatter15/alpaca.cpp for the light model version (this is compatible and tested only with that checkpoint model!) |
|
|
|
|