diff --git a/README.md b/README.md index 07150ed..e626118 100644 --- a/README.md +++ b/README.md @@ -48,6 +48,8 @@ Below is an instruction that describes a task. Write a response that appropriate ### Response: ``` +See the [prompt-templates](https://github.com/go-skynet/llama-cli/tree/master/prompt-templates) directory in this repository for templates for most popular models. + ## Container images `llama-cli` comes by default as a container image. You can check out all the available images with corresponding tags [here](https://quay.io/repository/go-skynet/llama-cli?tab=tags&tag=latest) diff --git a/prompt-templates/alpaca.tmpl b/prompt-templates/alpaca.tmpl new file mode 100644 index 0000000..518071f --- /dev/null +++ b/prompt-templates/alpaca.tmpl @@ -0,0 +1,6 @@ +Below is an instruction that describes a task. Write a response that appropriately completes the request. + +### Instruction: +{{.Input}} + +### Response: \ No newline at end of file diff --git a/prompt-templates/koala.tmpl b/prompt-templates/koala.tmpl new file mode 100644 index 0000000..ffbd017 --- /dev/null +++ b/prompt-templates/koala.tmpl @@ -0,0 +1 @@ +BEGINNING OF CONVERSATION: USER: {{.Input}} GPT: \ No newline at end of file diff --git a/prompt-templates/vicuna.tmpl b/prompt-templates/vicuna.tmpl new file mode 100644 index 0000000..518071f --- /dev/null +++ b/prompt-templates/vicuna.tmpl @@ -0,0 +1,6 @@ +Below is an instruction that describes a task. Write a response that appropriately completes the request. + +### Instruction: +{{.Input}} + +### Response: \ No newline at end of file