mirror of
https://github.com/ggerganov/llama.cpp.git
synced 2024-11-11 13:30:35 +00:00
readme : add docs for chat-persistent.sh (#1568)
* readme : add docs for chat-persistent.sh * Update README.md
This commit is contained in:
parent
1359b6aba5
commit
c31bbe934b
19
README.md
19
README.md
@ -391,6 +391,25 @@ Note the use of `--color` to distinguish between user input and generated text.
|
|||||||
|
|
||||||
![image](https://user-images.githubusercontent.com/1991296/224575029-2af3c7dc-5a65-4f64-a6bb-517a532aea38.png)
|
![image](https://user-images.githubusercontent.com/1991296/224575029-2af3c7dc-5a65-4f64-a6bb-517a532aea38.png)
|
||||||
|
|
||||||
|
### Persistent Interaction
|
||||||
|
|
||||||
|
The prompt, user inputs, and model generations can be saved and resumed across calls to `./main` by leveraging `--prompt-cache` and `--prompt-cache-all`. The `./examples/chat-persistent.sh` script demonstrates this with support for long-running, resumable chat sessions. To use this example, you must provide a file to cache the initial chat prompt and a directory to save the chat session, and may optionally provide the same variables as `chat-13B.sh`. The same prompt cache can be reused for new chat sessions. Note that both prompt cache and chat directory are tied to the initial prompt (`PROMPT_TEMPLATE`) and the model file.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Start a new chat
|
||||||
|
PROMPT_CACHE_FILE=chat.prompt.bin CHAT_SAVE_DIR=./chat/default ./examples/chat-persistent.sh
|
||||||
|
|
||||||
|
# Resume that chat
|
||||||
|
PROMPT_CACHE_FILE=chat.prompt.bin CHAT_SAVE_DIR=./chat/default ./examples/chat-persistent.sh
|
||||||
|
|
||||||
|
# Start a different chat with the same prompt/model
|
||||||
|
PROMPT_CACHE_FILE=chat.prompt.bin CHAT_SAVE_DIR=./chat/another ./examples/chat-persistent.sh
|
||||||
|
|
||||||
|
# Different prompt cache for different prompt/model
|
||||||
|
PROMPT_TEMPLATE=./prompts/chat-with-bob.txt PROMPT_CACHE_FILE=bob.prompt.bin \
|
||||||
|
CHAT_SAVE_DIR=./chat/bob ./examples/chat-persistent.sh
|
||||||
|
```
|
||||||
|
|
||||||
### Instruction mode with Alpaca
|
### Instruction mode with Alpaca
|
||||||
|
|
||||||
1. First, download the `ggml` Alpaca model into the `./models` folder
|
1. First, download the `ggml` Alpaca model into the `./models` folder
|
||||||
|
Loading…
Reference in New Issue
Block a user