Fix inference example lacks required parameters (#9035)

Signed-off-by: Aisuko <urakiny@gmail.com>
This commit is contained in:
Aisuko 2024-08-16 19:08:59 +10:00 committed by GitHub
parent 23fd453544
commit c8ddce8560
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194

View File

@ -34,7 +34,7 @@ Run the quantized model:
```bash ```bash
# start inference on a gguf model # start inference on a gguf model
./llama-cli -m ./models/mymodel/ggml-model-Q4_K_M.gguf -n 128 ./llama-cli -m ./models/mymodel/ggml-model-Q4_K_M.gguf -cnv -p "You are a helpful assistant"
``` ```
When running the larger models, make sure you have enough disk space to store all the intermediate files. When running the larger models, make sure you have enough disk space to store all the intermediate files.