Stephan Walter
b391579db9
Update README and comments for standalone perplexity tool ( #525 )
2023-03-26 16:14:01 +03:00
anzz1
7a87d31f4f
[main] fix infinite generation (-n == -1) ( #523 )
2023-03-26 16:06:10 +03:00
Harald Fernengel
33e35b8fe8
Exit from interactive mode if input stream is bad ( #491 )
...
Allow exiting the interactive prompt also with CTRL-D on Unix and CTRL-Z
on Windows.
2023-03-26 08:25:46 +03:00
anzz1
34ab526843
(Windows) Set console to UTF-8 on init ( #420 )
...
Sets console codepage to 65001 (CP_UTF8) on start for both input and output, should fix problems with UTF-8 characters.
2023-03-25 22:29:22 +02:00
Georgi Gerganov
c2b25b6912
Fix colors enabling on WIN32
2023-03-25 21:53:39 +02:00
Georgi Gerganov
79b2b266db
If n_predict == -1, generate forever
2023-03-25 21:51:41 +02:00
Georgi Gerganov
e2d490dafd
Inifinite generation via context swapping ( #71 )
2023-03-25 21:36:22 +02:00
Georgi Gerganov
03f7e33560
Cleanup STL headers + fix embedding examples + minor stuff
2023-03-25 20:51:14 +02:00
Georgi Gerganov
55ad42af84
Move chat scripts into "./examples"
2023-03-25 20:37:09 +02:00
Georgi Gerganov
a316a425d0
Overhaul the examples structure
...
- main -> examples
- utils -> examples (renamed to "common")
- quantize -> examples
- separate tools for "perplexity" and "embedding"
Hope I didn't break something !
2023-03-25 20:26:40 +02:00
Georgi Gerganov
04c6f5ed6f
Immediately start processing the prompt before user input has been provided ( #476 )
2023-03-24 23:17:58 +02:00
Mathieu Nayrolles
3f9c6135e4
fix typo in chatLLaMa ( #368 )
...
The prompt contains a typo where 'alound' is used instead of 'aloud'.
2023-03-21 22:52:27 +02:00
Jean-Christophe Hoelt
3ab3e6582f
Add chatLLaMa script ( #198 )
...
* Add chatLLaMa script
* Fix shellcheck errors and do some cleanup
* Move chatLLaMa script to `examples` directory
* Reduce chatLLaMa context size to 2048
Ref d7def1a752
* Include n_predict to 2048 in examples/chatLLaMa
2023-03-21 18:23:15 +02:00