llama.cpp/models
goerch 8d177eddeb
llama : improve token type support (#2668)
* Merge tokenizer fixes into the gguf branch.

* Add test vocabularies

* Adapt convert-new.py (and fix a clang-cl compiler error on windows)

* Improved tokenizer test

But does it work on MacOS?

* Improve token type support

- Added @klosax code to convert.py
- Improved token type support in vocabulary

* Exclude platform dependent tests

* More sentencepiece compatibility by eliminating magic numbers

* Restored accidentally removed comment
2023-08-21 18:56:02 +03:00
..
.editorconfig editorconfig : ignore models folder 2023-08-17 19:17:25 +03:00
ggml-vocab-llama.gguf llama : improve token type support (#2668) 2023-08-21 18:56:02 +03:00