llama.cpp/gguf-py
Sigbjørn Skjæret 03c0946d73
convert : support models with multiple chat templates (#6588)
* Support converting models with multiple chat templates

Adds the following metadata:
* tokenizer.chat_templates
* tokenizer.chat_template.<name1>
* tokenizer.chat_template.<name2>
* tokenizer.chat_template.<...>

Where `tokenizer.chat_templates` is an array of the template names (except `default`), `default` is added to the regular `tokenizer.chat_template`.

* replace filtered characters with underscore

* New script to add/modify/remove metadata

This scripts creates a copy of a GGUF file and allows you to add/modify/remove metadata in the process.

Most importantly this allows you to update chat templates, either as a string or directly from an updated tokenizer_config.json file.

* Add files via upload

add new script to project/readme

* flake--
2024-04-18 14:49:01 +03:00
..
examples gguf : add python reader example (#5216) 2024-02-13 19:56:38 +02:00
gguf convert : support models with multiple chat templates (#6588) 2024-04-18 14:49:01 +03:00
scripts convert : support models with multiple chat templates (#6588) 2024-04-18 14:49:01 +03:00
tests gguf-py: Refactor and allow reading/modifying existing GGUF files (#3981) 2023-11-11 08:04:50 +03:00
LICENSE gguf : make gguf pip-installable 2023-08-25 09:26:05 +03:00
pyproject.toml convert : support models with multiple chat templates (#6588) 2024-04-18 14:49:01 +03:00
README.md convert : support models with multiple chat templates (#6588) 2024-04-18 14:49:01 +03:00

gguf

This is a Python package for writing binary files in the GGUF (GGML Universal File) format.

See convert-llama-hf-to-gguf.py as an example for its usage.

Installation

pip install gguf

API Examples/Simple Tools

examples/writer.py — Generates example.gguf in the current directory to demonstrate generating a GGUF file. Note that this file cannot be used as a model.

scripts/gguf-dump.py — Dumps a GGUF file's metadata to the console.

scripts/gguf-set-metadata.py — Allows changing simple metadata values in a GGUF file by key.

scripts/gguf-convert-endian.py — Allows converting the endianness of GGUF files.

scripts/gguf-new-metadata.py — Copies a GGUF file with added/modified/removed metadata values.

Development

Maintainers who participate in development of this package are advised to install it in editable mode:

cd /path/to/llama.cpp/gguf-py

pip install --editable .

Note: This may require to upgrade your Pip installation, with a message saying that editable installation currently requires setup.py. In this case, upgrade Pip to the latest:

pip install --upgrade pip

Automatic publishing with CI

There's a GitHub workflow to make a release automatically upon creation of tags in a specified format.

  1. Bump the version in pyproject.toml.
  2. Create a tag named gguf-vx.x.x where x.x.x is the semantic version number.
git tag -a gguf-v1.0.0 -m "Version 1.0 release"
  1. Push the tags.
git push origin --tags

Manual publishing

If you want to publish the package manually for any reason, you need to have twine and build installed:

pip install build twine

Then, follow these steps to release a new version:

  1. Bump the version in pyproject.toml.
  2. Build the package:
python -m build
  1. Upload the generated distribution archives:
python -m twine upload dist/*

TODO

  • Add tests
  • Include conversion scripts as command line entry points in this package.