mirror of
https://github.com/ggerganov/llama.cpp.git
synced 2024-12-29 04:44:34 +00:00
04eec58112
* llama : remove the separate scale tensors of BitNet b1.58 They won't be needed, since the remaining ternary quant types have built-in scales. |
||
---|---|---|
.. | ||
__init__.py | ||
constants.py | ||
gguf_reader.py | ||
gguf_writer.py | ||
gguf.py | ||
lazy.py | ||
metadata.py | ||
py.typed | ||
quants.py | ||
tensor_mapping.py | ||
utility.py | ||
vocab.py |