llama.cpp/requirements
2024-07-07 14:23:38 +03:00
..
requirements-convert_hf_to_gguf_update.txt py : use cpu-only torch in requirements.txt (#8335) 2024-07-07 14:23:38 +03:00
requirements-convert_hf_to_gguf.txt py : use cpu-only torch in requirements.txt (#8335) 2024-07-07 14:23:38 +03:00
requirements-convert_legacy_llama.txt py : switch to snake_case (#8305) 2024-07-05 07:53:33 +03:00
requirements-convert_llama_ggml_to_gguf.txt py : switch to snake_case (#8305) 2024-07-05 07:53:33 +03:00