llama.cpp/.devops
2024-01-21 21:37:13 +00:00
..
nix Revert LLAMA_NATIVE to OFF in flake.nix (#5066) 2024-01-21 21:37:13 +00:00
cloud-v-pipeline ci : Cloud-V for RISC-V builds (#3160) 2023-09-15 11:06:56 +03:00
full-cuda.Dockerfile python : add check-requirements.sh and GitHub workflow (#4585) 2023-12-29 16:50:29 +02:00
full-rocm.Dockerfile python : add check-requirements.sh and GitHub workflow (#4585) 2023-12-29 16:50:29 +02:00
full.Dockerfile python : add check-requirements.sh and GitHub workflow (#4585) 2023-12-29 16:50:29 +02:00
llama-cpp-clblast.srpm.spec devops : added systemd units and set versioning to use date. (#2835) 2023-08-28 09:31:24 +03:00
llama-cpp-cublas.srpm.spec devops : added systemd units and set versioning to use date. (#2835) 2023-08-28 09:31:24 +03:00
llama-cpp.srpm.spec devops : added systemd units and set versioning to use date. (#2835) 2023-08-28 09:31:24 +03:00
main-cuda.Dockerfile docker : add git to full-cuda.Dockerfile main-cuda.Dockerfile (#3044) 2023-09-08 13:57:55 +03:00
main-rocm.Dockerfile python : add check-requirements.sh and GitHub workflow (#4585) 2023-12-29 16:50:29 +02:00
main.Dockerfile Add llama.cpp docker support for non-latin languages (#1673) 2023-06-08 00:58:53 -07:00
tools.sh docker : add finetune option (#4211) 2023-11-30 23:46:01 +02:00