llama.cpp/.devops
Meng, Hengyu dcf752707d
update intel docker oneapi-basekit to 2024.1.1-devel-ubuntu22.04 (#7894)
In addition this reverts a workaround we had to do to workaround the upstream issue with expired intel GPG package keys in 2024.0.1-devel-ubuntu22.04
2024-06-12 19:05:35 +10:00
..
nix llama : remove MPI backend (#7395) 2024-05-20 01:17:03 +02:00
cloud-v-pipeline ci : Cloud-V for RISC-V builds (#3160) 2023-09-15 11:06:56 +03:00
full-cuda.Dockerfile docker : add openmp lib (#7780) 2024-06-06 08:17:21 +03:00
full-rocm.Dockerfile Fixed painfully slow single process builds. (#7326) 2024-05-30 22:32:38 +02:00
full.Dockerfile docker : add openmp lib (#7780) 2024-06-06 08:17:21 +03:00
llama-cpp-clblast.srpm.spec Fedora build update (#6388) 2024-03-29 22:59:56 +01:00
llama-cpp-cuda.srpm.spec Fedora build update (#6388) 2024-03-29 22:59:56 +01:00
llama-cpp.srpm.spec Fedora build update (#6388) 2024-03-29 22:59:56 +01:00
main-cuda.Dockerfile docker : build only main and server in their images (#7782) 2024-06-06 08:19:49 +03:00
main-intel.Dockerfile update intel docker oneapi-basekit to 2024.1.1-devel-ubuntu22.04 (#7894) 2024-06-12 19:05:35 +10:00
main-rocm.Dockerfile docker : build only main and server in their images (#7782) 2024-06-06 08:19:49 +03:00
main-vulkan.Dockerfile docker : add openmp lib (#7780) 2024-06-06 08:17:21 +03:00
main.Dockerfile docker : build only main and server in their images (#7782) 2024-06-06 08:19:49 +03:00
server-cuda.Dockerfile docker : build only main and server in their images (#7782) 2024-06-06 08:19:49 +03:00
server-intel.Dockerfile update intel docker oneapi-basekit to 2024.1.1-devel-ubuntu22.04 (#7894) 2024-06-12 19:05:35 +10:00
server-rocm.Dockerfile Fixed painfully slow single process builds. (#7326) 2024-05-30 22:32:38 +02:00
server-vulkan.Dockerfile build(cmake): simplify instructions (cmake -B build && cmake --build build ...) (#6964) 2024-04-29 17:02:45 +01:00
server.Dockerfile docker : build only main and server in their images (#7782) 2024-06-06 08:19:49 +03:00
tools.sh Move convert.py to examples/convert-legacy-llama.py (#7430) 2024-05-30 21:40:00 +10:00