llama.cpp/.devops
Olivier Chafik b8a7a5a90f
build(cmake): simplify instructions (cmake -B build && cmake --build build ...) (#6964)
* readme: cmake . -B build && cmake --build build

* build: fix typo

Co-authored-by: Jared Van Bortel <cebtenzzre@gmail.com>

* build: drop implicit . from cmake config command

* build: remove another superfluous .

* build: update MinGW cmake commands

* Update README-sycl.md

Co-authored-by: Neo Zhang Jianyu <jianyu.zhang@intel.com>

* build: reinstate --config Release as not the default w/ some generators + document how to build Debug

* build: revert more --config Release

* build: nit / remove -H from cmake example

* build: reword debug instructions around single/multi config split

---------

Co-authored-by: Jared Van Bortel <cebtenzzre@gmail.com>
Co-authored-by: Neo Zhang Jianyu <jianyu.zhang@intel.com>
2024-04-29 17:02:45 +01:00
..
nix nix: removed unnessesary indentation 2024-03-28 07:48:27 +00:00
cloud-v-pipeline ci : Cloud-V for RISC-V builds (#3160) 2023-09-15 11:06:56 +03:00
full-cuda.Dockerfile server: add cURL support to server Dockerfiles (#6474) 2024-04-04 18:31:22 +02:00
full-rocm.Dockerfile server: add cURL support to server Dockerfiles (#6474) 2024-04-04 18:31:22 +02:00
full.Dockerfile server: add cURL support to server Dockerfiles (#6474) 2024-04-04 18:31:22 +02:00
llama-cpp-clblast.srpm.spec Fedora build update (#6388) 2024-03-29 22:59:56 +01:00
llama-cpp-cuda.srpm.spec Fedora build update (#6388) 2024-03-29 22:59:56 +01:00
llama-cpp.srpm.spec Fedora build update (#6388) 2024-03-29 22:59:56 +01:00
main-cuda.Dockerfile cuda : rename build flag to LLAMA_CUDA (#6299) 2024-03-26 01:16:01 +01:00
main-intel.Dockerfile build(cmake): simplify instructions (cmake -B build && cmake --build build ...) (#6964) 2024-04-29 17:02:45 +01:00
main-rocm.Dockerfile python : add check-requirements.sh and GitHub workflow (#4585) 2023-12-29 16:50:29 +02:00
main-vulkan.Dockerfile build(cmake): simplify instructions (cmake -B build && cmake --build build ...) (#6964) 2024-04-29 17:02:45 +01:00
main.Dockerfile Add llama.cpp docker support for non-latin languages (#1673) 2023-06-08 00:58:53 -07:00
server-cuda.Dockerfile server: add cURL support to server Dockerfiles (#6474) 2024-04-04 18:31:22 +02:00
server-intel.Dockerfile build(cmake): simplify instructions (cmake -B build && cmake --build build ...) (#6964) 2024-04-29 17:02:45 +01:00
server-rocm.Dockerfile server: add cURL support to server Dockerfiles (#6474) 2024-04-04 18:31:22 +02:00
server-vulkan.Dockerfile build(cmake): simplify instructions (cmake -B build && cmake --build build ...) (#6964) 2024-04-29 17:02:45 +01:00
server.Dockerfile server: add cURL support to server.Dockerfile (#6461) 2024-04-03 19:56:37 +02:00
tools.sh docker : add finetune option (#4211) 2023-11-30 23:46:01 +02:00