llama.cpp/.devops
hutli f6a0f5c642 nix: .#widnows: init
initial nix build for windows using zig

mingwW64 build

removes nix zig windows build

removes nix zig windows build

removed unnessesary glibc.static

removed unnessesary import of pkgs in nix

fixed missing trailing newline on non-windows nix builds

overriding stdenv when building for crosscompiling to windows in nix

better variables when crosscompiling windows in nix

cross compile windows on macos

removed trailing whitespace

remove unnessesary overwrite of "CMAKE_SYSTEM_NAME" in nix windows build

nix: keep file extension when copying result files during cross compile for windows

nix: better checking for file extensions when using MinGW

nix: using hostPlatform instead of targetPlatform when cross compiling for Windows

using hostPlatform.extensions.executable to extract executable format
2024-03-28 07:48:27 +00:00
..
nix nix: .#widnows: init 2024-03-28 07:48:27 +00:00
cloud-v-pipeline ci : Cloud-V for RISC-V builds (#3160) 2023-09-15 11:06:56 +03:00
full-cuda.Dockerfile cuda : rename build flag to LLAMA_CUDA (#6299) 2024-03-26 01:16:01 +01:00
full-rocm.Dockerfile python : add check-requirements.sh and GitHub workflow (#4585) 2023-12-29 16:50:29 +02:00
full.Dockerfile python : add check-requirements.sh and GitHub workflow (#4585) 2023-12-29 16:50:29 +02:00
llama-cpp-clblast.srpm.spec devops : added systemd units and set versioning to use date. (#2835) 2023-08-28 09:31:24 +03:00
llama-cpp-cuda.srpm.spec cuda : rename build flag to LLAMA_CUDA (#6299) 2024-03-26 01:16:01 +01:00
llama-cpp.srpm.spec devops : added systemd units and set versioning to use date. (#2835) 2023-08-28 09:31:24 +03:00
main-cuda.Dockerfile cuda : rename build flag to LLAMA_CUDA (#6299) 2024-03-26 01:16:01 +01:00
main-intel.Dockerfile docker : add build for SYCL, Vulkan + update readme (#5228) 2024-02-02 09:56:31 +02:00
main-rocm.Dockerfile python : add check-requirements.sh and GitHub workflow (#4585) 2023-12-29 16:50:29 +02:00
main-vulkan.Dockerfile docker : add build for SYCL, Vulkan + update readme (#5228) 2024-02-02 09:56:31 +02:00
main.Dockerfile Add llama.cpp docker support for non-latin languages (#1673) 2023-06-08 00:58:53 -07:00
server-cuda.Dockerfile cuda : rename build flag to LLAMA_CUDA (#6299) 2024-03-26 01:16:01 +01:00
server-intel.Dockerfile docker : add build for SYCL, Vulkan + update readme (#5228) 2024-02-02 09:56:31 +02:00
server-rocm.Dockerfile docker : add server-first container images (#5157) 2024-01-28 09:55:31 +02:00
server-vulkan.Dockerfile docker : add build for SYCL, Vulkan + update readme (#5228) 2024-02-02 09:56:31 +02:00
server.Dockerfile docker : add server-first container images (#5157) 2024-01-28 09:55:31 +02:00
tools.sh docker : add finetune option (#4211) 2023-11-30 23:46:01 +02:00