llama.cpp/.devops/nix
Georgi Gerganov f3f65429c4
llama : reorganize source code + improve CMake (#8006)
* scripts : update sync [no ci]

* files : relocate [no ci]

* ci : disable kompute build [no ci]

* cmake : fixes [no ci]

* server : fix mingw build

ggml-ci

* cmake : minor [no ci]

* cmake : link math library [no ci]

* cmake : build normal ggml library (not object library) [no ci]

* cmake : fix kompute build

ggml-ci

* make,cmake : fix LLAMA_CUDA + replace GGML_CDEF_PRIVATE

ggml-ci

* move public backend headers to the public include directory (#8122)

* move public backend headers to the public include directory

* nix test

* spm : fix metal header

---------

Co-authored-by: Georgi Gerganov <ggerganov@gmail.com>

* scripts : fix sync paths [no ci]

* scripts : sync ggml-blas.h [no ci]

---------

Co-authored-by: slaren <slarengh@gmail.com>
2024-06-26 18:33:02 +03:00
..
apps.nix build: rename main → llama-cli, server → llama-server, llava-cli → llama-llava-cli, etc... (#7809) 2024-06-13 00:41:52 +01:00
devshells.nix flake.nix : rewrite (#4605) 2023-12-29 16:42:26 +02:00
docker.nix nix: init singularity and docker images (#5056) 2024-02-22 11:44:10 -08:00
jetson-support.nix flake.nix: expose full scope in legacyPackages 2023-12-31 13:14:58 -08:00
nixpkgs-instances.nix nix: add a comment on the many nixpkgs-with-cuda instances 2024-01-22 12:19:30 +00:00
package.nix llama : reorganize source code + improve CMake (#8006) 2024-06-26 18:33:02 +03:00
scope.nix nix: init singularity and docker images (#5056) 2024-02-22 11:44:10 -08:00
sif.nix build(nix): Introduce flake.formatter for nix fmt (#5687) 2024-03-01 15:18:26 -08:00