llama.cpp/ggml/src
Yuri Khrustalev 822b6322de
Some checks failed
Publish Docker image / Push Docker image to Docker Hub (map[dockerfile:.devops/full-cuda.Dockerfile platforms:linux/amd64 tag:full-cuda]) (push) Waiting to run
Publish Docker image / Push Docker image to Docker Hub (map[dockerfile:.devops/full.Dockerfile platforms:linux/amd64,linux/arm64 tag:full]) (push) Waiting to run
Publish Docker image / Push Docker image to Docker Hub (map[dockerfile:.devops/llama-cli-cuda.Dockerfile platforms:linux/amd64 tag:light-cuda]) (push) Waiting to run
Publish Docker image / Push Docker image to Docker Hub (map[dockerfile:.devops/llama-cli-intel.Dockerfile platforms:linux/amd64 tag:light-intel]) (push) Waiting to run
Publish Docker image / Push Docker image to Docker Hub (map[dockerfile:.devops/llama-cli.Dockerfile platforms:linux/amd64,linux/arm64 tag:light]) (push) Waiting to run
Publish Docker image / Push Docker image to Docker Hub (map[dockerfile:.devops/llama-server-cuda.Dockerfile platforms:linux/amd64 tag:server-cuda]) (push) Waiting to run
Publish Docker image / Push Docker image to Docker Hub (map[dockerfile:.devops/llama-server-intel.Dockerfile platforms:linux/amd64 tag:server-intel]) (push) Waiting to run
Publish Docker image / Push Docker image to Docker Hub (map[dockerfile:.devops/llama-server.Dockerfile platforms:linux/amd64,linux/arm64 tag:server]) (push) Waiting to run
Nix CI / nix-eval (macos-latest) (push) Waiting to run
Nix CI / nix-eval (ubuntu-latest) (push) Waiting to run
Nix CI / nix-build (macos-latest) (push) Waiting to run
Nix CI / nix-build (ubuntu-latest) (push) Waiting to run
flake8 Lint / Lint (push) Waiting to run
Python Type-Check / pyright type-check (push) Waiting to run
update-flake-lock / lockfile (push) Has been cancelled
ggml : ggml_type_name return "NONE" for invalid values (#9458)
When running on Windows, the quantization utility attempts to print the types that are not set which leads to a crash.
2024-09-14 12:54:37 +03:00
..
ggml-cann cann : fix doxy (ggml/0) 2024-09-08 11:05:55 +03:00
ggml-cuda CUDA: fix --split-mode row race condition (#9413) 2024-09-11 10:22:40 +02:00
ggml-sycl [SYCL] Fix DMMV dequantization (#9279) 2024-09-04 16:26:33 +01:00
kompute@4565194ed7 llama : reorganize source code + improve CMake (#8006) 2024-06-26 18:33:02 +03:00
kompute-shaders ggml : move rope type enum to ggml.h (#8949) 2024-08-13 21:13:15 +02:00
llamafile llamafile : disable sgemm for batch-size 1 (#9330) 2024-09-07 22:02:26 +03:00
vulkan-shaders Improve Vulkan shader build system (#9239) 2024-09-06 08:56:17 +02:00
CMakeLists.txt cmake : use list(APPEND ...) instead of set() + dedup linker (#9463) 2024-09-14 10:55:05 +03:00
ggml-aarch64.c ggml : AVX2 support for Q4_0_8_8 (#8713) 2024-09-04 19:51:22 +03:00
ggml-aarch64.h ggml : minor naming changes (#8433) 2024-07-12 10:46:02 +03:00
ggml-alloc.c ggml : reduce hash table reset cost (#8698) 2024-07-27 04:41:55 +02:00
ggml-backend-impl.h llama : reorganize source code + improve CMake (#8006) 2024-06-26 18:33:02 +03:00
ggml-backend.c tests: add gradient tests for all backends (ggml/932) 2024-09-08 11:05:55 +03:00
ggml-blas.cpp ggml : hide ggml_object, ggml_cgraph, ggml_hash_set (#9408) 2024-09-12 14:23:49 +03:00
ggml-cann.cpp cann: Add host buffer type for Ascend NPU (#9406) 2024-09-12 19:46:43 +08:00
ggml-common.h ggml-quants : ternary packing for TriLMs and BitNet b1.58 (#8151) 2024-09-05 21:48:47 -04:00
ggml-cuda.cu ggml : hide ggml_object, ggml_cgraph, ggml_hash_set (#9408) 2024-09-12 14:23:49 +03:00
ggml-impl.h ggml : hide ggml_object, ggml_cgraph, ggml_hash_set (#9408) 2024-09-12 14:23:49 +03:00
ggml-kompute.cpp ggml : hide ggml_object, ggml_cgraph, ggml_hash_set (#9408) 2024-09-12 14:23:49 +03:00
ggml-metal.m ggml : hide ggml_object, ggml_cgraph, ggml_hash_set (#9408) 2024-09-12 14:23:49 +03:00
ggml-metal.metal sync : ggml 2024-08-27 22:41:27 +03:00
ggml-quants.c ggml : vector length agnostic SVE support (#9290) 2024-09-09 18:37:18 +03:00
ggml-quants.h ggml-quants : ternary packing for TriLMs and BitNet b1.58 (#8151) 2024-09-05 21:48:47 -04:00
ggml-rpc.cpp ggml : hide ggml_object, ggml_cgraph, ggml_hash_set (#9408) 2024-09-12 14:23:49 +03:00
ggml-sycl.cpp ggml : hide ggml_object, ggml_cgraph, ggml_hash_set (#9408) 2024-09-12 14:23:49 +03:00
ggml-vulkan.cpp ggml : hide ggml_object, ggml_cgraph, ggml_hash_set (#9408) 2024-09-12 14:23:49 +03:00
ggml.c ggml : ggml_type_name return "NONE" for invalid values (#9458) 2024-09-14 12:54:37 +03:00