llama.cpp/ggml
Junil Kim 0eb4e12bee
Some checks failed
Nix CI / nix-eval (macos-latest) (push) Waiting to run
Nix CI / nix-eval (ubuntu-latest) (push) Waiting to run
Nix CI / nix-build (macos-latest) (push) Waiting to run
Nix CI / nix-build (ubuntu-latest) (push) Waiting to run
flake8 Lint / Lint (push) Waiting to run
Nix aarch64 builds / nix-build-aarch64 (push) Has been cancelled
vulkan: Fix a vulkan-shaders-gen arugment parsing error (#10484)
The vulkan-shaders-gen was not parsing the --no-clean argument correctly.
Because the previous code was parsing the arguments which have a value only
and the --no-clean argument does not have a value, it was not being parsed
correctly. This commit can now correctly parse arguments that don't have values.
2024-11-26 01:47:20 +00:00
..
include ggml : add support for dynamic loading of backends (#10469) 2024-11-25 15:13:39 +01:00
src vulkan: Fix a vulkan-shaders-gen arugment parsing error (#10484) 2024-11-26 01:47:20 +00:00
.gitignore vulkan : cmake integration (#8119) 2024-07-13 18:12:39 +02:00
CMakeLists.txt ggml : add support for dynamic loading of backends (#10469) 2024-11-25 15:13:39 +01:00