llama.cpp/ggml/src/ggml-cann
Dou Xinpeng 904837e0cb
Some checks are pending
Publish Docker image / Push Docker image to Docker Hub (map[dockerfile:.devops/full-cuda.Dockerfile platforms:linux/amd64 tag:full-cuda]) (push) Waiting to run
Publish Docker image / Push Docker image to Docker Hub (map[dockerfile:.devops/full.Dockerfile platforms:linux/amd64,linux/arm64 tag:full]) (push) Waiting to run
Publish Docker image / Push Docker image to Docker Hub (map[dockerfile:.devops/llama-cli-cuda.Dockerfile platforms:linux/amd64 tag:light-cuda]) (push) Waiting to run
Publish Docker image / Push Docker image to Docker Hub (map[dockerfile:.devops/llama-cli-intel.Dockerfile platforms:linux/amd64 tag:light-intel]) (push) Waiting to run
Publish Docker image / Push Docker image to Docker Hub (map[dockerfile:.devops/llama-cli.Dockerfile platforms:linux/amd64,linux/arm64 tag:light]) (push) Waiting to run
Publish Docker image / Push Docker image to Docker Hub (map[dockerfile:.devops/llama-server-cuda.Dockerfile platforms:linux/amd64 tag:server-cuda]) (push) Waiting to run
Publish Docker image / Push Docker image to Docker Hub (map[dockerfile:.devops/llama-server-intel.Dockerfile platforms:linux/amd64 tag:server-intel]) (push) Waiting to run
Publish Docker image / Push Docker image to Docker Hub (map[dockerfile:.devops/llama-server.Dockerfile platforms:linux/amd64,linux/arm64 tag:server]) (push) Waiting to run
Nix CI / nix-eval (macos-latest) (push) Waiting to run
Nix CI / nix-eval (ubuntu-latest) (push) Waiting to run
Nix CI / nix-build (macos-latest) (push) Waiting to run
Nix CI / nix-build (ubuntu-latest) (push) Waiting to run
flake8 Lint / Lint (push) Waiting to run
cann: fix crash when llama-bench is running on multiple cann devices (#9627)
2024-09-25 11:30:38 +08:00
..
kernels cann: fix buffer_num and runtime speed slowly error (#8865) 2024-08-05 21:10:37 +08:00
.clang-format [CANN] Add Ascend NPU backend (#6035) 2024-07-17 14:23:50 +03:00
acl_tensor.cpp cann: support q4_0 model (#8822) 2024-08-05 12:22:30 +08:00
acl_tensor.h cann: support q4_0 model (#8822) 2024-08-05 12:22:30 +08:00
aclnn_ops.cpp ggml : move rope type enum to ggml.h (#8949) 2024-08-13 21:13:15 +02:00
aclnn_ops.h [CANN] Add Ascend NPU backend (#6035) 2024-07-17 14:23:50 +03:00
common.h cann: fix crash when llama-bench is running on multiple cann devices (#9627) 2024-09-25 11:30:38 +08:00
Doxyfile cann : fix doxy (ggml/0) 2024-09-08 11:05:55 +03:00