llama.cpp/ggml/include
Diego Devesa c83ad6d01e
Some checks are pending
Publish Docker image / Push Docker image to Docker Hub (map[dockerfile:.devops/full-cuda.Dockerfile platforms:linux/amd64 tag:full-cuda]) (push) Waiting to run
Publish Docker image / Push Docker image to Docker Hub (map[dockerfile:.devops/full.Dockerfile platforms:linux/amd64,linux/arm64 tag:full]) (push) Waiting to run
Publish Docker image / Push Docker image to Docker Hub (map[dockerfile:.devops/llama-cli-cuda.Dockerfile platforms:linux/amd64 tag:light-cuda]) (push) Waiting to run
Publish Docker image / Push Docker image to Docker Hub (map[dockerfile:.devops/llama-cli-intel.Dockerfile platforms:linux/amd64 tag:light-intel]) (push) Waiting to run
Publish Docker image / Push Docker image to Docker Hub (map[dockerfile:.devops/llama-cli.Dockerfile platforms:linux/amd64,linux/arm64 tag:light]) (push) Waiting to run
Publish Docker image / Push Docker image to Docker Hub (map[dockerfile:.devops/llama-server-cuda.Dockerfile platforms:linux/amd64 tag:server-cuda]) (push) Waiting to run
Publish Docker image / Push Docker image to Docker Hub (map[dockerfile:.devops/llama-server-intel.Dockerfile platforms:linux/amd64 tag:server-intel]) (push) Waiting to run
Publish Docker image / Push Docker image to Docker Hub (map[dockerfile:.devops/llama-server.Dockerfile platforms:linux/amd64,linux/arm64 tag:server]) (push) Waiting to run
Nix CI / nix-eval (macos-latest) (push) Waiting to run
Nix CI / nix-eval (ubuntu-latest) (push) Waiting to run
Nix CI / nix-build (macos-latest) (push) Waiting to run
Nix CI / nix-build (ubuntu-latest) (push) Waiting to run
flake8 Lint / Lint (push) Waiting to run
ggml-backend : add device and backend reg interfaces (#9707)
Co-authored-by: Johannes Gäßler <johannesg@5d6.de>
2024-10-03 01:49:47 +02:00
..
ggml-alloc.h Threadpool: take 2 (#8672) 2024-08-30 01:20:53 +02:00
ggml-backend.h ggml-backend : add device and backend reg interfaces (#9707) 2024-10-03 01:49:47 +02:00
ggml-blas.h ggml-backend : add device and backend reg interfaces (#9707) 2024-10-03 01:49:47 +02:00
ggml-cann.h ggml-backend : add device and backend reg interfaces (#9707) 2024-10-03 01:49:47 +02:00
ggml-cuda.h ggml-backend : add device and backend reg interfaces (#9707) 2024-10-03 01:49:47 +02:00
ggml-kompute.h llama : reorganize source code + improve CMake (#8006) 2024-06-26 18:33:02 +03:00
ggml-metal.h ggml-backend : add device and backend reg interfaces (#9707) 2024-10-03 01:49:47 +02:00
ggml-rpc.h ggml-backend : add device and backend reg interfaces (#9707) 2024-10-03 01:49:47 +02:00
ggml-sycl.h ggml-backend : add device and backend reg interfaces (#9707) 2024-10-03 01:49:47 +02:00
ggml-vulkan.h ggml-backend : add device and backend reg interfaces (#9707) 2024-10-03 01:49:47 +02:00
ggml.h ggml-backend : add device and backend reg interfaces (#9707) 2024-10-03 01:49:47 +02:00