mirror of
https://github.com/ggerganov/llama.cpp.git
synced 2024-12-26 03:14:35 +00:00
Add Nix and Flox install instructions (#7899)
This commit is contained in:
parent
99052cd227
commit
b473e95084
24
README.md
24
README.md
@ -387,6 +387,30 @@ brew install llama.cpp
|
||||
```
|
||||
The formula is automatically updated with new `llama.cpp` releases. More info: https://github.com/ggerganov/llama.cpp/discussions/7668
|
||||
|
||||
### Nix
|
||||
|
||||
On Mac and Linux, the Nix package manager can be used via
|
||||
```
|
||||
nix profile install nixpkgs#llama-cpp
|
||||
```
|
||||
For flake enabled installs.
|
||||
|
||||
Or
|
||||
```
|
||||
nix-env --file '<nixpkgs>' --install --attr llama-cpp
|
||||
```
|
||||
For non-flake enabled installs.
|
||||
|
||||
This expression is automatically updated within the [nixpkgs repo](https://github.com/NixOS/nixpkgs/blob/nixos-24.05/pkgs/by-name/ll/llama-cpp/package.nix#L164).
|
||||
|
||||
#### Flox
|
||||
|
||||
On Mac and Linux, Flox can be used to install llama.cpp within a Flox environment via
|
||||
```
|
||||
flox install llama-cpp
|
||||
```
|
||||
Flox follows the nixpkgs build of llama.cpp.
|
||||
|
||||
### Metal Build
|
||||
|
||||
On MacOS, Metal is enabled by default. Using Metal makes the computation run on the GPU.
|
||||
|
Loading…
Reference in New Issue
Block a user