llama.cpp/examples/llama.swiftui
singularity 3c0b585561
llama.swiftui : support loading custom model from file picker (#4767)
* swiftui: support load model from file picker

* swiftui: remove trailing whitespace
2024-01-04 10:22:38 +02:00
..
llama.cpp.swift llama.swiftui : fix infinite loop, ouput timings, buff UI (#4674) 2023-12-29 15:58:56 +02:00
llama.swiftui llama.swiftui : support loading custom model from file picker (#4767) 2024-01-04 10:22:38 +02:00
llama.swiftui.xcodeproj llama.swiftui : support loading custom model from file picker (#4767) 2024-01-04 10:22:38 +02:00
.gitignore llama.swiftui : add bench functionality (#4483) 2023-12-17 19:38:41 +02:00
README.md examples : iOS example with swift ui (#4159) 2023-11-27 16:56:52 +02:00

llama.swiftui

Local inference of llama.cpp on an iPhone. So far I only tested with starcoder 1B model, but it can most likely handle 7B models as well.

https://github.com/bachittle/llama.cpp/assets/39804642/e290827a-4edb-4093-9642-2a5e399ec545