llama.cpp/examples
Przemysław Pawełczyk cb6c44c5e0
build : do not use _GNU_SOURCE gratuitously (#2035)
* Do not use _GNU_SOURCE gratuitously.

What is needed to build llama.cpp and examples is availability of
stuff defined in The Open Group Base Specifications Issue 6
(https://pubs.opengroup.org/onlinepubs/009695399/) known also as
Single Unix Specification v3 (SUSv3) or POSIX.1-2001 + XSI extensions,
plus some stuff from BSD that is not specified in POSIX.1.

Well, that was true until NUMA support was added recently,
so enable GNU libc extensions for Linux builds to cover that.

Not having feature test macros in source code gives greater flexibility
to those wanting to reuse it in 3rd party app, as they can build it with
FTMs set by Makefile here or other FTMs depending on their needs.

It builds without issues in Alpine (musl libc), Ubuntu (glibc), MSYS2.

* make : enable Darwin extensions for macOS to expose RLIMIT_MEMLOCK

* make : enable BSD extensions for DragonFlyBSD to expose RLIMIT_MEMLOCK

* make : use BSD-specific FTMs to enable alloca on BSDs

* make : fix OpenBSD build by exposing newer POSIX definitions

* cmake : follow recent FTM improvements from Makefile
2023-09-08 15:09:21 +03:00
..
baby-llama build : fix most gcc and clang warnings (#2861) 2023-09-01 16:34:50 +03:00
beam-search build : do not use _GNU_SOURCE gratuitously (#2035) 2023-09-08 15:09:21 +03:00
benchmark cmake : install targets (#2256) 2023-07-19 10:01:11 +03:00
convert-llama2c-to-ggml fix some warnings from gcc and clang-tidy (#3038) 2023-09-07 13:22:29 -04:00
embd-input build : do not use _GNU_SOURCE gratuitously (#2035) 2023-09-08 15:09:21 +03:00
embedding fix some warnings from gcc and clang-tidy (#3038) 2023-09-07 13:22:29 -04:00
gguf examples : replace fprintf to stdout with printf (#3017) 2023-09-05 15:10:27 -04:00
gptneox-wip fix some warnings from gcc and clang-tidy (#3038) 2023-09-07 13:22:29 -04:00
jeopardy chmod : make scripts executable (#2675) 2023-08-23 17:29:09 +03:00
llama-bench llama-bench : use two tokens in the warmup run for prompt evals (#3059) 2023-09-07 15:52:34 +02:00
main build : do not use _GNU_SOURCE gratuitously (#2035) 2023-09-08 15:09:21 +03:00
metal gguf : new file format with flexible meta data (beta) (#2398) 2023-08-21 23:07:43 +03:00
perplexity fix some warnings from gcc and clang-tidy (#3038) 2023-09-07 13:22:29 -04:00
quantize fix some warnings from gcc and clang-tidy (#3038) 2023-09-07 13:22:29 -04:00
quantize-stats fix some warnings from gcc and clang-tidy (#3038) 2023-09-07 13:22:29 -04:00
save-load-state fix some warnings from gcc and clang-tidy (#3038) 2023-09-07 13:22:29 -04:00
server fix some warnings from gcc and clang-tidy (#3038) 2023-09-07 13:22:29 -04:00
simple build : do not use _GNU_SOURCE gratuitously (#2035) 2023-09-08 15:09:21 +03:00
speculative build : do not use _GNU_SOURCE gratuitously (#2035) 2023-09-08 15:09:21 +03:00
train-text-from-scratch fix some warnings from gcc and clang-tidy (#3038) 2023-09-07 13:22:29 -04:00
alpaca.sh alpaca.sh : update model file name (#2074) 2023-07-06 19:17:50 +03:00
chat-13B.bat Create chat-13B.bat (#592) 2023-03-29 20:21:09 +03:00
chat-13B.sh examples : read chat prompts from a template file (#1196) 2023-05-03 20:58:11 +03:00
chat-persistent.sh chat-persistent.sh : use bracket expressions in grep (#1564) 2023-05-24 09:16:22 +03:00
chat-vicuna.sh examples : add chat-vicuna.sh (#1854) 2023-06-15 21:05:53 +03:00
chat.sh main : log file (#2748) 2023-08-30 09:29:32 +03:00
CMakeLists.txt speculative : PoC for speeding-up inference via speculative sampling (#2926) 2023-09-03 15:12:08 +03:00
gpt4all.sh examples : add -n to alpaca and gpt4all scripts (#706) 2023-04-13 16:03:39 +03:00
json-schema-to-grammar.py chmod : make scripts executable (#2675) 2023-08-23 17:29:09 +03:00
llama2-13b.sh gitignore : changes for Poetry users + chat examples (#2284) 2023-07-21 13:53:27 +03:00
llama2.sh gitignore : changes for Poetry users + chat examples (#2284) 2023-07-21 13:53:27 +03:00
llama.vim vim : streaming and more (#2495) 2023-08-08 14:44:48 +03:00
llm.vim llm.vim : stop generation at multiple linebreaks, bind to <F2> (#2879) 2023-08-30 09:50:55 +03:00
make-ggml.py chmod : make scripts executable (#2675) 2023-08-23 17:29:09 +03:00
Miku.sh MIKU MAYHEM: Upgrading the Default Model for Maximum Fun 🎉 (#2287) 2023-07-21 11:13:18 +03:00
reason-act.sh chmod : make scripts executable (#2675) 2023-08-23 17:29:09 +03:00
server-llama2-13B.sh chmod : make scripts executable (#2675) 2023-08-23 17:29:09 +03:00