llama.cpp/examples/server/tests/features
Georgi Gerganov f4d2b8846a
llama : add reranking support (#9510)
* py : add XLMRobertaForSequenceClassification [no ci]

* py : fix scalar-tensor conversion [no ci]

* py : fix position embeddings chop [no ci]

* llama : read new cls tensors [no ci]

* llama : add classigication head (wip) [no ci]

* llama : add "rank" pooling type

ggml-ci

* server : add rerank endpoint

ggml-ci

* llama : aboud ggml_repeat during classification

* rerank : cleanup + comments

* server : accept /rerank endpoint in addition to /v1/rerank [no ci]

* embedding : parse special tokens

* jina : support v1 reranker

* vocab : minor style

ggml-ci

* server : initiate tests for later

ggml-ci

* server : add docs

* llama : add comment [no ci]

* llama : fix uninitialized tensors

* ci : add rerank tests

ggml-ci

* add reranking test

* change test data

* Update examples/server/server.cpp

Co-authored-by: Xuan Son Nguyen <thichthat@gmail.com>

* add `--reranking` argument

* update server docs

* llama : fix comment [no ci]

ggml-ci

---------

Co-authored-by: Xuan Son Nguyen <son@huggingface.co>
Co-authored-by: Xuan Son Nguyen <thichthat@gmail.com>
2024-09-28 17:42:03 +03:00
..
steps llama : add reranking support (#9510) 2024-09-28 17:42:03 +03:00
ctx_shift.feature server : add --no-context-shift option (#9607) 2024-09-23 22:23:54 +02:00
embeddings.feature llama : add reranking support (#9510) 2024-09-28 17:42:03 +03:00
environment.py server tests : more pythonic process management; fix bare except: (#6146) 2024-03-20 06:33:49 +01:00
issues.feature server: tests: passkey challenge / self-extend with context shift demo (#5832) 2024-03-02 22:00:14 +01:00
lora.feature server : add lora hotswap endpoint (WIP) (#8857) 2024-08-06 17:33:39 +02:00
parallel.feature server : simplify state machine for slot (#9283) 2024-09-06 23:21:29 +02:00
passkey.feature server : simplify state machine for slot (#9283) 2024-09-06 23:21:29 +02:00
rerank.feature llama : add reranking support (#9510) 2024-09-28 17:42:03 +03:00
results.feature server : fix temperature + disable some tests (#7409) 2024-05-20 22:10:03 +10:00
security.feature json-schema-to-grammar improvements (+ added to server) (#5978) 2024-03-21 11:50:43 +00:00
server.feature server : Add option to return token pieces in /tokenize endpoint (#9108) 2024-09-12 22:30:11 +02:00
slotsave.feature Tokenizer SPM fixes for phi-3 and llama-spm (bugfix) (#7425) 2024-05-21 14:39:48 +02:00
wrong_usages.feature server : refactor multitask handling (#9274) 2024-09-02 17:11:51 +02:00