llama.cpp/examples/server/tests/unit
Xuan Son Nguyen e52522b869
server : bring back info of final chunk in stream mode (#10722)
* server : bring back into to final chunk in stream mode

* clarify a bit

* traling space
2024-12-08 20:38:51 +01:00
..
test_basic.py server : (refactor) no more json in server_task input (#10691) 2024-12-07 20:21:09 +01:00
test_chat_completion.py server : (refactor) no more json in server_task input (#10691) 2024-12-07 20:21:09 +01:00
test_completion.py server : bring back info of final chunk in stream mode (#10722) 2024-12-08 20:38:51 +01:00
test_ctx_shift.py server : replace behave with pytest (#10416) 2024-11-26 16:20:18 +01:00
test_embedding.py server : replace behave with pytest (#10416) 2024-11-26 16:20:18 +01:00
test_infill.py server : add more test cases (#10569) 2024-11-29 21:48:56 +01:00
test_lora.py server : replace behave with pytest (#10416) 2024-11-26 16:20:18 +01:00
test_rerank.py server : add more test cases (#10569) 2024-11-29 21:48:56 +01:00
test_security.py server : replace behave with pytest (#10416) 2024-11-26 16:20:18 +01:00
test_slot_save.py server : replace behave with pytest (#10416) 2024-11-26 16:20:18 +01:00
test_speculative.py server : fix speculative decoding with context shift (#10641) 2024-12-04 22:38:20 +02:00
test_tokenize.py server : replace behave with pytest (#10416) 2024-11-26 16:20:18 +01:00