tabby/.changes/unreleased
Wei Zhang 1632dd054e
feat(llama-cpp-server): reuse local server for completion and chat (#2812)
*  support reuse the same server if model id is equaled

Signed-off-by: Wei Zhang <kweizh@gmail.com>

* 🎨 fix review

Signed-off-by: Wei Zhang <kweizh@gmail.com>

* 🔨 should compare model in llama cpp server

Signed-off-by: Wei Zhang <kweizh@gmail.com>

* match case for creating completion and chat

Signed-off-by: Wei Zhang <kweizh@gmail.com>

* if let case for creating completion and chat

Signed-off-by: Wei Zhang <kweizh@gmail.com>

* add changelog

* rebase && make fix

---------

Signed-off-by: Wei Zhang <kweizh@gmail.com>
Co-authored-by: Meng Zhang <meng@tabbyml.com>
2024-08-10 22:51:13 -07:00
..
.gitkeep
Features-20240805-162901.yaml feat(webserver): support persisted thread in answer engine (#2793) 2024-08-09 15:29:19 -07:00
Fixed and Improvements-20240810-221045.yaml feat(llama-cpp-server): reuse local server for completion and chat (#2812) 2024-08-10 22:51:13 -07:00