tabby/crates/llama-cpp-server
Meng Zhang 531062a153
fix(core): fix ggml path loading for windows. fix user_agent field (m… (#3152)
* fix(core): fix ggml path loading for windows. fix user_agent field (mark as optional)

* [autofix.ci] apply automated fixes

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2024-09-16 19:09:21 +08:00
..
llama.cpp@5ef07e25ac chore(llama.cpp): bump version to b3571 (#2851) 2024-08-12 13:54:44 -07:00
src fix(core): fix ggml path loading for windows. fix user_agent field (m… (#3152) 2024-09-16 19:09:21 +08:00
build.rs fix(build): disable GGML_NATIVE explicitly (#3118) 2024-09-10 14:27:22 -07:00
Cargo.toml refactor(webserver): switch to openai chat interface (#2564) 2024-07-03 15:44:34 +09:00