tabby/crates
Meng Zhang 998cf18bb2
fix(core): refactor chat_completions function to handle OpenAI stream… (#2879)
* fix(core): refactor chat_completions function to handle OpenAI stream errors properly

* [autofix.ci] apply automated fixes

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2024-08-14 18:29:46 -07:00
..
aim-downloader chore(download): enable sha256 checksum, remove partition urls support (#2514) 2024-06-26 12:23:43 +00:00
http-api-bindings fix(core): add suffix support for openai completion legacy interface (#2825) 2024-08-10 04:43:56 +00:00
llama-cpp-server chore(llama.cpp): bump version to b3571 (#2851) 2024-08-12 13:54:44 -07:00
ollama-api-bindings refactor(config): make HttpModelConfig.api_endpoint optional, so we can use default value for certain model kind (#2760) 2024-07-31 18:22:29 +00:00
tabby fix(core): refactor chat_completions function to handle OpenAI stream… (#2879) 2024-08-14 18:29:46 -07:00
tabby-common fix(search): change source query to const score query (#2859) 2024-08-12 23:17:36 -07:00
tabby-crawler Release 0.15.0-dev.0 2024-07-23 11:35:47 +08:00
tabby-download Release 0.16.0-dev.0 2024-08-07 20:12:48 -07:00
tabby-git feat(git): support query with quoted string for repository_grep (#2784) 2024-08-06 12:49:45 -07:00
tabby-index chore(index): skip web documents if body is empty (#2831) 2024-08-11 19:58:24 +00:00
tabby-inference Release 0.15.0-dev.0 2024-07-23 11:35:47 +08:00