tabby/CHANGELOG.md

138 lines
5.6 KiB
Markdown
Raw Normal View History

2023-12-15 05:50:24 +00:00
# v0.8.0 [Unreleased]
## Notice
* Due to format changes, re-executing `tabby scheduler --now` is required to ensure that `Code Browser` functions properly.
2023-12-15 05:50:24 +00:00
## Features
2024-01-22 05:38:09 +00:00
* Introducing a preview release of the `Source Code Browser`, featuring visualization of code snippets utilized for code completion in RAG.
2023-12-15 05:50:24 +00:00
## Fixes and Improvements
2024-01-22 05:38:09 +00:00
* Added a Windows CPU binary distribution.
* Added a Linux ROCm (AMD GPU) binary distribution.
* Fixed an issue with cached permanent redirection in certain browsers (e.g., Chrome) when the `--webserver` flag is disabled.
* Introduced the `TABBY_MODEL_CACHE_ROOT` environment variable to individually override the model cache directory.
* The `/v1beta/chat/completions` API endpoint is now compatible with OpenAI's chat completion API.
2023-12-16 03:22:03 +00:00
2023-12-15 05:54:31 +00:00
# v0.7.0 (12/15/2023)
## Features
2023-12-15 05:54:31 +00:00
* Tabby now includes built-in user management and secure access, ensuring that it is only accessible to your team.
* The `--webserver` flag is a new addition to `tabby serve` that enables secure access to the tabby server. When this flag is on, IDE extensions will need to provide an authorization token to access the instance.
- Some functionalities that are bound to the webserver (e.g. playground) will also require the `--webserver` flag.
## Fixes and Improvements
2023-12-15 05:54:31 +00:00
* Fix https://github.com/TabbyML/tabby/issues/1036, events log should be written to dated json files.
# v0.6.0 (11/27/2023)
chore: release 0.5.0 (#697) * Release 0.5.0-rc.0 http-api-bindings@0.5.0-rc.0 llama-cpp-bindings@0.5.0-rc.0 tabby@0.5.0-rc.0 tabby-common@0.5.0-rc.0 tabby-download@0.5.0-rc.0 tabby-inference@0.5.0-rc.0 tabby-scheduler@0.5.0-rc.0 Generated by cargo-workspaces * fix: docker branch tag should only generate when not empty * Release 0.5.0-rc.1 http-api-bindings@0.5.0-rc.1 llama-cpp-bindings@0.5.0-rc.1 tabby@0.5.0-rc.1 tabby-common@0.5.0-rc.1 tabby-download@0.5.0-rc.1 tabby-inference@0.5.0-rc.1 tabby-scheduler@0.5.0-rc.1 Generated by cargo-workspaces * fix: handlebar syntax in meta action * Release 0.5.0-rc.2 http-api-bindings@0.5.0-rc.2 llama-cpp-bindings@0.5.0-rc.2 tabby@0.5.0-rc.2 tabby-common@0.5.0-rc.2 tabby-download@0.5.0-rc.2 tabby-inference@0.5.0-rc.2 tabby-scheduler@0.5.0-rc.2 Generated by cargo-workspaces * fix: handlebar syntax in meta action * Release 0.5.0-rc.3 http-api-bindings@0.5.0-rc.3 llama-cpp-bindings@0.5.0-rc.3 tabby@0.5.0-rc.3 tabby-common@0.5.0-rc.3 tabby-download@0.5.0-rc.3 tabby-inference@0.5.0-rc.3 tabby-scheduler@0.5.0-rc.3 Generated by cargo-workspaces * docs: update change log and docs * fix: collect_snippet should handle NotReady error * Release 0.5.0-rc.4 http-api-bindings@0.5.0-rc.4 llama-cpp-bindings@0.5.0-rc.4 tabby@0.5.0-rc.4 tabby-common@0.5.0-rc.4 tabby-download@0.5.0-rc.4 tabby-inference@0.5.0-rc.4 tabby-scheduler@0.5.0-rc.4 Generated by cargo-workspaces * Release 0.5.0 http-api-bindings@0.5.0 llama-cpp-bindings@0.5.0 tabby@0.5.0 tabby-common@0.5.0 tabby-download@0.5.0 tabby-inference@0.5.0 tabby-scheduler@0.5.0 Generated by cargo-workspaces
2023-11-04 01:02:03 +00:00
## Features
2023-11-19 23:34:25 +00:00
* Add distribution support (running completion / chat model on different process / machine).
* Add conversation history in chat playground.
* Add `/metrics` endpoint for prometheus metrics collection.
2023-11-19 23:34:25 +00:00
## Fixes and Improvements
2023-11-19 23:34:25 +00:00
* Fix the slow repository indexing due to constraint memory arena in tantivy index writer.
* Make `--model` optional, so users can create a chat only instance.
2023-11-19 23:34:25 +00:00
* Add `--parallelism` to control the throughput and VRAM usage: https://github.com/TabbyML/tabby/pull/727
2023-11-20 01:13:38 +00:00
# v0.5.5 (11/09/2023)
2023-11-07 21:27:52 +00:00
## Fixes and Improvements
## Notice
chore: release 0.5.0 (#697) * Release 0.5.0-rc.0 http-api-bindings@0.5.0-rc.0 llama-cpp-bindings@0.5.0-rc.0 tabby@0.5.0-rc.0 tabby-common@0.5.0-rc.0 tabby-download@0.5.0-rc.0 tabby-inference@0.5.0-rc.0 tabby-scheduler@0.5.0-rc.0 Generated by cargo-workspaces * fix: docker branch tag should only generate when not empty * Release 0.5.0-rc.1 http-api-bindings@0.5.0-rc.1 llama-cpp-bindings@0.5.0-rc.1 tabby@0.5.0-rc.1 tabby-common@0.5.0-rc.1 tabby-download@0.5.0-rc.1 tabby-inference@0.5.0-rc.1 tabby-scheduler@0.5.0-rc.1 Generated by cargo-workspaces * fix: handlebar syntax in meta action * Release 0.5.0-rc.2 http-api-bindings@0.5.0-rc.2 llama-cpp-bindings@0.5.0-rc.2 tabby@0.5.0-rc.2 tabby-common@0.5.0-rc.2 tabby-download@0.5.0-rc.2 tabby-inference@0.5.0-rc.2 tabby-scheduler@0.5.0-rc.2 Generated by cargo-workspaces * fix: handlebar syntax in meta action * Release 0.5.0-rc.3 http-api-bindings@0.5.0-rc.3 llama-cpp-bindings@0.5.0-rc.3 tabby@0.5.0-rc.3 tabby-common@0.5.0-rc.3 tabby-download@0.5.0-rc.3 tabby-inference@0.5.0-rc.3 tabby-scheduler@0.5.0-rc.3 Generated by cargo-workspaces * docs: update change log and docs * fix: collect_snippet should handle NotReady error * Release 0.5.0-rc.4 http-api-bindings@0.5.0-rc.4 llama-cpp-bindings@0.5.0-rc.4 tabby@0.5.0-rc.4 tabby-common@0.5.0-rc.4 tabby-download@0.5.0-rc.4 tabby-inference@0.5.0-rc.4 tabby-scheduler@0.5.0-rc.4 Generated by cargo-workspaces * Release 0.5.0 http-api-bindings@0.5.0 llama-cpp-bindings@0.5.0 tabby@0.5.0 tabby-common@0.5.0 tabby-download@0.5.0 tabby-inference@0.5.0 tabby-scheduler@0.5.0 Generated by cargo-workspaces
2023-11-04 01:02:03 +00:00
* llama.cpp backend (CPU, Metal) now requires a redownload of gguf model due to upstream format changes: https://github.com/TabbyML/tabby/pull/645 https://github.com/ggerganov/llama.cpp/pull/3252
* Due to indexing format changes, the `~/.tabby/index` needs to be manually removed before any further runs of `tabby scheduler`.
chore: release 0.5.0 (#697) * Release 0.5.0-rc.0 http-api-bindings@0.5.0-rc.0 llama-cpp-bindings@0.5.0-rc.0 tabby@0.5.0-rc.0 tabby-common@0.5.0-rc.0 tabby-download@0.5.0-rc.0 tabby-inference@0.5.0-rc.0 tabby-scheduler@0.5.0-rc.0 Generated by cargo-workspaces * fix: docker branch tag should only generate when not empty * Release 0.5.0-rc.1 http-api-bindings@0.5.0-rc.1 llama-cpp-bindings@0.5.0-rc.1 tabby@0.5.0-rc.1 tabby-common@0.5.0-rc.1 tabby-download@0.5.0-rc.1 tabby-inference@0.5.0-rc.1 tabby-scheduler@0.5.0-rc.1 Generated by cargo-workspaces * fix: handlebar syntax in meta action * Release 0.5.0-rc.2 http-api-bindings@0.5.0-rc.2 llama-cpp-bindings@0.5.0-rc.2 tabby@0.5.0-rc.2 tabby-common@0.5.0-rc.2 tabby-download@0.5.0-rc.2 tabby-inference@0.5.0-rc.2 tabby-scheduler@0.5.0-rc.2 Generated by cargo-workspaces * fix: handlebar syntax in meta action * Release 0.5.0-rc.3 http-api-bindings@0.5.0-rc.3 llama-cpp-bindings@0.5.0-rc.3 tabby@0.5.0-rc.3 tabby-common@0.5.0-rc.3 tabby-download@0.5.0-rc.3 tabby-inference@0.5.0-rc.3 tabby-scheduler@0.5.0-rc.3 Generated by cargo-workspaces * docs: update change log and docs * fix: collect_snippet should handle NotReady error * Release 0.5.0-rc.4 http-api-bindings@0.5.0-rc.4 llama-cpp-bindings@0.5.0-rc.4 tabby@0.5.0-rc.4 tabby-common@0.5.0-rc.4 tabby-download@0.5.0-rc.4 tabby-inference@0.5.0-rc.4 tabby-scheduler@0.5.0-rc.4 Generated by cargo-workspaces * Release 0.5.0 http-api-bindings@0.5.0 llama-cpp-bindings@0.5.0 tabby@0.5.0 tabby-common@0.5.0 tabby-download@0.5.0 tabby-inference@0.5.0 tabby-scheduler@0.5.0 Generated by cargo-workspaces
2023-11-04 01:02:03 +00:00
* `TABBY_REGISTRY` is replaced with `TABBY_DOWNLOAD_HOST` for the github based registry implementation.
2023-10-14 00:44:19 +00:00
## Features
chore: release 0.5.0 (#697) * Release 0.5.0-rc.0 http-api-bindings@0.5.0-rc.0 llama-cpp-bindings@0.5.0-rc.0 tabby@0.5.0-rc.0 tabby-common@0.5.0-rc.0 tabby-download@0.5.0-rc.0 tabby-inference@0.5.0-rc.0 tabby-scheduler@0.5.0-rc.0 Generated by cargo-workspaces * fix: docker branch tag should only generate when not empty * Release 0.5.0-rc.1 http-api-bindings@0.5.0-rc.1 llama-cpp-bindings@0.5.0-rc.1 tabby@0.5.0-rc.1 tabby-common@0.5.0-rc.1 tabby-download@0.5.0-rc.1 tabby-inference@0.5.0-rc.1 tabby-scheduler@0.5.0-rc.1 Generated by cargo-workspaces * fix: handlebar syntax in meta action * Release 0.5.0-rc.2 http-api-bindings@0.5.0-rc.2 llama-cpp-bindings@0.5.0-rc.2 tabby@0.5.0-rc.2 tabby-common@0.5.0-rc.2 tabby-download@0.5.0-rc.2 tabby-inference@0.5.0-rc.2 tabby-scheduler@0.5.0-rc.2 Generated by cargo-workspaces * fix: handlebar syntax in meta action * Release 0.5.0-rc.3 http-api-bindings@0.5.0-rc.3 llama-cpp-bindings@0.5.0-rc.3 tabby@0.5.0-rc.3 tabby-common@0.5.0-rc.3 tabby-download@0.5.0-rc.3 tabby-inference@0.5.0-rc.3 tabby-scheduler@0.5.0-rc.3 Generated by cargo-workspaces * docs: update change log and docs * fix: collect_snippet should handle NotReady error * Release 0.5.0-rc.4 http-api-bindings@0.5.0-rc.4 llama-cpp-bindings@0.5.0-rc.4 tabby@0.5.0-rc.4 tabby-common@0.5.0-rc.4 tabby-download@0.5.0-rc.4 tabby-inference@0.5.0-rc.4 tabby-scheduler@0.5.0-rc.4 Generated by cargo-workspaces * Release 0.5.0 http-api-bindings@0.5.0 llama-cpp-bindings@0.5.0 tabby@0.5.0 tabby-common@0.5.0 tabby-download@0.5.0 tabby-inference@0.5.0 tabby-scheduler@0.5.0 Generated by cargo-workspaces
2023-11-04 01:02:03 +00:00
* Improved dashboard UI.
## Fixes and Improvements
chore: release 0.5.0 (#697) * Release 0.5.0-rc.0 http-api-bindings@0.5.0-rc.0 llama-cpp-bindings@0.5.0-rc.0 tabby@0.5.0-rc.0 tabby-common@0.5.0-rc.0 tabby-download@0.5.0-rc.0 tabby-inference@0.5.0-rc.0 tabby-scheduler@0.5.0-rc.0 Generated by cargo-workspaces * fix: docker branch tag should only generate when not empty * Release 0.5.0-rc.1 http-api-bindings@0.5.0-rc.1 llama-cpp-bindings@0.5.0-rc.1 tabby@0.5.0-rc.1 tabby-common@0.5.0-rc.1 tabby-download@0.5.0-rc.1 tabby-inference@0.5.0-rc.1 tabby-scheduler@0.5.0-rc.1 Generated by cargo-workspaces * fix: handlebar syntax in meta action * Release 0.5.0-rc.2 http-api-bindings@0.5.0-rc.2 llama-cpp-bindings@0.5.0-rc.2 tabby@0.5.0-rc.2 tabby-common@0.5.0-rc.2 tabby-download@0.5.0-rc.2 tabby-inference@0.5.0-rc.2 tabby-scheduler@0.5.0-rc.2 Generated by cargo-workspaces * fix: handlebar syntax in meta action * Release 0.5.0-rc.3 http-api-bindings@0.5.0-rc.3 llama-cpp-bindings@0.5.0-rc.3 tabby@0.5.0-rc.3 tabby-common@0.5.0-rc.3 tabby-download@0.5.0-rc.3 tabby-inference@0.5.0-rc.3 tabby-scheduler@0.5.0-rc.3 Generated by cargo-workspaces * docs: update change log and docs * fix: collect_snippet should handle NotReady error * Release 0.5.0-rc.4 http-api-bindings@0.5.0-rc.4 llama-cpp-bindings@0.5.0-rc.4 tabby@0.5.0-rc.4 tabby-common@0.5.0-rc.4 tabby-download@0.5.0-rc.4 tabby-inference@0.5.0-rc.4 tabby-scheduler@0.5.0-rc.4 Generated by cargo-workspaces * Release 0.5.0 http-api-bindings@0.5.0 llama-cpp-bindings@0.5.0 tabby@0.5.0 tabby-common@0.5.0 tabby-download@0.5.0 tabby-inference@0.5.0 tabby-scheduler@0.5.0 Generated by cargo-workspaces
2023-11-04 01:02:03 +00:00
* Cpu backend is switched to llama.cpp: https://github.com/TabbyML/tabby/pull/638
* add `server.completion_timeout` to control the code completion interface timeout: https://github.com/TabbyML/tabby/pull/637
chore: release 0.5.0 (#697) * Release 0.5.0-rc.0 http-api-bindings@0.5.0-rc.0 llama-cpp-bindings@0.5.0-rc.0 tabby@0.5.0-rc.0 tabby-common@0.5.0-rc.0 tabby-download@0.5.0-rc.0 tabby-inference@0.5.0-rc.0 tabby-scheduler@0.5.0-rc.0 Generated by cargo-workspaces * fix: docker branch tag should only generate when not empty * Release 0.5.0-rc.1 http-api-bindings@0.5.0-rc.1 llama-cpp-bindings@0.5.0-rc.1 tabby@0.5.0-rc.1 tabby-common@0.5.0-rc.1 tabby-download@0.5.0-rc.1 tabby-inference@0.5.0-rc.1 tabby-scheduler@0.5.0-rc.1 Generated by cargo-workspaces * fix: handlebar syntax in meta action * Release 0.5.0-rc.2 http-api-bindings@0.5.0-rc.2 llama-cpp-bindings@0.5.0-rc.2 tabby@0.5.0-rc.2 tabby-common@0.5.0-rc.2 tabby-download@0.5.0-rc.2 tabby-inference@0.5.0-rc.2 tabby-scheduler@0.5.0-rc.2 Generated by cargo-workspaces * fix: handlebar syntax in meta action * Release 0.5.0-rc.3 http-api-bindings@0.5.0-rc.3 llama-cpp-bindings@0.5.0-rc.3 tabby@0.5.0-rc.3 tabby-common@0.5.0-rc.3 tabby-download@0.5.0-rc.3 tabby-inference@0.5.0-rc.3 tabby-scheduler@0.5.0-rc.3 Generated by cargo-workspaces * docs: update change log and docs * fix: collect_snippet should handle NotReady error * Release 0.5.0-rc.4 http-api-bindings@0.5.0-rc.4 llama-cpp-bindings@0.5.0-rc.4 tabby@0.5.0-rc.4 tabby-common@0.5.0-rc.4 tabby-download@0.5.0-rc.4 tabby-inference@0.5.0-rc.4 tabby-scheduler@0.5.0-rc.4 Generated by cargo-workspaces * Release 0.5.0 http-api-bindings@0.5.0 llama-cpp-bindings@0.5.0 tabby@0.5.0 tabby-common@0.5.0 tabby-download@0.5.0 tabby-inference@0.5.0 tabby-scheduler@0.5.0 Generated by cargo-workspaces
2023-11-04 01:02:03 +00:00
* Cuda backend is switched to llama.cpp: https://github.com/TabbyML/tabby/pull/656
* Tokenizer implementation is switched to llama.cpp, so tabby no longer need to download additional tokenizer file: https://github.com/TabbyML/tabby/pull/683
2023-11-09 08:36:35 +00:00
* Fix deadlock issue reported in https://github.com/TabbyML/tabby/issues/718
2023-11-20 01:13:38 +00:00
# v0.4.0 (10/24/2023)
## Features
* Supports golang: https://github.com/TabbyML/tabby/issues/553
2023-10-21 19:36:48 +00:00
* Supports ruby: https://github.com/TabbyML/tabby/pull/597
* Supports using local directory for `Repository.git_url`: use `file:///path/to/repo` to specify a local directory.
* A new UI design for webserver.
2023-10-14 00:44:19 +00:00
## Fixes and Improvements
2023-10-21 19:36:48 +00:00
* Improve snippets retrieval by dedup candidates to existing content + snippets: https://github.com/TabbyML/tabby/pull/582
2023-10-14 00:44:19 +00:00
2023-11-20 01:13:38 +00:00
# v0.3.1 (10/21/2023)
2023-10-21 19:33:05 +00:00
## Fixes and improvements
2023-10-21 19:35:12 +00:00
* Fix GPU OOM issue caused the parallelism: https://github.com/TabbyML/tabby/issues/541, https://github.com/TabbyML/tabby/issues/587
2023-10-21 19:33:05 +00:00
* Fix git safe directory check in docker: https://github.com/TabbyML/tabby/issues/569
2023-11-20 01:13:38 +00:00
# v0.3.0 (10/13/2023)
2023-10-03 20:32:13 +00:00
2023-10-03 20:44:17 +00:00
## Features
2023-10-14 00:29:14 +00:00
### Retrieval-Augmented Code Completion Enabled by Default
The currently supported languages are:
* Rust
* Python
* JavaScript / JSX
* TypeScript / TSX
A blog series detailing the technical aspects of Retrieval-Augmented Code Completion will be published soon. Stay tuned!
## Fixes and Improvements
* Fix [Issue #511](https://github.com/TabbyML/tabby/issues/511) by marking ggml models as optional.
* Improve stop words handling by combining RegexSet into Regex for efficiency.
2023-10-03 20:32:13 +00:00
2023-10-13 20:19:50 +00:00
# v0.2.2 (10/09/2023)
## Fixes and improvements
2023-10-14 00:29:14 +00:00
2023-10-13 20:19:50 +00:00
* Fix a critical issue that might cause request dead locking in ctranslate2 backend (when loading is heavy)
2023-10-04 00:13:35 +00:00
# v0.2.1 (10/03/2023)
2023-10-03 20:44:17 +00:00
## Features
2023-10-03 20:32:13 +00:00
### Chat Model & Web Interface
We have introduced a new argument, `--chat-model`, which allows you to specify the model for the chat playground located at http://localhost:8080/playground
To utilize this feature, use the following command in the terminal:
```bash
tabby serve --device metal --model TabbyML/StarCoder-1B --chat-model TabbyML/Mistral-7B
```
### ModelScope Model Registry
Mainland Chinese users have been facing challenges accessing Hugging Face due to various reasons. The Tabby team is actively working to address this issue by mirroring models to a hosting provider in mainland China called modelscope.cn.
```bash
# Download from the Modelscope registry
TABBY_REGISTRY=modelscope tabby download --model TabbyML/WizardCoder-1B
```
2023-10-03 20:44:17 +00:00
## Fixes and improvements
2023-10-03 20:32:13 +00:00
* Implemented more accurate UTF-8 incremental decoding in the [GitHub pull request](https://github.com/TabbyML/tabby/pull/491).
* Fixed the stop words implementation by utilizing RegexSet to isolate the stop word group.
* Improved model downloading logic; now Tabby will attempt to fetch the latest model version if there's a remote change, and the local cache key becomes stale.
2023-10-04 00:37:11 +00:00
* set default num_replicas_per_device for ctranslate2 backend to increase parallelism.