Go to file
Meng Zhang d42942c379
feat: support ModelScope for model registry downloading (#477)
* feat: update cache info file after each file got downloaded

* refactor: extract Downloader for model downloading logic

* refactor: extract HuggingFaceRegistry

* refactor: extract serde_json to workspace dependency

* feat: add ModelScopeRegistry

* refactor: extract registry to its sub dir.

* feat: add scripts to mirror hf model to modelscope
2023-09-26 11:52:11 -07:00
.github chore: Update docker.yml 2023-09-08 10:08:42 +08:00
ci Revert "chore: add RELEASE_NOTES.md" 2023-08-31 10:11:29 +08:00
clients Refactor completion request statistics (#474) 2023-09-26 03:01:38 -07:00
crates feat: support ModelScope for model registry downloading (#477) 2023-09-26 11:52:11 -07:00
experimental feat: support ModelScope for model registry downloading (#477) 2023-09-26 11:52:11 -07:00
python/tabby feat: cleanup trainer with new data format 2023-06-13 12:48:27 -07:00
tests
website docs: add default social image 2023-09-22 00:29:00 +08:00
.dockerignore fix: git information in health API response [TAB-177] (#375) 2023-08-29 18:19:54 +08:00
.gitattributes
.gitignore refactor: 💡 Add some .gitignore rules 2023-09-11 09:58:47 +08:00
.gitmodules refactor: use TabbyML/llama.cpp submodule 2023-09-03 12:38:54 +08:00
.rustfmt.toml
Cargo.lock feat: support ModelScope for model registry downloading (#477) 2023-09-26 11:52:11 -07:00
Cargo.toml feat: support ModelScope for model registry downloading (#477) 2023-09-26 11:52:11 -07:00
Dockerfile feat: Update Dockerfile to ctranslate 3.20.0 (#460) 2023-09-19 14:12:35 +08:00
LICENSE
Makefile feat: build index from dataset (#234) 2023-06-12 19:21:27 +00:00
MODEL_SPEC.md docs: add model spec (unstable) version (#457) 2023-09-18 15:48:03 +08:00
package.json refactor(agent): agent http request and cancellation flow. (#446) 2023-09-15 11:05:46 +08:00
README.md docs: Update README.md 2023-09-22 22:44:43 +08:00
yarn.lock Refactor completion request statistics (#474) 2023-09-26 03:01:38 -07:00

🐾 Tabby

build status Docker pulls License Slack Community

Tabby is a self-hosted AI coding assistant, offering an open-source and on-premises alternative to GitHub Copilot. It boasts several key features:

  • Self-contained, with no need for a DBMS or cloud service.
  • OpenAPI interface, easy to integrate with existing infrastructure (e.g Cloud IDE).
  • Supports consumer-grade GPUs.

Open in Playground

Demo

🔥 What's New

  • 09/21/2023 We've hit 10K stars 🌟 on GitHub! 🚀🎉👏
  • 09/18/2023 Apple's M1/M2 Metal inference support has landed in v0.1.1!
  • 08/31/2023 Tabby's first stable release v0.0.1 🥳.
  • 08/28/2023 Experimental support for the CodeLlama 7B.
  • 08/24/2023 Tabby is now on JetBrains Marketplace!

👋 Getting Started

The easiest way to start a Tabby server is by using the following Docker command:

docker run -it \
  --gpus all -p 8080:8080 -v $HOME/.tabby:/data \
  tabbyml/tabby \
  serve --model TabbyML/SantaCoder-1B --device cuda

For additional options (e.g inference type, parallelism), please refer to the documentation at https://tabbyml.github.io/tabby.

🤝 Contributing

Get the Code

git clone --recurse-submodules https://github.com/TabbyML/tabby
cd tabby

If you have already cloned the repository, you could run the git submodule update --recursive --init command to fetch all submodules.

Build

  1. Set up the Rust environment by following this tutorial.

  2. Install the required dependencies:

# For MacOS
brew install protobuf

# For Ubuntu / Debian
apt-get install protobuf-compiler libopenblas-dev
  1. Now, you can build Tabby by running the command cargo build.

Start Hacking!

... and don't forget to submit a Pull Request

🌟 Star History

Star History Chart