This website requires JavaScript.
Explore
Help
Sign In
mirrors
/
tabby
Watch
1
Star
0
Fork
0
You've already forked tabby
mirror of
https://github.com/TabbyML/tabby
synced
2024-11-25 14:31:08 +00:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
c8b1d2f49d
tabby
/
crates
/
llama-cpp-server
History
Meng Zhang
7c49ac51ed
chore(llama-cpp-server): add llama-server invoking commands to error message (
#2925
)
2024-08-20 21:45:11 +00:00
..
llama.cpp
@
5ef07e25ac
chore(llama.cpp): bump version to b3571 (
#2851
)
2024-08-12 13:54:44 -07:00
src
chore(llama-cpp-server): add llama-server invoking commands to error message (
#2925
)
2024-08-20 21:45:11 +00:00
build.rs
fix: renamed LLAMA_HIPBLAS flag not handled by llama_option_depr (
#2835
)
2024-08-11 18:32:17 -07:00
Cargo.toml
refactor(webserver): switch to openai chat interface (
#2564
)
2024-07-03 15:44:34 +09:00