Releases: av/harbor
v0.0.15
Plandex integration
# Run plandex service
harbor up plandex
# Run healthcheck against the Plandex service
harbor plandex health # should print "OK"
- Service overview
- Setup/configuration with Ollama guide in the Wiki
harbor pdx
CLI reference
Local env
.env
file can now be used for any necessary permanent overrides without risking to conflict with updates from upstream.
To accompany the change:
# Brings the configuration back to current defaults
harbor config reset
Misc
- HuggingFace CLI is a service - unify config management with other services
- Poor man's
harbor update
Full Changelog: v0.0.14...v0.0.15
v0.0.14
PAR LLAMA
TUI for Ollama.
PAR LLAMA is an amazing terminal UI for interacting with Ollama. Everything you'd expect from a modern chat interface, but in the terminal.
# Ollama should be one of the
# running services in order to be reachable by parllama
harbor up ollama
# 1. Shortcut
harbor parllama
# 2. Via harbor run - the underlying command is different
harbor run parllama
# 3. Via interactive service shell
harbor shell parllama
$ parllama
See service wiki for more details
Full Changelog: v0.0.13...v0.0.14
v0.0.13
What's Changed
TabbyAPI backend support
harbor tabbyapi model Annuvin/gemma-2-2b-it-abliterated-4.0bpw-exl2
harbor up tabbyapi
New CLI Features
harbor hf dl
Integrating awesome HuggingFaceModelDownloader CLI for easier HF/Llama.cpp cache management
# See the original help
harbor hf dl --help
# EXL2 example
#
# -s ./hf - Save the model to global HuggingFace cache (mounted to ./hf)
# -c 10 - make download go brr with 10 concurrent connections
# -m - model specifier in user/repo format
# -b - model revision/branch specifier (where applicable)
harbor hf dl -c 10 -m turboderp/TinyLlama-1B-exl2 -b 2.3bpw -s ./hf
# GGUF example
#
# -s ./llama.cpp - Save the model to global llama.cpp cache (mounted to ./llama.cpp)
# -c 10 - make download go brr with 10 concurrent connections
# -m - model specifier in user/repo format
# :Q2_K - file filter postfix - will only download files with this postfix
harbor hf dl -c 10 -m TheBloke/TinyLlama-1.1B-Chat-v1.0-GGUF:Q2_K -s ./llama.cpp
harbor hf find
To accompany the hf dl
- a quick way to jump right to the HF Hub to find new models.
harbor hf find gguf
harbor hf find exl2 gemma-2
harbor hf find awq llama-3.1
harbor hf find tinyllama
Misc
- docs: update README.md by @eltociear in #3
harbor shell
- launch interactive shell in service container (shortcut from previousharbor exec
+harbor cmd
combinations)harbor build
- for services that'll have theirDockerfile
within Harbor repo (such ashfdownloader
)
New Contributors
- @eltociear made their first contribution in #3
Full Changelog: v0.0.12...v0.0.13
v0.0.12
- Container name prefix: all containers from the toolkit are now prefixed with
harbor.
to avoid conflict with other locally running setups
# Can be adjusted via .env or with CLI
harbor config get container.prefix # harbor
harbor config set container.prefix friendly # friendly.ollama
- Fixing
harbor ln
to use CLI name without an extension
Full Changelog: v0.0.11...v0.0.12
v0.0.11
Features
Open WebUI cross-file configuration
Setup for Open WebUI was refactored to actively intersect configs from requested services. In other words - now Open WebUI won't try to reach out to LiteLLM or vLLM (or other) service endpoints unless it has been explicitly launched
OpenAI Key/URL management
harbor openai urls add https://api.openai.com/v1
harbor openai keys add <OpenAI API Key>
Default services management
harbor defaults ls
harbor defaults rm 0
harbor defaults rm ollama
harbor defaults add llamacpp
Autoopen
# Update the config
harbor config set ui.autoopen true
# Start harbor - default UI opens automatically
harbor up
Misc
- Compatibility Wiki
- env manager works with array variables
harbor webui
CLI helperharbor dive
utilharbor info
util
Full Changelog: v0.0.10...v0.0.11
v0.0.10
harbor up aphrodite
- Aphrodite Engine inference backend
Full Changelog: v0.0.9...v0.0.10
v0.0.9
harbor up vllm
- vLLM backend supportharbor up bionicgpt
- BionicGPT frontend- litellm config merger workflow
harbor cmd
- run arbitrary compose commandsharbor run
- run commands in a service main container without starting it first
Full Changelog: v0.0.8...v0.0.9
v0.0.8
harbor up librechat
- LibreChat as an additional UI serviceharbor pull
to update service imagesharbor down
can now pass parameters to the docker
Full Changelog: v0.0.7...v0.0.8