Skip to content

Releases: av/harbor

v0.0.15

03 Aug 21:58
@av av
Compare
Choose a tag to compare

Plandex integration

screenshot of Plandex working with Ollama

# Run plandex service
harbor up plandex
# Run healthcheck against the Plandex service
harbor plandex health # should print "OK"

Local env

.env file can now be used for any necessary permanent overrides without risking to conflict with updates from upstream.

To accompany the change:

# Brings the configuration back to current defaults
harbor config reset

Misc

  • HuggingFace CLI is a service - unify config management with other services
  • Poor man's harbor update

Full Changelog: v0.0.14...v0.0.15

v0.0.14

03 Aug 10:09
@av av
Compare
Choose a tag to compare

PAR LLAMA

PAR LLAMA UI screenshot

TUI for Ollama.

PAR LLAMA is an amazing terminal UI for interacting with Ollama. Everything you'd expect from a modern chat interface, but in the terminal.

# Ollama should be one of the
# running services in order to be reachable by parllama
harbor up ollama

# 1. Shortcut
harbor parllama

# 2. Via harbor run - the underlying command is different
harbor run parllama

# 3. Via interactive service shell
harbor shell parllama
$ parllama

See service wiki for more details

Full Changelog: v0.0.13...v0.0.14

v0.0.13

02 Aug 23:53
@av av
Compare
Choose a tag to compare

What's Changed

TabbyAPI backend support

Wiki Docs

harbor tabbyapi model Annuvin/gemma-2-2b-it-abliterated-4.0bpw-exl2
harbor up tabbyapi

New CLI Features

harbor hf dl

Integrating awesome HuggingFaceModelDownloader CLI for easier HF/Llama.cpp cache management

# See the original help
harbor hf dl --help

# EXL2 example
#
# -s ./hf - Save the model to global HuggingFace cache (mounted to ./hf)
# -c 10   - make download go brr with 10 concurrent connections
# -m      - model specifier in user/repo format
# -b      - model revision/branch specifier (where applicable)
harbor hf dl -c 10 -m turboderp/TinyLlama-1B-exl2 -b 2.3bpw -s ./hf

# GGUF example
#
# -s ./llama.cpp - Save the model to global llama.cpp cache (mounted to ./llama.cpp)
# -c 10          - make download go brr with 10 concurrent connections
# -m             - model specifier in user/repo format
# :Q2_K          - file filter postfix - will only download files with this postfix
harbor hf dl -c 10 -m TheBloke/TinyLlama-1.1B-Chat-v1.0-GGUF:Q2_K -s ./llama.cpp

harbor hf find

To accompany the hf dl - a quick way to jump right to the HF Hub to find new models.

harbor hf find gguf
harbor hf find exl2 gemma-2
harbor hf find awq llama-3.1
harbor hf find tinyllama

Misc

  • docs: update README.md by @eltociear in #3
  • harbor shell - launch interactive shell in service container (shortcut from previous harbor exec + harbor cmd combinations)
  • harbor build - for services that'll have their Dockerfile within Harbor repo (such as hfdownloader)

New Contributors

Full Changelog: v0.0.12...v0.0.13

v0.0.12

02 Aug 18:55
@av av
Compare
Choose a tag to compare
  • Container name prefix: all containers from the toolkit are now prefixed with harbor. to avoid conflict with other locally running setups
# Can be adjusted via .env or with CLI
harbor config get container.prefix # harbor
harbor config set container.prefix friendly # friendly.ollama
  • Fixing harbor ln to use CLI name without an extension

Full Changelog: v0.0.11...v0.0.12

v0.0.11

02 Aug 15:27
@av av
Compare
Choose a tag to compare

Features

Open WebUI cross-file configuration

Setup for Open WebUI was refactored to actively intersect configs from requested services. In other words - now Open WebUI won't try to reach out to LiteLLM or vLLM (or other) service endpoints unless it has been explicitly launched

OpenAI Key/URL management

harbor openai urls add https://api.openai.com/v1
harbor openai keys add <OpenAI API Key>

Default services management

harbor defaults ls
harbor defaults rm 0
harbor defaults rm ollama
harbor defaults add llamacpp

Autoopen

# Update the config
harbor config set ui.autoopen true

# Start harbor - default UI opens automatically
harbor up

Misc

  • Compatibility Wiki
  • env manager works with array variables
  • harbor webui CLI helper
  • harbor dive util
  • harbor info util

Full Changelog: v0.0.10...v0.0.11

v0.0.10

01 Aug 15:02
@av av
Compare
Choose a tag to compare
  • harbor up aphrodite - Aphrodite Engine inference backend

Full Changelog: v0.0.9...v0.0.10

v0.0.9

01 Aug 12:22
@av av
Compare
Choose a tag to compare
  • harbor up vllm - vLLM backend support
  • harbor up bionicgpt - BionicGPT frontend
  • litellm config merger workflow
  • harbor cmd - run arbitrary compose commands
  • harbor run - run commands in a service main container without starting it first

Full Changelog: v0.0.8...v0.0.9

v0.0.8

31 Jul 21:57
@av av
Compare
Choose a tag to compare
  • harbor up librechat - LibreChat as an additional UI service
  • harbor pull to update service images
  • harbor down can now pass parameters to the docker

Full Changelog: v0.0.7...v0.0.8

v0.0.7

31 Jul 19:53
@av av
Compare
Choose a tag to compare
  • 📖 Wiki
  • langfuse service, sample integration with litellm
  • hollama frontend
  • crossfile support for compose

Full Changelog: v0.0.6...v0.0.7

v0.0.6

30 Jul 22:40
@av av
Compare
Choose a tag to compare
  • Approaching v0.1.0 with only few services remaining to scaffold 🎉
  • harbor up tts - TTS integrated with Open WebUI
  • harbor up tgi litellm - TGI as another LLM backend and LiteLLM as an OpenAI-compatible API proxy