Releases: av/harbor
v0.3.2
v0.3.1
v0.3.1
This is a maintenance release with a few fixes, nothing exciting
harbor dev docs
- fixes relative URLs so that Boost README links now finally work- README - revamp, supporters
boost
- fixed mismatch between docs and actual env vars
r0
- workflow for R1-like reasoning chains for any LLM (including older ones, like Llama 2)
markov
- Open WebUI-only, serves an artifact showing a token graph for the current completiondocs
- numerous tweaks and adjustmentsn8n
- fixed missing EOF preventingharbor env n8n
from working as expectedtxtai
- restored functionality with a monkey patch until this PR is merged
Full Changelog: v0.3.0...v0.3.1
v0.3.0 - Routines, Traefik, Latent Scope
v0.3.0 - Routines, Traefik, Latent Scope
Routines
We now have more than 200 compose files and docker compose
itself is being slow merging them (upwards of ~5s even on my powerful dev machine). In order to overcome this we move the core logic that powers the (now legacy) version of the CLI into dedicated routines. v0.3.0 is also a step towards having a native Harbor CLI in the future.
New routines setup is based on the distroless flavor of Deno. In typical Harbor fashion you don't need to install anything, only to pay the disk space tax (~150Mb in this instance). Harbor will cache the dependencies and everything needed after the first cold start. PyPi, Native, and NPM installation paths will continue to function in the same way as previously. Deno was chosen over Bun, Node.js, Python because it brings the most value within these 150Mb with a path towards native binaries in the future and much more. Additionally, Harbor was already using Deno for some lightweight automation. I did experiments with Rust and Go and despite the better end binaries - the cost of creating and maintaining a CLI there is much higher, so choosing Deno means more time to update and improve the project.
You can return to the legacy behavior by setting a config option:
harbor config set legacy_cli true
Traefik
Harbor now includes traefik
as its reverse HTTP Proxy. Automatic configuration is now limited to a local deployment, however it can be reconfigured as needed manually - see the service wiki for more details.
Latent Scope
One of the tools that leave you with a "woah". Allows exploring the given dataset representation in latent space.
Misc
perplexideez
- fixed usage without access to modern composerelease
script seeds programmatic filesseed-cdi
- EOF fixesseed-traefik
- dev script to create.x.traefik.
cross-files for related servicesboost
- revisedr0
moduleqrgen
- fixed build (affectsharbor qr
,harbor tunnel
)
New Contributors
- @heronsouzamarques made their first contribution in #134 💪
- @Tien-Cheng helped restoring the
qrgen
functionality in #137
Full Changelog: v0.2.28...v0.3.0
v0.2.28
This is a mostly maintenance release with a small new Frontend and an exciting new feature for Harbor Boost.
Mikupad
LLM frontend in a single HTML file
Misc
boost
- support for interactive artifacts (specifically for Open WebUI),
- dedicated README
- multiple new experimental modules (undocumented, see source)
- relaxing CORS policy
- cosmetic fixes to dev scripts
librechat
- fixing after MeiliSearch update to v1.12.3promptfoo
- access to Harbor's configured API keys
- example eval based on Misguided Attention (unfinished)
webui
- now sees local time correctly on supported systemsharbor logs
,harbor down
and other commands relying on"*"
- fixed incorrect detection of CDI capability
New Contributors
- @ColumbusAI made their first contribution in #131 🙌
- @kianmeng made their first contribution in #124 💪
Full Changelog: v0.2.27...v0.2.28
v0.2.27 - Morphic, SQL Chat, gptme, Kokoro v1
v0.2.27 - Morphic, SQL Chat, gptme
Three new services are here to add even more value to your local LLM setup!
Morphic
An AI-powered search engine with a generative UI.
SQL Chat
SQL Chat is a chat-based SQL client, which uses natural language to communicate with the database to implement operations such as query, modification, addition, and deletion of the database.
gptme
Terminal assistant, with tools so it can: use the shell, run code, edit files, and much more;
Speaches - now supports Kokoro v1
Harbor patched its installation of speaches to support Kokoro v1 models (ahead of official support from the project itself)
Misc
- CDI - Harbor detects and enables CDI Nvidia driver on compatible systems (kudos to @FrantaNautilus)
harbor dev
- alias to run dev-related scripts from.scripts
harbor dev scaffold
- unwanted prefix newlineboost
- extras for chat/chat node APIs for custom modulescex
- experiment on automatic context expansion by paraphrasingstcl
- continued experiments on "side" reasoning
- Plenty of clarifications and extra-examples for Ollama/WebUI docs and more
- Experimental
requirements.sh
to install Harbor's dependencies automatically on Linux (undocumented, untested)
New Contributors
- @FrantaNautilus made their first contribution in #119 🎉
Full Changelog: v0.2.26...v0.2.27
v0.2.26
v0.2.26
This is a maintenance release with a bugfixes for specific services and general cross-platform compatibility
cmdh
- stop using
pkgx
, patch in structured Outputs from Ollama and larger default context (should be usable with llama3.1 8b) harbor how
will actively check if Ollama has the model configured forcmdh
and will ask to pull otherwise
- stop using
ollama
- extra docs on cache location, extrenal instance and troubleshootingboost
- continuing experiments with
stcl
workflow
- continuing experiments with
- @bjj fixed handling of Docker Desktop versions for
harbor doctor
and capability detection - fixed incorrect application of capabilities when using a wildcard
*
for service match- This was the reason some services were failing to launch on Mac OS
aider
- fixes to support non-root user in newer versionsopenhands
- fixes to support custom registry, local state volume in newer versionslibrechat
- fixed entrypoint lacking exec permissionsshared
- Node.js config merger no longer requires lodash in the base container- Fixes
chatui
being unable to start in recent versions
- Fixes
bolt
- switch to official bolt.diy image, notes on Mac OS compatibility in the docs
New Contributors
Full Changelog: v0.2.25...v0.2.26
v0.2.25
v0.2.25 - Cleanup/maintenance/bugfixes
- Bundled documentation paths fixed for Windows (for upcoming feature)
boost
- newr0
module emulating DeepSeek R1 reasoning chainsllamacpp
- normalise service config, docs
- fixed capability detection
- example of
harbor llamacpp gguf
ollama
- normalise service config, docs
ollama.defaul_models
config - allows to specify models to pre-pull, fixes #105
comfyui
- fixed bug where missing
HF_TOKEN
made default workflow unusable, #106 - default workflow moved to Harbor repo
- normalise service config, docs
- fixed bug where missing
harbor size
- now includes workspacesharbor find
- addedcomfyui
workspace
v0.2.24 - Harbor App on Windows
v0.2.24 - Harbor App is now officially supported on Windows
I'm happy to report that all major issues preventing the App from being usable on Windows were resolved (not to say it's bug-free though). The issues were mostly related to the WSL environment and differences in how specific commands should be executed via Tauri shell bridge. Luckily most of the problems had reasonable workarounds, so the Harbor App on Windows reached parity with its Linux/MacOS versions.
Harbor.App.on.Windows.mp4
Misc
- Preparing docs for a future standalone site (not 100% that it'll happen, but still)
- It's now possible to turn off automatic capability detection (for
nvidia
) and manage list of enabled capabilities manually - Multipple small tweaks for the App, and Harbor Boost
klmbr
is not more careful around articles
v0.2.23 - Speaches + future + fixes
Speaches
Calling your LLM is now easier than ever.
# Start the service
harbor up speaches
faster-whisper-server
is now a more abstract project called speaches
which is now supported by Harbor, unlike the previous iteration - this one supports both TTS and STT at the same time, so you only need one service to call your LLMs now.
Out of the box, it'll use Systran/faster-distil-whisper-large-v3
for STT and new cool hexgrad/Kokoro-82M
for the TTS, both will be pre-configured for use with Open WebUI Audio settings.
Misc
- Harbor agents scaffold
- Harbor Boost:
- New experimental
recpl
module to test recursive GUI planning - Boost now detects more tasks from new Open WebUI versions (to avoid running expensive workflows for them)
- New experimental
kobold
- more reasonable defaults- General - adding generic
DO_NOT_TRACK
env vars for multiple services to disable tracking out of the box
v0.2.22 - KoboldCpp
KoboldCpp
# [Optional] per-pull the image
harbor pull kobold
# Will take a while on the first run
harbor up kobold
KoboldCpp is an easy-to-use AI text-generation software for GGML and GGUF models, inspired by the original KoboldAI. Out of the box, Harbor will pre-connect kobold
to Open WebUI.
Misc
- Fix Webtop container's connection to Harbor and Docker Socket by @SimonBlancoE in #98
- More portable shebang for the CLI
harbor doctor
- tests all requirements before exiting, more granular requirements for docker, Nvidia- MCTS was updated to become compatible with OWUI v0.5.4 (most recent as of today)
- We now have a ko-fi page
New Contributors 🎉
- @SimonBlancoE made their first contribution in #98
Full Changelog: v0.2.21...v0.2.22