Skip to content

Releases: av/harbor

v0.3.2

09 Mar 22:19
@av av
Compare
Choose a tag to compare

v0.3.2

This is a very minor bugfix release

  • boost - now correctly handles incomplete chunks from downstream APIs by attempting buffering and then parsing (tested with Groq API)

Full Changelog: v0.3.1...v0.3.2

v0.3.1

09 Mar 21:01
@av av
Compare
Choose a tag to compare

v0.3.1

This is a maintenance release with a few fixes, nothing exciting

  • harbor dev docs - fixes relative URLs so that Boost README links now finally work
  • README - revamp, supporters
  • boost
    • fixed mismatch between docs and actual env vars
    • r0 - workflow for R1-like reasoning chains for any LLM (including older ones, like Llama 2)
  • markov - Open WebUI-only, serves an artifact showing a token graph for the current completion
  • docs - numerous tweaks and adjustments
  • n8n - fixed missing EOF preventing harbor env n8n from working as expected
  • txtai - restored functionality with a monkey patch until this PR is merged

Full Changelog: v0.3.0...v0.3.1

v0.3.0 - Routines, Traefik, Latent Scope

01 Mar 23:02
@av av
Compare
Choose a tag to compare

v0.3.0 - Routines, Traefik, Latent Scope

Routines

image

We now have more than 200 compose files and docker compose itself is being slow merging them (upwards of ~5s even on my powerful dev machine). In order to overcome this we move the core logic that powers the (now legacy) version of the CLI into dedicated routines. v0.3.0 is also a step towards having a native Harbor CLI in the future.

New routines setup is based on the distroless flavor of Deno. In typical Harbor fashion you don't need to install anything, only to pay the disk space tax (~150Mb in this instance). Harbor will cache the dependencies and everything needed after the first cold start. PyPi, Native, and NPM installation paths will continue to function in the same way as previously. Deno was chosen over Bun, Node.js, Python because it brings the most value within these 150Mb with a path towards native binaries in the future and much more. Additionally, Harbor was already using Deno for some lightweight automation. I did experiments with Rust and Go and despite the better end binaries - the cost of creating and maintaining a CLI there is much higher, so choosing Deno means more time to update and improve the project.

You can return to the legacy behavior by setting a config option:

harbor config set legacy_cli true

Traefik

Harbor now includes traefik as its reverse HTTP Proxy. Automatic configuration is now limited to a local deployment, however it can be reconfigured as needed manually - see the service wiki for more details.

Latent Scope

image

One of the tools that leave you with a "woah". Allows exploring the given dataset representation in latent space.

Misc

  • perplexideez - fixed usage without access to modern compose
  • release script seeds programmatic files
  • seed-cdi - EOF fixes
  • seed-traefik - dev script to create .x.traefik. cross-files for related services
  • boost - revised r0 module
  • qrgen - fixed build (affects harbor qr, harbor tunnel)

New Contributors

Full Changelog: v0.2.28...v0.3.0

v0.2.28

22 Feb 18:45
@av av
03d6ea2
Compare
Choose a tag to compare

This is a mostly maintenance release with a small new Frontend and an exciting new feature for Harbor Boost.

Mikupad

image

LLM frontend in a single HTML file

Misc

  • boost
    • support for interactive artifacts (specifically for Open WebUI),
    • dedicated README
    • multiple new experimental modules (undocumented, see source)
    • relaxing CORS policy
  • cosmetic fixes to dev scripts
  • librechat - fixing after MeiliSearch update to v1.12.3
  • promptfoo
    • access to Harbor's configured API keys
    • example eval based on Misguided Attention (unfinished)
  • webui - now sees local time correctly on supported systems
  • harbor logs, harbor down and other commands relying on "*" - fixed incorrect detection of CDI capability

New Contributors

Full Changelog: v0.2.27...v0.2.28

v0.2.27 - Morphic, SQL Chat, gptme, Kokoro v1

08 Feb 16:42
@av av
Compare
Choose a tag to compare

v0.2.27 - Morphic, SQL Chat, gptme

Three new services are here to add even more value to your local LLM setup!

Morphic

harbor-morphic

An AI-powered search engine with a generative UI.

SQL Chat

sqlchat

SQL Chat is a chat-based SQL client, which uses natural language to communicate with the database to implement operations such as query, modification, addition, and deletion of the database.

gptme

gptme

Terminal assistant, with tools so it can: use the shell, run code, edit files, and much more;

Speaches - now supports Kokoro v1

Harbor patched its installation of speaches to support Kokoro v1 models (ahead of official support from the project itself)

Misc

  • CDI - Harbor detects and enables CDI Nvidia driver on compatible systems (kudos to @FrantaNautilus)
  • harbor dev - alias to run dev-related scripts from .scripts
  • harbor dev scaffold - unwanted prefix newline
  • boost - extras for chat/chat node APIs for custom modules
    • cex - experiment on automatic context expansion by paraphrasing
    • stcl - continued experiments on "side" reasoning
  • Plenty of clarifications and extra-examples for Ollama/WebUI docs and more
  • Experimental requirements.sh to install Harbor's dependencies automatically on Linux (undocumented, untested)

New Contributors

Full Changelog: v0.2.26...v0.2.27

v0.2.26

01 Feb 18:53
@av av
Compare
Choose a tag to compare

v0.2.26

This is a maintenance release with a bugfixes for specific services and general cross-platform compatibility

  • cmdh
    • stop using pkgx, patch in structured Outputs from Ollama and larger default context (should be usable with llama3.1 8b)
    • harbor how will actively check if Ollama has the model configured for cmdh and will ask to pull otherwise
  • ollama - extra docs on cache location, extrenal instance and troubleshooting
  • boost
    • continuing experiments with stcl workflow
  • @bjj fixed handling of Docker Desktop versions for harbor doctor and capability detection
  • fixed incorrect application of capabilities when using a wildcard * for service match
    • This was the reason some services were failing to launch on Mac OS
  • aider - fixes to support non-root user in newer versions
  • openhands - fixes to support custom registry, local state volume in newer versions
  • librechat - fixed entrypoint lacking exec permissions
  • shared - Node.js config merger no longer requires lodash in the base container
    • Fixes chatui being unable to start in recent versions
  • bolt - switch to official bolt.diy image, notes on Mac OS compatibility in the docs

New Contributors

  • @bjj made their first contribution in #112 🎉

Full Changelog: v0.2.25...v0.2.26

v0.2.25

25 Jan 12:36
@av av
Compare
Choose a tag to compare

v0.2.25 - Cleanup/maintenance/bugfixes

  • Bundled documentation paths fixed for Windows (for upcoming feature)
  • boost - new r0 module emulating DeepSeek R1 reasoning chains
  • llamacpp
    • normalise service config, docs
    • fixed capability detection
    • example of harbor llamacpp gguf
  • ollama
    • normalise service config, docs
    • ollama.defaul_models config - allows to specify models to pre-pull, fixes #105
  • comfyui
    • fixed bug where missing HF_TOKEN made default workflow unusable, #106
    • default workflow moved to Harbor repo
    • normalise service config, docs
  • harbor size - now includes workspaces
  • harbor find - added comfyui workspace

v0.2.24 - Harbor App on Windows

19 Jan 21:59
@av av
Compare
Choose a tag to compare

v0.2.24 - Harbor App is now officially supported on Windows

I'm happy to report that all major issues preventing the App from being usable on Windows were resolved (not to say it's bug-free though). The issues were mostly related to the WSL environment and differences in how specific commands should be executed via Tauri shell bridge. Luckily most of the problems had reasonable workarounds, so the Harbor App on Windows reached parity with its Linux/MacOS versions.

Harbor.App.on.Windows.mp4

Misc

  • Preparing docs for a future standalone site (not 100% that it'll happen, but still)
  • It's now possible to turn off automatic capability detection (for nvidia) and manage list of enabled capabilities manually
  • Multipple small tweaks for the App, and Harbor Boost
  • klmbr is not more careful around articles

v0.2.23 - Speaches + future + fixes

18 Jan 15:40
@av av
Compare
Choose a tag to compare

Speaches

Calling your LLM is now easier than ever.

# Start the service
harbor up speaches

faster-whisper-server is now a more abstract project called speaches which is now supported by Harbor, unlike the previous iteration - this one supports both TTS and STT at the same time, so you only need one service to call your LLMs now.

Out of the box, it'll use Systran/faster-distil-whisper-large-v3 for STT and new cool hexgrad/Kokoro-82M for the TTS, both will be pre-configured for use with Open WebUI Audio settings.

Misc

  • Harbor agents scaffold
  • Harbor Boost:
    • New experimental recpl module to test recursive GUI planning
    • Boost now detects more tasks from new Open WebUI versions (to avoid running expensive workflows for them)
  • kobold - more reasonable defaults
  • General - adding generic DO_NOT_TRACK env vars for multiple services to disable tracking out of the box

v0.2.22 - KoboldCpp

11 Jan 16:58
@av av
Compare
Choose a tag to compare

KoboldCpp

# [Optional] per-pull the image
harbor pull kobold

# Will take a while on the first run
harbor up kobold

KoboldCpp is an easy-to-use AI text-generation software for GGML and GGUF models, inspired by the original KoboldAI. Out of the box, Harbor will pre-connect kobold to Open WebUI.

Misc

  • Fix Webtop container's connection to Harbor and Docker Socket by @SimonBlancoE in #98
  • More portable shebang for the CLI
  • harbor doctor - tests all requirements before exiting, more granular requirements for docker, Nvidia
  • MCTS was updated to become compatible with OWUI v0.5.4 (most recent as of today)
  • We now have a ko-fi page

New Contributors 🎉

Full Changelog: v0.2.21...v0.2.22