Skip to content

Commit

Permalink
fix: new llama 3.3 model
Browse files Browse the repository at this point in the history
Signed-off-by: Bastian Fredriksson <bastian.fredriksson@keyfactor.com>
  • Loading branch information
Realiserad committed Jan 7, 2025
1 parent 678b099 commit 6fbcf97
Show file tree
Hide file tree
Showing 3 changed files with 5 additions and 5 deletions.
6 changes: 3 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -69,7 +69,7 @@ api_key = <your API key>

If you are self-hosting, my recommendation is to use
[Ollama](https://github.com/ollama/ollama) with
[Llama 3.1 70B](https://ollama.com/library/llama3.1). An out of the box
[Llama 3.3 70B](https://ollama.com/library/llama3.3). An out of the box
configuration running on `localhost` could then look something
like this:

Expand All @@ -79,7 +79,7 @@ configuration = local-llama

[local-llama]
provider = self-hosted
model = llama3.1
model = llama3.3
server = http://localhost:11434/v1
```

Expand Down Expand Up @@ -119,7 +119,7 @@ configuration = huggingface
provider = huggingface
email = <your email>
password = <your password>
model = meta-llama/Meta-Llama-3.1-70B-Instruct
model = meta-llama/Llama-3.3-70B-Instruct
```

Available models are listed [here](https://huggingface.co/chat/models).
Expand Down
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ build-backend = "setuptools.build_meta"

[project]
name = "fish_ai"
version = "1.0.0"
version = "1.0.1"
authors = [{ name = "Bastian Fredriksson", email = "realiserad@gmail.com" }]
description = "Provides core functionality for fish-ai, an AI plugin for the fish shell."
readme = "README.md"
Expand Down
2 changes: 1 addition & 1 deletion src/fish_ai/engine.py
Original file line number Diff line number Diff line change
Expand Up @@ -215,7 +215,7 @@ def get_response(messages):
cookies=cookies.get_dict(),
system_prompt=create_system_prompt(messages),
default_llm=get_config('model') or
'meta-llama/Meta-Llama-3.1-70B-Instruct')
'meta-llama/Llama-3.3-70B-Instruct')

response = bot.chat(
messages[-1].get('content')).wait_until_done()
Expand Down

0 comments on commit 6fbcf97

Please sign in to comment.