Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

all-minilm-L6-v2-q5_k_m.gguf corrupted? #850

Open
lukehinds opened this issue Jan 30, 2025 · 0 comments
Open

all-minilm-L6-v2-q5_k_m.gguf corrupted? #850

lukehinds opened this issue Jan 30, 2025 · 0 comments
Labels

Comments

@lukehinds
Copy link
Contributor

Describe the issue

I am unsure what has caused this, but whenever I switch back to main, the model is creating errors. I do wonder if this might be git-lfs related?

2025-01-30T12:58:20.037723Z [error    ] Error during search: Failed to load model from file: ./codegate_volume/models/all-minilm-L6-v2-q5_k_m.gguf lineno=259 module=storage_engine pathname=/Users/lhinds/repos/stacklok/codegate-repos/codegate/src/codegate/storage/storage_engine.py
2025-01-30T12:58:20.037773Z [debug    ] Generating embedding           content=['\n\n'] content_length=2 lineno=80 model=all-minilm-L6-v2-q5_k_m.gguf module=inference_engine pathname=/Users/lhinds/repos/stacklok/codegate-repos/codegate/src/codegate/inference/inference_engine.py
2025-01-30T12:58:20.037818Z [info     ] Loading model from ./codegate_volume/models/all-minilm-L6-v2-q5_k_m.gguf with parameters n_gpu_layers=0 and n_ctx=512 lineno=44 module=inference_engine pathname=/Users/lhinds/repos/stacklok/codegate-repos/codegate/src/codegate/inference/inference_engine.py
2025-01-30T12:58:20.038381Z [error    ] Error during search: Failed to load model from file: ./codegate_volume/models/all-minilm-L6-v2-q5_k_m.gguf lineno=259 module=storage_engine pathname=/Users/lhinds/repos/stacklok/codegate-repos/codegate/src/codegate/storage/storage_engine.py
2025-01-30T12:58:20.039424Z [debug    ] Generating embedding           content=['\n\n'] content_length=2 lineno=80 model=all-minilm-L6-v2-q5_k_m.gguf module=inference_engine pathname=/Users/lhinds/repos/stacklok/codegate-repos/codegate/src/codegate/inference/inference_engine.py
2025-01-30T12:58:20.039875Z [info     ] Loading model from ./codegate_volume/models/all-minilm-L6-v2-q5_k_m.gguf with parameters n_gpu_layers=0 and n_ctx=512 lineno=44 module=inference_engine pathname=/Users/lhinds/repos/stacklok/codegate-repos/codegate/src/codegate/inference/inference_engine.py

Steps to Reproduce

git checkout main , start the server

Operating System

MacOS (Arm)

IDE and Version

n/a

Extension and Version

n/a

Provider

Anthropic

Model

n/a

Codegate version

n/a

Logs

No response

Additional Context

No response

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

1 participant