We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
I am unsure what has caused this, but whenever I switch back to main, the model is creating errors. I do wonder if this might be git-lfs related?
2025-01-30T12:58:20.037723Z [error ] Error during search: Failed to load model from file: ./codegate_volume/models/all-minilm-L6-v2-q5_k_m.gguf lineno=259 module=storage_engine pathname=/Users/lhinds/repos/stacklok/codegate-repos/codegate/src/codegate/storage/storage_engine.py 2025-01-30T12:58:20.037773Z [debug ] Generating embedding content=['\n\n'] content_length=2 lineno=80 model=all-minilm-L6-v2-q5_k_m.gguf module=inference_engine pathname=/Users/lhinds/repos/stacklok/codegate-repos/codegate/src/codegate/inference/inference_engine.py 2025-01-30T12:58:20.037818Z [info ] Loading model from ./codegate_volume/models/all-minilm-L6-v2-q5_k_m.gguf with parameters n_gpu_layers=0 and n_ctx=512 lineno=44 module=inference_engine pathname=/Users/lhinds/repos/stacklok/codegate-repos/codegate/src/codegate/inference/inference_engine.py 2025-01-30T12:58:20.038381Z [error ] Error during search: Failed to load model from file: ./codegate_volume/models/all-minilm-L6-v2-q5_k_m.gguf lineno=259 module=storage_engine pathname=/Users/lhinds/repos/stacklok/codegate-repos/codegate/src/codegate/storage/storage_engine.py 2025-01-30T12:58:20.039424Z [debug ] Generating embedding content=['\n\n'] content_length=2 lineno=80 model=all-minilm-L6-v2-q5_k_m.gguf module=inference_engine pathname=/Users/lhinds/repos/stacklok/codegate-repos/codegate/src/codegate/inference/inference_engine.py 2025-01-30T12:58:20.039875Z [info ] Loading model from ./codegate_volume/models/all-minilm-L6-v2-q5_k_m.gguf with parameters n_gpu_layers=0 and n_ctx=512 lineno=44 module=inference_engine pathname=/Users/lhinds/repos/stacklok/codegate-repos/codegate/src/codegate/inference/inference_engine.py
git checkout main , start the server
MacOS (Arm)
n/a
Anthropic
No response
The text was updated successfully, but these errors were encountered:
No branches or pull requests
Describe the issue
I am unsure what has caused this, but whenever I switch back to main, the model is creating errors. I do wonder if this might be git-lfs related?
Steps to Reproduce
git checkout main , start the server
Operating System
MacOS (Arm)
IDE and Version
n/a
Extension and Version
n/a
Provider
Anthropic
Model
n/a
Codegate version
n/a
Logs
No response
Additional Context
No response
The text was updated successfully, but these errors were encountered: