Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] tfs_z param seems to be already removed from ollama #2143

Open
1 of 9 tasks
Arvin2focus opened this issue Jan 23, 2025 · 1 comment
Open
1 of 9 tasks

[BUG] tfs_z param seems to be already removed from ollama #2143

Arvin2focus opened this issue Jan 23, 2025 · 1 comment
Labels
bug Something isn't working

Comments

@Arvin2focus
Copy link

Arvin2focus commented Jan 23, 2025

Pre-check

  • I have searched the existing issues and none cover this bug.

Description

logs of ollama0.5.7 with LLM Llama3.2:7b

level=WARN source=types.go:512 msg="invalid option provided" option=tfs_z

Ref : ollama/ollama#8252

Steps to Reproduce

  1. Upload files from UI
  2. ask some summarization question

Expected Behavior

response with correct summarize result

Actual Behavior

response with Failed

Environment

macbookpro with M3 max

Additional Information

No response

Version

No response

Setup Checklist

  • Confirm that you have followed the installation instructions in the project’s documentation.
  • Check that you are using the latest version of the project.
  • Verify disk space availability for model storage and data processing.
  • Ensure that you have the necessary permissions to run the project.

NVIDIA GPU Setup Checklist

  • Check that the all CUDA dependencies are installed and are compatible with your GPU (refer to CUDA's documentation)
  • Ensure an NVIDIA GPU is installed and recognized by the system (run nvidia-smi to verify).
  • Ensure proper permissions are set for accessing GPU resources.
  • Docker users - Verify that the NVIDIA Container Toolkit is configured correctly (e.g. run sudo docker run --rm --gpus all nvidia/cuda:11.0.3-base-ubuntu20.04 nvidia-smi)
@Arvin2focus Arvin2focus added the bug Something isn't working label Jan 23, 2025
@BobMerkus
Copy link

BobMerkus commented Jan 27, 2025

If you have access to the original Ollama Modelfile, remove the parameter tfs_z and then update the remote using ollama push. I don't think this should affect the token generation during inference, so if you are encountering issues it is probably related to something else

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants