Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bump TGI version to v3.0.0 #135

Merged
merged 6 commits into from
Jan 6, 2025
Merged

Bump TGI version to v3.0.0 #135

merged 6 commits into from
Jan 6, 2025

Conversation

tengomucho
Copy link
Collaborator

What does this PR do?

This bumps the TGI router version to v3.0.0. This was done by applying the changes that fixed some issues that were investigated in huggingface/optimum-neuron#748.

The Dockerfile has a default value, it is easier to only maintain that.
Update to TGI 3.0.0, using a simplified Cargo.toml.
This is based on the work done on optimum-neuron:
huggingface/optimum-neuron#748
Starting from TGI 2.4.1, the evaluation of the default value for
max_batch_prefill_tokens in the TGI launcher has changed, leading it
to be set to a default value of 4096 on tpu, while it was previously
set to max_batch_size * max_input_tokens.

This is now fixed in the entrypoint, pending a fix in the launcher.
@tengomucho tengomucho marked this pull request as ready for review January 5, 2025 20:54
Copy link
Collaborator

@baptistecolle baptistecolle left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM expect for small nit

Comment on lines 16 to 19
if [[ -z "${MAX_INPUT_TOKENS}" && -n ${MAX_INPUT_LENGTH} ]]; then
MAX_INPUT_TOKENS=${MAX_INPUT_LENGTH}
unset MAX_INPUT_LENGTH
fi
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
if [[ -z "${MAX_INPUT_TOKENS}" && -n ${MAX_INPUT_LENGTH} ]]; then
MAX_INPUT_TOKENS=${MAX_INPUT_LENGTH}
unset MAX_INPUT_LENGTH
fi
if [[ -z "${MAX_INPUT_TOKENS}" && -n ${MAX_INPUT_LENGTH} ]]; then
MAX_INPUT_TOKENS=${MAX_INPUT_LENGTH}
fi
unset MAX_INPUT_LENGTH

Maybe we should unset MAX_INPUT_LENGTH anyway to prevent any downstream use of it as it is deprecated

@tengomucho tengomucho merged commit 20772b8 into main Jan 6, 2025
3 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants