Skip to content

Commit

Permalink
WIP test ci
Browse files Browse the repository at this point in the history
  • Loading branch information
tengomucho committed Sep 11, 2024
1 parent 10f857c commit 51ec0e3
Showing 1 changed file with 4 additions and 11 deletions.
15 changes: 4 additions & 11 deletions .github/workflows/test-pytorch-xla-tpu-tgi.yml
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@ name: Optimum TPU / Test TGI on TPU

on:
push:
branches: [ main ]
branches: [ quick-ci-test ]
paths:
- "text-generation-inference/**"
pull_request:
Expand Down Expand Up @@ -30,14 +30,7 @@ jobs:

- name: Build and test TGI server
run: |
HF_TOKEN=${{ secrets.HF_TOKEN_OPTIMUM_TPU_CI }} make tgi_test
make test_installs tgi_server
find text-generation-inference/ -name "text_generation_server-*whl" -exec python -m pip install {} \;
HF_TOKEN=${{ secrets.HF_TOKEN_OPTIMUM_TPU_CI }} python -m pytest -sv text-generation-inference/tests -k gemma-2b
# Use a different step to test the Jetstream Pytorch version, to avoid conflicts with torch-xla[tpu]
- name: Install and test TGI server (Jetstream Pytorch)
run: |
pip install -U .[jetstream-pt] \
-f https://storage.googleapis.com/jax-releases/jax_nightly_releases.html \
-f https://storage.googleapis.com/jax-releases/jaxlib_nightly_releases.html \
-f https://storage.googleapis.com/libtpu-releases/index.html
JETSTREAM_PT=1 HF_TOKEN=${{ secrets.HF_TOKEN_OPTIMUM_TPU_CI }} python -m \
pytest -sv text-generation-inference/tests -k jetstream

0 comments on commit 51ec0e3

Please sign in to comment.