Skip to content

tt-inference-server/vllm-llama3-src-dev-ubuntu-22.04-amd64 v0.0.1-47fb1a2fb6e0-2f33504bad49 Public Latest

Install from the command line
Learn more about packages
$ docker pull ghcr.io/tenstorrent/tt-inference-server/vllm-llama3-src-dev-ubuntu-22.04-amd64:v0.0.1-47fb1a2fb6e0-2f33504bad49

Recent tagged image versions

  • Published about 1 month ago · Digest
    sha256:4febe250874c6a16ab925ef3995c489f9a03d510be768826f5eb47944f62af27
    5 Version downloads

Loading

Details


Last published

1 month ago

Issues

34

Total downloads

7