Skip to content

Commit

Permalink
Add documentation to the repository (#13)
Browse files Browse the repository at this point in the history
* Initial commit for the documentation

* Initial doc-build workflow

* Initial how to deploy instance

* Add the gcloud alpha requirement

* Address general comments

* Add PR doc workflows

* Fix workflow

* Add _toctree

* Fix docs/source/howto/overview

---------

Co-authored-by: regisss <15324346+regisss@users.noreply.github.com>
  • Loading branch information
mfuntowicz and regisss authored Apr 8, 2024
1 parent 7b48145 commit f92066c
Show file tree
Hide file tree
Showing 10 changed files with 320 additions and 1 deletion.
57 changes: 57 additions & 0 deletions .github/workflows/doc-build.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,57 @@
name: Build documentation

on:
push:
branches:
- main
tags:
- 'v[0-9]+.[0-9]+.[0-9]+'

paths:
- 'docs/source/**'
- 'docs/assets/**'
- 'optimum/**'
- '.github/workflows/doc-build.yml'
workflow_dispatch:

jobs:
build_documentation:
runs-on: ubuntu-latest
env:
COMMIT_SHA: ${{ github.event.pull_request.head.sha }}
PR_NUMBER: ${{ github.event.number }}
EVENT_CONTEXT: ${{ toJSON(github.event) }}
PR_CLONE_URL: ${{ github.event.pull_request.head.repo.clone_url }}

steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version: '18'
cache-dependency-path: "kit/package-lock.json"

- name: Set environment variables
run: |
cd optimum
version=`echo "$(grep '^__version__ =' tpu/version.py | cut -d '=' -f 2- | xargs)"`
if [[ $version == *.dev0 ]]
then
echo "VERSION=main" >> $GITHUB_ENV
else
echo "VERSION=v$version" >> $GITHUB_ENV
fi
cd ..
- name: Setup environment
run: |
pip install ".[quality]"
- name: Make documentation
shell: bash
run: |
doc-builder build optimum.tpu docs/source/ --repo_name optimum-tpu --build_dir tpu-doc-build/ --version ${{ env.VERSION }} --version_tag_suffix "" --html --clean
cd tpu-doc-build/
mv optimum.tpu optimum-tpu
doc-builder push optimum-tpu --doc_build_repo_id "hf-doc-build/doc-build" --token "${{ secrets.HF_DOC_BUILD_PUSH }}" --commit_msg "Updated with commit $COMMIT_SHA See: https://github.com/huggingface/optimum-tpu/commit/$COMMIT_SHA" --n_retries 5
52 changes: 52 additions & 0 deletions .github/workflows/doc-pr-build.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,52 @@
name: Build PR Documentation

on:
pull_request:
branches: [ main ]
paths:
- 'docs/source/**'
- 'docs/assets/**'
- 'optimum/**'
- '.github/workflows/doc-pr-build.yml'

concurrency:
group: ${{ github.workflow }}-${{ github.head_ref || github.run_id }}
cancel-in-progress: true

jobs:
build_documentation:
runs-on: ubuntu-latest
env:
COMMIT_SHA: ${{ github.event.pull_request.head.sha }}
PR_NUMBER: ${{ github.event.number }}
EVENT_CONTEXT: ${{ toJSON(github.event) }}
PR_CLONE_URL: ${{ github.event.pull_request.head.repo.clone_url }}

steps:
- uses: actions/checkout@v3
- uses: actions/setup-node@v3
with:
node-version: '18'
cache-dependency-path: "kit/package-lock.json"

- name: Setup environment
run: |
pip install -U pip
pip install ".[quality]"
- name: Make documentation
shell: bash
run: |
doc-builder build optimum.tpu docs/source/ --repo_name optimum-tpu --build_dir tpu-doc-build/ --version pr_${{ env.PR_NUMBER }} --version_tag_suffix "" --html --clean
- name: Save commit_sha & pr_number
run: |
cd tpu-doc-build/
mv optimum.tpu optimum-tpu
echo ${{ env.COMMIT_SHA }} > ./commit_sha
echo ${{ env.PR_NUMBER }} > ./pr_number
- uses: actions/upload-artifact@v3
with:
name: doc-build-artifact
path: tpu-doc-build/
16 changes: 16 additions & 0 deletions .github/workflows/upload_pr_documentation.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
name: Upload PR Documentation

on:
workflow_run:
workflows: ["Build PR Documentation"]
types:
- completed

jobs:
build:
uses: huggingface/doc-builder/.github/workflows/upload_pr_documentation.yml@main
with:
package_name: optimum-tpu
secrets:
hf_token: ${{ secrets.HF_DOC_BUILD_PUSH }}
comment_bot_token: ${{ secrets.COMMENT_BOT_TOKEN }}
17 changes: 17 additions & 0 deletions docs/source/_toctree.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
- sections:
- local: index
title: 🤗 Optimum-TPU
- sections:
- local: tutorials/overview
title: Overview
title: Tutorials
- sections:
- local: howto/overview
title: Overview
- local: howto/deploy
title: Deploying a Google Cloud TPU instance
- local: howto/serving
title: Deploying a TGI server on a Google Cloud TPU instance
title: How-To Guides
title: Optimum-TPU
isExpanded: true
82 changes: 82 additions & 0 deletions docs/source/howto/deploy.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,82 @@
# Deploying a Google TPU instance on Google Cloud Platform (GCP)


## Context

We assume the reader has already created a Google Cloud Platform (GCP) user or organisation account and an
associated project.

We also assume the reader to have the Google Cloud CLI installed. If not please follow the links right after to
[install](https://cloud.google.com/sdk/docs/install) and [setup](https://cloud.google.com/sdk/docs/initializing).

## Creating the initial TPUVM on GCP

In order to create your initial TPU instance, you will need to provide some information:

- The zone in GCP you would like to see the instance being deployed (close to the reader for development purpose, close to the end user for production for instance)
- Which kind of TPU you would like to target
- Which version of the TPU runtime you want to leverage on the instance
- Custom instance name to quickly skim and refer back to the instance

Overall the end command looks like this:

```bash
gcloud compute tpus tpu-vm create <ref_instance_name> \
--zone=<deploiment_zone> \
--accelerator-type=<target_tpu_generation> \
--version=<runtime_version>
```

### Deploying a TPU v5litepod-8 instance

In our case we will be deploying a `v5litepod-8` instance name `optimum-tpu-get-started`
in the GCP region `us-west4-a` using the latest `v2-alpha-tpuv5-lite` runtime version.

Of course, feel free to adjust all these parameters to the one that match with your usage and quotas.

Before creating the instance, please make sure to install `gcloud alpha component` as it is required to be able to
target TPUv5 VMs: `gcloud components install alpha`

```bash
gcloud alpha compute tpus tpu-vm create optimum-tpu-get-started \
--zone=us-west4-a \
--accelerator-type=v5litepod-8 \
--version=v2-alpha-tpuv5
```

## Connecting to the instance

```bash
gcloud compute tpus tpu-vm ssh <ref_instance_name> --zone=<deploiment_zone>
$ >
```

In the example above deploying v5litepod-8 it would be something like:

```bash
gcloud compute tpus tpu-vm ssh optimum-tpu-get-started --zone=us-west4-a
$ >
```

## Setting up the instance to run AI workloads on TPUs

### Optimum-TPU with PyTorch/XLA

If you want to leverage PyTorch/XLA through Optimum-TPU, it should be as simple as

```bash
$ python3 -m pip install optimum-tpu
$ export PJRT_DEVICE=TPU
```

Now you can validate the installation with the following command which should print `xla:0` as we do have a single
TPU device bound to this instance.

```bash
$ python -c "import torch_xla.core.xla_model as xm; print(xm.xla_device())"
xla:0
```

### Optimum-TPU with JAX

JAX is coming very soon - stay tuned!
26 changes: 26 additions & 0 deletions docs/source/howto/overview.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
# Optimum-TPU How To

This page gives you access to handful of walkthrough scenarios to leverage Google TPUs for your use case.

## Looking for?

<div class="mt-10">
<div class="w-full flex flex-col space-y-4 md:space-y-0 md:grid md:grid-cols-2 md:gap-y-4 md:gap-x-5">
<a class="!no-underline border dark:border-gray-700 p-5 rounded-lg shadow hover:shadow-lg" href="./serving.mdx">
<div class="w-full text-center bg-gradient-to-r from-slate-900 to-slate-700 rounded-lg py-1.5 font-semibold mb-5 text-white text-lg leading-relaxed">
Deploying Google Cloud TPU instance
</div>
<p class="text-gray-700">

</p>
</a>
<a class="!no-underline border dark:border-gray-700 p-5 rounded-lg shadow hover:shadow-lg" href="./deploy.mdx">
<div class="w-full text-center bg-gradient-to-r from-slate-900 to-slate-700 rounded-lg py-1.5 font-semibold mb-5 text-white text-lg leading-relaxed">
Deploying a Text-Generation-Inference server on Google Cloud TPU instance
</div>
<p class="text-gray-700">

</p>
</a>
</div>
</div>
Empty file added docs/source/howto/serving.mdx
Empty file.
69 changes: 69 additions & 0 deletions docs/source/index.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,69 @@
<!---
Copyright 2024 The HuggingFace Team. All rights reserved.

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->

# 🤗 Optimum TPU

Optimum TPU provides all the necessary machinery to leverage and optimize AI workloads runningo on [Google Cloud TPU devices](https://cloud.google.com/tpu/docs).

The API provides the overall same user-experience as Hugging Face transformers with the minimum amount of changes required to target performance for both inference.

Training support is underway, stay tuned! 🚀


## Installation

Optimum TPU is meant to reduce as much as possible the friction in order to leverage Google Cloud TPU accelerators.
As such, we provide a pip installable package to make sure everyone can get easily started.

### Run Cloud TPU with pip
```bash
pip install optimum-tpu
```

### Run Cloud TPU within Docker container

### PyTorch
```bash
export TPUVM_IMAGE_URL=us-central1-docker.pkg.dev/tpu-pytorch-releases/docker/xla
export TPUVM_IMAGE_VERSION=8f1dcd5b03f993e4da5c20d17c77aff6a5f22d5455f8eb042d2e4b16ac460526
docker pull
docker run -ti --rm --privileged --network=host ${TPUVM_IMAGE_URL}@sha256:${TPUVM_IMAGE_VERSION} bash
```

From there you can install optimum-tpu through the pip instructions above.


<div class="mt-10">
<div class="w-full flex flex-col space-y-4 md:space-y-0 md:grid md:grid-cols-2 md:gap-y-4 md:gap-x-5">
<a class="!no-underline border dark:border-gray-700 p-5 rounded-lg shadow hover:shadow-lg" href="./howto/overview">
<div class="w-full text-center bg-gradient-to-br from-red-500 to-red-800 rounded-lg py-1.5 font-semibold mb-5 text-white text-lg leading-relaxed">
How-to guides
</div>
<p class="text-gray-700">

</p>
</a>
<a
class="!no-underline border dark:border-gray-700 p-5 rounded-lg shadow hover:shadow-lg"
href="./package_reference/trainer"
>
<div class="w-full text-center bg-gradient-to-br from-green-500 to-green-800 rounded-lg py-1.5 font-semibold mb-5 text-white text-lg leading-relaxed">
Reference
</div>
<p class="text-gray-700">Technical descriptions of how the classes and methods of Optimum TPU</p>
</a>
</div>
</div>
Empty file.
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,7 @@ dependencies = [

[project.optional-dependencies]
tests = ["pytest", "safetensors"]
quality = ["black", "ruff", "isort",]
quality = ["black", "ruff", "isort", "hf_doc_builder @ git+https://github.com/huggingface/doc-builder.git"]

[project.urls]
Homepage = "https://hf.co/hardware"
Expand Down

0 comments on commit f92066c

Please sign in to comment.