Skip to content

Actions: NVIDIA/TransformerEngine

Blossom-CI

Actions

Loading...
Loading

Show workflow options

Create status badge

Loading
217 workflow run results
217 workflow run results

Filter by Event

Filter by Status

Filter by Branch

Filter by Actor

[JAX] Custom Op Workspace Tensors from XLA Buffers
Blossom-CI #1912: Issue comment #532 (comment) created by denera
December 19, 2023 23:08 4s
December 19, 2023 23:08 4s
[PYTORCH::FP8] FP8 significantly slow down when scaling up to 1000+ GPUs
Blossom-CI #1911: Issue comment #556 (comment) created by Ageliss
December 19, 2023 12:07 5s
December 19, 2023 12:07 5s
Provide pre-computed max sequence to remove unnecessary kernels and D2H copies
Blossom-CI #1910: Issue comment #555 (comment) created by timmoon10
December 19, 2023 01:58 5s
December 19, 2023 01:58 5s
[PyTorch] Reduce size of sanity tests
Blossom-CI #1909: Issue comment #510 (comment) created by timmoon10
December 18, 2023 22:29 4s
December 18, 2023 22:29 4s
Use unoptimized layernorm kernel if pointers are not aligned
Blossom-CI #1908: Issue comment #490 (comment) created by timmoon10
December 18, 2023 22:26 6s
December 18, 2023 22:26 6s
NVRTC kernels for cast-transpose
Blossom-CI #1907: Issue comment #258 (comment) created by timmoon10
December 18, 2023 22:25 4s
December 18, 2023 22:25 4s
Avoid redundant computation for cu_seqlens
Blossom-CI #1906: Issue comment #535 (comment) created by timmoon10
December 18, 2023 22:24 4s
December 18, 2023 22:24 4s
[PYTORCH::FP8] FP8 significantly slow down when scaling up to 1000+ GPUs
Blossom-CI #1905: Issue comment #556 (comment) created by ptrendx
December 18, 2023 19:09 5s
December 18, 2023 19:09 5s
Assert error in JAX layernorm_bwd when using dtype=jnp.bfloat16
Blossom-CI #1904: Issue comment #569 (comment) created by crclark
December 18, 2023 18:15 5s
December 18, 2023 18:15 5s
Export to ONNX fails
Blossom-CI #1903: Issue comment #528 (comment) created by jbcdnr
December 18, 2023 13:39 5s
December 18, 2023 13:39 5s
[PyTorch] Linear and LayerNormLinear bug fix for excess weight and bias buffers
Blossom-CI #1902: Issue comment #570 (comment) created by denera
December 17, 2023 17:34 3s
December 17, 2023 17:34 3s
Assert error in JAX layernorm_bwd when using dtype=jnp.bfloat16
Blossom-CI #1901: Issue comment #569 (comment) created by zlsh80826
December 16, 2023 07:05 5s
December 16, 2023 07:05 5s
[PyTorch] Add sliding window support to FlashAttention
Blossom-CI #1900: Issue comment #551 (comment) created by cyanguwa
December 15, 2023 23:55 4s
December 15, 2023 23:55 4s
[PyTorch] Linear and LayerNormLinear bug fix for excess weight and bias buffers
Blossom-CI #1899: Issue comment #570 (comment) created by denera
December 15, 2023 23:46 4s
December 15, 2023 23:46 4s
[PyTorch] Add sliding window support to FlashAttention
Blossom-CI #1898: Issue comment #551 (comment) created by cyanguwa
December 15, 2023 23:14 4s
December 15, 2023 23:14 4s
Update fp8_meta amax when copying into Float8Tensor
Blossom-CI #1897: Issue comment #567 (comment) created by timmoon10
December 15, 2023 22:53 3s
December 15, 2023 22:53 3s
[PyTorch] Add sliding window support to FlashAttention
Blossom-CI #1896: Issue comment #551 (comment) created by cyanguwa
December 15, 2023 22:43 4s
December 15, 2023 22:43 4s
[JAX] Custom Op Workspace Tensors from XLA Buffers
Blossom-CI #1895: Issue comment #532 (comment) created by denera
December 15, 2023 21:57 5s
December 15, 2023 21:57 5s
question for building wheel for transformer-engine
Blossom-CI #1894: Issue comment #516 (comment) created by osainz59
December 15, 2023 21:28 5s
December 15, 2023 21:28 5s
[PyTorch] Add sliding window support to FlashAttention
Blossom-CI #1893: Issue comment #551 (comment) created by cyanguwa
December 15, 2023 20:32 3s
December 15, 2023 20:32 3s
Difficult to understand why "no fused attention kernel is available"
Blossom-CI #1892: Issue comment #568 (comment) created by ptrendx
December 15, 2023 19:48 4s
December 15, 2023 19:48 4s
Assert error in JAX layernorm_bwd when using dtype=jnp.bfloat16
Blossom-CI #1891: Issue comment #569 (comment) created by ptrendx
December 15, 2023 19:45 4s
December 15, 2023 19:45 4s
Update fp8_meta amax when copying into Float8Tensor
Blossom-CI #1890: Issue comment #567 (comment) created by timmoon10
December 15, 2023 03:04 4s
December 15, 2023 03:04 4s
Disable dynamo for Fused Attention
Blossom-CI #1889: Issue comment #558 (comment) created by timmoon10
December 15, 2023 03:02 5s
December 15, 2023 03:02 5s
[JAX] Fix failure on pattern matching of FP8 GEMM when enabling FSDP.
Blossom-CI #1888: Issue comment #547 (comment) created by mingxu1067
December 15, 2023 01:40 4s
December 15, 2023 01:40 4s