Skip to content

Commit

Permalink
WIP dont use flash attention
Browse files Browse the repository at this point in the history
  • Loading branch information
VeraChristina committed Feb 21, 2025
1 parent 924b1fa commit c0b206b
Showing 1 changed file with 3 additions and 1 deletion.
4 changes: 3 additions & 1 deletion tests/basic_config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ defaults:
- training: default
- _self_


no_validation: True
# diagnostics:
# plot:
# callbacks: []
Expand All @@ -30,6 +30,8 @@ hardware:

model:
num_channels: 16
processor:
attention_implementation: scaled_dot_product_attention

dataloader:
limit_batches:
Expand Down

0 comments on commit c0b206b

Please sign in to comment.