You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hard attention has such low accuracy on the validation data even after 10 epochs.
Is it just used for drawing the heat map?
Why is my accuracy affected in this case compared to my soft Attention?
From the papers I learnt that Soft Attention gives to multiple objects as compared to Hard attention. To cross check this I gave my model an image having 2 digits.
Soft Attention works well by giving attention to both the digits.
But in Hard attention I expected it to give attention to just one of the digits. Though the result is same compared to Soft Attention and prediction is affected.
The text was updated successfully, but these errors were encountered:
Hard attention has such low accuracy on the validation data even after 10 epochs.
Is it just used for drawing the heat map?
Why is my accuracy affected in this case compared to my soft Attention?
From the papers I learnt that Soft Attention gives to multiple objects as compared to Hard attention. To cross check this I gave my model an image having 2 digits.
Soft Attention works well by giving attention to both the digits.
But in Hard attention I expected it to give attention to just one of the digits. Though the result is same compared to Soft Attention and prediction is affected.
The text was updated successfully, but these errors were encountered: