Skip to content
This repository was archived by the owner on Feb 15, 2025. It is now read-only.

How do the outputs change? #23

Open
yarinbar opened this issue Dec 6, 2023 · 1 comment
Open

How do the outputs change? #23

yarinbar opened this issue Dec 6, 2023 · 1 comment

Comments

@yarinbar
Copy link

yarinbar commented Dec 6, 2023

According to the code (and assuming that STEPS = 1, i dont understand how the outputs change after the adaptation:

def forward(self, x):
    if self.episodic:
        self.reset()

    for _ in range(self.steps):
        outputs = forward_and_adapt(x, self.model, self.optimizer)
    return outputs

@torch.enable_grad()  # ensure grads in possible no grad context for testing
def forward_and_adapt(x, model, optimizer):
    """Forward and adapt model on batch of data.

    Measure entropy of the model prediction, take gradients, and update params.
    """
    # forward
    outputs = model(x)
    # adapt
    loss = softmax_entropy(outputs).mean(0)
    loss.backward()
    optimizer.step()
    optimizer.zero_grad()
    return outputs

judging by the code, you return the original outputs however they do change somehow, how?

@NneurotransmitterR
Copy link

I think in Online TTA, a batch of samples won't be fed into the network again to do another forward pass, so the updated model is for the next batch. But in some other scenarios the current batch will get a new output.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants