-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Batch Normalization support #10
Comments
I did manage to get it working after fixing I mistake I made where I was passing a whole batch of inputs to the hypernetwork, then directly passing the output to the |
I think I would like a comment in the readme about needing to set |
@Richienb Thank you for your response. I am sorry to distract you again. I was unclear about your response. I am trying the following way: I use Hypernet over a CNN that uses batch norm. So even after setting init_independent_weights=False, I get a shape mismatch error when doing a forward pass with batch size>1 on HyperConvNet. I am still not clear on what is going wrong here. Can you help me out? class HyperConvNet(nn.Module):
|
@paramrajpura Are you sure that in the input to |
It's a batch. I wanted to do a batch-forward pass. Is there a way to do it? |
I have been iterating over each item in the batch separately, and combining them at the end. I'm not sure if there is a way that lets you do it in parallel, but the way I have just described does work. |
Yes, Even I do that with a loop, but it gets too time-consuming to train and test on a large dataset. Thank you so much @Richienb for the discussion! |
Vries et al. 2017 condition a network by hypernetizing BatchNorm layers only.
Presently, specifying a
BatchNorm2d
causesValueError: Fan in and fan out can not be computed for tensor with fewer than 2 dimensions
at:hyperlight/hyperlight/hypernet/initialization.py
Line 45 in a3e2108
Then, if I set
init_independent_weights=False
, I getAttributeError: Uninitialized External Parameter, please set the value first. Did you mean: '_data'?
The text was updated successfully, but these errors were encountered: