Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

use case: transfer learning / finetuning #157

Open
gfrogat opened this issue May 17, 2018 · 3 comments
Open

use case: transfer learning / finetuning #157

gfrogat opened this issue May 17, 2018 · 3 comments
Labels

Comments

@gfrogat
Copy link
Contributor

gfrogat commented May 17, 2018

Most people don't train models from scratch but fine-tune a pretrained model instead.

The resulting dynamics of the model might therefore be different. On a different note: is not completely clear whether the observed oscillations are due to changes in the convnet filters or in the fully connected parts of the network

@NMO13 NMO13 added the paper label May 22, 2018
@gfrogat
Copy link
Contributor Author

gfrogat commented May 22, 2018

In this context it might be interesting to compare the learning / fine tuning of an image classifier on summer and winter images.

@thinkh
Copy link
Member

thinkh commented Aug 29, 2018

We would need a dataset with summer and winter images for this use case.

@gfrogat
Copy link
Contributor Author

gfrogat commented Sep 5, 2018

we can illustrate similar effects by experimenting on different random folds, i.e. cross validation.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants