-
Notifications
You must be signed in to change notification settings - Fork 14
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
About experiment setting #13
Comments
Hi.How many |
same problem.Hope the author can answer and help us |
same problem, it would be helpful if the authors could provide more details on getting the results on paper |
lots of version issue how did you handle guys? |
Is the code complete, and it can't be run directly using the training command given by the author? |
Thx for your great work!
I have some questions about your code.
ARROW_ROOT=./datasets/mmimdb
NUM_GPUS=2
NUM_NODES=1
BS_FITS_YOUR_GPU=2
PRETRAINED_MODEL_PATH=./pretrained_weight/vilt_200k_mlm_itm.ckpt
EXP_NAME=mmimdb
python run.py with data_root=${ARROW_ROOT}
num_gpus=${NUM_GPUS}
num_nodes=${NUM_NODES}
per_gpu_batchsize=${BS_FITS_YOUR_GPU}
task_finetune_mmimdb
load_path=${PRETRAINED_MODEL_PATH}
exp_name=${EXP_NAME}
and I got 40.65 (paper: 42.66) on test set with same setting. Can I reproduce the paper's results without changing parameters like a learning rate or are there some optimized hyperparameters for each dataset?
The text was updated successfully, but these errors were encountered: