You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi,
Thanks for sharing your code and also the nice YouTube video presentation of the paper. Specifically, I have a question about a scenario in the TTT method. In my understanding, applying TTT to an existing pre-trained model requires repeating the training with source domain data with the self-supervised head (ssh) (rotation prediction). After the training, a standard or online version of TTT can be applied to the test domain data by initializing the ssh weights with the ones obtained from the training phase. This means "training alteration" is necessary for TTT.
My question is what would happen if I did not alter the training and did not retrain the data with a ssh. Therefore, during the test, I simply started the weights of the ssh from random points? In what way would it affect the results? Have you tried that? I assume the results will be worse than the standard TTT.
Thanks,
Sorour
The text was updated successfully, but these errors were encountered:
SorourMo
changed the title
No training alteration case
No "training alteration" case
Jan 5, 2022
Thank you for your interest. I have not tried TTT without "training alteration", because I expect the results to be worse (as you said), and I don't see any reason why someone has to train without the self-supervised task. You are free to try it yourself though, the modification should be a one-liner.
Hi,
Thanks for sharing your code and also the nice YouTube video presentation of the paper. Specifically, I have a question about a scenario in the TTT method. In my understanding, applying TTT to an existing pre-trained model requires repeating the training with source domain data with the self-supervised head (ssh) (rotation prediction). After the training, a standard or online version of TTT can be applied to the test domain data by initializing the ssh weights with the ones obtained from the training phase. This means "training alteration" is necessary for TTT.
My question is what would happen if I did not alter the training and did not retrain the data with a ssh. Therefore, during the test, I simply started the weights of the ssh from random points? In what way would it affect the results? Have you tried that? I assume the results will be worse than the standard TTT.
Thanks,
Sorour
The text was updated successfully, but these errors were encountered: