Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

No "training alteration" case #4

Open
SorourMo opened this issue Jan 5, 2022 · 1 comment
Open

No "training alteration" case #4

SorourMo opened this issue Jan 5, 2022 · 1 comment

Comments

@SorourMo
Copy link

SorourMo commented Jan 5, 2022

Hi,
Thanks for sharing your code and also the nice YouTube video presentation of the paper. Specifically, I have a question about a scenario in the TTT method. In my understanding, applying TTT to an existing pre-trained model requires repeating the training with source domain data with the self-supervised head (ssh) (rotation prediction). After the training, a standard or online version of TTT can be applied to the test domain data by initializing the ssh weights with the ones obtained from the training phase. This means "training alteration" is necessary for TTT.

My question is what would happen if I did not alter the training and did not retrain the data with a ssh. Therefore, during the test, I simply started the weights of the ssh from random points? In what way would it affect the results? Have you tried that? I assume the results will be worse than the standard TTT.

Thanks,
Sorour

@SorourMo SorourMo changed the title No training alteration case No "training alteration" case Jan 5, 2022
@yueatsprograms
Copy link
Owner

Hi Sorour,

Thank you for your interest. I have not tried TTT without "training alteration", because I expect the results to be worse (as you said), and I don't see any reason why someone has to train without the self-supervised task. You are free to try it yourself though, the modification should be a one-liner.

Yu

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants