Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature] Support TimeSformer #839

Merged
merged 27 commits into from
Jul 1, 2021

Conversation

congee524
Copy link
Contributor

@congee524 congee524 commented Apr 28, 2021

  • Attention Code
  • TimeSformer Backbone Code
  • TimeSformer Head
  • Inference
  • Train
  • CheckPoint
  • Refine Code
  • Add Docstring
  • Unittest
  • README

@congee524 congee524 changed the title [Feature] Support TimeSformer [Feature] Support TimeSformer (BC) Apr 28, 2021
@congee524 congee524 added BC-breaking WIP work in progress labels May 13, 2021
@congee524 congee524 changed the title [Feature] Support TimeSformer (BC) [Feature] Support TimeSformer May 13, 2021
@congee524 congee524 marked this pull request as ready for review May 21, 2021 09:00
@congee524 congee524 removed the WIP work in progress label May 24, 2021
Copy link
Member

@kennymckormick kennymckormick left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The current min version of mmcv is 1.3.1, is it sufficient for this PR?

@kennymckormick
Copy link
Member

The current min version of mmcv is 1.3.1, is it sufficient for this PR?

Oh, seems not. You need to update the version requirement as well as the CI. Besides, you can add some unittest to improve the CodeCov

@congee524
Copy link
Contributor Author

The current min version of mmcv is 1.3.1, is it sufficient for this PR?

Oh, seems not. You need to update the version requirement as well as the CI. Besides, you can add some unittest to improve the CodeCov

OK. The work may need to wait until mmcv 1.3.5 release.

@kennymckormick
Copy link
Member

The current min version of mmcv is 1.3.1, is it sufficient for this PR?

Oh, seems not. You need to update the version requirement as well as the CI. Besides, you can add some unittest to improve the CodeCov

OK. The work may need to wait until mmcv 1.3.5 release.

You can ping me after this PR is ready.

self.num_heads = num_heads
self.num_frames = num_frames
self.norm = build_norm_layer(norm_cfg, self.embed_dims)[1]
self.attn = nn.MultiheadAttention(embed_dims, num_heads, attn_drop,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Perhaps you need to use MultiheadAttention from mmcv since it takes the arg batch_first

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, it should be turned into mmcv MultiheadAttention, pytorch 1.9 includes arg but previous versions not.

self.num_heads = num_heads
self.num_frames = num_frames
self.norm = build_norm_layer(norm_cfg, self.embed_dims)[1]
self.attn = nn.MultiheadAttention(embed_dims, num_heads, attn_drop,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same here

@CLAassistant
Copy link

CLAassistant commented Jun 17, 2021

CLA assistant check
All committers have signed the CLA.

@shvdiwnkozbw
Copy link
Contributor

It seems MMCV minimum version 1.3.6 is required.

@kennymckormick
Copy link
Member

@shvdiwnkozbw The old checkpoints also need to be updated. You can send me the converted checkpoint.

@codecov
Copy link

codecov bot commented Jun 24, 2021

Codecov Report

Merging #839 (2e445ba) into master (79fba89) will decrease coverage by 0.38%.
The diff coverage is 88.17%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #839      +/-   ##
==========================================
- Coverage   83.73%   83.34%   -0.39%     
==========================================
  Files         132      135       +3     
  Lines        9982    10311     +329     
  Branches     1718     1764      +46     
==========================================
+ Hits         8358     8594     +236     
- Misses       1210     1285      +75     
- Partials      414      432      +18     
Flag Coverage Δ
unittests 83.34% <88.17%> (-0.39%) ⬇️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
mmaction/models/backbones/timesformer.py 78.00% <78.00%> (ø)
mmaction/models/common/transformer.py 97.56% <97.56%> (ø)
mmaction/__init__.py 100.00% <100.00%> (ø)
mmaction/models/__init__.py 100.00% <100.00%> (ø)
mmaction/models/backbones/__init__.py 100.00% <100.00%> (ø)
mmaction/models/common/__init__.py 100.00% <100.00%> (ø)
mmaction/models/heads/__init__.py 100.00% <100.00%> (ø)
mmaction/models/heads/timesformer_head.py 100.00% <100.00%> (ø)
mmaction/models/localizers/base.py 45.88% <0.00%> (-10.99%) ⬇️
mmaction/datasets/pipelines/compose.py 90.32% <0.00%> (-9.68%) ⬇️
... and 12 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 79fba89...2e445ba. Read the comment docs.

@shvdiwnkozbw
Copy link
Contributor

@shvdiwnkozbw The old checkpoints also need to be updated. You can send me the converted checkpoint.

This is the link for converted checkpoint
https://drive.google.com/drive/folders/1E3uwhfDZlArzI8wsvGGD26EoNbrxsUp7?usp=sharing

@kennymckormick
Copy link
Member

kennymckormick commented Jun 30, 2021

@shvdiwnkozbw The old checkpoints also need to be updated. You can send me the converted checkpoint.

This is the link for converted checkpoint
https://drive.google.com/drive/folders/1E3uwhfDZlArzI8wsvGGD26EoNbrxsUp7?usp=sharing

Have you tested the converted checkpoints? It seems that the testing accuracy doesn't align with the reported numbers, e.g., I got 61.8% Top-1 when I test the divST model (jointST failed).

@shvdiwnkozbw
Copy link
Contributor

Have you tested the converted checkpoints? It seems that the testing accuracy doesn't align with the reported numbers, e.g., I got 61.8% Top-1 when I test the divST model (jointST failed).

I have not tested the accuracy since my account has been cancelled, are there any missing parameters when loading the converted checkpoint?

@kennymckormick
Copy link
Member

Have you tested the converted checkpoints? It seems that the testing accuracy doesn't align with the reported numbers, e.g., I got 61.8% Top-1 when I test the divST model (jointST failed).

I have not tested the accuracy since my account has been cancelled, are there any missing parameters when loading the converted checkpoint?

For divST, the parameter names aligned perfectly but the accuracy is about 10% inferior. For jointST, the checkpoint can not be loaded properly.

@kennymckormick kennymckormick merged commit 1c2d9c6 into open-mmlab:master Jul 1, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants