-
Notifications
You must be signed in to change notification settings - Fork 321
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add streaming zipformer #787
Conversation
@yaozengwei Thank you for this PR ! |
Could you first try to check the batch that starts to cause Nan? Maybe you could refer to https://github.com/k2-fsa/icefall/blob/master/egs/librispeech/ASR/lstm_transducer_stateless3/train.py#L630. If there is no problem with the data, you could run the training code with "--inf-check 1" before batche(~2500) that gets Nan, to check which module gets infinite. |
This PR could be merged first. I will create a new PR to add the document https://k2-fsa.github.io/icefall/recipes/Streaming-ASR/index.html. |
When I using the
What are the possible reasons for this output |
|
What's your lm scale? @JiawangLiu |
Decoding results given by
|
@JiawangLiu Could you check if the bpe model is the same, it looks to me as a mismatch between the bpe from the RNNLM and the Zipformer |
I can't find the bpe model in RNNLM, I get my bpe_500 model by running the |
@JiawangLiu Then very most likely it is not the same as the RNN-LM one. To be honest, I don't recall which one I used for training the RNN-LM |
This PR adds recipe of streaming zipformer.
The following table compares performance of our current streaming models trained on full librispeech.