Skip to content

Commit 331e5eb

Browse files
authored
[doc] Fix typos. (#31)
1 parent 5baa6a9 commit 331e5eb

File tree

1 file changed

+5
-5
lines changed

1 file changed

+5
-5
lines changed

docs/source/recipes/librispeech/conformer_ctc.rst

+5-5
Original file line numberDiff line numberDiff line change
@@ -303,7 +303,7 @@ The commonly used options are:
303303
304304
- ``--lattice-score-scale``
305305

306-
It is used to scaled down lattice scores so that we can more unique
306+
It is used to scale down lattice scores so that there are more unique
307307
paths for rescoring.
308308

309309
- ``--max-duration``
@@ -314,7 +314,7 @@ The commonly used options are:
314314
Pre-trained Model
315315
-----------------
316316

317-
We have uploaded the pre-trained model to
317+
We have uploaded a pre-trained model to
318318
`<https://huggingface.co/pkufool/icefall_asr_librispeech_conformer_ctc>`_.
319319

320320
We describe how to use the pre-trained model to transcribe a sound file or
@@ -324,7 +324,7 @@ Install kaldifeat
324324
~~~~~~~~~~~~~~~~~
325325

326326
`kaldifeat <https://github.com/csukuangfj/kaldifeat>`_ is used to
327-
extract features for a single sound file or multiple soundfiles
327+
extract features for a single sound file or multiple sound files
328328
at the same time.
329329

330330
Please refer to `<https://github.com/csukuangfj/kaldifeat>`_ for installation.
@@ -397,7 +397,7 @@ After downloading, you will have the following files:
397397
398398
- ``data/lm/G_4_gram.pt``
399399
400-
It is a 4-gram LM, useful for LM rescoring.
400+
It is a 4-gram LM, used for n-gram LM rescoring.
401401
402402
- ``exp/pretrained.pt``
403403
@@ -556,7 +556,7 @@ Its output is:
556556
HLG decoding + LM rescoring + attention decoder rescoring
557557
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
558558
559-
It uses an n-gram LM to rescore the decoding lattice, extracts
559+
It uses an n-gram LM to rescore the decoding lattice, extracts
560560
n paths from the rescored lattice, recores the extracted paths with
561561
an attention decoder. The path with the highest score is the decoding result.
562562

0 commit comments

Comments
 (0)