-
Notifications
You must be signed in to change notification settings - Fork 28.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
add a warning in SpmConverter
for sentencepiece's model using the byte fallback feature
#16629
Changes from 9 commits
b07f671
4610825
1ee9f3d
406ae44
06fb892
e6c78b8
552fd60
d91f34f
4baf7ce
c7adf5d
8969211
3bb407a
061f3c3
d7f854b
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -19,6 +19,7 @@ | |
allow to make our dependency on SentencePiece optional. | ||
""" | ||
|
||
import warnings | ||
from typing import Dict, List, Tuple | ||
|
||
from tokenizers import Regex, Tokenizer, decoders, normalizers, pre_tokenizers, processors | ||
|
@@ -429,6 +430,12 @@ def __init__(self, *args): | |
m.ParseFromString(f.read()) | ||
self.proto = m | ||
|
||
if self.proto.trainer_spec.byte_fallback: | ||
warnings.warn( | ||
"The sentencepiece tokenizer that you are converting to a fast tokenizer uses the byte fallback option" | ||
" which is not implemented in the fast tokenizers." | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Could the warning explain a bit more to the user what to expect? If I read that message with no context, I have no idea what it means 😬 There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Also should we maybe use the from .utils import logger
...
logger = logging.get_logger(__name__)
....
logger.warn(...) There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. @sgugger (@patrickvonplaten ) That's a good point! I just added an explanation in my last commit. Does this seem sufficient? 🙂 There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. @patrickvonplaten That's actually how I did it initially but then I saw that the python I have no preference between the two, but I'm interested to know what makes you prefer the second option! 😄 There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. From the official docs:
https://docs.python.org/3.8/howto/logging.html#when-to-use-logging I think this is more of a There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Yes it's definitely not instance where we want to use the |
||
) | ||
|
||
def vocab(self, proto): | ||
return [(piece.piece, piece.score) for piece in proto.pieces] | ||
|
||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Without the
src/transformers/utils/sentencepiece_model_pb2.py
file update, theself.proto.trainer_spec.byte_fallback
call would have returned aAttributeError
with the message "byte_fallback"