Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Replace deprecated logger.warn with warning #16876

Merged
merged 1 commit into from
Apr 25, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -269,7 +269,7 @@ def set_quantizer(name, mod, quantizer, k, v):
assert hasattr(quantizer_mod, k)
setattr(quantizer_mod, k, v)
else:
logger.warn(f"{name} has no {quantizer}")
logger.warning(f"{name} has no {quantizer}")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Note that we don't actively maintain those ;-)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure! Just covering all grounds!



def set_quantizers(name, mod, which="both", **kwargs):
Expand Down
2 changes: 1 addition & 1 deletion src/transformers/configuration_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -306,7 +306,7 @@ def __init__(self, **kwargs):
if self.id2label is not None:
num_labels = kwargs.pop("num_labels", None)
if num_labels is not None and len(self.id2label) != num_labels:
logger.warn(
logger.warning(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think that's our internal logger here no - not Python's logger. So not sure about this change @LysandreJik

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The logging module imported in this script is indeed our internal logging module that lives under the utils directory. We see here that the logger is instantiated by invoking the get_logger function from this module:

logger = logging.get_logger(__name__)

If we inspect the internal logging module in utils, we see that it first imports the Python logging module:


The get_logger function simply returns the getLogger function from this Python logging module:
def get_logger(name: Optional[str] = None) -> logging.Logger:
"""
Return a logger with the specified name.
This function is not supposed to be directly accessed unless you are writing a custom transformers module.
"""
if name is None:
name = _get_library_name()
_configure_library_root_logger()
return logging.getLogger(name)

Hence, our internal logger is implicitly derived from its parent Python logger, and is simply a wrapper for this module. Consequently, the advice regarding the use of warning in place of warn should still hold.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Referring to @LysandreJik here :-)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, it's a good change!

f"You passed along `num_labels={num_labels}` with an incompatible id to label map: "
f"{self.id2label}. The number of labels wil be overwritten to {self.num_labels}."
)
Expand Down
2 changes: 1 addition & 1 deletion src/transformers/modeling_flax_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -641,7 +641,7 @@ def from_pretrained(
unexpected_keys = set(state.keys()) - model.required_params

if missing_keys and not _do_init:
logger.warn(
logger.warning(
f"The checkpoint {pretrained_model_name_or_path} is missing required keys: {missing_keys}. "
f"Make sure to call model.init_weights to initialize the missing weights."
)
Expand Down