Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add UL2017 JECs and JERs #248

Merged
merged 6 commits into from
Sep 16, 2020
Merged

Add UL2017 JECs and JERs #248

merged 6 commits into from
Sep 16, 2020

Conversation

IzaakWN
Copy link
Contributor

@IzaakWN IzaakWN commented Aug 15, 2020

Add latest JECs (Summer19UL17_V5, TWiki) and JERs (Summer19UL17_JRV2, TWiki) and implement into jetmetHelperRun2.py as 'UL2017'. @danbarto @camclean

(This PR would be use for measurements in the TauPOG based on nanoAOD.)

@mariadalfonso
Copy link
Contributor

@fgolf
Loukas and I do not see any controversial in this PR,
Shall we go ahead with the merge ?

'2018' : [1.24, 1.20, 1.28]
jmrValues = { '2016' : [1.0, 1.2, 0.8],
'2017' : [1.09, 1.14, 1.04],
'UL2017' : [1.00, 1.00, 1.00], # placeholder
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

note for later:
the jetMass scale and resolution use the placeholder values. basically no smearing/noscaling.
Given the large set of changes (taggerss algorithm, puppi tune) ... I think it's better to reset these for the UL and update later when proper values are checked as function of pt

@IzaakWN
Copy link
Contributor Author

IzaakWN commented Sep 8, 2020

Btw, after these Summer19UL17_V5 corrections, I see still see some disagreements, see slide 10 of these slides, esp. leading jet |eta| > 2, pT < 50 GeV, and a trend in MET. (Note these are mutau events for the Tau ID SF measurement, and the Z → 𝜏𝜏 (orange) still needs a SF ~0.95.)
Is this indeed as expected?

@gouskos
Copy link
Contributor

gouskos commented Sep 8, 2020

pinging jme conveners @ahinzmann, @lathomas

@lathomas
Copy link

lathomas commented Sep 8, 2020

@IzaakWN you mean that you still see data/MC discrepancy? Yes, the agreement is not perfect but much better than EOY and now within JES uncties in principle. See slide 9 and 30 of https://indico.cern.ch/event/928283/contributions/3912643/attachments/2061682/3469831/ul17jecsvalidation_v2.pdf

@IzaakWN
Copy link
Contributor Author

IzaakWN commented Sep 8, 2020

Yes, that we still see data/MC discrepancies. Thanks for the links! It indeed looks consistent.

Regarding this PR, do you think it's worth it to already include these JECs into this repository, or should we wait for the next set of corrections?

@lathomas
Copy link

lathomas commented Sep 8, 2020

Yes I wouldn't wait. The next set of UL17 JECs will not happen before a few months.

@mariadalfonso
Copy link
Contributor

@IzaakWN
now we have conflicts, can you please rebase?

@IzaakWN
Copy link
Contributor Author

IzaakWN commented Sep 16, 2020

Hi, as far as I could tell, ignoring spaces, the only conflicting change was Autumn18_V7_MCAutumn18_V7b_MC in the jerTagsMC dictionary.

@gouskos
Copy link
Contributor

gouskos commented Sep 16, 2020

Hi, as far as I could tell, ignoring spaces, the only conflicting change was Autumn18_V7_MCAutumn18_V7b_MC in the jerTagsMC dictionary.

Looks good to me. thanks

@gouskos gouskos merged commit 87890b6 into cms-nanoAOD:master Sep 16, 2020
'2018' : [1.24, 1.20, 1.28]
jmrValues = { '2016' : [1.0, 1.2, 0.8],
'2017' : [1.09, 1.14, 1.04],
'2018' : [1.09, 1.14, 1.04], # Use 2017 values for 2018 until 2018 are released
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why were these reverted to the 2017 values?

Copy link
Contributor

@mariadalfonso mariadalfonso Nov 4, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@camclean
indeed where changed accidentally

questions:

  1. where the 2018 come from ? are not in the wiki page
    https://twiki.cern.ch/twiki/bin/view/CMS/JetWtagging#2018_scale_factors_and_correctio
  2. the original,up,down numbers seems not correct looks more "nominal, down, up"

Copy link
Contributor

@camclean camclean Nov 5, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The numbers are working point dependent and are shown in the tables in the section you linked. You can see the JMS and JMR values in the m and sigma column in the data/MC row.

--> sorry I cannot find in the twiki (rev 77) the table of the 2018 JMR/JMS. I can see very clearly the 2016-2017. Neither the numbers "1.24" (your new JMRnominal) nor "0.997" (your new JMSnominal) can be found on the wiki.

----> The numbers are here (copied from the twiki).

image

Yes, it should be nominal, down, up.

--> it cannot be that for 2016-2017 we have nominal, up, down and then for 2018 nominal, down, up. If this is the way the code handles stuff, some some cleanup is required there.

----> I apologize. This was incorrect. The JMS values below are nominal, down, up, while you are right that the JMR values are nominal, up, down.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So in the end it was good that this was removed because it was buggy. But the values should now be updated to 2018 values.

Since there are actually multiple working points for W-tagging (and they should be optional), I can make a PR that makes the JMS/JMR values default to 1.0 but have optional other inputs. I think there are a couple of other outstanding updates/fixes from the JME side so maybe I should prepare a presentation for the next XPOG meeting?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

Successfully merging this pull request may close these issues.

5 participants