Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Hermetic Timing SLHC27 (fix crashes in RECO so we can use the usual workflows) #12962

Merged

Conversation

lgray
Copy link
Contributor

@lgray lgray commented Jan 15, 2016

The complete set of changes to the vertexing to be aware of timing.
Still needs tuning but gives an answer and does not crash.

We will need to cut a new release for this so that the timing sample DR campaign can march on.

@boudoul
@bendavid

@lgray
Copy link
Contributor Author

lgray commented Jan 15, 2016

@cmsbuild please test

@cmsbuild cmsbuild added this to the Next CMSSW_6_2_X_SLHC milestone Jan 15, 2016
@cmsbuild
Copy link
Contributor

The tests are being triggered in jenkins.
https://cmssdt.cern.ch/jenkins/job/ib-any-integration/10541/console

@cmsbuild
Copy link
Contributor

A new Pull Request was created by @lgray (Lindsey Gray) for CMSSW_6_2_X_SLHC.

It involves the following packages:

DataFormats/PatCandidates
RecoParticleFlow/FastTiming
RecoParticleFlow/PandoraTranslator
RecoVertex/AdaptiveVertexFit
RecoVertex/PrimaryVertexProducer
RecoVertex/VertexPrimitives
TrackingTools/TransientTrack

The following packages do not have a category, yet:

RecoParticleFlow/FastTiming
RecoParticleFlow/PandoraTranslator

@cvuosalo, @monttj, @cmsbuild, @slava77, @vadler, @davidlange6 can you please review it and eventually sign? Thanks.
@mmarionncern, @imarches, @makortel, @pvmulder, @acaudron, @abbiendi, @GiacomoSguazzoni, @jhgoh, @VinInn, @bellan, @istaslis, @rovere, @cerati, @ferencek, @gpetruc, @trocino, @dgulhan, @bachtis, @battibass this is something you requested to watch as well.
@slava77, @smuzaffar, @fratnikov, @mark-grimes you are the release manager for this.

Following commands in first line of a comment are recognized

  • +1|approve[d]|sign[ed]: L1/L2's to approve it
  • -1|reject[ed]: L1/L2's to reject it
  • assign <category>[,<category>[,...]]: L1/L2's to request signatures from other categories
  • unassign <category>[,<category>[,...]]: L1/L2's to remove signatures from other categories
  • hold: L1/all L2's/release manager to mark it as on hold
  • unhold: L1/user who put this PR on hold
  • merge: L1/release managers to merge this request
  • [@cmsbuild,] please test: L1/L2 and selected users to start jenkins tests
  • [@cmsbuild,] please test with cms-sw/cmsdist#<PR>: L1/L2 and selected users to start jenkins tests using externals from cmsdist

@cmsbuild
Copy link
Contributor

-1

Tested at: 1383ada
I found errors in the following unit tests:

---> test runtestPhysicsToolsPatAlgos had ERRORS

you can see the results of the tests here:
https://cmssdt.cern.ch/SDT/jenkins-artifacts/pull-request-integration/PR-12962/10541/summary.html

@slava77
Copy link
Contributor

slava77 commented Jan 15, 2016

the test fails in the IB as well

i=globalVTracks.begin(); i!=globalVTracks.end() ; ++i )
{
//std::cout << " wgt=" << (**i).weight() << std::endl;
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it's unclear if this no-op debug loops is needed

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Will remove.

@lgray
Copy link
Contributor Author

lgray commented Jan 28, 2016

@slava77 Please give this a look and review. I found the issue with the crash and fixed it.
Once this is integrated into 620_SLHCX we need to cut a new release and continue the campaign.

@boudoul

@cmsbuild
Copy link
Contributor

-1

Tested at: 5cd9e78
I found errors in the following unit tests:

---> test runtestPhysicsToolsPatAlgos had ERRORS

you can see the results of the tests here:
https://cmssdt.cern.ch/SDT/jenkins-artifacts/pull-request-integration/PR-12962/10816/summary.html

@lgray
Copy link
Contributor Author

lgray commented Jan 28, 2016

As usual, unit test failures unrelated.
@slava77 If you could expedite this I would appreciate it.

@slava77
Copy link
Contributor

slava77 commented Jan 29, 2016

+1

for 5cd9e78

  • updates look ~ OK based on somewhat superficial review from the RECO perspective. More work will be needed to get this to 8X release
  • jenkins tests pass
  • local check of 12600 with customizeHGCalPandora_cff.cust_2023HGCalPandoraMuonFastTime shows a running workflow before and after the changes and new products show up
-----------------------------------------------------------------
   or, B         new, B      delta, B   delta, %   deltaJ, %    branch 
-----------------------------------------------------------------
      0.0 ->       354.0        354     NEWO   0.09     recoVertexs_offlinePrimaryVertices4D_WithBS_RECO.
      0.0 ->       318.3        318     NEWO   0.08     floatedmValueMap_ecalBarrelClusterFastTimer_PerfectResolutionModelResolution_RECO.
      0.0 ->       337.9        338     NEWO   0.08     floatedmValueMap_trackTimeValueMapProducer_generalTracksConfigurableFlatResolutionModel_RECO.
      0.0 ->       330.6        331     NEWO   0.08     floatedmValueMap_trackTimeValueMapProducer_gsfTracksConfigurableFlatResolutionModel_RECO.
      0.0 ->       573.2        573     NEWO   0.14     floatss_ecalBarrelClusterFastTimer_PerfectResolutionModel_RECO.
      0.0 ->       330.8        331     NEWO   0.08     floatedmValueMap_trackTimeValueMapProducer_gsfTracksConfigurableFlatResolutionModelResolution_RECO.
      0.0 ->       341.8        342     NEWO   0.08     recoVertexs_offlinePrimaryVertices1D__RECO.
      0.0 ->       346.1        346     NEWO   0.09     recoVertexs_offlinePrimaryVertices4D__RECO.
      0.0 ->       349.8        350     NEWO   0.09     recoVertexs_offlinePrimaryVertices1D_WithBS_RECO.
      0.0 ->       334.4        334     NEWO   0.08     floatedmValueMap_trackTimeValueMapProducer_generalTracksConfigurableFlatResolutionModelResolution_RECO.
-------------------------------------------------------------
   403744 ->      407366       3621             0.9     ALL BRANCHES

@lgray
Copy link
Contributor Author

lgray commented Jan 29, 2016

@slava77 You can merge this one into SLHCX as well, since you are release manager. I'm not sure if @davidlange6 Is responsible for merges in the SLHC releases.

@davidlange6
Copy link
Contributor

SLHCX is past its routine-merging shelf life. Lets discuss the basic validation for the relval and/or production at the next TP and then we’ll build a release.

On Jan 29, 2016, at 9:12 AM, Lindsey Gray notifications@github.com wrote:

@slava77 You can merge this one into SLHCX as well, since you are release manager. I'm not sure if @davidlange6 Is responsible for merges in the SLHC releases.


Reply to this email directly or view it on GitHub.

@lgray
Copy link
Contributor Author

lgray commented Jan 29, 2016

When in the next TP?

@lgray
Copy link
Contributor Author

lgray commented Jan 29, 2016

@davidlange6 just FYI, this PR fixes an issue with thr fast time customization thay was holding back the further processing of samples requested last year that are already processed to gen-sim.

@lgray
Copy link
Contributor Author

lgray commented Feb 1, 2016

@davidlange6 Here is some validation. I would appreciate it if we can move forward here.

Spacetime event display showing 4D and 3D vertex reconstruction (200PU).
vertex_reco_display_200PU_hgg_evt0.pdf

time resolution 50PU (because of stats)
dt_to_sim_50PU.pdf

z resolution 50PU, this is the same as the 30um expected in 3D RECO.
dz_to_sim_50PU.pdf

@lgray
Copy link
Contributor Author

lgray commented Feb 1, 2016

@davidlange6 Since there is no upgrade studies group this week. Where else should we discuss this PR and release?

@lgray
Copy link
Contributor Author

lgray commented Feb 3, 2016

@davidlange6 Ping, what shall we do here?

@davidlange6
Copy link
Contributor

I finally finished an email on the topic. sorry for the delay. @lgray

@lgray
Copy link
Contributor Author

lgray commented Feb 4, 2016

@makortel @VinInn Could you guys take a look and give any suggestions or some sort of "sign off". It would be useful to push this PR along. (An email is coming to you by separate post).

@makortel
Copy link
Contributor

makortel commented Feb 4, 2016

@lgray Looks fine to me, but you probably want to wait @VinInn.

@VinInn
Copy link
Contributor

VinInn commented Feb 4, 2016

ok, for 80X (or 81X) it needs a bit of streamlining... (to many ifs)

@lgray
Copy link
Contributor Author

lgray commented Feb 4, 2016

@VinInn thanks. Sure, for 80X I can make it less branchy.
By the way, how long till phase 2 tracking in 80X (i.e. running work flow)?
If there is the tracking already I could already forward port this stuff.

@lgray
Copy link
Contributor Author

lgray commented Feb 8, 2016

@davidlange6 Please have a look here. Vincenzo was OK with alterations.

davidlange6 added a commit that referenced this pull request Feb 9, 2016
Hermetic Timing SLHC27 (fix crashes in RECO so we can use the usual workflows)
@davidlange6 davidlange6 merged commit 2c7bd6f into cms-sw:CMSSW_6_2_X_SLHC Feb 9, 2016
@lgray
Copy link
Contributor Author

lgray commented Feb 11, 2016

@boudoul Since SLHC28 is now out, who do I need to talk to in order to start the various DR steps for the pileup levels we discussed?

@boudoul
Copy link
Contributor

boudoul commented Feb 11, 2016

hi @lgray I'll send a mail to the relevant parties this afternoon

@lgray
Copy link
Contributor Author

lgray commented Feb 11, 2016

@boudoul thanks a lot!

On Thu, Feb 11, 2016 at 3:18 PM, boudoul notifications@github.com wrote:

hi @lgray https://github.com/lgray I'll send a mail to the relevant
parties this afternoon


Reply to this email directly or view it on GitHub
#12962 (comment).

@lgray
Copy link
Contributor Author

lgray commented Feb 11, 2016

@boudoul Just to remind the PU values we agreed upon : 20, 50, 80, 110, 140, 200

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants