-
Notifications
You must be signed in to change notification settings - Fork 2.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Automate creating issues for failing tests on main #36761
Comments
This could get pretty spammy. If there's good way to prevent that, I would be for something like that. |
Is the spamminess about the total number of issues (i.e. the total number of tests failing)? Or about each issue getting a lot of comments? Instead of comments we could just edit the issue body and add a new line for each commit in which a test fails (that would avoid some notifications). One thing I like is that the spam is proportional to the number of things that are broken, so if somebody wants less spam, they know what to do 😅 but I agree we don't want too much spam |
That's one way to look at it, yeah 😄 No, I mean I just expressed my concern. There's definitely a good way of solving this without being too spammy. |
Is this issue still open? |
@pratik-mahalle Yes! @ArthurSens was working on this (and, indirectly, @ankitpatel96 as well). You are welcome to talk to them if you want to help :) |
Yeah, I've been working on this before the end-of-the-year holidays but I still haven't caught up after coming back. A WIP branch is here, but I haven't tested yet |
@ArthurSens Has issue resolved or should I start working on it? |
It's not resolved yet! But I'm not sure how one could help here, do you want to continue the WIP branch? Were you thinking about starting with another strategy? |
Mimicking open-telemetry/opentelemetry-collector#11963 This effort is related to #36761. We'll need JUnit test results so issuegenerator(being worked on this PR: open-telemetry/opentelemetry-go-build-tools#672) can generate issues based on failing tests on main. cc @mx-psi Signed-off-by: Arthur Silva Sens <arthursens2005@gmail.com>
#### Description This PR extends the current unit test workflow to download the artifacts we started uploading at #37941. Further docs can be found here: https://github.com/actions/download-artifact I'm also moving files around to make sure we start small. Generating issues only for the `hostmetricsreceiver` component. Once we notice that is working well I plan to raise another PR to cover all components of contrib <!-- Issue number (e.g. #1234) or full URL to issue, if applicable. --> #### Link to tracking issue Related to #36761 --------- Signed-off-by: Arthur Silva Sens <arthursens2005@gmail.com> Co-authored-by: Pablo Baeyens <pbaeyens31+github@gmail.com>
#### Description After the revert done in #38230 and #38231, I'm looking yet again on what went wrong there. I'm solving this in small parts, and this PR focuses on ensuring JUnit Artifacts are uploaded correctly. #### Link to tracking issue Still related to #36761 #### Testing I plan to test this by looking at the artifacts produced in this PR. The JUnit files need to be there. See https://github.com/actions/upload-artifact?tab=readme-ov-file#where-does-the-upload-go --------- Signed-off-by: Arthur Silva Sens <arthursens2005@gmail.com>
Component(s)
No response
Describe the issue you're reporting
We have a bunch of tests that fail on main and are not getting enough attention. We should automate creating an issue (see e.g. here) on failing tests on main. Ideally it should contain:
The workflow should check if an issue already exists (and leave a comment on the existing issue if not?)
Information about failing unit tests can be retrieved via the JSON output of gotestsum.
The text was updated successfully, but these errors were encountered: