Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add extra job to openstack-operator for testing 4.18 #62633

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

lewisdenny
Copy link
Contributor

No description provided.

Copy link
Contributor

openshift-ci bot commented Mar 11, 2025

[APPROVALNOTIFIER] This PR is APPROVED

This pull-request has been approved by: lewisdenny

The full list of commands accepted by this bot can be found here.

The pull request process is described here

Needs approval from an approver in each of these files:

Approvers can indicate their approval by writing /approve in a comment
Approvers can cancel approval by writing /approve cancel in a comment

@openshift-ci openshift-ci bot added the approved Indicates a PR has been approved by an approver from all required OWNERS files. label Mar 11, 2025
@openshift-ci openshift-ci bot requested review from bshephar and frenzyfriday March 11, 2025 06:25
@openshift-ci-robot
Copy link
Contributor

[REHEARSALNOTIFIER]
@lewisdenny: the pj-rehearse plugin accommodates running rehearsal tests for the changes in this PR. Expand 'Interacting with pj-rehearse' for usage details. The following rehearsable tests have been affected by this change:

Test name Repo Type Reason
pull-ci-openstack-k8s-operators-openstack-operator-18.0-fr2-openstack-operator-build-deploy-kuttl-4-18 openstack-k8s-operators/openstack-operator presubmit Presubmit changed
pull-ci-openstack-k8s-operators-openstack-operator-main-openstack-operator-build-deploy-kuttl-4-18 openstack-k8s-operators/openstack-operator presubmit Presubmit changed

Prior to this PR being merged, you will need to either run and acknowledge or opt to skip these rehearsals.

Interacting with pj-rehearse

Comment: /pj-rehearse to run up to 5 rehearsals
Comment: /pj-rehearse skip to opt-out of rehearsals
Comment: /pj-rehearse {test-name}, with each test separated by a space, to run one or more specific rehearsals
Comment: /pj-rehearse more to run up to 10 rehearsals
Comment: /pj-rehearse max to run up to 25 rehearsals
Comment: /pj-rehearse auto-ack to run up to 5 rehearsals, and add the rehearsals-ack label on success
Comment: /pj-rehearse list to get an up-to-date list of affected jobs
Comment: /pj-rehearse abort to abort all active rehearsals
Comment: /pj-rehearse network-access-allowed to allow rehearsals of tests that have the restrict_network_access field set to false. This must be executed by an openshift org member who is not the PR author

Once you are satisfied with the results of the rehearsals, comment: /pj-rehearse ack to unblock merge. When the rehearsals-ack label is present on your PR, merge will no longer be blocked by rehearsals.
If you would like the rehearsals-ack label removed, comment: /pj-rehearse reject to re-block merging.

@lewisdenny
Copy link
Contributor Author

/pj-rehearse pull-ci-openstack-k8s-operators-openstack-operator-18.0-fr2-openstack-operator-build-deploy-kuttl-4-18 pull-ci-openstack-k8s-operators-openstack-operator-main-openstack-operator-build-deploy-kuttl-4-18

@openshift-ci-robot
Copy link
Contributor

@lewisdenny: now processing your pj-rehearse request. Please allow up to 10 minutes for jobs to trigger or cancel.

@stuggi
Copy link
Contributor

stuggi commented Mar 11, 2025

seems we lost leaderelection lock during the kuttl tests and openstack-operator manager got restarted.
[1]

2025-03-11T08:25:39.714497412Z E0311 08:25:39.714357       1 leaderelection.go:369] Failed to update lock: etcdserver: request timed out
2025-03-11T08:25:53.758628157Z E0311 08:25:53.758528       1 leaderelection.go:332] error retrieving resource lock openstack-operators/40ba705e.openstack.org: Get "https://172.30.0.1:443/apis/coordination.k8s.io/v1/namespaces/openstack-operators/leases/40ba705e.openstack.org": context deadline exceeded
2025-03-11T08:25:53.758628157Z I0311 08:25:53.758597       1 leaderelection.go:285] failed to renew lease openstack-operators/40ba705e.openstack.org: timed out waiting for the condition
2025-03-11T08:25:53.758778881Z 2025-03-11T08:25:53.758Z	ERROR	setup	problem running manager	{"error": "leader election lost"}
2025-03-11T08:25:53.758778881Z main.main
2025-03-11T08:25:53.758778881Z 	/remote-source/main.go:312
2025-03-11T08:25:53.758778881Z runtime.main
2025-03-11T08:25:53.758778881Z 	/usr/lib/golang/src/runtime/proc.go:267
2025-03-11T08:25:53.758801971Z 2025-03-11T08:25:53.758Z	INFO	Stopping and waiting for non leader election runnables
2025-03-11T08:25:53.758809521Z 2025-03-11T08:25:53.758Z	INFO	Stopping and waiting for leader election runnables

as a result the test failed to connect to the webook

logger.go:42: 08:25:57 | ctlplane-basic-deployment-with-nicMappings/2-deploy-openstack | Error from server (InternalError): error when creating "STDIN": Internal error occurred: failed calling webhook "mopenstackcontrolplane.kb.io": failed to call webhook: Post "[https://openstack-operator-webhook-service.openstack-operators.svc:443/mutate-core-openstack-org-v1beta1-openstackcontrolplane?timeout=10s](https://openstack-operator-webhook-service.openstack-operators.svc/mutate-core-openstack-org-v1beta1-openstackcontrolplane?timeout=10s)": no endpoints available for service "openstack-operator-webhook-service"
    case.go:378: failed in step 2-deploy-openstack

lets rerun to see if that happens again

[1] https://gcsweb-ci.apps.ci.l2s4.p1.openshiftapps.com/gcs/test-platform-results/pr-logs/pull/openshift_release/62633/rehearse-62633-pull-ci-openstack-k8s-operators-openstack-operator-18.0-fr2-openstack-operator-build-deploy-kuttl-4-18/1899347299306311680/artifacts/openstack-operator-build-deploy-kuttl-4-18/openstack-k8s-operators-gather/artifacts/must-gather/quay-io-openshift-release-dev-ocp-v4-0-art-dev-sha256-0cc6d999e5e52bfe425b80493657ffd973e8d8729faf520d2217e6ccef6c08ca/namespaces/openstack-operators/pods/openstack-operator-controller-manager-6bd5988c79-rkn2t/manager/manager/logs/previous.log

@stuggi
Copy link
Contributor

stuggi commented Mar 11, 2025

/pj-rehearse pull-ci-openstack-k8s-operators-openstack-operator-18.0-fr2-openstack-operator-build-deploy-kuttl-4-18 pull-ci-openstack-k8s-operators-openstack-operator-main-openstack-operator-build-deploy-kuttl-4-18

@openshift-ci-robot
Copy link
Contributor

@stuggi: now processing your pj-rehearse request. Please allow up to 10 minutes for jobs to trigger or cancel.

@stuggi
Copy link
Contributor

stuggi commented Mar 11, 2025

/pj-rehearse pull-ci-openstack-k8s-operators-openstack-operator-18.0-fr2-openstack-operator-build-deploy-kuttl-4-18 pull-ci-openstack-k8s-operators-openstack-operator-main-openstack-operator-build-deploy-kuttl-4-18

@openshift-ci-robot
Copy link
Contributor

@stuggi: now processing your pj-rehearse request. Please allow up to 10 minutes for jobs to trigger or cancel.

@stuggi
Copy link
Contributor

stuggi commented Mar 11, 2025

/pj-rehearse pull-ci-openstack-k8s-operators-openstack-operator-main-openstack-operator-build-deploy-kuttl-4-18

@openshift-ci-robot
Copy link
Contributor

@stuggi: now processing your pj-rehearse request. Please allow up to 10 minutes for jobs to trigger or cancel.

@stuggi
Copy link
Contributor

stuggi commented Mar 12, 2025

for https://prow.ci.openshift.org/view/gs/test-platform-results/pr-logs/pull/openshift_release/62633/rehearse-62633-pull-ci-openstack-k8s-operators-openstack-operator-18.0-fr2-openstack-operator-build-deploy-kuttl-4-18/1899488144177238016 , I think its not an issue. it seems that with 4.18 we now have a new condition in succeeded pods which make the validation to fail, since we only expect in the assert a single entry in the conditions list: https://github.com/openstack-k8s-operators/openstack-operator/blob/main/tests/kuttl/tests/dataplane-deploy-global-service-test/01-assert.yaml#L172-L174

        +  - lastProbeTime: "2025-03-11T19:03:12Z"
        +    lastTransitionTime: "2025-03-11T19:03:12Z"
        +    message: Reached expected number of succeeded pods
        +    reason: CompletionsReached
        +    status: "True"
        +    type: SuccessCriteriaMet
        +  - lastProbeTime: "2025-03-11T19:03:12Z"
        +    lastTransitionTime: "2025-03-11T19:03:12Z"
        +    message: Reached expected number of succeeded pods
        +    reason: CompletionsReached
        +    status: "True"
             type: Complete
        +  ready: 0
        +  startTime: "2025-03-11T19:03:06Z"
           succeeded: 1
        +  terminating: 0
           uncountedTerminatedPods: {}
         
        
    case.go:380: resource Job:openstack-kuttl-tests/custom-global-service-edpm-compute-global: .status.conditions: slice length mismatch: 1 != 2
    case.go:380: --- Job:openstack-kuttl-tests/download-cache-edpm-compute-global-edpm-compute-global

@stuggi
Copy link
Contributor

stuggi commented Mar 12, 2025

@stuggi
Copy link
Contributor

stuggi commented Mar 13, 2025

/pj-rehearse pull-ci-openstack-k8s-operators-openstack-operator-main-openstack-operator-build-deploy-kuttl-4-18

@openshift-ci-robot
Copy link
Contributor

@stuggi: now processing your pj-rehearse request. Please allow up to 10 minutes for jobs to trigger or cancel.

@stuggi
Copy link
Contributor

stuggi commented Mar 13, 2025

/pj-rehearse pull-ci-openstack-k8s-operators-openstack-operator-18.0-fr2-openstack-operator-build-deploy-kuttl-4-18

@openshift-ci-robot
Copy link
Contributor

@stuggi: now processing your pj-rehearse request. Please allow up to 10 minutes for jobs to trigger or cancel.

@stuggi
Copy link
Contributor

stuggi commented Mar 13, 2025

crc_storage issue and failed to save artifacts ...

ERRO[2025-03-13T09:35:21Z] Failed to save artifacts before releasing the claimed cluster  clusterClaim.Name=eca7ed5a-8b39-45aa-a3fd-2d3303c0b967 clusterClaim.Namespace=openstack-k8s-operators-cluster-pool error=failed to get cluster claim eca7ed5a-8b39-45aa-a3fd-2d3303c0b967 in namespace openstack-k8s-operators-cluster-pool: clusterclaims.hive.openshift.io "eca7ed5a-8b39-45aa-a3fd-2d3303c0b967" not found

@stuggi
Copy link
Contributor

stuggi commented Mar 13, 2025

/pj-rehearse pull-ci-openstack-k8s-operators-openstack-operator-18.0-fr2-openstack-operator-build-deploy-kuttl-4-18 pull-ci-openstack-k8s-operators-openstack-operator-main-openstack-operator-build-deploy-kuttl-4-18

@openshift-ci-robot
Copy link
Contributor

@stuggi: now processing your pj-rehearse request. Please allow up to 10 minutes for jobs to trigger or cancel.

Copy link
Contributor

openshift-ci bot commented Mar 13, 2025

@lewisdenny: The following test failed, say /retest to rerun all failed tests or /retest-required to rerun all mandatory failed tests:

Test name Commit Details Required Rerun command
ci/rehearse/openstack-k8s-operators/openstack-operator/18.0-fr2/openstack-operator-build-deploy-kuttl-4-18 4b64756 link unknown /pj-rehearse pull-ci-openstack-k8s-operators-openstack-operator-18.0-fr2-openstack-operator-build-deploy-kuttl-4-18

Full PR test history. Your PR dashboard.

Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes-sigs/prow repository. I understand the commands that are listed here.

@stuggi
Copy link
Contributor

stuggi commented Mar 13, 2025

/retest

@stuggi
Copy link
Contributor

stuggi commented Mar 14, 2025

/pj-rehearse pull-ci-openstack-k8s-operators-openstack-operator-18.0-fr2-openstack-operator-build-deploy-kuttl-4-18

@openshift-ci-robot
Copy link
Contributor

@stuggi: now processing your pj-rehearse request. Please allow up to 10 minutes for jobs to trigger or cancel.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
approved Indicates a PR has been approved by an approver from all required OWNERS files.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants