Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Publish command: call docker hub api to check the tag was registered #11654

Merged

Conversation

alafanechere
Copy link
Contributor

@alafanechere alafanechere commented Apr 1, 2022

What

In some rare situations, it can happen that a docker tag was successfully published but that the DockerHub does not register this tag, making the tag not available in the UI nor in the API we call to check if the image exists.

This problem happened for source-bing-ads:0.1.4:

(related Slack conversation for fellow Airbyters)

How

I want to detect this error early in our release process by making the /publish command fails if this problem occurs.
It will allow us to detect it before merging a connector PR and avoid breaking the Airbyte release.

Copy link
Contributor

@davinchia davinchia left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks like there is a name typo, otherwise is good.

Co-authored-by: Davin Chia <davinchia@gmail.com>
@alafanechere alafanechere temporarily deployed to more-secrets April 1, 2022 09:27 Inactive
@alafanechere alafanechere temporarily deployed to more-secrets April 1, 2022 09:27 Inactive
# Checking it the image was successfully registered on DockerHub
TAG_URL="https://hub.docker.com/v2/repositories/${image_name}/tags/${image_version}"
DOCKERHUB_RESPONSE_CODE=$(curl --silent --output /dev/null --write-out "%{http_code}" ${TAG_URL})
if [[ "${DOCKERHUB_RESPONSE_CODE}" == "404" ]]; then
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

some questions:
1.is this API immediately consistent, or eventually consistent? if the latter,we may start failing a lot of builds for transient reasons
2. what is the expectation from an engineer once they run across this error? are they expected to republish? bump the version again? If their build fails, what is their actionable next step?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  1. I don't know. As of today the connector I published that led to a 404 on the dockerhub still remains 404. I could add a sleep for 5 seconds in case there's a small delay before the tag is available on the API after the push.
  2. Bumping again is the suggested action. Republishing won't work because the image is registered but not discoverable with the API.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

could you test #1 please? i just want to make sure this doesn't suddenly start introducing a ton of failures for a case which is 1% of the time.

if we want to move forward, then for 2 could you add that in the log message? (the action the dev needs to take)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I tested to run a publish workflow on destination-e2e which latest version was not published. It worked. I also tried with the 404 URL to test the behavior in case of bad API response, it also worked as expected: the job failed and the message I set is echoed.

@alafanechere alafanechere requested a review from sherifnada April 4, 2022 17:49
@alafanechere alafanechere temporarily deployed to more-secrets April 4, 2022 17:54 Inactive
@alafanechere alafanechere temporarily deployed to more-secrets April 4, 2022 17:54 Inactive
@github-actions github-actions bot added the area/connectors Connector related issues label Apr 4, 2022
@alafanechere alafanechere temporarily deployed to more-secrets April 4, 2022 22:05 Inactive
@alafanechere alafanechere temporarily deployed to more-secrets April 4, 2022 22:05 Inactive
@alafanechere alafanechere temporarily deployed to more-secrets April 4, 2022 22:17 Inactive
@alafanechere alafanechere temporarily deployed to more-secrets April 4, 2022 22:22 Inactive
@alafanechere alafanechere temporarily deployed to more-secrets April 4, 2022 22:29 Inactive
@alafanechere alafanechere temporarily deployed to more-secrets April 4, 2022 22:48 Inactive
@alafanechere alafanechere temporarily deployed to more-secrets April 4, 2022 22:48 Inactive
@alafanechere alafanechere temporarily deployed to more-secrets April 4, 2022 23:10 Inactive
@alafanechere alafanechere temporarily deployed to more-secrets April 4, 2022 23:10 Inactive
@alafanechere alafanechere temporarily deployed to more-secrets April 4, 2022 23:13 Inactive
@alafanechere alafanechere temporarily deployed to more-secrets April 4, 2022 23:13 Inactive
@alafanechere alafanechere merged commit 5e676fe into master Apr 4, 2022
@alafanechere alafanechere deleted the augustin/publish-command/check-dockerhub-registered branch April 4, 2022 23:14
# see the description of this PR to understand why this is needed https://github.com/airbytehq/airbyte/pull/11654/
sleep 5
TAG_URL="https://hub.docker.com/v2/repositories/${image_name}/tags/${image_version}"
DOCKERHUB_RESPONSE_CODE=$(curl --silent --output /dev/null --write-out "%{http_code}" ${TAG_URL})
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I wonder if this step would benefit from some retry? does dockerhub fail in a way that a retry shortly after would be useful? the man page for curl has a fairly deep set of retry options if that is interesting to you

cool use of pulling the HTTP code here ☝️

@tuliren
Copy link
Contributor

tuliren commented May 4, 2022

@alafanechere, this check does not work for private repository such as destination-databricks. I created #12584 to track this issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area/connectors Connector related issues
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants