Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

🎉 Destination Redshift: Add "Loading Method" option to Redshift Destination spec and UI #13415

Merged
merged 9 commits into from
Jun 10, 2022

Conversation

alexandertsukanov
Copy link
Contributor

@alexandertsukanov alexandertsukanov commented Jun 2, 2022

What

This PR improves UX/UI experience by adding a dropdown to select the Redshift Uploading Data method:
image

How

Modified connector's specification

Recommended reading order

  1. x.java

🚨 User Impact 🚨

After this update, the customer uploading method will be set to Standard, and the customer will need to reconfigure the method to use S3 Staging, to fill the appropriate parameters again.

Pre-merge Checklist

Expand the relevant checklist and delete the others.

New Connector

Community member or Airbyter

  • Community member? Grant edit access to maintainers (instructions)
  • Secrets in the connector's spec are annotated with airbyte_secret
  • Unit & integration tests added and passing. Community members, please provide proof of success locally e.g: screenshot or copy-paste unit, integration, and acceptance test output. To run acceptance tests for a Python connector, follow instructions in the README. For java connectors run ./gradlew :airbyte-integrations:connectors:<name>:integrationTest.
  • Code reviews completed
  • Documentation updated
    • Connector's README.md
    • Connector's bootstrap.md. See description and examples
    • docs/SUMMARY.md
    • docs/integrations/<source or destination>/<name>.md including changelog. See changelog example
    • docs/integrations/README.md
    • airbyte-integrations/builds.md
  • PR name follows PR naming conventions

Airbyter

If this is a community PR, the Airbyte engineer reviewing this PR is responsible for the below items.

  • Create a non-forked branch based on this PR and test the below items on it
  • Build is successful
  • If new credentials are required for use in CI, add them to GSM. Instructions.
  • /test connector=connectors/<name> command is passing
  • New Connector version released on Dockerhub by running the /publish command described here
  • After the connector is published, connector added to connector index as described here
  • Seed specs have been re-generated by building the platform and committing the changes to the seed spec files, as described here
Updating a connector

Community member or Airbyter

  • Grant edit access to maintainers (instructions)
  • Secrets in the connector's spec are annotated with airbyte_secret
  • Unit & integration tests added and passing. Community members, please provide proof of success locally e.g: screenshot or copy-paste unit, integration, and acceptance test output. To run acceptance tests for a Python connector, follow instructions in the README. For java connectors run ./gradlew :airbyte-integrations:connectors:<name>:integrationTest.
  • Code reviews completed
  • Documentation updated
    • Connector's README.md
    • Connector's bootstrap.md. See description and examples
    • Changelog updated in docs/integrations/<source or destination>/<name>.md including changelog. See changelog example
  • PR name follows PR naming conventions

Airbyter

If this is a community PR, the Airbyte engineer reviewing this PR is responsible for the below items.

  • Create a non-forked branch based on this PR and test the below items on it
  • Build is successful
  • If new credentials are required for use in CI, add them to GSM. Instructions.
  • /test connector=connectors/<name> command is passing
  • New Connector version released on Dockerhub and connector version bumped by running the /publish command described here
Connector Generator
  • Issue acceptance criteria met
  • PR name follows PR naming conventions
  • If adding a new generator, add it to the list of scaffold modules being tested
  • The generator test modules (all connectors with -scaffold in their name) have been updated with the latest scaffold by running ./gradlew :airbyte-integrations:connector-templates:generator:testScaffoldTemplates then checking in your changes
  • Documentation which references the generator is updated as needed

Tests

Unit

Put your unit tests output here.

Integration

Put your integration tests output here.

Acceptance

Put your acceptance tests output here.

@github-actions github-actions bot added area/connectors Connector related issues area/documentation Improvements or additions to documentation labels Jun 2, 2022
@alexandertsukanov alexandertsukanov linked an issue Jun 2, 2022 that may be closed by this pull request
@alexandertsukanov
Copy link
Contributor Author

alexandertsukanov commented Jun 2, 2022

/test connector=connectors/destination-redshift

🕑 connectors/destination-redshift https://github.com/airbytehq/airbyte/actions/runs/2428502158
✅ connectors/destination-redshift https://github.com/airbytehq/airbyte/actions/runs/2428502158
Python tests coverage:

Name                                                              Stmts   Miss  Cover
-------------------------------------------------------------------------------------
normalization/transform_config/__init__.py                            2      0   100%
normalization/transform_catalog/reserved_keywords.py                 13      0   100%
normalization/transform_catalog/__init__.py                           2      0   100%
normalization/destination_type.py                                    13      0   100%
normalization/__init__.py                                             4      0   100%
normalization/transform_catalog/destination_name_transformer.py     155      8    95%
normalization/transform_config/transform.py                         159     31    81%
normalization/transform_catalog/table_name_registry.py              174     34    80%
normalization/transform_catalog/utils.py                             38      9    76%
normalization/transform_catalog/dbt_macro.py                         22      7    68%
normalization/transform_catalog/catalog_processor.py                147     80    46%
normalization/transform_catalog/transform.py                         61     38    38%
normalization/transform_catalog/stream_processor.py                 543    352    35%
-------------------------------------------------------------------------------------
TOTAL                                                              1333    559    58%

Build Passed

Test summary info:

All Passed

@alexandr-shegeda alexandr-shegeda marked this pull request as ready for review June 2, 2022 16:50
"eu-west-3",
"sa-east-1",
"me-south-1"
"uploading_method": {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm a bit concerned with broken backward compatibility. @alexandertsukanov is there a way to pick S3 staging by default and probably read existing config instead of making users update connector settings?
cc @grishick

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@alexandr-shegeda this was fixed and covered with the unit tests. Great comment. Thanks.

__
sema-logo  Summary: 👌 This code looks good

@alexandertsukanov
Copy link
Contributor Author

alexandertsukanov commented Jun 3, 2022

/test connector=connectors/destination-redshift

🕑 connectors/destination-redshift https://github.com/airbytehq/airbyte/actions/runs/2433523841
✅ connectors/destination-redshift https://github.com/airbytehq/airbyte/actions/runs/2433523841
Python tests coverage:

Name                                                              Stmts   Miss  Cover
-------------------------------------------------------------------------------------
normalization/transform_config/__init__.py                            2      0   100%
normalization/transform_catalog/reserved_keywords.py                 13      0   100%
normalization/transform_catalog/__init__.py                           2      0   100%
normalization/destination_type.py                                    13      0   100%
normalization/__init__.py                                             4      0   100%
normalization/transform_catalog/destination_name_transformer.py     155      8    95%
normalization/transform_config/transform.py                         159     31    81%
normalization/transform_catalog/table_name_registry.py              174     34    80%
normalization/transform_catalog/utils.py                             38      9    76%
normalization/transform_catalog/dbt_macro.py                         22      7    68%
normalization/transform_catalog/catalog_processor.py                147     80    46%
normalization/transform_catalog/transform.py                         61     38    38%
normalization/transform_catalog/stream_processor.py                 543    352    35%
-------------------------------------------------------------------------------------
TOTAL                                                              1333    559    58%

Build Passed

Test summary info:

All Passed

Copy link
Contributor

@alexandr-shegeda alexandr-shegeda left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

looks great

Copy link
Contributor

@edgao edgao left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

checking my understanding: Existing configs (including S3 and standard inserts) will continue to work, because validateIfAllRequiredS3fieldsAreNullOrEmpty will try to find the upload method entry, fail to find it, and just return the top-level config node?

LOGGER.warn("The \"standard\" upload mode is not performant, and is not recommended for production. " +
"Please use the Amazon S3 upload mode if you are syncing a large amount of data.");
return DestinationType.STANDARD;
}

if (isNullOrEmpty(bucketNode) && isNullOrEmpty(regionNode) && isNullOrEmpty(accessKeyIdNode)
&& isNullOrEmpty(secretAccessKeyNode)) {
if (validateIfAllRequiredS3fieldsAreNullOrEmpty(jsonNode)) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hm, I think this might have been bugged previously? It seems weird that this condition is identical to the previous one. Maybe it was supposed to be using || instead of &&?

Copy link
Contributor Author

@alexandertsukanov alexandertsukanov Jun 8, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry, the statement by itself is correct just a little bit confused with the name. Fixed please take a look.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think I'm confused about when this condition is ever true: the if-statement right above has the exact same condition, so wouldn't we always hit the return Destination.STANDARD?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@edgao if each of the next parameters:

   return isNullOrEmpty(jsonNode.get("s3_bucket_name"))
        && isNullOrEmpty(jsonNode.get("s3_bucket_region"))
        && isNullOrEmpty(jsonNode.get("access_key_id"))
        && isNullOrEmpty(jsonNode.get("secret_access_key"));

won't be null or empty, the method will return DestinationType.COPY_S3

However, I strongly agree with you we don't need to check this parameter on BE, as if any of them is null or empty we will strike Destination.STANDARD anyway.

return config.has(UPLOADING_METHOD) ? config.get(UPLOADING_METHOD) : config;
}

public static boolean validateIfAllRequiredS3fieldsAreNullOrEmpty(final JsonNode jsonNode) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nitpick: rename to hasNoS3Fields

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please, take a look comment above.

@alexandertsukanov
Copy link
Contributor Author

alexandertsukanov commented Jun 9, 2022

checking my understanding: Existing configs (including S3 and standard inserts) will continue to work, because validateIfAllRequiredS3fieldsAreNullOrEmpty will try to find the upload method entry, fail to find it, and just return the top-level config node?

@edgao This is correct.

Copy link
Contributor

@edgao edgao left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

:shipit:

@alexandertsukanov
Copy link
Contributor Author

alexandertsukanov commented Jun 10, 2022

/publish connector=connectors/destination-redshift

🕑 connectors/destination-redshift https://github.com/airbytehq/airbyte/actions/runs/2474422800
❌ Failed to publish connectors/destination-redshift
❌ Couldn't auto-bump version for connectors/destination-redshift

@alexandertsukanov
Copy link
Contributor Author

alexandertsukanov commented Jun 10, 2022

/publish connector=connectors/destination-redshift

🕑 connectors/destination-redshift https://github.com/airbytehq/airbyte/actions/runs/2474511057
🚀 Successfully published connectors/destination-redshift
🚀 Auto-bumped version for connectors/destination-redshift
✅ connectors/destination-redshift https://github.com/airbytehq/airbyte/actions/runs/2474511057

@octavia-squidington-iii octavia-squidington-iii temporarily deployed to more-secrets June 10, 2022 11:42 Inactive
@alexandertsukanov alexandertsukanov merged commit b45014d into master Jun 10, 2022
@alexandertsukanov alexandertsukanov deleted the otsukanov/airbyte-12709_add_loading_method branch June 10, 2022 11:47
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area/connectors Connector related issues area/documentation Improvements or additions to documentation connectors/destination/redshift
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add "Loading Method" option to Redshift Destination spec
6 participants