-
Notifications
You must be signed in to change notification settings - Fork 4.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
🎉 Destination Redshift: Add "Loading Method" option to Redshift Destination spec and UI #13415
🎉 Destination Redshift: Add "Loading Method" option to Redshift Destination spec and UI #13415
Conversation
...io/airbyte/integrations/destination/redshift/RedshiftStagingS3DestinationAcceptanceTest.java
Show resolved
Hide resolved
/test connector=connectors/destination-redshift
Build PassedTest summary info:
|
"eu-west-3", | ||
"sa-east-1", | ||
"me-south-1" | ||
"uploading_method": { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm a bit concerned with broken backward compatibility. @alexandertsukanov is there a way to pick S3 staging by default and probably read existing config instead of making users update connector settings?
cc @grishick
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@alexandr-shegeda this was fixed and covered with the unit tests. Great comment. Thanks.
/test connector=connectors/destination-redshift
Build PassedTest summary info:
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
looks great
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
checking my understanding: Existing configs (including S3 and standard inserts) will continue to work, because validateIfAllRequiredS3fieldsAreNullOrEmpty
will try to find the upload method entry, fail to find it, and just return the top-level config node?
LOGGER.warn("The \"standard\" upload mode is not performant, and is not recommended for production. " + | ||
"Please use the Amazon S3 upload mode if you are syncing a large amount of data."); | ||
return DestinationType.STANDARD; | ||
} | ||
|
||
if (isNullOrEmpty(bucketNode) && isNullOrEmpty(regionNode) && isNullOrEmpty(accessKeyIdNode) | ||
&& isNullOrEmpty(secretAccessKeyNode)) { | ||
if (validateIfAllRequiredS3fieldsAreNullOrEmpty(jsonNode)) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
hm, I think this might have been bugged previously? It seems weird that this condition is identical to the previous one. Maybe it was supposed to be using ||
instead of &&
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sorry, the statement by itself is correct just a little bit confused with the name. Fixed please take a look.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think I'm confused about when this condition is ever true: the if-statement right above has the exact same condition, so wouldn't we always hit the return Destination.STANDARD
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@edgao if each of the next parameters:
return isNullOrEmpty(jsonNode.get("s3_bucket_name"))
&& isNullOrEmpty(jsonNode.get("s3_bucket_region"))
&& isNullOrEmpty(jsonNode.get("access_key_id"))
&& isNullOrEmpty(jsonNode.get("secret_access_key"));
won't be null or empty, the method will return DestinationType.COPY_S3
However, I strongly agree with you we don't need to check this parameter on BE, as if any of them is null or empty we will strike Destination.STANDARD
anyway.
...shift/src/main/java/io/airbyte/integrations/destination/redshift/validator/RedshiftUtil.java
Show resolved
Hide resolved
return config.has(UPLOADING_METHOD) ? config.get(UPLOADING_METHOD) : config; | ||
} | ||
|
||
public static boolean validateIfAllRequiredS3fieldsAreNullOrEmpty(final JsonNode jsonNode) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nitpick: rename to hasNoS3Fields
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please, take a look comment above.
...redshift/src/main/java/io/airbyte/integrations/destination/redshift/RedshiftDestination.java
Outdated
Show resolved
Hide resolved
@edgao This is correct. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
/publish connector=connectors/destination-redshift
|
/publish connector=connectors/destination-redshift
|
What
This PR improves UX/UI experience by adding a dropdown to select the Redshift Uploading Data method:

How
Modified connector's specification
Recommended reading order
x.java
🚨 User Impact 🚨
After this update, the customer uploading method will be set to Standard, and the customer will need to reconfigure the method to use S3 Staging, to fill the appropriate parameters again.
Pre-merge Checklist
Expand the relevant checklist and delete the others.
New Connector
Community member or Airbyter
airbyte_secret
./gradlew :airbyte-integrations:connectors:<name>:integrationTest
.README.md
bootstrap.md
. See description and examplesdocs/SUMMARY.md
docs/integrations/<source or destination>/<name>.md
including changelog. See changelog exampledocs/integrations/README.md
airbyte-integrations/builds.md
Airbyter
If this is a community PR, the Airbyte engineer reviewing this PR is responsible for the below items.
/test connector=connectors/<name>
command is passing/publish
command described hereUpdating a connector
Community member or Airbyter
airbyte_secret
./gradlew :airbyte-integrations:connectors:<name>:integrationTest
.README.md
bootstrap.md
. See description and examplesdocs/integrations/<source or destination>/<name>.md
including changelog. See changelog exampleAirbyter
If this is a community PR, the Airbyte engineer reviewing this PR is responsible for the below items.
/test connector=connectors/<name>
command is passing/publish
command described hereConnector Generator
-scaffold
in their name) have been updated with the latest scaffold by running./gradlew :airbyte-integrations:connector-templates:generator:testScaffoldTemplates
then checking in your changesTests
Unit
Put your unit tests output here.
Integration
Put your integration tests output here.
Acceptance
Put your acceptance tests output here.