Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

clarify staging setup guide for bq destination #9255

Merged
merged 5 commits into from
Jan 6, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
19 changes: 13 additions & 6 deletions docs/integrations/destinations/bigquery.md
Original file line number Diff line number Diff line change
Expand Up @@ -111,15 +111,22 @@ This is the recommended configuration for uploading data to BigQuery. It works b
* **GCS Bucket Path**
* **Block Size (MB) for GCS multipart upload**
* **GCS Bucket Keep files after migration**
* See [this](https://cloud.google.com/storage/docs/creating-buckets) for instructions on how to create a GCS bucket.
* See [this](https://cloud.google.com/storage/docs/creating-buckets) for instructions on how to create a GCS bucket. The bucket cannot have a retention policy. Set Protection Tools to none or Object versioning.
* **HMAC Key Access ID**
* See [this](https://cloud.google.com/storage/docs/authentication/hmackeys) on how to generate an access key.
* We recommend creating an Airbyte-specific user or service account. This user or account will require read and write permissions to objects in the bucket.
* See [this](https://cloud.google.com/storage/docs/authentication/managing-hmackeys) on how to generate an access key. For more information on hmac keys please reference the [GCP docs](https://cloud.google.com/storage/docs/authentication/hmackeys)
* We recommend creating an Airbyte-specific user or service account. This user or account will require the following permissions for the bucket:
```
storage.multipartUploads.abort
storage.multipartUploads.create
storage.objects.create
storage.objects.delete
storage.objects.get
storage.objects.list
```
You can set those by going to the permissions tab in the GCS bucket and adding the appropriate the email address of the service account or user and adding the aforementioned permissions.
* **Secret Access Key**
* Corresponding key to the above access ID.
* Make sure your GCS bucket is accessible from the machine running Airbyte.
* This depends on your networking setup.
* The easiest way to verify if Airbyte is able to connect to your GCS bucket is via the check connection tool in the UI.
* Make sure your GCS bucket is accessible from the machine running Airbyte. This depends on your networking setup. The easiest way to verify if Airbyte is able to connect to your GCS bucket is via the check connection tool in the UI.

### `Standard` uploads
This uploads data directly from your source to BigQuery. While this is faster to setup initially, **we strongly recommend that you do not use this option for anything other than a quick demo**. It is more than 10x slower than the GCS uploading option and will fail for many datasets. Please be aware you may see some failures for big datasets and slow sources, e.g. if reading from source takes more than 10-12 hours. This is caused by the Google BigQuery SDK client limitations. For more details please check [https://github.com/airbytehq/airbyte/issues/3549](https://github.com/airbytehq/airbyte/issues/3549)
Expand Down
19 changes: 13 additions & 6 deletions docs/integrations/destinations/gcs.md
Original file line number Diff line number Diff line change
Expand Up @@ -207,16 +207,23 @@ Under the hood, an Airbyte data stream in Json schema is first converted to an A

* Fill up GCS info
* **GCS Bucket Name**
* See [this](https://cloud.google.com/storage/docs/creating-buckets) to create an S3 bucket.
* See [this](https://cloud.google.com/storage/docs/creating-buckets) for instructions on how to create a GCS bucket. The bucket cannot have a retention policy. Set Protection Tools to none or Object versioning.
* **GCS Bucket Region**
* **HMAC Key Access ID**
* See [this](https://cloud.google.com/storage/docs/authentication/hmackeys) on how to generate an access key.
* We recommend creating an Airbyte-specific user or service account. This user or account will require read and write permissions to objects in the bucket.
* See [this](https://cloud.google.com/storage/docs/authentication/managing-hmackeys) on how to generate an access key. For more information on hmac keys please reference the [GCP docs](https://cloud.google.com/storage/docs/authentication/hmackeys)
* We recommend creating an Airbyte-specific user or service account. This user or account will require the following permissions for the bucket:
```
storage.multipartUploads.abort
storage.multipartUploads.create
storage.objects.create
storage.objects.delete
storage.objects.get
storage.objects.list
```
You can set those by going to the permissions tab in the GCS bucket and adding the appropriate the email address of the service account or user and adding the aforementioned permissions.
* **Secret Access Key**
* Corresponding key to the above access ID.
* Make sure your GCS bucket is accessible from the machine running Airbyte.
* This depends on your networking setup.
* The easiest way to verify if Airbyte is able to connect to your GCS bucket is via the check connection tool in the UI.
* Make sure your GCS bucket is accessible from the machine running Airbyte. This depends on your networking setup. The easiest way to verify if Airbyte is able to connect to your GCS bucket is via the check connection tool in the UI.

## CHANGELOG

Expand Down