diff --git a/docs/operator-guides/browsing-output-logs.md b/docs/operator-guides/browsing-output-logs.md index 5ccba3965771e..9dc7acb58b3c4 100644 --- a/docs/operator-guides/browsing-output-logs.md +++ b/docs/operator-guides/browsing-output-logs.md @@ -93,6 +93,28 @@ docker cp airbyte-server:/tmp/workspace/9/2/catalog.json . cat catalog.json ``` +### Browsing on Kubernetes + +If you are running on Kubernetes, use the following commands instead to browsing and copy the files to your local. + +To browse, identify the pod you are interested in and exec into it. You will be presented with a terminal that will accept normal linux commands e.g ls. +```bash +kubectl exec -it -n -c main bash +e.g. +kubectl exec -it destination-bigquery-worker-3607-0-chlle -n jobs -c main bash +root@destination-bigquery-worker-3607-0-chlle:/config# ls +FINISHED_UPLOADING destination_catalog.json destination_config.json +``` + +To copy the file on to your local in order to preserve it's contents: +```bash +kubectl cp /:/config/destination_catalog.json ./catalog.json +e.g. +kubectl cp jobs/normalization-worker-3605-0-sxtox:/config/destination_catalog.json ./catalog.json +cat ./catalog.json +``` + + ## CSV or JSON local Destinations: Check local data folder If you setup a pipeline using one of the local File based destinations \(CSV or JSON\), Airbyte is writing the resulting files containing the data in the special `/local/` directory in the container. By default, this volume is mounted from `/tmp/airbyte_local` on the host machine. So you need to navigate to this [local folder](file:///tmp/airbyte_local/) on the filesystem of the machine running the Airbyte deployment to retrieve the local data files.