Skip to content

Commit

Permalink
🎉 Destination databricks: rename to databricks lakehouse (#13722)
Browse files Browse the repository at this point in the history
* Update Databricks naming

* Update destination_spec

* Update BOOTSTRAP.md

* Update Dockerfile

* Update README.md

* Update spec.json

* Update databricks.md

* Update databricks.md

* Update airbyte-integrations/connectors/destination-databricks/BOOTSTRAP.md

Co-authored-by: LiRen Tu <tuliren.git@outlook.com>

Co-authored-by: LiRen Tu <tuliren.git@outlook.com>
  • Loading branch information
Andy and tuliren authored Jun 14, 2022
1 parent 15fe51b commit fe6eda5
Show file tree
Hide file tree
Showing 7 changed files with 12 additions and 11 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -70,10 +70,10 @@
dockerImageTag: 0.1.6
documentationUrl: https://docs.airbyte.io/integrations/destinations/clickhouse
releaseStage: alpha
- name: Databricks Delta Lake
- name: Databricks Lakehouse
destinationDefinitionId: 072d5540-f236-4294-ba7c-ade8fd918496
dockerRepository: airbyte/destination-databricks
dockerImageTag: 0.2.1
dockerImageTag: 0.2.2
documentationUrl: https://docs.airbyte.io/integrations/destinations/databricks
icon: databricks.svg
releaseStage: alpha
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -996,12 +996,12 @@
- "overwrite"
- "append"
- "append_dedup"
- dockerImage: "airbyte/destination-databricks:0.2.1"
- dockerImage: "airbyte/destination-databricks:0.2.2"
spec:
documentationUrl: "https://docs.airbyte.io/integrations/destinations/databricks"
connectionSpecification:
$schema: "http://json-schema.org/draft-07/schema#"
title: "Databricks Delta Lake Destination Spec"
title: "Databricks Lakehouse Destination Spec"
type: "object"
required:
- "accept_terms"
Expand Down
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Databricks Delta Lake Destination Connector Bootstrap
# Databricks Lakehouse Destination Connector Bootstrap

The Databricks Delta Lake Connector enables a developer to sync data into a Databricks cluster. It does so in two steps:
This destination syncs data to Delta Lake on Databricks Lakehouse. It does so in two steps:

1. Persist source data in S3 staging files in the Parquet format.
2. Create delta table based on the Parquet staging files.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -16,5 +16,5 @@ ENV APPLICATION destination-databricks

COPY --from=build /airbyte /airbyte

LABEL io.airbyte.version=0.2.1
LABEL io.airbyte.version=0.2.2
LABEL io.airbyte.name=airbyte/destination-databricks
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Destination Databricks Delta Lake
# Destination Databricks Lakehouse

This is the repository for the Databricks destination connector in Java.
For information about how to use this connector within Airbyte, see [the User Documentation](https://docs.airbyte.io/integrations/destinations/databricks).
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@
"supported_destination_sync_modes": ["overwrite", "append"],
"connectionSpecification": {
"$schema": "http://json-schema.org/draft-07/schema#",
"title": "Databricks Delta Lake Destination Spec",
"title": "Databricks Lakehouse Destination Spec",
"type": "object",
"required": [
"accept_terms",
Expand Down
5 changes: 3 additions & 2 deletions docs/integrations/destinations/databricks.md
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
# Databricks Delta Lake
# Databricks Lakehouse

## Overview

This destination syncs data to Databricks Delta Lake. Each stream is written to its own [delta-table](https://delta.io/).
This destination syncs data to Delta Lake on Databricks Lakehouse. Each stream is written to its own [delta-table](https://delta.io/).

This connector requires a JDBC driver to connect to the Databricks cluster. By using the driver and the connector, you must agree to the [JDBC ODBC driver license](https://databricks.com/jdbc-odbc-driver-license). This means that you can only use this connector to connect third party applications to Apache Spark SQL within a Databricks offering using the ODBC and/or JDBC protocols.

Expand Down Expand Up @@ -104,6 +104,7 @@ Under the hood, an Airbyte data stream in Json schema is first converted to an A

| Version | Date | Pull Request | Subject |
| :--- | :--- | :--- | :--- |
| 0.2.2 | 2022-06-13 | [\#13722](https://github.com/airbytehq/airbyte/pull/13722) | Rename to "Databricks Lakehouse". |
| 0.2.1 | 2022-06-08 | [\#13630](https://github.com/airbytehq/airbyte/pull/13630) | Rename to "Databricks Delta Lake" and add field orders in the spec. |
| 0.2.0 | 2022-05-15 | [\#12861](https://github.com/airbytehq/airbyte/pull/12861) | Use new public Databricks JDBC driver, and open source the connector. |
| 0.1.5 | 2022-05-04 | [\#12578](https://github.com/airbytehq/airbyte/pull/12578) | In JSON to Avro conversion, log JSON field values that do not follow Avro schema for debugging. |
Expand Down

0 comments on commit fe6eda5

Please sign in to comment.