Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Ability to import HBase Snapshot data into Cloud Bigtable using Dataflow #2755

Merged
merged 36 commits into from
Feb 1, 2021
Merged
Show file tree
Hide file tree
Changes from 26 commits
Commits
Show all changes
36 commits
Select commit Hold shift + click to select a range
32c1e6d
Support import from HBase snapshot
lichng Oct 20, 2020
7d63490
Update document
lichng Dec 14, 2020
6c0ed5e
Change the conf type for HBaseSnapshotInputConfiguration for
lichng Dec 14, 2020
fc728a8
Rename HBASE_ROOT_PATH to HBASE_EXPORT_ROOT_PATH in example doc for
lichng Dec 15, 2020
316c0aa
Addressing the review comments:
lichng Dec 21, 2020
6188efa
Add the original Main.java under sequencefiles back
lichng Dec 22, 2020
f621dc0
gcs connector still requires non-android guava version
lichng Dec 28, 2020
68d88f4
Support import from HBase snapshot
lichng Oct 20, 2020
53f73bc
Addressing the review comments:
lichng Dec 21, 2020
cfe86e2
switch HBasesnapshotConfiguration to a builder class
lichng Dec 22, 2020
f6cdabf
use DataflowRunner instead of DirectRunner for integration tests
lichng Dec 30, 2020
bf7409e
revert pom file override
lichng Dec 30, 2020
fa0d8a8
Remove all ValueProvider for now
lichng Dec 30, 2020
1ca1fd8
Add gcsProject parameter and remove template related document
lichng Dec 30, 2020
f5b086f
recover the dependency missed in the rebase
lichng Dec 30, 2020
d0389c0
Update new files using latest header comment format and update year to
lichng Jan 5, 2021
27e3657
Remove workaround for BIGTABLE_BULK_AUTOFLUSH_MS_KEY
lichng Jan 8, 2021
0a5ced1
Exclude hbase-shaded-client
lichng Jan 11, 2021
19b2a5c
Clean up all transitive depdendencies on hbase-shaded-client
lichng Jan 12, 2021
61987c3
Add document for integration test generation instructions
lichng Jan 14, 2021
77f528b
Update document
lichng Jan 15, 2021
703fe6a
Fail the pipeline building when there is an exception configuring input
lichng Jan 19, 2021
049cc43
renaming according to review comments
lichng Jan 19, 2021
2f44f4d
update comments
lichng Jan 19, 2021
39b003e
More document about hbase snapshot file structure
lichng Jan 19, 2021
d96cffb
System.out -> LOG
lichng Jan 19, 2021
ce4e2cf
throw out exception instead of terminating JVM
lichng Jan 21, 2021
fc6e1e4
Remove outside visible parameter restoreDir, use a default dir instead
lichng Jan 22, 2021
e871418
use pattern without ending '/'
lichng Jan 24, 2021
6f15d81
use listObject instead of match since GcsUtil expand intentionally
lichng Jan 26, 2021
5db7ce6
Using a unique suffix for restore dir to avoid conflict
lichng Jan 26, 2021
87b0eff
Add dependency to pom.xml
lichng Jan 26, 2021
c967d17
minimize accessibility for class
lichng Jan 28, 2021
59e7a55
Fix typo and Add header comment for CleanupHBaseSnapshotRestoreFilesFn
lichng Jan 28, 2021
2596d88
Adding more error messages for HBaseSnapshotInputConfigBuilder
lichng Feb 1, 2021
94270af
Add document about how to handle temp files during job failures
lichng Feb 1, 2021
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion bigtable-dataflow-parent/bigtable-beam-import/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -42,4 +42,4 @@ java -jar bigtable-beam-import-1.14.1-shaded.jar import \
--maxNumWorkers=[3x number of nodes] \
--zone=[zone of your cluster]
```
[//]: # ({x-version-update-end})
[//]: # ({x-version-update-end})
76 changes: 73 additions & 3 deletions bigtable-dataflow-parent/bigtable-beam-import/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,8 @@ limitations under the License.
<artifactId>bigtable-beam-import</artifactId>

<properties>
<mainClass>com.google.cloud.bigtable.beam.sequencefiles.Main</mainClass>
<mainClass>com.google.cloud.bigtable.beam.Main</mainClass>
<skipITs>false</skipITs>
</properties>

<!-- Adding this to resolve version conflict within beam sdk-->
Expand All @@ -49,6 +50,12 @@ limitations under the License.
<groupId>${project.groupId}</groupId>
<artifactId>bigtable-hbase-beam</artifactId>
<version>${project.version}</version>
<exclusions>
<exclusion>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-shaded-client</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>com.google.cloud.bigtable</groupId>
Expand All @@ -67,6 +74,10 @@ limitations under the License.
<groupId>io.opencensus</groupId>
<artifactId>*</artifactId>
</exclusion>
<exclusion>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-shaded-client</artifactId>
</exclusion>
</exclusions>
</dependency>

Expand All @@ -87,10 +98,18 @@ limitations under the License.
<artifactId>beam-sdks-java-io-hadoop-common</artifactId>
<version>${beam.version}</version>
</dependency>
<dependency>
<groupId>org.apache.beam</groupId>
<artifactId>beam-sdks-java-io-hadoop-format</artifactId>
<version>${beam.version}</version>
</dependency>

<!-- For HBase 2.x, this should be hbase-mapreduce
https://hbase.apache.org/2.1/book.html#export
-->
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-shaded-client</artifactId>
<artifactId>hbase-shaded-server</artifactId>
<version>${hbase.version}</version>
</dependency>

Expand All @@ -104,7 +123,7 @@ limitations under the License.
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
<version>${guava.version}</version>
<version>${gcs-guava.version}</version>
</dependency>

<!-- TODO: check if commons-codec was transitively updated to 1.13 and okhttp was updated to 2.7.5 when upgrading-->
Expand Down Expand Up @@ -149,6 +168,13 @@ limitations under the License.
<artifactId>slf4j-api</artifactId>
<version>${slf4j.version}</version>
</dependency>
<!-- https://mvnrepository.com/artifact/com.google.cloud.bigdataoss/gcs-connector -->
<dependency>
<groupId>com.google.cloud.bigdataoss</groupId>
<artifactId>gcs-connector</artifactId>
<version>hadoop2-2.1.4</version>
<classifier>shaded</classifier>
</dependency>

<!-- Test -->
<dependency>
Expand Down Expand Up @@ -181,6 +207,12 @@ limitations under the License.
<version>${junit.version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-shaded-testing-util</artifactId>
<version>${hbase.version}</version>
<scope>test</scope>
</dependency>
</dependencies>

<build>
Expand Down Expand Up @@ -265,6 +297,16 @@ limitations under the License.
<transformers>
<transformer implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer" />
</transformers>
<filters>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
</configuration>
</plugin>

Expand Down Expand Up @@ -376,5 +418,33 @@ limitations under the License.
</plugins>
</build>
</profile>

<profile>
<id>hbasesnapshotsIntegrationTest</id>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-failsafe-plugin</artifactId>
<executions>
<execution>
<id>hbasesnapshots-integration-test</id>
<goals>
<goal>integration-test</goal>
</goals>
<phase>integration-test</phase>
<configuration>
<forkCount>1</forkCount>
<includes>
<include>**/hbasesnapshots/*IT.java</include>
</includes>
<useSystemClassLoader>false</useSystemClassLoader>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</profile>
</profiles>
</project>
Original file line number Diff line number Diff line change
@@ -0,0 +1,79 @@
/*
* Copyright 2021 Google LLC
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package com.google.cloud.bigtable.beam;

import com.google.bigtable.repackaged.com.google.api.core.InternalApi;
import com.google.bigtable.repackaged.com.google.api.core.InternalExtensionOnly;
import com.google.cloud.bigtable.beam.hbasesnapshots.ImportJobFromHbaseSnapshot;
import com.google.cloud.bigtable.beam.sequencefiles.CreateTableHelper;
import com.google.cloud.bigtable.beam.sequencefiles.ExportJob;
import com.google.cloud.bigtable.beam.sequencefiles.ImportJob;
import java.io.File;
import java.net.URISyntaxException;
import java.util.Arrays;

/** Entry point for create-table/import/export job submission. */
@InternalExtensionOnly
final class Main {
/** For internal use only - public for technical reasons. */
@InternalApi("For internal usage only")
public Main() {}

public static void main(String[] args) throws Exception {
if (args.length < 1) {
usage();
System.exit(1);
}

String[] subArgs = Arrays.copyOfRange(args, 1, args.length);

switch (args[0]) {
case "export":
ExportJob.main(subArgs);
break;
case "import":
ImportJob.main(subArgs);
break;
case "importsnapshot":
ImportJobFromHbaseSnapshot.main(subArgs);
break;
case "create-table":
CreateTableHelper.main(subArgs);
break;
default:
usage();
System.exit(1);
}
}

private static void usage() {
String jarName;

try {
jarName =
new File(Main.class.getProtectionDomain().getCodeSource().getLocation().toURI().getPath())
.getName();
} catch (URISyntaxException e) {
jarName = "<jar>";
}

System.out.printf(
"java -jar %s <action> <action_params>\n"
+ "Where <action> can be 'export', 'import' , 'importsnapshot' or 'create-table'. To get further help, run: \n"
+ "java -jar %s <action> --help\n",
jarName, jarName);
}
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,130 @@
/*
* Copyright 2021 Google LLC
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package com.google.cloud.bigtable.beam.hbasesnapshots;

import com.google.common.base.Preconditions;
import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.hbase.client.Result;
import org.apache.hadoop.hbase.client.Scan;
import org.apache.hadoop.hbase.io.ImmutableBytesWritable;
import org.apache.hadoop.hbase.mapreduce.TableInputFormat;
import org.apache.hadoop.hbase.mapreduce.TableSnapshotInputFormat;
import org.apache.hadoop.hbase.protobuf.ProtobufUtil;
import org.apache.hadoop.hbase.protobuf.generated.ClientProtos;
import org.apache.hadoop.hbase.util.Base64;
import org.apache.hadoop.io.Writable;
import org.apache.hadoop.mapreduce.InputFormat;
import org.apache.hadoop.mapreduce.Job;

/**
* A {@link Configuration} that could be used in {@link HadoopFormatIO} for reading HBase snapshot
* hosted in Google Cloud Storage(GCS) bucket via GCS connector. It uses {@link
* TableSnapshotInputFormat} for reading HBase snapshots.
*/
class HBaseSnapshotInputConfigBuilder {

private static final Log LOG = LogFactory.getLog(HBaseSnapshotInputConfigBuilder.class);
// Batch size used for HBase snapshot scans
private static final int BATCH_SIZE = 1000;

private String projectId;
private String hbaseSnapshotSourceDir;
private String snapshotName;
private String restoreDir;

public HBaseSnapshotInputConfigBuilder() {}

/*
* Set the project id use to access the GCS bucket with HBase snapshot data to be imported
*/
public HBaseSnapshotInputConfigBuilder setProjectId(String projectId) {
this.projectId = projectId;
return this;
}

/*
* Set the GCS path where the HBase snapshot data is located
*/
public HBaseSnapshotInputConfigBuilder setHbaseSnapshotSourceDir(String hbaseSnapshotSourceDir) {
this.hbaseSnapshotSourceDir = hbaseSnapshotSourceDir;
return this;
}

/*
* Set the name of the snapshot to be imported
* e.g when importing snapshot 'gs://<your-gcs-path>/hbase-export/table_snapshot'
* put 'table_snapshot' as the {@code snapshotName}
* and 'gs://<your-gcs-path>/hbase-export' as {@code exportedSnapshotDir}
*/
public HBaseSnapshotInputConfigBuilder setSnapshotName(String snapshotName) {
this.snapshotName = snapshotName;
return this;
}

/*
* Set the temporary restore GCS path used by TableSnapshotInputFormat while reading the HBase snapshot
* This path should not be under {@code exportedSnapshotDir}
*/
public HBaseSnapshotInputConfigBuilder setRestoreDir(String restoreDir) {
this.restoreDir = restoreDir;
return this;
}

public Configuration build() throws Exception {
Preconditions.checkNotNull(projectId);
Preconditions.checkNotNull(hbaseSnapshotSourceDir);
Preconditions.checkNotNull(snapshotName);
Preconditions.checkState(
hbaseSnapshotSourceDir.startsWith("gs://"),
"snapshot folder must be hosted in a GCS bucket ");

Configuration conf = createHBaseConfiguration();

// Configuring a MapReduce Job base on HBaseConfiguration
// and return the job Configuration
ClientProtos.Scan proto = ProtobufUtil.toScan(new Scan().setBatch(BATCH_SIZE));
conf.set(TableInputFormat.SCAN, Base64.encodeBytes(proto.toByteArray()));
Job job = Job.getInstance(conf); // creates internal clone of hbaseConf
TableSnapshotInputFormat.setInput(job, snapshotName, new Path(restoreDir));
return job.getConfiguration(); // extract the modified clone
}

// separate static part for unit testing
public Configuration createHBaseConfiguration() {
Configuration conf = HBaseConfiguration.create();

// Setup the input data location for HBase snapshot import
// exportedSnapshotDir should be a GCS Bucket path.
conf.set("hbase.rootdir", hbaseSnapshotSourceDir);
conf.set("fs.defaultFS", hbaseSnapshotSourceDir);

// Setup GCS connector to use GCS as Hadoop filesystem
conf.set("fs.AbstractFileSystem.gs.impl", "com.google.cloud.hadoop.fs.gcs.GoogleHadoopFS");
conf.set("fs.gs.project.id", projectId);
conf.set("google.cloud.auth.service.account.enable", "true");

// Setup MapReduce config for TableSnapshotInputFormat
conf.setClass(
"mapreduce.job.inputformat.class", TableSnapshotInputFormat.class, InputFormat.class);
conf.setClass("key.class", ImmutableBytesWritable.class, Writable.class);
conf.setClass("value.class", Result.class, Object.class);
return conf;
}
}
Loading