Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Value only available at runtime, but accessed from a non-runtime context #2389

Closed
czka opened this issue Feb 24, 2020 · 2 comments · Fixed by #2400
Closed

Value only available at runtime, but accessed from a non-runtime context #2389

czka opened this issue Feb 24, 2020 · 2 comments · Fixed by #2400
Assignees
Labels
api: bigtable Issues related to the googleapis/java-bigtable-hbase API. priority: p2 Moderately-important priority. Fix may not be included in next release. type: feature request ‘Nice-to-have’ improvement, new feature or different behavior or design.

Comments

@czka
Copy link

czka commented Feb 24, 2020

bigtable-beam-import-1.13.0-shaded.jar returns a java.lang.IllegalStateException: Value only available at runtime, but accessed from a non-runtime context: RuntimeValueProvider{propertyName=bigtableStartRow, default=} error for each export operation which doesn't have --bigtableStartRow set:

(...)
2020-02-24 15:49:06 INFO  DataflowPipelineTranslator:522 - Adding Read table as step s1
2020-02-24 15:49:06 WARN  CustomSources:79 - Size estimation of the source failed: com.google.cloud.bigtable.beam.CloudBigtableIO$Source@5528a42c
java.lang.IllegalStateException: Value only available at runtime, but accessed from a non-runtime context: RuntimeValueProvider{propertyName=bigtableStartRow, default=}
	at org.apache.beam.sdk.options.ValueProvider$RuntimeValueProvider.get(ValueProvider.java:226)
	at com.google.cloud.bigtable.beam.TemplateUtils$RequestValueProvider.get(TemplateUtils.java:95)
	at com.google.cloud.bigtable.beam.TemplateUtils$RequestValueProvider.get(TemplateUtils.java:76)
	at com.google.cloud.bigtable.beam.CloudBigtableScanConfiguration$RequestWithTableNameValueProvider.get(CloudBigtableScanConfiguration.java:275)
	at com.google.cloud.bigtable.beam.CloudBigtableScanConfiguration$RequestWithTableNameValueProvider.get(CloudBigtableScanConfiguration.java:253)
	at com.google.cloud.bigtable.beam.CloudBigtableScanConfiguration.getRequest(CloudBigtableScanConfiguration.java:330)
	at com.google.cloud.bigtable.beam.CloudBigtableScanConfiguration.getRowRange(CloudBigtableScanConfiguration.java:362)
	at com.google.cloud.bigtable.beam.CloudBigtableScanConfiguration.getStartRowByteString(CloudBigtableScanConfiguration.java:354)
	at com.google.cloud.bigtable.beam.CloudBigtableScanConfiguration.getZeroCopyStartRow(CloudBigtableScanConfiguration.java:345)
	at com.google.cloud.bigtable.beam.CloudBigtableIO$Source.getEstimatedSizeBytes(CloudBigtableIO.java:461)
	at org.apache.beam.runners.dataflow.internal.CustomSources.serializeToCloudSource(CustomSources.java:77)
	at org.apache.beam.runners.dataflow.ReadTranslator.translateReadHelper(ReadTranslator.java:51)
	at org.apache.beam.runners.dataflow.ReadTranslator.translate(ReadTranslator.java:38)
	at org.apache.beam.runners.dataflow.ReadTranslator.translate(ReadTranslator.java:35)
	at org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator.visitPrimitiveTransform(DataflowPipelineTranslator.java:473)
	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:665)
	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:657)
	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$600(TransformHierarchy.java:317)
	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:251)
	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:460)
	at org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator.translate(DataflowPipelineTranslator.java:412)
	at org.apache.beam.runners.dataflow.DataflowPipelineTranslator.translate(DataflowPipelineTranslator.java:173)
	at org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:748)
	at org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:179)
	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:315)
	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:301)
	at com.google.cloud.bigtable.beam.sequencefiles.ExportJob.main(ExportJob.java:181)
	at com.google.cloud.bigtable.beam.sequencefiles.Main.main(Main.java:41)
2020-02-24 15:49:06 INFO  DataflowPipelineTranslator:522 - Adding Format results/Map as step s2
(...)

To reproduce try running:

java -jar bigtable-beam-import-1.13.0-shaded.jar export --runner=dataflow --project=<project-name> --destinationPath=gs://<bucket-name> --bigtableInstanceId=<instance-name> --bigtableTableId=<table-id> --tempLocation=gs://<bucket-name>/temp --maxNumWorkers=1 --diskSizeGb=30 --workerMachineType=n1-standard-1 --jobName=<job-name> --region=<region-name> --sdkWorkerParallelism=0 --bigtableMaxVersions=1

When --bigtableStartRow is set, it complains about bigtableStopRow for a change:

(...)
2020-02-24 16:03:57 INFO  DataflowPipelineTranslator:522 - Adding Read table as step s1
2020-02-24 16:03:57 WARN  CustomSources:79 - Size estimation of the source failed: com.google.cloud.bigtable.beam.CloudBigtableIO$Source@54ec8cc9
java.lang.IllegalStateException: Value only available at runtime, but accessed from a non-runtime context: RuntimeValueProvider{propertyName=bigtableStopRow, default=}
	at org.apache.beam.sdk.options.ValueProvider$RuntimeValueProvider.get(ValueProvider.java:226)
	at com.google.cloud.bigtable.beam.TemplateUtils$RequestValueProvider.get(TemplateUtils.java:98)
	at com.google.cloud.bigtable.beam.TemplateUtils$RequestValueProvider.get(TemplateUtils.java:76)
	at com.google.cloud.bigtable.beam.CloudBigtableScanConfiguration$RequestWithTableNameValueProvider.get(CloudBigtableScanConfiguration.java:275)
	at com.google.cloud.bigtable.beam.CloudBigtableScanConfiguration$RequestWithTableNameValueProvider.get(CloudBigtableScanConfiguration.java:253)
	at com.google.cloud.bigtable.beam.CloudBigtableScanConfiguration.getRequest(CloudBigtableScanConfiguration.java:330)
	at com.google.cloud.bigtable.beam.CloudBigtableScanConfiguration.getRowRange(CloudBigtableScanConfiguration.java:362)
	at com.google.cloud.bigtable.beam.CloudBigtableScanConfiguration.getStartRowByteString(CloudBigtableScanConfiguration.java:354)
	at com.google.cloud.bigtable.beam.CloudBigtableScanConfiguration.getZeroCopyStartRow(CloudBigtableScanConfiguration.java:345)
	at com.google.cloud.bigtable.beam.CloudBigtableIO$Source.getEstimatedSizeBytes(CloudBigtableIO.java:461)
	at org.apache.beam.runners.dataflow.internal.CustomSources.serializeToCloudSource(CustomSources.java:77)
	at org.apache.beam.runners.dataflow.ReadTranslator.translateReadHelper(ReadTranslator.java:51)
	at org.apache.beam.runners.dataflow.ReadTranslator.translate(ReadTranslator.java:38)
	at org.apache.beam.runners.dataflow.ReadTranslator.translate(ReadTranslator.java:35)
	at org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator.visitPrimitiveTransform(DataflowPipelineTranslator.java:473)
	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:665)
	at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:657)
	at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$600(TransformHierarchy.java:317)
	at org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:251)
	at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:460)
	at org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator.translate(DataflowPipelineTranslator.java:412)
	at org.apache.beam.runners.dataflow.DataflowPipelineTranslator.translate(DataflowPipelineTranslator.java:173)
	at org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:748)
	at org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:179)
	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:315)
	at org.apache.beam.sdk.Pipeline.run(Pipeline.java:301)
	at com.google.cloud.bigtable.beam.sequencefiles.ExportJob.main(ExportJob.java:181)
	at com.google.cloud.bigtable.beam.sequencefiles.Main.main(Main.java:41)
2020-02-24 16:03:57 INFO  DataflowPipelineTranslator:522 - Adding Format results/Map as step s2
(...)
@product-auto-label product-auto-label bot added the api: bigtable Issues related to the googleapis/java-bigtable-hbase API. label Feb 24, 2020
@igorbernstein2
Copy link
Collaborator

Thanks for reporting!

We do need to fix this, but please note that this is only a cosmetic warning and can be ignored

@yoshi-automation yoshi-automation added the triage me I really want to be triaged. label Feb 25, 2020
@czka
Copy link
Author

czka commented Feb 25, 2020

@igorbernstein2 I know it's non-fatal. Otherwise my bigtable_export.py wouldn't work ;).

@yoshi-automation yoshi-automation added the 🚨 This issue needs some love. label Feb 29, 2020
@crwilcox crwilcox added the priority: p2 Moderately-important priority. Fix may not be included in next release. label Mar 2, 2020
@kolea2 kolea2 added type: feature request ‘Nice-to-have’ improvement, new feature or different behavior or design. and removed 🚨 This issue needs some love. triage me I really want to be triaged. labels Mar 2, 2020
igorbernstein2 added a commit to igorbernstein2/cloud-bigtable-client that referenced this issue Mar 5, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
api: bigtable Issues related to the googleapis/java-bigtable-hbase API. priority: p2 Moderately-important priority. Fix may not be included in next release. type: feature request ‘Nice-to-have’ improvement, new feature or different behavior or design.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants