You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In BigQuery there's column of BigNumeric datatype, with the value 99999999999999999999999999999999999.
This BigQuery-to-BigTable writes the column to BigTable without throwing any exception, however, when read from BigTable, the value becomes java.nio.HeapBytesBuffer[pos=0 lim=32 cap=32]
I used cbt read and BigTable's Python library to read that value, and they are all same.
Beam Version
Newer than 2.35.0
Relevant log output
No response
The text was updated successfully, but these errors were encountered:
Same issue when writing BYTES BigQuery column into BigTable (Apache Beam 2.57)
For context: I'm trying to write numbers into BigTable to be compatible with BigTable's Increments (big-endian). If BQ has INT field this template will write it as "10" (base10 string representation), if BQ field is BYTES, template write "java.nio.HeapBytesBuffer[pos=0 lim=8 cap=8]" as BigTable value
This issue has been marked as stale due to 180 days of inactivity. It will be closed in 1 week if no further activity occurs. If you think that’s incorrect or this pull request requires a review, please simply write any comment. If closed, you can revive the issue at any time. Thank you for your contributions.
Related Template(s)
bigquery-to-bigtable
What happened?
In BigQuery there's column of BigNumeric datatype, with the value 99999999999999999999999999999999999.
This BigQuery-to-BigTable writes the column to BigTable without throwing any exception, however, when read from BigTable, the value becomes
java.nio.HeapBytesBuffer[pos=0 lim=32 cap=32]
I used
cbt read
and BigTable's Python library to read that value, and they are all same.Beam Version
Newer than 2.35.0
Relevant log output
No response
The text was updated successfully, but these errors were encountered: