Consuming 190 partitions of a table using changestream from bigtable

49 views
Skip to first unread message

Tarun K

unread,
Feb 11, 2025, 6:28:57 PMFeb 11
to Google Cloud Bigtable Discuss
I am using bigtable changestream feature to listen and process changes / mutations on particular table using changestream and one table has 190 partitions, i got the general implementation working on a test table but when i tried to use it with table that has 400 Million rows ( has 190 paritions) it was not working

After debugging for a while, i found that we open a long live grpc stream for consuming each partition and bigtable has limit of 100 streams per client , so when i try to read only 100 partitions at a time it is working but when i add any more than that it is not working at all ( i don't see any mutations , and writes / reads are not working (timeout expiry) )

Not sure how to address this, because implementing a solution/hack to only read 100 partitions at a time seems too complicated and there isn't a option in GRPC v2 API to specify or limit number of streams so not sure how to proceed either

Anyone faced similar issue or have any opinion on this ? I created a support ticket btw but meanwhile if anyone has any opinions on this matter , you are welcome to comment on it
Reply all
Reply to author
Forward
0 new messages