'Exceeded rate limits: too many table update operations for this table in Cloud Function while uploading the data from pubsub topic to BQ
Let me explain the use case which I am trying to solve
- I am getting data to the pubsub topic on daily basis in google cloud
- I have a written a cloud function with pubsub as a trigger type to trigger and execute the script inside the cloud function when ever the topic receives the data
- The function will trigger and uploads the data to Bigquery tables
- Right now I kept the memory of the function as 256mb and max instances to 100
- On daily basis I receive very small amount of data like 16 messages to pubsub topic
- Even the data is small I observed the cloud function is encountering the following error "too many table update operations for this table" and I am missing some data
- I have retry option which I can use but don't want the cloud function to return any errors, Is there any workaround for this
- sharing the sample code below

Solution 1:[1]
It appears that your code is exceeding the maximum rate of table metadata update operations per table which is 5 operations per 10 seconds. So it appears that all your Cloud Pub/Sub messages are coming at the same time (or near the same time). You can follow either of the following solutions for the same:
As suggested in the comments, you can use Streaming inserts. As this limit does not apply to Streaming Inserts.
You can insert a delay in your code for providing time between successive metadata updates. In this case I passed a timeout of 30 seconds to the
resultfor the same. This worked for me.
I tried the second method. I published 19 messages from my publisher code at the same time. I got the same error message as yours (when executing without timeout).
But after Inserting the timeout the execution was successful. (Timeout value can be reduced further as per the requirement)
Complete Code:
import base64
from google.cloud import bigquery
import json
import io
def hello_pubsub(event, context):
"""Triggered from a message on a Cloud Pub/Sub topic.
Args:
event (dict): Event payload.
context (google.cloud.functions.Context): Metadata for the event.
"""
client = bigquery.Client()
pubsub_message = base64.b64decode(event['data']).decode('utf-8')
print(pubsub_message)
message = json.loads(pubsub_message)
result = json.dumps(message)
source_file = io.StringIO(result)
table_id="ProjectID.DatasetID.TableName"
job_config = bigquery.LoadJobConfig(
autodetect=True,
write_disposition="WRITE_APPEND",
source_format=bigquery.SourceFormat.NEWLINE_DELIMITED_JSON,
)
job = client.load_table_from_file(source_file, table_id, job_config=job_config)
job.result(timeout=30) #passing a timeout in the function call
Output : (All 19 rows were inserted, I did not notice any loss of data)
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|---|
| Solution 1 |


