''insertALL failed' when streaming data into BigQuery using Cloud Functions
I'm using Cloud Functions to stream data into BigQuery.
This is what the data looks like
{'odata.metadata': 'http://datamall2.mytransport.sg/ltaodataservice/$metadata#BusArrivalv2/@Element',
'BusStopCode': '81201',
'Services': [{'ServiceNo': '134',
'Operator': 'SBST',
'NextBus': {'OriginCode': '80289',
'DestinationCode': '80289',
'EstimatedArrival': '2022-04-11T15:17:59+08:00',
'Latitude': '0',
'Longitude': '0',
'VisitNumber': '1',
'Load': 'SEA',
'Feature': 'WAB',
'Type': 'DD'}}
I want to convert the 'Services' data into dataframe and I use this code
mylist = [{**x, **x.pop('NextBus')} for x in data["Services"]]
df = pd.DataFrame(mylist)
But this error shows up
Exception: insertAll failed: [{'index': 0, 'errors': [{'reason': 'invalid', 'location': 'nextbus', 'debugInfo': '', 'message': 'This field: nextbus is not a record.'}]}
I'm able to convert the list to dataframe when running the code on jupyter notebook but it does not work on Cloud Functions. Please tell me what is the problem and how do I fix it.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
