'How to Delete Rows that exceeds max row limit in Google Big Query

I have a dataset that has a few really really large size rows (JSON data column with over 50MB) which causes my query to break and gives an error

API limit exceeded: Message before conversion exceeds max row limit, limit: 20971520 actual: 57329961

Is there a way to ignore rows that are too large in size or just drop them within the query? Or perhaps there is a way to extract meta information about rows in a table that i can then use to drop those rows?



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source