'json to csv using jq on large json file

I have a very large JSON that contains sports data output from a sports simulation website. I am working on a script that outputs player gamelog data into a csv file.

The JSON is too large to post but there are over 700 games in the games[] array. 2 teams[] in each team array and 10 players in each player[] array.

My current code is:

jq -r '.games[] | [.teams[] | .tid, .ovr, .won, .lost, (.players[] | .pid, .gs, .min, .fg, .fga)] | join(", ")' > file.csv

This code puts all of the player data (both teams) for each game on one line of the csv. I would like one line per player in the csv.

I have used map and join("\n") in previous scripts but the nested arrays are giving me problems.

Thanks for the help.



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source