I have 20,000 ~1000-row dataframes, each of which has a name, in a 170GB pickle file at the moment. I'd like to write these to a file these so I can load them i
macos-catalina
overhead
processmaker-api
runtime-type
fileapi
apikit
processing-efficiency
influxdb-python
unsafe
contingency
emacs24
android-event
css-purge
codelens
zyte
hazelcast
nrepl
maglev
redux-selector
google-cloud-pubsub
mlmodel
datacontractserializer
ilias
data-race
rust-proc-macros
wmv
redux-saga-firebase
fluent-assertions
windbg
mercadopagosdk