'cx_oracle: Issues moving large volume of data from one table to another
I am trying to move 20 million records using from one oracle database to another using python cx_Oracle libraries. Unfortunately I cannot use the oracle cli utility tool because some minor transformation is required on some rows.
Here is the code I am using
def execute(in_query, out_query, batch_size=10000):
in_conn = create_src_db_connection()
out_conn = create_dest_db_connection()
out_conn.autocommit = True
in_cursor = in_conn.cursor()
out_cursor = out_conn.cursor()
in_cursor.execute(in_query)
white True:
try:
rows = in_cursor.fetchmany(batch_size)
if not rows:
break
# transformation of rows done here
out_cursor.execute_many(out_query, rows, batcherrors=True)
for error in out_cursor.getbatcherrors():
print("Error", error.message, "at row offset", error.offset)
except cx_Oracle.Error as err:
print(err)
in_conn.close()
out_conn.close()
in_query = "select * from table" out_query = "insert into table(a, b, c, d) values (:1,:2,:3,:4)"
Unfortunately this is either not performing well about 6000 rows per minute or is sometimes failing with error
cx_Oracle DIP-1080: connection was closed by ORA-3135C50
Is the re a better approach? Any reason why this keeps failing? Seems like one of the connections is idling for too long.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
