'Mysql python executemany taking hours to insert 1mill rows

I have a data load job that has been running for hours. The data set size is >1million and I am using mysql.connector to connect to the target database. The data extraction is done within a few minutes. Why is the insert taking so long and how can I speed it up? Please don't tell me to use sqlalchemy because I have the same problem with sqlalchemy. I found this link that says executemany should insert 1mill records in a few minutes at max. What am I missing?

The insert statement is as follows:

try:
        cursor = dbconn.cursor()
        sql = (r"""INSERT IGNORE INTO TABLE 
                              (FIELD1,FIELD2,FIELD3,FIELD4,FIELD5,FIELD6,FIELD7)                                     
                                    VALUES ( %s, %s,
                                            %s, %s, %s, %s, 
                                            DATE_FORMAT(%s, '%Y-%m-%d %T')
                                             )""")
        cursor.executemany(sql, data)
        dbconn.commit()
    except Exception:
        raise Exception
    finally:
        cursor.close()


Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source