'Using multi character delimiter while exporting SQL table as text file using python

I am trying to export SQL table as multi-character delimited text file in python. I have tried using Pandas, but it is only supporting single character as delimiter. Now I am using NumPy module to export table data with my custom delimiter '|***|'. Efficiency of this function is very low in large tables and this will also crash memory for large tables.

Main concerns: Able to use multi-character delimiter while exporting table with effective memory usage

NB: Since numpy.savetxt() does not support whole table import at once, header and data part are separately loaded and added into a text file.

Ultimate aim is to create an automated solution that imports the required tables from the database.

Any alternative solution to this in python or in other language is highly appreciated!

Adding the code now I use:

    x = pd.read_sql('SELECT * FROM [' + database + '].[' + schema + '].[' + table_name + ']',connection)
    cols = x.columns
    header = "|***|".join(cols)
    numpy.savetxt(table_data, x, fmt = "%s|***|%s|***|%s|***|%s|***|%s")
    with open(table_data, 'r') as partial : df_data = partial.read()
    with open(table_data, 'w') as whole : whole.write(header + "\n" + df_data)
    print("Exported ", table_data)


Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source