'Python - insert data into postgres db using a dataframe
I am working with python trying to connect with postgres, I created a table into my postgres database in the staging schema.
create table staging.data( Name varchar, Age bigint);
then I try to connect and insert my dataframe data into this table:
import psycopg2
import pandas as pd
from sqlalchemy import create_engine
conn_string = 'postgresql://myuser:password@host/database_name'
db = create_engine(conn_string)
conn = db.connect()
# our dataframe
data = {'Name': ['Tom', 'dick', 'harry'],
'Age': [22, 21, 24]}
# Create DataFrame
df = pd.DataFrame(data)
df.to_sql('staging.data', con=conn, if_exists='replace',
index=False)
conn = psycopg2.connect(conn_string
)
conn.autocommit = True
cursor = conn.cursor()
sql1 = '''select * from staging.data;'''
cursor.execute(sql1)
for i in cursor.fetchall():
print(i)
conn.commit()
conn.close()
But the Python ends with no error message, and there is no data into my table from postgres.
Any idea about this?
Regards
Solution 1:[1]
I think the issue is that you are trying to use a schema other than public. Try passing in the schema name via the schema argument of to_sql() like this:
df.to_sql('data', con=conn, if_exists='replace', schema='staging', index=False)
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|---|
| Solution 1 | Emrah Diril |
