You are bitten by the case (in)sensitivity issues with PostgreSQL. If you quote the table name in the query, it will work:

df = pd.read_sql_query('select * from "Stat_Table"',con=engine)

But personally, I would advise to just always use lower case table names (and column names), also when writing the table to the database to prevent such issues.


From the PostgreSQL docs (http://www.postgresql.org/docs/8.0/static/sql-syntax.html#SQL-SYNTAX-IDENTIFIERS):

Quoting an identifier also makes it case-sensitive, whereas unquoted names are always folded to lower case

To explain a bit more: you have written a table with the name Stat_Table to the database (and sqlalchemy will quote this name, so it will be written as "Stat_Table" in the postgres database). When doing the query 'select * from Stat_Table' the unquoted table name will be converted to lower case stat_table, and so you get the message that this table is not found.

See eg also Are PostgreSQL column names case-sensitive?

Answer from joris on Stack Overflow
🌐
Pandas
pandas.pydata.org › docs › reference › api › pandas.DataFrame.to_sql.html
pandas.DataFrame.to_sql — pandas 3.0.1 documentation
Use method to define a callable insertion method to do nothing if there’s a primary key conflict on a table in a PostgreSQL database. >>> from sqlalchemy.dialects.postgresql import insert >>> def insert_on_conflict_nothing(table, conn, keys, data_iter): ...
🌐
Pandas
pandas.pydata.org › docs › reference › api › pandas.read_sql.html
pandas.read_sql — pandas 3.0.1 documentation - PyData |
>>> pd.read_sql("test_data", "postgres:///db_name") For parameterized query, using params is recommended over string interpolation. >>> from sqlalchemy import text >>> sql = text( ... "SELECT int_column, date_column FROM test_data WHERE int_column=:int_val" ...
🌐
GitHub
github.com › brian-kipkoech-tanui › DataFrametoPostgresqlSQLalchemy
GitHub - brian-kipkoech-tanui/DataFrametoPostgresqlSQLalchemy: Save data in Pandas DataFrame to Postgresql using SQLAlchemy
To read data from a PostgreSQL database into Python, you can use the read_sql_query function from the Pandas library. This function allows you to execute a SQL query and return the results as a Pandas DataFrame.
Author   brian-kipkoech-tanui
🌐
Hackers and Slackers
hackersandslackers.com › connecting-pandas-to-a-sql-database-with-sqlalchemy
Connecting Pandas to a Database with SQLAlchemy
January 19, 2022 - As you might imagine, the first two libraries we need to install are Pandas and SQLAlchemy. We need to install a database connector as our third and final library, but the library you need depends on the type of database you'll be connecting to. If you're connecting to MySQL I recommend installing PyMySQL ( pip install pymysql ). If you're connecting to Postgres, go with Psycopg2 ( pip install psycopg2 ).
🌐
GeeksforGeeks
geeksforgeeks.org › connecting-pandas-to-a-database-with-sqlalchemy
Connecting Pandas to a Database with SQLAlchemy - GeeksforGeeks
January 26, 2022 - Now, let's Establish the connection with the PostgreSQL database and make it interactable to python using the psycopg2 driver. Next, we shall load the dataframe to be pushed to our SQLite database using the to_sql() function as shown. ... # import necessary packages import pandas import psycopg2 from sqlalchemy import create_engine # establish connection with the database engine = create_engine( "dialect+driver//username:password@hostname:portnumber/databasename") # read the pandas dataframe data = pandas.read_csv("path to dataset") # connect the pandas dataframe with postgresql table data.to_sql('loan_data', engine, if_exists='replace')
🌐
SQL Shack
sqlshack.com › introduction-to-sqlalchemy-in-pandas-dataframe
Introduction to SQLAlchemy in Pandas Dataframe
August 20, 2020 - Now that we have our database engine ready, let us first create a dataframe from a CSV file and try to insert the same into a SQL table in the PostgreSQL database. I am using a Superstore dataset for this tutorial, which you can download from https://data.world/annjackson/2019-superstore. To read data from a CSV file in pandas, you can use the following command and store it into a dataframe.
🌐
Towards Data Science
towardsdatascience.com › home › latest › how to connect to sql databases from python using sqlalchemy and pandas
How to Connect to SQL Databases from Python Using SQLAlchemy and Pandas | Towards Data Science
January 21, 2025 - In this article, I will discuss how to integrate PostgreSQL with Python, therefore, let’s install "psycopg2". Open the anaconda prompt or command prompt and type the following commands. pip install SQLAlchemy pip install pandas pip install psycopg2
Find elsewhere
Top answer
1 of 8
248

Starting from pandas 0.14 (released end of May 2014), postgresql is supported. The sql module now uses sqlalchemy to support different database flavors. You can pass a sqlalchemy engine for a postgresql database (see docs). E.g.:

from sqlalchemy import create_engine
engine = create_engine('postgresql://username:password@localhost:5432/mydatabase')
df.to_sql('table_name', engine)

You are correct that in pandas up to version 0.13.1 postgresql was not supported. If you need to use an older version of pandas, here is a patched version of pandas.io.sql: https://gist.github.com/jorisvandenbossche/10841234.
I wrote this a time ago, so cannot fully guarantee that it always works, buth the basis should be there). If you put that file in your working directory and import it, then you should be able to do (where con is a postgresql connection):

import sql  # the patched version (file is named sql.py)
sql.write_frame(df, 'table_name', con, flavor='postgresql')
2 of 8
149

Faster option:

The following code will copy your Pandas DF to postgres DB much faster than df.to_sql method and you won't need any intermediate csv file to store the df.

Create an engine based on your DB specifications.

Create a table in your postgres DB that has equal number of columns as the Dataframe (df).

Data in DF will get inserted in your postgres table.

from sqlalchemy import create_engine
import psycopg2 
import io

If you want to replace the table, we can replace it with normal to_sql method using headers from our df and then load the entire big time consuming df into DB.

engine = create_engine(
    'postgresql+psycopg2://username:password@host:port/database')

# Drop old table and create new empty table
df.head(0).to_sql('table_name', engine, if_exists='replace',index=False)

conn = engine.raw_connection()
cur = conn.cursor()
output = io.StringIO()
df.to_csv(output, sep='\t', header=False, index=False)
output.seek(0)
contents = output.getvalue()
cur.copy_from(output, 'table_name', null="") # null values become ''
conn.commit()
cur.close()
conn.close()
🌐
CoderPad
coderpad.io › blog › development › how-to-manipulate-sql-data-using-sqlalchemy-and-pandas
How To Manipulate SQL Data Using SQLAlchemy and Pandas - CoderPad
June 7, 2023 - We discussed how to import data from SQLAlchemy to Pandas DataFrame using read_sql, how to export Pandas DataFrame to the database using to_sql, and how to load a CSV file to get a DataFrame that can be shipped to the database. I’m Ezz. I’m an AWS Certified Machine Learning Specialist and a Data Platform Engineer. I help SaaS companies rank on Google. Check out my website for more. ... MySQL vs. PostgreSQL: How Do They Compare?
🌐
Medium
medium.com › @heyamit10 › pandas-sqlalchemy-practical-guide-for-beginners-b2b5cd708587
pandas sqlalchemy (Practical Guide for Beginners) | by Hey Amit | Medium
March 6, 2025 - pip install pandas sqlalchemy sqlite3 psycopg2 pymysql ... sqlite3, psycopg2, pymysql → These are database connectors for SQLite, PostgreSQL, and MySQL.
🌐
Pandas
pandas.pydata.org › pandas-docs › stable › reference › api › pandas.read_sql.html
pandas.read_sql — pandas 2.2.3 documentation - PyData |
>>> pd.read_sql("test_data", "postgres:///db_name") For parameterized query, using params is recommended over string interpolation. >>> from sqlalchemy import text >>> sql = text( ... "SELECT int_column, date_column FROM test_data WHERE int_column=:int_val" ...
🌐
CData
cdata.com › kb › tech › postgresql-python-pandas.rst
How to Visualize PostgreSQL Data in Python with pandas
Download a free, 30-day trial of the CData Python Connector for PostgreSQL to start building Python apps and scripts with connectivity to PostgreSQL data. Reach out to our Support Team if you have any questions. import pandas import matplotlib.pyplot as plt from sqlalchemy import create_engin engine = create_engine("postgresql:///?User=postgres&Password=admin&Database=postgres&Server=127.0.0.1&Port=5432") df = pandas.read_sql("SELECT ShipName, ShipCity FROM Orders WHERE ShipCountry = 'USA'", engine) df.plot(kind="bar", x="ShipName", y="ShipCity") plt.show()
🌐
Codementor
codementor.io › community › graceful data ingestion with sqlalchemy and pandas
Graceful Data Ingestion with SQLAlchemy and Pandas | Codementor
April 2, 2019 - As we know, python has a good database tookit SQLAlchemy with good ORM integration and a good data processing library Pandas. Here we explore some different implementations and discuss the pros and cons in this article. With a pandas dataframe with thousands data and complex data type. How to load the data into target database fast and the code should be easy to maintain. Note. Here we use PostgreSQL ...
🌐
KDnuggets
kdnuggets.com › using-sql-with-python-sqlalchemy-and-pandas
Using SQL with Python: SQLAlchemy and Pandas - KDnuggets
June 12, 2024 - We will learn how to connect to databases, execute SQL queries using SQLAlchemy, and analyze and visualize data using Pandas.
🌐
Pandas
pandas.pydata.org › docs › dev › reference › api › pandas.DataFrame.to_sql.html
pandas.DataFrame.to_sql — pandas 3.0.0rc2+20.g501c5052ca documentation
Use method to define a callable insertion method to do nothing if there’s a primary key conflict on a table in a PostgreSQL database. >>> from sqlalchemy.dialects.postgresql import insert >>> def insert_on_conflict_nothing(table, conn, keys, data_iter): ...
🌐
Medium
medium.com › itversity › integration-of-pandas-with-postgres-database-04789d3cf645
Integrate Pandas with PostgreSQL for Data Analysis | by Durga Gadiraju | itversity | Medium
February 10, 2025 - To connect Pandas with PostgreSQL, you need the following Python libraries: SQLAlchemy: Acts as an abstraction layer for database connectivity, simplifying connections to multiple database types.
🌐
CData
cdata.com › kb › tech › postgresql-python-sqlalchemy.rst
How to use SQLAlchemy ORM to access PostgreSQL Data in Python
November 18, 2024 - This article shows how to use SQLAlchemy to connect to PostgreSQL data to query, update, delete, and insert PostgreSQL data.
🌐
Medium
medium.com › @heyamit10 › pandas-postgresql-mastering-data-manipulation-79f55f84a7d4
pandas postgresql: Mastering Data Manipulation | by Hey Amit | Medium
April 12, 2025 - Once you have Python ready, it’s time to install the necessary packages. Open your command prompt or terminal and run the following commands: pip install pandas pip install sqlalchemy psycopg2