The read_sql docs say this params argument can be a list, tuple or dict (see docs).

To pass the values in the sql query, there are different syntaxes possible: ?, :1, :name, %s, %(name)s (see PEP249).
But not all of these possibilities are supported by all database drivers, which syntax is supported depends on the driver you are using (psycopg2 in your case I suppose).

In your second case, when using a dict, you are using 'named arguments', and according to the psycopg2 documentation, they support the %(name)s style (and so not the :name I suppose), see http://initd.org/psycopg/docs/usage.html#query-parameters.
So using that style should work:

df = psql.read_sql(('select "Timestamp","Value" from "MyTable" '
                     'where "Timestamp" BETWEEN %(dstart)s AND %(dfinish)s'),
                   db,params={"dstart":datetime(2014,6,24,16,0),"dfinish":datetime(2014,6,24,17,0)},
                   index_col=['Timestamp'])
Answer from joris on Stack Overflow
🌐
Pandas
pandas.pydata.org › docs › reference › api › pandas.read_sql.html
pandas.read_sql — pandas 3.0.1 documentation - PyData |
Returns a DataFrame object that contains the result set of the executed SQL query or an SQL Table based on the provided input, in relation to the specified database connection. ... Read SQL database table into a DataFrame.
🌐
Spark By {Examples}
sparkbyexamples.com › home › pandas › pandas read sql query or table with examples
Pandas Read SQL Query or Table with Examples - Spark By {Examples}
December 2, 2024 - Using parameters helps protect against SQL injection and allows for dynamic queries. Pandas supports parameterized queries through the params argument in both read_sql_query() and read_sql_table() functions. In this pandas read SQL into DataFrame you have learned how to run the SQL query and convert the result into DataFrame.
Top answer
1 of 2
134

The read_sql docs say this params argument can be a list, tuple or dict (see docs).

To pass the values in the sql query, there are different syntaxes possible: ?, :1, :name, %s, %(name)s (see PEP249).
But not all of these possibilities are supported by all database drivers, which syntax is supported depends on the driver you are using (psycopg2 in your case I suppose).

In your second case, when using a dict, you are using 'named arguments', and according to the psycopg2 documentation, they support the %(name)s style (and so not the :name I suppose), see http://initd.org/psycopg/docs/usage.html#query-parameters.
So using that style should work:

df = psql.read_sql(('select "Timestamp","Value" from "MyTable" '
                     'where "Timestamp" BETWEEN %(dstart)s AND %(dfinish)s'),
                   db,params={"dstart":datetime(2014,6,24,16,0),"dfinish":datetime(2014,6,24,17,0)},
                   index_col=['Timestamp'])
2 of 2
-1

I was having trouble passing a large number of parameters when reading from a SQLite Table. Then it turns out since you pass a string to read_sql, you can just use f-string. Tried the same with MSSQL pyodbc and it works as well.

For SQLite, it would look like this:

# write a sample table into memory
from sqlalchemy import create_engine
df = pd.DataFrame({'Timestamp': pd.date_range('2020-01-17', '2020-04-24', 10), 'Value1': range(10)})
engine = create_engine('sqlite://', echo=False)
df.to_sql('MyTable', engine);

# query the table using a query
tpl = (1, 3, 5, 8, 9)
query = f"""SELECT Timestamp, Value1 FROM MyTable WHERE Value1 IN {tpl}"""
df = pd.read_sql(query, engine)

If the parameters are datetimes, it's a bit more complicated but calling the datetime conversion function of the SQL dialect you're using should do the job.

start, end = '2020-01-01', '2020-04-01'
query = f"""SELECT Timestamp, Value1 FROM MyTable WHERE Timestamp BETWEEN STRFTIME("{start}") AND STRFTIME("{end}")"""
df = pd.read_sql(query, engine)
🌐
Pandas
pandas.pydata.org › docs › reference › api › pandas.read_sql_query.html
pandas.read_sql_query — pandas 3.0.1 documentation
>>> from sqlalchemy import create_engine >>> engine = create_engine("sqlite:///database.db") >>> sql_query = "SELECT int_column FROM test_data" >>> with engine.connect() as conn, conn.begin(): ... data = pd.read_sql_query(sql_query, conn)
🌐
Pandas
pandas.pydata.org › pandas-docs › stable › reference › api › pandas.read_sql.html
pandas.read_sql — pandas 2.2.3 documentation - PyData |
Returns a DataFrame object that contains the result set of the executed SQL query or an SQL Table based on the provided input, in relation to the specified database connection. ... Read SQL database table into a DataFrame.
🌐
Dask
docs.dask.org › en › stable › generated › dask.dataframe.read_sql.html
dask.dataframe.read_sql — Dask documentation
This function is a convenience wrapper around read_sql_table and read_sql_query. It will delegate to the specific function depending on the provided input. A SQL query will be routed to read_sql_query, while a database table name will be routed to read_sql_table.
🌐
AskPython
askpython.com › home › pandas read_sql: read sql query/database table into a dataframe
Pandas read_sql: Read SQL query/database table into a DataFrame - AskPython
January 31, 2023 - You can now use the Pandas read_sql() function to read the data from the table using SQL queries. The below example demonstrates how you can load all the data from the table STUDENT and convert it into a Pandas DataFrame.
🌐
Liora
liora.io › blog › understanding the pandas read sql function: a deep dive
Understanding the Pandas Read SQL Function: A Deep Dive
March 20, 2024 - Good to know: This is a basic model of how to use Pandas Read_SQL. It’s also possible to create a generalized query string to extract different ranges. And all this while adapting your queries and their variables.
Find elsewhere
🌐
DataLemur
datalemur.com › blog › sql-pandas-read_sql
How to Use Pandas read_sql to Write and Run SQL?
April 21, 2025 - |user_id|device_type|view_time| ... use a [JPMorgan Chase SQL interview question](https://datalemur.com/questions/card-launch-success) as an example. #### read_sql_table to Extract Specific Columns We'll read the `monthly_cards_issued` table, extracting only the `card_name` ...
🌐
Pandas
pandas.pydata.org › pandas-docs › version › 1.5 › reference › api › pandas.read_sql.html
pandas.read_sql — pandas 1.5.2 documentation
Examples · Read data from SQL ... in an error. >>> from sqlite3 import connect >>> conn = connect(':memory:') >>> df = pd.DataFrame(data=[[0, '10/11/12'], [1, '12/11/10']], ......
🌐
Like Geeks
likegeeks.com › home › python › pandas › read sql query/table into dataframe using pandas read_sql
Read SQL Query/Table into DataFrame using Pandas read_sql
October 16, 2023 - You can use the parse_dates parameter of the read_sql function to parse date columns and that’s because Pandas loads date columns as objects By default. The parse_dates parameter accepts a list of columns to be parsed as dates. Let’s consider an example where we have a ‘students’ table with a ‘dob’ (date of birth) column:
🌐
TutorialsPoint
tutorialspoint.com › python_pandas › python_pandas_read_sql_method.htm
Python Pandas read_sql() Method
The Pandas read_sql() method returns a pandas DataFrame containing the query results. Here is a basic example demonstrating reading a SQL tabular data using the Pandas read_sql() method.
🌐
Pandas
pandas.pydata.org › docs › reference › api › pandas.read_sql_table.html
pandas.read_sql_table — pandas 3.0.1 documentation
pandas.read_sql_table(table_name, con, schema=None, index_col=None, coerce_float=True, parse_dates=None, columns=None, chunksize=None, dtype_backend=<no_default>)[source]#
🌐
Databricks
api-docs.databricks.com › python › pyspark › latest › pyspark.pandas › api › pyspark.pandas.read_sql.html
pyspark.pandas.read_sql — PySpark master documentation
Read SQL query or database table into a DataFrame · This function is a convenience wrapper around read_sql_table and read_sql_query (for backward compatibility). It will delegate to the specific function depending on the provided input. A SQL query will be routed to read_sql_query, while a ...
🌐
Easy Tweaks
easytweaks.com › pandas-read-sql-table-query-example
How to read from an SQL database with pd.read_sql in Python?
September 27, 2020 - Skip to content · BECOMING MORE EFFICIENT WITH TECH · HELPING TO REDUCE BUSY WORK SO YOU CAN FOCUS ON STUFF THAT MATTERS · Check out our latest posts: · How to change the default Email Account in Gmail and Outlook · How to fix Windows 11 Not detecting Bluetooth headset · How to fix Microsoft ...
🌐
Microsoft Learn
learn.microsoft.com › en-us › sql › t-sql › data-types › read-database-engine
Read (Database Engine) - SQL Server | Microsoft Learn
January 13, 2025 - Byte[] encoding = new byte[] { 0x58 }; MemoryStream stream = new MemoryStream(encoding, false /*not writable*/); BinaryReader br = new BinaryReader(stream); SqlHierarchyId hid = new SqlHierarchyId(); hid.Read(br);
🌐
Readthedocs
aws-sdk-pandas.readthedocs.io › en › 3.10.0 › stubs › awswrangler.postgresql.read_sql_query.html
awswrangler.postgresql.read_sql_query — AWS SDK for pandas 3.10.0 documentation
Examples · Reading from PostgreSQL using a Glue Catalog Connections · >>> import awswrangler as wr >>> with wr.postgresql.connect("MY_GLUE_CONNECTION") as con: ... df = wr.postgresql.read_sql_query( ... sql="SELECT * FROM public.my_table", ...
🌐
Apache
spark.apache.org › docs › latest › api › python › reference › pyspark.pandas › api › pyspark.pandas.read_sql_query.html
pyspark.pandas.read_sql_query — PySpark 4.1.1 documentation
SQL query to be executed. ... A JDBC URI could be provided as str. ... The URI must be JDBC URI instead of Python’s database URI. index_colstring or list of strings, optional, default: None · Column(s) to set as index(MultiIndex). ... All other options passed directly into Spark’s JDBC data source. ... Read SQL database table into a DataFrame.
🌐
Medium
medium.com › geekculture › python-read-sql-to-sql-read-and-write-sql-databases-f3fa9b980c33
Python “read_sql” & “to_sql”: Read and Write SQL Databases | by Aaron Zhu | Geek Culture | Medium
April 11, 2023 - While SQL databases have been around for decades, they still hold a significant position in data management, and Python has become the go-to language for data science. Python provides several libraries to connect to databases, and in this post, we will explore how to read and write SQL databases using Python’s “read_sql” and “to_sql” functions.