When using mysqldb and cursor.execute(), pass the value None, not "NULL":
value = None
cursor.execute("INSERT INTO table (`column1`) VALUES (%s)", (value,))
Found the answer here
Answer from j c on Stack OverflowWhen using mysqldb and cursor.execute(), pass the value None, not "NULL":
value = None
cursor.execute("INSERT INTO table (`column1`) VALUES (%s)", (value,))
Found the answer here
Quick note about using parameters in SQL statements with Python. See the RealPython article on this topic - Preventing SQL Injection Attacks With Python. Here's another good article from TowardsDataScience.com - A Simple Approach To Templated SQL Queries In Python. These helped me with same None/NULL issue.
Also, I found that if I put "NULL" (without quotes) directly into the INSERT query in VALUES, it was interpreted appropriately in the SQL Server DB. The translation problem only exists if needing to conditionally add NULL or a value via string interpolation.
Examples:
cursor.execute("SELECT admin FROM users WHERE username = %s'", (username, ));
cursor.execute("SELECT admin FROM users WHERE username = %(username)s", {'username': username});
UPDATE: This StackOverflow discussion is more in line with what I'm trying to do and may help someone else.
Example:
import pypyodbc
myData = [
(1, 'foo'),
(2, None),
(3, 'bar'),
]
connStr = """
DSN=myDb_SQLEXPRESS;
"""
cnxn = pypyodbc.connect(connStr)
crsr = cnxn.cursor()
sql = """
INSERT INTO myTable VALUES (?, ?)
"""
for dataRow in myData:
print(dataRow)
crsr.execute(sql, dataRow)
cnxn.commit()
crsr.close()
cnxn.close()
Insert null value using pyodbc
Does anyone know how I can insert null values with pyodbc into sql server? The version of sql server I am using is 2019, and I’m using the latest odbc driver.
The connection and insert statements work, but when I try to insert None values, they get translated as some varchar value on the sql server side. I have seen some odbc attribute called sqldescribeparam that might determine the data type conversion into sql server, but I can’t figure out if or how I can access this via the pyodbc connection.
My connection looks something like how the documentation describes:
conn = pyodbc.connect('DRIVER={ODBC Driver 18 for SQL Server};SERVER=test;DATABASE=test;UID=user;PWD=password')
Don't use dynamic SQL. Use a proper parameterized query:
# test data
name = "Gord"
key = None # not a string
id = 1
query_update = "update table_name set Name = ?, Key = ? where Id = ?"
stmt.execute(query_update, name, key, id)
conn.commit()
The problem that your Null values are actually a string and not "real" Null.
If you want to insert Null, your key should be equal to None.
You can can convert it as follows:
Key = Key if Key != 'Null' else None
To insert null values to the database you have two options:
- omit that field from your INSERT statement, or
- use
None
Also: To guard against SQL-injection you should not use normal string interpolation for your queries.
You should pass two (2) arguments to execute(), e.g.:
mycursor.execute("""INSERT INTO products
(city_id, product_id, quantity, price)
VALUES (%s, %s, %s, %s)""",
(city_id, product_id, quantity, price))
Alternative #2:
user_id = None
mycursor.execute("""INSERT INTO products
(user_id, city_id, product_id, quantity, price)
VALUES (%s, %s, %s, %s, %s)""",
(user_id, city_id, product_id, quantity, price))
With the current psycopg, instead of None, use a variable set to 'NULL'.
variable = 'NULL'
insert_query = """insert into my_table values(date'{}',{},{})"""
format_query = insert_query.format('9999-12-31', variable, variable)
curr.execute(format_query)
conn.commit()
>> insert into my_table values(date'9999-12-31',NULL,NULL)
not sure if this is the place to ask.
assuming connections are ok.
will this have issue with null values if any. I've encountered many times that the null value will be painful. May I know if you have any suggestions or the usual practise to overcome this.
I know we can import over to the database BUT I need to automate via python in the future. Thanks
# open the dataset csv file, skip the first row (header), and insert
each line as a record into the covid_data table
with open("data/nyccovid_{}.csv".format(date.today().strftime("%Y%m%d"))) as f: next(f) for row in f: cursor.execute(""" INSERT INTO covid_data VALUES ('{}', '{}', '{}', '{}') """.format(row.split(",")[0], row.split(",")[1], row.split(",")[2], row.split(",")[3]) )dbconnect.commit() cursor.close() dbconnect.close()
The error is because of NaN values in your data. The sql server Null equivalent in python is None object, so you must change all NaNs with None, if you want to insert Nulls into the sql server:
df.where((pd.notnull(df)), None)
never mind... my bad but it worked just fine with the "None" insertion