jwilner's response is spot on. I was exploring to see if there's a faster option, since in my experience, summing flat arrays is (strangely) faster than counting. This code seems faster:

df.isnull().values.any()

import numpy as np
import pandas as pd
import perfplot


def setup(n):
    df = pd.DataFrame(np.random.randn(n))
    df[df > 0.9] = np.nan
    return df


def isnull_any(df):
    return df.isnull().any()


def isnull_values_sum(df):
    return df.isnull().values.sum() > 0


def isnull_sum(df):
    return df.isnull().sum() > 0


def isnull_values_any(df):
    return df.isnull().values.any()


perfplot.save(
    "out.png",
    setup=setup,
    kernels=[isnull_any, isnull_values_sum, isnull_sum, isnull_values_any],
    n_range=[2 ** k for k in range(25)],
)

df.isnull().sum().sum() is a bit slower, but of course, has additional information -- the number of NaNs.

Answer from S Anand on Stack Overflow
🌐
GeeksforGeeks
geeksforgeeks.org › python › python-pandas-isnull-and-notnull
Pandas isnull() and notnull() Method - GeeksforGeeks
July 8, 2018 - import pandas as pd data = pd.read_csv("employees.csv") bool_series = pd.notnull(data["Gender"]) filtered_data = data[bool_series] print("Data with non-null 'Gender' values:") print(filtered_data) ... bool_series = pd.notnull(data["Gender"]): ...
Top answer
1 of 16
890

jwilner's response is spot on. I was exploring to see if there's a faster option, since in my experience, summing flat arrays is (strangely) faster than counting. This code seems faster:

df.isnull().values.any()

import numpy as np
import pandas as pd
import perfplot


def setup(n):
    df = pd.DataFrame(np.random.randn(n))
    df[df > 0.9] = np.nan
    return df


def isnull_any(df):
    return df.isnull().any()


def isnull_values_sum(df):
    return df.isnull().values.sum() > 0


def isnull_sum(df):
    return df.isnull().sum() > 0


def isnull_values_any(df):
    return df.isnull().values.any()


perfplot.save(
    "out.png",
    setup=setup,
    kernels=[isnull_any, isnull_values_sum, isnull_sum, isnull_values_any],
    n_range=[2 ** k for k in range(25)],
)

df.isnull().sum().sum() is a bit slower, but of course, has additional information -- the number of NaNs.

2 of 16
245

You have a couple of options.

import pandas as pd
import numpy as np

df = pd.DataFrame(np.random.randn(10,6))
# Make a few areas have NaN values
df.iloc[1:3,1] = np.nan
df.iloc[5,3] = np.nan
df.iloc[7:9,5] = np.nan

Now the data frame looks something like this:

          0         1         2         3         4         5
0  0.520113  0.884000  1.260966 -0.236597  0.312972 -0.196281
1 -0.837552       NaN  0.143017  0.862355  0.346550  0.842952
2 -0.452595       NaN -0.420790  0.456215  1.203459  0.527425
3  0.317503 -0.917042  1.780938 -1.584102  0.432745  0.389797
4 -0.722852  1.704820 -0.113821 -1.466458  0.083002  0.011722
5 -0.622851 -0.251935 -1.498837       NaN  1.098323  0.273814
6  0.329585  0.075312 -0.690209 -3.807924  0.489317 -0.841368
7 -1.123433 -1.187496  1.868894 -2.046456 -0.949718       NaN
8  1.133880 -0.110447  0.050385 -1.158387  0.188222       NaN
9 -0.513741  1.196259  0.704537  0.982395 -0.585040 -1.693810
  • Option 1: df.isnull().any().any() - This returns a boolean value

You know of the isnull() which would return a dataframe like this:

       0      1      2      3      4      5
0  False  False  False  False  False  False
1  False   True  False  False  False  False
2  False   True  False  False  False  False
3  False  False  False  False  False  False
4  False  False  False  False  False  False
5  False  False  False   True  False  False
6  False  False  False  False  False  False
7  False  False  False  False  False   True
8  False  False  False  False  False   True
9  False  False  False  False  False  False

If you make it df.isnull().any(), you can find just the columns that have NaN values:

0    False
1     True
2    False
3     True
4    False
5     True
dtype: bool

One more .any() will tell you if any of the above are True

> df.isnull().any().any()
True
  • Option 2: df.isnull().sum().sum() - This returns an integer of the total number of NaN values:

This operates the same way as the .any().any() does, by first giving a summation of the number of NaN values in a column, then the summation of those values:

df.isnull().sum()
0    0
1    2
2    0
3    1
4    0
5    2
dtype: int64

Finally, to get the total number of NaN values in the DataFrame:

df.isnull().sum().sum()
5
🌐
Quora
quora.com › How-do-you-check-if-a-column-has-a-null-value-in-Pandas
How to check if a column has a null value in Pandas - Quora
Answer (1 of 2): Use pandas.isnull(value) to determine if [code ]value[/code] is [code ]None[/code] or [code ]NaN[/code]. pandas.isnull(value) returns [code ]True[/code] if [code ]value[/code] is [code ]None[/code] or [code ]NaN[/code] and [code ...
🌐
Saturn Cloud
saturncloud.io › blog › how-to-check-if-a-particular-cell-in-pandas-dataframe-is-null
How to Check if a Particular Cell in Pandas DataFrame is Null | Saturn Cloud Blog
October 27, 2023 - We learned how to use the isnull() and notnull() methods to check for null and non-null values in a DataFrame, and how to use the iloc method to access a particular cell in a DataFrame by its row and column index.
🌐
Atlassian
atlassian.com › data › notebook › how-to-check-if-any-value-is-nan-in-a-pandas-dataframe
How to check if any value is NaN in a pandas DataFrame
Within pandas, a null value is considered missing and is denoted by NaN. Learn how to evalute pandas for missing data with the isnull() command.
🌐
DZone
dzone.com › articles › pandas-find-rows-where-columnfield-is-null
Pandas: Find Rows Where Column/Field Is Null
July 10, 2017 - print(train[train["Electrical"].isnull()][null_columns]) LotFrontage Alley MasVnrType MasVnrArea BsmtQual BsmtCond BsmtExposure \ 1379 73.0 NaN None 0.0 Gd TA No BsmtFinType1 BsmtFinType2 Electrical FireplaceQu GarageType GarageYrBlt \ 1379 Unf Unf NaN NaN BuiltIn 2007.0 GarageFinish GarageQual GarageCond PoolQC Fence MiscFeature 1379 Fin TA TA NaN NaN NaN · And what if we want to return every row that contains at least one null value?
🌐
Medium
medium.com › @amit25173 › how-to-identify-null-missing-values-in-pandas-6e67d7725b95
How to Identify Null (Missing) Values in pandas? | by Amit Yadav | Medium
March 6, 2025 - If you’re working with large datasets, this is your best bet. ... This summarizes your dataset, including the count of non-null values. If you’re dealing with thousands of rows, it quickly shows which columns have missing values. ... <class 'pandas.core.frame.DataFrame'> RangeIndex: 4 entries, 0 to 3 Data columns (total 3 columns): # Column Non-Null Count Dtype --- ------ -------------- ----- 0 Name 3 non-null object 1 Age 3 non-null float64 2 Salary 3 non-null float64
Find elsewhere
🌐
DataGenX
datagenx.net › 2017 › 03 › check-if-python-pandas-dataframe-column.html
Check if Python Pandas DataFrame Column is having NaN or NULL - DataGenX - Atul's Scratchpad
Returns TRUE if all the data points follow the condition. Now, as we know that there are some nulls/NaN values in our data frame, let's check those out - data.isnull().sum() - this will return the count of NULLs/NaN values in each column.
🌐
Vultr Docs
docs.vultr.com › python › third-party › pandas › DataFrame › notnull
Python Pandas DataFrame notnull() - Check Non-Null Values | Vultr Docs
December 30, 2024 - Leverage the power of Boolean indexing in Pandas, using the output of notnull() to filter the DataFrame. Combine with other DataFrame operations like loc for nuanced data selection and manipulation. ... Here, notnull() checks the 'Salary' column for non-null entries, and loc is used to filter the entire DataFrame based on this condition.
🌐
YouTube
youtube.com › watch
How to Check if a Dataframe Column Has Null Values | 2 Ways in Python - YouTube
Pandas and Python are some of the most popular tools in data analytics today. In today’s video, we will talk about how you can check if a column is null. The...
Published   July 18, 2023
🌐
GeeksforGeeks
geeksforgeeks.org › pandas › how-to-check-if-cell-is-empty-in-pandas-dataframe
How To Check If Cell Is Empty In Pandas Dataframe - GeeksforGeeks
July 23, 2025 - We are using the loc() function ... names). 'axis=0' will check if all values along axis 0 (i.e., along columns) in each column are True. If all values in a column are True, it means that all cells in that column are null...
🌐
Saturn Cloud
saturncloud.io › blog › python-pandas-selecting-rows-whose-column-value-is-null-none-nan
Python Pandas Selecting Rows Whose Column Value is Null None Nan | Saturn Cloud Blog
October 26, 2023 - Here, we first apply the isnull() method to the entire dataframe, which returns a boolean mask indicating whether each element is null or not. Then, we apply the any(axis=1) method to the result to check if any value in each row is null.
🌐
Pandas
pandas.pydata.org › docs › reference › api › pandas.DataFrame.notnull.html
pandas.DataFrame.notnull — pandas 2.3.3 documentation
DataFrame.notnull is an alias for DataFrame.notna. Detect existing (non-missing) values. Return a boolean same-sized object indicating if the values are not NA. Non-missing values get mapped to True. Characters such as empty strings '' or numpy.inf are not considered NA values (unless you set pandas.options.mode.use_inf_as_na = True).
Top answer
1 of 1
2

Using pandas, you should avoid loop. Use mask filtering and slicing to fill your flag column. In order to detect null values, use .isnull() directly on pandas dataframe or series (when you select a column), not on a value as you did. Then use .fillna() if you want to replace null values with something else.

Based on your code (but not sure it will works, it could be helpfull you share some input data and expected output), the solution may look as follow.

First create empty column as you did:

data['Flags'] = None

Then fill this columns based on condition on "Temperature phase" column (using fillna(0) to replace all null values by 0 allow you to only test if values are <= 0, this replacement is not applied on the final dataframe):

data.loc[data['Temperature phase'].fillna(0) <= 0, "Flags"] = 1
data.loc[data['Temperature phase'] > 200, "Flags"] = 2

And now replace Temperature phase values.

For the values equal to 0 or null, you seems to have choosen to replace them with the previous value in dataframe. You maybe could achieve this part using this.

data.loc[data['Temperature phase'].isnull(), 'Temperature phase'] = data['Temperature phase'].shift().loc[data.loc[data['Temperature phase'].isnull()].index]

First, this command use .shift() to shift all values in column Temperature phase by one, then filtering rows where Temperature phase is null and replace values by corresponding index in shifted Temperature phase values.

Finaly, replace other Temperature phase values:

data.loc[data['Temperature phase'] < 0, "Temperature phase"] = 0
data.loc[data['Temperature phase'] > 200, "Temperature phase"] = 130

You don't need flag index so on as the Flag is directly fill in the final dataframe.

🌐
Pandas
pandas.pydata.org › docs › reference › api › pandas.notnull.html
pandas.notnull — pandas 2.3.3 documentation
Object to check for not null or non-missing values. ... For scalar input, returns a scalar boolean. For array input, returns an array of boolean indicating whether each corresponding element is valid. ... Boolean inverse of pandas.notna.