How to Count the NaN Values in a DataFrame?

pandas pyspark count specific value in column

NaN values are also called missing values and simply indicate the data we do not have. We do not like to have missing values in a dataset but itโ€™s inevitable to have them in some cases.ย 

The first step in handling missing values is to check how many they are. We often want to count the NaN values in a specific column to better understand the data.

This short how-to article will teach us how to count the missing values in Pandas and PySpark DataFrames.

How to Count the NaN Values in a DataFrame?

Pandas

We can use the isna or isnull function to detect missing values. They returned a DataFrame filled with boolean values (True or False) indicating the missing values. In order to count the missing values in each column separately, we need to use the sum function together with isna or isnull.

				
					df.isna().sum()

f1    2
f2    2
f3    1
f4    0
dtype: int64
				
			

If we apply the sum function, we will get the number of the missing values in the DataFrame.

				
					df.isna().sum().sum()
5
				
			

PySpark

We can count the NaN values in each column separately in PySpark. The functions to use are select, count, when, and isnan.

				
					df.select(
    F.count(F.when(F.isnan("number")==True, F.col("number"))).alias("NaN_count")
).show()

+---------+
|NaN_count|
+---------+
|        2|
+---------+
				
			

The isnan function checks the condition of being NaN, the count, and when the functions count the rows in which the condition is True.

This question is also being asked as:

  • Python DataFrame get null value counts

People have also asked for:

You may also like

Start Monitoring Your Models in Minutes