Use parentheses to enforce the desired operator precedence:

F.when( (df["col-1"]>0.0) & (df["col-2"]>0.0), 1).otherwise(0)
Answer from Ashalynd on Stack Overflow
🌐
Apache
spark.apache.org › docs › latest › api › python › reference › pyspark.sql › api › pyspark.sql.functions.when.html
pyspark.sql.functions.when — PySpark 4.1.1 documentation
pyspark.sql.functions.when(condition, value)[source]# Evaluates a list of conditions and returns one of multiple possible result expressions. If pyspark.sql.Column.otherwise() is not invoked, None is returned for unmatched conditions. New in version 1.4.0. Changed in version 3.4.0: Supports ...
🌐
Spark By {Examples}
sparkbyexamples.com › home › hbase › pyspark when otherwise | sql case when usage
PySpark When Otherwise | SQL Case When Usage - Spark By {Examples}
March 27, 2024 - PySpark When Otherwise – when() is a SQL function that returns a Column type and otherwise() is a function of Column, if otherwise() is not used, it returns a None/NULL value.
Discussions

python - How do I use multiple conditions with pyspark.sql.functions.when()? - Stack Overflow
when in pyspark multiple conditions can be built using &(for and) and | (for or), it is important to enclose every expressions within parenthesis that combine to form the condition More on stackoverflow.com
🌐 stackoverflow.com
python - PySpark: multiple conditions in when clause - Stack Overflow
I would like to modify the cell values of a dataframe column (Age) where currently it is blank and I would only do it if another column (Survived) has the value 0 for the corresponding row where i... More on stackoverflow.com
🌐 stackoverflow.com
python - PySpark: when function with multiple outputs - Stack Overflow
But that doesn't work since I can't put a tuple into the "otherwise" function. ... from pyspark.sql import functions as F df.withColumn('device_id', F.when(col('device')=='desktop', 1).when(col('device')=='mobile', 2).otherwise(None)) More on stackoverflow.com
🌐 stackoverflow.com
July 10, 2019
Is a pandas_udf faster than a python udf when it is calling the same underlying function?
I'd love to know the answer to this as well. More on reddit.com
🌐 r/apachespark
5
13
July 31, 2023
🌐
ProjectPro
projectpro.io › recipes › define-when-and-otherwise-function-pyspark
Explain PySpark When and Otherwise Function -
February 6, 2024 - PySpark when and otherwise functions help you to perform intricate data transformations with ease. Whether you're dealing with conditional column creation, handling null values, or implementing complex logic, these functions are indispensable ...
🌐
Databricks
api-docs.databricks.com › python › pyspark › latest › pyspark.sql › api › pyspark.sql.functions.when.html
pyspark.sql.functions.when — PySpark master documentation
Evaluates a list of conditions and returns one of multiple possible result expressions. If pyspark.sql.Column.otherwise() is not invoked, None is returned for unmatched conditions · a boolean Column expression
🌐
Arab Psychology
scales.arabpsychology.com › home › stats › how to use pyspark `when` with or conditions for conditional logic
How Can I Use The PySpark "when" Function With An OR ...
February 3, 2026 - The PySpark “when” function is a powerful tool that allows users to apply conditional logic to their data in a Spark environment. This function can be used to create new columns or modify existing ones based on a specified condition.
Find elsewhere
🌐
Itversity
pyspark.itversity.com › 04_processing_column_data › 19_using_case_and_when.html
Using CASE and WHEN - Mastering Pyspark - ITVersity
>>> df.select(when(df['age'] == 2, 3).otherwise(4).alias("age")).collect() [Row(age=3), Row(age=4)] >>> df.select(when(df.age == 2, df.age + 1).alias("age")).collect() [Row(age=3), Row(age=None)] .. versionadded:: 1.4 File: /usr/hdp/current/spark2-client/python/pyspark/sql/functions.py Type: function
Top answer
1 of 5
153

You get SyntaxError error exception because Python has no && operator. It has and and & where the latter one is the correct choice to create boolean expressions on Column (| for a logical disjunction and ~ for logical negation).

Condition you created is also invalid because it doesn't consider operator precedence. & in Python has a higher precedence than == so expression has to be parenthesized.

(col("Age") == "") & (col("Survived") == "0")
## Column<b'((Age = ) AND (Survived = 0))'>

On a side note when function is equivalent to case expression not WHEN clause. Still the same rules apply. Conjunction:

df.where((col("foo") > 0) & (col("bar") < 0))

Disjunction:

df.where((col("foo") > 0) | (col("bar") < 0))

You can of course define conditions separately to avoid brackets:

cond1 = col("Age") == "" 
cond2 = col("Survived") == "0"

cond1 & cond2
2 of 5
39

when in pyspark multiple conditions can be built using &(for and) and | (for or).

Note:In pyspark t is important to enclose every expressions within parenthesis () that combine to form the condition

%pyspark
dataDF = spark.createDataFrame([(66, "a", "4"), 
                                (67, "a", "0"), 
                                (70, "b", "4"), 
                                (71, "d", "4")],
                                ("id", "code", "amt"))
dataDF.withColumn("new_column",
       when((col("code") == "a") | (col("code") == "d"), "A")
      .when((col("code") == "b") & (col("amt") == "4"), "B")
      .otherwise("A1")).show()

In Spark Scala code (&&) or (||) conditions can be used within when function

//scala
val dataDF = Seq(
      (66, "a", "4"), (67, "a", "0"), (70, "b", "4"), (71, "d", "4"
      )).toDF("id", "code", "amt")
dataDF.withColumn("new_column",
       when(col("code") === "a" || col("code") === "d", "A")
      .when(col("code") === "b" && col("amt") === "4", "B")
      .otherwise("A1")).show()

=======================

Output:
+---+----+---+----------+
| id|code|amt|new_column|
+---+----+---+----------+
| 66|   a|  4|         A|
| 67|   a|  0|         A|
| 70|   b|  4|         B|
| 71|   d|  4|         A|
+---+----+---+----------+

This code snippet is copied from sparkbyexamples.com

🌐
EDUCBA
educba.com › home › software development › software development tutorials › python tutorial › pyspark when
PySpark when | Learn the use of FROM in PySpark with Examples
April 10, 2023 - We can also use the case statement as well as the SQL function otherwise with When where the condition that doesn’t satisfy falls there. The data can also be segregated based on case statement where case followed with When filters the data out. When takes up the value checks them against the condition and then outputs the new column based on the value satisfied. It is similar to an if then clause in SQL. We can have multiple when statement with PySpark DataFrame.
Address   Unit no. 202, Jay Antariksh Bldg, Makwana Road, Marol, Andheri (East),, 400059, Mumbai
🌐
Medium
medium.com › data-and-beyond › mastering-pyspark-when-statement-a-comprehensive-guide-691c1f14a597
Mastering PySpark 'when' Statement: A Comprehensive Guide | Learn Data Transformation Techniques | Data And Beyond
December 12, 2024 - In today’s big data landscape, PySpark has emerged as a powerful tool for processing and analyzing massive datasets. One of the key features that make PySpark invaluable is its when statement, which enables conditional transformations on data. This comprehensive guide aims to equip you with a deep understanding of the PySpark when statement and its applications.
🌐
Statology
statology.org › home › pyspark: how to use when with and condition
PySpark: How to Use When with AND Condition
November 13, 2023 - import pyspark.sql.functions as F #create new DataFrame df_new = df.withColumn('B10', F.when((df.team=='B') & (df.points>10), 1).otherwise(0)) #view new DataFrame df_new.show() +----+--------+------+---+ |team|position|points|B10| +----+--------+------+---+ | A| Guard| 11| 0| | A| Guard| 8| 0| | A| Forward| 22| 0| | A| Forward| 22| 0| | B| Guard| 14| 1| | B| Guard| 14| 1| | B| Forward| 13| 1| | B| Forward| 7| 0| +----+--------+------+---+
🌐
Saturn Cloud
saturncloud.io › blog › pyspark-when-multiple-conditions-an-overview
PySpark - Multiple Conditions in When Clause: An Overview | Saturn Cloud Blog
January 13, 2024 - from pyspark.sql import SparkSession from pyspark.sql.functions import col, when spark = SparkSession.builder.appName("example").getOrCreate() data = [("Alice", 25), ("Bob", 30), ("Charlie", 22)] columns = ["Name", "Age"] df = spark.createDataFrame(data, columns) result_df = df.withColumn("Category", when(col("Age") < 25, "Young").otherwise("Adult")) result_df.show()
🌐
Statology
statology.org › home › pyspark: how to use when with or condition
PySpark: How to Use When with OR Condition
November 13, 2023 - This tutorial explains how to use the when function with OR conditions in PySpark, including an example.
🌐
Analytics Vidhya
analyticsvidhya.com › home › 9 most useful functions for pyspark dataframe
PySpark Functions | 9 most useful functions for PySpark DataFrame
May 19, 2021 - when(): The when the function is used to display the output based on the particular condition. It evaluates the condition provided and then returns the values accordingly. It is a SQL function that supports PySpark to check multiple conditions ...
🌐
Medium
medium.com › @uzzaman.ahmed › pyspark-conditional-functions-a-comprehensive-guide-115450e7ce07
PySpark Conditional Functions: A Comprehensive Guide | by Ahmed Uz Zaman | Medium
March 24, 2023 - Some examples of conditional functions in PySpark listed below. when() : This function allows you to define a condition and a value to be returned if the condition is true.
🌐
Rajanand
rajanand.org › spark › spark-when
Spark: when function - Rajanand
from pyspark.sql.functions import concat, lit # Add a new column 'Description' using `when` and `concat` df_with_description = df.withColumn("Description", when(col("Age") < 30, concat(col("Name"), lit(" is young"))) .otherwise(concat(col("Name"), lit(" is an adult")))) df_with_description.show()