Just execute this command in the spark directory:

cp conf/log4j.properties.template conf/log4j.properties

Edit log4j.properties:

# Set everything to be logged to the console
log4j.rootCategory=INFO, console
log4j.appender.console=org.apache.log4j.ConsoleAppender
log4j.appender.console.target=System.err
log4j.appender.console.layout=org.apache.log4j.PatternLayout
log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n

# Settings to quiet third party logs that are too verbose
log4j.logger.org.eclipse.jetty=WARN
log4j.logger.org.eclipse.jetty.util.component.AbstractLifeCycle=ERROR
log4j.logger.org.apache.spark.repl.SparkIMain$exprTyper=INFO
log4j.logger.org.apache.spark.repl.SparkILoop$SparkILoopInterpreter=INFO

Replace at the first line:

log4j.rootCategory=INFO, console

by:

log4j.rootCategory=WARN, console

Save and restart your shell. It works for me for Spark 1.1.0 and Spark 1.5.1 on OS X.

Answer from poiuytrez on Stack Overflow
🌐
Cloudera Community
community.cloudera.com › t5 › Support-Questions › Spark-job-submit-log-messages-on-console › td-p › 163043
Spark job submit log messages on console - Cloudera Community - 163043
February 20, 2017 - This will show you all the log based on your config levels. ... You can also switch to yarn-client mode to see more logs printed directly onto the console. Remember to switch back to yarn-cluster mode after you are done debugging. ... Thanks for the tip. Anyway, I have an additional question : I guess this configuration works in "yarn-cluster" mode (when driver and executors run under yarn responsibility on the cluster nodes) ? My problem comes from the fact that I perform my spark-submit in "yarn-client" mode, which means that my driver is not managed by yarn, and the consequence is that the logs from the driver application go to the console from the server where I performed my "spark-submit" command.
Discussions

Sparklyr - change log level during spark-submit
Hi! I wonder if you can help me with the following problem. What I need to achieve is to silence all the information logged to the console during ${SPARK_HOME}/bin/spark-submit execution. I have already changed log le… More on forum.posit.co
🌐 forum.posit.co
0
0
July 1, 2021
pyspark - How can set the default spark logging level? - Stack Overflow
I launch pyspark applications from pycharm on my own workstation, to a 8 node cluster. This cluster also has settings encoded in spark-defaults.conf and spark-env.sh This is how I obtain my spark More on stackoverflow.com
🌐 stackoverflow.com
pyspark - How can i change the log level in spark-submit console output? - Stack Overflow
I'm new to Spark (Pyspark) and when I run a spark-submit the spark log is very verbose. How can I change the log output? More on stackoverflow.com
🌐 stackoverflow.com
January 20, 2020
log4j - How to stop INFO messages displaying on spark console? - Stack Overflow
Now just edit /var/lib/spark/latest/conf/log4j.properties (with example from method #2) and all your apps will share this configuration. If you like the solution #3, but want to customize it per application, you can actually copy conf folder, edit it contents and specify as the root configuration during spark-submit... More on stackoverflow.com
🌐 stackoverflow.com
🌐
Posit Community
forum.posit.co › general
Sparklyr - change log level during spark-submit - General - Posit Community
July 1, 2021 - Hi! I wonder if you can help me with the following problem. What I need to achieve is to silence all the information logged to the console during ${SPARK_HOME}/bin/spark-submit execution. I have already changed log level for spark itself by changing the appropriate line in ${SPARK_HOME}/conf/log4j.properties but still when I submit my R-sparklyr-script by providing sparklyr.jar I get the following info: [user@host] ${SPARK_HOME}/bin/spark-submit --class sparklyr.Shell 'sparklyr.jar' script...
🌐
Apache Spark
spark.apache.org › docs › latest › configuration.html
Configuration - Spark 4.1.1 Documentation
The Spark shell and spark-submit tool support two ways to load configurations dynamically. The first is command line options, such as --master, as shown above. spark-submit can accept any Spark property using the --conf/-c flag, but uses special flags for properties that play a part in launching ...
🌐
Kontext
kontext.tech › home › blogs › spark & pyspark › turn off info logs in spark
Turn off INFO logs in Spark - Spark & PySpark - Kontext
December 15, 2020 - If you want to change log type via programming way, try the following code in Scala: spark = SparkSession.builder.getOrCreate() spark.sparkContext.setLogLevel("WARN") If you use Spark shell, you can directly access SparkContext via sc: ... You can easily run Spark code on your Windows or UNIX-alike (Linux, MacOS) systems. Follow these articles to setup your Spark environment if you don't have one yet:
🌐
The Internals of Spark SQL
jaceklaskowski.gitbooks.io › mastering-spark-sql › content › spark-logging.html
Logging · The Internals of Spark SQL
You can set up the default logging for Spark shell in conf/log4j.properties. Use conf/log4j.properties.template as a starting point. Refer to Setting Default Log Level Programatically in SparkContext — Entry Point to Spark Core.
Find elsewhere
🌐
Open Source at AWS
aws.github.io › aws-emr-containers-best-practices › troubleshooting › docs › change-log-level
Change Log Level - EMR Containers Best Practices Guides
To obtain more detail about their application or job submission, Spark application developers can change the log level of their job to different levels depending on their requirements.
🌐
Rangareddy
rangareddy.github.io › SparkCustomLogging
Spark with custom logging
September 20, 2021 - This blog post will help how to customize the Spark logs for both driver and executor. ... In the following properties, i have modified logging level from INFO to DEBUG mode.
🌐
Apache
spark.apache.org › docs › latest › api › python › reference › api › pyspark.SparkContext.setLogLevel.html
pyspark.SparkContext.setLogLevel - Apache Spark
Control our logLevel. This overrides any user-defined log settings. Valid log levels include: ALL, DEBUG, ERROR, FATAL, INFO, OFF, TRACE, WARN
🌐
IBM
cloud.ibm.com › docs › AnalyticsEngine
Configuring Spark log level information
Find documentation, API & SDK references, tutorials, FAQs, and more resources for IBM Cloud products and services.
🌐
Stack Overflow
stackoverflow.com › questions › 59830266 › how-can-i-change-the-log-level-in-spark-submit-console-output
pyspark - How can i change the log level in spark-submit console output? - Stack Overflow
January 20, 2020 - You’ll find the file inside your spark installation directory – spark/conf/log4j.properties · # Define the root logger with Appender file log4j.rootLogger=WARN, console # Define the file appender log4j.appender.FILE=org.apache.log4j.DailyRollingFileAppender # Name of the log file log4j.appender.FILE.File=/tmp/logfile.out # Set immediate flush to true log4j.appender.FILE.ImmediateFlush=true # Set the threshold to DEBUG mode log4j.appender.FILE.Threshold=debug # Set File append to true.
🌐
IBM
ibm.com › docs › en › cloud-paks › cp-data › 4.8.x
Configuring Spark log level information
4.8.1 and later Review the applications that run and identify the issues that are present by using the logs that the Analytics Engine powered by Apache Spark Spark application generates. The standard logging levels available are ALL, TRACE, DEBUG, INFO, WARN, ERROR, FATAL, and OFF.
🌐
Spark Code Hub
sparkcodehub.com › spark › configurations › log-level
Mastering Apache Spark’s Logging Configuration: A Comprehensive Guide to spark.logConf and Log Levels
Other settings: Configure cores, instances, driver resources, parallelism, fault tolerance, memory management, shuffling, and event logging, as detailed in SparkConf. ... spark-submit --class SalesAnalysis --master yarn --deploy-mode cluster \ --conf spark.app.name=SalesAnalysis_2025_04_12 \ --conf spark.logConf=true \ --driver-java-options "-Dlog4j.configuration=file:/path/to/log4j.properties" \ --conf "spark.executor.extraJavaOptions=-Dlog4j.configuration=file:/path/to/log4j.properties" \ --conf spark.executor.memory=8g \ --conf spark.executor.cores=4 \ --conf spark.executor.instances=10 \ -
🌐
Programmersought
programmersought.com › article › 4767696543
Spark sets the log level with spark-submit - Programmer Sought
3. If you are not allowed to modify the configuration file in the cluster of the production environment, use the spark-submit --conf mentioned above. ... phenomenon When running the Spark program, there will be a lot of INFO information, and it is very messy.You can avoid output INFO information by adjusting the output log level.