Skip to content


Spark uses[log4j] for logging.

[[levels]] Logging Levels

The valid logging levels are[log4j's Levels] (from most specific to least):

  • OFF (most specific, no logging)
  • FATAL (most specific, little data)
  • WARN
  • INFO
  • TRACE (least specific, a lot of data)
  • ALL (least specific, all data)


You can set up the default logging for Spark shell in conf/ Use conf/ as a starting point.

[[setting-default-log-level]] Setting Default Log Level Programatically

Refer to[Setting Default Log Level Programatically] in[SparkContext -- Entry Point to Spark Core].

[[setting-log-levels-applications]] Setting Log Levels in Spark Applications

In standalone Spark applications or while in[Spark Shell] session, use the following:

import org.apache.log4j.{Level, Logger}


[[sbt]] sbt

When running a Spark application from within sbt using run task, you can use the following build.sbt to configure logging levels:

fork in run := true
javaOptions in run ++= Seq(
outputStrategy := Some(StdoutOutput)

With the above configuration file should be on CLASSPATH which can be in src/main/resources directory (that is included in CLASSPATH by default).

When run starts, you should see the following output in sbt:

[spark-activator]> run
[info] Running StreamingApp
log4j: Trying to find [] using context classloader sun.misc.Launcher$AppClassLoader@1b6d3586.
log4j: Using URL [file:/Users/jacek/dev/oss/spark-activator/target/scala-2.11/classes/] for automatic log4j configuration.
log4j: Reading configuration from URL file:/Users/jacek/dev/oss/spark-activator/target/scala-2.11/classes/

[[disable]] Disabling Logging

Use the following conf/ to disable logging completely:
Back to top