$ gradle init --type java-library
Spark log4j custom logging settings
In order to execute Spark job with own instance of Spark Driver (from your application) and with overridden log4j settings you have to put special log4j configuration file on the classpath.
To achive this goal:
-
Create new java project with gradle
-
In the src/main/resources folder create new file
org/apache/spark/log4j-defaults.properties
-
Add log4j settings
# Set everything to be logged to the console log4j.rootCategory=INFO, console log4j.appender.console=org.apache.log4j.ConsoleAppender log4j.appender.console.target=System.err log4j.appender.console.layout=org.apache.log4j.PatternLayout log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n # Settings to quiet third party logs that are too verbose log4j.logger.org.spark-project.jetty=WARN log4j.logger.org.spark-project.jetty.util.component.AbstractLifeCycle=ERROR log4j.logger.org.apache.spark.repl.SparkIMain$exprTyper=INFO # Your job classes logging log4j.logger.your.job.class=DEBUG
-
In your build.gradle file change the name of jar file. The name should be prefixed with aa_ in order to be on the beginning of classpath.
jar.archiveName="aa_log4j-defaults.jar"
-
In your build.gradle add task to copy the jar file with log4j settings to the target directory
task copyJars(type: Copy, dependsOn: jar) { from "${buildDir}/libs" into project.file("${buildDir}/install/lib") }
-
Include the jar file to your application project.