You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Andrew Or (JIRA)" <ji...@apache.org> on 2015/06/10 22:28:00 UTC

[jira] [Closed] (SPARK-7261) Change default log level to WARN in the REPL

     [ https://issues.apache.org/jira/browse/SPARK-7261?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Andrew Or closed SPARK-7261.
----------------------------
       Resolution: Fixed
    Fix Version/s: 1.5.0

> Change default log level to WARN in the REPL
> --------------------------------------------
>
>                 Key: SPARK-7261
>                 URL: https://issues.apache.org/jira/browse/SPARK-7261
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Shell
>            Reporter: Patrick Wendell
>            Assignee: Shixiong Zhu
>            Priority: Blocker
>              Labels: starter
>             Fix For: 1.5.0
>
>
> We should add a log4j properties file for the repl (log4j-defaults-repl.properties) that has the level of warning. The main reason for doing this is that we now display nice progress bars in the REPL so the need for task level INFO messages is much less.
> The best way to accomplish this is the following:
> 1. Add a second logging defaults file called log4j-defaults-repl.properties that has log level WARN. https://github.com/apache/spark/blob/branch-1.4/core/src/main/resources/org/apache/spark/log4j-defaults.properties
> 2. When logging is initialized, check whether you are inside the REPL. If so, then use that one:
> https://github.com/apache/spark/blob/branch-1.4/core/src/main/scala/org/apache/spark/Logging.scala#L124
> 3. The printed message should say something like:
> Using Spark's repl log4j profile: org/apache/spark/log4j-defaults-repl.properties
> To adjust logging level use sc.setLogLevel("INFO")



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org