You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Max Schmidt <ma...@datapath.io> on 2016/01/11 10:56:41 UTC

Logger overridden when using JavaSparkContext

Hi there,

we're haveing a strange Problem here using Spark in a Java application
using the JavaSparkContext:

We are using java.util.logging.* for logging in our application with 2
Handlers (Console + Filehandler):

{{{
.handlers=java.util.logging.ConsoleHandler, java.util.logging.FileHandler

.level = FINE

java.util.logging.ConsoleHandler.level=INFO
java.util.logging.ConsoleHandler.formatter=java.util.logging.SimpleFormatter

java.util.logging.FileHandler.level= FINE
java.util.logging.FileHandler.formatter=java.util.logging.SimpleFormatter
java.util.logging.FileHandler.limit=10240000
java.util.logging.FileHandler.count=5
java.util.logging.FileHandler.append= true
java.util.logging.FileHandler.pattern=%t/delivery-model.%u.%g.txt

java.util.logging.SimpleFormatter.format=%1$tY-%1$tm-%1$td
%1$tH:%1$tM:%1$tS %5$s%6$s%n
}}}

The thing is, that when the JavaSparcContext is started, the Logging stops.

The log4j.properties for spark looks like this:

{{{
log4j.rootLogger=WARN, theConsoleAppender
log4j.additivity.io.datapath=false
log4j.appender.theConsoleAppender=org.apache.log4j.ConsoleAppender
log4j.appender.theConsoleAppender.layout=org.apache.log4j.PatternLayout
log4j.appender.theConsoleAppender.layout.ConversionPattern=%d{yyyy-MM-dd
HH:mm:ss} %m%n
}}}

Obviously iam not an expert in the Logging-Architecture yet, but i
really need to understand how the Handler of our JUL-Logging are changed
by the spark-library.

Thanks in advance!



---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: Logger overridden when using JavaSparkContext

Posted by Max Schmidt <ma...@datapath.io>.
Okay, i solved this problem...
It was my own fault by setting the RootLogger for the 
java.util.logging*.
An explicit name for the handler/level solved it.

Am 2016-01-11 12:33, schrieb Max Schmidt:
> I checked the handlers of my rootLogger
> (java.util.logging.Logger.getLogger("")) which where
> a Console and a FileHandler.
>
> After the JavaSparkContext was created, the rootLogger only contained 
> a
> 'org.slf4j.bridge.SLF4JBridgeHandler'.
>
> Am 11.01.2016 um 10:56 schrieb Max Schmidt:
>> Hi there,
>>
>> we're haveing a strange Problem here using Spark in a Java 
>> application
>> using the JavaSparkContext:
>>
>> We are using java.util.logging.* for logging in our application with 
>> 2
>> Handlers (Console + Filehandler):
>>
>> {{{
>> .handlers=java.util.logging.ConsoleHandler, 
>> java.util.logging.FileHandler
>>
>> .level = FINE
>>
>> java.util.logging.ConsoleHandler.level=INFO
>> 
>> java.util.logging.ConsoleHandler.formatter=java.util.logging.SimpleFormatter
>>
>> java.util.logging.FileHandler.level= FINE
>> 
>> java.util.logging.FileHandler.formatter=java.util.logging.SimpleFormatter
>> java.util.logging.FileHandler.limit=10240000
>> java.util.logging.FileHandler.count=5
>> java.util.logging.FileHandler.append= true
>> java.util.logging.FileHandler.pattern=%t/delivery-model.%u.%g.txt
>>
>> java.util.logging.SimpleFormatter.format=%1$tY-%1$tm-%1$td
>> %1$tH:%1$tM:%1$tS %5$s%6$s%n
>> }}}
>>
>> The thing is, that when the JavaSparcContext is started, the Logging 
>> stops.
>>
>> The log4j.properties for spark looks like this:
>>
>> {{{
>> log4j.rootLogger=WARN, theConsoleAppender
>> log4j.additivity.io.datapath=false
>> log4j.appender.theConsoleAppender=org.apache.log4j.ConsoleAppender
>> 
>> log4j.appender.theConsoleAppender.layout=org.apache.log4j.PatternLayout
>> 
>> log4j.appender.theConsoleAppender.layout.ConversionPattern=%d{yyyy-MM-dd
>> HH:mm:ss} %m%n
>> }}}
>>
>> Obviously iam not an expert in the Logging-Architecture yet, but i
>> really need to understand how the Handler of our JUL-Logging are 
>> changed
>> by the spark-library.
>>
>> Thanks in advance!
>>
>>
>>
>> 
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>> For additional commands, e-mail: user-help@spark.apache.org
>>


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: Logger overridden when using JavaSparkContext

Posted by Max Schmidt <ma...@datapath.io>.
I checked the handlers of my rootLogger
(java.util.logging.Logger.getLogger("")) which where
a Console and a FileHandler.

After the JavaSparkContext was created, the rootLogger only contained a
'org.slf4j.bridge.SLF4JBridgeHandler'.

Am 11.01.2016 um 10:56 schrieb Max Schmidt:
> Hi there,
>
> we're haveing a strange Problem here using Spark in a Java application
> using the JavaSparkContext:
>
> We are using java.util.logging.* for logging in our application with 2
> Handlers (Console + Filehandler):
>
> {{{
> .handlers=java.util.logging.ConsoleHandler, java.util.logging.FileHandler
>
> .level = FINE
>
> java.util.logging.ConsoleHandler.level=INFO
> java.util.logging.ConsoleHandler.formatter=java.util.logging.SimpleFormatter
>
> java.util.logging.FileHandler.level= FINE
> java.util.logging.FileHandler.formatter=java.util.logging.SimpleFormatter
> java.util.logging.FileHandler.limit=10240000
> java.util.logging.FileHandler.count=5
> java.util.logging.FileHandler.append= true
> java.util.logging.FileHandler.pattern=%t/delivery-model.%u.%g.txt
>
> java.util.logging.SimpleFormatter.format=%1$tY-%1$tm-%1$td
> %1$tH:%1$tM:%1$tS %5$s%6$s%n
> }}}
>
> The thing is, that when the JavaSparcContext is started, the Logging stops.
>
> The log4j.properties for spark looks like this:
>
> {{{
> log4j.rootLogger=WARN, theConsoleAppender
> log4j.additivity.io.datapath=false
> log4j.appender.theConsoleAppender=org.apache.log4j.ConsoleAppender
> log4j.appender.theConsoleAppender.layout=org.apache.log4j.PatternLayout
> log4j.appender.theConsoleAppender.layout.ConversionPattern=%d{yyyy-MM-dd
> HH:mm:ss} %m%n
> }}}
>
> Obviously iam not an expert in the Logging-Architecture yet, but i
> really need to understand how the Handler of our JUL-Logging are changed
> by the spark-library.
>
> Thanks in advance!
>
>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org