You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by kant kodali <ka...@gmail.com> on 2016/10/06 16:27:56 UTC

How to Disable or do minimal Logging for apache spark client Driver program?

How to Disable or do minimal Logging for apache spark client Driver program? I
couldn't find this information on docs. By Driver program I mean the java
program where I initialize spark context. It produces lot of INFO messages but I
would like to know only when there is error or a Exception such as Nullpointer
exception and so on. I am also using spark standalone mode and I don't submit
jobs through command line. I just invoke public static void main() of my driver
program.

Re: How to Disable or do minimal Logging for apache spark client Driver program?

Posted by kant kodali <ka...@gmail.com>.
got it! Thanks!
 





On Fri, Oct 7, 2016 12:41 PM, Jakob Odersky jakob@odersky.com
wrote:
Hi Kant,

job submission through the command line is not strictly required,

although it is the most common way (it's flexible and easy to use) in

which applications that depend on spark are run. The shell script

"spark-submit" ends up doing similar things to what your code snippet

shows.




I asked if you meant "local" mode when you wrote "I just invoke public

static void main() of my driver program" because I have seen people

confuse "local" and "standalone" in the past.




--Jakob







On Thu, Oct 6, 2016 at 10:30 PM, kant kodali <ka...@gmail.com> wrote:

> Hi Jakob,

>

> It is a biggest question for me too since I seem to be on a different page

> than everyone else whenever I say "I am also using spark standalone mode

> and I don't submit jobs through command line. I just invoke public static

> void main() of my driver program"

>

> Everyone keeps talking about submit jobs from command line or even the words

> "submit job" people automatically assume it is happening from command line.

> I just setup a standalone cluster and do this

>

>

>

>

>

> SparkConf sparkConf = config.buildSparkConfig();

> sparkConf.setJars(JavaSparkContext.jarOfClass(SparkDriver.class));

> JavaStreamingContext ssc = new JavaStreamingContext(sparkConf, new

> Duration(config.getSparkStremingBatchInterval()));

> ssc.sparkContext().setLogLevel("ERROR");

> Receiver receiver = new Receiver(config);

> JavaReceiverInputDStream<String> jsonMessagesDStream =

> ssc.receiverStream(receiver);

> jsonMessagesDStream.count()

> ssc.start();

> ssc.awaitTermination();

>

>

> Not using Mixmax yet?

>

>

> so I assume submitting Job happens through this API. please correct me if I

> am wrong.

>

> Thanks

>

>

> On Thu, Oct 6, 2016 1:38 PM, Jakob Odersky jakob@odersky.com wrote:

>>

>> You can change the kind of log messages that are shown by

>>

>> calling "context.setLogLevel(<level>)" with an appropriate level:

>>

>> ALL, DEBUG, ERROR, FATAL, INFO, OFF, TRACE, WARN.

>>

>> See

>>
http://spark.apache.org/docs/latest/api/scala/index.html#org.apache.spark.SparkContext@setLogLevel(logLevel:String):Unit

>>

>> for further details.

>>

>>

>> Just one nitpick: when you say "I am also using spark standalone

>>

>> mode and I don't submit jobs through command line. I just invoke

>>

>> public static void main() of my driver program." are you

>>

>> referring to spark local mode? It is possible to also run spark

>>

>> applications in "distributed mode" (i.e. standalone, yarn or

>>

>> mesos) just from the command line, however that will require

>>

>> using spark's launcher interface and bundling your application in

>>

>> a jar.

>>

>>

>> On Thu, Oct 6, 2016 at 9:27 AM, kant kodali <ka...@gmail.com> wrote:

>>

>> > How to Disable or do minimal Logging for apache spark client Driver

>> > program?

>>

>> > I couldn't find this information on docs. By Driver program I mean the

>> > java

>>

>> > program where I initialize spark context. It produces lot of INFO

>> > messages

>>

>> > but I would like to know only when there is error or a Exception such as

>>

>> > Nullpointer exception and so on. I am also using spark standalone mode

>> > and I

>>

>> > don't submit jobs through command line. I just invoke public static void

>>

>> > main() of my driver program.

>>

>>

>

Re: How to Disable or do minimal Logging for apache spark client Driver program?

Posted by Jakob Odersky <ja...@odersky.com>.
Hi Kant,
job submission through the command line is not strictly required,
although it is the most common way (it's flexible and easy to use) in
which applications that depend on spark are run. The shell script
"spark-submit" ends up doing similar things to what your code snippet
shows.

I asked if you meant "local" mode when you wrote "I just invoke public
static void main() of my driver program" because I have seen people
confuse "local" and "standalone" in the past.

--Jakob


On Thu, Oct 6, 2016 at 10:30 PM, kant kodali <ka...@gmail.com> wrote:
> Hi Jakob,
>
> It is a biggest question for me too since I seem to be on a different page
> than everyone else whenever I say  "I am also using spark standalone mode
> and I don't submit jobs through command line. I just invoke public static
> void main() of my driver program"
>
> Everyone keeps talking about submit jobs from command line or even the words
> "submit job" people automatically assume it is happening from command line.
> I just setup a standalone cluster and do this
>
>
>
>
>
>         SparkConf sparkConf = config.buildSparkConfig();
>         sparkConf.setJars(JavaSparkContext.jarOfClass(SparkDriver.class));
>         JavaStreamingContext ssc = new JavaStreamingContext(sparkConf, new
> Duration(config.getSparkStremingBatchInterval()));
>         ssc.sparkContext().setLogLevel("ERROR");
>         Receiver receiver = new Receiver(config);
>         JavaReceiverInputDStream<String> jsonMessagesDStream =
> ssc.receiverStream(receiver);
>         jsonMessagesDStream.count()
>         ssc.start();
>         ssc.awaitTermination();
>
>
> Not using Mixmax yet?
>
>
> so I assume submitting Job happens through this API. please correct me if I
> am wrong.
>
> Thanks
>
>
> On Thu, Oct 6, 2016 1:38 PM, Jakob Odersky jakob@odersky.com wrote:
>>
>> You can change the kind of log messages that are shown by
>>
>> calling "context.setLogLevel(<level>)" with an appropriate level:
>>
>> ALL, DEBUG, ERROR, FATAL, INFO, OFF, TRACE, WARN.
>>
>> See
>> http://spark.apache.org/docs/latest/api/scala/index.html#org.apache.spark.SparkContext@setLogLevel(logLevel:String):Unit
>>
>> for further details.
>>
>>
>> Just one nitpick: when you say "I am also using spark standalone
>>
>> mode and I don't submit jobs through command line. I just invoke
>>
>> public static void main() of my driver program." are you
>>
>> referring to spark local mode? It is possible to also run spark
>>
>> applications in "distributed mode" (i.e. standalone, yarn or
>>
>> mesos) just from the command line, however that will require
>>
>> using spark's launcher interface and bundling your application in
>>
>> a jar.
>>
>>
>> On Thu, Oct 6, 2016 at 9:27 AM, kant kodali <ka...@gmail.com> wrote:
>>
>> > How to Disable or do minimal Logging for apache spark client Driver
>> > program?
>>
>> > I couldn't find this information on docs. By Driver program I mean the
>> > java
>>
>> > program where I initialize spark context. It produces lot of INFO
>> > messages
>>
>> > but I would like to know only when there is error or a Exception such as
>>
>> > Nullpointer exception and so on. I am also using spark standalone mode
>> > and I
>>
>> > don't submit jobs through command line. I just invoke public static void
>>
>> > main() of my driver program.
>>
>>
>

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Re: How to Disable or do minimal Logging for apache spark client Driver program?

Posted by kant kodali <ka...@gmail.com>.
Hi Jakob,
It is a biggest question for me too since I seem to be on a different page than
everyone else whenever I say  "I am also using spark standalonemode and I don't
submit jobs through command line. I just invokepublic static void main() of my
driver program"
Everyone keeps talking about submit jobs from command line or even the words
"submit job" people automatically assume it is happening from command line. I
just setup a standalone cluster and do this




                          SparkConf sparkConf = config.buildSparkConfig();        sparkConf.setJars(JavaSparkContext.jarOfClass(SparkDriver.class));        JavaStreamingContext ssc = new JavaStreamingContext(sparkConf, new Duration(config.getSparkStremingBatchInterval()));        ssc.sparkContext().setLogLevel("ERROR");        Receiver receiver = new Receiver(config);        JavaReceiverInputDStream<String> jsonMessagesDStream = ssc.receiverStream(receiver);        jsonMessagesDStream.count()        ssc.start();        ssc.awaitTermination();
                

Not using Mixmax yet?
 



so I assume submitting Job happens through this API. please correct me if I am
wrong.
Thanks



On Thu, Oct 6, 2016 1:38 PM, Jakob Odersky jakob@odersky.com
wrote:
You can change the kind of log messages that are shown by

calling "context.setLogLevel(<level>)" with an appropriate level:

ALL, DEBUG, ERROR, FATAL, INFO, OFF, TRACE, WARN.

See
http://spark.apache.org/docs/latest/api/scala/index.html#org.apache.spark.SparkContext@setLogLevel(logLevel:String):Unit

for further details.




Just one nitpick: when you say "I am also using spark standalone

mode and I don't submit jobs through command line. I just invoke

public static void main() of my driver program." are you

referring to spark local mode? It is possible to also run spark

applications in "distributed mode" (i.e. standalone, yarn or

mesos) just from the command line, however that will require

using spark's launcher interface and bundling your application in

a jar.




On Thu, Oct 6, 2016 at 9:27 AM, kant kodali <ka...@gmail.com> wrote:

> How to Disable or do minimal Logging for apache spark client Driver program?

> I couldn't find this information on docs. By Driver program I mean the java

> program where I initialize spark context. It produces lot of INFO messages

> but I would like to know only when there is error or a Exception such as

> Nullpointer exception and so on. I am also using spark standalone mode and I

> don't submit jobs through command line. I just invoke public static void

> main() of my driver program.

Re: How to Disable or do minimal Logging for apache spark client Driver program?

Posted by Mahendra Kutare <ma...@gmail.com>.
import ch.qos.logback.classic.Level;

sc.setLogLevel(Level.INFO.levelStr);

//Change the level to an appropriate level for your application.

Mahendra
about.me/mahendrakutare
<https://about.me/mahendrakutare?promo=email_sig&utm_source=email_sig&utm_medium=email_sig&utm_campaign=external_links>
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Only those who will risk going too far can possibly find out how far one
can go.

On Thu, Oct 6, 2016 at 1:38 PM, Jakob Odersky <ja...@odersky.com> wrote:

> You can change the kind of log messages that are shown by
> calling "context.setLogLevel(<level>)" with an appropriate level:
> ALL, DEBUG, ERROR, FATAL, INFO, OFF, TRACE, WARN.
> See http://spark.apache.org/docs/latest/api/scala/index.html#
> org.apache.spark.SparkContext@setLogLevel(logLevel:String):Unit
> for further details.
>
> Just one nitpick: when you say "I am also using spark standalone
> mode and I don't submit jobs through command line. I just invoke
> public static void main() of my driver program." are you
> referring to spark local mode? It is possible to also run spark
> applications in "distributed mode" (i.e. standalone, yarn or
> mesos) just from the command line, however that will require
> using spark's launcher interface and bundling your application in
> a jar.
>
> On Thu, Oct 6, 2016 at 9:27 AM, kant kodali <ka...@gmail.com> wrote:
> > How to Disable or do minimal Logging for apache spark client Driver
> program?
> > I couldn't find this information on docs. By Driver program I mean the
> java
> > program where I initialize spark context. It produces lot of INFO
> messages
> > but I would like to know only when there is error or a Exception such as
> > Nullpointer exception and so on. I am also using spark standalone mode
> and I
> > don't submit jobs through command line. I just invoke public static void
> > main() of my driver program.
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscribe@spark.apache.org
>
>

Re: How to Disable or do minimal Logging for apache spark client Driver program?

Posted by Jakob Odersky <ja...@odersky.com>.
You can change the kind of log messages that are shown by
calling "context.setLogLevel(<level>)" with an appropriate level:
ALL, DEBUG, ERROR, FATAL, INFO, OFF, TRACE, WARN.
See http://spark.apache.org/docs/latest/api/scala/index.html#org.apache.spark.SparkContext@setLogLevel(logLevel:String):Unit
for further details.

Just one nitpick: when you say "I am also using spark standalone
mode and I don't submit jobs through command line. I just invoke
public static void main() of my driver program." are you
referring to spark local mode? It is possible to also run spark
applications in "distributed mode" (i.e. standalone, yarn or
mesos) just from the command line, however that will require
using spark's launcher interface and bundling your application in
a jar.

On Thu, Oct 6, 2016 at 9:27 AM, kant kodali <ka...@gmail.com> wrote:
> How to Disable or do minimal Logging for apache spark client Driver program?
> I couldn't find this information on docs. By Driver program I mean the java
> program where I initialize spark context. It produces lot of INFO messages
> but I would like to know only when there is error or a Exception such as
> Nullpointer exception and so on. I am also using spark standalone mode and I
> don't submit jobs through command line. I just invoke public static void
> main() of my driver program.

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org