You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Sivakumaran S <si...@me.com> on 2016/07/11 17:47:27 UTC

Question on Spark shell

Hello,

Is there a way to start the spark server with the log output piped to screen? I am currently running spark in the standalone mode on a single machine. 

Regards,

Sivakumaran


---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Re: Question on Spark shell

Posted by Sivakumaran S <si...@me.com>.
That was my bad with the title. 

I am getting that output when I run my application, both from the IDE as well as in the console. 

I want the server logs itself displayed in the terminal from where I start the server. Right now, running the command ‘start-master.sh’ returns the prompt. I want the Spark logs as events occur (INFO, WARN, ERROR); like enabling debug mode wherein server output is printed to screen. 

I have to edit the log4j properties file, that much I have learnt so far. Should be able to hack it now. Thanks for the help. Guess just helping to frame the question was enough to find the answer :)




> On 11-Jul-2016, at 6:57 PM, Anthony May <an...@gmail.com> wrote:
> 
> I see. The title of your original email was "Spark Shell" which is a Spark REPL environment based on the Scala Shell, hence why I misunderstood you.
> 
> You should have the same output starting the application on the console. You are not seeing any output?
> 
> On Mon, 11 Jul 2016 at 11:55 Sivakumaran S <siva.kumaran@me.com <ma...@me.com>> wrote:
> I am running a spark streaming application using Scala in the IntelliJ IDE. I can see the Spark output in the IDE itself (aggregation and stuff). I want the spark server logging (INFO, WARN, etc) to be displayed in screen when I start the master in the console. For example, when I start a kafka cluster, the prompt is not returned and the debug log is printed to the terminal. I want that set up with my spark server. 
> 
> I hope that explains my retrograde requirement :)
> 
> 
> 
>> On 11-Jul-2016, at 6:49 PM, Anthony May <anthonymay@gmail.com <ma...@gmail.com>> wrote:
>> 
>> Starting the Spark Shell gives you a Spark Context to play with straight away. The output is printed to the console.
>> 
>> On Mon, 11 Jul 2016 at 11:47 Sivakumaran S <siva.kumaran@me.com <ma...@me.com>> wrote:
>> Hello,
>> 
>> Is there a way to start the spark server with the log output piped to screen? I am currently running spark in the standalone mode on a single machine.
>> 
>> Regards,
>> 
>> Sivakumaran
>> 
>> 
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: user-unsubscribe@spark.apache.org <ma...@spark.apache.org>
>> 
> 


Re: Question on Spark shell

Posted by Anthony May <an...@gmail.com>.
I see. The title of your original email was "Spark Shell" which is a Spark
REPL environment based on the Scala Shell, hence why I misunderstood you.

You should have the same output starting the application on the console.
You are not seeing any output?

On Mon, 11 Jul 2016 at 11:55 Sivakumaran S <si...@me.com> wrote:

> I am running a spark streaming application using Scala in the IntelliJ
> IDE. I can see the Spark output in the IDE itself (aggregation and stuff).
> I want the spark server logging (INFO, WARN, etc) to be displayed in screen
> when I start the master in the console. For example, when I start a kafka
> cluster, the prompt is not returned and the debug log is printed to the
> terminal. I want that set up with my spark server.
>
> I hope that explains my retrograde requirement :)
>
>
>
> On 11-Jul-2016, at 6:49 PM, Anthony May <an...@gmail.com> wrote:
>
> Starting the Spark Shell gives you a Spark Context to play with straight
> away. The output is printed to the console.
>
> On Mon, 11 Jul 2016 at 11:47 Sivakumaran S <si...@me.com> wrote:
>
>> Hello,
>>
>> Is there a way to start the spark server with the log output piped to
>> screen? I am currently running spark in the standalone mode on a single
>> machine.
>>
>> Regards,
>>
>> Sivakumaran
>>
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: user-unsubscribe@spark.apache.org
>>
>>
>

Re: Question on Spark shell

Posted by Sivakumaran S <si...@me.com>.
I am running a spark streaming application using Scala in the IntelliJ IDE. I can see the Spark output in the IDE itself (aggregation and stuff). I want the spark server logging (INFO, WARN, etc) to be displayed in screen when I start the master in the console. For example, when I start a kafka cluster, the prompt is not returned and the debug log is printed to the terminal. I want that set up with my spark server. 

I hope that explains my retrograde requirement :)



> On 11-Jul-2016, at 6:49 PM, Anthony May <an...@gmail.com> wrote:
> 
> Starting the Spark Shell gives you a Spark Context to play with straight away. The output is printed to the console.
> 
> On Mon, 11 Jul 2016 at 11:47 Sivakumaran S <siva.kumaran@me.com <ma...@me.com>> wrote:
> Hello,
> 
> Is there a way to start the spark server with the log output piped to screen? I am currently running spark in the standalone mode on a single machine.
> 
> Regards,
> 
> Sivakumaran
> 
> 
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscribe@spark.apache.org <ma...@spark.apache.org>
> 


Re: Question on Spark shell

Posted by Anthony May <an...@gmail.com>.
Starting the Spark Shell gives you a Spark Context to play with straight
away. The output is printed to the console.

On Mon, 11 Jul 2016 at 11:47 Sivakumaran S <si...@me.com> wrote:

> Hello,
>
> Is there a way to start the spark server with the log output piped to
> screen? I am currently running spark in the standalone mode on a single
> machine.
>
> Regards,
>
> Sivakumaran
>
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscribe@spark.apache.org
>
>