You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@zeppelin.apache.org by Luciano Resende <lu...@gmail.com> on 2016/12/08 18:48:44 UTC

Handling spark-submit errors

I was playing with some new data sources in Zeppelin (master) and had an
issue on my --package declaration in zeppelin-env.sh.

First, the error stack was a little misleading, as it was only reporting
connection issues to Spark trying to retrieve paragraph status.

From my initial investigation, we pretty much invoke the spark-submit from
interpreter.sh... and I was wondering if there is a good way to trap this
kind of issues and be able to provide a better response to the user (or
even better logging)...

One option is to have something similar to what spark does, and actually,
have a class handling some of the interpreter integration logic.

Thoughts  ? Any other possibilities ?

-- 
Luciano Resende
http://twitter.com/lresende1975
http://lresende.blogspot.com/

Re: Handling spark-submit errors

Posted by Jeff Zhang <zj...@gmail.com>.
Hi Luciano

Usually I change the log4j.properties to get the output of process
interpreter.sh, it can help most of the problems I hit in interpreter.sh

Here's the log4j.properties I use

log4j.rootLogger = INFO, dailyfile

log4j.appender.stdout = org.apache.log4j.ConsoleAppender
log4j.appender.stdout.layout = org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%5p [%d] ({%t} %F[%M]:%L) -
%m%n

log4j.appender.dailyfile.DatePattern=.yyyy-MM-dd
log4j.appender.dailyfile.Threshold = DEBUG
log4j.appender.dailyfile = org.apache.log4j.DailyRollingFileAppender
log4j.appender.dailyfile.File = ${zeppelin.log.file}
log4j.appender.dailyfile.layout = org.apache.log4j.PatternLayout
log4j.appender.dailyfile.layout.ConversionPattern=%5p [%d] ({%t} %F[%M]:%L)
- %m%n


log4j.logger.org.apache.zeppelin.interpreter.InterpreterFactory=DEBUG
log4j.logger.org.apache.zeppelin.notebook.Paragraph=DEBUG
log4j.logger.org.apache.zeppelin.scheduler=DEBUG
log4j.logger.org.apache.zeppelin.livy=DEBUG
log4j.logger.org.apache.zeppelin.flink=DEBUG
log4j.logger.org.apache.zeppelin.spark=DEBUG
log4j.logger.org.apache.zeppelin.interpreter.util=DEBUG
log4j.logger.org.apache.zeppelin.interpreter.remote=DEBUG
log4j.logger.org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer=DEBUG

Luciano Resende <lu...@gmail.com>于2016年12月9日周五 上午2:48写道:

> I was playing with some new data sources in Zeppelin (master) and had an
> issue on my --package declaration in zeppelin-env.sh.
>
> First, the error stack was a little misleading, as it was only reporting
> connection issues to Spark trying to retrieve paragraph status.
>
> From my initial investigation, we pretty much invoke the spark-submit from
> interpreter.sh... and I was wondering if there is a good way to trap this
> kind of issues and be able to provide a better response to the user (or
> even better logging)...
>
> One option is to have something similar to what spark does, and actually,
> have a class handling some of the interpreter integration logic.
>
> Thoughts  ? Any other possibilities ?
>
> --
> Luciano Resende
> http://twitter.com/lresende1975
> http://lresende.blogspot.com/
>