You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@zeppelin.apache.org by Vikash Kumar <vi...@RESILINC.COM> on 2016/10/19 06:41:11 UTC

Netty error with spark interpreter

Hi all,
                I am trying zeppelin with spark which is throwing me the following error related to netty jar conflicts. I checked properly my class path. There are only single versions of netty-3.8.0 and netty-all-4.0.29-Final jar.

Other information :
                Spark 2.0.0
                Scala 2.11
                Zeppelin .6.2 snapshot
Command to build:
                mvn clean install -DskipTests -Drat.ignoreErrors=true -Dcheckstyle.skip=true -Denforcer.skip=true -Pspark-2.0 -Dspark.version=2.0.0 -Pscala-2.11 -Phadoop-2.7 -Pyarn
Queries:
sc.version :- it works fine
sqlContext.sql("show tables").show :- throws error

Running on local mode.
I am attaching my spark log file[zeppelin-interpreter-spark-root.log] and mvn dependency:tree result [dependencyTree.txt]


So I am not able to solve this problem.. :(


Thanks & Regards,
Vikash Kumar


RE: Netty error with spark interpreter

Posted by Vikash Kumar <vi...@RESILINC.COM>.
Hi all,
I solved this problem by excluding netty-all dependencies from external jar as well as from spark-dependency project. Spark was also using two different versions. Adding new netty-all-4.0.29.Final dependency into both project just worked fine.

Thanks & Regards,
Vikash Kumar
From: Vikash Kumar [mailto:vikash.kumar@RESILINC.COM]
Sent: Wednesday, October 19, 2016 12:11 PM
To: users@zeppelin.apache.org
Subject: Netty error with spark interpreter

Hi all,
                I am trying zeppelin with spark which is throwing me the following error related to netty jar conflicts. I checked properly my class path. There are only single versions of netty-3.8.0 and netty-all-4.0.29-Final jar.

Other information :
                Spark 2.0.0
                Scala 2.11
                Zeppelin .6.2 snapshot
Command to build:
                mvn clean install -DskipTests -Drat.ignoreErrors=true -Dcheckstyle.skip=true -Denforcer.skip=true -Pspark-2.0 -Dspark.version=2.0.0 -Pscala-2.11 -Phadoop-2.7 -Pyarn
Queries:
sc.version :- it works fine
sqlContext.sql("show tables").show :- throws error

Running on local mode.
I am attaching my spark log file[zeppelin-interpreter-spark-root.log] and mvn dependency:tree result [dependencyTree.txt]


So I am not able to solve this problem.. :(


Thanks & Regards,
Vikash Kumar