You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Tobias Pfeiffer <tg...@preferred.jp> on 2014/11/04 07:50:02 UTC

netty on classpath when using spark-submit

Hi,

I tried hard to get a version of netty into my jar file created with sbt
assembly that works with all my libraries. Now I managed that and was
really happy, but it seems like spark-submit puts an older version of netty
on the classpath when submitting to a cluster, such that my code ends up
with an NoSuchMethodError:

Code:
  val a = new DefaultHttpRequest(HttpVersion.HTTP_1_1, HttpMethod.POST,
    "http://localhost")
  val f = new File(a.getClass.getProtectionDomain().
    getCodeSource().getLocation().getPath())
  println(f.getAbsolutePath)
  println("headers: " + a.headers())

When executed with "sbt run":
  ~/.ivy2/cache/io.netty/netty/bundles/netty-3.9.4.Final.jar
  headers: org.jboss.netty.handler.codec.http.DefaultHttpHeaders@64934069

When executed with "spark-submit":
  ~/spark-1.1.0-bin-hadoop2.4/lib/spark-assembly-1.1.0-hadoop2.4.0.jar
  Exception in thread "main" java.lang.NoSuchMethodError:
org.jboss.netty.handler.codec.http.DefaultHttpRequest.headers()Lorg/jboss/netty/handler/codec/http/HttpHeaders;
    ...

How can I get the old netty version off my classpath?

Thanks
Tobias

Re: netty on classpath when using spark-submit

Posted by Tobias Pfeiffer <tg...@preferred.jp>.
Markus,

On Tue, Nov 11, 2014 at 10:40 AM, M. Dale <me...@yahoo.com> wrote:
>
>   I never tried to use this property. I was hoping someone else would jump
> in. When I saw your original question I remembered that Hadoop has
> something similar. So I searched and found the link below. A quick JIRA
> search seems to indicate that there is another property:
>
> https://issues.apache.org/jira/browse/SPARK-2996
>
> Maybe that property will work with yarn: spark.yarn.user.classpath.first
>

Thank you very much! That property does in fact load the classes from my
jar file first when running on YARN, great!

However, in local[N] mode, neither that one nor
the spark.files.userClassPathFirst one works. So when using spark-submit
with "--master local[3]" instead of "--master yarn-cluster", the value
for spark.files.userClassPathFirst is displayed correctly, but the classes
are still loaded from the wrong jar...

Tobias

Re: netty on classpath when using spark-submit

Posted by Tobias Pfeiffer <tg...@preferred.jp>.
Hi,

On Wed, Nov 5, 2014 at 10:23 AM, Tobias Pfeiffer wrote:
>
> On Tue, Nov 4, 2014 at 8:33 PM, M. Dale wrote:
>
>>    From http://spark.apache.org/docs/latest/configuration.html it seems
>> that there is an experimental property:
>>
>> spark.files.userClassPathFirst
>>
>
> Thank you very much, I didn't know about this.  Unfortunately, it doesn't
> change anything.  With this setting both true and false (as indicated by
> the Spark web interface) and no matter whether "local[N]" or "yarn-client"
> or "yarn-cluster" mode are used with spark-submit, the classpath looks the
> same and the netty class is loaded from the Spark jar. Can I use this
> setting with spark-submit at all?
>

Has anyone used this setting successfully or can advice me on how to use it
correctly?

Thanks
Tobias

Re: netty on classpath when using spark-submit

Posted by Tobias Pfeiffer <tg...@preferred.jp>.
Markus,

thanks for your help!

On Tue, Nov 4, 2014 at 8:33 PM, M. Dale <me...@yahoo.com.invalid> wrote:

>  Tobias,
>    From http://spark.apache.org/docs/latest/configuration.html it seems
> that there is an experimental property:
>
> spark.files.userClassPathFirst
>

Thank you very much, I didn't know about this.  Unfortunately, it doesn't
change anything.  With this setting both true and false (as indicated by
the Spark web interface) and no matter whether "local[N]" or "yarn-client"
or "yarn-cluster" mode are used with spark-submit, the classpath looks the
same and the netty class is loaded from the Spark jar. Can I use this
setting with spark-submit at all?

Thanks
Tobias

Re: netty on classpath when using spark-submit

Posted by "M. Dale" <me...@yahoo.com.INVALID>.
Tobias,
    From http://spark.apache.org/docs/latest/configuration.html it seems 
that there is an experimental property:

spark.files.userClassPathFirst

Whether to give user-added jars precedence over Spark's own jars when 
loading classes in Executors. This feature can be used to mitigate 
conflicts between Spark's dependencies and user dependencies. It is 
currently an experimental feature.

HTH,
Markus

On 11/04/2014 01:50 AM, Tobias Pfeiffer wrote:
> Hi,
>
> I tried hard to get a version of netty into my jar file created with 
> sbt assembly that works with all my libraries. Now I managed that and 
> was really happy, but it seems like spark-submit puts an older version 
> of netty on the classpath when submitting to a cluster, such that my 
> code ends up with an NoSuchMethodError:
>
> Code:
>   val a = new DefaultHttpRequest(HttpVersion.HTTP_1_1, HttpMethod.POST,
>     "http://localhost")
>   val f = new File(a.getClass.getProtectionDomain().
>     getCodeSource().getLocation().getPath())
>   println(f.getAbsolutePath)
>   println("headers: " + a.headers())
>
> When executed with "sbt run":
> ~/.ivy2/cache/io.netty/netty/bundles/netty-3.9.4.Final.jar
>   headers: org.jboss.netty.handler.codec.http.DefaultHttpHeaders@64934069
>
> When executed with "spark-submit":
> ~/spark-1.1.0-bin-hadoop2.4/lib/spark-assembly-1.1.0-hadoop2.4.0.jar
>   Exception in thread "main" java.lang.NoSuchMethodError: 
> org.jboss.netty.handler.codec.http.DefaultHttpRequest.headers()Lorg/jboss/netty/handler/codec/http/HttpHeaders;
>     ...
>
> How can I get the old netty version off my classpath?
>
> Thanks
> Tobias
>