You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Robert James <sr...@gmail.com> on 2014/06/25 00:39:41 UTC

Upgrading to Spark 1.0.0 causes NoSuchMethodError

My app works fine under Spark 0.9.  I just tried upgrading to Spark
1.0, by downloading the Spark distro to a dir, changing the sbt file,
and running sbt assembly, but I get now NoSuchMethodErrors when trying
to use spark-submit.

I copied in the SimpleApp example from
http://spark.apache.org/docs/latest/quick-start.html and get the same
error:

$/usr/local/share/spark/bin/spark-submit --class SimpleApp
target/scala-2.10/myproj-assembly-1.0.jar
Spark assembly has been built with Hive, including Datanucleus jars on classpath
Exception in thread "main" java.lang.NoSuchMethodError:
org.apache.spark.SparkContext$.$lessinit$greater$default$2()Lscala/collection/Map;
	at SimpleApp$.main(SimpleApp.scala:10)
	at SimpleApp.main(SimpleApp.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:601)
	at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:292)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:55)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

How can I migrate to Spark 1.0.0?

Re: Upgrading to Spark 1.0.0 causes NoSuchMethodError

Posted by Robert James <sr...@gmail.com>.
On 6/24/14, Peng Cheng <pc...@uow.edu.au> wrote:
> I got 'NoSuchFieldError' which is of the same type. its definitely a
> dependency jar conflict. spark driver will load jars of itself which in
> recent version get many dependencies that are 1-2 years old. And if your
> newer version dependency is in the same package it will be shaded (Java's
> first come first serve principle) and the new method won't be found. Try
> using:
>
> mvn dependency:tree to find duplicate artifacts
>
> and use maven-shade-plugin to rename the package of your newer library.
> (IntelliJ doesn't officially support this plug-in so it may become quirky,
> if that happens try re-importing the project)
>

I'm using Scala and sbt. How can I do what you recommend (no maven)?

 I tried doing a `sbt clean` but it didn't help.

Re: Upgrading to Spark 1.0.0 causes NoSuchMethodError

Posted by Peng Cheng <pc...@uow.edu.au>.
I got 'NoSuchFieldError' which is of the same type. its definitely a
dependency jar conflict. spark driver will load jars of itself which in
recent version get many dependencies that are 1-2 years old. And if your
newer version dependency is in the same package it will be shaded (Java's
first come first serve principle) and the new method won't be found. Try
using:

mvn dependency:tree to find duplicate artifacts

and use maven-shade-plugin to rename the package of your newer library.
(IntelliJ doesn't officially support this plug-in so it may become quirky,
if that happens try re-importing the project)



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Upgrading-to-Spark-1-0-0-causes-NoSuchMethodError-tp8207p8220.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: Upgrading to Spark 1.0.0 causes NoSuchMethodError

Posted by Akhil Das <ak...@sigmoidanalytics.com>.
Try deleting the .iv2 directory in your home and then do a sbt clean
assembly would solve this issue i guess.

Thanks
Best Regards


On Thu, Jun 26, 2014 at 3:10 AM, Robert James <sr...@gmail.com>
wrote:

> In case anyone else is having this problem, deleting all ivy's cache,
> then doing a sbt clean, then recompiling everything, repackaging, and
> reassemblying, seems to have solved the problem.  (From the sbt docs,
> it seems that having to delete ivy's cache means a bug in sbt)
>
> On 6/25/14, Robert James <sr...@gmail.com> wrote:
> > Thanks Paul.  I'm unable to follow the discussion on SPARK-2075.  But
> > how would you recommend I test or follow up on that? Is there a
> > workaround?
> >
> > On 6/25/14, Paul Brown <pr...@mult.ifario.us> wrote:
> >> Hi, Robert --
> >>
> >> I wonder if this is an instance of SPARK-2075:
> >> https://issues.apache.org/jira/browse/SPARK-2075
> >>
> >> -- Paul
> >>
> >> —
> >> prb@mult.ifario.us | Multifarious, Inc. | http://mult.ifario.us/
> >>
> >>
> >> On Wed, Jun 25, 2014 at 6:28 AM, Robert James <sr...@gmail.com>
> >> wrote:
> >>
> >>> On 6/24/14, Robert James <sr...@gmail.com> wrote:
> >>> > My app works fine under Spark 0.9.  I just tried upgrading to Spark
> >>> > 1.0, by downloading the Spark distro to a dir, changing the sbt file,
> >>> > and running sbt assembly, but I get now NoSuchMethodErrors when
> trying
> >>> > to use spark-submit.
> >>> >
> >>> > I copied in the SimpleApp example from
> >>> > http://spark.apache.org/docs/latest/quick-start.html and get the
> same
> >>> > error:
> >>> >
> >>> > $/usr/local/share/spark/bin/spark-submit --class SimpleApp
> >>> > target/scala-2.10/myproj-assembly-1.0.jar
> >>> > Spark assembly has been built with Hive, including Datanucleus jars
> on
> >>> > classpath
> >>> > Exception in thread "main" java.lang.NoSuchMethodError:
> >>> >
> >>>
> org.apache.spark.SparkContext$.$lessinit$greater$default$2()Lscala/collection/Map;
> >>> >       at SimpleApp$.main(SimpleApp.scala:10)
> >>> >       at SimpleApp.main(SimpleApp.scala)
> >>> >       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >>> >       at
> >>> >
> >>>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> >>> >       at
> >>> >
> >>>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >>> >       at java.lang.reflect.Method.invoke(Method.java:601)
> >>> >       at
> >>> org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:292)
> >>> >       at
> >>> > org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:55)
> >>> >       at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> >>> >
> >>> > How can I migrate to Spark 1.0.0?
> >>> >
> >>>
> >>> I've done `sbt clean`, deleted the entire ivy2 cache, and still get
> >>> the above error on both my code and the official Spark example.  Can
> >>> anyone guide me on how to debug this?
> >>>
> >>> How does Spark find the /usr/local/share/spark directory? Is there a
> >>> variable somewhere I need to set to point to that, or that might point
> >>> to the old spark? I've left the old spark dir on the machine (just
> >>> changed the symlink) - can that be causing problems?
> >>>
> >>> How should I approach this?
> >>>
> >>
> >
>

Re: Upgrading to Spark 1.0.0 causes NoSuchMethodError

Posted by Robert James <sr...@gmail.com>.
In case anyone else is having this problem, deleting all ivy's cache,
then doing a sbt clean, then recompiling everything, repackaging, and
reassemblying, seems to have solved the problem.  (From the sbt docs,
it seems that having to delete ivy's cache means a bug in sbt)

On 6/25/14, Robert James <sr...@gmail.com> wrote:
> Thanks Paul.  I'm unable to follow the discussion on SPARK-2075.  But
> how would you recommend I test or follow up on that? Is there a
> workaround?
>
> On 6/25/14, Paul Brown <pr...@mult.ifario.us> wrote:
>> Hi, Robert --
>>
>> I wonder if this is an instance of SPARK-2075:
>> https://issues.apache.org/jira/browse/SPARK-2075
>>
>> -- Paul
>>
>> —
>> prb@mult.ifario.us | Multifarious, Inc. | http://mult.ifario.us/
>>
>>
>> On Wed, Jun 25, 2014 at 6:28 AM, Robert James <sr...@gmail.com>
>> wrote:
>>
>>> On 6/24/14, Robert James <sr...@gmail.com> wrote:
>>> > My app works fine under Spark 0.9.  I just tried upgrading to Spark
>>> > 1.0, by downloading the Spark distro to a dir, changing the sbt file,
>>> > and running sbt assembly, but I get now NoSuchMethodErrors when trying
>>> > to use spark-submit.
>>> >
>>> > I copied in the SimpleApp example from
>>> > http://spark.apache.org/docs/latest/quick-start.html and get the same
>>> > error:
>>> >
>>> > $/usr/local/share/spark/bin/spark-submit --class SimpleApp
>>> > target/scala-2.10/myproj-assembly-1.0.jar
>>> > Spark assembly has been built with Hive, including Datanucleus jars on
>>> > classpath
>>> > Exception in thread "main" java.lang.NoSuchMethodError:
>>> >
>>> org.apache.spark.SparkContext$.$lessinit$greater$default$2()Lscala/collection/Map;
>>> >       at SimpleApp$.main(SimpleApp.scala:10)
>>> >       at SimpleApp.main(SimpleApp.scala)
>>> >       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> >       at
>>> >
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> >       at
>>> >
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> >       at java.lang.reflect.Method.invoke(Method.java:601)
>>> >       at
>>> org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:292)
>>> >       at
>>> > org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:55)
>>> >       at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>> >
>>> > How can I migrate to Spark 1.0.0?
>>> >
>>>
>>> I've done `sbt clean`, deleted the entire ivy2 cache, and still get
>>> the above error on both my code and the official Spark example.  Can
>>> anyone guide me on how to debug this?
>>>
>>> How does Spark find the /usr/local/share/spark directory? Is there a
>>> variable somewhere I need to set to point to that, or that might point
>>> to the old spark? I've left the old spark dir on the machine (just
>>> changed the symlink) - can that be causing problems?
>>>
>>> How should I approach this?
>>>
>>
>

Re: Upgrading to Spark 1.0.0 causes NoSuchMethodError

Posted by Robert James <sr...@gmail.com>.
Thanks Paul.  I'm unable to follow the discussion on SPARK-2075.  But
how would you recommend I test or follow up on that? Is there a
workaround?

On 6/25/14, Paul Brown <pr...@mult.ifario.us> wrote:
> Hi, Robert --
>
> I wonder if this is an instance of SPARK-2075:
> https://issues.apache.org/jira/browse/SPARK-2075
>
> -- Paul
>
> —
> prb@mult.ifario.us | Multifarious, Inc. | http://mult.ifario.us/
>
>
> On Wed, Jun 25, 2014 at 6:28 AM, Robert James <sr...@gmail.com>
> wrote:
>
>> On 6/24/14, Robert James <sr...@gmail.com> wrote:
>> > My app works fine under Spark 0.9.  I just tried upgrading to Spark
>> > 1.0, by downloading the Spark distro to a dir, changing the sbt file,
>> > and running sbt assembly, but I get now NoSuchMethodErrors when trying
>> > to use spark-submit.
>> >
>> > I copied in the SimpleApp example from
>> > http://spark.apache.org/docs/latest/quick-start.html and get the same
>> > error:
>> >
>> > $/usr/local/share/spark/bin/spark-submit --class SimpleApp
>> > target/scala-2.10/myproj-assembly-1.0.jar
>> > Spark assembly has been built with Hive, including Datanucleus jars on
>> > classpath
>> > Exception in thread "main" java.lang.NoSuchMethodError:
>> >
>> org.apache.spark.SparkContext$.$lessinit$greater$default$2()Lscala/collection/Map;
>> >       at SimpleApp$.main(SimpleApp.scala:10)
>> >       at SimpleApp.main(SimpleApp.scala)
>> >       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> >       at
>> >
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>> >       at
>> >
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> >       at java.lang.reflect.Method.invoke(Method.java:601)
>> >       at
>> org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:292)
>> >       at
>> > org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:55)
>> >       at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>> >
>> > How can I migrate to Spark 1.0.0?
>> >
>>
>> I've done `sbt clean`, deleted the entire ivy2 cache, and still get
>> the above error on both my code and the official Spark example.  Can
>> anyone guide me on how to debug this?
>>
>> How does Spark find the /usr/local/share/spark directory? Is there a
>> variable somewhere I need to set to point to that, or that might point
>> to the old spark? I've left the old spark dir on the machine (just
>> changed the symlink) - can that be causing problems?
>>
>> How should I approach this?
>>
>

Re: Upgrading to Spark 1.0.0 causes NoSuchMethodError

Posted by Paul Brown <pr...@mult.ifario.us>.
Hi, Robert --

I wonder if this is an instance of SPARK-2075:
https://issues.apache.org/jira/browse/SPARK-2075

-- Paul

—
prb@mult.ifario.us | Multifarious, Inc. | http://mult.ifario.us/


On Wed, Jun 25, 2014 at 6:28 AM, Robert James <sr...@gmail.com>
wrote:

> On 6/24/14, Robert James <sr...@gmail.com> wrote:
> > My app works fine under Spark 0.9.  I just tried upgrading to Spark
> > 1.0, by downloading the Spark distro to a dir, changing the sbt file,
> > and running sbt assembly, but I get now NoSuchMethodErrors when trying
> > to use spark-submit.
> >
> > I copied in the SimpleApp example from
> > http://spark.apache.org/docs/latest/quick-start.html and get the same
> > error:
> >
> > $/usr/local/share/spark/bin/spark-submit --class SimpleApp
> > target/scala-2.10/myproj-assembly-1.0.jar
> > Spark assembly has been built with Hive, including Datanucleus jars on
> > classpath
> > Exception in thread "main" java.lang.NoSuchMethodError:
> >
> org.apache.spark.SparkContext$.$lessinit$greater$default$2()Lscala/collection/Map;
> >       at SimpleApp$.main(SimpleApp.scala:10)
> >       at SimpleApp.main(SimpleApp.scala)
> >       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >       at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> >       at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >       at java.lang.reflect.Method.invoke(Method.java:601)
> >       at
> org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:292)
> >       at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:55)
> >       at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> >
> > How can I migrate to Spark 1.0.0?
> >
>
> I've done `sbt clean`, deleted the entire ivy2 cache, and still get
> the above error on both my code and the official Spark example.  Can
> anyone guide me on how to debug this?
>
> How does Spark find the /usr/local/share/spark directory? Is there a
> variable somewhere I need to set to point to that, or that might point
> to the old spark? I've left the old spark dir on the machine (just
> changed the symlink) - can that be causing problems?
>
> How should I approach this?
>

Re: Upgrading to Spark 1.0.0 causes NoSuchMethodError

Posted by Robert James <sr...@gmail.com>.
On 6/24/14, Robert James <sr...@gmail.com> wrote:
> My app works fine under Spark 0.9.  I just tried upgrading to Spark
> 1.0, by downloading the Spark distro to a dir, changing the sbt file,
> and running sbt assembly, but I get now NoSuchMethodErrors when trying
> to use spark-submit.
>
> I copied in the SimpleApp example from
> http://spark.apache.org/docs/latest/quick-start.html and get the same
> error:
>
> $/usr/local/share/spark/bin/spark-submit --class SimpleApp
> target/scala-2.10/myproj-assembly-1.0.jar
> Spark assembly has been built with Hive, including Datanucleus jars on
> classpath
> Exception in thread "main" java.lang.NoSuchMethodError:
> org.apache.spark.SparkContext$.$lessinit$greater$default$2()Lscala/collection/Map;
> 	at SimpleApp$.main(SimpleApp.scala:10)
> 	at SimpleApp.main(SimpleApp.scala)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> 	at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:601)
> 	at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:292)
> 	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:55)
> 	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>
> How can I migrate to Spark 1.0.0?
>

I've done `sbt clean`, deleted the entire ivy2 cache, and still get
the above error on both my code and the official Spark example.  Can
anyone guide me on how to debug this?

How does Spark find the /usr/local/share/spark directory? Is there a
variable somewhere I need to set to point to that, or that might point
to the old spark? I've left the old spark dir on the machine (just
changed the symlink) - can that be causing problems?

How should I approach this?