You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by arjun biswas <ar...@gmail.com> on 2014/01/16 01:14:13 UTC

jarOfClass method no found in SparkContext

Hello All ,

I have installed spark on my machine and was succesful in running sbt/sbt
package as well as sbt/sbt assembly . I am trying to run the examples in
java from eclipse . To be precise i am trying to run the JavaLogQuery
example from eclipse . The issue is i am unable to resolve this compilation
problem of *jarOfClass being not available inside the Java Spark Context* .
I have added all the possible jars and is using Spark 0.8.1 incubating
which is the latest one with scala 2.9.3 .I have all jars to the classpath
to the point that i do not get any import error . However
JavaSparkContext.jarOfClass gives the above error saying the jarOfClass
method is unavailable in the JavaSparkContext . I am using Spark-0.8.1
incubating and scala 2.9.3 . Has anyone tried to run the java sample
examples from eclipse . Please note that this is a compile time error in
eclipse .

Regards
Arjun

Re: jarOfClass method no found in SparkContext

Posted by arjun biswas <ar...@gmail.com>.
Thanks for pointing me to that mistake . Yes i was using the spark 0.8.1
incubating jar and the master branch code examples . I corrected the mistake

Regards


On Wed, Jan 15, 2014 at 5:51 PM, Patrick Wendell <pw...@gmail.com> wrote:

> Hm, are you sure you haven't included the master branch of Spark
> somehow in your classpath? jarOfClass was added to Java in the master
> branch and Spark 0.9.0 (RC). So it seems a lot like you have a newer
> (post 0.8.X) version of the examples.
>
> - Patrick
>
> On Wed, Jan 15, 2014 at 5:04 PM, arjun biswas <ar...@gmail.com>
> wrote:
> > Could it be possible that you have an older version of JavaSparkContext
> > (i.e. from an older version of Spark) in your path? Please check that
> there
> > aren't two versions of Spark accidentally included in your class path
> used
> > in Eclipse. It would not give errors in the import (as it finds the
> imported
> > packages and classes) but would give such errors as it may be
> unfortunately
> > finding an older version of JavaSparkContext class in the class path.
> >
> >>>
> >
> > I have the following three jars in the class path of eclipse .and no
> other
> > jar is currently in the classpath
> > 1)google-collections-0.8.jar
> > 2)scala-library.jar
> > 3)spark-core_2.9.3-0.8.1-incubating.jar
> >
> > Am i using the correct jar files to run the java samples from eclipse ?
> >
> > Regards
> >
> >
> >
> >
> > On Wed, Jan 15, 2014 at 4:36 PM, Tathagata Das <
> tathagata.das1565@gmail.com>
> > wrote:
> >>
> >> Could it be possible that you have an older version of JavaSparkContext
> >> (i.e. from an older version of Spark) in your path? Please check that
> there
> >> aren't two versions of Spark accidentally included in your class path
> used
> >> in Eclipse. It would not give errors in the import (as it finds the
> imported
> >> packages and classes) but would give such errors as it may be
> unfortunately
> >> finding an older version of JavaSparkContext class in the class path.
> >>
> >> TD
> >>
> >>
> >> On Wed, Jan 15, 2014 at 4:14 PM, arjun biswas <
> arjunbiswas.uci@gmail.com>
> >> wrote:
> >>>
> >>> Hello All ,
> >>>
> >>> I have installed spark on my machine and was succesful in running
> sbt/sbt
> >>> package as well as sbt/sbt assembly . I am trying to run the examples
> in
> >>> java from eclipse . To be precise i am trying to run the JavaLogQuery
> >>> example from eclipse . The issue is i am unable to resolve this
> compilation
> >>> problem of jarOfClass being not available inside the Java Spark
> Context . I
> >>> have added all the possible jars and is using Spark 0.8.1 incubating
> which
> >>> is the latest one with scala 2.9.3 .I have all jars to the classpath
> to the
> >>> point that i do not get any import error . However
> >>> JavaSparkContext.jarOfClass gives the above error saying the jarOfClass
> >>> method is unavailable in the JavaSparkContext . I am using Spark-0.8.1
> >>> incubating and scala 2.9.3 . Has anyone tried to run the java sample
> >>> examples from eclipse . Please note that this is a compile time error
> in
> >>> eclipse .
> >>>
> >>> Regards
> >>> Arjun
> >>
> >>
> >
>

Re: jarOfClass method no found in SparkContext

Posted by Patrick Wendell <pw...@gmail.com>.
Hm, are you sure you haven't included the master branch of Spark
somehow in your classpath? jarOfClass was added to Java in the master
branch and Spark 0.9.0 (RC). So it seems a lot like you have a newer
(post 0.8.X) version of the examples.

- Patrick

On Wed, Jan 15, 2014 at 5:04 PM, arjun biswas <ar...@gmail.com> wrote:
> Could it be possible that you have an older version of JavaSparkContext
> (i.e. from an older version of Spark) in your path? Please check that there
> aren't two versions of Spark accidentally included in your class path used
> in Eclipse. It would not give errors in the import (as it finds the imported
> packages and classes) but would give such errors as it may be unfortunately
> finding an older version of JavaSparkContext class in the class path.
>
>>>
>
> I have the following three jars in the class path of eclipse .and no other
> jar is currently in the classpath
> 1)google-collections-0.8.jar
> 2)scala-library.jar
> 3)spark-core_2.9.3-0.8.1-incubating.jar
>
> Am i using the correct jar files to run the java samples from eclipse ?
>
> Regards
>
>
>
>
> On Wed, Jan 15, 2014 at 4:36 PM, Tathagata Das <ta...@gmail.com>
> wrote:
>>
>> Could it be possible that you have an older version of JavaSparkContext
>> (i.e. from an older version of Spark) in your path? Please check that there
>> aren't two versions of Spark accidentally included in your class path used
>> in Eclipse. It would not give errors in the import (as it finds the imported
>> packages and classes) but would give such errors as it may be unfortunately
>> finding an older version of JavaSparkContext class in the class path.
>>
>> TD
>>
>>
>> On Wed, Jan 15, 2014 at 4:14 PM, arjun biswas <ar...@gmail.com>
>> wrote:
>>>
>>> Hello All ,
>>>
>>> I have installed spark on my machine and was succesful in running sbt/sbt
>>> package as well as sbt/sbt assembly . I am trying to run the examples in
>>> java from eclipse . To be precise i am trying to run the JavaLogQuery
>>> example from eclipse . The issue is i am unable to resolve this compilation
>>> problem of jarOfClass being not available inside the Java Spark Context . I
>>> have added all the possible jars and is using Spark 0.8.1 incubating which
>>> is the latest one with scala 2.9.3 .I have all jars to the classpath to the
>>> point that i do not get any import error . However
>>> JavaSparkContext.jarOfClass gives the above error saying the jarOfClass
>>> method is unavailable in the JavaSparkContext . I am using Spark-0.8.1
>>> incubating and scala 2.9.3 . Has anyone tried to run the java sample
>>> examples from eclipse . Please note that this is a compile time error in
>>> eclipse .
>>>
>>> Regards
>>> Arjun
>>
>>
>

Re: jarOfClass method no found in SparkContext

Posted by arjun biswas <ar...@gmail.com>.
Could it be possible that you have an older version of JavaSparkContext
(i.e. from an older version of Spark) in your path? Please check that there
aren't two versions of Spark accidentally included in your class path used
in Eclipse. It would not give errors in the import (as it finds the
imported packages and classes) but would give such errors as it may be
unfortunately finding an older version of JavaSparkContext class in the
class path.

>>

I have the following three jars in the class path of eclipse .and no other
jar is currently in the classpath
*1)google-collections-0.8.jar*
*2)scala-library.jar*
*3)spark-core_2.9.3-0.8.1-incubating.jar*

*Am i using the correct jar files to run the java samples from eclipse ?*

*Regards*




On Wed, Jan 15, 2014 at 4:36 PM, Tathagata Das
<ta...@gmail.com>wrote:

> Could it be possible that you have an older version of JavaSparkContext
> (i.e. from an older version of Spark) in your path? Please check that there
> aren't two versions of Spark accidentally included in your class path used
> in Eclipse. It would not give errors in the import (as it finds the
> imported packages and classes) but would give such errors as it may be
> unfortunately finding an older version of JavaSparkContext class in the
> class path.
>
> TD
>
>
> On Wed, Jan 15, 2014 at 4:14 PM, arjun biswas <ar...@gmail.com>wrote:
>
>> Hello All ,
>>
>> I have installed spark on my machine and was succesful in running sbt/sbt
>> package as well as sbt/sbt assembly . I am trying to run the examples in
>> java from eclipse . To be precise i am trying to run the JavaLogQuery
>> example from eclipse . The issue is i am unable to resolve this
>> compilation problem of *jarOfClass being not available inside the Java
>> Spark Context* . I have added all the possible jars and is using Spark
>> 0.8.1 incubating which is the latest one with scala 2.9.3 .I have all
>> jars to the classpath to the point that i do not get any import error .
>> However JavaSparkContext.jarOfClass gives the above error saying the
>> jarOfClass method is unavailable in the JavaSparkContext . I am using
>> Spark-0.8.1 incubating and scala 2.9.3 . Has anyone tried to run the java
>> sample examples from eclipse . Please note that this is a compile time
>> error in eclipse .
>>
>> Regards
>> Arjun
>>
>
>

Re: jarOfClass method no found in SparkContext

Posted by Tathagata Das <ta...@gmail.com>.
Could it be possible that you have an older version of JavaSparkContext
(i.e. from an older version of Spark) in your path? Please check that there
aren't two versions of Spark accidentally included in your class path used
in Eclipse. It would not give errors in the import (as it finds the
imported packages and classes) but would give such errors as it may be
unfortunately finding an older version of JavaSparkContext class in the
class path.

TD


On Wed, Jan 15, 2014 at 4:14 PM, arjun biswas <ar...@gmail.com>wrote:

> Hello All ,
>
> I have installed spark on my machine and was succesful in running sbt/sbt
> package as well as sbt/sbt assembly . I am trying to run the examples in
> java from eclipse . To be precise i am trying to run the JavaLogQuery
> example from eclipse . The issue is i am unable to resolve this
> compilation problem of *jarOfClass being not available inside the Java
> Spark Context* . I have added all the possible jars and is using Spark
> 0.8.1 incubating which is the latest one with scala 2.9.3 .I have all
> jars to the classpath to the point that i do not get any import error .
> However JavaSparkContext.jarOfClass gives the above error saying the
> jarOfClass method is unavailable in the JavaSparkContext . I am using
> Spark-0.8.1 incubating and scala 2.9.3 . Has anyone tried to run the java
> sample examples from eclipse . Please note that this is a compile time
> error in eclipse .
>
> Regards
> Arjun
>