You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Andre Kuhnen <an...@gmail.com> on 2014/02/13 23:12:21 UTC

ADD_JARS not working on 0.9

Hello, my spark-shell tells me taht the jar are added but it can not import
any of my stuff


When I used the same steps on 0.8  everything worked fine

Thanks

Re: ADD_JARS not working on 0.9

Posted by Andrew Ash <an...@andrewash.com>.
Hi Andre,

I've also noticed this.  The jar needs to be added to SPARK_CLASSPATH also
now.

See
https://mail-archives.apache.org/mod_mbox/incubator-spark-user/201402.mbox/%3CCAJbo4neMLiTrnm1XbyqomWmp0m+EUcg4yE-txuRGSVKOb5KLeA@mail.gmail.com%3E


On Thu, Feb 13, 2014 at 2:12 PM, Andre Kuhnen <an...@gmail.com> wrote:

> Hello, my spark-shell tells me taht the jar are added but it can not
> import any of my stuff
>
>
> When I used the same steps on 0.8  everything worked fine
>
> Thanks
>
>

Re: ADD_JARS not working on 0.9

Posted by Nan Zhu <zh...@gmail.com>.
Someone wants to review the fix? https://github.com/apache/incubator-spark/pull/614  

Best,  

--  
Nan Zhu


On Sunday, February 16, 2014 at 8:37 PM, Nan Zhu wrote:

> I’m interested in fixing this  
>  
> Can anyone assign the JIRA to me?
>  
> Best,  
>  
> --  
> Nan Zhu
>  
>  
> On Sunday, February 16, 2014 at 6:17 PM, Andrew Ash wrote:
>  
> > // cc Patrick, who I think helps with the Amplab Jira
> >  
> > Amplab Jira admins, can we make sure that newly-created users have comment permissions?  This has been standard in the open source Jira instances I've worked with in the past (like Hadoop).
> >  
> > Thanks!
> > Andrew
> >  
> >  
> >  
> > On Sat, Feb 15, 2014 at 4:25 AM, Vyacheslav Baranov <slavik.baranov@gmail.com (mailto:slavik.baranov@gmail.com)> wrote:
> > > Andrew,
> > >  
> > > I've created account on Amplab Jira, but unfortunately I don't have permission to comment.
> > >  
> > > Vyacheslav
> > >  
> > >  
> > > On 15/02/14 00:28, Andrew Ash wrote:
> > > > Hi Vyacheslav,  
> > > >  
> > > > If you could add that to the ticket directly that would be valuable because you're more familiar with the specific problem than me!  
> > > >  
> > > > Andrew  
> > > >  
> > > >  
> > > > On Fri, Feb 14, 2014 at 8:10 AM, Vyacheslav Baranov <slavik.baranov@gmail.com (mailto:slavik.baranov@gmail.com)> wrote:
> > > > > Hello Andrew,
> > > > >  
> > > > > I'm running on the same problem when I try to import a jar using ':cp' repl command. This used to work on 0.8:
> > > > >  
> > > > > scala> import org.msgpack
> > > > > <console>:10: error: msgpack is not a member of org
> > > > >        import org.msgpack
> > > > >               ^
> > > > >  
> > > > > scala> :cp /path/to/msgpack-0.6.8.jar
> > > > > Added '/path/to/msgpack-0.6.8.jar'.  Your new classpath is:
> > > > > "/usr/share/lib/spark/*:/usr/lib/spark/conf:/usr/lib/spark/jars/spark-assembly-0.8.0-incubating-hadoop1.2.1.jar:/path/to/msgpack-0.6.8.jar"
> > > > > 14/02/14 20:04:00 INFO server.Server: jetty-7.x.y-SNAPSHOT
> > > > > 14/02/14 20:04:00 INFO server.AbstractConnector: Started SocketConnector@0.0.0.0:64454 (mailto:SocketConnector@0.0.0.0:64454)
> > > > >  
> > > > > scala> import org.msgpack
> > > > > import org.msgpack
> > > > >  
> > > > > And it's not working on 0.9:
> > > > >  
> > > > > scala> import org.msgpack
> > > > > <console>:10: error: object msgpack is not a member of package org
> > > > >        import org.msgpack
> > > > >               ^
> > > > >  
> > > > > scala> :cp /path/to/msgpack-0.6.8.jar
> > > > > Added '/path/to/msgpack-0.6.8.jar'.  Your new classpath is:
> > > > > "/usr/share/lib/spark/*:/usr/lib/spark/conf:/usr/lib/spark/jars/spark-assembly-0.9.0-incubating-hadoop2.2.0.jar:/path/to/msgpack-0.6.8.jar"
> > > > > Nothing to replay.
> > > > >  
> > > > > scala> import org.msgpack
> > > > > <console>:7: error: object msgpack is not a member of package org
> > > > >        import org.msgpack
> > > > >               ^
> > > > >  
> > > > > Probably, it's worth to add this to issue's comments
> > > > >  
> > > > > Thank you,
> > > > > Vyacheslav  
> > > > >  
> > > > >  
> > > > > On 14/02/14 02:26, Andrew Ash wrote:
> > > > > > I filed a bug so we can track the fix: https://spark-project.atlassian.net/browse/SPARK-1089  
> > > > > >  
> > > > > >  
> > > > > > On Thu, Feb 13, 2014 at 2:21 PM, Soumya Simanta <soumya.simanta@gmail.com (mailto:soumya.simanta@gmail.com)> wrote:
> > > > > > > Use   
> > > > > > > SPARK_CLASSPATH along with ADD_JARS  
> > > > > > >  
> > > > > > >  
> > > > > > > On Thu, Feb 13, 2014 at 5:12 PM, Andre Kuhnen <andrekuhnen@gmail.com (mailto:andrekuhnen@gmail.com)> wrote:
> > > > > > > > Hello, my spark-shell tells me taht the jar are added but it can not import any of my stuff  
> > > > > > > >  
> > > > > > > >  
> > > > > > > > When I used the same steps on 0.8  everything worked fine  
> > > > > > > >  
> > > > > > > > Thanks  
> > > > > > > >  
> > > > > > >  
> > > > > >  
> > > > >  
> > > >  
> > >  
> >  
>  


Re: ADD_JARS not working on 0.9

Posted by Nan Zhu <zh...@gmail.com>.
I’m interested in fixing this  

Can anyone assign the JIRA to me?

Best,  

--  
Nan Zhu


On Sunday, February 16, 2014 at 6:17 PM, Andrew Ash wrote:

> // cc Patrick, who I think helps with the Amplab Jira
>  
> Amplab Jira admins, can we make sure that newly-created users have comment permissions?  This has been standard in the open source Jira instances I've worked with in the past (like Hadoop).
>  
> Thanks!
> Andrew
>  
>  
>  
> On Sat, Feb 15, 2014 at 4:25 AM, Vyacheslav Baranov <slavik.baranov@gmail.com (mailto:slavik.baranov@gmail.com)> wrote:
> > Andrew,
> >  
> > I've created account on Amplab Jira, but unfortunately I don't have permission to comment.
> >  
> > Vyacheslav
> >  
> >  
> > On 15/02/14 00:28, Andrew Ash wrote:
> > > Hi Vyacheslav,  
> > >  
> > > If you could add that to the ticket directly that would be valuable because you're more familiar with the specific problem than me!  
> > >  
> > > Andrew  
> > >  
> > >  
> > > On Fri, Feb 14, 2014 at 8:10 AM, Vyacheslav Baranov <slavik.baranov@gmail.com (mailto:slavik.baranov@gmail.com)> wrote:
> > > > Hello Andrew,
> > > >  
> > > > I'm running on the same problem when I try to import a jar using ':cp' repl command. This used to work on 0.8:
> > > >  
> > > > scala> import org.msgpack
> > > > <console>:10: error: msgpack is not a member of org
> > > >        import org.msgpack
> > > >               ^
> > > >  
> > > > scala> :cp /path/to/msgpack-0.6.8.jar
> > > > Added '/path/to/msgpack-0.6.8.jar'.  Your new classpath is:
> > > > "/usr/share/lib/spark/*:/usr/lib/spark/conf:/usr/lib/spark/jars/spark-assembly-0.8.0-incubating-hadoop1.2.1.jar:/path/to/msgpack-0.6.8.jar"
> > > > 14/02/14 20:04:00 INFO server.Server: jetty-7.x.y-SNAPSHOT
> > > > 14/02/14 20:04:00 INFO server.AbstractConnector: Started SocketConnector@0.0.0.0:64454 (mailto:SocketConnector@0.0.0.0:64454)
> > > >  
> > > > scala> import org.msgpack
> > > > import org.msgpack
> > > >  
> > > > And it's not working on 0.9:
> > > >  
> > > > scala> import org.msgpack
> > > > <console>:10: error: object msgpack is not a member of package org
> > > >        import org.msgpack
> > > >               ^
> > > >  
> > > > scala> :cp /path/to/msgpack-0.6.8.jar
> > > > Added '/path/to/msgpack-0.6.8.jar'.  Your new classpath is:
> > > > "/usr/share/lib/spark/*:/usr/lib/spark/conf:/usr/lib/spark/jars/spark-assembly-0.9.0-incubating-hadoop2.2.0.jar:/path/to/msgpack-0.6.8.jar"
> > > > Nothing to replay.
> > > >  
> > > > scala> import org.msgpack
> > > > <console>:7: error: object msgpack is not a member of package org
> > > >        import org.msgpack
> > > >               ^
> > > >  
> > > > Probably, it's worth to add this to issue's comments
> > > >  
> > > > Thank you,
> > > > Vyacheslav  
> > > >  
> > > >  
> > > > On 14/02/14 02:26, Andrew Ash wrote:
> > > > > I filed a bug so we can track the fix: https://spark-project.atlassian.net/browse/SPARK-1089  
> > > > >  
> > > > >  
> > > > > On Thu, Feb 13, 2014 at 2:21 PM, Soumya Simanta <soumya.simanta@gmail.com (mailto:soumya.simanta@gmail.com)> wrote:
> > > > > > Use   
> > > > > > SPARK_CLASSPATH along with ADD_JARS  
> > > > > >  
> > > > > >  
> > > > > > On Thu, Feb 13, 2014 at 5:12 PM, Andre Kuhnen <andrekuhnen@gmail.com (mailto:andrekuhnen@gmail.com)> wrote:
> > > > > > > Hello, my spark-shell tells me taht the jar are added but it can not import any of my stuff  
> > > > > > >  
> > > > > > >  
> > > > > > > When I used the same steps on 0.8  everything worked fine  
> > > > > > >  
> > > > > > > Thanks  
> > > > > > >  
> > > > > >  
> > > > >  
> > > >  
> > >  
> >  
>  


Re: ADD_JARS not working on 0.9

Posted by Andrew Ash <an...@andrewash.com>.
// cc Patrick, who I think helps with the Amplab Jira

Amplab Jira admins, can we make sure that newly-created users have comment
permissions?  This has been standard in the open source Jira instances I've
worked with in the past (like Hadoop).

Thanks!
Andrew


On Sat, Feb 15, 2014 at 4:25 AM, Vyacheslav Baranov <
slavik.baranov@gmail.com> wrote:

>  Andrew,
>
> I've created account on Amplab Jira, but unfortunately I don't have
> permission to comment.
>
> Vyacheslav
>
>
> On 15/02/14 00:28, Andrew Ash wrote:
>
> Hi Vyacheslav,
>
>  If you could add that to the ticket directly that would be valuable
> because you're more familiar with the specific problem than me!
>
>  Andrew
>
>
> On Fri, Feb 14, 2014 at 8:10 AM, Vyacheslav Baranov <
> slavik.baranov@gmail.com> wrote:
>
>>  Hello Andrew,
>>
>> I'm running on the same problem when I try to import a jar using ':cp'
>> repl command. This used to work on 0.8:
>>
>> scala> import org.msgpack
>> <console>:10: error: msgpack is not a member of org
>>        import org.msgpack
>>               ^
>>
>> scala> :cp /path/to/msgpack-0.6.8.jar
>> Added '/path/to/msgpack-0.6.8.jar'.  Your new classpath is:
>>
>> "/usr/share/lib/spark/*:/usr/lib/spark/conf:/usr/lib/spark/jars/spark-assembly-0.8.0-incubating-hadoop1.2.1.jar:/path/to/msgpack-0.6.8.jar"
>> 14/02/14 20:04:00 INFO server.Server: jetty-7.x.y-SNAPSHOT
>> 14/02/14 20:04:00 INFO server.AbstractConnector: Started
>> SocketConnector@0.0.0.0:64454
>>
>> scala> import org.msgpack
>> import org.msgpack
>>
>> And it's not working on 0.9:
>>
>> scala> import org.msgpack
>> <console>:10: error: object msgpack is not a member of package org
>>        import org.msgpack
>>               ^
>>
>> scala> :cp /path/to/msgpack-0.6.8.jar
>> Added '/path/to/msgpack-0.6.8.jar'.  Your new classpath is:
>>
>> "/usr/share/lib/spark/*:/usr/lib/spark/conf:/usr/lib/spark/jars/spark-assembly-0.9.0-incubating-hadoop2.2.0.jar:/path/to/msgpack-0.6.8.jar"
>> Nothing to replay.
>>
>> scala> import org.msgpack
>> <console>:7: error: object msgpack is not a member of package org
>>        import org.msgpack
>>               ^
>>
>> Probably, it's worth to add this to issue's comments
>>
>> Thank you,
>> Vyacheslav
>>
>>
>> On 14/02/14 02:26, Andrew Ash wrote:
>>
>> I filed a bug so we can track the fix:
>> https://spark-project.atlassian.net/browse/SPARK-1089
>>
>>
>>  On Thu, Feb 13, 2014 at 2:21 PM, Soumya Simanta <
>> soumya.simanta@gmail.com> wrote:
>>
>>>  Use
>>> SPARK_CLASSPATH along with ADD_JARS
>>>
>>>
>>>  On Thu, Feb 13, 2014 at 5:12 PM, Andre Kuhnen <an...@gmail.com>wrote:
>>>
>>>> Hello, my spark-shell tells me taht the jar are added but it can not
>>>> import any of my stuff
>>>>
>>>>
>>>>  When I used the same steps on 0.8  everything worked fine
>>>>
>>>>  Thanks
>>>>
>>>>
>>>
>>
>>
>
>

Re: ADD_JARS not working on 0.9

Posted by Vyacheslav Baranov <sl...@gmail.com>.
Andrew,

I've created account on Amplab Jira, but unfortunately I don't have 
permission to comment.

Vyacheslav

On 15/02/14 00:28, Andrew Ash wrote:
> Hi Vyacheslav,
>
> If you could add that to the ticket directly that would be valuable 
> because you're more familiar with the specific problem than me!
>
> Andrew
>
>
> On Fri, Feb 14, 2014 at 8:10 AM, Vyacheslav Baranov 
> <slavik.baranov@gmail.com <ma...@gmail.com>> wrote:
>
>     Hello Andrew,
>
>     I'm running on the same problem when I try to import a jar using
>     ':cp' repl command. This used to work on 0.8:
>
>     scala> import org.msgpack
>     <console>:10: error: msgpack is not a member of org
>            import org.msgpack
>                   ^
>
>     scala> :cp /path/to/msgpack-0.6.8.jar
>     Added '/path/to/msgpack-0.6.8.jar'.  Your new classpath is:
>     "/usr/share/lib/spark/*:/usr/lib/spark/conf:/usr/lib/spark/jars/spark-assembly-0.8.0-incubating-hadoop1.2.1.jar:/path/to/msgpack-0.6.8.jar"
>     14/02/14 20:04:00 INFO server.Server: jetty-7.x.y-SNAPSHOT
>     14/02/14 20:04:00 INFO server.AbstractConnector: Started
>     SocketConnector@0.0.0.0:64454 <mailto:SocketConnector@0.0.0.0:64454>
>
>     scala> import org.msgpack
>     import org.msgpack
>
>     And it's not working on 0.9:
>
>     scala> import org.msgpack
>     <console>:10: error: object msgpack is not a member of package org
>            import org.msgpack
>                   ^
>
>     scala> :cp /path/to/msgpack-0.6.8.jar
>     Added '/path/to/msgpack-0.6.8.jar'.  Your new classpath is:
>     "/usr/share/lib/spark/*:/usr/lib/spark/conf:/usr/lib/spark/jars/spark-assembly-0.9.0-incubating-hadoop2.2.0.jar:/path/to/msgpack-0.6.8.jar"
>     Nothing to replay.
>
>     scala> import org.msgpack
>     <console>:7: error: object msgpack is not a member of package org
>            import org.msgpack
>                   ^
>
>     Probably, it's worth to add this to issue's comments
>
>     Thank you,
>     Vyacheslav
>
>
>     On 14/02/14 02:26, Andrew Ash wrote:
>>     I filed a bug so we can track the fix:
>>     https://spark-project.atlassian.net/browse/SPARK-1089
>>
>>
>>     On Thu, Feb 13, 2014 at 2:21 PM, Soumya Simanta
>>     <soumya.simanta@gmail.com <ma...@gmail.com>> wrote:
>>
>>         Use
>>         SPARK_CLASSPATH along with ADD_JARS
>>
>>
>>         On Thu, Feb 13, 2014 at 5:12 PM, Andre Kuhnen
>>         <andrekuhnen@gmail.com <ma...@gmail.com>> wrote:
>>
>>             Hello, my spark-shell tells me taht the jar are added but
>>             it can not import any of my stuff
>>
>>
>>             When I used the same steps on 0.8  everything worked fine
>>
>>             Thanks
>>
>>
>>
>
>


Re: ADD_JARS not working on 0.9

Posted by Andrew Ash <an...@andrewash.com>.
Hi Vyacheslav,

If you could add that to the ticket directly that would be valuable because
you're more familiar with the specific problem than me!

Andrew


On Fri, Feb 14, 2014 at 8:10 AM, Vyacheslav Baranov <
slavik.baranov@gmail.com> wrote:

>  Hello Andrew,
>
> I'm running on the same problem when I try to import a jar using ':cp'
> repl command. This used to work on 0.8:
>
> scala> import org.msgpack
> <console>:10: error: msgpack is not a member of org
>        import org.msgpack
>               ^
>
> scala> :cp /path/to/msgpack-0.6.8.jar
> Added '/path/to/msgpack-0.6.8.jar'.  Your new classpath is:
>
> "/usr/share/lib/spark/*:/usr/lib/spark/conf:/usr/lib/spark/jars/spark-assembly-0.8.0-incubating-hadoop1.2.1.jar:/path/to/msgpack-0.6.8.jar"
> 14/02/14 20:04:00 INFO server.Server: jetty-7.x.y-SNAPSHOT
> 14/02/14 20:04:00 INFO server.AbstractConnector: Started
> SocketConnector@0.0.0.0:64454
>
> scala> import org.msgpack
> import org.msgpack
>
> And it's not working on 0.9:
>
> scala> import org.msgpack
> <console>:10: error: object msgpack is not a member of package org
>        import org.msgpack
>               ^
>
> scala> :cp /path/to/msgpack-0.6.8.jar
> Added '/path/to/msgpack-0.6.8.jar'.  Your new classpath is:
>
> "/usr/share/lib/spark/*:/usr/lib/spark/conf:/usr/lib/spark/jars/spark-assembly-0.9.0-incubating-hadoop2.2.0.jar:/path/to/msgpack-0.6.8.jar"
> Nothing to replay.
>
> scala> import org.msgpack
> <console>:7: error: object msgpack is not a member of package org
>        import org.msgpack
>               ^
>
> Probably, it's worth to add this to issue's comments
>
> Thank you,
> Vyacheslav
>
>
> On 14/02/14 02:26, Andrew Ash wrote:
>
> I filed a bug so we can track the fix:
> https://spark-project.atlassian.net/browse/SPARK-1089
>
>
>  On Thu, Feb 13, 2014 at 2:21 PM, Soumya Simanta <soumya.simanta@gmail.com
> > wrote:
>
>>  Use
>> SPARK_CLASSPATH along with ADD_JARS
>>
>>
>>  On Thu, Feb 13, 2014 at 5:12 PM, Andre Kuhnen <an...@gmail.com>wrote:
>>
>>> Hello, my spark-shell tells me taht the jar are added but it can not
>>> import any of my stuff
>>>
>>>
>>>  When I used the same steps on 0.8  everything worked fine
>>>
>>>  Thanks
>>>
>>>
>>
>
>

Re: ADD_JARS not working on 0.9

Posted by Vyacheslav Baranov <sl...@gmail.com>.
Hello Andrew,

I'm running on the same problem when I try to import a jar using ':cp' 
repl command. This used to work on 0.8:

scala> import org.msgpack
<console>:10: error: msgpack is not a member of org
        import org.msgpack
               ^

scala> :cp /path/to/msgpack-0.6.8.jar
Added '/path/to/msgpack-0.6.8.jar'.  Your new classpath is:
"/usr/share/lib/spark/*:/usr/lib/spark/conf:/usr/lib/spark/jars/spark-assembly-0.8.0-incubating-hadoop1.2.1.jar:/path/to/msgpack-0.6.8.jar"
14/02/14 20:04:00 INFO server.Server: jetty-7.x.y-SNAPSHOT
14/02/14 20:04:00 INFO server.AbstractConnector: Started 
SocketConnector@0.0.0.0:64454

scala> import org.msgpack
import org.msgpack

And it's not working on 0.9:

scala> import org.msgpack
<console>:10: error: object msgpack is not a member of package org
        import org.msgpack
               ^

scala> :cp /path/to/msgpack-0.6.8.jar
Added '/path/to/msgpack-0.6.8.jar'.  Your new classpath is:
"/usr/share/lib/spark/*:/usr/lib/spark/conf:/usr/lib/spark/jars/spark-assembly-0.9.0-incubating-hadoop2.2.0.jar:/path/to/msgpack-0.6.8.jar"
Nothing to replay.

scala> import org.msgpack
<console>:7: error: object msgpack is not a member of package org
        import org.msgpack
               ^

Probably, it's worth to add this to issue's comments

Thank you,
Vyacheslav

On 14/02/14 02:26, Andrew Ash wrote:
> I filed a bug so we can track the fix: 
> https://spark-project.atlassian.net/browse/SPARK-1089
>
>
> On Thu, Feb 13, 2014 at 2:21 PM, Soumya Simanta 
> <soumya.simanta@gmail.com <ma...@gmail.com>> wrote:
>
>     Use
>     SPARK_CLASSPATH along with ADD_JARS
>
>
>     On Thu, Feb 13, 2014 at 5:12 PM, Andre Kuhnen
>     <andrekuhnen@gmail.com <ma...@gmail.com>> wrote:
>
>         Hello, my spark-shell tells me taht the jar are added but it
>         can not import any of my stuff
>
>
>         When I used the same steps on 0.8  everything worked fine
>
>         Thanks
>
>
>


Re: ADD_JARS not working on 0.9

Posted by Andre Kuhnen <an...@gmail.com>.
Solved,  it was sbt version



2014-02-14 10:51 GMT-02:00 Andre Kuhnen <an...@gmail.com>:

> thanks guys,  but now I am having this problem, and I am compiling my jar
> with scala version 2.10.3 and sbt 013
> any idea?
>
> Failed to initialize compiler: NoSuchMethodError.
> This is most often remedied by a full clean and recompile.
> Otherwise, your classpath may continue bytecode compiled by
> different and incompatible versions of scala.
>
>
>
> 2014-02-13 23:16 GMT-02:00 Andre Kuhnen <an...@gmail.com>:
>
> Thanks a lot.
>> Em 13/02/2014 20:27, "Andrew Ash" <an...@andrewash.com> escreveu:
>>
>> I filed a bug so we can track the fix:
>>> https://spark-project.atlassian.net/browse/SPARK-1089
>>>
>>>
>>> On Thu, Feb 13, 2014 at 2:21 PM, Soumya Simanta <
>>> soumya.simanta@gmail.com> wrote:
>>>
>>>> Use
>>>> SPARK_CLASSPATH along with ADD_JARS
>>>>
>>>>
>>>> On Thu, Feb 13, 2014 at 5:12 PM, Andre Kuhnen <an...@gmail.com>wrote:
>>>>
>>>>> Hello, my spark-shell tells me taht the jar are added but it can not
>>>>> import any of my stuff
>>>>>
>>>>>
>>>>> When I used the same steps on 0.8  everything worked fine
>>>>>
>>>>> Thanks
>>>>>
>>>>>
>>>>
>>>
>

Re: ADD_JARS not working on 0.9

Posted by Andre Kuhnen <an...@gmail.com>.
thanks guys,  but now I am having this problem, and I am compiling my jar
with scala version 2.10.3 and sbt 013
any idea?

Failed to initialize compiler: NoSuchMethodError.
This is most often remedied by a full clean and recompile.
Otherwise, your classpath may continue bytecode compiled by
different and incompatible versions of scala.



2014-02-13 23:16 GMT-02:00 Andre Kuhnen <an...@gmail.com>:

> Thanks a lot.
> Em 13/02/2014 20:27, "Andrew Ash" <an...@andrewash.com> escreveu:
>
> I filed a bug so we can track the fix:
>> https://spark-project.atlassian.net/browse/SPARK-1089
>>
>>
>> On Thu, Feb 13, 2014 at 2:21 PM, Soumya Simanta <soumya.simanta@gmail.com
>> > wrote:
>>
>>> Use
>>> SPARK_CLASSPATH along with ADD_JARS
>>>
>>>
>>> On Thu, Feb 13, 2014 at 5:12 PM, Andre Kuhnen <an...@gmail.com>wrote:
>>>
>>>> Hello, my spark-shell tells me taht the jar are added but it can not
>>>> import any of my stuff
>>>>
>>>>
>>>> When I used the same steps on 0.8  everything worked fine
>>>>
>>>> Thanks
>>>>
>>>>
>>>
>>

Re: ADD_JARS not working on 0.9

Posted by Andre Kuhnen <an...@gmail.com>.
Thanks a lot.
Em 13/02/2014 20:27, "Andrew Ash" <an...@andrewash.com> escreveu:

> I filed a bug so we can track the fix:
> https://spark-project.atlassian.net/browse/SPARK-1089
>
>
> On Thu, Feb 13, 2014 at 2:21 PM, Soumya Simanta <so...@gmail.com>wrote:
>
>> Use
>> SPARK_CLASSPATH along with ADD_JARS
>>
>>
>> On Thu, Feb 13, 2014 at 5:12 PM, Andre Kuhnen <an...@gmail.com>wrote:
>>
>>> Hello, my spark-shell tells me taht the jar are added but it can not
>>> import any of my stuff
>>>
>>>
>>> When I used the same steps on 0.8  everything worked fine
>>>
>>> Thanks
>>>
>>>
>>
>

Re: ADD_JARS not working on 0.9

Posted by Andrew Ash <an...@andrewash.com>.
I filed a bug so we can track the fix:
https://spark-project.atlassian.net/browse/SPARK-1089


On Thu, Feb 13, 2014 at 2:21 PM, Soumya Simanta <so...@gmail.com>wrote:

> Use
> SPARK_CLASSPATH along with ADD_JARS
>
>
> On Thu, Feb 13, 2014 at 5:12 PM, Andre Kuhnen <an...@gmail.com>wrote:
>
>> Hello, my spark-shell tells me taht the jar are added but it can not
>> import any of my stuff
>>
>>
>> When I used the same steps on 0.8  everything worked fine
>>
>> Thanks
>>
>>
>

Re: ADD_JARS not working on 0.9

Posted by Soumya Simanta <so...@gmail.com>.
Use
SPARK_CLASSPATH along with ADD_JARS


On Thu, Feb 13, 2014 at 5:12 PM, Andre Kuhnen <an...@gmail.com> wrote:

> Hello, my spark-shell tells me taht the jar are added but it can not
> import any of my stuff
>
>
> When I used the same steps on 0.8  everything worked fine
>
> Thanks
>
>