You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@mahout.apache.org by Andrea Abelli <an...@teralytics.ch> on 2014/08/13 14:19:08 UTC

Spark Bindings

Hi

I hope you are well.
While following this tutorial
https://mahout.apache.org/users/sparkbindings/play-with-shell.html
I ran into some problems.
At point 4. of "Starting Mahout's Spark shell", executing `bin/mahout
spark-shell` returns
Error: Could not find or load main class
org.apache.mahout.sparkbindings.shell.Main
so I had a look at classes folder's tree and ./bin/mahout's source code.

vagrant@vagrant-ubuntu-trusty-64:~/tl/mahout$ ls -l
 $MAHOUT_HOME/spark-shell/target/
total 40
drwxrwxr-x 3 vagrant vagrant  4096 Aug 13 11:18 classes
-rw-rw-r-- 1 vagrant vagrant     1 Aug 13 11:18 classes.timestamp
-rw-rw-r-- 1 vagrant vagrant  3014 Aug 13 11:18
mahout-spark-shell_2.10-1.0-SNAPSHOT-sources.jar
-rw-rw-r-- 1 vagrant vagrant  3132 Aug 13 11:18
mahout-spark-shell_2.10-1.0-SNAPSHOT-tests.jar
-rw-rw-r-- 1 vagrant vagrant 14136 Aug 13 11:18
mahout-spark-shell_2.10-1.0-SNAPSHOT.jar
drwxrwxr-x 2 vagrant vagrant  4096 Aug 13 11:18 maven-archiver
drwxrwxr-x 2 vagrant vagrant  4096 Aug 13 11:18 test-classes

while line 180 in ./bin/mahout reads
    for f in $MAHOUT_HOME/spark-shell/target/mahout-spark-shell-*.jar ; do

Now, by applying the following diff

diff --git a/bin/mahout b/bin/mahout
index 5f54181..a6f4ba8 100755
--- a/bin/mahout
+++ b/bin/mahout
@@ -177,7 +177,7 @@ then
       CLASSPATH=${CLASSPATH}:$f;
     done

-    for f in $MAHOUT_HOME/spark-shell/target/mahout-spark-shell-*.jar ; do
+    for f in $MAHOUT_HOME/spark-shell/target/mahout-spark-shell_*.jar ; do
        CLASSPATH=${CLASSPATH}:$f;
     done

I'm now able to get to mahout's shell after running `./bin/mahout
spark-shell`, but I get the following errors

Using Scala version 2.10.3 (OpenJDK 64-Bit Server VM, Java 1.7.0_55)
Type in expressions to have them evaluated.
Type :help for more information.
<console>:9: error: object drm is not a member of package
org.apache.mahout.math
                @transient implicit val sdc:
org.apache.mahout.math.drm.DistributedContext =
                                                                    ^
<console>:10: error: type SparkDistributedContext is not a member of
package org.apache.mahout.sparkbindings
                   new
org.apache.mahout.sparkbindings.SparkDistributedContext(
                                                       ^
Mahout distributed context is available as "implicit val sdc".
<console>:13: error: not found: value scalabindings
       import scalabindings._
              ^
<console>:13: error: not found: value RLikeOps
       import RLikeOps._
              ^
<console>:13: error: not found: value drm
       import drm._
              ^
<console>:13: error: not found: value RLikeDrmOps
       import RLikeDrmOps._
              ^

Has anyone any idea of what's going on/wrong? Any hints on what I'm doing
wrong or how I could fix this?

Thanks in advance, and thanks for the awesome project.
Looking forward to participate.

Regards
Andrea

Re: Spark Bindings

Posted by Pat Ferrel <pa...@occamsmachete.com>.
The version of Spark being used for Mahout is now 1.0.x

It was just changed in the repo

On Aug 13, 2014, at 7:48 AM, Andrea Abelli <an...@teralytics.ch> wrote:

Hello again

I did some additional fiddling with ./bin/mahout :

vagrant@vagrant-ubuntu-trusty-64:~/tl/mahout$ git diff
diff --git a/bin/mahout b/bin/mahout
index 5f54181..0174b31 100755
--- a/bin/mahout
+++ b/bin/mahout
@@ -161,7 +161,7 @@ then
  fi

  # add scala dev target
-  for f in $MAHOUT_HOME/math-scala/target/mahout-math-scala-*.jar ; do
+  for f in $MAHOUT_HOME/math-scala/target/mahout-math-scala_*.jar ; do
     CLASSPATH=${CLASSPATH}:$f;
  done

@@ -173,11 +173,11 @@ then
      CLASSPATH=${CLASSPATH}:$f;
    done

-    for f in $MAHOUT_HOME/spark/target/mahout-spark-*.jar ; do
+    for f in $MAHOUT_HOME/spark/target/mahout-spark_*.jar ; do
      CLASSPATH=${CLASSPATH}:$f;
    done

-    for f in $MAHOUT_HOME/spark-shell/target/mahout-spark-shell-*.jar ; do
+    for f in $MAHOUT_HOME/spark-shell/target/mahout-spark-shell_*.jar ; do
       CLASSPATH=${CLASSPATH}:$f;
    done


and got this error when running ./bin/mahout spark-shell

Exception in thread "main" java.lang.NoSuchMethodError:
org.apache.spark.HttpServer.<init>(Ljava/io/File;)V
at org.apache.spark.repl.SparkIMain.<init>(SparkIMain.scala:100)
at
org.apache.spark.repl.SparkILoop$SparkILoopInterpreter.<init>(SparkILoop.scala:172)
at org.apache.spark.repl.SparkILoop.createInterpreter(SparkILoop.scala:191)
at
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:883)
at
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:881)
at
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:881)
at
scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:881)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:973)
at org.apache.mahout.sparkbindings.shell.Main$.main(Main.scala:31)
at org.apache.mahout.sparkbindings.shell.Main.main(Main.scala)

I then changed SPARK_HOME from 0.9.1 to spark-1.0.2  and now it seems to
work fine.

Regards
Andrea



On Wed, Aug 13, 2014 at 2:19 PM, Andrea Abelli <an...@teralytics.ch>
wrote:

> Hi
> 
> I hope you are well.
> While following this tutorial
> https://mahout.apache.org/users/sparkbindings/play-with-shell.html
> I ran into some problems.
> At point 4. of "Starting Mahout's Spark shell", executing `bin/mahout
> spark-shell` returns
> Error: Could not find or load main class
> org.apache.mahout.sparkbindings.shell.Main
> so I had a look at classes folder's tree and ./bin/mahout's source code.
> 
> vagrant@vagrant-ubuntu-trusty-64:~/tl/mahout$ ls -l
> $MAHOUT_HOME/spark-shell/target/
> total 40
> drwxrwxr-x 3 vagrant vagrant  4096 Aug 13 11:18 classes
> -rw-rw-r-- 1 vagrant vagrant     1 Aug 13 11:18 classes.timestamp
> -rw-rw-r-- 1 vagrant vagrant  3014 Aug 13 11:18
> mahout-spark-shell_2.10-1.0-SNAPSHOT-sources.jar
> -rw-rw-r-- 1 vagrant vagrant  3132 Aug 13 11:18
> mahout-spark-shell_2.10-1.0-SNAPSHOT-tests.jar
> -rw-rw-r-- 1 vagrant vagrant 14136 Aug 13 11:18
> mahout-spark-shell_2.10-1.0-SNAPSHOT.jar
> drwxrwxr-x 2 vagrant vagrant  4096 Aug 13 11:18 maven-archiver
> drwxrwxr-x 2 vagrant vagrant  4096 Aug 13 11:18 test-classes
> 
> while line 180 in ./bin/mahout reads
>    for f in $MAHOUT_HOME/spark-shell/target/mahout-spark-shell-*.jar ; do
> 
> Now, by applying the following diff
> 
> diff --git a/bin/mahout b/bin/mahout
> index 5f54181..a6f4ba8 100755
> --- a/bin/mahout
> +++ b/bin/mahout
> @@ -177,7 +177,7 @@ then
>       CLASSPATH=${CLASSPATH}:$f;
>     done
> 
> -    for f in $MAHOUT_HOME/spark-shell/target/mahout-spark-shell-*.jar ; do
> +    for f in $MAHOUT_HOME/spark-shell/target/mahout-spark-shell_*.jar ; do
>        CLASSPATH=${CLASSPATH}:$f;
>     done
> 
> I'm now able to get to mahout's shell after running `./bin/mahout
> spark-shell`, but I get the following errors
> 
> Using Scala version 2.10.3 (OpenJDK 64-Bit Server VM, Java 1.7.0_55)
> Type in expressions to have them evaluated.
> Type :help for more information.
> <console>:9: error: object drm is not a member of package
> org.apache.mahout.math
>                @transient implicit val sdc:
> org.apache.mahout.math.drm.DistributedContext =
>                                                                    ^
> <console>:10: error: type SparkDistributedContext is not a member of
> package org.apache.mahout.sparkbindings
>                   new
> org.apache.mahout.sparkbindings.SparkDistributedContext(
>                                                       ^
> Mahout distributed context is available as "implicit val sdc".
> <console>:13: error: not found: value scalabindings
>       import scalabindings._
>              ^
> <console>:13: error: not found: value RLikeOps
>       import RLikeOps._
>              ^
> <console>:13: error: not found: value drm
>       import drm._
>              ^
> <console>:13: error: not found: value RLikeDrmOps
>       import RLikeDrmOps._
>              ^
> 
> Has anyone any idea of what's going on/wrong? Any hints on what I'm doing
> wrong or how I could fix this?
> 
> Thanks in advance, and thanks for the awesome project.
> Looking forward to participate.
> 
> Regards
> Andrea
> 



-- 

Andrea Abelli | TERALYTICS
*analytics scientist*

Teralytics AG | Zollstrasse 62 | 8005 Zurich | Switzerland
phone: +353 83 442 44 88
email: andrea.abelli@teralytics.ch
www.teralytics.net

Company registration number: CH-020.3.037.709-7 | Trade register Canton
Zurich
Board of directors: Georg Polzer, Mark Schmitz, Dr. Angelica Kohlmann Küpper
Data Privacy Supervisor: Prof. Dr. Donald Alan Kossmann

This e-mail message contains confidential information which is for the sole
attention and use of the intended recipient. Please notify us at once if
you think that it may not be intended for you and delete it immediately.


Re: Spark Bindings

Posted by Dmitriy Lyubimov <dl...@gmail.com>.
Just ran tutorial on distributed (standalone) spark with 2 workers. seems
to work like a charm. Can't reproduce any of the problems.


On Thu, Aug 14, 2014 at 7:35 PM, Dmitriy Lyubimov <dl...@gmail.com> wrote:

> so as i suspected, picking up proper jars was broken in the head.
> I did a quick patch (and a unit test) to assert proper jars are being
> picked up to be shipped with the job, but i haven't tested with fully
> distributed setup on 1.0.1 (haven't had time to set it for 1.0.1 yet).
>
> Spark people are releasing too fast. we barely have migrated to 1.0.1 and
> they have already released 1.0.2.  Well technically minor version should
> not matter, users can change the pom and recompile with 1.0.2, my guess is
> it should work.
>
>
> On Thu, Aug 14, 2014 at 5:02 PM, Dmitriy Lyubimov <dl...@gmail.com>
> wrote:
>
>> if errors appear in MASTER=local, then it is something wrong with spark
>> binaries IMO. What i usually do is i do my own compilation of Spark with
>> CDH4 (right now, it happens to be 4.3.1) , set SPARK_HOME and MAHOUT_HOME,
>> compile mahout HEAD with maven install -Dskip.tests=true, and then just
>> running `bin/mahout spark-shell` from the MAHOUT_HOME.
>>
>>
>>
>>
>> On Thu, Aug 14, 2014 at 4:55 PM, Dmitriy Lyubimov <dl...@gmail.com>
>> wrote:
>>
>>> just spend last 5 minutes to cut-and-paste the tutorial on HEAD with
>>> 1.0.1 in local mode. Everything works without problem in local mode. What
>>> was used for MASTER setting with this problem?
>>>
>>>
>>> On Thu, Aug 14, 2014 at 11:29 AM, Dmitriy Lyubimov <dl...@gmail.com>
>>> wrote:
>>>
>>>> for the same reason it may have screwed mahout context creation so that
>>>> mahout jars are now not shpped to the backend properly.
>>>>
>>>>
>>>> if the sole purpose of exercise is to get the totorial working, i'd
>>>> suggest to just roll back to commit level before Anand's change and Spark
>>>> 0.9.1 dependency, I am pretty sure it should work then. e.g. this one
>>>> should be the last good commit (this requires Spark 0.9.1)
>>>>
>>>> commit 7a50a291b4598e9809f9acf609b92175ce7f953b
>>>> Author: Dmitriy Lyubimov <dl...@apache.org>
>>>> Date:   Wed Aug 6 12:30:51 2014 -0700
>>>>
>>>>     MAHOUT-1597: A + 1.0 (fixes)
>>>>
>>>>
>>>> (use
>>>>
>>>> git reset 7a50a291 --hard
>>>>
>>>> to sync to this one)
>>>>
>>>>
>>>>
>>>> On Thu, Aug 14, 2014 at 11:20 AM, Dmitriy Lyubimov <dl...@gmail.com>
>>>> wrote:
>>>>
>>>>> not sure either at this point. I guess PR from Anand renaming
>>>>> artifacts created classpath problems but somehow it did not necessarily
>>>>> manifest in my local tests since my maven repo holds the old ones as well.
>>>>>
>>>>>
>>>>> On Thu, Aug 14, 2014 at 9:55 AM, Pat Ferrel <pa...@occamsmachete.com>
>>>>> wrote:
>>>>>
>>>>>> There are two problems here:
>>>>>>
>>>>>> 1) a bug in the mahout script. Just pushed your fix, thx. The jars
>>>>>> got renamed is seems.
>>>>>>
>>>>>> 2) not sure what’s happening with the array serializer, maybe Dmitriy
>>>>>> has an idea?
>>>>>>
>>>>>>
>>>>>> On Aug 14, 2014, at 8:13 AM, Andrea Abelli <
>>>>>> andrea.abelli@teralytics.ch> wrote:
>>>>>>
>>>>>> Hi Again
>>>>>>
>>>>>> new version of spark, new stack trace:
>>>>>> http://pastebin.com/KPNZ3rYQ
>>>>>>
>>>>>> I'm going to have a look at it tomorrow.
>>>>>>
>>>>>> Good evening
>>>>>> Andrea
>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Spark Bindings

Posted by Dmitriy Lyubimov <dl...@gmail.com>.
so as i suspected, picking up proper jars was broken in the head.
I did a quick patch (and a unit test) to assert proper jars are being
picked up to be shipped with the job, but i haven't tested with fully
distributed setup on 1.0.1 (haven't had time to set it for 1.0.1 yet).

Spark people are releasing too fast. we barely have migrated to 1.0.1 and
they have already released 1.0.2.  Well technically minor version should
not matter, users can change the pom and recompile with 1.0.2, my guess is
it should work.


On Thu, Aug 14, 2014 at 5:02 PM, Dmitriy Lyubimov <dl...@gmail.com> wrote:

> if errors appear in MASTER=local, then it is something wrong with spark
> binaries IMO. What i usually do is i do my own compilation of Spark with
> CDH4 (right now, it happens to be 4.3.1) , set SPARK_HOME and MAHOUT_HOME,
> compile mahout HEAD with maven install -Dskip.tests=true, and then just
> running `bin/mahout spark-shell` from the MAHOUT_HOME.
>
>
>
>
> On Thu, Aug 14, 2014 at 4:55 PM, Dmitriy Lyubimov <dl...@gmail.com>
> wrote:
>
>> just spend last 5 minutes to cut-and-paste the tutorial on HEAD with
>> 1.0.1 in local mode. Everything works without problem in local mode. What
>> was used for MASTER setting with this problem?
>>
>>
>> On Thu, Aug 14, 2014 at 11:29 AM, Dmitriy Lyubimov <dl...@gmail.com>
>> wrote:
>>
>>> for the same reason it may have screwed mahout context creation so that
>>> mahout jars are now not shpped to the backend properly.
>>>
>>>
>>> if the sole purpose of exercise is to get the totorial working, i'd
>>> suggest to just roll back to commit level before Anand's change and Spark
>>> 0.9.1 dependency, I am pretty sure it should work then. e.g. this one
>>> should be the last good commit (this requires Spark 0.9.1)
>>>
>>> commit 7a50a291b4598e9809f9acf609b92175ce7f953b
>>> Author: Dmitriy Lyubimov <dl...@apache.org>
>>> Date:   Wed Aug 6 12:30:51 2014 -0700
>>>
>>>     MAHOUT-1597: A + 1.0 (fixes)
>>>
>>>
>>> (use
>>>
>>> git reset 7a50a291 --hard
>>>
>>> to sync to this one)
>>>
>>>
>>>
>>> On Thu, Aug 14, 2014 at 11:20 AM, Dmitriy Lyubimov <dl...@gmail.com>
>>> wrote:
>>>
>>>> not sure either at this point. I guess PR from Anand renaming artifacts
>>>> created classpath problems but somehow it did not necessarily manifest in
>>>> my local tests since my maven repo holds the old ones as well.
>>>>
>>>>
>>>> On Thu, Aug 14, 2014 at 9:55 AM, Pat Ferrel <pa...@occamsmachete.com>
>>>> wrote:
>>>>
>>>>> There are two problems here:
>>>>>
>>>>> 1) a bug in the mahout script. Just pushed your fix, thx. The jars got
>>>>> renamed is seems.
>>>>>
>>>>> 2) not sure what’s happening with the array serializer, maybe Dmitriy
>>>>> has an idea?
>>>>>
>>>>>
>>>>> On Aug 14, 2014, at 8:13 AM, Andrea Abelli <
>>>>> andrea.abelli@teralytics.ch> wrote:
>>>>>
>>>>> Hi Again
>>>>>
>>>>> new version of spark, new stack trace:
>>>>> http://pastebin.com/KPNZ3rYQ
>>>>>
>>>>> I'm going to have a look at it tomorrow.
>>>>>
>>>>> Good evening
>>>>> Andrea
>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Spark Bindings

Posted by Dmitriy Lyubimov <dl...@gmail.com>.
if errors appear in MASTER=local, then it is something wrong with spark
binaries IMO. What i usually do is i do my own compilation of Spark with
CDH4 (right now, it happens to be 4.3.1) , set SPARK_HOME and MAHOUT_HOME,
compile mahout HEAD with maven install -Dskip.tests=true, and then just
running `bin/mahout spark-shell` from the MAHOUT_HOME.




On Thu, Aug 14, 2014 at 4:55 PM, Dmitriy Lyubimov <dl...@gmail.com> wrote:

> just spend last 5 minutes to cut-and-paste the tutorial on HEAD with 1.0.1
> in local mode. Everything works without problem in local mode. What was
> used for MASTER setting with this problem?
>
>
> On Thu, Aug 14, 2014 at 11:29 AM, Dmitriy Lyubimov <dl...@gmail.com>
> wrote:
>
>> for the same reason it may have screwed mahout context creation so that
>> mahout jars are now not shpped to the backend properly.
>>
>>
>> if the sole purpose of exercise is to get the totorial working, i'd
>> suggest to just roll back to commit level before Anand's change and Spark
>> 0.9.1 dependency, I am pretty sure it should work then. e.g. this one
>> should be the last good commit (this requires Spark 0.9.1)
>>
>> commit 7a50a291b4598e9809f9acf609b92175ce7f953b
>> Author: Dmitriy Lyubimov <dl...@apache.org>
>> Date:   Wed Aug 6 12:30:51 2014 -0700
>>
>>     MAHOUT-1597: A + 1.0 (fixes)
>>
>>
>> (use
>>
>> git reset 7a50a291 --hard
>>
>> to sync to this one)
>>
>>
>>
>> On Thu, Aug 14, 2014 at 11:20 AM, Dmitriy Lyubimov <dl...@gmail.com>
>> wrote:
>>
>>> not sure either at this point. I guess PR from Anand renaming artifacts
>>> created classpath problems but somehow it did not necessarily manifest in
>>> my local tests since my maven repo holds the old ones as well.
>>>
>>>
>>> On Thu, Aug 14, 2014 at 9:55 AM, Pat Ferrel <pa...@occamsmachete.com>
>>> wrote:
>>>
>>>> There are two problems here:
>>>>
>>>> 1) a bug in the mahout script. Just pushed your fix, thx. The jars got
>>>> renamed is seems.
>>>>
>>>> 2) not sure what’s happening with the array serializer, maybe Dmitriy
>>>> has an idea?
>>>>
>>>>
>>>> On Aug 14, 2014, at 8:13 AM, Andrea Abelli <an...@teralytics.ch>
>>>> wrote:
>>>>
>>>> Hi Again
>>>>
>>>> new version of spark, new stack trace:
>>>> http://pastebin.com/KPNZ3rYQ
>>>>
>>>> I'm going to have a look at it tomorrow.
>>>>
>>>> Good evening
>>>> Andrea
>>>>
>>>>
>>>
>>
>

Re: Spark Bindings

Posted by Dmitriy Lyubimov <dl...@gmail.com>.
just spend last 5 minutes to cut-and-paste the tutorial on HEAD with 1.0.1
in local mode. Everything works without problem in local mode. What was
used for MASTER setting with this problem?


On Thu, Aug 14, 2014 at 11:29 AM, Dmitriy Lyubimov <dl...@gmail.com>
wrote:

> for the same reason it may have screwed mahout context creation so that
> mahout jars are now not shpped to the backend properly.
>
>
> if the sole purpose of exercise is to get the totorial working, i'd
> suggest to just roll back to commit level before Anand's change and Spark
> 0.9.1 dependency, I am pretty sure it should work then. e.g. this one
> should be the last good commit (this requires Spark 0.9.1)
>
> commit 7a50a291b4598e9809f9acf609b92175ce7f953b
> Author: Dmitriy Lyubimov <dl...@apache.org>
> Date:   Wed Aug 6 12:30:51 2014 -0700
>
>     MAHOUT-1597: A + 1.0 (fixes)
>
>
> (use
>
> git reset 7a50a291 --hard
>
> to sync to this one)
>
>
>
> On Thu, Aug 14, 2014 at 11:20 AM, Dmitriy Lyubimov <dl...@gmail.com>
> wrote:
>
>> not sure either at this point. I guess PR from Anand renaming artifacts
>> created classpath problems but somehow it did not necessarily manifest in
>> my local tests since my maven repo holds the old ones as well.
>>
>>
>> On Thu, Aug 14, 2014 at 9:55 AM, Pat Ferrel <pa...@occamsmachete.com>
>> wrote:
>>
>>> There are two problems here:
>>>
>>> 1) a bug in the mahout script. Just pushed your fix, thx. The jars got
>>> renamed is seems.
>>>
>>> 2) not sure what’s happening with the array serializer, maybe Dmitriy
>>> has an idea?
>>>
>>>
>>> On Aug 14, 2014, at 8:13 AM, Andrea Abelli <an...@teralytics.ch>
>>> wrote:
>>>
>>> Hi Again
>>>
>>> new version of spark, new stack trace:
>>> http://pastebin.com/KPNZ3rYQ
>>>
>>> I'm going to have a look at it tomorrow.
>>>
>>> Good evening
>>> Andrea
>>>
>>>
>>
>

Re: Spark Bindings

Posted by Dmitriy Lyubimov <dl...@gmail.com>.
for the same reason it may have screwed mahout context creation so that
mahout jars are now not shpped to the backend properly.


if the sole purpose of exercise is to get the totorial working, i'd suggest
to just roll back to commit level before Anand's change and Spark 0.9.1
dependency, I am pretty sure it should work then. e.g. this one should be
the last good commit (this requires Spark 0.9.1)

commit 7a50a291b4598e9809f9acf609b92175ce7f953b
Author: Dmitriy Lyubimov <dl...@apache.org>
Date:   Wed Aug 6 12:30:51 2014 -0700

    MAHOUT-1597: A + 1.0 (fixes)


(use

git reset 7a50a291 --hard

to sync to this one)



On Thu, Aug 14, 2014 at 11:20 AM, Dmitriy Lyubimov <dl...@gmail.com>
wrote:

> not sure either at this point. I guess PR from Anand renaming artifacts
> created classpath problems but somehow it did not necessarily manifest in
> my local tests since my maven repo holds the old ones as well.
>
>
> On Thu, Aug 14, 2014 at 9:55 AM, Pat Ferrel <pa...@occamsmachete.com> wrote:
>
>> There are two problems here:
>>
>> 1) a bug in the mahout script. Just pushed your fix, thx. The jars got
>> renamed is seems.
>>
>> 2) not sure what’s happening with the array serializer, maybe Dmitriy has
>> an idea?
>>
>>
>> On Aug 14, 2014, at 8:13 AM, Andrea Abelli <an...@teralytics.ch>
>> wrote:
>>
>> Hi Again
>>
>> new version of spark, new stack trace:
>> http://pastebin.com/KPNZ3rYQ
>>
>> I'm going to have a look at it tomorrow.
>>
>> Good evening
>> Andrea
>>
>>
>

Re: Spark Bindings

Posted by Dmitriy Lyubimov <dl...@gmail.com>.
not sure either at this point. I guess PR from Anand renaming artifacts
created classpath problems but somehow it did not necessarily manifest in
my local tests since my maven repo holds the old ones as well.


On Thu, Aug 14, 2014 at 9:55 AM, Pat Ferrel <pa...@occamsmachete.com> wrote:

> There are two problems here:
>
> 1) a bug in the mahout script. Just pushed your fix, thx. The jars got
> renamed is seems.
>
> 2) not sure what’s happening with the array serializer, maybe Dmitriy has
> an idea?
>
>
> On Aug 14, 2014, at 8:13 AM, Andrea Abelli <an...@teralytics.ch>
> wrote:
>
> Hi Again
>
> new version of spark, new stack trace:
> http://pastebin.com/KPNZ3rYQ
>
> I'm going to have a look at it tomorrow.
>
> Good evening
> Andrea
>
>

Re: Spark Bindings

Posted by Pat Ferrel <pa...@occamsmachete.com>.
There are two problems here:

1) a bug in the mahout script. Just pushed your fix, thx. The jars got renamed is seems.

2) not sure what’s happening with the array serializer, maybe Dmitriy has an idea?


On Aug 14, 2014, at 8:13 AM, Andrea Abelli <an...@teralytics.ch> wrote:

Hi Again

new version of spark, new stack trace:
http://pastebin.com/KPNZ3rYQ

I'm going to have a look at it tomorrow.

Good evening
Andrea


Re: Spark Bindings

Posted by Andrea Abelli <an...@teralytics.ch>.
Hi Again

new version of spark, new stack trace:
http://pastebin.com/KPNZ3rYQ

I'm going to have a look at it tomorrow.

Good evening
Andrea

Re: Spark Bindings

Posted by Andrea Abelli <an...@teralytics.ch>.
​Pat, Dimitri

thanks for your feedback.
I managed to get mahout's spark-shell running but got a stack trace (
java.io.InvalidClassException ) after running


val drmX = drmData(::, 0 until 4)
​​

That's probably because I'm using Spark 1.0.2 . I'll do what Dimitri said
and let you know of the result.
Thanks again.

Good day
Andrea​




On Wed, Aug 13, 2014 at 7:27 PM, Dmitriy Lyubimov <dl...@gmail.com> wrote:

> email 1 and 2 seem all to be classpath problems.
>
> Make sure spark and mahout are both compiled, Spark version corresponds to
> one in mahout (1.0.1 in the current head), SPARK_HOME and (I think)
> MAHOUT_HOME are set
>
>
> On Wed, Aug 13, 2014 at 7:48 AM, Andrea Abelli <
> andrea.abelli@teralytics.ch>
> wrote:
>
> > Hello again
> >
> > I did some additional fiddling with ./bin/mahout :
> >
> > vagrant@vagrant-ubuntu-trusty-64:~/tl/mahout$ git diff
> > diff --git a/bin/mahout b/bin/mahout
> > index 5f54181..0174b31 100755
> > --- a/bin/mahout
> > +++ b/bin/mahout
> > @@ -161,7 +161,7 @@ then
> >    fi
> >
> >    # add scala dev target
> > -  for f in $MAHOUT_HOME/math-scala/target/mahout-math-scala-*.jar ; do
> > +  for f in $MAHOUT_HOME/math-scala/target/mahout-math-scala_*.jar ; do
> >       CLASSPATH=${CLASSPATH}:$f;
> >    done
> >
> > @@ -173,11 +173,11 @@ then
> >        CLASSPATH=${CLASSPATH}:$f;
> >      done
> >
> > -    for f in $MAHOUT_HOME/spark/target/mahout-spark-*.jar ; do
> > +    for f in $MAHOUT_HOME/spark/target/mahout-spark_*.jar ; do
> >        CLASSPATH=${CLASSPATH}:$f;
> >      done
> >
> > -    for f in $MAHOUT_HOME/spark-shell/target/mahout-spark-shell-*.jar ;
> do
> > +    for f in $MAHOUT_HOME/spark-shell/target/mahout-spark-shell_*.jar ;
> do
> >         CLASSPATH=${CLASSPATH}:$f;
> >      done
> >
> >
> > and got this error when running ./bin/mahout spark-shell
> >
> > Exception in thread "main" java.lang.NoSuchMethodError:
> > org.apache.spark.HttpServer.<init>(Ljava/io/File;)V
> > at org.apache.spark.repl.SparkIMain.<init>(SparkIMain.scala:100)
> >  at
> >
> >
> org.apache.spark.repl.SparkILoop$SparkILoopInterpreter.<init>(SparkILoop.scala:172)
> > at
> org.apache.spark.repl.SparkILoop.createInterpreter(SparkILoop.scala:191)
> >  at
> >
> >
> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:883)
> > at
> >
> >
> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:881)
> >  at
> >
> >
> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:881)
> > at
> >
> >
> scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
> >  at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:881)
> > at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:973)
> >  at org.apache.mahout.sparkbindings.shell.Main$.main(Main.scala:31)
> > at org.apache.mahout.sparkbindings.shell.Main.main(Main.scala)
> >
> > I then changed SPARK_HOME from 0.9.1 to spark-1.0.2  and now it seems to
> > work fine.
> >
> > Regards
> > Andrea
> >
> >
> >
> > On Wed, Aug 13, 2014 at 2:19 PM, Andrea Abelli <
> > andrea.abelli@teralytics.ch>
> > wrote:
> >
> > > Hi
> > >
> > > I hope you are well.
> > > While following this tutorial
> > > https://mahout.apache.org/users/sparkbindings/play-with-shell.html
> > > I ran into some problems.
> > > At point 4. of "Starting Mahout's Spark shell", executing `bin/mahout
> > > spark-shell` returns
> > > Error: Could not find or load main class
> > > org.apache.mahout.sparkbindings.shell.Main
> > > so I had a look at classes folder's tree and ./bin/mahout's source
> code.
> > >
> > > vagrant@vagrant-ubuntu-trusty-64:~/tl/mahout$ ls -l
> > >  $MAHOUT_HOME/spark-shell/target/
> > > total 40
> > > drwxrwxr-x 3 vagrant vagrant  4096 Aug 13 11:18 classes
> > > -rw-rw-r-- 1 vagrant vagrant     1 Aug 13 11:18 classes.timestamp
> > > -rw-rw-r-- 1 vagrant vagrant  3014 Aug 13 11:18
> > > mahout-spark-shell_2.10-1.0-SNAPSHOT-sources.jar
> > > -rw-rw-r-- 1 vagrant vagrant  3132 Aug 13 11:18
> > > mahout-spark-shell_2.10-1.0-SNAPSHOT-tests.jar
> > > -rw-rw-r-- 1 vagrant vagrant 14136 Aug 13 11:18
> > > mahout-spark-shell_2.10-1.0-SNAPSHOT.jar
> > > drwxrwxr-x 2 vagrant vagrant  4096 Aug 13 11:18 maven-archiver
> > > drwxrwxr-x 2 vagrant vagrant  4096 Aug 13 11:18 test-classes
> > >
> > > while line 180 in ./bin/mahout reads
> > >     for f in $MAHOUT_HOME/spark-shell/target/mahout-spark-shell-*.jar ;
> > do
> > >
> > > Now, by applying the following diff
> > >
> > > diff --git a/bin/mahout b/bin/mahout
> > > index 5f54181..a6f4ba8 100755
> > > --- a/bin/mahout
> > > +++ b/bin/mahout
> > > @@ -177,7 +177,7 @@ then
> > >        CLASSPATH=${CLASSPATH}:$f;
> > >      done
> > >
> > > -    for f in $MAHOUT_HOME/spark-shell/target/mahout-spark-shell-*.jar
> ;
> > do
> > > +    for f in $MAHOUT_HOME/spark-shell/target/mahout-spark-shell_*.jar
> ;
> > do
> > >         CLASSPATH=${CLASSPATH}:$f;
> > >      done
> > >
> > > I'm now able to get to mahout's shell after running `./bin/mahout
> > > spark-shell`, but I get the following errors
> > >
> > > Using Scala version 2.10.3 (OpenJDK 64-Bit Server VM, Java 1.7.0_55)
> > > Type in expressions to have them evaluated.
> > > Type :help for more information.
> > > <console>:9: error: object drm is not a member of package
> > > org.apache.mahout.math
> > >                 @transient implicit val sdc:
> > > org.apache.mahout.math.drm.DistributedContext =
> > >                                                                     ^
> > > <console>:10: error: type SparkDistributedContext is not a member of
> > > package org.apache.mahout.sparkbindings
> > >                    new
> > > org.apache.mahout.sparkbindings.SparkDistributedContext(
> > >                                                        ^
> > > Mahout distributed context is available as "implicit val sdc".
> > > <console>:13: error: not found: value scalabindings
> > >        import scalabindings._
> > >               ^
> > > <console>:13: error: not found: value RLikeOps
> > >        import RLikeOps._
> > >               ^
> > > <console>:13: error: not found: value drm
> > >        import drm._
> > >               ^
> > > <console>:13: error: not found: value RLikeDrmOps
> > >        import RLikeDrmOps._
> > >               ^
> > >
> > > Has anyone any idea of what's going on/wrong? Any hints on what I'm
> doing
> > > wrong or how I could fix this?
> > >
> > > Thanks in advance, and thanks for the awesome project.
> > > Looking forward to participate.
> > >
> > > Regards
> > > Andrea
> > >
> >
> >
> >
> > --
> >
> > Andrea Abelli | TERALYTICS
> > *analytics scientist*
> >
> > Teralytics AG | Zollstrasse 62 | 8005 Zurich | Switzerland
> > phone: +353 83 442 44 88
> > email: andrea.abelli@teralytics.ch
> > www.teralytics.net
> >
> > Company registration number: CH-020.3.037.709-7 | Trade register Canton
> > Zurich
> > Board of directors: Georg Polzer, Mark Schmitz, Dr. Angelica Kohlmann
> > Küpper
> > Data Privacy Supervisor: Prof. Dr. Donald Alan Kossmann
> >
> > This e-mail message contains confidential information which is for the
> sole
> > attention and use of the intended recipient. Please notify us at once if
> > you think that it may not be intended for you and delete it immediately.
> >
>



-- 

Andrea Abelli | TERALYTICS
*analytics scientist*

Teralytics AG | Zollstrasse 62 | 8005 Zurich | Switzerland
phone: +353 83 442 44 88
email: andrea.abelli@teralytics.ch
www.teralytics.net

Company registration number: CH-020.3.037.709-7 | Trade register Canton
Zurich
Board of directors: Georg Polzer, Mark Schmitz, Dr. Angelica Kohlmann Küpper
Data Privacy Supervisor: Prof. Dr. Donald Alan Kossmann

This e-mail message contains confidential information which is for the sole
attention and use of the intended recipient. Please notify us at once if
you think that it may not be intended for you and delete it immediately.

Re: Spark Bindings

Posted by Dmitriy Lyubimov <dl...@gmail.com>.
email 1 and 2 seem all to be classpath problems.

Make sure spark and mahout are both compiled, Spark version corresponds to
one in mahout (1.0.1 in the current head), SPARK_HOME and (I think)
MAHOUT_HOME are set


On Wed, Aug 13, 2014 at 7:48 AM, Andrea Abelli <an...@teralytics.ch>
wrote:

> Hello again
>
> I did some additional fiddling with ./bin/mahout :
>
> vagrant@vagrant-ubuntu-trusty-64:~/tl/mahout$ git diff
> diff --git a/bin/mahout b/bin/mahout
> index 5f54181..0174b31 100755
> --- a/bin/mahout
> +++ b/bin/mahout
> @@ -161,7 +161,7 @@ then
>    fi
>
>    # add scala dev target
> -  for f in $MAHOUT_HOME/math-scala/target/mahout-math-scala-*.jar ; do
> +  for f in $MAHOUT_HOME/math-scala/target/mahout-math-scala_*.jar ; do
>       CLASSPATH=${CLASSPATH}:$f;
>    done
>
> @@ -173,11 +173,11 @@ then
>        CLASSPATH=${CLASSPATH}:$f;
>      done
>
> -    for f in $MAHOUT_HOME/spark/target/mahout-spark-*.jar ; do
> +    for f in $MAHOUT_HOME/spark/target/mahout-spark_*.jar ; do
>        CLASSPATH=${CLASSPATH}:$f;
>      done
>
> -    for f in $MAHOUT_HOME/spark-shell/target/mahout-spark-shell-*.jar ; do
> +    for f in $MAHOUT_HOME/spark-shell/target/mahout-spark-shell_*.jar ; do
>         CLASSPATH=${CLASSPATH}:$f;
>      done
>
>
> and got this error when running ./bin/mahout spark-shell
>
> Exception in thread "main" java.lang.NoSuchMethodError:
> org.apache.spark.HttpServer.<init>(Ljava/io/File;)V
> at org.apache.spark.repl.SparkIMain.<init>(SparkIMain.scala:100)
>  at
>
> org.apache.spark.repl.SparkILoop$SparkILoopInterpreter.<init>(SparkILoop.scala:172)
> at org.apache.spark.repl.SparkILoop.createInterpreter(SparkILoop.scala:191)
>  at
>
> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:883)
> at
>
> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:881)
>  at
>
> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:881)
> at
>
> scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
>  at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:881)
> at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:973)
>  at org.apache.mahout.sparkbindings.shell.Main$.main(Main.scala:31)
> at org.apache.mahout.sparkbindings.shell.Main.main(Main.scala)
>
> I then changed SPARK_HOME from 0.9.1 to spark-1.0.2  and now it seems to
> work fine.
>
> Regards
> Andrea
>
>
>
> On Wed, Aug 13, 2014 at 2:19 PM, Andrea Abelli <
> andrea.abelli@teralytics.ch>
> wrote:
>
> > Hi
> >
> > I hope you are well.
> > While following this tutorial
> > https://mahout.apache.org/users/sparkbindings/play-with-shell.html
> > I ran into some problems.
> > At point 4. of "Starting Mahout's Spark shell", executing `bin/mahout
> > spark-shell` returns
> > Error: Could not find or load main class
> > org.apache.mahout.sparkbindings.shell.Main
> > so I had a look at classes folder's tree and ./bin/mahout's source code.
> >
> > vagrant@vagrant-ubuntu-trusty-64:~/tl/mahout$ ls -l
> >  $MAHOUT_HOME/spark-shell/target/
> > total 40
> > drwxrwxr-x 3 vagrant vagrant  4096 Aug 13 11:18 classes
> > -rw-rw-r-- 1 vagrant vagrant     1 Aug 13 11:18 classes.timestamp
> > -rw-rw-r-- 1 vagrant vagrant  3014 Aug 13 11:18
> > mahout-spark-shell_2.10-1.0-SNAPSHOT-sources.jar
> > -rw-rw-r-- 1 vagrant vagrant  3132 Aug 13 11:18
> > mahout-spark-shell_2.10-1.0-SNAPSHOT-tests.jar
> > -rw-rw-r-- 1 vagrant vagrant 14136 Aug 13 11:18
> > mahout-spark-shell_2.10-1.0-SNAPSHOT.jar
> > drwxrwxr-x 2 vagrant vagrant  4096 Aug 13 11:18 maven-archiver
> > drwxrwxr-x 2 vagrant vagrant  4096 Aug 13 11:18 test-classes
> >
> > while line 180 in ./bin/mahout reads
> >     for f in $MAHOUT_HOME/spark-shell/target/mahout-spark-shell-*.jar ;
> do
> >
> > Now, by applying the following diff
> >
> > diff --git a/bin/mahout b/bin/mahout
> > index 5f54181..a6f4ba8 100755
> > --- a/bin/mahout
> > +++ b/bin/mahout
> > @@ -177,7 +177,7 @@ then
> >        CLASSPATH=${CLASSPATH}:$f;
> >      done
> >
> > -    for f in $MAHOUT_HOME/spark-shell/target/mahout-spark-shell-*.jar ;
> do
> > +    for f in $MAHOUT_HOME/spark-shell/target/mahout-spark-shell_*.jar ;
> do
> >         CLASSPATH=${CLASSPATH}:$f;
> >      done
> >
> > I'm now able to get to mahout's shell after running `./bin/mahout
> > spark-shell`, but I get the following errors
> >
> > Using Scala version 2.10.3 (OpenJDK 64-Bit Server VM, Java 1.7.0_55)
> > Type in expressions to have them evaluated.
> > Type :help for more information.
> > <console>:9: error: object drm is not a member of package
> > org.apache.mahout.math
> >                 @transient implicit val sdc:
> > org.apache.mahout.math.drm.DistributedContext =
> >                                                                     ^
> > <console>:10: error: type SparkDistributedContext is not a member of
> > package org.apache.mahout.sparkbindings
> >                    new
> > org.apache.mahout.sparkbindings.SparkDistributedContext(
> >                                                        ^
> > Mahout distributed context is available as "implicit val sdc".
> > <console>:13: error: not found: value scalabindings
> >        import scalabindings._
> >               ^
> > <console>:13: error: not found: value RLikeOps
> >        import RLikeOps._
> >               ^
> > <console>:13: error: not found: value drm
> >        import drm._
> >               ^
> > <console>:13: error: not found: value RLikeDrmOps
> >        import RLikeDrmOps._
> >               ^
> >
> > Has anyone any idea of what's going on/wrong? Any hints on what I'm doing
> > wrong or how I could fix this?
> >
> > Thanks in advance, and thanks for the awesome project.
> > Looking forward to participate.
> >
> > Regards
> > Andrea
> >
>
>
>
> --
>
> Andrea Abelli | TERALYTICS
> *analytics scientist*
>
> Teralytics AG | Zollstrasse 62 | 8005 Zurich | Switzerland
> phone: +353 83 442 44 88
> email: andrea.abelli@teralytics.ch
> www.teralytics.net
>
> Company registration number: CH-020.3.037.709-7 | Trade register Canton
> Zurich
> Board of directors: Georg Polzer, Mark Schmitz, Dr. Angelica Kohlmann
> Küpper
> Data Privacy Supervisor: Prof. Dr. Donald Alan Kossmann
>
> This e-mail message contains confidential information which is for the sole
> attention and use of the intended recipient. Please notify us at once if
> you think that it may not be intended for you and delete it immediately.
>

Re: Spark Bindings

Posted by Andrea Abelli <an...@teralytics.ch>.
Hello again

I did some additional fiddling with ./bin/mahout :

vagrant@vagrant-ubuntu-trusty-64:~/tl/mahout$ git diff
diff --git a/bin/mahout b/bin/mahout
index 5f54181..0174b31 100755
--- a/bin/mahout
+++ b/bin/mahout
@@ -161,7 +161,7 @@ then
   fi

   # add scala dev target
-  for f in $MAHOUT_HOME/math-scala/target/mahout-math-scala-*.jar ; do
+  for f in $MAHOUT_HOME/math-scala/target/mahout-math-scala_*.jar ; do
      CLASSPATH=${CLASSPATH}:$f;
   done

@@ -173,11 +173,11 @@ then
       CLASSPATH=${CLASSPATH}:$f;
     done

-    for f in $MAHOUT_HOME/spark/target/mahout-spark-*.jar ; do
+    for f in $MAHOUT_HOME/spark/target/mahout-spark_*.jar ; do
       CLASSPATH=${CLASSPATH}:$f;
     done

-    for f in $MAHOUT_HOME/spark-shell/target/mahout-spark-shell-*.jar ; do
+    for f in $MAHOUT_HOME/spark-shell/target/mahout-spark-shell_*.jar ; do
        CLASSPATH=${CLASSPATH}:$f;
     done


and got this error when running ./bin/mahout spark-shell

Exception in thread "main" java.lang.NoSuchMethodError:
org.apache.spark.HttpServer.<init>(Ljava/io/File;)V
at org.apache.spark.repl.SparkIMain.<init>(SparkIMain.scala:100)
 at
org.apache.spark.repl.SparkILoop$SparkILoopInterpreter.<init>(SparkILoop.scala:172)
at org.apache.spark.repl.SparkILoop.createInterpreter(SparkILoop.scala:191)
 at
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:883)
at
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:881)
 at
org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:881)
at
scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
 at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:881)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:973)
 at org.apache.mahout.sparkbindings.shell.Main$.main(Main.scala:31)
at org.apache.mahout.sparkbindings.shell.Main.main(Main.scala)

I then changed SPARK_HOME from 0.9.1 to spark-1.0.2  and now it seems to
work fine.

Regards
Andrea



On Wed, Aug 13, 2014 at 2:19 PM, Andrea Abelli <an...@teralytics.ch>
wrote:

> Hi
>
> I hope you are well.
> While following this tutorial
> https://mahout.apache.org/users/sparkbindings/play-with-shell.html
> I ran into some problems.
> At point 4. of "Starting Mahout's Spark shell", executing `bin/mahout
> spark-shell` returns
> Error: Could not find or load main class
> org.apache.mahout.sparkbindings.shell.Main
> so I had a look at classes folder's tree and ./bin/mahout's source code.
>
> vagrant@vagrant-ubuntu-trusty-64:~/tl/mahout$ ls -l
>  $MAHOUT_HOME/spark-shell/target/
> total 40
> drwxrwxr-x 3 vagrant vagrant  4096 Aug 13 11:18 classes
> -rw-rw-r-- 1 vagrant vagrant     1 Aug 13 11:18 classes.timestamp
> -rw-rw-r-- 1 vagrant vagrant  3014 Aug 13 11:18
> mahout-spark-shell_2.10-1.0-SNAPSHOT-sources.jar
> -rw-rw-r-- 1 vagrant vagrant  3132 Aug 13 11:18
> mahout-spark-shell_2.10-1.0-SNAPSHOT-tests.jar
> -rw-rw-r-- 1 vagrant vagrant 14136 Aug 13 11:18
> mahout-spark-shell_2.10-1.0-SNAPSHOT.jar
> drwxrwxr-x 2 vagrant vagrant  4096 Aug 13 11:18 maven-archiver
> drwxrwxr-x 2 vagrant vagrant  4096 Aug 13 11:18 test-classes
>
> while line 180 in ./bin/mahout reads
>     for f in $MAHOUT_HOME/spark-shell/target/mahout-spark-shell-*.jar ; do
>
> Now, by applying the following diff
>
> diff --git a/bin/mahout b/bin/mahout
> index 5f54181..a6f4ba8 100755
> --- a/bin/mahout
> +++ b/bin/mahout
> @@ -177,7 +177,7 @@ then
>        CLASSPATH=${CLASSPATH}:$f;
>      done
>
> -    for f in $MAHOUT_HOME/spark-shell/target/mahout-spark-shell-*.jar ; do
> +    for f in $MAHOUT_HOME/spark-shell/target/mahout-spark-shell_*.jar ; do
>         CLASSPATH=${CLASSPATH}:$f;
>      done
>
> I'm now able to get to mahout's shell after running `./bin/mahout
> spark-shell`, but I get the following errors
>
> Using Scala version 2.10.3 (OpenJDK 64-Bit Server VM, Java 1.7.0_55)
> Type in expressions to have them evaluated.
> Type :help for more information.
> <console>:9: error: object drm is not a member of package
> org.apache.mahout.math
>                 @transient implicit val sdc:
> org.apache.mahout.math.drm.DistributedContext =
>                                                                     ^
> <console>:10: error: type SparkDistributedContext is not a member of
> package org.apache.mahout.sparkbindings
>                    new
> org.apache.mahout.sparkbindings.SparkDistributedContext(
>                                                        ^
> Mahout distributed context is available as "implicit val sdc".
> <console>:13: error: not found: value scalabindings
>        import scalabindings._
>               ^
> <console>:13: error: not found: value RLikeOps
>        import RLikeOps._
>               ^
> <console>:13: error: not found: value drm
>        import drm._
>               ^
> <console>:13: error: not found: value RLikeDrmOps
>        import RLikeDrmOps._
>               ^
>
> Has anyone any idea of what's going on/wrong? Any hints on what I'm doing
> wrong or how I could fix this?
>
> Thanks in advance, and thanks for the awesome project.
> Looking forward to participate.
>
> Regards
> Andrea
>



-- 

Andrea Abelli | TERALYTICS
*analytics scientist*

Teralytics AG | Zollstrasse 62 | 8005 Zurich | Switzerland
phone: +353 83 442 44 88
email: andrea.abelli@teralytics.ch
www.teralytics.net

Company registration number: CH-020.3.037.709-7 | Trade register Canton
Zurich
Board of directors: Georg Polzer, Mark Schmitz, Dr. Angelica Kohlmann Küpper
Data Privacy Supervisor: Prof. Dr. Donald Alan Kossmann

This e-mail message contains confidential information which is for the sole
attention and use of the intended recipient. Please notify us at once if
you think that it may not be intended for you and delete it immediately.