You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@mahout.apache.org by "JP Bordenave (JIRA)" <ji...@apache.org> on 2015/07/17 23:29:05 UTC

[jira] [Updated] (MAHOUT-1758) mahout spark-shell - get illegal acces error at startup

     [ https://issues.apache.org/jira/browse/MAHOUT-1758?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

JP Bordenave updated MAHOUT-1758:
---------------------------------
    Summary: mahout spark-shell - get illegal acces error at startup  (was: mahout spark-shell - get illegal acces eror at startup)

> mahout spark-shell - get illegal acces error at startup
> -------------------------------------------------------
>
>                 Key: MAHOUT-1758
>                 URL: https://issues.apache.org/jira/browse/MAHOUT-1758
>             Project: Mahout
>          Issue Type: Bug
>         Environment: linux unbuntu 14.04,  cluster 1pc master 2pc slave, 16GB ram by node.
> Hadoop 2.6
> Spark 1.4.1
> Mahout 10.1
> R 3.0.2/Rhadoop
> scala 2.10
>            Reporter: JP Bordenave
>            Priority: Critical
>
> Hello,
> i installed hadoop 2.6,  spark 1.4 ,sparkR,pyspark working fine, no issue
> scala 2.10
> now i try to configure mahout with my cluster spark/hadoop, but when i start
> mahout, i get illegalaccesseror, it try tot start in local mode, i get same error, look to be incompatible with spark 1.4.x and mahout 10.1, 
> can you confirm ? patch ?
> edit: i saw in release note mahout 10.1, compatibilty 1.2.2 or less
> Thanks for your info
> JP
> i set my variable nd my cluster spark
> export SPARK_HOME=/usr/local/spark
> export MASTER=spark://stargate:7077
> {noformat}
> hduser@stargate:~$ mahout spark-shell
> MAHOUT_LOCAL is not set; adding HADOOP_CONF_DIR to classpath.
> SLF4J: Class path contains multiple SLF4J bindings.
> SLF4J: Found binding in [jar:file:/usr/local/apache-mahout-distribution-0.10.1/mahout-examples-0.10.1-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in [jar:file:/usr/local/apache-mahout-distribution-0.10.1/mahout-mr-0.10.1-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in [jar:file:/usr/local/spark-1.4.1-bin-hadoop2.6/lib/spark-assembly-1.4.1-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in [jar:file:/usr/local/apache-mahout-distribution-0.10.1/lib/slf4j-log4j12-1.7.12.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
> 15/07/17 23:17:54 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
>                          _                 _
>          _ __ ___   __ _| |__   ___  _   _| |_
>         | '_ ` _ \ / _` | '_ \ / _ \| | | | __|
>         | | | | | | (_| | | | | (_) | |_| | |_
>         |_| |_| |_|\__,_|_| |_|\___/ \__,_|\__|  version 0.10.0
> Using Scala version 2.10.4 (OpenJDK 64-Bit Server VM, Java 1.7.0_79)
> Type in expressions to have them evaluated.
> Type :help for more information.
> java.lang.IllegalAccessError: tried to access method org.apache.spark.repl.SparkIMain.classServer()Lorg/apache/spark/HttpServer; from class org.apache.mahout.sparkbindings.shell.MahoutSparkILoop
>         at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.createSparkContext(MahoutSparkILoop.scala:42)
>         at $iwC$$iwC.<init>(<console>:11)
>         at $iwC.<init>(<console>:18)
>         at <init>(<console>:20)
>         at .<init>(<console>:24)
>         at .<clinit>(<console>)
>         at .<init>(<console>:7)
>         at .<clinit>(<console>)
>         at $print(<console>)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
>         at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
>         at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
>         at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
>         at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
>         at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
>         at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
>         at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
>         at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(MahoutSparkILoop.scala:63)
>         at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop$$anonfun$initializeSpark$1.apply(MahoutSparkILoop.scala:62)
>         at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop$$anonfun$initializeSpark$1.apply(MahoutSparkILoop.scala:62)
>         at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
>         at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.initializeSpark(MahoutSparkILoop.scala:62)
>         at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
>         at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:157)
>         at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
>         at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:106)
>         at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.postInitialization(MahoutSparkILoop.scala:24)
>         at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
>         at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
>         at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
>         at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
>         at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
>         at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
>         at org.apache.mahout.sparkbindings.shell.Main$.main(Main.scala:39)
>         at org.apache.mahout.sparkbindings.shell.Main.main(Main.scala)
> Mahout distributed context is available as "implicit val sdc".
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Re: [jira] [Updated] (MAHOUT-1758) mahout spark-shell - get illegal acces error at startup

Posted by Andrew Palumbo <ap...@outlook.com>.
I'm surprised that it made it that far on spark 1.4.

Biting my tongue before I say this but the 0.11 branch may be pretty 
close to 1.4 compatible:

I tried testing 0.11 on Spark 1.4. Everything builds and tests fine but 
unfortunately

     $SPARK_HOME/bin/compute-classpath.sh

has been removed in Spark 1.4: 
https://issues.apache.org/jira/browse/SPARK-4924

So we will need to do some work at least to add the Spark .jars in 
/bin/mahout in order to be Spark 1.4 compatible.



On 07/17/2015 06:33 PM, Dmitriy Lyubimov wrote:
> we don't support spark 1.4 yet. 0.10.x branch is compatible with spark
> 0.9..1.2 and 0.11.0 (master) is compatible with 1.3. I don't think anyone
> yet looked at 1.4. So, ok, we can start working on this with this issue.
> thank you.
>
> On Fri, Jul 17, 2015 at 2:29 PM, JP Bordenave (JIRA) <ji...@apache.org>
> wrote:
>
>>       [
>> https://issues.apache.org/jira/browse/MAHOUT-1758?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
>> ]
>>
>> JP Bordenave updated MAHOUT-1758:
>> ---------------------------------
>>      Summary: mahout spark-shell - get illegal acces error at startup
>> (was: mahout spark-shell - get illegal acces eror at startup)
>>
>>> mahout spark-shell - get illegal acces error at startup
>>> -------------------------------------------------------
>>>
>>>                  Key: MAHOUT-1758
>>>                  URL: https://issues.apache.org/jira/browse/MAHOUT-1758
>>>              Project: Mahout
>>>           Issue Type: Bug
>>>          Environment: linux unbuntu 14.04,  cluster 1pc master 2pc slave,
>> 16GB ram by node.
>>> Hadoop 2.6
>>> Spark 1.4.1
>>> Mahout 10.1
>>> R 3.0.2/Rhadoop
>>> scala 2.10
>>>             Reporter: JP Bordenave
>>>             Priority: Critical
>>>
>>> Hello,
>>> i installed hadoop 2.6,  spark 1.4 ,sparkR,pyspark working fine, no issue
>>> scala 2.10
>>> now i try to configure mahout with my cluster spark/hadoop, but when i
>> start
>>> mahout, i get illegalaccesseror, it try tot start in local mode, i get
>> same error, look to be incompatible with spark 1.4.x and mahout 10.1,
>>> can you confirm ? patch ?
>>> edit: i saw in release note mahout 10.1, compatibilty 1.2.2 or less
>>> Thanks for your info
>>> JP
>>> i set my variable nd my cluster spark
>>> export SPARK_HOME=/usr/local/spark
>>> export MASTER=spark://stargate:7077
>>> {noformat}
>>> hduser@stargate:~$ mahout spark-shell
>>> MAHOUT_LOCAL is not set; adding HADOOP_CONF_DIR to classpath.
>>> SLF4J: Class path contains multiple SLF4J bindings.
>>> SLF4J: Found binding in
>> [jar:file:/usr/local/apache-mahout-distribution-0.10.1/mahout-examples-0.10.1-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>> SLF4J: Found binding in
>> [jar:file:/usr/local/apache-mahout-distribution-0.10.1/mahout-mr-0.10.1-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>> SLF4J: Found binding in
>> [jar:file:/usr/local/spark-1.4.1-bin-hadoop2.6/lib/spark-assembly-1.4.1-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>> SLF4J: Found binding in
>> [jar:file:/usr/local/apache-mahout-distribution-0.10.1/lib/slf4j-log4j12-1.7.12.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>> explanation.
>>> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
>>> 15/07/17 23:17:54 WARN NativeCodeLoader: Unable to load native-hadoop
>> library for your platform... using builtin-java classes where applicable
>>>                           _                 _
>>>           _ __ ___   __ _| |__   ___  _   _| |_
>>>          | '_ ` _ \ / _` | '_ \ / _ \| | | | __|
>>>          | | | | | | (_| | | | | (_) | |_| | |_
>>>          |_| |_| |_|\__,_|_| |_|\___/ \__,_|\__|  version 0.10.0
>>> Using Scala version 2.10.4 (OpenJDK 64-Bit Server VM, Java 1.7.0_79)
>>> Type in expressions to have them evaluated.
>>> Type :help for more information.
>>> java.lang.IllegalAccessError: tried to access method
>> org.apache.spark.repl.SparkIMain.classServer()Lorg/apache/spark/HttpServer;
>> from class org.apache.mahout.sparkbindings.shell.MahoutSparkILoop
>>>          at
>> org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.createSparkContext(MahoutSparkILoop.scala:42)
>>>          at $iwC$$iwC.<init>(<console>:11)
>>>          at $iwC.<init>(<console>:18)
>>>          at <init>(<console>:20)
>>>          at .<init>(<console>:24)
>>>          at .<clinit>(<console>)
>>>          at .<init>(<console>:7)
>>>          at .<clinit>(<console>)
>>>          at $print(<console>)
>>>          at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>          at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>          at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>          at java.lang.reflect.Method.invoke(Method.java:606)
>>>          at
>> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
>>>          at
>> org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
>>>          at
>> org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
>>>          at
>> org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
>>>          at
>> org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
>>>          at
>> org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
>>>          at
>> org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
>>>          at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
>>>          at
>> org.apache.mahout.sparkbindings.shell.MahoutSparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(MahoutSparkILoop.scala:63)
>>>          at
>> org.apache.mahout.sparkbindings.shell.MahoutSparkILoop$$anonfun$initializeSpark$1.apply(MahoutSparkILoop.scala:62)
>>>          at
>> org.apache.mahout.sparkbindings.shell.MahoutSparkILoop$$anonfun$initializeSpark$1.apply(MahoutSparkILoop.scala:62)
>>>          at
>> org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
>>>          at
>> org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.initializeSpark(MahoutSparkILoop.scala:62)
>>>          at
>> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
>>>          at
>> org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:157)
>>>          at
>> org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
>>>          at
>> org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:106)
>>>          at
>> org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.postInitialization(MahoutSparkILoop.scala:24)
>>>          at
>> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
>>>          at
>> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
>>>          at
>> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
>>>          at
>> scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
>>>          at org.apache.spark.repl.SparkILoop.org
>> $apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
>>>          at
>> org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
>>>          at
>> org.apache.mahout.sparkbindings.shell.Main$.main(Main.scala:39)
>>>          at org.apache.mahout.sparkbindings.shell.Main.main(Main.scala)
>>> Mahout distributed context is available as "implicit val sdc".
>>> {noformat}
>>
>>
>> --
>> This message was sent by Atlassian JIRA
>> (v6.3.4#6332)
>>


Re: [jira] [Updated] (MAHOUT-1758) mahout spark-shell - get illegal acces error at startup

Posted by Dmitriy Lyubimov <dl...@gmail.com>.
we don't support spark 1.4 yet. 0.10.x branch is compatible with spark
0.9..1.2 and 0.11.0 (master) is compatible with 1.3. I don't think anyone
yet looked at 1.4. So, ok, we can start working on this with this issue.
thank you.

On Fri, Jul 17, 2015 at 2:29 PM, JP Bordenave (JIRA) <ji...@apache.org>
wrote:

>
>      [
> https://issues.apache.org/jira/browse/MAHOUT-1758?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
> ]
>
> JP Bordenave updated MAHOUT-1758:
> ---------------------------------
>     Summary: mahout spark-shell - get illegal acces error at startup
> (was: mahout spark-shell - get illegal acces eror at startup)
>
> > mahout spark-shell - get illegal acces error at startup
> > -------------------------------------------------------
> >
> >                 Key: MAHOUT-1758
> >                 URL: https://issues.apache.org/jira/browse/MAHOUT-1758
> >             Project: Mahout
> >          Issue Type: Bug
> >         Environment: linux unbuntu 14.04,  cluster 1pc master 2pc slave,
> 16GB ram by node.
> > Hadoop 2.6
> > Spark 1.4.1
> > Mahout 10.1
> > R 3.0.2/Rhadoop
> > scala 2.10
> >            Reporter: JP Bordenave
> >            Priority: Critical
> >
> > Hello,
> > i installed hadoop 2.6,  spark 1.4 ,sparkR,pyspark working fine, no issue
> > scala 2.10
> > now i try to configure mahout with my cluster spark/hadoop, but when i
> start
> > mahout, i get illegalaccesseror, it try tot start in local mode, i get
> same error, look to be incompatible with spark 1.4.x and mahout 10.1,
> > can you confirm ? patch ?
> > edit: i saw in release note mahout 10.1, compatibilty 1.2.2 or less
> > Thanks for your info
> > JP
> > i set my variable nd my cluster spark
> > export SPARK_HOME=/usr/local/spark
> > export MASTER=spark://stargate:7077
> > {noformat}
> > hduser@stargate:~$ mahout spark-shell
> > MAHOUT_LOCAL is not set; adding HADOOP_CONF_DIR to classpath.
> > SLF4J: Class path contains multiple SLF4J bindings.
> > SLF4J: Found binding in
> [jar:file:/usr/local/apache-mahout-distribution-0.10.1/mahout-examples-0.10.1-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> > SLF4J: Found binding in
> [jar:file:/usr/local/apache-mahout-distribution-0.10.1/mahout-mr-0.10.1-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> > SLF4J: Found binding in
> [jar:file:/usr/local/spark-1.4.1-bin-hadoop2.6/lib/spark-assembly-1.4.1-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> > SLF4J: Found binding in
> [jar:file:/usr/local/apache-mahout-distribution-0.10.1/lib/slf4j-log4j12-1.7.12.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> explanation.
> > SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
> > 15/07/17 23:17:54 WARN NativeCodeLoader: Unable to load native-hadoop
> library for your platform... using builtin-java classes where applicable
> >                          _                 _
> >          _ __ ___   __ _| |__   ___  _   _| |_
> >         | '_ ` _ \ / _` | '_ \ / _ \| | | | __|
> >         | | | | | | (_| | | | | (_) | |_| | |_
> >         |_| |_| |_|\__,_|_| |_|\___/ \__,_|\__|  version 0.10.0
> > Using Scala version 2.10.4 (OpenJDK 64-Bit Server VM, Java 1.7.0_79)
> > Type in expressions to have them evaluated.
> > Type :help for more information.
> > java.lang.IllegalAccessError: tried to access method
> org.apache.spark.repl.SparkIMain.classServer()Lorg/apache/spark/HttpServer;
> from class org.apache.mahout.sparkbindings.shell.MahoutSparkILoop
> >         at
> org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.createSparkContext(MahoutSparkILoop.scala:42)
> >         at $iwC$$iwC.<init>(<console>:11)
> >         at $iwC.<init>(<console>:18)
> >         at <init>(<console>:20)
> >         at .<init>(<console>:24)
> >         at .<clinit>(<console>)
> >         at .<init>(<console>:7)
> >         at .<clinit>(<console>)
> >         at $print(<console>)
> >         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> >         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >         at java.lang.reflect.Method.invoke(Method.java:606)
> >         at
> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
> >         at
> org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
> >         at
> org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
> >         at
> org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
> >         at
> org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
> >         at
> org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
> >         at
> org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
> >         at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
> >         at
> org.apache.mahout.sparkbindings.shell.MahoutSparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(MahoutSparkILoop.scala:63)
> >         at
> org.apache.mahout.sparkbindings.shell.MahoutSparkILoop$$anonfun$initializeSpark$1.apply(MahoutSparkILoop.scala:62)
> >         at
> org.apache.mahout.sparkbindings.shell.MahoutSparkILoop$$anonfun$initializeSpark$1.apply(MahoutSparkILoop.scala:62)
> >         at
> org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
> >         at
> org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.initializeSpark(MahoutSparkILoop.scala:62)
> >         at
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
> >         at
> org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:157)
> >         at
> org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
> >         at
> org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:106)
> >         at
> org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.postInitialization(MahoutSparkILoop.scala:24)
> >         at
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
> >         at
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
> >         at
> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
> >         at
> scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
> >         at org.apache.spark.repl.SparkILoop.org
> $apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
> >         at
> org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
> >         at
> org.apache.mahout.sparkbindings.shell.Main$.main(Main.scala:39)
> >         at org.apache.mahout.sparkbindings.shell.Main.main(Main.scala)
> > Mahout distributed context is available as "implicit val sdc".
> > {noformat}
>
>
>
> --
> This message was sent by Atlassian JIRA
> (v6.3.4#6332)
>