You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@mahout.apache.org by "JP Bordenave (JIRA)" <ji...@apache.org> on 2015/08/09 10:12:46 UTC

[jira] [Comment Edited] (MAHOUT-1758) mahout spark-shell - get illegal acces error at startup

    [ https://issues.apache.org/jira/browse/MAHOUT-1758?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14663320#comment-14663320 ] 

JP Bordenave edited comment on MAHOUT-1758 at 8/9/15 8:12 AM:
--------------------------------------------------------------

Hello,

Thanks for answer,

i already solved my issue by removing mahout from global environnement hadoop 2.7.1/spark ecosystem installation because i was not able to solve global incompatiblity, i continu with my ecosystem part R/spark/pig of hadoop and other tools,  they all working fine together

KR
JP





was (Author: jpbordi):

Hello,

Thanks for answer,

i already removed mahout from global environnement hadoop 2.7.1/spark ecosystem installation because i was not able to solve global incompatiblity, i continu with my ecosystem part R/spark/pig of hadoop and other tools,  they all working fine together

KR
JP




> mahout spark-shell - get illegal acces error at startup
> -------------------------------------------------------
>
>                 Key: MAHOUT-1758
>                 URL: https://issues.apache.org/jira/browse/MAHOUT-1758
>             Project: Mahout
>          Issue Type: Bug
>    Affects Versions: 0.10.1
>         Environment: linux unbuntu 14.04,  cluster 1pc master 2pc slave, 16GB ram by node.
> Hadoop 2.6
> Spark 1.4.1
> Mahout 10.1
> R 3.0.2/Rhadoop
> scala 2.10
>            Reporter: JP Bordenave
>            Assignee: Suneel Marthi
>            Priority: Critical
>             Fix For: 0.11.0
>
>
> Hello,
> i installed hadoop 2.6,  spark 1.4 ,sparkR,pyspark working fine, no issue
> scala 2.10
> now i try to configure mahout with my cluster spark/hadoop, but when i start
> mahout, i get illegalaccesseror, it try tot start in local mode, i get same error, look to be incompatible with spark 1.4.x and mahout 10.1, 
> can you confirm ? patch ?
> edit: i saw in release note mahout 10.1, compatibilty 1.2.2 or less
> Thanks for your info
> JP
> i set my variable nd my cluster spark
> export SPARK_HOME=/usr/local/spark
> export MASTER=spark://stargate:7077
> {noformat}
> hduser@stargate:~$ mahout spark-shell
> MAHOUT_LOCAL is not set; adding HADOOP_CONF_DIR to classpath.
> SLF4J: Class path contains multiple SLF4J bindings.
> SLF4J: Found binding in [jar:file:/usr/local/apache-mahout-distribution-0.10.1/mahout-examples-0.10.1-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in [jar:file:/usr/local/apache-mahout-distribution-0.10.1/mahout-mr-0.10.1-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in [jar:file:/usr/local/spark-1.4.1-bin-hadoop2.6/lib/spark-assembly-1.4.1-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in [jar:file:/usr/local/apache-mahout-distribution-0.10.1/lib/slf4j-log4j12-1.7.12.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
> 15/07/17 23:17:54 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
>                          _                 _
>          _ __ ___   __ _| |__   ___  _   _| |_
>         | '_ ` _ \ / _` | '_ \ / _ \| | | | __|
>         | | | | | | (_| | | | | (_) | |_| | |_
>         |_| |_| |_|\__,_|_| |_|\___/ \__,_|\__|  version 0.10.0
> Using Scala version 2.10.4 (OpenJDK 64-Bit Server VM, Java 1.7.0_79)
> Type in expressions to have them evaluated.
> Type :help for more information.
> java.lang.IllegalAccessError: tried to access method org.apache.spark.repl.SparkIMain.classServer()Lorg/apache/spark/HttpServer; from class org.apache.mahout.sparkbindings.shell.MahoutSparkILoop
>         at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.createSparkContext(MahoutSparkILoop.scala:42)
>         at $iwC$$iwC.<init>(<console>:11)
>         at $iwC.<init>(<console>:18)
>         at <init>(<console>:20)
>         at .<init>(<console>:24)
>         at .<clinit>(<console>)
>         at .<init>(<console>:7)
>         at .<clinit>(<console>)
>         at $print(<console>)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
>         at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
>         at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
>         at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
>         at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
>         at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
>         at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
>         at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
>         at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(MahoutSparkILoop.scala:63)
>         at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop$$anonfun$initializeSpark$1.apply(MahoutSparkILoop.scala:62)
>         at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop$$anonfun$initializeSpark$1.apply(MahoutSparkILoop.scala:62)
>         at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
>         at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.initializeSpark(MahoutSparkILoop.scala:62)
>         at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
>         at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:157)
>         at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
>         at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:106)
>         at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.postInitialization(MahoutSparkILoop.scala:24)
>         at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
>         at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
>         at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
>         at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
>         at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
>         at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
>         at org.apache.mahout.sparkbindings.shell.Main$.main(Main.scala:39)
>         at org.apache.mahout.sparkbindings.shell.Main.main(Main.scala)
> Mahout distributed context is available as "implicit val sdc".
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)