You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hadoop.apache.org by Ashish Kumar9 <as...@in.ibm.com> on 2016/09/15 12:23:28 UTC
Spark 2.0.0 against Hadoop 2.7.2 - spark-shell error
I have built Spark 2.0.0 against Hadoop 2.7.2 and build was successful .
Hadoop is running fine as well . However when I run Spark I get below
runtime exception .
Please suggest on required classpath settings . I have included all the
hadoop libraries in the classpath and I still get this error .
[hadoop@sys-77402 sbin]$ cd $SPARK_HOME
[hadoop@sys-77402 spark-2.0.0-bin-spark-2.0.0-hadoop2.7-ppc64le]$
spark-shell
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel).
java.lang.NoClassDefFoundError:
org/apache/commons/configuration/Configuration
at
org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.<init>(DefaultMetricsSystem.java:38)
at
org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.<clinit>(DefaultMetricsSystem.java:36)
at
org.apache.hadoop.security.UserGroupInformation$UgiMetrics.create(UserGroupInformation.java:120)
at
org.apache.hadoop.security.UserGroupInformation.<clinit>(UserGroupInformation.java:236)
at
org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2245)
at
org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2245)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala:2245)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:297)
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2256)
at
org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:831)
at
org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:823)
at scala.Option.getOrElse(Option.scala:121)
at
org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:823)
at org.apache.spark.repl.Main$.createSparkSession(Main.scala:101)
... 47 elided
Caused by: java.lang.ClassNotFoundException:
org.apache.commons.configuration.Configuration
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 62 more
<console>:14: error: not found: value spark
import spark.implicits._
^
<console>:14: error: not found: value spark
import spark.sql
^
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 2.0.0
/_/
Using Scala version 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0_65)
Type in expressions to have them evaluated.
Re: Spark 2.0.0 against Hadoop 2.7.2 - spark-shell error
Posted by Ashish Kumar9 <as...@in.ibm.com>.
I had to setup "SPARK_CLASSPATH" for Hadoop and Spark libraries and it
works now .
From: Ashish Kumar9/India/IBM@IBMIN
To: "user @spark" <us...@spark.apache.org>, user@hadoop.apache.org
Date: 09/15/2016 05:54 PM
Subject: Spark 2.0.0 against Hadoop 2.7.2 - spark-shell error
I have built Spark 2.0.0 against Hadoop 2.7.2 and build was successful .
Hadoop is running fine as well . However when I run Spark I get below
runtime exception .
Please suggest on required classpath settings . I have included all the
hadoop libraries in the classpath and I still get this error .
[hadoop@sys-77402 sbin]$ cd $SPARK_HOME
[hadoop@sys-77402 spark-2.0.0-bin-spark-2.0.0-hadoop2.7-ppc64le]$
spark-shell
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel).
java.lang.NoClassDefFoundError:
org/apache/commons/configuration/Configuration
at
org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.<init>(DefaultMetricsSystem.java:38)
at
org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.<clinit>(DefaultMetricsSystem.java:36)
at
org.apache.hadoop.security.UserGroupInformation$UgiMetrics.create(UserGroupInformation.java:120)
at
org.apache.hadoop.security.UserGroupInformation.<clinit>(UserGroupInformation.java:236)
at
org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2245)
at
org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2245)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala:2245)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:297)
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2256)
at
org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:831)
at
org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:823)
at scala.Option.getOrElse(Option.scala:121)
at
org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:823)
at org.apache.spark.repl.Main$.createSparkSession(Main.scala:101)
... 47 elided
Caused by: java.lang.ClassNotFoundException:
org.apache.commons.configuration.Configuration
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 62 more
<console>:14: error: not found: value spark
import spark.implicits._
^
<console>:14: error: not found: value spark
import spark.sql
^
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 2.0.0
/_/
Using Scala version 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0_65)
Type in expressions to have them evaluated.