You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Hu...@Dell.com on 2013/11/21 23:13:54 UTC

building spark 0.8.0 against CDH 4.4.0 fails

Any advice what could be the issue here?

My SPARK_HADOOP_VERSION=2.0.0-cdh4.2.0

[root@kserv06 spark-0.8.0-incubating]# sbt/sbt assembly
[info] Loading project definition from /home/spark/spark-0.8.0-incubating/project/project
[info] Loading project definition from /home/spark/spark-0.8.0-incubating/project
[info] Set current project to root (in build file:/home/spark/spark-0.8.0-incubating/)
[info] Compiling 258 Scala sources and 16 Java sources to /home/spark/spark-0.8.0-incubating/core/target/scala-2.9.3/classes...
[warn] /home/spark/spark-0.8.0-incubating/core/src/main/scala/org/apache/hadoop/mapred/SparkHadoopMapRedUtil.scala:34: constructor TaskAttemptID in class TaskAttemptID is deprecated: see corresponding Javadoc for more information.
[warn]     new TaskAttemptID(jtIdentifier, jobId, isMap, taskId, attemptId)
[warn]     ^
[warn] /home/spark/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/SparkContext.scala:401: constructor Job in class Job is deprecated: see corresponding Javadoc for more information.
[warn]     val job = new NewHadoopJob(conf)
[warn]               ^
[warn] /home/spark/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/SparkHadoopWriter.scala:129: method cleanupJob in class OutputCommitter is deprecated: see corresponding Javadoc for more information.
[warn]     getOutputCommitter().cleanupJob(getJobContext())
[warn]                          ^
[warn] /home/spark/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/SparkHadoopWriter.scala:170: constructor TaskID in class TaskID is deprecated: see corresponding Javadoc for more information.
[warn]         new TaskAttemptID(new TaskID(jID.value, true, splitID), attemptID))
[warn]                           ^
[warn] /home/spark/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/SparkHadoopWriter.scala:198: method makeQualified in class Path is deprecated: see corresponding Javadoc for more information.
[warn]     outputPath = outputPath.makeQualified(fs)
[warn]                             ^
[warn] /home/spark/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/rdd/CheckpointRDD.scala:102: method getDefaultReplication in class FileSystem is deprecated: see corresponding Javadoc for more information.
[warn]       fs.create(tempOutputPath, false, bufferSize, fs.getDefaultReplication, blockSize)
[warn]                                                       ^
[warn] /home/spark/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/rdd/PairRDDFunctions.scala:554: constructor Job in class Job is deprecated: see corresponding Javadoc for more information.
[warn]     val job = new NewAPIHadoopJob(conf)
[warn]               ^
[warn] /home/spark/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/rdd/PairRDDFunctions.scala:592: method cleanupJob in class OutputCommitter is deprecated: see corresponding Javadoc for more information.
[warn]     jobCommitter.cleanupJob(jobTaskContext)
[warn]                  ^
[warn] /home/spark/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/scheduler/InputFormatInfo.scala:98: constructor Job in class Job is deprecated: see corresponding Javadoc for more information.
[warn]     val job = new Job(conf)
[warn]               ^
[warn] 9 warnings found
[error] ----------
[error] 1. WARNING in /home/spark/spark-0.8.0-incubating/core/src/main/java/org/apache/spark/network/netty/FileClient.java (at line 22)
[error]         import io.netty.channel.ChannelFuture;
[error]                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[error] The import io.netty.channel.ChannelFuture is never used
[error] ----------
[error] 2. WARNING in /home/spark/spark-0.8.0-incubating/core/src/main/java/org/apache/spark/network/netty/FileClient.java (at line 23)
[error]         import io.netty.channel.ChannelFutureListener;
[error]                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[error] The import io.netty.channel.ChannelFutureListener is never used
[error] ----------
[error] ----------
[error] 3. WARNING in /home/spark/spark-0.8.0-incubating/core/src/main/java/org/apache/spark/network/netty/FileServer.java (at line 23)
[error]         import io.netty.channel.Channel;
[error]                ^^^^^^^^^^^^^^^^^^^^^^^^
[error] The import io.netty.channel.Channel is never used
[error] ----------
[error] ----------
[error] 4. WARNING in /home/spark/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/api/java/JavaSparkContextVarargsWorkaround.java (at line 20)
[error]         import java.util.Arrays;
[error]                ^^^^^^^^^^^^^^^^
[error] The import java.util.Arrays is never used
[error] ----------
[error] ----------
[error] 5. WARNING in /home/spark/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/api/java/function/DoubleFlatMapFunction.java (at line 30)
[error]         public abstract class DoubleFlatMapFunction<T> extends AbstractFunction1<T, Iterable<Double>>
[error]                               ^^^^^^^^^^^^^^^^^^^^^
[error] The serializable class DoubleFlatMapFunction does not declare a static final serialVersionUID field of type long
[error] ----------
[error] 6. ERROR in /home/spark/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/api/java/function/DoubleFlatMapFunction.java (at line 36)
[error]         public final Iterable<Double> apply(T t) { return call(t); }
[error]                                       ^^^^^^^^^^
[error] The method apply(T) of type DoubleFlatMapFunction<T> must override a superclass method
[error] ----------
[error] 6 problems (1 error, 5 warnings)
[error] (core/compile:compile) javac returned nonzero exit code
[error] Total time: 133 s, completed Nov 21, 2013 2:12:11 PM
[root@kserv06 spark-0.8.0-incubating]#


RE: building spark 0.8.0 against CDH 4.4.0 fails

Posted by Hu...@Dell.com.
I did not use CDH4 yarn so it worked without SPARK_YARN as u pointed out but in different build env.
I believe it's my current build env. that was not setup correctly to pull the needed hadoop libraries for 2.x
Thanks,
Hussam

-----Original Message-----
From: Injun Song [mailto:ijsong@gmail.com]
Sent: Friday, November 22, 2013 4:03 PM
To: user@spark.incubator.apache.org
Subject: Re: building spark 0.8.0 against CDH 4.4.0 fails

Hi,
Did you install CDH 4.2.0 MR2 (yarn) ?
If then, you should type,
"SPARK_HADOOP_VERSION=2.0.0-cdh4.2.0 SPARK_YARN=true sbt/sbt assembly"
Try it please.




On Nov 22, 2013, at 7:13 AM, Hussam_Jarada@Dell.com wrote:

>
> Any advice what could be the issue here?
>
> My SPARK_HADOOP_VERSION=2.0.0-cdh4.2.0
>
> [root@kserv06 spark-0.8.0-incubating]# sbt/sbt assembly [info] Loading
> project definition from
> /home/spark/spark-0.8.0-incubating/project/project
> [info] Loading project definition from
> /home/spark/spark-0.8.0-incubating/project
> [info] Set current project to root (in build
> file:/home/spark/spark-0.8.0-incubating/<file:///\\home\spark\spark-0.8.0-incubating\>)
> [info] Compiling 258 Scala sources and 16 Java sources to /home/spark/spark-0.8.0-incubating/core/target/scala-2.9.3/classes...

> [warn] /home/spark/spark-0.8.0-incubating/core/src/main/scala/org/apache/hadoop/mapred/SparkHadoopMapRedUtil.scala:34: constructor TaskAttemptID in class TaskAttemptID is deprecated: see corresponding Javadoc for more information.

> [warn]     new TaskAttemptID(jtIdentifier, jobId, isMap, taskId, attemptId)
> [warn]     ^
> [warn] /home/spark/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/SparkContext.scala:401: constructor Job in class Job is deprecated: see corresponding Javadoc for more information.

> [warn]     val job = new NewHadoopJob(conf)
> [warn]               ^
> [warn] /home/spark/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/SparkHadoopWriter.scala:129: method cleanupJob in class OutputCommitter is deprecated: see corresponding Javadoc for more information.

> [warn]     getOutputCommitter().cleanupJob(getJobContext())
> [warn]                          ^
> [warn] /home/spark/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/SparkHadoopWriter.scala:170: constructor TaskID in class TaskID is deprecated: see corresponding Javadoc for more information.

> [warn]         new TaskAttemptID(new TaskID(jID.value, true, splitID), attemptID))
> [warn]                           ^
> [warn] /home/spark/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/SparkHadoopWriter.scala:198: method makeQualified in class Path is deprecated: see corresponding Javadoc for more information.

> [warn]     outputPath = outputPath.makeQualified(fs)
> [warn]                             ^
> [warn] /home/spark/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/rdd/CheckpointRDD.scala:102: method getDefaultReplication in class FileSystem is deprecated: see corresponding Javadoc for more information.

> [warn]       fs.create(tempOutputPath, false, bufferSize, fs.getDefaultReplication, blockSize)
> [warn]                                                       ^
> [warn] /home/spark/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/rdd/PairRDDFunctions.scala:554: constructor Job in class Job is deprecated: see corresponding Javadoc for more information.

> [warn]     val job = new NewAPIHadoopJob(conf)
> [warn]               ^
> [warn] /home/spark/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/rdd/PairRDDFunctions.scala:592: method cleanupJob in class OutputCommitter is deprecated: see corresponding Javadoc for more information.

> [warn]     jobCommitter.cleanupJob(jobTaskContext)
> [warn]                  ^
> [warn] /home/spark/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/scheduler/InputFormatInfo.scala:98: constructor Job in class Job is deprecated: see corresponding Javadoc for more information.

> [warn]     val job = new Job(conf)
> [warn]               ^
> [warn] 9 warnings found
> [error] ----------
> [error] 1. WARNING in /home/spark/spark-0.8.0-incubating/core/src/main/java/org/apache/spark/network/netty/FileClient.java (at line 22)

> [error]         import io.netty.channel.ChannelFuture;
> [error]                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
> [error] The import io.netty.channel.ChannelFuture is never used
> [error] ---------- [error] 2. WARNING in
> /home/spark/spark-0.8.0-incubating/core/src/main/java/org/apache/spark/network/netty/FileClient.java (at line 23)
> [error]         import io.netty.channel.ChannelFutureListener;
> [error]                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
> [error] The import io.netty.channel.ChannelFutureListener is never
> used [error] ---------- [error] ---------- [error] 3. WARNING in
> /home/spark/spark-0.8.0-incubating/core/src/main/java/org/apache/spark/network/netty/FileServer.java (at line 23)
> [error]         import io.netty.channel.Channel;
> [error]                ^^^^^^^^^^^^^^^^^^^^^^^^
> [error] The import io.netty.channel.Channel is never used [error]
> ---------- [error] ---------- [error] 4. WARNING in
> /home/spark/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/api/java/JavaSparkContextVarargsWorkaround.java (at line 20)

> [error]         import java.util.Arrays;
> [error]                ^^^^^^^^^^^^^^^^
> [error] The import java.util.Arrays is never used [error] ----------
> [error] ---------- [error] 5. WARNING in
> /home/spark/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/api/java/function/DoubleFlatMapFunction.java (at line 30)

> [error]         public abstract class DoubleFlatMapFunction<T> extends AbstractFunction1<T, Iterable<Double>>
> [error]                               ^^^^^^^^^^^^^^^^^^^^^
> [error] The serializable class DoubleFlatMapFunction does not declare
> a static final serialVersionUID field of type long [error] ----------
> [error] 6. ERROR in /home/spark/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/api/java/function/DoubleFlatMapFunction.java (at line 36)

> [error]         public final Iterable<Double> apply(T t) { return call(t); }
> [error]                                       ^^^^^^^^^^
> [error] The method apply(T) of type DoubleFlatMapFunction<T> must
> override a superclass method [error] ---------- [error] 6 problems (1
> error, 5 warnings) [error] (core/compile:compile) javac returned
> nonzero exit code [error] Total time: 133 s, completed Nov 21, 2013
> 2:12:11 PM
> [root@kserv06 spark-0.8.0-incubating]#


Re: building spark 0.8.0 against CDH 4.4.0 fails

Posted by Injun Song <ij...@gmail.com>.
Hi,
Did you install CDH 4.2.0 MR2 (yarn) ?
If then, you should type,      
"SPARK_HADOOP_VERSION=2.0.0-cdh4.2.0 SPARK_YARN=true sbt/sbt assemblyā€¯
Try it please.





On Nov 22, 2013, at 7:13 AM, Hussam_Jarada@Dell.com wrote:

>  
> Any advice what could be the issue here?
>  
> My SPARK_HADOOP_VERSION=2.0.0-cdh4.2.0
>  
> [root@kserv06 spark-0.8.0-incubating]# sbt/sbt assembly
> [info] Loading project definition from /home/spark/spark-0.8.0-incubating/project/project
> [info] Loading project definition from /home/spark/spark-0.8.0-incubating/project
> [info] Set current project to root (in build file:/home/spark/spark-0.8.0-incubating/)
> [info] Compiling 258 Scala sources and 16 Java sources to /home/spark/spark-0.8.0-incubating/core/target/scala-2.9.3/classes...
> [warn] /home/spark/spark-0.8.0-incubating/core/src/main/scala/org/apache/hadoop/mapred/SparkHadoopMapRedUtil.scala:34: constructor TaskAttemptID in class TaskAttemptID is deprecated: see corresponding Javadoc for more information.
> [warn]     new TaskAttemptID(jtIdentifier, jobId, isMap, taskId, attemptId)
> [warn]     ^
> [warn] /home/spark/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/SparkContext.scala:401: constructor Job in class Job is deprecated: see corresponding Javadoc for more information.
> [warn]     val job = new NewHadoopJob(conf)
> [warn]               ^
> [warn] /home/spark/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/SparkHadoopWriter.scala:129: method cleanupJob in class OutputCommitter is deprecated: see corresponding Javadoc for more information.
> [warn]     getOutputCommitter().cleanupJob(getJobContext())
> [warn]                          ^
> [warn] /home/spark/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/SparkHadoopWriter.scala:170: constructor TaskID in class TaskID is deprecated: see corresponding Javadoc for more information.
> [warn]         new TaskAttemptID(new TaskID(jID.value, true, splitID), attemptID))
> [warn]                           ^
> [warn] /home/spark/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/SparkHadoopWriter.scala:198: method makeQualified in class Path is deprecated: see corresponding Javadoc for more information.
> [warn]     outputPath = outputPath.makeQualified(fs)
> [warn]                             ^
> [warn] /home/spark/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/rdd/CheckpointRDD.scala:102: method getDefaultReplication in class FileSystem is deprecated: see corresponding Javadoc for more information.
> [warn]       fs.create(tempOutputPath, false, bufferSize, fs.getDefaultReplication, blockSize)
> [warn]                                                       ^
> [warn] /home/spark/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/rdd/PairRDDFunctions.scala:554: constructor Job in class Job is deprecated: see corresponding Javadoc for more information.
> [warn]     val job = new NewAPIHadoopJob(conf)
> [warn]               ^
> [warn] /home/spark/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/rdd/PairRDDFunctions.scala:592: method cleanupJob in class OutputCommitter is deprecated: see corresponding Javadoc for more information.
> [warn]     jobCommitter.cleanupJob(jobTaskContext)
> [warn]                  ^
> [warn] /home/spark/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/scheduler/InputFormatInfo.scala:98: constructor Job in class Job is deprecated: see corresponding Javadoc for more information.
> [warn]     val job = new Job(conf)
> [warn]               ^
> [warn] 9 warnings found
> [error] ----------
> [error] 1. WARNING in /home/spark/spark-0.8.0-incubating/core/src/main/java/org/apache/spark/network/netty/FileClient.java (at line 22)
> [error]         import io.netty.channel.ChannelFuture;
> [error]                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
> [error] The import io.netty.channel.ChannelFuture is never used
> [error] ----------
> [error] 2. WARNING in /home/spark/spark-0.8.0-incubating/core/src/main/java/org/apache/spark/network/netty/FileClient.java (at line 23)
> [error]         import io.netty.channel.ChannelFutureListener;
> [error]                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
> [error] The import io.netty.channel.ChannelFutureListener is never used
> [error] ----------
> [error] ----------
> [error] 3. WARNING in /home/spark/spark-0.8.0-incubating/core/src/main/java/org/apache/spark/network/netty/FileServer.java (at line 23)
> [error]         import io.netty.channel.Channel;
> [error]                ^^^^^^^^^^^^^^^^^^^^^^^^
> [error] The import io.netty.channel.Channel is never used
> [error] ----------
> [error] ----------
> [error] 4. WARNING in /home/spark/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/api/java/JavaSparkContextVarargsWorkaround.java (at line 20)
> [error]         import java.util.Arrays;
> [error]                ^^^^^^^^^^^^^^^^
> [error] The import java.util.Arrays is never used
> [error] ----------
> [error] ----------
> [error] 5. WARNING in /home/spark/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/api/java/function/DoubleFlatMapFunction.java (at line 30)
> [error]         public abstract class DoubleFlatMapFunction<T> extends AbstractFunction1<T, Iterable<Double>>
> [error]                               ^^^^^^^^^^^^^^^^^^^^^
> [error] The serializable class DoubleFlatMapFunction does not declare a static final serialVersionUID field of type long
> [error] ----------
> [error] 6. ERROR in /home/spark/spark-0.8.0-incubating/core/src/main/scala/org/apache/spark/api/java/function/DoubleFlatMapFunction.java (at line 36)
> [error]         public final Iterable<Double> apply(T t) { return call(t); }
> [error]                                       ^^^^^^^^^^
> [error] The method apply(T) of type DoubleFlatMapFunction<T> must override a superclass method
> [error] ----------
> [error] 6 problems (1 error, 5 warnings)
> [error] (core/compile:compile) javac returned nonzero exit code
> [error] Total time: 133 s, completed Nov 21, 2013 2:12:11 PM
> [root@kserv06 spark-0.8.0-incubating]#