You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by "Ulanov, Alexander" <al...@hp.com> on 2014/08/05 18:04:20 UTC

Spark maven project with the latest Spark jars

Hi,

I'm trying to create a maven project that references the latest build of Spark.
1)downloaded sources and compiled the latest version of Spark.
2)added new spark-core jar to the a new local maven repo
3)created Scala maven project with net.alchim31.maven (scala-archetype-simple v 1.5)
4)added dependency to the new spark-core inside the pom.xml
5)I create SparkContext in the code of this project: val sc = new SparkContext("local", "test")
6)When I run it, I get the error:
Error:scalac: bad symbolic reference. A signature in RDD.class refers to term io
in package org.apache.hadoop which is not available.
It may be completely missing from the current classpath, or the version on
the classpath might be incompatible with the version used when compiling RDD.class.

This problem doesn't occur if I reference the spark-core from the maven repo. What am I doing wrong?

Best regards, Alexander