You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Radim Kolar (JIRA)" <ji...@apache.org> on 2014/09/14 23:20:33 UTC
[jira] [Created] (SPARK-3521) Missing modules in 1.1.0 source
distribution - cant be build with maven
Radim Kolar created SPARK-3521:
----------------------------------
Summary: Missing modules in 1.1.0 source distribution - cant be build with maven
Key: SPARK-3521
URL: https://issues.apache.org/jira/browse/SPARK-3521
Project: Spark
Issue Type: Bug
Components: Build
Affects Versions: 1.1.0
Reporter: Radim Kolar
Priority: Minor
modules {{bagel}}, {{mllib}}, {{flume-sink}} and {{flume}} are missing from source code distro, spark cant be build with maven. It cant be build by {{sbt/sbt}} either due to other bug (_java.lang.IllegalStateException: impossible to get artifacts when data has not been loaded. IvyNode = org.slf4j#slf4j-api;1.6.1_)
(hsn@sanatana:pts/6):work/spark-1.1.0% mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.1 -DskipTests clean package
[INFO] Scanning for projects...
[ERROR] The build could not read 1 project -> [Help 1]
[ERROR]
[ERROR] The project org.apache.spark:spark-parent:1.1.0 (/home/hsn/myports/spark11/work/spark-1.1.0/pom.xml) has 4 errors
[ERROR] Child module /home/hsn/myports/spark11/work/spark-1.1.0/bagel of /home/hsn/myports/spark11/work/spark-1.1.0/pom.xml does not exist
[ERROR] Child module /home/hsn/myports/spark11/work/spark-1.1.0/mllib of /home/hsn/myports/spark11/work/spark-1.1.0/pom.xml does not exist
[ERROR] Child module /home/hsn/myports/spark11/work/spark-1.1.0/external/flume of /home/hsn/myports/spark11/work/spark-1.1.0/pom.xml does not exist
[ERROR] Child module /home/hsn/myports/spark11/work/spark-1.1.0/external/flume-sink/pom.xml of /home/hsn/myports/spark11/work/spark-1.1.0/pom.xml does not exist
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org