You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Kayode Odeyemi <dr...@gmail.com> on 2015/10/23 15:14:32 UTC

Maven build failed (Spark master)

Hi,

I can't seem to get a successful maven build. Please see command output
below:

bash-3.2$ ./make-distribution.sh --name spark-latest --tgz --mvn mvn
-Dhadoop.version=2.7.0 -Phadoop-2.7 -Phive -Phive-thriftserver -DskipTests
clean package
+++ dirname ./make-distribution.sh
++ cd .
++ pwd
+ SPARK_HOME=/usr/local/spark-latest
+ DISTDIR=/usr/local/spark-latest/dist
+ SPARK_TACHYON=false
+ TACHYON_VERSION=0.7.1
+ TACHYON_TGZ=tachyon-0.7.1-bin.tar.gz
+ TACHYON_URL=
https://github.com/amplab/tachyon/releases/download/v0.7.1/tachyon-0.7.1-bin.tar.gz
+ MAKE_TGZ=false
+ NAME=none
+ MVN=/usr/local/spark-latest/build/mvn
+ ((  12  ))
+ case $1 in
+ NAME=spark-latest
+ shift
+ shift
+ ((  10  ))
+ case $1 in
+ MAKE_TGZ=true
+ shift
+ ((  9  ))
+ case $1 in
+ MVN=mvn
+ shift
+ shift
+ ((  7  ))
+ case $1 in
+ break
+ '[' -z /Library/Java/JavaVirtualMachines/jdk1.8.0_20.jdk/Contents/Home ']'
+ '[' -z /Library/Java/JavaVirtualMachines/jdk1.8.0_20.jdk/Contents/Home ']'
++ command -v git
+ '[' /usr/bin/git ']'
++ git rev-parse --short HEAD
+ GITREV=487d409
+ '[' '!' -z 487d409 ']'
+ GITREVSTRING=' (git revision 487d409)'
+ unset GITREV
++ command -v mvn
+ '[' '!' /usr/bin/mvn ']'
++ mvn help:evaluate -Dexpression=project.version -Dhadoop.version=2.7.0
-Phadoop-2.7 -Phive -Phive-thriftserver -DskipTests clean package
++ grep -v INFO
++ tail -n 1
+ VERSION='[ERROR] [Help 1]
http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException'

Same output error with JDK 7

Appreciate your help.

Re: Maven build failed (Spark master)

Posted by Kayode Odeyemi <dr...@gmail.com>.
Thanks gents.

Removal of 'clean package -U' made the difference.

On Tue, Oct 27, 2015 at 6:39 PM, Todd Nist <ts...@gmail.com> wrote:

> I issued the same basic command and it worked fine.
>
> RADTech-MBP:spark $ ./make-distribution.sh --name hadoop-2.6 --tgz -Pyarn
> -Phadoop-2.6 -Dhadoop.version=2.6.0 -Phive -Phive-thriftserver -DskipTests
>
> Which created: spark-1.6.0-SNAPSHOT-bin-hadoop-2.6.tgz in the root
> directory of the project.
>
> FWIW, the environment was an MBP with OS X 10.10.5 and Java:
>
> java version "1.8.0_51"
> Java(TM) SE Runtime Environment (build 1.8.0_51-b16)
> Java HotSpot(TM) 64-Bit Server VM (build 25.51-b03, mixed mode)
>
> -Todd
>
> On Tue, Oct 27, 2015 at 12:17 PM, Ted Yu <yu...@gmail.com> wrote:
>
>> I used the following command:
>> make-distribution.sh --name custom-spark --tgz -Phadoop-2.4 -Phive
>> -Phive-thriftserver -Pyarn
>>
>> spark-1.6.0-SNAPSHOT-bin-custom-spark.tgz was generated (with patch from
>> SPARK-11348)
>>
>> Can you try above command ?
>>
>> Thanks
>>
>> On Tue, Oct 27, 2015 at 7:03 AM, Kayode Odeyemi <dr...@gmail.com>
>> wrote:
>>
>>> Ted, I switched to this:
>>>
>>> ./make-distribution.sh --name spark-latest --tgz -Dhadoop.version=2.6.0
>>> -Phadoop-2.6 -Phive -Phive-thriftserver -Pyarn -DskipTests clean package -U
>>>
>>> Same error. No .gz file. Here's the bottom output log:
>>>
>>> + rm -rf /home/emperor/javaprojects/spark/dist
>>> + mkdir -p /home/emperor/javaprojects/spark/dist/lib
>>> + echo 'Spark [WARNING] See
>>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin (git revision
>>> 3689beb) built for Hadoop [WARNING] See
>>> http://docs.codehaus.org/display/MAVENUSER/Shade+Pl
>>> + echo 'Build flags: -Dhadoop.version=2.6.0' -Phadoop-2.6 -Phive
>>> -Phive-thriftserver -Pyarn -DskipTests clean package -U
>>> + cp
>>> /home/emperor/javaprojects/spark/assembly/target/scala-2.10/spark-assembly-1.6.0-SNAPSHOT-hadoop2.6.0.jar
>>> /home/emperor/javaprojects/spark/dist/lib/
>>> + cp
>>> /home/emperor/javaprojects/spark/examples/target/scala-2.10/spark-examples-1.6.0-SNAPSHOT-hadoop2.6.0.jar
>>> /home/emperor/javaprojects/spark/dist/lib/
>>> + cp
>>> /home/emperor/javaprojects/spark/network/yarn/target/scala-2.10/spark-1.6.0-SNAPSHOT-yarn-shuffle.jar
>>> /home/emperor/javaprojects/spark/dist/lib/
>>> + mkdir -p /home/emperor/javaprojects/spark/dist/examples/src/main
>>> + cp -r /home/emperor/javaprojects/spark/examples/src/main
>>> /home/emperor/javaprojects/spark/dist/examples/src/
>>> + '[' 1 == 1 ']'
>>> + cp
>>> /home/emperor/javaprojects/spark/lib_managed/jars/datanucleus-api-jdo-3.2.6.jar
>>> /home/emperor/javaprojects/spark/lib_managed/jars/datanucleus-core-3.2.10.jar
>>> /home/emperor/javaprojects
>>> ed/jars/datanucleus-rdbms-3.2.9.jar
>>> /home/emperor/javaprojects/spark/dist/lib/
>>> + cp /home/emperor/javaprojects/spark/LICENSE
>>> /home/emperor/javaprojects/spark/dist
>>> + cp -r /home/emperor/javaprojects/spark/licenses
>>> /home/emperor/javaprojects/spark/dist
>>> + cp /home/emperor/javaprojects/spark/NOTICE
>>> /home/emperor/javaprojects/spark/dist
>>> + '[' -e /home/emperor/javaprojects/spark/CHANGES.txt ']'
>>> + cp -r /home/emperor/javaprojects/spark/data
>>> /home/emperor/javaprojects/spark/dist
>>> + mkdir /home/emperor/javaprojects/spark/dist/conf
>>> + cp /home/emperor/javaprojects/spark/conf/docker.properties.template
>>> /home/emperor/javaprojects/spark/conf/fairscheduler.xml.template
>>> /home/emperor/javaprojects/spark/conf/log4j.properties
>>> emperor/javaprojects/spark/conf/metrics.properties.template
>>> /home/emperor/javaprojects/spark/conf/slaves.template
>>> /home/emperor/javaprojects/spark/conf/spark-defaults.conf.template /home/em
>>> ts/spark/conf/spark-env.sh.template
>>> /home/emperor/javaprojects/spark/dist/conf
>>> + cp /home/emperor/javaprojects/spark/README.md
>>> /home/emperor/javaprojects/spark/dist
>>> + cp -r /home/emperor/javaprojects/spark/bin
>>> /home/emperor/javaprojects/spark/dist
>>> + cp -r /home/emperor/javaprojects/spark/python
>>> /home/emperor/javaprojects/spark/dist
>>> + cp -r /home/emperor/javaprojects/spark/sbin
>>> /home/emperor/javaprojects/spark/dist
>>> + cp -r /home/emperor/javaprojects/spark/ec2
>>> /home/emperor/javaprojects/spark/dist
>>> + '[' -d /home/emperor/javaprojects/spark/R/lib/SparkR ']'
>>> + '[' false == true ']'
>>> + '[' true == true ']'
>>> + TARDIR_NAME='spark-[WARNING] See
>>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest
>>> '
>>> + TARDIR='/home/emperor/javaprojects/spark/spark-[WARNING] See
>>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest
>>> '
>>> + rm -rf '/home/emperor/javaprojects/spark/spark-[WARNING] See
>>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest
>>> '
>>> + cp -r /home/emperor/javaprojects/spark/dist
>>> '/home/emperor/javaprojects/spark/spark-[WARNING] See
>>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest
>>> '
>>> cp: cannot create directory
>>> `/home/emperor/javaprojects/spark/spark-[WARNING] See
>>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest':
>>> No such file or directory
>>>
>>>
>>> On Tue, Oct 27, 2015 at 2:14 PM, Ted Yu <yu...@gmail.com> wrote:
>>>
>>>> Can you try the same command shown in the pull request ?
>>>>
>>>> Thanks
>>>>
>>>> On Oct 27, 2015, at 12:40 AM, Kayode Odeyemi <dr...@gmail.com> wrote:
>>>>
>>>> Thank you.
>>>>
>>>> But I'm getting same warnings and it's still preventing the archive
>>>> from being generated.
>>>>
>>>> I've ran this on both OSX Lion and Ubuntu 12. Same error. No .gz file
>>>>
>>>> On Mon, Oct 26, 2015 at 9:10 PM, Ted Yu <yu...@gmail.com> wrote:
>>>>
>>>>> Looks like '-Pyarn' was missing in your command.
>>>>>
>>>>> On Mon, Oct 26, 2015 at 12:06 PM, Kayode Odeyemi <dr...@gmail.com>
>>>>> wrote:
>>>>>
>>>>>> I used this command which is synonymous to what you have:
>>>>>>
>>>>>> ./make-distribution.sh --name spark-latest --tgz --mvn mvn
>>>>>> -Dhadoop.version=2.6.0 -Phadoop-2.6 -Phive -Phive-thriftserver -DskipTests
>>>>>> clean package -U
>>>>>>
>>>>>> But I still see WARNINGS like this in the output and no .gz file
>>>>>> created:
>>>>>>
>>>>>> cp: /usr/local/spark-latest/spark-[WARNING] See
>>>>>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=10/day=26/.part-r-00005.gz.parquet.crc:
>>>>>> No such file or directory
>>>>>> cp: /usr/local/spark-latest/spark-[WARNING] See
>>>>>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=10/day=26/part-r-00005.gz.parquet:
>>>>>> No such file or directory
>>>>>> cp: /usr/local/spark-latest/spark-[WARNING] See
>>>>>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=9:
>>>>>> No such file or directory
>>>>>> cp:
>>>>>> /usr/local/spark-latest/dist/python/test_support/sql/parquet_partitioned/year=2015/month=9:
>>>>>> unable to copy extended attributes to
>>>>>> /usr/local/spark-latest/spark-[WARNING] See
>>>>>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=9:
>>>>>> No such file or directory
>>>>>> cp: /usr/local/spark-latest/spark-[WARNING] See
>>>>>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=9/day=1:
>>>>>> No such file or directory
>>>>>> cp:
>>>>>> /usr/local/spark-latest/dist/python/test_support/sql/parquet_partitioned/year=2015/month=9/day=1:
>>>>>> unable to copy extended attributes to
>>>>>> /usr/local/spark-latest/spark-[WARNING] See
>>>>>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=9/day=1:
>>>>>> No such file or directory
>>>>>> cp: /usr/local/spark-latest/spark-[WARNING] See
>>>>>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=9/day=1/.part-r-00007.gz.parquet.crc:
>>>>>> No such file or directory
>>>>>>
>>>>>> On Mon, Oct 26, 2015 at 8:58 PM, Ted Yu <yu...@gmail.com> wrote:
>>>>>>
>>>>>>> If you use the command shown in:
>>>>>>> https://github.com/apache/spark/pull/9281
>>>>>>>
>>>>>>> You should have got the following:
>>>>>>>
>>>>>>>
>>>>>>> ./dist/python/test_support/sql/parquet_partitioned/year=2014/month=9/day=1/part-r-00008.gz.parquet
>>>>>>>
>>>>>>> ./dist/python/test_support/sql/parquet_partitioned/year=2015/month=9/day=1/part-r-00007.gz.parquet
>>>>>>>
>>>>>>> ./dist/python/test_support/sql/parquet_partitioned/year=2015/month=10/day=25/part-r-00004.gz.parquet
>>>>>>>
>>>>>>> ./dist/python/test_support/sql/parquet_partitioned/year=2015/month=10/day=25/part-r-00002.gz.parquet
>>>>>>>
>>>>>>> ./dist/python/test_support/sql/parquet_partitioned/year=2015/month=10/day=26/part-r-00005.gz.parquet
>>>>>>>
>>>>>>> On Mon, Oct 26, 2015 at 11:47 AM, Kayode Odeyemi <dr...@gmail.com>
>>>>>>> wrote:
>>>>>>>
>>>>>>>> I see a lot of stuffs like this after the a successful maven build:
>>>>>>>>
>>>>>>>> cp: /usr/local/spark-latest/spark-[WARNING] See
>>>>>>>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2014/month=9/day=1/
>>>>>>>> part-r-00008.gz.parquet: No such file or directory
>>>>>>>>
>>>>>>>> Seems it fails when it tries to package the build as an archive.
>>>>>>>>
>>>>>>>> I'm using the latest code on github master.
>>>>>>>>
>>>>>>>> Any ideas please?
>>>>>>>>
>>>>>>>> On Mon, Oct 26, 2015 at 6:20 PM, Yana Kadiyska <
>>>>>>>> yana.kadiyska@gmail.com> wrote:
>>>>>>>>
>>>>>>>>> In 1.4 ./make_distribution produces a .tgz file in the root
>>>>>>>>> directory (same directory that make_distribution is in)
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Mon, Oct 26, 2015 at 8:46 AM, Kayode Odeyemi <dreyemi@gmail.com
>>>>>>>>> > wrote:
>>>>>>>>>
>>>>>>>>>> Hi,
>>>>>>>>>>
>>>>>>>>>> The ./make_distribution task completed. However, I can't seem to
>>>>>>>>>> locate the
>>>>>>>>>> .tar.gz file.
>>>>>>>>>>
>>>>>>>>>> Where does Spark save this? or should I just work with the dist
>>>>>>>>>> directory?
>>>>>>>>>>
>>>>>>>>>> On Fri, Oct 23, 2015 at 4:23 PM, Kayode Odeyemi <
>>>>>>>>>> dreyemi@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>>> I saw this when I tested manually (without ./make-distribution)
>>>>>>>>>>>
>>>>>>>>>>> Detected Maven Version: 3.2.2 is not in the allowed range 3.3.3.
>>>>>>>>>>>
>>>>>>>>>>> So I simply upgraded maven to 3.3.3.
>>>>>>>>>>>
>>>>>>>>>>> Resolved. Thanks
>>>>>>>>>>>
>>>>>>>>>>> On Fri, Oct 23, 2015 at 3:17 PM, Sean Owen <so...@cloudera.com>
>>>>>>>>>>> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> This doesn't show the actual error output from Maven. I have a
>>>>>>>>>>>> strong
>>>>>>>>>>>> guess that you haven't set MAVEN_OPTS to increase the memory
>>>>>>>>>>>> Maven can
>>>>>>>>>>>> use.
>>>>>>>>>>>>
>>>>>>>>>>>> On Fri, Oct 23, 2015 at 6:14 AM, Kayode Odeyemi <
>>>>>>>>>>>> dreyemi@gmail.com> wrote:
>>>>>>>>>>>> > Hi,
>>>>>>>>>>>> >
>>>>>>>>>>>> > I can't seem to get a successful maven build. Please see
>>>>>>>>>>>> command output
>>>>>>>>>>>> > below:
>>>>>>>>>>>> >
>>>>>>>>>>>> > bash-3.2$ ./make-distribution.sh --name spark-latest --tgz
>>>>>>>>>>>> --mvn mvn
>>>>>>>>>>>> > -Dhadoop.version=2.7.0 -Phadoop-2.7 -Phive
>>>>>>>>>>>> -Phive-thriftserver -DskipTests
>>>>>>>>>>>> > clean package
>>>>>>>>>>>> > +++ dirname ./make-distribution.sh
>>>>>>>>>>>> > ++ cd .
>>>>>>>>>>>> > ++ pwd
>>>>>>>>>>>> > + SPARK_HOME=/usr/local/spark-latest
>>>>>>>>>>>> > + DISTDIR=/usr/local/spark-latest/dist
>>>>>>>>>>>> > + SPARK_TACHYON=false
>>>>>>>>>>>> > + TACHYON_VERSION=0.7.1
>>>>>>>>>>>> > + TACHYON_TGZ=tachyon-0.7.1-bin.tar.gz
>>>>>>>>>>>> > +
>>>>>>>>>>>> > TACHYON_URL=
>>>>>>>>>>>> https://github.com/amplab/tachyon/releases/download/v0.7.1/tachyon-0.7.1-bin.tar.gz
>>>>>>>>>>>> > + MAKE_TGZ=false
>>>>>>>>>>>> > + NAME=none
>>>>>>>>>>>> > + MVN=/usr/local/spark-latest/build/mvn
>>>>>>>>>>>> > + ((  12  ))
>>>>>>>>>>>> > + case $1 in
>>>>>>>>>>>> > + NAME=spark-latest
>>>>>>>>>>>> > + shift
>>>>>>>>>>>> > + shift
>>>>>>>>>>>> > + ((  10  ))
>>>>>>>>>>>> > + case $1 in
>>>>>>>>>>>> > + MAKE_TGZ=true
>>>>>>>>>>>> > + shift
>>>>>>>>>>>> > + ((  9  ))
>>>>>>>>>>>> > + case $1 in
>>>>>>>>>>>> > + MVN=mvn
>>>>>>>>>>>> > + shift
>>>>>>>>>>>> > + shift
>>>>>>>>>>>> > + ((  7  ))
>>>>>>>>>>>> > + case $1 in
>>>>>>>>>>>> > + break
>>>>>>>>>>>> > + '[' -z
>>>>>>>>>>>> /Library/Java/JavaVirtualMachines/jdk1.8.0_20.jdk/Contents/Home ']'
>>>>>>>>>>>> > + '[' -z
>>>>>>>>>>>> /Library/Java/JavaVirtualMachines/jdk1.8.0_20.jdk/Contents/Home ']'
>>>>>>>>>>>> > ++ command -v git
>>>>>>>>>>>> > + '[' /usr/bin/git ']'
>>>>>>>>>>>> > ++ git rev-parse --short HEAD
>>>>>>>>>>>> > + GITREV=487d409
>>>>>>>>>>>> > + '[' '!' -z 487d409 ']'
>>>>>>>>>>>> > + GITREVSTRING=' (git revision 487d409)'
>>>>>>>>>>>> > + unset GITREV
>>>>>>>>>>>> > ++ command -v mvn
>>>>>>>>>>>> > + '[' '!' /usr/bin/mvn ']'
>>>>>>>>>>>> > ++ mvn help:evaluate -Dexpression=project.version
>>>>>>>>>>>> -Dhadoop.version=2.7.0
>>>>>>>>>>>> > -Phadoop-2.7 -Phive -Phive-thriftserver -DskipTests clean
>>>>>>>>>>>> package
>>>>>>>>>>>> > ++ grep -v INFO
>>>>>>>>>>>> > ++ tail -n 1
>>>>>>>>>>>> > + VERSION='[ERROR] [Help 1]
>>>>>>>>>>>> >
>>>>>>>>>>>> http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
>>>>>>>>>>>> '
>>>>>>>>>>>> >
>>>>>>>>>>>> > Same output error with JDK 7
>>>>>>>>>>>> >
>>>>>>>>>>>> > Appreciate your help.
>>>>>>>>>>>> >
>>>>>>>>>>>> >
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Maven build failed (Spark master)

Posted by Todd Nist <ts...@gmail.com>.
I issued the same basic command and it worked fine.

RADTech-MBP:spark $ ./make-distribution.sh --name hadoop-2.6 --tgz -Pyarn
-Phadoop-2.6 -Dhadoop.version=2.6.0 -Phive -Phive-thriftserver -DskipTests

Which created: spark-1.6.0-SNAPSHOT-bin-hadoop-2.6.tgz in the root
directory of the project.

FWIW, the environment was an MBP with OS X 10.10.5 and Java:

java version "1.8.0_51"
Java(TM) SE Runtime Environment (build 1.8.0_51-b16)
Java HotSpot(TM) 64-Bit Server VM (build 25.51-b03, mixed mode)

-Todd

On Tue, Oct 27, 2015 at 12:17 PM, Ted Yu <yu...@gmail.com> wrote:

> I used the following command:
> make-distribution.sh --name custom-spark --tgz -Phadoop-2.4 -Phive
> -Phive-thriftserver -Pyarn
>
> spark-1.6.0-SNAPSHOT-bin-custom-spark.tgz was generated (with patch from
> SPARK-11348)
>
> Can you try above command ?
>
> Thanks
>
> On Tue, Oct 27, 2015 at 7:03 AM, Kayode Odeyemi <dr...@gmail.com> wrote:
>
>> Ted, I switched to this:
>>
>> ./make-distribution.sh --name spark-latest --tgz -Dhadoop.version=2.6.0
>> -Phadoop-2.6 -Phive -Phive-thriftserver -Pyarn -DskipTests clean package -U
>>
>> Same error. No .gz file. Here's the bottom output log:
>>
>> + rm -rf /home/emperor/javaprojects/spark/dist
>> + mkdir -p /home/emperor/javaprojects/spark/dist/lib
>> + echo 'Spark [WARNING] See
>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin (git revision
>> 3689beb) built for Hadoop [WARNING] See
>> http://docs.codehaus.org/display/MAVENUSER/Shade+Pl
>> + echo 'Build flags: -Dhadoop.version=2.6.0' -Phadoop-2.6 -Phive
>> -Phive-thriftserver -Pyarn -DskipTests clean package -U
>> + cp
>> /home/emperor/javaprojects/spark/assembly/target/scala-2.10/spark-assembly-1.6.0-SNAPSHOT-hadoop2.6.0.jar
>> /home/emperor/javaprojects/spark/dist/lib/
>> + cp
>> /home/emperor/javaprojects/spark/examples/target/scala-2.10/spark-examples-1.6.0-SNAPSHOT-hadoop2.6.0.jar
>> /home/emperor/javaprojects/spark/dist/lib/
>> + cp
>> /home/emperor/javaprojects/spark/network/yarn/target/scala-2.10/spark-1.6.0-SNAPSHOT-yarn-shuffle.jar
>> /home/emperor/javaprojects/spark/dist/lib/
>> + mkdir -p /home/emperor/javaprojects/spark/dist/examples/src/main
>> + cp -r /home/emperor/javaprojects/spark/examples/src/main
>> /home/emperor/javaprojects/spark/dist/examples/src/
>> + '[' 1 == 1 ']'
>> + cp
>> /home/emperor/javaprojects/spark/lib_managed/jars/datanucleus-api-jdo-3.2.6.jar
>> /home/emperor/javaprojects/spark/lib_managed/jars/datanucleus-core-3.2.10.jar
>> /home/emperor/javaprojects
>> ed/jars/datanucleus-rdbms-3.2.9.jar
>> /home/emperor/javaprojects/spark/dist/lib/
>> + cp /home/emperor/javaprojects/spark/LICENSE
>> /home/emperor/javaprojects/spark/dist
>> + cp -r /home/emperor/javaprojects/spark/licenses
>> /home/emperor/javaprojects/spark/dist
>> + cp /home/emperor/javaprojects/spark/NOTICE
>> /home/emperor/javaprojects/spark/dist
>> + '[' -e /home/emperor/javaprojects/spark/CHANGES.txt ']'
>> + cp -r /home/emperor/javaprojects/spark/data
>> /home/emperor/javaprojects/spark/dist
>> + mkdir /home/emperor/javaprojects/spark/dist/conf
>> + cp /home/emperor/javaprojects/spark/conf/docker.properties.template
>> /home/emperor/javaprojects/spark/conf/fairscheduler.xml.template
>> /home/emperor/javaprojects/spark/conf/log4j.properties
>> emperor/javaprojects/spark/conf/metrics.properties.template
>> /home/emperor/javaprojects/spark/conf/slaves.template
>> /home/emperor/javaprojects/spark/conf/spark-defaults.conf.template /home/em
>> ts/spark/conf/spark-env.sh.template
>> /home/emperor/javaprojects/spark/dist/conf
>> + cp /home/emperor/javaprojects/spark/README.md
>> /home/emperor/javaprojects/spark/dist
>> + cp -r /home/emperor/javaprojects/spark/bin
>> /home/emperor/javaprojects/spark/dist
>> + cp -r /home/emperor/javaprojects/spark/python
>> /home/emperor/javaprojects/spark/dist
>> + cp -r /home/emperor/javaprojects/spark/sbin
>> /home/emperor/javaprojects/spark/dist
>> + cp -r /home/emperor/javaprojects/spark/ec2
>> /home/emperor/javaprojects/spark/dist
>> + '[' -d /home/emperor/javaprojects/spark/R/lib/SparkR ']'
>> + '[' false == true ']'
>> + '[' true == true ']'
>> + TARDIR_NAME='spark-[WARNING] See
>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest'
>> + TARDIR='/home/emperor/javaprojects/spark/spark-[WARNING] See
>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest'
>> + rm -rf '/home/emperor/javaprojects/spark/spark-[WARNING] See
>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest'
>> + cp -r /home/emperor/javaprojects/spark/dist
>> '/home/emperor/javaprojects/spark/spark-[WARNING] See
>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest'
>> cp: cannot create directory
>> `/home/emperor/javaprojects/spark/spark-[WARNING] See
>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest':
>> No such file or directory
>>
>>
>> On Tue, Oct 27, 2015 at 2:14 PM, Ted Yu <yu...@gmail.com> wrote:
>>
>>> Can you try the same command shown in the pull request ?
>>>
>>> Thanks
>>>
>>> On Oct 27, 2015, at 12:40 AM, Kayode Odeyemi <dr...@gmail.com> wrote:
>>>
>>> Thank you.
>>>
>>> But I'm getting same warnings and it's still preventing the archive from
>>> being generated.
>>>
>>> I've ran this on both OSX Lion and Ubuntu 12. Same error. No .gz file
>>>
>>> On Mon, Oct 26, 2015 at 9:10 PM, Ted Yu <yu...@gmail.com> wrote:
>>>
>>>> Looks like '-Pyarn' was missing in your command.
>>>>
>>>> On Mon, Oct 26, 2015 at 12:06 PM, Kayode Odeyemi <dr...@gmail.com>
>>>> wrote:
>>>>
>>>>> I used this command which is synonymous to what you have:
>>>>>
>>>>> ./make-distribution.sh --name spark-latest --tgz --mvn mvn
>>>>> -Dhadoop.version=2.6.0 -Phadoop-2.6 -Phive -Phive-thriftserver -DskipTests
>>>>> clean package -U
>>>>>
>>>>> But I still see WARNINGS like this in the output and no .gz file
>>>>> created:
>>>>>
>>>>> cp: /usr/local/spark-latest/spark-[WARNING] See
>>>>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=10/day=26/.part-r-00005.gz.parquet.crc:
>>>>> No such file or directory
>>>>> cp: /usr/local/spark-latest/spark-[WARNING] See
>>>>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=10/day=26/part-r-00005.gz.parquet:
>>>>> No such file or directory
>>>>> cp: /usr/local/spark-latest/spark-[WARNING] See
>>>>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=9:
>>>>> No such file or directory
>>>>> cp:
>>>>> /usr/local/spark-latest/dist/python/test_support/sql/parquet_partitioned/year=2015/month=9:
>>>>> unable to copy extended attributes to
>>>>> /usr/local/spark-latest/spark-[WARNING] See
>>>>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=9:
>>>>> No such file or directory
>>>>> cp: /usr/local/spark-latest/spark-[WARNING] See
>>>>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=9/day=1:
>>>>> No such file or directory
>>>>> cp:
>>>>> /usr/local/spark-latest/dist/python/test_support/sql/parquet_partitioned/year=2015/month=9/day=1:
>>>>> unable to copy extended attributes to
>>>>> /usr/local/spark-latest/spark-[WARNING] See
>>>>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=9/day=1:
>>>>> No such file or directory
>>>>> cp: /usr/local/spark-latest/spark-[WARNING] See
>>>>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=9/day=1/.part-r-00007.gz.parquet.crc:
>>>>> No such file or directory
>>>>>
>>>>> On Mon, Oct 26, 2015 at 8:58 PM, Ted Yu <yu...@gmail.com> wrote:
>>>>>
>>>>>> If you use the command shown in:
>>>>>> https://github.com/apache/spark/pull/9281
>>>>>>
>>>>>> You should have got the following:
>>>>>>
>>>>>>
>>>>>> ./dist/python/test_support/sql/parquet_partitioned/year=2014/month=9/day=1/part-r-00008.gz.parquet
>>>>>>
>>>>>> ./dist/python/test_support/sql/parquet_partitioned/year=2015/month=9/day=1/part-r-00007.gz.parquet
>>>>>>
>>>>>> ./dist/python/test_support/sql/parquet_partitioned/year=2015/month=10/day=25/part-r-00004.gz.parquet
>>>>>>
>>>>>> ./dist/python/test_support/sql/parquet_partitioned/year=2015/month=10/day=25/part-r-00002.gz.parquet
>>>>>>
>>>>>> ./dist/python/test_support/sql/parquet_partitioned/year=2015/month=10/day=26/part-r-00005.gz.parquet
>>>>>>
>>>>>> On Mon, Oct 26, 2015 at 11:47 AM, Kayode Odeyemi <dr...@gmail.com>
>>>>>> wrote:
>>>>>>
>>>>>>> I see a lot of stuffs like this after the a successful maven build:
>>>>>>>
>>>>>>> cp: /usr/local/spark-latest/spark-[WARNING] See
>>>>>>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2014/month=9/day=1/
>>>>>>> part-r-00008.gz.parquet: No such file or directory
>>>>>>>
>>>>>>> Seems it fails when it tries to package the build as an archive.
>>>>>>>
>>>>>>> I'm using the latest code on github master.
>>>>>>>
>>>>>>> Any ideas please?
>>>>>>>
>>>>>>> On Mon, Oct 26, 2015 at 6:20 PM, Yana Kadiyska <
>>>>>>> yana.kadiyska@gmail.com> wrote:
>>>>>>>
>>>>>>>> In 1.4 ./make_distribution produces a .tgz file in the root
>>>>>>>> directory (same directory that make_distribution is in)
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> On Mon, Oct 26, 2015 at 8:46 AM, Kayode Odeyemi <dr...@gmail.com>
>>>>>>>> wrote:
>>>>>>>>
>>>>>>>>> Hi,
>>>>>>>>>
>>>>>>>>> The ./make_distribution task completed. However, I can't seem to
>>>>>>>>> locate the
>>>>>>>>> .tar.gz file.
>>>>>>>>>
>>>>>>>>> Where does Spark save this? or should I just work with the dist
>>>>>>>>> directory?
>>>>>>>>>
>>>>>>>>> On Fri, Oct 23, 2015 at 4:23 PM, Kayode Odeyemi <dreyemi@gmail.com
>>>>>>>>> > wrote:
>>>>>>>>>
>>>>>>>>>> I saw this when I tested manually (without ./make-distribution)
>>>>>>>>>>
>>>>>>>>>> Detected Maven Version: 3.2.2 is not in the allowed range 3.3.3.
>>>>>>>>>>
>>>>>>>>>> So I simply upgraded maven to 3.3.3.
>>>>>>>>>>
>>>>>>>>>> Resolved. Thanks
>>>>>>>>>>
>>>>>>>>>> On Fri, Oct 23, 2015 at 3:17 PM, Sean Owen <so...@cloudera.com>
>>>>>>>>>> wrote:
>>>>>>>>>>
>>>>>>>>>>> This doesn't show the actual error output from Maven. I have a
>>>>>>>>>>> strong
>>>>>>>>>>> guess that you haven't set MAVEN_OPTS to increase the memory
>>>>>>>>>>> Maven can
>>>>>>>>>>> use.
>>>>>>>>>>>
>>>>>>>>>>> On Fri, Oct 23, 2015 at 6:14 AM, Kayode Odeyemi <
>>>>>>>>>>> dreyemi@gmail.com> wrote:
>>>>>>>>>>> > Hi,
>>>>>>>>>>> >
>>>>>>>>>>> > I can't seem to get a successful maven build. Please see
>>>>>>>>>>> command output
>>>>>>>>>>> > below:
>>>>>>>>>>> >
>>>>>>>>>>> > bash-3.2$ ./make-distribution.sh --name spark-latest --tgz
>>>>>>>>>>> --mvn mvn
>>>>>>>>>>> > -Dhadoop.version=2.7.0 -Phadoop-2.7 -Phive -Phive-thriftserver
>>>>>>>>>>> -DskipTests
>>>>>>>>>>> > clean package
>>>>>>>>>>> > +++ dirname ./make-distribution.sh
>>>>>>>>>>> > ++ cd .
>>>>>>>>>>> > ++ pwd
>>>>>>>>>>> > + SPARK_HOME=/usr/local/spark-latest
>>>>>>>>>>> > + DISTDIR=/usr/local/spark-latest/dist
>>>>>>>>>>> > + SPARK_TACHYON=false
>>>>>>>>>>> > + TACHYON_VERSION=0.7.1
>>>>>>>>>>> > + TACHYON_TGZ=tachyon-0.7.1-bin.tar.gz
>>>>>>>>>>> > +
>>>>>>>>>>> > TACHYON_URL=
>>>>>>>>>>> https://github.com/amplab/tachyon/releases/download/v0.7.1/tachyon-0.7.1-bin.tar.gz
>>>>>>>>>>> > + MAKE_TGZ=false
>>>>>>>>>>> > + NAME=none
>>>>>>>>>>> > + MVN=/usr/local/spark-latest/build/mvn
>>>>>>>>>>> > + ((  12  ))
>>>>>>>>>>> > + case $1 in
>>>>>>>>>>> > + NAME=spark-latest
>>>>>>>>>>> > + shift
>>>>>>>>>>> > + shift
>>>>>>>>>>> > + ((  10  ))
>>>>>>>>>>> > + case $1 in
>>>>>>>>>>> > + MAKE_TGZ=true
>>>>>>>>>>> > + shift
>>>>>>>>>>> > + ((  9  ))
>>>>>>>>>>> > + case $1 in
>>>>>>>>>>> > + MVN=mvn
>>>>>>>>>>> > + shift
>>>>>>>>>>> > + shift
>>>>>>>>>>> > + ((  7  ))
>>>>>>>>>>> > + case $1 in
>>>>>>>>>>> > + break
>>>>>>>>>>> > + '[' -z
>>>>>>>>>>> /Library/Java/JavaVirtualMachines/jdk1.8.0_20.jdk/Contents/Home ']'
>>>>>>>>>>> > + '[' -z
>>>>>>>>>>> /Library/Java/JavaVirtualMachines/jdk1.8.0_20.jdk/Contents/Home ']'
>>>>>>>>>>> > ++ command -v git
>>>>>>>>>>> > + '[' /usr/bin/git ']'
>>>>>>>>>>> > ++ git rev-parse --short HEAD
>>>>>>>>>>> > + GITREV=487d409
>>>>>>>>>>> > + '[' '!' -z 487d409 ']'
>>>>>>>>>>> > + GITREVSTRING=' (git revision 487d409)'
>>>>>>>>>>> > + unset GITREV
>>>>>>>>>>> > ++ command -v mvn
>>>>>>>>>>> > + '[' '!' /usr/bin/mvn ']'
>>>>>>>>>>> > ++ mvn help:evaluate -Dexpression=project.version
>>>>>>>>>>> -Dhadoop.version=2.7.0
>>>>>>>>>>> > -Phadoop-2.7 -Phive -Phive-thriftserver -DskipTests clean
>>>>>>>>>>> package
>>>>>>>>>>> > ++ grep -v INFO
>>>>>>>>>>> > ++ tail -n 1
>>>>>>>>>>> > + VERSION='[ERROR] [Help 1]
>>>>>>>>>>> >
>>>>>>>>>>> http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
>>>>>>>>>>> '
>>>>>>>>>>> >
>>>>>>>>>>> > Same output error with JDK 7
>>>>>>>>>>> >
>>>>>>>>>>> > Appreciate your help.
>>>>>>>>>>> >
>>>>>>>>>>> >
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Maven build failed (Spark master)

Posted by Ted Yu <yu...@gmail.com>.
I used the following command:
make-distribution.sh --name custom-spark --tgz -Phadoop-2.4 -Phive
-Phive-thriftserver -Pyarn

spark-1.6.0-SNAPSHOT-bin-custom-spark.tgz was generated (with patch from
SPARK-11348)

Can you try above command ?

Thanks

On Tue, Oct 27, 2015 at 7:03 AM, Kayode Odeyemi <dr...@gmail.com> wrote:

> Ted, I switched to this:
>
> ./make-distribution.sh --name spark-latest --tgz -Dhadoop.version=2.6.0
> -Phadoop-2.6 -Phive -Phive-thriftserver -Pyarn -DskipTests clean package -U
>
> Same error. No .gz file. Here's the bottom output log:
>
> + rm -rf /home/emperor/javaprojects/spark/dist
> + mkdir -p /home/emperor/javaprojects/spark/dist/lib
> + echo 'Spark [WARNING] See
> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin (git revision
> 3689beb) built for Hadoop [WARNING] See
> http://docs.codehaus.org/display/MAVENUSER/Shade+Pl
> + echo 'Build flags: -Dhadoop.version=2.6.0' -Phadoop-2.6 -Phive
> -Phive-thriftserver -Pyarn -DskipTests clean package -U
> + cp
> /home/emperor/javaprojects/spark/assembly/target/scala-2.10/spark-assembly-1.6.0-SNAPSHOT-hadoop2.6.0.jar
> /home/emperor/javaprojects/spark/dist/lib/
> + cp
> /home/emperor/javaprojects/spark/examples/target/scala-2.10/spark-examples-1.6.0-SNAPSHOT-hadoop2.6.0.jar
> /home/emperor/javaprojects/spark/dist/lib/
> + cp
> /home/emperor/javaprojects/spark/network/yarn/target/scala-2.10/spark-1.6.0-SNAPSHOT-yarn-shuffle.jar
> /home/emperor/javaprojects/spark/dist/lib/
> + mkdir -p /home/emperor/javaprojects/spark/dist/examples/src/main
> + cp -r /home/emperor/javaprojects/spark/examples/src/main
> /home/emperor/javaprojects/spark/dist/examples/src/
> + '[' 1 == 1 ']'
> + cp
> /home/emperor/javaprojects/spark/lib_managed/jars/datanucleus-api-jdo-3.2.6.jar
> /home/emperor/javaprojects/spark/lib_managed/jars/datanucleus-core-3.2.10.jar
> /home/emperor/javaprojects
> ed/jars/datanucleus-rdbms-3.2.9.jar
> /home/emperor/javaprojects/spark/dist/lib/
> + cp /home/emperor/javaprojects/spark/LICENSE
> /home/emperor/javaprojects/spark/dist
> + cp -r /home/emperor/javaprojects/spark/licenses
> /home/emperor/javaprojects/spark/dist
> + cp /home/emperor/javaprojects/spark/NOTICE
> /home/emperor/javaprojects/spark/dist
> + '[' -e /home/emperor/javaprojects/spark/CHANGES.txt ']'
> + cp -r /home/emperor/javaprojects/spark/data
> /home/emperor/javaprojects/spark/dist
> + mkdir /home/emperor/javaprojects/spark/dist/conf
> + cp /home/emperor/javaprojects/spark/conf/docker.properties.template
> /home/emperor/javaprojects/spark/conf/fairscheduler.xml.template
> /home/emperor/javaprojects/spark/conf/log4j.properties
> emperor/javaprojects/spark/conf/metrics.properties.template
> /home/emperor/javaprojects/spark/conf/slaves.template
> /home/emperor/javaprojects/spark/conf/spark-defaults.conf.template /home/em
> ts/spark/conf/spark-env.sh.template
> /home/emperor/javaprojects/spark/dist/conf
> + cp /home/emperor/javaprojects/spark/README.md
> /home/emperor/javaprojects/spark/dist
> + cp -r /home/emperor/javaprojects/spark/bin
> /home/emperor/javaprojects/spark/dist
> + cp -r /home/emperor/javaprojects/spark/python
> /home/emperor/javaprojects/spark/dist
> + cp -r /home/emperor/javaprojects/spark/sbin
> /home/emperor/javaprojects/spark/dist
> + cp -r /home/emperor/javaprojects/spark/ec2
> /home/emperor/javaprojects/spark/dist
> + '[' -d /home/emperor/javaprojects/spark/R/lib/SparkR ']'
> + '[' false == true ']'
> + '[' true == true ']'
> + TARDIR_NAME='spark-[WARNING] See
> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest'
> + TARDIR='/home/emperor/javaprojects/spark/spark-[WARNING] See
> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest'
> + rm -rf '/home/emperor/javaprojects/spark/spark-[WARNING] See
> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest'
> + cp -r /home/emperor/javaprojects/spark/dist
> '/home/emperor/javaprojects/spark/spark-[WARNING] See
> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest'
> cp: cannot create directory
> `/home/emperor/javaprojects/spark/spark-[WARNING] See
> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest':
> No such file or directory
>
>
> On Tue, Oct 27, 2015 at 2:14 PM, Ted Yu <yu...@gmail.com> wrote:
>
>> Can you try the same command shown in the pull request ?
>>
>> Thanks
>>
>> On Oct 27, 2015, at 12:40 AM, Kayode Odeyemi <dr...@gmail.com> wrote:
>>
>> Thank you.
>>
>> But I'm getting same warnings and it's still preventing the archive from
>> being generated.
>>
>> I've ran this on both OSX Lion and Ubuntu 12. Same error. No .gz file
>>
>> On Mon, Oct 26, 2015 at 9:10 PM, Ted Yu <yu...@gmail.com> wrote:
>>
>>> Looks like '-Pyarn' was missing in your command.
>>>
>>> On Mon, Oct 26, 2015 at 12:06 PM, Kayode Odeyemi <dr...@gmail.com>
>>> wrote:
>>>
>>>> I used this command which is synonymous to what you have:
>>>>
>>>> ./make-distribution.sh --name spark-latest --tgz --mvn mvn
>>>> -Dhadoop.version=2.6.0 -Phadoop-2.6 -Phive -Phive-thriftserver -DskipTests
>>>> clean package -U
>>>>
>>>> But I still see WARNINGS like this in the output and no .gz file
>>>> created:
>>>>
>>>> cp: /usr/local/spark-latest/spark-[WARNING] See
>>>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=10/day=26/.part-r-00005.gz.parquet.crc:
>>>> No such file or directory
>>>> cp: /usr/local/spark-latest/spark-[WARNING] See
>>>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=10/day=26/part-r-00005.gz.parquet:
>>>> No such file or directory
>>>> cp: /usr/local/spark-latest/spark-[WARNING] See
>>>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=9:
>>>> No such file or directory
>>>> cp:
>>>> /usr/local/spark-latest/dist/python/test_support/sql/parquet_partitioned/year=2015/month=9:
>>>> unable to copy extended attributes to
>>>> /usr/local/spark-latest/spark-[WARNING] See
>>>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=9:
>>>> No such file or directory
>>>> cp: /usr/local/spark-latest/spark-[WARNING] See
>>>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=9/day=1:
>>>> No such file or directory
>>>> cp:
>>>> /usr/local/spark-latest/dist/python/test_support/sql/parquet_partitioned/year=2015/month=9/day=1:
>>>> unable to copy extended attributes to
>>>> /usr/local/spark-latest/spark-[WARNING] See
>>>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=9/day=1:
>>>> No such file or directory
>>>> cp: /usr/local/spark-latest/spark-[WARNING] See
>>>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=9/day=1/.part-r-00007.gz.parquet.crc:
>>>> No such file or directory
>>>>
>>>> On Mon, Oct 26, 2015 at 8:58 PM, Ted Yu <yu...@gmail.com> wrote:
>>>>
>>>>> If you use the command shown in:
>>>>> https://github.com/apache/spark/pull/9281
>>>>>
>>>>> You should have got the following:
>>>>>
>>>>>
>>>>> ./dist/python/test_support/sql/parquet_partitioned/year=2014/month=9/day=1/part-r-00008.gz.parquet
>>>>>
>>>>> ./dist/python/test_support/sql/parquet_partitioned/year=2015/month=9/day=1/part-r-00007.gz.parquet
>>>>>
>>>>> ./dist/python/test_support/sql/parquet_partitioned/year=2015/month=10/day=25/part-r-00004.gz.parquet
>>>>>
>>>>> ./dist/python/test_support/sql/parquet_partitioned/year=2015/month=10/day=25/part-r-00002.gz.parquet
>>>>>
>>>>> ./dist/python/test_support/sql/parquet_partitioned/year=2015/month=10/day=26/part-r-00005.gz.parquet
>>>>>
>>>>> On Mon, Oct 26, 2015 at 11:47 AM, Kayode Odeyemi <dr...@gmail.com>
>>>>> wrote:
>>>>>
>>>>>> I see a lot of stuffs like this after the a successful maven build:
>>>>>>
>>>>>> cp: /usr/local/spark-latest/spark-[WARNING] See
>>>>>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2014/month=9/day=1/
>>>>>> part-r-00008.gz.parquet: No such file or directory
>>>>>>
>>>>>> Seems it fails when it tries to package the build as an archive.
>>>>>>
>>>>>> I'm using the latest code on github master.
>>>>>>
>>>>>> Any ideas please?
>>>>>>
>>>>>> On Mon, Oct 26, 2015 at 6:20 PM, Yana Kadiyska <
>>>>>> yana.kadiyska@gmail.com> wrote:
>>>>>>
>>>>>>> In 1.4 ./make_distribution produces a .tgz file in the root
>>>>>>> directory (same directory that make_distribution is in)
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Mon, Oct 26, 2015 at 8:46 AM, Kayode Odeyemi <dr...@gmail.com>
>>>>>>> wrote:
>>>>>>>
>>>>>>>> Hi,
>>>>>>>>
>>>>>>>> The ./make_distribution task completed. However, I can't seem to
>>>>>>>> locate the
>>>>>>>> .tar.gz file.
>>>>>>>>
>>>>>>>> Where does Spark save this? or should I just work with the dist
>>>>>>>> directory?
>>>>>>>>
>>>>>>>> On Fri, Oct 23, 2015 at 4:23 PM, Kayode Odeyemi <dr...@gmail.com>
>>>>>>>> wrote:
>>>>>>>>
>>>>>>>>> I saw this when I tested manually (without ./make-distribution)
>>>>>>>>>
>>>>>>>>> Detected Maven Version: 3.2.2 is not in the allowed range 3.3.3.
>>>>>>>>>
>>>>>>>>> So I simply upgraded maven to 3.3.3.
>>>>>>>>>
>>>>>>>>> Resolved. Thanks
>>>>>>>>>
>>>>>>>>> On Fri, Oct 23, 2015 at 3:17 PM, Sean Owen <so...@cloudera.com>
>>>>>>>>> wrote:
>>>>>>>>>
>>>>>>>>>> This doesn't show the actual error output from Maven. I have a
>>>>>>>>>> strong
>>>>>>>>>> guess that you haven't set MAVEN_OPTS to increase the memory
>>>>>>>>>> Maven can
>>>>>>>>>> use.
>>>>>>>>>>
>>>>>>>>>> On Fri, Oct 23, 2015 at 6:14 AM, Kayode Odeyemi <
>>>>>>>>>> dreyemi@gmail.com> wrote:
>>>>>>>>>> > Hi,
>>>>>>>>>> >
>>>>>>>>>> > I can't seem to get a successful maven build. Please see
>>>>>>>>>> command output
>>>>>>>>>> > below:
>>>>>>>>>> >
>>>>>>>>>> > bash-3.2$ ./make-distribution.sh --name spark-latest --tgz
>>>>>>>>>> --mvn mvn
>>>>>>>>>> > -Dhadoop.version=2.7.0 -Phadoop-2.7 -Phive -Phive-thriftserver
>>>>>>>>>> -DskipTests
>>>>>>>>>> > clean package
>>>>>>>>>> > +++ dirname ./make-distribution.sh
>>>>>>>>>> > ++ cd .
>>>>>>>>>> > ++ pwd
>>>>>>>>>> > + SPARK_HOME=/usr/local/spark-latest
>>>>>>>>>> > + DISTDIR=/usr/local/spark-latest/dist
>>>>>>>>>> > + SPARK_TACHYON=false
>>>>>>>>>> > + TACHYON_VERSION=0.7.1
>>>>>>>>>> > + TACHYON_TGZ=tachyon-0.7.1-bin.tar.gz
>>>>>>>>>> > +
>>>>>>>>>> > TACHYON_URL=
>>>>>>>>>> https://github.com/amplab/tachyon/releases/download/v0.7.1/tachyon-0.7.1-bin.tar.gz
>>>>>>>>>> > + MAKE_TGZ=false
>>>>>>>>>> > + NAME=none
>>>>>>>>>> > + MVN=/usr/local/spark-latest/build/mvn
>>>>>>>>>> > + ((  12  ))
>>>>>>>>>> > + case $1 in
>>>>>>>>>> > + NAME=spark-latest
>>>>>>>>>> > + shift
>>>>>>>>>> > + shift
>>>>>>>>>> > + ((  10  ))
>>>>>>>>>> > + case $1 in
>>>>>>>>>> > + MAKE_TGZ=true
>>>>>>>>>> > + shift
>>>>>>>>>> > + ((  9  ))
>>>>>>>>>> > + case $1 in
>>>>>>>>>> > + MVN=mvn
>>>>>>>>>> > + shift
>>>>>>>>>> > + shift
>>>>>>>>>> > + ((  7  ))
>>>>>>>>>> > + case $1 in
>>>>>>>>>> > + break
>>>>>>>>>> > + '[' -z
>>>>>>>>>> /Library/Java/JavaVirtualMachines/jdk1.8.0_20.jdk/Contents/Home ']'
>>>>>>>>>> > + '[' -z
>>>>>>>>>> /Library/Java/JavaVirtualMachines/jdk1.8.0_20.jdk/Contents/Home ']'
>>>>>>>>>> > ++ command -v git
>>>>>>>>>> > + '[' /usr/bin/git ']'
>>>>>>>>>> > ++ git rev-parse --short HEAD
>>>>>>>>>> > + GITREV=487d409
>>>>>>>>>> > + '[' '!' -z 487d409 ']'
>>>>>>>>>> > + GITREVSTRING=' (git revision 487d409)'
>>>>>>>>>> > + unset GITREV
>>>>>>>>>> > ++ command -v mvn
>>>>>>>>>> > + '[' '!' /usr/bin/mvn ']'
>>>>>>>>>> > ++ mvn help:evaluate -Dexpression=project.version
>>>>>>>>>> -Dhadoop.version=2.7.0
>>>>>>>>>> > -Phadoop-2.7 -Phive -Phive-thriftserver -DskipTests clean
>>>>>>>>>> package
>>>>>>>>>> > ++ grep -v INFO
>>>>>>>>>> > ++ tail -n 1
>>>>>>>>>> > + VERSION='[ERROR] [Help 1]
>>>>>>>>>> >
>>>>>>>>>> http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
>>>>>>>>>> '
>>>>>>>>>> >
>>>>>>>>>> > Same output error with JDK 7
>>>>>>>>>> >
>>>>>>>>>> > Appreciate your help.
>>>>>>>>>> >
>>>>>>>>>> >
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Maven build failed (Spark master)

Posted by Kayode Odeyemi <dr...@gmail.com>.
Seems the build and directory structure in dist is similar to the .gz file
downloaded from the
downloads page. Can the dist directory be used as is?

On Tue, Oct 27, 2015 at 4:03 PM, Kayode Odeyemi <dr...@gmail.com> wrote:

> Ted, I switched to this:
>
> ./make-distribution.sh --name spark-latest --tgz -Dhadoop.version=2.6.0
> -Phadoop-2.6 -Phive -Phive-thriftserver -Pyarn -DskipTests clean package -U
>
> Same error. No .gz file. Here's the bottom output log:
>
> + rm -rf /home/emperor/javaprojects/spark/dist
> + mkdir -p /home/emperor/javaprojects/spark/dist/lib
> + echo 'Spark [WARNING] See
> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin (git revision
> 3689beb) built for Hadoop [WARNING] See
> http://docs.codehaus.org/display/MAVENUSER/Shade+Pl
> + echo 'Build flags: -Dhadoop.version=2.6.0' -Phadoop-2.6 -Phive
> -Phive-thriftserver -Pyarn -DskipTests clean package -U
> + cp
> /home/emperor/javaprojects/spark/assembly/target/scala-2.10/spark-assembly-1.6.0-SNAPSHOT-hadoop2.6.0.jar
> /home/emperor/javaprojects/spark/dist/lib/
> + cp
> /home/emperor/javaprojects/spark/examples/target/scala-2.10/spark-examples-1.6.0-SNAPSHOT-hadoop2.6.0.jar
> /home/emperor/javaprojects/spark/dist/lib/
> + cp
> /home/emperor/javaprojects/spark/network/yarn/target/scala-2.10/spark-1.6.0-SNAPSHOT-yarn-shuffle.jar
> /home/emperor/javaprojects/spark/dist/lib/
> + mkdir -p /home/emperor/javaprojects/spark/dist/examples/src/main
> + cp -r /home/emperor/javaprojects/spark/examples/src/main
> /home/emperor/javaprojects/spark/dist/examples/src/
> + '[' 1 == 1 ']'
> + cp
> /home/emperor/javaprojects/spark/lib_managed/jars/datanucleus-api-jdo-3.2.6.jar
> /home/emperor/javaprojects/spark/lib_managed/jars/datanucleus-core-3.2.10.jar
> /home/emperor/javaprojects
> ed/jars/datanucleus-rdbms-3.2.9.jar
> /home/emperor/javaprojects/spark/dist/lib/
> + cp /home/emperor/javaprojects/spark/LICENSE
> /home/emperor/javaprojects/spark/dist
> + cp -r /home/emperor/javaprojects/spark/licenses
> /home/emperor/javaprojects/spark/dist
> + cp /home/emperor/javaprojects/spark/NOTICE
> /home/emperor/javaprojects/spark/dist
> + '[' -e /home/emperor/javaprojects/spark/CHANGES.txt ']'
> + cp -r /home/emperor/javaprojects/spark/data
> /home/emperor/javaprojects/spark/dist
> + mkdir /home/emperor/javaprojects/spark/dist/conf
> + cp /home/emperor/javaprojects/spark/conf/docker.properties.template
> /home/emperor/javaprojects/spark/conf/fairscheduler.xml.template
> /home/emperor/javaprojects/spark/conf/log4j.properties
> emperor/javaprojects/spark/conf/metrics.properties.template
> /home/emperor/javaprojects/spark/conf/slaves.template
> /home/emperor/javaprojects/spark/conf/spark-defaults.conf.template /home/em
> ts/spark/conf/spark-env.sh.template
> /home/emperor/javaprojects/spark/dist/conf
> + cp /home/emperor/javaprojects/spark/README.md
> /home/emperor/javaprojects/spark/dist
> + cp -r /home/emperor/javaprojects/spark/bin
> /home/emperor/javaprojects/spark/dist
> + cp -r /home/emperor/javaprojects/spark/python
> /home/emperor/javaprojects/spark/dist
> + cp -r /home/emperor/javaprojects/spark/sbin
> /home/emperor/javaprojects/spark/dist
> + cp -r /home/emperor/javaprojects/spark/ec2
> /home/emperor/javaprojects/spark/dist
> + '[' -d /home/emperor/javaprojects/spark/R/lib/SparkR ']'
> + '[' false == true ']'
> + '[' true == true ']'
> + TARDIR_NAME='spark-[WARNING] See
> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest'
> + TARDIR='/home/emperor/javaprojects/spark/spark-[WARNING] See
> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest'
> + rm -rf '/home/emperor/javaprojects/spark/spark-[WARNING] See
> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest'
> + cp -r /home/emperor/javaprojects/spark/dist
> '/home/emperor/javaprojects/spark/spark-[WARNING] See
> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest'
> cp: cannot create directory
> `/home/emperor/javaprojects/spark/spark-[WARNING] See
> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest':
> No such file or directory
>
>
> On Tue, Oct 27, 2015 at 2:14 PM, Ted Yu <yu...@gmail.com> wrote:
>
>> Can you try the same command shown in the pull request ?
>>
>> Thanks
>>
>> On Oct 27, 2015, at 12:40 AM, Kayode Odeyemi <dr...@gmail.com> wrote:
>>
>> Thank you.
>>
>> But I'm getting same warnings and it's still preventing the archive from
>> being generated.
>>
>> I've ran this on both OSX Lion and Ubuntu 12. Same error. No .gz file
>>
>> On Mon, Oct 26, 2015 at 9:10 PM, Ted Yu <yu...@gmail.com> wrote:
>>
>>> Looks like '-Pyarn' was missing in your command.
>>>
>>> On Mon, Oct 26, 2015 at 12:06 PM, Kayode Odeyemi <dr...@gmail.com>
>>> wrote:
>>>
>>>> I used this command which is synonymous to what you have:
>>>>
>>>> ./make-distribution.sh --name spark-latest --tgz --mvn mvn
>>>> -Dhadoop.version=2.6.0 -Phadoop-2.6 -Phive -Phive-thriftserver -DskipTests
>>>> clean package -U
>>>>
>>>> But I still see WARNINGS like this in the output and no .gz file
>>>> created:
>>>>
>>>> cp: /usr/local/spark-latest/spark-[WARNING] See
>>>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=10/day=26/.part-r-00005.gz.parquet.crc:
>>>> No such file or directory
>>>> cp: /usr/local/spark-latest/spark-[WARNING] See
>>>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=10/day=26/part-r-00005.gz.parquet:
>>>> No such file or directory
>>>> cp: /usr/local/spark-latest/spark-[WARNING] See
>>>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=9:
>>>> No such file or directory
>>>> cp:
>>>> /usr/local/spark-latest/dist/python/test_support/sql/parquet_partitioned/year=2015/month=9:
>>>> unable to copy extended attributes to
>>>> /usr/local/spark-latest/spark-[WARNING] See
>>>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=9:
>>>> No such file or directory
>>>> cp: /usr/local/spark-latest/spark-[WARNING] See
>>>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=9/day=1:
>>>> No such file or directory
>>>> cp:
>>>> /usr/local/spark-latest/dist/python/test_support/sql/parquet_partitioned/year=2015/month=9/day=1:
>>>> unable to copy extended attributes to
>>>> /usr/local/spark-latest/spark-[WARNING] See
>>>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=9/day=1:
>>>> No such file or directory
>>>> cp: /usr/local/spark-latest/spark-[WARNING] See
>>>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=9/day=1/.part-r-00007.gz.parquet.crc:
>>>> No such file or directory
>>>>
>>>> On Mon, Oct 26, 2015 at 8:58 PM, Ted Yu <yu...@gmail.com> wrote:
>>>>
>>>>> If you use the command shown in:
>>>>> https://github.com/apache/spark/pull/9281
>>>>>
>>>>> You should have got the following:
>>>>>
>>>>>
>>>>> ./dist/python/test_support/sql/parquet_partitioned/year=2014/month=9/day=1/part-r-00008.gz.parquet
>>>>>
>>>>> ./dist/python/test_support/sql/parquet_partitioned/year=2015/month=9/day=1/part-r-00007.gz.parquet
>>>>>
>>>>> ./dist/python/test_support/sql/parquet_partitioned/year=2015/month=10/day=25/part-r-00004.gz.parquet
>>>>>
>>>>> ./dist/python/test_support/sql/parquet_partitioned/year=2015/month=10/day=25/part-r-00002.gz.parquet
>>>>>
>>>>> ./dist/python/test_support/sql/parquet_partitioned/year=2015/month=10/day=26/part-r-00005.gz.parquet
>>>>>
>>>>> On Mon, Oct 26, 2015 at 11:47 AM, Kayode Odeyemi <dr...@gmail.com>
>>>>> wrote:
>>>>>
>>>>>> I see a lot of stuffs like this after the a successful maven build:
>>>>>>
>>>>>> cp: /usr/local/spark-latest/spark-[WARNING] See
>>>>>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2014/month=9/day=1/
>>>>>> part-r-00008.gz.parquet: No such file or directory
>>>>>>
>>>>>> Seems it fails when it tries to package the build as an archive.
>>>>>>
>>>>>> I'm using the latest code on github master.
>>>>>>
>>>>>> Any ideas please?
>>>>>>
>>>>>> On Mon, Oct 26, 2015 at 6:20 PM, Yana Kadiyska <
>>>>>> yana.kadiyska@gmail.com> wrote:
>>>>>>
>>>>>>> In 1.4 ./make_distribution produces a .tgz file in the root
>>>>>>> directory (same directory that make_distribution is in)
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Mon, Oct 26, 2015 at 8:46 AM, Kayode Odeyemi <dr...@gmail.com>
>>>>>>> wrote:
>>>>>>>
>>>>>>>> Hi,
>>>>>>>>
>>>>>>>> The ./make_distribution task completed. However, I can't seem to
>>>>>>>> locate the
>>>>>>>> .tar.gz file.
>>>>>>>>
>>>>>>>> Where does Spark save this? or should I just work with the dist
>>>>>>>> directory?
>>>>>>>>
>>>>>>>> On Fri, Oct 23, 2015 at 4:23 PM, Kayode Odeyemi <dr...@gmail.com>
>>>>>>>> wrote:
>>>>>>>>
>>>>>>>>> I saw this when I tested manually (without ./make-distribution)
>>>>>>>>>
>>>>>>>>> Detected Maven Version: 3.2.2 is not in the allowed range 3.3.3.
>>>>>>>>>
>>>>>>>>> So I simply upgraded maven to 3.3.3.
>>>>>>>>>
>>>>>>>>> Resolved. Thanks
>>>>>>>>>
>>>>>>>>> On Fri, Oct 23, 2015 at 3:17 PM, Sean Owen <so...@cloudera.com>
>>>>>>>>> wrote:
>>>>>>>>>
>>>>>>>>>> This doesn't show the actual error output from Maven. I have a
>>>>>>>>>> strong
>>>>>>>>>> guess that you haven't set MAVEN_OPTS to increase the memory
>>>>>>>>>> Maven can
>>>>>>>>>> use.
>>>>>>>>>>
>>>>>>>>>> On Fri, Oct 23, 2015 at 6:14 AM, Kayode Odeyemi <
>>>>>>>>>> dreyemi@gmail.com> wrote:
>>>>>>>>>> > Hi,
>>>>>>>>>> >
>>>>>>>>>> > I can't seem to get a successful maven build. Please see
>>>>>>>>>> command output
>>>>>>>>>> > below:
>>>>>>>>>> >
>>>>>>>>>> > bash-3.2$ ./make-distribution.sh --name spark-latest --tgz
>>>>>>>>>> --mvn mvn
>>>>>>>>>> > -Dhadoop.version=2.7.0 -Phadoop-2.7 -Phive -Phive-thriftserver
>>>>>>>>>> -DskipTests
>>>>>>>>>> > clean package
>>>>>>>>>> > +++ dirname ./make-distribution.sh
>>>>>>>>>> > ++ cd .
>>>>>>>>>> > ++ pwd
>>>>>>>>>> > + SPARK_HOME=/usr/local/spark-latest
>>>>>>>>>> > + DISTDIR=/usr/local/spark-latest/dist
>>>>>>>>>> > + SPARK_TACHYON=false
>>>>>>>>>> > + TACHYON_VERSION=0.7.1
>>>>>>>>>> > + TACHYON_TGZ=tachyon-0.7.1-bin.tar.gz
>>>>>>>>>> > +
>>>>>>>>>> > TACHYON_URL=
>>>>>>>>>> https://github.com/amplab/tachyon/releases/download/v0.7.1/tachyon-0.7.1-bin.tar.gz
>>>>>>>>>> > + MAKE_TGZ=false
>>>>>>>>>> > + NAME=none
>>>>>>>>>> > + MVN=/usr/local/spark-latest/build/mvn
>>>>>>>>>> > + ((  12  ))
>>>>>>>>>> > + case $1 in
>>>>>>>>>> > + NAME=spark-latest
>>>>>>>>>> > + shift
>>>>>>>>>> > + shift
>>>>>>>>>> > + ((  10  ))
>>>>>>>>>> > + case $1 in
>>>>>>>>>> > + MAKE_TGZ=true
>>>>>>>>>> > + shift
>>>>>>>>>> > + ((  9  ))
>>>>>>>>>> > + case $1 in
>>>>>>>>>> > + MVN=mvn
>>>>>>>>>> > + shift
>>>>>>>>>> > + shift
>>>>>>>>>> > + ((  7  ))
>>>>>>>>>> > + case $1 in
>>>>>>>>>> > + break
>>>>>>>>>> > + '[' -z
>>>>>>>>>> /Library/Java/JavaVirtualMachines/jdk1.8.0_20.jdk/Contents/Home ']'
>>>>>>>>>> > + '[' -z
>>>>>>>>>> /Library/Java/JavaVirtualMachines/jdk1.8.0_20.jdk/Contents/Home ']'
>>>>>>>>>> > ++ command -v git
>>>>>>>>>> > + '[' /usr/bin/git ']'
>>>>>>>>>> > ++ git rev-parse --short HEAD
>>>>>>>>>> > + GITREV=487d409
>>>>>>>>>> > + '[' '!' -z 487d409 ']'
>>>>>>>>>> > + GITREVSTRING=' (git revision 487d409)'
>>>>>>>>>> > + unset GITREV
>>>>>>>>>> > ++ command -v mvn
>>>>>>>>>> > + '[' '!' /usr/bin/mvn ']'
>>>>>>>>>> > ++ mvn help:evaluate -Dexpression=project.version
>>>>>>>>>> -Dhadoop.version=2.7.0
>>>>>>>>>> > -Phadoop-2.7 -Phive -Phive-thriftserver -DskipTests clean
>>>>>>>>>> package
>>>>>>>>>> > ++ grep -v INFO
>>>>>>>>>> > ++ tail -n 1
>>>>>>>>>> > + VERSION='[ERROR] [Help 1]
>>>>>>>>>> >
>>>>>>>>>> http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
>>>>>>>>>> '
>>>>>>>>>> >
>>>>>>>>>> > Same output error with JDK 7
>>>>>>>>>> >
>>>>>>>>>> > Appreciate your help.
>>>>>>>>>> >
>>>>>>>>>> >
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Maven build failed (Spark master)

Posted by Kayode Odeyemi <dr...@gmail.com>.
Ted, I switched to this:

./make-distribution.sh --name spark-latest --tgz -Dhadoop.version=2.6.0
-Phadoop-2.6 -Phive -Phive-thriftserver -Pyarn -DskipTests clean package -U

Same error. No .gz file. Here's the bottom output log:

+ rm -rf /home/emperor/javaprojects/spark/dist
+ mkdir -p /home/emperor/javaprojects/spark/dist/lib
+ echo 'Spark [WARNING] See
http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin (git revision
3689beb) built for Hadoop [WARNING] See
http://docs.codehaus.org/display/MAVENUSER/Shade+Pl
+ echo 'Build flags: -Dhadoop.version=2.6.0' -Phadoop-2.6 -Phive
-Phive-thriftserver -Pyarn -DskipTests clean package -U
+ cp
/home/emperor/javaprojects/spark/assembly/target/scala-2.10/spark-assembly-1.6.0-SNAPSHOT-hadoop2.6.0.jar
/home/emperor/javaprojects/spark/dist/lib/
+ cp
/home/emperor/javaprojects/spark/examples/target/scala-2.10/spark-examples-1.6.0-SNAPSHOT-hadoop2.6.0.jar
/home/emperor/javaprojects/spark/dist/lib/
+ cp
/home/emperor/javaprojects/spark/network/yarn/target/scala-2.10/spark-1.6.0-SNAPSHOT-yarn-shuffle.jar
/home/emperor/javaprojects/spark/dist/lib/
+ mkdir -p /home/emperor/javaprojects/spark/dist/examples/src/main
+ cp -r /home/emperor/javaprojects/spark/examples/src/main
/home/emperor/javaprojects/spark/dist/examples/src/
+ '[' 1 == 1 ']'
+ cp
/home/emperor/javaprojects/spark/lib_managed/jars/datanucleus-api-jdo-3.2.6.jar
/home/emperor/javaprojects/spark/lib_managed/jars/datanucleus-core-3.2.10.jar
/home/emperor/javaprojects
ed/jars/datanucleus-rdbms-3.2.9.jar
/home/emperor/javaprojects/spark/dist/lib/
+ cp /home/emperor/javaprojects/spark/LICENSE
/home/emperor/javaprojects/spark/dist
+ cp -r /home/emperor/javaprojects/spark/licenses
/home/emperor/javaprojects/spark/dist
+ cp /home/emperor/javaprojects/spark/NOTICE
/home/emperor/javaprojects/spark/dist
+ '[' -e /home/emperor/javaprojects/spark/CHANGES.txt ']'
+ cp -r /home/emperor/javaprojects/spark/data
/home/emperor/javaprojects/spark/dist
+ mkdir /home/emperor/javaprojects/spark/dist/conf
+ cp /home/emperor/javaprojects/spark/conf/docker.properties.template
/home/emperor/javaprojects/spark/conf/fairscheduler.xml.template
/home/emperor/javaprojects/spark/conf/log4j.properties
emperor/javaprojects/spark/conf/metrics.properties.template
/home/emperor/javaprojects/spark/conf/slaves.template
/home/emperor/javaprojects/spark/conf/spark-defaults.conf.template /home/em
ts/spark/conf/spark-env.sh.template
/home/emperor/javaprojects/spark/dist/conf
+ cp /home/emperor/javaprojects/spark/README.md
/home/emperor/javaprojects/spark/dist
+ cp -r /home/emperor/javaprojects/spark/bin
/home/emperor/javaprojects/spark/dist
+ cp -r /home/emperor/javaprojects/spark/python
/home/emperor/javaprojects/spark/dist
+ cp -r /home/emperor/javaprojects/spark/sbin
/home/emperor/javaprojects/spark/dist
+ cp -r /home/emperor/javaprojects/spark/ec2
/home/emperor/javaprojects/spark/dist
+ '[' -d /home/emperor/javaprojects/spark/R/lib/SparkR ']'
+ '[' false == true ']'
+ '[' true == true ']'
+ TARDIR_NAME='spark-[WARNING] See
http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest'
+ TARDIR='/home/emperor/javaprojects/spark/spark-[WARNING] See
http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest'
+ rm -rf '/home/emperor/javaprojects/spark/spark-[WARNING] See
http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest'
+ cp -r /home/emperor/javaprojects/spark/dist
'/home/emperor/javaprojects/spark/spark-[WARNING] See
http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest'
cp: cannot create directory
`/home/emperor/javaprojects/spark/spark-[WARNING] See
http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest':
No such file or directory


On Tue, Oct 27, 2015 at 2:14 PM, Ted Yu <yu...@gmail.com> wrote:

> Can you try the same command shown in the pull request ?
>
> Thanks
>
> On Oct 27, 2015, at 12:40 AM, Kayode Odeyemi <dr...@gmail.com> wrote:
>
> Thank you.
>
> But I'm getting same warnings and it's still preventing the archive from
> being generated.
>
> I've ran this on both OSX Lion and Ubuntu 12. Same error. No .gz file
>
> On Mon, Oct 26, 2015 at 9:10 PM, Ted Yu <yu...@gmail.com> wrote:
>
>> Looks like '-Pyarn' was missing in your command.
>>
>> On Mon, Oct 26, 2015 at 12:06 PM, Kayode Odeyemi <dr...@gmail.com>
>> wrote:
>>
>>> I used this command which is synonymous to what you have:
>>>
>>> ./make-distribution.sh --name spark-latest --tgz --mvn mvn
>>> -Dhadoop.version=2.6.0 -Phadoop-2.6 -Phive -Phive-thriftserver -DskipTests
>>> clean package -U
>>>
>>> But I still see WARNINGS like this in the output and no .gz file created:
>>>
>>> cp: /usr/local/spark-latest/spark-[WARNING] See
>>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=10/day=26/.part-r-00005.gz.parquet.crc:
>>> No such file or directory
>>> cp: /usr/local/spark-latest/spark-[WARNING] See
>>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=10/day=26/part-r-00005.gz.parquet:
>>> No such file or directory
>>> cp: /usr/local/spark-latest/spark-[WARNING] See
>>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=9:
>>> No such file or directory
>>> cp:
>>> /usr/local/spark-latest/dist/python/test_support/sql/parquet_partitioned/year=2015/month=9:
>>> unable to copy extended attributes to
>>> /usr/local/spark-latest/spark-[WARNING] See
>>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=9:
>>> No such file or directory
>>> cp: /usr/local/spark-latest/spark-[WARNING] See
>>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=9/day=1:
>>> No such file or directory
>>> cp:
>>> /usr/local/spark-latest/dist/python/test_support/sql/parquet_partitioned/year=2015/month=9/day=1:
>>> unable to copy extended attributes to
>>> /usr/local/spark-latest/spark-[WARNING] See
>>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=9/day=1:
>>> No such file or directory
>>> cp: /usr/local/spark-latest/spark-[WARNING] See
>>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=9/day=1/.part-r-00007.gz.parquet.crc:
>>> No such file or directory
>>>
>>> On Mon, Oct 26, 2015 at 8:58 PM, Ted Yu <yu...@gmail.com> wrote:
>>>
>>>> If you use the command shown in:
>>>> https://github.com/apache/spark/pull/9281
>>>>
>>>> You should have got the following:
>>>>
>>>>
>>>> ./dist/python/test_support/sql/parquet_partitioned/year=2014/month=9/day=1/part-r-00008.gz.parquet
>>>>
>>>> ./dist/python/test_support/sql/parquet_partitioned/year=2015/month=9/day=1/part-r-00007.gz.parquet
>>>>
>>>> ./dist/python/test_support/sql/parquet_partitioned/year=2015/month=10/day=25/part-r-00004.gz.parquet
>>>>
>>>> ./dist/python/test_support/sql/parquet_partitioned/year=2015/month=10/day=25/part-r-00002.gz.parquet
>>>>
>>>> ./dist/python/test_support/sql/parquet_partitioned/year=2015/month=10/day=26/part-r-00005.gz.parquet
>>>>
>>>> On Mon, Oct 26, 2015 at 11:47 AM, Kayode Odeyemi <dr...@gmail.com>
>>>> wrote:
>>>>
>>>>> I see a lot of stuffs like this after the a successful maven build:
>>>>>
>>>>> cp: /usr/local/spark-latest/spark-[WARNING] See
>>>>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2014/month=9/day=1/
>>>>> part-r-00008.gz.parquet: No such file or directory
>>>>>
>>>>> Seems it fails when it tries to package the build as an archive.
>>>>>
>>>>> I'm using the latest code on github master.
>>>>>
>>>>> Any ideas please?
>>>>>
>>>>> On Mon, Oct 26, 2015 at 6:20 PM, Yana Kadiyska <
>>>>> yana.kadiyska@gmail.com> wrote:
>>>>>
>>>>>> In 1.4 ./make_distribution produces a .tgz file in the root
>>>>>> directory (same directory that make_distribution is in)
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Mon, Oct 26, 2015 at 8:46 AM, Kayode Odeyemi <dr...@gmail.com>
>>>>>> wrote:
>>>>>>
>>>>>>> Hi,
>>>>>>>
>>>>>>> The ./make_distribution task completed. However, I can't seem to
>>>>>>> locate the
>>>>>>> .tar.gz file.
>>>>>>>
>>>>>>> Where does Spark save this? or should I just work with the dist
>>>>>>> directory?
>>>>>>>
>>>>>>> On Fri, Oct 23, 2015 at 4:23 PM, Kayode Odeyemi <dr...@gmail.com>
>>>>>>> wrote:
>>>>>>>
>>>>>>>> I saw this when I tested manually (without ./make-distribution)
>>>>>>>>
>>>>>>>> Detected Maven Version: 3.2.2 is not in the allowed range 3.3.3.
>>>>>>>>
>>>>>>>> So I simply upgraded maven to 3.3.3.
>>>>>>>>
>>>>>>>> Resolved. Thanks
>>>>>>>>
>>>>>>>> On Fri, Oct 23, 2015 at 3:17 PM, Sean Owen <so...@cloudera.com>
>>>>>>>> wrote:
>>>>>>>>
>>>>>>>>> This doesn't show the actual error output from Maven. I have a
>>>>>>>>> strong
>>>>>>>>> guess that you haven't set MAVEN_OPTS to increase the memory Maven
>>>>>>>>> can
>>>>>>>>> use.
>>>>>>>>>
>>>>>>>>> On Fri, Oct 23, 2015 at 6:14 AM, Kayode Odeyemi <dr...@gmail.com>
>>>>>>>>> wrote:
>>>>>>>>> > Hi,
>>>>>>>>> >
>>>>>>>>> > I can't seem to get a successful maven build. Please see command
>>>>>>>>> output
>>>>>>>>> > below:
>>>>>>>>> >
>>>>>>>>> > bash-3.2$ ./make-distribution.sh --name spark-latest --tgz --mvn
>>>>>>>>> mvn
>>>>>>>>> > -Dhadoop.version=2.7.0 -Phadoop-2.7 -Phive -Phive-thriftserver
>>>>>>>>> -DskipTests
>>>>>>>>> > clean package
>>>>>>>>> > +++ dirname ./make-distribution.sh
>>>>>>>>> > ++ cd .
>>>>>>>>> > ++ pwd
>>>>>>>>> > + SPARK_HOME=/usr/local/spark-latest
>>>>>>>>> > + DISTDIR=/usr/local/spark-latest/dist
>>>>>>>>> > + SPARK_TACHYON=false
>>>>>>>>> > + TACHYON_VERSION=0.7.1
>>>>>>>>> > + TACHYON_TGZ=tachyon-0.7.1-bin.tar.gz
>>>>>>>>> > +
>>>>>>>>> > TACHYON_URL=
>>>>>>>>> https://github.com/amplab/tachyon/releases/download/v0.7.1/tachyon-0.7.1-bin.tar.gz
>>>>>>>>> > + MAKE_TGZ=false
>>>>>>>>> > + NAME=none
>>>>>>>>> > + MVN=/usr/local/spark-latest/build/mvn
>>>>>>>>> > + ((  12  ))
>>>>>>>>> > + case $1 in
>>>>>>>>> > + NAME=spark-latest
>>>>>>>>> > + shift
>>>>>>>>> > + shift
>>>>>>>>> > + ((  10  ))
>>>>>>>>> > + case $1 in
>>>>>>>>> > + MAKE_TGZ=true
>>>>>>>>> > + shift
>>>>>>>>> > + ((  9  ))
>>>>>>>>> > + case $1 in
>>>>>>>>> > + MVN=mvn
>>>>>>>>> > + shift
>>>>>>>>> > + shift
>>>>>>>>> > + ((  7  ))
>>>>>>>>> > + case $1 in
>>>>>>>>> > + break
>>>>>>>>> > + '[' -z
>>>>>>>>> /Library/Java/JavaVirtualMachines/jdk1.8.0_20.jdk/Contents/Home ']'
>>>>>>>>> > + '[' -z
>>>>>>>>> /Library/Java/JavaVirtualMachines/jdk1.8.0_20.jdk/Contents/Home ']'
>>>>>>>>> > ++ command -v git
>>>>>>>>> > + '[' /usr/bin/git ']'
>>>>>>>>> > ++ git rev-parse --short HEAD
>>>>>>>>> > + GITREV=487d409
>>>>>>>>> > + '[' '!' -z 487d409 ']'
>>>>>>>>> > + GITREVSTRING=' (git revision 487d409)'
>>>>>>>>> > + unset GITREV
>>>>>>>>> > ++ command -v mvn
>>>>>>>>> > + '[' '!' /usr/bin/mvn ']'
>>>>>>>>> > ++ mvn help:evaluate -Dexpression=project.version
>>>>>>>>> -Dhadoop.version=2.7.0
>>>>>>>>> > -Phadoop-2.7 -Phive -Phive-thriftserver -DskipTests clean package
>>>>>>>>> > ++ grep -v INFO
>>>>>>>>> > ++ tail -n 1
>>>>>>>>> > + VERSION='[ERROR] [Help 1]
>>>>>>>>> >
>>>>>>>>> http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
>>>>>>>>> '
>>>>>>>>> >
>>>>>>>>> > Same output error with JDK 7
>>>>>>>>> >
>>>>>>>>> > Appreciate your help.
>>>>>>>>> >
>>>>>>>>> >
>>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Maven build failed (Spark master)

Posted by Ted Yu <yu...@gmail.com>.
Can you try the same command shown in the pull request ?

Thanks

> On Oct 27, 2015, at 12:40 AM, Kayode Odeyemi <dr...@gmail.com> wrote:
> 
> Thank you.
> 
> But I'm getting same warnings and it's still preventing the archive from being generated.
> 
> I've ran this on both OSX Lion and Ubuntu 12. Same error. No .gz file
> 
>> On Mon, Oct 26, 2015 at 9:10 PM, Ted Yu <yu...@gmail.com> wrote:
>> Looks like '-Pyarn' was missing in your command.
>> 
>>> On Mon, Oct 26, 2015 at 12:06 PM, Kayode Odeyemi <dr...@gmail.com> wrote:
>>> I used this command which is synonymous to what you have:
>>> 
>>> ./make-distribution.sh --name spark-latest --tgz --mvn mvn -Dhadoop.version=2.6.0 -Phadoop-2.6 -Phive -Phive-thriftserver -DskipTests clean package -U 
>>> 
>>> But I still see WARNINGS like this in the output and no .gz file created:
>>> 
>>> cp: /usr/local/spark-latest/spark-[WARNING] See http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=10/day=26/.part-r-00005.gz.parquet.crc: No such file or directory
>>> cp: /usr/local/spark-latest/spark-[WARNING] See http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=10/day=26/part-r-00005.gz.parquet: No such file or directory
>>> cp: /usr/local/spark-latest/spark-[WARNING] See http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=9: No such file or directory
>>> cp: /usr/local/spark-latest/dist/python/test_support/sql/parquet_partitioned/year=2015/month=9: unable to copy extended attributes to /usr/local/spark-latest/spark-[WARNING] See http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=9: No such file or directory
>>> cp: /usr/local/spark-latest/spark-[WARNING] See http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=9/day=1: No such file or directory
>>> cp: /usr/local/spark-latest/dist/python/test_support/sql/parquet_partitioned/year=2015/month=9/day=1: unable to copy extended attributes to /usr/local/spark-latest/spark-[WARNING] See http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=9/day=1: No such file or directory
>>> cp: /usr/local/spark-latest/spark-[WARNING] See http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=9/day=1/.part-r-00007.gz.parquet.crc: No such file or directory
>>> 
>>>> On Mon, Oct 26, 2015 at 8:58 PM, Ted Yu <yu...@gmail.com> wrote:
>>>> If you use the command shown in:
>>>> https://github.com/apache/spark/pull/9281
>>>> 
>>>> You should have got the following:
>>>> 
>>>> ./dist/python/test_support/sql/parquet_partitioned/year=2014/month=9/day=1/part-r-00008.gz.parquet
>>>> ./dist/python/test_support/sql/parquet_partitioned/year=2015/month=9/day=1/part-r-00007.gz.parquet
>>>> ./dist/python/test_support/sql/parquet_partitioned/year=2015/month=10/day=25/part-r-00004.gz.parquet
>>>> ./dist/python/test_support/sql/parquet_partitioned/year=2015/month=10/day=25/part-r-00002.gz.parquet
>>>> ./dist/python/test_support/sql/parquet_partitioned/year=2015/month=10/day=26/part-r-00005.gz.parquet
>>>> 
>>>>> On Mon, Oct 26, 2015 at 11:47 AM, Kayode Odeyemi <dr...@gmail.com> wrote:
>>>>> I see a lot of stuffs like this after the a successful maven build:
>>>>> 
>>>>> cp: /usr/local/spark-latest/spark-[WARNING] See http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2014/month=9/day=1/
>>>>> part-r-00008.gz.parquet: No such file or directory
>>>>> 
>>>>> Seems it fails when it tries to package the build as an archive.
>>>>> 
>>>>> I'm using the latest code on github master.
>>>>> 
>>>>> Any ideas please?
>>>>> 
>>>>>> On Mon, Oct 26, 2015 at 6:20 PM, Yana Kadiyska <ya...@gmail.com> wrote:
>>>>>> In 1.4 ./make_distribution produces a .tgz file in the root directory (same directory that make_distribution is in)
>>>>>> 
>>>>>> 
>>>>>> 
>>>>>>> On Mon, Oct 26, 2015 at 8:46 AM, Kayode Odeyemi <dr...@gmail.com> wrote:
>>>>>>> Hi,
>>>>>>> 
>>>>>>> The ./make_distribution task completed. However, I can't seem to locate the
>>>>>>> .tar.gz file.
>>>>>>> 
>>>>>>> Where does Spark save this? or should I just work with the dist directory?
>>>>>>> 
>>>>>>>> On Fri, Oct 23, 2015 at 4:23 PM, Kayode Odeyemi <dr...@gmail.com> wrote:
>>>>>>>> I saw this when I tested manually (without ./make-distribution)
>>>>>>>> 
>>>>>>>> Detected Maven Version: 3.2.2 is not in the allowed range 3.3.3.
>>>>>>>> 
>>>>>>>> So I simply upgraded maven to 3.3.3.
>>>>>>>> 
>>>>>>>> Resolved. Thanks
>>>>>>>> 
>>>>>>>>> On Fri, Oct 23, 2015 at 3:17 PM, Sean Owen <so...@cloudera.com> wrote:
>>>>>>>>> This doesn't show the actual error output from Maven. I have a strong
>>>>>>>>> guess that you haven't set MAVEN_OPTS to increase the memory Maven can
>>>>>>>>> use.
>>>>>>>>> 
>>>>>>>>> On Fri, Oct 23, 2015 at 6:14 AM, Kayode Odeyemi <dr...@gmail.com> wrote:
>>>>>>>>> > Hi,
>>>>>>>>> >
>>>>>>>>> > I can't seem to get a successful maven build. Please see command output
>>>>>>>>> > below:
>>>>>>>>> >
>>>>>>>>> > bash-3.2$ ./make-distribution.sh --name spark-latest --tgz --mvn mvn
>>>>>>>>> > -Dhadoop.version=2.7.0 -Phadoop-2.7 -Phive -Phive-thriftserver -DskipTests
>>>>>>>>> > clean package
>>>>>>>>> > +++ dirname ./make-distribution.sh
>>>>>>>>> > ++ cd .
>>>>>>>>> > ++ pwd
>>>>>>>>> > + SPARK_HOME=/usr/local/spark-latest
>>>>>>>>> > + DISTDIR=/usr/local/spark-latest/dist
>>>>>>>>> > + SPARK_TACHYON=false
>>>>>>>>> > + TACHYON_VERSION=0.7.1
>>>>>>>>> > + TACHYON_TGZ=tachyon-0.7.1-bin.tar.gz
>>>>>>>>> > +
>>>>>>>>> > TACHYON_URL=https://github.com/amplab/tachyon/releases/download/v0.7.1/tachyon-0.7.1-bin.tar.gz
>>>>>>>>> > + MAKE_TGZ=false
>>>>>>>>> > + NAME=none
>>>>>>>>> > + MVN=/usr/local/spark-latest/build/mvn
>>>>>>>>> > + ((  12  ))
>>>>>>>>> > + case $1 in
>>>>>>>>> > + NAME=spark-latest
>>>>>>>>> > + shift
>>>>>>>>> > + shift
>>>>>>>>> > + ((  10  ))
>>>>>>>>> > + case $1 in
>>>>>>>>> > + MAKE_TGZ=true
>>>>>>>>> > + shift
>>>>>>>>> > + ((  9  ))
>>>>>>>>> > + case $1 in
>>>>>>>>> > + MVN=mvn
>>>>>>>>> > + shift
>>>>>>>>> > + shift
>>>>>>>>> > + ((  7  ))
>>>>>>>>> > + case $1 in
>>>>>>>>> > + break
>>>>>>>>> > + '[' -z /Library/Java/JavaVirtualMachines/jdk1.8.0_20.jdk/Contents/Home ']'
>>>>>>>>> > + '[' -z /Library/Java/JavaVirtualMachines/jdk1.8.0_20.jdk/Contents/Home ']'
>>>>>>>>> > ++ command -v git
>>>>>>>>> > + '[' /usr/bin/git ']'
>>>>>>>>> > ++ git rev-parse --short HEAD
>>>>>>>>> > + GITREV=487d409
>>>>>>>>> > + '[' '!' -z 487d409 ']'
>>>>>>>>> > + GITREVSTRING=' (git revision 487d409)'
>>>>>>>>> > + unset GITREV
>>>>>>>>> > ++ command -v mvn
>>>>>>>>> > + '[' '!' /usr/bin/mvn ']'
>>>>>>>>> > ++ mvn help:evaluate -Dexpression=project.version -Dhadoop.version=2.7.0
>>>>>>>>> > -Phadoop-2.7 -Phive -Phive-thriftserver -DskipTests clean package
>>>>>>>>> > ++ grep -v INFO
>>>>>>>>> > ++ tail -n 1
>>>>>>>>> > + VERSION='[ERROR] [Help 1]
>>>>>>>>> > http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException'
>>>>>>>>> >
>>>>>>>>> > Same output error with JDK 7
>>>>>>>>> >
>>>>>>>>> > Appreciate your help.
>>>>>>>>> >
>>>>>>>>> >
>>>>>>> 
>>> 
> 
> 

Re: Maven build failed (Spark master)

Posted by Kayode Odeyemi <dr...@gmail.com>.
Thank you.

But I'm getting same warnings and it's still preventing the archive from
being generated.

I've ran this on both OSX Lion and Ubuntu 12. Same error. No .gz file

On Mon, Oct 26, 2015 at 9:10 PM, Ted Yu <yu...@gmail.com> wrote:

> Looks like '-Pyarn' was missing in your command.
>
> On Mon, Oct 26, 2015 at 12:06 PM, Kayode Odeyemi <dr...@gmail.com>
> wrote:
>
>> I used this command which is synonymous to what you have:
>>
>> ./make-distribution.sh --name spark-latest --tgz --mvn mvn
>> -Dhadoop.version=2.6.0 -Phadoop-2.6 -Phive -Phive-thriftserver -DskipTests
>> clean package -U
>>
>> But I still see WARNINGS like this in the output and no .gz file created:
>>
>> cp: /usr/local/spark-latest/spark-[WARNING] See
>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=10/day=26/.part-r-00005.gz.parquet.crc:
>> No such file or directory
>> cp: /usr/local/spark-latest/spark-[WARNING] See
>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=10/day=26/part-r-00005.gz.parquet:
>> No such file or directory
>> cp: /usr/local/spark-latest/spark-[WARNING] See
>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=9:
>> No such file or directory
>> cp:
>> /usr/local/spark-latest/dist/python/test_support/sql/parquet_partitioned/year=2015/month=9:
>> unable to copy extended attributes to
>> /usr/local/spark-latest/spark-[WARNING] See
>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=9:
>> No such file or directory
>> cp: /usr/local/spark-latest/spark-[WARNING] See
>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=9/day=1:
>> No such file or directory
>> cp:
>> /usr/local/spark-latest/dist/python/test_support/sql/parquet_partitioned/year=2015/month=9/day=1:
>> unable to copy extended attributes to
>> /usr/local/spark-latest/spark-[WARNING] See
>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=9/day=1:
>> No such file or directory
>> cp: /usr/local/spark-latest/spark-[WARNING] See
>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=9/day=1/.part-r-00007.gz.parquet.crc:
>> No such file or directory
>>
>> On Mon, Oct 26, 2015 at 8:58 PM, Ted Yu <yu...@gmail.com> wrote:
>>
>>> If you use the command shown in:
>>> https://github.com/apache/spark/pull/9281
>>>
>>> You should have got the following:
>>>
>>>
>>> ./dist/python/test_support/sql/parquet_partitioned/year=2014/month=9/day=1/part-r-00008.gz.parquet
>>>
>>> ./dist/python/test_support/sql/parquet_partitioned/year=2015/month=9/day=1/part-r-00007.gz.parquet
>>>
>>> ./dist/python/test_support/sql/parquet_partitioned/year=2015/month=10/day=25/part-r-00004.gz.parquet
>>>
>>> ./dist/python/test_support/sql/parquet_partitioned/year=2015/month=10/day=25/part-r-00002.gz.parquet
>>>
>>> ./dist/python/test_support/sql/parquet_partitioned/year=2015/month=10/day=26/part-r-00005.gz.parquet
>>>
>>> On Mon, Oct 26, 2015 at 11:47 AM, Kayode Odeyemi <dr...@gmail.com>
>>> wrote:
>>>
>>>> I see a lot of stuffs like this after the a successful maven build:
>>>>
>>>> cp: /usr/local/spark-latest/spark-[WARNING] See
>>>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2014/month=9/day=1/
>>>> part-r-00008.gz.parquet: No such file or directory
>>>>
>>>> Seems it fails when it tries to package the build as an archive.
>>>>
>>>> I'm using the latest code on github master.
>>>>
>>>> Any ideas please?
>>>>
>>>> On Mon, Oct 26, 2015 at 6:20 PM, Yana Kadiyska <yana.kadiyska@gmail.com
>>>> > wrote:
>>>>
>>>>> In 1.4 ./make_distribution produces a .tgz file in the root directory
>>>>> (same directory that make_distribution is in)
>>>>>
>>>>>
>>>>>
>>>>> On Mon, Oct 26, 2015 at 8:46 AM, Kayode Odeyemi <dr...@gmail.com>
>>>>> wrote:
>>>>>
>>>>>> Hi,
>>>>>>
>>>>>> The ./make_distribution task completed. However, I can't seem to
>>>>>> locate the
>>>>>> .tar.gz file.
>>>>>>
>>>>>> Where does Spark save this? or should I just work with the dist
>>>>>> directory?
>>>>>>
>>>>>> On Fri, Oct 23, 2015 at 4:23 PM, Kayode Odeyemi <dr...@gmail.com>
>>>>>> wrote:
>>>>>>
>>>>>>> I saw this when I tested manually (without ./make-distribution)
>>>>>>>
>>>>>>> Detected Maven Version: 3.2.2 is not in the allowed range 3.3.3.
>>>>>>>
>>>>>>> So I simply upgraded maven to 3.3.3.
>>>>>>>
>>>>>>> Resolved. Thanks
>>>>>>>
>>>>>>> On Fri, Oct 23, 2015 at 3:17 PM, Sean Owen <so...@cloudera.com>
>>>>>>> wrote:
>>>>>>>
>>>>>>>> This doesn't show the actual error output from Maven. I have a
>>>>>>>> strong
>>>>>>>> guess that you haven't set MAVEN_OPTS to increase the memory Maven
>>>>>>>> can
>>>>>>>> use.
>>>>>>>>
>>>>>>>> On Fri, Oct 23, 2015 at 6:14 AM, Kayode Odeyemi <dr...@gmail.com>
>>>>>>>> wrote:
>>>>>>>> > Hi,
>>>>>>>> >
>>>>>>>> > I can't seem to get a successful maven build. Please see command
>>>>>>>> output
>>>>>>>> > below:
>>>>>>>> >
>>>>>>>> > bash-3.2$ ./make-distribution.sh --name spark-latest --tgz --mvn
>>>>>>>> mvn
>>>>>>>> > -Dhadoop.version=2.7.0 -Phadoop-2.7 -Phive -Phive-thriftserver
>>>>>>>> -DskipTests
>>>>>>>> > clean package
>>>>>>>> > +++ dirname ./make-distribution.sh
>>>>>>>> > ++ cd .
>>>>>>>> > ++ pwd
>>>>>>>> > + SPARK_HOME=/usr/local/spark-latest
>>>>>>>> > + DISTDIR=/usr/local/spark-latest/dist
>>>>>>>> > + SPARK_TACHYON=false
>>>>>>>> > + TACHYON_VERSION=0.7.1
>>>>>>>> > + TACHYON_TGZ=tachyon-0.7.1-bin.tar.gz
>>>>>>>> > +
>>>>>>>> > TACHYON_URL=
>>>>>>>> https://github.com/amplab/tachyon/releases/download/v0.7.1/tachyon-0.7.1-bin.tar.gz
>>>>>>>> > + MAKE_TGZ=false
>>>>>>>> > + NAME=none
>>>>>>>> > + MVN=/usr/local/spark-latest/build/mvn
>>>>>>>> > + ((  12  ))
>>>>>>>> > + case $1 in
>>>>>>>> > + NAME=spark-latest
>>>>>>>> > + shift
>>>>>>>> > + shift
>>>>>>>> > + ((  10  ))
>>>>>>>> > + case $1 in
>>>>>>>> > + MAKE_TGZ=true
>>>>>>>> > + shift
>>>>>>>> > + ((  9  ))
>>>>>>>> > + case $1 in
>>>>>>>> > + MVN=mvn
>>>>>>>> > + shift
>>>>>>>> > + shift
>>>>>>>> > + ((  7  ))
>>>>>>>> > + case $1 in
>>>>>>>> > + break
>>>>>>>> > + '[' -z
>>>>>>>> /Library/Java/JavaVirtualMachines/jdk1.8.0_20.jdk/Contents/Home ']'
>>>>>>>> > + '[' -z
>>>>>>>> /Library/Java/JavaVirtualMachines/jdk1.8.0_20.jdk/Contents/Home ']'
>>>>>>>> > ++ command -v git
>>>>>>>> > + '[' /usr/bin/git ']'
>>>>>>>> > ++ git rev-parse --short HEAD
>>>>>>>> > + GITREV=487d409
>>>>>>>> > + '[' '!' -z 487d409 ']'
>>>>>>>> > + GITREVSTRING=' (git revision 487d409)'
>>>>>>>> > + unset GITREV
>>>>>>>> > ++ command -v mvn
>>>>>>>> > + '[' '!' /usr/bin/mvn ']'
>>>>>>>> > ++ mvn help:evaluate -Dexpression=project.version
>>>>>>>> -Dhadoop.version=2.7.0
>>>>>>>> > -Phadoop-2.7 -Phive -Phive-thriftserver -DskipTests clean package
>>>>>>>> > ++ grep -v INFO
>>>>>>>> > ++ tail -n 1
>>>>>>>> > + VERSION='[ERROR] [Help 1]
>>>>>>>> >
>>>>>>>> http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
>>>>>>>> '
>>>>>>>> >
>>>>>>>> > Same output error with JDK 7
>>>>>>>> >
>>>>>>>> > Appreciate your help.
>>>>>>>> >
>>>>>>>> >
>>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>>
>>>
>>
>

Re: Maven build failed (Spark master)

Posted by Ted Yu <yu...@gmail.com>.
Looks like '-Pyarn' was missing in your command.

On Mon, Oct 26, 2015 at 12:06 PM, Kayode Odeyemi <dr...@gmail.com> wrote:

> I used this command which is synonymous to what you have:
>
> ./make-distribution.sh --name spark-latest --tgz --mvn mvn
> -Dhadoop.version=2.6.0 -Phadoop-2.6 -Phive -Phive-thriftserver -DskipTests
> clean package -U
>
> But I still see WARNINGS like this in the output and no .gz file created:
>
> cp: /usr/local/spark-latest/spark-[WARNING] See
> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=10/day=26/.part-r-00005.gz.parquet.crc:
> No such file or directory
> cp: /usr/local/spark-latest/spark-[WARNING] See
> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=10/day=26/part-r-00005.gz.parquet:
> No such file or directory
> cp: /usr/local/spark-latest/spark-[WARNING] See
> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=9:
> No such file or directory
> cp:
> /usr/local/spark-latest/dist/python/test_support/sql/parquet_partitioned/year=2015/month=9:
> unable to copy extended attributes to
> /usr/local/spark-latest/spark-[WARNING] See
> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=9:
> No such file or directory
> cp: /usr/local/spark-latest/spark-[WARNING] See
> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=9/day=1:
> No such file or directory
> cp:
> /usr/local/spark-latest/dist/python/test_support/sql/parquet_partitioned/year=2015/month=9/day=1:
> unable to copy extended attributes to
> /usr/local/spark-latest/spark-[WARNING] See
> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=9/day=1:
> No such file or directory
> cp: /usr/local/spark-latest/spark-[WARNING] See
> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=9/day=1/.part-r-00007.gz.parquet.crc:
> No such file or directory
>
> On Mon, Oct 26, 2015 at 8:58 PM, Ted Yu <yu...@gmail.com> wrote:
>
>> If you use the command shown in:
>> https://github.com/apache/spark/pull/9281
>>
>> You should have got the following:
>>
>>
>> ./dist/python/test_support/sql/parquet_partitioned/year=2014/month=9/day=1/part-r-00008.gz.parquet
>>
>> ./dist/python/test_support/sql/parquet_partitioned/year=2015/month=9/day=1/part-r-00007.gz.parquet
>>
>> ./dist/python/test_support/sql/parquet_partitioned/year=2015/month=10/day=25/part-r-00004.gz.parquet
>>
>> ./dist/python/test_support/sql/parquet_partitioned/year=2015/month=10/day=25/part-r-00002.gz.parquet
>>
>> ./dist/python/test_support/sql/parquet_partitioned/year=2015/month=10/day=26/part-r-00005.gz.parquet
>>
>> On Mon, Oct 26, 2015 at 11:47 AM, Kayode Odeyemi <dr...@gmail.com>
>> wrote:
>>
>>> I see a lot of stuffs like this after the a successful maven build:
>>>
>>> cp: /usr/local/spark-latest/spark-[WARNING] See
>>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2014/month=9/day=1/
>>> part-r-00008.gz.parquet: No such file or directory
>>>
>>> Seems it fails when it tries to package the build as an archive.
>>>
>>> I'm using the latest code on github master.
>>>
>>> Any ideas please?
>>>
>>> On Mon, Oct 26, 2015 at 6:20 PM, Yana Kadiyska <ya...@gmail.com>
>>> wrote:
>>>
>>>> In 1.4 ./make_distribution produces a .tgz file in the root directory
>>>> (same directory that make_distribution is in)
>>>>
>>>>
>>>>
>>>> On Mon, Oct 26, 2015 at 8:46 AM, Kayode Odeyemi <dr...@gmail.com>
>>>> wrote:
>>>>
>>>>> Hi,
>>>>>
>>>>> The ./make_distribution task completed. However, I can't seem to
>>>>> locate the
>>>>> .tar.gz file.
>>>>>
>>>>> Where does Spark save this? or should I just work with the dist
>>>>> directory?
>>>>>
>>>>> On Fri, Oct 23, 2015 at 4:23 PM, Kayode Odeyemi <dr...@gmail.com>
>>>>> wrote:
>>>>>
>>>>>> I saw this when I tested manually (without ./make-distribution)
>>>>>>
>>>>>> Detected Maven Version: 3.2.2 is not in the allowed range 3.3.3.
>>>>>>
>>>>>> So I simply upgraded maven to 3.3.3.
>>>>>>
>>>>>> Resolved. Thanks
>>>>>>
>>>>>> On Fri, Oct 23, 2015 at 3:17 PM, Sean Owen <so...@cloudera.com>
>>>>>> wrote:
>>>>>>
>>>>>>> This doesn't show the actual error output from Maven. I have a strong
>>>>>>> guess that you haven't set MAVEN_OPTS to increase the memory Maven
>>>>>>> can
>>>>>>> use.
>>>>>>>
>>>>>>> On Fri, Oct 23, 2015 at 6:14 AM, Kayode Odeyemi <dr...@gmail.com>
>>>>>>> wrote:
>>>>>>> > Hi,
>>>>>>> >
>>>>>>> > I can't seem to get a successful maven build. Please see command
>>>>>>> output
>>>>>>> > below:
>>>>>>> >
>>>>>>> > bash-3.2$ ./make-distribution.sh --name spark-latest --tgz --mvn
>>>>>>> mvn
>>>>>>> > -Dhadoop.version=2.7.0 -Phadoop-2.7 -Phive -Phive-thriftserver
>>>>>>> -DskipTests
>>>>>>> > clean package
>>>>>>> > +++ dirname ./make-distribution.sh
>>>>>>> > ++ cd .
>>>>>>> > ++ pwd
>>>>>>> > + SPARK_HOME=/usr/local/spark-latest
>>>>>>> > + DISTDIR=/usr/local/spark-latest/dist
>>>>>>> > + SPARK_TACHYON=false
>>>>>>> > + TACHYON_VERSION=0.7.1
>>>>>>> > + TACHYON_TGZ=tachyon-0.7.1-bin.tar.gz
>>>>>>> > +
>>>>>>> > TACHYON_URL=
>>>>>>> https://github.com/amplab/tachyon/releases/download/v0.7.1/tachyon-0.7.1-bin.tar.gz
>>>>>>> > + MAKE_TGZ=false
>>>>>>> > + NAME=none
>>>>>>> > + MVN=/usr/local/spark-latest/build/mvn
>>>>>>> > + ((  12  ))
>>>>>>> > + case $1 in
>>>>>>> > + NAME=spark-latest
>>>>>>> > + shift
>>>>>>> > + shift
>>>>>>> > + ((  10  ))
>>>>>>> > + case $1 in
>>>>>>> > + MAKE_TGZ=true
>>>>>>> > + shift
>>>>>>> > + ((  9  ))
>>>>>>> > + case $1 in
>>>>>>> > + MVN=mvn
>>>>>>> > + shift
>>>>>>> > + shift
>>>>>>> > + ((  7  ))
>>>>>>> > + case $1 in
>>>>>>> > + break
>>>>>>> > + '[' -z
>>>>>>> /Library/Java/JavaVirtualMachines/jdk1.8.0_20.jdk/Contents/Home ']'
>>>>>>> > + '[' -z
>>>>>>> /Library/Java/JavaVirtualMachines/jdk1.8.0_20.jdk/Contents/Home ']'
>>>>>>> > ++ command -v git
>>>>>>> > + '[' /usr/bin/git ']'
>>>>>>> > ++ git rev-parse --short HEAD
>>>>>>> > + GITREV=487d409
>>>>>>> > + '[' '!' -z 487d409 ']'
>>>>>>> > + GITREVSTRING=' (git revision 487d409)'
>>>>>>> > + unset GITREV
>>>>>>> > ++ command -v mvn
>>>>>>> > + '[' '!' /usr/bin/mvn ']'
>>>>>>> > ++ mvn help:evaluate -Dexpression=project.version
>>>>>>> -Dhadoop.version=2.7.0
>>>>>>> > -Phadoop-2.7 -Phive -Phive-thriftserver -DskipTests clean package
>>>>>>> > ++ grep -v INFO
>>>>>>> > ++ tail -n 1
>>>>>>> > + VERSION='[ERROR] [Help 1]
>>>>>>> >
>>>>>>> http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
>>>>>>> '
>>>>>>> >
>>>>>>> > Same output error with JDK 7
>>>>>>> >
>>>>>>> > Appreciate your help.
>>>>>>> >
>>>>>>> >
>>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>>
>>
>

Re: Maven build failed (Spark master)

Posted by Kayode Odeyemi <dr...@gmail.com>.
I used this command which is synonymous to what you have:

./make-distribution.sh --name spark-latest --tgz --mvn mvn
-Dhadoop.version=2.6.0 -Phadoop-2.6 -Phive -Phive-thriftserver -DskipTests
clean package -U

But I still see WARNINGS like this in the output and no .gz file created:

cp: /usr/local/spark-latest/spark-[WARNING] See
http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=10/day=26/.part-r-00005.gz.parquet.crc:
No such file or directory
cp: /usr/local/spark-latest/spark-[WARNING] See
http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=10/day=26/part-r-00005.gz.parquet:
No such file or directory
cp: /usr/local/spark-latest/spark-[WARNING] See
http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=9:
No such file or directory
cp:
/usr/local/spark-latest/dist/python/test_support/sql/parquet_partitioned/year=2015/month=9:
unable to copy extended attributes to
/usr/local/spark-latest/spark-[WARNING] See
http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=9:
No such file or directory
cp: /usr/local/spark-latest/spark-[WARNING] See
http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=9/day=1:
No such file or directory
cp:
/usr/local/spark-latest/dist/python/test_support/sql/parquet_partitioned/year=2015/month=9/day=1:
unable to copy extended attributes to
/usr/local/spark-latest/spark-[WARNING] See
http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=9/day=1:
No such file or directory
cp: /usr/local/spark-latest/spark-[WARNING] See
http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2015/month=9/day=1/.part-r-00007.gz.parquet.crc:
No such file or directory

On Mon, Oct 26, 2015 at 8:58 PM, Ted Yu <yu...@gmail.com> wrote:

> If you use the command shown in:
> https://github.com/apache/spark/pull/9281
>
> You should have got the following:
>
>
> ./dist/python/test_support/sql/parquet_partitioned/year=2014/month=9/day=1/part-r-00008.gz.parquet
>
> ./dist/python/test_support/sql/parquet_partitioned/year=2015/month=9/day=1/part-r-00007.gz.parquet
>
> ./dist/python/test_support/sql/parquet_partitioned/year=2015/month=10/day=25/part-r-00004.gz.parquet
>
> ./dist/python/test_support/sql/parquet_partitioned/year=2015/month=10/day=25/part-r-00002.gz.parquet
>
> ./dist/python/test_support/sql/parquet_partitioned/year=2015/month=10/day=26/part-r-00005.gz.parquet
>
> On Mon, Oct 26, 2015 at 11:47 AM, Kayode Odeyemi <dr...@gmail.com>
> wrote:
>
>> I see a lot of stuffs like this after the a successful maven build:
>>
>> cp: /usr/local/spark-latest/spark-[WARNING] See
>> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2014/month=9/day=1/
>> part-r-00008.gz.parquet: No such file or directory
>>
>> Seems it fails when it tries to package the build as an archive.
>>
>> I'm using the latest code on github master.
>>
>> Any ideas please?
>>
>> On Mon, Oct 26, 2015 at 6:20 PM, Yana Kadiyska <ya...@gmail.com>
>> wrote:
>>
>>> In 1.4 ./make_distribution produces a .tgz file in the root directory
>>> (same directory that make_distribution is in)
>>>
>>>
>>>
>>> On Mon, Oct 26, 2015 at 8:46 AM, Kayode Odeyemi <dr...@gmail.com>
>>> wrote:
>>>
>>>> Hi,
>>>>
>>>> The ./make_distribution task completed. However, I can't seem to locate
>>>> the
>>>> .tar.gz file.
>>>>
>>>> Where does Spark save this? or should I just work with the dist
>>>> directory?
>>>>
>>>> On Fri, Oct 23, 2015 at 4:23 PM, Kayode Odeyemi <dr...@gmail.com>
>>>> wrote:
>>>>
>>>>> I saw this when I tested manually (without ./make-distribution)
>>>>>
>>>>> Detected Maven Version: 3.2.2 is not in the allowed range 3.3.3.
>>>>>
>>>>> So I simply upgraded maven to 3.3.3.
>>>>>
>>>>> Resolved. Thanks
>>>>>
>>>>> On Fri, Oct 23, 2015 at 3:17 PM, Sean Owen <so...@cloudera.com> wrote:
>>>>>
>>>>>> This doesn't show the actual error output from Maven. I have a strong
>>>>>> guess that you haven't set MAVEN_OPTS to increase the memory Maven can
>>>>>> use.
>>>>>>
>>>>>> On Fri, Oct 23, 2015 at 6:14 AM, Kayode Odeyemi <dr...@gmail.com>
>>>>>> wrote:
>>>>>> > Hi,
>>>>>> >
>>>>>> > I can't seem to get a successful maven build. Please see command
>>>>>> output
>>>>>> > below:
>>>>>> >
>>>>>> > bash-3.2$ ./make-distribution.sh --name spark-latest --tgz --mvn mvn
>>>>>> > -Dhadoop.version=2.7.0 -Phadoop-2.7 -Phive -Phive-thriftserver
>>>>>> -DskipTests
>>>>>> > clean package
>>>>>> > +++ dirname ./make-distribution.sh
>>>>>> > ++ cd .
>>>>>> > ++ pwd
>>>>>> > + SPARK_HOME=/usr/local/spark-latest
>>>>>> > + DISTDIR=/usr/local/spark-latest/dist
>>>>>> > + SPARK_TACHYON=false
>>>>>> > + TACHYON_VERSION=0.7.1
>>>>>> > + TACHYON_TGZ=tachyon-0.7.1-bin.tar.gz
>>>>>> > +
>>>>>> > TACHYON_URL=
>>>>>> https://github.com/amplab/tachyon/releases/download/v0.7.1/tachyon-0.7.1-bin.tar.gz
>>>>>> > + MAKE_TGZ=false
>>>>>> > + NAME=none
>>>>>> > + MVN=/usr/local/spark-latest/build/mvn
>>>>>> > + ((  12  ))
>>>>>> > + case $1 in
>>>>>> > + NAME=spark-latest
>>>>>> > + shift
>>>>>> > + shift
>>>>>> > + ((  10  ))
>>>>>> > + case $1 in
>>>>>> > + MAKE_TGZ=true
>>>>>> > + shift
>>>>>> > + ((  9  ))
>>>>>> > + case $1 in
>>>>>> > + MVN=mvn
>>>>>> > + shift
>>>>>> > + shift
>>>>>> > + ((  7  ))
>>>>>> > + case $1 in
>>>>>> > + break
>>>>>> > + '[' -z
>>>>>> /Library/Java/JavaVirtualMachines/jdk1.8.0_20.jdk/Contents/Home ']'
>>>>>> > + '[' -z
>>>>>> /Library/Java/JavaVirtualMachines/jdk1.8.0_20.jdk/Contents/Home ']'
>>>>>> > ++ command -v git
>>>>>> > + '[' /usr/bin/git ']'
>>>>>> > ++ git rev-parse --short HEAD
>>>>>> > + GITREV=487d409
>>>>>> > + '[' '!' -z 487d409 ']'
>>>>>> > + GITREVSTRING=' (git revision 487d409)'
>>>>>> > + unset GITREV
>>>>>> > ++ command -v mvn
>>>>>> > + '[' '!' /usr/bin/mvn ']'
>>>>>> > ++ mvn help:evaluate -Dexpression=project.version
>>>>>> -Dhadoop.version=2.7.0
>>>>>> > -Phadoop-2.7 -Phive -Phive-thriftserver -DskipTests clean package
>>>>>> > ++ grep -v INFO
>>>>>> > ++ tail -n 1
>>>>>> > + VERSION='[ERROR] [Help 1]
>>>>>> >
>>>>>> http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
>>>>>> '
>>>>>> >
>>>>>> > Same output error with JDK 7
>>>>>> >
>>>>>> > Appreciate your help.
>>>>>> >
>>>>>> >
>>>>>>
>>>>>
>>>>>
>>>>>
>>>>
>>>
>>
>>
>

Re: Maven build failed (Spark master)

Posted by Ted Yu <yu...@gmail.com>.
If you use the command shown in:
https://github.com/apache/spark/pull/9281

You should have got the following:

./dist/python/test_support/sql/parquet_partitioned/year=2014/month=9/day=1/part-r-00008.gz.parquet
./dist/python/test_support/sql/parquet_partitioned/year=2015/month=9/day=1/part-r-00007.gz.parquet
./dist/python/test_support/sql/parquet_partitioned/year=2015/month=10/day=25/part-r-00004.gz.parquet
./dist/python/test_support/sql/parquet_partitioned/year=2015/month=10/day=25/part-r-00002.gz.parquet
./dist/python/test_support/sql/parquet_partitioned/year=2015/month=10/day=26/part-r-00005.gz.parquet

On Mon, Oct 26, 2015 at 11:47 AM, Kayode Odeyemi <dr...@gmail.com> wrote:

> I see a lot of stuffs like this after the a successful maven build:
>
> cp: /usr/local/spark-latest/spark-[WARNING] See
> http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2014/month=9/day=1/
> part-r-00008.gz.parquet: No such file or directory
>
> Seems it fails when it tries to package the build as an archive.
>
> I'm using the latest code on github master.
>
> Any ideas please?
>
> On Mon, Oct 26, 2015 at 6:20 PM, Yana Kadiyska <ya...@gmail.com>
> wrote:
>
>> In 1.4 ./make_distribution produces a .tgz file in the root directory
>> (same directory that make_distribution is in)
>>
>>
>>
>> On Mon, Oct 26, 2015 at 8:46 AM, Kayode Odeyemi <dr...@gmail.com>
>> wrote:
>>
>>> Hi,
>>>
>>> The ./make_distribution task completed. However, I can't seem to locate
>>> the
>>> .tar.gz file.
>>>
>>> Where does Spark save this? or should I just work with the dist
>>> directory?
>>>
>>> On Fri, Oct 23, 2015 at 4:23 PM, Kayode Odeyemi <dr...@gmail.com>
>>> wrote:
>>>
>>>> I saw this when I tested manually (without ./make-distribution)
>>>>
>>>> Detected Maven Version: 3.2.2 is not in the allowed range 3.3.3.
>>>>
>>>> So I simply upgraded maven to 3.3.3.
>>>>
>>>> Resolved. Thanks
>>>>
>>>> On Fri, Oct 23, 2015 at 3:17 PM, Sean Owen <so...@cloudera.com> wrote:
>>>>
>>>>> This doesn't show the actual error output from Maven. I have a strong
>>>>> guess that you haven't set MAVEN_OPTS to increase the memory Maven can
>>>>> use.
>>>>>
>>>>> On Fri, Oct 23, 2015 at 6:14 AM, Kayode Odeyemi <dr...@gmail.com>
>>>>> wrote:
>>>>> > Hi,
>>>>> >
>>>>> > I can't seem to get a successful maven build. Please see command
>>>>> output
>>>>> > below:
>>>>> >
>>>>> > bash-3.2$ ./make-distribution.sh --name spark-latest --tgz --mvn mvn
>>>>> > -Dhadoop.version=2.7.0 -Phadoop-2.7 -Phive -Phive-thriftserver
>>>>> -DskipTests
>>>>> > clean package
>>>>> > +++ dirname ./make-distribution.sh
>>>>> > ++ cd .
>>>>> > ++ pwd
>>>>> > + SPARK_HOME=/usr/local/spark-latest
>>>>> > + DISTDIR=/usr/local/spark-latest/dist
>>>>> > + SPARK_TACHYON=false
>>>>> > + TACHYON_VERSION=0.7.1
>>>>> > + TACHYON_TGZ=tachyon-0.7.1-bin.tar.gz
>>>>> > +
>>>>> > TACHYON_URL=
>>>>> https://github.com/amplab/tachyon/releases/download/v0.7.1/tachyon-0.7.1-bin.tar.gz
>>>>> > + MAKE_TGZ=false
>>>>> > + NAME=none
>>>>> > + MVN=/usr/local/spark-latest/build/mvn
>>>>> > + ((  12  ))
>>>>> > + case $1 in
>>>>> > + NAME=spark-latest
>>>>> > + shift
>>>>> > + shift
>>>>> > + ((  10  ))
>>>>> > + case $1 in
>>>>> > + MAKE_TGZ=true
>>>>> > + shift
>>>>> > + ((  9  ))
>>>>> > + case $1 in
>>>>> > + MVN=mvn
>>>>> > + shift
>>>>> > + shift
>>>>> > + ((  7  ))
>>>>> > + case $1 in
>>>>> > + break
>>>>> > + '[' -z
>>>>> /Library/Java/JavaVirtualMachines/jdk1.8.0_20.jdk/Contents/Home ']'
>>>>> > + '[' -z
>>>>> /Library/Java/JavaVirtualMachines/jdk1.8.0_20.jdk/Contents/Home ']'
>>>>> > ++ command -v git
>>>>> > + '[' /usr/bin/git ']'
>>>>> > ++ git rev-parse --short HEAD
>>>>> > + GITREV=487d409
>>>>> > + '[' '!' -z 487d409 ']'
>>>>> > + GITREVSTRING=' (git revision 487d409)'
>>>>> > + unset GITREV
>>>>> > ++ command -v mvn
>>>>> > + '[' '!' /usr/bin/mvn ']'
>>>>> > ++ mvn help:evaluate -Dexpression=project.version
>>>>> -Dhadoop.version=2.7.0
>>>>> > -Phadoop-2.7 -Phive -Phive-thriftserver -DskipTests clean package
>>>>> > ++ grep -v INFO
>>>>> > ++ tail -n 1
>>>>> > + VERSION='[ERROR] [Help 1]
>>>>> >
>>>>> http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
>>>>> '
>>>>> >
>>>>> > Same output error with JDK 7
>>>>> >
>>>>> > Appreciate your help.
>>>>> >
>>>>> >
>>>>>
>>>>
>>>>
>>>>
>>>
>>
>
>

Re: Maven build failed (Spark master)

Posted by Kayode Odeyemi <dr...@gmail.com>.
I see a lot of stuffs like this after the a successful maven build:

cp: /usr/local/spark-latest/spark-[WARNING] See
http://docs.codehaus.org/display/MAVENUSER/Shade+Plugin-bin-spark-latest/python/test_support/sql/parquet_partitioned/year=2014/month=9/day=1/
part-r-00008.gz.parquet: No such file or directory

Seems it fails when it tries to package the build as an archive.

I'm using the latest code on github master.

Any ideas please?

On Mon, Oct 26, 2015 at 6:20 PM, Yana Kadiyska <ya...@gmail.com>
wrote:

> In 1.4 ./make_distribution produces a .tgz file in the root directory
> (same directory that make_distribution is in)
>
>
>
> On Mon, Oct 26, 2015 at 8:46 AM, Kayode Odeyemi <dr...@gmail.com> wrote:
>
>> Hi,
>>
>> The ./make_distribution task completed. However, I can't seem to locate
>> the
>> .tar.gz file.
>>
>> Where does Spark save this? or should I just work with the dist directory?
>>
>> On Fri, Oct 23, 2015 at 4:23 PM, Kayode Odeyemi <dr...@gmail.com>
>> wrote:
>>
>>> I saw this when I tested manually (without ./make-distribution)
>>>
>>> Detected Maven Version: 3.2.2 is not in the allowed range 3.3.3.
>>>
>>> So I simply upgraded maven to 3.3.3.
>>>
>>> Resolved. Thanks
>>>
>>> On Fri, Oct 23, 2015 at 3:17 PM, Sean Owen <so...@cloudera.com> wrote:
>>>
>>>> This doesn't show the actual error output from Maven. I have a strong
>>>> guess that you haven't set MAVEN_OPTS to increase the memory Maven can
>>>> use.
>>>>
>>>> On Fri, Oct 23, 2015 at 6:14 AM, Kayode Odeyemi <dr...@gmail.com>
>>>> wrote:
>>>> > Hi,
>>>> >
>>>> > I can't seem to get a successful maven build. Please see command
>>>> output
>>>> > below:
>>>> >
>>>> > bash-3.2$ ./make-distribution.sh --name spark-latest --tgz --mvn mvn
>>>> > -Dhadoop.version=2.7.0 -Phadoop-2.7 -Phive -Phive-thriftserver
>>>> -DskipTests
>>>> > clean package
>>>> > +++ dirname ./make-distribution.sh
>>>> > ++ cd .
>>>> > ++ pwd
>>>> > + SPARK_HOME=/usr/local/spark-latest
>>>> > + DISTDIR=/usr/local/spark-latest/dist
>>>> > + SPARK_TACHYON=false
>>>> > + TACHYON_VERSION=0.7.1
>>>> > + TACHYON_TGZ=tachyon-0.7.1-bin.tar.gz
>>>> > +
>>>> > TACHYON_URL=
>>>> https://github.com/amplab/tachyon/releases/download/v0.7.1/tachyon-0.7.1-bin.tar.gz
>>>> > + MAKE_TGZ=false
>>>> > + NAME=none
>>>> > + MVN=/usr/local/spark-latest/build/mvn
>>>> > + ((  12  ))
>>>> > + case $1 in
>>>> > + NAME=spark-latest
>>>> > + shift
>>>> > + shift
>>>> > + ((  10  ))
>>>> > + case $1 in
>>>> > + MAKE_TGZ=true
>>>> > + shift
>>>> > + ((  9  ))
>>>> > + case $1 in
>>>> > + MVN=mvn
>>>> > + shift
>>>> > + shift
>>>> > + ((  7  ))
>>>> > + case $1 in
>>>> > + break
>>>> > + '[' -z
>>>> /Library/Java/JavaVirtualMachines/jdk1.8.0_20.jdk/Contents/Home ']'
>>>> > + '[' -z
>>>> /Library/Java/JavaVirtualMachines/jdk1.8.0_20.jdk/Contents/Home ']'
>>>> > ++ command -v git
>>>> > + '[' /usr/bin/git ']'
>>>> > ++ git rev-parse --short HEAD
>>>> > + GITREV=487d409
>>>> > + '[' '!' -z 487d409 ']'
>>>> > + GITREVSTRING=' (git revision 487d409)'
>>>> > + unset GITREV
>>>> > ++ command -v mvn
>>>> > + '[' '!' /usr/bin/mvn ']'
>>>> > ++ mvn help:evaluate -Dexpression=project.version
>>>> -Dhadoop.version=2.7.0
>>>> > -Phadoop-2.7 -Phive -Phive-thriftserver -DskipTests clean package
>>>> > ++ grep -v INFO
>>>> > ++ tail -n 1
>>>> > + VERSION='[ERROR] [Help 1]
>>>> >
>>>> http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
>>>> '
>>>> >
>>>> > Same output error with JDK 7
>>>> >
>>>> > Appreciate your help.
>>>> >
>>>> >
>>>>
>>>
>>>
>>>
>>
>

Re: Maven build failed (Spark master)

Posted by Yana Kadiyska <ya...@gmail.com>.
In 1.4 ./make_distribution produces a .tgz file in the root directory (same
directory that make_distribution is in)



On Mon, Oct 26, 2015 at 8:46 AM, Kayode Odeyemi <dr...@gmail.com> wrote:

> Hi,
>
> The ./make_distribution task completed. However, I can't seem to locate the
> .tar.gz file.
>
> Where does Spark save this? or should I just work with the dist directory?
>
> On Fri, Oct 23, 2015 at 4:23 PM, Kayode Odeyemi <dr...@gmail.com> wrote:
>
>> I saw this when I tested manually (without ./make-distribution)
>>
>> Detected Maven Version: 3.2.2 is not in the allowed range 3.3.3.
>>
>> So I simply upgraded maven to 3.3.3.
>>
>> Resolved. Thanks
>>
>> On Fri, Oct 23, 2015 at 3:17 PM, Sean Owen <so...@cloudera.com> wrote:
>>
>>> This doesn't show the actual error output from Maven. I have a strong
>>> guess that you haven't set MAVEN_OPTS to increase the memory Maven can
>>> use.
>>>
>>> On Fri, Oct 23, 2015 at 6:14 AM, Kayode Odeyemi <dr...@gmail.com>
>>> wrote:
>>> > Hi,
>>> >
>>> > I can't seem to get a successful maven build. Please see command output
>>> > below:
>>> >
>>> > bash-3.2$ ./make-distribution.sh --name spark-latest --tgz --mvn mvn
>>> > -Dhadoop.version=2.7.0 -Phadoop-2.7 -Phive -Phive-thriftserver
>>> -DskipTests
>>> > clean package
>>> > +++ dirname ./make-distribution.sh
>>> > ++ cd .
>>> > ++ pwd
>>> > + SPARK_HOME=/usr/local/spark-latest
>>> > + DISTDIR=/usr/local/spark-latest/dist
>>> > + SPARK_TACHYON=false
>>> > + TACHYON_VERSION=0.7.1
>>> > + TACHYON_TGZ=tachyon-0.7.1-bin.tar.gz
>>> > +
>>> > TACHYON_URL=
>>> https://github.com/amplab/tachyon/releases/download/v0.7.1/tachyon-0.7.1-bin.tar.gz
>>> > + MAKE_TGZ=false
>>> > + NAME=none
>>> > + MVN=/usr/local/spark-latest/build/mvn
>>> > + ((  12  ))
>>> > + case $1 in
>>> > + NAME=spark-latest
>>> > + shift
>>> > + shift
>>> > + ((  10  ))
>>> > + case $1 in
>>> > + MAKE_TGZ=true
>>> > + shift
>>> > + ((  9  ))
>>> > + case $1 in
>>> > + MVN=mvn
>>> > + shift
>>> > + shift
>>> > + ((  7  ))
>>> > + case $1 in
>>> > + break
>>> > + '[' -z
>>> /Library/Java/JavaVirtualMachines/jdk1.8.0_20.jdk/Contents/Home ']'
>>> > + '[' -z
>>> /Library/Java/JavaVirtualMachines/jdk1.8.0_20.jdk/Contents/Home ']'
>>> > ++ command -v git
>>> > + '[' /usr/bin/git ']'
>>> > ++ git rev-parse --short HEAD
>>> > + GITREV=487d409
>>> > + '[' '!' -z 487d409 ']'
>>> > + GITREVSTRING=' (git revision 487d409)'
>>> > + unset GITREV
>>> > ++ command -v mvn
>>> > + '[' '!' /usr/bin/mvn ']'
>>> > ++ mvn help:evaluate -Dexpression=project.version
>>> -Dhadoop.version=2.7.0
>>> > -Phadoop-2.7 -Phive -Phive-thriftserver -DskipTests clean package
>>> > ++ grep -v INFO
>>> > ++ tail -n 1
>>> > + VERSION='[ERROR] [Help 1]
>>> >
>>> http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException'
>>> >
>>> > Same output error with JDK 7
>>> >
>>> > Appreciate your help.
>>> >
>>> >
>>>
>>
>>
>>
>

Re: Maven build failed (Spark master)

Posted by Kayode Odeyemi <dr...@gmail.com>.
Hi,

The ./make_distribution task completed. However, I can't seem to locate the
.tar.gz file.

Where does Spark save this? or should I just work with the dist directory?

On Fri, Oct 23, 2015 at 4:23 PM, Kayode Odeyemi <dr...@gmail.com> wrote:

> I saw this when I tested manually (without ./make-distribution)
>
> Detected Maven Version: 3.2.2 is not in the allowed range 3.3.3.
>
> So I simply upgraded maven to 3.3.3.
>
> Resolved. Thanks
>
> On Fri, Oct 23, 2015 at 3:17 PM, Sean Owen <so...@cloudera.com> wrote:
>
>> This doesn't show the actual error output from Maven. I have a strong
>> guess that you haven't set MAVEN_OPTS to increase the memory Maven can
>> use.
>>
>> On Fri, Oct 23, 2015 at 6:14 AM, Kayode Odeyemi <dr...@gmail.com>
>> wrote:
>> > Hi,
>> >
>> > I can't seem to get a successful maven build. Please see command output
>> > below:
>> >
>> > bash-3.2$ ./make-distribution.sh --name spark-latest --tgz --mvn mvn
>> > -Dhadoop.version=2.7.0 -Phadoop-2.7 -Phive -Phive-thriftserver
>> -DskipTests
>> > clean package
>> > +++ dirname ./make-distribution.sh
>> > ++ cd .
>> > ++ pwd
>> > + SPARK_HOME=/usr/local/spark-latest
>> > + DISTDIR=/usr/local/spark-latest/dist
>> > + SPARK_TACHYON=false
>> > + TACHYON_VERSION=0.7.1
>> > + TACHYON_TGZ=tachyon-0.7.1-bin.tar.gz
>> > +
>> > TACHYON_URL=
>> https://github.com/amplab/tachyon/releases/download/v0.7.1/tachyon-0.7.1-bin.tar.gz
>> > + MAKE_TGZ=false
>> > + NAME=none
>> > + MVN=/usr/local/spark-latest/build/mvn
>> > + ((  12  ))
>> > + case $1 in
>> > + NAME=spark-latest
>> > + shift
>> > + shift
>> > + ((  10  ))
>> > + case $1 in
>> > + MAKE_TGZ=true
>> > + shift
>> > + ((  9  ))
>> > + case $1 in
>> > + MVN=mvn
>> > + shift
>> > + shift
>> > + ((  7  ))
>> > + case $1 in
>> > + break
>> > + '[' -z
>> /Library/Java/JavaVirtualMachines/jdk1.8.0_20.jdk/Contents/Home ']'
>> > + '[' -z
>> /Library/Java/JavaVirtualMachines/jdk1.8.0_20.jdk/Contents/Home ']'
>> > ++ command -v git
>> > + '[' /usr/bin/git ']'
>> > ++ git rev-parse --short HEAD
>> > + GITREV=487d409
>> > + '[' '!' -z 487d409 ']'
>> > + GITREVSTRING=' (git revision 487d409)'
>> > + unset GITREV
>> > ++ command -v mvn
>> > + '[' '!' /usr/bin/mvn ']'
>> > ++ mvn help:evaluate -Dexpression=project.version -Dhadoop.version=2.7.0
>> > -Phadoop-2.7 -Phive -Phive-thriftserver -DskipTests clean package
>> > ++ grep -v INFO
>> > ++ tail -n 1
>> > + VERSION='[ERROR] [Help 1]
>> > http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
>> '
>> >
>> > Same output error with JDK 7
>> >
>> > Appreciate your help.
>> >
>> >
>>
>
>
>

Re: Maven build failed (Spark master)

Posted by Kayode Odeyemi <dr...@gmail.com>.
I saw this when I tested manually (without ./make-distribution)

Detected Maven Version: 3.2.2 is not in the allowed range 3.3.3.

So I simply upgraded maven to 3.3.3.

Resolved. Thanks

On Fri, Oct 23, 2015 at 3:17 PM, Sean Owen <so...@cloudera.com> wrote:

> This doesn't show the actual error output from Maven. I have a strong
> guess that you haven't set MAVEN_OPTS to increase the memory Maven can
> use.
>
> On Fri, Oct 23, 2015 at 6:14 AM, Kayode Odeyemi <dr...@gmail.com> wrote:
> > Hi,
> >
> > I can't seem to get a successful maven build. Please see command output
> > below:
> >
> > bash-3.2$ ./make-distribution.sh --name spark-latest --tgz --mvn mvn
> > -Dhadoop.version=2.7.0 -Phadoop-2.7 -Phive -Phive-thriftserver
> -DskipTests
> > clean package
> > +++ dirname ./make-distribution.sh
> > ++ cd .
> > ++ pwd
> > + SPARK_HOME=/usr/local/spark-latest
> > + DISTDIR=/usr/local/spark-latest/dist
> > + SPARK_TACHYON=false
> > + TACHYON_VERSION=0.7.1
> > + TACHYON_TGZ=tachyon-0.7.1-bin.tar.gz
> > +
> > TACHYON_URL=
> https://github.com/amplab/tachyon/releases/download/v0.7.1/tachyon-0.7.1-bin.tar.gz
> > + MAKE_TGZ=false
> > + NAME=none
> > + MVN=/usr/local/spark-latest/build/mvn
> > + ((  12  ))
> > + case $1 in
> > + NAME=spark-latest
> > + shift
> > + shift
> > + ((  10  ))
> > + case $1 in
> > + MAKE_TGZ=true
> > + shift
> > + ((  9  ))
> > + case $1 in
> > + MVN=mvn
> > + shift
> > + shift
> > + ((  7  ))
> > + case $1 in
> > + break
> > + '[' -z /Library/Java/JavaVirtualMachines/jdk1.8.0_20.jdk/Contents/Home
> ']'
> > + '[' -z /Library/Java/JavaVirtualMachines/jdk1.8.0_20.jdk/Contents/Home
> ']'
> > ++ command -v git
> > + '[' /usr/bin/git ']'
> > ++ git rev-parse --short HEAD
> > + GITREV=487d409
> > + '[' '!' -z 487d409 ']'
> > + GITREVSTRING=' (git revision 487d409)'
> > + unset GITREV
> > ++ command -v mvn
> > + '[' '!' /usr/bin/mvn ']'
> > ++ mvn help:evaluate -Dexpression=project.version -Dhadoop.version=2.7.0
> > -Phadoop-2.7 -Phive -Phive-thriftserver -DskipTests clean package
> > ++ grep -v INFO
> > ++ tail -n 1
> > + VERSION='[ERROR] [Help 1]
> > http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException'
> >
> > Same output error with JDK 7
> >
> > Appreciate your help.
> >
> >
>

Re: Maven build failed (Spark master)

Posted by Sean Owen <so...@cloudera.com>.
This doesn't show the actual error output from Maven. I have a strong
guess that you haven't set MAVEN_OPTS to increase the memory Maven can
use.

On Fri, Oct 23, 2015 at 6:14 AM, Kayode Odeyemi <dr...@gmail.com> wrote:
> Hi,
>
> I can't seem to get a successful maven build. Please see command output
> below:
>
> bash-3.2$ ./make-distribution.sh --name spark-latest --tgz --mvn mvn
> -Dhadoop.version=2.7.0 -Phadoop-2.7 -Phive -Phive-thriftserver -DskipTests
> clean package
> +++ dirname ./make-distribution.sh
> ++ cd .
> ++ pwd
> + SPARK_HOME=/usr/local/spark-latest
> + DISTDIR=/usr/local/spark-latest/dist
> + SPARK_TACHYON=false
> + TACHYON_VERSION=0.7.1
> + TACHYON_TGZ=tachyon-0.7.1-bin.tar.gz
> +
> TACHYON_URL=https://github.com/amplab/tachyon/releases/download/v0.7.1/tachyon-0.7.1-bin.tar.gz
> + MAKE_TGZ=false
> + NAME=none
> + MVN=/usr/local/spark-latest/build/mvn
> + ((  12  ))
> + case $1 in
> + NAME=spark-latest
> + shift
> + shift
> + ((  10  ))
> + case $1 in
> + MAKE_TGZ=true
> + shift
> + ((  9  ))
> + case $1 in
> + MVN=mvn
> + shift
> + shift
> + ((  7  ))
> + case $1 in
> + break
> + '[' -z /Library/Java/JavaVirtualMachines/jdk1.8.0_20.jdk/Contents/Home ']'
> + '[' -z /Library/Java/JavaVirtualMachines/jdk1.8.0_20.jdk/Contents/Home ']'
> ++ command -v git
> + '[' /usr/bin/git ']'
> ++ git rev-parse --short HEAD
> + GITREV=487d409
> + '[' '!' -z 487d409 ']'
> + GITREVSTRING=' (git revision 487d409)'
> + unset GITREV
> ++ command -v mvn
> + '[' '!' /usr/bin/mvn ']'
> ++ mvn help:evaluate -Dexpression=project.version -Dhadoop.version=2.7.0
> -Phadoop-2.7 -Phive -Phive-thriftserver -DskipTests clean package
> ++ grep -v INFO
> ++ tail -n 1
> + VERSION='[ERROR] [Help 1]
> http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException'
>
> Same output error with JDK 7
>
> Appreciate your help.
>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org