You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Chester Chen <ch...@alpinenow.com> on 2015/10/08 19:35:17 UTC

Build spark 1.5.1 branch fails

Question regarding branch-1.5  build.

Noticed that the spark project no longer publish the spark-assembly. We
have to build ourselves ( until we find way to not depends on assembly
jar).


I check out the tag v.1.5.1 release version and using the sbt to build it,
I get the following error

build/sbt -Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0 -Phive
-Phive-thriftserver -DskipTests clean package assembly


[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn] ::          UNRESOLVED DEPENDENCIES         ::
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn] :: org.apache.spark#spark-network-common_2.10;1.5.1: configuration
not public in org.apache.spark#spark-network-common_2.10;1.5.1: 'test'. It
was required from org.apache.spark#spark-network-shuffle_2.10;1.5.1 test
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn]
[warn] Note: Unresolved dependencies path:
[warn] org.apache.spark:spark-network-common_2.10:1.5.1
((com.typesafe.sbt.pom.MavenHelper) MavenHelper.scala#L76)
[warn]  +- org.apache.spark:spark-network-shuffle_2.10:1.5.1
[info] Packaging
/Users/chester/projects/alpine/apache/spark/launcher/target/scala-2.10/spark-launcher_2.10-1.5.1.jar
...
[info] Done packaging.
[warn] four warnings found
[warn] Note: Some input files use unchecked or unsafe operations.
[warn] Note: Recompile with -Xlint:unchecked for details.
[warn] No main class detected
[info] Packaging
/Users/chester/projects/alpine/apache/spark/external/flume-sink/target/scala-2.10/spark-streaming-flume-sink_2.10-1.5.1.jar
...
[info] Done packaging.
sbt.ResolveException: unresolved dependency:
org.apache.spark#spark-network-common_2.10;1.5.1: configuration not public
in org.apache.spark#spark-network-common_2.10;1.5.1: 'test'. It was
required from org.apache.spark#spark-network-shuffle_2.10;1.5.1 test


Somehow the network-shuffle can't find the test jar needed ( not sure why
test still needed, even the  -DskipTests is already specified)

tried the maven command, the build failed as well ( without assembly)

mvn -Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0 -Phive -Phive-thriftserver
-DskipTests clean package

[ERROR] Failed to execute goal
org.apache.maven.plugins:maven-enforcer-plugin:1.4:enforce
(enforce-versions) on project spark-parent_2.10: Some Enforcer rules have
failed. Look above for specific messages explaining why the rule failed. ->
[Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions,
please read the following articles:
[ERROR] [Help 1]
http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException



I checkout the branch-1.5 and replaced "1.5.2-SNAPSHOT" with "1.5.1" and
build/sbt will still fail ( same error as above for sbt)

But if I keep the version string as "1.5.2-SNAPSHOT", the build/sbt works
fine.


Any ideas ?

Chester

Re: Build spark 1.5.1 branch fails

Posted by Steve Loughran <st...@hortonworks.com>.
On 18 Oct 2015, at 11:09, Sean Owen <so...@cloudera.com>> wrote:


These are still too low I think. Try 4g heap and 1g permgen. That's what the error tells you right?

On Sat, Oct 17, 2015, 10:58 PM Chester Chen <ch...@alpinenow.com>> wrote:
Yes, I have tried MAVEN_OPTS with

-Xmx2g -XX:MaxPermSize=512M -XX:ReservedCodeCacheSize=512m

-Xmx4g -XX:MaxPermSize=512M -XX:ReservedCodeCacheSize=512m

-Xmx2g -XX:MaxPermSize=1g -XX:ReservedCodeCacheSize=512m

None of them works. All failed with the same error.

thanks



fwiw, here's min. The headless one is a relic of apple jdk6 that I could probably cut now.

$ echo $MAVEN_OPTS
-Xmx2g -XX:MaxPermSize=512M -XX:ReservedCodeCacheSize=512m -Xms256m -Djava.awt.headless=true


Re: Build spark 1.5.1 branch fails

Posted by Sean Owen <so...@cloudera.com>.
These are still too low I think. Try 4g heap and 1g permgen. That's what
the error tells you right?

On Sat, Oct 17, 2015, 10:58 PM Chester Chen <ch...@alpinenow.com> wrote:

> Yes, I have tried MAVEN_OPTS with
>
> -Xmx2g -XX:MaxPermSize=512M -XX:ReservedCodeCacheSize=512m
>
> -Xmx4g -XX:MaxPermSize=512M -XX:ReservedCodeCacheSize=512m
>
> -Xmx2g -XX:MaxPermSize=1g -XX:ReservedCodeCacheSize=512m
>
> None of them works. All failed with the same error.
>
> thanks
>
>
>
>
>
>
> On Sat, Oct 17, 2015 at 2:44 PM, Ted Yu <yu...@gmail.com> wrote:
>
>> Have you set MAVEN_OPTS with the following ?
>> -Xmx2g -XX:MaxPermSize=512M -XX:ReservedCodeCacheSize=512m
>>
>> Cheers
>>
>> On Sat, Oct 17, 2015 at 2:35 PM, Chester Chen <ch...@alpinenow.com>
>> wrote:
>>
>>> I was using jdk 1.7 and maven version is the same as pom file.
>>>
>>> ᚛ |(v1.5.1)|$ java -version
>>> java version "1.7.0_51"
>>> Java(TM) SE Runtime Environment (build 1.7.0_51-b13)
>>> Java HotSpot(TM) 64-Bit Server VM (build 24.51-b03, mixed mode)
>>>
>>> Using build/sbt still fail the same with -Denforcer.skip, with mvn
>>> build, it fails with
>>>
>>>
>>> [ERROR] PermGen space -> [Help 1]
>>> [ERROR]
>>> [ERROR] To see the full stack trace of the errors, re-run Maven with the
>>> -e switch.
>>> [ERROR] Re-run Maven using the -X switch to enable full debug logging
>>>
>>> I am giving up on this. Just using 1.5.2-SNAPSHOT for now.
>>>
>>> Chester
>>>
>>>
>>> On Mon, Oct 12, 2015 at 12:05 AM, Xiao Li <ga...@gmail.com> wrote:
>>>
>>>> Hi, Chester,
>>>>
>>>> Please check your pom.xml. Your java.version and maven.version might
>>>> not match your build environment.
>>>>
>>>> Or using -Denforcer.skip=true from the command line to skip it.
>>>>
>>>> Good luck,
>>>>
>>>> Xiao Li
>>>>
>>>> 2015-10-08 10:35 GMT-07:00 Chester Chen <ch...@alpinenow.com>:
>>>>
>>>>> Question regarding branch-1.5  build.
>>>>>
>>>>> Noticed that the spark project no longer publish the spark-assembly.
>>>>> We have to build ourselves ( until we find way to not depends on assembly
>>>>> jar).
>>>>>
>>>>>
>>>>> I check out the tag v.1.5.1 release version and using the sbt to build
>>>>> it, I get the following error
>>>>>
>>>>> build/sbt -Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0 -Phive
>>>>> -Phive-thriftserver -DskipTests clean package assembly
>>>>>
>>>>>
>>>>> [warn] ::::::::::::::::::::::::::::::::::::::::::::::
>>>>> [warn] ::          UNRESOLVED DEPENDENCIES         ::
>>>>> [warn] ::::::::::::::::::::::::::::::::::::::::::::::
>>>>> [warn] :: org.apache.spark#spark-network-common_2.10;1.5.1:
>>>>> configuration not public in
>>>>> org.apache.spark#spark-network-common_2.10;1.5.1: 'test'. It was required
>>>>> from org.apache.spark#spark-network-shuffle_2.10;1.5.1 test
>>>>> [warn] ::::::::::::::::::::::::::::::::::::::::::::::
>>>>> [warn]
>>>>> [warn] Note: Unresolved dependencies path:
>>>>> [warn] org.apache.spark:spark-network-common_2.10:1.5.1
>>>>> ((com.typesafe.sbt.pom.MavenHelper) MavenHelper.scala#L76)
>>>>> [warn]  +- org.apache.spark:spark-network-shuffle_2.10:1.5.1
>>>>> [info] Packaging
>>>>> /Users/chester/projects/alpine/apache/spark/launcher/target/scala-2.10/spark-launcher_2.10-1.5.1.jar
>>>>> ...
>>>>> [info] Done packaging.
>>>>> [warn] four warnings found
>>>>> [warn] Note: Some input files use unchecked or unsafe operations.
>>>>> [warn] Note: Recompile with -Xlint:unchecked for details.
>>>>> [warn] No main class detected
>>>>> [info] Packaging
>>>>> /Users/chester/projects/alpine/apache/spark/external/flume-sink/target/scala-2.10/spark-streaming-flume-sink_2.10-1.5.1.jar
>>>>> ...
>>>>> [info] Done packaging.
>>>>> sbt.ResolveException: unresolved dependency:
>>>>> org.apache.spark#spark-network-common_2.10;1.5.1: configuration not public
>>>>> in org.apache.spark#spark-network-common_2.10;1.5.1: 'test'. It was
>>>>> required from org.apache.spark#spark-network-shuffle_2.10;1.5.1 test
>>>>>
>>>>>
>>>>> Somehow the network-shuffle can't find the test jar needed ( not sure
>>>>> why test still needed, even the  -DskipTests is already specified)
>>>>>
>>>>> tried the maven command, the build failed as well ( without assembly)
>>>>>
>>>>> mvn -Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0 -Phive
>>>>> -Phive-thriftserver -DskipTests clean package
>>>>>
>>>>> [ERROR] Failed to execute goal
>>>>> org.apache.maven.plugins:maven-enforcer-plugin:1.4:enforce
>>>>> (enforce-versions) on project spark-parent_2.10: Some Enforcer rules have
>>>>> failed. Look above for specific messages explaining why the rule failed. ->
>>>>> [Help 1]
>>>>> [ERROR]
>>>>> [ERROR] To see the full stack trace of the errors, re-run Maven with
>>>>> the -e switch.
>>>>> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
>>>>> [ERROR]
>>>>> [ERROR] For more information about the errors and possible solutions,
>>>>> please read the following articles:
>>>>> [ERROR] [Help 1]
>>>>> http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
>>>>>
>>>>>
>>>>>
>>>>> I checkout the branch-1.5 and replaced "1.5.2-SNAPSHOT" with "1.5.1"
>>>>> and build/sbt will still fail ( same error as above for sbt)
>>>>>
>>>>> But if I keep the version string as "1.5.2-SNAPSHOT", the build/sbt
>>>>> works fine.
>>>>>
>>>>>
>>>>> Any ideas ?
>>>>>
>>>>> Chester
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Build spark 1.5.1 branch fails

Posted by Chester Chen <ch...@alpinenow.com>.
Yes, I have tried MAVEN_OPTS with

-Xmx2g -XX:MaxPermSize=512M -XX:ReservedCodeCacheSize=512m

-Xmx4g -XX:MaxPermSize=512M -XX:ReservedCodeCacheSize=512m

-Xmx2g -XX:MaxPermSize=1g -XX:ReservedCodeCacheSize=512m

None of them works. All failed with the same error.

thanks






On Sat, Oct 17, 2015 at 2:44 PM, Ted Yu <yu...@gmail.com> wrote:

> Have you set MAVEN_OPTS with the following ?
> -Xmx2g -XX:MaxPermSize=512M -XX:ReservedCodeCacheSize=512m
>
> Cheers
>
> On Sat, Oct 17, 2015 at 2:35 PM, Chester Chen <ch...@alpinenow.com>
> wrote:
>
>> I was using jdk 1.7 and maven version is the same as pom file.
>>
>> ᚛ |(v1.5.1)|$ java -version
>> java version "1.7.0_51"
>> Java(TM) SE Runtime Environment (build 1.7.0_51-b13)
>> Java HotSpot(TM) 64-Bit Server VM (build 24.51-b03, mixed mode)
>>
>> Using build/sbt still fail the same with -Denforcer.skip, with mvn build,
>> it fails with
>>
>>
>> [ERROR] PermGen space -> [Help 1]
>> [ERROR]
>> [ERROR] To see the full stack trace of the errors, re-run Maven with the
>> -e switch.
>> [ERROR] Re-run Maven using the -X switch to enable full debug logging
>>
>> I am giving up on this. Just using 1.5.2-SNAPSHOT for now.
>>
>> Chester
>>
>>
>> On Mon, Oct 12, 2015 at 12:05 AM, Xiao Li <ga...@gmail.com> wrote:
>>
>>> Hi, Chester,
>>>
>>> Please check your pom.xml. Your java.version and maven.version might not
>>> match your build environment.
>>>
>>> Or using -Denforcer.skip=true from the command line to skip it.
>>>
>>> Good luck,
>>>
>>> Xiao Li
>>>
>>> 2015-10-08 10:35 GMT-07:00 Chester Chen <ch...@alpinenow.com>:
>>>
>>>> Question regarding branch-1.5  build.
>>>>
>>>> Noticed that the spark project no longer publish the spark-assembly. We
>>>> have to build ourselves ( until we find way to not depends on assembly
>>>> jar).
>>>>
>>>>
>>>> I check out the tag v.1.5.1 release version and using the sbt to build
>>>> it, I get the following error
>>>>
>>>> build/sbt -Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0 -Phive
>>>> -Phive-thriftserver -DskipTests clean package assembly
>>>>
>>>>
>>>> [warn] ::::::::::::::::::::::::::::::::::::::::::::::
>>>> [warn] ::          UNRESOLVED DEPENDENCIES         ::
>>>> [warn] ::::::::::::::::::::::::::::::::::::::::::::::
>>>> [warn] :: org.apache.spark#spark-network-common_2.10;1.5.1:
>>>> configuration not public in
>>>> org.apache.spark#spark-network-common_2.10;1.5.1: 'test'. It was required
>>>> from org.apache.spark#spark-network-shuffle_2.10;1.5.1 test
>>>> [warn] ::::::::::::::::::::::::::::::::::::::::::::::
>>>> [warn]
>>>> [warn] Note: Unresolved dependencies path:
>>>> [warn] org.apache.spark:spark-network-common_2.10:1.5.1
>>>> ((com.typesafe.sbt.pom.MavenHelper) MavenHelper.scala#L76)
>>>> [warn]  +- org.apache.spark:spark-network-shuffle_2.10:1.5.1
>>>> [info] Packaging
>>>> /Users/chester/projects/alpine/apache/spark/launcher/target/scala-2.10/spark-launcher_2.10-1.5.1.jar
>>>> ...
>>>> [info] Done packaging.
>>>> [warn] four warnings found
>>>> [warn] Note: Some input files use unchecked or unsafe operations.
>>>> [warn] Note: Recompile with -Xlint:unchecked for details.
>>>> [warn] No main class detected
>>>> [info] Packaging
>>>> /Users/chester/projects/alpine/apache/spark/external/flume-sink/target/scala-2.10/spark-streaming-flume-sink_2.10-1.5.1.jar
>>>> ...
>>>> [info] Done packaging.
>>>> sbt.ResolveException: unresolved dependency:
>>>> org.apache.spark#spark-network-common_2.10;1.5.1: configuration not public
>>>> in org.apache.spark#spark-network-common_2.10;1.5.1: 'test'. It was
>>>> required from org.apache.spark#spark-network-shuffle_2.10;1.5.1 test
>>>>
>>>>
>>>> Somehow the network-shuffle can't find the test jar needed ( not sure
>>>> why test still needed, even the  -DskipTests is already specified)
>>>>
>>>> tried the maven command, the build failed as well ( without assembly)
>>>>
>>>> mvn -Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0 -Phive
>>>> -Phive-thriftserver -DskipTests clean package
>>>>
>>>> [ERROR] Failed to execute goal
>>>> org.apache.maven.plugins:maven-enforcer-plugin:1.4:enforce
>>>> (enforce-versions) on project spark-parent_2.10: Some Enforcer rules have
>>>> failed. Look above for specific messages explaining why the rule failed. ->
>>>> [Help 1]
>>>> [ERROR]
>>>> [ERROR] To see the full stack trace of the errors, re-run Maven with
>>>> the -e switch.
>>>> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
>>>> [ERROR]
>>>> [ERROR] For more information about the errors and possible solutions,
>>>> please read the following articles:
>>>> [ERROR] [Help 1]
>>>> http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
>>>>
>>>>
>>>>
>>>> I checkout the branch-1.5 and replaced "1.5.2-SNAPSHOT" with "1.5.1"
>>>> and build/sbt will still fail ( same error as above for sbt)
>>>>
>>>> But if I keep the version string as "1.5.2-SNAPSHOT", the build/sbt
>>>> works fine.
>>>>
>>>>
>>>> Any ideas ?
>>>>
>>>> Chester
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>
>>
>

Re: Build spark 1.5.1 branch fails

Posted by Ted Yu <yu...@gmail.com>.
Have you set MAVEN_OPTS with the following ?
-Xmx2g -XX:MaxPermSize=512M -XX:ReservedCodeCacheSize=512m

Cheers

On Sat, Oct 17, 2015 at 2:35 PM, Chester Chen <ch...@alpinenow.com> wrote:

> I was using jdk 1.7 and maven version is the same as pom file.
>
> ᚛ |(v1.5.1)|$ java -version
> java version "1.7.0_51"
> Java(TM) SE Runtime Environment (build 1.7.0_51-b13)
> Java HotSpot(TM) 64-Bit Server VM (build 24.51-b03, mixed mode)
>
> Using build/sbt still fail the same with -Denforcer.skip, with mvn build,
> it fails with
>
>
> [ERROR] PermGen space -> [Help 1]
> [ERROR]
> [ERROR] To see the full stack trace of the errors, re-run Maven with the
> -e switch.
> [ERROR] Re-run Maven using the -X switch to enable full debug logging
>
> I am giving up on this. Just using 1.5.2-SNAPSHOT for now.
>
> Chester
>
>
> On Mon, Oct 12, 2015 at 12:05 AM, Xiao Li <ga...@gmail.com> wrote:
>
>> Hi, Chester,
>>
>> Please check your pom.xml. Your java.version and maven.version might not
>> match your build environment.
>>
>> Or using -Denforcer.skip=true from the command line to skip it.
>>
>> Good luck,
>>
>> Xiao Li
>>
>> 2015-10-08 10:35 GMT-07:00 Chester Chen <ch...@alpinenow.com>:
>>
>>> Question regarding branch-1.5  build.
>>>
>>> Noticed that the spark project no longer publish the spark-assembly. We
>>> have to build ourselves ( until we find way to not depends on assembly
>>> jar).
>>>
>>>
>>> I check out the tag v.1.5.1 release version and using the sbt to build
>>> it, I get the following error
>>>
>>> build/sbt -Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0 -Phive
>>> -Phive-thriftserver -DskipTests clean package assembly
>>>
>>>
>>> [warn] ::::::::::::::::::::::::::::::::::::::::::::::
>>> [warn] ::          UNRESOLVED DEPENDENCIES         ::
>>> [warn] ::::::::::::::::::::::::::::::::::::::::::::::
>>> [warn] :: org.apache.spark#spark-network-common_2.10;1.5.1:
>>> configuration not public in
>>> org.apache.spark#spark-network-common_2.10;1.5.1: 'test'. It was required
>>> from org.apache.spark#spark-network-shuffle_2.10;1.5.1 test
>>> [warn] ::::::::::::::::::::::::::::::::::::::::::::::
>>> [warn]
>>> [warn] Note: Unresolved dependencies path:
>>> [warn] org.apache.spark:spark-network-common_2.10:1.5.1
>>> ((com.typesafe.sbt.pom.MavenHelper) MavenHelper.scala#L76)
>>> [warn]  +- org.apache.spark:spark-network-shuffle_2.10:1.5.1
>>> [info] Packaging
>>> /Users/chester/projects/alpine/apache/spark/launcher/target/scala-2.10/spark-launcher_2.10-1.5.1.jar
>>> ...
>>> [info] Done packaging.
>>> [warn] four warnings found
>>> [warn] Note: Some input files use unchecked or unsafe operations.
>>> [warn] Note: Recompile with -Xlint:unchecked for details.
>>> [warn] No main class detected
>>> [info] Packaging
>>> /Users/chester/projects/alpine/apache/spark/external/flume-sink/target/scala-2.10/spark-streaming-flume-sink_2.10-1.5.1.jar
>>> ...
>>> [info] Done packaging.
>>> sbt.ResolveException: unresolved dependency:
>>> org.apache.spark#spark-network-common_2.10;1.5.1: configuration not public
>>> in org.apache.spark#spark-network-common_2.10;1.5.1: 'test'. It was
>>> required from org.apache.spark#spark-network-shuffle_2.10;1.5.1 test
>>>
>>>
>>> Somehow the network-shuffle can't find the test jar needed ( not sure
>>> why test still needed, even the  -DskipTests is already specified)
>>>
>>> tried the maven command, the build failed as well ( without assembly)
>>>
>>> mvn -Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0 -Phive
>>> -Phive-thriftserver -DskipTests clean package
>>>
>>> [ERROR] Failed to execute goal
>>> org.apache.maven.plugins:maven-enforcer-plugin:1.4:enforce
>>> (enforce-versions) on project spark-parent_2.10: Some Enforcer rules have
>>> failed. Look above for specific messages explaining why the rule failed. ->
>>> [Help 1]
>>> [ERROR]
>>> [ERROR] To see the full stack trace of the errors, re-run Maven with the
>>> -e switch.
>>> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
>>> [ERROR]
>>> [ERROR] For more information about the errors and possible solutions,
>>> please read the following articles:
>>> [ERROR] [Help 1]
>>> http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
>>>
>>>
>>>
>>> I checkout the branch-1.5 and replaced "1.5.2-SNAPSHOT" with "1.5.1" and
>>> build/sbt will still fail ( same error as above for sbt)
>>>
>>> But if I keep the version string as "1.5.2-SNAPSHOT", the build/sbt
>>> works fine.
>>>
>>>
>>> Any ideas ?
>>>
>>> Chester
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>
>

Re: Build spark 1.5.1 branch fails

Posted by Chester Chen <ch...@alpinenow.com>.
I was using jdk 1.7 and maven version is the same as pom file.

᚛ |(v1.5.1)|$ java -version
java version "1.7.0_51"
Java(TM) SE Runtime Environment (build 1.7.0_51-b13)
Java HotSpot(TM) 64-Bit Server VM (build 24.51-b03, mixed mode)

Using build/sbt still fail the same with -Denforcer.skip, with mvn build,
it fails with


[ERROR] PermGen space -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging

I am giving up on this. Just using 1.5.2-SNAPSHOT for now.

Chester


On Mon, Oct 12, 2015 at 12:05 AM, Xiao Li <ga...@gmail.com> wrote:

> Hi, Chester,
>
> Please check your pom.xml. Your java.version and maven.version might not
> match your build environment.
>
> Or using -Denforcer.skip=true from the command line to skip it.
>
> Good luck,
>
> Xiao Li
>
> 2015-10-08 10:35 GMT-07:00 Chester Chen <ch...@alpinenow.com>:
>
>> Question regarding branch-1.5  build.
>>
>> Noticed that the spark project no longer publish the spark-assembly. We
>> have to build ourselves ( until we find way to not depends on assembly
>> jar).
>>
>>
>> I check out the tag v.1.5.1 release version and using the sbt to build
>> it, I get the following error
>>
>> build/sbt -Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0 -Phive
>> -Phive-thriftserver -DskipTests clean package assembly
>>
>>
>> [warn] ::::::::::::::::::::::::::::::::::::::::::::::
>> [warn] ::          UNRESOLVED DEPENDENCIES         ::
>> [warn] ::::::::::::::::::::::::::::::::::::::::::::::
>> [warn] :: org.apache.spark#spark-network-common_2.10;1.5.1:
>> configuration not public in
>> org.apache.spark#spark-network-common_2.10;1.5.1: 'test'. It was required
>> from org.apache.spark#spark-network-shuffle_2.10;1.5.1 test
>> [warn] ::::::::::::::::::::::::::::::::::::::::::::::
>> [warn]
>> [warn] Note: Unresolved dependencies path:
>> [warn] org.apache.spark:spark-network-common_2.10:1.5.1
>> ((com.typesafe.sbt.pom.MavenHelper) MavenHelper.scala#L76)
>> [warn]  +- org.apache.spark:spark-network-shuffle_2.10:1.5.1
>> [info] Packaging
>> /Users/chester/projects/alpine/apache/spark/launcher/target/scala-2.10/spark-launcher_2.10-1.5.1.jar
>> ...
>> [info] Done packaging.
>> [warn] four warnings found
>> [warn] Note: Some input files use unchecked or unsafe operations.
>> [warn] Note: Recompile with -Xlint:unchecked for details.
>> [warn] No main class detected
>> [info] Packaging
>> /Users/chester/projects/alpine/apache/spark/external/flume-sink/target/scala-2.10/spark-streaming-flume-sink_2.10-1.5.1.jar
>> ...
>> [info] Done packaging.
>> sbt.ResolveException: unresolved dependency:
>> org.apache.spark#spark-network-common_2.10;1.5.1: configuration not public
>> in org.apache.spark#spark-network-common_2.10;1.5.1: 'test'. It was
>> required from org.apache.spark#spark-network-shuffle_2.10;1.5.1 test
>>
>>
>> Somehow the network-shuffle can't find the test jar needed ( not sure why
>> test still needed, even the  -DskipTests is already specified)
>>
>> tried the maven command, the build failed as well ( without assembly)
>>
>> mvn -Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0 -Phive -Phive-thriftserver
>> -DskipTests clean package
>>
>> [ERROR] Failed to execute goal
>> org.apache.maven.plugins:maven-enforcer-plugin:1.4:enforce
>> (enforce-versions) on project spark-parent_2.10: Some Enforcer rules have
>> failed. Look above for specific messages explaining why the rule failed. ->
>> [Help 1]
>> [ERROR]
>> [ERROR] To see the full stack trace of the errors, re-run Maven with the
>> -e switch.
>> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
>> [ERROR]
>> [ERROR] For more information about the errors and possible solutions,
>> please read the following articles:
>> [ERROR] [Help 1]
>> http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
>>
>>
>>
>> I checkout the branch-1.5 and replaced "1.5.2-SNAPSHOT" with "1.5.1" and
>> build/sbt will still fail ( same error as above for sbt)
>>
>> But if I keep the version string as "1.5.2-SNAPSHOT", the build/sbt works
>> fine.
>>
>>
>> Any ideas ?
>>
>> Chester
>>
>>
>>
>>
>>
>>
>>
>>
>

Re: Build spark 1.5.1 branch fails

Posted by Xiao Li <ga...@gmail.com>.
Hi, Chester,

Please check your pom.xml. Your java.version and maven.version might not
match your build environment.

Or using -Denforcer.skip=true from the command line to skip it.

Good luck,

Xiao Li

2015-10-08 10:35 GMT-07:00 Chester Chen <ch...@alpinenow.com>:

> Question regarding branch-1.5  build.
>
> Noticed that the spark project no longer publish the spark-assembly. We
> have to build ourselves ( until we find way to not depends on assembly
> jar).
>
>
> I check out the tag v.1.5.1 release version and using the sbt to build it,
> I get the following error
>
> build/sbt -Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0 -Phive
> -Phive-thriftserver -DskipTests clean package assembly
>
>
> [warn] ::::::::::::::::::::::::::::::::::::::::::::::
> [warn] ::          UNRESOLVED DEPENDENCIES         ::
> [warn] ::::::::::::::::::::::::::::::::::::::::::::::
> [warn] :: org.apache.spark#spark-network-common_2.10;1.5.1: configuration
> not public in org.apache.spark#spark-network-common_2.10;1.5.1: 'test'. It
> was required from org.apache.spark#spark-network-shuffle_2.10;1.5.1 test
> [warn] ::::::::::::::::::::::::::::::::::::::::::::::
> [warn]
> [warn] Note: Unresolved dependencies path:
> [warn] org.apache.spark:spark-network-common_2.10:1.5.1
> ((com.typesafe.sbt.pom.MavenHelper) MavenHelper.scala#L76)
> [warn]  +- org.apache.spark:spark-network-shuffle_2.10:1.5.1
> [info] Packaging
> /Users/chester/projects/alpine/apache/spark/launcher/target/scala-2.10/spark-launcher_2.10-1.5.1.jar
> ...
> [info] Done packaging.
> [warn] four warnings found
> [warn] Note: Some input files use unchecked or unsafe operations.
> [warn] Note: Recompile with -Xlint:unchecked for details.
> [warn] No main class detected
> [info] Packaging
> /Users/chester/projects/alpine/apache/spark/external/flume-sink/target/scala-2.10/spark-streaming-flume-sink_2.10-1.5.1.jar
> ...
> [info] Done packaging.
> sbt.ResolveException: unresolved dependency:
> org.apache.spark#spark-network-common_2.10;1.5.1: configuration not public
> in org.apache.spark#spark-network-common_2.10;1.5.1: 'test'. It was
> required from org.apache.spark#spark-network-shuffle_2.10;1.5.1 test
>
>
> Somehow the network-shuffle can't find the test jar needed ( not sure why
> test still needed, even the  -DskipTests is already specified)
>
> tried the maven command, the build failed as well ( without assembly)
>
> mvn -Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0 -Phive -Phive-thriftserver
> -DskipTests clean package
>
> [ERROR] Failed to execute goal
> org.apache.maven.plugins:maven-enforcer-plugin:1.4:enforce
> (enforce-versions) on project spark-parent_2.10: Some Enforcer rules have
> failed. Look above for specific messages explaining why the rule failed. ->
> [Help 1]
> [ERROR]
> [ERROR] To see the full stack trace of the errors, re-run Maven with the
> -e switch.
> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
> [ERROR]
> [ERROR] For more information about the errors and possible solutions,
> please read the following articles:
> [ERROR] [Help 1]
> http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
>
>
>
> I checkout the branch-1.5 and replaced "1.5.2-SNAPSHOT" with "1.5.1" and
> build/sbt will still fail ( same error as above for sbt)
>
> But if I keep the version string as "1.5.2-SNAPSHOT", the build/sbt works
> fine.
>
>
> Any ideas ?
>
> Chester
>
>
>
>
>
>
>
>