You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Marco Mistroni <mm...@gmail.com> on 2016/10/04 20:21:15 UTC

building Spark 2.1 vs Java 1.8 on Ubuntu 16/06

Hi all
 my mvn build of Spark 2.1 using Java 1.8 is spinning out of memory with an
error saying it cannot allocate enough memory during maven compilation

Instructions (in the Spark 2.0 page) says that MAVENOPTS are not needed for
Java 1.8 and , accoding to my understanding, spark build process will add it
during the build via mvn
Note; i am not using Zinc. Rather, i am using my own Maven version (3.3.9),
launching this command from the main spark directory. The same build works
when i use Java 1.7(and MAVENOPTS)

mvn -Pyarn -Dscala-2.11 -DskipTests clean package

Could anyone assist?
kr
  marco

Re: building Spark 2.1 vs Java 1.8 on Ubuntu 16/06

Posted by Marco Mistroni <mm...@gmail.com>.
Thanks Fred for pointers...so far I was only able to build 2.1 with Java 7
and no zinc.
Will try options u suggest. FYI building with sbt ends up in oom even with
Java 7
I will try and update this thread
Kr

On 6 Oct 2016 8:58 pm, "Fred Reiss" <fr...@gmail.com> wrote:

> There's no option to prevent build/mvn from starting the zinc server, but
> you should be able to prevent the maven build from using the zinc server by
> changing the <useZincServer> option at line 1935 of the master pom.xml.
>
> Note that the zinc-based compile works on my Ubuntu 16.04 box. You might
> be able to get zinc-based compiles working by tweaking your settings. A few
> things to try:
> -- Make sure another build hasn't left a second, incompatible copy of zinc
> squatting on the port that Spark expects to use
> -- Try setting the environment variable JAVA_7_HOME to point to and
> OpenJDK 7 installation. build/mvn runs zinc with Java 7 if that is
> available.
>
> Note that setting JAVA_7_HOME will break incremental compilation for
> sbt-based builds. Use that environment variable with restraint.
>
> Fred
>
>
>
> On Thu, Oct 6, 2016 at 2:22 AM, Marco Mistroni <mm...@gmail.com>
> wrote:
>
>> Thanks Fred
>> The build/mvn will trigger compilation using zinc and I want to avoid
>> that as every time I have tried it runs into errors while compiling spark
>> core. How can I disable zinc by default?
>> Kr
>>
>> On 5 Oct 2016 10:53 pm, "Fred Reiss" <fr...@gmail.com> wrote:
>>
>>> Actually the memory options *are* required for Java 1.8. Without them
>>> the build will fail intermittently. We just updated the documentation with
>>> regard to this fact in Spark 2.0.1. Relevant PR is here:
>>> https://github.com/apache/spark/pull/15005
>>>
>>> Your best bet as the project transitions from Java 7 to Java 8 is to use
>>> the scripts build/mvn and build/sbt, which should be updated on a regular
>>> basis with safe JVM options.
>>>
>>> Fred
>>>
>>> On Wed, Oct 5, 2016 at 1:40 AM, Marco Mistroni <mm...@gmail.com>
>>> wrote:
>>>
>>>> Thanks Richard.  It also says that for Java 1.8 the mavenopts are not
>>>> required..unless I misinterpreted the instructions...
>>>> Kr
>>>>
>>>> On 5 Oct 2016 9:20 am, "Richard Siebeling" <rs...@gmail.com>
>>>> wrote:
>>>>
>>>>> sorry, now with the link included, see http://spark.apache.org/do
>>>>> cs/latest/building-spark.html
>>>>>
>>>>> On Wed, Oct 5, 2016 at 10:19 AM, Richard Siebeling <
>>>>> rsiebeling@gmail.com> wrote:
>>>>>
>>>>>> Hi,
>>>>>>
>>>>>> did you set the following option: export MAVEN_OPTS="-Xmx2g
>>>>>> -XX:ReservedCodeCacheSize=512m"
>>>>>>
>>>>>> kind regards,
>>>>>> Richard
>>>>>>
>>>>>> On Tue, Oct 4, 2016 at 10:21 PM, Marco Mistroni <mm...@gmail.com>
>>>>>> wrote:
>>>>>>
>>>>>>> Hi all
>>>>>>>  my mvn build of Spark 2.1 using Java 1.8 is spinning out of memory
>>>>>>> with an error saying it cannot allocate enough memory during maven
>>>>>>> compilation
>>>>>>>
>>>>>>> Instructions (in the Spark 2.0 page) says that MAVENOPTS are not
>>>>>>> needed for Java 1.8 and , accoding to my understanding, spark build process
>>>>>>> will add it
>>>>>>> during the build via mvn
>>>>>>> Note; i am not using Zinc. Rather, i am using my own Maven version
>>>>>>> (3.3.9), launching this command from the main spark directory. The same
>>>>>>> build works when i use Java 1.7(and MAVENOPTS)
>>>>>>>
>>>>>>> mvn -Pyarn -Dscala-2.11 -DskipTests clean package
>>>>>>>
>>>>>>> Could anyone assist?
>>>>>>> kr
>>>>>>>   marco
>>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>
>

Re: building Spark 2.1 vs Java 1.8 on Ubuntu 16/06

Posted by Fred Reiss <fr...@gmail.com>.
There's no option to prevent build/mvn from starting the zinc server, but
you should be able to prevent the maven build from using the zinc server by
changing the <useZincServer> option at line 1935 of the master pom.xml.

Note that the zinc-based compile works on my Ubuntu 16.04 box. You might be
able to get zinc-based compiles working by tweaking your settings. A few
things to try:
-- Make sure another build hasn't left a second, incompatible copy of zinc
squatting on the port that Spark expects to use
-- Try setting the environment variable JAVA_7_HOME to point to and OpenJDK
7 installation. build/mvn runs zinc with Java 7 if that is available.

Note that setting JAVA_7_HOME will break incremental compilation for
sbt-based builds. Use that environment variable with restraint.

Fred



On Thu, Oct 6, 2016 at 2:22 AM, Marco Mistroni <mm...@gmail.com> wrote:

> Thanks Fred
> The build/mvn will trigger compilation using zinc and I want to avoid that
> as every time I have tried it runs into errors while compiling spark core.
> How can I disable zinc by default?
> Kr
>
> On 5 Oct 2016 10:53 pm, "Fred Reiss" <fr...@gmail.com> wrote:
>
>> Actually the memory options *are* required for Java 1.8. Without them the
>> build will fail intermittently. We just updated the documentation with
>> regard to this fact in Spark 2.0.1. Relevant PR is here:
>> https://github.com/apache/spark/pull/15005
>>
>> Your best bet as the project transitions from Java 7 to Java 8 is to use
>> the scripts build/mvn and build/sbt, which should be updated on a regular
>> basis with safe JVM options.
>>
>> Fred
>>
>> On Wed, Oct 5, 2016 at 1:40 AM, Marco Mistroni <mm...@gmail.com>
>> wrote:
>>
>>> Thanks Richard.  It also says that for Java 1.8 the mavenopts are not
>>> required..unless I misinterpreted the instructions...
>>> Kr
>>>
>>> On 5 Oct 2016 9:20 am, "Richard Siebeling" <rs...@gmail.com> wrote:
>>>
>>>> sorry, now with the link included, see http://spark.apache.org/do
>>>> cs/latest/building-spark.html
>>>>
>>>> On Wed, Oct 5, 2016 at 10:19 AM, Richard Siebeling <
>>>> rsiebeling@gmail.com> wrote:
>>>>
>>>>> Hi,
>>>>>
>>>>> did you set the following option: export MAVEN_OPTS="-Xmx2g
>>>>> -XX:ReservedCodeCacheSize=512m"
>>>>>
>>>>> kind regards,
>>>>> Richard
>>>>>
>>>>> On Tue, Oct 4, 2016 at 10:21 PM, Marco Mistroni <mm...@gmail.com>
>>>>> wrote:
>>>>>
>>>>>> Hi all
>>>>>>  my mvn build of Spark 2.1 using Java 1.8 is spinning out of memory
>>>>>> with an error saying it cannot allocate enough memory during maven
>>>>>> compilation
>>>>>>
>>>>>> Instructions (in the Spark 2.0 page) says that MAVENOPTS are not
>>>>>> needed for Java 1.8 and , accoding to my understanding, spark build process
>>>>>> will add it
>>>>>> during the build via mvn
>>>>>> Note; i am not using Zinc. Rather, i am using my own Maven version
>>>>>> (3.3.9), launching this command from the main spark directory. The same
>>>>>> build works when i use Java 1.7(and MAVENOPTS)
>>>>>>
>>>>>> mvn -Pyarn -Dscala-2.11 -DskipTests clean package
>>>>>>
>>>>>> Could anyone assist?
>>>>>> kr
>>>>>>   marco
>>>>>>
>>>>>
>>>>>
>>>>
>>

Re: building Spark 2.1 vs Java 1.8 on Ubuntu 16/06

Posted by Marco Mistroni <mm...@gmail.com>.
Thanks Fred
The build/mvn will trigger compilation using zinc and I want to avoid that
as every time I have tried it runs into errors while compiling spark core.
How can I disable zinc by default?
Kr

On 5 Oct 2016 10:53 pm, "Fred Reiss" <fr...@gmail.com> wrote:

> Actually the memory options *are* required for Java 1.8. Without them the
> build will fail intermittently. We just updated the documentation with
> regard to this fact in Spark 2.0.1. Relevant PR is here:
> https://github.com/apache/spark/pull/15005
>
> Your best bet as the project transitions from Java 7 to Java 8 is to use
> the scripts build/mvn and build/sbt, which should be updated on a regular
> basis with safe JVM options.
>
> Fred
>
> On Wed, Oct 5, 2016 at 1:40 AM, Marco Mistroni <mm...@gmail.com>
> wrote:
>
>> Thanks Richard.  It also says that for Java 1.8 the mavenopts are not
>> required..unless I misinterpreted the instructions...
>> Kr
>>
>> On 5 Oct 2016 9:20 am, "Richard Siebeling" <rs...@gmail.com> wrote:
>>
>>> sorry, now with the link included, see http://spark.apache.org/do
>>> cs/latest/building-spark.html
>>>
>>> On Wed, Oct 5, 2016 at 10:19 AM, Richard Siebeling <rsiebeling@gmail.com
>>> > wrote:
>>>
>>>> Hi,
>>>>
>>>> did you set the following option: export MAVEN_OPTS="-Xmx2g
>>>> -XX:ReservedCodeCacheSize=512m"
>>>>
>>>> kind regards,
>>>> Richard
>>>>
>>>> On Tue, Oct 4, 2016 at 10:21 PM, Marco Mistroni <mm...@gmail.com>
>>>> wrote:
>>>>
>>>>> Hi all
>>>>>  my mvn build of Spark 2.1 using Java 1.8 is spinning out of memory
>>>>> with an error saying it cannot allocate enough memory during maven
>>>>> compilation
>>>>>
>>>>> Instructions (in the Spark 2.0 page) says that MAVENOPTS are not
>>>>> needed for Java 1.8 and , accoding to my understanding, spark build process
>>>>> will add it
>>>>> during the build via mvn
>>>>> Note; i am not using Zinc. Rather, i am using my own Maven version
>>>>> (3.3.9), launching this command from the main spark directory. The same
>>>>> build works when i use Java 1.7(and MAVENOPTS)
>>>>>
>>>>> mvn -Pyarn -Dscala-2.11 -DskipTests clean package
>>>>>
>>>>> Could anyone assist?
>>>>> kr
>>>>>   marco
>>>>>
>>>>
>>>>
>>>
>

Re: building Spark 2.1 vs Java 1.8 on Ubuntu 16/06

Posted by Fred Reiss <fr...@gmail.com>.
Actually the memory options *are* required for Java 1.8. Without them the
build will fail intermittently. We just updated the documentation with
regard to this fact in Spark 2.0.1. Relevant PR is here:
https://github.com/apache/spark/pull/15005

Your best bet as the project transitions from Java 7 to Java 8 is to use
the scripts build/mvn and build/sbt, which should be updated on a regular
basis with safe JVM options.

Fred

On Wed, Oct 5, 2016 at 1:40 AM, Marco Mistroni <mm...@gmail.com> wrote:

> Thanks Richard.  It also says that for Java 1.8 the mavenopts are not
> required..unless I misinterpreted the instructions...
> Kr
>
> On 5 Oct 2016 9:20 am, "Richard Siebeling" <rs...@gmail.com> wrote:
>
>> sorry, now with the link included, see http://spark.apache.org/do
>> cs/latest/building-spark.html
>>
>> On Wed, Oct 5, 2016 at 10:19 AM, Richard Siebeling <rs...@gmail.com>
>> wrote:
>>
>>> Hi,
>>>
>>> did you set the following option: export MAVEN_OPTS="-Xmx2g
>>> -XX:ReservedCodeCacheSize=512m"
>>>
>>> kind regards,
>>> Richard
>>>
>>> On Tue, Oct 4, 2016 at 10:21 PM, Marco Mistroni <mm...@gmail.com>
>>> wrote:
>>>
>>>> Hi all
>>>>  my mvn build of Spark 2.1 using Java 1.8 is spinning out of memory
>>>> with an error saying it cannot allocate enough memory during maven
>>>> compilation
>>>>
>>>> Instructions (in the Spark 2.0 page) says that MAVENOPTS are not needed
>>>> for Java 1.8 and , accoding to my understanding, spark build process will
>>>> add it
>>>> during the build via mvn
>>>> Note; i am not using Zinc. Rather, i am using my own Maven version
>>>> (3.3.9), launching this command from the main spark directory. The same
>>>> build works when i use Java 1.7(and MAVENOPTS)
>>>>
>>>> mvn -Pyarn -Dscala-2.11 -DskipTests clean package
>>>>
>>>> Could anyone assist?
>>>> kr
>>>>   marco
>>>>
>>>
>>>
>>

Re: building Spark 2.1 vs Java 1.8 on Ubuntu 16/06

Posted by Marco Mistroni <mm...@gmail.com>.
Thanks Richard.  It also says that for Java 1.8 the mavenopts are not
required..unless I misinterpreted the instructions...
Kr

On 5 Oct 2016 9:20 am, "Richard Siebeling" <rs...@gmail.com> wrote:

> sorry, now with the link included, see http://spark.apache.org/
> docs/latest/building-spark.html
>
> On Wed, Oct 5, 2016 at 10:19 AM, Richard Siebeling <rs...@gmail.com>
> wrote:
>
>> Hi,
>>
>> did you set the following option: export MAVEN_OPTS="-Xmx2g
>> -XX:ReservedCodeCacheSize=512m"
>>
>> kind regards,
>> Richard
>>
>> On Tue, Oct 4, 2016 at 10:21 PM, Marco Mistroni <mm...@gmail.com>
>> wrote:
>>
>>> Hi all
>>>  my mvn build of Spark 2.1 using Java 1.8 is spinning out of memory with
>>> an error saying it cannot allocate enough memory during maven compilation
>>>
>>> Instructions (in the Spark 2.0 page) says that MAVENOPTS are not needed
>>> for Java 1.8 and , accoding to my understanding, spark build process will
>>> add it
>>> during the build via mvn
>>> Note; i am not using Zinc. Rather, i am using my own Maven version
>>> (3.3.9), launching this command from the main spark directory. The same
>>> build works when i use Java 1.7(and MAVENOPTS)
>>>
>>> mvn -Pyarn -Dscala-2.11 -DskipTests clean package
>>>
>>> Could anyone assist?
>>> kr
>>>   marco
>>>
>>
>>
>

Re: building Spark 2.1 vs Java 1.8 on Ubuntu 16/06

Posted by Richard Siebeling <rs...@gmail.com>.
sorry, now with the link included, see
http://spark.apache.org/docs/latest/building-spark.html

On Wed, Oct 5, 2016 at 10:19 AM, Richard Siebeling <rs...@gmail.com>
wrote:

> Hi,
>
> did you set the following option: export MAVEN_OPTS="-Xmx2g
> -XX:ReservedCodeCacheSize=512m"
>
> kind regards,
> Richard
>
> On Tue, Oct 4, 2016 at 10:21 PM, Marco Mistroni <mm...@gmail.com>
> wrote:
>
>> Hi all
>>  my mvn build of Spark 2.1 using Java 1.8 is spinning out of memory with
>> an error saying it cannot allocate enough memory during maven compilation
>>
>> Instructions (in the Spark 2.0 page) says that MAVENOPTS are not needed
>> for Java 1.8 and , accoding to my understanding, spark build process will
>> add it
>> during the build via mvn
>> Note; i am not using Zinc. Rather, i am using my own Maven version
>> (3.3.9), launching this command from the main spark directory. The same
>> build works when i use Java 1.7(and MAVENOPTS)
>>
>> mvn -Pyarn -Dscala-2.11 -DskipTests clean package
>>
>> Could anyone assist?
>> kr
>>   marco
>>
>
>

Re: building Spark 2.1 vs Java 1.8 on Ubuntu 16/06

Posted by Richard Siebeling <rs...@gmail.com>.
Hi,

did you set the following option: export MAVEN_OPTS="-Xmx2g
-XX:ReservedCodeCacheSize=512m"

kind regards,
Richard

On Tue, Oct 4, 2016 at 10:21 PM, Marco Mistroni <mm...@gmail.com> wrote:

> Hi all
>  my mvn build of Spark 2.1 using Java 1.8 is spinning out of memory with
> an error saying it cannot allocate enough memory during maven compilation
>
> Instructions (in the Spark 2.0 page) says that MAVENOPTS are not needed
> for Java 1.8 and , accoding to my understanding, spark build process will
> add it
> during the build via mvn
> Note; i am not using Zinc. Rather, i am using my own Maven version
> (3.3.9), launching this command from the main spark directory. The same
> build works when i use Java 1.7(and MAVENOPTS)
>
> mvn -Pyarn -Dscala-2.11 -DskipTests clean package
>
> Could anyone assist?
> kr
>   marco
>