You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Soren Macbeth <so...@yieldbot.com> on 2014/06/01 04:06:57 UTC

SCALA_HOME or SCALA_LIBRARY_PATH not set during build

Hello,

Following the instructions for building spark 1.0.0, I encountered the
following error:

[ERROR] Failed to execute goal
org.apache.maven.plugins:maven-antrun-plugin:1.7:run (default) on project
spark-core_2.10: An Ant BuildException has occured: Please set the
SCALA_HOME (or SCALA_LIBRARY_PATH if scala is on the path) environment
variables and retry.
[ERROR] around Ant part ...<fail message="Please set the SCALA_HOME (or
SCALA_LIBRARY_PATH if scala is on the path) environment variables and
retry.">... @ 6:126 in
/Users/soren/src/spark-1.0.0/core/target/antrun/build-main.xml

No where in the documentation does it mention that having scala installed
and either of these env vars set nor what version should be installed.
Setting these env vars wasn't required for 0.9.1 with sbt.

I was able to get past it by downloading the scala 2.10.4 binary package to
a temp dir and setting SCALA_HOME to that dir.

Ideally, it would be nice to not have to require people to have a
standalone scala installation but at a minimum this requirement should be
documented in the build instructions no?

-Soren

Re: SCALA_HOME or SCALA_LIBRARY_PATH not set during build

Posted by Colin McCabe <cm...@alumni.cmu.edu>.
Cool.  Nice to not have to set this any more.

best,
Colin


On Sun, Jun 1, 2014 at 11:21 AM, Patrick Wendell <pw...@gmail.com> wrote:

> I went ahead and created a JIRA for this and back ported the
> improvement into branch-1.0. This wasn't a regression per-se because
> the behavior existed in all previous versions, but it's annoying
> behavior so best to fix it.
>
> https://issues.apache.org/jira/browse/SPARK-1984
>
> - Patrick
>
> On Sun, Jun 1, 2014 at 11:13 AM, Patrick Wendell <pw...@gmail.com>
> wrote:
> > This is a false error message actually - the Maven build no longer
> > requires SCALA_HOME but the message/check was still there. This was
> > fixed recently in master:
> >
> >
> https://github.com/apache/spark/commit/d8c005d5371f81a2a06c5d27c7021e1ae43d7193
> >
> > I can back port that fix into branch-1.0 so it will be in 1.0.1 as
> > well. For other people running into this, you can export SCALA_HOME to
> > any value and it will work.
> >
> > - Patrick
> >
> > On Sat, May 31, 2014 at 8:34 PM, Colin McCabe <cm...@alumni.cmu.edu>
> wrote:
> >> Spark currently supports two build systems, sbt and maven.  sbt will
> >> download the correct version of scala, but with Maven you need to
> supply it
> >> yourself and set SCALA_HOME.
> >>
> >> It sounds like the instructions need to be updated-- perhaps create a
> JIRA?
> >>
> >> best,
> >> Colin
> >>
> >>
> >> On Sat, May 31, 2014 at 7:06 PM, Soren Macbeth <so...@yieldbot.com>
> wrote:
> >>
> >>> Hello,
> >>>
> >>> Following the instructions for building spark 1.0.0, I encountered the
> >>> following error:
> >>>
> >>> [ERROR] Failed to execute goal
> >>> org.apache.maven.plugins:maven-antrun-plugin:1.7:run (default) on
> project
> >>> spark-core_2.10: An Ant BuildException has occured: Please set the
> >>> SCALA_HOME (or SCALA_LIBRARY_PATH if scala is on the path) environment
> >>> variables and retry.
> >>> [ERROR] around Ant part ...<fail message="Please set the SCALA_HOME (or
> >>> SCALA_LIBRARY_PATH if scala is on the path) environment variables and
> >>> retry.">... @ 6:126 in
> >>> /Users/soren/src/spark-1.0.0/core/target/antrun/build-main.xml
> >>>
> >>> No where in the documentation does it mention that having scala
> installed
> >>> and either of these env vars set nor what version should be installed.
> >>> Setting these env vars wasn't required for 0.9.1 with sbt.
> >>>
> >>> I was able to get past it by downloading the scala 2.10.4 binary
> package to
> >>> a temp dir and setting SCALA_HOME to that dir.
> >>>
> >>> Ideally, it would be nice to not have to require people to have a
> >>> standalone scala installation but at a minimum this requirement should
> be
> >>> documented in the build instructions no?
> >>>
> >>> -Soren
> >>>
>

Re: SCALA_HOME or SCALA_LIBRARY_PATH not set during build

Posted by Soren Macbeth <so...@yieldbot.com>.
Cheers, I didn't think it was needed, but just wanted to point it out.


On Sun, Jun 1, 2014 at 11:21 AM, Patrick Wendell <pw...@gmail.com> wrote:

> I went ahead and created a JIRA for this and back ported the
> improvement into branch-1.0. This wasn't a regression per-se because
> the behavior existed in all previous versions, but it's annoying
> behavior so best to fix it.
>
> https://issues.apache.org/jira/browse/SPARK-1984
>
> - Patrick
>
> On Sun, Jun 1, 2014 at 11:13 AM, Patrick Wendell <pw...@gmail.com>
> wrote:
> > This is a false error message actually - the Maven build no longer
> > requires SCALA_HOME but the message/check was still there. This was
> > fixed recently in master:
> >
> >
> https://github.com/apache/spark/commit/d8c005d5371f81a2a06c5d27c7021e1ae43d7193
> >
> > I can back port that fix into branch-1.0 so it will be in 1.0.1 as
> > well. For other people running into this, you can export SCALA_HOME to
> > any value and it will work.
> >
> > - Patrick
> >
> > On Sat, May 31, 2014 at 8:34 PM, Colin McCabe <cm...@alumni.cmu.edu>
> wrote:
> >> Spark currently supports two build systems, sbt and maven.  sbt will
> >> download the correct version of scala, but with Maven you need to
> supply it
> >> yourself and set SCALA_HOME.
> >>
> >> It sounds like the instructions need to be updated-- perhaps create a
> JIRA?
> >>
> >> best,
> >> Colin
> >>
> >>
> >> On Sat, May 31, 2014 at 7:06 PM, Soren Macbeth <so...@yieldbot.com>
> wrote:
> >>
> >>> Hello,
> >>>
> >>> Following the instructions for building spark 1.0.0, I encountered the
> >>> following error:
> >>>
> >>> [ERROR] Failed to execute goal
> >>> org.apache.maven.plugins:maven-antrun-plugin:1.7:run (default) on
> project
> >>> spark-core_2.10: An Ant BuildException has occured: Please set the
> >>> SCALA_HOME (or SCALA_LIBRARY_PATH if scala is on the path) environment
> >>> variables and retry.
> >>> [ERROR] around Ant part ...<fail message="Please set the SCALA_HOME (or
> >>> SCALA_LIBRARY_PATH if scala is on the path) environment variables and
> >>> retry.">... @ 6:126 in
> >>> /Users/soren/src/spark-1.0.0/core/target/antrun/build-main.xml
> >>>
> >>> No where in the documentation does it mention that having scala
> installed
> >>> and either of these env vars set nor what version should be installed.
> >>> Setting these env vars wasn't required for 0.9.1 with sbt.
> >>>
> >>> I was able to get past it by downloading the scala 2.10.4 binary
> package to
> >>> a temp dir and setting SCALA_HOME to that dir.
> >>>
> >>> Ideally, it would be nice to not have to require people to have a
> >>> standalone scala installation but at a minimum this requirement should
> be
> >>> documented in the build instructions no?
> >>>
> >>> -Soren
> >>>
>

Re: SCALA_HOME or SCALA_LIBRARY_PATH not set during build

Posted by Patrick Wendell <pw...@gmail.com>.
I went ahead and created a JIRA for this and back ported the
improvement into branch-1.0. This wasn't a regression per-se because
the behavior existed in all previous versions, but it's annoying
behavior so best to fix it.

https://issues.apache.org/jira/browse/SPARK-1984

- Patrick

On Sun, Jun 1, 2014 at 11:13 AM, Patrick Wendell <pw...@gmail.com> wrote:
> This is a false error message actually - the Maven build no longer
> requires SCALA_HOME but the message/check was still there. This was
> fixed recently in master:
>
> https://github.com/apache/spark/commit/d8c005d5371f81a2a06c5d27c7021e1ae43d7193
>
> I can back port that fix into branch-1.0 so it will be in 1.0.1 as
> well. For other people running into this, you can export SCALA_HOME to
> any value and it will work.
>
> - Patrick
>
> On Sat, May 31, 2014 at 8:34 PM, Colin McCabe <cm...@alumni.cmu.edu> wrote:
>> Spark currently supports two build systems, sbt and maven.  sbt will
>> download the correct version of scala, but with Maven you need to supply it
>> yourself and set SCALA_HOME.
>>
>> It sounds like the instructions need to be updated-- perhaps create a JIRA?
>>
>> best,
>> Colin
>>
>>
>> On Sat, May 31, 2014 at 7:06 PM, Soren Macbeth <so...@yieldbot.com> wrote:
>>
>>> Hello,
>>>
>>> Following the instructions for building spark 1.0.0, I encountered the
>>> following error:
>>>
>>> [ERROR] Failed to execute goal
>>> org.apache.maven.plugins:maven-antrun-plugin:1.7:run (default) on project
>>> spark-core_2.10: An Ant BuildException has occured: Please set the
>>> SCALA_HOME (or SCALA_LIBRARY_PATH if scala is on the path) environment
>>> variables and retry.
>>> [ERROR] around Ant part ...<fail message="Please set the SCALA_HOME (or
>>> SCALA_LIBRARY_PATH if scala is on the path) environment variables and
>>> retry.">... @ 6:126 in
>>> /Users/soren/src/spark-1.0.0/core/target/antrun/build-main.xml
>>>
>>> No where in the documentation does it mention that having scala installed
>>> and either of these env vars set nor what version should be installed.
>>> Setting these env vars wasn't required for 0.9.1 with sbt.
>>>
>>> I was able to get past it by downloading the scala 2.10.4 binary package to
>>> a temp dir and setting SCALA_HOME to that dir.
>>>
>>> Ideally, it would be nice to not have to require people to have a
>>> standalone scala installation but at a minimum this requirement should be
>>> documented in the build instructions no?
>>>
>>> -Soren
>>>

Re: SCALA_HOME or SCALA_LIBRARY_PATH not set during build

Posted by Patrick Wendell <pw...@gmail.com>.
This is a false error message actually - the Maven build no longer
requires SCALA_HOME but the message/check was still there. This was
fixed recently in master:

https://github.com/apache/spark/commit/d8c005d5371f81a2a06c5d27c7021e1ae43d7193

I can back port that fix into branch-1.0 so it will be in 1.0.1 as
well. For other people running into this, you can export SCALA_HOME to
any value and it will work.

- Patrick

On Sat, May 31, 2014 at 8:34 PM, Colin McCabe <cm...@alumni.cmu.edu> wrote:
> Spark currently supports two build systems, sbt and maven.  sbt will
> download the correct version of scala, but with Maven you need to supply it
> yourself and set SCALA_HOME.
>
> It sounds like the instructions need to be updated-- perhaps create a JIRA?
>
> best,
> Colin
>
>
> On Sat, May 31, 2014 at 7:06 PM, Soren Macbeth <so...@yieldbot.com> wrote:
>
>> Hello,
>>
>> Following the instructions for building spark 1.0.0, I encountered the
>> following error:
>>
>> [ERROR] Failed to execute goal
>> org.apache.maven.plugins:maven-antrun-plugin:1.7:run (default) on project
>> spark-core_2.10: An Ant BuildException has occured: Please set the
>> SCALA_HOME (or SCALA_LIBRARY_PATH if scala is on the path) environment
>> variables and retry.
>> [ERROR] around Ant part ...<fail message="Please set the SCALA_HOME (or
>> SCALA_LIBRARY_PATH if scala is on the path) environment variables and
>> retry.">... @ 6:126 in
>> /Users/soren/src/spark-1.0.0/core/target/antrun/build-main.xml
>>
>> No where in the documentation does it mention that having scala installed
>> and either of these env vars set nor what version should be installed.
>> Setting these env vars wasn't required for 0.9.1 with sbt.
>>
>> I was able to get past it by downloading the scala 2.10.4 binary package to
>> a temp dir and setting SCALA_HOME to that dir.
>>
>> Ideally, it would be nice to not have to require people to have a
>> standalone scala installation but at a minimum this requirement should be
>> documented in the build instructions no?
>>
>> -Soren
>>

Re: SCALA_HOME or SCALA_LIBRARY_PATH not set during build

Posted by Colin McCabe <cm...@alumni.cmu.edu>.
Spark currently supports two build systems, sbt and maven.  sbt will
download the correct version of scala, but with Maven you need to supply it
yourself and set SCALA_HOME.

It sounds like the instructions need to be updated-- perhaps create a JIRA?

best,
Colin


On Sat, May 31, 2014 at 7:06 PM, Soren Macbeth <so...@yieldbot.com> wrote:

> Hello,
>
> Following the instructions for building spark 1.0.0, I encountered the
> following error:
>
> [ERROR] Failed to execute goal
> org.apache.maven.plugins:maven-antrun-plugin:1.7:run (default) on project
> spark-core_2.10: An Ant BuildException has occured: Please set the
> SCALA_HOME (or SCALA_LIBRARY_PATH if scala is on the path) environment
> variables and retry.
> [ERROR] around Ant part ...<fail message="Please set the SCALA_HOME (or
> SCALA_LIBRARY_PATH if scala is on the path) environment variables and
> retry.">... @ 6:126 in
> /Users/soren/src/spark-1.0.0/core/target/antrun/build-main.xml
>
> No where in the documentation does it mention that having scala installed
> and either of these env vars set nor what version should be installed.
> Setting these env vars wasn't required for 0.9.1 with sbt.
>
> I was able to get past it by downloading the scala 2.10.4 binary package to
> a temp dir and setting SCALA_HOME to that dir.
>
> Ideally, it would be nice to not have to require people to have a
> standalone scala installation but at a minimum this requirement should be
> documented in the build instructions no?
>
> -Soren
>