You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@zeppelin.apache.org by RJ Nowling <rn...@gmail.com> on 2015/03/23 15:48:43 UTC

Maven profiles

Hi all,

I wanted to clarify how the Maven profiles work, especially with reference
to cluster vs local mode.

The documentation [1] specifies that local mode can be compiled like so:

$ mvn install -DskipTests

and cluster mode like so:

$ mvn install -DskipTests -Dspark.version=1.1.0 -Dhadoop.version=2.2.0

I may want to specify different Spark or Hadoop versions for local mode,
though.  In the pom.xml file, I saw that there are different Spark and
Hadoop profiles specified.

Questions:
1. Are the Maven profiles mutually exclusive?  Can I specify -Pspark-1.3
-Phadoop-2.4 ?

2. What differentiates the local vs cluster mode in the build?

Thanks,
RJ

[1] https://zeppelin.incubator.apache.org/docs/install/install.html

Re: Maven profiles

Posted by RJ Nowling <rn...@gmail.com>.
My thinking is that since local/cluster mode depends on whether the MASTER
variable is set in the conf, the local/cluster mode distinction in the
build has to do whether or not to bundle dependencies.  If so, that's a
documentation error.  Otherwise, separate explicit profiles are needed.

On Mon, Mar 23, 2015 at 12:07 PM, RJ Nowling <rn...@gmail.com> wrote:

> Thanks, Nirav.
>
> It seems to me that there should be explicit profiles for local vs cluster
> mode separate from the Spark/Hadoop profiles.  For example, what if I want
> to set the version of Spark used in the local mode?
>
>
> On Mon, Mar 23, 2015 at 11:54 AM, Nirav Mehta <me...@gmail.com>
> wrote:
>
>> 1. Yes, they trigger the build processes accordingly. I just built it
>> with spark 1.3.0 and Hadoop 2.6.0
>> 2. Local will build Zeppelin with Spark to trigger a local context with
>> local threads. With cluster mode, it will expect the master, hadoop conf
>> and spark assembly configurations to be specified in the zeppelin-env.sh
>>
>> I've just started using Zeppelin myself, so experts please validate my
>> claim.
>>
>> On Mon, Mar 23, 2015 at 10:48 AM, RJ Nowling <rn...@gmail.com> wrote:
>>
>>> Hi all,
>>>
>>> I wanted to clarify how the Maven profiles work, especially with
>>> reference to cluster vs local mode.
>>>
>>> The documentation [1] specifies that local mode can be compiled like so:
>>>
>>> $ mvn install -DskipTests
>>>
>>> and cluster mode like so:
>>>
>>> $ mvn install -DskipTests -Dspark.version=1.1.0 -Dhadoop.version=2.2.0
>>>
>>> I may want to specify different Spark or Hadoop versions for local mode,
>>> though.  In the pom.xml file, I saw that there are different Spark and
>>> Hadoop profiles specified.
>>>
>>> Questions:
>>> 1. Are the Maven profiles mutually exclusive?  Can I specify -Pspark-1.3
>>> -Phadoop-2.4 ?
>>>
>>> 2. What differentiates the local vs cluster mode in the build?
>>>
>>> Thanks,
>>> RJ
>>>
>>> [1] https://zeppelin.incubator.apache.org/docs/install/install.html
>>>
>>>
>>
>

Re: Maven profiles

Posted by RJ Nowling <rn...@gmail.com>.
Thanks, Nirav.

It seems to me that there should be explicit profiles for local vs cluster
mode separate from the Spark/Hadoop profiles.  For example, what if I want
to set the version of Spark used in the local mode?


On Mon, Mar 23, 2015 at 11:54 AM, Nirav Mehta <me...@gmail.com> wrote:

> 1. Yes, they trigger the build processes accordingly. I just built it with
> spark 1.3.0 and Hadoop 2.6.0
> 2. Local will build Zeppelin with Spark to trigger a local context with
> local threads. With cluster mode, it will expect the master, hadoop conf
> and spark assembly configurations to be specified in the zeppelin-env.sh
>
> I've just started using Zeppelin myself, so experts please validate my
> claim.
>
> On Mon, Mar 23, 2015 at 10:48 AM, RJ Nowling <rn...@gmail.com> wrote:
>
>> Hi all,
>>
>> I wanted to clarify how the Maven profiles work, especially with
>> reference to cluster vs local mode.
>>
>> The documentation [1] specifies that local mode can be compiled like so:
>>
>> $ mvn install -DskipTests
>>
>> and cluster mode like so:
>>
>> $ mvn install -DskipTests -Dspark.version=1.1.0 -Dhadoop.version=2.2.0
>>
>> I may want to specify different Spark or Hadoop versions for local mode,
>> though.  In the pom.xml file, I saw that there are different Spark and
>> Hadoop profiles specified.
>>
>> Questions:
>> 1. Are the Maven profiles mutually exclusive?  Can I specify -Pspark-1.3
>> -Phadoop-2.4 ?
>>
>> 2. What differentiates the local vs cluster mode in the build?
>>
>> Thanks,
>> RJ
>>
>> [1] https://zeppelin.incubator.apache.org/docs/install/install.html
>>
>>
>

Re: Maven profiles

Posted by Nirav Mehta <me...@gmail.com>.
1. Yes, they trigger the build processes accordingly. I just built it with
spark 1.3.0 and Hadoop 2.6.0
2. Local will build Zeppelin with Spark to trigger a local context with
local threads. With cluster mode, it will expect the master, hadoop conf
and spark assembly configurations to be specified in the zeppelin-env.sh

I've just started using Zeppelin myself, so experts please validate my
claim.

On Mon, Mar 23, 2015 at 10:48 AM, RJ Nowling <rn...@gmail.com> wrote:

> Hi all,
>
> I wanted to clarify how the Maven profiles work, especially with reference
> to cluster vs local mode.
>
> The documentation [1] specifies that local mode can be compiled like so:
>
> $ mvn install -DskipTests
>
> and cluster mode like so:
>
> $ mvn install -DskipTests -Dspark.version=1.1.0 -Dhadoop.version=2.2.0
>
> I may want to specify different Spark or Hadoop versions for local mode,
> though.  In the pom.xml file, I saw that there are different Spark and
> Hadoop profiles specified.
>
> Questions:
> 1. Are the Maven profiles mutually exclusive?  Can I specify -Pspark-1.3
> -Phadoop-2.4 ?
>
> 2. What differentiates the local vs cluster mode in the build?
>
> Thanks,
> RJ
>
> [1] https://zeppelin.incubator.apache.org/docs/install/install.html
>
>