You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@zeppelin.apache.org by Victor Coustenoble <vi...@datastax.com> on 2016/01/12 00:45:32 UTC

Default Hadoop and Spark versions ?

Few questions on build options:


- Spark and Hadoop are needed to build Zeppelin for client binaries
library, right ? and with SPARK_HOME set, another client library version
can be used, right ?

- If I don't specify any options, are there default Spark and Hadoop
versions embedded ? I don't find it in pom file

- Possible to check Spark client library version used in a notebook ?

- If I only specify the cassandra-spark option, I get at the same time the
corresponding Spark client library version ?


Thanks

Victor

Re: Default Hadoop and Spark versions ?

Posted by Victor Coustenoble <vi...@datastax.com>.
Thanks Junaid,

And I have found how to check the Spark version installed , a simple
sc.version !

Victor

[image: datastax_logo.png] <http://www.datastax.com/>

Victor Coustenoble

Solutions Engineer | +33 6 70 23 68 82 | victor.coustenoble@datastax.com


[image: linkedin.png] <https://www.linkedin.com/in/victorcoustenoble> [image:
twitter.png] <https://twitter.com/vizanalytics> [image: g+.png]
<https://plus.google.com/+Datastax/about>
<http://feeds.feedburner.com/datastax> <https://github.com/datastax/>


<http://www.datastax.com/gartner-magic-quadrant-odbms>

On Tue, Jan 12, 2016 at 9:01 AM, Junaid Shaikh J <
junaid.j.shaikh@ericsson.com> wrote:

> Answers to your questions:
>
> You do not explicitly need to install spark and hadoop before building
> Zeppelin. It can be embedded, and while building Zeppelin, you can specify
> spark and hadoop versions you need. Yes SPARK-HOME can be used to point to
> the external spark and hadoop installation.
>
> To build you may specify any spark and hadoop versions. For example:
>
> mvn clean package -Pspark-1.6 -Phadoop-2.4 -Pyarn -Ppyspark
>
>
> For Cassandra integration, build using the option: -Pcassandra-spark-*xx*
>
> /Junaid
>
>
> On 12 Jan 2016, at 00:45, Victor Coustenoble <
> victor.coustenoble@datastax.com> wrote:
>
> Few questions on build options:
>
> - Spark and Hadoop are needed to build Zeppelin for client binaries
> library, right ? and with SPARK_HOME set, another client library version
> can be used, right ?
> - If I don't specify any options, are there default Spark and Hadoop
> versions embedded ? I don't find it in pom file
> - Possible to check Spark client library version used in a notebook ?
> - If I only specify the cassandra-spark option, I get at the same time the
> corresponding Spark client library version ?
>
> Thanks
> Victor
>
>
>

Re: Default Hadoop and Spark versions ?

Posted by Junaid Shaikh J <ju...@ericsson.com>.
Answers to your questions:

You do not explicitly need to install spark and hadoop before building Zeppelin. It can be embedded, and while building Zeppelin, you can specify spark and hadoop versions you need. Yes SPARK-HOME can be used to point to the external spark and hadoop installation.

To build you may specify any spark and hadoop versions. For example:

mvn clean package -Pspark-1.6 -Phadoop-2.4 -Pyarn -Ppyspark


For Cassandra integration, build using the option: -Pcassandra-spark-xx

/Junaid


On 12 Jan 2016, at 00:45, Victor Coustenoble <vi...@datastax.com>> wrote:

Few questions on build options:

- Spark and Hadoop are needed to build Zeppelin for client binaries library, right ? and with SPARK_HOME set, another client library version can be used, right ?
- If I don't specify any options, are there default Spark and Hadoop versions embedded ? I don't find it in pom file
- Possible to check Spark client library version used in a notebook ?
- If I only specify the cassandra-spark option, I get at the same time the corresponding Spark client library version ?

Thanks
Victor