You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@zeppelin.apache.org by Jonathan Haddad <jo...@jonhaddad.com> on 2015/05/05 01:44:10 UTC

setting --driver-class-path in spark?

Currently I use the following command to start pyspark, in order to use the
Cassandra connector:

pyspark \
    --jars ${PYSPARK_ROOT}/pyspark_cassandra-0.1.4.jar  \
    --driver-class-path ${PYSPARK_ROOT}/pyspark_cassandra-0.1.4.jar \
    --py-files ${PYSPARK_ROOT}/pyspark_cassandra-0.1.4-py2.7.egg \
    --conf spark.cassandra.connection.host=127.0.0.1 \
    --master spark://127.0.0.1:7077 \

Is there a way to do this with Zeppelin?  Sorry if I'm missing something
obvious, I'm new to the project.

Jon

Re: setting --driver-class-path in spark?

Posted by moon soo Lee <mo...@apache.org>.
Hi,

I didn't tested. but in conf/zeppelin-env.sh,

You can do similar things with --jars, --driver-class-path option using
-Dspark.jars, like

export ZEPPELIN_JAVA_OPTS="-Dspark.jars=${PYSPARK_ROOT}/pyspark_cassandra
-0.1.4.jar,${PYSPARK_ROOT}/pyspark_cassandra-0.1.4.jar"


Not sure about .egg. But you can try with PYTHONPATH.
export PYTHONPATH=${PYSPARK_ROOT}/pyspark_cassandra-0.1.4-py2.7.egg

And --conf, --master can be done interpreter setting page.

Hope this helps.

Thanks,
moon



On Tue, May 5, 2015 at 12:44 AM Jonathan Haddad <jo...@jonhaddad.com> wrote:

> Currently I use the following command to start pyspark, in order to use
> the Cassandra connector:
>
> pyspark \
>     --jars ${PYSPARK_ROOT}/pyspark_cassandra-0.1.4.jar  \
>     --driver-class-path ${PYSPARK_ROOT}/pyspark_cassandra-0.1.4.jar \
>     --py-files ${PYSPARK_ROOT}/pyspark_cassandra-0.1.4-py2.7.egg \
>     --conf spark.cassandra.connection.host=127.0.0.1 \
>     --master spark://127.0.0.1:7077 \
>
> Is there a way to do this with Zeppelin?  Sorry if I'm missing something
> obvious, I'm new to the project.
>
> Jon
>