You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@zeppelin.apache.org by Manuel Sopena Ballesteros <ma...@garvan.org.au> on 2019/11/15 01:29:40 UTC

send parameters to pyspark

Dear zeppelin community,

I need to send some parameters to pyspark so it can find extra jars.

This is an example of the parameters I need to send to pyspark:

pyspark \
  --jars /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages/hail/hail-all-spark.jar \
  --conf spark.driver.extraClassPath=/share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages/hail/hail-all-spark.jar \
  --conf spark.executor.extraClassPath=/share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages/hail/hail-all-spark.jar \
  --conf spark.serializer=org.apache.spark.serializer.KryoSerializer \
  --conf spark.kryo.registrator=is.hail.kryo.HailKryoRegistrator

How could I configure my spark interpreter to do this?

Thank you very much
NOTICE
Please consider the environment before printing this email. This message and any attachments are intended for the addressee named and may contain legally privileged/confidential/copyright information. If you are not the intended recipient, you should not read, use, disclose, copy or distribute this communication. If you have received this message in error please notify us at once by return email and then delete both messages. We accept no liability for the distribution of viruses or similar in electronic communications. This notice should not be removed.

Re: send parameters to pyspark

Posted by Jeff Zhang <zj...@gmail.com>.
You can specify spark properties as mentioned here
http://spark.apache.org/docs/latest/configuration.html
They will be passed to spark via --conf



Manuel Sopena Ballesteros <ma...@garvan.org.au> 于2019年11月15日周五
上午11:19写道:

> Thank you very much, that worked
>
>
>
> What about passing –conf flag to pyspark?
>
>
>
> Manuel
>
>
>
> *From:* Jeff Zhang [mailto:zjffdu@gmail.com]
> *Sent:* Friday, November 15, 2019 12:35 PM
> *To:* users
> *Subject:* Re: send parameters to pyspark
>
>
>
> you can set property spark.jars
>
>
>
> Manuel Sopena Ballesteros <ma...@garvan.org.au> 于2019年11月15日周五 上午9:30
> 写道:
>
> Dear zeppelin community,
>
>
>
> I need to send some parameters to pyspark so it can find extra jars.
>
>
>
> This is an example of the parameters I need to send to pyspark:
>
>
>
> pyspark \
>
>   --jars
> /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages/hail/hail-all-spark.jar
> \
>
>   --conf
> spark.driver.extraClassPath=/share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages/hail/hail-all-spark.jar
> \
>
>   --conf
> spark.executor.extraClassPath=/share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages/hail/hail-all-spark.jar
> \
>
>   --conf spark.serializer=org.apache.spark.serializer.KryoSerializer \
>
>   --conf spark.kryo.registrator=is.hail.kryo.HailKryoRegistrator
>
>
>
> How could I configure my spark interpreter to do this?
>
>
>
> Thank you very much
>
> NOTICE
>
> Please consider the environment before printing this email. This message
> and any attachments are intended for the addressee named and may contain
> legally privileged/confidential/copyright information. If you are not the
> intended recipient, you should not read, use, disclose, copy or distribute
> this communication. If you have received this message in error please
> notify us at once by return email and then delete both messages. We accept
> no liability for the distribution of viruses or similar in electronic
> communications. This notice should not be removed.
>
>
>
>
> --
>
> Best Regards
>
> Jeff Zhang
> NOTICE
> Please consider the environment before printing this email. This message
> and any attachments are intended for the addressee named and may contain
> legally privileged/confidential/copyright information. If you are not the
> intended recipient, you should not read, use, disclose, copy or distribute
> this communication. If you have received this message in error please
> notify us at once by return email and then delete both messages. We accept
> no liability for the distribution of viruses or similar in electronic
> communications. This notice should not be removed.
>


-- 
Best Regards

Jeff Zhang

RE: send parameters to pyspark

Posted by Manuel Sopena Ballesteros <ma...@garvan.org.au>.
Thank you very much, that worked

What about passing –conf flag to pyspark?

Manuel

From: Jeff Zhang [mailto:zjffdu@gmail.com]
Sent: Friday, November 15, 2019 12:35 PM
To: users
Subject: Re: send parameters to pyspark

you can set property spark.jars

Manuel Sopena Ballesteros <ma...@garvan.org.au>> 于2019年11月15日周五 上午9:30写道:
Dear zeppelin community,

I need to send some parameters to pyspark so it can find extra jars.

This is an example of the parameters I need to send to pyspark:

pyspark \
  --jars /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages/hail/hail-all-spark.jar \
  --conf spark.driver.extraClassPath=/share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages/hail/hail-all-spark.jar \
  --conf spark.executor.extraClassPath=/share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages/hail/hail-all-spark.jar \
  --conf spark.serializer=org.apache.spark.serializer.KryoSerializer \
  --conf spark.kryo.registrator=is.hail.kryo.HailKryoRegistrator

How could I configure my spark interpreter to do this?

Thank you very much
NOTICE
Please consider the environment before printing this email. This message and any attachments are intended for the addressee named and may contain legally privileged/confidential/copyright information. If you are not the intended recipient, you should not read, use, disclose, copy or distribute this communication. If you have received this message in error please notify us at once by return email and then delete both messages. We accept no liability for the distribution of viruses or similar in electronic communications. This notice should not be removed.


--
Best Regards

Jeff Zhang
NOTICE
Please consider the environment before printing this email. This message and any attachments are intended for the addressee named and may contain legally privileged/confidential/copyright information. If you are not the intended recipient, you should not read, use, disclose, copy or distribute this communication. If you have received this message in error please notify us at once by return email and then delete both messages. We accept no liability for the distribution of viruses or similar in electronic communications. This notice should not be removed.

Re: send parameters to pyspark

Posted by Jeff Zhang <zj...@gmail.com>.
you can set property spark.jars

Manuel Sopena Ballesteros <ma...@garvan.org.au> 于2019年11月15日周五 上午9:30写道:

> Dear zeppelin community,
>
>
>
> I need to send some parameters to pyspark so it can find extra jars.
>
>
>
> This is an example of the parameters I need to send to pyspark:
>
>
>
> pyspark \
>
>   --jars
> /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages/hail/hail-all-spark.jar
> \
>
>   --conf
> spark.driver.extraClassPath=/share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages/hail/hail-all-spark.jar
> \
>
>   --conf
> spark.executor.extraClassPath=/share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages/hail/hail-all-spark.jar
> \
>
>   --conf spark.serializer=org.apache.spark.serializer.KryoSerializer \
>
>   --conf spark.kryo.registrator=is.hail.kryo.HailKryoRegistrator
>
>
>
> How could I configure my spark interpreter to do this?
>
>
>
> Thank you very much
> NOTICE
> Please consider the environment before printing this email. This message
> and any attachments are intended for the addressee named and may contain
> legally privileged/confidential/copyright information. If you are not the
> intended recipient, you should not read, use, disclose, copy or distribute
> this communication. If you have received this message in error please
> notify us at once by return email and then delete both messages. We accept
> no liability for the distribution of viruses or similar in electronic
> communications. This notice should not be removed.
>


-- 
Best Regards

Jeff Zhang