You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@sedona.apache.org by Yutian Pang <yp...@asu.edu> on 2020/10/01 23:38:48 UTC

Re: Using GeoSpark with pip installed pyspark

Hello,

I'm forwarding my email to this email address since issues@sedona.apache.org
is an invalid address.

Best,
Yutian

On Thu, Oct 1, 2020 at 4:35 PM Yutian Pang <yp...@asu.edu> wrote:

> Hello,
>
> I encountered a problem when I'm trying to install GeoSpark and upload the
> java packages from GeoSpark to Spark as in
> upload_jars() and the error messages,
>
> ---------------------------------------------------------------------------ValueError                                Traceback (most recent call last)<ipython-input-53-f81c7dc7072b> in <module>----> 1 upload_jars()      2 GeoSparkRegistrator.registerAll(spark)
> ~/anaconda3/lib/python3.7/site-packages/geospark/register/uploading.py in upload_jars()     37     module_path = get_module_path(get_abs_path())     38     upload_jars_based_on_spark_version(module_path)---> 39     findspark.init()     40     return True
> ~/anaconda3/lib/python3.7/site-packages/findspark.py in init(spark_home, python_path, edit_rc, edit_profile)    127     128     if not spark_home:--> 129         spark_home = find()    130     131     if not python_path:
> ~/anaconda3/lib/python3.7/site-packages/findspark.py in find()     34     if not spark_home:     35         raise ValueError(---> 36             "Couldn't find Spark, make sure SPARK_HOME env is set"     37             " or Spark is in an expected location (e.g. from homebrew installation)."     38         )
> ValueError: Couldn't find Spark, make sure SPARK_HOME env is set or Spark is in an expected location (e.g. from homebrew installation).
>
>
> The problem comes from the findspark function cannot find SPARK_HOME in
> /.bashrc.
>
> I installed pyspark with pip install which doesn't require me to set
> global environment during installation.
>
> And this problem still exists even if I have manually set
> SPARK_HOME=/home/ypang6/anaconda3/lib/python3.7/site-packages/pyspark in
> /.bashrc on my machine.
>
> I'm confused about how should I pass this to load the java packages. Could
> you please give me some hint?
>
> Best regards,
> Yutian
>
>