You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@sdap.apache.org by "Joseph C. Jacob (Jira)" <ji...@apache.org> on 2021/09/21 03:41:00 UTC

[jira] [Created] (SDAP-346) PySpark environment variables incorrectly set

Joseph C. Jacob created SDAP-346:
------------------------------------

             Summary: PySpark environment variables incorrectly set
                 Key: SDAP-346
                 URL: https://issues.apache.org/jira/browse/SDAP-346
             Project: Apache Science Data Analytics Platform
          Issue Type: Bug
          Components: nexus
            Reporter: Joseph C. Jacob
            Assignee: Joseph C. Jacob


SDAP deployment fails due to the PYSPARK_PYTHON and PYSPARK_DRIVER_PYTHON environment variables being set incorrectly to directories instead of executables in the incubator-sdap-nexus/docker/nexus-webapp/Dockerfile:

PYSPARK_DRIVER_PYTHON=/opt/conda/lib/python3.8
PYSPARK_PYTHON=/opt/conda/lib/python3.8

The correct settings are to the executables:

PYSPARK_DRIVER_PYTHON=/opt/conda/bin/python3.8
PYSPARK_PYTHON=/opt/conda/bin/python3.8

These can be correctly set by overriding them in webapp.distributed.driver.env and webapp.distributed.executor.env in the helm chart values.yaml, but this ticket is to make the default settings work so that no setting is needed in the values.yaml.

 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)