You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2018/11/15 22:42:00 UTC
[jira] [Assigned] (SPARK-26083) Pyspark command is not working
properly with default Docker Image build
[ https://issues.apache.org/jira/browse/SPARK-26083?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Apache Spark reassigned SPARK-26083:
------------------------------------
Assignee: (was: Apache Spark)
> Pyspark command is not working properly with default Docker Image build
> -----------------------------------------------------------------------
>
> Key: SPARK-26083
> URL: https://issues.apache.org/jira/browse/SPARK-26083
> Project: Spark
> Issue Type: Bug
> Components: Kubernetes
> Affects Versions: 2.4.0
> Reporter: Qi Shao
> Priority: Minor
> Labels: easyfix, newbie, patch, pull-request-available
> Fix For: 2.4.1
>
>
> When I try to run
> {code:java}
> ./bin/pyspark{code}
> in a pod in Kubernetes(image built without change from pyspark Dockerfile), I'm getting an error:
> {code:java}
> $SPARK_HOME/bin/pyspark --deploy-mode client --master k8s://https://$KUBERNETES_SERVICE_HOST:$KUBERNETES_SERVICE_PORT_HTTPS ...
> Python 2.7.15 (default, Aug 22 2018, 13:24:18) [GCC 6.4.0] on linux2 Type "help", "copyright", "credits" or "license" for more information.
> Could not open PYTHONSTARTUP
> IOError: [Errno 2] No such file or directory: '/opt/spark/python/pyspark/shell.py'{code}
> This is because {{pyspark}} folder doesn't exist under {{/opt/spark/python/}}
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org