You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (Jira)" <ji...@apache.org> on 2022/07/21 13:44:00 UTC

[jira] [Commented] (SPARK-39817) Missing sbin scripts in PySpark packages

    [ https://issues.apache.org/jira/browse/SPARK-39817?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17569425#comment-17569425 ] 

Hyukjin Kwon commented on SPARK-39817:
--------------------------------------

pip is designed for using it in Python. I would prefer to avoid people to create a Spark cluster by using pip.

> Missing sbin scripts in PySpark packages
> ----------------------------------------
>
>                 Key: SPARK-39817
>                 URL: https://issues.apache.org/jira/browse/SPARK-39817
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark
>    Affects Versions: 3.2.0, 3.2.1, 3.3.0, 3.2.2
>            Reporter: F. H.
>            Priority: Major
>              Labels: easyfix
>   Original Estimate: 5m
>  Remaining Estimate: 5m
>
> In the PySpark setup.py, only a subset of all scripts is included.
> I'm in particular missing the `submit-all.sh` script:
> {code:python}
>         package_data={
>             'pyspark.jars': ['*.jar'],
>             'pyspark.bin': ['*'],
>             'pyspark.sbin': ['spark-config.sh', 'spark-daemon.sh',
>                              'start-history-server.sh',
>                              'stop-history-server.sh', ],
>             [...]
>         },
> {code}
>  
> The solution is super simple, just change 'pyspark.sbin' to:
> {code:python}
> 'pyspark.sbin': ['*'],
> {code}
>  
> I would happily submit a PR to github, but I have no clue on the organizational details.
> This would be great to get backported for pyspark 3.2.x as well as 3.3.x soon.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org