You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Lingaraj Gowdar (Jira)" <ji...@apache.org> on 2022/11/12 19:50:00 UTC

[jira] [Created] (SPARK-41119) Python application to be passed as argument using commandline option for PySpark

Lingaraj Gowdar created SPARK-41119:
---------------------------------------

             Summary: Python application to be passed as argument using commandline option for PySpark
                 Key: SPARK-41119
                 URL: https://issues.apache.org/jira/browse/SPARK-41119
             Project: Spark
          Issue Type: Improvement
          Components: PySpark
    Affects Versions: 3.2.2
            Reporter: Lingaraj Gowdar


*Background -*

I was trying to run multiple tests by passing the script / file to the different spark shells. All shells except PySpark accepts script as command line argument and processes the script. The alternate way to pass a file via commandline arguments is by using (<) operator on a shell like below 

_# pyspark --master yarn --deploy-mode client < python-example-script.py_

 

*Improvement suggested -*

As PySpark can be used only in client mode and If the same functionality can be added for PySpark, it will help if anyone wants to try out only PySpark instead of running it via spark-submit.

spark-submit supports everything the spark shells{_}(accepting scripts with -i or -f option){_} can do but if there is a way to pass a script / file as argument to other spark shells(spark-shell, spark-sql) then why not the same can be provided with PySpark.

 

 



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org