You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "shanyu zhao (JIRA)" <ji...@apache.org> on 2018/11/12 02:53:00 UTC

[jira] [Created] (SPARK-26011) pyspark app with "spark.jars.packages" config does not work

shanyu zhao created SPARK-26011:
-----------------------------------

             Summary: pyspark app with "spark.jars.packages" config does not work
                 Key: SPARK-26011
                 URL: https://issues.apache.org/jira/browse/SPARK-26011
             Project: Spark
          Issue Type: Bug
          Components: Spark Submit
    Affects Versions: 2.4.0, 2.3.2
            Reporter: shanyu zhao


Command "pyspark --packages" works as expected, but if submitting a livy pyspark job with "spark.jars.packages" config, the downloaded packages are not added to python's sys.path therefore the package is not available to use.

For example, this command works:

pyspark --packages Azure:mmlspark:0.14

However, using Jupyter notebook with sparkmagic kernel to open a pyspark session failed:

%%configure -f \{"conf": {spark.jars.packages": "Azure:mmlspark:0.14"}}
import mmlspark

The root cause is that SparkSubmit determines pyspark app by the suffix of primary resource but Livy uses "spark-internal" as the primary resource when calling spark-submit, therefore args.isPython is fails in SparkSubmit.scala.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org