You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@zeppelin.apache.org by "Jerzy J. Gangi (JIRA)" <ji...@apache.org> on 2016/11/30 19:26:59 UTC

[jira] [Created] (ZEPPELIN-1741) JAR's specified with spark.jars in spark-defaults.conf does not affect %pyspark interpreter

Jerzy J. Gangi created ZEPPELIN-1741:
----------------------------------------

             Summary: JAR's specified with spark.jars in spark-defaults.conf does not affect %pyspark interpreter
                 Key: ZEPPELIN-1741
                 URL: https://issues.apache.org/jira/browse/ZEPPELIN-1741
             Project: Zeppelin
          Issue Type: Bug
          Components: Interpreters
    Affects Versions: 0.6.2
         Environment: Zeppelin 0.6.2
Spark 1.6.2
            Reporter: Jerzy J. Gangi


If you specify JAR's with `spark.jars` in your `spark-defaults.conf`, the `%pyspark` interpreter will not load these JARs.

Currently, a note in the Spark interpreter documentation says, "Note that adding jar to pyspark is only availabe via `%dep` interpreter at the moment."

This is undesirable for two reasons:

1) `%dep` is deprecated
2) Sysadmins configure `spark-defaults.conf`, and expect these settings to be honored however Spark is executed.

As a Zeppelin user, I expect that if I configure JAR's in `spark-defaults.conf`, these JAR's will be available when the %pyspark interpreter is run.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)