You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@zeppelin.apache.org by Ruslan Dautkhanov <da...@gmail.com> on 2018/03/29 22:15:23 UTC

%spark.dep for %pyspark

Were you guys able to to use %spark.dep for %pyspark ?

According to documentation this should work:
https://zeppelin.apache.org/docs/0.7.2/interpreter/spark.html#dependency-management
" Note: %spark.dep interpreter loads libraries to %spark and %spark.pyspark but
not to %spark.sql interpreter.  "

In real life for some reason it doesn't work.. (on recent master)

(as a workaround I add a local jar into --jars option in the
spark_submit_options, but using %spark.dep would be so much nicer).


Thanks,
Ruslan