You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2019/05/21 04:20:04 UTC

[jira] [Updated] (SPARK-13975) Cannot specify extra libs for executor from /extra-lib

     [ https://issues.apache.org/jira/browse/SPARK-13975?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon updated SPARK-13975:
---------------------------------
    Labels: bulk-closed  (was: )

> Cannot specify extra libs for executor from /extra-lib
> ------------------------------------------------------
>
>                 Key: SPARK-13975
>                 URL: https://issues.apache.org/jira/browse/SPARK-13975
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Submit
>    Affects Versions: 1.6.1
>            Reporter: Leonid Poliakov
>            Priority: Minor
>              Labels: bulk-closed
>
> If you build a framework on top of spark and want to bundle it with the spark, there is no easy way to add your framework libs to executor classpath.
> Let's say I want to add my custom libs to {{/extra-lib}} folder, ship the new bundle (with my libs in it) to nodes, run the bundle. I want executors on node to always automatically load my libs from {{/extra-lib}}, because that's how future developers would use framework out-of-the-box.
> The config doc says you can specify extraClasspath for the executor in {{spark-defaults.conf}}, which is good because custom config may be put in the bundle for the framework, but the syntax of the property is unclear.
> You can basically specify the value that will be appended to {{-cp}} for a executor Java process, so it follows the Java how-to-set-classpath rules, so basically you have two options here:
> 1. specify absolute path
> bq. spark.executor.extraClassPath /home/user/Apps/spark-bundled/extra-lib/*
> 2. specify relative path
> bq. spark.executor.extraClassPath ../../../extra-lib/*
> But none of these ways look good: absolute path won't work at all since you cannot know where users will put the bundle, relative path looks weird because executor will have it's work directory set to something like {{/work/app-20160316070310-0002/0}} and can also be broken if custom worker folder is configured.
> So, it's required to have a proper way to bundle custom libs and set executor classpath to load them up.
> *Expected*: you can specify {{spark.executor.extraClassPath}} relative to {{$SPARK_HOME}} using placeholders, e.g. with next syntax:
> bq. spark.executor.extraClassPath ${home}/extra-lib/*
> Code will resolve placeholders in properties with a proper path
> Executor will get absolute path in {{-cp}} this way
> *Actual*: you cannot specify extra libs for executor relative to {{$SPARK_HOME}}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org