You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@oozie.apache.org by "Andras Piros (JIRA)" <ji...@apache.org> on 2018/05/11 13:07:00 UTC
[jira] [Commented] (OOZIE-3228) [Spark action] Can't load
properties from spark-defaults.conf
[ https://issues.apache.org/jira/browse/OOZIE-3228?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16471880#comment-16471880 ]
Andras Piros commented on OOZIE-3228:
-------------------------------------
Following parameters are checked for presence in user- or server-provided {{spark-defaults.conf}}, and are prepended to existing Spark arguments:
* {{spark.executor.extraClassPath}}: separated by {{":"}}
* {{spark.driver.extraClassPath}}: separated by {{":"}}
* {{spark.executor.extraJavaOptions}}: separated by {{" "}}
* {{spark.driver.extraJavaOptions}}: separated by {{" "}}
> [Spark action] Can't load properties from spark-defaults.conf
> -------------------------------------------------------------
>
> Key: OOZIE-3228
> URL: https://issues.apache.org/jira/browse/OOZIE-3228
> Project: Oozie
> Issue Type: Bug
> Components: action
> Affects Versions: 5.0.0, 4.3.1
> Reporter: Tang Yan
> Assignee: Andras Piros
> Priority: Major
>
> When I create a Oozie workflow to launch a spark action, the spark job can't load the configured properties in {{spark-defaults.conf}}. I've configured each NodeManager as the Spark gateway role, so {{spark-defaults.conf}} is generated in {{/etc/spark/conf/}} on each worker node.
> In {{spark-defaults.conf}} some configuration I've set into.
> {noformat}
> spark.executor.extraClassPath=/etc/hbase/conf:/etc/hive/conf
> spark.driver.extraClassPath=/etc/hbase/conf:/etc/hive/conf
> {noformat}
> But in the Oozie spark job, they're not loaded automatically.
> {noformat}
> --conf spark.executor.extraClassPath=$PWD/*
> --conf spark.driver.extraClassPath=$PWD/*
> {noformat}
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)