You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@oozie.apache.org by "Andras Piros (JIRA)" <ji...@apache.org> on 2018/05/02 08:26:00 UTC
[jira] [Updated] (OOZIE-3228) Oozie Spark Action - the spark job
can't load the properties in spark-defaults.conf.
[ https://issues.apache.org/jira/browse/OOZIE-3228?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Andras Piros updated OOZIE-3228:
--------------------------------
Description:
When I create a Oozie workflow to launch a spark action, the spark job can't load the configured properties in {{spark-defaults.conf}}. I've configured each NodeManager as the Spark gateway role, so {{spark-defaults.conf}} is generated in {{/etc/spark/conf/}} on each worker node.
In {{spark-defaults.conf}} some configuration I've set into.
{noformat}
spark.executor.extraClassPath=/etc/hbase/conf:/etc/hive/conf
spark.driver.extraClassPath=/etc/hbase/conf:/etc/hive/conf
{noformat}
But in the Oozie spark job, they're not loaded automatically.
{noformat}
--conf spark.executor.extraClassPath=$PWD/*
--conf spark.driver.extraClassPath=$PWD/*
{noformat}
was:
When I create a oozie workflow to launch a spark action, the spark job can't load the configured properties in spark-defaults.conf. I've configured each Nodemanager as the spark gateway role, so the spark-defaults.conf is generated in /etc/spark/conf/ on each worker node.
in spark-defaults.conf some configuration I've set into.
spark.executor.extraClassPath=/etc/hbase/conf:/etc/hive/conf
spark.driver.extraClassPath=/etc/hbase/conf:/etc/hive/conf
But in the Oozie spark job, they're not loaded automatically.
--conf spark.executor.extraClassPath=$PWD/*
--conf spark.driver.extraClassPath=$PWD/*
> Oozie Spark Action - the spark job can't load the properties in spark-defaults.conf.
> ------------------------------------------------------------------------------------
>
> Key: OOZIE-3228
> URL: https://issues.apache.org/jira/browse/OOZIE-3228
> Project: Oozie
> Issue Type: Bug
> Components: action
> Affects Versions: 4.3.1
> Reporter: Tang Yan
> Assignee: Andras Piros
> Priority: Major
>
> When I create a Oozie workflow to launch a spark action, the spark job can't load the configured properties in {{spark-defaults.conf}}. I've configured each NodeManager as the Spark gateway role, so {{spark-defaults.conf}} is generated in {{/etc/spark/conf/}} on each worker node.
> In {{spark-defaults.conf}} some configuration I've set into.
> {noformat}
> spark.executor.extraClassPath=/etc/hbase/conf:/etc/hive/conf
> spark.driver.extraClassPath=/etc/hbase/conf:/etc/hive/conf
> {noformat}
> But in the Oozie spark job, they're not loaded automatically.
> {noformat}
> --conf spark.executor.extraClassPath=$PWD/*
> --conf spark.driver.extraClassPath=$PWD/*
> {noformat}
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)