You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Doug Balog (JIRA)" <ji...@apache.org> on 2016/06/30 13:17:10 UTC

[jira] [Commented] (SPARK-16298) spark.yarn.principal not working

    [ https://issues.apache.org/jira/browse/SPARK-16298?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15357064#comment-15357064 ] 

Doug Balog commented on SPARK-16298:
------------------------------------

Which version ? 
Are you running in client mode or cluster mode ?
I believe that it doesn't work in cluster mode because YARN has already started your app master and workers before your code is called. 
Try setting `spark.yarn.principal` and `spark.yarn.keytab` via  command line `spark-submit --conf` 



> spark.yarn.principal not working
> --------------------------------
>
>                 Key: SPARK-16298
>                 URL: https://issues.apache.org/jira/browse/SPARK-16298
>             Project: Spark
>          Issue Type: Bug
>            Reporter: Partha Pratim Ghosh
>
> I am opening a Spark configuration with spark.yarn.principal and spark.yarn.keytab. However, this is not authenticating the underlying HDFS with the same principal and keytab. Instead, seems it is picking up from ticket cache. Without this feature the spark.yarn.principal and spark.yarn.keytab doesn't seem to be logical.
> Sample code - 
> SparkConf conf = new SparkConf().setMaster("yarn-client").setAppName("spark-test")
> 						.set("spark.repl.class.uri", classServerUri);
> 				conf.set("spark.yarn.principal", principal);
> 				conf.set("spark.yarn.keytab", keytab);
> 				conf.setSparkHome(sparkBasePath);
> 				
> 				if (execUri != null) {
> 					conf.set("spark.executor.uri", execUri);
> 				}
> 				conf.set("spark.executor.memory", "8g");
> 				conf.set("spark.scheduler.mode", "FAIR");
> 				SparkContext sparkContext = new SparkContext(conf);
> Please advise how this can be achieved.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org