You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2016/06/14 08:15:11 UTC

[jira] [Resolved] (SPARK-15940) Simultaneous multiple spark context with different kerberos FID credentials from within one id

     [ https://issues.apache.org/jira/browse/SPARK-15940?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Sean Owen resolved SPARK-15940.
-------------------------------
    Resolution: Not A Problem

Multiple Spark contexts are not supported.

> Simultaneous multiple spark context with different kerberos FID credentials from within one id
> ----------------------------------------------------------------------------------------------
>
>                 Key: SPARK-15940
>                 URL: https://issues.apache.org/jira/browse/SPARK-15940
>             Project: Spark
>          Issue Type: Bug
>            Reporter: Partha Pratim Ghosh
>
> Spark authorization can be brought about by kerberos authentication. Now if we want to have two groups of users working on Spark contexts authorized by different ids, how to go about doing that?
> Our J2EE application is running as user1. From there we are opening 2 Spark contexts and we need to open them as follows - 
> Spark Context (SC1) - kinit Functional Id (FID1)
> Spark Context (SC2) - kinit Functional Id (FID2)
> We can open the Spark contexts one by one. But after the Spark contexts are up we are using them, in many cases, parallelly. So, there are scenarios where we are firing SC1 and SC2 jobs parallelly. In that case, if the default kinit (the last kinit that was done) is FID2 and we run a job on SC1 then we get exception as FID1 does not get validated.
> How do we maintain 2 spark contexts through one single user (user1), getting authorized by 2 different FIDs? We need this functionality as the authorization differs. Please advise.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org