You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@accumulo.apache.org by Jorge Machado <jo...@me.com> on 2018/03/23 06:29:32 UTC

Spark and Accumulo Delegation tokens

Hi Guys, 

I’m on the middle of writing a spark Datasource connector for Apache Spark to connect to Accumulo Tablets, because we have Kerberos it get’s a little trick because Spark only handles the Delegation Tokens from Hbase, hive and hdfs. 

Would be a PR for a implementation of HadoopDelegationTokenProvider for Accumulo be accepted ? 


Jorge Machado






Re: Spark and Accumulo Delegation tokens

Posted by Jorge Machado <jo...@hotmail.com>.
Thanks ! 

Should we contribute something like that ? 

Jorge Machado 
Jorge@jmachado.me


> Am 23.03.2018 um 12:24 schrieb Saisai Shao <sa...@gmail.com>:
> 
> It is in yarn module.
> "org.apache.spark.deploy.yarn.security.ServiceCredentialProvider".
> 
> 2018-03-23 15:10 GMT+08:00 Jorge Machado <jo...@me.com>:
> 
>> Hi Jerry,
>> 
>> where do you see that Class on Spark ? I only found HadoopDelegationTokenManager
>> and I don’t see any way to add my Provider into it.
>> 
>> private def getDelegationTokenProviders: Map[String, HadoopDelegationTokenProvider] = {
>>  val providers = List(new HadoopFSDelegationTokenProvider(fileSystems),
>>    new HiveDelegationTokenProvider,
>>    new HBaseDelegationTokenProvider)
>> 
>>  // Filter out providers for which spark.security.credentials.{service}.enabled is false.
>>  providers
>>    .filter { p => isServiceEnabled(p.serviceName) }
>>    .map { p => (p.serviceName, p) }
>>    .toMap
>> }
>> 
>> 
>> If you could give me a tipp there would be great.
>> Thanks
>> 
>> Jorge Machado
>> 
>> 
>> 
>> 
>> 
>> On 23 Mar 2018, at 07:38, Saisai Shao <sa...@gmail.com> wrote:
>> 
>> I think you can build your own Accumulo credential provider as similar to
>> HadoopDelegationTokenProvider out of Spark, Spark already provided an
>> interface "ServiceCredentialProvider" for user to plug-in customized
>> credential provider.
>> 
>> Thanks
>> Jerry
>> 
>> 2018-03-23 14:29 GMT+08:00 Jorge Machado <jo...@me.com>:
>> 
>> Hi Guys,
>> 
>> I’m on the middle of writing a spark Datasource connector for Apache Spark
>> to connect to Accumulo Tablets, because we have Kerberos it get’s a little
>> trick because Spark only handles the Delegation Tokens from Hbase, hive and
>> hdfs.
>> 
>> Would be a PR for a implementation of HadoopDelegationTokenProvider for
>> Accumulo be accepted ?
>> 
>> 
>> Jorge Machado
>> 
>> 
>> 
>> 
>> 
>> 
>> 
>> 

Re: Spark and Accumulo Delegation tokens

Posted by Saisai Shao <sa...@gmail.com>.
It is in yarn module.
"org.apache.spark.deploy.yarn.security.ServiceCredentialProvider".

2018-03-23 15:10 GMT+08:00 Jorge Machado <jo...@me.com>:

> Hi Jerry,
>
> where do you see that Class on Spark ? I only found HadoopDelegationTokenManager
> and I don’t see any way to add my Provider into it.
>
> private def getDelegationTokenProviders: Map[String, HadoopDelegationTokenProvider] = {
>   val providers = List(new HadoopFSDelegationTokenProvider(fileSystems),
>     new HiveDelegationTokenProvider,
>     new HBaseDelegationTokenProvider)
>
>   // Filter out providers for which spark.security.credentials.{service}.enabled is false.
>   providers
>     .filter { p => isServiceEnabled(p.serviceName) }
>     .map { p => (p.serviceName, p) }
>     .toMap
> }
>
>
> If you could give me a tipp there would be great.
> Thanks
>
> Jorge Machado
>
>
>
>
>
> On 23 Mar 2018, at 07:38, Saisai Shao <sa...@gmail.com> wrote:
>
> I think you can build your own Accumulo credential provider as similar to
> HadoopDelegationTokenProvider out of Spark, Spark already provided an
> interface "ServiceCredentialProvider" for user to plug-in customized
> credential provider.
>
> Thanks
> Jerry
>
> 2018-03-23 14:29 GMT+08:00 Jorge Machado <jo...@me.com>:
>
> Hi Guys,
>
> I’m on the middle of writing a spark Datasource connector for Apache Spark
> to connect to Accumulo Tablets, because we have Kerberos it get’s a little
> trick because Spark only handles the Delegation Tokens from Hbase, hive and
> hdfs.
>
> Would be a PR for a implementation of HadoopDelegationTokenProvider for
> Accumulo be accepted ?
>
>
> Jorge Machado
>
>
>
>
>
>
>
>

Re: Spark and Accumulo Delegation tokens

Posted by Saisai Shao <sa...@gmail.com>.
It is in yarn module.
"org.apache.spark.deploy.yarn.security.ServiceCredentialProvider".

2018-03-23 15:10 GMT+08:00 Jorge Machado <jo...@me.com>:

> Hi Jerry,
>
> where do you see that Class on Spark ? I only found HadoopDelegationTokenManager
> and I don’t see any way to add my Provider into it.
>
> private def getDelegationTokenProviders: Map[String, HadoopDelegationTokenProvider] = {
>   val providers = List(new HadoopFSDelegationTokenProvider(fileSystems),
>     new HiveDelegationTokenProvider,
>     new HBaseDelegationTokenProvider)
>
>   // Filter out providers for which spark.security.credentials.{service}.enabled is false.
>   providers
>     .filter { p => isServiceEnabled(p.serviceName) }
>     .map { p => (p.serviceName, p) }
>     .toMap
> }
>
>
> If you could give me a tipp there would be great.
> Thanks
>
> Jorge Machado
>
>
>
>
>
> On 23 Mar 2018, at 07:38, Saisai Shao <sa...@gmail.com> wrote:
>
> I think you can build your own Accumulo credential provider as similar to
> HadoopDelegationTokenProvider out of Spark, Spark already provided an
> interface "ServiceCredentialProvider" for user to plug-in customized
> credential provider.
>
> Thanks
> Jerry
>
> 2018-03-23 14:29 GMT+08:00 Jorge Machado <jo...@me.com>:
>
> Hi Guys,
>
> I’m on the middle of writing a spark Datasource connector for Apache Spark
> to connect to Accumulo Tablets, because we have Kerberos it get’s a little
> trick because Spark only handles the Delegation Tokens from Hbase, hive and
> hdfs.
>
> Would be a PR for a implementation of HadoopDelegationTokenProvider for
> Accumulo be accepted ?
>
>
> Jorge Machado
>
>
>
>
>
>
>
>

Re: Spark and Accumulo Delegation tokens

Posted by Jorge Machado <jo...@me.com>.
Hi Jerry, 

where do you see that Class on Spark ? I only found HadoopDelegationTokenManager and I don’t see any way to add my Provider into it. 

private def getDelegationTokenProviders: Map[String, HadoopDelegationTokenProvider] = {
  val providers = List(new HadoopFSDelegationTokenProvider(fileSystems),
    new HiveDelegationTokenProvider,
    new HBaseDelegationTokenProvider)

  // Filter out providers for which spark.security.credentials.{service}.enabled is false.
  providers
    .filter { p => isServiceEnabled(p.serviceName) }
    .map { p => (p.serviceName, p) }
    .toMap
}

If you could give me a tipp there would be great. 
Thanks 

Jorge Machado





> On 23 Mar 2018, at 07:38, Saisai Shao <sa...@gmail.com> wrote:
> 
> I think you can build your own Accumulo credential provider as similar to
> HadoopDelegationTokenProvider out of Spark, Spark already provided an
> interface "ServiceCredentialProvider" for user to plug-in customized
> credential provider.
> 
> Thanks
> Jerry
> 
> 2018-03-23 14:29 GMT+08:00 Jorge Machado <jo...@me.com>:
> 
>> Hi Guys,
>> 
>> I’m on the middle of writing a spark Datasource connector for Apache Spark
>> to connect to Accumulo Tablets, because we have Kerberos it get’s a little
>> trick because Spark only handles the Delegation Tokens from Hbase, hive and
>> hdfs.
>> 
>> Would be a PR for a implementation of HadoopDelegationTokenProvider for
>> Accumulo be accepted ?
>> 
>> 
>> Jorge Machado
>> 
>> 
>> 
>> 
>> 
>> 


Re: Spark and Accumulo Delegation tokens

Posted by Jorge Machado <jo...@me.com>.
Hi Jerry, 

where do you see that Class on Spark ? I only found HadoopDelegationTokenManager and I don’t see any way to add my Provider into it. 

private def getDelegationTokenProviders: Map[String, HadoopDelegationTokenProvider] = {
  val providers = List(new HadoopFSDelegationTokenProvider(fileSystems),
    new HiveDelegationTokenProvider,
    new HBaseDelegationTokenProvider)

  // Filter out providers for which spark.security.credentials.{service}.enabled is false.
  providers
    .filter { p => isServiceEnabled(p.serviceName) }
    .map { p => (p.serviceName, p) }
    .toMap
}

If you could give me a tipp there would be great. 
Thanks 

Jorge Machado





> On 23 Mar 2018, at 07:38, Saisai Shao <sa...@gmail.com> wrote:
> 
> I think you can build your own Accumulo credential provider as similar to
> HadoopDelegationTokenProvider out of Spark, Spark already provided an
> interface "ServiceCredentialProvider" for user to plug-in customized
> credential provider.
> 
> Thanks
> Jerry
> 
> 2018-03-23 14:29 GMT+08:00 Jorge Machado <jo...@me.com>:
> 
>> Hi Guys,
>> 
>> I’m on the middle of writing a spark Datasource connector for Apache Spark
>> to connect to Accumulo Tablets, because we have Kerberos it get’s a little
>> trick because Spark only handles the Delegation Tokens from Hbase, hive and
>> hdfs.
>> 
>> Would be a PR for a implementation of HadoopDelegationTokenProvider for
>> Accumulo be accepted ?
>> 
>> 
>> Jorge Machado
>> 
>> 
>> 
>> 
>> 
>> 


Re: Spark and Accumulo Delegation tokens

Posted by Saisai Shao <sa...@gmail.com>.
I think you can build your own Accumulo credential provider as similar to
HadoopDelegationTokenProvider out of Spark, Spark already provided an
interface "ServiceCredentialProvider" for user to plug-in customized
credential provider.

Thanks
Jerry

2018-03-23 14:29 GMT+08:00 Jorge Machado <jo...@me.com>:

> Hi Guys,
>
> I’m on the middle of writing a spark Datasource connector for Apache Spark
> to connect to Accumulo Tablets, because we have Kerberos it get’s a little
> trick because Spark only handles the Delegation Tokens from Hbase, hive and
> hdfs.
>
> Would be a PR for a implementation of HadoopDelegationTokenProvider for
> Accumulo be accepted ?
>
>
> Jorge Machado
>
>
>
>
>
>

Re: Spark and Accumulo Delegation tokens

Posted by Saisai Shao <sa...@gmail.com>.
I think you can build your own Accumulo credential provider as similar to
HadoopDelegationTokenProvider out of Spark, Spark already provided an
interface "ServiceCredentialProvider" for user to plug-in customized
credential provider.

Thanks
Jerry

2018-03-23 14:29 GMT+08:00 Jorge Machado <jo...@me.com>:

> Hi Guys,
>
> I’m on the middle of writing a spark Datasource connector for Apache Spark
> to connect to Accumulo Tablets, because we have Kerberos it get’s a little
> trick because Spark only handles the Delegation Tokens from Hbase, hive and
> hdfs.
>
> Would be a PR for a implementation of HadoopDelegationTokenProvider for
> Accumulo be accepted ?
>
>
> Jorge Machado
>
>
>
>
>
>