You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@phoenix.apache.org by Akhilesh Pathodia <pa...@gmail.com> on 2015/12/07 20:54:00 UTC

Spark on hbase using Phoenix in secure cluster

Hi,

I am running spark job on yarn in cluster mode in secured cluster. I am
trying to run Spark on Hbase using Phoenix, but Spark executors are unable
to get hbase connection using phoenix. I am running knit command to get the
ticket before starting the job and also keytab file and principal are
correctly specified in connection URL. But still spark job on each node
throws below error:

15/12/01 03:23:15 ERROR ipc.AbstractRpcClient: SASL authentication failed.
The most likely cause is missing or invalid credentials. Consider 'kinit'.
javax.security.sasl.SaslException: GSS initiate failed [Caused by
GSSException: No valid credentials provided (Mechanism level: Failed to
find any Kerberos tgt)]
        at
com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)

I am using Spark 1.3.1, Hbase 1.0.0, Phoenix 4.3. I am able to run Spark on
Hbase(without phoenix) successfully in yarn-client mode as mentioned in
this link:

https://github.com/cloudera-labs/SparkOnHBase#scan-that-works-on-kerberos

Also, I found that there is a known issue for yarn-cluster mode for Spark
1.3.1 version:

https://issues.apache.org/jira/browse/SPARK-6918

Has anybody been successful in running Spark on hbase using Phoenix in yarn
cluster or client mode?

Thanks,
Akhilesh Pathodia

Re: Spark on hbase using Phoenix in secure cluster

Posted by Akhilesh Pathodia <pa...@gmail.com>.
I had got the kerberos tgt as I had run kinit command with right keytab
file and principal before starting thr spark job. Moreover, I am able to
get the hbase connection when running Spark on hbase (without Phoenix) in
yarn-client mode, but unable to connect to hbase when using phoenix.

Thanks,
Akhilesh

On Tue, Dec 8, 2015 at 1:27 AM, James Heather <ja...@mendeley.com>
wrote:

> I have no idea what the right way to solve it is, but this is a Kerberos
> error: the cluster is expecting you to have a Kerberos ticket-granting
> ticket ('tgt') but you haven't got one.
>
> Its suggestion of using 'kinit' is pointing you towards a way of getting
> such a ticket: 'kinit' is the Linux command for starting a Kerberos session
> and retrieving a ticket. But to use it, you need to have the right Kerberos
> config on the client.
>
> James
>
>
> On 07/12/15 19:54, Akhilesh Pathodia wrote:
>
>> Hi,
>>
>> I am running spark job on yarn in cluster mode in secured cluster. I am
>> trying to run Spark on Hbase using Phoenix, but Spark executors are unable
>> to get hbase connection using phoenix. I am running knit command to get the
>> ticket before starting the job and also keytab file and principal are
>> correctly specified in connection URL. But still spark job on each node
>> throws below error:
>>
>> 15/12/01 03:23:15 ERROR ipc.AbstractRpcClient: SASL authentication
>> failed. The most likely cause is missing or invalid credentials. Consider
>> 'kinit'.
>> javax.security.sasl.SaslException: GSS initiate failed [Caused by
>> GSSException: No valid credentials provided (Mechanism level: Failed to
>> find any Kerberos tgt)]
>>         at
>> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
>>
>> I am using Spark 1.3.1, Hbase 1.0.0, Phoenix 4.3. I am able to run Spark
>> on Hbase(without phoenix) successfully in yarn-client mode as mentioned in
>> this link:
>>
>> https://github.com/cloudera-labs/SparkOnHBase#scan-that-works-on-kerberos
>>
>> Also, I found that there is a known issue for yarn-cluster mode for Spark
>> 1.3.1 version:
>>
>> https://issues.apache.org/jira/browse/SPARK-6918
>>
>> Has anybody been successful in running Spark on hbase using Phoenix in
>> yarn cluster or client mode?
>>
>> Thanks,
>> Akhilesh Pathodia
>>
>
>

Re: Spark on hbase using Phoenix in secure cluster

Posted by James Heather <ja...@mendeley.com>.
I have no idea what the right way to solve it is, but this is a Kerberos 
error: the cluster is expecting you to have a Kerberos ticket-granting 
ticket ('tgt') but you haven't got one.

Its suggestion of using 'kinit' is pointing you towards a way of getting 
such a ticket: 'kinit' is the Linux command for starting a Kerberos 
session and retrieving a ticket. But to use it, you need to have the 
right Kerberos config on the client.

James

On 07/12/15 19:54, Akhilesh Pathodia wrote:
> Hi,
>
> I am running spark job on yarn in cluster mode in secured cluster. I 
> am trying to run Spark on Hbase using Phoenix, but Spark executors are 
> unable to get hbase connection using phoenix. I am running knit 
> command to get the ticket before starting the job and also keytab file 
> and principal are correctly specified in connection URL. But still 
> spark job on each node throws below error:
>
> 15/12/01 03:23:15 ERROR ipc.AbstractRpcClient: SASL authentication 
> failed. The most likely cause is missing or invalid credentials. 
> Consider 'kinit'.
> javax.security.sasl.SaslException: GSS initiate failed [Caused by 
> GSSException: No valid credentials provided (Mechanism level: Failed 
> to find any Kerberos tgt)]
>         at 
> com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212)
>
> I am using Spark 1.3.1, Hbase 1.0.0, Phoenix 4.3. I am able to run 
> Spark on Hbase(without phoenix) successfully in yarn-client mode as 
> mentioned in this link:
>
> https://github.com/cloudera-labs/SparkOnHBase#scan-that-works-on-kerberos
>
> Also, I found that there is a known issue for yarn-cluster mode for 
> Spark 1.3.1 version:
>
> https://issues.apache.org/jira/browse/SPARK-6918
>
> Has anybody been successful in running Spark on hbase using Phoenix in 
> yarn cluster or client mode?
>
> Thanks,
> Akhilesh Pathodia