You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@hbase.apache.org by "Binzi Cao (JIRA)" <ji...@apache.org> on 2016/11/07 22:06:58 UTC

[jira] [Comment Edited] (HBASE-17040) HBase Spark does not work in Kerberos and yarn-master mode

    [ https://issues.apache.org/jira/browse/HBASE-17040?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15645605#comment-15645605 ] 

Binzi Cao edited comment on HBASE-17040 at 11/7/16 10:06 PM:
-------------------------------------------------------------

My {{spark-sbumit}} command is as below:

{code}
HADOOP_USER_NAME=spark spark-submit   \
--jars "local:///opt/cloudera/parcels/CDH/jars/hbase-spark-1.2.0-cdh5.8.2.jar" \
--driver-memory 5G \
--executor-memory 5G \
--num-executors  5 \
--deploy-mode cluster \
--files "my_application.conf,hbase.keytab"  \
--class "MySparkApp" --master yarn  \
--driver-java-options "-Dconfig.file=my_application.conf" \
MyHBaseApp.jar
{code}
I tried two different ways to start the job:

1. {{kinit}} first with a user with spark and hbase permissions. Spark job can be started successfully but will fail to create {{HBaseContext}} with above exceptions. 

The submit command works if I change the {{deploy-mode}} to {{client}}.

2. Pass the keytab file to job and load it in code, the principal/keytab file can be loaded properly, but the {{HBaseContext}} could not be created as it always use the current user of the job instead of  the keytab credentials 






was (Author: caobinzi):
My {{spark-sbumit}} command is as below:

{code}
HADOOP_USER_NAME=spark spark-submit   \
--jars "local:///opt/cloudera/parcels/CDH/jars/hbase-spark-1.2.0-cdh5.8.2.jar" \
--driver-memory 5G \
--executor-memory 5G \
--num-executors  5 \
--deploy-mode cluster \
--files "my_application.conf,hbase.keytab"  \
--class "MySparkApp" --master yarn  \
--driver-java-options "-Dconfig.file=my_application.conf" \
MyHBaseApp.jar
{code}
I tried two different ways to start the job:

1. kinit first with a user with spark and hbase permissions. Spark job can be started successfully but will fail to create {{HBaseContext}} with above exceptions. 

The submit command works if I change the {{deploy-mode}} to {{client}}.

2. Pass the keytab file to job and load it in code, the principal/keytab file can be loaded properly, but the {{HBaseContext}} could not be created as it always use the current user of the job instead of  the keytab credentials 





> HBase Spark does not work in Kerberos and yarn-master mode
> ----------------------------------------------------------
>
>                 Key: HBASE-17040
>                 URL: https://issues.apache.org/jira/browse/HBASE-17040
>             Project: HBase
>          Issue Type: Bug
>          Components: spark
>    Affects Versions: 2.0.0
>         Environment: HBase
> Kerberos
> Yarn
> Cloudera
>            Reporter: Binzi Cao
>
> We are loading hbase records  to RDD with the hbase-spark library in Cloudera. 
> The hbase-spark code works if  we submit the job with client mode, but does not work in cluster mode. We got below exceptions:
> ```
> 16/11/07 05:43:28 WARN security.UserGroupInformation: PriviledgedActionException as:spark (auth:SIMPLE) cause:javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
> 16/11/07 05:43:28 WARN ipc.RpcClientImpl: Exception encountered while connecting to the server : javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
> 16/11/07 05:43:28 ERROR ipc.RpcClientImpl: SASL authentication failed. The most likely cause is missing or invalid credentials. Consider 'kinit'.
> javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
> 	at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211)
> 	at org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:181)
> 	at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupSaslConnection(RpcClientImpl.java:617)
> 	at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.access$700(RpcClientImpl.java:162)
> 	at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:743)
> 	at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:740)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:422)
> 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)
> 	at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:740)
> 	at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.writeRequest(RpcClientImpl.java:906)
> 	at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.tracedWriteRequest(RpcClientImpl.java:873)
> 	at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1242)
> 	at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:226)
> 	at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:331)
> 	at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.execService(ClientProtos.java:34118)
> 	at org.apache.hadoop.hbase.protobuf.ProtobufUtil.execService(ProtobufUtil.java:1627)
> 	at org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(RegionCoprocessorRpcChannel.java:92)
> 	at org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(RegionCoprocessorRpcChannel.java:89)
> 	at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:126)
> 	at org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel.callExecService(RegionCoprocessorRpcChannel.java:95)
> 	at org.apache.hadoop.hbase.ipc.CoprocessorRpcChannel.callBlockingMethod(CoprocessorRpcChannel.java:73)
> 	at org.apache.hadoop.hbase.protobuf.generated.AuthenticationProtos$AuthenticationService$BlockingStub.getAuthenticationToken(AuthenticationProtos.java:4512)
> 	at org.apache.hadoop.hbase.security.token.TokenUtil.obtainToken(TokenUtil.java:86)
> 	at org.apache.hadoop.hbase.security.token.TokenUtil$1.run(TokenUtil.java:111)
> 	at org.apache.hadoop.hbase.security.token.TokenUtil$1.run(TokenUtil.java:108)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:422)
> 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)
> 	at org.apache.hadoop.hbase.security.User$SecureHadoopUser.runAs(User.java:340)
> 	at org.apache.hadoop.hbase.security.token.TokenUtil.obtainToken(TokenUtil.java:108)
> 	at org.apache.hadoop.hbase.security.token.TokenUtil.addTokenForJob(TokenUtil.java:329)
> 	at org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.initCredentials(TableMapReduceUtil.java:490)
> 	at org.apache.hadoop.hbase.spark.HBaseContext.<init>(HBaseContext.scala:70)
> ```



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)