You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Marcelo Vanzin (JIRA)" <ji...@apache.org> on 2019/06/24 18:40:00 UTC
[jira] [Created] (SPARK-28150) Failure to create multiple contexts
in same JVM with Kerberos auth
Marcelo Vanzin created SPARK-28150:
--------------------------------------
Summary: Failure to create multiple contexts in same JVM with Kerberos auth
Key: SPARK-28150
URL: https://issues.apache.org/jira/browse/SPARK-28150
Project: Spark
Issue Type: Bug
Components: Spark Core
Affects Versions: 3.0.0
Reporter: Marcelo Vanzin
Take the following small app that creates multiple contexts (not concurrently):
{code}
from pyspark.context import SparkContext
import time
for i in range(2):
with SparkContext() as sc:
pass
time.sleep(5)
{code}
This fails when kerberos (without dt renewal) is being used:
{noformat}
19/06/24 11:33:58 ERROR spark.SparkContext: Error initializing SparkContext.
java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.security.HBaseDelegationTokenProvider.obtainDelegationTokens(HBaseDelegationTokenProvider.scala:49)
Caused by: org.apache.hadoop.hbase.shaded.com.google.protobuf.ServiceException: Error calling method hbase.pb.AuthenticationService.GetAuthenticationToken
at org.apache.hadoop.hbase.client.SyncCoprocessorRpcChannel.callBlockingMethod(SyncCoprocessorRpcChannel.java:71)
{noformat}
If you enable dt renewal things work since the codes takes a slightly different path when generating the initial delegation tokens.
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org