You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@kylin.apache.org by "Michael James Arockiam (Jira)" <ji...@apache.org> on 2020/08/08 09:28:00 UTC

[jira] [Created] (KYLIN-4690) BUILD CUBE - job fail on spark clusters mode - #7 Step Name: Build Cube with Spark

Michael James Arockiam created KYLIN-4690:
---------------------------------------------

             Summary: BUILD CUBE  - job fail on spark clusters mode - #7 Step Name: Build Cube with Spark
                 Key: KYLIN-4690
                 URL: https://issues.apache.org/jira/browse/KYLIN-4690
             Project: Kylin
          Issue Type: Bug
          Components: Spark Engine
    Affects Versions: v3.1.0
            Reporter: Michael James Arockiam


BUILD CUBE  - job fail on spark clusters mode - #7 Step Name: Build Cube with Spark

 Executor:

export HADOOP_CONF_DIR=/app/kylin/apache-kylin-3.1.0-bin-hbase1x/kylin_hadoop_conf_dir && /usr/hdp/current/spark2-client/bin/spark-submit --class org.apache.kylin.common.util.SparkEntry --name "Build Cube with Spark:CBE_DEV[20200102000000_20200103000000]" --conf spark.executor.cores=5  --conf spark.hadoop.yarn.timeline-service.enabled=false  --conf spark.hadoop.mapreduce.output.fileoutputformat.compress.codec=org.apache.hadoop.io.compress.DefaultCodec  --conf spark.executor.memoryOverhead=1024  --conf spark.executor.extraJavaOptions=-Dhdp.version=2.6.4.149-3  --conf spark.master=yarn  --conf spark.hadoop.mapreduce.output.fileoutputformat.compress=true  --conf spark.executor.instances=5  --conf spark.kryo.register=org.apache.spark.internal.io.FileCommitProtocol.TaskCommitMessage  --conf spark.yarn.am.extraJavaOptions=-Dhdp.version=2.6.4.149-3  --conf spark.executor.memory=4G  --conf spark.yarn.queue=sgz1-criskapp-haas_dev  --conf spark.submit.deployMode=cluster  --conf spark.dynamicAllocation.minExecutors=0  --conf spark.network.timeout=600  --conf spark.hadoop.dfs.replication=2  --conf spark.yarn.executor.memoryOverhead=1024  --conf spark.dynamicAllocation.executorIdleTimeout=300  --conf spark.history.fs.logDirectory=hdfs:///kylin/spark-history  --conf spark.driver.memory=5G  --conf spark.driver.extraJavaOptions=-Dhdp.version=2.6.4.149-3  --conf spark.io.compression.codec=org.apache.spark.io.SnappyCompressionCodec  --conf spark.eventLog.enabled=true  --conf spark.shuffle.service.enabled=true  --conf spark.eventLog.dir=hdfs:///kylin/spark-history  --conf spark.dynamicAllocation.maxExecutors=15  --conf spark.dynamicAllocation.enabled=true --jars /app/kylin/apache-kylin-3.1.0-bin-hbase1x/lib/kylin-job-3.1.0.jar /app/kylin/apache-kylin-3.1.0-bin-hbase1x/lib/kylin-job-3.1.0.jar -className org.apache.kylin.engine.spark.SparkCubingByLayer -hiveTable kylin310.kylin_intermediate_cbe_dev_02f32a29_1d51_0cb0_37ba_825333d38c8d -output hdfs://<IP:PORT>/dev/kylin310/kylin-0f5b105d-4794-e7ce-b329-fd7a83cb1aa2/CBE_DEV/cuboid/ -input hdfs://<IP:PORT>/dev/kylin310/kylin-0f5b105d-4794-e7ce-b329-fd7a83cb1aa2/kylin_intermediate_cbe_dev_02f32a29_1d51_0cb0_37ba_825333d38c8d -segmentId 02f32a29-1d51-0cb0-37ba-825333d38c8d -metaUrl crr_kylin_dev240@hdfs,path=hdfs://<IP:PORT>/dev/kylin310/kylin-0f5b105d-4794-e7ce-b329-fd7a83cb1aa2/CBE_DEV/metadata -cubename CBE_DEV

 

Step Name:

#7 Step Name: Build Cube with Spark:CBE_DEV[20200102000000_20200103000000]

 

Error:

20/08/08 09:23:54 ERROR ApplicationMaster: User class threw exception: java.lang.RuntimeException: error execute org.apache.kylin.engine.spark.SparkCubingByLayer. Root cause: Error while instantiating 'org.apache.spark.sql.hive.HiveSessionStateBuilder':

java.lang.RuntimeException: error execute org.apache.kylin.engine.spark.SparkCubingByLayer. Root cause: Error while instantiating 'org.apache.spark.sql.hive.HiveSessionStateBuilder':

                at org.apache.kylin.common.util.AbstractApplication.execute(AbstractApplication.java:42)

                at org.apache.kylin.common.util.SparkEntry.main(SparkEntry.java:44)

                at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

                at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

                at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

                at java.lang.reflect.Method.invoke(Method.java:497)

                at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:646)

Caused by: java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveSessionStateBuilder':

                at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$instantiateSessionState(SparkSession.scala:1075)

                at org.apache.spark.sql.SparkSession$$anonfun$sessionState$2.apply(SparkSession.scala:142)

                at org.apache.spark.sql.SparkSession$$anonfun$sessionState$2.apply(SparkSession.scala:141)

                at scala.Option.getOrElse(Option.scala:121)

                at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:141)

                at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:138)

                at org.apache.spark.sql.SparkSession.table(SparkSession.scala:619)

                at org.apache.kylin.engine.spark.SparkUtil.getOtherFormatHiveInput(SparkUtil.java:173)

                at org.apache.kylin.engine.spark.SparkUtil.hiveRecordInputRDD(SparkUtil.java:153)

                at org.apache.kylin.engine.spark.SparkCubingByLayer.execute(SparkCubingByLayer.java:168)

                at org.apache.kylin.common.util.AbstractApplication.execute(AbstractApplication.java:37)

                ... 6 more

Caused by: org.apache.spark.sql.AnalysisException: java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(java.io.IOException): Delegation Token can only be issued over connection with kerberos authentication;

                at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:106)

                at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:194)

                at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:105)

                at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:93)

                at org.apache.spark.sql.hive.HiveSessionStateBuilder.externalCatalog(HiveSessionStateBuilder.scala:39)

                at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog$lzycompute(HiveSessionStateBuilder.scala:54)

                at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateBuilder.scala:52)

                at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateBuilder.scala:35)

                at org.apache.spark.sql.internal.BaseSessionStateBuilder.build(BaseSessionStateBuilder.scala:289)

                at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$instantiateSessionState(SparkSession.scala:1072)

                ... 16 more

Caused by: java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(java.io.IOException): Delegation Token can only be issued over connection with kerberos authentication

                at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:535)

                at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:191)

                at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)

                at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)

                at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)

                at java.lang.reflect.Constructor.newInstance(Constructor.java:422)

                at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:268)

                at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:362)

                at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:266)

                at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:66)

                at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:65)

                at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply$mcZ$sp(HiveExternalCatalog.scala:195)

                at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:195)

                at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:195)

                at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97)

                ... 25 more

Caused by: org.apache.hadoop.ipc.RemoteException(java.io.IOException): Delegation Token can only be issued over connection with kerberos authentication

                at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1554)

                at org.apache.hadoop.ipc.Client.call(Client.java:1498)

                at org.apache.hadoop.ipc.Client.call(Client.java:1398)

                at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233)

                at com.sun.proxy.$Proxy10.getDelegationToken(Unknown Source)

                at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getDelegationToken(ClientNamenodeProtocolTranslatorPB.java:985)

                at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

                at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

                at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

                at java.lang.reflect.Method.invoke(Method.java:497)

                at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:291)

                at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:203)

                at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:185)

                at com.sun.proxy.$Proxy11.getDelegationToken(Unknown Source)

                at org.apache.hadoop.hdfs.DFSClient.getDelegationToken(DFSClient.java:1042)

                at org.apache.hadoop.hdfs.DistributedFileSystem.getDelegationToken(DistributedFileSystem.java:1689)

                at org.apache.hadoop.fs.FileSystem.collectDelegationTokens(FileSystem.java:549)

                at org.apache.hadoop.fs.FileSystem.addDelegationTokens(FileSystem.java:527)

                at org.apache.hadoop.hdfs.DistributedFileSystem.addDelegationTokens(DistributedFileSystem.java:2400)

                at org.apache.tez.common.security.TokenCache.obtainTokensForFileSystemsInternal(TokenCache.java:119)

                at org.apache.tez.common.security.TokenCache.obtainTokensForFileSystemsInternal(TokenCache.java:98)

                at org.apache.tez.common.security.TokenCache.obtainTokensForFileSystems(TokenCache.java:76)

                at org.apache.tez.client.TezClientUtils.setupTezJarsLocalResources(TezClientUtils.java:200)

                at org.apache.tez.client.TezClient.getTezJarResources(TezClient.java:831)

                at org.apache.tez.client.TezClient.start(TezClient.java:355)

                at org.apache.hadoop.hive.ql.exec.tez.TezSessionState.open(TezSessionState.java:184)

                at org.apache.hadoop.hive.ql.exec.tez.TezSessionState.open(TezSessionState.java:116)

                at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:532)

                ... 39 more

20/08/08 09:23:54 INFO ApplicationMaster: Final app status: FAILED, exitCode: 15, (reason: User class threw exception: java.lang.RuntimeException: error execute org.apache.kylin.engine.spark.SparkCubingByLayer. Root cause: Error while instantiating 'org.apache.spark.sql.hive.HiveSessionStateBuilder':)

20/08/08 09:23:54 INFO SparkContext: Invoking stop() from shutdown hook

 PS: Job execution working in spark local mode



--
This message was sent by Atlassian Jira
(v8.3.4#803005)