You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@kylin.apache.org by "Shaofeng SHI (JIRA)" <ji...@apache.org> on 2018/12/04 01:08:00 UTC

[jira] [Commented] (KYLIN-3607) can't build cube with spark in v2.5.0

    [ https://issues.apache.org/jira/browse/KYLIN-3607?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16708040#comment-16708040 ] 

Shaofeng SHI commented on KYLIN-3607:
-------------------------------------

Interesting, I see your HBase version is 1.1.2, but in HBase source code, there is no "MetricsIO.java", where comes it? Can you double check there is no other version of HBase jar files on your cluster?

> can't build cube with spark in v2.5.0
> -------------------------------------
>
>                 Key: KYLIN-3607
>                 URL: https://issues.apache.org/jira/browse/KYLIN-3607
>             Project: Kylin
>          Issue Type: Bug
>            Reporter: ANIL KUMAR
>            Priority: Major
>
> in Kylin v2.5.0, can't be built cube at step 8 Convert Cuboid Data to HFile, the following is the related exception:
>  
> ERROR yarn.ApplicationMaster: User class threw exception: java.lang.RuntimeException: error execute org.apache.kylin.storage.hbase.steps.SparkCubeHFile. Root cause: Job aborted due to stage failure: Task 0 in stage 1.0 failed 4 times, java.lang.ExceptionInInitializerError
>  at org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2$1.getNewWriter(HFileOutputFormat2.java:247)
>  at org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2$1.write(HFileOutputFormat2.java:194)
>  at org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2$1.write(HFileOutputFormat2.java:152)
>  at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1$$anonfun$12$$anonfun$apply$4.apply$mcV$sp(PairRDDFunctions.scala:1125)
>  at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1$$anonfun$12$$anonfun$apply$4.apply(PairRDDFunctions.scala:1123)
>  at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1$$anonfun$12$$anonfun$apply$4.apply(PairRDDFunctions.scala:1123)
>  at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1353)
>  at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1$$anonfun$12.apply(PairRDDFunctions.scala:1131)
>  at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1$$anonfun$12.apply(PairRDDFunctions.scala:1102)
>  at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
>  at org.apache.spark.scheduler.Task.run(Task.scala:99)
>  at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:325)
>  at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>  at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>  at java.lang.Thread.run(Thread.java:745)
> Caused by: java.lang.RuntimeException: Could not create interface org.apache.hadoop.hbase.regionserver.MetricsRegionServerSourceFactory Is the hadoop compatibility jar on the classpath?
>  at org.apache.hadoop.hbase.CompatibilitySingletonFactory.getInstance(CompatibilitySingletonFactory.java:73)
>  at org.apache.hadoop.hbase.io.MetricsIO.<init>(MetricsIO.java:31)
>  at org.apache.hadoop.hbase.io.hfile.HFile.<clinit>(HFile.java:192)
>  ... 15 more
> Caused by: java.util.NoSuchElementException
>  at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:365)
>  at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
>  at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
>  at org.apache.hadoop.hbase.CompatibilitySingletonFactory.getInstance(CompatibilitySingletonFactory.java:59)
>  ... 17 more



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)