You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@kylin.apache.org by "Chao Long (JIRA)" <ji...@apache.org> on 2018/11/23 06:55:00 UTC

[jira] [Issue Comment Deleted] (KYLIN-3699) SparkCubingByLayer. Root cause: null

     [ https://issues.apache.org/jira/browse/KYLIN-3699?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Chao Long updated KYLIN-3699:
-----------------------------
    Comment: was deleted

(was: I check the code. It seems a bug here. Could you fix it and then contribute a patch to Kylin?  thank you!:D)

> SparkCubingByLayer. Root cause: null 
> -------------------------------------
>
>                 Key: KYLIN-3699
>                 URL: https://issues.apache.org/jira/browse/KYLIN-3699
>             Project: Kylin
>          Issue Type: Bug
>          Components: Spark Engine
>    Affects Versions: v2.5.0
>         Environment: hdp.version: 2.5.0.0-1245
> kylin: 2.5.0 (apache-kylin-2.5.0-bin-hbase1x)
>            Reporter: 风语者
>            Priority: Major
>         Attachments: Error.png, 异常信息.png, 异常步骤.png
>
>
> 你好:
>       我最近在使用 Kylin, 在使用 spark engine 构建 cube 的时候经常会出现一个异常,不知道怎么解决。异常如下:
> Exception in thread "main" java.lang.RuntimeException: error execute org.apache.kylin.engine.spark.SparkCubingByLayer. Root cause: null
> 	at org.apache.kylin.common.util.AbstractApplication.execute(AbstractApplication.java:42)
> 	at org.apache.kylin.common.util.SparkEntry.main(SparkEntry.java:44)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:498)
> 	at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:744)
> 	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
> 	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
> 	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
> 	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> Caused by: java.lang.NullPointerException
> 	at org.apache.kylin.engine.mr.common.CubeStatsReader.estimateLayerSize(CubeStatsReader.java:297)
> 	at org.apache.kylin.engine.spark.SparkUtil.estimateLayerPartitionNum(SparkUtil.java:108)
> 	at org.apache.kylin.engine.spark.SparkCubingByLayer.execute(SparkCubingByLayer.java:182)
> 	at org.apache.kylin.common.util.AbstractApplication.execute(AbstractApplication.java:37)
>         我发现,出现这个异常的时候,fact table 与 lookup table 关联后记录条数为0,就会触发这个问题。会在 {color:#FF0000}*8 Step Name: Build Cube with Spark*{color} 报错。spark 引擎用的 kylin 自带的 spark 2.1.2. 集群环境:hdp.version: 2.5.0.0-1245
>        如果用 mapreduce 引擎构建的话,是可以正常构建的,没有数据,流程也可以正常走通。
>         不知道这个问题该怎么解决。 
>         希望可以得到回复。
>         谢谢谢谢。



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)