You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@kylin.apache.org by "java_program@aliyun.com" <ja...@aliyun.com> on 2017/06/28 08:58:37 UTC

Failed to find metadata store by url: kylin_metadata@hbase

Hi,
    I upgrade Kylin to 2.0.
   I can successs run at Mapreduce engine now ,but when i build a cube using spark cube, it shows errors below.
How could I solve it. Thanks 


Exception in thread "main" java.lang.RuntimeException: error execute org.apache.kylin.engine.spark.SparkCubingByLayer
at org.apache.kylin.common.util.AbstractApplication.execute(AbstractApplication.java:42)
at org.apache.kylin.common.util.SparkEntry.main(SparkEntry.java:44)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.IllegalArgumentException: Failed to find metadata store by url: kylin_metadata@hbase
at org.apache.kylin.common.persistence.ResourceStore.createResourceStore(ResourceStore.java:99)
at org.apache.kylin.common.persistence.ResourceStore.getStore(ResourceStore.java:110)
at org.apache.kylin.cube.CubeManager.getStore(CubeManager.java:812)
at org.apache.kylin.cube.CubeManager.loadAllCubeInstance(CubeManager.java:732)
at org.apache.kylin.cube.CubeManager.<init>(CubeManager.java:143)
at org.apache.kylin.cube.CubeManager.getInstance(CubeManager.java:107)
at org.apache.kylin.engine.spark.SparkCubingByLayer.execute(SparkCubingByLayer.java:160)
at org.apache.kylin.common.util.AbstractApplication.execute(AbstractApplication.java:37)
... 10 more



java_program@aliyun.com

Re: Re: Failed to find metadata store by url: kylin_metadata@hbase

Posted by Li Yang <li...@apache.org>.
> Caused by: java.lang.IllegalArgumentException: Failed to find metadata
store by url: kylin_metadata@hbase

There should another exception in log (typically before this line), telling
the root cause of why failed to connect HBase. Please look around.

Yang

On Thu, Jun 29, 2017 at 6:58 PM, java_program@aliyun.com <
java_program@aliyun.com> wrote:

> Hi,
> here is my configuration in kylin.properties
>
> kylin.env.hadoop-conf-dir=/usr/local/kylin/hadoop-conf
>
> [kylin@gateway conf]$ ll /usr/local/kylin/hadoop-conf
> 总用量 20
> lrwxrwxrwx 1 kylin kylin    42 6月  28 13:31 core-site.xml
> -> /usr/local/hadoop/etc/hadoop/core-site.xml
> lrwxrwxrwx 1 kylin kylin    36 6月  28 13:32 hbase-site.
> xml -> /usr/local/hbase/conf/hbase-site.xml
> lrwxrwxrwx 1 kylin kylin    42 6月  28 13:31 hdfs-site.xml
> -> /usr/local/hadoop/etc/hadoop/hdfs-site.xml
> -rw-r--r-- 1 kylin kylin 17924 6月  28 13:33 hive-site.xml
> lrwxrwxrwx 1 kylin kylin    42 6月  28 13:31 yarn-site.xml
> -> /usr/local/hadoop/etc/hadoop/yarn-site.xml
>
> and all log is below
>
> OS command error exit with 1 -- export HADOOP_CONF_DIR=/usr/local/kylin/hadoop-conf && /usr/local/spark/bin/spark-submit --class org.apache.kylin.common.util.SparkEntry  --conf spark.executor.instances=1  --conf spark.yarn.jar=hdfs://ns1/kylin/spark/spark-assembly-1.6.0-cdh5.9.0-hadoop2.6.0-cdh5.9.0.jar  --conf spark.yarn.queue=default  --conf spark.history.fs.logDirectory=hdfs:///kylin/spark-history  --conf spark.master=yarn  --conf spark.executor.memory=4G  --conf spark.eventLog.enabled=true  --conf spark.eventLog.dir=hdfs:///kylin/spark-history  --conf spark.executor.cores=2  --conf spark.submit.deployMode=client --files /usr/local/hbase/conf/hbase-site.xml --jars /usr/local/hbase/lib/htrace-core-3.2.0-incubating.jar,/usr/local/hbase/lib/hbase-client-1.2.0-cdh5.9.0.jar,/usr/local/hbase/lib/hbase-common-1.2.0-cdh5.9.0.jar,/usr/local/hbase/lib/hbase-protocol-1.2.0-cdh5.9.0.jar,/usr/local/hbase/lib/metrics-core-2.2.0.jar,/usr/local/hbase/lib/guava-12.0.1.jar, /usr/local/kylin/lib/kylin-job-2.0.0.jar -className org.apache.kylin.engine.spark.SparkCubingByLayer -hiveTable default.kylin_intermediate_kylin_sales_cube_d7955f9a_d290_4479_866c_5745f7880b81 -output hdfs:///kylin/kylin_metadata/kylin-09df3734-6dcf-41b2-99a1-02e2b4e16f9f/kylin_sales_cube/cuboid/ -segmentId d7955f9a-d290-4479-866c-5745f7880b81 -confPath /usr/local/kylin/conf -cubename kylin_sales_cube
> SparkEntry args:-className org.apache.kylin.engine.spark.SparkCubingByLayer -hiveTable default.kylin_intermediate_kylin_sales_cube_d7955f9a_d290_4479_866c_5745f7880b81 -output hdfs:///kylin/kylin_metadata/kylin-09df3734-6dcf-41b2-99a1-02e2b4e16f9f/kylin_sales_cube/cuboid/ -segmentId d7955f9a-d290-4479-866c-5745f7880b81 -confPath /usr/local/kylin/conf -cubename kylin_sales_cube
> Abstract Application args:-hiveTable default.kylin_intermediate_kylin_sales_cube_d7955f9a_d290_4479_866c_5745f7880b81 -output hdfs:///kylin/kylin_metadata/kylin-09df3734-6dcf-41b2-99a1-02e2b4e16f9f/kylin_sales_cube/cuboid/ -segmentId d7955f9a-d290-4479-866c-5745f7880b81 -confPath /usr/local/kylin/conf -cubename kylin_sales_cube
> spark.yarn.driver.memoryOverhead is set but does not apply in client mode.
> spark.driver.cores is set but does not apply in client mode.
> Hive history file=/usr/local/hive2/logs/kylin/hive_job_log_fe557857-7a8e-407f-aeab-94044b59931d_2113658910.txt
> Hive history file=/usr/local/hive2/logs/kylin/hive_job_log_df0eaa5c-b047-43a3-bd07-0cc8c0ec148c_2093334660.txt
> Exception in thread "main" java.lang.RuntimeException: error execute org.apache.kylin.engine.spark.SparkCubingByLayer
> 	at org.apache.kylin.common.util.AbstractApplication.execute(AbstractApplication.java:42)
> 	at org.apache.kylin.common.util.SparkEntry.main(SparkEntry.java:44)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:498)
> 	at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
> 	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
> 	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
> 	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
> 	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> Caused by: java.lang.IllegalArgumentException: Failed to find metadata store by url: kylin_metadata@hbase
> 	at org.apache.kylin.common.persistence.ResourceStore.createResourceStore(ResourceStore.java:99)
> 	at org.apache.kylin.common.persistence.ResourceStore.getStore(ResourceStore.java:110)
> 	at org.apache.kylin.cube.CubeManager.getStore(CubeManager.java:812)
> 	at org.apache.kylin.cube.CubeManager.loadAllCubeInstance(CubeManager.java:732)
> 	at org.apache.kylin.cube.CubeManager.<init>(CubeManager.java:143)
> 	at org.apache.kylin.cube.CubeManager.getInstance(CubeManager.java:107)
> 	at org.apache.kylin.engine.spark.SparkCubingByLayer.execute(SparkCubingByLayer.java:160)
> 	at org.apache.kylin.common.util.AbstractApplication.execute(AbstractApplication.java:37)
> 	... 10 more
>
>
> Thank you very much!
>
> ------------------------------
> java_program@aliyun.com
>
>
> *From:* ShaoFeng Shi <sh...@apache.org>
> *Date:* 2017-06-28 19:49
> *To:* user <us...@kylin.apache.org>
> *Subject:* Re: Failed to find metadata store by url: kylin_metadata@hbase
> Hi,
>
> Could you pls provide the full log? I'm afraid the root cause wasn't
> included here. There should be other error before or after this. Please
> also check whether the hbase-site.xml is in the HADOOP_CONF folder and
> Spark executors can connect with HBase cluster.
>
> Here is a similar question before: http://apache-kylin.
> 74782.x6.nabble.com/A-problem-in-cube-building-with-SPARK-td8009.html
>
> 2017-06-28 16:58 GMT+08:00 java_program@aliyun.com <
> java_program@aliyun.com>:
>
>> Hi,
>>     I upgrade Kylin to 2.0.
>>    I can successs run at Mapreduce engine now ,but when i
>> build a cube using spark cube, it shows errors below.
>> How could I solve it. Thanks
>>
>>
>> Exception in thread "main" java.lang.RuntimeException: error
>>  execute org.apache.kylin.engine.spark.SparkCubingByLayer
>> at org.apache.kylin.common.util.AbstractApplication.execute(
>> AbstractApplication.java:42)
>> at org.apache.kylin.common.util.SparkEntry.main(SparkEntry.java:44)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodA
>> ccessorImpl.java:62)
>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(Delegatin
>> gMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:498)
>> at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$dep
>> loy$SparkSubmit$$runMain(SparkSubmit.scala:731)
>> at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSub
>> mit.scala:181)
>> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
>> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
>> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>> Caused by: java.lang.IllegalArgumentException: Failed to
>> find metadata store by url: kylin_metadata@hbase
>> at org.apache.kylin.common.persistence.ResourceStore.createR
>> esourceStore(ResourceStore.java:99)
>> at org.apache.kylin.common.persistence.ResourceStore.getStor
>> e(ResourceStore.java:110)
>> at org.apache.kylin.cube.CubeManager.getStore(CubeManager.java:812)
>> at org.apache.kylin.cube.CubeManager.loadAllCubeInstance(Cub
>> eManager.java:732)
>> at org.apache.kylin.cube.CubeManager.<init>(CubeManager.java:143)
>> at org.apache.kylin.cube.CubeManager.getInstance(CubeManager.java:107)
>> at org.apache.kylin.engine.spark.SparkCubingByLayer.execute(
>> SparkCubingByLayer.java:160)
>> at org.apache.kylin.common.util.AbstractApplication.execute(
>> AbstractApplication.java:37)
>> ... 10 more
>>
>> ------------------------------
>> java_program@aliyun.com
>>
>
>
>
> --
> Best regards,
>
> Shaofeng Shi 史少锋
>
>

Re: Re: Failed to find metadata store by url: kylin_metadata@hbase

Posted by "java_program@aliyun.com" <ja...@aliyun.com>.
Hi,
here is my configuration in kylin.properties
 
kylin.env.hadoop-conf-dir=/usr/local/kylin/hadoop-conf

[kylin@gateway conf]$ ll /usr/local/kylin/hadoop-conf
总用量 20
lrwxrwxrwx 1 kylin kylin    42 6月  28 13:31 core-site.xml -> /usr/local/hadoop/etc/hadoop/core-site.xml
lrwxrwxrwx 1 kylin kylin    36 6月  28 13:32 hbase-site.xml -> /usr/local/hbase/conf/hbase-site.xml
lrwxrwxrwx 1 kylin kylin    42 6月  28 13:31 hdfs-site.xml -> /usr/local/hadoop/etc/hadoop/hdfs-site.xml
-rw-r--r-- 1 kylin kylin 17924 6月  28 13:33 hive-site.xml
lrwxrwxrwx 1 kylin kylin    42 6月  28 13:31 yarn-site.xml -> /usr/local/hadoop/etc/hadoop/yarn-site.xml

and all log is below 

OS command error exit with 1 -- export HADOOP_CONF_DIR=/usr/local/kylin/hadoop-conf && /usr/local/spark/bin/spark-submit --class org.apache.kylin.common.util.SparkEntry  --conf spark.executor.instances=1  --conf spark.yarn.jar=hdfs://ns1/kylin/spark/spark-assembly-1.6.0-cdh5.9.0-hadoop2.6.0-cdh5.9.0.jar  --conf spark.yarn.queue=default  --conf spark.history.fs.logDirectory=hdfs:///kylin/spark-history  --conf spark.master=yarn  --conf spark.executor.memory=4G  --conf spark.eventLog.enabled=true  --conf spark.eventLog.dir=hdfs:///kylin/spark-history  --conf spark.executor.cores=2  --conf spark.submit.deployMode=client --files /usr/local/hbase/conf/hbase-site.xml --jars /usr/local/hbase/lib/htrace-core-3.2.0-incubating.jar,/usr/local/hbase/lib/hbase-client-1.2.0-cdh5.9.0.jar,/usr/local/hbase/lib/hbase-common-1.2.0-cdh5.9.0.jar,/usr/local/hbase/lib/hbase-protocol-1.2.0-cdh5.9.0.jar,/usr/local/hbase/lib/metrics-core-2.2.0.jar,/usr/local/hbase/lib/guava-12.0.1.jar, /usr/local/kylin/lib/kylin-job-2.0.0.jar -className org.apache.kylin.engine.spark.SparkCubingByLayer -hiveTable default.kylin_intermediate_kylin_sales_cube_d7955f9a_d290_4479_866c_5745f7880b81 -output hdfs:///kylin/kylin_metadata/kylin-09df3734-6dcf-41b2-99a1-02e2b4e16f9f/kylin_sales_cube/cuboid/ -segmentId d7955f9a-d290-4479-866c-5745f7880b81 -confPath /usr/local/kylin/conf -cubename kylin_sales_cube
SparkEntry args:-className org.apache.kylin.engine.spark.SparkCubingByLayer -hiveTable default.kylin_intermediate_kylin_sales_cube_d7955f9a_d290_4479_866c_5745f7880b81 -output hdfs:///kylin/kylin_metadata/kylin-09df3734-6dcf-41b2-99a1-02e2b4e16f9f/kylin_sales_cube/cuboid/ -segmentId d7955f9a-d290-4479-866c-5745f7880b81 -confPath /usr/local/kylin/conf -cubename kylin_sales_cube
Abstract Application args:-hiveTable default.kylin_intermediate_kylin_sales_cube_d7955f9a_d290_4479_866c_5745f7880b81 -output hdfs:///kylin/kylin_metadata/kylin-09df3734-6dcf-41b2-99a1-02e2b4e16f9f/kylin_sales_cube/cuboid/ -segmentId d7955f9a-d290-4479-866c-5745f7880b81 -confPath /usr/local/kylin/conf -cubename kylin_sales_cube
spark.yarn.driver.memoryOverhead is set but does not apply in client mode.
spark.driver.cores is set but does not apply in client mode.
Hive history file=/usr/local/hive2/logs/kylin/hive_job_log_fe557857-7a8e-407f-aeab-94044b59931d_2113658910.txt
Hive history file=/usr/local/hive2/logs/kylin/hive_job_log_df0eaa5c-b047-43a3-bd07-0cc8c0ec148c_2093334660.txt
Exception in thread "main" java.lang.RuntimeException: error execute org.apache.kylin.engine.spark.SparkCubingByLayer
	at org.apache.kylin.common.util.AbstractApplication.execute(AbstractApplication.java:42)
	at org.apache.kylin.common.util.SparkEntry.main(SparkEntry.java:44)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.IllegalArgumentException: Failed to find metadata store by url: kylin_metadata@hbase
	at org.apache.kylin.common.persistence.ResourceStore.createResourceStore(ResourceStore.java:99)
	at org.apache.kylin.common.persistence.ResourceStore.getStore(ResourceStore.java:110)
	at org.apache.kylin.cube.CubeManager.getStore(CubeManager.java:812)
	at org.apache.kylin.cube.CubeManager.loadAllCubeInstance(CubeManager.java:732)
	at org.apache.kylin.cube.CubeManager.<init>(CubeManager.java:143)
	at org.apache.kylin.cube.CubeManager.getInstance(CubeManager.java:107)
	at org.apache.kylin.engine.spark.SparkCubingByLayer.execute(SparkCubingByLayer.java:160)
	at org.apache.kylin.common.util.AbstractApplication.execute(AbstractApplication.java:37)
	... 10 more

Thank you very much!



java_program@aliyun.com
 
From: ShaoFeng Shi
Date: 2017-06-28 19:49
To: user
Subject: Re: Failed to find metadata store by url: kylin_metadata@hbase
Hi, 

Could you pls provide the full log? I'm afraid the root cause wasn't included here. There should be other error before or after this. Please also check whether the hbase-site.xml is in the HADOOP_CONF folder and Spark executors can connect with HBase cluster.

Here is a similar question before: http://apache-kylin.74782.x6.nabble.com/A-problem-in-cube-building-with-SPARK-td8009.html

2017-06-28 16:58 GMT+08:00 java_program@aliyun.com <ja...@aliyun.com>:
Hi,
    I upgrade Kylin to 2.0.
   I can successs run at Mapreduce engine now ,but when i build a cube using spark cube, it shows errors below.
How could I solve it. Thanks 


Exception in thread "main" java.lang.RuntimeException: error execute org.apache.kylin.engine.spark.SparkCubingByLayer
at org.apache.kylin.common.util.AbstractApplication.execute(AbstractApplication.java:42)
at org.apache.kylin.common.util.SparkEntry.main(SparkEntry.java:44)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.IllegalArgumentException: Failed to find metadata store by url: kylin_metadata@hbase
at org.apache.kylin.common.persistence.ResourceStore.createResourceStore(ResourceStore.java:99)
at org.apache.kylin.common.persistence.ResourceStore.getStore(ResourceStore.java:110)
at org.apache.kylin.cube.CubeManager.getStore(CubeManager.java:812)
at org.apache.kylin.cube.CubeManager.loadAllCubeInstance(CubeManager.java:732)
at org.apache.kylin.cube.CubeManager.<init>(CubeManager.java:143)
at org.apache.kylin.cube.CubeManager.getInstance(CubeManager.java:107)
at org.apache.kylin.engine.spark.SparkCubingByLayer.execute(SparkCubingByLayer.java:160)
at org.apache.kylin.common.util.AbstractApplication.execute(AbstractApplication.java:37)
... 10 more



java_program@aliyun.com



-- 
Best regards,

Shaofeng Shi 史少锋


Re: Failed to find metadata store by url: kylin_metadata@hbase

Posted by ShaoFeng Shi <sh...@apache.org>.
Hi,

Could you pls provide the full log? I'm afraid the root cause wasn't
included here. There should be other error before or after this. Please
also check whether the hbase-site.xml is in the HADOOP_CONF folder and
Spark executors can connect with HBase cluster.

Here is a similar question before:
http://apache-kylin.74782.x6.nabble.com/A-problem-in-cube-building-with-SPARK-td8009.html

2017-06-28 16:58 GMT+08:00 java_program@aliyun.com <ja...@aliyun.com>
:

> Hi,
>     I upgrade Kylin to 2.0.
>    I can successs run at Mapreduce engine now ,but
> when i build a cube using spark cube, it shows errors below.
> How could I solve it. Thanks
>
>
> Exception in thread "main" java.lang.RuntimeException:
> error execute org.apache.kylin.engine.spark.SparkCubingByLayer
> at org.apache.kylin.common.util.AbstractApplication.
> execute(AbstractApplication.java:42)
> at org.apache.kylin.common.util.SparkEntry.main(SparkEntry.java:44)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(
> NativeMethodAccessorImpl.java:62)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(
> DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$
> deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
> at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> Caused by: java.lang.IllegalArgumentException:
> Failed to find metadata store by url: kylin_metadata@hbase
> at org.apache.kylin.common.persistence.ResourceStore.createResourceStore(
> ResourceStore.java:99)
> at org.apache.kylin.common.persistence.ResourceStore.
> getStore(ResourceStore.java:110)
> at org.apache.kylin.cube.CubeManager.getStore(CubeManager.java:812)
> at org.apache.kylin.cube.CubeManager.loadAllCubeInstance(
> CubeManager.java:732)
> at org.apache.kylin.cube.CubeManager.<init>(CubeManager.java:143)
> at org.apache.kylin.cube.CubeManager.getInstance(CubeManager.java:107)
> at org.apache.kylin.engine.spark.SparkCubingByLayer.
> execute(SparkCubingByLayer.java:160)
> at org.apache.kylin.common.util.AbstractApplication.
> execute(AbstractApplication.java:37)
> ... 10 more
>
> ------------------------------
> java_program@aliyun.com
>



-- 
Best regards,

Shaofeng Shi 史少锋