You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@kylin.apache.org by Tony Lee <bt...@gmail.com> on 2017/06/01 04:01:02 UTC

Failed to find metadata while build with spark

Hi, team

I am using kylin 2.0.0 with hbase 1.2.4 and hadoop 2.6.2.
I got this error at  step 7: Build Cube with Spark.  MapReduce engine works
fine .

OS command error exit with 1 -- export
HADOOP_CONF_DIR=/opt/kylin/conf/hadoop &&
/opt/kylin/spark/bin/spark-submit --class
org.apache.kylin.common.util.SparkEntry  --conf
spark.executor.instances=1  --conf
spark.yarn.jar=hdfs://hans/kylin/spark/spark-assembly-1.6.3-hadoop2.6.0.jar
 --conf spark.yarn.queue=default  --conf
spark.history.fs.logDirectory=hdfs:///kylin/spark-history  --conf
spark.io.compression.codec=org.apache.spark.io.SnappyCompressionCodec
--conf spark.master=yarn  --conf spark.executor.memory=4G  --conf
spark.eventLog.enabled=true  --conf
spark.eventLog.dir=hdfs:///kylin/spark-history  --conf
spark.executor.cores=2  --conf spark.submit.deployMode=cluster --files
/app/hbase-1.2.4/conf/hbase-site.xml --jars
/app/hadoop-2.6.2/share/hadoop/common/lib/htrace-core-3.0.4.jar,/app/hbase-1.2.4/lib/htrace-core-3.1.0-incubating.jar,/app/hbase-1.2.4/lib/hbase-client-1.2.4.jar,/app/hbase-1.2.4/lib/hbase-common-1.2.4.jar,/app/hbase-1.2.4/lib/hbase-protocol-1.2.4.jar,/app/hbase-1.2.4/lib/metrics-core-2.2.0.jar,/app/hbase-1.2.4/lib/guava-12.0.1.jar,
/opt/kylin/lib/kylin-job-2.0.0.jar -className
org.apache.kylin.engine.spark.SparkCubingByLayer -hiveTable
default.kylin_intermediate_medusa_play_cube_7d81350f_0c27_4125_abc1_6ecf556647ad
-output hdfs:///kylin/kylin_metadata/kylin-0cc443b3-e5aa-4262-8c7d-e9138183bd80/medusa_play_cube/cuboid/
-segmentId 7d81350f-0c27-4125-abc1-6ecf556647ad -confPath
/opt/kylin/conf -cubename medusa_play_cube
SparkEntry args:-className
org.apache.kylin.engine.spark.SparkCubingByLayer -hiveTable
default.kylin_intermediate_medusa_play_cube_7d81350f_0c27_4125_abc1_6ecf556647ad
-output hdfs:///kylin/kylin_metadata/kylin-0cc443b3-e5aa-4262-8c7d-e9138183bd80/medusa_play_cube/cuboid/
-segmentId 7d81350f-0c27-4125-abc1-6ecf556647ad -confPath
/opt/kylin/conf -cubename medusa_play_cube
Abstract Application args:-hiveTable
default.kylin_intermediate_medusa_play_cube_7d81350f_0c27_4125_abc1_6ecf556647ad
-output hdfs:///kylin/kylin_metadata/kylin-0cc443b3-e5aa-4262-8c7d-e9138183bd80/medusa_play_cube/cuboid/
-segmentId 7d81350f-0c27-4125-abc1-6ecf556647ad -confPath
/opt/kylin/conf -cubename medusa_play_cube
spark.driver.cores is set but does not apply in client mode.
17/06/01 11:24:30 ERROR ResourceStore: Create new store instance failed
java.lang.reflect.InvocationTargetException
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
	at org.apache.kylin.common.persistence.ResourceStore.createResourceStore(ResourceStore.java:91)
	at org.apache.kylin.common.persistence.ResourceStore.getStore(ResourceStore.java:110)
	at org.apache.kylin.cube.CubeManager.getStore(CubeManager.java:812)
	at org.apache.kylin.cube.CubeManager.loadAllCubeInstance(CubeManager.java:732)
	at org.apache.kylin.cube.CubeManager.<init>(CubeManager.java:143)
	at org.apache.kylin.cube.CubeManager.getInstance(CubeManager.java:107)
	at org.apache.kylin.engine.spark.SparkCubingByLayer.execute(SparkCubingByLayer.java:160)
	at org.apache.kylin.common.util.AbstractApplication.execute(AbstractApplication.java:37)
	at org.apache.kylin.common.util.SparkEntry.main(SparkEntry.java:44)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:497)
	at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.IllegalArgumentException: File not exist by
'kylin_metadata@hbase':
/app/apache-kylin-2.0.0-bin/kylin_metadata@hbase
	at org.apache.kylin.common.persistence.FileResourceStore.<init>(FileResourceStore.java:49)
	... 22 more
17/06/01 11:24:30 ERROR ResourceStore: Create new store instance failed
java.lang.reflect.InvocationTargetException
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
	at org.apache.kylin.common.persistence.ResourceStore.createResourceStore(ResourceStore.java:91)
	at org.apache.kylin.common.persistence.ResourceStore.getStore(ResourceStore.java:110)
	at org.apache.kylin.cube.CubeManager.getStore(CubeManager.java:812)
	at org.apache.kylin.cube.CubeManager.loadAllCubeInstance(CubeManager.java:732)
	at org.apache.kylin.cube.CubeManager.<init>(CubeManager.java:143)
	at org.apache.kylin.cube.CubeManager.getInstance(CubeManager.java:107)
	at org.apache.kylin.engine.spark.SparkCubingByLayer.execute(SparkCubingByLayer.java:160)
	at org.apache.kylin.common.util.AbstractApplication.execute(AbstractApplication.java:37)
	at org.apache.kylin.common.util.SparkEntry.main(SparkEntry.java:44)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:497)
	at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedException:
Can't get the locations
	at org.apache.hadoop.hbase.client.RpcRetryingCallerWithReadReplicas.getRegionLocations(RpcRetryingCallerWithReadReplicas.java:319)
	at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:156)
	at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:60)
	at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:210)
	at org.apache.hadoop.hbase.client.ClientScanner.call(ClientScanner.java:327)
	at org.apache.hadoop.hbase.client.ClientScanner.nextScanner(ClientScanner.java:302)
	at org.apache.hadoop.hbase.client.ClientScanner.initializeScannerInConstruction(ClientScanner.java:167)
	at org.apache.hadoop.hbase.client.ClientScanner.<init>(ClientScanner.java:162)
	at org.apache.hadoop.hbase.client.HTable.getScanner(HTable.java:797)
	at org.apache.hadoop.hbase.MetaTableAccessor.fullScan(MetaTableAccessor.java:602)
	at org.apache.hadoop.hbase.MetaTableAccessor.tableExists(MetaTableAccessor.java:366)
	at org.apache.hadoop.hbase.client.HBaseAdmin.tableExists(HBaseAdmin.java:406)
	at org.apache.kylin.storage.hbase.HBaseConnection.tableExists(HBaseConnection.java:251)
	at org.apache.kylin.storage.hbase.HBaseConnection.createHTableIfNeeded(HBaseConnection.java:273)
	at org.apache.kylin.storage.hbase.HBaseResourceStore.createHTableIfNeeded(HBaseResourceStore.java:90)
	at org.apache.kylin.storage.hbase.HBaseResourceStore.<init>(HBaseResourceStore.java:86)
	... 22 more
Exception in thread "main" java.lang.RuntimeException: error execute
org.apache.kylin.engine.spark.SparkCubingByLayer
	at org.apache.kylin.common.util.AbstractApplication.execute(AbstractApplication.java:42)
	at org.apache.kylin.common.util.SparkEntry.main(SparkEntry.java:44)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:497)
	at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.IllegalArgumentException: Failed to find metadata
store by url: kylin_metadata@hbase
	at org.apache.kylin.common.persistence.ResourceStore.createResourceStore(ResourceStore.java:99)
	at org.apache.kylin.common.persistence.ResourceStore.getStore(ResourceStore.java:110)
	at org.apache.kylin.cube.CubeManager.getStore(CubeManager.java:812)
	at org.apache.kylin.cube.CubeManager.loadAllCubeInstance(CubeManager.java:732)
	at org.apache.kylin.cube.CubeManager.<init>(CubeManager.java:143)
	at org.apache.kylin.cube.CubeManager.getInstance(CubeManager.java:107)
	at org.apache.kylin.engine.spark.SparkCubingByLayer.execute(SparkCubingByLayer.java:160)
	at org.apache.kylin.common.util.AbstractApplication.execute(AbstractApplication.java:37)
	... 10 more


i have already put hbase-site.xml into dir
HADOOP_CONF_DIR=/opt/kylin/conf/hadoop


$ ll
total 8
lrwxrwxrwx. 1 hadoop hadoop   36 May 31 19:07 core-site.xml ->
/opt/hadoop/etc/hadoop/core-site.xml
lrwxrwxrwx. 1 hadoop hadoop   30 May 31 19:10 hbase-site.xml ->
/opt/hbase/conf/hbase-site.xml
lrwxrwxrwx. 1 hadoop hadoop   36 May 31 19:07 hdfs-site.xml ->
/opt/hadoop/etc/hadoop/hdfs-site.xml
-rwxr-xr-x. 1 hadoop hadoop 4559 May 31 19:24 hive-site.xml
lrwxrwxrwx. 1 hadoop hadoop   38 May 31 19:07 mapred-site.xml ->
/opt/hadoop/etc/hadoop/mapred-site.xml
lrwxrwxrwx. 1 hadoop hadoop   36 May 31 19:08 yarn-site.xml ->
/opt/hadoop/etc/hadoop/yarn-site.xml

What did i missed?
Any suggestion will help, thanks

Re: Failed to find metadata while build with spark

Posted by ShaoFeng Shi <sh...@apache.org>.
Hi Tony,

The error indicates the executors couldn't connect to HBase cluster, the
reason might be a wrong config file be passed (see
https://community.hortonworks.com/articles/74998/while-trying-to-connect-to-the-hbase-cluster-from.html
).

From the info you provided I saw there are two hbase-site.xml, pls check
whether they are the same:

diff /opt/hbase/conf/hbase-site.xml /app/hbase-1.2.4/conf/hbase-site.xml


2017-06-01 12:01 GMT+08:00 Tony Lee <bt...@gmail.com>:

> Hi, team
>
> I am using kylin 2.0.0 with hbase 1.2.4 and hadoop 2.6.2.
> I got this error at  step 7: Build Cube with Spark.  MapReduce engine works
> fine .
>
> OS command error exit with 1 -- export HADOOP_CONF_DIR=/opt/kylin/conf/hadoop && /opt/kylin/spark/bin/spark-submit --class org.apache.kylin.common.util.SparkEntry  --conf spark.executor.instances=1  --conf spark.yarn.jar=hdfs://hans/kylin/spark/spark-assembly-1.6.3-hadoop2.6.0.jar  --conf spark.yarn.queue=default  --conf spark.history.fs.logDirectory=hdfs:///kylin/spark-history  --conf spark.io.compression.codec=org.apache.spark.io.SnappyCompressionCodec  --conf spark.master=yarn  --conf spark.executor.memory=4G  --conf spark.eventLog.enabled=true  --conf spark.eventLog.dir=hdfs:///kylin/spark-history  --conf spark.executor.cores=2  --conf spark.submit.deployMode=cluster --files /app/hbase-1.2.4/conf/hbase-site.xml --jars /app/hadoop-2.6.2/share/hadoop/common/lib/htrace-core-3.0.4.jar,/app/hbase-1.2.4/lib/htrace-core-3.1.0-incubating.jar,/app/hbase-1.2.4/lib/hbase-client-1.2.4.jar,/app/hbase-1.2.4/lib/hbase-common-1.2.4.jar,/app/hbase-1.2.4/lib/hbase-protocol-1.2.4.jar,/app/hbase-1.2.4/lib/metrics-core-2.2.0.jar,/app/hbase-1.2.4/lib/guava-12.0.1.jar, /opt/kylin/lib/kylin-job-2.0.0.jar -className org.apache.kylin.engine.spark.SparkCubingByLayer -hiveTable default.kylin_intermediate_medusa_play_cube_7d81350f_0c27_4125_abc1_6ecf556647ad -output hdfs:///kylin/kylin_metadata/kylin-0cc443b3-e5aa-4262-8c7d-e9138183bd80/medusa_play_cube/cuboid/ -segmentId 7d81350f-0c27-4125-abc1-6ecf556647ad -confPath /opt/kylin/conf -cubename medusa_play_cube
> SparkEntry args:-className org.apache.kylin.engine.spark.SparkCubingByLayer -hiveTable default.kylin_intermediate_medusa_play_cube_7d81350f_0c27_4125_abc1_6ecf556647ad -output hdfs:///kylin/kylin_metadata/kylin-0cc443b3-e5aa-4262-8c7d-e9138183bd80/medusa_play_cube/cuboid/ -segmentId 7d81350f-0c27-4125-abc1-6ecf556647ad -confPath /opt/kylin/conf -cubename medusa_play_cube
> Abstract Application args:-hiveTable default.kylin_intermediate_medusa_play_cube_7d81350f_0c27_4125_abc1_6ecf556647ad -output hdfs:///kylin/kylin_metadata/kylin-0cc443b3-e5aa-4262-8c7d-e9138183bd80/medusa_play_cube/cuboid/ -segmentId 7d81350f-0c27-4125-abc1-6ecf556647ad -confPath /opt/kylin/conf -cubename medusa_play_cube
> spark.driver.cores is set but does not apply in client mode.
> 17/06/01 11:24:30 ERROR ResourceStore: Create new store instance failed
> java.lang.reflect.InvocationTargetException
> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> 	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> 	at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
> 	at org.apache.kylin.common.persistence.ResourceStore.createResourceStore(ResourceStore.java:91)
> 	at org.apache.kylin.common.persistence.ResourceStore.getStore(ResourceStore.java:110)
> 	at org.apache.kylin.cube.CubeManager.getStore(CubeManager.java:812)
> 	at org.apache.kylin.cube.CubeManager.loadAllCubeInstance(CubeManager.java:732)
> 	at org.apache.kylin.cube.CubeManager.<init>(CubeManager.java:143)
> 	at org.apache.kylin.cube.CubeManager.getInstance(CubeManager.java:107)
> 	at org.apache.kylin.engine.spark.SparkCubingByLayer.execute(SparkCubingByLayer.java:160)
> 	at org.apache.kylin.common.util.AbstractApplication.execute(AbstractApplication.java:37)
> 	at org.apache.kylin.common.util.SparkEntry.main(SparkEntry.java:44)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:497)
> 	at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
> 	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
> 	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
> 	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
> 	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> Caused by: java.lang.IllegalArgumentException: File not exist by 'kylin_metadata@hbase': /app/apache-kylin-2.0.0-bin/kylin_metadata@hbase
> 	at org.apache.kylin.common.persistence.FileResourceStore.<init>(FileResourceStore.java:49)
> 	... 22 more
> 17/06/01 11:24:30 ERROR ResourceStore: Create new store instance failed
> java.lang.reflect.InvocationTargetException
> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> 	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> 	at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
> 	at org.apache.kylin.common.persistence.ResourceStore.createResourceStore(ResourceStore.java:91)
> 	at org.apache.kylin.common.persistence.ResourceStore.getStore(ResourceStore.java:110)
> 	at org.apache.kylin.cube.CubeManager.getStore(CubeManager.java:812)
> 	at org.apache.kylin.cube.CubeManager.loadAllCubeInstance(CubeManager.java:732)
> 	at org.apache.kylin.cube.CubeManager.<init>(CubeManager.java:143)
> 	at org.apache.kylin.cube.CubeManager.getInstance(CubeManager.java:107)
> 	at org.apache.kylin.engine.spark.SparkCubingByLayer.execute(SparkCubingByLayer.java:160)
> 	at org.apache.kylin.common.util.AbstractApplication.execute(AbstractApplication.java:37)
> 	at org.apache.kylin.common.util.SparkEntry.main(SparkEntry.java:44)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:497)
> 	at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
> 	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
> 	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
> 	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
> 	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> Caused by: org.apache.hadoop.hbase.client.RetriesExhaustedException: Can't get the locations
> 	at org.apache.hadoop.hbase.client.RpcRetryingCallerWithReadReplicas.getRegionLocations(RpcRetryingCallerWithReadReplicas.java:319)
> 	at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:156)
> 	at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:60)
> 	at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:210)
> 	at org.apache.hadoop.hbase.client.ClientScanner.call(ClientScanner.java:327)
> 	at org.apache.hadoop.hbase.client.ClientScanner.nextScanner(ClientScanner.java:302)
> 	at org.apache.hadoop.hbase.client.ClientScanner.initializeScannerInConstruction(ClientScanner.java:167)
> 	at org.apache.hadoop.hbase.client.ClientScanner.<init>(ClientScanner.java:162)
> 	at org.apache.hadoop.hbase.client.HTable.getScanner(HTable.java:797)
> 	at org.apache.hadoop.hbase.MetaTableAccessor.fullScan(MetaTableAccessor.java:602)
> 	at org.apache.hadoop.hbase.MetaTableAccessor.tableExists(MetaTableAccessor.java:366)
> 	at org.apache.hadoop.hbase.client.HBaseAdmin.tableExists(HBaseAdmin.java:406)
> 	at org.apache.kylin.storage.hbase.HBaseConnection.tableExists(HBaseConnection.java:251)
> 	at org.apache.kylin.storage.hbase.HBaseConnection.createHTableIfNeeded(HBaseConnection.java:273)
> 	at org.apache.kylin.storage.hbase.HBaseResourceStore.createHTableIfNeeded(HBaseResourceStore.java:90)
> 	at org.apache.kylin.storage.hbase.HBaseResourceStore.<init>(HBaseResourceStore.java:86)
> 	... 22 more
> Exception in thread "main" java.lang.RuntimeException: error execute org.apache.kylin.engine.spark.SparkCubingByLayer
> 	at org.apache.kylin.common.util.AbstractApplication.execute(AbstractApplication.java:42)
> 	at org.apache.kylin.common.util.SparkEntry.main(SparkEntry.java:44)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:497)
> 	at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
> 	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
> 	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
> 	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
> 	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> Caused by: java.lang.IllegalArgumentException: Failed to find metadata store by url: kylin_metadata@hbase
> 	at org.apache.kylin.common.persistence.ResourceStore.createResourceStore(ResourceStore.java:99)
> 	at org.apache.kylin.common.persistence.ResourceStore.getStore(ResourceStore.java:110)
> 	at org.apache.kylin.cube.CubeManager.getStore(CubeManager.java:812)
> 	at org.apache.kylin.cube.CubeManager.loadAllCubeInstance(CubeManager.java:732)
> 	at org.apache.kylin.cube.CubeManager.<init>(CubeManager.java:143)
> 	at org.apache.kylin.cube.CubeManager.getInstance(CubeManager.java:107)
> 	at org.apache.kylin.engine.spark.SparkCubingByLayer.execute(SparkCubingByLayer.java:160)
> 	at org.apache.kylin.common.util.AbstractApplication.execute(AbstractApplication.java:37)
> 	... 10 more
>
>
> i have already put hbase-site.xml into dir HADOOP_CONF_DIR=/opt/kylin/conf/hadoop
>
>
> $ ll
> total 8
> lrwxrwxrwx. 1 hadoop hadoop   36 May 31 19:07 core-site.xml ->
> /opt/hadoop/etc/hadoop/core-site.xml
> lrwxrwxrwx. 1 hadoop hadoop   30 May 31 19:10 hbase-site.xml ->
> /opt/hbase/conf/hbase-site.xml
> lrwxrwxrwx. 1 hadoop hadoop   36 May 31 19:07 hdfs-site.xml ->
> /opt/hadoop/etc/hadoop/hdfs-site.xml
> -rwxr-xr-x. 1 hadoop hadoop 4559 May 31 19:24 hive-site.xml
> lrwxrwxrwx. 1 hadoop hadoop   38 May 31 19:07 mapred-site.xml ->
> /opt/hadoop/etc/hadoop/mapred-site.xml
> lrwxrwxrwx. 1 hadoop hadoop   36 May 31 19:08 yarn-site.xml ->
> /opt/hadoop/etc/hadoop/yarn-site.xml
>
> What did i missed?
> Any suggestion will help, thanks
>



-- 
Best regards,

Shaofeng Shi 史少锋