You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@kylin.apache.org by XiongQiuQiu <zh...@gmail.com> on 2017/03/17 03:00:24 UTC
Getting an error when build cube at Convert Cuboid Data to HFile
I run sample.sh then build cube kylin_sales_cube.
Getting an error at #17 Step Name: Convert Cuboid Data to HFile
this is log
java.lang.IllegalArgumentException: Wrong FS:
file:/opt/hadoop/tmp/partitions_48a40473-5dc9-4c50-b367-0a765137d33c,
expected: hdfs://n-vsan-12:9000
at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:643)
at org.apache.hadoop.fs.FileSystem.makeQualified(FileSystem.java:463)
at
org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2.configurePartitioner(HFileOutputFormat2.java:672)
at
org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2.configureIncrementalLoad(HFileOutputFormat2.java:517)
at
org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2.configureIncrementalLoad(HFileOutputFormat2.java:476)
at
org.apache.hadoop.hbase.mapreduce.HFileOutputFormat.configureIncrementalLoad(HFileOutputFormat.java:91)
at
org.apache.kylin.storage.hbase.steps.CubeHFileJob.run(CubeHFileJob.java:99)
at org.apache.kylin.engine.mr.MRUtil.runMRJob(MRUtil.java:92)
at
org.apache.kylin.engine.mr.common.MapReduceExecutable.doWork(MapReduceExecutable.java:120)
at
org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:113)
at
org.apache.kylin.job.execution.DefaultChainedExecutable.doWork(DefaultChainedExecutable.java:57)
at
org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:113)
at
org.apache.kylin.job.impl.threadpool.DefaultScheduler$JobRunner.run(DefaultScheduler.java:136)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
how can i do to fix it ,thank you
--
View this message in context: http://apache-kylin.74782.x6.nabble.com/Getting-an-error-when-build-cube-at-Convert-Cuboid-Data-to-HFile-tp7456.html
Sent from the Apache Kylin mailing list archive at Nabble.com.
Re: Getting an error when build cube at Convert Cuboid Data to HFile
Posted by Li Yang <li...@apache.org>.
> java.lang.IllegalArgumentException: Wrong FS:
file:/opt/hadoop/tmp/partitions_48a40473-5dc9-4c50-b367-0a765137d33c,
expected: hdfs://n-vsan-12:9000
Try analyze your config and see where the raw file system "
file:/opt/hadoop/..." come in. It is not expected.
Yang
On Fri, Mar 17, 2017 at 11:00 AM, XiongQiuQiu <zh...@gmail.com>
wrote:
> I run sample.sh then build cube kylin_sales_cube.
> Getting an error at #17 Step Name: Convert Cuboid Data to HFile
> this is log
> java.lang.IllegalArgumentException: Wrong FS:
> file:/opt/hadoop/tmp/partitions_48a40473-5dc9-4c50-b367-0a765137d33c,
> expected: hdfs://n-vsan-12:9000
> at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:643)
> at org.apache.hadoop.fs.FileSystem.makeQualified(
> FileSystem.java:463)
> at
> org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2.configurePartitioner(
> HFileOutputFormat2.java:672)
> at
> org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2.
> configureIncrementalLoad(HFileOutputFormat2.java:517)
> at
> org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2.
> configureIncrementalLoad(HFileOutputFormat2.java:476)
> at
> org.apache.hadoop.hbase.mapreduce.HFileOutputFormat.
> configureIncrementalLoad(HFileOutputFormat.java:91)
> at
> org.apache.kylin.storage.hbase.steps.CubeHFileJob.run(
> CubeHFileJob.java:99)
> at org.apache.kylin.engine.mr.MRUtil.runMRJob(MRUtil.java:92)
> at
> org.apache.kylin.engine.mr.common.MapReduceExecutable.
> doWork(MapReduceExecutable.java:120)
> at
> org.apache.kylin.job.execution.AbstractExecutable.
> execute(AbstractExecutable.java:113)
> at
> org.apache.kylin.job.execution.DefaultChainedExecutable.doWork(
> DefaultChainedExecutable.java:57)
> at
> org.apache.kylin.job.execution.AbstractExecutable.
> execute(AbstractExecutable.java:113)
> at
> org.apache.kylin.job.impl.threadpool.DefaultScheduler$JobRunner.run(
> DefaultScheduler.java:136)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(
> ThreadPoolExecutor.java:1145)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(
> ThreadPoolExecutor.java:615)
> at java.lang.Thread.run(Thread.java:745)
>
>
>
>
> how can i do to fix it ,thank you
>
>
> --
> View this message in context: http://apache-kylin.74782.x6.
> nabble.com/Getting-an-error-when-build-cube-at-Convert-
> Cuboid-Data-to-HFile-tp7456.html
> Sent from the Apache Kylin mailing list archive at Nabble.com.
>