You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@kylin.apache.org by ShaoFeng Shi <sh...@apache.org> on 2019/03/19 09:00:07 UTC

Re: Hbase table is always empty when build with spark

Hi Alex,

Could you please report a JIRA to Kylin? or send a Pull request if you
already have a hot-fix. Thank you!

Best regards,

Shaofeng Shi 史少锋
Apache Kylin PMC
Email: shaofengshi@apache.org

Apache Kylin FAQ: https://kylin.apache.org/docs/gettingstarted/faq.html
Join Kylin user mail group: user-subscribe@kylin.apache.org
Join Kylin dev mail group: dev-subscribe@kylin.apache.org




mailpig <al...@163.com> 于2019年2月25日周一 下午5:18写道:

> Sure, hive table is not empty and the output directory of hfile also has
> data.
>
> <http://apache-kylin.74782.x6.nabble.com/file/t635/IMG20190225_171051.png>
>
>
> After set the mapreduce.job.outputformat.class in the job config, load
> hfile
> to hbase is success.
> Besides that I found the source code has the above config in the first
> commit,
> ..............................
> HTable table = new HTable(hbaseConf,
> cubeSegment.getStorageLocationIdentifier());
>         try {
>             HFileOutputFormat2.configureIncrementalLoadMap(job, table);
>         } catch (IOException ioe) {
>             // this can be ignored.
>             logger.debug(ioe.getMessage(), ioe);
>         }
> ...............................
> But after the commit 76c9c960be542c919301c72b34c7ae5ce6f1ec1c, the above
> config is deleted, I don't know why. Please check.
>
> --
> Sent from: http://apache-kylin.74782.x6.nabble.com/
>