You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@kylin.apache.org by 林琰文 <ly...@gmail.com> on 2019/03/18 17:04:51 UTC

Build buildSupportsSnappy Error When Doing Integration Testing

Hi all,
I am currently running integration test. However, I met the following
error. Could you please share some suggestions on this?
*1. Command*:
mvn verify -fae -Dhdp.version=3.0.1.0-187 -P sandbox
*2. Error message from Yarn Container Attempt:*

2019-03-18 16:43:25,583 INFO [main] org.apache.kylin.engine.mr.KylinMapper:
Accepting Mapper Key with ordinal: 1

2019-03-18 16:43:25,583 INFO [main] org.apache.kylin.engine.mr.KylinMapper:
Do map, available memory: 322m

2019-03-18 16:43:25,596 INFO [main] org.apache.kylin.common.KylinConfig:
Creating new manager instance of class
org.apache.kylin.cube.cuboid.CuboidManager

2019-03-18 16:43:25,599 INFO [main]
org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter: File Output
Committer Algorithm version is 1

2019-03-18 16:43:25,599 INFO [main]
org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter:
FileOutputCommitter skip cleanup _temporary folders under output
directory:false, ignore cleanup failures: false

2019-03-18 16:43:25,795 ERROR [main] org.apache.kylin.engine.mr.KylinMapper:

java.lang.UnsatisfiedLinkError:
org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z

 at org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native
Method)

 at
org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63
)

 at
org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:136
)

 at org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:150
)

 at org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:168
)

 at org.apache.hadoop.io.SequenceFile$Writer.init(SequenceFile.java:1304)

 at org.apache.hadoop.io.SequenceFile$Writer.<init>(SequenceFile.java:1192)

 at org.apache.hadoop.io.SequenceFile$BlockCompressWriter
.<init>(SequenceFile.java:1552)

 at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:289)

 at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:542)

 at
org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat.getSequenceWriter(SequenceFileOutputFormat.java:64)

 at
org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat.getRecordWriter(SequenceFileOutputFormat.java:75)

 at
org.apache.hadoop.mapreduce.lib.output.LazyOutputFormat$LazyRecordWriter.write(LazyOutputFormat.java:113)

 at
org.apache.hadoop.mapreduce.lib.output.MultipleOutputs.write(MultipleOutputs.java:468)

 at
org.apache.kylin.engine.mr.steps.FilterRecommendCuboidDataMapper.doMap(FilterRecommendCuboidDataMapper.java:85
)

 at
org.apache.kylin.engine.mr.steps.FilterRecommendCuboidDataMapper.doMap(FilterRecommendCuboidDataMapper.java:44
)

 at org.apache.kylin.engine.mr.KylinMapper.map(KylinMapper.java:77)

 at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:146)

 at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:799)

 at org.apache.hadoop.mapred.MapTask.run(MapTask.java:347)

 at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:174)

 at java.security.AccessController.doPrivileged(Native Method)

 at javax.security.auth.Subject.doAs(Subject.java:422)

 at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)

 at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:168)

2019-03-18 16:43:25,797 INFO [main] org.apache.kylin.engine.mr.KylinMapper:
Do cleanup, available memory: 318m

2019-03-18 16:43:25,813 INFO [main] org.apache.kylin.engine.mr.KylinMapper:
Total rows: 1

2019-03-18 16:43:25,813 ERROR [main] org.apache.hadoop.mapred.YarnChild:
Error running child : java.lang.UnsatisfiedLinkError:
org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z

 at org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native
Method)

 at
org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63
)

 at
org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:136
)

 at org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:150
)

 at org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:168
)

 at org.apache.hadoop.io.SequenceFile$Writer.init(SequenceFile.java:1304)

 at org.apache.hadoop.io.SequenceFile$Writer.<init>(SequenceFile.java:1192)

 at org.apache.hadoop.io.SequenceFile$BlockCompressWriter
.<init>(SequenceFile.java:1552)

 at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:289)

 at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:542)

 at
org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat.getSequenceWriter(SequenceFileOutputFormat.java:64)

 at
org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat.getRecordWriter(SequenceFileOutputFormat.java:75)

 at
org.apache.hadoop.mapreduce.lib.output.LazyOutputFormat$LazyRecordWriter.write(LazyOutputFormat.java:113)

 at
org.apache.hadoop.mapreduce.lib.output.MultipleOutputs.write(MultipleOutputs.java:468)

 at
org.apache.kylin.engine.mr.steps.FilterRecommendCuboidDataMapper.doMap(FilterRecommendCuboidDataMapper.java:85
)

 at
org.apache.kylin.engine.mr.steps.FilterRecommendCuboidDataMapper.doMap(FilterRecommendCuboidDataMapper.java:44
)

 at org.apache.kylin.engine.mr.KylinMapper.map(KylinMapper.java:77)

 at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:146)

 at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:799)

 at org.apache.hadoop.mapred.MapTask.run(MapTask.java:347)

 at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:174)

 at java.security.AccessController.doPrivileged(Native Method)

 at javax.security.auth.Subject.doAs(Subject.java:422)

 at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)

 at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:168)


2019-03-18 16:43:25,926 INFO [main]
org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Stopping MapTask metrics
system...

2019-03-18 16:43:25,927 INFO [main]
org.apache.hadoop.metrics2.impl.MetricsSystemImpl: MapTask metrics system
stopped.

2019-03-18 16:43:25,927 INFO [main]
org.apache.hadoop.metrics2.impl.MetricsSystemImpl: MapTask metrics system
shutdown complete.
*3. What I have tried (but not work):*
Have made sure the following files have the following property:
3.1 core-site.xml
File: {kylin_root}/examples/test_case_data/sandbox/core-site.xml
File: HDP HDFS core-site.xml (via Ambari Web UI)
<property> <name>io.compression.codecs</name> <value>
org.apache.hadoop.io.compress.GzipCodec,org.apache.hadoop.io.compress.DefaultCodec,org.apache.hadoop.io.compress.SnappyCodec
</value> </property>
3.2 mapred-site.xml
File: {kylin_root}/examples/test_case_data/sandbox/mapred-site.xml
File: HDP MapReduce2 mapred-site.xml (via Ambari Web UI)
<property> <name>mapreduce.map.output.compress</name> <value>true</value> </
property> <property> <name>mapred.map.output.compress.codec</name> <value>
org.apache.hadoop.io.compress.SnappyCodec</value> </property> <property> <
name>mapreduce.admin.user.env</name> <value>LD_LIBRARY_PATH=/usr/hdp/
3.0.1.0-187/hadoop/lib/native</value> </property>
3.3 libsnappy.so
Have checked the file libsnappy.so is located at the /usr/hdp/3.0.1.0-187
/hadoop/lib/native

Thanks!

Best,
Yanwen Lin

Re: Build buildSupportsSnappy Error When Doing Integration Testing

Posted by 林琰文 <ly...@gmail.com>.
Thanks!
I am using Hadoop3.x. I agree this may lead to the error.
In the link you shared with me, I saw this:


>
>
> *mapreduce.admin.user.env:如果map和reduce任务访问本地库(压缩等),则必须保留原始值。当此值为空时,设置执行环境的命令将取决于操作系统:Linux:LD_LIBRARY_PATH=$HADOOP_COMMON_HOME/lib/native.windows:PATH
> =%PATH%;%HADOOP_COMMON_HOME%\\bin.*


It says if we want to visit native library, we have to keep the original
value for *mapreduce.admin.user.env. *Actually I checked the following
property in the Ambari Web UI and I think the value is just the original
value, right? I also checked this property in the
<kylin_root>/examples/test_case_data/sandbox/mapred-site.xml, and it is
same as Ambari Web UI. Therefore, the map and reduce task should load
native library successfully.


>
>
>
> *<property><name>mapreduce.admin.user.env</name><value>LD_LIBRARY_PATH=/usr/hdp/${hdp.version}/hadoop/lib/native:/usr/hdp/${hdp.version}/hadoop/lib/native/Linux-amd64-64</value> </property>*


However, I checked the dumped YARN container log, the below context is from
sys log:


> *2019-03-20 11:54:24,901 WARN [main]
> org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop
> library for your platform... using builtin-java classes where
> applicable2019-03-21 07:57:10,482 DEBUG [main]
> org.apache.hadoop.util.NativeCodeLoader:
> java.library.path=/hadoop/yarn/local/usercache/root/appcache/application_1553152585911_0025/container_e08_1553152585911_0025_01_000005:/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib*


As it shows, Hadoop still cannot load native library (I have checked the*
libsnappy.so *did exist in */usr/hdp/${hdp.version}/hadoop/lib/native* and
it is a 64-bit file compatible with my OS. ).
Also, I checked dumped context from launch_container.sh:


>
> *export PWD="/hadoop/yarn/local/usercache/root/appcache/# Omits some other
> commandsexport LD_LIBRARY_PATH="$PWD"*


As you could see, the LD_LIBRARY_PATH is actually set to "$PWD" instead of
what is specified in the Ambari Web UI.
If this problem is related to Hadoop version, I think Hadoop 3.x actually
deprecates mapreduce.admin.user.*env *since it cannot take any effect
during YARN container runtime. Do you agree with this?
Sorry for keeping asking this question, but there is few useful explanation
about this Hadoop property. Thanks for your patience!

On Fri, Mar 22, 2019 at 3:18 PM Na Zhai <na...@kyligence.io> wrote:

> Hi, 林琰文.
>
>
>
> What’s your Hadoop version? 2.x and 3.x may have a different
> configuration. I do some search about this problem. Hope this can help you.
> http://www.aboutyun.com/thread-23759-1-1.html.
>
>
>
>
>
> 发送自 Windows 10 版邮件<https://go.microsoft.com/fwlink/?LinkId=550986>应用
>
>
>
> ________________________________
> 发件人: 林琰文 <ly...@gmail.com>
> 发送时间: Friday, March 22, 2019 10:47:07 AM
> 收件人: dev@kylin.apache.org
> 主题: Re: Build buildSupportsSnappy Error When Doing Integration Testing
>
> Hi Na Zhai,
>
> Thank you for your replying!
>
> Actually I've checked this link previously but it did not work. When I
> checked the log output from YARN MapReduce Job container I found that the
> *LD_LIBRARY_PATH* was not set correctly and MapReduce log still showed it
> could not load native hadoop library. However, I did specify the*
> LD_LIBRARY_PATH=/usr/hdp/3.0.1.0-187/hadoop/lib/native* in
> *mapreduce.admin.user.env
> *in* mapred-site.xml*. So It was quite weird. My hacked way to tackle this
> problem is to add
>
> *-Djava.library.path="${java.library.path}:/usr/hdp/${hdp.version}/hadoop/lib/native
> *option in the *mapreduce.admin.map.child.java.opts* in *mapred-site.xml*.
> So I don't know why adding *LD_LIBRARY_PATH to the mapreduce.admin.user.env
> in mapred-site.xml* does not work*. *Do you have any idea on this?
>
> Thanks again!
>
> Best,
> Yanwen Lin
>
> On Thu, Mar 21, 2019 at 10:44 PM Na Zhai <na...@kyligence.io> wrote:
>
> > Hi, 林琰文.
> >
> >
> >
> > You should add native dependencies. Hope this can help you.
> >
> >
> >
> https://stackoverflow.com/questions/22150417/hadoop-mapreduce-java-lang-unsatisfiedlinkerror-org-apache-hadoop-util-nativec
> > .
> >
> >
> >
> >
> >
> > 发送自 Windows 10 版邮件<https://go.microsoft.com/fwlink/?LinkId=550986>应用
> >
> >
> >
> > ________________________________
> > 发件人: 林琰文 <ly...@gmail.com>
> > 发送时间: Tuesday, March 19, 2019 1:04:51 AM
> > 收件人: dev@kylin.apache.org
> > 主题: Build buildSupportsSnappy Error When Doing Integration Testing
> >
> > Hi all,
> > I am currently running integration test. However, I met the following
> > error. Could you please share some suggestions on this?
> > *1. Command*:
> > mvn verify -fae -Dhdp.version=3.0.1.0-187 -P sandbox
> > *2. Error message from Yarn Container Attempt:*
> >
> > 2019-03-18 16:43:25,583 INFO [main] org.apache.kylin.engine.mr
> > .KylinMapper:
> > Accepting Mapper Key with ordinal: 1
> >
> > 2019-03-18 16:43:25,583 INFO [main] org.apache.kylin.engine.mr
> > .KylinMapper:
> > Do map, available memory: 322m
> >
> > 2019-03-18 16:43:25,596 INFO [main] org.apache.kylin.common.KylinConfig:
> > Creating new manager instance of class
> > org.apache.kylin.cube.cuboid.CuboidManager
> >
> > 2019-03-18 16:43:25,599 INFO [main]
> > org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter: File Output
> > Committer Algorithm version is 1
> >
> > 2019-03-18 16:43:25,599 INFO [main]
> > org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter:
> > FileOutputCommitter skip cleanup _temporary folders under output
> > directory:false, ignore cleanup failures: false
> >
> > 2019-03-18 16:43:25,795 ERROR [main] org.apache.kylin.engine.mr
> > .KylinMapper:
> >
> > java.lang.UnsatisfiedLinkError:
> > org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
> >
> >  at org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native
> > Method)
> >
> >  at
> >
> >
> org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63
> > )
> >
> >  at
> >
> >
> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:136
> > )
> >
> >  at
> > org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:150
> > )
> >
> >  at
> > org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:168
> > )
> >
> >  at org.apache.hadoop.io
> .SequenceFile$Writer.init(SequenceFile.java:1304)
> >
> >  at org.apache.hadoop.io
> > .SequenceFile$Writer.<init>(SequenceFile.java:1192)
> >
> >  at org.apache.hadoop.io.SequenceFile$BlockCompressWriter
> > .<init>(SequenceFile.java:1552)
> >
> >  at org.apache.hadoop.io
> .SequenceFile.createWriter(SequenceFile.java:289)
> >
> >  at org.apache.hadoop.io
> .SequenceFile.createWriter(SequenceFile.java:542)
> >
> >  at
> >
> >
> org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat.getSequenceWriter(SequenceFileOutputFormat.java:64)
> >
> >  at
> >
> >
> org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat.getRecordWriter(SequenceFileOutputFormat.java:75)
> >
> >  at
> >
> >
> org.apache.hadoop.mapreduce.lib.output.LazyOutputFormat$LazyRecordWriter.write(LazyOutputFormat.java:113)
> >
> >  at
> >
> >
> org.apache.hadoop.mapreduce.lib.output.MultipleOutputs.write(MultipleOutputs.java:468)
> >
> >  at
> > org.apache.kylin.engine.mr
> >
> .steps.FilterRecommendCuboidDataMapper.doMap(FilterRecommendCuboidDataMapper.java:85
> > )
> >
> >  at
> > org.apache.kylin.engine.mr
> >
> .steps.FilterRecommendCuboidDataMapper.doMap(FilterRecommendCuboidDataMapper.java:44
> > )
> >
> >  at org.apache.kylin.engine.mr.KylinMapper.map(KylinMapper.java:77)
> >
> >  at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:146)
> >
> >  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:799)
> >
> >  at org.apache.hadoop.mapred.MapTask.run(MapTask.java:347)
> >
> >  at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:174)
> >
> >  at java.security.AccessController.doPrivileged(Native Method)
> >
> >  at javax.security.auth.Subject.doAs(Subject.java:422)
> >
> >  at
> >
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
> >
> >  at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:168)
> >
> > 2019-03-18 16:43:25,797 INFO [main] org.apache.kylin.engine.mr
> > .KylinMapper:
> > Do cleanup, available memory: 318m
> >
> > 2019-03-18 16:43:25,813 INFO [main] org.apache.kylin.engine.mr
> > .KylinMapper:
> > Total rows: 1
> >
> > 2019-03-18 16:43:25,813 ERROR [main] org.apache.hadoop.mapred.YarnChild:
> > Error running child : java.lang.UnsatisfiedLinkError:
> > org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
> >
> >  at org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native
> > Method)
> >
> >  at
> >
> >
> org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63
> > )
> >
> >  at
> >
> >
> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:136
> > )
> >
> >  at
> > org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:150
> > )
> >
> >  at
> > org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:168
> > )
> >
> >  at org.apache.hadoop.io
> .SequenceFile$Writer.init(SequenceFile.java:1304)
> >
> >  at org.apache.hadoop.io
> > .SequenceFile$Writer.<init>(SequenceFile.java:1192)
> >
> >  at org.apache.hadoop.io.SequenceFile$BlockCompressWriter
> > .<init>(SequenceFile.java:1552)
> >
> >  at org.apache.hadoop.io
> .SequenceFile.createWriter(SequenceFile.java:289)
> >
> >  at org.apache.hadoop.io
> .SequenceFile.createWriter(SequenceFile.java:542)
> >
> >  at
> >
> >
> org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat.getSequenceWriter(SequenceFileOutputFormat.java:64)
> >
> >  at
> >
> >
> org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat.getRecordWriter(SequenceFileOutputFormat.java:75)
> >
> >  at
> >
> >
> org.apache.hadoop.mapreduce.lib.output.LazyOutputFormat$LazyRecordWriter.write(LazyOutputFormat.java:113)
> >
> >  at
> >
> >
> org.apache.hadoop.mapreduce.lib.output.MultipleOutputs.write(MultipleOutputs.java:468)
> >
> >  at
> > org.apache.kylin.engine.mr
> >
> .steps.FilterRecommendCuboidDataMapper.doMap(FilterRecommendCuboidDataMapper.java:85
> > )
> >
> >  at
> > org.apache.kylin.engine.mr
> >
> .steps.FilterRecommendCuboidDataMapper.doMap(FilterRecommendCuboidDataMapper.java:44
> > )
> >
> >  at org.apache.kylin.engine.mr.KylinMapper.map(KylinMapper.java:77)
> >
> >  at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:146)
> >
> >  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:799)
> >
> >  at org.apache.hadoop.mapred.MapTask.run(MapTask.java:347)
> >
> >  at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:174)
> >
> >  at java.security.AccessController.doPrivileged(Native Method)
> >
> >  at javax.security.auth.Subject.doAs(Subject.java:422)
> >
> >  at
> >
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
> >
> >  at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:168)
> >
> >
> > 2019-03-18 16:43:25,926 INFO [main]
> > org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Stopping MapTask
> metrics
> > system...
> >
> > 2019-03-18 16:43:25,927 INFO [main]
> > org.apache.hadoop.metrics2.impl.MetricsSystemImpl: MapTask metrics system
> > stopped.
> >
> > 2019-03-18 16:43:25,927 INFO [main]
> > org.apache.hadoop.metrics2.impl.MetricsSystemImpl: MapTask metrics system
> > shutdown complete.
> > *3. What I have tried (but not work):*
> > Have made sure the following files have the following property:
> > 3.1 core-site.xml
> > File: {kylin_root}/examples/test_case_data/sandbox/core-site.xml
> > File: HDP HDFS core-site.xml (via Ambari Web UI)
> > <property> <name>io.compression.codecs</name> <value>
> >
> >
> org.apache.hadoop.io.compress.GzipCodec,org.apache.hadoop.io.compress.DefaultCodec,org.apache.hadoop.io.compress.SnappyCodec
> > </value> </property>
> > 3.2 mapred-site.xml
> > File: {kylin_root}/examples/test_case_data/sandbox/mapred-site.xml
> > File: HDP MapReduce2 mapred-site.xml (via Ambari Web UI)
> > <property> <name>mapreduce.map.output.compress</name> <value>true</value>
> > </
> > property> <property> <name>mapred.map.output.compress.codec</name>
> <value>
> > org.apache.hadoop.io.compress.SnappyCodec</value> </property> <property>
> <
> > name>mapreduce.admin.user.env</name> <value>LD_LIBRARY_PATH=/usr/hdp/
> > 3.0.1.0-187/hadoop/lib/native</value> </property>
> > 3.3 libsnappy.so
> > Have checked the file libsnappy.so is located at the /usr/hdp/3.0.1.0-187
> > /hadoop/lib/native
> >
> > Thanks!
> >
> > Best,
> > Yanwen Lin
> >
>

答复: Build buildSupportsSnappy Error When Doing Integration Testing

Posted by Na Zhai <na...@kyligence.io>.
Hi, 林琰文.



What’s your Hadoop version? 2.x and 3.x may have a different configuration. I do some search about this problem. Hope this can help you. http://www.aboutyun.com/thread-23759-1-1.html.





发送自 Windows 10 版邮件<https://go.microsoft.com/fwlink/?LinkId=550986>应用



________________________________
发件人: 林琰文 <ly...@gmail.com>
发送时间: Friday, March 22, 2019 10:47:07 AM
收件人: dev@kylin.apache.org
主题: Re: Build buildSupportsSnappy Error When Doing Integration Testing

Hi Na Zhai,

Thank you for your replying!

Actually I've checked this link previously but it did not work. When I
checked the log output from YARN MapReduce Job container I found that the
*LD_LIBRARY_PATH* was not set correctly and MapReduce log still showed it
could not load native hadoop library. However, I did specify the*
LD_LIBRARY_PATH=/usr/hdp/3.0.1.0-187/hadoop/lib/native* in
*mapreduce.admin.user.env
*in* mapred-site.xml*. So It was quite weird. My hacked way to tackle this
problem is to add
*-Djava.library.path="${java.library.path}:/usr/hdp/${hdp.version}/hadoop/lib/native
*option in the *mapreduce.admin.map.child.java.opts* in *mapred-site.xml*.
So I don't know why adding *LD_LIBRARY_PATH to the mapreduce.admin.user.env
in mapred-site.xml* does not work*. *Do you have any idea on this?

Thanks again!

Best,
Yanwen Lin

On Thu, Mar 21, 2019 at 10:44 PM Na Zhai <na...@kyligence.io> wrote:

> Hi, 林琰文.
>
>
>
> You should add native dependencies. Hope this can help you.
>
>
> https://stackoverflow.com/questions/22150417/hadoop-mapreduce-java-lang-unsatisfiedlinkerror-org-apache-hadoop-util-nativec
> .
>
>
>
>
>
> 发送自 Windows 10 版邮件<https://go.microsoft.com/fwlink/?LinkId=550986>应用
>
>
>
> ________________________________
> 发件人: 林琰文 <ly...@gmail.com>
> 发送时间: Tuesday, March 19, 2019 1:04:51 AM
> 收件人: dev@kylin.apache.org
> 主题: Build buildSupportsSnappy Error When Doing Integration Testing
>
> Hi all,
> I am currently running integration test. However, I met the following
> error. Could you please share some suggestions on this?
> *1. Command*:
> mvn verify -fae -Dhdp.version=3.0.1.0-187 -P sandbox
> *2. Error message from Yarn Container Attempt:*
>
> 2019-03-18 16:43:25,583 INFO [main] org.apache.kylin.engine.mr
> .KylinMapper:
> Accepting Mapper Key with ordinal: 1
>
> 2019-03-18 16:43:25,583 INFO [main] org.apache.kylin.engine.mr
> .KylinMapper:
> Do map, available memory: 322m
>
> 2019-03-18 16:43:25,596 INFO [main] org.apache.kylin.common.KylinConfig:
> Creating new manager instance of class
> org.apache.kylin.cube.cuboid.CuboidManager
>
> 2019-03-18 16:43:25,599 INFO [main]
> org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter: File Output
> Committer Algorithm version is 1
>
> 2019-03-18 16:43:25,599 INFO [main]
> org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter:
> FileOutputCommitter skip cleanup _temporary folders under output
> directory:false, ignore cleanup failures: false
>
> 2019-03-18 16:43:25,795 ERROR [main] org.apache.kylin.engine.mr
> .KylinMapper:
>
> java.lang.UnsatisfiedLinkError:
> org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
>
>  at org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native
> Method)
>
>  at
>
> org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63
> )
>
>  at
>
> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:136
> )
>
>  at
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:150
> )
>
>  at
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:168
> )
>
>  at org.apache.hadoop.io.SequenceFile$Writer.init(SequenceFile.java:1304)
>
>  at org.apache.hadoop.io
> .SequenceFile$Writer.<init>(SequenceFile.java:1192)
>
>  at org.apache.hadoop.io.SequenceFile$BlockCompressWriter
> .<init>(SequenceFile.java:1552)
>
>  at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:289)
>
>  at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:542)
>
>  at
>
> org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat.getSequenceWriter(SequenceFileOutputFormat.java:64)
>
>  at
>
> org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat.getRecordWriter(SequenceFileOutputFormat.java:75)
>
>  at
>
> org.apache.hadoop.mapreduce.lib.output.LazyOutputFormat$LazyRecordWriter.write(LazyOutputFormat.java:113)
>
>  at
>
> org.apache.hadoop.mapreduce.lib.output.MultipleOutputs.write(MultipleOutputs.java:468)
>
>  at
> org.apache.kylin.engine.mr
> .steps.FilterRecommendCuboidDataMapper.doMap(FilterRecommendCuboidDataMapper.java:85
> )
>
>  at
> org.apache.kylin.engine.mr
> .steps.FilterRecommendCuboidDataMapper.doMap(FilterRecommendCuboidDataMapper.java:44
> )
>
>  at org.apache.kylin.engine.mr.KylinMapper.map(KylinMapper.java:77)
>
>  at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:146)
>
>  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:799)
>
>  at org.apache.hadoop.mapred.MapTask.run(MapTask.java:347)
>
>  at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:174)
>
>  at java.security.AccessController.doPrivileged(Native Method)
>
>  at javax.security.auth.Subject.doAs(Subject.java:422)
>
>  at
>
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
>
>  at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:168)
>
> 2019-03-18 16:43:25,797 INFO [main] org.apache.kylin.engine.mr
> .KylinMapper:
> Do cleanup, available memory: 318m
>
> 2019-03-18 16:43:25,813 INFO [main] org.apache.kylin.engine.mr
> .KylinMapper:
> Total rows: 1
>
> 2019-03-18 16:43:25,813 ERROR [main] org.apache.hadoop.mapred.YarnChild:
> Error running child : java.lang.UnsatisfiedLinkError:
> org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
>
>  at org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native
> Method)
>
>  at
>
> org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63
> )
>
>  at
>
> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:136
> )
>
>  at
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:150
> )
>
>  at
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:168
> )
>
>  at org.apache.hadoop.io.SequenceFile$Writer.init(SequenceFile.java:1304)
>
>  at org.apache.hadoop.io
> .SequenceFile$Writer.<init>(SequenceFile.java:1192)
>
>  at org.apache.hadoop.io.SequenceFile$BlockCompressWriter
> .<init>(SequenceFile.java:1552)
>
>  at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:289)
>
>  at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:542)
>
>  at
>
> org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat.getSequenceWriter(SequenceFileOutputFormat.java:64)
>
>  at
>
> org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat.getRecordWriter(SequenceFileOutputFormat.java:75)
>
>  at
>
> org.apache.hadoop.mapreduce.lib.output.LazyOutputFormat$LazyRecordWriter.write(LazyOutputFormat.java:113)
>
>  at
>
> org.apache.hadoop.mapreduce.lib.output.MultipleOutputs.write(MultipleOutputs.java:468)
>
>  at
> org.apache.kylin.engine.mr
> .steps.FilterRecommendCuboidDataMapper.doMap(FilterRecommendCuboidDataMapper.java:85
> )
>
>  at
> org.apache.kylin.engine.mr
> .steps.FilterRecommendCuboidDataMapper.doMap(FilterRecommendCuboidDataMapper.java:44
> )
>
>  at org.apache.kylin.engine.mr.KylinMapper.map(KylinMapper.java:77)
>
>  at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:146)
>
>  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:799)
>
>  at org.apache.hadoop.mapred.MapTask.run(MapTask.java:347)
>
>  at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:174)
>
>  at java.security.AccessController.doPrivileged(Native Method)
>
>  at javax.security.auth.Subject.doAs(Subject.java:422)
>
>  at
>
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
>
>  at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:168)
>
>
> 2019-03-18 16:43:25,926 INFO [main]
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Stopping MapTask metrics
> system...
>
> 2019-03-18 16:43:25,927 INFO [main]
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: MapTask metrics system
> stopped.
>
> 2019-03-18 16:43:25,927 INFO [main]
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: MapTask metrics system
> shutdown complete.
> *3. What I have tried (but not work):*
> Have made sure the following files have the following property:
> 3.1 core-site.xml
> File: {kylin_root}/examples/test_case_data/sandbox/core-site.xml
> File: HDP HDFS core-site.xml (via Ambari Web UI)
> <property> <name>io.compression.codecs</name> <value>
>
> org.apache.hadoop.io.compress.GzipCodec,org.apache.hadoop.io.compress.DefaultCodec,org.apache.hadoop.io.compress.SnappyCodec
> </value> </property>
> 3.2 mapred-site.xml
> File: {kylin_root}/examples/test_case_data/sandbox/mapred-site.xml
> File: HDP MapReduce2 mapred-site.xml (via Ambari Web UI)
> <property> <name>mapreduce.map.output.compress</name> <value>true</value>
> </
> property> <property> <name>mapred.map.output.compress.codec</name> <value>
> org.apache.hadoop.io.compress.SnappyCodec</value> </property> <property> <
> name>mapreduce.admin.user.env</name> <value>LD_LIBRARY_PATH=/usr/hdp/
> 3.0.1.0-187/hadoop/lib/native</value> </property>
> 3.3 libsnappy.so
> Have checked the file libsnappy.so is located at the /usr/hdp/3.0.1.0-187
> /hadoop/lib/native
>
> Thanks!
>
> Best,
> Yanwen Lin
>

Re: Build buildSupportsSnappy Error When Doing Integration Testing

Posted by 林琰文 <ly...@gmail.com>.
Hi Na Zhai,

Thank you for your replying!

Actually I've checked this link previously but it did not work. When I
checked the log output from YARN MapReduce Job container I found that the
*LD_LIBRARY_PATH* was not set correctly and MapReduce log still showed it
could not load native hadoop library. However, I did specify the*
LD_LIBRARY_PATH=/usr/hdp/3.0.1.0-187/hadoop/lib/native* in
*mapreduce.admin.user.env
*in* mapred-site.xml*. So It was quite weird. My hacked way to tackle this
problem is to add
*-Djava.library.path="${java.library.path}:/usr/hdp/${hdp.version}/hadoop/lib/native
*option in the *mapreduce.admin.map.child.java.opts* in *mapred-site.xml*.
So I don't know why adding *LD_LIBRARY_PATH to the mapreduce.admin.user.env
in mapred-site.xml* does not work*. *Do you have any idea on this?

Thanks again!

Best,
Yanwen Lin

On Thu, Mar 21, 2019 at 10:44 PM Na Zhai <na...@kyligence.io> wrote:

> Hi, 林琰文.
>
>
>
> You should add native dependencies. Hope this can help you.
>
>
> https://stackoverflow.com/questions/22150417/hadoop-mapreduce-java-lang-unsatisfiedlinkerror-org-apache-hadoop-util-nativec
> .
>
>
>
>
>
> 发送自 Windows 10 版邮件<https://go.microsoft.com/fwlink/?LinkId=550986>应用
>
>
>
> ________________________________
> 发件人: 林琰文 <ly...@gmail.com>
> 发送时间: Tuesday, March 19, 2019 1:04:51 AM
> 收件人: dev@kylin.apache.org
> 主题: Build buildSupportsSnappy Error When Doing Integration Testing
>
> Hi all,
> I am currently running integration test. However, I met the following
> error. Could you please share some suggestions on this?
> *1. Command*:
> mvn verify -fae -Dhdp.version=3.0.1.0-187 -P sandbox
> *2. Error message from Yarn Container Attempt:*
>
> 2019-03-18 16:43:25,583 INFO [main] org.apache.kylin.engine.mr
> .KylinMapper:
> Accepting Mapper Key with ordinal: 1
>
> 2019-03-18 16:43:25,583 INFO [main] org.apache.kylin.engine.mr
> .KylinMapper:
> Do map, available memory: 322m
>
> 2019-03-18 16:43:25,596 INFO [main] org.apache.kylin.common.KylinConfig:
> Creating new manager instance of class
> org.apache.kylin.cube.cuboid.CuboidManager
>
> 2019-03-18 16:43:25,599 INFO [main]
> org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter: File Output
> Committer Algorithm version is 1
>
> 2019-03-18 16:43:25,599 INFO [main]
> org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter:
> FileOutputCommitter skip cleanup _temporary folders under output
> directory:false, ignore cleanup failures: false
>
> 2019-03-18 16:43:25,795 ERROR [main] org.apache.kylin.engine.mr
> .KylinMapper:
>
> java.lang.UnsatisfiedLinkError:
> org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
>
>  at org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native
> Method)
>
>  at
>
> org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63
> )
>
>  at
>
> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:136
> )
>
>  at
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:150
> )
>
>  at
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:168
> )
>
>  at org.apache.hadoop.io.SequenceFile$Writer.init(SequenceFile.java:1304)
>
>  at org.apache.hadoop.io
> .SequenceFile$Writer.<init>(SequenceFile.java:1192)
>
>  at org.apache.hadoop.io.SequenceFile$BlockCompressWriter
> .<init>(SequenceFile.java:1552)
>
>  at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:289)
>
>  at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:542)
>
>  at
>
> org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat.getSequenceWriter(SequenceFileOutputFormat.java:64)
>
>  at
>
> org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat.getRecordWriter(SequenceFileOutputFormat.java:75)
>
>  at
>
> org.apache.hadoop.mapreduce.lib.output.LazyOutputFormat$LazyRecordWriter.write(LazyOutputFormat.java:113)
>
>  at
>
> org.apache.hadoop.mapreduce.lib.output.MultipleOutputs.write(MultipleOutputs.java:468)
>
>  at
> org.apache.kylin.engine.mr
> .steps.FilterRecommendCuboidDataMapper.doMap(FilterRecommendCuboidDataMapper.java:85
> )
>
>  at
> org.apache.kylin.engine.mr
> .steps.FilterRecommendCuboidDataMapper.doMap(FilterRecommendCuboidDataMapper.java:44
> )
>
>  at org.apache.kylin.engine.mr.KylinMapper.map(KylinMapper.java:77)
>
>  at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:146)
>
>  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:799)
>
>  at org.apache.hadoop.mapred.MapTask.run(MapTask.java:347)
>
>  at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:174)
>
>  at java.security.AccessController.doPrivileged(Native Method)
>
>  at javax.security.auth.Subject.doAs(Subject.java:422)
>
>  at
>
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
>
>  at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:168)
>
> 2019-03-18 16:43:25,797 INFO [main] org.apache.kylin.engine.mr
> .KylinMapper:
> Do cleanup, available memory: 318m
>
> 2019-03-18 16:43:25,813 INFO [main] org.apache.kylin.engine.mr
> .KylinMapper:
> Total rows: 1
>
> 2019-03-18 16:43:25,813 ERROR [main] org.apache.hadoop.mapred.YarnChild:
> Error running child : java.lang.UnsatisfiedLinkError:
> org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
>
>  at org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native
> Method)
>
>  at
>
> org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63
> )
>
>  at
>
> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:136
> )
>
>  at
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:150
> )
>
>  at
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:168
> )
>
>  at org.apache.hadoop.io.SequenceFile$Writer.init(SequenceFile.java:1304)
>
>  at org.apache.hadoop.io
> .SequenceFile$Writer.<init>(SequenceFile.java:1192)
>
>  at org.apache.hadoop.io.SequenceFile$BlockCompressWriter
> .<init>(SequenceFile.java:1552)
>
>  at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:289)
>
>  at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:542)
>
>  at
>
> org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat.getSequenceWriter(SequenceFileOutputFormat.java:64)
>
>  at
>
> org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat.getRecordWriter(SequenceFileOutputFormat.java:75)
>
>  at
>
> org.apache.hadoop.mapreduce.lib.output.LazyOutputFormat$LazyRecordWriter.write(LazyOutputFormat.java:113)
>
>  at
>
> org.apache.hadoop.mapreduce.lib.output.MultipleOutputs.write(MultipleOutputs.java:468)
>
>  at
> org.apache.kylin.engine.mr
> .steps.FilterRecommendCuboidDataMapper.doMap(FilterRecommendCuboidDataMapper.java:85
> )
>
>  at
> org.apache.kylin.engine.mr
> .steps.FilterRecommendCuboidDataMapper.doMap(FilterRecommendCuboidDataMapper.java:44
> )
>
>  at org.apache.kylin.engine.mr.KylinMapper.map(KylinMapper.java:77)
>
>  at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:146)
>
>  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:799)
>
>  at org.apache.hadoop.mapred.MapTask.run(MapTask.java:347)
>
>  at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:174)
>
>  at java.security.AccessController.doPrivileged(Native Method)
>
>  at javax.security.auth.Subject.doAs(Subject.java:422)
>
>  at
>
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
>
>  at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:168)
>
>
> 2019-03-18 16:43:25,926 INFO [main]
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Stopping MapTask metrics
> system...
>
> 2019-03-18 16:43:25,927 INFO [main]
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: MapTask metrics system
> stopped.
>
> 2019-03-18 16:43:25,927 INFO [main]
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl: MapTask metrics system
> shutdown complete.
> *3. What I have tried (but not work):*
> Have made sure the following files have the following property:
> 3.1 core-site.xml
> File: {kylin_root}/examples/test_case_data/sandbox/core-site.xml
> File: HDP HDFS core-site.xml (via Ambari Web UI)
> <property> <name>io.compression.codecs</name> <value>
>
> org.apache.hadoop.io.compress.GzipCodec,org.apache.hadoop.io.compress.DefaultCodec,org.apache.hadoop.io.compress.SnappyCodec
> </value> </property>
> 3.2 mapred-site.xml
> File: {kylin_root}/examples/test_case_data/sandbox/mapred-site.xml
> File: HDP MapReduce2 mapred-site.xml (via Ambari Web UI)
> <property> <name>mapreduce.map.output.compress</name> <value>true</value>
> </
> property> <property> <name>mapred.map.output.compress.codec</name> <value>
> org.apache.hadoop.io.compress.SnappyCodec</value> </property> <property> <
> name>mapreduce.admin.user.env</name> <value>LD_LIBRARY_PATH=/usr/hdp/
> 3.0.1.0-187/hadoop/lib/native</value> </property>
> 3.3 libsnappy.so
> Have checked the file libsnappy.so is located at the /usr/hdp/3.0.1.0-187
> /hadoop/lib/native
>
> Thanks!
>
> Best,
> Yanwen Lin
>

答复: Build buildSupportsSnappy Error When Doing Integration Testing

Posted by Na Zhai <na...@kyligence.io>.
Hi, 林琰文.



You should add native dependencies. Hope this can help you.

https://stackoverflow.com/questions/22150417/hadoop-mapreduce-java-lang-unsatisfiedlinkerror-org-apache-hadoop-util-nativec.





发送自 Windows 10 版邮件<https://go.microsoft.com/fwlink/?LinkId=550986>应用



________________________________
发件人: 林琰文 <ly...@gmail.com>
发送时间: Tuesday, March 19, 2019 1:04:51 AM
收件人: dev@kylin.apache.org
主题: Build buildSupportsSnappy Error When Doing Integration Testing

Hi all,
I am currently running integration test. However, I met the following
error. Could you please share some suggestions on this?
*1. Command*:
mvn verify -fae -Dhdp.version=3.0.1.0-187 -P sandbox
*2. Error message from Yarn Container Attempt:*

2019-03-18 16:43:25,583 INFO [main] org.apache.kylin.engine.mr.KylinMapper:
Accepting Mapper Key with ordinal: 1

2019-03-18 16:43:25,583 INFO [main] org.apache.kylin.engine.mr.KylinMapper:
Do map, available memory: 322m

2019-03-18 16:43:25,596 INFO [main] org.apache.kylin.common.KylinConfig:
Creating new manager instance of class
org.apache.kylin.cube.cuboid.CuboidManager

2019-03-18 16:43:25,599 INFO [main]
org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter: File Output
Committer Algorithm version is 1

2019-03-18 16:43:25,599 INFO [main]
org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter:
FileOutputCommitter skip cleanup _temporary folders under output
directory:false, ignore cleanup failures: false

2019-03-18 16:43:25,795 ERROR [main] org.apache.kylin.engine.mr.KylinMapper:

java.lang.UnsatisfiedLinkError:
org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z

 at org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native
Method)

 at
org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63
)

 at
org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:136
)

 at org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:150
)

 at org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:168
)

 at org.apache.hadoop.io.SequenceFile$Writer.init(SequenceFile.java:1304)

 at org.apache.hadoop.io.SequenceFile$Writer.<init>(SequenceFile.java:1192)

 at org.apache.hadoop.io.SequenceFile$BlockCompressWriter
.<init>(SequenceFile.java:1552)

 at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:289)

 at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:542)

 at
org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat.getSequenceWriter(SequenceFileOutputFormat.java:64)

 at
org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat.getRecordWriter(SequenceFileOutputFormat.java:75)

 at
org.apache.hadoop.mapreduce.lib.output.LazyOutputFormat$LazyRecordWriter.write(LazyOutputFormat.java:113)

 at
org.apache.hadoop.mapreduce.lib.output.MultipleOutputs.write(MultipleOutputs.java:468)

 at
org.apache.kylin.engine.mr.steps.FilterRecommendCuboidDataMapper.doMap(FilterRecommendCuboidDataMapper.java:85
)

 at
org.apache.kylin.engine.mr.steps.FilterRecommendCuboidDataMapper.doMap(FilterRecommendCuboidDataMapper.java:44
)

 at org.apache.kylin.engine.mr.KylinMapper.map(KylinMapper.java:77)

 at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:146)

 at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:799)

 at org.apache.hadoop.mapred.MapTask.run(MapTask.java:347)

 at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:174)

 at java.security.AccessController.doPrivileged(Native Method)

 at javax.security.auth.Subject.doAs(Subject.java:422)

 at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)

 at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:168)

2019-03-18 16:43:25,797 INFO [main] org.apache.kylin.engine.mr.KylinMapper:
Do cleanup, available memory: 318m

2019-03-18 16:43:25,813 INFO [main] org.apache.kylin.engine.mr.KylinMapper:
Total rows: 1

2019-03-18 16:43:25,813 ERROR [main] org.apache.hadoop.mapred.YarnChild:
Error running child : java.lang.UnsatisfiedLinkError:
org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z

 at org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native
Method)

 at
org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63
)

 at
org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:136
)

 at org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:150
)

 at org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:168
)

 at org.apache.hadoop.io.SequenceFile$Writer.init(SequenceFile.java:1304)

 at org.apache.hadoop.io.SequenceFile$Writer.<init>(SequenceFile.java:1192)

 at org.apache.hadoop.io.SequenceFile$BlockCompressWriter
.<init>(SequenceFile.java:1552)

 at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:289)

 at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:542)

 at
org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat.getSequenceWriter(SequenceFileOutputFormat.java:64)

 at
org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat.getRecordWriter(SequenceFileOutputFormat.java:75)

 at
org.apache.hadoop.mapreduce.lib.output.LazyOutputFormat$LazyRecordWriter.write(LazyOutputFormat.java:113)

 at
org.apache.hadoop.mapreduce.lib.output.MultipleOutputs.write(MultipleOutputs.java:468)

 at
org.apache.kylin.engine.mr.steps.FilterRecommendCuboidDataMapper.doMap(FilterRecommendCuboidDataMapper.java:85
)

 at
org.apache.kylin.engine.mr.steps.FilterRecommendCuboidDataMapper.doMap(FilterRecommendCuboidDataMapper.java:44
)

 at org.apache.kylin.engine.mr.KylinMapper.map(KylinMapper.java:77)

 at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:146)

 at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:799)

 at org.apache.hadoop.mapred.MapTask.run(MapTask.java:347)

 at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:174)

 at java.security.AccessController.doPrivileged(Native Method)

 at javax.security.auth.Subject.doAs(Subject.java:422)

 at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)

 at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:168)


2019-03-18 16:43:25,926 INFO [main]
org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Stopping MapTask metrics
system...

2019-03-18 16:43:25,927 INFO [main]
org.apache.hadoop.metrics2.impl.MetricsSystemImpl: MapTask metrics system
stopped.

2019-03-18 16:43:25,927 INFO [main]
org.apache.hadoop.metrics2.impl.MetricsSystemImpl: MapTask metrics system
shutdown complete.
*3. What I have tried (but not work):*
Have made sure the following files have the following property:
3.1 core-site.xml
File: {kylin_root}/examples/test_case_data/sandbox/core-site.xml
File: HDP HDFS core-site.xml (via Ambari Web UI)
<property> <name>io.compression.codecs</name> <value>
org.apache.hadoop.io.compress.GzipCodec,org.apache.hadoop.io.compress.DefaultCodec,org.apache.hadoop.io.compress.SnappyCodec
</value> </property>
3.2 mapred-site.xml
File: {kylin_root}/examples/test_case_data/sandbox/mapred-site.xml
File: HDP MapReduce2 mapred-site.xml (via Ambari Web UI)
<property> <name>mapreduce.map.output.compress</name> <value>true</value> </
property> <property> <name>mapred.map.output.compress.codec</name> <value>
org.apache.hadoop.io.compress.SnappyCodec</value> </property> <property> <
name>mapreduce.admin.user.env</name> <value>LD_LIBRARY_PATH=/usr/hdp/
3.0.1.0-187/hadoop/lib/native</value> </property>
3.3 libsnappy.so
Have checked the file libsnappy.so is located at the /usr/hdp/3.0.1.0-187
/hadoop/lib/native

Thanks!

Best,
Yanwen Lin