You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by "Arthur.hk.chan@gmail.com" <ar...@gmail.com> on 2014/08/19 17:40:53 UTC

Hadoop 2.4.1 Snappy Smoke Test failed

Hi,

I am trying Snappy in Hadoop 2.4.1, here are my steps: 

(CentOS 64-bit)
1)
yum install snappy snappy-devel

2)
added the following 
(core-site.xml)
   <property>
    <name>io.compression.codecs</name>
    <value>org.apache.hadoop.io.compress.GzipCodec,org.apache.hadoop.io.compress.DefaultCodec,org.apache.hadoop.io.compress.SnappyCodec</value>
   </property>

3) 
mapred-site.xml
   <property>
    <name>mapreduce.admin.map.child.java.opts</name>
    <value>-server -XX:NewRatio=8 -Djava.library.path=/usr/lib/hadoop/lib/native/ -Djava.net.preferIPv4Stack=true</value>
    <final>true</final>
   </property>
   <property>
    <name>mapreduce.admin.reduce.child.java.opts</name>
    <value>-server -XX:NewRatio=8 -Djava.library.path=/usr/lib/hadoop/lib/native/ -Djava.net.preferIPv4Stack=true</value>
    <final>true</final>
   </property>

4) smoke test
bin/hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-*.jar  teragen 100000 /tmp/teragenout

I got the following warning, actually there is no any test file created in hdfs:

14/08/19 22:50:10 WARN mapred.YARNRunner: Usage of -Djava.library.path in mapreduce.admin.map.child.java.opts can cause programs to no longer function if hadoop native libraries are used. These values should be set as part of the LD_LIBRARY_PATH in the map JVM env using mapreduce.admin.user.env config settings.
14/08/19 22:50:10 WARN mapred.YARNRunner: Usage of -Djava.library.path in mapreduce.admin.reduce.child.java.opts can cause programs to no longer function if hadoop native libraries are used. These values should be set as part of the LD_LIBRARY_PATH in the reduce JVM env using mapreduce.admin.user.env config settings.

Can anyone please advise how to install and enable SNAPPY in Hadoop 2.4.1? or what would be wrong? or is my new change in mapred-site.xml incorrect?

Regards
Arthur






Re: Hadoop 2.4.1 How to clear usercache

Posted by "Arthur.hk.chan@gmail.com" <ar...@gmail.com>.
I restarted the cluster, and the usercahe all gone automatically.  no longer an issue.  thanks

 
On 20 Aug, 2014, at 7:05 pm, Arthur.hk.chan@gmail.com <ar...@gmail.com> wrote:

> Hi, 
> 
>  i use Hadoop 2.4.1, in my cluster,  Non DFS Used: 2.09 TB
> 
> I found that these files are all under tmp/nm-local-dir/usercache
> 
> Is there any Hadoop command to remove these unused user cache files tmp/nm-local-dir/usercache ?
> 
> Regards
> Arthur
> 
> 


Re: Hadoop 2.4.1 How to clear usercache

Posted by "Arthur.hk.chan@gmail.com" <ar...@gmail.com>.
I restarted the cluster, and the usercahe all gone automatically.  no longer an issue.  thanks

 
On 20 Aug, 2014, at 7:05 pm, Arthur.hk.chan@gmail.com <ar...@gmail.com> wrote:

> Hi, 
> 
>  i use Hadoop 2.4.1, in my cluster,  Non DFS Used: 2.09 TB
> 
> I found that these files are all under tmp/nm-local-dir/usercache
> 
> Is there any Hadoop command to remove these unused user cache files tmp/nm-local-dir/usercache ?
> 
> Regards
> Arthur
> 
> 


Re: Hadoop 2.4.1 How to clear usercache

Posted by "Arthur.hk.chan@gmail.com" <ar...@gmail.com>.
I restarted the cluster, and the usercahe all gone automatically.  no longer an issue.  thanks

 
On 20 Aug, 2014, at 7:05 pm, Arthur.hk.chan@gmail.com <ar...@gmail.com> wrote:

> Hi, 
> 
>  i use Hadoop 2.4.1, in my cluster,  Non DFS Used: 2.09 TB
> 
> I found that these files are all under tmp/nm-local-dir/usercache
> 
> Is there any Hadoop command to remove these unused user cache files tmp/nm-local-dir/usercache ?
> 
> Regards
> Arthur
> 
> 


Re: Hadoop 2.4.1 How to clear usercache

Posted by "Arthur.hk.chan@gmail.com" <ar...@gmail.com>.
I restarted the cluster, and the usercahe all gone automatically.  no longer an issue.  thanks

 
On 20 Aug, 2014, at 7:05 pm, Arthur.hk.chan@gmail.com <ar...@gmail.com> wrote:

> Hi, 
> 
>  i use Hadoop 2.4.1, in my cluster,  Non DFS Used: 2.09 TB
> 
> I found that these files are all under tmp/nm-local-dir/usercache
> 
> Is there any Hadoop command to remove these unused user cache files tmp/nm-local-dir/usercache ?
> 
> Regards
> Arthur
> 
> 


Hadoop 2.4.1 How to clear usercache

Posted by "Arthur.hk.chan@gmail.com" <ar...@gmail.com>.
Hi, 

 i use Hadoop 2.4.1, in my cluster,  Non DFS Used: 2.09 TB

I found that these files are all under tmp/nm-local-dir/usercache

Is there any Hadoop command to remove these unused user cache files tmp/nm-local-dir/usercache ?

Regards
Arthur



Re: Hadoop 2.4.1 Snappy Smoke Test failed

Posted by "Arthur.hk.chan@gmail.com" <ar...@gmail.com>.
Thanks for your reply.  However I think it is not about 32-bit version issue, cus my Hadoop is 64-bit as I compiled it from source.  I think my way to install snappy should be wrong, 

Arthur
On 19 Aug, 2014, at 11:53 pm, Andre Kelpe <ak...@concurrentinc.com> wrote:

> Could this be caused by the fact that hadoop no longer ships with 64bit libs? https://issues.apache.org/jira/browse/HADOOP-9911
> 
> - André
> 
> 
> On Tue, Aug 19, 2014 at 5:40 PM, Arthur.hk.chan@gmail.com <ar...@gmail.com> wrote:
> Hi,
> 
> I am trying Snappy in Hadoop 2.4.1, here are my steps: 
> 
> (CentOS 64-bit)
> 1)
> yum install snappy snappy-devel
> 
> 2)
> added the following 
> (core-site.xml)
>    <property>
>     <name>io.compression.codecs</name>
>     <value>org.apache.hadoop.io.compress.GzipCodec,org.apache.hadoop.io.compress.DefaultCodec,org.apache.hadoop.io.compress.SnappyCodec</value>
>    </property>
> 
> 3) 
> mapred-site.xml
>    <property>
>     <name>mapreduce.admin.map.child.java.opts</name>
>     <value>-server -XX:NewRatio=8 -Djava.library.path=/usr/lib/hadoop/lib/native/ -Djava.net.preferIPv4Stack=true</value>
>     <final>true</final>
>    </property>
>    <property>
>     <name>mapreduce.admin.reduce.child.java.opts</name>
>     <value>-server -XX:NewRatio=8 -Djava.library.path=/usr/lib/hadoop/lib/native/ -Djava.net.preferIPv4Stack=true</value>
>     <final>true</final>
>    </property>
> 
> 4) smoke test
> bin/hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-*.jar  teragen 100000 /tmp/teragenout
> 
> I got the following warning, actually there is no any test file created in hdfs:
> 
> 14/08/19 22:50:10 WARN mapred.YARNRunner: Usage of -Djava.library.path in mapreduce.admin.map.child.java.opts can cause programs to no longer function if hadoop native libraries are used. These values should be set as part of the LD_LIBRARY_PATH in the map JVM env using mapreduce.admin.user.env config settings.
> 14/08/19 22:50:10 WARN mapred.YARNRunner: Usage of -Djava.library.path in mapreduce.admin.reduce.child.java.opts can cause programs to no longer function if hadoop native libraries are used. These values should be set as part of the LD_LIBRARY_PATH in the reduce JVM env using mapreduce.admin.user.env config settings.
> 
> Can anyone please advise how to install and enable SNAPPY in Hadoop 2.4.1? or what would be wrong? or is my new change in mapred-site.xml incorrect?
> 
> Regards
> Arthur
> 
> 
> 
> 
> 
> 
> 
> 
> -- 
> André Kelpe
> andre@concurrentinc.com
> http://concurrentinc.com


Re: Hadoop 2.4.1 Snappy Smoke Test failed

Posted by "Arthur.hk.chan@gmail.com" <ar...@gmail.com>.
Thanks for your reply.  However I think it is not about 32-bit version issue, cus my Hadoop is 64-bit as I compiled it from source.  I think my way to install snappy should be wrong, 

Arthur
On 19 Aug, 2014, at 11:53 pm, Andre Kelpe <ak...@concurrentinc.com> wrote:

> Could this be caused by the fact that hadoop no longer ships with 64bit libs? https://issues.apache.org/jira/browse/HADOOP-9911
> 
> - André
> 
> 
> On Tue, Aug 19, 2014 at 5:40 PM, Arthur.hk.chan@gmail.com <ar...@gmail.com> wrote:
> Hi,
> 
> I am trying Snappy in Hadoop 2.4.1, here are my steps: 
> 
> (CentOS 64-bit)
> 1)
> yum install snappy snappy-devel
> 
> 2)
> added the following 
> (core-site.xml)
>    <property>
>     <name>io.compression.codecs</name>
>     <value>org.apache.hadoop.io.compress.GzipCodec,org.apache.hadoop.io.compress.DefaultCodec,org.apache.hadoop.io.compress.SnappyCodec</value>
>    </property>
> 
> 3) 
> mapred-site.xml
>    <property>
>     <name>mapreduce.admin.map.child.java.opts</name>
>     <value>-server -XX:NewRatio=8 -Djava.library.path=/usr/lib/hadoop/lib/native/ -Djava.net.preferIPv4Stack=true</value>
>     <final>true</final>
>    </property>
>    <property>
>     <name>mapreduce.admin.reduce.child.java.opts</name>
>     <value>-server -XX:NewRatio=8 -Djava.library.path=/usr/lib/hadoop/lib/native/ -Djava.net.preferIPv4Stack=true</value>
>     <final>true</final>
>    </property>
> 
> 4) smoke test
> bin/hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-*.jar  teragen 100000 /tmp/teragenout
> 
> I got the following warning, actually there is no any test file created in hdfs:
> 
> 14/08/19 22:50:10 WARN mapred.YARNRunner: Usage of -Djava.library.path in mapreduce.admin.map.child.java.opts can cause programs to no longer function if hadoop native libraries are used. These values should be set as part of the LD_LIBRARY_PATH in the map JVM env using mapreduce.admin.user.env config settings.
> 14/08/19 22:50:10 WARN mapred.YARNRunner: Usage of -Djava.library.path in mapreduce.admin.reduce.child.java.opts can cause programs to no longer function if hadoop native libraries are used. These values should be set as part of the LD_LIBRARY_PATH in the reduce JVM env using mapreduce.admin.user.env config settings.
> 
> Can anyone please advise how to install and enable SNAPPY in Hadoop 2.4.1? or what would be wrong? or is my new change in mapred-site.xml incorrect?
> 
> Regards
> Arthur
> 
> 
> 
> 
> 
> 
> 
> 
> -- 
> André Kelpe
> andre@concurrentinc.com
> http://concurrentinc.com


Re: Hadoop 2.4.1 Snappy Smoke Test failed

Posted by "Arthur.hk.chan@gmail.com" <ar...@gmail.com>.
Thanks for your reply.  However I think it is not about 32-bit version issue, cus my Hadoop is 64-bit as I compiled it from source.  I think my way to install snappy should be wrong, 

Arthur
On 19 Aug, 2014, at 11:53 pm, Andre Kelpe <ak...@concurrentinc.com> wrote:

> Could this be caused by the fact that hadoop no longer ships with 64bit libs? https://issues.apache.org/jira/browse/HADOOP-9911
> 
> - André
> 
> 
> On Tue, Aug 19, 2014 at 5:40 PM, Arthur.hk.chan@gmail.com <ar...@gmail.com> wrote:
> Hi,
> 
> I am trying Snappy in Hadoop 2.4.1, here are my steps: 
> 
> (CentOS 64-bit)
> 1)
> yum install snappy snappy-devel
> 
> 2)
> added the following 
> (core-site.xml)
>    <property>
>     <name>io.compression.codecs</name>
>     <value>org.apache.hadoop.io.compress.GzipCodec,org.apache.hadoop.io.compress.DefaultCodec,org.apache.hadoop.io.compress.SnappyCodec</value>
>    </property>
> 
> 3) 
> mapred-site.xml
>    <property>
>     <name>mapreduce.admin.map.child.java.opts</name>
>     <value>-server -XX:NewRatio=8 -Djava.library.path=/usr/lib/hadoop/lib/native/ -Djava.net.preferIPv4Stack=true</value>
>     <final>true</final>
>    </property>
>    <property>
>     <name>mapreduce.admin.reduce.child.java.opts</name>
>     <value>-server -XX:NewRatio=8 -Djava.library.path=/usr/lib/hadoop/lib/native/ -Djava.net.preferIPv4Stack=true</value>
>     <final>true</final>
>    </property>
> 
> 4) smoke test
> bin/hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-*.jar  teragen 100000 /tmp/teragenout
> 
> I got the following warning, actually there is no any test file created in hdfs:
> 
> 14/08/19 22:50:10 WARN mapred.YARNRunner: Usage of -Djava.library.path in mapreduce.admin.map.child.java.opts can cause programs to no longer function if hadoop native libraries are used. These values should be set as part of the LD_LIBRARY_PATH in the map JVM env using mapreduce.admin.user.env config settings.
> 14/08/19 22:50:10 WARN mapred.YARNRunner: Usage of -Djava.library.path in mapreduce.admin.reduce.child.java.opts can cause programs to no longer function if hadoop native libraries are used. These values should be set as part of the LD_LIBRARY_PATH in the reduce JVM env using mapreduce.admin.user.env config settings.
> 
> Can anyone please advise how to install and enable SNAPPY in Hadoop 2.4.1? or what would be wrong? or is my new change in mapred-site.xml incorrect?
> 
> Regards
> Arthur
> 
> 
> 
> 
> 
> 
> 
> 
> -- 
> André Kelpe
> andre@concurrentinc.com
> http://concurrentinc.com


Re: Hadoop 2.4.1 Snappy Smoke Test failed

Posted by "Arthur.hk.chan@gmail.com" <ar...@gmail.com>.
Thanks for your reply.  However I think it is not about 32-bit version issue, cus my Hadoop is 64-bit as I compiled it from source.  I think my way to install snappy should be wrong, 

Arthur
On 19 Aug, 2014, at 11:53 pm, Andre Kelpe <ak...@concurrentinc.com> wrote:

> Could this be caused by the fact that hadoop no longer ships with 64bit libs? https://issues.apache.org/jira/browse/HADOOP-9911
> 
> - André
> 
> 
> On Tue, Aug 19, 2014 at 5:40 PM, Arthur.hk.chan@gmail.com <ar...@gmail.com> wrote:
> Hi,
> 
> I am trying Snappy in Hadoop 2.4.1, here are my steps: 
> 
> (CentOS 64-bit)
> 1)
> yum install snappy snappy-devel
> 
> 2)
> added the following 
> (core-site.xml)
>    <property>
>     <name>io.compression.codecs</name>
>     <value>org.apache.hadoop.io.compress.GzipCodec,org.apache.hadoop.io.compress.DefaultCodec,org.apache.hadoop.io.compress.SnappyCodec</value>
>    </property>
> 
> 3) 
> mapred-site.xml
>    <property>
>     <name>mapreduce.admin.map.child.java.opts</name>
>     <value>-server -XX:NewRatio=8 -Djava.library.path=/usr/lib/hadoop/lib/native/ -Djava.net.preferIPv4Stack=true</value>
>     <final>true</final>
>    </property>
>    <property>
>     <name>mapreduce.admin.reduce.child.java.opts</name>
>     <value>-server -XX:NewRatio=8 -Djava.library.path=/usr/lib/hadoop/lib/native/ -Djava.net.preferIPv4Stack=true</value>
>     <final>true</final>
>    </property>
> 
> 4) smoke test
> bin/hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-*.jar  teragen 100000 /tmp/teragenout
> 
> I got the following warning, actually there is no any test file created in hdfs:
> 
> 14/08/19 22:50:10 WARN mapred.YARNRunner: Usage of -Djava.library.path in mapreduce.admin.map.child.java.opts can cause programs to no longer function if hadoop native libraries are used. These values should be set as part of the LD_LIBRARY_PATH in the map JVM env using mapreduce.admin.user.env config settings.
> 14/08/19 22:50:10 WARN mapred.YARNRunner: Usage of -Djava.library.path in mapreduce.admin.reduce.child.java.opts can cause programs to no longer function if hadoop native libraries are used. These values should be set as part of the LD_LIBRARY_PATH in the reduce JVM env using mapreduce.admin.user.env config settings.
> 
> Can anyone please advise how to install and enable SNAPPY in Hadoop 2.4.1? or what would be wrong? or is my new change in mapred-site.xml incorrect?
> 
> Regards
> Arthur
> 
> 
> 
> 
> 
> 
> 
> 
> -- 
> André Kelpe
> andre@concurrentinc.com
> http://concurrentinc.com


Re: Hadoop 2.4.1 Snappy Smoke Test failed

Posted by Andre Kelpe <ak...@concurrentinc.com>.
Could this be caused by the fact that hadoop no longer ships with 64bit
libs? https://issues.apache.org/jira/browse/HADOOP-9911

- André


On Tue, Aug 19, 2014 at 5:40 PM, Arthur.hk.chan@gmail.com <
arthur.hk.chan@gmail.com> wrote:

> Hi,
>
> I am trying Snappy in Hadoop 2.4.1, here are my steps:
>
> (CentOS 64-bit)
> 1)
> yum install snappy snappy-devel
>
> 2)
> added the following
> (core-site.xml)
>    <property>
>     <name>io.compression.codecs</name>
>
> <value>org.apache.hadoop.io.compress.GzipCodec,org.apache.hadoop.io.compress.DefaultCodec,org.apache.hadoop.io.compress.SnappyCodec</value>
>    </property>
>
> 3)
> mapred-site.xml
>    <property>
>     <name>mapreduce.admin.map.child.java.opts</name>
>     <value>-server -XX:NewRatio=8
> -Djava.library.path=/usr/lib/hadoop/lib/native/
> -Djava.net.preferIPv4Stack=true</value>
>     <final>true</final>
>    </property>
>    <property>
>     <name>mapreduce.admin.reduce.child.java.opts</name>
>     <value>-server -XX:NewRatio=8
> -Djava.library.path=/usr/lib/hadoop/lib/native/
> -Djava.net.preferIPv4Stack=true</value>
>     <final>true</final>
>    </property>
>
> 4) smoke test
> bin/hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-*.jar  teragen
> 100000 /tmp/teragenout
>
> I got the following warning, actually there is no any test file created in
> hdfs:
>
> 14/08/19 22:50:10 WARN mapred.YARNRunner: Usage of -Djava.library.path in
> mapreduce.admin.map.child.java.opts can cause programs to no longer
> function if hadoop native libraries are used. These values should be set as
> part of the LD_LIBRARY_PATH in the map JVM env using
> mapreduce.admin.user.env config settings.
> 14/08/19 22:50:10 WARN mapred.YARNRunner: Usage of -Djava.library.path in
> mapreduce.admin.reduce.child.java.opts can cause programs to no longer
> function if hadoop native libraries are used. These values should be set as
> part of the LD_LIBRARY_PATH in the reduce JVM env using
> mapreduce.admin.user.env config settings.
>
> Can anyone please advise how to install and enable SNAPPY in Hadoop 2.4.1?
> or what would be wrong? or is my new change in mapred-site.xml incorrect?
>
> Regards
> Arthur
>
>
>
>
>
>


-- 
André Kelpe
andre@concurrentinc.com
http://concurrentinc.com

Hadoop 2.4.1 How to clear usercache

Posted by "Arthur.hk.chan@gmail.com" <ar...@gmail.com>.
Hi, 

 i use Hadoop 2.4.1, in my cluster,  Non DFS Used: 2.09 TB

I found that these files are all under tmp/nm-local-dir/usercache

Is there any Hadoop command to remove these unused user cache files tmp/nm-local-dir/usercache ?

Regards
Arthur



Re: Hadoop 2.4.1 Snappy Smoke Test failed

Posted by Andre Kelpe <ak...@concurrentinc.com>.
Could this be caused by the fact that hadoop no longer ships with 64bit
libs? https://issues.apache.org/jira/browse/HADOOP-9911

- André


On Tue, Aug 19, 2014 at 5:40 PM, Arthur.hk.chan@gmail.com <
arthur.hk.chan@gmail.com> wrote:

> Hi,
>
> I am trying Snappy in Hadoop 2.4.1, here are my steps:
>
> (CentOS 64-bit)
> 1)
> yum install snappy snappy-devel
>
> 2)
> added the following
> (core-site.xml)
>    <property>
>     <name>io.compression.codecs</name>
>
> <value>org.apache.hadoop.io.compress.GzipCodec,org.apache.hadoop.io.compress.DefaultCodec,org.apache.hadoop.io.compress.SnappyCodec</value>
>    </property>
>
> 3)
> mapred-site.xml
>    <property>
>     <name>mapreduce.admin.map.child.java.opts</name>
>     <value>-server -XX:NewRatio=8
> -Djava.library.path=/usr/lib/hadoop/lib/native/
> -Djava.net.preferIPv4Stack=true</value>
>     <final>true</final>
>    </property>
>    <property>
>     <name>mapreduce.admin.reduce.child.java.opts</name>
>     <value>-server -XX:NewRatio=8
> -Djava.library.path=/usr/lib/hadoop/lib/native/
> -Djava.net.preferIPv4Stack=true</value>
>     <final>true</final>
>    </property>
>
> 4) smoke test
> bin/hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-*.jar  teragen
> 100000 /tmp/teragenout
>
> I got the following warning, actually there is no any test file created in
> hdfs:
>
> 14/08/19 22:50:10 WARN mapred.YARNRunner: Usage of -Djava.library.path in
> mapreduce.admin.map.child.java.opts can cause programs to no longer
> function if hadoop native libraries are used. These values should be set as
> part of the LD_LIBRARY_PATH in the map JVM env using
> mapreduce.admin.user.env config settings.
> 14/08/19 22:50:10 WARN mapred.YARNRunner: Usage of -Djava.library.path in
> mapreduce.admin.reduce.child.java.opts can cause programs to no longer
> function if hadoop native libraries are used. These values should be set as
> part of the LD_LIBRARY_PATH in the reduce JVM env using
> mapreduce.admin.user.env config settings.
>
> Can anyone please advise how to install and enable SNAPPY in Hadoop 2.4.1?
> or what would be wrong? or is my new change in mapred-site.xml incorrect?
>
> Regards
> Arthur
>
>
>
>
>
>


-- 
André Kelpe
andre@concurrentinc.com
http://concurrentinc.com

Hadoop 2.4.1 How to clear usercache

Posted by "Arthur.hk.chan@gmail.com" <ar...@gmail.com>.
Hi, 

 i use Hadoop 2.4.1, in my cluster,  Non DFS Used: 2.09 TB

I found that these files are all under tmp/nm-local-dir/usercache

Is there any Hadoop command to remove these unused user cache files tmp/nm-local-dir/usercache ?

Regards
Arthur



Re: Hadoop 2.4.1 Snappy Smoke Test failed

Posted by Andre Kelpe <ak...@concurrentinc.com>.
Could this be caused by the fact that hadoop no longer ships with 64bit
libs? https://issues.apache.org/jira/browse/HADOOP-9911

- André


On Tue, Aug 19, 2014 at 5:40 PM, Arthur.hk.chan@gmail.com <
arthur.hk.chan@gmail.com> wrote:

> Hi,
>
> I am trying Snappy in Hadoop 2.4.1, here are my steps:
>
> (CentOS 64-bit)
> 1)
> yum install snappy snappy-devel
>
> 2)
> added the following
> (core-site.xml)
>    <property>
>     <name>io.compression.codecs</name>
>
> <value>org.apache.hadoop.io.compress.GzipCodec,org.apache.hadoop.io.compress.DefaultCodec,org.apache.hadoop.io.compress.SnappyCodec</value>
>    </property>
>
> 3)
> mapred-site.xml
>    <property>
>     <name>mapreduce.admin.map.child.java.opts</name>
>     <value>-server -XX:NewRatio=8
> -Djava.library.path=/usr/lib/hadoop/lib/native/
> -Djava.net.preferIPv4Stack=true</value>
>     <final>true</final>
>    </property>
>    <property>
>     <name>mapreduce.admin.reduce.child.java.opts</name>
>     <value>-server -XX:NewRatio=8
> -Djava.library.path=/usr/lib/hadoop/lib/native/
> -Djava.net.preferIPv4Stack=true</value>
>     <final>true</final>
>    </property>
>
> 4) smoke test
> bin/hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-*.jar  teragen
> 100000 /tmp/teragenout
>
> I got the following warning, actually there is no any test file created in
> hdfs:
>
> 14/08/19 22:50:10 WARN mapred.YARNRunner: Usage of -Djava.library.path in
> mapreduce.admin.map.child.java.opts can cause programs to no longer
> function if hadoop native libraries are used. These values should be set as
> part of the LD_LIBRARY_PATH in the map JVM env using
> mapreduce.admin.user.env config settings.
> 14/08/19 22:50:10 WARN mapred.YARNRunner: Usage of -Djava.library.path in
> mapreduce.admin.reduce.child.java.opts can cause programs to no longer
> function if hadoop native libraries are used. These values should be set as
> part of the LD_LIBRARY_PATH in the reduce JVM env using
> mapreduce.admin.user.env config settings.
>
> Can anyone please advise how to install and enable SNAPPY in Hadoop 2.4.1?
> or what would be wrong? or is my new change in mapred-site.xml incorrect?
>
> Regards
> Arthur
>
>
>
>
>
>


-- 
André Kelpe
andre@concurrentinc.com
http://concurrentinc.com

Re: Hadoop 2.4.1 Snappy Smoke Test failed

Posted by Andre Kelpe <ak...@concurrentinc.com>.
Could this be caused by the fact that hadoop no longer ships with 64bit
libs? https://issues.apache.org/jira/browse/HADOOP-9911

- André


On Tue, Aug 19, 2014 at 5:40 PM, Arthur.hk.chan@gmail.com <
arthur.hk.chan@gmail.com> wrote:

> Hi,
>
> I am trying Snappy in Hadoop 2.4.1, here are my steps:
>
> (CentOS 64-bit)
> 1)
> yum install snappy snappy-devel
>
> 2)
> added the following
> (core-site.xml)
>    <property>
>     <name>io.compression.codecs</name>
>
> <value>org.apache.hadoop.io.compress.GzipCodec,org.apache.hadoop.io.compress.DefaultCodec,org.apache.hadoop.io.compress.SnappyCodec</value>
>    </property>
>
> 3)
> mapred-site.xml
>    <property>
>     <name>mapreduce.admin.map.child.java.opts</name>
>     <value>-server -XX:NewRatio=8
> -Djava.library.path=/usr/lib/hadoop/lib/native/
> -Djava.net.preferIPv4Stack=true</value>
>     <final>true</final>
>    </property>
>    <property>
>     <name>mapreduce.admin.reduce.child.java.opts</name>
>     <value>-server -XX:NewRatio=8
> -Djava.library.path=/usr/lib/hadoop/lib/native/
> -Djava.net.preferIPv4Stack=true</value>
>     <final>true</final>
>    </property>
>
> 4) smoke test
> bin/hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-*.jar  teragen
> 100000 /tmp/teragenout
>
> I got the following warning, actually there is no any test file created in
> hdfs:
>
> 14/08/19 22:50:10 WARN mapred.YARNRunner: Usage of -Djava.library.path in
> mapreduce.admin.map.child.java.opts can cause programs to no longer
> function if hadoop native libraries are used. These values should be set as
> part of the LD_LIBRARY_PATH in the map JVM env using
> mapreduce.admin.user.env config settings.
> 14/08/19 22:50:10 WARN mapred.YARNRunner: Usage of -Djava.library.path in
> mapreduce.admin.reduce.child.java.opts can cause programs to no longer
> function if hadoop native libraries are used. These values should be set as
> part of the LD_LIBRARY_PATH in the reduce JVM env using
> mapreduce.admin.user.env config settings.
>
> Can anyone please advise how to install and enable SNAPPY in Hadoop 2.4.1?
> or what would be wrong? or is my new change in mapred-site.xml incorrect?
>
> Regards
> Arthur
>
>
>
>
>
>


-- 
André Kelpe
andre@concurrentinc.com
http://concurrentinc.com

Hadoop 2.4.1 How to clear usercache

Posted by "Arthur.hk.chan@gmail.com" <ar...@gmail.com>.
Hi, 

 i use Hadoop 2.4.1, in my cluster,  Non DFS Used: 2.09 TB

I found that these files are all under tmp/nm-local-dir/usercache

Is there any Hadoop command to remove these unused user cache files tmp/nm-local-dir/usercache ?

Regards
Arthur