You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@ambari.apache.org by "Rajesh Kartha (JIRA)" <ji...@apache.org> on 2015/04/17 20:41:00 UTC

[jira] [Updated] (AMBARI-10573) During installs set default value of the yarn property 'yarn.app.mapreduce.am.env' to be same as 'mapreduce.admin.user.env'

     [ https://issues.apache.org/jira/browse/AMBARI-10573?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Rajesh Kartha updated AMBARI-10573:
-----------------------------------
    Description: 
We recently ran into an issue where after successfully creating HFile or SEQ/RC files with Snappy Compression in MR, the job fails during job commit with:
java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
at org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native Method)
at org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63)
at org.apache.hadoop.io.compress.SnappyCodec.getDecompressorType(SnappyCodec.java:192)
at org.apache.hadoop.hive.ql.io.CodecPool.getDecompressor(CodecPool.java:122)
at org.apache.hadoop.hive.ql.io.RCFile$Reader.init(RCFile.java:1518)

Incidentally the job commit was run as part of in the MRAppMaster which did not have the default 'yarn.app.mapreduce.am.env' set.

Here is a simple repro using a uberized task:

$ hadoop jar hadoop-mapreduce-examples.jar wordcount  -Dmapreduce.job.ubertask.enable=true -Dmapred.output.compress=true -Dmapred.output.compression.codec=org.apache.hadoop.io.compress.SnappyCodec /tmp/word_count.csv /tmp/wc-error

which would fail with:

2015-04-17 09:40:10,075 FATAL [uber-SubtaskRunner] org.apache.hadoop.mapred.LocalContainerLauncher: Error running local (uberized) 'child' : java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
	at org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native Method)
	at org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63)
	at org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:133)
	at org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
	at org.apache.hadoop.io.compress.CompressionCodec$Util.createOutputStreamWithCodecPool(CompressionCodec.java:131)
	at org.apache.hadoop.io.compress.SnappyCodec.createOutputStream(SnappyCodec.java:98)
	at org.apache.hadoop.mapreduce.lib.output.TextOutputFormat.getRecordWriter(TextOutputFormat.java:136)


It would be desirable to have the yarn property 'yarn.app.mapreduce.am.env' set to be same as 'mapreduce.admin.user.env' by default during installs so such tasks would not fail.

  was:
We recently ran into an issue where after successfully creating HFile or SEQ/RC files with Snappy Compression in MR, the job fails job commit with:
java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
at org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native Method)
at org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63)
at org.apache.hadoop.io.compress.SnappyCodec.getDecompressorType(SnappyCodec.java:192)
at org.apache.hadoop.hive.ql.io.CodecPool.getDecompressor(CodecPool.java:122)
at org.apache.hadoop.hive.ql.io.RCFile$Reader.init(RCFile.java:1518)

Incidentally the job commit was run as part of in the MRAppMaster which did not have the default 'yarn.app.mapreduce.am.env' set.

Here is a simple repro using a uberized task:

$ hadoop jar hadoop-mapreduce-examples.jar wordcount  -Dmapreduce.job.ubertask.enable=true -Dmapred.output.compress=true -Dmapred.output.compression.codec=org.apache.hadoop.io.compress.SnappyCodec /tmp/word_count.csv /tmp/wc-error

which would fail with:

2015-04-17 09:40:10,075 FATAL [uber-SubtaskRunner] org.apache.hadoop.mapred.LocalContainerLauncher: Error running local (uberized) 'child' : java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
	at org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native Method)
	at org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63)
	at org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:133)
	at org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
	at org.apache.hadoop.io.compress.CompressionCodec$Util.createOutputStreamWithCodecPool(CompressionCodec.java:131)
	at org.apache.hadoop.io.compress.SnappyCodec.createOutputStream(SnappyCodec.java:98)
	at org.apache.hadoop.mapreduce.lib.output.TextOutputFormat.getRecordWriter(TextOutputFormat.java:136)


It would be desirable to have the yarn property 'yarn.app.mapreduce.am.env' set to be same as 'mapreduce.admin.user.env' by default during installs so such tasks would not fail.


> During installs set default value of the yarn property 'yarn.app.mapreduce.am.env' to be same as 'mapreduce.admin.user.env'
> ---------------------------------------------------------------------------------------------------------------------------
>
>                 Key: AMBARI-10573
>                 URL: https://issues.apache.org/jira/browse/AMBARI-10573
>             Project: Ambari
>          Issue Type: Bug
>          Components: ambari-server
>    Affects Versions: 1.7.0
>            Reporter: Rajesh Kartha
>
> We recently ran into an issue where after successfully creating HFile or SEQ/RC files with Snappy Compression in MR, the job fails during job commit with:
> java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
> at org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native Method)
> at org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63)
> at org.apache.hadoop.io.compress.SnappyCodec.getDecompressorType(SnappyCodec.java:192)
> at org.apache.hadoop.hive.ql.io.CodecPool.getDecompressor(CodecPool.java:122)
> at org.apache.hadoop.hive.ql.io.RCFile$Reader.init(RCFile.java:1518)
> Incidentally the job commit was run as part of in the MRAppMaster which did not have the default 'yarn.app.mapreduce.am.env' set.
> Here is a simple repro using a uberized task:
> $ hadoop jar hadoop-mapreduce-examples.jar wordcount  -Dmapreduce.job.ubertask.enable=true -Dmapred.output.compress=true -Dmapred.output.compression.codec=org.apache.hadoop.io.compress.SnappyCodec /tmp/word_count.csv /tmp/wc-error
> which would fail with:
> 2015-04-17 09:40:10,075 FATAL [uber-SubtaskRunner] org.apache.hadoop.mapred.LocalContainerLauncher: Error running local (uberized) 'child' : java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
> 	at org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native Method)
> 	at org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63)
> 	at org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:133)
> 	at org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
> 	at org.apache.hadoop.io.compress.CompressionCodec$Util.createOutputStreamWithCodecPool(CompressionCodec.java:131)
> 	at org.apache.hadoop.io.compress.SnappyCodec.createOutputStream(SnappyCodec.java:98)
> 	at org.apache.hadoop.mapreduce.lib.output.TextOutputFormat.getRecordWriter(TextOutputFormat.java:136)
> It would be desirable to have the yarn property 'yarn.app.mapreduce.am.env' set to be same as 'mapreduce.admin.user.env' by default during installs so such tasks would not fail.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)