You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@nifi.apache.org by shweta <sh...@gmail.com> on 2016/02/06 18:11:43 UTC

java.lang.UnsatisfiedLinkError in PutHDFS with snappy compression.

Hi All,

I'm getting a java.lang.UnsatisfiedLinkError while adding data into PutHDFS
processor with compression codec as snappy. The error message says "Failed
to write to HDFS due to
org.apache.hadoop.util.NativeCodeloader.build.SupportsSnappy()Z.

Inspite of this error, .snappy files are being written in my Hdfs.

Has anyone faced a similar issue before or can provide any pointers.

Thanks,
Shweta



--
View this message in context: http://apache-nifi-developer-list.39713.n7.nabble.com/java-lang-UnsatisfiedLinkError-in-PutHDFS-with-snappy-compression-tp7182.html
Sent from the Apache NiFi Developer List mailing list archive at Nabble.com.

Re: java.lang.UnsatisfiedLinkError in PutHDFS with snappy compression.

Posted by shweta <sh...@gmail.com>.
Hi Joe,

Please find the contents of core-site.xml  which contains following
property.

<property>
  <name>io.compression.codecs</name>
<value>org.apache.hadoop.io.compress.GzipCodec,org.apache.hadoop.io.compress.DefaultCodec,org.apache.hadoop.io.compress.SnappyCodec</value>
</property>

mapred-site.xml

 <property>
      <name>mapreduce.map.output.compress</name>
      <value>true</value>
    </property>

    <property>
     <name>mapred.map.output.compress.codec</name>  
     <value>org.apache.hadoop.io.compress.SnappyCodec</value>
    </property>


    <property>
      <name>mapreduce.admin.user.env</name>
      <value>LD_LIBRARY_PATH=/usr/hdp/2.2.0.0-1084/hadoop/lib/native</value>
    </property>

The logs snapshot as following:-

2016-02-09 15:03:14,713 INFO [pool-26-thread-8]
o.apache.nifi.processors.hadoop.PutHDFS
PutHDFS[id=1a396460-dca5-4adf-af4a-8bc828c2e395] Initialized a new HDFS File
System with working dir: hdfs://localhost:9000/user/shweta default block
size: 134217728 default replication: 1 config: Configuration:
core-default.xml, core-site.xml, mapred-default.xml, mapred-site.xml,
yarn-default.xml, yarn-site.xml, hdfs-default.xml, hdfs-site.xml,
/home/shweta/software/hadoop/hadoop-2.4.0/etc/hadoop/hdfs-site.xml,
/home/shweta/software/hadoop/hadoop-2.4.0/etc/hadoop/core-site.xml
2016-02-09 15:03:14,724 INFO [pool-26-thread-8]
o.a.n.c.s.TimerDrivenSchedulingAgent Scheduled
PutHDFS[id=1a396460-dca5-4adf-af4a-8bc828c2e395] to run with 1 threads
2016-02-09 15:03:15,999 INFO [Timer-Driven Process Thread-8]
o.a.nifi.processors.standard.ReplaceText
ReplaceText[id=59f04d7a-c230-4774-a78f-90f44bcc24d5] Transferred
StandardFlowFileRecord[uuid=cb84f64f-1bc1-4922-8380-b2692afaf1f5,claim=StandardContentClaim
[resourceClaim=StandardResourceClaim[id=1455009982642-1, container=default,
section=1], offset=724356, length=3710],offset=0,name=test2.csv,size=3710]
to 'success'
2016-02-09 15:03:16,015 INFO [Timer-Driven Process Thread-8]
o.a.nifi.processors.standard.ReplaceText
ReplaceText[id=59f04d7a-c230-4774-a78f-90f44bcc24d5] Transferred
StandardFlowFileRecord[uuid=f7156b6b-0a5d-4ea8-9494-0d7bcd2cf83d,claim=StandardContentClaim
[resourceClaim=StandardResourceClaim[id=1455009982642-1, container=default,
section=1], offset=728066, length=7],offset=0,name=test.txt,size=7] to
'success'
2016-02-09 15:03:16,282 ERROR [Timer-Driven Process Thread-8]
o.apache.nifi.processors.hadoop.PutHDFS
PutHDFS[id=1a396460-dca5-4adf-af4a-8bc828c2e395] Failed to write to HDFS due
to java.lang.UnsatisfiedLinkError:
org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
2016-02-09 15:03:17,508 ERROR [Timer-Driven Process Thread-6]
o.apache.nifi.processors.hadoop.PutHDFS
PutHDFS[id=1a396460-dca5-4adf-af4a-8bc828c2e395] Failed to write to HDFS due
to java.lang.UnsatisfiedLinkError:
org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z

Thanks,
Shweta




--
View this message in context: http://apache-nifi-developer-list.39713.n7.nabble.com/java-lang-UnsatisfiedLinkError-in-PutHDFS-with-snappy-compression-tp7182p7235.html
Sent from the Apache NiFi Developer List mailing list archive at Nabble.com.

Re: java.lang.UnsatisfiedLinkError in PutHDFS with snappy compression.

Posted by Matthew Burgess <ma...@gmail.com>.
When you say you pointed LD_LIBRARY_PATH to the location of libsnappy.so, do you mean just the setting of the “mapreduce.admin.user.env” property in mapred-site.xml, or the actual environment variable before starting NiFi?  The mapred-site settings won’t be used as PutHDFS does not use MapReduce. If you do something like:

export LD_LIBRARY_PATH=/usr/hdp/2.2.0.0-1084/hadoop/lib/native
bin/nifi.sh start

That should let PutHDFS know about the appropriate libraries.




On 2/9/16, 4:38 AM, "shweta" <sh...@gmail.com> wrote:

>Hi Jeremy,
>
>Even after copying libsnappy.so to java_home/jre/lib it did not help much. I
>also pointed LD_LIBRARY_PATH to the location of libsnappy.so. Even went to
>the extent of modyfying bootstrap.conf with jvm params 
> -Djava.library.path=//<path for libsnappy.so>.
>
>But received the same error again. I have configured following properties in 
>Hadoop files as following:-
>
>core-site.xml
>
><property>
>  <name>io.compression.codecs</name>
><value>org.apache.hadoop.io.compress.GzipCodec,org.apache.hadoop.io.compress.DefaultCodec,org.apache.hadoop.io.compress.SnappyCodec</value>
></property>
>
>mapred-site.xml
>
> <property>
>      <name>mapreduce.map.output.compress</name>
>      <value>true</value>
>    </property>
>
>    <property>
>     <name>mapred.map.output.compress.codec</name>  
>     <value>org.apache.hadoop.io.compress.SnappyCodec</value>
>    </property>
>
>
>    <property>
>      <name>mapreduce.admin.user.env</name>
>      <value>LD_LIBRARY_PATH=/usr/hdp/2.2.0.0-1084/hadoop/lib/native</value>
>    </property>
>
>Anything else I'm missing on to get this issue fixed?? 
>
>Thanks,
>Shweta
>
>
>
>--
>View this message in context: http://apache-nifi-developer-list.39713.n7.nabble.com/java-lang-UnsatisfiedLinkError-in-PutHDFS-with-snappy-compression-tp7182p7236.html
>Sent from the Apache NiFi Developer List mailing list archive at Nabble.com.


Re: java.lang.UnsatisfiedLinkError in PutHDFS with snappy compression.

Posted by shweta <sh...@gmail.com>.
Hi Jeremy,

Even after copying libsnappy.so to java_home/jre/lib it did not help much. I
also pointed LD_LIBRARY_PATH to the location of libsnappy.so. Even went to
the extent of modyfying bootstrap.conf with jvm params 
 -Djava.library.path=//<path for libsnappy.so>.

But received the same error again. I have configured following properties in 
Hadoop files as following:-

core-site.xml

<property>
  <name>io.compression.codecs</name>
<value>org.apache.hadoop.io.compress.GzipCodec,org.apache.hadoop.io.compress.DefaultCodec,org.apache.hadoop.io.compress.SnappyCodec</value>
</property>

mapred-site.xml

 <property>
      <name>mapreduce.map.output.compress</name>
      <value>true</value>
    </property>

    <property>
     <name>mapred.map.output.compress.codec</name>  
     <value>org.apache.hadoop.io.compress.SnappyCodec</value>
    </property>


    <property>
      <name>mapreduce.admin.user.env</name>
      <value>LD_LIBRARY_PATH=/usr/hdp/2.2.0.0-1084/hadoop/lib/native</value>
    </property>

Anything else I'm missing on to get this issue fixed?? 

Thanks,
Shweta



--
View this message in context: http://apache-nifi-developer-list.39713.n7.nabble.com/java-lang-UnsatisfiedLinkError-in-PutHDFS-with-snappy-compression-tp7182p7236.html
Sent from the Apache NiFi Developer List mailing list archive at Nabble.com.

Re: java.lang.UnsatisfiedLinkError in PutHDFS with snappy compression.

Posted by Matt Burgess <ma...@gmail.com>.
To add to Jeremy's last point, even after the library is present, the files must be greater than the HDFS block size (default is 64 MB I think?) or Hadoop-snappy will also not compress them.

Sent from my iPhone

> On Feb 6, 2016, at 5:41 PM, Jeremy Dyer <jd...@gmail.com> wrote:
> 
> Shweta,
> 
> Looks like your missing the snappy native library. I have seen this several
> times before. Assuming your on a linux machine you have 2 options. You can
> copy the libsnappy.so native library to your JAVA_HOME/jre/lib native
> directory. Or you can set LD_LIBRARY_PATH to point to where your
> libsnappy.so native library is located on the machine.
> 
> I believe if you closely examine the files that are being written to HDFS
> with a .snappy extension you will see that in fact that are not actually
> snappy compressed.
> 
> Jeremy Dyer
> 
>> On Sat, Feb 6, 2016 at 1:04 PM, Joe Witt <jo...@gmail.com> wrote:
>> 
>> Can you show what is in your core-site.xml and the proc properties.
>> Also can you show the full log output?
>> 
>> Thanks
>> Joe
>> 
>>> On Sat, Feb 6, 2016 at 9:11 AM, shweta <sh...@gmail.com> wrote:
>>> Hi All,
>>> 
>>> I'm getting a java.lang.UnsatisfiedLinkError while adding data into
>> PutHDFS
>>> processor with compression codec as snappy. The error message says
>> "Failed
>>> to write to HDFS due to
>>> org.apache.hadoop.util.NativeCodeloader.build.SupportsSnappy()Z.
>>> 
>>> Inspite of this error, .snappy files are being written in my Hdfs.
>>> 
>>> Has anyone faced a similar issue before or can provide any pointers.
>>> 
>>> Thanks,
>>> Shweta
>>> 
>>> 
>>> 
>>> --
>>> View this message in context:
>> http://apache-nifi-developer-list.39713.n7.nabble.com/java-lang-UnsatisfiedLinkError-in-PutHDFS-with-snappy-compression-tp7182.html
>>> Sent from the Apache NiFi Developer List mailing list archive at
>> Nabble.com.
>> 

Re: java.lang.UnsatisfiedLinkError in PutHDFS with snappy compression.

Posted by Jeremy Dyer <jd...@gmail.com>.
Shweta,

Looks like your missing the snappy native library. I have seen this several
times before. Assuming your on a linux machine you have 2 options. You can
copy the libsnappy.so native library to your JAVA_HOME/jre/lib native
directory. Or you can set LD_LIBRARY_PATH to point to where your
libsnappy.so native library is located on the machine.

I believe if you closely examine the files that are being written to HDFS
with a .snappy extension you will see that in fact that are not actually
snappy compressed.

Jeremy Dyer

On Sat, Feb 6, 2016 at 1:04 PM, Joe Witt <jo...@gmail.com> wrote:

> Can you show what is in your core-site.xml and the proc properties.
> Also can you show the full log output?
>
> Thanks
> Joe
>
> On Sat, Feb 6, 2016 at 9:11 AM, shweta <sh...@gmail.com> wrote:
> > Hi All,
> >
> > I'm getting a java.lang.UnsatisfiedLinkError while adding data into
> PutHDFS
> > processor with compression codec as snappy. The error message says
> "Failed
> > to write to HDFS due to
> > org.apache.hadoop.util.NativeCodeloader.build.SupportsSnappy()Z.
> >
> > Inspite of this error, .snappy files are being written in my Hdfs.
> >
> > Has anyone faced a similar issue before or can provide any pointers.
> >
> > Thanks,
> > Shweta
> >
> >
> >
> > --
> > View this message in context:
> http://apache-nifi-developer-list.39713.n7.nabble.com/java-lang-UnsatisfiedLinkError-in-PutHDFS-with-snappy-compression-tp7182.html
> > Sent from the Apache NiFi Developer List mailing list archive at
> Nabble.com.
>

Re: java.lang.UnsatisfiedLinkError in PutHDFS with snappy compression.

Posted by Joe Witt <jo...@gmail.com>.
Can you show what is in your core-site.xml and the proc properties.
Also can you show the full log output?

Thanks
Joe

On Sat, Feb 6, 2016 at 9:11 AM, shweta <sh...@gmail.com> wrote:
> Hi All,
>
> I'm getting a java.lang.UnsatisfiedLinkError while adding data into PutHDFS
> processor with compression codec as snappy. The error message says "Failed
> to write to HDFS due to
> org.apache.hadoop.util.NativeCodeloader.build.SupportsSnappy()Z.
>
> Inspite of this error, .snappy files are being written in my Hdfs.
>
> Has anyone faced a similar issue before or can provide any pointers.
>
> Thanks,
> Shweta
>
>
>
> --
> View this message in context: http://apache-nifi-developer-list.39713.n7.nabble.com/java-lang-UnsatisfiedLinkError-in-PutHDFS-with-snappy-compression-tp7182.html
> Sent from the Apache NiFi Developer List mailing list archive at Nabble.com.