You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@nifi.apache.org by Thad Guidry <th...@gmail.com> on 2016/03/29 20:24:18 UTC

PutHDFS and LZ4 compression ERROR

I get an error:

13:04:51 CDT
ERROR
765efcb2-5ab0-4a72-a86f-71865dec264d

PutHDFS[id=765efcb2-5ab0-4a72-a86f-71865dec264d] Failed to write to HDFS
due to java.lang.RuntimeException: native lz4 library not available:
java.lang.RuntimeException: native lz4 library not available

even though I built successfully LZ4 https://github.com/jpountz/lz4-java
for my Windows 7 64bit using Ant, Ivy, and Mingw-w64
and placed that built lz4-1.3-SNAPSHOT.jar into the nifi/lib folder
and where it is getting picked up by NiFi bootstrap.Command just fine.

yet the error persists.

I'm wondering if
https://github.com/apache/nifi/blob/master/nifi-nar-bundles/nifi-hadoop-bundle/nifi-hdfs-processors/src/main/java/org/apache/nifi/processors/hadoop/PutHDFS.java#L279

might actually not be using the Java port but instead the JNI binding or
some such as described here
https://github.com/jpountz/lz4-java#implementations ?

Thad
+ThadGuidry <https://www.google.com/+ThadGuidry>

Re: PutHDFS and LZ4 compression ERROR

Posted by Thad Guidry <th...@gmail.com>.
Oh gosh your right...lol...to much OS mixing on my fingertips.

Let me try the build of the LZ4  jar again.

Thanks Matt,

Thad
+ThadGuidry <https://www.google.com/+ThadGuidry>

Re: PutHDFS and LZ4 compression ERROR

Posted by Matt Burgess <ma...@gmail.com>.
I don't think the .so will help you on Windows, you'd need a .dll instead.

> On Mar 30, 2016, at 11:39 AM, Thad Guidry <th...@gmail.com> wrote:
> 
> Evidently, on Windows the PATH environment variable should also have the path to your native libraries, so that java.library.path can find them.
> 
> I added the path to my .so file to my PATH environment variable....yet I still get the error
> 
> 2016-03-30 10:38:16,181 ERROR [Timer-Driven Process Thread-1] o.apache.nifi.processors.hadoop.PutHDFS 
> java.lang.RuntimeException: native lz4 library not available
>     at org.apache.hadoop.io.compress.Lz4Codec.getCompressorType(Lz4Codec.java:125) ~[hadoop-common-2.6.2.jar:na]
>     at org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148) ~[hadoop-common-2.6.2.jar:na]
>     at org.apache.hadoop.io.compress.CompressionCodec$Util.createOutputStreamWithCodecPool(CompressionCodec.java:131) ~[hadoop-common-2.6.2.jar:na]
>     at org.apache.hadoop.io.compress.Lz4Codec.createOutputStream(Lz4Codec.java:87) ~[hadoop-common-2.6.2.jar:na]
>     at org.apache.nifi.processors.hadoop.PutHDFS$1.process(PutHDFS.java:279) ~[nifi-hdfs-processors-0.6.0.jar:0.6.0]
> 
> ​Don't just engage Mr. Sulu ...stay for breakfast. :)​
> 
> Thad
> +ThadGuidry
> 
> 

Re: PutHDFS and LZ4 compression ERROR

Posted by Thad Guidry <th...@gmail.com>.
Evidently, on Windows the PATH environment variable should also have the
path to your native libraries, so that java.library.path can find them.

I added the path to my .so file to my PATH environment variable....yet I
still get the error

2016-03-30 10:38:16,181 ERROR [Timer-Driven Process Thread-1]
o.apache.nifi.processors.hadoop.PutHDFS
java.lang.RuntimeException: native lz4 library not available
    at
org.apache.hadoop.io.compress.Lz4Codec.getCompressorType(Lz4Codec.java:125)
~[hadoop-common-2.6.2.jar:na]
    at
org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
~[hadoop-common-2.6.2.jar:na]
    at
org.apache.hadoop.io.compress.CompressionCodec$Util.createOutputStreamWithCodecPool(CompressionCodec.java:131)
~[hadoop-common-2.6.2.jar:na]
    at
org.apache.hadoop.io.compress.Lz4Codec.createOutputStream(Lz4Codec.java:87)
~[hadoop-common-2.6.2.jar:na]
    at
org.apache.nifi.processors.hadoop.PutHDFS$1.process(PutHDFS.java:279)
~[nifi-hdfs-processors-0.6.0.jar:0.6.0]

​Don't just engage Mr. Sulu ...stay for breakfast. :)​

Thad
+ThadGuidry <https://www.google.com/+ThadGuidry>

Re: PutHDFS and LZ4 compression ERROR

Posted by Joe Witt <jo...@gmail.com>.
Chase,

It is a self-driven subscribe and unsubscribe process.  Please see
here https://nifi.apache.org/mailing_lists.html

Thanks
Joe

On Wed, Mar 30, 2016 at 9:18 AM, Chase Cunningham <ch...@thecynja.com> wrote:
> take me off this list...
>
> unsubscribe
>
>
> On 3/30/16 10:13 AM, Joe Witt wrote:
>>
>> Did you set the LD_LIBRARY_PATH as Burgress mentioned at the end?
>>
>> I am not in a good position to dig in at the moment so my apologies
>> for the half-help here.  The loading of native libs as I recall is a
>> pretty specific process.  I know a few folks are familiar with it in
>> the community so let's keep the thread alive and encourage them to
>> engage :-)
>>
>> On Wed, Mar 30, 2016 at 9:07 AM, Thad Guidry <th...@gmail.com> wrote:
>>>
>>> Ah, that's really helpful.
>>>
>>> Looks like I did have the native library in the built .jar file
>>>
>>> \lz4-1.3-SNAPSHOT\net\jpountz\util\win32\amd64\liblz4-java.so
>>>
>>> but placing that .so file in my C:\Program
>>> Files\Java\jdk1.8.0_74\jre\lib\amd64 folder results in the same missing
>>> lz4
>>> native NiFi errors
>>>
>>> Ideas ?
>>>
>>> Thad
>>> +ThadGuidry
>>>
>
> --
> Dr. Chase C Cunningham
> CTRC (SW) USN Ret.
> The Cynja LLC Proprietary Business and Technical Information
> CONFIDENTIAL TREATMENT REQUIRED
>

Re: PutHDFS and LZ4 compression ERROR

Posted by Chase Cunningham <ch...@thecynja.com>.
take me off this list...

unsubscribe

On 3/30/16 10:13 AM, Joe Witt wrote:
> Did you set the LD_LIBRARY_PATH as Burgress mentioned at the end?
>
> I am not in a good position to dig in at the moment so my apologies
> for the half-help here.  The loading of native libs as I recall is a
> pretty specific process.  I know a few folks are familiar with it in
> the community so let's keep the thread alive and encourage them to
> engage :-)
>
> On Wed, Mar 30, 2016 at 9:07 AM, Thad Guidry <th...@gmail.com> wrote:
>> Ah, that's really helpful.
>>
>> Looks like I did have the native library in the built .jar file
>>
>> \lz4-1.3-SNAPSHOT\net\jpountz\util\win32\amd64\liblz4-java.so
>>
>> but placing that .so file in my C:\Program
>> Files\Java\jdk1.8.0_74\jre\lib\amd64 folder results in the same missing lz4
>> native NiFi errors
>>
>> Ideas ?
>>
>> Thad
>> +ThadGuidry
>>

-- 
Dr. Chase C Cunningham
CTRC (SW) USN Ret.
The Cynja LLC Proprietary Business and Technical Information
CONFIDENTIAL TREATMENT REQUIRED


Re: PutHDFS and LZ4 compression ERROR

Posted by Thad Guidry <th...@gmail.com>.
Yes of course, on Windows 7 in my User environment variables, I set
LD_LIBRARY_PATH=C:\Program Files\Java\jdk1.8.0_74\jre\lib\amd64

which as the .so file

ENGAGE ! Mr Sulu :)​

Thad
+ThadGuidry <https://www.google.com/+ThadGuidry>

Re: PutHDFS and LZ4 compression ERROR

Posted by Joe Witt <jo...@gmail.com>.
Did you set the LD_LIBRARY_PATH as Burgress mentioned at the end?

I am not in a good position to dig in at the moment so my apologies
for the half-help here.  The loading of native libs as I recall is a
pretty specific process.  I know a few folks are familiar with it in
the community so let's keep the thread alive and encourage them to
engage :-)

On Wed, Mar 30, 2016 at 9:07 AM, Thad Guidry <th...@gmail.com> wrote:
> Ah, that's really helpful.
>
> Looks like I did have the native library in the built .jar file
>
> \lz4-1.3-SNAPSHOT\net\jpountz\util\win32\amd64\liblz4-java.so
>
> but placing that .so file in my C:\Program
> Files\Java\jdk1.8.0_74\jre\lib\amd64 folder results in the same missing lz4
> native NiFi errors
>
> Ideas ?
>
> Thad
> +ThadGuidry
>

Re: PutHDFS and LZ4 compression ERROR

Posted by Thad Guidry <th...@gmail.com>.
Ah, that's really helpful.

Looks like I did have the native library in the built .jar file

\lz4-1.3-SNAPSHOT\net\jpountz\util\win32\amd64\liblz4-java.so

but placing that .so file in my C:\Program
Files\Java\jdk1.8.0_74\jre\lib\amd64 folder results in the same missing lz4
native NiFi errors

Ideas ?

Thad
+ThadGuidry <https://www.google.com/+ThadGuidry>

Re: PutHDFS and LZ4 compression ERROR

Posted by Joe Witt <jo...@gmail.com>.
Thad

This thread [1] seems related.  Take a look and see if that helps.
The basic gist as I understand it is we won't have access to that
native library unless it is pointed to somewhere or unless the Java
code that calls it knows how to set/find it for you.

[1] http://apache-nifi-developer-list.39713.n7.nabble.com/java-lang-UnsatisfiedLinkError-in-PutHDFS-with-snappy-compression-td7182.html

On Wed, Mar 30, 2016 at 8:42 AM, Thad Guidry <th...@gmail.com> wrote:
> My bad....there is... in the app log...
>
> 2016-03-30 09:39:27,709 INFO [Write-Ahead Local State Provider Maintenance]
> org.wali.MinimalLockingWriteAheadLog
> org.wali.MinimalLockingWriteAheadLog@7615666e checkpointed with 8 Records
> and 0 Swap Files in 69 milliseconds (Stop-the-world time = 6 milliseconds,
> Clear Edit Logs time = 4 millis), max Transaction ID 23
> 2016-03-30 09:39:31,979 INFO [pool-16-thread-1]
> o.a.n.c.r.WriteAheadFlowFileRepository Initiating checkpoint of FlowFile
> Repository
> 2016-03-30 09:39:32,380 INFO [pool-16-thread-1]
> org.wali.MinimalLockingWriteAheadLog
> org.wali.MinimalLockingWriteAheadLog@174f0d06 checkpointed with 3 Records
> and 0 Swap Files in 400 milliseconds (Stop-the-world time = 273
> milliseconds, Clear Edit Logs time = 74 millis), max Transaction ID 9785
> 2016-03-30 09:39:32,380 INFO [pool-16-thread-1]
> o.a.n.c.r.WriteAheadFlowFileRepository Successfully checkpointed FlowFile
> Repository with 3 records in 400 milliseconds
> 2016-03-30 09:39:32,523 ERROR [Timer-Driven Process Thread-9]
> o.apache.nifi.processors.hadoop.PutHDFS
> PutHDFS[id=765efcb2-5ab0-4a72-a86f-71865dec264d] Failed to write to HDFS due
> to java.lang.RuntimeException: native lz4 library not available:
> java.lang.RuntimeException: native lz4 library not available
> 2016-03-30 09:39:32,525 ERROR [Timer-Driven Process Thread-9]
> o.apache.nifi.processors.hadoop.PutHDFS
> java.lang.RuntimeException: native lz4 library not available
>     at
> org.apache.hadoop.io.compress.Lz4Codec.getCompressorType(Lz4Codec.java:125)
> ~[hadoop-common-2.6.2.jar:na]
>     at
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
> ~[hadoop-common-2.6.2.jar:na]
>     at
> org.apache.hadoop.io.compress.CompressionCodec$Util.createOutputStreamWithCodecPool(CompressionCodec.java:131)
> ~[hadoop-common-2.6.2.jar:na]
>     at
> org.apache.hadoop.io.compress.Lz4Codec.createOutputStream(Lz4Codec.java:87)
> ~[hadoop-common-2.6.2.jar:na]
>     at org.apache.nifi.processors.hadoop.PutHDFS$1.process(PutHDFS.java:279)
> ~[nifi-hdfs-processors-0.6.0.jar:0.6.0]
>     at
> org.apache.nifi.controller.repository.StandardProcessSession.read(StandardProcessSession.java:1807)
> ~[na:na]
>     at
> org.apache.nifi.controller.repository.StandardProcessSession.read(StandardProcessSession.java:1778)
> ~[na:na]
>     at org.apache.nifi.processors.hadoop.PutHDFS.onTrigger(PutHDFS.java:270)
> ~[nifi-hdfs-processors-0.6.0.jar:0.6.0]
>     at
> org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27)
> [nifi-api-0.6.0.jar:0.6.0]
>     at
> org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1057)
> [nifi-framework-core-0.6.0.jar:0.6.0]
>     at
> org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:136)
> [nifi-framework-core-0.6.0.jar:0.6.0]
>     at
> org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:47)
> [nifi-framework-core-0.6.0.jar:0.6.0]
>     at
> org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:123)
> [nifi-framework-core-0.6.0.jar:0.6.0]
>     at
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> [na:1.8.0_74]
>     at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
> [na:1.8.0_74]
>     at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
> [na:1.8.0_74]
>     at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
> [na:1.8.0_74]
>     at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> [na:1.8.0_74]
>     at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> [na:1.8.0_74]
>     at java.lang.Thread.run(Thread.java:745) [na:1.8.0_74]
> 2016-03-30 09:39:34,273 ERROR [Timer-Driven Process Thread-5]
> o.apache.nifi.processors.hadoop.PutHDFS
> PutHDFS[id=765efcb2-5ab0-4a72-a86f-71865dec264d] Failed to write to HDFS due
> to java.lang.RuntimeException: native lz4 library not available:
> java.lang.RuntimeException: native lz4 library not available
> 2016-03-30 09:39:34,274 ERROR [Timer-Driven Process Thread-5]
> o.apache.nifi.processors.hadoop.PutHDFS
> java.lang.RuntimeException: native lz4 library not available
>     at
> org.apache.hadoop.io.compress.Lz4Codec.getCompressorType(Lz4Codec.java:125)
> ~[hadoop-common-2.6.2.jar:na]
>     at
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
> ~[hadoop-common-2.6.2.jar:na]
>     at
> org.apache.hadoop.io.compress.CompressionCodec$Util.createOutputStreamWithCodecPool(CompressionCodec.java:131)
> ~[hadoop-common-2.6.2.jar:na]
>     at
> org.apache.hadoop.io.compress.Lz4Codec.createOutputStream(Lz4Codec.java:87)
> ~[hadoop-common-2.6.2.jar:na]
>     at org.apache.nifi.processors.hadoop.PutHDFS$1.process(PutHDFS.java:279)
> ~[nifi-hdfs-processors-0.6.0.jar:0.6.0]
>     at
> org.apache.nifi.controller.repository.StandardProcessSession.read(StandardProcessSession.java:1807)
> ~[na:na]
>     at
> org.apache.nifi.controller.repository.StandardProcessSession.read(StandardProcessSession.java:1778)
> ~[na:na]
>     at org.apache.nifi.processors.hadoop.PutHDFS.onTrigger(PutHDFS.java:270)
> ~[nifi-hdfs-processors-0.6.0.jar:0.6.0]
>     at
> org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27)
> [nifi-api-0.6.0.jar:0.6.0]
>     at
> org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1057)
> [nifi-framework-core-0.6.0.jar:0.6.0]
>     at
> org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:136)
> [nifi-framework-core-0.6.0.jar:0.6.0]
>     at
> org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:47)
> [nifi-framework-core-0.6.0.jar:0.6.0]
>     at
> org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:123)
> [nifi-framework-core-0.6.0.jar:0.6.0]
>     at
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> [na:1.8.0_74]
>     at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
> [na:1.8.0_74]
>     at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
> [na:1.8.0_74]
>     at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
> [na:1.8.0_74]
>     at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> [na:1.8.0_74]
>     at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> [na:1.8.0_74]
>     at java.lang.Thread.run(Thread.java:745) [na:1.8.0_74]
> 2016-03-30 09:39:35,295 ERROR [Timer-Driven Process Thread-9]
> o.apache.nifi.processors.hadoop.PutHDFS
> PutHDFS[id=765efcb2-5ab0-4a72-a86f-71865dec264d] Failed to write to HDFS due
> to java.lang.RuntimeException: native lz4 library not available:
> java.lang.RuntimeException: native lz4 library not available
> 2016-03-30 09:39:35,296 ERROR [Timer-Driven Process Thread-9]
> o.apache.nifi.processors.hadoop.PutHDFS
> java.lang.RuntimeException: native lz4 library not available
>     at
> org.apache.hadoop.io.compress.Lz4Codec.getCompressorType(Lz4Codec.java:125)
> ~[hadoop-common-2.6.2.jar:na]
>     at
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
> ~[hadoop-common-2.6.2.jar:na]
>     at
> org.apache.hadoop.io.compress.CompressionCodec$Util.createOutputStreamWithCodecPool(CompressionCodec.java:131)
> ~[hadoop-common-2.6.2.jar:na]
>     at
> org.apache.hadoop.io.compress.Lz4Codec.createOutputStream(Lz4Codec.java:87)
> ~[hadoop-common-2.6.2.jar:na]
>     at org.apache.nifi.processors.hadoop.PutHDFS$1.process(PutHDFS.java:279)
> ~[nifi-hdfs-processors-0.6.0.jar:0.6.0]
>     at
> org.apache.nifi.controller.repository.StandardProcessSession.read(StandardProcessSession.java:1807)
> ~[na:na]
>     at
> org.apache.nifi.controller.repository.StandardProcessSession.read(StandardProcessSession.java:1778)
> ~[na:na]
>     at org.apache.nifi.processors.hadoop.PutHDFS.onTrigger(PutHDFS.java:270)
> ~[nifi-hdfs-processors-0.6.0.jar:0.6.0]
>     at
> org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27)
> [nifi-api-0.6.0.jar:0.6.0]
>     at
> org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1057)
> [nifi-framework-core-0.6.0.jar:0.6.0]
>     at
> org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:136)
> [nifi-framework-core-0.6.0.jar:0.6.0]
>     at
> org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:47)
> [nifi-framework-core-0.6.0.jar:0.6.0]
>     at
> org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:123)
> [nifi-framework-core-0.6.0.jar:0.6.0]
>     at
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> [na:1.8.0_74]
>     at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
> [na:1.8.0_74]
>     at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
> [na:1.8.0_74]
>     at
> java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
> [na:1.8.0_74]
>     at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> [na:1.8.0_74]
>     at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> [na:1.8.0_74]
>     at java.lang.Thread.run(Thread.java:745) [na:1.8.0_74]
>
> Thad
> +ThadGuidry

Re: PutHDFS and LZ4 compression ERROR

Posted by Thad Guidry <th...@gmail.com>.
My bad....there is... in the app log...

2016-03-30 09:39:27,709 INFO [Write-Ahead Local State Provider Maintenance]
org.wali.MinimalLockingWriteAheadLog
org.wali.MinimalLockingWriteAheadLog@7615666e checkpointed with 8 Records
and 0 Swap Files in 69 milliseconds (Stop-the-world time = 6 milliseconds,
Clear Edit Logs time = 4 millis), max Transaction ID 23
2016-03-30 09:39:31,979 INFO [pool-16-thread-1]
o.a.n.c.r.WriteAheadFlowFileRepository Initiating checkpoint of FlowFile
Repository
2016-03-30 09:39:32,380 INFO [pool-16-thread-1]
org.wali.MinimalLockingWriteAheadLog
org.wali.MinimalLockingWriteAheadLog@174f0d06 checkpointed with 3 Records
and 0 Swap Files in 400 milliseconds (Stop-the-world time = 273
milliseconds, Clear Edit Logs time = 74 millis), max Transaction ID 9785
2016-03-30 09:39:32,380 INFO [pool-16-thread-1]
o.a.n.c.r.WriteAheadFlowFileRepository Successfully checkpointed FlowFile
Repository with 3 records in 400 milliseconds
2016-03-30 09:39:32,523 ERROR [Timer-Driven Process Thread-9]
o.apache.nifi.processors.hadoop.PutHDFS
PutHDFS[id=765efcb2-5ab0-4a72-a86f-71865dec264d] Failed to write to HDFS
due to java.lang.RuntimeException: native lz4 library not available:
java.lang.RuntimeException: native lz4 library not available
2016-03-30 09:39:32,525 ERROR [Timer-Driven Process Thread-9]
o.apache.nifi.processors.hadoop.PutHDFS
java.lang.RuntimeException: native lz4 library not available
    at
org.apache.hadoop.io.compress.Lz4Codec.getCompressorType(Lz4Codec.java:125)
~[hadoop-common-2.6.2.jar:na]
    at
org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
~[hadoop-common-2.6.2.jar:na]
    at
org.apache.hadoop.io.compress.CompressionCodec$Util.createOutputStreamWithCodecPool(CompressionCodec.java:131)
~[hadoop-common-2.6.2.jar:na]
    at
org.apache.hadoop.io.compress.Lz4Codec.createOutputStream(Lz4Codec.java:87)
~[hadoop-common-2.6.2.jar:na]
    at
org.apache.nifi.processors.hadoop.PutHDFS$1.process(PutHDFS.java:279)
~[nifi-hdfs-processors-0.6.0.jar:0.6.0]
    at
org.apache.nifi.controller.repository.StandardProcessSession.read(StandardProcessSession.java:1807)
~[na:na]
    at
org.apache.nifi.controller.repository.StandardProcessSession.read(StandardProcessSession.java:1778)
~[na:na]
    at
org.apache.nifi.processors.hadoop.PutHDFS.onTrigger(PutHDFS.java:270)
~[nifi-hdfs-processors-0.6.0.jar:0.6.0]
    at
org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27)
[nifi-api-0.6.0.jar:0.6.0]
    at
org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1057)
[nifi-framework-core-0.6.0.jar:0.6.0]
    at
org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:136)
[nifi-framework-core-0.6.0.jar:0.6.0]
    at
org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:47)
[nifi-framework-core-0.6.0.jar:0.6.0]
    at
org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:123)
[nifi-framework-core-0.6.0.jar:0.6.0]
    at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
[na:1.8.0_74]
    at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
[na:1.8.0_74]
    at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
[na:1.8.0_74]
    at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
[na:1.8.0_74]
    at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
[na:1.8.0_74]
    at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
[na:1.8.0_74]
    at java.lang.Thread.run(Thread.java:745) [na:1.8.0_74]
2016-03-30 09:39:34,273 ERROR [Timer-Driven Process Thread-5]
o.apache.nifi.processors.hadoop.PutHDFS
PutHDFS[id=765efcb2-5ab0-4a72-a86f-71865dec264d] Failed to write to HDFS
due to java.lang.RuntimeException: native lz4 library not available:
java.lang.RuntimeException: native lz4 library not available
2016-03-30 09:39:34,274 ERROR [Timer-Driven Process Thread-5]
o.apache.nifi.processors.hadoop.PutHDFS
java.lang.RuntimeException: native lz4 library not available
    at
org.apache.hadoop.io.compress.Lz4Codec.getCompressorType(Lz4Codec.java:125)
~[hadoop-common-2.6.2.jar:na]
    at
org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
~[hadoop-common-2.6.2.jar:na]
    at
org.apache.hadoop.io.compress.CompressionCodec$Util.createOutputStreamWithCodecPool(CompressionCodec.java:131)
~[hadoop-common-2.6.2.jar:na]
    at
org.apache.hadoop.io.compress.Lz4Codec.createOutputStream(Lz4Codec.java:87)
~[hadoop-common-2.6.2.jar:na]
    at
org.apache.nifi.processors.hadoop.PutHDFS$1.process(PutHDFS.java:279)
~[nifi-hdfs-processors-0.6.0.jar:0.6.0]
    at
org.apache.nifi.controller.repository.StandardProcessSession.read(StandardProcessSession.java:1807)
~[na:na]
    at
org.apache.nifi.controller.repository.StandardProcessSession.read(StandardProcessSession.java:1778)
~[na:na]
    at
org.apache.nifi.processors.hadoop.PutHDFS.onTrigger(PutHDFS.java:270)
~[nifi-hdfs-processors-0.6.0.jar:0.6.0]
    at
org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27)
[nifi-api-0.6.0.jar:0.6.0]
    at
org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1057)
[nifi-framework-core-0.6.0.jar:0.6.0]
    at
org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:136)
[nifi-framework-core-0.6.0.jar:0.6.0]
    at
org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:47)
[nifi-framework-core-0.6.0.jar:0.6.0]
    at
org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:123)
[nifi-framework-core-0.6.0.jar:0.6.0]
    at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
[na:1.8.0_74]
    at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
[na:1.8.0_74]
    at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
[na:1.8.0_74]
    at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
[na:1.8.0_74]
    at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
[na:1.8.0_74]
    at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
[na:1.8.0_74]
    at java.lang.Thread.run(Thread.java:745) [na:1.8.0_74]
2016-03-30 09:39:35,295 ERROR [Timer-Driven Process Thread-9]
o.apache.nifi.processors.hadoop.PutHDFS
PutHDFS[id=765efcb2-5ab0-4a72-a86f-71865dec264d] Failed to write to HDFS
due to java.lang.RuntimeException: native lz4 library not available:
java.lang.RuntimeException: native lz4 library not available
2016-03-30 09:39:35,296 ERROR [Timer-Driven Process Thread-9]
o.apache.nifi.processors.hadoop.PutHDFS
java.lang.RuntimeException: native lz4 library not available
    at
org.apache.hadoop.io.compress.Lz4Codec.getCompressorType(Lz4Codec.java:125)
~[hadoop-common-2.6.2.jar:na]
    at
org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
~[hadoop-common-2.6.2.jar:na]
    at
org.apache.hadoop.io.compress.CompressionCodec$Util.createOutputStreamWithCodecPool(CompressionCodec.java:131)
~[hadoop-common-2.6.2.jar:na]
    at
org.apache.hadoop.io.compress.Lz4Codec.createOutputStream(Lz4Codec.java:87)
~[hadoop-common-2.6.2.jar:na]
    at
org.apache.nifi.processors.hadoop.PutHDFS$1.process(PutHDFS.java:279)
~[nifi-hdfs-processors-0.6.0.jar:0.6.0]
    at
org.apache.nifi.controller.repository.StandardProcessSession.read(StandardProcessSession.java:1807)
~[na:na]
    at
org.apache.nifi.controller.repository.StandardProcessSession.read(StandardProcessSession.java:1778)
~[na:na]
    at
org.apache.nifi.processors.hadoop.PutHDFS.onTrigger(PutHDFS.java:270)
~[nifi-hdfs-processors-0.6.0.jar:0.6.0]
    at
org.apache.nifi.processor.AbstractProcessor.onTrigger(AbstractProcessor.java:27)
[nifi-api-0.6.0.jar:0.6.0]
    at
org.apache.nifi.controller.StandardProcessorNode.onTrigger(StandardProcessorNode.java:1057)
[nifi-framework-core-0.6.0.jar:0.6.0]
    at
org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:136)
[nifi-framework-core-0.6.0.jar:0.6.0]
    at
org.apache.nifi.controller.tasks.ContinuallyRunProcessorTask.call(ContinuallyRunProcessorTask.java:47)
[nifi-framework-core-0.6.0.jar:0.6.0]
    at
org.apache.nifi.controller.scheduling.TimerDrivenSchedulingAgent$1.run(TimerDrivenSchedulingAgent.java:123)
[nifi-framework-core-0.6.0.jar:0.6.0]
    at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
[na:1.8.0_74]
    at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
[na:1.8.0_74]
    at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
[na:1.8.0_74]
    at
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
[na:1.8.0_74]
    at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
[na:1.8.0_74]
    at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
[na:1.8.0_74]
    at java.lang.Thread.run(Thread.java:745) [na:1.8.0_74]

Thad
+ThadGuidry <https://www.google.com/+ThadGuidry>

Re: PutHDFS and LZ4 compression ERROR

Posted by Thad Guidry <th...@gmail.com>.
Joe,

There is no more additional output, even when I set to DEBUG level.

09:36:32 CDT
ERROR
765efcb2-5ab0-4a72-a86f-71865dec264d

PutHDFS[id=765efcb2-5ab0-4a72-a86f-71865dec264d] Failed to write to
HDFS due to java.lang.RuntimeException: native lz4 library not
available: java.lang.RuntimeException: native lz4 library not
available

09:36:34 CDT
ERROR
765efcb2-5ab0-4a72-a86f-71865dec264d

PutHDFS[id=765efcb2-5ab0-4a72-a86f-71865dec264d] Failed to write to
HDFS due to java.lang.RuntimeException: native lz4 library not
available: java.lang.RuntimeException: native lz4 library not
available

09:36:35 CDT
ERROR
765efcb2-5ab0-4a72-a86f-71865dec264d

PutHDFS[id=765efcb2-5ab0-4a72-a86f-71865dec264d] Failed to write to
HDFS due to java.lang.RuntimeException: native lz4 library not
available: java.lang.RuntimeException: native lz4 library not
available


Thad
+ThadGuidry <https://www.google.com/+ThadGuidry>

Re: PutHDFS and LZ4 compression ERROR

Posted by Thad Guidry <th...@gmail.com>.
Sure Joe,

I'll get that for you tomorrow morning.


Thad
+ThadGuidry <https://www.google.com/+ThadGuidry>

On Tue, Mar 29, 2016 at 10:00 PM, Joe Witt <jo...@gmail.com> wrote:

> Thad,
>
> Can you share the full stack trace that should be present in the log
> with that?  There is clearly a bit of Java code attempting to load the
> native library and unable to find it.  Placing the jar file in the
> classpath which contains the native library may well not be enough
> because loading the native libraries requires specific settings.
>
> Thanks
> Joe
>
> On Tue, Mar 29, 2016 at 12:24 PM, Thad Guidry <th...@gmail.com>
> wrote:
> > I get an error:
> >
> > 13:04:51 CDT
> > ERROR
> > 765efcb2-5ab0-4a72-a86f-71865dec264d
> >
> > PutHDFS[id=765efcb2-5ab0-4a72-a86f-71865dec264d] Failed to write to HDFS
> due
> > to java.lang.RuntimeException: native lz4 library not available:
> > java.lang.RuntimeException: native lz4 library not available
> >
> > even though I built successfully LZ4 https://github.com/jpountz/lz4-java
> > for my Windows 7 64bit using Ant, Ivy, and Mingw-w64
> > and placed that built lz4-1.3-SNAPSHOT.jar into the nifi/lib folder
> > and where it is getting picked up by NiFi bootstrap.Command just fine.
> >
> > yet the error persists.
> >
> > I'm wondering if
> >
> https://github.com/apache/nifi/blob/master/nifi-nar-bundles/nifi-hadoop-bundle/nifi-hdfs-processors/src/main/java/org/apache/nifi/processors/hadoop/PutHDFS.java#L279
> >
> > might actually not be using the Java port but instead the JNI binding or
> > some such as described here
> > https://github.com/jpountz/lz4-java#implementations ?
> >
> > Thad
> > +ThadGuidry
>

Re: PutHDFS and LZ4 compression ERROR

Posted by Joe Witt <jo...@gmail.com>.
Thad,

Can you share the full stack trace that should be present in the log
with that?  There is clearly a bit of Java code attempting to load the
native library and unable to find it.  Placing the jar file in the
classpath which contains the native library may well not be enough
because loading the native libraries requires specific settings.

Thanks
Joe

On Tue, Mar 29, 2016 at 12:24 PM, Thad Guidry <th...@gmail.com> wrote:
> I get an error:
>
> 13:04:51 CDT
> ERROR
> 765efcb2-5ab0-4a72-a86f-71865dec264d
>
> PutHDFS[id=765efcb2-5ab0-4a72-a86f-71865dec264d] Failed to write to HDFS due
> to java.lang.RuntimeException: native lz4 library not available:
> java.lang.RuntimeException: native lz4 library not available
>
> even though I built successfully LZ4 https://github.com/jpountz/lz4-java
> for my Windows 7 64bit using Ant, Ivy, and Mingw-w64
> and placed that built lz4-1.3-SNAPSHOT.jar into the nifi/lib folder
> and where it is getting picked up by NiFi bootstrap.Command just fine.
>
> yet the error persists.
>
> I'm wondering if
> https://github.com/apache/nifi/blob/master/nifi-nar-bundles/nifi-hadoop-bundle/nifi-hdfs-processors/src/main/java/org/apache/nifi/processors/hadoop/PutHDFS.java#L279
>
> might actually not be using the Java port but instead the JNI binding or
> some such as described here
> https://github.com/jpountz/lz4-java#implementations ?
>
> Thad
> +ThadGuidry