You are viewing a plain text version of this content. The canonical link for it is here.
Posted to mapreduce-user@hadoop.apache.org by karthi keyan <ka...@gmail.com> on 2016/03/29 08:29:18 UTC

UnstaisfiedLinkError - Windows Environment

Hi,

Frequently am facing this issue while reading the Data from HDFS, Every
time i have replaced (rebuid) the jars. Does any one suggest me the right
way to resolve this issue? or can any one tell me the root cause for this
error ?

JDK > 1.7
System env - win 64 bit


Caused by: java.lang.UnsatisfiedLinkError:
org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(IILjava/nio/ByteBuffer;ILjava/nio/ByteBuffer;IILjava/lang/String;JZ)V
at org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(Native
Method) ~[hadoop-common-2.6.2.jar:na]
at
org.apache.hadoop.util.NativeCrc32.verifyChunkedSums(NativeCrc32.java:59)
~[hadoop-common-2.6.2.jar:na]
at
org.apache.hadoop.util.DataChecksum.verifyChunkedSums(DataChecksum.java:301)
~[hadoop-common-2.6.2.jar:na]
at
org.apache.hadoop.hdfs.RemoteBlockReader2.readNextPacket(RemoteBlockReader2.java:216)
~[hadoop-hdfs-2.6.2.jar:na]
at
org.apache.hadoop.hdfs.RemoteBlockReader2.read(RemoteBlockReader2.java:146)
~[hadoop-hdfs-2.6.2.jar:na]
at
org.apache.hadoop.hdfs.DFSInputStream$ByteArrayStrategy.doRead(DFSInputStream.java:693)
~[hadoop-hdfs-2.6.2.jar:na]
at
org.apache.hadoop.hdfs.DFSInputStream.readBuffer(DFSInputStream.java:749)
~[hadoop-hdfs-2.6.2.jar:na]
at
org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:807)
~[hadoop-hdfs-2.6.2.jar:na]
at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:848)
~[hadoop-hdfs-2.6.2.jar:na]
at java.io.DataInputStream.read(DataInputStream.java:100) ~[na:1.7.0]

Regards,
Karthikeyan S

RE: UnstaisfiedLinkError - Windows Environment

Posted by Brahma Reddy Battula <br...@huawei.com>.
UnsatisfiedLinkError, is coming because its not able link, Java code API (JNI) to corresponding native API in available library.
This will not be the problem from running different version in remote cluster.
Common causes would be:
1. Built for different architecture (32bit?)
2. Libraries corrupted ?

Are you able to read without libraries??
Dont put those libraries in bin and try.

--Brahma Reddy Battula

From: karthi keyan [mailto:karthi93.sankar@gmail.com]
Sent: 29 March 2016 16:47
To: Brahma Reddy Battula
Cc: user@hadoop.apache.org
Subject: Re: UnstaisfiedLinkError - Windows Environment

Only side libraries are built with Hadoop 2.6.2 , But Server(Remote Cluster) is built with Hadoop 2.5.2? Am not sure that version not be the case here.. Don't know wy it happens ? does you knw the cause of this exception ?

RE: UnstaisfiedLinkError - Windows Environment

Posted by Brahma Reddy Battula <br...@huawei.com>.
UnsatisfiedLinkError, is coming because its not able link, Java code API (JNI) to corresponding native API in available library.
This will not be the problem from running different version in remote cluster.
Common causes would be:
1. Built for different architecture (32bit?)
2. Libraries corrupted ?

Are you able to read without libraries??
Dont put those libraries in bin and try.

--Brahma Reddy Battula

From: karthi keyan [mailto:karthi93.sankar@gmail.com]
Sent: 29 March 2016 16:47
To: Brahma Reddy Battula
Cc: user@hadoop.apache.org
Subject: Re: UnstaisfiedLinkError - Windows Environment

Only side libraries are built with Hadoop 2.6.2 , But Server(Remote Cluster) is built with Hadoop 2.5.2? Am not sure that version not be the case here.. Don't know wy it happens ? does you knw the cause of this exception ?

RE: UnstaisfiedLinkError - Windows Environment

Posted by Brahma Reddy Battula <br...@huawei.com>.
UnsatisfiedLinkError, is coming because its not able link, Java code API (JNI) to corresponding native API in available library.
This will not be the problem from running different version in remote cluster.
Common causes would be:
1. Built for different architecture (32bit?)
2. Libraries corrupted ?

Are you able to read without libraries??
Dont put those libraries in bin and try.

--Brahma Reddy Battula

From: karthi keyan [mailto:karthi93.sankar@gmail.com]
Sent: 29 March 2016 16:47
To: Brahma Reddy Battula
Cc: user@hadoop.apache.org
Subject: Re: UnstaisfiedLinkError - Windows Environment

Only side libraries are built with Hadoop 2.6.2 , But Server(Remote Cluster) is built with Hadoop 2.5.2? Am not sure that version not be the case here.. Don't know wy it happens ? does you knw the cause of this exception ?

Re: Re: UnstaisfiedLinkError - Windows Environment

Posted by karthi keyan <ka...@gmail.com>.
I too thought version will be the problem but, Same Package works fine for
another Server (remote Cluster with same Hadoop 2.5.2) . See here i have to
clarify my use case,

1. Having a Small sample(*client*) which is built with Hadoop *2.6.2 (*Built
with .net framework 4.0 and JDK 1.8 in 64 bit env)

2. Having a Remote cluster (*server*) which is built with Hadoop *2.5.2 *
* (*Built with .net framework 4.0 and JDK 1.7 in 64 bit env )

3. When i intend to access the Data from *Server*(Remote cluster with
Hadoop 2.5.2 ) with my *client *Api , Above exception will rise.

@*hsdcloud@163.com <hs...@163.com> *As you said the version will be
same, But it works fine When If *server* is running in another machine with
same environment.

As now i have the changed to another Server(remote Cluster) and working
fine !!
Am little bit confused with the exception where it occurs either problem
with Version/DLL or with System Environment??




On Tue, Mar 29, 2016 at 2:33 PM, hsdcloud@163.com <hs...@163.com> wrote:

> i think the root cause is version ,  suggest your built jar use
> hadoop2.5.2 ,the same version
>
> ------------------------------
> hsdcloud@163.com
>
>
> *From:* karthi keyan <ka...@gmail.com>
> *Date:* 2016-03-29 17:16
> *To:* Brahma Reddy Battula <br...@huawei.com>
> *CC:* user@hadoop.apache.org
> *Subject:* Re: UnstaisfiedLinkError - Windows Environment
> Only side libraries are built with Hadoop 2.6.2 , But Server(Remote
> Cluster) is built with Hadoop 2.5.2? Am not sure that version not be the
> case here.. Don't know wy it happens ? does you knw the cause of this
> exception ?
>
>

Re: Re: UnstaisfiedLinkError - Windows Environment

Posted by karthi keyan <ka...@gmail.com>.
I too thought version will be the problem but, Same Package works fine for
another Server (remote Cluster with same Hadoop 2.5.2) . See here i have to
clarify my use case,

1. Having a Small sample(*client*) which is built with Hadoop *2.6.2 (*Built
with .net framework 4.0 and JDK 1.8 in 64 bit env)

2. Having a Remote cluster (*server*) which is built with Hadoop *2.5.2 *
* (*Built with .net framework 4.0 and JDK 1.7 in 64 bit env )

3. When i intend to access the Data from *Server*(Remote cluster with
Hadoop 2.5.2 ) with my *client *Api , Above exception will rise.

@*hsdcloud@163.com <hs...@163.com> *As you said the version will be
same, But it works fine When If *server* is running in another machine with
same environment.

As now i have the changed to another Server(remote Cluster) and working
fine !!
Am little bit confused with the exception where it occurs either problem
with Version/DLL or with System Environment??




On Tue, Mar 29, 2016 at 2:33 PM, hsdcloud@163.com <hs...@163.com> wrote:

> i think the root cause is version ,  suggest your built jar use
> hadoop2.5.2 ,the same version
>
> ------------------------------
> hsdcloud@163.com
>
>
> *From:* karthi keyan <ka...@gmail.com>
> *Date:* 2016-03-29 17:16
> *To:* Brahma Reddy Battula <br...@huawei.com>
> *CC:* user@hadoop.apache.org
> *Subject:* Re: UnstaisfiedLinkError - Windows Environment
> Only side libraries are built with Hadoop 2.6.2 , But Server(Remote
> Cluster) is built with Hadoop 2.5.2? Am not sure that version not be the
> case here.. Don't know wy it happens ? does you knw the cause of this
> exception ?
>
>

Re: Re: UnstaisfiedLinkError - Windows Environment

Posted by karthi keyan <ka...@gmail.com>.
I too thought version will be the problem but, Same Package works fine for
another Server (remote Cluster with same Hadoop 2.5.2) . See here i have to
clarify my use case,

1. Having a Small sample(*client*) which is built with Hadoop *2.6.2 (*Built
with .net framework 4.0 and JDK 1.8 in 64 bit env)

2. Having a Remote cluster (*server*) which is built with Hadoop *2.5.2 *
* (*Built with .net framework 4.0 and JDK 1.7 in 64 bit env )

3. When i intend to access the Data from *Server*(Remote cluster with
Hadoop 2.5.2 ) with my *client *Api , Above exception will rise.

@*hsdcloud@163.com <hs...@163.com> *As you said the version will be
same, But it works fine When If *server* is running in another machine with
same environment.

As now i have the changed to another Server(remote Cluster) and working
fine !!
Am little bit confused with the exception where it occurs either problem
with Version/DLL or with System Environment??




On Tue, Mar 29, 2016 at 2:33 PM, hsdcloud@163.com <hs...@163.com> wrote:

> i think the root cause is version ,  suggest your built jar use
> hadoop2.5.2 ,the same version
>
> ------------------------------
> hsdcloud@163.com
>
>
> *From:* karthi keyan <ka...@gmail.com>
> *Date:* 2016-03-29 17:16
> *To:* Brahma Reddy Battula <br...@huawei.com>
> *CC:* user@hadoop.apache.org
> *Subject:* Re: UnstaisfiedLinkError - Windows Environment
> Only side libraries are built with Hadoop 2.6.2 , But Server(Remote
> Cluster) is built with Hadoop 2.5.2? Am not sure that version not be the
> case here.. Don't know wy it happens ? does you knw the cause of this
> exception ?
>
>

Re: Re: UnstaisfiedLinkError - Windows Environment

Posted by karthi keyan <ka...@gmail.com>.
I too thought version will be the problem but, Same Package works fine for
another Server (remote Cluster with same Hadoop 2.5.2) . See here i have to
clarify my use case,

1. Having a Small sample(*client*) which is built with Hadoop *2.6.2 (*Built
with .net framework 4.0 and JDK 1.8 in 64 bit env)

2. Having a Remote cluster (*server*) which is built with Hadoop *2.5.2 *
* (*Built with .net framework 4.0 and JDK 1.7 in 64 bit env )

3. When i intend to access the Data from *Server*(Remote cluster with
Hadoop 2.5.2 ) with my *client *Api , Above exception will rise.

@*hsdcloud@163.com <hs...@163.com> *As you said the version will be
same, But it works fine When If *server* is running in another machine with
same environment.

As now i have the changed to another Server(remote Cluster) and working
fine !!
Am little bit confused with the exception where it occurs either problem
with Version/DLL or with System Environment??




On Tue, Mar 29, 2016 at 2:33 PM, hsdcloud@163.com <hs...@163.com> wrote:

> i think the root cause is version ,  suggest your built jar use
> hadoop2.5.2 ,the same version
>
> ------------------------------
> hsdcloud@163.com
>
>
> *From:* karthi keyan <ka...@gmail.com>
> *Date:* 2016-03-29 17:16
> *To:* Brahma Reddy Battula <br...@huawei.com>
> *CC:* user@hadoop.apache.org
> *Subject:* Re: UnstaisfiedLinkError - Windows Environment
> Only side libraries are built with Hadoop 2.6.2 , But Server(Remote
> Cluster) is built with Hadoop 2.5.2? Am not sure that version not be the
> case here.. Don't know wy it happens ? does you knw the cause of this
> exception ?
>
>

Re: Re: UnstaisfiedLinkError - Windows Environment

Posted by "hsdcloud@163.com" <hs...@163.com>.
Only side libraries are built with Hadoop 2.6.2 , But Server(Remote
Cluster) is built with Hadoop 2.5.2? Am not sure that version not be the
case here.. Don't know wy it happens ? does you knw the cause of this
exception ?

Re: Re: UnstaisfiedLinkError - Windows Environment

Posted by "hsdcloud@163.com" <hs...@163.com>.
Only side libraries are built with Hadoop 2.6.2 , But Server(Remote
Cluster) is built with Hadoop 2.5.2? Am not sure that version not be the
case here.. Don't know wy it happens ? does you knw the cause of this
exception ?

Re: Re: UnstaisfiedLinkError - Windows Environment

Posted by "hsdcloud@163.com" <hs...@163.com>.
Only side libraries are built with Hadoop 2.6.2 , But Server(Remote
Cluster) is built with Hadoop 2.5.2? Am not sure that version not be the
case here.. Don't know wy it happens ? does you knw the cause of this
exception ?

Re: Re: UnstaisfiedLinkError - Windows Environment

Posted by "hsdcloud@163.com" <hs...@163.com>.
Only side libraries are built with Hadoop 2.6.2 , But Server(Remote
Cluster) is built with Hadoop 2.5.2? Am not sure that version not be the
case here.. Don't know wy it happens ? does you knw the cause of this
exception ?

RE: UnstaisfiedLinkError - Windows Environment

Posted by Brahma Reddy Battula <br...@huawei.com>.
UnsatisfiedLinkError, is coming because its not able link, Java code API (JNI) to corresponding native API in available library.
This will not be the problem from running different version in remote cluster.
Common causes would be:
1. Built for different architecture (32bit?)
2. Libraries corrupted ?

Are you able to read without libraries??
Dont put those libraries in bin and try.

--Brahma Reddy Battula

From: karthi keyan [mailto:karthi93.sankar@gmail.com]
Sent: 29 March 2016 16:47
To: Brahma Reddy Battula
Cc: user@hadoop.apache.org
Subject: Re: UnstaisfiedLinkError - Windows Environment

Only side libraries are built with Hadoop 2.6.2 , But Server(Remote Cluster) is built with Hadoop 2.5.2? Am not sure that version not be the case here.. Don't know wy it happens ? does you knw the cause of this exception ?

Re: UnstaisfiedLinkError - Windows Environment

Posted by karthi keyan <ka...@gmail.com>.
Only side libraries are built with Hadoop 2.6.2 , But Server(Remote
Cluster) is built with Hadoop 2.5.2? Am not sure that version not be the
case here.. Don't know wy it happens ? does you knw the cause of this
exception ?

Re: UnstaisfiedLinkError - Windows Environment

Posted by karthi keyan <ka...@gmail.com>.
Only side libraries are built with Hadoop 2.6.2 , But Server(Remote
Cluster) is built with Hadoop 2.5.2? Am not sure that version not be the
case here.. Don't know wy it happens ? does you knw the cause of this
exception ?

Re: UnstaisfiedLinkError - Windows Environment

Posted by karthi keyan <ka...@gmail.com>.
Only side libraries are built with Hadoop 2.6.2 , But Server(Remote
Cluster) is built with Hadoop 2.5.2? Am not sure that version not be the
case here.. Don't know wy it happens ? does you knw the cause of this
exception ?

Re: UnstaisfiedLinkError - Windows Environment

Posted by karthi keyan <ka...@gmail.com>.
Only side libraries are built with Hadoop 2.6.2 , But Server(Remote
Cluster) is built with Hadoop 2.5.2? Am not sure that version not be the
case here.. Don't know wy it happens ? does you knw the cause of this
exception ?

RE: UnstaisfiedLinkError - Windows Environment

Posted by Brahma Reddy Battula <br...@huawei.com>.
As you told Server is Hadoop 2.5.2.
But client is 2.6.2, as seen in the exception.
Whether client-side libraries built with Hadoop 2.6.2?

From: karthi keyan [mailto:karthi93.sankar@gmail.com]
Sent: 29 March 2016 15:16
To: Brahma Reddy Battula
Cc: user@hadoop.apache.org
Subject: Re: UnstaisfiedLinkError - Windows Environment

Yes, built with right libraries.
In my case i have to connect with remote cluster which accommodate Hadoop (built for 64 bit windows and Hadoop 2.5.2).

On Tue, Mar 29, 2016 at 12:34 PM, Brahma Reddy Battula <br...@huawei.com>> wrote:
Are you using the right libraries ( built for 64-bit windows and Hadoop 2.6.2) ?

From: karthi keyan [mailto:karthi93.sankar@gmail.com<ma...@gmail.com>]
Sent: 29 March 2016 14:51
To: Brahma Reddy Battula
Cc: user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: Re: UnstaisfiedLinkError - Windows Environment

Hi Brahma,

I have added those libraries to the bin path. Every time  when i communicate with other cluster(hadoop) am facing this issue.
Is there any Backward compatibility  ?? or some thing else ?

On Tue, Mar 29, 2016 at 12:09 PM, Brahma Reddy Battula <br...@huawei.com>> wrote:
Hadoop Cluster installed in Windows or only client is in Windows?

Whether Hadoop distribution contains windows library files and
<HADOOP_HOME>/bin is added to PATH ?


From: karthi keyan [mailto:karthi93.sankar@gmail.com<ma...@gmail.com>]
Sent: 29 March 2016 14:29
To: user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: UnstaisfiedLinkError - Windows Environment

Hi,

Frequently am facing this issue while reading the Data from HDFS, Every time i have replaced (rebuid) the jars. Does any one suggest me the right way to resolve this issue? or can any one tell me the root cause for this error ?

JDK > 1.7
System env - win 64 bit


Caused by: java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(IILjava/nio/ByteBuffer;ILjava/nio/ByteBuffer;IILjava/lang/String;JZ)V
            at org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(Native Method) ~[hadoop-common-2.6.2.jar:na]
            at org.apache.hadoop.util.NativeCrc32.verifyChunkedSums(NativeCrc32.java:59) ~[hadoop-common-2.6.2.jar:na]
            at org.apache.hadoop.util.DataChecksum.verifyChunkedSums(DataChecksum.java:301) ~[hadoop-common-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.RemoteBlockReader2.readNextPacket(RemoteBlockReader2.java:216) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.RemoteBlockReader2.read(RemoteBlockReader2.java:146) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream$ByteArrayStrategy.doRead(DFSInputStream.java:693) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream.readBuffer(DFSInputStream.java:749) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:807) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:848) ~[hadoop-hdfs-2.6.2.jar:na]
            at java.io.DataInputStream.read(DataInputStream.java:100) ~[na:1.7.0]

Regards,
Karthikeyan S



RE: UnstaisfiedLinkError - Windows Environment

Posted by Brahma Reddy Battula <br...@huawei.com>.
As you told Server is Hadoop 2.5.2.
But client is 2.6.2, as seen in the exception.
Whether client-side libraries built with Hadoop 2.6.2?

From: karthi keyan [mailto:karthi93.sankar@gmail.com]
Sent: 29 March 2016 15:16
To: Brahma Reddy Battula
Cc: user@hadoop.apache.org
Subject: Re: UnstaisfiedLinkError - Windows Environment

Yes, built with right libraries.
In my case i have to connect with remote cluster which accommodate Hadoop (built for 64 bit windows and Hadoop 2.5.2).

On Tue, Mar 29, 2016 at 12:34 PM, Brahma Reddy Battula <br...@huawei.com>> wrote:
Are you using the right libraries ( built for 64-bit windows and Hadoop 2.6.2) ?

From: karthi keyan [mailto:karthi93.sankar@gmail.com<ma...@gmail.com>]
Sent: 29 March 2016 14:51
To: Brahma Reddy Battula
Cc: user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: Re: UnstaisfiedLinkError - Windows Environment

Hi Brahma,

I have added those libraries to the bin path. Every time  when i communicate with other cluster(hadoop) am facing this issue.
Is there any Backward compatibility  ?? or some thing else ?

On Tue, Mar 29, 2016 at 12:09 PM, Brahma Reddy Battula <br...@huawei.com>> wrote:
Hadoop Cluster installed in Windows or only client is in Windows?

Whether Hadoop distribution contains windows library files and
<HADOOP_HOME>/bin is added to PATH ?


From: karthi keyan [mailto:karthi93.sankar@gmail.com<ma...@gmail.com>]
Sent: 29 March 2016 14:29
To: user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: UnstaisfiedLinkError - Windows Environment

Hi,

Frequently am facing this issue while reading the Data from HDFS, Every time i have replaced (rebuid) the jars. Does any one suggest me the right way to resolve this issue? or can any one tell me the root cause for this error ?

JDK > 1.7
System env - win 64 bit


Caused by: java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(IILjava/nio/ByteBuffer;ILjava/nio/ByteBuffer;IILjava/lang/String;JZ)V
            at org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(Native Method) ~[hadoop-common-2.6.2.jar:na]
            at org.apache.hadoop.util.NativeCrc32.verifyChunkedSums(NativeCrc32.java:59) ~[hadoop-common-2.6.2.jar:na]
            at org.apache.hadoop.util.DataChecksum.verifyChunkedSums(DataChecksum.java:301) ~[hadoop-common-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.RemoteBlockReader2.readNextPacket(RemoteBlockReader2.java:216) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.RemoteBlockReader2.read(RemoteBlockReader2.java:146) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream$ByteArrayStrategy.doRead(DFSInputStream.java:693) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream.readBuffer(DFSInputStream.java:749) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:807) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:848) ~[hadoop-hdfs-2.6.2.jar:na]
            at java.io.DataInputStream.read(DataInputStream.java:100) ~[na:1.7.0]

Regards,
Karthikeyan S



RE: UnstaisfiedLinkError - Windows Environment

Posted by Brahma Reddy Battula <br...@huawei.com>.
As you told Server is Hadoop 2.5.2.
But client is 2.6.2, as seen in the exception.
Whether client-side libraries built with Hadoop 2.6.2?

From: karthi keyan [mailto:karthi93.sankar@gmail.com]
Sent: 29 March 2016 15:16
To: Brahma Reddy Battula
Cc: user@hadoop.apache.org
Subject: Re: UnstaisfiedLinkError - Windows Environment

Yes, built with right libraries.
In my case i have to connect with remote cluster which accommodate Hadoop (built for 64 bit windows and Hadoop 2.5.2).

On Tue, Mar 29, 2016 at 12:34 PM, Brahma Reddy Battula <br...@huawei.com>> wrote:
Are you using the right libraries ( built for 64-bit windows and Hadoop 2.6.2) ?

From: karthi keyan [mailto:karthi93.sankar@gmail.com<ma...@gmail.com>]
Sent: 29 March 2016 14:51
To: Brahma Reddy Battula
Cc: user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: Re: UnstaisfiedLinkError - Windows Environment

Hi Brahma,

I have added those libraries to the bin path. Every time  when i communicate with other cluster(hadoop) am facing this issue.
Is there any Backward compatibility  ?? or some thing else ?

On Tue, Mar 29, 2016 at 12:09 PM, Brahma Reddy Battula <br...@huawei.com>> wrote:
Hadoop Cluster installed in Windows or only client is in Windows?

Whether Hadoop distribution contains windows library files and
<HADOOP_HOME>/bin is added to PATH ?


From: karthi keyan [mailto:karthi93.sankar@gmail.com<ma...@gmail.com>]
Sent: 29 March 2016 14:29
To: user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: UnstaisfiedLinkError - Windows Environment

Hi,

Frequently am facing this issue while reading the Data from HDFS, Every time i have replaced (rebuid) the jars. Does any one suggest me the right way to resolve this issue? or can any one tell me the root cause for this error ?

JDK > 1.7
System env - win 64 bit


Caused by: java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(IILjava/nio/ByteBuffer;ILjava/nio/ByteBuffer;IILjava/lang/String;JZ)V
            at org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(Native Method) ~[hadoop-common-2.6.2.jar:na]
            at org.apache.hadoop.util.NativeCrc32.verifyChunkedSums(NativeCrc32.java:59) ~[hadoop-common-2.6.2.jar:na]
            at org.apache.hadoop.util.DataChecksum.verifyChunkedSums(DataChecksum.java:301) ~[hadoop-common-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.RemoteBlockReader2.readNextPacket(RemoteBlockReader2.java:216) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.RemoteBlockReader2.read(RemoteBlockReader2.java:146) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream$ByteArrayStrategy.doRead(DFSInputStream.java:693) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream.readBuffer(DFSInputStream.java:749) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:807) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:848) ~[hadoop-hdfs-2.6.2.jar:na]
            at java.io.DataInputStream.read(DataInputStream.java:100) ~[na:1.7.0]

Regards,
Karthikeyan S



RE: UnstaisfiedLinkError - Windows Environment

Posted by "Zheng, Kai" <ka...@intel.com>.
Something to try:

1.       Run ‘hadoop checknative’ to see if anything wrong;

2.       Find the Hadoop native DLL and using some tool inspect its symbols, see if there is the required function related to crc32.

This is a rather old function I’m not sure it was changed around the version.

Regards,
Kai

From: karthi keyan [mailto:karthi93.sankar@gmail.com]
Sent: Tuesday, March 29, 2016 3:16 PM
To: Brahma Reddy Battula <br...@huawei.com>
Cc: user@hadoop.apache.org
Subject: Re: UnstaisfiedLinkError - Windows Environment

Yes, built with right libraries.
In my case i have to connect with remote cluster which accommodate Hadoop (built for 64 bit windows and Hadoop 2.5.2).

On Tue, Mar 29, 2016 at 12:34 PM, Brahma Reddy Battula <br...@huawei.com>> wrote:
Are you using the right libraries ( built for 64-bit windows and Hadoop 2.6.2) ?

From: karthi keyan [mailto:karthi93.sankar@gmail.com<ma...@gmail.com>]
Sent: 29 March 2016 14:51
To: Brahma Reddy Battula
Cc: user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: Re: UnstaisfiedLinkError - Windows Environment

Hi Brahma,

I have added those libraries to the bin path. Every time  when i communicate with other cluster(hadoop) am facing this issue.
Is there any Backward compatibility  ?? or some thing else ?

On Tue, Mar 29, 2016 at 12:09 PM, Brahma Reddy Battula <br...@huawei.com>> wrote:
Hadoop Cluster installed in Windows or only client is in Windows?

Whether Hadoop distribution contains windows library files and
<HADOOP_HOME>/bin is added to PATH ?


From: karthi keyan [mailto:karthi93.sankar@gmail.com<ma...@gmail.com>]
Sent: 29 March 2016 14:29
To: user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: UnstaisfiedLinkError - Windows Environment

Hi,

Frequently am facing this issue while reading the Data from HDFS, Every time i have replaced (rebuid) the jars. Does any one suggest me the right way to resolve this issue? or can any one tell me the root cause for this error ?

JDK > 1.7
System env - win 64 bit


Caused by: java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(IILjava/nio/ByteBuffer;ILjava/nio/ByteBuffer;IILjava/lang/String;JZ)V
            at org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(Native Method) ~[hadoop-common-2.6.2.jar:na]
            at org.apache.hadoop.util.NativeCrc32.verifyChunkedSums(NativeCrc32.java:59) ~[hadoop-common-2.6.2.jar:na]
            at org.apache.hadoop.util.DataChecksum.verifyChunkedSums(DataChecksum.java:301) ~[hadoop-common-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.RemoteBlockReader2.readNextPacket(RemoteBlockReader2.java:216) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.RemoteBlockReader2.read(RemoteBlockReader2.java:146) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream$ByteArrayStrategy.doRead(DFSInputStream.java:693) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream.readBuffer(DFSInputStream.java:749) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:807) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:848) ~[hadoop-hdfs-2.6.2.jar:na]
            at java.io.DataInputStream.read(DataInputStream.java:100) ~[na:1.7.0]

Regards,
Karthikeyan S



RE: UnstaisfiedLinkError - Windows Environment

Posted by Brahma Reddy Battula <br...@huawei.com>.
As you told Server is Hadoop 2.5.2.
But client is 2.6.2, as seen in the exception.
Whether client-side libraries built with Hadoop 2.6.2?

From: karthi keyan [mailto:karthi93.sankar@gmail.com]
Sent: 29 March 2016 15:16
To: Brahma Reddy Battula
Cc: user@hadoop.apache.org
Subject: Re: UnstaisfiedLinkError - Windows Environment

Yes, built with right libraries.
In my case i have to connect with remote cluster which accommodate Hadoop (built for 64 bit windows and Hadoop 2.5.2).

On Tue, Mar 29, 2016 at 12:34 PM, Brahma Reddy Battula <br...@huawei.com>> wrote:
Are you using the right libraries ( built for 64-bit windows and Hadoop 2.6.2) ?

From: karthi keyan [mailto:karthi93.sankar@gmail.com<ma...@gmail.com>]
Sent: 29 March 2016 14:51
To: Brahma Reddy Battula
Cc: user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: Re: UnstaisfiedLinkError - Windows Environment

Hi Brahma,

I have added those libraries to the bin path. Every time  when i communicate with other cluster(hadoop) am facing this issue.
Is there any Backward compatibility  ?? or some thing else ?

On Tue, Mar 29, 2016 at 12:09 PM, Brahma Reddy Battula <br...@huawei.com>> wrote:
Hadoop Cluster installed in Windows or only client is in Windows?

Whether Hadoop distribution contains windows library files and
<HADOOP_HOME>/bin is added to PATH ?


From: karthi keyan [mailto:karthi93.sankar@gmail.com<ma...@gmail.com>]
Sent: 29 March 2016 14:29
To: user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: UnstaisfiedLinkError - Windows Environment

Hi,

Frequently am facing this issue while reading the Data from HDFS, Every time i have replaced (rebuid) the jars. Does any one suggest me the right way to resolve this issue? or can any one tell me the root cause for this error ?

JDK > 1.7
System env - win 64 bit


Caused by: java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(IILjava/nio/ByteBuffer;ILjava/nio/ByteBuffer;IILjava/lang/String;JZ)V
            at org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(Native Method) ~[hadoop-common-2.6.2.jar:na]
            at org.apache.hadoop.util.NativeCrc32.verifyChunkedSums(NativeCrc32.java:59) ~[hadoop-common-2.6.2.jar:na]
            at org.apache.hadoop.util.DataChecksum.verifyChunkedSums(DataChecksum.java:301) ~[hadoop-common-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.RemoteBlockReader2.readNextPacket(RemoteBlockReader2.java:216) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.RemoteBlockReader2.read(RemoteBlockReader2.java:146) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream$ByteArrayStrategy.doRead(DFSInputStream.java:693) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream.readBuffer(DFSInputStream.java:749) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:807) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:848) ~[hadoop-hdfs-2.6.2.jar:na]
            at java.io.DataInputStream.read(DataInputStream.java:100) ~[na:1.7.0]

Regards,
Karthikeyan S



RE: UnstaisfiedLinkError - Windows Environment

Posted by "Zheng, Kai" <ka...@intel.com>.
Something to try:

1.       Run ‘hadoop checknative’ to see if anything wrong;

2.       Find the Hadoop native DLL and using some tool inspect its symbols, see if there is the required function related to crc32.

This is a rather old function I’m not sure it was changed around the version.

Regards,
Kai

From: karthi keyan [mailto:karthi93.sankar@gmail.com]
Sent: Tuesday, March 29, 2016 3:16 PM
To: Brahma Reddy Battula <br...@huawei.com>
Cc: user@hadoop.apache.org
Subject: Re: UnstaisfiedLinkError - Windows Environment

Yes, built with right libraries.
In my case i have to connect with remote cluster which accommodate Hadoop (built for 64 bit windows and Hadoop 2.5.2).

On Tue, Mar 29, 2016 at 12:34 PM, Brahma Reddy Battula <br...@huawei.com>> wrote:
Are you using the right libraries ( built for 64-bit windows and Hadoop 2.6.2) ?

From: karthi keyan [mailto:karthi93.sankar@gmail.com<ma...@gmail.com>]
Sent: 29 March 2016 14:51
To: Brahma Reddy Battula
Cc: user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: Re: UnstaisfiedLinkError - Windows Environment

Hi Brahma,

I have added those libraries to the bin path. Every time  when i communicate with other cluster(hadoop) am facing this issue.
Is there any Backward compatibility  ?? or some thing else ?

On Tue, Mar 29, 2016 at 12:09 PM, Brahma Reddy Battula <br...@huawei.com>> wrote:
Hadoop Cluster installed in Windows or only client is in Windows?

Whether Hadoop distribution contains windows library files and
<HADOOP_HOME>/bin is added to PATH ?


From: karthi keyan [mailto:karthi93.sankar@gmail.com<ma...@gmail.com>]
Sent: 29 March 2016 14:29
To: user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: UnstaisfiedLinkError - Windows Environment

Hi,

Frequently am facing this issue while reading the Data from HDFS, Every time i have replaced (rebuid) the jars. Does any one suggest me the right way to resolve this issue? or can any one tell me the root cause for this error ?

JDK > 1.7
System env - win 64 bit


Caused by: java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(IILjava/nio/ByteBuffer;ILjava/nio/ByteBuffer;IILjava/lang/String;JZ)V
            at org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(Native Method) ~[hadoop-common-2.6.2.jar:na]
            at org.apache.hadoop.util.NativeCrc32.verifyChunkedSums(NativeCrc32.java:59) ~[hadoop-common-2.6.2.jar:na]
            at org.apache.hadoop.util.DataChecksum.verifyChunkedSums(DataChecksum.java:301) ~[hadoop-common-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.RemoteBlockReader2.readNextPacket(RemoteBlockReader2.java:216) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.RemoteBlockReader2.read(RemoteBlockReader2.java:146) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream$ByteArrayStrategy.doRead(DFSInputStream.java:693) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream.readBuffer(DFSInputStream.java:749) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:807) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:848) ~[hadoop-hdfs-2.6.2.jar:na]
            at java.io.DataInputStream.read(DataInputStream.java:100) ~[na:1.7.0]

Regards,
Karthikeyan S



RE: UnstaisfiedLinkError - Windows Environment

Posted by "Zheng, Kai" <ka...@intel.com>.
Something to try:

1.       Run ‘hadoop checknative’ to see if anything wrong;

2.       Find the Hadoop native DLL and using some tool inspect its symbols, see if there is the required function related to crc32.

This is a rather old function I’m not sure it was changed around the version.

Regards,
Kai

From: karthi keyan [mailto:karthi93.sankar@gmail.com]
Sent: Tuesday, March 29, 2016 3:16 PM
To: Brahma Reddy Battula <br...@huawei.com>
Cc: user@hadoop.apache.org
Subject: Re: UnstaisfiedLinkError - Windows Environment

Yes, built with right libraries.
In my case i have to connect with remote cluster which accommodate Hadoop (built for 64 bit windows and Hadoop 2.5.2).

On Tue, Mar 29, 2016 at 12:34 PM, Brahma Reddy Battula <br...@huawei.com>> wrote:
Are you using the right libraries ( built for 64-bit windows and Hadoop 2.6.2) ?

From: karthi keyan [mailto:karthi93.sankar@gmail.com<ma...@gmail.com>]
Sent: 29 March 2016 14:51
To: Brahma Reddy Battula
Cc: user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: Re: UnstaisfiedLinkError - Windows Environment

Hi Brahma,

I have added those libraries to the bin path. Every time  when i communicate with other cluster(hadoop) am facing this issue.
Is there any Backward compatibility  ?? or some thing else ?

On Tue, Mar 29, 2016 at 12:09 PM, Brahma Reddy Battula <br...@huawei.com>> wrote:
Hadoop Cluster installed in Windows or only client is in Windows?

Whether Hadoop distribution contains windows library files and
<HADOOP_HOME>/bin is added to PATH ?


From: karthi keyan [mailto:karthi93.sankar@gmail.com<ma...@gmail.com>]
Sent: 29 March 2016 14:29
To: user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: UnstaisfiedLinkError - Windows Environment

Hi,

Frequently am facing this issue while reading the Data from HDFS, Every time i have replaced (rebuid) the jars. Does any one suggest me the right way to resolve this issue? or can any one tell me the root cause for this error ?

JDK > 1.7
System env - win 64 bit


Caused by: java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(IILjava/nio/ByteBuffer;ILjava/nio/ByteBuffer;IILjava/lang/String;JZ)V
            at org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(Native Method) ~[hadoop-common-2.6.2.jar:na]
            at org.apache.hadoop.util.NativeCrc32.verifyChunkedSums(NativeCrc32.java:59) ~[hadoop-common-2.6.2.jar:na]
            at org.apache.hadoop.util.DataChecksum.verifyChunkedSums(DataChecksum.java:301) ~[hadoop-common-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.RemoteBlockReader2.readNextPacket(RemoteBlockReader2.java:216) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.RemoteBlockReader2.read(RemoteBlockReader2.java:146) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream$ByteArrayStrategy.doRead(DFSInputStream.java:693) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream.readBuffer(DFSInputStream.java:749) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:807) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:848) ~[hadoop-hdfs-2.6.2.jar:na]
            at java.io.DataInputStream.read(DataInputStream.java:100) ~[na:1.7.0]

Regards,
Karthikeyan S



RE: UnstaisfiedLinkError - Windows Environment

Posted by "Zheng, Kai" <ka...@intel.com>.
Something to try:

1.       Run ‘hadoop checknative’ to see if anything wrong;

2.       Find the Hadoop native DLL and using some tool inspect its symbols, see if there is the required function related to crc32.

This is a rather old function I’m not sure it was changed around the version.

Regards,
Kai

From: karthi keyan [mailto:karthi93.sankar@gmail.com]
Sent: Tuesday, March 29, 2016 3:16 PM
To: Brahma Reddy Battula <br...@huawei.com>
Cc: user@hadoop.apache.org
Subject: Re: UnstaisfiedLinkError - Windows Environment

Yes, built with right libraries.
In my case i have to connect with remote cluster which accommodate Hadoop (built for 64 bit windows and Hadoop 2.5.2).

On Tue, Mar 29, 2016 at 12:34 PM, Brahma Reddy Battula <br...@huawei.com>> wrote:
Are you using the right libraries ( built for 64-bit windows and Hadoop 2.6.2) ?

From: karthi keyan [mailto:karthi93.sankar@gmail.com<ma...@gmail.com>]
Sent: 29 March 2016 14:51
To: Brahma Reddy Battula
Cc: user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: Re: UnstaisfiedLinkError - Windows Environment

Hi Brahma,

I have added those libraries to the bin path. Every time  when i communicate with other cluster(hadoop) am facing this issue.
Is there any Backward compatibility  ?? or some thing else ?

On Tue, Mar 29, 2016 at 12:09 PM, Brahma Reddy Battula <br...@huawei.com>> wrote:
Hadoop Cluster installed in Windows or only client is in Windows?

Whether Hadoop distribution contains windows library files and
<HADOOP_HOME>/bin is added to PATH ?


From: karthi keyan [mailto:karthi93.sankar@gmail.com<ma...@gmail.com>]
Sent: 29 March 2016 14:29
To: user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: UnstaisfiedLinkError - Windows Environment

Hi,

Frequently am facing this issue while reading the Data from HDFS, Every time i have replaced (rebuid) the jars. Does any one suggest me the right way to resolve this issue? or can any one tell me the root cause for this error ?

JDK > 1.7
System env - win 64 bit


Caused by: java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(IILjava/nio/ByteBuffer;ILjava/nio/ByteBuffer;IILjava/lang/String;JZ)V
            at org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(Native Method) ~[hadoop-common-2.6.2.jar:na]
            at org.apache.hadoop.util.NativeCrc32.verifyChunkedSums(NativeCrc32.java:59) ~[hadoop-common-2.6.2.jar:na]
            at org.apache.hadoop.util.DataChecksum.verifyChunkedSums(DataChecksum.java:301) ~[hadoop-common-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.RemoteBlockReader2.readNextPacket(RemoteBlockReader2.java:216) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.RemoteBlockReader2.read(RemoteBlockReader2.java:146) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream$ByteArrayStrategy.doRead(DFSInputStream.java:693) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream.readBuffer(DFSInputStream.java:749) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:807) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:848) ~[hadoop-hdfs-2.6.2.jar:na]
            at java.io.DataInputStream.read(DataInputStream.java:100) ~[na:1.7.0]

Regards,
Karthikeyan S



Re: UnstaisfiedLinkError - Windows Environment

Posted by karthi keyan <ka...@gmail.com>.
Yes, built with right libraries.

In my case i have to connect with remote cluster which accommodate Hadoop
(built for 64 bit windows and Hadoop 2.5.2).

On Tue, Mar 29, 2016 at 12:34 PM, Brahma Reddy Battula <
brahmareddy.battula@huawei.com> wrote:

> Are you using the right libraries ( built for 64-bit windows and
> Hadoop 2.6.2) ?
>
>
>
> *From:* karthi keyan [mailto:karthi93.sankar@gmail.com]
> *Sent:* 29 March 2016 14:51
> *To:* Brahma Reddy Battula
> *Cc:* user@hadoop.apache.org
> *Subject:* Re: UnstaisfiedLinkError - Windows Environment
>
>
>
> Hi Brahma,
>
>
> I have added those libraries to the bin path. Every time  when
> i communicate with other cluster(hadoop) am facing this issue.
>
> Is there any Backward compatibility  ?? or some thing else ?
>
>
>
> On Tue, Mar 29, 2016 at 12:09 PM, Brahma Reddy Battula <
> brahmareddy.battula@huawei.com> wrote:
>
> Hadoop Cluster installed in Windows or only client is in Windows?
>
>
>
> Whether Hadoop distribution contains windows library files and
> <HADOOP_HOME>/bin is added to PATH ?
>
>
>
>
>
> *From:* karthi keyan [mailto:karthi93.sankar@gmail.com]
> *Sent:* 29 March 2016 14:29
> *To:* user@hadoop.apache.org
> *Subject:* UnstaisfiedLinkError - Windows Environment
>
>
>
> Hi,
>
> Frequently am facing this issue while reading the Data from HDFS, Every
> time i have replaced (rebuid) the jars. Does any one suggest me the right
> way to resolve this issue? or can any one tell me the root cause for this
> error ?
>
> JDK > 1.7
> System env - win 64 bit
>
>
> Caused by: java.lang.UnsatisfiedLinkError:
> org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(IILjava/nio/ByteBuffer;ILjava/nio/ByteBuffer;IILjava/lang/String;JZ)V
>
>             at
> org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(Native Method)
> ~[hadoop-common-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.util.NativeCrc32.verifyChunkedSums(NativeCrc32.java:59)
> ~[hadoop-common-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.util.DataChecksum.verifyChunkedSums(DataChecksum.java:301)
> ~[hadoop-common-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.hdfs.RemoteBlockReader2.readNextPacket(RemoteBlockReader2.java:216)
> ~[hadoop-hdfs-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.hdfs.RemoteBlockReader2.read(RemoteBlockReader2.java:146)
> ~[hadoop-hdfs-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.hdfs.DFSInputStream$ByteArrayStrategy.doRead(DFSInputStream.java:693)
> ~[hadoop-hdfs-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.hdfs.DFSInputStream.readBuffer(DFSInputStream.java:749)
> ~[hadoop-hdfs-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:807)
> ~[hadoop-hdfs-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:848)
> ~[hadoop-hdfs-2.6.2.jar:na]
>
>             at java.io.DataInputStream.read(DataInputStream.java:100)
> ~[na:1.7.0]
>
> Regards,
>
> Karthikeyan S
>
>
>

Re: UnstaisfiedLinkError - Windows Environment

Posted by karthi keyan <ka...@gmail.com>.
Yes, built with right libraries.

In my case i have to connect with remote cluster which accommodate Hadoop
(built for 64 bit windows and Hadoop 2.5.2).

On Tue, Mar 29, 2016 at 12:34 PM, Brahma Reddy Battula <
brahmareddy.battula@huawei.com> wrote:

> Are you using the right libraries ( built for 64-bit windows and
> Hadoop 2.6.2) ?
>
>
>
> *From:* karthi keyan [mailto:karthi93.sankar@gmail.com]
> *Sent:* 29 March 2016 14:51
> *To:* Brahma Reddy Battula
> *Cc:* user@hadoop.apache.org
> *Subject:* Re: UnstaisfiedLinkError - Windows Environment
>
>
>
> Hi Brahma,
>
>
> I have added those libraries to the bin path. Every time  when
> i communicate with other cluster(hadoop) am facing this issue.
>
> Is there any Backward compatibility  ?? or some thing else ?
>
>
>
> On Tue, Mar 29, 2016 at 12:09 PM, Brahma Reddy Battula <
> brahmareddy.battula@huawei.com> wrote:
>
> Hadoop Cluster installed in Windows or only client is in Windows?
>
>
>
> Whether Hadoop distribution contains windows library files and
> <HADOOP_HOME>/bin is added to PATH ?
>
>
>
>
>
> *From:* karthi keyan [mailto:karthi93.sankar@gmail.com]
> *Sent:* 29 March 2016 14:29
> *To:* user@hadoop.apache.org
> *Subject:* UnstaisfiedLinkError - Windows Environment
>
>
>
> Hi,
>
> Frequently am facing this issue while reading the Data from HDFS, Every
> time i have replaced (rebuid) the jars. Does any one suggest me the right
> way to resolve this issue? or can any one tell me the root cause for this
> error ?
>
> JDK > 1.7
> System env - win 64 bit
>
>
> Caused by: java.lang.UnsatisfiedLinkError:
> org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(IILjava/nio/ByteBuffer;ILjava/nio/ByteBuffer;IILjava/lang/String;JZ)V
>
>             at
> org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(Native Method)
> ~[hadoop-common-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.util.NativeCrc32.verifyChunkedSums(NativeCrc32.java:59)
> ~[hadoop-common-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.util.DataChecksum.verifyChunkedSums(DataChecksum.java:301)
> ~[hadoop-common-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.hdfs.RemoteBlockReader2.readNextPacket(RemoteBlockReader2.java:216)
> ~[hadoop-hdfs-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.hdfs.RemoteBlockReader2.read(RemoteBlockReader2.java:146)
> ~[hadoop-hdfs-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.hdfs.DFSInputStream$ByteArrayStrategy.doRead(DFSInputStream.java:693)
> ~[hadoop-hdfs-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.hdfs.DFSInputStream.readBuffer(DFSInputStream.java:749)
> ~[hadoop-hdfs-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:807)
> ~[hadoop-hdfs-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:848)
> ~[hadoop-hdfs-2.6.2.jar:na]
>
>             at java.io.DataInputStream.read(DataInputStream.java:100)
> ~[na:1.7.0]
>
> Regards,
>
> Karthikeyan S
>
>
>

Re: UnstaisfiedLinkError - Windows Environment

Posted by karthi keyan <ka...@gmail.com>.
Yes, built with right libraries.

In my case i have to connect with remote cluster which accommodate Hadoop
(built for 64 bit windows and Hadoop 2.5.2).

On Tue, Mar 29, 2016 at 12:34 PM, Brahma Reddy Battula <
brahmareddy.battula@huawei.com> wrote:

> Are you using the right libraries ( built for 64-bit windows and
> Hadoop 2.6.2) ?
>
>
>
> *From:* karthi keyan [mailto:karthi93.sankar@gmail.com]
> *Sent:* 29 March 2016 14:51
> *To:* Brahma Reddy Battula
> *Cc:* user@hadoop.apache.org
> *Subject:* Re: UnstaisfiedLinkError - Windows Environment
>
>
>
> Hi Brahma,
>
>
> I have added those libraries to the bin path. Every time  when
> i communicate with other cluster(hadoop) am facing this issue.
>
> Is there any Backward compatibility  ?? or some thing else ?
>
>
>
> On Tue, Mar 29, 2016 at 12:09 PM, Brahma Reddy Battula <
> brahmareddy.battula@huawei.com> wrote:
>
> Hadoop Cluster installed in Windows or only client is in Windows?
>
>
>
> Whether Hadoop distribution contains windows library files and
> <HADOOP_HOME>/bin is added to PATH ?
>
>
>
>
>
> *From:* karthi keyan [mailto:karthi93.sankar@gmail.com]
> *Sent:* 29 March 2016 14:29
> *To:* user@hadoop.apache.org
> *Subject:* UnstaisfiedLinkError - Windows Environment
>
>
>
> Hi,
>
> Frequently am facing this issue while reading the Data from HDFS, Every
> time i have replaced (rebuid) the jars. Does any one suggest me the right
> way to resolve this issue? or can any one tell me the root cause for this
> error ?
>
> JDK > 1.7
> System env - win 64 bit
>
>
> Caused by: java.lang.UnsatisfiedLinkError:
> org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(IILjava/nio/ByteBuffer;ILjava/nio/ByteBuffer;IILjava/lang/String;JZ)V
>
>             at
> org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(Native Method)
> ~[hadoop-common-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.util.NativeCrc32.verifyChunkedSums(NativeCrc32.java:59)
> ~[hadoop-common-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.util.DataChecksum.verifyChunkedSums(DataChecksum.java:301)
> ~[hadoop-common-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.hdfs.RemoteBlockReader2.readNextPacket(RemoteBlockReader2.java:216)
> ~[hadoop-hdfs-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.hdfs.RemoteBlockReader2.read(RemoteBlockReader2.java:146)
> ~[hadoop-hdfs-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.hdfs.DFSInputStream$ByteArrayStrategy.doRead(DFSInputStream.java:693)
> ~[hadoop-hdfs-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.hdfs.DFSInputStream.readBuffer(DFSInputStream.java:749)
> ~[hadoop-hdfs-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:807)
> ~[hadoop-hdfs-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:848)
> ~[hadoop-hdfs-2.6.2.jar:na]
>
>             at java.io.DataInputStream.read(DataInputStream.java:100)
> ~[na:1.7.0]
>
> Regards,
>
> Karthikeyan S
>
>
>

Re: UnstaisfiedLinkError - Windows Environment

Posted by karthi keyan <ka...@gmail.com>.
Yes, built with right libraries.

In my case i have to connect with remote cluster which accommodate Hadoop
(built for 64 bit windows and Hadoop 2.5.2).

On Tue, Mar 29, 2016 at 12:34 PM, Brahma Reddy Battula <
brahmareddy.battula@huawei.com> wrote:

> Are you using the right libraries ( built for 64-bit windows and
> Hadoop 2.6.2) ?
>
>
>
> *From:* karthi keyan [mailto:karthi93.sankar@gmail.com]
> *Sent:* 29 March 2016 14:51
> *To:* Brahma Reddy Battula
> *Cc:* user@hadoop.apache.org
> *Subject:* Re: UnstaisfiedLinkError - Windows Environment
>
>
>
> Hi Brahma,
>
>
> I have added those libraries to the bin path. Every time  when
> i communicate with other cluster(hadoop) am facing this issue.
>
> Is there any Backward compatibility  ?? or some thing else ?
>
>
>
> On Tue, Mar 29, 2016 at 12:09 PM, Brahma Reddy Battula <
> brahmareddy.battula@huawei.com> wrote:
>
> Hadoop Cluster installed in Windows or only client is in Windows?
>
>
>
> Whether Hadoop distribution contains windows library files and
> <HADOOP_HOME>/bin is added to PATH ?
>
>
>
>
>
> *From:* karthi keyan [mailto:karthi93.sankar@gmail.com]
> *Sent:* 29 March 2016 14:29
> *To:* user@hadoop.apache.org
> *Subject:* UnstaisfiedLinkError - Windows Environment
>
>
>
> Hi,
>
> Frequently am facing this issue while reading the Data from HDFS, Every
> time i have replaced (rebuid) the jars. Does any one suggest me the right
> way to resolve this issue? or can any one tell me the root cause for this
> error ?
>
> JDK > 1.7
> System env - win 64 bit
>
>
> Caused by: java.lang.UnsatisfiedLinkError:
> org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(IILjava/nio/ByteBuffer;ILjava/nio/ByteBuffer;IILjava/lang/String;JZ)V
>
>             at
> org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(Native Method)
> ~[hadoop-common-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.util.NativeCrc32.verifyChunkedSums(NativeCrc32.java:59)
> ~[hadoop-common-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.util.DataChecksum.verifyChunkedSums(DataChecksum.java:301)
> ~[hadoop-common-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.hdfs.RemoteBlockReader2.readNextPacket(RemoteBlockReader2.java:216)
> ~[hadoop-hdfs-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.hdfs.RemoteBlockReader2.read(RemoteBlockReader2.java:146)
> ~[hadoop-hdfs-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.hdfs.DFSInputStream$ByteArrayStrategy.doRead(DFSInputStream.java:693)
> ~[hadoop-hdfs-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.hdfs.DFSInputStream.readBuffer(DFSInputStream.java:749)
> ~[hadoop-hdfs-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:807)
> ~[hadoop-hdfs-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:848)
> ~[hadoop-hdfs-2.6.2.jar:na]
>
>             at java.io.DataInputStream.read(DataInputStream.java:100)
> ~[na:1.7.0]
>
> Regards,
>
> Karthikeyan S
>
>
>

RE: UnstaisfiedLinkError - Windows Environment

Posted by Brahma Reddy Battula <br...@huawei.com>.
Are you using the right libraries ( built for 64-bit windows and Hadoop 2.6.2) ?

From: karthi keyan [mailto:karthi93.sankar@gmail.com]
Sent: 29 March 2016 14:51
To: Brahma Reddy Battula
Cc: user@hadoop.apache.org
Subject: Re: UnstaisfiedLinkError - Windows Environment

Hi Brahma,

I have added those libraries to the bin path. Every time  when i communicate with other cluster(hadoop) am facing this issue.
Is there any Backward compatibility  ?? or some thing else ?

On Tue, Mar 29, 2016 at 12:09 PM, Brahma Reddy Battula <br...@huawei.com>> wrote:
Hadoop Cluster installed in Windows or only client is in Windows?

Whether Hadoop distribution contains windows library files and
<HADOOP_HOME>/bin is added to PATH ?


From: karthi keyan [mailto:karthi93.sankar@gmail.com<ma...@gmail.com>]
Sent: 29 March 2016 14:29
To: user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: UnstaisfiedLinkError - Windows Environment

Hi,

Frequently am facing this issue while reading the Data from HDFS, Every time i have replaced (rebuid) the jars. Does any one suggest me the right way to resolve this issue? or can any one tell me the root cause for this error ?

JDK > 1.7
System env - win 64 bit


Caused by: java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(IILjava/nio/ByteBuffer;ILjava/nio/ByteBuffer;IILjava/lang/String;JZ)V
            at org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(Native Method) ~[hadoop-common-2.6.2.jar:na]
            at org.apache.hadoop.util.NativeCrc32.verifyChunkedSums(NativeCrc32.java:59) ~[hadoop-common-2.6.2.jar:na]
            at org.apache.hadoop.util.DataChecksum.verifyChunkedSums(DataChecksum.java:301) ~[hadoop-common-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.RemoteBlockReader2.readNextPacket(RemoteBlockReader2.java:216) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.RemoteBlockReader2.read(RemoteBlockReader2.java:146) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream$ByteArrayStrategy.doRead(DFSInputStream.java:693) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream.readBuffer(DFSInputStream.java:749) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:807) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:848) ~[hadoop-hdfs-2.6.2.jar:na]
            at java.io.DataInputStream.read(DataInputStream.java:100) ~[na:1.7.0]

Regards,
Karthikeyan S


RE: UnstaisfiedLinkError - Windows Environment

Posted by Brahma Reddy Battula <br...@huawei.com>.
Are you using the right libraries ( built for 64-bit windows and Hadoop 2.6.2) ?

From: karthi keyan [mailto:karthi93.sankar@gmail.com]
Sent: 29 March 2016 14:51
To: Brahma Reddy Battula
Cc: user@hadoop.apache.org
Subject: Re: UnstaisfiedLinkError - Windows Environment

Hi Brahma,

I have added those libraries to the bin path. Every time  when i communicate with other cluster(hadoop) am facing this issue.
Is there any Backward compatibility  ?? or some thing else ?

On Tue, Mar 29, 2016 at 12:09 PM, Brahma Reddy Battula <br...@huawei.com>> wrote:
Hadoop Cluster installed in Windows or only client is in Windows?

Whether Hadoop distribution contains windows library files and
<HADOOP_HOME>/bin is added to PATH ?


From: karthi keyan [mailto:karthi93.sankar@gmail.com<ma...@gmail.com>]
Sent: 29 March 2016 14:29
To: user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: UnstaisfiedLinkError - Windows Environment

Hi,

Frequently am facing this issue while reading the Data from HDFS, Every time i have replaced (rebuid) the jars. Does any one suggest me the right way to resolve this issue? or can any one tell me the root cause for this error ?

JDK > 1.7
System env - win 64 bit


Caused by: java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(IILjava/nio/ByteBuffer;ILjava/nio/ByteBuffer;IILjava/lang/String;JZ)V
            at org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(Native Method) ~[hadoop-common-2.6.2.jar:na]
            at org.apache.hadoop.util.NativeCrc32.verifyChunkedSums(NativeCrc32.java:59) ~[hadoop-common-2.6.2.jar:na]
            at org.apache.hadoop.util.DataChecksum.verifyChunkedSums(DataChecksum.java:301) ~[hadoop-common-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.RemoteBlockReader2.readNextPacket(RemoteBlockReader2.java:216) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.RemoteBlockReader2.read(RemoteBlockReader2.java:146) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream$ByteArrayStrategy.doRead(DFSInputStream.java:693) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream.readBuffer(DFSInputStream.java:749) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:807) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:848) ~[hadoop-hdfs-2.6.2.jar:na]
            at java.io.DataInputStream.read(DataInputStream.java:100) ~[na:1.7.0]

Regards,
Karthikeyan S


RE: UnstaisfiedLinkError - Windows Environment

Posted by Brahma Reddy Battula <br...@huawei.com>.
Are you using the right libraries ( built for 64-bit windows and Hadoop 2.6.2) ?

From: karthi keyan [mailto:karthi93.sankar@gmail.com]
Sent: 29 March 2016 14:51
To: Brahma Reddy Battula
Cc: user@hadoop.apache.org
Subject: Re: UnstaisfiedLinkError - Windows Environment

Hi Brahma,

I have added those libraries to the bin path. Every time  when i communicate with other cluster(hadoop) am facing this issue.
Is there any Backward compatibility  ?? or some thing else ?

On Tue, Mar 29, 2016 at 12:09 PM, Brahma Reddy Battula <br...@huawei.com>> wrote:
Hadoop Cluster installed in Windows or only client is in Windows?

Whether Hadoop distribution contains windows library files and
<HADOOP_HOME>/bin is added to PATH ?


From: karthi keyan [mailto:karthi93.sankar@gmail.com<ma...@gmail.com>]
Sent: 29 March 2016 14:29
To: user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: UnstaisfiedLinkError - Windows Environment

Hi,

Frequently am facing this issue while reading the Data from HDFS, Every time i have replaced (rebuid) the jars. Does any one suggest me the right way to resolve this issue? or can any one tell me the root cause for this error ?

JDK > 1.7
System env - win 64 bit


Caused by: java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(IILjava/nio/ByteBuffer;ILjava/nio/ByteBuffer;IILjava/lang/String;JZ)V
            at org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(Native Method) ~[hadoop-common-2.6.2.jar:na]
            at org.apache.hadoop.util.NativeCrc32.verifyChunkedSums(NativeCrc32.java:59) ~[hadoop-common-2.6.2.jar:na]
            at org.apache.hadoop.util.DataChecksum.verifyChunkedSums(DataChecksum.java:301) ~[hadoop-common-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.RemoteBlockReader2.readNextPacket(RemoteBlockReader2.java:216) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.RemoteBlockReader2.read(RemoteBlockReader2.java:146) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream$ByteArrayStrategy.doRead(DFSInputStream.java:693) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream.readBuffer(DFSInputStream.java:749) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:807) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:848) ~[hadoop-hdfs-2.6.2.jar:na]
            at java.io.DataInputStream.read(DataInputStream.java:100) ~[na:1.7.0]

Regards,
Karthikeyan S


RE: UnstaisfiedLinkError - Windows Environment

Posted by Brahma Reddy Battula <br...@huawei.com>.
Are you using the right libraries ( built for 64-bit windows and Hadoop 2.6.2) ?

From: karthi keyan [mailto:karthi93.sankar@gmail.com]
Sent: 29 March 2016 14:51
To: Brahma Reddy Battula
Cc: user@hadoop.apache.org
Subject: Re: UnstaisfiedLinkError - Windows Environment

Hi Brahma,

I have added those libraries to the bin path. Every time  when i communicate with other cluster(hadoop) am facing this issue.
Is there any Backward compatibility  ?? or some thing else ?

On Tue, Mar 29, 2016 at 12:09 PM, Brahma Reddy Battula <br...@huawei.com>> wrote:
Hadoop Cluster installed in Windows or only client is in Windows?

Whether Hadoop distribution contains windows library files and
<HADOOP_HOME>/bin is added to PATH ?


From: karthi keyan [mailto:karthi93.sankar@gmail.com<ma...@gmail.com>]
Sent: 29 March 2016 14:29
To: user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: UnstaisfiedLinkError - Windows Environment

Hi,

Frequently am facing this issue while reading the Data from HDFS, Every time i have replaced (rebuid) the jars. Does any one suggest me the right way to resolve this issue? or can any one tell me the root cause for this error ?

JDK > 1.7
System env - win 64 bit


Caused by: java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(IILjava/nio/ByteBuffer;ILjava/nio/ByteBuffer;IILjava/lang/String;JZ)V
            at org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(Native Method) ~[hadoop-common-2.6.2.jar:na]
            at org.apache.hadoop.util.NativeCrc32.verifyChunkedSums(NativeCrc32.java:59) ~[hadoop-common-2.6.2.jar:na]
            at org.apache.hadoop.util.DataChecksum.verifyChunkedSums(DataChecksum.java:301) ~[hadoop-common-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.RemoteBlockReader2.readNextPacket(RemoteBlockReader2.java:216) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.RemoteBlockReader2.read(RemoteBlockReader2.java:146) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream$ByteArrayStrategy.doRead(DFSInputStream.java:693) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream.readBuffer(DFSInputStream.java:749) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:807) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:848) ~[hadoop-hdfs-2.6.2.jar:na]
            at java.io.DataInputStream.read(DataInputStream.java:100) ~[na:1.7.0]

Regards,
Karthikeyan S


Re: UnstaisfiedLinkError - Windows Environment

Posted by karthi keyan <ka...@gmail.com>.
Hi Brahma,

I have added those libraries to the bin path. Every time  when
i communicate with other cluster(hadoop) am facing this issue.
Is there any Backward compatibility  ?? or some thing else ?


On Tue, Mar 29, 2016 at 12:09 PM, Brahma Reddy Battula <
brahmareddy.battula@huawei.com> wrote:

> Hadoop Cluster installed in Windows or only client is in Windows?
>
>
>
> Whether Hadoop distribution contains windows library files and
> <HADOOP_HOME>/bin is added to PATH ?
>
>
>
>
>
> *From:* karthi keyan [mailto:karthi93.sankar@gmail.com]
> *Sent:* 29 March 2016 14:29
> *To:* user@hadoop.apache.org
> *Subject:* UnstaisfiedLinkError - Windows Environment
>
>
>
> Hi,
>
> Frequently am facing this issue while reading the Data from HDFS, Every
> time i have replaced (rebuid) the jars. Does any one suggest me the right
> way to resolve this issue? or can any one tell me the root cause for this
> error ?
>
> JDK > 1.7
> System env - win 64 bit
>
>
> Caused by: java.lang.UnsatisfiedLinkError:
> org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(IILjava/nio/ByteBuffer;ILjava/nio/ByteBuffer;IILjava/lang/String;JZ)V
>
>             at
> org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(Native Method)
> ~[hadoop-common-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.util.NativeCrc32.verifyChunkedSums(NativeCrc32.java:59)
> ~[hadoop-common-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.util.DataChecksum.verifyChunkedSums(DataChecksum.java:301)
> ~[hadoop-common-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.hdfs.RemoteBlockReader2.readNextPacket(RemoteBlockReader2.java:216)
> ~[hadoop-hdfs-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.hdfs.RemoteBlockReader2.read(RemoteBlockReader2.java:146)
> ~[hadoop-hdfs-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.hdfs.DFSInputStream$ByteArrayStrategy.doRead(DFSInputStream.java:693)
> ~[hadoop-hdfs-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.hdfs.DFSInputStream.readBuffer(DFSInputStream.java:749)
> ~[hadoop-hdfs-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:807)
> ~[hadoop-hdfs-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:848)
> ~[hadoop-hdfs-2.6.2.jar:na]
>
>             at java.io.DataInputStream.read(DataInputStream.java:100)
> ~[na:1.7.0]
>
> Regards,
>
> Karthikeyan S
>

Re: UnstaisfiedLinkError - Windows Environment

Posted by karthi keyan <ka...@gmail.com>.
Hi Brahma,

I have added those libraries to the bin path. Every time  when
i communicate with other cluster(hadoop) am facing this issue.
Is there any Backward compatibility  ?? or some thing else ?


On Tue, Mar 29, 2016 at 12:09 PM, Brahma Reddy Battula <
brahmareddy.battula@huawei.com> wrote:

> Hadoop Cluster installed in Windows or only client is in Windows?
>
>
>
> Whether Hadoop distribution contains windows library files and
> <HADOOP_HOME>/bin is added to PATH ?
>
>
>
>
>
> *From:* karthi keyan [mailto:karthi93.sankar@gmail.com]
> *Sent:* 29 March 2016 14:29
> *To:* user@hadoop.apache.org
> *Subject:* UnstaisfiedLinkError - Windows Environment
>
>
>
> Hi,
>
> Frequently am facing this issue while reading the Data from HDFS, Every
> time i have replaced (rebuid) the jars. Does any one suggest me the right
> way to resolve this issue? or can any one tell me the root cause for this
> error ?
>
> JDK > 1.7
> System env - win 64 bit
>
>
> Caused by: java.lang.UnsatisfiedLinkError:
> org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(IILjava/nio/ByteBuffer;ILjava/nio/ByteBuffer;IILjava/lang/String;JZ)V
>
>             at
> org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(Native Method)
> ~[hadoop-common-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.util.NativeCrc32.verifyChunkedSums(NativeCrc32.java:59)
> ~[hadoop-common-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.util.DataChecksum.verifyChunkedSums(DataChecksum.java:301)
> ~[hadoop-common-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.hdfs.RemoteBlockReader2.readNextPacket(RemoteBlockReader2.java:216)
> ~[hadoop-hdfs-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.hdfs.RemoteBlockReader2.read(RemoteBlockReader2.java:146)
> ~[hadoop-hdfs-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.hdfs.DFSInputStream$ByteArrayStrategy.doRead(DFSInputStream.java:693)
> ~[hadoop-hdfs-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.hdfs.DFSInputStream.readBuffer(DFSInputStream.java:749)
> ~[hadoop-hdfs-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:807)
> ~[hadoop-hdfs-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:848)
> ~[hadoop-hdfs-2.6.2.jar:na]
>
>             at java.io.DataInputStream.read(DataInputStream.java:100)
> ~[na:1.7.0]
>
> Regards,
>
> Karthikeyan S
>

Re: UnstaisfiedLinkError - Windows Environment

Posted by karthi keyan <ka...@gmail.com>.
Hi Brahma,

I have added those libraries to the bin path. Every time  when
i communicate with other cluster(hadoop) am facing this issue.
Is there any Backward compatibility  ?? or some thing else ?


On Tue, Mar 29, 2016 at 12:09 PM, Brahma Reddy Battula <
brahmareddy.battula@huawei.com> wrote:

> Hadoop Cluster installed in Windows or only client is in Windows?
>
>
>
> Whether Hadoop distribution contains windows library files and
> <HADOOP_HOME>/bin is added to PATH ?
>
>
>
>
>
> *From:* karthi keyan [mailto:karthi93.sankar@gmail.com]
> *Sent:* 29 March 2016 14:29
> *To:* user@hadoop.apache.org
> *Subject:* UnstaisfiedLinkError - Windows Environment
>
>
>
> Hi,
>
> Frequently am facing this issue while reading the Data from HDFS, Every
> time i have replaced (rebuid) the jars. Does any one suggest me the right
> way to resolve this issue? or can any one tell me the root cause for this
> error ?
>
> JDK > 1.7
> System env - win 64 bit
>
>
> Caused by: java.lang.UnsatisfiedLinkError:
> org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(IILjava/nio/ByteBuffer;ILjava/nio/ByteBuffer;IILjava/lang/String;JZ)V
>
>             at
> org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(Native Method)
> ~[hadoop-common-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.util.NativeCrc32.verifyChunkedSums(NativeCrc32.java:59)
> ~[hadoop-common-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.util.DataChecksum.verifyChunkedSums(DataChecksum.java:301)
> ~[hadoop-common-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.hdfs.RemoteBlockReader2.readNextPacket(RemoteBlockReader2.java:216)
> ~[hadoop-hdfs-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.hdfs.RemoteBlockReader2.read(RemoteBlockReader2.java:146)
> ~[hadoop-hdfs-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.hdfs.DFSInputStream$ByteArrayStrategy.doRead(DFSInputStream.java:693)
> ~[hadoop-hdfs-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.hdfs.DFSInputStream.readBuffer(DFSInputStream.java:749)
> ~[hadoop-hdfs-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:807)
> ~[hadoop-hdfs-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:848)
> ~[hadoop-hdfs-2.6.2.jar:na]
>
>             at java.io.DataInputStream.read(DataInputStream.java:100)
> ~[na:1.7.0]
>
> Regards,
>
> Karthikeyan S
>

Re: UnstaisfiedLinkError - Windows Environment

Posted by karthi keyan <ka...@gmail.com>.
Hi Brahma,

I have added those libraries to the bin path. Every time  when
i communicate with other cluster(hadoop) am facing this issue.
Is there any Backward compatibility  ?? or some thing else ?


On Tue, Mar 29, 2016 at 12:09 PM, Brahma Reddy Battula <
brahmareddy.battula@huawei.com> wrote:

> Hadoop Cluster installed in Windows or only client is in Windows?
>
>
>
> Whether Hadoop distribution contains windows library files and
> <HADOOP_HOME>/bin is added to PATH ?
>
>
>
>
>
> *From:* karthi keyan [mailto:karthi93.sankar@gmail.com]
> *Sent:* 29 March 2016 14:29
> *To:* user@hadoop.apache.org
> *Subject:* UnstaisfiedLinkError - Windows Environment
>
>
>
> Hi,
>
> Frequently am facing this issue while reading the Data from HDFS, Every
> time i have replaced (rebuid) the jars. Does any one suggest me the right
> way to resolve this issue? or can any one tell me the root cause for this
> error ?
>
> JDK > 1.7
> System env - win 64 bit
>
>
> Caused by: java.lang.UnsatisfiedLinkError:
> org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(IILjava/nio/ByteBuffer;ILjava/nio/ByteBuffer;IILjava/lang/String;JZ)V
>
>             at
> org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(Native Method)
> ~[hadoop-common-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.util.NativeCrc32.verifyChunkedSums(NativeCrc32.java:59)
> ~[hadoop-common-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.util.DataChecksum.verifyChunkedSums(DataChecksum.java:301)
> ~[hadoop-common-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.hdfs.RemoteBlockReader2.readNextPacket(RemoteBlockReader2.java:216)
> ~[hadoop-hdfs-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.hdfs.RemoteBlockReader2.read(RemoteBlockReader2.java:146)
> ~[hadoop-hdfs-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.hdfs.DFSInputStream$ByteArrayStrategy.doRead(DFSInputStream.java:693)
> ~[hadoop-hdfs-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.hdfs.DFSInputStream.readBuffer(DFSInputStream.java:749)
> ~[hadoop-hdfs-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:807)
> ~[hadoop-hdfs-2.6.2.jar:na]
>
>             at
> org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:848)
> ~[hadoop-hdfs-2.6.2.jar:na]
>
>             at java.io.DataInputStream.read(DataInputStream.java:100)
> ~[na:1.7.0]
>
> Regards,
>
> Karthikeyan S
>

RE: UnstaisfiedLinkError - Windows Environment

Posted by Brahma Reddy Battula <br...@huawei.com>.
Hadoop Cluster installed in Windows or only client is in Windows?

Whether Hadoop distribution contains windows library files and
<HADOOP_HOME>/bin is added to PATH ?


From: karthi keyan [mailto:karthi93.sankar@gmail.com]
Sent: 29 March 2016 14:29
To: user@hadoop.apache.org
Subject: UnstaisfiedLinkError - Windows Environment

Hi,

Frequently am facing this issue while reading the Data from HDFS, Every time i have replaced (rebuid) the jars. Does any one suggest me the right way to resolve this issue? or can any one tell me the root cause for this error ?

JDK > 1.7
System env - win 64 bit


Caused by: java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(IILjava/nio/ByteBuffer;ILjava/nio/ByteBuffer;IILjava/lang/String;JZ)V
            at org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(Native Method) ~[hadoop-common-2.6.2.jar:na]
            at org.apache.hadoop.util.NativeCrc32.verifyChunkedSums(NativeCrc32.java:59) ~[hadoop-common-2.6.2.jar:na]
            at org.apache.hadoop.util.DataChecksum.verifyChunkedSums(DataChecksum.java:301) ~[hadoop-common-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.RemoteBlockReader2.readNextPacket(RemoteBlockReader2.java:216) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.RemoteBlockReader2.read(RemoteBlockReader2.java:146) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream$ByteArrayStrategy.doRead(DFSInputStream.java:693) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream.readBuffer(DFSInputStream.java:749) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:807) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:848) ~[hadoop-hdfs-2.6.2.jar:na]
            at java.io.DataInputStream.read(DataInputStream.java:100) ~[na:1.7.0]

Regards,
Karthikeyan S

RE: UnstaisfiedLinkError - Windows Environment

Posted by Brahma Reddy Battula <br...@huawei.com>.
Hadoop Cluster installed in Windows or only client is in Windows?

Whether Hadoop distribution contains windows library files and
<HADOOP_HOME>/bin is added to PATH ?


From: karthi keyan [mailto:karthi93.sankar@gmail.com]
Sent: 29 March 2016 14:29
To: user@hadoop.apache.org
Subject: UnstaisfiedLinkError - Windows Environment

Hi,

Frequently am facing this issue while reading the Data from HDFS, Every time i have replaced (rebuid) the jars. Does any one suggest me the right way to resolve this issue? or can any one tell me the root cause for this error ?

JDK > 1.7
System env - win 64 bit


Caused by: java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(IILjava/nio/ByteBuffer;ILjava/nio/ByteBuffer;IILjava/lang/String;JZ)V
            at org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(Native Method) ~[hadoop-common-2.6.2.jar:na]
            at org.apache.hadoop.util.NativeCrc32.verifyChunkedSums(NativeCrc32.java:59) ~[hadoop-common-2.6.2.jar:na]
            at org.apache.hadoop.util.DataChecksum.verifyChunkedSums(DataChecksum.java:301) ~[hadoop-common-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.RemoteBlockReader2.readNextPacket(RemoteBlockReader2.java:216) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.RemoteBlockReader2.read(RemoteBlockReader2.java:146) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream$ByteArrayStrategy.doRead(DFSInputStream.java:693) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream.readBuffer(DFSInputStream.java:749) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:807) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:848) ~[hadoop-hdfs-2.6.2.jar:na]
            at java.io.DataInputStream.read(DataInputStream.java:100) ~[na:1.7.0]

Regards,
Karthikeyan S

RE: UnstaisfiedLinkError - Windows Environment

Posted by Brahma Reddy Battula <br...@huawei.com>.
Hadoop Cluster installed in Windows or only client is in Windows?

Whether Hadoop distribution contains windows library files and
<HADOOP_HOME>/bin is added to PATH ?


From: karthi keyan [mailto:karthi93.sankar@gmail.com]
Sent: 29 March 2016 14:29
To: user@hadoop.apache.org
Subject: UnstaisfiedLinkError - Windows Environment

Hi,

Frequently am facing this issue while reading the Data from HDFS, Every time i have replaced (rebuid) the jars. Does any one suggest me the right way to resolve this issue? or can any one tell me the root cause for this error ?

JDK > 1.7
System env - win 64 bit


Caused by: java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(IILjava/nio/ByteBuffer;ILjava/nio/ByteBuffer;IILjava/lang/String;JZ)V
            at org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(Native Method) ~[hadoop-common-2.6.2.jar:na]
            at org.apache.hadoop.util.NativeCrc32.verifyChunkedSums(NativeCrc32.java:59) ~[hadoop-common-2.6.2.jar:na]
            at org.apache.hadoop.util.DataChecksum.verifyChunkedSums(DataChecksum.java:301) ~[hadoop-common-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.RemoteBlockReader2.readNextPacket(RemoteBlockReader2.java:216) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.RemoteBlockReader2.read(RemoteBlockReader2.java:146) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream$ByteArrayStrategy.doRead(DFSInputStream.java:693) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream.readBuffer(DFSInputStream.java:749) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:807) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:848) ~[hadoop-hdfs-2.6.2.jar:na]
            at java.io.DataInputStream.read(DataInputStream.java:100) ~[na:1.7.0]

Regards,
Karthikeyan S

RE: UnstaisfiedLinkError - Windows Environment

Posted by Brahma Reddy Battula <br...@huawei.com>.
Hadoop Cluster installed in Windows or only client is in Windows?

Whether Hadoop distribution contains windows library files and
<HADOOP_HOME>/bin is added to PATH ?


From: karthi keyan [mailto:karthi93.sankar@gmail.com]
Sent: 29 March 2016 14:29
To: user@hadoop.apache.org
Subject: UnstaisfiedLinkError - Windows Environment

Hi,

Frequently am facing this issue while reading the Data from HDFS, Every time i have replaced (rebuid) the jars. Does any one suggest me the right way to resolve this issue? or can any one tell me the root cause for this error ?

JDK > 1.7
System env - win 64 bit


Caused by: java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(IILjava/nio/ByteBuffer;ILjava/nio/ByteBuffer;IILjava/lang/String;JZ)V
            at org.apache.hadoop.util.NativeCrc32.nativeComputeChunkedSums(Native Method) ~[hadoop-common-2.6.2.jar:na]
            at org.apache.hadoop.util.NativeCrc32.verifyChunkedSums(NativeCrc32.java:59) ~[hadoop-common-2.6.2.jar:na]
            at org.apache.hadoop.util.DataChecksum.verifyChunkedSums(DataChecksum.java:301) ~[hadoop-common-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.RemoteBlockReader2.readNextPacket(RemoteBlockReader2.java:216) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.RemoteBlockReader2.read(RemoteBlockReader2.java:146) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream$ByteArrayStrategy.doRead(DFSInputStream.java:693) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream.readBuffer(DFSInputStream.java:749) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:807) ~[hadoop-hdfs-2.6.2.jar:na]
            at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:848) ~[hadoop-hdfs-2.6.2.jar:na]
            at java.io.DataInputStream.read(DataInputStream.java:100) ~[na:1.7.0]

Regards,
Karthikeyan S