You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-dev@hadoop.apache.org by Neeraj Vaidya <ne...@yahoo.co.in> on 2015/07/04 04:56:27 UTC

Hadoop 2.7.0, Solaris 10 Sparc 64-bit, Native Code build error

Hi All,
While building native code on Solaris 10 Sparc 64-bit, I have encountered an error where the linker cannot find the definition of pipelined_crc32c function in the bulk_crc32.c.o file.
I have defined the macro USE_PIPELINED, but the definition of the code is still not being compiled into the object file.
The error occurs when trying to generate the test executable test_bulk_crc32

REgards,
Neeraj

Re: Hadoop 2.7.0, Solaris 10 Sparc 64-bit, Native Code build error

Posted by Alan Burlison <Al...@oracle.com>.
On 04/07/2015 03:56, Neeraj Vaidya wrote:

> Hi All,
> While building native code on Solaris 10 Sparc 64-bit, I have encountered an error where the linker cannot find the definition of pipelined_crc32c function in the bulk_crc32.c.o file.
> I have defined the macro USE_PIPELINED, but the definition of the code is still not being compiled into the object file.
> The error occurs when trying to generate the test executable test_bulk_crc32

HADOOP-12008 Investigate providing SPARC hardware-accelerated CRC32 code
https://issues.apache.org/jira/browse/HADOOP-12008

Even if you get past that point there are other things that will stop 
you building successfully on SPARC, see the following bugs:

https://issues.apache.org/jira/browse/HADOOP-11985
https://issues.apache.org/jira/browse/YARN-3719
https://issues.apache.org/jira/browse/HDFS-8478
https://issues.apache.org/jira/browse/MAPREDUCE-6390

-- 
Alan Burlison
--