You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Aureliano Buendia <bu...@gmail.com> on 2014/01/13 18:28:16 UTC
Unable to load native-hadoop library
Hi,
I'm using spark-ec2 scripts, and spark applications do not load native
hadoop libraries. I've set the native lib path like this:
export SPARK_LIBRARY_PATH='/root/ephemeral-hdfs/lib/native/'
But get these warnings in log:
WARN NativeCodeLoader: Unable to load native-hadoop library for your
platform... using builtin-java classes where applicable
WARN LoadSnappy: Snappy native library not loaded
Is SPARK_LIBRARY_PATH the right variable for this? Does spark use this
variable, or does my application have to set up the native libraries?
Re: Unable to load native-hadoop library
Posted by Aureliano Buendia <bu...@gmail.com>.
I had to explicitly use -Djava.library.path for this to work.
On Mon, Jan 13, 2014 at 5:51 PM, Aureliano Buendia <bu...@gmail.com>wrote:
> I'm compiling my application against the same hadoop version on spark ec2
> AMI:
>
> <dependency>
> <groupId>org.apache.hadoop</groupId>
> <artifactId>hadoop-client</artifactId>
> <version>0.23.7</version>
> </dependency>
>
> In my shaded fat jar, I do not include this library though, which
> shouldn't cause this problem.
>
>
>
> On Mon, Jan 13, 2014 at 5:28 PM, Aureliano Buendia <bu...@gmail.com>wrote:
>
>> Hi,
>>
>> I'm using spark-ec2 scripts, and spark applications do not load native
>> hadoop libraries. I've set the native lib path like this:
>>
>> export SPARK_LIBRARY_PATH='/root/ephemeral-hdfs/lib/native/'
>>
>> But get these warnings in log:
>>
>> WARN NativeCodeLoader: Unable to load native-hadoop library for your
>> platform... using builtin-java classes where applicable
>> WARN LoadSnappy: Snappy native library not loaded
>>
>> Is SPARK_LIBRARY_PATH the right variable for this? Does spark use this
>> variable, or does my application have to set up the native libraries?
>>
>
>
Re: Unable to load native-hadoop library
Posted by Aureliano Buendia <bu...@gmail.com>.
I'm compiling my application against the same hadoop version on spark ec2
AMI:
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>0.23.7</version>
</dependency>
In my shaded fat jar, I do not include this library though, which shouldn't
cause this problem.
On Mon, Jan 13, 2014 at 5:28 PM, Aureliano Buendia <bu...@gmail.com>wrote:
> Hi,
>
> I'm using spark-ec2 scripts, and spark applications do not load native
> hadoop libraries. I've set the native lib path like this:
>
> export SPARK_LIBRARY_PATH='/root/ephemeral-hdfs/lib/native/'
>
> But get these warnings in log:
>
> WARN NativeCodeLoader: Unable to load native-hadoop library for your
> platform... using builtin-java classes where applicable
> WARN LoadSnappy: Snappy native library not loaded
>
> Is SPARK_LIBRARY_PATH the right variable for this? Does spark use this
> variable, or does my application have to set up the native libraries?
>