You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by xiiik <xi...@qq.com> on 2014/03/02 09:40:47 UTC

Unable to load realm info from SCDynamicStore

hi all,

i have build spark-0.9.0-incubating-bin-hadoop2.tgz on my MacBook,  and the pyspark works well, but got the message below.
(i don’t have Hadoop installed on my MacBook)


…...
14/03/02 15:31:59 INFO HttpServer: Starting HTTP Server
14/03/02 15:31:59 INFO SparkUI: Started Spark Web UI at http://192.168.1.106:4040
2014-03-02 15:32:00.151 java[4814:e903] Unable to load realm info from SCDynamicStore
14/03/02 15:32:00 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable


how can i fix this?

thanks

Re: Unable to load realm info from SCDynamicStore

Posted by Sean Owen <so...@cloudera.com>.
This is completely normal for Hadoop. Unless you specially install
some libraries like snappy you will get this, but it does not hurt.
--
Sean Owen | Director, Data Science | London


On Sun, Mar 2, 2014 at 8:40 AM, xiiik <xi...@qq.com> wrote:
> hi all,
>
> i have build spark-0.9.0-incubating-bin-hadoop2.tgz on my MacBook,  and the pyspark works well, but got the message below.
> (i don’t have Hadoop installed on my MacBook)
>
>
> …...
> 14/03/02 15:31:59 INFO HttpServer: Starting HTTP Server
> 14/03/02 15:31:59 INFO SparkUI: Started Spark Web UI at http://192.168.1.106:4040
> 2014-03-02 15:32:00.151 java[4814:e903] Unable to load realm info from SCDynamicStore
> 14/03/02 15:32:00 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
>
>
> how can i fix this?
>
> thanks