You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Satoshi YAMADA <sa...@ale.csce.kyushu-u.ac.jp> on 2008/06/17 08:12:23 UTC

Re: hadoop on Solaris

> >From hadoop doc, only Linux and Windows are supported platforms. Is  
> it possible to run
> hadoop on Solaris? Is hadoop implemented in pure java? What kinds of  
> problems are there in
> order to port to Solaris? Thanks in advance.

hi,

no one seems to reply to the previous "hadoop on Solaris" Thread.

I just tried running hadoop on Solaris 5.10 and somehow got error  
message.
If you can give some advices, I would appreciate it. (single operation  
seems to
work).


I tried pseudo operation.

First, I have to set the path to /usr/ucb in order to use "whoami".

Then, I set the conf/hadoop-env.sh to set JAVA_HOME.

I set conf/hadoop-site.xml as below
*****************************************
<configuration>
   <property>
     <name>fs.default.name</name>
     <value>localhost:9000</value>
   </property>
   <property>
     <name>mapred.job.tracker</name>
     <value>localhost:9001</value>
   </property>
   <property>
     <name>dfs.replication</name>
     <value>1</value>
   </property>
     <property>
         <name>dfs.name.dir</name>
         <value>/home/hadoop/hadoop-dist/filesystem/name</value>
     </property>
   <property>
         <name>dfs.data.dir</name>
         <value>/home/hadoop/hadoop-dist/filesystem/data</value>
   </property>
   <property>
         <name>mapred.system.dir</name>
         <value>/home/hadoop/hadoop-dist/filesystem/mapred/system</value>
   </property>
   <property>
     <name>mapred.local.dir</name>
     <value>/home/hadoop/hadoop-dist/filesystem/mapred/local</value>
     </property>
</configuration>
****************************************************

I think the configuration is conventional.

Then, I start the daemons, and all daemons seems to boot without  
problems.
Also, I can copy the example "conf" directory into HDFS.

I tried "grep" program, but I got the errors below.

*****************errors*******************
 > ./bin/hadoop jar hadoop-0.17.0-examples.jar grep test_input  
test_output 'dfs[a-z.]+'
08/06/17 14:51:06 WARN fs.FileSystem: "localhost:9000" is a deprecated  
filesystem name. Use "hdfs://localhost:9000/" instead.
08/06/17 14:51:07 WARN fs.FileSystem: "localhost:9000" is a deprecated  
filesystem name. Use "hdfs://localhost:9000/" instead.
08/06/17 14:51:09 WARN fs.FileSystem: "localhost:9000" is a deprecated  
filesystem name. Use "hdfs://localhost:9000/" instead.
08/06/17 14:51:09 INFO mapred.FileInputFormat: Total input paths to  
process : 10
08/06/17 14:51:11 INFO mapred.JobClient: Running job:  
job_200806171448_0001
08/06/17 14:51:12 INFO mapred.JobClient:  map 0% reduce 0%
08/06/17 14:51:17 INFO mapred.JobClient: Task Id :  
task_200806171448_0001_m_000000_0, Status : FAILED
Error initializing task_200806171448_0001_m_000000_0:
java.io.IOException
         at org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.java:175)
         at  
org.apache.hadoop.dfs.DistributedFileSystem.initialize(DistributedFileSy 
stem.java:68)
         at  
org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1280)
         at  
org.apache.hadoop.fs.FileSystem.access$300(FileSystem.java:56)
         at  
org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1291)
         at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:203)
         at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:108)
         at  
org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:632)
         at  
org.apache.hadoop.mapred.TaskTracker.startNewTask(TaskTracker.java: 
1274)
         at  
org.apache.hadoop.mapred.TaskTracker.offerService(TaskTracker.java:915)
         at  
org.apache.hadoop.mapred.TaskTracker.run(TaskTracker.java:1310)
         at  
org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:2251)
Caused by: javax.security.auth.login.LoginException: Login failed:  
whoami: not found
         at  
org.apache.hadoop.security.UnixUserGroupInformation.login(UnixUserGroupI 
nformation.java:250)
         at  
org.apache.hadoop.security.UnixUserGroupInformation.login(UnixUserGroupI 
nformation.java:275)
         at org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.java:173)
         ... 11 more

08/06/17 14:51:17 WARN mapred.JobClient: Error reading task  
outputhttp://MyPC:50060/tasklog? 
plaintext=true&taskid=task_200806171448_0001_m_000000_0&filter=stdout
08/06/17 14:51:17 WARN mapred.JobClient: Error reading task  
outputhttp://MyPC:50060/tasklog? 
plaintext=true&taskid=task_200806171448_0001_m_000000_0&filter=stderr
08/06/17 14:51:22 INFO mapred.JobClient: Task Id :  
task_200806171448_0001_m_000000_1, Status : FAILED
Error initializing task_200806171448_0001_m_000000_1:
java.io.IOException
         at org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.java:175)
         at  
org.apache.hadoop.dfs.DistributedFileSystem.initialize(DistributedFileSy 
stem.java:68)
         at  
org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1280)
         at  
org.apache.hadoop.fs.FileSystem.access$300(FileSystem.java:56)
         at  
org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1291)
         at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:203)
         at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:108)
         at  
org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:632)
         at  
org.apache.hadoop.mapred.TaskTracker.startNewTask(TaskTracker.java: 
1274)
         at  
org.apache.hadoop.mapred.TaskTracker.offerService(TaskTracker.java:915)
         at  
org.apache.hadoop.mapred.TaskTracker.run(TaskTracker.java:1310)
         at  
org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:2251)
Caused by: javax.security.auth.login.LoginException: Login failed:  
whoami: not found
         at  
org.apache.hadoop.security.UnixUserGroupInformation.login(UnixUserGroupI 
nformation.java:250)
         at  
org.apache.hadoop.security.UnixUserGroupInformation.login(UnixUserGroupI 
nformation.java:275)
         at org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.java:173)
         ... 11 more

08/06/17 14:51:22 WARN mapred.JobClient: Error reading task  
outputhttp://MyPC:50060/tasklog? 
plaintext=true&taskid=task_200806171448_0001_m_000000_1&filter=stdout
08/06/17 14:51:22 WARN mapred.JobClient: Error reading task  
outputhttp://MyPC:50060/tasklog? 
plaintext=true&taskid=task_200806171448_0001_m_000000_1&filter=stderr
08/06/17 14:51:22 INFO mapred.JobClient: Task Id :  
task_200806171448_0001_m_000000_2, Status : FAILED
Error initializing task_200806171448_0001_m_000000_2:
java.io.IOException
         at org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.java:175)
         at  
org.apache.hadoop.dfs.DistributedFileSystem.initialize(DistributedFileSy 
stem.java:68)
         at  
org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1280)
         at  
org.apache.hadoop.fs.FileSystem.access$300(FileSystem.java:56)
         at  
org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1291)
         at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:203)
         at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:108)
         at  
org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:632)
         at  
org.apache.hadoop.mapred.TaskTracker.startNewTask(TaskTracker.java: 
1274)
         at  
org.apache.hadoop.mapred.TaskTracker.offerService(TaskTracker.java:915)
         at  
org.apache.hadoop.mapred.TaskTracker.run(TaskTracker.java:1310)
         at  
org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:2251)
Caused by: javax.security.auth.login.LoginException: Login failed:  
whoami: not found
         at  
org.apache.hadoop.security.UnixUserGroupInformation.login(UnixUserGroupI 
nformation.java:250)
         at  
org.apache.hadoop.security.UnixUserGroupInformation.login(UnixUserGroupI 
nformation.java:275)
         at org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.java:173)
         ... 11 more

08/06/17 14:51:22 WARN mapred.JobClient: Error reading task  
outputhttp://MyPC:50060/tasklog? 
plaintext=true&taskid=task_200806171448_0001_m_000000_2&filter=stdout
08/06/17 14:51:22 WARN mapred.JobClient: Error reading task  
outputhttp://MyPC:50060/tasklog? 
plaintext=true&taskid=task_200806171448_0001_m_000000_2&filter=stderr
08/06/17 14:51:27 INFO mapred.JobClient:  map 100% reduce 100%
08/06/17 14:51:28 WARN fs.FileSystem: "localhost:9000" is a deprecated  
filesystem name. Use "hdfs://localhost:9000/" instead.
java.io.IOException: Job failed!
         at  
org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1062)
         at org.apache.hadoop.examples.Grep.run(Grep.java:69)
         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
         at org.apache.hadoop.examples.Grep.main(Grep.java:93)
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at  
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.jav 
a:39)
         at  
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessor 
Impl.java:25)
         at java.lang.reflect.Method.invoke(Method.java:585)
         at  
org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDr 
iver.java:68)
         at  
org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
         at  
org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:53)
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at  
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.jav 
a:39)
         at  
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessor 
Impl.java:25)
         at java.lang.reflect.Method.invoke(Method.java:585)
         at org.apache.hadoop.util.RunJar.main(RunJar.java:155)
         at org.apache.hadoop.mapred.JobShell.run(JobShell.java:194)
         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
         at org.apache.hadoop.mapred.JobShell.main(JobShell.java:220)
*****************errors*******************

I appreciate your help.

satoshi


------------------------------------------------------------------------ 
--------
Satoshi YAMADA <sa...@ale.csce.kyushu-u.ac.jp>
Department of Computer Science and Communication
Engineering, Graduate School of Information Science and
Electrical Engineering, Kyushu University


Re: hadoop on Solaris

Posted by Satoshi YAMADA <sa...@ale.csce.kyushu-u.ac.jp>.
 >steve, Tom
>> I'd recommend you check out the trunk and try building it and running  
>> the
>> tests on solaris.
> I've also managed to test and build Hadoop on Solaris. From 0.17
> there's support for building the native libraries on Solaris, which
> are useful for performance (see
> https://issues.apache.org/jira/browse/HADOOP-3123).
Thanks for your advice. I have not built Hadoop by myself yet and just  
tried
with binary. I should try now.

 > Tom
I missed that error message and I do not know why it appeared.
I set the ~/.profile and export the PATH. So, it should work now....
well, let me figure out and report it later.

Thanks,
satoshi

On 2008/06/17, at 21:42, Tom White wrote:

> I've successfully run Hadoop on Solaris 5.10 (on Intel). The path
> included /usr/ucb so whoami was picked up correctly.
>
> Satoshi, you say you added /usr/ucb to you path too, so I'm puzzled
> why you get a LoginException saying "whoami: not found" - did you
> export your changes to path?
>
> I've also managed to test and build Hadoop on Solaris. From 0.17
> there's support for building the native libraries on Solaris, which
> are useful for performance (see
> https://issues.apache.org/jira/browse/HADOOP-3123).
>
> Tom
>
> On Tue, Jun 17, 2008 at 11:47 AM, Steve Loughran <st...@apache.org>  
> wrote:
>> Satoshi YAMADA wrote:
>>>>
>>>>> From hadoop doc, only Linux and Windows are supported platforms.  
>>>>> Is it
>>>> possible to run
>>>> hadoop on Solaris? Is hadoop implemented in pure java? What kinds of
>>>> problems are there in
>>>> order to port to Solaris? Thanks in advance.
>>>
>>> hi,
>>>
>>> no one seems to reply to the previous "hadoop on Solaris" Thread.
>>>
>>> I just tried running hadoop on Solaris 5.10 and somehow got error  
>>> message.
>>> If you can give some advices, I would appreciate it. (single  
>>> operation
>>> seems to
>>> work).
>>>
>>
>> You are probably the first person trying this. This means you have  
>> more
>> work, but it gives you an opportunity to contribute code back into  
>> the next
>> release.
>>
>> I'd recommend you check out the trunk and try building it and running  
>> the
>> tests on solaris. Then when the tests fail, you can file bug reports  
>> (with
>> stack traces) against specific tests. Then -possibly- other people  
>> might
>> pick up and fix the problems, or you can fix them one by one,  
>> submitting
>> patches to the bugreps as you go.
>>
>> I'm sure the Hadoop team would be happy to have Solaris support, its  
>> just a
>> matter of whoever has the need sitting down to do it.
>>
>> -steve
>>
>
>
------------------------------------------------------------------------ 
--------
Satoshi YAMADA <sa...@ale.csce.kyushu-u.ac.jp>
Department of Computer Science and Communication
Engineering, Graduate School of Information Science and
Electrical Engineering, Kyushu University


Re: hadoop on Solaris

Posted by Tom White <to...@gmail.com>.
I've successfully run Hadoop on Solaris 5.10 (on Intel). The path
included /usr/ucb so whoami was picked up correctly.

Satoshi, you say you added /usr/ucb to you path too, so I'm puzzled
why you get a LoginException saying "whoami: not found" - did you
export your changes to path?

I've also managed to test and build Hadoop on Solaris. From 0.17
there's support for building the native libraries on Solaris, which
are useful for performance (see
https://issues.apache.org/jira/browse/HADOOP-3123).

Tom

On Tue, Jun 17, 2008 at 11:47 AM, Steve Loughran <st...@apache.org> wrote:
> Satoshi YAMADA wrote:
>>>
>>> >From hadoop doc, only Linux and Windows are supported platforms. Is it
>>> possible to run
>>> hadoop on Solaris? Is hadoop implemented in pure java? What kinds of
>>> problems are there in
>>> order to port to Solaris? Thanks in advance.
>>
>> hi,
>>
>> no one seems to reply to the previous "hadoop on Solaris" Thread.
>>
>> I just tried running hadoop on Solaris 5.10 and somehow got error message.
>> If you can give some advices, I would appreciate it. (single operation
>> seems to
>> work).
>>
>
> You are probably the first person trying this. This means you have more
> work, but it gives you an opportunity to contribute code back into the next
> release.
>
> I'd recommend you check out the trunk and try building it and running the
> tests on solaris. Then when the tests fail, you can file bug reports (with
> stack traces) against specific tests. Then -possibly- other people might
> pick up and fix the problems, or you can fix them one by one, submitting
> patches to the bugreps as you go.
>
> I'm sure the Hadoop team would be happy to have Solaris support, its just a
> matter of whoever has the need sitting down to do it.
>
> -steve
>

Re: hadoop on Solaris

Posted by Steve Loughran <st...@apache.org>.
Satoshi YAMADA wrote:
>> >From hadoop doc, only Linux and Windows are supported platforms. Is 
>> it possible to run
>> hadoop on Solaris? Is hadoop implemented in pure java? What kinds of 
>> problems are there in
>> order to port to Solaris? Thanks in advance.
> 
> hi,
> 
> no one seems to reply to the previous "hadoop on Solaris" Thread.
> 
> I just tried running hadoop on Solaris 5.10 and somehow got error message.
> If you can give some advices, I would appreciate it. (single operation 
> seems to
> work).
> 

You are probably the first person trying this. This means you have more 
work, but it gives you an opportunity to contribute code back into the 
next release.

I'd recommend you check out the trunk and try building it and running 
the tests on solaris. Then when the tests fail, you can file bug reports 
(with stack traces) against specific tests. Then -possibly- other people 
might pick up and fix the problems, or you can fix them one by one, 
submitting patches to the bugreps as you go.

I'm sure the Hadoop team would be happy to have Solaris support, its 
just a matter of whoever has the need sitting down to do it.

-steve