You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hadoop.apache.org by Emile Kao <em...@gmx.net> on 2012/12/11 15:23:15 UTC

using hadoop on zLinux (Linux on S390)

Hello community,
I am trying to use hadoop 1.1.0 on a SLES 11 (zLinux) running on IBM S390.
The java provided is "java-s390x-60" 64Bit.
While trying to format the namenode I got the following error:

$:/opt/flume_hadoop/hadoop-1.1.0> bin/hadoop namenode -format
12/12/11 14:16:31 INFO namenode.NameNode: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting NameNode
STARTUP_MSG:   host = xxxxxxxxx
STARTUP_MSG:   args = [-format]
STARTUP_MSG:   version = 1.1.0
STARTUP_MSG:   build = https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.1 -r 1394289; compiled by 'hortonfo' on Thu Oct  4 22:06:49 UTC 2012
************************************************************/
Re-format filesystem in /opt/hadoop_data/name ? (Y or N) Y
12/12/11 14:16:34 INFO util.GSet: VM type       = 64-bit
12/12/11 14:16:34 INFO util.GSet: 2% max memory = 20.0 MB
12/12/11 14:16:34 INFO util.GSet: capacity      = 2^21 = 2097152 entries
12/12/11 14:16:34 INFO util.GSet: recommended=2097152, actual=2097152
12/12/11 14:16:34 ERROR security.UserGroupInformation: Unable to find JAAS classes:com.ibm.security.auth.LinuxPrincipal
12/12/11 14:16:35 ERROR namenode.NameNode: java.io.IOException: failure to login
        at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:501)
        at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:491)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:480)
        at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1198)
        at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1391)
        at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1412)
Caused by: javax.security.auth.login.LoginException: java.lang.NullPointerException: invalid null Class provided
        at javax.security.auth.Subject.getPrincipals(Subject.java:809)
        at org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.getCanonicalUser(UserGroupInformation.java:86)
        at org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.commit(UserGroupInformation.java:123)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:48)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:600)
        at javax.security.auth.login.LoginContext.invoke(LoginContext.java:795)
        at javax.security.auth.login.LoginContext.access$000(LoginContext.java:209)
        at javax.security.auth.login.LoginContext$5.run(LoginContext.java:732)
        at java.security.AccessController.doPrivileged(AccessController.java:284)
        at javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:729)
        at javax.security.auth.login.LoginContext.login(LoginContext.java:600)
        at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:482)
        at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:491)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:480)
        at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1198)
        at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1391)
        at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1412)

        at javax.security.auth.login.LoginContext.invoke(LoginContext.java:898)
        at javax.security.auth.login.LoginContext.access$000(LoginContext.java:209)
        at javax.security.auth.login.LoginContext$5.run(LoginContext.java:732)
        at java.security.AccessController.doPrivileged(AccessController.java:284)
        at javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:729)
        at javax.security.auth.login.LoginContext.login(LoginContext.java:600)
        at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:482)
        ... 6 more

12/12/11 14:16:35 INFO namenode.NameNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at xxxxxxxxxxxxxxxx
************************************************************/
$:/opt/flume_hadoop/hadoop-1.1.0>

Question:

1)@developer
Are you aware of this behavior?
2)It there a way to overcome this problem with a workaround?
3)IS it a security issue? --> I was able to issue ssh on localhost without error.

Re: using hadoop on zLinux (Linux on S390)

Posted by Michael Segel <mi...@hotmail.com>.
Well...

I didn't think that the general version contained IBM specific security Java. 

Your error: JAAS classes:com.ibm.security.auth.LinuxPrincipal
(The first line...) 

Is saying that they can't find this class.
Since this is the Apache release and you're trying to run it on IBM where you need specific IBM security stuff. 

Now I could be wrong but that's my first take on it. 


On Dec 11, 2012, at 8:50 AM, "Emile Kao" <em...@gmx.net> wrote:

> No, this is the general available version...
> 
> -------- Original-Nachricht --------
>> Datum: Tue, 11 Dec 2012 08:31:57 -0600
>> Von: Michael Segel <mi...@hotmail.com>
>> An: user@hadoop.apache.org
>> Betreff: Re: using hadoop on zLinux (Linux on S390)
> 
>> Well, on the surface.... 
>> 
>> It looks like its either a missing class, or you don't have your class
>> path set up right. 
>> 
>> I'm assuming you got this version of Hadoop from IBM, so I would suggest
>> contacting their support and opening up a ticket. 
>> 
>> 
>> On Dec 11, 2012, at 8:23 AM, Emile Kao <em...@gmx.net> wrote:
>> 
>>> Hello community,
>>> I am trying to use hadoop 1.1.0 on a SLES 11 (zLinux) running on IBM
>> S390.
>>> The java provided is "java-s390x-60" 64Bit.
>>> While trying to format the namenode I got the following error:
>>> 
>>> $:/opt/flume_hadoop/hadoop-1.1.0> bin/hadoop namenode -format
>>> 12/12/11 14:16:31 INFO namenode.NameNode: STARTUP_MSG:
>>> /************************************************************
>>> STARTUP_MSG: Starting NameNode
>>> STARTUP_MSG:   host = xxxxxxxxx
>>> STARTUP_MSG:   args = [-format]
>>> STARTUP_MSG:   version = 1.1.0
>>> STARTUP_MSG:   build =
>> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.1 -r 1394289; compiled by 'hortonfo' on Thu Oct  4 22:06:49
>> UTC 2012
>>> ************************************************************/
>>> Re-format filesystem in /opt/hadoop_data/name ? (Y or N) Y
>>> 12/12/11 14:16:34 INFO util.GSet: VM type       = 64-bit
>>> 12/12/11 14:16:34 INFO util.GSet: 2% max memory = 20.0 MB
>>> 12/12/11 14:16:34 INFO util.GSet: capacity      = 2^21 = 2097152 entries
>>> 12/12/11 14:16:34 INFO util.GSet: recommended=2097152, actual=2097152
>>> 12/12/11 14:16:34 ERROR security.UserGroupInformation: Unable to find
>> JAAS classes:com.ibm.security.auth.LinuxPrincipal
>>> 12/12/11 14:16:35 ERROR namenode.NameNode: java.io.IOException: failure
>> to login
>>>       at
>> org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:501)
>>>       at
>> org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463)
>>>       at
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:491)
>>>       at
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:480)
>>>       at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1198)
>>>       at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1391)
>>>       at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1412)
>>> Caused by: javax.security.auth.login.LoginException:
>> java.lang.NullPointerException: invalid null Class provided
>>>       at javax.security.auth.Subject.getPrincipals(Subject.java:809)
>>>       at
>> org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.getCanonicalUser(UserGroupInformation.java:86)
>>>       at
>> org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.commit(UserGroupInformation.java:123)
>>>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>       at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:48)
>>>       at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>       at java.lang.reflect.Method.invoke(Method.java:600)
>>>       at
>> javax.security.auth.login.LoginContext.invoke(LoginContext.java:795)
>>>       at
>> javax.security.auth.login.LoginContext.access$000(LoginContext.java:209)
>>>       at
>> javax.security.auth.login.LoginContext$5.run(LoginContext.java:732)
>>>       at
>> java.security.AccessController.doPrivileged(AccessController.java:284)
>>>       at
>> javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:729)
>>>       at
>> javax.security.auth.login.LoginContext.login(LoginContext.java:600)
>>>       at
>> org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:482)
>>>       at
>> org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463)
>>>       at
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:491)
>>>       at
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:480)
>>>       at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1198)
>>>       at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1391)
>>>       at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1412)
>>> 
>>>       at
>> javax.security.auth.login.LoginContext.invoke(LoginContext.java:898)
>>>       at
>> javax.security.auth.login.LoginContext.access$000(LoginContext.java:209)
>>>       at
>> javax.security.auth.login.LoginContext$5.run(LoginContext.java:732)
>>>       at
>> java.security.AccessController.doPrivileged(AccessController.java:284)
>>>       at
>> javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:729)
>>>       at
>> javax.security.auth.login.LoginContext.login(LoginContext.java:600)
>>>       at
>> org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:482)
>>>       ... 6 more
>>> 
>>> 12/12/11 14:16:35 INFO namenode.NameNode: SHUTDOWN_MSG:
>>> /************************************************************
>>> SHUTDOWN_MSG: Shutting down NameNode at xxxxxxxxxxxxxxxx
>>> ************************************************************/
>>> $:/opt/flume_hadoop/hadoop-1.1.0>
>>> 
>>> Question:
>>> 
>>> 1)@developer
>>> Are you aware of this behavior?
>>> 2)It there a way to overcome this problem with a workaround?
>>> 3)IS it a security issue? --> I was able to issue ssh on localhost
>> without error.
>>> 
>> 
> 


Re: User: is not allowed to impersonate hduser

Posted by Harsh J <ha...@cloudera.com>.
Great. This is described in more detail at
http://hadoop.apache.org/common/docs/stable/Secure_Impersonation.html.

On Tue, Dec 11, 2012 at 11:24 PM, Oleg Zhurakousky
<ol...@gmail.com> wrote:
> Harsh, thanx for replying but I just figured it out
> UserGroupInformation ugi = UserGroupInformation.createRemoteUser("hduser");
> ugi.doAs(new PrivilegedAction<Object>() {
> . . .
> }
> On Dec 11, 2012, at 12:40 PM, Harsh J <ha...@cloudera.com> wrote:
>
> Are you attempting to specify a user.name=hduser in your configs while
> submitting the job?
>
> On Tue, Dec 11, 2012 at 10:19 PM, Oleg Zhurakousky
> <ol...@gmail.com> wrote:
>
> Trying to submit a MR job from the local machine and getting the above error
>
> Any idea
>
> Thanks
> Oleg
>
>
>
>
> --
> Harsh J
>
>



-- 
Harsh J

Re: User: is not allowed to impersonate hduser

Posted by Harsh J <ha...@cloudera.com>.
Great. This is described in more detail at
http://hadoop.apache.org/common/docs/stable/Secure_Impersonation.html.

On Tue, Dec 11, 2012 at 11:24 PM, Oleg Zhurakousky
<ol...@gmail.com> wrote:
> Harsh, thanx for replying but I just figured it out
> UserGroupInformation ugi = UserGroupInformation.createRemoteUser("hduser");
> ugi.doAs(new PrivilegedAction<Object>() {
> . . .
> }
> On Dec 11, 2012, at 12:40 PM, Harsh J <ha...@cloudera.com> wrote:
>
> Are you attempting to specify a user.name=hduser in your configs while
> submitting the job?
>
> On Tue, Dec 11, 2012 at 10:19 PM, Oleg Zhurakousky
> <ol...@gmail.com> wrote:
>
> Trying to submit a MR job from the local machine and getting the above error
>
> Any idea
>
> Thanks
> Oleg
>
>
>
>
> --
> Harsh J
>
>



-- 
Harsh J

Re: User: is not allowed to impersonate hduser

Posted by Harsh J <ha...@cloudera.com>.
Great. This is described in more detail at
http://hadoop.apache.org/common/docs/stable/Secure_Impersonation.html.

On Tue, Dec 11, 2012 at 11:24 PM, Oleg Zhurakousky
<ol...@gmail.com> wrote:
> Harsh, thanx for replying but I just figured it out
> UserGroupInformation ugi = UserGroupInformation.createRemoteUser("hduser");
> ugi.doAs(new PrivilegedAction<Object>() {
> . . .
> }
> On Dec 11, 2012, at 12:40 PM, Harsh J <ha...@cloudera.com> wrote:
>
> Are you attempting to specify a user.name=hduser in your configs while
> submitting the job?
>
> On Tue, Dec 11, 2012 at 10:19 PM, Oleg Zhurakousky
> <ol...@gmail.com> wrote:
>
> Trying to submit a MR job from the local machine and getting the above error
>
> Any idea
>
> Thanks
> Oleg
>
>
>
>
> --
> Harsh J
>
>



-- 
Harsh J

Re: User: is not allowed to impersonate hduser

Posted by Harsh J <ha...@cloudera.com>.
Great. This is described in more detail at
http://hadoop.apache.org/common/docs/stable/Secure_Impersonation.html.

On Tue, Dec 11, 2012 at 11:24 PM, Oleg Zhurakousky
<ol...@gmail.com> wrote:
> Harsh, thanx for replying but I just figured it out
> UserGroupInformation ugi = UserGroupInformation.createRemoteUser("hduser");
> ugi.doAs(new PrivilegedAction<Object>() {
> . . .
> }
> On Dec 11, 2012, at 12:40 PM, Harsh J <ha...@cloudera.com> wrote:
>
> Are you attempting to specify a user.name=hduser in your configs while
> submitting the job?
>
> On Tue, Dec 11, 2012 at 10:19 PM, Oleg Zhurakousky
> <ol...@gmail.com> wrote:
>
> Trying to submit a MR job from the local machine and getting the above error
>
> Any idea
>
> Thanks
> Oleg
>
>
>
>
> --
> Harsh J
>
>



-- 
Harsh J

Re: User: is not allowed to impersonate hduser

Posted by Oleg Zhurakousky <ol...@gmail.com>.
Harsh, thanx for replying but I just figured it out
UserGroupInformation ugi = UserGroupInformation.createRemoteUser("hduser");
		ugi.doAs(new PrivilegedAction<Object>() {
. . .
}
On Dec 11, 2012, at 12:40 PM, Harsh J <ha...@cloudera.com> wrote:

> Are you attempting to specify a user.name=hduser in your configs while
> submitting the job?
> 
> On Tue, Dec 11, 2012 at 10:19 PM, Oleg Zhurakousky
> <ol...@gmail.com> wrote:
>> Trying to submit a MR job from the local machine and getting the above error
>> 
>> Any idea
>> 
>> Thanks
>> Oleg
> 
> 
> 
> -- 
> Harsh J


Re: User: is not allowed to impersonate hduser

Posted by Oleg Zhurakousky <ol...@gmail.com>.
Harsh, thanx for replying but I just figured it out
UserGroupInformation ugi = UserGroupInformation.createRemoteUser("hduser");
		ugi.doAs(new PrivilegedAction<Object>() {
. . .
}
On Dec 11, 2012, at 12:40 PM, Harsh J <ha...@cloudera.com> wrote:

> Are you attempting to specify a user.name=hduser in your configs while
> submitting the job?
> 
> On Tue, Dec 11, 2012 at 10:19 PM, Oleg Zhurakousky
> <ol...@gmail.com> wrote:
>> Trying to submit a MR job from the local machine and getting the above error
>> 
>> Any idea
>> 
>> Thanks
>> Oleg
> 
> 
> 
> -- 
> Harsh J


Re: User: is not allowed to impersonate hduser

Posted by Oleg Zhurakousky <ol...@gmail.com>.
Harsh, thanx for replying but I just figured it out
UserGroupInformation ugi = UserGroupInformation.createRemoteUser("hduser");
		ugi.doAs(new PrivilegedAction<Object>() {
. . .
}
On Dec 11, 2012, at 12:40 PM, Harsh J <ha...@cloudera.com> wrote:

> Are you attempting to specify a user.name=hduser in your configs while
> submitting the job?
> 
> On Tue, Dec 11, 2012 at 10:19 PM, Oleg Zhurakousky
> <ol...@gmail.com> wrote:
>> Trying to submit a MR job from the local machine and getting the above error
>> 
>> Any idea
>> 
>> Thanks
>> Oleg
> 
> 
> 
> -- 
> Harsh J


Re: User: is not allowed to impersonate hduser

Posted by Oleg Zhurakousky <ol...@gmail.com>.
Harsh, thanx for replying but I just figured it out
UserGroupInformation ugi = UserGroupInformation.createRemoteUser("hduser");
		ugi.doAs(new PrivilegedAction<Object>() {
. . .
}
On Dec 11, 2012, at 12:40 PM, Harsh J <ha...@cloudera.com> wrote:

> Are you attempting to specify a user.name=hduser in your configs while
> submitting the job?
> 
> On Tue, Dec 11, 2012 at 10:19 PM, Oleg Zhurakousky
> <ol...@gmail.com> wrote:
>> Trying to submit a MR job from the local machine and getting the above error
>> 
>> Any idea
>> 
>> Thanks
>> Oleg
> 
> 
> 
> -- 
> Harsh J


Re: User: is not allowed to impersonate hduser

Posted by Harsh J <ha...@cloudera.com>.
Are you attempting to specify a user.name=hduser in your configs while
submitting the job?

On Tue, Dec 11, 2012 at 10:19 PM, Oleg Zhurakousky
<ol...@gmail.com> wrote:
> Trying to submit a MR job from the local machine and getting the above error
>
> Any idea
>
> Thanks
> Oleg



-- 
Harsh J

Re: User: is not allowed to impersonate hduser

Posted by Harsh J <ha...@cloudera.com>.
Are you attempting to specify a user.name=hduser in your configs while
submitting the job?

On Tue, Dec 11, 2012 at 10:19 PM, Oleg Zhurakousky
<ol...@gmail.com> wrote:
> Trying to submit a MR job from the local machine and getting the above error
>
> Any idea
>
> Thanks
> Oleg



-- 
Harsh J

Re: User: is not allowed to impersonate hduser

Posted by Harsh J <ha...@cloudera.com>.
Are you attempting to specify a user.name=hduser in your configs while
submitting the job?

On Tue, Dec 11, 2012 at 10:19 PM, Oleg Zhurakousky
<ol...@gmail.com> wrote:
> Trying to submit a MR job from the local machine and getting the above error
>
> Any idea
>
> Thanks
> Oleg



-- 
Harsh J

Re: User: is not allowed to impersonate hduser

Posted by Harsh J <ha...@cloudera.com>.
Are you attempting to specify a user.name=hduser in your configs while
submitting the job?

On Tue, Dec 11, 2012 at 10:19 PM, Oleg Zhurakousky
<ol...@gmail.com> wrote:
> Trying to submit a MR job from the local machine and getting the above error
>
> Any idea
>
> Thanks
> Oleg



-- 
Harsh J

User: is not allowed to impersonate hduser

Posted by Oleg Zhurakousky <ol...@gmail.com>.
Trying to submit a MR job from the local machine and getting the above error

Any idea

Thanks
Oleg

User: is not allowed to impersonate hduser

Posted by Oleg Zhurakousky <ol...@gmail.com>.
Trying to submit a MR job from the local machine and getting the above error

Any idea

Thanks
Oleg

Re: compile hadoop-1.1.1 on zLinux using apache maven

Posted by Jean-Marc Spaggiari <je...@spaggiari.org>.
Fyi, I compiles 1.0.3 successfully using ant last week. So steps seems
still to be good.

JM
Le 13 déc. 2012 05:28, "Nicolas Liochon" <nk...@gmail.com> a écrit :

> branch1 does not use maven but ant.
> There are some docs here:
> http://wiki.apache.org/hadoop/BuildingHadoopFromSVN, not sure it's
> totally up to date.
>
> On Thu, Dec 13, 2012 at 11:08 AM, Emile Kao <em...@gmx.net> wrote:
>
>>
>>
>> 3) Can I compile the package in a simpler way other then maven?
>>
>
>

Re: compile hadoop-1.1.1 on zLinux using apache maven

Posted by Jean-Marc Spaggiari <je...@spaggiari.org>.
Fyi, I compiles 1.0.3 successfully using ant last week. So steps seems
still to be good.

JM
Le 13 déc. 2012 05:28, "Nicolas Liochon" <nk...@gmail.com> a écrit :

> branch1 does not use maven but ant.
> There are some docs here:
> http://wiki.apache.org/hadoop/BuildingHadoopFromSVN, not sure it's
> totally up to date.
>
> On Thu, Dec 13, 2012 at 11:08 AM, Emile Kao <em...@gmx.net> wrote:
>
>>
>>
>> 3) Can I compile the package in a simpler way other then maven?
>>
>
>

Re: compile hadoop-1.1.1 on zLinux using apache maven

Posted by Jean-Marc Spaggiari <je...@spaggiari.org>.
Fyi, I compiles 1.0.3 successfully using ant last week. So steps seems
still to be good.

JM
Le 13 déc. 2012 05:28, "Nicolas Liochon" <nk...@gmail.com> a écrit :

> branch1 does not use maven but ant.
> There are some docs here:
> http://wiki.apache.org/hadoop/BuildingHadoopFromSVN, not sure it's
> totally up to date.
>
> On Thu, Dec 13, 2012 at 11:08 AM, Emile Kao <em...@gmx.net> wrote:
>
>>
>>
>> 3) Can I compile the package in a simpler way other then maven?
>>
>
>

Re: compile hadoop-1.1.1 on zLinux using apache maven

Posted by Jean-Marc Spaggiari <je...@spaggiari.org>.
Fyi, I compiles 1.0.3 successfully using ant last week. So steps seems
still to be good.

JM
Le 13 déc. 2012 05:28, "Nicolas Liochon" <nk...@gmail.com> a écrit :

> branch1 does not use maven but ant.
> There are some docs here:
> http://wiki.apache.org/hadoop/BuildingHadoopFromSVN, not sure it's
> totally up to date.
>
> On Thu, Dec 13, 2012 at 11:08 AM, Emile Kao <em...@gmx.net> wrote:
>
>>
>>
>> 3) Can I compile the package in a simpler way other then maven?
>>
>
>

Re: compile hadoop-1.1.1 on zLinux using apache maven

Posted by Nicolas Liochon <nk...@gmail.com>.
branch1 does not use maven but ant.
There are some docs here:
http://wiki.apache.org/hadoop/BuildingHadoopFromSVN, not sure it's totally
up to date.

On Thu, Dec 13, 2012 at 11:08 AM, Emile Kao <em...@gmx.net> wrote:

>
>
> 3) Can I compile the package in a simpler way other then maven?
>

Re: compile hadoop-1.1.1 on zLinux using apache maven

Posted by Nicolas Liochon <nk...@gmail.com>.
branch1 does not use maven but ant.
There are some docs here:
http://wiki.apache.org/hadoop/BuildingHadoopFromSVN, not sure it's totally
up to date.

On Thu, Dec 13, 2012 at 11:08 AM, Emile Kao <em...@gmx.net> wrote:

>
>
> 3) Can I compile the package in a simpler way other then maven?
>

Re: compile hadoop-1.1.1 on zLinux using apache maven

Posted by Nicolas Liochon <nk...@gmail.com>.
branch1 does not use maven but ant.
There are some docs here:
http://wiki.apache.org/hadoop/BuildingHadoopFromSVN, not sure it's totally
up to date.

On Thu, Dec 13, 2012 at 11:08 AM, Emile Kao <em...@gmx.net> wrote:

>
>
> 3) Can I compile the package in a simpler way other then maven?
>

Re: compile hadoop-1.1.1 on zLinux using apache maven

Posted by Nicolas Liochon <nk...@gmail.com>.
branch1 does not use maven but ant.
There are some docs here:
http://wiki.apache.org/hadoop/BuildingHadoopFromSVN, not sure it's totally
up to date.

On Thu, Dec 13, 2012 at 11:08 AM, Emile Kao <em...@gmx.net> wrote:

>
>
> 3) Can I compile the package in a simpler way other then maven?
>

compile hadoop-1.1.1 on zLinux using apache maven

Posted by Emile Kao <em...@gmx.net>.
Hello Guys,
Now that I have downloaded the hadoop 1.1.1 source tar ball, I am trying to compile it for my platform (s390) running SLES 11.
I am encountering a couple of problem for which I have some questions:

1) Is there an official guide from the hadoop project showing how to build a binary for a custom target platform? I didn't find anything similar in the tar ball documentation

2) When using maven and trying to compile the hadoop sources, I got the following error:

----------------------------------------------------------------------------------------
localhost:/opt/flume_hadoop/hadoop-1.1.1 # mvn package
[INFO] Scanning for projects...
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 0.093s
[INFO] Finished at: Thu Dec 13 10:23:05 CET 2012
[INFO] Final Memory: 4M/11M
[INFO] ------------------------------------------------------------------------
[ERROR] The goal you specified requires a project to execute but there is no POM in this directory (/opt/flume_hadoop/hadoop-1.1.1). Please verify you invoked Maven from the correct directory. -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MissingProjectException
localhost:/opt/flume_hadoop/hadoop-1.1.1 #
------------------------------------------------------------------------------------

The compiler is complaining about the missing of POM in this directory (/opt/flume_hadoop/hadoop-1.1.1). But there is no "pom.xml" in the official tar ball.
Any idea?

3) Can I compile the package in a simpler way other then maven?

Thank you!

Cheers, Emile


compile hadoop-1.1.1 on zLinux using apache maven

Posted by Emile Kao <em...@gmx.net>.
Hello Guys,
Now that I have downloaded the hadoop 1.1.1 source tar ball, I am trying to compile it for my platform (s390) running SLES 11.
I am encountering a couple of problem for which I have some questions:

1) Is there an official guide from the hadoop project showing how to build a binary for a custom target platform? I didn't find anything similar in the tar ball documentation

2) When using maven and trying to compile the hadoop sources, I got the following error:

----------------------------------------------------------------------------------------
localhost:/opt/flume_hadoop/hadoop-1.1.1 # mvn package
[INFO] Scanning for projects...
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 0.093s
[INFO] Finished at: Thu Dec 13 10:23:05 CET 2012
[INFO] Final Memory: 4M/11M
[INFO] ------------------------------------------------------------------------
[ERROR] The goal you specified requires a project to execute but there is no POM in this directory (/opt/flume_hadoop/hadoop-1.1.1). Please verify you invoked Maven from the correct directory. -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MissingProjectException
localhost:/opt/flume_hadoop/hadoop-1.1.1 #
------------------------------------------------------------------------------------

The compiler is complaining about the missing of POM in this directory (/opt/flume_hadoop/hadoop-1.1.1). But there is no "pom.xml" in the official tar ball.
Any idea?

3) Can I compile the package in a simpler way other then maven?

Thank you!

Cheers, Emile



compile hadoop-1.1.1 on zLinux using apache maven

Posted by Emile Kao <em...@gmx.net>.
Hello Guys,
Now that I have downloaded the hadoop 1.1.1 source tar ball, I am trying to compile it for my platform (s390) running SLES 11.
I am encountering a couple of problem for which I have some questions:

1) Is there an official guide from the hadoop project showing how to build a binary for a custom target platform? I didn't find anything similar in the tar ball documentation

2) When using maven and trying to compile the hadoop sources, I got the following error:

----------------------------------------------------------------------------------------
localhost:/opt/flume_hadoop/hadoop-1.1.1 # mvn package
[INFO] Scanning for projects...
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 0.093s
[INFO] Finished at: Thu Dec 13 10:23:05 CET 2012
[INFO] Final Memory: 4M/11M
[INFO] ------------------------------------------------------------------------
[ERROR] The goal you specified requires a project to execute but there is no POM in this directory (/opt/flume_hadoop/hadoop-1.1.1). Please verify you invoked Maven from the correct directory. -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MissingProjectException
localhost:/opt/flume_hadoop/hadoop-1.1.1 #
------------------------------------------------------------------------------------

The compiler is complaining about the missing of POM in this directory (/opt/flume_hadoop/hadoop-1.1.1). But there is no "pom.xml" in the official tar ball.
Any idea?

3) Can I compile the package in a simpler way other then maven?

Thank you!

Cheers, Emile



compile hadoop-1.1.1 on zLinux using apache maven

Posted by Emile Kao <em...@gmx.net>.
Hello Guys,
Now that I have downloaded the hadoop 1.1.1 source tar ball, I am trying to compile it for my platform (s390) running SLES 11.
I am encountering a couple of problem for which I have some questions:

1) Is there an official guide from the hadoop project showing how to build a binary for a custom target platform? I didn't find anything similar in the tar ball documentation

2) When using maven and trying to compile the hadoop sources, I got the following error:

----------------------------------------------------------------------------------------
localhost:/opt/flume_hadoop/hadoop-1.1.1 # mvn package
[INFO] Scanning for projects...
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 0.093s
[INFO] Finished at: Thu Dec 13 10:23:05 CET 2012
[INFO] Final Memory: 4M/11M
[INFO] ------------------------------------------------------------------------
[ERROR] The goal you specified requires a project to execute but there is no POM in this directory (/opt/flume_hadoop/hadoop-1.1.1). Please verify you invoked Maven from the correct directory. -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MissingProjectException
localhost:/opt/flume_hadoop/hadoop-1.1.1 #
------------------------------------------------------------------------------------

The compiler is complaining about the missing of POM in this directory (/opt/flume_hadoop/hadoop-1.1.1). But there is no "pom.xml" in the official tar ball.
Any idea?

3) Can I compile the package in a simpler way other then maven?

Thank you!

Cheers, Emile



compile hadoop-1.1.1 on zLinux using apache maven

Posted by Emile Kao <em...@gmx.net>.
Hello Guys,
Now that I have downloaded the hadoop 1.1.1 source tar ball, I am trying to compile it for my platform (s390) running SLES 11.
I am encountering a couple of problem for which I have some questions:

1) Is there an official guide from the hadoop project showing how to build a binary for a custom target platform? I didn't find anything similar in the tar ball documentation

2) When using maven and trying to compile the hadoop sources, I got the following error:

----------------------------------------------------------------------------------------
localhost:/opt/flume_hadoop/hadoop-1.1.1 # mvn package
[INFO] Scanning for projects...
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 0.093s
[INFO] Finished at: Thu Dec 13 10:23:05 CET 2012
[INFO] Final Memory: 4M/11M
[INFO] ------------------------------------------------------------------------
[ERROR] The goal you specified requires a project to execute but there is no POM in this directory (/opt/flume_hadoop/hadoop-1.1.1). Please verify you invoked Maven from the correct directory. -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MissingProjectException
localhost:/opt/flume_hadoop/hadoop-1.1.1 #
------------------------------------------------------------------------------------

The compiler is complaining about the missing of POM in this directory (/opt/flume_hadoop/hadoop-1.1.1). But there is no "pom.xml" in the official tar ball.
Any idea?

3) Can I compile the package in a simpler way other then maven?

Thank you!

Cheers, Emile


compile hadoop-1.1.1 on zLinux using apache maven

Posted by Emile Kao <em...@gmx.net>.
Hello Guys,
Now that I have downloaded the hadoop 1.1.1 source tar ball, I am trying to compile it for my platform (s390) running SLES 11.
I am encountering a couple of problem for which I have some questions:

1) Is there an official guide from the hadoop project showing how to build a binary for a custom target platform? I didn't find anything similar in the tar ball documentation

2) When using maven and trying to compile the hadoop sources, I got the following error:

----------------------------------------------------------------------------------------
localhost:/opt/flume_hadoop/hadoop-1.1.1 # mvn package
[INFO] Scanning for projects...
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 0.093s
[INFO] Finished at: Thu Dec 13 10:23:05 CET 2012
[INFO] Final Memory: 4M/11M
[INFO] ------------------------------------------------------------------------
[ERROR] The goal you specified requires a project to execute but there is no POM in this directory (/opt/flume_hadoop/hadoop-1.1.1). Please verify you invoked Maven from the correct directory. -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MissingProjectException
localhost:/opt/flume_hadoop/hadoop-1.1.1 #
------------------------------------------------------------------------------------

The compiler is complaining about the missing of POM in this directory (/opt/flume_hadoop/hadoop-1.1.1). But there is no "pom.xml" in the official tar ball.
Any idea?

3) Can I compile the package in a simpler way other then maven?

Thank you!

Cheers, Emile


compile hadoop-1.1.1 on zLinux using apache maven

Posted by Emile Kao <em...@gmx.net>.
Hello Guys,
Now that I have downloaded the hadoop 1.1.1 source tar ball, I am trying to compile it for my platform (s390) running SLES 11.
I am encountering a couple of problem for which I have some questions:

1) Is there an official guide from the hadoop project showing how to build a binary for a custom target platform? I didn't find anything similar in the tar ball documentation

2) When using maven and trying to compile the hadoop sources, I got the following error:

----------------------------------------------------------------------------------------
localhost:/opt/flume_hadoop/hadoop-1.1.1 # mvn package
[INFO] Scanning for projects...
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 0.093s
[INFO] Finished at: Thu Dec 13 10:23:05 CET 2012
[INFO] Final Memory: 4M/11M
[INFO] ------------------------------------------------------------------------
[ERROR] The goal you specified requires a project to execute but there is no POM in this directory (/opt/flume_hadoop/hadoop-1.1.1). Please verify you invoked Maven from the correct directory. -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MissingProjectException
localhost:/opt/flume_hadoop/hadoop-1.1.1 #
------------------------------------------------------------------------------------

The compiler is complaining about the missing of POM in this directory (/opt/flume_hadoop/hadoop-1.1.1). But there is no "pom.xml" in the official tar ball.
Any idea?

3) Can I compile the package in a simpler way other then maven?

Thank you!

Cheers, Emile



compile hadoop-1.1.1 on zLinux using apache maven

Posted by Emile Kao <em...@gmx.net>.
Hello Guys,
Now that I have downloaded the hadoop 1.1.1 source tar ball, I am trying to compile it for my platform (s390) running SLES 11.
I am encountering a couple of problem for which I have some questions:

1) Is there an official guide from the hadoop project showing how to build a binary for a custom target platform? I didn't find anything similar in the tar ball documentation

2) When using maven and trying to compile the hadoop sources, I got the following error:

----------------------------------------------------------------------------------------
localhost:/opt/flume_hadoop/hadoop-1.1.1 # mvn package
[INFO] Scanning for projects...
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 0.093s
[INFO] Finished at: Thu Dec 13 10:23:05 CET 2012
[INFO] Final Memory: 4M/11M
[INFO] ------------------------------------------------------------------------
[ERROR] The goal you specified requires a project to execute but there is no POM in this directory (/opt/flume_hadoop/hadoop-1.1.1). Please verify you invoked Maven from the correct directory. -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MissingProjectException
localhost:/opt/flume_hadoop/hadoop-1.1.1 #
------------------------------------------------------------------------------------

The compiler is complaining about the missing of POM in this directory (/opt/flume_hadoop/hadoop-1.1.1). But there is no "pom.xml" in the official tar ball.
Any idea?

3) Can I compile the package in a simpler way other then maven?

Thank you!

Cheers, Emile


Re: using hadoop on zLinux (Linux on S390)

Posted by Kumar Ravi <ku...@us.ibm.com>.
Emile,

 You need a s390 version of hadoop-core binary. Since s390 is not a 
supported binary yet, you'll need to build it

Hope this helps.

Regards,
Kumar

Kumar Ravi
IBM Linux Technology Center 
IBM Master Inventor

11501 Burnet Road,
Austin, TX 78758

Tel.: (512)286-8179



From:
"Emile Kao" <em...@gmx.net>
To:
user@hadoop.apache.org, 
Date:
12/11/2012 09:30 AM
Subject:
Re: using hadoop on zLinux (Linux on S390)


Hello Kumar,
here are the answers to your questions:

> 1. What version and vendor of JDK did you use to compile and package 
hadoop? 

Answer:
I didn't compile the package since I followed the instructions in the 
official documentation (
http://hadoop.apache.org/docs/r1.1.0/single_node_setup.html). They were no 
talk about compiling the code first.
By the way I am using the binary version I downloaded from the official 
download site. I guess this one is already compiled.

> 2. What version and vendor of JVM are you running? You can type java 
-version from the console to see this.

Answer:
This is the java version I am using:

java version "1.6.0"
Java(TM) SE Runtime Environment (build pxz6460sr10fp1-20120321_01(SR10 
FP1))
IBM J9 VM (build 2.4, JRE 1.6.0 IBM J9 2.4 Linux s390x-64 
jvmxz6460sr10fp1-20120202_101568 (JIT enabled, AOT enabled)
J9VM - 20120202_101568
JIT  - r9_20111107_21307ifx1
GC   - 20120202_AA)
JCL  - 20120320_01

Thank you in advance.

Cheers, Emile


-------- Original-Nachricht --------
> Datum: Tue, 11 Dec 2012 08:56:24 -0600
> Von: Kumar Ravi <ku...@us.ibm.com>
> An: user@hadoop.apache.org
> Betreff: Re: using hadoop on zLinux (Linux on S390)

> Hi Emile,
> 
>  I have a couple of questions for you:
> 
> 1. What version and vendor of JDK did you use to compile and package 
> hadoop? 
> 
> 2. What version and vendor of JVM are you running? You can type java 
> -version from the console to see this.
> 
> Thanks,
> Kumar
> 
> Kumar Ravi
> IBM Linux Technology Center 
> 
> 
> 
> 
> From:
> "Emile Kao" <em...@gmx.net>
> To:
> user@hadoop.apache.org, 
> Date:
> 12/11/2012 08:51 AM
> Subject:
> Re: using hadoop on zLinux (Linux on S390)
> 
> 
> No, this is the general available version...
> 
> -------- Original-Nachricht --------
> > Datum: Tue, 11 Dec 2012 08:31:57 -0600
> > Von: Michael Segel <mi...@hotmail.com>
> > An: user@hadoop.apache.org
> > Betreff: Re: using hadoop on zLinux (Linux on S390)
> 
> > Well, on the surface.... 
> > 
> > It looks like its either a missing class, or you don't have your class
> > path set up right. 
> > 
> > I'm assuming you got this version of Hadoop from IBM, so I would 
suggest
> > contacting their support and opening up a ticket. 
> > 
> > 
> > On Dec 11, 2012, at 8:23 AM, Emile Kao <em...@gmx.net> wrote:
> > 
> > > Hello community,
> > > I am trying to use hadoop 1.1.0 on a SLES 11 (zLinux) running on IBM
> > S390.
> > > The java provided is "java-s390x-60" 64Bit.
> > > While trying to format the namenode I got the following error:
> > > 
> > > $:/opt/flume_hadoop/hadoop-1.1.0> bin/hadoop namenode -format
> > > 12/12/11 14:16:31 INFO namenode.NameNode: STARTUP_MSG:
> > > /************************************************************
> > > STARTUP_MSG: Starting NameNode
> > > STARTUP_MSG:   host = xxxxxxxxx
> > > STARTUP_MSG:   args = [-format]
> > > STARTUP_MSG:   version = 1.1.0
> > > STARTUP_MSG:   build =
> > https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.1 -r 
> 1394289; compiled by 'hortonfo' on Thu Oct  4 22:06:49
> > UTC 2012
> > > ************************************************************/
> > > Re-format filesystem in /opt/hadoop_data/name ? (Y or N) Y
> > > 12/12/11 14:16:34 INFO util.GSet: VM type       = 64-bit
> > > 12/12/11 14:16:34 INFO util.GSet: 2% max memory = 20.0 MB
> > > 12/12/11 14:16:34 INFO util.GSet: capacity      = 2^21 = 2097152 
> entries
> > > 12/12/11 14:16:34 INFO util.GSet: recommended=2097152, 
actual=2097152
> > > 12/12/11 14:16:34 ERROR security.UserGroupInformation: Unable to 
find
> > JAAS classes:com.ibm.security.auth.LinuxPrincipal
> > > 12/12/11 14:16:35 ERROR namenode.NameNode: java.io.IOException: 
> failure
> > to login
> > >        at
> > 
> 
org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:501)
> > >        at
> > 
> 
org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463)
> > >        at
> > 
> 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:491)
> > >        at
> > 
> 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:480)
> > >        at
> > 
> 
org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1198)
> > >        at
> > 
> 
org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1391)
> > >        at
> > 
org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1412)
> > > Caused by: javax.security.auth.login.LoginException:
> > java.lang.NullPointerException: invalid null Class provided
> > >        at 
javax.security.auth.Subject.getPrincipals(Subject.java:809)
> > >        at
> > 
> 
org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.getCanonicalUser(UserGroupInformation.java:86)
> > >        at
> > 
> 
org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.commit(UserGroupInformation.java:123)
> > >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native 
Method)
> > >        at
> > 
> 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:48)
> > >        at
> > 
> 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> > >        at java.lang.reflect.Method.invoke(Method.java:600)
> > >        at
> > javax.security.auth.login.LoginContext.invoke(LoginContext.java:795)
> > >        at
> > 
javax.security.auth.login.LoginContext.access$000(LoginContext.java:209)
> > >        at
> > javax.security.auth.login.LoginContext$5.run(LoginContext.java:732)
> > >        at
> > java.security.AccessController.doPrivileged(AccessController.java:284)
> > >        at
> > 
> 
javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:729)
> > >        at
> > javax.security.auth.login.LoginContext.login(LoginContext.java:600)
> > >        at
> > 
> 
org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:482)
> > >        at
> > 
> 
org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463)
> > >        at
> > 
> 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:491)
> > >        at
> > 
> 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:480)
> > >        at
> > 
> 
org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1198)
> > >        at
> > 
> 
org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1391)
> > >        at
> > 
org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1412)
> > > 
> > >        at
> > javax.security.auth.login.LoginContext.invoke(LoginContext.java:898)
> > >        at
> > 
javax.security.auth.login.LoginContext.access$000(LoginContext.java:209)
> > >        at
> > javax.security.auth.login.LoginContext$5.run(LoginContext.java:732)
> > >        at
> > java.security.AccessController.doPrivileged(AccessController.java:284)
> > >        at
> > 
> 
javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:729)
> > >        at
> > javax.security.auth.login.LoginContext.login(LoginContext.java:600)
> > >        at
> > 
> 
org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:482)
> > >        ... 6 more
> > > 
> > > 12/12/11 14:16:35 INFO namenode.NameNode: SHUTDOWN_MSG:
> > > /************************************************************
> > > SHUTDOWN_MSG: Shutting down NameNode at xxxxxxxxxxxxxxxx
> > > ************************************************************/
> > > $:/opt/flume_hadoop/hadoop-1.1.0>
> > > 
> > > Question:
> > > 
> > > 1)@developer
> > > Are you aware of this behavior?
> > > 2)It there a way to overcome this problem with a workaround?
> > > 3)IS it a security issue? --> I was able to issue ssh on localhost
> > without error.
> > > 
> > 
> 
> 



Re: using hadoop on zLinux (Linux on S390)

Posted by Kumar Ravi <ku...@us.ibm.com>.
Emile,

 You need a s390 version of hadoop-core binary. Since s390 is not a 
supported binary yet, you'll need to build it

Hope this helps.

Regards,
Kumar

Kumar Ravi
IBM Linux Technology Center 
IBM Master Inventor

11501 Burnet Road,
Austin, TX 78758

Tel.: (512)286-8179



From:
"Emile Kao" <em...@gmx.net>
To:
user@hadoop.apache.org, 
Date:
12/11/2012 09:30 AM
Subject:
Re: using hadoop on zLinux (Linux on S390)


Hello Kumar,
here are the answers to your questions:

> 1. What version and vendor of JDK did you use to compile and package 
hadoop? 

Answer:
I didn't compile the package since I followed the instructions in the 
official documentation (
http://hadoop.apache.org/docs/r1.1.0/single_node_setup.html). They were no 
talk about compiling the code first.
By the way I am using the binary version I downloaded from the official 
download site. I guess this one is already compiled.

> 2. What version and vendor of JVM are you running? You can type java 
-version from the console to see this.

Answer:
This is the java version I am using:

java version "1.6.0"
Java(TM) SE Runtime Environment (build pxz6460sr10fp1-20120321_01(SR10 
FP1))
IBM J9 VM (build 2.4, JRE 1.6.0 IBM J9 2.4 Linux s390x-64 
jvmxz6460sr10fp1-20120202_101568 (JIT enabled, AOT enabled)
J9VM - 20120202_101568
JIT  - r9_20111107_21307ifx1
GC   - 20120202_AA)
JCL  - 20120320_01

Thank you in advance.

Cheers, Emile


-------- Original-Nachricht --------
> Datum: Tue, 11 Dec 2012 08:56:24 -0600
> Von: Kumar Ravi <ku...@us.ibm.com>
> An: user@hadoop.apache.org
> Betreff: Re: using hadoop on zLinux (Linux on S390)

> Hi Emile,
> 
>  I have a couple of questions for you:
> 
> 1. What version and vendor of JDK did you use to compile and package 
> hadoop? 
> 
> 2. What version and vendor of JVM are you running? You can type java 
> -version from the console to see this.
> 
> Thanks,
> Kumar
> 
> Kumar Ravi
> IBM Linux Technology Center 
> 
> 
> 
> 
> From:
> "Emile Kao" <em...@gmx.net>
> To:
> user@hadoop.apache.org, 
> Date:
> 12/11/2012 08:51 AM
> Subject:
> Re: using hadoop on zLinux (Linux on S390)
> 
> 
> No, this is the general available version...
> 
> -------- Original-Nachricht --------
> > Datum: Tue, 11 Dec 2012 08:31:57 -0600
> > Von: Michael Segel <mi...@hotmail.com>
> > An: user@hadoop.apache.org
> > Betreff: Re: using hadoop on zLinux (Linux on S390)
> 
> > Well, on the surface.... 
> > 
> > It looks like its either a missing class, or you don't have your class
> > path set up right. 
> > 
> > I'm assuming you got this version of Hadoop from IBM, so I would 
suggest
> > contacting their support and opening up a ticket. 
> > 
> > 
> > On Dec 11, 2012, at 8:23 AM, Emile Kao <em...@gmx.net> wrote:
> > 
> > > Hello community,
> > > I am trying to use hadoop 1.1.0 on a SLES 11 (zLinux) running on IBM
> > S390.
> > > The java provided is "java-s390x-60" 64Bit.
> > > While trying to format the namenode I got the following error:
> > > 
> > > $:/opt/flume_hadoop/hadoop-1.1.0> bin/hadoop namenode -format
> > > 12/12/11 14:16:31 INFO namenode.NameNode: STARTUP_MSG:
> > > /************************************************************
> > > STARTUP_MSG: Starting NameNode
> > > STARTUP_MSG:   host = xxxxxxxxx
> > > STARTUP_MSG:   args = [-format]
> > > STARTUP_MSG:   version = 1.1.0
> > > STARTUP_MSG:   build =
> > https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.1 -r 
> 1394289; compiled by 'hortonfo' on Thu Oct  4 22:06:49
> > UTC 2012
> > > ************************************************************/
> > > Re-format filesystem in /opt/hadoop_data/name ? (Y or N) Y
> > > 12/12/11 14:16:34 INFO util.GSet: VM type       = 64-bit
> > > 12/12/11 14:16:34 INFO util.GSet: 2% max memory = 20.0 MB
> > > 12/12/11 14:16:34 INFO util.GSet: capacity      = 2^21 = 2097152 
> entries
> > > 12/12/11 14:16:34 INFO util.GSet: recommended=2097152, 
actual=2097152
> > > 12/12/11 14:16:34 ERROR security.UserGroupInformation: Unable to 
find
> > JAAS classes:com.ibm.security.auth.LinuxPrincipal
> > > 12/12/11 14:16:35 ERROR namenode.NameNode: java.io.IOException: 
> failure
> > to login
> > >        at
> > 
> 
org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:501)
> > >        at
> > 
> 
org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463)
> > >        at
> > 
> 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:491)
> > >        at
> > 
> 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:480)
> > >        at
> > 
> 
org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1198)
> > >        at
> > 
> 
org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1391)
> > >        at
> > 
org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1412)
> > > Caused by: javax.security.auth.login.LoginException:
> > java.lang.NullPointerException: invalid null Class provided
> > >        at 
javax.security.auth.Subject.getPrincipals(Subject.java:809)
> > >        at
> > 
> 
org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.getCanonicalUser(UserGroupInformation.java:86)
> > >        at
> > 
> 
org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.commit(UserGroupInformation.java:123)
> > >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native 
Method)
> > >        at
> > 
> 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:48)
> > >        at
> > 
> 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> > >        at java.lang.reflect.Method.invoke(Method.java:600)
> > >        at
> > javax.security.auth.login.LoginContext.invoke(LoginContext.java:795)
> > >        at
> > 
javax.security.auth.login.LoginContext.access$000(LoginContext.java:209)
> > >        at
> > javax.security.auth.login.LoginContext$5.run(LoginContext.java:732)
> > >        at
> > java.security.AccessController.doPrivileged(AccessController.java:284)
> > >        at
> > 
> 
javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:729)
> > >        at
> > javax.security.auth.login.LoginContext.login(LoginContext.java:600)
> > >        at
> > 
> 
org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:482)
> > >        at
> > 
> 
org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463)
> > >        at
> > 
> 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:491)
> > >        at
> > 
> 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:480)
> > >        at
> > 
> 
org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1198)
> > >        at
> > 
> 
org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1391)
> > >        at
> > 
org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1412)
> > > 
> > >        at
> > javax.security.auth.login.LoginContext.invoke(LoginContext.java:898)
> > >        at
> > 
javax.security.auth.login.LoginContext.access$000(LoginContext.java:209)
> > >        at
> > javax.security.auth.login.LoginContext$5.run(LoginContext.java:732)
> > >        at
> > java.security.AccessController.doPrivileged(AccessController.java:284)
> > >        at
> > 
> 
javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:729)
> > >        at
> > javax.security.auth.login.LoginContext.login(LoginContext.java:600)
> > >        at
> > 
> 
org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:482)
> > >        ... 6 more
> > > 
> > > 12/12/11 14:16:35 INFO namenode.NameNode: SHUTDOWN_MSG:
> > > /************************************************************
> > > SHUTDOWN_MSG: Shutting down NameNode at xxxxxxxxxxxxxxxx
> > > ************************************************************/
> > > $:/opt/flume_hadoop/hadoop-1.1.0>
> > > 
> > > Question:
> > > 
> > > 1)@developer
> > > Are you aware of this behavior?
> > > 2)It there a way to overcome this problem with a workaround?
> > > 3)IS it a security issue? --> I was able to issue ssh on localhost
> > without error.
> > > 
> > 
> 
> 



User: is not allowed to impersonate hduser

Posted by Oleg Zhurakousky <ol...@gmail.com>.
Trying to submit a MR job from the local machine and getting the above error

Any idea

Thanks
Oleg

Re: using hadoop on zLinux (Linux on S390)

Posted by Kumar Ravi <ku...@us.ibm.com>.
Emile,

 You need a s390 version of hadoop-core binary. Since s390 is not a 
supported binary yet, you'll need to build it

Hope this helps.

Regards,
Kumar

Kumar Ravi
IBM Linux Technology Center 
IBM Master Inventor

11501 Burnet Road,
Austin, TX 78758

Tel.: (512)286-8179



From:
"Emile Kao" <em...@gmx.net>
To:
user@hadoop.apache.org, 
Date:
12/11/2012 09:30 AM
Subject:
Re: using hadoop on zLinux (Linux on S390)


Hello Kumar,
here are the answers to your questions:

> 1. What version and vendor of JDK did you use to compile and package 
hadoop? 

Answer:
I didn't compile the package since I followed the instructions in the 
official documentation (
http://hadoop.apache.org/docs/r1.1.0/single_node_setup.html). They were no 
talk about compiling the code first.
By the way I am using the binary version I downloaded from the official 
download site. I guess this one is already compiled.

> 2. What version and vendor of JVM are you running? You can type java 
-version from the console to see this.

Answer:
This is the java version I am using:

java version "1.6.0"
Java(TM) SE Runtime Environment (build pxz6460sr10fp1-20120321_01(SR10 
FP1))
IBM J9 VM (build 2.4, JRE 1.6.0 IBM J9 2.4 Linux s390x-64 
jvmxz6460sr10fp1-20120202_101568 (JIT enabled, AOT enabled)
J9VM - 20120202_101568
JIT  - r9_20111107_21307ifx1
GC   - 20120202_AA)
JCL  - 20120320_01

Thank you in advance.

Cheers, Emile


-------- Original-Nachricht --------
> Datum: Tue, 11 Dec 2012 08:56:24 -0600
> Von: Kumar Ravi <ku...@us.ibm.com>
> An: user@hadoop.apache.org
> Betreff: Re: using hadoop on zLinux (Linux on S390)

> Hi Emile,
> 
>  I have a couple of questions for you:
> 
> 1. What version and vendor of JDK did you use to compile and package 
> hadoop? 
> 
> 2. What version and vendor of JVM are you running? You can type java 
> -version from the console to see this.
> 
> Thanks,
> Kumar
> 
> Kumar Ravi
> IBM Linux Technology Center 
> 
> 
> 
> 
> From:
> "Emile Kao" <em...@gmx.net>
> To:
> user@hadoop.apache.org, 
> Date:
> 12/11/2012 08:51 AM
> Subject:
> Re: using hadoop on zLinux (Linux on S390)
> 
> 
> No, this is the general available version...
> 
> -------- Original-Nachricht --------
> > Datum: Tue, 11 Dec 2012 08:31:57 -0600
> > Von: Michael Segel <mi...@hotmail.com>
> > An: user@hadoop.apache.org
> > Betreff: Re: using hadoop on zLinux (Linux on S390)
> 
> > Well, on the surface.... 
> > 
> > It looks like its either a missing class, or you don't have your class
> > path set up right. 
> > 
> > I'm assuming you got this version of Hadoop from IBM, so I would 
suggest
> > contacting their support and opening up a ticket. 
> > 
> > 
> > On Dec 11, 2012, at 8:23 AM, Emile Kao <em...@gmx.net> wrote:
> > 
> > > Hello community,
> > > I am trying to use hadoop 1.1.0 on a SLES 11 (zLinux) running on IBM
> > S390.
> > > The java provided is "java-s390x-60" 64Bit.
> > > While trying to format the namenode I got the following error:
> > > 
> > > $:/opt/flume_hadoop/hadoop-1.1.0> bin/hadoop namenode -format
> > > 12/12/11 14:16:31 INFO namenode.NameNode: STARTUP_MSG:
> > > /************************************************************
> > > STARTUP_MSG: Starting NameNode
> > > STARTUP_MSG:   host = xxxxxxxxx
> > > STARTUP_MSG:   args = [-format]
> > > STARTUP_MSG:   version = 1.1.0
> > > STARTUP_MSG:   build =
> > https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.1 -r 
> 1394289; compiled by 'hortonfo' on Thu Oct  4 22:06:49
> > UTC 2012
> > > ************************************************************/
> > > Re-format filesystem in /opt/hadoop_data/name ? (Y or N) Y
> > > 12/12/11 14:16:34 INFO util.GSet: VM type       = 64-bit
> > > 12/12/11 14:16:34 INFO util.GSet: 2% max memory = 20.0 MB
> > > 12/12/11 14:16:34 INFO util.GSet: capacity      = 2^21 = 2097152 
> entries
> > > 12/12/11 14:16:34 INFO util.GSet: recommended=2097152, 
actual=2097152
> > > 12/12/11 14:16:34 ERROR security.UserGroupInformation: Unable to 
find
> > JAAS classes:com.ibm.security.auth.LinuxPrincipal
> > > 12/12/11 14:16:35 ERROR namenode.NameNode: java.io.IOException: 
> failure
> > to login
> > >        at
> > 
> 
org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:501)
> > >        at
> > 
> 
org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463)
> > >        at
> > 
> 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:491)
> > >        at
> > 
> 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:480)
> > >        at
> > 
> 
org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1198)
> > >        at
> > 
> 
org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1391)
> > >        at
> > 
org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1412)
> > > Caused by: javax.security.auth.login.LoginException:
> > java.lang.NullPointerException: invalid null Class provided
> > >        at 
javax.security.auth.Subject.getPrincipals(Subject.java:809)
> > >        at
> > 
> 
org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.getCanonicalUser(UserGroupInformation.java:86)
> > >        at
> > 
> 
org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.commit(UserGroupInformation.java:123)
> > >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native 
Method)
> > >        at
> > 
> 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:48)
> > >        at
> > 
> 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> > >        at java.lang.reflect.Method.invoke(Method.java:600)
> > >        at
> > javax.security.auth.login.LoginContext.invoke(LoginContext.java:795)
> > >        at
> > 
javax.security.auth.login.LoginContext.access$000(LoginContext.java:209)
> > >        at
> > javax.security.auth.login.LoginContext$5.run(LoginContext.java:732)
> > >        at
> > java.security.AccessController.doPrivileged(AccessController.java:284)
> > >        at
> > 
> 
javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:729)
> > >        at
> > javax.security.auth.login.LoginContext.login(LoginContext.java:600)
> > >        at
> > 
> 
org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:482)
> > >        at
> > 
> 
org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463)
> > >        at
> > 
> 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:491)
> > >        at
> > 
> 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:480)
> > >        at
> > 
> 
org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1198)
> > >        at
> > 
> 
org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1391)
> > >        at
> > 
org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1412)
> > > 
> > >        at
> > javax.security.auth.login.LoginContext.invoke(LoginContext.java:898)
> > >        at
> > 
javax.security.auth.login.LoginContext.access$000(LoginContext.java:209)
> > >        at
> > javax.security.auth.login.LoginContext$5.run(LoginContext.java:732)
> > >        at
> > java.security.AccessController.doPrivileged(AccessController.java:284)
> > >        at
> > 
> 
javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:729)
> > >        at
> > javax.security.auth.login.LoginContext.login(LoginContext.java:600)
> > >        at
> > 
> 
org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:482)
> > >        ... 6 more
> > > 
> > > 12/12/11 14:16:35 INFO namenode.NameNode: SHUTDOWN_MSG:
> > > /************************************************************
> > > SHUTDOWN_MSG: Shutting down NameNode at xxxxxxxxxxxxxxxx
> > > ************************************************************/
> > > $:/opt/flume_hadoop/hadoop-1.1.0>
> > > 
> > > Question:
> > > 
> > > 1)@developer
> > > Are you aware of this behavior?
> > > 2)It there a way to overcome this problem with a workaround?
> > > 3)IS it a security issue? --> I was able to issue ssh on localhost
> > without error.
> > > 
> > 
> 
> 



User: is not allowed to impersonate hduser

Posted by Oleg Zhurakousky <ol...@gmail.com>.
Trying to submit a MR job from the local machine and getting the above error

Any idea

Thanks
Oleg

Re: using hadoop on zLinux (Linux on S390)

Posted by Kumar Ravi <ku...@us.ibm.com>.
Emile,

 You need a s390 version of hadoop-core binary. Since s390 is not a 
supported binary yet, you'll need to build it

Hope this helps.

Regards,
Kumar

Kumar Ravi
IBM Linux Technology Center 
IBM Master Inventor

11501 Burnet Road,
Austin, TX 78758

Tel.: (512)286-8179



From:
"Emile Kao" <em...@gmx.net>
To:
user@hadoop.apache.org, 
Date:
12/11/2012 09:30 AM
Subject:
Re: using hadoop on zLinux (Linux on S390)


Hello Kumar,
here are the answers to your questions:

> 1. What version and vendor of JDK did you use to compile and package 
hadoop? 

Answer:
I didn't compile the package since I followed the instructions in the 
official documentation (
http://hadoop.apache.org/docs/r1.1.0/single_node_setup.html). They were no 
talk about compiling the code first.
By the way I am using the binary version I downloaded from the official 
download site. I guess this one is already compiled.

> 2. What version and vendor of JVM are you running? You can type java 
-version from the console to see this.

Answer:
This is the java version I am using:

java version "1.6.0"
Java(TM) SE Runtime Environment (build pxz6460sr10fp1-20120321_01(SR10 
FP1))
IBM J9 VM (build 2.4, JRE 1.6.0 IBM J9 2.4 Linux s390x-64 
jvmxz6460sr10fp1-20120202_101568 (JIT enabled, AOT enabled)
J9VM - 20120202_101568
JIT  - r9_20111107_21307ifx1
GC   - 20120202_AA)
JCL  - 20120320_01

Thank you in advance.

Cheers, Emile


-------- Original-Nachricht --------
> Datum: Tue, 11 Dec 2012 08:56:24 -0600
> Von: Kumar Ravi <ku...@us.ibm.com>
> An: user@hadoop.apache.org
> Betreff: Re: using hadoop on zLinux (Linux on S390)

> Hi Emile,
> 
>  I have a couple of questions for you:
> 
> 1. What version and vendor of JDK did you use to compile and package 
> hadoop? 
> 
> 2. What version and vendor of JVM are you running? You can type java 
> -version from the console to see this.
> 
> Thanks,
> Kumar
> 
> Kumar Ravi
> IBM Linux Technology Center 
> 
> 
> 
> 
> From:
> "Emile Kao" <em...@gmx.net>
> To:
> user@hadoop.apache.org, 
> Date:
> 12/11/2012 08:51 AM
> Subject:
> Re: using hadoop on zLinux (Linux on S390)
> 
> 
> No, this is the general available version...
> 
> -------- Original-Nachricht --------
> > Datum: Tue, 11 Dec 2012 08:31:57 -0600
> > Von: Michael Segel <mi...@hotmail.com>
> > An: user@hadoop.apache.org
> > Betreff: Re: using hadoop on zLinux (Linux on S390)
> 
> > Well, on the surface.... 
> > 
> > It looks like its either a missing class, or you don't have your class
> > path set up right. 
> > 
> > I'm assuming you got this version of Hadoop from IBM, so I would 
suggest
> > contacting their support and opening up a ticket. 
> > 
> > 
> > On Dec 11, 2012, at 8:23 AM, Emile Kao <em...@gmx.net> wrote:
> > 
> > > Hello community,
> > > I am trying to use hadoop 1.1.0 on a SLES 11 (zLinux) running on IBM
> > S390.
> > > The java provided is "java-s390x-60" 64Bit.
> > > While trying to format the namenode I got the following error:
> > > 
> > > $:/opt/flume_hadoop/hadoop-1.1.0> bin/hadoop namenode -format
> > > 12/12/11 14:16:31 INFO namenode.NameNode: STARTUP_MSG:
> > > /************************************************************
> > > STARTUP_MSG: Starting NameNode
> > > STARTUP_MSG:   host = xxxxxxxxx
> > > STARTUP_MSG:   args = [-format]
> > > STARTUP_MSG:   version = 1.1.0
> > > STARTUP_MSG:   build =
> > https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.1 -r 
> 1394289; compiled by 'hortonfo' on Thu Oct  4 22:06:49
> > UTC 2012
> > > ************************************************************/
> > > Re-format filesystem in /opt/hadoop_data/name ? (Y or N) Y
> > > 12/12/11 14:16:34 INFO util.GSet: VM type       = 64-bit
> > > 12/12/11 14:16:34 INFO util.GSet: 2% max memory = 20.0 MB
> > > 12/12/11 14:16:34 INFO util.GSet: capacity      = 2^21 = 2097152 
> entries
> > > 12/12/11 14:16:34 INFO util.GSet: recommended=2097152, 
actual=2097152
> > > 12/12/11 14:16:34 ERROR security.UserGroupInformation: Unable to 
find
> > JAAS classes:com.ibm.security.auth.LinuxPrincipal
> > > 12/12/11 14:16:35 ERROR namenode.NameNode: java.io.IOException: 
> failure
> > to login
> > >        at
> > 
> 
org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:501)
> > >        at
> > 
> 
org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463)
> > >        at
> > 
> 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:491)
> > >        at
> > 
> 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:480)
> > >        at
> > 
> 
org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1198)
> > >        at
> > 
> 
org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1391)
> > >        at
> > 
org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1412)
> > > Caused by: javax.security.auth.login.LoginException:
> > java.lang.NullPointerException: invalid null Class provided
> > >        at 
javax.security.auth.Subject.getPrincipals(Subject.java:809)
> > >        at
> > 
> 
org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.getCanonicalUser(UserGroupInformation.java:86)
> > >        at
> > 
> 
org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.commit(UserGroupInformation.java:123)
> > >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native 
Method)
> > >        at
> > 
> 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:48)
> > >        at
> > 
> 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> > >        at java.lang.reflect.Method.invoke(Method.java:600)
> > >        at
> > javax.security.auth.login.LoginContext.invoke(LoginContext.java:795)
> > >        at
> > 
javax.security.auth.login.LoginContext.access$000(LoginContext.java:209)
> > >        at
> > javax.security.auth.login.LoginContext$5.run(LoginContext.java:732)
> > >        at
> > java.security.AccessController.doPrivileged(AccessController.java:284)
> > >        at
> > 
> 
javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:729)
> > >        at
> > javax.security.auth.login.LoginContext.login(LoginContext.java:600)
> > >        at
> > 
> 
org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:482)
> > >        at
> > 
> 
org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463)
> > >        at
> > 
> 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:491)
> > >        at
> > 
> 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:480)
> > >        at
> > 
> 
org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1198)
> > >        at
> > 
> 
org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1391)
> > >        at
> > 
org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1412)
> > > 
> > >        at
> > javax.security.auth.login.LoginContext.invoke(LoginContext.java:898)
> > >        at
> > 
javax.security.auth.login.LoginContext.access$000(LoginContext.java:209)
> > >        at
> > javax.security.auth.login.LoginContext$5.run(LoginContext.java:732)
> > >        at
> > java.security.AccessController.doPrivileged(AccessController.java:284)
> > >        at
> > 
> 
javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:729)
> > >        at
> > javax.security.auth.login.LoginContext.login(LoginContext.java:600)
> > >        at
> > 
> 
org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:482)
> > >        ... 6 more
> > > 
> > > 12/12/11 14:16:35 INFO namenode.NameNode: SHUTDOWN_MSG:
> > > /************************************************************
> > > SHUTDOWN_MSG: Shutting down NameNode at xxxxxxxxxxxxxxxx
> > > ************************************************************/
> > > $:/opt/flume_hadoop/hadoop-1.1.0>
> > > 
> > > Question:
> > > 
> > > 1)@developer
> > > Are you aware of this behavior?
> > > 2)It there a way to overcome this problem with a workaround?
> > > 3)IS it a security issue? --> I was able to issue ssh on localhost
> > without error.
> > > 
> > 
> 
> 



Re: using hadoop on zLinux (Linux on S390)

Posted by Emile Kao <em...@gmx.net>.
Hello Kumar,
here are the answers to your questions:

> 1. What version and vendor of JDK did you use to compile and package hadoop? 

Answer:
I didn't compile the package since I followed the instructions in the official documentation (http://hadoop.apache.org/docs/r1.1.0/single_node_setup.html). They were no talk about compiling the code first.
By the way I am using the binary version I downloaded from the official download site. I guess this one is already compiled.

> 2. What version and vendor of JVM are you running? You can type java -version from the console to see this.

Answer:
This is the java version I am using:

java version "1.6.0"
Java(TM) SE Runtime Environment (build pxz6460sr10fp1-20120321_01(SR10 FP1))
IBM J9 VM (build 2.4, JRE 1.6.0 IBM J9 2.4 Linux s390x-64 jvmxz6460sr10fp1-20120202_101568 (JIT enabled, AOT enabled)
J9VM - 20120202_101568
JIT  - r9_20111107_21307ifx1
GC   - 20120202_AA)
JCL  - 20120320_01

Thank you in advance.

Cheers, Emile


-------- Original-Nachricht --------
> Datum: Tue, 11 Dec 2012 08:56:24 -0600
> Von: Kumar Ravi <ku...@us.ibm.com>
> An: user@hadoop.apache.org
> Betreff: Re: using hadoop on zLinux (Linux on S390)

> Hi Emile,
> 
>  I have a couple of questions for you:
> 
> 1. What version and vendor of JDK did you use to compile and package 
> hadoop? 
> 
> 2. What version and vendor of JVM are you running? You can type java 
> -version from the console to see this.
> 
> Thanks,
> Kumar
> 
> Kumar Ravi
> IBM Linux Technology Center 
> 
> 
> 
> 
> From:
> "Emile Kao" <em...@gmx.net>
> To:
> user@hadoop.apache.org, 
> Date:
> 12/11/2012 08:51 AM
> Subject:
> Re: using hadoop on zLinux (Linux on S390)
> 
> 
> No, this is the general available version...
> 
> -------- Original-Nachricht --------
> > Datum: Tue, 11 Dec 2012 08:31:57 -0600
> > Von: Michael Segel <mi...@hotmail.com>
> > An: user@hadoop.apache.org
> > Betreff: Re: using hadoop on zLinux (Linux on S390)
> 
> > Well, on the surface.... 
> > 
> > It looks like its either a missing class, or you don't have your class
> > path set up right. 
> > 
> > I'm assuming you got this version of Hadoop from IBM, so I would suggest
> > contacting their support and opening up a ticket. 
> > 
> > 
> > On Dec 11, 2012, at 8:23 AM, Emile Kao <em...@gmx.net> wrote:
> > 
> > > Hello community,
> > > I am trying to use hadoop 1.1.0 on a SLES 11 (zLinux) running on IBM
> > S390.
> > > The java provided is "java-s390x-60" 64Bit.
> > > While trying to format the namenode I got the following error:
> > > 
> > > $:/opt/flume_hadoop/hadoop-1.1.0> bin/hadoop namenode -format
> > > 12/12/11 14:16:31 INFO namenode.NameNode: STARTUP_MSG:
> > > /************************************************************
> > > STARTUP_MSG: Starting NameNode
> > > STARTUP_MSG:   host = xxxxxxxxx
> > > STARTUP_MSG:   args = [-format]
> > > STARTUP_MSG:   version = 1.1.0
> > > STARTUP_MSG:   build =
> > https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.1 -r 
> 1394289; compiled by 'hortonfo' on Thu Oct  4 22:06:49
> > UTC 2012
> > > ************************************************************/
> > > Re-format filesystem in /opt/hadoop_data/name ? (Y or N) Y
> > > 12/12/11 14:16:34 INFO util.GSet: VM type       = 64-bit
> > > 12/12/11 14:16:34 INFO util.GSet: 2% max memory = 20.0 MB
> > > 12/12/11 14:16:34 INFO util.GSet: capacity      = 2^21 = 2097152 
> entries
> > > 12/12/11 14:16:34 INFO util.GSet: recommended=2097152, actual=2097152
> > > 12/12/11 14:16:34 ERROR security.UserGroupInformation: Unable to find
> > JAAS classes:com.ibm.security.auth.LinuxPrincipal
> > > 12/12/11 14:16:35 ERROR namenode.NameNode: java.io.IOException: 
> failure
> > to login
> > >        at
> > 
> org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:501)
> > >        at
> > 
> org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463)
> > >        at
> > 
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:491)
> > >        at
> > 
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:480)
> > >        at
> > 
> org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1198)
> > >        at
> > 
> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1391)
> > >        at
> > org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1412)
> > > Caused by: javax.security.auth.login.LoginException:
> > java.lang.NullPointerException: invalid null Class provided
> > >        at javax.security.auth.Subject.getPrincipals(Subject.java:809)
> > >        at
> > 
> org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.getCanonicalUser(UserGroupInformation.java:86)
> > >        at
> > 
> org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.commit(UserGroupInformation.java:123)
> > >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > >        at
> > 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:48)
> > >        at
> > 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> > >        at java.lang.reflect.Method.invoke(Method.java:600)
> > >        at
> > javax.security.auth.login.LoginContext.invoke(LoginContext.java:795)
> > >        at
> > javax.security.auth.login.LoginContext.access$000(LoginContext.java:209)
> > >        at
> > javax.security.auth.login.LoginContext$5.run(LoginContext.java:732)
> > >        at
> > java.security.AccessController.doPrivileged(AccessController.java:284)
> > >        at
> > 
> javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:729)
> > >        at
> > javax.security.auth.login.LoginContext.login(LoginContext.java:600)
> > >        at
> > 
> org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:482)
> > >        at
> > 
> org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463)
> > >        at
> > 
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:491)
> > >        at
> > 
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:480)
> > >        at
> > 
> org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1198)
> > >        at
> > 
> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1391)
> > >        at
> > org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1412)
> > > 
> > >        at
> > javax.security.auth.login.LoginContext.invoke(LoginContext.java:898)
> > >        at
> > javax.security.auth.login.LoginContext.access$000(LoginContext.java:209)
> > >        at
> > javax.security.auth.login.LoginContext$5.run(LoginContext.java:732)
> > >        at
> > java.security.AccessController.doPrivileged(AccessController.java:284)
> > >        at
> > 
> javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:729)
> > >        at
> > javax.security.auth.login.LoginContext.login(LoginContext.java:600)
> > >        at
> > 
> org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:482)
> > >        ... 6 more
> > > 
> > > 12/12/11 14:16:35 INFO namenode.NameNode: SHUTDOWN_MSG:
> > > /************************************************************
> > > SHUTDOWN_MSG: Shutting down NameNode at xxxxxxxxxxxxxxxx
> > > ************************************************************/
> > > $:/opt/flume_hadoop/hadoop-1.1.0>
> > > 
> > > Question:
> > > 
> > > 1)@developer
> > > Are you aware of this behavior?
> > > 2)It there a way to overcome this problem with a workaround?
> > > 3)IS it a security issue? --> I was able to issue ssh on localhost
> > without error.
> > > 
> > 
> 
> 

Re: using hadoop on zLinux (Linux on S390)

Posted by Emile Kao <em...@gmx.net>.
Hello Kumar,
here are the answers to your questions:

> 1. What version and vendor of JDK did you use to compile and package hadoop? 

Answer:
I didn't compile the package since I followed the instructions in the official documentation (http://hadoop.apache.org/docs/r1.1.0/single_node_setup.html). They were no talk about compiling the code first.
By the way I am using the binary version I downloaded from the official download site. I guess this one is already compiled.

> 2. What version and vendor of JVM are you running? You can type java -version from the console to see this.

Answer:
This is the java version I am using:

java version "1.6.0"
Java(TM) SE Runtime Environment (build pxz6460sr10fp1-20120321_01(SR10 FP1))
IBM J9 VM (build 2.4, JRE 1.6.0 IBM J9 2.4 Linux s390x-64 jvmxz6460sr10fp1-20120202_101568 (JIT enabled, AOT enabled)
J9VM - 20120202_101568
JIT  - r9_20111107_21307ifx1
GC   - 20120202_AA)
JCL  - 20120320_01

Thank you in advance.

Cheers, Emile


-------- Original-Nachricht --------
> Datum: Tue, 11 Dec 2012 08:56:24 -0600
> Von: Kumar Ravi <ku...@us.ibm.com>
> An: user@hadoop.apache.org
> Betreff: Re: using hadoop on zLinux (Linux on S390)

> Hi Emile,
> 
>  I have a couple of questions for you:
> 
> 1. What version and vendor of JDK did you use to compile and package 
> hadoop? 
> 
> 2. What version and vendor of JVM are you running? You can type java 
> -version from the console to see this.
> 
> Thanks,
> Kumar
> 
> Kumar Ravi
> IBM Linux Technology Center 
> 
> 
> 
> 
> From:
> "Emile Kao" <em...@gmx.net>
> To:
> user@hadoop.apache.org, 
> Date:
> 12/11/2012 08:51 AM
> Subject:
> Re: using hadoop on zLinux (Linux on S390)
> 
> 
> No, this is the general available version...
> 
> -------- Original-Nachricht --------
> > Datum: Tue, 11 Dec 2012 08:31:57 -0600
> > Von: Michael Segel <mi...@hotmail.com>
> > An: user@hadoop.apache.org
> > Betreff: Re: using hadoop on zLinux (Linux on S390)
> 
> > Well, on the surface.... 
> > 
> > It looks like its either a missing class, or you don't have your class
> > path set up right. 
> > 
> > I'm assuming you got this version of Hadoop from IBM, so I would suggest
> > contacting their support and opening up a ticket. 
> > 
> > 
> > On Dec 11, 2012, at 8:23 AM, Emile Kao <em...@gmx.net> wrote:
> > 
> > > Hello community,
> > > I am trying to use hadoop 1.1.0 on a SLES 11 (zLinux) running on IBM
> > S390.
> > > The java provided is "java-s390x-60" 64Bit.
> > > While trying to format the namenode I got the following error:
> > > 
> > > $:/opt/flume_hadoop/hadoop-1.1.0> bin/hadoop namenode -format
> > > 12/12/11 14:16:31 INFO namenode.NameNode: STARTUP_MSG:
> > > /************************************************************
> > > STARTUP_MSG: Starting NameNode
> > > STARTUP_MSG:   host = xxxxxxxxx
> > > STARTUP_MSG:   args = [-format]
> > > STARTUP_MSG:   version = 1.1.0
> > > STARTUP_MSG:   build =
> > https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.1 -r 
> 1394289; compiled by 'hortonfo' on Thu Oct  4 22:06:49
> > UTC 2012
> > > ************************************************************/
> > > Re-format filesystem in /opt/hadoop_data/name ? (Y or N) Y
> > > 12/12/11 14:16:34 INFO util.GSet: VM type       = 64-bit
> > > 12/12/11 14:16:34 INFO util.GSet: 2% max memory = 20.0 MB
> > > 12/12/11 14:16:34 INFO util.GSet: capacity      = 2^21 = 2097152 
> entries
> > > 12/12/11 14:16:34 INFO util.GSet: recommended=2097152, actual=2097152
> > > 12/12/11 14:16:34 ERROR security.UserGroupInformation: Unable to find
> > JAAS classes:com.ibm.security.auth.LinuxPrincipal
> > > 12/12/11 14:16:35 ERROR namenode.NameNode: java.io.IOException: 
> failure
> > to login
> > >        at
> > 
> org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:501)
> > >        at
> > 
> org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463)
> > >        at
> > 
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:491)
> > >        at
> > 
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:480)
> > >        at
> > 
> org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1198)
> > >        at
> > 
> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1391)
> > >        at
> > org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1412)
> > > Caused by: javax.security.auth.login.LoginException:
> > java.lang.NullPointerException: invalid null Class provided
> > >        at javax.security.auth.Subject.getPrincipals(Subject.java:809)
> > >        at
> > 
> org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.getCanonicalUser(UserGroupInformation.java:86)
> > >        at
> > 
> org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.commit(UserGroupInformation.java:123)
> > >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > >        at
> > 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:48)
> > >        at
> > 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> > >        at java.lang.reflect.Method.invoke(Method.java:600)
> > >        at
> > javax.security.auth.login.LoginContext.invoke(LoginContext.java:795)
> > >        at
> > javax.security.auth.login.LoginContext.access$000(LoginContext.java:209)
> > >        at
> > javax.security.auth.login.LoginContext$5.run(LoginContext.java:732)
> > >        at
> > java.security.AccessController.doPrivileged(AccessController.java:284)
> > >        at
> > 
> javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:729)
> > >        at
> > javax.security.auth.login.LoginContext.login(LoginContext.java:600)
> > >        at
> > 
> org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:482)
> > >        at
> > 
> org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463)
> > >        at
> > 
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:491)
> > >        at
> > 
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:480)
> > >        at
> > 
> org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1198)
> > >        at
> > 
> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1391)
> > >        at
> > org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1412)
> > > 
> > >        at
> > javax.security.auth.login.LoginContext.invoke(LoginContext.java:898)
> > >        at
> > javax.security.auth.login.LoginContext.access$000(LoginContext.java:209)
> > >        at
> > javax.security.auth.login.LoginContext$5.run(LoginContext.java:732)
> > >        at
> > java.security.AccessController.doPrivileged(AccessController.java:284)
> > >        at
> > 
> javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:729)
> > >        at
> > javax.security.auth.login.LoginContext.login(LoginContext.java:600)
> > >        at
> > 
> org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:482)
> > >        ... 6 more
> > > 
> > > 12/12/11 14:16:35 INFO namenode.NameNode: SHUTDOWN_MSG:
> > > /************************************************************
> > > SHUTDOWN_MSG: Shutting down NameNode at xxxxxxxxxxxxxxxx
> > > ************************************************************/
> > > $:/opt/flume_hadoop/hadoop-1.1.0>
> > > 
> > > Question:
> > > 
> > > 1)@developer
> > > Are you aware of this behavior?
> > > 2)It there a way to overcome this problem with a workaround?
> > > 3)IS it a security issue? --> I was able to issue ssh on localhost
> > without error.
> > > 
> > 
> 
> 

Re: using hadoop on zLinux (Linux on S390)

Posted by Emile Kao <em...@gmx.net>.
Hello Kumar,
here are the answers to your questions:

> 1. What version and vendor of JDK did you use to compile and package hadoop? 

Answer:
I didn't compile the package since I followed the instructions in the official documentation (http://hadoop.apache.org/docs/r1.1.0/single_node_setup.html). They were no talk about compiling the code first.
By the way I am using the binary version I downloaded from the official download site. I guess this one is already compiled.

> 2. What version and vendor of JVM are you running? You can type java -version from the console to see this.

Answer:
This is the java version I am using:

java version "1.6.0"
Java(TM) SE Runtime Environment (build pxz6460sr10fp1-20120321_01(SR10 FP1))
IBM J9 VM (build 2.4, JRE 1.6.0 IBM J9 2.4 Linux s390x-64 jvmxz6460sr10fp1-20120202_101568 (JIT enabled, AOT enabled)
J9VM - 20120202_101568
JIT  - r9_20111107_21307ifx1
GC   - 20120202_AA)
JCL  - 20120320_01

Thank you in advance.

Cheers, Emile


-------- Original-Nachricht --------
> Datum: Tue, 11 Dec 2012 08:56:24 -0600
> Von: Kumar Ravi <ku...@us.ibm.com>
> An: user@hadoop.apache.org
> Betreff: Re: using hadoop on zLinux (Linux on S390)

> Hi Emile,
> 
>  I have a couple of questions for you:
> 
> 1. What version and vendor of JDK did you use to compile and package 
> hadoop? 
> 
> 2. What version and vendor of JVM are you running? You can type java 
> -version from the console to see this.
> 
> Thanks,
> Kumar
> 
> Kumar Ravi
> IBM Linux Technology Center 
> 
> 
> 
> 
> From:
> "Emile Kao" <em...@gmx.net>
> To:
> user@hadoop.apache.org, 
> Date:
> 12/11/2012 08:51 AM
> Subject:
> Re: using hadoop on zLinux (Linux on S390)
> 
> 
> No, this is the general available version...
> 
> -------- Original-Nachricht --------
> > Datum: Tue, 11 Dec 2012 08:31:57 -0600
> > Von: Michael Segel <mi...@hotmail.com>
> > An: user@hadoop.apache.org
> > Betreff: Re: using hadoop on zLinux (Linux on S390)
> 
> > Well, on the surface.... 
> > 
> > It looks like its either a missing class, or you don't have your class
> > path set up right. 
> > 
> > I'm assuming you got this version of Hadoop from IBM, so I would suggest
> > contacting their support and opening up a ticket. 
> > 
> > 
> > On Dec 11, 2012, at 8:23 AM, Emile Kao <em...@gmx.net> wrote:
> > 
> > > Hello community,
> > > I am trying to use hadoop 1.1.0 on a SLES 11 (zLinux) running on IBM
> > S390.
> > > The java provided is "java-s390x-60" 64Bit.
> > > While trying to format the namenode I got the following error:
> > > 
> > > $:/opt/flume_hadoop/hadoop-1.1.0> bin/hadoop namenode -format
> > > 12/12/11 14:16:31 INFO namenode.NameNode: STARTUP_MSG:
> > > /************************************************************
> > > STARTUP_MSG: Starting NameNode
> > > STARTUP_MSG:   host = xxxxxxxxx
> > > STARTUP_MSG:   args = [-format]
> > > STARTUP_MSG:   version = 1.1.0
> > > STARTUP_MSG:   build =
> > https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.1 -r 
> 1394289; compiled by 'hortonfo' on Thu Oct  4 22:06:49
> > UTC 2012
> > > ************************************************************/
> > > Re-format filesystem in /opt/hadoop_data/name ? (Y or N) Y
> > > 12/12/11 14:16:34 INFO util.GSet: VM type       = 64-bit
> > > 12/12/11 14:16:34 INFO util.GSet: 2% max memory = 20.0 MB
> > > 12/12/11 14:16:34 INFO util.GSet: capacity      = 2^21 = 2097152 
> entries
> > > 12/12/11 14:16:34 INFO util.GSet: recommended=2097152, actual=2097152
> > > 12/12/11 14:16:34 ERROR security.UserGroupInformation: Unable to find
> > JAAS classes:com.ibm.security.auth.LinuxPrincipal
> > > 12/12/11 14:16:35 ERROR namenode.NameNode: java.io.IOException: 
> failure
> > to login
> > >        at
> > 
> org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:501)
> > >        at
> > 
> org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463)
> > >        at
> > 
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:491)
> > >        at
> > 
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:480)
> > >        at
> > 
> org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1198)
> > >        at
> > 
> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1391)
> > >        at
> > org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1412)
> > > Caused by: javax.security.auth.login.LoginException:
> > java.lang.NullPointerException: invalid null Class provided
> > >        at javax.security.auth.Subject.getPrincipals(Subject.java:809)
> > >        at
> > 
> org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.getCanonicalUser(UserGroupInformation.java:86)
> > >        at
> > 
> org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.commit(UserGroupInformation.java:123)
> > >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > >        at
> > 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:48)
> > >        at
> > 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> > >        at java.lang.reflect.Method.invoke(Method.java:600)
> > >        at
> > javax.security.auth.login.LoginContext.invoke(LoginContext.java:795)
> > >        at
> > javax.security.auth.login.LoginContext.access$000(LoginContext.java:209)
> > >        at
> > javax.security.auth.login.LoginContext$5.run(LoginContext.java:732)
> > >        at
> > java.security.AccessController.doPrivileged(AccessController.java:284)
> > >        at
> > 
> javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:729)
> > >        at
> > javax.security.auth.login.LoginContext.login(LoginContext.java:600)
> > >        at
> > 
> org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:482)
> > >        at
> > 
> org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463)
> > >        at
> > 
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:491)
> > >        at
> > 
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:480)
> > >        at
> > 
> org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1198)
> > >        at
> > 
> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1391)
> > >        at
> > org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1412)
> > > 
> > >        at
> > javax.security.auth.login.LoginContext.invoke(LoginContext.java:898)
> > >        at
> > javax.security.auth.login.LoginContext.access$000(LoginContext.java:209)
> > >        at
> > javax.security.auth.login.LoginContext$5.run(LoginContext.java:732)
> > >        at
> > java.security.AccessController.doPrivileged(AccessController.java:284)
> > >        at
> > 
> javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:729)
> > >        at
> > javax.security.auth.login.LoginContext.login(LoginContext.java:600)
> > >        at
> > 
> org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:482)
> > >        ... 6 more
> > > 
> > > 12/12/11 14:16:35 INFO namenode.NameNode: SHUTDOWN_MSG:
> > > /************************************************************
> > > SHUTDOWN_MSG: Shutting down NameNode at xxxxxxxxxxxxxxxx
> > > ************************************************************/
> > > $:/opt/flume_hadoop/hadoop-1.1.0>
> > > 
> > > Question:
> > > 
> > > 1)@developer
> > > Are you aware of this behavior?
> > > 2)It there a way to overcome this problem with a workaround?
> > > 3)IS it a security issue? --> I was able to issue ssh on localhost
> > without error.
> > > 
> > 
> 
> 

Re: using hadoop on zLinux (Linux on S390)

Posted by Emile Kao <em...@gmx.net>.
Hello Kumar,
here are the answers to your questions:

> 1. What version and vendor of JDK did you use to compile and package hadoop? 

Answer:
I didn't compile the package since I followed the instructions in the official documentation (http://hadoop.apache.org/docs/r1.1.0/single_node_setup.html). They were no talk about compiling the code first.
By the way I am using the binary version I downloaded from the official download site. I guess this one is already compiled.

> 2. What version and vendor of JVM are you running? You can type java -version from the console to see this.

Answer:
This is the java version I am using:

java version "1.6.0"
Java(TM) SE Runtime Environment (build pxz6460sr10fp1-20120321_01(SR10 FP1))
IBM J9 VM (build 2.4, JRE 1.6.0 IBM J9 2.4 Linux s390x-64 jvmxz6460sr10fp1-20120202_101568 (JIT enabled, AOT enabled)
J9VM - 20120202_101568
JIT  - r9_20111107_21307ifx1
GC   - 20120202_AA)
JCL  - 20120320_01

Thank you in advance.

Cheers, Emile


-------- Original-Nachricht --------
> Datum: Tue, 11 Dec 2012 08:56:24 -0600
> Von: Kumar Ravi <ku...@us.ibm.com>
> An: user@hadoop.apache.org
> Betreff: Re: using hadoop on zLinux (Linux on S390)

> Hi Emile,
> 
>  I have a couple of questions for you:
> 
> 1. What version and vendor of JDK did you use to compile and package 
> hadoop? 
> 
> 2. What version and vendor of JVM are you running? You can type java 
> -version from the console to see this.
> 
> Thanks,
> Kumar
> 
> Kumar Ravi
> IBM Linux Technology Center 
> 
> 
> 
> 
> From:
> "Emile Kao" <em...@gmx.net>
> To:
> user@hadoop.apache.org, 
> Date:
> 12/11/2012 08:51 AM
> Subject:
> Re: using hadoop on zLinux (Linux on S390)
> 
> 
> No, this is the general available version...
> 
> -------- Original-Nachricht --------
> > Datum: Tue, 11 Dec 2012 08:31:57 -0600
> > Von: Michael Segel <mi...@hotmail.com>
> > An: user@hadoop.apache.org
> > Betreff: Re: using hadoop on zLinux (Linux on S390)
> 
> > Well, on the surface.... 
> > 
> > It looks like its either a missing class, or you don't have your class
> > path set up right. 
> > 
> > I'm assuming you got this version of Hadoop from IBM, so I would suggest
> > contacting their support and opening up a ticket. 
> > 
> > 
> > On Dec 11, 2012, at 8:23 AM, Emile Kao <em...@gmx.net> wrote:
> > 
> > > Hello community,
> > > I am trying to use hadoop 1.1.0 on a SLES 11 (zLinux) running on IBM
> > S390.
> > > The java provided is "java-s390x-60" 64Bit.
> > > While trying to format the namenode I got the following error:
> > > 
> > > $:/opt/flume_hadoop/hadoop-1.1.0> bin/hadoop namenode -format
> > > 12/12/11 14:16:31 INFO namenode.NameNode: STARTUP_MSG:
> > > /************************************************************
> > > STARTUP_MSG: Starting NameNode
> > > STARTUP_MSG:   host = xxxxxxxxx
> > > STARTUP_MSG:   args = [-format]
> > > STARTUP_MSG:   version = 1.1.0
> > > STARTUP_MSG:   build =
> > https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.1 -r 
> 1394289; compiled by 'hortonfo' on Thu Oct  4 22:06:49
> > UTC 2012
> > > ************************************************************/
> > > Re-format filesystem in /opt/hadoop_data/name ? (Y or N) Y
> > > 12/12/11 14:16:34 INFO util.GSet: VM type       = 64-bit
> > > 12/12/11 14:16:34 INFO util.GSet: 2% max memory = 20.0 MB
> > > 12/12/11 14:16:34 INFO util.GSet: capacity      = 2^21 = 2097152 
> entries
> > > 12/12/11 14:16:34 INFO util.GSet: recommended=2097152, actual=2097152
> > > 12/12/11 14:16:34 ERROR security.UserGroupInformation: Unable to find
> > JAAS classes:com.ibm.security.auth.LinuxPrincipal
> > > 12/12/11 14:16:35 ERROR namenode.NameNode: java.io.IOException: 
> failure
> > to login
> > >        at
> > 
> org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:501)
> > >        at
> > 
> org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463)
> > >        at
> > 
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:491)
> > >        at
> > 
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:480)
> > >        at
> > 
> org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1198)
> > >        at
> > 
> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1391)
> > >        at
> > org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1412)
> > > Caused by: javax.security.auth.login.LoginException:
> > java.lang.NullPointerException: invalid null Class provided
> > >        at javax.security.auth.Subject.getPrincipals(Subject.java:809)
> > >        at
> > 
> org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.getCanonicalUser(UserGroupInformation.java:86)
> > >        at
> > 
> org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.commit(UserGroupInformation.java:123)
> > >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > >        at
> > 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:48)
> > >        at
> > 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> > >        at java.lang.reflect.Method.invoke(Method.java:600)
> > >        at
> > javax.security.auth.login.LoginContext.invoke(LoginContext.java:795)
> > >        at
> > javax.security.auth.login.LoginContext.access$000(LoginContext.java:209)
> > >        at
> > javax.security.auth.login.LoginContext$5.run(LoginContext.java:732)
> > >        at
> > java.security.AccessController.doPrivileged(AccessController.java:284)
> > >        at
> > 
> javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:729)
> > >        at
> > javax.security.auth.login.LoginContext.login(LoginContext.java:600)
> > >        at
> > 
> org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:482)
> > >        at
> > 
> org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463)
> > >        at
> > 
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:491)
> > >        at
> > 
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:480)
> > >        at
> > 
> org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1198)
> > >        at
> > 
> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1391)
> > >        at
> > org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1412)
> > > 
> > >        at
> > javax.security.auth.login.LoginContext.invoke(LoginContext.java:898)
> > >        at
> > javax.security.auth.login.LoginContext.access$000(LoginContext.java:209)
> > >        at
> > javax.security.auth.login.LoginContext$5.run(LoginContext.java:732)
> > >        at
> > java.security.AccessController.doPrivileged(AccessController.java:284)
> > >        at
> > 
> javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:729)
> > >        at
> > javax.security.auth.login.LoginContext.login(LoginContext.java:600)
> > >        at
> > 
> org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:482)
> > >        ... 6 more
> > > 
> > > 12/12/11 14:16:35 INFO namenode.NameNode: SHUTDOWN_MSG:
> > > /************************************************************
> > > SHUTDOWN_MSG: Shutting down NameNode at xxxxxxxxxxxxxxxx
> > > ************************************************************/
> > > $:/opt/flume_hadoop/hadoop-1.1.0>
> > > 
> > > Question:
> > > 
> > > 1)@developer
> > > Are you aware of this behavior?
> > > 2)It there a way to overcome this problem with a workaround?
> > > 3)IS it a security issue? --> I was able to issue ssh on localhost
> > without error.
> > > 
> > 
> 
> 

Re: using hadoop on zLinux (Linux on S390)

Posted by Kumar Ravi <ku...@us.ibm.com>.
Hi Emile,

 I have a couple of questions for you:

1. What version and vendor of JDK did you use to compile and package 
hadoop? 

2. What version and vendor of JVM are you running? You can type java 
-version from the console to see this.

Thanks,
Kumar

Kumar Ravi
IBM Linux Technology Center 




From:
"Emile Kao" <em...@gmx.net>
To:
user@hadoop.apache.org, 
Date:
12/11/2012 08:51 AM
Subject:
Re: using hadoop on zLinux (Linux on S390)


No, this is the general available version...

-------- Original-Nachricht --------
> Datum: Tue, 11 Dec 2012 08:31:57 -0600
> Von: Michael Segel <mi...@hotmail.com>
> An: user@hadoop.apache.org
> Betreff: Re: using hadoop on zLinux (Linux on S390)

> Well, on the surface.... 
> 
> It looks like its either a missing class, or you don't have your class
> path set up right. 
> 
> I'm assuming you got this version of Hadoop from IBM, so I would suggest
> contacting their support and opening up a ticket. 
> 
> 
> On Dec 11, 2012, at 8:23 AM, Emile Kao <em...@gmx.net> wrote:
> 
> > Hello community,
> > I am trying to use hadoop 1.1.0 on a SLES 11 (zLinux) running on IBM
> S390.
> > The java provided is "java-s390x-60" 64Bit.
> > While trying to format the namenode I got the following error:
> > 
> > $:/opt/flume_hadoop/hadoop-1.1.0> bin/hadoop namenode -format
> > 12/12/11 14:16:31 INFO namenode.NameNode: STARTUP_MSG:
> > /************************************************************
> > STARTUP_MSG: Starting NameNode
> > STARTUP_MSG:   host = xxxxxxxxx
> > STARTUP_MSG:   args = [-format]
> > STARTUP_MSG:   version = 1.1.0
> > STARTUP_MSG:   build =
> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.1 -r 
1394289; compiled by 'hortonfo' on Thu Oct  4 22:06:49
> UTC 2012
> > ************************************************************/
> > Re-format filesystem in /opt/hadoop_data/name ? (Y or N) Y
> > 12/12/11 14:16:34 INFO util.GSet: VM type       = 64-bit
> > 12/12/11 14:16:34 INFO util.GSet: 2% max memory = 20.0 MB
> > 12/12/11 14:16:34 INFO util.GSet: capacity      = 2^21 = 2097152 
entries
> > 12/12/11 14:16:34 INFO util.GSet: recommended=2097152, actual=2097152
> > 12/12/11 14:16:34 ERROR security.UserGroupInformation: Unable to find
> JAAS classes:com.ibm.security.auth.LinuxPrincipal
> > 12/12/11 14:16:35 ERROR namenode.NameNode: java.io.IOException: 
failure
> to login
> >        at
> 
org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:501)
> >        at
> 
org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463)
> >        at
> 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:491)
> >        at
> 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:480)
> >        at
> 
org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1198)
> >        at
> 
org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1391)
> >        at
> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1412)
> > Caused by: javax.security.auth.login.LoginException:
> java.lang.NullPointerException: invalid null Class provided
> >        at javax.security.auth.Subject.getPrincipals(Subject.java:809)
> >        at
> 
org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.getCanonicalUser(UserGroupInformation.java:86)
> >        at
> 
org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.commit(UserGroupInformation.java:123)
> >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >        at
> 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:48)
> >        at
> 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >        at java.lang.reflect.Method.invoke(Method.java:600)
> >        at
> javax.security.auth.login.LoginContext.invoke(LoginContext.java:795)
> >        at
> javax.security.auth.login.LoginContext.access$000(LoginContext.java:209)
> >        at
> javax.security.auth.login.LoginContext$5.run(LoginContext.java:732)
> >        at
> java.security.AccessController.doPrivileged(AccessController.java:284)
> >        at
> 
javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:729)
> >        at
> javax.security.auth.login.LoginContext.login(LoginContext.java:600)
> >        at
> 
org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:482)
> >        at
> 
org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463)
> >        at
> 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:491)
> >        at
> 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:480)
> >        at
> 
org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1198)
> >        at
> 
org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1391)
> >        at
> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1412)
> > 
> >        at
> javax.security.auth.login.LoginContext.invoke(LoginContext.java:898)
> >        at
> javax.security.auth.login.LoginContext.access$000(LoginContext.java:209)
> >        at
> javax.security.auth.login.LoginContext$5.run(LoginContext.java:732)
> >        at
> java.security.AccessController.doPrivileged(AccessController.java:284)
> >        at
> 
javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:729)
> >        at
> javax.security.auth.login.LoginContext.login(LoginContext.java:600)
> >        at
> 
org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:482)
> >        ... 6 more
> > 
> > 12/12/11 14:16:35 INFO namenode.NameNode: SHUTDOWN_MSG:
> > /************************************************************
> > SHUTDOWN_MSG: Shutting down NameNode at xxxxxxxxxxxxxxxx
> > ************************************************************/
> > $:/opt/flume_hadoop/hadoop-1.1.0>
> > 
> > Question:
> > 
> > 1)@developer
> > Are you aware of this behavior?
> > 2)It there a way to overcome this problem with a workaround?
> > 3)IS it a security issue? --> I was able to issue ssh on localhost
> without error.
> > 
> 



Re: using hadoop on zLinux (Linux on S390)

Posted by Michael Segel <mi...@hotmail.com>.
Well...

I didn't think that the general version contained IBM specific security Java. 

Your error: JAAS classes:com.ibm.security.auth.LinuxPrincipal
(The first line...) 

Is saying that they can't find this class.
Since this is the Apache release and you're trying to run it on IBM where you need specific IBM security stuff. 

Now I could be wrong but that's my first take on it. 


On Dec 11, 2012, at 8:50 AM, "Emile Kao" <em...@gmx.net> wrote:

> No, this is the general available version...
> 
> -------- Original-Nachricht --------
>> Datum: Tue, 11 Dec 2012 08:31:57 -0600
>> Von: Michael Segel <mi...@hotmail.com>
>> An: user@hadoop.apache.org
>> Betreff: Re: using hadoop on zLinux (Linux on S390)
> 
>> Well, on the surface.... 
>> 
>> It looks like its either a missing class, or you don't have your class
>> path set up right. 
>> 
>> I'm assuming you got this version of Hadoop from IBM, so I would suggest
>> contacting their support and opening up a ticket. 
>> 
>> 
>> On Dec 11, 2012, at 8:23 AM, Emile Kao <em...@gmx.net> wrote:
>> 
>>> Hello community,
>>> I am trying to use hadoop 1.1.0 on a SLES 11 (zLinux) running on IBM
>> S390.
>>> The java provided is "java-s390x-60" 64Bit.
>>> While trying to format the namenode I got the following error:
>>> 
>>> $:/opt/flume_hadoop/hadoop-1.1.0> bin/hadoop namenode -format
>>> 12/12/11 14:16:31 INFO namenode.NameNode: STARTUP_MSG:
>>> /************************************************************
>>> STARTUP_MSG: Starting NameNode
>>> STARTUP_MSG:   host = xxxxxxxxx
>>> STARTUP_MSG:   args = [-format]
>>> STARTUP_MSG:   version = 1.1.0
>>> STARTUP_MSG:   build =
>> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.1 -r 1394289; compiled by 'hortonfo' on Thu Oct  4 22:06:49
>> UTC 2012
>>> ************************************************************/
>>> Re-format filesystem in /opt/hadoop_data/name ? (Y or N) Y
>>> 12/12/11 14:16:34 INFO util.GSet: VM type       = 64-bit
>>> 12/12/11 14:16:34 INFO util.GSet: 2% max memory = 20.0 MB
>>> 12/12/11 14:16:34 INFO util.GSet: capacity      = 2^21 = 2097152 entries
>>> 12/12/11 14:16:34 INFO util.GSet: recommended=2097152, actual=2097152
>>> 12/12/11 14:16:34 ERROR security.UserGroupInformation: Unable to find
>> JAAS classes:com.ibm.security.auth.LinuxPrincipal
>>> 12/12/11 14:16:35 ERROR namenode.NameNode: java.io.IOException: failure
>> to login
>>>       at
>> org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:501)
>>>       at
>> org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463)
>>>       at
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:491)
>>>       at
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:480)
>>>       at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1198)
>>>       at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1391)
>>>       at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1412)
>>> Caused by: javax.security.auth.login.LoginException:
>> java.lang.NullPointerException: invalid null Class provided
>>>       at javax.security.auth.Subject.getPrincipals(Subject.java:809)
>>>       at
>> org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.getCanonicalUser(UserGroupInformation.java:86)
>>>       at
>> org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.commit(UserGroupInformation.java:123)
>>>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>       at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:48)
>>>       at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>       at java.lang.reflect.Method.invoke(Method.java:600)
>>>       at
>> javax.security.auth.login.LoginContext.invoke(LoginContext.java:795)
>>>       at
>> javax.security.auth.login.LoginContext.access$000(LoginContext.java:209)
>>>       at
>> javax.security.auth.login.LoginContext$5.run(LoginContext.java:732)
>>>       at
>> java.security.AccessController.doPrivileged(AccessController.java:284)
>>>       at
>> javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:729)
>>>       at
>> javax.security.auth.login.LoginContext.login(LoginContext.java:600)
>>>       at
>> org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:482)
>>>       at
>> org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463)
>>>       at
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:491)
>>>       at
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:480)
>>>       at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1198)
>>>       at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1391)
>>>       at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1412)
>>> 
>>>       at
>> javax.security.auth.login.LoginContext.invoke(LoginContext.java:898)
>>>       at
>> javax.security.auth.login.LoginContext.access$000(LoginContext.java:209)
>>>       at
>> javax.security.auth.login.LoginContext$5.run(LoginContext.java:732)
>>>       at
>> java.security.AccessController.doPrivileged(AccessController.java:284)
>>>       at
>> javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:729)
>>>       at
>> javax.security.auth.login.LoginContext.login(LoginContext.java:600)
>>>       at
>> org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:482)
>>>       ... 6 more
>>> 
>>> 12/12/11 14:16:35 INFO namenode.NameNode: SHUTDOWN_MSG:
>>> /************************************************************
>>> SHUTDOWN_MSG: Shutting down NameNode at xxxxxxxxxxxxxxxx
>>> ************************************************************/
>>> $:/opt/flume_hadoop/hadoop-1.1.0>
>>> 
>>> Question:
>>> 
>>> 1)@developer
>>> Are you aware of this behavior?
>>> 2)It there a way to overcome this problem with a workaround?
>>> 3)IS it a security issue? --> I was able to issue ssh on localhost
>> without error.
>>> 
>> 
> 


Re: using hadoop on zLinux (Linux on S390)

Posted by Kumar Ravi <ku...@us.ibm.com>.
Hi Emile,

 I have a couple of questions for you:

1. What version and vendor of JDK did you use to compile and package 
hadoop? 

2. What version and vendor of JVM are you running? You can type java 
-version from the console to see this.

Thanks,
Kumar

Kumar Ravi
IBM Linux Technology Center 




From:
"Emile Kao" <em...@gmx.net>
To:
user@hadoop.apache.org, 
Date:
12/11/2012 08:51 AM
Subject:
Re: using hadoop on zLinux (Linux on S390)


No, this is the general available version...

-------- Original-Nachricht --------
> Datum: Tue, 11 Dec 2012 08:31:57 -0600
> Von: Michael Segel <mi...@hotmail.com>
> An: user@hadoop.apache.org
> Betreff: Re: using hadoop on zLinux (Linux on S390)

> Well, on the surface.... 
> 
> It looks like its either a missing class, or you don't have your class
> path set up right. 
> 
> I'm assuming you got this version of Hadoop from IBM, so I would suggest
> contacting their support and opening up a ticket. 
> 
> 
> On Dec 11, 2012, at 8:23 AM, Emile Kao <em...@gmx.net> wrote:
> 
> > Hello community,
> > I am trying to use hadoop 1.1.0 on a SLES 11 (zLinux) running on IBM
> S390.
> > The java provided is "java-s390x-60" 64Bit.
> > While trying to format the namenode I got the following error:
> > 
> > $:/opt/flume_hadoop/hadoop-1.1.0> bin/hadoop namenode -format
> > 12/12/11 14:16:31 INFO namenode.NameNode: STARTUP_MSG:
> > /************************************************************
> > STARTUP_MSG: Starting NameNode
> > STARTUP_MSG:   host = xxxxxxxxx
> > STARTUP_MSG:   args = [-format]
> > STARTUP_MSG:   version = 1.1.0
> > STARTUP_MSG:   build =
> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.1 -r 
1394289; compiled by 'hortonfo' on Thu Oct  4 22:06:49
> UTC 2012
> > ************************************************************/
> > Re-format filesystem in /opt/hadoop_data/name ? (Y or N) Y
> > 12/12/11 14:16:34 INFO util.GSet: VM type       = 64-bit
> > 12/12/11 14:16:34 INFO util.GSet: 2% max memory = 20.0 MB
> > 12/12/11 14:16:34 INFO util.GSet: capacity      = 2^21 = 2097152 
entries
> > 12/12/11 14:16:34 INFO util.GSet: recommended=2097152, actual=2097152
> > 12/12/11 14:16:34 ERROR security.UserGroupInformation: Unable to find
> JAAS classes:com.ibm.security.auth.LinuxPrincipal
> > 12/12/11 14:16:35 ERROR namenode.NameNode: java.io.IOException: 
failure
> to login
> >        at
> 
org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:501)
> >        at
> 
org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463)
> >        at
> 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:491)
> >        at
> 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:480)
> >        at
> 
org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1198)
> >        at
> 
org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1391)
> >        at
> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1412)
> > Caused by: javax.security.auth.login.LoginException:
> java.lang.NullPointerException: invalid null Class provided
> >        at javax.security.auth.Subject.getPrincipals(Subject.java:809)
> >        at
> 
org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.getCanonicalUser(UserGroupInformation.java:86)
> >        at
> 
org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.commit(UserGroupInformation.java:123)
> >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >        at
> 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:48)
> >        at
> 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >        at java.lang.reflect.Method.invoke(Method.java:600)
> >        at
> javax.security.auth.login.LoginContext.invoke(LoginContext.java:795)
> >        at
> javax.security.auth.login.LoginContext.access$000(LoginContext.java:209)
> >        at
> javax.security.auth.login.LoginContext$5.run(LoginContext.java:732)
> >        at
> java.security.AccessController.doPrivileged(AccessController.java:284)
> >        at
> 
javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:729)
> >        at
> javax.security.auth.login.LoginContext.login(LoginContext.java:600)
> >        at
> 
org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:482)
> >        at
> 
org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463)
> >        at
> 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:491)
> >        at
> 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:480)
> >        at
> 
org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1198)
> >        at
> 
org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1391)
> >        at
> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1412)
> > 
> >        at
> javax.security.auth.login.LoginContext.invoke(LoginContext.java:898)
> >        at
> javax.security.auth.login.LoginContext.access$000(LoginContext.java:209)
> >        at
> javax.security.auth.login.LoginContext$5.run(LoginContext.java:732)
> >        at
> java.security.AccessController.doPrivileged(AccessController.java:284)
> >        at
> 
javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:729)
> >        at
> javax.security.auth.login.LoginContext.login(LoginContext.java:600)
> >        at
> 
org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:482)
> >        ... 6 more
> > 
> > 12/12/11 14:16:35 INFO namenode.NameNode: SHUTDOWN_MSG:
> > /************************************************************
> > SHUTDOWN_MSG: Shutting down NameNode at xxxxxxxxxxxxxxxx
> > ************************************************************/
> > $:/opt/flume_hadoop/hadoop-1.1.0>
> > 
> > Question:
> > 
> > 1)@developer
> > Are you aware of this behavior?
> > 2)It there a way to overcome this problem with a workaround?
> > 3)IS it a security issue? --> I was able to issue ssh on localhost
> without error.
> > 
> 



Re: using hadoop on zLinux (Linux on S390)

Posted by Kumar Ravi <ku...@us.ibm.com>.
Hi Emile,

 I have a couple of questions for you:

1. What version and vendor of JDK did you use to compile and package 
hadoop? 

2. What version and vendor of JVM are you running? You can type java 
-version from the console to see this.

Thanks,
Kumar

Kumar Ravi
IBM Linux Technology Center 




From:
"Emile Kao" <em...@gmx.net>
To:
user@hadoop.apache.org, 
Date:
12/11/2012 08:51 AM
Subject:
Re: using hadoop on zLinux (Linux on S390)


No, this is the general available version...

-------- Original-Nachricht --------
> Datum: Tue, 11 Dec 2012 08:31:57 -0600
> Von: Michael Segel <mi...@hotmail.com>
> An: user@hadoop.apache.org
> Betreff: Re: using hadoop on zLinux (Linux on S390)

> Well, on the surface.... 
> 
> It looks like its either a missing class, or you don't have your class
> path set up right. 
> 
> I'm assuming you got this version of Hadoop from IBM, so I would suggest
> contacting their support and opening up a ticket. 
> 
> 
> On Dec 11, 2012, at 8:23 AM, Emile Kao <em...@gmx.net> wrote:
> 
> > Hello community,
> > I am trying to use hadoop 1.1.0 on a SLES 11 (zLinux) running on IBM
> S390.
> > The java provided is "java-s390x-60" 64Bit.
> > While trying to format the namenode I got the following error:
> > 
> > $:/opt/flume_hadoop/hadoop-1.1.0> bin/hadoop namenode -format
> > 12/12/11 14:16:31 INFO namenode.NameNode: STARTUP_MSG:
> > /************************************************************
> > STARTUP_MSG: Starting NameNode
> > STARTUP_MSG:   host = xxxxxxxxx
> > STARTUP_MSG:   args = [-format]
> > STARTUP_MSG:   version = 1.1.0
> > STARTUP_MSG:   build =
> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.1 -r 
1394289; compiled by 'hortonfo' on Thu Oct  4 22:06:49
> UTC 2012
> > ************************************************************/
> > Re-format filesystem in /opt/hadoop_data/name ? (Y or N) Y
> > 12/12/11 14:16:34 INFO util.GSet: VM type       = 64-bit
> > 12/12/11 14:16:34 INFO util.GSet: 2% max memory = 20.0 MB
> > 12/12/11 14:16:34 INFO util.GSet: capacity      = 2^21 = 2097152 
entries
> > 12/12/11 14:16:34 INFO util.GSet: recommended=2097152, actual=2097152
> > 12/12/11 14:16:34 ERROR security.UserGroupInformation: Unable to find
> JAAS classes:com.ibm.security.auth.LinuxPrincipal
> > 12/12/11 14:16:35 ERROR namenode.NameNode: java.io.IOException: 
failure
> to login
> >        at
> 
org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:501)
> >        at
> 
org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463)
> >        at
> 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:491)
> >        at
> 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:480)
> >        at
> 
org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1198)
> >        at
> 
org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1391)
> >        at
> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1412)
> > Caused by: javax.security.auth.login.LoginException:
> java.lang.NullPointerException: invalid null Class provided
> >        at javax.security.auth.Subject.getPrincipals(Subject.java:809)
> >        at
> 
org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.getCanonicalUser(UserGroupInformation.java:86)
> >        at
> 
org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.commit(UserGroupInformation.java:123)
> >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >        at
> 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:48)
> >        at
> 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >        at java.lang.reflect.Method.invoke(Method.java:600)
> >        at
> javax.security.auth.login.LoginContext.invoke(LoginContext.java:795)
> >        at
> javax.security.auth.login.LoginContext.access$000(LoginContext.java:209)
> >        at
> javax.security.auth.login.LoginContext$5.run(LoginContext.java:732)
> >        at
> java.security.AccessController.doPrivileged(AccessController.java:284)
> >        at
> 
javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:729)
> >        at
> javax.security.auth.login.LoginContext.login(LoginContext.java:600)
> >        at
> 
org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:482)
> >        at
> 
org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463)
> >        at
> 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:491)
> >        at
> 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:480)
> >        at
> 
org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1198)
> >        at
> 
org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1391)
> >        at
> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1412)
> > 
> >        at
> javax.security.auth.login.LoginContext.invoke(LoginContext.java:898)
> >        at
> javax.security.auth.login.LoginContext.access$000(LoginContext.java:209)
> >        at
> javax.security.auth.login.LoginContext$5.run(LoginContext.java:732)
> >        at
> java.security.AccessController.doPrivileged(AccessController.java:284)
> >        at
> 
javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:729)
> >        at
> javax.security.auth.login.LoginContext.login(LoginContext.java:600)
> >        at
> 
org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:482)
> >        ... 6 more
> > 
> > 12/12/11 14:16:35 INFO namenode.NameNode: SHUTDOWN_MSG:
> > /************************************************************
> > SHUTDOWN_MSG: Shutting down NameNode at xxxxxxxxxxxxxxxx
> > ************************************************************/
> > $:/opt/flume_hadoop/hadoop-1.1.0>
> > 
> > Question:
> > 
> > 1)@developer
> > Are you aware of this behavior?
> > 2)It there a way to overcome this problem with a workaround?
> > 3)IS it a security issue? --> I was able to issue ssh on localhost
> without error.
> > 
> 



Re: using hadoop on zLinux (Linux on S390)

Posted by Michael Segel <mi...@hotmail.com>.
Well...

I didn't think that the general version contained IBM specific security Java. 

Your error: JAAS classes:com.ibm.security.auth.LinuxPrincipal
(The first line...) 

Is saying that they can't find this class.
Since this is the Apache release and you're trying to run it on IBM where you need specific IBM security stuff. 

Now I could be wrong but that's my first take on it. 


On Dec 11, 2012, at 8:50 AM, "Emile Kao" <em...@gmx.net> wrote:

> No, this is the general available version...
> 
> -------- Original-Nachricht --------
>> Datum: Tue, 11 Dec 2012 08:31:57 -0600
>> Von: Michael Segel <mi...@hotmail.com>
>> An: user@hadoop.apache.org
>> Betreff: Re: using hadoop on zLinux (Linux on S390)
> 
>> Well, on the surface.... 
>> 
>> It looks like its either a missing class, or you don't have your class
>> path set up right. 
>> 
>> I'm assuming you got this version of Hadoop from IBM, so I would suggest
>> contacting their support and opening up a ticket. 
>> 
>> 
>> On Dec 11, 2012, at 8:23 AM, Emile Kao <em...@gmx.net> wrote:
>> 
>>> Hello community,
>>> I am trying to use hadoop 1.1.0 on a SLES 11 (zLinux) running on IBM
>> S390.
>>> The java provided is "java-s390x-60" 64Bit.
>>> While trying to format the namenode I got the following error:
>>> 
>>> $:/opt/flume_hadoop/hadoop-1.1.0> bin/hadoop namenode -format
>>> 12/12/11 14:16:31 INFO namenode.NameNode: STARTUP_MSG:
>>> /************************************************************
>>> STARTUP_MSG: Starting NameNode
>>> STARTUP_MSG:   host = xxxxxxxxx
>>> STARTUP_MSG:   args = [-format]
>>> STARTUP_MSG:   version = 1.1.0
>>> STARTUP_MSG:   build =
>> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.1 -r 1394289; compiled by 'hortonfo' on Thu Oct  4 22:06:49
>> UTC 2012
>>> ************************************************************/
>>> Re-format filesystem in /opt/hadoop_data/name ? (Y or N) Y
>>> 12/12/11 14:16:34 INFO util.GSet: VM type       = 64-bit
>>> 12/12/11 14:16:34 INFO util.GSet: 2% max memory = 20.0 MB
>>> 12/12/11 14:16:34 INFO util.GSet: capacity      = 2^21 = 2097152 entries
>>> 12/12/11 14:16:34 INFO util.GSet: recommended=2097152, actual=2097152
>>> 12/12/11 14:16:34 ERROR security.UserGroupInformation: Unable to find
>> JAAS classes:com.ibm.security.auth.LinuxPrincipal
>>> 12/12/11 14:16:35 ERROR namenode.NameNode: java.io.IOException: failure
>> to login
>>>       at
>> org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:501)
>>>       at
>> org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463)
>>>       at
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:491)
>>>       at
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:480)
>>>       at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1198)
>>>       at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1391)
>>>       at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1412)
>>> Caused by: javax.security.auth.login.LoginException:
>> java.lang.NullPointerException: invalid null Class provided
>>>       at javax.security.auth.Subject.getPrincipals(Subject.java:809)
>>>       at
>> org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.getCanonicalUser(UserGroupInformation.java:86)
>>>       at
>> org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.commit(UserGroupInformation.java:123)
>>>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>       at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:48)
>>>       at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>       at java.lang.reflect.Method.invoke(Method.java:600)
>>>       at
>> javax.security.auth.login.LoginContext.invoke(LoginContext.java:795)
>>>       at
>> javax.security.auth.login.LoginContext.access$000(LoginContext.java:209)
>>>       at
>> javax.security.auth.login.LoginContext$5.run(LoginContext.java:732)
>>>       at
>> java.security.AccessController.doPrivileged(AccessController.java:284)
>>>       at
>> javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:729)
>>>       at
>> javax.security.auth.login.LoginContext.login(LoginContext.java:600)
>>>       at
>> org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:482)
>>>       at
>> org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463)
>>>       at
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:491)
>>>       at
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:480)
>>>       at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1198)
>>>       at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1391)
>>>       at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1412)
>>> 
>>>       at
>> javax.security.auth.login.LoginContext.invoke(LoginContext.java:898)
>>>       at
>> javax.security.auth.login.LoginContext.access$000(LoginContext.java:209)
>>>       at
>> javax.security.auth.login.LoginContext$5.run(LoginContext.java:732)
>>>       at
>> java.security.AccessController.doPrivileged(AccessController.java:284)
>>>       at
>> javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:729)
>>>       at
>> javax.security.auth.login.LoginContext.login(LoginContext.java:600)
>>>       at
>> org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:482)
>>>       ... 6 more
>>> 
>>> 12/12/11 14:16:35 INFO namenode.NameNode: SHUTDOWN_MSG:
>>> /************************************************************
>>> SHUTDOWN_MSG: Shutting down NameNode at xxxxxxxxxxxxxxxx
>>> ************************************************************/
>>> $:/opt/flume_hadoop/hadoop-1.1.0>
>>> 
>>> Question:
>>> 
>>> 1)@developer
>>> Are you aware of this behavior?
>>> 2)It there a way to overcome this problem with a workaround?
>>> 3)IS it a security issue? --> I was able to issue ssh on localhost
>> without error.
>>> 
>> 
> 


Re: using hadoop on zLinux (Linux on S390)

Posted by Michael Segel <mi...@hotmail.com>.
Well...

I didn't think that the general version contained IBM specific security Java. 

Your error: JAAS classes:com.ibm.security.auth.LinuxPrincipal
(The first line...) 

Is saying that they can't find this class.
Since this is the Apache release and you're trying to run it on IBM where you need specific IBM security stuff. 

Now I could be wrong but that's my first take on it. 


On Dec 11, 2012, at 8:50 AM, "Emile Kao" <em...@gmx.net> wrote:

> No, this is the general available version...
> 
> -------- Original-Nachricht --------
>> Datum: Tue, 11 Dec 2012 08:31:57 -0600
>> Von: Michael Segel <mi...@hotmail.com>
>> An: user@hadoop.apache.org
>> Betreff: Re: using hadoop on zLinux (Linux on S390)
> 
>> Well, on the surface.... 
>> 
>> It looks like its either a missing class, or you don't have your class
>> path set up right. 
>> 
>> I'm assuming you got this version of Hadoop from IBM, so I would suggest
>> contacting their support and opening up a ticket. 
>> 
>> 
>> On Dec 11, 2012, at 8:23 AM, Emile Kao <em...@gmx.net> wrote:
>> 
>>> Hello community,
>>> I am trying to use hadoop 1.1.0 on a SLES 11 (zLinux) running on IBM
>> S390.
>>> The java provided is "java-s390x-60" 64Bit.
>>> While trying to format the namenode I got the following error:
>>> 
>>> $:/opt/flume_hadoop/hadoop-1.1.0> bin/hadoop namenode -format
>>> 12/12/11 14:16:31 INFO namenode.NameNode: STARTUP_MSG:
>>> /************************************************************
>>> STARTUP_MSG: Starting NameNode
>>> STARTUP_MSG:   host = xxxxxxxxx
>>> STARTUP_MSG:   args = [-format]
>>> STARTUP_MSG:   version = 1.1.0
>>> STARTUP_MSG:   build =
>> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.1 -r 1394289; compiled by 'hortonfo' on Thu Oct  4 22:06:49
>> UTC 2012
>>> ************************************************************/
>>> Re-format filesystem in /opt/hadoop_data/name ? (Y or N) Y
>>> 12/12/11 14:16:34 INFO util.GSet: VM type       = 64-bit
>>> 12/12/11 14:16:34 INFO util.GSet: 2% max memory = 20.0 MB
>>> 12/12/11 14:16:34 INFO util.GSet: capacity      = 2^21 = 2097152 entries
>>> 12/12/11 14:16:34 INFO util.GSet: recommended=2097152, actual=2097152
>>> 12/12/11 14:16:34 ERROR security.UserGroupInformation: Unable to find
>> JAAS classes:com.ibm.security.auth.LinuxPrincipal
>>> 12/12/11 14:16:35 ERROR namenode.NameNode: java.io.IOException: failure
>> to login
>>>       at
>> org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:501)
>>>       at
>> org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463)
>>>       at
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:491)
>>>       at
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:480)
>>>       at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1198)
>>>       at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1391)
>>>       at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1412)
>>> Caused by: javax.security.auth.login.LoginException:
>> java.lang.NullPointerException: invalid null Class provided
>>>       at javax.security.auth.Subject.getPrincipals(Subject.java:809)
>>>       at
>> org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.getCanonicalUser(UserGroupInformation.java:86)
>>>       at
>> org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.commit(UserGroupInformation.java:123)
>>>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>       at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:48)
>>>       at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>       at java.lang.reflect.Method.invoke(Method.java:600)
>>>       at
>> javax.security.auth.login.LoginContext.invoke(LoginContext.java:795)
>>>       at
>> javax.security.auth.login.LoginContext.access$000(LoginContext.java:209)
>>>       at
>> javax.security.auth.login.LoginContext$5.run(LoginContext.java:732)
>>>       at
>> java.security.AccessController.doPrivileged(AccessController.java:284)
>>>       at
>> javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:729)
>>>       at
>> javax.security.auth.login.LoginContext.login(LoginContext.java:600)
>>>       at
>> org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:482)
>>>       at
>> org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463)
>>>       at
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:491)
>>>       at
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:480)
>>>       at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1198)
>>>       at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1391)
>>>       at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1412)
>>> 
>>>       at
>> javax.security.auth.login.LoginContext.invoke(LoginContext.java:898)
>>>       at
>> javax.security.auth.login.LoginContext.access$000(LoginContext.java:209)
>>>       at
>> javax.security.auth.login.LoginContext$5.run(LoginContext.java:732)
>>>       at
>> java.security.AccessController.doPrivileged(AccessController.java:284)
>>>       at
>> javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:729)
>>>       at
>> javax.security.auth.login.LoginContext.login(LoginContext.java:600)
>>>       at
>> org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:482)
>>>       ... 6 more
>>> 
>>> 12/12/11 14:16:35 INFO namenode.NameNode: SHUTDOWN_MSG:
>>> /************************************************************
>>> SHUTDOWN_MSG: Shutting down NameNode at xxxxxxxxxxxxxxxx
>>> ************************************************************/
>>> $:/opt/flume_hadoop/hadoop-1.1.0>
>>> 
>>> Question:
>>> 
>>> 1)@developer
>>> Are you aware of this behavior?
>>> 2)It there a way to overcome this problem with a workaround?
>>> 3)IS it a security issue? --> I was able to issue ssh on localhost
>> without error.
>>> 
>> 
> 


Re: using hadoop on zLinux (Linux on S390)

Posted by Kumar Ravi <ku...@us.ibm.com>.
Hi Emile,

 I have a couple of questions for you:

1. What version and vendor of JDK did you use to compile and package 
hadoop? 

2. What version and vendor of JVM are you running? You can type java 
-version from the console to see this.

Thanks,
Kumar

Kumar Ravi
IBM Linux Technology Center 




From:
"Emile Kao" <em...@gmx.net>
To:
user@hadoop.apache.org, 
Date:
12/11/2012 08:51 AM
Subject:
Re: using hadoop on zLinux (Linux on S390)


No, this is the general available version...

-------- Original-Nachricht --------
> Datum: Tue, 11 Dec 2012 08:31:57 -0600
> Von: Michael Segel <mi...@hotmail.com>
> An: user@hadoop.apache.org
> Betreff: Re: using hadoop on zLinux (Linux on S390)

> Well, on the surface.... 
> 
> It looks like its either a missing class, or you don't have your class
> path set up right. 
> 
> I'm assuming you got this version of Hadoop from IBM, so I would suggest
> contacting their support and opening up a ticket. 
> 
> 
> On Dec 11, 2012, at 8:23 AM, Emile Kao <em...@gmx.net> wrote:
> 
> > Hello community,
> > I am trying to use hadoop 1.1.0 on a SLES 11 (zLinux) running on IBM
> S390.
> > The java provided is "java-s390x-60" 64Bit.
> > While trying to format the namenode I got the following error:
> > 
> > $:/opt/flume_hadoop/hadoop-1.1.0> bin/hadoop namenode -format
> > 12/12/11 14:16:31 INFO namenode.NameNode: STARTUP_MSG:
> > /************************************************************
> > STARTUP_MSG: Starting NameNode
> > STARTUP_MSG:   host = xxxxxxxxx
> > STARTUP_MSG:   args = [-format]
> > STARTUP_MSG:   version = 1.1.0
> > STARTUP_MSG:   build =
> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.1 -r 
1394289; compiled by 'hortonfo' on Thu Oct  4 22:06:49
> UTC 2012
> > ************************************************************/
> > Re-format filesystem in /opt/hadoop_data/name ? (Y or N) Y
> > 12/12/11 14:16:34 INFO util.GSet: VM type       = 64-bit
> > 12/12/11 14:16:34 INFO util.GSet: 2% max memory = 20.0 MB
> > 12/12/11 14:16:34 INFO util.GSet: capacity      = 2^21 = 2097152 
entries
> > 12/12/11 14:16:34 INFO util.GSet: recommended=2097152, actual=2097152
> > 12/12/11 14:16:34 ERROR security.UserGroupInformation: Unable to find
> JAAS classes:com.ibm.security.auth.LinuxPrincipal
> > 12/12/11 14:16:35 ERROR namenode.NameNode: java.io.IOException: 
failure
> to login
> >        at
> 
org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:501)
> >        at
> 
org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463)
> >        at
> 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:491)
> >        at
> 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:480)
> >        at
> 
org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1198)
> >        at
> 
org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1391)
> >        at
> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1412)
> > Caused by: javax.security.auth.login.LoginException:
> java.lang.NullPointerException: invalid null Class provided
> >        at javax.security.auth.Subject.getPrincipals(Subject.java:809)
> >        at
> 
org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.getCanonicalUser(UserGroupInformation.java:86)
> >        at
> 
org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.commit(UserGroupInformation.java:123)
> >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >        at
> 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:48)
> >        at
> 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >        at java.lang.reflect.Method.invoke(Method.java:600)
> >        at
> javax.security.auth.login.LoginContext.invoke(LoginContext.java:795)
> >        at
> javax.security.auth.login.LoginContext.access$000(LoginContext.java:209)
> >        at
> javax.security.auth.login.LoginContext$5.run(LoginContext.java:732)
> >        at
> java.security.AccessController.doPrivileged(AccessController.java:284)
> >        at
> 
javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:729)
> >        at
> javax.security.auth.login.LoginContext.login(LoginContext.java:600)
> >        at
> 
org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:482)
> >        at
> 
org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463)
> >        at
> 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:491)
> >        at
> 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:480)
> >        at
> 
org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1198)
> >        at
> 
org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1391)
> >        at
> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1412)
> > 
> >        at
> javax.security.auth.login.LoginContext.invoke(LoginContext.java:898)
> >        at
> javax.security.auth.login.LoginContext.access$000(LoginContext.java:209)
> >        at
> javax.security.auth.login.LoginContext$5.run(LoginContext.java:732)
> >        at
> java.security.AccessController.doPrivileged(AccessController.java:284)
> >        at
> 
javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:729)
> >        at
> javax.security.auth.login.LoginContext.login(LoginContext.java:600)
> >        at
> 
org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:482)
> >        ... 6 more
> > 
> > 12/12/11 14:16:35 INFO namenode.NameNode: SHUTDOWN_MSG:
> > /************************************************************
> > SHUTDOWN_MSG: Shutting down NameNode at xxxxxxxxxxxxxxxx
> > ************************************************************/
> > $:/opt/flume_hadoop/hadoop-1.1.0>
> > 
> > Question:
> > 
> > 1)@developer
> > Are you aware of this behavior?
> > 2)It there a way to overcome this problem with a workaround?
> > 3)IS it a security issue? --> I was able to issue ssh on localhost
> without error.
> > 
> 



Re: using hadoop on zLinux (Linux on S390)

Posted by Emile Kao <em...@gmx.net>.
No, this is the general available version...

-------- Original-Nachricht --------
> Datum: Tue, 11 Dec 2012 08:31:57 -0600
> Von: Michael Segel <mi...@hotmail.com>
> An: user@hadoop.apache.org
> Betreff: Re: using hadoop on zLinux (Linux on S390)

> Well, on the surface.... 
> 
> It looks like its either a missing class, or you don't have your class
> path set up right. 
> 
> I'm assuming you got this version of Hadoop from IBM, so I would suggest
> contacting their support and opening up a ticket. 
> 
> 
> On Dec 11, 2012, at 8:23 AM, Emile Kao <em...@gmx.net> wrote:
> 
> > Hello community,
> > I am trying to use hadoop 1.1.0 on a SLES 11 (zLinux) running on IBM
> S390.
> > The java provided is "java-s390x-60" 64Bit.
> > While trying to format the namenode I got the following error:
> > 
> > $:/opt/flume_hadoop/hadoop-1.1.0> bin/hadoop namenode -format
> > 12/12/11 14:16:31 INFO namenode.NameNode: STARTUP_MSG:
> > /************************************************************
> > STARTUP_MSG: Starting NameNode
> > STARTUP_MSG:   host = xxxxxxxxx
> > STARTUP_MSG:   args = [-format]
> > STARTUP_MSG:   version = 1.1.0
> > STARTUP_MSG:   build =
> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.1 -r 1394289; compiled by 'hortonfo' on Thu Oct  4 22:06:49
> UTC 2012
> > ************************************************************/
> > Re-format filesystem in /opt/hadoop_data/name ? (Y or N) Y
> > 12/12/11 14:16:34 INFO util.GSet: VM type       = 64-bit
> > 12/12/11 14:16:34 INFO util.GSet: 2% max memory = 20.0 MB
> > 12/12/11 14:16:34 INFO util.GSet: capacity      = 2^21 = 2097152 entries
> > 12/12/11 14:16:34 INFO util.GSet: recommended=2097152, actual=2097152
> > 12/12/11 14:16:34 ERROR security.UserGroupInformation: Unable to find
> JAAS classes:com.ibm.security.auth.LinuxPrincipal
> > 12/12/11 14:16:35 ERROR namenode.NameNode: java.io.IOException: failure
> to login
> >        at
> org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:501)
> >        at
> org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463)
> >        at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:491)
> >        at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:480)
> >        at
> org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1198)
> >        at
> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1391)
> >        at
> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1412)
> > Caused by: javax.security.auth.login.LoginException:
> java.lang.NullPointerException: invalid null Class provided
> >        at javax.security.auth.Subject.getPrincipals(Subject.java:809)
> >        at
> org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.getCanonicalUser(UserGroupInformation.java:86)
> >        at
> org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.commit(UserGroupInformation.java:123)
> >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >        at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:48)
> >        at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >        at java.lang.reflect.Method.invoke(Method.java:600)
> >        at
> javax.security.auth.login.LoginContext.invoke(LoginContext.java:795)
> >        at
> javax.security.auth.login.LoginContext.access$000(LoginContext.java:209)
> >        at
> javax.security.auth.login.LoginContext$5.run(LoginContext.java:732)
> >        at
> java.security.AccessController.doPrivileged(AccessController.java:284)
> >        at
> javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:729)
> >        at
> javax.security.auth.login.LoginContext.login(LoginContext.java:600)
> >        at
> org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:482)
> >        at
> org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463)
> >        at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:491)
> >        at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:480)
> >        at
> org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1198)
> >        at
> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1391)
> >        at
> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1412)
> > 
> >        at
> javax.security.auth.login.LoginContext.invoke(LoginContext.java:898)
> >        at
> javax.security.auth.login.LoginContext.access$000(LoginContext.java:209)
> >        at
> javax.security.auth.login.LoginContext$5.run(LoginContext.java:732)
> >        at
> java.security.AccessController.doPrivileged(AccessController.java:284)
> >        at
> javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:729)
> >        at
> javax.security.auth.login.LoginContext.login(LoginContext.java:600)
> >        at
> org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:482)
> >        ... 6 more
> > 
> > 12/12/11 14:16:35 INFO namenode.NameNode: SHUTDOWN_MSG:
> > /************************************************************
> > SHUTDOWN_MSG: Shutting down NameNode at xxxxxxxxxxxxxxxx
> > ************************************************************/
> > $:/opt/flume_hadoop/hadoop-1.1.0>
> > 
> > Question:
> > 
> > 1)@developer
> > Are you aware of this behavior?
> > 2)It there a way to overcome this problem with a workaround?
> > 3)IS it a security issue? --> I was able to issue ssh on localhost
> without error.
> > 
> 

Re: using hadoop on zLinux (Linux on S390)

Posted by Emile Kao <em...@gmx.net>.
No, this is the general available version...

-------- Original-Nachricht --------
> Datum: Tue, 11 Dec 2012 08:31:57 -0600
> Von: Michael Segel <mi...@hotmail.com>
> An: user@hadoop.apache.org
> Betreff: Re: using hadoop on zLinux (Linux on S390)

> Well, on the surface.... 
> 
> It looks like its either a missing class, or you don't have your class
> path set up right. 
> 
> I'm assuming you got this version of Hadoop from IBM, so I would suggest
> contacting their support and opening up a ticket. 
> 
> 
> On Dec 11, 2012, at 8:23 AM, Emile Kao <em...@gmx.net> wrote:
> 
> > Hello community,
> > I am trying to use hadoop 1.1.0 on a SLES 11 (zLinux) running on IBM
> S390.
> > The java provided is "java-s390x-60" 64Bit.
> > While trying to format the namenode I got the following error:
> > 
> > $:/opt/flume_hadoop/hadoop-1.1.0> bin/hadoop namenode -format
> > 12/12/11 14:16:31 INFO namenode.NameNode: STARTUP_MSG:
> > /************************************************************
> > STARTUP_MSG: Starting NameNode
> > STARTUP_MSG:   host = xxxxxxxxx
> > STARTUP_MSG:   args = [-format]
> > STARTUP_MSG:   version = 1.1.0
> > STARTUP_MSG:   build =
> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.1 -r 1394289; compiled by 'hortonfo' on Thu Oct  4 22:06:49
> UTC 2012
> > ************************************************************/
> > Re-format filesystem in /opt/hadoop_data/name ? (Y or N) Y
> > 12/12/11 14:16:34 INFO util.GSet: VM type       = 64-bit
> > 12/12/11 14:16:34 INFO util.GSet: 2% max memory = 20.0 MB
> > 12/12/11 14:16:34 INFO util.GSet: capacity      = 2^21 = 2097152 entries
> > 12/12/11 14:16:34 INFO util.GSet: recommended=2097152, actual=2097152
> > 12/12/11 14:16:34 ERROR security.UserGroupInformation: Unable to find
> JAAS classes:com.ibm.security.auth.LinuxPrincipal
> > 12/12/11 14:16:35 ERROR namenode.NameNode: java.io.IOException: failure
> to login
> >        at
> org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:501)
> >        at
> org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463)
> >        at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:491)
> >        at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:480)
> >        at
> org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1198)
> >        at
> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1391)
> >        at
> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1412)
> > Caused by: javax.security.auth.login.LoginException:
> java.lang.NullPointerException: invalid null Class provided
> >        at javax.security.auth.Subject.getPrincipals(Subject.java:809)
> >        at
> org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.getCanonicalUser(UserGroupInformation.java:86)
> >        at
> org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.commit(UserGroupInformation.java:123)
> >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >        at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:48)
> >        at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >        at java.lang.reflect.Method.invoke(Method.java:600)
> >        at
> javax.security.auth.login.LoginContext.invoke(LoginContext.java:795)
> >        at
> javax.security.auth.login.LoginContext.access$000(LoginContext.java:209)
> >        at
> javax.security.auth.login.LoginContext$5.run(LoginContext.java:732)
> >        at
> java.security.AccessController.doPrivileged(AccessController.java:284)
> >        at
> javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:729)
> >        at
> javax.security.auth.login.LoginContext.login(LoginContext.java:600)
> >        at
> org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:482)
> >        at
> org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463)
> >        at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:491)
> >        at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:480)
> >        at
> org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1198)
> >        at
> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1391)
> >        at
> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1412)
> > 
> >        at
> javax.security.auth.login.LoginContext.invoke(LoginContext.java:898)
> >        at
> javax.security.auth.login.LoginContext.access$000(LoginContext.java:209)
> >        at
> javax.security.auth.login.LoginContext$5.run(LoginContext.java:732)
> >        at
> java.security.AccessController.doPrivileged(AccessController.java:284)
> >        at
> javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:729)
> >        at
> javax.security.auth.login.LoginContext.login(LoginContext.java:600)
> >        at
> org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:482)
> >        ... 6 more
> > 
> > 12/12/11 14:16:35 INFO namenode.NameNode: SHUTDOWN_MSG:
> > /************************************************************
> > SHUTDOWN_MSG: Shutting down NameNode at xxxxxxxxxxxxxxxx
> > ************************************************************/
> > $:/opt/flume_hadoop/hadoop-1.1.0>
> > 
> > Question:
> > 
> > 1)@developer
> > Are you aware of this behavior?
> > 2)It there a way to overcome this problem with a workaround?
> > 3)IS it a security issue? --> I was able to issue ssh on localhost
> without error.
> > 
> 

Re: using hadoop on zLinux (Linux on S390)

Posted by Emile Kao <em...@gmx.net>.
No, this is the general available version...

-------- Original-Nachricht --------
> Datum: Tue, 11 Dec 2012 08:31:57 -0600
> Von: Michael Segel <mi...@hotmail.com>
> An: user@hadoop.apache.org
> Betreff: Re: using hadoop on zLinux (Linux on S390)

> Well, on the surface.... 
> 
> It looks like its either a missing class, or you don't have your class
> path set up right. 
> 
> I'm assuming you got this version of Hadoop from IBM, so I would suggest
> contacting their support and opening up a ticket. 
> 
> 
> On Dec 11, 2012, at 8:23 AM, Emile Kao <em...@gmx.net> wrote:
> 
> > Hello community,
> > I am trying to use hadoop 1.1.0 on a SLES 11 (zLinux) running on IBM
> S390.
> > The java provided is "java-s390x-60" 64Bit.
> > While trying to format the namenode I got the following error:
> > 
> > $:/opt/flume_hadoop/hadoop-1.1.0> bin/hadoop namenode -format
> > 12/12/11 14:16:31 INFO namenode.NameNode: STARTUP_MSG:
> > /************************************************************
> > STARTUP_MSG: Starting NameNode
> > STARTUP_MSG:   host = xxxxxxxxx
> > STARTUP_MSG:   args = [-format]
> > STARTUP_MSG:   version = 1.1.0
> > STARTUP_MSG:   build =
> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.1 -r 1394289; compiled by 'hortonfo' on Thu Oct  4 22:06:49
> UTC 2012
> > ************************************************************/
> > Re-format filesystem in /opt/hadoop_data/name ? (Y or N) Y
> > 12/12/11 14:16:34 INFO util.GSet: VM type       = 64-bit
> > 12/12/11 14:16:34 INFO util.GSet: 2% max memory = 20.0 MB
> > 12/12/11 14:16:34 INFO util.GSet: capacity      = 2^21 = 2097152 entries
> > 12/12/11 14:16:34 INFO util.GSet: recommended=2097152, actual=2097152
> > 12/12/11 14:16:34 ERROR security.UserGroupInformation: Unable to find
> JAAS classes:com.ibm.security.auth.LinuxPrincipal
> > 12/12/11 14:16:35 ERROR namenode.NameNode: java.io.IOException: failure
> to login
> >        at
> org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:501)
> >        at
> org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463)
> >        at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:491)
> >        at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:480)
> >        at
> org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1198)
> >        at
> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1391)
> >        at
> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1412)
> > Caused by: javax.security.auth.login.LoginException:
> java.lang.NullPointerException: invalid null Class provided
> >        at javax.security.auth.Subject.getPrincipals(Subject.java:809)
> >        at
> org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.getCanonicalUser(UserGroupInformation.java:86)
> >        at
> org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.commit(UserGroupInformation.java:123)
> >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >        at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:48)
> >        at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >        at java.lang.reflect.Method.invoke(Method.java:600)
> >        at
> javax.security.auth.login.LoginContext.invoke(LoginContext.java:795)
> >        at
> javax.security.auth.login.LoginContext.access$000(LoginContext.java:209)
> >        at
> javax.security.auth.login.LoginContext$5.run(LoginContext.java:732)
> >        at
> java.security.AccessController.doPrivileged(AccessController.java:284)
> >        at
> javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:729)
> >        at
> javax.security.auth.login.LoginContext.login(LoginContext.java:600)
> >        at
> org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:482)
> >        at
> org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463)
> >        at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:491)
> >        at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:480)
> >        at
> org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1198)
> >        at
> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1391)
> >        at
> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1412)
> > 
> >        at
> javax.security.auth.login.LoginContext.invoke(LoginContext.java:898)
> >        at
> javax.security.auth.login.LoginContext.access$000(LoginContext.java:209)
> >        at
> javax.security.auth.login.LoginContext$5.run(LoginContext.java:732)
> >        at
> java.security.AccessController.doPrivileged(AccessController.java:284)
> >        at
> javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:729)
> >        at
> javax.security.auth.login.LoginContext.login(LoginContext.java:600)
> >        at
> org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:482)
> >        ... 6 more
> > 
> > 12/12/11 14:16:35 INFO namenode.NameNode: SHUTDOWN_MSG:
> > /************************************************************
> > SHUTDOWN_MSG: Shutting down NameNode at xxxxxxxxxxxxxxxx
> > ************************************************************/
> > $:/opt/flume_hadoop/hadoop-1.1.0>
> > 
> > Question:
> > 
> > 1)@developer
> > Are you aware of this behavior?
> > 2)It there a way to overcome this problem with a workaround?
> > 3)IS it a security issue? --> I was able to issue ssh on localhost
> without error.
> > 
> 

Re: using hadoop on zLinux (Linux on S390)

Posted by Emile Kao <em...@gmx.net>.
No, this is the general available version...

-------- Original-Nachricht --------
> Datum: Tue, 11 Dec 2012 08:31:57 -0600
> Von: Michael Segel <mi...@hotmail.com>
> An: user@hadoop.apache.org
> Betreff: Re: using hadoop on zLinux (Linux on S390)

> Well, on the surface.... 
> 
> It looks like its either a missing class, or you don't have your class
> path set up right. 
> 
> I'm assuming you got this version of Hadoop from IBM, so I would suggest
> contacting their support and opening up a ticket. 
> 
> 
> On Dec 11, 2012, at 8:23 AM, Emile Kao <em...@gmx.net> wrote:
> 
> > Hello community,
> > I am trying to use hadoop 1.1.0 on a SLES 11 (zLinux) running on IBM
> S390.
> > The java provided is "java-s390x-60" 64Bit.
> > While trying to format the namenode I got the following error:
> > 
> > $:/opt/flume_hadoop/hadoop-1.1.0> bin/hadoop namenode -format
> > 12/12/11 14:16:31 INFO namenode.NameNode: STARTUP_MSG:
> > /************************************************************
> > STARTUP_MSG: Starting NameNode
> > STARTUP_MSG:   host = xxxxxxxxx
> > STARTUP_MSG:   args = [-format]
> > STARTUP_MSG:   version = 1.1.0
> > STARTUP_MSG:   build =
> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.1 -r 1394289; compiled by 'hortonfo' on Thu Oct  4 22:06:49
> UTC 2012
> > ************************************************************/
> > Re-format filesystem in /opt/hadoop_data/name ? (Y or N) Y
> > 12/12/11 14:16:34 INFO util.GSet: VM type       = 64-bit
> > 12/12/11 14:16:34 INFO util.GSet: 2% max memory = 20.0 MB
> > 12/12/11 14:16:34 INFO util.GSet: capacity      = 2^21 = 2097152 entries
> > 12/12/11 14:16:34 INFO util.GSet: recommended=2097152, actual=2097152
> > 12/12/11 14:16:34 ERROR security.UserGroupInformation: Unable to find
> JAAS classes:com.ibm.security.auth.LinuxPrincipal
> > 12/12/11 14:16:35 ERROR namenode.NameNode: java.io.IOException: failure
> to login
> >        at
> org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:501)
> >        at
> org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463)
> >        at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:491)
> >        at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:480)
> >        at
> org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1198)
> >        at
> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1391)
> >        at
> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1412)
> > Caused by: javax.security.auth.login.LoginException:
> java.lang.NullPointerException: invalid null Class provided
> >        at javax.security.auth.Subject.getPrincipals(Subject.java:809)
> >        at
> org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.getCanonicalUser(UserGroupInformation.java:86)
> >        at
> org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.commit(UserGroupInformation.java:123)
> >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >        at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:48)
> >        at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >        at java.lang.reflect.Method.invoke(Method.java:600)
> >        at
> javax.security.auth.login.LoginContext.invoke(LoginContext.java:795)
> >        at
> javax.security.auth.login.LoginContext.access$000(LoginContext.java:209)
> >        at
> javax.security.auth.login.LoginContext$5.run(LoginContext.java:732)
> >        at
> java.security.AccessController.doPrivileged(AccessController.java:284)
> >        at
> javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:729)
> >        at
> javax.security.auth.login.LoginContext.login(LoginContext.java:600)
> >        at
> org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:482)
> >        at
> org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463)
> >        at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:491)
> >        at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:480)
> >        at
> org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1198)
> >        at
> org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1391)
> >        at
> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1412)
> > 
> >        at
> javax.security.auth.login.LoginContext.invoke(LoginContext.java:898)
> >        at
> javax.security.auth.login.LoginContext.access$000(LoginContext.java:209)
> >        at
> javax.security.auth.login.LoginContext$5.run(LoginContext.java:732)
> >        at
> java.security.AccessController.doPrivileged(AccessController.java:284)
> >        at
> javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:729)
> >        at
> javax.security.auth.login.LoginContext.login(LoginContext.java:600)
> >        at
> org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:482)
> >        ... 6 more
> > 
> > 12/12/11 14:16:35 INFO namenode.NameNode: SHUTDOWN_MSG:
> > /************************************************************
> > SHUTDOWN_MSG: Shutting down NameNode at xxxxxxxxxxxxxxxx
> > ************************************************************/
> > $:/opt/flume_hadoop/hadoop-1.1.0>
> > 
> > Question:
> > 
> > 1)@developer
> > Are you aware of this behavior?
> > 2)It there a way to overcome this problem with a workaround?
> > 3)IS it a security issue? --> I was able to issue ssh on localhost
> without error.
> > 
> 

Re: using hadoop on zLinux (Linux on S390)

Posted by Michael Segel <mi...@hotmail.com>.
Well, on the surface.... 

It looks like its either a missing class, or you don't have your class path set up right. 

I'm assuming you got this version of Hadoop from IBM, so I would suggest contacting their support and opening up a ticket. 


On Dec 11, 2012, at 8:23 AM, Emile Kao <em...@gmx.net> wrote:

> Hello community,
> I am trying to use hadoop 1.1.0 on a SLES 11 (zLinux) running on IBM S390.
> The java provided is "java-s390x-60" 64Bit.
> While trying to format the namenode I got the following error:
> 
> $:/opt/flume_hadoop/hadoop-1.1.0> bin/hadoop namenode -format
> 12/12/11 14:16:31 INFO namenode.NameNode: STARTUP_MSG:
> /************************************************************
> STARTUP_MSG: Starting NameNode
> STARTUP_MSG:   host = xxxxxxxxx
> STARTUP_MSG:   args = [-format]
> STARTUP_MSG:   version = 1.1.0
> STARTUP_MSG:   build = https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.1 -r 1394289; compiled by 'hortonfo' on Thu Oct  4 22:06:49 UTC 2012
> ************************************************************/
> Re-format filesystem in /opt/hadoop_data/name ? (Y or N) Y
> 12/12/11 14:16:34 INFO util.GSet: VM type       = 64-bit
> 12/12/11 14:16:34 INFO util.GSet: 2% max memory = 20.0 MB
> 12/12/11 14:16:34 INFO util.GSet: capacity      = 2^21 = 2097152 entries
> 12/12/11 14:16:34 INFO util.GSet: recommended=2097152, actual=2097152
> 12/12/11 14:16:34 ERROR security.UserGroupInformation: Unable to find JAAS classes:com.ibm.security.auth.LinuxPrincipal
> 12/12/11 14:16:35 ERROR namenode.NameNode: java.io.IOException: failure to login
>        at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:501)
>        at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463)
>        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:491)
>        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:480)
>        at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1198)
>        at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1391)
>        at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1412)
> Caused by: javax.security.auth.login.LoginException: java.lang.NullPointerException: invalid null Class provided
>        at javax.security.auth.Subject.getPrincipals(Subject.java:809)
>        at org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.getCanonicalUser(UserGroupInformation.java:86)
>        at org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.commit(UserGroupInformation.java:123)
>        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:48)
>        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>        at java.lang.reflect.Method.invoke(Method.java:600)
>        at javax.security.auth.login.LoginContext.invoke(LoginContext.java:795)
>        at javax.security.auth.login.LoginContext.access$000(LoginContext.java:209)
>        at javax.security.auth.login.LoginContext$5.run(LoginContext.java:732)
>        at java.security.AccessController.doPrivileged(AccessController.java:284)
>        at javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:729)
>        at javax.security.auth.login.LoginContext.login(LoginContext.java:600)
>        at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:482)
>        at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463)
>        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:491)
>        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:480)
>        at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1198)
>        at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1391)
>        at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1412)
> 
>        at javax.security.auth.login.LoginContext.invoke(LoginContext.java:898)
>        at javax.security.auth.login.LoginContext.access$000(LoginContext.java:209)
>        at javax.security.auth.login.LoginContext$5.run(LoginContext.java:732)
>        at java.security.AccessController.doPrivileged(AccessController.java:284)
>        at javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:729)
>        at javax.security.auth.login.LoginContext.login(LoginContext.java:600)
>        at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:482)
>        ... 6 more
> 
> 12/12/11 14:16:35 INFO namenode.NameNode: SHUTDOWN_MSG:
> /************************************************************
> SHUTDOWN_MSG: Shutting down NameNode at xxxxxxxxxxxxxxxx
> ************************************************************/
> $:/opt/flume_hadoop/hadoop-1.1.0>
> 
> Question:
> 
> 1)@developer
> Are you aware of this behavior?
> 2)It there a way to overcome this problem with a workaround?
> 3)IS it a security issue? --> I was able to issue ssh on localhost without error.
> 


Re: using hadoop on zLinux (Linux on S390)

Posted by Michael Segel <mi...@hotmail.com>.
Well, on the surface.... 

It looks like its either a missing class, or you don't have your class path set up right. 

I'm assuming you got this version of Hadoop from IBM, so I would suggest contacting their support and opening up a ticket. 


On Dec 11, 2012, at 8:23 AM, Emile Kao <em...@gmx.net> wrote:

> Hello community,
> I am trying to use hadoop 1.1.0 on a SLES 11 (zLinux) running on IBM S390.
> The java provided is "java-s390x-60" 64Bit.
> While trying to format the namenode I got the following error:
> 
> $:/opt/flume_hadoop/hadoop-1.1.0> bin/hadoop namenode -format
> 12/12/11 14:16:31 INFO namenode.NameNode: STARTUP_MSG:
> /************************************************************
> STARTUP_MSG: Starting NameNode
> STARTUP_MSG:   host = xxxxxxxxx
> STARTUP_MSG:   args = [-format]
> STARTUP_MSG:   version = 1.1.0
> STARTUP_MSG:   build = https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.1 -r 1394289; compiled by 'hortonfo' on Thu Oct  4 22:06:49 UTC 2012
> ************************************************************/
> Re-format filesystem in /opt/hadoop_data/name ? (Y or N) Y
> 12/12/11 14:16:34 INFO util.GSet: VM type       = 64-bit
> 12/12/11 14:16:34 INFO util.GSet: 2% max memory = 20.0 MB
> 12/12/11 14:16:34 INFO util.GSet: capacity      = 2^21 = 2097152 entries
> 12/12/11 14:16:34 INFO util.GSet: recommended=2097152, actual=2097152
> 12/12/11 14:16:34 ERROR security.UserGroupInformation: Unable to find JAAS classes:com.ibm.security.auth.LinuxPrincipal
> 12/12/11 14:16:35 ERROR namenode.NameNode: java.io.IOException: failure to login
>        at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:501)
>        at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463)
>        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:491)
>        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:480)
>        at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1198)
>        at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1391)
>        at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1412)
> Caused by: javax.security.auth.login.LoginException: java.lang.NullPointerException: invalid null Class provided
>        at javax.security.auth.Subject.getPrincipals(Subject.java:809)
>        at org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.getCanonicalUser(UserGroupInformation.java:86)
>        at org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.commit(UserGroupInformation.java:123)
>        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:48)
>        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>        at java.lang.reflect.Method.invoke(Method.java:600)
>        at javax.security.auth.login.LoginContext.invoke(LoginContext.java:795)
>        at javax.security.auth.login.LoginContext.access$000(LoginContext.java:209)
>        at javax.security.auth.login.LoginContext$5.run(LoginContext.java:732)
>        at java.security.AccessController.doPrivileged(AccessController.java:284)
>        at javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:729)
>        at javax.security.auth.login.LoginContext.login(LoginContext.java:600)
>        at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:482)
>        at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463)
>        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:491)
>        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:480)
>        at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1198)
>        at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1391)
>        at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1412)
> 
>        at javax.security.auth.login.LoginContext.invoke(LoginContext.java:898)
>        at javax.security.auth.login.LoginContext.access$000(LoginContext.java:209)
>        at javax.security.auth.login.LoginContext$5.run(LoginContext.java:732)
>        at java.security.AccessController.doPrivileged(AccessController.java:284)
>        at javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:729)
>        at javax.security.auth.login.LoginContext.login(LoginContext.java:600)
>        at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:482)
>        ... 6 more
> 
> 12/12/11 14:16:35 INFO namenode.NameNode: SHUTDOWN_MSG:
> /************************************************************
> SHUTDOWN_MSG: Shutting down NameNode at xxxxxxxxxxxxxxxx
> ************************************************************/
> $:/opt/flume_hadoop/hadoop-1.1.0>
> 
> Question:
> 
> 1)@developer
> Are you aware of this behavior?
> 2)It there a way to overcome this problem with a workaround?
> 3)IS it a security issue? --> I was able to issue ssh on localhost without error.
> 


Re: using hadoop on zLinux (Linux on S390)

Posted by Michael Segel <mi...@hotmail.com>.
Well, on the surface.... 

It looks like its either a missing class, or you don't have your class path set up right. 

I'm assuming you got this version of Hadoop from IBM, so I would suggest contacting their support and opening up a ticket. 


On Dec 11, 2012, at 8:23 AM, Emile Kao <em...@gmx.net> wrote:

> Hello community,
> I am trying to use hadoop 1.1.0 on a SLES 11 (zLinux) running on IBM S390.
> The java provided is "java-s390x-60" 64Bit.
> While trying to format the namenode I got the following error:
> 
> $:/opt/flume_hadoop/hadoop-1.1.0> bin/hadoop namenode -format
> 12/12/11 14:16:31 INFO namenode.NameNode: STARTUP_MSG:
> /************************************************************
> STARTUP_MSG: Starting NameNode
> STARTUP_MSG:   host = xxxxxxxxx
> STARTUP_MSG:   args = [-format]
> STARTUP_MSG:   version = 1.1.0
> STARTUP_MSG:   build = https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.1 -r 1394289; compiled by 'hortonfo' on Thu Oct  4 22:06:49 UTC 2012
> ************************************************************/
> Re-format filesystem in /opt/hadoop_data/name ? (Y or N) Y
> 12/12/11 14:16:34 INFO util.GSet: VM type       = 64-bit
> 12/12/11 14:16:34 INFO util.GSet: 2% max memory = 20.0 MB
> 12/12/11 14:16:34 INFO util.GSet: capacity      = 2^21 = 2097152 entries
> 12/12/11 14:16:34 INFO util.GSet: recommended=2097152, actual=2097152
> 12/12/11 14:16:34 ERROR security.UserGroupInformation: Unable to find JAAS classes:com.ibm.security.auth.LinuxPrincipal
> 12/12/11 14:16:35 ERROR namenode.NameNode: java.io.IOException: failure to login
>        at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:501)
>        at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463)
>        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:491)
>        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:480)
>        at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1198)
>        at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1391)
>        at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1412)
> Caused by: javax.security.auth.login.LoginException: java.lang.NullPointerException: invalid null Class provided
>        at javax.security.auth.Subject.getPrincipals(Subject.java:809)
>        at org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.getCanonicalUser(UserGroupInformation.java:86)
>        at org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.commit(UserGroupInformation.java:123)
>        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:48)
>        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>        at java.lang.reflect.Method.invoke(Method.java:600)
>        at javax.security.auth.login.LoginContext.invoke(LoginContext.java:795)
>        at javax.security.auth.login.LoginContext.access$000(LoginContext.java:209)
>        at javax.security.auth.login.LoginContext$5.run(LoginContext.java:732)
>        at java.security.AccessController.doPrivileged(AccessController.java:284)
>        at javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:729)
>        at javax.security.auth.login.LoginContext.login(LoginContext.java:600)
>        at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:482)
>        at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463)
>        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:491)
>        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:480)
>        at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1198)
>        at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1391)
>        at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1412)
> 
>        at javax.security.auth.login.LoginContext.invoke(LoginContext.java:898)
>        at javax.security.auth.login.LoginContext.access$000(LoginContext.java:209)
>        at javax.security.auth.login.LoginContext$5.run(LoginContext.java:732)
>        at java.security.AccessController.doPrivileged(AccessController.java:284)
>        at javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:729)
>        at javax.security.auth.login.LoginContext.login(LoginContext.java:600)
>        at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:482)
>        ... 6 more
> 
> 12/12/11 14:16:35 INFO namenode.NameNode: SHUTDOWN_MSG:
> /************************************************************
> SHUTDOWN_MSG: Shutting down NameNode at xxxxxxxxxxxxxxxx
> ************************************************************/
> $:/opt/flume_hadoop/hadoop-1.1.0>
> 
> Question:
> 
> 1)@developer
> Are you aware of this behavior?
> 2)It there a way to overcome this problem with a workaround?
> 3)IS it a security issue? --> I was able to issue ssh on localhost without error.
> 


Re: using hadoop on zLinux (Linux on S390)

Posted by Michael Segel <mi...@hotmail.com>.
Well, on the surface.... 

It looks like its either a missing class, or you don't have your class path set up right. 

I'm assuming you got this version of Hadoop from IBM, so I would suggest contacting their support and opening up a ticket. 


On Dec 11, 2012, at 8:23 AM, Emile Kao <em...@gmx.net> wrote:

> Hello community,
> I am trying to use hadoop 1.1.0 on a SLES 11 (zLinux) running on IBM S390.
> The java provided is "java-s390x-60" 64Bit.
> While trying to format the namenode I got the following error:
> 
> $:/opt/flume_hadoop/hadoop-1.1.0> bin/hadoop namenode -format
> 12/12/11 14:16:31 INFO namenode.NameNode: STARTUP_MSG:
> /************************************************************
> STARTUP_MSG: Starting NameNode
> STARTUP_MSG:   host = xxxxxxxxx
> STARTUP_MSG:   args = [-format]
> STARTUP_MSG:   version = 1.1.0
> STARTUP_MSG:   build = https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.1 -r 1394289; compiled by 'hortonfo' on Thu Oct  4 22:06:49 UTC 2012
> ************************************************************/
> Re-format filesystem in /opt/hadoop_data/name ? (Y or N) Y
> 12/12/11 14:16:34 INFO util.GSet: VM type       = 64-bit
> 12/12/11 14:16:34 INFO util.GSet: 2% max memory = 20.0 MB
> 12/12/11 14:16:34 INFO util.GSet: capacity      = 2^21 = 2097152 entries
> 12/12/11 14:16:34 INFO util.GSet: recommended=2097152, actual=2097152
> 12/12/11 14:16:34 ERROR security.UserGroupInformation: Unable to find JAAS classes:com.ibm.security.auth.LinuxPrincipal
> 12/12/11 14:16:35 ERROR namenode.NameNode: java.io.IOException: failure to login
>        at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:501)
>        at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463)
>        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:491)
>        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:480)
>        at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1198)
>        at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1391)
>        at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1412)
> Caused by: javax.security.auth.login.LoginException: java.lang.NullPointerException: invalid null Class provided
>        at javax.security.auth.Subject.getPrincipals(Subject.java:809)
>        at org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.getCanonicalUser(UserGroupInformation.java:86)
>        at org.apache.hadoop.security.UserGroupInformation$HadoopLoginModule.commit(UserGroupInformation.java:123)
>        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:48)
>        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>        at java.lang.reflect.Method.invoke(Method.java:600)
>        at javax.security.auth.login.LoginContext.invoke(LoginContext.java:795)
>        at javax.security.auth.login.LoginContext.access$000(LoginContext.java:209)
>        at javax.security.auth.login.LoginContext$5.run(LoginContext.java:732)
>        at java.security.AccessController.doPrivileged(AccessController.java:284)
>        at javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:729)
>        at javax.security.auth.login.LoginContext.login(LoginContext.java:600)
>        at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:482)
>        at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:463)
>        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:491)
>        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:480)
>        at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1198)
>        at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1391)
>        at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1412)
> 
>        at javax.security.auth.login.LoginContext.invoke(LoginContext.java:898)
>        at javax.security.auth.login.LoginContext.access$000(LoginContext.java:209)
>        at javax.security.auth.login.LoginContext$5.run(LoginContext.java:732)
>        at java.security.AccessController.doPrivileged(AccessController.java:284)
>        at javax.security.auth.login.LoginContext.invokeCreatorPriv(LoginContext.java:729)
>        at javax.security.auth.login.LoginContext.login(LoginContext.java:600)
>        at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:482)
>        ... 6 more
> 
> 12/12/11 14:16:35 INFO namenode.NameNode: SHUTDOWN_MSG:
> /************************************************************
> SHUTDOWN_MSG: Shutting down NameNode at xxxxxxxxxxxxxxxx
> ************************************************************/
> $:/opt/flume_hadoop/hadoop-1.1.0>
> 
> Question:
> 
> 1)@developer
> Are you aware of this behavior?
> 2)It there a way to overcome this problem with a workaround?
> 3)IS it a security issue? --> I was able to issue ssh on localhost without error.
>