You are viewing a plain text version of this content. The canonical link for it is here.
Posted to hdfs-user@hadoop.apache.org by Björn-Elmar Macek <ma...@cs.uni-kassel.de> on 2012/04/27 12:01:18 UTC

Hadoop Configuration Issues

Hello,

i have recently installed Hadoop on my and a second machine in order to 
test the setup and develop little programs locally before deploying them 
to the cluster. I stumbled over several difficulties, which i could fix 
with some internet research. But once again im stuck and i think i can 
nail the problem down:

When Hadoop evaluates the config files in /etc/hadoop it does not have 
any default values for all the variables used within:

\________ First Error:
hadoop namenode -format
Warning: $HADOOP_HOME is deprecated.

12/04/27 11:31:41 INFO namenode.NameNode: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting NameNode
STARTUP_MSG:   host = ubuntu/127.0.1.1
STARTUP_MSG:   args = [-format]
STARTUP_MSG:   version = 1.0.1
STARTUP_MSG:   build = 
https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r 
1243785; compiled by 'hortonfo' on Tue Feb 14 08:13:52 UTC 2012
************************************************************/
12/04/27 11:31:41 INFO util.GSet: VM type       = 32-bit
12/04/27 11:31:41 INFO util.GSet: 2% max memory = 2.475 MB
12/04/27 11:31:41 INFO util.GSet: capacity      = 2^19 = 524288 entries
12/04/27 11:31:41 INFO util.GSet: recommended=524288, actual=524288
12/04/27 11:31:41 ERROR namenode.NameNode: 
java.lang.IllegalArgumentException: Invalid attribute value for 
hadoop.security.authentication of ${SECURITY_TYPE}
     at 
org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:202)
     at 
org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:187)
     at 
org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:239)
     at 
org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:438)
     at 
org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:424)
     at 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:473)
     at 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:462)
     at 
org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1162)
     at 
org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1271)
     at 
org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1288)

12/04/27 11:31:41 INFO namenode.NameNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at ubuntu/127.0.1.1
************************************************************/


\_________ Solution
I removed the variable and exchanged it with the value "simple".
Then the next error occurred:


\_________ Error 2
hadoop namenode -format
Warning: $HADOOP_HOME is deprecated.

12/04/27 11:46:33 INFO namenode.NameNode: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting NameNode
STARTUP_MSG:   host = ubuntu/127.0.1.1
STARTUP_MSG:   args = [-format]
STARTUP_MSG:   version = 1.0.1
STARTUP_MSG:   build = 
https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r 
1243785; compiled by 'hortonfo' on Tue Feb 14 08:13:52 UTC 2012
************************************************************/
12/04/27 11:46:33 INFO util.GSet: VM type       = 32-bit
12/04/27 11:46:33 INFO util.GSet: 2% max memory = 2.475 MB
12/04/27 11:46:33 INFO util.GSet: capacity      = 2^19 = 524288 entries
12/04/27 11:46:33 INFO util.GSet: recommended=524288, actual=524288
12/04/27 11:46:33 ERROR namenode.NameNode: 
java.lang.ExceptionInInitializerError
     at 
org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:212)
     at 
org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:187)
     at 
org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:239)
     at 
org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:438)
     at 
org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:424)
     at 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:473)
     at 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:462)
     at 
org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1162)
     at 
org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1271)
     at 
org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1288)
Caused by: java.util.regex.PatternSyntaxException: Illegal repetition 
near index 8
[jt]t@.*${KERBEROS_REALM}
         ^
     at java.util.regex.Pattern.error(Pattern.java:1730)
     at java.util.regex.Pattern.closure(Pattern.java:2792)
     at java.util.regex.Pattern.sequence(Pattern.java:1906)
     at java.util.regex.Pattern.expr(Pattern.java:1769)
     at java.util.regex.Pattern.compile(Pattern.java:1477)
     at java.util.regex.Pattern.<init>(Pattern.java:1150)
     at java.util.regex.Pattern.compile(Pattern.java:840)
     at 
org.apache.hadoop.security.KerberosName$Rule.<init>(KerberosName.java:188)
     at 
org.apache.hadoop.security.KerberosName.parseRules(KerberosName.java:324)
     at 
org.apache.hadoop.security.KerberosName.setConfiguration(KerberosName.java:343)
     at 
org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:212)
     at 
org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:187)
     at 
org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:239)
     at 
org.apache.hadoop.security.KerberosName.<clinit>(KerberosName.java:83)
     ... 10 more

12/04/27 11:46:33 INFO namenode.NameNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at ubuntu/127.0.1.1
************************************************************/

\______ And once again...
... a variable seems to be undefined. And i guess, if i found a suitable 
value for this property the next one will be undefined. I hope there are 
any default values present, because i have no idea, what to fill to the 
value slots. All books and instructions i read on hadoop never discussed 
these issues.
BTW: HADOOP_HOME is defined, although the log tells different.

I hope you can assist me.

Best regards,
Björn-Elmar Macek


Re: Hadoop Configuration Issues

Posted by alo alt <wg...@googlemail.com>.
Yes, as Leo said. Be sure that you deploy all  config files around your cluster. and be sure that you installed the same hadoop version. 
In my blog you'll find a older page "running an hadoop cluster in 20 minutes", based on a distribution, but the main steps are the same. 

best,
 Alex

--
Alexander Lorenz
http://mapredit.blogspot.com
German Hadoop LinkedIn Group: http://goo.gl/N8pCF

On Apr 27, 2012, at 7:53 PM, Leo Leung wrote:

> I would suggest changing the hadoop configuration on the slave(s)
> You'll have to maintain the delta config amongst the nodes.
> 
> 
> -----Original Message-----
> From: Björn-Elmar Macek [mailto:macek@cs.uni-kassel.de] 
> Sent: Friday, April 27, 2012 7:28 AM
> To: hdfs-user@hadoop.apache.org
> Subject: Re: Hadoop Configuration Issues
> 
> Hi Alex,
> 
> thank you for the tip: it pushed me into the right direction. I was using the deb-package to install hadoop, which did not work out cause of the problems i had. Now i use the tarball archive and unpacked it to a subfolder of my home directory.
> 
> BUT i just ran into another problem: after successfully executing "hadoop namenode -format" the "start-all.sh" script (executed on the master server) runs into errors, when trying to access files on the slave system: it automatically seems to expect the hadoop files lying in the very same directory as on the master:
> 
> ema@ubuntu:~/Programs/hadoop-1.0.2/bin$ ./start-all.sh
> Warning: $HADOOP_HOME is deprecated.
> 
> namenode running as process 19393. Stop it first.
> slave: bash: line 0: cd: /home/ema/Programs/hadoop-1.0.2/libexec/..: No such file or directory
> slave: bash: /home/ema/Programs/hadoop-1.0.2/bin/hadoop-daemon.sh: No such file or directory ema@master's password:
> master: Connection closed by UNKNOWN
> starting jobtracker, logging to
> /var/log/hadoop/ema/hadoop-ema-jobtracker-ubuntu.out
> slave: bash: line 0: cd: /home/ema/Programs/hadoop-1.0.2/libexec/..: No such file or directory
> slave: bash: /home/ema/Programs/hadoop-1.0.2/bin/hadoop-daemon.sh: No such file or directory
> 
> How can i tell the slave, that the files lie somewhere else?
> 
> Best regards,
> Björn-Elmar
> 
> Am 27.04.2012 13:11, schrieb alo alt:
>> Hi,
>> 
>> yes, sorry - I saw that after I hit the send botton.
>> Looks like you mixed up some configs with wrong templates. I would suggest you use the default configs:
>> http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project
>> /hadoop-common/src/main/conf/
>> 
>> - Alex
>> 
>> --
>> Alexander Lorenz
>> http://mapredit.blogspot.com
>> 
>> On Apr 27, 2012, at 12:39 PM, Björn-Elmar Macek wrote:
>> 
>>> Hi Alex,
>>> 
>>> as i have written, i already did so! The problem is as already stated in my mail before, that all the Variables ${bla}  seem to be UNSET - not only SECURITY_TYPE. As i dont really understand those parameters, i would like to use the default ones, which afaik should be configured in the hadoop-env.sh. But obviously they are not.
>>> 
>>> Best,
>>> Björn
>>> 
>>> Am 27.04.2012 12:12, schrieb alo alt:
>>>> Hi,
>>>> 
>>>> Invalid attribute value for hadoop.security.authentication of 
>>>> ${SECURITY_TYPE} Set it to simple and it should work (default is kerberos).
>>>> 
>>>> - Alex
>>>> 
>>>> --
>>>> Alexander Lorenz
>>>> http://mapredit.blogspot.com
>>>> 
>>>> On Apr 27, 2012, at 12:01 PM, Björn-Elmar Macek wrote:
>>>> 
>>>>> Hello,
>>>>> 
>>>>> i have recently installed Hadoop on my and a second machine in order to test the setup and develop little programs locally before deploying them to the cluster. I stumbled over several difficulties, which i could fix with some internet research. But once again im stuck and i think i can nail the problem down:
>>>>> 
>>>>> When Hadoop evaluates the config files in /etc/hadoop it does not have any default values for all the variables used within:
>>>>> 
>>>>> \________ First Error:
>>>>> hadoop namenode -format
>>>>> Warning: $HADOOP_HOME is deprecated.
>>>>> 
>>>>> 12/04/27 11:31:41 INFO namenode.NameNode: STARTUP_MSG:
>>>>> /************************************************************
>>>>> STARTUP_MSG: Starting NameNode
>>>>> STARTUP_MSG:   host = ubuntu/127.0.1.1
>>>>> STARTUP_MSG:   args = [-format]
>>>>> STARTUP_MSG:   version = 1.0.1
>>>>> STARTUP_MSG:   build = https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r 1243785; compiled by 'hortonfo' on Tue Feb 14 08:13:52 UTC 2012
>>>>> ************************************************************/
>>>>> 12/04/27 11:31:41 INFO util.GSet: VM type       = 32-bit
>>>>> 12/04/27 11:31:41 INFO util.GSet: 2% max memory = 2.475 MB
>>>>> 12/04/27 11:31:41 INFO util.GSet: capacity      = 2^19 = 524288 entries
>>>>> 12/04/27 11:31:41 INFO util.GSet: recommended=524288, actual=524288
>>>>> 12/04/27 11:31:41 ERROR namenode.NameNode: java.lang.IllegalArgumentException: Invalid attribute value for hadoop.security.authentication of ${SECURITY_TYPE}
>>>>>    at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:202)
>>>>>    at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:187)
>>>>>    at org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:239)
>>>>>    at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:438)
>>>>>    at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:424)
>>>>>    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:473)
>>>>>    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:462)
>>>>>    at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1162)
>>>>>    at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1271)
>>>>>    at 
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:
>>>>> 1288)
>>>>> 
>>>>> 12/04/27 11:31:41 INFO namenode.NameNode: SHUTDOWN_MSG:
>>>>> /************************************************************
>>>>> SHUTDOWN_MSG: Shutting down NameNode at ubuntu/127.0.1.1 
>>>>> ************************************************************/
>>>>> 
>>>>> 
>>>>> \_________ Solution
>>>>> I removed the variable and exchanged it with the value "simple".
>>>>> Then the next error occurred:
>>>>> 
>>>>> 
>>>>> \_________ Error 2
>>>>> hadoop namenode -format
>>>>> Warning: $HADOOP_HOME is deprecated.
>>>>> 
>>>>> 12/04/27 11:46:33 INFO namenode.NameNode: STARTUP_MSG:
>>>>> /************************************************************
>>>>> STARTUP_MSG: Starting NameNode
>>>>> STARTUP_MSG:   host = ubuntu/127.0.1.1
>>>>> STARTUP_MSG:   args = [-format]
>>>>> STARTUP_MSG:   version = 1.0.1
>>>>> STARTUP_MSG:   build = https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r 1243785; compiled by 'hortonfo' on Tue Feb 14 08:13:52 UTC 2012
>>>>> ************************************************************/
>>>>> 12/04/27 11:46:33 INFO util.GSet: VM type       = 32-bit
>>>>> 12/04/27 11:46:33 INFO util.GSet: 2% max memory = 2.475 MB
>>>>> 12/04/27 11:46:33 INFO util.GSet: capacity      = 2^19 = 524288 entries
>>>>> 12/04/27 11:46:33 INFO util.GSet: recommended=524288, actual=524288
>>>>> 12/04/27 11:46:33 ERROR namenode.NameNode: java.lang.ExceptionInInitializerError
>>>>>    at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:212)
>>>>>    at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:187)
>>>>>    at org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:239)
>>>>>    at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:438)
>>>>>    at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:424)
>>>>>    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:473)
>>>>>    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:462)
>>>>>    at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1162)
>>>>>    at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1271)
>>>>>    at 
>>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:
>>>>> 1288) Caused by: java.util.regex.PatternSyntaxException: Illegal 
>>>>> repetition near index 8 [jt]t@.*${KERBEROS_REALM}
>>>>>        ^
>>>>>    at java.util.regex.Pattern.error(Pattern.java:1730)
>>>>>    at java.util.regex.Pattern.closure(Pattern.java:2792)
>>>>>    at java.util.regex.Pattern.sequence(Pattern.java:1906)
>>>>>    at java.util.regex.Pattern.expr(Pattern.java:1769)
>>>>>    at java.util.regex.Pattern.compile(Pattern.java:1477)
>>>>>    at java.util.regex.Pattern.<init>(Pattern.java:1150)
>>>>>    at java.util.regex.Pattern.compile(Pattern.java:840)
>>>>>    at org.apache.hadoop.security.KerberosName$Rule.<init>(KerberosName.java:188)
>>>>>    at org.apache.hadoop.security.KerberosName.parseRules(KerberosName.java:324)
>>>>>    at org.apache.hadoop.security.KerberosName.setConfiguration(KerberosName.java:343)
>>>>>    at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:212)
>>>>>    at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:187)
>>>>>    at org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:239)
>>>>>    at org.apache.hadoop.security.KerberosName.<clinit>(KerberosName.java:83)
>>>>>    ... 10 more
>>>>> 
>>>>> 12/04/27 11:46:33 INFO namenode.NameNode: SHUTDOWN_MSG:
>>>>> /************************************************************
>>>>> SHUTDOWN_MSG: Shutting down NameNode at ubuntu/127.0.1.1 
>>>>> ************************************************************/
>>>>> 
>>>>> \______ And once again...
>>>>> ... a variable seems to be undefined. And i guess, if i found a suitable value for this property the next one will be undefined. I hope there are any default values present, because i have no idea, what to fill to the value slots. All books and instructions i read on hadoop never discussed these issues.
>>>>> BTW: HADOOP_HOME is defined, although the log tells different.
>>>>> 
>>>>> I hope you can assist me.
>>>>> 
>>>>> Best regards,
>>>>> Björn-Elmar Macek
>>>>> 
>> 
> 


RE: Hadoop Configuration Issues

Posted by Leo Leung <ll...@ddn.com>.
I would suggest changing the hadoop configuration on the slave(s)
You'll have to maintain the delta config amongst the nodes.


-----Original Message-----
From: Björn-Elmar Macek [mailto:macek@cs.uni-kassel.de] 
Sent: Friday, April 27, 2012 7:28 AM
To: hdfs-user@hadoop.apache.org
Subject: Re: Hadoop Configuration Issues

Hi Alex,

thank you for the tip: it pushed me into the right direction. I was using the deb-package to install hadoop, which did not work out cause of the problems i had. Now i use the tarball archive and unpacked it to a subfolder of my home directory.

BUT i just ran into another problem: after successfully executing "hadoop namenode -format" the "start-all.sh" script (executed on the master server) runs into errors, when trying to access files on the slave system: it automatically seems to expect the hadoop files lying in the very same directory as on the master:

ema@ubuntu:~/Programs/hadoop-1.0.2/bin$ ./start-all.sh
Warning: $HADOOP_HOME is deprecated.

namenode running as process 19393. Stop it first.
slave: bash: line 0: cd: /home/ema/Programs/hadoop-1.0.2/libexec/..: No such file or directory
slave: bash: /home/ema/Programs/hadoop-1.0.2/bin/hadoop-daemon.sh: No such file or directory ema@master's password:
master: Connection closed by UNKNOWN
starting jobtracker, logging to
/var/log/hadoop/ema/hadoop-ema-jobtracker-ubuntu.out
slave: bash: line 0: cd: /home/ema/Programs/hadoop-1.0.2/libexec/..: No such file or directory
slave: bash: /home/ema/Programs/hadoop-1.0.2/bin/hadoop-daemon.sh: No such file or directory

How can i tell the slave, that the files lie somewhere else?

Best regards,
Björn-Elmar

Am 27.04.2012 13:11, schrieb alo alt:
> Hi,
>
> yes, sorry - I saw that after I hit the send botton.
> Looks like you mixed up some configs with wrong templates. I would suggest you use the default configs:
> http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project
> /hadoop-common/src/main/conf/
>
> - Alex
>
> --
> Alexander Lorenz
> http://mapredit.blogspot.com
>
> On Apr 27, 2012, at 12:39 PM, Björn-Elmar Macek wrote:
>
>> Hi Alex,
>>
>> as i have written, i already did so! The problem is as already stated in my mail before, that all the Variables ${bla}  seem to be UNSET - not only SECURITY_TYPE. As i dont really understand those parameters, i would like to use the default ones, which afaik should be configured in the hadoop-env.sh. But obviously they are not.
>>
>> Best,
>> Björn
>>
>> Am 27.04.2012 12:12, schrieb alo alt:
>>> Hi,
>>>
>>> Invalid attribute value for hadoop.security.authentication of 
>>> ${SECURITY_TYPE} Set it to simple and it should work (default is kerberos).
>>>
>>> - Alex
>>>
>>> --
>>> Alexander Lorenz
>>> http://mapredit.blogspot.com
>>>
>>> On Apr 27, 2012, at 12:01 PM, Björn-Elmar Macek wrote:
>>>
>>>> Hello,
>>>>
>>>> i have recently installed Hadoop on my and a second machine in order to test the setup and develop little programs locally before deploying them to the cluster. I stumbled over several difficulties, which i could fix with some internet research. But once again im stuck and i think i can nail the problem down:
>>>>
>>>> When Hadoop evaluates the config files in /etc/hadoop it does not have any default values for all the variables used within:
>>>>
>>>> \________ First Error:
>>>> hadoop namenode -format
>>>> Warning: $HADOOP_HOME is deprecated.
>>>>
>>>> 12/04/27 11:31:41 INFO namenode.NameNode: STARTUP_MSG:
>>>> /************************************************************
>>>> STARTUP_MSG: Starting NameNode
>>>> STARTUP_MSG:   host = ubuntu/127.0.1.1
>>>> STARTUP_MSG:   args = [-format]
>>>> STARTUP_MSG:   version = 1.0.1
>>>> STARTUP_MSG:   build = https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r 1243785; compiled by 'hortonfo' on Tue Feb 14 08:13:52 UTC 2012
>>>> ************************************************************/
>>>> 12/04/27 11:31:41 INFO util.GSet: VM type       = 32-bit
>>>> 12/04/27 11:31:41 INFO util.GSet: 2% max memory = 2.475 MB
>>>> 12/04/27 11:31:41 INFO util.GSet: capacity      = 2^19 = 524288 entries
>>>> 12/04/27 11:31:41 INFO util.GSet: recommended=524288, actual=524288
>>>> 12/04/27 11:31:41 ERROR namenode.NameNode: java.lang.IllegalArgumentException: Invalid attribute value for hadoop.security.authentication of ${SECURITY_TYPE}
>>>>     at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:202)
>>>>     at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:187)
>>>>     at org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:239)
>>>>     at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:438)
>>>>     at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:424)
>>>>     at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:473)
>>>>     at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:462)
>>>>     at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1162)
>>>>     at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1271)
>>>>     at 
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:
>>>> 1288)
>>>>
>>>> 12/04/27 11:31:41 INFO namenode.NameNode: SHUTDOWN_MSG:
>>>> /************************************************************
>>>> SHUTDOWN_MSG: Shutting down NameNode at ubuntu/127.0.1.1 
>>>> ************************************************************/
>>>>
>>>>
>>>> \_________ Solution
>>>> I removed the variable and exchanged it with the value "simple".
>>>> Then the next error occurred:
>>>>
>>>>
>>>> \_________ Error 2
>>>> hadoop namenode -format
>>>> Warning: $HADOOP_HOME is deprecated.
>>>>
>>>> 12/04/27 11:46:33 INFO namenode.NameNode: STARTUP_MSG:
>>>> /************************************************************
>>>> STARTUP_MSG: Starting NameNode
>>>> STARTUP_MSG:   host = ubuntu/127.0.1.1
>>>> STARTUP_MSG:   args = [-format]
>>>> STARTUP_MSG:   version = 1.0.1
>>>> STARTUP_MSG:   build = https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r 1243785; compiled by 'hortonfo' on Tue Feb 14 08:13:52 UTC 2012
>>>> ************************************************************/
>>>> 12/04/27 11:46:33 INFO util.GSet: VM type       = 32-bit
>>>> 12/04/27 11:46:33 INFO util.GSet: 2% max memory = 2.475 MB
>>>> 12/04/27 11:46:33 INFO util.GSet: capacity      = 2^19 = 524288 entries
>>>> 12/04/27 11:46:33 INFO util.GSet: recommended=524288, actual=524288
>>>> 12/04/27 11:46:33 ERROR namenode.NameNode: java.lang.ExceptionInInitializerError
>>>>     at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:212)
>>>>     at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:187)
>>>>     at org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:239)
>>>>     at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:438)
>>>>     at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:424)
>>>>     at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:473)
>>>>     at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:462)
>>>>     at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1162)
>>>>     at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1271)
>>>>     at 
>>>> org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:
>>>> 1288) Caused by: java.util.regex.PatternSyntaxException: Illegal 
>>>> repetition near index 8 [jt]t@.*${KERBEROS_REALM}
>>>>         ^
>>>>     at java.util.regex.Pattern.error(Pattern.java:1730)
>>>>     at java.util.regex.Pattern.closure(Pattern.java:2792)
>>>>     at java.util.regex.Pattern.sequence(Pattern.java:1906)
>>>>     at java.util.regex.Pattern.expr(Pattern.java:1769)
>>>>     at java.util.regex.Pattern.compile(Pattern.java:1477)
>>>>     at java.util.regex.Pattern.<init>(Pattern.java:1150)
>>>>     at java.util.regex.Pattern.compile(Pattern.java:840)
>>>>     at org.apache.hadoop.security.KerberosName$Rule.<init>(KerberosName.java:188)
>>>>     at org.apache.hadoop.security.KerberosName.parseRules(KerberosName.java:324)
>>>>     at org.apache.hadoop.security.KerberosName.setConfiguration(KerberosName.java:343)
>>>>     at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:212)
>>>>     at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:187)
>>>>     at org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:239)
>>>>     at org.apache.hadoop.security.KerberosName.<clinit>(KerberosName.java:83)
>>>>     ... 10 more
>>>>
>>>> 12/04/27 11:46:33 INFO namenode.NameNode: SHUTDOWN_MSG:
>>>> /************************************************************
>>>> SHUTDOWN_MSG: Shutting down NameNode at ubuntu/127.0.1.1 
>>>> ************************************************************/
>>>>
>>>> \______ And once again...
>>>> ... a variable seems to be undefined. And i guess, if i found a suitable value for this property the next one will be undefined. I hope there are any default values present, because i have no idea, what to fill to the value slots. All books and instructions i read on hadoop never discussed these issues.
>>>> BTW: HADOOP_HOME is defined, although the log tells different.
>>>>
>>>> I hope you can assist me.
>>>>
>>>> Best regards,
>>>> Björn-Elmar Macek
>>>>
>


Re: Hadoop Configuration Issues

Posted by Björn-Elmar Macek <ma...@cs.uni-kassel.de>.
Hi Alex,

thank you for the tip: it pushed me into the right direction. I was 
using the deb-package to install hadoop, which did not work out cause of 
the problems i had. Now i use the tarball archive and unpacked it to a 
subfolder of my home directory.

BUT i just ran into another problem: after successfully executing 
"hadoop namenode -format" the "start-all.sh" script (executed on the 
master server) runs into errors, when trying to access files on the 
slave system: it automatically seems to expect the hadoop files lying in 
the very same directory as on the master:

ema@ubuntu:~/Programs/hadoop-1.0.2/bin$ ./start-all.sh
Warning: $HADOOP_HOME is deprecated.

namenode running as process 19393. Stop it first.
slave: bash: line 0: cd: /home/ema/Programs/hadoop-1.0.2/libexec/..: No 
such file or directory
slave: bash: /home/ema/Programs/hadoop-1.0.2/bin/hadoop-daemon.sh: No 
such file or directory
ema@master's password:
master: Connection closed by UNKNOWN
starting jobtracker, logging to 
/var/log/hadoop/ema/hadoop-ema-jobtracker-ubuntu.out
slave: bash: line 0: cd: /home/ema/Programs/hadoop-1.0.2/libexec/..: No 
such file or directory
slave: bash: /home/ema/Programs/hadoop-1.0.2/bin/hadoop-daemon.sh: No 
such file or directory

How can i tell the slave, that the files lie somewhere else?

Best regards,
Björn-Elmar

Am 27.04.2012 13:11, schrieb alo alt:
> Hi,
>
> yes, sorry - I saw that after I hit the send botton.
> Looks like you mixed up some configs with wrong templates. I would suggest you use the default configs:
> http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/conf/
>
> - Alex
>
> --
> Alexander Lorenz
> http://mapredit.blogspot.com
>
> On Apr 27, 2012, at 12:39 PM, Björn-Elmar Macek wrote:
>
>> Hi Alex,
>>
>> as i have written, i already did so! The problem is as already stated in my mail before, that all the Variables ${bla}  seem to be UNSET - not only SECURITY_TYPE. As i dont really understand those parameters, i would like to use the default ones, which afaik should be configured in the hadoop-env.sh. But obviously they are not.
>>
>> Best,
>> Björn
>>
>> Am 27.04.2012 12:12, schrieb alo alt:
>>> Hi,
>>>
>>> Invalid attribute value for hadoop.security.authentication of ${SECURITY_TYPE}
>>> Set it to simple and it should work (default is kerberos).
>>>
>>> - Alex
>>>
>>> --
>>> Alexander Lorenz
>>> http://mapredit.blogspot.com
>>>
>>> On Apr 27, 2012, at 12:01 PM, Björn-Elmar Macek wrote:
>>>
>>>> Hello,
>>>>
>>>> i have recently installed Hadoop on my and a second machine in order to test the setup and develop little programs locally before deploying them to the cluster. I stumbled over several difficulties, which i could fix with some internet research. But once again im stuck and i think i can nail the problem down:
>>>>
>>>> When Hadoop evaluates the config files in /etc/hadoop it does not have any default values for all the variables used within:
>>>>
>>>> \________ First Error:
>>>> hadoop namenode -format
>>>> Warning: $HADOOP_HOME is deprecated.
>>>>
>>>> 12/04/27 11:31:41 INFO namenode.NameNode: STARTUP_MSG:
>>>> /************************************************************
>>>> STARTUP_MSG: Starting NameNode
>>>> STARTUP_MSG:   host = ubuntu/127.0.1.1
>>>> STARTUP_MSG:   args = [-format]
>>>> STARTUP_MSG:   version = 1.0.1
>>>> STARTUP_MSG:   build = https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r 1243785; compiled by 'hortonfo' on Tue Feb 14 08:13:52 UTC 2012
>>>> ************************************************************/
>>>> 12/04/27 11:31:41 INFO util.GSet: VM type       = 32-bit
>>>> 12/04/27 11:31:41 INFO util.GSet: 2% max memory = 2.475 MB
>>>> 12/04/27 11:31:41 INFO util.GSet: capacity      = 2^19 = 524288 entries
>>>> 12/04/27 11:31:41 INFO util.GSet: recommended=524288, actual=524288
>>>> 12/04/27 11:31:41 ERROR namenode.NameNode: java.lang.IllegalArgumentException: Invalid attribute value for hadoop.security.authentication of ${SECURITY_TYPE}
>>>>     at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:202)
>>>>     at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:187)
>>>>     at org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:239)
>>>>     at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:438)
>>>>     at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:424)
>>>>     at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:473)
>>>>     at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:462)
>>>>     at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1162)
>>>>     at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1271)
>>>>     at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1288)
>>>>
>>>> 12/04/27 11:31:41 INFO namenode.NameNode: SHUTDOWN_MSG:
>>>> /************************************************************
>>>> SHUTDOWN_MSG: Shutting down NameNode at ubuntu/127.0.1.1
>>>> ************************************************************/
>>>>
>>>>
>>>> \_________ Solution
>>>> I removed the variable and exchanged it with the value "simple".
>>>> Then the next error occurred:
>>>>
>>>>
>>>> \_________ Error 2
>>>> hadoop namenode -format
>>>> Warning: $HADOOP_HOME is deprecated.
>>>>
>>>> 12/04/27 11:46:33 INFO namenode.NameNode: STARTUP_MSG:
>>>> /************************************************************
>>>> STARTUP_MSG: Starting NameNode
>>>> STARTUP_MSG:   host = ubuntu/127.0.1.1
>>>> STARTUP_MSG:   args = [-format]
>>>> STARTUP_MSG:   version = 1.0.1
>>>> STARTUP_MSG:   build = https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r 1243785; compiled by 'hortonfo' on Tue Feb 14 08:13:52 UTC 2012
>>>> ************************************************************/
>>>> 12/04/27 11:46:33 INFO util.GSet: VM type       = 32-bit
>>>> 12/04/27 11:46:33 INFO util.GSet: 2% max memory = 2.475 MB
>>>> 12/04/27 11:46:33 INFO util.GSet: capacity      = 2^19 = 524288 entries
>>>> 12/04/27 11:46:33 INFO util.GSet: recommended=524288, actual=524288
>>>> 12/04/27 11:46:33 ERROR namenode.NameNode: java.lang.ExceptionInInitializerError
>>>>     at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:212)
>>>>     at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:187)
>>>>     at org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:239)
>>>>     at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:438)
>>>>     at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:424)
>>>>     at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:473)
>>>>     at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:462)
>>>>     at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1162)
>>>>     at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1271)
>>>>     at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1288)
>>>> Caused by: java.util.regex.PatternSyntaxException: Illegal repetition near index 8
>>>> [jt]t@.*${KERBEROS_REALM}
>>>>         ^
>>>>     at java.util.regex.Pattern.error(Pattern.java:1730)
>>>>     at java.util.regex.Pattern.closure(Pattern.java:2792)
>>>>     at java.util.regex.Pattern.sequence(Pattern.java:1906)
>>>>     at java.util.regex.Pattern.expr(Pattern.java:1769)
>>>>     at java.util.regex.Pattern.compile(Pattern.java:1477)
>>>>     at java.util.regex.Pattern.<init>(Pattern.java:1150)
>>>>     at java.util.regex.Pattern.compile(Pattern.java:840)
>>>>     at org.apache.hadoop.security.KerberosName$Rule.<init>(KerberosName.java:188)
>>>>     at org.apache.hadoop.security.KerberosName.parseRules(KerberosName.java:324)
>>>>     at org.apache.hadoop.security.KerberosName.setConfiguration(KerberosName.java:343)
>>>>     at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:212)
>>>>     at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:187)
>>>>     at org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:239)
>>>>     at org.apache.hadoop.security.KerberosName.<clinit>(KerberosName.java:83)
>>>>     ... 10 more
>>>>
>>>> 12/04/27 11:46:33 INFO namenode.NameNode: SHUTDOWN_MSG:
>>>> /************************************************************
>>>> SHUTDOWN_MSG: Shutting down NameNode at ubuntu/127.0.1.1
>>>> ************************************************************/
>>>>
>>>> \______ And once again...
>>>> ... a variable seems to be undefined. And i guess, if i found a suitable value for this property the next one will be undefined. I hope there are any default values present, because i have no idea, what to fill to the value slots. All books and instructions i read on hadoop never discussed these issues.
>>>> BTW: HADOOP_HOME is defined, although the log tells different.
>>>>
>>>> I hope you can assist me.
>>>>
>>>> Best regards,
>>>> Björn-Elmar Macek
>>>>
>


Re: Hadoop Configuration Issues

Posted by alo alt <wg...@googlemail.com>.
Hi,

yes, sorry - I saw that after I hit the send botton.
Looks like you mixed up some configs with wrong templates. I would suggest you use the default configs:
http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/conf/

- Alex

--
Alexander Lorenz
http://mapredit.blogspot.com

On Apr 27, 2012, at 12:39 PM, Björn-Elmar Macek wrote:

> Hi Alex,
> 
> as i have written, i already did so! The problem is as already stated in my mail before, that all the Variables ${bla}  seem to be UNSET - not only SECURITY_TYPE. As i dont really understand those parameters, i would like to use the default ones, which afaik should be configured in the hadoop-env.sh. But obviously they are not.
> 
> Best,
> Björn
> 
> Am 27.04.2012 12:12, schrieb alo alt:
>> Hi,
>> 
>> Invalid attribute value for hadoop.security.authentication of ${SECURITY_TYPE}
>> Set it to simple and it should work (default is kerberos).
>> 
>> - Alex
>> 
>> --
>> Alexander Lorenz
>> http://mapredit.blogspot.com
>> 
>> On Apr 27, 2012, at 12:01 PM, Björn-Elmar Macek wrote:
>> 
>>> Hello,
>>> 
>>> i have recently installed Hadoop on my and a second machine in order to test the setup and develop little programs locally before deploying them to the cluster. I stumbled over several difficulties, which i could fix with some internet research. But once again im stuck and i think i can nail the problem down:
>>> 
>>> When Hadoop evaluates the config files in /etc/hadoop it does not have any default values for all the variables used within:
>>> 
>>> \________ First Error:
>>> hadoop namenode -format
>>> Warning: $HADOOP_HOME is deprecated.
>>> 
>>> 12/04/27 11:31:41 INFO namenode.NameNode: STARTUP_MSG:
>>> /************************************************************
>>> STARTUP_MSG: Starting NameNode
>>> STARTUP_MSG:   host = ubuntu/127.0.1.1
>>> STARTUP_MSG:   args = [-format]
>>> STARTUP_MSG:   version = 1.0.1
>>> STARTUP_MSG:   build = https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r 1243785; compiled by 'hortonfo' on Tue Feb 14 08:13:52 UTC 2012
>>> ************************************************************/
>>> 12/04/27 11:31:41 INFO util.GSet: VM type       = 32-bit
>>> 12/04/27 11:31:41 INFO util.GSet: 2% max memory = 2.475 MB
>>> 12/04/27 11:31:41 INFO util.GSet: capacity      = 2^19 = 524288 entries
>>> 12/04/27 11:31:41 INFO util.GSet: recommended=524288, actual=524288
>>> 12/04/27 11:31:41 ERROR namenode.NameNode: java.lang.IllegalArgumentException: Invalid attribute value for hadoop.security.authentication of ${SECURITY_TYPE}
>>>    at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:202)
>>>    at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:187)
>>>    at org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:239)
>>>    at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:438)
>>>    at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:424)
>>>    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:473)
>>>    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:462)
>>>    at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1162)
>>>    at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1271)
>>>    at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1288)
>>> 
>>> 12/04/27 11:31:41 INFO namenode.NameNode: SHUTDOWN_MSG:
>>> /************************************************************
>>> SHUTDOWN_MSG: Shutting down NameNode at ubuntu/127.0.1.1
>>> ************************************************************/
>>> 
>>> 
>>> \_________ Solution
>>> I removed the variable and exchanged it with the value "simple".
>>> Then the next error occurred:
>>> 
>>> 
>>> \_________ Error 2
>>> hadoop namenode -format
>>> Warning: $HADOOP_HOME is deprecated.
>>> 
>>> 12/04/27 11:46:33 INFO namenode.NameNode: STARTUP_MSG:
>>> /************************************************************
>>> STARTUP_MSG: Starting NameNode
>>> STARTUP_MSG:   host = ubuntu/127.0.1.1
>>> STARTUP_MSG:   args = [-format]
>>> STARTUP_MSG:   version = 1.0.1
>>> STARTUP_MSG:   build = https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r 1243785; compiled by 'hortonfo' on Tue Feb 14 08:13:52 UTC 2012
>>> ************************************************************/
>>> 12/04/27 11:46:33 INFO util.GSet: VM type       = 32-bit
>>> 12/04/27 11:46:33 INFO util.GSet: 2% max memory = 2.475 MB
>>> 12/04/27 11:46:33 INFO util.GSet: capacity      = 2^19 = 524288 entries
>>> 12/04/27 11:46:33 INFO util.GSet: recommended=524288, actual=524288
>>> 12/04/27 11:46:33 ERROR namenode.NameNode: java.lang.ExceptionInInitializerError
>>>    at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:212)
>>>    at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:187)
>>>    at org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:239)
>>>    at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:438)
>>>    at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:424)
>>>    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:473)
>>>    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:462)
>>>    at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1162)
>>>    at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1271)
>>>    at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1288)
>>> Caused by: java.util.regex.PatternSyntaxException: Illegal repetition near index 8
>>> [jt]t@.*${KERBEROS_REALM}
>>>        ^
>>>    at java.util.regex.Pattern.error(Pattern.java:1730)
>>>    at java.util.regex.Pattern.closure(Pattern.java:2792)
>>>    at java.util.regex.Pattern.sequence(Pattern.java:1906)
>>>    at java.util.regex.Pattern.expr(Pattern.java:1769)
>>>    at java.util.regex.Pattern.compile(Pattern.java:1477)
>>>    at java.util.regex.Pattern.<init>(Pattern.java:1150)
>>>    at java.util.regex.Pattern.compile(Pattern.java:840)
>>>    at org.apache.hadoop.security.KerberosName$Rule.<init>(KerberosName.java:188)
>>>    at org.apache.hadoop.security.KerberosName.parseRules(KerberosName.java:324)
>>>    at org.apache.hadoop.security.KerberosName.setConfiguration(KerberosName.java:343)
>>>    at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:212)
>>>    at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:187)
>>>    at org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:239)
>>>    at org.apache.hadoop.security.KerberosName.<clinit>(KerberosName.java:83)
>>>    ... 10 more
>>> 
>>> 12/04/27 11:46:33 INFO namenode.NameNode: SHUTDOWN_MSG:
>>> /************************************************************
>>> SHUTDOWN_MSG: Shutting down NameNode at ubuntu/127.0.1.1
>>> ************************************************************/
>>> 
>>> \______ And once again...
>>> ... a variable seems to be undefined. And i guess, if i found a suitable value for this property the next one will be undefined. I hope there are any default values present, because i have no idea, what to fill to the value slots. All books and instructions i read on hadoop never discussed these issues.
>>> BTW: HADOOP_HOME is defined, although the log tells different.
>>> 
>>> I hope you can assist me.
>>> 
>>> Best regards,
>>> Björn-Elmar Macek
>>> 
>> 
> 


Re: Hadoop Configuration Issues

Posted by Björn-Elmar Macek <ma...@cs.uni-kassel.de>.
Hi Alex,

as i have written, i already did so! The problem is as already stated in 
my mail before, that all the Variables ${bla}  seem to be UNSET - not 
only SECURITY_TYPE. As i dont really understand those parameters, i 
would like to use the default ones, which afaik should be configured in 
the hadoop-env.sh. But obviously they are not.

Best,
Björn

Am 27.04.2012 12:12, schrieb alo alt:
> Hi,
>
> Invalid attribute value for hadoop.security.authentication of ${SECURITY_TYPE}
> Set it to simple and it should work (default is kerberos).
>
> - Alex
>
> --
> Alexander Lorenz
> http://mapredit.blogspot.com
>
> On Apr 27, 2012, at 12:01 PM, Björn-Elmar Macek wrote:
>
>> Hello,
>>
>> i have recently installed Hadoop on my and a second machine in order to test the setup and develop little programs locally before deploying them to the cluster. I stumbled over several difficulties, which i could fix with some internet research. But once again im stuck and i think i can nail the problem down:
>>
>> When Hadoop evaluates the config files in /etc/hadoop it does not have any default values for all the variables used within:
>>
>> \________ First Error:
>> hadoop namenode -format
>> Warning: $HADOOP_HOME is deprecated.
>>
>> 12/04/27 11:31:41 INFO namenode.NameNode: STARTUP_MSG:
>> /************************************************************
>> STARTUP_MSG: Starting NameNode
>> STARTUP_MSG:   host = ubuntu/127.0.1.1
>> STARTUP_MSG:   args = [-format]
>> STARTUP_MSG:   version = 1.0.1
>> STARTUP_MSG:   build = https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r 1243785; compiled by 'hortonfo' on Tue Feb 14 08:13:52 UTC 2012
>> ************************************************************/
>> 12/04/27 11:31:41 INFO util.GSet: VM type       = 32-bit
>> 12/04/27 11:31:41 INFO util.GSet: 2% max memory = 2.475 MB
>> 12/04/27 11:31:41 INFO util.GSet: capacity      = 2^19 = 524288 entries
>> 12/04/27 11:31:41 INFO util.GSet: recommended=524288, actual=524288
>> 12/04/27 11:31:41 ERROR namenode.NameNode: java.lang.IllegalArgumentException: Invalid attribute value for hadoop.security.authentication of ${SECURITY_TYPE}
>>     at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:202)
>>     at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:187)
>>     at org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:239)
>>     at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:438)
>>     at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:424)
>>     at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:473)
>>     at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:462)
>>     at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1162)
>>     at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1271)
>>     at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1288)
>>
>> 12/04/27 11:31:41 INFO namenode.NameNode: SHUTDOWN_MSG:
>> /************************************************************
>> SHUTDOWN_MSG: Shutting down NameNode at ubuntu/127.0.1.1
>> ************************************************************/
>>
>>
>> \_________ Solution
>> I removed the variable and exchanged it with the value "simple".
>> Then the next error occurred:
>>
>>
>> \_________ Error 2
>> hadoop namenode -format
>> Warning: $HADOOP_HOME is deprecated.
>>
>> 12/04/27 11:46:33 INFO namenode.NameNode: STARTUP_MSG:
>> /************************************************************
>> STARTUP_MSG: Starting NameNode
>> STARTUP_MSG:   host = ubuntu/127.0.1.1
>> STARTUP_MSG:   args = [-format]
>> STARTUP_MSG:   version = 1.0.1
>> STARTUP_MSG:   build = https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r 1243785; compiled by 'hortonfo' on Tue Feb 14 08:13:52 UTC 2012
>> ************************************************************/
>> 12/04/27 11:46:33 INFO util.GSet: VM type       = 32-bit
>> 12/04/27 11:46:33 INFO util.GSet: 2% max memory = 2.475 MB
>> 12/04/27 11:46:33 INFO util.GSet: capacity      = 2^19 = 524288 entries
>> 12/04/27 11:46:33 INFO util.GSet: recommended=524288, actual=524288
>> 12/04/27 11:46:33 ERROR namenode.NameNode: java.lang.ExceptionInInitializerError
>>     at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:212)
>>     at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:187)
>>     at org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:239)
>>     at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:438)
>>     at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:424)
>>     at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:473)
>>     at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:462)
>>     at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1162)
>>     at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1271)
>>     at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1288)
>> Caused by: java.util.regex.PatternSyntaxException: Illegal repetition near index 8
>> [jt]t@.*${KERBEROS_REALM}
>>         ^
>>     at java.util.regex.Pattern.error(Pattern.java:1730)
>>     at java.util.regex.Pattern.closure(Pattern.java:2792)
>>     at java.util.regex.Pattern.sequence(Pattern.java:1906)
>>     at java.util.regex.Pattern.expr(Pattern.java:1769)
>>     at java.util.regex.Pattern.compile(Pattern.java:1477)
>>     at java.util.regex.Pattern.<init>(Pattern.java:1150)
>>     at java.util.regex.Pattern.compile(Pattern.java:840)
>>     at org.apache.hadoop.security.KerberosName$Rule.<init>(KerberosName.java:188)
>>     at org.apache.hadoop.security.KerberosName.parseRules(KerberosName.java:324)
>>     at org.apache.hadoop.security.KerberosName.setConfiguration(KerberosName.java:343)
>>     at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:212)
>>     at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:187)
>>     at org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:239)
>>     at org.apache.hadoop.security.KerberosName.<clinit>(KerberosName.java:83)
>>     ... 10 more
>>
>> 12/04/27 11:46:33 INFO namenode.NameNode: SHUTDOWN_MSG:
>> /************************************************************
>> SHUTDOWN_MSG: Shutting down NameNode at ubuntu/127.0.1.1
>> ************************************************************/
>>
>> \______ And once again...
>> ... a variable seems to be undefined. And i guess, if i found a suitable value for this property the next one will be undefined. I hope there are any default values present, because i have no idea, what to fill to the value slots. All books and instructions i read on hadoop never discussed these issues.
>> BTW: HADOOP_HOME is defined, although the log tells different.
>>
>> I hope you can assist me.
>>
>> Best regards,
>> Björn-Elmar Macek
>>
>


Re: Hadoop Configuration Issues

Posted by alo alt <wg...@googlemail.com>.
Hi,

Invalid attribute value for hadoop.security.authentication of ${SECURITY_TYPE}
Set it to simple and it should work (default is kerberos).

- Alex

--
Alexander Lorenz
http://mapredit.blogspot.com

On Apr 27, 2012, at 12:01 PM, Björn-Elmar Macek wrote:

> Hello,
> 
> i have recently installed Hadoop on my and a second machine in order to test the setup and develop little programs locally before deploying them to the cluster. I stumbled over several difficulties, which i could fix with some internet research. But once again im stuck and i think i can nail the problem down:
> 
> When Hadoop evaluates the config files in /etc/hadoop it does not have any default values for all the variables used within:
> 
> \________ First Error:
> hadoop namenode -format
> Warning: $HADOOP_HOME is deprecated.
> 
> 12/04/27 11:31:41 INFO namenode.NameNode: STARTUP_MSG:
> /************************************************************
> STARTUP_MSG: Starting NameNode
> STARTUP_MSG:   host = ubuntu/127.0.1.1
> STARTUP_MSG:   args = [-format]
> STARTUP_MSG:   version = 1.0.1
> STARTUP_MSG:   build = https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r 1243785; compiled by 'hortonfo' on Tue Feb 14 08:13:52 UTC 2012
> ************************************************************/
> 12/04/27 11:31:41 INFO util.GSet: VM type       = 32-bit
> 12/04/27 11:31:41 INFO util.GSet: 2% max memory = 2.475 MB
> 12/04/27 11:31:41 INFO util.GSet: capacity      = 2^19 = 524288 entries
> 12/04/27 11:31:41 INFO util.GSet: recommended=524288, actual=524288
> 12/04/27 11:31:41 ERROR namenode.NameNode: java.lang.IllegalArgumentException: Invalid attribute value for hadoop.security.authentication of ${SECURITY_TYPE}
>    at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:202)
>    at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:187)
>    at org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:239)
>    at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:438)
>    at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:424)
>    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:473)
>    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:462)
>    at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1162)
>    at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1271)
>    at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1288)
> 
> 12/04/27 11:31:41 INFO namenode.NameNode: SHUTDOWN_MSG:
> /************************************************************
> SHUTDOWN_MSG: Shutting down NameNode at ubuntu/127.0.1.1
> ************************************************************/
> 
> 
> \_________ Solution
> I removed the variable and exchanged it with the value "simple".
> Then the next error occurred:
> 
> 
> \_________ Error 2
> hadoop namenode -format
> Warning: $HADOOP_HOME is deprecated.
> 
> 12/04/27 11:46:33 INFO namenode.NameNode: STARTUP_MSG:
> /************************************************************
> STARTUP_MSG: Starting NameNode
> STARTUP_MSG:   host = ubuntu/127.0.1.1
> STARTUP_MSG:   args = [-format]
> STARTUP_MSG:   version = 1.0.1
> STARTUP_MSG:   build = https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r 1243785; compiled by 'hortonfo' on Tue Feb 14 08:13:52 UTC 2012
> ************************************************************/
> 12/04/27 11:46:33 INFO util.GSet: VM type       = 32-bit
> 12/04/27 11:46:33 INFO util.GSet: 2% max memory = 2.475 MB
> 12/04/27 11:46:33 INFO util.GSet: capacity      = 2^19 = 524288 entries
> 12/04/27 11:46:33 INFO util.GSet: recommended=524288, actual=524288
> 12/04/27 11:46:33 ERROR namenode.NameNode: java.lang.ExceptionInInitializerError
>    at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:212)
>    at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:187)
>    at org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:239)
>    at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:438)
>    at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:424)
>    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.setConfigurationParameters(FSNamesystem.java:473)
>    at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:462)
>    at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1162)
>    at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1271)
>    at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1288)
> Caused by: java.util.regex.PatternSyntaxException: Illegal repetition near index 8
> [jt]t@.*${KERBEROS_REALM}
>        ^
>    at java.util.regex.Pattern.error(Pattern.java:1730)
>    at java.util.regex.Pattern.closure(Pattern.java:2792)
>    at java.util.regex.Pattern.sequence(Pattern.java:1906)
>    at java.util.regex.Pattern.expr(Pattern.java:1769)
>    at java.util.regex.Pattern.compile(Pattern.java:1477)
>    at java.util.regex.Pattern.<init>(Pattern.java:1150)
>    at java.util.regex.Pattern.compile(Pattern.java:840)
>    at org.apache.hadoop.security.KerberosName$Rule.<init>(KerberosName.java:188)
>    at org.apache.hadoop.security.KerberosName.parseRules(KerberosName.java:324)
>    at org.apache.hadoop.security.KerberosName.setConfiguration(KerberosName.java:343)
>    at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:212)
>    at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:187)
>    at org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:239)
>    at org.apache.hadoop.security.KerberosName.<clinit>(KerberosName.java:83)
>    ... 10 more
> 
> 12/04/27 11:46:33 INFO namenode.NameNode: SHUTDOWN_MSG:
> /************************************************************
> SHUTDOWN_MSG: Shutting down NameNode at ubuntu/127.0.1.1
> ************************************************************/
> 
> \______ And once again...
> ... a variable seems to be undefined. And i guess, if i found a suitable value for this property the next one will be undefined. I hope there are any default values present, because i have no idea, what to fill to the value slots. All books and instructions i read on hadoop never discussed these issues.
> BTW: HADOOP_HOME is defined, although the log tells different.
> 
> I hope you can assist me.
> 
> Best regards,
> Björn-Elmar Macek
>