You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hbase.apache.org by Malte Maltesmann <ma...@gmail.com> on 2014/08/25 14:48:12 UTC

NoSuchElementException while executing MapReduce-job

Hi all,

when I'm trying to execute a MapReduce-Job over the HBase-Java-API I get an
exception with the follwoing Stacktrace:

java.util.NoSuchElementException
    at java.util.StringTokenizer.nextToken(StringTokenizer.java:349)
    at
org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.loadPermissionInfo(RawLocalFileSystem.java:565)
    at
org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.getPermission(RawLocalFileSystem.java:534)
    at
org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.checkPermissionOfOther(ClientDistributedCacheManager.java:276)
    at
org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.isPublic(ClientDistributedCacheManager.java:240)
    at
org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineCacheVisibilities(ClientDistributedCacheManager.java:162)
    at
org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestampsAndCacheVisibilities(ClientDistributedCacheManager.java:58)
    at
org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:265)
    at
org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:301)
    at
org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:389)
    at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285)
    at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1556)
    at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282)
    at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1303)
...

In the RawLocalFileSystem.java I can see, that its trying to get some
Permissions here, but the StringTokenizer gets an empty String, so the
first t.nextToken() fails with the Exception above:

/// loads permissions, owner, and group from `ls -ld`
    private void loadPermissionInfo() {
      IOException e = null;
      try {
        String output = FileUtil.execCommand(new File(getPath().toUri()),
            Shell.getGetPermissionCommand());
        StringTokenizer t =
            new StringTokenizer(output, Shell.TOKEN_SEPARATOR_REGEX);
        //expected format
        //-rw-------    1 username groupname ...
        String permission = t.nextToken();
        if (permission.length() > FsPermission.MAX_PERMISSION_LENGTH) {
          //files with ACLs might have a '+'
          permission = permission.substring(0,
            FsPermission.MAX_PERMISSION_LENGTH);
        }
        setPermission(FsPermission.valueOf(permission));
        t.nextToken();

        String owner = t.nextToken();
        // If on windows domain, token format is DOMAIN\\user and we want to
        // extract only the user name
        if (Shell.WINDOWS) {
          int i = owner.indexOf('\\');
          if (i != -1)
            owner = owner.substring(i + 1);
        }
        setOwner(owner);

        setGroup(t.nextToken());
      } catch (Shell.ExitCodeException ioe) {
        if (ioe.getExitCode() != 1) {
          e = ioe;
        } else {
          setPermission(null);
          setOwner(null);
          setGroup(null);
        }
      } catch (IOException ioe) {
        e = ioe;
      } finally {
        if (e != null) {
          throw new RuntimeException("Error while running command to get " +
                                     "file permissions : " +
                                     StringUtils.stringifyException(e));
        }
      }
    }

So my question is, what could have caused this Problem. My setup is the
following:

Two nodes running on FreeBSD 9.0
Hadoop 2.4.1
HBase 0.98.4

The shell-command "ls -ld" delivers the same output on FreeBSD as on any
other Linux-distribution, so this shouldn't be the problem.
Also I can use simple operations from the HBase-API like get or put without
any problems.

I also posted this on stackoverflow
<http://stackoverflow.com/questions/25364802/hbase-mapreduce-nosuchelementexception>
a week ago, but didn't get any answers.

Thanks in advance,
Malte

Re: NoSuchElementException while executing MapReduce-job

Posted by Malte Maltesmann <ma...@gmail.com>.
Hi again,

after a bit of work it finally runs the job. The problem was that I always
tried to run the MapReduce-jobs directly from eclipse. This worked with
Hadoop 1.2.1 but not with Hadoop 2.4.1. Running the job with ./hadoop jar
mapredtest.jar works fine with both versions.
So here I have an additional question: Is there a way to run MapReduce-Jobs
directly from code without creating a jar first?

Best Regards
Malte


2014-08-28 17:57 GMT+02:00 Ted Yu <yu...@gmail.com>:

> Maybe mapred-site.xml was not in classpath ?
> It should have something like this:
>
>   <property>
>     <name>mapreduce.framework.name</name>
>     <value>yarn</value>
>   </property>
>
> Cheers
>
>
> On Thu, Aug 28, 2014 at 5:51 AM, Malte Maltesmann <
> malte.maltesmann@gmail.com> wrote:
>
> > I did not use maven earlier for this, but if I try it with maven and the
> > following pom.xml:
> >
> > <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="
> > http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="
> > http://maven.apache.org/POM/4.0.0
> > http://maven.apache.org/xsd/maven-4.0.0.xsd">
> >   <modelVersion>4.0.0</modelVersion>
> >   <groupId>de.test.malte</groupId>
> >   <artifactId>MapRedMaven</artifactId>
> >   <version>0.0.1-SNAPSHOT</version>
> >   <dependencies>
> >       <dependency>
> >           <groupId>org.apache.hadoop</groupId>
> >           <artifactId>hadoop-common</artifactId>
> >           <version>2.4.1</version>
> >       </dependency>
> >       <dependency>
> >           <groupId>org.apache.hadoop</groupId>
> >           <artifactId>hadoop-hdfs</artifactId>
> >           <version>2.4.1</version>
> >       </dependency>
> >       <dependency>
> >           <groupId>org.apache.hadoop</groupId>
> >           <artifactId>hadoop-mapreduce-client-core</artifactId>
> >           <version>2.4.1</version>
> >       </dependency>
> >   </dependencies>
> > </project>
> >
> > I get the following exception on both, Linux and FreeBSD:
> >
> > 14/08/28 14:45:56 WARN util.NativeCodeLoader: Unable to load
> native-hadoop
> > library for your platform... using builtin-java classes where applicable
> > Exception in thread "main" java.io.IOException: Cannot initialize
> Cluster.
> > Please check your configuration for mapreduce.framework.name and the
> > correspond server addresses.
> >     at org.apache.hadoop.mapreduce.Cluster.initialize(Cluster.java:120)
> >     at org.apache.hadoop.mapreduce.Cluster.<init>(Cluster.java:82)
> >     at org.apache.hadoop.mapreduce.Cluster.<init>(Cluster.java:75)
> >     at org.apache.hadoop.mapreduce.Job$9.run(Job.java:1255)
> >     at org.apache.hadoop.mapreduce.Job$9.run(Job.java:1251)
> >     at java.security.AccessController.doPrivileged(Native Method)
> >     at javax.security.auth.Subject.doAs(Subject.java:415)
> >     at
> >
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1556)
> >     at org.apache.hadoop.mapreduce.Job.connect(Job.java:1250)
> >     at org.apache.hadoop.mapreduce.Job.submit(Job.java:1279)
> >     at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1303)
> >     at
> > NewMaxTemperature$NewMaxTemperatureMapper.main(NewMaxTemperature.java:39)
> >
> > By the way, I am now trying to run this
> > <
> >
> http://gerrymcnicol.com/index.php/2014/01/02/hadoop-and-cassandra-part-4-writing-your-first-mapreduce-job/
> > >
> > example on only one node. It runs fine on Linux without using maven but
> > crashes on Linux and FreeBSD with maven with the above Exception and
> > without maven on FreeBSD with the NoSuchElementException.
> >
> > Regards
> > Malte
> >
> >
> > 2014-08-27 16:44 GMT+02:00 Artem Ervits <ar...@nyp.org>:
> >
> > > Check your pom.xml, some artifacts changed from earlier releases.
> > >
> > >
> > > Artem Ervits
> > > Data Analyst
> > > New York Presbyterian Hospital
> > >
> > > ----- Original Message -----
> > > From: Malte Otten [mailto:malte.maltesmann@gmail.com]
> > > Sent: Monday, August 25, 2014 02:12 PM
> > > To: user@hbase.apache.org <us...@hbase.apache.org>
> > > Subject: Re: NoSuchElementException while executing MapReduce-job
> > >
> > > Hi Ted,
> > >
> > > thanks for your quick answer.
> > > I don't think I'm running in a secure setup, because I didn't do any of
> > > the things specified in the reference guide chapter from your link and
> > > as far as I can see, the secure setup is not the default. My
> > > hbase-site.xml looks like the following:
> > >
> > > <configuration>
> > > <property>
> > >    <name>hbase.cluster.distributed</name>
> > >    <value>true</value>
> > > </property>
> > > <property>
> > > <name>hbase.master.wait.on.regionservers.mintostart</name>
> > >    <value>1</value>
> > > </property>
> > >    <property>
> > >      <name>hbase.rootdir</name>
> > > <value>hdfs://my.hbase.server.de:8020/hbase</value>
> > >    </property>
> > >    <property>
> > >      <name>hbase.zookeeper.property.dataDir</name>
> > >      <value>/my/path/hbase-data/zookeeper</value>
> > >    </property>
> > > <property>
> > >    <name>hbase.zookeeper.quorum</name>
> > >    <value>my.hbase.server.de</value>
> > > </property>
> > > </configuration>
> > >
> > > Do you have any other suggestions?
> > >
> > > Regards
> > > Malte
> > >
> > >
> > > Am 25.08.2014 um 18:01 schrieb Ted Yu:
> > > > Looks like you were running in a secure setup.
> > > >
> > > > If that's the case, you should read:
> > > > http://hbase.apache.org/book.html#hbase.secure.configuration
> > > >
> > > > Cheers
> > > >
> > > >
> > > > On Mon, Aug 25, 2014 at 5:48 AM, Malte Maltesmann <
> > > > malte.maltesmann@gmail.com> wrote:
> > > >
> > > >> Hi all,
> > > >>
> > > >> when I'm trying to execute a MapReduce-Job over the HBase-Java-API I
> > > get an
> > > >> exception with the follwoing Stacktrace:
> > > >>
> > > >> java.util.NoSuchElementException
> > > >>      at
> java.util.StringTokenizer.nextToken(StringTokenizer.java:349)
> > > >>      at
> > > >>
> > > >>
> > >
> >
> org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.loadPermissionInfo(RawLocalFileSystem.java:565)
> > > >>      at
> > > >>
> > > >>
> > >
> >
> org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.getPermission(RawLocalFileSystem.java:534)
> > > >>      at
> > > >>
> > > >>
> > >
> >
> org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.checkPermissionOfOther(ClientDistributedCacheManager.java:276)
> > > >>      at
> > > >>
> > > >>
> > >
> >
> org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.isPublic(ClientDistributedCacheManager.java:240)
> > > >>      at
> > > >>
> > > >>
> > >
> >
> org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineCacheVisibilities(ClientDistributedCacheManager.java:162)
> > > >>      at
> > > >>
> > > >>
> > >
> >
> org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestampsAndCacheVisibilities(ClientDistributedCacheManager.java:58)
> > > >>      at
> > > >>
> > > >>
> > >
> >
> org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:265)
> > > >>      at
> > > >>
> > > >>
> > >
> >
> org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:301)
> > > >>      at
> > > >>
> > > >>
> > >
> >
> org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:389)
> > > >>      at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285)
> > > >>      at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282)
> > > >>      at java.security.AccessController.doPrivileged(Native Method)
> > > >>      at javax.security.auth.Subject.doAs(Subject.java:415)
> > > >>      at
> > > >>
> > > >>
> > >
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1556)
> > > >>      at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282)
> > > >>      at
> > org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1303)
> > > >> ...
> > > >>
> > > >> In the RawLocalFileSystem.java I can see, that its trying to get
> some
> > > >> Permissions here, but the StringTokenizer gets an empty String, so
> the
> > > >> first t.nextToken() fails with the Exception above:
> > > >>
> > > >> /// loads permissions, owner, and group from `ls -ld`
> > > >>      private void loadPermissionInfo() {
> > > >>        IOException e = null;
> > > >>        try {
> > > >>          String output = FileUtil.execCommand(new
> > > File(getPath().toUri()),
> > > >>              Shell.getGetPermissionCommand());
> > > >>          StringTokenizer t =
> > > >>              new StringTokenizer(output,
> Shell.TOKEN_SEPARATOR_REGEX);
> > > >>          //expected format
> > > >>          //-rw-------    1 username groupname ...
> > > >>          String permission = t.nextToken();
> > > >>          if (permission.length() >
> > FsPermission.MAX_PERMISSION_LENGTH) {
> > > >>            //files with ACLs might have a '+'
> > > >>            permission = permission.substring(0,
> > > >>              FsPermission.MAX_PERMISSION_LENGTH);
> > > >>          }
> > > >>          setPermission(FsPermission.valueOf(permission));
> > > >>          t.nextToken();
> > > >>
> > > >>          String owner = t.nextToken();
> > > >>          // If on windows domain, token format is DOMAIN\\user and
> we
> > > want
> > > >> to
> > > >>          // extract only the user name
> > > >>          if (Shell.WINDOWS) {
> > > >>            int i = owner.indexOf('\\');
> > > >>            if (i != -1)
> > > >>              owner = owner.substring(i + 1);
> > > >>          }
> > > >>          setOwner(owner);
> > > >>
> > > >>          setGroup(t.nextToken());
> > > >>        } catch (Shell.ExitCodeException ioe) {
> > > >>          if (ioe.getExitCode() != 1) {
> > > >>            e = ioe;
> > > >>          } else {
> > > >>            setPermission(null);
> > > >>            setOwner(null);
> > > >>            setGroup(null);
> > > >>          }
> > > >>        } catch (IOException ioe) {
> > > >>          e = ioe;
> > > >>        } finally {
> > > >>          if (e != null) {
> > > >>            throw new RuntimeException("Error while running command
> to
> > > get "
> > > >> +
> > > >>                                       "file permissions : " +
> > > >>
> > >  StringUtils.stringifyException(e));
> > > >>          }
> > > >>        }
> > > >>      }
> > > >>
> > > >> So my question is, what could have caused this Problem. My setup is
> > the
> > > >> following:
> > > >>
> > > >> Two nodes running on FreeBSD 9.0
> > > >> Hadoop 2.4.1
> > > >> HBase 0.98.4
> > > >>
> > > >> The shell-command "ls -ld" delivers the same output on FreeBSD as on
> > any
> > > >> other Linux-distribution, so this shouldn't be the problem.
> > > >> Also I can use simple operations from the HBase-API like get or put
> > > without
> > > >> any problems.
> > > >>
> > > >> I also posted this on stackoverflow
> > > >> <
> > > >>
> > >
> >
> http://stackoverflow.com/questions/25364802/hbase-mapreduce-nosuchelementexception
> > > >> a week ago, but didn't get any answers.
> > > >>
> > > >> Thanks in advance,
> > > >> Malte
> > > >>
> > >
> > >
> > > ________________________________
> > >
> > > This electronic message is intended to be for the use only of the named
> > > recipient, and may contain information that is confidential or
> > privileged.
> > > If you are not the intended recipient, you are hereby notified that any
> > > disclosure, copying, distribution or use of the contents of this
> message
> > is
> > > strictly prohibited. If you have received this message in error or are
> > not
> > > the named recipient, please notify us immediately by contacting the
> > sender
> > > at the electronic mail address noted above, and delete and destroy all
> > > copies of this message. Thank you.
> > >
> > > This electronic message is intended to be for the use only of the named
> > > recipient, and may contain information that is confidential or
> > privileged.
> > > If you are not the intended recipient, you are hereby notified that any
> > > disclosure, copying, distribution or use of the contents of this
> message
> > is
> > > strictly prohibited.  If you have received this message in error or are
> > not
> > > the named recipient, please notify us immediately by contacting the
> > sender
> > > at the electronic mail address noted above, and delete and destroy all
> > > copies of this message.  Thank you.
> > >
> >
>

Re: NoSuchElementException while executing MapReduce-job

Posted by Ted Yu <yu...@gmail.com>.
Maybe mapred-site.xml was not in classpath ?
It should have something like this:

  <property>
    <name>mapreduce.framework.name</name>
    <value>yarn</value>
  </property>

Cheers


On Thu, Aug 28, 2014 at 5:51 AM, Malte Maltesmann <
malte.maltesmann@gmail.com> wrote:

> I did not use maven earlier for this, but if I try it with maven and the
> following pom.xml:
>
> <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="
> http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="
> http://maven.apache.org/POM/4.0.0
> http://maven.apache.org/xsd/maven-4.0.0.xsd">
>   <modelVersion>4.0.0</modelVersion>
>   <groupId>de.test.malte</groupId>
>   <artifactId>MapRedMaven</artifactId>
>   <version>0.0.1-SNAPSHOT</version>
>   <dependencies>
>       <dependency>
>           <groupId>org.apache.hadoop</groupId>
>           <artifactId>hadoop-common</artifactId>
>           <version>2.4.1</version>
>       </dependency>
>       <dependency>
>           <groupId>org.apache.hadoop</groupId>
>           <artifactId>hadoop-hdfs</artifactId>
>           <version>2.4.1</version>
>       </dependency>
>       <dependency>
>           <groupId>org.apache.hadoop</groupId>
>           <artifactId>hadoop-mapreduce-client-core</artifactId>
>           <version>2.4.1</version>
>       </dependency>
>   </dependencies>
> </project>
>
> I get the following exception on both, Linux and FreeBSD:
>
> 14/08/28 14:45:56 WARN util.NativeCodeLoader: Unable to load native-hadoop
> library for your platform... using builtin-java classes where applicable
> Exception in thread "main" java.io.IOException: Cannot initialize Cluster.
> Please check your configuration for mapreduce.framework.name and the
> correspond server addresses.
>     at org.apache.hadoop.mapreduce.Cluster.initialize(Cluster.java:120)
>     at org.apache.hadoop.mapreduce.Cluster.<init>(Cluster.java:82)
>     at org.apache.hadoop.mapreduce.Cluster.<init>(Cluster.java:75)
>     at org.apache.hadoop.mapreduce.Job$9.run(Job.java:1255)
>     at org.apache.hadoop.mapreduce.Job$9.run(Job.java:1251)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:415)
>     at
>
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1556)
>     at org.apache.hadoop.mapreduce.Job.connect(Job.java:1250)
>     at org.apache.hadoop.mapreduce.Job.submit(Job.java:1279)
>     at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1303)
>     at
> NewMaxTemperature$NewMaxTemperatureMapper.main(NewMaxTemperature.java:39)
>
> By the way, I am now trying to run this
> <
> http://gerrymcnicol.com/index.php/2014/01/02/hadoop-and-cassandra-part-4-writing-your-first-mapreduce-job/
> >
> example on only one node. It runs fine on Linux without using maven but
> crashes on Linux and FreeBSD with maven with the above Exception and
> without maven on FreeBSD with the NoSuchElementException.
>
> Regards
> Malte
>
>
> 2014-08-27 16:44 GMT+02:00 Artem Ervits <ar...@nyp.org>:
>
> > Check your pom.xml, some artifacts changed from earlier releases.
> >
> >
> > Artem Ervits
> > Data Analyst
> > New York Presbyterian Hospital
> >
> > ----- Original Message -----
> > From: Malte Otten [mailto:malte.maltesmann@gmail.com]
> > Sent: Monday, August 25, 2014 02:12 PM
> > To: user@hbase.apache.org <us...@hbase.apache.org>
> > Subject: Re: NoSuchElementException while executing MapReduce-job
> >
> > Hi Ted,
> >
> > thanks for your quick answer.
> > I don't think I'm running in a secure setup, because I didn't do any of
> > the things specified in the reference guide chapter from your link and
> > as far as I can see, the secure setup is not the default. My
> > hbase-site.xml looks like the following:
> >
> > <configuration>
> > <property>
> >    <name>hbase.cluster.distributed</name>
> >    <value>true</value>
> > </property>
> > <property>
> > <name>hbase.master.wait.on.regionservers.mintostart</name>
> >    <value>1</value>
> > </property>
> >    <property>
> >      <name>hbase.rootdir</name>
> > <value>hdfs://my.hbase.server.de:8020/hbase</value>
> >    </property>
> >    <property>
> >      <name>hbase.zookeeper.property.dataDir</name>
> >      <value>/my/path/hbase-data/zookeeper</value>
> >    </property>
> > <property>
> >    <name>hbase.zookeeper.quorum</name>
> >    <value>my.hbase.server.de</value>
> > </property>
> > </configuration>
> >
> > Do you have any other suggestions?
> >
> > Regards
> > Malte
> >
> >
> > Am 25.08.2014 um 18:01 schrieb Ted Yu:
> > > Looks like you were running in a secure setup.
> > >
> > > If that's the case, you should read:
> > > http://hbase.apache.org/book.html#hbase.secure.configuration
> > >
> > > Cheers
> > >
> > >
> > > On Mon, Aug 25, 2014 at 5:48 AM, Malte Maltesmann <
> > > malte.maltesmann@gmail.com> wrote:
> > >
> > >> Hi all,
> > >>
> > >> when I'm trying to execute a MapReduce-Job over the HBase-Java-API I
> > get an
> > >> exception with the follwoing Stacktrace:
> > >>
> > >> java.util.NoSuchElementException
> > >>      at java.util.StringTokenizer.nextToken(StringTokenizer.java:349)
> > >>      at
> > >>
> > >>
> >
> org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.loadPermissionInfo(RawLocalFileSystem.java:565)
> > >>      at
> > >>
> > >>
> >
> org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.getPermission(RawLocalFileSystem.java:534)
> > >>      at
> > >>
> > >>
> >
> org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.checkPermissionOfOther(ClientDistributedCacheManager.java:276)
> > >>      at
> > >>
> > >>
> >
> org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.isPublic(ClientDistributedCacheManager.java:240)
> > >>      at
> > >>
> > >>
> >
> org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineCacheVisibilities(ClientDistributedCacheManager.java:162)
> > >>      at
> > >>
> > >>
> >
> org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestampsAndCacheVisibilities(ClientDistributedCacheManager.java:58)
> > >>      at
> > >>
> > >>
> >
> org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:265)
> > >>      at
> > >>
> > >>
> >
> org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:301)
> > >>      at
> > >>
> > >>
> >
> org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:389)
> > >>      at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285)
> > >>      at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282)
> > >>      at java.security.AccessController.doPrivileged(Native Method)
> > >>      at javax.security.auth.Subject.doAs(Subject.java:415)
> > >>      at
> > >>
> > >>
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1556)
> > >>      at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282)
> > >>      at
> org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1303)
> > >> ...
> > >>
> > >> In the RawLocalFileSystem.java I can see, that its trying to get some
> > >> Permissions here, but the StringTokenizer gets an empty String, so the
> > >> first t.nextToken() fails with the Exception above:
> > >>
> > >> /// loads permissions, owner, and group from `ls -ld`
> > >>      private void loadPermissionInfo() {
> > >>        IOException e = null;
> > >>        try {
> > >>          String output = FileUtil.execCommand(new
> > File(getPath().toUri()),
> > >>              Shell.getGetPermissionCommand());
> > >>          StringTokenizer t =
> > >>              new StringTokenizer(output, Shell.TOKEN_SEPARATOR_REGEX);
> > >>          //expected format
> > >>          //-rw-------    1 username groupname ...
> > >>          String permission = t.nextToken();
> > >>          if (permission.length() >
> FsPermission.MAX_PERMISSION_LENGTH) {
> > >>            //files with ACLs might have a '+'
> > >>            permission = permission.substring(0,
> > >>              FsPermission.MAX_PERMISSION_LENGTH);
> > >>          }
> > >>          setPermission(FsPermission.valueOf(permission));
> > >>          t.nextToken();
> > >>
> > >>          String owner = t.nextToken();
> > >>          // If on windows domain, token format is DOMAIN\\user and we
> > want
> > >> to
> > >>          // extract only the user name
> > >>          if (Shell.WINDOWS) {
> > >>            int i = owner.indexOf('\\');
> > >>            if (i != -1)
> > >>              owner = owner.substring(i + 1);
> > >>          }
> > >>          setOwner(owner);
> > >>
> > >>          setGroup(t.nextToken());
> > >>        } catch (Shell.ExitCodeException ioe) {
> > >>          if (ioe.getExitCode() != 1) {
> > >>            e = ioe;
> > >>          } else {
> > >>            setPermission(null);
> > >>            setOwner(null);
> > >>            setGroup(null);
> > >>          }
> > >>        } catch (IOException ioe) {
> > >>          e = ioe;
> > >>        } finally {
> > >>          if (e != null) {
> > >>            throw new RuntimeException("Error while running command to
> > get "
> > >> +
> > >>                                       "file permissions : " +
> > >>
> >  StringUtils.stringifyException(e));
> > >>          }
> > >>        }
> > >>      }
> > >>
> > >> So my question is, what could have caused this Problem. My setup is
> the
> > >> following:
> > >>
> > >> Two nodes running on FreeBSD 9.0
> > >> Hadoop 2.4.1
> > >> HBase 0.98.4
> > >>
> > >> The shell-command "ls -ld" delivers the same output on FreeBSD as on
> any
> > >> other Linux-distribution, so this shouldn't be the problem.
> > >> Also I can use simple operations from the HBase-API like get or put
> > without
> > >> any problems.
> > >>
> > >> I also posted this on stackoverflow
> > >> <
> > >>
> >
> http://stackoverflow.com/questions/25364802/hbase-mapreduce-nosuchelementexception
> > >> a week ago, but didn't get any answers.
> > >>
> > >> Thanks in advance,
> > >> Malte
> > >>
> >
> >
> > ________________________________
> >
> > This electronic message is intended to be for the use only of the named
> > recipient, and may contain information that is confidential or
> privileged.
> > If you are not the intended recipient, you are hereby notified that any
> > disclosure, copying, distribution or use of the contents of this message
> is
> > strictly prohibited. If you have received this message in error or are
> not
> > the named recipient, please notify us immediately by contacting the
> sender
> > at the electronic mail address noted above, and delete and destroy all
> > copies of this message. Thank you.
> >
> > This electronic message is intended to be for the use only of the named
> > recipient, and may contain information that is confidential or
> privileged.
> > If you are not the intended recipient, you are hereby notified that any
> > disclosure, copying, distribution or use of the contents of this message
> is
> > strictly prohibited.  If you have received this message in error or are
> not
> > the named recipient, please notify us immediately by contacting the
> sender
> > at the electronic mail address noted above, and delete and destroy all
> > copies of this message.  Thank you.
> >
>

Re: NoSuchElementException while executing MapReduce-job

Posted by Malte Maltesmann <ma...@gmail.com>.
I did not use maven earlier for this, but if I try it with maven and the
following pom.xml:

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="
http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="
http://maven.apache.org/POM/4.0.0
http://maven.apache.org/xsd/maven-4.0.0.xsd">
  <modelVersion>4.0.0</modelVersion>
  <groupId>de.test.malte</groupId>
  <artifactId>MapRedMaven</artifactId>
  <version>0.0.1-SNAPSHOT</version>
  <dependencies>
      <dependency>
          <groupId>org.apache.hadoop</groupId>
          <artifactId>hadoop-common</artifactId>
          <version>2.4.1</version>
      </dependency>
      <dependency>
          <groupId>org.apache.hadoop</groupId>
          <artifactId>hadoop-hdfs</artifactId>
          <version>2.4.1</version>
      </dependency>
      <dependency>
          <groupId>org.apache.hadoop</groupId>
          <artifactId>hadoop-mapreduce-client-core</artifactId>
          <version>2.4.1</version>
      </dependency>
  </dependencies>
</project>

I get the following exception on both, Linux and FreeBSD:

14/08/28 14:45:56 WARN util.NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable
Exception in thread "main" java.io.IOException: Cannot initialize Cluster.
Please check your configuration for mapreduce.framework.name and the
correspond server addresses.
    at org.apache.hadoop.mapreduce.Cluster.initialize(Cluster.java:120)
    at org.apache.hadoop.mapreduce.Cluster.<init>(Cluster.java:82)
    at org.apache.hadoop.mapreduce.Cluster.<init>(Cluster.java:75)
    at org.apache.hadoop.mapreduce.Job$9.run(Job.java:1255)
    at org.apache.hadoop.mapreduce.Job$9.run(Job.java:1251)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1556)
    at org.apache.hadoop.mapreduce.Job.connect(Job.java:1250)
    at org.apache.hadoop.mapreduce.Job.submit(Job.java:1279)
    at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1303)
    at
NewMaxTemperature$NewMaxTemperatureMapper.main(NewMaxTemperature.java:39)

By the way, I am now trying to run this
<http://gerrymcnicol.com/index.php/2014/01/02/hadoop-and-cassandra-part-4-writing-your-first-mapreduce-job/>
example on only one node. It runs fine on Linux without using maven but
crashes on Linux and FreeBSD with maven with the above Exception and
without maven on FreeBSD with the NoSuchElementException.

Regards
Malte


2014-08-27 16:44 GMT+02:00 Artem Ervits <ar...@nyp.org>:

> Check your pom.xml, some artifacts changed from earlier releases.
>
>
> Artem Ervits
> Data Analyst
> New York Presbyterian Hospital
>
> ----- Original Message -----
> From: Malte Otten [mailto:malte.maltesmann@gmail.com]
> Sent: Monday, August 25, 2014 02:12 PM
> To: user@hbase.apache.org <us...@hbase.apache.org>
> Subject: Re: NoSuchElementException while executing MapReduce-job
>
> Hi Ted,
>
> thanks for your quick answer.
> I don't think I'm running in a secure setup, because I didn't do any of
> the things specified in the reference guide chapter from your link and
> as far as I can see, the secure setup is not the default. My
> hbase-site.xml looks like the following:
>
> <configuration>
> <property>
>    <name>hbase.cluster.distributed</name>
>    <value>true</value>
> </property>
> <property>
> <name>hbase.master.wait.on.regionservers.mintostart</name>
>    <value>1</value>
> </property>
>    <property>
>      <name>hbase.rootdir</name>
> <value>hdfs://my.hbase.server.de:8020/hbase</value>
>    </property>
>    <property>
>      <name>hbase.zookeeper.property.dataDir</name>
>      <value>/my/path/hbase-data/zookeeper</value>
>    </property>
> <property>
>    <name>hbase.zookeeper.quorum</name>
>    <value>my.hbase.server.de</value>
> </property>
> </configuration>
>
> Do you have any other suggestions?
>
> Regards
> Malte
>
>
> Am 25.08.2014 um 18:01 schrieb Ted Yu:
> > Looks like you were running in a secure setup.
> >
> > If that's the case, you should read:
> > http://hbase.apache.org/book.html#hbase.secure.configuration
> >
> > Cheers
> >
> >
> > On Mon, Aug 25, 2014 at 5:48 AM, Malte Maltesmann <
> > malte.maltesmann@gmail.com> wrote:
> >
> >> Hi all,
> >>
> >> when I'm trying to execute a MapReduce-Job over the HBase-Java-API I
> get an
> >> exception with the follwoing Stacktrace:
> >>
> >> java.util.NoSuchElementException
> >>      at java.util.StringTokenizer.nextToken(StringTokenizer.java:349)
> >>      at
> >>
> >>
> org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.loadPermissionInfo(RawLocalFileSystem.java:565)
> >>      at
> >>
> >>
> org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.getPermission(RawLocalFileSystem.java:534)
> >>      at
> >>
> >>
> org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.checkPermissionOfOther(ClientDistributedCacheManager.java:276)
> >>      at
> >>
> >>
> org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.isPublic(ClientDistributedCacheManager.java:240)
> >>      at
> >>
> >>
> org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineCacheVisibilities(ClientDistributedCacheManager.java:162)
> >>      at
> >>
> >>
> org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestampsAndCacheVisibilities(ClientDistributedCacheManager.java:58)
> >>      at
> >>
> >>
> org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:265)
> >>      at
> >>
> >>
> org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:301)
> >>      at
> >>
> >>
> org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:389)
> >>      at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285)
> >>      at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282)
> >>      at java.security.AccessController.doPrivileged(Native Method)
> >>      at javax.security.auth.Subject.doAs(Subject.java:415)
> >>      at
> >>
> >>
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1556)
> >>      at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282)
> >>      at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1303)
> >> ...
> >>
> >> In the RawLocalFileSystem.java I can see, that its trying to get some
> >> Permissions here, but the StringTokenizer gets an empty String, so the
> >> first t.nextToken() fails with the Exception above:
> >>
> >> /// loads permissions, owner, and group from `ls -ld`
> >>      private void loadPermissionInfo() {
> >>        IOException e = null;
> >>        try {
> >>          String output = FileUtil.execCommand(new
> File(getPath().toUri()),
> >>              Shell.getGetPermissionCommand());
> >>          StringTokenizer t =
> >>              new StringTokenizer(output, Shell.TOKEN_SEPARATOR_REGEX);
> >>          //expected format
> >>          //-rw-------    1 username groupname ...
> >>          String permission = t.nextToken();
> >>          if (permission.length() > FsPermission.MAX_PERMISSION_LENGTH) {
> >>            //files with ACLs might have a '+'
> >>            permission = permission.substring(0,
> >>              FsPermission.MAX_PERMISSION_LENGTH);
> >>          }
> >>          setPermission(FsPermission.valueOf(permission));
> >>          t.nextToken();
> >>
> >>          String owner = t.nextToken();
> >>          // If on windows domain, token format is DOMAIN\\user and we
> want
> >> to
> >>          // extract only the user name
> >>          if (Shell.WINDOWS) {
> >>            int i = owner.indexOf('\\');
> >>            if (i != -1)
> >>              owner = owner.substring(i + 1);
> >>          }
> >>          setOwner(owner);
> >>
> >>          setGroup(t.nextToken());
> >>        } catch (Shell.ExitCodeException ioe) {
> >>          if (ioe.getExitCode() != 1) {
> >>            e = ioe;
> >>          } else {
> >>            setPermission(null);
> >>            setOwner(null);
> >>            setGroup(null);
> >>          }
> >>        } catch (IOException ioe) {
> >>          e = ioe;
> >>        } finally {
> >>          if (e != null) {
> >>            throw new RuntimeException("Error while running command to
> get "
> >> +
> >>                                       "file permissions : " +
> >>
>  StringUtils.stringifyException(e));
> >>          }
> >>        }
> >>      }
> >>
> >> So my question is, what could have caused this Problem. My setup is the
> >> following:
> >>
> >> Two nodes running on FreeBSD 9.0
> >> Hadoop 2.4.1
> >> HBase 0.98.4
> >>
> >> The shell-command "ls -ld" delivers the same output on FreeBSD as on any
> >> other Linux-distribution, so this shouldn't be the problem.
> >> Also I can use simple operations from the HBase-API like get or put
> without
> >> any problems.
> >>
> >> I also posted this on stackoverflow
> >> <
> >>
> http://stackoverflow.com/questions/25364802/hbase-mapreduce-nosuchelementexception
> >> a week ago, but didn't get any answers.
> >>
> >> Thanks in advance,
> >> Malte
> >>
>
>
> ________________________________
>
> This electronic message is intended to be for the use only of the named
> recipient, and may contain information that is confidential or privileged.
> If you are not the intended recipient, you are hereby notified that any
> disclosure, copying, distribution or use of the contents of this message is
> strictly prohibited. If you have received this message in error or are not
> the named recipient, please notify us immediately by contacting the sender
> at the electronic mail address noted above, and delete and destroy all
> copies of this message. Thank you.
>
> This electronic message is intended to be for the use only of the named
> recipient, and may contain information that is confidential or privileged.
> If you are not the intended recipient, you are hereby notified that any
> disclosure, copying, distribution or use of the contents of this message is
> strictly prohibited.  If you have received this message in error or are not
> the named recipient, please notify us immediately by contacting the sender
> at the electronic mail address noted above, and delete and destroy all
> copies of this message.  Thank you.
>

Re: NoSuchElementException while executing MapReduce-job

Posted by Artem Ervits <ar...@nyp.org>.
Check your pom.xml, some artifacts changed from earlier releases.


Artem Ervits
Data Analyst
New York Presbyterian Hospital

----- Original Message -----
From: Malte Otten [mailto:malte.maltesmann@gmail.com]
Sent: Monday, August 25, 2014 02:12 PM
To: user@hbase.apache.org <us...@hbase.apache.org>
Subject: Re: NoSuchElementException while executing MapReduce-job

Hi Ted,

thanks for your quick answer.
I don't think I'm running in a secure setup, because I didn't do any of
the things specified in the reference guide chapter from your link and
as far as I can see, the secure setup is not the default. My
hbase-site.xml looks like the following:

<configuration>
<property>
   <name>hbase.cluster.distributed</name>
   <value>true</value>
</property>
<property>
<name>hbase.master.wait.on.regionservers.mintostart</name>
   <value>1</value>
</property>
   <property>
     <name>hbase.rootdir</name>
<value>hdfs://my.hbase.server.de:8020/hbase</value>
   </property>
   <property>
     <name>hbase.zookeeper.property.dataDir</name>
     <value>/my/path/hbase-data/zookeeper</value>
   </property>
<property>
   <name>hbase.zookeeper.quorum</name>
   <value>my.hbase.server.de</value>
</property>
</configuration>

Do you have any other suggestions?

Regards
Malte


Am 25.08.2014 um 18:01 schrieb Ted Yu:
> Looks like you were running in a secure setup.
>
> If that's the case, you should read:
> http://hbase.apache.org/book.html#hbase.secure.configuration
>
> Cheers
>
>
> On Mon, Aug 25, 2014 at 5:48 AM, Malte Maltesmann <
> malte.maltesmann@gmail.com> wrote:
>
>> Hi all,
>>
>> when I'm trying to execute a MapReduce-Job over the HBase-Java-API I get an
>> exception with the follwoing Stacktrace:
>>
>> java.util.NoSuchElementException
>>      at java.util.StringTokenizer.nextToken(StringTokenizer.java:349)
>>      at
>>
>> org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.loadPermissionInfo(RawLocalFileSystem.java:565)
>>      at
>>
>> org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.getPermission(RawLocalFileSystem.java:534)
>>      at
>>
>> org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.checkPermissionOfOther(ClientDistributedCacheManager.java:276)
>>      at
>>
>> org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.isPublic(ClientDistributedCacheManager.java:240)
>>      at
>>
>> org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineCacheVisibilities(ClientDistributedCacheManager.java:162)
>>      at
>>
>> org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestampsAndCacheVisibilities(ClientDistributedCacheManager.java:58)
>>      at
>>
>> org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:265)
>>      at
>>
>> org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:301)
>>      at
>>
>> org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:389)
>>      at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285)
>>      at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282)
>>      at java.security.AccessController.doPrivileged(Native Method)
>>      at javax.security.auth.Subject.doAs(Subject.java:415)
>>      at
>>
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1556)
>>      at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282)
>>      at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1303)
>> ...
>>
>> In the RawLocalFileSystem.java I can see, that its trying to get some
>> Permissions here, but the StringTokenizer gets an empty String, so the
>> first t.nextToken() fails with the Exception above:
>>
>> /// loads permissions, owner, and group from `ls -ld`
>>      private void loadPermissionInfo() {
>>        IOException e = null;
>>        try {
>>          String output = FileUtil.execCommand(new File(getPath().toUri()),
>>              Shell.getGetPermissionCommand());
>>          StringTokenizer t =
>>              new StringTokenizer(output, Shell.TOKEN_SEPARATOR_REGEX);
>>          //expected format
>>          //-rw-------    1 username groupname ...
>>          String permission = t.nextToken();
>>          if (permission.length() > FsPermission.MAX_PERMISSION_LENGTH) {
>>            //files with ACLs might have a '+'
>>            permission = permission.substring(0,
>>              FsPermission.MAX_PERMISSION_LENGTH);
>>          }
>>          setPermission(FsPermission.valueOf(permission));
>>          t.nextToken();
>>
>>          String owner = t.nextToken();
>>          // If on windows domain, token format is DOMAIN\\user and we want
>> to
>>          // extract only the user name
>>          if (Shell.WINDOWS) {
>>            int i = owner.indexOf('\\');
>>            if (i != -1)
>>              owner = owner.substring(i + 1);
>>          }
>>          setOwner(owner);
>>
>>          setGroup(t.nextToken());
>>        } catch (Shell.ExitCodeException ioe) {
>>          if (ioe.getExitCode() != 1) {
>>            e = ioe;
>>          } else {
>>            setPermission(null);
>>            setOwner(null);
>>            setGroup(null);
>>          }
>>        } catch (IOException ioe) {
>>          e = ioe;
>>        } finally {
>>          if (e != null) {
>>            throw new RuntimeException("Error while running command to get "
>> +
>>                                       "file permissions : " +
>>                                       StringUtils.stringifyException(e));
>>          }
>>        }
>>      }
>>
>> So my question is, what could have caused this Problem. My setup is the
>> following:
>>
>> Two nodes running on FreeBSD 9.0
>> Hadoop 2.4.1
>> HBase 0.98.4
>>
>> The shell-command "ls -ld" delivers the same output on FreeBSD as on any
>> other Linux-distribution, so this shouldn't be the problem.
>> Also I can use simple operations from the HBase-API like get or put without
>> any problems.
>>
>> I also posted this on stackoverflow
>> <
>> http://stackoverflow.com/questions/25364802/hbase-mapreduce-nosuchelementexception
>> a week ago, but didn't get any answers.
>>
>> Thanks in advance,
>> Malte
>>


________________________________

This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged. If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited. If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message. Thank you.

This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.

Re: NoSuchElementException while executing MapReduce-job

Posted by Malte Otten <ma...@gmail.com>.
Hi Ted,

thanks for your quick answer.
I don't think I'm running in a secure setup, because I didn't do any of 
the things specified in the reference guide chapter from your link and 
as far as I can see, the secure setup is not the default. My 
hbase-site.xml looks like the following:

<configuration>
<property>
   <name>hbase.cluster.distributed</name>
   <value>true</value>
</property>
<property>
<name>hbase.master.wait.on.regionservers.mintostart</name>
   <value>1</value>
</property>
   <property>
     <name>hbase.rootdir</name>
<value>hdfs://my.hbase.server.de:8020/hbase</value>
   </property>
   <property>
     <name>hbase.zookeeper.property.dataDir</name>
     <value>/my/path/hbase-data/zookeeper</value>
   </property>
<property>
   <name>hbase.zookeeper.quorum</name>
   <value>my.hbase.server.de</value>
</property>
</configuration>

Do you have any other suggestions?

Regards
Malte


Am 25.08.2014 um 18:01 schrieb Ted Yu:
> Looks like you were running in a secure setup.
>
> If that's the case, you should read:
> http://hbase.apache.org/book.html#hbase.secure.configuration
>
> Cheers
>
>
> On Mon, Aug 25, 2014 at 5:48 AM, Malte Maltesmann <
> malte.maltesmann@gmail.com> wrote:
>
>> Hi all,
>>
>> when I'm trying to execute a MapReduce-Job over the HBase-Java-API I get an
>> exception with the follwoing Stacktrace:
>>
>> java.util.NoSuchElementException
>>      at java.util.StringTokenizer.nextToken(StringTokenizer.java:349)
>>      at
>>
>> org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.loadPermissionInfo(RawLocalFileSystem.java:565)
>>      at
>>
>> org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.getPermission(RawLocalFileSystem.java:534)
>>      at
>>
>> org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.checkPermissionOfOther(ClientDistributedCacheManager.java:276)
>>      at
>>
>> org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.isPublic(ClientDistributedCacheManager.java:240)
>>      at
>>
>> org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineCacheVisibilities(ClientDistributedCacheManager.java:162)
>>      at
>>
>> org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestampsAndCacheVisibilities(ClientDistributedCacheManager.java:58)
>>      at
>>
>> org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:265)
>>      at
>>
>> org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:301)
>>      at
>>
>> org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:389)
>>      at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285)
>>      at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282)
>>      at java.security.AccessController.doPrivileged(Native Method)
>>      at javax.security.auth.Subject.doAs(Subject.java:415)
>>      at
>>
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1556)
>>      at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282)
>>      at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1303)
>> ...
>>
>> In the RawLocalFileSystem.java I can see, that its trying to get some
>> Permissions here, but the StringTokenizer gets an empty String, so the
>> first t.nextToken() fails with the Exception above:
>>
>> /// loads permissions, owner, and group from `ls -ld`
>>      private void loadPermissionInfo() {
>>        IOException e = null;
>>        try {
>>          String output = FileUtil.execCommand(new File(getPath().toUri()),
>>              Shell.getGetPermissionCommand());
>>          StringTokenizer t =
>>              new StringTokenizer(output, Shell.TOKEN_SEPARATOR_REGEX);
>>          //expected format
>>          //-rw-------    1 username groupname ...
>>          String permission = t.nextToken();
>>          if (permission.length() > FsPermission.MAX_PERMISSION_LENGTH) {
>>            //files with ACLs might have a '+'
>>            permission = permission.substring(0,
>>              FsPermission.MAX_PERMISSION_LENGTH);
>>          }
>>          setPermission(FsPermission.valueOf(permission));
>>          t.nextToken();
>>
>>          String owner = t.nextToken();
>>          // If on windows domain, token format is DOMAIN\\user and we want
>> to
>>          // extract only the user name
>>          if (Shell.WINDOWS) {
>>            int i = owner.indexOf('\\');
>>            if (i != -1)
>>              owner = owner.substring(i + 1);
>>          }
>>          setOwner(owner);
>>
>>          setGroup(t.nextToken());
>>        } catch (Shell.ExitCodeException ioe) {
>>          if (ioe.getExitCode() != 1) {
>>            e = ioe;
>>          } else {
>>            setPermission(null);
>>            setOwner(null);
>>            setGroup(null);
>>          }
>>        } catch (IOException ioe) {
>>          e = ioe;
>>        } finally {
>>          if (e != null) {
>>            throw new RuntimeException("Error while running command to get "
>> +
>>                                       "file permissions : " +
>>                                       StringUtils.stringifyException(e));
>>          }
>>        }
>>      }
>>
>> So my question is, what could have caused this Problem. My setup is the
>> following:
>>
>> Two nodes running on FreeBSD 9.0
>> Hadoop 2.4.1
>> HBase 0.98.4
>>
>> The shell-command "ls -ld" delivers the same output on FreeBSD as on any
>> other Linux-distribution, so this shouldn't be the problem.
>> Also I can use simple operations from the HBase-API like get or put without
>> any problems.
>>
>> I also posted this on stackoverflow
>> <
>> http://stackoverflow.com/questions/25364802/hbase-mapreduce-nosuchelementexception
>> a week ago, but didn't get any answers.
>>
>> Thanks in advance,
>> Malte
>>


Re: NoSuchElementException while executing MapReduce-job

Posted by Ted Yu <yu...@gmail.com>.
Looks like you were running in a secure setup.

If that's the case, you should read:
http://hbase.apache.org/book.html#hbase.secure.configuration

Cheers


On Mon, Aug 25, 2014 at 5:48 AM, Malte Maltesmann <
malte.maltesmann@gmail.com> wrote:

> Hi all,
>
> when I'm trying to execute a MapReduce-Job over the HBase-Java-API I get an
> exception with the follwoing Stacktrace:
>
> java.util.NoSuchElementException
>     at java.util.StringTokenizer.nextToken(StringTokenizer.java:349)
>     at
>
> org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.loadPermissionInfo(RawLocalFileSystem.java:565)
>     at
>
> org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.getPermission(RawLocalFileSystem.java:534)
>     at
>
> org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.checkPermissionOfOther(ClientDistributedCacheManager.java:276)
>     at
>
> org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.isPublic(ClientDistributedCacheManager.java:240)
>     at
>
> org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineCacheVisibilities(ClientDistributedCacheManager.java:162)
>     at
>
> org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestampsAndCacheVisibilities(ClientDistributedCacheManager.java:58)
>     at
>
> org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:265)
>     at
>
> org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:301)
>     at
>
> org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:389)
>     at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285)
>     at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:415)
>     at
>
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1556)
>     at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282)
>     at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1303)
> ...
>
> In the RawLocalFileSystem.java I can see, that its trying to get some
> Permissions here, but the StringTokenizer gets an empty String, so the
> first t.nextToken() fails with the Exception above:
>
> /// loads permissions, owner, and group from `ls -ld`
>     private void loadPermissionInfo() {
>       IOException e = null;
>       try {
>         String output = FileUtil.execCommand(new File(getPath().toUri()),
>             Shell.getGetPermissionCommand());
>         StringTokenizer t =
>             new StringTokenizer(output, Shell.TOKEN_SEPARATOR_REGEX);
>         //expected format
>         //-rw-------    1 username groupname ...
>         String permission = t.nextToken();
>         if (permission.length() > FsPermission.MAX_PERMISSION_LENGTH) {
>           //files with ACLs might have a '+'
>           permission = permission.substring(0,
>             FsPermission.MAX_PERMISSION_LENGTH);
>         }
>         setPermission(FsPermission.valueOf(permission));
>         t.nextToken();
>
>         String owner = t.nextToken();
>         // If on windows domain, token format is DOMAIN\\user and we want
> to
>         // extract only the user name
>         if (Shell.WINDOWS) {
>           int i = owner.indexOf('\\');
>           if (i != -1)
>             owner = owner.substring(i + 1);
>         }
>         setOwner(owner);
>
>         setGroup(t.nextToken());
>       } catch (Shell.ExitCodeException ioe) {
>         if (ioe.getExitCode() != 1) {
>           e = ioe;
>         } else {
>           setPermission(null);
>           setOwner(null);
>           setGroup(null);
>         }
>       } catch (IOException ioe) {
>         e = ioe;
>       } finally {
>         if (e != null) {
>           throw new RuntimeException("Error while running command to get "
> +
>                                      "file permissions : " +
>                                      StringUtils.stringifyException(e));
>         }
>       }
>     }
>
> So my question is, what could have caused this Problem. My setup is the
> following:
>
> Two nodes running on FreeBSD 9.0
> Hadoop 2.4.1
> HBase 0.98.4
>
> The shell-command "ls -ld" delivers the same output on FreeBSD as on any
> other Linux-distribution, so this shouldn't be the problem.
> Also I can use simple operations from the HBase-API like get or put without
> any problems.
>
> I also posted this on stackoverflow
> <
> http://stackoverflow.com/questions/25364802/hbase-mapreduce-nosuchelementexception
> >
> a week ago, but didn't get any answers.
>
> Thanks in advance,
> Malte
>