You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hadoop.apache.org by Malte Maltesmann <ma...@gmail.com> on 2014/08/27 14:05:27 UTC

NoSuchElementException while running local MapReduce-Job on FreeBSD

Hi all,

I tried to run a MapReduce-Job on my tow node FreeBSD-Cluster with Hadoop
2.4.1 and HBase 0.98.4 and ran into below Exception. I then tried the
example provided here
<http://gerrymcnicol.azurewebsites.net/index.php/2014/01/02/hadoop-and-cassandra-part-4-writing-your-first-mapreduce-job/>,
running a MapReduce-Job without involving HBase or the HDFS on a single
FreeBSD machine and got the same Exception:

Exception in thread "main" java.util.NoSuchElementException
    at java.util.StringTokenizer.nextToken(StringTokenizer.java:349)
    at
org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.loadPermissionInfo(RawLocalFileSystem.java:565)
    at
org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.getPermission(RawLocalFileSystem.java:534)
    at
org.apache.hadoop.fs.LocatedFileStatus.<init>(LocatedFileStatus.java:42)
    at org.apache.hadoop.fs.FileSystem$4.next(FileSystem.java:1697)
    at org.apache.hadoop.fs.FileSystem$4.next(FileSystem.java:1679)
    at
org.apache.hadoop.mapreduce.lib.input.FileInputFormat.singleThreadedListStatus(FileInputFormat.java:302)
    at
org.apache.hadoop.mapreduce.lib.input.FileInputFormat.listStatus(FileInputFormat.java:263)
    at
org.apache.hadoop.mapreduce.lib.input.FileInputFormat.getSplits(FileInputFormat.java:375)
    at
org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:493)
    at
org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:510)
    at
org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:394)
    at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285)
    at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1556)
    at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282)
    at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1303)
    at
test.NewMaxTemperature$NewMaxTemperatureMapper.main(NewMaxTemperature.java:39)

I ran the exact same project on a Linux machine and it worked just fine. So
it seems this is a FreeBSD related issue.

I looked into the code and found the following part, where the exception
happens:

/// loads permissions, owner, and group from `ls -ld`private void
loadPermissionInfo() {
  IOException e = null;
  try {
    String output = FileUtil.execCommand(new File(getPath().toUri()),
        Shell.getGetPermissionCommand());
    StringTokenizer t =
        new StringTokenizer(output, Shell.TOKEN_SEPARATOR_REGEX);
    //expected format
    //-rw-------    1 username groupname ...
    String permission = t.nextToken();


So here the StrinTokenizer tries to call nextToken() on an empty String.
How can this happen?
I checked that "ls -ld" provides the same output on FreeBSD and Linux.

Does anybody have any suggestion what happens here?

Thanks and regards,
Malte