You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-dev@hadoop.apache.org by "dhruba borthakur (JIRA)" <ji...@apache.org> on 2007/05/02 01:40:15 UTC
[jira] Resolved: (HADOOP-612) DFs copyFromLocal fails with
NullPointerException for a single file
[ https://issues.apache.org/jira/browse/HADOOP-612?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
dhruba borthakur resolved HADOOP-612.
-------------------------------------
Resolution: Fixed
I am closing this one because this does not happen anymore.
> DFs copyFromLocal fails with NullPointerException for a single file
> -------------------------------------------------------------------
>
> Key: HADOOP-612
> URL: https://issues.apache.org/jira/browse/HADOOP-612
> Project: Hadoop
> Issue Type: Bug
> Components: dfs
> Affects Versions: 0.7.1
> Reporter: Sanjay Dahiya
> Assigned To: Sameer Paranjpye
> Priority: Blocker
>
> DFS copyFromLocal fails with NullPointerException when copying a single file to DFS. Copying a directory works.
> public File getFile(String dirsProp, String path)
> throws IOException {
> String[] dirs = getStrings(dirsProp); <===== returns null for a single file
> int hashCode = path.hashCode();
> for (int i = 0; i < dirs.length; i++) { // try each local dir <==== Throws NullPointerException
> int index = (hashCode+i & Integer.MAX_VALUE) % dirs.length;
> File file = new File(dirs[index], path);
> File dir = file.getParentFile();
> if (dir.exists() || dir.mkdirs()) {
> return file;
> }
> }
> throw new IOException("No valid local directories in property: "+dirsProp);
> }
> Exception in thread "main" java.lang.NullPointerException
> at org.apache.hadoop.conf.Configuration.getFile(Configuration.java:397)
> at org.apache.hadoop.dfs.DFSClient$DFSOutputStream.newBackupFile(DFSClient.java:913)
> at org.apache.hadoop.dfs.DFSClient$DFSOutputStream.<init>(DFSClient.java:903)
> at org.apache.hadoop.dfs.DFSClient.create(DFSClient.java:276)
> at org.apache.hadoop.dfs.DistributedFileSystem.createRaw(DistributedFileSystem.java:104)
> at org.apache.hadoop.fs.FSDataOutputStream$Summer.<init>(FSDataOutputStream.java:56)
> at org.apache.hadoop.fs.FSDataOutputStream$Summer.<init>(FSDataOutputStream.java:45)
> at org.apache.hadoop.fs.FSDataOutputStream.<init>(FSDataOutputStream.java:146)
> at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:271)
> at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:178)
> at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:77)
> at org.apache.hadoop.dfs.DistributedFileSystem.copyFromLocalFile(DistributedFileSystem.java:186)
> at org.apache.hadoop.dfs.DFSShell.copyFromLocal(DFSShell.java:45)
> at org.apache.hadoop.dfs.DFSShell.run(DFSShell.java:516)
> at org.apache.hadoop.util.ToolBase.doMain(ToolBase.java:187)
> at org.apache.hadoop.dfs.DFSShell.main(DFSShell.java:570)
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.