You are viewing a plain text version of this content. The canonical link for it is here.
Posted to hdfs-dev@hadoop.apache.org by "Ravi Phulari (JIRA)" <ji...@apache.org> on 2009/09/26 00:24:16 UTC

[jira] Resolved: (HDFS-50) NullPointerException when reading deleted file

     [ https://issues.apache.org/jira/browse/HDFS-50?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Ravi Phulari resolved HDFS-50.
------------------------------

    Resolution: Cannot Reproduce

Koji, I could not reproduce this issue .
Closing this Jira as cannot reproduce.


{code}
[rphulari@host hadoop]$ hadoop fs -ls
Found 1 items
drwx------   - rphulari hdfs          0 2009-09-25 22:06 /user/rphulari/test
[rphulari@host hadoop]$ hadoop fs -ls test
Found 2 items
-rw-------   3 rphulari hdfs     405943 2009-09-25 22:05 /user/rphulari/test/foo.txt
-rw-------   3 rphulari hdfs          4 2009-09-25 22:06 /user/rphulari/test/test.py
[rphulari@host hadoop]$ hadoop fs -rmr test
Moved to trash: hdfs://host.some.com/user/rphulari/test
[rphulari@host hadoop]$ hadoop fs -ls
Found 1 items
drwx------   - rphulari hdfs          0 2009-09-25 22:17 /user/rphulari/.Trash
[rphulari@host hadoop]$ hadoop fs -cat test/foo.txt
cat: File does not exist: test/foo.txt
[rphulari@host hadoop]$ hadoop fs -cat test/test.py
cat: File does not exist: test/test.py
[rphulari@host hadoop]$ 




> NullPointerException when reading deleted file
> ----------------------------------------------
>
>                 Key: HDFS-50
>                 URL: https://issues.apache.org/jira/browse/HDFS-50
>             Project: Hadoop HDFS
>          Issue Type: Bug
>            Reporter: Koji Noguchi
>            Priority: Minor
>
> hdfs://AAA:9999/distcp/destdir/Trash/0803050600/data/part-00018
> : java.lang.NullPointerException
>   at org.apache.hadoop.dfs.DFSClient$DFSInputStream.getBlockAt(DFSClient.java:919)
>   at org.apache.hadoop.dfs.DFSClient$DFSInputStream.blockSeekTo(DFSClient.java:992)
>   at org.apache.hadoop.dfs.DFSClient$DFSInputStream.read(DFSClient.java:1112)
>   at java.io.DataInputStream.read(DataInputStream.java:83)
>   at org.apache.hadoop.util.CopyFiles$FSCopyFilesMapper.copy(CopyFiles.java:303)
>   at org.apache.hadoop.util.CopyFiles$FSCopyFilesMapper.map(CopyFiles.java:364)
>   at org.apache.hadoop.util.CopyFiles$FSCopyFilesMapper.map(CopyFiles.java:219)
>   at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50)
>   at org.apache.hadoop.mapred.MapTask.run(MapTask.java:192)
>   at org.apache.hadoop.mapred.TaskTracker$Child.main(TaskTracker.java:1804)
> (Line number for CopyFiles.java is a little off since I'm using my modified version)

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.