You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hadoop.apache.org by xeonmailinglist <xe...@gmail.com> on 2015/02/05 17:13:08 UTC

How I list files in HDFS?

Hi,

I want to list files in the HDFS using the |FileUtil.listFiles| but all 
I get is IOException errors. The code, error and the output is below. 
How I list files in HDFS?

|Exception in thread "main" java.io.IOException: Invalid directory or I/O error occurred for dir: /outputmp
|

I have this code

|         try{
             File[] mapOutputFiles = FileUtil.listFiles(new File("webhdfs://hadoop-coc-1/outputmp/"));
             System.out.println("1 success: " + mapOutputFiles.length);
         } catch (Exception e) {
             System.out.println("1 failed");
         }

         try {
             File[] mapOutputFiles2 = FileUtil.listFiles(new File("webhdfs://hadoop-coc-1/outputmp"));
             System.out.println("2 success: " + mapOutputFiles2.length);
         } catch (Exception e) {
             System.out.println("2 failed");
         }

         try {
             File[] mapOutputFiles3 = FileUtil.listFiles(new File("/outputmp"));
             System.out.println("3 success: " + mapOutputFiles3.length);
         } catch (Exception e) {
             System.out.println("3 failed");
         }
         try {
             File[] mapOutputFiles4 = FileUtil.listFiles(new File("/outputmp/"));
             System.out.println("4 success: " + mapOutputFiles4.length);
         } catch (Exception e) {
             System.out.println("4 failed");
         }
|

The output

|1 failed
2 failed
3 failed
4 failed
|

The output exists

|vagrant@hadoop-coc-1:~/Programs/hadoop$ hdfs dfs -ls /outputmp
Found 2 items
-rw-r--r--   2 vagrant supergroup          0 2015-02-05 15:50 /outputmp/_SUCCESS
-rw-r--r--   2 vagrant supergroup         12 2015-02-05 15:50 /outputmp/part-m-00000
vagrant@hadoop-coc-1:~/Programs/hadoop$ hdfs dfs -ls webhdfs://hadoop-coc-1/outputmp
Found 2 items
-rw-r--r--   2 vagrant supergroup          0 2015-02-05 15:50 webhdfs://hadoop-coc-1/outputmp/_SUCCESS
-rw-r--r--   2 vagrant supergroup         12 2015-02-05 15:50 webhdfs://hadoop-coc-1/outputmp/part-m-00000
vagrant@hadoop-coc-1:~/Programs/hadoop$
|

​

Re: How I list files in HDFS?

Posted by Azuryy Yu <az...@gmail.com>.
Hi,

You cannot use new File(".......") as parameter, which should be new
Path("/outputmp")

On Fri, Feb 6, 2015 at 3:51 AM, Ravi Prakash <ra...@ymail.com> wrote:

> Hi Xeon!
>
> Can you try using the FileContext or FileSystem API?
>
> HTH
> Ravi
>
>
>   On Thursday, February 5, 2015 8:13 AM, xeonmailinglist <
> xeonmailinglist@gmail.com> wrote:
>
>
>   Hi,
>  I want to list files in the HDFS using the FileUtil.listFiles but all I
> get is IOException errors. The code, error and the output is below. How I
> list files in HDFS?
>
> Exception in thread "main" java.io.IOException: Invalid directory or I/O error occurred for dir: /outputmp
>
> I have this code
>
>         try{
>             File[] mapOutputFiles = FileUtil.listFiles(new File("webhdfs://hadoop-coc-1/outputmp/"));
>             System.out.println("1 success: " + mapOutputFiles.length);
>         } catch (Exception e) {
>             System.out.println("1 failed");
>         }
>
>         try {
>             File[] mapOutputFiles2 = FileUtil.listFiles(new File("webhdfs://hadoop-coc-1/outputmp"));
>             System.out.println("2 success: " + mapOutputFiles2.length);
>         } catch (Exception e) {
>             System.out.println("2 failed");
>         }
>
>         try {
>             File[] mapOutputFiles3 = FileUtil.listFiles(new File("/outputmp"));
>             System.out.println("3 success: " + mapOutputFiles3.length);
>         } catch (Exception e) {
>             System.out.println("3 failed");
>         }
>         try {
>             File[] mapOutputFiles4 = FileUtil.listFiles(new File("/outputmp/"));
>             System.out.println("4 success: " + mapOutputFiles4.length);
>         } catch (Exception e) {
>             System.out.println("4 failed");
>         }
>
> The output
>
> 1 failed
> 2 failed
> 3 failed
> 4 failed
>
> The output exists
>
> vagrant@hadoop-coc-1:~/Programs/hadoop$ hdfs dfs -ls /outputmp
> Found 2 items
> -rw-r--r--   2 vagrant supergroup          0 2015-02-05 15:50 /outputmp/_SUCCESS
> -rw-r--r--   2 vagrant supergroup         12 2015-02-05 15:50 /outputmp/part-m-00000
> vagrant@hadoop-coc-1:~/Programs/hadoop$ hdfs dfs -ls webhdfs://hadoop-coc-1/outputmp
> Found 2 items
> -rw-r--r--   2 vagrant supergroup          0 2015-02-05 15:50 webhdfs://hadoop-coc-1/outputmp/_SUCCESS
> -rw-r--r--   2 vagrant supergroup         12 2015-02-05 15:50 webhdfs://hadoop-coc-1/outputmp/part-m-00000
> vagrant@hadoop-coc-1:~/Programs/hadoop$
>
> ​
>
>
>

Re: How I list files in HDFS?

Posted by Azuryy Yu <az...@gmail.com>.
Hi,

You cannot use new File(".......") as parameter, which should be new
Path("/outputmp")

On Fri, Feb 6, 2015 at 3:51 AM, Ravi Prakash <ra...@ymail.com> wrote:

> Hi Xeon!
>
> Can you try using the FileContext or FileSystem API?
>
> HTH
> Ravi
>
>
>   On Thursday, February 5, 2015 8:13 AM, xeonmailinglist <
> xeonmailinglist@gmail.com> wrote:
>
>
>   Hi,
>  I want to list files in the HDFS using the FileUtil.listFiles but all I
> get is IOException errors. The code, error and the output is below. How I
> list files in HDFS?
>
> Exception in thread "main" java.io.IOException: Invalid directory or I/O error occurred for dir: /outputmp
>
> I have this code
>
>         try{
>             File[] mapOutputFiles = FileUtil.listFiles(new File("webhdfs://hadoop-coc-1/outputmp/"));
>             System.out.println("1 success: " + mapOutputFiles.length);
>         } catch (Exception e) {
>             System.out.println("1 failed");
>         }
>
>         try {
>             File[] mapOutputFiles2 = FileUtil.listFiles(new File("webhdfs://hadoop-coc-1/outputmp"));
>             System.out.println("2 success: " + mapOutputFiles2.length);
>         } catch (Exception e) {
>             System.out.println("2 failed");
>         }
>
>         try {
>             File[] mapOutputFiles3 = FileUtil.listFiles(new File("/outputmp"));
>             System.out.println("3 success: " + mapOutputFiles3.length);
>         } catch (Exception e) {
>             System.out.println("3 failed");
>         }
>         try {
>             File[] mapOutputFiles4 = FileUtil.listFiles(new File("/outputmp/"));
>             System.out.println("4 success: " + mapOutputFiles4.length);
>         } catch (Exception e) {
>             System.out.println("4 failed");
>         }
>
> The output
>
> 1 failed
> 2 failed
> 3 failed
> 4 failed
>
> The output exists
>
> vagrant@hadoop-coc-1:~/Programs/hadoop$ hdfs dfs -ls /outputmp
> Found 2 items
> -rw-r--r--   2 vagrant supergroup          0 2015-02-05 15:50 /outputmp/_SUCCESS
> -rw-r--r--   2 vagrant supergroup         12 2015-02-05 15:50 /outputmp/part-m-00000
> vagrant@hadoop-coc-1:~/Programs/hadoop$ hdfs dfs -ls webhdfs://hadoop-coc-1/outputmp
> Found 2 items
> -rw-r--r--   2 vagrant supergroup          0 2015-02-05 15:50 webhdfs://hadoop-coc-1/outputmp/_SUCCESS
> -rw-r--r--   2 vagrant supergroup         12 2015-02-05 15:50 webhdfs://hadoop-coc-1/outputmp/part-m-00000
> vagrant@hadoop-coc-1:~/Programs/hadoop$
>
> ​
>
>
>

Re: How I list files in HDFS?

Posted by Azuryy Yu <az...@gmail.com>.
Hi,

You cannot use new File(".......") as parameter, which should be new
Path("/outputmp")

On Fri, Feb 6, 2015 at 3:51 AM, Ravi Prakash <ra...@ymail.com> wrote:

> Hi Xeon!
>
> Can you try using the FileContext or FileSystem API?
>
> HTH
> Ravi
>
>
>   On Thursday, February 5, 2015 8:13 AM, xeonmailinglist <
> xeonmailinglist@gmail.com> wrote:
>
>
>   Hi,
>  I want to list files in the HDFS using the FileUtil.listFiles but all I
> get is IOException errors. The code, error and the output is below. How I
> list files in HDFS?
>
> Exception in thread "main" java.io.IOException: Invalid directory or I/O error occurred for dir: /outputmp
>
> I have this code
>
>         try{
>             File[] mapOutputFiles = FileUtil.listFiles(new File("webhdfs://hadoop-coc-1/outputmp/"));
>             System.out.println("1 success: " + mapOutputFiles.length);
>         } catch (Exception e) {
>             System.out.println("1 failed");
>         }
>
>         try {
>             File[] mapOutputFiles2 = FileUtil.listFiles(new File("webhdfs://hadoop-coc-1/outputmp"));
>             System.out.println("2 success: " + mapOutputFiles2.length);
>         } catch (Exception e) {
>             System.out.println("2 failed");
>         }
>
>         try {
>             File[] mapOutputFiles3 = FileUtil.listFiles(new File("/outputmp"));
>             System.out.println("3 success: " + mapOutputFiles3.length);
>         } catch (Exception e) {
>             System.out.println("3 failed");
>         }
>         try {
>             File[] mapOutputFiles4 = FileUtil.listFiles(new File("/outputmp/"));
>             System.out.println("4 success: " + mapOutputFiles4.length);
>         } catch (Exception e) {
>             System.out.println("4 failed");
>         }
>
> The output
>
> 1 failed
> 2 failed
> 3 failed
> 4 failed
>
> The output exists
>
> vagrant@hadoop-coc-1:~/Programs/hadoop$ hdfs dfs -ls /outputmp
> Found 2 items
> -rw-r--r--   2 vagrant supergroup          0 2015-02-05 15:50 /outputmp/_SUCCESS
> -rw-r--r--   2 vagrant supergroup         12 2015-02-05 15:50 /outputmp/part-m-00000
> vagrant@hadoop-coc-1:~/Programs/hadoop$ hdfs dfs -ls webhdfs://hadoop-coc-1/outputmp
> Found 2 items
> -rw-r--r--   2 vagrant supergroup          0 2015-02-05 15:50 webhdfs://hadoop-coc-1/outputmp/_SUCCESS
> -rw-r--r--   2 vagrant supergroup         12 2015-02-05 15:50 webhdfs://hadoop-coc-1/outputmp/part-m-00000
> vagrant@hadoop-coc-1:~/Programs/hadoop$
>
> ​
>
>
>

Re: How I list files in HDFS?

Posted by Azuryy Yu <az...@gmail.com>.
Hi,

You cannot use new File(".......") as parameter, which should be new
Path("/outputmp")

On Fri, Feb 6, 2015 at 3:51 AM, Ravi Prakash <ra...@ymail.com> wrote:

> Hi Xeon!
>
> Can you try using the FileContext or FileSystem API?
>
> HTH
> Ravi
>
>
>   On Thursday, February 5, 2015 8:13 AM, xeonmailinglist <
> xeonmailinglist@gmail.com> wrote:
>
>
>   Hi,
>  I want to list files in the HDFS using the FileUtil.listFiles but all I
> get is IOException errors. The code, error and the output is below. How I
> list files in HDFS?
>
> Exception in thread "main" java.io.IOException: Invalid directory or I/O error occurred for dir: /outputmp
>
> I have this code
>
>         try{
>             File[] mapOutputFiles = FileUtil.listFiles(new File("webhdfs://hadoop-coc-1/outputmp/"));
>             System.out.println("1 success: " + mapOutputFiles.length);
>         } catch (Exception e) {
>             System.out.println("1 failed");
>         }
>
>         try {
>             File[] mapOutputFiles2 = FileUtil.listFiles(new File("webhdfs://hadoop-coc-1/outputmp"));
>             System.out.println("2 success: " + mapOutputFiles2.length);
>         } catch (Exception e) {
>             System.out.println("2 failed");
>         }
>
>         try {
>             File[] mapOutputFiles3 = FileUtil.listFiles(new File("/outputmp"));
>             System.out.println("3 success: " + mapOutputFiles3.length);
>         } catch (Exception e) {
>             System.out.println("3 failed");
>         }
>         try {
>             File[] mapOutputFiles4 = FileUtil.listFiles(new File("/outputmp/"));
>             System.out.println("4 success: " + mapOutputFiles4.length);
>         } catch (Exception e) {
>             System.out.println("4 failed");
>         }
>
> The output
>
> 1 failed
> 2 failed
> 3 failed
> 4 failed
>
> The output exists
>
> vagrant@hadoop-coc-1:~/Programs/hadoop$ hdfs dfs -ls /outputmp
> Found 2 items
> -rw-r--r--   2 vagrant supergroup          0 2015-02-05 15:50 /outputmp/_SUCCESS
> -rw-r--r--   2 vagrant supergroup         12 2015-02-05 15:50 /outputmp/part-m-00000
> vagrant@hadoop-coc-1:~/Programs/hadoop$ hdfs dfs -ls webhdfs://hadoop-coc-1/outputmp
> Found 2 items
> -rw-r--r--   2 vagrant supergroup          0 2015-02-05 15:50 webhdfs://hadoop-coc-1/outputmp/_SUCCESS
> -rw-r--r--   2 vagrant supergroup         12 2015-02-05 15:50 webhdfs://hadoop-coc-1/outputmp/part-m-00000
> vagrant@hadoop-coc-1:~/Programs/hadoop$
>
> ​
>
>
>

Re: How I list files in HDFS?

Posted by Ravi Prakash <ra...@ymail.com>.
Hi Xeon!
Can you try using the FileContext or FileSystem API?
HTH
Ravi 

     On Thursday, February 5, 2015 8:13 AM, xeonmailinglist <xe...@gmail.com> wrote:
   

    Hi,
  I want to list files in the HDFS using the FileUtil.listFiles but all I get is IOException errors. The code, error and the output is below. How I list files in HDFS?
  Exception in thread "main" java.io.IOException: Invalid directory or I/O error occurred for dir: /outputmp
 I have this code         try{ 
            File[] mapOutputFiles = FileUtil.listFiles(new File("webhdfs://hadoop-coc-1/outputmp/"));
            System.out.println("1 success: " + mapOutputFiles.length);
        } catch (Exception e) {
            System.out.println("1 failed");
        }

        try {
            File[] mapOutputFiles2 = FileUtil.listFiles(new File("webhdfs://hadoop-coc-1/outputmp"));
            System.out.println("2 success: " + mapOutputFiles2.length);
        } catch (Exception e) {
            System.out.println("2 failed");
        }

        try {
            File[] mapOutputFiles3 = FileUtil.listFiles(new File("/outputmp"));
            System.out.println("3 success: " + mapOutputFiles3.length);
        } catch (Exception e) {
            System.out.println("3 failed");
        }
        try {
            File[] mapOutputFiles4 = FileUtil.listFiles(new File("/outputmp/"));
            System.out.println("4 success: " + mapOutputFiles4.length);
        } catch (Exception e) {
            System.out.println("4 failed");
        }
 The output 1 failed
2 failed
3 failed
4 failed
 The output exists vagrant@hadoop-coc-1:~/Programs/hadoop$ hdfs dfs -ls /outputmp
Found 2 items
-rw-r--r--   2 vagrant supergroup          0 2015-02-05 15:50 /outputmp/_SUCCESS
-rw-r--r--   2 vagrant supergroup         12 2015-02-05 15:50 /outputmp/part-m-00000
vagrant@hadoop-coc-1:~/Programs/hadoop$ hdfs dfs -ls webhdfs://hadoop-coc-1/outputmp
Found 2 items
-rw-r--r--   2 vagrant supergroup          0 2015-02-05 15:50 webhdfs://hadoop-coc-1/outputmp/_SUCCESS
-rw-r--r--   2 vagrant supergroup         12 2015-02-05 15:50 webhdfs://hadoop-coc-1/outputmp/part-m-00000
vagrant@hadoop-coc-1:~/Programs/hadoop$
 ​  

   

Re: How I list files in HDFS?

Posted by Susheel Kumar Gadalay <sk...@gmail.com>.
You can do like this.

      Configuration conf =  getConf();

      FileSystem fs = FileSystem.get(conf);

      FileStatus[] fstatus = fs.listStatus(new Path(...));

      String generatedFile;

      for (int i=0; i<fstatus.length; i++) {
         generatedFile = fstatus[i].getPath().getName();
         System.out.println("File Name : " + generatedFile);
      }

On 2/5/15, xeonmailinglist <xe...@gmail.com> wrote:
> Hi,
>
> I want to list files in the HDFS using the |FileUtil.listFiles| but all
> I get is IOException errors. The code, error and the output is below.
> How I list files in HDFS?
>
> |Exception in thread "main" java.io.IOException: Invalid directory or I/O
> error occurred for dir: /outputmp
> |
>
> I have this code
>
> |         try{
>              File[] mapOutputFiles = FileUtil.listFiles(new
> File("webhdfs://hadoop-coc-1/outputmp/"));
>              System.out.println("1 success: " + mapOutputFiles.length);
>          } catch (Exception e) {
>              System.out.println("1 failed");
>          }
>
>          try {
>              File[] mapOutputFiles2 = FileUtil.listFiles(new
> File("webhdfs://hadoop-coc-1/outputmp"));
>              System.out.println("2 success: " + mapOutputFiles2.length);
>          } catch (Exception e) {
>              System.out.println("2 failed");
>          }
>
>          try {
>              File[] mapOutputFiles3 = FileUtil.listFiles(new
> File("/outputmp"));
>              System.out.println("3 success: " + mapOutputFiles3.length);
>          } catch (Exception e) {
>              System.out.println("3 failed");
>          }
>          try {
>              File[] mapOutputFiles4 = FileUtil.listFiles(new
> File("/outputmp/"));
>              System.out.println("4 success: " + mapOutputFiles4.length);
>          } catch (Exception e) {
>              System.out.println("4 failed");
>          }
> |
>
> The output
>
> |1 failed
> 2 failed
> 3 failed
> 4 failed
> |
>
> The output exists
>
> |vagrant@hadoop-coc-1:~/Programs/hadoop$ hdfs dfs -ls /outputmp
> Found 2 items
> -rw-r--r--   2 vagrant supergroup          0 2015-02-05 15:50
> /outputmp/_SUCCESS
> -rw-r--r--   2 vagrant supergroup         12 2015-02-05 15:50
> /outputmp/part-m-00000
> vagrant@hadoop-coc-1:~/Programs/hadoop$ hdfs dfs -ls
> webhdfs://hadoop-coc-1/outputmp
> Found 2 items
> -rw-r--r--   2 vagrant supergroup          0 2015-02-05 15:50
> webhdfs://hadoop-coc-1/outputmp/_SUCCESS
> -rw-r--r--   2 vagrant supergroup         12 2015-02-05 15:50
> webhdfs://hadoop-coc-1/outputmp/part-m-00000
> vagrant@hadoop-coc-1:~/Programs/hadoop$
> |
>
> ​
>

Re: How I list files in HDFS?

Posted by Susheel Kumar Gadalay <sk...@gmail.com>.
You can do like this.

      Configuration conf =  getConf();

      FileSystem fs = FileSystem.get(conf);

      FileStatus[] fstatus = fs.listStatus(new Path(...));

      String generatedFile;

      for (int i=0; i<fstatus.length; i++) {
         generatedFile = fstatus[i].getPath().getName();
         System.out.println("File Name : " + generatedFile);
      }

On 2/5/15, xeonmailinglist <xe...@gmail.com> wrote:
> Hi,
>
> I want to list files in the HDFS using the |FileUtil.listFiles| but all
> I get is IOException errors. The code, error and the output is below.
> How I list files in HDFS?
>
> |Exception in thread "main" java.io.IOException: Invalid directory or I/O
> error occurred for dir: /outputmp
> |
>
> I have this code
>
> |         try{
>              File[] mapOutputFiles = FileUtil.listFiles(new
> File("webhdfs://hadoop-coc-1/outputmp/"));
>              System.out.println("1 success: " + mapOutputFiles.length);
>          } catch (Exception e) {
>              System.out.println("1 failed");
>          }
>
>          try {
>              File[] mapOutputFiles2 = FileUtil.listFiles(new
> File("webhdfs://hadoop-coc-1/outputmp"));
>              System.out.println("2 success: " + mapOutputFiles2.length);
>          } catch (Exception e) {
>              System.out.println("2 failed");
>          }
>
>          try {
>              File[] mapOutputFiles3 = FileUtil.listFiles(new
> File("/outputmp"));
>              System.out.println("3 success: " + mapOutputFiles3.length);
>          } catch (Exception e) {
>              System.out.println("3 failed");
>          }
>          try {
>              File[] mapOutputFiles4 = FileUtil.listFiles(new
> File("/outputmp/"));
>              System.out.println("4 success: " + mapOutputFiles4.length);
>          } catch (Exception e) {
>              System.out.println("4 failed");
>          }
> |
>
> The output
>
> |1 failed
> 2 failed
> 3 failed
> 4 failed
> |
>
> The output exists
>
> |vagrant@hadoop-coc-1:~/Programs/hadoop$ hdfs dfs -ls /outputmp
> Found 2 items
> -rw-r--r--   2 vagrant supergroup          0 2015-02-05 15:50
> /outputmp/_SUCCESS
> -rw-r--r--   2 vagrant supergroup         12 2015-02-05 15:50
> /outputmp/part-m-00000
> vagrant@hadoop-coc-1:~/Programs/hadoop$ hdfs dfs -ls
> webhdfs://hadoop-coc-1/outputmp
> Found 2 items
> -rw-r--r--   2 vagrant supergroup          0 2015-02-05 15:50
> webhdfs://hadoop-coc-1/outputmp/_SUCCESS
> -rw-r--r--   2 vagrant supergroup         12 2015-02-05 15:50
> webhdfs://hadoop-coc-1/outputmp/part-m-00000
> vagrant@hadoop-coc-1:~/Programs/hadoop$
> |
>
> ​
>

Re: How I list files in HDFS?

Posted by Ravi Prakash <ra...@ymail.com>.
Hi Xeon!
Can you try using the FileContext or FileSystem API?
HTH
Ravi 

     On Thursday, February 5, 2015 8:13 AM, xeonmailinglist <xe...@gmail.com> wrote:
   

    Hi,
  I want to list files in the HDFS using the FileUtil.listFiles but all I get is IOException errors. The code, error and the output is below. How I list files in HDFS?
  Exception in thread "main" java.io.IOException: Invalid directory or I/O error occurred for dir: /outputmp
 I have this code         try{ 
            File[] mapOutputFiles = FileUtil.listFiles(new File("webhdfs://hadoop-coc-1/outputmp/"));
            System.out.println("1 success: " + mapOutputFiles.length);
        } catch (Exception e) {
            System.out.println("1 failed");
        }

        try {
            File[] mapOutputFiles2 = FileUtil.listFiles(new File("webhdfs://hadoop-coc-1/outputmp"));
            System.out.println("2 success: " + mapOutputFiles2.length);
        } catch (Exception e) {
            System.out.println("2 failed");
        }

        try {
            File[] mapOutputFiles3 = FileUtil.listFiles(new File("/outputmp"));
            System.out.println("3 success: " + mapOutputFiles3.length);
        } catch (Exception e) {
            System.out.println("3 failed");
        }
        try {
            File[] mapOutputFiles4 = FileUtil.listFiles(new File("/outputmp/"));
            System.out.println("4 success: " + mapOutputFiles4.length);
        } catch (Exception e) {
            System.out.println("4 failed");
        }
 The output 1 failed
2 failed
3 failed
4 failed
 The output exists vagrant@hadoop-coc-1:~/Programs/hadoop$ hdfs dfs -ls /outputmp
Found 2 items
-rw-r--r--   2 vagrant supergroup          0 2015-02-05 15:50 /outputmp/_SUCCESS
-rw-r--r--   2 vagrant supergroup         12 2015-02-05 15:50 /outputmp/part-m-00000
vagrant@hadoop-coc-1:~/Programs/hadoop$ hdfs dfs -ls webhdfs://hadoop-coc-1/outputmp
Found 2 items
-rw-r--r--   2 vagrant supergroup          0 2015-02-05 15:50 webhdfs://hadoop-coc-1/outputmp/_SUCCESS
-rw-r--r--   2 vagrant supergroup         12 2015-02-05 15:50 webhdfs://hadoop-coc-1/outputmp/part-m-00000
vagrant@hadoop-coc-1:~/Programs/hadoop$
 ​  

   

Re: How I list files in HDFS?

Posted by Ravi Prakash <ra...@ymail.com>.
Hi Xeon!
Can you try using the FileContext or FileSystem API?
HTH
Ravi 

     On Thursday, February 5, 2015 8:13 AM, xeonmailinglist <xe...@gmail.com> wrote:
   

    Hi,
  I want to list files in the HDFS using the FileUtil.listFiles but all I get is IOException errors. The code, error and the output is below. How I list files in HDFS?
  Exception in thread "main" java.io.IOException: Invalid directory or I/O error occurred for dir: /outputmp
 I have this code         try{ 
            File[] mapOutputFiles = FileUtil.listFiles(new File("webhdfs://hadoop-coc-1/outputmp/"));
            System.out.println("1 success: " + mapOutputFiles.length);
        } catch (Exception e) {
            System.out.println("1 failed");
        }

        try {
            File[] mapOutputFiles2 = FileUtil.listFiles(new File("webhdfs://hadoop-coc-1/outputmp"));
            System.out.println("2 success: " + mapOutputFiles2.length);
        } catch (Exception e) {
            System.out.println("2 failed");
        }

        try {
            File[] mapOutputFiles3 = FileUtil.listFiles(new File("/outputmp"));
            System.out.println("3 success: " + mapOutputFiles3.length);
        } catch (Exception e) {
            System.out.println("3 failed");
        }
        try {
            File[] mapOutputFiles4 = FileUtil.listFiles(new File("/outputmp/"));
            System.out.println("4 success: " + mapOutputFiles4.length);
        } catch (Exception e) {
            System.out.println("4 failed");
        }
 The output 1 failed
2 failed
3 failed
4 failed
 The output exists vagrant@hadoop-coc-1:~/Programs/hadoop$ hdfs dfs -ls /outputmp
Found 2 items
-rw-r--r--   2 vagrant supergroup          0 2015-02-05 15:50 /outputmp/_SUCCESS
-rw-r--r--   2 vagrant supergroup         12 2015-02-05 15:50 /outputmp/part-m-00000
vagrant@hadoop-coc-1:~/Programs/hadoop$ hdfs dfs -ls webhdfs://hadoop-coc-1/outputmp
Found 2 items
-rw-r--r--   2 vagrant supergroup          0 2015-02-05 15:50 webhdfs://hadoop-coc-1/outputmp/_SUCCESS
-rw-r--r--   2 vagrant supergroup         12 2015-02-05 15:50 webhdfs://hadoop-coc-1/outputmp/part-m-00000
vagrant@hadoop-coc-1:~/Programs/hadoop$
 ​  

   

Re: How I list files in HDFS?

Posted by Susheel Kumar Gadalay <sk...@gmail.com>.
You can do like this.

      Configuration conf =  getConf();

      FileSystem fs = FileSystem.get(conf);

      FileStatus[] fstatus = fs.listStatus(new Path(...));

      String generatedFile;

      for (int i=0; i<fstatus.length; i++) {
         generatedFile = fstatus[i].getPath().getName();
         System.out.println("File Name : " + generatedFile);
      }

On 2/5/15, xeonmailinglist <xe...@gmail.com> wrote:
> Hi,
>
> I want to list files in the HDFS using the |FileUtil.listFiles| but all
> I get is IOException errors. The code, error and the output is below.
> How I list files in HDFS?
>
> |Exception in thread "main" java.io.IOException: Invalid directory or I/O
> error occurred for dir: /outputmp
> |
>
> I have this code
>
> |         try{
>              File[] mapOutputFiles = FileUtil.listFiles(new
> File("webhdfs://hadoop-coc-1/outputmp/"));
>              System.out.println("1 success: " + mapOutputFiles.length);
>          } catch (Exception e) {
>              System.out.println("1 failed");
>          }
>
>          try {
>              File[] mapOutputFiles2 = FileUtil.listFiles(new
> File("webhdfs://hadoop-coc-1/outputmp"));
>              System.out.println("2 success: " + mapOutputFiles2.length);
>          } catch (Exception e) {
>              System.out.println("2 failed");
>          }
>
>          try {
>              File[] mapOutputFiles3 = FileUtil.listFiles(new
> File("/outputmp"));
>              System.out.println("3 success: " + mapOutputFiles3.length);
>          } catch (Exception e) {
>              System.out.println("3 failed");
>          }
>          try {
>              File[] mapOutputFiles4 = FileUtil.listFiles(new
> File("/outputmp/"));
>              System.out.println("4 success: " + mapOutputFiles4.length);
>          } catch (Exception e) {
>              System.out.println("4 failed");
>          }
> |
>
> The output
>
> |1 failed
> 2 failed
> 3 failed
> 4 failed
> |
>
> The output exists
>
> |vagrant@hadoop-coc-1:~/Programs/hadoop$ hdfs dfs -ls /outputmp
> Found 2 items
> -rw-r--r--   2 vagrant supergroup          0 2015-02-05 15:50
> /outputmp/_SUCCESS
> -rw-r--r--   2 vagrant supergroup         12 2015-02-05 15:50
> /outputmp/part-m-00000
> vagrant@hadoop-coc-1:~/Programs/hadoop$ hdfs dfs -ls
> webhdfs://hadoop-coc-1/outputmp
> Found 2 items
> -rw-r--r--   2 vagrant supergroup          0 2015-02-05 15:50
> webhdfs://hadoop-coc-1/outputmp/_SUCCESS
> -rw-r--r--   2 vagrant supergroup         12 2015-02-05 15:50
> webhdfs://hadoop-coc-1/outputmp/part-m-00000
> vagrant@hadoop-coc-1:~/Programs/hadoop$
> |
>
> ​
>

Re: How I list files in HDFS?

Posted by Susheel Kumar Gadalay <sk...@gmail.com>.
You can do like this.

      Configuration conf =  getConf();

      FileSystem fs = FileSystem.get(conf);

      FileStatus[] fstatus = fs.listStatus(new Path(...));

      String generatedFile;

      for (int i=0; i<fstatus.length; i++) {
         generatedFile = fstatus[i].getPath().getName();
         System.out.println("File Name : " + generatedFile);
      }

On 2/5/15, xeonmailinglist <xe...@gmail.com> wrote:
> Hi,
>
> I want to list files in the HDFS using the |FileUtil.listFiles| but all
> I get is IOException errors. The code, error and the output is below.
> How I list files in HDFS?
>
> |Exception in thread "main" java.io.IOException: Invalid directory or I/O
> error occurred for dir: /outputmp
> |
>
> I have this code
>
> |         try{
>              File[] mapOutputFiles = FileUtil.listFiles(new
> File("webhdfs://hadoop-coc-1/outputmp/"));
>              System.out.println("1 success: " + mapOutputFiles.length);
>          } catch (Exception e) {
>              System.out.println("1 failed");
>          }
>
>          try {
>              File[] mapOutputFiles2 = FileUtil.listFiles(new
> File("webhdfs://hadoop-coc-1/outputmp"));
>              System.out.println("2 success: " + mapOutputFiles2.length);
>          } catch (Exception e) {
>              System.out.println("2 failed");
>          }
>
>          try {
>              File[] mapOutputFiles3 = FileUtil.listFiles(new
> File("/outputmp"));
>              System.out.println("3 success: " + mapOutputFiles3.length);
>          } catch (Exception e) {
>              System.out.println("3 failed");
>          }
>          try {
>              File[] mapOutputFiles4 = FileUtil.listFiles(new
> File("/outputmp/"));
>              System.out.println("4 success: " + mapOutputFiles4.length);
>          } catch (Exception e) {
>              System.out.println("4 failed");
>          }
> |
>
> The output
>
> |1 failed
> 2 failed
> 3 failed
> 4 failed
> |
>
> The output exists
>
> |vagrant@hadoop-coc-1:~/Programs/hadoop$ hdfs dfs -ls /outputmp
> Found 2 items
> -rw-r--r--   2 vagrant supergroup          0 2015-02-05 15:50
> /outputmp/_SUCCESS
> -rw-r--r--   2 vagrant supergroup         12 2015-02-05 15:50
> /outputmp/part-m-00000
> vagrant@hadoop-coc-1:~/Programs/hadoop$ hdfs dfs -ls
> webhdfs://hadoop-coc-1/outputmp
> Found 2 items
> -rw-r--r--   2 vagrant supergroup          0 2015-02-05 15:50
> webhdfs://hadoop-coc-1/outputmp/_SUCCESS
> -rw-r--r--   2 vagrant supergroup         12 2015-02-05 15:50
> webhdfs://hadoop-coc-1/outputmp/part-m-00000
> vagrant@hadoop-coc-1:~/Programs/hadoop$
> |
>
> ​
>

Re: How I list files in HDFS?

Posted by Ravi Prakash <ra...@ymail.com>.
Hi Xeon!
Can you try using the FileContext or FileSystem API?
HTH
Ravi 

     On Thursday, February 5, 2015 8:13 AM, xeonmailinglist <xe...@gmail.com> wrote:
   

    Hi,
  I want to list files in the HDFS using the FileUtil.listFiles but all I get is IOException errors. The code, error and the output is below. How I list files in HDFS?
  Exception in thread "main" java.io.IOException: Invalid directory or I/O error occurred for dir: /outputmp
 I have this code         try{ 
            File[] mapOutputFiles = FileUtil.listFiles(new File("webhdfs://hadoop-coc-1/outputmp/"));
            System.out.println("1 success: " + mapOutputFiles.length);
        } catch (Exception e) {
            System.out.println("1 failed");
        }

        try {
            File[] mapOutputFiles2 = FileUtil.listFiles(new File("webhdfs://hadoop-coc-1/outputmp"));
            System.out.println("2 success: " + mapOutputFiles2.length);
        } catch (Exception e) {
            System.out.println("2 failed");
        }

        try {
            File[] mapOutputFiles3 = FileUtil.listFiles(new File("/outputmp"));
            System.out.println("3 success: " + mapOutputFiles3.length);
        } catch (Exception e) {
            System.out.println("3 failed");
        }
        try {
            File[] mapOutputFiles4 = FileUtil.listFiles(new File("/outputmp/"));
            System.out.println("4 success: " + mapOutputFiles4.length);
        } catch (Exception e) {
            System.out.println("4 failed");
        }
 The output 1 failed
2 failed
3 failed
4 failed
 The output exists vagrant@hadoop-coc-1:~/Programs/hadoop$ hdfs dfs -ls /outputmp
Found 2 items
-rw-r--r--   2 vagrant supergroup          0 2015-02-05 15:50 /outputmp/_SUCCESS
-rw-r--r--   2 vagrant supergroup         12 2015-02-05 15:50 /outputmp/part-m-00000
vagrant@hadoop-coc-1:~/Programs/hadoop$ hdfs dfs -ls webhdfs://hadoop-coc-1/outputmp
Found 2 items
-rw-r--r--   2 vagrant supergroup          0 2015-02-05 15:50 webhdfs://hadoop-coc-1/outputmp/_SUCCESS
-rw-r--r--   2 vagrant supergroup         12 2015-02-05 15:50 webhdfs://hadoop-coc-1/outputmp/part-m-00000
vagrant@hadoop-coc-1:~/Programs/hadoop$
 ​