You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by ztesoft <nj...@hotmail.com> on 2010/01/19 06:48:20 UTC

Exception occurs when I try to read a file

Hi, I wrote a program to read a file in HDFS.
The codes are following:
import java.io.IOException;
import java.net.MalformedURLException;
import java.net.URI;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FSDataInputStream;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IOUtils;

public class FileSystemCat {
    public static void main(String[] args) throws IOException {
        String uri = "hdfs://10.45.11.247:9000/user/root/test3.xml";
        Configuration conf = new Configuration();
        FileSystem fs = FileSystem.get(URI.create(uri),conf);
        FSDataInputStream in = null;
        try {
            in = fs.open(new Path(uri));
            IOUtils.copyBytes(in, System.out, 4096,false);
            in.seek(10);
            IOUtils.copyBytes(in, System.out, 4096,false);
        } catch (MalformedURLException e) {
            // TODO Auto-generated catch block
            e.printStackTrace();
        } catch (IOException e) {
            // TODO Auto-generated catch block
            e.printStackTrace();
        }finally
        {
            IOUtils.closeStream(in);
        }
    }
}

The error occurs like following:
Exception in thread "main" java.io.IOException: Call to /10.45.11.247:9000
failed on local exception: java.io.EOFException
        at org.apache.hadoop.ipc.Client.wrapException(Client.java:774)
        at org.apache.hadoop.ipc.Client.call(Client.java:742)
        at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
        at $Proxy0.getProtocolVersion(Unknown Source)
        at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
        at
org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:105)
        at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:177)
        at
org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:82)
        at
org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1373)
        at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
        at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1385)
        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:191)
        at hadoopstudy.FileSystemCat.main(FileSystemCat.java:26)
Caused by: java.io.EOFException
        at java.io.DataInputStream.readInt(DataInputStream.java:375)
        at
org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:501)
        at org.apache.hadoop.ipc.Client$Connection.run(Client.java:446)
Java Result: 1

Does anyone know how to solve it?


-- 
View this message in context: http://old.nabble.com/Exception-occurs-when-I-try-to-read-a-file-tp27221237p27221237.html
Sent from the Hadoop core-user mailing list archive at Nabble.com.


Re: Exception occurs when I try to read a file

Posted by Mafish Liu <ma...@gmail.com>.
The node on which this program was running failed to connect to
namenode. Check your configuration of hadoop cluster first, eg, the
firewall.

2010/1/19 ztesoft <nj...@hotmail.com>:
>
> Hi, I wrote a program to read a file in HDFS.
> The codes are following:
> import java.io.IOException;
> import java.net.MalformedURLException;
> import java.net.URI;
> import org.apache.hadoop.conf.Configuration;
> import org.apache.hadoop.fs.FSDataInputStream;
> import org.apache.hadoop.fs.FileSystem;
> import org.apache.hadoop.fs.Path;
> import org.apache.hadoop.io.IOUtils;
>
> public class FileSystemCat {
>    public static void main(String[] args) throws IOException {
>        String uri = "hdfs://10.45.11.247:9000/user/root/test3.xml";
>        Configuration conf = new Configuration();
>        FileSystem fs = FileSystem.get(URI.create(uri),conf);
>        FSDataInputStream in = null;
>        try {
>            in = fs.open(new Path(uri));
>            IOUtils.copyBytes(in, System.out, 4096,false);
>            in.seek(10);
>            IOUtils.copyBytes(in, System.out, 4096,false);
>        } catch (MalformedURLException e) {
>            // TODO Auto-generated catch block
>            e.printStackTrace();
>        } catch (IOException e) {
>            // TODO Auto-generated catch block
>            e.printStackTrace();
>        }finally
>        {
>            IOUtils.closeStream(in);
>        }
>    }
> }
>
> The error occurs like following:
> Exception in thread "main" java.io.IOException: Call to /10.45.11.247:9000
> failed on local exception: java.io.EOFException
>        at org.apache.hadoop.ipc.Client.wrapException(Client.java:774)
>        at org.apache.hadoop.ipc.Client.call(Client.java:742)
>        at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
>        at $Proxy0.getProtocolVersion(Unknown Source)
>        at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
>        at
> org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:105)
>        at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:177)
>        at
> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:82)
>        at
> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1373)
>        at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
>        at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1385)
>        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:191)
>        at hadoopstudy.FileSystemCat.main(FileSystemCat.java:26)
> Caused by: java.io.EOFException
>        at java.io.DataInputStream.readInt(DataInputStream.java:375)
>        at
> org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:501)
>        at org.apache.hadoop.ipc.Client$Connection.run(Client.java:446)
> Java Result: 1
>
> Does anyone know how to solve it?
>
>
> --
> View this message in context: http://old.nabble.com/Exception-occurs-when-I-try-to-read-a-file-tp27221237p27221237.html
> Sent from the Hadoop core-user mailing list archive at Nabble.com.
>
>



-- 
Mafish@gmail.com

Re: Exception occurs when I try to read a file

Posted by ztesoft <nj...@hotmail.com>.
I find the Warning message in hadoop-hadoop-namenode-hpc1.log when I try to
read the file.

WARN org.apache.hadoop.ipc.Server: Incorrect header or version mismatch from
10.45.12.172:1577 got version 3 expected version 2

Does it mean I should install lastest version hadoop?
-- 
View this message in context: http://old.nabble.com/Exception-occurs-when-I-try-to-read-a-file-tp27221237p27318671.html
Sent from the Hadoop core-user mailing list archive at Nabble.com.


Re: Exception occurs when I try to read a file

Posted by Jeff Zhang <zj...@gmail.com>.
I guess the reason is that the version of hadoop on server side is not
compatible with the version you use in client side.



On Tue, Jan 19, 2010 at 1:48 PM, ztesoft <nj...@hotmail.com> wrote:

>
> Hi, I wrote a program to read a file in HDFS.
> The codes are following:
> import java.io.IOException;
> import java.net.MalformedURLException;
> import java.net.URI;
> import org.apache.hadoop.conf.Configuration;
> import org.apache.hadoop.fs.FSDataInputStream;
> import org.apache.hadoop.fs.FileSystem;
> import org.apache.hadoop.fs.Path;
> import org.apache.hadoop.io.IOUtils;
>
> public class FileSystemCat {
>    public static void main(String[] args) throws IOException {
>        String uri = "hdfs://10.45.11.247:9000/user/root/test3.xml";
>        Configuration conf = new Configuration();
>        FileSystem fs = FileSystem.get(URI.create(uri),conf);
>        FSDataInputStream in = null;
>        try {
>            in = fs.open(new Path(uri));
>            IOUtils.copyBytes(in, System.out, 4096,false);
>            in.seek(10);
>            IOUtils.copyBytes(in, System.out, 4096,false);
>        } catch (MalformedURLException e) {
>            // TODO Auto-generated catch block
>            e.printStackTrace();
>        } catch (IOException e) {
>            // TODO Auto-generated catch block
>            e.printStackTrace();
>        }finally
>        {
>            IOUtils.closeStream(in);
>        }
>    }
> }
>
> The error occurs like following:
> Exception in thread "main" java.io.IOException: Call to /10.45.11.247:9000
> failed on local exception: java.io.EOFException
>        at org.apache.hadoop.ipc.Client.wrapException(Client.java:774)
>        at org.apache.hadoop.ipc.Client.call(Client.java:742)
>        at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
>        at $Proxy0.getProtocolVersion(Unknown Source)
>        at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
>        at
> org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:105)
>        at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:177)
>        at
>
> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:82)
>        at
> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1373)
>        at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
>        at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1385)
>        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:191)
>        at hadoopstudy.FileSystemCat.main(FileSystemCat.java:26)
> Caused by: java.io.EOFException
>        at java.io.DataInputStream.readInt(DataInputStream.java:375)
>        at
> org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:501)
>        at org.apache.hadoop.ipc.Client$Connection.run(Client.java:446)
> Java Result: 1
>
> Does anyone know how to solve it?
>
>
> --
> View this message in context:
> http://old.nabble.com/Exception-occurs-when-I-try-to-read-a-file-tp27221237p27221237.html
> Sent from the Hadoop core-user mailing list archive at Nabble.com.
>
>


-- 
Best Regards

Jeff Zhang