You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@pig.apache.org by "John Lium (JIRA)" <ji...@apache.org> on 2011/07/20 22:35:57 UTC
[jira] [Created] (PIG-2183) Pig 0.8.1 not working with Hadoop
0.20.203.0
Pig 0.8.1 not working with Hadoop 0.20.203.0
--------------------------------------------
Key: PIG-2183
URL: https://issues.apache.org/jira/browse/PIG-2183
Project: Pig
Issue Type: Bug
Components: grunt, site
Affects Versions: 0.8.1
Environment: Gentoo Linux Kernel: 2.6.38-gentoo-r6
java version "1.6.0_26"
Ant version 1.8.1
Reporter: John Lium
When running pig, I get the following error.
Error before Pig is launched
----------------------------
ERROR 2999: Unexpected internal error. Failed to create DataStorage
java.lang.RuntimeException: Failed to create DataStorage
at org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.java:75)
at org.apache.pig.backend.hadoop.datastorage.HDataStorage.<init>(HDataStorage.java:58)
at org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:214)
at org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:134)
at org.apache.pig.impl.PigContext.connect(PigContext.java:183)
at org.apache.pig.PigServer.<init>(PigServer.java:226)
at org.apache.pig.PigServer.<init>(PigServer.java:215)
at org.apache.pig.tools.grunt.Grunt.<init>(Grunt.java:55)
at org.apache.pig.Main.run(Main.java:452)
at org.apache.pig.Main.main(Main.java:107)
Caused by: java.io.IOException: Call to rasputin/192.168.1.3:9000 failed on local exception: java.io.EOFException
at org.apache.hadoop.ipc.Client.wrapException(Client.java:775)
at org.apache.hadoop.ipc.Client.call(Client.java:743)
at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
at $Proxy0.getProtocolVersion(Unknown Source)
at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:82)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
at org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.java:72)
... 9 more
Caused by: java.io.EOFException
at java.io.DataInputStream.readInt(DataInputStream.java:375)
at org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:501)
at org.apache.hadoop.ipc.Client$Connection.run(Client.java:446)
================================================================================
My env vars are defined in bin/pig as the following
export JAVA_HOME="/etc/java-config-2/current-system-vm"
export PIG_CLASSPATH="/var/hadoop/pig/pig-withouthadoop.jar:$HADOOP_HOME/hadoop-core-0.20.203.0.jar:$HADOOP_HOME/lib:$HADOOP_CONF_DIR"
--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira
[jira] [Commented] (PIG-2183) Pig 0.8.1 not working with Hadoop
0.20.203.0
Posted by "John Lium (JIRA)" <ji...@apache.org>.
[ https://issues.apache.org/jira/browse/PIG-2183?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13069331#comment-13069331 ]
John Lium commented on PIG-2183:
--------------------------------
Finally got it to work.
I built with the above mentioned "ant jar-withouthadoop"
Hacked up the wrapper script to this -> http://pastebin.com/5WuLRUAN
Used this conf/pig-env.sh -> http://pastebin.com/vs5fAHpu
Replaced the hadoop jars in build/ivy/lib/Pig with the 0.20.203 core and test ones
Ran the wrapper script
> Pig 0.8.1 not working with Hadoop 0.20.203.0
> --------------------------------------------
>
> Key: PIG-2183
> URL: https://issues.apache.org/jira/browse/PIG-2183
> Project: Pig
> Issue Type: Bug
> Components: grunt, site
> Affects Versions: 0.8.1
> Environment: Gentoo Linux Kernel: 2.6.38-gentoo-r6
> java version "1.6.0_26"
> Ant version 1.8.1
> Reporter: John Lium
> Assignee: Daniel Dai
>
> When running pig, I get the following error.
> Error before Pig is launched
> ----------------------------
> ERROR 2999: Unexpected internal error. Failed to create DataStorage
> java.lang.RuntimeException: Failed to create DataStorage
> at org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.java:75)
> at org.apache.pig.backend.hadoop.datastorage.HDataStorage.<init>(HDataStorage.java:58)
> at org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:214)
> at org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:134)
> at org.apache.pig.impl.PigContext.connect(PigContext.java:183)
> at org.apache.pig.PigServer.<init>(PigServer.java:226)
> at org.apache.pig.PigServer.<init>(PigServer.java:215)
> at org.apache.pig.tools.grunt.Grunt.<init>(Grunt.java:55)
> at org.apache.pig.Main.run(Main.java:452)
> at org.apache.pig.Main.main(Main.java:107)
> Caused by: java.io.IOException: Call to rasputin/192.168.1.3:9000 failed on local exception: java.io.EOFException
> at org.apache.hadoop.ipc.Client.wrapException(Client.java:775)
> at org.apache.hadoop.ipc.Client.call(Client.java:743)
> at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
> at $Proxy0.getProtocolVersion(Unknown Source)
> at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
> at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
> at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:82)
> at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
> at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
> at org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.java:72)
> ... 9 more
> Caused by: java.io.EOFException
> at java.io.DataInputStream.readInt(DataInputStream.java:375)
> at org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:501)
> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:446)
> ================================================================================
> My env vars are defined in bin/pig as the following
> export JAVA_HOME="/etc/java-config-2/current-system-vm"
> export PIG_CLASSPATH="/var/hadoop/pig/pig-withouthadoop.jar:$HADOOP_HOME/hadoop-core-0.20.203.0.jar:$HADOOP_HOME/lib:$HADOOP_CONF_DIR"
--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira
[jira] [Commented] (PIG-2183) Pig 0.8.1 not working with Hadoop
0.20.203.0
Posted by "Daniel Dai (JIRA)" <ji...@apache.org>.
[ https://issues.apache.org/jira/browse/PIG-2183?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13079480#comment-13079480 ]
Daniel Dai commented on PIG-2183:
---------------------------------
Here is the change I made in PIG-2183-1.patch:
1. When releasing Pig, include pig-withouthadoop.jar instead of fat pig.jar
2. For hadoop 203+, using HADOOP_HOME to find hadoop binary, run "hadoop classpath" to get all hadoop libraries
3. For hadoop 20.2, assume hadoop libraries is inside HADOOP_HOME
4. If no HADOOP_HOME defined, Pig will link hadoop 20.2 jars inside $PIG_HOME/lib
5. Still pick PIG_CLASSPATH first, so user can override hadoop conf dir
> Pig 0.8.1 not working with Hadoop 0.20.203.0
> --------------------------------------------
>
> Key: PIG-2183
> URL: https://issues.apache.org/jira/browse/PIG-2183
> Project: Pig
> Issue Type: Bug
> Components: grunt, site
> Affects Versions: 0.8.1
> Environment: Gentoo Linux Kernel: 2.6.38-gentoo-r6
> java version "1.6.0_26"
> Ant version 1.8.1
> Reporter: John Lium
> Assignee: Daniel Dai
> Attachments: PIG-2183-1.patch
>
>
> When running pig, I get the following error.
> Error before Pig is launched
> ----------------------------
> ERROR 2999: Unexpected internal error. Failed to create DataStorage
> java.lang.RuntimeException: Failed to create DataStorage
> at org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.java:75)
> at org.apache.pig.backend.hadoop.datastorage.HDataStorage.<init>(HDataStorage.java:58)
> at org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:214)
> at org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:134)
> at org.apache.pig.impl.PigContext.connect(PigContext.java:183)
> at org.apache.pig.PigServer.<init>(PigServer.java:226)
> at org.apache.pig.PigServer.<init>(PigServer.java:215)
> at org.apache.pig.tools.grunt.Grunt.<init>(Grunt.java:55)
> at org.apache.pig.Main.run(Main.java:452)
> at org.apache.pig.Main.main(Main.java:107)
> Caused by: java.io.IOException: Call to rasputin/192.168.1.3:9000 failed on local exception: java.io.EOFException
> at org.apache.hadoop.ipc.Client.wrapException(Client.java:775)
> at org.apache.hadoop.ipc.Client.call(Client.java:743)
> at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
> at $Proxy0.getProtocolVersion(Unknown Source)
> at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
> at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
> at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:82)
> at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
> at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
> at org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.java:72)
> ... 9 more
> Caused by: java.io.EOFException
> at java.io.DataInputStream.readInt(DataInputStream.java:375)
> at org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:501)
> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:446)
> ================================================================================
> My env vars are defined in bin/pig as the following
> export JAVA_HOME="/etc/java-config-2/current-system-vm"
> export PIG_CLASSPATH="/var/hadoop/pig/pig-withouthadoop.jar:$HADOOP_HOME/hadoop-core-0.20.203.0.jar:$HADOOP_HOME/lib:$HADOOP_CONF_DIR"
--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira
[jira] [Updated] (PIG-2183) Pig not working with Hadoop 0.20.203.0
Posted by "Daniel Dai (JIRA)" <ji...@apache.org>.
[ https://issues.apache.org/jira/browse/PIG-2183?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Daniel Dai updated PIG-2183:
----------------------------
Attachment: PIG-2183-2.patch
PIG-2183-2.patch include some bug fix, also change undesired behavior introduced by PIG-1857.
> Pig not working with Hadoop 0.20.203.0
> --------------------------------------
>
> Key: PIG-2183
> URL: https://issues.apache.org/jira/browse/PIG-2183
> Project: Pig
> Issue Type: Bug
> Components: grunt, site
> Affects Versions: 0.8.1, 0.9.1, 0.10
> Environment: Gentoo Linux Kernel: 2.6.38-gentoo-r6
> java version "1.6.0_26"
> Ant version 1.8.1
> Reporter: John Lium
> Assignee: Daniel Dai
> Attachments: PIG-2183-1.patch, PIG-2183-2.patch
>
>
> When running pig, I get the following error.
> Error before Pig is launched
> ----------------------------
> ERROR 2999: Unexpected internal error. Failed to create DataStorage
> java.lang.RuntimeException: Failed to create DataStorage
> at org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.java:75)
> at org.apache.pig.backend.hadoop.datastorage.HDataStorage.<init>(HDataStorage.java:58)
> at org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:214)
> at org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:134)
> at org.apache.pig.impl.PigContext.connect(PigContext.java:183)
> at org.apache.pig.PigServer.<init>(PigServer.java:226)
> at org.apache.pig.PigServer.<init>(PigServer.java:215)
> at org.apache.pig.tools.grunt.Grunt.<init>(Grunt.java:55)
> at org.apache.pig.Main.run(Main.java:452)
> at org.apache.pig.Main.main(Main.java:107)
> Caused by: java.io.IOException: Call to rasputin/192.168.1.3:9000 failed on local exception: java.io.EOFException
> at org.apache.hadoop.ipc.Client.wrapException(Client.java:775)
> at org.apache.hadoop.ipc.Client.call(Client.java:743)
> at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
> at $Proxy0.getProtocolVersion(Unknown Source)
> at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
> at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
> at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:82)
> at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
> at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
> at org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.java:72)
> ... 9 more
> Caused by: java.io.EOFException
> at java.io.DataInputStream.readInt(DataInputStream.java:375)
> at org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:501)
> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:446)
> ================================================================================
> My env vars are defined in bin/pig as the following
> export JAVA_HOME="/etc/java-config-2/current-system-vm"
> export PIG_CLASSPATH="/var/hadoop/pig/pig-withouthadoop.jar:$HADOOP_HOME/hadoop-core-0.20.203.0.jar:$HADOOP_HOME/lib:$HADOOP_CONF_DIR"
--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira
[jira] [Updated] (PIG-2183) Pig not working with Hadoop 0.20.203.0
Posted by "Daniel Dai (JIRA)" <ji...@apache.org>.
[ https://issues.apache.org/jira/browse/PIG-2183?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Daniel Dai updated PIG-2183:
----------------------------
Resolution: Fixed
Fix Version/s: 0.10
0.9.1
Hadoop Flags: [Reviewed]
Status: Resolved (was: Patch Available)
Manually tested with release build, rpm/deb package and svn checkout, all works fine.
Committed to both trunk and 0.9 branch.
> Pig not working with Hadoop 0.20.203.0
> --------------------------------------
>
> Key: PIG-2183
> URL: https://issues.apache.org/jira/browse/PIG-2183
> Project: Pig
> Issue Type: Bug
> Components: grunt, site
> Affects Versions: 0.8.1, 0.9.1, 0.10
> Environment: Gentoo Linux Kernel: 2.6.38-gentoo-r6
> java version "1.6.0_26"
> Ant version 1.8.1
> Reporter: John Lium
> Assignee: Daniel Dai
> Fix For: 0.9.1, 0.10
>
> Attachments: PIG-2183-1.patch, PIG-2183-2.patch
>
>
> When running pig, I get the following error.
> Error before Pig is launched
> ----------------------------
> ERROR 2999: Unexpected internal error. Failed to create DataStorage
> java.lang.RuntimeException: Failed to create DataStorage
> at org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.java:75)
> at org.apache.pig.backend.hadoop.datastorage.HDataStorage.<init>(HDataStorage.java:58)
> at org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:214)
> at org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:134)
> at org.apache.pig.impl.PigContext.connect(PigContext.java:183)
> at org.apache.pig.PigServer.<init>(PigServer.java:226)
> at org.apache.pig.PigServer.<init>(PigServer.java:215)
> at org.apache.pig.tools.grunt.Grunt.<init>(Grunt.java:55)
> at org.apache.pig.Main.run(Main.java:452)
> at org.apache.pig.Main.main(Main.java:107)
> Caused by: java.io.IOException: Call to rasputin/192.168.1.3:9000 failed on local exception: java.io.EOFException
> at org.apache.hadoop.ipc.Client.wrapException(Client.java:775)
> at org.apache.hadoop.ipc.Client.call(Client.java:743)
> at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
> at $Proxy0.getProtocolVersion(Unknown Source)
> at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
> at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
> at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:82)
> at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
> at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
> at org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.java:72)
> ... 9 more
> Caused by: java.io.EOFException
> at java.io.DataInputStream.readInt(DataInputStream.java:375)
> at org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:501)
> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:446)
> ================================================================================
> My env vars are defined in bin/pig as the following
> export JAVA_HOME="/etc/java-config-2/current-system-vm"
> export PIG_CLASSPATH="/var/hadoop/pig/pig-withouthadoop.jar:$HADOOP_HOME/hadoop-core-0.20.203.0.jar:$HADOOP_HOME/lib:$HADOOP_CONF_DIR"
--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira
[jira] [Updated] (PIG-2183) Pig 0.8.1 not working with Hadoop
0.20.203.0
Posted by "Daniel Dai (JIRA)" <ji...@apache.org>.
[ https://issues.apache.org/jira/browse/PIG-2183?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Daniel Dai updated PIG-2183:
----------------------------
Attachment: PIG-2183-1.patch
Restructure pig script to use pig-withouthadoop.jar, and link hadoop jars dynamically.
> Pig 0.8.1 not working with Hadoop 0.20.203.0
> --------------------------------------------
>
> Key: PIG-2183
> URL: https://issues.apache.org/jira/browse/PIG-2183
> Project: Pig
> Issue Type: Bug
> Components: grunt, site
> Affects Versions: 0.8.1
> Environment: Gentoo Linux Kernel: 2.6.38-gentoo-r6
> java version "1.6.0_26"
> Ant version 1.8.1
> Reporter: John Lium
> Assignee: Daniel Dai
> Attachments: PIG-2183-1.patch
>
>
> When running pig, I get the following error.
> Error before Pig is launched
> ----------------------------
> ERROR 2999: Unexpected internal error. Failed to create DataStorage
> java.lang.RuntimeException: Failed to create DataStorage
> at org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.java:75)
> at org.apache.pig.backend.hadoop.datastorage.HDataStorage.<init>(HDataStorage.java:58)
> at org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:214)
> at org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:134)
> at org.apache.pig.impl.PigContext.connect(PigContext.java:183)
> at org.apache.pig.PigServer.<init>(PigServer.java:226)
> at org.apache.pig.PigServer.<init>(PigServer.java:215)
> at org.apache.pig.tools.grunt.Grunt.<init>(Grunt.java:55)
> at org.apache.pig.Main.run(Main.java:452)
> at org.apache.pig.Main.main(Main.java:107)
> Caused by: java.io.IOException: Call to rasputin/192.168.1.3:9000 failed on local exception: java.io.EOFException
> at org.apache.hadoop.ipc.Client.wrapException(Client.java:775)
> at org.apache.hadoop.ipc.Client.call(Client.java:743)
> at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
> at $Proxy0.getProtocolVersion(Unknown Source)
> at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
> at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
> at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:82)
> at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
> at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
> at org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.java:72)
> ... 9 more
> Caused by: java.io.EOFException
> at java.io.DataInputStream.readInt(DataInputStream.java:375)
> at org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:501)
> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:446)
> ================================================================================
> My env vars are defined in bin/pig as the following
> export JAVA_HOME="/etc/java-config-2/current-system-vm"
> export PIG_CLASSPATH="/var/hadoop/pig/pig-withouthadoop.jar:$HADOOP_HOME/hadoop-core-0.20.203.0.jar:$HADOOP_HOME/lib:$HADOOP_CONF_DIR"
--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira
[jira] [Updated] (PIG-2183) Pig not working with Hadoop 0.20.203.0
Posted by "Daniel Dai (JIRA)" <ji...@apache.org>.
[ https://issues.apache.org/jira/browse/PIG-2183?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Daniel Dai updated PIG-2183:
----------------------------
Attachment: PIG-2183-2.patch
> Pig not working with Hadoop 0.20.203.0
> --------------------------------------
>
> Key: PIG-2183
> URL: https://issues.apache.org/jira/browse/PIG-2183
> Project: Pig
> Issue Type: Bug
> Components: grunt, site
> Affects Versions: 0.8.1, 0.9.1, 0.10
> Environment: Gentoo Linux Kernel: 2.6.38-gentoo-r6
> java version "1.6.0_26"
> Ant version 1.8.1
> Reporter: John Lium
> Assignee: Daniel Dai
> Attachments: PIG-2183-1.patch, PIG-2183-2.patch
>
>
> When running pig, I get the following error.
> Error before Pig is launched
> ----------------------------
> ERROR 2999: Unexpected internal error. Failed to create DataStorage
> java.lang.RuntimeException: Failed to create DataStorage
> at org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.java:75)
> at org.apache.pig.backend.hadoop.datastorage.HDataStorage.<init>(HDataStorage.java:58)
> at org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:214)
> at org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:134)
> at org.apache.pig.impl.PigContext.connect(PigContext.java:183)
> at org.apache.pig.PigServer.<init>(PigServer.java:226)
> at org.apache.pig.PigServer.<init>(PigServer.java:215)
> at org.apache.pig.tools.grunt.Grunt.<init>(Grunt.java:55)
> at org.apache.pig.Main.run(Main.java:452)
> at org.apache.pig.Main.main(Main.java:107)
> Caused by: java.io.IOException: Call to rasputin/192.168.1.3:9000 failed on local exception: java.io.EOFException
> at org.apache.hadoop.ipc.Client.wrapException(Client.java:775)
> at org.apache.hadoop.ipc.Client.call(Client.java:743)
> at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
> at $Proxy0.getProtocolVersion(Unknown Source)
> at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
> at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
> at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:82)
> at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
> at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
> at org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.java:72)
> ... 9 more
> Caused by: java.io.EOFException
> at java.io.DataInputStream.readInt(DataInputStream.java:375)
> at org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:501)
> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:446)
> ================================================================================
> My env vars are defined in bin/pig as the following
> export JAVA_HOME="/etc/java-config-2/current-system-vm"
> export PIG_CLASSPATH="/var/hadoop/pig/pig-withouthadoop.jar:$HADOOP_HOME/hadoop-core-0.20.203.0.jar:$HADOOP_HOME/lib:$HADOOP_CONF_DIR"
--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira
[jira] [Updated] (PIG-2183) Pig not working with Hadoop 0.20.203.0
Posted by "Daniel Dai (JIRA)" <ji...@apache.org>.
[ https://issues.apache.org/jira/browse/PIG-2183?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Daniel Dai updated PIG-2183:
----------------------------
Attachment: PIG-2183-2.patch
> Pig not working with Hadoop 0.20.203.0
> --------------------------------------
>
> Key: PIG-2183
> URL: https://issues.apache.org/jira/browse/PIG-2183
> Project: Pig
> Issue Type: Bug
> Components: grunt, site
> Affects Versions: 0.8.1, 0.9.1, 0.10
> Environment: Gentoo Linux Kernel: 2.6.38-gentoo-r6
> java version "1.6.0_26"
> Ant version 1.8.1
> Reporter: John Lium
> Assignee: Daniel Dai
> Attachments: PIG-2183-1.patch, PIG-2183-2.patch
>
>
> When running pig, I get the following error.
> Error before Pig is launched
> ----------------------------
> ERROR 2999: Unexpected internal error. Failed to create DataStorage
> java.lang.RuntimeException: Failed to create DataStorage
> at org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.java:75)
> at org.apache.pig.backend.hadoop.datastorage.HDataStorage.<init>(HDataStorage.java:58)
> at org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:214)
> at org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:134)
> at org.apache.pig.impl.PigContext.connect(PigContext.java:183)
> at org.apache.pig.PigServer.<init>(PigServer.java:226)
> at org.apache.pig.PigServer.<init>(PigServer.java:215)
> at org.apache.pig.tools.grunt.Grunt.<init>(Grunt.java:55)
> at org.apache.pig.Main.run(Main.java:452)
> at org.apache.pig.Main.main(Main.java:107)
> Caused by: java.io.IOException: Call to rasputin/192.168.1.3:9000 failed on local exception: java.io.EOFException
> at org.apache.hadoop.ipc.Client.wrapException(Client.java:775)
> at org.apache.hadoop.ipc.Client.call(Client.java:743)
> at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
> at $Proxy0.getProtocolVersion(Unknown Source)
> at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
> at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
> at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:82)
> at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
> at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
> at org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.java:72)
> ... 9 more
> Caused by: java.io.EOFException
> at java.io.DataInputStream.readInt(DataInputStream.java:375)
> at org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:501)
> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:446)
> ================================================================================
> My env vars are defined in bin/pig as the following
> export JAVA_HOME="/etc/java-config-2/current-system-vm"
> export PIG_CLASSPATH="/var/hadoop/pig/pig-withouthadoop.jar:$HADOOP_HOME/hadoop-core-0.20.203.0.jar:$HADOOP_HOME/lib:$HADOOP_CONF_DIR"
--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira
[jira] [Updated] (PIG-2183) Pig not working with Hadoop 0.20.203.0
Posted by "Daniel Dai (JIRA)" <ji...@apache.org>.
[ https://issues.apache.org/jira/browse/PIG-2183?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Daniel Dai updated PIG-2183:
----------------------------
Attachment: (was: PIG-2183-2.patch)
> Pig not working with Hadoop 0.20.203.0
> --------------------------------------
>
> Key: PIG-2183
> URL: https://issues.apache.org/jira/browse/PIG-2183
> Project: Pig
> Issue Type: Bug
> Components: grunt, site
> Affects Versions: 0.8.1, 0.9.1, 0.10
> Environment: Gentoo Linux Kernel: 2.6.38-gentoo-r6
> java version "1.6.0_26"
> Ant version 1.8.1
> Reporter: John Lium
> Assignee: Daniel Dai
> Attachments: PIG-2183-1.patch
>
>
> When running pig, I get the following error.
> Error before Pig is launched
> ----------------------------
> ERROR 2999: Unexpected internal error. Failed to create DataStorage
> java.lang.RuntimeException: Failed to create DataStorage
> at org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.java:75)
> at org.apache.pig.backend.hadoop.datastorage.HDataStorage.<init>(HDataStorage.java:58)
> at org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:214)
> at org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:134)
> at org.apache.pig.impl.PigContext.connect(PigContext.java:183)
> at org.apache.pig.PigServer.<init>(PigServer.java:226)
> at org.apache.pig.PigServer.<init>(PigServer.java:215)
> at org.apache.pig.tools.grunt.Grunt.<init>(Grunt.java:55)
> at org.apache.pig.Main.run(Main.java:452)
> at org.apache.pig.Main.main(Main.java:107)
> Caused by: java.io.IOException: Call to rasputin/192.168.1.3:9000 failed on local exception: java.io.EOFException
> at org.apache.hadoop.ipc.Client.wrapException(Client.java:775)
> at org.apache.hadoop.ipc.Client.call(Client.java:743)
> at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
> at $Proxy0.getProtocolVersion(Unknown Source)
> at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
> at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
> at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:82)
> at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
> at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
> at org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.java:72)
> ... 9 more
> Caused by: java.io.EOFException
> at java.io.DataInputStream.readInt(DataInputStream.java:375)
> at org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:501)
> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:446)
> ================================================================================
> My env vars are defined in bin/pig as the following
> export JAVA_HOME="/etc/java-config-2/current-system-vm"
> export PIG_CLASSPATH="/var/hadoop/pig/pig-withouthadoop.jar:$HADOOP_HOME/hadoop-core-0.20.203.0.jar:$HADOOP_HOME/lib:$HADOOP_CONF_DIR"
--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira
[jira] [Commented] (PIG-2183) Pig 0.8.1 not working with Hadoop
0.20.203.0
Posted by "John Lium (JIRA)" <ji...@apache.org>.
[ https://issues.apache.org/jira/browse/PIG-2183?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13068623#comment-13068623 ]
John Lium commented on PIG-2183:
--------------------------------
Google waited till after I posted this issue to show me this.
https://issues.apache.org/jira/browse/PIG-2148
> Pig 0.8.1 not working with Hadoop 0.20.203.0
> --------------------------------------------
>
> Key: PIG-2183
> URL: https://issues.apache.org/jira/browse/PIG-2183
> Project: Pig
> Issue Type: Bug
> Components: grunt, site
> Affects Versions: 0.8.1
> Environment: Gentoo Linux Kernel: 2.6.38-gentoo-r6
> java version "1.6.0_26"
> Ant version 1.8.1
> Reporter: John Lium
>
> When running pig, I get the following error.
> Error before Pig is launched
> ----------------------------
> ERROR 2999: Unexpected internal error. Failed to create DataStorage
> java.lang.RuntimeException: Failed to create DataStorage
> at org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.java:75)
> at org.apache.pig.backend.hadoop.datastorage.HDataStorage.<init>(HDataStorage.java:58)
> at org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:214)
> at org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:134)
> at org.apache.pig.impl.PigContext.connect(PigContext.java:183)
> at org.apache.pig.PigServer.<init>(PigServer.java:226)
> at org.apache.pig.PigServer.<init>(PigServer.java:215)
> at org.apache.pig.tools.grunt.Grunt.<init>(Grunt.java:55)
> at org.apache.pig.Main.run(Main.java:452)
> at org.apache.pig.Main.main(Main.java:107)
> Caused by: java.io.IOException: Call to rasputin/192.168.1.3:9000 failed on local exception: java.io.EOFException
> at org.apache.hadoop.ipc.Client.wrapException(Client.java:775)
> at org.apache.hadoop.ipc.Client.call(Client.java:743)
> at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
> at $Proxy0.getProtocolVersion(Unknown Source)
> at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
> at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
> at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:82)
> at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
> at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
> at org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.java:72)
> ... 9 more
> Caused by: java.io.EOFException
> at java.io.DataInputStream.readInt(DataInputStream.java:375)
> at org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:501)
> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:446)
> ================================================================================
> My env vars are defined in bin/pig as the following
> export JAVA_HOME="/etc/java-config-2/current-system-vm"
> export PIG_CLASSPATH="/var/hadoop/pig/pig-withouthadoop.jar:$HADOOP_HOME/hadoop-core-0.20.203.0.jar:$HADOOP_HOME/lib:$HADOOP_CONF_DIR"
--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira
[jira] [Commented] (PIG-2183) Pig 0.8.1 not working with Hadoop
0.20.203.0
Posted by "Daniel Dai (JIRA)" <ji...@apache.org>.
[ https://issues.apache.org/jira/browse/PIG-2183?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13069339#comment-13069339 ]
Daniel Dai commented on PIG-2183:
---------------------------------
Thanks John. That is a first step toward a patch. To make it a formal patch, we need to add all your did into bin/pig. Are you interested to make a patch?
> Pig 0.8.1 not working with Hadoop 0.20.203.0
> --------------------------------------------
>
> Key: PIG-2183
> URL: https://issues.apache.org/jira/browse/PIG-2183
> Project: Pig
> Issue Type: Bug
> Components: grunt, site
> Affects Versions: 0.8.1
> Environment: Gentoo Linux Kernel: 2.6.38-gentoo-r6
> java version "1.6.0_26"
> Ant version 1.8.1
> Reporter: John Lium
> Assignee: Daniel Dai
>
> When running pig, I get the following error.
> Error before Pig is launched
> ----------------------------
> ERROR 2999: Unexpected internal error. Failed to create DataStorage
> java.lang.RuntimeException: Failed to create DataStorage
> at org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.java:75)
> at org.apache.pig.backend.hadoop.datastorage.HDataStorage.<init>(HDataStorage.java:58)
> at org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:214)
> at org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:134)
> at org.apache.pig.impl.PigContext.connect(PigContext.java:183)
> at org.apache.pig.PigServer.<init>(PigServer.java:226)
> at org.apache.pig.PigServer.<init>(PigServer.java:215)
> at org.apache.pig.tools.grunt.Grunt.<init>(Grunt.java:55)
> at org.apache.pig.Main.run(Main.java:452)
> at org.apache.pig.Main.main(Main.java:107)
> Caused by: java.io.IOException: Call to rasputin/192.168.1.3:9000 failed on local exception: java.io.EOFException
> at org.apache.hadoop.ipc.Client.wrapException(Client.java:775)
> at org.apache.hadoop.ipc.Client.call(Client.java:743)
> at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
> at $Proxy0.getProtocolVersion(Unknown Source)
> at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
> at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
> at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:82)
> at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
> at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
> at org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.java:72)
> ... 9 more
> Caused by: java.io.EOFException
> at java.io.DataInputStream.readInt(DataInputStream.java:375)
> at org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:501)
> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:446)
> ================================================================================
> My env vars are defined in bin/pig as the following
> export JAVA_HOME="/etc/java-config-2/current-system-vm"
> export PIG_CLASSPATH="/var/hadoop/pig/pig-withouthadoop.jar:$HADOOP_HOME/hadoop-core-0.20.203.0.jar:$HADOOP_HOME/lib:$HADOOP_CONF_DIR"
--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira
[jira] [Updated] (PIG-2183) Pig not working with Hadoop 0.20.203.0
Posted by "Daniel Dai (JIRA)" <ji...@apache.org>.
[ https://issues.apache.org/jira/browse/PIG-2183?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Daniel Dai updated PIG-2183:
----------------------------
Status: Patch Available (was: Open)
> Pig not working with Hadoop 0.20.203.0
> --------------------------------------
>
> Key: PIG-2183
> URL: https://issues.apache.org/jira/browse/PIG-2183
> Project: Pig
> Issue Type: Bug
> Components: grunt, site
> Affects Versions: 0.8.1, 0.9.1, 0.10
> Environment: Gentoo Linux Kernel: 2.6.38-gentoo-r6
> java version "1.6.0_26"
> Ant version 1.8.1
> Reporter: John Lium
> Assignee: Daniel Dai
> Attachments: PIG-2183-1.patch
>
>
> When running pig, I get the following error.
> Error before Pig is launched
> ----------------------------
> ERROR 2999: Unexpected internal error. Failed to create DataStorage
> java.lang.RuntimeException: Failed to create DataStorage
> at org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.java:75)
> at org.apache.pig.backend.hadoop.datastorage.HDataStorage.<init>(HDataStorage.java:58)
> at org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:214)
> at org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:134)
> at org.apache.pig.impl.PigContext.connect(PigContext.java:183)
> at org.apache.pig.PigServer.<init>(PigServer.java:226)
> at org.apache.pig.PigServer.<init>(PigServer.java:215)
> at org.apache.pig.tools.grunt.Grunt.<init>(Grunt.java:55)
> at org.apache.pig.Main.run(Main.java:452)
> at org.apache.pig.Main.main(Main.java:107)
> Caused by: java.io.IOException: Call to rasputin/192.168.1.3:9000 failed on local exception: java.io.EOFException
> at org.apache.hadoop.ipc.Client.wrapException(Client.java:775)
> at org.apache.hadoop.ipc.Client.call(Client.java:743)
> at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
> at $Proxy0.getProtocolVersion(Unknown Source)
> at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
> at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
> at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:82)
> at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
> at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
> at org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.java:72)
> ... 9 more
> Caused by: java.io.EOFException
> at java.io.DataInputStream.readInt(DataInputStream.java:375)
> at org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:501)
> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:446)
> ================================================================================
> My env vars are defined in bin/pig as the following
> export JAVA_HOME="/etc/java-config-2/current-system-vm"
> export PIG_CLASSPATH="/var/hadoop/pig/pig-withouthadoop.jar:$HADOOP_HOME/hadoop-core-0.20.203.0.jar:$HADOOP_HOME/lib:$HADOOP_CONF_DIR"
--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira
[jira] [Commented] (PIG-2183) Pig not working with Hadoop
0.20.203.0
Posted by "Thejas M Nair (JIRA)" <ji...@apache.org>.
[ https://issues.apache.org/jira/browse/PIG-2183?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13081229#comment-13081229 ]
Thejas M Nair commented on PIG-2183:
------------------------------------
+1
> Pig not working with Hadoop 0.20.203.0
> --------------------------------------
>
> Key: PIG-2183
> URL: https://issues.apache.org/jira/browse/PIG-2183
> Project: Pig
> Issue Type: Bug
> Components: grunt, site
> Affects Versions: 0.8.1, 0.9.1, 0.10
> Environment: Gentoo Linux Kernel: 2.6.38-gentoo-r6
> java version "1.6.0_26"
> Ant version 1.8.1
> Reporter: John Lium
> Assignee: Daniel Dai
> Attachments: PIG-2183-1.patch
>
>
> When running pig, I get the following error.
> Error before Pig is launched
> ----------------------------
> ERROR 2999: Unexpected internal error. Failed to create DataStorage
> java.lang.RuntimeException: Failed to create DataStorage
> at org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.java:75)
> at org.apache.pig.backend.hadoop.datastorage.HDataStorage.<init>(HDataStorage.java:58)
> at org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:214)
> at org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:134)
> at org.apache.pig.impl.PigContext.connect(PigContext.java:183)
> at org.apache.pig.PigServer.<init>(PigServer.java:226)
> at org.apache.pig.PigServer.<init>(PigServer.java:215)
> at org.apache.pig.tools.grunt.Grunt.<init>(Grunt.java:55)
> at org.apache.pig.Main.run(Main.java:452)
> at org.apache.pig.Main.main(Main.java:107)
> Caused by: java.io.IOException: Call to rasputin/192.168.1.3:9000 failed on local exception: java.io.EOFException
> at org.apache.hadoop.ipc.Client.wrapException(Client.java:775)
> at org.apache.hadoop.ipc.Client.call(Client.java:743)
> at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
> at $Proxy0.getProtocolVersion(Unknown Source)
> at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
> at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
> at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:82)
> at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
> at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
> at org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.java:72)
> ... 9 more
> Caused by: java.io.EOFException
> at java.io.DataInputStream.readInt(DataInputStream.java:375)
> at org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:501)
> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:446)
> ================================================================================
> My env vars are defined in bin/pig as the following
> export JAVA_HOME="/etc/java-config-2/current-system-vm"
> export PIG_CLASSPATH="/var/hadoop/pig/pig-withouthadoop.jar:$HADOOP_HOME/hadoop-core-0.20.203.0.jar:$HADOOP_HOME/lib:$HADOOP_CONF_DIR"
--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira
[jira] [Commented] (PIG-2183) Pig 0.8.1 not working with Hadoop
0.20.203.0
Posted by "John Lium (JIRA)" <ji...@apache.org>.
[ https://issues.apache.org/jira/browse/PIG-2183?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13073363#comment-13073363 ]
John Lium commented on PIG-2183:
--------------------------------
My bash script is pretty clunky, would anyone be able to spruce it up a bit?
Also, I wanted to add that for this fix you need to delete the old hadoop jars.
> Pig 0.8.1 not working with Hadoop 0.20.203.0
> --------------------------------------------
>
> Key: PIG-2183
> URL: https://issues.apache.org/jira/browse/PIG-2183
> Project: Pig
> Issue Type: Bug
> Components: grunt, site
> Affects Versions: 0.8.1
> Environment: Gentoo Linux Kernel: 2.6.38-gentoo-r6
> java version "1.6.0_26"
> Ant version 1.8.1
> Reporter: John Lium
> Assignee: Daniel Dai
>
> When running pig, I get the following error.
> Error before Pig is launched
> ----------------------------
> ERROR 2999: Unexpected internal error. Failed to create DataStorage
> java.lang.RuntimeException: Failed to create DataStorage
> at org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.java:75)
> at org.apache.pig.backend.hadoop.datastorage.HDataStorage.<init>(HDataStorage.java:58)
> at org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:214)
> at org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:134)
> at org.apache.pig.impl.PigContext.connect(PigContext.java:183)
> at org.apache.pig.PigServer.<init>(PigServer.java:226)
> at org.apache.pig.PigServer.<init>(PigServer.java:215)
> at org.apache.pig.tools.grunt.Grunt.<init>(Grunt.java:55)
> at org.apache.pig.Main.run(Main.java:452)
> at org.apache.pig.Main.main(Main.java:107)
> Caused by: java.io.IOException: Call to rasputin/192.168.1.3:9000 failed on local exception: java.io.EOFException
> at org.apache.hadoop.ipc.Client.wrapException(Client.java:775)
> at org.apache.hadoop.ipc.Client.call(Client.java:743)
> at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
> at $Proxy0.getProtocolVersion(Unknown Source)
> at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
> at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
> at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:82)
> at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
> at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
> at org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.java:72)
> ... 9 more
> Caused by: java.io.EOFException
> at java.io.DataInputStream.readInt(DataInputStream.java:375)
> at org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:501)
> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:446)
> ================================================================================
> My env vars are defined in bin/pig as the following
> export JAVA_HOME="/etc/java-config-2/current-system-vm"
> export PIG_CLASSPATH="/var/hadoop/pig/pig-withouthadoop.jar:$HADOOP_HOME/hadoop-core-0.20.203.0.jar:$HADOOP_HOME/lib:$HADOOP_CONF_DIR"
--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira
[jira] [Commented] (PIG-2183) Pig 0.8.1 not working with Hadoop
0.20.203.0
Posted by "John Lium (JIRA)" <ji...@apache.org>.
[ https://issues.apache.org/jira/browse/PIG-2183?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13068616#comment-13068616 ]
John Lium commented on PIG-2183:
--------------------------------
Also, forgot to mention that I built pig with "ant jar-withouthadoop"
> Pig 0.8.1 not working with Hadoop 0.20.203.0
> --------------------------------------------
>
> Key: PIG-2183
> URL: https://issues.apache.org/jira/browse/PIG-2183
> Project: Pig
> Issue Type: Bug
> Components: grunt, site
> Affects Versions: 0.8.1
> Environment: Gentoo Linux Kernel: 2.6.38-gentoo-r6
> java version "1.6.0_26"
> Ant version 1.8.1
> Reporter: John Lium
>
> When running pig, I get the following error.
> Error before Pig is launched
> ----------------------------
> ERROR 2999: Unexpected internal error. Failed to create DataStorage
> java.lang.RuntimeException: Failed to create DataStorage
> at org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.java:75)
> at org.apache.pig.backend.hadoop.datastorage.HDataStorage.<init>(HDataStorage.java:58)
> at org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:214)
> at org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:134)
> at org.apache.pig.impl.PigContext.connect(PigContext.java:183)
> at org.apache.pig.PigServer.<init>(PigServer.java:226)
> at org.apache.pig.PigServer.<init>(PigServer.java:215)
> at org.apache.pig.tools.grunt.Grunt.<init>(Grunt.java:55)
> at org.apache.pig.Main.run(Main.java:452)
> at org.apache.pig.Main.main(Main.java:107)
> Caused by: java.io.IOException: Call to rasputin/192.168.1.3:9000 failed on local exception: java.io.EOFException
> at org.apache.hadoop.ipc.Client.wrapException(Client.java:775)
> at org.apache.hadoop.ipc.Client.call(Client.java:743)
> at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
> at $Proxy0.getProtocolVersion(Unknown Source)
> at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
> at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
> at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:82)
> at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
> at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
> at org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.java:72)
> ... 9 more
> Caused by: java.io.EOFException
> at java.io.DataInputStream.readInt(DataInputStream.java:375)
> at org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:501)
> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:446)
> ================================================================================
> My env vars are defined in bin/pig as the following
> export JAVA_HOME="/etc/java-config-2/current-system-vm"
> export PIG_CLASSPATH="/var/hadoop/pig/pig-withouthadoop.jar:$HADOOP_HOME/hadoop-core-0.20.203.0.jar:$HADOOP_HOME/lib:$HADOOP_CONF_DIR"
--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira
[jira] [Updated] (PIG-2183) Pig not working with Hadoop 0.20.203.0
Posted by "Daniel Dai (JIRA)" <ji...@apache.org>.
[ https://issues.apache.org/jira/browse/PIG-2183?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Daniel Dai updated PIG-2183:
----------------------------
Attachment: (was: PIG-2183-2.patch)
> Pig not working with Hadoop 0.20.203.0
> --------------------------------------
>
> Key: PIG-2183
> URL: https://issues.apache.org/jira/browse/PIG-2183
> Project: Pig
> Issue Type: Bug
> Components: grunt, site
> Affects Versions: 0.8.1, 0.9.1, 0.10
> Environment: Gentoo Linux Kernel: 2.6.38-gentoo-r6
> java version "1.6.0_26"
> Ant version 1.8.1
> Reporter: John Lium
> Assignee: Daniel Dai
> Attachments: PIG-2183-1.patch, PIG-2183-2.patch
>
>
> When running pig, I get the following error.
> Error before Pig is launched
> ----------------------------
> ERROR 2999: Unexpected internal error. Failed to create DataStorage
> java.lang.RuntimeException: Failed to create DataStorage
> at org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.java:75)
> at org.apache.pig.backend.hadoop.datastorage.HDataStorage.<init>(HDataStorage.java:58)
> at org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:214)
> at org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:134)
> at org.apache.pig.impl.PigContext.connect(PigContext.java:183)
> at org.apache.pig.PigServer.<init>(PigServer.java:226)
> at org.apache.pig.PigServer.<init>(PigServer.java:215)
> at org.apache.pig.tools.grunt.Grunt.<init>(Grunt.java:55)
> at org.apache.pig.Main.run(Main.java:452)
> at org.apache.pig.Main.main(Main.java:107)
> Caused by: java.io.IOException: Call to rasputin/192.168.1.3:9000 failed on local exception: java.io.EOFException
> at org.apache.hadoop.ipc.Client.wrapException(Client.java:775)
> at org.apache.hadoop.ipc.Client.call(Client.java:743)
> at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
> at $Proxy0.getProtocolVersion(Unknown Source)
> at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
> at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
> at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:82)
> at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
> at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
> at org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.java:72)
> ... 9 more
> Caused by: java.io.EOFException
> at java.io.DataInputStream.readInt(DataInputStream.java:375)
> at org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:501)
> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:446)
> ================================================================================
> My env vars are defined in bin/pig as the following
> export JAVA_HOME="/etc/java-config-2/current-system-vm"
> export PIG_CLASSPATH="/var/hadoop/pig/pig-withouthadoop.jar:$HADOOP_HOME/hadoop-core-0.20.203.0.jar:$HADOOP_HOME/lib:$HADOOP_CONF_DIR"
--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira
[jira] [Updated] (PIG-2183) Pig not working with Hadoop 0.20.203.0
Posted by "Daniel Dai (JIRA)" <ji...@apache.org>.
[ https://issues.apache.org/jira/browse/PIG-2183?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Daniel Dai updated PIG-2183:
----------------------------
Affects Version/s: 0.10
0.9.1
Summary: Pig not working with Hadoop 0.20.203.0 (was: Pig 0.8.1 not working with Hadoop 0.20.203.0)
> Pig not working with Hadoop 0.20.203.0
> --------------------------------------
>
> Key: PIG-2183
> URL: https://issues.apache.org/jira/browse/PIG-2183
> Project: Pig
> Issue Type: Bug
> Components: grunt, site
> Affects Versions: 0.8.1, 0.9.1, 0.10
> Environment: Gentoo Linux Kernel: 2.6.38-gentoo-r6
> java version "1.6.0_26"
> Ant version 1.8.1
> Reporter: John Lium
> Assignee: Daniel Dai
> Attachments: PIG-2183-1.patch
>
>
> When running pig, I get the following error.
> Error before Pig is launched
> ----------------------------
> ERROR 2999: Unexpected internal error. Failed to create DataStorage
> java.lang.RuntimeException: Failed to create DataStorage
> at org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.java:75)
> at org.apache.pig.backend.hadoop.datastorage.HDataStorage.<init>(HDataStorage.java:58)
> at org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:214)
> at org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:134)
> at org.apache.pig.impl.PigContext.connect(PigContext.java:183)
> at org.apache.pig.PigServer.<init>(PigServer.java:226)
> at org.apache.pig.PigServer.<init>(PigServer.java:215)
> at org.apache.pig.tools.grunt.Grunt.<init>(Grunt.java:55)
> at org.apache.pig.Main.run(Main.java:452)
> at org.apache.pig.Main.main(Main.java:107)
> Caused by: java.io.IOException: Call to rasputin/192.168.1.3:9000 failed on local exception: java.io.EOFException
> at org.apache.hadoop.ipc.Client.wrapException(Client.java:775)
> at org.apache.hadoop.ipc.Client.call(Client.java:743)
> at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
> at $Proxy0.getProtocolVersion(Unknown Source)
> at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
> at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
> at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:82)
> at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
> at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
> at org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.java:72)
> ... 9 more
> Caused by: java.io.EOFException
> at java.io.DataInputStream.readInt(DataInputStream.java:375)
> at org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:501)
> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:446)
> ================================================================================
> My env vars are defined in bin/pig as the following
> export JAVA_HOME="/etc/java-config-2/current-system-vm"
> export PIG_CLASSPATH="/var/hadoop/pig/pig-withouthadoop.jar:$HADOOP_HOME/hadoop-core-0.20.203.0.jar:$HADOOP_HOME/lib:$HADOOP_CONF_DIR"
--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira
[jira] [Assigned] (PIG-2183) Pig 0.8.1 not working with Hadoop
0.20.203.0
Posted by "Daniel Dai (JIRA)" <ji...@apache.org>.
[ https://issues.apache.org/jira/browse/PIG-2183?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Daniel Dai reassigned PIG-2183:
-------------------------------
Assignee: Daniel Dai
> Pig 0.8.1 not working with Hadoop 0.20.203.0
> --------------------------------------------
>
> Key: PIG-2183
> URL: https://issues.apache.org/jira/browse/PIG-2183
> Project: Pig
> Issue Type: Bug
> Components: grunt, site
> Affects Versions: 0.8.1
> Environment: Gentoo Linux Kernel: 2.6.38-gentoo-r6
> java version "1.6.0_26"
> Ant version 1.8.1
> Reporter: John Lium
> Assignee: Daniel Dai
>
> When running pig, I get the following error.
> Error before Pig is launched
> ----------------------------
> ERROR 2999: Unexpected internal error. Failed to create DataStorage
> java.lang.RuntimeException: Failed to create DataStorage
> at org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.java:75)
> at org.apache.pig.backend.hadoop.datastorage.HDataStorage.<init>(HDataStorage.java:58)
> at org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:214)
> at org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:134)
> at org.apache.pig.impl.PigContext.connect(PigContext.java:183)
> at org.apache.pig.PigServer.<init>(PigServer.java:226)
> at org.apache.pig.PigServer.<init>(PigServer.java:215)
> at org.apache.pig.tools.grunt.Grunt.<init>(Grunt.java:55)
> at org.apache.pig.Main.run(Main.java:452)
> at org.apache.pig.Main.main(Main.java:107)
> Caused by: java.io.IOException: Call to rasputin/192.168.1.3:9000 failed on local exception: java.io.EOFException
> at org.apache.hadoop.ipc.Client.wrapException(Client.java:775)
> at org.apache.hadoop.ipc.Client.call(Client.java:743)
> at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
> at $Proxy0.getProtocolVersion(Unknown Source)
> at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
> at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:207)
> at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:170)
> at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:82)
> at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
> at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
> at org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.java:72)
> ... 9 more
> Caused by: java.io.EOFException
> at java.io.DataInputStream.readInt(DataInputStream.java:375)
> at org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:501)
> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:446)
> ================================================================================
> My env vars are defined in bin/pig as the following
> export JAVA_HOME="/etc/java-config-2/current-system-vm"
> export PIG_CLASSPATH="/var/hadoop/pig/pig-withouthadoop.jar:$HADOOP_HOME/hadoop-core-0.20.203.0.jar:$HADOOP_HOME/lib:$HADOOP_CONF_DIR"
--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira