You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by andrew touchet <ad...@latech.edu> on 2014/07/18 01:49:51 UTC

HDFS input/output error - fuse mount

Hello,

Hadoop package installed:
hadoop-0.20-0.20.2+737-33.osg.el5.noarch

Operating System:
CentOS release 5.8 (Final)

I am mounting HDFS from my namenode to another node with fuse.  After
mounting to /hdfs, any attempts to 'ls', 'cd', or use 'hadoop fs' leads to
the below output.


$ls /hdfs
*ls: /hdfs: Input/output error*
$hadoop fs -ls













*Exception in thread "main" java.lang.UnsupportedClassVersionError:
org/apache/hadoop/fs/FsShell : Unsupported major.minor version 51.0    at
java.lang.ClassLoader.defineClass1(Native Method)    at
java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
java.security.AccessController.doPrivileged(Native Method)    at
java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
java.lang.ClassLoader.loadClass(ClassLoader.java:247)Could not find the
main class: org.apache.hadoop.fs.FsShell.  Program will exit.*


I have attempted to mount /hdfs manually in debug mode and then attempted
to access /hdfs from a different terminal. This is the output. The namenode
is *glados*. The server where /hdfs is being mounted is *glados2*.


$hdfs -oserver=glados,port=9000,rdbuffer=131072,allow_other /hdfs -d















































*fuse-dfs ignoring option allow_otherERROR fuse_options.c:162 fuse-dfs
didn't recognize /hdfs,-2fuse-dfs ignoring option -dunique: 1, opcode: INIT
(26), nodeid: 0, insize: 56INIT:
7.10flags=0x0000000bmax_readahead=0x00020000INFO fuse_init.c:115 Mounting
glados:9000Exception in thread "main"
java.lang.UnsupportedClassVersionError:
org/apache/hadoop/conf/Configuration : Unsupported major.minor version
51.0    at java.lang.ClassLoader.defineClass1(Native Method)    at
java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
java.security.AccessController.doPrivileged(Native Method)    at
java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
java.lang.ClassLoader.loadClass(ClassLoader.java:247)Can't construct
instance of class org.apache.hadoop.conf.ConfigurationERROR fuse_init.c:127
Unable to establish test connection to server   INIT: 7.8
flags=0x00000001   max_readahead=0x00020000   max_write=0x00020000
unique: 1, error: 0 (Success), outsize: 40unique: 2, opcode: GETATTR (3),
nodeid: 1, insize: 56Exception in thread "Thread-0"
java.lang.UnsupportedClassVersionError:
org/apache/hadoop/conf/Configuration : Unsupported major.minor version
51.0    at java.lang.ClassLoader.defineClass1(Native Method)    at
java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
java.security.AccessController.doPrivileged(Native Method)    at
java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
java.lang.ClassLoader.loadClass(ClassLoader.java:247)Can't construct
instance of class org.apache.hadoop.conf.ConfigurationERROR
fuse_connect.c:83 Unable to instantiate a filesystem for user027ERROR
fuse_impls_getattr.c:40 Could not connect to glados:9000   unique: 2,
error: -5 (Input/output error), outsize: 16unique: 3, opcode: GETATTR (3),
nodeid: 1, insize: 56*

I adopted this system after this was already setup, so I do not know which
java version was used during install. Currently I'm using:

$java -version


*java version "1.6.0_45"Java(TM) SE Runtime Environment (build
1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed
mode)*

$java -version





*java version "1.6.0_45"Java(TM) SE Runtime Environment (build
1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed
mode)*
Is my java version really the cause of this issue?  What is the correct
java version to be used for this version of hadoop.  I have also tried
1.6.0_31 but no changes were seen.

If java isn't my issue, then what is?

Best regards,

Andrew

Re: Re: HDFS input/output error - fuse mount

Posted by Harsh J <ha...@cloudera.com>.
Apache Bigtop detects JVM based on the most recent version first, if
you use packages based on their framework:
https://github.com/apache/bigtop/blob/master/bigtop-packages/src/common/bigtop-utils/bigtop-detect-javahome
(switch branch/tag appropriately to a specific release you may be
using/based off)

Within Apache Hadoop, we rely on the definition of JAVA_HOME
explicitly. In a packaged environment, the above script typically
supplies it.

On Sat, Jul 19, 2014 at 6:35 AM, Chris Mawata <ch...@gmail.com> wrote:
> Great that you got it sorted out. I'm afraid I don't know if there is a
> configuration that would automatically check the versions -- maybe someone
> who knows might chime in.
> Cheers
> Chris
>
> On Jul 18, 2014 3:06 PM, "andrew touchet" <ad...@latech.edu> wrote:
>>
>> Thanks Chris!
>>
>> The issue was that even though I set jdk-7u21 as my default, it checked
>> for /usr/java/jdk-1.6* first.  Even though it was compiled with 1.7.
>>
>> Is there anyway to generate a proper hadoop-config.sh to reflect the minor
>> version hadoop was built with? So that in my case, it would check for
>> /usr/java/jdk-1.7* instead?  I appreciate the help!
>>
>>
>> On Thu, Jul 17, 2014 at 11:11 PM, Chris Mawata <ch...@gmail.com>
>> wrote:
>>>
>>> Yet another place to check -- in the hadoop-env.sh file there is also a
>>> JAVA_HOME setting.
>>> Chris
>>>
>>> On Jul 17, 2014 9:46 PM, "andrew touchet" <ad...@latech.edu> wrote:
>>>>
>>>> Hi Fireflyhoo,
>>>>
>>>> Below I follow the symbolic links for the jdk-7u21. These links are
>>>> changed accordingly as I change between versions. Also, I have 8 datanodes
>>>> and 2 other various servers that are capable of mounting /hdfs.  So it is
>>>> just this server is an issue.
>>>>
>>>> $ java -version
>>>> java version "1.7.0_21"
>>>> Java(TM) SE Runtime Environment (build 1.7.0_21-b11)
>>>> Java HotSpot(TM) 64-Bit Server VM (build 23.21-b01, mixed mode)
>>>>
>>>> java
>>>> $ ls -l `which java`
>>>> lrwxrwxrwx 1 root root 26 Jul 17 19:50 /usr/bin/java ->
>>>> /usr/java/default/bin/java
>>>> $ ls -l /usr/java/default
>>>> lrwxrwxrwx 1 root root 16 Jul 17 19:50 /usr/java/default ->
>>>> /usr/java/latest
>>>> $ ls -l /usr/java/latest
>>>> lrwxrwxrwx 1 root root 21 Jul 17 20:29 /usr/java/latest ->
>>>> /usr/java/jdk1.7.0_21
>>>>
>>>> jar
>>>> $ ls -l `which jar`
>>>> lrwxrwxrwx 1 root root 21 Jul 17 20:18 /usr/bin/jar ->
>>>> /etc/alternatives/jar
>>>> $ ls -l /etc/alternatives/jar
>>>> lrwxrwxrwx 1 root root 29 Jul 17 20:26 /etc/alternatives/jar ->
>>>> /usr/java/jdk1.7.0_21/bin/jar
>>>>
>>>> javac
>>>> $ ls -l `which javac`
>>>> lrwxrwxrwx 1 root root 23 Jul 17 20:18 /usr/bin/javac ->
>>>> /etc/alternatives/javac
>>>> $ ls -l /etc/alternatives/javac
>>>> lrwxrwxrwx 1 root root 31 Jul 17 20:26 /etc/alternatives/javac ->
>>>> /usr/java/jdk1.7.0_21/bin/javac
>>>>
>>>> Now that I've tried version from  6 & 7, I'm really not sure what is
>>>> causing this issue.
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> On Thu, Jul 17, 2014 at 8:21 PM, fireflyhoo@gmail.com
>>>> <fi...@gmail.com> wrote:
>>>>>
>>>>> I think  you first confirm you local java version ,
>>>>> Some  liux will  pre-installed java ,that version is very low
>>>>>
>>>>> ________________________________
>>>>> fireflyhoo@gmail.com
>>>>>
>>>>>
>>>>> From: andrew touchet
>>>>> Date: 2014-07-18 09:06
>>>>> To: user
>>>>> Subject: Re: HDFS input/output error - fuse mount
>>>>> Hi Chris,
>>>>>
>>>>> I tried to mount /hdfs with java versions below but there was no change
>>>>> in output.
>>>>> jre-7u21
>>>>> jdk-7u21
>>>>> jdk-7u55
>>>>> jdk1.6.0_31
>>>>> jdk1.6.0_45
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> On Thu, Jul 17, 2014 at 6:56 PM, Chris Mawata <ch...@gmail.com>
>>>>> wrote:
>>>>>>
>>>>>> Version 51 ia Java 7
>>>>>> Chris
>>>>>>
>>>>>> On Jul 17, 2014 7:50 PM, "andrew touchet" <ad...@latech.edu> wrote:
>>>>>>>
>>>>>>> Hello,
>>>>>>>
>>>>>>> Hadoop package installed:
>>>>>>> hadoop-0.20-0.20.2+737-33.osg.el5.noarch
>>>>>>>
>>>>>>> Operating System:
>>>>>>> CentOS release 5.8 (Final)
>>>>>>>
>>>>>>> I am mounting HDFS from my namenode to another node with fuse.  After
>>>>>>> mounting to /hdfs, any attempts to 'ls', 'cd', or use 'hadoop fs' leads to
>>>>>>> the below output.
>>>>>>>
>>>>>>>
>>>>>>> $ls /hdfs
>>>>>>> ls: /hdfs: Input/output error
>>>>>>> $hadoop fs -ls
>>>>>>> Exception in thread "main" java.lang.UnsupportedClassVersionError:
>>>>>>> org/apache/hadoop/fs/FsShell : Unsupported major.minor version 51.0
>>>>>>>     at java.lang.ClassLoader.defineClass1(Native Method)
>>>>>>>     at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
>>>>>>>     at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
>>>>>>>     at
>>>>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>>>>>     at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
>>>>>>>     at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
>>>>>>>     at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
>>>>>>>     at java.security.AccessController.doPrivileged(Native Method)
>>>>>>>     at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>>>>>>>     at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>>>>>>>     at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>>>>>>>     at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>>>>>>> Could not find the main class: org.apache.hadoop.fs.FsShell.  Program
>>>>>>> will exit.
>>>>>>>
>>>>>>>
>>>>>>> I have attempted to mount /hdfs manually in debug mode and then
>>>>>>> attempted to access /hdfs from a different terminal. This is the output. The
>>>>>>> namenode is glados. The server where /hdfs is being mounted is glados2.
>>>>>>>
>>>>>>>
>>>>>>> $hdfs -oserver=glados,port=9000,rdbuffer=131072,allow_other /hdfs -d
>>>>>>> fuse-dfs ignoring option allow_other
>>>>>>> ERROR fuse_options.c:162 fuse-dfs didn't recognize /hdfs,-2
>>>>>>>
>>>>>>> fuse-dfs ignoring option -d
>>>>>>> unique: 1, opcode: INIT (26), nodeid: 0, insize: 56
>>>>>>> INIT: 7.10
>>>>>>> flags=0x0000000b
>>>>>>> max_readahead=0x00020000
>>>>>>> INFO fuse_init.c:115 Mounting glados:9000
>>>>>>> Exception in thread "main" java.lang.UnsupportedClassVersionError:
>>>>>>> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
>>>>>>>     at java.lang.ClassLoader.defineClass1(Native Method)
>>>>>>>     at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
>>>>>>>     at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
>>>>>>>     at
>>>>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>>>>>     at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
>>>>>>>     at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
>>>>>>>     at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
>>>>>>>     at java.security.AccessController.doPrivileged(Native Method)
>>>>>>>     at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>>>>>>>     at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>>>>>>>     at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>>>>>>>     at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>>>>>>> Can't construct instance of class
>>>>>>> org.apache.hadoop.conf.Configuration
>>>>>>> ERROR fuse_init.c:127 Unable to establish test connection to server
>>>>>>>    INIT: 7.8
>>>>>>>    flags=0x00000001
>>>>>>>    max_readahead=0x00020000
>>>>>>>    max_write=0x00020000
>>>>>>>    unique: 1, error: 0 (Success), outsize: 40
>>>>>>> unique: 2, opcode: GETATTR (3), nodeid: 1, insize: 56
>>>>>>> Exception in thread "Thread-0"
>>>>>>> java.lang.UnsupportedClassVersionError: org/apache/hadoop/conf/Configuration
>>>>>>> : Unsupported major.minor version 51.0
>>>>>>>     at java.lang.ClassLoader.defineClass1(Native Method)
>>>>>>>     at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
>>>>>>>     at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
>>>>>>>     at
>>>>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>>>>>     at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
>>>>>>>     at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
>>>>>>>     at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
>>>>>>>     at java.security.AccessController.doPrivileged(Native Method)
>>>>>>>     at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>>>>>>>     at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>>>>>>>     at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>>>>>>>     at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>>>>>>> Can't construct instance of class
>>>>>>> org.apache.hadoop.conf.Configuration
>>>>>>> ERROR fuse_connect.c:83 Unable to instantiate a filesystem for
>>>>>>> user027
>>>>>>> ERROR fuse_impls_getattr.c:40 Could not connect to glados:9000
>>>>>>>    unique: 2, error: -5 (Input/output error), outsize: 16
>>>>>>> unique: 3, opcode: GETATTR (3), nodeid: 1, insize: 56
>>>>>>>
>>>>>>> I adopted this system after this was already setup, so I do not know
>>>>>>> which java version was used during install. Currently I'm using:
>>>>>>>
>>>>>>> $java -version
>>>>>>> java version "1.6.0_45"
>>>>>>> Java(TM) SE Runtime Environment (build 1.6.0_45-b06)
>>>>>>> Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed mode)
>>>>>>>
>>>>>>> $java -version
>>>>>>> java version "1.6.0_45"
>>>>>>> Java(TM) SE Runtime Environment (build 1.6.0_45-b06)
>>>>>>> Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed mode)
>>>>>>>
>>>>>>>
>>>>>>> Is my java version really the cause of this issue?  What is the
>>>>>>> correct java version to be used for this version of hadoop.  I have also
>>>>>>> tried 1.6.0_31 but no changes were seen.
>>>>>>>
>>>>>>> If java isn't my issue, then what is?
>>>>>>>
>>>>>>> Best regards,
>>>>>>>
>>>>>>> Andrew
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>
>>>>
>>
>



-- 
Harsh J

Re: Re: HDFS input/output error - fuse mount

Posted by Harsh J <ha...@cloudera.com>.
Apache Bigtop detects JVM based on the most recent version first, if
you use packages based on their framework:
https://github.com/apache/bigtop/blob/master/bigtop-packages/src/common/bigtop-utils/bigtop-detect-javahome
(switch branch/tag appropriately to a specific release you may be
using/based off)

Within Apache Hadoop, we rely on the definition of JAVA_HOME
explicitly. In a packaged environment, the above script typically
supplies it.

On Sat, Jul 19, 2014 at 6:35 AM, Chris Mawata <ch...@gmail.com> wrote:
> Great that you got it sorted out. I'm afraid I don't know if there is a
> configuration that would automatically check the versions -- maybe someone
> who knows might chime in.
> Cheers
> Chris
>
> On Jul 18, 2014 3:06 PM, "andrew touchet" <ad...@latech.edu> wrote:
>>
>> Thanks Chris!
>>
>> The issue was that even though I set jdk-7u21 as my default, it checked
>> for /usr/java/jdk-1.6* first.  Even though it was compiled with 1.7.
>>
>> Is there anyway to generate a proper hadoop-config.sh to reflect the minor
>> version hadoop was built with? So that in my case, it would check for
>> /usr/java/jdk-1.7* instead?  I appreciate the help!
>>
>>
>> On Thu, Jul 17, 2014 at 11:11 PM, Chris Mawata <ch...@gmail.com>
>> wrote:
>>>
>>> Yet another place to check -- in the hadoop-env.sh file there is also a
>>> JAVA_HOME setting.
>>> Chris
>>>
>>> On Jul 17, 2014 9:46 PM, "andrew touchet" <ad...@latech.edu> wrote:
>>>>
>>>> Hi Fireflyhoo,
>>>>
>>>> Below I follow the symbolic links for the jdk-7u21. These links are
>>>> changed accordingly as I change between versions. Also, I have 8 datanodes
>>>> and 2 other various servers that are capable of mounting /hdfs.  So it is
>>>> just this server is an issue.
>>>>
>>>> $ java -version
>>>> java version "1.7.0_21"
>>>> Java(TM) SE Runtime Environment (build 1.7.0_21-b11)
>>>> Java HotSpot(TM) 64-Bit Server VM (build 23.21-b01, mixed mode)
>>>>
>>>> java
>>>> $ ls -l `which java`
>>>> lrwxrwxrwx 1 root root 26 Jul 17 19:50 /usr/bin/java ->
>>>> /usr/java/default/bin/java
>>>> $ ls -l /usr/java/default
>>>> lrwxrwxrwx 1 root root 16 Jul 17 19:50 /usr/java/default ->
>>>> /usr/java/latest
>>>> $ ls -l /usr/java/latest
>>>> lrwxrwxrwx 1 root root 21 Jul 17 20:29 /usr/java/latest ->
>>>> /usr/java/jdk1.7.0_21
>>>>
>>>> jar
>>>> $ ls -l `which jar`
>>>> lrwxrwxrwx 1 root root 21 Jul 17 20:18 /usr/bin/jar ->
>>>> /etc/alternatives/jar
>>>> $ ls -l /etc/alternatives/jar
>>>> lrwxrwxrwx 1 root root 29 Jul 17 20:26 /etc/alternatives/jar ->
>>>> /usr/java/jdk1.7.0_21/bin/jar
>>>>
>>>> javac
>>>> $ ls -l `which javac`
>>>> lrwxrwxrwx 1 root root 23 Jul 17 20:18 /usr/bin/javac ->
>>>> /etc/alternatives/javac
>>>> $ ls -l /etc/alternatives/javac
>>>> lrwxrwxrwx 1 root root 31 Jul 17 20:26 /etc/alternatives/javac ->
>>>> /usr/java/jdk1.7.0_21/bin/javac
>>>>
>>>> Now that I've tried version from  6 & 7, I'm really not sure what is
>>>> causing this issue.
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> On Thu, Jul 17, 2014 at 8:21 PM, fireflyhoo@gmail.com
>>>> <fi...@gmail.com> wrote:
>>>>>
>>>>> I think  you first confirm you local java version ,
>>>>> Some  liux will  pre-installed java ,that version is very low
>>>>>
>>>>> ________________________________
>>>>> fireflyhoo@gmail.com
>>>>>
>>>>>
>>>>> From: andrew touchet
>>>>> Date: 2014-07-18 09:06
>>>>> To: user
>>>>> Subject: Re: HDFS input/output error - fuse mount
>>>>> Hi Chris,
>>>>>
>>>>> I tried to mount /hdfs with java versions below but there was no change
>>>>> in output.
>>>>> jre-7u21
>>>>> jdk-7u21
>>>>> jdk-7u55
>>>>> jdk1.6.0_31
>>>>> jdk1.6.0_45
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> On Thu, Jul 17, 2014 at 6:56 PM, Chris Mawata <ch...@gmail.com>
>>>>> wrote:
>>>>>>
>>>>>> Version 51 ia Java 7
>>>>>> Chris
>>>>>>
>>>>>> On Jul 17, 2014 7:50 PM, "andrew touchet" <ad...@latech.edu> wrote:
>>>>>>>
>>>>>>> Hello,
>>>>>>>
>>>>>>> Hadoop package installed:
>>>>>>> hadoop-0.20-0.20.2+737-33.osg.el5.noarch
>>>>>>>
>>>>>>> Operating System:
>>>>>>> CentOS release 5.8 (Final)
>>>>>>>
>>>>>>> I am mounting HDFS from my namenode to another node with fuse.  After
>>>>>>> mounting to /hdfs, any attempts to 'ls', 'cd', or use 'hadoop fs' leads to
>>>>>>> the below output.
>>>>>>>
>>>>>>>
>>>>>>> $ls /hdfs
>>>>>>> ls: /hdfs: Input/output error
>>>>>>> $hadoop fs -ls
>>>>>>> Exception in thread "main" java.lang.UnsupportedClassVersionError:
>>>>>>> org/apache/hadoop/fs/FsShell : Unsupported major.minor version 51.0
>>>>>>>     at java.lang.ClassLoader.defineClass1(Native Method)
>>>>>>>     at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
>>>>>>>     at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
>>>>>>>     at
>>>>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>>>>>     at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
>>>>>>>     at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
>>>>>>>     at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
>>>>>>>     at java.security.AccessController.doPrivileged(Native Method)
>>>>>>>     at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>>>>>>>     at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>>>>>>>     at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>>>>>>>     at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>>>>>>> Could not find the main class: org.apache.hadoop.fs.FsShell.  Program
>>>>>>> will exit.
>>>>>>>
>>>>>>>
>>>>>>> I have attempted to mount /hdfs manually in debug mode and then
>>>>>>> attempted to access /hdfs from a different terminal. This is the output. The
>>>>>>> namenode is glados. The server where /hdfs is being mounted is glados2.
>>>>>>>
>>>>>>>
>>>>>>> $hdfs -oserver=glados,port=9000,rdbuffer=131072,allow_other /hdfs -d
>>>>>>> fuse-dfs ignoring option allow_other
>>>>>>> ERROR fuse_options.c:162 fuse-dfs didn't recognize /hdfs,-2
>>>>>>>
>>>>>>> fuse-dfs ignoring option -d
>>>>>>> unique: 1, opcode: INIT (26), nodeid: 0, insize: 56
>>>>>>> INIT: 7.10
>>>>>>> flags=0x0000000b
>>>>>>> max_readahead=0x00020000
>>>>>>> INFO fuse_init.c:115 Mounting glados:9000
>>>>>>> Exception in thread "main" java.lang.UnsupportedClassVersionError:
>>>>>>> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
>>>>>>>     at java.lang.ClassLoader.defineClass1(Native Method)
>>>>>>>     at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
>>>>>>>     at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
>>>>>>>     at
>>>>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>>>>>     at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
>>>>>>>     at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
>>>>>>>     at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
>>>>>>>     at java.security.AccessController.doPrivileged(Native Method)
>>>>>>>     at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>>>>>>>     at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>>>>>>>     at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>>>>>>>     at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>>>>>>> Can't construct instance of class
>>>>>>> org.apache.hadoop.conf.Configuration
>>>>>>> ERROR fuse_init.c:127 Unable to establish test connection to server
>>>>>>>    INIT: 7.8
>>>>>>>    flags=0x00000001
>>>>>>>    max_readahead=0x00020000
>>>>>>>    max_write=0x00020000
>>>>>>>    unique: 1, error: 0 (Success), outsize: 40
>>>>>>> unique: 2, opcode: GETATTR (3), nodeid: 1, insize: 56
>>>>>>> Exception in thread "Thread-0"
>>>>>>> java.lang.UnsupportedClassVersionError: org/apache/hadoop/conf/Configuration
>>>>>>> : Unsupported major.minor version 51.0
>>>>>>>     at java.lang.ClassLoader.defineClass1(Native Method)
>>>>>>>     at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
>>>>>>>     at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
>>>>>>>     at
>>>>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>>>>>     at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
>>>>>>>     at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
>>>>>>>     at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
>>>>>>>     at java.security.AccessController.doPrivileged(Native Method)
>>>>>>>     at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>>>>>>>     at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>>>>>>>     at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>>>>>>>     at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>>>>>>> Can't construct instance of class
>>>>>>> org.apache.hadoop.conf.Configuration
>>>>>>> ERROR fuse_connect.c:83 Unable to instantiate a filesystem for
>>>>>>> user027
>>>>>>> ERROR fuse_impls_getattr.c:40 Could not connect to glados:9000
>>>>>>>    unique: 2, error: -5 (Input/output error), outsize: 16
>>>>>>> unique: 3, opcode: GETATTR (3), nodeid: 1, insize: 56
>>>>>>>
>>>>>>> I adopted this system after this was already setup, so I do not know
>>>>>>> which java version was used during install. Currently I'm using:
>>>>>>>
>>>>>>> $java -version
>>>>>>> java version "1.6.0_45"
>>>>>>> Java(TM) SE Runtime Environment (build 1.6.0_45-b06)
>>>>>>> Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed mode)
>>>>>>>
>>>>>>> $java -version
>>>>>>> java version "1.6.0_45"
>>>>>>> Java(TM) SE Runtime Environment (build 1.6.0_45-b06)
>>>>>>> Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed mode)
>>>>>>>
>>>>>>>
>>>>>>> Is my java version really the cause of this issue?  What is the
>>>>>>> correct java version to be used for this version of hadoop.  I have also
>>>>>>> tried 1.6.0_31 but no changes were seen.
>>>>>>>
>>>>>>> If java isn't my issue, then what is?
>>>>>>>
>>>>>>> Best regards,
>>>>>>>
>>>>>>> Andrew
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>
>>>>
>>
>



-- 
Harsh J

Re: what exactly does data in HDFS look like?

Posted by Bertrand Dechoux <de...@gmail.com>.
But basically you are right : it is the same concept as with a classical
file system. A file is seen as a sequence of bytes. For various efficiency
reasons, the whole sequence is not stored like that but first splits into
blocks (subsequences). With a local file system, these blocks will be
within the local drives. With HDFS, they are somewhere within the cluster
(and replicated, of course).

So really, the filesystem doesn't care about what is inside the file and
the format is something it is really oblivious to.

Bertrand


On Sat, Jul 19, 2014 at 7:02 AM, Shahab Yunus <sh...@gmail.com>
wrote:

> The data itself is eventually store in a form of file. Each blocks of the
> file and it replicas are stored in files and directories on different
> nodes. The Namenode that keep the information and maintains it about each
> file and where its blocks (and replicated blocks exist in the cluster.)
>
> As for the format, it is stored as bytes. In the normal cases you use the
> DFS or FileOutputStream classes to  write data and in those instances it is
> written in byte form (conversion to bytes i.e. serialize data.) When you
> read the data, you use the same counterpart classes like InputStream and
> those convert the data from byte to text (i.e. deserialization). Point
> being, HDFS is oblivious to the fact whether it was JSON of XML.
>
> This would be more evident if you see the code to read/write from HDFS
> (writing example below):
>
> https://sites.google.com/site/hadoopandhive/home/how-to-write-a-file-in-hdfs-using-hadoop
>
> Now on the other hand, if you were using compression or other storage
> formats like Avro or Parquet then those formats come with their own classes
> which take care of serialization and deserialization.
>
> For basic cases, this should be helpful:
>
> https://www.inkling.com/read/hadoop-definitive-guide-tom-white-3rd/chapter-3/data-flow
>
> More here on data storage:
>
> http://stackoverflow.com/questions/2358402/where-hdfs-stores-files-locally-by-default
> http://hadoop.apache.org/docs/r1.2.1/hdfs_design.html#Data+Organization
> https://developer.yahoo.com/hadoop/tutorial/module1.html#data
>
> Regards,
> Shahab
>
>
> On Sat, Jul 19, 2014 at 12:12 AM, Adaryl "Bob" Wakefield, MBA <
> adaryl.wakefield@hotmail.com> wrote:
>
>>   And by that I mean is there an HDFS file type? I feel like I’m missing
>> something. Let’s say I have a HUGE json file that I import into HDFS. Does
>> it retain it’s JSON format in HDFS? What if it’s just random tweets I’m
>> streaming. Is it kind of like a normal disk where there are all kinds of
>> files sitting on disk in their own format it’s just that in HDFS they are
>> spread out over nodes?
>>
>> B.
>>
>
>

Re: what exactly does data in HDFS look like?

Posted by Bertrand Dechoux <de...@gmail.com>.
But basically you are right : it is the same concept as with a classical
file system. A file is seen as a sequence of bytes. For various efficiency
reasons, the whole sequence is not stored like that but first splits into
blocks (subsequences). With a local file system, these blocks will be
within the local drives. With HDFS, they are somewhere within the cluster
(and replicated, of course).

So really, the filesystem doesn't care about what is inside the file and
the format is something it is really oblivious to.

Bertrand


On Sat, Jul 19, 2014 at 7:02 AM, Shahab Yunus <sh...@gmail.com>
wrote:

> The data itself is eventually store in a form of file. Each blocks of the
> file and it replicas are stored in files and directories on different
> nodes. The Namenode that keep the information and maintains it about each
> file and where its blocks (and replicated blocks exist in the cluster.)
>
> As for the format, it is stored as bytes. In the normal cases you use the
> DFS or FileOutputStream classes to  write data and in those instances it is
> written in byte form (conversion to bytes i.e. serialize data.) When you
> read the data, you use the same counterpart classes like InputStream and
> those convert the data from byte to text (i.e. deserialization). Point
> being, HDFS is oblivious to the fact whether it was JSON of XML.
>
> This would be more evident if you see the code to read/write from HDFS
> (writing example below):
>
> https://sites.google.com/site/hadoopandhive/home/how-to-write-a-file-in-hdfs-using-hadoop
>
> Now on the other hand, if you were using compression or other storage
> formats like Avro or Parquet then those formats come with their own classes
> which take care of serialization and deserialization.
>
> For basic cases, this should be helpful:
>
> https://www.inkling.com/read/hadoop-definitive-guide-tom-white-3rd/chapter-3/data-flow
>
> More here on data storage:
>
> http://stackoverflow.com/questions/2358402/where-hdfs-stores-files-locally-by-default
> http://hadoop.apache.org/docs/r1.2.1/hdfs_design.html#Data+Organization
> https://developer.yahoo.com/hadoop/tutorial/module1.html#data
>
> Regards,
> Shahab
>
>
> On Sat, Jul 19, 2014 at 12:12 AM, Adaryl "Bob" Wakefield, MBA <
> adaryl.wakefield@hotmail.com> wrote:
>
>>   And by that I mean is there an HDFS file type? I feel like I’m missing
>> something. Let’s say I have a HUGE json file that I import into HDFS. Does
>> it retain it’s JSON format in HDFS? What if it’s just random tweets I’m
>> streaming. Is it kind of like a normal disk where there are all kinds of
>> files sitting on disk in their own format it’s just that in HDFS they are
>> spread out over nodes?
>>
>> B.
>>
>
>

Re: what exactly does data in HDFS look like?

Posted by Bertrand Dechoux <de...@gmail.com>.
But basically you are right : it is the same concept as with a classical
file system. A file is seen as a sequence of bytes. For various efficiency
reasons, the whole sequence is not stored like that but first splits into
blocks (subsequences). With a local file system, these blocks will be
within the local drives. With HDFS, they are somewhere within the cluster
(and replicated, of course).

So really, the filesystem doesn't care about what is inside the file and
the format is something it is really oblivious to.

Bertrand


On Sat, Jul 19, 2014 at 7:02 AM, Shahab Yunus <sh...@gmail.com>
wrote:

> The data itself is eventually store in a form of file. Each blocks of the
> file and it replicas are stored in files and directories on different
> nodes. The Namenode that keep the information and maintains it about each
> file and where its blocks (and replicated blocks exist in the cluster.)
>
> As for the format, it is stored as bytes. In the normal cases you use the
> DFS or FileOutputStream classes to  write data and in those instances it is
> written in byte form (conversion to bytes i.e. serialize data.) When you
> read the data, you use the same counterpart classes like InputStream and
> those convert the data from byte to text (i.e. deserialization). Point
> being, HDFS is oblivious to the fact whether it was JSON of XML.
>
> This would be more evident if you see the code to read/write from HDFS
> (writing example below):
>
> https://sites.google.com/site/hadoopandhive/home/how-to-write-a-file-in-hdfs-using-hadoop
>
> Now on the other hand, if you were using compression or other storage
> formats like Avro or Parquet then those formats come with their own classes
> which take care of serialization and deserialization.
>
> For basic cases, this should be helpful:
>
> https://www.inkling.com/read/hadoop-definitive-guide-tom-white-3rd/chapter-3/data-flow
>
> More here on data storage:
>
> http://stackoverflow.com/questions/2358402/where-hdfs-stores-files-locally-by-default
> http://hadoop.apache.org/docs/r1.2.1/hdfs_design.html#Data+Organization
> https://developer.yahoo.com/hadoop/tutorial/module1.html#data
>
> Regards,
> Shahab
>
>
> On Sat, Jul 19, 2014 at 12:12 AM, Adaryl "Bob" Wakefield, MBA <
> adaryl.wakefield@hotmail.com> wrote:
>
>>   And by that I mean is there an HDFS file type? I feel like I’m missing
>> something. Let’s say I have a HUGE json file that I import into HDFS. Does
>> it retain it’s JSON format in HDFS? What if it’s just random tweets I’m
>> streaming. Is it kind of like a normal disk where there are all kinds of
>> files sitting on disk in their own format it’s just that in HDFS they are
>> spread out over nodes?
>>
>> B.
>>
>
>

Re: what exactly does data in HDFS look like?

Posted by Bertrand Dechoux <de...@gmail.com>.
But basically you are right : it is the same concept as with a classical
file system. A file is seen as a sequence of bytes. For various efficiency
reasons, the whole sequence is not stored like that but first splits into
blocks (subsequences). With a local file system, these blocks will be
within the local drives. With HDFS, they are somewhere within the cluster
(and replicated, of course).

So really, the filesystem doesn't care about what is inside the file and
the format is something it is really oblivious to.

Bertrand


On Sat, Jul 19, 2014 at 7:02 AM, Shahab Yunus <sh...@gmail.com>
wrote:

> The data itself is eventually store in a form of file. Each blocks of the
> file and it replicas are stored in files and directories on different
> nodes. The Namenode that keep the information and maintains it about each
> file and where its blocks (and replicated blocks exist in the cluster.)
>
> As for the format, it is stored as bytes. In the normal cases you use the
> DFS or FileOutputStream classes to  write data and in those instances it is
> written in byte form (conversion to bytes i.e. serialize data.) When you
> read the data, you use the same counterpart classes like InputStream and
> those convert the data from byte to text (i.e. deserialization). Point
> being, HDFS is oblivious to the fact whether it was JSON of XML.
>
> This would be more evident if you see the code to read/write from HDFS
> (writing example below):
>
> https://sites.google.com/site/hadoopandhive/home/how-to-write-a-file-in-hdfs-using-hadoop
>
> Now on the other hand, if you were using compression or other storage
> formats like Avro or Parquet then those formats come with their own classes
> which take care of serialization and deserialization.
>
> For basic cases, this should be helpful:
>
> https://www.inkling.com/read/hadoop-definitive-guide-tom-white-3rd/chapter-3/data-flow
>
> More here on data storage:
>
> http://stackoverflow.com/questions/2358402/where-hdfs-stores-files-locally-by-default
> http://hadoop.apache.org/docs/r1.2.1/hdfs_design.html#Data+Organization
> https://developer.yahoo.com/hadoop/tutorial/module1.html#data
>
> Regards,
> Shahab
>
>
> On Sat, Jul 19, 2014 at 12:12 AM, Adaryl "Bob" Wakefield, MBA <
> adaryl.wakefield@hotmail.com> wrote:
>
>>   And by that I mean is there an HDFS file type? I feel like I’m missing
>> something. Let’s say I have a HUGE json file that I import into HDFS. Does
>> it retain it’s JSON format in HDFS? What if it’s just random tweets I’m
>> streaming. Is it kind of like a normal disk where there are all kinds of
>> files sitting on disk in their own format it’s just that in HDFS they are
>> spread out over nodes?
>>
>> B.
>>
>
>

Re: what exactly does data in HDFS look like?

Posted by Shahab Yunus <sh...@gmail.com>.
The data itself is eventually store in a form of file. Each blocks of the
file and it replicas are stored in files and directories on different
nodes. The Namenode that keep the information and maintains it about each
file and where its blocks (and replicated blocks exist in the cluster.)

As for the format, it is stored as bytes. In the normal cases you use the
DFS or FileOutputStream classes to  write data and in those instances it is
written in byte form (conversion to bytes i.e. serialize data.) When you
read the data, you use the same counterpart classes like InputStream and
those convert the data from byte to text (i.e. deserialization). Point
being, HDFS is oblivious to the fact whether it was JSON of XML.

This would be more evident if you see the code to read/write from HDFS
(writing example below):
https://sites.google.com/site/hadoopandhive/home/how-to-write-a-file-in-hdfs-using-hadoop

Now on the other hand, if you were using compression or other storage
formats like Avro or Parquet then those formats come with their own classes
which take care of serialization and deserialization.

For basic cases, this should be helpful:
https://www.inkling.com/read/hadoop-definitive-guide-tom-white-3rd/chapter-3/data-flow

More here on data storage:
http://stackoverflow.com/questions/2358402/where-hdfs-stores-files-locally-by-default
http://hadoop.apache.org/docs/r1.2.1/hdfs_design.html#Data+Organization
https://developer.yahoo.com/hadoop/tutorial/module1.html#data

Regards,
Shahab


On Sat, Jul 19, 2014 at 12:12 AM, Adaryl "Bob" Wakefield, MBA <
adaryl.wakefield@hotmail.com> wrote:

>   And by that I mean is there an HDFS file type? I feel like I’m missing
> something. Let’s say I have a HUGE json file that I import into HDFS. Does
> it retain it’s JSON format in HDFS? What if it’s just random tweets I’m
> streaming. Is it kind of like a normal disk where there are all kinds of
> files sitting on disk in their own format it’s just that in HDFS they are
> spread out over nodes?
>
> B.
>

Re: what exactly does data in HDFS look like?

Posted by Shahab Yunus <sh...@gmail.com>.
The data itself is eventually store in a form of file. Each blocks of the
file and it replicas are stored in files and directories on different
nodes. The Namenode that keep the information and maintains it about each
file and where its blocks (and replicated blocks exist in the cluster.)

As for the format, it is stored as bytes. In the normal cases you use the
DFS or FileOutputStream classes to  write data and in those instances it is
written in byte form (conversion to bytes i.e. serialize data.) When you
read the data, you use the same counterpart classes like InputStream and
those convert the data from byte to text (i.e. deserialization). Point
being, HDFS is oblivious to the fact whether it was JSON of XML.

This would be more evident if you see the code to read/write from HDFS
(writing example below):
https://sites.google.com/site/hadoopandhive/home/how-to-write-a-file-in-hdfs-using-hadoop

Now on the other hand, if you were using compression or other storage
formats like Avro or Parquet then those formats come with their own classes
which take care of serialization and deserialization.

For basic cases, this should be helpful:
https://www.inkling.com/read/hadoop-definitive-guide-tom-white-3rd/chapter-3/data-flow

More here on data storage:
http://stackoverflow.com/questions/2358402/where-hdfs-stores-files-locally-by-default
http://hadoop.apache.org/docs/r1.2.1/hdfs_design.html#Data+Organization
https://developer.yahoo.com/hadoop/tutorial/module1.html#data

Regards,
Shahab


On Sat, Jul 19, 2014 at 12:12 AM, Adaryl "Bob" Wakefield, MBA <
adaryl.wakefield@hotmail.com> wrote:

>   And by that I mean is there an HDFS file type? I feel like I’m missing
> something. Let’s say I have a HUGE json file that I import into HDFS. Does
> it retain it’s JSON format in HDFS? What if it’s just random tweets I’m
> streaming. Is it kind of like a normal disk where there are all kinds of
> files sitting on disk in their own format it’s just that in HDFS they are
> spread out over nodes?
>
> B.
>

Re: what exactly does data in HDFS look like?

Posted by Shahab Yunus <sh...@gmail.com>.
The data itself is eventually store in a form of file. Each blocks of the
file and it replicas are stored in files and directories on different
nodes. The Namenode that keep the information and maintains it about each
file and where its blocks (and replicated blocks exist in the cluster.)

As for the format, it is stored as bytes. In the normal cases you use the
DFS or FileOutputStream classes to  write data and in those instances it is
written in byte form (conversion to bytes i.e. serialize data.) When you
read the data, you use the same counterpart classes like InputStream and
those convert the data from byte to text (i.e. deserialization). Point
being, HDFS is oblivious to the fact whether it was JSON of XML.

This would be more evident if you see the code to read/write from HDFS
(writing example below):
https://sites.google.com/site/hadoopandhive/home/how-to-write-a-file-in-hdfs-using-hadoop

Now on the other hand, if you were using compression or other storage
formats like Avro or Parquet then those formats come with their own classes
which take care of serialization and deserialization.

For basic cases, this should be helpful:
https://www.inkling.com/read/hadoop-definitive-guide-tom-white-3rd/chapter-3/data-flow

More here on data storage:
http://stackoverflow.com/questions/2358402/where-hdfs-stores-files-locally-by-default
http://hadoop.apache.org/docs/r1.2.1/hdfs_design.html#Data+Organization
https://developer.yahoo.com/hadoop/tutorial/module1.html#data

Regards,
Shahab


On Sat, Jul 19, 2014 at 12:12 AM, Adaryl "Bob" Wakefield, MBA <
adaryl.wakefield@hotmail.com> wrote:

>   And by that I mean is there an HDFS file type? I feel like I’m missing
> something. Let’s say I have a HUGE json file that I import into HDFS. Does
> it retain it’s JSON format in HDFS? What if it’s just random tweets I’m
> streaming. Is it kind of like a normal disk where there are all kinds of
> files sitting on disk in their own format it’s just that in HDFS they are
> spread out over nodes?
>
> B.
>

Re: what exactly does data in HDFS look like?

Posted by Shahab Yunus <sh...@gmail.com>.
The data itself is eventually store in a form of file. Each blocks of the
file and it replicas are stored in files and directories on different
nodes. The Namenode that keep the information and maintains it about each
file and where its blocks (and replicated blocks exist in the cluster.)

As for the format, it is stored as bytes. In the normal cases you use the
DFS or FileOutputStream classes to  write data and in those instances it is
written in byte form (conversion to bytes i.e. serialize data.) When you
read the data, you use the same counterpart classes like InputStream and
those convert the data from byte to text (i.e. deserialization). Point
being, HDFS is oblivious to the fact whether it was JSON of XML.

This would be more evident if you see the code to read/write from HDFS
(writing example below):
https://sites.google.com/site/hadoopandhive/home/how-to-write-a-file-in-hdfs-using-hadoop

Now on the other hand, if you were using compression or other storage
formats like Avro or Parquet then those formats come with their own classes
which take care of serialization and deserialization.

For basic cases, this should be helpful:
https://www.inkling.com/read/hadoop-definitive-guide-tom-white-3rd/chapter-3/data-flow

More here on data storage:
http://stackoverflow.com/questions/2358402/where-hdfs-stores-files-locally-by-default
http://hadoop.apache.org/docs/r1.2.1/hdfs_design.html#Data+Organization
https://developer.yahoo.com/hadoop/tutorial/module1.html#data

Regards,
Shahab


On Sat, Jul 19, 2014 at 12:12 AM, Adaryl "Bob" Wakefield, MBA <
adaryl.wakefield@hotmail.com> wrote:

>   And by that I mean is there an HDFS file type? I feel like I’m missing
> something. Let’s say I have a HUGE json file that I import into HDFS. Does
> it retain it’s JSON format in HDFS? What if it’s just random tweets I’m
> streaming. Is it kind of like a normal disk where there are all kinds of
> files sitting on disk in their own format it’s just that in HDFS they are
> spread out over nodes?
>
> B.
>

what exactly does data in HDFS look like?

Posted by "Adaryl \"Bob\" Wakefield, MBA" <ad...@hotmail.com>.
And by that I mean is there an HDFS file type? I feel like I’m missing something. Let’s say I have a HUGE json file that I import into HDFS. Does it retain it’s JSON format in HDFS? What if it’s just random tweets I’m streaming. Is it kind of like a normal disk where there are all kinds of files sitting on disk in their own format it’s just that in HDFS they are spread out over nodes?

B.

what exactly does data in HDFS look like?

Posted by "Adaryl \"Bob\" Wakefield, MBA" <ad...@hotmail.com>.
And by that I mean is there an HDFS file type? I feel like I’m missing something. Let’s say I have a HUGE json file that I import into HDFS. Does it retain it’s JSON format in HDFS? What if it’s just random tweets I’m streaming. Is it kind of like a normal disk where there are all kinds of files sitting on disk in their own format it’s just that in HDFS they are spread out over nodes?

B.

Re: Re: HDFS input/output error - fuse mount

Posted by Harsh J <ha...@cloudera.com>.
Apache Bigtop detects JVM based on the most recent version first, if
you use packages based on their framework:
https://github.com/apache/bigtop/blob/master/bigtop-packages/src/common/bigtop-utils/bigtop-detect-javahome
(switch branch/tag appropriately to a specific release you may be
using/based off)

Within Apache Hadoop, we rely on the definition of JAVA_HOME
explicitly. In a packaged environment, the above script typically
supplies it.

On Sat, Jul 19, 2014 at 6:35 AM, Chris Mawata <ch...@gmail.com> wrote:
> Great that you got it sorted out. I'm afraid I don't know if there is a
> configuration that would automatically check the versions -- maybe someone
> who knows might chime in.
> Cheers
> Chris
>
> On Jul 18, 2014 3:06 PM, "andrew touchet" <ad...@latech.edu> wrote:
>>
>> Thanks Chris!
>>
>> The issue was that even though I set jdk-7u21 as my default, it checked
>> for /usr/java/jdk-1.6* first.  Even though it was compiled with 1.7.
>>
>> Is there anyway to generate a proper hadoop-config.sh to reflect the minor
>> version hadoop was built with? So that in my case, it would check for
>> /usr/java/jdk-1.7* instead?  I appreciate the help!
>>
>>
>> On Thu, Jul 17, 2014 at 11:11 PM, Chris Mawata <ch...@gmail.com>
>> wrote:
>>>
>>> Yet another place to check -- in the hadoop-env.sh file there is also a
>>> JAVA_HOME setting.
>>> Chris
>>>
>>> On Jul 17, 2014 9:46 PM, "andrew touchet" <ad...@latech.edu> wrote:
>>>>
>>>> Hi Fireflyhoo,
>>>>
>>>> Below I follow the symbolic links for the jdk-7u21. These links are
>>>> changed accordingly as I change between versions. Also, I have 8 datanodes
>>>> and 2 other various servers that are capable of mounting /hdfs.  So it is
>>>> just this server is an issue.
>>>>
>>>> $ java -version
>>>> java version "1.7.0_21"
>>>> Java(TM) SE Runtime Environment (build 1.7.0_21-b11)
>>>> Java HotSpot(TM) 64-Bit Server VM (build 23.21-b01, mixed mode)
>>>>
>>>> java
>>>> $ ls -l `which java`
>>>> lrwxrwxrwx 1 root root 26 Jul 17 19:50 /usr/bin/java ->
>>>> /usr/java/default/bin/java
>>>> $ ls -l /usr/java/default
>>>> lrwxrwxrwx 1 root root 16 Jul 17 19:50 /usr/java/default ->
>>>> /usr/java/latest
>>>> $ ls -l /usr/java/latest
>>>> lrwxrwxrwx 1 root root 21 Jul 17 20:29 /usr/java/latest ->
>>>> /usr/java/jdk1.7.0_21
>>>>
>>>> jar
>>>> $ ls -l `which jar`
>>>> lrwxrwxrwx 1 root root 21 Jul 17 20:18 /usr/bin/jar ->
>>>> /etc/alternatives/jar
>>>> $ ls -l /etc/alternatives/jar
>>>> lrwxrwxrwx 1 root root 29 Jul 17 20:26 /etc/alternatives/jar ->
>>>> /usr/java/jdk1.7.0_21/bin/jar
>>>>
>>>> javac
>>>> $ ls -l `which javac`
>>>> lrwxrwxrwx 1 root root 23 Jul 17 20:18 /usr/bin/javac ->
>>>> /etc/alternatives/javac
>>>> $ ls -l /etc/alternatives/javac
>>>> lrwxrwxrwx 1 root root 31 Jul 17 20:26 /etc/alternatives/javac ->
>>>> /usr/java/jdk1.7.0_21/bin/javac
>>>>
>>>> Now that I've tried version from  6 & 7, I'm really not sure what is
>>>> causing this issue.
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> On Thu, Jul 17, 2014 at 8:21 PM, fireflyhoo@gmail.com
>>>> <fi...@gmail.com> wrote:
>>>>>
>>>>> I think  you first confirm you local java version ,
>>>>> Some  liux will  pre-installed java ,that version is very low
>>>>>
>>>>> ________________________________
>>>>> fireflyhoo@gmail.com
>>>>>
>>>>>
>>>>> From: andrew touchet
>>>>> Date: 2014-07-18 09:06
>>>>> To: user
>>>>> Subject: Re: HDFS input/output error - fuse mount
>>>>> Hi Chris,
>>>>>
>>>>> I tried to mount /hdfs with java versions below but there was no change
>>>>> in output.
>>>>> jre-7u21
>>>>> jdk-7u21
>>>>> jdk-7u55
>>>>> jdk1.6.0_31
>>>>> jdk1.6.0_45
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> On Thu, Jul 17, 2014 at 6:56 PM, Chris Mawata <ch...@gmail.com>
>>>>> wrote:
>>>>>>
>>>>>> Version 51 ia Java 7
>>>>>> Chris
>>>>>>
>>>>>> On Jul 17, 2014 7:50 PM, "andrew touchet" <ad...@latech.edu> wrote:
>>>>>>>
>>>>>>> Hello,
>>>>>>>
>>>>>>> Hadoop package installed:
>>>>>>> hadoop-0.20-0.20.2+737-33.osg.el5.noarch
>>>>>>>
>>>>>>> Operating System:
>>>>>>> CentOS release 5.8 (Final)
>>>>>>>
>>>>>>> I am mounting HDFS from my namenode to another node with fuse.  After
>>>>>>> mounting to /hdfs, any attempts to 'ls', 'cd', or use 'hadoop fs' leads to
>>>>>>> the below output.
>>>>>>>
>>>>>>>
>>>>>>> $ls /hdfs
>>>>>>> ls: /hdfs: Input/output error
>>>>>>> $hadoop fs -ls
>>>>>>> Exception in thread "main" java.lang.UnsupportedClassVersionError:
>>>>>>> org/apache/hadoop/fs/FsShell : Unsupported major.minor version 51.0
>>>>>>>     at java.lang.ClassLoader.defineClass1(Native Method)
>>>>>>>     at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
>>>>>>>     at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
>>>>>>>     at
>>>>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>>>>>     at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
>>>>>>>     at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
>>>>>>>     at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
>>>>>>>     at java.security.AccessController.doPrivileged(Native Method)
>>>>>>>     at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>>>>>>>     at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>>>>>>>     at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>>>>>>>     at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>>>>>>> Could not find the main class: org.apache.hadoop.fs.FsShell.  Program
>>>>>>> will exit.
>>>>>>>
>>>>>>>
>>>>>>> I have attempted to mount /hdfs manually in debug mode and then
>>>>>>> attempted to access /hdfs from a different terminal. This is the output. The
>>>>>>> namenode is glados. The server where /hdfs is being mounted is glados2.
>>>>>>>
>>>>>>>
>>>>>>> $hdfs -oserver=glados,port=9000,rdbuffer=131072,allow_other /hdfs -d
>>>>>>> fuse-dfs ignoring option allow_other
>>>>>>> ERROR fuse_options.c:162 fuse-dfs didn't recognize /hdfs,-2
>>>>>>>
>>>>>>> fuse-dfs ignoring option -d
>>>>>>> unique: 1, opcode: INIT (26), nodeid: 0, insize: 56
>>>>>>> INIT: 7.10
>>>>>>> flags=0x0000000b
>>>>>>> max_readahead=0x00020000
>>>>>>> INFO fuse_init.c:115 Mounting glados:9000
>>>>>>> Exception in thread "main" java.lang.UnsupportedClassVersionError:
>>>>>>> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
>>>>>>>     at java.lang.ClassLoader.defineClass1(Native Method)
>>>>>>>     at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
>>>>>>>     at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
>>>>>>>     at
>>>>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>>>>>     at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
>>>>>>>     at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
>>>>>>>     at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
>>>>>>>     at java.security.AccessController.doPrivileged(Native Method)
>>>>>>>     at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>>>>>>>     at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>>>>>>>     at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>>>>>>>     at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>>>>>>> Can't construct instance of class
>>>>>>> org.apache.hadoop.conf.Configuration
>>>>>>> ERROR fuse_init.c:127 Unable to establish test connection to server
>>>>>>>    INIT: 7.8
>>>>>>>    flags=0x00000001
>>>>>>>    max_readahead=0x00020000
>>>>>>>    max_write=0x00020000
>>>>>>>    unique: 1, error: 0 (Success), outsize: 40
>>>>>>> unique: 2, opcode: GETATTR (3), nodeid: 1, insize: 56
>>>>>>> Exception in thread "Thread-0"
>>>>>>> java.lang.UnsupportedClassVersionError: org/apache/hadoop/conf/Configuration
>>>>>>> : Unsupported major.minor version 51.0
>>>>>>>     at java.lang.ClassLoader.defineClass1(Native Method)
>>>>>>>     at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
>>>>>>>     at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
>>>>>>>     at
>>>>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>>>>>     at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
>>>>>>>     at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
>>>>>>>     at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
>>>>>>>     at java.security.AccessController.doPrivileged(Native Method)
>>>>>>>     at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>>>>>>>     at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>>>>>>>     at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>>>>>>>     at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>>>>>>> Can't construct instance of class
>>>>>>> org.apache.hadoop.conf.Configuration
>>>>>>> ERROR fuse_connect.c:83 Unable to instantiate a filesystem for
>>>>>>> user027
>>>>>>> ERROR fuse_impls_getattr.c:40 Could not connect to glados:9000
>>>>>>>    unique: 2, error: -5 (Input/output error), outsize: 16
>>>>>>> unique: 3, opcode: GETATTR (3), nodeid: 1, insize: 56
>>>>>>>
>>>>>>> I adopted this system after this was already setup, so I do not know
>>>>>>> which java version was used during install. Currently I'm using:
>>>>>>>
>>>>>>> $java -version
>>>>>>> java version "1.6.0_45"
>>>>>>> Java(TM) SE Runtime Environment (build 1.6.0_45-b06)
>>>>>>> Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed mode)
>>>>>>>
>>>>>>> $java -version
>>>>>>> java version "1.6.0_45"
>>>>>>> Java(TM) SE Runtime Environment (build 1.6.0_45-b06)
>>>>>>> Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed mode)
>>>>>>>
>>>>>>>
>>>>>>> Is my java version really the cause of this issue?  What is the
>>>>>>> correct java version to be used for this version of hadoop.  I have also
>>>>>>> tried 1.6.0_31 but no changes were seen.
>>>>>>>
>>>>>>> If java isn't my issue, then what is?
>>>>>>>
>>>>>>> Best regards,
>>>>>>>
>>>>>>> Andrew
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>
>>>>
>>
>



-- 
Harsh J

what exactly does data in HDFS look like?

Posted by "Adaryl \"Bob\" Wakefield, MBA" <ad...@hotmail.com>.
And by that I mean is there an HDFS file type? I feel like I’m missing something. Let’s say I have a HUGE json file that I import into HDFS. Does it retain it’s JSON format in HDFS? What if it’s just random tweets I’m streaming. Is it kind of like a normal disk where there are all kinds of files sitting on disk in their own format it’s just that in HDFS they are spread out over nodes?

B.

what exactly does data in HDFS look like?

Posted by "Adaryl \"Bob\" Wakefield, MBA" <ad...@hotmail.com>.
And by that I mean is there an HDFS file type? I feel like I’m missing something. Let’s say I have a HUGE json file that I import into HDFS. Does it retain it’s JSON format in HDFS? What if it’s just random tweets I’m streaming. Is it kind of like a normal disk where there are all kinds of files sitting on disk in their own format it’s just that in HDFS they are spread out over nodes?

B.

Re: Re: HDFS input/output error - fuse mount

Posted by Harsh J <ha...@cloudera.com>.
Apache Bigtop detects JVM based on the most recent version first, if
you use packages based on their framework:
https://github.com/apache/bigtop/blob/master/bigtop-packages/src/common/bigtop-utils/bigtop-detect-javahome
(switch branch/tag appropriately to a specific release you may be
using/based off)

Within Apache Hadoop, we rely on the definition of JAVA_HOME
explicitly. In a packaged environment, the above script typically
supplies it.

On Sat, Jul 19, 2014 at 6:35 AM, Chris Mawata <ch...@gmail.com> wrote:
> Great that you got it sorted out. I'm afraid I don't know if there is a
> configuration that would automatically check the versions -- maybe someone
> who knows might chime in.
> Cheers
> Chris
>
> On Jul 18, 2014 3:06 PM, "andrew touchet" <ad...@latech.edu> wrote:
>>
>> Thanks Chris!
>>
>> The issue was that even though I set jdk-7u21 as my default, it checked
>> for /usr/java/jdk-1.6* first.  Even though it was compiled with 1.7.
>>
>> Is there anyway to generate a proper hadoop-config.sh to reflect the minor
>> version hadoop was built with? So that in my case, it would check for
>> /usr/java/jdk-1.7* instead?  I appreciate the help!
>>
>>
>> On Thu, Jul 17, 2014 at 11:11 PM, Chris Mawata <ch...@gmail.com>
>> wrote:
>>>
>>> Yet another place to check -- in the hadoop-env.sh file there is also a
>>> JAVA_HOME setting.
>>> Chris
>>>
>>> On Jul 17, 2014 9:46 PM, "andrew touchet" <ad...@latech.edu> wrote:
>>>>
>>>> Hi Fireflyhoo,
>>>>
>>>> Below I follow the symbolic links for the jdk-7u21. These links are
>>>> changed accordingly as I change between versions. Also, I have 8 datanodes
>>>> and 2 other various servers that are capable of mounting /hdfs.  So it is
>>>> just this server is an issue.
>>>>
>>>> $ java -version
>>>> java version "1.7.0_21"
>>>> Java(TM) SE Runtime Environment (build 1.7.0_21-b11)
>>>> Java HotSpot(TM) 64-Bit Server VM (build 23.21-b01, mixed mode)
>>>>
>>>> java
>>>> $ ls -l `which java`
>>>> lrwxrwxrwx 1 root root 26 Jul 17 19:50 /usr/bin/java ->
>>>> /usr/java/default/bin/java
>>>> $ ls -l /usr/java/default
>>>> lrwxrwxrwx 1 root root 16 Jul 17 19:50 /usr/java/default ->
>>>> /usr/java/latest
>>>> $ ls -l /usr/java/latest
>>>> lrwxrwxrwx 1 root root 21 Jul 17 20:29 /usr/java/latest ->
>>>> /usr/java/jdk1.7.0_21
>>>>
>>>> jar
>>>> $ ls -l `which jar`
>>>> lrwxrwxrwx 1 root root 21 Jul 17 20:18 /usr/bin/jar ->
>>>> /etc/alternatives/jar
>>>> $ ls -l /etc/alternatives/jar
>>>> lrwxrwxrwx 1 root root 29 Jul 17 20:26 /etc/alternatives/jar ->
>>>> /usr/java/jdk1.7.0_21/bin/jar
>>>>
>>>> javac
>>>> $ ls -l `which javac`
>>>> lrwxrwxrwx 1 root root 23 Jul 17 20:18 /usr/bin/javac ->
>>>> /etc/alternatives/javac
>>>> $ ls -l /etc/alternatives/javac
>>>> lrwxrwxrwx 1 root root 31 Jul 17 20:26 /etc/alternatives/javac ->
>>>> /usr/java/jdk1.7.0_21/bin/javac
>>>>
>>>> Now that I've tried version from  6 & 7, I'm really not sure what is
>>>> causing this issue.
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> On Thu, Jul 17, 2014 at 8:21 PM, fireflyhoo@gmail.com
>>>> <fi...@gmail.com> wrote:
>>>>>
>>>>> I think  you first confirm you local java version ,
>>>>> Some  liux will  pre-installed java ,that version is very low
>>>>>
>>>>> ________________________________
>>>>> fireflyhoo@gmail.com
>>>>>
>>>>>
>>>>> From: andrew touchet
>>>>> Date: 2014-07-18 09:06
>>>>> To: user
>>>>> Subject: Re: HDFS input/output error - fuse mount
>>>>> Hi Chris,
>>>>>
>>>>> I tried to mount /hdfs with java versions below but there was no change
>>>>> in output.
>>>>> jre-7u21
>>>>> jdk-7u21
>>>>> jdk-7u55
>>>>> jdk1.6.0_31
>>>>> jdk1.6.0_45
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> On Thu, Jul 17, 2014 at 6:56 PM, Chris Mawata <ch...@gmail.com>
>>>>> wrote:
>>>>>>
>>>>>> Version 51 ia Java 7
>>>>>> Chris
>>>>>>
>>>>>> On Jul 17, 2014 7:50 PM, "andrew touchet" <ad...@latech.edu> wrote:
>>>>>>>
>>>>>>> Hello,
>>>>>>>
>>>>>>> Hadoop package installed:
>>>>>>> hadoop-0.20-0.20.2+737-33.osg.el5.noarch
>>>>>>>
>>>>>>> Operating System:
>>>>>>> CentOS release 5.8 (Final)
>>>>>>>
>>>>>>> I am mounting HDFS from my namenode to another node with fuse.  After
>>>>>>> mounting to /hdfs, any attempts to 'ls', 'cd', or use 'hadoop fs' leads to
>>>>>>> the below output.
>>>>>>>
>>>>>>>
>>>>>>> $ls /hdfs
>>>>>>> ls: /hdfs: Input/output error
>>>>>>> $hadoop fs -ls
>>>>>>> Exception in thread "main" java.lang.UnsupportedClassVersionError:
>>>>>>> org/apache/hadoop/fs/FsShell : Unsupported major.minor version 51.0
>>>>>>>     at java.lang.ClassLoader.defineClass1(Native Method)
>>>>>>>     at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
>>>>>>>     at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
>>>>>>>     at
>>>>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>>>>>     at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
>>>>>>>     at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
>>>>>>>     at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
>>>>>>>     at java.security.AccessController.doPrivileged(Native Method)
>>>>>>>     at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>>>>>>>     at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>>>>>>>     at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>>>>>>>     at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>>>>>>> Could not find the main class: org.apache.hadoop.fs.FsShell.  Program
>>>>>>> will exit.
>>>>>>>
>>>>>>>
>>>>>>> I have attempted to mount /hdfs manually in debug mode and then
>>>>>>> attempted to access /hdfs from a different terminal. This is the output. The
>>>>>>> namenode is glados. The server where /hdfs is being mounted is glados2.
>>>>>>>
>>>>>>>
>>>>>>> $hdfs -oserver=glados,port=9000,rdbuffer=131072,allow_other /hdfs -d
>>>>>>> fuse-dfs ignoring option allow_other
>>>>>>> ERROR fuse_options.c:162 fuse-dfs didn't recognize /hdfs,-2
>>>>>>>
>>>>>>> fuse-dfs ignoring option -d
>>>>>>> unique: 1, opcode: INIT (26), nodeid: 0, insize: 56
>>>>>>> INIT: 7.10
>>>>>>> flags=0x0000000b
>>>>>>> max_readahead=0x00020000
>>>>>>> INFO fuse_init.c:115 Mounting glados:9000
>>>>>>> Exception in thread "main" java.lang.UnsupportedClassVersionError:
>>>>>>> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
>>>>>>>     at java.lang.ClassLoader.defineClass1(Native Method)
>>>>>>>     at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
>>>>>>>     at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
>>>>>>>     at
>>>>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>>>>>     at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
>>>>>>>     at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
>>>>>>>     at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
>>>>>>>     at java.security.AccessController.doPrivileged(Native Method)
>>>>>>>     at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>>>>>>>     at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>>>>>>>     at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>>>>>>>     at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>>>>>>> Can't construct instance of class
>>>>>>> org.apache.hadoop.conf.Configuration
>>>>>>> ERROR fuse_init.c:127 Unable to establish test connection to server
>>>>>>>    INIT: 7.8
>>>>>>>    flags=0x00000001
>>>>>>>    max_readahead=0x00020000
>>>>>>>    max_write=0x00020000
>>>>>>>    unique: 1, error: 0 (Success), outsize: 40
>>>>>>> unique: 2, opcode: GETATTR (3), nodeid: 1, insize: 56
>>>>>>> Exception in thread "Thread-0"
>>>>>>> java.lang.UnsupportedClassVersionError: org/apache/hadoop/conf/Configuration
>>>>>>> : Unsupported major.minor version 51.0
>>>>>>>     at java.lang.ClassLoader.defineClass1(Native Method)
>>>>>>>     at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
>>>>>>>     at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
>>>>>>>     at
>>>>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>>>>>     at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
>>>>>>>     at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
>>>>>>>     at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
>>>>>>>     at java.security.AccessController.doPrivileged(Native Method)
>>>>>>>     at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>>>>>>>     at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>>>>>>>     at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>>>>>>>     at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>>>>>>> Can't construct instance of class
>>>>>>> org.apache.hadoop.conf.Configuration
>>>>>>> ERROR fuse_connect.c:83 Unable to instantiate a filesystem for
>>>>>>> user027
>>>>>>> ERROR fuse_impls_getattr.c:40 Could not connect to glados:9000
>>>>>>>    unique: 2, error: -5 (Input/output error), outsize: 16
>>>>>>> unique: 3, opcode: GETATTR (3), nodeid: 1, insize: 56
>>>>>>>
>>>>>>> I adopted this system after this was already setup, so I do not know
>>>>>>> which java version was used during install. Currently I'm using:
>>>>>>>
>>>>>>> $java -version
>>>>>>> java version "1.6.0_45"
>>>>>>> Java(TM) SE Runtime Environment (build 1.6.0_45-b06)
>>>>>>> Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed mode)
>>>>>>>
>>>>>>> $java -version
>>>>>>> java version "1.6.0_45"
>>>>>>> Java(TM) SE Runtime Environment (build 1.6.0_45-b06)
>>>>>>> Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed mode)
>>>>>>>
>>>>>>>
>>>>>>> Is my java version really the cause of this issue?  What is the
>>>>>>> correct java version to be used for this version of hadoop.  I have also
>>>>>>> tried 1.6.0_31 but no changes were seen.
>>>>>>>
>>>>>>> If java isn't my issue, then what is?
>>>>>>>
>>>>>>> Best regards,
>>>>>>>
>>>>>>> Andrew
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>
>>>>
>>
>



-- 
Harsh J

Re: Re: HDFS input/output error - fuse mount

Posted by Chris Mawata <ch...@gmail.com>.
Great that you got it sorted out. I'm afraid I don't know if there is a
configuration that would automatically check the versions -- maybe someone
who knows might chime in.
Cheers
Chris
On Jul 18, 2014 3:06 PM, "andrew touchet" <ad...@latech.edu> wrote:

> Thanks Chris!
>
> The issue was that even though I set jdk-7u21 as my default, it checked
> for /usr/java/jdk-1.6* first.  Even though it was compiled with 1.7.
>
> Is there anyway to generate a proper hadoop-config.sh to reflect the minor
> version hadoop was built with? So that in my case, it would check for
> /usr/java/jdk-1.7* instead?  I appreciate the help!
>
>
> On Thu, Jul 17, 2014 at 11:11 PM, Chris Mawata <ch...@gmail.com>
> wrote:
>
>> Yet another place to check -- in the hadoop-env.sh file there is also a
>> JAVA_HOME setting.
>> Chris
>> On Jul 17, 2014 9:46 PM, "andrew touchet" <ad...@latech.edu> wrote:
>>
>>> Hi Fireflyhoo,
>>>
>>> Below I follow the symbolic links for the jdk-7u21. These links are
>>> changed accordingly as I change between versions. Also, I have 8 datanodes
>>> and 2 other various servers that are capable of mounting /hdfs.  So it is
>>> just this server is an issue.
>>>
>>> $ java -version
>>> java version "1.7.0_21"
>>> Java(TM) SE Runtime Environment (build 1.7.0_21-b11)
>>> Java HotSpot(TM) 64-Bit Server VM (build 23.21-b01, mixed mode)
>>>
>>> java
>>> $ ls -l `which java`
>>> *lrwxrwxrwx 1 root root 26 Jul 17 19:50 /usr/bin/java ->
>>> /usr/java/default/bin/java*
>>> $ ls -l /usr/java/default
>>> *lrwxrwxrwx 1 root root 16 Jul 17 19:50 /usr/java/default ->
>>> /usr/java/latest*
>>> $ ls -l /usr/java/latest
>>> *lrwxrwxrwx 1 root root 21 Jul 17 20:29 /usr/java/latest ->
>>> /usr/java/jdk1.7.0_21*
>>>
>>> jar
>>> $ ls -l `which jar`
>>> *lrwxrwxrwx 1 root root 21 Jul 17 20:18 /usr/bin/jar ->
>>> /etc/alternatives/jar*
>>> $ ls -l /etc/alternatives/jar
>>> *lrwxrwxrwx 1 root root 29 Jul 17 20:26 /etc/alternatives/jar ->
>>> /usr/java/jdk1.7.0_21/bin/jar*
>>>
>>> javac
>>> $ ls -l `which javac`
>>> *lrwxrwxrwx 1 root root 23 Jul 17 20:18 /usr/bin/javac ->
>>> /etc/alternatives/javac*
>>> $ ls -l /etc/alternatives/javac
>>> *lrwxrwxrwx 1 root root 31 Jul 17 20:26 /etc/alternatives/javac ->
>>> /usr/java/jdk1.7.0_21/bin/javac*
>>>
>>> Now that I've tried version from  6 & 7, I'm really not sure what is
>>> causing this issue.
>>>
>>>
>>>
>>>
>>>
>>>
>>> On Thu, Jul 17, 2014 at 8:21 PM, fireflyhoo@gmail.com <
>>> fireflyhoo@gmail.com> wrote:
>>>
>>>>  I think  you first confirm you local java version ,
>>>> Some  liux will  pre-installed java ,that version is very low
>>>>
>>>> ------------------------------
>>>> fireflyhoo@gmail.com
>>>>
>>>>
>>>> *From:* andrew touchet <ad...@latech.edu>
>>>> *Date:* 2014-07-18 09:06
>>>> *To:* user <us...@hadoop.apache.org>
>>>> *Subject:* Re: HDFS input/output error - fuse mount
>>>> Hi Chris,
>>>>
>>>> I tried to mount /hdfs with java versions below but there was no change
>>>> in output.
>>>> jre-7u21
>>>> jdk-7u21
>>>> jdk-7u55
>>>> jdk1.6.0_31
>>>> jdk1.6.0_45
>>>>
>>>>
>>>>
>>>>
>>>> On Thu, Jul 17, 2014 at 6:56 PM, Chris Mawata <ch...@gmail.com>
>>>> wrote:
>>>>
>>>>> Version 51 ia Java 7
>>>>> Chris
>>>>> On Jul 17, 2014 7:50 PM, "andrew touchet" <ad...@latech.edu> wrote:
>>>>>
>>>>>> Hello,
>>>>>>
>>>>>> Hadoop package installed:
>>>>>> hadoop-0.20-0.20.2+737-33.osg.el5.noarch
>>>>>>
>>>>>> Operating System:
>>>>>> CentOS release 5.8 (Final)
>>>>>>
>>>>>> I am mounting HDFS from my namenode to another node with fuse.  After
>>>>>> mounting to /hdfs, any attempts to 'ls', 'cd', or use 'hadoop fs' leads to
>>>>>> the below output.
>>>>>>
>>>>>>
>>>>>> $ls /hdfs
>>>>>> *ls: /hdfs: Input/output error*
>>>>>> $hadoop fs -ls
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> *Exception in thread "main" java.lang.UnsupportedClassVersionError:
>>>>>> org/apache/hadoop/fs/FsShell : Unsupported major.minor version 51.0     at
>>>>>> java.lang.ClassLoader.defineClass1(Native Method)    at
>>>>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>>>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>>>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>>>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>>>>> java.security.AccessController.doPrivileged(Native Method)     at
>>>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Could not find the
>>>>>> main class: org.apache.hadoop.fs.FsShell.  Program will exit.*
>>>>>>
>>>>>>
>>>>>> I have attempted to mount /hdfs manually in debug mode and then
>>>>>> attempted to access /hdfs from a different terminal. This is the output.
>>>>>> The namenode is *glados*. The server where /hdfs is being mounted is
>>>>>> *glados2*.
>>>>>>
>>>>>>
>>>>>> $hdfs -oserver=glados,port=9000,rdbuffer=131072,allow_other /hdfs -d
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> *fuse-dfs ignoring option allow_otherERROR fuse_options.c:162
>>>>>> fuse-dfs didn't recognize /hdfs,-2fuse-dfs ignoring option -d unique: 1,
>>>>>> opcode: INIT (26), nodeid: 0, insize: 56INIT:
>>>>>> 7.10flags=0x0000000bmax_readahead=0x00020000INFO fuse_init.c:115 Mounting
>>>>>> glados:9000Exception in thread "main"
>>>>>> java.lang.UnsupportedClassVersionError:
>>>>>> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
>>>>>>     at java.lang.ClassLoader.defineClass1(Native Method)    at
>>>>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>>>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>>>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>>>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>>>>> java.security.AccessController.doPrivileged(Native Method)     at
>>>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Can't construct
>>>>>> instance of class org.apache.hadoop.conf.ConfigurationERROR fuse_init.c:127
>>>>>> Unable to establish test connection to server   INIT: 7.8
>>>>>> flags=0x00000001   max_readahead=0x00020000   max_write=0x00020000
>>>>>> unique: 1, error: 0 (Success), outsize: 40unique: 2, opcode: GETATTR (3),
>>>>>> nodeid: 1, insize: 56Exception in thread "Thread-0"
>>>>>> java.lang.UnsupportedClassVersionError:
>>>>>> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
>>>>>>     at java.lang.ClassLoader.defineClass1(Native Method)    at
>>>>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>>>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>>>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>>>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>>>>> java.security.AccessController.doPrivileged(Native Method)     at
>>>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Can't construct
>>>>>> instance of class org.apache.hadoop.conf.ConfigurationERROR
>>>>>> fuse_connect.c:83 Unable to instantiate a filesystem for user027ERROR
>>>>>> fuse_impls_getattr.c:40 Could not connect to glados:9000   unique: 2,
>>>>>> error: -5 (Input/output error), outsize: 16 unique: 3, opcode: GETATTR (3),
>>>>>> nodeid: 1, insize: 56*
>>>>>>
>>>>>> I adopted this system after this was already setup, so I do not know
>>>>>> which java version was used during install. Currently I'm using:
>>>>>>
>>>>>> $java -version
>>>>>>
>>>>>>
>>>>>> *java version "1.6.0_45"Java(TM) SE Runtime Environment (build
>>>>>> 1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed
>>>>>> mode)*
>>>>>>
>>>>>> $java -version
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> *java version "1.6.0_45" Java(TM) SE Runtime Environment (build
>>>>>> 1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed
>>>>>> mode)*
>>>>>> Is my java version really the cause of this issue?  What is the
>>>>>> correct java version to be used for this version of hadoop.  I have also
>>>>>> tried 1.6.0_31 but no changes were seen.
>>>>>>
>>>>>> If java isn't my issue, then what is?
>>>>>>
>>>>>> Best regards,
>>>>>>
>>>>>> Andrew
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>
>>>
>

Re: Re: HDFS input/output error - fuse mount

Posted by Chris Mawata <ch...@gmail.com>.
Great that you got it sorted out. I'm afraid I don't know if there is a
configuration that would automatically check the versions -- maybe someone
who knows might chime in.
Cheers
Chris
On Jul 18, 2014 3:06 PM, "andrew touchet" <ad...@latech.edu> wrote:

> Thanks Chris!
>
> The issue was that even though I set jdk-7u21 as my default, it checked
> for /usr/java/jdk-1.6* first.  Even though it was compiled with 1.7.
>
> Is there anyway to generate a proper hadoop-config.sh to reflect the minor
> version hadoop was built with? So that in my case, it would check for
> /usr/java/jdk-1.7* instead?  I appreciate the help!
>
>
> On Thu, Jul 17, 2014 at 11:11 PM, Chris Mawata <ch...@gmail.com>
> wrote:
>
>> Yet another place to check -- in the hadoop-env.sh file there is also a
>> JAVA_HOME setting.
>> Chris
>> On Jul 17, 2014 9:46 PM, "andrew touchet" <ad...@latech.edu> wrote:
>>
>>> Hi Fireflyhoo,
>>>
>>> Below I follow the symbolic links for the jdk-7u21. These links are
>>> changed accordingly as I change between versions. Also, I have 8 datanodes
>>> and 2 other various servers that are capable of mounting /hdfs.  So it is
>>> just this server is an issue.
>>>
>>> $ java -version
>>> java version "1.7.0_21"
>>> Java(TM) SE Runtime Environment (build 1.7.0_21-b11)
>>> Java HotSpot(TM) 64-Bit Server VM (build 23.21-b01, mixed mode)
>>>
>>> java
>>> $ ls -l `which java`
>>> *lrwxrwxrwx 1 root root 26 Jul 17 19:50 /usr/bin/java ->
>>> /usr/java/default/bin/java*
>>> $ ls -l /usr/java/default
>>> *lrwxrwxrwx 1 root root 16 Jul 17 19:50 /usr/java/default ->
>>> /usr/java/latest*
>>> $ ls -l /usr/java/latest
>>> *lrwxrwxrwx 1 root root 21 Jul 17 20:29 /usr/java/latest ->
>>> /usr/java/jdk1.7.0_21*
>>>
>>> jar
>>> $ ls -l `which jar`
>>> *lrwxrwxrwx 1 root root 21 Jul 17 20:18 /usr/bin/jar ->
>>> /etc/alternatives/jar*
>>> $ ls -l /etc/alternatives/jar
>>> *lrwxrwxrwx 1 root root 29 Jul 17 20:26 /etc/alternatives/jar ->
>>> /usr/java/jdk1.7.0_21/bin/jar*
>>>
>>> javac
>>> $ ls -l `which javac`
>>> *lrwxrwxrwx 1 root root 23 Jul 17 20:18 /usr/bin/javac ->
>>> /etc/alternatives/javac*
>>> $ ls -l /etc/alternatives/javac
>>> *lrwxrwxrwx 1 root root 31 Jul 17 20:26 /etc/alternatives/javac ->
>>> /usr/java/jdk1.7.0_21/bin/javac*
>>>
>>> Now that I've tried version from  6 & 7, I'm really not sure what is
>>> causing this issue.
>>>
>>>
>>>
>>>
>>>
>>>
>>> On Thu, Jul 17, 2014 at 8:21 PM, fireflyhoo@gmail.com <
>>> fireflyhoo@gmail.com> wrote:
>>>
>>>>  I think  you first confirm you local java version ,
>>>> Some  liux will  pre-installed java ,that version is very low
>>>>
>>>> ------------------------------
>>>> fireflyhoo@gmail.com
>>>>
>>>>
>>>> *From:* andrew touchet <ad...@latech.edu>
>>>> *Date:* 2014-07-18 09:06
>>>> *To:* user <us...@hadoop.apache.org>
>>>> *Subject:* Re: HDFS input/output error - fuse mount
>>>> Hi Chris,
>>>>
>>>> I tried to mount /hdfs with java versions below but there was no change
>>>> in output.
>>>> jre-7u21
>>>> jdk-7u21
>>>> jdk-7u55
>>>> jdk1.6.0_31
>>>> jdk1.6.0_45
>>>>
>>>>
>>>>
>>>>
>>>> On Thu, Jul 17, 2014 at 6:56 PM, Chris Mawata <ch...@gmail.com>
>>>> wrote:
>>>>
>>>>> Version 51 ia Java 7
>>>>> Chris
>>>>> On Jul 17, 2014 7:50 PM, "andrew touchet" <ad...@latech.edu> wrote:
>>>>>
>>>>>> Hello,
>>>>>>
>>>>>> Hadoop package installed:
>>>>>> hadoop-0.20-0.20.2+737-33.osg.el5.noarch
>>>>>>
>>>>>> Operating System:
>>>>>> CentOS release 5.8 (Final)
>>>>>>
>>>>>> I am mounting HDFS from my namenode to another node with fuse.  After
>>>>>> mounting to /hdfs, any attempts to 'ls', 'cd', or use 'hadoop fs' leads to
>>>>>> the below output.
>>>>>>
>>>>>>
>>>>>> $ls /hdfs
>>>>>> *ls: /hdfs: Input/output error*
>>>>>> $hadoop fs -ls
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> *Exception in thread "main" java.lang.UnsupportedClassVersionError:
>>>>>> org/apache/hadoop/fs/FsShell : Unsupported major.minor version 51.0     at
>>>>>> java.lang.ClassLoader.defineClass1(Native Method)    at
>>>>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>>>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>>>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>>>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>>>>> java.security.AccessController.doPrivileged(Native Method)     at
>>>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Could not find the
>>>>>> main class: org.apache.hadoop.fs.FsShell.  Program will exit.*
>>>>>>
>>>>>>
>>>>>> I have attempted to mount /hdfs manually in debug mode and then
>>>>>> attempted to access /hdfs from a different terminal. This is the output.
>>>>>> The namenode is *glados*. The server where /hdfs is being mounted is
>>>>>> *glados2*.
>>>>>>
>>>>>>
>>>>>> $hdfs -oserver=glados,port=9000,rdbuffer=131072,allow_other /hdfs -d
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> *fuse-dfs ignoring option allow_otherERROR fuse_options.c:162
>>>>>> fuse-dfs didn't recognize /hdfs,-2fuse-dfs ignoring option -d unique: 1,
>>>>>> opcode: INIT (26), nodeid: 0, insize: 56INIT:
>>>>>> 7.10flags=0x0000000bmax_readahead=0x00020000INFO fuse_init.c:115 Mounting
>>>>>> glados:9000Exception in thread "main"
>>>>>> java.lang.UnsupportedClassVersionError:
>>>>>> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
>>>>>>     at java.lang.ClassLoader.defineClass1(Native Method)    at
>>>>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>>>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>>>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>>>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>>>>> java.security.AccessController.doPrivileged(Native Method)     at
>>>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Can't construct
>>>>>> instance of class org.apache.hadoop.conf.ConfigurationERROR fuse_init.c:127
>>>>>> Unable to establish test connection to server   INIT: 7.8
>>>>>> flags=0x00000001   max_readahead=0x00020000   max_write=0x00020000
>>>>>> unique: 1, error: 0 (Success), outsize: 40unique: 2, opcode: GETATTR (3),
>>>>>> nodeid: 1, insize: 56Exception in thread "Thread-0"
>>>>>> java.lang.UnsupportedClassVersionError:
>>>>>> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
>>>>>>     at java.lang.ClassLoader.defineClass1(Native Method)    at
>>>>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>>>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>>>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>>>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>>>>> java.security.AccessController.doPrivileged(Native Method)     at
>>>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Can't construct
>>>>>> instance of class org.apache.hadoop.conf.ConfigurationERROR
>>>>>> fuse_connect.c:83 Unable to instantiate a filesystem for user027ERROR
>>>>>> fuse_impls_getattr.c:40 Could not connect to glados:9000   unique: 2,
>>>>>> error: -5 (Input/output error), outsize: 16 unique: 3, opcode: GETATTR (3),
>>>>>> nodeid: 1, insize: 56*
>>>>>>
>>>>>> I adopted this system after this was already setup, so I do not know
>>>>>> which java version was used during install. Currently I'm using:
>>>>>>
>>>>>> $java -version
>>>>>>
>>>>>>
>>>>>> *java version "1.6.0_45"Java(TM) SE Runtime Environment (build
>>>>>> 1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed
>>>>>> mode)*
>>>>>>
>>>>>> $java -version
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> *java version "1.6.0_45" Java(TM) SE Runtime Environment (build
>>>>>> 1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed
>>>>>> mode)*
>>>>>> Is my java version really the cause of this issue?  What is the
>>>>>> correct java version to be used for this version of hadoop.  I have also
>>>>>> tried 1.6.0_31 but no changes were seen.
>>>>>>
>>>>>> If java isn't my issue, then what is?
>>>>>>
>>>>>> Best regards,
>>>>>>
>>>>>> Andrew
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>
>>>
>

Re: Re: HDFS input/output error - fuse mount

Posted by Chris Mawata <ch...@gmail.com>.
Great that you got it sorted out. I'm afraid I don't know if there is a
configuration that would automatically check the versions -- maybe someone
who knows might chime in.
Cheers
Chris
On Jul 18, 2014 3:06 PM, "andrew touchet" <ad...@latech.edu> wrote:

> Thanks Chris!
>
> The issue was that even though I set jdk-7u21 as my default, it checked
> for /usr/java/jdk-1.6* first.  Even though it was compiled with 1.7.
>
> Is there anyway to generate a proper hadoop-config.sh to reflect the minor
> version hadoop was built with? So that in my case, it would check for
> /usr/java/jdk-1.7* instead?  I appreciate the help!
>
>
> On Thu, Jul 17, 2014 at 11:11 PM, Chris Mawata <ch...@gmail.com>
> wrote:
>
>> Yet another place to check -- in the hadoop-env.sh file there is also a
>> JAVA_HOME setting.
>> Chris
>> On Jul 17, 2014 9:46 PM, "andrew touchet" <ad...@latech.edu> wrote:
>>
>>> Hi Fireflyhoo,
>>>
>>> Below I follow the symbolic links for the jdk-7u21. These links are
>>> changed accordingly as I change between versions. Also, I have 8 datanodes
>>> and 2 other various servers that are capable of mounting /hdfs.  So it is
>>> just this server is an issue.
>>>
>>> $ java -version
>>> java version "1.7.0_21"
>>> Java(TM) SE Runtime Environment (build 1.7.0_21-b11)
>>> Java HotSpot(TM) 64-Bit Server VM (build 23.21-b01, mixed mode)
>>>
>>> java
>>> $ ls -l `which java`
>>> *lrwxrwxrwx 1 root root 26 Jul 17 19:50 /usr/bin/java ->
>>> /usr/java/default/bin/java*
>>> $ ls -l /usr/java/default
>>> *lrwxrwxrwx 1 root root 16 Jul 17 19:50 /usr/java/default ->
>>> /usr/java/latest*
>>> $ ls -l /usr/java/latest
>>> *lrwxrwxrwx 1 root root 21 Jul 17 20:29 /usr/java/latest ->
>>> /usr/java/jdk1.7.0_21*
>>>
>>> jar
>>> $ ls -l `which jar`
>>> *lrwxrwxrwx 1 root root 21 Jul 17 20:18 /usr/bin/jar ->
>>> /etc/alternatives/jar*
>>> $ ls -l /etc/alternatives/jar
>>> *lrwxrwxrwx 1 root root 29 Jul 17 20:26 /etc/alternatives/jar ->
>>> /usr/java/jdk1.7.0_21/bin/jar*
>>>
>>> javac
>>> $ ls -l `which javac`
>>> *lrwxrwxrwx 1 root root 23 Jul 17 20:18 /usr/bin/javac ->
>>> /etc/alternatives/javac*
>>> $ ls -l /etc/alternatives/javac
>>> *lrwxrwxrwx 1 root root 31 Jul 17 20:26 /etc/alternatives/javac ->
>>> /usr/java/jdk1.7.0_21/bin/javac*
>>>
>>> Now that I've tried version from  6 & 7, I'm really not sure what is
>>> causing this issue.
>>>
>>>
>>>
>>>
>>>
>>>
>>> On Thu, Jul 17, 2014 at 8:21 PM, fireflyhoo@gmail.com <
>>> fireflyhoo@gmail.com> wrote:
>>>
>>>>  I think  you first confirm you local java version ,
>>>> Some  liux will  pre-installed java ,that version is very low
>>>>
>>>> ------------------------------
>>>> fireflyhoo@gmail.com
>>>>
>>>>
>>>> *From:* andrew touchet <ad...@latech.edu>
>>>> *Date:* 2014-07-18 09:06
>>>> *To:* user <us...@hadoop.apache.org>
>>>> *Subject:* Re: HDFS input/output error - fuse mount
>>>> Hi Chris,
>>>>
>>>> I tried to mount /hdfs with java versions below but there was no change
>>>> in output.
>>>> jre-7u21
>>>> jdk-7u21
>>>> jdk-7u55
>>>> jdk1.6.0_31
>>>> jdk1.6.0_45
>>>>
>>>>
>>>>
>>>>
>>>> On Thu, Jul 17, 2014 at 6:56 PM, Chris Mawata <ch...@gmail.com>
>>>> wrote:
>>>>
>>>>> Version 51 ia Java 7
>>>>> Chris
>>>>> On Jul 17, 2014 7:50 PM, "andrew touchet" <ad...@latech.edu> wrote:
>>>>>
>>>>>> Hello,
>>>>>>
>>>>>> Hadoop package installed:
>>>>>> hadoop-0.20-0.20.2+737-33.osg.el5.noarch
>>>>>>
>>>>>> Operating System:
>>>>>> CentOS release 5.8 (Final)
>>>>>>
>>>>>> I am mounting HDFS from my namenode to another node with fuse.  After
>>>>>> mounting to /hdfs, any attempts to 'ls', 'cd', or use 'hadoop fs' leads to
>>>>>> the below output.
>>>>>>
>>>>>>
>>>>>> $ls /hdfs
>>>>>> *ls: /hdfs: Input/output error*
>>>>>> $hadoop fs -ls
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> *Exception in thread "main" java.lang.UnsupportedClassVersionError:
>>>>>> org/apache/hadoop/fs/FsShell : Unsupported major.minor version 51.0     at
>>>>>> java.lang.ClassLoader.defineClass1(Native Method)    at
>>>>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>>>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>>>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>>>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>>>>> java.security.AccessController.doPrivileged(Native Method)     at
>>>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Could not find the
>>>>>> main class: org.apache.hadoop.fs.FsShell.  Program will exit.*
>>>>>>
>>>>>>
>>>>>> I have attempted to mount /hdfs manually in debug mode and then
>>>>>> attempted to access /hdfs from a different terminal. This is the output.
>>>>>> The namenode is *glados*. The server where /hdfs is being mounted is
>>>>>> *glados2*.
>>>>>>
>>>>>>
>>>>>> $hdfs -oserver=glados,port=9000,rdbuffer=131072,allow_other /hdfs -d
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> *fuse-dfs ignoring option allow_otherERROR fuse_options.c:162
>>>>>> fuse-dfs didn't recognize /hdfs,-2fuse-dfs ignoring option -d unique: 1,
>>>>>> opcode: INIT (26), nodeid: 0, insize: 56INIT:
>>>>>> 7.10flags=0x0000000bmax_readahead=0x00020000INFO fuse_init.c:115 Mounting
>>>>>> glados:9000Exception in thread "main"
>>>>>> java.lang.UnsupportedClassVersionError:
>>>>>> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
>>>>>>     at java.lang.ClassLoader.defineClass1(Native Method)    at
>>>>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>>>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>>>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>>>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>>>>> java.security.AccessController.doPrivileged(Native Method)     at
>>>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Can't construct
>>>>>> instance of class org.apache.hadoop.conf.ConfigurationERROR fuse_init.c:127
>>>>>> Unable to establish test connection to server   INIT: 7.8
>>>>>> flags=0x00000001   max_readahead=0x00020000   max_write=0x00020000
>>>>>> unique: 1, error: 0 (Success), outsize: 40unique: 2, opcode: GETATTR (3),
>>>>>> nodeid: 1, insize: 56Exception in thread "Thread-0"
>>>>>> java.lang.UnsupportedClassVersionError:
>>>>>> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
>>>>>>     at java.lang.ClassLoader.defineClass1(Native Method)    at
>>>>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>>>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>>>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>>>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>>>>> java.security.AccessController.doPrivileged(Native Method)     at
>>>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Can't construct
>>>>>> instance of class org.apache.hadoop.conf.ConfigurationERROR
>>>>>> fuse_connect.c:83 Unable to instantiate a filesystem for user027ERROR
>>>>>> fuse_impls_getattr.c:40 Could not connect to glados:9000   unique: 2,
>>>>>> error: -5 (Input/output error), outsize: 16 unique: 3, opcode: GETATTR (3),
>>>>>> nodeid: 1, insize: 56*
>>>>>>
>>>>>> I adopted this system after this was already setup, so I do not know
>>>>>> which java version was used during install. Currently I'm using:
>>>>>>
>>>>>> $java -version
>>>>>>
>>>>>>
>>>>>> *java version "1.6.0_45"Java(TM) SE Runtime Environment (build
>>>>>> 1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed
>>>>>> mode)*
>>>>>>
>>>>>> $java -version
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> *java version "1.6.0_45" Java(TM) SE Runtime Environment (build
>>>>>> 1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed
>>>>>> mode)*
>>>>>> Is my java version really the cause of this issue?  What is the
>>>>>> correct java version to be used for this version of hadoop.  I have also
>>>>>> tried 1.6.0_31 but no changes were seen.
>>>>>>
>>>>>> If java isn't my issue, then what is?
>>>>>>
>>>>>> Best regards,
>>>>>>
>>>>>> Andrew
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>
>>>
>

Re: Re: HDFS input/output error - fuse mount

Posted by Chris Mawata <ch...@gmail.com>.
Great that you got it sorted out. I'm afraid I don't know if there is a
configuration that would automatically check the versions -- maybe someone
who knows might chime in.
Cheers
Chris
On Jul 18, 2014 3:06 PM, "andrew touchet" <ad...@latech.edu> wrote:

> Thanks Chris!
>
> The issue was that even though I set jdk-7u21 as my default, it checked
> for /usr/java/jdk-1.6* first.  Even though it was compiled with 1.7.
>
> Is there anyway to generate a proper hadoop-config.sh to reflect the minor
> version hadoop was built with? So that in my case, it would check for
> /usr/java/jdk-1.7* instead?  I appreciate the help!
>
>
> On Thu, Jul 17, 2014 at 11:11 PM, Chris Mawata <ch...@gmail.com>
> wrote:
>
>> Yet another place to check -- in the hadoop-env.sh file there is also a
>> JAVA_HOME setting.
>> Chris
>> On Jul 17, 2014 9:46 PM, "andrew touchet" <ad...@latech.edu> wrote:
>>
>>> Hi Fireflyhoo,
>>>
>>> Below I follow the symbolic links for the jdk-7u21. These links are
>>> changed accordingly as I change between versions. Also, I have 8 datanodes
>>> and 2 other various servers that are capable of mounting /hdfs.  So it is
>>> just this server is an issue.
>>>
>>> $ java -version
>>> java version "1.7.0_21"
>>> Java(TM) SE Runtime Environment (build 1.7.0_21-b11)
>>> Java HotSpot(TM) 64-Bit Server VM (build 23.21-b01, mixed mode)
>>>
>>> java
>>> $ ls -l `which java`
>>> *lrwxrwxrwx 1 root root 26 Jul 17 19:50 /usr/bin/java ->
>>> /usr/java/default/bin/java*
>>> $ ls -l /usr/java/default
>>> *lrwxrwxrwx 1 root root 16 Jul 17 19:50 /usr/java/default ->
>>> /usr/java/latest*
>>> $ ls -l /usr/java/latest
>>> *lrwxrwxrwx 1 root root 21 Jul 17 20:29 /usr/java/latest ->
>>> /usr/java/jdk1.7.0_21*
>>>
>>> jar
>>> $ ls -l `which jar`
>>> *lrwxrwxrwx 1 root root 21 Jul 17 20:18 /usr/bin/jar ->
>>> /etc/alternatives/jar*
>>> $ ls -l /etc/alternatives/jar
>>> *lrwxrwxrwx 1 root root 29 Jul 17 20:26 /etc/alternatives/jar ->
>>> /usr/java/jdk1.7.0_21/bin/jar*
>>>
>>> javac
>>> $ ls -l `which javac`
>>> *lrwxrwxrwx 1 root root 23 Jul 17 20:18 /usr/bin/javac ->
>>> /etc/alternatives/javac*
>>> $ ls -l /etc/alternatives/javac
>>> *lrwxrwxrwx 1 root root 31 Jul 17 20:26 /etc/alternatives/javac ->
>>> /usr/java/jdk1.7.0_21/bin/javac*
>>>
>>> Now that I've tried version from  6 & 7, I'm really not sure what is
>>> causing this issue.
>>>
>>>
>>>
>>>
>>>
>>>
>>> On Thu, Jul 17, 2014 at 8:21 PM, fireflyhoo@gmail.com <
>>> fireflyhoo@gmail.com> wrote:
>>>
>>>>  I think  you first confirm you local java version ,
>>>> Some  liux will  pre-installed java ,that version is very low
>>>>
>>>> ------------------------------
>>>> fireflyhoo@gmail.com
>>>>
>>>>
>>>> *From:* andrew touchet <ad...@latech.edu>
>>>> *Date:* 2014-07-18 09:06
>>>> *To:* user <us...@hadoop.apache.org>
>>>> *Subject:* Re: HDFS input/output error - fuse mount
>>>> Hi Chris,
>>>>
>>>> I tried to mount /hdfs with java versions below but there was no change
>>>> in output.
>>>> jre-7u21
>>>> jdk-7u21
>>>> jdk-7u55
>>>> jdk1.6.0_31
>>>> jdk1.6.0_45
>>>>
>>>>
>>>>
>>>>
>>>> On Thu, Jul 17, 2014 at 6:56 PM, Chris Mawata <ch...@gmail.com>
>>>> wrote:
>>>>
>>>>> Version 51 ia Java 7
>>>>> Chris
>>>>> On Jul 17, 2014 7:50 PM, "andrew touchet" <ad...@latech.edu> wrote:
>>>>>
>>>>>> Hello,
>>>>>>
>>>>>> Hadoop package installed:
>>>>>> hadoop-0.20-0.20.2+737-33.osg.el5.noarch
>>>>>>
>>>>>> Operating System:
>>>>>> CentOS release 5.8 (Final)
>>>>>>
>>>>>> I am mounting HDFS from my namenode to another node with fuse.  After
>>>>>> mounting to /hdfs, any attempts to 'ls', 'cd', or use 'hadoop fs' leads to
>>>>>> the below output.
>>>>>>
>>>>>>
>>>>>> $ls /hdfs
>>>>>> *ls: /hdfs: Input/output error*
>>>>>> $hadoop fs -ls
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> *Exception in thread "main" java.lang.UnsupportedClassVersionError:
>>>>>> org/apache/hadoop/fs/FsShell : Unsupported major.minor version 51.0     at
>>>>>> java.lang.ClassLoader.defineClass1(Native Method)    at
>>>>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>>>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>>>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>>>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>>>>> java.security.AccessController.doPrivileged(Native Method)     at
>>>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Could not find the
>>>>>> main class: org.apache.hadoop.fs.FsShell.  Program will exit.*
>>>>>>
>>>>>>
>>>>>> I have attempted to mount /hdfs manually in debug mode and then
>>>>>> attempted to access /hdfs from a different terminal. This is the output.
>>>>>> The namenode is *glados*. The server where /hdfs is being mounted is
>>>>>> *glados2*.
>>>>>>
>>>>>>
>>>>>> $hdfs -oserver=glados,port=9000,rdbuffer=131072,allow_other /hdfs -d
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> *fuse-dfs ignoring option allow_otherERROR fuse_options.c:162
>>>>>> fuse-dfs didn't recognize /hdfs,-2fuse-dfs ignoring option -d unique: 1,
>>>>>> opcode: INIT (26), nodeid: 0, insize: 56INIT:
>>>>>> 7.10flags=0x0000000bmax_readahead=0x00020000INFO fuse_init.c:115 Mounting
>>>>>> glados:9000Exception in thread "main"
>>>>>> java.lang.UnsupportedClassVersionError:
>>>>>> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
>>>>>>     at java.lang.ClassLoader.defineClass1(Native Method)    at
>>>>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>>>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>>>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>>>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>>>>> java.security.AccessController.doPrivileged(Native Method)     at
>>>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Can't construct
>>>>>> instance of class org.apache.hadoop.conf.ConfigurationERROR fuse_init.c:127
>>>>>> Unable to establish test connection to server   INIT: 7.8
>>>>>> flags=0x00000001   max_readahead=0x00020000   max_write=0x00020000
>>>>>> unique: 1, error: 0 (Success), outsize: 40unique: 2, opcode: GETATTR (3),
>>>>>> nodeid: 1, insize: 56Exception in thread "Thread-0"
>>>>>> java.lang.UnsupportedClassVersionError:
>>>>>> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
>>>>>>     at java.lang.ClassLoader.defineClass1(Native Method)    at
>>>>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>>>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>>>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>>>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>>>>> java.security.AccessController.doPrivileged(Native Method)     at
>>>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Can't construct
>>>>>> instance of class org.apache.hadoop.conf.ConfigurationERROR
>>>>>> fuse_connect.c:83 Unable to instantiate a filesystem for user027ERROR
>>>>>> fuse_impls_getattr.c:40 Could not connect to glados:9000   unique: 2,
>>>>>> error: -5 (Input/output error), outsize: 16 unique: 3, opcode: GETATTR (3),
>>>>>> nodeid: 1, insize: 56*
>>>>>>
>>>>>> I adopted this system after this was already setup, so I do not know
>>>>>> which java version was used during install. Currently I'm using:
>>>>>>
>>>>>> $java -version
>>>>>>
>>>>>>
>>>>>> *java version "1.6.0_45"Java(TM) SE Runtime Environment (build
>>>>>> 1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed
>>>>>> mode)*
>>>>>>
>>>>>> $java -version
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> *java version "1.6.0_45" Java(TM) SE Runtime Environment (build
>>>>>> 1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed
>>>>>> mode)*
>>>>>> Is my java version really the cause of this issue?  What is the
>>>>>> correct java version to be used for this version of hadoop.  I have also
>>>>>> tried 1.6.0_31 but no changes were seen.
>>>>>>
>>>>>> If java isn't my issue, then what is?
>>>>>>
>>>>>> Best regards,
>>>>>>
>>>>>> Andrew
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>
>>>
>

Re: Re: HDFS input/output error - fuse mount

Posted by andrew touchet <ad...@latech.edu>.
Thanks Chris!

The issue was that even though I set jdk-7u21 as my default, it checked for
/usr/java/jdk-1.6* first.  Even though it was compiled with 1.7.

Is there anyway to generate a proper hadoop-config.sh to reflect the minor
version hadoop was built with? So that in my case, it would check for
/usr/java/jdk-1.7* instead?  I appreciate the help!


On Thu, Jul 17, 2014 at 11:11 PM, Chris Mawata <ch...@gmail.com>
wrote:

> Yet another place to check -- in the hadoop-env.sh file there is also a
> JAVA_HOME setting.
> Chris
> On Jul 17, 2014 9:46 PM, "andrew touchet" <ad...@latech.edu> wrote:
>
>> Hi Fireflyhoo,
>>
>> Below I follow the symbolic links for the jdk-7u21. These links are
>> changed accordingly as I change between versions. Also, I have 8 datanodes
>> and 2 other various servers that are capable of mounting /hdfs.  So it is
>> just this server is an issue.
>>
>> $ java -version
>> java version "1.7.0_21"
>> Java(TM) SE Runtime Environment (build 1.7.0_21-b11)
>> Java HotSpot(TM) 64-Bit Server VM (build 23.21-b01, mixed mode)
>>
>> java
>> $ ls -l `which java`
>> *lrwxrwxrwx 1 root root 26 Jul 17 19:50 /usr/bin/java ->
>> /usr/java/default/bin/java*
>> $ ls -l /usr/java/default
>> *lrwxrwxrwx 1 root root 16 Jul 17 19:50 /usr/java/default ->
>> /usr/java/latest*
>> $ ls -l /usr/java/latest
>> *lrwxrwxrwx 1 root root 21 Jul 17 20:29 /usr/java/latest ->
>> /usr/java/jdk1.7.0_21*
>>
>> jar
>> $ ls -l `which jar`
>> *lrwxrwxrwx 1 root root 21 Jul 17 20:18 /usr/bin/jar ->
>> /etc/alternatives/jar*
>> $ ls -l /etc/alternatives/jar
>> *lrwxrwxrwx 1 root root 29 Jul 17 20:26 /etc/alternatives/jar ->
>> /usr/java/jdk1.7.0_21/bin/jar*
>>
>> javac
>> $ ls -l `which javac`
>> *lrwxrwxrwx 1 root root 23 Jul 17 20:18 /usr/bin/javac ->
>> /etc/alternatives/javac*
>> $ ls -l /etc/alternatives/javac
>> *lrwxrwxrwx 1 root root 31 Jul 17 20:26 /etc/alternatives/javac ->
>> /usr/java/jdk1.7.0_21/bin/javac*
>>
>> Now that I've tried version from  6 & 7, I'm really not sure what is
>> causing this issue.
>>
>>
>>
>>
>>
>>
>> On Thu, Jul 17, 2014 at 8:21 PM, fireflyhoo@gmail.com <
>> fireflyhoo@gmail.com> wrote:
>>
>>>  I think  you first confirm you local java version ,
>>> Some  liux will  pre-installed java ,that version is very low
>>>
>>> ------------------------------
>>> fireflyhoo@gmail.com
>>>
>>>
>>> *From:* andrew touchet <ad...@latech.edu>
>>> *Date:* 2014-07-18 09:06
>>> *To:* user <us...@hadoop.apache.org>
>>> *Subject:* Re: HDFS input/output error - fuse mount
>>> Hi Chris,
>>>
>>> I tried to mount /hdfs with java versions below but there was no change
>>> in output.
>>> jre-7u21
>>> jdk-7u21
>>> jdk-7u55
>>> jdk1.6.0_31
>>> jdk1.6.0_45
>>>
>>>
>>>
>>>
>>> On Thu, Jul 17, 2014 at 6:56 PM, Chris Mawata <ch...@gmail.com>
>>> wrote:
>>>
>>>> Version 51 ia Java 7
>>>> Chris
>>>> On Jul 17, 2014 7:50 PM, "andrew touchet" <ad...@latech.edu> wrote:
>>>>
>>>>> Hello,
>>>>>
>>>>> Hadoop package installed:
>>>>> hadoop-0.20-0.20.2+737-33.osg.el5.noarch
>>>>>
>>>>> Operating System:
>>>>> CentOS release 5.8 (Final)
>>>>>
>>>>> I am mounting HDFS from my namenode to another node with fuse.  After
>>>>> mounting to /hdfs, any attempts to 'ls', 'cd', or use 'hadoop fs' leads to
>>>>> the below output.
>>>>>
>>>>>
>>>>> $ls /hdfs
>>>>> *ls: /hdfs: Input/output error*
>>>>> $hadoop fs -ls
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> *Exception in thread "main" java.lang.UnsupportedClassVersionError:
>>>>> org/apache/hadoop/fs/FsShell : Unsupported major.minor version 51.0     at
>>>>> java.lang.ClassLoader.defineClass1(Native Method)    at
>>>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>>>> java.security.AccessController.doPrivileged(Native Method)     at
>>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Could not find the
>>>>> main class: org.apache.hadoop.fs.FsShell.  Program will exit.*
>>>>>
>>>>>
>>>>> I have attempted to mount /hdfs manually in debug mode and then
>>>>> attempted to access /hdfs from a different terminal. This is the output.
>>>>> The namenode is *glados*. The server where /hdfs is being mounted is
>>>>> *glados2*.
>>>>>
>>>>>
>>>>> $hdfs -oserver=glados,port=9000,rdbuffer=131072,allow_other /hdfs -d
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> *fuse-dfs ignoring option allow_otherERROR fuse_options.c:162 fuse-dfs
>>>>> didn't recognize /hdfs,-2fuse-dfs ignoring option -d unique: 1, opcode:
>>>>> INIT (26), nodeid: 0, insize: 56INIT:
>>>>> 7.10flags=0x0000000bmax_readahead=0x00020000INFO fuse_init.c:115 Mounting
>>>>> glados:9000Exception in thread "main"
>>>>> java.lang.UnsupportedClassVersionError:
>>>>> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
>>>>>     at java.lang.ClassLoader.defineClass1(Native Method)    at
>>>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>>>> java.security.AccessController.doPrivileged(Native Method)     at
>>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Can't construct
>>>>> instance of class org.apache.hadoop.conf.ConfigurationERROR fuse_init.c:127
>>>>> Unable to establish test connection to server   INIT: 7.8
>>>>> flags=0x00000001   max_readahead=0x00020000   max_write=0x00020000
>>>>> unique: 1, error: 0 (Success), outsize: 40unique: 2, opcode: GETATTR (3),
>>>>> nodeid: 1, insize: 56Exception in thread "Thread-0"
>>>>> java.lang.UnsupportedClassVersionError:
>>>>> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
>>>>>     at java.lang.ClassLoader.defineClass1(Native Method)    at
>>>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>>>> java.security.AccessController.doPrivileged(Native Method)     at
>>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Can't construct
>>>>> instance of class org.apache.hadoop.conf.ConfigurationERROR
>>>>> fuse_connect.c:83 Unable to instantiate a filesystem for user027ERROR
>>>>> fuse_impls_getattr.c:40 Could not connect to glados:9000   unique: 2,
>>>>> error: -5 (Input/output error), outsize: 16 unique: 3, opcode: GETATTR (3),
>>>>> nodeid: 1, insize: 56*
>>>>>
>>>>> I adopted this system after this was already setup, so I do not know
>>>>> which java version was used during install. Currently I'm using:
>>>>>
>>>>> $java -version
>>>>>
>>>>>
>>>>> *java version "1.6.0_45"Java(TM) SE Runtime Environment (build
>>>>> 1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed
>>>>> mode)*
>>>>>
>>>>> $java -version
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> *java version "1.6.0_45" Java(TM) SE Runtime Environment (build
>>>>> 1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed
>>>>> mode)*
>>>>> Is my java version really the cause of this issue?  What is the
>>>>> correct java version to be used for this version of hadoop.  I have also
>>>>> tried 1.6.0_31 but no changes were seen.
>>>>>
>>>>> If java isn't my issue, then what is?
>>>>>
>>>>> Best regards,
>>>>>
>>>>> Andrew
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>
>>

Re: Re: HDFS input/output error - fuse mount

Posted by andrew touchet <ad...@latech.edu>.
Thanks Chris!

The issue was that even though I set jdk-7u21 as my default, it checked for
/usr/java/jdk-1.6* first.  Even though it was compiled with 1.7.

Is there anyway to generate a proper hadoop-config.sh to reflect the minor
version hadoop was built with? So that in my case, it would check for
/usr/java/jdk-1.7* instead?  I appreciate the help!


On Thu, Jul 17, 2014 at 11:11 PM, Chris Mawata <ch...@gmail.com>
wrote:

> Yet another place to check -- in the hadoop-env.sh file there is also a
> JAVA_HOME setting.
> Chris
> On Jul 17, 2014 9:46 PM, "andrew touchet" <ad...@latech.edu> wrote:
>
>> Hi Fireflyhoo,
>>
>> Below I follow the symbolic links for the jdk-7u21. These links are
>> changed accordingly as I change between versions. Also, I have 8 datanodes
>> and 2 other various servers that are capable of mounting /hdfs.  So it is
>> just this server is an issue.
>>
>> $ java -version
>> java version "1.7.0_21"
>> Java(TM) SE Runtime Environment (build 1.7.0_21-b11)
>> Java HotSpot(TM) 64-Bit Server VM (build 23.21-b01, mixed mode)
>>
>> java
>> $ ls -l `which java`
>> *lrwxrwxrwx 1 root root 26 Jul 17 19:50 /usr/bin/java ->
>> /usr/java/default/bin/java*
>> $ ls -l /usr/java/default
>> *lrwxrwxrwx 1 root root 16 Jul 17 19:50 /usr/java/default ->
>> /usr/java/latest*
>> $ ls -l /usr/java/latest
>> *lrwxrwxrwx 1 root root 21 Jul 17 20:29 /usr/java/latest ->
>> /usr/java/jdk1.7.0_21*
>>
>> jar
>> $ ls -l `which jar`
>> *lrwxrwxrwx 1 root root 21 Jul 17 20:18 /usr/bin/jar ->
>> /etc/alternatives/jar*
>> $ ls -l /etc/alternatives/jar
>> *lrwxrwxrwx 1 root root 29 Jul 17 20:26 /etc/alternatives/jar ->
>> /usr/java/jdk1.7.0_21/bin/jar*
>>
>> javac
>> $ ls -l `which javac`
>> *lrwxrwxrwx 1 root root 23 Jul 17 20:18 /usr/bin/javac ->
>> /etc/alternatives/javac*
>> $ ls -l /etc/alternatives/javac
>> *lrwxrwxrwx 1 root root 31 Jul 17 20:26 /etc/alternatives/javac ->
>> /usr/java/jdk1.7.0_21/bin/javac*
>>
>> Now that I've tried version from  6 & 7, I'm really not sure what is
>> causing this issue.
>>
>>
>>
>>
>>
>>
>> On Thu, Jul 17, 2014 at 8:21 PM, fireflyhoo@gmail.com <
>> fireflyhoo@gmail.com> wrote:
>>
>>>  I think  you first confirm you local java version ,
>>> Some  liux will  pre-installed java ,that version is very low
>>>
>>> ------------------------------
>>> fireflyhoo@gmail.com
>>>
>>>
>>> *From:* andrew touchet <ad...@latech.edu>
>>> *Date:* 2014-07-18 09:06
>>> *To:* user <us...@hadoop.apache.org>
>>> *Subject:* Re: HDFS input/output error - fuse mount
>>> Hi Chris,
>>>
>>> I tried to mount /hdfs with java versions below but there was no change
>>> in output.
>>> jre-7u21
>>> jdk-7u21
>>> jdk-7u55
>>> jdk1.6.0_31
>>> jdk1.6.0_45
>>>
>>>
>>>
>>>
>>> On Thu, Jul 17, 2014 at 6:56 PM, Chris Mawata <ch...@gmail.com>
>>> wrote:
>>>
>>>> Version 51 ia Java 7
>>>> Chris
>>>> On Jul 17, 2014 7:50 PM, "andrew touchet" <ad...@latech.edu> wrote:
>>>>
>>>>> Hello,
>>>>>
>>>>> Hadoop package installed:
>>>>> hadoop-0.20-0.20.2+737-33.osg.el5.noarch
>>>>>
>>>>> Operating System:
>>>>> CentOS release 5.8 (Final)
>>>>>
>>>>> I am mounting HDFS from my namenode to another node with fuse.  After
>>>>> mounting to /hdfs, any attempts to 'ls', 'cd', or use 'hadoop fs' leads to
>>>>> the below output.
>>>>>
>>>>>
>>>>> $ls /hdfs
>>>>> *ls: /hdfs: Input/output error*
>>>>> $hadoop fs -ls
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> *Exception in thread "main" java.lang.UnsupportedClassVersionError:
>>>>> org/apache/hadoop/fs/FsShell : Unsupported major.minor version 51.0     at
>>>>> java.lang.ClassLoader.defineClass1(Native Method)    at
>>>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>>>> java.security.AccessController.doPrivileged(Native Method)     at
>>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Could not find the
>>>>> main class: org.apache.hadoop.fs.FsShell.  Program will exit.*
>>>>>
>>>>>
>>>>> I have attempted to mount /hdfs manually in debug mode and then
>>>>> attempted to access /hdfs from a different terminal. This is the output.
>>>>> The namenode is *glados*. The server where /hdfs is being mounted is
>>>>> *glados2*.
>>>>>
>>>>>
>>>>> $hdfs -oserver=glados,port=9000,rdbuffer=131072,allow_other /hdfs -d
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> *fuse-dfs ignoring option allow_otherERROR fuse_options.c:162 fuse-dfs
>>>>> didn't recognize /hdfs,-2fuse-dfs ignoring option -d unique: 1, opcode:
>>>>> INIT (26), nodeid: 0, insize: 56INIT:
>>>>> 7.10flags=0x0000000bmax_readahead=0x00020000INFO fuse_init.c:115 Mounting
>>>>> glados:9000Exception in thread "main"
>>>>> java.lang.UnsupportedClassVersionError:
>>>>> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
>>>>>     at java.lang.ClassLoader.defineClass1(Native Method)    at
>>>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>>>> java.security.AccessController.doPrivileged(Native Method)     at
>>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Can't construct
>>>>> instance of class org.apache.hadoop.conf.ConfigurationERROR fuse_init.c:127
>>>>> Unable to establish test connection to server   INIT: 7.8
>>>>> flags=0x00000001   max_readahead=0x00020000   max_write=0x00020000
>>>>> unique: 1, error: 0 (Success), outsize: 40unique: 2, opcode: GETATTR (3),
>>>>> nodeid: 1, insize: 56Exception in thread "Thread-0"
>>>>> java.lang.UnsupportedClassVersionError:
>>>>> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
>>>>>     at java.lang.ClassLoader.defineClass1(Native Method)    at
>>>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>>>> java.security.AccessController.doPrivileged(Native Method)     at
>>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Can't construct
>>>>> instance of class org.apache.hadoop.conf.ConfigurationERROR
>>>>> fuse_connect.c:83 Unable to instantiate a filesystem for user027ERROR
>>>>> fuse_impls_getattr.c:40 Could not connect to glados:9000   unique: 2,
>>>>> error: -5 (Input/output error), outsize: 16 unique: 3, opcode: GETATTR (3),
>>>>> nodeid: 1, insize: 56*
>>>>>
>>>>> I adopted this system after this was already setup, so I do not know
>>>>> which java version was used during install. Currently I'm using:
>>>>>
>>>>> $java -version
>>>>>
>>>>>
>>>>> *java version "1.6.0_45"Java(TM) SE Runtime Environment (build
>>>>> 1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed
>>>>> mode)*
>>>>>
>>>>> $java -version
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> *java version "1.6.0_45" Java(TM) SE Runtime Environment (build
>>>>> 1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed
>>>>> mode)*
>>>>> Is my java version really the cause of this issue?  What is the
>>>>> correct java version to be used for this version of hadoop.  I have also
>>>>> tried 1.6.0_31 but no changes were seen.
>>>>>
>>>>> If java isn't my issue, then what is?
>>>>>
>>>>> Best regards,
>>>>>
>>>>> Andrew
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>
>>

Re: Re: HDFS input/output error - fuse mount

Posted by andrew touchet <ad...@latech.edu>.
Thanks Chris!

The issue was that even though I set jdk-7u21 as my default, it checked for
/usr/java/jdk-1.6* first.  Even though it was compiled with 1.7.

Is there anyway to generate a proper hadoop-config.sh to reflect the minor
version hadoop was built with? So that in my case, it would check for
/usr/java/jdk-1.7* instead?  I appreciate the help!


On Thu, Jul 17, 2014 at 11:11 PM, Chris Mawata <ch...@gmail.com>
wrote:

> Yet another place to check -- in the hadoop-env.sh file there is also a
> JAVA_HOME setting.
> Chris
> On Jul 17, 2014 9:46 PM, "andrew touchet" <ad...@latech.edu> wrote:
>
>> Hi Fireflyhoo,
>>
>> Below I follow the symbolic links for the jdk-7u21. These links are
>> changed accordingly as I change between versions. Also, I have 8 datanodes
>> and 2 other various servers that are capable of mounting /hdfs.  So it is
>> just this server is an issue.
>>
>> $ java -version
>> java version "1.7.0_21"
>> Java(TM) SE Runtime Environment (build 1.7.0_21-b11)
>> Java HotSpot(TM) 64-Bit Server VM (build 23.21-b01, mixed mode)
>>
>> java
>> $ ls -l `which java`
>> *lrwxrwxrwx 1 root root 26 Jul 17 19:50 /usr/bin/java ->
>> /usr/java/default/bin/java*
>> $ ls -l /usr/java/default
>> *lrwxrwxrwx 1 root root 16 Jul 17 19:50 /usr/java/default ->
>> /usr/java/latest*
>> $ ls -l /usr/java/latest
>> *lrwxrwxrwx 1 root root 21 Jul 17 20:29 /usr/java/latest ->
>> /usr/java/jdk1.7.0_21*
>>
>> jar
>> $ ls -l `which jar`
>> *lrwxrwxrwx 1 root root 21 Jul 17 20:18 /usr/bin/jar ->
>> /etc/alternatives/jar*
>> $ ls -l /etc/alternatives/jar
>> *lrwxrwxrwx 1 root root 29 Jul 17 20:26 /etc/alternatives/jar ->
>> /usr/java/jdk1.7.0_21/bin/jar*
>>
>> javac
>> $ ls -l `which javac`
>> *lrwxrwxrwx 1 root root 23 Jul 17 20:18 /usr/bin/javac ->
>> /etc/alternatives/javac*
>> $ ls -l /etc/alternatives/javac
>> *lrwxrwxrwx 1 root root 31 Jul 17 20:26 /etc/alternatives/javac ->
>> /usr/java/jdk1.7.0_21/bin/javac*
>>
>> Now that I've tried version from  6 & 7, I'm really not sure what is
>> causing this issue.
>>
>>
>>
>>
>>
>>
>> On Thu, Jul 17, 2014 at 8:21 PM, fireflyhoo@gmail.com <
>> fireflyhoo@gmail.com> wrote:
>>
>>>  I think  you first confirm you local java version ,
>>> Some  liux will  pre-installed java ,that version is very low
>>>
>>> ------------------------------
>>> fireflyhoo@gmail.com
>>>
>>>
>>> *From:* andrew touchet <ad...@latech.edu>
>>> *Date:* 2014-07-18 09:06
>>> *To:* user <us...@hadoop.apache.org>
>>> *Subject:* Re: HDFS input/output error - fuse mount
>>> Hi Chris,
>>>
>>> I tried to mount /hdfs with java versions below but there was no change
>>> in output.
>>> jre-7u21
>>> jdk-7u21
>>> jdk-7u55
>>> jdk1.6.0_31
>>> jdk1.6.0_45
>>>
>>>
>>>
>>>
>>> On Thu, Jul 17, 2014 at 6:56 PM, Chris Mawata <ch...@gmail.com>
>>> wrote:
>>>
>>>> Version 51 ia Java 7
>>>> Chris
>>>> On Jul 17, 2014 7:50 PM, "andrew touchet" <ad...@latech.edu> wrote:
>>>>
>>>>> Hello,
>>>>>
>>>>> Hadoop package installed:
>>>>> hadoop-0.20-0.20.2+737-33.osg.el5.noarch
>>>>>
>>>>> Operating System:
>>>>> CentOS release 5.8 (Final)
>>>>>
>>>>> I am mounting HDFS from my namenode to another node with fuse.  After
>>>>> mounting to /hdfs, any attempts to 'ls', 'cd', or use 'hadoop fs' leads to
>>>>> the below output.
>>>>>
>>>>>
>>>>> $ls /hdfs
>>>>> *ls: /hdfs: Input/output error*
>>>>> $hadoop fs -ls
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> *Exception in thread "main" java.lang.UnsupportedClassVersionError:
>>>>> org/apache/hadoop/fs/FsShell : Unsupported major.minor version 51.0     at
>>>>> java.lang.ClassLoader.defineClass1(Native Method)    at
>>>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>>>> java.security.AccessController.doPrivileged(Native Method)     at
>>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Could not find the
>>>>> main class: org.apache.hadoop.fs.FsShell.  Program will exit.*
>>>>>
>>>>>
>>>>> I have attempted to mount /hdfs manually in debug mode and then
>>>>> attempted to access /hdfs from a different terminal. This is the output.
>>>>> The namenode is *glados*. The server where /hdfs is being mounted is
>>>>> *glados2*.
>>>>>
>>>>>
>>>>> $hdfs -oserver=glados,port=9000,rdbuffer=131072,allow_other /hdfs -d
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> *fuse-dfs ignoring option allow_otherERROR fuse_options.c:162 fuse-dfs
>>>>> didn't recognize /hdfs,-2fuse-dfs ignoring option -d unique: 1, opcode:
>>>>> INIT (26), nodeid: 0, insize: 56INIT:
>>>>> 7.10flags=0x0000000bmax_readahead=0x00020000INFO fuse_init.c:115 Mounting
>>>>> glados:9000Exception in thread "main"
>>>>> java.lang.UnsupportedClassVersionError:
>>>>> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
>>>>>     at java.lang.ClassLoader.defineClass1(Native Method)    at
>>>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>>>> java.security.AccessController.doPrivileged(Native Method)     at
>>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Can't construct
>>>>> instance of class org.apache.hadoop.conf.ConfigurationERROR fuse_init.c:127
>>>>> Unable to establish test connection to server   INIT: 7.8
>>>>> flags=0x00000001   max_readahead=0x00020000   max_write=0x00020000
>>>>> unique: 1, error: 0 (Success), outsize: 40unique: 2, opcode: GETATTR (3),
>>>>> nodeid: 1, insize: 56Exception in thread "Thread-0"
>>>>> java.lang.UnsupportedClassVersionError:
>>>>> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
>>>>>     at java.lang.ClassLoader.defineClass1(Native Method)    at
>>>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>>>> java.security.AccessController.doPrivileged(Native Method)     at
>>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Can't construct
>>>>> instance of class org.apache.hadoop.conf.ConfigurationERROR
>>>>> fuse_connect.c:83 Unable to instantiate a filesystem for user027ERROR
>>>>> fuse_impls_getattr.c:40 Could not connect to glados:9000   unique: 2,
>>>>> error: -5 (Input/output error), outsize: 16 unique: 3, opcode: GETATTR (3),
>>>>> nodeid: 1, insize: 56*
>>>>>
>>>>> I adopted this system after this was already setup, so I do not know
>>>>> which java version was used during install. Currently I'm using:
>>>>>
>>>>> $java -version
>>>>>
>>>>>
>>>>> *java version "1.6.0_45"Java(TM) SE Runtime Environment (build
>>>>> 1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed
>>>>> mode)*
>>>>>
>>>>> $java -version
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> *java version "1.6.0_45" Java(TM) SE Runtime Environment (build
>>>>> 1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed
>>>>> mode)*
>>>>> Is my java version really the cause of this issue?  What is the
>>>>> correct java version to be used for this version of hadoop.  I have also
>>>>> tried 1.6.0_31 but no changes were seen.
>>>>>
>>>>> If java isn't my issue, then what is?
>>>>>
>>>>> Best regards,
>>>>>
>>>>> Andrew
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>
>>

Re: Re: HDFS input/output error - fuse mount

Posted by andrew touchet <ad...@latech.edu>.
Thanks Chris!

The issue was that even though I set jdk-7u21 as my default, it checked for
/usr/java/jdk-1.6* first.  Even though it was compiled with 1.7.

Is there anyway to generate a proper hadoop-config.sh to reflect the minor
version hadoop was built with? So that in my case, it would check for
/usr/java/jdk-1.7* instead?  I appreciate the help!


On Thu, Jul 17, 2014 at 11:11 PM, Chris Mawata <ch...@gmail.com>
wrote:

> Yet another place to check -- in the hadoop-env.sh file there is also a
> JAVA_HOME setting.
> Chris
> On Jul 17, 2014 9:46 PM, "andrew touchet" <ad...@latech.edu> wrote:
>
>> Hi Fireflyhoo,
>>
>> Below I follow the symbolic links for the jdk-7u21. These links are
>> changed accordingly as I change between versions. Also, I have 8 datanodes
>> and 2 other various servers that are capable of mounting /hdfs.  So it is
>> just this server is an issue.
>>
>> $ java -version
>> java version "1.7.0_21"
>> Java(TM) SE Runtime Environment (build 1.7.0_21-b11)
>> Java HotSpot(TM) 64-Bit Server VM (build 23.21-b01, mixed mode)
>>
>> java
>> $ ls -l `which java`
>> *lrwxrwxrwx 1 root root 26 Jul 17 19:50 /usr/bin/java ->
>> /usr/java/default/bin/java*
>> $ ls -l /usr/java/default
>> *lrwxrwxrwx 1 root root 16 Jul 17 19:50 /usr/java/default ->
>> /usr/java/latest*
>> $ ls -l /usr/java/latest
>> *lrwxrwxrwx 1 root root 21 Jul 17 20:29 /usr/java/latest ->
>> /usr/java/jdk1.7.0_21*
>>
>> jar
>> $ ls -l `which jar`
>> *lrwxrwxrwx 1 root root 21 Jul 17 20:18 /usr/bin/jar ->
>> /etc/alternatives/jar*
>> $ ls -l /etc/alternatives/jar
>> *lrwxrwxrwx 1 root root 29 Jul 17 20:26 /etc/alternatives/jar ->
>> /usr/java/jdk1.7.0_21/bin/jar*
>>
>> javac
>> $ ls -l `which javac`
>> *lrwxrwxrwx 1 root root 23 Jul 17 20:18 /usr/bin/javac ->
>> /etc/alternatives/javac*
>> $ ls -l /etc/alternatives/javac
>> *lrwxrwxrwx 1 root root 31 Jul 17 20:26 /etc/alternatives/javac ->
>> /usr/java/jdk1.7.0_21/bin/javac*
>>
>> Now that I've tried version from  6 & 7, I'm really not sure what is
>> causing this issue.
>>
>>
>>
>>
>>
>>
>> On Thu, Jul 17, 2014 at 8:21 PM, fireflyhoo@gmail.com <
>> fireflyhoo@gmail.com> wrote:
>>
>>>  I think  you first confirm you local java version ,
>>> Some  liux will  pre-installed java ,that version is very low
>>>
>>> ------------------------------
>>> fireflyhoo@gmail.com
>>>
>>>
>>> *From:* andrew touchet <ad...@latech.edu>
>>> *Date:* 2014-07-18 09:06
>>> *To:* user <us...@hadoop.apache.org>
>>> *Subject:* Re: HDFS input/output error - fuse mount
>>> Hi Chris,
>>>
>>> I tried to mount /hdfs with java versions below but there was no change
>>> in output.
>>> jre-7u21
>>> jdk-7u21
>>> jdk-7u55
>>> jdk1.6.0_31
>>> jdk1.6.0_45
>>>
>>>
>>>
>>>
>>> On Thu, Jul 17, 2014 at 6:56 PM, Chris Mawata <ch...@gmail.com>
>>> wrote:
>>>
>>>> Version 51 ia Java 7
>>>> Chris
>>>> On Jul 17, 2014 7:50 PM, "andrew touchet" <ad...@latech.edu> wrote:
>>>>
>>>>> Hello,
>>>>>
>>>>> Hadoop package installed:
>>>>> hadoop-0.20-0.20.2+737-33.osg.el5.noarch
>>>>>
>>>>> Operating System:
>>>>> CentOS release 5.8 (Final)
>>>>>
>>>>> I am mounting HDFS from my namenode to another node with fuse.  After
>>>>> mounting to /hdfs, any attempts to 'ls', 'cd', or use 'hadoop fs' leads to
>>>>> the below output.
>>>>>
>>>>>
>>>>> $ls /hdfs
>>>>> *ls: /hdfs: Input/output error*
>>>>> $hadoop fs -ls
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> *Exception in thread "main" java.lang.UnsupportedClassVersionError:
>>>>> org/apache/hadoop/fs/FsShell : Unsupported major.minor version 51.0     at
>>>>> java.lang.ClassLoader.defineClass1(Native Method)    at
>>>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>>>> java.security.AccessController.doPrivileged(Native Method)     at
>>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Could not find the
>>>>> main class: org.apache.hadoop.fs.FsShell.  Program will exit.*
>>>>>
>>>>>
>>>>> I have attempted to mount /hdfs manually in debug mode and then
>>>>> attempted to access /hdfs from a different terminal. This is the output.
>>>>> The namenode is *glados*. The server where /hdfs is being mounted is
>>>>> *glados2*.
>>>>>
>>>>>
>>>>> $hdfs -oserver=glados,port=9000,rdbuffer=131072,allow_other /hdfs -d
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> *fuse-dfs ignoring option allow_otherERROR fuse_options.c:162 fuse-dfs
>>>>> didn't recognize /hdfs,-2fuse-dfs ignoring option -d unique: 1, opcode:
>>>>> INIT (26), nodeid: 0, insize: 56INIT:
>>>>> 7.10flags=0x0000000bmax_readahead=0x00020000INFO fuse_init.c:115 Mounting
>>>>> glados:9000Exception in thread "main"
>>>>> java.lang.UnsupportedClassVersionError:
>>>>> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
>>>>>     at java.lang.ClassLoader.defineClass1(Native Method)    at
>>>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>>>> java.security.AccessController.doPrivileged(Native Method)     at
>>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Can't construct
>>>>> instance of class org.apache.hadoop.conf.ConfigurationERROR fuse_init.c:127
>>>>> Unable to establish test connection to server   INIT: 7.8
>>>>> flags=0x00000001   max_readahead=0x00020000   max_write=0x00020000
>>>>> unique: 1, error: 0 (Success), outsize: 40unique: 2, opcode: GETATTR (3),
>>>>> nodeid: 1, insize: 56Exception in thread "Thread-0"
>>>>> java.lang.UnsupportedClassVersionError:
>>>>> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
>>>>>     at java.lang.ClassLoader.defineClass1(Native Method)    at
>>>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>>>> java.security.AccessController.doPrivileged(Native Method)     at
>>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Can't construct
>>>>> instance of class org.apache.hadoop.conf.ConfigurationERROR
>>>>> fuse_connect.c:83 Unable to instantiate a filesystem for user027ERROR
>>>>> fuse_impls_getattr.c:40 Could not connect to glados:9000   unique: 2,
>>>>> error: -5 (Input/output error), outsize: 16 unique: 3, opcode: GETATTR (3),
>>>>> nodeid: 1, insize: 56*
>>>>>
>>>>> I adopted this system after this was already setup, so I do not know
>>>>> which java version was used during install. Currently I'm using:
>>>>>
>>>>> $java -version
>>>>>
>>>>>
>>>>> *java version "1.6.0_45"Java(TM) SE Runtime Environment (build
>>>>> 1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed
>>>>> mode)*
>>>>>
>>>>> $java -version
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> *java version "1.6.0_45" Java(TM) SE Runtime Environment (build
>>>>> 1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed
>>>>> mode)*
>>>>> Is my java version really the cause of this issue?  What is the
>>>>> correct java version to be used for this version of hadoop.  I have also
>>>>> tried 1.6.0_31 but no changes were seen.
>>>>>
>>>>> If java isn't my issue, then what is?
>>>>>
>>>>> Best regards,
>>>>>
>>>>> Andrew
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>
>>

Re: Re: HDFS input/output error - fuse mount

Posted by Chris Mawata <ch...@gmail.com>.
Yet another place to check -- in the hadoop-env.sh file there is also a
JAVA_HOME setting.
Chris
On Jul 17, 2014 9:46 PM, "andrew touchet" <ad...@latech.edu> wrote:

> Hi Fireflyhoo,
>
> Below I follow the symbolic links for the jdk-7u21. These links are
> changed accordingly as I change between versions. Also, I have 8 datanodes
> and 2 other various servers that are capable of mounting /hdfs.  So it is
> just this server is an issue.
>
> $ java -version
> java version "1.7.0_21"
> Java(TM) SE Runtime Environment (build 1.7.0_21-b11)
> Java HotSpot(TM) 64-Bit Server VM (build 23.21-b01, mixed mode)
>
> java
> $ ls -l `which java`
> *lrwxrwxrwx 1 root root 26 Jul 17 19:50 /usr/bin/java ->
> /usr/java/default/bin/java*
> $ ls -l /usr/java/default
> *lrwxrwxrwx 1 root root 16 Jul 17 19:50 /usr/java/default ->
> /usr/java/latest*
> $ ls -l /usr/java/latest
> *lrwxrwxrwx 1 root root 21 Jul 17 20:29 /usr/java/latest ->
> /usr/java/jdk1.7.0_21*
>
> jar
> $ ls -l `which jar`
> *lrwxrwxrwx 1 root root 21 Jul 17 20:18 /usr/bin/jar ->
> /etc/alternatives/jar*
> $ ls -l /etc/alternatives/jar
> *lrwxrwxrwx 1 root root 29 Jul 17 20:26 /etc/alternatives/jar ->
> /usr/java/jdk1.7.0_21/bin/jar*
>
> javac
> $ ls -l `which javac`
> *lrwxrwxrwx 1 root root 23 Jul 17 20:18 /usr/bin/javac ->
> /etc/alternatives/javac*
> $ ls -l /etc/alternatives/javac
> *lrwxrwxrwx 1 root root 31 Jul 17 20:26 /etc/alternatives/javac ->
> /usr/java/jdk1.7.0_21/bin/javac*
>
> Now that I've tried version from  6 & 7, I'm really not sure what is
> causing this issue.
>
>
>
>
>
>
> On Thu, Jul 17, 2014 at 8:21 PM, fireflyhoo@gmail.com <
> fireflyhoo@gmail.com> wrote:
>
>>  I think  you first confirm you local java version ,
>> Some  liux will  pre-installed java ,that version is very low
>>
>> ------------------------------
>> fireflyhoo@gmail.com
>>
>>
>> *From:* andrew touchet <ad...@latech.edu>
>> *Date:* 2014-07-18 09:06
>> *To:* user <us...@hadoop.apache.org>
>> *Subject:* Re: HDFS input/output error - fuse mount
>> Hi Chris,
>>
>> I tried to mount /hdfs with java versions below but there was no change
>> in output.
>> jre-7u21
>> jdk-7u21
>> jdk-7u55
>> jdk1.6.0_31
>> jdk1.6.0_45
>>
>>
>>
>>
>> On Thu, Jul 17, 2014 at 6:56 PM, Chris Mawata <ch...@gmail.com>
>> wrote:
>>
>>> Version 51 ia Java 7
>>> Chris
>>> On Jul 17, 2014 7:50 PM, "andrew touchet" <ad...@latech.edu> wrote:
>>>
>>>> Hello,
>>>>
>>>> Hadoop package installed:
>>>> hadoop-0.20-0.20.2+737-33.osg.el5.noarch
>>>>
>>>> Operating System:
>>>> CentOS release 5.8 (Final)
>>>>
>>>> I am mounting HDFS from my namenode to another node with fuse.  After
>>>> mounting to /hdfs, any attempts to 'ls', 'cd', or use 'hadoop fs' leads to
>>>> the below output.
>>>>
>>>>
>>>> $ls /hdfs
>>>> *ls: /hdfs: Input/output error*
>>>> $hadoop fs -ls
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> *Exception in thread "main" java.lang.UnsupportedClassVersionError:
>>>> org/apache/hadoop/fs/FsShell : Unsupported major.minor version 51.0     at
>>>> java.lang.ClassLoader.defineClass1(Native Method)    at
>>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>>> java.security.AccessController.doPrivileged(Native Method)     at
>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Could not find the
>>>> main class: org.apache.hadoop.fs.FsShell.  Program will exit.*
>>>>
>>>>
>>>> I have attempted to mount /hdfs manually in debug mode and then
>>>> attempted to access /hdfs from a different terminal. This is the output.
>>>> The namenode is *glados*. The server where /hdfs is being mounted is
>>>> *glados2*.
>>>>
>>>>
>>>> $hdfs -oserver=glados,port=9000,rdbuffer=131072,allow_other /hdfs -d
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> *fuse-dfs ignoring option allow_otherERROR fuse_options.c:162 fuse-dfs
>>>> didn't recognize /hdfs,-2fuse-dfs ignoring option -d unique: 1, opcode:
>>>> INIT (26), nodeid: 0, insize: 56INIT:
>>>> 7.10flags=0x0000000bmax_readahead=0x00020000INFO fuse_init.c:115 Mounting
>>>> glados:9000Exception in thread "main"
>>>> java.lang.UnsupportedClassVersionError:
>>>> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
>>>>     at java.lang.ClassLoader.defineClass1(Native Method)    at
>>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>>> java.security.AccessController.doPrivileged(Native Method)     at
>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Can't construct
>>>> instance of class org.apache.hadoop.conf.ConfigurationERROR fuse_init.c:127
>>>> Unable to establish test connection to server   INIT: 7.8
>>>> flags=0x00000001   max_readahead=0x00020000   max_write=0x00020000
>>>> unique: 1, error: 0 (Success), outsize: 40unique: 2, opcode: GETATTR (3),
>>>> nodeid: 1, insize: 56Exception in thread "Thread-0"
>>>> java.lang.UnsupportedClassVersionError:
>>>> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
>>>>     at java.lang.ClassLoader.defineClass1(Native Method)    at
>>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>>> java.security.AccessController.doPrivileged(Native Method)     at
>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Can't construct
>>>> instance of class org.apache.hadoop.conf.ConfigurationERROR
>>>> fuse_connect.c:83 Unable to instantiate a filesystem for user027ERROR
>>>> fuse_impls_getattr.c:40 Could not connect to glados:9000   unique: 2,
>>>> error: -5 (Input/output error), outsize: 16 unique: 3, opcode: GETATTR (3),
>>>> nodeid: 1, insize: 56*
>>>>
>>>> I adopted this system after this was already setup, so I do not know
>>>> which java version was used during install. Currently I'm using:
>>>>
>>>> $java -version
>>>>
>>>>
>>>> *java version "1.6.0_45"Java(TM) SE Runtime Environment (build
>>>> 1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed
>>>> mode)*
>>>>
>>>> $java -version
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> *java version "1.6.0_45" Java(TM) SE Runtime Environment (build
>>>> 1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed
>>>> mode)*
>>>> Is my java version really the cause of this issue?  What is the correct
>>>> java version to be used for this version of hadoop.  I have also tried
>>>> 1.6.0_31 but no changes were seen.
>>>>
>>>> If java isn't my issue, then what is?
>>>>
>>>> Best regards,
>>>>
>>>> Andrew
>>>>
>>>>
>>>>
>>>>
>>>>
>>
>

Re: Re: HDFS input/output error - fuse mount

Posted by Chris Mawata <ch...@gmail.com>.
Check the JAVA_HOME environment variable as well ...
 On Jul 17, 2014 9:46 PM, "andrew touchet" <ad...@latech.edu> wrote:

> Hi Fireflyhoo,
>
> Below I follow the symbolic links for the jdk-7u21. These links are
> changed accordingly as I change between versions. Also, I have 8 datanodes
> and 2 other various servers that are capable of mounting /hdfs.  So it is
> just this server is an issue.
>
> $ java -version
> java version "1.7.0_21"
> Java(TM) SE Runtime Environment (build 1.7.0_21-b11)
> Java HotSpot(TM) 64-Bit Server VM (build 23.21-b01, mixed mode)
>
> java
> $ ls -l `which java`
> *lrwxrwxrwx 1 root root 26 Jul 17 19:50 /usr/bin/java ->
> /usr/java/default/bin/java*
> $ ls -l /usr/java/default
> *lrwxrwxrwx 1 root root 16 Jul 17 19:50 /usr/java/default ->
> /usr/java/latest*
> $ ls -l /usr/java/latest
> *lrwxrwxrwx 1 root root 21 Jul 17 20:29 /usr/java/latest ->
> /usr/java/jdk1.7.0_21*
>
> jar
> $ ls -l `which jar`
> *lrwxrwxrwx 1 root root 21 Jul 17 20:18 /usr/bin/jar ->
> /etc/alternatives/jar*
> $ ls -l /etc/alternatives/jar
> *lrwxrwxrwx 1 root root 29 Jul 17 20:26 /etc/alternatives/jar ->
> /usr/java/jdk1.7.0_21/bin/jar*
>
> javac
> $ ls -l `which javac`
> *lrwxrwxrwx 1 root root 23 Jul 17 20:18 /usr/bin/javac ->
> /etc/alternatives/javac*
> $ ls -l /etc/alternatives/javac
> *lrwxrwxrwx 1 root root 31 Jul 17 20:26 /etc/alternatives/javac ->
> /usr/java/jdk1.7.0_21/bin/javac*
>
> Now that I've tried version from  6 & 7, I'm really not sure what is
> causing this issue.
>
>
>
>
>
>
> On Thu, Jul 17, 2014 at 8:21 PM, fireflyhoo@gmail.com <
> fireflyhoo@gmail.com> wrote:
>
>>  I think  you first confirm you local java version ,
>> Some  liux will  pre-installed java ,that version is very low
>>
>> ------------------------------
>> fireflyhoo@gmail.com
>>
>>
>> *From:* andrew touchet <ad...@latech.edu>
>> *Date:* 2014-07-18 09:06
>> *To:* user <us...@hadoop.apache.org>
>> *Subject:* Re: HDFS input/output error - fuse mount
>> Hi Chris,
>>
>> I tried to mount /hdfs with java versions below but there was no change
>> in output.
>> jre-7u21
>> jdk-7u21
>> jdk-7u55
>> jdk1.6.0_31
>> jdk1.6.0_45
>>
>>
>>
>>
>> On Thu, Jul 17, 2014 at 6:56 PM, Chris Mawata <ch...@gmail.com>
>> wrote:
>>
>>> Version 51 ia Java 7
>>> Chris
>>> On Jul 17, 2014 7:50 PM, "andrew touchet" <ad...@latech.edu> wrote:
>>>
>>>> Hello,
>>>>
>>>> Hadoop package installed:
>>>> hadoop-0.20-0.20.2+737-33.osg.el5.noarch
>>>>
>>>> Operating System:
>>>> CentOS release 5.8 (Final)
>>>>
>>>> I am mounting HDFS from my namenode to another node with fuse.  After
>>>> mounting to /hdfs, any attempts to 'ls', 'cd', or use 'hadoop fs' leads to
>>>> the below output.
>>>>
>>>>
>>>> $ls /hdfs
>>>> *ls: /hdfs: Input/output error*
>>>> $hadoop fs -ls
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> *Exception in thread "main" java.lang.UnsupportedClassVersionError:
>>>> org/apache/hadoop/fs/FsShell : Unsupported major.minor version 51.0     at
>>>> java.lang.ClassLoader.defineClass1(Native Method)    at
>>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>>> java.security.AccessController.doPrivileged(Native Method)     at
>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Could not find the
>>>> main class: org.apache.hadoop.fs.FsShell.  Program will exit.*
>>>>
>>>>
>>>> I have attempted to mount /hdfs manually in debug mode and then
>>>> attempted to access /hdfs from a different terminal. This is the output.
>>>> The namenode is *glados*. The server where /hdfs is being mounted is
>>>> *glados2*.
>>>>
>>>>
>>>> $hdfs -oserver=glados,port=9000,rdbuffer=131072,allow_other /hdfs -d
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> *fuse-dfs ignoring option allow_otherERROR fuse_options.c:162 fuse-dfs
>>>> didn't recognize /hdfs,-2fuse-dfs ignoring option -d unique: 1, opcode:
>>>> INIT (26), nodeid: 0, insize: 56INIT:
>>>> 7.10flags=0x0000000bmax_readahead=0x00020000INFO fuse_init.c:115 Mounting
>>>> glados:9000Exception in thread "main"
>>>> java.lang.UnsupportedClassVersionError:
>>>> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
>>>>     at java.lang.ClassLoader.defineClass1(Native Method)    at
>>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>>> java.security.AccessController.doPrivileged(Native Method)     at
>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Can't construct
>>>> instance of class org.apache.hadoop.conf.ConfigurationERROR fuse_init.c:127
>>>> Unable to establish test connection to server   INIT: 7.8
>>>> flags=0x00000001   max_readahead=0x00020000   max_write=0x00020000
>>>> unique: 1, error: 0 (Success), outsize: 40unique: 2, opcode: GETATTR (3),
>>>> nodeid: 1, insize: 56Exception in thread "Thread-0"
>>>> java.lang.UnsupportedClassVersionError:
>>>> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
>>>>     at java.lang.ClassLoader.defineClass1(Native Method)    at
>>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>>> java.security.AccessController.doPrivileged(Native Method)     at
>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Can't construct
>>>> instance of class org.apache.hadoop.conf.ConfigurationERROR
>>>> fuse_connect.c:83 Unable to instantiate a filesystem for user027ERROR
>>>> fuse_impls_getattr.c:40 Could not connect to glados:9000   unique: 2,
>>>> error: -5 (Input/output error), outsize: 16 unique: 3, opcode: GETATTR (3),
>>>> nodeid: 1, insize: 56*
>>>>
>>>> I adopted this system after this was already setup, so I do not know
>>>> which java version was used during install. Currently I'm using:
>>>>
>>>> $java -version
>>>>
>>>>
>>>> *java version "1.6.0_45"Java(TM) SE Runtime Environment (build
>>>> 1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed
>>>> mode)*
>>>>
>>>> $java -version
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> *java version "1.6.0_45" Java(TM) SE Runtime Environment (build
>>>> 1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed
>>>> mode)*
>>>> Is my java version really the cause of this issue?  What is the correct
>>>> java version to be used for this version of hadoop.  I have also tried
>>>> 1.6.0_31 but no changes were seen.
>>>>
>>>> If java isn't my issue, then what is?
>>>>
>>>> Best regards,
>>>>
>>>> Andrew
>>>>
>>>>
>>>>
>>>>
>>>>
>>
>

Re: Re: HDFS input/output error - fuse mount

Posted by Chris Mawata <ch...@gmail.com>.
Check the JAVA_HOME environment variable as well ...
 On Jul 17, 2014 9:46 PM, "andrew touchet" <ad...@latech.edu> wrote:

> Hi Fireflyhoo,
>
> Below I follow the symbolic links for the jdk-7u21. These links are
> changed accordingly as I change between versions. Also, I have 8 datanodes
> and 2 other various servers that are capable of mounting /hdfs.  So it is
> just this server is an issue.
>
> $ java -version
> java version "1.7.0_21"
> Java(TM) SE Runtime Environment (build 1.7.0_21-b11)
> Java HotSpot(TM) 64-Bit Server VM (build 23.21-b01, mixed mode)
>
> java
> $ ls -l `which java`
> *lrwxrwxrwx 1 root root 26 Jul 17 19:50 /usr/bin/java ->
> /usr/java/default/bin/java*
> $ ls -l /usr/java/default
> *lrwxrwxrwx 1 root root 16 Jul 17 19:50 /usr/java/default ->
> /usr/java/latest*
> $ ls -l /usr/java/latest
> *lrwxrwxrwx 1 root root 21 Jul 17 20:29 /usr/java/latest ->
> /usr/java/jdk1.7.0_21*
>
> jar
> $ ls -l `which jar`
> *lrwxrwxrwx 1 root root 21 Jul 17 20:18 /usr/bin/jar ->
> /etc/alternatives/jar*
> $ ls -l /etc/alternatives/jar
> *lrwxrwxrwx 1 root root 29 Jul 17 20:26 /etc/alternatives/jar ->
> /usr/java/jdk1.7.0_21/bin/jar*
>
> javac
> $ ls -l `which javac`
> *lrwxrwxrwx 1 root root 23 Jul 17 20:18 /usr/bin/javac ->
> /etc/alternatives/javac*
> $ ls -l /etc/alternatives/javac
> *lrwxrwxrwx 1 root root 31 Jul 17 20:26 /etc/alternatives/javac ->
> /usr/java/jdk1.7.0_21/bin/javac*
>
> Now that I've tried version from  6 & 7, I'm really not sure what is
> causing this issue.
>
>
>
>
>
>
> On Thu, Jul 17, 2014 at 8:21 PM, fireflyhoo@gmail.com <
> fireflyhoo@gmail.com> wrote:
>
>>  I think  you first confirm you local java version ,
>> Some  liux will  pre-installed java ,that version is very low
>>
>> ------------------------------
>> fireflyhoo@gmail.com
>>
>>
>> *From:* andrew touchet <ad...@latech.edu>
>> *Date:* 2014-07-18 09:06
>> *To:* user <us...@hadoop.apache.org>
>> *Subject:* Re: HDFS input/output error - fuse mount
>> Hi Chris,
>>
>> I tried to mount /hdfs with java versions below but there was no change
>> in output.
>> jre-7u21
>> jdk-7u21
>> jdk-7u55
>> jdk1.6.0_31
>> jdk1.6.0_45
>>
>>
>>
>>
>> On Thu, Jul 17, 2014 at 6:56 PM, Chris Mawata <ch...@gmail.com>
>> wrote:
>>
>>> Version 51 ia Java 7
>>> Chris
>>> On Jul 17, 2014 7:50 PM, "andrew touchet" <ad...@latech.edu> wrote:
>>>
>>>> Hello,
>>>>
>>>> Hadoop package installed:
>>>> hadoop-0.20-0.20.2+737-33.osg.el5.noarch
>>>>
>>>> Operating System:
>>>> CentOS release 5.8 (Final)
>>>>
>>>> I am mounting HDFS from my namenode to another node with fuse.  After
>>>> mounting to /hdfs, any attempts to 'ls', 'cd', or use 'hadoop fs' leads to
>>>> the below output.
>>>>
>>>>
>>>> $ls /hdfs
>>>> *ls: /hdfs: Input/output error*
>>>> $hadoop fs -ls
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> *Exception in thread "main" java.lang.UnsupportedClassVersionError:
>>>> org/apache/hadoop/fs/FsShell : Unsupported major.minor version 51.0     at
>>>> java.lang.ClassLoader.defineClass1(Native Method)    at
>>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>>> java.security.AccessController.doPrivileged(Native Method)     at
>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Could not find the
>>>> main class: org.apache.hadoop.fs.FsShell.  Program will exit.*
>>>>
>>>>
>>>> I have attempted to mount /hdfs manually in debug mode and then
>>>> attempted to access /hdfs from a different terminal. This is the output.
>>>> The namenode is *glados*. The server where /hdfs is being mounted is
>>>> *glados2*.
>>>>
>>>>
>>>> $hdfs -oserver=glados,port=9000,rdbuffer=131072,allow_other /hdfs -d
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> *fuse-dfs ignoring option allow_otherERROR fuse_options.c:162 fuse-dfs
>>>> didn't recognize /hdfs,-2fuse-dfs ignoring option -d unique: 1, opcode:
>>>> INIT (26), nodeid: 0, insize: 56INIT:
>>>> 7.10flags=0x0000000bmax_readahead=0x00020000INFO fuse_init.c:115 Mounting
>>>> glados:9000Exception in thread "main"
>>>> java.lang.UnsupportedClassVersionError:
>>>> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
>>>>     at java.lang.ClassLoader.defineClass1(Native Method)    at
>>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>>> java.security.AccessController.doPrivileged(Native Method)     at
>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Can't construct
>>>> instance of class org.apache.hadoop.conf.ConfigurationERROR fuse_init.c:127
>>>> Unable to establish test connection to server   INIT: 7.8
>>>> flags=0x00000001   max_readahead=0x00020000   max_write=0x00020000
>>>> unique: 1, error: 0 (Success), outsize: 40unique: 2, opcode: GETATTR (3),
>>>> nodeid: 1, insize: 56Exception in thread "Thread-0"
>>>> java.lang.UnsupportedClassVersionError:
>>>> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
>>>>     at java.lang.ClassLoader.defineClass1(Native Method)    at
>>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>>> java.security.AccessController.doPrivileged(Native Method)     at
>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Can't construct
>>>> instance of class org.apache.hadoop.conf.ConfigurationERROR
>>>> fuse_connect.c:83 Unable to instantiate a filesystem for user027ERROR
>>>> fuse_impls_getattr.c:40 Could not connect to glados:9000   unique: 2,
>>>> error: -5 (Input/output error), outsize: 16 unique: 3, opcode: GETATTR (3),
>>>> nodeid: 1, insize: 56*
>>>>
>>>> I adopted this system after this was already setup, so I do not know
>>>> which java version was used during install. Currently I'm using:
>>>>
>>>> $java -version
>>>>
>>>>
>>>> *java version "1.6.0_45"Java(TM) SE Runtime Environment (build
>>>> 1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed
>>>> mode)*
>>>>
>>>> $java -version
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> *java version "1.6.0_45" Java(TM) SE Runtime Environment (build
>>>> 1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed
>>>> mode)*
>>>> Is my java version really the cause of this issue?  What is the correct
>>>> java version to be used for this version of hadoop.  I have also tried
>>>> 1.6.0_31 but no changes were seen.
>>>>
>>>> If java isn't my issue, then what is?
>>>>
>>>> Best regards,
>>>>
>>>> Andrew
>>>>
>>>>
>>>>
>>>>
>>>>
>>
>

Re: Re: HDFS input/output error - fuse mount

Posted by Chris Mawata <ch...@gmail.com>.
Yet another place to check -- in the hadoop-env.sh file there is also a
JAVA_HOME setting.
Chris
On Jul 17, 2014 9:46 PM, "andrew touchet" <ad...@latech.edu> wrote:

> Hi Fireflyhoo,
>
> Below I follow the symbolic links for the jdk-7u21. These links are
> changed accordingly as I change between versions. Also, I have 8 datanodes
> and 2 other various servers that are capable of mounting /hdfs.  So it is
> just this server is an issue.
>
> $ java -version
> java version "1.7.0_21"
> Java(TM) SE Runtime Environment (build 1.7.0_21-b11)
> Java HotSpot(TM) 64-Bit Server VM (build 23.21-b01, mixed mode)
>
> java
> $ ls -l `which java`
> *lrwxrwxrwx 1 root root 26 Jul 17 19:50 /usr/bin/java ->
> /usr/java/default/bin/java*
> $ ls -l /usr/java/default
> *lrwxrwxrwx 1 root root 16 Jul 17 19:50 /usr/java/default ->
> /usr/java/latest*
> $ ls -l /usr/java/latest
> *lrwxrwxrwx 1 root root 21 Jul 17 20:29 /usr/java/latest ->
> /usr/java/jdk1.7.0_21*
>
> jar
> $ ls -l `which jar`
> *lrwxrwxrwx 1 root root 21 Jul 17 20:18 /usr/bin/jar ->
> /etc/alternatives/jar*
> $ ls -l /etc/alternatives/jar
> *lrwxrwxrwx 1 root root 29 Jul 17 20:26 /etc/alternatives/jar ->
> /usr/java/jdk1.7.0_21/bin/jar*
>
> javac
> $ ls -l `which javac`
> *lrwxrwxrwx 1 root root 23 Jul 17 20:18 /usr/bin/javac ->
> /etc/alternatives/javac*
> $ ls -l /etc/alternatives/javac
> *lrwxrwxrwx 1 root root 31 Jul 17 20:26 /etc/alternatives/javac ->
> /usr/java/jdk1.7.0_21/bin/javac*
>
> Now that I've tried version from  6 & 7, I'm really not sure what is
> causing this issue.
>
>
>
>
>
>
> On Thu, Jul 17, 2014 at 8:21 PM, fireflyhoo@gmail.com <
> fireflyhoo@gmail.com> wrote:
>
>>  I think  you first confirm you local java version ,
>> Some  liux will  pre-installed java ,that version is very low
>>
>> ------------------------------
>> fireflyhoo@gmail.com
>>
>>
>> *From:* andrew touchet <ad...@latech.edu>
>> *Date:* 2014-07-18 09:06
>> *To:* user <us...@hadoop.apache.org>
>> *Subject:* Re: HDFS input/output error - fuse mount
>> Hi Chris,
>>
>> I tried to mount /hdfs with java versions below but there was no change
>> in output.
>> jre-7u21
>> jdk-7u21
>> jdk-7u55
>> jdk1.6.0_31
>> jdk1.6.0_45
>>
>>
>>
>>
>> On Thu, Jul 17, 2014 at 6:56 PM, Chris Mawata <ch...@gmail.com>
>> wrote:
>>
>>> Version 51 ia Java 7
>>> Chris
>>> On Jul 17, 2014 7:50 PM, "andrew touchet" <ad...@latech.edu> wrote:
>>>
>>>> Hello,
>>>>
>>>> Hadoop package installed:
>>>> hadoop-0.20-0.20.2+737-33.osg.el5.noarch
>>>>
>>>> Operating System:
>>>> CentOS release 5.8 (Final)
>>>>
>>>> I am mounting HDFS from my namenode to another node with fuse.  After
>>>> mounting to /hdfs, any attempts to 'ls', 'cd', or use 'hadoop fs' leads to
>>>> the below output.
>>>>
>>>>
>>>> $ls /hdfs
>>>> *ls: /hdfs: Input/output error*
>>>> $hadoop fs -ls
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> *Exception in thread "main" java.lang.UnsupportedClassVersionError:
>>>> org/apache/hadoop/fs/FsShell : Unsupported major.minor version 51.0     at
>>>> java.lang.ClassLoader.defineClass1(Native Method)    at
>>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>>> java.security.AccessController.doPrivileged(Native Method)     at
>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Could not find the
>>>> main class: org.apache.hadoop.fs.FsShell.  Program will exit.*
>>>>
>>>>
>>>> I have attempted to mount /hdfs manually in debug mode and then
>>>> attempted to access /hdfs from a different terminal. This is the output.
>>>> The namenode is *glados*. The server where /hdfs is being mounted is
>>>> *glados2*.
>>>>
>>>>
>>>> $hdfs -oserver=glados,port=9000,rdbuffer=131072,allow_other /hdfs -d
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> *fuse-dfs ignoring option allow_otherERROR fuse_options.c:162 fuse-dfs
>>>> didn't recognize /hdfs,-2fuse-dfs ignoring option -d unique: 1, opcode:
>>>> INIT (26), nodeid: 0, insize: 56INIT:
>>>> 7.10flags=0x0000000bmax_readahead=0x00020000INFO fuse_init.c:115 Mounting
>>>> glados:9000Exception in thread "main"
>>>> java.lang.UnsupportedClassVersionError:
>>>> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
>>>>     at java.lang.ClassLoader.defineClass1(Native Method)    at
>>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>>> java.security.AccessController.doPrivileged(Native Method)     at
>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Can't construct
>>>> instance of class org.apache.hadoop.conf.ConfigurationERROR fuse_init.c:127
>>>> Unable to establish test connection to server   INIT: 7.8
>>>> flags=0x00000001   max_readahead=0x00020000   max_write=0x00020000
>>>> unique: 1, error: 0 (Success), outsize: 40unique: 2, opcode: GETATTR (3),
>>>> nodeid: 1, insize: 56Exception in thread "Thread-0"
>>>> java.lang.UnsupportedClassVersionError:
>>>> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
>>>>     at java.lang.ClassLoader.defineClass1(Native Method)    at
>>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>>> java.security.AccessController.doPrivileged(Native Method)     at
>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Can't construct
>>>> instance of class org.apache.hadoop.conf.ConfigurationERROR
>>>> fuse_connect.c:83 Unable to instantiate a filesystem for user027ERROR
>>>> fuse_impls_getattr.c:40 Could not connect to glados:9000   unique: 2,
>>>> error: -5 (Input/output error), outsize: 16 unique: 3, opcode: GETATTR (3),
>>>> nodeid: 1, insize: 56*
>>>>
>>>> I adopted this system after this was already setup, so I do not know
>>>> which java version was used during install. Currently I'm using:
>>>>
>>>> $java -version
>>>>
>>>>
>>>> *java version "1.6.0_45"Java(TM) SE Runtime Environment (build
>>>> 1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed
>>>> mode)*
>>>>
>>>> $java -version
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> *java version "1.6.0_45" Java(TM) SE Runtime Environment (build
>>>> 1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed
>>>> mode)*
>>>> Is my java version really the cause of this issue?  What is the correct
>>>> java version to be used for this version of hadoop.  I have also tried
>>>> 1.6.0_31 but no changes were seen.
>>>>
>>>> If java isn't my issue, then what is?
>>>>
>>>> Best regards,
>>>>
>>>> Andrew
>>>>
>>>>
>>>>
>>>>
>>>>
>>
>

Re: Re: HDFS input/output error - fuse mount

Posted by Chris Mawata <ch...@gmail.com>.
Yet another place to check -- in the hadoop-env.sh file there is also a
JAVA_HOME setting.
Chris
On Jul 17, 2014 9:46 PM, "andrew touchet" <ad...@latech.edu> wrote:

> Hi Fireflyhoo,
>
> Below I follow the symbolic links for the jdk-7u21. These links are
> changed accordingly as I change between versions. Also, I have 8 datanodes
> and 2 other various servers that are capable of mounting /hdfs.  So it is
> just this server is an issue.
>
> $ java -version
> java version "1.7.0_21"
> Java(TM) SE Runtime Environment (build 1.7.0_21-b11)
> Java HotSpot(TM) 64-Bit Server VM (build 23.21-b01, mixed mode)
>
> java
> $ ls -l `which java`
> *lrwxrwxrwx 1 root root 26 Jul 17 19:50 /usr/bin/java ->
> /usr/java/default/bin/java*
> $ ls -l /usr/java/default
> *lrwxrwxrwx 1 root root 16 Jul 17 19:50 /usr/java/default ->
> /usr/java/latest*
> $ ls -l /usr/java/latest
> *lrwxrwxrwx 1 root root 21 Jul 17 20:29 /usr/java/latest ->
> /usr/java/jdk1.7.0_21*
>
> jar
> $ ls -l `which jar`
> *lrwxrwxrwx 1 root root 21 Jul 17 20:18 /usr/bin/jar ->
> /etc/alternatives/jar*
> $ ls -l /etc/alternatives/jar
> *lrwxrwxrwx 1 root root 29 Jul 17 20:26 /etc/alternatives/jar ->
> /usr/java/jdk1.7.0_21/bin/jar*
>
> javac
> $ ls -l `which javac`
> *lrwxrwxrwx 1 root root 23 Jul 17 20:18 /usr/bin/javac ->
> /etc/alternatives/javac*
> $ ls -l /etc/alternatives/javac
> *lrwxrwxrwx 1 root root 31 Jul 17 20:26 /etc/alternatives/javac ->
> /usr/java/jdk1.7.0_21/bin/javac*
>
> Now that I've tried version from  6 & 7, I'm really not sure what is
> causing this issue.
>
>
>
>
>
>
> On Thu, Jul 17, 2014 at 8:21 PM, fireflyhoo@gmail.com <
> fireflyhoo@gmail.com> wrote:
>
>>  I think  you first confirm you local java version ,
>> Some  liux will  pre-installed java ,that version is very low
>>
>> ------------------------------
>> fireflyhoo@gmail.com
>>
>>
>> *From:* andrew touchet <ad...@latech.edu>
>> *Date:* 2014-07-18 09:06
>> *To:* user <us...@hadoop.apache.org>
>> *Subject:* Re: HDFS input/output error - fuse mount
>> Hi Chris,
>>
>> I tried to mount /hdfs with java versions below but there was no change
>> in output.
>> jre-7u21
>> jdk-7u21
>> jdk-7u55
>> jdk1.6.0_31
>> jdk1.6.0_45
>>
>>
>>
>>
>> On Thu, Jul 17, 2014 at 6:56 PM, Chris Mawata <ch...@gmail.com>
>> wrote:
>>
>>> Version 51 ia Java 7
>>> Chris
>>> On Jul 17, 2014 7:50 PM, "andrew touchet" <ad...@latech.edu> wrote:
>>>
>>>> Hello,
>>>>
>>>> Hadoop package installed:
>>>> hadoop-0.20-0.20.2+737-33.osg.el5.noarch
>>>>
>>>> Operating System:
>>>> CentOS release 5.8 (Final)
>>>>
>>>> I am mounting HDFS from my namenode to another node with fuse.  After
>>>> mounting to /hdfs, any attempts to 'ls', 'cd', or use 'hadoop fs' leads to
>>>> the below output.
>>>>
>>>>
>>>> $ls /hdfs
>>>> *ls: /hdfs: Input/output error*
>>>> $hadoop fs -ls
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> *Exception in thread "main" java.lang.UnsupportedClassVersionError:
>>>> org/apache/hadoop/fs/FsShell : Unsupported major.minor version 51.0     at
>>>> java.lang.ClassLoader.defineClass1(Native Method)    at
>>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>>> java.security.AccessController.doPrivileged(Native Method)     at
>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Could not find the
>>>> main class: org.apache.hadoop.fs.FsShell.  Program will exit.*
>>>>
>>>>
>>>> I have attempted to mount /hdfs manually in debug mode and then
>>>> attempted to access /hdfs from a different terminal. This is the output.
>>>> The namenode is *glados*. The server where /hdfs is being mounted is
>>>> *glados2*.
>>>>
>>>>
>>>> $hdfs -oserver=glados,port=9000,rdbuffer=131072,allow_other /hdfs -d
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> *fuse-dfs ignoring option allow_otherERROR fuse_options.c:162 fuse-dfs
>>>> didn't recognize /hdfs,-2fuse-dfs ignoring option -d unique: 1, opcode:
>>>> INIT (26), nodeid: 0, insize: 56INIT:
>>>> 7.10flags=0x0000000bmax_readahead=0x00020000INFO fuse_init.c:115 Mounting
>>>> glados:9000Exception in thread "main"
>>>> java.lang.UnsupportedClassVersionError:
>>>> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
>>>>     at java.lang.ClassLoader.defineClass1(Native Method)    at
>>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>>> java.security.AccessController.doPrivileged(Native Method)     at
>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Can't construct
>>>> instance of class org.apache.hadoop.conf.ConfigurationERROR fuse_init.c:127
>>>> Unable to establish test connection to server   INIT: 7.8
>>>> flags=0x00000001   max_readahead=0x00020000   max_write=0x00020000
>>>> unique: 1, error: 0 (Success), outsize: 40unique: 2, opcode: GETATTR (3),
>>>> nodeid: 1, insize: 56Exception in thread "Thread-0"
>>>> java.lang.UnsupportedClassVersionError:
>>>> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
>>>>     at java.lang.ClassLoader.defineClass1(Native Method)    at
>>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>>> java.security.AccessController.doPrivileged(Native Method)     at
>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Can't construct
>>>> instance of class org.apache.hadoop.conf.ConfigurationERROR
>>>> fuse_connect.c:83 Unable to instantiate a filesystem for user027ERROR
>>>> fuse_impls_getattr.c:40 Could not connect to glados:9000   unique: 2,
>>>> error: -5 (Input/output error), outsize: 16 unique: 3, opcode: GETATTR (3),
>>>> nodeid: 1, insize: 56*
>>>>
>>>> I adopted this system after this was already setup, so I do not know
>>>> which java version was used during install. Currently I'm using:
>>>>
>>>> $java -version
>>>>
>>>>
>>>> *java version "1.6.0_45"Java(TM) SE Runtime Environment (build
>>>> 1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed
>>>> mode)*
>>>>
>>>> $java -version
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> *java version "1.6.0_45" Java(TM) SE Runtime Environment (build
>>>> 1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed
>>>> mode)*
>>>> Is my java version really the cause of this issue?  What is the correct
>>>> java version to be used for this version of hadoop.  I have also tried
>>>> 1.6.0_31 but no changes were seen.
>>>>
>>>> If java isn't my issue, then what is?
>>>>
>>>> Best regards,
>>>>
>>>> Andrew
>>>>
>>>>
>>>>
>>>>
>>>>
>>
>

Re: Re: HDFS input/output error - fuse mount

Posted by Chris Mawata <ch...@gmail.com>.
Check the JAVA_HOME environment variable as well ...
 On Jul 17, 2014 9:46 PM, "andrew touchet" <ad...@latech.edu> wrote:

> Hi Fireflyhoo,
>
> Below I follow the symbolic links for the jdk-7u21. These links are
> changed accordingly as I change between versions. Also, I have 8 datanodes
> and 2 other various servers that are capable of mounting /hdfs.  So it is
> just this server is an issue.
>
> $ java -version
> java version "1.7.0_21"
> Java(TM) SE Runtime Environment (build 1.7.0_21-b11)
> Java HotSpot(TM) 64-Bit Server VM (build 23.21-b01, mixed mode)
>
> java
> $ ls -l `which java`
> *lrwxrwxrwx 1 root root 26 Jul 17 19:50 /usr/bin/java ->
> /usr/java/default/bin/java*
> $ ls -l /usr/java/default
> *lrwxrwxrwx 1 root root 16 Jul 17 19:50 /usr/java/default ->
> /usr/java/latest*
> $ ls -l /usr/java/latest
> *lrwxrwxrwx 1 root root 21 Jul 17 20:29 /usr/java/latest ->
> /usr/java/jdk1.7.0_21*
>
> jar
> $ ls -l `which jar`
> *lrwxrwxrwx 1 root root 21 Jul 17 20:18 /usr/bin/jar ->
> /etc/alternatives/jar*
> $ ls -l /etc/alternatives/jar
> *lrwxrwxrwx 1 root root 29 Jul 17 20:26 /etc/alternatives/jar ->
> /usr/java/jdk1.7.0_21/bin/jar*
>
> javac
> $ ls -l `which javac`
> *lrwxrwxrwx 1 root root 23 Jul 17 20:18 /usr/bin/javac ->
> /etc/alternatives/javac*
> $ ls -l /etc/alternatives/javac
> *lrwxrwxrwx 1 root root 31 Jul 17 20:26 /etc/alternatives/javac ->
> /usr/java/jdk1.7.0_21/bin/javac*
>
> Now that I've tried version from  6 & 7, I'm really not sure what is
> causing this issue.
>
>
>
>
>
>
> On Thu, Jul 17, 2014 at 8:21 PM, fireflyhoo@gmail.com <
> fireflyhoo@gmail.com> wrote:
>
>>  I think  you first confirm you local java version ,
>> Some  liux will  pre-installed java ,that version is very low
>>
>> ------------------------------
>> fireflyhoo@gmail.com
>>
>>
>> *From:* andrew touchet <ad...@latech.edu>
>> *Date:* 2014-07-18 09:06
>> *To:* user <us...@hadoop.apache.org>
>> *Subject:* Re: HDFS input/output error - fuse mount
>> Hi Chris,
>>
>> I tried to mount /hdfs with java versions below but there was no change
>> in output.
>> jre-7u21
>> jdk-7u21
>> jdk-7u55
>> jdk1.6.0_31
>> jdk1.6.0_45
>>
>>
>>
>>
>> On Thu, Jul 17, 2014 at 6:56 PM, Chris Mawata <ch...@gmail.com>
>> wrote:
>>
>>> Version 51 ia Java 7
>>> Chris
>>> On Jul 17, 2014 7:50 PM, "andrew touchet" <ad...@latech.edu> wrote:
>>>
>>>> Hello,
>>>>
>>>> Hadoop package installed:
>>>> hadoop-0.20-0.20.2+737-33.osg.el5.noarch
>>>>
>>>> Operating System:
>>>> CentOS release 5.8 (Final)
>>>>
>>>> I am mounting HDFS from my namenode to another node with fuse.  After
>>>> mounting to /hdfs, any attempts to 'ls', 'cd', or use 'hadoop fs' leads to
>>>> the below output.
>>>>
>>>>
>>>> $ls /hdfs
>>>> *ls: /hdfs: Input/output error*
>>>> $hadoop fs -ls
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> *Exception in thread "main" java.lang.UnsupportedClassVersionError:
>>>> org/apache/hadoop/fs/FsShell : Unsupported major.minor version 51.0     at
>>>> java.lang.ClassLoader.defineClass1(Native Method)    at
>>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>>> java.security.AccessController.doPrivileged(Native Method)     at
>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Could not find the
>>>> main class: org.apache.hadoop.fs.FsShell.  Program will exit.*
>>>>
>>>>
>>>> I have attempted to mount /hdfs manually in debug mode and then
>>>> attempted to access /hdfs from a different terminal. This is the output.
>>>> The namenode is *glados*. The server where /hdfs is being mounted is
>>>> *glados2*.
>>>>
>>>>
>>>> $hdfs -oserver=glados,port=9000,rdbuffer=131072,allow_other /hdfs -d
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> *fuse-dfs ignoring option allow_otherERROR fuse_options.c:162 fuse-dfs
>>>> didn't recognize /hdfs,-2fuse-dfs ignoring option -d unique: 1, opcode:
>>>> INIT (26), nodeid: 0, insize: 56INIT:
>>>> 7.10flags=0x0000000bmax_readahead=0x00020000INFO fuse_init.c:115 Mounting
>>>> glados:9000Exception in thread "main"
>>>> java.lang.UnsupportedClassVersionError:
>>>> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
>>>>     at java.lang.ClassLoader.defineClass1(Native Method)    at
>>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>>> java.security.AccessController.doPrivileged(Native Method)     at
>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Can't construct
>>>> instance of class org.apache.hadoop.conf.ConfigurationERROR fuse_init.c:127
>>>> Unable to establish test connection to server   INIT: 7.8
>>>> flags=0x00000001   max_readahead=0x00020000   max_write=0x00020000
>>>> unique: 1, error: 0 (Success), outsize: 40unique: 2, opcode: GETATTR (3),
>>>> nodeid: 1, insize: 56Exception in thread "Thread-0"
>>>> java.lang.UnsupportedClassVersionError:
>>>> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
>>>>     at java.lang.ClassLoader.defineClass1(Native Method)    at
>>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>>> java.security.AccessController.doPrivileged(Native Method)     at
>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Can't construct
>>>> instance of class org.apache.hadoop.conf.ConfigurationERROR
>>>> fuse_connect.c:83 Unable to instantiate a filesystem for user027ERROR
>>>> fuse_impls_getattr.c:40 Could not connect to glados:9000   unique: 2,
>>>> error: -5 (Input/output error), outsize: 16 unique: 3, opcode: GETATTR (3),
>>>> nodeid: 1, insize: 56*
>>>>
>>>> I adopted this system after this was already setup, so I do not know
>>>> which java version was used during install. Currently I'm using:
>>>>
>>>> $java -version
>>>>
>>>>
>>>> *java version "1.6.0_45"Java(TM) SE Runtime Environment (build
>>>> 1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed
>>>> mode)*
>>>>
>>>> $java -version
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> *java version "1.6.0_45" Java(TM) SE Runtime Environment (build
>>>> 1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed
>>>> mode)*
>>>> Is my java version really the cause of this issue?  What is the correct
>>>> java version to be used for this version of hadoop.  I have also tried
>>>> 1.6.0_31 but no changes were seen.
>>>>
>>>> If java isn't my issue, then what is?
>>>>
>>>> Best regards,
>>>>
>>>> Andrew
>>>>
>>>>
>>>>
>>>>
>>>>
>>
>

Re: Re: HDFS input/output error - fuse mount

Posted by Chris Mawata <ch...@gmail.com>.
Check the JAVA_HOME environment variable as well ...
 On Jul 17, 2014 9:46 PM, "andrew touchet" <ad...@latech.edu> wrote:

> Hi Fireflyhoo,
>
> Below I follow the symbolic links for the jdk-7u21. These links are
> changed accordingly as I change between versions. Also, I have 8 datanodes
> and 2 other various servers that are capable of mounting /hdfs.  So it is
> just this server is an issue.
>
> $ java -version
> java version "1.7.0_21"
> Java(TM) SE Runtime Environment (build 1.7.0_21-b11)
> Java HotSpot(TM) 64-Bit Server VM (build 23.21-b01, mixed mode)
>
> java
> $ ls -l `which java`
> *lrwxrwxrwx 1 root root 26 Jul 17 19:50 /usr/bin/java ->
> /usr/java/default/bin/java*
> $ ls -l /usr/java/default
> *lrwxrwxrwx 1 root root 16 Jul 17 19:50 /usr/java/default ->
> /usr/java/latest*
> $ ls -l /usr/java/latest
> *lrwxrwxrwx 1 root root 21 Jul 17 20:29 /usr/java/latest ->
> /usr/java/jdk1.7.0_21*
>
> jar
> $ ls -l `which jar`
> *lrwxrwxrwx 1 root root 21 Jul 17 20:18 /usr/bin/jar ->
> /etc/alternatives/jar*
> $ ls -l /etc/alternatives/jar
> *lrwxrwxrwx 1 root root 29 Jul 17 20:26 /etc/alternatives/jar ->
> /usr/java/jdk1.7.0_21/bin/jar*
>
> javac
> $ ls -l `which javac`
> *lrwxrwxrwx 1 root root 23 Jul 17 20:18 /usr/bin/javac ->
> /etc/alternatives/javac*
> $ ls -l /etc/alternatives/javac
> *lrwxrwxrwx 1 root root 31 Jul 17 20:26 /etc/alternatives/javac ->
> /usr/java/jdk1.7.0_21/bin/javac*
>
> Now that I've tried version from  6 & 7, I'm really not sure what is
> causing this issue.
>
>
>
>
>
>
> On Thu, Jul 17, 2014 at 8:21 PM, fireflyhoo@gmail.com <
> fireflyhoo@gmail.com> wrote:
>
>>  I think  you first confirm you local java version ,
>> Some  liux will  pre-installed java ,that version is very low
>>
>> ------------------------------
>> fireflyhoo@gmail.com
>>
>>
>> *From:* andrew touchet <ad...@latech.edu>
>> *Date:* 2014-07-18 09:06
>> *To:* user <us...@hadoop.apache.org>
>> *Subject:* Re: HDFS input/output error - fuse mount
>> Hi Chris,
>>
>> I tried to mount /hdfs with java versions below but there was no change
>> in output.
>> jre-7u21
>> jdk-7u21
>> jdk-7u55
>> jdk1.6.0_31
>> jdk1.6.0_45
>>
>>
>>
>>
>> On Thu, Jul 17, 2014 at 6:56 PM, Chris Mawata <ch...@gmail.com>
>> wrote:
>>
>>> Version 51 ia Java 7
>>> Chris
>>> On Jul 17, 2014 7:50 PM, "andrew touchet" <ad...@latech.edu> wrote:
>>>
>>>> Hello,
>>>>
>>>> Hadoop package installed:
>>>> hadoop-0.20-0.20.2+737-33.osg.el5.noarch
>>>>
>>>> Operating System:
>>>> CentOS release 5.8 (Final)
>>>>
>>>> I am mounting HDFS from my namenode to another node with fuse.  After
>>>> mounting to /hdfs, any attempts to 'ls', 'cd', or use 'hadoop fs' leads to
>>>> the below output.
>>>>
>>>>
>>>> $ls /hdfs
>>>> *ls: /hdfs: Input/output error*
>>>> $hadoop fs -ls
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> *Exception in thread "main" java.lang.UnsupportedClassVersionError:
>>>> org/apache/hadoop/fs/FsShell : Unsupported major.minor version 51.0     at
>>>> java.lang.ClassLoader.defineClass1(Native Method)    at
>>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>>> java.security.AccessController.doPrivileged(Native Method)     at
>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Could not find the
>>>> main class: org.apache.hadoop.fs.FsShell.  Program will exit.*
>>>>
>>>>
>>>> I have attempted to mount /hdfs manually in debug mode and then
>>>> attempted to access /hdfs from a different terminal. This is the output.
>>>> The namenode is *glados*. The server where /hdfs is being mounted is
>>>> *glados2*.
>>>>
>>>>
>>>> $hdfs -oserver=glados,port=9000,rdbuffer=131072,allow_other /hdfs -d
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> *fuse-dfs ignoring option allow_otherERROR fuse_options.c:162 fuse-dfs
>>>> didn't recognize /hdfs,-2fuse-dfs ignoring option -d unique: 1, opcode:
>>>> INIT (26), nodeid: 0, insize: 56INIT:
>>>> 7.10flags=0x0000000bmax_readahead=0x00020000INFO fuse_init.c:115 Mounting
>>>> glados:9000Exception in thread "main"
>>>> java.lang.UnsupportedClassVersionError:
>>>> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
>>>>     at java.lang.ClassLoader.defineClass1(Native Method)    at
>>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>>> java.security.AccessController.doPrivileged(Native Method)     at
>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Can't construct
>>>> instance of class org.apache.hadoop.conf.ConfigurationERROR fuse_init.c:127
>>>> Unable to establish test connection to server   INIT: 7.8
>>>> flags=0x00000001   max_readahead=0x00020000   max_write=0x00020000
>>>> unique: 1, error: 0 (Success), outsize: 40unique: 2, opcode: GETATTR (3),
>>>> nodeid: 1, insize: 56Exception in thread "Thread-0"
>>>> java.lang.UnsupportedClassVersionError:
>>>> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
>>>>     at java.lang.ClassLoader.defineClass1(Native Method)    at
>>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>>> java.security.AccessController.doPrivileged(Native Method)     at
>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Can't construct
>>>> instance of class org.apache.hadoop.conf.ConfigurationERROR
>>>> fuse_connect.c:83 Unable to instantiate a filesystem for user027ERROR
>>>> fuse_impls_getattr.c:40 Could not connect to glados:9000   unique: 2,
>>>> error: -5 (Input/output error), outsize: 16 unique: 3, opcode: GETATTR (3),
>>>> nodeid: 1, insize: 56*
>>>>
>>>> I adopted this system after this was already setup, so I do not know
>>>> which java version was used during install. Currently I'm using:
>>>>
>>>> $java -version
>>>>
>>>>
>>>> *java version "1.6.0_45"Java(TM) SE Runtime Environment (build
>>>> 1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed
>>>> mode)*
>>>>
>>>> $java -version
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> *java version "1.6.0_45" Java(TM) SE Runtime Environment (build
>>>> 1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed
>>>> mode)*
>>>> Is my java version really the cause of this issue?  What is the correct
>>>> java version to be used for this version of hadoop.  I have also tried
>>>> 1.6.0_31 but no changes were seen.
>>>>
>>>> If java isn't my issue, then what is?
>>>>
>>>> Best regards,
>>>>
>>>> Andrew
>>>>
>>>>
>>>>
>>>>
>>>>
>>
>

Re: Re: HDFS input/output error - fuse mount

Posted by Chris Mawata <ch...@gmail.com>.
Yet another place to check -- in the hadoop-env.sh file there is also a
JAVA_HOME setting.
Chris
On Jul 17, 2014 9:46 PM, "andrew touchet" <ad...@latech.edu> wrote:

> Hi Fireflyhoo,
>
> Below I follow the symbolic links for the jdk-7u21. These links are
> changed accordingly as I change between versions. Also, I have 8 datanodes
> and 2 other various servers that are capable of mounting /hdfs.  So it is
> just this server is an issue.
>
> $ java -version
> java version "1.7.0_21"
> Java(TM) SE Runtime Environment (build 1.7.0_21-b11)
> Java HotSpot(TM) 64-Bit Server VM (build 23.21-b01, mixed mode)
>
> java
> $ ls -l `which java`
> *lrwxrwxrwx 1 root root 26 Jul 17 19:50 /usr/bin/java ->
> /usr/java/default/bin/java*
> $ ls -l /usr/java/default
> *lrwxrwxrwx 1 root root 16 Jul 17 19:50 /usr/java/default ->
> /usr/java/latest*
> $ ls -l /usr/java/latest
> *lrwxrwxrwx 1 root root 21 Jul 17 20:29 /usr/java/latest ->
> /usr/java/jdk1.7.0_21*
>
> jar
> $ ls -l `which jar`
> *lrwxrwxrwx 1 root root 21 Jul 17 20:18 /usr/bin/jar ->
> /etc/alternatives/jar*
> $ ls -l /etc/alternatives/jar
> *lrwxrwxrwx 1 root root 29 Jul 17 20:26 /etc/alternatives/jar ->
> /usr/java/jdk1.7.0_21/bin/jar*
>
> javac
> $ ls -l `which javac`
> *lrwxrwxrwx 1 root root 23 Jul 17 20:18 /usr/bin/javac ->
> /etc/alternatives/javac*
> $ ls -l /etc/alternatives/javac
> *lrwxrwxrwx 1 root root 31 Jul 17 20:26 /etc/alternatives/javac ->
> /usr/java/jdk1.7.0_21/bin/javac*
>
> Now that I've tried version from  6 & 7, I'm really not sure what is
> causing this issue.
>
>
>
>
>
>
> On Thu, Jul 17, 2014 at 8:21 PM, fireflyhoo@gmail.com <
> fireflyhoo@gmail.com> wrote:
>
>>  I think  you first confirm you local java version ,
>> Some  liux will  pre-installed java ,that version is very low
>>
>> ------------------------------
>> fireflyhoo@gmail.com
>>
>>
>> *From:* andrew touchet <ad...@latech.edu>
>> *Date:* 2014-07-18 09:06
>> *To:* user <us...@hadoop.apache.org>
>> *Subject:* Re: HDFS input/output error - fuse mount
>> Hi Chris,
>>
>> I tried to mount /hdfs with java versions below but there was no change
>> in output.
>> jre-7u21
>> jdk-7u21
>> jdk-7u55
>> jdk1.6.0_31
>> jdk1.6.0_45
>>
>>
>>
>>
>> On Thu, Jul 17, 2014 at 6:56 PM, Chris Mawata <ch...@gmail.com>
>> wrote:
>>
>>> Version 51 ia Java 7
>>> Chris
>>> On Jul 17, 2014 7:50 PM, "andrew touchet" <ad...@latech.edu> wrote:
>>>
>>>> Hello,
>>>>
>>>> Hadoop package installed:
>>>> hadoop-0.20-0.20.2+737-33.osg.el5.noarch
>>>>
>>>> Operating System:
>>>> CentOS release 5.8 (Final)
>>>>
>>>> I am mounting HDFS from my namenode to another node with fuse.  After
>>>> mounting to /hdfs, any attempts to 'ls', 'cd', or use 'hadoop fs' leads to
>>>> the below output.
>>>>
>>>>
>>>> $ls /hdfs
>>>> *ls: /hdfs: Input/output error*
>>>> $hadoop fs -ls
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> *Exception in thread "main" java.lang.UnsupportedClassVersionError:
>>>> org/apache/hadoop/fs/FsShell : Unsupported major.minor version 51.0     at
>>>> java.lang.ClassLoader.defineClass1(Native Method)    at
>>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>>> java.security.AccessController.doPrivileged(Native Method)     at
>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Could not find the
>>>> main class: org.apache.hadoop.fs.FsShell.  Program will exit.*
>>>>
>>>>
>>>> I have attempted to mount /hdfs manually in debug mode and then
>>>> attempted to access /hdfs from a different terminal. This is the output.
>>>> The namenode is *glados*. The server where /hdfs is being mounted is
>>>> *glados2*.
>>>>
>>>>
>>>> $hdfs -oserver=glados,port=9000,rdbuffer=131072,allow_other /hdfs -d
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> *fuse-dfs ignoring option allow_otherERROR fuse_options.c:162 fuse-dfs
>>>> didn't recognize /hdfs,-2fuse-dfs ignoring option -d unique: 1, opcode:
>>>> INIT (26), nodeid: 0, insize: 56INIT:
>>>> 7.10flags=0x0000000bmax_readahead=0x00020000INFO fuse_init.c:115 Mounting
>>>> glados:9000Exception in thread "main"
>>>> java.lang.UnsupportedClassVersionError:
>>>> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
>>>>     at java.lang.ClassLoader.defineClass1(Native Method)    at
>>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>>> java.security.AccessController.doPrivileged(Native Method)     at
>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Can't construct
>>>> instance of class org.apache.hadoop.conf.ConfigurationERROR fuse_init.c:127
>>>> Unable to establish test connection to server   INIT: 7.8
>>>> flags=0x00000001   max_readahead=0x00020000   max_write=0x00020000
>>>> unique: 1, error: 0 (Success), outsize: 40unique: 2, opcode: GETATTR (3),
>>>> nodeid: 1, insize: 56Exception in thread "Thread-0"
>>>> java.lang.UnsupportedClassVersionError:
>>>> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
>>>>     at java.lang.ClassLoader.defineClass1(Native Method)    at
>>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>>> java.security.AccessController.doPrivileged(Native Method)     at
>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Can't construct
>>>> instance of class org.apache.hadoop.conf.ConfigurationERROR
>>>> fuse_connect.c:83 Unable to instantiate a filesystem for user027ERROR
>>>> fuse_impls_getattr.c:40 Could not connect to glados:9000   unique: 2,
>>>> error: -5 (Input/output error), outsize: 16 unique: 3, opcode: GETATTR (3),
>>>> nodeid: 1, insize: 56*
>>>>
>>>> I adopted this system after this was already setup, so I do not know
>>>> which java version was used during install. Currently I'm using:
>>>>
>>>> $java -version
>>>>
>>>>
>>>> *java version "1.6.0_45"Java(TM) SE Runtime Environment (build
>>>> 1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed
>>>> mode)*
>>>>
>>>> $java -version
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> *java version "1.6.0_45" Java(TM) SE Runtime Environment (build
>>>> 1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed
>>>> mode)*
>>>> Is my java version really the cause of this issue?  What is the correct
>>>> java version to be used for this version of hadoop.  I have also tried
>>>> 1.6.0_31 but no changes were seen.
>>>>
>>>> If java isn't my issue, then what is?
>>>>
>>>> Best regards,
>>>>
>>>> Andrew
>>>>
>>>>
>>>>
>>>>
>>>>
>>
>

Re: Re: HDFS input/output error - fuse mount

Posted by andrew touchet <ad...@latech.edu>.
Hi Fireflyhoo,

Below I follow the symbolic links for the jdk-7u21. These links are changed
accordingly as I change between versions. Also, I have 8 datanodes and 2
other various servers that are capable of mounting /hdfs.  So it is just
this server is an issue.

$ java -version
java version "1.7.0_21"
Java(TM) SE Runtime Environment (build 1.7.0_21-b11)
Java HotSpot(TM) 64-Bit Server VM (build 23.21-b01, mixed mode)

java
$ ls -l `which java`
*lrwxrwxrwx 1 root root 26 Jul 17 19:50 /usr/bin/java ->
/usr/java/default/bin/java*
$ ls -l /usr/java/default
*lrwxrwxrwx 1 root root 16 Jul 17 19:50 /usr/java/default ->
/usr/java/latest*
$ ls -l /usr/java/latest
*lrwxrwxrwx 1 root root 21 Jul 17 20:29 /usr/java/latest ->
/usr/java/jdk1.7.0_21*

jar
$ ls -l `which jar`
*lrwxrwxrwx 1 root root 21 Jul 17 20:18 /usr/bin/jar ->
/etc/alternatives/jar*
$ ls -l /etc/alternatives/jar
*lrwxrwxrwx 1 root root 29 Jul 17 20:26 /etc/alternatives/jar ->
/usr/java/jdk1.7.0_21/bin/jar*

javac
$ ls -l `which javac`
*lrwxrwxrwx 1 root root 23 Jul 17 20:18 /usr/bin/javac ->
/etc/alternatives/javac*
$ ls -l /etc/alternatives/javac
*lrwxrwxrwx 1 root root 31 Jul 17 20:26 /etc/alternatives/javac ->
/usr/java/jdk1.7.0_21/bin/javac*

Now that I've tried version from  6 & 7, I'm really not sure what is
causing this issue.






On Thu, Jul 17, 2014 at 8:21 PM, fireflyhoo@gmail.com <fi...@gmail.com>
wrote:

> I think  you first confirm you local java version ,
> Some  liux will  pre-installed java ,that version is very low
>
> ------------------------------
> fireflyhoo@gmail.com
>
>
> *From:* andrew touchet <ad...@latech.edu>
> *Date:* 2014-07-18 09:06
> *To:* user <us...@hadoop.apache.org>
> *Subject:* Re: HDFS input/output error - fuse mount
> Hi Chris,
>
> I tried to mount /hdfs with java versions below but there was no change in
> output.
> jre-7u21
> jdk-7u21
> jdk-7u55
> jdk1.6.0_31
> jdk1.6.0_45
>
>
>
>
> On Thu, Jul 17, 2014 at 6:56 PM, Chris Mawata <ch...@gmail.com>
> wrote:
>
>> Version 51 ia Java 7
>> Chris
>> On Jul 17, 2014 7:50 PM, "andrew touchet" <ad...@latech.edu> wrote:
>>
>>> Hello,
>>>
>>> Hadoop package installed:
>>> hadoop-0.20-0.20.2+737-33.osg.el5.noarch
>>>
>>> Operating System:
>>> CentOS release 5.8 (Final)
>>>
>>> I am mounting HDFS from my namenode to another node with fuse.  After
>>> mounting to /hdfs, any attempts to 'ls', 'cd', or use 'hadoop fs' leads to
>>> the below output.
>>>
>>>
>>> $ls /hdfs
>>> *ls: /hdfs: Input/output error*
>>> $hadoop fs -ls
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>> *Exception in thread "main" java.lang.UnsupportedClassVersionError:
>>> org/apache/hadoop/fs/FsShell : Unsupported major.minor version 51.0     at
>>> java.lang.ClassLoader.defineClass1(Native Method)    at
>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>> java.security.AccessController.doPrivileged(Native Method)     at
>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Could not find the
>>> main class: org.apache.hadoop.fs.FsShell.  Program will exit.*
>>>
>>>
>>> I have attempted to mount /hdfs manually in debug mode and then
>>> attempted to access /hdfs from a different terminal. This is the output.
>>> The namenode is *glados*. The server where /hdfs is being mounted is
>>> *glados2*.
>>>
>>>
>>> $hdfs -oserver=glados,port=9000,rdbuffer=131072,allow_other /hdfs -d
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>> *fuse-dfs ignoring option allow_otherERROR fuse_options.c:162 fuse-dfs
>>> didn't recognize /hdfs,-2fuse-dfs ignoring option -d unique: 1, opcode:
>>> INIT (26), nodeid: 0, insize: 56INIT:
>>> 7.10flags=0x0000000bmax_readahead=0x00020000INFO fuse_init.c:115 Mounting
>>> glados:9000Exception in thread "main"
>>> java.lang.UnsupportedClassVersionError:
>>> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
>>>     at java.lang.ClassLoader.defineClass1(Native Method)    at
>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>> java.security.AccessController.doPrivileged(Native Method)     at
>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Can't construct
>>> instance of class org.apache.hadoop.conf.ConfigurationERROR fuse_init.c:127
>>> Unable to establish test connection to server   INIT: 7.8
>>> flags=0x00000001   max_readahead=0x00020000   max_write=0x00020000
>>> unique: 1, error: 0 (Success), outsize: 40unique: 2, opcode: GETATTR (3),
>>> nodeid: 1, insize: 56Exception in thread "Thread-0"
>>> java.lang.UnsupportedClassVersionError:
>>> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
>>>     at java.lang.ClassLoader.defineClass1(Native Method)    at
>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>> java.security.AccessController.doPrivileged(Native Method)     at
>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Can't construct
>>> instance of class org.apache.hadoop.conf.ConfigurationERROR
>>> fuse_connect.c:83 Unable to instantiate a filesystem for user027ERROR
>>> fuse_impls_getattr.c:40 Could not connect to glados:9000   unique: 2,
>>> error: -5 (Input/output error), outsize: 16 unique: 3, opcode: GETATTR (3),
>>> nodeid: 1, insize: 56*
>>>
>>> I adopted this system after this was already setup, so I do not know
>>> which java version was used during install. Currently I'm using:
>>>
>>> $java -version
>>>
>>>
>>> *java version "1.6.0_45"Java(TM) SE Runtime Environment (build
>>> 1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed
>>> mode)*
>>>
>>> $java -version
>>>
>>>
>>>
>>>
>>>
>>> *java version "1.6.0_45" Java(TM) SE Runtime Environment (build
>>> 1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed
>>> mode)*
>>> Is my java version really the cause of this issue?  What is the correct
>>> java version to be used for this version of hadoop.  I have also tried
>>> 1.6.0_31 but no changes were seen.
>>>
>>> If java isn't my issue, then what is?
>>>
>>> Best regards,
>>>
>>> Andrew
>>>
>>>
>>>
>>>
>>>
>

Re: Re: HDFS input/output error - fuse mount

Posted by andrew touchet <ad...@latech.edu>.
Hi Fireflyhoo,

Below I follow the symbolic links for the jdk-7u21. These links are changed
accordingly as I change between versions. Also, I have 8 datanodes and 2
other various servers that are capable of mounting /hdfs.  So it is just
this server is an issue.

$ java -version
java version "1.7.0_21"
Java(TM) SE Runtime Environment (build 1.7.0_21-b11)
Java HotSpot(TM) 64-Bit Server VM (build 23.21-b01, mixed mode)

java
$ ls -l `which java`
*lrwxrwxrwx 1 root root 26 Jul 17 19:50 /usr/bin/java ->
/usr/java/default/bin/java*
$ ls -l /usr/java/default
*lrwxrwxrwx 1 root root 16 Jul 17 19:50 /usr/java/default ->
/usr/java/latest*
$ ls -l /usr/java/latest
*lrwxrwxrwx 1 root root 21 Jul 17 20:29 /usr/java/latest ->
/usr/java/jdk1.7.0_21*

jar
$ ls -l `which jar`
*lrwxrwxrwx 1 root root 21 Jul 17 20:18 /usr/bin/jar ->
/etc/alternatives/jar*
$ ls -l /etc/alternatives/jar
*lrwxrwxrwx 1 root root 29 Jul 17 20:26 /etc/alternatives/jar ->
/usr/java/jdk1.7.0_21/bin/jar*

javac
$ ls -l `which javac`
*lrwxrwxrwx 1 root root 23 Jul 17 20:18 /usr/bin/javac ->
/etc/alternatives/javac*
$ ls -l /etc/alternatives/javac
*lrwxrwxrwx 1 root root 31 Jul 17 20:26 /etc/alternatives/javac ->
/usr/java/jdk1.7.0_21/bin/javac*

Now that I've tried version from  6 & 7, I'm really not sure what is
causing this issue.






On Thu, Jul 17, 2014 at 8:21 PM, fireflyhoo@gmail.com <fi...@gmail.com>
wrote:

> I think  you first confirm you local java version ,
> Some  liux will  pre-installed java ,that version is very low
>
> ------------------------------
> fireflyhoo@gmail.com
>
>
> *From:* andrew touchet <ad...@latech.edu>
> *Date:* 2014-07-18 09:06
> *To:* user <us...@hadoop.apache.org>
> *Subject:* Re: HDFS input/output error - fuse mount
> Hi Chris,
>
> I tried to mount /hdfs with java versions below but there was no change in
> output.
> jre-7u21
> jdk-7u21
> jdk-7u55
> jdk1.6.0_31
> jdk1.6.0_45
>
>
>
>
> On Thu, Jul 17, 2014 at 6:56 PM, Chris Mawata <ch...@gmail.com>
> wrote:
>
>> Version 51 ia Java 7
>> Chris
>> On Jul 17, 2014 7:50 PM, "andrew touchet" <ad...@latech.edu> wrote:
>>
>>> Hello,
>>>
>>> Hadoop package installed:
>>> hadoop-0.20-0.20.2+737-33.osg.el5.noarch
>>>
>>> Operating System:
>>> CentOS release 5.8 (Final)
>>>
>>> I am mounting HDFS from my namenode to another node with fuse.  After
>>> mounting to /hdfs, any attempts to 'ls', 'cd', or use 'hadoop fs' leads to
>>> the below output.
>>>
>>>
>>> $ls /hdfs
>>> *ls: /hdfs: Input/output error*
>>> $hadoop fs -ls
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>> *Exception in thread "main" java.lang.UnsupportedClassVersionError:
>>> org/apache/hadoop/fs/FsShell : Unsupported major.minor version 51.0     at
>>> java.lang.ClassLoader.defineClass1(Native Method)    at
>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>> java.security.AccessController.doPrivileged(Native Method)     at
>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Could not find the
>>> main class: org.apache.hadoop.fs.FsShell.  Program will exit.*
>>>
>>>
>>> I have attempted to mount /hdfs manually in debug mode and then
>>> attempted to access /hdfs from a different terminal. This is the output.
>>> The namenode is *glados*. The server where /hdfs is being mounted is
>>> *glados2*.
>>>
>>>
>>> $hdfs -oserver=glados,port=9000,rdbuffer=131072,allow_other /hdfs -d
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>> *fuse-dfs ignoring option allow_otherERROR fuse_options.c:162 fuse-dfs
>>> didn't recognize /hdfs,-2fuse-dfs ignoring option -d unique: 1, opcode:
>>> INIT (26), nodeid: 0, insize: 56INIT:
>>> 7.10flags=0x0000000bmax_readahead=0x00020000INFO fuse_init.c:115 Mounting
>>> glados:9000Exception in thread "main"
>>> java.lang.UnsupportedClassVersionError:
>>> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
>>>     at java.lang.ClassLoader.defineClass1(Native Method)    at
>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>> java.security.AccessController.doPrivileged(Native Method)     at
>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Can't construct
>>> instance of class org.apache.hadoop.conf.ConfigurationERROR fuse_init.c:127
>>> Unable to establish test connection to server   INIT: 7.8
>>> flags=0x00000001   max_readahead=0x00020000   max_write=0x00020000
>>> unique: 1, error: 0 (Success), outsize: 40unique: 2, opcode: GETATTR (3),
>>> nodeid: 1, insize: 56Exception in thread "Thread-0"
>>> java.lang.UnsupportedClassVersionError:
>>> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
>>>     at java.lang.ClassLoader.defineClass1(Native Method)    at
>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>> java.security.AccessController.doPrivileged(Native Method)     at
>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Can't construct
>>> instance of class org.apache.hadoop.conf.ConfigurationERROR
>>> fuse_connect.c:83 Unable to instantiate a filesystem for user027ERROR
>>> fuse_impls_getattr.c:40 Could not connect to glados:9000   unique: 2,
>>> error: -5 (Input/output error), outsize: 16 unique: 3, opcode: GETATTR (3),
>>> nodeid: 1, insize: 56*
>>>
>>> I adopted this system after this was already setup, so I do not know
>>> which java version was used during install. Currently I'm using:
>>>
>>> $java -version
>>>
>>>
>>> *java version "1.6.0_45"Java(TM) SE Runtime Environment (build
>>> 1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed
>>> mode)*
>>>
>>> $java -version
>>>
>>>
>>>
>>>
>>>
>>> *java version "1.6.0_45" Java(TM) SE Runtime Environment (build
>>> 1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed
>>> mode)*
>>> Is my java version really the cause of this issue?  What is the correct
>>> java version to be used for this version of hadoop.  I have also tried
>>> 1.6.0_31 but no changes were seen.
>>>
>>> If java isn't my issue, then what is?
>>>
>>> Best regards,
>>>
>>> Andrew
>>>
>>>
>>>
>>>
>>>
>

Re: Re: HDFS input/output error - fuse mount

Posted by andrew touchet <ad...@latech.edu>.
Hi Fireflyhoo,

Below I follow the symbolic links for the jdk-7u21. These links are changed
accordingly as I change between versions. Also, I have 8 datanodes and 2
other various servers that are capable of mounting /hdfs.  So it is just
this server is an issue.

$ java -version
java version "1.7.0_21"
Java(TM) SE Runtime Environment (build 1.7.0_21-b11)
Java HotSpot(TM) 64-Bit Server VM (build 23.21-b01, mixed mode)

java
$ ls -l `which java`
*lrwxrwxrwx 1 root root 26 Jul 17 19:50 /usr/bin/java ->
/usr/java/default/bin/java*
$ ls -l /usr/java/default
*lrwxrwxrwx 1 root root 16 Jul 17 19:50 /usr/java/default ->
/usr/java/latest*
$ ls -l /usr/java/latest
*lrwxrwxrwx 1 root root 21 Jul 17 20:29 /usr/java/latest ->
/usr/java/jdk1.7.0_21*

jar
$ ls -l `which jar`
*lrwxrwxrwx 1 root root 21 Jul 17 20:18 /usr/bin/jar ->
/etc/alternatives/jar*
$ ls -l /etc/alternatives/jar
*lrwxrwxrwx 1 root root 29 Jul 17 20:26 /etc/alternatives/jar ->
/usr/java/jdk1.7.0_21/bin/jar*

javac
$ ls -l `which javac`
*lrwxrwxrwx 1 root root 23 Jul 17 20:18 /usr/bin/javac ->
/etc/alternatives/javac*
$ ls -l /etc/alternatives/javac
*lrwxrwxrwx 1 root root 31 Jul 17 20:26 /etc/alternatives/javac ->
/usr/java/jdk1.7.0_21/bin/javac*

Now that I've tried version from  6 & 7, I'm really not sure what is
causing this issue.






On Thu, Jul 17, 2014 at 8:21 PM, fireflyhoo@gmail.com <fi...@gmail.com>
wrote:

> I think  you first confirm you local java version ,
> Some  liux will  pre-installed java ,that version is very low
>
> ------------------------------
> fireflyhoo@gmail.com
>
>
> *From:* andrew touchet <ad...@latech.edu>
> *Date:* 2014-07-18 09:06
> *To:* user <us...@hadoop.apache.org>
> *Subject:* Re: HDFS input/output error - fuse mount
> Hi Chris,
>
> I tried to mount /hdfs with java versions below but there was no change in
> output.
> jre-7u21
> jdk-7u21
> jdk-7u55
> jdk1.6.0_31
> jdk1.6.0_45
>
>
>
>
> On Thu, Jul 17, 2014 at 6:56 PM, Chris Mawata <ch...@gmail.com>
> wrote:
>
>> Version 51 ia Java 7
>> Chris
>> On Jul 17, 2014 7:50 PM, "andrew touchet" <ad...@latech.edu> wrote:
>>
>>> Hello,
>>>
>>> Hadoop package installed:
>>> hadoop-0.20-0.20.2+737-33.osg.el5.noarch
>>>
>>> Operating System:
>>> CentOS release 5.8 (Final)
>>>
>>> I am mounting HDFS from my namenode to another node with fuse.  After
>>> mounting to /hdfs, any attempts to 'ls', 'cd', or use 'hadoop fs' leads to
>>> the below output.
>>>
>>>
>>> $ls /hdfs
>>> *ls: /hdfs: Input/output error*
>>> $hadoop fs -ls
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>> *Exception in thread "main" java.lang.UnsupportedClassVersionError:
>>> org/apache/hadoop/fs/FsShell : Unsupported major.minor version 51.0     at
>>> java.lang.ClassLoader.defineClass1(Native Method)    at
>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>> java.security.AccessController.doPrivileged(Native Method)     at
>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Could not find the
>>> main class: org.apache.hadoop.fs.FsShell.  Program will exit.*
>>>
>>>
>>> I have attempted to mount /hdfs manually in debug mode and then
>>> attempted to access /hdfs from a different terminal. This is the output.
>>> The namenode is *glados*. The server where /hdfs is being mounted is
>>> *glados2*.
>>>
>>>
>>> $hdfs -oserver=glados,port=9000,rdbuffer=131072,allow_other /hdfs -d
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>> *fuse-dfs ignoring option allow_otherERROR fuse_options.c:162 fuse-dfs
>>> didn't recognize /hdfs,-2fuse-dfs ignoring option -d unique: 1, opcode:
>>> INIT (26), nodeid: 0, insize: 56INIT:
>>> 7.10flags=0x0000000bmax_readahead=0x00020000INFO fuse_init.c:115 Mounting
>>> glados:9000Exception in thread "main"
>>> java.lang.UnsupportedClassVersionError:
>>> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
>>>     at java.lang.ClassLoader.defineClass1(Native Method)    at
>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>> java.security.AccessController.doPrivileged(Native Method)     at
>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Can't construct
>>> instance of class org.apache.hadoop.conf.ConfigurationERROR fuse_init.c:127
>>> Unable to establish test connection to server   INIT: 7.8
>>> flags=0x00000001   max_readahead=0x00020000   max_write=0x00020000
>>> unique: 1, error: 0 (Success), outsize: 40unique: 2, opcode: GETATTR (3),
>>> nodeid: 1, insize: 56Exception in thread "Thread-0"
>>> java.lang.UnsupportedClassVersionError:
>>> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
>>>     at java.lang.ClassLoader.defineClass1(Native Method)    at
>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>> java.security.AccessController.doPrivileged(Native Method)     at
>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Can't construct
>>> instance of class org.apache.hadoop.conf.ConfigurationERROR
>>> fuse_connect.c:83 Unable to instantiate a filesystem for user027ERROR
>>> fuse_impls_getattr.c:40 Could not connect to glados:9000   unique: 2,
>>> error: -5 (Input/output error), outsize: 16 unique: 3, opcode: GETATTR (3),
>>> nodeid: 1, insize: 56*
>>>
>>> I adopted this system after this was already setup, so I do not know
>>> which java version was used during install. Currently I'm using:
>>>
>>> $java -version
>>>
>>>
>>> *java version "1.6.0_45"Java(TM) SE Runtime Environment (build
>>> 1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed
>>> mode)*
>>>
>>> $java -version
>>>
>>>
>>>
>>>
>>>
>>> *java version "1.6.0_45" Java(TM) SE Runtime Environment (build
>>> 1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed
>>> mode)*
>>> Is my java version really the cause of this issue?  What is the correct
>>> java version to be used for this version of hadoop.  I have also tried
>>> 1.6.0_31 but no changes were seen.
>>>
>>> If java isn't my issue, then what is?
>>>
>>> Best regards,
>>>
>>> Andrew
>>>
>>>
>>>
>>>
>>>
>

Re: Re: HDFS input/output error - fuse mount

Posted by andrew touchet <ad...@latech.edu>.
Hi Fireflyhoo,

Below I follow the symbolic links for the jdk-7u21. These links are changed
accordingly as I change between versions. Also, I have 8 datanodes and 2
other various servers that are capable of mounting /hdfs.  So it is just
this server is an issue.

$ java -version
java version "1.7.0_21"
Java(TM) SE Runtime Environment (build 1.7.0_21-b11)
Java HotSpot(TM) 64-Bit Server VM (build 23.21-b01, mixed mode)

java
$ ls -l `which java`
*lrwxrwxrwx 1 root root 26 Jul 17 19:50 /usr/bin/java ->
/usr/java/default/bin/java*
$ ls -l /usr/java/default
*lrwxrwxrwx 1 root root 16 Jul 17 19:50 /usr/java/default ->
/usr/java/latest*
$ ls -l /usr/java/latest
*lrwxrwxrwx 1 root root 21 Jul 17 20:29 /usr/java/latest ->
/usr/java/jdk1.7.0_21*

jar
$ ls -l `which jar`
*lrwxrwxrwx 1 root root 21 Jul 17 20:18 /usr/bin/jar ->
/etc/alternatives/jar*
$ ls -l /etc/alternatives/jar
*lrwxrwxrwx 1 root root 29 Jul 17 20:26 /etc/alternatives/jar ->
/usr/java/jdk1.7.0_21/bin/jar*

javac
$ ls -l `which javac`
*lrwxrwxrwx 1 root root 23 Jul 17 20:18 /usr/bin/javac ->
/etc/alternatives/javac*
$ ls -l /etc/alternatives/javac
*lrwxrwxrwx 1 root root 31 Jul 17 20:26 /etc/alternatives/javac ->
/usr/java/jdk1.7.0_21/bin/javac*

Now that I've tried version from  6 & 7, I'm really not sure what is
causing this issue.






On Thu, Jul 17, 2014 at 8:21 PM, fireflyhoo@gmail.com <fi...@gmail.com>
wrote:

> I think  you first confirm you local java version ,
> Some  liux will  pre-installed java ,that version is very low
>
> ------------------------------
> fireflyhoo@gmail.com
>
>
> *From:* andrew touchet <ad...@latech.edu>
> *Date:* 2014-07-18 09:06
> *To:* user <us...@hadoop.apache.org>
> *Subject:* Re: HDFS input/output error - fuse mount
> Hi Chris,
>
> I tried to mount /hdfs with java versions below but there was no change in
> output.
> jre-7u21
> jdk-7u21
> jdk-7u55
> jdk1.6.0_31
> jdk1.6.0_45
>
>
>
>
> On Thu, Jul 17, 2014 at 6:56 PM, Chris Mawata <ch...@gmail.com>
> wrote:
>
>> Version 51 ia Java 7
>> Chris
>> On Jul 17, 2014 7:50 PM, "andrew touchet" <ad...@latech.edu> wrote:
>>
>>> Hello,
>>>
>>> Hadoop package installed:
>>> hadoop-0.20-0.20.2+737-33.osg.el5.noarch
>>>
>>> Operating System:
>>> CentOS release 5.8 (Final)
>>>
>>> I am mounting HDFS from my namenode to another node with fuse.  After
>>> mounting to /hdfs, any attempts to 'ls', 'cd', or use 'hadoop fs' leads to
>>> the below output.
>>>
>>>
>>> $ls /hdfs
>>> *ls: /hdfs: Input/output error*
>>> $hadoop fs -ls
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>> *Exception in thread "main" java.lang.UnsupportedClassVersionError:
>>> org/apache/hadoop/fs/FsShell : Unsupported major.minor version 51.0     at
>>> java.lang.ClassLoader.defineClass1(Native Method)    at
>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>> java.security.AccessController.doPrivileged(Native Method)     at
>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Could not find the
>>> main class: org.apache.hadoop.fs.FsShell.  Program will exit.*
>>>
>>>
>>> I have attempted to mount /hdfs manually in debug mode and then
>>> attempted to access /hdfs from a different terminal. This is the output.
>>> The namenode is *glados*. The server where /hdfs is being mounted is
>>> *glados2*.
>>>
>>>
>>> $hdfs -oserver=glados,port=9000,rdbuffer=131072,allow_other /hdfs -d
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>> *fuse-dfs ignoring option allow_otherERROR fuse_options.c:162 fuse-dfs
>>> didn't recognize /hdfs,-2fuse-dfs ignoring option -d unique: 1, opcode:
>>> INIT (26), nodeid: 0, insize: 56INIT:
>>> 7.10flags=0x0000000bmax_readahead=0x00020000INFO fuse_init.c:115 Mounting
>>> glados:9000Exception in thread "main"
>>> java.lang.UnsupportedClassVersionError:
>>> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
>>>     at java.lang.ClassLoader.defineClass1(Native Method)    at
>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>> java.security.AccessController.doPrivileged(Native Method)     at
>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Can't construct
>>> instance of class org.apache.hadoop.conf.ConfigurationERROR fuse_init.c:127
>>> Unable to establish test connection to server   INIT: 7.8
>>> flags=0x00000001   max_readahead=0x00020000   max_write=0x00020000
>>> unique: 1, error: 0 (Success), outsize: 40unique: 2, opcode: GETATTR (3),
>>> nodeid: 1, insize: 56Exception in thread "Thread-0"
>>> java.lang.UnsupportedClassVersionError:
>>> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
>>>     at java.lang.ClassLoader.defineClass1(Native Method)    at
>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>>> java.security.AccessController.doPrivileged(Native Method)     at
>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Can't construct
>>> instance of class org.apache.hadoop.conf.ConfigurationERROR
>>> fuse_connect.c:83 Unable to instantiate a filesystem for user027ERROR
>>> fuse_impls_getattr.c:40 Could not connect to glados:9000   unique: 2,
>>> error: -5 (Input/output error), outsize: 16 unique: 3, opcode: GETATTR (3),
>>> nodeid: 1, insize: 56*
>>>
>>> I adopted this system after this was already setup, so I do not know
>>> which java version was used during install. Currently I'm using:
>>>
>>> $java -version
>>>
>>>
>>> *java version "1.6.0_45"Java(TM) SE Runtime Environment (build
>>> 1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed
>>> mode)*
>>>
>>> $java -version
>>>
>>>
>>>
>>>
>>>
>>> *java version "1.6.0_45" Java(TM) SE Runtime Environment (build
>>> 1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed
>>> mode)*
>>> Is my java version really the cause of this issue?  What is the correct
>>> java version to be used for this version of hadoop.  I have also tried
>>> 1.6.0_31 but no changes were seen.
>>>
>>> If java isn't my issue, then what is?
>>>
>>> Best regards,
>>>
>>> Andrew
>>>
>>>
>>>
>>>
>>>
>

Re: Re: HDFS input/output error - fuse mount

Posted by "fireflyhoo@gmail.com" <fi...@gmail.com>.
I think  you first confirm you local java version ,  
Some  liux will  pre-installed java ,that version is very low 



fireflyhoo@gmail.com
 
From: andrew touchet
Date: 2014-07-18 09:06
To: user
Subject: Re: HDFS input/output error - fuse mount
Hi Chris, 

I tried to mount /hdfs with java versions below but there was no change in output.  
jre-7u21
jdk-7u21
jdk-7u55
jdk1.6.0_31
jdk1.6.0_45




On Thu, Jul 17, 2014 at 6:56 PM, Chris Mawata <ch...@gmail.com> wrote:
Version 51 ia Java 7
Chris
On Jul 17, 2014 7:50 PM, "andrew touchet" <ad...@latech.edu> wrote:
Hello,

Hadoop package installed:
hadoop-0.20-0.20.2+737-33.osg.el5.noarch

Operating System:
CentOS release 5.8 (Final)

I am mounting HDFS from my namenode to another node with fuse.  After mounting to /hdfs, any attempts to 'ls', 'cd', or use 'hadoop fs' leads to the below output.


$ls /hdfs
ls: /hdfs: Input/output error
$hadoop fs -ls
Exception in thread "main" java.lang.UnsupportedClassVersionError: org/apache/hadoop/fs/FsShell : Unsupported major.minor version 51.0
    at java.lang.ClassLoader.defineClass1(Native Method)
    at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
    at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
    at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
    at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Could not find the main class: org.apache.hadoop.fs.FsShell.  Program will exit.


I have attempted to mount /hdfs manually in debug mode and then attempted to access /hdfs from a different terminal. This is the output. The namenode is glados. The server where /hdfs is being mounted is glados2.


$hdfs -oserver=glados,port=9000,rdbuffer=131072,allow_other /hdfs -d
fuse-dfs ignoring option allow_other
ERROR fuse_options.c:162 fuse-dfs didn't recognize /hdfs,-2

fuse-dfs ignoring option -d
unique: 1, opcode: INIT (26), nodeid: 0, insize: 56
INIT: 7.10
flags=0x0000000b
max_readahead=0x00020000
INFO fuse_init.c:115 Mounting glados:9000
Exception in thread "main" java.lang.UnsupportedClassVersionError: org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
    at java.lang.ClassLoader.defineClass1(Native Method)
    at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
    at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
    at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
    at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Can't construct instance of class org.apache.hadoop.conf.Configuration
ERROR fuse_init.c:127 Unable to establish test connection to server
   INIT: 7.8
   flags=0x00000001
   max_readahead=0x00020000
   max_write=0x00020000
   unique: 1, error: 0 (Success), outsize: 40
unique: 2, opcode: GETATTR (3), nodeid: 1, insize: 56
Exception in thread "Thread-0" java.lang.UnsupportedClassVersionError: org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
    at java.lang.ClassLoader.defineClass1(Native Method)
    at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
    at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
    at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
    at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Can't construct instance of class org.apache.hadoop.conf.Configuration
ERROR fuse_connect.c:83 Unable to instantiate a filesystem for user027
ERROR fuse_impls_getattr.c:40 Could not connect to glados:9000
   unique: 2, error: -5 (Input/output error), outsize: 16
unique: 3, opcode: GETATTR (3), nodeid: 1, insize: 56

I adopted this system after this was already setup, so I do not know which java version was used during install. Currently I'm using:

$java -version
java version "1.6.0_45"
Java(TM) SE Runtime Environment (build 1.6.0_45-b06)
Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed mode)

$java -version
java version "1.6.0_45"
Java(TM) SE Runtime Environment (build 1.6.0_45-b06)
Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed mode)


Is my java version really the cause of this issue?  What is the correct java version to be used for this version of hadoop.  I have also tried 1.6.0_31 but no changes were seen.

If java isn't my issue, then what is?

Best regards,

Andrew 






Re: Re: HDFS input/output error - fuse mount

Posted by "fireflyhoo@gmail.com" <fi...@gmail.com>.
I think  you first confirm you local java version ,  
Some  liux will  pre-installed java ,that version is very low 



fireflyhoo@gmail.com
 
From: andrew touchet
Date: 2014-07-18 09:06
To: user
Subject: Re: HDFS input/output error - fuse mount
Hi Chris, 

I tried to mount /hdfs with java versions below but there was no change in output.  
jre-7u21
jdk-7u21
jdk-7u55
jdk1.6.0_31
jdk1.6.0_45




On Thu, Jul 17, 2014 at 6:56 PM, Chris Mawata <ch...@gmail.com> wrote:
Version 51 ia Java 7
Chris
On Jul 17, 2014 7:50 PM, "andrew touchet" <ad...@latech.edu> wrote:
Hello,

Hadoop package installed:
hadoop-0.20-0.20.2+737-33.osg.el5.noarch

Operating System:
CentOS release 5.8 (Final)

I am mounting HDFS from my namenode to another node with fuse.  After mounting to /hdfs, any attempts to 'ls', 'cd', or use 'hadoop fs' leads to the below output.


$ls /hdfs
ls: /hdfs: Input/output error
$hadoop fs -ls
Exception in thread "main" java.lang.UnsupportedClassVersionError: org/apache/hadoop/fs/FsShell : Unsupported major.minor version 51.0
    at java.lang.ClassLoader.defineClass1(Native Method)
    at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
    at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
    at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
    at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Could not find the main class: org.apache.hadoop.fs.FsShell.  Program will exit.


I have attempted to mount /hdfs manually in debug mode and then attempted to access /hdfs from a different terminal. This is the output. The namenode is glados. The server where /hdfs is being mounted is glados2.


$hdfs -oserver=glados,port=9000,rdbuffer=131072,allow_other /hdfs -d
fuse-dfs ignoring option allow_other
ERROR fuse_options.c:162 fuse-dfs didn't recognize /hdfs,-2

fuse-dfs ignoring option -d
unique: 1, opcode: INIT (26), nodeid: 0, insize: 56
INIT: 7.10
flags=0x0000000b
max_readahead=0x00020000
INFO fuse_init.c:115 Mounting glados:9000
Exception in thread "main" java.lang.UnsupportedClassVersionError: org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
    at java.lang.ClassLoader.defineClass1(Native Method)
    at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
    at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
    at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
    at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Can't construct instance of class org.apache.hadoop.conf.Configuration
ERROR fuse_init.c:127 Unable to establish test connection to server
   INIT: 7.8
   flags=0x00000001
   max_readahead=0x00020000
   max_write=0x00020000
   unique: 1, error: 0 (Success), outsize: 40
unique: 2, opcode: GETATTR (3), nodeid: 1, insize: 56
Exception in thread "Thread-0" java.lang.UnsupportedClassVersionError: org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
    at java.lang.ClassLoader.defineClass1(Native Method)
    at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
    at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
    at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
    at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Can't construct instance of class org.apache.hadoop.conf.Configuration
ERROR fuse_connect.c:83 Unable to instantiate a filesystem for user027
ERROR fuse_impls_getattr.c:40 Could not connect to glados:9000
   unique: 2, error: -5 (Input/output error), outsize: 16
unique: 3, opcode: GETATTR (3), nodeid: 1, insize: 56

I adopted this system after this was already setup, so I do not know which java version was used during install. Currently I'm using:

$java -version
java version "1.6.0_45"
Java(TM) SE Runtime Environment (build 1.6.0_45-b06)
Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed mode)

$java -version
java version "1.6.0_45"
Java(TM) SE Runtime Environment (build 1.6.0_45-b06)
Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed mode)


Is my java version really the cause of this issue?  What is the correct java version to be used for this version of hadoop.  I have also tried 1.6.0_31 but no changes were seen.

If java isn't my issue, then what is?

Best regards,

Andrew 






Re: Re: HDFS input/output error - fuse mount

Posted by "fireflyhoo@gmail.com" <fi...@gmail.com>.
I think  you first confirm you local java version ,  
Some  liux will  pre-installed java ,that version is very low 



fireflyhoo@gmail.com
 
From: andrew touchet
Date: 2014-07-18 09:06
To: user
Subject: Re: HDFS input/output error - fuse mount
Hi Chris, 

I tried to mount /hdfs with java versions below but there was no change in output.  
jre-7u21
jdk-7u21
jdk-7u55
jdk1.6.0_31
jdk1.6.0_45




On Thu, Jul 17, 2014 at 6:56 PM, Chris Mawata <ch...@gmail.com> wrote:
Version 51 ia Java 7
Chris
On Jul 17, 2014 7:50 PM, "andrew touchet" <ad...@latech.edu> wrote:
Hello,

Hadoop package installed:
hadoop-0.20-0.20.2+737-33.osg.el5.noarch

Operating System:
CentOS release 5.8 (Final)

I am mounting HDFS from my namenode to another node with fuse.  After mounting to /hdfs, any attempts to 'ls', 'cd', or use 'hadoop fs' leads to the below output.


$ls /hdfs
ls: /hdfs: Input/output error
$hadoop fs -ls
Exception in thread "main" java.lang.UnsupportedClassVersionError: org/apache/hadoop/fs/FsShell : Unsupported major.minor version 51.0
    at java.lang.ClassLoader.defineClass1(Native Method)
    at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
    at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
    at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
    at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Could not find the main class: org.apache.hadoop.fs.FsShell.  Program will exit.


I have attempted to mount /hdfs manually in debug mode and then attempted to access /hdfs from a different terminal. This is the output. The namenode is glados. The server where /hdfs is being mounted is glados2.


$hdfs -oserver=glados,port=9000,rdbuffer=131072,allow_other /hdfs -d
fuse-dfs ignoring option allow_other
ERROR fuse_options.c:162 fuse-dfs didn't recognize /hdfs,-2

fuse-dfs ignoring option -d
unique: 1, opcode: INIT (26), nodeid: 0, insize: 56
INIT: 7.10
flags=0x0000000b
max_readahead=0x00020000
INFO fuse_init.c:115 Mounting glados:9000
Exception in thread "main" java.lang.UnsupportedClassVersionError: org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
    at java.lang.ClassLoader.defineClass1(Native Method)
    at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
    at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
    at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
    at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Can't construct instance of class org.apache.hadoop.conf.Configuration
ERROR fuse_init.c:127 Unable to establish test connection to server
   INIT: 7.8
   flags=0x00000001
   max_readahead=0x00020000
   max_write=0x00020000
   unique: 1, error: 0 (Success), outsize: 40
unique: 2, opcode: GETATTR (3), nodeid: 1, insize: 56
Exception in thread "Thread-0" java.lang.UnsupportedClassVersionError: org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
    at java.lang.ClassLoader.defineClass1(Native Method)
    at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
    at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
    at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
    at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Can't construct instance of class org.apache.hadoop.conf.Configuration
ERROR fuse_connect.c:83 Unable to instantiate a filesystem for user027
ERROR fuse_impls_getattr.c:40 Could not connect to glados:9000
   unique: 2, error: -5 (Input/output error), outsize: 16
unique: 3, opcode: GETATTR (3), nodeid: 1, insize: 56

I adopted this system after this was already setup, so I do not know which java version was used during install. Currently I'm using:

$java -version
java version "1.6.0_45"
Java(TM) SE Runtime Environment (build 1.6.0_45-b06)
Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed mode)

$java -version
java version "1.6.0_45"
Java(TM) SE Runtime Environment (build 1.6.0_45-b06)
Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed mode)


Is my java version really the cause of this issue?  What is the correct java version to be used for this version of hadoop.  I have also tried 1.6.0_31 but no changes were seen.

If java isn't my issue, then what is?

Best regards,

Andrew 






Re: Re: HDFS input/output error - fuse mount

Posted by "fireflyhoo@gmail.com" <fi...@gmail.com>.
I think  you first confirm you local java version ,  
Some  liux will  pre-installed java ,that version is very low 



fireflyhoo@gmail.com
 
From: andrew touchet
Date: 2014-07-18 09:06
To: user
Subject: Re: HDFS input/output error - fuse mount
Hi Chris, 

I tried to mount /hdfs with java versions below but there was no change in output.  
jre-7u21
jdk-7u21
jdk-7u55
jdk1.6.0_31
jdk1.6.0_45




On Thu, Jul 17, 2014 at 6:56 PM, Chris Mawata <ch...@gmail.com> wrote:
Version 51 ia Java 7
Chris
On Jul 17, 2014 7:50 PM, "andrew touchet" <ad...@latech.edu> wrote:
Hello,

Hadoop package installed:
hadoop-0.20-0.20.2+737-33.osg.el5.noarch

Operating System:
CentOS release 5.8 (Final)

I am mounting HDFS from my namenode to another node with fuse.  After mounting to /hdfs, any attempts to 'ls', 'cd', or use 'hadoop fs' leads to the below output.


$ls /hdfs
ls: /hdfs: Input/output error
$hadoop fs -ls
Exception in thread "main" java.lang.UnsupportedClassVersionError: org/apache/hadoop/fs/FsShell : Unsupported major.minor version 51.0
    at java.lang.ClassLoader.defineClass1(Native Method)
    at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
    at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
    at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
    at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Could not find the main class: org.apache.hadoop.fs.FsShell.  Program will exit.


I have attempted to mount /hdfs manually in debug mode and then attempted to access /hdfs from a different terminal. This is the output. The namenode is glados. The server where /hdfs is being mounted is glados2.


$hdfs -oserver=glados,port=9000,rdbuffer=131072,allow_other /hdfs -d
fuse-dfs ignoring option allow_other
ERROR fuse_options.c:162 fuse-dfs didn't recognize /hdfs,-2

fuse-dfs ignoring option -d
unique: 1, opcode: INIT (26), nodeid: 0, insize: 56
INIT: 7.10
flags=0x0000000b
max_readahead=0x00020000
INFO fuse_init.c:115 Mounting glados:9000
Exception in thread "main" java.lang.UnsupportedClassVersionError: org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
    at java.lang.ClassLoader.defineClass1(Native Method)
    at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
    at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
    at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
    at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Can't construct instance of class org.apache.hadoop.conf.Configuration
ERROR fuse_init.c:127 Unable to establish test connection to server
   INIT: 7.8
   flags=0x00000001
   max_readahead=0x00020000
   max_write=0x00020000
   unique: 1, error: 0 (Success), outsize: 40
unique: 2, opcode: GETATTR (3), nodeid: 1, insize: 56
Exception in thread "Thread-0" java.lang.UnsupportedClassVersionError: org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
    at java.lang.ClassLoader.defineClass1(Native Method)
    at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
    at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
    at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
    at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Can't construct instance of class org.apache.hadoop.conf.Configuration
ERROR fuse_connect.c:83 Unable to instantiate a filesystem for user027
ERROR fuse_impls_getattr.c:40 Could not connect to glados:9000
   unique: 2, error: -5 (Input/output error), outsize: 16
unique: 3, opcode: GETATTR (3), nodeid: 1, insize: 56

I adopted this system after this was already setup, so I do not know which java version was used during install. Currently I'm using:

$java -version
java version "1.6.0_45"
Java(TM) SE Runtime Environment (build 1.6.0_45-b06)
Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed mode)

$java -version
java version "1.6.0_45"
Java(TM) SE Runtime Environment (build 1.6.0_45-b06)
Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed mode)


Is my java version really the cause of this issue?  What is the correct java version to be used for this version of hadoop.  I have also tried 1.6.0_31 but no changes were seen.

If java isn't my issue, then what is?

Best regards,

Andrew 






Re: HDFS input/output error - fuse mount

Posted by andrew touchet <ad...@latech.edu>.
Hi Chris,

I tried to mount /hdfs with java versions below but there was no change in
output.
jre-7u21
jdk-7u21
jdk-7u55
jdk1.6.0_31
jdk1.6.0_45




On Thu, Jul 17, 2014 at 6:56 PM, Chris Mawata <ch...@gmail.com>
wrote:

> Version 51 ia Java 7
> Chris
> On Jul 17, 2014 7:50 PM, "andrew touchet" <ad...@latech.edu> wrote:
>
>> Hello,
>>
>> Hadoop package installed:
>> hadoop-0.20-0.20.2+737-33.osg.el5.noarch
>>
>> Operating System:
>> CentOS release 5.8 (Final)
>>
>> I am mounting HDFS from my namenode to another node with fuse.  After
>> mounting to /hdfs, any attempts to 'ls', 'cd', or use 'hadoop fs' leads to
>> the below output.
>>
>>
>> $ls /hdfs
>> *ls: /hdfs: Input/output error*
>> $hadoop fs -ls
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>> *Exception in thread "main" java.lang.UnsupportedClassVersionError:
>> org/apache/hadoop/fs/FsShell : Unsupported major.minor version 51.0     at
>> java.lang.ClassLoader.defineClass1(Native Method)    at
>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>> java.security.AccessController.doPrivileged(Native Method)     at
>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Could not find the
>> main class: org.apache.hadoop.fs.FsShell.  Program will exit.*
>>
>>
>> I have attempted to mount /hdfs manually in debug mode and then attempted
>> to access /hdfs from a different terminal. This is the output. The namenode
>> is *glados*. The server where /hdfs is being mounted is *glados2*.
>>
>>
>> $hdfs -oserver=glados,port=9000,rdbuffer=131072,allow_other /hdfs -d
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>> *fuse-dfs ignoring option allow_otherERROR fuse_options.c:162 fuse-dfs
>> didn't recognize /hdfs,-2fuse-dfs ignoring option -d unique: 1, opcode:
>> INIT (26), nodeid: 0, insize: 56INIT:
>> 7.10flags=0x0000000bmax_readahead=0x00020000INFO fuse_init.c:115 Mounting
>> glados:9000Exception in thread "main"
>> java.lang.UnsupportedClassVersionError:
>> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
>>     at java.lang.ClassLoader.defineClass1(Native Method)    at
>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>> java.security.AccessController.doPrivileged(Native Method)     at
>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Can't construct
>> instance of class org.apache.hadoop.conf.ConfigurationERROR fuse_init.c:127
>> Unable to establish test connection to server   INIT: 7.8
>> flags=0x00000001   max_readahead=0x00020000   max_write=0x00020000
>> unique: 1, error: 0 (Success), outsize: 40unique: 2, opcode: GETATTR (3),
>> nodeid: 1, insize: 56Exception in thread "Thread-0"
>> java.lang.UnsupportedClassVersionError:
>> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
>>     at java.lang.ClassLoader.defineClass1(Native Method)    at
>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>> java.security.AccessController.doPrivileged(Native Method)     at
>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Can't construct
>> instance of class org.apache.hadoop.conf.ConfigurationERROR
>> fuse_connect.c:83 Unable to instantiate a filesystem for user027ERROR
>> fuse_impls_getattr.c:40 Could not connect to glados:9000   unique: 2,
>> error: -5 (Input/output error), outsize: 16 unique: 3, opcode: GETATTR (3),
>> nodeid: 1, insize: 56*
>>
>> I adopted this system after this was already setup, so I do not know
>> which java version was used during install. Currently I'm using:
>>
>> $java -version
>>
>>
>> *java version "1.6.0_45"Java(TM) SE Runtime Environment (build
>> 1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed
>> mode)*
>>
>> $java -version
>>
>>
>>
>>
>>
>> *java version "1.6.0_45" Java(TM) SE Runtime Environment (build
>> 1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed
>> mode)*
>> Is my java version really the cause of this issue?  What is the correct
>> java version to be used for this version of hadoop.  I have also tried
>> 1.6.0_31 but no changes were seen.
>>
>> If java isn't my issue, then what is?
>>
>> Best regards,
>>
>> Andrew
>>
>>
>>
>>
>>

Re: HDFS input/output error - fuse mount

Posted by andrew touchet <ad...@latech.edu>.
Hi Chris,

I tried to mount /hdfs with java versions below but there was no change in
output.
jre-7u21
jdk-7u21
jdk-7u55
jdk1.6.0_31
jdk1.6.0_45




On Thu, Jul 17, 2014 at 6:56 PM, Chris Mawata <ch...@gmail.com>
wrote:

> Version 51 ia Java 7
> Chris
> On Jul 17, 2014 7:50 PM, "andrew touchet" <ad...@latech.edu> wrote:
>
>> Hello,
>>
>> Hadoop package installed:
>> hadoop-0.20-0.20.2+737-33.osg.el5.noarch
>>
>> Operating System:
>> CentOS release 5.8 (Final)
>>
>> I am mounting HDFS from my namenode to another node with fuse.  After
>> mounting to /hdfs, any attempts to 'ls', 'cd', or use 'hadoop fs' leads to
>> the below output.
>>
>>
>> $ls /hdfs
>> *ls: /hdfs: Input/output error*
>> $hadoop fs -ls
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>> *Exception in thread "main" java.lang.UnsupportedClassVersionError:
>> org/apache/hadoop/fs/FsShell : Unsupported major.minor version 51.0     at
>> java.lang.ClassLoader.defineClass1(Native Method)    at
>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>> java.security.AccessController.doPrivileged(Native Method)     at
>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Could not find the
>> main class: org.apache.hadoop.fs.FsShell.  Program will exit.*
>>
>>
>> I have attempted to mount /hdfs manually in debug mode and then attempted
>> to access /hdfs from a different terminal. This is the output. The namenode
>> is *glados*. The server where /hdfs is being mounted is *glados2*.
>>
>>
>> $hdfs -oserver=glados,port=9000,rdbuffer=131072,allow_other /hdfs -d
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>> *fuse-dfs ignoring option allow_otherERROR fuse_options.c:162 fuse-dfs
>> didn't recognize /hdfs,-2fuse-dfs ignoring option -d unique: 1, opcode:
>> INIT (26), nodeid: 0, insize: 56INIT:
>> 7.10flags=0x0000000bmax_readahead=0x00020000INFO fuse_init.c:115 Mounting
>> glados:9000Exception in thread "main"
>> java.lang.UnsupportedClassVersionError:
>> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
>>     at java.lang.ClassLoader.defineClass1(Native Method)    at
>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>> java.security.AccessController.doPrivileged(Native Method)     at
>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Can't construct
>> instance of class org.apache.hadoop.conf.ConfigurationERROR fuse_init.c:127
>> Unable to establish test connection to server   INIT: 7.8
>> flags=0x00000001   max_readahead=0x00020000   max_write=0x00020000
>> unique: 1, error: 0 (Success), outsize: 40unique: 2, opcode: GETATTR (3),
>> nodeid: 1, insize: 56Exception in thread "Thread-0"
>> java.lang.UnsupportedClassVersionError:
>> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
>>     at java.lang.ClassLoader.defineClass1(Native Method)    at
>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>> java.security.AccessController.doPrivileged(Native Method)     at
>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Can't construct
>> instance of class org.apache.hadoop.conf.ConfigurationERROR
>> fuse_connect.c:83 Unable to instantiate a filesystem for user027ERROR
>> fuse_impls_getattr.c:40 Could not connect to glados:9000   unique: 2,
>> error: -5 (Input/output error), outsize: 16 unique: 3, opcode: GETATTR (3),
>> nodeid: 1, insize: 56*
>>
>> I adopted this system after this was already setup, so I do not know
>> which java version was used during install. Currently I'm using:
>>
>> $java -version
>>
>>
>> *java version "1.6.0_45"Java(TM) SE Runtime Environment (build
>> 1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed
>> mode)*
>>
>> $java -version
>>
>>
>>
>>
>>
>> *java version "1.6.0_45" Java(TM) SE Runtime Environment (build
>> 1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed
>> mode)*
>> Is my java version really the cause of this issue?  What is the correct
>> java version to be used for this version of hadoop.  I have also tried
>> 1.6.0_31 but no changes were seen.
>>
>> If java isn't my issue, then what is?
>>
>> Best regards,
>>
>> Andrew
>>
>>
>>
>>
>>

Re: HDFS input/output error - fuse mount

Posted by andrew touchet <ad...@latech.edu>.
Hi Chris,

I tried to mount /hdfs with java versions below but there was no change in
output.
jre-7u21
jdk-7u21
jdk-7u55
jdk1.6.0_31
jdk1.6.0_45




On Thu, Jul 17, 2014 at 6:56 PM, Chris Mawata <ch...@gmail.com>
wrote:

> Version 51 ia Java 7
> Chris
> On Jul 17, 2014 7:50 PM, "andrew touchet" <ad...@latech.edu> wrote:
>
>> Hello,
>>
>> Hadoop package installed:
>> hadoop-0.20-0.20.2+737-33.osg.el5.noarch
>>
>> Operating System:
>> CentOS release 5.8 (Final)
>>
>> I am mounting HDFS from my namenode to another node with fuse.  After
>> mounting to /hdfs, any attempts to 'ls', 'cd', or use 'hadoop fs' leads to
>> the below output.
>>
>>
>> $ls /hdfs
>> *ls: /hdfs: Input/output error*
>> $hadoop fs -ls
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>> *Exception in thread "main" java.lang.UnsupportedClassVersionError:
>> org/apache/hadoop/fs/FsShell : Unsupported major.minor version 51.0     at
>> java.lang.ClassLoader.defineClass1(Native Method)    at
>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>> java.security.AccessController.doPrivileged(Native Method)     at
>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Could not find the
>> main class: org.apache.hadoop.fs.FsShell.  Program will exit.*
>>
>>
>> I have attempted to mount /hdfs manually in debug mode and then attempted
>> to access /hdfs from a different terminal. This is the output. The namenode
>> is *glados*. The server where /hdfs is being mounted is *glados2*.
>>
>>
>> $hdfs -oserver=glados,port=9000,rdbuffer=131072,allow_other /hdfs -d
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>> *fuse-dfs ignoring option allow_otherERROR fuse_options.c:162 fuse-dfs
>> didn't recognize /hdfs,-2fuse-dfs ignoring option -d unique: 1, opcode:
>> INIT (26), nodeid: 0, insize: 56INIT:
>> 7.10flags=0x0000000bmax_readahead=0x00020000INFO fuse_init.c:115 Mounting
>> glados:9000Exception in thread "main"
>> java.lang.UnsupportedClassVersionError:
>> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
>>     at java.lang.ClassLoader.defineClass1(Native Method)    at
>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>> java.security.AccessController.doPrivileged(Native Method)     at
>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Can't construct
>> instance of class org.apache.hadoop.conf.ConfigurationERROR fuse_init.c:127
>> Unable to establish test connection to server   INIT: 7.8
>> flags=0x00000001   max_readahead=0x00020000   max_write=0x00020000
>> unique: 1, error: 0 (Success), outsize: 40unique: 2, opcode: GETATTR (3),
>> nodeid: 1, insize: 56Exception in thread "Thread-0"
>> java.lang.UnsupportedClassVersionError:
>> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
>>     at java.lang.ClassLoader.defineClass1(Native Method)    at
>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>> java.security.AccessController.doPrivileged(Native Method)     at
>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Can't construct
>> instance of class org.apache.hadoop.conf.ConfigurationERROR
>> fuse_connect.c:83 Unable to instantiate a filesystem for user027ERROR
>> fuse_impls_getattr.c:40 Could not connect to glados:9000   unique: 2,
>> error: -5 (Input/output error), outsize: 16 unique: 3, opcode: GETATTR (3),
>> nodeid: 1, insize: 56*
>>
>> I adopted this system after this was already setup, so I do not know
>> which java version was used during install. Currently I'm using:
>>
>> $java -version
>>
>>
>> *java version "1.6.0_45"Java(TM) SE Runtime Environment (build
>> 1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed
>> mode)*
>>
>> $java -version
>>
>>
>>
>>
>>
>> *java version "1.6.0_45" Java(TM) SE Runtime Environment (build
>> 1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed
>> mode)*
>> Is my java version really the cause of this issue?  What is the correct
>> java version to be used for this version of hadoop.  I have also tried
>> 1.6.0_31 but no changes were seen.
>>
>> If java isn't my issue, then what is?
>>
>> Best regards,
>>
>> Andrew
>>
>>
>>
>>
>>

Re: HDFS input/output error - fuse mount

Posted by andrew touchet <ad...@latech.edu>.
Hi Chris,

I tried to mount /hdfs with java versions below but there was no change in
output.
jre-7u21
jdk-7u21
jdk-7u55
jdk1.6.0_31
jdk1.6.0_45




On Thu, Jul 17, 2014 at 6:56 PM, Chris Mawata <ch...@gmail.com>
wrote:

> Version 51 ia Java 7
> Chris
> On Jul 17, 2014 7:50 PM, "andrew touchet" <ad...@latech.edu> wrote:
>
>> Hello,
>>
>> Hadoop package installed:
>> hadoop-0.20-0.20.2+737-33.osg.el5.noarch
>>
>> Operating System:
>> CentOS release 5.8 (Final)
>>
>> I am mounting HDFS from my namenode to another node with fuse.  After
>> mounting to /hdfs, any attempts to 'ls', 'cd', or use 'hadoop fs' leads to
>> the below output.
>>
>>
>> $ls /hdfs
>> *ls: /hdfs: Input/output error*
>> $hadoop fs -ls
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>> *Exception in thread "main" java.lang.UnsupportedClassVersionError:
>> org/apache/hadoop/fs/FsShell : Unsupported major.minor version 51.0     at
>> java.lang.ClassLoader.defineClass1(Native Method)    at
>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>> java.security.AccessController.doPrivileged(Native Method)     at
>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Could not find the
>> main class: org.apache.hadoop.fs.FsShell.  Program will exit.*
>>
>>
>> I have attempted to mount /hdfs manually in debug mode and then attempted
>> to access /hdfs from a different terminal. This is the output. The namenode
>> is *glados*. The server where /hdfs is being mounted is *glados2*.
>>
>>
>> $hdfs -oserver=glados,port=9000,rdbuffer=131072,allow_other /hdfs -d
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>> *fuse-dfs ignoring option allow_otherERROR fuse_options.c:162 fuse-dfs
>> didn't recognize /hdfs,-2fuse-dfs ignoring option -d unique: 1, opcode:
>> INIT (26), nodeid: 0, insize: 56INIT:
>> 7.10flags=0x0000000bmax_readahead=0x00020000INFO fuse_init.c:115 Mounting
>> glados:9000Exception in thread "main"
>> java.lang.UnsupportedClassVersionError:
>> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
>>     at java.lang.ClassLoader.defineClass1(Native Method)    at
>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>> java.security.AccessController.doPrivileged(Native Method)     at
>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Can't construct
>> instance of class org.apache.hadoop.conf.ConfigurationERROR fuse_init.c:127
>> Unable to establish test connection to server   INIT: 7.8
>> flags=0x00000001   max_readahead=0x00020000   max_write=0x00020000
>> unique: 1, error: 0 (Success), outsize: 40unique: 2, opcode: GETATTR (3),
>> nodeid: 1, insize: 56Exception in thread "Thread-0"
>> java.lang.UnsupportedClassVersionError:
>> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
>>     at java.lang.ClassLoader.defineClass1(Native Method)    at
>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
>> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
>> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
>> java.security.AccessController.doPrivileged(Native Method)     at
>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
>> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
>> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Can't construct
>> instance of class org.apache.hadoop.conf.ConfigurationERROR
>> fuse_connect.c:83 Unable to instantiate a filesystem for user027ERROR
>> fuse_impls_getattr.c:40 Could not connect to glados:9000   unique: 2,
>> error: -5 (Input/output error), outsize: 16 unique: 3, opcode: GETATTR (3),
>> nodeid: 1, insize: 56*
>>
>> I adopted this system after this was already setup, so I do not know
>> which java version was used during install. Currently I'm using:
>>
>> $java -version
>>
>>
>> *java version "1.6.0_45"Java(TM) SE Runtime Environment (build
>> 1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed
>> mode)*
>>
>> $java -version
>>
>>
>>
>>
>>
>> *java version "1.6.0_45" Java(TM) SE Runtime Environment (build
>> 1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed
>> mode)*
>> Is my java version really the cause of this issue?  What is the correct
>> java version to be used for this version of hadoop.  I have also tried
>> 1.6.0_31 but no changes were seen.
>>
>> If java isn't my issue, then what is?
>>
>> Best regards,
>>
>> Andrew
>>
>>
>>
>>
>>

Re: HDFS input/output error - fuse mount

Posted by Chris Mawata <ch...@gmail.com>.
Version 51 ia Java 7
Chris
On Jul 17, 2014 7:50 PM, "andrew touchet" <ad...@latech.edu> wrote:

> Hello,
>
> Hadoop package installed:
> hadoop-0.20-0.20.2+737-33.osg.el5.noarch
>
> Operating System:
> CentOS release 5.8 (Final)
>
> I am mounting HDFS from my namenode to another node with fuse.  After
> mounting to /hdfs, any attempts to 'ls', 'cd', or use 'hadoop fs' leads to
> the below output.
>
>
> $ls /hdfs
> *ls: /hdfs: Input/output error*
> $hadoop fs -ls
>
>
>
>
>
>
>
>
>
>
>
>
>
> *Exception in thread "main" java.lang.UnsupportedClassVersionError:
> org/apache/hadoop/fs/FsShell : Unsupported major.minor version 51.0     at
> java.lang.ClassLoader.defineClass1(Native Method)    at
> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
> java.security.AccessController.doPrivileged(Native Method)     at
> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Could not find the
> main class: org.apache.hadoop.fs.FsShell.  Program will exit.*
>
>
> I have attempted to mount /hdfs manually in debug mode and then attempted
> to access /hdfs from a different terminal. This is the output. The namenode
> is *glados*. The server where /hdfs is being mounted is *glados2*.
>
>
> $hdfs -oserver=glados,port=9000,rdbuffer=131072,allow_other /hdfs -d
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
> *fuse-dfs ignoring option allow_otherERROR fuse_options.c:162 fuse-dfs
> didn't recognize /hdfs,-2fuse-dfs ignoring option -d unique: 1, opcode:
> INIT (26), nodeid: 0, insize: 56INIT:
> 7.10flags=0x0000000bmax_readahead=0x00020000INFO fuse_init.c:115 Mounting
> glados:9000Exception in thread "main"
> java.lang.UnsupportedClassVersionError:
> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
>     at java.lang.ClassLoader.defineClass1(Native Method)    at
> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
> java.security.AccessController.doPrivileged(Native Method)     at
> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Can't construct
> instance of class org.apache.hadoop.conf.ConfigurationERROR fuse_init.c:127
> Unable to establish test connection to server   INIT: 7.8
> flags=0x00000001   max_readahead=0x00020000   max_write=0x00020000
> unique: 1, error: 0 (Success), outsize: 40unique: 2, opcode: GETATTR (3),
> nodeid: 1, insize: 56Exception in thread "Thread-0"
> java.lang.UnsupportedClassVersionError:
> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
>     at java.lang.ClassLoader.defineClass1(Native Method)    at
> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
> java.security.AccessController.doPrivileged(Native Method)     at
> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Can't construct
> instance of class org.apache.hadoop.conf.ConfigurationERROR
> fuse_connect.c:83 Unable to instantiate a filesystem for user027ERROR
> fuse_impls_getattr.c:40 Could not connect to glados:9000   unique: 2,
> error: -5 (Input/output error), outsize: 16 unique: 3, opcode: GETATTR (3),
> nodeid: 1, insize: 56*
>
> I adopted this system after this was already setup, so I do not know which
> java version was used during install. Currently I'm using:
>
> $java -version
>
>
> *java version "1.6.0_45"Java(TM) SE Runtime Environment (build
> 1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed
> mode)*
>
> $java -version
>
>
>
>
>
> *java version "1.6.0_45" Java(TM) SE Runtime Environment (build
> 1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed
> mode)*
> Is my java version really the cause of this issue?  What is the correct
> java version to be used for this version of hadoop.  I have also tried
> 1.6.0_31 but no changes were seen.
>
> If java isn't my issue, then what is?
>
> Best regards,
>
> Andrew
>
>
>
>
>

Re: HDFS input/output error - fuse mount

Posted by Chris Mawata <ch...@gmail.com>.
Version 51 ia Java 7
Chris
On Jul 17, 2014 7:50 PM, "andrew touchet" <ad...@latech.edu> wrote:

> Hello,
>
> Hadoop package installed:
> hadoop-0.20-0.20.2+737-33.osg.el5.noarch
>
> Operating System:
> CentOS release 5.8 (Final)
>
> I am mounting HDFS from my namenode to another node with fuse.  After
> mounting to /hdfs, any attempts to 'ls', 'cd', or use 'hadoop fs' leads to
> the below output.
>
>
> $ls /hdfs
> *ls: /hdfs: Input/output error*
> $hadoop fs -ls
>
>
>
>
>
>
>
>
>
>
>
>
>
> *Exception in thread "main" java.lang.UnsupportedClassVersionError:
> org/apache/hadoop/fs/FsShell : Unsupported major.minor version 51.0     at
> java.lang.ClassLoader.defineClass1(Native Method)    at
> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
> java.security.AccessController.doPrivileged(Native Method)     at
> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Could not find the
> main class: org.apache.hadoop.fs.FsShell.  Program will exit.*
>
>
> I have attempted to mount /hdfs manually in debug mode and then attempted
> to access /hdfs from a different terminal. This is the output. The namenode
> is *glados*. The server where /hdfs is being mounted is *glados2*.
>
>
> $hdfs -oserver=glados,port=9000,rdbuffer=131072,allow_other /hdfs -d
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
> *fuse-dfs ignoring option allow_otherERROR fuse_options.c:162 fuse-dfs
> didn't recognize /hdfs,-2fuse-dfs ignoring option -d unique: 1, opcode:
> INIT (26), nodeid: 0, insize: 56INIT:
> 7.10flags=0x0000000bmax_readahead=0x00020000INFO fuse_init.c:115 Mounting
> glados:9000Exception in thread "main"
> java.lang.UnsupportedClassVersionError:
> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
>     at java.lang.ClassLoader.defineClass1(Native Method)    at
> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
> java.security.AccessController.doPrivileged(Native Method)     at
> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Can't construct
> instance of class org.apache.hadoop.conf.ConfigurationERROR fuse_init.c:127
> Unable to establish test connection to server   INIT: 7.8
> flags=0x00000001   max_readahead=0x00020000   max_write=0x00020000
> unique: 1, error: 0 (Success), outsize: 40unique: 2, opcode: GETATTR (3),
> nodeid: 1, insize: 56Exception in thread "Thread-0"
> java.lang.UnsupportedClassVersionError:
> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
>     at java.lang.ClassLoader.defineClass1(Native Method)    at
> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
> java.security.AccessController.doPrivileged(Native Method)     at
> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Can't construct
> instance of class org.apache.hadoop.conf.ConfigurationERROR
> fuse_connect.c:83 Unable to instantiate a filesystem for user027ERROR
> fuse_impls_getattr.c:40 Could not connect to glados:9000   unique: 2,
> error: -5 (Input/output error), outsize: 16 unique: 3, opcode: GETATTR (3),
> nodeid: 1, insize: 56*
>
> I adopted this system after this was already setup, so I do not know which
> java version was used during install. Currently I'm using:
>
> $java -version
>
>
> *java version "1.6.0_45"Java(TM) SE Runtime Environment (build
> 1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed
> mode)*
>
> $java -version
>
>
>
>
>
> *java version "1.6.0_45" Java(TM) SE Runtime Environment (build
> 1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed
> mode)*
> Is my java version really the cause of this issue?  What is the correct
> java version to be used for this version of hadoop.  I have also tried
> 1.6.0_31 but no changes were seen.
>
> If java isn't my issue, then what is?
>
> Best regards,
>
> Andrew
>
>
>
>
>

Re: HDFS input/output error - fuse mount

Posted by Chris Mawata <ch...@gmail.com>.
Version 51 ia Java 7
Chris
On Jul 17, 2014 7:50 PM, "andrew touchet" <ad...@latech.edu> wrote:

> Hello,
>
> Hadoop package installed:
> hadoop-0.20-0.20.2+737-33.osg.el5.noarch
>
> Operating System:
> CentOS release 5.8 (Final)
>
> I am mounting HDFS from my namenode to another node with fuse.  After
> mounting to /hdfs, any attempts to 'ls', 'cd', or use 'hadoop fs' leads to
> the below output.
>
>
> $ls /hdfs
> *ls: /hdfs: Input/output error*
> $hadoop fs -ls
>
>
>
>
>
>
>
>
>
>
>
>
>
> *Exception in thread "main" java.lang.UnsupportedClassVersionError:
> org/apache/hadoop/fs/FsShell : Unsupported major.minor version 51.0     at
> java.lang.ClassLoader.defineClass1(Native Method)    at
> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
> java.security.AccessController.doPrivileged(Native Method)     at
> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Could not find the
> main class: org.apache.hadoop.fs.FsShell.  Program will exit.*
>
>
> I have attempted to mount /hdfs manually in debug mode and then attempted
> to access /hdfs from a different terminal. This is the output. The namenode
> is *glados*. The server where /hdfs is being mounted is *glados2*.
>
>
> $hdfs -oserver=glados,port=9000,rdbuffer=131072,allow_other /hdfs -d
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
> *fuse-dfs ignoring option allow_otherERROR fuse_options.c:162 fuse-dfs
> didn't recognize /hdfs,-2fuse-dfs ignoring option -d unique: 1, opcode:
> INIT (26), nodeid: 0, insize: 56INIT:
> 7.10flags=0x0000000bmax_readahead=0x00020000INFO fuse_init.c:115 Mounting
> glados:9000Exception in thread "main"
> java.lang.UnsupportedClassVersionError:
> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
>     at java.lang.ClassLoader.defineClass1(Native Method)    at
> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
> java.security.AccessController.doPrivileged(Native Method)     at
> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Can't construct
> instance of class org.apache.hadoop.conf.ConfigurationERROR fuse_init.c:127
> Unable to establish test connection to server   INIT: 7.8
> flags=0x00000001   max_readahead=0x00020000   max_write=0x00020000
> unique: 1, error: 0 (Success), outsize: 40unique: 2, opcode: GETATTR (3),
> nodeid: 1, insize: 56Exception in thread "Thread-0"
> java.lang.UnsupportedClassVersionError:
> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
>     at java.lang.ClassLoader.defineClass1(Native Method)    at
> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
> java.security.AccessController.doPrivileged(Native Method)     at
> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Can't construct
> instance of class org.apache.hadoop.conf.ConfigurationERROR
> fuse_connect.c:83 Unable to instantiate a filesystem for user027ERROR
> fuse_impls_getattr.c:40 Could not connect to glados:9000   unique: 2,
> error: -5 (Input/output error), outsize: 16 unique: 3, opcode: GETATTR (3),
> nodeid: 1, insize: 56*
>
> I adopted this system after this was already setup, so I do not know which
> java version was used during install. Currently I'm using:
>
> $java -version
>
>
> *java version "1.6.0_45"Java(TM) SE Runtime Environment (build
> 1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed
> mode)*
>
> $java -version
>
>
>
>
>
> *java version "1.6.0_45" Java(TM) SE Runtime Environment (build
> 1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed
> mode)*
> Is my java version really the cause of this issue?  What is the correct
> java version to be used for this version of hadoop.  I have also tried
> 1.6.0_31 but no changes were seen.
>
> If java isn't my issue, then what is?
>
> Best regards,
>
> Andrew
>
>
>
>
>

Re: HDFS input/output error - fuse mount

Posted by Chris Mawata <ch...@gmail.com>.
Version 51 ia Java 7
Chris
On Jul 17, 2014 7:50 PM, "andrew touchet" <ad...@latech.edu> wrote:

> Hello,
>
> Hadoop package installed:
> hadoop-0.20-0.20.2+737-33.osg.el5.noarch
>
> Operating System:
> CentOS release 5.8 (Final)
>
> I am mounting HDFS from my namenode to another node with fuse.  After
> mounting to /hdfs, any attempts to 'ls', 'cd', or use 'hadoop fs' leads to
> the below output.
>
>
> $ls /hdfs
> *ls: /hdfs: Input/output error*
> $hadoop fs -ls
>
>
>
>
>
>
>
>
>
>
>
>
>
> *Exception in thread "main" java.lang.UnsupportedClassVersionError:
> org/apache/hadoop/fs/FsShell : Unsupported major.minor version 51.0     at
> java.lang.ClassLoader.defineClass1(Native Method)    at
> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
> java.security.AccessController.doPrivileged(Native Method)     at
> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Could not find the
> main class: org.apache.hadoop.fs.FsShell.  Program will exit.*
>
>
> I have attempted to mount /hdfs manually in debug mode and then attempted
> to access /hdfs from a different terminal. This is the output. The namenode
> is *glados*. The server where /hdfs is being mounted is *glados2*.
>
>
> $hdfs -oserver=glados,port=9000,rdbuffer=131072,allow_other /hdfs -d
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
> *fuse-dfs ignoring option allow_otherERROR fuse_options.c:162 fuse-dfs
> didn't recognize /hdfs,-2fuse-dfs ignoring option -d unique: 1, opcode:
> INIT (26), nodeid: 0, insize: 56INIT:
> 7.10flags=0x0000000bmax_readahead=0x00020000INFO fuse_init.c:115 Mounting
> glados:9000Exception in thread "main"
> java.lang.UnsupportedClassVersionError:
> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
>     at java.lang.ClassLoader.defineClass1(Native Method)    at
> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
> java.security.AccessController.doPrivileged(Native Method)     at
> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Can't construct
> instance of class org.apache.hadoop.conf.ConfigurationERROR fuse_init.c:127
> Unable to establish test connection to server   INIT: 7.8
> flags=0x00000001   max_readahead=0x00020000   max_write=0x00020000
> unique: 1, error: 0 (Success), outsize: 40unique: 2, opcode: GETATTR (3),
> nodeid: 1, insize: 56Exception in thread "Thread-0"
> java.lang.UnsupportedClassVersionError:
> org/apache/hadoop/conf/Configuration : Unsupported major.minor version 51.0
>     at java.lang.ClassLoader.defineClass1(Native Method)    at
> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)    at
> java.lang.ClassLoader.defineClass(ClassLoader.java:615)    at
> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
> at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)    at
> java.net.URLClassLoader.access$000(URLClassLoader.java:58)    at
> java.net.URLClassLoader$1.run(URLClassLoader.java:197)    at
> java.security.AccessController.doPrivileged(Native Method)     at
> java.net.URLClassLoader.findClass(URLClassLoader.java:190)    at
> java.lang.ClassLoader.loadClass(ClassLoader.java:306)    at
> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)    at
> java.lang.ClassLoader.loadClass(ClassLoader.java:247) Can't construct
> instance of class org.apache.hadoop.conf.ConfigurationERROR
> fuse_connect.c:83 Unable to instantiate a filesystem for user027ERROR
> fuse_impls_getattr.c:40 Could not connect to glados:9000   unique: 2,
> error: -5 (Input/output error), outsize: 16 unique: 3, opcode: GETATTR (3),
> nodeid: 1, insize: 56*
>
> I adopted this system after this was already setup, so I do not know which
> java version was used during install. Currently I'm using:
>
> $java -version
>
>
> *java version "1.6.0_45"Java(TM) SE Runtime Environment (build
> 1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed
> mode)*
>
> $java -version
>
>
>
>
>
> *java version "1.6.0_45" Java(TM) SE Runtime Environment (build
> 1.6.0_45-b06)Java HotSpot(TM) 64-Bit Server VM (build 20.45-b01, mixed
> mode)*
> Is my java version really the cause of this issue?  What is the correct
> java version to be used for this version of hadoop.  I have also tried
> 1.6.0_31 but no changes were seen.
>
> If java isn't my issue, then what is?
>
> Best regards,
>
> Andrew
>
>
>
>
>