You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by jayalakshmi sandhya <sa...@gmail.com> on 2010/01/15 09:49:26 UTC

Pl help me out..

Hi.. Am getting one bug while setting the nodes in hadoop; am not able to
rectify it..  Pl help me out.. I ve tried to explain the prob in detail..

When I type these cmds in terminal,

sandhya@sandhya-laptop:/usr/local/hadoop$ bin/hadoop jar
hadoop-*-examples.jar grep input output 'dfs[a-z.]+'

(or)

sandhya@sandhya-laptop:/usr/local/hadoop$ bin/hadoop namenode –format

I get  this error

Exception in thread "main" java.lang.UnsupportedClassVersionError: Bad
version number in .class file

                at java.lang.ClassLoader.defineClass1(Native Method)

                at java.lang.ClassLoader.defineClass(ClassLoader.java:620)

                at
java.security.SecureClassLoader.defineClass(SecureClassLoader.java:124)

                at
java.net.URLClassLoader.defineClass(URLClassLoader.java:260)

                at
java.net.URLClassLoader.access$100(URLClassLoader.java:56)

                at java.net.URLClassLoader$1.run(URLClassLoader.java:195)

                at java.security.AccessController.doPrivileged(Native
Method)

                at
java.net.URLClassLoader.findClass(URLClassLoader.java:188)

                at java.lang.ClassLoader.loadClass(ClassLoader.java:306)

                at
sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:268)

                at java.lang.ClassLoader.loadClass(ClassLoader.java:251)

                at
java.lang.ClassLoader.loadClassInternal(ClassLoader.java:319)

Exception in thread "main" java.lang.UnsupportedClassVersionError: Bad
version number in .class file

                at java.lang.ClassLoader.defineClass1(Native Method)

                at java.lang.ClassLoader.defineClass(ClassLoader.java:620)

                at
java.security.SecureClassLoader.defineClass(SecureClassLoader.java:124)

                at
java.net.URLClassLoader.defineClass(URLClassLoader.java:260)

                at
java.net.URLClassLoader.access$100(URLClassLoader.java:56)

                at java.net.URLClassLoader$1.run(URLClassLoader.java:195)

                at java.security.AccessController.doPrivileged(Native
Method)

                at
java.net.URLClassLoader.findClass(URLClassLoader.java:188)

                at java.lang.ClassLoader.loadClass(ClassLoader.java:306)

                at
sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:268)

                at java.lang.ClassLoader.loadClass(ClassLoader.java:251)

                at
java.lang.ClassLoader.loadClassInternal(ClassLoader.java:319)

sandhya@sandhya-laptop:/usr/local/hadoop$





*Actually there was a step after installing hadoop like-*



Unpack the downloaded Hadoop distribution. In the distribution, edit the
file conf/hadoop-env.sh to define at least JAVA_HOME to be the root of your
Java installation.



So this is my conf/hadoop-env.sh ,



# Set Hadoop-specific environment variables here.



# The only required environment variable is JAVA_HOME.  All others are

# optional.  When running a distributed configuration it is best to

# set JAVA_HOME in this file, so that it is correctly defined on

# remote nodes.



# The java implementation to use.  Required.

*export JAVA_HOME=/home/sandhya/jdk   # I CHANGED HERE .. MY JAVA
INSTALLATION IS IN THIS PATH*

# Extra Java CLASSPATH elements.  Optional.

# export HADOOP_CLASSPATH=



# The maximum amount of heap to use, in MB. Default is 1000.

export HADOOP_HEAPSIZE=2000 # I CHANGED HERE



# Extra Java runtime options.  Empty by default.

export HADOOP_OPTS=-server



# Command specific options appended to HADOOP_OPTS when specified

export HADOOP_NAMENODE_OPTS="-Dcom.sun.management.jmxremote
$HADOOP_NAMENODE_OPTS"

export HADOOP_SECONDARYNAMENODE_OPTS="-Dcom.sun.management.jmxremote
$HADOOP_SECONDARYNAMENODE_OPTS"

export HADOOP_DATANODE_OPTS="-Dcom.sun.management.jmxremote
$HADOOP_DATANODE_OPTS"

export HADOOP_BALANCER_OPTS="-Dcom.sun.management.jmxremote
$HADOOP_BALANCER_OPTS"

export HADOOP_JOBTRACKER_OPTS="-Dcom.sun.management.jmxremote
$HADOOP_JOBTRACKER_OPTS"

# export HADOOP_TASKTRACKER_OPTS=

# The following applies to multiple commands (fs, dfs, fsck, distcp etc)

# export HADOOP_CLIENT_OPTS



# Extra ssh options.  Empty by default.

# export HADOOP_SSH_OPTS="-o ConnectTimeout=1 -o SendEnv=HADOOP_CONF_DIR"



# Where log files are stored.  $HADOOP_HOME/logs by default.

# export HADOOP_LOG_DIR=${HADOOP_HOME}/logs



# File naming remote slave hosts.  $HADOOP_HOME/conf/slaves by default.

# export HADOOP_SLAVES=${HADOOP_HOME}/conf/slaves



# host:path where hadoop code should be rsync'd from.  Unset by default.

# export HADOOP_MASTER=master:/home/$USER/src/hadoop



# Seconds to sleep between slave commands.  Unset by default.  This

# can be useful in large clusters, where, e.g., slave rsyncs can

# otherwise arrive faster than the master can service them.

# export HADOOP_SLAVE_SLEEP=0.1



# The directory where pid files are stored. /tmp by default.

# export HADOOP_PID_DIR=/var/hadoop/pids



# A string representing this instance of hadoop. $USER by default.

# export HADOOP_IDENT_STRING=$USER



# The scheduling priority for daemon processes.  See 'man nice'.

# export HADOOP_NICENESS=10



When I googled, I got to know this-



*That's because you're using classes compiled with different versions of
Java.**

Typically if you use a j1.5 compiled class in a J1.4 JVM, it's not going to
work.*

* *

I do not know how I shud verify the above statements. But,Actually I had
another  installation of java , now I removed that  version.

Re: Pl help me out..

Posted by Stephen Watt <sw...@us.ibm.com>.
I think its likely that you're using Hadoop 0.20.0 or greater which 
requires Java 1.6, and I guess you have Java 1.5 JRE or less specified in 
your conf/hadoop-env.sh file. Change the file to point to your 1.6 JRE.

Kind regards
Steve Watt



From:
jayalakshmi sandhya <sa...@gmail.com>
To:
"common-user@hadoop.apache.org" <co...@hadoop.apache.org>
Date:
01/19/2010 12:02 PM
Subject:
Pl help me out..



Hi.. Am getting one bug while setting the nodes in hadoop; am not able to
rectify it..  Pl help me out.. I ve tried to explain the prob in detail..

When I type these cmds in terminal,

sandhya@sandhya-laptop:/usr/local/hadoop$ bin/hadoop jar
hadoop-*-examples.jar grep input output 'dfs[a-z.]+'

(or)

sandhya@sandhya-laptop:/usr/local/hadoop$ bin/hadoop namenode –format

I get  this error

Exception in thread "main" java.lang.UnsupportedClassVersionError: Bad
version number in .class file

                at java.lang.ClassLoader.defineClass1(Native Method)

                at java.lang.ClassLoader.defineClass(ClassLoader.java:620)

                at
java.security.SecureClassLoader.defineClass(SecureClassLoader.java:124)

                at
java.net.URLClassLoader.defineClass(URLClassLoader.java:260)

                at
java.net.URLClassLoader.access$100(URLClassLoader.java:56)

                at java.net.URLClassLoader$1.run(URLClassLoader.java:195)

                at java.security.AccessController.doPrivileged(Native
Method)

                at
java.net.URLClassLoader.findClass(URLClassLoader.java:188)

                at java.lang.ClassLoader.loadClass(ClassLoader.java:306)

                at
sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:268)

                at java.lang.ClassLoader.loadClass(ClassLoader.java:251)

                at
java.lang.ClassLoader.loadClassInternal(ClassLoader.java:319)

Exception in thread "main" java.lang.UnsupportedClassVersionError: Bad
version number in .class file

                at java.lang.ClassLoader.defineClass1(Native Method)

                at java.lang.ClassLoader.defineClass(ClassLoader.java:620)

                at
java.security.SecureClassLoader.defineClass(SecureClassLoader.java:124)

                at
java.net.URLClassLoader.defineClass(URLClassLoader.java:260)

                at
java.net.URLClassLoader.access$100(URLClassLoader.java:56)

                at java.net.URLClassLoader$1.run(URLClassLoader.java:195)

                at java.security.AccessController.doPrivileged(Native
Method)

                at
java.net.URLClassLoader.findClass(URLClassLoader.java:188)

                at java.lang.ClassLoader.loadClass(ClassLoader.java:306)

                at
sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:268)

                at java.lang.ClassLoader.loadClass(ClassLoader.java:251)

                at
java.lang.ClassLoader.loadClassInternal(ClassLoader.java:319)

sandhya@sandhya-laptop:/usr/local/hadoop$





*Actually there was a step after installing hadoop like-*



Unpack the downloaded Hadoop distribution. In the distribution, edit the
file conf/hadoop-env.sh to define at least JAVA_HOME to be the root of 
your
Java installation.



So this is my conf/hadoop-env.sh ,



# Set Hadoop-specific environment variables here.



# The only required environment variable is JAVA_HOME.  All others are

# optional.  When running a distributed configuration it is best to

# set JAVA_HOME in this file, so that it is correctly defined on

# remote nodes.



# The java implementation to use.  Required.

*export JAVA_HOME=/home/sandhya/jdk   # I CHANGED HERE .. MY JAVA
INSTALLATION IS IN THIS PATH*

# Extra Java CLASSPATH elements.  Optional.

# export HADOOP_CLASSPATH=



# The maximum amount of heap to use, in MB. Default is 1000.

export HADOOP_HEAPSIZE=2000 # I CHANGED HERE



# Extra Java runtime options.  Empty by default.

export HADOOP_OPTS=-server



# Command specific options appended to HADOOP_OPTS when specified

export HADOOP_NAMENODE_OPTS="-Dcom.sun.management.jmxremote
$HADOOP_NAMENODE_OPTS"

export HADOOP_SECONDARYNAMENODE_OPTS="-Dcom.sun.management.jmxremote
$HADOOP_SECONDARYNAMENODE_OPTS"

export HADOOP_DATANODE_OPTS="-Dcom.sun.management.jmxremote
$HADOOP_DATANODE_OPTS"

export HADOOP_BALANCER_OPTS="-Dcom.sun.management.jmxremote
$HADOOP_BALANCER_OPTS"

export HADOOP_JOBTRACKER_OPTS="-Dcom.sun.management.jmxremote
$HADOOP_JOBTRACKER_OPTS"

# export HADOOP_TASKTRACKER_OPTS=

# The following applies to multiple commands (fs, dfs, fsck, distcp etc)

# export HADOOP_CLIENT_OPTS



# Extra ssh options.  Empty by default.

# export HADOOP_SSH_OPTS="-o ConnectTimeout=1 -o SendEnv=HADOOP_CONF_DIR"



# Where log files are stored.  $HADOOP_HOME/logs by default.

# export HADOOP_LOG_DIR=${HADOOP_HOME}/logs



# File naming remote slave hosts.  $HADOOP_HOME/conf/slaves by default.

# export HADOOP_SLAVES=${HADOOP_HOME}/conf/slaves



# host:path where hadoop code should be rsync'd from.  Unset by default.

# export HADOOP_MASTER=master:/home/$USER/src/hadoop



# Seconds to sleep between slave commands.  Unset by default.  This

# can be useful in large clusters, where, e.g., slave rsyncs can

# otherwise arrive faster than the master can service them.

# export HADOOP_SLAVE_SLEEP=0.1



# The directory where pid files are stored. /tmp by default.

# export HADOOP_PID_DIR=/var/hadoop/pids



# A string representing this instance of hadoop. $USER by default.

# export HADOOP_IDENT_STRING=$USER



# The scheduling priority for daemon processes.  See 'man nice'.

# export HADOOP_NICENESS=10



When I googled, I got to know this-



*That's because you're using classes compiled with different versions of
Java.**

Typically if you use a j1.5 compiled class in a J1.4 JVM, it's not going 
to
work.*

* *

I do not know how I shud verify the above statements. But,Actually I had
another  installation of java , now I removed that  version.