You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@accumulo.apache.org by David Medinets <da...@gmail.com> on 2012/12/14 04:43:13 UTC
hadoop classpath causing an exception (sub-command not defined?)
I am at a loss to explain what I am seeing. I have installed Accumulo
many times without a hitch. But today, I am running into a problem
getting the hadoop classpath.
$ /usr/local/hadoop/bin/hadoop
Usage: hadoop [--config confdir] COMMAND
where COMMAND is one of:
namenode -format format the DFS filesystem
secondarynamenode run the DFS secondary namenode
namenode run the DFS namenode
datanode run a DFS datanode
dfsadmin run a DFS admin client
mradmin run a Map-Reduce admin client
fsck run a DFS filesystem checking utility
fs run a generic filesystem user client
balancer run a cluster balancing utility
jobtracker run the MapReduce job Tracker node
pipes run a Pipes job
tasktracker run a MapReduce task Tracker node
job manipulate MapReduce jobs
queue get information regarding JobQueues
version print the version
jar <jar> run a jar file
distcp <srcurl> <desturl> copy file or directories recursively
archive -archiveName NAME <src>* <dest> create a hadoop archive
daemonlog get/set the log level for each daemon
or
CLASSNAME run the class named CLASSNAME
Most commands print help when invoked w/o parameters.
I am using using the following version of hadoop:
$ hadoop version
Hadoop 0.20.2
Subversion https://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.20
-r 911707
Compiled by chrisdo on Fri Feb 19 08:07:34 UTC 2010
Inside the accumulo script is the line:
HADOOP_CLASSPATH=`$HADOOP_HOME/bin/hadoop classpath`
This line results in the following exception:
$ $HADOOP_HOME/bin/hadoop classpath
Exception in thread "main" java.lang.NoClassDefFoundError: classpath
Caused by: java.lang.ClassNotFoundException: classpath
at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:294)
at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
Could not find the main class: classpath. Program will exit.
Am I missing something basic? What?
Re: hadoop classpath causing an exception (sub-command not defined?)
Posted by Josh Elser <jo...@gmail.com>.
Given the work that Billie just committed to allow the user to set the
version of Hadoop (and ZooKeeper) being compiled against (ACCUMULO-876),
I think the best solution would be to substitute the Hadoop version
specific commands into the scripts at build time.
That is, of course, assuming that we can't come up with an all-version
compatible set of scripts.
On 12/14/12 9:21 AM, David Medinets wrote:
> Should we add an hadoop version check to the accumulo script?
>
> On Fri, Dec 14, 2012 at 7:45 AM, Jason Trost <ja...@gmail.com> wrote:
>> We saw the same issue recently. We upgraded our dev nodes to hadoop 1.1.1
>> and it fixed this issue. I'm not sure when class path was added to the
>> hadoop command so a minor upgrade may work too.
>>
>> --Jason
>>
>> sent from my DROID
>> On Dec 14, 2012 7:34 AM, "David Medinets" <da...@gmail.com> wrote:
>>
>>> It looks to me like the change of Nov 21, 2012 added the 'hadoop
>>> classpath' call to the accumulo script.
>>>
>>> ACCUMULO-708 initial implementation of VFS class loader …
>>> git-svn-id: https://svn.apache.org/repos/asf/accumulo/trunk@1412398
>>> 13f79535-47bb-0310-9956-ffa450edef68
>>> Dave Marion authored 23 days ago
>>>
>>> Could the classpath sub-command be part of a newer version (>0.20.2) of
>>> hadoop?
>>>
>>> On Fri, Dec 14, 2012 at 12:18 AM, John Vines <jv...@gmail.com> wrote:
>>>> I didn't think hadoop had a classpath argument, just Accumulo.
>>>>
>>>> Sent from my phone, please pardon the typos and brevity.
>>>> On Dec 13, 2012 10:43 PM, "David Medinets" <da...@gmail.com>
>>> wrote:
>>>>> I am at a loss to explain what I am seeing. I have installed Accumulo
>>>>> many times without a hitch. But today, I am running into a problem
>>>>> getting the hadoop classpath.
>>>>>
>>>>> $ /usr/local/hadoop/bin/hadoop
>>>>> Usage: hadoop [--config confdir] COMMAND
>>>>> where COMMAND is one of:
>>>>> namenode -format format the DFS filesystem
>>>>> secondarynamenode run the DFS secondary namenode
>>>>> namenode run the DFS namenode
>>>>> datanode run a DFS datanode
>>>>> dfsadmin run a DFS admin client
>>>>> mradmin run a Map-Reduce admin client
>>>>> fsck run a DFS filesystem checking utility
>>>>> fs run a generic filesystem user client
>>>>> balancer run a cluster balancing utility
>>>>> jobtracker run the MapReduce job Tracker node
>>>>> pipes run a Pipes job
>>>>> tasktracker run a MapReduce task Tracker node
>>>>> job manipulate MapReduce jobs
>>>>> queue get information regarding JobQueues
>>>>> version print the version
>>>>> jar <jar> run a jar file
>>>>> distcp <srcurl> <desturl> copy file or directories recursively
>>>>> archive -archiveName NAME <src>* <dest> create a hadoop archive
>>>>> daemonlog get/set the log level for each daemon
>>>>> or
>>>>> CLASSNAME run the class named CLASSNAME
>>>>> Most commands print help when invoked w/o parameters.
>>>>>
>>>>> I am using using the following version of hadoop:
>>>>>
>>>>> $ hadoop version
>>>>> Hadoop 0.20.2
>>>>> Subversion
>>>>> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.20
>>>>> -r 911707
>>>>> Compiled by chrisdo on Fri Feb 19 08:07:34 UTC 2010
>>>>>
>>>>> Inside the accumulo script is the line:
>>>>>
>>>>> HADOOP_CLASSPATH=`$HADOOP_HOME/bin/hadoop classpath`
>>>>>
>>>>> This line results in the following exception:
>>>>>
>>>>> $ $HADOOP_HOME/bin/hadoop classpath
>>>>> Exception in thread "main" java.lang.NoClassDefFoundError: classpath
>>>>> Caused by: java.lang.ClassNotFoundException: classpath
>>>>> at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
>>>>> at java.security.AccessController.doPrivileged(Native Method)
>>>>> at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>>>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
>>>>> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:294)
>>>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
>>>>> Could not find the main class: classpath. Program will exit.
>>>>>
>>>>> Am I missing something basic? What?
>>>>>
Re: hadoop classpath causing an exception (sub-command not defined?)
Posted by David Medinets <da...@gmail.com>.
Should we add an hadoop version check to the accumulo script?
On Fri, Dec 14, 2012 at 7:45 AM, Jason Trost <ja...@gmail.com> wrote:
> We saw the same issue recently. We upgraded our dev nodes to hadoop 1.1.1
> and it fixed this issue. I'm not sure when class path was added to the
> hadoop command so a minor upgrade may work too.
>
> --Jason
>
> sent from my DROID
> On Dec 14, 2012 7:34 AM, "David Medinets" <da...@gmail.com> wrote:
>
>> It looks to me like the change of Nov 21, 2012 added the 'hadoop
>> classpath' call to the accumulo script.
>>
>> ACCUMULO-708 initial implementation of VFS class loader …
>> git-svn-id: https://svn.apache.org/repos/asf/accumulo/trunk@1412398
>> 13f79535-47bb-0310-9956-ffa450edef68
>> Dave Marion authored 23 days ago
>>
>> Could the classpath sub-command be part of a newer version (>0.20.2) of
>> hadoop?
>>
>> On Fri, Dec 14, 2012 at 12:18 AM, John Vines <jv...@gmail.com> wrote:
>> > I didn't think hadoop had a classpath argument, just Accumulo.
>> >
>> > Sent from my phone, please pardon the typos and brevity.
>> > On Dec 13, 2012 10:43 PM, "David Medinets" <da...@gmail.com>
>> wrote:
>> >
>> >> I am at a loss to explain what I am seeing. I have installed Accumulo
>> >> many times without a hitch. But today, I am running into a problem
>> >> getting the hadoop classpath.
>> >>
>> >> $ /usr/local/hadoop/bin/hadoop
>> >> Usage: hadoop [--config confdir] COMMAND
>> >> where COMMAND is one of:
>> >> namenode -format format the DFS filesystem
>> >> secondarynamenode run the DFS secondary namenode
>> >> namenode run the DFS namenode
>> >> datanode run a DFS datanode
>> >> dfsadmin run a DFS admin client
>> >> mradmin run a Map-Reduce admin client
>> >> fsck run a DFS filesystem checking utility
>> >> fs run a generic filesystem user client
>> >> balancer run a cluster balancing utility
>> >> jobtracker run the MapReduce job Tracker node
>> >> pipes run a Pipes job
>> >> tasktracker run a MapReduce task Tracker node
>> >> job manipulate MapReduce jobs
>> >> queue get information regarding JobQueues
>> >> version print the version
>> >> jar <jar> run a jar file
>> >> distcp <srcurl> <desturl> copy file or directories recursively
>> >> archive -archiveName NAME <src>* <dest> create a hadoop archive
>> >> daemonlog get/set the log level for each daemon
>> >> or
>> >> CLASSNAME run the class named CLASSNAME
>> >> Most commands print help when invoked w/o parameters.
>> >>
>> >> I am using using the following version of hadoop:
>> >>
>> >> $ hadoop version
>> >> Hadoop 0.20.2
>> >> Subversion
>> >> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.20
>> >> -r 911707
>> >> Compiled by chrisdo on Fri Feb 19 08:07:34 UTC 2010
>> >>
>> >> Inside the accumulo script is the line:
>> >>
>> >> HADOOP_CLASSPATH=`$HADOOP_HOME/bin/hadoop classpath`
>> >>
>> >> This line results in the following exception:
>> >>
>> >> $ $HADOOP_HOME/bin/hadoop classpath
>> >> Exception in thread "main" java.lang.NoClassDefFoundError: classpath
>> >> Caused by: java.lang.ClassNotFoundException: classpath
>> >> at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
>> >> at java.security.AccessController.doPrivileged(Native Method)
>> >> at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>> >> at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
>> >> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:294)
>> >> at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
>> >> Could not find the main class: classpath. Program will exit.
>> >>
>> >> Am I missing something basic? What?
>> >>
>>
Re: hadoop classpath causing an exception (sub-command not defined?)
Posted by Jason Trost <ja...@gmail.com>.
We saw the same issue recently. We upgraded our dev nodes to hadoop 1.1.1
and it fixed this issue. I'm not sure when class path was added to the
hadoop command so a minor upgrade may work too.
--Jason
sent from my DROID
On Dec 14, 2012 7:34 AM, "David Medinets" <da...@gmail.com> wrote:
> It looks to me like the change of Nov 21, 2012 added the 'hadoop
> classpath' call to the accumulo script.
>
> ACCUMULO-708 initial implementation of VFS class loader …
> git-svn-id: https://svn.apache.org/repos/asf/accumulo/trunk@1412398
> 13f79535-47bb-0310-9956-ffa450edef68
> Dave Marion authored 23 days ago
>
> Could the classpath sub-command be part of a newer version (>0.20.2) of
> hadoop?
>
> On Fri, Dec 14, 2012 at 12:18 AM, John Vines <jv...@gmail.com> wrote:
> > I didn't think hadoop had a classpath argument, just Accumulo.
> >
> > Sent from my phone, please pardon the typos and brevity.
> > On Dec 13, 2012 10:43 PM, "David Medinets" <da...@gmail.com>
> wrote:
> >
> >> I am at a loss to explain what I am seeing. I have installed Accumulo
> >> many times without a hitch. But today, I am running into a problem
> >> getting the hadoop classpath.
> >>
> >> $ /usr/local/hadoop/bin/hadoop
> >> Usage: hadoop [--config confdir] COMMAND
> >> where COMMAND is one of:
> >> namenode -format format the DFS filesystem
> >> secondarynamenode run the DFS secondary namenode
> >> namenode run the DFS namenode
> >> datanode run a DFS datanode
> >> dfsadmin run a DFS admin client
> >> mradmin run a Map-Reduce admin client
> >> fsck run a DFS filesystem checking utility
> >> fs run a generic filesystem user client
> >> balancer run a cluster balancing utility
> >> jobtracker run the MapReduce job Tracker node
> >> pipes run a Pipes job
> >> tasktracker run a MapReduce task Tracker node
> >> job manipulate MapReduce jobs
> >> queue get information regarding JobQueues
> >> version print the version
> >> jar <jar> run a jar file
> >> distcp <srcurl> <desturl> copy file or directories recursively
> >> archive -archiveName NAME <src>* <dest> create a hadoop archive
> >> daemonlog get/set the log level for each daemon
> >> or
> >> CLASSNAME run the class named CLASSNAME
> >> Most commands print help when invoked w/o parameters.
> >>
> >> I am using using the following version of hadoop:
> >>
> >> $ hadoop version
> >> Hadoop 0.20.2
> >> Subversion
> >> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.20
> >> -r 911707
> >> Compiled by chrisdo on Fri Feb 19 08:07:34 UTC 2010
> >>
> >> Inside the accumulo script is the line:
> >>
> >> HADOOP_CLASSPATH=`$HADOOP_HOME/bin/hadoop classpath`
> >>
> >> This line results in the following exception:
> >>
> >> $ $HADOOP_HOME/bin/hadoop classpath
> >> Exception in thread "main" java.lang.NoClassDefFoundError: classpath
> >> Caused by: java.lang.ClassNotFoundException: classpath
> >> at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
> >> at java.security.AccessController.doPrivileged(Native Method)
> >> at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
> >> at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
> >> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:294)
> >> at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
> >> Could not find the main class: classpath. Program will exit.
> >>
> >> Am I missing something basic? What?
> >>
>
Re: hadoop classpath causing an exception (sub-command not defined?)
Posted by David Medinets <da...@gmail.com>.
It looks to me like the change of Nov 21, 2012 added the 'hadoop
classpath' call to the accumulo script.
ACCUMULO-708 initial implementation of VFS class loader …
git-svn-id: https://svn.apache.org/repos/asf/accumulo/trunk@1412398
13f79535-47bb-0310-9956-ffa450edef68
Dave Marion authored 23 days ago
Could the classpath sub-command be part of a newer version (>0.20.2) of hadoop?
On Fri, Dec 14, 2012 at 12:18 AM, John Vines <jv...@gmail.com> wrote:
> I didn't think hadoop had a classpath argument, just Accumulo.
>
> Sent from my phone, please pardon the typos and brevity.
> On Dec 13, 2012 10:43 PM, "David Medinets" <da...@gmail.com> wrote:
>
>> I am at a loss to explain what I am seeing. I have installed Accumulo
>> many times without a hitch. But today, I am running into a problem
>> getting the hadoop classpath.
>>
>> $ /usr/local/hadoop/bin/hadoop
>> Usage: hadoop [--config confdir] COMMAND
>> where COMMAND is one of:
>> namenode -format format the DFS filesystem
>> secondarynamenode run the DFS secondary namenode
>> namenode run the DFS namenode
>> datanode run a DFS datanode
>> dfsadmin run a DFS admin client
>> mradmin run a Map-Reduce admin client
>> fsck run a DFS filesystem checking utility
>> fs run a generic filesystem user client
>> balancer run a cluster balancing utility
>> jobtracker run the MapReduce job Tracker node
>> pipes run a Pipes job
>> tasktracker run a MapReduce task Tracker node
>> job manipulate MapReduce jobs
>> queue get information regarding JobQueues
>> version print the version
>> jar <jar> run a jar file
>> distcp <srcurl> <desturl> copy file or directories recursively
>> archive -archiveName NAME <src>* <dest> create a hadoop archive
>> daemonlog get/set the log level for each daemon
>> or
>> CLASSNAME run the class named CLASSNAME
>> Most commands print help when invoked w/o parameters.
>>
>> I am using using the following version of hadoop:
>>
>> $ hadoop version
>> Hadoop 0.20.2
>> Subversion
>> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.20
>> -r 911707
>> Compiled by chrisdo on Fri Feb 19 08:07:34 UTC 2010
>>
>> Inside the accumulo script is the line:
>>
>> HADOOP_CLASSPATH=`$HADOOP_HOME/bin/hadoop classpath`
>>
>> This line results in the following exception:
>>
>> $ $HADOOP_HOME/bin/hadoop classpath
>> Exception in thread "main" java.lang.NoClassDefFoundError: classpath
>> Caused by: java.lang.ClassNotFoundException: classpath
>> at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
>> at java.security.AccessController.doPrivileged(Native Method)
>> at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>> at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
>> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:294)
>> at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
>> Could not find the main class: classpath. Program will exit.
>>
>> Am I missing something basic? What?
>>
Re: hadoop classpath causing an exception (sub-command not defined?)
Posted by John Vines <jv...@gmail.com>.
I didn't think hadoop had a classpath argument, just Accumulo.
Sent from my phone, please pardon the typos and brevity.
On Dec 13, 2012 10:43 PM, "David Medinets" <da...@gmail.com> wrote:
> I am at a loss to explain what I am seeing. I have installed Accumulo
> many times without a hitch. But today, I am running into a problem
> getting the hadoop classpath.
>
> $ /usr/local/hadoop/bin/hadoop
> Usage: hadoop [--config confdir] COMMAND
> where COMMAND is one of:
> namenode -format format the DFS filesystem
> secondarynamenode run the DFS secondary namenode
> namenode run the DFS namenode
> datanode run a DFS datanode
> dfsadmin run a DFS admin client
> mradmin run a Map-Reduce admin client
> fsck run a DFS filesystem checking utility
> fs run a generic filesystem user client
> balancer run a cluster balancing utility
> jobtracker run the MapReduce job Tracker node
> pipes run a Pipes job
> tasktracker run a MapReduce task Tracker node
> job manipulate MapReduce jobs
> queue get information regarding JobQueues
> version print the version
> jar <jar> run a jar file
> distcp <srcurl> <desturl> copy file or directories recursively
> archive -archiveName NAME <src>* <dest> create a hadoop archive
> daemonlog get/set the log level for each daemon
> or
> CLASSNAME run the class named CLASSNAME
> Most commands print help when invoked w/o parameters.
>
> I am using using the following version of hadoop:
>
> $ hadoop version
> Hadoop 0.20.2
> Subversion
> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.20
> -r 911707
> Compiled by chrisdo on Fri Feb 19 08:07:34 UTC 2010
>
> Inside the accumulo script is the line:
>
> HADOOP_CLASSPATH=`$HADOOP_HOME/bin/hadoop classpath`
>
> This line results in the following exception:
>
> $ $HADOOP_HOME/bin/hadoop classpath
> Exception in thread "main" java.lang.NoClassDefFoundError: classpath
> Caused by: java.lang.ClassNotFoundException: classpath
> at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
> at java.security.AccessController.doPrivileged(Native Method)
> at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:294)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
> Could not find the main class: classpath. Program will exit.
>
> Am I missing something basic? What?
>