You are viewing a plain text version of this content. The canonical link for it is here.
Posted to mapreduce-user@hadoop.apache.org by Fatih Haltas <fa...@nyu.edu> on 2013/02/19 16:59:24 UTC

Trouble in running MapReduce application

Hi everyone,

I know this is the common mistake to not specify the class adress while
trying to run a jar, however,
although I specified, I am still getting the ClassNotFound exception.

What may be the reason for it? I have been struggling for this problem more
than a 2 days.
I just wrote different MapReduce application for some anlaysis. I got this
problem.

To check, is there something wrong with my system, i tried to run WordCount
example.
When I just run hadoop-examples wordcount, it is working fine.

But when I add just "package org.myorg;" command at the beginning, it
doesnot work.

Here is what I have done so far
*************************************************************************
1. I just copied wordcount code from the apaches own examples source code
and I just changed package decleration as "package org.myorg;"
**************************************************************************
2. Then I tried to run that command:
 *************************************************************************
"hadoop jar wordcount_19_02.jar org.myorg.WordCount
/home/hadoop/project/hadoop-data/NetFlow 19_02_wordcount.output"
*************************************************************************
3. I got following error:
**************************************************************************
[hadoop@ADUAE042-LAP-V project]$ hadoop jar wordcount_19_02.jar
org.myorg.WordCount /home/hadoop/project/hadoop-data/NetFlow
19_02_wordcount.output
Warning: $HADOOP_HOME is deprecated.

Exception in thread "main" java.lang.ClassNotFoundException:
org.myorg.WordCount
        at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:266)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:149)

**************************************************************************
4. This is the content of my .jar file:
****************************************************
[hadoop@ADUAE042-LAP-V project]$ jar tf wordcount_19_02.jar
META-INF/
META-INF/MANIFEST.MF
wordcount_classes/
wordcount_classes/org/
wordcount_classes/org/myorg/
wordcount_classes/org/myorg/WordCount.class
wordcount_classes/org/myorg/WordCount$TokenizerMapper.class
wordcount_classes/org/myorg/WordCount$IntSumReducer.class
**********************************************************
5. This is the 'ls' output of my working directory:
**********************************************************
[hadoop@ADUAE042-LAP-V project]$ ls
flowclasses_18_02  flowclasses_18_02.jar  hadoop-1.0.4  hadoop-1.0.4.tar.gz
 hadoop-data  MapReduce.java  sample  wordcount_19_02.jar
 wordcount_classes  WordCountClasses  WordCount.java
*************************************************************
So as you see, package decleration is fine but I am really helpless, I
googled but they are all saying samething you should specify the package
hierarchy of your main class. I did know it already I am specifying but
doesn't work.

I would be much obliged to anyone helped me

Regards,

Re: Trouble in running MapReduce application

Posted by Fatih Haltas <fa...@nyu.edu>.
Thank you very much but
No this is the file in hdfs and it is the exact path of netflow data in
hdfs. Hadoop-data is the home hdfs directory, before downgrading my jdk
this command worked well

24 Şubat 2013 Pazar tarihinde sudhakara st adlı kullanıcı şöyle yazdı:

> Hi,
>      Your specifying  the input directory in local file system not in
> HDFS, Copy some text file to using '-put' or '-copyFromLoca'l to HDFS user
> home directory then try to execute word count by specifying home as input
> directory.
>
> On Sun, Feb 24, 2013 at 3:29 PM, Fatih Haltas <fa...@nyu.edu>wrote:
>
>
>
> Hi Hemanth;
>
> Thanks for your grreat helps,
>
> I am really much obliged to you.
>
> I solved this problem by changing my java compiler vs. but now though I
> changed everynodes configuration I am getting this error even I tried to
> run example of wordcount without making any changes.
>
> What may be the reason, I believe that I checked all config files and
> changed the home variables, also /etc/hosts
>
> Here is my problem:
> ************************************************************
> [hadoop@ADUAE042-LAP-V logs]$ hadoop jar ../hadoop-examples-1.0.4.jar
> wordcount /home/hadoop/project/hadoop-data/NetFlow test1353.out
>
> Warning: $HADOOP_HOME is deprecated.
>
> 13/02/24 13:32:28 INFO input.FileInputFormat: Total input paths to process
> : 1
> 13/02/24 13:32:28 INFO util.NativeCodeLoader: Loaded the native-hadoop
> library
> 13/02/24 13:32:28 WARN snappy.LoadSnappy: Snappy native library not loaded
> 13/02/24 13:32:29 INFO mapred.JobClient: Running job: job_201301141457_0034
> 13/02/24 13:32:30 INFO mapred.JobClient:  map 0% reduce 0%
> 13/02/24 13:32:37 INFO mapred.JobClient: Task Id :
> attempt_201301141457_0034_m_000002_0, Status : FAILED
> java.lang.Throwable: Child Error
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
> Caused by: java.io.IOException: Task process exit with nonzero status of 1.
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
>
> attempt_201301141457_0034_m_000002_0: execvp: No such file or directory
> 13/02/24 13:32:43 INFO mapred.JobClient: Task Id :
> attempt_201301141457_0034_r_000002_0, Status : FAILED
> java.lang.Throwable: Child Error
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
> Caused by: java.io.IOException: Task process exit with nonzero status of 1.
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
>
> attempt_201301141457_0034_r_000002_0: execvp: No such file or directory
> 13/02/24 13:32:50 INFO mapred.JobClient: Task Id :
> attempt_201301141457_0034_m_000002_1, Status : FAILED
> java.lang.Throwable: Child Error
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
> Caused by: java.io.IOException: Task process exit with nonzero status of 1.
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
>
> attempt_201301141457_0034_m_000002_1: execvp: No such file or directory
> 13/02/24 13:32:56 INFO mapred.JobClient: Task Id :
> attempt_201301141457_0034_r_000002_1, Status : FAILED
> java.lang.Throwable: Child Error
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
> Caused by: java.io.IOException: Task process exit with nonzero status of 1.
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
>
> attempt_201301141457_0034_r_000002_1: execvp: No such file or directory
> 13/02/24 13:33:02 INFO mapred.JobClient: Task Id :
> attempt_201301141457_0034_m_000002_2, Status : FAILED
> Error initializing attempt_201301141457_0034_m_000002_2:
> java.lang.InternalError
>         at
> sun.misc.URLClassPath$JarLoader.getResource(URLClassPath.java:769)
>         at sun.misc.URLClassPath.getResource(URLClassPath.java:185)
>         at sun.misc.URLClassPath.getResource(URLClassPath.java:237)
>         at
> java.lang.ClassLoader.getBootstrapResource(ClassLoader.java:1113)
>         at java.lang.ClassLoader.getResource(ClassLoader.java:974)
>         at java.lang.ClassLoader.getResource(ClassLoader.java:972)
>         at java.lang.ClassLoader.getSystemResource(ClassLoader.java:1075)
>         at
> java.lang.ClassLoader.getSystemResourceAsStream(ClassLoader.java:1181)
>         at java.lang.Class.getResourceAsStream(Class.java:2045)
>         at
> com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory$1.run(OutputPropertiesFactory.java:370)
>         at java.security.AccessController.doPrivileged(Native M
>
> --
>
> Regards,
> .....  Sudhakara.st
>
>

Re: Trouble in running MapReduce application

Posted by Fatih Haltas <fa...@nyu.edu>.
Thank you very much but
No this is the file in hdfs and it is the exact path of netflow data in
hdfs. Hadoop-data is the home hdfs directory, before downgrading my jdk
this command worked well

24 Şubat 2013 Pazar tarihinde sudhakara st adlı kullanıcı şöyle yazdı:

> Hi,
>      Your specifying  the input directory in local file system not in
> HDFS, Copy some text file to using '-put' or '-copyFromLoca'l to HDFS user
> home directory then try to execute word count by specifying home as input
> directory.
>
> On Sun, Feb 24, 2013 at 3:29 PM, Fatih Haltas <fa...@nyu.edu>wrote:
>
>
>
> Hi Hemanth;
>
> Thanks for your grreat helps,
>
> I am really much obliged to you.
>
> I solved this problem by changing my java compiler vs. but now though I
> changed everynodes configuration I am getting this error even I tried to
> run example of wordcount without making any changes.
>
> What may be the reason, I believe that I checked all config files and
> changed the home variables, also /etc/hosts
>
> Here is my problem:
> ************************************************************
> [hadoop@ADUAE042-LAP-V logs]$ hadoop jar ../hadoop-examples-1.0.4.jar
> wordcount /home/hadoop/project/hadoop-data/NetFlow test1353.out
>
> Warning: $HADOOP_HOME is deprecated.
>
> 13/02/24 13:32:28 INFO input.FileInputFormat: Total input paths to process
> : 1
> 13/02/24 13:32:28 INFO util.NativeCodeLoader: Loaded the native-hadoop
> library
> 13/02/24 13:32:28 WARN snappy.LoadSnappy: Snappy native library not loaded
> 13/02/24 13:32:29 INFO mapred.JobClient: Running job: job_201301141457_0034
> 13/02/24 13:32:30 INFO mapred.JobClient:  map 0% reduce 0%
> 13/02/24 13:32:37 INFO mapred.JobClient: Task Id :
> attempt_201301141457_0034_m_000002_0, Status : FAILED
> java.lang.Throwable: Child Error
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
> Caused by: java.io.IOException: Task process exit with nonzero status of 1.
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
>
> attempt_201301141457_0034_m_000002_0: execvp: No such file or directory
> 13/02/24 13:32:43 INFO mapred.JobClient: Task Id :
> attempt_201301141457_0034_r_000002_0, Status : FAILED
> java.lang.Throwable: Child Error
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
> Caused by: java.io.IOException: Task process exit with nonzero status of 1.
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
>
> attempt_201301141457_0034_r_000002_0: execvp: No such file or directory
> 13/02/24 13:32:50 INFO mapred.JobClient: Task Id :
> attempt_201301141457_0034_m_000002_1, Status : FAILED
> java.lang.Throwable: Child Error
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
> Caused by: java.io.IOException: Task process exit with nonzero status of 1.
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
>
> attempt_201301141457_0034_m_000002_1: execvp: No such file or directory
> 13/02/24 13:32:56 INFO mapred.JobClient: Task Id :
> attempt_201301141457_0034_r_000002_1, Status : FAILED
> java.lang.Throwable: Child Error
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
> Caused by: java.io.IOException: Task process exit with nonzero status of 1.
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
>
> attempt_201301141457_0034_r_000002_1: execvp: No such file or directory
> 13/02/24 13:33:02 INFO mapred.JobClient: Task Id :
> attempt_201301141457_0034_m_000002_2, Status : FAILED
> Error initializing attempt_201301141457_0034_m_000002_2:
> java.lang.InternalError
>         at
> sun.misc.URLClassPath$JarLoader.getResource(URLClassPath.java:769)
>         at sun.misc.URLClassPath.getResource(URLClassPath.java:185)
>         at sun.misc.URLClassPath.getResource(URLClassPath.java:237)
>         at
> java.lang.ClassLoader.getBootstrapResource(ClassLoader.java:1113)
>         at java.lang.ClassLoader.getResource(ClassLoader.java:974)
>         at java.lang.ClassLoader.getResource(ClassLoader.java:972)
>         at java.lang.ClassLoader.getSystemResource(ClassLoader.java:1075)
>         at
> java.lang.ClassLoader.getSystemResourceAsStream(ClassLoader.java:1181)
>         at java.lang.Class.getResourceAsStream(Class.java:2045)
>         at
> com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory$1.run(OutputPropertiesFactory.java:370)
>         at java.security.AccessController.doPrivileged(Native M
>
> --
>
> Regards,
> .....  Sudhakara.st
>
>

Re: Trouble in running MapReduce application

Posted by Fatih Haltas <fa...@nyu.edu>.
Thank you very much but
No this is the file in hdfs and it is the exact path of netflow data in
hdfs. Hadoop-data is the home hdfs directory, before downgrading my jdk
this command worked well

24 Şubat 2013 Pazar tarihinde sudhakara st adlı kullanıcı şöyle yazdı:

> Hi,
>      Your specifying  the input directory in local file system not in
> HDFS, Copy some text file to using '-put' or '-copyFromLoca'l to HDFS user
> home directory then try to execute word count by specifying home as input
> directory.
>
> On Sun, Feb 24, 2013 at 3:29 PM, Fatih Haltas <fa...@nyu.edu>wrote:
>
>
>
> Hi Hemanth;
>
> Thanks for your grreat helps,
>
> I am really much obliged to you.
>
> I solved this problem by changing my java compiler vs. but now though I
> changed everynodes configuration I am getting this error even I tried to
> run example of wordcount without making any changes.
>
> What may be the reason, I believe that I checked all config files and
> changed the home variables, also /etc/hosts
>
> Here is my problem:
> ************************************************************
> [hadoop@ADUAE042-LAP-V logs]$ hadoop jar ../hadoop-examples-1.0.4.jar
> wordcount /home/hadoop/project/hadoop-data/NetFlow test1353.out
>
> Warning: $HADOOP_HOME is deprecated.
>
> 13/02/24 13:32:28 INFO input.FileInputFormat: Total input paths to process
> : 1
> 13/02/24 13:32:28 INFO util.NativeCodeLoader: Loaded the native-hadoop
> library
> 13/02/24 13:32:28 WARN snappy.LoadSnappy: Snappy native library not loaded
> 13/02/24 13:32:29 INFO mapred.JobClient: Running job: job_201301141457_0034
> 13/02/24 13:32:30 INFO mapred.JobClient:  map 0% reduce 0%
> 13/02/24 13:32:37 INFO mapred.JobClient: Task Id :
> attempt_201301141457_0034_m_000002_0, Status : FAILED
> java.lang.Throwable: Child Error
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
> Caused by: java.io.IOException: Task process exit with nonzero status of 1.
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
>
> attempt_201301141457_0034_m_000002_0: execvp: No such file or directory
> 13/02/24 13:32:43 INFO mapred.JobClient: Task Id :
> attempt_201301141457_0034_r_000002_0, Status : FAILED
> java.lang.Throwable: Child Error
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
> Caused by: java.io.IOException: Task process exit with nonzero status of 1.
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
>
> attempt_201301141457_0034_r_000002_0: execvp: No such file or directory
> 13/02/24 13:32:50 INFO mapred.JobClient: Task Id :
> attempt_201301141457_0034_m_000002_1, Status : FAILED
> java.lang.Throwable: Child Error
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
> Caused by: java.io.IOException: Task process exit with nonzero status of 1.
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
>
> attempt_201301141457_0034_m_000002_1: execvp: No such file or directory
> 13/02/24 13:32:56 INFO mapred.JobClient: Task Id :
> attempt_201301141457_0034_r_000002_1, Status : FAILED
> java.lang.Throwable: Child Error
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
> Caused by: java.io.IOException: Task process exit with nonzero status of 1.
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
>
> attempt_201301141457_0034_r_000002_1: execvp: No such file or directory
> 13/02/24 13:33:02 INFO mapred.JobClient: Task Id :
> attempt_201301141457_0034_m_000002_2, Status : FAILED
> Error initializing attempt_201301141457_0034_m_000002_2:
> java.lang.InternalError
>         at
> sun.misc.URLClassPath$JarLoader.getResource(URLClassPath.java:769)
>         at sun.misc.URLClassPath.getResource(URLClassPath.java:185)
>         at sun.misc.URLClassPath.getResource(URLClassPath.java:237)
>         at
> java.lang.ClassLoader.getBootstrapResource(ClassLoader.java:1113)
>         at java.lang.ClassLoader.getResource(ClassLoader.java:974)
>         at java.lang.ClassLoader.getResource(ClassLoader.java:972)
>         at java.lang.ClassLoader.getSystemResource(ClassLoader.java:1075)
>         at
> java.lang.ClassLoader.getSystemResourceAsStream(ClassLoader.java:1181)
>         at java.lang.Class.getResourceAsStream(Class.java:2045)
>         at
> com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory$1.run(OutputPropertiesFactory.java:370)
>         at java.security.AccessController.doPrivileged(Native M
>
> --
>
> Regards,
> .....  Sudhakara.st
>
>

Re: Trouble in running MapReduce application

Posted by Fatih Haltas <fa...@nyu.edu>.
Thank you very much but
No this is the file in hdfs and it is the exact path of netflow data in
hdfs. Hadoop-data is the home hdfs directory, before downgrading my jdk
this command worked well

24 Şubat 2013 Pazar tarihinde sudhakara st adlı kullanıcı şöyle yazdı:

> Hi,
>      Your specifying  the input directory in local file system not in
> HDFS, Copy some text file to using '-put' or '-copyFromLoca'l to HDFS user
> home directory then try to execute word count by specifying home as input
> directory.
>
> On Sun, Feb 24, 2013 at 3:29 PM, Fatih Haltas <fa...@nyu.edu>wrote:
>
>
>
> Hi Hemanth;
>
> Thanks for your grreat helps,
>
> I am really much obliged to you.
>
> I solved this problem by changing my java compiler vs. but now though I
> changed everynodes configuration I am getting this error even I tried to
> run example of wordcount without making any changes.
>
> What may be the reason, I believe that I checked all config files and
> changed the home variables, also /etc/hosts
>
> Here is my problem:
> ************************************************************
> [hadoop@ADUAE042-LAP-V logs]$ hadoop jar ../hadoop-examples-1.0.4.jar
> wordcount /home/hadoop/project/hadoop-data/NetFlow test1353.out
>
> Warning: $HADOOP_HOME is deprecated.
>
> 13/02/24 13:32:28 INFO input.FileInputFormat: Total input paths to process
> : 1
> 13/02/24 13:32:28 INFO util.NativeCodeLoader: Loaded the native-hadoop
> library
> 13/02/24 13:32:28 WARN snappy.LoadSnappy: Snappy native library not loaded
> 13/02/24 13:32:29 INFO mapred.JobClient: Running job: job_201301141457_0034
> 13/02/24 13:32:30 INFO mapred.JobClient:  map 0% reduce 0%
> 13/02/24 13:32:37 INFO mapred.JobClient: Task Id :
> attempt_201301141457_0034_m_000002_0, Status : FAILED
> java.lang.Throwable: Child Error
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
> Caused by: java.io.IOException: Task process exit with nonzero status of 1.
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
>
> attempt_201301141457_0034_m_000002_0: execvp: No such file or directory
> 13/02/24 13:32:43 INFO mapred.JobClient: Task Id :
> attempt_201301141457_0034_r_000002_0, Status : FAILED
> java.lang.Throwable: Child Error
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
> Caused by: java.io.IOException: Task process exit with nonzero status of 1.
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
>
> attempt_201301141457_0034_r_000002_0: execvp: No such file or directory
> 13/02/24 13:32:50 INFO mapred.JobClient: Task Id :
> attempt_201301141457_0034_m_000002_1, Status : FAILED
> java.lang.Throwable: Child Error
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
> Caused by: java.io.IOException: Task process exit with nonzero status of 1.
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
>
> attempt_201301141457_0034_m_000002_1: execvp: No such file or directory
> 13/02/24 13:32:56 INFO mapred.JobClient: Task Id :
> attempt_201301141457_0034_r_000002_1, Status : FAILED
> java.lang.Throwable: Child Error
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
> Caused by: java.io.IOException: Task process exit with nonzero status of 1.
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
>
> attempt_201301141457_0034_r_000002_1: execvp: No such file or directory
> 13/02/24 13:33:02 INFO mapred.JobClient: Task Id :
> attempt_201301141457_0034_m_000002_2, Status : FAILED
> Error initializing attempt_201301141457_0034_m_000002_2:
> java.lang.InternalError
>         at
> sun.misc.URLClassPath$JarLoader.getResource(URLClassPath.java:769)
>         at sun.misc.URLClassPath.getResource(URLClassPath.java:185)
>         at sun.misc.URLClassPath.getResource(URLClassPath.java:237)
>         at
> java.lang.ClassLoader.getBootstrapResource(ClassLoader.java:1113)
>         at java.lang.ClassLoader.getResource(ClassLoader.java:974)
>         at java.lang.ClassLoader.getResource(ClassLoader.java:972)
>         at java.lang.ClassLoader.getSystemResource(ClassLoader.java:1075)
>         at
> java.lang.ClassLoader.getSystemResourceAsStream(ClassLoader.java:1181)
>         at java.lang.Class.getResourceAsStream(Class.java:2045)
>         at
> com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory$1.run(OutputPropertiesFactory.java:370)
>         at java.security.AccessController.doPrivileged(Native M
>
> --
>
> Regards,
> .....  Sudhakara.st
>
>

Re: Trouble in running MapReduce application

Posted by sudhakara st <su...@gmail.com>.
Hi,
     Your specifying  the input directory in local file system not in HDFS,
Copy some text file to using '-put' or '-copyFromLoca'l to HDFS user home
directory then try to execute word count by specifying home as input
directory.

On Sun, Feb 24, 2013 at 3:29 PM, Fatih Haltas <fa...@nyu.edu> wrote:

>
>
> Hi Hemanth;
>
> Thanks for your grreat helps,
>
> I am really much obliged to you.
>
> I solved this problem by changing my java compiler vs. but now though I
> changed everynodes configuration I am getting this error even I tried to
> run example of wordcount without making any changes.
>
> What may be the reason, I believe that I checked all config files and
> changed the home variables, also /etc/hosts
>
> Here is my problem:
> ************************************************************
> [hadoop@ADUAE042-LAP-V logs]$ hadoop jar ../hadoop-examples-1.0.4.jar
> wordcount /home/hadoop/project/hadoop-data/NetFlow test1353.out
>
> Warning: $HADOOP_HOME is deprecated.
>
> 13/02/24 13:32:28 INFO input.FileInputFormat: Total input paths to process
> : 1
> 13/02/24 13:32:28 INFO util.NativeCodeLoader: Loaded the native-hadoop
> library
> 13/02/24 13:32:28 WARN snappy.LoadSnappy: Snappy native library not loaded
> 13/02/24 13:32:29 INFO mapred.JobClient: Running job: job_201301141457_0034
> 13/02/24 13:32:30 INFO mapred.JobClient:  map 0% reduce 0%
> 13/02/24 13:32:37 INFO mapred.JobClient: Task Id :
> attempt_201301141457_0034_m_000002_0, Status : FAILED
> java.lang.Throwable: Child Error
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
> Caused by: java.io.IOException: Task process exit with nonzero status of 1.
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
>
> attempt_201301141457_0034_m_000002_0: execvp: No such file or directory
> 13/02/24 13:32:43 INFO mapred.JobClient: Task Id :
> attempt_201301141457_0034_r_000002_0, Status : FAILED
> java.lang.Throwable: Child Error
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
> Caused by: java.io.IOException: Task process exit with nonzero status of 1.
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
>
> attempt_201301141457_0034_r_000002_0: execvp: No such file or directory
> 13/02/24 13:32:50 INFO mapred.JobClient: Task Id :
> attempt_201301141457_0034_m_000002_1, Status : FAILED
> java.lang.Throwable: Child Error
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
> Caused by: java.io.IOException: Task process exit with nonzero status of 1.
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
>
> attempt_201301141457_0034_m_000002_1: execvp: No such file or directory
> 13/02/24 13:32:56 INFO mapred.JobClient: Task Id :
> attempt_201301141457_0034_r_000002_1, Status : FAILED
> java.lang.Throwable: Child Error
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
> Caused by: java.io.IOException: Task process exit with nonzero status of 1.
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
>
> attempt_201301141457_0034_r_000002_1: execvp: No such file or directory
> 13/02/24 13:33:02 INFO mapred.JobClient: Task Id :
> attempt_201301141457_0034_m_000002_2, Status : FAILED
> Error initializing attempt_201301141457_0034_m_000002_2:
> java.lang.InternalError
>         at
> sun.misc.URLClassPath$JarLoader.getResource(URLClassPath.java:769)
>         at sun.misc.URLClassPath.getResource(URLClassPath.java:185)
>         at sun.misc.URLClassPath.getResource(URLClassPath.java:237)
>         at
> java.lang.ClassLoader.getBootstrapResource(ClassLoader.java:1113)
>         at java.lang.ClassLoader.getResource(ClassLoader.java:974)
>         at java.lang.ClassLoader.getResource(ClassLoader.java:972)
>         at java.lang.ClassLoader.getSystemResource(ClassLoader.java:1075)
>         at
> java.lang.ClassLoader.getSystemResourceAsStream(ClassLoader.java:1181)
>         at java.lang.Class.getResourceAsStream(Class.java:2045)
>         at
> com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory$1.run(OutputPropertiesFactory.java:370)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at
> com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory.loadPropertiesFile(OutputPropertiesFactory.java:366)
>         at
> com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory.getDefaultMethodProperties(OutputPropertiesFactory.java:267)
>         at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.setDefaults(TransformerImpl.java:1123)
>         at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.createOutputProperties(TransformerImpl.java:1084)
>         at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.<init>(TransformerImpl.java:250)
>         at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.<init>(TransformerImpl.java:241)
>         at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerFactoryImpl.newTransformer(TransformerFactoryImpl.java:607)
>         at
> org.apache.hadoop.conf.Configuration.writeXml(Configuration.java:1310)
>         at
> org.apache.hadoop.mapred.JobLocalizer.writeLocalJobFile(JobLocalizer.java:559)
>         at
> org.apache.hadoop.mapred.TaskTracker$4.run(TaskTracker.java:1227)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:416)
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
>         at
> org.apache.hadoop.mapred.TaskTracker.initializeJob(TaskTracker.java:1203)
>         at
> org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:1118)
>         at
> org.apache.hadoop.mapred.TaskTracker$5.run(TaskTracker.java:2430)
>         at java.lang.Thread.run(Thread.java:679)
> Caused by: java.io.FileNotFoundException:
> /usr/lib/jvm/java-1.6.0-openjdk-1.6.0.0.x86_64/jre/lib/resources.jar
>         at
> sun.misc.URLClassPath$JarLoader.getJarFile(URLClassPath.java:658)
>         at
> sun.misc.URLClassPath$JarLoader.access$600(URLClassPath.java:555)
>         at sun.misc.URLClassPath$JarLoader$1.run(URLClassPath.java:622)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at
> sun.misc.URLClassPath$JarLoader.ensureOpen(URLClassPath.java:614)
>         at
> sun.misc.URLClassPath$JarLoader.getResource(URLClassPath.java:767)
>         ... 27 more
>
>  13/02/24 13:33:02 WARN mapred.JobClient: Error reading task
> outputhttp://ADUAE045-LAP-V:50060/tasklog?plaintext=true&attemptid=attempt_201301141457_0034_m_000002_2&filter=stdout
> 13/02/24 13:33:02 WARN mapred.JobClient: Error reading task
> outputhttp://ADUAE045-LAP-V:50060/tasklog?plaintext=true&attemptid=attempt_201301141457_0034_m_000002_2&filter=stderr
> 13/02/24 13:33:08 INFO mapred.JobClient: Task Id :
> attempt_201301141457_0034_m_000001_0, Status : FAILED
> Error initializing attempt_201301141457_0034_m_000001_0:
> java.lang.InternalError
>         at
> sun.misc.URLClassPath$JarLoader.getResource(URLClassPath.java:769)
>         at sun.misc.URLClassPath.getResource(URLClassPath.java:185)
>         at sun.misc.URLClassPath.getResource(URLClassPath.java:237)
>         at
> java.lang.ClassLoader.getBootstrapResource(ClassLoader.java:1113)
>         at java.lang.ClassLoader.getResource(ClassLoader.java:974)
>         at java.lang.ClassLoader.getResource(ClassLoader.java:972)
>         at java.lang.ClassLoader.getSystemResource(ClassLoader.java:1075)
>         at
> java.lang.ClassLoader.getSystemResourceAsStream(ClassLoader.java:1181)
>         at java.lang.Class.getResourceAsStream(Class.java:2045)
>         at
> com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory$1.run(OutputPropertiesFactory.java:370)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at
> com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory.loadPropertiesFile(OutputPropertiesFactory.java:366)
>         at
> com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory.getDefaultMethodProperties(OutputPropertiesFactory.java:267)
>         at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.setDefaults(TransformerImpl.java:1123)
>         at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.createOutputProperties(TransformerImpl.java:1084)
>         at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.<init>(TransformerImpl.java:250)
>         at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.<init>(TransformerImpl.java:241)
>         at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerFactoryImpl.newTransformer(TransformerFactoryImpl.java:607)
>         at
> org.apache.hadoop.conf.Configuration.writeXml(Configuration.java:1310)
>         at
> org.apache.hadoop.mapred.JobLocalizer.writeLocalJobFile(JobLocalizer.java:559)
>         at
> org.apache.hadoop.mapred.TaskTracker$4.run(TaskTracker.java:1227)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:416)
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
>         at
> org.apache.hadoop.mapred.TaskTracker.initializeJob(TaskTracker.java:1203)
>         at
> org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:1118)
>         at
> org.apache.hadoop.mapred.TaskTracker$5.run(TaskTracker.java:2430)
>         at java.lang.Thread.run(Thread.java:679)
> Caused by: java.io.FileNotFoundException:
> /usr/lib/jvm/java-1.6.0-openjdk-1.6.0.0.x86_64/jre/lib/resources.jar
>         at
> sun.misc.URLClassPath$JarLoader.getJarFile(URLClassPath.java:658)
>         at
> sun.misc.URLClassPath$JarLoader.access$600(URLClassPath.java:555)
>         at sun.misc.URLClassPath$JarLoader$1.run(URLClassPath.java:622)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at
> sun.misc.URLClassPath$JarLoader.ensureOpen(URLClassPath.java:614)
>         at
> sun.misc.URLClassPath$JarLoader.getResource(URLClassPath.java:767)
>         ... 27 more
>
>  13/02/24 13:33:08 WARN mapred.JobClient: Error reading task
> outputhttp://ADUAE045-LAP-V:50060/tasklog?plaintext=true&attemptid=attempt_201301141457_0034_m_000001_0&filter=stdout
> 13/02/24 13:33:08 WARN mapred.JobClient: Error reading task
> outputhttp://ADUAE045-LAP-V:50060/tasklog?plaintext=true&attemptid=attempt_201301141457_0034_m_000001_0&filter=stderr
> 13/02/24 13:33:11 INFO mapred.JobClient: Task Id :
> attempt_201301141457_0034_r_000001_0, Status : FAILED
> Error initializing attempt_201301141457_0034_r_000001_0:
> java.lang.InternalError
>         at
> sun.misc.URLClassPath$JarLoader.getResource(URLClassPath.java:769)
>         at sun.misc.URLClassPath.getResource(URLClassPath.java:185)
>         at sun.misc.URLClassPath.getResource(URLClassPath.java:237)
>         at
> java.lang.ClassLoader.getBootstrapResource(ClassLoader.java:1113)
>         at java.lang.ClassLoader.getResource(ClassLoader.java:974)
>         at java.lang.ClassLoader.getResource(ClassLoader.java:972)
>         at java.lang.ClassLoader.getSystemResource(ClassLoader.java:1075)
>         at
> java.lang.ClassLoader.getSystemResourceAsStream(ClassLoader.java:1181)
>         at java.lang.Class.getResourceAsStream(Class.java:2045)
>         at
> com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory$1.run(OutputPropertiesFactory.java:370)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at
> com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory.loadPropertiesFile(OutputPropertiesFactory.java:366)
>         at
> com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory.getDefaultMethodProperties(OutputPropertiesFactory.java:267)
>         at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.setDefaults(TransformerImpl.java:1123)
>         at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.createOutputProperties(TransformerImpl.java:1084)
>         at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.<init>(TransformerImpl.java:250)
>         at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.<init>(TransformerImpl.java:241)
>         at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerFactoryImpl.newTransformer(TransformerFactoryImpl.java:607)
>         at
> org.apache.hadoop.conf.Configuration.writeXml(Configuration.java:1310)
>         at
> org.apache.hadoop.mapred.JobLocalizer.writeLocalJobFile(JobLocalizer.java:559)
>         at
> org.apache.hadoop.mapred.TaskTracker$4.run(TaskTracker.java:1227)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:416)
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
>         at
> org.apache.hadoop.mapred.TaskTracker.initializeJob(TaskTracker.java:1203)
>         at
> org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:1118)
>         at
> org.apache.hadoop.mapred.TaskTracker$5.run(TaskTracker.java:2430)
>         at java.lang.Thread.run(Thread.java:679)
> Caused by: java.io.FileNotFoundException:
> /usr/lib/jvm/java-1.6.0-openjdk-1.6.0.0.x86_64/jre/lib/resources.jar
>         at
> sun.misc.URLClassPath$JarLoader.getJarFile(URLClassPath.java:658)
>         at
> sun.misc.URLClassPath$JarLoader.access$600(URLClassPath.java:555)
>         at sun.misc.URLClassPath$JarLoader$1.run(URLClassPath.java:622)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at
> sun.misc.URLClassPath$JarLoader.ensureOpen(URLClassPath.java:614)
>         at
> sun.misc.URLClassPath$JarLoader.getResource(URLClassPath.java:767)
>         ... 27 more
>
>  13/02/24 13:33:11 WARN mapred.JobClient: Error reading task
> outputhttp://ADUAE045-LAP-V:50060/tasklog?plaintext=true&attemptid=attempt_201301141457_0034_r_000001_0&filter=stdout
> 13/02/24 13:33:11 WARN mapred.JobClient: Error reading task
> outputhttp://ADUAE045-LAP-V:50060/tasklog?plaintext=true&attemptid=attempt_201301141457_0034_r_000001_0&filter=stderr
> 13/02/24 13:33:19 INFO mapred.JobClient: Task Id :
> attempt_201301141457_0034_m_000001_1, Status : FAILED
> java.lang.Throwable: Child Error
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
> Caused by: java.io.IOException: Task process exit with nonzero status of 1.
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
>
> attempt_201301141457_0034_m_000001_1: execvp: No such file or directory
> 13/02/24 13:33:25 INFO mapred.JobClient: Task Id :
> attempt_201301141457_0034_r_000001_1, Status : FAILED
> java.lang.Throwable: Child Error
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
> Caused by: java.io.IOException: Task process exit with nonzero status of 1.
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
>
> attempt_201301141457_0034_r_000001_1: execvp: No such file or directory
> 13/02/24 13:33:31 INFO mapred.JobClient: Task Id :
> attempt_201301141457_0034_m_000001_2, Status : FAILED
> java.lang.Throwable: Child Error
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
> Caused by: java.io.IOException: Task process exit with nonzero status of 1.
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
>
> attempt_201301141457_0034_m_000001_2: execvp: No such file or directory
> 13/02/24 13:33:37 INFO mapred.JobClient: Job complete:
> job_201301141457_0034
> 13/02/24 13:33:37 INFO mapred.JobClient: Counters: 4
> 13/02/24 13:33:37 INFO mapred.JobClient:   Job Counters
> 13/02/24 13:33:37 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=15112
> 13/02/24 13:33:37 INFO mapred.JobClient:     Total time spent by all
> reduces waiting after reserving slots (ms)=0
> 13/02/24 13:33:37 INFO mapred.JobClient:     Total time spent by all maps
> waiting after reserving slots (ms)=0
> 13/02/24 13:33:37 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=9091
>
>
>
> On Sat, Feb 23, 2013 at 5:05 PM, Hemanth Yamijala <
> yhemanth@thoughtworks.com> wrote:
>
>> Can you try this ? Pick a class like WordCount from your package and
>> execute this command:
>>
>> javap -classpath <path to your jar> -verbose org.myorg.Wordcount | grep
>> version.
>>
>> For e.g. here's what I get for my class:
>>
>> $ javap -verbose WCMapper | grep version
>>   minor version: 0
>>   major version: 50
>>
>> Please paste the output of this - we can verify what the problem is.
>>
>> Thanks
>> Hemanth
>>
>>
>> On Sat, Feb 23, 2013 at 4:45 PM, Fatih Haltas <fa...@nyu.edu>wrote:
>>
>>> Hi again,
>>>
>>> Thanks for your help but now, I am struggling with the same problem on a
>>> machine. As the preivous problem, I just decrease the Java version by Java
>>> 6, but this time I could not solve the problem.
>>>
>>> those are outputs that may explain the situation:
>>>
>>> ---------------------------------------------------------------------------------------------------------------------------------------------
>>> 1. I could not run my own code, to check the system I just tried to run
>>> basic wordcount example without any modification, except package info.
>>> **************************************************
>>> COMMAND EXECUTED: hadoop jar my.jar org.myorg.WordCount NetFlow
>>> NetFlow.out
>>> Warning: $HADOOP_HOME is deprecated.
>>>
>>> Exception in thread "main" java.lang.UnsupportedClassVersionError:
>>> org/myorg/WordCount : Unsupported major.minor version 51.0
>>>          at java.lang.ClassLoader.defineClass1(Native Method)
>>>         at java.lang.ClassLoader.defineClass(ClassLoader.java:634)
>>>         at
>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>>>         at java.net.URLClassLoader.defineClass(URLClassLoader.java:277)
>>>         at java.net.URLClassLoader.access$000(URLClassLoader.java:73)
>>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:212)
>>>         at java.security.AccessController.doPrivileged(Native Method)
>>>         at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
>>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
>>>         at java.lang.Class.forName0(Native Method)
>>>         at java.lang.Class.forName(Class.java:266)
>>>         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
>>>
>>> **************************************************************************************
>>> 2. Java version:
>>> ********************************
>>> COMMAND EXECUTED: java -version
>>> java version "1.6.0_24"
>>> OpenJDK Runtime Environment (IcedTea6 1.11.6)
>>> (rhel-1.33.1.11.6.el5_9-x86_64)
>>> OpenJDK 64-Bit Server VM (build 20.0-b12, mixed mode)
>>> **********************************
>>> 3. JAVA_HOME variable:
>>> **********************************
>>> COMMAND EXECUTED: echo $JAVA_HOME
>>> /usr/lib/jvm/jre-1.6.0-openjdk.x86_64
>>> ********************************************
>>> 4. HADOOP version:
>>> *******************************************
>>> COMMAND EXECUTED: hadoop version
>>> Warning: $HADOOP_HOME is deprecated.
>>>
>>> Hadoop 1.0.4
>>> Subversion
>>> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r
>>> 1393290
>>> Compiled by hortonfo on Wed Oct  3 05:13:58 UTC 2012
>>> From source with checksum fe2baea87c4c81a2c505767f3f9b71f4
>>> ********************************************************
>>>
>>> Are these still incompatible with eachother? (Hadoop version and java
>>> version)
>>>
>>>
>>> Thank you very much.
>>>
>>>
>>> On Tue, Feb 19, 2013 at 10:26 PM, Fatih Haltas <fa...@nyu.edu>wrote:
>>>
>>>> Thank you all very much
>>>>
>>>> 19 Şubat 2013 Salı tarihinde Harsh J adlı kullanıcı şöyle yazdı:
>>>>
>>>>> Oops. I just noticed Hemanth has been answering on a dupe thread as
>>>>> well. Lets drop this thread and carry on there :)
>>>>>
>>>>> On Tue, Feb 19, 2013 at 11:14 PM, Harsh J <ha...@cloudera.com> wrote:
>>>>> > Hi,
>>>>> >
>>>>> > The new error usually happens if you compile using Java 7 and try to
>>>>> > run via Java 6 (for example). That is, an incompatibility in the
>>>>> > runtimes for the binary artifact produced.
>>>>> >
>>>>> > On Tue, Feb 19, 2013 at 10:09 PM, Fatih Haltas <fa...@nyu.edu>
>>>>> wrote:
>>>>> >> Thank you very much Harsh,
>>>>> >>
>>>>> >> Now, as I promised earlier I am much obliged to you.
>>>>> >>
>>>>> >> But, now I solved that problem by just changing the directories
>>>>> then again
>>>>> >> creating a jar file of org. but I am getting this error:
>>>>> >>
>>>>> >> 1.) What I got
>>>>> >>
>>>>> ------------------------------------------------------------------------------
>>>>> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ hadoop jar
>>>>> flow19028pm.jar
>>>>> >> org.myorg.MapReduce /home/hadoop/project/hadoop-data/NetFlow
>>>>> 19_02.out
>>>>> >> Warning: $HADOOP_HOME is deprecated.
>>>>> >>
>>>>> >> Exception in thread "main" java.lang.UnsupportedClassVersionError:
>>>>> >> org/myorg/MapReduce : Unsupported major.minor version 51.0
>>>>> >>         at java.lang.ClassLoader.defineClass1(Native Method)
>>>>> >>         at java.lang.ClassLoader.defineClass(ClassLoader.java:634)
>>>>> >>         at
>>>>> >>
>>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>>>>> >>         at
>>>>> java.net.URLClassLoader.defineClass(URLClassLoader.java:277)
>>>>> >>         at
>>>>> java.net.URLClassLoader.access$000(URLClassLoader.java:73)
>>>>> >>         at java.net.URLClassLoader$1.run(URLClassLoader.java:212)
>>>>> >>         at java.security.AccessController.doPrivileged(Native
>>>>> Method)
>>>>> >>         at
>>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>>>>> >>         at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
>>>>> >>         at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
>>>>> >>         at java.lang.Class.forName0(Native Method)
>>>>> >>         at java.lang.Class.forName(Class.java:266)
>>>>> >>         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
>>>>> >>
>>>>> >> 2.) How I create my jar
>>>>> >>
>>>>> -------------------------------------------------------------------------------------
>>>>> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar cvf flow19028pm.jar
>>>>> org
>>>>> >> added manifest
>>>>> >> adding: org/(in = 0) (out= 0)(stored 0%)
>>>>> >> adding: org/myorg/(in = 0) (out= 0)(stored 0%)
>>>>> >> adding: org/myorg/MapReduce$FlowPortReducer.class(in = 1661) (out=
>>>>> >> 690)(deflated 58%)
>>>>> >> adding: org/myorg/MapReduce.class(in = 1587) (out= 903)(deflated
>>>>> 43%)
>>>>> >> adding: org/myorg/MapReduce$FlowPortMapper.class(in = 1874) (out=
>>>>> >> 823)(deflated 56%)
>>>>> >>
>>>>> >> 3.) Content of my jar file
>>>>> >>
>>>>> ---------------------------------------------------------------------------------------
>>>>> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar tf flow19028pm.jar
>>>>> >> META-INF/
>>>>> >> META-INF/MANIFEST.MF
>>>>> >> org/
>>>>> >> org/myorg/
>>>>> >> org/myorg/MapReduce$FlowPortReducer.class
>>>>> >> org/myorg/MapReduce.class
>>>>> >> org/myorg/MapReduce$FlowPortMapper.class
>>>>> >>
>>>>> -----------------------------------------------------------------------------------------
>>>>> >>
>>>>> >>
>>>>> >> Thank you very much.
>>>>> >>
>>>>> >>
>>>>> >> On Tue, Feb 19, 2013 at 8:20 PM, Harsh J <ha...@cloudera.com>
>>>>> wrote:
>>>>> >>>
>>>>> >>> Your point (4) explains the problem. The jar packed structure
>>>>> should
>>>>> >>> look like the below, and not how it is presently (one extra top
>>>>> level
>>>>> >>> dir is present):
>>>>> >>>
>>>>> >>> META-INF/
>>>>> >>> META-INF/MANIFEST.MF
>>>>> >>> org/
>>>>> >>> org/myorg/
>>>>> >>> org/myorg/WordCount.class
>>>>> >>> org/myorg/WordCount$TokenizerMapper.class
>>>>> >>> org/myorg/WordCount$IntSumReducer.clas--
>>>>> Harsh J
>>>>>
>>>>
>>>
>>
>
>


-- 

Regards,
.....  Sudhakara.st

Re: Trouble in running MapReduce application

Posted by sudhakara st <su...@gmail.com>.
Hi,
     Your specifying  the input directory in local file system not in HDFS,
Copy some text file to using '-put' or '-copyFromLoca'l to HDFS user home
directory then try to execute word count by specifying home as input
directory.

On Sun, Feb 24, 2013 at 3:29 PM, Fatih Haltas <fa...@nyu.edu> wrote:

>
>
> Hi Hemanth;
>
> Thanks for your grreat helps,
>
> I am really much obliged to you.
>
> I solved this problem by changing my java compiler vs. but now though I
> changed everynodes configuration I am getting this error even I tried to
> run example of wordcount without making any changes.
>
> What may be the reason, I believe that I checked all config files and
> changed the home variables, also /etc/hosts
>
> Here is my problem:
> ************************************************************
> [hadoop@ADUAE042-LAP-V logs]$ hadoop jar ../hadoop-examples-1.0.4.jar
> wordcount /home/hadoop/project/hadoop-data/NetFlow test1353.out
>
> Warning: $HADOOP_HOME is deprecated.
>
> 13/02/24 13:32:28 INFO input.FileInputFormat: Total input paths to process
> : 1
> 13/02/24 13:32:28 INFO util.NativeCodeLoader: Loaded the native-hadoop
> library
> 13/02/24 13:32:28 WARN snappy.LoadSnappy: Snappy native library not loaded
> 13/02/24 13:32:29 INFO mapred.JobClient: Running job: job_201301141457_0034
> 13/02/24 13:32:30 INFO mapred.JobClient:  map 0% reduce 0%
> 13/02/24 13:32:37 INFO mapred.JobClient: Task Id :
> attempt_201301141457_0034_m_000002_0, Status : FAILED
> java.lang.Throwable: Child Error
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
> Caused by: java.io.IOException: Task process exit with nonzero status of 1.
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
>
> attempt_201301141457_0034_m_000002_0: execvp: No such file or directory
> 13/02/24 13:32:43 INFO mapred.JobClient: Task Id :
> attempt_201301141457_0034_r_000002_0, Status : FAILED
> java.lang.Throwable: Child Error
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
> Caused by: java.io.IOException: Task process exit with nonzero status of 1.
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
>
> attempt_201301141457_0034_r_000002_0: execvp: No such file or directory
> 13/02/24 13:32:50 INFO mapred.JobClient: Task Id :
> attempt_201301141457_0034_m_000002_1, Status : FAILED
> java.lang.Throwable: Child Error
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
> Caused by: java.io.IOException: Task process exit with nonzero status of 1.
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
>
> attempt_201301141457_0034_m_000002_1: execvp: No such file or directory
> 13/02/24 13:32:56 INFO mapred.JobClient: Task Id :
> attempt_201301141457_0034_r_000002_1, Status : FAILED
> java.lang.Throwable: Child Error
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
> Caused by: java.io.IOException: Task process exit with nonzero status of 1.
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
>
> attempt_201301141457_0034_r_000002_1: execvp: No such file or directory
> 13/02/24 13:33:02 INFO mapred.JobClient: Task Id :
> attempt_201301141457_0034_m_000002_2, Status : FAILED
> Error initializing attempt_201301141457_0034_m_000002_2:
> java.lang.InternalError
>         at
> sun.misc.URLClassPath$JarLoader.getResource(URLClassPath.java:769)
>         at sun.misc.URLClassPath.getResource(URLClassPath.java:185)
>         at sun.misc.URLClassPath.getResource(URLClassPath.java:237)
>         at
> java.lang.ClassLoader.getBootstrapResource(ClassLoader.java:1113)
>         at java.lang.ClassLoader.getResource(ClassLoader.java:974)
>         at java.lang.ClassLoader.getResource(ClassLoader.java:972)
>         at java.lang.ClassLoader.getSystemResource(ClassLoader.java:1075)
>         at
> java.lang.ClassLoader.getSystemResourceAsStream(ClassLoader.java:1181)
>         at java.lang.Class.getResourceAsStream(Class.java:2045)
>         at
> com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory$1.run(OutputPropertiesFactory.java:370)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at
> com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory.loadPropertiesFile(OutputPropertiesFactory.java:366)
>         at
> com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory.getDefaultMethodProperties(OutputPropertiesFactory.java:267)
>         at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.setDefaults(TransformerImpl.java:1123)
>         at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.createOutputProperties(TransformerImpl.java:1084)
>         at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.<init>(TransformerImpl.java:250)
>         at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.<init>(TransformerImpl.java:241)
>         at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerFactoryImpl.newTransformer(TransformerFactoryImpl.java:607)
>         at
> org.apache.hadoop.conf.Configuration.writeXml(Configuration.java:1310)
>         at
> org.apache.hadoop.mapred.JobLocalizer.writeLocalJobFile(JobLocalizer.java:559)
>         at
> org.apache.hadoop.mapred.TaskTracker$4.run(TaskTracker.java:1227)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:416)
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
>         at
> org.apache.hadoop.mapred.TaskTracker.initializeJob(TaskTracker.java:1203)
>         at
> org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:1118)
>         at
> org.apache.hadoop.mapred.TaskTracker$5.run(TaskTracker.java:2430)
>         at java.lang.Thread.run(Thread.java:679)
> Caused by: java.io.FileNotFoundException:
> /usr/lib/jvm/java-1.6.0-openjdk-1.6.0.0.x86_64/jre/lib/resources.jar
>         at
> sun.misc.URLClassPath$JarLoader.getJarFile(URLClassPath.java:658)
>         at
> sun.misc.URLClassPath$JarLoader.access$600(URLClassPath.java:555)
>         at sun.misc.URLClassPath$JarLoader$1.run(URLClassPath.java:622)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at
> sun.misc.URLClassPath$JarLoader.ensureOpen(URLClassPath.java:614)
>         at
> sun.misc.URLClassPath$JarLoader.getResource(URLClassPath.java:767)
>         ... 27 more
>
>  13/02/24 13:33:02 WARN mapred.JobClient: Error reading task
> outputhttp://ADUAE045-LAP-V:50060/tasklog?plaintext=true&attemptid=attempt_201301141457_0034_m_000002_2&filter=stdout
> 13/02/24 13:33:02 WARN mapred.JobClient: Error reading task
> outputhttp://ADUAE045-LAP-V:50060/tasklog?plaintext=true&attemptid=attempt_201301141457_0034_m_000002_2&filter=stderr
> 13/02/24 13:33:08 INFO mapred.JobClient: Task Id :
> attempt_201301141457_0034_m_000001_0, Status : FAILED
> Error initializing attempt_201301141457_0034_m_000001_0:
> java.lang.InternalError
>         at
> sun.misc.URLClassPath$JarLoader.getResource(URLClassPath.java:769)
>         at sun.misc.URLClassPath.getResource(URLClassPath.java:185)
>         at sun.misc.URLClassPath.getResource(URLClassPath.java:237)
>         at
> java.lang.ClassLoader.getBootstrapResource(ClassLoader.java:1113)
>         at java.lang.ClassLoader.getResource(ClassLoader.java:974)
>         at java.lang.ClassLoader.getResource(ClassLoader.java:972)
>         at java.lang.ClassLoader.getSystemResource(ClassLoader.java:1075)
>         at
> java.lang.ClassLoader.getSystemResourceAsStream(ClassLoader.java:1181)
>         at java.lang.Class.getResourceAsStream(Class.java:2045)
>         at
> com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory$1.run(OutputPropertiesFactory.java:370)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at
> com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory.loadPropertiesFile(OutputPropertiesFactory.java:366)
>         at
> com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory.getDefaultMethodProperties(OutputPropertiesFactory.java:267)
>         at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.setDefaults(TransformerImpl.java:1123)
>         at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.createOutputProperties(TransformerImpl.java:1084)
>         at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.<init>(TransformerImpl.java:250)
>         at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.<init>(TransformerImpl.java:241)
>         at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerFactoryImpl.newTransformer(TransformerFactoryImpl.java:607)
>         at
> org.apache.hadoop.conf.Configuration.writeXml(Configuration.java:1310)
>         at
> org.apache.hadoop.mapred.JobLocalizer.writeLocalJobFile(JobLocalizer.java:559)
>         at
> org.apache.hadoop.mapred.TaskTracker$4.run(TaskTracker.java:1227)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:416)
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
>         at
> org.apache.hadoop.mapred.TaskTracker.initializeJob(TaskTracker.java:1203)
>         at
> org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:1118)
>         at
> org.apache.hadoop.mapred.TaskTracker$5.run(TaskTracker.java:2430)
>         at java.lang.Thread.run(Thread.java:679)
> Caused by: java.io.FileNotFoundException:
> /usr/lib/jvm/java-1.6.0-openjdk-1.6.0.0.x86_64/jre/lib/resources.jar
>         at
> sun.misc.URLClassPath$JarLoader.getJarFile(URLClassPath.java:658)
>         at
> sun.misc.URLClassPath$JarLoader.access$600(URLClassPath.java:555)
>         at sun.misc.URLClassPath$JarLoader$1.run(URLClassPath.java:622)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at
> sun.misc.URLClassPath$JarLoader.ensureOpen(URLClassPath.java:614)
>         at
> sun.misc.URLClassPath$JarLoader.getResource(URLClassPath.java:767)
>         ... 27 more
>
>  13/02/24 13:33:08 WARN mapred.JobClient: Error reading task
> outputhttp://ADUAE045-LAP-V:50060/tasklog?plaintext=true&attemptid=attempt_201301141457_0034_m_000001_0&filter=stdout
> 13/02/24 13:33:08 WARN mapred.JobClient: Error reading task
> outputhttp://ADUAE045-LAP-V:50060/tasklog?plaintext=true&attemptid=attempt_201301141457_0034_m_000001_0&filter=stderr
> 13/02/24 13:33:11 INFO mapred.JobClient: Task Id :
> attempt_201301141457_0034_r_000001_0, Status : FAILED
> Error initializing attempt_201301141457_0034_r_000001_0:
> java.lang.InternalError
>         at
> sun.misc.URLClassPath$JarLoader.getResource(URLClassPath.java:769)
>         at sun.misc.URLClassPath.getResource(URLClassPath.java:185)
>         at sun.misc.URLClassPath.getResource(URLClassPath.java:237)
>         at
> java.lang.ClassLoader.getBootstrapResource(ClassLoader.java:1113)
>         at java.lang.ClassLoader.getResource(ClassLoader.java:974)
>         at java.lang.ClassLoader.getResource(ClassLoader.java:972)
>         at java.lang.ClassLoader.getSystemResource(ClassLoader.java:1075)
>         at
> java.lang.ClassLoader.getSystemResourceAsStream(ClassLoader.java:1181)
>         at java.lang.Class.getResourceAsStream(Class.java:2045)
>         at
> com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory$1.run(OutputPropertiesFactory.java:370)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at
> com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory.loadPropertiesFile(OutputPropertiesFactory.java:366)
>         at
> com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory.getDefaultMethodProperties(OutputPropertiesFactory.java:267)
>         at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.setDefaults(TransformerImpl.java:1123)
>         at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.createOutputProperties(TransformerImpl.java:1084)
>         at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.<init>(TransformerImpl.java:250)
>         at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.<init>(TransformerImpl.java:241)
>         at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerFactoryImpl.newTransformer(TransformerFactoryImpl.java:607)
>         at
> org.apache.hadoop.conf.Configuration.writeXml(Configuration.java:1310)
>         at
> org.apache.hadoop.mapred.JobLocalizer.writeLocalJobFile(JobLocalizer.java:559)
>         at
> org.apache.hadoop.mapred.TaskTracker$4.run(TaskTracker.java:1227)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:416)
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
>         at
> org.apache.hadoop.mapred.TaskTracker.initializeJob(TaskTracker.java:1203)
>         at
> org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:1118)
>         at
> org.apache.hadoop.mapred.TaskTracker$5.run(TaskTracker.java:2430)
>         at java.lang.Thread.run(Thread.java:679)
> Caused by: java.io.FileNotFoundException:
> /usr/lib/jvm/java-1.6.0-openjdk-1.6.0.0.x86_64/jre/lib/resources.jar
>         at
> sun.misc.URLClassPath$JarLoader.getJarFile(URLClassPath.java:658)
>         at
> sun.misc.URLClassPath$JarLoader.access$600(URLClassPath.java:555)
>         at sun.misc.URLClassPath$JarLoader$1.run(URLClassPath.java:622)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at
> sun.misc.URLClassPath$JarLoader.ensureOpen(URLClassPath.java:614)
>         at
> sun.misc.URLClassPath$JarLoader.getResource(URLClassPath.java:767)
>         ... 27 more
>
>  13/02/24 13:33:11 WARN mapred.JobClient: Error reading task
> outputhttp://ADUAE045-LAP-V:50060/tasklog?plaintext=true&attemptid=attempt_201301141457_0034_r_000001_0&filter=stdout
> 13/02/24 13:33:11 WARN mapred.JobClient: Error reading task
> outputhttp://ADUAE045-LAP-V:50060/tasklog?plaintext=true&attemptid=attempt_201301141457_0034_r_000001_0&filter=stderr
> 13/02/24 13:33:19 INFO mapred.JobClient: Task Id :
> attempt_201301141457_0034_m_000001_1, Status : FAILED
> java.lang.Throwable: Child Error
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
> Caused by: java.io.IOException: Task process exit with nonzero status of 1.
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
>
> attempt_201301141457_0034_m_000001_1: execvp: No such file or directory
> 13/02/24 13:33:25 INFO mapred.JobClient: Task Id :
> attempt_201301141457_0034_r_000001_1, Status : FAILED
> java.lang.Throwable: Child Error
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
> Caused by: java.io.IOException: Task process exit with nonzero status of 1.
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
>
> attempt_201301141457_0034_r_000001_1: execvp: No such file or directory
> 13/02/24 13:33:31 INFO mapred.JobClient: Task Id :
> attempt_201301141457_0034_m_000001_2, Status : FAILED
> java.lang.Throwable: Child Error
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
> Caused by: java.io.IOException: Task process exit with nonzero status of 1.
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
>
> attempt_201301141457_0034_m_000001_2: execvp: No such file or directory
> 13/02/24 13:33:37 INFO mapred.JobClient: Job complete:
> job_201301141457_0034
> 13/02/24 13:33:37 INFO mapred.JobClient: Counters: 4
> 13/02/24 13:33:37 INFO mapred.JobClient:   Job Counters
> 13/02/24 13:33:37 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=15112
> 13/02/24 13:33:37 INFO mapred.JobClient:     Total time spent by all
> reduces waiting after reserving slots (ms)=0
> 13/02/24 13:33:37 INFO mapred.JobClient:     Total time spent by all maps
> waiting after reserving slots (ms)=0
> 13/02/24 13:33:37 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=9091
>
>
>
> On Sat, Feb 23, 2013 at 5:05 PM, Hemanth Yamijala <
> yhemanth@thoughtworks.com> wrote:
>
>> Can you try this ? Pick a class like WordCount from your package and
>> execute this command:
>>
>> javap -classpath <path to your jar> -verbose org.myorg.Wordcount | grep
>> version.
>>
>> For e.g. here's what I get for my class:
>>
>> $ javap -verbose WCMapper | grep version
>>   minor version: 0
>>   major version: 50
>>
>> Please paste the output of this - we can verify what the problem is.
>>
>> Thanks
>> Hemanth
>>
>>
>> On Sat, Feb 23, 2013 at 4:45 PM, Fatih Haltas <fa...@nyu.edu>wrote:
>>
>>> Hi again,
>>>
>>> Thanks for your help but now, I am struggling with the same problem on a
>>> machine. As the preivous problem, I just decrease the Java version by Java
>>> 6, but this time I could not solve the problem.
>>>
>>> those are outputs that may explain the situation:
>>>
>>> ---------------------------------------------------------------------------------------------------------------------------------------------
>>> 1. I could not run my own code, to check the system I just tried to run
>>> basic wordcount example without any modification, except package info.
>>> **************************************************
>>> COMMAND EXECUTED: hadoop jar my.jar org.myorg.WordCount NetFlow
>>> NetFlow.out
>>> Warning: $HADOOP_HOME is deprecated.
>>>
>>> Exception in thread "main" java.lang.UnsupportedClassVersionError:
>>> org/myorg/WordCount : Unsupported major.minor version 51.0
>>>          at java.lang.ClassLoader.defineClass1(Native Method)
>>>         at java.lang.ClassLoader.defineClass(ClassLoader.java:634)
>>>         at
>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>>>         at java.net.URLClassLoader.defineClass(URLClassLoader.java:277)
>>>         at java.net.URLClassLoader.access$000(URLClassLoader.java:73)
>>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:212)
>>>         at java.security.AccessController.doPrivileged(Native Method)
>>>         at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
>>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
>>>         at java.lang.Class.forName0(Native Method)
>>>         at java.lang.Class.forName(Class.java:266)
>>>         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
>>>
>>> **************************************************************************************
>>> 2. Java version:
>>> ********************************
>>> COMMAND EXECUTED: java -version
>>> java version "1.6.0_24"
>>> OpenJDK Runtime Environment (IcedTea6 1.11.6)
>>> (rhel-1.33.1.11.6.el5_9-x86_64)
>>> OpenJDK 64-Bit Server VM (build 20.0-b12, mixed mode)
>>> **********************************
>>> 3. JAVA_HOME variable:
>>> **********************************
>>> COMMAND EXECUTED: echo $JAVA_HOME
>>> /usr/lib/jvm/jre-1.6.0-openjdk.x86_64
>>> ********************************************
>>> 4. HADOOP version:
>>> *******************************************
>>> COMMAND EXECUTED: hadoop version
>>> Warning: $HADOOP_HOME is deprecated.
>>>
>>> Hadoop 1.0.4
>>> Subversion
>>> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r
>>> 1393290
>>> Compiled by hortonfo on Wed Oct  3 05:13:58 UTC 2012
>>> From source with checksum fe2baea87c4c81a2c505767f3f9b71f4
>>> ********************************************************
>>>
>>> Are these still incompatible with eachother? (Hadoop version and java
>>> version)
>>>
>>>
>>> Thank you very much.
>>>
>>>
>>> On Tue, Feb 19, 2013 at 10:26 PM, Fatih Haltas <fa...@nyu.edu>wrote:
>>>
>>>> Thank you all very much
>>>>
>>>> 19 �ubat 2013 Sal� tarihinde Harsh J adl� kullan�c� ��yle yazd�:
>>>>
>>>>> Oops. I just noticed Hemanth has been answering on a dupe thread as
>>>>> well. Lets drop this thread and carry on there :)
>>>>>
>>>>> On Tue, Feb 19, 2013 at 11:14 PM, Harsh J <ha...@cloudera.com> wrote:
>>>>> > Hi,
>>>>> >
>>>>> > The new error usually happens if you compile using Java 7 and try to
>>>>> > run via Java 6 (for example). That is, an incompatibility in the
>>>>> > runtimes for the binary artifact produced.
>>>>> >
>>>>> > On Tue, Feb 19, 2013 at 10:09 PM, Fatih Haltas <fa...@nyu.edu>
>>>>> wrote:
>>>>> >> Thank you very much Harsh,
>>>>> >>
>>>>> >> Now, as I promised earlier I am much obliged to you.
>>>>> >>
>>>>> >> But, now I solved that problem by just changing the directories
>>>>> then again
>>>>> >> creating a jar file of org. but I am getting this error:
>>>>> >>
>>>>> >> 1.) What I got
>>>>> >>
>>>>> ------------------------------------------------------------------------------
>>>>> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ hadoop jar
>>>>> flow19028pm.jar
>>>>> >> org.myorg.MapReduce /home/hadoop/project/hadoop-data/NetFlow
>>>>> 19_02.out
>>>>> >> Warning: $HADOOP_HOME is deprecated.
>>>>> >>
>>>>> >> Exception in thread "main" java.lang.UnsupportedClassVersionError:
>>>>> >> org/myorg/MapReduce : Unsupported major.minor version 51.0
>>>>> >>         at java.lang.ClassLoader.defineClass1(Native Method)
>>>>> >>         at java.lang.ClassLoader.defineClass(ClassLoader.java:634)
>>>>> >>         at
>>>>> >>
>>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>>>>> >>         at
>>>>> java.net.URLClassLoader.defineClass(URLClassLoader.java:277)
>>>>> >>         at
>>>>> java.net.URLClassLoader.access$000(URLClassLoader.java:73)
>>>>> >>         at java.net.URLClassLoader$1.run(URLClassLoader.java:212)
>>>>> >>         at java.security.AccessController.doPrivileged(Native
>>>>> Method)
>>>>> >>         at
>>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>>>>> >>         at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
>>>>> >>         at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
>>>>> >>         at java.lang.Class.forName0(Native Method)
>>>>> >>         at java.lang.Class.forName(Class.java:266)
>>>>> >>         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
>>>>> >>
>>>>> >> 2.) How I create my jar
>>>>> >>
>>>>> -------------------------------------------------------------------------------------
>>>>> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar cvf flow19028pm.jar
>>>>> org
>>>>> >> added manifest
>>>>> >> adding: org/(in = 0) (out= 0)(stored 0%)
>>>>> >> adding: org/myorg/(in = 0) (out= 0)(stored 0%)
>>>>> >> adding: org/myorg/MapReduce$FlowPortReducer.class(in = 1661) (out=
>>>>> >> 690)(deflated 58%)
>>>>> >> adding: org/myorg/MapReduce.class(in = 1587) (out= 903)(deflated
>>>>> 43%)
>>>>> >> adding: org/myorg/MapReduce$FlowPortMapper.class(in = 1874) (out=
>>>>> >> 823)(deflated 56%)
>>>>> >>
>>>>> >> 3.) Content of my jar file
>>>>> >>
>>>>> ---------------------------------------------------------------------------------------
>>>>> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar tf flow19028pm.jar
>>>>> >> META-INF/
>>>>> >> META-INF/MANIFEST.MF
>>>>> >> org/
>>>>> >> org/myorg/
>>>>> >> org/myorg/MapReduce$FlowPortReducer.class
>>>>> >> org/myorg/MapReduce.class
>>>>> >> org/myorg/MapReduce$FlowPortMapper.class
>>>>> >>
>>>>> -----------------------------------------------------------------------------------------
>>>>> >>
>>>>> >>
>>>>> >> Thank you very much.
>>>>> >>
>>>>> >>
>>>>> >> On Tue, Feb 19, 2013 at 8:20 PM, Harsh J <ha...@cloudera.com>
>>>>> wrote:
>>>>> >>>
>>>>> >>> Your point (4) explains the problem. The jar packed structure
>>>>> should
>>>>> >>> look like the below, and not how it is presently (one extra top
>>>>> level
>>>>> >>> dir is present):
>>>>> >>>
>>>>> >>> META-INF/
>>>>> >>> META-INF/MANIFEST.MF
>>>>> >>> org/
>>>>> >>> org/myorg/
>>>>> >>> org/myorg/WordCount.class
>>>>> >>> org/myorg/WordCount$TokenizerMapper.class
>>>>> >>> org/myorg/WordCount$IntSumReducer.clas--
>>>>> Harsh J
>>>>>
>>>>
>>>
>>
>
>


-- 

Regards,
.....  Sudhakara.st

Re: Trouble in running MapReduce application

Posted by sudhakara st <su...@gmail.com>.
Hi,
     Your specifying  the input directory in local file system not in HDFS,
Copy some text file to using '-put' or '-copyFromLoca'l to HDFS user home
directory then try to execute word count by specifying home as input
directory.

On Sun, Feb 24, 2013 at 3:29 PM, Fatih Haltas <fa...@nyu.edu> wrote:

>
>
> Hi Hemanth;
>
> Thanks for your grreat helps,
>
> I am really much obliged to you.
>
> I solved this problem by changing my java compiler vs. but now though I
> changed everynodes configuration I am getting this error even I tried to
> run example of wordcount without making any changes.
>
> What may be the reason, I believe that I checked all config files and
> changed the home variables, also /etc/hosts
>
> Here is my problem:
> ************************************************************
> [hadoop@ADUAE042-LAP-V logs]$ hadoop jar ../hadoop-examples-1.0.4.jar
> wordcount /home/hadoop/project/hadoop-data/NetFlow test1353.out
>
> Warning: $HADOOP_HOME is deprecated.
>
> 13/02/24 13:32:28 INFO input.FileInputFormat: Total input paths to process
> : 1
> 13/02/24 13:32:28 INFO util.NativeCodeLoader: Loaded the native-hadoop
> library
> 13/02/24 13:32:28 WARN snappy.LoadSnappy: Snappy native library not loaded
> 13/02/24 13:32:29 INFO mapred.JobClient: Running job: job_201301141457_0034
> 13/02/24 13:32:30 INFO mapred.JobClient:  map 0% reduce 0%
> 13/02/24 13:32:37 INFO mapred.JobClient: Task Id :
> attempt_201301141457_0034_m_000002_0, Status : FAILED
> java.lang.Throwable: Child Error
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
> Caused by: java.io.IOException: Task process exit with nonzero status of 1.
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
>
> attempt_201301141457_0034_m_000002_0: execvp: No such file or directory
> 13/02/24 13:32:43 INFO mapred.JobClient: Task Id :
> attempt_201301141457_0034_r_000002_0, Status : FAILED
> java.lang.Throwable: Child Error
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
> Caused by: java.io.IOException: Task process exit with nonzero status of 1.
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
>
> attempt_201301141457_0034_r_000002_0: execvp: No such file or directory
> 13/02/24 13:32:50 INFO mapred.JobClient: Task Id :
> attempt_201301141457_0034_m_000002_1, Status : FAILED
> java.lang.Throwable: Child Error
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
> Caused by: java.io.IOException: Task process exit with nonzero status of 1.
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
>
> attempt_201301141457_0034_m_000002_1: execvp: No such file or directory
> 13/02/24 13:32:56 INFO mapred.JobClient: Task Id :
> attempt_201301141457_0034_r_000002_1, Status : FAILED
> java.lang.Throwable: Child Error
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
> Caused by: java.io.IOException: Task process exit with nonzero status of 1.
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
>
> attempt_201301141457_0034_r_000002_1: execvp: No such file or directory
> 13/02/24 13:33:02 INFO mapred.JobClient: Task Id :
> attempt_201301141457_0034_m_000002_2, Status : FAILED
> Error initializing attempt_201301141457_0034_m_000002_2:
> java.lang.InternalError
>         at
> sun.misc.URLClassPath$JarLoader.getResource(URLClassPath.java:769)
>         at sun.misc.URLClassPath.getResource(URLClassPath.java:185)
>         at sun.misc.URLClassPath.getResource(URLClassPath.java:237)
>         at
> java.lang.ClassLoader.getBootstrapResource(ClassLoader.java:1113)
>         at java.lang.ClassLoader.getResource(ClassLoader.java:974)
>         at java.lang.ClassLoader.getResource(ClassLoader.java:972)
>         at java.lang.ClassLoader.getSystemResource(ClassLoader.java:1075)
>         at
> java.lang.ClassLoader.getSystemResourceAsStream(ClassLoader.java:1181)
>         at java.lang.Class.getResourceAsStream(Class.java:2045)
>         at
> com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory$1.run(OutputPropertiesFactory.java:370)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at
> com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory.loadPropertiesFile(OutputPropertiesFactory.java:366)
>         at
> com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory.getDefaultMethodProperties(OutputPropertiesFactory.java:267)
>         at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.setDefaults(TransformerImpl.java:1123)
>         at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.createOutputProperties(TransformerImpl.java:1084)
>         at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.<init>(TransformerImpl.java:250)
>         at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.<init>(TransformerImpl.java:241)
>         at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerFactoryImpl.newTransformer(TransformerFactoryImpl.java:607)
>         at
> org.apache.hadoop.conf.Configuration.writeXml(Configuration.java:1310)
>         at
> org.apache.hadoop.mapred.JobLocalizer.writeLocalJobFile(JobLocalizer.java:559)
>         at
> org.apache.hadoop.mapred.TaskTracker$4.run(TaskTracker.java:1227)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:416)
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
>         at
> org.apache.hadoop.mapred.TaskTracker.initializeJob(TaskTracker.java:1203)
>         at
> org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:1118)
>         at
> org.apache.hadoop.mapred.TaskTracker$5.run(TaskTracker.java:2430)
>         at java.lang.Thread.run(Thread.java:679)
> Caused by: java.io.FileNotFoundException:
> /usr/lib/jvm/java-1.6.0-openjdk-1.6.0.0.x86_64/jre/lib/resources.jar
>         at
> sun.misc.URLClassPath$JarLoader.getJarFile(URLClassPath.java:658)
>         at
> sun.misc.URLClassPath$JarLoader.access$600(URLClassPath.java:555)
>         at sun.misc.URLClassPath$JarLoader$1.run(URLClassPath.java:622)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at
> sun.misc.URLClassPath$JarLoader.ensureOpen(URLClassPath.java:614)
>         at
> sun.misc.URLClassPath$JarLoader.getResource(URLClassPath.java:767)
>         ... 27 more
>
>  13/02/24 13:33:02 WARN mapred.JobClient: Error reading task
> outputhttp://ADUAE045-LAP-V:50060/tasklog?plaintext=true&attemptid=attempt_201301141457_0034_m_000002_2&filter=stdout
> 13/02/24 13:33:02 WARN mapred.JobClient: Error reading task
> outputhttp://ADUAE045-LAP-V:50060/tasklog?plaintext=true&attemptid=attempt_201301141457_0034_m_000002_2&filter=stderr
> 13/02/24 13:33:08 INFO mapred.JobClient: Task Id :
> attempt_201301141457_0034_m_000001_0, Status : FAILED
> Error initializing attempt_201301141457_0034_m_000001_0:
> java.lang.InternalError
>         at
> sun.misc.URLClassPath$JarLoader.getResource(URLClassPath.java:769)
>         at sun.misc.URLClassPath.getResource(URLClassPath.java:185)
>         at sun.misc.URLClassPath.getResource(URLClassPath.java:237)
>         at
> java.lang.ClassLoader.getBootstrapResource(ClassLoader.java:1113)
>         at java.lang.ClassLoader.getResource(ClassLoader.java:974)
>         at java.lang.ClassLoader.getResource(ClassLoader.java:972)
>         at java.lang.ClassLoader.getSystemResource(ClassLoader.java:1075)
>         at
> java.lang.ClassLoader.getSystemResourceAsStream(ClassLoader.java:1181)
>         at java.lang.Class.getResourceAsStream(Class.java:2045)
>         at
> com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory$1.run(OutputPropertiesFactory.java:370)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at
> com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory.loadPropertiesFile(OutputPropertiesFactory.java:366)
>         at
> com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory.getDefaultMethodProperties(OutputPropertiesFactory.java:267)
>         at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.setDefaults(TransformerImpl.java:1123)
>         at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.createOutputProperties(TransformerImpl.java:1084)
>         at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.<init>(TransformerImpl.java:250)
>         at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.<init>(TransformerImpl.java:241)
>         at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerFactoryImpl.newTransformer(TransformerFactoryImpl.java:607)
>         at
> org.apache.hadoop.conf.Configuration.writeXml(Configuration.java:1310)
>         at
> org.apache.hadoop.mapred.JobLocalizer.writeLocalJobFile(JobLocalizer.java:559)
>         at
> org.apache.hadoop.mapred.TaskTracker$4.run(TaskTracker.java:1227)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:416)
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
>         at
> org.apache.hadoop.mapred.TaskTracker.initializeJob(TaskTracker.java:1203)
>         at
> org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:1118)
>         at
> org.apache.hadoop.mapred.TaskTracker$5.run(TaskTracker.java:2430)
>         at java.lang.Thread.run(Thread.java:679)
> Caused by: java.io.FileNotFoundException:
> /usr/lib/jvm/java-1.6.0-openjdk-1.6.0.0.x86_64/jre/lib/resources.jar
>         at
> sun.misc.URLClassPath$JarLoader.getJarFile(URLClassPath.java:658)
>         at
> sun.misc.URLClassPath$JarLoader.access$600(URLClassPath.java:555)
>         at sun.misc.URLClassPath$JarLoader$1.run(URLClassPath.java:622)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at
> sun.misc.URLClassPath$JarLoader.ensureOpen(URLClassPath.java:614)
>         at
> sun.misc.URLClassPath$JarLoader.getResource(URLClassPath.java:767)
>         ... 27 more
>
>  13/02/24 13:33:08 WARN mapred.JobClient: Error reading task
> outputhttp://ADUAE045-LAP-V:50060/tasklog?plaintext=true&attemptid=attempt_201301141457_0034_m_000001_0&filter=stdout
> 13/02/24 13:33:08 WARN mapred.JobClient: Error reading task
> outputhttp://ADUAE045-LAP-V:50060/tasklog?plaintext=true&attemptid=attempt_201301141457_0034_m_000001_0&filter=stderr
> 13/02/24 13:33:11 INFO mapred.JobClient: Task Id :
> attempt_201301141457_0034_r_000001_0, Status : FAILED
> Error initializing attempt_201301141457_0034_r_000001_0:
> java.lang.InternalError
>         at
> sun.misc.URLClassPath$JarLoader.getResource(URLClassPath.java:769)
>         at sun.misc.URLClassPath.getResource(URLClassPath.java:185)
>         at sun.misc.URLClassPath.getResource(URLClassPath.java:237)
>         at
> java.lang.ClassLoader.getBootstrapResource(ClassLoader.java:1113)
>         at java.lang.ClassLoader.getResource(ClassLoader.java:974)
>         at java.lang.ClassLoader.getResource(ClassLoader.java:972)
>         at java.lang.ClassLoader.getSystemResource(ClassLoader.java:1075)
>         at
> java.lang.ClassLoader.getSystemResourceAsStream(ClassLoader.java:1181)
>         at java.lang.Class.getResourceAsStream(Class.java:2045)
>         at
> com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory$1.run(OutputPropertiesFactory.java:370)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at
> com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory.loadPropertiesFile(OutputPropertiesFactory.java:366)
>         at
> com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory.getDefaultMethodProperties(OutputPropertiesFactory.java:267)
>         at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.setDefaults(TransformerImpl.java:1123)
>         at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.createOutputProperties(TransformerImpl.java:1084)
>         at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.<init>(TransformerImpl.java:250)
>         at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.<init>(TransformerImpl.java:241)
>         at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerFactoryImpl.newTransformer(TransformerFactoryImpl.java:607)
>         at
> org.apache.hadoop.conf.Configuration.writeXml(Configuration.java:1310)
>         at
> org.apache.hadoop.mapred.JobLocalizer.writeLocalJobFile(JobLocalizer.java:559)
>         at
> org.apache.hadoop.mapred.TaskTracker$4.run(TaskTracker.java:1227)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:416)
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
>         at
> org.apache.hadoop.mapred.TaskTracker.initializeJob(TaskTracker.java:1203)
>         at
> org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:1118)
>         at
> org.apache.hadoop.mapred.TaskTracker$5.run(TaskTracker.java:2430)
>         at java.lang.Thread.run(Thread.java:679)
> Caused by: java.io.FileNotFoundException:
> /usr/lib/jvm/java-1.6.0-openjdk-1.6.0.0.x86_64/jre/lib/resources.jar
>         at
> sun.misc.URLClassPath$JarLoader.getJarFile(URLClassPath.java:658)
>         at
> sun.misc.URLClassPath$JarLoader.access$600(URLClassPath.java:555)
>         at sun.misc.URLClassPath$JarLoader$1.run(URLClassPath.java:622)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at
> sun.misc.URLClassPath$JarLoader.ensureOpen(URLClassPath.java:614)
>         at
> sun.misc.URLClassPath$JarLoader.getResource(URLClassPath.java:767)
>         ... 27 more
>
>  13/02/24 13:33:11 WARN mapred.JobClient: Error reading task
> outputhttp://ADUAE045-LAP-V:50060/tasklog?plaintext=true&attemptid=attempt_201301141457_0034_r_000001_0&filter=stdout
> 13/02/24 13:33:11 WARN mapred.JobClient: Error reading task
> outputhttp://ADUAE045-LAP-V:50060/tasklog?plaintext=true&attemptid=attempt_201301141457_0034_r_000001_0&filter=stderr
> 13/02/24 13:33:19 INFO mapred.JobClient: Task Id :
> attempt_201301141457_0034_m_000001_1, Status : FAILED
> java.lang.Throwable: Child Error
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
> Caused by: java.io.IOException: Task process exit with nonzero status of 1.
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
>
> attempt_201301141457_0034_m_000001_1: execvp: No such file or directory
> 13/02/24 13:33:25 INFO mapred.JobClient: Task Id :
> attempt_201301141457_0034_r_000001_1, Status : FAILED
> java.lang.Throwable: Child Error
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
> Caused by: java.io.IOException: Task process exit with nonzero status of 1.
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
>
> attempt_201301141457_0034_r_000001_1: execvp: No such file or directory
> 13/02/24 13:33:31 INFO mapred.JobClient: Task Id :
> attempt_201301141457_0034_m_000001_2, Status : FAILED
> java.lang.Throwable: Child Error
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
> Caused by: java.io.IOException: Task process exit with nonzero status of 1.
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
>
> attempt_201301141457_0034_m_000001_2: execvp: No such file or directory
> 13/02/24 13:33:37 INFO mapred.JobClient: Job complete:
> job_201301141457_0034
> 13/02/24 13:33:37 INFO mapred.JobClient: Counters: 4
> 13/02/24 13:33:37 INFO mapred.JobClient:   Job Counters
> 13/02/24 13:33:37 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=15112
> 13/02/24 13:33:37 INFO mapred.JobClient:     Total time spent by all
> reduces waiting after reserving slots (ms)=0
> 13/02/24 13:33:37 INFO mapred.JobClient:     Total time spent by all maps
> waiting after reserving slots (ms)=0
> 13/02/24 13:33:37 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=9091
>
>
>
> On Sat, Feb 23, 2013 at 5:05 PM, Hemanth Yamijala <
> yhemanth@thoughtworks.com> wrote:
>
>> Can you try this ? Pick a class like WordCount from your package and
>> execute this command:
>>
>> javap -classpath <path to your jar> -verbose org.myorg.Wordcount | grep
>> version.
>>
>> For e.g. here's what I get for my class:
>>
>> $ javap -verbose WCMapper | grep version
>>   minor version: 0
>>   major version: 50
>>
>> Please paste the output of this - we can verify what the problem is.
>>
>> Thanks
>> Hemanth
>>
>>
>> On Sat, Feb 23, 2013 at 4:45 PM, Fatih Haltas <fa...@nyu.edu>wrote:
>>
>>> Hi again,
>>>
>>> Thanks for your help but now, I am struggling with the same problem on a
>>> machine. As the preivous problem, I just decrease the Java version by Java
>>> 6, but this time I could not solve the problem.
>>>
>>> those are outputs that may explain the situation:
>>>
>>> ---------------------------------------------------------------------------------------------------------------------------------------------
>>> 1. I could not run my own code, to check the system I just tried to run
>>> basic wordcount example without any modification, except package info.
>>> **************************************************
>>> COMMAND EXECUTED: hadoop jar my.jar org.myorg.WordCount NetFlow
>>> NetFlow.out
>>> Warning: $HADOOP_HOME is deprecated.
>>>
>>> Exception in thread "main" java.lang.UnsupportedClassVersionError:
>>> org/myorg/WordCount : Unsupported major.minor version 51.0
>>>          at java.lang.ClassLoader.defineClass1(Native Method)
>>>         at java.lang.ClassLoader.defineClass(ClassLoader.java:634)
>>>         at
>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>>>         at java.net.URLClassLoader.defineClass(URLClassLoader.java:277)
>>>         at java.net.URLClassLoader.access$000(URLClassLoader.java:73)
>>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:212)
>>>         at java.security.AccessController.doPrivileged(Native Method)
>>>         at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
>>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
>>>         at java.lang.Class.forName0(Native Method)
>>>         at java.lang.Class.forName(Class.java:266)
>>>         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
>>>
>>> **************************************************************************************
>>> 2. Java version:
>>> ********************************
>>> COMMAND EXECUTED: java -version
>>> java version "1.6.0_24"
>>> OpenJDK Runtime Environment (IcedTea6 1.11.6)
>>> (rhel-1.33.1.11.6.el5_9-x86_64)
>>> OpenJDK 64-Bit Server VM (build 20.0-b12, mixed mode)
>>> **********************************
>>> 3. JAVA_HOME variable:
>>> **********************************
>>> COMMAND EXECUTED: echo $JAVA_HOME
>>> /usr/lib/jvm/jre-1.6.0-openjdk.x86_64
>>> ********************************************
>>> 4. HADOOP version:
>>> *******************************************
>>> COMMAND EXECUTED: hadoop version
>>> Warning: $HADOOP_HOME is deprecated.
>>>
>>> Hadoop 1.0.4
>>> Subversion
>>> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r
>>> 1393290
>>> Compiled by hortonfo on Wed Oct  3 05:13:58 UTC 2012
>>> From source with checksum fe2baea87c4c81a2c505767f3f9b71f4
>>> ********************************************************
>>>
>>> Are these still incompatible with eachother? (Hadoop version and java
>>> version)
>>>
>>>
>>> Thank you very much.
>>>
>>>
>>> On Tue, Feb 19, 2013 at 10:26 PM, Fatih Haltas <fa...@nyu.edu>wrote:
>>>
>>>> Thank you all very much
>>>>
>>>> 19 �ubat 2013 Sal� tarihinde Harsh J adl� kullan�c� ��yle yazd�:
>>>>
>>>>> Oops. I just noticed Hemanth has been answering on a dupe thread as
>>>>> well. Lets drop this thread and carry on there :)
>>>>>
>>>>> On Tue, Feb 19, 2013 at 11:14 PM, Harsh J <ha...@cloudera.com> wrote:
>>>>> > Hi,
>>>>> >
>>>>> > The new error usually happens if you compile using Java 7 and try to
>>>>> > run via Java 6 (for example). That is, an incompatibility in the
>>>>> > runtimes for the binary artifact produced.
>>>>> >
>>>>> > On Tue, Feb 19, 2013 at 10:09 PM, Fatih Haltas <fa...@nyu.edu>
>>>>> wrote:
>>>>> >> Thank you very much Harsh,
>>>>> >>
>>>>> >> Now, as I promised earlier I am much obliged to you.
>>>>> >>
>>>>> >> But, now I solved that problem by just changing the directories
>>>>> then again
>>>>> >> creating a jar file of org. but I am getting this error:
>>>>> >>
>>>>> >> 1.) What I got
>>>>> >>
>>>>> ------------------------------------------------------------------------------
>>>>> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ hadoop jar
>>>>> flow19028pm.jar
>>>>> >> org.myorg.MapReduce /home/hadoop/project/hadoop-data/NetFlow
>>>>> 19_02.out
>>>>> >> Warning: $HADOOP_HOME is deprecated.
>>>>> >>
>>>>> >> Exception in thread "main" java.lang.UnsupportedClassVersionError:
>>>>> >> org/myorg/MapReduce : Unsupported major.minor version 51.0
>>>>> >>         at java.lang.ClassLoader.defineClass1(Native Method)
>>>>> >>         at java.lang.ClassLoader.defineClass(ClassLoader.java:634)
>>>>> >>         at
>>>>> >>
>>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>>>>> >>         at
>>>>> java.net.URLClassLoader.defineClass(URLClassLoader.java:277)
>>>>> >>         at
>>>>> java.net.URLClassLoader.access$000(URLClassLoader.java:73)
>>>>> >>         at java.net.URLClassLoader$1.run(URLClassLoader.java:212)
>>>>> >>         at java.security.AccessController.doPrivileged(Native
>>>>> Method)
>>>>> >>         at
>>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>>>>> >>         at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
>>>>> >>         at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
>>>>> >>         at java.lang.Class.forName0(Native Method)
>>>>> >>         at java.lang.Class.forName(Class.java:266)
>>>>> >>         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
>>>>> >>
>>>>> >> 2.) How I create my jar
>>>>> >>
>>>>> -------------------------------------------------------------------------------------
>>>>> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar cvf flow19028pm.jar
>>>>> org
>>>>> >> added manifest
>>>>> >> adding: org/(in = 0) (out= 0)(stored 0%)
>>>>> >> adding: org/myorg/(in = 0) (out= 0)(stored 0%)
>>>>> >> adding: org/myorg/MapReduce$FlowPortReducer.class(in = 1661) (out=
>>>>> >> 690)(deflated 58%)
>>>>> >> adding: org/myorg/MapReduce.class(in = 1587) (out= 903)(deflated
>>>>> 43%)
>>>>> >> adding: org/myorg/MapReduce$FlowPortMapper.class(in = 1874) (out=
>>>>> >> 823)(deflated 56%)
>>>>> >>
>>>>> >> 3.) Content of my jar file
>>>>> >>
>>>>> ---------------------------------------------------------------------------------------
>>>>> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar tf flow19028pm.jar
>>>>> >> META-INF/
>>>>> >> META-INF/MANIFEST.MF
>>>>> >> org/
>>>>> >> org/myorg/
>>>>> >> org/myorg/MapReduce$FlowPortReducer.class
>>>>> >> org/myorg/MapReduce.class
>>>>> >> org/myorg/MapReduce$FlowPortMapper.class
>>>>> >>
>>>>> -----------------------------------------------------------------------------------------
>>>>> >>
>>>>> >>
>>>>> >> Thank you very much.
>>>>> >>
>>>>> >>
>>>>> >> On Tue, Feb 19, 2013 at 8:20 PM, Harsh J <ha...@cloudera.com>
>>>>> wrote:
>>>>> >>>
>>>>> >>> Your point (4) explains the problem. The jar packed structure
>>>>> should
>>>>> >>> look like the below, and not how it is presently (one extra top
>>>>> level
>>>>> >>> dir is present):
>>>>> >>>
>>>>> >>> META-INF/
>>>>> >>> META-INF/MANIFEST.MF
>>>>> >>> org/
>>>>> >>> org/myorg/
>>>>> >>> org/myorg/WordCount.class
>>>>> >>> org/myorg/WordCount$TokenizerMapper.class
>>>>> >>> org/myorg/WordCount$IntSumReducer.clas--
>>>>> Harsh J
>>>>>
>>>>
>>>
>>
>
>


-- 

Regards,
.....  Sudhakara.st

Re: Trouble in running MapReduce application

Posted by sudhakara st <su...@gmail.com>.
Hi,
     Your specifying  the input directory in local file system not in HDFS,
Copy some text file to using '-put' or '-copyFromLoca'l to HDFS user home
directory then try to execute word count by specifying home as input
directory.

On Sun, Feb 24, 2013 at 3:29 PM, Fatih Haltas <fa...@nyu.edu> wrote:

>
>
> Hi Hemanth;
>
> Thanks for your grreat helps,
>
> I am really much obliged to you.
>
> I solved this problem by changing my java compiler vs. but now though I
> changed everynodes configuration I am getting this error even I tried to
> run example of wordcount without making any changes.
>
> What may be the reason, I believe that I checked all config files and
> changed the home variables, also /etc/hosts
>
> Here is my problem:
> ************************************************************
> [hadoop@ADUAE042-LAP-V logs]$ hadoop jar ../hadoop-examples-1.0.4.jar
> wordcount /home/hadoop/project/hadoop-data/NetFlow test1353.out
>
> Warning: $HADOOP_HOME is deprecated.
>
> 13/02/24 13:32:28 INFO input.FileInputFormat: Total input paths to process
> : 1
> 13/02/24 13:32:28 INFO util.NativeCodeLoader: Loaded the native-hadoop
> library
> 13/02/24 13:32:28 WARN snappy.LoadSnappy: Snappy native library not loaded
> 13/02/24 13:32:29 INFO mapred.JobClient: Running job: job_201301141457_0034
> 13/02/24 13:32:30 INFO mapred.JobClient:  map 0% reduce 0%
> 13/02/24 13:32:37 INFO mapred.JobClient: Task Id :
> attempt_201301141457_0034_m_000002_0, Status : FAILED
> java.lang.Throwable: Child Error
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
> Caused by: java.io.IOException: Task process exit with nonzero status of 1.
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
>
> attempt_201301141457_0034_m_000002_0: execvp: No such file or directory
> 13/02/24 13:32:43 INFO mapred.JobClient: Task Id :
> attempt_201301141457_0034_r_000002_0, Status : FAILED
> java.lang.Throwable: Child Error
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
> Caused by: java.io.IOException: Task process exit with nonzero status of 1.
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
>
> attempt_201301141457_0034_r_000002_0: execvp: No such file or directory
> 13/02/24 13:32:50 INFO mapred.JobClient: Task Id :
> attempt_201301141457_0034_m_000002_1, Status : FAILED
> java.lang.Throwable: Child Error
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
> Caused by: java.io.IOException: Task process exit with nonzero status of 1.
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
>
> attempt_201301141457_0034_m_000002_1: execvp: No such file or directory
> 13/02/24 13:32:56 INFO mapred.JobClient: Task Id :
> attempt_201301141457_0034_r_000002_1, Status : FAILED
> java.lang.Throwable: Child Error
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
> Caused by: java.io.IOException: Task process exit with nonzero status of 1.
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
>
> attempt_201301141457_0034_r_000002_1: execvp: No such file or directory
> 13/02/24 13:33:02 INFO mapred.JobClient: Task Id :
> attempt_201301141457_0034_m_000002_2, Status : FAILED
> Error initializing attempt_201301141457_0034_m_000002_2:
> java.lang.InternalError
>         at
> sun.misc.URLClassPath$JarLoader.getResource(URLClassPath.java:769)
>         at sun.misc.URLClassPath.getResource(URLClassPath.java:185)
>         at sun.misc.URLClassPath.getResource(URLClassPath.java:237)
>         at
> java.lang.ClassLoader.getBootstrapResource(ClassLoader.java:1113)
>         at java.lang.ClassLoader.getResource(ClassLoader.java:974)
>         at java.lang.ClassLoader.getResource(ClassLoader.java:972)
>         at java.lang.ClassLoader.getSystemResource(ClassLoader.java:1075)
>         at
> java.lang.ClassLoader.getSystemResourceAsStream(ClassLoader.java:1181)
>         at java.lang.Class.getResourceAsStream(Class.java:2045)
>         at
> com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory$1.run(OutputPropertiesFactory.java:370)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at
> com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory.loadPropertiesFile(OutputPropertiesFactory.java:366)
>         at
> com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory.getDefaultMethodProperties(OutputPropertiesFactory.java:267)
>         at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.setDefaults(TransformerImpl.java:1123)
>         at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.createOutputProperties(TransformerImpl.java:1084)
>         at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.<init>(TransformerImpl.java:250)
>         at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.<init>(TransformerImpl.java:241)
>         at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerFactoryImpl.newTransformer(TransformerFactoryImpl.java:607)
>         at
> org.apache.hadoop.conf.Configuration.writeXml(Configuration.java:1310)
>         at
> org.apache.hadoop.mapred.JobLocalizer.writeLocalJobFile(JobLocalizer.java:559)
>         at
> org.apache.hadoop.mapred.TaskTracker$4.run(TaskTracker.java:1227)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:416)
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
>         at
> org.apache.hadoop.mapred.TaskTracker.initializeJob(TaskTracker.java:1203)
>         at
> org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:1118)
>         at
> org.apache.hadoop.mapred.TaskTracker$5.run(TaskTracker.java:2430)
>         at java.lang.Thread.run(Thread.java:679)
> Caused by: java.io.FileNotFoundException:
> /usr/lib/jvm/java-1.6.0-openjdk-1.6.0.0.x86_64/jre/lib/resources.jar
>         at
> sun.misc.URLClassPath$JarLoader.getJarFile(URLClassPath.java:658)
>         at
> sun.misc.URLClassPath$JarLoader.access$600(URLClassPath.java:555)
>         at sun.misc.URLClassPath$JarLoader$1.run(URLClassPath.java:622)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at
> sun.misc.URLClassPath$JarLoader.ensureOpen(URLClassPath.java:614)
>         at
> sun.misc.URLClassPath$JarLoader.getResource(URLClassPath.java:767)
>         ... 27 more
>
>  13/02/24 13:33:02 WARN mapred.JobClient: Error reading task
> outputhttp://ADUAE045-LAP-V:50060/tasklog?plaintext=true&attemptid=attempt_201301141457_0034_m_000002_2&filter=stdout
> 13/02/24 13:33:02 WARN mapred.JobClient: Error reading task
> outputhttp://ADUAE045-LAP-V:50060/tasklog?plaintext=true&attemptid=attempt_201301141457_0034_m_000002_2&filter=stderr
> 13/02/24 13:33:08 INFO mapred.JobClient: Task Id :
> attempt_201301141457_0034_m_000001_0, Status : FAILED
> Error initializing attempt_201301141457_0034_m_000001_0:
> java.lang.InternalError
>         at
> sun.misc.URLClassPath$JarLoader.getResource(URLClassPath.java:769)
>         at sun.misc.URLClassPath.getResource(URLClassPath.java:185)
>         at sun.misc.URLClassPath.getResource(URLClassPath.java:237)
>         at
> java.lang.ClassLoader.getBootstrapResource(ClassLoader.java:1113)
>         at java.lang.ClassLoader.getResource(ClassLoader.java:974)
>         at java.lang.ClassLoader.getResource(ClassLoader.java:972)
>         at java.lang.ClassLoader.getSystemResource(ClassLoader.java:1075)
>         at
> java.lang.ClassLoader.getSystemResourceAsStream(ClassLoader.java:1181)
>         at java.lang.Class.getResourceAsStream(Class.java:2045)
>         at
> com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory$1.run(OutputPropertiesFactory.java:370)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at
> com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory.loadPropertiesFile(OutputPropertiesFactory.java:366)
>         at
> com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory.getDefaultMethodProperties(OutputPropertiesFactory.java:267)
>         at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.setDefaults(TransformerImpl.java:1123)
>         at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.createOutputProperties(TransformerImpl.java:1084)
>         at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.<init>(TransformerImpl.java:250)
>         at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.<init>(TransformerImpl.java:241)
>         at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerFactoryImpl.newTransformer(TransformerFactoryImpl.java:607)
>         at
> org.apache.hadoop.conf.Configuration.writeXml(Configuration.java:1310)
>         at
> org.apache.hadoop.mapred.JobLocalizer.writeLocalJobFile(JobLocalizer.java:559)
>         at
> org.apache.hadoop.mapred.TaskTracker$4.run(TaskTracker.java:1227)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:416)
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
>         at
> org.apache.hadoop.mapred.TaskTracker.initializeJob(TaskTracker.java:1203)
>         at
> org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:1118)
>         at
> org.apache.hadoop.mapred.TaskTracker$5.run(TaskTracker.java:2430)
>         at java.lang.Thread.run(Thread.java:679)
> Caused by: java.io.FileNotFoundException:
> /usr/lib/jvm/java-1.6.0-openjdk-1.6.0.0.x86_64/jre/lib/resources.jar
>         at
> sun.misc.URLClassPath$JarLoader.getJarFile(URLClassPath.java:658)
>         at
> sun.misc.URLClassPath$JarLoader.access$600(URLClassPath.java:555)
>         at sun.misc.URLClassPath$JarLoader$1.run(URLClassPath.java:622)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at
> sun.misc.URLClassPath$JarLoader.ensureOpen(URLClassPath.java:614)
>         at
> sun.misc.URLClassPath$JarLoader.getResource(URLClassPath.java:767)
>         ... 27 more
>
>  13/02/24 13:33:08 WARN mapred.JobClient: Error reading task
> outputhttp://ADUAE045-LAP-V:50060/tasklog?plaintext=true&attemptid=attempt_201301141457_0034_m_000001_0&filter=stdout
> 13/02/24 13:33:08 WARN mapred.JobClient: Error reading task
> outputhttp://ADUAE045-LAP-V:50060/tasklog?plaintext=true&attemptid=attempt_201301141457_0034_m_000001_0&filter=stderr
> 13/02/24 13:33:11 INFO mapred.JobClient: Task Id :
> attempt_201301141457_0034_r_000001_0, Status : FAILED
> Error initializing attempt_201301141457_0034_r_000001_0:
> java.lang.InternalError
>         at
> sun.misc.URLClassPath$JarLoader.getResource(URLClassPath.java:769)
>         at sun.misc.URLClassPath.getResource(URLClassPath.java:185)
>         at sun.misc.URLClassPath.getResource(URLClassPath.java:237)
>         at
> java.lang.ClassLoader.getBootstrapResource(ClassLoader.java:1113)
>         at java.lang.ClassLoader.getResource(ClassLoader.java:974)
>         at java.lang.ClassLoader.getResource(ClassLoader.java:972)
>         at java.lang.ClassLoader.getSystemResource(ClassLoader.java:1075)
>         at
> java.lang.ClassLoader.getSystemResourceAsStream(ClassLoader.java:1181)
>         at java.lang.Class.getResourceAsStream(Class.java:2045)
>         at
> com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory$1.run(OutputPropertiesFactory.java:370)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at
> com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory.loadPropertiesFile(OutputPropertiesFactory.java:366)
>         at
> com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory.getDefaultMethodProperties(OutputPropertiesFactory.java:267)
>         at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.setDefaults(TransformerImpl.java:1123)
>         at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.createOutputProperties(TransformerImpl.java:1084)
>         at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.<init>(TransformerImpl.java:250)
>         at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.<init>(TransformerImpl.java:241)
>         at
> com.sun.org.apache.xalan.internal.xsltc.trax.TransformerFactoryImpl.newTransformer(TransformerFactoryImpl.java:607)
>         at
> org.apache.hadoop.conf.Configuration.writeXml(Configuration.java:1310)
>         at
> org.apache.hadoop.mapred.JobLocalizer.writeLocalJobFile(JobLocalizer.java:559)
>         at
> org.apache.hadoop.mapred.TaskTracker$4.run(TaskTracker.java:1227)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:416)
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
>         at
> org.apache.hadoop.mapred.TaskTracker.initializeJob(TaskTracker.java:1203)
>         at
> org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:1118)
>         at
> org.apache.hadoop.mapred.TaskTracker$5.run(TaskTracker.java:2430)
>         at java.lang.Thread.run(Thread.java:679)
> Caused by: java.io.FileNotFoundException:
> /usr/lib/jvm/java-1.6.0-openjdk-1.6.0.0.x86_64/jre/lib/resources.jar
>         at
> sun.misc.URLClassPath$JarLoader.getJarFile(URLClassPath.java:658)
>         at
> sun.misc.URLClassPath$JarLoader.access$600(URLClassPath.java:555)
>         at sun.misc.URLClassPath$JarLoader$1.run(URLClassPath.java:622)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at
> sun.misc.URLClassPath$JarLoader.ensureOpen(URLClassPath.java:614)
>         at
> sun.misc.URLClassPath$JarLoader.getResource(URLClassPath.java:767)
>         ... 27 more
>
>  13/02/24 13:33:11 WARN mapred.JobClient: Error reading task
> outputhttp://ADUAE045-LAP-V:50060/tasklog?plaintext=true&attemptid=attempt_201301141457_0034_r_000001_0&filter=stdout
> 13/02/24 13:33:11 WARN mapred.JobClient: Error reading task
> outputhttp://ADUAE045-LAP-V:50060/tasklog?plaintext=true&attemptid=attempt_201301141457_0034_r_000001_0&filter=stderr
> 13/02/24 13:33:19 INFO mapred.JobClient: Task Id :
> attempt_201301141457_0034_m_000001_1, Status : FAILED
> java.lang.Throwable: Child Error
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
> Caused by: java.io.IOException: Task process exit with nonzero status of 1.
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
>
> attempt_201301141457_0034_m_000001_1: execvp: No such file or directory
> 13/02/24 13:33:25 INFO mapred.JobClient: Task Id :
> attempt_201301141457_0034_r_000001_1, Status : FAILED
> java.lang.Throwable: Child Error
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
> Caused by: java.io.IOException: Task process exit with nonzero status of 1.
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
>
> attempt_201301141457_0034_r_000001_1: execvp: No such file or directory
> 13/02/24 13:33:31 INFO mapred.JobClient: Task Id :
> attempt_201301141457_0034_m_000001_2, Status : FAILED
> java.lang.Throwable: Child Error
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
> Caused by: java.io.IOException: Task process exit with nonzero status of 1.
>         at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
>
> attempt_201301141457_0034_m_000001_2: execvp: No such file or directory
> 13/02/24 13:33:37 INFO mapred.JobClient: Job complete:
> job_201301141457_0034
> 13/02/24 13:33:37 INFO mapred.JobClient: Counters: 4
> 13/02/24 13:33:37 INFO mapred.JobClient:   Job Counters
> 13/02/24 13:33:37 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=15112
> 13/02/24 13:33:37 INFO mapred.JobClient:     Total time spent by all
> reduces waiting after reserving slots (ms)=0
> 13/02/24 13:33:37 INFO mapred.JobClient:     Total time spent by all maps
> waiting after reserving slots (ms)=0
> 13/02/24 13:33:37 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=9091
>
>
>
> On Sat, Feb 23, 2013 at 5:05 PM, Hemanth Yamijala <
> yhemanth@thoughtworks.com> wrote:
>
>> Can you try this ? Pick a class like WordCount from your package and
>> execute this command:
>>
>> javap -classpath <path to your jar> -verbose org.myorg.Wordcount | grep
>> version.
>>
>> For e.g. here's what I get for my class:
>>
>> $ javap -verbose WCMapper | grep version
>>   minor version: 0
>>   major version: 50
>>
>> Please paste the output of this - we can verify what the problem is.
>>
>> Thanks
>> Hemanth
>>
>>
>> On Sat, Feb 23, 2013 at 4:45 PM, Fatih Haltas <fa...@nyu.edu>wrote:
>>
>>> Hi again,
>>>
>>> Thanks for your help but now, I am struggling with the same problem on a
>>> machine. As the preivous problem, I just decrease the Java version by Java
>>> 6, but this time I could not solve the problem.
>>>
>>> those are outputs that may explain the situation:
>>>
>>> ---------------------------------------------------------------------------------------------------------------------------------------------
>>> 1. I could not run my own code, to check the system I just tried to run
>>> basic wordcount example without any modification, except package info.
>>> **************************************************
>>> COMMAND EXECUTED: hadoop jar my.jar org.myorg.WordCount NetFlow
>>> NetFlow.out
>>> Warning: $HADOOP_HOME is deprecated.
>>>
>>> Exception in thread "main" java.lang.UnsupportedClassVersionError:
>>> org/myorg/WordCount : Unsupported major.minor version 51.0
>>>          at java.lang.ClassLoader.defineClass1(Native Method)
>>>         at java.lang.ClassLoader.defineClass(ClassLoader.java:634)
>>>         at
>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>>>         at java.net.URLClassLoader.defineClass(URLClassLoader.java:277)
>>>         at java.net.URLClassLoader.access$000(URLClassLoader.java:73)
>>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:212)
>>>         at java.security.AccessController.doPrivileged(Native Method)
>>>         at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
>>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
>>>         at java.lang.Class.forName0(Native Method)
>>>         at java.lang.Class.forName(Class.java:266)
>>>         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
>>>
>>> **************************************************************************************
>>> 2. Java version:
>>> ********************************
>>> COMMAND EXECUTED: java -version
>>> java version "1.6.0_24"
>>> OpenJDK Runtime Environment (IcedTea6 1.11.6)
>>> (rhel-1.33.1.11.6.el5_9-x86_64)
>>> OpenJDK 64-Bit Server VM (build 20.0-b12, mixed mode)
>>> **********************************
>>> 3. JAVA_HOME variable:
>>> **********************************
>>> COMMAND EXECUTED: echo $JAVA_HOME
>>> /usr/lib/jvm/jre-1.6.0-openjdk.x86_64
>>> ********************************************
>>> 4. HADOOP version:
>>> *******************************************
>>> COMMAND EXECUTED: hadoop version
>>> Warning: $HADOOP_HOME is deprecated.
>>>
>>> Hadoop 1.0.4
>>> Subversion
>>> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r
>>> 1393290
>>> Compiled by hortonfo on Wed Oct  3 05:13:58 UTC 2012
>>> From source with checksum fe2baea87c4c81a2c505767f3f9b71f4
>>> ********************************************************
>>>
>>> Are these still incompatible with eachother? (Hadoop version and java
>>> version)
>>>
>>>
>>> Thank you very much.
>>>
>>>
>>> On Tue, Feb 19, 2013 at 10:26 PM, Fatih Haltas <fa...@nyu.edu>wrote:
>>>
>>>> Thank you all very much
>>>>
>>>> 19 �ubat 2013 Sal� tarihinde Harsh J adl� kullan�c� ��yle yazd�:
>>>>
>>>>> Oops. I just noticed Hemanth has been answering on a dupe thread as
>>>>> well. Lets drop this thread and carry on there :)
>>>>>
>>>>> On Tue, Feb 19, 2013 at 11:14 PM, Harsh J <ha...@cloudera.com> wrote:
>>>>> > Hi,
>>>>> >
>>>>> > The new error usually happens if you compile using Java 7 and try to
>>>>> > run via Java 6 (for example). That is, an incompatibility in the
>>>>> > runtimes for the binary artifact produced.
>>>>> >
>>>>> > On Tue, Feb 19, 2013 at 10:09 PM, Fatih Haltas <fa...@nyu.edu>
>>>>> wrote:
>>>>> >> Thank you very much Harsh,
>>>>> >>
>>>>> >> Now, as I promised earlier I am much obliged to you.
>>>>> >>
>>>>> >> But, now I solved that problem by just changing the directories
>>>>> then again
>>>>> >> creating a jar file of org. but I am getting this error:
>>>>> >>
>>>>> >> 1.) What I got
>>>>> >>
>>>>> ------------------------------------------------------------------------------
>>>>> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ hadoop jar
>>>>> flow19028pm.jar
>>>>> >> org.myorg.MapReduce /home/hadoop/project/hadoop-data/NetFlow
>>>>> 19_02.out
>>>>> >> Warning: $HADOOP_HOME is deprecated.
>>>>> >>
>>>>> >> Exception in thread "main" java.lang.UnsupportedClassVersionError:
>>>>> >> org/myorg/MapReduce : Unsupported major.minor version 51.0
>>>>> >>         at java.lang.ClassLoader.defineClass1(Native Method)
>>>>> >>         at java.lang.ClassLoader.defineClass(ClassLoader.java:634)
>>>>> >>         at
>>>>> >>
>>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>>>>> >>         at
>>>>> java.net.URLClassLoader.defineClass(URLClassLoader.java:277)
>>>>> >>         at
>>>>> java.net.URLClassLoader.access$000(URLClassLoader.java:73)
>>>>> >>         at java.net.URLClassLoader$1.run(URLClassLoader.java:212)
>>>>> >>         at java.security.AccessController.doPrivileged(Native
>>>>> Method)
>>>>> >>         at
>>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>>>>> >>         at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
>>>>> >>         at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
>>>>> >>         at java.lang.Class.forName0(Native Method)
>>>>> >>         at java.lang.Class.forName(Class.java:266)
>>>>> >>         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
>>>>> >>
>>>>> >> 2.) How I create my jar
>>>>> >>
>>>>> -------------------------------------------------------------------------------------
>>>>> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar cvf flow19028pm.jar
>>>>> org
>>>>> >> added manifest
>>>>> >> adding: org/(in = 0) (out= 0)(stored 0%)
>>>>> >> adding: org/myorg/(in = 0) (out= 0)(stored 0%)
>>>>> >> adding: org/myorg/MapReduce$FlowPortReducer.class(in = 1661) (out=
>>>>> >> 690)(deflated 58%)
>>>>> >> adding: org/myorg/MapReduce.class(in = 1587) (out= 903)(deflated
>>>>> 43%)
>>>>> >> adding: org/myorg/MapReduce$FlowPortMapper.class(in = 1874) (out=
>>>>> >> 823)(deflated 56%)
>>>>> >>
>>>>> >> 3.) Content of my jar file
>>>>> >>
>>>>> ---------------------------------------------------------------------------------------
>>>>> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar tf flow19028pm.jar
>>>>> >> META-INF/
>>>>> >> META-INF/MANIFEST.MF
>>>>> >> org/
>>>>> >> org/myorg/
>>>>> >> org/myorg/MapReduce$FlowPortReducer.class
>>>>> >> org/myorg/MapReduce.class
>>>>> >> org/myorg/MapReduce$FlowPortMapper.class
>>>>> >>
>>>>> -----------------------------------------------------------------------------------------
>>>>> >>
>>>>> >>
>>>>> >> Thank you very much.
>>>>> >>
>>>>> >>
>>>>> >> On Tue, Feb 19, 2013 at 8:20 PM, Harsh J <ha...@cloudera.com>
>>>>> wrote:
>>>>> >>>
>>>>> >>> Your point (4) explains the problem. The jar packed structure
>>>>> should
>>>>> >>> look like the below, and not how it is presently (one extra top
>>>>> level
>>>>> >>> dir is present):
>>>>> >>>
>>>>> >>> META-INF/
>>>>> >>> META-INF/MANIFEST.MF
>>>>> >>> org/
>>>>> >>> org/myorg/
>>>>> >>> org/myorg/WordCount.class
>>>>> >>> org/myorg/WordCount$TokenizerMapper.class
>>>>> >>> org/myorg/WordCount$IntSumReducer.clas--
>>>>> Harsh J
>>>>>
>>>>
>>>
>>
>
>


-- 

Regards,
.....  Sudhakara.st

Fwd: Trouble in running MapReduce application

Posted by Fatih Haltas <fa...@nyu.edu>.
Hi Hemanth;

Thanks for your grreat helps,

I am really much obliged to you.

I solved this problem by changing my java compiler vs. but now though I
changed everynodes configuration I am getting this error even I tried to
run example of wordcount without making any changes.

What may be the reason, I believe that I checked all config files and
changed the home variables, also /etc/hosts

Here is my problem:
************************************************************
[hadoop@ADUAE042-LAP-V logs]$ hadoop jar ../hadoop-examples-1.0.4.jar
wordcount /home/hadoop/project/hadoop-data/NetFlow test1353.out

Warning: $HADOOP_HOME is deprecated.

13/02/24 13:32:28 INFO input.FileInputFormat: Total input paths to process
: 1
13/02/24 13:32:28 INFO util.NativeCodeLoader: Loaded the native-hadoop
library
13/02/24 13:32:28 WARN snappy.LoadSnappy: Snappy native library not loaded
13/02/24 13:32:29 INFO mapred.JobClient: Running job: job_201301141457_0034
13/02/24 13:32:30 INFO mapred.JobClient:  map 0% reduce 0%
13/02/24 13:32:37 INFO mapred.JobClient: Task Id :
attempt_201301141457_0034_m_000002_0, Status : FAILED
java.lang.Throwable: Child Error
        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
Caused by: java.io.IOException: Task process exit with nonzero status of 1.
        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)

attempt_201301141457_0034_m_000002_0: execvp: No such file or directory
13/02/24 13:32:43 INFO mapred.JobClient: Task Id :
attempt_201301141457_0034_r_000002_0, Status : FAILED
java.lang.Throwable: Child Error
        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
Caused by: java.io.IOException: Task process exit with nonzero status of 1.
        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)

attempt_201301141457_0034_r_000002_0: execvp: No such file or directory
13/02/24 13:32:50 INFO mapred.JobClient: Task Id :
attempt_201301141457_0034_m_000002_1, Status : FAILED
java.lang.Throwable: Child Error
        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
Caused by: java.io.IOException: Task process exit with nonzero status of 1.
        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)

attempt_201301141457_0034_m_000002_1: execvp: No such file or directory
13/02/24 13:32:56 INFO mapred.JobClient: Task Id :
attempt_201301141457_0034_r_000002_1, Status : FAILED
java.lang.Throwable: Child Error
        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
Caused by: java.io.IOException: Task process exit with nonzero status of 1.
        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)

attempt_201301141457_0034_r_000002_1: execvp: No such file or directory
13/02/24 13:33:02 INFO mapred.JobClient: Task Id :
attempt_201301141457_0034_m_000002_2, Status : FAILED
Error initializing attempt_201301141457_0034_m_000002_2:
java.lang.InternalError
        at
sun.misc.URLClassPath$JarLoader.getResource(URLClassPath.java:769)
        at sun.misc.URLClassPath.getResource(URLClassPath.java:185)
        at sun.misc.URLClassPath.getResource(URLClassPath.java:237)
        at java.lang.ClassLoader.getBootstrapResource(ClassLoader.java:1113)
        at java.lang.ClassLoader.getResource(ClassLoader.java:974)
        at java.lang.ClassLoader.getResource(ClassLoader.java:972)
        at java.lang.ClassLoader.getSystemResource(ClassLoader.java:1075)
        at
java.lang.ClassLoader.getSystemResourceAsStream(ClassLoader.java:1181)
        at java.lang.Class.getResourceAsStream(Class.java:2045)
        at
com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory$1.run(OutputPropertiesFactory.java:370)
        at java.security.AccessController.doPrivileged(Native Method)
        at
com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory.loadPropertiesFile(OutputPropertiesFactory.java:366)
        at
com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory.getDefaultMethodProperties(OutputPropertiesFactory.java:267)
        at
com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.setDefaults(TransformerImpl.java:1123)
        at
com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.createOutputProperties(TransformerImpl.java:1084)
        at
com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.<init>(TransformerImpl.java:250)
        at
com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.<init>(TransformerImpl.java:241)
        at
com.sun.org.apache.xalan.internal.xsltc.trax.TransformerFactoryImpl.newTransformer(TransformerFactoryImpl.java:607)
        at
org.apache.hadoop.conf.Configuration.writeXml(Configuration.java:1310)
        at
org.apache.hadoop.mapred.JobLocalizer.writeLocalJobFile(JobLocalizer.java:559)
        at org.apache.hadoop.mapred.TaskTracker$4.run(TaskTracker.java:1227)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:416)
        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
        at
org.apache.hadoop.mapred.TaskTracker.initializeJob(TaskTracker.java:1203)
        at
org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:1118)
        at org.apache.hadoop.mapred.TaskTracker$5.run(TaskTracker.java:2430)
        at java.lang.Thread.run(Thread.java:679)
Caused by: java.io.FileNotFoundException:
/usr/lib/jvm/java-1.6.0-openjdk-1.6.0.0.x86_64/jre/lib/resources.jar
        at sun.misc.URLClassPath$JarLoader.getJarFile(URLClassPath.java:658)
        at sun.misc.URLClassPath$JarLoader.access$600(URLClassPath.java:555)
        at sun.misc.URLClassPath$JarLoader$1.run(URLClassPath.java:622)
        at java.security.AccessController.doPrivileged(Native Method)
        at sun.misc.URLClassPath$JarLoader.ensureOpen(URLClassPath.java:614)
        at
sun.misc.URLClassPath$JarLoader.getResource(URLClassPath.java:767)
        ... 27 more

13/02/24 13:33:02 WARN mapred.JobClient: Error reading task
outputhttp://ADUAE045-LAP-V:50060/tasklog?plaintext=true&attemptid=attempt_201301141457_0034_m_000002_2&filter=stdout
13/02/24 13:33:02 WARN mapred.JobClient: Error reading task
outputhttp://ADUAE045-LAP-V:50060/tasklog?plaintext=true&attemptid=attempt_201301141457_0034_m_000002_2&filter=stderr
13/02/24 13:33:08 INFO mapred.JobClient: Task Id :
attempt_201301141457_0034_m_000001_0, Status : FAILED
Error initializing attempt_201301141457_0034_m_000001_0:
java.lang.InternalError
        at
sun.misc.URLClassPath$JarLoader.getResource(URLClassPath.java:769)
        at sun.misc.URLClassPath.getResource(URLClassPath.java:185)
        at sun.misc.URLClassPath.getResource(URLClassPath.java:237)
        at java.lang.ClassLoader.getBootstrapResource(ClassLoader.java:1113)
        at java.lang.ClassLoader.getResource(ClassLoader.java:974)
        at java.lang.ClassLoader.getResource(ClassLoader.java:972)
        at java.lang.ClassLoader.getSystemResource(ClassLoader.java:1075)
        at
java.lang.ClassLoader.getSystemResourceAsStream(ClassLoader.java:1181)
        at java.lang.Class.getResourceAsStream(Class.java:2045)
        at
com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory$1.run(OutputPropertiesFactory.java:370)
        at java.security.AccessController.doPrivileged(Native Method)
        at
com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory.loadPropertiesFile(OutputPropertiesFactory.java:366)
        at
com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory.getDefaultMethodProperties(OutputPropertiesFactory.java:267)
        at
com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.setDefaults(TransformerImpl.java:1123)
        at
com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.createOutputProperties(TransformerImpl.java:1084)
        at
com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.<init>(TransformerImpl.java:250)
        at
com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.<init>(TransformerImpl.java:241)
        at
com.sun.org.apache.xalan.internal.xsltc.trax.TransformerFactoryImpl.newTransformer(TransformerFactoryImpl.java:607)
        at
org.apache.hadoop.conf.Configuration.writeXml(Configuration.java:1310)
        at
org.apache.hadoop.mapred.JobLocalizer.writeLocalJobFile(JobLocalizer.java:559)
        at org.apache.hadoop.mapred.TaskTracker$4.run(TaskTracker.java:1227)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:416)
        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
        at
org.apache.hadoop.mapred.TaskTracker.initializeJob(TaskTracker.java:1203)
        at
org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:1118)
        at org.apache.hadoop.mapred.TaskTracker$5.run(TaskTracker.java:2430)
        at java.lang.Thread.run(Thread.java:679)
Caused by: java.io.FileNotFoundException:
/usr/lib/jvm/java-1.6.0-openjdk-1.6.0.0.x86_64/jre/lib/resources.jar
        at sun.misc.URLClassPath$JarLoader.getJarFile(URLClassPath.java:658)
        at sun.misc.URLClassPath$JarLoader.access$600(URLClassPath.java:555)
        at sun.misc.URLClassPath$JarLoader$1.run(URLClassPath.java:622)
        at java.security.AccessController.doPrivileged(Native Method)
        at sun.misc.URLClassPath$JarLoader.ensureOpen(URLClassPath.java:614)
        at
sun.misc.URLClassPath$JarLoader.getResource(URLClassPath.java:767)
        ... 27 more

13/02/24 13:33:08 WARN mapred.JobClient: Error reading task
outputhttp://ADUAE045-LAP-V:50060/tasklog?plaintext=true&attemptid=attempt_201301141457_0034_m_000001_0&filter=stdout
13/02/24 13:33:08 WARN mapred.JobClient: Error reading task
outputhttp://ADUAE045-LAP-V:50060/tasklog?plaintext=true&attemptid=attempt_201301141457_0034_m_000001_0&filter=stderr
13/02/24 13:33:11 INFO mapred.JobClient: Task Id :
attempt_201301141457_0034_r_000001_0, Status : FAILED
Error initializing attempt_201301141457_0034_r_000001_0:
java.lang.InternalError
        at
sun.misc.URLClassPath$JarLoader.getResource(URLClassPath.java:769)
        at sun.misc.URLClassPath.getResource(URLClassPath.java:185)
        at sun.misc.URLClassPath.getResource(URLClassPath.java:237)
        at java.lang.ClassLoader.getBootstrapResource(ClassLoader.java:1113)
        at java.lang.ClassLoader.getResource(ClassLoader.java:974)
        at java.lang.ClassLoader.getResource(ClassLoader.java:972)
        at java.lang.ClassLoader.getSystemResource(ClassLoader.java:1075)
        at
java.lang.ClassLoader.getSystemResourceAsStream(ClassLoader.java:1181)
        at java.lang.Class.getResourceAsStream(Class.java:2045)
        at
com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory$1.run(OutputPropertiesFactory.java:370)
        at java.security.AccessController.doPrivileged(Native Method)
        at
com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory.loadPropertiesFile(OutputPropertiesFactory.java:366)
        at
com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory.getDefaultMethodProperties(OutputPropertiesFactory.java:267)
        at
com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.setDefaults(TransformerImpl.java:1123)
        at
com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.createOutputProperties(TransformerImpl.java:1084)
        at
com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.<init>(TransformerImpl.java:250)
        at
com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.<init>(TransformerImpl.java:241)
        at
com.sun.org.apache.xalan.internal.xsltc.trax.TransformerFactoryImpl.newTransformer(TransformerFactoryImpl.java:607)
        at
org.apache.hadoop.conf.Configuration.writeXml(Configuration.java:1310)
        at
org.apache.hadoop.mapred.JobLocalizer.writeLocalJobFile(JobLocalizer.java:559)
        at org.apache.hadoop.mapred.TaskTracker$4.run(TaskTracker.java:1227)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:416)
        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
        at
org.apache.hadoop.mapred.TaskTracker.initializeJob(TaskTracker.java:1203)
        at
org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:1118)
        at org.apache.hadoop.mapred.TaskTracker$5.run(TaskTracker.java:2430)
        at java.lang.Thread.run(Thread.java:679)
Caused by: java.io.FileNotFoundException:
/usr/lib/jvm/java-1.6.0-openjdk-1.6.0.0.x86_64/jre/lib/resources.jar
        at sun.misc.URLClassPath$JarLoader.getJarFile(URLClassPath.java:658)
        at sun.misc.URLClassPath$JarLoader.access$600(URLClassPath.java:555)
        at sun.misc.URLClassPath$JarLoader$1.run(URLClassPath.java:622)
        at java.security.AccessController.doPrivileged(Native Method)
        at sun.misc.URLClassPath$JarLoader.ensureOpen(URLClassPath.java:614)
        at
sun.misc.URLClassPath$JarLoader.getResource(URLClassPath.java:767)
        ... 27 more

13/02/24 13:33:11 WARN mapred.JobClient: Error reading task
outputhttp://ADUAE045-LAP-V:50060/tasklog?plaintext=true&attemptid=attempt_201301141457_0034_r_000001_0&filter=stdout
13/02/24 13:33:11 WARN mapred.JobClient: Error reading task
outputhttp://ADUAE045-LAP-V:50060/tasklog?plaintext=true&attemptid=attempt_201301141457_0034_r_000001_0&filter=stderr
13/02/24 13:33:19 INFO mapred.JobClient: Task Id :
attempt_201301141457_0034_m_000001_1, Status : FAILED
java.lang.Throwable: Child Error
        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
Caused by: java.io.IOException: Task process exit with nonzero status of 1.
        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)

attempt_201301141457_0034_m_000001_1: execvp: No such file or directory
13/02/24 13:33:25 INFO mapred.JobClient: Task Id :
attempt_201301141457_0034_r_000001_1, Status : FAILED
java.lang.Throwable: Child Error
        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
Caused by: java.io.IOException: Task process exit with nonzero status of 1.
        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)

attempt_201301141457_0034_r_000001_1: execvp: No such file or directory
13/02/24 13:33:31 INFO mapred.JobClient: Task Id :
attempt_201301141457_0034_m_000001_2, Status : FAILED
java.lang.Throwable: Child Error
        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
Caused by: java.io.IOException: Task process exit with nonzero status of 1.
        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)

attempt_201301141457_0034_m_000001_2: execvp: No such file or directory
13/02/24 13:33:37 INFO mapred.JobClient: Job complete: job_201301141457_0034
13/02/24 13:33:37 INFO mapred.JobClient: Counters: 4
13/02/24 13:33:37 INFO mapred.JobClient:   Job Counters
13/02/24 13:33:37 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=15112
13/02/24 13:33:37 INFO mapred.JobClient:     Total time spent by all
reduces waiting after reserving slots (ms)=0
13/02/24 13:33:37 INFO mapred.JobClient:     Total time spent by all maps
waiting after reserving slots (ms)=0
13/02/24 13:33:37 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=9091



On Sat, Feb 23, 2013 at 5:05 PM, Hemanth Yamijala <yhemanth@thoughtworks.com
> wrote:

> Can you try this ? Pick a class like WordCount from your package and
> execute this command:
>
> javap -classpath <path to your jar> -verbose org.myorg.Wordcount | grep
> version.
>
> For e.g. here's what I get for my class:
>
> $ javap -verbose WCMapper | grep version
>   minor version: 0
>   major version: 50
>
> Please paste the output of this - we can verify what the problem is.
>
> Thanks
> Hemanth
>
>
> On Sat, Feb 23, 2013 at 4:45 PM, Fatih Haltas <fa...@nyu.edu>wrote:
>
>> Hi again,
>>
>> Thanks for your help but now, I am struggling with the same problem on a
>> machine. As the preivous problem, I just decrease the Java version by Java
>> 6, but this time I could not solve the problem.
>>
>> those are outputs that may explain the situation:
>>
>> ---------------------------------------------------------------------------------------------------------------------------------------------
>> 1. I could not run my own code, to check the system I just tried to run
>> basic wordcount example without any modification, except package info.
>> **************************************************
>> COMMAND EXECUTED: hadoop jar my.jar org.myorg.WordCount NetFlow
>> NetFlow.out
>> Warning: $HADOOP_HOME is deprecated.
>>
>> Exception in thread "main" java.lang.UnsupportedClassVersionError:
>> org/myorg/WordCount : Unsupported major.minor version 51.0
>>          at java.lang.ClassLoader.defineClass1(Native Method)
>>         at java.lang.ClassLoader.defineClass(ClassLoader.java:634)
>>         at
>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>>         at java.net.URLClassLoader.defineClass(URLClassLoader.java:277)
>>         at java.net.URLClassLoader.access$000(URLClassLoader.java:73)
>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:212)
>>         at java.security.AccessController.doPrivileged(Native Method)
>>         at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
>>         at java.lang.Class.forName0(Native Method)
>>         at java.lang.Class.forName(Class.java:266)
>>         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
>>
>> **************************************************************************************
>> 2. Java version:
>> ********************************
>> COMMAND EXECUTED: java -version
>> java version "1.6.0_24"
>> OpenJDK Runtime Environment (IcedTea6 1.11.6)
>> (rhel-1.33.1.11.6.el5_9-x86_64)
>> OpenJDK 64-Bit Server VM (build 20.0-b12, mixed mode)
>> **********************************
>> 3. JAVA_HOME variable:
>> **********************************
>> COMMAND EXECUTED: echo $JAVA_HOME
>> /usr/lib/jvm/jre-1.6.0-openjdk.x86_64
>> ********************************************
>> 4. HADOOP version:
>> *******************************************
>> COMMAND EXECUTED: hadoop version
>> Warning: $HADOOP_HOME is deprecated.
>>
>> Hadoop 1.0.4
>> Subversion
>> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r
>> 1393290
>> Compiled by hortonfo on Wed Oct  3 05:13:58 UTC 2012
>> From source with checksum fe2baea87c4c81a2c505767f3f9b71f4
>> ********************************************************
>>
>> Are these still incompatible with eachother? (Hadoop version and java
>> version)
>>
>>
>> Thank you very much.
>>
>>
>> On Tue, Feb 19, 2013 at 10:26 PM, Fatih Haltas <fa...@nyu.edu>wrote:
>>
>>> Thank you all very much
>>>
>>> 19 Şubat 2013 Salı tarihinde Harsh J adlı kullanıcı şöyle yazdı:
>>>
>>>> Oops. I just noticed Hemanth has been answering on a dupe thread as
>>>> well. Lets drop this thread and carry on there :)
>>>>
>>>> On Tue, Feb 19, 2013 at 11:14 PM, Harsh J <ha...@cloudera.com> wrote:
>>>> > Hi,
>>>> >
>>>> > The new error usually happens if you compile using Java 7 and try to
>>>> > run via Java 6 (for example). That is, an incompatibility in the
>>>> > runtimes for the binary artifact produced.
>>>> >
>>>> > On Tue, Feb 19, 2013 at 10:09 PM, Fatih Haltas <fa...@nyu.edu>
>>>> wrote:
>>>> >> Thank you very much Harsh,
>>>> >>
>>>> >> Now, as I promised earlier I am much obliged to you.
>>>> >>
>>>> >> But, now I solved that problem by just changing the directories then
>>>> again
>>>> >> creating a jar file of org. but I am getting this error:
>>>> >>
>>>> >> 1.) What I got
>>>> >>
>>>> ------------------------------------------------------------------------------
>>>> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ hadoop jar
>>>> flow19028pm.jar
>>>> >> org.myorg.MapReduce /home/hadoop/project/hadoop-data/NetFlow
>>>> 19_02.out
>>>> >> Warning: $HADOOP_HOME is deprecated.
>>>> >>
>>>> >> Exception in thread "main" java.lang.UnsupportedClassVersionError:
>>>> >> org/myorg/MapReduce : Unsupported major.minor version 51.0
>>>> >>         at java.lang.ClassLoader.defineClass1(Native Method)
>>>> >>         at java.lang.ClassLoader.defineClass(ClassLoader.java:634)
>>>> >>         at
>>>> >>
>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>>>> >>         at
>>>> java.net.URLClassLoader.defineClass(URLClassLoader.java:277)
>>>> >>         at java.net.URLClassLoader.access$000(URLClassLoader.java:73)
>>>> >>         at java.net.URLClassLoader$1.run(URLClassLoader.java:212)
>>>> >>         at java.security.AccessController.doPrivileged(Native Method)
>>>> >>         at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>>>> >>         at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
>>>> >>         at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
>>>> >>         at java.lang.Class.forName0(Native Method)
>>>> >>         at java.lang.Class.forName(Class.java:266)
>>>> >>         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
>>>> >>
>>>> >> 2.) How I create my jar
>>>> >>
>>>> -------------------------------------------------------------------------------------
>>>> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar cvf flow19028pm.jar
>>>> org
>>>> >> added manifest
>>>> >> adding: org/(in = 0) (out= 0)(stored 0%)
>>>> >> adding: org/myorg/(in = 0) (out= 0)(stored 0%)
>>>> >> adding: org/myorg/MapReduce$FlowPortReducer.class(in = 1661) (out=
>>>> >> 690)(deflated 58%)
>>>> >> adding: org/myorg/MapReduce.class(in = 1587) (out= 903)(deflated 43%)
>>>> >> adding: org/myorg/MapReduce$FlowPortMapper.class(in = 1874) (out=
>>>> >> 823)(deflated 56%)
>>>> >>
>>>> >> 3.) Content of my jar file
>>>> >>
>>>> ---------------------------------------------------------------------------------------
>>>> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar tf flow19028pm.jar
>>>> >> META-INF/
>>>> >> META-INF/MANIFEST.MF
>>>> >> org/
>>>> >> org/myorg/
>>>> >> org/myorg/MapReduce$FlowPortReducer.class
>>>> >> org/myorg/MapReduce.class
>>>> >> org/myorg/MapReduce$FlowPortMapper.class
>>>> >>
>>>> -----------------------------------------------------------------------------------------
>>>> >>
>>>> >>
>>>> >> Thank you very much.
>>>> >>
>>>> >>
>>>> >> On Tue, Feb 19, 2013 at 8:20 PM, Harsh J <ha...@cloudera.com> wrote:
>>>> >>>
>>>> >>> Your point (4) explains the problem. The jar packed structure should
>>>> >>> look like the below, and not how it is presently (one extra top
>>>> level
>>>> >>> dir is present):
>>>> >>>
>>>> >>> META-INF/
>>>> >>> META-INF/MANIFEST.MF
>>>> >>> org/
>>>> >>> org/myorg/
>>>> >>> org/myorg/WordCount.class
>>>> >>> org/myorg/WordCount$TokenizerMapper.class
>>>> >>> org/myorg/WordCount$IntSumReducer.clas--
>>>> Harsh J
>>>>
>>>
>>
>

Fwd: Trouble in running MapReduce application

Posted by Fatih Haltas <fa...@nyu.edu>.
Hi Hemanth;

Thanks for your grreat helps,

I am really much obliged to you.

I solved this problem by changing my java compiler vs. but now though I
changed everynodes configuration I am getting this error even I tried to
run example of wordcount without making any changes.

What may be the reason, I believe that I checked all config files and
changed the home variables, also /etc/hosts

Here is my problem:
************************************************************
[hadoop@ADUAE042-LAP-V logs]$ hadoop jar ../hadoop-examples-1.0.4.jar
wordcount /home/hadoop/project/hadoop-data/NetFlow test1353.out

Warning: $HADOOP_HOME is deprecated.

13/02/24 13:32:28 INFO input.FileInputFormat: Total input paths to process
: 1
13/02/24 13:32:28 INFO util.NativeCodeLoader: Loaded the native-hadoop
library
13/02/24 13:32:28 WARN snappy.LoadSnappy: Snappy native library not loaded
13/02/24 13:32:29 INFO mapred.JobClient: Running job: job_201301141457_0034
13/02/24 13:32:30 INFO mapred.JobClient:  map 0% reduce 0%
13/02/24 13:32:37 INFO mapred.JobClient: Task Id :
attempt_201301141457_0034_m_000002_0, Status : FAILED
java.lang.Throwable: Child Error
        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
Caused by: java.io.IOException: Task process exit with nonzero status of 1.
        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)

attempt_201301141457_0034_m_000002_0: execvp: No such file or directory
13/02/24 13:32:43 INFO mapred.JobClient: Task Id :
attempt_201301141457_0034_r_000002_0, Status : FAILED
java.lang.Throwable: Child Error
        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
Caused by: java.io.IOException: Task process exit with nonzero status of 1.
        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)

attempt_201301141457_0034_r_000002_0: execvp: No such file or directory
13/02/24 13:32:50 INFO mapred.JobClient: Task Id :
attempt_201301141457_0034_m_000002_1, Status : FAILED
java.lang.Throwable: Child Error
        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
Caused by: java.io.IOException: Task process exit with nonzero status of 1.
        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)

attempt_201301141457_0034_m_000002_1: execvp: No such file or directory
13/02/24 13:32:56 INFO mapred.JobClient: Task Id :
attempt_201301141457_0034_r_000002_1, Status : FAILED
java.lang.Throwable: Child Error
        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
Caused by: java.io.IOException: Task process exit with nonzero status of 1.
        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)

attempt_201301141457_0034_r_000002_1: execvp: No such file or directory
13/02/24 13:33:02 INFO mapred.JobClient: Task Id :
attempt_201301141457_0034_m_000002_2, Status : FAILED
Error initializing attempt_201301141457_0034_m_000002_2:
java.lang.InternalError
        at
sun.misc.URLClassPath$JarLoader.getResource(URLClassPath.java:769)
        at sun.misc.URLClassPath.getResource(URLClassPath.java:185)
        at sun.misc.URLClassPath.getResource(URLClassPath.java:237)
        at java.lang.ClassLoader.getBootstrapResource(ClassLoader.java:1113)
        at java.lang.ClassLoader.getResource(ClassLoader.java:974)
        at java.lang.ClassLoader.getResource(ClassLoader.java:972)
        at java.lang.ClassLoader.getSystemResource(ClassLoader.java:1075)
        at
java.lang.ClassLoader.getSystemResourceAsStream(ClassLoader.java:1181)
        at java.lang.Class.getResourceAsStream(Class.java:2045)
        at
com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory$1.run(OutputPropertiesFactory.java:370)
        at java.security.AccessController.doPrivileged(Native Method)
        at
com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory.loadPropertiesFile(OutputPropertiesFactory.java:366)
        at
com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory.getDefaultMethodProperties(OutputPropertiesFactory.java:267)
        at
com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.setDefaults(TransformerImpl.java:1123)
        at
com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.createOutputProperties(TransformerImpl.java:1084)
        at
com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.<init>(TransformerImpl.java:250)
        at
com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.<init>(TransformerImpl.java:241)
        at
com.sun.org.apache.xalan.internal.xsltc.trax.TransformerFactoryImpl.newTransformer(TransformerFactoryImpl.java:607)
        at
org.apache.hadoop.conf.Configuration.writeXml(Configuration.java:1310)
        at
org.apache.hadoop.mapred.JobLocalizer.writeLocalJobFile(JobLocalizer.java:559)
        at org.apache.hadoop.mapred.TaskTracker$4.run(TaskTracker.java:1227)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:416)
        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
        at
org.apache.hadoop.mapred.TaskTracker.initializeJob(TaskTracker.java:1203)
        at
org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:1118)
        at org.apache.hadoop.mapred.TaskTracker$5.run(TaskTracker.java:2430)
        at java.lang.Thread.run(Thread.java:679)
Caused by: java.io.FileNotFoundException:
/usr/lib/jvm/java-1.6.0-openjdk-1.6.0.0.x86_64/jre/lib/resources.jar
        at sun.misc.URLClassPath$JarLoader.getJarFile(URLClassPath.java:658)
        at sun.misc.URLClassPath$JarLoader.access$600(URLClassPath.java:555)
        at sun.misc.URLClassPath$JarLoader$1.run(URLClassPath.java:622)
        at java.security.AccessController.doPrivileged(Native Method)
        at sun.misc.URLClassPath$JarLoader.ensureOpen(URLClassPath.java:614)
        at
sun.misc.URLClassPath$JarLoader.getResource(URLClassPath.java:767)
        ... 27 more

13/02/24 13:33:02 WARN mapred.JobClient: Error reading task
outputhttp://ADUAE045-LAP-V:50060/tasklog?plaintext=true&attemptid=attempt_201301141457_0034_m_000002_2&filter=stdout
13/02/24 13:33:02 WARN mapred.JobClient: Error reading task
outputhttp://ADUAE045-LAP-V:50060/tasklog?plaintext=true&attemptid=attempt_201301141457_0034_m_000002_2&filter=stderr
13/02/24 13:33:08 INFO mapred.JobClient: Task Id :
attempt_201301141457_0034_m_000001_0, Status : FAILED
Error initializing attempt_201301141457_0034_m_000001_0:
java.lang.InternalError
        at
sun.misc.URLClassPath$JarLoader.getResource(URLClassPath.java:769)
        at sun.misc.URLClassPath.getResource(URLClassPath.java:185)
        at sun.misc.URLClassPath.getResource(URLClassPath.java:237)
        at java.lang.ClassLoader.getBootstrapResource(ClassLoader.java:1113)
        at java.lang.ClassLoader.getResource(ClassLoader.java:974)
        at java.lang.ClassLoader.getResource(ClassLoader.java:972)
        at java.lang.ClassLoader.getSystemResource(ClassLoader.java:1075)
        at
java.lang.ClassLoader.getSystemResourceAsStream(ClassLoader.java:1181)
        at java.lang.Class.getResourceAsStream(Class.java:2045)
        at
com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory$1.run(OutputPropertiesFactory.java:370)
        at java.security.AccessController.doPrivileged(Native Method)
        at
com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory.loadPropertiesFile(OutputPropertiesFactory.java:366)
        at
com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory.getDefaultMethodProperties(OutputPropertiesFactory.java:267)
        at
com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.setDefaults(TransformerImpl.java:1123)
        at
com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.createOutputProperties(TransformerImpl.java:1084)
        at
com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.<init>(TransformerImpl.java:250)
        at
com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.<init>(TransformerImpl.java:241)
        at
com.sun.org.apache.xalan.internal.xsltc.trax.TransformerFactoryImpl.newTransformer(TransformerFactoryImpl.java:607)
        at
org.apache.hadoop.conf.Configuration.writeXml(Configuration.java:1310)
        at
org.apache.hadoop.mapred.JobLocalizer.writeLocalJobFile(JobLocalizer.java:559)
        at org.apache.hadoop.mapred.TaskTracker$4.run(TaskTracker.java:1227)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:416)
        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
        at
org.apache.hadoop.mapred.TaskTracker.initializeJob(TaskTracker.java:1203)
        at
org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:1118)
        at org.apache.hadoop.mapred.TaskTracker$5.run(TaskTracker.java:2430)
        at java.lang.Thread.run(Thread.java:679)
Caused by: java.io.FileNotFoundException:
/usr/lib/jvm/java-1.6.0-openjdk-1.6.0.0.x86_64/jre/lib/resources.jar
        at sun.misc.URLClassPath$JarLoader.getJarFile(URLClassPath.java:658)
        at sun.misc.URLClassPath$JarLoader.access$600(URLClassPath.java:555)
        at sun.misc.URLClassPath$JarLoader$1.run(URLClassPath.java:622)
        at java.security.AccessController.doPrivileged(Native Method)
        at sun.misc.URLClassPath$JarLoader.ensureOpen(URLClassPath.java:614)
        at
sun.misc.URLClassPath$JarLoader.getResource(URLClassPath.java:767)
        ... 27 more

13/02/24 13:33:08 WARN mapred.JobClient: Error reading task
outputhttp://ADUAE045-LAP-V:50060/tasklog?plaintext=true&attemptid=attempt_201301141457_0034_m_000001_0&filter=stdout
13/02/24 13:33:08 WARN mapred.JobClient: Error reading task
outputhttp://ADUAE045-LAP-V:50060/tasklog?plaintext=true&attemptid=attempt_201301141457_0034_m_000001_0&filter=stderr
13/02/24 13:33:11 INFO mapred.JobClient: Task Id :
attempt_201301141457_0034_r_000001_0, Status : FAILED
Error initializing attempt_201301141457_0034_r_000001_0:
java.lang.InternalError
        at
sun.misc.URLClassPath$JarLoader.getResource(URLClassPath.java:769)
        at sun.misc.URLClassPath.getResource(URLClassPath.java:185)
        at sun.misc.URLClassPath.getResource(URLClassPath.java:237)
        at java.lang.ClassLoader.getBootstrapResource(ClassLoader.java:1113)
        at java.lang.ClassLoader.getResource(ClassLoader.java:974)
        at java.lang.ClassLoader.getResource(ClassLoader.java:972)
        at java.lang.ClassLoader.getSystemResource(ClassLoader.java:1075)
        at
java.lang.ClassLoader.getSystemResourceAsStream(ClassLoader.java:1181)
        at java.lang.Class.getResourceAsStream(Class.java:2045)
        at
com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory$1.run(OutputPropertiesFactory.java:370)
        at java.security.AccessController.doPrivileged(Native Method)
        at
com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory.loadPropertiesFile(OutputPropertiesFactory.java:366)
        at
com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory.getDefaultMethodProperties(OutputPropertiesFactory.java:267)
        at
com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.setDefaults(TransformerImpl.java:1123)
        at
com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.createOutputProperties(TransformerImpl.java:1084)
        at
com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.<init>(TransformerImpl.java:250)
        at
com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.<init>(TransformerImpl.java:241)
        at
com.sun.org.apache.xalan.internal.xsltc.trax.TransformerFactoryImpl.newTransformer(TransformerFactoryImpl.java:607)
        at
org.apache.hadoop.conf.Configuration.writeXml(Configuration.java:1310)
        at
org.apache.hadoop.mapred.JobLocalizer.writeLocalJobFile(JobLocalizer.java:559)
        at org.apache.hadoop.mapred.TaskTracker$4.run(TaskTracker.java:1227)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:416)
        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
        at
org.apache.hadoop.mapred.TaskTracker.initializeJob(TaskTracker.java:1203)
        at
org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:1118)
        at org.apache.hadoop.mapred.TaskTracker$5.run(TaskTracker.java:2430)
        at java.lang.Thread.run(Thread.java:679)
Caused by: java.io.FileNotFoundException:
/usr/lib/jvm/java-1.6.0-openjdk-1.6.0.0.x86_64/jre/lib/resources.jar
        at sun.misc.URLClassPath$JarLoader.getJarFile(URLClassPath.java:658)
        at sun.misc.URLClassPath$JarLoader.access$600(URLClassPath.java:555)
        at sun.misc.URLClassPath$JarLoader$1.run(URLClassPath.java:622)
        at java.security.AccessController.doPrivileged(Native Method)
        at sun.misc.URLClassPath$JarLoader.ensureOpen(URLClassPath.java:614)
        at
sun.misc.URLClassPath$JarLoader.getResource(URLClassPath.java:767)
        ... 27 more

13/02/24 13:33:11 WARN mapred.JobClient: Error reading task
outputhttp://ADUAE045-LAP-V:50060/tasklog?plaintext=true&attemptid=attempt_201301141457_0034_r_000001_0&filter=stdout
13/02/24 13:33:11 WARN mapred.JobClient: Error reading task
outputhttp://ADUAE045-LAP-V:50060/tasklog?plaintext=true&attemptid=attempt_201301141457_0034_r_000001_0&filter=stderr
13/02/24 13:33:19 INFO mapred.JobClient: Task Id :
attempt_201301141457_0034_m_000001_1, Status : FAILED
java.lang.Throwable: Child Error
        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
Caused by: java.io.IOException: Task process exit with nonzero status of 1.
        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)

attempt_201301141457_0034_m_000001_1: execvp: No such file or directory
13/02/24 13:33:25 INFO mapred.JobClient: Task Id :
attempt_201301141457_0034_r_000001_1, Status : FAILED
java.lang.Throwable: Child Error
        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
Caused by: java.io.IOException: Task process exit with nonzero status of 1.
        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)

attempt_201301141457_0034_r_000001_1: execvp: No such file or directory
13/02/24 13:33:31 INFO mapred.JobClient: Task Id :
attempt_201301141457_0034_m_000001_2, Status : FAILED
java.lang.Throwable: Child Error
        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
Caused by: java.io.IOException: Task process exit with nonzero status of 1.
        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)

attempt_201301141457_0034_m_000001_2: execvp: No such file or directory
13/02/24 13:33:37 INFO mapred.JobClient: Job complete: job_201301141457_0034
13/02/24 13:33:37 INFO mapred.JobClient: Counters: 4
13/02/24 13:33:37 INFO mapred.JobClient:   Job Counters
13/02/24 13:33:37 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=15112
13/02/24 13:33:37 INFO mapred.JobClient:     Total time spent by all
reduces waiting after reserving slots (ms)=0
13/02/24 13:33:37 INFO mapred.JobClient:     Total time spent by all maps
waiting after reserving slots (ms)=0
13/02/24 13:33:37 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=9091



On Sat, Feb 23, 2013 at 5:05 PM, Hemanth Yamijala <yhemanth@thoughtworks.com
> wrote:

> Can you try this ? Pick a class like WordCount from your package and
> execute this command:
>
> javap -classpath <path to your jar> -verbose org.myorg.Wordcount | grep
> version.
>
> For e.g. here's what I get for my class:
>
> $ javap -verbose WCMapper | grep version
>   minor version: 0
>   major version: 50
>
> Please paste the output of this - we can verify what the problem is.
>
> Thanks
> Hemanth
>
>
> On Sat, Feb 23, 2013 at 4:45 PM, Fatih Haltas <fa...@nyu.edu>wrote:
>
>> Hi again,
>>
>> Thanks for your help but now, I am struggling with the same problem on a
>> machine. As the preivous problem, I just decrease the Java version by Java
>> 6, but this time I could not solve the problem.
>>
>> those are outputs that may explain the situation:
>>
>> ---------------------------------------------------------------------------------------------------------------------------------------------
>> 1. I could not run my own code, to check the system I just tried to run
>> basic wordcount example without any modification, except package info.
>> **************************************************
>> COMMAND EXECUTED: hadoop jar my.jar org.myorg.WordCount NetFlow
>> NetFlow.out
>> Warning: $HADOOP_HOME is deprecated.
>>
>> Exception in thread "main" java.lang.UnsupportedClassVersionError:
>> org/myorg/WordCount : Unsupported major.minor version 51.0
>>          at java.lang.ClassLoader.defineClass1(Native Method)
>>         at java.lang.ClassLoader.defineClass(ClassLoader.java:634)
>>         at
>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>>         at java.net.URLClassLoader.defineClass(URLClassLoader.java:277)
>>         at java.net.URLClassLoader.access$000(URLClassLoader.java:73)
>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:212)
>>         at java.security.AccessController.doPrivileged(Native Method)
>>         at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
>>         at java.lang.Class.forName0(Native Method)
>>         at java.lang.Class.forName(Class.java:266)
>>         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
>>
>> **************************************************************************************
>> 2. Java version:
>> ********************************
>> COMMAND EXECUTED: java -version
>> java version "1.6.0_24"
>> OpenJDK Runtime Environment (IcedTea6 1.11.6)
>> (rhel-1.33.1.11.6.el5_9-x86_64)
>> OpenJDK 64-Bit Server VM (build 20.0-b12, mixed mode)
>> **********************************
>> 3. JAVA_HOME variable:
>> **********************************
>> COMMAND EXECUTED: echo $JAVA_HOME
>> /usr/lib/jvm/jre-1.6.0-openjdk.x86_64
>> ********************************************
>> 4. HADOOP version:
>> *******************************************
>> COMMAND EXECUTED: hadoop version
>> Warning: $HADOOP_HOME is deprecated.
>>
>> Hadoop 1.0.4
>> Subversion
>> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r
>> 1393290
>> Compiled by hortonfo on Wed Oct  3 05:13:58 UTC 2012
>> From source with checksum fe2baea87c4c81a2c505767f3f9b71f4
>> ********************************************************
>>
>> Are these still incompatible with eachother? (Hadoop version and java
>> version)
>>
>>
>> Thank you very much.
>>
>>
>> On Tue, Feb 19, 2013 at 10:26 PM, Fatih Haltas <fa...@nyu.edu>wrote:
>>
>>> Thank you all very much
>>>
>>> 19 Şubat 2013 Salı tarihinde Harsh J adlı kullanıcı şöyle yazdı:
>>>
>>>> Oops. I just noticed Hemanth has been answering on a dupe thread as
>>>> well. Lets drop this thread and carry on there :)
>>>>
>>>> On Tue, Feb 19, 2013 at 11:14 PM, Harsh J <ha...@cloudera.com> wrote:
>>>> > Hi,
>>>> >
>>>> > The new error usually happens if you compile using Java 7 and try to
>>>> > run via Java 6 (for example). That is, an incompatibility in the
>>>> > runtimes for the binary artifact produced.
>>>> >
>>>> > On Tue, Feb 19, 2013 at 10:09 PM, Fatih Haltas <fa...@nyu.edu>
>>>> wrote:
>>>> >> Thank you very much Harsh,
>>>> >>
>>>> >> Now, as I promised earlier I am much obliged to you.
>>>> >>
>>>> >> But, now I solved that problem by just changing the directories then
>>>> again
>>>> >> creating a jar file of org. but I am getting this error:
>>>> >>
>>>> >> 1.) What I got
>>>> >>
>>>> ------------------------------------------------------------------------------
>>>> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ hadoop jar
>>>> flow19028pm.jar
>>>> >> org.myorg.MapReduce /home/hadoop/project/hadoop-data/NetFlow
>>>> 19_02.out
>>>> >> Warning: $HADOOP_HOME is deprecated.
>>>> >>
>>>> >> Exception in thread "main" java.lang.UnsupportedClassVersionError:
>>>> >> org/myorg/MapReduce : Unsupported major.minor version 51.0
>>>> >>         at java.lang.ClassLoader.defineClass1(Native Method)
>>>> >>         at java.lang.ClassLoader.defineClass(ClassLoader.java:634)
>>>> >>         at
>>>> >>
>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>>>> >>         at
>>>> java.net.URLClassLoader.defineClass(URLClassLoader.java:277)
>>>> >>         at java.net.URLClassLoader.access$000(URLClassLoader.java:73)
>>>> >>         at java.net.URLClassLoader$1.run(URLClassLoader.java:212)
>>>> >>         at java.security.AccessController.doPrivileged(Native Method)
>>>> >>         at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>>>> >>         at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
>>>> >>         at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
>>>> >>         at java.lang.Class.forName0(Native Method)
>>>> >>         at java.lang.Class.forName(Class.java:266)
>>>> >>         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
>>>> >>
>>>> >> 2.) How I create my jar
>>>> >>
>>>> -------------------------------------------------------------------------------------
>>>> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar cvf flow19028pm.jar
>>>> org
>>>> >> added manifest
>>>> >> adding: org/(in = 0) (out= 0)(stored 0%)
>>>> >> adding: org/myorg/(in = 0) (out= 0)(stored 0%)
>>>> >> adding: org/myorg/MapReduce$FlowPortReducer.class(in = 1661) (out=
>>>> >> 690)(deflated 58%)
>>>> >> adding: org/myorg/MapReduce.class(in = 1587) (out= 903)(deflated 43%)
>>>> >> adding: org/myorg/MapReduce$FlowPortMapper.class(in = 1874) (out=
>>>> >> 823)(deflated 56%)
>>>> >>
>>>> >> 3.) Content of my jar file
>>>> >>
>>>> ---------------------------------------------------------------------------------------
>>>> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar tf flow19028pm.jar
>>>> >> META-INF/
>>>> >> META-INF/MANIFEST.MF
>>>> >> org/
>>>> >> org/myorg/
>>>> >> org/myorg/MapReduce$FlowPortReducer.class
>>>> >> org/myorg/MapReduce.class
>>>> >> org/myorg/MapReduce$FlowPortMapper.class
>>>> >>
>>>> -----------------------------------------------------------------------------------------
>>>> >>
>>>> >>
>>>> >> Thank you very much.
>>>> >>
>>>> >>
>>>> >> On Tue, Feb 19, 2013 at 8:20 PM, Harsh J <ha...@cloudera.com> wrote:
>>>> >>>
>>>> >>> Your point (4) explains the problem. The jar packed structure should
>>>> >>> look like the below, and not how it is presently (one extra top
>>>> level
>>>> >>> dir is present):
>>>> >>>
>>>> >>> META-INF/
>>>> >>> META-INF/MANIFEST.MF
>>>> >>> org/
>>>> >>> org/myorg/
>>>> >>> org/myorg/WordCount.class
>>>> >>> org/myorg/WordCount$TokenizerMapper.class
>>>> >>> org/myorg/WordCount$IntSumReducer.clas--
>>>> Harsh J
>>>>
>>>
>>
>

Fwd: Trouble in running MapReduce application

Posted by Fatih Haltas <fa...@nyu.edu>.
Hi Hemanth;

Thanks for your grreat helps,

I am really much obliged to you.

I solved this problem by changing my java compiler vs. but now though I
changed everynodes configuration I am getting this error even I tried to
run example of wordcount without making any changes.

What may be the reason, I believe that I checked all config files and
changed the home variables, also /etc/hosts

Here is my problem:
************************************************************
[hadoop@ADUAE042-LAP-V logs]$ hadoop jar ../hadoop-examples-1.0.4.jar
wordcount /home/hadoop/project/hadoop-data/NetFlow test1353.out

Warning: $HADOOP_HOME is deprecated.

13/02/24 13:32:28 INFO input.FileInputFormat: Total input paths to process
: 1
13/02/24 13:32:28 INFO util.NativeCodeLoader: Loaded the native-hadoop
library
13/02/24 13:32:28 WARN snappy.LoadSnappy: Snappy native library not loaded
13/02/24 13:32:29 INFO mapred.JobClient: Running job: job_201301141457_0034
13/02/24 13:32:30 INFO mapred.JobClient:  map 0% reduce 0%
13/02/24 13:32:37 INFO mapred.JobClient: Task Id :
attempt_201301141457_0034_m_000002_0, Status : FAILED
java.lang.Throwable: Child Error
        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
Caused by: java.io.IOException: Task process exit with nonzero status of 1.
        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)

attempt_201301141457_0034_m_000002_0: execvp: No such file or directory
13/02/24 13:32:43 INFO mapred.JobClient: Task Id :
attempt_201301141457_0034_r_000002_0, Status : FAILED
java.lang.Throwable: Child Error
        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
Caused by: java.io.IOException: Task process exit with nonzero status of 1.
        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)

attempt_201301141457_0034_r_000002_0: execvp: No such file or directory
13/02/24 13:32:50 INFO mapred.JobClient: Task Id :
attempt_201301141457_0034_m_000002_1, Status : FAILED
java.lang.Throwable: Child Error
        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
Caused by: java.io.IOException: Task process exit with nonzero status of 1.
        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)

attempt_201301141457_0034_m_000002_1: execvp: No such file or directory
13/02/24 13:32:56 INFO mapred.JobClient: Task Id :
attempt_201301141457_0034_r_000002_1, Status : FAILED
java.lang.Throwable: Child Error
        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
Caused by: java.io.IOException: Task process exit with nonzero status of 1.
        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)

attempt_201301141457_0034_r_000002_1: execvp: No such file or directory
13/02/24 13:33:02 INFO mapred.JobClient: Task Id :
attempt_201301141457_0034_m_000002_2, Status : FAILED
Error initializing attempt_201301141457_0034_m_000002_2:
java.lang.InternalError
        at
sun.misc.URLClassPath$JarLoader.getResource(URLClassPath.java:769)
        at sun.misc.URLClassPath.getResource(URLClassPath.java:185)
        at sun.misc.URLClassPath.getResource(URLClassPath.java:237)
        at java.lang.ClassLoader.getBootstrapResource(ClassLoader.java:1113)
        at java.lang.ClassLoader.getResource(ClassLoader.java:974)
        at java.lang.ClassLoader.getResource(ClassLoader.java:972)
        at java.lang.ClassLoader.getSystemResource(ClassLoader.java:1075)
        at
java.lang.ClassLoader.getSystemResourceAsStream(ClassLoader.java:1181)
        at java.lang.Class.getResourceAsStream(Class.java:2045)
        at
com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory$1.run(OutputPropertiesFactory.java:370)
        at java.security.AccessController.doPrivileged(Native Method)
        at
com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory.loadPropertiesFile(OutputPropertiesFactory.java:366)
        at
com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory.getDefaultMethodProperties(OutputPropertiesFactory.java:267)
        at
com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.setDefaults(TransformerImpl.java:1123)
        at
com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.createOutputProperties(TransformerImpl.java:1084)
        at
com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.<init>(TransformerImpl.java:250)
        at
com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.<init>(TransformerImpl.java:241)
        at
com.sun.org.apache.xalan.internal.xsltc.trax.TransformerFactoryImpl.newTransformer(TransformerFactoryImpl.java:607)
        at
org.apache.hadoop.conf.Configuration.writeXml(Configuration.java:1310)
        at
org.apache.hadoop.mapred.JobLocalizer.writeLocalJobFile(JobLocalizer.java:559)
        at org.apache.hadoop.mapred.TaskTracker$4.run(TaskTracker.java:1227)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:416)
        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
        at
org.apache.hadoop.mapred.TaskTracker.initializeJob(TaskTracker.java:1203)
        at
org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:1118)
        at org.apache.hadoop.mapred.TaskTracker$5.run(TaskTracker.java:2430)
        at java.lang.Thread.run(Thread.java:679)
Caused by: java.io.FileNotFoundException:
/usr/lib/jvm/java-1.6.0-openjdk-1.6.0.0.x86_64/jre/lib/resources.jar
        at sun.misc.URLClassPath$JarLoader.getJarFile(URLClassPath.java:658)
        at sun.misc.URLClassPath$JarLoader.access$600(URLClassPath.java:555)
        at sun.misc.URLClassPath$JarLoader$1.run(URLClassPath.java:622)
        at java.security.AccessController.doPrivileged(Native Method)
        at sun.misc.URLClassPath$JarLoader.ensureOpen(URLClassPath.java:614)
        at
sun.misc.URLClassPath$JarLoader.getResource(URLClassPath.java:767)
        ... 27 more

13/02/24 13:33:02 WARN mapred.JobClient: Error reading task
outputhttp://ADUAE045-LAP-V:50060/tasklog?plaintext=true&attemptid=attempt_201301141457_0034_m_000002_2&filter=stdout
13/02/24 13:33:02 WARN mapred.JobClient: Error reading task
outputhttp://ADUAE045-LAP-V:50060/tasklog?plaintext=true&attemptid=attempt_201301141457_0034_m_000002_2&filter=stderr
13/02/24 13:33:08 INFO mapred.JobClient: Task Id :
attempt_201301141457_0034_m_000001_0, Status : FAILED
Error initializing attempt_201301141457_0034_m_000001_0:
java.lang.InternalError
        at
sun.misc.URLClassPath$JarLoader.getResource(URLClassPath.java:769)
        at sun.misc.URLClassPath.getResource(URLClassPath.java:185)
        at sun.misc.URLClassPath.getResource(URLClassPath.java:237)
        at java.lang.ClassLoader.getBootstrapResource(ClassLoader.java:1113)
        at java.lang.ClassLoader.getResource(ClassLoader.java:974)
        at java.lang.ClassLoader.getResource(ClassLoader.java:972)
        at java.lang.ClassLoader.getSystemResource(ClassLoader.java:1075)
        at
java.lang.ClassLoader.getSystemResourceAsStream(ClassLoader.java:1181)
        at java.lang.Class.getResourceAsStream(Class.java:2045)
        at
com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory$1.run(OutputPropertiesFactory.java:370)
        at java.security.AccessController.doPrivileged(Native Method)
        at
com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory.loadPropertiesFile(OutputPropertiesFactory.java:366)
        at
com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory.getDefaultMethodProperties(OutputPropertiesFactory.java:267)
        at
com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.setDefaults(TransformerImpl.java:1123)
        at
com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.createOutputProperties(TransformerImpl.java:1084)
        at
com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.<init>(TransformerImpl.java:250)
        at
com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.<init>(TransformerImpl.java:241)
        at
com.sun.org.apache.xalan.internal.xsltc.trax.TransformerFactoryImpl.newTransformer(TransformerFactoryImpl.java:607)
        at
org.apache.hadoop.conf.Configuration.writeXml(Configuration.java:1310)
        at
org.apache.hadoop.mapred.JobLocalizer.writeLocalJobFile(JobLocalizer.java:559)
        at org.apache.hadoop.mapred.TaskTracker$4.run(TaskTracker.java:1227)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:416)
        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
        at
org.apache.hadoop.mapred.TaskTracker.initializeJob(TaskTracker.java:1203)
        at
org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:1118)
        at org.apache.hadoop.mapred.TaskTracker$5.run(TaskTracker.java:2430)
        at java.lang.Thread.run(Thread.java:679)
Caused by: java.io.FileNotFoundException:
/usr/lib/jvm/java-1.6.0-openjdk-1.6.0.0.x86_64/jre/lib/resources.jar
        at sun.misc.URLClassPath$JarLoader.getJarFile(URLClassPath.java:658)
        at sun.misc.URLClassPath$JarLoader.access$600(URLClassPath.java:555)
        at sun.misc.URLClassPath$JarLoader$1.run(URLClassPath.java:622)
        at java.security.AccessController.doPrivileged(Native Method)
        at sun.misc.URLClassPath$JarLoader.ensureOpen(URLClassPath.java:614)
        at
sun.misc.URLClassPath$JarLoader.getResource(URLClassPath.java:767)
        ... 27 more

13/02/24 13:33:08 WARN mapred.JobClient: Error reading task
outputhttp://ADUAE045-LAP-V:50060/tasklog?plaintext=true&attemptid=attempt_201301141457_0034_m_000001_0&filter=stdout
13/02/24 13:33:08 WARN mapred.JobClient: Error reading task
outputhttp://ADUAE045-LAP-V:50060/tasklog?plaintext=true&attemptid=attempt_201301141457_0034_m_000001_0&filter=stderr
13/02/24 13:33:11 INFO mapred.JobClient: Task Id :
attempt_201301141457_0034_r_000001_0, Status : FAILED
Error initializing attempt_201301141457_0034_r_000001_0:
java.lang.InternalError
        at
sun.misc.URLClassPath$JarLoader.getResource(URLClassPath.java:769)
        at sun.misc.URLClassPath.getResource(URLClassPath.java:185)
        at sun.misc.URLClassPath.getResource(URLClassPath.java:237)
        at java.lang.ClassLoader.getBootstrapResource(ClassLoader.java:1113)
        at java.lang.ClassLoader.getResource(ClassLoader.java:974)
        at java.lang.ClassLoader.getResource(ClassLoader.java:972)
        at java.lang.ClassLoader.getSystemResource(ClassLoader.java:1075)
        at
java.lang.ClassLoader.getSystemResourceAsStream(ClassLoader.java:1181)
        at java.lang.Class.getResourceAsStream(Class.java:2045)
        at
com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory$1.run(OutputPropertiesFactory.java:370)
        at java.security.AccessController.doPrivileged(Native Method)
        at
com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory.loadPropertiesFile(OutputPropertiesFactory.java:366)
        at
com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory.getDefaultMethodProperties(OutputPropertiesFactory.java:267)
        at
com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.setDefaults(TransformerImpl.java:1123)
        at
com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.createOutputProperties(TransformerImpl.java:1084)
        at
com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.<init>(TransformerImpl.java:250)
        at
com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.<init>(TransformerImpl.java:241)
        at
com.sun.org.apache.xalan.internal.xsltc.trax.TransformerFactoryImpl.newTransformer(TransformerFactoryImpl.java:607)
        at
org.apache.hadoop.conf.Configuration.writeXml(Configuration.java:1310)
        at
org.apache.hadoop.mapred.JobLocalizer.writeLocalJobFile(JobLocalizer.java:559)
        at org.apache.hadoop.mapred.TaskTracker$4.run(TaskTracker.java:1227)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:416)
        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
        at
org.apache.hadoop.mapred.TaskTracker.initializeJob(TaskTracker.java:1203)
        at
org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:1118)
        at org.apache.hadoop.mapred.TaskTracker$5.run(TaskTracker.java:2430)
        at java.lang.Thread.run(Thread.java:679)
Caused by: java.io.FileNotFoundException:
/usr/lib/jvm/java-1.6.0-openjdk-1.6.0.0.x86_64/jre/lib/resources.jar
        at sun.misc.URLClassPath$JarLoader.getJarFile(URLClassPath.java:658)
        at sun.misc.URLClassPath$JarLoader.access$600(URLClassPath.java:555)
        at sun.misc.URLClassPath$JarLoader$1.run(URLClassPath.java:622)
        at java.security.AccessController.doPrivileged(Native Method)
        at sun.misc.URLClassPath$JarLoader.ensureOpen(URLClassPath.java:614)
        at
sun.misc.URLClassPath$JarLoader.getResource(URLClassPath.java:767)
        ... 27 more

13/02/24 13:33:11 WARN mapred.JobClient: Error reading task
outputhttp://ADUAE045-LAP-V:50060/tasklog?plaintext=true&attemptid=attempt_201301141457_0034_r_000001_0&filter=stdout
13/02/24 13:33:11 WARN mapred.JobClient: Error reading task
outputhttp://ADUAE045-LAP-V:50060/tasklog?plaintext=true&attemptid=attempt_201301141457_0034_r_000001_0&filter=stderr
13/02/24 13:33:19 INFO mapred.JobClient: Task Id :
attempt_201301141457_0034_m_000001_1, Status : FAILED
java.lang.Throwable: Child Error
        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
Caused by: java.io.IOException: Task process exit with nonzero status of 1.
        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)

attempt_201301141457_0034_m_000001_1: execvp: No such file or directory
13/02/24 13:33:25 INFO mapred.JobClient: Task Id :
attempt_201301141457_0034_r_000001_1, Status : FAILED
java.lang.Throwable: Child Error
        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
Caused by: java.io.IOException: Task process exit with nonzero status of 1.
        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)

attempt_201301141457_0034_r_000001_1: execvp: No such file or directory
13/02/24 13:33:31 INFO mapred.JobClient: Task Id :
attempt_201301141457_0034_m_000001_2, Status : FAILED
java.lang.Throwable: Child Error
        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
Caused by: java.io.IOException: Task process exit with nonzero status of 1.
        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)

attempt_201301141457_0034_m_000001_2: execvp: No such file or directory
13/02/24 13:33:37 INFO mapred.JobClient: Job complete: job_201301141457_0034
13/02/24 13:33:37 INFO mapred.JobClient: Counters: 4
13/02/24 13:33:37 INFO mapred.JobClient:   Job Counters
13/02/24 13:33:37 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=15112
13/02/24 13:33:37 INFO mapred.JobClient:     Total time spent by all
reduces waiting after reserving slots (ms)=0
13/02/24 13:33:37 INFO mapred.JobClient:     Total time spent by all maps
waiting after reserving slots (ms)=0
13/02/24 13:33:37 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=9091



On Sat, Feb 23, 2013 at 5:05 PM, Hemanth Yamijala <yhemanth@thoughtworks.com
> wrote:

> Can you try this ? Pick a class like WordCount from your package and
> execute this command:
>
> javap -classpath <path to your jar> -verbose org.myorg.Wordcount | grep
> version.
>
> For e.g. here's what I get for my class:
>
> $ javap -verbose WCMapper | grep version
>   minor version: 0
>   major version: 50
>
> Please paste the output of this - we can verify what the problem is.
>
> Thanks
> Hemanth
>
>
> On Sat, Feb 23, 2013 at 4:45 PM, Fatih Haltas <fa...@nyu.edu>wrote:
>
>> Hi again,
>>
>> Thanks for your help but now, I am struggling with the same problem on a
>> machine. As the preivous problem, I just decrease the Java version by Java
>> 6, but this time I could not solve the problem.
>>
>> those are outputs that may explain the situation:
>>
>> ---------------------------------------------------------------------------------------------------------------------------------------------
>> 1. I could not run my own code, to check the system I just tried to run
>> basic wordcount example without any modification, except package info.
>> **************************************************
>> COMMAND EXECUTED: hadoop jar my.jar org.myorg.WordCount NetFlow
>> NetFlow.out
>> Warning: $HADOOP_HOME is deprecated.
>>
>> Exception in thread "main" java.lang.UnsupportedClassVersionError:
>> org/myorg/WordCount : Unsupported major.minor version 51.0
>>          at java.lang.ClassLoader.defineClass1(Native Method)
>>         at java.lang.ClassLoader.defineClass(ClassLoader.java:634)
>>         at
>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>>         at java.net.URLClassLoader.defineClass(URLClassLoader.java:277)
>>         at java.net.URLClassLoader.access$000(URLClassLoader.java:73)
>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:212)
>>         at java.security.AccessController.doPrivileged(Native Method)
>>         at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
>>         at java.lang.Class.forName0(Native Method)
>>         at java.lang.Class.forName(Class.java:266)
>>         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
>>
>> **************************************************************************************
>> 2. Java version:
>> ********************************
>> COMMAND EXECUTED: java -version
>> java version "1.6.0_24"
>> OpenJDK Runtime Environment (IcedTea6 1.11.6)
>> (rhel-1.33.1.11.6.el5_9-x86_64)
>> OpenJDK 64-Bit Server VM (build 20.0-b12, mixed mode)
>> **********************************
>> 3. JAVA_HOME variable:
>> **********************************
>> COMMAND EXECUTED: echo $JAVA_HOME
>> /usr/lib/jvm/jre-1.6.0-openjdk.x86_64
>> ********************************************
>> 4. HADOOP version:
>> *******************************************
>> COMMAND EXECUTED: hadoop version
>> Warning: $HADOOP_HOME is deprecated.
>>
>> Hadoop 1.0.4
>> Subversion
>> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r
>> 1393290
>> Compiled by hortonfo on Wed Oct  3 05:13:58 UTC 2012
>> From source with checksum fe2baea87c4c81a2c505767f3f9b71f4
>> ********************************************************
>>
>> Are these still incompatible with eachother? (Hadoop version and java
>> version)
>>
>>
>> Thank you very much.
>>
>>
>> On Tue, Feb 19, 2013 at 10:26 PM, Fatih Haltas <fa...@nyu.edu>wrote:
>>
>>> Thank you all very much
>>>
>>> 19 Şubat 2013 Salı tarihinde Harsh J adlı kullanıcı şöyle yazdı:
>>>
>>>> Oops. I just noticed Hemanth has been answering on a dupe thread as
>>>> well. Lets drop this thread and carry on there :)
>>>>
>>>> On Tue, Feb 19, 2013 at 11:14 PM, Harsh J <ha...@cloudera.com> wrote:
>>>> > Hi,
>>>> >
>>>> > The new error usually happens if you compile using Java 7 and try to
>>>> > run via Java 6 (for example). That is, an incompatibility in the
>>>> > runtimes for the binary artifact produced.
>>>> >
>>>> > On Tue, Feb 19, 2013 at 10:09 PM, Fatih Haltas <fa...@nyu.edu>
>>>> wrote:
>>>> >> Thank you very much Harsh,
>>>> >>
>>>> >> Now, as I promised earlier I am much obliged to you.
>>>> >>
>>>> >> But, now I solved that problem by just changing the directories then
>>>> again
>>>> >> creating a jar file of org. but I am getting this error:
>>>> >>
>>>> >> 1.) What I got
>>>> >>
>>>> ------------------------------------------------------------------------------
>>>> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ hadoop jar
>>>> flow19028pm.jar
>>>> >> org.myorg.MapReduce /home/hadoop/project/hadoop-data/NetFlow
>>>> 19_02.out
>>>> >> Warning: $HADOOP_HOME is deprecated.
>>>> >>
>>>> >> Exception in thread "main" java.lang.UnsupportedClassVersionError:
>>>> >> org/myorg/MapReduce : Unsupported major.minor version 51.0
>>>> >>         at java.lang.ClassLoader.defineClass1(Native Method)
>>>> >>         at java.lang.ClassLoader.defineClass(ClassLoader.java:634)
>>>> >>         at
>>>> >>
>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>>>> >>         at
>>>> java.net.URLClassLoader.defineClass(URLClassLoader.java:277)
>>>> >>         at java.net.URLClassLoader.access$000(URLClassLoader.java:73)
>>>> >>         at java.net.URLClassLoader$1.run(URLClassLoader.java:212)
>>>> >>         at java.security.AccessController.doPrivileged(Native Method)
>>>> >>         at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>>>> >>         at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
>>>> >>         at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
>>>> >>         at java.lang.Class.forName0(Native Method)
>>>> >>         at java.lang.Class.forName(Class.java:266)
>>>> >>         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
>>>> >>
>>>> >> 2.) How I create my jar
>>>> >>
>>>> -------------------------------------------------------------------------------------
>>>> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar cvf flow19028pm.jar
>>>> org
>>>> >> added manifest
>>>> >> adding: org/(in = 0) (out= 0)(stored 0%)
>>>> >> adding: org/myorg/(in = 0) (out= 0)(stored 0%)
>>>> >> adding: org/myorg/MapReduce$FlowPortReducer.class(in = 1661) (out=
>>>> >> 690)(deflated 58%)
>>>> >> adding: org/myorg/MapReduce.class(in = 1587) (out= 903)(deflated 43%)
>>>> >> adding: org/myorg/MapReduce$FlowPortMapper.class(in = 1874) (out=
>>>> >> 823)(deflated 56%)
>>>> >>
>>>> >> 3.) Content of my jar file
>>>> >>
>>>> ---------------------------------------------------------------------------------------
>>>> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar tf flow19028pm.jar
>>>> >> META-INF/
>>>> >> META-INF/MANIFEST.MF
>>>> >> org/
>>>> >> org/myorg/
>>>> >> org/myorg/MapReduce$FlowPortReducer.class
>>>> >> org/myorg/MapReduce.class
>>>> >> org/myorg/MapReduce$FlowPortMapper.class
>>>> >>
>>>> -----------------------------------------------------------------------------------------
>>>> >>
>>>> >>
>>>> >> Thank you very much.
>>>> >>
>>>> >>
>>>> >> On Tue, Feb 19, 2013 at 8:20 PM, Harsh J <ha...@cloudera.com> wrote:
>>>> >>>
>>>> >>> Your point (4) explains the problem. The jar packed structure should
>>>> >>> look like the below, and not how it is presently (one extra top
>>>> level
>>>> >>> dir is present):
>>>> >>>
>>>> >>> META-INF/
>>>> >>> META-INF/MANIFEST.MF
>>>> >>> org/
>>>> >>> org/myorg/
>>>> >>> org/myorg/WordCount.class
>>>> >>> org/myorg/WordCount$TokenizerMapper.class
>>>> >>> org/myorg/WordCount$IntSumReducer.clas--
>>>> Harsh J
>>>>
>>>
>>
>

Fwd: Trouble in running MapReduce application

Posted by Fatih Haltas <fa...@nyu.edu>.
Hi Hemanth;

Thanks for your grreat helps,

I am really much obliged to you.

I solved this problem by changing my java compiler vs. but now though I
changed everynodes configuration I am getting this error even I tried to
run example of wordcount without making any changes.

What may be the reason, I believe that I checked all config files and
changed the home variables, also /etc/hosts

Here is my problem:
************************************************************
[hadoop@ADUAE042-LAP-V logs]$ hadoop jar ../hadoop-examples-1.0.4.jar
wordcount /home/hadoop/project/hadoop-data/NetFlow test1353.out

Warning: $HADOOP_HOME is deprecated.

13/02/24 13:32:28 INFO input.FileInputFormat: Total input paths to process
: 1
13/02/24 13:32:28 INFO util.NativeCodeLoader: Loaded the native-hadoop
library
13/02/24 13:32:28 WARN snappy.LoadSnappy: Snappy native library not loaded
13/02/24 13:32:29 INFO mapred.JobClient: Running job: job_201301141457_0034
13/02/24 13:32:30 INFO mapred.JobClient:  map 0% reduce 0%
13/02/24 13:32:37 INFO mapred.JobClient: Task Id :
attempt_201301141457_0034_m_000002_0, Status : FAILED
java.lang.Throwable: Child Error
        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
Caused by: java.io.IOException: Task process exit with nonzero status of 1.
        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)

attempt_201301141457_0034_m_000002_0: execvp: No such file or directory
13/02/24 13:32:43 INFO mapred.JobClient: Task Id :
attempt_201301141457_0034_r_000002_0, Status : FAILED
java.lang.Throwable: Child Error
        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
Caused by: java.io.IOException: Task process exit with nonzero status of 1.
        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)

attempt_201301141457_0034_r_000002_0: execvp: No such file or directory
13/02/24 13:32:50 INFO mapred.JobClient: Task Id :
attempt_201301141457_0034_m_000002_1, Status : FAILED
java.lang.Throwable: Child Error
        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
Caused by: java.io.IOException: Task process exit with nonzero status of 1.
        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)

attempt_201301141457_0034_m_000002_1: execvp: No such file or directory
13/02/24 13:32:56 INFO mapred.JobClient: Task Id :
attempt_201301141457_0034_r_000002_1, Status : FAILED
java.lang.Throwable: Child Error
        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
Caused by: java.io.IOException: Task process exit with nonzero status of 1.
        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)

attempt_201301141457_0034_r_000002_1: execvp: No such file or directory
13/02/24 13:33:02 INFO mapred.JobClient: Task Id :
attempt_201301141457_0034_m_000002_2, Status : FAILED
Error initializing attempt_201301141457_0034_m_000002_2:
java.lang.InternalError
        at
sun.misc.URLClassPath$JarLoader.getResource(URLClassPath.java:769)
        at sun.misc.URLClassPath.getResource(URLClassPath.java:185)
        at sun.misc.URLClassPath.getResource(URLClassPath.java:237)
        at java.lang.ClassLoader.getBootstrapResource(ClassLoader.java:1113)
        at java.lang.ClassLoader.getResource(ClassLoader.java:974)
        at java.lang.ClassLoader.getResource(ClassLoader.java:972)
        at java.lang.ClassLoader.getSystemResource(ClassLoader.java:1075)
        at
java.lang.ClassLoader.getSystemResourceAsStream(ClassLoader.java:1181)
        at java.lang.Class.getResourceAsStream(Class.java:2045)
        at
com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory$1.run(OutputPropertiesFactory.java:370)
        at java.security.AccessController.doPrivileged(Native Method)
        at
com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory.loadPropertiesFile(OutputPropertiesFactory.java:366)
        at
com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory.getDefaultMethodProperties(OutputPropertiesFactory.java:267)
        at
com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.setDefaults(TransformerImpl.java:1123)
        at
com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.createOutputProperties(TransformerImpl.java:1084)
        at
com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.<init>(TransformerImpl.java:250)
        at
com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.<init>(TransformerImpl.java:241)
        at
com.sun.org.apache.xalan.internal.xsltc.trax.TransformerFactoryImpl.newTransformer(TransformerFactoryImpl.java:607)
        at
org.apache.hadoop.conf.Configuration.writeXml(Configuration.java:1310)
        at
org.apache.hadoop.mapred.JobLocalizer.writeLocalJobFile(JobLocalizer.java:559)
        at org.apache.hadoop.mapred.TaskTracker$4.run(TaskTracker.java:1227)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:416)
        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
        at
org.apache.hadoop.mapred.TaskTracker.initializeJob(TaskTracker.java:1203)
        at
org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:1118)
        at org.apache.hadoop.mapred.TaskTracker$5.run(TaskTracker.java:2430)
        at java.lang.Thread.run(Thread.java:679)
Caused by: java.io.FileNotFoundException:
/usr/lib/jvm/java-1.6.0-openjdk-1.6.0.0.x86_64/jre/lib/resources.jar
        at sun.misc.URLClassPath$JarLoader.getJarFile(URLClassPath.java:658)
        at sun.misc.URLClassPath$JarLoader.access$600(URLClassPath.java:555)
        at sun.misc.URLClassPath$JarLoader$1.run(URLClassPath.java:622)
        at java.security.AccessController.doPrivileged(Native Method)
        at sun.misc.URLClassPath$JarLoader.ensureOpen(URLClassPath.java:614)
        at
sun.misc.URLClassPath$JarLoader.getResource(URLClassPath.java:767)
        ... 27 more

13/02/24 13:33:02 WARN mapred.JobClient: Error reading task
outputhttp://ADUAE045-LAP-V:50060/tasklog?plaintext=true&attemptid=attempt_201301141457_0034_m_000002_2&filter=stdout
13/02/24 13:33:02 WARN mapred.JobClient: Error reading task
outputhttp://ADUAE045-LAP-V:50060/tasklog?plaintext=true&attemptid=attempt_201301141457_0034_m_000002_2&filter=stderr
13/02/24 13:33:08 INFO mapred.JobClient: Task Id :
attempt_201301141457_0034_m_000001_0, Status : FAILED
Error initializing attempt_201301141457_0034_m_000001_0:
java.lang.InternalError
        at
sun.misc.URLClassPath$JarLoader.getResource(URLClassPath.java:769)
        at sun.misc.URLClassPath.getResource(URLClassPath.java:185)
        at sun.misc.URLClassPath.getResource(URLClassPath.java:237)
        at java.lang.ClassLoader.getBootstrapResource(ClassLoader.java:1113)
        at java.lang.ClassLoader.getResource(ClassLoader.java:974)
        at java.lang.ClassLoader.getResource(ClassLoader.java:972)
        at java.lang.ClassLoader.getSystemResource(ClassLoader.java:1075)
        at
java.lang.ClassLoader.getSystemResourceAsStream(ClassLoader.java:1181)
        at java.lang.Class.getResourceAsStream(Class.java:2045)
        at
com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory$1.run(OutputPropertiesFactory.java:370)
        at java.security.AccessController.doPrivileged(Native Method)
        at
com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory.loadPropertiesFile(OutputPropertiesFactory.java:366)
        at
com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory.getDefaultMethodProperties(OutputPropertiesFactory.java:267)
        at
com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.setDefaults(TransformerImpl.java:1123)
        at
com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.createOutputProperties(TransformerImpl.java:1084)
        at
com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.<init>(TransformerImpl.java:250)
        at
com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.<init>(TransformerImpl.java:241)
        at
com.sun.org.apache.xalan.internal.xsltc.trax.TransformerFactoryImpl.newTransformer(TransformerFactoryImpl.java:607)
        at
org.apache.hadoop.conf.Configuration.writeXml(Configuration.java:1310)
        at
org.apache.hadoop.mapred.JobLocalizer.writeLocalJobFile(JobLocalizer.java:559)
        at org.apache.hadoop.mapred.TaskTracker$4.run(TaskTracker.java:1227)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:416)
        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
        at
org.apache.hadoop.mapred.TaskTracker.initializeJob(TaskTracker.java:1203)
        at
org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:1118)
        at org.apache.hadoop.mapred.TaskTracker$5.run(TaskTracker.java:2430)
        at java.lang.Thread.run(Thread.java:679)
Caused by: java.io.FileNotFoundException:
/usr/lib/jvm/java-1.6.0-openjdk-1.6.0.0.x86_64/jre/lib/resources.jar
        at sun.misc.URLClassPath$JarLoader.getJarFile(URLClassPath.java:658)
        at sun.misc.URLClassPath$JarLoader.access$600(URLClassPath.java:555)
        at sun.misc.URLClassPath$JarLoader$1.run(URLClassPath.java:622)
        at java.security.AccessController.doPrivileged(Native Method)
        at sun.misc.URLClassPath$JarLoader.ensureOpen(URLClassPath.java:614)
        at
sun.misc.URLClassPath$JarLoader.getResource(URLClassPath.java:767)
        ... 27 more

13/02/24 13:33:08 WARN mapred.JobClient: Error reading task
outputhttp://ADUAE045-LAP-V:50060/tasklog?plaintext=true&attemptid=attempt_201301141457_0034_m_000001_0&filter=stdout
13/02/24 13:33:08 WARN mapred.JobClient: Error reading task
outputhttp://ADUAE045-LAP-V:50060/tasklog?plaintext=true&attemptid=attempt_201301141457_0034_m_000001_0&filter=stderr
13/02/24 13:33:11 INFO mapred.JobClient: Task Id :
attempt_201301141457_0034_r_000001_0, Status : FAILED
Error initializing attempt_201301141457_0034_r_000001_0:
java.lang.InternalError
        at
sun.misc.URLClassPath$JarLoader.getResource(URLClassPath.java:769)
        at sun.misc.URLClassPath.getResource(URLClassPath.java:185)
        at sun.misc.URLClassPath.getResource(URLClassPath.java:237)
        at java.lang.ClassLoader.getBootstrapResource(ClassLoader.java:1113)
        at java.lang.ClassLoader.getResource(ClassLoader.java:974)
        at java.lang.ClassLoader.getResource(ClassLoader.java:972)
        at java.lang.ClassLoader.getSystemResource(ClassLoader.java:1075)
        at
java.lang.ClassLoader.getSystemResourceAsStream(ClassLoader.java:1181)
        at java.lang.Class.getResourceAsStream(Class.java:2045)
        at
com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory$1.run(OutputPropertiesFactory.java:370)
        at java.security.AccessController.doPrivileged(Native Method)
        at
com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory.loadPropertiesFile(OutputPropertiesFactory.java:366)
        at
com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory.getDefaultMethodProperties(OutputPropertiesFactory.java:267)
        at
com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.setDefaults(TransformerImpl.java:1123)
        at
com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.createOutputProperties(TransformerImpl.java:1084)
        at
com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.<init>(TransformerImpl.java:250)
        at
com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.<init>(TransformerImpl.java:241)
        at
com.sun.org.apache.xalan.internal.xsltc.trax.TransformerFactoryImpl.newTransformer(TransformerFactoryImpl.java:607)
        at
org.apache.hadoop.conf.Configuration.writeXml(Configuration.java:1310)
        at
org.apache.hadoop.mapred.JobLocalizer.writeLocalJobFile(JobLocalizer.java:559)
        at org.apache.hadoop.mapred.TaskTracker$4.run(TaskTracker.java:1227)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:416)
        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
        at
org.apache.hadoop.mapred.TaskTracker.initializeJob(TaskTracker.java:1203)
        at
org.apache.hadoop.mapred.TaskTracker.localizeJob(TaskTracker.java:1118)
        at org.apache.hadoop.mapred.TaskTracker$5.run(TaskTracker.java:2430)
        at java.lang.Thread.run(Thread.java:679)
Caused by: java.io.FileNotFoundException:
/usr/lib/jvm/java-1.6.0-openjdk-1.6.0.0.x86_64/jre/lib/resources.jar
        at sun.misc.URLClassPath$JarLoader.getJarFile(URLClassPath.java:658)
        at sun.misc.URLClassPath$JarLoader.access$600(URLClassPath.java:555)
        at sun.misc.URLClassPath$JarLoader$1.run(URLClassPath.java:622)
        at java.security.AccessController.doPrivileged(Native Method)
        at sun.misc.URLClassPath$JarLoader.ensureOpen(URLClassPath.java:614)
        at
sun.misc.URLClassPath$JarLoader.getResource(URLClassPath.java:767)
        ... 27 more

13/02/24 13:33:11 WARN mapred.JobClient: Error reading task
outputhttp://ADUAE045-LAP-V:50060/tasklog?plaintext=true&attemptid=attempt_201301141457_0034_r_000001_0&filter=stdout
13/02/24 13:33:11 WARN mapred.JobClient: Error reading task
outputhttp://ADUAE045-LAP-V:50060/tasklog?plaintext=true&attemptid=attempt_201301141457_0034_r_000001_0&filter=stderr
13/02/24 13:33:19 INFO mapred.JobClient: Task Id :
attempt_201301141457_0034_m_000001_1, Status : FAILED
java.lang.Throwable: Child Error
        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
Caused by: java.io.IOException: Task process exit with nonzero status of 1.
        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)

attempt_201301141457_0034_m_000001_1: execvp: No such file or directory
13/02/24 13:33:25 INFO mapred.JobClient: Task Id :
attempt_201301141457_0034_r_000001_1, Status : FAILED
java.lang.Throwable: Child Error
        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
Caused by: java.io.IOException: Task process exit with nonzero status of 1.
        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)

attempt_201301141457_0034_r_000001_1: execvp: No such file or directory
13/02/24 13:33:31 INFO mapred.JobClient: Task Id :
attempt_201301141457_0034_m_000001_2, Status : FAILED
java.lang.Throwable: Child Error
        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
Caused by: java.io.IOException: Task process exit with nonzero status of 1.
        at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)

attempt_201301141457_0034_m_000001_2: execvp: No such file or directory
13/02/24 13:33:37 INFO mapred.JobClient: Job complete: job_201301141457_0034
13/02/24 13:33:37 INFO mapred.JobClient: Counters: 4
13/02/24 13:33:37 INFO mapred.JobClient:   Job Counters
13/02/24 13:33:37 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=15112
13/02/24 13:33:37 INFO mapred.JobClient:     Total time spent by all
reduces waiting after reserving slots (ms)=0
13/02/24 13:33:37 INFO mapred.JobClient:     Total time spent by all maps
waiting after reserving slots (ms)=0
13/02/24 13:33:37 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=9091



On Sat, Feb 23, 2013 at 5:05 PM, Hemanth Yamijala <yhemanth@thoughtworks.com
> wrote:

> Can you try this ? Pick a class like WordCount from your package and
> execute this command:
>
> javap -classpath <path to your jar> -verbose org.myorg.Wordcount | grep
> version.
>
> For e.g. here's what I get for my class:
>
> $ javap -verbose WCMapper | grep version
>   minor version: 0
>   major version: 50
>
> Please paste the output of this - we can verify what the problem is.
>
> Thanks
> Hemanth
>
>
> On Sat, Feb 23, 2013 at 4:45 PM, Fatih Haltas <fa...@nyu.edu>wrote:
>
>> Hi again,
>>
>> Thanks for your help but now, I am struggling with the same problem on a
>> machine. As the preivous problem, I just decrease the Java version by Java
>> 6, but this time I could not solve the problem.
>>
>> those are outputs that may explain the situation:
>>
>> ---------------------------------------------------------------------------------------------------------------------------------------------
>> 1. I could not run my own code, to check the system I just tried to run
>> basic wordcount example without any modification, except package info.
>> **************************************************
>> COMMAND EXECUTED: hadoop jar my.jar org.myorg.WordCount NetFlow
>> NetFlow.out
>> Warning: $HADOOP_HOME is deprecated.
>>
>> Exception in thread "main" java.lang.UnsupportedClassVersionError:
>> org/myorg/WordCount : Unsupported major.minor version 51.0
>>          at java.lang.ClassLoader.defineClass1(Native Method)
>>         at java.lang.ClassLoader.defineClass(ClassLoader.java:634)
>>         at
>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>>         at java.net.URLClassLoader.defineClass(URLClassLoader.java:277)
>>         at java.net.URLClassLoader.access$000(URLClassLoader.java:73)
>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:212)
>>         at java.security.AccessController.doPrivileged(Native Method)
>>         at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
>>         at java.lang.Class.forName0(Native Method)
>>         at java.lang.Class.forName(Class.java:266)
>>         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
>>
>> **************************************************************************************
>> 2. Java version:
>> ********************************
>> COMMAND EXECUTED: java -version
>> java version "1.6.0_24"
>> OpenJDK Runtime Environment (IcedTea6 1.11.6)
>> (rhel-1.33.1.11.6.el5_9-x86_64)
>> OpenJDK 64-Bit Server VM (build 20.0-b12, mixed mode)
>> **********************************
>> 3. JAVA_HOME variable:
>> **********************************
>> COMMAND EXECUTED: echo $JAVA_HOME
>> /usr/lib/jvm/jre-1.6.0-openjdk.x86_64
>> ********************************************
>> 4. HADOOP version:
>> *******************************************
>> COMMAND EXECUTED: hadoop version
>> Warning: $HADOOP_HOME is deprecated.
>>
>> Hadoop 1.0.4
>> Subversion
>> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r
>> 1393290
>> Compiled by hortonfo on Wed Oct  3 05:13:58 UTC 2012
>> From source with checksum fe2baea87c4c81a2c505767f3f9b71f4
>> ********************************************************
>>
>> Are these still incompatible with eachother? (Hadoop version and java
>> version)
>>
>>
>> Thank you very much.
>>
>>
>> On Tue, Feb 19, 2013 at 10:26 PM, Fatih Haltas <fa...@nyu.edu>wrote:
>>
>>> Thank you all very much
>>>
>>> 19 Şubat 2013 Salı tarihinde Harsh J adlı kullanıcı şöyle yazdı:
>>>
>>>> Oops. I just noticed Hemanth has been answering on a dupe thread as
>>>> well. Lets drop this thread and carry on there :)
>>>>
>>>> On Tue, Feb 19, 2013 at 11:14 PM, Harsh J <ha...@cloudera.com> wrote:
>>>> > Hi,
>>>> >
>>>> > The new error usually happens if you compile using Java 7 and try to
>>>> > run via Java 6 (for example). That is, an incompatibility in the
>>>> > runtimes for the binary artifact produced.
>>>> >
>>>> > On Tue, Feb 19, 2013 at 10:09 PM, Fatih Haltas <fa...@nyu.edu>
>>>> wrote:
>>>> >> Thank you very much Harsh,
>>>> >>
>>>> >> Now, as I promised earlier I am much obliged to you.
>>>> >>
>>>> >> But, now I solved that problem by just changing the directories then
>>>> again
>>>> >> creating a jar file of org. but I am getting this error:
>>>> >>
>>>> >> 1.) What I got
>>>> >>
>>>> ------------------------------------------------------------------------------
>>>> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ hadoop jar
>>>> flow19028pm.jar
>>>> >> org.myorg.MapReduce /home/hadoop/project/hadoop-data/NetFlow
>>>> 19_02.out
>>>> >> Warning: $HADOOP_HOME is deprecated.
>>>> >>
>>>> >> Exception in thread "main" java.lang.UnsupportedClassVersionError:
>>>> >> org/myorg/MapReduce : Unsupported major.minor version 51.0
>>>> >>         at java.lang.ClassLoader.defineClass1(Native Method)
>>>> >>         at java.lang.ClassLoader.defineClass(ClassLoader.java:634)
>>>> >>         at
>>>> >>
>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>>>> >>         at
>>>> java.net.URLClassLoader.defineClass(URLClassLoader.java:277)
>>>> >>         at java.net.URLClassLoader.access$000(URLClassLoader.java:73)
>>>> >>         at java.net.URLClassLoader$1.run(URLClassLoader.java:212)
>>>> >>         at java.security.AccessController.doPrivileged(Native Method)
>>>> >>         at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>>>> >>         at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
>>>> >>         at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
>>>> >>         at java.lang.Class.forName0(Native Method)
>>>> >>         at java.lang.Class.forName(Class.java:266)
>>>> >>         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
>>>> >>
>>>> >> 2.) How I create my jar
>>>> >>
>>>> -------------------------------------------------------------------------------------
>>>> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar cvf flow19028pm.jar
>>>> org
>>>> >> added manifest
>>>> >> adding: org/(in = 0) (out= 0)(stored 0%)
>>>> >> adding: org/myorg/(in = 0) (out= 0)(stored 0%)
>>>> >> adding: org/myorg/MapReduce$FlowPortReducer.class(in = 1661) (out=
>>>> >> 690)(deflated 58%)
>>>> >> adding: org/myorg/MapReduce.class(in = 1587) (out= 903)(deflated 43%)
>>>> >> adding: org/myorg/MapReduce$FlowPortMapper.class(in = 1874) (out=
>>>> >> 823)(deflated 56%)
>>>> >>
>>>> >> 3.) Content of my jar file
>>>> >>
>>>> ---------------------------------------------------------------------------------------
>>>> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar tf flow19028pm.jar
>>>> >> META-INF/
>>>> >> META-INF/MANIFEST.MF
>>>> >> org/
>>>> >> org/myorg/
>>>> >> org/myorg/MapReduce$FlowPortReducer.class
>>>> >> org/myorg/MapReduce.class
>>>> >> org/myorg/MapReduce$FlowPortMapper.class
>>>> >>
>>>> -----------------------------------------------------------------------------------------
>>>> >>
>>>> >>
>>>> >> Thank you very much.
>>>> >>
>>>> >>
>>>> >> On Tue, Feb 19, 2013 at 8:20 PM, Harsh J <ha...@cloudera.com> wrote:
>>>> >>>
>>>> >>> Your point (4) explains the problem. The jar packed structure should
>>>> >>> look like the below, and not how it is presently (one extra top
>>>> level
>>>> >>> dir is present):
>>>> >>>
>>>> >>> META-INF/
>>>> >>> META-INF/MANIFEST.MF
>>>> >>> org/
>>>> >>> org/myorg/
>>>> >>> org/myorg/WordCount.class
>>>> >>> org/myorg/WordCount$TokenizerMapper.class
>>>> >>> org/myorg/WordCount$IntSumReducer.clas--
>>>> Harsh J
>>>>
>>>
>>
>

Re: Trouble in running MapReduce application

Posted by Hemanth Yamijala <yh...@thoughtworks.com>.
Can you try this ? Pick a class like WordCount from your package and
execute this command:

javap -classpath <path to your jar> -verbose org.myorg.Wordcount | grep
version.

For e.g. here's what I get for my class:

$ javap -verbose WCMapper | grep version
  minor version: 0
  major version: 50

Please paste the output of this - we can verify what the problem is.

Thanks
Hemanth


On Sat, Feb 23, 2013 at 4:45 PM, Fatih Haltas <fa...@nyu.edu> wrote:

> Hi again,
>
> Thanks for your help but now, I am struggling with the same problem on a
> machine. As the preivous problem, I just decrease the Java version by Java
> 6, but this time I could not solve the problem.
>
> those are outputs that may explain the situation:
>
> ---------------------------------------------------------------------------------------------------------------------------------------------
> 1. I could not run my own code, to check the system I just tried to run
> basic wordcount example without any modification, except package info.
> **************************************************
> COMMAND EXECUTED: hadoop jar my.jar org.myorg.WordCount NetFlow NetFlow.out
> Warning: $HADOOP_HOME is deprecated.
>
> Exception in thread "main" java.lang.UnsupportedClassVersionError:
> org/myorg/WordCount : Unsupported major.minor version 51.0
>         at java.lang.ClassLoader.defineClass1(Native Method)
>         at java.lang.ClassLoader.defineClass(ClassLoader.java:634)
>         at
> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>         at java.net.URLClassLoader.defineClass(URLClassLoader.java:277)
>         at java.net.URLClassLoader.access$000(URLClassLoader.java:73)
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:212)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
>         at java.lang.Class.forName0(Native Method)
>         at java.lang.Class.forName(Class.java:266)
>         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
>
> **************************************************************************************
> 2. Java version:
> ********************************
> COMMAND EXECUTED: java -version
> java version "1.6.0_24"
> OpenJDK Runtime Environment (IcedTea6 1.11.6)
> (rhel-1.33.1.11.6.el5_9-x86_64)
> OpenJDK 64-Bit Server VM (build 20.0-b12, mixed mode)
> **********************************
> 3. JAVA_HOME variable:
> **********************************
> COMMAND EXECUTED: echo $JAVA_HOME
> /usr/lib/jvm/jre-1.6.0-openjdk.x86_64
> ********************************************
> 4. HADOOP version:
> *******************************************
> COMMAND EXECUTED: hadoop version
> Warning: $HADOOP_HOME is deprecated.
>
> Hadoop 1.0.4
> Subversion
> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r
> 1393290
> Compiled by hortonfo on Wed Oct  3 05:13:58 UTC 2012
> From source with checksum fe2baea87c4c81a2c505767f3f9b71f4
> ********************************************************
>
> Are these still incompatible with eachother? (Hadoop version and java
> version)
>
>
> Thank you very much.
>
>
> On Tue, Feb 19, 2013 at 10:26 PM, Fatih Haltas <fa...@nyu.edu>wrote:
>
>> Thank you all very much
>>
>> 19 Şubat 2013 Salı tarihinde Harsh J adlı kullanıcı şöyle yazdı:
>>
>>> Oops. I just noticed Hemanth has been answering on a dupe thread as
>>> well. Lets drop this thread and carry on there :)
>>>
>>> On Tue, Feb 19, 2013 at 11:14 PM, Harsh J <ha...@cloudera.com> wrote:
>>> > Hi,
>>> >
>>> > The new error usually happens if you compile using Java 7 and try to
>>> > run via Java 6 (for example). That is, an incompatibility in the
>>> > runtimes for the binary artifact produced.
>>> >
>>> > On Tue, Feb 19, 2013 at 10:09 PM, Fatih Haltas <fa...@nyu.edu>
>>> wrote:
>>> >> Thank you very much Harsh,
>>> >>
>>> >> Now, as I promised earlier I am much obliged to you.
>>> >>
>>> >> But, now I solved that problem by just changing the directories then
>>> again
>>> >> creating a jar file of org. but I am getting this error:
>>> >>
>>> >> 1.) What I got
>>> >>
>>> ------------------------------------------------------------------------------
>>> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ hadoop jar flow19028pm.jar
>>> >> org.myorg.MapReduce /home/hadoop/project/hadoop-data/NetFlow 19_02.out
>>> >> Warning: $HADOOP_HOME is deprecated.
>>> >>
>>> >> Exception in thread "main" java.lang.UnsupportedClassVersionError:
>>> >> org/myorg/MapReduce : Unsupported major.minor version 51.0
>>> >>         at java.lang.ClassLoader.defineClass1(Native Method)
>>> >>         at java.lang.ClassLoader.defineClass(ClassLoader.java:634)
>>> >>         at
>>> >>
>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>>> >>         at
>>> java.net.URLClassLoader.defineClass(URLClassLoader.java:277)
>>> >>         at java.net.URLClassLoader.access$000(URLClassLoader.java:73)
>>> >>         at java.net.URLClassLoader$1.run(URLClassLoader.java:212)
>>> >>         at java.security.AccessController.doPrivileged(Native Method)
>>> >>         at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>>> >>         at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
>>> >>         at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
>>> >>         at java.lang.Class.forName0(Native Method)
>>> >>         at java.lang.Class.forName(Class.java:266)
>>> >>         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
>>> >>
>>> >> 2.) How I create my jar
>>> >>
>>> -------------------------------------------------------------------------------------
>>> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar cvf flow19028pm.jar
>>> org
>>> >> added manifest
>>> >> adding: org/(in = 0) (out= 0)(stored 0%)
>>> >> adding: org/myorg/(in = 0) (out= 0)(stored 0%)
>>> >> adding: org/myorg/MapReduce$FlowPortReducer.class(in = 1661) (out=
>>> >> 690)(deflated 58%)
>>> >> adding: org/myorg/MapReduce.class(in = 1587) (out= 903)(deflated 43%)
>>> >> adding: org/myorg/MapReduce$FlowPortMapper.class(in = 1874) (out=
>>> >> 823)(deflated 56%)
>>> >>
>>> >> 3.) Content of my jar file
>>> >>
>>> ---------------------------------------------------------------------------------------
>>> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar tf flow19028pm.jar
>>> >> META-INF/
>>> >> META-INF/MANIFEST.MF
>>> >> org/
>>> >> org/myorg/
>>> >> org/myorg/MapReduce$FlowPortReducer.class
>>> >> org/myorg/MapReduce.class
>>> >> org/myorg/MapReduce$FlowPortMapper.class
>>> >>
>>> -----------------------------------------------------------------------------------------
>>> >>
>>> >>
>>> >> Thank you very much.
>>> >>
>>> >>
>>> >> On Tue, Feb 19, 2013 at 8:20 PM, Harsh J <ha...@cloudera.com> wrote:
>>> >>>
>>> >>> Your point (4) explains the problem. The jar packed structure should
>>> >>> look like the below, and not how it is presently (one extra top level
>>> >>> dir is present):
>>> >>>
>>> >>> META-INF/
>>> >>> META-INF/MANIFEST.MF
>>> >>> org/
>>> >>> org/myorg/
>>> >>> org/myorg/WordCount.class
>>> >>> org/myorg/WordCount$TokenizerMapper.class
>>> >>> org/myorg/WordCount$IntSumReducer.clas--
>>> Harsh J
>>>
>>
>

Re: Trouble in running MapReduce application

Posted by Hemanth Yamijala <yh...@thoughtworks.com>.
Can you try this ? Pick a class like WordCount from your package and
execute this command:

javap -classpath <path to your jar> -verbose org.myorg.Wordcount | grep
version.

For e.g. here's what I get for my class:

$ javap -verbose WCMapper | grep version
  minor version: 0
  major version: 50

Please paste the output of this - we can verify what the problem is.

Thanks
Hemanth


On Sat, Feb 23, 2013 at 4:45 PM, Fatih Haltas <fa...@nyu.edu> wrote:

> Hi again,
>
> Thanks for your help but now, I am struggling with the same problem on a
> machine. As the preivous problem, I just decrease the Java version by Java
> 6, but this time I could not solve the problem.
>
> those are outputs that may explain the situation:
>
> ---------------------------------------------------------------------------------------------------------------------------------------------
> 1. I could not run my own code, to check the system I just tried to run
> basic wordcount example without any modification, except package info.
> **************************************************
> COMMAND EXECUTED: hadoop jar my.jar org.myorg.WordCount NetFlow NetFlow.out
> Warning: $HADOOP_HOME is deprecated.
>
> Exception in thread "main" java.lang.UnsupportedClassVersionError:
> org/myorg/WordCount : Unsupported major.minor version 51.0
>         at java.lang.ClassLoader.defineClass1(Native Method)
>         at java.lang.ClassLoader.defineClass(ClassLoader.java:634)
>         at
> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>         at java.net.URLClassLoader.defineClass(URLClassLoader.java:277)
>         at java.net.URLClassLoader.access$000(URLClassLoader.java:73)
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:212)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
>         at java.lang.Class.forName0(Native Method)
>         at java.lang.Class.forName(Class.java:266)
>         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
>
> **************************************************************************************
> 2. Java version:
> ********************************
> COMMAND EXECUTED: java -version
> java version "1.6.0_24"
> OpenJDK Runtime Environment (IcedTea6 1.11.6)
> (rhel-1.33.1.11.6.el5_9-x86_64)
> OpenJDK 64-Bit Server VM (build 20.0-b12, mixed mode)
> **********************************
> 3. JAVA_HOME variable:
> **********************************
> COMMAND EXECUTED: echo $JAVA_HOME
> /usr/lib/jvm/jre-1.6.0-openjdk.x86_64
> ********************************************
> 4. HADOOP version:
> *******************************************
> COMMAND EXECUTED: hadoop version
> Warning: $HADOOP_HOME is deprecated.
>
> Hadoop 1.0.4
> Subversion
> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r
> 1393290
> Compiled by hortonfo on Wed Oct  3 05:13:58 UTC 2012
> From source with checksum fe2baea87c4c81a2c505767f3f9b71f4
> ********************************************************
>
> Are these still incompatible with eachother? (Hadoop version and java
> version)
>
>
> Thank you very much.
>
>
> On Tue, Feb 19, 2013 at 10:26 PM, Fatih Haltas <fa...@nyu.edu>wrote:
>
>> Thank you all very much
>>
>> 19 Şubat 2013 Salı tarihinde Harsh J adlı kullanıcı şöyle yazdı:
>>
>>> Oops. I just noticed Hemanth has been answering on a dupe thread as
>>> well. Lets drop this thread and carry on there :)
>>>
>>> On Tue, Feb 19, 2013 at 11:14 PM, Harsh J <ha...@cloudera.com> wrote:
>>> > Hi,
>>> >
>>> > The new error usually happens if you compile using Java 7 and try to
>>> > run via Java 6 (for example). That is, an incompatibility in the
>>> > runtimes for the binary artifact produced.
>>> >
>>> > On Tue, Feb 19, 2013 at 10:09 PM, Fatih Haltas <fa...@nyu.edu>
>>> wrote:
>>> >> Thank you very much Harsh,
>>> >>
>>> >> Now, as I promised earlier I am much obliged to you.
>>> >>
>>> >> But, now I solved that problem by just changing the directories then
>>> again
>>> >> creating a jar file of org. but I am getting this error:
>>> >>
>>> >> 1.) What I got
>>> >>
>>> ------------------------------------------------------------------------------
>>> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ hadoop jar flow19028pm.jar
>>> >> org.myorg.MapReduce /home/hadoop/project/hadoop-data/NetFlow 19_02.out
>>> >> Warning: $HADOOP_HOME is deprecated.
>>> >>
>>> >> Exception in thread "main" java.lang.UnsupportedClassVersionError:
>>> >> org/myorg/MapReduce : Unsupported major.minor version 51.0
>>> >>         at java.lang.ClassLoader.defineClass1(Native Method)
>>> >>         at java.lang.ClassLoader.defineClass(ClassLoader.java:634)
>>> >>         at
>>> >>
>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>>> >>         at
>>> java.net.URLClassLoader.defineClass(URLClassLoader.java:277)
>>> >>         at java.net.URLClassLoader.access$000(URLClassLoader.java:73)
>>> >>         at java.net.URLClassLoader$1.run(URLClassLoader.java:212)
>>> >>         at java.security.AccessController.doPrivileged(Native Method)
>>> >>         at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>>> >>         at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
>>> >>         at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
>>> >>         at java.lang.Class.forName0(Native Method)
>>> >>         at java.lang.Class.forName(Class.java:266)
>>> >>         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
>>> >>
>>> >> 2.) How I create my jar
>>> >>
>>> -------------------------------------------------------------------------------------
>>> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar cvf flow19028pm.jar
>>> org
>>> >> added manifest
>>> >> adding: org/(in = 0) (out= 0)(stored 0%)
>>> >> adding: org/myorg/(in = 0) (out= 0)(stored 0%)
>>> >> adding: org/myorg/MapReduce$FlowPortReducer.class(in = 1661) (out=
>>> >> 690)(deflated 58%)
>>> >> adding: org/myorg/MapReduce.class(in = 1587) (out= 903)(deflated 43%)
>>> >> adding: org/myorg/MapReduce$FlowPortMapper.class(in = 1874) (out=
>>> >> 823)(deflated 56%)
>>> >>
>>> >> 3.) Content of my jar file
>>> >>
>>> ---------------------------------------------------------------------------------------
>>> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar tf flow19028pm.jar
>>> >> META-INF/
>>> >> META-INF/MANIFEST.MF
>>> >> org/
>>> >> org/myorg/
>>> >> org/myorg/MapReduce$FlowPortReducer.class
>>> >> org/myorg/MapReduce.class
>>> >> org/myorg/MapReduce$FlowPortMapper.class
>>> >>
>>> -----------------------------------------------------------------------------------------
>>> >>
>>> >>
>>> >> Thank you very much.
>>> >>
>>> >>
>>> >> On Tue, Feb 19, 2013 at 8:20 PM, Harsh J <ha...@cloudera.com> wrote:
>>> >>>
>>> >>> Your point (4) explains the problem. The jar packed structure should
>>> >>> look like the below, and not how it is presently (one extra top level
>>> >>> dir is present):
>>> >>>
>>> >>> META-INF/
>>> >>> META-INF/MANIFEST.MF
>>> >>> org/
>>> >>> org/myorg/
>>> >>> org/myorg/WordCount.class
>>> >>> org/myorg/WordCount$TokenizerMapper.class
>>> >>> org/myorg/WordCount$IntSumReducer.clas--
>>> Harsh J
>>>
>>
>

Re: Trouble in running MapReduce application

Posted by Hemanth Yamijala <yh...@thoughtworks.com>.
Can you try this ? Pick a class like WordCount from your package and
execute this command:

javap -classpath <path to your jar> -verbose org.myorg.Wordcount | grep
version.

For e.g. here's what I get for my class:

$ javap -verbose WCMapper | grep version
  minor version: 0
  major version: 50

Please paste the output of this - we can verify what the problem is.

Thanks
Hemanth


On Sat, Feb 23, 2013 at 4:45 PM, Fatih Haltas <fa...@nyu.edu> wrote:

> Hi again,
>
> Thanks for your help but now, I am struggling with the same problem on a
> machine. As the preivous problem, I just decrease the Java version by Java
> 6, but this time I could not solve the problem.
>
> those are outputs that may explain the situation:
>
> ---------------------------------------------------------------------------------------------------------------------------------------------
> 1. I could not run my own code, to check the system I just tried to run
> basic wordcount example without any modification, except package info.
> **************************************************
> COMMAND EXECUTED: hadoop jar my.jar org.myorg.WordCount NetFlow NetFlow.out
> Warning: $HADOOP_HOME is deprecated.
>
> Exception in thread "main" java.lang.UnsupportedClassVersionError:
> org/myorg/WordCount : Unsupported major.minor version 51.0
>         at java.lang.ClassLoader.defineClass1(Native Method)
>         at java.lang.ClassLoader.defineClass(ClassLoader.java:634)
>         at
> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>         at java.net.URLClassLoader.defineClass(URLClassLoader.java:277)
>         at java.net.URLClassLoader.access$000(URLClassLoader.java:73)
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:212)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
>         at java.lang.Class.forName0(Native Method)
>         at java.lang.Class.forName(Class.java:266)
>         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
>
> **************************************************************************************
> 2. Java version:
> ********************************
> COMMAND EXECUTED: java -version
> java version "1.6.0_24"
> OpenJDK Runtime Environment (IcedTea6 1.11.6)
> (rhel-1.33.1.11.6.el5_9-x86_64)
> OpenJDK 64-Bit Server VM (build 20.0-b12, mixed mode)
> **********************************
> 3. JAVA_HOME variable:
> **********************************
> COMMAND EXECUTED: echo $JAVA_HOME
> /usr/lib/jvm/jre-1.6.0-openjdk.x86_64
> ********************************************
> 4. HADOOP version:
> *******************************************
> COMMAND EXECUTED: hadoop version
> Warning: $HADOOP_HOME is deprecated.
>
> Hadoop 1.0.4
> Subversion
> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r
> 1393290
> Compiled by hortonfo on Wed Oct  3 05:13:58 UTC 2012
> From source with checksum fe2baea87c4c81a2c505767f3f9b71f4
> ********************************************************
>
> Are these still incompatible with eachother? (Hadoop version and java
> version)
>
>
> Thank you very much.
>
>
> On Tue, Feb 19, 2013 at 10:26 PM, Fatih Haltas <fa...@nyu.edu>wrote:
>
>> Thank you all very much
>>
>> 19 Şubat 2013 Salı tarihinde Harsh J adlı kullanıcı şöyle yazdı:
>>
>>> Oops. I just noticed Hemanth has been answering on a dupe thread as
>>> well. Lets drop this thread and carry on there :)
>>>
>>> On Tue, Feb 19, 2013 at 11:14 PM, Harsh J <ha...@cloudera.com> wrote:
>>> > Hi,
>>> >
>>> > The new error usually happens if you compile using Java 7 and try to
>>> > run via Java 6 (for example). That is, an incompatibility in the
>>> > runtimes for the binary artifact produced.
>>> >
>>> > On Tue, Feb 19, 2013 at 10:09 PM, Fatih Haltas <fa...@nyu.edu>
>>> wrote:
>>> >> Thank you very much Harsh,
>>> >>
>>> >> Now, as I promised earlier I am much obliged to you.
>>> >>
>>> >> But, now I solved that problem by just changing the directories then
>>> again
>>> >> creating a jar file of org. but I am getting this error:
>>> >>
>>> >> 1.) What I got
>>> >>
>>> ------------------------------------------------------------------------------
>>> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ hadoop jar flow19028pm.jar
>>> >> org.myorg.MapReduce /home/hadoop/project/hadoop-data/NetFlow 19_02.out
>>> >> Warning: $HADOOP_HOME is deprecated.
>>> >>
>>> >> Exception in thread "main" java.lang.UnsupportedClassVersionError:
>>> >> org/myorg/MapReduce : Unsupported major.minor version 51.0
>>> >>         at java.lang.ClassLoader.defineClass1(Native Method)
>>> >>         at java.lang.ClassLoader.defineClass(ClassLoader.java:634)
>>> >>         at
>>> >>
>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>>> >>         at
>>> java.net.URLClassLoader.defineClass(URLClassLoader.java:277)
>>> >>         at java.net.URLClassLoader.access$000(URLClassLoader.java:73)
>>> >>         at java.net.URLClassLoader$1.run(URLClassLoader.java:212)
>>> >>         at java.security.AccessController.doPrivileged(Native Method)
>>> >>         at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>>> >>         at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
>>> >>         at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
>>> >>         at java.lang.Class.forName0(Native Method)
>>> >>         at java.lang.Class.forName(Class.java:266)
>>> >>         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
>>> >>
>>> >> 2.) How I create my jar
>>> >>
>>> -------------------------------------------------------------------------------------
>>> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar cvf flow19028pm.jar
>>> org
>>> >> added manifest
>>> >> adding: org/(in = 0) (out= 0)(stored 0%)
>>> >> adding: org/myorg/(in = 0) (out= 0)(stored 0%)
>>> >> adding: org/myorg/MapReduce$FlowPortReducer.class(in = 1661) (out=
>>> >> 690)(deflated 58%)
>>> >> adding: org/myorg/MapReduce.class(in = 1587) (out= 903)(deflated 43%)
>>> >> adding: org/myorg/MapReduce$FlowPortMapper.class(in = 1874) (out=
>>> >> 823)(deflated 56%)
>>> >>
>>> >> 3.) Content of my jar file
>>> >>
>>> ---------------------------------------------------------------------------------------
>>> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar tf flow19028pm.jar
>>> >> META-INF/
>>> >> META-INF/MANIFEST.MF
>>> >> org/
>>> >> org/myorg/
>>> >> org/myorg/MapReduce$FlowPortReducer.class
>>> >> org/myorg/MapReduce.class
>>> >> org/myorg/MapReduce$FlowPortMapper.class
>>> >>
>>> -----------------------------------------------------------------------------------------
>>> >>
>>> >>
>>> >> Thank you very much.
>>> >>
>>> >>
>>> >> On Tue, Feb 19, 2013 at 8:20 PM, Harsh J <ha...@cloudera.com> wrote:
>>> >>>
>>> >>> Your point (4) explains the problem. The jar packed structure should
>>> >>> look like the below, and not how it is presently (one extra top level
>>> >>> dir is present):
>>> >>>
>>> >>> META-INF/
>>> >>> META-INF/MANIFEST.MF
>>> >>> org/
>>> >>> org/myorg/
>>> >>> org/myorg/WordCount.class
>>> >>> org/myorg/WordCount$TokenizerMapper.class
>>> >>> org/myorg/WordCount$IntSumReducer.clas--
>>> Harsh J
>>>
>>
>

Re: Trouble in running MapReduce application

Posted by Hemanth Yamijala <yh...@thoughtworks.com>.
Can you try this ? Pick a class like WordCount from your package and
execute this command:

javap -classpath <path to your jar> -verbose org.myorg.Wordcount | grep
version.

For e.g. here's what I get for my class:

$ javap -verbose WCMapper | grep version
  minor version: 0
  major version: 50

Please paste the output of this - we can verify what the problem is.

Thanks
Hemanth


On Sat, Feb 23, 2013 at 4:45 PM, Fatih Haltas <fa...@nyu.edu> wrote:

> Hi again,
>
> Thanks for your help but now, I am struggling with the same problem on a
> machine. As the preivous problem, I just decrease the Java version by Java
> 6, but this time I could not solve the problem.
>
> those are outputs that may explain the situation:
>
> ---------------------------------------------------------------------------------------------------------------------------------------------
> 1. I could not run my own code, to check the system I just tried to run
> basic wordcount example without any modification, except package info.
> **************************************************
> COMMAND EXECUTED: hadoop jar my.jar org.myorg.WordCount NetFlow NetFlow.out
> Warning: $HADOOP_HOME is deprecated.
>
> Exception in thread "main" java.lang.UnsupportedClassVersionError:
> org/myorg/WordCount : Unsupported major.minor version 51.0
>         at java.lang.ClassLoader.defineClass1(Native Method)
>         at java.lang.ClassLoader.defineClass(ClassLoader.java:634)
>         at
> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>         at java.net.URLClassLoader.defineClass(URLClassLoader.java:277)
>         at java.net.URLClassLoader.access$000(URLClassLoader.java:73)
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:212)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
>         at java.lang.Class.forName0(Native Method)
>         at java.lang.Class.forName(Class.java:266)
>         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
>
> **************************************************************************************
> 2. Java version:
> ********************************
> COMMAND EXECUTED: java -version
> java version "1.6.0_24"
> OpenJDK Runtime Environment (IcedTea6 1.11.6)
> (rhel-1.33.1.11.6.el5_9-x86_64)
> OpenJDK 64-Bit Server VM (build 20.0-b12, mixed mode)
> **********************************
> 3. JAVA_HOME variable:
> **********************************
> COMMAND EXECUTED: echo $JAVA_HOME
> /usr/lib/jvm/jre-1.6.0-openjdk.x86_64
> ********************************************
> 4. HADOOP version:
> *******************************************
> COMMAND EXECUTED: hadoop version
> Warning: $HADOOP_HOME is deprecated.
>
> Hadoop 1.0.4
> Subversion
> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r
> 1393290
> Compiled by hortonfo on Wed Oct  3 05:13:58 UTC 2012
> From source with checksum fe2baea87c4c81a2c505767f3f9b71f4
> ********************************************************
>
> Are these still incompatible with eachother? (Hadoop version and java
> version)
>
>
> Thank you very much.
>
>
> On Tue, Feb 19, 2013 at 10:26 PM, Fatih Haltas <fa...@nyu.edu>wrote:
>
>> Thank you all very much
>>
>> 19 Şubat 2013 Salı tarihinde Harsh J adlı kullanıcı şöyle yazdı:
>>
>>> Oops. I just noticed Hemanth has been answering on a dupe thread as
>>> well. Lets drop this thread and carry on there :)
>>>
>>> On Tue, Feb 19, 2013 at 11:14 PM, Harsh J <ha...@cloudera.com> wrote:
>>> > Hi,
>>> >
>>> > The new error usually happens if you compile using Java 7 and try to
>>> > run via Java 6 (for example). That is, an incompatibility in the
>>> > runtimes for the binary artifact produced.
>>> >
>>> > On Tue, Feb 19, 2013 at 10:09 PM, Fatih Haltas <fa...@nyu.edu>
>>> wrote:
>>> >> Thank you very much Harsh,
>>> >>
>>> >> Now, as I promised earlier I am much obliged to you.
>>> >>
>>> >> But, now I solved that problem by just changing the directories then
>>> again
>>> >> creating a jar file of org. but I am getting this error:
>>> >>
>>> >> 1.) What I got
>>> >>
>>> ------------------------------------------------------------------------------
>>> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ hadoop jar flow19028pm.jar
>>> >> org.myorg.MapReduce /home/hadoop/project/hadoop-data/NetFlow 19_02.out
>>> >> Warning: $HADOOP_HOME is deprecated.
>>> >>
>>> >> Exception in thread "main" java.lang.UnsupportedClassVersionError:
>>> >> org/myorg/MapReduce : Unsupported major.minor version 51.0
>>> >>         at java.lang.ClassLoader.defineClass1(Native Method)
>>> >>         at java.lang.ClassLoader.defineClass(ClassLoader.java:634)
>>> >>         at
>>> >>
>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>>> >>         at
>>> java.net.URLClassLoader.defineClass(URLClassLoader.java:277)
>>> >>         at java.net.URLClassLoader.access$000(URLClassLoader.java:73)
>>> >>         at java.net.URLClassLoader$1.run(URLClassLoader.java:212)
>>> >>         at java.security.AccessController.doPrivileged(Native Method)
>>> >>         at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>>> >>         at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
>>> >>         at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
>>> >>         at java.lang.Class.forName0(Native Method)
>>> >>         at java.lang.Class.forName(Class.java:266)
>>> >>         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
>>> >>
>>> >> 2.) How I create my jar
>>> >>
>>> -------------------------------------------------------------------------------------
>>> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar cvf flow19028pm.jar
>>> org
>>> >> added manifest
>>> >> adding: org/(in = 0) (out= 0)(stored 0%)
>>> >> adding: org/myorg/(in = 0) (out= 0)(stored 0%)
>>> >> adding: org/myorg/MapReduce$FlowPortReducer.class(in = 1661) (out=
>>> >> 690)(deflated 58%)
>>> >> adding: org/myorg/MapReduce.class(in = 1587) (out= 903)(deflated 43%)
>>> >> adding: org/myorg/MapReduce$FlowPortMapper.class(in = 1874) (out=
>>> >> 823)(deflated 56%)
>>> >>
>>> >> 3.) Content of my jar file
>>> >>
>>> ---------------------------------------------------------------------------------------
>>> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar tf flow19028pm.jar
>>> >> META-INF/
>>> >> META-INF/MANIFEST.MF
>>> >> org/
>>> >> org/myorg/
>>> >> org/myorg/MapReduce$FlowPortReducer.class
>>> >> org/myorg/MapReduce.class
>>> >> org/myorg/MapReduce$FlowPortMapper.class
>>> >>
>>> -----------------------------------------------------------------------------------------
>>> >>
>>> >>
>>> >> Thank you very much.
>>> >>
>>> >>
>>> >> On Tue, Feb 19, 2013 at 8:20 PM, Harsh J <ha...@cloudera.com> wrote:
>>> >>>
>>> >>> Your point (4) explains the problem. The jar packed structure should
>>> >>> look like the below, and not how it is presently (one extra top level
>>> >>> dir is present):
>>> >>>
>>> >>> META-INF/
>>> >>> META-INF/MANIFEST.MF
>>> >>> org/
>>> >>> org/myorg/
>>> >>> org/myorg/WordCount.class
>>> >>> org/myorg/WordCount$TokenizerMapper.class
>>> >>> org/myorg/WordCount$IntSumReducer.clas--
>>> Harsh J
>>>
>>
>

Re: Trouble in running MapReduce application

Posted by Fatih Haltas <fa...@nyu.edu>.
Hi again,

Thanks for your help but now, I am struggling with the same problem on a
machine. As the preivous problem, I just decrease the Java version by Java
6, but this time I could not solve the problem.

those are outputs that may explain the situation:
---------------------------------------------------------------------------------------------------------------------------------------------
1. I could not run my own code, to check the system I just tried to run
basic wordcount example without any modification, except package info.
**************************************************
COMMAND EXECUTED: hadoop jar my.jar org.myorg.WordCount NetFlow NetFlow.out
Warning: $HADOOP_HOME is deprecated.

Exception in thread "main" java.lang.UnsupportedClassVersionError:
org/myorg/WordCount : Unsupported major.minor version 51.0
        at java.lang.ClassLoader.defineClass1(Native Method)
        at java.lang.ClassLoader.defineClass(ClassLoader.java:634)
        at
java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
        at java.net.URLClassLoader.defineClass(URLClassLoader.java:277)
        at java.net.URLClassLoader.access$000(URLClassLoader.java:73)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:212)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:266)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
**************************************************************************************
2. Java version:
********************************
COMMAND EXECUTED: java -version
java version "1.6.0_24"
OpenJDK Runtime Environment (IcedTea6 1.11.6)
(rhel-1.33.1.11.6.el5_9-x86_64)
OpenJDK 64-Bit Server VM (build 20.0-b12, mixed mode)
**********************************
3. JAVA_HOME variable:
**********************************
COMMAND EXECUTED: echo $JAVA_HOME
/usr/lib/jvm/jre-1.6.0-openjdk.x86_64
********************************************
4. HADOOP version:
*******************************************
COMMAND EXECUTED: hadoop version
Warning: $HADOOP_HOME is deprecated.

Hadoop 1.0.4
Subversion
https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r
1393290
Compiled by hortonfo on Wed Oct  3 05:13:58 UTC 2012
>From source with checksum fe2baea87c4c81a2c505767f3f9b71f4
********************************************************

Are these still incompatible with eachother? (Hadoop version and java
version)


Thank you very much.


On Tue, Feb 19, 2013 at 10:26 PM, Fatih Haltas <fa...@nyu.edu> wrote:

> Thank you all very much
>
> 19 Şubat 2013 Salı tarihinde Harsh J adlı kullanıcı şöyle yazdı:
>
>> Oops. I just noticed Hemanth has been answering on a dupe thread as
>> well. Lets drop this thread and carry on there :)
>>
>> On Tue, Feb 19, 2013 at 11:14 PM, Harsh J <ha...@cloudera.com> wrote:
>> > Hi,
>> >
>> > The new error usually happens if you compile using Java 7 and try to
>> > run via Java 6 (for example). That is, an incompatibility in the
>> > runtimes for the binary artifact produced.
>> >
>> > On Tue, Feb 19, 2013 at 10:09 PM, Fatih Haltas <fa...@nyu.edu>
>> wrote:
>> >> Thank you very much Harsh,
>> >>
>> >> Now, as I promised earlier I am much obliged to you.
>> >>
>> >> But, now I solved that problem by just changing the directories then
>> again
>> >> creating a jar file of org. but I am getting this error:
>> >>
>> >> 1.) What I got
>> >>
>> ------------------------------------------------------------------------------
>> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ hadoop jar flow19028pm.jar
>> >> org.myorg.MapReduce /home/hadoop/project/hadoop-data/NetFlow 19_02.out
>> >> Warning: $HADOOP_HOME is deprecated.
>> >>
>> >> Exception in thread "main" java.lang.UnsupportedClassVersionError:
>> >> org/myorg/MapReduce : Unsupported major.minor version 51.0
>> >>         at java.lang.ClassLoader.defineClass1(Native Method)
>> >>         at java.lang.ClassLoader.defineClass(ClassLoader.java:634)
>> >>         at
>> >> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>> >>         at java.net.URLClassLoader.defineClass(URLClassLoader.java:277)
>> >>         at java.net.URLClassLoader.access$000(URLClassLoader.java:73)
>> >>         at java.net.URLClassLoader$1.run(URLClassLoader.java:212)
>> >>         at java.security.AccessController.doPrivileged(Native Method)
>> >>         at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>> >>         at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
>> >>         at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
>> >>         at java.lang.Class.forName0(Native Method)
>> >>         at java.lang.Class.forName(Class.java:266)
>> >>         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
>> >>
>> >> 2.) How I create my jar
>> >>
>> -------------------------------------------------------------------------------------
>> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar cvf flow19028pm.jar org
>> >> added manifest
>> >> adding: org/(in = 0) (out= 0)(stored 0%)
>> >> adding: org/myorg/(in = 0) (out= 0)(stored 0%)
>> >> adding: org/myorg/MapReduce$FlowPortReducer.class(in = 1661) (out=
>> >> 690)(deflated 58%)
>> >> adding: org/myorg/MapReduce.class(in = 1587) (out= 903)(deflated 43%)
>> >> adding: org/myorg/MapReduce$FlowPortMapper.class(in = 1874) (out=
>> >> 823)(deflated 56%)
>> >>
>> >> 3.) Content of my jar file
>> >>
>> ---------------------------------------------------------------------------------------
>> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar tf flow19028pm.jar
>> >> META-INF/
>> >> META-INF/MANIFEST.MF
>> >> org/
>> >> org/myorg/
>> >> org/myorg/MapReduce$FlowPortReducer.class
>> >> org/myorg/MapReduce.class
>> >> org/myorg/MapReduce$FlowPortMapper.class
>> >>
>> -----------------------------------------------------------------------------------------
>> >>
>> >>
>> >> Thank you very much.
>> >>
>> >>
>> >> On Tue, Feb 19, 2013 at 8:20 PM, Harsh J <ha...@cloudera.com> wrote:
>> >>>
>> >>> Your point (4) explains the problem. The jar packed structure should
>> >>> look like the below, and not how it is presently (one extra top level
>> >>> dir is present):
>> >>>
>> >>> META-INF/
>> >>> META-INF/MANIFEST.MF
>> >>> org/
>> >>> org/myorg/
>> >>> org/myorg/WordCount.class
>> >>> org/myorg/WordCount$TokenizerMapper.class
>> >>> org/myorg/WordCount$IntSumReducer.clas--
>> Harsh J
>>
>

Re: Trouble in running MapReduce application

Posted by Fatih Haltas <fa...@nyu.edu>.
Hi again,

Thanks for your help but now, I am struggling with the same problem on a
machine. As the preivous problem, I just decrease the Java version by Java
6, but this time I could not solve the problem.

those are outputs that may explain the situation:
---------------------------------------------------------------------------------------------------------------------------------------------
1. I could not run my own code, to check the system I just tried to run
basic wordcount example without any modification, except package info.
**************************************************
COMMAND EXECUTED: hadoop jar my.jar org.myorg.WordCount NetFlow NetFlow.out
Warning: $HADOOP_HOME is deprecated.

Exception in thread "main" java.lang.UnsupportedClassVersionError:
org/myorg/WordCount : Unsupported major.minor version 51.0
        at java.lang.ClassLoader.defineClass1(Native Method)
        at java.lang.ClassLoader.defineClass(ClassLoader.java:634)
        at
java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
        at java.net.URLClassLoader.defineClass(URLClassLoader.java:277)
        at java.net.URLClassLoader.access$000(URLClassLoader.java:73)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:212)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:266)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
**************************************************************************************
2. Java version:
********************************
COMMAND EXECUTED: java -version
java version "1.6.0_24"
OpenJDK Runtime Environment (IcedTea6 1.11.6)
(rhel-1.33.1.11.6.el5_9-x86_64)
OpenJDK 64-Bit Server VM (build 20.0-b12, mixed mode)
**********************************
3. JAVA_HOME variable:
**********************************
COMMAND EXECUTED: echo $JAVA_HOME
/usr/lib/jvm/jre-1.6.0-openjdk.x86_64
********************************************
4. HADOOP version:
*******************************************
COMMAND EXECUTED: hadoop version
Warning: $HADOOP_HOME is deprecated.

Hadoop 1.0.4
Subversion
https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r
1393290
Compiled by hortonfo on Wed Oct  3 05:13:58 UTC 2012
>From source with checksum fe2baea87c4c81a2c505767f3f9b71f4
********************************************************

Are these still incompatible with eachother? (Hadoop version and java
version)


Thank you very much.


On Tue, Feb 19, 2013 at 10:26 PM, Fatih Haltas <fa...@nyu.edu> wrote:

> Thank you all very much
>
> 19 Şubat 2013 Salı tarihinde Harsh J adlı kullanıcı şöyle yazdı:
>
>> Oops. I just noticed Hemanth has been answering on a dupe thread as
>> well. Lets drop this thread and carry on there :)
>>
>> On Tue, Feb 19, 2013 at 11:14 PM, Harsh J <ha...@cloudera.com> wrote:
>> > Hi,
>> >
>> > The new error usually happens if you compile using Java 7 and try to
>> > run via Java 6 (for example). That is, an incompatibility in the
>> > runtimes for the binary artifact produced.
>> >
>> > On Tue, Feb 19, 2013 at 10:09 PM, Fatih Haltas <fa...@nyu.edu>
>> wrote:
>> >> Thank you very much Harsh,
>> >>
>> >> Now, as I promised earlier I am much obliged to you.
>> >>
>> >> But, now I solved that problem by just changing the directories then
>> again
>> >> creating a jar file of org. but I am getting this error:
>> >>
>> >> 1.) What I got
>> >>
>> ------------------------------------------------------------------------------
>> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ hadoop jar flow19028pm.jar
>> >> org.myorg.MapReduce /home/hadoop/project/hadoop-data/NetFlow 19_02.out
>> >> Warning: $HADOOP_HOME is deprecated.
>> >>
>> >> Exception in thread "main" java.lang.UnsupportedClassVersionError:
>> >> org/myorg/MapReduce : Unsupported major.minor version 51.0
>> >>         at java.lang.ClassLoader.defineClass1(Native Method)
>> >>         at java.lang.ClassLoader.defineClass(ClassLoader.java:634)
>> >>         at
>> >> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>> >>         at java.net.URLClassLoader.defineClass(URLClassLoader.java:277)
>> >>         at java.net.URLClassLoader.access$000(URLClassLoader.java:73)
>> >>         at java.net.URLClassLoader$1.run(URLClassLoader.java:212)
>> >>         at java.security.AccessController.doPrivileged(Native Method)
>> >>         at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>> >>         at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
>> >>         at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
>> >>         at java.lang.Class.forName0(Native Method)
>> >>         at java.lang.Class.forName(Class.java:266)
>> >>         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
>> >>
>> >> 2.) How I create my jar
>> >>
>> -------------------------------------------------------------------------------------
>> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar cvf flow19028pm.jar org
>> >> added manifest
>> >> adding: org/(in = 0) (out= 0)(stored 0%)
>> >> adding: org/myorg/(in = 0) (out= 0)(stored 0%)
>> >> adding: org/myorg/MapReduce$FlowPortReducer.class(in = 1661) (out=
>> >> 690)(deflated 58%)
>> >> adding: org/myorg/MapReduce.class(in = 1587) (out= 903)(deflated 43%)
>> >> adding: org/myorg/MapReduce$FlowPortMapper.class(in = 1874) (out=
>> >> 823)(deflated 56%)
>> >>
>> >> 3.) Content of my jar file
>> >>
>> ---------------------------------------------------------------------------------------
>> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar tf flow19028pm.jar
>> >> META-INF/
>> >> META-INF/MANIFEST.MF
>> >> org/
>> >> org/myorg/
>> >> org/myorg/MapReduce$FlowPortReducer.class
>> >> org/myorg/MapReduce.class
>> >> org/myorg/MapReduce$FlowPortMapper.class
>> >>
>> -----------------------------------------------------------------------------------------
>> >>
>> >>
>> >> Thank you very much.
>> >>
>> >>
>> >> On Tue, Feb 19, 2013 at 8:20 PM, Harsh J <ha...@cloudera.com> wrote:
>> >>>
>> >>> Your point (4) explains the problem. The jar packed structure should
>> >>> look like the below, and not how it is presently (one extra top level
>> >>> dir is present):
>> >>>
>> >>> META-INF/
>> >>> META-INF/MANIFEST.MF
>> >>> org/
>> >>> org/myorg/
>> >>> org/myorg/WordCount.class
>> >>> org/myorg/WordCount$TokenizerMapper.class
>> >>> org/myorg/WordCount$IntSumReducer.clas--
>> Harsh J
>>
>

Re: Trouble in running MapReduce application

Posted by Fatih Haltas <fa...@nyu.edu>.
Hi again,

Thanks for your help but now, I am struggling with the same problem on a
machine. As the preivous problem, I just decrease the Java version by Java
6, but this time I could not solve the problem.

those are outputs that may explain the situation:
---------------------------------------------------------------------------------------------------------------------------------------------
1. I could not run my own code, to check the system I just tried to run
basic wordcount example without any modification, except package info.
**************************************************
COMMAND EXECUTED: hadoop jar my.jar org.myorg.WordCount NetFlow NetFlow.out
Warning: $HADOOP_HOME is deprecated.

Exception in thread "main" java.lang.UnsupportedClassVersionError:
org/myorg/WordCount : Unsupported major.minor version 51.0
        at java.lang.ClassLoader.defineClass1(Native Method)
        at java.lang.ClassLoader.defineClass(ClassLoader.java:634)
        at
java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
        at java.net.URLClassLoader.defineClass(URLClassLoader.java:277)
        at java.net.URLClassLoader.access$000(URLClassLoader.java:73)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:212)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:266)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
**************************************************************************************
2. Java version:
********************************
COMMAND EXECUTED: java -version
java version "1.6.0_24"
OpenJDK Runtime Environment (IcedTea6 1.11.6)
(rhel-1.33.1.11.6.el5_9-x86_64)
OpenJDK 64-Bit Server VM (build 20.0-b12, mixed mode)
**********************************
3. JAVA_HOME variable:
**********************************
COMMAND EXECUTED: echo $JAVA_HOME
/usr/lib/jvm/jre-1.6.0-openjdk.x86_64
********************************************
4. HADOOP version:
*******************************************
COMMAND EXECUTED: hadoop version
Warning: $HADOOP_HOME is deprecated.

Hadoop 1.0.4
Subversion
https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r
1393290
Compiled by hortonfo on Wed Oct  3 05:13:58 UTC 2012
>From source with checksum fe2baea87c4c81a2c505767f3f9b71f4
********************************************************

Are these still incompatible with eachother? (Hadoop version and java
version)


Thank you very much.


On Tue, Feb 19, 2013 at 10:26 PM, Fatih Haltas <fa...@nyu.edu> wrote:

> Thank you all very much
>
> 19 Şubat 2013 Salı tarihinde Harsh J adlı kullanıcı şöyle yazdı:
>
>> Oops. I just noticed Hemanth has been answering on a dupe thread as
>> well. Lets drop this thread and carry on there :)
>>
>> On Tue, Feb 19, 2013 at 11:14 PM, Harsh J <ha...@cloudera.com> wrote:
>> > Hi,
>> >
>> > The new error usually happens if you compile using Java 7 and try to
>> > run via Java 6 (for example). That is, an incompatibility in the
>> > runtimes for the binary artifact produced.
>> >
>> > On Tue, Feb 19, 2013 at 10:09 PM, Fatih Haltas <fa...@nyu.edu>
>> wrote:
>> >> Thank you very much Harsh,
>> >>
>> >> Now, as I promised earlier I am much obliged to you.
>> >>
>> >> But, now I solved that problem by just changing the directories then
>> again
>> >> creating a jar file of org. but I am getting this error:
>> >>
>> >> 1.) What I got
>> >>
>> ------------------------------------------------------------------------------
>> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ hadoop jar flow19028pm.jar
>> >> org.myorg.MapReduce /home/hadoop/project/hadoop-data/NetFlow 19_02.out
>> >> Warning: $HADOOP_HOME is deprecated.
>> >>
>> >> Exception in thread "main" java.lang.UnsupportedClassVersionError:
>> >> org/myorg/MapReduce : Unsupported major.minor version 51.0
>> >>         at java.lang.ClassLoader.defineClass1(Native Method)
>> >>         at java.lang.ClassLoader.defineClass(ClassLoader.java:634)
>> >>         at
>> >> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>> >>         at java.net.URLClassLoader.defineClass(URLClassLoader.java:277)
>> >>         at java.net.URLClassLoader.access$000(URLClassLoader.java:73)
>> >>         at java.net.URLClassLoader$1.run(URLClassLoader.java:212)
>> >>         at java.security.AccessController.doPrivileged(Native Method)
>> >>         at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>> >>         at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
>> >>         at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
>> >>         at java.lang.Class.forName0(Native Method)
>> >>         at java.lang.Class.forName(Class.java:266)
>> >>         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
>> >>
>> >> 2.) How I create my jar
>> >>
>> -------------------------------------------------------------------------------------
>> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar cvf flow19028pm.jar org
>> >> added manifest
>> >> adding: org/(in = 0) (out= 0)(stored 0%)
>> >> adding: org/myorg/(in = 0) (out= 0)(stored 0%)
>> >> adding: org/myorg/MapReduce$FlowPortReducer.class(in = 1661) (out=
>> >> 690)(deflated 58%)
>> >> adding: org/myorg/MapReduce.class(in = 1587) (out= 903)(deflated 43%)
>> >> adding: org/myorg/MapReduce$FlowPortMapper.class(in = 1874) (out=
>> >> 823)(deflated 56%)
>> >>
>> >> 3.) Content of my jar file
>> >>
>> ---------------------------------------------------------------------------------------
>> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar tf flow19028pm.jar
>> >> META-INF/
>> >> META-INF/MANIFEST.MF
>> >> org/
>> >> org/myorg/
>> >> org/myorg/MapReduce$FlowPortReducer.class
>> >> org/myorg/MapReduce.class
>> >> org/myorg/MapReduce$FlowPortMapper.class
>> >>
>> -----------------------------------------------------------------------------------------
>> >>
>> >>
>> >> Thank you very much.
>> >>
>> >>
>> >> On Tue, Feb 19, 2013 at 8:20 PM, Harsh J <ha...@cloudera.com> wrote:
>> >>>
>> >>> Your point (4) explains the problem. The jar packed structure should
>> >>> look like the below, and not how it is presently (one extra top level
>> >>> dir is present):
>> >>>
>> >>> META-INF/
>> >>> META-INF/MANIFEST.MF
>> >>> org/
>> >>> org/myorg/
>> >>> org/myorg/WordCount.class
>> >>> org/myorg/WordCount$TokenizerMapper.class
>> >>> org/myorg/WordCount$IntSumReducer.clas--
>> Harsh J
>>
>

Re: Trouble in running MapReduce application

Posted by Fatih Haltas <fa...@nyu.edu>.
Hi again,

Thanks for your help but now, I am struggling with the same problem on a
machine. As the preivous problem, I just decrease the Java version by Java
6, but this time I could not solve the problem.

those are outputs that may explain the situation:
---------------------------------------------------------------------------------------------------------------------------------------------
1. I could not run my own code, to check the system I just tried to run
basic wordcount example without any modification, except package info.
**************************************************
COMMAND EXECUTED: hadoop jar my.jar org.myorg.WordCount NetFlow NetFlow.out
Warning: $HADOOP_HOME is deprecated.

Exception in thread "main" java.lang.UnsupportedClassVersionError:
org/myorg/WordCount : Unsupported major.minor version 51.0
        at java.lang.ClassLoader.defineClass1(Native Method)
        at java.lang.ClassLoader.defineClass(ClassLoader.java:634)
        at
java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
        at java.net.URLClassLoader.defineClass(URLClassLoader.java:277)
        at java.net.URLClassLoader.access$000(URLClassLoader.java:73)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:212)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:266)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
**************************************************************************************
2. Java version:
********************************
COMMAND EXECUTED: java -version
java version "1.6.0_24"
OpenJDK Runtime Environment (IcedTea6 1.11.6)
(rhel-1.33.1.11.6.el5_9-x86_64)
OpenJDK 64-Bit Server VM (build 20.0-b12, mixed mode)
**********************************
3. JAVA_HOME variable:
**********************************
COMMAND EXECUTED: echo $JAVA_HOME
/usr/lib/jvm/jre-1.6.0-openjdk.x86_64
********************************************
4. HADOOP version:
*******************************************
COMMAND EXECUTED: hadoop version
Warning: $HADOOP_HOME is deprecated.

Hadoop 1.0.4
Subversion
https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.0 -r
1393290
Compiled by hortonfo on Wed Oct  3 05:13:58 UTC 2012
>From source with checksum fe2baea87c4c81a2c505767f3f9b71f4
********************************************************

Are these still incompatible with eachother? (Hadoop version and java
version)


Thank you very much.


On Tue, Feb 19, 2013 at 10:26 PM, Fatih Haltas <fa...@nyu.edu> wrote:

> Thank you all very much
>
> 19 Şubat 2013 Salı tarihinde Harsh J adlı kullanıcı şöyle yazdı:
>
>> Oops. I just noticed Hemanth has been answering on a dupe thread as
>> well. Lets drop this thread and carry on there :)
>>
>> On Tue, Feb 19, 2013 at 11:14 PM, Harsh J <ha...@cloudera.com> wrote:
>> > Hi,
>> >
>> > The new error usually happens if you compile using Java 7 and try to
>> > run via Java 6 (for example). That is, an incompatibility in the
>> > runtimes for the binary artifact produced.
>> >
>> > On Tue, Feb 19, 2013 at 10:09 PM, Fatih Haltas <fa...@nyu.edu>
>> wrote:
>> >> Thank you very much Harsh,
>> >>
>> >> Now, as I promised earlier I am much obliged to you.
>> >>
>> >> But, now I solved that problem by just changing the directories then
>> again
>> >> creating a jar file of org. but I am getting this error:
>> >>
>> >> 1.) What I got
>> >>
>> ------------------------------------------------------------------------------
>> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ hadoop jar flow19028pm.jar
>> >> org.myorg.MapReduce /home/hadoop/project/hadoop-data/NetFlow 19_02.out
>> >> Warning: $HADOOP_HOME is deprecated.
>> >>
>> >> Exception in thread "main" java.lang.UnsupportedClassVersionError:
>> >> org/myorg/MapReduce : Unsupported major.minor version 51.0
>> >>         at java.lang.ClassLoader.defineClass1(Native Method)
>> >>         at java.lang.ClassLoader.defineClass(ClassLoader.java:634)
>> >>         at
>> >> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>> >>         at java.net.URLClassLoader.defineClass(URLClassLoader.java:277)
>> >>         at java.net.URLClassLoader.access$000(URLClassLoader.java:73)
>> >>         at java.net.URLClassLoader$1.run(URLClassLoader.java:212)
>> >>         at java.security.AccessController.doPrivileged(Native Method)
>> >>         at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>> >>         at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
>> >>         at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
>> >>         at java.lang.Class.forName0(Native Method)
>> >>         at java.lang.Class.forName(Class.java:266)
>> >>         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
>> >>
>> >> 2.) How I create my jar
>> >>
>> -------------------------------------------------------------------------------------
>> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar cvf flow19028pm.jar org
>> >> added manifest
>> >> adding: org/(in = 0) (out= 0)(stored 0%)
>> >> adding: org/myorg/(in = 0) (out= 0)(stored 0%)
>> >> adding: org/myorg/MapReduce$FlowPortReducer.class(in = 1661) (out=
>> >> 690)(deflated 58%)
>> >> adding: org/myorg/MapReduce.class(in = 1587) (out= 903)(deflated 43%)
>> >> adding: org/myorg/MapReduce$FlowPortMapper.class(in = 1874) (out=
>> >> 823)(deflated 56%)
>> >>
>> >> 3.) Content of my jar file
>> >>
>> ---------------------------------------------------------------------------------------
>> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar tf flow19028pm.jar
>> >> META-INF/
>> >> META-INF/MANIFEST.MF
>> >> org/
>> >> org/myorg/
>> >> org/myorg/MapReduce$FlowPortReducer.class
>> >> org/myorg/MapReduce.class
>> >> org/myorg/MapReduce$FlowPortMapper.class
>> >>
>> -----------------------------------------------------------------------------------------
>> >>
>> >>
>> >> Thank you very much.
>> >>
>> >>
>> >> On Tue, Feb 19, 2013 at 8:20 PM, Harsh J <ha...@cloudera.com> wrote:
>> >>>
>> >>> Your point (4) explains the problem. The jar packed structure should
>> >>> look like the below, and not how it is presently (one extra top level
>> >>> dir is present):
>> >>>
>> >>> META-INF/
>> >>> META-INF/MANIFEST.MF
>> >>> org/
>> >>> org/myorg/
>> >>> org/myorg/WordCount.class
>> >>> org/myorg/WordCount$TokenizerMapper.class
>> >>> org/myorg/WordCount$IntSumReducer.clas--
>> Harsh J
>>
>

Re: Trouble in running MapReduce application

Posted by Fatih Haltas <fa...@nyu.edu>.
Thank you all very much

19 Şubat 2013 Salı tarihinde Harsh J adlı kullanıcı şöyle yazdı:

> Oops. I just noticed Hemanth has been answering on a dupe thread as
> well. Lets drop this thread and carry on there :)
>
> On Tue, Feb 19, 2013 at 11:14 PM, Harsh J <ha...@cloudera.com> wrote:
> > Hi,
> >
> > The new error usually happens if you compile using Java 7 and try to
> > run via Java 6 (for example). That is, an incompatibility in the
> > runtimes for the binary artifact produced.
> >
> > On Tue, Feb 19, 2013 at 10:09 PM, Fatih Haltas <fa...@nyu.edu>
> wrote:
> >> Thank you very much Harsh,
> >>
> >> Now, as I promised earlier I am much obliged to you.
> >>
> >> But, now I solved that problem by just changing the directories then
> again
> >> creating a jar file of org. but I am getting this error:
> >>
> >> 1.) What I got
> >>
> ------------------------------------------------------------------------------
> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ hadoop jar flow19028pm.jar
> >> org.myorg.MapReduce /home/hadoop/project/hadoop-data/NetFlow 19_02.out
> >> Warning: $HADOOP_HOME is deprecated.
> >>
> >> Exception in thread "main" java.lang.UnsupportedClassVersionError:
> >> org/myorg/MapReduce : Unsupported major.minor version 51.0
> >>         at java.lang.ClassLoader.defineClass1(Native Method)
> >>         at java.lang.ClassLoader.defineClass(ClassLoader.java:634)
> >>         at
> >> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
> >>         at java.net.URLClassLoader.defineClass(URLClassLoader.java:277)
> >>         at java.net.URLClassLoader.access$000(URLClassLoader.java:73)
> >>         at java.net.URLClassLoader$1.run(URLClassLoader.java:212)
> >>         at java.security.AccessController.doPrivileged(Native Method)
> >>         at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
> >>         at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
> >>         at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
> >>         at java.lang.Class.forName0(Native Method)
> >>         at java.lang.Class.forName(Class.java:266)
> >>         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
> >>
> >> 2.) How I create my jar
> >>
> -------------------------------------------------------------------------------------
> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar cvf flow19028pm.jar org
> >> added manifest
> >> adding: org/(in = 0) (out= 0)(stored 0%)
> >> adding: org/myorg/(in = 0) (out= 0)(stored 0%)
> >> adding: org/myorg/MapReduce$FlowPortReducer.class(in = 1661) (out=
> >> 690)(deflated 58%)
> >> adding: org/myorg/MapReduce.class(in = 1587) (out= 903)(deflated 43%)
> >> adding: org/myorg/MapReduce$FlowPortMapper.class(in = 1874) (out=
> >> 823)(deflated 56%)
> >>
> >> 3.) Content of my jar file
> >>
> ---------------------------------------------------------------------------------------
> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar tf flow19028pm.jar
> >> META-INF/
> >> META-INF/MANIFEST.MF
> >> org/
> >> org/myorg/
> >> org/myorg/MapReduce$FlowPortReducer.class
> >> org/myorg/MapReduce.class
> >> org/myorg/MapReduce$FlowPortMapper.class
> >>
> -----------------------------------------------------------------------------------------
> >>
> >>
> >> Thank you very much.
> >>
> >>
> >> On Tue, Feb 19, 2013 at 8:20 PM, Harsh J <ha...@cloudera.com> wrote:
> >>>
> >>> Your point (4) explains the problem. The jar packed structure should
> >>> look like the below, and not how it is presently (one extra top level
> >>> dir is present):
> >>>
> >>> META-INF/
> >>> META-INF/MANIFEST.MF
> >>> org/
> >>> org/myorg/
> >>> org/myorg/WordCount.class
> >>> org/myorg/WordCount$TokenizerMapper.class
> >>> org/myorg/WordCount$IntSumReducer.clas--
> Harsh J
>

Re: Trouble in running MapReduce application

Posted by Fatih Haltas <fa...@nyu.edu>.
Thank you all very much

19 Şubat 2013 Salı tarihinde Harsh J adlı kullanıcı şöyle yazdı:

> Oops. I just noticed Hemanth has been answering on a dupe thread as
> well. Lets drop this thread and carry on there :)
>
> On Tue, Feb 19, 2013 at 11:14 PM, Harsh J <ha...@cloudera.com> wrote:
> > Hi,
> >
> > The new error usually happens if you compile using Java 7 and try to
> > run via Java 6 (for example). That is, an incompatibility in the
> > runtimes for the binary artifact produced.
> >
> > On Tue, Feb 19, 2013 at 10:09 PM, Fatih Haltas <fa...@nyu.edu>
> wrote:
> >> Thank you very much Harsh,
> >>
> >> Now, as I promised earlier I am much obliged to you.
> >>
> >> But, now I solved that problem by just changing the directories then
> again
> >> creating a jar file of org. but I am getting this error:
> >>
> >> 1.) What I got
> >>
> ------------------------------------------------------------------------------
> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ hadoop jar flow19028pm.jar
> >> org.myorg.MapReduce /home/hadoop/project/hadoop-data/NetFlow 19_02.out
> >> Warning: $HADOOP_HOME is deprecated.
> >>
> >> Exception in thread "main" java.lang.UnsupportedClassVersionError:
> >> org/myorg/MapReduce : Unsupported major.minor version 51.0
> >>         at java.lang.ClassLoader.defineClass1(Native Method)
> >>         at java.lang.ClassLoader.defineClass(ClassLoader.java:634)
> >>         at
> >> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
> >>         at java.net.URLClassLoader.defineClass(URLClassLoader.java:277)
> >>         at java.net.URLClassLoader.access$000(URLClassLoader.java:73)
> >>         at java.net.URLClassLoader$1.run(URLClassLoader.java:212)
> >>         at java.security.AccessController.doPrivileged(Native Method)
> >>         at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
> >>         at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
> >>         at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
> >>         at java.lang.Class.forName0(Native Method)
> >>         at java.lang.Class.forName(Class.java:266)
> >>         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
> >>
> >> 2.) How I create my jar
> >>
> -------------------------------------------------------------------------------------
> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar cvf flow19028pm.jar org
> >> added manifest
> >> adding: org/(in = 0) (out= 0)(stored 0%)
> >> adding: org/myorg/(in = 0) (out= 0)(stored 0%)
> >> adding: org/myorg/MapReduce$FlowPortReducer.class(in = 1661) (out=
> >> 690)(deflated 58%)
> >> adding: org/myorg/MapReduce.class(in = 1587) (out= 903)(deflated 43%)
> >> adding: org/myorg/MapReduce$FlowPortMapper.class(in = 1874) (out=
> >> 823)(deflated 56%)
> >>
> >> 3.) Content of my jar file
> >>
> ---------------------------------------------------------------------------------------
> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar tf flow19028pm.jar
> >> META-INF/
> >> META-INF/MANIFEST.MF
> >> org/
> >> org/myorg/
> >> org/myorg/MapReduce$FlowPortReducer.class
> >> org/myorg/MapReduce.class
> >> org/myorg/MapReduce$FlowPortMapper.class
> >>
> -----------------------------------------------------------------------------------------
> >>
> >>
> >> Thank you very much.
> >>
> >>
> >> On Tue, Feb 19, 2013 at 8:20 PM, Harsh J <ha...@cloudera.com> wrote:
> >>>
> >>> Your point (4) explains the problem. The jar packed structure should
> >>> look like the below, and not how it is presently (one extra top level
> >>> dir is present):
> >>>
> >>> META-INF/
> >>> META-INF/MANIFEST.MF
> >>> org/
> >>> org/myorg/
> >>> org/myorg/WordCount.class
> >>> org/myorg/WordCount$TokenizerMapper.class
> >>> org/myorg/WordCount$IntSumReducer.clas--
> Harsh J
>

Re: Trouble in running MapReduce application

Posted by Fatih Haltas <fa...@nyu.edu>.
Thank you all very much

19 Şubat 2013 Salı tarihinde Harsh J adlı kullanıcı şöyle yazdı:

> Oops. I just noticed Hemanth has been answering on a dupe thread as
> well. Lets drop this thread and carry on there :)
>
> On Tue, Feb 19, 2013 at 11:14 PM, Harsh J <ha...@cloudera.com> wrote:
> > Hi,
> >
> > The new error usually happens if you compile using Java 7 and try to
> > run via Java 6 (for example). That is, an incompatibility in the
> > runtimes for the binary artifact produced.
> >
> > On Tue, Feb 19, 2013 at 10:09 PM, Fatih Haltas <fa...@nyu.edu>
> wrote:
> >> Thank you very much Harsh,
> >>
> >> Now, as I promised earlier I am much obliged to you.
> >>
> >> But, now I solved that problem by just changing the directories then
> again
> >> creating a jar file of org. but I am getting this error:
> >>
> >> 1.) What I got
> >>
> ------------------------------------------------------------------------------
> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ hadoop jar flow19028pm.jar
> >> org.myorg.MapReduce /home/hadoop/project/hadoop-data/NetFlow 19_02.out
> >> Warning: $HADOOP_HOME is deprecated.
> >>
> >> Exception in thread "main" java.lang.UnsupportedClassVersionError:
> >> org/myorg/MapReduce : Unsupported major.minor version 51.0
> >>         at java.lang.ClassLoader.defineClass1(Native Method)
> >>         at java.lang.ClassLoader.defineClass(ClassLoader.java:634)
> >>         at
> >> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
> >>         at java.net.URLClassLoader.defineClass(URLClassLoader.java:277)
> >>         at java.net.URLClassLoader.access$000(URLClassLoader.java:73)
> >>         at java.net.URLClassLoader$1.run(URLClassLoader.java:212)
> >>         at java.security.AccessController.doPrivileged(Native Method)
> >>         at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
> >>         at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
> >>         at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
> >>         at java.lang.Class.forName0(Native Method)
> >>         at java.lang.Class.forName(Class.java:266)
> >>         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
> >>
> >> 2.) How I create my jar
> >>
> -------------------------------------------------------------------------------------
> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar cvf flow19028pm.jar org
> >> added manifest
> >> adding: org/(in = 0) (out= 0)(stored 0%)
> >> adding: org/myorg/(in = 0) (out= 0)(stored 0%)
> >> adding: org/myorg/MapReduce$FlowPortReducer.class(in = 1661) (out=
> >> 690)(deflated 58%)
> >> adding: org/myorg/MapReduce.class(in = 1587) (out= 903)(deflated 43%)
> >> adding: org/myorg/MapReduce$FlowPortMapper.class(in = 1874) (out=
> >> 823)(deflated 56%)
> >>
> >> 3.) Content of my jar file
> >>
> ---------------------------------------------------------------------------------------
> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar tf flow19028pm.jar
> >> META-INF/
> >> META-INF/MANIFEST.MF
> >> org/
> >> org/myorg/
> >> org/myorg/MapReduce$FlowPortReducer.class
> >> org/myorg/MapReduce.class
> >> org/myorg/MapReduce$FlowPortMapper.class
> >>
> -----------------------------------------------------------------------------------------
> >>
> >>
> >> Thank you very much.
> >>
> >>
> >> On Tue, Feb 19, 2013 at 8:20 PM, Harsh J <ha...@cloudera.com> wrote:
> >>>
> >>> Your point (4) explains the problem. The jar packed structure should
> >>> look like the below, and not how it is presently (one extra top level
> >>> dir is present):
> >>>
> >>> META-INF/
> >>> META-INF/MANIFEST.MF
> >>> org/
> >>> org/myorg/
> >>> org/myorg/WordCount.class
> >>> org/myorg/WordCount$TokenizerMapper.class
> >>> org/myorg/WordCount$IntSumReducer.clas--
> Harsh J
>

Re: Trouble in running MapReduce application

Posted by Fatih Haltas <fa...@nyu.edu>.
Thank you all very much

19 Şubat 2013 Salı tarihinde Harsh J adlı kullanıcı şöyle yazdı:

> Oops. I just noticed Hemanth has been answering on a dupe thread as
> well. Lets drop this thread and carry on there :)
>
> On Tue, Feb 19, 2013 at 11:14 PM, Harsh J <ha...@cloudera.com> wrote:
> > Hi,
> >
> > The new error usually happens if you compile using Java 7 and try to
> > run via Java 6 (for example). That is, an incompatibility in the
> > runtimes for the binary artifact produced.
> >
> > On Tue, Feb 19, 2013 at 10:09 PM, Fatih Haltas <fa...@nyu.edu>
> wrote:
> >> Thank you very much Harsh,
> >>
> >> Now, as I promised earlier I am much obliged to you.
> >>
> >> But, now I solved that problem by just changing the directories then
> again
> >> creating a jar file of org. but I am getting this error:
> >>
> >> 1.) What I got
> >>
> ------------------------------------------------------------------------------
> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ hadoop jar flow19028pm.jar
> >> org.myorg.MapReduce /home/hadoop/project/hadoop-data/NetFlow 19_02.out
> >> Warning: $HADOOP_HOME is deprecated.
> >>
> >> Exception in thread "main" java.lang.UnsupportedClassVersionError:
> >> org/myorg/MapReduce : Unsupported major.minor version 51.0
> >>         at java.lang.ClassLoader.defineClass1(Native Method)
> >>         at java.lang.ClassLoader.defineClass(ClassLoader.java:634)
> >>         at
> >> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
> >>         at java.net.URLClassLoader.defineClass(URLClassLoader.java:277)
> >>         at java.net.URLClassLoader.access$000(URLClassLoader.java:73)
> >>         at java.net.URLClassLoader$1.run(URLClassLoader.java:212)
> >>         at java.security.AccessController.doPrivileged(Native Method)
> >>         at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
> >>         at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
> >>         at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
> >>         at java.lang.Class.forName0(Native Method)
> >>         at java.lang.Class.forName(Class.java:266)
> >>         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
> >>
> >> 2.) How I create my jar
> >>
> -------------------------------------------------------------------------------------
> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar cvf flow19028pm.jar org
> >> added manifest
> >> adding: org/(in = 0) (out= 0)(stored 0%)
> >> adding: org/myorg/(in = 0) (out= 0)(stored 0%)
> >> adding: org/myorg/MapReduce$FlowPortReducer.class(in = 1661) (out=
> >> 690)(deflated 58%)
> >> adding: org/myorg/MapReduce.class(in = 1587) (out= 903)(deflated 43%)
> >> adding: org/myorg/MapReduce$FlowPortMapper.class(in = 1874) (out=
> >> 823)(deflated 56%)
> >>
> >> 3.) Content of my jar file
> >>
> ---------------------------------------------------------------------------------------
> >> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar tf flow19028pm.jar
> >> META-INF/
> >> META-INF/MANIFEST.MF
> >> org/
> >> org/myorg/
> >> org/myorg/MapReduce$FlowPortReducer.class
> >> org/myorg/MapReduce.class
> >> org/myorg/MapReduce$FlowPortMapper.class
> >>
> -----------------------------------------------------------------------------------------
> >>
> >>
> >> Thank you very much.
> >>
> >>
> >> On Tue, Feb 19, 2013 at 8:20 PM, Harsh J <ha...@cloudera.com> wrote:
> >>>
> >>> Your point (4) explains the problem. The jar packed structure should
> >>> look like the below, and not how it is presently (one extra top level
> >>> dir is present):
> >>>
> >>> META-INF/
> >>> META-INF/MANIFEST.MF
> >>> org/
> >>> org/myorg/
> >>> org/myorg/WordCount.class
> >>> org/myorg/WordCount$TokenizerMapper.class
> >>> org/myorg/WordCount$IntSumReducer.clas--
> Harsh J
>

Re: Trouble in running MapReduce application

Posted by Harsh J <ha...@cloudera.com>.
Oops. I just noticed Hemanth has been answering on a dupe thread as
well. Lets drop this thread and carry on there :)

On Tue, Feb 19, 2013 at 11:14 PM, Harsh J <ha...@cloudera.com> wrote:
> Hi,
>
> The new error usually happens if you compile using Java 7 and try to
> run via Java 6 (for example). That is, an incompatibility in the
> runtimes for the binary artifact produced.
>
> On Tue, Feb 19, 2013 at 10:09 PM, Fatih Haltas <fa...@nyu.edu> wrote:
>> Thank you very much Harsh,
>>
>> Now, as I promised earlier I am much obliged to you.
>>
>> But, now I solved that problem by just changing the directories then again
>> creating a jar file of org. but I am getting this error:
>>
>> 1.) What I got
>> ------------------------------------------------------------------------------
>> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ hadoop jar flow19028pm.jar
>> org.myorg.MapReduce /home/hadoop/project/hadoop-data/NetFlow 19_02.out
>> Warning: $HADOOP_HOME is deprecated.
>>
>> Exception in thread "main" java.lang.UnsupportedClassVersionError:
>> org/myorg/MapReduce : Unsupported major.minor version 51.0
>>         at java.lang.ClassLoader.defineClass1(Native Method)
>>         at java.lang.ClassLoader.defineClass(ClassLoader.java:634)
>>         at
>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>>         at java.net.URLClassLoader.defineClass(URLClassLoader.java:277)
>>         at java.net.URLClassLoader.access$000(URLClassLoader.java:73)
>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:212)
>>         at java.security.AccessController.doPrivileged(Native Method)
>>         at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
>>         at java.lang.Class.forName0(Native Method)
>>         at java.lang.Class.forName(Class.java:266)
>>         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
>>
>> 2.) How I create my jar
>> -------------------------------------------------------------------------------------
>> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar cvf flow19028pm.jar org
>> added manifest
>> adding: org/(in = 0) (out= 0)(stored 0%)
>> adding: org/myorg/(in = 0) (out= 0)(stored 0%)
>> adding: org/myorg/MapReduce$FlowPortReducer.class(in = 1661) (out=
>> 690)(deflated 58%)
>> adding: org/myorg/MapReduce.class(in = 1587) (out= 903)(deflated 43%)
>> adding: org/myorg/MapReduce$FlowPortMapper.class(in = 1874) (out=
>> 823)(deflated 56%)
>>
>> 3.) Content of my jar file
>> ---------------------------------------------------------------------------------------
>> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar tf flow19028pm.jar
>> META-INF/
>> META-INF/MANIFEST.MF
>> org/
>> org/myorg/
>> org/myorg/MapReduce$FlowPortReducer.class
>> org/myorg/MapReduce.class
>> org/myorg/MapReduce$FlowPortMapper.class
>> -----------------------------------------------------------------------------------------
>>
>>
>> Thank you very much.
>>
>>
>> On Tue, Feb 19, 2013 at 8:20 PM, Harsh J <ha...@cloudera.com> wrote:
>>>
>>> Your point (4) explains the problem. The jar packed structure should
>>> look like the below, and not how it is presently (one extra top level
>>> dir is present):
>>>
>>> META-INF/
>>> META-INF/MANIFEST.MF
>>> org/
>>> org/myorg/
>>> org/myorg/WordCount.class
>>> org/myorg/WordCount$TokenizerMapper.class
>>> org/myorg/WordCount$IntSumReducer.class
>>>
>>> On Tue, Feb 19, 2013 at 9:29 PM, Fatih Haltas <fa...@nyu.edu>
>>> wrote:
>>> > Hi everyone,
>>> >
>>> > I know this is the common mistake to not specify the class adress while
>>> > trying to run a jar, however,
>>> > although I specified, I am still getting the ClassNotFound exception.
>>> >
>>> > What may be the reason for it? I have been struggling for this problem
>>> > more
>>> > than a 2 days.
>>> > I just wrote different MapReduce application for some anlaysis. I got
>>> > this
>>> > problem.
>>> >
>>> > To check, is there something wrong with my system, i tried to run
>>> > WordCount
>>> > example.
>>> > When I just run hadoop-examples wordcount, it is working fine.
>>> >
>>> > But when I add just "package org.myorg;" command at the beginning, it
>>> > doesnot work.
>>> >
>>> > Here is what I have done so far
>>> >
>>> > *************************************************************************
>>> > 1. I just copied wordcount code from the apaches own examples source
>>> > code
>>> > and I just changed package decleration as "package org.myorg;"
>>> >
>>> > **************************************************************************
>>> > 2. Then I tried to run that command:
>>> >
>>> > *************************************************************************
>>> > "hadoop jar wordcount_19_02.jar org.myorg.WordCount
>>> > /home/hadoop/project/hadoop-data/NetFlow 19_02_wordcount.output"
>>> >
>>> > *************************************************************************
>>> > 3. I got following error:
>>> >
>>> > **************************************************************************
>>> > [hadoop@ADUAE042-LAP-V project]$ hadoop jar wordcount_19_02.jar
>>> > org.myorg.WordCount /home/hadoop/project/hadoop-data/NetFlow
>>> > 19_02_wordcount.output
>>> > Warning: $HADOOP_HOME is deprecated.
>>> >
>>> > Exception in thread "main" java.lang.ClassNotFoundException:
>>> > org.myorg.WordCount
>>> >         at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
>>> >         at java.security.AccessController.doPrivileged(Native Method)
>>> >         at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>>> >         at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
>>> >         at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
>>> >         at java.lang.Class.forName0(Native Method)
>>> >         at java.lang.Class.forName(Class.java:266)
>>> >         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
>>> >
>>> >
>>> > **************************************************************************
>>> > 4. This is the content of my .jar file:
>>> > ****************************************************
>>> > [hadoop@ADUAE042-LAP-V project]$ jar tf wordcount_19_02.jar
>>> > META-INF/
>>> > META-INF/MANIFEST.MF
>>> > wordcount_classes/
>>> > wordcount_classes/org/
>>> > wordcount_classes/org/myorg/
>>> > wordcount_classes/org/myorg/WordCount.class
>>> > wordcount_classes/org/myorg/WordCount$TokenizerMapper.class
>>> > wordcount_classes/org/myorg/WordCount$IntSumReducer.class
>>> > **********************************************************
>>> > 5. This is the 'ls' output of my working directory:
>>> > **********************************************************
>>> > [hadoop@ADUAE042-LAP-V project]$ ls
>>> > flowclasses_18_02  flowclasses_18_02.jar  hadoop-1.0.4
>>> > hadoop-1.0.4.tar.gz
>>> > hadoop-data  MapReduce.java  sample  wordcount_19_02.jar
>>> > wordcount_classes
>>> > WordCountClasses  WordCount.java
>>> > *************************************************************
>>> > So as you see, package decleration is fine but I am really helpless, I
>>> > googled but they are all saying samething you should specify the package
>>> > hierarchy of your main class. I did know it already I am specifying but
>>> > doesn't work.
>>> >
>>> > I would be much obliged to anyone helped me
>>> >
>>> > Regards,
>>>
>>>
>>>
>>> --
>>> Harsh J
>>
>>
>
>
>
> --
> Harsh J



--
Harsh J

Re: Trouble in running MapReduce application

Posted by Harsh J <ha...@cloudera.com>.
Oops. I just noticed Hemanth has been answering on a dupe thread as
well. Lets drop this thread and carry on there :)

On Tue, Feb 19, 2013 at 11:14 PM, Harsh J <ha...@cloudera.com> wrote:
> Hi,
>
> The new error usually happens if you compile using Java 7 and try to
> run via Java 6 (for example). That is, an incompatibility in the
> runtimes for the binary artifact produced.
>
> On Tue, Feb 19, 2013 at 10:09 PM, Fatih Haltas <fa...@nyu.edu> wrote:
>> Thank you very much Harsh,
>>
>> Now, as I promised earlier I am much obliged to you.
>>
>> But, now I solved that problem by just changing the directories then again
>> creating a jar file of org. but I am getting this error:
>>
>> 1.) What I got
>> ------------------------------------------------------------------------------
>> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ hadoop jar flow19028pm.jar
>> org.myorg.MapReduce /home/hadoop/project/hadoop-data/NetFlow 19_02.out
>> Warning: $HADOOP_HOME is deprecated.
>>
>> Exception in thread "main" java.lang.UnsupportedClassVersionError:
>> org/myorg/MapReduce : Unsupported major.minor version 51.0
>>         at java.lang.ClassLoader.defineClass1(Native Method)
>>         at java.lang.ClassLoader.defineClass(ClassLoader.java:634)
>>         at
>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>>         at java.net.URLClassLoader.defineClass(URLClassLoader.java:277)
>>         at java.net.URLClassLoader.access$000(URLClassLoader.java:73)
>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:212)
>>         at java.security.AccessController.doPrivileged(Native Method)
>>         at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
>>         at java.lang.Class.forName0(Native Method)
>>         at java.lang.Class.forName(Class.java:266)
>>         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
>>
>> 2.) How I create my jar
>> -------------------------------------------------------------------------------------
>> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar cvf flow19028pm.jar org
>> added manifest
>> adding: org/(in = 0) (out= 0)(stored 0%)
>> adding: org/myorg/(in = 0) (out= 0)(stored 0%)
>> adding: org/myorg/MapReduce$FlowPortReducer.class(in = 1661) (out=
>> 690)(deflated 58%)
>> adding: org/myorg/MapReduce.class(in = 1587) (out= 903)(deflated 43%)
>> adding: org/myorg/MapReduce$FlowPortMapper.class(in = 1874) (out=
>> 823)(deflated 56%)
>>
>> 3.) Content of my jar file
>> ---------------------------------------------------------------------------------------
>> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar tf flow19028pm.jar
>> META-INF/
>> META-INF/MANIFEST.MF
>> org/
>> org/myorg/
>> org/myorg/MapReduce$FlowPortReducer.class
>> org/myorg/MapReduce.class
>> org/myorg/MapReduce$FlowPortMapper.class
>> -----------------------------------------------------------------------------------------
>>
>>
>> Thank you very much.
>>
>>
>> On Tue, Feb 19, 2013 at 8:20 PM, Harsh J <ha...@cloudera.com> wrote:
>>>
>>> Your point (4) explains the problem. The jar packed structure should
>>> look like the below, and not how it is presently (one extra top level
>>> dir is present):
>>>
>>> META-INF/
>>> META-INF/MANIFEST.MF
>>> org/
>>> org/myorg/
>>> org/myorg/WordCount.class
>>> org/myorg/WordCount$TokenizerMapper.class
>>> org/myorg/WordCount$IntSumReducer.class
>>>
>>> On Tue, Feb 19, 2013 at 9:29 PM, Fatih Haltas <fa...@nyu.edu>
>>> wrote:
>>> > Hi everyone,
>>> >
>>> > I know this is the common mistake to not specify the class adress while
>>> > trying to run a jar, however,
>>> > although I specified, I am still getting the ClassNotFound exception.
>>> >
>>> > What may be the reason for it? I have been struggling for this problem
>>> > more
>>> > than a 2 days.
>>> > I just wrote different MapReduce application for some anlaysis. I got
>>> > this
>>> > problem.
>>> >
>>> > To check, is there something wrong with my system, i tried to run
>>> > WordCount
>>> > example.
>>> > When I just run hadoop-examples wordcount, it is working fine.
>>> >
>>> > But when I add just "package org.myorg;" command at the beginning, it
>>> > doesnot work.
>>> >
>>> > Here is what I have done so far
>>> >
>>> > *************************************************************************
>>> > 1. I just copied wordcount code from the apaches own examples source
>>> > code
>>> > and I just changed package decleration as "package org.myorg;"
>>> >
>>> > **************************************************************************
>>> > 2. Then I tried to run that command:
>>> >
>>> > *************************************************************************
>>> > "hadoop jar wordcount_19_02.jar org.myorg.WordCount
>>> > /home/hadoop/project/hadoop-data/NetFlow 19_02_wordcount.output"
>>> >
>>> > *************************************************************************
>>> > 3. I got following error:
>>> >
>>> > **************************************************************************
>>> > [hadoop@ADUAE042-LAP-V project]$ hadoop jar wordcount_19_02.jar
>>> > org.myorg.WordCount /home/hadoop/project/hadoop-data/NetFlow
>>> > 19_02_wordcount.output
>>> > Warning: $HADOOP_HOME is deprecated.
>>> >
>>> > Exception in thread "main" java.lang.ClassNotFoundException:
>>> > org.myorg.WordCount
>>> >         at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
>>> >         at java.security.AccessController.doPrivileged(Native Method)
>>> >         at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>>> >         at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
>>> >         at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
>>> >         at java.lang.Class.forName0(Native Method)
>>> >         at java.lang.Class.forName(Class.java:266)
>>> >         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
>>> >
>>> >
>>> > **************************************************************************
>>> > 4. This is the content of my .jar file:
>>> > ****************************************************
>>> > [hadoop@ADUAE042-LAP-V project]$ jar tf wordcount_19_02.jar
>>> > META-INF/
>>> > META-INF/MANIFEST.MF
>>> > wordcount_classes/
>>> > wordcount_classes/org/
>>> > wordcount_classes/org/myorg/
>>> > wordcount_classes/org/myorg/WordCount.class
>>> > wordcount_classes/org/myorg/WordCount$TokenizerMapper.class
>>> > wordcount_classes/org/myorg/WordCount$IntSumReducer.class
>>> > **********************************************************
>>> > 5. This is the 'ls' output of my working directory:
>>> > **********************************************************
>>> > [hadoop@ADUAE042-LAP-V project]$ ls
>>> > flowclasses_18_02  flowclasses_18_02.jar  hadoop-1.0.4
>>> > hadoop-1.0.4.tar.gz
>>> > hadoop-data  MapReduce.java  sample  wordcount_19_02.jar
>>> > wordcount_classes
>>> > WordCountClasses  WordCount.java
>>> > *************************************************************
>>> > So as you see, package decleration is fine but I am really helpless, I
>>> > googled but they are all saying samething you should specify the package
>>> > hierarchy of your main class. I did know it already I am specifying but
>>> > doesn't work.
>>> >
>>> > I would be much obliged to anyone helped me
>>> >
>>> > Regards,
>>>
>>>
>>>
>>> --
>>> Harsh J
>>
>>
>
>
>
> --
> Harsh J



--
Harsh J

Re: Trouble in running MapReduce application

Posted by Harsh J <ha...@cloudera.com>.
Oops. I just noticed Hemanth has been answering on a dupe thread as
well. Lets drop this thread and carry on there :)

On Tue, Feb 19, 2013 at 11:14 PM, Harsh J <ha...@cloudera.com> wrote:
> Hi,
>
> The new error usually happens if you compile using Java 7 and try to
> run via Java 6 (for example). That is, an incompatibility in the
> runtimes for the binary artifact produced.
>
> On Tue, Feb 19, 2013 at 10:09 PM, Fatih Haltas <fa...@nyu.edu> wrote:
>> Thank you very much Harsh,
>>
>> Now, as I promised earlier I am much obliged to you.
>>
>> But, now I solved that problem by just changing the directories then again
>> creating a jar file of org. but I am getting this error:
>>
>> 1.) What I got
>> ------------------------------------------------------------------------------
>> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ hadoop jar flow19028pm.jar
>> org.myorg.MapReduce /home/hadoop/project/hadoop-data/NetFlow 19_02.out
>> Warning: $HADOOP_HOME is deprecated.
>>
>> Exception in thread "main" java.lang.UnsupportedClassVersionError:
>> org/myorg/MapReduce : Unsupported major.minor version 51.0
>>         at java.lang.ClassLoader.defineClass1(Native Method)
>>         at java.lang.ClassLoader.defineClass(ClassLoader.java:634)
>>         at
>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>>         at java.net.URLClassLoader.defineClass(URLClassLoader.java:277)
>>         at java.net.URLClassLoader.access$000(URLClassLoader.java:73)
>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:212)
>>         at java.security.AccessController.doPrivileged(Native Method)
>>         at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
>>         at java.lang.Class.forName0(Native Method)
>>         at java.lang.Class.forName(Class.java:266)
>>         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
>>
>> 2.) How I create my jar
>> -------------------------------------------------------------------------------------
>> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar cvf flow19028pm.jar org
>> added manifest
>> adding: org/(in = 0) (out= 0)(stored 0%)
>> adding: org/myorg/(in = 0) (out= 0)(stored 0%)
>> adding: org/myorg/MapReduce$FlowPortReducer.class(in = 1661) (out=
>> 690)(deflated 58%)
>> adding: org/myorg/MapReduce.class(in = 1587) (out= 903)(deflated 43%)
>> adding: org/myorg/MapReduce$FlowPortMapper.class(in = 1874) (out=
>> 823)(deflated 56%)
>>
>> 3.) Content of my jar file
>> ---------------------------------------------------------------------------------------
>> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar tf flow19028pm.jar
>> META-INF/
>> META-INF/MANIFEST.MF
>> org/
>> org/myorg/
>> org/myorg/MapReduce$FlowPortReducer.class
>> org/myorg/MapReduce.class
>> org/myorg/MapReduce$FlowPortMapper.class
>> -----------------------------------------------------------------------------------------
>>
>>
>> Thank you very much.
>>
>>
>> On Tue, Feb 19, 2013 at 8:20 PM, Harsh J <ha...@cloudera.com> wrote:
>>>
>>> Your point (4) explains the problem. The jar packed structure should
>>> look like the below, and not how it is presently (one extra top level
>>> dir is present):
>>>
>>> META-INF/
>>> META-INF/MANIFEST.MF
>>> org/
>>> org/myorg/
>>> org/myorg/WordCount.class
>>> org/myorg/WordCount$TokenizerMapper.class
>>> org/myorg/WordCount$IntSumReducer.class
>>>
>>> On Tue, Feb 19, 2013 at 9:29 PM, Fatih Haltas <fa...@nyu.edu>
>>> wrote:
>>> > Hi everyone,
>>> >
>>> > I know this is the common mistake to not specify the class adress while
>>> > trying to run a jar, however,
>>> > although I specified, I am still getting the ClassNotFound exception.
>>> >
>>> > What may be the reason for it? I have been struggling for this problem
>>> > more
>>> > than a 2 days.
>>> > I just wrote different MapReduce application for some anlaysis. I got
>>> > this
>>> > problem.
>>> >
>>> > To check, is there something wrong with my system, i tried to run
>>> > WordCount
>>> > example.
>>> > When I just run hadoop-examples wordcount, it is working fine.
>>> >
>>> > But when I add just "package org.myorg;" command at the beginning, it
>>> > doesnot work.
>>> >
>>> > Here is what I have done so far
>>> >
>>> > *************************************************************************
>>> > 1. I just copied wordcount code from the apaches own examples source
>>> > code
>>> > and I just changed package decleration as "package org.myorg;"
>>> >
>>> > **************************************************************************
>>> > 2. Then I tried to run that command:
>>> >
>>> > *************************************************************************
>>> > "hadoop jar wordcount_19_02.jar org.myorg.WordCount
>>> > /home/hadoop/project/hadoop-data/NetFlow 19_02_wordcount.output"
>>> >
>>> > *************************************************************************
>>> > 3. I got following error:
>>> >
>>> > **************************************************************************
>>> > [hadoop@ADUAE042-LAP-V project]$ hadoop jar wordcount_19_02.jar
>>> > org.myorg.WordCount /home/hadoop/project/hadoop-data/NetFlow
>>> > 19_02_wordcount.output
>>> > Warning: $HADOOP_HOME is deprecated.
>>> >
>>> > Exception in thread "main" java.lang.ClassNotFoundException:
>>> > org.myorg.WordCount
>>> >         at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
>>> >         at java.security.AccessController.doPrivileged(Native Method)
>>> >         at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>>> >         at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
>>> >         at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
>>> >         at java.lang.Class.forName0(Native Method)
>>> >         at java.lang.Class.forName(Class.java:266)
>>> >         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
>>> >
>>> >
>>> > **************************************************************************
>>> > 4. This is the content of my .jar file:
>>> > ****************************************************
>>> > [hadoop@ADUAE042-LAP-V project]$ jar tf wordcount_19_02.jar
>>> > META-INF/
>>> > META-INF/MANIFEST.MF
>>> > wordcount_classes/
>>> > wordcount_classes/org/
>>> > wordcount_classes/org/myorg/
>>> > wordcount_classes/org/myorg/WordCount.class
>>> > wordcount_classes/org/myorg/WordCount$TokenizerMapper.class
>>> > wordcount_classes/org/myorg/WordCount$IntSumReducer.class
>>> > **********************************************************
>>> > 5. This is the 'ls' output of my working directory:
>>> > **********************************************************
>>> > [hadoop@ADUAE042-LAP-V project]$ ls
>>> > flowclasses_18_02  flowclasses_18_02.jar  hadoop-1.0.4
>>> > hadoop-1.0.4.tar.gz
>>> > hadoop-data  MapReduce.java  sample  wordcount_19_02.jar
>>> > wordcount_classes
>>> > WordCountClasses  WordCount.java
>>> > *************************************************************
>>> > So as you see, package decleration is fine but I am really helpless, I
>>> > googled but they are all saying samething you should specify the package
>>> > hierarchy of your main class. I did know it already I am specifying but
>>> > doesn't work.
>>> >
>>> > I would be much obliged to anyone helped me
>>> >
>>> > Regards,
>>>
>>>
>>>
>>> --
>>> Harsh J
>>
>>
>
>
>
> --
> Harsh J



--
Harsh J

Re: Trouble in running MapReduce application

Posted by Harsh J <ha...@cloudera.com>.
Oops. I just noticed Hemanth has been answering on a dupe thread as
well. Lets drop this thread and carry on there :)

On Tue, Feb 19, 2013 at 11:14 PM, Harsh J <ha...@cloudera.com> wrote:
> Hi,
>
> The new error usually happens if you compile using Java 7 and try to
> run via Java 6 (for example). That is, an incompatibility in the
> runtimes for the binary artifact produced.
>
> On Tue, Feb 19, 2013 at 10:09 PM, Fatih Haltas <fa...@nyu.edu> wrote:
>> Thank you very much Harsh,
>>
>> Now, as I promised earlier I am much obliged to you.
>>
>> But, now I solved that problem by just changing the directories then again
>> creating a jar file of org. but I am getting this error:
>>
>> 1.) What I got
>> ------------------------------------------------------------------------------
>> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ hadoop jar flow19028pm.jar
>> org.myorg.MapReduce /home/hadoop/project/hadoop-data/NetFlow 19_02.out
>> Warning: $HADOOP_HOME is deprecated.
>>
>> Exception in thread "main" java.lang.UnsupportedClassVersionError:
>> org/myorg/MapReduce : Unsupported major.minor version 51.0
>>         at java.lang.ClassLoader.defineClass1(Native Method)
>>         at java.lang.ClassLoader.defineClass(ClassLoader.java:634)
>>         at
>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>>         at java.net.URLClassLoader.defineClass(URLClassLoader.java:277)
>>         at java.net.URLClassLoader.access$000(URLClassLoader.java:73)
>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:212)
>>         at java.security.AccessController.doPrivileged(Native Method)
>>         at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
>>         at java.lang.Class.forName0(Native Method)
>>         at java.lang.Class.forName(Class.java:266)
>>         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
>>
>> 2.) How I create my jar
>> -------------------------------------------------------------------------------------
>> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar cvf flow19028pm.jar org
>> added manifest
>> adding: org/(in = 0) (out= 0)(stored 0%)
>> adding: org/myorg/(in = 0) (out= 0)(stored 0%)
>> adding: org/myorg/MapReduce$FlowPortReducer.class(in = 1661) (out=
>> 690)(deflated 58%)
>> adding: org/myorg/MapReduce.class(in = 1587) (out= 903)(deflated 43%)
>> adding: org/myorg/MapReduce$FlowPortMapper.class(in = 1874) (out=
>> 823)(deflated 56%)
>>
>> 3.) Content of my jar file
>> ---------------------------------------------------------------------------------------
>> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar tf flow19028pm.jar
>> META-INF/
>> META-INF/MANIFEST.MF
>> org/
>> org/myorg/
>> org/myorg/MapReduce$FlowPortReducer.class
>> org/myorg/MapReduce.class
>> org/myorg/MapReduce$FlowPortMapper.class
>> -----------------------------------------------------------------------------------------
>>
>>
>> Thank you very much.
>>
>>
>> On Tue, Feb 19, 2013 at 8:20 PM, Harsh J <ha...@cloudera.com> wrote:
>>>
>>> Your point (4) explains the problem. The jar packed structure should
>>> look like the below, and not how it is presently (one extra top level
>>> dir is present):
>>>
>>> META-INF/
>>> META-INF/MANIFEST.MF
>>> org/
>>> org/myorg/
>>> org/myorg/WordCount.class
>>> org/myorg/WordCount$TokenizerMapper.class
>>> org/myorg/WordCount$IntSumReducer.class
>>>
>>> On Tue, Feb 19, 2013 at 9:29 PM, Fatih Haltas <fa...@nyu.edu>
>>> wrote:
>>> > Hi everyone,
>>> >
>>> > I know this is the common mistake to not specify the class adress while
>>> > trying to run a jar, however,
>>> > although I specified, I am still getting the ClassNotFound exception.
>>> >
>>> > What may be the reason for it? I have been struggling for this problem
>>> > more
>>> > than a 2 days.
>>> > I just wrote different MapReduce application for some anlaysis. I got
>>> > this
>>> > problem.
>>> >
>>> > To check, is there something wrong with my system, i tried to run
>>> > WordCount
>>> > example.
>>> > When I just run hadoop-examples wordcount, it is working fine.
>>> >
>>> > But when I add just "package org.myorg;" command at the beginning, it
>>> > doesnot work.
>>> >
>>> > Here is what I have done so far
>>> >
>>> > *************************************************************************
>>> > 1. I just copied wordcount code from the apaches own examples source
>>> > code
>>> > and I just changed package decleration as "package org.myorg;"
>>> >
>>> > **************************************************************************
>>> > 2. Then I tried to run that command:
>>> >
>>> > *************************************************************************
>>> > "hadoop jar wordcount_19_02.jar org.myorg.WordCount
>>> > /home/hadoop/project/hadoop-data/NetFlow 19_02_wordcount.output"
>>> >
>>> > *************************************************************************
>>> > 3. I got following error:
>>> >
>>> > **************************************************************************
>>> > [hadoop@ADUAE042-LAP-V project]$ hadoop jar wordcount_19_02.jar
>>> > org.myorg.WordCount /home/hadoop/project/hadoop-data/NetFlow
>>> > 19_02_wordcount.output
>>> > Warning: $HADOOP_HOME is deprecated.
>>> >
>>> > Exception in thread "main" java.lang.ClassNotFoundException:
>>> > org.myorg.WordCount
>>> >         at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
>>> >         at java.security.AccessController.doPrivileged(Native Method)
>>> >         at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>>> >         at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
>>> >         at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
>>> >         at java.lang.Class.forName0(Native Method)
>>> >         at java.lang.Class.forName(Class.java:266)
>>> >         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
>>> >
>>> >
>>> > **************************************************************************
>>> > 4. This is the content of my .jar file:
>>> > ****************************************************
>>> > [hadoop@ADUAE042-LAP-V project]$ jar tf wordcount_19_02.jar
>>> > META-INF/
>>> > META-INF/MANIFEST.MF
>>> > wordcount_classes/
>>> > wordcount_classes/org/
>>> > wordcount_classes/org/myorg/
>>> > wordcount_classes/org/myorg/WordCount.class
>>> > wordcount_classes/org/myorg/WordCount$TokenizerMapper.class
>>> > wordcount_classes/org/myorg/WordCount$IntSumReducer.class
>>> > **********************************************************
>>> > 5. This is the 'ls' output of my working directory:
>>> > **********************************************************
>>> > [hadoop@ADUAE042-LAP-V project]$ ls
>>> > flowclasses_18_02  flowclasses_18_02.jar  hadoop-1.0.4
>>> > hadoop-1.0.4.tar.gz
>>> > hadoop-data  MapReduce.java  sample  wordcount_19_02.jar
>>> > wordcount_classes
>>> > WordCountClasses  WordCount.java
>>> > *************************************************************
>>> > So as you see, package decleration is fine but I am really helpless, I
>>> > googled but they are all saying samething you should specify the package
>>> > hierarchy of your main class. I did know it already I am specifying but
>>> > doesn't work.
>>> >
>>> > I would be much obliged to anyone helped me
>>> >
>>> > Regards,
>>>
>>>
>>>
>>> --
>>> Harsh J
>>
>>
>
>
>
> --
> Harsh J



--
Harsh J

Re: Trouble in running MapReduce application

Posted by Harsh J <ha...@cloudera.com>.
Hi,

The new error usually happens if you compile using Java 7 and try to
run via Java 6 (for example). That is, an incompatibility in the
runtimes for the binary artifact produced.

On Tue, Feb 19, 2013 at 10:09 PM, Fatih Haltas <fa...@nyu.edu> wrote:
> Thank you very much Harsh,
>
> Now, as I promised earlier I am much obliged to you.
>
> But, now I solved that problem by just changing the directories then again
> creating a jar file of org. but I am getting this error:
>
> 1.) What I got
> ------------------------------------------------------------------------------
> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ hadoop jar flow19028pm.jar
> org.myorg.MapReduce /home/hadoop/project/hadoop-data/NetFlow 19_02.out
> Warning: $HADOOP_HOME is deprecated.
>
> Exception in thread "main" java.lang.UnsupportedClassVersionError:
> org/myorg/MapReduce : Unsupported major.minor version 51.0
>         at java.lang.ClassLoader.defineClass1(Native Method)
>         at java.lang.ClassLoader.defineClass(ClassLoader.java:634)
>         at
> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>         at java.net.URLClassLoader.defineClass(URLClassLoader.java:277)
>         at java.net.URLClassLoader.access$000(URLClassLoader.java:73)
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:212)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
>         at java.lang.Class.forName0(Native Method)
>         at java.lang.Class.forName(Class.java:266)
>         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
>
> 2.) How I create my jar
> -------------------------------------------------------------------------------------
> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar cvf flow19028pm.jar org
> added manifest
> adding: org/(in = 0) (out= 0)(stored 0%)
> adding: org/myorg/(in = 0) (out= 0)(stored 0%)
> adding: org/myorg/MapReduce$FlowPortReducer.class(in = 1661) (out=
> 690)(deflated 58%)
> adding: org/myorg/MapReduce.class(in = 1587) (out= 903)(deflated 43%)
> adding: org/myorg/MapReduce$FlowPortMapper.class(in = 1874) (out=
> 823)(deflated 56%)
>
> 3.) Content of my jar file
> ---------------------------------------------------------------------------------------
> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar tf flow19028pm.jar
> META-INF/
> META-INF/MANIFEST.MF
> org/
> org/myorg/
> org/myorg/MapReduce$FlowPortReducer.class
> org/myorg/MapReduce.class
> org/myorg/MapReduce$FlowPortMapper.class
> -----------------------------------------------------------------------------------------
>
>
> Thank you very much.
>
>
> On Tue, Feb 19, 2013 at 8:20 PM, Harsh J <ha...@cloudera.com> wrote:
>>
>> Your point (4) explains the problem. The jar packed structure should
>> look like the below, and not how it is presently (one extra top level
>> dir is present):
>>
>> META-INF/
>> META-INF/MANIFEST.MF
>> org/
>> org/myorg/
>> org/myorg/WordCount.class
>> org/myorg/WordCount$TokenizerMapper.class
>> org/myorg/WordCount$IntSumReducer.class
>>
>> On Tue, Feb 19, 2013 at 9:29 PM, Fatih Haltas <fa...@nyu.edu>
>> wrote:
>> > Hi everyone,
>> >
>> > I know this is the common mistake to not specify the class adress while
>> > trying to run a jar, however,
>> > although I specified, I am still getting the ClassNotFound exception.
>> >
>> > What may be the reason for it? I have been struggling for this problem
>> > more
>> > than a 2 days.
>> > I just wrote different MapReduce application for some anlaysis. I got
>> > this
>> > problem.
>> >
>> > To check, is there something wrong with my system, i tried to run
>> > WordCount
>> > example.
>> > When I just run hadoop-examples wordcount, it is working fine.
>> >
>> > But when I add just "package org.myorg;" command at the beginning, it
>> > doesnot work.
>> >
>> > Here is what I have done so far
>> >
>> > *************************************************************************
>> > 1. I just copied wordcount code from the apaches own examples source
>> > code
>> > and I just changed package decleration as "package org.myorg;"
>> >
>> > **************************************************************************
>> > 2. Then I tried to run that command:
>> >
>> > *************************************************************************
>> > "hadoop jar wordcount_19_02.jar org.myorg.WordCount
>> > /home/hadoop/project/hadoop-data/NetFlow 19_02_wordcount.output"
>> >
>> > *************************************************************************
>> > 3. I got following error:
>> >
>> > **************************************************************************
>> > [hadoop@ADUAE042-LAP-V project]$ hadoop jar wordcount_19_02.jar
>> > org.myorg.WordCount /home/hadoop/project/hadoop-data/NetFlow
>> > 19_02_wordcount.output
>> > Warning: $HADOOP_HOME is deprecated.
>> >
>> > Exception in thread "main" java.lang.ClassNotFoundException:
>> > org.myorg.WordCount
>> >         at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
>> >         at java.security.AccessController.doPrivileged(Native Method)
>> >         at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>> >         at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
>> >         at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
>> >         at java.lang.Class.forName0(Native Method)
>> >         at java.lang.Class.forName(Class.java:266)
>> >         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
>> >
>> >
>> > **************************************************************************
>> > 4. This is the content of my .jar file:
>> > ****************************************************
>> > [hadoop@ADUAE042-LAP-V project]$ jar tf wordcount_19_02.jar
>> > META-INF/
>> > META-INF/MANIFEST.MF
>> > wordcount_classes/
>> > wordcount_classes/org/
>> > wordcount_classes/org/myorg/
>> > wordcount_classes/org/myorg/WordCount.class
>> > wordcount_classes/org/myorg/WordCount$TokenizerMapper.class
>> > wordcount_classes/org/myorg/WordCount$IntSumReducer.class
>> > **********************************************************
>> > 5. This is the 'ls' output of my working directory:
>> > **********************************************************
>> > [hadoop@ADUAE042-LAP-V project]$ ls
>> > flowclasses_18_02  flowclasses_18_02.jar  hadoop-1.0.4
>> > hadoop-1.0.4.tar.gz
>> > hadoop-data  MapReduce.java  sample  wordcount_19_02.jar
>> > wordcount_classes
>> > WordCountClasses  WordCount.java
>> > *************************************************************
>> > So as you see, package decleration is fine but I am really helpless, I
>> > googled but they are all saying samething you should specify the package
>> > hierarchy of your main class. I did know it already I am specifying but
>> > doesn't work.
>> >
>> > I would be much obliged to anyone helped me
>> >
>> > Regards,
>>
>>
>>
>> --
>> Harsh J
>
>



--
Harsh J

Re: Trouble in running MapReduce application

Posted by Harsh J <ha...@cloudera.com>.
Hi,

The new error usually happens if you compile using Java 7 and try to
run via Java 6 (for example). That is, an incompatibility in the
runtimes for the binary artifact produced.

On Tue, Feb 19, 2013 at 10:09 PM, Fatih Haltas <fa...@nyu.edu> wrote:
> Thank you very much Harsh,
>
> Now, as I promised earlier I am much obliged to you.
>
> But, now I solved that problem by just changing the directories then again
> creating a jar file of org. but I am getting this error:
>
> 1.) What I got
> ------------------------------------------------------------------------------
> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ hadoop jar flow19028pm.jar
> org.myorg.MapReduce /home/hadoop/project/hadoop-data/NetFlow 19_02.out
> Warning: $HADOOP_HOME is deprecated.
>
> Exception in thread "main" java.lang.UnsupportedClassVersionError:
> org/myorg/MapReduce : Unsupported major.minor version 51.0
>         at java.lang.ClassLoader.defineClass1(Native Method)
>         at java.lang.ClassLoader.defineClass(ClassLoader.java:634)
>         at
> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>         at java.net.URLClassLoader.defineClass(URLClassLoader.java:277)
>         at java.net.URLClassLoader.access$000(URLClassLoader.java:73)
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:212)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
>         at java.lang.Class.forName0(Native Method)
>         at java.lang.Class.forName(Class.java:266)
>         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
>
> 2.) How I create my jar
> -------------------------------------------------------------------------------------
> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar cvf flow19028pm.jar org
> added manifest
> adding: org/(in = 0) (out= 0)(stored 0%)
> adding: org/myorg/(in = 0) (out= 0)(stored 0%)
> adding: org/myorg/MapReduce$FlowPortReducer.class(in = 1661) (out=
> 690)(deflated 58%)
> adding: org/myorg/MapReduce.class(in = 1587) (out= 903)(deflated 43%)
> adding: org/myorg/MapReduce$FlowPortMapper.class(in = 1874) (out=
> 823)(deflated 56%)
>
> 3.) Content of my jar file
> ---------------------------------------------------------------------------------------
> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar tf flow19028pm.jar
> META-INF/
> META-INF/MANIFEST.MF
> org/
> org/myorg/
> org/myorg/MapReduce$FlowPortReducer.class
> org/myorg/MapReduce.class
> org/myorg/MapReduce$FlowPortMapper.class
> -----------------------------------------------------------------------------------------
>
>
> Thank you very much.
>
>
> On Tue, Feb 19, 2013 at 8:20 PM, Harsh J <ha...@cloudera.com> wrote:
>>
>> Your point (4) explains the problem. The jar packed structure should
>> look like the below, and not how it is presently (one extra top level
>> dir is present):
>>
>> META-INF/
>> META-INF/MANIFEST.MF
>> org/
>> org/myorg/
>> org/myorg/WordCount.class
>> org/myorg/WordCount$TokenizerMapper.class
>> org/myorg/WordCount$IntSumReducer.class
>>
>> On Tue, Feb 19, 2013 at 9:29 PM, Fatih Haltas <fa...@nyu.edu>
>> wrote:
>> > Hi everyone,
>> >
>> > I know this is the common mistake to not specify the class adress while
>> > trying to run a jar, however,
>> > although I specified, I am still getting the ClassNotFound exception.
>> >
>> > What may be the reason for it? I have been struggling for this problem
>> > more
>> > than a 2 days.
>> > I just wrote different MapReduce application for some anlaysis. I got
>> > this
>> > problem.
>> >
>> > To check, is there something wrong with my system, i tried to run
>> > WordCount
>> > example.
>> > When I just run hadoop-examples wordcount, it is working fine.
>> >
>> > But when I add just "package org.myorg;" command at the beginning, it
>> > doesnot work.
>> >
>> > Here is what I have done so far
>> >
>> > *************************************************************************
>> > 1. I just copied wordcount code from the apaches own examples source
>> > code
>> > and I just changed package decleration as "package org.myorg;"
>> >
>> > **************************************************************************
>> > 2. Then I tried to run that command:
>> >
>> > *************************************************************************
>> > "hadoop jar wordcount_19_02.jar org.myorg.WordCount
>> > /home/hadoop/project/hadoop-data/NetFlow 19_02_wordcount.output"
>> >
>> > *************************************************************************
>> > 3. I got following error:
>> >
>> > **************************************************************************
>> > [hadoop@ADUAE042-LAP-V project]$ hadoop jar wordcount_19_02.jar
>> > org.myorg.WordCount /home/hadoop/project/hadoop-data/NetFlow
>> > 19_02_wordcount.output
>> > Warning: $HADOOP_HOME is deprecated.
>> >
>> > Exception in thread "main" java.lang.ClassNotFoundException:
>> > org.myorg.WordCount
>> >         at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
>> >         at java.security.AccessController.doPrivileged(Native Method)
>> >         at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>> >         at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
>> >         at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
>> >         at java.lang.Class.forName0(Native Method)
>> >         at java.lang.Class.forName(Class.java:266)
>> >         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
>> >
>> >
>> > **************************************************************************
>> > 4. This is the content of my .jar file:
>> > ****************************************************
>> > [hadoop@ADUAE042-LAP-V project]$ jar tf wordcount_19_02.jar
>> > META-INF/
>> > META-INF/MANIFEST.MF
>> > wordcount_classes/
>> > wordcount_classes/org/
>> > wordcount_classes/org/myorg/
>> > wordcount_classes/org/myorg/WordCount.class
>> > wordcount_classes/org/myorg/WordCount$TokenizerMapper.class
>> > wordcount_classes/org/myorg/WordCount$IntSumReducer.class
>> > **********************************************************
>> > 5. This is the 'ls' output of my working directory:
>> > **********************************************************
>> > [hadoop@ADUAE042-LAP-V project]$ ls
>> > flowclasses_18_02  flowclasses_18_02.jar  hadoop-1.0.4
>> > hadoop-1.0.4.tar.gz
>> > hadoop-data  MapReduce.java  sample  wordcount_19_02.jar
>> > wordcount_classes
>> > WordCountClasses  WordCount.java
>> > *************************************************************
>> > So as you see, package decleration is fine but I am really helpless, I
>> > googled but they are all saying samething you should specify the package
>> > hierarchy of your main class. I did know it already I am specifying but
>> > doesn't work.
>> >
>> > I would be much obliged to anyone helped me
>> >
>> > Regards,
>>
>>
>>
>> --
>> Harsh J
>
>



--
Harsh J

Re: Trouble in running MapReduce application

Posted by Harsh J <ha...@cloudera.com>.
Hi,

The new error usually happens if you compile using Java 7 and try to
run via Java 6 (for example). That is, an incompatibility in the
runtimes for the binary artifact produced.

On Tue, Feb 19, 2013 at 10:09 PM, Fatih Haltas <fa...@nyu.edu> wrote:
> Thank you very much Harsh,
>
> Now, as I promised earlier I am much obliged to you.
>
> But, now I solved that problem by just changing the directories then again
> creating a jar file of org. but I am getting this error:
>
> 1.) What I got
> ------------------------------------------------------------------------------
> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ hadoop jar flow19028pm.jar
> org.myorg.MapReduce /home/hadoop/project/hadoop-data/NetFlow 19_02.out
> Warning: $HADOOP_HOME is deprecated.
>
> Exception in thread "main" java.lang.UnsupportedClassVersionError:
> org/myorg/MapReduce : Unsupported major.minor version 51.0
>         at java.lang.ClassLoader.defineClass1(Native Method)
>         at java.lang.ClassLoader.defineClass(ClassLoader.java:634)
>         at
> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>         at java.net.URLClassLoader.defineClass(URLClassLoader.java:277)
>         at java.net.URLClassLoader.access$000(URLClassLoader.java:73)
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:212)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
>         at java.lang.Class.forName0(Native Method)
>         at java.lang.Class.forName(Class.java:266)
>         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
>
> 2.) How I create my jar
> -------------------------------------------------------------------------------------
> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar cvf flow19028pm.jar org
> added manifest
> adding: org/(in = 0) (out= 0)(stored 0%)
> adding: org/myorg/(in = 0) (out= 0)(stored 0%)
> adding: org/myorg/MapReduce$FlowPortReducer.class(in = 1661) (out=
> 690)(deflated 58%)
> adding: org/myorg/MapReduce.class(in = 1587) (out= 903)(deflated 43%)
> adding: org/myorg/MapReduce$FlowPortMapper.class(in = 1874) (out=
> 823)(deflated 56%)
>
> 3.) Content of my jar file
> ---------------------------------------------------------------------------------------
> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar tf flow19028pm.jar
> META-INF/
> META-INF/MANIFEST.MF
> org/
> org/myorg/
> org/myorg/MapReduce$FlowPortReducer.class
> org/myorg/MapReduce.class
> org/myorg/MapReduce$FlowPortMapper.class
> -----------------------------------------------------------------------------------------
>
>
> Thank you very much.
>
>
> On Tue, Feb 19, 2013 at 8:20 PM, Harsh J <ha...@cloudera.com> wrote:
>>
>> Your point (4) explains the problem. The jar packed structure should
>> look like the below, and not how it is presently (one extra top level
>> dir is present):
>>
>> META-INF/
>> META-INF/MANIFEST.MF
>> org/
>> org/myorg/
>> org/myorg/WordCount.class
>> org/myorg/WordCount$TokenizerMapper.class
>> org/myorg/WordCount$IntSumReducer.class
>>
>> On Tue, Feb 19, 2013 at 9:29 PM, Fatih Haltas <fa...@nyu.edu>
>> wrote:
>> > Hi everyone,
>> >
>> > I know this is the common mistake to not specify the class adress while
>> > trying to run a jar, however,
>> > although I specified, I am still getting the ClassNotFound exception.
>> >
>> > What may be the reason for it? I have been struggling for this problem
>> > more
>> > than a 2 days.
>> > I just wrote different MapReduce application for some anlaysis. I got
>> > this
>> > problem.
>> >
>> > To check, is there something wrong with my system, i tried to run
>> > WordCount
>> > example.
>> > When I just run hadoop-examples wordcount, it is working fine.
>> >
>> > But when I add just "package org.myorg;" command at the beginning, it
>> > doesnot work.
>> >
>> > Here is what I have done so far
>> >
>> > *************************************************************************
>> > 1. I just copied wordcount code from the apaches own examples source
>> > code
>> > and I just changed package decleration as "package org.myorg;"
>> >
>> > **************************************************************************
>> > 2. Then I tried to run that command:
>> >
>> > *************************************************************************
>> > "hadoop jar wordcount_19_02.jar org.myorg.WordCount
>> > /home/hadoop/project/hadoop-data/NetFlow 19_02_wordcount.output"
>> >
>> > *************************************************************************
>> > 3. I got following error:
>> >
>> > **************************************************************************
>> > [hadoop@ADUAE042-LAP-V project]$ hadoop jar wordcount_19_02.jar
>> > org.myorg.WordCount /home/hadoop/project/hadoop-data/NetFlow
>> > 19_02_wordcount.output
>> > Warning: $HADOOP_HOME is deprecated.
>> >
>> > Exception in thread "main" java.lang.ClassNotFoundException:
>> > org.myorg.WordCount
>> >         at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
>> >         at java.security.AccessController.doPrivileged(Native Method)
>> >         at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>> >         at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
>> >         at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
>> >         at java.lang.Class.forName0(Native Method)
>> >         at java.lang.Class.forName(Class.java:266)
>> >         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
>> >
>> >
>> > **************************************************************************
>> > 4. This is the content of my .jar file:
>> > ****************************************************
>> > [hadoop@ADUAE042-LAP-V project]$ jar tf wordcount_19_02.jar
>> > META-INF/
>> > META-INF/MANIFEST.MF
>> > wordcount_classes/
>> > wordcount_classes/org/
>> > wordcount_classes/org/myorg/
>> > wordcount_classes/org/myorg/WordCount.class
>> > wordcount_classes/org/myorg/WordCount$TokenizerMapper.class
>> > wordcount_classes/org/myorg/WordCount$IntSumReducer.class
>> > **********************************************************
>> > 5. This is the 'ls' output of my working directory:
>> > **********************************************************
>> > [hadoop@ADUAE042-LAP-V project]$ ls
>> > flowclasses_18_02  flowclasses_18_02.jar  hadoop-1.0.4
>> > hadoop-1.0.4.tar.gz
>> > hadoop-data  MapReduce.java  sample  wordcount_19_02.jar
>> > wordcount_classes
>> > WordCountClasses  WordCount.java
>> > *************************************************************
>> > So as you see, package decleration is fine but I am really helpless, I
>> > googled but they are all saying samething you should specify the package
>> > hierarchy of your main class. I did know it already I am specifying but
>> > doesn't work.
>> >
>> > I would be much obliged to anyone helped me
>> >
>> > Regards,
>>
>>
>>
>> --
>> Harsh J
>
>



--
Harsh J

Re: Trouble in running MapReduce application

Posted by Harsh J <ha...@cloudera.com>.
Hi,

The new error usually happens if you compile using Java 7 and try to
run via Java 6 (for example). That is, an incompatibility in the
runtimes for the binary artifact produced.

On Tue, Feb 19, 2013 at 10:09 PM, Fatih Haltas <fa...@nyu.edu> wrote:
> Thank you very much Harsh,
>
> Now, as I promised earlier I am much obliged to you.
>
> But, now I solved that problem by just changing the directories then again
> creating a jar file of org. but I am getting this error:
>
> 1.) What I got
> ------------------------------------------------------------------------------
> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ hadoop jar flow19028pm.jar
> org.myorg.MapReduce /home/hadoop/project/hadoop-data/NetFlow 19_02.out
> Warning: $HADOOP_HOME is deprecated.
>
> Exception in thread "main" java.lang.UnsupportedClassVersionError:
> org/myorg/MapReduce : Unsupported major.minor version 51.0
>         at java.lang.ClassLoader.defineClass1(Native Method)
>         at java.lang.ClassLoader.defineClass(ClassLoader.java:634)
>         at
> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>         at java.net.URLClassLoader.defineClass(URLClassLoader.java:277)
>         at java.net.URLClassLoader.access$000(URLClassLoader.java:73)
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:212)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
>         at java.lang.Class.forName0(Native Method)
>         at java.lang.Class.forName(Class.java:266)
>         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
>
> 2.) How I create my jar
> -------------------------------------------------------------------------------------
> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar cvf flow19028pm.jar org
> added manifest
> adding: org/(in = 0) (out= 0)(stored 0%)
> adding: org/myorg/(in = 0) (out= 0)(stored 0%)
> adding: org/myorg/MapReduce$FlowPortReducer.class(in = 1661) (out=
> 690)(deflated 58%)
> adding: org/myorg/MapReduce.class(in = 1587) (out= 903)(deflated 43%)
> adding: org/myorg/MapReduce$FlowPortMapper.class(in = 1874) (out=
> 823)(deflated 56%)
>
> 3.) Content of my jar file
> ---------------------------------------------------------------------------------------
> [hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar tf flow19028pm.jar
> META-INF/
> META-INF/MANIFEST.MF
> org/
> org/myorg/
> org/myorg/MapReduce$FlowPortReducer.class
> org/myorg/MapReduce.class
> org/myorg/MapReduce$FlowPortMapper.class
> -----------------------------------------------------------------------------------------
>
>
> Thank you very much.
>
>
> On Tue, Feb 19, 2013 at 8:20 PM, Harsh J <ha...@cloudera.com> wrote:
>>
>> Your point (4) explains the problem. The jar packed structure should
>> look like the below, and not how it is presently (one extra top level
>> dir is present):
>>
>> META-INF/
>> META-INF/MANIFEST.MF
>> org/
>> org/myorg/
>> org/myorg/WordCount.class
>> org/myorg/WordCount$TokenizerMapper.class
>> org/myorg/WordCount$IntSumReducer.class
>>
>> On Tue, Feb 19, 2013 at 9:29 PM, Fatih Haltas <fa...@nyu.edu>
>> wrote:
>> > Hi everyone,
>> >
>> > I know this is the common mistake to not specify the class adress while
>> > trying to run a jar, however,
>> > although I specified, I am still getting the ClassNotFound exception.
>> >
>> > What may be the reason for it? I have been struggling for this problem
>> > more
>> > than a 2 days.
>> > I just wrote different MapReduce application for some anlaysis. I got
>> > this
>> > problem.
>> >
>> > To check, is there something wrong with my system, i tried to run
>> > WordCount
>> > example.
>> > When I just run hadoop-examples wordcount, it is working fine.
>> >
>> > But when I add just "package org.myorg;" command at the beginning, it
>> > doesnot work.
>> >
>> > Here is what I have done so far
>> >
>> > *************************************************************************
>> > 1. I just copied wordcount code from the apaches own examples source
>> > code
>> > and I just changed package decleration as "package org.myorg;"
>> >
>> > **************************************************************************
>> > 2. Then I tried to run that command:
>> >
>> > *************************************************************************
>> > "hadoop jar wordcount_19_02.jar org.myorg.WordCount
>> > /home/hadoop/project/hadoop-data/NetFlow 19_02_wordcount.output"
>> >
>> > *************************************************************************
>> > 3. I got following error:
>> >
>> > **************************************************************************
>> > [hadoop@ADUAE042-LAP-V project]$ hadoop jar wordcount_19_02.jar
>> > org.myorg.WordCount /home/hadoop/project/hadoop-data/NetFlow
>> > 19_02_wordcount.output
>> > Warning: $HADOOP_HOME is deprecated.
>> >
>> > Exception in thread "main" java.lang.ClassNotFoundException:
>> > org.myorg.WordCount
>> >         at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
>> >         at java.security.AccessController.doPrivileged(Native Method)
>> >         at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>> >         at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
>> >         at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
>> >         at java.lang.Class.forName0(Native Method)
>> >         at java.lang.Class.forName(Class.java:266)
>> >         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
>> >
>> >
>> > **************************************************************************
>> > 4. This is the content of my .jar file:
>> > ****************************************************
>> > [hadoop@ADUAE042-LAP-V project]$ jar tf wordcount_19_02.jar
>> > META-INF/
>> > META-INF/MANIFEST.MF
>> > wordcount_classes/
>> > wordcount_classes/org/
>> > wordcount_classes/org/myorg/
>> > wordcount_classes/org/myorg/WordCount.class
>> > wordcount_classes/org/myorg/WordCount$TokenizerMapper.class
>> > wordcount_classes/org/myorg/WordCount$IntSumReducer.class
>> > **********************************************************
>> > 5. This is the 'ls' output of my working directory:
>> > **********************************************************
>> > [hadoop@ADUAE042-LAP-V project]$ ls
>> > flowclasses_18_02  flowclasses_18_02.jar  hadoop-1.0.4
>> > hadoop-1.0.4.tar.gz
>> > hadoop-data  MapReduce.java  sample  wordcount_19_02.jar
>> > wordcount_classes
>> > WordCountClasses  WordCount.java
>> > *************************************************************
>> > So as you see, package decleration is fine but I am really helpless, I
>> > googled but they are all saying samething you should specify the package
>> > hierarchy of your main class. I did know it already I am specifying but
>> > doesn't work.
>> >
>> > I would be much obliged to anyone helped me
>> >
>> > Regards,
>>
>>
>>
>> --
>> Harsh J
>
>



--
Harsh J

Re: Trouble in running MapReduce application

Posted by Fatih Haltas <fa...@nyu.edu>.
Thank you very much Harsh,

Now, as I promised earlier I am much obliged to you.

But, now I solved that problem by just changing the directories then again
creating a jar file of org. but I am getting this error:

1.) What I got
------------------------------------------------------------------------------
[hadoop@ADUAE042-LAP-V flowclasses_18_02]$ hadoop jar flow19028pm.jar
org.myorg.MapReduce /home/hadoop/project/hadoop-data/NetFlow 19_02.out
Warning: $HADOOP_HOME is deprecated.

Exception in thread "main" java.lang.UnsupportedClassVersionError:
org/myorg/MapReduce : Unsupported major.minor version 51.0
        at java.lang.ClassLoader.defineClass1(Native Method)
        at java.lang.ClassLoader.defineClass(ClassLoader.java:634)
        at
java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
        at java.net.URLClassLoader.defineClass(URLClassLoader.java:277)
        at java.net.URLClassLoader.access$000(URLClassLoader.java:73)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:212)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:266)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:149)

2.) How I create my jar
-------------------------------------------------------------------------------------
[hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar cvf flow19028pm.jar org
added manifest
adding: org/(in = 0) (out= 0)(stored 0%)
adding: org/myorg/(in = 0) (out= 0)(stored 0%)
adding: org/myorg/MapReduce$FlowPortReducer.class(in = 1661) (out=
690)(deflated 58%)
adding: org/myorg/MapReduce.class(in = 1587) (out= 903)(deflated 43%)
adding: org/myorg/MapReduce$FlowPortMapper.class(in = 1874) (out=
823)(deflated 56%)

3.) Content of my jar file
---------------------------------------------------------------------------------------
[hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar tf flow19028pm.jar
META-INF/
META-INF/MANIFEST.MF
org/
org/myorg/
org/myorg/MapReduce$FlowPortReducer.class
org/myorg/MapReduce.class
org/myorg/MapReduce$FlowPortMapper.class
-----------------------------------------------------------------------------------------


Thank you very much.


On Tue, Feb 19, 2013 at 8:20 PM, Harsh J <ha...@cloudera.com> wrote:

> Your point (4) explains the problem. The jar packed structure should
> look like the below, and not how it is presently (one extra top level
> dir is present):
>
> META-INF/
> META-INF/MANIFEST.MF
> org/
> org/myorg/
> org/myorg/WordCount.class
> org/myorg/WordCount$TokenizerMapper.class
> org/myorg/WordCount$IntSumReducer.class
>
> On Tue, Feb 19, 2013 at 9:29 PM, Fatih Haltas <fa...@nyu.edu>
> wrote:
> > Hi everyone,
> >
> > I know this is the common mistake to not specify the class adress while
> > trying to run a jar, however,
> > although I specified, I am still getting the ClassNotFound exception.
> >
> > What may be the reason for it? I have been struggling for this problem
> more
> > than a 2 days.
> > I just wrote different MapReduce application for some anlaysis. I got
> this
> > problem.
> >
> > To check, is there something wrong with my system, i tried to run
> WordCount
> > example.
> > When I just run hadoop-examples wordcount, it is working fine.
> >
> > But when I add just "package org.myorg;" command at the beginning, it
> > doesnot work.
> >
> > Here is what I have done so far
> > *************************************************************************
> > 1. I just copied wordcount code from the apaches own examples source code
> > and I just changed package decleration as "package org.myorg;"
> >
> **************************************************************************
> > 2. Then I tried to run that command:
> >
>  *************************************************************************
> > "hadoop jar wordcount_19_02.jar org.myorg.WordCount
> > /home/hadoop/project/hadoop-data/NetFlow 19_02_wordcount.output"
> > *************************************************************************
> > 3. I got following error:
> >
> **************************************************************************
> > [hadoop@ADUAE042-LAP-V project]$ hadoop jar wordcount_19_02.jar
> > org.myorg.WordCount /home/hadoop/project/hadoop-data/NetFlow
> > 19_02_wordcount.output
> > Warning: $HADOOP_HOME is deprecated.
> >
> > Exception in thread "main" java.lang.ClassNotFoundException:
> > org.myorg.WordCount
> >         at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
> >         at java.security.AccessController.doPrivileged(Native Method)
> >         at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
> >         at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
> >         at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
> >         at java.lang.Class.forName0(Native Method)
> >         at java.lang.Class.forName(Class.java:266)
> >         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
> >
> >
> **************************************************************************
> > 4. This is the content of my .jar file:
> > ****************************************************
> > [hadoop@ADUAE042-LAP-V project]$ jar tf wordcount_19_02.jar
> > META-INF/
> > META-INF/MANIFEST.MF
> > wordcount_classes/
> > wordcount_classes/org/
> > wordcount_classes/org/myorg/
> > wordcount_classes/org/myorg/WordCount.class
> > wordcount_classes/org/myorg/WordCount$TokenizerMapper.class
> > wordcount_classes/org/myorg/WordCount$IntSumReducer.class
> > **********************************************************
> > 5. This is the 'ls' output of my working directory:
> > **********************************************************
> > [hadoop@ADUAE042-LAP-V project]$ ls
> > flowclasses_18_02  flowclasses_18_02.jar  hadoop-1.0.4
>  hadoop-1.0.4.tar.gz
> > hadoop-data  MapReduce.java  sample  wordcount_19_02.jar
>  wordcount_classes
> > WordCountClasses  WordCount.java
> > *************************************************************
> > So as you see, package decleration is fine but I am really helpless, I
> > googled but they are all saying samething you should specify the package
> > hierarchy of your main class. I did know it already I am specifying but
> > doesn't work.
> >
> > I would be much obliged to anyone helped me
> >
> > Regards,
>
>
>
> --
> Harsh J
>

Re: Trouble in running MapReduce application

Posted by Fatih Haltas <fa...@nyu.edu>.
Thank you very much Harsh,

Now, as I promised earlier I am much obliged to you.

But, now I solved that problem by just changing the directories then again
creating a jar file of org. but I am getting this error:

1.) What I got
------------------------------------------------------------------------------
[hadoop@ADUAE042-LAP-V flowclasses_18_02]$ hadoop jar flow19028pm.jar
org.myorg.MapReduce /home/hadoop/project/hadoop-data/NetFlow 19_02.out
Warning: $HADOOP_HOME is deprecated.

Exception in thread "main" java.lang.UnsupportedClassVersionError:
org/myorg/MapReduce : Unsupported major.minor version 51.0
        at java.lang.ClassLoader.defineClass1(Native Method)
        at java.lang.ClassLoader.defineClass(ClassLoader.java:634)
        at
java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
        at java.net.URLClassLoader.defineClass(URLClassLoader.java:277)
        at java.net.URLClassLoader.access$000(URLClassLoader.java:73)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:212)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:266)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:149)

2.) How I create my jar
-------------------------------------------------------------------------------------
[hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar cvf flow19028pm.jar org
added manifest
adding: org/(in = 0) (out= 0)(stored 0%)
adding: org/myorg/(in = 0) (out= 0)(stored 0%)
adding: org/myorg/MapReduce$FlowPortReducer.class(in = 1661) (out=
690)(deflated 58%)
adding: org/myorg/MapReduce.class(in = 1587) (out= 903)(deflated 43%)
adding: org/myorg/MapReduce$FlowPortMapper.class(in = 1874) (out=
823)(deflated 56%)

3.) Content of my jar file
---------------------------------------------------------------------------------------
[hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar tf flow19028pm.jar
META-INF/
META-INF/MANIFEST.MF
org/
org/myorg/
org/myorg/MapReduce$FlowPortReducer.class
org/myorg/MapReduce.class
org/myorg/MapReduce$FlowPortMapper.class
-----------------------------------------------------------------------------------------


Thank you very much.


On Tue, Feb 19, 2013 at 8:20 PM, Harsh J <ha...@cloudera.com> wrote:

> Your point (4) explains the problem. The jar packed structure should
> look like the below, and not how it is presently (one extra top level
> dir is present):
>
> META-INF/
> META-INF/MANIFEST.MF
> org/
> org/myorg/
> org/myorg/WordCount.class
> org/myorg/WordCount$TokenizerMapper.class
> org/myorg/WordCount$IntSumReducer.class
>
> On Tue, Feb 19, 2013 at 9:29 PM, Fatih Haltas <fa...@nyu.edu>
> wrote:
> > Hi everyone,
> >
> > I know this is the common mistake to not specify the class adress while
> > trying to run a jar, however,
> > although I specified, I am still getting the ClassNotFound exception.
> >
> > What may be the reason for it? I have been struggling for this problem
> more
> > than a 2 days.
> > I just wrote different MapReduce application for some anlaysis. I got
> this
> > problem.
> >
> > To check, is there something wrong with my system, i tried to run
> WordCount
> > example.
> > When I just run hadoop-examples wordcount, it is working fine.
> >
> > But when I add just "package org.myorg;" command at the beginning, it
> > doesnot work.
> >
> > Here is what I have done so far
> > *************************************************************************
> > 1. I just copied wordcount code from the apaches own examples source code
> > and I just changed package decleration as "package org.myorg;"
> >
> **************************************************************************
> > 2. Then I tried to run that command:
> >
>  *************************************************************************
> > "hadoop jar wordcount_19_02.jar org.myorg.WordCount
> > /home/hadoop/project/hadoop-data/NetFlow 19_02_wordcount.output"
> > *************************************************************************
> > 3. I got following error:
> >
> **************************************************************************
> > [hadoop@ADUAE042-LAP-V project]$ hadoop jar wordcount_19_02.jar
> > org.myorg.WordCount /home/hadoop/project/hadoop-data/NetFlow
> > 19_02_wordcount.output
> > Warning: $HADOOP_HOME is deprecated.
> >
> > Exception in thread "main" java.lang.ClassNotFoundException:
> > org.myorg.WordCount
> >         at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
> >         at java.security.AccessController.doPrivileged(Native Method)
> >         at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
> >         at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
> >         at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
> >         at java.lang.Class.forName0(Native Method)
> >         at java.lang.Class.forName(Class.java:266)
> >         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
> >
> >
> **************************************************************************
> > 4. This is the content of my .jar file:
> > ****************************************************
> > [hadoop@ADUAE042-LAP-V project]$ jar tf wordcount_19_02.jar
> > META-INF/
> > META-INF/MANIFEST.MF
> > wordcount_classes/
> > wordcount_classes/org/
> > wordcount_classes/org/myorg/
> > wordcount_classes/org/myorg/WordCount.class
> > wordcount_classes/org/myorg/WordCount$TokenizerMapper.class
> > wordcount_classes/org/myorg/WordCount$IntSumReducer.class
> > **********************************************************
> > 5. This is the 'ls' output of my working directory:
> > **********************************************************
> > [hadoop@ADUAE042-LAP-V project]$ ls
> > flowclasses_18_02  flowclasses_18_02.jar  hadoop-1.0.4
>  hadoop-1.0.4.tar.gz
> > hadoop-data  MapReduce.java  sample  wordcount_19_02.jar
>  wordcount_classes
> > WordCountClasses  WordCount.java
> > *************************************************************
> > So as you see, package decleration is fine but I am really helpless, I
> > googled but they are all saying samething you should specify the package
> > hierarchy of your main class. I did know it already I am specifying but
> > doesn't work.
> >
> > I would be much obliged to anyone helped me
> >
> > Regards,
>
>
>
> --
> Harsh J
>

Re: Trouble in running MapReduce application

Posted by Fatih Haltas <fa...@nyu.edu>.
Thank you very much Harsh,

Now, as I promised earlier I am much obliged to you.

But, now I solved that problem by just changing the directories then again
creating a jar file of org. but I am getting this error:

1.) What I got
------------------------------------------------------------------------------
[hadoop@ADUAE042-LAP-V flowclasses_18_02]$ hadoop jar flow19028pm.jar
org.myorg.MapReduce /home/hadoop/project/hadoop-data/NetFlow 19_02.out
Warning: $HADOOP_HOME is deprecated.

Exception in thread "main" java.lang.UnsupportedClassVersionError:
org/myorg/MapReduce : Unsupported major.minor version 51.0
        at java.lang.ClassLoader.defineClass1(Native Method)
        at java.lang.ClassLoader.defineClass(ClassLoader.java:634)
        at
java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
        at java.net.URLClassLoader.defineClass(URLClassLoader.java:277)
        at java.net.URLClassLoader.access$000(URLClassLoader.java:73)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:212)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:266)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:149)

2.) How I create my jar
-------------------------------------------------------------------------------------
[hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar cvf flow19028pm.jar org
added manifest
adding: org/(in = 0) (out= 0)(stored 0%)
adding: org/myorg/(in = 0) (out= 0)(stored 0%)
adding: org/myorg/MapReduce$FlowPortReducer.class(in = 1661) (out=
690)(deflated 58%)
adding: org/myorg/MapReduce.class(in = 1587) (out= 903)(deflated 43%)
adding: org/myorg/MapReduce$FlowPortMapper.class(in = 1874) (out=
823)(deflated 56%)

3.) Content of my jar file
---------------------------------------------------------------------------------------
[hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar tf flow19028pm.jar
META-INF/
META-INF/MANIFEST.MF
org/
org/myorg/
org/myorg/MapReduce$FlowPortReducer.class
org/myorg/MapReduce.class
org/myorg/MapReduce$FlowPortMapper.class
-----------------------------------------------------------------------------------------


Thank you very much.


On Tue, Feb 19, 2013 at 8:20 PM, Harsh J <ha...@cloudera.com> wrote:

> Your point (4) explains the problem. The jar packed structure should
> look like the below, and not how it is presently (one extra top level
> dir is present):
>
> META-INF/
> META-INF/MANIFEST.MF
> org/
> org/myorg/
> org/myorg/WordCount.class
> org/myorg/WordCount$TokenizerMapper.class
> org/myorg/WordCount$IntSumReducer.class
>
> On Tue, Feb 19, 2013 at 9:29 PM, Fatih Haltas <fa...@nyu.edu>
> wrote:
> > Hi everyone,
> >
> > I know this is the common mistake to not specify the class adress while
> > trying to run a jar, however,
> > although I specified, I am still getting the ClassNotFound exception.
> >
> > What may be the reason for it? I have been struggling for this problem
> more
> > than a 2 days.
> > I just wrote different MapReduce application for some anlaysis. I got
> this
> > problem.
> >
> > To check, is there something wrong with my system, i tried to run
> WordCount
> > example.
> > When I just run hadoop-examples wordcount, it is working fine.
> >
> > But when I add just "package org.myorg;" command at the beginning, it
> > doesnot work.
> >
> > Here is what I have done so far
> > *************************************************************************
> > 1. I just copied wordcount code from the apaches own examples source code
> > and I just changed package decleration as "package org.myorg;"
> >
> **************************************************************************
> > 2. Then I tried to run that command:
> >
>  *************************************************************************
> > "hadoop jar wordcount_19_02.jar org.myorg.WordCount
> > /home/hadoop/project/hadoop-data/NetFlow 19_02_wordcount.output"
> > *************************************************************************
> > 3. I got following error:
> >
> **************************************************************************
> > [hadoop@ADUAE042-LAP-V project]$ hadoop jar wordcount_19_02.jar
> > org.myorg.WordCount /home/hadoop/project/hadoop-data/NetFlow
> > 19_02_wordcount.output
> > Warning: $HADOOP_HOME is deprecated.
> >
> > Exception in thread "main" java.lang.ClassNotFoundException:
> > org.myorg.WordCount
> >         at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
> >         at java.security.AccessController.doPrivileged(Native Method)
> >         at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
> >         at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
> >         at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
> >         at java.lang.Class.forName0(Native Method)
> >         at java.lang.Class.forName(Class.java:266)
> >         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
> >
> >
> **************************************************************************
> > 4. This is the content of my .jar file:
> > ****************************************************
> > [hadoop@ADUAE042-LAP-V project]$ jar tf wordcount_19_02.jar
> > META-INF/
> > META-INF/MANIFEST.MF
> > wordcount_classes/
> > wordcount_classes/org/
> > wordcount_classes/org/myorg/
> > wordcount_classes/org/myorg/WordCount.class
> > wordcount_classes/org/myorg/WordCount$TokenizerMapper.class
> > wordcount_classes/org/myorg/WordCount$IntSumReducer.class
> > **********************************************************
> > 5. This is the 'ls' output of my working directory:
> > **********************************************************
> > [hadoop@ADUAE042-LAP-V project]$ ls
> > flowclasses_18_02  flowclasses_18_02.jar  hadoop-1.0.4
>  hadoop-1.0.4.tar.gz
> > hadoop-data  MapReduce.java  sample  wordcount_19_02.jar
>  wordcount_classes
> > WordCountClasses  WordCount.java
> > *************************************************************
> > So as you see, package decleration is fine but I am really helpless, I
> > googled but they are all saying samething you should specify the package
> > hierarchy of your main class. I did know it already I am specifying but
> > doesn't work.
> >
> > I would be much obliged to anyone helped me
> >
> > Regards,
>
>
>
> --
> Harsh J
>

Re: Trouble in running MapReduce application

Posted by Fatih Haltas <fa...@nyu.edu>.
Thank you very much Harsh,

Now, as I promised earlier I am much obliged to you.

But, now I solved that problem by just changing the directories then again
creating a jar file of org. but I am getting this error:

1.) What I got
------------------------------------------------------------------------------
[hadoop@ADUAE042-LAP-V flowclasses_18_02]$ hadoop jar flow19028pm.jar
org.myorg.MapReduce /home/hadoop/project/hadoop-data/NetFlow 19_02.out
Warning: $HADOOP_HOME is deprecated.

Exception in thread "main" java.lang.UnsupportedClassVersionError:
org/myorg/MapReduce : Unsupported major.minor version 51.0
        at java.lang.ClassLoader.defineClass1(Native Method)
        at java.lang.ClassLoader.defineClass(ClassLoader.java:634)
        at
java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
        at java.net.URLClassLoader.defineClass(URLClassLoader.java:277)
        at java.net.URLClassLoader.access$000(URLClassLoader.java:73)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:212)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:266)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:149)

2.) How I create my jar
-------------------------------------------------------------------------------------
[hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar cvf flow19028pm.jar org
added manifest
adding: org/(in = 0) (out= 0)(stored 0%)
adding: org/myorg/(in = 0) (out= 0)(stored 0%)
adding: org/myorg/MapReduce$FlowPortReducer.class(in = 1661) (out=
690)(deflated 58%)
adding: org/myorg/MapReduce.class(in = 1587) (out= 903)(deflated 43%)
adding: org/myorg/MapReduce$FlowPortMapper.class(in = 1874) (out=
823)(deflated 56%)

3.) Content of my jar file
---------------------------------------------------------------------------------------
[hadoop@ADUAE042-LAP-V flowclasses_18_02]$ jar tf flow19028pm.jar
META-INF/
META-INF/MANIFEST.MF
org/
org/myorg/
org/myorg/MapReduce$FlowPortReducer.class
org/myorg/MapReduce.class
org/myorg/MapReduce$FlowPortMapper.class
-----------------------------------------------------------------------------------------


Thank you very much.


On Tue, Feb 19, 2013 at 8:20 PM, Harsh J <ha...@cloudera.com> wrote:

> Your point (4) explains the problem. The jar packed structure should
> look like the below, and not how it is presently (one extra top level
> dir is present):
>
> META-INF/
> META-INF/MANIFEST.MF
> org/
> org/myorg/
> org/myorg/WordCount.class
> org/myorg/WordCount$TokenizerMapper.class
> org/myorg/WordCount$IntSumReducer.class
>
> On Tue, Feb 19, 2013 at 9:29 PM, Fatih Haltas <fa...@nyu.edu>
> wrote:
> > Hi everyone,
> >
> > I know this is the common mistake to not specify the class adress while
> > trying to run a jar, however,
> > although I specified, I am still getting the ClassNotFound exception.
> >
> > What may be the reason for it? I have been struggling for this problem
> more
> > than a 2 days.
> > I just wrote different MapReduce application for some anlaysis. I got
> this
> > problem.
> >
> > To check, is there something wrong with my system, i tried to run
> WordCount
> > example.
> > When I just run hadoop-examples wordcount, it is working fine.
> >
> > But when I add just "package org.myorg;" command at the beginning, it
> > doesnot work.
> >
> > Here is what I have done so far
> > *************************************************************************
> > 1. I just copied wordcount code from the apaches own examples source code
> > and I just changed package decleration as "package org.myorg;"
> >
> **************************************************************************
> > 2. Then I tried to run that command:
> >
>  *************************************************************************
> > "hadoop jar wordcount_19_02.jar org.myorg.WordCount
> > /home/hadoop/project/hadoop-data/NetFlow 19_02_wordcount.output"
> > *************************************************************************
> > 3. I got following error:
> >
> **************************************************************************
> > [hadoop@ADUAE042-LAP-V project]$ hadoop jar wordcount_19_02.jar
> > org.myorg.WordCount /home/hadoop/project/hadoop-data/NetFlow
> > 19_02_wordcount.output
> > Warning: $HADOOP_HOME is deprecated.
> >
> > Exception in thread "main" java.lang.ClassNotFoundException:
> > org.myorg.WordCount
> >         at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
> >         at java.security.AccessController.doPrivileged(Native Method)
> >         at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
> >         at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
> >         at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
> >         at java.lang.Class.forName0(Native Method)
> >         at java.lang.Class.forName(Class.java:266)
> >         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
> >
> >
> **************************************************************************
> > 4. This is the content of my .jar file:
> > ****************************************************
> > [hadoop@ADUAE042-LAP-V project]$ jar tf wordcount_19_02.jar
> > META-INF/
> > META-INF/MANIFEST.MF
> > wordcount_classes/
> > wordcount_classes/org/
> > wordcount_classes/org/myorg/
> > wordcount_classes/org/myorg/WordCount.class
> > wordcount_classes/org/myorg/WordCount$TokenizerMapper.class
> > wordcount_classes/org/myorg/WordCount$IntSumReducer.class
> > **********************************************************
> > 5. This is the 'ls' output of my working directory:
> > **********************************************************
> > [hadoop@ADUAE042-LAP-V project]$ ls
> > flowclasses_18_02  flowclasses_18_02.jar  hadoop-1.0.4
>  hadoop-1.0.4.tar.gz
> > hadoop-data  MapReduce.java  sample  wordcount_19_02.jar
>  wordcount_classes
> > WordCountClasses  WordCount.java
> > *************************************************************
> > So as you see, package decleration is fine but I am really helpless, I
> > googled but they are all saying samething you should specify the package
> > hierarchy of your main class. I did know it already I am specifying but
> > doesn't work.
> >
> > I would be much obliged to anyone helped me
> >
> > Regards,
>
>
>
> --
> Harsh J
>

Re: Trouble in running MapReduce application

Posted by Harsh J <ha...@cloudera.com>.
Your point (4) explains the problem. The jar packed structure should
look like the below, and not how it is presently (one extra top level
dir is present):

META-INF/
META-INF/MANIFEST.MF
org/
org/myorg/
org/myorg/WordCount.class
org/myorg/WordCount$TokenizerMapper.class
org/myorg/WordCount$IntSumReducer.class

On Tue, Feb 19, 2013 at 9:29 PM, Fatih Haltas <fa...@nyu.edu> wrote:
> Hi everyone,
>
> I know this is the common mistake to not specify the class adress while
> trying to run a jar, however,
> although I specified, I am still getting the ClassNotFound exception.
>
> What may be the reason for it? I have been struggling for this problem more
> than a 2 days.
> I just wrote different MapReduce application for some anlaysis. I got this
> problem.
>
> To check, is there something wrong with my system, i tried to run WordCount
> example.
> When I just run hadoop-examples wordcount, it is working fine.
>
> But when I add just "package org.myorg;" command at the beginning, it
> doesnot work.
>
> Here is what I have done so far
> *************************************************************************
> 1. I just copied wordcount code from the apaches own examples source code
> and I just changed package decleration as "package org.myorg;"
> **************************************************************************
> 2. Then I tried to run that command:
>  *************************************************************************
> "hadoop jar wordcount_19_02.jar org.myorg.WordCount
> /home/hadoop/project/hadoop-data/NetFlow 19_02_wordcount.output"
> *************************************************************************
> 3. I got following error:
> **************************************************************************
> [hadoop@ADUAE042-LAP-V project]$ hadoop jar wordcount_19_02.jar
> org.myorg.WordCount /home/hadoop/project/hadoop-data/NetFlow
> 19_02_wordcount.output
> Warning: $HADOOP_HOME is deprecated.
>
> Exception in thread "main" java.lang.ClassNotFoundException:
> org.myorg.WordCount
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
>         at java.lang.Class.forName0(Native Method)
>         at java.lang.Class.forName(Class.java:266)
>         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
>
> **************************************************************************
> 4. This is the content of my .jar file:
> ****************************************************
> [hadoop@ADUAE042-LAP-V project]$ jar tf wordcount_19_02.jar
> META-INF/
> META-INF/MANIFEST.MF
> wordcount_classes/
> wordcount_classes/org/
> wordcount_classes/org/myorg/
> wordcount_classes/org/myorg/WordCount.class
> wordcount_classes/org/myorg/WordCount$TokenizerMapper.class
> wordcount_classes/org/myorg/WordCount$IntSumReducer.class
> **********************************************************
> 5. This is the 'ls' output of my working directory:
> **********************************************************
> [hadoop@ADUAE042-LAP-V project]$ ls
> flowclasses_18_02  flowclasses_18_02.jar  hadoop-1.0.4  hadoop-1.0.4.tar.gz
> hadoop-data  MapReduce.java  sample  wordcount_19_02.jar  wordcount_classes
> WordCountClasses  WordCount.java
> *************************************************************
> So as you see, package decleration is fine but I am really helpless, I
> googled but they are all saying samething you should specify the package
> hierarchy of your main class. I did know it already I am specifying but
> doesn't work.
>
> I would be much obliged to anyone helped me
>
> Regards,



--
Harsh J

Re: Trouble in running MapReduce application

Posted by Harsh J <ha...@cloudera.com>.
Your point (4) explains the problem. The jar packed structure should
look like the below, and not how it is presently (one extra top level
dir is present):

META-INF/
META-INF/MANIFEST.MF
org/
org/myorg/
org/myorg/WordCount.class
org/myorg/WordCount$TokenizerMapper.class
org/myorg/WordCount$IntSumReducer.class

On Tue, Feb 19, 2013 at 9:29 PM, Fatih Haltas <fa...@nyu.edu> wrote:
> Hi everyone,
>
> I know this is the common mistake to not specify the class adress while
> trying to run a jar, however,
> although I specified, I am still getting the ClassNotFound exception.
>
> What may be the reason for it? I have been struggling for this problem more
> than a 2 days.
> I just wrote different MapReduce application for some anlaysis. I got this
> problem.
>
> To check, is there something wrong with my system, i tried to run WordCount
> example.
> When I just run hadoop-examples wordcount, it is working fine.
>
> But when I add just "package org.myorg;" command at the beginning, it
> doesnot work.
>
> Here is what I have done so far
> *************************************************************************
> 1. I just copied wordcount code from the apaches own examples source code
> and I just changed package decleration as "package org.myorg;"
> **************************************************************************
> 2. Then I tried to run that command:
>  *************************************************************************
> "hadoop jar wordcount_19_02.jar org.myorg.WordCount
> /home/hadoop/project/hadoop-data/NetFlow 19_02_wordcount.output"
> *************************************************************************
> 3. I got following error:
> **************************************************************************
> [hadoop@ADUAE042-LAP-V project]$ hadoop jar wordcount_19_02.jar
> org.myorg.WordCount /home/hadoop/project/hadoop-data/NetFlow
> 19_02_wordcount.output
> Warning: $HADOOP_HOME is deprecated.
>
> Exception in thread "main" java.lang.ClassNotFoundException:
> org.myorg.WordCount
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
>         at java.lang.Class.forName0(Native Method)
>         at java.lang.Class.forName(Class.java:266)
>         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
>
> **************************************************************************
> 4. This is the content of my .jar file:
> ****************************************************
> [hadoop@ADUAE042-LAP-V project]$ jar tf wordcount_19_02.jar
> META-INF/
> META-INF/MANIFEST.MF
> wordcount_classes/
> wordcount_classes/org/
> wordcount_classes/org/myorg/
> wordcount_classes/org/myorg/WordCount.class
> wordcount_classes/org/myorg/WordCount$TokenizerMapper.class
> wordcount_classes/org/myorg/WordCount$IntSumReducer.class
> **********************************************************
> 5. This is the 'ls' output of my working directory:
> **********************************************************
> [hadoop@ADUAE042-LAP-V project]$ ls
> flowclasses_18_02  flowclasses_18_02.jar  hadoop-1.0.4  hadoop-1.0.4.tar.gz
> hadoop-data  MapReduce.java  sample  wordcount_19_02.jar  wordcount_classes
> WordCountClasses  WordCount.java
> *************************************************************
> So as you see, package decleration is fine but I am really helpless, I
> googled but they are all saying samething you should specify the package
> hierarchy of your main class. I did know it already I am specifying but
> doesn't work.
>
> I would be much obliged to anyone helped me
>
> Regards,



--
Harsh J

Re: Trouble in running MapReduce application

Posted by Harsh J <ha...@cloudera.com>.
Your point (4) explains the problem. The jar packed structure should
look like the below, and not how it is presently (one extra top level
dir is present):

META-INF/
META-INF/MANIFEST.MF
org/
org/myorg/
org/myorg/WordCount.class
org/myorg/WordCount$TokenizerMapper.class
org/myorg/WordCount$IntSumReducer.class

On Tue, Feb 19, 2013 at 9:29 PM, Fatih Haltas <fa...@nyu.edu> wrote:
> Hi everyone,
>
> I know this is the common mistake to not specify the class adress while
> trying to run a jar, however,
> although I specified, I am still getting the ClassNotFound exception.
>
> What may be the reason for it? I have been struggling for this problem more
> than a 2 days.
> I just wrote different MapReduce application for some anlaysis. I got this
> problem.
>
> To check, is there something wrong with my system, i tried to run WordCount
> example.
> When I just run hadoop-examples wordcount, it is working fine.
>
> But when I add just "package org.myorg;" command at the beginning, it
> doesnot work.
>
> Here is what I have done so far
> *************************************************************************
> 1. I just copied wordcount code from the apaches own examples source code
> and I just changed package decleration as "package org.myorg;"
> **************************************************************************
> 2. Then I tried to run that command:
>  *************************************************************************
> "hadoop jar wordcount_19_02.jar org.myorg.WordCount
> /home/hadoop/project/hadoop-data/NetFlow 19_02_wordcount.output"
> *************************************************************************
> 3. I got following error:
> **************************************************************************
> [hadoop@ADUAE042-LAP-V project]$ hadoop jar wordcount_19_02.jar
> org.myorg.WordCount /home/hadoop/project/hadoop-data/NetFlow
> 19_02_wordcount.output
> Warning: $HADOOP_HOME is deprecated.
>
> Exception in thread "main" java.lang.ClassNotFoundException:
> org.myorg.WordCount
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
>         at java.lang.Class.forName0(Native Method)
>         at java.lang.Class.forName(Class.java:266)
>         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
>
> **************************************************************************
> 4. This is the content of my .jar file:
> ****************************************************
> [hadoop@ADUAE042-LAP-V project]$ jar tf wordcount_19_02.jar
> META-INF/
> META-INF/MANIFEST.MF
> wordcount_classes/
> wordcount_classes/org/
> wordcount_classes/org/myorg/
> wordcount_classes/org/myorg/WordCount.class
> wordcount_classes/org/myorg/WordCount$TokenizerMapper.class
> wordcount_classes/org/myorg/WordCount$IntSumReducer.class
> **********************************************************
> 5. This is the 'ls' output of my working directory:
> **********************************************************
> [hadoop@ADUAE042-LAP-V project]$ ls
> flowclasses_18_02  flowclasses_18_02.jar  hadoop-1.0.4  hadoop-1.0.4.tar.gz
> hadoop-data  MapReduce.java  sample  wordcount_19_02.jar  wordcount_classes
> WordCountClasses  WordCount.java
> *************************************************************
> So as you see, package decleration is fine but I am really helpless, I
> googled but they are all saying samething you should specify the package
> hierarchy of your main class. I did know it already I am specifying but
> doesn't work.
>
> I would be much obliged to anyone helped me
>
> Regards,



--
Harsh J

Re: Trouble in running MapReduce application

Posted by Harsh J <ha...@cloudera.com>.
Your point (4) explains the problem. The jar packed structure should
look like the below, and not how it is presently (one extra top level
dir is present):

META-INF/
META-INF/MANIFEST.MF
org/
org/myorg/
org/myorg/WordCount.class
org/myorg/WordCount$TokenizerMapper.class
org/myorg/WordCount$IntSumReducer.class

On Tue, Feb 19, 2013 at 9:29 PM, Fatih Haltas <fa...@nyu.edu> wrote:
> Hi everyone,
>
> I know this is the common mistake to not specify the class adress while
> trying to run a jar, however,
> although I specified, I am still getting the ClassNotFound exception.
>
> What may be the reason for it? I have been struggling for this problem more
> than a 2 days.
> I just wrote different MapReduce application for some anlaysis. I got this
> problem.
>
> To check, is there something wrong with my system, i tried to run WordCount
> example.
> When I just run hadoop-examples wordcount, it is working fine.
>
> But when I add just "package org.myorg;" command at the beginning, it
> doesnot work.
>
> Here is what I have done so far
> *************************************************************************
> 1. I just copied wordcount code from the apaches own examples source code
> and I just changed package decleration as "package org.myorg;"
> **************************************************************************
> 2. Then I tried to run that command:
>  *************************************************************************
> "hadoop jar wordcount_19_02.jar org.myorg.WordCount
> /home/hadoop/project/hadoop-data/NetFlow 19_02_wordcount.output"
> *************************************************************************
> 3. I got following error:
> **************************************************************************
> [hadoop@ADUAE042-LAP-V project]$ hadoop jar wordcount_19_02.jar
> org.myorg.WordCount /home/hadoop/project/hadoop-data/NetFlow
> 19_02_wordcount.output
> Warning: $HADOOP_HOME is deprecated.
>
> Exception in thread "main" java.lang.ClassNotFoundException:
> org.myorg.WordCount
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
>         at java.lang.Class.forName0(Native Method)
>         at java.lang.Class.forName(Class.java:266)
>         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
>
> **************************************************************************
> 4. This is the content of my .jar file:
> ****************************************************
> [hadoop@ADUAE042-LAP-V project]$ jar tf wordcount_19_02.jar
> META-INF/
> META-INF/MANIFEST.MF
> wordcount_classes/
> wordcount_classes/org/
> wordcount_classes/org/myorg/
> wordcount_classes/org/myorg/WordCount.class
> wordcount_classes/org/myorg/WordCount$TokenizerMapper.class
> wordcount_classes/org/myorg/WordCount$IntSumReducer.class
> **********************************************************
> 5. This is the 'ls' output of my working directory:
> **********************************************************
> [hadoop@ADUAE042-LAP-V project]$ ls
> flowclasses_18_02  flowclasses_18_02.jar  hadoop-1.0.4  hadoop-1.0.4.tar.gz
> hadoop-data  MapReduce.java  sample  wordcount_19_02.jar  wordcount_classes
> WordCountClasses  WordCount.java
> *************************************************************
> So as you see, package decleration is fine but I am really helpless, I
> googled but they are all saying samething you should specify the package
> hierarchy of your main class. I did know it already I am specifying but
> doesn't work.
>
> I would be much obliged to anyone helped me
>
> Regards,



--
Harsh J