You are viewing a plain text version of this content. The canonical link for it is here.
Posted to hdfs-user@hadoop.apache.org by YIMEN YIMGA Gael <ga...@sgcib.com> on 2014/09/10 15:09:47 UTC

Error when executing a WordCount Program

Hello Hadoopers,

Here is the error, I'm facing when running WordCount example program written by myself.
Kind find attached the file of my WordCount program.
Below the error.

===========================================================================================================================================
-bash-4.1$ bin/hadoop jar WordCount.jar
Entr?e dans le programme MAIN !!!
14/09/10 15:00:24 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
14/09/10 15:00:24 WARN mapred.JobClient: No job jar file set.  User classes may not be found. See JobConf(Class) or JobConf#setJar(String).
14/09/10 15:00:24 INFO util.NativeCodeLoader: Loaded the native-hadoop library
14/09/10 15:00:24 WARN snappy.LoadSnappy: Snappy native library not loaded
14/09/10 15:00:24 INFO mapred.JobClient: Cleaning up the staging area hdfs://latdevweb02:9000/user/hadoop/.staging/job_201409101141_0001
14/09/10 15:00:24 ERROR security.UserGroupInformation: PriviledgedActionException as:hadoop cause:org.apache.hadoop.mapred.InvalidInputException: Input path does not exist: hdfs://latdevweb02:9000/home/hadoop/hadoop/input
org.apache.hadoop.mapred.InvalidInputException: Input path does not exist: hdfs://latdevweb02:9000/home/hadoop/hadoop/input
        at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:197)
        at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:208)
        at org.apache.hadoop.mapred.JobClient.writeOldSplits(JobClient.java:1081)
        at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:1073)
        at org.apache.hadoop.mapred.JobClient.access$700(JobClient.java:179)
        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:983)
        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:936)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
        at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:936)
        at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:910)
        at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1353)
        at fr.societegenerale.bigdata.lactool.WordCountDriver.main(WordCountDriver.java:50)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:601)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:160)
-bash-4.1$
===========================================================================================================================================

Thanks in advance for your help.

Warm regards
GYY
*************************************************************************
This message and any attachments (the "message") are confidential, intended solely for the addressee(s), and may contain legally privileged information.
Any unauthorised use or dissemination is prohibited. E-mails are susceptible to alteration.   
Neither SOCIETE GENERALE nor any of its subsidiaries or affiliates shall be liable for the message if altered, changed or
falsified.
Please visit http://swapdisclosure.sgcib.com for important information with respect to derivative products.
                              ************
Ce message et toutes les pieces jointes (ci-apres le "message") sont confidentiels et susceptibles de contenir des informations couvertes 
par le secret professionnel. 
Ce message est etabli a l'intention exclusive de ses destinataires. Toute utilisation ou diffusion non autorisee est interdite.
Tout message electronique est susceptible d'alteration. 
La SOCIETE GENERALE et ses filiales declinent toute responsabilite au titre de ce message s'il a ete altere, deforme ou falsifie.
Veuillez consulter le site http://swapdisclosure.sgcib.com afin de recueillir d'importantes informations sur les produits derives.
*************************************************************************

RE: Error when executing a WordCount Program

Posted by YIMEN YIMGA Gael <ga...@sgcib.com>.
Hi,

In fact,

hdfs://latdevweb02:9000/home/hadoop/hadoop/input
is not a folder on hdfs.

I created a folder /tmp/hadoop-hadoop/dfs/data, where data will be saved in hdfs.

And in my HADOOP_HOME folder, there is two folders “input” and “output”, but I don’t know how to configure them in the program.

Please could you look into my code and advise please ?

Standing by …

Warm regards

From: Shahab Yunus [mailto:shahab.yunus@gmail.com]
Sent: Wednesday 10 September 2014 15:19
To: user@hadoop.apache.org
Subject: Re: Error when executing a WordCount Program

hdfs://latdevweb02:9000/home/hadoop/hadoop/input

is this is a valid path on hdfs? Can you access this path outside of the program? For example using hadoop fs -ls command? Also, was this path and files in it, created by a different user?

The exception seem to say that it does not exist or the running user does not have permission to read it.

Regards,
Shahab



On Wed, Sep 10, 2014 at 9:09 AM, YIMEN YIMGA Gael <ga...@sgcib.com>> wrote:
Hello Hadoopers,

Here is the error, I’m facing when running WordCount example program written by myself.
Kind find attached the file of my WordCount program.
Below the error.

===========================================================================================================================================
-bash-4.1$ bin/hadoop jar WordCount.jar
Entr?e dans le programme MAIN !!!
14/09/10 15:00:24 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
14/09/10 15:00:24 WARN mapred.JobClient: No job jar file set.  User classes may not be found. See JobConf(Class) or JobConf#setJar(String).
14/09/10 15:00:24 INFO util.NativeCodeLoader: Loaded the native-hadoop library
14/09/10 15:00:24 WARN snappy.LoadSnappy: Snappy native library not loaded
14/09/10 15:00:24 INFO mapred.JobClient: Cleaning up the staging area hdfs://latdevweb02:9000/user/hadoop/.staging/job_201409101141_0001
14/09/10 15:00:24 ERROR security.UserGroupInformation: PriviledgedActionException as:hadoop cause:org.apache.hadoop.mapred.InvalidInputException: Input path does not exist: hdfs://latdevweb02:9000/home/hadoop/hadoop/input
org.apache.hadoop.mapred.InvalidInputException: Input path does not exist: hdfs://latdevweb02:9000/home/hadoop/hadoop/input
        at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:197)
        at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:208)
        at org.apache.hadoop.mapred.JobClient.writeOldSplits(JobClient.java:1081)
        at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:1073)
        at org.apache.hadoop.mapred.JobClient.access$700(JobClient.java:179)
        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:983)
        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:936)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
        at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:936)
        at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:910)
        at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1353)
        at fr.societegenerale.bigdata.lactool.WordCountDriver.main(WordCountDriver.java:50)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:601)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:160)
-bash-4.1$
===========================================================================================================================================

Thanks in advance for your help.

Warm regards
GYY

*************************************************************************
This message and any attachments (the "message") are confidential, intended solely for the addressee(s), and may contain legally privileged information.
Any unauthorised use or dissemination is prohibited. E-mails are susceptible to alteration.
Neither SOCIETE GENERALE nor any of its subsidiaries or affiliates shall be liable for the message if altered, changed or
falsified.
Please visit http://swapdisclosure.sgcib.com for important information with respect to derivative products.
                              ************
Ce message et toutes les pieces jointes (ci-apres le "message") sont confidentiels et susceptibles de contenir des informations couvertes
par le secret professionnel.
Ce message est etabli a l'intention exclusive de ses destinataires. Toute utilisation ou diffusion non autorisee est interdite.
Tout message electronique est susceptible d'alteration.
La SOCIETE GENERALE et ses filiales declinent toute responsabilite au titre de ce message s'il a ete altere, deforme ou falsifie.
Veuillez consulter le site http://swapdisclosure.sgcib.com afin de recueillir d'importantes informations sur les produits derives.
*************************************************************************


RE: Error when executing a WordCount Program

Posted by YIMEN YIMGA Gael <ga...@sgcib.com>.
Hi,

In fact,

hdfs://latdevweb02:9000/home/hadoop/hadoop/input
is not a folder on hdfs.

I created a folder /tmp/hadoop-hadoop/dfs/data, where data will be saved in hdfs.

And in my HADOOP_HOME folder, there is two folders “input” and “output”, but I don’t know how to configure them in the program.

Please could you look into my code and advise please ?

Standing by …

Warm regards

From: Shahab Yunus [mailto:shahab.yunus@gmail.com]
Sent: Wednesday 10 September 2014 15:19
To: user@hadoop.apache.org
Subject: Re: Error when executing a WordCount Program

hdfs://latdevweb02:9000/home/hadoop/hadoop/input

is this is a valid path on hdfs? Can you access this path outside of the program? For example using hadoop fs -ls command? Also, was this path and files in it, created by a different user?

The exception seem to say that it does not exist or the running user does not have permission to read it.

Regards,
Shahab



On Wed, Sep 10, 2014 at 9:09 AM, YIMEN YIMGA Gael <ga...@sgcib.com>> wrote:
Hello Hadoopers,

Here is the error, I’m facing when running WordCount example program written by myself.
Kind find attached the file of my WordCount program.
Below the error.

===========================================================================================================================================
-bash-4.1$ bin/hadoop jar WordCount.jar
Entr?e dans le programme MAIN !!!
14/09/10 15:00:24 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
14/09/10 15:00:24 WARN mapred.JobClient: No job jar file set.  User classes may not be found. See JobConf(Class) or JobConf#setJar(String).
14/09/10 15:00:24 INFO util.NativeCodeLoader: Loaded the native-hadoop library
14/09/10 15:00:24 WARN snappy.LoadSnappy: Snappy native library not loaded
14/09/10 15:00:24 INFO mapred.JobClient: Cleaning up the staging area hdfs://latdevweb02:9000/user/hadoop/.staging/job_201409101141_0001
14/09/10 15:00:24 ERROR security.UserGroupInformation: PriviledgedActionException as:hadoop cause:org.apache.hadoop.mapred.InvalidInputException: Input path does not exist: hdfs://latdevweb02:9000/home/hadoop/hadoop/input
org.apache.hadoop.mapred.InvalidInputException: Input path does not exist: hdfs://latdevweb02:9000/home/hadoop/hadoop/input
        at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:197)
        at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:208)
        at org.apache.hadoop.mapred.JobClient.writeOldSplits(JobClient.java:1081)
        at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:1073)
        at org.apache.hadoop.mapred.JobClient.access$700(JobClient.java:179)
        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:983)
        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:936)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
        at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:936)
        at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:910)
        at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1353)
        at fr.societegenerale.bigdata.lactool.WordCountDriver.main(WordCountDriver.java:50)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:601)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:160)
-bash-4.1$
===========================================================================================================================================

Thanks in advance for your help.

Warm regards
GYY

*************************************************************************
This message and any attachments (the "message") are confidential, intended solely for the addressee(s), and may contain legally privileged information.
Any unauthorised use or dissemination is prohibited. E-mails are susceptible to alteration.
Neither SOCIETE GENERALE nor any of its subsidiaries or affiliates shall be liable for the message if altered, changed or
falsified.
Please visit http://swapdisclosure.sgcib.com for important information with respect to derivative products.
                              ************
Ce message et toutes les pieces jointes (ci-apres le "message") sont confidentiels et susceptibles de contenir des informations couvertes
par le secret professionnel.
Ce message est etabli a l'intention exclusive de ses destinataires. Toute utilisation ou diffusion non autorisee est interdite.
Tout message electronique est susceptible d'alteration.
La SOCIETE GENERALE et ses filiales declinent toute responsabilite au titre de ce message s'il a ete altere, deforme ou falsifie.
Veuillez consulter le site http://swapdisclosure.sgcib.com afin de recueillir d'importantes informations sur les produits derives.
*************************************************************************


RE: Error when executing a WordCount Program

Posted by YIMEN YIMGA Gael <ga...@sgcib.com>.
Hi,

Please that is my real problem.
Could you please look into my code in attached and tell me how I can update this, please ?

How to set a job jar file?

And now, here is my hdfs-site.xml

==
-bash-4.1$ cat conf/hdfs-site.xml
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<!-- Put site-specific property overrides in this file. -->

<configuration>
   <property>
      <name>dfs.replication</name>
      <value>1</value>
   </property>
   <property>
      <name>dfs.data.dir</name>
      <value>/tmp/hadoop-hadoop/dfs/data</value>
   </property>
</configuration>
-bash-4.1$
==

Could you advice on how to solve the error of “input path does not exist”?

Standing by …

Cheers


From: Chris MacKenzie [mailto:studio@chrismackenziephotography.co.uk]
Sent: Wednesday 10 September 2014 15:27
To: user@hadoop.apache.org
Subject: Re: Error when executing a WordCount Program

Hi have you set a class in your code ?

WARN mapred.JobClient: No job jar file set.  User classes may not be found. See JobConf(Class) or JobConf#setJar(String).

Also you need to check the path for your input file

Input path does not exist: hdfs://latdevweb02:9000/home/hadoop/hadoop/input

These are pretty straight forward errors resolve them and you should be good to go.

Sent from my iPhone

On 10 Sep 2014, at 14:19, Shahab Yunus <sh...@gmail.com>> wrote:
hdfs://latdevweb02:9000/home/hadoop/hadoop/input

is this is a valid path on hdfs? Can you access this path outside of the program? For example using hadoop fs -ls command? Also, was this path and files in it, created by a different user?

The exception seem to say that it does not exist or the running user does not have permission to read it.

Regards,
Shahab



On Wed, Sep 10, 2014 at 9:09 AM, YIMEN YIMGA Gael <ga...@sgcib.com>> wrote:
Hello Hadoopers,

Here is the error, I’m facing when running WordCount example program written by myself.
Kind find attached the file of my WordCount program.
Below the error.

===========================================================================================================================================
-bash-4.1$ bin/hadoop jar WordCount.jar
Entr?e dans le programme MAIN !!!
14/09/10 15:00:24 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
14/09/10 15:00:24 WARN mapred.JobClient: No job jar file set.  User classes may not be found. See JobConf(Class) or JobConf#setJar(String).
14/09/10 15:00:24 INFO util.NativeCodeLoader: Loaded the native-hadoop library
14/09/10 15:00:24 WARN snappy.LoadSnappy: Snappy native library not loaded
14/09/10 15:00:24 INFO mapred.JobClient: Cleaning up the staging area hdfs://latdevweb02:9000/user/hadoop/.staging/job_201409101141_0001
14/09/10 15:00:24 ERROR security.UserGroupInformation: PriviledgedActionException as:hadoop cause:org.apache.hadoop.mapred.InvalidInputException: Input path does not exist: hdfs://latdevweb02:9000/home/hadoop/hadoop/input
org.apache.hadoop.mapred.InvalidInputException: Input path does not exist: hdfs://latdevweb02:9000/home/hadoop/hadoop/input
        at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:197)
        at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:208)
        at org.apache.hadoop.mapred.JobClient.writeOldSplits(JobClient.java:1081)
        at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:1073)
        at org.apache.hadoop.mapred.JobClient.access$700(JobClient.java:179)
        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:983)
        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:936)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
        at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:936)
        at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:910)
        at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1353)
        at fr.societegenerale.bigdata.lactool.WordCountDriver.main(WordCountDriver.java:50)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:601)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:160)
-bash-4.1$
===========================================================================================================================================

Thanks in advance for your help.

Warm regards
GYY

*************************************************************************
This message and any attachments (the "message") are confidential, intended solely for the addressee(s), and may contain legally privileged information.
Any unauthorised use or dissemination is prohibited. E-mails are susceptible to alteration.
Neither SOCIETE GENERALE nor any of its subsidiaries or affiliates shall be liable for the message if altered, changed or
falsified.
Please visit http://swapdisclosure.sgcib.com for important information with respect to derivative products.
                              ************
Ce message et toutes les pieces jointes (ci-apres le "message") sont confidentiels et susceptibles de contenir des informations couvertes
par le secret professionnel.
Ce message est etabli a l'intention exclusive de ses destinataires. Toute utilisation ou diffusion non autorisee est interdite.
Tout message electronique est susceptible d'alteration.
La SOCIETE GENERALE et ses filiales declinent toute responsabilite au titre de ce message s'il a ete altere, deforme ou falsifie.
Veuillez consulter le site http://swapdisclosure.sgcib.com afin de recueillir d'importantes informations sur les produits derives.
*************************************************************************


RE: Error when executing a WordCount Program

Posted by YIMEN YIMGA Gael <ga...@sgcib.com>.
Hi,

Please that is my real problem.
Could you please look into my code in attached and tell me how I can update this, please ?

How to set a job jar file?

And now, here is my hdfs-site.xml

==
-bash-4.1$ cat conf/hdfs-site.xml
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<!-- Put site-specific property overrides in this file. -->

<configuration>
   <property>
      <name>dfs.replication</name>
      <value>1</value>
   </property>
   <property>
      <name>dfs.data.dir</name>
      <value>/tmp/hadoop-hadoop/dfs/data</value>
   </property>
</configuration>
-bash-4.1$
==

Could you advice on how to solve the error of “input path does not exist”?

Standing by …

Cheers


From: Chris MacKenzie [mailto:studio@chrismackenziephotography.co.uk]
Sent: Wednesday 10 September 2014 15:27
To: user@hadoop.apache.org
Subject: Re: Error when executing a WordCount Program

Hi have you set a class in your code ?

WARN mapred.JobClient: No job jar file set.  User classes may not be found. See JobConf(Class) or JobConf#setJar(String).

Also you need to check the path for your input file

Input path does not exist: hdfs://latdevweb02:9000/home/hadoop/hadoop/input

These are pretty straight forward errors resolve them and you should be good to go.

Sent from my iPhone

On 10 Sep 2014, at 14:19, Shahab Yunus <sh...@gmail.com>> wrote:
hdfs://latdevweb02:9000/home/hadoop/hadoop/input

is this is a valid path on hdfs? Can you access this path outside of the program? For example using hadoop fs -ls command? Also, was this path and files in it, created by a different user?

The exception seem to say that it does not exist or the running user does not have permission to read it.

Regards,
Shahab



On Wed, Sep 10, 2014 at 9:09 AM, YIMEN YIMGA Gael <ga...@sgcib.com>> wrote:
Hello Hadoopers,

Here is the error, I’m facing when running WordCount example program written by myself.
Kind find attached the file of my WordCount program.
Below the error.

===========================================================================================================================================
-bash-4.1$ bin/hadoop jar WordCount.jar
Entr?e dans le programme MAIN !!!
14/09/10 15:00:24 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
14/09/10 15:00:24 WARN mapred.JobClient: No job jar file set.  User classes may not be found. See JobConf(Class) or JobConf#setJar(String).
14/09/10 15:00:24 INFO util.NativeCodeLoader: Loaded the native-hadoop library
14/09/10 15:00:24 WARN snappy.LoadSnappy: Snappy native library not loaded
14/09/10 15:00:24 INFO mapred.JobClient: Cleaning up the staging area hdfs://latdevweb02:9000/user/hadoop/.staging/job_201409101141_0001
14/09/10 15:00:24 ERROR security.UserGroupInformation: PriviledgedActionException as:hadoop cause:org.apache.hadoop.mapred.InvalidInputException: Input path does not exist: hdfs://latdevweb02:9000/home/hadoop/hadoop/input
org.apache.hadoop.mapred.InvalidInputException: Input path does not exist: hdfs://latdevweb02:9000/home/hadoop/hadoop/input
        at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:197)
        at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:208)
        at org.apache.hadoop.mapred.JobClient.writeOldSplits(JobClient.java:1081)
        at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:1073)
        at org.apache.hadoop.mapred.JobClient.access$700(JobClient.java:179)
        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:983)
        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:936)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
        at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:936)
        at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:910)
        at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1353)
        at fr.societegenerale.bigdata.lactool.WordCountDriver.main(WordCountDriver.java:50)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:601)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:160)
-bash-4.1$
===========================================================================================================================================

Thanks in advance for your help.

Warm regards
GYY

*************************************************************************
This message and any attachments (the "message") are confidential, intended solely for the addressee(s), and may contain legally privileged information.
Any unauthorised use or dissemination is prohibited. E-mails are susceptible to alteration.
Neither SOCIETE GENERALE nor any of its subsidiaries or affiliates shall be liable for the message if altered, changed or
falsified.
Please visit http://swapdisclosure.sgcib.com for important information with respect to derivative products.
                              ************
Ce message et toutes les pieces jointes (ci-apres le "message") sont confidentiels et susceptibles de contenir des informations couvertes
par le secret professionnel.
Ce message est etabli a l'intention exclusive de ses destinataires. Toute utilisation ou diffusion non autorisee est interdite.
Tout message electronique est susceptible d'alteration.
La SOCIETE GENERALE et ses filiales declinent toute responsabilite au titre de ce message s'il a ete altere, deforme ou falsifie.
Veuillez consulter le site http://swapdisclosure.sgcib.com afin de recueillir d'importantes informations sur les produits derives.
*************************************************************************


RE: Error when executing a WordCount Program

Posted by YIMEN YIMGA Gael <ga...@sgcib.com>.
Hi,

Please that is my real problem.
Could you please look into my code in attached and tell me how I can update this, please ?

How to set a job jar file?

And now, here is my hdfs-site.xml

==
-bash-4.1$ cat conf/hdfs-site.xml
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<!-- Put site-specific property overrides in this file. -->

<configuration>
   <property>
      <name>dfs.replication</name>
      <value>1</value>
   </property>
   <property>
      <name>dfs.data.dir</name>
      <value>/tmp/hadoop-hadoop/dfs/data</value>
   </property>
</configuration>
-bash-4.1$
==

Could you advice on how to solve the error of “input path does not exist”?

Standing by …

Cheers


From: Chris MacKenzie [mailto:studio@chrismackenziephotography.co.uk]
Sent: Wednesday 10 September 2014 15:27
To: user@hadoop.apache.org
Subject: Re: Error when executing a WordCount Program

Hi have you set a class in your code ?

WARN mapred.JobClient: No job jar file set.  User classes may not be found. See JobConf(Class) or JobConf#setJar(String).

Also you need to check the path for your input file

Input path does not exist: hdfs://latdevweb02:9000/home/hadoop/hadoop/input

These are pretty straight forward errors resolve them and you should be good to go.

Sent from my iPhone

On 10 Sep 2014, at 14:19, Shahab Yunus <sh...@gmail.com>> wrote:
hdfs://latdevweb02:9000/home/hadoop/hadoop/input

is this is a valid path on hdfs? Can you access this path outside of the program? For example using hadoop fs -ls command? Also, was this path and files in it, created by a different user?

The exception seem to say that it does not exist or the running user does not have permission to read it.

Regards,
Shahab



On Wed, Sep 10, 2014 at 9:09 AM, YIMEN YIMGA Gael <ga...@sgcib.com>> wrote:
Hello Hadoopers,

Here is the error, I’m facing when running WordCount example program written by myself.
Kind find attached the file of my WordCount program.
Below the error.

===========================================================================================================================================
-bash-4.1$ bin/hadoop jar WordCount.jar
Entr?e dans le programme MAIN !!!
14/09/10 15:00:24 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
14/09/10 15:00:24 WARN mapred.JobClient: No job jar file set.  User classes may not be found. See JobConf(Class) or JobConf#setJar(String).
14/09/10 15:00:24 INFO util.NativeCodeLoader: Loaded the native-hadoop library
14/09/10 15:00:24 WARN snappy.LoadSnappy: Snappy native library not loaded
14/09/10 15:00:24 INFO mapred.JobClient: Cleaning up the staging area hdfs://latdevweb02:9000/user/hadoop/.staging/job_201409101141_0001
14/09/10 15:00:24 ERROR security.UserGroupInformation: PriviledgedActionException as:hadoop cause:org.apache.hadoop.mapred.InvalidInputException: Input path does not exist: hdfs://latdevweb02:9000/home/hadoop/hadoop/input
org.apache.hadoop.mapred.InvalidInputException: Input path does not exist: hdfs://latdevweb02:9000/home/hadoop/hadoop/input
        at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:197)
        at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:208)
        at org.apache.hadoop.mapred.JobClient.writeOldSplits(JobClient.java:1081)
        at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:1073)
        at org.apache.hadoop.mapred.JobClient.access$700(JobClient.java:179)
        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:983)
        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:936)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
        at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:936)
        at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:910)
        at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1353)
        at fr.societegenerale.bigdata.lactool.WordCountDriver.main(WordCountDriver.java:50)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:601)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:160)
-bash-4.1$
===========================================================================================================================================

Thanks in advance for your help.

Warm regards
GYY

*************************************************************************
This message and any attachments (the "message") are confidential, intended solely for the addressee(s), and may contain legally privileged information.
Any unauthorised use or dissemination is prohibited. E-mails are susceptible to alteration.
Neither SOCIETE GENERALE nor any of its subsidiaries or affiliates shall be liable for the message if altered, changed or
falsified.
Please visit http://swapdisclosure.sgcib.com for important information with respect to derivative products.
                              ************
Ce message et toutes les pieces jointes (ci-apres le "message") sont confidentiels et susceptibles de contenir des informations couvertes
par le secret professionnel.
Ce message est etabli a l'intention exclusive de ses destinataires. Toute utilisation ou diffusion non autorisee est interdite.
Tout message electronique est susceptible d'alteration.
La SOCIETE GENERALE et ses filiales declinent toute responsabilite au titre de ce message s'il a ete altere, deforme ou falsifie.
Veuillez consulter le site http://swapdisclosure.sgcib.com afin de recueillir d'importantes informations sur les produits derives.
*************************************************************************


RE: Error when executing a WordCount Program

Posted by YIMEN YIMGA Gael <ga...@sgcib.com>.
Hi,

Please that is my real problem.
Could you please look into my code in attached and tell me how I can update this, please ?

How to set a job jar file?

And now, here is my hdfs-site.xml

==
-bash-4.1$ cat conf/hdfs-site.xml
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<!-- Put site-specific property overrides in this file. -->

<configuration>
   <property>
      <name>dfs.replication</name>
      <value>1</value>
   </property>
   <property>
      <name>dfs.data.dir</name>
      <value>/tmp/hadoop-hadoop/dfs/data</value>
   </property>
</configuration>
-bash-4.1$
==

Could you advice on how to solve the error of “input path does not exist”?

Standing by …

Cheers


From: Chris MacKenzie [mailto:studio@chrismackenziephotography.co.uk]
Sent: Wednesday 10 September 2014 15:27
To: user@hadoop.apache.org
Subject: Re: Error when executing a WordCount Program

Hi have you set a class in your code ?

WARN mapred.JobClient: No job jar file set.  User classes may not be found. See JobConf(Class) or JobConf#setJar(String).

Also you need to check the path for your input file

Input path does not exist: hdfs://latdevweb02:9000/home/hadoop/hadoop/input

These are pretty straight forward errors resolve them and you should be good to go.

Sent from my iPhone

On 10 Sep 2014, at 14:19, Shahab Yunus <sh...@gmail.com>> wrote:
hdfs://latdevweb02:9000/home/hadoop/hadoop/input

is this is a valid path on hdfs? Can you access this path outside of the program? For example using hadoop fs -ls command? Also, was this path and files in it, created by a different user?

The exception seem to say that it does not exist or the running user does not have permission to read it.

Regards,
Shahab



On Wed, Sep 10, 2014 at 9:09 AM, YIMEN YIMGA Gael <ga...@sgcib.com>> wrote:
Hello Hadoopers,

Here is the error, I’m facing when running WordCount example program written by myself.
Kind find attached the file of my WordCount program.
Below the error.

===========================================================================================================================================
-bash-4.1$ bin/hadoop jar WordCount.jar
Entr?e dans le programme MAIN !!!
14/09/10 15:00:24 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
14/09/10 15:00:24 WARN mapred.JobClient: No job jar file set.  User classes may not be found. See JobConf(Class) or JobConf#setJar(String).
14/09/10 15:00:24 INFO util.NativeCodeLoader: Loaded the native-hadoop library
14/09/10 15:00:24 WARN snappy.LoadSnappy: Snappy native library not loaded
14/09/10 15:00:24 INFO mapred.JobClient: Cleaning up the staging area hdfs://latdevweb02:9000/user/hadoop/.staging/job_201409101141_0001
14/09/10 15:00:24 ERROR security.UserGroupInformation: PriviledgedActionException as:hadoop cause:org.apache.hadoop.mapred.InvalidInputException: Input path does not exist: hdfs://latdevweb02:9000/home/hadoop/hadoop/input
org.apache.hadoop.mapred.InvalidInputException: Input path does not exist: hdfs://latdevweb02:9000/home/hadoop/hadoop/input
        at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:197)
        at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:208)
        at org.apache.hadoop.mapred.JobClient.writeOldSplits(JobClient.java:1081)
        at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:1073)
        at org.apache.hadoop.mapred.JobClient.access$700(JobClient.java:179)
        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:983)
        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:936)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
        at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:936)
        at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:910)
        at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1353)
        at fr.societegenerale.bigdata.lactool.WordCountDriver.main(WordCountDriver.java:50)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:601)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:160)
-bash-4.1$
===========================================================================================================================================

Thanks in advance for your help.

Warm regards
GYY

*************************************************************************
This message and any attachments (the "message") are confidential, intended solely for the addressee(s), and may contain legally privileged information.
Any unauthorised use or dissemination is prohibited. E-mails are susceptible to alteration.
Neither SOCIETE GENERALE nor any of its subsidiaries or affiliates shall be liable for the message if altered, changed or
falsified.
Please visit http://swapdisclosure.sgcib.com for important information with respect to derivative products.
                              ************
Ce message et toutes les pieces jointes (ci-apres le "message") sont confidentiels et susceptibles de contenir des informations couvertes
par le secret professionnel.
Ce message est etabli a l'intention exclusive de ses destinataires. Toute utilisation ou diffusion non autorisee est interdite.
Tout message electronique est susceptible d'alteration.
La SOCIETE GENERALE et ses filiales declinent toute responsabilite au titre de ce message s'il a ete altere, deforme ou falsifie.
Veuillez consulter le site http://swapdisclosure.sgcib.com afin de recueillir d'importantes informations sur les produits derives.
*************************************************************************


Re: Error when executing a WordCount Program

Posted by Chris MacKenzie <st...@chrismackenziephotography.co.uk>.
Hi have you set a class in your code ?

>> WARN mapred.JobClient: No job jar file set.  User classes may not be found. See JobConf(Class) or JobConf#setJar(String).
>> 


Also you need to check the path for your input file

>> Input path does not exist: hdfs://latdevweb02:9000/home/hadoop/hadoop/input
>> 

These are pretty straight forward errors resolve them and you should be good to go. 

Sent from my iPhone

> On 10 Sep 2014, at 14:19, Shahab Yunus <sh...@gmail.com> wrote:
> 
> hdfs://latdevweb02:9000/home/hadoop/hadoop/input
> 
> is this is a valid path on hdfs? Can you access this path outside of the program? For example using hadoop fs -ls command? Also, was this path and files in it, created by a different user?
> 
> The exception seem to say that it does not exist or the running user does not have permission to read it.
> 
> Regards,
> Shahab
> 
> 
> 
>> On Wed, Sep 10, 2014 at 9:09 AM, YIMEN YIMGA Gael <ga...@sgcib.com> wrote:
>> Hello Hadoopers,
>> 
>>  
>> 
>> Here is the error, I’m facing when running WordCount example program written by myself.
>> 
>> Kind find attached the file of my WordCount program.
>> 
>> Below the error.
>> 
>>  
>> 
>> ===========================================================================================================================================
>> 
>> -bash-4.1$ bin/hadoop jar WordCount.jar
>> 
>> Entr?e dans le programme MAIN !!!
>> 
>> 14/09/10 15:00:24 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
>> 
>> 14/09/10 15:00:24 WARN mapred.JobClient: No job jar file set.  User classes may not be found. See JobConf(Class) or JobConf#setJar(String).
>> 
>> 14/09/10 15:00:24 INFO util.NativeCodeLoader: Loaded the native-hadoop library
>> 
>> 14/09/10 15:00:24 WARN snappy.LoadSnappy: Snappy native library not loaded
>> 
>> 14/09/10 15:00:24 INFO mapred.JobClient: Cleaning up the staging area hdfs://latdevweb02:9000/user/hadoop/.staging/job_201409101141_0001
>> 
>> 14/09/10 15:00:24 ERROR security.UserGroupInformation: PriviledgedActionException as:hadoop cause:org.apache.hadoop.mapred.InvalidInputException: Input path does not exist: hdfs://latdevweb02:9000/home/hadoop/hadoop/input
>> 
>> org.apache.hadoop.mapred.InvalidInputException: Input path does not exist: hdfs://latdevweb02:9000/home/hadoop/hadoop/input
>> 
>>         at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:197)
>> 
>>         at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:208)
>> 
>>         at org.apache.hadoop.mapred.JobClient.writeOldSplits(JobClient.java:1081)
>> 
>>         at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:1073)
>> 
>>         at org.apache.hadoop.mapred.JobClient.access$700(JobClient.java:179)
>> 
>>         at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:983)
>> 
>>         at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:936)
>> 
>>         at java.security.AccessController.doPrivileged(Native Method)
>> 
>>         at javax.security.auth.Subject.doAs(Subject.java:415)
>> 
>>         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
>> 
>>         at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:936)
>> 
>>         at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:910)
>> 
>>         at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1353)
>> 
>>         at fr.societegenerale.bigdata.lactool.WordCountDriver.main(WordCountDriver.java:50)
>> 
>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> 
>>         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>> 
>>         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> 
>>         at java.lang.reflect.Method.invoke(Method.java:601)
>> 
>>         at org.apache.hadoop.util.RunJar.main(RunJar.java:160)
>> 
>> -bash-4.1$
>> 
>> ===========================================================================================================================================
>> 
>>  
>> 
>> Thanks in advance for your help.
>> 
>>  
>> 
>> Warm regards
>> 
>> GYY
>> 
>> *************************************************************************
>> This message and any attachments (the "message") are confidential, intended solely for the addressee(s), and may contain legally privileged information.
>> Any unauthorised use or dissemination is prohibited. E-mails are susceptible to alteration.  
>> Neither SOCIETE GENERALE nor any of its subsidiaries or affiliates shall be liable for the message if altered, changed or
>> falsified.
>> Please visit http://swapdisclosure.sgcib.com for important information with respect to derivative products.
>>                               ************
>> Ce message et toutes les pieces jointes (ci-apres le "message") sont confidentiels et susceptibles de contenir des informations couvertes 
>> par le secret professionnel. 
>> Ce message est etabli a l'intention exclusive de ses destinataires. Toute utilisation ou diffusion non autorisee est interdite.
>> Tout message electronique est susceptible d'alteration. 
>> La SOCIETE GENERALE et ses filiales declinent toute responsabilite au titre de ce message s'il a ete altere, deforme ou falsifie.
>> Veuillez consulter le site http://swapdisclosure.sgcib.com afin de recueillir d'importantes informations sur les produits derives.
>> *************************************************************************
>> 
> 

Re: Error when executing a WordCount Program

Posted by Chris MacKenzie <st...@chrismackenziephotography.co.uk>.
Hi have you set a class in your code ?

>> WARN mapred.JobClient: No job jar file set.  User classes may not be found. See JobConf(Class) or JobConf#setJar(String).
>> 


Also you need to check the path for your input file

>> Input path does not exist: hdfs://latdevweb02:9000/home/hadoop/hadoop/input
>> 

These are pretty straight forward errors resolve them and you should be good to go. 

Sent from my iPhone

> On 10 Sep 2014, at 14:19, Shahab Yunus <sh...@gmail.com> wrote:
> 
> hdfs://latdevweb02:9000/home/hadoop/hadoop/input
> 
> is this is a valid path on hdfs? Can you access this path outside of the program? For example using hadoop fs -ls command? Also, was this path and files in it, created by a different user?
> 
> The exception seem to say that it does not exist or the running user does not have permission to read it.
> 
> Regards,
> Shahab
> 
> 
> 
>> On Wed, Sep 10, 2014 at 9:09 AM, YIMEN YIMGA Gael <ga...@sgcib.com> wrote:
>> Hello Hadoopers,
>> 
>>  
>> 
>> Here is the error, I’m facing when running WordCount example program written by myself.
>> 
>> Kind find attached the file of my WordCount program.
>> 
>> Below the error.
>> 
>>  
>> 
>> ===========================================================================================================================================
>> 
>> -bash-4.1$ bin/hadoop jar WordCount.jar
>> 
>> Entr?e dans le programme MAIN !!!
>> 
>> 14/09/10 15:00:24 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
>> 
>> 14/09/10 15:00:24 WARN mapred.JobClient: No job jar file set.  User classes may not be found. See JobConf(Class) or JobConf#setJar(String).
>> 
>> 14/09/10 15:00:24 INFO util.NativeCodeLoader: Loaded the native-hadoop library
>> 
>> 14/09/10 15:00:24 WARN snappy.LoadSnappy: Snappy native library not loaded
>> 
>> 14/09/10 15:00:24 INFO mapred.JobClient: Cleaning up the staging area hdfs://latdevweb02:9000/user/hadoop/.staging/job_201409101141_0001
>> 
>> 14/09/10 15:00:24 ERROR security.UserGroupInformation: PriviledgedActionException as:hadoop cause:org.apache.hadoop.mapred.InvalidInputException: Input path does not exist: hdfs://latdevweb02:9000/home/hadoop/hadoop/input
>> 
>> org.apache.hadoop.mapred.InvalidInputException: Input path does not exist: hdfs://latdevweb02:9000/home/hadoop/hadoop/input
>> 
>>         at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:197)
>> 
>>         at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:208)
>> 
>>         at org.apache.hadoop.mapred.JobClient.writeOldSplits(JobClient.java:1081)
>> 
>>         at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:1073)
>> 
>>         at org.apache.hadoop.mapred.JobClient.access$700(JobClient.java:179)
>> 
>>         at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:983)
>> 
>>         at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:936)
>> 
>>         at java.security.AccessController.doPrivileged(Native Method)
>> 
>>         at javax.security.auth.Subject.doAs(Subject.java:415)
>> 
>>         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
>> 
>>         at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:936)
>> 
>>         at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:910)
>> 
>>         at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1353)
>> 
>>         at fr.societegenerale.bigdata.lactool.WordCountDriver.main(WordCountDriver.java:50)
>> 
>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> 
>>         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>> 
>>         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> 
>>         at java.lang.reflect.Method.invoke(Method.java:601)
>> 
>>         at org.apache.hadoop.util.RunJar.main(RunJar.java:160)
>> 
>> -bash-4.1$
>> 
>> ===========================================================================================================================================
>> 
>>  
>> 
>> Thanks in advance for your help.
>> 
>>  
>> 
>> Warm regards
>> 
>> GYY
>> 
>> *************************************************************************
>> This message and any attachments (the "message") are confidential, intended solely for the addressee(s), and may contain legally privileged information.
>> Any unauthorised use or dissemination is prohibited. E-mails are susceptible to alteration.  
>> Neither SOCIETE GENERALE nor any of its subsidiaries or affiliates shall be liable for the message if altered, changed or
>> falsified.
>> Please visit http://swapdisclosure.sgcib.com for important information with respect to derivative products.
>>                               ************
>> Ce message et toutes les pieces jointes (ci-apres le "message") sont confidentiels et susceptibles de contenir des informations couvertes 
>> par le secret professionnel. 
>> Ce message est etabli a l'intention exclusive de ses destinataires. Toute utilisation ou diffusion non autorisee est interdite.
>> Tout message electronique est susceptible d'alteration. 
>> La SOCIETE GENERALE et ses filiales declinent toute responsabilite au titre de ce message s'il a ete altere, deforme ou falsifie.
>> Veuillez consulter le site http://swapdisclosure.sgcib.com afin de recueillir d'importantes informations sur les produits derives.
>> *************************************************************************
>> 
> 

Re: Error when executing a WordCount Program

Posted by Chris MacKenzie <st...@chrismackenziephotography.co.uk>.
Hi have you set a class in your code ?

>> WARN mapred.JobClient: No job jar file set.  User classes may not be found. See JobConf(Class) or JobConf#setJar(String).
>> 


Also you need to check the path for your input file

>> Input path does not exist: hdfs://latdevweb02:9000/home/hadoop/hadoop/input
>> 

These are pretty straight forward errors resolve them and you should be good to go. 

Sent from my iPhone

> On 10 Sep 2014, at 14:19, Shahab Yunus <sh...@gmail.com> wrote:
> 
> hdfs://latdevweb02:9000/home/hadoop/hadoop/input
> 
> is this is a valid path on hdfs? Can you access this path outside of the program? For example using hadoop fs -ls command? Also, was this path and files in it, created by a different user?
> 
> The exception seem to say that it does not exist or the running user does not have permission to read it.
> 
> Regards,
> Shahab
> 
> 
> 
>> On Wed, Sep 10, 2014 at 9:09 AM, YIMEN YIMGA Gael <ga...@sgcib.com> wrote:
>> Hello Hadoopers,
>> 
>>  
>> 
>> Here is the error, I’m facing when running WordCount example program written by myself.
>> 
>> Kind find attached the file of my WordCount program.
>> 
>> Below the error.
>> 
>>  
>> 
>> ===========================================================================================================================================
>> 
>> -bash-4.1$ bin/hadoop jar WordCount.jar
>> 
>> Entr?e dans le programme MAIN !!!
>> 
>> 14/09/10 15:00:24 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
>> 
>> 14/09/10 15:00:24 WARN mapred.JobClient: No job jar file set.  User classes may not be found. See JobConf(Class) or JobConf#setJar(String).
>> 
>> 14/09/10 15:00:24 INFO util.NativeCodeLoader: Loaded the native-hadoop library
>> 
>> 14/09/10 15:00:24 WARN snappy.LoadSnappy: Snappy native library not loaded
>> 
>> 14/09/10 15:00:24 INFO mapred.JobClient: Cleaning up the staging area hdfs://latdevweb02:9000/user/hadoop/.staging/job_201409101141_0001
>> 
>> 14/09/10 15:00:24 ERROR security.UserGroupInformation: PriviledgedActionException as:hadoop cause:org.apache.hadoop.mapred.InvalidInputException: Input path does not exist: hdfs://latdevweb02:9000/home/hadoop/hadoop/input
>> 
>> org.apache.hadoop.mapred.InvalidInputException: Input path does not exist: hdfs://latdevweb02:9000/home/hadoop/hadoop/input
>> 
>>         at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:197)
>> 
>>         at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:208)
>> 
>>         at org.apache.hadoop.mapred.JobClient.writeOldSplits(JobClient.java:1081)
>> 
>>         at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:1073)
>> 
>>         at org.apache.hadoop.mapred.JobClient.access$700(JobClient.java:179)
>> 
>>         at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:983)
>> 
>>         at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:936)
>> 
>>         at java.security.AccessController.doPrivileged(Native Method)
>> 
>>         at javax.security.auth.Subject.doAs(Subject.java:415)
>> 
>>         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
>> 
>>         at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:936)
>> 
>>         at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:910)
>> 
>>         at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1353)
>> 
>>         at fr.societegenerale.bigdata.lactool.WordCountDriver.main(WordCountDriver.java:50)
>> 
>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> 
>>         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>> 
>>         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> 
>>         at java.lang.reflect.Method.invoke(Method.java:601)
>> 
>>         at org.apache.hadoop.util.RunJar.main(RunJar.java:160)
>> 
>> -bash-4.1$
>> 
>> ===========================================================================================================================================
>> 
>>  
>> 
>> Thanks in advance for your help.
>> 
>>  
>> 
>> Warm regards
>> 
>> GYY
>> 
>> *************************************************************************
>> This message and any attachments (the "message") are confidential, intended solely for the addressee(s), and may contain legally privileged information.
>> Any unauthorised use or dissemination is prohibited. E-mails are susceptible to alteration.  
>> Neither SOCIETE GENERALE nor any of its subsidiaries or affiliates shall be liable for the message if altered, changed or
>> falsified.
>> Please visit http://swapdisclosure.sgcib.com for important information with respect to derivative products.
>>                               ************
>> Ce message et toutes les pieces jointes (ci-apres le "message") sont confidentiels et susceptibles de contenir des informations couvertes 
>> par le secret professionnel. 
>> Ce message est etabli a l'intention exclusive de ses destinataires. Toute utilisation ou diffusion non autorisee est interdite.
>> Tout message electronique est susceptible d'alteration. 
>> La SOCIETE GENERALE et ses filiales declinent toute responsabilite au titre de ce message s'il a ete altere, deforme ou falsifie.
>> Veuillez consulter le site http://swapdisclosure.sgcib.com afin de recueillir d'importantes informations sur les produits derives.
>> *************************************************************************
>> 
> 

RE: Error when executing a WordCount Program

Posted by YIMEN YIMGA Gael <ga...@sgcib.com>.
Hi,

In fact,

hdfs://latdevweb02:9000/home/hadoop/hadoop/input
is not a folder on hdfs.

I created a folder /tmp/hadoop-hadoop/dfs/data, where data will be saved in hdfs.

And in my HADOOP_HOME folder, there is two folders “input” and “output”, but I don’t know how to configure them in the program.

Please could you look into my code and advise please ?

Standing by …

Warm regards

From: Shahab Yunus [mailto:shahab.yunus@gmail.com]
Sent: Wednesday 10 September 2014 15:19
To: user@hadoop.apache.org
Subject: Re: Error when executing a WordCount Program

hdfs://latdevweb02:9000/home/hadoop/hadoop/input

is this is a valid path on hdfs? Can you access this path outside of the program? For example using hadoop fs -ls command? Also, was this path and files in it, created by a different user?

The exception seem to say that it does not exist or the running user does not have permission to read it.

Regards,
Shahab



On Wed, Sep 10, 2014 at 9:09 AM, YIMEN YIMGA Gael <ga...@sgcib.com>> wrote:
Hello Hadoopers,

Here is the error, I’m facing when running WordCount example program written by myself.
Kind find attached the file of my WordCount program.
Below the error.

===========================================================================================================================================
-bash-4.1$ bin/hadoop jar WordCount.jar
Entr?e dans le programme MAIN !!!
14/09/10 15:00:24 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
14/09/10 15:00:24 WARN mapred.JobClient: No job jar file set.  User classes may not be found. See JobConf(Class) or JobConf#setJar(String).
14/09/10 15:00:24 INFO util.NativeCodeLoader: Loaded the native-hadoop library
14/09/10 15:00:24 WARN snappy.LoadSnappy: Snappy native library not loaded
14/09/10 15:00:24 INFO mapred.JobClient: Cleaning up the staging area hdfs://latdevweb02:9000/user/hadoop/.staging/job_201409101141_0001
14/09/10 15:00:24 ERROR security.UserGroupInformation: PriviledgedActionException as:hadoop cause:org.apache.hadoop.mapred.InvalidInputException: Input path does not exist: hdfs://latdevweb02:9000/home/hadoop/hadoop/input
org.apache.hadoop.mapred.InvalidInputException: Input path does not exist: hdfs://latdevweb02:9000/home/hadoop/hadoop/input
        at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:197)
        at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:208)
        at org.apache.hadoop.mapred.JobClient.writeOldSplits(JobClient.java:1081)
        at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:1073)
        at org.apache.hadoop.mapred.JobClient.access$700(JobClient.java:179)
        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:983)
        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:936)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
        at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:936)
        at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:910)
        at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1353)
        at fr.societegenerale.bigdata.lactool.WordCountDriver.main(WordCountDriver.java:50)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:601)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:160)
-bash-4.1$
===========================================================================================================================================

Thanks in advance for your help.

Warm regards
GYY

*************************************************************************
This message and any attachments (the "message") are confidential, intended solely for the addressee(s), and may contain legally privileged information.
Any unauthorised use or dissemination is prohibited. E-mails are susceptible to alteration.
Neither SOCIETE GENERALE nor any of its subsidiaries or affiliates shall be liable for the message if altered, changed or
falsified.
Please visit http://swapdisclosure.sgcib.com for important information with respect to derivative products.
                              ************
Ce message et toutes les pieces jointes (ci-apres le "message") sont confidentiels et susceptibles de contenir des informations couvertes
par le secret professionnel.
Ce message est etabli a l'intention exclusive de ses destinataires. Toute utilisation ou diffusion non autorisee est interdite.
Tout message electronique est susceptible d'alteration.
La SOCIETE GENERALE et ses filiales declinent toute responsabilite au titre de ce message s'il a ete altere, deforme ou falsifie.
Veuillez consulter le site http://swapdisclosure.sgcib.com afin de recueillir d'importantes informations sur les produits derives.
*************************************************************************


RE: Error when executing a WordCount Program

Posted by YIMEN YIMGA Gael <ga...@sgcib.com>.
Hi,

In fact,

hdfs://latdevweb02:9000/home/hadoop/hadoop/input
is not a folder on hdfs.

I created a folder /tmp/hadoop-hadoop/dfs/data, where data will be saved in hdfs.

And in my HADOOP_HOME folder, there is two folders “input” and “output”, but I don’t know how to configure them in the program.

Please could you look into my code and advise please ?

Standing by …

Warm regards

From: Shahab Yunus [mailto:shahab.yunus@gmail.com]
Sent: Wednesday 10 September 2014 15:19
To: user@hadoop.apache.org
Subject: Re: Error when executing a WordCount Program

hdfs://latdevweb02:9000/home/hadoop/hadoop/input

is this is a valid path on hdfs? Can you access this path outside of the program? For example using hadoop fs -ls command? Also, was this path and files in it, created by a different user?

The exception seem to say that it does not exist or the running user does not have permission to read it.

Regards,
Shahab



On Wed, Sep 10, 2014 at 9:09 AM, YIMEN YIMGA Gael <ga...@sgcib.com>> wrote:
Hello Hadoopers,

Here is the error, I’m facing when running WordCount example program written by myself.
Kind find attached the file of my WordCount program.
Below the error.

===========================================================================================================================================
-bash-4.1$ bin/hadoop jar WordCount.jar
Entr?e dans le programme MAIN !!!
14/09/10 15:00:24 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
14/09/10 15:00:24 WARN mapred.JobClient: No job jar file set.  User classes may not be found. See JobConf(Class) or JobConf#setJar(String).
14/09/10 15:00:24 INFO util.NativeCodeLoader: Loaded the native-hadoop library
14/09/10 15:00:24 WARN snappy.LoadSnappy: Snappy native library not loaded
14/09/10 15:00:24 INFO mapred.JobClient: Cleaning up the staging area hdfs://latdevweb02:9000/user/hadoop/.staging/job_201409101141_0001
14/09/10 15:00:24 ERROR security.UserGroupInformation: PriviledgedActionException as:hadoop cause:org.apache.hadoop.mapred.InvalidInputException: Input path does not exist: hdfs://latdevweb02:9000/home/hadoop/hadoop/input
org.apache.hadoop.mapred.InvalidInputException: Input path does not exist: hdfs://latdevweb02:9000/home/hadoop/hadoop/input
        at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:197)
        at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:208)
        at org.apache.hadoop.mapred.JobClient.writeOldSplits(JobClient.java:1081)
        at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:1073)
        at org.apache.hadoop.mapred.JobClient.access$700(JobClient.java:179)
        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:983)
        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:936)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
        at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:936)
        at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:910)
        at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1353)
        at fr.societegenerale.bigdata.lactool.WordCountDriver.main(WordCountDriver.java:50)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:601)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:160)
-bash-4.1$
===========================================================================================================================================

Thanks in advance for your help.

Warm regards
GYY

*************************************************************************
This message and any attachments (the "message") are confidential, intended solely for the addressee(s), and may contain legally privileged information.
Any unauthorised use or dissemination is prohibited. E-mails are susceptible to alteration.
Neither SOCIETE GENERALE nor any of its subsidiaries or affiliates shall be liable for the message if altered, changed or
falsified.
Please visit http://swapdisclosure.sgcib.com for important information with respect to derivative products.
                              ************
Ce message et toutes les pieces jointes (ci-apres le "message") sont confidentiels et susceptibles de contenir des informations couvertes
par le secret professionnel.
Ce message est etabli a l'intention exclusive de ses destinataires. Toute utilisation ou diffusion non autorisee est interdite.
Tout message electronique est susceptible d'alteration.
La SOCIETE GENERALE et ses filiales declinent toute responsabilite au titre de ce message s'il a ete altere, deforme ou falsifie.
Veuillez consulter le site http://swapdisclosure.sgcib.com afin de recueillir d'importantes informations sur les produits derives.
*************************************************************************


Re: Error when executing a WordCount Program

Posted by Chris MacKenzie <st...@chrismackenziephotography.co.uk>.
Hi have you set a class in your code ?

>> WARN mapred.JobClient: No job jar file set.  User classes may not be found. See JobConf(Class) or JobConf#setJar(String).
>> 


Also you need to check the path for your input file

>> Input path does not exist: hdfs://latdevweb02:9000/home/hadoop/hadoop/input
>> 

These are pretty straight forward errors resolve them and you should be good to go. 

Sent from my iPhone

> On 10 Sep 2014, at 14:19, Shahab Yunus <sh...@gmail.com> wrote:
> 
> hdfs://latdevweb02:9000/home/hadoop/hadoop/input
> 
> is this is a valid path on hdfs? Can you access this path outside of the program? For example using hadoop fs -ls command? Also, was this path and files in it, created by a different user?
> 
> The exception seem to say that it does not exist or the running user does not have permission to read it.
> 
> Regards,
> Shahab
> 
> 
> 
>> On Wed, Sep 10, 2014 at 9:09 AM, YIMEN YIMGA Gael <ga...@sgcib.com> wrote:
>> Hello Hadoopers,
>> 
>>  
>> 
>> Here is the error, I’m facing when running WordCount example program written by myself.
>> 
>> Kind find attached the file of my WordCount program.
>> 
>> Below the error.
>> 
>>  
>> 
>> ===========================================================================================================================================
>> 
>> -bash-4.1$ bin/hadoop jar WordCount.jar
>> 
>> Entr?e dans le programme MAIN !!!
>> 
>> 14/09/10 15:00:24 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
>> 
>> 14/09/10 15:00:24 WARN mapred.JobClient: No job jar file set.  User classes may not be found. See JobConf(Class) or JobConf#setJar(String).
>> 
>> 14/09/10 15:00:24 INFO util.NativeCodeLoader: Loaded the native-hadoop library
>> 
>> 14/09/10 15:00:24 WARN snappy.LoadSnappy: Snappy native library not loaded
>> 
>> 14/09/10 15:00:24 INFO mapred.JobClient: Cleaning up the staging area hdfs://latdevweb02:9000/user/hadoop/.staging/job_201409101141_0001
>> 
>> 14/09/10 15:00:24 ERROR security.UserGroupInformation: PriviledgedActionException as:hadoop cause:org.apache.hadoop.mapred.InvalidInputException: Input path does not exist: hdfs://latdevweb02:9000/home/hadoop/hadoop/input
>> 
>> org.apache.hadoop.mapred.InvalidInputException: Input path does not exist: hdfs://latdevweb02:9000/home/hadoop/hadoop/input
>> 
>>         at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:197)
>> 
>>         at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:208)
>> 
>>         at org.apache.hadoop.mapred.JobClient.writeOldSplits(JobClient.java:1081)
>> 
>>         at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:1073)
>> 
>>         at org.apache.hadoop.mapred.JobClient.access$700(JobClient.java:179)
>> 
>>         at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:983)
>> 
>>         at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:936)
>> 
>>         at java.security.AccessController.doPrivileged(Native Method)
>> 
>>         at javax.security.auth.Subject.doAs(Subject.java:415)
>> 
>>         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
>> 
>>         at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:936)
>> 
>>         at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:910)
>> 
>>         at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1353)
>> 
>>         at fr.societegenerale.bigdata.lactool.WordCountDriver.main(WordCountDriver.java:50)
>> 
>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> 
>>         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>> 
>>         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> 
>>         at java.lang.reflect.Method.invoke(Method.java:601)
>> 
>>         at org.apache.hadoop.util.RunJar.main(RunJar.java:160)
>> 
>> -bash-4.1$
>> 
>> ===========================================================================================================================================
>> 
>>  
>> 
>> Thanks in advance for your help.
>> 
>>  
>> 
>> Warm regards
>> 
>> GYY
>> 
>> *************************************************************************
>> This message and any attachments (the "message") are confidential, intended solely for the addressee(s), and may contain legally privileged information.
>> Any unauthorised use or dissemination is prohibited. E-mails are susceptible to alteration.  
>> Neither SOCIETE GENERALE nor any of its subsidiaries or affiliates shall be liable for the message if altered, changed or
>> falsified.
>> Please visit http://swapdisclosure.sgcib.com for important information with respect to derivative products.
>>                               ************
>> Ce message et toutes les pieces jointes (ci-apres le "message") sont confidentiels et susceptibles de contenir des informations couvertes 
>> par le secret professionnel. 
>> Ce message est etabli a l'intention exclusive de ses destinataires. Toute utilisation ou diffusion non autorisee est interdite.
>> Tout message electronique est susceptible d'alteration. 
>> La SOCIETE GENERALE et ses filiales declinent toute responsabilite au titre de ce message s'il a ete altere, deforme ou falsifie.
>> Veuillez consulter le site http://swapdisclosure.sgcib.com afin de recueillir d'importantes informations sur les produits derives.
>> *************************************************************************
>> 
> 

Re: Error when executing a WordCount Program

Posted by Shahab Yunus <sh...@gmail.com>.
*hdfs://latdevweb02:9000/home/hadoop/hadoop/input*

is this is a valid path on hdfs? Can you access this path outside of the
program? For example using hadoop fs -ls command? Also, was this path and
files in it, created by a different user?

The exception seem to say that it does not exist or the running user does
not have permission to read it.

Regards,
Shahab



On Wed, Sep 10, 2014 at 9:09 AM, YIMEN YIMGA Gael <
gael.yimen-yimga@sgcib.com> wrote:

> Hello Hadoopers,
>
>
>
> Here is the error, I’m facing when running WordCount example program
> written by myself.
>
> Kind find attached the file of my WordCount program.
>
> Below the error.
>
>
>
>
> ===========================================================================================================================================
>
> *-bash-4.1$ bin/hadoop jar WordCount.jar*
>
> *Entr?e dans le programme MAIN !!!*
>
> *14/09/10 15:00:24 WARN mapred.JobClient: Use GenericOptionsParser for
> parsing the arguments. Applications should implement Tool for the same.*
>
> *14/09/10 15:00:24 WARN mapred.JobClient: No job jar file set.  User
> classes may not be found. See JobConf(Class) or JobConf#setJar(String).*
>
> *14/09/10 15:00:24 INFO util.NativeCodeLoader: Loaded the native-hadoop
> library*
>
> *14/09/10 15:00:24 WARN snappy.LoadSnappy: Snappy native library not
> loaded*
>
> *14/09/10 15:00:24 INFO mapred.JobClient: Cleaning up the staging area
> hdfs://latdevweb02:9000/user/hadoop/.staging/job_201409101141_0001*
>
> *14/09/10 15:00:24 ERROR security.UserGroupInformation:
> PriviledgedActionException as:hadoop
> cause:org.apache.hadoop.mapred.InvalidInputException: Input path does not
> exist: hdfs://latdevweb02:9000/home/hadoop/hadoop/input*
>
> *org.apache.hadoop.mapred.InvalidInputException: Input path does not
> exist: hdfs://latdevweb02:9000/home/hadoop/hadoop/input*
>
> *        at
> org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:197)*
>
> *        at
> org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:208)*
>
> *        at
> org.apache.hadoop.mapred.JobClient.writeOldSplits(JobClient.java:1081)*
>
> *        at
> org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:1073)*
>
> *        at
> org.apache.hadoop.mapred.JobClient.access$700(JobClient.java:179)*
>
> *        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:983)*
>
> *        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:936)*
>
> *        at java.security.AccessController.doPrivileged(Native Method)*
>
> *        at javax.security.auth.Subject.doAs(Subject.java:415)*
>
> *        at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)*
>
> *        at
> org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:936)*
>
> *        at
> org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:910)*
>
> *        at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1353)*
>
> *        at
> fr.societegenerale.bigdata.lactool.WordCountDriver.main(WordCountDriver.java:50)*
>
> *        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)*
>
> *        at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)*
>
> *        at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)*
>
> *        at java.lang.reflect.Method.invoke(Method.java:601)*
>
> *        at org.apache.hadoop.util.RunJar.main(RunJar.java:160)*
>
> *-bash-4.1$*
>
>
> ===========================================================================================================================================
>
>
>
> Thanks in advance for your help.
>
>
>
> Warm regards
>
> GYY
>
> *************************************************************************
> This message and any attachments (the "message") are confidential,
> intended solely for the addressee(s), and may contain legally privileged
> information.
> Any unauthorised use or dissemination is prohibited. E-mails are
> susceptible to alteration.
> Neither SOCIETE GENERALE nor any of its subsidiaries or affiliates shall
> be liable for the message if altered, changed or
> falsified.
> Please visit http://swapdisclosure.sgcib.com for important information
> with respect to derivative products.
>                               ************
> Ce message et toutes les pieces jointes (ci-apres le "message") sont
> confidentiels et susceptibles de contenir des informations couvertes
> par le secret professionnel.
> Ce message est etabli a l'intention exclusive de ses destinataires. Toute
> utilisation ou diffusion non autorisee est interdite.
> Tout message electronique est susceptible d'alteration.
> La SOCIETE GENERALE et ses filiales declinent toute responsabilite au
> titre de ce message s'il a ete altere, deforme ou falsifie.
> Veuillez consulter le site http://swapdisclosure.sgcib.com afin de
> recueillir d'importantes informations sur les produits derives.
> *************************************************************************
>

Re: Error when executing a WordCount Program

Posted by Shahab Yunus <sh...@gmail.com>.
*hdfs://latdevweb02:9000/home/hadoop/hadoop/input*

is this is a valid path on hdfs? Can you access this path outside of the
program? For example using hadoop fs -ls command? Also, was this path and
files in it, created by a different user?

The exception seem to say that it does not exist or the running user does
not have permission to read it.

Regards,
Shahab



On Wed, Sep 10, 2014 at 9:09 AM, YIMEN YIMGA Gael <
gael.yimen-yimga@sgcib.com> wrote:

> Hello Hadoopers,
>
>
>
> Here is the error, I’m facing when running WordCount example program
> written by myself.
>
> Kind find attached the file of my WordCount program.
>
> Below the error.
>
>
>
>
> ===========================================================================================================================================
>
> *-bash-4.1$ bin/hadoop jar WordCount.jar*
>
> *Entr?e dans le programme MAIN !!!*
>
> *14/09/10 15:00:24 WARN mapred.JobClient: Use GenericOptionsParser for
> parsing the arguments. Applications should implement Tool for the same.*
>
> *14/09/10 15:00:24 WARN mapred.JobClient: No job jar file set.  User
> classes may not be found. See JobConf(Class) or JobConf#setJar(String).*
>
> *14/09/10 15:00:24 INFO util.NativeCodeLoader: Loaded the native-hadoop
> library*
>
> *14/09/10 15:00:24 WARN snappy.LoadSnappy: Snappy native library not
> loaded*
>
> *14/09/10 15:00:24 INFO mapred.JobClient: Cleaning up the staging area
> hdfs://latdevweb02:9000/user/hadoop/.staging/job_201409101141_0001*
>
> *14/09/10 15:00:24 ERROR security.UserGroupInformation:
> PriviledgedActionException as:hadoop
> cause:org.apache.hadoop.mapred.InvalidInputException: Input path does not
> exist: hdfs://latdevweb02:9000/home/hadoop/hadoop/input*
>
> *org.apache.hadoop.mapred.InvalidInputException: Input path does not
> exist: hdfs://latdevweb02:9000/home/hadoop/hadoop/input*
>
> *        at
> org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:197)*
>
> *        at
> org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:208)*
>
> *        at
> org.apache.hadoop.mapred.JobClient.writeOldSplits(JobClient.java:1081)*
>
> *        at
> org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:1073)*
>
> *        at
> org.apache.hadoop.mapred.JobClient.access$700(JobClient.java:179)*
>
> *        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:983)*
>
> *        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:936)*
>
> *        at java.security.AccessController.doPrivileged(Native Method)*
>
> *        at javax.security.auth.Subject.doAs(Subject.java:415)*
>
> *        at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)*
>
> *        at
> org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:936)*
>
> *        at
> org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:910)*
>
> *        at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1353)*
>
> *        at
> fr.societegenerale.bigdata.lactool.WordCountDriver.main(WordCountDriver.java:50)*
>
> *        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)*
>
> *        at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)*
>
> *        at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)*
>
> *        at java.lang.reflect.Method.invoke(Method.java:601)*
>
> *        at org.apache.hadoop.util.RunJar.main(RunJar.java:160)*
>
> *-bash-4.1$*
>
>
> ===========================================================================================================================================
>
>
>
> Thanks in advance for your help.
>
>
>
> Warm regards
>
> GYY
>
> *************************************************************************
> This message and any attachments (the "message") are confidential,
> intended solely for the addressee(s), and may contain legally privileged
> information.
> Any unauthorised use or dissemination is prohibited. E-mails are
> susceptible to alteration.
> Neither SOCIETE GENERALE nor any of its subsidiaries or affiliates shall
> be liable for the message if altered, changed or
> falsified.
> Please visit http://swapdisclosure.sgcib.com for important information
> with respect to derivative products.
>                               ************
> Ce message et toutes les pieces jointes (ci-apres le "message") sont
> confidentiels et susceptibles de contenir des informations couvertes
> par le secret professionnel.
> Ce message est etabli a l'intention exclusive de ses destinataires. Toute
> utilisation ou diffusion non autorisee est interdite.
> Tout message electronique est susceptible d'alteration.
> La SOCIETE GENERALE et ses filiales declinent toute responsabilite au
> titre de ce message s'il a ete altere, deforme ou falsifie.
> Veuillez consulter le site http://swapdisclosure.sgcib.com afin de
> recueillir d'importantes informations sur les produits derives.
> *************************************************************************
>

Re: Error when executing a WordCount Program

Posted by Shahab Yunus <sh...@gmail.com>.
*hdfs://latdevweb02:9000/home/hadoop/hadoop/input*

is this is a valid path on hdfs? Can you access this path outside of the
program? For example using hadoop fs -ls command? Also, was this path and
files in it, created by a different user?

The exception seem to say that it does not exist or the running user does
not have permission to read it.

Regards,
Shahab



On Wed, Sep 10, 2014 at 9:09 AM, YIMEN YIMGA Gael <
gael.yimen-yimga@sgcib.com> wrote:

> Hello Hadoopers,
>
>
>
> Here is the error, I’m facing when running WordCount example program
> written by myself.
>
> Kind find attached the file of my WordCount program.
>
> Below the error.
>
>
>
>
> ===========================================================================================================================================
>
> *-bash-4.1$ bin/hadoop jar WordCount.jar*
>
> *Entr?e dans le programme MAIN !!!*
>
> *14/09/10 15:00:24 WARN mapred.JobClient: Use GenericOptionsParser for
> parsing the arguments. Applications should implement Tool for the same.*
>
> *14/09/10 15:00:24 WARN mapred.JobClient: No job jar file set.  User
> classes may not be found. See JobConf(Class) or JobConf#setJar(String).*
>
> *14/09/10 15:00:24 INFO util.NativeCodeLoader: Loaded the native-hadoop
> library*
>
> *14/09/10 15:00:24 WARN snappy.LoadSnappy: Snappy native library not
> loaded*
>
> *14/09/10 15:00:24 INFO mapred.JobClient: Cleaning up the staging area
> hdfs://latdevweb02:9000/user/hadoop/.staging/job_201409101141_0001*
>
> *14/09/10 15:00:24 ERROR security.UserGroupInformation:
> PriviledgedActionException as:hadoop
> cause:org.apache.hadoop.mapred.InvalidInputException: Input path does not
> exist: hdfs://latdevweb02:9000/home/hadoop/hadoop/input*
>
> *org.apache.hadoop.mapred.InvalidInputException: Input path does not
> exist: hdfs://latdevweb02:9000/home/hadoop/hadoop/input*
>
> *        at
> org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:197)*
>
> *        at
> org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:208)*
>
> *        at
> org.apache.hadoop.mapred.JobClient.writeOldSplits(JobClient.java:1081)*
>
> *        at
> org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:1073)*
>
> *        at
> org.apache.hadoop.mapred.JobClient.access$700(JobClient.java:179)*
>
> *        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:983)*
>
> *        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:936)*
>
> *        at java.security.AccessController.doPrivileged(Native Method)*
>
> *        at javax.security.auth.Subject.doAs(Subject.java:415)*
>
> *        at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)*
>
> *        at
> org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:936)*
>
> *        at
> org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:910)*
>
> *        at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1353)*
>
> *        at
> fr.societegenerale.bigdata.lactool.WordCountDriver.main(WordCountDriver.java:50)*
>
> *        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)*
>
> *        at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)*
>
> *        at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)*
>
> *        at java.lang.reflect.Method.invoke(Method.java:601)*
>
> *        at org.apache.hadoop.util.RunJar.main(RunJar.java:160)*
>
> *-bash-4.1$*
>
>
> ===========================================================================================================================================
>
>
>
> Thanks in advance for your help.
>
>
>
> Warm regards
>
> GYY
>
> *************************************************************************
> This message and any attachments (the "message") are confidential,
> intended solely for the addressee(s), and may contain legally privileged
> information.
> Any unauthorised use or dissemination is prohibited. E-mails are
> susceptible to alteration.
> Neither SOCIETE GENERALE nor any of its subsidiaries or affiliates shall
> be liable for the message if altered, changed or
> falsified.
> Please visit http://swapdisclosure.sgcib.com for important information
> with respect to derivative products.
>                               ************
> Ce message et toutes les pieces jointes (ci-apres le "message") sont
> confidentiels et susceptibles de contenir des informations couvertes
> par le secret professionnel.
> Ce message est etabli a l'intention exclusive de ses destinataires. Toute
> utilisation ou diffusion non autorisee est interdite.
> Tout message electronique est susceptible d'alteration.
> La SOCIETE GENERALE et ses filiales declinent toute responsabilite au
> titre de ce message s'il a ete altere, deforme ou falsifie.
> Veuillez consulter le site http://swapdisclosure.sgcib.com afin de
> recueillir d'importantes informations sur les produits derives.
> *************************************************************************
>

Re: Error when executing a WordCount Program

Posted by Shahab Yunus <sh...@gmail.com>.
*hdfs://latdevweb02:9000/home/hadoop/hadoop/input*

is this is a valid path on hdfs? Can you access this path outside of the
program? For example using hadoop fs -ls command? Also, was this path and
files in it, created by a different user?

The exception seem to say that it does not exist or the running user does
not have permission to read it.

Regards,
Shahab



On Wed, Sep 10, 2014 at 9:09 AM, YIMEN YIMGA Gael <
gael.yimen-yimga@sgcib.com> wrote:

> Hello Hadoopers,
>
>
>
> Here is the error, I’m facing when running WordCount example program
> written by myself.
>
> Kind find attached the file of my WordCount program.
>
> Below the error.
>
>
>
>
> ===========================================================================================================================================
>
> *-bash-4.1$ bin/hadoop jar WordCount.jar*
>
> *Entr?e dans le programme MAIN !!!*
>
> *14/09/10 15:00:24 WARN mapred.JobClient: Use GenericOptionsParser for
> parsing the arguments. Applications should implement Tool for the same.*
>
> *14/09/10 15:00:24 WARN mapred.JobClient: No job jar file set.  User
> classes may not be found. See JobConf(Class) or JobConf#setJar(String).*
>
> *14/09/10 15:00:24 INFO util.NativeCodeLoader: Loaded the native-hadoop
> library*
>
> *14/09/10 15:00:24 WARN snappy.LoadSnappy: Snappy native library not
> loaded*
>
> *14/09/10 15:00:24 INFO mapred.JobClient: Cleaning up the staging area
> hdfs://latdevweb02:9000/user/hadoop/.staging/job_201409101141_0001*
>
> *14/09/10 15:00:24 ERROR security.UserGroupInformation:
> PriviledgedActionException as:hadoop
> cause:org.apache.hadoop.mapred.InvalidInputException: Input path does not
> exist: hdfs://latdevweb02:9000/home/hadoop/hadoop/input*
>
> *org.apache.hadoop.mapred.InvalidInputException: Input path does not
> exist: hdfs://latdevweb02:9000/home/hadoop/hadoop/input*
>
> *        at
> org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:197)*
>
> *        at
> org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:208)*
>
> *        at
> org.apache.hadoop.mapred.JobClient.writeOldSplits(JobClient.java:1081)*
>
> *        at
> org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:1073)*
>
> *        at
> org.apache.hadoop.mapred.JobClient.access$700(JobClient.java:179)*
>
> *        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:983)*
>
> *        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:936)*
>
> *        at java.security.AccessController.doPrivileged(Native Method)*
>
> *        at javax.security.auth.Subject.doAs(Subject.java:415)*
>
> *        at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)*
>
> *        at
> org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:936)*
>
> *        at
> org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:910)*
>
> *        at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1353)*
>
> *        at
> fr.societegenerale.bigdata.lactool.WordCountDriver.main(WordCountDriver.java:50)*
>
> *        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)*
>
> *        at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)*
>
> *        at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)*
>
> *        at java.lang.reflect.Method.invoke(Method.java:601)*
>
> *        at org.apache.hadoop.util.RunJar.main(RunJar.java:160)*
>
> *-bash-4.1$*
>
>
> ===========================================================================================================================================
>
>
>
> Thanks in advance for your help.
>
>
>
> Warm regards
>
> GYY
>
> *************************************************************************
> This message and any attachments (the "message") are confidential,
> intended solely for the addressee(s), and may contain legally privileged
> information.
> Any unauthorised use or dissemination is prohibited. E-mails are
> susceptible to alteration.
> Neither SOCIETE GENERALE nor any of its subsidiaries or affiliates shall
> be liable for the message if altered, changed or
> falsified.
> Please visit http://swapdisclosure.sgcib.com for important information
> with respect to derivative products.
>                               ************
> Ce message et toutes les pieces jointes (ci-apres le "message") sont
> confidentiels et susceptibles de contenir des informations couvertes
> par le secret professionnel.
> Ce message est etabli a l'intention exclusive de ses destinataires. Toute
> utilisation ou diffusion non autorisee est interdite.
> Tout message electronique est susceptible d'alteration.
> La SOCIETE GENERALE et ses filiales declinent toute responsabilite au
> titre de ce message s'il a ete altere, deforme ou falsifie.
> Veuillez consulter le site http://swapdisclosure.sgcib.com afin de
> recueillir d'importantes informations sur les produits derives.
> *************************************************************************
>