You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hadoop.apache.org by Visioner Sadak <vi...@gmail.com> on 2012/08/30 10:32:11 UTC

Integrating hadoop with java UI application deployed on tomcat

Hi,

  I have a WAR which is deployed on tomcat server the WAR contains some
java classes which uploads files, will i be able to upload directly in to
hadoop iam using the below code in one of my java class

       Configuration hadoopConf=new Configuration();
        //get the default associated file system
       FileSystem fileSystem=FileSystem.get(hadoopConf);
       // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
        //copy from lfs to hdfs
       fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
Path("/user/TestDir/"));

but its throwing up this error

java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration

when this code is run independtly using a single jar deployed in hadoop bin
it wrks fine

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
any solution guys,badly stuck in this [?][?][?]

On Tue, Sep 4, 2012 at 4:28 PM, Visioner Sadak <vi...@gmail.com>wrote:

> Thanks bejoy, actually my hadoop is also on windows(i have installed it in
> psuedo-distributed mode for testing) its not a remote cluster....
>
>
> On Tue, Sep 4, 2012 at 3:38 PM, Bejoy KS <be...@gmail.com> wrote:
>
>> **
>> Hi
>>
>> You are running tomact on a windows machine and trying to connect to a
>> remote hadoop cluster from there. Your core site has
>>
>> <name>
>> fs.default.name</name>
>> <value>hdfs://localhost:9000</value>
>>
>> But It is localhost here.( I assume you are not running hadoop on this
>> windows environment for some testing)
>>
>> You need to have the exact configuration files and hadoop jars from the
>> cluster machines on this tomcat environment as well. I mean on the
>> classpath of your application.
>> Regards
>> Bejoy KS
>>
>> Sent from handheld, please excuse typos.
>> ------------------------------
>> *From: *Visioner Sadak <vi...@gmail.com>
>> *Date: *Tue, 4 Sep 2012 15:31:25 +0530
>> *To: *<us...@hadoop.apache.org>
>> *ReplyTo: *user@hadoop.apache.org
>>  *Subject: *Re: Integrating hadoop with java UI application deployed on
>> tomcat
>>
>> also getting one more error
>>
>> *
>>
>> org.apache.hadoop.ipc.RemoteException
>> *: Server IPC version 5 cannot communicate with client version 4
>>
>> On Tue, Sep 4, 2012 at 2:44 PM, Visioner Sadak <vi...@gmail.com>wrote:
>>
>>> Thanks shobha tried adding conf folder to tomcats classpath  still
>>> getting same error
>>>
>>>
>>> Call to localhost/127.0.0.1:9000 failed on local exception:
>>> java.io.IOException: An established connection was aborted by the software
>>> in your host machine
>>>
>>>  On Tue, Sep 4, 2012 at 11:18 AM, Mahadevappa, Shobha <
>>> Shobha.Mahadevappa@nttdata.com> wrote:
>>>
>>>>  Hi,****
>>>>
>>>> Try adding the hadoop/conf directory in the TOMCAT’s classpath ****
>>>>
>>>> ** **
>>>>
>>>> Ex :
>>>> CLASSPATH=/usr/local/Apps/hbase-0.90.4/conf:/usr/local/Apps/hadoop-0.20.203.0/conf:
>>>> ****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> Regards,****
>>>>
>>>> *Shobha M *****
>>>>
>>>> ** **
>>>>
>>>> *From:* Visioner Sadak [mailto:visioner.sadak@gmail.com]
>>>> *Sent:* 03 September 2012 PM 04:01
>>>> *To:* user@hadoop.apache.org
>>>>
>>>> *Subject:* Re: Integrating hadoop with java UI application deployed on
>>>> tomcat****
>>>>
>>>> ** **
>>>>
>>>> Thanks steve thers nothing in logs and no exceptions as well i found
>>>> that some file is created in my F:\user with directory name but its not
>>>> visible inside my hadoop browse filesystem directories i also added the
>>>> config by using the below method ****
>>>>
>>>> hadoopConf.addResource(****
>>>>
>>>> "F:/hadoop-0.22.0/conf/core-site.xml"); ****
>>>>
>>>> when running thru WAR printing out the filesystem i m getting
>>>> org.apache.hadoop.fs.LocalFileSystem@9cd8db ****
>>>>
>>>> when running an independet jar within hadoop i m getting
>>>> DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]****
>>>>
>>>> when running an independet jar i m able to do uploads....****
>>>>
>>>>  ****
>>>>
>>>> just wanted to know will i have to add something in my classpath of
>>>> tomcat or is there any other configurations of core-site.xml that i am
>>>> missing out..thanks for your help.....****
>>>>
>>>>  ****
>>>>
>>>> ** **
>>>>
>>>> On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
>>>> wrote:****
>>>>
>>>> ** **
>>>>
>>>> well, it's worked for me in the past outside Hadoop itself:****
>>>>
>>>> ** **
>>>>
>>>>
>>>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>>>> ****
>>>>
>>>> ** **
>>>>
>>>>    1. Turn logging up to DEBUG****
>>>>    2. Make sure that the filesystem you've just loaded is what you
>>>>    expect, by logging its value. It may turn out to be file:///,
>>>>    because the normal Hadoop site-config.xml isn't being picked up****
>>>>
>>>>   ****
>>>>
>>>>  ** **
>>>>
>>>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <
>>>> visioner.sadak@gmail.com> wrote:****
>>>>
>>>> but the problem is that my  code gets executed with the warning but
>>>> file is not copied to hdfs , actually i m trying to copy a file from local
>>>> to hdfs ****
>>>>
>>>>  ****
>>>>
>>>>    Configuration hadoopConf=new Configuration();
>>>>         //get the default associated file system
>>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>>>         //copy from lfs to hdfs
>>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>>>> Path("/user/TestDir/")); ****
>>>>
>>>>  ****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>>
>>>> ______________________________________________________________________
>>>> Disclaimer:This email and any attachments are sent in strictest
>>>> confidence for the sole use of the addressee and may contain legally
>>>> privileged, confidential, and proprietary data. If you are not the intended
>>>> recipient, please advise the sender by replying promptly to this email and
>>>> then delete and destroy this email and any attachments without any further
>>>> use, copying or forwarding
>>>>
>>>
>>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
any solution guys,badly stuck in this [?][?][?]

On Tue, Sep 4, 2012 at 4:28 PM, Visioner Sadak <vi...@gmail.com>wrote:

> Thanks bejoy, actually my hadoop is also on windows(i have installed it in
> psuedo-distributed mode for testing) its not a remote cluster....
>
>
> On Tue, Sep 4, 2012 at 3:38 PM, Bejoy KS <be...@gmail.com> wrote:
>
>> **
>> Hi
>>
>> You are running tomact on a windows machine and trying to connect to a
>> remote hadoop cluster from there. Your core site has
>>
>> <name>
>> fs.default.name</name>
>> <value>hdfs://localhost:9000</value>
>>
>> But It is localhost here.( I assume you are not running hadoop on this
>> windows environment for some testing)
>>
>> You need to have the exact configuration files and hadoop jars from the
>> cluster machines on this tomcat environment as well. I mean on the
>> classpath of your application.
>> Regards
>> Bejoy KS
>>
>> Sent from handheld, please excuse typos.
>> ------------------------------
>> *From: *Visioner Sadak <vi...@gmail.com>
>> *Date: *Tue, 4 Sep 2012 15:31:25 +0530
>> *To: *<us...@hadoop.apache.org>
>> *ReplyTo: *user@hadoop.apache.org
>>  *Subject: *Re: Integrating hadoop with java UI application deployed on
>> tomcat
>>
>> also getting one more error
>>
>> *
>>
>> org.apache.hadoop.ipc.RemoteException
>> *: Server IPC version 5 cannot communicate with client version 4
>>
>> On Tue, Sep 4, 2012 at 2:44 PM, Visioner Sadak <vi...@gmail.com>wrote:
>>
>>> Thanks shobha tried adding conf folder to tomcats classpath  still
>>> getting same error
>>>
>>>
>>> Call to localhost/127.0.0.1:9000 failed on local exception:
>>> java.io.IOException: An established connection was aborted by the software
>>> in your host machine
>>>
>>>  On Tue, Sep 4, 2012 at 11:18 AM, Mahadevappa, Shobha <
>>> Shobha.Mahadevappa@nttdata.com> wrote:
>>>
>>>>  Hi,****
>>>>
>>>> Try adding the hadoop/conf directory in the TOMCAT’s classpath ****
>>>>
>>>> ** **
>>>>
>>>> Ex :
>>>> CLASSPATH=/usr/local/Apps/hbase-0.90.4/conf:/usr/local/Apps/hadoop-0.20.203.0/conf:
>>>> ****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> Regards,****
>>>>
>>>> *Shobha M *****
>>>>
>>>> ** **
>>>>
>>>> *From:* Visioner Sadak [mailto:visioner.sadak@gmail.com]
>>>> *Sent:* 03 September 2012 PM 04:01
>>>> *To:* user@hadoop.apache.org
>>>>
>>>> *Subject:* Re: Integrating hadoop with java UI application deployed on
>>>> tomcat****
>>>>
>>>> ** **
>>>>
>>>> Thanks steve thers nothing in logs and no exceptions as well i found
>>>> that some file is created in my F:\user with directory name but its not
>>>> visible inside my hadoop browse filesystem directories i also added the
>>>> config by using the below method ****
>>>>
>>>> hadoopConf.addResource(****
>>>>
>>>> "F:/hadoop-0.22.0/conf/core-site.xml"); ****
>>>>
>>>> when running thru WAR printing out the filesystem i m getting
>>>> org.apache.hadoop.fs.LocalFileSystem@9cd8db ****
>>>>
>>>> when running an independet jar within hadoop i m getting
>>>> DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]****
>>>>
>>>> when running an independet jar i m able to do uploads....****
>>>>
>>>>  ****
>>>>
>>>> just wanted to know will i have to add something in my classpath of
>>>> tomcat or is there any other configurations of core-site.xml that i am
>>>> missing out..thanks for your help.....****
>>>>
>>>>  ****
>>>>
>>>> ** **
>>>>
>>>> On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
>>>> wrote:****
>>>>
>>>> ** **
>>>>
>>>> well, it's worked for me in the past outside Hadoop itself:****
>>>>
>>>> ** **
>>>>
>>>>
>>>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>>>> ****
>>>>
>>>> ** **
>>>>
>>>>    1. Turn logging up to DEBUG****
>>>>    2. Make sure that the filesystem you've just loaded is what you
>>>>    expect, by logging its value. It may turn out to be file:///,
>>>>    because the normal Hadoop site-config.xml isn't being picked up****
>>>>
>>>>   ****
>>>>
>>>>  ** **
>>>>
>>>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <
>>>> visioner.sadak@gmail.com> wrote:****
>>>>
>>>> but the problem is that my  code gets executed with the warning but
>>>> file is not copied to hdfs , actually i m trying to copy a file from local
>>>> to hdfs ****
>>>>
>>>>  ****
>>>>
>>>>    Configuration hadoopConf=new Configuration();
>>>>         //get the default associated file system
>>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>>>         //copy from lfs to hdfs
>>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>>>> Path("/user/TestDir/")); ****
>>>>
>>>>  ****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>>
>>>> ______________________________________________________________________
>>>> Disclaimer:This email and any attachments are sent in strictest
>>>> confidence for the sole use of the addressee and may contain legally
>>>> privileged, confidential, and proprietary data. If you are not the intended
>>>> recipient, please advise the sender by replying promptly to this email and
>>>> then delete and destroy this email and any attachments without any further
>>>> use, copying or forwarding
>>>>
>>>
>>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
any solution guys,badly stuck in this [?][?][?]

On Tue, Sep 4, 2012 at 4:28 PM, Visioner Sadak <vi...@gmail.com>wrote:

> Thanks bejoy, actually my hadoop is also on windows(i have installed it in
> psuedo-distributed mode for testing) its not a remote cluster....
>
>
> On Tue, Sep 4, 2012 at 3:38 PM, Bejoy KS <be...@gmail.com> wrote:
>
>> **
>> Hi
>>
>> You are running tomact on a windows machine and trying to connect to a
>> remote hadoop cluster from there. Your core site has
>>
>> <name>
>> fs.default.name</name>
>> <value>hdfs://localhost:9000</value>
>>
>> But It is localhost here.( I assume you are not running hadoop on this
>> windows environment for some testing)
>>
>> You need to have the exact configuration files and hadoop jars from the
>> cluster machines on this tomcat environment as well. I mean on the
>> classpath of your application.
>> Regards
>> Bejoy KS
>>
>> Sent from handheld, please excuse typos.
>> ------------------------------
>> *From: *Visioner Sadak <vi...@gmail.com>
>> *Date: *Tue, 4 Sep 2012 15:31:25 +0530
>> *To: *<us...@hadoop.apache.org>
>> *ReplyTo: *user@hadoop.apache.org
>>  *Subject: *Re: Integrating hadoop with java UI application deployed on
>> tomcat
>>
>> also getting one more error
>>
>> *
>>
>> org.apache.hadoop.ipc.RemoteException
>> *: Server IPC version 5 cannot communicate with client version 4
>>
>> On Tue, Sep 4, 2012 at 2:44 PM, Visioner Sadak <vi...@gmail.com>wrote:
>>
>>> Thanks shobha tried adding conf folder to tomcats classpath  still
>>> getting same error
>>>
>>>
>>> Call to localhost/127.0.0.1:9000 failed on local exception:
>>> java.io.IOException: An established connection was aborted by the software
>>> in your host machine
>>>
>>>  On Tue, Sep 4, 2012 at 11:18 AM, Mahadevappa, Shobha <
>>> Shobha.Mahadevappa@nttdata.com> wrote:
>>>
>>>>  Hi,****
>>>>
>>>> Try adding the hadoop/conf directory in the TOMCAT’s classpath ****
>>>>
>>>> ** **
>>>>
>>>> Ex :
>>>> CLASSPATH=/usr/local/Apps/hbase-0.90.4/conf:/usr/local/Apps/hadoop-0.20.203.0/conf:
>>>> ****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> Regards,****
>>>>
>>>> *Shobha M *****
>>>>
>>>> ** **
>>>>
>>>> *From:* Visioner Sadak [mailto:visioner.sadak@gmail.com]
>>>> *Sent:* 03 September 2012 PM 04:01
>>>> *To:* user@hadoop.apache.org
>>>>
>>>> *Subject:* Re: Integrating hadoop with java UI application deployed on
>>>> tomcat****
>>>>
>>>> ** **
>>>>
>>>> Thanks steve thers nothing in logs and no exceptions as well i found
>>>> that some file is created in my F:\user with directory name but its not
>>>> visible inside my hadoop browse filesystem directories i also added the
>>>> config by using the below method ****
>>>>
>>>> hadoopConf.addResource(****
>>>>
>>>> "F:/hadoop-0.22.0/conf/core-site.xml"); ****
>>>>
>>>> when running thru WAR printing out the filesystem i m getting
>>>> org.apache.hadoop.fs.LocalFileSystem@9cd8db ****
>>>>
>>>> when running an independet jar within hadoop i m getting
>>>> DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]****
>>>>
>>>> when running an independet jar i m able to do uploads....****
>>>>
>>>>  ****
>>>>
>>>> just wanted to know will i have to add something in my classpath of
>>>> tomcat or is there any other configurations of core-site.xml that i am
>>>> missing out..thanks for your help.....****
>>>>
>>>>  ****
>>>>
>>>> ** **
>>>>
>>>> On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
>>>> wrote:****
>>>>
>>>> ** **
>>>>
>>>> well, it's worked for me in the past outside Hadoop itself:****
>>>>
>>>> ** **
>>>>
>>>>
>>>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>>>> ****
>>>>
>>>> ** **
>>>>
>>>>    1. Turn logging up to DEBUG****
>>>>    2. Make sure that the filesystem you've just loaded is what you
>>>>    expect, by logging its value. It may turn out to be file:///,
>>>>    because the normal Hadoop site-config.xml isn't being picked up****
>>>>
>>>>   ****
>>>>
>>>>  ** **
>>>>
>>>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <
>>>> visioner.sadak@gmail.com> wrote:****
>>>>
>>>> but the problem is that my  code gets executed with the warning but
>>>> file is not copied to hdfs , actually i m trying to copy a file from local
>>>> to hdfs ****
>>>>
>>>>  ****
>>>>
>>>>    Configuration hadoopConf=new Configuration();
>>>>         //get the default associated file system
>>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>>>         //copy from lfs to hdfs
>>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>>>> Path("/user/TestDir/")); ****
>>>>
>>>>  ****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>>
>>>> ______________________________________________________________________
>>>> Disclaimer:This email and any attachments are sent in strictest
>>>> confidence for the sole use of the addressee and may contain legally
>>>> privileged, confidential, and proprietary data. If you are not the intended
>>>> recipient, please advise the sender by replying promptly to this email and
>>>> then delete and destroy this email and any attachments without any further
>>>> use, copying or forwarding
>>>>
>>>
>>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
any solution guys,badly stuck in this [?][?][?]

On Tue, Sep 4, 2012 at 4:28 PM, Visioner Sadak <vi...@gmail.com>wrote:

> Thanks bejoy, actually my hadoop is also on windows(i have installed it in
> psuedo-distributed mode for testing) its not a remote cluster....
>
>
> On Tue, Sep 4, 2012 at 3:38 PM, Bejoy KS <be...@gmail.com> wrote:
>
>> **
>> Hi
>>
>> You are running tomact on a windows machine and trying to connect to a
>> remote hadoop cluster from there. Your core site has
>>
>> <name>
>> fs.default.name</name>
>> <value>hdfs://localhost:9000</value>
>>
>> But It is localhost here.( I assume you are not running hadoop on this
>> windows environment for some testing)
>>
>> You need to have the exact configuration files and hadoop jars from the
>> cluster machines on this tomcat environment as well. I mean on the
>> classpath of your application.
>> Regards
>> Bejoy KS
>>
>> Sent from handheld, please excuse typos.
>> ------------------------------
>> *From: *Visioner Sadak <vi...@gmail.com>
>> *Date: *Tue, 4 Sep 2012 15:31:25 +0530
>> *To: *<us...@hadoop.apache.org>
>> *ReplyTo: *user@hadoop.apache.org
>>  *Subject: *Re: Integrating hadoop with java UI application deployed on
>> tomcat
>>
>> also getting one more error
>>
>> *
>>
>> org.apache.hadoop.ipc.RemoteException
>> *: Server IPC version 5 cannot communicate with client version 4
>>
>> On Tue, Sep 4, 2012 at 2:44 PM, Visioner Sadak <vi...@gmail.com>wrote:
>>
>>> Thanks shobha tried adding conf folder to tomcats classpath  still
>>> getting same error
>>>
>>>
>>> Call to localhost/127.0.0.1:9000 failed on local exception:
>>> java.io.IOException: An established connection was aborted by the software
>>> in your host machine
>>>
>>>  On Tue, Sep 4, 2012 at 11:18 AM, Mahadevappa, Shobha <
>>> Shobha.Mahadevappa@nttdata.com> wrote:
>>>
>>>>  Hi,****
>>>>
>>>> Try adding the hadoop/conf directory in the TOMCAT’s classpath ****
>>>>
>>>> ** **
>>>>
>>>> Ex :
>>>> CLASSPATH=/usr/local/Apps/hbase-0.90.4/conf:/usr/local/Apps/hadoop-0.20.203.0/conf:
>>>> ****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> Regards,****
>>>>
>>>> *Shobha M *****
>>>>
>>>> ** **
>>>>
>>>> *From:* Visioner Sadak [mailto:visioner.sadak@gmail.com]
>>>> *Sent:* 03 September 2012 PM 04:01
>>>> *To:* user@hadoop.apache.org
>>>>
>>>> *Subject:* Re: Integrating hadoop with java UI application deployed on
>>>> tomcat****
>>>>
>>>> ** **
>>>>
>>>> Thanks steve thers nothing in logs and no exceptions as well i found
>>>> that some file is created in my F:\user with directory name but its not
>>>> visible inside my hadoop browse filesystem directories i also added the
>>>> config by using the below method ****
>>>>
>>>> hadoopConf.addResource(****
>>>>
>>>> "F:/hadoop-0.22.0/conf/core-site.xml"); ****
>>>>
>>>> when running thru WAR printing out the filesystem i m getting
>>>> org.apache.hadoop.fs.LocalFileSystem@9cd8db ****
>>>>
>>>> when running an independet jar within hadoop i m getting
>>>> DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]****
>>>>
>>>> when running an independet jar i m able to do uploads....****
>>>>
>>>>  ****
>>>>
>>>> just wanted to know will i have to add something in my classpath of
>>>> tomcat or is there any other configurations of core-site.xml that i am
>>>> missing out..thanks for your help.....****
>>>>
>>>>  ****
>>>>
>>>> ** **
>>>>
>>>> On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
>>>> wrote:****
>>>>
>>>> ** **
>>>>
>>>> well, it's worked for me in the past outside Hadoop itself:****
>>>>
>>>> ** **
>>>>
>>>>
>>>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>>>> ****
>>>>
>>>> ** **
>>>>
>>>>    1. Turn logging up to DEBUG****
>>>>    2. Make sure that the filesystem you've just loaded is what you
>>>>    expect, by logging its value. It may turn out to be file:///,
>>>>    because the normal Hadoop site-config.xml isn't being picked up****
>>>>
>>>>   ****
>>>>
>>>>  ** **
>>>>
>>>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <
>>>> visioner.sadak@gmail.com> wrote:****
>>>>
>>>> but the problem is that my  code gets executed with the warning but
>>>> file is not copied to hdfs , actually i m trying to copy a file from local
>>>> to hdfs ****
>>>>
>>>>  ****
>>>>
>>>>    Configuration hadoopConf=new Configuration();
>>>>         //get the default associated file system
>>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>>>         //copy from lfs to hdfs
>>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>>>> Path("/user/TestDir/")); ****
>>>>
>>>>  ****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>>
>>>> ______________________________________________________________________
>>>> Disclaimer:This email and any attachments are sent in strictest
>>>> confidence for the sole use of the addressee and may contain legally
>>>> privileged, confidential, and proprietary data. If you are not the intended
>>>> recipient, please advise the sender by replying promptly to this email and
>>>> then delete and destroy this email and any attachments without any further
>>>> use, copying or forwarding
>>>>
>>>
>>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
Thanks bejoy, actually my hadoop is also on windows(i have installed it in
psuedo-distributed mode for testing) its not a remote cluster....

On Tue, Sep 4, 2012 at 3:38 PM, Bejoy KS <be...@gmail.com> wrote:

> **
> Hi
>
> You are running tomact on a windows machine and trying to connect to a
> remote hadoop cluster from there. Your core site has
>
> <name>
> fs.default.name</name>
> <value>hdfs://localhost:9000</value>
>
> But It is localhost here.( I assume you are not running hadoop on this
> windows environment for some testing)
>
> You need to have the exact configuration files and hadoop jars from the
> cluster machines on this tomcat environment as well. I mean on the
> classpath of your application.
> Regards
> Bejoy KS
>
> Sent from handheld, please excuse typos.
> ------------------------------
> *From: *Visioner Sadak <vi...@gmail.com>
> *Date: *Tue, 4 Sep 2012 15:31:25 +0530
> *To: *<us...@hadoop.apache.org>
> *ReplyTo: *user@hadoop.apache.org
>  *Subject: *Re: Integrating hadoop with java UI application deployed on
> tomcat
>
> also getting one more error
>
> *
>
> org.apache.hadoop.ipc.RemoteException
> *: Server IPC version 5 cannot communicate with client version 4
>
> On Tue, Sep 4, 2012 at 2:44 PM, Visioner Sadak <vi...@gmail.com>wrote:
>
>> Thanks shobha tried adding conf folder to tomcats classpath  still
>> getting same error
>>
>>
>> Call to localhost/127.0.0.1:9000 failed on local exception:
>> java.io.IOException: An established connection was aborted by the software
>> in your host machine
>>
>>  On Tue, Sep 4, 2012 at 11:18 AM, Mahadevappa, Shobha <
>> Shobha.Mahadevappa@nttdata.com> wrote:
>>
>>>  Hi,****
>>>
>>> Try adding the hadoop/conf directory in the TOMCAT’s classpath ****
>>>
>>> ** **
>>>
>>> Ex :
>>> CLASSPATH=/usr/local/Apps/hbase-0.90.4/conf:/usr/local/Apps/hadoop-0.20.203.0/conf:
>>> ****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> Regards,****
>>>
>>> *Shobha M *****
>>>
>>> ** **
>>>
>>> *From:* Visioner Sadak [mailto:visioner.sadak@gmail.com]
>>> *Sent:* 03 September 2012 PM 04:01
>>> *To:* user@hadoop.apache.org
>>>
>>> *Subject:* Re: Integrating hadoop with java UI application deployed on
>>> tomcat****
>>>
>>> ** **
>>>
>>> Thanks steve thers nothing in logs and no exceptions as well i found
>>> that some file is created in my F:\user with directory name but its not
>>> visible inside my hadoop browse filesystem directories i also added the
>>> config by using the below method ****
>>>
>>> hadoopConf.addResource(****
>>>
>>> "F:/hadoop-0.22.0/conf/core-site.xml"); ****
>>>
>>> when running thru WAR printing out the filesystem i m getting
>>> org.apache.hadoop.fs.LocalFileSystem@9cd8db ****
>>>
>>> when running an independet jar within hadoop i m getting
>>> DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]****
>>>
>>> when running an independet jar i m able to do uploads....****
>>>
>>>  ****
>>>
>>> just wanted to know will i have to add something in my classpath of
>>> tomcat or is there any other configurations of core-site.xml that i am
>>> missing out..thanks for your help.....****
>>>
>>>  ****
>>>
>>> ** **
>>>
>>> On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
>>> wrote:****
>>>
>>> ** **
>>>
>>> well, it's worked for me in the past outside Hadoop itself:****
>>>
>>> ** **
>>>
>>>
>>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>>> ****
>>>
>>> ** **
>>>
>>>    1. Turn logging up to DEBUG****
>>>    2. Make sure that the filesystem you've just loaded is what you
>>>    expect, by logging its value. It may turn out to be file:///,
>>>    because the normal Hadoop site-config.xml isn't being picked up****
>>>
>>>   ****
>>>
>>>  ** **
>>>
>>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <
>>> visioner.sadak@gmail.com> wrote:****
>>>
>>> but the problem is that my  code gets executed with the warning but file
>>> is not copied to hdfs , actually i m trying to copy a file from local to
>>> hdfs ****
>>>
>>>  ****
>>>
>>>    Configuration hadoopConf=new Configuration();
>>>         //get the default associated file system
>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>>         //copy from lfs to hdfs
>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>>> Path("/user/TestDir/")); ****
>>>
>>>  ****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>>
>>> ______________________________________________________________________
>>> Disclaimer:This email and any attachments are sent in strictest
>>> confidence for the sole use of the addressee and may contain legally
>>> privileged, confidential, and proprietary data. If you are not the intended
>>> recipient, please advise the sender by replying promptly to this email and
>>> then delete and destroy this email and any attachments without any further
>>> use, copying or forwarding
>>>
>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
Thanks bejoy, actually my hadoop is also on windows(i have installed it in
psuedo-distributed mode for testing) its not a remote cluster....

On Tue, Sep 4, 2012 at 3:38 PM, Bejoy KS <be...@gmail.com> wrote:

> **
> Hi
>
> You are running tomact on a windows machine and trying to connect to a
> remote hadoop cluster from there. Your core site has
>
> <name>
> fs.default.name</name>
> <value>hdfs://localhost:9000</value>
>
> But It is localhost here.( I assume you are not running hadoop on this
> windows environment for some testing)
>
> You need to have the exact configuration files and hadoop jars from the
> cluster machines on this tomcat environment as well. I mean on the
> classpath of your application.
> Regards
> Bejoy KS
>
> Sent from handheld, please excuse typos.
> ------------------------------
> *From: *Visioner Sadak <vi...@gmail.com>
> *Date: *Tue, 4 Sep 2012 15:31:25 +0530
> *To: *<us...@hadoop.apache.org>
> *ReplyTo: *user@hadoop.apache.org
>  *Subject: *Re: Integrating hadoop with java UI application deployed on
> tomcat
>
> also getting one more error
>
> *
>
> org.apache.hadoop.ipc.RemoteException
> *: Server IPC version 5 cannot communicate with client version 4
>
> On Tue, Sep 4, 2012 at 2:44 PM, Visioner Sadak <vi...@gmail.com>wrote:
>
>> Thanks shobha tried adding conf folder to tomcats classpath  still
>> getting same error
>>
>>
>> Call to localhost/127.0.0.1:9000 failed on local exception:
>> java.io.IOException: An established connection was aborted by the software
>> in your host machine
>>
>>  On Tue, Sep 4, 2012 at 11:18 AM, Mahadevappa, Shobha <
>> Shobha.Mahadevappa@nttdata.com> wrote:
>>
>>>  Hi,****
>>>
>>> Try adding the hadoop/conf directory in the TOMCAT’s classpath ****
>>>
>>> ** **
>>>
>>> Ex :
>>> CLASSPATH=/usr/local/Apps/hbase-0.90.4/conf:/usr/local/Apps/hadoop-0.20.203.0/conf:
>>> ****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> Regards,****
>>>
>>> *Shobha M *****
>>>
>>> ** **
>>>
>>> *From:* Visioner Sadak [mailto:visioner.sadak@gmail.com]
>>> *Sent:* 03 September 2012 PM 04:01
>>> *To:* user@hadoop.apache.org
>>>
>>> *Subject:* Re: Integrating hadoop with java UI application deployed on
>>> tomcat****
>>>
>>> ** **
>>>
>>> Thanks steve thers nothing in logs and no exceptions as well i found
>>> that some file is created in my F:\user with directory name but its not
>>> visible inside my hadoop browse filesystem directories i also added the
>>> config by using the below method ****
>>>
>>> hadoopConf.addResource(****
>>>
>>> "F:/hadoop-0.22.0/conf/core-site.xml"); ****
>>>
>>> when running thru WAR printing out the filesystem i m getting
>>> org.apache.hadoop.fs.LocalFileSystem@9cd8db ****
>>>
>>> when running an independet jar within hadoop i m getting
>>> DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]****
>>>
>>> when running an independet jar i m able to do uploads....****
>>>
>>>  ****
>>>
>>> just wanted to know will i have to add something in my classpath of
>>> tomcat or is there any other configurations of core-site.xml that i am
>>> missing out..thanks for your help.....****
>>>
>>>  ****
>>>
>>> ** **
>>>
>>> On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
>>> wrote:****
>>>
>>> ** **
>>>
>>> well, it's worked for me in the past outside Hadoop itself:****
>>>
>>> ** **
>>>
>>>
>>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>>> ****
>>>
>>> ** **
>>>
>>>    1. Turn logging up to DEBUG****
>>>    2. Make sure that the filesystem you've just loaded is what you
>>>    expect, by logging its value. It may turn out to be file:///,
>>>    because the normal Hadoop site-config.xml isn't being picked up****
>>>
>>>   ****
>>>
>>>  ** **
>>>
>>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <
>>> visioner.sadak@gmail.com> wrote:****
>>>
>>> but the problem is that my  code gets executed with the warning but file
>>> is not copied to hdfs , actually i m trying to copy a file from local to
>>> hdfs ****
>>>
>>>  ****
>>>
>>>    Configuration hadoopConf=new Configuration();
>>>         //get the default associated file system
>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>>         //copy from lfs to hdfs
>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>>> Path("/user/TestDir/")); ****
>>>
>>>  ****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>>
>>> ______________________________________________________________________
>>> Disclaimer:This email and any attachments are sent in strictest
>>> confidence for the sole use of the addressee and may contain legally
>>> privileged, confidential, and proprietary data. If you are not the intended
>>> recipient, please advise the sender by replying promptly to this email and
>>> then delete and destroy this email and any attachments without any further
>>> use, copying or forwarding
>>>
>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
Thanks bejoy, actually my hadoop is also on windows(i have installed it in
psuedo-distributed mode for testing) its not a remote cluster....

On Tue, Sep 4, 2012 at 3:38 PM, Bejoy KS <be...@gmail.com> wrote:

> **
> Hi
>
> You are running tomact on a windows machine and trying to connect to a
> remote hadoop cluster from there. Your core site has
>
> <name>
> fs.default.name</name>
> <value>hdfs://localhost:9000</value>
>
> But It is localhost here.( I assume you are not running hadoop on this
> windows environment for some testing)
>
> You need to have the exact configuration files and hadoop jars from the
> cluster machines on this tomcat environment as well. I mean on the
> classpath of your application.
> Regards
> Bejoy KS
>
> Sent from handheld, please excuse typos.
> ------------------------------
> *From: *Visioner Sadak <vi...@gmail.com>
> *Date: *Tue, 4 Sep 2012 15:31:25 +0530
> *To: *<us...@hadoop.apache.org>
> *ReplyTo: *user@hadoop.apache.org
>  *Subject: *Re: Integrating hadoop with java UI application deployed on
> tomcat
>
> also getting one more error
>
> *
>
> org.apache.hadoop.ipc.RemoteException
> *: Server IPC version 5 cannot communicate with client version 4
>
> On Tue, Sep 4, 2012 at 2:44 PM, Visioner Sadak <vi...@gmail.com>wrote:
>
>> Thanks shobha tried adding conf folder to tomcats classpath  still
>> getting same error
>>
>>
>> Call to localhost/127.0.0.1:9000 failed on local exception:
>> java.io.IOException: An established connection was aborted by the software
>> in your host machine
>>
>>  On Tue, Sep 4, 2012 at 11:18 AM, Mahadevappa, Shobha <
>> Shobha.Mahadevappa@nttdata.com> wrote:
>>
>>>  Hi,****
>>>
>>> Try adding the hadoop/conf directory in the TOMCAT’s classpath ****
>>>
>>> ** **
>>>
>>> Ex :
>>> CLASSPATH=/usr/local/Apps/hbase-0.90.4/conf:/usr/local/Apps/hadoop-0.20.203.0/conf:
>>> ****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> Regards,****
>>>
>>> *Shobha M *****
>>>
>>> ** **
>>>
>>> *From:* Visioner Sadak [mailto:visioner.sadak@gmail.com]
>>> *Sent:* 03 September 2012 PM 04:01
>>> *To:* user@hadoop.apache.org
>>>
>>> *Subject:* Re: Integrating hadoop with java UI application deployed on
>>> tomcat****
>>>
>>> ** **
>>>
>>> Thanks steve thers nothing in logs and no exceptions as well i found
>>> that some file is created in my F:\user with directory name but its not
>>> visible inside my hadoop browse filesystem directories i also added the
>>> config by using the below method ****
>>>
>>> hadoopConf.addResource(****
>>>
>>> "F:/hadoop-0.22.0/conf/core-site.xml"); ****
>>>
>>> when running thru WAR printing out the filesystem i m getting
>>> org.apache.hadoop.fs.LocalFileSystem@9cd8db ****
>>>
>>> when running an independet jar within hadoop i m getting
>>> DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]****
>>>
>>> when running an independet jar i m able to do uploads....****
>>>
>>>  ****
>>>
>>> just wanted to know will i have to add something in my classpath of
>>> tomcat or is there any other configurations of core-site.xml that i am
>>> missing out..thanks for your help.....****
>>>
>>>  ****
>>>
>>> ** **
>>>
>>> On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
>>> wrote:****
>>>
>>> ** **
>>>
>>> well, it's worked for me in the past outside Hadoop itself:****
>>>
>>> ** **
>>>
>>>
>>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>>> ****
>>>
>>> ** **
>>>
>>>    1. Turn logging up to DEBUG****
>>>    2. Make sure that the filesystem you've just loaded is what you
>>>    expect, by logging its value. It may turn out to be file:///,
>>>    because the normal Hadoop site-config.xml isn't being picked up****
>>>
>>>   ****
>>>
>>>  ** **
>>>
>>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <
>>> visioner.sadak@gmail.com> wrote:****
>>>
>>> but the problem is that my  code gets executed with the warning but file
>>> is not copied to hdfs , actually i m trying to copy a file from local to
>>> hdfs ****
>>>
>>>  ****
>>>
>>>    Configuration hadoopConf=new Configuration();
>>>         //get the default associated file system
>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>>         //copy from lfs to hdfs
>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>>> Path("/user/TestDir/")); ****
>>>
>>>  ****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>>
>>> ______________________________________________________________________
>>> Disclaimer:This email and any attachments are sent in strictest
>>> confidence for the sole use of the addressee and may contain legally
>>> privileged, confidential, and proprietary data. If you are not the intended
>>> recipient, please advise the sender by replying promptly to this email and
>>> then delete and destroy this email and any attachments without any further
>>> use, copying or forwarding
>>>
>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
Thanks bejoy, actually my hadoop is also on windows(i have installed it in
psuedo-distributed mode for testing) its not a remote cluster....

On Tue, Sep 4, 2012 at 3:38 PM, Bejoy KS <be...@gmail.com> wrote:

> **
> Hi
>
> You are running tomact on a windows machine and trying to connect to a
> remote hadoop cluster from there. Your core site has
>
> <name>
> fs.default.name</name>
> <value>hdfs://localhost:9000</value>
>
> But It is localhost here.( I assume you are not running hadoop on this
> windows environment for some testing)
>
> You need to have the exact configuration files and hadoop jars from the
> cluster machines on this tomcat environment as well. I mean on the
> classpath of your application.
> Regards
> Bejoy KS
>
> Sent from handheld, please excuse typos.
> ------------------------------
> *From: *Visioner Sadak <vi...@gmail.com>
> *Date: *Tue, 4 Sep 2012 15:31:25 +0530
> *To: *<us...@hadoop.apache.org>
> *ReplyTo: *user@hadoop.apache.org
>  *Subject: *Re: Integrating hadoop with java UI application deployed on
> tomcat
>
> also getting one more error
>
> *
>
> org.apache.hadoop.ipc.RemoteException
> *: Server IPC version 5 cannot communicate with client version 4
>
> On Tue, Sep 4, 2012 at 2:44 PM, Visioner Sadak <vi...@gmail.com>wrote:
>
>> Thanks shobha tried adding conf folder to tomcats classpath  still
>> getting same error
>>
>>
>> Call to localhost/127.0.0.1:9000 failed on local exception:
>> java.io.IOException: An established connection was aborted by the software
>> in your host machine
>>
>>  On Tue, Sep 4, 2012 at 11:18 AM, Mahadevappa, Shobha <
>> Shobha.Mahadevappa@nttdata.com> wrote:
>>
>>>  Hi,****
>>>
>>> Try adding the hadoop/conf directory in the TOMCAT’s classpath ****
>>>
>>> ** **
>>>
>>> Ex :
>>> CLASSPATH=/usr/local/Apps/hbase-0.90.4/conf:/usr/local/Apps/hadoop-0.20.203.0/conf:
>>> ****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> Regards,****
>>>
>>> *Shobha M *****
>>>
>>> ** **
>>>
>>> *From:* Visioner Sadak [mailto:visioner.sadak@gmail.com]
>>> *Sent:* 03 September 2012 PM 04:01
>>> *To:* user@hadoop.apache.org
>>>
>>> *Subject:* Re: Integrating hadoop with java UI application deployed on
>>> tomcat****
>>>
>>> ** **
>>>
>>> Thanks steve thers nothing in logs and no exceptions as well i found
>>> that some file is created in my F:\user with directory name but its not
>>> visible inside my hadoop browse filesystem directories i also added the
>>> config by using the below method ****
>>>
>>> hadoopConf.addResource(****
>>>
>>> "F:/hadoop-0.22.0/conf/core-site.xml"); ****
>>>
>>> when running thru WAR printing out the filesystem i m getting
>>> org.apache.hadoop.fs.LocalFileSystem@9cd8db ****
>>>
>>> when running an independet jar within hadoop i m getting
>>> DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]****
>>>
>>> when running an independet jar i m able to do uploads....****
>>>
>>>  ****
>>>
>>> just wanted to know will i have to add something in my classpath of
>>> tomcat or is there any other configurations of core-site.xml that i am
>>> missing out..thanks for your help.....****
>>>
>>>  ****
>>>
>>> ** **
>>>
>>> On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
>>> wrote:****
>>>
>>> ** **
>>>
>>> well, it's worked for me in the past outside Hadoop itself:****
>>>
>>> ** **
>>>
>>>
>>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>>> ****
>>>
>>> ** **
>>>
>>>    1. Turn logging up to DEBUG****
>>>    2. Make sure that the filesystem you've just loaded is what you
>>>    expect, by logging its value. It may turn out to be file:///,
>>>    because the normal Hadoop site-config.xml isn't being picked up****
>>>
>>>   ****
>>>
>>>  ** **
>>>
>>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <
>>> visioner.sadak@gmail.com> wrote:****
>>>
>>> but the problem is that my  code gets executed with the warning but file
>>> is not copied to hdfs , actually i m trying to copy a file from local to
>>> hdfs ****
>>>
>>>  ****
>>>
>>>    Configuration hadoopConf=new Configuration();
>>>         //get the default associated file system
>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>>         //copy from lfs to hdfs
>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>>> Path("/user/TestDir/")); ****
>>>
>>>  ****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>>
>>> ______________________________________________________________________
>>> Disclaimer:This email and any attachments are sent in strictest
>>> confidence for the sole use of the addressee and may contain legally
>>> privileged, confidential, and proprietary data. If you are not the intended
>>> recipient, please advise the sender by replying promptly to this email and
>>> then delete and destroy this email and any attachments without any further
>>> use, copying or forwarding
>>>
>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Bejoy KS <be...@gmail.com>.
Hi

You are running tomact on a windows machine and trying to connect to a remote hadoop cluster from there. Your core site has
<name>
fs.default.name</name>
<value>hdfs://localhost:9000</value>

But It is localhost here.( I assume you are not running hadoop on this windows environment for some testing)

 You need to have the exact configuration files and hadoop jars from the cluster machines on this tomcat environment as well. I mean on the classpath of your application. 
 
Regards
Bejoy KS

Sent from handheld, please excuse typos.

-----Original Message-----
From: Visioner Sadak <vi...@gmail.com>
Date: Tue, 4 Sep 2012 15:31:25 
To: <us...@hadoop.apache.org>
Reply-To: user@hadoop.apache.org
Subject: Re: Integrating hadoop with java UI application deployed on tomcat

also getting one more error

*

org.apache.hadoop.ipc.RemoteException*: Server IPC version 5 cannot
communicate with client version 4


On Tue, Sep 4, 2012 at 2:44 PM, Visioner Sadak <vi...@gmail.com>wrote:

> Thanks shobha tried adding conf folder to tomcats classpath  still getting
> same error
>
>
> Call to localhost/127.0.0.1:9000 failed on local exception:
> java.io.IOException: An established connection was aborted by the software
> in your host machine
>
>  On Tue, Sep 4, 2012 at 11:18 AM, Mahadevappa, Shobha <
> Shobha.Mahadevappa@nttdata.com> wrote:
>
>>  Hi,****
>>
>> Try adding the hadoop/conf directory in the TOMCAT’s classpath ****
>>
>> ** **
>>
>> Ex :
>> CLASSPATH=/usr/local/Apps/hbase-0.90.4/conf:/usr/local/Apps/hadoop-0.20.203.0/conf:
>> ****
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> Regards,****
>>
>> *Shobha M *****
>>
>> ** **
>>
>> *From:* Visioner Sadak [mailto:visioner.sadak@gmail.com]
>> *Sent:* 03 September 2012 PM 04:01
>> *To:* user@hadoop.apache.org
>>
>> *Subject:* Re: Integrating hadoop with java UI application deployed on
>> tomcat****
>>
>> ** **
>>
>> Thanks steve thers nothing in logs and no exceptions as well i found that
>> some file is created in my F:\user with directory name but its not visible
>> inside my hadoop browse filesystem directories i also added the config by
>> using the below method ****
>>
>> hadoopConf.addResource(****
>>
>> "F:/hadoop-0.22.0/conf/core-site.xml"); ****
>>
>> when running thru WAR printing out the filesystem i m getting
>> org.apache.hadoop.fs.LocalFileSystem@9cd8db ****
>>
>> when running an independet jar within hadoop i m getting
>> DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]****
>>
>> when running an independet jar i m able to do uploads....****
>>
>>  ****
>>
>> just wanted to know will i have to add something in my classpath of
>> tomcat or is there any other configurations of core-site.xml that i am
>> missing out..thanks for your help.....****
>>
>>  ****
>>
>> ** **
>>
>> On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
>> wrote:****
>>
>> ** **
>>
>> well, it's worked for me in the past outside Hadoop itself:****
>>
>> ** **
>>
>>
>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>> ****
>>
>> ** **
>>
>>    1. Turn logging up to DEBUG****
>>    2. Make sure that the filesystem you've just loaded is what you
>>    expect, by logging its value. It may turn out to be file:///, because
>>    the normal Hadoop site-config.xml isn't being picked up****
>>
>>   ****
>>
>>  ** **
>>
>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <vi...@gmail.com>
>> wrote:****
>>
>> but the problem is that my  code gets executed with the warning but file
>> is not copied to hdfs , actually i m trying to copy a file from local to
>> hdfs ****
>>
>>  ****
>>
>>    Configuration hadoopConf=new Configuration();
>>         //get the default associated file system
>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>         //copy from lfs to hdfs
>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>> Path("/user/TestDir/")); ****
>>
>>  ****
>>
>> ** **
>>
>> ** **
>>
>>
>> ______________________________________________________________________
>> Disclaimer:This email and any attachments are sent in strictest
>> confidence for the sole use of the addressee and may contain legally
>> privileged, confidential, and proprietary data. If you are not the intended
>> recipient, please advise the sender by replying promptly to this email and
>> then delete and destroy this email and any attachments without any further
>> use, copying or forwarding
>>
>
>


Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Bejoy KS <be...@gmail.com>.
Hi

You are running tomact on a windows machine and trying to connect to a remote hadoop cluster from there. Your core site has
<name>
fs.default.name</name>
<value>hdfs://localhost:9000</value>

But It is localhost here.( I assume you are not running hadoop on this windows environment for some testing)

 You need to have the exact configuration files and hadoop jars from the cluster machines on this tomcat environment as well. I mean on the classpath of your application. 
 
Regards
Bejoy KS

Sent from handheld, please excuse typos.

-----Original Message-----
From: Visioner Sadak <vi...@gmail.com>
Date: Tue, 4 Sep 2012 15:31:25 
To: <us...@hadoop.apache.org>
Reply-To: user@hadoop.apache.org
Subject: Re: Integrating hadoop with java UI application deployed on tomcat

also getting one more error

*

org.apache.hadoop.ipc.RemoteException*: Server IPC version 5 cannot
communicate with client version 4


On Tue, Sep 4, 2012 at 2:44 PM, Visioner Sadak <vi...@gmail.com>wrote:

> Thanks shobha tried adding conf folder to tomcats classpath  still getting
> same error
>
>
> Call to localhost/127.0.0.1:9000 failed on local exception:
> java.io.IOException: An established connection was aborted by the software
> in your host machine
>
>  On Tue, Sep 4, 2012 at 11:18 AM, Mahadevappa, Shobha <
> Shobha.Mahadevappa@nttdata.com> wrote:
>
>>  Hi,****
>>
>> Try adding the hadoop/conf directory in the TOMCAT’s classpath ****
>>
>> ** **
>>
>> Ex :
>> CLASSPATH=/usr/local/Apps/hbase-0.90.4/conf:/usr/local/Apps/hadoop-0.20.203.0/conf:
>> ****
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> Regards,****
>>
>> *Shobha M *****
>>
>> ** **
>>
>> *From:* Visioner Sadak [mailto:visioner.sadak@gmail.com]
>> *Sent:* 03 September 2012 PM 04:01
>> *To:* user@hadoop.apache.org
>>
>> *Subject:* Re: Integrating hadoop with java UI application deployed on
>> tomcat****
>>
>> ** **
>>
>> Thanks steve thers nothing in logs and no exceptions as well i found that
>> some file is created in my F:\user with directory name but its not visible
>> inside my hadoop browse filesystem directories i also added the config by
>> using the below method ****
>>
>> hadoopConf.addResource(****
>>
>> "F:/hadoop-0.22.0/conf/core-site.xml"); ****
>>
>> when running thru WAR printing out the filesystem i m getting
>> org.apache.hadoop.fs.LocalFileSystem@9cd8db ****
>>
>> when running an independet jar within hadoop i m getting
>> DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]****
>>
>> when running an independet jar i m able to do uploads....****
>>
>>  ****
>>
>> just wanted to know will i have to add something in my classpath of
>> tomcat or is there any other configurations of core-site.xml that i am
>> missing out..thanks for your help.....****
>>
>>  ****
>>
>> ** **
>>
>> On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
>> wrote:****
>>
>> ** **
>>
>> well, it's worked for me in the past outside Hadoop itself:****
>>
>> ** **
>>
>>
>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>> ****
>>
>> ** **
>>
>>    1. Turn logging up to DEBUG****
>>    2. Make sure that the filesystem you've just loaded is what you
>>    expect, by logging its value. It may turn out to be file:///, because
>>    the normal Hadoop site-config.xml isn't being picked up****
>>
>>   ****
>>
>>  ** **
>>
>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <vi...@gmail.com>
>> wrote:****
>>
>> but the problem is that my  code gets executed with the warning but file
>> is not copied to hdfs , actually i m trying to copy a file from local to
>> hdfs ****
>>
>>  ****
>>
>>    Configuration hadoopConf=new Configuration();
>>         //get the default associated file system
>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>         //copy from lfs to hdfs
>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>> Path("/user/TestDir/")); ****
>>
>>  ****
>>
>> ** **
>>
>> ** **
>>
>>
>> ______________________________________________________________________
>> Disclaimer:This email and any attachments are sent in strictest
>> confidence for the sole use of the addressee and may contain legally
>> privileged, confidential, and proprietary data. If you are not the intended
>> recipient, please advise the sender by replying promptly to this email and
>> then delete and destroy this email and any attachments without any further
>> use, copying or forwarding
>>
>
>


Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Bejoy KS <be...@gmail.com>.
Hi

You are running tomact on a windows machine and trying to connect to a remote hadoop cluster from there. Your core site has
<name>
fs.default.name</name>
<value>hdfs://localhost:9000</value>

But It is localhost here.( I assume you are not running hadoop on this windows environment for some testing)

 You need to have the exact configuration files and hadoop jars from the cluster machines on this tomcat environment as well. I mean on the classpath of your application. 
 
Regards
Bejoy KS

Sent from handheld, please excuse typos.

-----Original Message-----
From: Visioner Sadak <vi...@gmail.com>
Date: Tue, 4 Sep 2012 15:31:25 
To: <us...@hadoop.apache.org>
Reply-To: user@hadoop.apache.org
Subject: Re: Integrating hadoop with java UI application deployed on tomcat

also getting one more error

*

org.apache.hadoop.ipc.RemoteException*: Server IPC version 5 cannot
communicate with client version 4


On Tue, Sep 4, 2012 at 2:44 PM, Visioner Sadak <vi...@gmail.com>wrote:

> Thanks shobha tried adding conf folder to tomcats classpath  still getting
> same error
>
>
> Call to localhost/127.0.0.1:9000 failed on local exception:
> java.io.IOException: An established connection was aborted by the software
> in your host machine
>
>  On Tue, Sep 4, 2012 at 11:18 AM, Mahadevappa, Shobha <
> Shobha.Mahadevappa@nttdata.com> wrote:
>
>>  Hi,****
>>
>> Try adding the hadoop/conf directory in the TOMCAT’s classpath ****
>>
>> ** **
>>
>> Ex :
>> CLASSPATH=/usr/local/Apps/hbase-0.90.4/conf:/usr/local/Apps/hadoop-0.20.203.0/conf:
>> ****
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> Regards,****
>>
>> *Shobha M *****
>>
>> ** **
>>
>> *From:* Visioner Sadak [mailto:visioner.sadak@gmail.com]
>> *Sent:* 03 September 2012 PM 04:01
>> *To:* user@hadoop.apache.org
>>
>> *Subject:* Re: Integrating hadoop with java UI application deployed on
>> tomcat****
>>
>> ** **
>>
>> Thanks steve thers nothing in logs and no exceptions as well i found that
>> some file is created in my F:\user with directory name but its not visible
>> inside my hadoop browse filesystem directories i also added the config by
>> using the below method ****
>>
>> hadoopConf.addResource(****
>>
>> "F:/hadoop-0.22.0/conf/core-site.xml"); ****
>>
>> when running thru WAR printing out the filesystem i m getting
>> org.apache.hadoop.fs.LocalFileSystem@9cd8db ****
>>
>> when running an independet jar within hadoop i m getting
>> DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]****
>>
>> when running an independet jar i m able to do uploads....****
>>
>>  ****
>>
>> just wanted to know will i have to add something in my classpath of
>> tomcat or is there any other configurations of core-site.xml that i am
>> missing out..thanks for your help.....****
>>
>>  ****
>>
>> ** **
>>
>> On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
>> wrote:****
>>
>> ** **
>>
>> well, it's worked for me in the past outside Hadoop itself:****
>>
>> ** **
>>
>>
>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>> ****
>>
>> ** **
>>
>>    1. Turn logging up to DEBUG****
>>    2. Make sure that the filesystem you've just loaded is what you
>>    expect, by logging its value. It may turn out to be file:///, because
>>    the normal Hadoop site-config.xml isn't being picked up****
>>
>>   ****
>>
>>  ** **
>>
>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <vi...@gmail.com>
>> wrote:****
>>
>> but the problem is that my  code gets executed with the warning but file
>> is not copied to hdfs , actually i m trying to copy a file from local to
>> hdfs ****
>>
>>  ****
>>
>>    Configuration hadoopConf=new Configuration();
>>         //get the default associated file system
>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>         //copy from lfs to hdfs
>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>> Path("/user/TestDir/")); ****
>>
>>  ****
>>
>> ** **
>>
>> ** **
>>
>>
>> ______________________________________________________________________
>> Disclaimer:This email and any attachments are sent in strictest
>> confidence for the sole use of the addressee and may contain legally
>> privileged, confidential, and proprietary data. If you are not the intended
>> recipient, please advise the sender by replying promptly to this email and
>> then delete and destroy this email and any attachments without any further
>> use, copying or forwarding
>>
>
>


Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Bejoy KS <be...@gmail.com>.
Hi

You are running tomact on a windows machine and trying to connect to a remote hadoop cluster from there. Your core site has
<name>
fs.default.name</name>
<value>hdfs://localhost:9000</value>

But It is localhost here.( I assume you are not running hadoop on this windows environment for some testing)

 You need to have the exact configuration files and hadoop jars from the cluster machines on this tomcat environment as well. I mean on the classpath of your application. 
 
Regards
Bejoy KS

Sent from handheld, please excuse typos.

-----Original Message-----
From: Visioner Sadak <vi...@gmail.com>
Date: Tue, 4 Sep 2012 15:31:25 
To: <us...@hadoop.apache.org>
Reply-To: user@hadoop.apache.org
Subject: Re: Integrating hadoop with java UI application deployed on tomcat

also getting one more error

*

org.apache.hadoop.ipc.RemoteException*: Server IPC version 5 cannot
communicate with client version 4


On Tue, Sep 4, 2012 at 2:44 PM, Visioner Sadak <vi...@gmail.com>wrote:

> Thanks shobha tried adding conf folder to tomcats classpath  still getting
> same error
>
>
> Call to localhost/127.0.0.1:9000 failed on local exception:
> java.io.IOException: An established connection was aborted by the software
> in your host machine
>
>  On Tue, Sep 4, 2012 at 11:18 AM, Mahadevappa, Shobha <
> Shobha.Mahadevappa@nttdata.com> wrote:
>
>>  Hi,****
>>
>> Try adding the hadoop/conf directory in the TOMCAT’s classpath ****
>>
>> ** **
>>
>> Ex :
>> CLASSPATH=/usr/local/Apps/hbase-0.90.4/conf:/usr/local/Apps/hadoop-0.20.203.0/conf:
>> ****
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> Regards,****
>>
>> *Shobha M *****
>>
>> ** **
>>
>> *From:* Visioner Sadak [mailto:visioner.sadak@gmail.com]
>> *Sent:* 03 September 2012 PM 04:01
>> *To:* user@hadoop.apache.org
>>
>> *Subject:* Re: Integrating hadoop with java UI application deployed on
>> tomcat****
>>
>> ** **
>>
>> Thanks steve thers nothing in logs and no exceptions as well i found that
>> some file is created in my F:\user with directory name but its not visible
>> inside my hadoop browse filesystem directories i also added the config by
>> using the below method ****
>>
>> hadoopConf.addResource(****
>>
>> "F:/hadoop-0.22.0/conf/core-site.xml"); ****
>>
>> when running thru WAR printing out the filesystem i m getting
>> org.apache.hadoop.fs.LocalFileSystem@9cd8db ****
>>
>> when running an independet jar within hadoop i m getting
>> DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]****
>>
>> when running an independet jar i m able to do uploads....****
>>
>>  ****
>>
>> just wanted to know will i have to add something in my classpath of
>> tomcat or is there any other configurations of core-site.xml that i am
>> missing out..thanks for your help.....****
>>
>>  ****
>>
>> ** **
>>
>> On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
>> wrote:****
>>
>> ** **
>>
>> well, it's worked for me in the past outside Hadoop itself:****
>>
>> ** **
>>
>>
>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>> ****
>>
>> ** **
>>
>>    1. Turn logging up to DEBUG****
>>    2. Make sure that the filesystem you've just loaded is what you
>>    expect, by logging its value. It may turn out to be file:///, because
>>    the normal Hadoop site-config.xml isn't being picked up****
>>
>>   ****
>>
>>  ** **
>>
>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <vi...@gmail.com>
>> wrote:****
>>
>> but the problem is that my  code gets executed with the warning but file
>> is not copied to hdfs , actually i m trying to copy a file from local to
>> hdfs ****
>>
>>  ****
>>
>>    Configuration hadoopConf=new Configuration();
>>         //get the default associated file system
>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>         //copy from lfs to hdfs
>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>> Path("/user/TestDir/")); ****
>>
>>  ****
>>
>> ** **
>>
>> ** **
>>
>>
>> ______________________________________________________________________
>> Disclaimer:This email and any attachments are sent in strictest
>> confidence for the sole use of the addressee and may contain legally
>> privileged, confidential, and proprietary data. If you are not the intended
>> recipient, please advise the sender by replying promptly to this email and
>> then delete and destroy this email and any attachments without any further
>> use, copying or forwarding
>>
>
>


Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
also getting one more error

*

org.apache.hadoop.ipc.RemoteException*: Server IPC version 5 cannot
communicate with client version 4


On Tue, Sep 4, 2012 at 2:44 PM, Visioner Sadak <vi...@gmail.com>wrote:

> Thanks shobha tried adding conf folder to tomcats classpath  still getting
> same error
>
>
> Call to localhost/127.0.0.1:9000 failed on local exception:
> java.io.IOException: An established connection was aborted by the software
> in your host machine
>
>  On Tue, Sep 4, 2012 at 11:18 AM, Mahadevappa, Shobha <
> Shobha.Mahadevappa@nttdata.com> wrote:
>
>>  Hi,****
>>
>> Try adding the hadoop/conf directory in the TOMCAT’s classpath ****
>>
>> ** **
>>
>> Ex :
>> CLASSPATH=/usr/local/Apps/hbase-0.90.4/conf:/usr/local/Apps/hadoop-0.20.203.0/conf:
>> ****
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> Regards,****
>>
>> *Shobha M *****
>>
>> ** **
>>
>> *From:* Visioner Sadak [mailto:visioner.sadak@gmail.com]
>> *Sent:* 03 September 2012 PM 04:01
>> *To:* user@hadoop.apache.org
>>
>> *Subject:* Re: Integrating hadoop with java UI application deployed on
>> tomcat****
>>
>> ** **
>>
>> Thanks steve thers nothing in logs and no exceptions as well i found that
>> some file is created in my F:\user with directory name but its not visible
>> inside my hadoop browse filesystem directories i also added the config by
>> using the below method ****
>>
>> hadoopConf.addResource(****
>>
>> "F:/hadoop-0.22.0/conf/core-site.xml"); ****
>>
>> when running thru WAR printing out the filesystem i m getting
>> org.apache.hadoop.fs.LocalFileSystem@9cd8db ****
>>
>> when running an independet jar within hadoop i m getting
>> DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]****
>>
>> when running an independet jar i m able to do uploads....****
>>
>>  ****
>>
>> just wanted to know will i have to add something in my classpath of
>> tomcat or is there any other configurations of core-site.xml that i am
>> missing out..thanks for your help.....****
>>
>>  ****
>>
>> ** **
>>
>> On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
>> wrote:****
>>
>> ** **
>>
>> well, it's worked for me in the past outside Hadoop itself:****
>>
>> ** **
>>
>>
>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>> ****
>>
>> ** **
>>
>>    1. Turn logging up to DEBUG****
>>    2. Make sure that the filesystem you've just loaded is what you
>>    expect, by logging its value. It may turn out to be file:///, because
>>    the normal Hadoop site-config.xml isn't being picked up****
>>
>>   ****
>>
>>  ** **
>>
>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <vi...@gmail.com>
>> wrote:****
>>
>> but the problem is that my  code gets executed with the warning but file
>> is not copied to hdfs , actually i m trying to copy a file from local to
>> hdfs ****
>>
>>  ****
>>
>>    Configuration hadoopConf=new Configuration();
>>         //get the default associated file system
>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>         //copy from lfs to hdfs
>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>> Path("/user/TestDir/")); ****
>>
>>  ****
>>
>> ** **
>>
>> ** **
>>
>>
>> ______________________________________________________________________
>> Disclaimer:This email and any attachments are sent in strictest
>> confidence for the sole use of the addressee and may contain legally
>> privileged, confidential, and proprietary data. If you are not the intended
>> recipient, please advise the sender by replying promptly to this email and
>> then delete and destroy this email and any attachments without any further
>> use, copying or forwarding
>>
>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
also getting one more error

*

org.apache.hadoop.ipc.RemoteException*: Server IPC version 5 cannot
communicate with client version 4


On Tue, Sep 4, 2012 at 2:44 PM, Visioner Sadak <vi...@gmail.com>wrote:

> Thanks shobha tried adding conf folder to tomcats classpath  still getting
> same error
>
>
> Call to localhost/127.0.0.1:9000 failed on local exception:
> java.io.IOException: An established connection was aborted by the software
> in your host machine
>
>  On Tue, Sep 4, 2012 at 11:18 AM, Mahadevappa, Shobha <
> Shobha.Mahadevappa@nttdata.com> wrote:
>
>>  Hi,****
>>
>> Try adding the hadoop/conf directory in the TOMCAT’s classpath ****
>>
>> ** **
>>
>> Ex :
>> CLASSPATH=/usr/local/Apps/hbase-0.90.4/conf:/usr/local/Apps/hadoop-0.20.203.0/conf:
>> ****
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> Regards,****
>>
>> *Shobha M *****
>>
>> ** **
>>
>> *From:* Visioner Sadak [mailto:visioner.sadak@gmail.com]
>> *Sent:* 03 September 2012 PM 04:01
>> *To:* user@hadoop.apache.org
>>
>> *Subject:* Re: Integrating hadoop with java UI application deployed on
>> tomcat****
>>
>> ** **
>>
>> Thanks steve thers nothing in logs and no exceptions as well i found that
>> some file is created in my F:\user with directory name but its not visible
>> inside my hadoop browse filesystem directories i also added the config by
>> using the below method ****
>>
>> hadoopConf.addResource(****
>>
>> "F:/hadoop-0.22.0/conf/core-site.xml"); ****
>>
>> when running thru WAR printing out the filesystem i m getting
>> org.apache.hadoop.fs.LocalFileSystem@9cd8db ****
>>
>> when running an independet jar within hadoop i m getting
>> DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]****
>>
>> when running an independet jar i m able to do uploads....****
>>
>>  ****
>>
>> just wanted to know will i have to add something in my classpath of
>> tomcat or is there any other configurations of core-site.xml that i am
>> missing out..thanks for your help.....****
>>
>>  ****
>>
>> ** **
>>
>> On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
>> wrote:****
>>
>> ** **
>>
>> well, it's worked for me in the past outside Hadoop itself:****
>>
>> ** **
>>
>>
>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>> ****
>>
>> ** **
>>
>>    1. Turn logging up to DEBUG****
>>    2. Make sure that the filesystem you've just loaded is what you
>>    expect, by logging its value. It may turn out to be file:///, because
>>    the normal Hadoop site-config.xml isn't being picked up****
>>
>>   ****
>>
>>  ** **
>>
>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <vi...@gmail.com>
>> wrote:****
>>
>> but the problem is that my  code gets executed with the warning but file
>> is not copied to hdfs , actually i m trying to copy a file from local to
>> hdfs ****
>>
>>  ****
>>
>>    Configuration hadoopConf=new Configuration();
>>         //get the default associated file system
>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>         //copy from lfs to hdfs
>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>> Path("/user/TestDir/")); ****
>>
>>  ****
>>
>> ** **
>>
>> ** **
>>
>>
>> ______________________________________________________________________
>> Disclaimer:This email and any attachments are sent in strictest
>> confidence for the sole use of the addressee and may contain legally
>> privileged, confidential, and proprietary data. If you are not the intended
>> recipient, please advise the sender by replying promptly to this email and
>> then delete and destroy this email and any attachments without any further
>> use, copying or forwarding
>>
>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
also getting one more error

*

org.apache.hadoop.ipc.RemoteException*: Server IPC version 5 cannot
communicate with client version 4


On Tue, Sep 4, 2012 at 2:44 PM, Visioner Sadak <vi...@gmail.com>wrote:

> Thanks shobha tried adding conf folder to tomcats classpath  still getting
> same error
>
>
> Call to localhost/127.0.0.1:9000 failed on local exception:
> java.io.IOException: An established connection was aborted by the software
> in your host machine
>
>  On Tue, Sep 4, 2012 at 11:18 AM, Mahadevappa, Shobha <
> Shobha.Mahadevappa@nttdata.com> wrote:
>
>>  Hi,****
>>
>> Try adding the hadoop/conf directory in the TOMCAT’s classpath ****
>>
>> ** **
>>
>> Ex :
>> CLASSPATH=/usr/local/Apps/hbase-0.90.4/conf:/usr/local/Apps/hadoop-0.20.203.0/conf:
>> ****
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> Regards,****
>>
>> *Shobha M *****
>>
>> ** **
>>
>> *From:* Visioner Sadak [mailto:visioner.sadak@gmail.com]
>> *Sent:* 03 September 2012 PM 04:01
>> *To:* user@hadoop.apache.org
>>
>> *Subject:* Re: Integrating hadoop with java UI application deployed on
>> tomcat****
>>
>> ** **
>>
>> Thanks steve thers nothing in logs and no exceptions as well i found that
>> some file is created in my F:\user with directory name but its not visible
>> inside my hadoop browse filesystem directories i also added the config by
>> using the below method ****
>>
>> hadoopConf.addResource(****
>>
>> "F:/hadoop-0.22.0/conf/core-site.xml"); ****
>>
>> when running thru WAR printing out the filesystem i m getting
>> org.apache.hadoop.fs.LocalFileSystem@9cd8db ****
>>
>> when running an independet jar within hadoop i m getting
>> DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]****
>>
>> when running an independet jar i m able to do uploads....****
>>
>>  ****
>>
>> just wanted to know will i have to add something in my classpath of
>> tomcat or is there any other configurations of core-site.xml that i am
>> missing out..thanks for your help.....****
>>
>>  ****
>>
>> ** **
>>
>> On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
>> wrote:****
>>
>> ** **
>>
>> well, it's worked for me in the past outside Hadoop itself:****
>>
>> ** **
>>
>>
>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>> ****
>>
>> ** **
>>
>>    1. Turn logging up to DEBUG****
>>    2. Make sure that the filesystem you've just loaded is what you
>>    expect, by logging its value. It may turn out to be file:///, because
>>    the normal Hadoop site-config.xml isn't being picked up****
>>
>>   ****
>>
>>  ** **
>>
>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <vi...@gmail.com>
>> wrote:****
>>
>> but the problem is that my  code gets executed with the warning but file
>> is not copied to hdfs , actually i m trying to copy a file from local to
>> hdfs ****
>>
>>  ****
>>
>>    Configuration hadoopConf=new Configuration();
>>         //get the default associated file system
>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>         //copy from lfs to hdfs
>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>> Path("/user/TestDir/")); ****
>>
>>  ****
>>
>> ** **
>>
>> ** **
>>
>>
>> ______________________________________________________________________
>> Disclaimer:This email and any attachments are sent in strictest
>> confidence for the sole use of the addressee and may contain legally
>> privileged, confidential, and proprietary data. If you are not the intended
>> recipient, please advise the sender by replying promptly to this email and
>> then delete and destroy this email and any attachments without any further
>> use, copying or forwarding
>>
>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
also getting one more error

*

org.apache.hadoop.ipc.RemoteException*: Server IPC version 5 cannot
communicate with client version 4


On Tue, Sep 4, 2012 at 2:44 PM, Visioner Sadak <vi...@gmail.com>wrote:

> Thanks shobha tried adding conf folder to tomcats classpath  still getting
> same error
>
>
> Call to localhost/127.0.0.1:9000 failed on local exception:
> java.io.IOException: An established connection was aborted by the software
> in your host machine
>
>  On Tue, Sep 4, 2012 at 11:18 AM, Mahadevappa, Shobha <
> Shobha.Mahadevappa@nttdata.com> wrote:
>
>>  Hi,****
>>
>> Try adding the hadoop/conf directory in the TOMCAT’s classpath ****
>>
>> ** **
>>
>> Ex :
>> CLASSPATH=/usr/local/Apps/hbase-0.90.4/conf:/usr/local/Apps/hadoop-0.20.203.0/conf:
>> ****
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> Regards,****
>>
>> *Shobha M *****
>>
>> ** **
>>
>> *From:* Visioner Sadak [mailto:visioner.sadak@gmail.com]
>> *Sent:* 03 September 2012 PM 04:01
>> *To:* user@hadoop.apache.org
>>
>> *Subject:* Re: Integrating hadoop with java UI application deployed on
>> tomcat****
>>
>> ** **
>>
>> Thanks steve thers nothing in logs and no exceptions as well i found that
>> some file is created in my F:\user with directory name but its not visible
>> inside my hadoop browse filesystem directories i also added the config by
>> using the below method ****
>>
>> hadoopConf.addResource(****
>>
>> "F:/hadoop-0.22.0/conf/core-site.xml"); ****
>>
>> when running thru WAR printing out the filesystem i m getting
>> org.apache.hadoop.fs.LocalFileSystem@9cd8db ****
>>
>> when running an independet jar within hadoop i m getting
>> DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]****
>>
>> when running an independet jar i m able to do uploads....****
>>
>>  ****
>>
>> just wanted to know will i have to add something in my classpath of
>> tomcat or is there any other configurations of core-site.xml that i am
>> missing out..thanks for your help.....****
>>
>>  ****
>>
>> ** **
>>
>> On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
>> wrote:****
>>
>> ** **
>>
>> well, it's worked for me in the past outside Hadoop itself:****
>>
>> ** **
>>
>>
>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>> ****
>>
>> ** **
>>
>>    1. Turn logging up to DEBUG****
>>    2. Make sure that the filesystem you've just loaded is what you
>>    expect, by logging its value. It may turn out to be file:///, because
>>    the normal Hadoop site-config.xml isn't being picked up****
>>
>>   ****
>>
>>  ** **
>>
>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <vi...@gmail.com>
>> wrote:****
>>
>> but the problem is that my  code gets executed with the warning but file
>> is not copied to hdfs , actually i m trying to copy a file from local to
>> hdfs ****
>>
>>  ****
>>
>>    Configuration hadoopConf=new Configuration();
>>         //get the default associated file system
>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>         //copy from lfs to hdfs
>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>> Path("/user/TestDir/")); ****
>>
>>  ****
>>
>> ** **
>>
>> ** **
>>
>>
>> ______________________________________________________________________
>> Disclaimer:This email and any attachments are sent in strictest
>> confidence for the sole use of the addressee and may contain legally
>> privileged, confidential, and proprietary data. If you are not the intended
>> recipient, please advise the sender by replying promptly to this email and
>> then delete and destroy this email and any attachments without any further
>> use, copying or forwarding
>>
>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
Thanks shobha tried adding conf folder to tomcats classpath  still getting
same error


Call to localhost/127.0.0.1:9000 failed on local exception:
java.io.IOException: An established connection was aborted by the software
in your host machine

On Tue, Sep 4, 2012 at 11:18 AM, Mahadevappa, Shobha <
Shobha.Mahadevappa@nttdata.com> wrote:

>  Hi,****
>
> Try adding the hadoop/conf directory in the TOMCAT’s classpath ****
>
> ** **
>
> Ex :
> CLASSPATH=/usr/local/Apps/hbase-0.90.4/conf:/usr/local/Apps/hadoop-0.20.203.0/conf:
> ****
>
> ** **
>
> ** **
>
> ** **
>
> Regards,****
>
> *Shobha M *****
>
> ** **
>
> *From:* Visioner Sadak [mailto:visioner.sadak@gmail.com]
> *Sent:* 03 September 2012 PM 04:01
> *To:* user@hadoop.apache.org
>
> *Subject:* Re: Integrating hadoop with java UI application deployed on
> tomcat****
>
> ** **
>
> Thanks steve thers nothing in logs and no exceptions as well i found that
> some file is created in my F:\user with directory name but its not visible
> inside my hadoop browse filesystem directories i also added the config by
> using the below method ****
>
> hadoopConf.addResource(****
>
> "F:/hadoop-0.22.0/conf/core-site.xml"); ****
>
> when running thru WAR printing out the filesystem i m getting
> org.apache.hadoop.fs.LocalFileSystem@9cd8db ****
>
> when running an independet jar within hadoop i m getting
> DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]****
>
> when running an independet jar i m able to do uploads....****
>
>  ****
>
> just wanted to know will i have to add something in my classpath of tomcat
> or is there any other configurations of core-site.xml that i am missing
> out..thanks for your help.....****
>
>  ****
>
> ** **
>
> On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
> wrote:****
>
> ** **
>
> well, it's worked for me in the past outside Hadoop itself:****
>
> ** **
>
>
> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
> ****
>
> ** **
>
>    1. Turn logging up to DEBUG****
>    2. Make sure that the filesystem you've just loaded is what you
>    expect, by logging its value. It may turn out to be file:///, because
>    the normal Hadoop site-config.xml isn't being picked up****
>
>   ****
>
>  ** **
>
> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <vi...@gmail.com>
> wrote:****
>
> but the problem is that my  code gets executed with the warning but file
> is not copied to hdfs , actually i m trying to copy a file from local to
> hdfs ****
>
>  ****
>
>    Configuration hadoopConf=new Configuration();
>         //get the default associated file system
>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>         //copy from lfs to hdfs
>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
> Path("/user/TestDir/")); ****
>
>  ****
>
> ** **
>
> ** **
>
> ______________________________________________________________________
> Disclaimer:This email and any attachments are sent in strictest confidence
> for the sole use of the addressee and may contain legally privileged,
> confidential, and proprietary data. If you are not the intended recipient,
> please advise the sender by replying promptly to this email and then delete
> and destroy this email and any attachments without any further use, copying
> or forwarding
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
Thanks shobha tried adding conf folder to tomcats classpath  still getting
same error


Call to localhost/127.0.0.1:9000 failed on local exception:
java.io.IOException: An established connection was aborted by the software
in your host machine

On Tue, Sep 4, 2012 at 11:18 AM, Mahadevappa, Shobha <
Shobha.Mahadevappa@nttdata.com> wrote:

>  Hi,****
>
> Try adding the hadoop/conf directory in the TOMCAT’s classpath ****
>
> ** **
>
> Ex :
> CLASSPATH=/usr/local/Apps/hbase-0.90.4/conf:/usr/local/Apps/hadoop-0.20.203.0/conf:
> ****
>
> ** **
>
> ** **
>
> ** **
>
> Regards,****
>
> *Shobha M *****
>
> ** **
>
> *From:* Visioner Sadak [mailto:visioner.sadak@gmail.com]
> *Sent:* 03 September 2012 PM 04:01
> *To:* user@hadoop.apache.org
>
> *Subject:* Re: Integrating hadoop with java UI application deployed on
> tomcat****
>
> ** **
>
> Thanks steve thers nothing in logs and no exceptions as well i found that
> some file is created in my F:\user with directory name but its not visible
> inside my hadoop browse filesystem directories i also added the config by
> using the below method ****
>
> hadoopConf.addResource(****
>
> "F:/hadoop-0.22.0/conf/core-site.xml"); ****
>
> when running thru WAR printing out the filesystem i m getting
> org.apache.hadoop.fs.LocalFileSystem@9cd8db ****
>
> when running an independet jar within hadoop i m getting
> DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]****
>
> when running an independet jar i m able to do uploads....****
>
>  ****
>
> just wanted to know will i have to add something in my classpath of tomcat
> or is there any other configurations of core-site.xml that i am missing
> out..thanks for your help.....****
>
>  ****
>
> ** **
>
> On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
> wrote:****
>
> ** **
>
> well, it's worked for me in the past outside Hadoop itself:****
>
> ** **
>
>
> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
> ****
>
> ** **
>
>    1. Turn logging up to DEBUG****
>    2. Make sure that the filesystem you've just loaded is what you
>    expect, by logging its value. It may turn out to be file:///, because
>    the normal Hadoop site-config.xml isn't being picked up****
>
>   ****
>
>  ** **
>
> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <vi...@gmail.com>
> wrote:****
>
> but the problem is that my  code gets executed with the warning but file
> is not copied to hdfs , actually i m trying to copy a file from local to
> hdfs ****
>
>  ****
>
>    Configuration hadoopConf=new Configuration();
>         //get the default associated file system
>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>         //copy from lfs to hdfs
>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
> Path("/user/TestDir/")); ****
>
>  ****
>
> ** **
>
> ** **
>
> ______________________________________________________________________
> Disclaimer:This email and any attachments are sent in strictest confidence
> for the sole use of the addressee and may contain legally privileged,
> confidential, and proprietary data. If you are not the intended recipient,
> please advise the sender by replying promptly to this email and then delete
> and destroy this email and any attachments without any further use, copying
> or forwarding
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
Thanks shobha tried adding conf folder to tomcats classpath  still getting
same error


Call to localhost/127.0.0.1:9000 failed on local exception:
java.io.IOException: An established connection was aborted by the software
in your host machine

On Tue, Sep 4, 2012 at 11:18 AM, Mahadevappa, Shobha <
Shobha.Mahadevappa@nttdata.com> wrote:

>  Hi,****
>
> Try adding the hadoop/conf directory in the TOMCAT’s classpath ****
>
> ** **
>
> Ex :
> CLASSPATH=/usr/local/Apps/hbase-0.90.4/conf:/usr/local/Apps/hadoop-0.20.203.0/conf:
> ****
>
> ** **
>
> ** **
>
> ** **
>
> Regards,****
>
> *Shobha M *****
>
> ** **
>
> *From:* Visioner Sadak [mailto:visioner.sadak@gmail.com]
> *Sent:* 03 September 2012 PM 04:01
> *To:* user@hadoop.apache.org
>
> *Subject:* Re: Integrating hadoop with java UI application deployed on
> tomcat****
>
> ** **
>
> Thanks steve thers nothing in logs and no exceptions as well i found that
> some file is created in my F:\user with directory name but its not visible
> inside my hadoop browse filesystem directories i also added the config by
> using the below method ****
>
> hadoopConf.addResource(****
>
> "F:/hadoop-0.22.0/conf/core-site.xml"); ****
>
> when running thru WAR printing out the filesystem i m getting
> org.apache.hadoop.fs.LocalFileSystem@9cd8db ****
>
> when running an independet jar within hadoop i m getting
> DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]****
>
> when running an independet jar i m able to do uploads....****
>
>  ****
>
> just wanted to know will i have to add something in my classpath of tomcat
> or is there any other configurations of core-site.xml that i am missing
> out..thanks for your help.....****
>
>  ****
>
> ** **
>
> On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
> wrote:****
>
> ** **
>
> well, it's worked for me in the past outside Hadoop itself:****
>
> ** **
>
>
> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
> ****
>
> ** **
>
>    1. Turn logging up to DEBUG****
>    2. Make sure that the filesystem you've just loaded is what you
>    expect, by logging its value. It may turn out to be file:///, because
>    the normal Hadoop site-config.xml isn't being picked up****
>
>   ****
>
>  ** **
>
> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <vi...@gmail.com>
> wrote:****
>
> but the problem is that my  code gets executed with the warning but file
> is not copied to hdfs , actually i m trying to copy a file from local to
> hdfs ****
>
>  ****
>
>    Configuration hadoopConf=new Configuration();
>         //get the default associated file system
>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>         //copy from lfs to hdfs
>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
> Path("/user/TestDir/")); ****
>
>  ****
>
> ** **
>
> ** **
>
> ______________________________________________________________________
> Disclaimer:This email and any attachments are sent in strictest confidence
> for the sole use of the addressee and may contain legally privileged,
> confidential, and proprietary data. If you are not the intended recipient,
> please advise the sender by replying promptly to this email and then delete
> and destroy this email and any attachments without any further use, copying
> or forwarding
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
Thanks shobha tried adding conf folder to tomcats classpath  still getting
same error


Call to localhost/127.0.0.1:9000 failed on local exception:
java.io.IOException: An established connection was aborted by the software
in your host machine

On Tue, Sep 4, 2012 at 11:18 AM, Mahadevappa, Shobha <
Shobha.Mahadevappa@nttdata.com> wrote:

>  Hi,****
>
> Try adding the hadoop/conf directory in the TOMCAT’s classpath ****
>
> ** **
>
> Ex :
> CLASSPATH=/usr/local/Apps/hbase-0.90.4/conf:/usr/local/Apps/hadoop-0.20.203.0/conf:
> ****
>
> ** **
>
> ** **
>
> ** **
>
> Regards,****
>
> *Shobha M *****
>
> ** **
>
> *From:* Visioner Sadak [mailto:visioner.sadak@gmail.com]
> *Sent:* 03 September 2012 PM 04:01
> *To:* user@hadoop.apache.org
>
> *Subject:* Re: Integrating hadoop with java UI application deployed on
> tomcat****
>
> ** **
>
> Thanks steve thers nothing in logs and no exceptions as well i found that
> some file is created in my F:\user with directory name but its not visible
> inside my hadoop browse filesystem directories i also added the config by
> using the below method ****
>
> hadoopConf.addResource(****
>
> "F:/hadoop-0.22.0/conf/core-site.xml"); ****
>
> when running thru WAR printing out the filesystem i m getting
> org.apache.hadoop.fs.LocalFileSystem@9cd8db ****
>
> when running an independet jar within hadoop i m getting
> DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]****
>
> when running an independet jar i m able to do uploads....****
>
>  ****
>
> just wanted to know will i have to add something in my classpath of tomcat
> or is there any other configurations of core-site.xml that i am missing
> out..thanks for your help.....****
>
>  ****
>
> ** **
>
> On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
> wrote:****
>
> ** **
>
> well, it's worked for me in the past outside Hadoop itself:****
>
> ** **
>
>
> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
> ****
>
> ** **
>
>    1. Turn logging up to DEBUG****
>    2. Make sure that the filesystem you've just loaded is what you
>    expect, by logging its value. It may turn out to be file:///, because
>    the normal Hadoop site-config.xml isn't being picked up****
>
>   ****
>
>  ** **
>
> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <vi...@gmail.com>
> wrote:****
>
> but the problem is that my  code gets executed with the warning but file
> is not copied to hdfs , actually i m trying to copy a file from local to
> hdfs ****
>
>  ****
>
>    Configuration hadoopConf=new Configuration();
>         //get the default associated file system
>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>         //copy from lfs to hdfs
>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
> Path("/user/TestDir/")); ****
>
>  ****
>
> ** **
>
> ** **
>
> ______________________________________________________________________
> Disclaimer:This email and any attachments are sent in strictest confidence
> for the sole use of the addressee and may contain legally privileged,
> confidential, and proprietary data. If you are not the intended recipient,
> please advise the sender by replying promptly to this email and then delete
> and destroy this email and any attachments without any further use, copying
> or forwarding
>

RE: Integrating hadoop with java UI application deployed on tomcat

Posted by "Mahadevappa, Shobha" <Sh...@nttdata.com>.
Hi,
Try adding the hadoop/conf directory in the TOMCAT's classpath

Ex : CLASSPATH=/usr/local/Apps/hbase-0.90.4/conf:/usr/local/Apps/hadoop-0.20.203.0/conf:



Regards,
Shobha M

From: Visioner Sadak [mailto:visioner.sadak@gmail.com]
Sent: 03 September 2012 PM 04:01
To: user@hadoop.apache.org
Subject: Re: Integrating hadoop with java UI application deployed on tomcat

Thanks steve thers nothing in logs and no exceptions as well i found that some file is created in my F:\user with directory name but its not visible inside my hadoop browse filesystem directories i also added the config by using the below method
hadoopConf.addResource(
"F:/hadoop-0.22.0/conf/core-site.xml");
when running thru WAR printing out the filesystem i m getting org.apache.hadoop.fs.LocalFileSystem@9cd8db<ma...@9cd8db>
when running an independet jar within hadoop i m getting DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
when running an independet jar i m able to do uploads....

just wanted to know will i have to add something in my classpath of tomcat or is there any other configurations of core-site.xml that i am missing out..thanks for your help.....


On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>> wrote:

well, it's worked for me in the past outside Hadoop itself:

http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup


  1.  Turn logging up to DEBUG
  2.  Make sure that the filesystem you've just loaded is what you expect, by logging its value. It may turn out to be file:///<file:///\\>, because the normal Hadoop site-config.xml isn't being picked up


On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <vi...@gmail.com>> wrote:
but the problem is that my  code gets executed with the warning but file is not copied to hdfs , actually i m trying to copy a file from local to hdfs

   Configuration hadoopConf=new Configuration();
        //get the default associated file system
       FileSystem fileSystem=FileSystem.get(hadoopConf);
       // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
        //copy from lfs to hdfs
       fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new Path("/user/TestDir/"));




______________________________________________________________________
Disclaimer:This email and any attachments are sent in strictest confidence for the sole use of the addressee and may contain legally privileged, confidential, and proprietary data.  If you are not the intended recipient, please advise the sender by replying promptly to this email and then delete and destroy this email and any attachments without any further use, copying or forwarding

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
Thanks senthil name node is up and running and in core-site.xml i have

<configuration>
<property>
<name>fs.default.name</name>
<value>hdfs://localhost:9000</value>
</property>
</configuration>

shud i change my ip  or any other config??

On Mon, Sep 3, 2012 at 10:11 PM, Senthil Kumar <
senthilkumar@thoughtworks.com> wrote:

> The error says call to 127.0.0.1:9000 fails. It is failing when it tries
> to contact the namenode (9000 is the default namenode port) configured in
> core-site.xml. You should also check whether the namenode is configured
> correctly and also whether the namenode is up.
>
>
> On Mon, Sep 3, 2012 at 7:43 PM, Visioner Sadak <vi...@gmail.com>wrote:
>
>> Thanks Senthil i tried on trying with new path getting this error do i
>> have to do any ssl setting on tomcat as well
>>
>> *
>>
>> java.io.IOException
>> *: Call to localhost/127.0.0.1:9000 failed on local exception: *
>> java.io.IOException*: An established connection was aborted by the
>> software in your host machine
>>
>> at org.apache.hadoop.ipc.Client.wrapException(*Client.java:1107*)
>>
>> at org.apache.hadoop.ipc.Client.call(*Client.java:1075*)
>>
>> at org.apache.hadoop.ipc.RPC$Invoker.invoke(*RPC.java:225*)
>>
>>
>>  On Mon, Sep 3, 2012 at 7:22 PM, Senthil Kumar <
>> senthilkumar@thoughtworks.com> wrote:
>>
>>> Try using hadoopConf.addResource(*new
>>> Path("F:/hadoop-0.22.0/conf/core-site.xml")*); instead
>>> of hadoopConf.addResource("F:/hadoop-0.22.0/conf/core-site.xml");
>>>
>>> or you should add your core-site.xml to a location which is in your
>>> class path(WEB-INF\classes or WEB-INF\lib in case of a web application)
>>>
>>>
>>> On Mon, Sep 3, 2012 at 6:02 PM, Visioner Sadak <visioner.sadak@gmail.com
>>> > wrote:
>>>
>>>> thanks hemanth i tried adding ext folder conf and extn root folder
>>>> unable to add xml only but still same problem thanks for the help
>>>>
>>>> On Mon, Sep 3, 2012 at 4:11 PM, Hemanth Yamijala <yh...@gmail.com>wrote:
>>>>
>>>>> Hi,
>>>>>
>>>>> If you are getting the LocalFileSystem, you could try by putting
>>>>> core-site.xml in a directory that's there in the classpath for the
>>>>> Tomcat App (or include such a path in the classpath, if that's
>>>>> possible)
>>>>>
>>>>> Thanks
>>>>> hemanth
>>>>>
>>>>> On Mon, Sep 3, 2012 at 4:01 PM, Visioner Sadak <
>>>>> visioner.sadak@gmail.com> wrote:
>>>>> > Thanks steve thers nothing in logs and no exceptions as well i found
>>>>> that
>>>>> > some file is created in my F:\user with directory name but its not
>>>>> visible
>>>>> > inside my hadoop browse filesystem directories i also added the
>>>>> config by
>>>>> > using the below method
>>>>> > hadoopConf.addResource(
>>>>> > "F:/hadoop-0.22.0/conf/core-site.xml");
>>>>> > when running thru WAR printing out the filesystem i m getting
>>>>> > org.apache.hadoop.fs.LocalFileSystem@9cd8db
>>>>> > when running an independet jar within hadoop i m getting
>>>>> > DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
>>>>> > when running an independet jar i m able to do uploads....
>>>>> >
>>>>> > just wanted to know will i have to add something in my classpath of
>>>>> tomcat
>>>>> > or is there any other configurations of core-site.xml that i am
>>>>> missing
>>>>> > out..thanks for your help.....
>>>>> >
>>>>> >
>>>>> >
>>>>> > On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <
>>>>> stevel@hortonworks.com>
>>>>> > wrote:
>>>>> >>
>>>>> >>
>>>>> >> well, it's worked for me in the past outside Hadoop itself:
>>>>> >>
>>>>> >>
>>>>> >>
>>>>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>>>>> >>
>>>>> >> Turn logging up to DEBUG
>>>>> >> Make sure that the filesystem you've just loaded is what you
>>>>> expect, by
>>>>> >> logging its value. It may turn out to be file:///, because the
>>>>> normal Hadoop
>>>>> >> site-config.xml isn't being picked up
>>>>> >>
>>>>> >>
>>>>> >>>
>>>>> >>>
>>>>> >>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak
>>>>> >>> <vi...@gmail.com> wrote:
>>>>> >>>>
>>>>> >>>> but the problem is that my  code gets executed with the warning
>>>>> but file
>>>>> >>>> is not copied to hdfs , actually i m trying to copy a file from
>>>>> local to
>>>>> >>>> hdfs
>>>>> >>>>
>>>>> >>>>    Configuration hadoopConf=new Configuration();
>>>>> >>>>         //get the default associated file system
>>>>> >>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>>> >>>>        // HarFileSystem harFileSystem= new
>>>>> HarFileSystem(fileSystem);
>>>>> >>>>         //copy from lfs to hdfs
>>>>> >>>>        fileSystem.copyFromLocalFile(new
>>>>> Path("E:/test/GANI.jpg"),new
>>>>> >>>> Path("/user/TestDir/"));
>>>>> >>>>
>>>>> >>
>>>>> >>
>>>>> >
>>>>>
>>>>
>>>>
>>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
Thanks senthil name node is up and running and in core-site.xml i have

<configuration>
<property>
<name>fs.default.name</name>
<value>hdfs://localhost:9000</value>
</property>
</configuration>

shud i change my ip  or any other config??

On Mon, Sep 3, 2012 at 10:11 PM, Senthil Kumar <
senthilkumar@thoughtworks.com> wrote:

> The error says call to 127.0.0.1:9000 fails. It is failing when it tries
> to contact the namenode (9000 is the default namenode port) configured in
> core-site.xml. You should also check whether the namenode is configured
> correctly and also whether the namenode is up.
>
>
> On Mon, Sep 3, 2012 at 7:43 PM, Visioner Sadak <vi...@gmail.com>wrote:
>
>> Thanks Senthil i tried on trying with new path getting this error do i
>> have to do any ssl setting on tomcat as well
>>
>> *
>>
>> java.io.IOException
>> *: Call to localhost/127.0.0.1:9000 failed on local exception: *
>> java.io.IOException*: An established connection was aborted by the
>> software in your host machine
>>
>> at org.apache.hadoop.ipc.Client.wrapException(*Client.java:1107*)
>>
>> at org.apache.hadoop.ipc.Client.call(*Client.java:1075*)
>>
>> at org.apache.hadoop.ipc.RPC$Invoker.invoke(*RPC.java:225*)
>>
>>
>>  On Mon, Sep 3, 2012 at 7:22 PM, Senthil Kumar <
>> senthilkumar@thoughtworks.com> wrote:
>>
>>> Try using hadoopConf.addResource(*new
>>> Path("F:/hadoop-0.22.0/conf/core-site.xml")*); instead
>>> of hadoopConf.addResource("F:/hadoop-0.22.0/conf/core-site.xml");
>>>
>>> or you should add your core-site.xml to a location which is in your
>>> class path(WEB-INF\classes or WEB-INF\lib in case of a web application)
>>>
>>>
>>> On Mon, Sep 3, 2012 at 6:02 PM, Visioner Sadak <visioner.sadak@gmail.com
>>> > wrote:
>>>
>>>> thanks hemanth i tried adding ext folder conf and extn root folder
>>>> unable to add xml only but still same problem thanks for the help
>>>>
>>>> On Mon, Sep 3, 2012 at 4:11 PM, Hemanth Yamijala <yh...@gmail.com>wrote:
>>>>
>>>>> Hi,
>>>>>
>>>>> If you are getting the LocalFileSystem, you could try by putting
>>>>> core-site.xml in a directory that's there in the classpath for the
>>>>> Tomcat App (or include such a path in the classpath, if that's
>>>>> possible)
>>>>>
>>>>> Thanks
>>>>> hemanth
>>>>>
>>>>> On Mon, Sep 3, 2012 at 4:01 PM, Visioner Sadak <
>>>>> visioner.sadak@gmail.com> wrote:
>>>>> > Thanks steve thers nothing in logs and no exceptions as well i found
>>>>> that
>>>>> > some file is created in my F:\user with directory name but its not
>>>>> visible
>>>>> > inside my hadoop browse filesystem directories i also added the
>>>>> config by
>>>>> > using the below method
>>>>> > hadoopConf.addResource(
>>>>> > "F:/hadoop-0.22.0/conf/core-site.xml");
>>>>> > when running thru WAR printing out the filesystem i m getting
>>>>> > org.apache.hadoop.fs.LocalFileSystem@9cd8db
>>>>> > when running an independet jar within hadoop i m getting
>>>>> > DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
>>>>> > when running an independet jar i m able to do uploads....
>>>>> >
>>>>> > just wanted to know will i have to add something in my classpath of
>>>>> tomcat
>>>>> > or is there any other configurations of core-site.xml that i am
>>>>> missing
>>>>> > out..thanks for your help.....
>>>>> >
>>>>> >
>>>>> >
>>>>> > On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <
>>>>> stevel@hortonworks.com>
>>>>> > wrote:
>>>>> >>
>>>>> >>
>>>>> >> well, it's worked for me in the past outside Hadoop itself:
>>>>> >>
>>>>> >>
>>>>> >>
>>>>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>>>>> >>
>>>>> >> Turn logging up to DEBUG
>>>>> >> Make sure that the filesystem you've just loaded is what you
>>>>> expect, by
>>>>> >> logging its value. It may turn out to be file:///, because the
>>>>> normal Hadoop
>>>>> >> site-config.xml isn't being picked up
>>>>> >>
>>>>> >>
>>>>> >>>
>>>>> >>>
>>>>> >>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak
>>>>> >>> <vi...@gmail.com> wrote:
>>>>> >>>>
>>>>> >>>> but the problem is that my  code gets executed with the warning
>>>>> but file
>>>>> >>>> is not copied to hdfs , actually i m trying to copy a file from
>>>>> local to
>>>>> >>>> hdfs
>>>>> >>>>
>>>>> >>>>    Configuration hadoopConf=new Configuration();
>>>>> >>>>         //get the default associated file system
>>>>> >>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>>> >>>>        // HarFileSystem harFileSystem= new
>>>>> HarFileSystem(fileSystem);
>>>>> >>>>         //copy from lfs to hdfs
>>>>> >>>>        fileSystem.copyFromLocalFile(new
>>>>> Path("E:/test/GANI.jpg"),new
>>>>> >>>> Path("/user/TestDir/"));
>>>>> >>>>
>>>>> >>
>>>>> >>
>>>>> >
>>>>>
>>>>
>>>>
>>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
Thanks senthil name node is up and running and in core-site.xml i have

<configuration>
<property>
<name>fs.default.name</name>
<value>hdfs://localhost:9000</value>
</property>
</configuration>

shud i change my ip  or any other config??

On Mon, Sep 3, 2012 at 10:11 PM, Senthil Kumar <
senthilkumar@thoughtworks.com> wrote:

> The error says call to 127.0.0.1:9000 fails. It is failing when it tries
> to contact the namenode (9000 is the default namenode port) configured in
> core-site.xml. You should also check whether the namenode is configured
> correctly and also whether the namenode is up.
>
>
> On Mon, Sep 3, 2012 at 7:43 PM, Visioner Sadak <vi...@gmail.com>wrote:
>
>> Thanks Senthil i tried on trying with new path getting this error do i
>> have to do any ssl setting on tomcat as well
>>
>> *
>>
>> java.io.IOException
>> *: Call to localhost/127.0.0.1:9000 failed on local exception: *
>> java.io.IOException*: An established connection was aborted by the
>> software in your host machine
>>
>> at org.apache.hadoop.ipc.Client.wrapException(*Client.java:1107*)
>>
>> at org.apache.hadoop.ipc.Client.call(*Client.java:1075*)
>>
>> at org.apache.hadoop.ipc.RPC$Invoker.invoke(*RPC.java:225*)
>>
>>
>>  On Mon, Sep 3, 2012 at 7:22 PM, Senthil Kumar <
>> senthilkumar@thoughtworks.com> wrote:
>>
>>> Try using hadoopConf.addResource(*new
>>> Path("F:/hadoop-0.22.0/conf/core-site.xml")*); instead
>>> of hadoopConf.addResource("F:/hadoop-0.22.0/conf/core-site.xml");
>>>
>>> or you should add your core-site.xml to a location which is in your
>>> class path(WEB-INF\classes or WEB-INF\lib in case of a web application)
>>>
>>>
>>> On Mon, Sep 3, 2012 at 6:02 PM, Visioner Sadak <visioner.sadak@gmail.com
>>> > wrote:
>>>
>>>> thanks hemanth i tried adding ext folder conf and extn root folder
>>>> unable to add xml only but still same problem thanks for the help
>>>>
>>>> On Mon, Sep 3, 2012 at 4:11 PM, Hemanth Yamijala <yh...@gmail.com>wrote:
>>>>
>>>>> Hi,
>>>>>
>>>>> If you are getting the LocalFileSystem, you could try by putting
>>>>> core-site.xml in a directory that's there in the classpath for the
>>>>> Tomcat App (or include such a path in the classpath, if that's
>>>>> possible)
>>>>>
>>>>> Thanks
>>>>> hemanth
>>>>>
>>>>> On Mon, Sep 3, 2012 at 4:01 PM, Visioner Sadak <
>>>>> visioner.sadak@gmail.com> wrote:
>>>>> > Thanks steve thers nothing in logs and no exceptions as well i found
>>>>> that
>>>>> > some file is created in my F:\user with directory name but its not
>>>>> visible
>>>>> > inside my hadoop browse filesystem directories i also added the
>>>>> config by
>>>>> > using the below method
>>>>> > hadoopConf.addResource(
>>>>> > "F:/hadoop-0.22.0/conf/core-site.xml");
>>>>> > when running thru WAR printing out the filesystem i m getting
>>>>> > org.apache.hadoop.fs.LocalFileSystem@9cd8db
>>>>> > when running an independet jar within hadoop i m getting
>>>>> > DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
>>>>> > when running an independet jar i m able to do uploads....
>>>>> >
>>>>> > just wanted to know will i have to add something in my classpath of
>>>>> tomcat
>>>>> > or is there any other configurations of core-site.xml that i am
>>>>> missing
>>>>> > out..thanks for your help.....
>>>>> >
>>>>> >
>>>>> >
>>>>> > On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <
>>>>> stevel@hortonworks.com>
>>>>> > wrote:
>>>>> >>
>>>>> >>
>>>>> >> well, it's worked for me in the past outside Hadoop itself:
>>>>> >>
>>>>> >>
>>>>> >>
>>>>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>>>>> >>
>>>>> >> Turn logging up to DEBUG
>>>>> >> Make sure that the filesystem you've just loaded is what you
>>>>> expect, by
>>>>> >> logging its value. It may turn out to be file:///, because the
>>>>> normal Hadoop
>>>>> >> site-config.xml isn't being picked up
>>>>> >>
>>>>> >>
>>>>> >>>
>>>>> >>>
>>>>> >>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak
>>>>> >>> <vi...@gmail.com> wrote:
>>>>> >>>>
>>>>> >>>> but the problem is that my  code gets executed with the warning
>>>>> but file
>>>>> >>>> is not copied to hdfs , actually i m trying to copy a file from
>>>>> local to
>>>>> >>>> hdfs
>>>>> >>>>
>>>>> >>>>    Configuration hadoopConf=new Configuration();
>>>>> >>>>         //get the default associated file system
>>>>> >>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>>> >>>>        // HarFileSystem harFileSystem= new
>>>>> HarFileSystem(fileSystem);
>>>>> >>>>         //copy from lfs to hdfs
>>>>> >>>>        fileSystem.copyFromLocalFile(new
>>>>> Path("E:/test/GANI.jpg"),new
>>>>> >>>> Path("/user/TestDir/"));
>>>>> >>>>
>>>>> >>
>>>>> >>
>>>>> >
>>>>>
>>>>
>>>>
>>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
Thanks senthil name node is up and running and in core-site.xml i have

<configuration>
<property>
<name>fs.default.name</name>
<value>hdfs://localhost:9000</value>
</property>
</configuration>

shud i change my ip  or any other config??

On Mon, Sep 3, 2012 at 10:11 PM, Senthil Kumar <
senthilkumar@thoughtworks.com> wrote:

> The error says call to 127.0.0.1:9000 fails. It is failing when it tries
> to contact the namenode (9000 is the default namenode port) configured in
> core-site.xml. You should also check whether the namenode is configured
> correctly and also whether the namenode is up.
>
>
> On Mon, Sep 3, 2012 at 7:43 PM, Visioner Sadak <vi...@gmail.com>wrote:
>
>> Thanks Senthil i tried on trying with new path getting this error do i
>> have to do any ssl setting on tomcat as well
>>
>> *
>>
>> java.io.IOException
>> *: Call to localhost/127.0.0.1:9000 failed on local exception: *
>> java.io.IOException*: An established connection was aborted by the
>> software in your host machine
>>
>> at org.apache.hadoop.ipc.Client.wrapException(*Client.java:1107*)
>>
>> at org.apache.hadoop.ipc.Client.call(*Client.java:1075*)
>>
>> at org.apache.hadoop.ipc.RPC$Invoker.invoke(*RPC.java:225*)
>>
>>
>>  On Mon, Sep 3, 2012 at 7:22 PM, Senthil Kumar <
>> senthilkumar@thoughtworks.com> wrote:
>>
>>> Try using hadoopConf.addResource(*new
>>> Path("F:/hadoop-0.22.0/conf/core-site.xml")*); instead
>>> of hadoopConf.addResource("F:/hadoop-0.22.0/conf/core-site.xml");
>>>
>>> or you should add your core-site.xml to a location which is in your
>>> class path(WEB-INF\classes or WEB-INF\lib in case of a web application)
>>>
>>>
>>> On Mon, Sep 3, 2012 at 6:02 PM, Visioner Sadak <visioner.sadak@gmail.com
>>> > wrote:
>>>
>>>> thanks hemanth i tried adding ext folder conf and extn root folder
>>>> unable to add xml only but still same problem thanks for the help
>>>>
>>>> On Mon, Sep 3, 2012 at 4:11 PM, Hemanth Yamijala <yh...@gmail.com>wrote:
>>>>
>>>>> Hi,
>>>>>
>>>>> If you are getting the LocalFileSystem, you could try by putting
>>>>> core-site.xml in a directory that's there in the classpath for the
>>>>> Tomcat App (or include such a path in the classpath, if that's
>>>>> possible)
>>>>>
>>>>> Thanks
>>>>> hemanth
>>>>>
>>>>> On Mon, Sep 3, 2012 at 4:01 PM, Visioner Sadak <
>>>>> visioner.sadak@gmail.com> wrote:
>>>>> > Thanks steve thers nothing in logs and no exceptions as well i found
>>>>> that
>>>>> > some file is created in my F:\user with directory name but its not
>>>>> visible
>>>>> > inside my hadoop browse filesystem directories i also added the
>>>>> config by
>>>>> > using the below method
>>>>> > hadoopConf.addResource(
>>>>> > "F:/hadoop-0.22.0/conf/core-site.xml");
>>>>> > when running thru WAR printing out the filesystem i m getting
>>>>> > org.apache.hadoop.fs.LocalFileSystem@9cd8db
>>>>> > when running an independet jar within hadoop i m getting
>>>>> > DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
>>>>> > when running an independet jar i m able to do uploads....
>>>>> >
>>>>> > just wanted to know will i have to add something in my classpath of
>>>>> tomcat
>>>>> > or is there any other configurations of core-site.xml that i am
>>>>> missing
>>>>> > out..thanks for your help.....
>>>>> >
>>>>> >
>>>>> >
>>>>> > On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <
>>>>> stevel@hortonworks.com>
>>>>> > wrote:
>>>>> >>
>>>>> >>
>>>>> >> well, it's worked for me in the past outside Hadoop itself:
>>>>> >>
>>>>> >>
>>>>> >>
>>>>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>>>>> >>
>>>>> >> Turn logging up to DEBUG
>>>>> >> Make sure that the filesystem you've just loaded is what you
>>>>> expect, by
>>>>> >> logging its value. It may turn out to be file:///, because the
>>>>> normal Hadoop
>>>>> >> site-config.xml isn't being picked up
>>>>> >>
>>>>> >>
>>>>> >>>
>>>>> >>>
>>>>> >>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak
>>>>> >>> <vi...@gmail.com> wrote:
>>>>> >>>>
>>>>> >>>> but the problem is that my  code gets executed with the warning
>>>>> but file
>>>>> >>>> is not copied to hdfs , actually i m trying to copy a file from
>>>>> local to
>>>>> >>>> hdfs
>>>>> >>>>
>>>>> >>>>    Configuration hadoopConf=new Configuration();
>>>>> >>>>         //get the default associated file system
>>>>> >>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>>> >>>>        // HarFileSystem harFileSystem= new
>>>>> HarFileSystem(fileSystem);
>>>>> >>>>         //copy from lfs to hdfs
>>>>> >>>>        fileSystem.copyFromLocalFile(new
>>>>> Path("E:/test/GANI.jpg"),new
>>>>> >>>> Path("/user/TestDir/"));
>>>>> >>>>
>>>>> >>
>>>>> >>
>>>>> >
>>>>>
>>>>
>>>>
>>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Senthil Kumar <se...@thoughtworks.com>.
The error says call to 127.0.0.1:9000 fails. It is failing when it tries to
contact the namenode (9000 is the default namenode port) configured in
core-site.xml. You should also check whether the namenode is configured
correctly and also whether the namenode is up.


On Mon, Sep 3, 2012 at 7:43 PM, Visioner Sadak <vi...@gmail.com>wrote:

> Thanks Senthil i tried on trying with new path getting this error do i
> have to do any ssl setting on tomcat as well
>
> *
>
> java.io.IOException
> *: Call to localhost/127.0.0.1:9000 failed on local exception: *
> java.io.IOException*: An established connection was aborted by the
> software in your host machine
>
> at org.apache.hadoop.ipc.Client.wrapException(*Client.java:1107*)
>
> at org.apache.hadoop.ipc.Client.call(*Client.java:1075*)
>
> at org.apache.hadoop.ipc.RPC$Invoker.invoke(*RPC.java:225*)
>
>
> On Mon, Sep 3, 2012 at 7:22 PM, Senthil Kumar <
> senthilkumar@thoughtworks.com> wrote:
>
>> Try using hadoopConf.addResource(*new
>> Path("F:/hadoop-0.22.0/conf/core-site.xml")*); instead
>> of hadoopConf.addResource("F:/hadoop-0.22.0/conf/core-site.xml");
>>
>> or you should add your core-site.xml to a location which is in your class
>> path(WEB-INF\classes or WEB-INF\lib in case of a web application)
>>
>>
>> On Mon, Sep 3, 2012 at 6:02 PM, Visioner Sadak <vi...@gmail.com>wrote:
>>
>>> thanks hemanth i tried adding ext folder conf and extn root folder
>>> unable to add xml only but still same problem thanks for the help
>>>
>>> On Mon, Sep 3, 2012 at 4:11 PM, Hemanth Yamijala <yh...@gmail.com>wrote:
>>>
>>>> Hi,
>>>>
>>>> If you are getting the LocalFileSystem, you could try by putting
>>>> core-site.xml in a directory that's there in the classpath for the
>>>> Tomcat App (or include such a path in the classpath, if that's
>>>> possible)
>>>>
>>>> Thanks
>>>> hemanth
>>>>
>>>> On Mon, Sep 3, 2012 at 4:01 PM, Visioner Sadak <
>>>> visioner.sadak@gmail.com> wrote:
>>>> > Thanks steve thers nothing in logs and no exceptions as well i found
>>>> that
>>>> > some file is created in my F:\user with directory name but its not
>>>> visible
>>>> > inside my hadoop browse filesystem directories i also added the
>>>> config by
>>>> > using the below method
>>>> > hadoopConf.addResource(
>>>> > "F:/hadoop-0.22.0/conf/core-site.xml");
>>>> > when running thru WAR printing out the filesystem i m getting
>>>> > org.apache.hadoop.fs.LocalFileSystem@9cd8db
>>>> > when running an independet jar within hadoop i m getting
>>>> > DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
>>>> > when running an independet jar i m able to do uploads....
>>>> >
>>>> > just wanted to know will i have to add something in my classpath of
>>>> tomcat
>>>> > or is there any other configurations of core-site.xml that i am
>>>> missing
>>>> > out..thanks for your help.....
>>>> >
>>>> >
>>>> >
>>>> > On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <
>>>> stevel@hortonworks.com>
>>>> > wrote:
>>>> >>
>>>> >>
>>>> >> well, it's worked for me in the past outside Hadoop itself:
>>>> >>
>>>> >>
>>>> >>
>>>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>>>> >>
>>>> >> Turn logging up to DEBUG
>>>> >> Make sure that the filesystem you've just loaded is what you expect,
>>>> by
>>>> >> logging its value. It may turn out to be file:///, because the
>>>> normal Hadoop
>>>> >> site-config.xml isn't being picked up
>>>> >>
>>>> >>
>>>> >>>
>>>> >>>
>>>> >>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak
>>>> >>> <vi...@gmail.com> wrote:
>>>> >>>>
>>>> >>>> but the problem is that my  code gets executed with the warning
>>>> but file
>>>> >>>> is not copied to hdfs , actually i m trying to copy a file from
>>>> local to
>>>> >>>> hdfs
>>>> >>>>
>>>> >>>>    Configuration hadoopConf=new Configuration();
>>>> >>>>         //get the default associated file system
>>>> >>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>> >>>>        // HarFileSystem harFileSystem= new
>>>> HarFileSystem(fileSystem);
>>>> >>>>         //copy from lfs to hdfs
>>>> >>>>        fileSystem.copyFromLocalFile(new
>>>> Path("E:/test/GANI.jpg"),new
>>>> >>>> Path("/user/TestDir/"));
>>>> >>>>
>>>> >>
>>>> >>
>>>> >
>>>>
>>>
>>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Senthil Kumar <se...@thoughtworks.com>.
The error says call to 127.0.0.1:9000 fails. It is failing when it tries to
contact the namenode (9000 is the default namenode port) configured in
core-site.xml. You should also check whether the namenode is configured
correctly and also whether the namenode is up.


On Mon, Sep 3, 2012 at 7:43 PM, Visioner Sadak <vi...@gmail.com>wrote:

> Thanks Senthil i tried on trying with new path getting this error do i
> have to do any ssl setting on tomcat as well
>
> *
>
> java.io.IOException
> *: Call to localhost/127.0.0.1:9000 failed on local exception: *
> java.io.IOException*: An established connection was aborted by the
> software in your host machine
>
> at org.apache.hadoop.ipc.Client.wrapException(*Client.java:1107*)
>
> at org.apache.hadoop.ipc.Client.call(*Client.java:1075*)
>
> at org.apache.hadoop.ipc.RPC$Invoker.invoke(*RPC.java:225*)
>
>
> On Mon, Sep 3, 2012 at 7:22 PM, Senthil Kumar <
> senthilkumar@thoughtworks.com> wrote:
>
>> Try using hadoopConf.addResource(*new
>> Path("F:/hadoop-0.22.0/conf/core-site.xml")*); instead
>> of hadoopConf.addResource("F:/hadoop-0.22.0/conf/core-site.xml");
>>
>> or you should add your core-site.xml to a location which is in your class
>> path(WEB-INF\classes or WEB-INF\lib in case of a web application)
>>
>>
>> On Mon, Sep 3, 2012 at 6:02 PM, Visioner Sadak <vi...@gmail.com>wrote:
>>
>>> thanks hemanth i tried adding ext folder conf and extn root folder
>>> unable to add xml only but still same problem thanks for the help
>>>
>>> On Mon, Sep 3, 2012 at 4:11 PM, Hemanth Yamijala <yh...@gmail.com>wrote:
>>>
>>>> Hi,
>>>>
>>>> If you are getting the LocalFileSystem, you could try by putting
>>>> core-site.xml in a directory that's there in the classpath for the
>>>> Tomcat App (or include such a path in the classpath, if that's
>>>> possible)
>>>>
>>>> Thanks
>>>> hemanth
>>>>
>>>> On Mon, Sep 3, 2012 at 4:01 PM, Visioner Sadak <
>>>> visioner.sadak@gmail.com> wrote:
>>>> > Thanks steve thers nothing in logs and no exceptions as well i found
>>>> that
>>>> > some file is created in my F:\user with directory name but its not
>>>> visible
>>>> > inside my hadoop browse filesystem directories i also added the
>>>> config by
>>>> > using the below method
>>>> > hadoopConf.addResource(
>>>> > "F:/hadoop-0.22.0/conf/core-site.xml");
>>>> > when running thru WAR printing out the filesystem i m getting
>>>> > org.apache.hadoop.fs.LocalFileSystem@9cd8db
>>>> > when running an independet jar within hadoop i m getting
>>>> > DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
>>>> > when running an independet jar i m able to do uploads....
>>>> >
>>>> > just wanted to know will i have to add something in my classpath of
>>>> tomcat
>>>> > or is there any other configurations of core-site.xml that i am
>>>> missing
>>>> > out..thanks for your help.....
>>>> >
>>>> >
>>>> >
>>>> > On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <
>>>> stevel@hortonworks.com>
>>>> > wrote:
>>>> >>
>>>> >>
>>>> >> well, it's worked for me in the past outside Hadoop itself:
>>>> >>
>>>> >>
>>>> >>
>>>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>>>> >>
>>>> >> Turn logging up to DEBUG
>>>> >> Make sure that the filesystem you've just loaded is what you expect,
>>>> by
>>>> >> logging its value. It may turn out to be file:///, because the
>>>> normal Hadoop
>>>> >> site-config.xml isn't being picked up
>>>> >>
>>>> >>
>>>> >>>
>>>> >>>
>>>> >>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak
>>>> >>> <vi...@gmail.com> wrote:
>>>> >>>>
>>>> >>>> but the problem is that my  code gets executed with the warning
>>>> but file
>>>> >>>> is not copied to hdfs , actually i m trying to copy a file from
>>>> local to
>>>> >>>> hdfs
>>>> >>>>
>>>> >>>>    Configuration hadoopConf=new Configuration();
>>>> >>>>         //get the default associated file system
>>>> >>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>> >>>>        // HarFileSystem harFileSystem= new
>>>> HarFileSystem(fileSystem);
>>>> >>>>         //copy from lfs to hdfs
>>>> >>>>        fileSystem.copyFromLocalFile(new
>>>> Path("E:/test/GANI.jpg"),new
>>>> >>>> Path("/user/TestDir/"));
>>>> >>>>
>>>> >>
>>>> >>
>>>> >
>>>>
>>>
>>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Senthil Kumar <se...@thoughtworks.com>.
The error says call to 127.0.0.1:9000 fails. It is failing when it tries to
contact the namenode (9000 is the default namenode port) configured in
core-site.xml. You should also check whether the namenode is configured
correctly and also whether the namenode is up.


On Mon, Sep 3, 2012 at 7:43 PM, Visioner Sadak <vi...@gmail.com>wrote:

> Thanks Senthil i tried on trying with new path getting this error do i
> have to do any ssl setting on tomcat as well
>
> *
>
> java.io.IOException
> *: Call to localhost/127.0.0.1:9000 failed on local exception: *
> java.io.IOException*: An established connection was aborted by the
> software in your host machine
>
> at org.apache.hadoop.ipc.Client.wrapException(*Client.java:1107*)
>
> at org.apache.hadoop.ipc.Client.call(*Client.java:1075*)
>
> at org.apache.hadoop.ipc.RPC$Invoker.invoke(*RPC.java:225*)
>
>
> On Mon, Sep 3, 2012 at 7:22 PM, Senthil Kumar <
> senthilkumar@thoughtworks.com> wrote:
>
>> Try using hadoopConf.addResource(*new
>> Path("F:/hadoop-0.22.0/conf/core-site.xml")*); instead
>> of hadoopConf.addResource("F:/hadoop-0.22.0/conf/core-site.xml");
>>
>> or you should add your core-site.xml to a location which is in your class
>> path(WEB-INF\classes or WEB-INF\lib in case of a web application)
>>
>>
>> On Mon, Sep 3, 2012 at 6:02 PM, Visioner Sadak <vi...@gmail.com>wrote:
>>
>>> thanks hemanth i tried adding ext folder conf and extn root folder
>>> unable to add xml only but still same problem thanks for the help
>>>
>>> On Mon, Sep 3, 2012 at 4:11 PM, Hemanth Yamijala <yh...@gmail.com>wrote:
>>>
>>>> Hi,
>>>>
>>>> If you are getting the LocalFileSystem, you could try by putting
>>>> core-site.xml in a directory that's there in the classpath for the
>>>> Tomcat App (or include such a path in the classpath, if that's
>>>> possible)
>>>>
>>>> Thanks
>>>> hemanth
>>>>
>>>> On Mon, Sep 3, 2012 at 4:01 PM, Visioner Sadak <
>>>> visioner.sadak@gmail.com> wrote:
>>>> > Thanks steve thers nothing in logs and no exceptions as well i found
>>>> that
>>>> > some file is created in my F:\user with directory name but its not
>>>> visible
>>>> > inside my hadoop browse filesystem directories i also added the
>>>> config by
>>>> > using the below method
>>>> > hadoopConf.addResource(
>>>> > "F:/hadoop-0.22.0/conf/core-site.xml");
>>>> > when running thru WAR printing out the filesystem i m getting
>>>> > org.apache.hadoop.fs.LocalFileSystem@9cd8db
>>>> > when running an independet jar within hadoop i m getting
>>>> > DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
>>>> > when running an independet jar i m able to do uploads....
>>>> >
>>>> > just wanted to know will i have to add something in my classpath of
>>>> tomcat
>>>> > or is there any other configurations of core-site.xml that i am
>>>> missing
>>>> > out..thanks for your help.....
>>>> >
>>>> >
>>>> >
>>>> > On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <
>>>> stevel@hortonworks.com>
>>>> > wrote:
>>>> >>
>>>> >>
>>>> >> well, it's worked for me in the past outside Hadoop itself:
>>>> >>
>>>> >>
>>>> >>
>>>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>>>> >>
>>>> >> Turn logging up to DEBUG
>>>> >> Make sure that the filesystem you've just loaded is what you expect,
>>>> by
>>>> >> logging its value. It may turn out to be file:///, because the
>>>> normal Hadoop
>>>> >> site-config.xml isn't being picked up
>>>> >>
>>>> >>
>>>> >>>
>>>> >>>
>>>> >>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak
>>>> >>> <vi...@gmail.com> wrote:
>>>> >>>>
>>>> >>>> but the problem is that my  code gets executed with the warning
>>>> but file
>>>> >>>> is not copied to hdfs , actually i m trying to copy a file from
>>>> local to
>>>> >>>> hdfs
>>>> >>>>
>>>> >>>>    Configuration hadoopConf=new Configuration();
>>>> >>>>         //get the default associated file system
>>>> >>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>> >>>>        // HarFileSystem harFileSystem= new
>>>> HarFileSystem(fileSystem);
>>>> >>>>         //copy from lfs to hdfs
>>>> >>>>        fileSystem.copyFromLocalFile(new
>>>> Path("E:/test/GANI.jpg"),new
>>>> >>>> Path("/user/TestDir/"));
>>>> >>>>
>>>> >>
>>>> >>
>>>> >
>>>>
>>>
>>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Senthil Kumar <se...@thoughtworks.com>.
The error says call to 127.0.0.1:9000 fails. It is failing when it tries to
contact the namenode (9000 is the default namenode port) configured in
core-site.xml. You should also check whether the namenode is configured
correctly and also whether the namenode is up.


On Mon, Sep 3, 2012 at 7:43 PM, Visioner Sadak <vi...@gmail.com>wrote:

> Thanks Senthil i tried on trying with new path getting this error do i
> have to do any ssl setting on tomcat as well
>
> *
>
> java.io.IOException
> *: Call to localhost/127.0.0.1:9000 failed on local exception: *
> java.io.IOException*: An established connection was aborted by the
> software in your host machine
>
> at org.apache.hadoop.ipc.Client.wrapException(*Client.java:1107*)
>
> at org.apache.hadoop.ipc.Client.call(*Client.java:1075*)
>
> at org.apache.hadoop.ipc.RPC$Invoker.invoke(*RPC.java:225*)
>
>
> On Mon, Sep 3, 2012 at 7:22 PM, Senthil Kumar <
> senthilkumar@thoughtworks.com> wrote:
>
>> Try using hadoopConf.addResource(*new
>> Path("F:/hadoop-0.22.0/conf/core-site.xml")*); instead
>> of hadoopConf.addResource("F:/hadoop-0.22.0/conf/core-site.xml");
>>
>> or you should add your core-site.xml to a location which is in your class
>> path(WEB-INF\classes or WEB-INF\lib in case of a web application)
>>
>>
>> On Mon, Sep 3, 2012 at 6:02 PM, Visioner Sadak <vi...@gmail.com>wrote:
>>
>>> thanks hemanth i tried adding ext folder conf and extn root folder
>>> unable to add xml only but still same problem thanks for the help
>>>
>>> On Mon, Sep 3, 2012 at 4:11 PM, Hemanth Yamijala <yh...@gmail.com>wrote:
>>>
>>>> Hi,
>>>>
>>>> If you are getting the LocalFileSystem, you could try by putting
>>>> core-site.xml in a directory that's there in the classpath for the
>>>> Tomcat App (or include such a path in the classpath, if that's
>>>> possible)
>>>>
>>>> Thanks
>>>> hemanth
>>>>
>>>> On Mon, Sep 3, 2012 at 4:01 PM, Visioner Sadak <
>>>> visioner.sadak@gmail.com> wrote:
>>>> > Thanks steve thers nothing in logs and no exceptions as well i found
>>>> that
>>>> > some file is created in my F:\user with directory name but its not
>>>> visible
>>>> > inside my hadoop browse filesystem directories i also added the
>>>> config by
>>>> > using the below method
>>>> > hadoopConf.addResource(
>>>> > "F:/hadoop-0.22.0/conf/core-site.xml");
>>>> > when running thru WAR printing out the filesystem i m getting
>>>> > org.apache.hadoop.fs.LocalFileSystem@9cd8db
>>>> > when running an independet jar within hadoop i m getting
>>>> > DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
>>>> > when running an independet jar i m able to do uploads....
>>>> >
>>>> > just wanted to know will i have to add something in my classpath of
>>>> tomcat
>>>> > or is there any other configurations of core-site.xml that i am
>>>> missing
>>>> > out..thanks for your help.....
>>>> >
>>>> >
>>>> >
>>>> > On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <
>>>> stevel@hortonworks.com>
>>>> > wrote:
>>>> >>
>>>> >>
>>>> >> well, it's worked for me in the past outside Hadoop itself:
>>>> >>
>>>> >>
>>>> >>
>>>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>>>> >>
>>>> >> Turn logging up to DEBUG
>>>> >> Make sure that the filesystem you've just loaded is what you expect,
>>>> by
>>>> >> logging its value. It may turn out to be file:///, because the
>>>> normal Hadoop
>>>> >> site-config.xml isn't being picked up
>>>> >>
>>>> >>
>>>> >>>
>>>> >>>
>>>> >>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak
>>>> >>> <vi...@gmail.com> wrote:
>>>> >>>>
>>>> >>>> but the problem is that my  code gets executed with the warning
>>>> but file
>>>> >>>> is not copied to hdfs , actually i m trying to copy a file from
>>>> local to
>>>> >>>> hdfs
>>>> >>>>
>>>> >>>>    Configuration hadoopConf=new Configuration();
>>>> >>>>         //get the default associated file system
>>>> >>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>> >>>>        // HarFileSystem harFileSystem= new
>>>> HarFileSystem(fileSystem);
>>>> >>>>         //copy from lfs to hdfs
>>>> >>>>        fileSystem.copyFromLocalFile(new
>>>> Path("E:/test/GANI.jpg"),new
>>>> >>>> Path("/user/TestDir/"));
>>>> >>>>
>>>> >>
>>>> >>
>>>> >
>>>>
>>>
>>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
Thanks Senthil i tried on trying with new path getting this error do i have
to do any ssl setting on tomcat as well

*

java.io.IOException*: Call to localhost/127.0.0.1:9000 failed on local
exception: *java.io.IOException*: An established connection was aborted by
the software in your host machine

at org.apache.hadoop.ipc.Client.wrapException(*Client.java:1107*)

at org.apache.hadoop.ipc.Client.call(*Client.java:1075*)

at org.apache.hadoop.ipc.RPC$Invoker.invoke(*RPC.java:225*)


On Mon, Sep 3, 2012 at 7:22 PM, Senthil Kumar <senthilkumar@thoughtworks.com
> wrote:

> Try using hadoopConf.addResource(*new
> Path("F:/hadoop-0.22.0/conf/core-site.xml")*); instead
> of hadoopConf.addResource("F:/hadoop-0.22.0/conf/core-site.xml");
>
> or you should add your core-site.xml to a location which is in your class
> path(WEB-INF\classes or WEB-INF\lib in case of a web application)
>
>
> On Mon, Sep 3, 2012 at 6:02 PM, Visioner Sadak <vi...@gmail.com>wrote:
>
>> thanks hemanth i tried adding ext folder conf and extn root folder
>> unable to add xml only but still same problem thanks for the help
>>
>> On Mon, Sep 3, 2012 at 4:11 PM, Hemanth Yamijala <yh...@gmail.com>wrote:
>>
>>> Hi,
>>>
>>> If you are getting the LocalFileSystem, you could try by putting
>>> core-site.xml in a directory that's there in the classpath for the
>>> Tomcat App (or include such a path in the classpath, if that's
>>> possible)
>>>
>>> Thanks
>>> hemanth
>>>
>>> On Mon, Sep 3, 2012 at 4:01 PM, Visioner Sadak <vi...@gmail.com>
>>> wrote:
>>> > Thanks steve thers nothing in logs and no exceptions as well i found
>>> that
>>> > some file is created in my F:\user with directory name but its not
>>> visible
>>> > inside my hadoop browse filesystem directories i also added the config
>>> by
>>> > using the below method
>>> > hadoopConf.addResource(
>>> > "F:/hadoop-0.22.0/conf/core-site.xml");
>>> > when running thru WAR printing out the filesystem i m getting
>>> > org.apache.hadoop.fs.LocalFileSystem@9cd8db
>>> > when running an independet jar within hadoop i m getting
>>> > DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
>>> > when running an independet jar i m able to do uploads....
>>> >
>>> > just wanted to know will i have to add something in my classpath of
>>> tomcat
>>> > or is there any other configurations of core-site.xml that i am missing
>>> > out..thanks for your help.....
>>> >
>>> >
>>> >
>>> > On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <stevel@hortonworks.com
>>> >
>>> > wrote:
>>> >>
>>> >>
>>> >> well, it's worked for me in the past outside Hadoop itself:
>>> >>
>>> >>
>>> >>
>>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>>> >>
>>> >> Turn logging up to DEBUG
>>> >> Make sure that the filesystem you've just loaded is what you expect,
>>> by
>>> >> logging its value. It may turn out to be file:///, because the normal
>>> Hadoop
>>> >> site-config.xml isn't being picked up
>>> >>
>>> >>
>>> >>>
>>> >>>
>>> >>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak
>>> >>> <vi...@gmail.com> wrote:
>>> >>>>
>>> >>>> but the problem is that my  code gets executed with the warning but
>>> file
>>> >>>> is not copied to hdfs , actually i m trying to copy a file from
>>> local to
>>> >>>> hdfs
>>> >>>>
>>> >>>>    Configuration hadoopConf=new Configuration();
>>> >>>>         //get the default associated file system
>>> >>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>> >>>>        // HarFileSystem harFileSystem= new
>>> HarFileSystem(fileSystem);
>>> >>>>         //copy from lfs to hdfs
>>> >>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>>> >>>> Path("/user/TestDir/"));
>>> >>>>
>>> >>
>>> >>
>>> >
>>>
>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
Thanks Senthil i tried on trying with new path getting this error do i have
to do any ssl setting on tomcat as well

*

java.io.IOException*: Call to localhost/127.0.0.1:9000 failed on local
exception: *java.io.IOException*: An established connection was aborted by
the software in your host machine

at org.apache.hadoop.ipc.Client.wrapException(*Client.java:1107*)

at org.apache.hadoop.ipc.Client.call(*Client.java:1075*)

at org.apache.hadoop.ipc.RPC$Invoker.invoke(*RPC.java:225*)


On Mon, Sep 3, 2012 at 7:22 PM, Senthil Kumar <senthilkumar@thoughtworks.com
> wrote:

> Try using hadoopConf.addResource(*new
> Path("F:/hadoop-0.22.0/conf/core-site.xml")*); instead
> of hadoopConf.addResource("F:/hadoop-0.22.0/conf/core-site.xml");
>
> or you should add your core-site.xml to a location which is in your class
> path(WEB-INF\classes or WEB-INF\lib in case of a web application)
>
>
> On Mon, Sep 3, 2012 at 6:02 PM, Visioner Sadak <vi...@gmail.com>wrote:
>
>> thanks hemanth i tried adding ext folder conf and extn root folder
>> unable to add xml only but still same problem thanks for the help
>>
>> On Mon, Sep 3, 2012 at 4:11 PM, Hemanth Yamijala <yh...@gmail.com>wrote:
>>
>>> Hi,
>>>
>>> If you are getting the LocalFileSystem, you could try by putting
>>> core-site.xml in a directory that's there in the classpath for the
>>> Tomcat App (or include such a path in the classpath, if that's
>>> possible)
>>>
>>> Thanks
>>> hemanth
>>>
>>> On Mon, Sep 3, 2012 at 4:01 PM, Visioner Sadak <vi...@gmail.com>
>>> wrote:
>>> > Thanks steve thers nothing in logs and no exceptions as well i found
>>> that
>>> > some file is created in my F:\user with directory name but its not
>>> visible
>>> > inside my hadoop browse filesystem directories i also added the config
>>> by
>>> > using the below method
>>> > hadoopConf.addResource(
>>> > "F:/hadoop-0.22.0/conf/core-site.xml");
>>> > when running thru WAR printing out the filesystem i m getting
>>> > org.apache.hadoop.fs.LocalFileSystem@9cd8db
>>> > when running an independet jar within hadoop i m getting
>>> > DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
>>> > when running an independet jar i m able to do uploads....
>>> >
>>> > just wanted to know will i have to add something in my classpath of
>>> tomcat
>>> > or is there any other configurations of core-site.xml that i am missing
>>> > out..thanks for your help.....
>>> >
>>> >
>>> >
>>> > On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <stevel@hortonworks.com
>>> >
>>> > wrote:
>>> >>
>>> >>
>>> >> well, it's worked for me in the past outside Hadoop itself:
>>> >>
>>> >>
>>> >>
>>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>>> >>
>>> >> Turn logging up to DEBUG
>>> >> Make sure that the filesystem you've just loaded is what you expect,
>>> by
>>> >> logging its value. It may turn out to be file:///, because the normal
>>> Hadoop
>>> >> site-config.xml isn't being picked up
>>> >>
>>> >>
>>> >>>
>>> >>>
>>> >>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak
>>> >>> <vi...@gmail.com> wrote:
>>> >>>>
>>> >>>> but the problem is that my  code gets executed with the warning but
>>> file
>>> >>>> is not copied to hdfs , actually i m trying to copy a file from
>>> local to
>>> >>>> hdfs
>>> >>>>
>>> >>>>    Configuration hadoopConf=new Configuration();
>>> >>>>         //get the default associated file system
>>> >>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>> >>>>        // HarFileSystem harFileSystem= new
>>> HarFileSystem(fileSystem);
>>> >>>>         //copy from lfs to hdfs
>>> >>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>>> >>>> Path("/user/TestDir/"));
>>> >>>>
>>> >>
>>> >>
>>> >
>>>
>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
Thanks Senthil i tried on trying with new path getting this error do i have
to do any ssl setting on tomcat as well

*

java.io.IOException*: Call to localhost/127.0.0.1:9000 failed on local
exception: *java.io.IOException*: An established connection was aborted by
the software in your host machine

at org.apache.hadoop.ipc.Client.wrapException(*Client.java:1107*)

at org.apache.hadoop.ipc.Client.call(*Client.java:1075*)

at org.apache.hadoop.ipc.RPC$Invoker.invoke(*RPC.java:225*)


On Mon, Sep 3, 2012 at 7:22 PM, Senthil Kumar <senthilkumar@thoughtworks.com
> wrote:

> Try using hadoopConf.addResource(*new
> Path("F:/hadoop-0.22.0/conf/core-site.xml")*); instead
> of hadoopConf.addResource("F:/hadoop-0.22.0/conf/core-site.xml");
>
> or you should add your core-site.xml to a location which is in your class
> path(WEB-INF\classes or WEB-INF\lib in case of a web application)
>
>
> On Mon, Sep 3, 2012 at 6:02 PM, Visioner Sadak <vi...@gmail.com>wrote:
>
>> thanks hemanth i tried adding ext folder conf and extn root folder
>> unable to add xml only but still same problem thanks for the help
>>
>> On Mon, Sep 3, 2012 at 4:11 PM, Hemanth Yamijala <yh...@gmail.com>wrote:
>>
>>> Hi,
>>>
>>> If you are getting the LocalFileSystem, you could try by putting
>>> core-site.xml in a directory that's there in the classpath for the
>>> Tomcat App (or include such a path in the classpath, if that's
>>> possible)
>>>
>>> Thanks
>>> hemanth
>>>
>>> On Mon, Sep 3, 2012 at 4:01 PM, Visioner Sadak <vi...@gmail.com>
>>> wrote:
>>> > Thanks steve thers nothing in logs and no exceptions as well i found
>>> that
>>> > some file is created in my F:\user with directory name but its not
>>> visible
>>> > inside my hadoop browse filesystem directories i also added the config
>>> by
>>> > using the below method
>>> > hadoopConf.addResource(
>>> > "F:/hadoop-0.22.0/conf/core-site.xml");
>>> > when running thru WAR printing out the filesystem i m getting
>>> > org.apache.hadoop.fs.LocalFileSystem@9cd8db
>>> > when running an independet jar within hadoop i m getting
>>> > DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
>>> > when running an independet jar i m able to do uploads....
>>> >
>>> > just wanted to know will i have to add something in my classpath of
>>> tomcat
>>> > or is there any other configurations of core-site.xml that i am missing
>>> > out..thanks for your help.....
>>> >
>>> >
>>> >
>>> > On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <stevel@hortonworks.com
>>> >
>>> > wrote:
>>> >>
>>> >>
>>> >> well, it's worked for me in the past outside Hadoop itself:
>>> >>
>>> >>
>>> >>
>>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>>> >>
>>> >> Turn logging up to DEBUG
>>> >> Make sure that the filesystem you've just loaded is what you expect,
>>> by
>>> >> logging its value. It may turn out to be file:///, because the normal
>>> Hadoop
>>> >> site-config.xml isn't being picked up
>>> >>
>>> >>
>>> >>>
>>> >>>
>>> >>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak
>>> >>> <vi...@gmail.com> wrote:
>>> >>>>
>>> >>>> but the problem is that my  code gets executed with the warning but
>>> file
>>> >>>> is not copied to hdfs , actually i m trying to copy a file from
>>> local to
>>> >>>> hdfs
>>> >>>>
>>> >>>>    Configuration hadoopConf=new Configuration();
>>> >>>>         //get the default associated file system
>>> >>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>> >>>>        // HarFileSystem harFileSystem= new
>>> HarFileSystem(fileSystem);
>>> >>>>         //copy from lfs to hdfs
>>> >>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>>> >>>> Path("/user/TestDir/"));
>>> >>>>
>>> >>
>>> >>
>>> >
>>>
>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
Thanks Senthil i tried on trying with new path getting this error do i have
to do any ssl setting on tomcat as well

*

java.io.IOException*: Call to localhost/127.0.0.1:9000 failed on local
exception: *java.io.IOException*: An established connection was aborted by
the software in your host machine

at org.apache.hadoop.ipc.Client.wrapException(*Client.java:1107*)

at org.apache.hadoop.ipc.Client.call(*Client.java:1075*)

at org.apache.hadoop.ipc.RPC$Invoker.invoke(*RPC.java:225*)


On Mon, Sep 3, 2012 at 7:22 PM, Senthil Kumar <senthilkumar@thoughtworks.com
> wrote:

> Try using hadoopConf.addResource(*new
> Path("F:/hadoop-0.22.0/conf/core-site.xml")*); instead
> of hadoopConf.addResource("F:/hadoop-0.22.0/conf/core-site.xml");
>
> or you should add your core-site.xml to a location which is in your class
> path(WEB-INF\classes or WEB-INF\lib in case of a web application)
>
>
> On Mon, Sep 3, 2012 at 6:02 PM, Visioner Sadak <vi...@gmail.com>wrote:
>
>> thanks hemanth i tried adding ext folder conf and extn root folder
>> unable to add xml only but still same problem thanks for the help
>>
>> On Mon, Sep 3, 2012 at 4:11 PM, Hemanth Yamijala <yh...@gmail.com>wrote:
>>
>>> Hi,
>>>
>>> If you are getting the LocalFileSystem, you could try by putting
>>> core-site.xml in a directory that's there in the classpath for the
>>> Tomcat App (or include such a path in the classpath, if that's
>>> possible)
>>>
>>> Thanks
>>> hemanth
>>>
>>> On Mon, Sep 3, 2012 at 4:01 PM, Visioner Sadak <vi...@gmail.com>
>>> wrote:
>>> > Thanks steve thers nothing in logs and no exceptions as well i found
>>> that
>>> > some file is created in my F:\user with directory name but its not
>>> visible
>>> > inside my hadoop browse filesystem directories i also added the config
>>> by
>>> > using the below method
>>> > hadoopConf.addResource(
>>> > "F:/hadoop-0.22.0/conf/core-site.xml");
>>> > when running thru WAR printing out the filesystem i m getting
>>> > org.apache.hadoop.fs.LocalFileSystem@9cd8db
>>> > when running an independet jar within hadoop i m getting
>>> > DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
>>> > when running an independet jar i m able to do uploads....
>>> >
>>> > just wanted to know will i have to add something in my classpath of
>>> tomcat
>>> > or is there any other configurations of core-site.xml that i am missing
>>> > out..thanks for your help.....
>>> >
>>> >
>>> >
>>> > On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <stevel@hortonworks.com
>>> >
>>> > wrote:
>>> >>
>>> >>
>>> >> well, it's worked for me in the past outside Hadoop itself:
>>> >>
>>> >>
>>> >>
>>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>>> >>
>>> >> Turn logging up to DEBUG
>>> >> Make sure that the filesystem you've just loaded is what you expect,
>>> by
>>> >> logging its value. It may turn out to be file:///, because the normal
>>> Hadoop
>>> >> site-config.xml isn't being picked up
>>> >>
>>> >>
>>> >>>
>>> >>>
>>> >>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak
>>> >>> <vi...@gmail.com> wrote:
>>> >>>>
>>> >>>> but the problem is that my  code gets executed with the warning but
>>> file
>>> >>>> is not copied to hdfs , actually i m trying to copy a file from
>>> local to
>>> >>>> hdfs
>>> >>>>
>>> >>>>    Configuration hadoopConf=new Configuration();
>>> >>>>         //get the default associated file system
>>> >>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>> >>>>        // HarFileSystem harFileSystem= new
>>> HarFileSystem(fileSystem);
>>> >>>>         //copy from lfs to hdfs
>>> >>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>>> >>>> Path("/user/TestDir/"));
>>> >>>>
>>> >>
>>> >>
>>> >
>>>
>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Senthil Kumar <se...@thoughtworks.com>.
Try using hadoopConf.addResource(*new
Path("F:/hadoop-0.22.0/conf/core-site.xml")*); instead
of hadoopConf.addResource("F:/hadoop-0.22.0/conf/core-site.xml");

or you should add your core-site.xml to a location which is in your class
path(WEB-INF\classes or WEB-INF\lib in case of a web application)


On Mon, Sep 3, 2012 at 6:02 PM, Visioner Sadak <vi...@gmail.com>wrote:

> thanks hemanth i tried adding ext folder conf and extn root folder
> unable to add xml only but still same problem thanks for the help
>
> On Mon, Sep 3, 2012 at 4:11 PM, Hemanth Yamijala <yh...@gmail.com>wrote:
>
>> Hi,
>>
>> If you are getting the LocalFileSystem, you could try by putting
>> core-site.xml in a directory that's there in the classpath for the
>> Tomcat App (or include such a path in the classpath, if that's
>> possible)
>>
>> Thanks
>> hemanth
>>
>> On Mon, Sep 3, 2012 at 4:01 PM, Visioner Sadak <vi...@gmail.com>
>> wrote:
>> > Thanks steve thers nothing in logs and no exceptions as well i found
>> that
>> > some file is created in my F:\user with directory name but its not
>> visible
>> > inside my hadoop browse filesystem directories i also added the config
>> by
>> > using the below method
>> > hadoopConf.addResource(
>> > "F:/hadoop-0.22.0/conf/core-site.xml");
>> > when running thru WAR printing out the filesystem i m getting
>> > org.apache.hadoop.fs.LocalFileSystem@9cd8db
>> > when running an independet jar within hadoop i m getting
>> > DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
>> > when running an independet jar i m able to do uploads....
>> >
>> > just wanted to know will i have to add something in my classpath of
>> tomcat
>> > or is there any other configurations of core-site.xml that i am missing
>> > out..thanks for your help.....
>> >
>> >
>> >
>> > On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
>> > wrote:
>> >>
>> >>
>> >> well, it's worked for me in the past outside Hadoop itself:
>> >>
>> >>
>> >>
>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>> >>
>> >> Turn logging up to DEBUG
>> >> Make sure that the filesystem you've just loaded is what you expect, by
>> >> logging its value. It may turn out to be file:///, because the normal
>> Hadoop
>> >> site-config.xml isn't being picked up
>> >>
>> >>
>> >>>
>> >>>
>> >>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak
>> >>> <vi...@gmail.com> wrote:
>> >>>>
>> >>>> but the problem is that my  code gets executed with the warning but
>> file
>> >>>> is not copied to hdfs , actually i m trying to copy a file from
>> local to
>> >>>> hdfs
>> >>>>
>> >>>>    Configuration hadoopConf=new Configuration();
>> >>>>         //get the default associated file system
>> >>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>> >>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>> >>>>         //copy from lfs to hdfs
>> >>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>> >>>> Path("/user/TestDir/"));
>> >>>>
>> >>
>> >>
>> >
>>
>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Senthil Kumar <se...@thoughtworks.com>.
Try using hadoopConf.addResource(*new
Path("F:/hadoop-0.22.0/conf/core-site.xml")*); instead
of hadoopConf.addResource("F:/hadoop-0.22.0/conf/core-site.xml");

or you should add your core-site.xml to a location which is in your class
path(WEB-INF\classes or WEB-INF\lib in case of a web application)


On Mon, Sep 3, 2012 at 6:02 PM, Visioner Sadak <vi...@gmail.com>wrote:

> thanks hemanth i tried adding ext folder conf and extn root folder
> unable to add xml only but still same problem thanks for the help
>
> On Mon, Sep 3, 2012 at 4:11 PM, Hemanth Yamijala <yh...@gmail.com>wrote:
>
>> Hi,
>>
>> If you are getting the LocalFileSystem, you could try by putting
>> core-site.xml in a directory that's there in the classpath for the
>> Tomcat App (or include such a path in the classpath, if that's
>> possible)
>>
>> Thanks
>> hemanth
>>
>> On Mon, Sep 3, 2012 at 4:01 PM, Visioner Sadak <vi...@gmail.com>
>> wrote:
>> > Thanks steve thers nothing in logs and no exceptions as well i found
>> that
>> > some file is created in my F:\user with directory name but its not
>> visible
>> > inside my hadoop browse filesystem directories i also added the config
>> by
>> > using the below method
>> > hadoopConf.addResource(
>> > "F:/hadoop-0.22.0/conf/core-site.xml");
>> > when running thru WAR printing out the filesystem i m getting
>> > org.apache.hadoop.fs.LocalFileSystem@9cd8db
>> > when running an independet jar within hadoop i m getting
>> > DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
>> > when running an independet jar i m able to do uploads....
>> >
>> > just wanted to know will i have to add something in my classpath of
>> tomcat
>> > or is there any other configurations of core-site.xml that i am missing
>> > out..thanks for your help.....
>> >
>> >
>> >
>> > On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
>> > wrote:
>> >>
>> >>
>> >> well, it's worked for me in the past outside Hadoop itself:
>> >>
>> >>
>> >>
>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>> >>
>> >> Turn logging up to DEBUG
>> >> Make sure that the filesystem you've just loaded is what you expect, by
>> >> logging its value. It may turn out to be file:///, because the normal
>> Hadoop
>> >> site-config.xml isn't being picked up
>> >>
>> >>
>> >>>
>> >>>
>> >>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak
>> >>> <vi...@gmail.com> wrote:
>> >>>>
>> >>>> but the problem is that my  code gets executed with the warning but
>> file
>> >>>> is not copied to hdfs , actually i m trying to copy a file from
>> local to
>> >>>> hdfs
>> >>>>
>> >>>>    Configuration hadoopConf=new Configuration();
>> >>>>         //get the default associated file system
>> >>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>> >>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>> >>>>         //copy from lfs to hdfs
>> >>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>> >>>> Path("/user/TestDir/"));
>> >>>>
>> >>
>> >>
>> >
>>
>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Senthil Kumar <se...@thoughtworks.com>.
Try using hadoopConf.addResource(*new
Path("F:/hadoop-0.22.0/conf/core-site.xml")*); instead
of hadoopConf.addResource("F:/hadoop-0.22.0/conf/core-site.xml");

or you should add your core-site.xml to a location which is in your class
path(WEB-INF\classes or WEB-INF\lib in case of a web application)


On Mon, Sep 3, 2012 at 6:02 PM, Visioner Sadak <vi...@gmail.com>wrote:

> thanks hemanth i tried adding ext folder conf and extn root folder
> unable to add xml only but still same problem thanks for the help
>
> On Mon, Sep 3, 2012 at 4:11 PM, Hemanth Yamijala <yh...@gmail.com>wrote:
>
>> Hi,
>>
>> If you are getting the LocalFileSystem, you could try by putting
>> core-site.xml in a directory that's there in the classpath for the
>> Tomcat App (or include such a path in the classpath, if that's
>> possible)
>>
>> Thanks
>> hemanth
>>
>> On Mon, Sep 3, 2012 at 4:01 PM, Visioner Sadak <vi...@gmail.com>
>> wrote:
>> > Thanks steve thers nothing in logs and no exceptions as well i found
>> that
>> > some file is created in my F:\user with directory name but its not
>> visible
>> > inside my hadoop browse filesystem directories i also added the config
>> by
>> > using the below method
>> > hadoopConf.addResource(
>> > "F:/hadoop-0.22.0/conf/core-site.xml");
>> > when running thru WAR printing out the filesystem i m getting
>> > org.apache.hadoop.fs.LocalFileSystem@9cd8db
>> > when running an independet jar within hadoop i m getting
>> > DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
>> > when running an independet jar i m able to do uploads....
>> >
>> > just wanted to know will i have to add something in my classpath of
>> tomcat
>> > or is there any other configurations of core-site.xml that i am missing
>> > out..thanks for your help.....
>> >
>> >
>> >
>> > On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
>> > wrote:
>> >>
>> >>
>> >> well, it's worked for me in the past outside Hadoop itself:
>> >>
>> >>
>> >>
>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>> >>
>> >> Turn logging up to DEBUG
>> >> Make sure that the filesystem you've just loaded is what you expect, by
>> >> logging its value. It may turn out to be file:///, because the normal
>> Hadoop
>> >> site-config.xml isn't being picked up
>> >>
>> >>
>> >>>
>> >>>
>> >>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak
>> >>> <vi...@gmail.com> wrote:
>> >>>>
>> >>>> but the problem is that my  code gets executed with the warning but
>> file
>> >>>> is not copied to hdfs , actually i m trying to copy a file from
>> local to
>> >>>> hdfs
>> >>>>
>> >>>>    Configuration hadoopConf=new Configuration();
>> >>>>         //get the default associated file system
>> >>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>> >>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>> >>>>         //copy from lfs to hdfs
>> >>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>> >>>> Path("/user/TestDir/"));
>> >>>>
>> >>
>> >>
>> >
>>
>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Senthil Kumar <se...@thoughtworks.com>.
Try using hadoopConf.addResource(*new
Path("F:/hadoop-0.22.0/conf/core-site.xml")*); instead
of hadoopConf.addResource("F:/hadoop-0.22.0/conf/core-site.xml");

or you should add your core-site.xml to a location which is in your class
path(WEB-INF\classes or WEB-INF\lib in case of a web application)


On Mon, Sep 3, 2012 at 6:02 PM, Visioner Sadak <vi...@gmail.com>wrote:

> thanks hemanth i tried adding ext folder conf and extn root folder
> unable to add xml only but still same problem thanks for the help
>
> On Mon, Sep 3, 2012 at 4:11 PM, Hemanth Yamijala <yh...@gmail.com>wrote:
>
>> Hi,
>>
>> If you are getting the LocalFileSystem, you could try by putting
>> core-site.xml in a directory that's there in the classpath for the
>> Tomcat App (or include such a path in the classpath, if that's
>> possible)
>>
>> Thanks
>> hemanth
>>
>> On Mon, Sep 3, 2012 at 4:01 PM, Visioner Sadak <vi...@gmail.com>
>> wrote:
>> > Thanks steve thers nothing in logs and no exceptions as well i found
>> that
>> > some file is created in my F:\user with directory name but its not
>> visible
>> > inside my hadoop browse filesystem directories i also added the config
>> by
>> > using the below method
>> > hadoopConf.addResource(
>> > "F:/hadoop-0.22.0/conf/core-site.xml");
>> > when running thru WAR printing out the filesystem i m getting
>> > org.apache.hadoop.fs.LocalFileSystem@9cd8db
>> > when running an independet jar within hadoop i m getting
>> > DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
>> > when running an independet jar i m able to do uploads....
>> >
>> > just wanted to know will i have to add something in my classpath of
>> tomcat
>> > or is there any other configurations of core-site.xml that i am missing
>> > out..thanks for your help.....
>> >
>> >
>> >
>> > On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
>> > wrote:
>> >>
>> >>
>> >> well, it's worked for me in the past outside Hadoop itself:
>> >>
>> >>
>> >>
>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>> >>
>> >> Turn logging up to DEBUG
>> >> Make sure that the filesystem you've just loaded is what you expect, by
>> >> logging its value. It may turn out to be file:///, because the normal
>> Hadoop
>> >> site-config.xml isn't being picked up
>> >>
>> >>
>> >>>
>> >>>
>> >>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak
>> >>> <vi...@gmail.com> wrote:
>> >>>>
>> >>>> but the problem is that my  code gets executed with the warning but
>> file
>> >>>> is not copied to hdfs , actually i m trying to copy a file from
>> local to
>> >>>> hdfs
>> >>>>
>> >>>>    Configuration hadoopConf=new Configuration();
>> >>>>         //get the default associated file system
>> >>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>> >>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>> >>>>         //copy from lfs to hdfs
>> >>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>> >>>> Path("/user/TestDir/"));
>> >>>>
>> >>
>> >>
>> >
>>
>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
thanks hemanth i tried adding ext folder conf and extn root folder   unable
to add xml only but still same problem thanks for the help

On Mon, Sep 3, 2012 at 4:11 PM, Hemanth Yamijala <yh...@gmail.com> wrote:

> Hi,
>
> If you are getting the LocalFileSystem, you could try by putting
> core-site.xml in a directory that's there in the classpath for the
> Tomcat App (or include such a path in the classpath, if that's
> possible)
>
> Thanks
> hemanth
>
> On Mon, Sep 3, 2012 at 4:01 PM, Visioner Sadak <vi...@gmail.com>
> wrote:
> > Thanks steve thers nothing in logs and no exceptions as well i found that
> > some file is created in my F:\user with directory name but its not
> visible
> > inside my hadoop browse filesystem directories i also added the config by
> > using the below method
> > hadoopConf.addResource(
> > "F:/hadoop-0.22.0/conf/core-site.xml");
> > when running thru WAR printing out the filesystem i m getting
> > org.apache.hadoop.fs.LocalFileSystem@9cd8db
> > when running an independet jar within hadoop i m getting
> > DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
> > when running an independet jar i m able to do uploads....
> >
> > just wanted to know will i have to add something in my classpath of
> tomcat
> > or is there any other configurations of core-site.xml that i am missing
> > out..thanks for your help.....
> >
> >
> >
> > On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
> > wrote:
> >>
> >>
> >> well, it's worked for me in the past outside Hadoop itself:
> >>
> >>
> >>
> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
> >>
> >> Turn logging up to DEBUG
> >> Make sure that the filesystem you've just loaded is what you expect, by
> >> logging its value. It may turn out to be file:///, because the normal
> Hadoop
> >> site-config.xml isn't being picked up
> >>
> >>
> >>>
> >>>
> >>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak
> >>> <vi...@gmail.com> wrote:
> >>>>
> >>>> but the problem is that my  code gets executed with the warning but
> file
> >>>> is not copied to hdfs , actually i m trying to copy a file from local
> to
> >>>> hdfs
> >>>>
> >>>>    Configuration hadoopConf=new Configuration();
> >>>>         //get the default associated file system
> >>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
> >>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
> >>>>         //copy from lfs to hdfs
> >>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
> >>>> Path("/user/TestDir/"));
> >>>>
> >>
> >>
> >
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
thanks hemanth i tried adding ext folder conf and extn root folder   unable
to add xml only but still same problem thanks for the help

On Mon, Sep 3, 2012 at 4:11 PM, Hemanth Yamijala <yh...@gmail.com> wrote:

> Hi,
>
> If you are getting the LocalFileSystem, you could try by putting
> core-site.xml in a directory that's there in the classpath for the
> Tomcat App (or include such a path in the classpath, if that's
> possible)
>
> Thanks
> hemanth
>
> On Mon, Sep 3, 2012 at 4:01 PM, Visioner Sadak <vi...@gmail.com>
> wrote:
> > Thanks steve thers nothing in logs and no exceptions as well i found that
> > some file is created in my F:\user with directory name but its not
> visible
> > inside my hadoop browse filesystem directories i also added the config by
> > using the below method
> > hadoopConf.addResource(
> > "F:/hadoop-0.22.0/conf/core-site.xml");
> > when running thru WAR printing out the filesystem i m getting
> > org.apache.hadoop.fs.LocalFileSystem@9cd8db
> > when running an independet jar within hadoop i m getting
> > DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
> > when running an independet jar i m able to do uploads....
> >
> > just wanted to know will i have to add something in my classpath of
> tomcat
> > or is there any other configurations of core-site.xml that i am missing
> > out..thanks for your help.....
> >
> >
> >
> > On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
> > wrote:
> >>
> >>
> >> well, it's worked for me in the past outside Hadoop itself:
> >>
> >>
> >>
> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
> >>
> >> Turn logging up to DEBUG
> >> Make sure that the filesystem you've just loaded is what you expect, by
> >> logging its value. It may turn out to be file:///, because the normal
> Hadoop
> >> site-config.xml isn't being picked up
> >>
> >>
> >>>
> >>>
> >>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak
> >>> <vi...@gmail.com> wrote:
> >>>>
> >>>> but the problem is that my  code gets executed with the warning but
> file
> >>>> is not copied to hdfs , actually i m trying to copy a file from local
> to
> >>>> hdfs
> >>>>
> >>>>    Configuration hadoopConf=new Configuration();
> >>>>         //get the default associated file system
> >>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
> >>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
> >>>>         //copy from lfs to hdfs
> >>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
> >>>> Path("/user/TestDir/"));
> >>>>
> >>
> >>
> >
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
thanks hemanth i tried adding ext folder conf and extn root folder   unable
to add xml only but still same problem thanks for the help

On Mon, Sep 3, 2012 at 4:11 PM, Hemanth Yamijala <yh...@gmail.com> wrote:

> Hi,
>
> If you are getting the LocalFileSystem, you could try by putting
> core-site.xml in a directory that's there in the classpath for the
> Tomcat App (or include such a path in the classpath, if that's
> possible)
>
> Thanks
> hemanth
>
> On Mon, Sep 3, 2012 at 4:01 PM, Visioner Sadak <vi...@gmail.com>
> wrote:
> > Thanks steve thers nothing in logs and no exceptions as well i found that
> > some file is created in my F:\user with directory name but its not
> visible
> > inside my hadoop browse filesystem directories i also added the config by
> > using the below method
> > hadoopConf.addResource(
> > "F:/hadoop-0.22.0/conf/core-site.xml");
> > when running thru WAR printing out the filesystem i m getting
> > org.apache.hadoop.fs.LocalFileSystem@9cd8db
> > when running an independet jar within hadoop i m getting
> > DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
> > when running an independet jar i m able to do uploads....
> >
> > just wanted to know will i have to add something in my classpath of
> tomcat
> > or is there any other configurations of core-site.xml that i am missing
> > out..thanks for your help.....
> >
> >
> >
> > On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
> > wrote:
> >>
> >>
> >> well, it's worked for me in the past outside Hadoop itself:
> >>
> >>
> >>
> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
> >>
> >> Turn logging up to DEBUG
> >> Make sure that the filesystem you've just loaded is what you expect, by
> >> logging its value. It may turn out to be file:///, because the normal
> Hadoop
> >> site-config.xml isn't being picked up
> >>
> >>
> >>>
> >>>
> >>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak
> >>> <vi...@gmail.com> wrote:
> >>>>
> >>>> but the problem is that my  code gets executed with the warning but
> file
> >>>> is not copied to hdfs , actually i m trying to copy a file from local
> to
> >>>> hdfs
> >>>>
> >>>>    Configuration hadoopConf=new Configuration();
> >>>>         //get the default associated file system
> >>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
> >>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
> >>>>         //copy from lfs to hdfs
> >>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
> >>>> Path("/user/TestDir/"));
> >>>>
> >>
> >>
> >
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
thanks hemanth i tried adding ext folder conf and extn root folder   unable
to add xml only but still same problem thanks for the help

On Mon, Sep 3, 2012 at 4:11 PM, Hemanth Yamijala <yh...@gmail.com> wrote:

> Hi,
>
> If you are getting the LocalFileSystem, you could try by putting
> core-site.xml in a directory that's there in the classpath for the
> Tomcat App (or include such a path in the classpath, if that's
> possible)
>
> Thanks
> hemanth
>
> On Mon, Sep 3, 2012 at 4:01 PM, Visioner Sadak <vi...@gmail.com>
> wrote:
> > Thanks steve thers nothing in logs and no exceptions as well i found that
> > some file is created in my F:\user with directory name but its not
> visible
> > inside my hadoop browse filesystem directories i also added the config by
> > using the below method
> > hadoopConf.addResource(
> > "F:/hadoop-0.22.0/conf/core-site.xml");
> > when running thru WAR printing out the filesystem i m getting
> > org.apache.hadoop.fs.LocalFileSystem@9cd8db
> > when running an independet jar within hadoop i m getting
> > DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
> > when running an independet jar i m able to do uploads....
> >
> > just wanted to know will i have to add something in my classpath of
> tomcat
> > or is there any other configurations of core-site.xml that i am missing
> > out..thanks for your help.....
> >
> >
> >
> > On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
> > wrote:
> >>
> >>
> >> well, it's worked for me in the past outside Hadoop itself:
> >>
> >>
> >>
> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
> >>
> >> Turn logging up to DEBUG
> >> Make sure that the filesystem you've just loaded is what you expect, by
> >> logging its value. It may turn out to be file:///, because the normal
> Hadoop
> >> site-config.xml isn't being picked up
> >>
> >>
> >>>
> >>>
> >>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak
> >>> <vi...@gmail.com> wrote:
> >>>>
> >>>> but the problem is that my  code gets executed with the warning but
> file
> >>>> is not copied to hdfs , actually i m trying to copy a file from local
> to
> >>>> hdfs
> >>>>
> >>>>    Configuration hadoopConf=new Configuration();
> >>>>         //get the default associated file system
> >>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
> >>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
> >>>>         //copy from lfs to hdfs
> >>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
> >>>> Path("/user/TestDir/"));
> >>>>
> >>
> >>
> >
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Hemanth Yamijala <yh...@gmail.com>.
Hi,

If you are getting the LocalFileSystem, you could try by putting
core-site.xml in a directory that's there in the classpath for the
Tomcat App (or include such a path in the classpath, if that's
possible)

Thanks
hemanth

On Mon, Sep 3, 2012 at 4:01 PM, Visioner Sadak <vi...@gmail.com> wrote:
> Thanks steve thers nothing in logs and no exceptions as well i found that
> some file is created in my F:\user with directory name but its not visible
> inside my hadoop browse filesystem directories i also added the config by
> using the below method
> hadoopConf.addResource(
> "F:/hadoop-0.22.0/conf/core-site.xml");
> when running thru WAR printing out the filesystem i m getting
> org.apache.hadoop.fs.LocalFileSystem@9cd8db
> when running an independet jar within hadoop i m getting
> DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
> when running an independet jar i m able to do uploads....
>
> just wanted to know will i have to add something in my classpath of tomcat
> or is there any other configurations of core-site.xml that i am missing
> out..thanks for your help.....
>
>
>
> On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
> wrote:
>>
>>
>> well, it's worked for me in the past outside Hadoop itself:
>>
>>
>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>>
>> Turn logging up to DEBUG
>> Make sure that the filesystem you've just loaded is what you expect, by
>> logging its value. It may turn out to be file:///, because the normal Hadoop
>> site-config.xml isn't being picked up
>>
>>
>>>
>>>
>>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak
>>> <vi...@gmail.com> wrote:
>>>>
>>>> but the problem is that my  code gets executed with the warning but file
>>>> is not copied to hdfs , actually i m trying to copy a file from local to
>>>> hdfs
>>>>
>>>>    Configuration hadoopConf=new Configuration();
>>>>         //get the default associated file system
>>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>>>         //copy from lfs to hdfs
>>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>>>> Path("/user/TestDir/"));
>>>>
>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Hemanth Yamijala <yh...@gmail.com>.
Hi,

If you are getting the LocalFileSystem, you could try by putting
core-site.xml in a directory that's there in the classpath for the
Tomcat App (or include such a path in the classpath, if that's
possible)

Thanks
hemanth

On Mon, Sep 3, 2012 at 4:01 PM, Visioner Sadak <vi...@gmail.com> wrote:
> Thanks steve thers nothing in logs and no exceptions as well i found that
> some file is created in my F:\user with directory name but its not visible
> inside my hadoop browse filesystem directories i also added the config by
> using the below method
> hadoopConf.addResource(
> "F:/hadoop-0.22.0/conf/core-site.xml");
> when running thru WAR printing out the filesystem i m getting
> org.apache.hadoop.fs.LocalFileSystem@9cd8db
> when running an independet jar within hadoop i m getting
> DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
> when running an independet jar i m able to do uploads....
>
> just wanted to know will i have to add something in my classpath of tomcat
> or is there any other configurations of core-site.xml that i am missing
> out..thanks for your help.....
>
>
>
> On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
> wrote:
>>
>>
>> well, it's worked for me in the past outside Hadoop itself:
>>
>>
>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>>
>> Turn logging up to DEBUG
>> Make sure that the filesystem you've just loaded is what you expect, by
>> logging its value. It may turn out to be file:///, because the normal Hadoop
>> site-config.xml isn't being picked up
>>
>>
>>>
>>>
>>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak
>>> <vi...@gmail.com> wrote:
>>>>
>>>> but the problem is that my  code gets executed with the warning but file
>>>> is not copied to hdfs , actually i m trying to copy a file from local to
>>>> hdfs
>>>>
>>>>    Configuration hadoopConf=new Configuration();
>>>>         //get the default associated file system
>>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>>>         //copy from lfs to hdfs
>>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>>>> Path("/user/TestDir/"));
>>>>
>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Hemanth Yamijala <yh...@gmail.com>.
Hi,

If you are getting the LocalFileSystem, you could try by putting
core-site.xml in a directory that's there in the classpath for the
Tomcat App (or include such a path in the classpath, if that's
possible)

Thanks
hemanth

On Mon, Sep 3, 2012 at 4:01 PM, Visioner Sadak <vi...@gmail.com> wrote:
> Thanks steve thers nothing in logs and no exceptions as well i found that
> some file is created in my F:\user with directory name but its not visible
> inside my hadoop browse filesystem directories i also added the config by
> using the below method
> hadoopConf.addResource(
> "F:/hadoop-0.22.0/conf/core-site.xml");
> when running thru WAR printing out the filesystem i m getting
> org.apache.hadoop.fs.LocalFileSystem@9cd8db
> when running an independet jar within hadoop i m getting
> DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
> when running an independet jar i m able to do uploads....
>
> just wanted to know will i have to add something in my classpath of tomcat
> or is there any other configurations of core-site.xml that i am missing
> out..thanks for your help.....
>
>
>
> On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
> wrote:
>>
>>
>> well, it's worked for me in the past outside Hadoop itself:
>>
>>
>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>>
>> Turn logging up to DEBUG
>> Make sure that the filesystem you've just loaded is what you expect, by
>> logging its value. It may turn out to be file:///, because the normal Hadoop
>> site-config.xml isn't being picked up
>>
>>
>>>
>>>
>>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak
>>> <vi...@gmail.com> wrote:
>>>>
>>>> but the problem is that my  code gets executed with the warning but file
>>>> is not copied to hdfs , actually i m trying to copy a file from local to
>>>> hdfs
>>>>
>>>>    Configuration hadoopConf=new Configuration();
>>>>         //get the default associated file system
>>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>>>         //copy from lfs to hdfs
>>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>>>> Path("/user/TestDir/"));
>>>>
>>
>>
>

RE: Integrating hadoop with java UI application deployed on tomcat

Posted by "Mahadevappa, Shobha" <Sh...@nttdata.com>.
Hi,
Try adding the hadoop/conf directory in the TOMCAT's classpath

Ex : CLASSPATH=/usr/local/Apps/hbase-0.90.4/conf:/usr/local/Apps/hadoop-0.20.203.0/conf:



Regards,
Shobha M

From: Visioner Sadak [mailto:visioner.sadak@gmail.com]
Sent: 03 September 2012 PM 04:01
To: user@hadoop.apache.org
Subject: Re: Integrating hadoop with java UI application deployed on tomcat

Thanks steve thers nothing in logs and no exceptions as well i found that some file is created in my F:\user with directory name but its not visible inside my hadoop browse filesystem directories i also added the config by using the below method
hadoopConf.addResource(
"F:/hadoop-0.22.0/conf/core-site.xml");
when running thru WAR printing out the filesystem i m getting org.apache.hadoop.fs.LocalFileSystem@9cd8db<ma...@9cd8db>
when running an independet jar within hadoop i m getting DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
when running an independet jar i m able to do uploads....

just wanted to know will i have to add something in my classpath of tomcat or is there any other configurations of core-site.xml that i am missing out..thanks for your help.....


On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>> wrote:

well, it's worked for me in the past outside Hadoop itself:

http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup


  1.  Turn logging up to DEBUG
  2.  Make sure that the filesystem you've just loaded is what you expect, by logging its value. It may turn out to be file:///<file:///\\>, because the normal Hadoop site-config.xml isn't being picked up


On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <vi...@gmail.com>> wrote:
but the problem is that my  code gets executed with the warning but file is not copied to hdfs , actually i m trying to copy a file from local to hdfs

   Configuration hadoopConf=new Configuration();
        //get the default associated file system
       FileSystem fileSystem=FileSystem.get(hadoopConf);
       // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
        //copy from lfs to hdfs
       fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new Path("/user/TestDir/"));




______________________________________________________________________
Disclaimer:This email and any attachments are sent in strictest confidence for the sole use of the addressee and may contain legally privileged, confidential, and proprietary data.  If you are not the intended recipient, please advise the sender by replying promptly to this email and then delete and destroy this email and any attachments without any further use, copying or forwarding

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Hemanth Yamijala <yh...@gmail.com>.
Hi,

If you are getting the LocalFileSystem, you could try by putting
core-site.xml in a directory that's there in the classpath for the
Tomcat App (or include such a path in the classpath, if that's
possible)

Thanks
hemanth

On Mon, Sep 3, 2012 at 4:01 PM, Visioner Sadak <vi...@gmail.com> wrote:
> Thanks steve thers nothing in logs and no exceptions as well i found that
> some file is created in my F:\user with directory name but its not visible
> inside my hadoop browse filesystem directories i also added the config by
> using the below method
> hadoopConf.addResource(
> "F:/hadoop-0.22.0/conf/core-site.xml");
> when running thru WAR printing out the filesystem i m getting
> org.apache.hadoop.fs.LocalFileSystem@9cd8db
> when running an independet jar within hadoop i m getting
> DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
> when running an independet jar i m able to do uploads....
>
> just wanted to know will i have to add something in my classpath of tomcat
> or is there any other configurations of core-site.xml that i am missing
> out..thanks for your help.....
>
>
>
> On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
> wrote:
>>
>>
>> well, it's worked for me in the past outside Hadoop itself:
>>
>>
>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>>
>> Turn logging up to DEBUG
>> Make sure that the filesystem you've just loaded is what you expect, by
>> logging its value. It may turn out to be file:///, because the normal Hadoop
>> site-config.xml isn't being picked up
>>
>>
>>>
>>>
>>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak
>>> <vi...@gmail.com> wrote:
>>>>
>>>> but the problem is that my  code gets executed with the warning but file
>>>> is not copied to hdfs , actually i m trying to copy a file from local to
>>>> hdfs
>>>>
>>>>    Configuration hadoopConf=new Configuration();
>>>>         //get the default associated file system
>>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>>>         //copy from lfs to hdfs
>>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>>>> Path("/user/TestDir/"));
>>>>
>>
>>
>

RE: Integrating hadoop with java UI application deployed on tomcat

Posted by "Mahadevappa, Shobha" <Sh...@nttdata.com>.
Hi,
Try adding the hadoop/conf directory in the TOMCAT's classpath

Ex : CLASSPATH=/usr/local/Apps/hbase-0.90.4/conf:/usr/local/Apps/hadoop-0.20.203.0/conf:



Regards,
Shobha M

From: Visioner Sadak [mailto:visioner.sadak@gmail.com]
Sent: 03 September 2012 PM 04:01
To: user@hadoop.apache.org
Subject: Re: Integrating hadoop with java UI application deployed on tomcat

Thanks steve thers nothing in logs and no exceptions as well i found that some file is created in my F:\user with directory name but its not visible inside my hadoop browse filesystem directories i also added the config by using the below method
hadoopConf.addResource(
"F:/hadoop-0.22.0/conf/core-site.xml");
when running thru WAR printing out the filesystem i m getting org.apache.hadoop.fs.LocalFileSystem@9cd8db<ma...@9cd8db>
when running an independet jar within hadoop i m getting DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
when running an independet jar i m able to do uploads....

just wanted to know will i have to add something in my classpath of tomcat or is there any other configurations of core-site.xml that i am missing out..thanks for your help.....


On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>> wrote:

well, it's worked for me in the past outside Hadoop itself:

http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup


  1.  Turn logging up to DEBUG
  2.  Make sure that the filesystem you've just loaded is what you expect, by logging its value. It may turn out to be file:///<file:///\\>, because the normal Hadoop site-config.xml isn't being picked up


On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <vi...@gmail.com>> wrote:
but the problem is that my  code gets executed with the warning but file is not copied to hdfs , actually i m trying to copy a file from local to hdfs

   Configuration hadoopConf=new Configuration();
        //get the default associated file system
       FileSystem fileSystem=FileSystem.get(hadoopConf);
       // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
        //copy from lfs to hdfs
       fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new Path("/user/TestDir/"));




______________________________________________________________________
Disclaimer:This email and any attachments are sent in strictest confidence for the sole use of the addressee and may contain legally privileged, confidential, and proprietary data.  If you are not the intended recipient, please advise the sender by replying promptly to this email and then delete and destroy this email and any attachments without any further use, copying or forwarding

RE: Integrating hadoop with java UI application deployed on tomcat

Posted by "Mahadevappa, Shobha" <Sh...@nttdata.com>.
Hi,
Try adding the hadoop/conf directory in the TOMCAT's classpath

Ex : CLASSPATH=/usr/local/Apps/hbase-0.90.4/conf:/usr/local/Apps/hadoop-0.20.203.0/conf:



Regards,
Shobha M

From: Visioner Sadak [mailto:visioner.sadak@gmail.com]
Sent: 03 September 2012 PM 04:01
To: user@hadoop.apache.org
Subject: Re: Integrating hadoop with java UI application deployed on tomcat

Thanks steve thers nothing in logs and no exceptions as well i found that some file is created in my F:\user with directory name but its not visible inside my hadoop browse filesystem directories i also added the config by using the below method
hadoopConf.addResource(
"F:/hadoop-0.22.0/conf/core-site.xml");
when running thru WAR printing out the filesystem i m getting org.apache.hadoop.fs.LocalFileSystem@9cd8db<ma...@9cd8db>
when running an independet jar within hadoop i m getting DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
when running an independet jar i m able to do uploads....

just wanted to know will i have to add something in my classpath of tomcat or is there any other configurations of core-site.xml that i am missing out..thanks for your help.....


On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>> wrote:

well, it's worked for me in the past outside Hadoop itself:

http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup


  1.  Turn logging up to DEBUG
  2.  Make sure that the filesystem you've just loaded is what you expect, by logging its value. It may turn out to be file:///<file:///\\>, because the normal Hadoop site-config.xml isn't being picked up


On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <vi...@gmail.com>> wrote:
but the problem is that my  code gets executed with the warning but file is not copied to hdfs , actually i m trying to copy a file from local to hdfs

   Configuration hadoopConf=new Configuration();
        //get the default associated file system
       FileSystem fileSystem=FileSystem.get(hadoopConf);
       // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
        //copy from lfs to hdfs
       fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new Path("/user/TestDir/"));




______________________________________________________________________
Disclaimer:This email and any attachments are sent in strictest confidence for the sole use of the addressee and may contain legally privileged, confidential, and proprietary data.  If you are not the intended recipient, please advise the sender by replying promptly to this email and then delete and destroy this email and any attachments without any further use, copying or forwarding

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
Thanks steve thers nothing in logs and no exceptions as well i found that
some file is created in my F:\user with directory name but its not visible
inside my hadoop browse filesystem directories i also added the config by
using the below method
hadoopConf.addResource("F:/hadoop-0.22.0/conf/core-site.xml");
when running thru WAR printing out the filesystem i m getting
org.apache.hadoop.fs.LocalFileSystem@9cd8db
when running an independet jar within hadoop i m getting
DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
when running an independet jar i m able to do uploads....

just wanted to know will i have to add something in my classpath of tomcat
or is there any other configurations of core-site.xml that i am missing
out..thanks for your help.....



On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>wrote:

>
> well, it's worked for me in the past outside Hadoop itself:
>
>
> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>
>
>    1. Turn logging up to DEBUG
>    2. Make sure that the filesystem you've just loaded is what you
>    expect, by logging its value. It may turn out to be file:///, because the
>    normal Hadoop site-config.xml isn't being picked up
>
>
>
>>
>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <visioner.sadak@gmail.com
>> > wrote:
>>
>>> but the problem is that my  code gets executed with the warning but file
>>> is not copied to hdfs , actually i m trying to copy a file from local to
>>> hdfs
>>>
>>>    Configuration hadoopConf=new Configuration();
>>>         //get the default associated file system
>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>>         //copy from lfs to hdfs
>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>>> Path("/user/TestDir/"));
>>>
>>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
Thanks steve thers nothing in logs and no exceptions as well i found that
some file is created in my F:\user with directory name but its not visible
inside my hadoop browse filesystem directories i also added the config by
using the below method
hadoopConf.addResource("F:/hadoop-0.22.0/conf/core-site.xml");
when running thru WAR printing out the filesystem i m getting
org.apache.hadoop.fs.LocalFileSystem@9cd8db
when running an independet jar within hadoop i m getting
DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
when running an independet jar i m able to do uploads....

just wanted to know will i have to add something in my classpath of tomcat
or is there any other configurations of core-site.xml that i am missing
out..thanks for your help.....



On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>wrote:

>
> well, it's worked for me in the past outside Hadoop itself:
>
>
> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>
>
>    1. Turn logging up to DEBUG
>    2. Make sure that the filesystem you've just loaded is what you
>    expect, by logging its value. It may turn out to be file:///, because the
>    normal Hadoop site-config.xml isn't being picked up
>
>
>
>>
>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <visioner.sadak@gmail.com
>> > wrote:
>>
>>> but the problem is that my  code gets executed with the warning but file
>>> is not copied to hdfs , actually i m trying to copy a file from local to
>>> hdfs
>>>
>>>    Configuration hadoopConf=new Configuration();
>>>         //get the default associated file system
>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>>         //copy from lfs to hdfs
>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>>> Path("/user/TestDir/"));
>>>
>>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
Thanks steve thers nothing in logs and no exceptions as well i found that
some file is created in my F:\user with directory name but its not visible
inside my hadoop browse filesystem directories i also added the config by
using the below method
hadoopConf.addResource("F:/hadoop-0.22.0/conf/core-site.xml");
when running thru WAR printing out the filesystem i m getting
org.apache.hadoop.fs.LocalFileSystem@9cd8db
when running an independet jar within hadoop i m getting
DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
when running an independet jar i m able to do uploads....

just wanted to know will i have to add something in my classpath of tomcat
or is there any other configurations of core-site.xml that i am missing
out..thanks for your help.....



On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>wrote:

>
> well, it's worked for me in the past outside Hadoop itself:
>
>
> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>
>
>    1. Turn logging up to DEBUG
>    2. Make sure that the filesystem you've just loaded is what you
>    expect, by logging its value. It may turn out to be file:///, because the
>    normal Hadoop site-config.xml isn't being picked up
>
>
>
>>
>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <visioner.sadak@gmail.com
>> > wrote:
>>
>>> but the problem is that my  code gets executed with the warning but file
>>> is not copied to hdfs , actually i m trying to copy a file from local to
>>> hdfs
>>>
>>>    Configuration hadoopConf=new Configuration();
>>>         //get the default associated file system
>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>>         //copy from lfs to hdfs
>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>>> Path("/user/TestDir/"));
>>>
>>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
Thanks steve thers nothing in logs and no exceptions as well i found that
some file is created in my F:\user with directory name but its not visible
inside my hadoop browse filesystem directories i also added the config by
using the below method
hadoopConf.addResource("F:/hadoop-0.22.0/conf/core-site.xml");
when running thru WAR printing out the filesystem i m getting
org.apache.hadoop.fs.LocalFileSystem@9cd8db
when running an independet jar within hadoop i m getting
DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
when running an independet jar i m able to do uploads....

just wanted to know will i have to add something in my classpath of tomcat
or is there any other configurations of core-site.xml that i am missing
out..thanks for your help.....



On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>wrote:

>
> well, it's worked for me in the past outside Hadoop itself:
>
>
> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>
>
>    1. Turn logging up to DEBUG
>    2. Make sure that the filesystem you've just loaded is what you
>    expect, by logging its value. It may turn out to be file:///, because the
>    normal Hadoop site-config.xml isn't being picked up
>
>
>
>>
>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <visioner.sadak@gmail.com
>> > wrote:
>>
>>> but the problem is that my  code gets executed with the warning but file
>>> is not copied to hdfs , actually i m trying to copy a file from local to
>>> hdfs
>>>
>>>    Configuration hadoopConf=new Configuration();
>>>         //get the default associated file system
>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>>         //copy from lfs to hdfs
>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>>> Path("/user/TestDir/"));
>>>
>>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Steve Loughran <st...@hortonworks.com>.
well, it's worked for me in the past outside Hadoop itself:

http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup


   1. Turn logging up to DEBUG
   2. Make sure that the filesystem you've just loaded is what you expect,
   by logging its value. It may turn out to be file:///, because the normal
   Hadoop site-config.xml isn't being picked up



>
> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <vi...@gmail.com>wrote:
>
>> but the problem is that my  code gets executed with the warning but file
>> is not copied to hdfs , actually i m trying to copy a file from local to
>> hdfs
>>
>>    Configuration hadoopConf=new Configuration();
>>         //get the default associated file system
>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>         //copy from lfs to hdfs
>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>> Path("/user/TestDir/"));
>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Steve Loughran <st...@hortonworks.com>.
well, it's worked for me in the past outside Hadoop itself:

http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup


   1. Turn logging up to DEBUG
   2. Make sure that the filesystem you've just loaded is what you expect,
   by logging its value. It may turn out to be file:///, because the normal
   Hadoop site-config.xml isn't being picked up



>
> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <vi...@gmail.com>wrote:
>
>> but the problem is that my  code gets executed with the warning but file
>> is not copied to hdfs , actually i m trying to copy a file from local to
>> hdfs
>>
>>    Configuration hadoopConf=new Configuration();
>>         //get the default associated file system
>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>         //copy from lfs to hdfs
>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>> Path("/user/TestDir/"));
>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Steve Loughran <st...@hortonworks.com>.
well, it's worked for me in the past outside Hadoop itself:

http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup


   1. Turn logging up to DEBUG
   2. Make sure that the filesystem you've just loaded is what you expect,
   by logging its value. It may turn out to be file:///, because the normal
   Hadoop site-config.xml isn't being picked up



>
> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <vi...@gmail.com>wrote:
>
>> but the problem is that my  code gets executed with the warning but file
>> is not copied to hdfs , actually i m trying to copy a file from local to
>> hdfs
>>
>>    Configuration hadoopConf=new Configuration();
>>         //get the default associated file system
>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>         //copy from lfs to hdfs
>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>> Path("/user/TestDir/"));
>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Steve Loughran <st...@hortonworks.com>.
well, it's worked for me in the past outside Hadoop itself:

http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup


   1. Turn logging up to DEBUG
   2. Make sure that the filesystem you've just loaded is what you expect,
   by logging its value. It may turn out to be file:///, because the normal
   Hadoop site-config.xml isn't being picked up



>
> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <vi...@gmail.com>wrote:
>
>> but the problem is that my  code gets executed with the warning but file
>> is not copied to hdfs , actually i m trying to copy a file from local to
>> hdfs
>>
>>    Configuration hadoopConf=new Configuration();
>>         //get the default associated file system
>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>         //copy from lfs to hdfs
>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>> Path("/user/TestDir/"));
>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
do i have to do some tomcat configuration settings ???

On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <vi...@gmail.com>wrote:

> but the problem is that my  code gets executed with the warning but file
> is not copied to hdfs , actually i m trying to copy a file from local to
> hdfs
>
>    Configuration hadoopConf=new Configuration();
>         //get the default associated file system
>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>         //copy from lfs to hdfs
>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
> Path("/user/TestDir/"));
>
>
>  On Thu, Aug 30, 2012 at 9:57 PM, Steve Loughran <st...@hortonworks.com>wrote:
>
>>
>>
>>  On 30 August 2012 13:54, Visioner Sadak <vi...@gmail.com>wrote:
>>
>>> Thanks a ton guys for your help i did used hadoop-core-1.0.3.jar &
>>> commons-lang-2.1.jar to get rid of the class not found error now i am
>>> getting this error is this becoz i am using my app and hadoop on windows???
>>>
>>> util.NativeCodeLoader: Unable to load native-hadoop library for your
>>> platform... using builtin-java classes where applicable
>>>
>>
>> no, that's warning you that the native code to help with some operations
>> (especially compression) aren't loading as your JVM's native lib path
>> aren't set up right.
>>
>> Just edit log4j to hide that classes log messages.
>>
>> FWIW, I've downgraded some other messages that are over noisy, especially
>> if you bring up a MiniMR/MiniDFS cluster for test runs:
>>
>>  log4j.logger.org.apache.hadoop.hdfs.server.datanode.DataStorage=WARNING
>>
>> log4j.logger.org.apache.hadoop.hdfs.server.datanode.DataXceiverServer=WARNING
>>
>> log4j.logger.org.apache.hadoop.hdfs.server.datanode.DataBlockScanner=WARNING
>> log4j.logger.org.apache.hadoop.hdfs.server.datanode.FSDataset=FATAL
>> log4j.logger.org.apache.hadoop.metrics2=FATAL
>> log4j.logger.org.apache.hadoop.ipc.metrics.RpcInstrumentation=WARNING
>> log4j.logger.org.apache.hadoop.ipc.Server=WARNING
>> log4j.logger.org.apache.hadoop.metrics=FATAL
>>
>>
>>
>>>
>>>
>>>
>>>
>>> On Thu, Aug 30, 2012 at 5:22 PM, Steve Loughran <st...@hortonworks.com>wrote:
>>>
>>>> you will need almost the entire hadoop client-side JAR set and
>>>> dependencies for this, I'm afraid.
>>>>
>>>> The new webhdfs filesys -HDFS over HTTP- is designed to be lighter
>>>> weight and only need an HTTP client, but I'm not aware of any ultra-thin
>>>> client yet (apache http components should suffice).
>>>>
>>>> If you are using any of the build tools with dependency management:
>>>> Ant+Ivy, Maven, Gradle, ask for Hadoop JARs and have the dependencies
>>>> pulled in.
>>>>
>>>> If you aren't using any of the build tools w/ dependency management,
>>>> now is the time.
>>>>
>>>>
>>>> On 30 August 2012 09:32, Visioner Sadak <vi...@gmail.com>wrote:
>>>>
>>>>> Hi,
>>>>>
>>>>>   I have a WAR which is deployed on tomcat server the WAR contains
>>>>> some java classes which uploads files, will i be able to upload directly in
>>>>> to hadoop iam using the below code in one of my java class
>>>>>
>>>>>        Configuration hadoopConf=new Configuration();
>>>>>         //get the default associated file system
>>>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>>>>         //copy from lfs to hdfs
>>>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>>>>> Path("/user/TestDir/"));
>>>>>
>>>>> but its throwing up this error
>>>>>
>>>>> java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
>>>>>
>>>>> when this code is run independtly using a single jar deployed in
>>>>> hadoop bin it wrks fine
>>>>>
>>>>>
>>>>>
>>>>
>>>>
>>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
do i have to do some tomcat configuration settings ???

On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <vi...@gmail.com>wrote:

> but the problem is that my  code gets executed with the warning but file
> is not copied to hdfs , actually i m trying to copy a file from local to
> hdfs
>
>    Configuration hadoopConf=new Configuration();
>         //get the default associated file system
>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>         //copy from lfs to hdfs
>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
> Path("/user/TestDir/"));
>
>
>  On Thu, Aug 30, 2012 at 9:57 PM, Steve Loughran <st...@hortonworks.com>wrote:
>
>>
>>
>>  On 30 August 2012 13:54, Visioner Sadak <vi...@gmail.com>wrote:
>>
>>> Thanks a ton guys for your help i did used hadoop-core-1.0.3.jar &
>>> commons-lang-2.1.jar to get rid of the class not found error now i am
>>> getting this error is this becoz i am using my app and hadoop on windows???
>>>
>>> util.NativeCodeLoader: Unable to load native-hadoop library for your
>>> platform... using builtin-java classes where applicable
>>>
>>
>> no, that's warning you that the native code to help with some operations
>> (especially compression) aren't loading as your JVM's native lib path
>> aren't set up right.
>>
>> Just edit log4j to hide that classes log messages.
>>
>> FWIW, I've downgraded some other messages that are over noisy, especially
>> if you bring up a MiniMR/MiniDFS cluster for test runs:
>>
>>  log4j.logger.org.apache.hadoop.hdfs.server.datanode.DataStorage=WARNING
>>
>> log4j.logger.org.apache.hadoop.hdfs.server.datanode.DataXceiverServer=WARNING
>>
>> log4j.logger.org.apache.hadoop.hdfs.server.datanode.DataBlockScanner=WARNING
>> log4j.logger.org.apache.hadoop.hdfs.server.datanode.FSDataset=FATAL
>> log4j.logger.org.apache.hadoop.metrics2=FATAL
>> log4j.logger.org.apache.hadoop.ipc.metrics.RpcInstrumentation=WARNING
>> log4j.logger.org.apache.hadoop.ipc.Server=WARNING
>> log4j.logger.org.apache.hadoop.metrics=FATAL
>>
>>
>>
>>>
>>>
>>>
>>>
>>> On Thu, Aug 30, 2012 at 5:22 PM, Steve Loughran <st...@hortonworks.com>wrote:
>>>
>>>> you will need almost the entire hadoop client-side JAR set and
>>>> dependencies for this, I'm afraid.
>>>>
>>>> The new webhdfs filesys -HDFS over HTTP- is designed to be lighter
>>>> weight and only need an HTTP client, but I'm not aware of any ultra-thin
>>>> client yet (apache http components should suffice).
>>>>
>>>> If you are using any of the build tools with dependency management:
>>>> Ant+Ivy, Maven, Gradle, ask for Hadoop JARs and have the dependencies
>>>> pulled in.
>>>>
>>>> If you aren't using any of the build tools w/ dependency management,
>>>> now is the time.
>>>>
>>>>
>>>> On 30 August 2012 09:32, Visioner Sadak <vi...@gmail.com>wrote:
>>>>
>>>>> Hi,
>>>>>
>>>>>   I have a WAR which is deployed on tomcat server the WAR contains
>>>>> some java classes which uploads files, will i be able to upload directly in
>>>>> to hadoop iam using the below code in one of my java class
>>>>>
>>>>>        Configuration hadoopConf=new Configuration();
>>>>>         //get the default associated file system
>>>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>>>>         //copy from lfs to hdfs
>>>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>>>>> Path("/user/TestDir/"));
>>>>>
>>>>> but its throwing up this error
>>>>>
>>>>> java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
>>>>>
>>>>> when this code is run independtly using a single jar deployed in
>>>>> hadoop bin it wrks fine
>>>>>
>>>>>
>>>>>
>>>>
>>>>
>>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
do i have to do some tomcat configuration settings ???

On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <vi...@gmail.com>wrote:

> but the problem is that my  code gets executed with the warning but file
> is not copied to hdfs , actually i m trying to copy a file from local to
> hdfs
>
>    Configuration hadoopConf=new Configuration();
>         //get the default associated file system
>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>         //copy from lfs to hdfs
>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
> Path("/user/TestDir/"));
>
>
>  On Thu, Aug 30, 2012 at 9:57 PM, Steve Loughran <st...@hortonworks.com>wrote:
>
>>
>>
>>  On 30 August 2012 13:54, Visioner Sadak <vi...@gmail.com>wrote:
>>
>>> Thanks a ton guys for your help i did used hadoop-core-1.0.3.jar &
>>> commons-lang-2.1.jar to get rid of the class not found error now i am
>>> getting this error is this becoz i am using my app and hadoop on windows???
>>>
>>> util.NativeCodeLoader: Unable to load native-hadoop library for your
>>> platform... using builtin-java classes where applicable
>>>
>>
>> no, that's warning you that the native code to help with some operations
>> (especially compression) aren't loading as your JVM's native lib path
>> aren't set up right.
>>
>> Just edit log4j to hide that classes log messages.
>>
>> FWIW, I've downgraded some other messages that are over noisy, especially
>> if you bring up a MiniMR/MiniDFS cluster for test runs:
>>
>>  log4j.logger.org.apache.hadoop.hdfs.server.datanode.DataStorage=WARNING
>>
>> log4j.logger.org.apache.hadoop.hdfs.server.datanode.DataXceiverServer=WARNING
>>
>> log4j.logger.org.apache.hadoop.hdfs.server.datanode.DataBlockScanner=WARNING
>> log4j.logger.org.apache.hadoop.hdfs.server.datanode.FSDataset=FATAL
>> log4j.logger.org.apache.hadoop.metrics2=FATAL
>> log4j.logger.org.apache.hadoop.ipc.metrics.RpcInstrumentation=WARNING
>> log4j.logger.org.apache.hadoop.ipc.Server=WARNING
>> log4j.logger.org.apache.hadoop.metrics=FATAL
>>
>>
>>
>>>
>>>
>>>
>>>
>>> On Thu, Aug 30, 2012 at 5:22 PM, Steve Loughran <st...@hortonworks.com>wrote:
>>>
>>>> you will need almost the entire hadoop client-side JAR set and
>>>> dependencies for this, I'm afraid.
>>>>
>>>> The new webhdfs filesys -HDFS over HTTP- is designed to be lighter
>>>> weight and only need an HTTP client, but I'm not aware of any ultra-thin
>>>> client yet (apache http components should suffice).
>>>>
>>>> If you are using any of the build tools with dependency management:
>>>> Ant+Ivy, Maven, Gradle, ask for Hadoop JARs and have the dependencies
>>>> pulled in.
>>>>
>>>> If you aren't using any of the build tools w/ dependency management,
>>>> now is the time.
>>>>
>>>>
>>>> On 30 August 2012 09:32, Visioner Sadak <vi...@gmail.com>wrote:
>>>>
>>>>> Hi,
>>>>>
>>>>>   I have a WAR which is deployed on tomcat server the WAR contains
>>>>> some java classes which uploads files, will i be able to upload directly in
>>>>> to hadoop iam using the below code in one of my java class
>>>>>
>>>>>        Configuration hadoopConf=new Configuration();
>>>>>         //get the default associated file system
>>>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>>>>         //copy from lfs to hdfs
>>>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>>>>> Path("/user/TestDir/"));
>>>>>
>>>>> but its throwing up this error
>>>>>
>>>>> java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
>>>>>
>>>>> when this code is run independtly using a single jar deployed in
>>>>> hadoop bin it wrks fine
>>>>>
>>>>>
>>>>>
>>>>
>>>>
>>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
do i have to do some tomcat configuration settings ???

On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <vi...@gmail.com>wrote:

> but the problem is that my  code gets executed with the warning but file
> is not copied to hdfs , actually i m trying to copy a file from local to
> hdfs
>
>    Configuration hadoopConf=new Configuration();
>         //get the default associated file system
>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>         //copy from lfs to hdfs
>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
> Path("/user/TestDir/"));
>
>
>  On Thu, Aug 30, 2012 at 9:57 PM, Steve Loughran <st...@hortonworks.com>wrote:
>
>>
>>
>>  On 30 August 2012 13:54, Visioner Sadak <vi...@gmail.com>wrote:
>>
>>> Thanks a ton guys for your help i did used hadoop-core-1.0.3.jar &
>>> commons-lang-2.1.jar to get rid of the class not found error now i am
>>> getting this error is this becoz i am using my app and hadoop on windows???
>>>
>>> util.NativeCodeLoader: Unable to load native-hadoop library for your
>>> platform... using builtin-java classes where applicable
>>>
>>
>> no, that's warning you that the native code to help with some operations
>> (especially compression) aren't loading as your JVM's native lib path
>> aren't set up right.
>>
>> Just edit log4j to hide that classes log messages.
>>
>> FWIW, I've downgraded some other messages that are over noisy, especially
>> if you bring up a MiniMR/MiniDFS cluster for test runs:
>>
>>  log4j.logger.org.apache.hadoop.hdfs.server.datanode.DataStorage=WARNING
>>
>> log4j.logger.org.apache.hadoop.hdfs.server.datanode.DataXceiverServer=WARNING
>>
>> log4j.logger.org.apache.hadoop.hdfs.server.datanode.DataBlockScanner=WARNING
>> log4j.logger.org.apache.hadoop.hdfs.server.datanode.FSDataset=FATAL
>> log4j.logger.org.apache.hadoop.metrics2=FATAL
>> log4j.logger.org.apache.hadoop.ipc.metrics.RpcInstrumentation=WARNING
>> log4j.logger.org.apache.hadoop.ipc.Server=WARNING
>> log4j.logger.org.apache.hadoop.metrics=FATAL
>>
>>
>>
>>>
>>>
>>>
>>>
>>> On Thu, Aug 30, 2012 at 5:22 PM, Steve Loughran <st...@hortonworks.com>wrote:
>>>
>>>> you will need almost the entire hadoop client-side JAR set and
>>>> dependencies for this, I'm afraid.
>>>>
>>>> The new webhdfs filesys -HDFS over HTTP- is designed to be lighter
>>>> weight and only need an HTTP client, but I'm not aware of any ultra-thin
>>>> client yet (apache http components should suffice).
>>>>
>>>> If you are using any of the build tools with dependency management:
>>>> Ant+Ivy, Maven, Gradle, ask for Hadoop JARs and have the dependencies
>>>> pulled in.
>>>>
>>>> If you aren't using any of the build tools w/ dependency management,
>>>> now is the time.
>>>>
>>>>
>>>> On 30 August 2012 09:32, Visioner Sadak <vi...@gmail.com>wrote:
>>>>
>>>>> Hi,
>>>>>
>>>>>   I have a WAR which is deployed on tomcat server the WAR contains
>>>>> some java classes which uploads files, will i be able to upload directly in
>>>>> to hadoop iam using the below code in one of my java class
>>>>>
>>>>>        Configuration hadoopConf=new Configuration();
>>>>>         //get the default associated file system
>>>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>>>>         //copy from lfs to hdfs
>>>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>>>>> Path("/user/TestDir/"));
>>>>>
>>>>> but its throwing up this error
>>>>>
>>>>> java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
>>>>>
>>>>> when this code is run independtly using a single jar deployed in
>>>>> hadoop bin it wrks fine
>>>>>
>>>>>
>>>>>
>>>>
>>>>
>>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
but the problem is that my  code gets executed with the warning but file is
not copied to hdfs , actually i m trying to copy a file from local to hdfs

   Configuration hadoopConf=new Configuration();
        //get the default associated file system
       FileSystem fileSystem=FileSystem.get(hadoopConf);
       // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
        //copy from lfs to hdfs
       fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
Path("/user/TestDir/"));


On Thu, Aug 30, 2012 at 9:57 PM, Steve Loughran <st...@hortonworks.com>wrote:

>
>
>  On 30 August 2012 13:54, Visioner Sadak <vi...@gmail.com> wrote:
>
>> Thanks a ton guys for your help i did used hadoop-core-1.0.3.jar &
>> commons-lang-2.1.jar to get rid of the class not found error now i am
>> getting this error is this becoz i am using my app and hadoop on windows???
>>
>> util.NativeCodeLoader: Unable to load native-hadoop library for your
>> platform... using builtin-java classes where applicable
>>
>
> no, that's warning you that the native code to help with some operations
> (especially compression) aren't loading as your JVM's native lib path
> aren't set up right.
>
> Just edit log4j to hide that classes log messages.
>
> FWIW, I've downgraded some other messages that are over noisy, especially
> if you bring up a MiniMR/MiniDFS cluster for test runs:
>
>  log4j.logger.org.apache.hadoop.hdfs.server.datanode.DataStorage=WARNING
>
> log4j.logger.org.apache.hadoop.hdfs.server.datanode.DataXceiverServer=WARNING
>
> log4j.logger.org.apache.hadoop.hdfs.server.datanode.DataBlockScanner=WARNING
> log4j.logger.org.apache.hadoop.hdfs.server.datanode.FSDataset=FATAL
> log4j.logger.org.apache.hadoop.metrics2=FATAL
> log4j.logger.org.apache.hadoop.ipc.metrics.RpcInstrumentation=WARNING
> log4j.logger.org.apache.hadoop.ipc.Server=WARNING
> log4j.logger.org.apache.hadoop.metrics=FATAL
>
>
>
>>
>>
>>
>>
>> On Thu, Aug 30, 2012 at 5:22 PM, Steve Loughran <st...@hortonworks.com>wrote:
>>
>>> you will need almost the entire hadoop client-side JAR set and
>>> dependencies for this, I'm afraid.
>>>
>>> The new webhdfs filesys -HDFS over HTTP- is designed to be lighter
>>> weight and only need an HTTP client, but I'm not aware of any ultra-thin
>>> client yet (apache http components should suffice).
>>>
>>> If you are using any of the build tools with dependency management:
>>> Ant+Ivy, Maven, Gradle, ask for Hadoop JARs and have the dependencies
>>> pulled in.
>>>
>>> If you aren't using any of the build tools w/ dependency management, now
>>> is the time.
>>>
>>>
>>> On 30 August 2012 09:32, Visioner Sadak <vi...@gmail.com>wrote:
>>>
>>>> Hi,
>>>>
>>>>   I have a WAR which is deployed on tomcat server the WAR contains some
>>>> java classes which uploads files, will i be able to upload directly in to
>>>> hadoop iam using the below code in one of my java class
>>>>
>>>>        Configuration hadoopConf=new Configuration();
>>>>         //get the default associated file system
>>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>>>         //copy from lfs to hdfs
>>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>>>> Path("/user/TestDir/"));
>>>>
>>>> but its throwing up this error
>>>>
>>>> java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
>>>>
>>>> when this code is run independtly using a single jar deployed in hadoop
>>>> bin it wrks fine
>>>>
>>>>
>>>>
>>>
>>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
but the problem is that my  code gets executed with the warning but file is
not copied to hdfs , actually i m trying to copy a file from local to hdfs

   Configuration hadoopConf=new Configuration();
        //get the default associated file system
       FileSystem fileSystem=FileSystem.get(hadoopConf);
       // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
        //copy from lfs to hdfs
       fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
Path("/user/TestDir/"));


On Thu, Aug 30, 2012 at 9:57 PM, Steve Loughran <st...@hortonworks.com>wrote:

>
>
>  On 30 August 2012 13:54, Visioner Sadak <vi...@gmail.com> wrote:
>
>> Thanks a ton guys for your help i did used hadoop-core-1.0.3.jar &
>> commons-lang-2.1.jar to get rid of the class not found error now i am
>> getting this error is this becoz i am using my app and hadoop on windows???
>>
>> util.NativeCodeLoader: Unable to load native-hadoop library for your
>> platform... using builtin-java classes where applicable
>>
>
> no, that's warning you that the native code to help with some operations
> (especially compression) aren't loading as your JVM's native lib path
> aren't set up right.
>
> Just edit log4j to hide that classes log messages.
>
> FWIW, I've downgraded some other messages that are over noisy, especially
> if you bring up a MiniMR/MiniDFS cluster for test runs:
>
>  log4j.logger.org.apache.hadoop.hdfs.server.datanode.DataStorage=WARNING
>
> log4j.logger.org.apache.hadoop.hdfs.server.datanode.DataXceiverServer=WARNING
>
> log4j.logger.org.apache.hadoop.hdfs.server.datanode.DataBlockScanner=WARNING
> log4j.logger.org.apache.hadoop.hdfs.server.datanode.FSDataset=FATAL
> log4j.logger.org.apache.hadoop.metrics2=FATAL
> log4j.logger.org.apache.hadoop.ipc.metrics.RpcInstrumentation=WARNING
> log4j.logger.org.apache.hadoop.ipc.Server=WARNING
> log4j.logger.org.apache.hadoop.metrics=FATAL
>
>
>
>>
>>
>>
>>
>> On Thu, Aug 30, 2012 at 5:22 PM, Steve Loughran <st...@hortonworks.com>wrote:
>>
>>> you will need almost the entire hadoop client-side JAR set and
>>> dependencies for this, I'm afraid.
>>>
>>> The new webhdfs filesys -HDFS over HTTP- is designed to be lighter
>>> weight and only need an HTTP client, but I'm not aware of any ultra-thin
>>> client yet (apache http components should suffice).
>>>
>>> If you are using any of the build tools with dependency management:
>>> Ant+Ivy, Maven, Gradle, ask for Hadoop JARs and have the dependencies
>>> pulled in.
>>>
>>> If you aren't using any of the build tools w/ dependency management, now
>>> is the time.
>>>
>>>
>>> On 30 August 2012 09:32, Visioner Sadak <vi...@gmail.com>wrote:
>>>
>>>> Hi,
>>>>
>>>>   I have a WAR which is deployed on tomcat server the WAR contains some
>>>> java classes which uploads files, will i be able to upload directly in to
>>>> hadoop iam using the below code in one of my java class
>>>>
>>>>        Configuration hadoopConf=new Configuration();
>>>>         //get the default associated file system
>>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>>>         //copy from lfs to hdfs
>>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>>>> Path("/user/TestDir/"));
>>>>
>>>> but its throwing up this error
>>>>
>>>> java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
>>>>
>>>> when this code is run independtly using a single jar deployed in hadoop
>>>> bin it wrks fine
>>>>
>>>>
>>>>
>>>
>>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
but the problem is that my  code gets executed with the warning but file is
not copied to hdfs , actually i m trying to copy a file from local to hdfs

   Configuration hadoopConf=new Configuration();
        //get the default associated file system
       FileSystem fileSystem=FileSystem.get(hadoopConf);
       // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
        //copy from lfs to hdfs
       fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
Path("/user/TestDir/"));


On Thu, Aug 30, 2012 at 9:57 PM, Steve Loughran <st...@hortonworks.com>wrote:

>
>
>  On 30 August 2012 13:54, Visioner Sadak <vi...@gmail.com> wrote:
>
>> Thanks a ton guys for your help i did used hadoop-core-1.0.3.jar &
>> commons-lang-2.1.jar to get rid of the class not found error now i am
>> getting this error is this becoz i am using my app and hadoop on windows???
>>
>> util.NativeCodeLoader: Unable to load native-hadoop library for your
>> platform... using builtin-java classes where applicable
>>
>
> no, that's warning you that the native code to help with some operations
> (especially compression) aren't loading as your JVM's native lib path
> aren't set up right.
>
> Just edit log4j to hide that classes log messages.
>
> FWIW, I've downgraded some other messages that are over noisy, especially
> if you bring up a MiniMR/MiniDFS cluster for test runs:
>
>  log4j.logger.org.apache.hadoop.hdfs.server.datanode.DataStorage=WARNING
>
> log4j.logger.org.apache.hadoop.hdfs.server.datanode.DataXceiverServer=WARNING
>
> log4j.logger.org.apache.hadoop.hdfs.server.datanode.DataBlockScanner=WARNING
> log4j.logger.org.apache.hadoop.hdfs.server.datanode.FSDataset=FATAL
> log4j.logger.org.apache.hadoop.metrics2=FATAL
> log4j.logger.org.apache.hadoop.ipc.metrics.RpcInstrumentation=WARNING
> log4j.logger.org.apache.hadoop.ipc.Server=WARNING
> log4j.logger.org.apache.hadoop.metrics=FATAL
>
>
>
>>
>>
>>
>>
>> On Thu, Aug 30, 2012 at 5:22 PM, Steve Loughran <st...@hortonworks.com>wrote:
>>
>>> you will need almost the entire hadoop client-side JAR set and
>>> dependencies for this, I'm afraid.
>>>
>>> The new webhdfs filesys -HDFS over HTTP- is designed to be lighter
>>> weight and only need an HTTP client, but I'm not aware of any ultra-thin
>>> client yet (apache http components should suffice).
>>>
>>> If you are using any of the build tools with dependency management:
>>> Ant+Ivy, Maven, Gradle, ask for Hadoop JARs and have the dependencies
>>> pulled in.
>>>
>>> If you aren't using any of the build tools w/ dependency management, now
>>> is the time.
>>>
>>>
>>> On 30 August 2012 09:32, Visioner Sadak <vi...@gmail.com>wrote:
>>>
>>>> Hi,
>>>>
>>>>   I have a WAR which is deployed on tomcat server the WAR contains some
>>>> java classes which uploads files, will i be able to upload directly in to
>>>> hadoop iam using the below code in one of my java class
>>>>
>>>>        Configuration hadoopConf=new Configuration();
>>>>         //get the default associated file system
>>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>>>         //copy from lfs to hdfs
>>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>>>> Path("/user/TestDir/"));
>>>>
>>>> but its throwing up this error
>>>>
>>>> java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
>>>>
>>>> when this code is run independtly using a single jar deployed in hadoop
>>>> bin it wrks fine
>>>>
>>>>
>>>>
>>>
>>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
but the problem is that my  code gets executed with the warning but file is
not copied to hdfs , actually i m trying to copy a file from local to hdfs

   Configuration hadoopConf=new Configuration();
        //get the default associated file system
       FileSystem fileSystem=FileSystem.get(hadoopConf);
       // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
        //copy from lfs to hdfs
       fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
Path("/user/TestDir/"));


On Thu, Aug 30, 2012 at 9:57 PM, Steve Loughran <st...@hortonworks.com>wrote:

>
>
>  On 30 August 2012 13:54, Visioner Sadak <vi...@gmail.com> wrote:
>
>> Thanks a ton guys for your help i did used hadoop-core-1.0.3.jar &
>> commons-lang-2.1.jar to get rid of the class not found error now i am
>> getting this error is this becoz i am using my app and hadoop on windows???
>>
>> util.NativeCodeLoader: Unable to load native-hadoop library for your
>> platform... using builtin-java classes where applicable
>>
>
> no, that's warning you that the native code to help with some operations
> (especially compression) aren't loading as your JVM's native lib path
> aren't set up right.
>
> Just edit log4j to hide that classes log messages.
>
> FWIW, I've downgraded some other messages that are over noisy, especially
> if you bring up a MiniMR/MiniDFS cluster for test runs:
>
>  log4j.logger.org.apache.hadoop.hdfs.server.datanode.DataStorage=WARNING
>
> log4j.logger.org.apache.hadoop.hdfs.server.datanode.DataXceiverServer=WARNING
>
> log4j.logger.org.apache.hadoop.hdfs.server.datanode.DataBlockScanner=WARNING
> log4j.logger.org.apache.hadoop.hdfs.server.datanode.FSDataset=FATAL
> log4j.logger.org.apache.hadoop.metrics2=FATAL
> log4j.logger.org.apache.hadoop.ipc.metrics.RpcInstrumentation=WARNING
> log4j.logger.org.apache.hadoop.ipc.Server=WARNING
> log4j.logger.org.apache.hadoop.metrics=FATAL
>
>
>
>>
>>
>>
>>
>> On Thu, Aug 30, 2012 at 5:22 PM, Steve Loughran <st...@hortonworks.com>wrote:
>>
>>> you will need almost the entire hadoop client-side JAR set and
>>> dependencies for this, I'm afraid.
>>>
>>> The new webhdfs filesys -HDFS over HTTP- is designed to be lighter
>>> weight and only need an HTTP client, but I'm not aware of any ultra-thin
>>> client yet (apache http components should suffice).
>>>
>>> If you are using any of the build tools with dependency management:
>>> Ant+Ivy, Maven, Gradle, ask for Hadoop JARs and have the dependencies
>>> pulled in.
>>>
>>> If you aren't using any of the build tools w/ dependency management, now
>>> is the time.
>>>
>>>
>>> On 30 August 2012 09:32, Visioner Sadak <vi...@gmail.com>wrote:
>>>
>>>> Hi,
>>>>
>>>>   I have a WAR which is deployed on tomcat server the WAR contains some
>>>> java classes which uploads files, will i be able to upload directly in to
>>>> hadoop iam using the below code in one of my java class
>>>>
>>>>        Configuration hadoopConf=new Configuration();
>>>>         //get the default associated file system
>>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>>>         //copy from lfs to hdfs
>>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>>>> Path("/user/TestDir/"));
>>>>
>>>> but its throwing up this error
>>>>
>>>> java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
>>>>
>>>> when this code is run independtly using a single jar deployed in hadoop
>>>> bin it wrks fine
>>>>
>>>>
>>>>
>>>
>>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Steve Loughran <st...@hortonworks.com>.
On 30 August 2012 13:54, Visioner Sadak <vi...@gmail.com> wrote:

> Thanks a ton guys for your help i did used hadoop-core-1.0.3.jar &
> commons-lang-2.1.jar to get rid of the class not found error now i am
> getting this error is this becoz i am using my app and hadoop on windows???
>
> util.NativeCodeLoader: Unable to load native-hadoop library for your
> platform... using builtin-java classes where applicable
>

no, that's warning you that the native code to help with some operations
(especially compression) aren't loading as your JVM's native lib path
aren't set up right.

Just edit log4j to hide that classes log messages.

FWIW, I've downgraded some other messages that are over noisy, especially
if you bring up a MiniMR/MiniDFS cluster for test runs:

log4j.logger.org.apache.hadoop.hdfs.server.datanode.DataStorage=WARNING
log4j.logger.org.apache.hadoop.hdfs.server.datanode.DataXceiverServer=WARNING
log4j.logger.org.apache.hadoop.hdfs.server.datanode.DataBlockScanner=WARNING
log4j.logger.org.apache.hadoop.hdfs.server.datanode.FSDataset=FATAL
log4j.logger.org.apache.hadoop.metrics2=FATAL
log4j.logger.org.apache.hadoop.ipc.metrics.RpcInstrumentation=WARNING
log4j.logger.org.apache.hadoop.ipc.Server=WARNING
log4j.logger.org.apache.hadoop.metrics=FATAL



>
>
>
>
> On Thu, Aug 30, 2012 at 5:22 PM, Steve Loughran <st...@hortonworks.com>wrote:
>
>> you will need almost the entire hadoop client-side JAR set and
>> dependencies for this, I'm afraid.
>>
>> The new webhdfs filesys -HDFS over HTTP- is designed to be lighter weight
>> and only need an HTTP client, but I'm not aware of any ultra-thin client
>> yet (apache http components should suffice).
>>
>> If you are using any of the build tools with dependency management:
>> Ant+Ivy, Maven, Gradle, ask for Hadoop JARs and have the dependencies
>> pulled in.
>>
>> If you aren't using any of the build tools w/ dependency management, now
>> is the time.
>>
>>
>> On 30 August 2012 09:32, Visioner Sadak <vi...@gmail.com> wrote:
>>
>>> Hi,
>>>
>>>   I have a WAR which is deployed on tomcat server the WAR contains some
>>> java classes which uploads files, will i be able to upload directly in to
>>> hadoop iam using the below code in one of my java class
>>>
>>>        Configuration hadoopConf=new Configuration();
>>>         //get the default associated file system
>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>>         //copy from lfs to hdfs
>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>>> Path("/user/TestDir/"));
>>>
>>> but its throwing up this error
>>>
>>> java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
>>>
>>> when this code is run independtly using a single jar deployed in hadoop
>>> bin it wrks fine
>>>
>>>
>>>
>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Steve Loughran <st...@hortonworks.com>.
On 30 August 2012 13:54, Visioner Sadak <vi...@gmail.com> wrote:

> Thanks a ton guys for your help i did used hadoop-core-1.0.3.jar &
> commons-lang-2.1.jar to get rid of the class not found error now i am
> getting this error is this becoz i am using my app and hadoop on windows???
>
> util.NativeCodeLoader: Unable to load native-hadoop library for your
> platform... using builtin-java classes where applicable
>

no, that's warning you that the native code to help with some operations
(especially compression) aren't loading as your JVM's native lib path
aren't set up right.

Just edit log4j to hide that classes log messages.

FWIW, I've downgraded some other messages that are over noisy, especially
if you bring up a MiniMR/MiniDFS cluster for test runs:

log4j.logger.org.apache.hadoop.hdfs.server.datanode.DataStorage=WARNING
log4j.logger.org.apache.hadoop.hdfs.server.datanode.DataXceiverServer=WARNING
log4j.logger.org.apache.hadoop.hdfs.server.datanode.DataBlockScanner=WARNING
log4j.logger.org.apache.hadoop.hdfs.server.datanode.FSDataset=FATAL
log4j.logger.org.apache.hadoop.metrics2=FATAL
log4j.logger.org.apache.hadoop.ipc.metrics.RpcInstrumentation=WARNING
log4j.logger.org.apache.hadoop.ipc.Server=WARNING
log4j.logger.org.apache.hadoop.metrics=FATAL



>
>
>
>
> On Thu, Aug 30, 2012 at 5:22 PM, Steve Loughran <st...@hortonworks.com>wrote:
>
>> you will need almost the entire hadoop client-side JAR set and
>> dependencies for this, I'm afraid.
>>
>> The new webhdfs filesys -HDFS over HTTP- is designed to be lighter weight
>> and only need an HTTP client, but I'm not aware of any ultra-thin client
>> yet (apache http components should suffice).
>>
>> If you are using any of the build tools with dependency management:
>> Ant+Ivy, Maven, Gradle, ask for Hadoop JARs and have the dependencies
>> pulled in.
>>
>> If you aren't using any of the build tools w/ dependency management, now
>> is the time.
>>
>>
>> On 30 August 2012 09:32, Visioner Sadak <vi...@gmail.com> wrote:
>>
>>> Hi,
>>>
>>>   I have a WAR which is deployed on tomcat server the WAR contains some
>>> java classes which uploads files, will i be able to upload directly in to
>>> hadoop iam using the below code in one of my java class
>>>
>>>        Configuration hadoopConf=new Configuration();
>>>         //get the default associated file system
>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>>         //copy from lfs to hdfs
>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>>> Path("/user/TestDir/"));
>>>
>>> but its throwing up this error
>>>
>>> java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
>>>
>>> when this code is run independtly using a single jar deployed in hadoop
>>> bin it wrks fine
>>>
>>>
>>>
>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Steve Loughran <st...@hortonworks.com>.
On 30 August 2012 13:54, Visioner Sadak <vi...@gmail.com> wrote:

> Thanks a ton guys for your help i did used hadoop-core-1.0.3.jar &
> commons-lang-2.1.jar to get rid of the class not found error now i am
> getting this error is this becoz i am using my app and hadoop on windows???
>
> util.NativeCodeLoader: Unable to load native-hadoop library for your
> platform... using builtin-java classes where applicable
>

no, that's warning you that the native code to help with some operations
(especially compression) aren't loading as your JVM's native lib path
aren't set up right.

Just edit log4j to hide that classes log messages.

FWIW, I've downgraded some other messages that are over noisy, especially
if you bring up a MiniMR/MiniDFS cluster for test runs:

log4j.logger.org.apache.hadoop.hdfs.server.datanode.DataStorage=WARNING
log4j.logger.org.apache.hadoop.hdfs.server.datanode.DataXceiverServer=WARNING
log4j.logger.org.apache.hadoop.hdfs.server.datanode.DataBlockScanner=WARNING
log4j.logger.org.apache.hadoop.hdfs.server.datanode.FSDataset=FATAL
log4j.logger.org.apache.hadoop.metrics2=FATAL
log4j.logger.org.apache.hadoop.ipc.metrics.RpcInstrumentation=WARNING
log4j.logger.org.apache.hadoop.ipc.Server=WARNING
log4j.logger.org.apache.hadoop.metrics=FATAL



>
>
>
>
> On Thu, Aug 30, 2012 at 5:22 PM, Steve Loughran <st...@hortonworks.com>wrote:
>
>> you will need almost the entire hadoop client-side JAR set and
>> dependencies for this, I'm afraid.
>>
>> The new webhdfs filesys -HDFS over HTTP- is designed to be lighter weight
>> and only need an HTTP client, but I'm not aware of any ultra-thin client
>> yet (apache http components should suffice).
>>
>> If you are using any of the build tools with dependency management:
>> Ant+Ivy, Maven, Gradle, ask for Hadoop JARs and have the dependencies
>> pulled in.
>>
>> If you aren't using any of the build tools w/ dependency management, now
>> is the time.
>>
>>
>> On 30 August 2012 09:32, Visioner Sadak <vi...@gmail.com> wrote:
>>
>>> Hi,
>>>
>>>   I have a WAR which is deployed on tomcat server the WAR contains some
>>> java classes which uploads files, will i be able to upload directly in to
>>> hadoop iam using the below code in one of my java class
>>>
>>>        Configuration hadoopConf=new Configuration();
>>>         //get the default associated file system
>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>>         //copy from lfs to hdfs
>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>>> Path("/user/TestDir/"));
>>>
>>> but its throwing up this error
>>>
>>> java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
>>>
>>> when this code is run independtly using a single jar deployed in hadoop
>>> bin it wrks fine
>>>
>>>
>>>
>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Steve Loughran <st...@hortonworks.com>.
On 30 August 2012 13:54, Visioner Sadak <vi...@gmail.com> wrote:

> Thanks a ton guys for your help i did used hadoop-core-1.0.3.jar &
> commons-lang-2.1.jar to get rid of the class not found error now i am
> getting this error is this becoz i am using my app and hadoop on windows???
>
> util.NativeCodeLoader: Unable to load native-hadoop library for your
> platform... using builtin-java classes where applicable
>

no, that's warning you that the native code to help with some operations
(especially compression) aren't loading as your JVM's native lib path
aren't set up right.

Just edit log4j to hide that classes log messages.

FWIW, I've downgraded some other messages that are over noisy, especially
if you bring up a MiniMR/MiniDFS cluster for test runs:

log4j.logger.org.apache.hadoop.hdfs.server.datanode.DataStorage=WARNING
log4j.logger.org.apache.hadoop.hdfs.server.datanode.DataXceiverServer=WARNING
log4j.logger.org.apache.hadoop.hdfs.server.datanode.DataBlockScanner=WARNING
log4j.logger.org.apache.hadoop.hdfs.server.datanode.FSDataset=FATAL
log4j.logger.org.apache.hadoop.metrics2=FATAL
log4j.logger.org.apache.hadoop.ipc.metrics.RpcInstrumentation=WARNING
log4j.logger.org.apache.hadoop.ipc.Server=WARNING
log4j.logger.org.apache.hadoop.metrics=FATAL



>
>
>
>
> On Thu, Aug 30, 2012 at 5:22 PM, Steve Loughran <st...@hortonworks.com>wrote:
>
>> you will need almost the entire hadoop client-side JAR set and
>> dependencies for this, I'm afraid.
>>
>> The new webhdfs filesys -HDFS over HTTP- is designed to be lighter weight
>> and only need an HTTP client, but I'm not aware of any ultra-thin client
>> yet (apache http components should suffice).
>>
>> If you are using any of the build tools with dependency management:
>> Ant+Ivy, Maven, Gradle, ask for Hadoop JARs and have the dependencies
>> pulled in.
>>
>> If you aren't using any of the build tools w/ dependency management, now
>> is the time.
>>
>>
>> On 30 August 2012 09:32, Visioner Sadak <vi...@gmail.com> wrote:
>>
>>> Hi,
>>>
>>>   I have a WAR which is deployed on tomcat server the WAR contains some
>>> java classes which uploads files, will i be able to upload directly in to
>>> hadoop iam using the below code in one of my java class
>>>
>>>        Configuration hadoopConf=new Configuration();
>>>         //get the default associated file system
>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>>         //copy from lfs to hdfs
>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>>> Path("/user/TestDir/"));
>>>
>>> but its throwing up this error
>>>
>>> java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
>>>
>>> when this code is run independtly using a single jar deployed in hadoop
>>> bin it wrks fine
>>>
>>>
>>>
>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
Thanks a ton guys for your help i did used hadoop-core-1.0.3.jar &
commons-lang-2.1.jar to get rid of the class not found error now i am
getting this error is this becoz i am using my app and hadoop on windows???

util.NativeCodeLoader: Unable to load native-hadoop library for your
platform... using builtin-java classes where applicable




On Thu, Aug 30, 2012 at 5:22 PM, Steve Loughran <st...@hortonworks.com>wrote:

> you will need almost the entire hadoop client-side JAR set and
> dependencies for this, I'm afraid.
>
> The new webhdfs filesys -HDFS over HTTP- is designed to be lighter weight
> and only need an HTTP client, but I'm not aware of any ultra-thin client
> yet (apache http components should suffice).
>
> If you are using any of the build tools with dependency management:
> Ant+Ivy, Maven, Gradle, ask for Hadoop JARs and have the dependencies
> pulled in.
>
> If you aren't using any of the build tools w/ dependency management, now
> is the time.
>
>
> On 30 August 2012 09:32, Visioner Sadak <vi...@gmail.com> wrote:
>
>> Hi,
>>
>>   I have a WAR which is deployed on tomcat server the WAR contains some
>> java classes which uploads files, will i be able to upload directly in to
>> hadoop iam using the below code in one of my java class
>>
>>        Configuration hadoopConf=new Configuration();
>>         //get the default associated file system
>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>         //copy from lfs to hdfs
>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>> Path("/user/TestDir/"));
>>
>> but its throwing up this error
>>
>> java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
>>
>> when this code is run independtly using a single jar deployed in hadoop
>> bin it wrks fine
>>
>>
>>
>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
Thanks a ton guys for your help i did used hadoop-core-1.0.3.jar &
commons-lang-2.1.jar to get rid of the class not found error now i am
getting this error is this becoz i am using my app and hadoop on windows???

util.NativeCodeLoader: Unable to load native-hadoop library for your
platform... using builtin-java classes where applicable




On Thu, Aug 30, 2012 at 5:22 PM, Steve Loughran <st...@hortonworks.com>wrote:

> you will need almost the entire hadoop client-side JAR set and
> dependencies for this, I'm afraid.
>
> The new webhdfs filesys -HDFS over HTTP- is designed to be lighter weight
> and only need an HTTP client, but I'm not aware of any ultra-thin client
> yet (apache http components should suffice).
>
> If you are using any of the build tools with dependency management:
> Ant+Ivy, Maven, Gradle, ask for Hadoop JARs and have the dependencies
> pulled in.
>
> If you aren't using any of the build tools w/ dependency management, now
> is the time.
>
>
> On 30 August 2012 09:32, Visioner Sadak <vi...@gmail.com> wrote:
>
>> Hi,
>>
>>   I have a WAR which is deployed on tomcat server the WAR contains some
>> java classes which uploads files, will i be able to upload directly in to
>> hadoop iam using the below code in one of my java class
>>
>>        Configuration hadoopConf=new Configuration();
>>         //get the default associated file system
>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>         //copy from lfs to hdfs
>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>> Path("/user/TestDir/"));
>>
>> but its throwing up this error
>>
>> java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
>>
>> when this code is run independtly using a single jar deployed in hadoop
>> bin it wrks fine
>>
>>
>>
>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
Thanks a ton guys for your help i did used hadoop-core-1.0.3.jar &
commons-lang-2.1.jar to get rid of the class not found error now i am
getting this error is this becoz i am using my app and hadoop on windows???

util.NativeCodeLoader: Unable to load native-hadoop library for your
platform... using builtin-java classes where applicable




On Thu, Aug 30, 2012 at 5:22 PM, Steve Loughran <st...@hortonworks.com>wrote:

> you will need almost the entire hadoop client-side JAR set and
> dependencies for this, I'm afraid.
>
> The new webhdfs filesys -HDFS over HTTP- is designed to be lighter weight
> and only need an HTTP client, but I'm not aware of any ultra-thin client
> yet (apache http components should suffice).
>
> If you are using any of the build tools with dependency management:
> Ant+Ivy, Maven, Gradle, ask for Hadoop JARs and have the dependencies
> pulled in.
>
> If you aren't using any of the build tools w/ dependency management, now
> is the time.
>
>
> On 30 August 2012 09:32, Visioner Sadak <vi...@gmail.com> wrote:
>
>> Hi,
>>
>>   I have a WAR which is deployed on tomcat server the WAR contains some
>> java classes which uploads files, will i be able to upload directly in to
>> hadoop iam using the below code in one of my java class
>>
>>        Configuration hadoopConf=new Configuration();
>>         //get the default associated file system
>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>         //copy from lfs to hdfs
>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>> Path("/user/TestDir/"));
>>
>> but its throwing up this error
>>
>> java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
>>
>> when this code is run independtly using a single jar deployed in hadoop
>> bin it wrks fine
>>
>>
>>
>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
Thanks a ton guys for your help i did used hadoop-core-1.0.3.jar &
commons-lang-2.1.jar to get rid of the class not found error now i am
getting this error is this becoz i am using my app and hadoop on windows???

util.NativeCodeLoader: Unable to load native-hadoop library for your
platform... using builtin-java classes where applicable




On Thu, Aug 30, 2012 at 5:22 PM, Steve Loughran <st...@hortonworks.com>wrote:

> you will need almost the entire hadoop client-side JAR set and
> dependencies for this, I'm afraid.
>
> The new webhdfs filesys -HDFS over HTTP- is designed to be lighter weight
> and only need an HTTP client, but I'm not aware of any ultra-thin client
> yet (apache http components should suffice).
>
> If you are using any of the build tools with dependency management:
> Ant+Ivy, Maven, Gradle, ask for Hadoop JARs and have the dependencies
> pulled in.
>
> If you aren't using any of the build tools w/ dependency management, now
> is the time.
>
>
> On 30 August 2012 09:32, Visioner Sadak <vi...@gmail.com> wrote:
>
>> Hi,
>>
>>   I have a WAR which is deployed on tomcat server the WAR contains some
>> java classes which uploads files, will i be able to upload directly in to
>> hadoop iam using the below code in one of my java class
>>
>>        Configuration hadoopConf=new Configuration();
>>         //get the default associated file system
>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>         //copy from lfs to hdfs
>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>> Path("/user/TestDir/"));
>>
>> but its throwing up this error
>>
>> java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
>>
>> when this code is run independtly using a single jar deployed in hadoop
>> bin it wrks fine
>>
>>
>>
>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Steve Loughran <st...@hortonworks.com>.
you will need almost the entire hadoop client-side JAR set and dependencies
for this, I'm afraid.

The new webhdfs filesys -HDFS over HTTP- is designed to be lighter weight
and only need an HTTP client, but I'm not aware of any ultra-thin client
yet (apache http components should suffice).

If you are using any of the build tools with dependency management:
Ant+Ivy, Maven, Gradle, ask for Hadoop JARs and have the dependencies
pulled in.

If you aren't using any of the build tools w/ dependency management, now is
the time.

On 30 August 2012 09:32, Visioner Sadak <vi...@gmail.com> wrote:

> Hi,
>
>   I have a WAR which is deployed on tomcat server the WAR contains some
> java classes which uploads files, will i be able to upload directly in to
> hadoop iam using the below code in one of my java class
>
>        Configuration hadoopConf=new Configuration();
>         //get the default associated file system
>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>         //copy from lfs to hdfs
>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
> Path("/user/TestDir/"));
>
> but its throwing up this error
>
> java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
>
> when this code is run independtly using a single jar deployed in hadoop
> bin it wrks fine
>
>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
Thanks experts for your help finally got the issue i was getting this error

org.apache.hadoop.ipc.RemoteException
: Server IPC version 5 cannot communicate with client version 4

because my libraries in tomcat were of different version than the libraries
of hadoop installation thanks a ton for your help.....

On Thu, Aug 30, 2012 at 2:02 PM, Visioner Sadak <vi...@gmail.com>wrote:

> Hi,
>
>   I have a WAR which is deployed on tomcat server the WAR contains some
> java classes which uploads files, will i be able to upload directly in to
> hadoop iam using the below code in one of my java class
>
>        Configuration hadoopConf=new Configuration();
>         //get the default associated file system
>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>         //copy from lfs to hdfs
>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
> Path("/user/TestDir/"));
>
> but its throwing up this error
>
> java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
>
> when this code is run independtly using a single jar deployed in hadoop
> bin it wrks fine
>
>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Hemanth Yamijala <yh...@gmail.com>.
Hi,

The error is talking about hadoop configuration. So probably you need to
put the hadoop core jar in the lib folder. That said, there might be other
dependencies you might need as well. But you can try it out once.

Thanks
hemanth

On Thu, Aug 30, 2012 at 3:53 PM, Visioner Sadak <vi...@gmail.com>wrote:

> Tried puttin it still same error [?]
>
>
> On Thu, Aug 30, 2012 at 3:06 PM, John Hancock <jh...@gmail.com>wrote:
>
>> You might need to put the apache commons configuration library jar in
>> web-inf/lib to clear this error.
>>
>>
>> On Thu, Aug 30, 2012 at 4:32 AM, Visioner Sadak <visioner.sadak@gmail.com
>> > wrote:
>>
>>> Hi,
>>>
>>>   I have a WAR which is deployed on tomcat server the WAR contains some
>>> java classes which uploads files, will i be able to upload directly in to
>>> hadoop iam using the below code in one of my java class
>>>
>>>        Configuration hadoopConf=new Configuration();
>>>         //get the default associated file system
>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>>         //copy from lfs to hdfs
>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>>> Path("/user/TestDir/"));
>>>
>>> but its throwing up this error
>>>
>>> java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
>>>
>>> when this code is run independtly using a single jar deployed in hadoop
>>> bin it wrks fine
>>>
>>>
>>>
>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Hemanth Yamijala <yh...@gmail.com>.
Hi,

The error is talking about hadoop configuration. So probably you need to
put the hadoop core jar in the lib folder. That said, there might be other
dependencies you might need as well. But you can try it out once.

Thanks
hemanth

On Thu, Aug 30, 2012 at 3:53 PM, Visioner Sadak <vi...@gmail.com>wrote:

> Tried puttin it still same error [?]
>
>
> On Thu, Aug 30, 2012 at 3:06 PM, John Hancock <jh...@gmail.com>wrote:
>
>> You might need to put the apache commons configuration library jar in
>> web-inf/lib to clear this error.
>>
>>
>> On Thu, Aug 30, 2012 at 4:32 AM, Visioner Sadak <visioner.sadak@gmail.com
>> > wrote:
>>
>>> Hi,
>>>
>>>   I have a WAR which is deployed on tomcat server the WAR contains some
>>> java classes which uploads files, will i be able to upload directly in to
>>> hadoop iam using the below code in one of my java class
>>>
>>>        Configuration hadoopConf=new Configuration();
>>>         //get the default associated file system
>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>>         //copy from lfs to hdfs
>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>>> Path("/user/TestDir/"));
>>>
>>> but its throwing up this error
>>>
>>> java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
>>>
>>> when this code is run independtly using a single jar deployed in hadoop
>>> bin it wrks fine
>>>
>>>
>>>
>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Hemanth Yamijala <yh...@gmail.com>.
Hi,

The error is talking about hadoop configuration. So probably you need to
put the hadoop core jar in the lib folder. That said, there might be other
dependencies you might need as well. But you can try it out once.

Thanks
hemanth

On Thu, Aug 30, 2012 at 3:53 PM, Visioner Sadak <vi...@gmail.com>wrote:

> Tried puttin it still same error [?]
>
>
> On Thu, Aug 30, 2012 at 3:06 PM, John Hancock <jh...@gmail.com>wrote:
>
>> You might need to put the apache commons configuration library jar in
>> web-inf/lib to clear this error.
>>
>>
>> On Thu, Aug 30, 2012 at 4:32 AM, Visioner Sadak <visioner.sadak@gmail.com
>> > wrote:
>>
>>> Hi,
>>>
>>>   I have a WAR which is deployed on tomcat server the WAR contains some
>>> java classes which uploads files, will i be able to upload directly in to
>>> hadoop iam using the below code in one of my java class
>>>
>>>        Configuration hadoopConf=new Configuration();
>>>         //get the default associated file system
>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>>         //copy from lfs to hdfs
>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>>> Path("/user/TestDir/"));
>>>
>>> but its throwing up this error
>>>
>>> java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
>>>
>>> when this code is run independtly using a single jar deployed in hadoop
>>> bin it wrks fine
>>>
>>>
>>>
>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Hemanth Yamijala <yh...@gmail.com>.
Hi,

The error is talking about hadoop configuration. So probably you need to
put the hadoop core jar in the lib folder. That said, there might be other
dependencies you might need as well. But you can try it out once.

Thanks
hemanth

On Thu, Aug 30, 2012 at 3:53 PM, Visioner Sadak <vi...@gmail.com>wrote:

> Tried puttin it still same error [?]
>
>
> On Thu, Aug 30, 2012 at 3:06 PM, John Hancock <jh...@gmail.com>wrote:
>
>> You might need to put the apache commons configuration library jar in
>> web-inf/lib to clear this error.
>>
>>
>> On Thu, Aug 30, 2012 at 4:32 AM, Visioner Sadak <visioner.sadak@gmail.com
>> > wrote:
>>
>>> Hi,
>>>
>>>   I have a WAR which is deployed on tomcat server the WAR contains some
>>> java classes which uploads files, will i be able to upload directly in to
>>> hadoop iam using the below code in one of my java class
>>>
>>>        Configuration hadoopConf=new Configuration();
>>>         //get the default associated file system
>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>>         //copy from lfs to hdfs
>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>>> Path("/user/TestDir/"));
>>>
>>> but its throwing up this error
>>>
>>> java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
>>>
>>> when this code is run independtly using a single jar deployed in hadoop
>>> bin it wrks fine
>>>
>>>
>>>
>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
Tried puttin it still same error [?]

On Thu, Aug 30, 2012 at 3:06 PM, John Hancock <jh...@gmail.com>wrote:

> You might need to put the apache commons configuration library jar in
> web-inf/lib to clear this error.
>
>
> On Thu, Aug 30, 2012 at 4:32 AM, Visioner Sadak <vi...@gmail.com>wrote:
>
>> Hi,
>>
>>   I have a WAR which is deployed on tomcat server the WAR contains some
>> java classes which uploads files, will i be able to upload directly in to
>> hadoop iam using the below code in one of my java class
>>
>>        Configuration hadoopConf=new Configuration();
>>         //get the default associated file system
>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>         //copy from lfs to hdfs
>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>> Path("/user/TestDir/"));
>>
>> but its throwing up this error
>>
>> java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
>>
>> when this code is run independtly using a single jar deployed in hadoop
>> bin it wrks fine
>>
>>
>>
>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
Tried puttin it still same error [?]

On Thu, Aug 30, 2012 at 3:06 PM, John Hancock <jh...@gmail.com>wrote:

> You might need to put the apache commons configuration library jar in
> web-inf/lib to clear this error.
>
>
> On Thu, Aug 30, 2012 at 4:32 AM, Visioner Sadak <vi...@gmail.com>wrote:
>
>> Hi,
>>
>>   I have a WAR which is deployed on tomcat server the WAR contains some
>> java classes which uploads files, will i be able to upload directly in to
>> hadoop iam using the below code in one of my java class
>>
>>        Configuration hadoopConf=new Configuration();
>>         //get the default associated file system
>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>         //copy from lfs to hdfs
>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>> Path("/user/TestDir/"));
>>
>> but its throwing up this error
>>
>> java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
>>
>> when this code is run independtly using a single jar deployed in hadoop
>> bin it wrks fine
>>
>>
>>
>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
Tried puttin it still same error [?]

On Thu, Aug 30, 2012 at 3:06 PM, John Hancock <jh...@gmail.com>wrote:

> You might need to put the apache commons configuration library jar in
> web-inf/lib to clear this error.
>
>
> On Thu, Aug 30, 2012 at 4:32 AM, Visioner Sadak <vi...@gmail.com>wrote:
>
>> Hi,
>>
>>   I have a WAR which is deployed on tomcat server the WAR contains some
>> java classes which uploads files, will i be able to upload directly in to
>> hadoop iam using the below code in one of my java class
>>
>>        Configuration hadoopConf=new Configuration();
>>         //get the default associated file system
>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>         //copy from lfs to hdfs
>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>> Path("/user/TestDir/"));
>>
>> but its throwing up this error
>>
>> java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
>>
>> when this code is run independtly using a single jar deployed in hadoop
>> bin it wrks fine
>>
>>
>>
>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
Tried puttin it still same error [?]

On Thu, Aug 30, 2012 at 3:06 PM, John Hancock <jh...@gmail.com>wrote:

> You might need to put the apache commons configuration library jar in
> web-inf/lib to clear this error.
>
>
> On Thu, Aug 30, 2012 at 4:32 AM, Visioner Sadak <vi...@gmail.com>wrote:
>
>> Hi,
>>
>>   I have a WAR which is deployed on tomcat server the WAR contains some
>> java classes which uploads files, will i be able to upload directly in to
>> hadoop iam using the below code in one of my java class
>>
>>        Configuration hadoopConf=new Configuration();
>>         //get the default associated file system
>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>         //copy from lfs to hdfs
>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>> Path("/user/TestDir/"));
>>
>> but its throwing up this error
>>
>> java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
>>
>> when this code is run independtly using a single jar deployed in hadoop
>> bin it wrks fine
>>
>>
>>
>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by John Hancock <jh...@gmail.com>.
You might need to put the apache commons configuration library jar in
web-inf/lib to clear this error.

On Thu, Aug 30, 2012 at 4:32 AM, Visioner Sadak <vi...@gmail.com>wrote:

> Hi,
>
>   I have a WAR which is deployed on tomcat server the WAR contains some
> java classes which uploads files, will i be able to upload directly in to
> hadoop iam using the below code in one of my java class
>
>        Configuration hadoopConf=new Configuration();
>         //get the default associated file system
>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>         //copy from lfs to hdfs
>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
> Path("/user/TestDir/"));
>
> but its throwing up this error
>
> java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
>
> when this code is run independtly using a single jar deployed in hadoop
> bin it wrks fine
>
>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by John Hancock <jh...@gmail.com>.
You might need to put the apache commons configuration library jar in
web-inf/lib to clear this error.

On Thu, Aug 30, 2012 at 4:32 AM, Visioner Sadak <vi...@gmail.com>wrote:

> Hi,
>
>   I have a WAR which is deployed on tomcat server the WAR contains some
> java classes which uploads files, will i be able to upload directly in to
> hadoop iam using the below code in one of my java class
>
>        Configuration hadoopConf=new Configuration();
>         //get the default associated file system
>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>         //copy from lfs to hdfs
>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
> Path("/user/TestDir/"));
>
> but its throwing up this error
>
> java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
>
> when this code is run independtly using a single jar deployed in hadoop
> bin it wrks fine
>
>
>

Fwd: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
---------- Forwarded message ----------
From: Visioner Sadak <vi...@gmail.com>
Date: Thu, Aug 30, 2012 at 2:02 PM
Subject: Integrating hadoop with java UI application deployed on tomcat
To: user@hadoop.apache.org


Hi,

  I have a WAR which is deployed on tomcat server the WAR contains some
java classes which uploads files, will i be able to upload directly in to
hadoop iam using the below code in one of my java class

       Configuration hadoopConf=new Configuration();
        //get the default associated file system
       FileSystem fileSystem=FileSystem.get(hadoopConf);
       // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
        //copy from lfs to hdfs
       fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
Path("/user/TestDir/"));

but its throwing up this error

java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration

when this code is run independtly using a single jar deployed in hadoop bin
it wrks fine

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by John Hancock <jh...@gmail.com>.
You might need to put the apache commons configuration library jar in
web-inf/lib to clear this error.

On Thu, Aug 30, 2012 at 4:32 AM, Visioner Sadak <vi...@gmail.com>wrote:

> Hi,
>
>   I have a WAR which is deployed on tomcat server the WAR contains some
> java classes which uploads files, will i be able to upload directly in to
> hadoop iam using the below code in one of my java class
>
>        Configuration hadoopConf=new Configuration();
>         //get the default associated file system
>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>         //copy from lfs to hdfs
>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
> Path("/user/TestDir/"));
>
> but its throwing up this error
>
> java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
>
> when this code is run independtly using a single jar deployed in hadoop
> bin it wrks fine
>
>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
Thanks experts for your help finally got the issue i was getting this error

org.apache.hadoop.ipc.RemoteException
: Server IPC version 5 cannot communicate with client version 4

because my libraries in tomcat were of different version than the libraries
of hadoop installation thanks a ton for your help.....

On Thu, Aug 30, 2012 at 2:02 PM, Visioner Sadak <vi...@gmail.com>wrote:

> Hi,
>
>   I have a WAR which is deployed on tomcat server the WAR contains some
> java classes which uploads files, will i be able to upload directly in to
> hadoop iam using the below code in one of my java class
>
>        Configuration hadoopConf=new Configuration();
>         //get the default associated file system
>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>         //copy from lfs to hdfs
>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
> Path("/user/TestDir/"));
>
> but its throwing up this error
>
> java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
>
> when this code is run independtly using a single jar deployed in hadoop
> bin it wrks fine
>
>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Steve Loughran <st...@hortonworks.com>.
you will need almost the entire hadoop client-side JAR set and dependencies
for this, I'm afraid.

The new webhdfs filesys -HDFS over HTTP- is designed to be lighter weight
and only need an HTTP client, but I'm not aware of any ultra-thin client
yet (apache http components should suffice).

If you are using any of the build tools with dependency management:
Ant+Ivy, Maven, Gradle, ask for Hadoop JARs and have the dependencies
pulled in.

If you aren't using any of the build tools w/ dependency management, now is
the time.

On 30 August 2012 09:32, Visioner Sadak <vi...@gmail.com> wrote:

> Hi,
>
>   I have a WAR which is deployed on tomcat server the WAR contains some
> java classes which uploads files, will i be able to upload directly in to
> hadoop iam using the below code in one of my java class
>
>        Configuration hadoopConf=new Configuration();
>         //get the default associated file system
>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>         //copy from lfs to hdfs
>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
> Path("/user/TestDir/"));
>
> but its throwing up this error
>
> java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
>
> when this code is run independtly using a single jar deployed in hadoop
> bin it wrks fine
>
>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by John Hancock <jh...@gmail.com>.
You might need to put the apache commons configuration library jar in
web-inf/lib to clear this error.

On Thu, Aug 30, 2012 at 4:32 AM, Visioner Sadak <vi...@gmail.com>wrote:

> Hi,
>
>   I have a WAR which is deployed on tomcat server the WAR contains some
> java classes which uploads files, will i be able to upload directly in to
> hadoop iam using the below code in one of my java class
>
>        Configuration hadoopConf=new Configuration();
>         //get the default associated file system
>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>         //copy from lfs to hdfs
>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
> Path("/user/TestDir/"));
>
> but its throwing up this error
>
> java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
>
> when this code is run independtly using a single jar deployed in hadoop
> bin it wrks fine
>
>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
Thanks experts for your help finally got the issue i was getting this error

org.apache.hadoop.ipc.RemoteException
: Server IPC version 5 cannot communicate with client version 4

because my libraries in tomcat were of different version than the libraries
of hadoop installation thanks a ton for your help.....

On Thu, Aug 30, 2012 at 2:02 PM, Visioner Sadak <vi...@gmail.com>wrote:

> Hi,
>
>   I have a WAR which is deployed on tomcat server the WAR contains some
> java classes which uploads files, will i be able to upload directly in to
> hadoop iam using the below code in one of my java class
>
>        Configuration hadoopConf=new Configuration();
>         //get the default associated file system
>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>         //copy from lfs to hdfs
>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
> Path("/user/TestDir/"));
>
> but its throwing up this error
>
> java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
>
> when this code is run independtly using a single jar deployed in hadoop
> bin it wrks fine
>
>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
Thanks experts for your help finally got the issue i was getting this error

org.apache.hadoop.ipc.RemoteException
: Server IPC version 5 cannot communicate with client version 4

because my libraries in tomcat were of different version than the libraries
of hadoop installation thanks a ton for your help.....

On Thu, Aug 30, 2012 at 2:02 PM, Visioner Sadak <vi...@gmail.com>wrote:

> Hi,
>
>   I have a WAR which is deployed on tomcat server the WAR contains some
> java classes which uploads files, will i be able to upload directly in to
> hadoop iam using the below code in one of my java class
>
>        Configuration hadoopConf=new Configuration();
>         //get the default associated file system
>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>         //copy from lfs to hdfs
>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
> Path("/user/TestDir/"));
>
> but its throwing up this error
>
> java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
>
> when this code is run independtly using a single jar deployed in hadoop
> bin it wrks fine
>
>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Steve Loughran <st...@hortonworks.com>.
you will need almost the entire hadoop client-side JAR set and dependencies
for this, I'm afraid.

The new webhdfs filesys -HDFS over HTTP- is designed to be lighter weight
and only need an HTTP client, but I'm not aware of any ultra-thin client
yet (apache http components should suffice).

If you are using any of the build tools with dependency management:
Ant+Ivy, Maven, Gradle, ask for Hadoop JARs and have the dependencies
pulled in.

If you aren't using any of the build tools w/ dependency management, now is
the time.

On 30 August 2012 09:32, Visioner Sadak <vi...@gmail.com> wrote:

> Hi,
>
>   I have a WAR which is deployed on tomcat server the WAR contains some
> java classes which uploads files, will i be able to upload directly in to
> hadoop iam using the below code in one of my java class
>
>        Configuration hadoopConf=new Configuration();
>         //get the default associated file system
>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>         //copy from lfs to hdfs
>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
> Path("/user/TestDir/"));
>
> but its throwing up this error
>
> java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
>
> when this code is run independtly using a single jar deployed in hadoop
> bin it wrks fine
>
>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Steve Loughran <st...@hortonworks.com>.
you will need almost the entire hadoop client-side JAR set and dependencies
for this, I'm afraid.

The new webhdfs filesys -HDFS over HTTP- is designed to be lighter weight
and only need an HTTP client, but I'm not aware of any ultra-thin client
yet (apache http components should suffice).

If you are using any of the build tools with dependency management:
Ant+Ivy, Maven, Gradle, ask for Hadoop JARs and have the dependencies
pulled in.

If you aren't using any of the build tools w/ dependency management, now is
the time.

On 30 August 2012 09:32, Visioner Sadak <vi...@gmail.com> wrote:

> Hi,
>
>   I have a WAR which is deployed on tomcat server the WAR contains some
> java classes which uploads files, will i be able to upload directly in to
> hadoop iam using the below code in one of my java class
>
>        Configuration hadoopConf=new Configuration();
>         //get the default associated file system
>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>         //copy from lfs to hdfs
>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
> Path("/user/TestDir/"));
>
> but its throwing up this error
>
> java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
>
> when this code is run independtly using a single jar deployed in hadoop
> bin it wrks fine
>
>
>