You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Steve Loughran <st...@hortonworks.com> on 2012/09/01 10:08:03 UTC

Re: Integrating hadoop with java UI application deployed on tomcat

well, it's worked for me in the past outside Hadoop itself:

http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup


   1. Turn logging up to DEBUG
   2. Make sure that the filesystem you've just loaded is what you expect,
   by logging its value. It may turn out to be file:///, because the normal
   Hadoop site-config.xml isn't being picked up



>
> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <vi...@gmail.com>wrote:
>
>> but the problem is that my  code gets executed with the warning but file
>> is not copied to hdfs , actually i m trying to copy a file from local to
>> hdfs
>>
>>    Configuration hadoopConf=new Configuration();
>>         //get the default associated file system
>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>         //copy from lfs to hdfs
>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>> Path("/user/TestDir/"));
>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
any solution guys,badly stuck in this [?][?][?]

On Tue, Sep 4, 2012 at 4:28 PM, Visioner Sadak <vi...@gmail.com>wrote:

> Thanks bejoy, actually my hadoop is also on windows(i have installed it in
> psuedo-distributed mode for testing) its not a remote cluster....
>
>
> On Tue, Sep 4, 2012 at 3:38 PM, Bejoy KS <be...@gmail.com> wrote:
>
>> **
>> Hi
>>
>> You are running tomact on a windows machine and trying to connect to a
>> remote hadoop cluster from there. Your core site has
>>
>> <name>
>> fs.default.name</name>
>> <value>hdfs://localhost:9000</value>
>>
>> But It is localhost here.( I assume you are not running hadoop on this
>> windows environment for some testing)
>>
>> You need to have the exact configuration files and hadoop jars from the
>> cluster machines on this tomcat environment as well. I mean on the
>> classpath of your application.
>> Regards
>> Bejoy KS
>>
>> Sent from handheld, please excuse typos.
>> ------------------------------
>> *From: *Visioner Sadak <vi...@gmail.com>
>> *Date: *Tue, 4 Sep 2012 15:31:25 +0530
>> *To: *<us...@hadoop.apache.org>
>> *ReplyTo: *user@hadoop.apache.org
>>  *Subject: *Re: Integrating hadoop with java UI application deployed on
>> tomcat
>>
>> also getting one more error
>>
>> *
>>
>> org.apache.hadoop.ipc.RemoteException
>> *: Server IPC version 5 cannot communicate with client version 4
>>
>> On Tue, Sep 4, 2012 at 2:44 PM, Visioner Sadak <vi...@gmail.com>wrote:
>>
>>> Thanks shobha tried adding conf folder to tomcats classpath  still
>>> getting same error
>>>
>>>
>>> Call to localhost/127.0.0.1:9000 failed on local exception:
>>> java.io.IOException: An established connection was aborted by the software
>>> in your host machine
>>>
>>>  On Tue, Sep 4, 2012 at 11:18 AM, Mahadevappa, Shobha <
>>> Shobha.Mahadevappa@nttdata.com> wrote:
>>>
>>>>  Hi,****
>>>>
>>>> Try adding the hadoop/conf directory in the TOMCAT’s classpath ****
>>>>
>>>> ** **
>>>>
>>>> Ex :
>>>> CLASSPATH=/usr/local/Apps/hbase-0.90.4/conf:/usr/local/Apps/hadoop-0.20.203.0/conf:
>>>> ****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> Regards,****
>>>>
>>>> *Shobha M *****
>>>>
>>>> ** **
>>>>
>>>> *From:* Visioner Sadak [mailto:visioner.sadak@gmail.com]
>>>> *Sent:* 03 September 2012 PM 04:01
>>>> *To:* user@hadoop.apache.org
>>>>
>>>> *Subject:* Re: Integrating hadoop with java UI application deployed on
>>>> tomcat****
>>>>
>>>> ** **
>>>>
>>>> Thanks steve thers nothing in logs and no exceptions as well i found
>>>> that some file is created in my F:\user with directory name but its not
>>>> visible inside my hadoop browse filesystem directories i also added the
>>>> config by using the below method ****
>>>>
>>>> hadoopConf.addResource(****
>>>>
>>>> "F:/hadoop-0.22.0/conf/core-site.xml"); ****
>>>>
>>>> when running thru WAR printing out the filesystem i m getting
>>>> org.apache.hadoop.fs.LocalFileSystem@9cd8db ****
>>>>
>>>> when running an independet jar within hadoop i m getting
>>>> DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]****
>>>>
>>>> when running an independet jar i m able to do uploads....****
>>>>
>>>>  ****
>>>>
>>>> just wanted to know will i have to add something in my classpath of
>>>> tomcat or is there any other configurations of core-site.xml that i am
>>>> missing out..thanks for your help.....****
>>>>
>>>>  ****
>>>>
>>>> ** **
>>>>
>>>> On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
>>>> wrote:****
>>>>
>>>> ** **
>>>>
>>>> well, it's worked for me in the past outside Hadoop itself:****
>>>>
>>>> ** **
>>>>
>>>>
>>>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>>>> ****
>>>>
>>>> ** **
>>>>
>>>>    1. Turn logging up to DEBUG****
>>>>    2. Make sure that the filesystem you've just loaded is what you
>>>>    expect, by logging its value. It may turn out to be file:///,
>>>>    because the normal Hadoop site-config.xml isn't being picked up****
>>>>
>>>>   ****
>>>>
>>>>  ** **
>>>>
>>>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <
>>>> visioner.sadak@gmail.com> wrote:****
>>>>
>>>> but the problem is that my  code gets executed with the warning but
>>>> file is not copied to hdfs , actually i m trying to copy a file from local
>>>> to hdfs ****
>>>>
>>>>  ****
>>>>
>>>>    Configuration hadoopConf=new Configuration();
>>>>         //get the default associated file system
>>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>>>         //copy from lfs to hdfs
>>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>>>> Path("/user/TestDir/")); ****
>>>>
>>>>  ****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>>
>>>> ______________________________________________________________________
>>>> Disclaimer:This email and any attachments are sent in strictest
>>>> confidence for the sole use of the addressee and may contain legally
>>>> privileged, confidential, and proprietary data. If you are not the intended
>>>> recipient, please advise the sender by replying promptly to this email and
>>>> then delete and destroy this email and any attachments without any further
>>>> use, copying or forwarding
>>>>
>>>
>>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
any solution guys,badly stuck in this [?][?][?]

On Tue, Sep 4, 2012 at 4:28 PM, Visioner Sadak <vi...@gmail.com>wrote:

> Thanks bejoy, actually my hadoop is also on windows(i have installed it in
> psuedo-distributed mode for testing) its not a remote cluster....
>
>
> On Tue, Sep 4, 2012 at 3:38 PM, Bejoy KS <be...@gmail.com> wrote:
>
>> **
>> Hi
>>
>> You are running tomact on a windows machine and trying to connect to a
>> remote hadoop cluster from there. Your core site has
>>
>> <name>
>> fs.default.name</name>
>> <value>hdfs://localhost:9000</value>
>>
>> But It is localhost here.( I assume you are not running hadoop on this
>> windows environment for some testing)
>>
>> You need to have the exact configuration files and hadoop jars from the
>> cluster machines on this tomcat environment as well. I mean on the
>> classpath of your application.
>> Regards
>> Bejoy KS
>>
>> Sent from handheld, please excuse typos.
>> ------------------------------
>> *From: *Visioner Sadak <vi...@gmail.com>
>> *Date: *Tue, 4 Sep 2012 15:31:25 +0530
>> *To: *<us...@hadoop.apache.org>
>> *ReplyTo: *user@hadoop.apache.org
>>  *Subject: *Re: Integrating hadoop with java UI application deployed on
>> tomcat
>>
>> also getting one more error
>>
>> *
>>
>> org.apache.hadoop.ipc.RemoteException
>> *: Server IPC version 5 cannot communicate with client version 4
>>
>> On Tue, Sep 4, 2012 at 2:44 PM, Visioner Sadak <vi...@gmail.com>wrote:
>>
>>> Thanks shobha tried adding conf folder to tomcats classpath  still
>>> getting same error
>>>
>>>
>>> Call to localhost/127.0.0.1:9000 failed on local exception:
>>> java.io.IOException: An established connection was aborted by the software
>>> in your host machine
>>>
>>>  On Tue, Sep 4, 2012 at 11:18 AM, Mahadevappa, Shobha <
>>> Shobha.Mahadevappa@nttdata.com> wrote:
>>>
>>>>  Hi,****
>>>>
>>>> Try adding the hadoop/conf directory in the TOMCAT’s classpath ****
>>>>
>>>> ** **
>>>>
>>>> Ex :
>>>> CLASSPATH=/usr/local/Apps/hbase-0.90.4/conf:/usr/local/Apps/hadoop-0.20.203.0/conf:
>>>> ****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> Regards,****
>>>>
>>>> *Shobha M *****
>>>>
>>>> ** **
>>>>
>>>> *From:* Visioner Sadak [mailto:visioner.sadak@gmail.com]
>>>> *Sent:* 03 September 2012 PM 04:01
>>>> *To:* user@hadoop.apache.org
>>>>
>>>> *Subject:* Re: Integrating hadoop with java UI application deployed on
>>>> tomcat****
>>>>
>>>> ** **
>>>>
>>>> Thanks steve thers nothing in logs and no exceptions as well i found
>>>> that some file is created in my F:\user with directory name but its not
>>>> visible inside my hadoop browse filesystem directories i also added the
>>>> config by using the below method ****
>>>>
>>>> hadoopConf.addResource(****
>>>>
>>>> "F:/hadoop-0.22.0/conf/core-site.xml"); ****
>>>>
>>>> when running thru WAR printing out the filesystem i m getting
>>>> org.apache.hadoop.fs.LocalFileSystem@9cd8db ****
>>>>
>>>> when running an independet jar within hadoop i m getting
>>>> DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]****
>>>>
>>>> when running an independet jar i m able to do uploads....****
>>>>
>>>>  ****
>>>>
>>>> just wanted to know will i have to add something in my classpath of
>>>> tomcat or is there any other configurations of core-site.xml that i am
>>>> missing out..thanks for your help.....****
>>>>
>>>>  ****
>>>>
>>>> ** **
>>>>
>>>> On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
>>>> wrote:****
>>>>
>>>> ** **
>>>>
>>>> well, it's worked for me in the past outside Hadoop itself:****
>>>>
>>>> ** **
>>>>
>>>>
>>>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>>>> ****
>>>>
>>>> ** **
>>>>
>>>>    1. Turn logging up to DEBUG****
>>>>    2. Make sure that the filesystem you've just loaded is what you
>>>>    expect, by logging its value. It may turn out to be file:///,
>>>>    because the normal Hadoop site-config.xml isn't being picked up****
>>>>
>>>>   ****
>>>>
>>>>  ** **
>>>>
>>>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <
>>>> visioner.sadak@gmail.com> wrote:****
>>>>
>>>> but the problem is that my  code gets executed with the warning but
>>>> file is not copied to hdfs , actually i m trying to copy a file from local
>>>> to hdfs ****
>>>>
>>>>  ****
>>>>
>>>>    Configuration hadoopConf=new Configuration();
>>>>         //get the default associated file system
>>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>>>         //copy from lfs to hdfs
>>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>>>> Path("/user/TestDir/")); ****
>>>>
>>>>  ****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>>
>>>> ______________________________________________________________________
>>>> Disclaimer:This email and any attachments are sent in strictest
>>>> confidence for the sole use of the addressee and may contain legally
>>>> privileged, confidential, and proprietary data. If you are not the intended
>>>> recipient, please advise the sender by replying promptly to this email and
>>>> then delete and destroy this email and any attachments without any further
>>>> use, copying or forwarding
>>>>
>>>
>>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
any solution guys,badly stuck in this [?][?][?]

On Tue, Sep 4, 2012 at 4:28 PM, Visioner Sadak <vi...@gmail.com>wrote:

> Thanks bejoy, actually my hadoop is also on windows(i have installed it in
> psuedo-distributed mode for testing) its not a remote cluster....
>
>
> On Tue, Sep 4, 2012 at 3:38 PM, Bejoy KS <be...@gmail.com> wrote:
>
>> **
>> Hi
>>
>> You are running tomact on a windows machine and trying to connect to a
>> remote hadoop cluster from there. Your core site has
>>
>> <name>
>> fs.default.name</name>
>> <value>hdfs://localhost:9000</value>
>>
>> But It is localhost here.( I assume you are not running hadoop on this
>> windows environment for some testing)
>>
>> You need to have the exact configuration files and hadoop jars from the
>> cluster machines on this tomcat environment as well. I mean on the
>> classpath of your application.
>> Regards
>> Bejoy KS
>>
>> Sent from handheld, please excuse typos.
>> ------------------------------
>> *From: *Visioner Sadak <vi...@gmail.com>
>> *Date: *Tue, 4 Sep 2012 15:31:25 +0530
>> *To: *<us...@hadoop.apache.org>
>> *ReplyTo: *user@hadoop.apache.org
>>  *Subject: *Re: Integrating hadoop with java UI application deployed on
>> tomcat
>>
>> also getting one more error
>>
>> *
>>
>> org.apache.hadoop.ipc.RemoteException
>> *: Server IPC version 5 cannot communicate with client version 4
>>
>> On Tue, Sep 4, 2012 at 2:44 PM, Visioner Sadak <vi...@gmail.com>wrote:
>>
>>> Thanks shobha tried adding conf folder to tomcats classpath  still
>>> getting same error
>>>
>>>
>>> Call to localhost/127.0.0.1:9000 failed on local exception:
>>> java.io.IOException: An established connection was aborted by the software
>>> in your host machine
>>>
>>>  On Tue, Sep 4, 2012 at 11:18 AM, Mahadevappa, Shobha <
>>> Shobha.Mahadevappa@nttdata.com> wrote:
>>>
>>>>  Hi,****
>>>>
>>>> Try adding the hadoop/conf directory in the TOMCAT’s classpath ****
>>>>
>>>> ** **
>>>>
>>>> Ex :
>>>> CLASSPATH=/usr/local/Apps/hbase-0.90.4/conf:/usr/local/Apps/hadoop-0.20.203.0/conf:
>>>> ****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> Regards,****
>>>>
>>>> *Shobha M *****
>>>>
>>>> ** **
>>>>
>>>> *From:* Visioner Sadak [mailto:visioner.sadak@gmail.com]
>>>> *Sent:* 03 September 2012 PM 04:01
>>>> *To:* user@hadoop.apache.org
>>>>
>>>> *Subject:* Re: Integrating hadoop with java UI application deployed on
>>>> tomcat****
>>>>
>>>> ** **
>>>>
>>>> Thanks steve thers nothing in logs and no exceptions as well i found
>>>> that some file is created in my F:\user with directory name but its not
>>>> visible inside my hadoop browse filesystem directories i also added the
>>>> config by using the below method ****
>>>>
>>>> hadoopConf.addResource(****
>>>>
>>>> "F:/hadoop-0.22.0/conf/core-site.xml"); ****
>>>>
>>>> when running thru WAR printing out the filesystem i m getting
>>>> org.apache.hadoop.fs.LocalFileSystem@9cd8db ****
>>>>
>>>> when running an independet jar within hadoop i m getting
>>>> DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]****
>>>>
>>>> when running an independet jar i m able to do uploads....****
>>>>
>>>>  ****
>>>>
>>>> just wanted to know will i have to add something in my classpath of
>>>> tomcat or is there any other configurations of core-site.xml that i am
>>>> missing out..thanks for your help.....****
>>>>
>>>>  ****
>>>>
>>>> ** **
>>>>
>>>> On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
>>>> wrote:****
>>>>
>>>> ** **
>>>>
>>>> well, it's worked for me in the past outside Hadoop itself:****
>>>>
>>>> ** **
>>>>
>>>>
>>>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>>>> ****
>>>>
>>>> ** **
>>>>
>>>>    1. Turn logging up to DEBUG****
>>>>    2. Make sure that the filesystem you've just loaded is what you
>>>>    expect, by logging its value. It may turn out to be file:///,
>>>>    because the normal Hadoop site-config.xml isn't being picked up****
>>>>
>>>>   ****
>>>>
>>>>  ** **
>>>>
>>>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <
>>>> visioner.sadak@gmail.com> wrote:****
>>>>
>>>> but the problem is that my  code gets executed with the warning but
>>>> file is not copied to hdfs , actually i m trying to copy a file from local
>>>> to hdfs ****
>>>>
>>>>  ****
>>>>
>>>>    Configuration hadoopConf=new Configuration();
>>>>         //get the default associated file system
>>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>>>         //copy from lfs to hdfs
>>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>>>> Path("/user/TestDir/")); ****
>>>>
>>>>  ****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>>
>>>> ______________________________________________________________________
>>>> Disclaimer:This email and any attachments are sent in strictest
>>>> confidence for the sole use of the addressee and may contain legally
>>>> privileged, confidential, and proprietary data. If you are not the intended
>>>> recipient, please advise the sender by replying promptly to this email and
>>>> then delete and destroy this email and any attachments without any further
>>>> use, copying or forwarding
>>>>
>>>
>>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
any solution guys,badly stuck in this [?][?][?]

On Tue, Sep 4, 2012 at 4:28 PM, Visioner Sadak <vi...@gmail.com>wrote:

> Thanks bejoy, actually my hadoop is also on windows(i have installed it in
> psuedo-distributed mode for testing) its not a remote cluster....
>
>
> On Tue, Sep 4, 2012 at 3:38 PM, Bejoy KS <be...@gmail.com> wrote:
>
>> **
>> Hi
>>
>> You are running tomact on a windows machine and trying to connect to a
>> remote hadoop cluster from there. Your core site has
>>
>> <name>
>> fs.default.name</name>
>> <value>hdfs://localhost:9000</value>
>>
>> But It is localhost here.( I assume you are not running hadoop on this
>> windows environment for some testing)
>>
>> You need to have the exact configuration files and hadoop jars from the
>> cluster machines on this tomcat environment as well. I mean on the
>> classpath of your application.
>> Regards
>> Bejoy KS
>>
>> Sent from handheld, please excuse typos.
>> ------------------------------
>> *From: *Visioner Sadak <vi...@gmail.com>
>> *Date: *Tue, 4 Sep 2012 15:31:25 +0530
>> *To: *<us...@hadoop.apache.org>
>> *ReplyTo: *user@hadoop.apache.org
>>  *Subject: *Re: Integrating hadoop with java UI application deployed on
>> tomcat
>>
>> also getting one more error
>>
>> *
>>
>> org.apache.hadoop.ipc.RemoteException
>> *: Server IPC version 5 cannot communicate with client version 4
>>
>> On Tue, Sep 4, 2012 at 2:44 PM, Visioner Sadak <vi...@gmail.com>wrote:
>>
>>> Thanks shobha tried adding conf folder to tomcats classpath  still
>>> getting same error
>>>
>>>
>>> Call to localhost/127.0.0.1:9000 failed on local exception:
>>> java.io.IOException: An established connection was aborted by the software
>>> in your host machine
>>>
>>>  On Tue, Sep 4, 2012 at 11:18 AM, Mahadevappa, Shobha <
>>> Shobha.Mahadevappa@nttdata.com> wrote:
>>>
>>>>  Hi,****
>>>>
>>>> Try adding the hadoop/conf directory in the TOMCAT’s classpath ****
>>>>
>>>> ** **
>>>>
>>>> Ex :
>>>> CLASSPATH=/usr/local/Apps/hbase-0.90.4/conf:/usr/local/Apps/hadoop-0.20.203.0/conf:
>>>> ****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> Regards,****
>>>>
>>>> *Shobha M *****
>>>>
>>>> ** **
>>>>
>>>> *From:* Visioner Sadak [mailto:visioner.sadak@gmail.com]
>>>> *Sent:* 03 September 2012 PM 04:01
>>>> *To:* user@hadoop.apache.org
>>>>
>>>> *Subject:* Re: Integrating hadoop with java UI application deployed on
>>>> tomcat****
>>>>
>>>> ** **
>>>>
>>>> Thanks steve thers nothing in logs and no exceptions as well i found
>>>> that some file is created in my F:\user with directory name but its not
>>>> visible inside my hadoop browse filesystem directories i also added the
>>>> config by using the below method ****
>>>>
>>>> hadoopConf.addResource(****
>>>>
>>>> "F:/hadoop-0.22.0/conf/core-site.xml"); ****
>>>>
>>>> when running thru WAR printing out the filesystem i m getting
>>>> org.apache.hadoop.fs.LocalFileSystem@9cd8db ****
>>>>
>>>> when running an independet jar within hadoop i m getting
>>>> DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]****
>>>>
>>>> when running an independet jar i m able to do uploads....****
>>>>
>>>>  ****
>>>>
>>>> just wanted to know will i have to add something in my classpath of
>>>> tomcat or is there any other configurations of core-site.xml that i am
>>>> missing out..thanks for your help.....****
>>>>
>>>>  ****
>>>>
>>>> ** **
>>>>
>>>> On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
>>>> wrote:****
>>>>
>>>> ** **
>>>>
>>>> well, it's worked for me in the past outside Hadoop itself:****
>>>>
>>>> ** **
>>>>
>>>>
>>>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>>>> ****
>>>>
>>>> ** **
>>>>
>>>>    1. Turn logging up to DEBUG****
>>>>    2. Make sure that the filesystem you've just loaded is what you
>>>>    expect, by logging its value. It may turn out to be file:///,
>>>>    because the normal Hadoop site-config.xml isn't being picked up****
>>>>
>>>>   ****
>>>>
>>>>  ** **
>>>>
>>>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <
>>>> visioner.sadak@gmail.com> wrote:****
>>>>
>>>> but the problem is that my  code gets executed with the warning but
>>>> file is not copied to hdfs , actually i m trying to copy a file from local
>>>> to hdfs ****
>>>>
>>>>  ****
>>>>
>>>>    Configuration hadoopConf=new Configuration();
>>>>         //get the default associated file system
>>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>>>         //copy from lfs to hdfs
>>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>>>> Path("/user/TestDir/")); ****
>>>>
>>>>  ****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>>
>>>> ______________________________________________________________________
>>>> Disclaimer:This email and any attachments are sent in strictest
>>>> confidence for the sole use of the addressee and may contain legally
>>>> privileged, confidential, and proprietary data. If you are not the intended
>>>> recipient, please advise the sender by replying promptly to this email and
>>>> then delete and destroy this email and any attachments without any further
>>>> use, copying or forwarding
>>>>
>>>
>>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
Thanks bejoy, actually my hadoop is also on windows(i have installed it in
psuedo-distributed mode for testing) its not a remote cluster....

On Tue, Sep 4, 2012 at 3:38 PM, Bejoy KS <be...@gmail.com> wrote:

> **
> Hi
>
> You are running tomact on a windows machine and trying to connect to a
> remote hadoop cluster from there. Your core site has
>
> <name>
> fs.default.name</name>
> <value>hdfs://localhost:9000</value>
>
> But It is localhost here.( I assume you are not running hadoop on this
> windows environment for some testing)
>
> You need to have the exact configuration files and hadoop jars from the
> cluster machines on this tomcat environment as well. I mean on the
> classpath of your application.
> Regards
> Bejoy KS
>
> Sent from handheld, please excuse typos.
> ------------------------------
> *From: *Visioner Sadak <vi...@gmail.com>
> *Date: *Tue, 4 Sep 2012 15:31:25 +0530
> *To: *<us...@hadoop.apache.org>
> *ReplyTo: *user@hadoop.apache.org
>  *Subject: *Re: Integrating hadoop with java UI application deployed on
> tomcat
>
> also getting one more error
>
> *
>
> org.apache.hadoop.ipc.RemoteException
> *: Server IPC version 5 cannot communicate with client version 4
>
> On Tue, Sep 4, 2012 at 2:44 PM, Visioner Sadak <vi...@gmail.com>wrote:
>
>> Thanks shobha tried adding conf folder to tomcats classpath  still
>> getting same error
>>
>>
>> Call to localhost/127.0.0.1:9000 failed on local exception:
>> java.io.IOException: An established connection was aborted by the software
>> in your host machine
>>
>>  On Tue, Sep 4, 2012 at 11:18 AM, Mahadevappa, Shobha <
>> Shobha.Mahadevappa@nttdata.com> wrote:
>>
>>>  Hi,****
>>>
>>> Try adding the hadoop/conf directory in the TOMCAT’s classpath ****
>>>
>>> ** **
>>>
>>> Ex :
>>> CLASSPATH=/usr/local/Apps/hbase-0.90.4/conf:/usr/local/Apps/hadoop-0.20.203.0/conf:
>>> ****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> Regards,****
>>>
>>> *Shobha M *****
>>>
>>> ** **
>>>
>>> *From:* Visioner Sadak [mailto:visioner.sadak@gmail.com]
>>> *Sent:* 03 September 2012 PM 04:01
>>> *To:* user@hadoop.apache.org
>>>
>>> *Subject:* Re: Integrating hadoop with java UI application deployed on
>>> tomcat****
>>>
>>> ** **
>>>
>>> Thanks steve thers nothing in logs and no exceptions as well i found
>>> that some file is created in my F:\user with directory name but its not
>>> visible inside my hadoop browse filesystem directories i also added the
>>> config by using the below method ****
>>>
>>> hadoopConf.addResource(****
>>>
>>> "F:/hadoop-0.22.0/conf/core-site.xml"); ****
>>>
>>> when running thru WAR printing out the filesystem i m getting
>>> org.apache.hadoop.fs.LocalFileSystem@9cd8db ****
>>>
>>> when running an independet jar within hadoop i m getting
>>> DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]****
>>>
>>> when running an independet jar i m able to do uploads....****
>>>
>>>  ****
>>>
>>> just wanted to know will i have to add something in my classpath of
>>> tomcat or is there any other configurations of core-site.xml that i am
>>> missing out..thanks for your help.....****
>>>
>>>  ****
>>>
>>> ** **
>>>
>>> On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
>>> wrote:****
>>>
>>> ** **
>>>
>>> well, it's worked for me in the past outside Hadoop itself:****
>>>
>>> ** **
>>>
>>>
>>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>>> ****
>>>
>>> ** **
>>>
>>>    1. Turn logging up to DEBUG****
>>>    2. Make sure that the filesystem you've just loaded is what you
>>>    expect, by logging its value. It may turn out to be file:///,
>>>    because the normal Hadoop site-config.xml isn't being picked up****
>>>
>>>   ****
>>>
>>>  ** **
>>>
>>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <
>>> visioner.sadak@gmail.com> wrote:****
>>>
>>> but the problem is that my  code gets executed with the warning but file
>>> is not copied to hdfs , actually i m trying to copy a file from local to
>>> hdfs ****
>>>
>>>  ****
>>>
>>>    Configuration hadoopConf=new Configuration();
>>>         //get the default associated file system
>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>>         //copy from lfs to hdfs
>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>>> Path("/user/TestDir/")); ****
>>>
>>>  ****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>>
>>> ______________________________________________________________________
>>> Disclaimer:This email and any attachments are sent in strictest
>>> confidence for the sole use of the addressee and may contain legally
>>> privileged, confidential, and proprietary data. If you are not the intended
>>> recipient, please advise the sender by replying promptly to this email and
>>> then delete and destroy this email and any attachments without any further
>>> use, copying or forwarding
>>>
>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
Thanks bejoy, actually my hadoop is also on windows(i have installed it in
psuedo-distributed mode for testing) its not a remote cluster....

On Tue, Sep 4, 2012 at 3:38 PM, Bejoy KS <be...@gmail.com> wrote:

> **
> Hi
>
> You are running tomact on a windows machine and trying to connect to a
> remote hadoop cluster from there. Your core site has
>
> <name>
> fs.default.name</name>
> <value>hdfs://localhost:9000</value>
>
> But It is localhost here.( I assume you are not running hadoop on this
> windows environment for some testing)
>
> You need to have the exact configuration files and hadoop jars from the
> cluster machines on this tomcat environment as well. I mean on the
> classpath of your application.
> Regards
> Bejoy KS
>
> Sent from handheld, please excuse typos.
> ------------------------------
> *From: *Visioner Sadak <vi...@gmail.com>
> *Date: *Tue, 4 Sep 2012 15:31:25 +0530
> *To: *<us...@hadoop.apache.org>
> *ReplyTo: *user@hadoop.apache.org
>  *Subject: *Re: Integrating hadoop with java UI application deployed on
> tomcat
>
> also getting one more error
>
> *
>
> org.apache.hadoop.ipc.RemoteException
> *: Server IPC version 5 cannot communicate with client version 4
>
> On Tue, Sep 4, 2012 at 2:44 PM, Visioner Sadak <vi...@gmail.com>wrote:
>
>> Thanks shobha tried adding conf folder to tomcats classpath  still
>> getting same error
>>
>>
>> Call to localhost/127.0.0.1:9000 failed on local exception:
>> java.io.IOException: An established connection was aborted by the software
>> in your host machine
>>
>>  On Tue, Sep 4, 2012 at 11:18 AM, Mahadevappa, Shobha <
>> Shobha.Mahadevappa@nttdata.com> wrote:
>>
>>>  Hi,****
>>>
>>> Try adding the hadoop/conf directory in the TOMCAT’s classpath ****
>>>
>>> ** **
>>>
>>> Ex :
>>> CLASSPATH=/usr/local/Apps/hbase-0.90.4/conf:/usr/local/Apps/hadoop-0.20.203.0/conf:
>>> ****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> Regards,****
>>>
>>> *Shobha M *****
>>>
>>> ** **
>>>
>>> *From:* Visioner Sadak [mailto:visioner.sadak@gmail.com]
>>> *Sent:* 03 September 2012 PM 04:01
>>> *To:* user@hadoop.apache.org
>>>
>>> *Subject:* Re: Integrating hadoop with java UI application deployed on
>>> tomcat****
>>>
>>> ** **
>>>
>>> Thanks steve thers nothing in logs and no exceptions as well i found
>>> that some file is created in my F:\user with directory name but its not
>>> visible inside my hadoop browse filesystem directories i also added the
>>> config by using the below method ****
>>>
>>> hadoopConf.addResource(****
>>>
>>> "F:/hadoop-0.22.0/conf/core-site.xml"); ****
>>>
>>> when running thru WAR printing out the filesystem i m getting
>>> org.apache.hadoop.fs.LocalFileSystem@9cd8db ****
>>>
>>> when running an independet jar within hadoop i m getting
>>> DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]****
>>>
>>> when running an independet jar i m able to do uploads....****
>>>
>>>  ****
>>>
>>> just wanted to know will i have to add something in my classpath of
>>> tomcat or is there any other configurations of core-site.xml that i am
>>> missing out..thanks for your help.....****
>>>
>>>  ****
>>>
>>> ** **
>>>
>>> On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
>>> wrote:****
>>>
>>> ** **
>>>
>>> well, it's worked for me in the past outside Hadoop itself:****
>>>
>>> ** **
>>>
>>>
>>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>>> ****
>>>
>>> ** **
>>>
>>>    1. Turn logging up to DEBUG****
>>>    2. Make sure that the filesystem you've just loaded is what you
>>>    expect, by logging its value. It may turn out to be file:///,
>>>    because the normal Hadoop site-config.xml isn't being picked up****
>>>
>>>   ****
>>>
>>>  ** **
>>>
>>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <
>>> visioner.sadak@gmail.com> wrote:****
>>>
>>> but the problem is that my  code gets executed with the warning but file
>>> is not copied to hdfs , actually i m trying to copy a file from local to
>>> hdfs ****
>>>
>>>  ****
>>>
>>>    Configuration hadoopConf=new Configuration();
>>>         //get the default associated file system
>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>>         //copy from lfs to hdfs
>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>>> Path("/user/TestDir/")); ****
>>>
>>>  ****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>>
>>> ______________________________________________________________________
>>> Disclaimer:This email and any attachments are sent in strictest
>>> confidence for the sole use of the addressee and may contain legally
>>> privileged, confidential, and proprietary data. If you are not the intended
>>> recipient, please advise the sender by replying promptly to this email and
>>> then delete and destroy this email and any attachments without any further
>>> use, copying or forwarding
>>>
>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
Thanks bejoy, actually my hadoop is also on windows(i have installed it in
psuedo-distributed mode for testing) its not a remote cluster....

On Tue, Sep 4, 2012 at 3:38 PM, Bejoy KS <be...@gmail.com> wrote:

> **
> Hi
>
> You are running tomact on a windows machine and trying to connect to a
> remote hadoop cluster from there. Your core site has
>
> <name>
> fs.default.name</name>
> <value>hdfs://localhost:9000</value>
>
> But It is localhost here.( I assume you are not running hadoop on this
> windows environment for some testing)
>
> You need to have the exact configuration files and hadoop jars from the
> cluster machines on this tomcat environment as well. I mean on the
> classpath of your application.
> Regards
> Bejoy KS
>
> Sent from handheld, please excuse typos.
> ------------------------------
> *From: *Visioner Sadak <vi...@gmail.com>
> *Date: *Tue, 4 Sep 2012 15:31:25 +0530
> *To: *<us...@hadoop.apache.org>
> *ReplyTo: *user@hadoop.apache.org
>  *Subject: *Re: Integrating hadoop with java UI application deployed on
> tomcat
>
> also getting one more error
>
> *
>
> org.apache.hadoop.ipc.RemoteException
> *: Server IPC version 5 cannot communicate with client version 4
>
> On Tue, Sep 4, 2012 at 2:44 PM, Visioner Sadak <vi...@gmail.com>wrote:
>
>> Thanks shobha tried adding conf folder to tomcats classpath  still
>> getting same error
>>
>>
>> Call to localhost/127.0.0.1:9000 failed on local exception:
>> java.io.IOException: An established connection was aborted by the software
>> in your host machine
>>
>>  On Tue, Sep 4, 2012 at 11:18 AM, Mahadevappa, Shobha <
>> Shobha.Mahadevappa@nttdata.com> wrote:
>>
>>>  Hi,****
>>>
>>> Try adding the hadoop/conf directory in the TOMCAT’s classpath ****
>>>
>>> ** **
>>>
>>> Ex :
>>> CLASSPATH=/usr/local/Apps/hbase-0.90.4/conf:/usr/local/Apps/hadoop-0.20.203.0/conf:
>>> ****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> Regards,****
>>>
>>> *Shobha M *****
>>>
>>> ** **
>>>
>>> *From:* Visioner Sadak [mailto:visioner.sadak@gmail.com]
>>> *Sent:* 03 September 2012 PM 04:01
>>> *To:* user@hadoop.apache.org
>>>
>>> *Subject:* Re: Integrating hadoop with java UI application deployed on
>>> tomcat****
>>>
>>> ** **
>>>
>>> Thanks steve thers nothing in logs and no exceptions as well i found
>>> that some file is created in my F:\user with directory name but its not
>>> visible inside my hadoop browse filesystem directories i also added the
>>> config by using the below method ****
>>>
>>> hadoopConf.addResource(****
>>>
>>> "F:/hadoop-0.22.0/conf/core-site.xml"); ****
>>>
>>> when running thru WAR printing out the filesystem i m getting
>>> org.apache.hadoop.fs.LocalFileSystem@9cd8db ****
>>>
>>> when running an independet jar within hadoop i m getting
>>> DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]****
>>>
>>> when running an independet jar i m able to do uploads....****
>>>
>>>  ****
>>>
>>> just wanted to know will i have to add something in my classpath of
>>> tomcat or is there any other configurations of core-site.xml that i am
>>> missing out..thanks for your help.....****
>>>
>>>  ****
>>>
>>> ** **
>>>
>>> On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
>>> wrote:****
>>>
>>> ** **
>>>
>>> well, it's worked for me in the past outside Hadoop itself:****
>>>
>>> ** **
>>>
>>>
>>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>>> ****
>>>
>>> ** **
>>>
>>>    1. Turn logging up to DEBUG****
>>>    2. Make sure that the filesystem you've just loaded is what you
>>>    expect, by logging its value. It may turn out to be file:///,
>>>    because the normal Hadoop site-config.xml isn't being picked up****
>>>
>>>   ****
>>>
>>>  ** **
>>>
>>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <
>>> visioner.sadak@gmail.com> wrote:****
>>>
>>> but the problem is that my  code gets executed with the warning but file
>>> is not copied to hdfs , actually i m trying to copy a file from local to
>>> hdfs ****
>>>
>>>  ****
>>>
>>>    Configuration hadoopConf=new Configuration();
>>>         //get the default associated file system
>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>>         //copy from lfs to hdfs
>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>>> Path("/user/TestDir/")); ****
>>>
>>>  ****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>>
>>> ______________________________________________________________________
>>> Disclaimer:This email and any attachments are sent in strictest
>>> confidence for the sole use of the addressee and may contain legally
>>> privileged, confidential, and proprietary data. If you are not the intended
>>> recipient, please advise the sender by replying promptly to this email and
>>> then delete and destroy this email and any attachments without any further
>>> use, copying or forwarding
>>>
>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
Thanks bejoy, actually my hadoop is also on windows(i have installed it in
psuedo-distributed mode for testing) its not a remote cluster....

On Tue, Sep 4, 2012 at 3:38 PM, Bejoy KS <be...@gmail.com> wrote:

> **
> Hi
>
> You are running tomact on a windows machine and trying to connect to a
> remote hadoop cluster from there. Your core site has
>
> <name>
> fs.default.name</name>
> <value>hdfs://localhost:9000</value>
>
> But It is localhost here.( I assume you are not running hadoop on this
> windows environment for some testing)
>
> You need to have the exact configuration files and hadoop jars from the
> cluster machines on this tomcat environment as well. I mean on the
> classpath of your application.
> Regards
> Bejoy KS
>
> Sent from handheld, please excuse typos.
> ------------------------------
> *From: *Visioner Sadak <vi...@gmail.com>
> *Date: *Tue, 4 Sep 2012 15:31:25 +0530
> *To: *<us...@hadoop.apache.org>
> *ReplyTo: *user@hadoop.apache.org
>  *Subject: *Re: Integrating hadoop with java UI application deployed on
> tomcat
>
> also getting one more error
>
> *
>
> org.apache.hadoop.ipc.RemoteException
> *: Server IPC version 5 cannot communicate with client version 4
>
> On Tue, Sep 4, 2012 at 2:44 PM, Visioner Sadak <vi...@gmail.com>wrote:
>
>> Thanks shobha tried adding conf folder to tomcats classpath  still
>> getting same error
>>
>>
>> Call to localhost/127.0.0.1:9000 failed on local exception:
>> java.io.IOException: An established connection was aborted by the software
>> in your host machine
>>
>>  On Tue, Sep 4, 2012 at 11:18 AM, Mahadevappa, Shobha <
>> Shobha.Mahadevappa@nttdata.com> wrote:
>>
>>>  Hi,****
>>>
>>> Try adding the hadoop/conf directory in the TOMCAT’s classpath ****
>>>
>>> ** **
>>>
>>> Ex :
>>> CLASSPATH=/usr/local/Apps/hbase-0.90.4/conf:/usr/local/Apps/hadoop-0.20.203.0/conf:
>>> ****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> Regards,****
>>>
>>> *Shobha M *****
>>>
>>> ** **
>>>
>>> *From:* Visioner Sadak [mailto:visioner.sadak@gmail.com]
>>> *Sent:* 03 September 2012 PM 04:01
>>> *To:* user@hadoop.apache.org
>>>
>>> *Subject:* Re: Integrating hadoop with java UI application deployed on
>>> tomcat****
>>>
>>> ** **
>>>
>>> Thanks steve thers nothing in logs and no exceptions as well i found
>>> that some file is created in my F:\user with directory name but its not
>>> visible inside my hadoop browse filesystem directories i also added the
>>> config by using the below method ****
>>>
>>> hadoopConf.addResource(****
>>>
>>> "F:/hadoop-0.22.0/conf/core-site.xml"); ****
>>>
>>> when running thru WAR printing out the filesystem i m getting
>>> org.apache.hadoop.fs.LocalFileSystem@9cd8db ****
>>>
>>> when running an independet jar within hadoop i m getting
>>> DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]****
>>>
>>> when running an independet jar i m able to do uploads....****
>>>
>>>  ****
>>>
>>> just wanted to know will i have to add something in my classpath of
>>> tomcat or is there any other configurations of core-site.xml that i am
>>> missing out..thanks for your help.....****
>>>
>>>  ****
>>>
>>> ** **
>>>
>>> On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
>>> wrote:****
>>>
>>> ** **
>>>
>>> well, it's worked for me in the past outside Hadoop itself:****
>>>
>>> ** **
>>>
>>>
>>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>>> ****
>>>
>>> ** **
>>>
>>>    1. Turn logging up to DEBUG****
>>>    2. Make sure that the filesystem you've just loaded is what you
>>>    expect, by logging its value. It may turn out to be file:///,
>>>    because the normal Hadoop site-config.xml isn't being picked up****
>>>
>>>   ****
>>>
>>>  ** **
>>>
>>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <
>>> visioner.sadak@gmail.com> wrote:****
>>>
>>> but the problem is that my  code gets executed with the warning but file
>>> is not copied to hdfs , actually i m trying to copy a file from local to
>>> hdfs ****
>>>
>>>  ****
>>>
>>>    Configuration hadoopConf=new Configuration();
>>>         //get the default associated file system
>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>>         //copy from lfs to hdfs
>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>>> Path("/user/TestDir/")); ****
>>>
>>>  ****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>>
>>> ______________________________________________________________________
>>> Disclaimer:This email and any attachments are sent in strictest
>>> confidence for the sole use of the addressee and may contain legally
>>> privileged, confidential, and proprietary data. If you are not the intended
>>> recipient, please advise the sender by replying promptly to this email and
>>> then delete and destroy this email and any attachments without any further
>>> use, copying or forwarding
>>>
>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Bejoy KS <be...@gmail.com>.
Hi

You are running tomact on a windows machine and trying to connect to a remote hadoop cluster from there. Your core site has
<name>
fs.default.name</name>
<value>hdfs://localhost:9000</value>

But It is localhost here.( I assume you are not running hadoop on this windows environment for some testing)

 You need to have the exact configuration files and hadoop jars from the cluster machines on this tomcat environment as well. I mean on the classpath of your application. 
 
Regards
Bejoy KS

Sent from handheld, please excuse typos.

-----Original Message-----
From: Visioner Sadak <vi...@gmail.com>
Date: Tue, 4 Sep 2012 15:31:25 
To: <us...@hadoop.apache.org>
Reply-To: user@hadoop.apache.org
Subject: Re: Integrating hadoop with java UI application deployed on tomcat

also getting one more error

*

org.apache.hadoop.ipc.RemoteException*: Server IPC version 5 cannot
communicate with client version 4


On Tue, Sep 4, 2012 at 2:44 PM, Visioner Sadak <vi...@gmail.com>wrote:

> Thanks shobha tried adding conf folder to tomcats classpath  still getting
> same error
>
>
> Call to localhost/127.0.0.1:9000 failed on local exception:
> java.io.IOException: An established connection was aborted by the software
> in your host machine
>
>  On Tue, Sep 4, 2012 at 11:18 AM, Mahadevappa, Shobha <
> Shobha.Mahadevappa@nttdata.com> wrote:
>
>>  Hi,****
>>
>> Try adding the hadoop/conf directory in the TOMCAT’s classpath ****
>>
>> ** **
>>
>> Ex :
>> CLASSPATH=/usr/local/Apps/hbase-0.90.4/conf:/usr/local/Apps/hadoop-0.20.203.0/conf:
>> ****
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> Regards,****
>>
>> *Shobha M *****
>>
>> ** **
>>
>> *From:* Visioner Sadak [mailto:visioner.sadak@gmail.com]
>> *Sent:* 03 September 2012 PM 04:01
>> *To:* user@hadoop.apache.org
>>
>> *Subject:* Re: Integrating hadoop with java UI application deployed on
>> tomcat****
>>
>> ** **
>>
>> Thanks steve thers nothing in logs and no exceptions as well i found that
>> some file is created in my F:\user with directory name but its not visible
>> inside my hadoop browse filesystem directories i also added the config by
>> using the below method ****
>>
>> hadoopConf.addResource(****
>>
>> "F:/hadoop-0.22.0/conf/core-site.xml"); ****
>>
>> when running thru WAR printing out the filesystem i m getting
>> org.apache.hadoop.fs.LocalFileSystem@9cd8db ****
>>
>> when running an independet jar within hadoop i m getting
>> DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]****
>>
>> when running an independet jar i m able to do uploads....****
>>
>>  ****
>>
>> just wanted to know will i have to add something in my classpath of
>> tomcat or is there any other configurations of core-site.xml that i am
>> missing out..thanks for your help.....****
>>
>>  ****
>>
>> ** **
>>
>> On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
>> wrote:****
>>
>> ** **
>>
>> well, it's worked for me in the past outside Hadoop itself:****
>>
>> ** **
>>
>>
>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>> ****
>>
>> ** **
>>
>>    1. Turn logging up to DEBUG****
>>    2. Make sure that the filesystem you've just loaded is what you
>>    expect, by logging its value. It may turn out to be file:///, because
>>    the normal Hadoop site-config.xml isn't being picked up****
>>
>>   ****
>>
>>  ** **
>>
>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <vi...@gmail.com>
>> wrote:****
>>
>> but the problem is that my  code gets executed with the warning but file
>> is not copied to hdfs , actually i m trying to copy a file from local to
>> hdfs ****
>>
>>  ****
>>
>>    Configuration hadoopConf=new Configuration();
>>         //get the default associated file system
>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>         //copy from lfs to hdfs
>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>> Path("/user/TestDir/")); ****
>>
>>  ****
>>
>> ** **
>>
>> ** **
>>
>>
>> ______________________________________________________________________
>> Disclaimer:This email and any attachments are sent in strictest
>> confidence for the sole use of the addressee and may contain legally
>> privileged, confidential, and proprietary data. If you are not the intended
>> recipient, please advise the sender by replying promptly to this email and
>> then delete and destroy this email and any attachments without any further
>> use, copying or forwarding
>>
>
>


Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Bejoy KS <be...@gmail.com>.
Hi

You are running tomact on a windows machine and trying to connect to a remote hadoop cluster from there. Your core site has
<name>
fs.default.name</name>
<value>hdfs://localhost:9000</value>

But It is localhost here.( I assume you are not running hadoop on this windows environment for some testing)

 You need to have the exact configuration files and hadoop jars from the cluster machines on this tomcat environment as well. I mean on the classpath of your application. 
 
Regards
Bejoy KS

Sent from handheld, please excuse typos.

-----Original Message-----
From: Visioner Sadak <vi...@gmail.com>
Date: Tue, 4 Sep 2012 15:31:25 
To: <us...@hadoop.apache.org>
Reply-To: user@hadoop.apache.org
Subject: Re: Integrating hadoop with java UI application deployed on tomcat

also getting one more error

*

org.apache.hadoop.ipc.RemoteException*: Server IPC version 5 cannot
communicate with client version 4


On Tue, Sep 4, 2012 at 2:44 PM, Visioner Sadak <vi...@gmail.com>wrote:

> Thanks shobha tried adding conf folder to tomcats classpath  still getting
> same error
>
>
> Call to localhost/127.0.0.1:9000 failed on local exception:
> java.io.IOException: An established connection was aborted by the software
> in your host machine
>
>  On Tue, Sep 4, 2012 at 11:18 AM, Mahadevappa, Shobha <
> Shobha.Mahadevappa@nttdata.com> wrote:
>
>>  Hi,****
>>
>> Try adding the hadoop/conf directory in the TOMCAT’s classpath ****
>>
>> ** **
>>
>> Ex :
>> CLASSPATH=/usr/local/Apps/hbase-0.90.4/conf:/usr/local/Apps/hadoop-0.20.203.0/conf:
>> ****
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> Regards,****
>>
>> *Shobha M *****
>>
>> ** **
>>
>> *From:* Visioner Sadak [mailto:visioner.sadak@gmail.com]
>> *Sent:* 03 September 2012 PM 04:01
>> *To:* user@hadoop.apache.org
>>
>> *Subject:* Re: Integrating hadoop with java UI application deployed on
>> tomcat****
>>
>> ** **
>>
>> Thanks steve thers nothing in logs and no exceptions as well i found that
>> some file is created in my F:\user with directory name but its not visible
>> inside my hadoop browse filesystem directories i also added the config by
>> using the below method ****
>>
>> hadoopConf.addResource(****
>>
>> "F:/hadoop-0.22.0/conf/core-site.xml"); ****
>>
>> when running thru WAR printing out the filesystem i m getting
>> org.apache.hadoop.fs.LocalFileSystem@9cd8db ****
>>
>> when running an independet jar within hadoop i m getting
>> DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]****
>>
>> when running an independet jar i m able to do uploads....****
>>
>>  ****
>>
>> just wanted to know will i have to add something in my classpath of
>> tomcat or is there any other configurations of core-site.xml that i am
>> missing out..thanks for your help.....****
>>
>>  ****
>>
>> ** **
>>
>> On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
>> wrote:****
>>
>> ** **
>>
>> well, it's worked for me in the past outside Hadoop itself:****
>>
>> ** **
>>
>>
>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>> ****
>>
>> ** **
>>
>>    1. Turn logging up to DEBUG****
>>    2. Make sure that the filesystem you've just loaded is what you
>>    expect, by logging its value. It may turn out to be file:///, because
>>    the normal Hadoop site-config.xml isn't being picked up****
>>
>>   ****
>>
>>  ** **
>>
>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <vi...@gmail.com>
>> wrote:****
>>
>> but the problem is that my  code gets executed with the warning but file
>> is not copied to hdfs , actually i m trying to copy a file from local to
>> hdfs ****
>>
>>  ****
>>
>>    Configuration hadoopConf=new Configuration();
>>         //get the default associated file system
>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>         //copy from lfs to hdfs
>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>> Path("/user/TestDir/")); ****
>>
>>  ****
>>
>> ** **
>>
>> ** **
>>
>>
>> ______________________________________________________________________
>> Disclaimer:This email and any attachments are sent in strictest
>> confidence for the sole use of the addressee and may contain legally
>> privileged, confidential, and proprietary data. If you are not the intended
>> recipient, please advise the sender by replying promptly to this email and
>> then delete and destroy this email and any attachments without any further
>> use, copying or forwarding
>>
>
>


Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Bejoy KS <be...@gmail.com>.
Hi

You are running tomact on a windows machine and trying to connect to a remote hadoop cluster from there. Your core site has
<name>
fs.default.name</name>
<value>hdfs://localhost:9000</value>

But It is localhost here.( I assume you are not running hadoop on this windows environment for some testing)

 You need to have the exact configuration files and hadoop jars from the cluster machines on this tomcat environment as well. I mean on the classpath of your application. 
 
Regards
Bejoy KS

Sent from handheld, please excuse typos.

-----Original Message-----
From: Visioner Sadak <vi...@gmail.com>
Date: Tue, 4 Sep 2012 15:31:25 
To: <us...@hadoop.apache.org>
Reply-To: user@hadoop.apache.org
Subject: Re: Integrating hadoop with java UI application deployed on tomcat

also getting one more error

*

org.apache.hadoop.ipc.RemoteException*: Server IPC version 5 cannot
communicate with client version 4


On Tue, Sep 4, 2012 at 2:44 PM, Visioner Sadak <vi...@gmail.com>wrote:

> Thanks shobha tried adding conf folder to tomcats classpath  still getting
> same error
>
>
> Call to localhost/127.0.0.1:9000 failed on local exception:
> java.io.IOException: An established connection was aborted by the software
> in your host machine
>
>  On Tue, Sep 4, 2012 at 11:18 AM, Mahadevappa, Shobha <
> Shobha.Mahadevappa@nttdata.com> wrote:
>
>>  Hi,****
>>
>> Try adding the hadoop/conf directory in the TOMCAT’s classpath ****
>>
>> ** **
>>
>> Ex :
>> CLASSPATH=/usr/local/Apps/hbase-0.90.4/conf:/usr/local/Apps/hadoop-0.20.203.0/conf:
>> ****
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> Regards,****
>>
>> *Shobha M *****
>>
>> ** **
>>
>> *From:* Visioner Sadak [mailto:visioner.sadak@gmail.com]
>> *Sent:* 03 September 2012 PM 04:01
>> *To:* user@hadoop.apache.org
>>
>> *Subject:* Re: Integrating hadoop with java UI application deployed on
>> tomcat****
>>
>> ** **
>>
>> Thanks steve thers nothing in logs and no exceptions as well i found that
>> some file is created in my F:\user with directory name but its not visible
>> inside my hadoop browse filesystem directories i also added the config by
>> using the below method ****
>>
>> hadoopConf.addResource(****
>>
>> "F:/hadoop-0.22.0/conf/core-site.xml"); ****
>>
>> when running thru WAR printing out the filesystem i m getting
>> org.apache.hadoop.fs.LocalFileSystem@9cd8db ****
>>
>> when running an independet jar within hadoop i m getting
>> DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]****
>>
>> when running an independet jar i m able to do uploads....****
>>
>>  ****
>>
>> just wanted to know will i have to add something in my classpath of
>> tomcat or is there any other configurations of core-site.xml that i am
>> missing out..thanks for your help.....****
>>
>>  ****
>>
>> ** **
>>
>> On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
>> wrote:****
>>
>> ** **
>>
>> well, it's worked for me in the past outside Hadoop itself:****
>>
>> ** **
>>
>>
>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>> ****
>>
>> ** **
>>
>>    1. Turn logging up to DEBUG****
>>    2. Make sure that the filesystem you've just loaded is what you
>>    expect, by logging its value. It may turn out to be file:///, because
>>    the normal Hadoop site-config.xml isn't being picked up****
>>
>>   ****
>>
>>  ** **
>>
>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <vi...@gmail.com>
>> wrote:****
>>
>> but the problem is that my  code gets executed with the warning but file
>> is not copied to hdfs , actually i m trying to copy a file from local to
>> hdfs ****
>>
>>  ****
>>
>>    Configuration hadoopConf=new Configuration();
>>         //get the default associated file system
>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>         //copy from lfs to hdfs
>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>> Path("/user/TestDir/")); ****
>>
>>  ****
>>
>> ** **
>>
>> ** **
>>
>>
>> ______________________________________________________________________
>> Disclaimer:This email and any attachments are sent in strictest
>> confidence for the sole use of the addressee and may contain legally
>> privileged, confidential, and proprietary data. If you are not the intended
>> recipient, please advise the sender by replying promptly to this email and
>> then delete and destroy this email and any attachments without any further
>> use, copying or forwarding
>>
>
>


Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Bejoy KS <be...@gmail.com>.
Hi

You are running tomact on a windows machine and trying to connect to a remote hadoop cluster from there. Your core site has
<name>
fs.default.name</name>
<value>hdfs://localhost:9000</value>

But It is localhost here.( I assume you are not running hadoop on this windows environment for some testing)

 You need to have the exact configuration files and hadoop jars from the cluster machines on this tomcat environment as well. I mean on the classpath of your application. 
 
Regards
Bejoy KS

Sent from handheld, please excuse typos.

-----Original Message-----
From: Visioner Sadak <vi...@gmail.com>
Date: Tue, 4 Sep 2012 15:31:25 
To: <us...@hadoop.apache.org>
Reply-To: user@hadoop.apache.org
Subject: Re: Integrating hadoop with java UI application deployed on tomcat

also getting one more error

*

org.apache.hadoop.ipc.RemoteException*: Server IPC version 5 cannot
communicate with client version 4


On Tue, Sep 4, 2012 at 2:44 PM, Visioner Sadak <vi...@gmail.com>wrote:

> Thanks shobha tried adding conf folder to tomcats classpath  still getting
> same error
>
>
> Call to localhost/127.0.0.1:9000 failed on local exception:
> java.io.IOException: An established connection was aborted by the software
> in your host machine
>
>  On Tue, Sep 4, 2012 at 11:18 AM, Mahadevappa, Shobha <
> Shobha.Mahadevappa@nttdata.com> wrote:
>
>>  Hi,****
>>
>> Try adding the hadoop/conf directory in the TOMCAT’s classpath ****
>>
>> ** **
>>
>> Ex :
>> CLASSPATH=/usr/local/Apps/hbase-0.90.4/conf:/usr/local/Apps/hadoop-0.20.203.0/conf:
>> ****
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> Regards,****
>>
>> *Shobha M *****
>>
>> ** **
>>
>> *From:* Visioner Sadak [mailto:visioner.sadak@gmail.com]
>> *Sent:* 03 September 2012 PM 04:01
>> *To:* user@hadoop.apache.org
>>
>> *Subject:* Re: Integrating hadoop with java UI application deployed on
>> tomcat****
>>
>> ** **
>>
>> Thanks steve thers nothing in logs and no exceptions as well i found that
>> some file is created in my F:\user with directory name but its not visible
>> inside my hadoop browse filesystem directories i also added the config by
>> using the below method ****
>>
>> hadoopConf.addResource(****
>>
>> "F:/hadoop-0.22.0/conf/core-site.xml"); ****
>>
>> when running thru WAR printing out the filesystem i m getting
>> org.apache.hadoop.fs.LocalFileSystem@9cd8db ****
>>
>> when running an independet jar within hadoop i m getting
>> DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]****
>>
>> when running an independet jar i m able to do uploads....****
>>
>>  ****
>>
>> just wanted to know will i have to add something in my classpath of
>> tomcat or is there any other configurations of core-site.xml that i am
>> missing out..thanks for your help.....****
>>
>>  ****
>>
>> ** **
>>
>> On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
>> wrote:****
>>
>> ** **
>>
>> well, it's worked for me in the past outside Hadoop itself:****
>>
>> ** **
>>
>>
>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>> ****
>>
>> ** **
>>
>>    1. Turn logging up to DEBUG****
>>    2. Make sure that the filesystem you've just loaded is what you
>>    expect, by logging its value. It may turn out to be file:///, because
>>    the normal Hadoop site-config.xml isn't being picked up****
>>
>>   ****
>>
>>  ** **
>>
>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <vi...@gmail.com>
>> wrote:****
>>
>> but the problem is that my  code gets executed with the warning but file
>> is not copied to hdfs , actually i m trying to copy a file from local to
>> hdfs ****
>>
>>  ****
>>
>>    Configuration hadoopConf=new Configuration();
>>         //get the default associated file system
>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>         //copy from lfs to hdfs
>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>> Path("/user/TestDir/")); ****
>>
>>  ****
>>
>> ** **
>>
>> ** **
>>
>>
>> ______________________________________________________________________
>> Disclaimer:This email and any attachments are sent in strictest
>> confidence for the sole use of the addressee and may contain legally
>> privileged, confidential, and proprietary data. If you are not the intended
>> recipient, please advise the sender by replying promptly to this email and
>> then delete and destroy this email and any attachments without any further
>> use, copying or forwarding
>>
>
>


Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
also getting one more error

*

org.apache.hadoop.ipc.RemoteException*: Server IPC version 5 cannot
communicate with client version 4


On Tue, Sep 4, 2012 at 2:44 PM, Visioner Sadak <vi...@gmail.com>wrote:

> Thanks shobha tried adding conf folder to tomcats classpath  still getting
> same error
>
>
> Call to localhost/127.0.0.1:9000 failed on local exception:
> java.io.IOException: An established connection was aborted by the software
> in your host machine
>
>  On Tue, Sep 4, 2012 at 11:18 AM, Mahadevappa, Shobha <
> Shobha.Mahadevappa@nttdata.com> wrote:
>
>>  Hi,****
>>
>> Try adding the hadoop/conf directory in the TOMCAT’s classpath ****
>>
>> ** **
>>
>> Ex :
>> CLASSPATH=/usr/local/Apps/hbase-0.90.4/conf:/usr/local/Apps/hadoop-0.20.203.0/conf:
>> ****
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> Regards,****
>>
>> *Shobha M *****
>>
>> ** **
>>
>> *From:* Visioner Sadak [mailto:visioner.sadak@gmail.com]
>> *Sent:* 03 September 2012 PM 04:01
>> *To:* user@hadoop.apache.org
>>
>> *Subject:* Re: Integrating hadoop with java UI application deployed on
>> tomcat****
>>
>> ** **
>>
>> Thanks steve thers nothing in logs and no exceptions as well i found that
>> some file is created in my F:\user with directory name but its not visible
>> inside my hadoop browse filesystem directories i also added the config by
>> using the below method ****
>>
>> hadoopConf.addResource(****
>>
>> "F:/hadoop-0.22.0/conf/core-site.xml"); ****
>>
>> when running thru WAR printing out the filesystem i m getting
>> org.apache.hadoop.fs.LocalFileSystem@9cd8db ****
>>
>> when running an independet jar within hadoop i m getting
>> DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]****
>>
>> when running an independet jar i m able to do uploads....****
>>
>>  ****
>>
>> just wanted to know will i have to add something in my classpath of
>> tomcat or is there any other configurations of core-site.xml that i am
>> missing out..thanks for your help.....****
>>
>>  ****
>>
>> ** **
>>
>> On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
>> wrote:****
>>
>> ** **
>>
>> well, it's worked for me in the past outside Hadoop itself:****
>>
>> ** **
>>
>>
>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>> ****
>>
>> ** **
>>
>>    1. Turn logging up to DEBUG****
>>    2. Make sure that the filesystem you've just loaded is what you
>>    expect, by logging its value. It may turn out to be file:///, because
>>    the normal Hadoop site-config.xml isn't being picked up****
>>
>>   ****
>>
>>  ** **
>>
>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <vi...@gmail.com>
>> wrote:****
>>
>> but the problem is that my  code gets executed with the warning but file
>> is not copied to hdfs , actually i m trying to copy a file from local to
>> hdfs ****
>>
>>  ****
>>
>>    Configuration hadoopConf=new Configuration();
>>         //get the default associated file system
>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>         //copy from lfs to hdfs
>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>> Path("/user/TestDir/")); ****
>>
>>  ****
>>
>> ** **
>>
>> ** **
>>
>>
>> ______________________________________________________________________
>> Disclaimer:This email and any attachments are sent in strictest
>> confidence for the sole use of the addressee and may contain legally
>> privileged, confidential, and proprietary data. If you are not the intended
>> recipient, please advise the sender by replying promptly to this email and
>> then delete and destroy this email and any attachments without any further
>> use, copying or forwarding
>>
>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
also getting one more error

*

org.apache.hadoop.ipc.RemoteException*: Server IPC version 5 cannot
communicate with client version 4


On Tue, Sep 4, 2012 at 2:44 PM, Visioner Sadak <vi...@gmail.com>wrote:

> Thanks shobha tried adding conf folder to tomcats classpath  still getting
> same error
>
>
> Call to localhost/127.0.0.1:9000 failed on local exception:
> java.io.IOException: An established connection was aborted by the software
> in your host machine
>
>  On Tue, Sep 4, 2012 at 11:18 AM, Mahadevappa, Shobha <
> Shobha.Mahadevappa@nttdata.com> wrote:
>
>>  Hi,****
>>
>> Try adding the hadoop/conf directory in the TOMCAT’s classpath ****
>>
>> ** **
>>
>> Ex :
>> CLASSPATH=/usr/local/Apps/hbase-0.90.4/conf:/usr/local/Apps/hadoop-0.20.203.0/conf:
>> ****
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> Regards,****
>>
>> *Shobha M *****
>>
>> ** **
>>
>> *From:* Visioner Sadak [mailto:visioner.sadak@gmail.com]
>> *Sent:* 03 September 2012 PM 04:01
>> *To:* user@hadoop.apache.org
>>
>> *Subject:* Re: Integrating hadoop with java UI application deployed on
>> tomcat****
>>
>> ** **
>>
>> Thanks steve thers nothing in logs and no exceptions as well i found that
>> some file is created in my F:\user with directory name but its not visible
>> inside my hadoop browse filesystem directories i also added the config by
>> using the below method ****
>>
>> hadoopConf.addResource(****
>>
>> "F:/hadoop-0.22.0/conf/core-site.xml"); ****
>>
>> when running thru WAR printing out the filesystem i m getting
>> org.apache.hadoop.fs.LocalFileSystem@9cd8db ****
>>
>> when running an independet jar within hadoop i m getting
>> DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]****
>>
>> when running an independet jar i m able to do uploads....****
>>
>>  ****
>>
>> just wanted to know will i have to add something in my classpath of
>> tomcat or is there any other configurations of core-site.xml that i am
>> missing out..thanks for your help.....****
>>
>>  ****
>>
>> ** **
>>
>> On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
>> wrote:****
>>
>> ** **
>>
>> well, it's worked for me in the past outside Hadoop itself:****
>>
>> ** **
>>
>>
>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>> ****
>>
>> ** **
>>
>>    1. Turn logging up to DEBUG****
>>    2. Make sure that the filesystem you've just loaded is what you
>>    expect, by logging its value. It may turn out to be file:///, because
>>    the normal Hadoop site-config.xml isn't being picked up****
>>
>>   ****
>>
>>  ** **
>>
>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <vi...@gmail.com>
>> wrote:****
>>
>> but the problem is that my  code gets executed with the warning but file
>> is not copied to hdfs , actually i m trying to copy a file from local to
>> hdfs ****
>>
>>  ****
>>
>>    Configuration hadoopConf=new Configuration();
>>         //get the default associated file system
>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>         //copy from lfs to hdfs
>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>> Path("/user/TestDir/")); ****
>>
>>  ****
>>
>> ** **
>>
>> ** **
>>
>>
>> ______________________________________________________________________
>> Disclaimer:This email and any attachments are sent in strictest
>> confidence for the sole use of the addressee and may contain legally
>> privileged, confidential, and proprietary data. If you are not the intended
>> recipient, please advise the sender by replying promptly to this email and
>> then delete and destroy this email and any attachments without any further
>> use, copying or forwarding
>>
>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
also getting one more error

*

org.apache.hadoop.ipc.RemoteException*: Server IPC version 5 cannot
communicate with client version 4


On Tue, Sep 4, 2012 at 2:44 PM, Visioner Sadak <vi...@gmail.com>wrote:

> Thanks shobha tried adding conf folder to tomcats classpath  still getting
> same error
>
>
> Call to localhost/127.0.0.1:9000 failed on local exception:
> java.io.IOException: An established connection was aborted by the software
> in your host machine
>
>  On Tue, Sep 4, 2012 at 11:18 AM, Mahadevappa, Shobha <
> Shobha.Mahadevappa@nttdata.com> wrote:
>
>>  Hi,****
>>
>> Try adding the hadoop/conf directory in the TOMCAT’s classpath ****
>>
>> ** **
>>
>> Ex :
>> CLASSPATH=/usr/local/Apps/hbase-0.90.4/conf:/usr/local/Apps/hadoop-0.20.203.0/conf:
>> ****
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> Regards,****
>>
>> *Shobha M *****
>>
>> ** **
>>
>> *From:* Visioner Sadak [mailto:visioner.sadak@gmail.com]
>> *Sent:* 03 September 2012 PM 04:01
>> *To:* user@hadoop.apache.org
>>
>> *Subject:* Re: Integrating hadoop with java UI application deployed on
>> tomcat****
>>
>> ** **
>>
>> Thanks steve thers nothing in logs and no exceptions as well i found that
>> some file is created in my F:\user with directory name but its not visible
>> inside my hadoop browse filesystem directories i also added the config by
>> using the below method ****
>>
>> hadoopConf.addResource(****
>>
>> "F:/hadoop-0.22.0/conf/core-site.xml"); ****
>>
>> when running thru WAR printing out the filesystem i m getting
>> org.apache.hadoop.fs.LocalFileSystem@9cd8db ****
>>
>> when running an independet jar within hadoop i m getting
>> DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]****
>>
>> when running an independet jar i m able to do uploads....****
>>
>>  ****
>>
>> just wanted to know will i have to add something in my classpath of
>> tomcat or is there any other configurations of core-site.xml that i am
>> missing out..thanks for your help.....****
>>
>>  ****
>>
>> ** **
>>
>> On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
>> wrote:****
>>
>> ** **
>>
>> well, it's worked for me in the past outside Hadoop itself:****
>>
>> ** **
>>
>>
>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>> ****
>>
>> ** **
>>
>>    1. Turn logging up to DEBUG****
>>    2. Make sure that the filesystem you've just loaded is what you
>>    expect, by logging its value. It may turn out to be file:///, because
>>    the normal Hadoop site-config.xml isn't being picked up****
>>
>>   ****
>>
>>  ** **
>>
>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <vi...@gmail.com>
>> wrote:****
>>
>> but the problem is that my  code gets executed with the warning but file
>> is not copied to hdfs , actually i m trying to copy a file from local to
>> hdfs ****
>>
>>  ****
>>
>>    Configuration hadoopConf=new Configuration();
>>         //get the default associated file system
>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>         //copy from lfs to hdfs
>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>> Path("/user/TestDir/")); ****
>>
>>  ****
>>
>> ** **
>>
>> ** **
>>
>>
>> ______________________________________________________________________
>> Disclaimer:This email and any attachments are sent in strictest
>> confidence for the sole use of the addressee and may contain legally
>> privileged, confidential, and proprietary data. If you are not the intended
>> recipient, please advise the sender by replying promptly to this email and
>> then delete and destroy this email and any attachments without any further
>> use, copying or forwarding
>>
>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
also getting one more error

*

org.apache.hadoop.ipc.RemoteException*: Server IPC version 5 cannot
communicate with client version 4


On Tue, Sep 4, 2012 at 2:44 PM, Visioner Sadak <vi...@gmail.com>wrote:

> Thanks shobha tried adding conf folder to tomcats classpath  still getting
> same error
>
>
> Call to localhost/127.0.0.1:9000 failed on local exception:
> java.io.IOException: An established connection was aborted by the software
> in your host machine
>
>  On Tue, Sep 4, 2012 at 11:18 AM, Mahadevappa, Shobha <
> Shobha.Mahadevappa@nttdata.com> wrote:
>
>>  Hi,****
>>
>> Try adding the hadoop/conf directory in the TOMCAT’s classpath ****
>>
>> ** **
>>
>> Ex :
>> CLASSPATH=/usr/local/Apps/hbase-0.90.4/conf:/usr/local/Apps/hadoop-0.20.203.0/conf:
>> ****
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> Regards,****
>>
>> *Shobha M *****
>>
>> ** **
>>
>> *From:* Visioner Sadak [mailto:visioner.sadak@gmail.com]
>> *Sent:* 03 September 2012 PM 04:01
>> *To:* user@hadoop.apache.org
>>
>> *Subject:* Re: Integrating hadoop with java UI application deployed on
>> tomcat****
>>
>> ** **
>>
>> Thanks steve thers nothing in logs and no exceptions as well i found that
>> some file is created in my F:\user with directory name but its not visible
>> inside my hadoop browse filesystem directories i also added the config by
>> using the below method ****
>>
>> hadoopConf.addResource(****
>>
>> "F:/hadoop-0.22.0/conf/core-site.xml"); ****
>>
>> when running thru WAR printing out the filesystem i m getting
>> org.apache.hadoop.fs.LocalFileSystem@9cd8db ****
>>
>> when running an independet jar within hadoop i m getting
>> DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]****
>>
>> when running an independet jar i m able to do uploads....****
>>
>>  ****
>>
>> just wanted to know will i have to add something in my classpath of
>> tomcat or is there any other configurations of core-site.xml that i am
>> missing out..thanks for your help.....****
>>
>>  ****
>>
>> ** **
>>
>> On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
>> wrote:****
>>
>> ** **
>>
>> well, it's worked for me in the past outside Hadoop itself:****
>>
>> ** **
>>
>>
>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>> ****
>>
>> ** **
>>
>>    1. Turn logging up to DEBUG****
>>    2. Make sure that the filesystem you've just loaded is what you
>>    expect, by logging its value. It may turn out to be file:///, because
>>    the normal Hadoop site-config.xml isn't being picked up****
>>
>>   ****
>>
>>  ** **
>>
>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <vi...@gmail.com>
>> wrote:****
>>
>> but the problem is that my  code gets executed with the warning but file
>> is not copied to hdfs , actually i m trying to copy a file from local to
>> hdfs ****
>>
>>  ****
>>
>>    Configuration hadoopConf=new Configuration();
>>         //get the default associated file system
>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>         //copy from lfs to hdfs
>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>> Path("/user/TestDir/")); ****
>>
>>  ****
>>
>> ** **
>>
>> ** **
>>
>>
>> ______________________________________________________________________
>> Disclaimer:This email and any attachments are sent in strictest
>> confidence for the sole use of the addressee and may contain legally
>> privileged, confidential, and proprietary data. If you are not the intended
>> recipient, please advise the sender by replying promptly to this email and
>> then delete and destroy this email and any attachments without any further
>> use, copying or forwarding
>>
>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
Thanks shobha tried adding conf folder to tomcats classpath  still getting
same error


Call to localhost/127.0.0.1:9000 failed on local exception:
java.io.IOException: An established connection was aborted by the software
in your host machine

On Tue, Sep 4, 2012 at 11:18 AM, Mahadevappa, Shobha <
Shobha.Mahadevappa@nttdata.com> wrote:

>  Hi,****
>
> Try adding the hadoop/conf directory in the TOMCAT’s classpath ****
>
> ** **
>
> Ex :
> CLASSPATH=/usr/local/Apps/hbase-0.90.4/conf:/usr/local/Apps/hadoop-0.20.203.0/conf:
> ****
>
> ** **
>
> ** **
>
> ** **
>
> Regards,****
>
> *Shobha M *****
>
> ** **
>
> *From:* Visioner Sadak [mailto:visioner.sadak@gmail.com]
> *Sent:* 03 September 2012 PM 04:01
> *To:* user@hadoop.apache.org
>
> *Subject:* Re: Integrating hadoop with java UI application deployed on
> tomcat****
>
> ** **
>
> Thanks steve thers nothing in logs and no exceptions as well i found that
> some file is created in my F:\user with directory name but its not visible
> inside my hadoop browse filesystem directories i also added the config by
> using the below method ****
>
> hadoopConf.addResource(****
>
> "F:/hadoop-0.22.0/conf/core-site.xml"); ****
>
> when running thru WAR printing out the filesystem i m getting
> org.apache.hadoop.fs.LocalFileSystem@9cd8db ****
>
> when running an independet jar within hadoop i m getting
> DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]****
>
> when running an independet jar i m able to do uploads....****
>
>  ****
>
> just wanted to know will i have to add something in my classpath of tomcat
> or is there any other configurations of core-site.xml that i am missing
> out..thanks for your help.....****
>
>  ****
>
> ** **
>
> On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
> wrote:****
>
> ** **
>
> well, it's worked for me in the past outside Hadoop itself:****
>
> ** **
>
>
> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
> ****
>
> ** **
>
>    1. Turn logging up to DEBUG****
>    2. Make sure that the filesystem you've just loaded is what you
>    expect, by logging its value. It may turn out to be file:///, because
>    the normal Hadoop site-config.xml isn't being picked up****
>
>   ****
>
>  ** **
>
> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <vi...@gmail.com>
> wrote:****
>
> but the problem is that my  code gets executed with the warning but file
> is not copied to hdfs , actually i m trying to copy a file from local to
> hdfs ****
>
>  ****
>
>    Configuration hadoopConf=new Configuration();
>         //get the default associated file system
>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>         //copy from lfs to hdfs
>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
> Path("/user/TestDir/")); ****
>
>  ****
>
> ** **
>
> ** **
>
> ______________________________________________________________________
> Disclaimer:This email and any attachments are sent in strictest confidence
> for the sole use of the addressee and may contain legally privileged,
> confidential, and proprietary data. If you are not the intended recipient,
> please advise the sender by replying promptly to this email and then delete
> and destroy this email and any attachments without any further use, copying
> or forwarding
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
Thanks shobha tried adding conf folder to tomcats classpath  still getting
same error


Call to localhost/127.0.0.1:9000 failed on local exception:
java.io.IOException: An established connection was aborted by the software
in your host machine

On Tue, Sep 4, 2012 at 11:18 AM, Mahadevappa, Shobha <
Shobha.Mahadevappa@nttdata.com> wrote:

>  Hi,****
>
> Try adding the hadoop/conf directory in the TOMCAT’s classpath ****
>
> ** **
>
> Ex :
> CLASSPATH=/usr/local/Apps/hbase-0.90.4/conf:/usr/local/Apps/hadoop-0.20.203.0/conf:
> ****
>
> ** **
>
> ** **
>
> ** **
>
> Regards,****
>
> *Shobha M *****
>
> ** **
>
> *From:* Visioner Sadak [mailto:visioner.sadak@gmail.com]
> *Sent:* 03 September 2012 PM 04:01
> *To:* user@hadoop.apache.org
>
> *Subject:* Re: Integrating hadoop with java UI application deployed on
> tomcat****
>
> ** **
>
> Thanks steve thers nothing in logs and no exceptions as well i found that
> some file is created in my F:\user with directory name but its not visible
> inside my hadoop browse filesystem directories i also added the config by
> using the below method ****
>
> hadoopConf.addResource(****
>
> "F:/hadoop-0.22.0/conf/core-site.xml"); ****
>
> when running thru WAR printing out the filesystem i m getting
> org.apache.hadoop.fs.LocalFileSystem@9cd8db ****
>
> when running an independet jar within hadoop i m getting
> DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]****
>
> when running an independet jar i m able to do uploads....****
>
>  ****
>
> just wanted to know will i have to add something in my classpath of tomcat
> or is there any other configurations of core-site.xml that i am missing
> out..thanks for your help.....****
>
>  ****
>
> ** **
>
> On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
> wrote:****
>
> ** **
>
> well, it's worked for me in the past outside Hadoop itself:****
>
> ** **
>
>
> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
> ****
>
> ** **
>
>    1. Turn logging up to DEBUG****
>    2. Make sure that the filesystem you've just loaded is what you
>    expect, by logging its value. It may turn out to be file:///, because
>    the normal Hadoop site-config.xml isn't being picked up****
>
>   ****
>
>  ** **
>
> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <vi...@gmail.com>
> wrote:****
>
> but the problem is that my  code gets executed with the warning but file
> is not copied to hdfs , actually i m trying to copy a file from local to
> hdfs ****
>
>  ****
>
>    Configuration hadoopConf=new Configuration();
>         //get the default associated file system
>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>         //copy from lfs to hdfs
>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
> Path("/user/TestDir/")); ****
>
>  ****
>
> ** **
>
> ** **
>
> ______________________________________________________________________
> Disclaimer:This email and any attachments are sent in strictest confidence
> for the sole use of the addressee and may contain legally privileged,
> confidential, and proprietary data. If you are not the intended recipient,
> please advise the sender by replying promptly to this email and then delete
> and destroy this email and any attachments without any further use, copying
> or forwarding
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
Thanks shobha tried adding conf folder to tomcats classpath  still getting
same error


Call to localhost/127.0.0.1:9000 failed on local exception:
java.io.IOException: An established connection was aborted by the software
in your host machine

On Tue, Sep 4, 2012 at 11:18 AM, Mahadevappa, Shobha <
Shobha.Mahadevappa@nttdata.com> wrote:

>  Hi,****
>
> Try adding the hadoop/conf directory in the TOMCAT’s classpath ****
>
> ** **
>
> Ex :
> CLASSPATH=/usr/local/Apps/hbase-0.90.4/conf:/usr/local/Apps/hadoop-0.20.203.0/conf:
> ****
>
> ** **
>
> ** **
>
> ** **
>
> Regards,****
>
> *Shobha M *****
>
> ** **
>
> *From:* Visioner Sadak [mailto:visioner.sadak@gmail.com]
> *Sent:* 03 September 2012 PM 04:01
> *To:* user@hadoop.apache.org
>
> *Subject:* Re: Integrating hadoop with java UI application deployed on
> tomcat****
>
> ** **
>
> Thanks steve thers nothing in logs and no exceptions as well i found that
> some file is created in my F:\user with directory name but its not visible
> inside my hadoop browse filesystem directories i also added the config by
> using the below method ****
>
> hadoopConf.addResource(****
>
> "F:/hadoop-0.22.0/conf/core-site.xml"); ****
>
> when running thru WAR printing out the filesystem i m getting
> org.apache.hadoop.fs.LocalFileSystem@9cd8db ****
>
> when running an independet jar within hadoop i m getting
> DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]****
>
> when running an independet jar i m able to do uploads....****
>
>  ****
>
> just wanted to know will i have to add something in my classpath of tomcat
> or is there any other configurations of core-site.xml that i am missing
> out..thanks for your help.....****
>
>  ****
>
> ** **
>
> On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
> wrote:****
>
> ** **
>
> well, it's worked for me in the past outside Hadoop itself:****
>
> ** **
>
>
> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
> ****
>
> ** **
>
>    1. Turn logging up to DEBUG****
>    2. Make sure that the filesystem you've just loaded is what you
>    expect, by logging its value. It may turn out to be file:///, because
>    the normal Hadoop site-config.xml isn't being picked up****
>
>   ****
>
>  ** **
>
> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <vi...@gmail.com>
> wrote:****
>
> but the problem is that my  code gets executed with the warning but file
> is not copied to hdfs , actually i m trying to copy a file from local to
> hdfs ****
>
>  ****
>
>    Configuration hadoopConf=new Configuration();
>         //get the default associated file system
>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>         //copy from lfs to hdfs
>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
> Path("/user/TestDir/")); ****
>
>  ****
>
> ** **
>
> ** **
>
> ______________________________________________________________________
> Disclaimer:This email and any attachments are sent in strictest confidence
> for the sole use of the addressee and may contain legally privileged,
> confidential, and proprietary data. If you are not the intended recipient,
> please advise the sender by replying promptly to this email and then delete
> and destroy this email and any attachments without any further use, copying
> or forwarding
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
Thanks shobha tried adding conf folder to tomcats classpath  still getting
same error


Call to localhost/127.0.0.1:9000 failed on local exception:
java.io.IOException: An established connection was aborted by the software
in your host machine

On Tue, Sep 4, 2012 at 11:18 AM, Mahadevappa, Shobha <
Shobha.Mahadevappa@nttdata.com> wrote:

>  Hi,****
>
> Try adding the hadoop/conf directory in the TOMCAT’s classpath ****
>
> ** **
>
> Ex :
> CLASSPATH=/usr/local/Apps/hbase-0.90.4/conf:/usr/local/Apps/hadoop-0.20.203.0/conf:
> ****
>
> ** **
>
> ** **
>
> ** **
>
> Regards,****
>
> *Shobha M *****
>
> ** **
>
> *From:* Visioner Sadak [mailto:visioner.sadak@gmail.com]
> *Sent:* 03 September 2012 PM 04:01
> *To:* user@hadoop.apache.org
>
> *Subject:* Re: Integrating hadoop with java UI application deployed on
> tomcat****
>
> ** **
>
> Thanks steve thers nothing in logs and no exceptions as well i found that
> some file is created in my F:\user with directory name but its not visible
> inside my hadoop browse filesystem directories i also added the config by
> using the below method ****
>
> hadoopConf.addResource(****
>
> "F:/hadoop-0.22.0/conf/core-site.xml"); ****
>
> when running thru WAR printing out the filesystem i m getting
> org.apache.hadoop.fs.LocalFileSystem@9cd8db ****
>
> when running an independet jar within hadoop i m getting
> DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]****
>
> when running an independet jar i m able to do uploads....****
>
>  ****
>
> just wanted to know will i have to add something in my classpath of tomcat
> or is there any other configurations of core-site.xml that i am missing
> out..thanks for your help.....****
>
>  ****
>
> ** **
>
> On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
> wrote:****
>
> ** **
>
> well, it's worked for me in the past outside Hadoop itself:****
>
> ** **
>
>
> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
> ****
>
> ** **
>
>    1. Turn logging up to DEBUG****
>    2. Make sure that the filesystem you've just loaded is what you
>    expect, by logging its value. It may turn out to be file:///, because
>    the normal Hadoop site-config.xml isn't being picked up****
>
>   ****
>
>  ** **
>
> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <vi...@gmail.com>
> wrote:****
>
> but the problem is that my  code gets executed with the warning but file
> is not copied to hdfs , actually i m trying to copy a file from local to
> hdfs ****
>
>  ****
>
>    Configuration hadoopConf=new Configuration();
>         //get the default associated file system
>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>         //copy from lfs to hdfs
>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
> Path("/user/TestDir/")); ****
>
>  ****
>
> ** **
>
> ** **
>
> ______________________________________________________________________
> Disclaimer:This email and any attachments are sent in strictest confidence
> for the sole use of the addressee and may contain legally privileged,
> confidential, and proprietary data. If you are not the intended recipient,
> please advise the sender by replying promptly to this email and then delete
> and destroy this email and any attachments without any further use, copying
> or forwarding
>

RE: Integrating hadoop with java UI application deployed on tomcat

Posted by "Mahadevappa, Shobha" <Sh...@nttdata.com>.
Hi,
Try adding the hadoop/conf directory in the TOMCAT's classpath

Ex : CLASSPATH=/usr/local/Apps/hbase-0.90.4/conf:/usr/local/Apps/hadoop-0.20.203.0/conf:



Regards,
Shobha M

From: Visioner Sadak [mailto:visioner.sadak@gmail.com]
Sent: 03 September 2012 PM 04:01
To: user@hadoop.apache.org
Subject: Re: Integrating hadoop with java UI application deployed on tomcat

Thanks steve thers nothing in logs and no exceptions as well i found that some file is created in my F:\user with directory name but its not visible inside my hadoop browse filesystem directories i also added the config by using the below method
hadoopConf.addResource(
"F:/hadoop-0.22.0/conf/core-site.xml");
when running thru WAR printing out the filesystem i m getting org.apache.hadoop.fs.LocalFileSystem@9cd8db<ma...@9cd8db>
when running an independet jar within hadoop i m getting DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
when running an independet jar i m able to do uploads....

just wanted to know will i have to add something in my classpath of tomcat or is there any other configurations of core-site.xml that i am missing out..thanks for your help.....


On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>> wrote:

well, it's worked for me in the past outside Hadoop itself:

http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup


  1.  Turn logging up to DEBUG
  2.  Make sure that the filesystem you've just loaded is what you expect, by logging its value. It may turn out to be file:///<file:///\\>, because the normal Hadoop site-config.xml isn't being picked up


On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <vi...@gmail.com>> wrote:
but the problem is that my  code gets executed with the warning but file is not copied to hdfs , actually i m trying to copy a file from local to hdfs

   Configuration hadoopConf=new Configuration();
        //get the default associated file system
       FileSystem fileSystem=FileSystem.get(hadoopConf);
       // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
        //copy from lfs to hdfs
       fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new Path("/user/TestDir/"));




______________________________________________________________________
Disclaimer:This email and any attachments are sent in strictest confidence for the sole use of the addressee and may contain legally privileged, confidential, and proprietary data.  If you are not the intended recipient, please advise the sender by replying promptly to this email and then delete and destroy this email and any attachments without any further use, copying or forwarding

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
Thanks senthil name node is up and running and in core-site.xml i have

<configuration>
<property>
<name>fs.default.name</name>
<value>hdfs://localhost:9000</value>
</property>
</configuration>

shud i change my ip  or any other config??

On Mon, Sep 3, 2012 at 10:11 PM, Senthil Kumar <
senthilkumar@thoughtworks.com> wrote:

> The error says call to 127.0.0.1:9000 fails. It is failing when it tries
> to contact the namenode (9000 is the default namenode port) configured in
> core-site.xml. You should also check whether the namenode is configured
> correctly and also whether the namenode is up.
>
>
> On Mon, Sep 3, 2012 at 7:43 PM, Visioner Sadak <vi...@gmail.com>wrote:
>
>> Thanks Senthil i tried on trying with new path getting this error do i
>> have to do any ssl setting on tomcat as well
>>
>> *
>>
>> java.io.IOException
>> *: Call to localhost/127.0.0.1:9000 failed on local exception: *
>> java.io.IOException*: An established connection was aborted by the
>> software in your host machine
>>
>> at org.apache.hadoop.ipc.Client.wrapException(*Client.java:1107*)
>>
>> at org.apache.hadoop.ipc.Client.call(*Client.java:1075*)
>>
>> at org.apache.hadoop.ipc.RPC$Invoker.invoke(*RPC.java:225*)
>>
>>
>>  On Mon, Sep 3, 2012 at 7:22 PM, Senthil Kumar <
>> senthilkumar@thoughtworks.com> wrote:
>>
>>> Try using hadoopConf.addResource(*new
>>> Path("F:/hadoop-0.22.0/conf/core-site.xml")*); instead
>>> of hadoopConf.addResource("F:/hadoop-0.22.0/conf/core-site.xml");
>>>
>>> or you should add your core-site.xml to a location which is in your
>>> class path(WEB-INF\classes or WEB-INF\lib in case of a web application)
>>>
>>>
>>> On Mon, Sep 3, 2012 at 6:02 PM, Visioner Sadak <visioner.sadak@gmail.com
>>> > wrote:
>>>
>>>> thanks hemanth i tried adding ext folder conf and extn root folder
>>>> unable to add xml only but still same problem thanks for the help
>>>>
>>>> On Mon, Sep 3, 2012 at 4:11 PM, Hemanth Yamijala <yh...@gmail.com>wrote:
>>>>
>>>>> Hi,
>>>>>
>>>>> If you are getting the LocalFileSystem, you could try by putting
>>>>> core-site.xml in a directory that's there in the classpath for the
>>>>> Tomcat App (or include such a path in the classpath, if that's
>>>>> possible)
>>>>>
>>>>> Thanks
>>>>> hemanth
>>>>>
>>>>> On Mon, Sep 3, 2012 at 4:01 PM, Visioner Sadak <
>>>>> visioner.sadak@gmail.com> wrote:
>>>>> > Thanks steve thers nothing in logs and no exceptions as well i found
>>>>> that
>>>>> > some file is created in my F:\user with directory name but its not
>>>>> visible
>>>>> > inside my hadoop browse filesystem directories i also added the
>>>>> config by
>>>>> > using the below method
>>>>> > hadoopConf.addResource(
>>>>> > "F:/hadoop-0.22.0/conf/core-site.xml");
>>>>> > when running thru WAR printing out the filesystem i m getting
>>>>> > org.apache.hadoop.fs.LocalFileSystem@9cd8db
>>>>> > when running an independet jar within hadoop i m getting
>>>>> > DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
>>>>> > when running an independet jar i m able to do uploads....
>>>>> >
>>>>> > just wanted to know will i have to add something in my classpath of
>>>>> tomcat
>>>>> > or is there any other configurations of core-site.xml that i am
>>>>> missing
>>>>> > out..thanks for your help.....
>>>>> >
>>>>> >
>>>>> >
>>>>> > On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <
>>>>> stevel@hortonworks.com>
>>>>> > wrote:
>>>>> >>
>>>>> >>
>>>>> >> well, it's worked for me in the past outside Hadoop itself:
>>>>> >>
>>>>> >>
>>>>> >>
>>>>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>>>>> >>
>>>>> >> Turn logging up to DEBUG
>>>>> >> Make sure that the filesystem you've just loaded is what you
>>>>> expect, by
>>>>> >> logging its value. It may turn out to be file:///, because the
>>>>> normal Hadoop
>>>>> >> site-config.xml isn't being picked up
>>>>> >>
>>>>> >>
>>>>> >>>
>>>>> >>>
>>>>> >>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak
>>>>> >>> <vi...@gmail.com> wrote:
>>>>> >>>>
>>>>> >>>> but the problem is that my  code gets executed with the warning
>>>>> but file
>>>>> >>>> is not copied to hdfs , actually i m trying to copy a file from
>>>>> local to
>>>>> >>>> hdfs
>>>>> >>>>
>>>>> >>>>    Configuration hadoopConf=new Configuration();
>>>>> >>>>         //get the default associated file system
>>>>> >>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>>> >>>>        // HarFileSystem harFileSystem= new
>>>>> HarFileSystem(fileSystem);
>>>>> >>>>         //copy from lfs to hdfs
>>>>> >>>>        fileSystem.copyFromLocalFile(new
>>>>> Path("E:/test/GANI.jpg"),new
>>>>> >>>> Path("/user/TestDir/"));
>>>>> >>>>
>>>>> >>
>>>>> >>
>>>>> >
>>>>>
>>>>
>>>>
>>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
Thanks senthil name node is up and running and in core-site.xml i have

<configuration>
<property>
<name>fs.default.name</name>
<value>hdfs://localhost:9000</value>
</property>
</configuration>

shud i change my ip  or any other config??

On Mon, Sep 3, 2012 at 10:11 PM, Senthil Kumar <
senthilkumar@thoughtworks.com> wrote:

> The error says call to 127.0.0.1:9000 fails. It is failing when it tries
> to contact the namenode (9000 is the default namenode port) configured in
> core-site.xml. You should also check whether the namenode is configured
> correctly and also whether the namenode is up.
>
>
> On Mon, Sep 3, 2012 at 7:43 PM, Visioner Sadak <vi...@gmail.com>wrote:
>
>> Thanks Senthil i tried on trying with new path getting this error do i
>> have to do any ssl setting on tomcat as well
>>
>> *
>>
>> java.io.IOException
>> *: Call to localhost/127.0.0.1:9000 failed on local exception: *
>> java.io.IOException*: An established connection was aborted by the
>> software in your host machine
>>
>> at org.apache.hadoop.ipc.Client.wrapException(*Client.java:1107*)
>>
>> at org.apache.hadoop.ipc.Client.call(*Client.java:1075*)
>>
>> at org.apache.hadoop.ipc.RPC$Invoker.invoke(*RPC.java:225*)
>>
>>
>>  On Mon, Sep 3, 2012 at 7:22 PM, Senthil Kumar <
>> senthilkumar@thoughtworks.com> wrote:
>>
>>> Try using hadoopConf.addResource(*new
>>> Path("F:/hadoop-0.22.0/conf/core-site.xml")*); instead
>>> of hadoopConf.addResource("F:/hadoop-0.22.0/conf/core-site.xml");
>>>
>>> or you should add your core-site.xml to a location which is in your
>>> class path(WEB-INF\classes or WEB-INF\lib in case of a web application)
>>>
>>>
>>> On Mon, Sep 3, 2012 at 6:02 PM, Visioner Sadak <visioner.sadak@gmail.com
>>> > wrote:
>>>
>>>> thanks hemanth i tried adding ext folder conf and extn root folder
>>>> unable to add xml only but still same problem thanks for the help
>>>>
>>>> On Mon, Sep 3, 2012 at 4:11 PM, Hemanth Yamijala <yh...@gmail.com>wrote:
>>>>
>>>>> Hi,
>>>>>
>>>>> If you are getting the LocalFileSystem, you could try by putting
>>>>> core-site.xml in a directory that's there in the classpath for the
>>>>> Tomcat App (or include such a path in the classpath, if that's
>>>>> possible)
>>>>>
>>>>> Thanks
>>>>> hemanth
>>>>>
>>>>> On Mon, Sep 3, 2012 at 4:01 PM, Visioner Sadak <
>>>>> visioner.sadak@gmail.com> wrote:
>>>>> > Thanks steve thers nothing in logs and no exceptions as well i found
>>>>> that
>>>>> > some file is created in my F:\user with directory name but its not
>>>>> visible
>>>>> > inside my hadoop browse filesystem directories i also added the
>>>>> config by
>>>>> > using the below method
>>>>> > hadoopConf.addResource(
>>>>> > "F:/hadoop-0.22.0/conf/core-site.xml");
>>>>> > when running thru WAR printing out the filesystem i m getting
>>>>> > org.apache.hadoop.fs.LocalFileSystem@9cd8db
>>>>> > when running an independet jar within hadoop i m getting
>>>>> > DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
>>>>> > when running an independet jar i m able to do uploads....
>>>>> >
>>>>> > just wanted to know will i have to add something in my classpath of
>>>>> tomcat
>>>>> > or is there any other configurations of core-site.xml that i am
>>>>> missing
>>>>> > out..thanks for your help.....
>>>>> >
>>>>> >
>>>>> >
>>>>> > On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <
>>>>> stevel@hortonworks.com>
>>>>> > wrote:
>>>>> >>
>>>>> >>
>>>>> >> well, it's worked for me in the past outside Hadoop itself:
>>>>> >>
>>>>> >>
>>>>> >>
>>>>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>>>>> >>
>>>>> >> Turn logging up to DEBUG
>>>>> >> Make sure that the filesystem you've just loaded is what you
>>>>> expect, by
>>>>> >> logging its value. It may turn out to be file:///, because the
>>>>> normal Hadoop
>>>>> >> site-config.xml isn't being picked up
>>>>> >>
>>>>> >>
>>>>> >>>
>>>>> >>>
>>>>> >>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak
>>>>> >>> <vi...@gmail.com> wrote:
>>>>> >>>>
>>>>> >>>> but the problem is that my  code gets executed with the warning
>>>>> but file
>>>>> >>>> is not copied to hdfs , actually i m trying to copy a file from
>>>>> local to
>>>>> >>>> hdfs
>>>>> >>>>
>>>>> >>>>    Configuration hadoopConf=new Configuration();
>>>>> >>>>         //get the default associated file system
>>>>> >>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>>> >>>>        // HarFileSystem harFileSystem= new
>>>>> HarFileSystem(fileSystem);
>>>>> >>>>         //copy from lfs to hdfs
>>>>> >>>>        fileSystem.copyFromLocalFile(new
>>>>> Path("E:/test/GANI.jpg"),new
>>>>> >>>> Path("/user/TestDir/"));
>>>>> >>>>
>>>>> >>
>>>>> >>
>>>>> >
>>>>>
>>>>
>>>>
>>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
Thanks senthil name node is up and running and in core-site.xml i have

<configuration>
<property>
<name>fs.default.name</name>
<value>hdfs://localhost:9000</value>
</property>
</configuration>

shud i change my ip  or any other config??

On Mon, Sep 3, 2012 at 10:11 PM, Senthil Kumar <
senthilkumar@thoughtworks.com> wrote:

> The error says call to 127.0.0.1:9000 fails. It is failing when it tries
> to contact the namenode (9000 is the default namenode port) configured in
> core-site.xml. You should also check whether the namenode is configured
> correctly and also whether the namenode is up.
>
>
> On Mon, Sep 3, 2012 at 7:43 PM, Visioner Sadak <vi...@gmail.com>wrote:
>
>> Thanks Senthil i tried on trying with new path getting this error do i
>> have to do any ssl setting on tomcat as well
>>
>> *
>>
>> java.io.IOException
>> *: Call to localhost/127.0.0.1:9000 failed on local exception: *
>> java.io.IOException*: An established connection was aborted by the
>> software in your host machine
>>
>> at org.apache.hadoop.ipc.Client.wrapException(*Client.java:1107*)
>>
>> at org.apache.hadoop.ipc.Client.call(*Client.java:1075*)
>>
>> at org.apache.hadoop.ipc.RPC$Invoker.invoke(*RPC.java:225*)
>>
>>
>>  On Mon, Sep 3, 2012 at 7:22 PM, Senthil Kumar <
>> senthilkumar@thoughtworks.com> wrote:
>>
>>> Try using hadoopConf.addResource(*new
>>> Path("F:/hadoop-0.22.0/conf/core-site.xml")*); instead
>>> of hadoopConf.addResource("F:/hadoop-0.22.0/conf/core-site.xml");
>>>
>>> or you should add your core-site.xml to a location which is in your
>>> class path(WEB-INF\classes or WEB-INF\lib in case of a web application)
>>>
>>>
>>> On Mon, Sep 3, 2012 at 6:02 PM, Visioner Sadak <visioner.sadak@gmail.com
>>> > wrote:
>>>
>>>> thanks hemanth i tried adding ext folder conf and extn root folder
>>>> unable to add xml only but still same problem thanks for the help
>>>>
>>>> On Mon, Sep 3, 2012 at 4:11 PM, Hemanth Yamijala <yh...@gmail.com>wrote:
>>>>
>>>>> Hi,
>>>>>
>>>>> If you are getting the LocalFileSystem, you could try by putting
>>>>> core-site.xml in a directory that's there in the classpath for the
>>>>> Tomcat App (or include such a path in the classpath, if that's
>>>>> possible)
>>>>>
>>>>> Thanks
>>>>> hemanth
>>>>>
>>>>> On Mon, Sep 3, 2012 at 4:01 PM, Visioner Sadak <
>>>>> visioner.sadak@gmail.com> wrote:
>>>>> > Thanks steve thers nothing in logs and no exceptions as well i found
>>>>> that
>>>>> > some file is created in my F:\user with directory name but its not
>>>>> visible
>>>>> > inside my hadoop browse filesystem directories i also added the
>>>>> config by
>>>>> > using the below method
>>>>> > hadoopConf.addResource(
>>>>> > "F:/hadoop-0.22.0/conf/core-site.xml");
>>>>> > when running thru WAR printing out the filesystem i m getting
>>>>> > org.apache.hadoop.fs.LocalFileSystem@9cd8db
>>>>> > when running an independet jar within hadoop i m getting
>>>>> > DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
>>>>> > when running an independet jar i m able to do uploads....
>>>>> >
>>>>> > just wanted to know will i have to add something in my classpath of
>>>>> tomcat
>>>>> > or is there any other configurations of core-site.xml that i am
>>>>> missing
>>>>> > out..thanks for your help.....
>>>>> >
>>>>> >
>>>>> >
>>>>> > On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <
>>>>> stevel@hortonworks.com>
>>>>> > wrote:
>>>>> >>
>>>>> >>
>>>>> >> well, it's worked for me in the past outside Hadoop itself:
>>>>> >>
>>>>> >>
>>>>> >>
>>>>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>>>>> >>
>>>>> >> Turn logging up to DEBUG
>>>>> >> Make sure that the filesystem you've just loaded is what you
>>>>> expect, by
>>>>> >> logging its value. It may turn out to be file:///, because the
>>>>> normal Hadoop
>>>>> >> site-config.xml isn't being picked up
>>>>> >>
>>>>> >>
>>>>> >>>
>>>>> >>>
>>>>> >>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak
>>>>> >>> <vi...@gmail.com> wrote:
>>>>> >>>>
>>>>> >>>> but the problem is that my  code gets executed with the warning
>>>>> but file
>>>>> >>>> is not copied to hdfs , actually i m trying to copy a file from
>>>>> local to
>>>>> >>>> hdfs
>>>>> >>>>
>>>>> >>>>    Configuration hadoopConf=new Configuration();
>>>>> >>>>         //get the default associated file system
>>>>> >>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>>> >>>>        // HarFileSystem harFileSystem= new
>>>>> HarFileSystem(fileSystem);
>>>>> >>>>         //copy from lfs to hdfs
>>>>> >>>>        fileSystem.copyFromLocalFile(new
>>>>> Path("E:/test/GANI.jpg"),new
>>>>> >>>> Path("/user/TestDir/"));
>>>>> >>>>
>>>>> >>
>>>>> >>
>>>>> >
>>>>>
>>>>
>>>>
>>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
Thanks senthil name node is up and running and in core-site.xml i have

<configuration>
<property>
<name>fs.default.name</name>
<value>hdfs://localhost:9000</value>
</property>
</configuration>

shud i change my ip  or any other config??

On Mon, Sep 3, 2012 at 10:11 PM, Senthil Kumar <
senthilkumar@thoughtworks.com> wrote:

> The error says call to 127.0.0.1:9000 fails. It is failing when it tries
> to contact the namenode (9000 is the default namenode port) configured in
> core-site.xml. You should also check whether the namenode is configured
> correctly and also whether the namenode is up.
>
>
> On Mon, Sep 3, 2012 at 7:43 PM, Visioner Sadak <vi...@gmail.com>wrote:
>
>> Thanks Senthil i tried on trying with new path getting this error do i
>> have to do any ssl setting on tomcat as well
>>
>> *
>>
>> java.io.IOException
>> *: Call to localhost/127.0.0.1:9000 failed on local exception: *
>> java.io.IOException*: An established connection was aborted by the
>> software in your host machine
>>
>> at org.apache.hadoop.ipc.Client.wrapException(*Client.java:1107*)
>>
>> at org.apache.hadoop.ipc.Client.call(*Client.java:1075*)
>>
>> at org.apache.hadoop.ipc.RPC$Invoker.invoke(*RPC.java:225*)
>>
>>
>>  On Mon, Sep 3, 2012 at 7:22 PM, Senthil Kumar <
>> senthilkumar@thoughtworks.com> wrote:
>>
>>> Try using hadoopConf.addResource(*new
>>> Path("F:/hadoop-0.22.0/conf/core-site.xml")*); instead
>>> of hadoopConf.addResource("F:/hadoop-0.22.0/conf/core-site.xml");
>>>
>>> or you should add your core-site.xml to a location which is in your
>>> class path(WEB-INF\classes or WEB-INF\lib in case of a web application)
>>>
>>>
>>> On Mon, Sep 3, 2012 at 6:02 PM, Visioner Sadak <visioner.sadak@gmail.com
>>> > wrote:
>>>
>>>> thanks hemanth i tried adding ext folder conf and extn root folder
>>>> unable to add xml only but still same problem thanks for the help
>>>>
>>>> On Mon, Sep 3, 2012 at 4:11 PM, Hemanth Yamijala <yh...@gmail.com>wrote:
>>>>
>>>>> Hi,
>>>>>
>>>>> If you are getting the LocalFileSystem, you could try by putting
>>>>> core-site.xml in a directory that's there in the classpath for the
>>>>> Tomcat App (or include such a path in the classpath, if that's
>>>>> possible)
>>>>>
>>>>> Thanks
>>>>> hemanth
>>>>>
>>>>> On Mon, Sep 3, 2012 at 4:01 PM, Visioner Sadak <
>>>>> visioner.sadak@gmail.com> wrote:
>>>>> > Thanks steve thers nothing in logs and no exceptions as well i found
>>>>> that
>>>>> > some file is created in my F:\user with directory name but its not
>>>>> visible
>>>>> > inside my hadoop browse filesystem directories i also added the
>>>>> config by
>>>>> > using the below method
>>>>> > hadoopConf.addResource(
>>>>> > "F:/hadoop-0.22.0/conf/core-site.xml");
>>>>> > when running thru WAR printing out the filesystem i m getting
>>>>> > org.apache.hadoop.fs.LocalFileSystem@9cd8db
>>>>> > when running an independet jar within hadoop i m getting
>>>>> > DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
>>>>> > when running an independet jar i m able to do uploads....
>>>>> >
>>>>> > just wanted to know will i have to add something in my classpath of
>>>>> tomcat
>>>>> > or is there any other configurations of core-site.xml that i am
>>>>> missing
>>>>> > out..thanks for your help.....
>>>>> >
>>>>> >
>>>>> >
>>>>> > On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <
>>>>> stevel@hortonworks.com>
>>>>> > wrote:
>>>>> >>
>>>>> >>
>>>>> >> well, it's worked for me in the past outside Hadoop itself:
>>>>> >>
>>>>> >>
>>>>> >>
>>>>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>>>>> >>
>>>>> >> Turn logging up to DEBUG
>>>>> >> Make sure that the filesystem you've just loaded is what you
>>>>> expect, by
>>>>> >> logging its value. It may turn out to be file:///, because the
>>>>> normal Hadoop
>>>>> >> site-config.xml isn't being picked up
>>>>> >>
>>>>> >>
>>>>> >>>
>>>>> >>>
>>>>> >>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak
>>>>> >>> <vi...@gmail.com> wrote:
>>>>> >>>>
>>>>> >>>> but the problem is that my  code gets executed with the warning
>>>>> but file
>>>>> >>>> is not copied to hdfs , actually i m trying to copy a file from
>>>>> local to
>>>>> >>>> hdfs
>>>>> >>>>
>>>>> >>>>    Configuration hadoopConf=new Configuration();
>>>>> >>>>         //get the default associated file system
>>>>> >>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>>> >>>>        // HarFileSystem harFileSystem= new
>>>>> HarFileSystem(fileSystem);
>>>>> >>>>         //copy from lfs to hdfs
>>>>> >>>>        fileSystem.copyFromLocalFile(new
>>>>> Path("E:/test/GANI.jpg"),new
>>>>> >>>> Path("/user/TestDir/"));
>>>>> >>>>
>>>>> >>
>>>>> >>
>>>>> >
>>>>>
>>>>
>>>>
>>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Senthil Kumar <se...@thoughtworks.com>.
The error says call to 127.0.0.1:9000 fails. It is failing when it tries to
contact the namenode (9000 is the default namenode port) configured in
core-site.xml. You should also check whether the namenode is configured
correctly and also whether the namenode is up.


On Mon, Sep 3, 2012 at 7:43 PM, Visioner Sadak <vi...@gmail.com>wrote:

> Thanks Senthil i tried on trying with new path getting this error do i
> have to do any ssl setting on tomcat as well
>
> *
>
> java.io.IOException
> *: Call to localhost/127.0.0.1:9000 failed on local exception: *
> java.io.IOException*: An established connection was aborted by the
> software in your host machine
>
> at org.apache.hadoop.ipc.Client.wrapException(*Client.java:1107*)
>
> at org.apache.hadoop.ipc.Client.call(*Client.java:1075*)
>
> at org.apache.hadoop.ipc.RPC$Invoker.invoke(*RPC.java:225*)
>
>
> On Mon, Sep 3, 2012 at 7:22 PM, Senthil Kumar <
> senthilkumar@thoughtworks.com> wrote:
>
>> Try using hadoopConf.addResource(*new
>> Path("F:/hadoop-0.22.0/conf/core-site.xml")*); instead
>> of hadoopConf.addResource("F:/hadoop-0.22.0/conf/core-site.xml");
>>
>> or you should add your core-site.xml to a location which is in your class
>> path(WEB-INF\classes or WEB-INF\lib in case of a web application)
>>
>>
>> On Mon, Sep 3, 2012 at 6:02 PM, Visioner Sadak <vi...@gmail.com>wrote:
>>
>>> thanks hemanth i tried adding ext folder conf and extn root folder
>>> unable to add xml only but still same problem thanks for the help
>>>
>>> On Mon, Sep 3, 2012 at 4:11 PM, Hemanth Yamijala <yh...@gmail.com>wrote:
>>>
>>>> Hi,
>>>>
>>>> If you are getting the LocalFileSystem, you could try by putting
>>>> core-site.xml in a directory that's there in the classpath for the
>>>> Tomcat App (or include such a path in the classpath, if that's
>>>> possible)
>>>>
>>>> Thanks
>>>> hemanth
>>>>
>>>> On Mon, Sep 3, 2012 at 4:01 PM, Visioner Sadak <
>>>> visioner.sadak@gmail.com> wrote:
>>>> > Thanks steve thers nothing in logs and no exceptions as well i found
>>>> that
>>>> > some file is created in my F:\user with directory name but its not
>>>> visible
>>>> > inside my hadoop browse filesystem directories i also added the
>>>> config by
>>>> > using the below method
>>>> > hadoopConf.addResource(
>>>> > "F:/hadoop-0.22.0/conf/core-site.xml");
>>>> > when running thru WAR printing out the filesystem i m getting
>>>> > org.apache.hadoop.fs.LocalFileSystem@9cd8db
>>>> > when running an independet jar within hadoop i m getting
>>>> > DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
>>>> > when running an independet jar i m able to do uploads....
>>>> >
>>>> > just wanted to know will i have to add something in my classpath of
>>>> tomcat
>>>> > or is there any other configurations of core-site.xml that i am
>>>> missing
>>>> > out..thanks for your help.....
>>>> >
>>>> >
>>>> >
>>>> > On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <
>>>> stevel@hortonworks.com>
>>>> > wrote:
>>>> >>
>>>> >>
>>>> >> well, it's worked for me in the past outside Hadoop itself:
>>>> >>
>>>> >>
>>>> >>
>>>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>>>> >>
>>>> >> Turn logging up to DEBUG
>>>> >> Make sure that the filesystem you've just loaded is what you expect,
>>>> by
>>>> >> logging its value. It may turn out to be file:///, because the
>>>> normal Hadoop
>>>> >> site-config.xml isn't being picked up
>>>> >>
>>>> >>
>>>> >>>
>>>> >>>
>>>> >>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak
>>>> >>> <vi...@gmail.com> wrote:
>>>> >>>>
>>>> >>>> but the problem is that my  code gets executed with the warning
>>>> but file
>>>> >>>> is not copied to hdfs , actually i m trying to copy a file from
>>>> local to
>>>> >>>> hdfs
>>>> >>>>
>>>> >>>>    Configuration hadoopConf=new Configuration();
>>>> >>>>         //get the default associated file system
>>>> >>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>> >>>>        // HarFileSystem harFileSystem= new
>>>> HarFileSystem(fileSystem);
>>>> >>>>         //copy from lfs to hdfs
>>>> >>>>        fileSystem.copyFromLocalFile(new
>>>> Path("E:/test/GANI.jpg"),new
>>>> >>>> Path("/user/TestDir/"));
>>>> >>>>
>>>> >>
>>>> >>
>>>> >
>>>>
>>>
>>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Senthil Kumar <se...@thoughtworks.com>.
The error says call to 127.0.0.1:9000 fails. It is failing when it tries to
contact the namenode (9000 is the default namenode port) configured in
core-site.xml. You should also check whether the namenode is configured
correctly and also whether the namenode is up.


On Mon, Sep 3, 2012 at 7:43 PM, Visioner Sadak <vi...@gmail.com>wrote:

> Thanks Senthil i tried on trying with new path getting this error do i
> have to do any ssl setting on tomcat as well
>
> *
>
> java.io.IOException
> *: Call to localhost/127.0.0.1:9000 failed on local exception: *
> java.io.IOException*: An established connection was aborted by the
> software in your host machine
>
> at org.apache.hadoop.ipc.Client.wrapException(*Client.java:1107*)
>
> at org.apache.hadoop.ipc.Client.call(*Client.java:1075*)
>
> at org.apache.hadoop.ipc.RPC$Invoker.invoke(*RPC.java:225*)
>
>
> On Mon, Sep 3, 2012 at 7:22 PM, Senthil Kumar <
> senthilkumar@thoughtworks.com> wrote:
>
>> Try using hadoopConf.addResource(*new
>> Path("F:/hadoop-0.22.0/conf/core-site.xml")*); instead
>> of hadoopConf.addResource("F:/hadoop-0.22.0/conf/core-site.xml");
>>
>> or you should add your core-site.xml to a location which is in your class
>> path(WEB-INF\classes or WEB-INF\lib in case of a web application)
>>
>>
>> On Mon, Sep 3, 2012 at 6:02 PM, Visioner Sadak <vi...@gmail.com>wrote:
>>
>>> thanks hemanth i tried adding ext folder conf and extn root folder
>>> unable to add xml only but still same problem thanks for the help
>>>
>>> On Mon, Sep 3, 2012 at 4:11 PM, Hemanth Yamijala <yh...@gmail.com>wrote:
>>>
>>>> Hi,
>>>>
>>>> If you are getting the LocalFileSystem, you could try by putting
>>>> core-site.xml in a directory that's there in the classpath for the
>>>> Tomcat App (or include such a path in the classpath, if that's
>>>> possible)
>>>>
>>>> Thanks
>>>> hemanth
>>>>
>>>> On Mon, Sep 3, 2012 at 4:01 PM, Visioner Sadak <
>>>> visioner.sadak@gmail.com> wrote:
>>>> > Thanks steve thers nothing in logs and no exceptions as well i found
>>>> that
>>>> > some file is created in my F:\user with directory name but its not
>>>> visible
>>>> > inside my hadoop browse filesystem directories i also added the
>>>> config by
>>>> > using the below method
>>>> > hadoopConf.addResource(
>>>> > "F:/hadoop-0.22.0/conf/core-site.xml");
>>>> > when running thru WAR printing out the filesystem i m getting
>>>> > org.apache.hadoop.fs.LocalFileSystem@9cd8db
>>>> > when running an independet jar within hadoop i m getting
>>>> > DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
>>>> > when running an independet jar i m able to do uploads....
>>>> >
>>>> > just wanted to know will i have to add something in my classpath of
>>>> tomcat
>>>> > or is there any other configurations of core-site.xml that i am
>>>> missing
>>>> > out..thanks for your help.....
>>>> >
>>>> >
>>>> >
>>>> > On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <
>>>> stevel@hortonworks.com>
>>>> > wrote:
>>>> >>
>>>> >>
>>>> >> well, it's worked for me in the past outside Hadoop itself:
>>>> >>
>>>> >>
>>>> >>
>>>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>>>> >>
>>>> >> Turn logging up to DEBUG
>>>> >> Make sure that the filesystem you've just loaded is what you expect,
>>>> by
>>>> >> logging its value. It may turn out to be file:///, because the
>>>> normal Hadoop
>>>> >> site-config.xml isn't being picked up
>>>> >>
>>>> >>
>>>> >>>
>>>> >>>
>>>> >>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak
>>>> >>> <vi...@gmail.com> wrote:
>>>> >>>>
>>>> >>>> but the problem is that my  code gets executed with the warning
>>>> but file
>>>> >>>> is not copied to hdfs , actually i m trying to copy a file from
>>>> local to
>>>> >>>> hdfs
>>>> >>>>
>>>> >>>>    Configuration hadoopConf=new Configuration();
>>>> >>>>         //get the default associated file system
>>>> >>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>> >>>>        // HarFileSystem harFileSystem= new
>>>> HarFileSystem(fileSystem);
>>>> >>>>         //copy from lfs to hdfs
>>>> >>>>        fileSystem.copyFromLocalFile(new
>>>> Path("E:/test/GANI.jpg"),new
>>>> >>>> Path("/user/TestDir/"));
>>>> >>>>
>>>> >>
>>>> >>
>>>> >
>>>>
>>>
>>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Senthil Kumar <se...@thoughtworks.com>.
The error says call to 127.0.0.1:9000 fails. It is failing when it tries to
contact the namenode (9000 is the default namenode port) configured in
core-site.xml. You should also check whether the namenode is configured
correctly and also whether the namenode is up.


On Mon, Sep 3, 2012 at 7:43 PM, Visioner Sadak <vi...@gmail.com>wrote:

> Thanks Senthil i tried on trying with new path getting this error do i
> have to do any ssl setting on tomcat as well
>
> *
>
> java.io.IOException
> *: Call to localhost/127.0.0.1:9000 failed on local exception: *
> java.io.IOException*: An established connection was aborted by the
> software in your host machine
>
> at org.apache.hadoop.ipc.Client.wrapException(*Client.java:1107*)
>
> at org.apache.hadoop.ipc.Client.call(*Client.java:1075*)
>
> at org.apache.hadoop.ipc.RPC$Invoker.invoke(*RPC.java:225*)
>
>
> On Mon, Sep 3, 2012 at 7:22 PM, Senthil Kumar <
> senthilkumar@thoughtworks.com> wrote:
>
>> Try using hadoopConf.addResource(*new
>> Path("F:/hadoop-0.22.0/conf/core-site.xml")*); instead
>> of hadoopConf.addResource("F:/hadoop-0.22.0/conf/core-site.xml");
>>
>> or you should add your core-site.xml to a location which is in your class
>> path(WEB-INF\classes or WEB-INF\lib in case of a web application)
>>
>>
>> On Mon, Sep 3, 2012 at 6:02 PM, Visioner Sadak <vi...@gmail.com>wrote:
>>
>>> thanks hemanth i tried adding ext folder conf and extn root folder
>>> unable to add xml only but still same problem thanks for the help
>>>
>>> On Mon, Sep 3, 2012 at 4:11 PM, Hemanth Yamijala <yh...@gmail.com>wrote:
>>>
>>>> Hi,
>>>>
>>>> If you are getting the LocalFileSystem, you could try by putting
>>>> core-site.xml in a directory that's there in the classpath for the
>>>> Tomcat App (or include such a path in the classpath, if that's
>>>> possible)
>>>>
>>>> Thanks
>>>> hemanth
>>>>
>>>> On Mon, Sep 3, 2012 at 4:01 PM, Visioner Sadak <
>>>> visioner.sadak@gmail.com> wrote:
>>>> > Thanks steve thers nothing in logs and no exceptions as well i found
>>>> that
>>>> > some file is created in my F:\user with directory name but its not
>>>> visible
>>>> > inside my hadoop browse filesystem directories i also added the
>>>> config by
>>>> > using the below method
>>>> > hadoopConf.addResource(
>>>> > "F:/hadoop-0.22.0/conf/core-site.xml");
>>>> > when running thru WAR printing out the filesystem i m getting
>>>> > org.apache.hadoop.fs.LocalFileSystem@9cd8db
>>>> > when running an independet jar within hadoop i m getting
>>>> > DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
>>>> > when running an independet jar i m able to do uploads....
>>>> >
>>>> > just wanted to know will i have to add something in my classpath of
>>>> tomcat
>>>> > or is there any other configurations of core-site.xml that i am
>>>> missing
>>>> > out..thanks for your help.....
>>>> >
>>>> >
>>>> >
>>>> > On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <
>>>> stevel@hortonworks.com>
>>>> > wrote:
>>>> >>
>>>> >>
>>>> >> well, it's worked for me in the past outside Hadoop itself:
>>>> >>
>>>> >>
>>>> >>
>>>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>>>> >>
>>>> >> Turn logging up to DEBUG
>>>> >> Make sure that the filesystem you've just loaded is what you expect,
>>>> by
>>>> >> logging its value. It may turn out to be file:///, because the
>>>> normal Hadoop
>>>> >> site-config.xml isn't being picked up
>>>> >>
>>>> >>
>>>> >>>
>>>> >>>
>>>> >>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak
>>>> >>> <vi...@gmail.com> wrote:
>>>> >>>>
>>>> >>>> but the problem is that my  code gets executed with the warning
>>>> but file
>>>> >>>> is not copied to hdfs , actually i m trying to copy a file from
>>>> local to
>>>> >>>> hdfs
>>>> >>>>
>>>> >>>>    Configuration hadoopConf=new Configuration();
>>>> >>>>         //get the default associated file system
>>>> >>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>> >>>>        // HarFileSystem harFileSystem= new
>>>> HarFileSystem(fileSystem);
>>>> >>>>         //copy from lfs to hdfs
>>>> >>>>        fileSystem.copyFromLocalFile(new
>>>> Path("E:/test/GANI.jpg"),new
>>>> >>>> Path("/user/TestDir/"));
>>>> >>>>
>>>> >>
>>>> >>
>>>> >
>>>>
>>>
>>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Senthil Kumar <se...@thoughtworks.com>.
The error says call to 127.0.0.1:9000 fails. It is failing when it tries to
contact the namenode (9000 is the default namenode port) configured in
core-site.xml. You should also check whether the namenode is configured
correctly and also whether the namenode is up.


On Mon, Sep 3, 2012 at 7:43 PM, Visioner Sadak <vi...@gmail.com>wrote:

> Thanks Senthil i tried on trying with new path getting this error do i
> have to do any ssl setting on tomcat as well
>
> *
>
> java.io.IOException
> *: Call to localhost/127.0.0.1:9000 failed on local exception: *
> java.io.IOException*: An established connection was aborted by the
> software in your host machine
>
> at org.apache.hadoop.ipc.Client.wrapException(*Client.java:1107*)
>
> at org.apache.hadoop.ipc.Client.call(*Client.java:1075*)
>
> at org.apache.hadoop.ipc.RPC$Invoker.invoke(*RPC.java:225*)
>
>
> On Mon, Sep 3, 2012 at 7:22 PM, Senthil Kumar <
> senthilkumar@thoughtworks.com> wrote:
>
>> Try using hadoopConf.addResource(*new
>> Path("F:/hadoop-0.22.0/conf/core-site.xml")*); instead
>> of hadoopConf.addResource("F:/hadoop-0.22.0/conf/core-site.xml");
>>
>> or you should add your core-site.xml to a location which is in your class
>> path(WEB-INF\classes or WEB-INF\lib in case of a web application)
>>
>>
>> On Mon, Sep 3, 2012 at 6:02 PM, Visioner Sadak <vi...@gmail.com>wrote:
>>
>>> thanks hemanth i tried adding ext folder conf and extn root folder
>>> unable to add xml only but still same problem thanks for the help
>>>
>>> On Mon, Sep 3, 2012 at 4:11 PM, Hemanth Yamijala <yh...@gmail.com>wrote:
>>>
>>>> Hi,
>>>>
>>>> If you are getting the LocalFileSystem, you could try by putting
>>>> core-site.xml in a directory that's there in the classpath for the
>>>> Tomcat App (or include such a path in the classpath, if that's
>>>> possible)
>>>>
>>>> Thanks
>>>> hemanth
>>>>
>>>> On Mon, Sep 3, 2012 at 4:01 PM, Visioner Sadak <
>>>> visioner.sadak@gmail.com> wrote:
>>>> > Thanks steve thers nothing in logs and no exceptions as well i found
>>>> that
>>>> > some file is created in my F:\user with directory name but its not
>>>> visible
>>>> > inside my hadoop browse filesystem directories i also added the
>>>> config by
>>>> > using the below method
>>>> > hadoopConf.addResource(
>>>> > "F:/hadoop-0.22.0/conf/core-site.xml");
>>>> > when running thru WAR printing out the filesystem i m getting
>>>> > org.apache.hadoop.fs.LocalFileSystem@9cd8db
>>>> > when running an independet jar within hadoop i m getting
>>>> > DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
>>>> > when running an independet jar i m able to do uploads....
>>>> >
>>>> > just wanted to know will i have to add something in my classpath of
>>>> tomcat
>>>> > or is there any other configurations of core-site.xml that i am
>>>> missing
>>>> > out..thanks for your help.....
>>>> >
>>>> >
>>>> >
>>>> > On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <
>>>> stevel@hortonworks.com>
>>>> > wrote:
>>>> >>
>>>> >>
>>>> >> well, it's worked for me in the past outside Hadoop itself:
>>>> >>
>>>> >>
>>>> >>
>>>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>>>> >>
>>>> >> Turn logging up to DEBUG
>>>> >> Make sure that the filesystem you've just loaded is what you expect,
>>>> by
>>>> >> logging its value. It may turn out to be file:///, because the
>>>> normal Hadoop
>>>> >> site-config.xml isn't being picked up
>>>> >>
>>>> >>
>>>> >>>
>>>> >>>
>>>> >>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak
>>>> >>> <vi...@gmail.com> wrote:
>>>> >>>>
>>>> >>>> but the problem is that my  code gets executed with the warning
>>>> but file
>>>> >>>> is not copied to hdfs , actually i m trying to copy a file from
>>>> local to
>>>> >>>> hdfs
>>>> >>>>
>>>> >>>>    Configuration hadoopConf=new Configuration();
>>>> >>>>         //get the default associated file system
>>>> >>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>> >>>>        // HarFileSystem harFileSystem= new
>>>> HarFileSystem(fileSystem);
>>>> >>>>         //copy from lfs to hdfs
>>>> >>>>        fileSystem.copyFromLocalFile(new
>>>> Path("E:/test/GANI.jpg"),new
>>>> >>>> Path("/user/TestDir/"));
>>>> >>>>
>>>> >>
>>>> >>
>>>> >
>>>>
>>>
>>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
Thanks Senthil i tried on trying with new path getting this error do i have
to do any ssl setting on tomcat as well

*

java.io.IOException*: Call to localhost/127.0.0.1:9000 failed on local
exception: *java.io.IOException*: An established connection was aborted by
the software in your host machine

at org.apache.hadoop.ipc.Client.wrapException(*Client.java:1107*)

at org.apache.hadoop.ipc.Client.call(*Client.java:1075*)

at org.apache.hadoop.ipc.RPC$Invoker.invoke(*RPC.java:225*)


On Mon, Sep 3, 2012 at 7:22 PM, Senthil Kumar <senthilkumar@thoughtworks.com
> wrote:

> Try using hadoopConf.addResource(*new
> Path("F:/hadoop-0.22.0/conf/core-site.xml")*); instead
> of hadoopConf.addResource("F:/hadoop-0.22.0/conf/core-site.xml");
>
> or you should add your core-site.xml to a location which is in your class
> path(WEB-INF\classes or WEB-INF\lib in case of a web application)
>
>
> On Mon, Sep 3, 2012 at 6:02 PM, Visioner Sadak <vi...@gmail.com>wrote:
>
>> thanks hemanth i tried adding ext folder conf and extn root folder
>> unable to add xml only but still same problem thanks for the help
>>
>> On Mon, Sep 3, 2012 at 4:11 PM, Hemanth Yamijala <yh...@gmail.com>wrote:
>>
>>> Hi,
>>>
>>> If you are getting the LocalFileSystem, you could try by putting
>>> core-site.xml in a directory that's there in the classpath for the
>>> Tomcat App (or include such a path in the classpath, if that's
>>> possible)
>>>
>>> Thanks
>>> hemanth
>>>
>>> On Mon, Sep 3, 2012 at 4:01 PM, Visioner Sadak <vi...@gmail.com>
>>> wrote:
>>> > Thanks steve thers nothing in logs and no exceptions as well i found
>>> that
>>> > some file is created in my F:\user with directory name but its not
>>> visible
>>> > inside my hadoop browse filesystem directories i also added the config
>>> by
>>> > using the below method
>>> > hadoopConf.addResource(
>>> > "F:/hadoop-0.22.0/conf/core-site.xml");
>>> > when running thru WAR printing out the filesystem i m getting
>>> > org.apache.hadoop.fs.LocalFileSystem@9cd8db
>>> > when running an independet jar within hadoop i m getting
>>> > DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
>>> > when running an independet jar i m able to do uploads....
>>> >
>>> > just wanted to know will i have to add something in my classpath of
>>> tomcat
>>> > or is there any other configurations of core-site.xml that i am missing
>>> > out..thanks for your help.....
>>> >
>>> >
>>> >
>>> > On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <stevel@hortonworks.com
>>> >
>>> > wrote:
>>> >>
>>> >>
>>> >> well, it's worked for me in the past outside Hadoop itself:
>>> >>
>>> >>
>>> >>
>>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>>> >>
>>> >> Turn logging up to DEBUG
>>> >> Make sure that the filesystem you've just loaded is what you expect,
>>> by
>>> >> logging its value. It may turn out to be file:///, because the normal
>>> Hadoop
>>> >> site-config.xml isn't being picked up
>>> >>
>>> >>
>>> >>>
>>> >>>
>>> >>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak
>>> >>> <vi...@gmail.com> wrote:
>>> >>>>
>>> >>>> but the problem is that my  code gets executed with the warning but
>>> file
>>> >>>> is not copied to hdfs , actually i m trying to copy a file from
>>> local to
>>> >>>> hdfs
>>> >>>>
>>> >>>>    Configuration hadoopConf=new Configuration();
>>> >>>>         //get the default associated file system
>>> >>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>> >>>>        // HarFileSystem harFileSystem= new
>>> HarFileSystem(fileSystem);
>>> >>>>         //copy from lfs to hdfs
>>> >>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>>> >>>> Path("/user/TestDir/"));
>>> >>>>
>>> >>
>>> >>
>>> >
>>>
>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
Thanks Senthil i tried on trying with new path getting this error do i have
to do any ssl setting on tomcat as well

*

java.io.IOException*: Call to localhost/127.0.0.1:9000 failed on local
exception: *java.io.IOException*: An established connection was aborted by
the software in your host machine

at org.apache.hadoop.ipc.Client.wrapException(*Client.java:1107*)

at org.apache.hadoop.ipc.Client.call(*Client.java:1075*)

at org.apache.hadoop.ipc.RPC$Invoker.invoke(*RPC.java:225*)


On Mon, Sep 3, 2012 at 7:22 PM, Senthil Kumar <senthilkumar@thoughtworks.com
> wrote:

> Try using hadoopConf.addResource(*new
> Path("F:/hadoop-0.22.0/conf/core-site.xml")*); instead
> of hadoopConf.addResource("F:/hadoop-0.22.0/conf/core-site.xml");
>
> or you should add your core-site.xml to a location which is in your class
> path(WEB-INF\classes or WEB-INF\lib in case of a web application)
>
>
> On Mon, Sep 3, 2012 at 6:02 PM, Visioner Sadak <vi...@gmail.com>wrote:
>
>> thanks hemanth i tried adding ext folder conf and extn root folder
>> unable to add xml only but still same problem thanks for the help
>>
>> On Mon, Sep 3, 2012 at 4:11 PM, Hemanth Yamijala <yh...@gmail.com>wrote:
>>
>>> Hi,
>>>
>>> If you are getting the LocalFileSystem, you could try by putting
>>> core-site.xml in a directory that's there in the classpath for the
>>> Tomcat App (or include such a path in the classpath, if that's
>>> possible)
>>>
>>> Thanks
>>> hemanth
>>>
>>> On Mon, Sep 3, 2012 at 4:01 PM, Visioner Sadak <vi...@gmail.com>
>>> wrote:
>>> > Thanks steve thers nothing in logs and no exceptions as well i found
>>> that
>>> > some file is created in my F:\user with directory name but its not
>>> visible
>>> > inside my hadoop browse filesystem directories i also added the config
>>> by
>>> > using the below method
>>> > hadoopConf.addResource(
>>> > "F:/hadoop-0.22.0/conf/core-site.xml");
>>> > when running thru WAR printing out the filesystem i m getting
>>> > org.apache.hadoop.fs.LocalFileSystem@9cd8db
>>> > when running an independet jar within hadoop i m getting
>>> > DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
>>> > when running an independet jar i m able to do uploads....
>>> >
>>> > just wanted to know will i have to add something in my classpath of
>>> tomcat
>>> > or is there any other configurations of core-site.xml that i am missing
>>> > out..thanks for your help.....
>>> >
>>> >
>>> >
>>> > On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <stevel@hortonworks.com
>>> >
>>> > wrote:
>>> >>
>>> >>
>>> >> well, it's worked for me in the past outside Hadoop itself:
>>> >>
>>> >>
>>> >>
>>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>>> >>
>>> >> Turn logging up to DEBUG
>>> >> Make sure that the filesystem you've just loaded is what you expect,
>>> by
>>> >> logging its value. It may turn out to be file:///, because the normal
>>> Hadoop
>>> >> site-config.xml isn't being picked up
>>> >>
>>> >>
>>> >>>
>>> >>>
>>> >>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak
>>> >>> <vi...@gmail.com> wrote:
>>> >>>>
>>> >>>> but the problem is that my  code gets executed with the warning but
>>> file
>>> >>>> is not copied to hdfs , actually i m trying to copy a file from
>>> local to
>>> >>>> hdfs
>>> >>>>
>>> >>>>    Configuration hadoopConf=new Configuration();
>>> >>>>         //get the default associated file system
>>> >>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>> >>>>        // HarFileSystem harFileSystem= new
>>> HarFileSystem(fileSystem);
>>> >>>>         //copy from lfs to hdfs
>>> >>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>>> >>>> Path("/user/TestDir/"));
>>> >>>>
>>> >>
>>> >>
>>> >
>>>
>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
Thanks Senthil i tried on trying with new path getting this error do i have
to do any ssl setting on tomcat as well

*

java.io.IOException*: Call to localhost/127.0.0.1:9000 failed on local
exception: *java.io.IOException*: An established connection was aborted by
the software in your host machine

at org.apache.hadoop.ipc.Client.wrapException(*Client.java:1107*)

at org.apache.hadoop.ipc.Client.call(*Client.java:1075*)

at org.apache.hadoop.ipc.RPC$Invoker.invoke(*RPC.java:225*)


On Mon, Sep 3, 2012 at 7:22 PM, Senthil Kumar <senthilkumar@thoughtworks.com
> wrote:

> Try using hadoopConf.addResource(*new
> Path("F:/hadoop-0.22.0/conf/core-site.xml")*); instead
> of hadoopConf.addResource("F:/hadoop-0.22.0/conf/core-site.xml");
>
> or you should add your core-site.xml to a location which is in your class
> path(WEB-INF\classes or WEB-INF\lib in case of a web application)
>
>
> On Mon, Sep 3, 2012 at 6:02 PM, Visioner Sadak <vi...@gmail.com>wrote:
>
>> thanks hemanth i tried adding ext folder conf and extn root folder
>> unable to add xml only but still same problem thanks for the help
>>
>> On Mon, Sep 3, 2012 at 4:11 PM, Hemanth Yamijala <yh...@gmail.com>wrote:
>>
>>> Hi,
>>>
>>> If you are getting the LocalFileSystem, you could try by putting
>>> core-site.xml in a directory that's there in the classpath for the
>>> Tomcat App (or include such a path in the classpath, if that's
>>> possible)
>>>
>>> Thanks
>>> hemanth
>>>
>>> On Mon, Sep 3, 2012 at 4:01 PM, Visioner Sadak <vi...@gmail.com>
>>> wrote:
>>> > Thanks steve thers nothing in logs and no exceptions as well i found
>>> that
>>> > some file is created in my F:\user with directory name but its not
>>> visible
>>> > inside my hadoop browse filesystem directories i also added the config
>>> by
>>> > using the below method
>>> > hadoopConf.addResource(
>>> > "F:/hadoop-0.22.0/conf/core-site.xml");
>>> > when running thru WAR printing out the filesystem i m getting
>>> > org.apache.hadoop.fs.LocalFileSystem@9cd8db
>>> > when running an independet jar within hadoop i m getting
>>> > DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
>>> > when running an independet jar i m able to do uploads....
>>> >
>>> > just wanted to know will i have to add something in my classpath of
>>> tomcat
>>> > or is there any other configurations of core-site.xml that i am missing
>>> > out..thanks for your help.....
>>> >
>>> >
>>> >
>>> > On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <stevel@hortonworks.com
>>> >
>>> > wrote:
>>> >>
>>> >>
>>> >> well, it's worked for me in the past outside Hadoop itself:
>>> >>
>>> >>
>>> >>
>>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>>> >>
>>> >> Turn logging up to DEBUG
>>> >> Make sure that the filesystem you've just loaded is what you expect,
>>> by
>>> >> logging its value. It may turn out to be file:///, because the normal
>>> Hadoop
>>> >> site-config.xml isn't being picked up
>>> >>
>>> >>
>>> >>>
>>> >>>
>>> >>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak
>>> >>> <vi...@gmail.com> wrote:
>>> >>>>
>>> >>>> but the problem is that my  code gets executed with the warning but
>>> file
>>> >>>> is not copied to hdfs , actually i m trying to copy a file from
>>> local to
>>> >>>> hdfs
>>> >>>>
>>> >>>>    Configuration hadoopConf=new Configuration();
>>> >>>>         //get the default associated file system
>>> >>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>> >>>>        // HarFileSystem harFileSystem= new
>>> HarFileSystem(fileSystem);
>>> >>>>         //copy from lfs to hdfs
>>> >>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>>> >>>> Path("/user/TestDir/"));
>>> >>>>
>>> >>
>>> >>
>>> >
>>>
>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
Thanks Senthil i tried on trying with new path getting this error do i have
to do any ssl setting on tomcat as well

*

java.io.IOException*: Call to localhost/127.0.0.1:9000 failed on local
exception: *java.io.IOException*: An established connection was aborted by
the software in your host machine

at org.apache.hadoop.ipc.Client.wrapException(*Client.java:1107*)

at org.apache.hadoop.ipc.Client.call(*Client.java:1075*)

at org.apache.hadoop.ipc.RPC$Invoker.invoke(*RPC.java:225*)


On Mon, Sep 3, 2012 at 7:22 PM, Senthil Kumar <senthilkumar@thoughtworks.com
> wrote:

> Try using hadoopConf.addResource(*new
> Path("F:/hadoop-0.22.0/conf/core-site.xml")*); instead
> of hadoopConf.addResource("F:/hadoop-0.22.0/conf/core-site.xml");
>
> or you should add your core-site.xml to a location which is in your class
> path(WEB-INF\classes or WEB-INF\lib in case of a web application)
>
>
> On Mon, Sep 3, 2012 at 6:02 PM, Visioner Sadak <vi...@gmail.com>wrote:
>
>> thanks hemanth i tried adding ext folder conf and extn root folder
>> unable to add xml only but still same problem thanks for the help
>>
>> On Mon, Sep 3, 2012 at 4:11 PM, Hemanth Yamijala <yh...@gmail.com>wrote:
>>
>>> Hi,
>>>
>>> If you are getting the LocalFileSystem, you could try by putting
>>> core-site.xml in a directory that's there in the classpath for the
>>> Tomcat App (or include such a path in the classpath, if that's
>>> possible)
>>>
>>> Thanks
>>> hemanth
>>>
>>> On Mon, Sep 3, 2012 at 4:01 PM, Visioner Sadak <vi...@gmail.com>
>>> wrote:
>>> > Thanks steve thers nothing in logs and no exceptions as well i found
>>> that
>>> > some file is created in my F:\user with directory name but its not
>>> visible
>>> > inside my hadoop browse filesystem directories i also added the config
>>> by
>>> > using the below method
>>> > hadoopConf.addResource(
>>> > "F:/hadoop-0.22.0/conf/core-site.xml");
>>> > when running thru WAR printing out the filesystem i m getting
>>> > org.apache.hadoop.fs.LocalFileSystem@9cd8db
>>> > when running an independet jar within hadoop i m getting
>>> > DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
>>> > when running an independet jar i m able to do uploads....
>>> >
>>> > just wanted to know will i have to add something in my classpath of
>>> tomcat
>>> > or is there any other configurations of core-site.xml that i am missing
>>> > out..thanks for your help.....
>>> >
>>> >
>>> >
>>> > On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <stevel@hortonworks.com
>>> >
>>> > wrote:
>>> >>
>>> >>
>>> >> well, it's worked for me in the past outside Hadoop itself:
>>> >>
>>> >>
>>> >>
>>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>>> >>
>>> >> Turn logging up to DEBUG
>>> >> Make sure that the filesystem you've just loaded is what you expect,
>>> by
>>> >> logging its value. It may turn out to be file:///, because the normal
>>> Hadoop
>>> >> site-config.xml isn't being picked up
>>> >>
>>> >>
>>> >>>
>>> >>>
>>> >>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak
>>> >>> <vi...@gmail.com> wrote:
>>> >>>>
>>> >>>> but the problem is that my  code gets executed with the warning but
>>> file
>>> >>>> is not copied to hdfs , actually i m trying to copy a file from
>>> local to
>>> >>>> hdfs
>>> >>>>
>>> >>>>    Configuration hadoopConf=new Configuration();
>>> >>>>         //get the default associated file system
>>> >>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>> >>>>        // HarFileSystem harFileSystem= new
>>> HarFileSystem(fileSystem);
>>> >>>>         //copy from lfs to hdfs
>>> >>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>>> >>>> Path("/user/TestDir/"));
>>> >>>>
>>> >>
>>> >>
>>> >
>>>
>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Senthil Kumar <se...@thoughtworks.com>.
Try using hadoopConf.addResource(*new
Path("F:/hadoop-0.22.0/conf/core-site.xml")*); instead
of hadoopConf.addResource("F:/hadoop-0.22.0/conf/core-site.xml");

or you should add your core-site.xml to a location which is in your class
path(WEB-INF\classes or WEB-INF\lib in case of a web application)


On Mon, Sep 3, 2012 at 6:02 PM, Visioner Sadak <vi...@gmail.com>wrote:

> thanks hemanth i tried adding ext folder conf and extn root folder
> unable to add xml only but still same problem thanks for the help
>
> On Mon, Sep 3, 2012 at 4:11 PM, Hemanth Yamijala <yh...@gmail.com>wrote:
>
>> Hi,
>>
>> If you are getting the LocalFileSystem, you could try by putting
>> core-site.xml in a directory that's there in the classpath for the
>> Tomcat App (or include such a path in the classpath, if that's
>> possible)
>>
>> Thanks
>> hemanth
>>
>> On Mon, Sep 3, 2012 at 4:01 PM, Visioner Sadak <vi...@gmail.com>
>> wrote:
>> > Thanks steve thers nothing in logs and no exceptions as well i found
>> that
>> > some file is created in my F:\user with directory name but its not
>> visible
>> > inside my hadoop browse filesystem directories i also added the config
>> by
>> > using the below method
>> > hadoopConf.addResource(
>> > "F:/hadoop-0.22.0/conf/core-site.xml");
>> > when running thru WAR printing out the filesystem i m getting
>> > org.apache.hadoop.fs.LocalFileSystem@9cd8db
>> > when running an independet jar within hadoop i m getting
>> > DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
>> > when running an independet jar i m able to do uploads....
>> >
>> > just wanted to know will i have to add something in my classpath of
>> tomcat
>> > or is there any other configurations of core-site.xml that i am missing
>> > out..thanks for your help.....
>> >
>> >
>> >
>> > On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
>> > wrote:
>> >>
>> >>
>> >> well, it's worked for me in the past outside Hadoop itself:
>> >>
>> >>
>> >>
>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>> >>
>> >> Turn logging up to DEBUG
>> >> Make sure that the filesystem you've just loaded is what you expect, by
>> >> logging its value. It may turn out to be file:///, because the normal
>> Hadoop
>> >> site-config.xml isn't being picked up
>> >>
>> >>
>> >>>
>> >>>
>> >>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak
>> >>> <vi...@gmail.com> wrote:
>> >>>>
>> >>>> but the problem is that my  code gets executed with the warning but
>> file
>> >>>> is not copied to hdfs , actually i m trying to copy a file from
>> local to
>> >>>> hdfs
>> >>>>
>> >>>>    Configuration hadoopConf=new Configuration();
>> >>>>         //get the default associated file system
>> >>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>> >>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>> >>>>         //copy from lfs to hdfs
>> >>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>> >>>> Path("/user/TestDir/"));
>> >>>>
>> >>
>> >>
>> >
>>
>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Senthil Kumar <se...@thoughtworks.com>.
Try using hadoopConf.addResource(*new
Path("F:/hadoop-0.22.0/conf/core-site.xml")*); instead
of hadoopConf.addResource("F:/hadoop-0.22.0/conf/core-site.xml");

or you should add your core-site.xml to a location which is in your class
path(WEB-INF\classes or WEB-INF\lib in case of a web application)


On Mon, Sep 3, 2012 at 6:02 PM, Visioner Sadak <vi...@gmail.com>wrote:

> thanks hemanth i tried adding ext folder conf and extn root folder
> unable to add xml only but still same problem thanks for the help
>
> On Mon, Sep 3, 2012 at 4:11 PM, Hemanth Yamijala <yh...@gmail.com>wrote:
>
>> Hi,
>>
>> If you are getting the LocalFileSystem, you could try by putting
>> core-site.xml in a directory that's there in the classpath for the
>> Tomcat App (or include such a path in the classpath, if that's
>> possible)
>>
>> Thanks
>> hemanth
>>
>> On Mon, Sep 3, 2012 at 4:01 PM, Visioner Sadak <vi...@gmail.com>
>> wrote:
>> > Thanks steve thers nothing in logs and no exceptions as well i found
>> that
>> > some file is created in my F:\user with directory name but its not
>> visible
>> > inside my hadoop browse filesystem directories i also added the config
>> by
>> > using the below method
>> > hadoopConf.addResource(
>> > "F:/hadoop-0.22.0/conf/core-site.xml");
>> > when running thru WAR printing out the filesystem i m getting
>> > org.apache.hadoop.fs.LocalFileSystem@9cd8db
>> > when running an independet jar within hadoop i m getting
>> > DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
>> > when running an independet jar i m able to do uploads....
>> >
>> > just wanted to know will i have to add something in my classpath of
>> tomcat
>> > or is there any other configurations of core-site.xml that i am missing
>> > out..thanks for your help.....
>> >
>> >
>> >
>> > On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
>> > wrote:
>> >>
>> >>
>> >> well, it's worked for me in the past outside Hadoop itself:
>> >>
>> >>
>> >>
>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>> >>
>> >> Turn logging up to DEBUG
>> >> Make sure that the filesystem you've just loaded is what you expect, by
>> >> logging its value. It may turn out to be file:///, because the normal
>> Hadoop
>> >> site-config.xml isn't being picked up
>> >>
>> >>
>> >>>
>> >>>
>> >>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak
>> >>> <vi...@gmail.com> wrote:
>> >>>>
>> >>>> but the problem is that my  code gets executed with the warning but
>> file
>> >>>> is not copied to hdfs , actually i m trying to copy a file from
>> local to
>> >>>> hdfs
>> >>>>
>> >>>>    Configuration hadoopConf=new Configuration();
>> >>>>         //get the default associated file system
>> >>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>> >>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>> >>>>         //copy from lfs to hdfs
>> >>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>> >>>> Path("/user/TestDir/"));
>> >>>>
>> >>
>> >>
>> >
>>
>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Senthil Kumar <se...@thoughtworks.com>.
Try using hadoopConf.addResource(*new
Path("F:/hadoop-0.22.0/conf/core-site.xml")*); instead
of hadoopConf.addResource("F:/hadoop-0.22.0/conf/core-site.xml");

or you should add your core-site.xml to a location which is in your class
path(WEB-INF\classes or WEB-INF\lib in case of a web application)


On Mon, Sep 3, 2012 at 6:02 PM, Visioner Sadak <vi...@gmail.com>wrote:

> thanks hemanth i tried adding ext folder conf and extn root folder
> unable to add xml only but still same problem thanks for the help
>
> On Mon, Sep 3, 2012 at 4:11 PM, Hemanth Yamijala <yh...@gmail.com>wrote:
>
>> Hi,
>>
>> If you are getting the LocalFileSystem, you could try by putting
>> core-site.xml in a directory that's there in the classpath for the
>> Tomcat App (or include such a path in the classpath, if that's
>> possible)
>>
>> Thanks
>> hemanth
>>
>> On Mon, Sep 3, 2012 at 4:01 PM, Visioner Sadak <vi...@gmail.com>
>> wrote:
>> > Thanks steve thers nothing in logs and no exceptions as well i found
>> that
>> > some file is created in my F:\user with directory name but its not
>> visible
>> > inside my hadoop browse filesystem directories i also added the config
>> by
>> > using the below method
>> > hadoopConf.addResource(
>> > "F:/hadoop-0.22.0/conf/core-site.xml");
>> > when running thru WAR printing out the filesystem i m getting
>> > org.apache.hadoop.fs.LocalFileSystem@9cd8db
>> > when running an independet jar within hadoop i m getting
>> > DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
>> > when running an independet jar i m able to do uploads....
>> >
>> > just wanted to know will i have to add something in my classpath of
>> tomcat
>> > or is there any other configurations of core-site.xml that i am missing
>> > out..thanks for your help.....
>> >
>> >
>> >
>> > On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
>> > wrote:
>> >>
>> >>
>> >> well, it's worked for me in the past outside Hadoop itself:
>> >>
>> >>
>> >>
>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>> >>
>> >> Turn logging up to DEBUG
>> >> Make sure that the filesystem you've just loaded is what you expect, by
>> >> logging its value. It may turn out to be file:///, because the normal
>> Hadoop
>> >> site-config.xml isn't being picked up
>> >>
>> >>
>> >>>
>> >>>
>> >>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak
>> >>> <vi...@gmail.com> wrote:
>> >>>>
>> >>>> but the problem is that my  code gets executed with the warning but
>> file
>> >>>> is not copied to hdfs , actually i m trying to copy a file from
>> local to
>> >>>> hdfs
>> >>>>
>> >>>>    Configuration hadoopConf=new Configuration();
>> >>>>         //get the default associated file system
>> >>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>> >>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>> >>>>         //copy from lfs to hdfs
>> >>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>> >>>> Path("/user/TestDir/"));
>> >>>>
>> >>
>> >>
>> >
>>
>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Senthil Kumar <se...@thoughtworks.com>.
Try using hadoopConf.addResource(*new
Path("F:/hadoop-0.22.0/conf/core-site.xml")*); instead
of hadoopConf.addResource("F:/hadoop-0.22.0/conf/core-site.xml");

or you should add your core-site.xml to a location which is in your class
path(WEB-INF\classes or WEB-INF\lib in case of a web application)


On Mon, Sep 3, 2012 at 6:02 PM, Visioner Sadak <vi...@gmail.com>wrote:

> thanks hemanth i tried adding ext folder conf and extn root folder
> unable to add xml only but still same problem thanks for the help
>
> On Mon, Sep 3, 2012 at 4:11 PM, Hemanth Yamijala <yh...@gmail.com>wrote:
>
>> Hi,
>>
>> If you are getting the LocalFileSystem, you could try by putting
>> core-site.xml in a directory that's there in the classpath for the
>> Tomcat App (or include such a path in the classpath, if that's
>> possible)
>>
>> Thanks
>> hemanth
>>
>> On Mon, Sep 3, 2012 at 4:01 PM, Visioner Sadak <vi...@gmail.com>
>> wrote:
>> > Thanks steve thers nothing in logs and no exceptions as well i found
>> that
>> > some file is created in my F:\user with directory name but its not
>> visible
>> > inside my hadoop browse filesystem directories i also added the config
>> by
>> > using the below method
>> > hadoopConf.addResource(
>> > "F:/hadoop-0.22.0/conf/core-site.xml");
>> > when running thru WAR printing out the filesystem i m getting
>> > org.apache.hadoop.fs.LocalFileSystem@9cd8db
>> > when running an independet jar within hadoop i m getting
>> > DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
>> > when running an independet jar i m able to do uploads....
>> >
>> > just wanted to know will i have to add something in my classpath of
>> tomcat
>> > or is there any other configurations of core-site.xml that i am missing
>> > out..thanks for your help.....
>> >
>> >
>> >
>> > On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
>> > wrote:
>> >>
>> >>
>> >> well, it's worked for me in the past outside Hadoop itself:
>> >>
>> >>
>> >>
>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>> >>
>> >> Turn logging up to DEBUG
>> >> Make sure that the filesystem you've just loaded is what you expect, by
>> >> logging its value. It may turn out to be file:///, because the normal
>> Hadoop
>> >> site-config.xml isn't being picked up
>> >>
>> >>
>> >>>
>> >>>
>> >>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak
>> >>> <vi...@gmail.com> wrote:
>> >>>>
>> >>>> but the problem is that my  code gets executed with the warning but
>> file
>> >>>> is not copied to hdfs , actually i m trying to copy a file from
>> local to
>> >>>> hdfs
>> >>>>
>> >>>>    Configuration hadoopConf=new Configuration();
>> >>>>         //get the default associated file system
>> >>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>> >>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>> >>>>         //copy from lfs to hdfs
>> >>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>> >>>> Path("/user/TestDir/"));
>> >>>>
>> >>
>> >>
>> >
>>
>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
thanks hemanth i tried adding ext folder conf and extn root folder   unable
to add xml only but still same problem thanks for the help

On Mon, Sep 3, 2012 at 4:11 PM, Hemanth Yamijala <yh...@gmail.com> wrote:

> Hi,
>
> If you are getting the LocalFileSystem, you could try by putting
> core-site.xml in a directory that's there in the classpath for the
> Tomcat App (or include such a path in the classpath, if that's
> possible)
>
> Thanks
> hemanth
>
> On Mon, Sep 3, 2012 at 4:01 PM, Visioner Sadak <vi...@gmail.com>
> wrote:
> > Thanks steve thers nothing in logs and no exceptions as well i found that
> > some file is created in my F:\user with directory name but its not
> visible
> > inside my hadoop browse filesystem directories i also added the config by
> > using the below method
> > hadoopConf.addResource(
> > "F:/hadoop-0.22.0/conf/core-site.xml");
> > when running thru WAR printing out the filesystem i m getting
> > org.apache.hadoop.fs.LocalFileSystem@9cd8db
> > when running an independet jar within hadoop i m getting
> > DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
> > when running an independet jar i m able to do uploads....
> >
> > just wanted to know will i have to add something in my classpath of
> tomcat
> > or is there any other configurations of core-site.xml that i am missing
> > out..thanks for your help.....
> >
> >
> >
> > On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
> > wrote:
> >>
> >>
> >> well, it's worked for me in the past outside Hadoop itself:
> >>
> >>
> >>
> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
> >>
> >> Turn logging up to DEBUG
> >> Make sure that the filesystem you've just loaded is what you expect, by
> >> logging its value. It may turn out to be file:///, because the normal
> Hadoop
> >> site-config.xml isn't being picked up
> >>
> >>
> >>>
> >>>
> >>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak
> >>> <vi...@gmail.com> wrote:
> >>>>
> >>>> but the problem is that my  code gets executed with the warning but
> file
> >>>> is not copied to hdfs , actually i m trying to copy a file from local
> to
> >>>> hdfs
> >>>>
> >>>>    Configuration hadoopConf=new Configuration();
> >>>>         //get the default associated file system
> >>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
> >>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
> >>>>         //copy from lfs to hdfs
> >>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
> >>>> Path("/user/TestDir/"));
> >>>>
> >>
> >>
> >
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
thanks hemanth i tried adding ext folder conf and extn root folder   unable
to add xml only but still same problem thanks for the help

On Mon, Sep 3, 2012 at 4:11 PM, Hemanth Yamijala <yh...@gmail.com> wrote:

> Hi,
>
> If you are getting the LocalFileSystem, you could try by putting
> core-site.xml in a directory that's there in the classpath for the
> Tomcat App (or include such a path in the classpath, if that's
> possible)
>
> Thanks
> hemanth
>
> On Mon, Sep 3, 2012 at 4:01 PM, Visioner Sadak <vi...@gmail.com>
> wrote:
> > Thanks steve thers nothing in logs and no exceptions as well i found that
> > some file is created in my F:\user with directory name but its not
> visible
> > inside my hadoop browse filesystem directories i also added the config by
> > using the below method
> > hadoopConf.addResource(
> > "F:/hadoop-0.22.0/conf/core-site.xml");
> > when running thru WAR printing out the filesystem i m getting
> > org.apache.hadoop.fs.LocalFileSystem@9cd8db
> > when running an independet jar within hadoop i m getting
> > DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
> > when running an independet jar i m able to do uploads....
> >
> > just wanted to know will i have to add something in my classpath of
> tomcat
> > or is there any other configurations of core-site.xml that i am missing
> > out..thanks for your help.....
> >
> >
> >
> > On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
> > wrote:
> >>
> >>
> >> well, it's worked for me in the past outside Hadoop itself:
> >>
> >>
> >>
> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
> >>
> >> Turn logging up to DEBUG
> >> Make sure that the filesystem you've just loaded is what you expect, by
> >> logging its value. It may turn out to be file:///, because the normal
> Hadoop
> >> site-config.xml isn't being picked up
> >>
> >>
> >>>
> >>>
> >>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak
> >>> <vi...@gmail.com> wrote:
> >>>>
> >>>> but the problem is that my  code gets executed with the warning but
> file
> >>>> is not copied to hdfs , actually i m trying to copy a file from local
> to
> >>>> hdfs
> >>>>
> >>>>    Configuration hadoopConf=new Configuration();
> >>>>         //get the default associated file system
> >>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
> >>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
> >>>>         //copy from lfs to hdfs
> >>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
> >>>> Path("/user/TestDir/"));
> >>>>
> >>
> >>
> >
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
thanks hemanth i tried adding ext folder conf and extn root folder   unable
to add xml only but still same problem thanks for the help

On Mon, Sep 3, 2012 at 4:11 PM, Hemanth Yamijala <yh...@gmail.com> wrote:

> Hi,
>
> If you are getting the LocalFileSystem, you could try by putting
> core-site.xml in a directory that's there in the classpath for the
> Tomcat App (or include such a path in the classpath, if that's
> possible)
>
> Thanks
> hemanth
>
> On Mon, Sep 3, 2012 at 4:01 PM, Visioner Sadak <vi...@gmail.com>
> wrote:
> > Thanks steve thers nothing in logs and no exceptions as well i found that
> > some file is created in my F:\user with directory name but its not
> visible
> > inside my hadoop browse filesystem directories i also added the config by
> > using the below method
> > hadoopConf.addResource(
> > "F:/hadoop-0.22.0/conf/core-site.xml");
> > when running thru WAR printing out the filesystem i m getting
> > org.apache.hadoop.fs.LocalFileSystem@9cd8db
> > when running an independet jar within hadoop i m getting
> > DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
> > when running an independet jar i m able to do uploads....
> >
> > just wanted to know will i have to add something in my classpath of
> tomcat
> > or is there any other configurations of core-site.xml that i am missing
> > out..thanks for your help.....
> >
> >
> >
> > On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
> > wrote:
> >>
> >>
> >> well, it's worked for me in the past outside Hadoop itself:
> >>
> >>
> >>
> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
> >>
> >> Turn logging up to DEBUG
> >> Make sure that the filesystem you've just loaded is what you expect, by
> >> logging its value. It may turn out to be file:///, because the normal
> Hadoop
> >> site-config.xml isn't being picked up
> >>
> >>
> >>>
> >>>
> >>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak
> >>> <vi...@gmail.com> wrote:
> >>>>
> >>>> but the problem is that my  code gets executed with the warning but
> file
> >>>> is not copied to hdfs , actually i m trying to copy a file from local
> to
> >>>> hdfs
> >>>>
> >>>>    Configuration hadoopConf=new Configuration();
> >>>>         //get the default associated file system
> >>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
> >>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
> >>>>         //copy from lfs to hdfs
> >>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
> >>>> Path("/user/TestDir/"));
> >>>>
> >>
> >>
> >
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
thanks hemanth i tried adding ext folder conf and extn root folder   unable
to add xml only but still same problem thanks for the help

On Mon, Sep 3, 2012 at 4:11 PM, Hemanth Yamijala <yh...@gmail.com> wrote:

> Hi,
>
> If you are getting the LocalFileSystem, you could try by putting
> core-site.xml in a directory that's there in the classpath for the
> Tomcat App (or include such a path in the classpath, if that's
> possible)
>
> Thanks
> hemanth
>
> On Mon, Sep 3, 2012 at 4:01 PM, Visioner Sadak <vi...@gmail.com>
> wrote:
> > Thanks steve thers nothing in logs and no exceptions as well i found that
> > some file is created in my F:\user with directory name but its not
> visible
> > inside my hadoop browse filesystem directories i also added the config by
> > using the below method
> > hadoopConf.addResource(
> > "F:/hadoop-0.22.0/conf/core-site.xml");
> > when running thru WAR printing out the filesystem i m getting
> > org.apache.hadoop.fs.LocalFileSystem@9cd8db
> > when running an independet jar within hadoop i m getting
> > DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
> > when running an independet jar i m able to do uploads....
> >
> > just wanted to know will i have to add something in my classpath of
> tomcat
> > or is there any other configurations of core-site.xml that i am missing
> > out..thanks for your help.....
> >
> >
> >
> > On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
> > wrote:
> >>
> >>
> >> well, it's worked for me in the past outside Hadoop itself:
> >>
> >>
> >>
> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
> >>
> >> Turn logging up to DEBUG
> >> Make sure that the filesystem you've just loaded is what you expect, by
> >> logging its value. It may turn out to be file:///, because the normal
> Hadoop
> >> site-config.xml isn't being picked up
> >>
> >>
> >>>
> >>>
> >>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak
> >>> <vi...@gmail.com> wrote:
> >>>>
> >>>> but the problem is that my  code gets executed with the warning but
> file
> >>>> is not copied to hdfs , actually i m trying to copy a file from local
> to
> >>>> hdfs
> >>>>
> >>>>    Configuration hadoopConf=new Configuration();
> >>>>         //get the default associated file system
> >>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
> >>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
> >>>>         //copy from lfs to hdfs
> >>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
> >>>> Path("/user/TestDir/"));
> >>>>
> >>
> >>
> >
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Hemanth Yamijala <yh...@gmail.com>.
Hi,

If you are getting the LocalFileSystem, you could try by putting
core-site.xml in a directory that's there in the classpath for the
Tomcat App (or include such a path in the classpath, if that's
possible)

Thanks
hemanth

On Mon, Sep 3, 2012 at 4:01 PM, Visioner Sadak <vi...@gmail.com> wrote:
> Thanks steve thers nothing in logs and no exceptions as well i found that
> some file is created in my F:\user with directory name but its not visible
> inside my hadoop browse filesystem directories i also added the config by
> using the below method
> hadoopConf.addResource(
> "F:/hadoop-0.22.0/conf/core-site.xml");
> when running thru WAR printing out the filesystem i m getting
> org.apache.hadoop.fs.LocalFileSystem@9cd8db
> when running an independet jar within hadoop i m getting
> DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
> when running an independet jar i m able to do uploads....
>
> just wanted to know will i have to add something in my classpath of tomcat
> or is there any other configurations of core-site.xml that i am missing
> out..thanks for your help.....
>
>
>
> On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
> wrote:
>>
>>
>> well, it's worked for me in the past outside Hadoop itself:
>>
>>
>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>>
>> Turn logging up to DEBUG
>> Make sure that the filesystem you've just loaded is what you expect, by
>> logging its value. It may turn out to be file:///, because the normal Hadoop
>> site-config.xml isn't being picked up
>>
>>
>>>
>>>
>>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak
>>> <vi...@gmail.com> wrote:
>>>>
>>>> but the problem is that my  code gets executed with the warning but file
>>>> is not copied to hdfs , actually i m trying to copy a file from local to
>>>> hdfs
>>>>
>>>>    Configuration hadoopConf=new Configuration();
>>>>         //get the default associated file system
>>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>>>         //copy from lfs to hdfs
>>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>>>> Path("/user/TestDir/"));
>>>>
>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Hemanth Yamijala <yh...@gmail.com>.
Hi,

If you are getting the LocalFileSystem, you could try by putting
core-site.xml in a directory that's there in the classpath for the
Tomcat App (or include such a path in the classpath, if that's
possible)

Thanks
hemanth

On Mon, Sep 3, 2012 at 4:01 PM, Visioner Sadak <vi...@gmail.com> wrote:
> Thanks steve thers nothing in logs and no exceptions as well i found that
> some file is created in my F:\user with directory name but its not visible
> inside my hadoop browse filesystem directories i also added the config by
> using the below method
> hadoopConf.addResource(
> "F:/hadoop-0.22.0/conf/core-site.xml");
> when running thru WAR printing out the filesystem i m getting
> org.apache.hadoop.fs.LocalFileSystem@9cd8db
> when running an independet jar within hadoop i m getting
> DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
> when running an independet jar i m able to do uploads....
>
> just wanted to know will i have to add something in my classpath of tomcat
> or is there any other configurations of core-site.xml that i am missing
> out..thanks for your help.....
>
>
>
> On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
> wrote:
>>
>>
>> well, it's worked for me in the past outside Hadoop itself:
>>
>>
>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>>
>> Turn logging up to DEBUG
>> Make sure that the filesystem you've just loaded is what you expect, by
>> logging its value. It may turn out to be file:///, because the normal Hadoop
>> site-config.xml isn't being picked up
>>
>>
>>>
>>>
>>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak
>>> <vi...@gmail.com> wrote:
>>>>
>>>> but the problem is that my  code gets executed with the warning but file
>>>> is not copied to hdfs , actually i m trying to copy a file from local to
>>>> hdfs
>>>>
>>>>    Configuration hadoopConf=new Configuration();
>>>>         //get the default associated file system
>>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>>>         //copy from lfs to hdfs
>>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>>>> Path("/user/TestDir/"));
>>>>
>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Hemanth Yamijala <yh...@gmail.com>.
Hi,

If you are getting the LocalFileSystem, you could try by putting
core-site.xml in a directory that's there in the classpath for the
Tomcat App (or include such a path in the classpath, if that's
possible)

Thanks
hemanth

On Mon, Sep 3, 2012 at 4:01 PM, Visioner Sadak <vi...@gmail.com> wrote:
> Thanks steve thers nothing in logs and no exceptions as well i found that
> some file is created in my F:\user with directory name but its not visible
> inside my hadoop browse filesystem directories i also added the config by
> using the below method
> hadoopConf.addResource(
> "F:/hadoop-0.22.0/conf/core-site.xml");
> when running thru WAR printing out the filesystem i m getting
> org.apache.hadoop.fs.LocalFileSystem@9cd8db
> when running an independet jar within hadoop i m getting
> DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
> when running an independet jar i m able to do uploads....
>
> just wanted to know will i have to add something in my classpath of tomcat
> or is there any other configurations of core-site.xml that i am missing
> out..thanks for your help.....
>
>
>
> On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
> wrote:
>>
>>
>> well, it's worked for me in the past outside Hadoop itself:
>>
>>
>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>>
>> Turn logging up to DEBUG
>> Make sure that the filesystem you've just loaded is what you expect, by
>> logging its value. It may turn out to be file:///, because the normal Hadoop
>> site-config.xml isn't being picked up
>>
>>
>>>
>>>
>>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak
>>> <vi...@gmail.com> wrote:
>>>>
>>>> but the problem is that my  code gets executed with the warning but file
>>>> is not copied to hdfs , actually i m trying to copy a file from local to
>>>> hdfs
>>>>
>>>>    Configuration hadoopConf=new Configuration();
>>>>         //get the default associated file system
>>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>>>         //copy from lfs to hdfs
>>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>>>> Path("/user/TestDir/"));
>>>>
>>
>>
>

RE: Integrating hadoop with java UI application deployed on tomcat

Posted by "Mahadevappa, Shobha" <Sh...@nttdata.com>.
Hi,
Try adding the hadoop/conf directory in the TOMCAT's classpath

Ex : CLASSPATH=/usr/local/Apps/hbase-0.90.4/conf:/usr/local/Apps/hadoop-0.20.203.0/conf:



Regards,
Shobha M

From: Visioner Sadak [mailto:visioner.sadak@gmail.com]
Sent: 03 September 2012 PM 04:01
To: user@hadoop.apache.org
Subject: Re: Integrating hadoop with java UI application deployed on tomcat

Thanks steve thers nothing in logs and no exceptions as well i found that some file is created in my F:\user with directory name but its not visible inside my hadoop browse filesystem directories i also added the config by using the below method
hadoopConf.addResource(
"F:/hadoop-0.22.0/conf/core-site.xml");
when running thru WAR printing out the filesystem i m getting org.apache.hadoop.fs.LocalFileSystem@9cd8db<ma...@9cd8db>
when running an independet jar within hadoop i m getting DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
when running an independet jar i m able to do uploads....

just wanted to know will i have to add something in my classpath of tomcat or is there any other configurations of core-site.xml that i am missing out..thanks for your help.....


On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>> wrote:

well, it's worked for me in the past outside Hadoop itself:

http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup


  1.  Turn logging up to DEBUG
  2.  Make sure that the filesystem you've just loaded is what you expect, by logging its value. It may turn out to be file:///<file:///\\>, because the normal Hadoop site-config.xml isn't being picked up


On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <vi...@gmail.com>> wrote:
but the problem is that my  code gets executed with the warning but file is not copied to hdfs , actually i m trying to copy a file from local to hdfs

   Configuration hadoopConf=new Configuration();
        //get the default associated file system
       FileSystem fileSystem=FileSystem.get(hadoopConf);
       // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
        //copy from lfs to hdfs
       fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new Path("/user/TestDir/"));




______________________________________________________________________
Disclaimer:This email and any attachments are sent in strictest confidence for the sole use of the addressee and may contain legally privileged, confidential, and proprietary data.  If you are not the intended recipient, please advise the sender by replying promptly to this email and then delete and destroy this email and any attachments without any further use, copying or forwarding

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Hemanth Yamijala <yh...@gmail.com>.
Hi,

If you are getting the LocalFileSystem, you could try by putting
core-site.xml in a directory that's there in the classpath for the
Tomcat App (or include such a path in the classpath, if that's
possible)

Thanks
hemanth

On Mon, Sep 3, 2012 at 4:01 PM, Visioner Sadak <vi...@gmail.com> wrote:
> Thanks steve thers nothing in logs and no exceptions as well i found that
> some file is created in my F:\user with directory name but its not visible
> inside my hadoop browse filesystem directories i also added the config by
> using the below method
> hadoopConf.addResource(
> "F:/hadoop-0.22.0/conf/core-site.xml");
> when running thru WAR printing out the filesystem i m getting
> org.apache.hadoop.fs.LocalFileSystem@9cd8db
> when running an independet jar within hadoop i m getting
> DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
> when running an independet jar i m able to do uploads....
>
> just wanted to know will i have to add something in my classpath of tomcat
> or is there any other configurations of core-site.xml that i am missing
> out..thanks for your help.....
>
>
>
> On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>
> wrote:
>>
>>
>> well, it's worked for me in the past outside Hadoop itself:
>>
>>
>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>>
>> Turn logging up to DEBUG
>> Make sure that the filesystem you've just loaded is what you expect, by
>> logging its value. It may turn out to be file:///, because the normal Hadoop
>> site-config.xml isn't being picked up
>>
>>
>>>
>>>
>>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak
>>> <vi...@gmail.com> wrote:
>>>>
>>>> but the problem is that my  code gets executed with the warning but file
>>>> is not copied to hdfs , actually i m trying to copy a file from local to
>>>> hdfs
>>>>
>>>>    Configuration hadoopConf=new Configuration();
>>>>         //get the default associated file system
>>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>>>         //copy from lfs to hdfs
>>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>>>> Path("/user/TestDir/"));
>>>>
>>
>>
>

RE: Integrating hadoop with java UI application deployed on tomcat

Posted by "Mahadevappa, Shobha" <Sh...@nttdata.com>.
Hi,
Try adding the hadoop/conf directory in the TOMCAT's classpath

Ex : CLASSPATH=/usr/local/Apps/hbase-0.90.4/conf:/usr/local/Apps/hadoop-0.20.203.0/conf:



Regards,
Shobha M

From: Visioner Sadak [mailto:visioner.sadak@gmail.com]
Sent: 03 September 2012 PM 04:01
To: user@hadoop.apache.org
Subject: Re: Integrating hadoop with java UI application deployed on tomcat

Thanks steve thers nothing in logs and no exceptions as well i found that some file is created in my F:\user with directory name but its not visible inside my hadoop browse filesystem directories i also added the config by using the below method
hadoopConf.addResource(
"F:/hadoop-0.22.0/conf/core-site.xml");
when running thru WAR printing out the filesystem i m getting org.apache.hadoop.fs.LocalFileSystem@9cd8db<ma...@9cd8db>
when running an independet jar within hadoop i m getting DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
when running an independet jar i m able to do uploads....

just wanted to know will i have to add something in my classpath of tomcat or is there any other configurations of core-site.xml that i am missing out..thanks for your help.....


On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>> wrote:

well, it's worked for me in the past outside Hadoop itself:

http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup


  1.  Turn logging up to DEBUG
  2.  Make sure that the filesystem you've just loaded is what you expect, by logging its value. It may turn out to be file:///<file:///\\>, because the normal Hadoop site-config.xml isn't being picked up


On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <vi...@gmail.com>> wrote:
but the problem is that my  code gets executed with the warning but file is not copied to hdfs , actually i m trying to copy a file from local to hdfs

   Configuration hadoopConf=new Configuration();
        //get the default associated file system
       FileSystem fileSystem=FileSystem.get(hadoopConf);
       // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
        //copy from lfs to hdfs
       fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new Path("/user/TestDir/"));




______________________________________________________________________
Disclaimer:This email and any attachments are sent in strictest confidence for the sole use of the addressee and may contain legally privileged, confidential, and proprietary data.  If you are not the intended recipient, please advise the sender by replying promptly to this email and then delete and destroy this email and any attachments without any further use, copying or forwarding

RE: Integrating hadoop with java UI application deployed on tomcat

Posted by "Mahadevappa, Shobha" <Sh...@nttdata.com>.
Hi,
Try adding the hadoop/conf directory in the TOMCAT's classpath

Ex : CLASSPATH=/usr/local/Apps/hbase-0.90.4/conf:/usr/local/Apps/hadoop-0.20.203.0/conf:



Regards,
Shobha M

From: Visioner Sadak [mailto:visioner.sadak@gmail.com]
Sent: 03 September 2012 PM 04:01
To: user@hadoop.apache.org
Subject: Re: Integrating hadoop with java UI application deployed on tomcat

Thanks steve thers nothing in logs and no exceptions as well i found that some file is created in my F:\user with directory name but its not visible inside my hadoop browse filesystem directories i also added the config by using the below method
hadoopConf.addResource(
"F:/hadoop-0.22.0/conf/core-site.xml");
when running thru WAR printing out the filesystem i m getting org.apache.hadoop.fs.LocalFileSystem@9cd8db<ma...@9cd8db>
when running an independet jar within hadoop i m getting DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
when running an independet jar i m able to do uploads....

just wanted to know will i have to add something in my classpath of tomcat or is there any other configurations of core-site.xml that i am missing out..thanks for your help.....


On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>> wrote:

well, it's worked for me in the past outside Hadoop itself:

http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup


  1.  Turn logging up to DEBUG
  2.  Make sure that the filesystem you've just loaded is what you expect, by logging its value. It may turn out to be file:///<file:///\\>, because the normal Hadoop site-config.xml isn't being picked up


On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <vi...@gmail.com>> wrote:
but the problem is that my  code gets executed with the warning but file is not copied to hdfs , actually i m trying to copy a file from local to hdfs

   Configuration hadoopConf=new Configuration();
        //get the default associated file system
       FileSystem fileSystem=FileSystem.get(hadoopConf);
       // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
        //copy from lfs to hdfs
       fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new Path("/user/TestDir/"));




______________________________________________________________________
Disclaimer:This email and any attachments are sent in strictest confidence for the sole use of the addressee and may contain legally privileged, confidential, and proprietary data.  If you are not the intended recipient, please advise the sender by replying promptly to this email and then delete and destroy this email and any attachments without any further use, copying or forwarding

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
Thanks steve thers nothing in logs and no exceptions as well i found that
some file is created in my F:\user with directory name but its not visible
inside my hadoop browse filesystem directories i also added the config by
using the below method
hadoopConf.addResource("F:/hadoop-0.22.0/conf/core-site.xml");
when running thru WAR printing out the filesystem i m getting
org.apache.hadoop.fs.LocalFileSystem@9cd8db
when running an independet jar within hadoop i m getting
DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
when running an independet jar i m able to do uploads....

just wanted to know will i have to add something in my classpath of tomcat
or is there any other configurations of core-site.xml that i am missing
out..thanks for your help.....



On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>wrote:

>
> well, it's worked for me in the past outside Hadoop itself:
>
>
> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>
>
>    1. Turn logging up to DEBUG
>    2. Make sure that the filesystem you've just loaded is what you
>    expect, by logging its value. It may turn out to be file:///, because the
>    normal Hadoop site-config.xml isn't being picked up
>
>
>
>>
>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <visioner.sadak@gmail.com
>> > wrote:
>>
>>> but the problem is that my  code gets executed with the warning but file
>>> is not copied to hdfs , actually i m trying to copy a file from local to
>>> hdfs
>>>
>>>    Configuration hadoopConf=new Configuration();
>>>         //get the default associated file system
>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>>         //copy from lfs to hdfs
>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>>> Path("/user/TestDir/"));
>>>
>>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
Thanks steve thers nothing in logs and no exceptions as well i found that
some file is created in my F:\user with directory name but its not visible
inside my hadoop browse filesystem directories i also added the config by
using the below method
hadoopConf.addResource("F:/hadoop-0.22.0/conf/core-site.xml");
when running thru WAR printing out the filesystem i m getting
org.apache.hadoop.fs.LocalFileSystem@9cd8db
when running an independet jar within hadoop i m getting
DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
when running an independet jar i m able to do uploads....

just wanted to know will i have to add something in my classpath of tomcat
or is there any other configurations of core-site.xml that i am missing
out..thanks for your help.....



On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>wrote:

>
> well, it's worked for me in the past outside Hadoop itself:
>
>
> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>
>
>    1. Turn logging up to DEBUG
>    2. Make sure that the filesystem you've just loaded is what you
>    expect, by logging its value. It may turn out to be file:///, because the
>    normal Hadoop site-config.xml isn't being picked up
>
>
>
>>
>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <visioner.sadak@gmail.com
>> > wrote:
>>
>>> but the problem is that my  code gets executed with the warning but file
>>> is not copied to hdfs , actually i m trying to copy a file from local to
>>> hdfs
>>>
>>>    Configuration hadoopConf=new Configuration();
>>>         //get the default associated file system
>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>>         //copy from lfs to hdfs
>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>>> Path("/user/TestDir/"));
>>>
>>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
Thanks steve thers nothing in logs and no exceptions as well i found that
some file is created in my F:\user with directory name but its not visible
inside my hadoop browse filesystem directories i also added the config by
using the below method
hadoopConf.addResource("F:/hadoop-0.22.0/conf/core-site.xml");
when running thru WAR printing out the filesystem i m getting
org.apache.hadoop.fs.LocalFileSystem@9cd8db
when running an independet jar within hadoop i m getting
DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
when running an independet jar i m able to do uploads....

just wanted to know will i have to add something in my classpath of tomcat
or is there any other configurations of core-site.xml that i am missing
out..thanks for your help.....



On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>wrote:

>
> well, it's worked for me in the past outside Hadoop itself:
>
>
> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>
>
>    1. Turn logging up to DEBUG
>    2. Make sure that the filesystem you've just loaded is what you
>    expect, by logging its value. It may turn out to be file:///, because the
>    normal Hadoop site-config.xml isn't being picked up
>
>
>
>>
>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <visioner.sadak@gmail.com
>> > wrote:
>>
>>> but the problem is that my  code gets executed with the warning but file
>>> is not copied to hdfs , actually i m trying to copy a file from local to
>>> hdfs
>>>
>>>    Configuration hadoopConf=new Configuration();
>>>         //get the default associated file system
>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>>         //copy from lfs to hdfs
>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>>> Path("/user/TestDir/"));
>>>
>>>
>>
>

Re: Integrating hadoop with java UI application deployed on tomcat

Posted by Visioner Sadak <vi...@gmail.com>.
Thanks steve thers nothing in logs and no exceptions as well i found that
some file is created in my F:\user with directory name but its not visible
inside my hadoop browse filesystem directories i also added the config by
using the below method
hadoopConf.addResource("F:/hadoop-0.22.0/conf/core-site.xml");
when running thru WAR printing out the filesystem i m getting
org.apache.hadoop.fs.LocalFileSystem@9cd8db
when running an independet jar within hadoop i m getting
DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
when running an independet jar i m able to do uploads....

just wanted to know will i have to add something in my classpath of tomcat
or is there any other configurations of core-site.xml that i am missing
out..thanks for your help.....



On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <st...@hortonworks.com>wrote:

>
> well, it's worked for me in the past outside Hadoop itself:
>
>
> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>
>
>    1. Turn logging up to DEBUG
>    2. Make sure that the filesystem you've just loaded is what you
>    expect, by logging its value. It may turn out to be file:///, because the
>    normal Hadoop site-config.xml isn't being picked up
>
>
>
>>
>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <visioner.sadak@gmail.com
>> > wrote:
>>
>>> but the problem is that my  code gets executed with the warning but file
>>> is not copied to hdfs , actually i m trying to copy a file from local to
>>> hdfs
>>>
>>>    Configuration hadoopConf=new Configuration();
>>>         //get the default associated file system
>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>>         //copy from lfs to hdfs
>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>>> Path("/user/TestDir/"));
>>>
>>>
>>
>