You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hadoop.apache.org by Irfan Sayed <ir...@gmail.com> on 2013/09/03 06:51:01 UTC

Re: about replication

thanks. sorry for the long break. actually got involved in some other
priorities
i downloaded the installer and while installing i got following error

[image: Inline image 1]

do i need to make any configuration prior to installation ??

regards
irfan



On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault
<or...@hortonworks.com>wrote:

> Here is the link
>
> http://download.hortonworks.com/products/hdp-windows/
>
> Olivier
> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:
>
>> thanks.
>> i just followed the instructions to setup the pseudo distributed setup
>> first using the url :
>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>>
>> i don't think so i am running DN on both machine
>> please find the attached log
>>
>> hi olivier
>>
>> can you please give me download link ?
>> let me try please
>>
>> regards
>> irfan
>>
>>
>>
>>
>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>
>>> Are you running DN on both the machines? Could you please show me your
>>> DN logs?
>>>
>>> Also, consider Oliver's suggestion. It's definitely a better option.
>>>
>>>
>>>
>>> Warm Regards,
>>> Tariq
>>> cloudfront.blogspot.com
>>>
>>>
>>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>>> orenault@hortonworks.com> wrote:
>>>
>>>> Irfu,
>>>>
>>>> If you want to quickly get Hadoop running on windows platform. You may
>>>> want to try our distribution for Windows. You will be able to find the msi
>>>> on our website.
>>>>
>>>> Regards
>>>> Olivier
>>>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>>
>>>>> thanks.
>>>>> ok. i think i need to change the plan over here
>>>>> let me create two environments. 1: totally windows 2: totally Unix
>>>>>
>>>>> because, on windows , anyway i have to try and see how hadoop works
>>>>> on UNIX, it is already known that ,  it is working fine.
>>>>>
>>>>> so, on windows , here is the setup:
>>>>>
>>>>> namenode : windows 2012 R2
>>>>> datanode : windows 2012 R2
>>>>>
>>>>> now, the exact problem is :
>>>>> 1: datanode is not getting started
>>>>> 2: replication : if i put any file/folder on any datanode , it should
>>>>> get replicated to all another available datanodes
>>>>>
>>>>> regards
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>>
>>>>>> Seriously??You are planning to develop something using Hadoop on
>>>>>> windows. Not a good idea. Anyways, cold you plz show me your log files?I
>>>>>> also need some additional info :
>>>>>> -The exact problem which you are facing right now
>>>>>> -Your cluster summary(no. of nodes etc)
>>>>>> -Your latest configuration files
>>>>>> -Your /etc.hosts file
>>>>>>
>>>>>> Warm Regards,
>>>>>> Tariq
>>>>>> cloudfront.blogspot.com
>>>>>>
>>>>>>
>>>>>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <ir...@gmail.com>wrote:
>>>>>>
>>>>>>> ok. thanks
>>>>>>> now, i need to start with all windows setup first as our product
>>>>>>> will be based on windows
>>>>>>> so, now, please tell me how to resolve the issue
>>>>>>>
>>>>>>> datanode is not starting . please suggest
>>>>>>>
>>>>>>> regards,
>>>>>>> irfan
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>>>>
>>>>>>>> It is possible. Theoretically Hadoop doesn't stop you from doing
>>>>>>>> that. But it is not a very wise setup.
>>>>>>>>
>>>>>>>> Warm Regards,
>>>>>>>> Tariq
>>>>>>>> cloudfront.blogspot.com
>>>>>>>>
>>>>>>>>
>>>>>>>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <ir...@gmail.com>wrote:
>>>>>>>>
>>>>>>>>> please suggest
>>>>>>>>>
>>>>>>>>> regards
>>>>>>>>> irfan
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <
>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>
>>>>>>>>>> thanks.
>>>>>>>>>> can i have setup like this :
>>>>>>>>>> namenode will be on linux (flavour may be RHEL, CentOS, UBuntu
>>>>>>>>>> etc)
>>>>>>>>>> and datanodes are the combination of any OS (windows , linux ,
>>>>>>>>>> unix etc )
>>>>>>>>>>
>>>>>>>>>> however, my doubt is,  as the file systems of  both the systems
>>>>>>>>>> (win and linux ) are different ,  datanodes of these systems can not be
>>>>>>>>>> part of single cluster . i have to make windows cluster separate and UNIX
>>>>>>>>>> cluster separate ?
>>>>>>>>>>
>>>>>>>>>> regards
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
>>>>>>>>>> aagarwal@hortonworks.com> wrote:
>>>>>>>>>>
>>>>>>>>>>> I just noticed you are on Cygwin. IIRC Windows PIDs are not the
>>>>>>>>>>> same as Cygwin PIDs so that may be causing the discrepancy. I don't know
>>>>>>>>>>> how well Hadoop works in Cygwin as I have never tried it. Work is in
>>>>>>>>>>> progress for native Windows support however there are no official releases
>>>>>>>>>>> with Windows support yet. It may be easier to get familiar with a
>>>>>>>>>>> release <https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <
>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> thanks
>>>>>>>>>>>> here is what i did .
>>>>>>>>>>>> i stopped all the namenodes and datanodes using ./stop-dfs.sh
>>>>>>>>>>>> command
>>>>>>>>>>>> then deleted all pid files for namenodes and datanodes
>>>>>>>>>>>>
>>>>>>>>>>>> started dfs again with command : "./start-dfs.sh"
>>>>>>>>>>>>
>>>>>>>>>>>> when i ran the "Jps" command . it shows
>>>>>>>>>>>>
>>>>>>>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>> 4536 Jps
>>>>>>>>>>>> 2076 NameNode
>>>>>>>>>>>>
>>>>>>>>>>>> however, when i open the pid file for namenode then it is not
>>>>>>>>>>>> showing pid as : 4560. on the contrary, it shud show : 2076
>>>>>>>>>>>>
>>>>>>>>>>>> please suggest
>>>>>>>>>>>>
>>>>>>>>>>>> regards
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
>>>>>>>>>>>> aagarwal@hortonworks.com> wrote:
>>>>>>>>>>>>
>>>>>>>>>>>>> Most likely there is a stale pid file. Something like
>>>>>>>>>>>>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>>>>>>>>>>>>> the datanode.
>>>>>>>>>>>>>
>>>>>>>>>>>>> I haven't read the entire thread so you may have looked at
>>>>>>>>>>>>> this already.
>>>>>>>>>>>>>
>>>>>>>>>>>>> -Arpit
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <
>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>
>>>>>>>>>>>>>> datanode is trying to connect to namenode continuously but
>>>>>>>>>>>>>> fails
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> when i try to run "jps" command it says :
>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>> 4584 NameNode
>>>>>>>>>>>>>> 4016 Jps
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> and when i ran the "./start-dfs.sh" then it says :
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> $ ./start-dfs.sh
>>>>>>>>>>>>>> namenode running as process 3544. Stop it first.
>>>>>>>>>>>>>> DFS-1: datanode running as process 4076. Stop it first.
>>>>>>>>>>>>>> localhost: secondarynamenode running as process 4792. Stop it
>>>>>>>>>>>>>> first.
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> both these logs are contradictory
>>>>>>>>>>>>>> please find the attached logs
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> should i attach the conf files as well ?
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <
>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> Your DN is still not running. Showing me the logs would be
>>>>>>>>>>>>>>> helpful.
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <
>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> i followed the url and did the steps mention in that. i
>>>>>>>>>>>>>>>> have deployed on the windows platform
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> Now, i am able to browse url : http://localhost:50070(name node )
>>>>>>>>>>>>>>>> however, not able to browse url : http://localhost:50030
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> please refer below
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> i have modified all the config files as mentioned and
>>>>>>>>>>>>>>>> formatted the hdfs file system as well
>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <
>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> thanks. i followed this url :
>>>>>>>>>>>>>>>>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>>>>>>>>>>>>>>>>> let me follow the url which you gave for pseudo
>>>>>>>>>>>>>>>>> distributed setup and then will switch to distributed mode
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> You are welcome. Which link have you followed for the
>>>>>>>>>>>>>>>>>> configuration?Your *core-site.xml* is empty. Remove the
>>>>>>>>>>>>>>>>>> property *fs.default.name *from *hdfs-site.xml* and add
>>>>>>>>>>>>>>>>>> it to *core-site.xml*. Remove *mapred.job.tracker* as
>>>>>>>>>>>>>>>>>> well. It is required in *mapred-site.xml*.
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> I would suggest you to do a pseudo distributed setup
>>>>>>>>>>>>>>>>>> first in order to get yourself familiar with the process and then proceed
>>>>>>>>>>>>>>>>>> to the distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> HTH
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> thanks tariq for response.
>>>>>>>>>>>>>>>>>>> as discussed last time, i have sent you all the config
>>>>>>>>>>>>>>>>>>> files in my setup .
>>>>>>>>>>>>>>>>>>> can you please go through that ?
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> please let me know
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> I'm sorry for being unresponsive. Was out of touch for
>>>>>>>>>>>>>>>>>>>> sometime because of ramzan and eid. Resuming work today.
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> What's the current status?
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <
>>>>>>>>>>>>>>>>>>>> manishd207@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> First of all read the concepts ..I hope you will like
>>>>>>>>>>>>>>>>>>>>> it..
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> hey Tariq,
>>>>>>>>>>>>>>>>>>>>>>> i am still stuck ..
>>>>>>>>>>>>>>>>>>>>>>> can you please suggest
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> attachment got quarantined
>>>>>>>>>>>>>>>>>>>>>>>>> resending in txt format. please rename it to
>>>>>>>>>>>>>>>>>>>>>>>>> conf.rar
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> if i run the jps command on namenode :
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>>>>>>>>>> 3164 NameNode
>>>>>>>>>>>>>>>>>>>>>>>>>> 1892 Jps
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> same command on datanode :
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-1/cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>>>>>>>>>> 3848 Jps
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> jps does not list any process for datanode.
>>>>>>>>>>>>>>>>>>>>>>>>>> however, on web browser i can see one live data node
>>>>>>>>>>>>>>>>>>>>>>>>>> please find the attached conf rar file of
>>>>>>>>>>>>>>>>>>>>>>>>>> namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> OK. we'll start fresh. Could you plz show me
>>>>>>>>>>>>>>>>>>>>>>>>>>> your latest config files?
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> BTW, are your daemons running fine?Use JPS to
>>>>>>>>>>>>>>>>>>>>>>>>>>> verify that.
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> i have created these dir "wksp_data" and
>>>>>>>>>>>>>>>>>>>>>>>>>>>> "wksp_name" on both datanode and namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>> made the respective changes in "hdfs-site.xml"
>>>>>>>>>>>>>>>>>>>>>>>>>>>> file
>>>>>>>>>>>>>>>>>>>>>>>>>>>> formatted the namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>> started the dfs
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> but still, not able to browse the file system
>>>>>>>>>>>>>>>>>>>>>>>>>>>> through web browser
>>>>>>>>>>>>>>>>>>>>>>>>>>>> please refer below
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> anything still missing ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> these dir needs to be created on all datanodes
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> and namenodes ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> further,  hdfs-site.xml needs to be updated on
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> both datanodes and namenodes for these new dir?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Create 2 directories manually corresponding
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> to the values of dfs.name.dir and dfs.data.dir properties and change the
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> permissions of these directories to 755. When you start pushing data into
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> your HDFS, data will start going inside the directory specified by
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> dfs.data.dir and the associated metadata will go inside dfs.name.dir.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Remember, you store data in HDFS, but it eventually gets stored in your
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> local/native FS. But you cannot see this data directly on your local/native
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> FS.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, i need this to be working on
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> windows environment as project requirement.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i will add/work on Linux later
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> so, now , at this stage , c:\\wksp is the
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> HDFS file system OR do i need to create it from command line ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:19 PM, Mohammad
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Hello Irfan,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sorry for being unresponsive. Got stuck
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> with some imp work.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> HDFS webUI doesn't provide us the ability
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> to create file or directory. You can browse HDFS, view files, download
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> files etc. But operation like create, move, copy etc are not supported.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> These values look fine to me.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> One suggestion though. Try getting a Linux
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> machine(if possible). Or at least use a VM. I personally feel that using
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Hadoop on windows is always messy.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> when i browse the file system , i am
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> getting following :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i haven't seen any make directory option
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> there
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i need to create it from command line ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> further, in the hdfs-site.xml file , i
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> have given following entries. are they correct ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <name>dfs.data.dir</name>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <value>c:\\wksp</value>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   </property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <name>dfs.name.dir</name>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <value>c:\\wksp</value>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   </property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 12:40 PM, manish
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> dunani <ma...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *You are wrong at this:*
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> copyFromLocal: File
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> copyFromLocal: File
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Because,You had wrote both the paths
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> local and You need not to copy hadoop into hdfs...Hadoop is already
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> working..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Just check out in browser by after
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> starting ur single node cluster :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> localhost:50070
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> then go for browse the filesystem link in
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> it..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> If there is no directory then make
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> directory there.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> That is your hdfs directory.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Then copy any text file there(no need to
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> copy hadoop there).beacause u are going to do processing on that data in
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> text file.That's why hadoop is used for ,first u need to make it clear in
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> ur mind.Then and then u will do it...otherwise not possible..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *Try this: *
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ .bin/hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /full/local/path/to/ur/file /hdfs/directory/path
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 11:49 AM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks. yes , i am newbie.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, i need windows setup.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> let me surely refer the doc and link
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> which u sent but i need this to be working ...
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> can you please help
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> MANISH DUNANI
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> -THANX
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> +91 9426881954,+91 8460656443
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> manishd207@gmail.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>>>>> Regards
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> *Manish Dunani*
>>>>>>>>>>>>>>>>>>>>> *Contact No* : +91 9408329137
>>>>>>>>>>>>>>>>>>>>> *skype id* : manish.dunani*
>>>>>>>>>>>>>>>>>>>>> *
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>>>>> NOTICE: This message is intended for the use of the individual
>>>>>>>>>>>>> or entity to which it is addressed and may contain information that is
>>>>>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>>>>>> immediately and delete it from your system. Thank You.
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>>> NOTICE: This message is intended for the use of the individual
>>>>>>>>>>> or entity to which it is addressed and may contain information that is
>>>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>>>> immediately and delete it from your system. Thank You.
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>> CONFIDENTIALITY NOTICE
>>>> NOTICE: This message is intended for the use of the individual or
>>>> entity to which it is addressed and may contain information that is
>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>> If the reader of this message is not the intended recipient, you are hereby
>>>> notified that any printing, copying, dissemination, distribution,
>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>> you have received this communication in error, please contact the sender
>>>> immediately and delete it from your system. Thank You.
>>>>
>>>
>>>
>>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.
>

Re: about replication

Posted by Irfan Sayed <ir...@gmail.com>.
please suggest. i am stuck
i haven't find anything in the log

regards
irfan



On Fri, Sep 13, 2013 at 3:07 PM, Irfan Sayed <ir...@gmail.com> wrote:

> please suggest
>
> regards
>
>
>
> On Thu, Sep 12, 2013 at 12:00 PM, Irfan Sayed <ir...@gmail.com>wrote:
>
>> thanks.
>> finally it got installed :)
>>
>> further, when i try to start the namenode, it failed with following log
>>
>> C:\hdp>start_remote_hdp_services.cmd
>> Master nodes: start DFS-DC
>> 0 Master nodes successfully started.
>> 1 Master nodes failed to start.
>>
>> PSComputerName      Service             Message             Status
>> --------------      -------             -------             ------
>>                                         Connecting to re...
>>
>>
>> StartStop-HDPServices : Manually start services on Master nodes then
>> retry full
>>  cluster start.  Exiting.
>> At C:\hdp\manage_remote_hdp_services.ps1:187 char:47
>> + if ($mode -eq "start") { StartStop-HDPservices <<<< ($mode) }
>>     + CategoryInfo          : NotSpecified: (:) [Write-Error],
>> WriteErrorExcep
>>    tion
>>     + FullyQualifiedErrorId :
>> Microsoft.PowerShell.Commands.WriteErrorExceptio
>>    n,StartStop-HDPServices
>>
>>
>> C:\hdp>
>>
>> i tried starting manually as well but no luck
>> anything missing in configuration ?
>>
>> regards
>>
>>
>>
>> On Wed, Sep 11, 2013 at 3:16 PM, Olivier Renault <
>> orenault@hortonworks.com> wrote:
>>
>>> You can put the same FQDN as your NameNode for example.
>>>
>>> Thanks
>>> Olivier
>>> On 11 Sep 2013 11:26, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>
>>>> i do not have any HIVE server host,  then, what should i put over
>>>> here?? . if i comment then i guess it throws error of commenting that
>>>> can i put the fqdn of namenode for HIVE server host  ?
>>>>
>>>> will it be a really working configuration?
>>>>
>>>> please suggest
>>>>
>>>> regards
>>>> irfan
>>>>
>>>>
>>>>
>>>> On Tue, Sep 10, 2013 at 5:09 PM, Olivier Renault <
>>>> orenault@hortonworks.com> wrote:
>>>>
>>>>> Your cluster-properties.txt should look something like :
>>>>>
>>>>>
>>>>> #Log directory
>>>>> HDP_LOG_DIR=c:\hadoop\logs
>>>>>
>>>>> #Data directory
>>>>> HDP_DATA_DIR=c:\hdp\data
>>>>>
>>>>> #Hosts
>>>>> NAMENODE_HOST=yourmaster.fqdn.com
>>>>> JOBTRACKER_HOST=yourmaster.fqdn.com
>>>>> HIVE_SERVER_HOST=yourmaster.fqdn.com
>>>>> OOZIE_SERVER_HOST=yourmaster.fqdn.com
>>>>> TEMPLETON_HOST=yourmaster.fqdn.com
>>>>> SLAVE_HOSTS=yourmaster.fqdn.com,youslave.fqdn.com
>>>>>
>>>>>
>>>>> #Database host
>>>>> DB_FLAVOR=derby
>>>>> DB_HOSTNAME=yourmaster.fqdn.com
>>>>>
>>>>>
>>>>> #Hive properties
>>>>> HIVE_DB_NAME=hive
>>>>> HIVE_DB_USERNAME=hive
>>>>> HIVE_DB_PASSWORD=hive
>>>>>
>>>>> #Oozie properties
>>>>> OOZIE_DB_NAME=oozie
>>>>> OOZIE_DB_USERNAME=oozie
>>>>> OOZIE_DB_PASSWORD=oozie
>>>>>
>>>>> You will need to replace, yourmaster.fqdn.com, yourslave.fqdn.com by
>>>>> your servers name. For the time being, I suggest that you do not install
>>>>> HBase, Oozie,
>>>>>
>>>>> regards,
>>>>> Olivier
>>>>>
>>>>>
>>>>> On 10 September 2013 07:02, Irfan Sayed <ir...@gmail.com> wrote:
>>>>>
>>>>>> please suggest
>>>>>>
>>>>>> regards
>>>>>> irfan
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Sat, Sep 7, 2013 at 4:56 PM, Irfan Sayed <ir...@gmail.com>wrote:
>>>>>>
>>>>>>> please suggest
>>>>>>>
>>>>>>> regards
>>>>>>> irfan
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Fri, Sep 6, 2013 at 12:16 PM, Irfan Sayed <ir...@gmail.com>wrote:
>>>>>>>
>>>>>>>> ok.. now i made some changes and installation went ahead
>>>>>>>> but failed in property "HIVE_SERVER_HOST" declaration
>>>>>>>> in cluster config file, i have commented this property. if i
>>>>>>>> uncomment , then what server address will give ???
>>>>>>>>
>>>>>>>> i have only two windows machines setup.
>>>>>>>> 1: for namenode and another for datanode
>>>>>>>>
>>>>>>>> please suggest
>>>>>>>>
>>>>>>>> regards
>>>>>>>> irfan
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> On Fri, Sep 6, 2013 at 11:42 AM, Irfan Sayed <ir...@gmail.com>wrote:
>>>>>>>>
>>>>>>>>> thanks.
>>>>>>>>> i installed the latest java in c:\java folder and now no error in
>>>>>>>>> log file related to java
>>>>>>>>> however, now it is throwing error on not having cluster properties
>>>>>>>>> file.
>>>>>>>>> in fact i am running/installing hdp from the location where this
>>>>>>>>> file exist . still it is throwing error
>>>>>>>>>
>>>>>>>>> please find the attached
>>>>>>>>>
>>>>>>>>> [image: Inline image 1]
>>>>>>>>>
>>>>>>>>> regards
>>>>>>>>> irfan
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Fri, Sep 6, 2013 at 11:12 AM, Ravi Mummulla (BIG DATA) <
>>>>>>>>> ravimu@microsoft.com> wrote:
>>>>>>>>>
>>>>>>>>>>  Here’s your issue (from the logs you attached earlier):****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> CAQuietExec:  Checking JAVA_HOME is set correctly...****
>>>>>>>>>>
>>>>>>>>>> CAQuietExec:  Files\Java\jdk1.6.0_31 was unexpected at this time.
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> It seems that you installed Java prerequisite in the default
>>>>>>>>>> path, which is %PROGRAMFILES% (expands to C:\Program Files in your case).
>>>>>>>>>> HDP 1.3 does not like spaces in paths, do you need to reinstall Java under
>>>>>>>>>> c:\java\ or something similar (in a path with no spaces).****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> *From:* Irfan Sayed [mailto:irfu.sayed@gmail.com]
>>>>>>>>>> *Sent:* Thursday, September 5, 2013 8:42 PM
>>>>>>>>>> *To:* user@hadoop.apache.org
>>>>>>>>>> *Subject:* Re: about replication****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> please find the attached.****
>>>>>>>>>>
>>>>>>>>>> i don't have "c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log"
>>>>>>>>>> as it is not generated ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> irfan****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Thu, Sep 5, 2013 at 6:09 PM, Olivier Renault <
>>>>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  Could you share the log files ( c:\hdp.log,
>>>>>>>>>> c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log )  as
>>>>>>>>>> well as your clusterproperties.txt ?****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> Thanks, ****
>>>>>>>>>>
>>>>>>>>>> Olivier****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On 5 September 2013 12:33, Irfan Sayed <ir...@gmail.com>
>>>>>>>>>> wrote:****
>>>>>>>>>>
>>>>>>>>>>  thanks. i followed the user manual for deployment and installed
>>>>>>>>>> all pre-requisites ****
>>>>>>>>>>
>>>>>>>>>> i modified the command and still the issue persist. please
>>>>>>>>>> suggest ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> please refer below ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> [image: Inline image 1]****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> irfan ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <
>>>>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>>>>
>>>>>>>>>> The command to install it is msiexec /i msifile /...  ****
>>>>>>>>>>
>>>>>>>>>> You will find the correct syntax as part of doc. ****
>>>>>>>>>>
>>>>>>>>>> Happy reading
>>>>>>>>>> Olivier ****
>>>>>>>>>>
>>>>>>>>>> On 4 Sep 2013 12:37, "Irfan Sayed" <ir...@gmail.com> wrote:*
>>>>>>>>>> ***
>>>>>>>>>>
>>>>>>>>>>  thanks. ****
>>>>>>>>>>
>>>>>>>>>> i referred the logs and manuals. i modified the clusterproperties
>>>>>>>>>> file and then double click on the msi file ****
>>>>>>>>>>
>>>>>>>>>> however, it still failed.****
>>>>>>>>>>
>>>>>>>>>> further i started the installation on command line by giving
>>>>>>>>>> HDP_LAYOUT=clusterproperties file path, ****
>>>>>>>>>>
>>>>>>>>>> installation went ahead and it failed for .NET framework 4.0 and
>>>>>>>>>> VC++ redistributable package dependency   ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> i installed both and started again the installation. ****
>>>>>>>>>>
>>>>>>>>>> failed again with following error ****
>>>>>>>>>>
>>>>>>>>>> [image: Inline image 1]****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> when i search for the logs mentioned in the error , i never found
>>>>>>>>>> that ****
>>>>>>>>>>
>>>>>>>>>> please suggest ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> irfan****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <
>>>>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>>>>
>>>>>>>>>> Correct, you need to define the cluster configuration as part of
>>>>>>>>>> a file. You will find some information on the configuration file as part of
>>>>>>>>>> the documentation. ****
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> You should make sure to have also installed the pre requisite. **
>>>>>>>>>> **
>>>>>>>>>>
>>>>>>>>>> Thanks
>>>>>>>>>> Olivier ****
>>>>>>>>>>
>>>>>>>>>> On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:*
>>>>>>>>>> ***
>>>>>>>>>>
>>>>>>>>>>  thanks. sorry for the long break. actually got involved in some
>>>>>>>>>> other priorities****
>>>>>>>>>>
>>>>>>>>>> i downloaded the installer and while installing i got following
>>>>>>>>>> error ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> [image: Inline image 1]****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> do i need to make any configuration prior to installation ??****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> irfan ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
>>>>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>>>>
>>>>>>>>>> Here is the link ****
>>>>>>>>>>
>>>>>>>>>> http://download.hortonworks.com/products/hdp-windows/****
>>>>>>>>>>
>>>>>>>>>> Olivier ****
>>>>>>>>>>
>>>>>>>>>> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>>  thanks.****
>>>>>>>>>>
>>>>>>>>>> i just followed the instructions to setup the pseudo distributed
>>>>>>>>>> setup first using the url :
>>>>>>>>>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>>  ****
>>>>>>>>>>
>>>>>>>>>> i don't think so i am running DN on both machine ****
>>>>>>>>>>
>>>>>>>>>> please find the attached log****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> hi olivier ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> can you please give me download link ?****
>>>>>>>>>>
>>>>>>>>>> let me try please ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> irfan ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <
>>>>>>>>>> dontariq@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  Are you running DN on both the machines? Could you please show
>>>>>>>>>> me your DN logs?****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> Also, consider Oliver's suggestion. It's definitely a better
>>>>>>>>>> option.****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> Warm Regards,****
>>>>>>>>>>
>>>>>>>>>> Tariq****
>>>>>>>>>>
>>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>>>>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>>>>
>>>>>>>>>> Irfu, ****
>>>>>>>>>>
>>>>>>>>>> If you want to quickly get Hadoop running on windows platform.
>>>>>>>>>> You may want to try our distribution for Windows. You will be able to find
>>>>>>>>>> the msi on our website. ****
>>>>>>>>>>
>>>>>>>>>> Regards
>>>>>>>>>> Olivier ****
>>>>>>>>>>
>>>>>>>>>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>>  thanks. ****
>>>>>>>>>>
>>>>>>>>>> ok. i think i need to change the plan over here ****
>>>>>>>>>>
>>>>>>>>>> let me create two environments. 1: totally windows 2: totally Unix
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> because, on windows , anyway i have to try and see how hadoop
>>>>>>>>>> works ****
>>>>>>>>>>
>>>>>>>>>> on UNIX, it is already known that ,  it is working fine. ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> so, on windows , here is the setup:****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> namenode : windows 2012 R2 ****
>>>>>>>>>>
>>>>>>>>>> datanode : windows 2012 R2 ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> now, the exact problem is :****
>>>>>>>>>>
>>>>>>>>>> 1: datanode is not getting started ****
>>>>>>>>>>
>>>>>>>>>> 2: replication : if i put any file/folder on any datanode , it
>>>>>>>>>> should get replicated to all another available datanodes ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <
>>>>>>>>>> dontariq@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  Seriously??You are planning to develop something using Hadoop
>>>>>>>>>> on windows. Not a good idea. Anyways, cold you plz show me your log files?I
>>>>>>>>>> also need some additional info :****
>>>>>>>>>>
>>>>>>>>>> -The exact problem which you are facing right now****
>>>>>>>>>>
>>>>>>>>>> -Your cluster summary(no. of nodes etc)****
>>>>>>>>>>
>>>>>>>>>> -Your latest configuration files****
>>>>>>>>>>
>>>>>>>>>> -Your /etc.hosts file****
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> Warm Regards,****
>>>>>>>>>>
>>>>>>>>>> Tariq****
>>>>>>>>>>
>>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  ok. thanks****
>>>>>>>>>>
>>>>>>>>>> now, i need to start with all windows setup first as our product
>>>>>>>>>> will be based on windows ****
>>>>>>>>>>
>>>>>>>>>> so, now, please tell me how to resolve the issue ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> datanode is not starting . please suggest ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards,****
>>>>>>>>>>
>>>>>>>>>> irfan ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <
>>>>>>>>>> dontariq@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  It is possible. Theoretically Hadoop doesn't stop you from
>>>>>>>>>> doing that. But it is not a very wise setup.****
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> Warm Regards,****
>>>>>>>>>>
>>>>>>>>>> Tariq****
>>>>>>>>>>
>>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  please suggest****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> irfan****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  thanks.****
>>>>>>>>>>
>>>>>>>>>> can i have setup like this :****
>>>>>>>>>>
>>>>>>>>>> namenode will be on linux (flavour may be RHEL, CentOS, UBuntu
>>>>>>>>>> etc)****
>>>>>>>>>>
>>>>>>>>>> and datanodes are the combination of any OS (windows , linux ,
>>>>>>>>>> unix etc )****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> however, my doubt is,  as the file systems of  both the systems
>>>>>>>>>> (win and linux ) are different ,  datanodes of these systems can not be
>>>>>>>>>> part of single cluster . i have to make windows cluster separate and UNIX
>>>>>>>>>> cluster separate ?****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
>>>>>>>>>> aagarwal@hortonworks.com> wrote:****
>>>>>>>>>>
>>>>>>>>>> I just noticed you are on Cygwin. IIRC Windows PIDs are not the
>>>>>>>>>> same as Cygwin PIDs so that may be causing the discrepancy. I don't know
>>>>>>>>>> how well Hadoop works in Cygwin as I have never tried it. Work is in
>>>>>>>>>> progress for native Windows support however there are no official releases
>>>>>>>>>> with Windows support yet. It may be easier to get familiar with a
>>>>>>>>>> release <https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  thanks ****
>>>>>>>>>>
>>>>>>>>>> here is what i did .****
>>>>>>>>>>
>>>>>>>>>> i stopped all the namenodes and datanodes using ./stop-dfs.sh
>>>>>>>>>> command ****
>>>>>>>>>>
>>>>>>>>>> then deleted all pid files for namenodes and datanodes ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> started dfs again with command : "./start-dfs.sh"****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> when i ran the "Jps" command . it shows****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>>>>>>
>>>>>>>>>> $ ./jps.exe****
>>>>>>>>>>
>>>>>>>>>> 4536 Jps****
>>>>>>>>>>
>>>>>>>>>> 2076 NameNode****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> however, when i open the pid file for namenode then it is not
>>>>>>>>>> showing pid as : 4560. on the contrary, it shud show : 2076****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> please suggest ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
>>>>>>>>>> aagarwal@hortonworks.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  Most likely there is a stale pid file. Something like
>>>>>>>>>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>>>>>>>>>> the datanode.
>>>>>>>>>>
>>>>>>>>>> I haven't read the entire thread so you may have looked at this
>>>>>>>>>> already.
>>>>>>>>>>
>>>>>>>>>> -Arpit****
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  datanode is trying to connect to namenode continuously but
>>>>>>>>>> fails ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> when i try to run "jps" command it says :****
>>>>>>>>>>
>>>>>>>>>> $ ./jps.exe****
>>>>>>>>>>
>>>>>>>>>> 4584 NameNode****
>>>>>>>>>>
>>>>>>>>>> 4016 Jps****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> and when i ran the "./start-dfs.sh" then it says :****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> $ ./start-dfs.sh****
>>>>>>>>>>
>>>>>>>>>> namenode running as process 3544. Stop it first.****
>>>>>>>>>>
>>>>>>>>>> DFS-1: datanode running as process 4076. Stop it first.****
>>>>>>>>>>
>>>>>>>>>> localhost: secondarynamenode running as process 4792. Stop it
>>>>>>>>>> first.****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> both these logs are contradictory ****
>>>>>>>>>>
>>>>>>>>>> please find the attached logs ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> should i attach the conf files as well ?****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>>  ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <
>>>>>>>>>> dontariq@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  Your DN is still not running. Showing me the logs would be
>>>>>>>>>> helpful.****
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> Warm Regards,****
>>>>>>>>>>
>>>>>>>>>> Tariq****
>>>>>>>>>>
>>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  i followed the url and did the steps mention in that. i have
>>>>>>>>>> deployed on the windows platform****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> Now, i am able to browse url : http://localhost:50070 (name node
>>>>>>>>>> )****
>>>>>>>>>>
>>>>>>>>>> however, not able to browse url : http://localhost:50030****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> please refer below****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> [image: Inline image 1]****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> i have modified all the config files as mentioned and formatted
>>>>>>>>>> the hdfs file system as well ****
>>>>>>>>>>
>>>>>>>>>> please suggest ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  thanks. i followed this url :
>>>>>>>>>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> let me follow the url which you gave for pseudo distributed setup
>>>>>>>>>> and then will switch to distributed mode****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> irfan ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <
>>>>>>>>>> dontariq@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  You are welcome. Which link have you followed for the
>>>>>>>>>> configuration?Your *core-site.xml* is empty. Remove the property
>>>>>>>>>> *fs.default.name *from *hdfs-site.xml* and add it to *
>>>>>>>>>> core-site.xml*. Remove *mapred.job.tracker* as well. It is
>>>>>>>>>> required in *mapred-site.xml*.****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> I would suggest you to do a pseudo distributed setup first in
>>>>>>>>>> order to get yourself familiar with the process and then proceed to the
>>>>>>>>>> distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> HTH****
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> Warm Regards,****
>>>>>>>>>>
>>>>>>>>>> Tariq****
>>>>>>>>>>
>>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  thanks tariq for response. ****
>>>>>>>>>>
>>>>>>>>>> as discussed last time, i have sent you all the config files in
>>>>>>>>>> my setup . ****
>>>>>>>>>>
>>>>>>>>>> can you please go through that ?****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> please let me know ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> irfan ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <
>>>>>>>>>> dontariq@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  I'm sorry for being unresponsive. Was out of touch for sometime
>>>>>>>>>> because of ramzan and eid. Resuming work today.****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> What's the current status?****
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> Warm Regards,****
>>>>>>>>>>
>>>>>>>>>> Tariq****
>>>>>>>>>>
>>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <
>>>>>>>>>> manishd207@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  First of all read the concepts ..I hope you will like it..****
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  please suggest ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> irfan ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  hey Tariq,****
>>>>>>>>>>
>>>>>>>>>> i am still stuck .. ****
>>>>>>>>>>
>>>>>>>>>> can you please suggest ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> irfan ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>>> wrote:****
>>>>>>>>>>
>>>>>>>>>>  please suggest ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>>> wrote:****
>>>>>>>>>>
>>>>>>>>>>  attachment got quarantined ****
>>>>>>>>>>
>>>>>>>>>> resending in txt format. please rename it to conf.rar ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>>> wrote:****
>>>>>>>>>>
>>>>>>>>>>  thanks.****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> if i run the jps command on namenode :****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>>>>>>
>>>>>>>>>> $ ./jps.exe****
>>>>>>>>>>
>>>>>>>>>> 3164 NameNode****
>>>>>>>>>>
>>>>>>>>>> 1892 Jps****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> same command on datanode :****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> Administrator@DFS-1 /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>>>>>>
>>>>>>>>>> $ ./jps.exe****
>>>>>>>>>>
>>>>>>>>>> 3848 Jps****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> jps does not list any process for datanode. however, on web
>>>>>>>>>> browser i can see one live data node ****
>>>>>>>>>>
>>>>>>>>>> please find the attached conf rar file of namenode ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <
>>>>>>>>>> dontariq@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  OK. we'll start fresh. Could you plz show me your latest config
>>>>>>>>>> files?****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> BTW, are your daemons running fine?Use JPS to verify that.****
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> Warm Regards,****
>>>>>>>>>>
>>>>>>>>>> Tariq****
>>>>>>>>>>
>>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  i have created these dir "wksp_data" and "wksp_name" on both
>>>>>>>>>> datanode and namenode ****
>>>>>>>>>>
>>>>>>>>>> made the respective changes in "hdfs-site.xml" file ****
>>>>>>>>>>
>>>>>>>>>> formatted the namenode ****
>>>>>>>>>>
>>>>>>>>>> started the dfs ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> but still, not able to browse the file system through web browser
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> please refer below ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> anything still missing ?****
>>>>>>>>>>
>>>>>>>>>> please suggest ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> [image: Inline image 1]****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  these dir needs to be created on all datanodes and namenodes ?**
>>>>>>>>>> **
>>>>>>>>>>
>>>>>>>>>> further,  hdfs-site.xml needs to be updated on both datanodes and
>>>>>>>>>> namenodes for these new dir?****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <
>>>>>>>>>> dontariq@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  Create 2 directories manually corresponding to the values of
>>>>>>>>>> dfs.name.dir and dfs.data.dir properties and change the permissions of
>>>>>>>>>> these directories to 755. When you start pushing data into your HDFS, data
>>>>>>>>>> will start going inside the directory specified by dfs.data.dir and the
>>>>>>>>>> associated metadata will go inside dfs.name.dir. Remember, you store data
>>>>>>>>>> in HDFS, but it eventually gets stored in your local/native FS. But you
>>>>>>>>>> cannot see this data directly on your local/native FS.****
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> Warm Regards,****
>>>>>>>>>>
>>>>>>>>>> Tariq****
>>>>>>>>>>
>>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>>> wrote:****
>>>>>>>>>>
>>>>>>>>>>  thanks. ****
>>>>>>>>>>
>>>>>>>>>> however, i need this to be working on windows environment as
>>>>>>>>>> project requirement.****
>>>>>>>>>>
>>>>>>>>>> i will add/work on Linux later ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> so, now , at this stage , c:\\wksp is the HDFS file system OR do
>>>>>>>>>> i need to create it from command line ?****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> please suggest****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards,****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <
>>>>>>>>>> dontariq@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  Hello Irfan,****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> Sorry for being unresponsive. Got stuck with some imp work.****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> HDFS webUI doesn't provide us the ability to create file or
>>>>>>>>>> directory. You can browse HDFS, view files, download files etc. But
>>>>>>>>>> operation like create, move, copy etc are not supported.****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> These values look fine to me.****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> One suggestion though. Try getting a Linux machine(if possible).
>>>>>>>>>> Or at least use a VM. I personally feel that using Hadoop on windows is
>>>>>>>>>> always messy.****
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> Warm Regards,****
>>>>>>>>>>
>>>>>>>>>> Tariq****
>>>>>>>>>>
>>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>>> wrote:****
>>>>>>>>>>
>>>>>>>>>>  thanks.****
>>>>>>>>>>
>>>>>>>>>> when i browse the file system , i am getting following :****
>>>>>>>>>>
>>>>>>>>>> i haven't seen any make directory option there ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> i need to create it from command line ?****
>>>>>>>>>>
>>>>>>>>>> further, in the hdfs-site.xml file , i have given following
>>>>>>>>>> entries. are they correct ? ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> <property>****
>>>>>>>>>>
>>>>>>>>>>   <name>dfs.data.dir</name>****
>>>>>>>>>>
>>>>>>>>>>   <value>c:\\wksp</value>****
>>>>>>>>>>
>>>>>>>>>>   </property>****
>>>>>>>>>>
>>>>>>>>>> <property>****
>>>>>>>>>>
>>>>>>>>>>   <name>dfs.name.dir</name>****
>>>>>>>>>>
>>>>>>>>>>   <value>c:\\wksp</value>****
>>>>>>>>>>
>>>>>>>>>>   </property>****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> please suggest ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> [image: Inline image 1]****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <
>>>>>>>>>> manishd207@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  *You are wrong at this:*****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin***
>>>>>>>>>> *
>>>>>>>>>>
>>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp***
>>>>>>>>>> *
>>>>>>>>>>
>>>>>>>>>> copyFromLocal: File
>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin***
>>>>>>>>>> *
>>>>>>>>>>
>>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> copyFromLocal: File
>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> Because,You had wrote both the paths local and You need not to
>>>>>>>>>> copy hadoop into hdfs...Hadoop is already working..****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> Just check out in browser by after starting ur single node
>>>>>>>>>> cluster :****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> localhost:50070****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> then go for browse the filesystem link in it..****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> If there is no directory then make directory there.****
>>>>>>>>>>
>>>>>>>>>> That is your hdfs directory.****
>>>>>>>>>>
>>>>>>>>>> Then copy any text file there(no need to copy hadoop
>>>>>>>>>> there).beacause u are going to do processing on that data in text
>>>>>>>>>> file.That's why hadoop is used for ,first u need to make it clear in ur
>>>>>>>>>> mind.Then and then u will do it...otherwise not possible..****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> *Try this: *****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2****
>>>>>>>>>>
>>>>>>>>>> $ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
>>>>>>>>>> /hdfs/directory/path****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  thanks. yes , i am newbie.****
>>>>>>>>>>
>>>>>>>>>> however, i need windows setup.****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> let me surely refer the doc and link which u sent but i need this
>>>>>>>>>> to be working ...****
>>>>>>>>>>
>>>>>>>>>> can you please help****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>>  ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> --
>>>>>>>>>> MANISH DUNANI
>>>>>>>>>> -THANX
>>>>>>>>>> +91 9426881954,+91 8460656443****
>>>>>>>>>>
>>>>>>>>>> manishd207@gmail.com****
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> -- ****
>>>>>>>>>>
>>>>>>>>>> Regards****
>>>>>>>>>>
>>>>>>>>>> *Manish Dunani*****
>>>>>>>>>>
>>>>>>>>>> *Contact No* : +91 9408329137****
>>>>>>>>>>
>>>>>>>>>> *skype id* : manish.dunani****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> -- ****
>>>>>>>>>>
>>>>>>>>>> Olivier Renault****
>>>>>>>>>>
>>>>>>>>>> Solution Engineer - Big Data - Hortonworks, Inc.
>>>>>>>>>> +44 7500 933 036
>>>>>>>>>> orenault@hortonworks.com
>>>>>>>>>> www.hortonworks.com****
>>>>>>>>>>
>>>>>>>>>> **** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>>> immediately and delete it from your system. Thank You.****<http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>>>>>>
>>>>>>>>>>  ** ** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>> CONFIDENTIALITY NOTICE
>>>>> NOTICE: This message is intended for the use of the individual or
>>>>> entity to which it is addressed and may contain information that is
>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>> notified that any printing, copying, dissemination, distribution,
>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>> you have received this communication in error, please contact the sender
>>>>> immediately and delete it from your system. Thank You.
>>>>>
>>>>
>>>>
>>> CONFIDENTIALITY NOTICE
>>> NOTICE: This message is intended for the use of the individual or entity
>>> to which it is addressed and may contain information that is confidential,
>>> privileged and exempt from disclosure under applicable law. If the reader
>>> of this message is not the intended recipient, you are hereby notified that
>>> any printing, copying, dissemination, distribution, disclosure or
>>> forwarding of this communication is strictly prohibited. If you have
>>> received this communication in error, please contact the sender immediately
>>> and delete it from your system. Thank You.
>>>
>>
>>
>

Re: about replication

Posted by Irfan Sayed <ir...@gmail.com>.
please suggest. i am stuck
i haven't find anything in the log

regards
irfan



On Fri, Sep 13, 2013 at 3:07 PM, Irfan Sayed <ir...@gmail.com> wrote:

> please suggest
>
> regards
>
>
>
> On Thu, Sep 12, 2013 at 12:00 PM, Irfan Sayed <ir...@gmail.com>wrote:
>
>> thanks.
>> finally it got installed :)
>>
>> further, when i try to start the namenode, it failed with following log
>>
>> C:\hdp>start_remote_hdp_services.cmd
>> Master nodes: start DFS-DC
>> 0 Master nodes successfully started.
>> 1 Master nodes failed to start.
>>
>> PSComputerName      Service             Message             Status
>> --------------      -------             -------             ------
>>                                         Connecting to re...
>>
>>
>> StartStop-HDPServices : Manually start services on Master nodes then
>> retry full
>>  cluster start.  Exiting.
>> At C:\hdp\manage_remote_hdp_services.ps1:187 char:47
>> + if ($mode -eq "start") { StartStop-HDPservices <<<< ($mode) }
>>     + CategoryInfo          : NotSpecified: (:) [Write-Error],
>> WriteErrorExcep
>>    tion
>>     + FullyQualifiedErrorId :
>> Microsoft.PowerShell.Commands.WriteErrorExceptio
>>    n,StartStop-HDPServices
>>
>>
>> C:\hdp>
>>
>> i tried starting manually as well but no luck
>> anything missing in configuration ?
>>
>> regards
>>
>>
>>
>> On Wed, Sep 11, 2013 at 3:16 PM, Olivier Renault <
>> orenault@hortonworks.com> wrote:
>>
>>> You can put the same FQDN as your NameNode for example.
>>>
>>> Thanks
>>> Olivier
>>> On 11 Sep 2013 11:26, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>
>>>> i do not have any HIVE server host,  then, what should i put over
>>>> here?? . if i comment then i guess it throws error of commenting that
>>>> can i put the fqdn of namenode for HIVE server host  ?
>>>>
>>>> will it be a really working configuration?
>>>>
>>>> please suggest
>>>>
>>>> regards
>>>> irfan
>>>>
>>>>
>>>>
>>>> On Tue, Sep 10, 2013 at 5:09 PM, Olivier Renault <
>>>> orenault@hortonworks.com> wrote:
>>>>
>>>>> Your cluster-properties.txt should look something like :
>>>>>
>>>>>
>>>>> #Log directory
>>>>> HDP_LOG_DIR=c:\hadoop\logs
>>>>>
>>>>> #Data directory
>>>>> HDP_DATA_DIR=c:\hdp\data
>>>>>
>>>>> #Hosts
>>>>> NAMENODE_HOST=yourmaster.fqdn.com
>>>>> JOBTRACKER_HOST=yourmaster.fqdn.com
>>>>> HIVE_SERVER_HOST=yourmaster.fqdn.com
>>>>> OOZIE_SERVER_HOST=yourmaster.fqdn.com
>>>>> TEMPLETON_HOST=yourmaster.fqdn.com
>>>>> SLAVE_HOSTS=yourmaster.fqdn.com,youslave.fqdn.com
>>>>>
>>>>>
>>>>> #Database host
>>>>> DB_FLAVOR=derby
>>>>> DB_HOSTNAME=yourmaster.fqdn.com
>>>>>
>>>>>
>>>>> #Hive properties
>>>>> HIVE_DB_NAME=hive
>>>>> HIVE_DB_USERNAME=hive
>>>>> HIVE_DB_PASSWORD=hive
>>>>>
>>>>> #Oozie properties
>>>>> OOZIE_DB_NAME=oozie
>>>>> OOZIE_DB_USERNAME=oozie
>>>>> OOZIE_DB_PASSWORD=oozie
>>>>>
>>>>> You will need to replace, yourmaster.fqdn.com, yourslave.fqdn.com by
>>>>> your servers name. For the time being, I suggest that you do not install
>>>>> HBase, Oozie,
>>>>>
>>>>> regards,
>>>>> Olivier
>>>>>
>>>>>
>>>>> On 10 September 2013 07:02, Irfan Sayed <ir...@gmail.com> wrote:
>>>>>
>>>>>> please suggest
>>>>>>
>>>>>> regards
>>>>>> irfan
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Sat, Sep 7, 2013 at 4:56 PM, Irfan Sayed <ir...@gmail.com>wrote:
>>>>>>
>>>>>>> please suggest
>>>>>>>
>>>>>>> regards
>>>>>>> irfan
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Fri, Sep 6, 2013 at 12:16 PM, Irfan Sayed <ir...@gmail.com>wrote:
>>>>>>>
>>>>>>>> ok.. now i made some changes and installation went ahead
>>>>>>>> but failed in property "HIVE_SERVER_HOST" declaration
>>>>>>>> in cluster config file, i have commented this property. if i
>>>>>>>> uncomment , then what server address will give ???
>>>>>>>>
>>>>>>>> i have only two windows machines setup.
>>>>>>>> 1: for namenode and another for datanode
>>>>>>>>
>>>>>>>> please suggest
>>>>>>>>
>>>>>>>> regards
>>>>>>>> irfan
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> On Fri, Sep 6, 2013 at 11:42 AM, Irfan Sayed <ir...@gmail.com>wrote:
>>>>>>>>
>>>>>>>>> thanks.
>>>>>>>>> i installed the latest java in c:\java folder and now no error in
>>>>>>>>> log file related to java
>>>>>>>>> however, now it is throwing error on not having cluster properties
>>>>>>>>> file.
>>>>>>>>> in fact i am running/installing hdp from the location where this
>>>>>>>>> file exist . still it is throwing error
>>>>>>>>>
>>>>>>>>> please find the attached
>>>>>>>>>
>>>>>>>>> [image: Inline image 1]
>>>>>>>>>
>>>>>>>>> regards
>>>>>>>>> irfan
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Fri, Sep 6, 2013 at 11:12 AM, Ravi Mummulla (BIG DATA) <
>>>>>>>>> ravimu@microsoft.com> wrote:
>>>>>>>>>
>>>>>>>>>>  Here’s your issue (from the logs you attached earlier):****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> CAQuietExec:  Checking JAVA_HOME is set correctly...****
>>>>>>>>>>
>>>>>>>>>> CAQuietExec:  Files\Java\jdk1.6.0_31 was unexpected at this time.
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> It seems that you installed Java prerequisite in the default
>>>>>>>>>> path, which is %PROGRAMFILES% (expands to C:\Program Files in your case).
>>>>>>>>>> HDP 1.3 does not like spaces in paths, do you need to reinstall Java under
>>>>>>>>>> c:\java\ or something similar (in a path with no spaces).****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> *From:* Irfan Sayed [mailto:irfu.sayed@gmail.com]
>>>>>>>>>> *Sent:* Thursday, September 5, 2013 8:42 PM
>>>>>>>>>> *To:* user@hadoop.apache.org
>>>>>>>>>> *Subject:* Re: about replication****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> please find the attached.****
>>>>>>>>>>
>>>>>>>>>> i don't have "c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log"
>>>>>>>>>> as it is not generated ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> irfan****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Thu, Sep 5, 2013 at 6:09 PM, Olivier Renault <
>>>>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  Could you share the log files ( c:\hdp.log,
>>>>>>>>>> c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log )  as
>>>>>>>>>> well as your clusterproperties.txt ?****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> Thanks, ****
>>>>>>>>>>
>>>>>>>>>> Olivier****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On 5 September 2013 12:33, Irfan Sayed <ir...@gmail.com>
>>>>>>>>>> wrote:****
>>>>>>>>>>
>>>>>>>>>>  thanks. i followed the user manual for deployment and installed
>>>>>>>>>> all pre-requisites ****
>>>>>>>>>>
>>>>>>>>>> i modified the command and still the issue persist. please
>>>>>>>>>> suggest ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> please refer below ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> [image: Inline image 1]****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> irfan ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <
>>>>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>>>>
>>>>>>>>>> The command to install it is msiexec /i msifile /...  ****
>>>>>>>>>>
>>>>>>>>>> You will find the correct syntax as part of doc. ****
>>>>>>>>>>
>>>>>>>>>> Happy reading
>>>>>>>>>> Olivier ****
>>>>>>>>>>
>>>>>>>>>> On 4 Sep 2013 12:37, "Irfan Sayed" <ir...@gmail.com> wrote:*
>>>>>>>>>> ***
>>>>>>>>>>
>>>>>>>>>>  thanks. ****
>>>>>>>>>>
>>>>>>>>>> i referred the logs and manuals. i modified the clusterproperties
>>>>>>>>>> file and then double click on the msi file ****
>>>>>>>>>>
>>>>>>>>>> however, it still failed.****
>>>>>>>>>>
>>>>>>>>>> further i started the installation on command line by giving
>>>>>>>>>> HDP_LAYOUT=clusterproperties file path, ****
>>>>>>>>>>
>>>>>>>>>> installation went ahead and it failed for .NET framework 4.0 and
>>>>>>>>>> VC++ redistributable package dependency   ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> i installed both and started again the installation. ****
>>>>>>>>>>
>>>>>>>>>> failed again with following error ****
>>>>>>>>>>
>>>>>>>>>> [image: Inline image 1]****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> when i search for the logs mentioned in the error , i never found
>>>>>>>>>> that ****
>>>>>>>>>>
>>>>>>>>>> please suggest ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> irfan****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <
>>>>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>>>>
>>>>>>>>>> Correct, you need to define the cluster configuration as part of
>>>>>>>>>> a file. You will find some information on the configuration file as part of
>>>>>>>>>> the documentation. ****
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> You should make sure to have also installed the pre requisite. **
>>>>>>>>>> **
>>>>>>>>>>
>>>>>>>>>> Thanks
>>>>>>>>>> Olivier ****
>>>>>>>>>>
>>>>>>>>>> On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:*
>>>>>>>>>> ***
>>>>>>>>>>
>>>>>>>>>>  thanks. sorry for the long break. actually got involved in some
>>>>>>>>>> other priorities****
>>>>>>>>>>
>>>>>>>>>> i downloaded the installer and while installing i got following
>>>>>>>>>> error ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> [image: Inline image 1]****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> do i need to make any configuration prior to installation ??****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> irfan ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
>>>>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>>>>
>>>>>>>>>> Here is the link ****
>>>>>>>>>>
>>>>>>>>>> http://download.hortonworks.com/products/hdp-windows/****
>>>>>>>>>>
>>>>>>>>>> Olivier ****
>>>>>>>>>>
>>>>>>>>>> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>>  thanks.****
>>>>>>>>>>
>>>>>>>>>> i just followed the instructions to setup the pseudo distributed
>>>>>>>>>> setup first using the url :
>>>>>>>>>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>>  ****
>>>>>>>>>>
>>>>>>>>>> i don't think so i am running DN on both machine ****
>>>>>>>>>>
>>>>>>>>>> please find the attached log****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> hi olivier ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> can you please give me download link ?****
>>>>>>>>>>
>>>>>>>>>> let me try please ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> irfan ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <
>>>>>>>>>> dontariq@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  Are you running DN on both the machines? Could you please show
>>>>>>>>>> me your DN logs?****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> Also, consider Oliver's suggestion. It's definitely a better
>>>>>>>>>> option.****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> Warm Regards,****
>>>>>>>>>>
>>>>>>>>>> Tariq****
>>>>>>>>>>
>>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>>>>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>>>>
>>>>>>>>>> Irfu, ****
>>>>>>>>>>
>>>>>>>>>> If you want to quickly get Hadoop running on windows platform.
>>>>>>>>>> You may want to try our distribution for Windows. You will be able to find
>>>>>>>>>> the msi on our website. ****
>>>>>>>>>>
>>>>>>>>>> Regards
>>>>>>>>>> Olivier ****
>>>>>>>>>>
>>>>>>>>>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>>  thanks. ****
>>>>>>>>>>
>>>>>>>>>> ok. i think i need to change the plan over here ****
>>>>>>>>>>
>>>>>>>>>> let me create two environments. 1: totally windows 2: totally Unix
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> because, on windows , anyway i have to try and see how hadoop
>>>>>>>>>> works ****
>>>>>>>>>>
>>>>>>>>>> on UNIX, it is already known that ,  it is working fine. ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> so, on windows , here is the setup:****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> namenode : windows 2012 R2 ****
>>>>>>>>>>
>>>>>>>>>> datanode : windows 2012 R2 ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> now, the exact problem is :****
>>>>>>>>>>
>>>>>>>>>> 1: datanode is not getting started ****
>>>>>>>>>>
>>>>>>>>>> 2: replication : if i put any file/folder on any datanode , it
>>>>>>>>>> should get replicated to all another available datanodes ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <
>>>>>>>>>> dontariq@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  Seriously??You are planning to develop something using Hadoop
>>>>>>>>>> on windows. Not a good idea. Anyways, cold you plz show me your log files?I
>>>>>>>>>> also need some additional info :****
>>>>>>>>>>
>>>>>>>>>> -The exact problem which you are facing right now****
>>>>>>>>>>
>>>>>>>>>> -Your cluster summary(no. of nodes etc)****
>>>>>>>>>>
>>>>>>>>>> -Your latest configuration files****
>>>>>>>>>>
>>>>>>>>>> -Your /etc.hosts file****
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> Warm Regards,****
>>>>>>>>>>
>>>>>>>>>> Tariq****
>>>>>>>>>>
>>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  ok. thanks****
>>>>>>>>>>
>>>>>>>>>> now, i need to start with all windows setup first as our product
>>>>>>>>>> will be based on windows ****
>>>>>>>>>>
>>>>>>>>>> so, now, please tell me how to resolve the issue ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> datanode is not starting . please suggest ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards,****
>>>>>>>>>>
>>>>>>>>>> irfan ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <
>>>>>>>>>> dontariq@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  It is possible. Theoretically Hadoop doesn't stop you from
>>>>>>>>>> doing that. But it is not a very wise setup.****
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> Warm Regards,****
>>>>>>>>>>
>>>>>>>>>> Tariq****
>>>>>>>>>>
>>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  please suggest****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> irfan****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  thanks.****
>>>>>>>>>>
>>>>>>>>>> can i have setup like this :****
>>>>>>>>>>
>>>>>>>>>> namenode will be on linux (flavour may be RHEL, CentOS, UBuntu
>>>>>>>>>> etc)****
>>>>>>>>>>
>>>>>>>>>> and datanodes are the combination of any OS (windows , linux ,
>>>>>>>>>> unix etc )****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> however, my doubt is,  as the file systems of  both the systems
>>>>>>>>>> (win and linux ) are different ,  datanodes of these systems can not be
>>>>>>>>>> part of single cluster . i have to make windows cluster separate and UNIX
>>>>>>>>>> cluster separate ?****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
>>>>>>>>>> aagarwal@hortonworks.com> wrote:****
>>>>>>>>>>
>>>>>>>>>> I just noticed you are on Cygwin. IIRC Windows PIDs are not the
>>>>>>>>>> same as Cygwin PIDs so that may be causing the discrepancy. I don't know
>>>>>>>>>> how well Hadoop works in Cygwin as I have never tried it. Work is in
>>>>>>>>>> progress for native Windows support however there are no official releases
>>>>>>>>>> with Windows support yet. It may be easier to get familiar with a
>>>>>>>>>> release <https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  thanks ****
>>>>>>>>>>
>>>>>>>>>> here is what i did .****
>>>>>>>>>>
>>>>>>>>>> i stopped all the namenodes and datanodes using ./stop-dfs.sh
>>>>>>>>>> command ****
>>>>>>>>>>
>>>>>>>>>> then deleted all pid files for namenodes and datanodes ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> started dfs again with command : "./start-dfs.sh"****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> when i ran the "Jps" command . it shows****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>>>>>>
>>>>>>>>>> $ ./jps.exe****
>>>>>>>>>>
>>>>>>>>>> 4536 Jps****
>>>>>>>>>>
>>>>>>>>>> 2076 NameNode****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> however, when i open the pid file for namenode then it is not
>>>>>>>>>> showing pid as : 4560. on the contrary, it shud show : 2076****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> please suggest ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
>>>>>>>>>> aagarwal@hortonworks.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  Most likely there is a stale pid file. Something like
>>>>>>>>>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>>>>>>>>>> the datanode.
>>>>>>>>>>
>>>>>>>>>> I haven't read the entire thread so you may have looked at this
>>>>>>>>>> already.
>>>>>>>>>>
>>>>>>>>>> -Arpit****
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  datanode is trying to connect to namenode continuously but
>>>>>>>>>> fails ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> when i try to run "jps" command it says :****
>>>>>>>>>>
>>>>>>>>>> $ ./jps.exe****
>>>>>>>>>>
>>>>>>>>>> 4584 NameNode****
>>>>>>>>>>
>>>>>>>>>> 4016 Jps****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> and when i ran the "./start-dfs.sh" then it says :****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> $ ./start-dfs.sh****
>>>>>>>>>>
>>>>>>>>>> namenode running as process 3544. Stop it first.****
>>>>>>>>>>
>>>>>>>>>> DFS-1: datanode running as process 4076. Stop it first.****
>>>>>>>>>>
>>>>>>>>>> localhost: secondarynamenode running as process 4792. Stop it
>>>>>>>>>> first.****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> both these logs are contradictory ****
>>>>>>>>>>
>>>>>>>>>> please find the attached logs ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> should i attach the conf files as well ?****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>>  ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <
>>>>>>>>>> dontariq@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  Your DN is still not running. Showing me the logs would be
>>>>>>>>>> helpful.****
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> Warm Regards,****
>>>>>>>>>>
>>>>>>>>>> Tariq****
>>>>>>>>>>
>>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  i followed the url and did the steps mention in that. i have
>>>>>>>>>> deployed on the windows platform****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> Now, i am able to browse url : http://localhost:50070 (name node
>>>>>>>>>> )****
>>>>>>>>>>
>>>>>>>>>> however, not able to browse url : http://localhost:50030****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> please refer below****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> [image: Inline image 1]****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> i have modified all the config files as mentioned and formatted
>>>>>>>>>> the hdfs file system as well ****
>>>>>>>>>>
>>>>>>>>>> please suggest ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  thanks. i followed this url :
>>>>>>>>>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> let me follow the url which you gave for pseudo distributed setup
>>>>>>>>>> and then will switch to distributed mode****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> irfan ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <
>>>>>>>>>> dontariq@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  You are welcome. Which link have you followed for the
>>>>>>>>>> configuration?Your *core-site.xml* is empty. Remove the property
>>>>>>>>>> *fs.default.name *from *hdfs-site.xml* and add it to *
>>>>>>>>>> core-site.xml*. Remove *mapred.job.tracker* as well. It is
>>>>>>>>>> required in *mapred-site.xml*.****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> I would suggest you to do a pseudo distributed setup first in
>>>>>>>>>> order to get yourself familiar with the process and then proceed to the
>>>>>>>>>> distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> HTH****
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> Warm Regards,****
>>>>>>>>>>
>>>>>>>>>> Tariq****
>>>>>>>>>>
>>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  thanks tariq for response. ****
>>>>>>>>>>
>>>>>>>>>> as discussed last time, i have sent you all the config files in
>>>>>>>>>> my setup . ****
>>>>>>>>>>
>>>>>>>>>> can you please go through that ?****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> please let me know ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> irfan ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <
>>>>>>>>>> dontariq@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  I'm sorry for being unresponsive. Was out of touch for sometime
>>>>>>>>>> because of ramzan and eid. Resuming work today.****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> What's the current status?****
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> Warm Regards,****
>>>>>>>>>>
>>>>>>>>>> Tariq****
>>>>>>>>>>
>>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <
>>>>>>>>>> manishd207@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  First of all read the concepts ..I hope you will like it..****
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  please suggest ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> irfan ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  hey Tariq,****
>>>>>>>>>>
>>>>>>>>>> i am still stuck .. ****
>>>>>>>>>>
>>>>>>>>>> can you please suggest ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> irfan ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>>> wrote:****
>>>>>>>>>>
>>>>>>>>>>  please suggest ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>>> wrote:****
>>>>>>>>>>
>>>>>>>>>>  attachment got quarantined ****
>>>>>>>>>>
>>>>>>>>>> resending in txt format. please rename it to conf.rar ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>>> wrote:****
>>>>>>>>>>
>>>>>>>>>>  thanks.****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> if i run the jps command on namenode :****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>>>>>>
>>>>>>>>>> $ ./jps.exe****
>>>>>>>>>>
>>>>>>>>>> 3164 NameNode****
>>>>>>>>>>
>>>>>>>>>> 1892 Jps****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> same command on datanode :****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> Administrator@DFS-1 /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>>>>>>
>>>>>>>>>> $ ./jps.exe****
>>>>>>>>>>
>>>>>>>>>> 3848 Jps****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> jps does not list any process for datanode. however, on web
>>>>>>>>>> browser i can see one live data node ****
>>>>>>>>>>
>>>>>>>>>> please find the attached conf rar file of namenode ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <
>>>>>>>>>> dontariq@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  OK. we'll start fresh. Could you plz show me your latest config
>>>>>>>>>> files?****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> BTW, are your daemons running fine?Use JPS to verify that.****
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> Warm Regards,****
>>>>>>>>>>
>>>>>>>>>> Tariq****
>>>>>>>>>>
>>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  i have created these dir "wksp_data" and "wksp_name" on both
>>>>>>>>>> datanode and namenode ****
>>>>>>>>>>
>>>>>>>>>> made the respective changes in "hdfs-site.xml" file ****
>>>>>>>>>>
>>>>>>>>>> formatted the namenode ****
>>>>>>>>>>
>>>>>>>>>> started the dfs ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> but still, not able to browse the file system through web browser
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> please refer below ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> anything still missing ?****
>>>>>>>>>>
>>>>>>>>>> please suggest ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> [image: Inline image 1]****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  these dir needs to be created on all datanodes and namenodes ?**
>>>>>>>>>> **
>>>>>>>>>>
>>>>>>>>>> further,  hdfs-site.xml needs to be updated on both datanodes and
>>>>>>>>>> namenodes for these new dir?****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <
>>>>>>>>>> dontariq@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  Create 2 directories manually corresponding to the values of
>>>>>>>>>> dfs.name.dir and dfs.data.dir properties and change the permissions of
>>>>>>>>>> these directories to 755. When you start pushing data into your HDFS, data
>>>>>>>>>> will start going inside the directory specified by dfs.data.dir and the
>>>>>>>>>> associated metadata will go inside dfs.name.dir. Remember, you store data
>>>>>>>>>> in HDFS, but it eventually gets stored in your local/native FS. But you
>>>>>>>>>> cannot see this data directly on your local/native FS.****
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> Warm Regards,****
>>>>>>>>>>
>>>>>>>>>> Tariq****
>>>>>>>>>>
>>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>>> wrote:****
>>>>>>>>>>
>>>>>>>>>>  thanks. ****
>>>>>>>>>>
>>>>>>>>>> however, i need this to be working on windows environment as
>>>>>>>>>> project requirement.****
>>>>>>>>>>
>>>>>>>>>> i will add/work on Linux later ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> so, now , at this stage , c:\\wksp is the HDFS file system OR do
>>>>>>>>>> i need to create it from command line ?****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> please suggest****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards,****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <
>>>>>>>>>> dontariq@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  Hello Irfan,****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> Sorry for being unresponsive. Got stuck with some imp work.****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> HDFS webUI doesn't provide us the ability to create file or
>>>>>>>>>> directory. You can browse HDFS, view files, download files etc. But
>>>>>>>>>> operation like create, move, copy etc are not supported.****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> These values look fine to me.****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> One suggestion though. Try getting a Linux machine(if possible).
>>>>>>>>>> Or at least use a VM. I personally feel that using Hadoop on windows is
>>>>>>>>>> always messy.****
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> Warm Regards,****
>>>>>>>>>>
>>>>>>>>>> Tariq****
>>>>>>>>>>
>>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>>> wrote:****
>>>>>>>>>>
>>>>>>>>>>  thanks.****
>>>>>>>>>>
>>>>>>>>>> when i browse the file system , i am getting following :****
>>>>>>>>>>
>>>>>>>>>> i haven't seen any make directory option there ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> i need to create it from command line ?****
>>>>>>>>>>
>>>>>>>>>> further, in the hdfs-site.xml file , i have given following
>>>>>>>>>> entries. are they correct ? ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> <property>****
>>>>>>>>>>
>>>>>>>>>>   <name>dfs.data.dir</name>****
>>>>>>>>>>
>>>>>>>>>>   <value>c:\\wksp</value>****
>>>>>>>>>>
>>>>>>>>>>   </property>****
>>>>>>>>>>
>>>>>>>>>> <property>****
>>>>>>>>>>
>>>>>>>>>>   <name>dfs.name.dir</name>****
>>>>>>>>>>
>>>>>>>>>>   <value>c:\\wksp</value>****
>>>>>>>>>>
>>>>>>>>>>   </property>****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> please suggest ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> [image: Inline image 1]****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <
>>>>>>>>>> manishd207@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  *You are wrong at this:*****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin***
>>>>>>>>>> *
>>>>>>>>>>
>>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp***
>>>>>>>>>> *
>>>>>>>>>>
>>>>>>>>>> copyFromLocal: File
>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin***
>>>>>>>>>> *
>>>>>>>>>>
>>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> copyFromLocal: File
>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> Because,You had wrote both the paths local and You need not to
>>>>>>>>>> copy hadoop into hdfs...Hadoop is already working..****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> Just check out in browser by after starting ur single node
>>>>>>>>>> cluster :****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> localhost:50070****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> then go for browse the filesystem link in it..****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> If there is no directory then make directory there.****
>>>>>>>>>>
>>>>>>>>>> That is your hdfs directory.****
>>>>>>>>>>
>>>>>>>>>> Then copy any text file there(no need to copy hadoop
>>>>>>>>>> there).beacause u are going to do processing on that data in text
>>>>>>>>>> file.That's why hadoop is used for ,first u need to make it clear in ur
>>>>>>>>>> mind.Then and then u will do it...otherwise not possible..****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> *Try this: *****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2****
>>>>>>>>>>
>>>>>>>>>> $ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
>>>>>>>>>> /hdfs/directory/path****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  thanks. yes , i am newbie.****
>>>>>>>>>>
>>>>>>>>>> however, i need windows setup.****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> let me surely refer the doc and link which u sent but i need this
>>>>>>>>>> to be working ...****
>>>>>>>>>>
>>>>>>>>>> can you please help****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>>  ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> --
>>>>>>>>>> MANISH DUNANI
>>>>>>>>>> -THANX
>>>>>>>>>> +91 9426881954,+91 8460656443****
>>>>>>>>>>
>>>>>>>>>> manishd207@gmail.com****
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> -- ****
>>>>>>>>>>
>>>>>>>>>> Regards****
>>>>>>>>>>
>>>>>>>>>> *Manish Dunani*****
>>>>>>>>>>
>>>>>>>>>> *Contact No* : +91 9408329137****
>>>>>>>>>>
>>>>>>>>>> *skype id* : manish.dunani****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> -- ****
>>>>>>>>>>
>>>>>>>>>> Olivier Renault****
>>>>>>>>>>
>>>>>>>>>> Solution Engineer - Big Data - Hortonworks, Inc.
>>>>>>>>>> +44 7500 933 036
>>>>>>>>>> orenault@hortonworks.com
>>>>>>>>>> www.hortonworks.com****
>>>>>>>>>>
>>>>>>>>>> **** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>>> immediately and delete it from your system. Thank You.****<http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>>>>>>
>>>>>>>>>>  ** ** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>> CONFIDENTIALITY NOTICE
>>>>> NOTICE: This message is intended for the use of the individual or
>>>>> entity to which it is addressed and may contain information that is
>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>> notified that any printing, copying, dissemination, distribution,
>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>> you have received this communication in error, please contact the sender
>>>>> immediately and delete it from your system. Thank You.
>>>>>
>>>>
>>>>
>>> CONFIDENTIALITY NOTICE
>>> NOTICE: This message is intended for the use of the individual or entity
>>> to which it is addressed and may contain information that is confidential,
>>> privileged and exempt from disclosure under applicable law. If the reader
>>> of this message is not the intended recipient, you are hereby notified that
>>> any printing, copying, dissemination, distribution, disclosure or
>>> forwarding of this communication is strictly prohibited. If you have
>>> received this communication in error, please contact the sender immediately
>>> and delete it from your system. Thank You.
>>>
>>
>>
>

Re: about replication

Posted by Irfan Sayed <ir...@gmail.com>.
please suggest. i am stuck
i haven't find anything in the log

regards
irfan



On Fri, Sep 13, 2013 at 3:07 PM, Irfan Sayed <ir...@gmail.com> wrote:

> please suggest
>
> regards
>
>
>
> On Thu, Sep 12, 2013 at 12:00 PM, Irfan Sayed <ir...@gmail.com>wrote:
>
>> thanks.
>> finally it got installed :)
>>
>> further, when i try to start the namenode, it failed with following log
>>
>> C:\hdp>start_remote_hdp_services.cmd
>> Master nodes: start DFS-DC
>> 0 Master nodes successfully started.
>> 1 Master nodes failed to start.
>>
>> PSComputerName      Service             Message             Status
>> --------------      -------             -------             ------
>>                                         Connecting to re...
>>
>>
>> StartStop-HDPServices : Manually start services on Master nodes then
>> retry full
>>  cluster start.  Exiting.
>> At C:\hdp\manage_remote_hdp_services.ps1:187 char:47
>> + if ($mode -eq "start") { StartStop-HDPservices <<<< ($mode) }
>>     + CategoryInfo          : NotSpecified: (:) [Write-Error],
>> WriteErrorExcep
>>    tion
>>     + FullyQualifiedErrorId :
>> Microsoft.PowerShell.Commands.WriteErrorExceptio
>>    n,StartStop-HDPServices
>>
>>
>> C:\hdp>
>>
>> i tried starting manually as well but no luck
>> anything missing in configuration ?
>>
>> regards
>>
>>
>>
>> On Wed, Sep 11, 2013 at 3:16 PM, Olivier Renault <
>> orenault@hortonworks.com> wrote:
>>
>>> You can put the same FQDN as your NameNode for example.
>>>
>>> Thanks
>>> Olivier
>>> On 11 Sep 2013 11:26, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>
>>>> i do not have any HIVE server host,  then, what should i put over
>>>> here?? . if i comment then i guess it throws error of commenting that
>>>> can i put the fqdn of namenode for HIVE server host  ?
>>>>
>>>> will it be a really working configuration?
>>>>
>>>> please suggest
>>>>
>>>> regards
>>>> irfan
>>>>
>>>>
>>>>
>>>> On Tue, Sep 10, 2013 at 5:09 PM, Olivier Renault <
>>>> orenault@hortonworks.com> wrote:
>>>>
>>>>> Your cluster-properties.txt should look something like :
>>>>>
>>>>>
>>>>> #Log directory
>>>>> HDP_LOG_DIR=c:\hadoop\logs
>>>>>
>>>>> #Data directory
>>>>> HDP_DATA_DIR=c:\hdp\data
>>>>>
>>>>> #Hosts
>>>>> NAMENODE_HOST=yourmaster.fqdn.com
>>>>> JOBTRACKER_HOST=yourmaster.fqdn.com
>>>>> HIVE_SERVER_HOST=yourmaster.fqdn.com
>>>>> OOZIE_SERVER_HOST=yourmaster.fqdn.com
>>>>> TEMPLETON_HOST=yourmaster.fqdn.com
>>>>> SLAVE_HOSTS=yourmaster.fqdn.com,youslave.fqdn.com
>>>>>
>>>>>
>>>>> #Database host
>>>>> DB_FLAVOR=derby
>>>>> DB_HOSTNAME=yourmaster.fqdn.com
>>>>>
>>>>>
>>>>> #Hive properties
>>>>> HIVE_DB_NAME=hive
>>>>> HIVE_DB_USERNAME=hive
>>>>> HIVE_DB_PASSWORD=hive
>>>>>
>>>>> #Oozie properties
>>>>> OOZIE_DB_NAME=oozie
>>>>> OOZIE_DB_USERNAME=oozie
>>>>> OOZIE_DB_PASSWORD=oozie
>>>>>
>>>>> You will need to replace, yourmaster.fqdn.com, yourslave.fqdn.com by
>>>>> your servers name. For the time being, I suggest that you do not install
>>>>> HBase, Oozie,
>>>>>
>>>>> regards,
>>>>> Olivier
>>>>>
>>>>>
>>>>> On 10 September 2013 07:02, Irfan Sayed <ir...@gmail.com> wrote:
>>>>>
>>>>>> please suggest
>>>>>>
>>>>>> regards
>>>>>> irfan
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Sat, Sep 7, 2013 at 4:56 PM, Irfan Sayed <ir...@gmail.com>wrote:
>>>>>>
>>>>>>> please suggest
>>>>>>>
>>>>>>> regards
>>>>>>> irfan
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Fri, Sep 6, 2013 at 12:16 PM, Irfan Sayed <ir...@gmail.com>wrote:
>>>>>>>
>>>>>>>> ok.. now i made some changes and installation went ahead
>>>>>>>> but failed in property "HIVE_SERVER_HOST" declaration
>>>>>>>> in cluster config file, i have commented this property. if i
>>>>>>>> uncomment , then what server address will give ???
>>>>>>>>
>>>>>>>> i have only two windows machines setup.
>>>>>>>> 1: for namenode and another for datanode
>>>>>>>>
>>>>>>>> please suggest
>>>>>>>>
>>>>>>>> regards
>>>>>>>> irfan
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> On Fri, Sep 6, 2013 at 11:42 AM, Irfan Sayed <ir...@gmail.com>wrote:
>>>>>>>>
>>>>>>>>> thanks.
>>>>>>>>> i installed the latest java in c:\java folder and now no error in
>>>>>>>>> log file related to java
>>>>>>>>> however, now it is throwing error on not having cluster properties
>>>>>>>>> file.
>>>>>>>>> in fact i am running/installing hdp from the location where this
>>>>>>>>> file exist . still it is throwing error
>>>>>>>>>
>>>>>>>>> please find the attached
>>>>>>>>>
>>>>>>>>> [image: Inline image 1]
>>>>>>>>>
>>>>>>>>> regards
>>>>>>>>> irfan
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Fri, Sep 6, 2013 at 11:12 AM, Ravi Mummulla (BIG DATA) <
>>>>>>>>> ravimu@microsoft.com> wrote:
>>>>>>>>>
>>>>>>>>>>  Here’s your issue (from the logs you attached earlier):****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> CAQuietExec:  Checking JAVA_HOME is set correctly...****
>>>>>>>>>>
>>>>>>>>>> CAQuietExec:  Files\Java\jdk1.6.0_31 was unexpected at this time.
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> It seems that you installed Java prerequisite in the default
>>>>>>>>>> path, which is %PROGRAMFILES% (expands to C:\Program Files in your case).
>>>>>>>>>> HDP 1.3 does not like spaces in paths, do you need to reinstall Java under
>>>>>>>>>> c:\java\ or something similar (in a path with no spaces).****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> *From:* Irfan Sayed [mailto:irfu.sayed@gmail.com]
>>>>>>>>>> *Sent:* Thursday, September 5, 2013 8:42 PM
>>>>>>>>>> *To:* user@hadoop.apache.org
>>>>>>>>>> *Subject:* Re: about replication****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> please find the attached.****
>>>>>>>>>>
>>>>>>>>>> i don't have "c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log"
>>>>>>>>>> as it is not generated ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> irfan****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Thu, Sep 5, 2013 at 6:09 PM, Olivier Renault <
>>>>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  Could you share the log files ( c:\hdp.log,
>>>>>>>>>> c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log )  as
>>>>>>>>>> well as your clusterproperties.txt ?****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> Thanks, ****
>>>>>>>>>>
>>>>>>>>>> Olivier****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On 5 September 2013 12:33, Irfan Sayed <ir...@gmail.com>
>>>>>>>>>> wrote:****
>>>>>>>>>>
>>>>>>>>>>  thanks. i followed the user manual for deployment and installed
>>>>>>>>>> all pre-requisites ****
>>>>>>>>>>
>>>>>>>>>> i modified the command and still the issue persist. please
>>>>>>>>>> suggest ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> please refer below ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> [image: Inline image 1]****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> irfan ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <
>>>>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>>>>
>>>>>>>>>> The command to install it is msiexec /i msifile /...  ****
>>>>>>>>>>
>>>>>>>>>> You will find the correct syntax as part of doc. ****
>>>>>>>>>>
>>>>>>>>>> Happy reading
>>>>>>>>>> Olivier ****
>>>>>>>>>>
>>>>>>>>>> On 4 Sep 2013 12:37, "Irfan Sayed" <ir...@gmail.com> wrote:*
>>>>>>>>>> ***
>>>>>>>>>>
>>>>>>>>>>  thanks. ****
>>>>>>>>>>
>>>>>>>>>> i referred the logs and manuals. i modified the clusterproperties
>>>>>>>>>> file and then double click on the msi file ****
>>>>>>>>>>
>>>>>>>>>> however, it still failed.****
>>>>>>>>>>
>>>>>>>>>> further i started the installation on command line by giving
>>>>>>>>>> HDP_LAYOUT=clusterproperties file path, ****
>>>>>>>>>>
>>>>>>>>>> installation went ahead and it failed for .NET framework 4.0 and
>>>>>>>>>> VC++ redistributable package dependency   ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> i installed both and started again the installation. ****
>>>>>>>>>>
>>>>>>>>>> failed again with following error ****
>>>>>>>>>>
>>>>>>>>>> [image: Inline image 1]****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> when i search for the logs mentioned in the error , i never found
>>>>>>>>>> that ****
>>>>>>>>>>
>>>>>>>>>> please suggest ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> irfan****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <
>>>>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>>>>
>>>>>>>>>> Correct, you need to define the cluster configuration as part of
>>>>>>>>>> a file. You will find some information on the configuration file as part of
>>>>>>>>>> the documentation. ****
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> You should make sure to have also installed the pre requisite. **
>>>>>>>>>> **
>>>>>>>>>>
>>>>>>>>>> Thanks
>>>>>>>>>> Olivier ****
>>>>>>>>>>
>>>>>>>>>> On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:*
>>>>>>>>>> ***
>>>>>>>>>>
>>>>>>>>>>  thanks. sorry for the long break. actually got involved in some
>>>>>>>>>> other priorities****
>>>>>>>>>>
>>>>>>>>>> i downloaded the installer and while installing i got following
>>>>>>>>>> error ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> [image: Inline image 1]****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> do i need to make any configuration prior to installation ??****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> irfan ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
>>>>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>>>>
>>>>>>>>>> Here is the link ****
>>>>>>>>>>
>>>>>>>>>> http://download.hortonworks.com/products/hdp-windows/****
>>>>>>>>>>
>>>>>>>>>> Olivier ****
>>>>>>>>>>
>>>>>>>>>> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>>  thanks.****
>>>>>>>>>>
>>>>>>>>>> i just followed the instructions to setup the pseudo distributed
>>>>>>>>>> setup first using the url :
>>>>>>>>>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>>  ****
>>>>>>>>>>
>>>>>>>>>> i don't think so i am running DN on both machine ****
>>>>>>>>>>
>>>>>>>>>> please find the attached log****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> hi olivier ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> can you please give me download link ?****
>>>>>>>>>>
>>>>>>>>>> let me try please ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> irfan ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <
>>>>>>>>>> dontariq@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  Are you running DN on both the machines? Could you please show
>>>>>>>>>> me your DN logs?****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> Also, consider Oliver's suggestion. It's definitely a better
>>>>>>>>>> option.****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> Warm Regards,****
>>>>>>>>>>
>>>>>>>>>> Tariq****
>>>>>>>>>>
>>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>>>>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>>>>
>>>>>>>>>> Irfu, ****
>>>>>>>>>>
>>>>>>>>>> If you want to quickly get Hadoop running on windows platform.
>>>>>>>>>> You may want to try our distribution for Windows. You will be able to find
>>>>>>>>>> the msi on our website. ****
>>>>>>>>>>
>>>>>>>>>> Regards
>>>>>>>>>> Olivier ****
>>>>>>>>>>
>>>>>>>>>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>>  thanks. ****
>>>>>>>>>>
>>>>>>>>>> ok. i think i need to change the plan over here ****
>>>>>>>>>>
>>>>>>>>>> let me create two environments. 1: totally windows 2: totally Unix
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> because, on windows , anyway i have to try and see how hadoop
>>>>>>>>>> works ****
>>>>>>>>>>
>>>>>>>>>> on UNIX, it is already known that ,  it is working fine. ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> so, on windows , here is the setup:****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> namenode : windows 2012 R2 ****
>>>>>>>>>>
>>>>>>>>>> datanode : windows 2012 R2 ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> now, the exact problem is :****
>>>>>>>>>>
>>>>>>>>>> 1: datanode is not getting started ****
>>>>>>>>>>
>>>>>>>>>> 2: replication : if i put any file/folder on any datanode , it
>>>>>>>>>> should get replicated to all another available datanodes ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <
>>>>>>>>>> dontariq@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  Seriously??You are planning to develop something using Hadoop
>>>>>>>>>> on windows. Not a good idea. Anyways, cold you plz show me your log files?I
>>>>>>>>>> also need some additional info :****
>>>>>>>>>>
>>>>>>>>>> -The exact problem which you are facing right now****
>>>>>>>>>>
>>>>>>>>>> -Your cluster summary(no. of nodes etc)****
>>>>>>>>>>
>>>>>>>>>> -Your latest configuration files****
>>>>>>>>>>
>>>>>>>>>> -Your /etc.hosts file****
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> Warm Regards,****
>>>>>>>>>>
>>>>>>>>>> Tariq****
>>>>>>>>>>
>>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  ok. thanks****
>>>>>>>>>>
>>>>>>>>>> now, i need to start with all windows setup first as our product
>>>>>>>>>> will be based on windows ****
>>>>>>>>>>
>>>>>>>>>> so, now, please tell me how to resolve the issue ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> datanode is not starting . please suggest ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards,****
>>>>>>>>>>
>>>>>>>>>> irfan ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <
>>>>>>>>>> dontariq@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  It is possible. Theoretically Hadoop doesn't stop you from
>>>>>>>>>> doing that. But it is not a very wise setup.****
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> Warm Regards,****
>>>>>>>>>>
>>>>>>>>>> Tariq****
>>>>>>>>>>
>>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  please suggest****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> irfan****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  thanks.****
>>>>>>>>>>
>>>>>>>>>> can i have setup like this :****
>>>>>>>>>>
>>>>>>>>>> namenode will be on linux (flavour may be RHEL, CentOS, UBuntu
>>>>>>>>>> etc)****
>>>>>>>>>>
>>>>>>>>>> and datanodes are the combination of any OS (windows , linux ,
>>>>>>>>>> unix etc )****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> however, my doubt is,  as the file systems of  both the systems
>>>>>>>>>> (win and linux ) are different ,  datanodes of these systems can not be
>>>>>>>>>> part of single cluster . i have to make windows cluster separate and UNIX
>>>>>>>>>> cluster separate ?****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
>>>>>>>>>> aagarwal@hortonworks.com> wrote:****
>>>>>>>>>>
>>>>>>>>>> I just noticed you are on Cygwin. IIRC Windows PIDs are not the
>>>>>>>>>> same as Cygwin PIDs so that may be causing the discrepancy. I don't know
>>>>>>>>>> how well Hadoop works in Cygwin as I have never tried it. Work is in
>>>>>>>>>> progress for native Windows support however there are no official releases
>>>>>>>>>> with Windows support yet. It may be easier to get familiar with a
>>>>>>>>>> release <https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  thanks ****
>>>>>>>>>>
>>>>>>>>>> here is what i did .****
>>>>>>>>>>
>>>>>>>>>> i stopped all the namenodes and datanodes using ./stop-dfs.sh
>>>>>>>>>> command ****
>>>>>>>>>>
>>>>>>>>>> then deleted all pid files for namenodes and datanodes ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> started dfs again with command : "./start-dfs.sh"****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> when i ran the "Jps" command . it shows****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>>>>>>
>>>>>>>>>> $ ./jps.exe****
>>>>>>>>>>
>>>>>>>>>> 4536 Jps****
>>>>>>>>>>
>>>>>>>>>> 2076 NameNode****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> however, when i open the pid file for namenode then it is not
>>>>>>>>>> showing pid as : 4560. on the contrary, it shud show : 2076****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> please suggest ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
>>>>>>>>>> aagarwal@hortonworks.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  Most likely there is a stale pid file. Something like
>>>>>>>>>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>>>>>>>>>> the datanode.
>>>>>>>>>>
>>>>>>>>>> I haven't read the entire thread so you may have looked at this
>>>>>>>>>> already.
>>>>>>>>>>
>>>>>>>>>> -Arpit****
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  datanode is trying to connect to namenode continuously but
>>>>>>>>>> fails ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> when i try to run "jps" command it says :****
>>>>>>>>>>
>>>>>>>>>> $ ./jps.exe****
>>>>>>>>>>
>>>>>>>>>> 4584 NameNode****
>>>>>>>>>>
>>>>>>>>>> 4016 Jps****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> and when i ran the "./start-dfs.sh" then it says :****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> $ ./start-dfs.sh****
>>>>>>>>>>
>>>>>>>>>> namenode running as process 3544. Stop it first.****
>>>>>>>>>>
>>>>>>>>>> DFS-1: datanode running as process 4076. Stop it first.****
>>>>>>>>>>
>>>>>>>>>> localhost: secondarynamenode running as process 4792. Stop it
>>>>>>>>>> first.****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> both these logs are contradictory ****
>>>>>>>>>>
>>>>>>>>>> please find the attached logs ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> should i attach the conf files as well ?****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>>  ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <
>>>>>>>>>> dontariq@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  Your DN is still not running. Showing me the logs would be
>>>>>>>>>> helpful.****
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> Warm Regards,****
>>>>>>>>>>
>>>>>>>>>> Tariq****
>>>>>>>>>>
>>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  i followed the url and did the steps mention in that. i have
>>>>>>>>>> deployed on the windows platform****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> Now, i am able to browse url : http://localhost:50070 (name node
>>>>>>>>>> )****
>>>>>>>>>>
>>>>>>>>>> however, not able to browse url : http://localhost:50030****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> please refer below****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> [image: Inline image 1]****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> i have modified all the config files as mentioned and formatted
>>>>>>>>>> the hdfs file system as well ****
>>>>>>>>>>
>>>>>>>>>> please suggest ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  thanks. i followed this url :
>>>>>>>>>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> let me follow the url which you gave for pseudo distributed setup
>>>>>>>>>> and then will switch to distributed mode****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> irfan ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <
>>>>>>>>>> dontariq@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  You are welcome. Which link have you followed for the
>>>>>>>>>> configuration?Your *core-site.xml* is empty. Remove the property
>>>>>>>>>> *fs.default.name *from *hdfs-site.xml* and add it to *
>>>>>>>>>> core-site.xml*. Remove *mapred.job.tracker* as well. It is
>>>>>>>>>> required in *mapred-site.xml*.****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> I would suggest you to do a pseudo distributed setup first in
>>>>>>>>>> order to get yourself familiar with the process and then proceed to the
>>>>>>>>>> distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> HTH****
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> Warm Regards,****
>>>>>>>>>>
>>>>>>>>>> Tariq****
>>>>>>>>>>
>>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  thanks tariq for response. ****
>>>>>>>>>>
>>>>>>>>>> as discussed last time, i have sent you all the config files in
>>>>>>>>>> my setup . ****
>>>>>>>>>>
>>>>>>>>>> can you please go through that ?****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> please let me know ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> irfan ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <
>>>>>>>>>> dontariq@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  I'm sorry for being unresponsive. Was out of touch for sometime
>>>>>>>>>> because of ramzan and eid. Resuming work today.****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> What's the current status?****
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> Warm Regards,****
>>>>>>>>>>
>>>>>>>>>> Tariq****
>>>>>>>>>>
>>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <
>>>>>>>>>> manishd207@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  First of all read the concepts ..I hope you will like it..****
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  please suggest ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> irfan ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  hey Tariq,****
>>>>>>>>>>
>>>>>>>>>> i am still stuck .. ****
>>>>>>>>>>
>>>>>>>>>> can you please suggest ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> irfan ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>>> wrote:****
>>>>>>>>>>
>>>>>>>>>>  please suggest ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>>> wrote:****
>>>>>>>>>>
>>>>>>>>>>  attachment got quarantined ****
>>>>>>>>>>
>>>>>>>>>> resending in txt format. please rename it to conf.rar ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>>> wrote:****
>>>>>>>>>>
>>>>>>>>>>  thanks.****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> if i run the jps command on namenode :****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>>>>>>
>>>>>>>>>> $ ./jps.exe****
>>>>>>>>>>
>>>>>>>>>> 3164 NameNode****
>>>>>>>>>>
>>>>>>>>>> 1892 Jps****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> same command on datanode :****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> Administrator@DFS-1 /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>>>>>>
>>>>>>>>>> $ ./jps.exe****
>>>>>>>>>>
>>>>>>>>>> 3848 Jps****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> jps does not list any process for datanode. however, on web
>>>>>>>>>> browser i can see one live data node ****
>>>>>>>>>>
>>>>>>>>>> please find the attached conf rar file of namenode ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <
>>>>>>>>>> dontariq@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  OK. we'll start fresh. Could you plz show me your latest config
>>>>>>>>>> files?****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> BTW, are your daemons running fine?Use JPS to verify that.****
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> Warm Regards,****
>>>>>>>>>>
>>>>>>>>>> Tariq****
>>>>>>>>>>
>>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  i have created these dir "wksp_data" and "wksp_name" on both
>>>>>>>>>> datanode and namenode ****
>>>>>>>>>>
>>>>>>>>>> made the respective changes in "hdfs-site.xml" file ****
>>>>>>>>>>
>>>>>>>>>> formatted the namenode ****
>>>>>>>>>>
>>>>>>>>>> started the dfs ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> but still, not able to browse the file system through web browser
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> please refer below ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> anything still missing ?****
>>>>>>>>>>
>>>>>>>>>> please suggest ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> [image: Inline image 1]****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  these dir needs to be created on all datanodes and namenodes ?**
>>>>>>>>>> **
>>>>>>>>>>
>>>>>>>>>> further,  hdfs-site.xml needs to be updated on both datanodes and
>>>>>>>>>> namenodes for these new dir?****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <
>>>>>>>>>> dontariq@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  Create 2 directories manually corresponding to the values of
>>>>>>>>>> dfs.name.dir and dfs.data.dir properties and change the permissions of
>>>>>>>>>> these directories to 755. When you start pushing data into your HDFS, data
>>>>>>>>>> will start going inside the directory specified by dfs.data.dir and the
>>>>>>>>>> associated metadata will go inside dfs.name.dir. Remember, you store data
>>>>>>>>>> in HDFS, but it eventually gets stored in your local/native FS. But you
>>>>>>>>>> cannot see this data directly on your local/native FS.****
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> Warm Regards,****
>>>>>>>>>>
>>>>>>>>>> Tariq****
>>>>>>>>>>
>>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>>> wrote:****
>>>>>>>>>>
>>>>>>>>>>  thanks. ****
>>>>>>>>>>
>>>>>>>>>> however, i need this to be working on windows environment as
>>>>>>>>>> project requirement.****
>>>>>>>>>>
>>>>>>>>>> i will add/work on Linux later ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> so, now , at this stage , c:\\wksp is the HDFS file system OR do
>>>>>>>>>> i need to create it from command line ?****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> please suggest****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards,****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <
>>>>>>>>>> dontariq@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  Hello Irfan,****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> Sorry for being unresponsive. Got stuck with some imp work.****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> HDFS webUI doesn't provide us the ability to create file or
>>>>>>>>>> directory. You can browse HDFS, view files, download files etc. But
>>>>>>>>>> operation like create, move, copy etc are not supported.****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> These values look fine to me.****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> One suggestion though. Try getting a Linux machine(if possible).
>>>>>>>>>> Or at least use a VM. I personally feel that using Hadoop on windows is
>>>>>>>>>> always messy.****
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> Warm Regards,****
>>>>>>>>>>
>>>>>>>>>> Tariq****
>>>>>>>>>>
>>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>>> wrote:****
>>>>>>>>>>
>>>>>>>>>>  thanks.****
>>>>>>>>>>
>>>>>>>>>> when i browse the file system , i am getting following :****
>>>>>>>>>>
>>>>>>>>>> i haven't seen any make directory option there ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> i need to create it from command line ?****
>>>>>>>>>>
>>>>>>>>>> further, in the hdfs-site.xml file , i have given following
>>>>>>>>>> entries. are they correct ? ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> <property>****
>>>>>>>>>>
>>>>>>>>>>   <name>dfs.data.dir</name>****
>>>>>>>>>>
>>>>>>>>>>   <value>c:\\wksp</value>****
>>>>>>>>>>
>>>>>>>>>>   </property>****
>>>>>>>>>>
>>>>>>>>>> <property>****
>>>>>>>>>>
>>>>>>>>>>   <name>dfs.name.dir</name>****
>>>>>>>>>>
>>>>>>>>>>   <value>c:\\wksp</value>****
>>>>>>>>>>
>>>>>>>>>>   </property>****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> please suggest ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> [image: Inline image 1]****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <
>>>>>>>>>> manishd207@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  *You are wrong at this:*****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin***
>>>>>>>>>> *
>>>>>>>>>>
>>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp***
>>>>>>>>>> *
>>>>>>>>>>
>>>>>>>>>> copyFromLocal: File
>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin***
>>>>>>>>>> *
>>>>>>>>>>
>>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> copyFromLocal: File
>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> Because,You had wrote both the paths local and You need not to
>>>>>>>>>> copy hadoop into hdfs...Hadoop is already working..****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> Just check out in browser by after starting ur single node
>>>>>>>>>> cluster :****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> localhost:50070****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> then go for browse the filesystem link in it..****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> If there is no directory then make directory there.****
>>>>>>>>>>
>>>>>>>>>> That is your hdfs directory.****
>>>>>>>>>>
>>>>>>>>>> Then copy any text file there(no need to copy hadoop
>>>>>>>>>> there).beacause u are going to do processing on that data in text
>>>>>>>>>> file.That's why hadoop is used for ,first u need to make it clear in ur
>>>>>>>>>> mind.Then and then u will do it...otherwise not possible..****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> *Try this: *****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2****
>>>>>>>>>>
>>>>>>>>>> $ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
>>>>>>>>>> /hdfs/directory/path****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  thanks. yes , i am newbie.****
>>>>>>>>>>
>>>>>>>>>> however, i need windows setup.****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> let me surely refer the doc and link which u sent but i need this
>>>>>>>>>> to be working ...****
>>>>>>>>>>
>>>>>>>>>> can you please help****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>>  ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> --
>>>>>>>>>> MANISH DUNANI
>>>>>>>>>> -THANX
>>>>>>>>>> +91 9426881954,+91 8460656443****
>>>>>>>>>>
>>>>>>>>>> manishd207@gmail.com****
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> -- ****
>>>>>>>>>>
>>>>>>>>>> Regards****
>>>>>>>>>>
>>>>>>>>>> *Manish Dunani*****
>>>>>>>>>>
>>>>>>>>>> *Contact No* : +91 9408329137****
>>>>>>>>>>
>>>>>>>>>> *skype id* : manish.dunani****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> -- ****
>>>>>>>>>>
>>>>>>>>>> Olivier Renault****
>>>>>>>>>>
>>>>>>>>>> Solution Engineer - Big Data - Hortonworks, Inc.
>>>>>>>>>> +44 7500 933 036
>>>>>>>>>> orenault@hortonworks.com
>>>>>>>>>> www.hortonworks.com****
>>>>>>>>>>
>>>>>>>>>> **** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>>> immediately and delete it from your system. Thank You.****<http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>>>>>>
>>>>>>>>>>  ** ** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>> CONFIDENTIALITY NOTICE
>>>>> NOTICE: This message is intended for the use of the individual or
>>>>> entity to which it is addressed and may contain information that is
>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>> notified that any printing, copying, dissemination, distribution,
>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>> you have received this communication in error, please contact the sender
>>>>> immediately and delete it from your system. Thank You.
>>>>>
>>>>
>>>>
>>> CONFIDENTIALITY NOTICE
>>> NOTICE: This message is intended for the use of the individual or entity
>>> to which it is addressed and may contain information that is confidential,
>>> privileged and exempt from disclosure under applicable law. If the reader
>>> of this message is not the intended recipient, you are hereby notified that
>>> any printing, copying, dissemination, distribution, disclosure or
>>> forwarding of this communication is strictly prohibited. If you have
>>> received this communication in error, please contact the sender immediately
>>> and delete it from your system. Thank You.
>>>
>>
>>
>

Re: about replication

Posted by Irfan Sayed <ir...@gmail.com>.
please suggest. i am stuck
i haven't find anything in the log

regards
irfan



On Fri, Sep 13, 2013 at 3:07 PM, Irfan Sayed <ir...@gmail.com> wrote:

> please suggest
>
> regards
>
>
>
> On Thu, Sep 12, 2013 at 12:00 PM, Irfan Sayed <ir...@gmail.com>wrote:
>
>> thanks.
>> finally it got installed :)
>>
>> further, when i try to start the namenode, it failed with following log
>>
>> C:\hdp>start_remote_hdp_services.cmd
>> Master nodes: start DFS-DC
>> 0 Master nodes successfully started.
>> 1 Master nodes failed to start.
>>
>> PSComputerName      Service             Message             Status
>> --------------      -------             -------             ------
>>                                         Connecting to re...
>>
>>
>> StartStop-HDPServices : Manually start services on Master nodes then
>> retry full
>>  cluster start.  Exiting.
>> At C:\hdp\manage_remote_hdp_services.ps1:187 char:47
>> + if ($mode -eq "start") { StartStop-HDPservices <<<< ($mode) }
>>     + CategoryInfo          : NotSpecified: (:) [Write-Error],
>> WriteErrorExcep
>>    tion
>>     + FullyQualifiedErrorId :
>> Microsoft.PowerShell.Commands.WriteErrorExceptio
>>    n,StartStop-HDPServices
>>
>>
>> C:\hdp>
>>
>> i tried starting manually as well but no luck
>> anything missing in configuration ?
>>
>> regards
>>
>>
>>
>> On Wed, Sep 11, 2013 at 3:16 PM, Olivier Renault <
>> orenault@hortonworks.com> wrote:
>>
>>> You can put the same FQDN as your NameNode for example.
>>>
>>> Thanks
>>> Olivier
>>> On 11 Sep 2013 11:26, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>
>>>> i do not have any HIVE server host,  then, what should i put over
>>>> here?? . if i comment then i guess it throws error of commenting that
>>>> can i put the fqdn of namenode for HIVE server host  ?
>>>>
>>>> will it be a really working configuration?
>>>>
>>>> please suggest
>>>>
>>>> regards
>>>> irfan
>>>>
>>>>
>>>>
>>>> On Tue, Sep 10, 2013 at 5:09 PM, Olivier Renault <
>>>> orenault@hortonworks.com> wrote:
>>>>
>>>>> Your cluster-properties.txt should look something like :
>>>>>
>>>>>
>>>>> #Log directory
>>>>> HDP_LOG_DIR=c:\hadoop\logs
>>>>>
>>>>> #Data directory
>>>>> HDP_DATA_DIR=c:\hdp\data
>>>>>
>>>>> #Hosts
>>>>> NAMENODE_HOST=yourmaster.fqdn.com
>>>>> JOBTRACKER_HOST=yourmaster.fqdn.com
>>>>> HIVE_SERVER_HOST=yourmaster.fqdn.com
>>>>> OOZIE_SERVER_HOST=yourmaster.fqdn.com
>>>>> TEMPLETON_HOST=yourmaster.fqdn.com
>>>>> SLAVE_HOSTS=yourmaster.fqdn.com,youslave.fqdn.com
>>>>>
>>>>>
>>>>> #Database host
>>>>> DB_FLAVOR=derby
>>>>> DB_HOSTNAME=yourmaster.fqdn.com
>>>>>
>>>>>
>>>>> #Hive properties
>>>>> HIVE_DB_NAME=hive
>>>>> HIVE_DB_USERNAME=hive
>>>>> HIVE_DB_PASSWORD=hive
>>>>>
>>>>> #Oozie properties
>>>>> OOZIE_DB_NAME=oozie
>>>>> OOZIE_DB_USERNAME=oozie
>>>>> OOZIE_DB_PASSWORD=oozie
>>>>>
>>>>> You will need to replace, yourmaster.fqdn.com, yourslave.fqdn.com by
>>>>> your servers name. For the time being, I suggest that you do not install
>>>>> HBase, Oozie,
>>>>>
>>>>> regards,
>>>>> Olivier
>>>>>
>>>>>
>>>>> On 10 September 2013 07:02, Irfan Sayed <ir...@gmail.com> wrote:
>>>>>
>>>>>> please suggest
>>>>>>
>>>>>> regards
>>>>>> irfan
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Sat, Sep 7, 2013 at 4:56 PM, Irfan Sayed <ir...@gmail.com>wrote:
>>>>>>
>>>>>>> please suggest
>>>>>>>
>>>>>>> regards
>>>>>>> irfan
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Fri, Sep 6, 2013 at 12:16 PM, Irfan Sayed <ir...@gmail.com>wrote:
>>>>>>>
>>>>>>>> ok.. now i made some changes and installation went ahead
>>>>>>>> but failed in property "HIVE_SERVER_HOST" declaration
>>>>>>>> in cluster config file, i have commented this property. if i
>>>>>>>> uncomment , then what server address will give ???
>>>>>>>>
>>>>>>>> i have only two windows machines setup.
>>>>>>>> 1: for namenode and another for datanode
>>>>>>>>
>>>>>>>> please suggest
>>>>>>>>
>>>>>>>> regards
>>>>>>>> irfan
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> On Fri, Sep 6, 2013 at 11:42 AM, Irfan Sayed <ir...@gmail.com>wrote:
>>>>>>>>
>>>>>>>>> thanks.
>>>>>>>>> i installed the latest java in c:\java folder and now no error in
>>>>>>>>> log file related to java
>>>>>>>>> however, now it is throwing error on not having cluster properties
>>>>>>>>> file.
>>>>>>>>> in fact i am running/installing hdp from the location where this
>>>>>>>>> file exist . still it is throwing error
>>>>>>>>>
>>>>>>>>> please find the attached
>>>>>>>>>
>>>>>>>>> [image: Inline image 1]
>>>>>>>>>
>>>>>>>>> regards
>>>>>>>>> irfan
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Fri, Sep 6, 2013 at 11:12 AM, Ravi Mummulla (BIG DATA) <
>>>>>>>>> ravimu@microsoft.com> wrote:
>>>>>>>>>
>>>>>>>>>>  Here’s your issue (from the logs you attached earlier):****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> CAQuietExec:  Checking JAVA_HOME is set correctly...****
>>>>>>>>>>
>>>>>>>>>> CAQuietExec:  Files\Java\jdk1.6.0_31 was unexpected at this time.
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> It seems that you installed Java prerequisite in the default
>>>>>>>>>> path, which is %PROGRAMFILES% (expands to C:\Program Files in your case).
>>>>>>>>>> HDP 1.3 does not like spaces in paths, do you need to reinstall Java under
>>>>>>>>>> c:\java\ or something similar (in a path with no spaces).****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> *From:* Irfan Sayed [mailto:irfu.sayed@gmail.com]
>>>>>>>>>> *Sent:* Thursday, September 5, 2013 8:42 PM
>>>>>>>>>> *To:* user@hadoop.apache.org
>>>>>>>>>> *Subject:* Re: about replication****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> please find the attached.****
>>>>>>>>>>
>>>>>>>>>> i don't have "c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log"
>>>>>>>>>> as it is not generated ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> irfan****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Thu, Sep 5, 2013 at 6:09 PM, Olivier Renault <
>>>>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  Could you share the log files ( c:\hdp.log,
>>>>>>>>>> c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log )  as
>>>>>>>>>> well as your clusterproperties.txt ?****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> Thanks, ****
>>>>>>>>>>
>>>>>>>>>> Olivier****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On 5 September 2013 12:33, Irfan Sayed <ir...@gmail.com>
>>>>>>>>>> wrote:****
>>>>>>>>>>
>>>>>>>>>>  thanks. i followed the user manual for deployment and installed
>>>>>>>>>> all pre-requisites ****
>>>>>>>>>>
>>>>>>>>>> i modified the command and still the issue persist. please
>>>>>>>>>> suggest ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> please refer below ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> [image: Inline image 1]****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> irfan ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <
>>>>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>>>>
>>>>>>>>>> The command to install it is msiexec /i msifile /...  ****
>>>>>>>>>>
>>>>>>>>>> You will find the correct syntax as part of doc. ****
>>>>>>>>>>
>>>>>>>>>> Happy reading
>>>>>>>>>> Olivier ****
>>>>>>>>>>
>>>>>>>>>> On 4 Sep 2013 12:37, "Irfan Sayed" <ir...@gmail.com> wrote:*
>>>>>>>>>> ***
>>>>>>>>>>
>>>>>>>>>>  thanks. ****
>>>>>>>>>>
>>>>>>>>>> i referred the logs and manuals. i modified the clusterproperties
>>>>>>>>>> file and then double click on the msi file ****
>>>>>>>>>>
>>>>>>>>>> however, it still failed.****
>>>>>>>>>>
>>>>>>>>>> further i started the installation on command line by giving
>>>>>>>>>> HDP_LAYOUT=clusterproperties file path, ****
>>>>>>>>>>
>>>>>>>>>> installation went ahead and it failed for .NET framework 4.0 and
>>>>>>>>>> VC++ redistributable package dependency   ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> i installed both and started again the installation. ****
>>>>>>>>>>
>>>>>>>>>> failed again with following error ****
>>>>>>>>>>
>>>>>>>>>> [image: Inline image 1]****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> when i search for the logs mentioned in the error , i never found
>>>>>>>>>> that ****
>>>>>>>>>>
>>>>>>>>>> please suggest ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> irfan****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <
>>>>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>>>>
>>>>>>>>>> Correct, you need to define the cluster configuration as part of
>>>>>>>>>> a file. You will find some information on the configuration file as part of
>>>>>>>>>> the documentation. ****
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> You should make sure to have also installed the pre requisite. **
>>>>>>>>>> **
>>>>>>>>>>
>>>>>>>>>> Thanks
>>>>>>>>>> Olivier ****
>>>>>>>>>>
>>>>>>>>>> On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:*
>>>>>>>>>> ***
>>>>>>>>>>
>>>>>>>>>>  thanks. sorry for the long break. actually got involved in some
>>>>>>>>>> other priorities****
>>>>>>>>>>
>>>>>>>>>> i downloaded the installer and while installing i got following
>>>>>>>>>> error ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> [image: Inline image 1]****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> do i need to make any configuration prior to installation ??****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> irfan ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
>>>>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>>>>
>>>>>>>>>> Here is the link ****
>>>>>>>>>>
>>>>>>>>>> http://download.hortonworks.com/products/hdp-windows/****
>>>>>>>>>>
>>>>>>>>>> Olivier ****
>>>>>>>>>>
>>>>>>>>>> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>>  thanks.****
>>>>>>>>>>
>>>>>>>>>> i just followed the instructions to setup the pseudo distributed
>>>>>>>>>> setup first using the url :
>>>>>>>>>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>>  ****
>>>>>>>>>>
>>>>>>>>>> i don't think so i am running DN on both machine ****
>>>>>>>>>>
>>>>>>>>>> please find the attached log****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> hi olivier ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> can you please give me download link ?****
>>>>>>>>>>
>>>>>>>>>> let me try please ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> irfan ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <
>>>>>>>>>> dontariq@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  Are you running DN on both the machines? Could you please show
>>>>>>>>>> me your DN logs?****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> Also, consider Oliver's suggestion. It's definitely a better
>>>>>>>>>> option.****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> Warm Regards,****
>>>>>>>>>>
>>>>>>>>>> Tariq****
>>>>>>>>>>
>>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>>>>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>>>>
>>>>>>>>>> Irfu, ****
>>>>>>>>>>
>>>>>>>>>> If you want to quickly get Hadoop running on windows platform.
>>>>>>>>>> You may want to try our distribution for Windows. You will be able to find
>>>>>>>>>> the msi on our website. ****
>>>>>>>>>>
>>>>>>>>>> Regards
>>>>>>>>>> Olivier ****
>>>>>>>>>>
>>>>>>>>>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>>  thanks. ****
>>>>>>>>>>
>>>>>>>>>> ok. i think i need to change the plan over here ****
>>>>>>>>>>
>>>>>>>>>> let me create two environments. 1: totally windows 2: totally Unix
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> because, on windows , anyway i have to try and see how hadoop
>>>>>>>>>> works ****
>>>>>>>>>>
>>>>>>>>>> on UNIX, it is already known that ,  it is working fine. ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> so, on windows , here is the setup:****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> namenode : windows 2012 R2 ****
>>>>>>>>>>
>>>>>>>>>> datanode : windows 2012 R2 ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> now, the exact problem is :****
>>>>>>>>>>
>>>>>>>>>> 1: datanode is not getting started ****
>>>>>>>>>>
>>>>>>>>>> 2: replication : if i put any file/folder on any datanode , it
>>>>>>>>>> should get replicated to all another available datanodes ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <
>>>>>>>>>> dontariq@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  Seriously??You are planning to develop something using Hadoop
>>>>>>>>>> on windows. Not a good idea. Anyways, cold you plz show me your log files?I
>>>>>>>>>> also need some additional info :****
>>>>>>>>>>
>>>>>>>>>> -The exact problem which you are facing right now****
>>>>>>>>>>
>>>>>>>>>> -Your cluster summary(no. of nodes etc)****
>>>>>>>>>>
>>>>>>>>>> -Your latest configuration files****
>>>>>>>>>>
>>>>>>>>>> -Your /etc.hosts file****
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> Warm Regards,****
>>>>>>>>>>
>>>>>>>>>> Tariq****
>>>>>>>>>>
>>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  ok. thanks****
>>>>>>>>>>
>>>>>>>>>> now, i need to start with all windows setup first as our product
>>>>>>>>>> will be based on windows ****
>>>>>>>>>>
>>>>>>>>>> so, now, please tell me how to resolve the issue ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> datanode is not starting . please suggest ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards,****
>>>>>>>>>>
>>>>>>>>>> irfan ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <
>>>>>>>>>> dontariq@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  It is possible. Theoretically Hadoop doesn't stop you from
>>>>>>>>>> doing that. But it is not a very wise setup.****
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> Warm Regards,****
>>>>>>>>>>
>>>>>>>>>> Tariq****
>>>>>>>>>>
>>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  please suggest****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> irfan****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  thanks.****
>>>>>>>>>>
>>>>>>>>>> can i have setup like this :****
>>>>>>>>>>
>>>>>>>>>> namenode will be on linux (flavour may be RHEL, CentOS, UBuntu
>>>>>>>>>> etc)****
>>>>>>>>>>
>>>>>>>>>> and datanodes are the combination of any OS (windows , linux ,
>>>>>>>>>> unix etc )****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> however, my doubt is,  as the file systems of  both the systems
>>>>>>>>>> (win and linux ) are different ,  datanodes of these systems can not be
>>>>>>>>>> part of single cluster . i have to make windows cluster separate and UNIX
>>>>>>>>>> cluster separate ?****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
>>>>>>>>>> aagarwal@hortonworks.com> wrote:****
>>>>>>>>>>
>>>>>>>>>> I just noticed you are on Cygwin. IIRC Windows PIDs are not the
>>>>>>>>>> same as Cygwin PIDs so that may be causing the discrepancy. I don't know
>>>>>>>>>> how well Hadoop works in Cygwin as I have never tried it. Work is in
>>>>>>>>>> progress for native Windows support however there are no official releases
>>>>>>>>>> with Windows support yet. It may be easier to get familiar with a
>>>>>>>>>> release <https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  thanks ****
>>>>>>>>>>
>>>>>>>>>> here is what i did .****
>>>>>>>>>>
>>>>>>>>>> i stopped all the namenodes and datanodes using ./stop-dfs.sh
>>>>>>>>>> command ****
>>>>>>>>>>
>>>>>>>>>> then deleted all pid files for namenodes and datanodes ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> started dfs again with command : "./start-dfs.sh"****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> when i ran the "Jps" command . it shows****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>>>>>>
>>>>>>>>>> $ ./jps.exe****
>>>>>>>>>>
>>>>>>>>>> 4536 Jps****
>>>>>>>>>>
>>>>>>>>>> 2076 NameNode****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> however, when i open the pid file for namenode then it is not
>>>>>>>>>> showing pid as : 4560. on the contrary, it shud show : 2076****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> please suggest ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
>>>>>>>>>> aagarwal@hortonworks.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  Most likely there is a stale pid file. Something like
>>>>>>>>>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>>>>>>>>>> the datanode.
>>>>>>>>>>
>>>>>>>>>> I haven't read the entire thread so you may have looked at this
>>>>>>>>>> already.
>>>>>>>>>>
>>>>>>>>>> -Arpit****
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  datanode is trying to connect to namenode continuously but
>>>>>>>>>> fails ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> when i try to run "jps" command it says :****
>>>>>>>>>>
>>>>>>>>>> $ ./jps.exe****
>>>>>>>>>>
>>>>>>>>>> 4584 NameNode****
>>>>>>>>>>
>>>>>>>>>> 4016 Jps****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> and when i ran the "./start-dfs.sh" then it says :****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> $ ./start-dfs.sh****
>>>>>>>>>>
>>>>>>>>>> namenode running as process 3544. Stop it first.****
>>>>>>>>>>
>>>>>>>>>> DFS-1: datanode running as process 4076. Stop it first.****
>>>>>>>>>>
>>>>>>>>>> localhost: secondarynamenode running as process 4792. Stop it
>>>>>>>>>> first.****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> both these logs are contradictory ****
>>>>>>>>>>
>>>>>>>>>> please find the attached logs ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> should i attach the conf files as well ?****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>>  ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <
>>>>>>>>>> dontariq@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  Your DN is still not running. Showing me the logs would be
>>>>>>>>>> helpful.****
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> Warm Regards,****
>>>>>>>>>>
>>>>>>>>>> Tariq****
>>>>>>>>>>
>>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  i followed the url and did the steps mention in that. i have
>>>>>>>>>> deployed on the windows platform****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> Now, i am able to browse url : http://localhost:50070 (name node
>>>>>>>>>> )****
>>>>>>>>>>
>>>>>>>>>> however, not able to browse url : http://localhost:50030****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> please refer below****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> [image: Inline image 1]****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> i have modified all the config files as mentioned and formatted
>>>>>>>>>> the hdfs file system as well ****
>>>>>>>>>>
>>>>>>>>>> please suggest ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  thanks. i followed this url :
>>>>>>>>>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> let me follow the url which you gave for pseudo distributed setup
>>>>>>>>>> and then will switch to distributed mode****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> irfan ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <
>>>>>>>>>> dontariq@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  You are welcome. Which link have you followed for the
>>>>>>>>>> configuration?Your *core-site.xml* is empty. Remove the property
>>>>>>>>>> *fs.default.name *from *hdfs-site.xml* and add it to *
>>>>>>>>>> core-site.xml*. Remove *mapred.job.tracker* as well. It is
>>>>>>>>>> required in *mapred-site.xml*.****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> I would suggest you to do a pseudo distributed setup first in
>>>>>>>>>> order to get yourself familiar with the process and then proceed to the
>>>>>>>>>> distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> HTH****
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> Warm Regards,****
>>>>>>>>>>
>>>>>>>>>> Tariq****
>>>>>>>>>>
>>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  thanks tariq for response. ****
>>>>>>>>>>
>>>>>>>>>> as discussed last time, i have sent you all the config files in
>>>>>>>>>> my setup . ****
>>>>>>>>>>
>>>>>>>>>> can you please go through that ?****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> please let me know ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> irfan ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <
>>>>>>>>>> dontariq@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  I'm sorry for being unresponsive. Was out of touch for sometime
>>>>>>>>>> because of ramzan and eid. Resuming work today.****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> What's the current status?****
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> Warm Regards,****
>>>>>>>>>>
>>>>>>>>>> Tariq****
>>>>>>>>>>
>>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <
>>>>>>>>>> manishd207@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  First of all read the concepts ..I hope you will like it..****
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  please suggest ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> irfan ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  hey Tariq,****
>>>>>>>>>>
>>>>>>>>>> i am still stuck .. ****
>>>>>>>>>>
>>>>>>>>>> can you please suggest ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> irfan ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>>> wrote:****
>>>>>>>>>>
>>>>>>>>>>  please suggest ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>>> wrote:****
>>>>>>>>>>
>>>>>>>>>>  attachment got quarantined ****
>>>>>>>>>>
>>>>>>>>>> resending in txt format. please rename it to conf.rar ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>>> wrote:****
>>>>>>>>>>
>>>>>>>>>>  thanks.****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> if i run the jps command on namenode :****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>>>>>>
>>>>>>>>>> $ ./jps.exe****
>>>>>>>>>>
>>>>>>>>>> 3164 NameNode****
>>>>>>>>>>
>>>>>>>>>> 1892 Jps****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> same command on datanode :****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> Administrator@DFS-1 /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>>>>>>
>>>>>>>>>> $ ./jps.exe****
>>>>>>>>>>
>>>>>>>>>> 3848 Jps****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> jps does not list any process for datanode. however, on web
>>>>>>>>>> browser i can see one live data node ****
>>>>>>>>>>
>>>>>>>>>> please find the attached conf rar file of namenode ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <
>>>>>>>>>> dontariq@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  OK. we'll start fresh. Could you plz show me your latest config
>>>>>>>>>> files?****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> BTW, are your daemons running fine?Use JPS to verify that.****
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> Warm Regards,****
>>>>>>>>>>
>>>>>>>>>> Tariq****
>>>>>>>>>>
>>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  i have created these dir "wksp_data" and "wksp_name" on both
>>>>>>>>>> datanode and namenode ****
>>>>>>>>>>
>>>>>>>>>> made the respective changes in "hdfs-site.xml" file ****
>>>>>>>>>>
>>>>>>>>>> formatted the namenode ****
>>>>>>>>>>
>>>>>>>>>> started the dfs ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> but still, not able to browse the file system through web browser
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> please refer below ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> anything still missing ?****
>>>>>>>>>>
>>>>>>>>>> please suggest ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> [image: Inline image 1]****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  these dir needs to be created on all datanodes and namenodes ?**
>>>>>>>>>> **
>>>>>>>>>>
>>>>>>>>>> further,  hdfs-site.xml needs to be updated on both datanodes and
>>>>>>>>>> namenodes for these new dir?****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <
>>>>>>>>>> dontariq@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  Create 2 directories manually corresponding to the values of
>>>>>>>>>> dfs.name.dir and dfs.data.dir properties and change the permissions of
>>>>>>>>>> these directories to 755. When you start pushing data into your HDFS, data
>>>>>>>>>> will start going inside the directory specified by dfs.data.dir and the
>>>>>>>>>> associated metadata will go inside dfs.name.dir. Remember, you store data
>>>>>>>>>> in HDFS, but it eventually gets stored in your local/native FS. But you
>>>>>>>>>> cannot see this data directly on your local/native FS.****
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> Warm Regards,****
>>>>>>>>>>
>>>>>>>>>> Tariq****
>>>>>>>>>>
>>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>>> wrote:****
>>>>>>>>>>
>>>>>>>>>>  thanks. ****
>>>>>>>>>>
>>>>>>>>>> however, i need this to be working on windows environment as
>>>>>>>>>> project requirement.****
>>>>>>>>>>
>>>>>>>>>> i will add/work on Linux later ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> so, now , at this stage , c:\\wksp is the HDFS file system OR do
>>>>>>>>>> i need to create it from command line ?****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> please suggest****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards,****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <
>>>>>>>>>> dontariq@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  Hello Irfan,****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> Sorry for being unresponsive. Got stuck with some imp work.****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> HDFS webUI doesn't provide us the ability to create file or
>>>>>>>>>> directory. You can browse HDFS, view files, download files etc. But
>>>>>>>>>> operation like create, move, copy etc are not supported.****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> These values look fine to me.****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> One suggestion though. Try getting a Linux machine(if possible).
>>>>>>>>>> Or at least use a VM. I personally feel that using Hadoop on windows is
>>>>>>>>>> always messy.****
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> Warm Regards,****
>>>>>>>>>>
>>>>>>>>>> Tariq****
>>>>>>>>>>
>>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>>> wrote:****
>>>>>>>>>>
>>>>>>>>>>  thanks.****
>>>>>>>>>>
>>>>>>>>>> when i browse the file system , i am getting following :****
>>>>>>>>>>
>>>>>>>>>> i haven't seen any make directory option there ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> i need to create it from command line ?****
>>>>>>>>>>
>>>>>>>>>> further, in the hdfs-site.xml file , i have given following
>>>>>>>>>> entries. are they correct ? ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> <property>****
>>>>>>>>>>
>>>>>>>>>>   <name>dfs.data.dir</name>****
>>>>>>>>>>
>>>>>>>>>>   <value>c:\\wksp</value>****
>>>>>>>>>>
>>>>>>>>>>   </property>****
>>>>>>>>>>
>>>>>>>>>> <property>****
>>>>>>>>>>
>>>>>>>>>>   <name>dfs.name.dir</name>****
>>>>>>>>>>
>>>>>>>>>>   <value>c:\\wksp</value>****
>>>>>>>>>>
>>>>>>>>>>   </property>****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> please suggest ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> [image: Inline image 1]****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <
>>>>>>>>>> manishd207@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  *You are wrong at this:*****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin***
>>>>>>>>>> *
>>>>>>>>>>
>>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp***
>>>>>>>>>> *
>>>>>>>>>>
>>>>>>>>>> copyFromLocal: File
>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin***
>>>>>>>>>> *
>>>>>>>>>>
>>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> copyFromLocal: File
>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> Because,You had wrote both the paths local and You need not to
>>>>>>>>>> copy hadoop into hdfs...Hadoop is already working..****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> Just check out in browser by after starting ur single node
>>>>>>>>>> cluster :****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> localhost:50070****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> then go for browse the filesystem link in it..****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> If there is no directory then make directory there.****
>>>>>>>>>>
>>>>>>>>>> That is your hdfs directory.****
>>>>>>>>>>
>>>>>>>>>> Then copy any text file there(no need to copy hadoop
>>>>>>>>>> there).beacause u are going to do processing on that data in text
>>>>>>>>>> file.That's why hadoop is used for ,first u need to make it clear in ur
>>>>>>>>>> mind.Then and then u will do it...otherwise not possible..****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> *Try this: *****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2****
>>>>>>>>>>
>>>>>>>>>> $ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
>>>>>>>>>> /hdfs/directory/path****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>>
>>>>>>>>>>  thanks. yes , i am newbie.****
>>>>>>>>>>
>>>>>>>>>> however, i need windows setup.****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> let me surely refer the doc and link which u sent but i need this
>>>>>>>>>> to be working ...****
>>>>>>>>>>
>>>>>>>>>> can you please help****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> regards****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>>  ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> --
>>>>>>>>>> MANISH DUNANI
>>>>>>>>>> -THANX
>>>>>>>>>> +91 9426881954,+91 8460656443****
>>>>>>>>>>
>>>>>>>>>> manishd207@gmail.com****
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> -- ****
>>>>>>>>>>
>>>>>>>>>> Regards****
>>>>>>>>>>
>>>>>>>>>> *Manish Dunani*****
>>>>>>>>>>
>>>>>>>>>> *Contact No* : +91 9408329137****
>>>>>>>>>>
>>>>>>>>>> *skype id* : manish.dunani****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>>>
>>>>>>>>>>  ** **
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ****
>>>>>>>>>>
>>>>>>>>>> ** **
>>>>>>>>>>
>>>>>>>>>> -- ****
>>>>>>>>>>
>>>>>>>>>> Olivier Renault****
>>>>>>>>>>
>>>>>>>>>> Solution Engineer - Big Data - Hortonworks, Inc.
>>>>>>>>>> +44 7500 933 036
>>>>>>>>>> orenault@hortonworks.com
>>>>>>>>>> www.hortonworks.com****
>>>>>>>>>>
>>>>>>>>>> **** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>>> immediately and delete it from your system. Thank You.****<http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>>>>>>
>>>>>>>>>>  ** ** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>> CONFIDENTIALITY NOTICE
>>>>> NOTICE: This message is intended for the use of the individual or
>>>>> entity to which it is addressed and may contain information that is
>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>> notified that any printing, copying, dissemination, distribution,
>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>> you have received this communication in error, please contact the sender
>>>>> immediately and delete it from your system. Thank You.
>>>>>
>>>>
>>>>
>>> CONFIDENTIALITY NOTICE
>>> NOTICE: This message is intended for the use of the individual or entity
>>> to which it is addressed and may contain information that is confidential,
>>> privileged and exempt from disclosure under applicable law. If the reader
>>> of this message is not the intended recipient, you are hereby notified that
>>> any printing, copying, dissemination, distribution, disclosure or
>>> forwarding of this communication is strictly prohibited. If you have
>>> received this communication in error, please contact the sender immediately
>>> and delete it from your system. Thank You.
>>>
>>
>>
>

Re: about replication

Posted by Irfan Sayed <ir...@gmail.com>.
please suggest

regards



On Thu, Sep 12, 2013 at 12:00 PM, Irfan Sayed <ir...@gmail.com> wrote:

> thanks.
> finally it got installed :)
>
> further, when i try to start the namenode, it failed with following log
>
> C:\hdp>start_remote_hdp_services.cmd
> Master nodes: start DFS-DC
> 0 Master nodes successfully started.
> 1 Master nodes failed to start.
>
> PSComputerName      Service             Message             Status
> --------------      -------             -------             ------
>                                         Connecting to re...
>
>
> StartStop-HDPServices : Manually start services on Master nodes then retry
> full
>  cluster start.  Exiting.
> At C:\hdp\manage_remote_hdp_services.ps1:187 char:47
> + if ($mode -eq "start") { StartStop-HDPservices <<<< ($mode) }
>     + CategoryInfo          : NotSpecified: (:) [Write-Error],
> WriteErrorExcep
>    tion
>     + FullyQualifiedErrorId :
> Microsoft.PowerShell.Commands.WriteErrorExceptio
>    n,StartStop-HDPServices
>
>
> C:\hdp>
>
> i tried starting manually as well but no luck
> anything missing in configuration ?
>
> regards
>
>
>
> On Wed, Sep 11, 2013 at 3:16 PM, Olivier Renault <orenault@hortonworks.com
> > wrote:
>
>> You can put the same FQDN as your NameNode for example.
>>
>> Thanks
>> Olivier
>> On 11 Sep 2013 11:26, "Irfan Sayed" <ir...@gmail.com> wrote:
>>
>>> i do not have any HIVE server host,  then, what should i put over here??
>>> . if i comment then i guess it throws error of commenting that
>>> can i put the fqdn of namenode for HIVE server host  ?
>>>
>>> will it be a really working configuration?
>>>
>>> please suggest
>>>
>>> regards
>>> irfan
>>>
>>>
>>>
>>> On Tue, Sep 10, 2013 at 5:09 PM, Olivier Renault <
>>> orenault@hortonworks.com> wrote:
>>>
>>>> Your cluster-properties.txt should look something like :
>>>>
>>>>
>>>> #Log directory
>>>> HDP_LOG_DIR=c:\hadoop\logs
>>>>
>>>> #Data directory
>>>> HDP_DATA_DIR=c:\hdp\data
>>>>
>>>> #Hosts
>>>> NAMENODE_HOST=yourmaster.fqdn.com
>>>> JOBTRACKER_HOST=yourmaster.fqdn.com
>>>> HIVE_SERVER_HOST=yourmaster.fqdn.com
>>>> OOZIE_SERVER_HOST=yourmaster.fqdn.com
>>>> TEMPLETON_HOST=yourmaster.fqdn.com
>>>> SLAVE_HOSTS=yourmaster.fqdn.com,youslave.fqdn.com
>>>>
>>>>
>>>> #Database host
>>>> DB_FLAVOR=derby
>>>> DB_HOSTNAME=yourmaster.fqdn.com
>>>>
>>>>
>>>> #Hive properties
>>>> HIVE_DB_NAME=hive
>>>> HIVE_DB_USERNAME=hive
>>>> HIVE_DB_PASSWORD=hive
>>>>
>>>> #Oozie properties
>>>> OOZIE_DB_NAME=oozie
>>>> OOZIE_DB_USERNAME=oozie
>>>> OOZIE_DB_PASSWORD=oozie
>>>>
>>>> You will need to replace, yourmaster.fqdn.com, yourslave.fqdn.com by
>>>> your servers name. For the time being, I suggest that you do not install
>>>> HBase, Oozie,
>>>>
>>>> regards,
>>>> Olivier
>>>>
>>>>
>>>> On 10 September 2013 07:02, Irfan Sayed <ir...@gmail.com> wrote:
>>>>
>>>>> please suggest
>>>>>
>>>>> regards
>>>>> irfan
>>>>>
>>>>>
>>>>>
>>>>> On Sat, Sep 7, 2013 at 4:56 PM, Irfan Sayed <ir...@gmail.com>wrote:
>>>>>
>>>>>> please suggest
>>>>>>
>>>>>> regards
>>>>>> irfan
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Fri, Sep 6, 2013 at 12:16 PM, Irfan Sayed <ir...@gmail.com>wrote:
>>>>>>
>>>>>>> ok.. now i made some changes and installation went ahead
>>>>>>> but failed in property "HIVE_SERVER_HOST" declaration
>>>>>>> in cluster config file, i have commented this property. if i
>>>>>>> uncomment , then what server address will give ???
>>>>>>>
>>>>>>> i have only two windows machines setup.
>>>>>>> 1: for namenode and another for datanode
>>>>>>>
>>>>>>> please suggest
>>>>>>>
>>>>>>> regards
>>>>>>> irfan
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Fri, Sep 6, 2013 at 11:42 AM, Irfan Sayed <ir...@gmail.com>wrote:
>>>>>>>
>>>>>>>> thanks.
>>>>>>>> i installed the latest java in c:\java folder and now no error in
>>>>>>>> log file related to java
>>>>>>>> however, now it is throwing error on not having cluster properties
>>>>>>>> file.
>>>>>>>> in fact i am running/installing hdp from the location where this
>>>>>>>> file exist . still it is throwing error
>>>>>>>>
>>>>>>>> please find the attached
>>>>>>>>
>>>>>>>> [image: Inline image 1]
>>>>>>>>
>>>>>>>> regards
>>>>>>>> irfan
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> On Fri, Sep 6, 2013 at 11:12 AM, Ravi Mummulla (BIG DATA) <
>>>>>>>> ravimu@microsoft.com> wrote:
>>>>>>>>
>>>>>>>>>  Here’s your issue (from the logs you attached earlier):****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> CAQuietExec:  Checking JAVA_HOME is set correctly...****
>>>>>>>>>
>>>>>>>>> CAQuietExec:  Files\Java\jdk1.6.0_31 was unexpected at this time.*
>>>>>>>>> ***
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> It seems that you installed Java prerequisite in the default path,
>>>>>>>>> which is %PROGRAMFILES% (expands to C:\Program Files in your case). HDP 1.3
>>>>>>>>> does not like spaces in paths, do you need to reinstall Java under c:\java\
>>>>>>>>> or something similar (in a path with no spaces).****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> *From:* Irfan Sayed [mailto:irfu.sayed@gmail.com]
>>>>>>>>> *Sent:* Thursday, September 5, 2013 8:42 PM
>>>>>>>>> *To:* user@hadoop.apache.org
>>>>>>>>> *Subject:* Re: about replication****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> please find the attached.****
>>>>>>>>>
>>>>>>>>> i don't have "c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log"
>>>>>>>>> as it is not generated ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> irfan****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Thu, Sep 5, 2013 at 6:09 PM, Olivier Renault <
>>>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>>>
>>>>>>>>>  Could you share the log files ( c:\hdp.log,
>>>>>>>>> c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log )  as
>>>>>>>>> well as your clusterproperties.txt ?****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> Thanks, ****
>>>>>>>>>
>>>>>>>>> Olivier****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On 5 September 2013 12:33, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  thanks. i followed the user manual for deployment and installed
>>>>>>>>> all pre-requisites ****
>>>>>>>>>
>>>>>>>>> i modified the command and still the issue persist. please suggest
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> please refer below ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> [image: Inline image 1]****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> irfan ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <
>>>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>>>
>>>>>>>>> The command to install it is msiexec /i msifile /...  ****
>>>>>>>>>
>>>>>>>>> You will find the correct syntax as part of doc. ****
>>>>>>>>>
>>>>>>>>> Happy reading
>>>>>>>>> Olivier ****
>>>>>>>>>
>>>>>>>>> On 4 Sep 2013 12:37, "Irfan Sayed" <ir...@gmail.com> wrote:**
>>>>>>>>> **
>>>>>>>>>
>>>>>>>>>  thanks. ****
>>>>>>>>>
>>>>>>>>> i referred the logs and manuals. i modified the clusterproperties
>>>>>>>>> file and then double click on the msi file ****
>>>>>>>>>
>>>>>>>>> however, it still failed.****
>>>>>>>>>
>>>>>>>>> further i started the installation on command line by giving
>>>>>>>>> HDP_LAYOUT=clusterproperties file path, ****
>>>>>>>>>
>>>>>>>>> installation went ahead and it failed for .NET framework 4.0 and
>>>>>>>>> VC++ redistributable package dependency   ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> i installed both and started again the installation. ****
>>>>>>>>>
>>>>>>>>> failed again with following error ****
>>>>>>>>>
>>>>>>>>> [image: Inline image 1]****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> when i search for the logs mentioned in the error , i never found
>>>>>>>>> that ****
>>>>>>>>>
>>>>>>>>> please suggest ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> irfan****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <
>>>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>>>
>>>>>>>>> Correct, you need to define the cluster configuration as part of a
>>>>>>>>> file. You will find some information on the configuration file as part of
>>>>>>>>> the documentation. ****
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> You should make sure to have also installed the pre requisite. ***
>>>>>>>>> *
>>>>>>>>>
>>>>>>>>> Thanks
>>>>>>>>> Olivier ****
>>>>>>>>>
>>>>>>>>> On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:**
>>>>>>>>> **
>>>>>>>>>
>>>>>>>>>  thanks. sorry for the long break. actually got involved in some
>>>>>>>>> other priorities****
>>>>>>>>>
>>>>>>>>> i downloaded the installer and while installing i got following
>>>>>>>>> error ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> [image: Inline image 1]****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> do i need to make any configuration prior to installation ??****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> irfan ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
>>>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>>>
>>>>>>>>> Here is the link ****
>>>>>>>>>
>>>>>>>>> http://download.hortonworks.com/products/hdp-windows/****
>>>>>>>>>
>>>>>>>>> Olivier ****
>>>>>>>>>
>>>>>>>>> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:*
>>>>>>>>> ***
>>>>>>>>>
>>>>>>>>>  thanks.****
>>>>>>>>>
>>>>>>>>> i just followed the instructions to setup the pseudo distributed
>>>>>>>>> setup first using the url :
>>>>>>>>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>>  ****
>>>>>>>>>
>>>>>>>>> i don't think so i am running DN on both machine ****
>>>>>>>>>
>>>>>>>>> please find the attached log****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> hi olivier ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> can you please give me download link ?****
>>>>>>>>>
>>>>>>>>> let me try please ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> irfan ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <
>>>>>>>>> dontariq@gmail.com> wrote:****
>>>>>>>>>
>>>>>>>>>  Are you running DN on both the machines? Could you please show
>>>>>>>>> me your DN logs?****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> Also, consider Oliver's suggestion. It's definitely a better
>>>>>>>>> option.****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> Warm Regards,****
>>>>>>>>>
>>>>>>>>> Tariq****
>>>>>>>>>
>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>>>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>>>
>>>>>>>>> Irfu, ****
>>>>>>>>>
>>>>>>>>> If you want to quickly get Hadoop running on windows platform. You
>>>>>>>>> may want to try our distribution for Windows. You will be able to find the
>>>>>>>>> msi on our website. ****
>>>>>>>>>
>>>>>>>>> Regards
>>>>>>>>> Olivier ****
>>>>>>>>>
>>>>>>>>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:*
>>>>>>>>> ***
>>>>>>>>>
>>>>>>>>>  thanks. ****
>>>>>>>>>
>>>>>>>>> ok. i think i need to change the plan over here ****
>>>>>>>>>
>>>>>>>>> let me create two environments. 1: totally windows 2: totally Unix
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> because, on windows , anyway i have to try and see how hadoop
>>>>>>>>> works ****
>>>>>>>>>
>>>>>>>>> on UNIX, it is already known that ,  it is working fine. ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> so, on windows , here is the setup:****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> namenode : windows 2012 R2 ****
>>>>>>>>>
>>>>>>>>> datanode : windows 2012 R2 ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> now, the exact problem is :****
>>>>>>>>>
>>>>>>>>> 1: datanode is not getting started ****
>>>>>>>>>
>>>>>>>>> 2: replication : if i put any file/folder on any datanode , it
>>>>>>>>> should get replicated to all another available datanodes ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <
>>>>>>>>> dontariq@gmail.com> wrote:****
>>>>>>>>>
>>>>>>>>>  Seriously??You are planning to develop something using Hadoop on
>>>>>>>>> windows. Not a good idea. Anyways, cold you plz show me your log files?I
>>>>>>>>> also need some additional info :****
>>>>>>>>>
>>>>>>>>> -The exact problem which you are facing right now****
>>>>>>>>>
>>>>>>>>> -Your cluster summary(no. of nodes etc)****
>>>>>>>>>
>>>>>>>>> -Your latest configuration files****
>>>>>>>>>
>>>>>>>>> -Your /etc.hosts file****
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> Warm Regards,****
>>>>>>>>>
>>>>>>>>> Tariq****
>>>>>>>>>
>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  ok. thanks****
>>>>>>>>>
>>>>>>>>> now, i need to start with all windows setup first as our product
>>>>>>>>> will be based on windows ****
>>>>>>>>>
>>>>>>>>> so, now, please tell me how to resolve the issue ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> datanode is not starting . please suggest ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards,****
>>>>>>>>>
>>>>>>>>> irfan ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <
>>>>>>>>> dontariq@gmail.com> wrote:****
>>>>>>>>>
>>>>>>>>>  It is possible. Theoretically Hadoop doesn't stop you from doing
>>>>>>>>> that. But it is not a very wise setup.****
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> Warm Regards,****
>>>>>>>>>
>>>>>>>>> Tariq****
>>>>>>>>>
>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  please suggest****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> irfan****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <
>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>
>>>>>>>>>  thanks.****
>>>>>>>>>
>>>>>>>>> can i have setup like this :****
>>>>>>>>>
>>>>>>>>> namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> and datanodes are the combination of any OS (windows , linux ,
>>>>>>>>> unix etc )****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> however, my doubt is,  as the file systems of  both the systems
>>>>>>>>> (win and linux ) are different ,  datanodes of these systems can not be
>>>>>>>>> part of single cluster . i have to make windows cluster separate and UNIX
>>>>>>>>> cluster separate ?****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
>>>>>>>>> aagarwal@hortonworks.com> wrote:****
>>>>>>>>>
>>>>>>>>> I just noticed you are on Cygwin. IIRC Windows PIDs are not the
>>>>>>>>> same as Cygwin PIDs so that may be causing the discrepancy. I don't know
>>>>>>>>> how well Hadoop works in Cygwin as I have never tried it. Work is in
>>>>>>>>> progress for native Windows support however there are no official releases
>>>>>>>>> with Windows support yet. It may be easier to get familiar with a
>>>>>>>>> release <https://www.apache.org/dyn/closer.cgi/hadoop/common/> on
>>>>>>>>> Linux if you are new to it.****
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <
>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>
>>>>>>>>>  thanks ****
>>>>>>>>>
>>>>>>>>> here is what i did .****
>>>>>>>>>
>>>>>>>>> i stopped all the namenodes and datanodes using ./stop-dfs.sh
>>>>>>>>> command ****
>>>>>>>>>
>>>>>>>>> then deleted all pid files for namenodes and datanodes ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> started dfs again with command : "./start-dfs.sh"****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> when i ran the "Jps" command . it shows****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>>>>>
>>>>>>>>> $ ./jps.exe****
>>>>>>>>>
>>>>>>>>> 4536 Jps****
>>>>>>>>>
>>>>>>>>> 2076 NameNode****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> however, when i open the pid file for namenode then it is not
>>>>>>>>> showing pid as : 4560. on the contrary, it shud show : 2076****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> please suggest ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
>>>>>>>>> aagarwal@hortonworks.com> wrote:****
>>>>>>>>>
>>>>>>>>>  Most likely there is a stale pid file. Something like
>>>>>>>>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>>>>>>>>> the datanode.
>>>>>>>>>
>>>>>>>>> I haven't read the entire thread so you may have looked at this
>>>>>>>>> already.
>>>>>>>>>
>>>>>>>>> -Arpit****
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  datanode is trying to connect to namenode continuously but fails
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> when i try to run "jps" command it says :****
>>>>>>>>>
>>>>>>>>> $ ./jps.exe****
>>>>>>>>>
>>>>>>>>> 4584 NameNode****
>>>>>>>>>
>>>>>>>>> 4016 Jps****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> and when i ran the "./start-dfs.sh" then it says :****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> $ ./start-dfs.sh****
>>>>>>>>>
>>>>>>>>> namenode running as process 3544. Stop it first.****
>>>>>>>>>
>>>>>>>>> DFS-1: datanode running as process 4076. Stop it first.****
>>>>>>>>>
>>>>>>>>> localhost: secondarynamenode running as process 4792. Stop it
>>>>>>>>> first.****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> both these logs are contradictory ****
>>>>>>>>>
>>>>>>>>> please find the attached logs ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> should i attach the conf files as well ?****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>>  ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <
>>>>>>>>> dontariq@gmail.com> wrote:****
>>>>>>>>>
>>>>>>>>>  Your DN is still not running. Showing me the logs would be
>>>>>>>>> helpful.****
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> Warm Regards,****
>>>>>>>>>
>>>>>>>>> Tariq****
>>>>>>>>>
>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  i followed the url and did the steps mention in that. i have
>>>>>>>>> deployed on the windows platform****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> Now, i am able to browse url : http://localhost:50070 (name node )
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> however, not able to browse url : http://localhost:50030****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> please refer below****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> [image: Inline image 1]****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> i have modified all the config files as mentioned and formatted
>>>>>>>>> the hdfs file system as well ****
>>>>>>>>>
>>>>>>>>> please suggest ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  thanks. i followed this url :
>>>>>>>>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> let me follow the url which you gave for pseudo distributed setup
>>>>>>>>> and then will switch to distributed mode****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> irfan ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <
>>>>>>>>> dontariq@gmail.com> wrote:****
>>>>>>>>>
>>>>>>>>>  You are welcome. Which link have you followed for the
>>>>>>>>> configuration?Your *core-site.xml* is empty. Remove the property *
>>>>>>>>> fs.default.name *from *hdfs-site.xml* and add it to *core-site.xml
>>>>>>>>> *. Remove *mapred.job.tracker* as well. It is required in *
>>>>>>>>> mapred-site.xml*.****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> I would suggest you to do a pseudo distributed setup first in
>>>>>>>>> order to get yourself familiar with the process and then proceed to the
>>>>>>>>> distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> HTH****
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> Warm Regards,****
>>>>>>>>>
>>>>>>>>> Tariq****
>>>>>>>>>
>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  thanks tariq for response. ****
>>>>>>>>>
>>>>>>>>> as discussed last time, i have sent you all the config files in my
>>>>>>>>> setup . ****
>>>>>>>>>
>>>>>>>>> can you please go through that ?****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> please let me know ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> irfan ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <
>>>>>>>>> dontariq@gmail.com> wrote:****
>>>>>>>>>
>>>>>>>>>  I'm sorry for being unresponsive. Was out of touch for sometime
>>>>>>>>> because of ramzan and eid. Resuming work today.****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> What's the current status?****
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> Warm Regards,****
>>>>>>>>>
>>>>>>>>> Tariq****
>>>>>>>>>
>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <
>>>>>>>>> manishd207@gmail.com> wrote:****
>>>>>>>>>
>>>>>>>>>  First of all read the concepts ..I hope you will like it..****
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf*
>>>>>>>>> ***
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  please suggest ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> irfan ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <
>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>
>>>>>>>>>  hey Tariq,****
>>>>>>>>>
>>>>>>>>> i am still stuck .. ****
>>>>>>>>>
>>>>>>>>> can you please suggest ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> irfan ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  please suggest ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  attachment got quarantined ****
>>>>>>>>>
>>>>>>>>> resending in txt format. please rename it to conf.rar ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  thanks.****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> if i run the jps command on namenode :****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>>>>>
>>>>>>>>> $ ./jps.exe****
>>>>>>>>>
>>>>>>>>> 3164 NameNode****
>>>>>>>>>
>>>>>>>>> 1892 Jps****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> same command on datanode :****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> Administrator@DFS-1 /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>>>>>
>>>>>>>>> $ ./jps.exe****
>>>>>>>>>
>>>>>>>>> 3848 Jps****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> jps does not list any process for datanode. however, on web
>>>>>>>>> browser i can see one live data node ****
>>>>>>>>>
>>>>>>>>> please find the attached conf rar file of namenode ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <do...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  OK. we'll start fresh. Could you plz show me your latest config
>>>>>>>>> files?****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> BTW, are your daemons running fine?Use JPS to verify that.****
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> Warm Regards,****
>>>>>>>>>
>>>>>>>>> Tariq****
>>>>>>>>>
>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  i have created these dir "wksp_data" and "wksp_name" on both
>>>>>>>>> datanode and namenode ****
>>>>>>>>>
>>>>>>>>> made the respective changes in "hdfs-site.xml" file ****
>>>>>>>>>
>>>>>>>>> formatted the namenode ****
>>>>>>>>>
>>>>>>>>> started the dfs ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> but still, not able to browse the file system through web browser
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> please refer below ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> anything still missing ?****
>>>>>>>>>
>>>>>>>>> please suggest ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> [image: Inline image 1]****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  these dir needs to be created on all datanodes and namenodes ?***
>>>>>>>>> *
>>>>>>>>>
>>>>>>>>> further,  hdfs-site.xml needs to be updated on both datanodes and
>>>>>>>>> namenodes for these new dir?****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  Create 2 directories manually corresponding to the values of
>>>>>>>>> dfs.name.dir and dfs.data.dir properties and change the permissions of
>>>>>>>>> these directories to 755. When you start pushing data into your HDFS, data
>>>>>>>>> will start going inside the directory specified by dfs.data.dir and the
>>>>>>>>> associated metadata will go inside dfs.name.dir. Remember, you store data
>>>>>>>>> in HDFS, but it eventually gets stored in your local/native FS. But you
>>>>>>>>> cannot see this data directly on your local/native FS.****
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> Warm Regards,****
>>>>>>>>>
>>>>>>>>> Tariq****
>>>>>>>>>
>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  thanks. ****
>>>>>>>>>
>>>>>>>>> however, i need this to be working on windows environment as
>>>>>>>>> project requirement.****
>>>>>>>>>
>>>>>>>>> i will add/work on Linux later ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> so, now , at this stage , c:\\wksp is the HDFS file system OR do i
>>>>>>>>> need to create it from command line ?****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> please suggest****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards,****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  Hello Irfan,****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> Sorry for being unresponsive. Got stuck with some imp work.****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> HDFS webUI doesn't provide us the ability to create file or
>>>>>>>>> directory. You can browse HDFS, view files, download files etc. But
>>>>>>>>> operation like create, move, copy etc are not supported.****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> These values look fine to me.****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> One suggestion though. Try getting a Linux machine(if possible).
>>>>>>>>> Or at least use a VM. I personally feel that using Hadoop on windows is
>>>>>>>>> always messy.****
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> Warm Regards,****
>>>>>>>>>
>>>>>>>>> Tariq****
>>>>>>>>>
>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  thanks.****
>>>>>>>>>
>>>>>>>>> when i browse the file system , i am getting following :****
>>>>>>>>>
>>>>>>>>> i haven't seen any make directory option there ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> i need to create it from command line ?****
>>>>>>>>>
>>>>>>>>> further, in the hdfs-site.xml file , i have given following
>>>>>>>>> entries. are they correct ? ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> <property>****
>>>>>>>>>
>>>>>>>>>   <name>dfs.data.dir</name>****
>>>>>>>>>
>>>>>>>>>   <value>c:\\wksp</value>****
>>>>>>>>>
>>>>>>>>>   </property>****
>>>>>>>>>
>>>>>>>>> <property>****
>>>>>>>>>
>>>>>>>>>   <name>dfs.name.dir</name>****
>>>>>>>>>
>>>>>>>>>   <value>c:\\wksp</value>****
>>>>>>>>>
>>>>>>>>>   </property>****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> please suggest ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> [image: Inline image 1]****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <
>>>>>>>>> manishd207@gmail.com> wrote:****
>>>>>>>>>
>>>>>>>>>  *You are wrong at this:*****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>>>>>>>>
>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp****
>>>>>>>>>
>>>>>>>>> copyFromLocal: File
>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>>>>>>>>
>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp*
>>>>>>>>> ***
>>>>>>>>>
>>>>>>>>> copyFromLocal: File
>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> Because,You had wrote both the paths local and You need not to
>>>>>>>>> copy hadoop into hdfs...Hadoop is already working..****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> Just check out in browser by after starting ur single node cluster
>>>>>>>>> :****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> localhost:50070****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> then go for browse the filesystem link in it..****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> If there is no directory then make directory there.****
>>>>>>>>>
>>>>>>>>> That is your hdfs directory.****
>>>>>>>>>
>>>>>>>>> Then copy any text file there(no need to copy hadoop
>>>>>>>>> there).beacause u are going to do processing on that data in text
>>>>>>>>> file.That's why hadoop is used for ,first u need to make it clear in ur
>>>>>>>>> mind.Then and then u will do it...otherwise not possible..****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> *Try this: *****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2****
>>>>>>>>>
>>>>>>>>> $ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
>>>>>>>>> /hdfs/directory/path****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  thanks. yes , i am newbie.****
>>>>>>>>>
>>>>>>>>> however, i need windows setup.****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> let me surely refer the doc and link which u sent but i need this
>>>>>>>>> to be working ...****
>>>>>>>>>
>>>>>>>>> can you please help****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>>  ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> --
>>>>>>>>> MANISH DUNANI
>>>>>>>>> -THANX
>>>>>>>>> +91 9426881954,+91 8460656443****
>>>>>>>>>
>>>>>>>>> manishd207@gmail.com****
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> -- ****
>>>>>>>>>
>>>>>>>>> Regards****
>>>>>>>>>
>>>>>>>>> *Manish Dunani*****
>>>>>>>>>
>>>>>>>>> *Contact No* : +91 9408329137****
>>>>>>>>>
>>>>>>>>> *skype id* : manish.dunani****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> -- ****
>>>>>>>>>
>>>>>>>>> Olivier Renault****
>>>>>>>>>
>>>>>>>>> Solution Engineer - Big Data - Hortonworks, Inc.
>>>>>>>>> +44 7500 933 036
>>>>>>>>> orenault@hortonworks.com
>>>>>>>>> www.hortonworks.com****
>>>>>>>>>
>>>>>>>>> **** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>> immediately and delete it from your system. Thank You.****<http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>>>>>
>>>>>>>>>  ** ** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>> CONFIDENTIALITY NOTICE
>>>> NOTICE: This message is intended for the use of the individual or
>>>> entity to which it is addressed and may contain information that is
>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>> If the reader of this message is not the intended recipient, you are hereby
>>>> notified that any printing, copying, dissemination, distribution,
>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>> you have received this communication in error, please contact the sender
>>>> immediately and delete it from your system. Thank You.
>>>>
>>>
>>>
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity
>> to which it is addressed and may contain information that is confidential,
>> privileged and exempt from disclosure under applicable law. If the reader
>> of this message is not the intended recipient, you are hereby notified that
>> any printing, copying, dissemination, distribution, disclosure or
>> forwarding of this communication is strictly prohibited. If you have
>> received this communication in error, please contact the sender immediately
>> and delete it from your system. Thank You.
>>
>
>

Re: about replication

Posted by Irfan Sayed <ir...@gmail.com>.
please suggest

regards



On Thu, Sep 12, 2013 at 12:00 PM, Irfan Sayed <ir...@gmail.com> wrote:

> thanks.
> finally it got installed :)
>
> further, when i try to start the namenode, it failed with following log
>
> C:\hdp>start_remote_hdp_services.cmd
> Master nodes: start DFS-DC
> 0 Master nodes successfully started.
> 1 Master nodes failed to start.
>
> PSComputerName      Service             Message             Status
> --------------      -------             -------             ------
>                                         Connecting to re...
>
>
> StartStop-HDPServices : Manually start services on Master nodes then retry
> full
>  cluster start.  Exiting.
> At C:\hdp\manage_remote_hdp_services.ps1:187 char:47
> + if ($mode -eq "start") { StartStop-HDPservices <<<< ($mode) }
>     + CategoryInfo          : NotSpecified: (:) [Write-Error],
> WriteErrorExcep
>    tion
>     + FullyQualifiedErrorId :
> Microsoft.PowerShell.Commands.WriteErrorExceptio
>    n,StartStop-HDPServices
>
>
> C:\hdp>
>
> i tried starting manually as well but no luck
> anything missing in configuration ?
>
> regards
>
>
>
> On Wed, Sep 11, 2013 at 3:16 PM, Olivier Renault <orenault@hortonworks.com
> > wrote:
>
>> You can put the same FQDN as your NameNode for example.
>>
>> Thanks
>> Olivier
>> On 11 Sep 2013 11:26, "Irfan Sayed" <ir...@gmail.com> wrote:
>>
>>> i do not have any HIVE server host,  then, what should i put over here??
>>> . if i comment then i guess it throws error of commenting that
>>> can i put the fqdn of namenode for HIVE server host  ?
>>>
>>> will it be a really working configuration?
>>>
>>> please suggest
>>>
>>> regards
>>> irfan
>>>
>>>
>>>
>>> On Tue, Sep 10, 2013 at 5:09 PM, Olivier Renault <
>>> orenault@hortonworks.com> wrote:
>>>
>>>> Your cluster-properties.txt should look something like :
>>>>
>>>>
>>>> #Log directory
>>>> HDP_LOG_DIR=c:\hadoop\logs
>>>>
>>>> #Data directory
>>>> HDP_DATA_DIR=c:\hdp\data
>>>>
>>>> #Hosts
>>>> NAMENODE_HOST=yourmaster.fqdn.com
>>>> JOBTRACKER_HOST=yourmaster.fqdn.com
>>>> HIVE_SERVER_HOST=yourmaster.fqdn.com
>>>> OOZIE_SERVER_HOST=yourmaster.fqdn.com
>>>> TEMPLETON_HOST=yourmaster.fqdn.com
>>>> SLAVE_HOSTS=yourmaster.fqdn.com,youslave.fqdn.com
>>>>
>>>>
>>>> #Database host
>>>> DB_FLAVOR=derby
>>>> DB_HOSTNAME=yourmaster.fqdn.com
>>>>
>>>>
>>>> #Hive properties
>>>> HIVE_DB_NAME=hive
>>>> HIVE_DB_USERNAME=hive
>>>> HIVE_DB_PASSWORD=hive
>>>>
>>>> #Oozie properties
>>>> OOZIE_DB_NAME=oozie
>>>> OOZIE_DB_USERNAME=oozie
>>>> OOZIE_DB_PASSWORD=oozie
>>>>
>>>> You will need to replace, yourmaster.fqdn.com, yourslave.fqdn.com by
>>>> your servers name. For the time being, I suggest that you do not install
>>>> HBase, Oozie,
>>>>
>>>> regards,
>>>> Olivier
>>>>
>>>>
>>>> On 10 September 2013 07:02, Irfan Sayed <ir...@gmail.com> wrote:
>>>>
>>>>> please suggest
>>>>>
>>>>> regards
>>>>> irfan
>>>>>
>>>>>
>>>>>
>>>>> On Sat, Sep 7, 2013 at 4:56 PM, Irfan Sayed <ir...@gmail.com>wrote:
>>>>>
>>>>>> please suggest
>>>>>>
>>>>>> regards
>>>>>> irfan
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Fri, Sep 6, 2013 at 12:16 PM, Irfan Sayed <ir...@gmail.com>wrote:
>>>>>>
>>>>>>> ok.. now i made some changes and installation went ahead
>>>>>>> but failed in property "HIVE_SERVER_HOST" declaration
>>>>>>> in cluster config file, i have commented this property. if i
>>>>>>> uncomment , then what server address will give ???
>>>>>>>
>>>>>>> i have only two windows machines setup.
>>>>>>> 1: for namenode and another for datanode
>>>>>>>
>>>>>>> please suggest
>>>>>>>
>>>>>>> regards
>>>>>>> irfan
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Fri, Sep 6, 2013 at 11:42 AM, Irfan Sayed <ir...@gmail.com>wrote:
>>>>>>>
>>>>>>>> thanks.
>>>>>>>> i installed the latest java in c:\java folder and now no error in
>>>>>>>> log file related to java
>>>>>>>> however, now it is throwing error on not having cluster properties
>>>>>>>> file.
>>>>>>>> in fact i am running/installing hdp from the location where this
>>>>>>>> file exist . still it is throwing error
>>>>>>>>
>>>>>>>> please find the attached
>>>>>>>>
>>>>>>>> [image: Inline image 1]
>>>>>>>>
>>>>>>>> regards
>>>>>>>> irfan
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> On Fri, Sep 6, 2013 at 11:12 AM, Ravi Mummulla (BIG DATA) <
>>>>>>>> ravimu@microsoft.com> wrote:
>>>>>>>>
>>>>>>>>>  Here’s your issue (from the logs you attached earlier):****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> CAQuietExec:  Checking JAVA_HOME is set correctly...****
>>>>>>>>>
>>>>>>>>> CAQuietExec:  Files\Java\jdk1.6.0_31 was unexpected at this time.*
>>>>>>>>> ***
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> It seems that you installed Java prerequisite in the default path,
>>>>>>>>> which is %PROGRAMFILES% (expands to C:\Program Files in your case). HDP 1.3
>>>>>>>>> does not like spaces in paths, do you need to reinstall Java under c:\java\
>>>>>>>>> or something similar (in a path with no spaces).****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> *From:* Irfan Sayed [mailto:irfu.sayed@gmail.com]
>>>>>>>>> *Sent:* Thursday, September 5, 2013 8:42 PM
>>>>>>>>> *To:* user@hadoop.apache.org
>>>>>>>>> *Subject:* Re: about replication****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> please find the attached.****
>>>>>>>>>
>>>>>>>>> i don't have "c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log"
>>>>>>>>> as it is not generated ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> irfan****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Thu, Sep 5, 2013 at 6:09 PM, Olivier Renault <
>>>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>>>
>>>>>>>>>  Could you share the log files ( c:\hdp.log,
>>>>>>>>> c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log )  as
>>>>>>>>> well as your clusterproperties.txt ?****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> Thanks, ****
>>>>>>>>>
>>>>>>>>> Olivier****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On 5 September 2013 12:33, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  thanks. i followed the user manual for deployment and installed
>>>>>>>>> all pre-requisites ****
>>>>>>>>>
>>>>>>>>> i modified the command and still the issue persist. please suggest
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> please refer below ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> [image: Inline image 1]****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> irfan ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <
>>>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>>>
>>>>>>>>> The command to install it is msiexec /i msifile /...  ****
>>>>>>>>>
>>>>>>>>> You will find the correct syntax as part of doc. ****
>>>>>>>>>
>>>>>>>>> Happy reading
>>>>>>>>> Olivier ****
>>>>>>>>>
>>>>>>>>> On 4 Sep 2013 12:37, "Irfan Sayed" <ir...@gmail.com> wrote:**
>>>>>>>>> **
>>>>>>>>>
>>>>>>>>>  thanks. ****
>>>>>>>>>
>>>>>>>>> i referred the logs and manuals. i modified the clusterproperties
>>>>>>>>> file and then double click on the msi file ****
>>>>>>>>>
>>>>>>>>> however, it still failed.****
>>>>>>>>>
>>>>>>>>> further i started the installation on command line by giving
>>>>>>>>> HDP_LAYOUT=clusterproperties file path, ****
>>>>>>>>>
>>>>>>>>> installation went ahead and it failed for .NET framework 4.0 and
>>>>>>>>> VC++ redistributable package dependency   ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> i installed both and started again the installation. ****
>>>>>>>>>
>>>>>>>>> failed again with following error ****
>>>>>>>>>
>>>>>>>>> [image: Inline image 1]****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> when i search for the logs mentioned in the error , i never found
>>>>>>>>> that ****
>>>>>>>>>
>>>>>>>>> please suggest ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> irfan****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <
>>>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>>>
>>>>>>>>> Correct, you need to define the cluster configuration as part of a
>>>>>>>>> file. You will find some information on the configuration file as part of
>>>>>>>>> the documentation. ****
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> You should make sure to have also installed the pre requisite. ***
>>>>>>>>> *
>>>>>>>>>
>>>>>>>>> Thanks
>>>>>>>>> Olivier ****
>>>>>>>>>
>>>>>>>>> On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:**
>>>>>>>>> **
>>>>>>>>>
>>>>>>>>>  thanks. sorry for the long break. actually got involved in some
>>>>>>>>> other priorities****
>>>>>>>>>
>>>>>>>>> i downloaded the installer and while installing i got following
>>>>>>>>> error ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> [image: Inline image 1]****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> do i need to make any configuration prior to installation ??****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> irfan ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
>>>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>>>
>>>>>>>>> Here is the link ****
>>>>>>>>>
>>>>>>>>> http://download.hortonworks.com/products/hdp-windows/****
>>>>>>>>>
>>>>>>>>> Olivier ****
>>>>>>>>>
>>>>>>>>> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:*
>>>>>>>>> ***
>>>>>>>>>
>>>>>>>>>  thanks.****
>>>>>>>>>
>>>>>>>>> i just followed the instructions to setup the pseudo distributed
>>>>>>>>> setup first using the url :
>>>>>>>>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>>  ****
>>>>>>>>>
>>>>>>>>> i don't think so i am running DN on both machine ****
>>>>>>>>>
>>>>>>>>> please find the attached log****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> hi olivier ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> can you please give me download link ?****
>>>>>>>>>
>>>>>>>>> let me try please ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> irfan ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <
>>>>>>>>> dontariq@gmail.com> wrote:****
>>>>>>>>>
>>>>>>>>>  Are you running DN on both the machines? Could you please show
>>>>>>>>> me your DN logs?****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> Also, consider Oliver's suggestion. It's definitely a better
>>>>>>>>> option.****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> Warm Regards,****
>>>>>>>>>
>>>>>>>>> Tariq****
>>>>>>>>>
>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>>>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>>>
>>>>>>>>> Irfu, ****
>>>>>>>>>
>>>>>>>>> If you want to quickly get Hadoop running on windows platform. You
>>>>>>>>> may want to try our distribution for Windows. You will be able to find the
>>>>>>>>> msi on our website. ****
>>>>>>>>>
>>>>>>>>> Regards
>>>>>>>>> Olivier ****
>>>>>>>>>
>>>>>>>>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:*
>>>>>>>>> ***
>>>>>>>>>
>>>>>>>>>  thanks. ****
>>>>>>>>>
>>>>>>>>> ok. i think i need to change the plan over here ****
>>>>>>>>>
>>>>>>>>> let me create two environments. 1: totally windows 2: totally Unix
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> because, on windows , anyway i have to try and see how hadoop
>>>>>>>>> works ****
>>>>>>>>>
>>>>>>>>> on UNIX, it is already known that ,  it is working fine. ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> so, on windows , here is the setup:****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> namenode : windows 2012 R2 ****
>>>>>>>>>
>>>>>>>>> datanode : windows 2012 R2 ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> now, the exact problem is :****
>>>>>>>>>
>>>>>>>>> 1: datanode is not getting started ****
>>>>>>>>>
>>>>>>>>> 2: replication : if i put any file/folder on any datanode , it
>>>>>>>>> should get replicated to all another available datanodes ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <
>>>>>>>>> dontariq@gmail.com> wrote:****
>>>>>>>>>
>>>>>>>>>  Seriously??You are planning to develop something using Hadoop on
>>>>>>>>> windows. Not a good idea. Anyways, cold you plz show me your log files?I
>>>>>>>>> also need some additional info :****
>>>>>>>>>
>>>>>>>>> -The exact problem which you are facing right now****
>>>>>>>>>
>>>>>>>>> -Your cluster summary(no. of nodes etc)****
>>>>>>>>>
>>>>>>>>> -Your latest configuration files****
>>>>>>>>>
>>>>>>>>> -Your /etc.hosts file****
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> Warm Regards,****
>>>>>>>>>
>>>>>>>>> Tariq****
>>>>>>>>>
>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  ok. thanks****
>>>>>>>>>
>>>>>>>>> now, i need to start with all windows setup first as our product
>>>>>>>>> will be based on windows ****
>>>>>>>>>
>>>>>>>>> so, now, please tell me how to resolve the issue ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> datanode is not starting . please suggest ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards,****
>>>>>>>>>
>>>>>>>>> irfan ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <
>>>>>>>>> dontariq@gmail.com> wrote:****
>>>>>>>>>
>>>>>>>>>  It is possible. Theoretically Hadoop doesn't stop you from doing
>>>>>>>>> that. But it is not a very wise setup.****
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> Warm Regards,****
>>>>>>>>>
>>>>>>>>> Tariq****
>>>>>>>>>
>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  please suggest****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> irfan****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <
>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>
>>>>>>>>>  thanks.****
>>>>>>>>>
>>>>>>>>> can i have setup like this :****
>>>>>>>>>
>>>>>>>>> namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> and datanodes are the combination of any OS (windows , linux ,
>>>>>>>>> unix etc )****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> however, my doubt is,  as the file systems of  both the systems
>>>>>>>>> (win and linux ) are different ,  datanodes of these systems can not be
>>>>>>>>> part of single cluster . i have to make windows cluster separate and UNIX
>>>>>>>>> cluster separate ?****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
>>>>>>>>> aagarwal@hortonworks.com> wrote:****
>>>>>>>>>
>>>>>>>>> I just noticed you are on Cygwin. IIRC Windows PIDs are not the
>>>>>>>>> same as Cygwin PIDs so that may be causing the discrepancy. I don't know
>>>>>>>>> how well Hadoop works in Cygwin as I have never tried it. Work is in
>>>>>>>>> progress for native Windows support however there are no official releases
>>>>>>>>> with Windows support yet. It may be easier to get familiar with a
>>>>>>>>> release <https://www.apache.org/dyn/closer.cgi/hadoop/common/> on
>>>>>>>>> Linux if you are new to it.****
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <
>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>
>>>>>>>>>  thanks ****
>>>>>>>>>
>>>>>>>>> here is what i did .****
>>>>>>>>>
>>>>>>>>> i stopped all the namenodes and datanodes using ./stop-dfs.sh
>>>>>>>>> command ****
>>>>>>>>>
>>>>>>>>> then deleted all pid files for namenodes and datanodes ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> started dfs again with command : "./start-dfs.sh"****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> when i ran the "Jps" command . it shows****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>>>>>
>>>>>>>>> $ ./jps.exe****
>>>>>>>>>
>>>>>>>>> 4536 Jps****
>>>>>>>>>
>>>>>>>>> 2076 NameNode****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> however, when i open the pid file for namenode then it is not
>>>>>>>>> showing pid as : 4560. on the contrary, it shud show : 2076****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> please suggest ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
>>>>>>>>> aagarwal@hortonworks.com> wrote:****
>>>>>>>>>
>>>>>>>>>  Most likely there is a stale pid file. Something like
>>>>>>>>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>>>>>>>>> the datanode.
>>>>>>>>>
>>>>>>>>> I haven't read the entire thread so you may have looked at this
>>>>>>>>> already.
>>>>>>>>>
>>>>>>>>> -Arpit****
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  datanode is trying to connect to namenode continuously but fails
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> when i try to run "jps" command it says :****
>>>>>>>>>
>>>>>>>>> $ ./jps.exe****
>>>>>>>>>
>>>>>>>>> 4584 NameNode****
>>>>>>>>>
>>>>>>>>> 4016 Jps****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> and when i ran the "./start-dfs.sh" then it says :****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> $ ./start-dfs.sh****
>>>>>>>>>
>>>>>>>>> namenode running as process 3544. Stop it first.****
>>>>>>>>>
>>>>>>>>> DFS-1: datanode running as process 4076. Stop it first.****
>>>>>>>>>
>>>>>>>>> localhost: secondarynamenode running as process 4792. Stop it
>>>>>>>>> first.****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> both these logs are contradictory ****
>>>>>>>>>
>>>>>>>>> please find the attached logs ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> should i attach the conf files as well ?****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>>  ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <
>>>>>>>>> dontariq@gmail.com> wrote:****
>>>>>>>>>
>>>>>>>>>  Your DN is still not running. Showing me the logs would be
>>>>>>>>> helpful.****
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> Warm Regards,****
>>>>>>>>>
>>>>>>>>> Tariq****
>>>>>>>>>
>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  i followed the url and did the steps mention in that. i have
>>>>>>>>> deployed on the windows platform****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> Now, i am able to browse url : http://localhost:50070 (name node )
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> however, not able to browse url : http://localhost:50030****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> please refer below****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> [image: Inline image 1]****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> i have modified all the config files as mentioned and formatted
>>>>>>>>> the hdfs file system as well ****
>>>>>>>>>
>>>>>>>>> please suggest ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  thanks. i followed this url :
>>>>>>>>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> let me follow the url which you gave for pseudo distributed setup
>>>>>>>>> and then will switch to distributed mode****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> irfan ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <
>>>>>>>>> dontariq@gmail.com> wrote:****
>>>>>>>>>
>>>>>>>>>  You are welcome. Which link have you followed for the
>>>>>>>>> configuration?Your *core-site.xml* is empty. Remove the property *
>>>>>>>>> fs.default.name *from *hdfs-site.xml* and add it to *core-site.xml
>>>>>>>>> *. Remove *mapred.job.tracker* as well. It is required in *
>>>>>>>>> mapred-site.xml*.****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> I would suggest you to do a pseudo distributed setup first in
>>>>>>>>> order to get yourself familiar with the process and then proceed to the
>>>>>>>>> distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> HTH****
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> Warm Regards,****
>>>>>>>>>
>>>>>>>>> Tariq****
>>>>>>>>>
>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  thanks tariq for response. ****
>>>>>>>>>
>>>>>>>>> as discussed last time, i have sent you all the config files in my
>>>>>>>>> setup . ****
>>>>>>>>>
>>>>>>>>> can you please go through that ?****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> please let me know ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> irfan ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <
>>>>>>>>> dontariq@gmail.com> wrote:****
>>>>>>>>>
>>>>>>>>>  I'm sorry for being unresponsive. Was out of touch for sometime
>>>>>>>>> because of ramzan and eid. Resuming work today.****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> What's the current status?****
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> Warm Regards,****
>>>>>>>>>
>>>>>>>>> Tariq****
>>>>>>>>>
>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <
>>>>>>>>> manishd207@gmail.com> wrote:****
>>>>>>>>>
>>>>>>>>>  First of all read the concepts ..I hope you will like it..****
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf*
>>>>>>>>> ***
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  please suggest ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> irfan ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <
>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>
>>>>>>>>>  hey Tariq,****
>>>>>>>>>
>>>>>>>>> i am still stuck .. ****
>>>>>>>>>
>>>>>>>>> can you please suggest ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> irfan ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  please suggest ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  attachment got quarantined ****
>>>>>>>>>
>>>>>>>>> resending in txt format. please rename it to conf.rar ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  thanks.****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> if i run the jps command on namenode :****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>>>>>
>>>>>>>>> $ ./jps.exe****
>>>>>>>>>
>>>>>>>>> 3164 NameNode****
>>>>>>>>>
>>>>>>>>> 1892 Jps****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> same command on datanode :****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> Administrator@DFS-1 /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>>>>>
>>>>>>>>> $ ./jps.exe****
>>>>>>>>>
>>>>>>>>> 3848 Jps****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> jps does not list any process for datanode. however, on web
>>>>>>>>> browser i can see one live data node ****
>>>>>>>>>
>>>>>>>>> please find the attached conf rar file of namenode ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <do...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  OK. we'll start fresh. Could you plz show me your latest config
>>>>>>>>> files?****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> BTW, are your daemons running fine?Use JPS to verify that.****
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> Warm Regards,****
>>>>>>>>>
>>>>>>>>> Tariq****
>>>>>>>>>
>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  i have created these dir "wksp_data" and "wksp_name" on both
>>>>>>>>> datanode and namenode ****
>>>>>>>>>
>>>>>>>>> made the respective changes in "hdfs-site.xml" file ****
>>>>>>>>>
>>>>>>>>> formatted the namenode ****
>>>>>>>>>
>>>>>>>>> started the dfs ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> but still, not able to browse the file system through web browser
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> please refer below ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> anything still missing ?****
>>>>>>>>>
>>>>>>>>> please suggest ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> [image: Inline image 1]****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  these dir needs to be created on all datanodes and namenodes ?***
>>>>>>>>> *
>>>>>>>>>
>>>>>>>>> further,  hdfs-site.xml needs to be updated on both datanodes and
>>>>>>>>> namenodes for these new dir?****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  Create 2 directories manually corresponding to the values of
>>>>>>>>> dfs.name.dir and dfs.data.dir properties and change the permissions of
>>>>>>>>> these directories to 755. When you start pushing data into your HDFS, data
>>>>>>>>> will start going inside the directory specified by dfs.data.dir and the
>>>>>>>>> associated metadata will go inside dfs.name.dir. Remember, you store data
>>>>>>>>> in HDFS, but it eventually gets stored in your local/native FS. But you
>>>>>>>>> cannot see this data directly on your local/native FS.****
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> Warm Regards,****
>>>>>>>>>
>>>>>>>>> Tariq****
>>>>>>>>>
>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  thanks. ****
>>>>>>>>>
>>>>>>>>> however, i need this to be working on windows environment as
>>>>>>>>> project requirement.****
>>>>>>>>>
>>>>>>>>> i will add/work on Linux later ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> so, now , at this stage , c:\\wksp is the HDFS file system OR do i
>>>>>>>>> need to create it from command line ?****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> please suggest****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards,****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  Hello Irfan,****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> Sorry for being unresponsive. Got stuck with some imp work.****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> HDFS webUI doesn't provide us the ability to create file or
>>>>>>>>> directory. You can browse HDFS, view files, download files etc. But
>>>>>>>>> operation like create, move, copy etc are not supported.****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> These values look fine to me.****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> One suggestion though. Try getting a Linux machine(if possible).
>>>>>>>>> Or at least use a VM. I personally feel that using Hadoop on windows is
>>>>>>>>> always messy.****
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> Warm Regards,****
>>>>>>>>>
>>>>>>>>> Tariq****
>>>>>>>>>
>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  thanks.****
>>>>>>>>>
>>>>>>>>> when i browse the file system , i am getting following :****
>>>>>>>>>
>>>>>>>>> i haven't seen any make directory option there ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> i need to create it from command line ?****
>>>>>>>>>
>>>>>>>>> further, in the hdfs-site.xml file , i have given following
>>>>>>>>> entries. are they correct ? ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> <property>****
>>>>>>>>>
>>>>>>>>>   <name>dfs.data.dir</name>****
>>>>>>>>>
>>>>>>>>>   <value>c:\\wksp</value>****
>>>>>>>>>
>>>>>>>>>   </property>****
>>>>>>>>>
>>>>>>>>> <property>****
>>>>>>>>>
>>>>>>>>>   <name>dfs.name.dir</name>****
>>>>>>>>>
>>>>>>>>>   <value>c:\\wksp</value>****
>>>>>>>>>
>>>>>>>>>   </property>****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> please suggest ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> [image: Inline image 1]****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <
>>>>>>>>> manishd207@gmail.com> wrote:****
>>>>>>>>>
>>>>>>>>>  *You are wrong at this:*****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>>>>>>>>
>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp****
>>>>>>>>>
>>>>>>>>> copyFromLocal: File
>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>>>>>>>>
>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp*
>>>>>>>>> ***
>>>>>>>>>
>>>>>>>>> copyFromLocal: File
>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> Because,You had wrote both the paths local and You need not to
>>>>>>>>> copy hadoop into hdfs...Hadoop is already working..****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> Just check out in browser by after starting ur single node cluster
>>>>>>>>> :****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> localhost:50070****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> then go for browse the filesystem link in it..****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> If there is no directory then make directory there.****
>>>>>>>>>
>>>>>>>>> That is your hdfs directory.****
>>>>>>>>>
>>>>>>>>> Then copy any text file there(no need to copy hadoop
>>>>>>>>> there).beacause u are going to do processing on that data in text
>>>>>>>>> file.That's why hadoop is used for ,first u need to make it clear in ur
>>>>>>>>> mind.Then and then u will do it...otherwise not possible..****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> *Try this: *****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2****
>>>>>>>>>
>>>>>>>>> $ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
>>>>>>>>> /hdfs/directory/path****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  thanks. yes , i am newbie.****
>>>>>>>>>
>>>>>>>>> however, i need windows setup.****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> let me surely refer the doc and link which u sent but i need this
>>>>>>>>> to be working ...****
>>>>>>>>>
>>>>>>>>> can you please help****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>>  ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> --
>>>>>>>>> MANISH DUNANI
>>>>>>>>> -THANX
>>>>>>>>> +91 9426881954,+91 8460656443****
>>>>>>>>>
>>>>>>>>> manishd207@gmail.com****
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> -- ****
>>>>>>>>>
>>>>>>>>> Regards****
>>>>>>>>>
>>>>>>>>> *Manish Dunani*****
>>>>>>>>>
>>>>>>>>> *Contact No* : +91 9408329137****
>>>>>>>>>
>>>>>>>>> *skype id* : manish.dunani****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> -- ****
>>>>>>>>>
>>>>>>>>> Olivier Renault****
>>>>>>>>>
>>>>>>>>> Solution Engineer - Big Data - Hortonworks, Inc.
>>>>>>>>> +44 7500 933 036
>>>>>>>>> orenault@hortonworks.com
>>>>>>>>> www.hortonworks.com****
>>>>>>>>>
>>>>>>>>> **** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>> immediately and delete it from your system. Thank You.****<http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>>>>>
>>>>>>>>>  ** ** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>> CONFIDENTIALITY NOTICE
>>>> NOTICE: This message is intended for the use of the individual or
>>>> entity to which it is addressed and may contain information that is
>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>> If the reader of this message is not the intended recipient, you are hereby
>>>> notified that any printing, copying, dissemination, distribution,
>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>> you have received this communication in error, please contact the sender
>>>> immediately and delete it from your system. Thank You.
>>>>
>>>
>>>
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity
>> to which it is addressed and may contain information that is confidential,
>> privileged and exempt from disclosure under applicable law. If the reader
>> of this message is not the intended recipient, you are hereby notified that
>> any printing, copying, dissemination, distribution, disclosure or
>> forwarding of this communication is strictly prohibited. If you have
>> received this communication in error, please contact the sender immediately
>> and delete it from your system. Thank You.
>>
>
>

Re: about replication

Posted by Irfan Sayed <ir...@gmail.com>.
please suggest

regards



On Thu, Sep 12, 2013 at 12:00 PM, Irfan Sayed <ir...@gmail.com> wrote:

> thanks.
> finally it got installed :)
>
> further, when i try to start the namenode, it failed with following log
>
> C:\hdp>start_remote_hdp_services.cmd
> Master nodes: start DFS-DC
> 0 Master nodes successfully started.
> 1 Master nodes failed to start.
>
> PSComputerName      Service             Message             Status
> --------------      -------             -------             ------
>                                         Connecting to re...
>
>
> StartStop-HDPServices : Manually start services on Master nodes then retry
> full
>  cluster start.  Exiting.
> At C:\hdp\manage_remote_hdp_services.ps1:187 char:47
> + if ($mode -eq "start") { StartStop-HDPservices <<<< ($mode) }
>     + CategoryInfo          : NotSpecified: (:) [Write-Error],
> WriteErrorExcep
>    tion
>     + FullyQualifiedErrorId :
> Microsoft.PowerShell.Commands.WriteErrorExceptio
>    n,StartStop-HDPServices
>
>
> C:\hdp>
>
> i tried starting manually as well but no luck
> anything missing in configuration ?
>
> regards
>
>
>
> On Wed, Sep 11, 2013 at 3:16 PM, Olivier Renault <orenault@hortonworks.com
> > wrote:
>
>> You can put the same FQDN as your NameNode for example.
>>
>> Thanks
>> Olivier
>> On 11 Sep 2013 11:26, "Irfan Sayed" <ir...@gmail.com> wrote:
>>
>>> i do not have any HIVE server host,  then, what should i put over here??
>>> . if i comment then i guess it throws error of commenting that
>>> can i put the fqdn of namenode for HIVE server host  ?
>>>
>>> will it be a really working configuration?
>>>
>>> please suggest
>>>
>>> regards
>>> irfan
>>>
>>>
>>>
>>> On Tue, Sep 10, 2013 at 5:09 PM, Olivier Renault <
>>> orenault@hortonworks.com> wrote:
>>>
>>>> Your cluster-properties.txt should look something like :
>>>>
>>>>
>>>> #Log directory
>>>> HDP_LOG_DIR=c:\hadoop\logs
>>>>
>>>> #Data directory
>>>> HDP_DATA_DIR=c:\hdp\data
>>>>
>>>> #Hosts
>>>> NAMENODE_HOST=yourmaster.fqdn.com
>>>> JOBTRACKER_HOST=yourmaster.fqdn.com
>>>> HIVE_SERVER_HOST=yourmaster.fqdn.com
>>>> OOZIE_SERVER_HOST=yourmaster.fqdn.com
>>>> TEMPLETON_HOST=yourmaster.fqdn.com
>>>> SLAVE_HOSTS=yourmaster.fqdn.com,youslave.fqdn.com
>>>>
>>>>
>>>> #Database host
>>>> DB_FLAVOR=derby
>>>> DB_HOSTNAME=yourmaster.fqdn.com
>>>>
>>>>
>>>> #Hive properties
>>>> HIVE_DB_NAME=hive
>>>> HIVE_DB_USERNAME=hive
>>>> HIVE_DB_PASSWORD=hive
>>>>
>>>> #Oozie properties
>>>> OOZIE_DB_NAME=oozie
>>>> OOZIE_DB_USERNAME=oozie
>>>> OOZIE_DB_PASSWORD=oozie
>>>>
>>>> You will need to replace, yourmaster.fqdn.com, yourslave.fqdn.com by
>>>> your servers name. For the time being, I suggest that you do not install
>>>> HBase, Oozie,
>>>>
>>>> regards,
>>>> Olivier
>>>>
>>>>
>>>> On 10 September 2013 07:02, Irfan Sayed <ir...@gmail.com> wrote:
>>>>
>>>>> please suggest
>>>>>
>>>>> regards
>>>>> irfan
>>>>>
>>>>>
>>>>>
>>>>> On Sat, Sep 7, 2013 at 4:56 PM, Irfan Sayed <ir...@gmail.com>wrote:
>>>>>
>>>>>> please suggest
>>>>>>
>>>>>> regards
>>>>>> irfan
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Fri, Sep 6, 2013 at 12:16 PM, Irfan Sayed <ir...@gmail.com>wrote:
>>>>>>
>>>>>>> ok.. now i made some changes and installation went ahead
>>>>>>> but failed in property "HIVE_SERVER_HOST" declaration
>>>>>>> in cluster config file, i have commented this property. if i
>>>>>>> uncomment , then what server address will give ???
>>>>>>>
>>>>>>> i have only two windows machines setup.
>>>>>>> 1: for namenode and another for datanode
>>>>>>>
>>>>>>> please suggest
>>>>>>>
>>>>>>> regards
>>>>>>> irfan
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Fri, Sep 6, 2013 at 11:42 AM, Irfan Sayed <ir...@gmail.com>wrote:
>>>>>>>
>>>>>>>> thanks.
>>>>>>>> i installed the latest java in c:\java folder and now no error in
>>>>>>>> log file related to java
>>>>>>>> however, now it is throwing error on not having cluster properties
>>>>>>>> file.
>>>>>>>> in fact i am running/installing hdp from the location where this
>>>>>>>> file exist . still it is throwing error
>>>>>>>>
>>>>>>>> please find the attached
>>>>>>>>
>>>>>>>> [image: Inline image 1]
>>>>>>>>
>>>>>>>> regards
>>>>>>>> irfan
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> On Fri, Sep 6, 2013 at 11:12 AM, Ravi Mummulla (BIG DATA) <
>>>>>>>> ravimu@microsoft.com> wrote:
>>>>>>>>
>>>>>>>>>  Here’s your issue (from the logs you attached earlier):****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> CAQuietExec:  Checking JAVA_HOME is set correctly...****
>>>>>>>>>
>>>>>>>>> CAQuietExec:  Files\Java\jdk1.6.0_31 was unexpected at this time.*
>>>>>>>>> ***
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> It seems that you installed Java prerequisite in the default path,
>>>>>>>>> which is %PROGRAMFILES% (expands to C:\Program Files in your case). HDP 1.3
>>>>>>>>> does not like spaces in paths, do you need to reinstall Java under c:\java\
>>>>>>>>> or something similar (in a path with no spaces).****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> *From:* Irfan Sayed [mailto:irfu.sayed@gmail.com]
>>>>>>>>> *Sent:* Thursday, September 5, 2013 8:42 PM
>>>>>>>>> *To:* user@hadoop.apache.org
>>>>>>>>> *Subject:* Re: about replication****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> please find the attached.****
>>>>>>>>>
>>>>>>>>> i don't have "c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log"
>>>>>>>>> as it is not generated ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> irfan****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Thu, Sep 5, 2013 at 6:09 PM, Olivier Renault <
>>>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>>>
>>>>>>>>>  Could you share the log files ( c:\hdp.log,
>>>>>>>>> c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log )  as
>>>>>>>>> well as your clusterproperties.txt ?****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> Thanks, ****
>>>>>>>>>
>>>>>>>>> Olivier****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On 5 September 2013 12:33, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  thanks. i followed the user manual for deployment and installed
>>>>>>>>> all pre-requisites ****
>>>>>>>>>
>>>>>>>>> i modified the command and still the issue persist. please suggest
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> please refer below ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> [image: Inline image 1]****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> irfan ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <
>>>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>>>
>>>>>>>>> The command to install it is msiexec /i msifile /...  ****
>>>>>>>>>
>>>>>>>>> You will find the correct syntax as part of doc. ****
>>>>>>>>>
>>>>>>>>> Happy reading
>>>>>>>>> Olivier ****
>>>>>>>>>
>>>>>>>>> On 4 Sep 2013 12:37, "Irfan Sayed" <ir...@gmail.com> wrote:**
>>>>>>>>> **
>>>>>>>>>
>>>>>>>>>  thanks. ****
>>>>>>>>>
>>>>>>>>> i referred the logs and manuals. i modified the clusterproperties
>>>>>>>>> file and then double click on the msi file ****
>>>>>>>>>
>>>>>>>>> however, it still failed.****
>>>>>>>>>
>>>>>>>>> further i started the installation on command line by giving
>>>>>>>>> HDP_LAYOUT=clusterproperties file path, ****
>>>>>>>>>
>>>>>>>>> installation went ahead and it failed for .NET framework 4.0 and
>>>>>>>>> VC++ redistributable package dependency   ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> i installed both and started again the installation. ****
>>>>>>>>>
>>>>>>>>> failed again with following error ****
>>>>>>>>>
>>>>>>>>> [image: Inline image 1]****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> when i search for the logs mentioned in the error , i never found
>>>>>>>>> that ****
>>>>>>>>>
>>>>>>>>> please suggest ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> irfan****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <
>>>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>>>
>>>>>>>>> Correct, you need to define the cluster configuration as part of a
>>>>>>>>> file. You will find some information on the configuration file as part of
>>>>>>>>> the documentation. ****
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> You should make sure to have also installed the pre requisite. ***
>>>>>>>>> *
>>>>>>>>>
>>>>>>>>> Thanks
>>>>>>>>> Olivier ****
>>>>>>>>>
>>>>>>>>> On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:**
>>>>>>>>> **
>>>>>>>>>
>>>>>>>>>  thanks. sorry for the long break. actually got involved in some
>>>>>>>>> other priorities****
>>>>>>>>>
>>>>>>>>> i downloaded the installer and while installing i got following
>>>>>>>>> error ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> [image: Inline image 1]****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> do i need to make any configuration prior to installation ??****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> irfan ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
>>>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>>>
>>>>>>>>> Here is the link ****
>>>>>>>>>
>>>>>>>>> http://download.hortonworks.com/products/hdp-windows/****
>>>>>>>>>
>>>>>>>>> Olivier ****
>>>>>>>>>
>>>>>>>>> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:*
>>>>>>>>> ***
>>>>>>>>>
>>>>>>>>>  thanks.****
>>>>>>>>>
>>>>>>>>> i just followed the instructions to setup the pseudo distributed
>>>>>>>>> setup first using the url :
>>>>>>>>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>>  ****
>>>>>>>>>
>>>>>>>>> i don't think so i am running DN on both machine ****
>>>>>>>>>
>>>>>>>>> please find the attached log****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> hi olivier ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> can you please give me download link ?****
>>>>>>>>>
>>>>>>>>> let me try please ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> irfan ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <
>>>>>>>>> dontariq@gmail.com> wrote:****
>>>>>>>>>
>>>>>>>>>  Are you running DN on both the machines? Could you please show
>>>>>>>>> me your DN logs?****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> Also, consider Oliver's suggestion. It's definitely a better
>>>>>>>>> option.****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> Warm Regards,****
>>>>>>>>>
>>>>>>>>> Tariq****
>>>>>>>>>
>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>>>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>>>
>>>>>>>>> Irfu, ****
>>>>>>>>>
>>>>>>>>> If you want to quickly get Hadoop running on windows platform. You
>>>>>>>>> may want to try our distribution for Windows. You will be able to find the
>>>>>>>>> msi on our website. ****
>>>>>>>>>
>>>>>>>>> Regards
>>>>>>>>> Olivier ****
>>>>>>>>>
>>>>>>>>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:*
>>>>>>>>> ***
>>>>>>>>>
>>>>>>>>>  thanks. ****
>>>>>>>>>
>>>>>>>>> ok. i think i need to change the plan over here ****
>>>>>>>>>
>>>>>>>>> let me create two environments. 1: totally windows 2: totally Unix
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> because, on windows , anyway i have to try and see how hadoop
>>>>>>>>> works ****
>>>>>>>>>
>>>>>>>>> on UNIX, it is already known that ,  it is working fine. ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> so, on windows , here is the setup:****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> namenode : windows 2012 R2 ****
>>>>>>>>>
>>>>>>>>> datanode : windows 2012 R2 ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> now, the exact problem is :****
>>>>>>>>>
>>>>>>>>> 1: datanode is not getting started ****
>>>>>>>>>
>>>>>>>>> 2: replication : if i put any file/folder on any datanode , it
>>>>>>>>> should get replicated to all another available datanodes ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <
>>>>>>>>> dontariq@gmail.com> wrote:****
>>>>>>>>>
>>>>>>>>>  Seriously??You are planning to develop something using Hadoop on
>>>>>>>>> windows. Not a good idea. Anyways, cold you plz show me your log files?I
>>>>>>>>> also need some additional info :****
>>>>>>>>>
>>>>>>>>> -The exact problem which you are facing right now****
>>>>>>>>>
>>>>>>>>> -Your cluster summary(no. of nodes etc)****
>>>>>>>>>
>>>>>>>>> -Your latest configuration files****
>>>>>>>>>
>>>>>>>>> -Your /etc.hosts file****
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> Warm Regards,****
>>>>>>>>>
>>>>>>>>> Tariq****
>>>>>>>>>
>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  ok. thanks****
>>>>>>>>>
>>>>>>>>> now, i need to start with all windows setup first as our product
>>>>>>>>> will be based on windows ****
>>>>>>>>>
>>>>>>>>> so, now, please tell me how to resolve the issue ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> datanode is not starting . please suggest ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards,****
>>>>>>>>>
>>>>>>>>> irfan ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <
>>>>>>>>> dontariq@gmail.com> wrote:****
>>>>>>>>>
>>>>>>>>>  It is possible. Theoretically Hadoop doesn't stop you from doing
>>>>>>>>> that. But it is not a very wise setup.****
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> Warm Regards,****
>>>>>>>>>
>>>>>>>>> Tariq****
>>>>>>>>>
>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  please suggest****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> irfan****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <
>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>
>>>>>>>>>  thanks.****
>>>>>>>>>
>>>>>>>>> can i have setup like this :****
>>>>>>>>>
>>>>>>>>> namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> and datanodes are the combination of any OS (windows , linux ,
>>>>>>>>> unix etc )****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> however, my doubt is,  as the file systems of  both the systems
>>>>>>>>> (win and linux ) are different ,  datanodes of these systems can not be
>>>>>>>>> part of single cluster . i have to make windows cluster separate and UNIX
>>>>>>>>> cluster separate ?****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
>>>>>>>>> aagarwal@hortonworks.com> wrote:****
>>>>>>>>>
>>>>>>>>> I just noticed you are on Cygwin. IIRC Windows PIDs are not the
>>>>>>>>> same as Cygwin PIDs so that may be causing the discrepancy. I don't know
>>>>>>>>> how well Hadoop works in Cygwin as I have never tried it. Work is in
>>>>>>>>> progress for native Windows support however there are no official releases
>>>>>>>>> with Windows support yet. It may be easier to get familiar with a
>>>>>>>>> release <https://www.apache.org/dyn/closer.cgi/hadoop/common/> on
>>>>>>>>> Linux if you are new to it.****
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <
>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>
>>>>>>>>>  thanks ****
>>>>>>>>>
>>>>>>>>> here is what i did .****
>>>>>>>>>
>>>>>>>>> i stopped all the namenodes and datanodes using ./stop-dfs.sh
>>>>>>>>> command ****
>>>>>>>>>
>>>>>>>>> then deleted all pid files for namenodes and datanodes ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> started dfs again with command : "./start-dfs.sh"****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> when i ran the "Jps" command . it shows****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>>>>>
>>>>>>>>> $ ./jps.exe****
>>>>>>>>>
>>>>>>>>> 4536 Jps****
>>>>>>>>>
>>>>>>>>> 2076 NameNode****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> however, when i open the pid file for namenode then it is not
>>>>>>>>> showing pid as : 4560. on the contrary, it shud show : 2076****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> please suggest ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
>>>>>>>>> aagarwal@hortonworks.com> wrote:****
>>>>>>>>>
>>>>>>>>>  Most likely there is a stale pid file. Something like
>>>>>>>>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>>>>>>>>> the datanode.
>>>>>>>>>
>>>>>>>>> I haven't read the entire thread so you may have looked at this
>>>>>>>>> already.
>>>>>>>>>
>>>>>>>>> -Arpit****
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  datanode is trying to connect to namenode continuously but fails
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> when i try to run "jps" command it says :****
>>>>>>>>>
>>>>>>>>> $ ./jps.exe****
>>>>>>>>>
>>>>>>>>> 4584 NameNode****
>>>>>>>>>
>>>>>>>>> 4016 Jps****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> and when i ran the "./start-dfs.sh" then it says :****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> $ ./start-dfs.sh****
>>>>>>>>>
>>>>>>>>> namenode running as process 3544. Stop it first.****
>>>>>>>>>
>>>>>>>>> DFS-1: datanode running as process 4076. Stop it first.****
>>>>>>>>>
>>>>>>>>> localhost: secondarynamenode running as process 4792. Stop it
>>>>>>>>> first.****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> both these logs are contradictory ****
>>>>>>>>>
>>>>>>>>> please find the attached logs ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> should i attach the conf files as well ?****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>>  ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <
>>>>>>>>> dontariq@gmail.com> wrote:****
>>>>>>>>>
>>>>>>>>>  Your DN is still not running. Showing me the logs would be
>>>>>>>>> helpful.****
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> Warm Regards,****
>>>>>>>>>
>>>>>>>>> Tariq****
>>>>>>>>>
>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  i followed the url and did the steps mention in that. i have
>>>>>>>>> deployed on the windows platform****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> Now, i am able to browse url : http://localhost:50070 (name node )
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> however, not able to browse url : http://localhost:50030****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> please refer below****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> [image: Inline image 1]****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> i have modified all the config files as mentioned and formatted
>>>>>>>>> the hdfs file system as well ****
>>>>>>>>>
>>>>>>>>> please suggest ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  thanks. i followed this url :
>>>>>>>>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> let me follow the url which you gave for pseudo distributed setup
>>>>>>>>> and then will switch to distributed mode****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> irfan ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <
>>>>>>>>> dontariq@gmail.com> wrote:****
>>>>>>>>>
>>>>>>>>>  You are welcome. Which link have you followed for the
>>>>>>>>> configuration?Your *core-site.xml* is empty. Remove the property *
>>>>>>>>> fs.default.name *from *hdfs-site.xml* and add it to *core-site.xml
>>>>>>>>> *. Remove *mapred.job.tracker* as well. It is required in *
>>>>>>>>> mapred-site.xml*.****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> I would suggest you to do a pseudo distributed setup first in
>>>>>>>>> order to get yourself familiar with the process and then proceed to the
>>>>>>>>> distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> HTH****
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> Warm Regards,****
>>>>>>>>>
>>>>>>>>> Tariq****
>>>>>>>>>
>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  thanks tariq for response. ****
>>>>>>>>>
>>>>>>>>> as discussed last time, i have sent you all the config files in my
>>>>>>>>> setup . ****
>>>>>>>>>
>>>>>>>>> can you please go through that ?****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> please let me know ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> irfan ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <
>>>>>>>>> dontariq@gmail.com> wrote:****
>>>>>>>>>
>>>>>>>>>  I'm sorry for being unresponsive. Was out of touch for sometime
>>>>>>>>> because of ramzan and eid. Resuming work today.****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> What's the current status?****
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> Warm Regards,****
>>>>>>>>>
>>>>>>>>> Tariq****
>>>>>>>>>
>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <
>>>>>>>>> manishd207@gmail.com> wrote:****
>>>>>>>>>
>>>>>>>>>  First of all read the concepts ..I hope you will like it..****
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf*
>>>>>>>>> ***
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  please suggest ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> irfan ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <
>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>
>>>>>>>>>  hey Tariq,****
>>>>>>>>>
>>>>>>>>> i am still stuck .. ****
>>>>>>>>>
>>>>>>>>> can you please suggest ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> irfan ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  please suggest ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  attachment got quarantined ****
>>>>>>>>>
>>>>>>>>> resending in txt format. please rename it to conf.rar ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  thanks.****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> if i run the jps command on namenode :****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>>>>>
>>>>>>>>> $ ./jps.exe****
>>>>>>>>>
>>>>>>>>> 3164 NameNode****
>>>>>>>>>
>>>>>>>>> 1892 Jps****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> same command on datanode :****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> Administrator@DFS-1 /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>>>>>
>>>>>>>>> $ ./jps.exe****
>>>>>>>>>
>>>>>>>>> 3848 Jps****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> jps does not list any process for datanode. however, on web
>>>>>>>>> browser i can see one live data node ****
>>>>>>>>>
>>>>>>>>> please find the attached conf rar file of namenode ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <do...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  OK. we'll start fresh. Could you plz show me your latest config
>>>>>>>>> files?****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> BTW, are your daemons running fine?Use JPS to verify that.****
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> Warm Regards,****
>>>>>>>>>
>>>>>>>>> Tariq****
>>>>>>>>>
>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  i have created these dir "wksp_data" and "wksp_name" on both
>>>>>>>>> datanode and namenode ****
>>>>>>>>>
>>>>>>>>> made the respective changes in "hdfs-site.xml" file ****
>>>>>>>>>
>>>>>>>>> formatted the namenode ****
>>>>>>>>>
>>>>>>>>> started the dfs ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> but still, not able to browse the file system through web browser
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> please refer below ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> anything still missing ?****
>>>>>>>>>
>>>>>>>>> please suggest ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> [image: Inline image 1]****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  these dir needs to be created on all datanodes and namenodes ?***
>>>>>>>>> *
>>>>>>>>>
>>>>>>>>> further,  hdfs-site.xml needs to be updated on both datanodes and
>>>>>>>>> namenodes for these new dir?****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  Create 2 directories manually corresponding to the values of
>>>>>>>>> dfs.name.dir and dfs.data.dir properties and change the permissions of
>>>>>>>>> these directories to 755. When you start pushing data into your HDFS, data
>>>>>>>>> will start going inside the directory specified by dfs.data.dir and the
>>>>>>>>> associated metadata will go inside dfs.name.dir. Remember, you store data
>>>>>>>>> in HDFS, but it eventually gets stored in your local/native FS. But you
>>>>>>>>> cannot see this data directly on your local/native FS.****
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> Warm Regards,****
>>>>>>>>>
>>>>>>>>> Tariq****
>>>>>>>>>
>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  thanks. ****
>>>>>>>>>
>>>>>>>>> however, i need this to be working on windows environment as
>>>>>>>>> project requirement.****
>>>>>>>>>
>>>>>>>>> i will add/work on Linux later ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> so, now , at this stage , c:\\wksp is the HDFS file system OR do i
>>>>>>>>> need to create it from command line ?****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> please suggest****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards,****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  Hello Irfan,****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> Sorry for being unresponsive. Got stuck with some imp work.****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> HDFS webUI doesn't provide us the ability to create file or
>>>>>>>>> directory. You can browse HDFS, view files, download files etc. But
>>>>>>>>> operation like create, move, copy etc are not supported.****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> These values look fine to me.****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> One suggestion though. Try getting a Linux machine(if possible).
>>>>>>>>> Or at least use a VM. I personally feel that using Hadoop on windows is
>>>>>>>>> always messy.****
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> Warm Regards,****
>>>>>>>>>
>>>>>>>>> Tariq****
>>>>>>>>>
>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  thanks.****
>>>>>>>>>
>>>>>>>>> when i browse the file system , i am getting following :****
>>>>>>>>>
>>>>>>>>> i haven't seen any make directory option there ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> i need to create it from command line ?****
>>>>>>>>>
>>>>>>>>> further, in the hdfs-site.xml file , i have given following
>>>>>>>>> entries. are they correct ? ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> <property>****
>>>>>>>>>
>>>>>>>>>   <name>dfs.data.dir</name>****
>>>>>>>>>
>>>>>>>>>   <value>c:\\wksp</value>****
>>>>>>>>>
>>>>>>>>>   </property>****
>>>>>>>>>
>>>>>>>>> <property>****
>>>>>>>>>
>>>>>>>>>   <name>dfs.name.dir</name>****
>>>>>>>>>
>>>>>>>>>   <value>c:\\wksp</value>****
>>>>>>>>>
>>>>>>>>>   </property>****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> please suggest ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> [image: Inline image 1]****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <
>>>>>>>>> manishd207@gmail.com> wrote:****
>>>>>>>>>
>>>>>>>>>  *You are wrong at this:*****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>>>>>>>>
>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp****
>>>>>>>>>
>>>>>>>>> copyFromLocal: File
>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>>>>>>>>
>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp*
>>>>>>>>> ***
>>>>>>>>>
>>>>>>>>> copyFromLocal: File
>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> Because,You had wrote both the paths local and You need not to
>>>>>>>>> copy hadoop into hdfs...Hadoop is already working..****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> Just check out in browser by after starting ur single node cluster
>>>>>>>>> :****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> localhost:50070****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> then go for browse the filesystem link in it..****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> If there is no directory then make directory there.****
>>>>>>>>>
>>>>>>>>> That is your hdfs directory.****
>>>>>>>>>
>>>>>>>>> Then copy any text file there(no need to copy hadoop
>>>>>>>>> there).beacause u are going to do processing on that data in text
>>>>>>>>> file.That's why hadoop is used for ,first u need to make it clear in ur
>>>>>>>>> mind.Then and then u will do it...otherwise not possible..****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> *Try this: *****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2****
>>>>>>>>>
>>>>>>>>> $ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
>>>>>>>>> /hdfs/directory/path****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  thanks. yes , i am newbie.****
>>>>>>>>>
>>>>>>>>> however, i need windows setup.****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> let me surely refer the doc and link which u sent but i need this
>>>>>>>>> to be working ...****
>>>>>>>>>
>>>>>>>>> can you please help****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>>  ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> --
>>>>>>>>> MANISH DUNANI
>>>>>>>>> -THANX
>>>>>>>>> +91 9426881954,+91 8460656443****
>>>>>>>>>
>>>>>>>>> manishd207@gmail.com****
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> -- ****
>>>>>>>>>
>>>>>>>>> Regards****
>>>>>>>>>
>>>>>>>>> *Manish Dunani*****
>>>>>>>>>
>>>>>>>>> *Contact No* : +91 9408329137****
>>>>>>>>>
>>>>>>>>> *skype id* : manish.dunani****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> -- ****
>>>>>>>>>
>>>>>>>>> Olivier Renault****
>>>>>>>>>
>>>>>>>>> Solution Engineer - Big Data - Hortonworks, Inc.
>>>>>>>>> +44 7500 933 036
>>>>>>>>> orenault@hortonworks.com
>>>>>>>>> www.hortonworks.com****
>>>>>>>>>
>>>>>>>>> **** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>> immediately and delete it from your system. Thank You.****<http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>>>>>
>>>>>>>>>  ** ** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>> CONFIDENTIALITY NOTICE
>>>> NOTICE: This message is intended for the use of the individual or
>>>> entity to which it is addressed and may contain information that is
>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>> If the reader of this message is not the intended recipient, you are hereby
>>>> notified that any printing, copying, dissemination, distribution,
>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>> you have received this communication in error, please contact the sender
>>>> immediately and delete it from your system. Thank You.
>>>>
>>>
>>>
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity
>> to which it is addressed and may contain information that is confidential,
>> privileged and exempt from disclosure under applicable law. If the reader
>> of this message is not the intended recipient, you are hereby notified that
>> any printing, copying, dissemination, distribution, disclosure or
>> forwarding of this communication is strictly prohibited. If you have
>> received this communication in error, please contact the sender immediately
>> and delete it from your system. Thank You.
>>
>
>

Re: about replication

Posted by Irfan Sayed <ir...@gmail.com>.
please suggest

regards



On Thu, Sep 12, 2013 at 12:00 PM, Irfan Sayed <ir...@gmail.com> wrote:

> thanks.
> finally it got installed :)
>
> further, when i try to start the namenode, it failed with following log
>
> C:\hdp>start_remote_hdp_services.cmd
> Master nodes: start DFS-DC
> 0 Master nodes successfully started.
> 1 Master nodes failed to start.
>
> PSComputerName      Service             Message             Status
> --------------      -------             -------             ------
>                                         Connecting to re...
>
>
> StartStop-HDPServices : Manually start services on Master nodes then retry
> full
>  cluster start.  Exiting.
> At C:\hdp\manage_remote_hdp_services.ps1:187 char:47
> + if ($mode -eq "start") { StartStop-HDPservices <<<< ($mode) }
>     + CategoryInfo          : NotSpecified: (:) [Write-Error],
> WriteErrorExcep
>    tion
>     + FullyQualifiedErrorId :
> Microsoft.PowerShell.Commands.WriteErrorExceptio
>    n,StartStop-HDPServices
>
>
> C:\hdp>
>
> i tried starting manually as well but no luck
> anything missing in configuration ?
>
> regards
>
>
>
> On Wed, Sep 11, 2013 at 3:16 PM, Olivier Renault <orenault@hortonworks.com
> > wrote:
>
>> You can put the same FQDN as your NameNode for example.
>>
>> Thanks
>> Olivier
>> On 11 Sep 2013 11:26, "Irfan Sayed" <ir...@gmail.com> wrote:
>>
>>> i do not have any HIVE server host,  then, what should i put over here??
>>> . if i comment then i guess it throws error of commenting that
>>> can i put the fqdn of namenode for HIVE server host  ?
>>>
>>> will it be a really working configuration?
>>>
>>> please suggest
>>>
>>> regards
>>> irfan
>>>
>>>
>>>
>>> On Tue, Sep 10, 2013 at 5:09 PM, Olivier Renault <
>>> orenault@hortonworks.com> wrote:
>>>
>>>> Your cluster-properties.txt should look something like :
>>>>
>>>>
>>>> #Log directory
>>>> HDP_LOG_DIR=c:\hadoop\logs
>>>>
>>>> #Data directory
>>>> HDP_DATA_DIR=c:\hdp\data
>>>>
>>>> #Hosts
>>>> NAMENODE_HOST=yourmaster.fqdn.com
>>>> JOBTRACKER_HOST=yourmaster.fqdn.com
>>>> HIVE_SERVER_HOST=yourmaster.fqdn.com
>>>> OOZIE_SERVER_HOST=yourmaster.fqdn.com
>>>> TEMPLETON_HOST=yourmaster.fqdn.com
>>>> SLAVE_HOSTS=yourmaster.fqdn.com,youslave.fqdn.com
>>>>
>>>>
>>>> #Database host
>>>> DB_FLAVOR=derby
>>>> DB_HOSTNAME=yourmaster.fqdn.com
>>>>
>>>>
>>>> #Hive properties
>>>> HIVE_DB_NAME=hive
>>>> HIVE_DB_USERNAME=hive
>>>> HIVE_DB_PASSWORD=hive
>>>>
>>>> #Oozie properties
>>>> OOZIE_DB_NAME=oozie
>>>> OOZIE_DB_USERNAME=oozie
>>>> OOZIE_DB_PASSWORD=oozie
>>>>
>>>> You will need to replace, yourmaster.fqdn.com, yourslave.fqdn.com by
>>>> your servers name. For the time being, I suggest that you do not install
>>>> HBase, Oozie,
>>>>
>>>> regards,
>>>> Olivier
>>>>
>>>>
>>>> On 10 September 2013 07:02, Irfan Sayed <ir...@gmail.com> wrote:
>>>>
>>>>> please suggest
>>>>>
>>>>> regards
>>>>> irfan
>>>>>
>>>>>
>>>>>
>>>>> On Sat, Sep 7, 2013 at 4:56 PM, Irfan Sayed <ir...@gmail.com>wrote:
>>>>>
>>>>>> please suggest
>>>>>>
>>>>>> regards
>>>>>> irfan
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Fri, Sep 6, 2013 at 12:16 PM, Irfan Sayed <ir...@gmail.com>wrote:
>>>>>>
>>>>>>> ok.. now i made some changes and installation went ahead
>>>>>>> but failed in property "HIVE_SERVER_HOST" declaration
>>>>>>> in cluster config file, i have commented this property. if i
>>>>>>> uncomment , then what server address will give ???
>>>>>>>
>>>>>>> i have only two windows machines setup.
>>>>>>> 1: for namenode and another for datanode
>>>>>>>
>>>>>>> please suggest
>>>>>>>
>>>>>>> regards
>>>>>>> irfan
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Fri, Sep 6, 2013 at 11:42 AM, Irfan Sayed <ir...@gmail.com>wrote:
>>>>>>>
>>>>>>>> thanks.
>>>>>>>> i installed the latest java in c:\java folder and now no error in
>>>>>>>> log file related to java
>>>>>>>> however, now it is throwing error on not having cluster properties
>>>>>>>> file.
>>>>>>>> in fact i am running/installing hdp from the location where this
>>>>>>>> file exist . still it is throwing error
>>>>>>>>
>>>>>>>> please find the attached
>>>>>>>>
>>>>>>>> [image: Inline image 1]
>>>>>>>>
>>>>>>>> regards
>>>>>>>> irfan
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> On Fri, Sep 6, 2013 at 11:12 AM, Ravi Mummulla (BIG DATA) <
>>>>>>>> ravimu@microsoft.com> wrote:
>>>>>>>>
>>>>>>>>>  Here’s your issue (from the logs you attached earlier):****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> CAQuietExec:  Checking JAVA_HOME is set correctly...****
>>>>>>>>>
>>>>>>>>> CAQuietExec:  Files\Java\jdk1.6.0_31 was unexpected at this time.*
>>>>>>>>> ***
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> It seems that you installed Java prerequisite in the default path,
>>>>>>>>> which is %PROGRAMFILES% (expands to C:\Program Files in your case). HDP 1.3
>>>>>>>>> does not like spaces in paths, do you need to reinstall Java under c:\java\
>>>>>>>>> or something similar (in a path with no spaces).****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> *From:* Irfan Sayed [mailto:irfu.sayed@gmail.com]
>>>>>>>>> *Sent:* Thursday, September 5, 2013 8:42 PM
>>>>>>>>> *To:* user@hadoop.apache.org
>>>>>>>>> *Subject:* Re: about replication****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> please find the attached.****
>>>>>>>>>
>>>>>>>>> i don't have "c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log"
>>>>>>>>> as it is not generated ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> irfan****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Thu, Sep 5, 2013 at 6:09 PM, Olivier Renault <
>>>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>>>
>>>>>>>>>  Could you share the log files ( c:\hdp.log,
>>>>>>>>> c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log )  as
>>>>>>>>> well as your clusterproperties.txt ?****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> Thanks, ****
>>>>>>>>>
>>>>>>>>> Olivier****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On 5 September 2013 12:33, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  thanks. i followed the user manual for deployment and installed
>>>>>>>>> all pre-requisites ****
>>>>>>>>>
>>>>>>>>> i modified the command and still the issue persist. please suggest
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> please refer below ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> [image: Inline image 1]****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> irfan ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <
>>>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>>>
>>>>>>>>> The command to install it is msiexec /i msifile /...  ****
>>>>>>>>>
>>>>>>>>> You will find the correct syntax as part of doc. ****
>>>>>>>>>
>>>>>>>>> Happy reading
>>>>>>>>> Olivier ****
>>>>>>>>>
>>>>>>>>> On 4 Sep 2013 12:37, "Irfan Sayed" <ir...@gmail.com> wrote:**
>>>>>>>>> **
>>>>>>>>>
>>>>>>>>>  thanks. ****
>>>>>>>>>
>>>>>>>>> i referred the logs and manuals. i modified the clusterproperties
>>>>>>>>> file and then double click on the msi file ****
>>>>>>>>>
>>>>>>>>> however, it still failed.****
>>>>>>>>>
>>>>>>>>> further i started the installation on command line by giving
>>>>>>>>> HDP_LAYOUT=clusterproperties file path, ****
>>>>>>>>>
>>>>>>>>> installation went ahead and it failed for .NET framework 4.0 and
>>>>>>>>> VC++ redistributable package dependency   ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> i installed both and started again the installation. ****
>>>>>>>>>
>>>>>>>>> failed again with following error ****
>>>>>>>>>
>>>>>>>>> [image: Inline image 1]****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> when i search for the logs mentioned in the error , i never found
>>>>>>>>> that ****
>>>>>>>>>
>>>>>>>>> please suggest ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> irfan****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <
>>>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>>>
>>>>>>>>> Correct, you need to define the cluster configuration as part of a
>>>>>>>>> file. You will find some information on the configuration file as part of
>>>>>>>>> the documentation. ****
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> You should make sure to have also installed the pre requisite. ***
>>>>>>>>> *
>>>>>>>>>
>>>>>>>>> Thanks
>>>>>>>>> Olivier ****
>>>>>>>>>
>>>>>>>>> On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:**
>>>>>>>>> **
>>>>>>>>>
>>>>>>>>>  thanks. sorry for the long break. actually got involved in some
>>>>>>>>> other priorities****
>>>>>>>>>
>>>>>>>>> i downloaded the installer and while installing i got following
>>>>>>>>> error ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> [image: Inline image 1]****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> do i need to make any configuration prior to installation ??****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> irfan ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
>>>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>>>
>>>>>>>>> Here is the link ****
>>>>>>>>>
>>>>>>>>> http://download.hortonworks.com/products/hdp-windows/****
>>>>>>>>>
>>>>>>>>> Olivier ****
>>>>>>>>>
>>>>>>>>> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:*
>>>>>>>>> ***
>>>>>>>>>
>>>>>>>>>  thanks.****
>>>>>>>>>
>>>>>>>>> i just followed the instructions to setup the pseudo distributed
>>>>>>>>> setup first using the url :
>>>>>>>>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>>  ****
>>>>>>>>>
>>>>>>>>> i don't think so i am running DN on both machine ****
>>>>>>>>>
>>>>>>>>> please find the attached log****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> hi olivier ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> can you please give me download link ?****
>>>>>>>>>
>>>>>>>>> let me try please ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> irfan ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <
>>>>>>>>> dontariq@gmail.com> wrote:****
>>>>>>>>>
>>>>>>>>>  Are you running DN on both the machines? Could you please show
>>>>>>>>> me your DN logs?****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> Also, consider Oliver's suggestion. It's definitely a better
>>>>>>>>> option.****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> Warm Regards,****
>>>>>>>>>
>>>>>>>>> Tariq****
>>>>>>>>>
>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>>>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>>>
>>>>>>>>> Irfu, ****
>>>>>>>>>
>>>>>>>>> If you want to quickly get Hadoop running on windows platform. You
>>>>>>>>> may want to try our distribution for Windows. You will be able to find the
>>>>>>>>> msi on our website. ****
>>>>>>>>>
>>>>>>>>> Regards
>>>>>>>>> Olivier ****
>>>>>>>>>
>>>>>>>>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:*
>>>>>>>>> ***
>>>>>>>>>
>>>>>>>>>  thanks. ****
>>>>>>>>>
>>>>>>>>> ok. i think i need to change the plan over here ****
>>>>>>>>>
>>>>>>>>> let me create two environments. 1: totally windows 2: totally Unix
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> because, on windows , anyway i have to try and see how hadoop
>>>>>>>>> works ****
>>>>>>>>>
>>>>>>>>> on UNIX, it is already known that ,  it is working fine. ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> so, on windows , here is the setup:****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> namenode : windows 2012 R2 ****
>>>>>>>>>
>>>>>>>>> datanode : windows 2012 R2 ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> now, the exact problem is :****
>>>>>>>>>
>>>>>>>>> 1: datanode is not getting started ****
>>>>>>>>>
>>>>>>>>> 2: replication : if i put any file/folder on any datanode , it
>>>>>>>>> should get replicated to all another available datanodes ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <
>>>>>>>>> dontariq@gmail.com> wrote:****
>>>>>>>>>
>>>>>>>>>  Seriously??You are planning to develop something using Hadoop on
>>>>>>>>> windows. Not a good idea. Anyways, cold you plz show me your log files?I
>>>>>>>>> also need some additional info :****
>>>>>>>>>
>>>>>>>>> -The exact problem which you are facing right now****
>>>>>>>>>
>>>>>>>>> -Your cluster summary(no. of nodes etc)****
>>>>>>>>>
>>>>>>>>> -Your latest configuration files****
>>>>>>>>>
>>>>>>>>> -Your /etc.hosts file****
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> Warm Regards,****
>>>>>>>>>
>>>>>>>>> Tariq****
>>>>>>>>>
>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  ok. thanks****
>>>>>>>>>
>>>>>>>>> now, i need to start with all windows setup first as our product
>>>>>>>>> will be based on windows ****
>>>>>>>>>
>>>>>>>>> so, now, please tell me how to resolve the issue ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> datanode is not starting . please suggest ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards,****
>>>>>>>>>
>>>>>>>>> irfan ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <
>>>>>>>>> dontariq@gmail.com> wrote:****
>>>>>>>>>
>>>>>>>>>  It is possible. Theoretically Hadoop doesn't stop you from doing
>>>>>>>>> that. But it is not a very wise setup.****
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> Warm Regards,****
>>>>>>>>>
>>>>>>>>> Tariq****
>>>>>>>>>
>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  please suggest****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> irfan****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <
>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>
>>>>>>>>>  thanks.****
>>>>>>>>>
>>>>>>>>> can i have setup like this :****
>>>>>>>>>
>>>>>>>>> namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> and datanodes are the combination of any OS (windows , linux ,
>>>>>>>>> unix etc )****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> however, my doubt is,  as the file systems of  both the systems
>>>>>>>>> (win and linux ) are different ,  datanodes of these systems can not be
>>>>>>>>> part of single cluster . i have to make windows cluster separate and UNIX
>>>>>>>>> cluster separate ?****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
>>>>>>>>> aagarwal@hortonworks.com> wrote:****
>>>>>>>>>
>>>>>>>>> I just noticed you are on Cygwin. IIRC Windows PIDs are not the
>>>>>>>>> same as Cygwin PIDs so that may be causing the discrepancy. I don't know
>>>>>>>>> how well Hadoop works in Cygwin as I have never tried it. Work is in
>>>>>>>>> progress for native Windows support however there are no official releases
>>>>>>>>> with Windows support yet. It may be easier to get familiar with a
>>>>>>>>> release <https://www.apache.org/dyn/closer.cgi/hadoop/common/> on
>>>>>>>>> Linux if you are new to it.****
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <
>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>
>>>>>>>>>  thanks ****
>>>>>>>>>
>>>>>>>>> here is what i did .****
>>>>>>>>>
>>>>>>>>> i stopped all the namenodes and datanodes using ./stop-dfs.sh
>>>>>>>>> command ****
>>>>>>>>>
>>>>>>>>> then deleted all pid files for namenodes and datanodes ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> started dfs again with command : "./start-dfs.sh"****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> when i ran the "Jps" command . it shows****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>>>>>
>>>>>>>>> $ ./jps.exe****
>>>>>>>>>
>>>>>>>>> 4536 Jps****
>>>>>>>>>
>>>>>>>>> 2076 NameNode****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> however, when i open the pid file for namenode then it is not
>>>>>>>>> showing pid as : 4560. on the contrary, it shud show : 2076****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> please suggest ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
>>>>>>>>> aagarwal@hortonworks.com> wrote:****
>>>>>>>>>
>>>>>>>>>  Most likely there is a stale pid file. Something like
>>>>>>>>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>>>>>>>>> the datanode.
>>>>>>>>>
>>>>>>>>> I haven't read the entire thread so you may have looked at this
>>>>>>>>> already.
>>>>>>>>>
>>>>>>>>> -Arpit****
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  datanode is trying to connect to namenode continuously but fails
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> when i try to run "jps" command it says :****
>>>>>>>>>
>>>>>>>>> $ ./jps.exe****
>>>>>>>>>
>>>>>>>>> 4584 NameNode****
>>>>>>>>>
>>>>>>>>> 4016 Jps****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> and when i ran the "./start-dfs.sh" then it says :****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> $ ./start-dfs.sh****
>>>>>>>>>
>>>>>>>>> namenode running as process 3544. Stop it first.****
>>>>>>>>>
>>>>>>>>> DFS-1: datanode running as process 4076. Stop it first.****
>>>>>>>>>
>>>>>>>>> localhost: secondarynamenode running as process 4792. Stop it
>>>>>>>>> first.****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> both these logs are contradictory ****
>>>>>>>>>
>>>>>>>>> please find the attached logs ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> should i attach the conf files as well ?****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>>  ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <
>>>>>>>>> dontariq@gmail.com> wrote:****
>>>>>>>>>
>>>>>>>>>  Your DN is still not running. Showing me the logs would be
>>>>>>>>> helpful.****
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> Warm Regards,****
>>>>>>>>>
>>>>>>>>> Tariq****
>>>>>>>>>
>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  i followed the url and did the steps mention in that. i have
>>>>>>>>> deployed on the windows platform****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> Now, i am able to browse url : http://localhost:50070 (name node )
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> however, not able to browse url : http://localhost:50030****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> please refer below****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> [image: Inline image 1]****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> i have modified all the config files as mentioned and formatted
>>>>>>>>> the hdfs file system as well ****
>>>>>>>>>
>>>>>>>>> please suggest ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  thanks. i followed this url :
>>>>>>>>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> let me follow the url which you gave for pseudo distributed setup
>>>>>>>>> and then will switch to distributed mode****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> irfan ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <
>>>>>>>>> dontariq@gmail.com> wrote:****
>>>>>>>>>
>>>>>>>>>  You are welcome. Which link have you followed for the
>>>>>>>>> configuration?Your *core-site.xml* is empty. Remove the property *
>>>>>>>>> fs.default.name *from *hdfs-site.xml* and add it to *core-site.xml
>>>>>>>>> *. Remove *mapred.job.tracker* as well. It is required in *
>>>>>>>>> mapred-site.xml*.****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> I would suggest you to do a pseudo distributed setup first in
>>>>>>>>> order to get yourself familiar with the process and then proceed to the
>>>>>>>>> distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> HTH****
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> Warm Regards,****
>>>>>>>>>
>>>>>>>>> Tariq****
>>>>>>>>>
>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  thanks tariq for response. ****
>>>>>>>>>
>>>>>>>>> as discussed last time, i have sent you all the config files in my
>>>>>>>>> setup . ****
>>>>>>>>>
>>>>>>>>> can you please go through that ?****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> please let me know ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> irfan ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <
>>>>>>>>> dontariq@gmail.com> wrote:****
>>>>>>>>>
>>>>>>>>>  I'm sorry for being unresponsive. Was out of touch for sometime
>>>>>>>>> because of ramzan and eid. Resuming work today.****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> What's the current status?****
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> Warm Regards,****
>>>>>>>>>
>>>>>>>>> Tariq****
>>>>>>>>>
>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <
>>>>>>>>> manishd207@gmail.com> wrote:****
>>>>>>>>>
>>>>>>>>>  First of all read the concepts ..I hope you will like it..****
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf*
>>>>>>>>> ***
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  please suggest ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> irfan ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <
>>>>>>>>> irfu.sayed@gmail.com> wrote:****
>>>>>>>>>
>>>>>>>>>  hey Tariq,****
>>>>>>>>>
>>>>>>>>> i am still stuck .. ****
>>>>>>>>>
>>>>>>>>> can you please suggest ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> irfan ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  please suggest ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  attachment got quarantined ****
>>>>>>>>>
>>>>>>>>> resending in txt format. please rename it to conf.rar ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  thanks.****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> if i run the jps command on namenode :****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>>>>>
>>>>>>>>> $ ./jps.exe****
>>>>>>>>>
>>>>>>>>> 3164 NameNode****
>>>>>>>>>
>>>>>>>>> 1892 Jps****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> same command on datanode :****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> Administrator@DFS-1 /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>>>>>
>>>>>>>>> $ ./jps.exe****
>>>>>>>>>
>>>>>>>>> 3848 Jps****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> jps does not list any process for datanode. however, on web
>>>>>>>>> browser i can see one live data node ****
>>>>>>>>>
>>>>>>>>> please find the attached conf rar file of namenode ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <do...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  OK. we'll start fresh. Could you plz show me your latest config
>>>>>>>>> files?****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> BTW, are your daemons running fine?Use JPS to verify that.****
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> Warm Regards,****
>>>>>>>>>
>>>>>>>>> Tariq****
>>>>>>>>>
>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  i have created these dir "wksp_data" and "wksp_name" on both
>>>>>>>>> datanode and namenode ****
>>>>>>>>>
>>>>>>>>> made the respective changes in "hdfs-site.xml" file ****
>>>>>>>>>
>>>>>>>>> formatted the namenode ****
>>>>>>>>>
>>>>>>>>> started the dfs ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> but still, not able to browse the file system through web browser
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> please refer below ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> anything still missing ?****
>>>>>>>>>
>>>>>>>>> please suggest ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> [image: Inline image 1]****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  these dir needs to be created on all datanodes and namenodes ?***
>>>>>>>>> *
>>>>>>>>>
>>>>>>>>> further,  hdfs-site.xml needs to be updated on both datanodes and
>>>>>>>>> namenodes for these new dir?****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  Create 2 directories manually corresponding to the values of
>>>>>>>>> dfs.name.dir and dfs.data.dir properties and change the permissions of
>>>>>>>>> these directories to 755. When you start pushing data into your HDFS, data
>>>>>>>>> will start going inside the directory specified by dfs.data.dir and the
>>>>>>>>> associated metadata will go inside dfs.name.dir. Remember, you store data
>>>>>>>>> in HDFS, but it eventually gets stored in your local/native FS. But you
>>>>>>>>> cannot see this data directly on your local/native FS.****
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> Warm Regards,****
>>>>>>>>>
>>>>>>>>> Tariq****
>>>>>>>>>
>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  thanks. ****
>>>>>>>>>
>>>>>>>>> however, i need this to be working on windows environment as
>>>>>>>>> project requirement.****
>>>>>>>>>
>>>>>>>>> i will add/work on Linux later ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> so, now , at this stage , c:\\wksp is the HDFS file system OR do i
>>>>>>>>> need to create it from command line ?****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> please suggest****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards,****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  Hello Irfan,****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> Sorry for being unresponsive. Got stuck with some imp work.****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> HDFS webUI doesn't provide us the ability to create file or
>>>>>>>>> directory. You can browse HDFS, view files, download files etc. But
>>>>>>>>> operation like create, move, copy etc are not supported.****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> These values look fine to me.****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> One suggestion though. Try getting a Linux machine(if possible).
>>>>>>>>> Or at least use a VM. I personally feel that using Hadoop on windows is
>>>>>>>>> always messy.****
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> Warm Regards,****
>>>>>>>>>
>>>>>>>>> Tariq****
>>>>>>>>>
>>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  thanks.****
>>>>>>>>>
>>>>>>>>> when i browse the file system , i am getting following :****
>>>>>>>>>
>>>>>>>>> i haven't seen any make directory option there ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> i need to create it from command line ?****
>>>>>>>>>
>>>>>>>>> further, in the hdfs-site.xml file , i have given following
>>>>>>>>> entries. are they correct ? ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> <property>****
>>>>>>>>>
>>>>>>>>>   <name>dfs.data.dir</name>****
>>>>>>>>>
>>>>>>>>>   <value>c:\\wksp</value>****
>>>>>>>>>
>>>>>>>>>   </property>****
>>>>>>>>>
>>>>>>>>> <property>****
>>>>>>>>>
>>>>>>>>>   <name>dfs.name.dir</name>****
>>>>>>>>>
>>>>>>>>>   <value>c:\\wksp</value>****
>>>>>>>>>
>>>>>>>>>   </property>****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> please suggest ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> [image: Inline image 1]****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <
>>>>>>>>> manishd207@gmail.com> wrote:****
>>>>>>>>>
>>>>>>>>>  *You are wrong at this:*****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>>>>>>>>
>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp****
>>>>>>>>>
>>>>>>>>> copyFromLocal: File
>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>>>>>>>>
>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp*
>>>>>>>>> ***
>>>>>>>>>
>>>>>>>>> copyFromLocal: File
>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> Because,You had wrote both the paths local and You need not to
>>>>>>>>> copy hadoop into hdfs...Hadoop is already working..****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> Just check out in browser by after starting ur single node cluster
>>>>>>>>> :****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> localhost:50070****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> then go for browse the filesystem link in it..****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> If there is no directory then make directory there.****
>>>>>>>>>
>>>>>>>>> That is your hdfs directory.****
>>>>>>>>>
>>>>>>>>> Then copy any text file there(no need to copy hadoop
>>>>>>>>> there).beacause u are going to do processing on that data in text
>>>>>>>>> file.That's why hadoop is used for ,first u need to make it clear in ur
>>>>>>>>> mind.Then and then u will do it...otherwise not possible..****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> *Try this: *****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2****
>>>>>>>>>
>>>>>>>>> $ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
>>>>>>>>> /hdfs/directory/path****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>>> wrote:****
>>>>>>>>>
>>>>>>>>>  thanks. yes , i am newbie.****
>>>>>>>>>
>>>>>>>>> however, i need windows setup.****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> let me surely refer the doc and link which u sent but i need this
>>>>>>>>> to be working ...****
>>>>>>>>>
>>>>>>>>> can you please help****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> regards****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>>  ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> --
>>>>>>>>> MANISH DUNANI
>>>>>>>>> -THANX
>>>>>>>>> +91 9426881954,+91 8460656443****
>>>>>>>>>
>>>>>>>>> manishd207@gmail.com****
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> -- ****
>>>>>>>>>
>>>>>>>>> Regards****
>>>>>>>>>
>>>>>>>>> *Manish Dunani*****
>>>>>>>>>
>>>>>>>>> *Contact No* : +91 9408329137****
>>>>>>>>>
>>>>>>>>> *skype id* : manish.dunani****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>>
>>>>>>>>>  ** **
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ****
>>>>>>>>>
>>>>>>>>> ** **
>>>>>>>>>
>>>>>>>>> -- ****
>>>>>>>>>
>>>>>>>>> Olivier Renault****
>>>>>>>>>
>>>>>>>>> Solution Engineer - Big Data - Hortonworks, Inc.
>>>>>>>>> +44 7500 933 036
>>>>>>>>> orenault@hortonworks.com
>>>>>>>>> www.hortonworks.com****
>>>>>>>>>
>>>>>>>>> **** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>> immediately and delete it from your system. Thank You.****<http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>>>>>
>>>>>>>>>  ** ** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>> CONFIDENTIALITY NOTICE
>>>> NOTICE: This message is intended for the use of the individual or
>>>> entity to which it is addressed and may contain information that is
>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>> If the reader of this message is not the intended recipient, you are hereby
>>>> notified that any printing, copying, dissemination, distribution,
>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>> you have received this communication in error, please contact the sender
>>>> immediately and delete it from your system. Thank You.
>>>>
>>>
>>>
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity
>> to which it is addressed and may contain information that is confidential,
>> privileged and exempt from disclosure under applicable law. If the reader
>> of this message is not the intended recipient, you are hereby notified that
>> any printing, copying, dissemination, distribution, disclosure or
>> forwarding of this communication is strictly prohibited. If you have
>> received this communication in error, please contact the sender immediately
>> and delete it from your system. Thank You.
>>
>
>

Re: about replication

Posted by Irfan Sayed <ir...@gmail.com>.
thanks.
finally it got installed :)

further, when i try to start the namenode, it failed with following log

C:\hdp>start_remote_hdp_services.cmd
Master nodes: start DFS-DC
0 Master nodes successfully started.
1 Master nodes failed to start.

PSComputerName      Service             Message             Status
--------------      -------             -------             ------
                                        Connecting to re...


StartStop-HDPServices : Manually start services on Master nodes then retry
full
 cluster start.  Exiting.
At C:\hdp\manage_remote_hdp_services.ps1:187 char:47
+ if ($mode -eq "start") { StartStop-HDPservices <<<< ($mode) }
    + CategoryInfo          : NotSpecified: (:) [Write-Error],
WriteErrorExcep
   tion
    + FullyQualifiedErrorId :
Microsoft.PowerShell.Commands.WriteErrorExceptio
   n,StartStop-HDPServices


C:\hdp>

i tried starting manually as well but no luck
anything missing in configuration ?

regards



On Wed, Sep 11, 2013 at 3:16 PM, Olivier Renault
<or...@hortonworks.com>wrote:

> You can put the same FQDN as your NameNode for example.
>
> Thanks
> Olivier
> On 11 Sep 2013 11:26, "Irfan Sayed" <ir...@gmail.com> wrote:
>
>> i do not have any HIVE server host,  then, what should i put over here??
>> . if i comment then i guess it throws error of commenting that
>> can i put the fqdn of namenode for HIVE server host  ?
>>
>> will it be a really working configuration?
>>
>> please suggest
>>
>> regards
>> irfan
>>
>>
>>
>> On Tue, Sep 10, 2013 at 5:09 PM, Olivier Renault <
>> orenault@hortonworks.com> wrote:
>>
>>> Your cluster-properties.txt should look something like :
>>>
>>>
>>> #Log directory
>>> HDP_LOG_DIR=c:\hadoop\logs
>>>
>>> #Data directory
>>> HDP_DATA_DIR=c:\hdp\data
>>>
>>> #Hosts
>>> NAMENODE_HOST=yourmaster.fqdn.com
>>> JOBTRACKER_HOST=yourmaster.fqdn.com
>>> HIVE_SERVER_HOST=yourmaster.fqdn.com
>>> OOZIE_SERVER_HOST=yourmaster.fqdn.com
>>> TEMPLETON_HOST=yourmaster.fqdn.com
>>> SLAVE_HOSTS=yourmaster.fqdn.com,youslave.fqdn.com
>>>
>>>
>>> #Database host
>>> DB_FLAVOR=derby
>>> DB_HOSTNAME=yourmaster.fqdn.com
>>>
>>>
>>> #Hive properties
>>> HIVE_DB_NAME=hive
>>> HIVE_DB_USERNAME=hive
>>> HIVE_DB_PASSWORD=hive
>>>
>>> #Oozie properties
>>> OOZIE_DB_NAME=oozie
>>> OOZIE_DB_USERNAME=oozie
>>> OOZIE_DB_PASSWORD=oozie
>>>
>>> You will need to replace, yourmaster.fqdn.com, yourslave.fqdn.com by
>>> your servers name. For the time being, I suggest that you do not install
>>> HBase, Oozie,
>>>
>>> regards,
>>> Olivier
>>>
>>>
>>> On 10 September 2013 07:02, Irfan Sayed <ir...@gmail.com> wrote:
>>>
>>>> please suggest
>>>>
>>>> regards
>>>> irfan
>>>>
>>>>
>>>>
>>>> On Sat, Sep 7, 2013 at 4:56 PM, Irfan Sayed <ir...@gmail.com>wrote:
>>>>
>>>>> please suggest
>>>>>
>>>>> regards
>>>>> irfan
>>>>>
>>>>>
>>>>>
>>>>> On Fri, Sep 6, 2013 at 12:16 PM, Irfan Sayed <ir...@gmail.com>wrote:
>>>>>
>>>>>> ok.. now i made some changes and installation went ahead
>>>>>> but failed in property "HIVE_SERVER_HOST" declaration
>>>>>> in cluster config file, i have commented this property. if i
>>>>>> uncomment , then what server address will give ???
>>>>>>
>>>>>> i have only two windows machines setup.
>>>>>> 1: for namenode and another for datanode
>>>>>>
>>>>>> please suggest
>>>>>>
>>>>>> regards
>>>>>> irfan
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Fri, Sep 6, 2013 at 11:42 AM, Irfan Sayed <ir...@gmail.com>wrote:
>>>>>>
>>>>>>> thanks.
>>>>>>> i installed the latest java in c:\java folder and now no error in
>>>>>>> log file related to java
>>>>>>> however, now it is throwing error on not having cluster properties
>>>>>>> file.
>>>>>>> in fact i am running/installing hdp from the location where this
>>>>>>> file exist . still it is throwing error
>>>>>>>
>>>>>>> please find the attached
>>>>>>>
>>>>>>> [image: Inline image 1]
>>>>>>>
>>>>>>> regards
>>>>>>> irfan
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Fri, Sep 6, 2013 at 11:12 AM, Ravi Mummulla (BIG DATA) <
>>>>>>> ravimu@microsoft.com> wrote:
>>>>>>>
>>>>>>>>  Here’s your issue (from the logs you attached earlier):****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> CAQuietExec:  Checking JAVA_HOME is set correctly...****
>>>>>>>>
>>>>>>>> CAQuietExec:  Files\Java\jdk1.6.0_31 was unexpected at this time.**
>>>>>>>> **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> It seems that you installed Java prerequisite in the default path,
>>>>>>>> which is %PROGRAMFILES% (expands to C:\Program Files in your case). HDP 1.3
>>>>>>>> does not like spaces in paths, do you need to reinstall Java under c:\java\
>>>>>>>> or something similar (in a path with no spaces).****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> *From:* Irfan Sayed [mailto:irfu.sayed@gmail.com]
>>>>>>>> *Sent:* Thursday, September 5, 2013 8:42 PM
>>>>>>>> *To:* user@hadoop.apache.org
>>>>>>>> *Subject:* Re: about replication****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> please find the attached.****
>>>>>>>>
>>>>>>>> i don't have "c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log"
>>>>>>>> as it is not generated ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> irfan****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Thu, Sep 5, 2013 at 6:09 PM, Olivier Renault <
>>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>>
>>>>>>>>  Could you share the log files ( c:\hdp.log,
>>>>>>>> c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log )  as
>>>>>>>> well as your clusterproperties.txt ?****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> Thanks, ****
>>>>>>>>
>>>>>>>> Olivier****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On 5 September 2013 12:33, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  thanks. i followed the user manual for deployment and installed
>>>>>>>> all pre-requisites ****
>>>>>>>>
>>>>>>>> i modified the command and still the issue persist. please suggest
>>>>>>>> ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> please refer below ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> [image: Inline image 1]****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> irfan ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <
>>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>>
>>>>>>>> The command to install it is msiexec /i msifile /...  ****
>>>>>>>>
>>>>>>>> You will find the correct syntax as part of doc. ****
>>>>>>>>
>>>>>>>> Happy reading
>>>>>>>> Olivier ****
>>>>>>>>
>>>>>>>> On 4 Sep 2013 12:37, "Irfan Sayed" <ir...@gmail.com> wrote:***
>>>>>>>> *
>>>>>>>>
>>>>>>>>  thanks. ****
>>>>>>>>
>>>>>>>> i referred the logs and manuals. i modified the clusterproperties
>>>>>>>> file and then double click on the msi file ****
>>>>>>>>
>>>>>>>> however, it still failed.****
>>>>>>>>
>>>>>>>> further i started the installation on command line by giving
>>>>>>>> HDP_LAYOUT=clusterproperties file path, ****
>>>>>>>>
>>>>>>>> installation went ahead and it failed for .NET framework 4.0 and
>>>>>>>> VC++ redistributable package dependency   ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> i installed both and started again the installation. ****
>>>>>>>>
>>>>>>>> failed again with following error ****
>>>>>>>>
>>>>>>>> [image: Inline image 1]****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> when i search for the logs mentioned in the error , i never found
>>>>>>>> that ****
>>>>>>>>
>>>>>>>> please suggest ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> irfan****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <
>>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>>
>>>>>>>> Correct, you need to define the cluster configuration as part of a
>>>>>>>> file. You will find some information on the configuration file as part of
>>>>>>>> the documentation. ****
>>>>>>>>
>>>>>>>>
>>>>>>>> http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
>>>>>>>> ****
>>>>>>>>
>>>>>>>> You should make sure to have also installed the pre requisite. ****
>>>>>>>>
>>>>>>>> Thanks
>>>>>>>> Olivier ****
>>>>>>>>
>>>>>>>> On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:***
>>>>>>>> *
>>>>>>>>
>>>>>>>>  thanks. sorry for the long break. actually got involved in some
>>>>>>>> other priorities****
>>>>>>>>
>>>>>>>> i downloaded the installer and while installing i got following
>>>>>>>> error ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> [image: Inline image 1]****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> do i need to make any configuration prior to installation ??****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> irfan ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
>>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>>
>>>>>>>> Here is the link ****
>>>>>>>>
>>>>>>>> http://download.hortonworks.com/products/hdp-windows/****
>>>>>>>>
>>>>>>>> Olivier ****
>>>>>>>>
>>>>>>>> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:**
>>>>>>>> **
>>>>>>>>
>>>>>>>>  thanks.****
>>>>>>>>
>>>>>>>> i just followed the instructions to setup the pseudo distributed
>>>>>>>> setup first using the url :
>>>>>>>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>>>>>>>> ****
>>>>>>>>
>>>>>>>>  ****
>>>>>>>>
>>>>>>>> i don't think so i am running DN on both machine ****
>>>>>>>>
>>>>>>>> please find the attached log****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> hi olivier ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> can you please give me download link ?****
>>>>>>>>
>>>>>>>> let me try please ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> irfan ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  Are you running DN on both the machines? Could you please show me
>>>>>>>> your DN logs?****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> Also, consider Oliver's suggestion. It's definitely a better option.
>>>>>>>> ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>>
>>>>>>>> ****
>>>>>>>>
>>>>>>>> Warm Regards,****
>>>>>>>>
>>>>>>>> Tariq****
>>>>>>>>
>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>>
>>>>>>>> Irfu, ****
>>>>>>>>
>>>>>>>> If you want to quickly get Hadoop running on windows platform. You
>>>>>>>> may want to try our distribution for Windows. You will be able to find the
>>>>>>>> msi on our website. ****
>>>>>>>>
>>>>>>>> Regards
>>>>>>>> Olivier ****
>>>>>>>>
>>>>>>>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:**
>>>>>>>> **
>>>>>>>>
>>>>>>>>  thanks. ****
>>>>>>>>
>>>>>>>> ok. i think i need to change the plan over here ****
>>>>>>>>
>>>>>>>> let me create two environments. 1: totally windows 2: totally Unix*
>>>>>>>> ***
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> because, on windows , anyway i have to try and see how hadoop works
>>>>>>>> ****
>>>>>>>>
>>>>>>>> on UNIX, it is already known that ,  it is working fine. ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> so, on windows , here is the setup:****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> namenode : windows 2012 R2 ****
>>>>>>>>
>>>>>>>> datanode : windows 2012 R2 ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> now, the exact problem is :****
>>>>>>>>
>>>>>>>> 1: datanode is not getting started ****
>>>>>>>>
>>>>>>>> 2: replication : if i put any file/folder on any datanode , it
>>>>>>>> should get replicated to all another available datanodes ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <do...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  Seriously??You are planning to develop something using Hadoop on
>>>>>>>> windows. Not a good idea. Anyways, cold you plz show me your log files?I
>>>>>>>> also need some additional info :****
>>>>>>>>
>>>>>>>> -The exact problem which you are facing right now****
>>>>>>>>
>>>>>>>> -Your cluster summary(no. of nodes etc)****
>>>>>>>>
>>>>>>>> -Your latest configuration files****
>>>>>>>>
>>>>>>>> -Your /etc.hosts file****
>>>>>>>>
>>>>>>>>
>>>>>>>> ****
>>>>>>>>
>>>>>>>> Warm Regards,****
>>>>>>>>
>>>>>>>> Tariq****
>>>>>>>>
>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  ok. thanks****
>>>>>>>>
>>>>>>>> now, i need to start with all windows setup first as our product
>>>>>>>> will be based on windows ****
>>>>>>>>
>>>>>>>> so, now, please tell me how to resolve the issue ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> datanode is not starting . please suggest ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards,****
>>>>>>>>
>>>>>>>> irfan ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  It is possible. Theoretically Hadoop doesn't stop you from doing
>>>>>>>> that. But it is not a very wise setup.****
>>>>>>>>
>>>>>>>>
>>>>>>>> ****
>>>>>>>>
>>>>>>>> Warm Regards,****
>>>>>>>>
>>>>>>>> Tariq****
>>>>>>>>
>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  please suggest****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> irfan****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  thanks.****
>>>>>>>>
>>>>>>>> can i have setup like this :****
>>>>>>>>
>>>>>>>> namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)
>>>>>>>> ****
>>>>>>>>
>>>>>>>> and datanodes are the combination of any OS (windows , linux , unix
>>>>>>>> etc )****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> however, my doubt is,  as the file systems of  both the systems
>>>>>>>> (win and linux ) are different ,  datanodes of these systems can not be
>>>>>>>> part of single cluster . i have to make windows cluster separate and UNIX
>>>>>>>> cluster separate ?****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
>>>>>>>> aagarwal@hortonworks.com> wrote:****
>>>>>>>>
>>>>>>>> I just noticed you are on Cygwin. IIRC Windows PIDs are not the
>>>>>>>> same as Cygwin PIDs so that may be causing the discrepancy. I don't know
>>>>>>>> how well Hadoop works in Cygwin as I have never tried it. Work is in
>>>>>>>> progress for native Windows support however there are no official releases
>>>>>>>> with Windows support yet. It may be easier to get familiar with a
>>>>>>>> release <https://www.apache.org/dyn/closer.cgi/hadoop/common/> on
>>>>>>>> Linux if you are new to it.****
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> ****
>>>>>>>>
>>>>>>>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  thanks ****
>>>>>>>>
>>>>>>>> here is what i did .****
>>>>>>>>
>>>>>>>> i stopped all the namenodes and datanodes using ./stop-dfs.sh
>>>>>>>> command ****
>>>>>>>>
>>>>>>>> then deleted all pid files for namenodes and datanodes ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> started dfs again with command : "./start-dfs.sh"****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> when i ran the "Jps" command . it shows****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>>>>
>>>>>>>> $ ./jps.exe****
>>>>>>>>
>>>>>>>> 4536 Jps****
>>>>>>>>
>>>>>>>> 2076 NameNode****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> however, when i open the pid file for namenode then it is not
>>>>>>>> showing pid as : 4560. on the contrary, it shud show : 2076****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> please suggest ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
>>>>>>>> aagarwal@hortonworks.com> wrote:****
>>>>>>>>
>>>>>>>>  Most likely there is a stale pid file. Something like
>>>>>>>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>>>>>>>> the datanode.
>>>>>>>>
>>>>>>>> I haven't read the entire thread so you may have looked at this
>>>>>>>> already.
>>>>>>>>
>>>>>>>> -Arpit****
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> ****
>>>>>>>>
>>>>>>>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  datanode is trying to connect to namenode continuously but fails *
>>>>>>>> ***
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> when i try to run "jps" command it says :****
>>>>>>>>
>>>>>>>> $ ./jps.exe****
>>>>>>>>
>>>>>>>> 4584 NameNode****
>>>>>>>>
>>>>>>>> 4016 Jps****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> and when i ran the "./start-dfs.sh" then it says :****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> $ ./start-dfs.sh****
>>>>>>>>
>>>>>>>> namenode running as process 3544. Stop it first.****
>>>>>>>>
>>>>>>>> DFS-1: datanode running as process 4076. Stop it first.****
>>>>>>>>
>>>>>>>> localhost: secondarynamenode running as process 4792. Stop it first.
>>>>>>>> ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> both these logs are contradictory ****
>>>>>>>>
>>>>>>>> please find the attached logs ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> should i attach the conf files as well ?****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>>  ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  Your DN is still not running. Showing me the logs would be
>>>>>>>> helpful.****
>>>>>>>>
>>>>>>>>
>>>>>>>> ****
>>>>>>>>
>>>>>>>> Warm Regards,****
>>>>>>>>
>>>>>>>> Tariq****
>>>>>>>>
>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  i followed the url and did the steps mention in that. i have
>>>>>>>> deployed on the windows platform****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> Now, i am able to browse url : http://localhost:50070 (name node )*
>>>>>>>> ***
>>>>>>>>
>>>>>>>> however, not able to browse url : http://localhost:50030****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> please refer below****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> [image: Inline image 1]****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> i have modified all the config files as mentioned and formatted the
>>>>>>>> hdfs file system as well ****
>>>>>>>>
>>>>>>>> please suggest ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  thanks. i followed this url :
>>>>>>>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>>>>>>>> ****
>>>>>>>>
>>>>>>>> let me follow the url which you gave for pseudo distributed setup
>>>>>>>> and then will switch to distributed mode****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> irfan ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  You are welcome. Which link have you followed for the
>>>>>>>> configuration?Your *core-site.xml* is empty. Remove the property *
>>>>>>>> fs.default.name *from *hdfs-site.xml* and add it to *core-site.xml*.
>>>>>>>> Remove *mapred.job.tracker* as well. It is required in *
>>>>>>>> mapred-site.xml*.****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> I would suggest you to do a pseudo distributed setup first in order
>>>>>>>> to get yourself familiar with the process and then proceed to the
>>>>>>>> distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>>>>>>>> ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> HTH****
>>>>>>>>
>>>>>>>>
>>>>>>>> ****
>>>>>>>>
>>>>>>>> Warm Regards,****
>>>>>>>>
>>>>>>>> Tariq****
>>>>>>>>
>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  thanks tariq for response. ****
>>>>>>>>
>>>>>>>> as discussed last time, i have sent you all the config files in my
>>>>>>>> setup . ****
>>>>>>>>
>>>>>>>> can you please go through that ?****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> please let me know ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> irfan ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  I'm sorry for being unresponsive. Was out of touch for sometime
>>>>>>>> because of ramzan and eid. Resuming work today.****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> What's the current status?****
>>>>>>>>
>>>>>>>>
>>>>>>>> ****
>>>>>>>>
>>>>>>>> Warm Regards,****
>>>>>>>>
>>>>>>>> Tariq****
>>>>>>>>
>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <
>>>>>>>> manishd207@gmail.com> wrote:****
>>>>>>>>
>>>>>>>>  First of all read the concepts ..I hope you will like it..****
>>>>>>>>
>>>>>>>>
>>>>>>>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf**
>>>>>>>> **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  please suggest ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> irfan ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  hey Tariq,****
>>>>>>>>
>>>>>>>> i am still stuck .. ****
>>>>>>>>
>>>>>>>> can you please suggest ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> irfan ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  please suggest ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  attachment got quarantined ****
>>>>>>>>
>>>>>>>> resending in txt format. please rename it to conf.rar ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  thanks.****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> if i run the jps command on namenode :****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>>>>
>>>>>>>> $ ./jps.exe****
>>>>>>>>
>>>>>>>> 3164 NameNode****
>>>>>>>>
>>>>>>>> 1892 Jps****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> same command on datanode :****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> Administrator@DFS-1 /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>>>>
>>>>>>>> $ ./jps.exe****
>>>>>>>>
>>>>>>>> 3848 Jps****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> jps does not list any process for datanode. however, on web browser
>>>>>>>> i can see one live data node ****
>>>>>>>>
>>>>>>>> please find the attached conf rar file of namenode ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <do...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  OK. we'll start fresh. Could you plz show me your latest config
>>>>>>>> files?****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> BTW, are your daemons running fine?Use JPS to verify that.****
>>>>>>>>
>>>>>>>>
>>>>>>>> ****
>>>>>>>>
>>>>>>>> Warm Regards,****
>>>>>>>>
>>>>>>>> Tariq****
>>>>>>>>
>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  i have created these dir "wksp_data" and "wksp_name" on both
>>>>>>>> datanode and namenode ****
>>>>>>>>
>>>>>>>> made the respective changes in "hdfs-site.xml" file ****
>>>>>>>>
>>>>>>>> formatted the namenode ****
>>>>>>>>
>>>>>>>> started the dfs ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> but still, not able to browse the file system through web browser *
>>>>>>>> ***
>>>>>>>>
>>>>>>>> please refer below ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> anything still missing ?****
>>>>>>>>
>>>>>>>> please suggest ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> [image: Inline image 1]****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  these dir needs to be created on all datanodes and namenodes ?****
>>>>>>>>
>>>>>>>> further,  hdfs-site.xml needs to be updated on both datanodes and
>>>>>>>> namenodes for these new dir?****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  Create 2 directories manually corresponding to the values of
>>>>>>>> dfs.name.dir and dfs.data.dir properties and change the permissions of
>>>>>>>> these directories to 755. When you start pushing data into your HDFS, data
>>>>>>>> will start going inside the directory specified by dfs.data.dir and the
>>>>>>>> associated metadata will go inside dfs.name.dir. Remember, you store data
>>>>>>>> in HDFS, but it eventually gets stored in your local/native FS. But you
>>>>>>>> cannot see this data directly on your local/native FS.****
>>>>>>>>
>>>>>>>>
>>>>>>>> ****
>>>>>>>>
>>>>>>>> Warm Regards,****
>>>>>>>>
>>>>>>>> Tariq****
>>>>>>>>
>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  thanks. ****
>>>>>>>>
>>>>>>>> however, i need this to be working on windows environment as
>>>>>>>> project requirement.****
>>>>>>>>
>>>>>>>> i will add/work on Linux later ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> so, now , at this stage , c:\\wksp is the HDFS file system OR do i
>>>>>>>> need to create it from command line ?****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> please suggest****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards,****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  Hello Irfan,****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> Sorry for being unresponsive. Got stuck with some imp work.****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> HDFS webUI doesn't provide us the ability to create file or
>>>>>>>> directory. You can browse HDFS, view files, download files etc. But
>>>>>>>> operation like create, move, copy etc are not supported.****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> These values look fine to me.****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> One suggestion though. Try getting a Linux machine(if possible). Or
>>>>>>>> at least use a VM. I personally feel that using Hadoop on windows is always
>>>>>>>> messy.****
>>>>>>>>
>>>>>>>>
>>>>>>>> ****
>>>>>>>>
>>>>>>>> Warm Regards,****
>>>>>>>>
>>>>>>>> Tariq****
>>>>>>>>
>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  thanks.****
>>>>>>>>
>>>>>>>> when i browse the file system , i am getting following :****
>>>>>>>>
>>>>>>>> i haven't seen any make directory option there ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> i need to create it from command line ?****
>>>>>>>>
>>>>>>>> further, in the hdfs-site.xml file , i have given following
>>>>>>>> entries. are they correct ? ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> <property>****
>>>>>>>>
>>>>>>>>   <name>dfs.data.dir</name>****
>>>>>>>>
>>>>>>>>   <value>c:\\wksp</value>****
>>>>>>>>
>>>>>>>>   </property>****
>>>>>>>>
>>>>>>>> <property>****
>>>>>>>>
>>>>>>>>   <name>dfs.name.dir</name>****
>>>>>>>>
>>>>>>>>   <value>c:\\wksp</value>****
>>>>>>>>
>>>>>>>>   </property>****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> please suggest ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> [image: Inline image 1]****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <
>>>>>>>> manishd207@gmail.com> wrote:****
>>>>>>>>
>>>>>>>>  *You are wrong at this:*****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>>>>>>>
>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp****
>>>>>>>>
>>>>>>>> copyFromLocal: File
>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
>>>>>>>> ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>>>>>>>
>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp**
>>>>>>>> **
>>>>>>>>
>>>>>>>> copyFromLocal: File
>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>>>>>>>> ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> Because,You had wrote both the paths local and You need not to copy
>>>>>>>> hadoop into hdfs...Hadoop is already working..****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> Just check out in browser by after starting ur single node cluster :
>>>>>>>> ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> localhost:50070****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> then go for browse the filesystem link in it..****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> If there is no directory then make directory there.****
>>>>>>>>
>>>>>>>> That is your hdfs directory.****
>>>>>>>>
>>>>>>>> Then copy any text file there(no need to copy hadoop
>>>>>>>> there).beacause u are going to do processing on that data in text
>>>>>>>> file.That's why hadoop is used for ,first u need to make it clear in ur
>>>>>>>> mind.Then and then u will do it...otherwise not possible..****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> *Try this: *****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2****
>>>>>>>>
>>>>>>>> $ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
>>>>>>>> /hdfs/directory/path****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  thanks. yes , i am newbie.****
>>>>>>>>
>>>>>>>> however, i need windows setup.****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> let me surely refer the doc and link which u sent but i need this
>>>>>>>> to be working ...****
>>>>>>>>
>>>>>>>> can you please help****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>>  ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> --
>>>>>>>> MANISH DUNANI
>>>>>>>> -THANX
>>>>>>>> +91 9426881954,+91 8460656443****
>>>>>>>>
>>>>>>>> manishd207@gmail.com****
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> ****
>>>>>>>>
>>>>>>>> -- ****
>>>>>>>>
>>>>>>>> Regards****
>>>>>>>>
>>>>>>>> *Manish Dunani*****
>>>>>>>>
>>>>>>>> *Contact No* : +91 9408329137****
>>>>>>>>
>>>>>>>> *skype id* : manish.dunani****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>
>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>
>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>
>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>
>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> -- ****
>>>>>>>>
>>>>>>>> Olivier Renault****
>>>>>>>>
>>>>>>>> Solution Engineer - Big Data - Hortonworks, Inc.
>>>>>>>> +44 7500 933 036
>>>>>>>> orenault@hortonworks.com
>>>>>>>> www.hortonworks.com****
>>>>>>>>
>>>>>>>> **** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>>>>
>>>>>>>>
>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>> immediately and delete it from your system. Thank You.****<http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>>>>
>>>>>>>>  ** ** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>> CONFIDENTIALITY NOTICE
>>> NOTICE: This message is intended for the use of the individual or entity
>>> to which it is addressed and may contain information that is confidential,
>>> privileged and exempt from disclosure under applicable law. If the reader
>>> of this message is not the intended recipient, you are hereby notified that
>>> any printing, copying, dissemination, distribution, disclosure or
>>> forwarding of this communication is strictly prohibited. If you have
>>> received this communication in error, please contact the sender immediately
>>> and delete it from your system. Thank You.
>>>
>>
>>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.
>

Re: about replication

Posted by Irfan Sayed <ir...@gmail.com>.
thanks.
finally it got installed :)

further, when i try to start the namenode, it failed with following log

C:\hdp>start_remote_hdp_services.cmd
Master nodes: start DFS-DC
0 Master nodes successfully started.
1 Master nodes failed to start.

PSComputerName      Service             Message             Status
--------------      -------             -------             ------
                                        Connecting to re...


StartStop-HDPServices : Manually start services on Master nodes then retry
full
 cluster start.  Exiting.
At C:\hdp\manage_remote_hdp_services.ps1:187 char:47
+ if ($mode -eq "start") { StartStop-HDPservices <<<< ($mode) }
    + CategoryInfo          : NotSpecified: (:) [Write-Error],
WriteErrorExcep
   tion
    + FullyQualifiedErrorId :
Microsoft.PowerShell.Commands.WriteErrorExceptio
   n,StartStop-HDPServices


C:\hdp>

i tried starting manually as well but no luck
anything missing in configuration ?

regards



On Wed, Sep 11, 2013 at 3:16 PM, Olivier Renault
<or...@hortonworks.com>wrote:

> You can put the same FQDN as your NameNode for example.
>
> Thanks
> Olivier
> On 11 Sep 2013 11:26, "Irfan Sayed" <ir...@gmail.com> wrote:
>
>> i do not have any HIVE server host,  then, what should i put over here??
>> . if i comment then i guess it throws error of commenting that
>> can i put the fqdn of namenode for HIVE server host  ?
>>
>> will it be a really working configuration?
>>
>> please suggest
>>
>> regards
>> irfan
>>
>>
>>
>> On Tue, Sep 10, 2013 at 5:09 PM, Olivier Renault <
>> orenault@hortonworks.com> wrote:
>>
>>> Your cluster-properties.txt should look something like :
>>>
>>>
>>> #Log directory
>>> HDP_LOG_DIR=c:\hadoop\logs
>>>
>>> #Data directory
>>> HDP_DATA_DIR=c:\hdp\data
>>>
>>> #Hosts
>>> NAMENODE_HOST=yourmaster.fqdn.com
>>> JOBTRACKER_HOST=yourmaster.fqdn.com
>>> HIVE_SERVER_HOST=yourmaster.fqdn.com
>>> OOZIE_SERVER_HOST=yourmaster.fqdn.com
>>> TEMPLETON_HOST=yourmaster.fqdn.com
>>> SLAVE_HOSTS=yourmaster.fqdn.com,youslave.fqdn.com
>>>
>>>
>>> #Database host
>>> DB_FLAVOR=derby
>>> DB_HOSTNAME=yourmaster.fqdn.com
>>>
>>>
>>> #Hive properties
>>> HIVE_DB_NAME=hive
>>> HIVE_DB_USERNAME=hive
>>> HIVE_DB_PASSWORD=hive
>>>
>>> #Oozie properties
>>> OOZIE_DB_NAME=oozie
>>> OOZIE_DB_USERNAME=oozie
>>> OOZIE_DB_PASSWORD=oozie
>>>
>>> You will need to replace, yourmaster.fqdn.com, yourslave.fqdn.com by
>>> your servers name. For the time being, I suggest that you do not install
>>> HBase, Oozie,
>>>
>>> regards,
>>> Olivier
>>>
>>>
>>> On 10 September 2013 07:02, Irfan Sayed <ir...@gmail.com> wrote:
>>>
>>>> please suggest
>>>>
>>>> regards
>>>> irfan
>>>>
>>>>
>>>>
>>>> On Sat, Sep 7, 2013 at 4:56 PM, Irfan Sayed <ir...@gmail.com>wrote:
>>>>
>>>>> please suggest
>>>>>
>>>>> regards
>>>>> irfan
>>>>>
>>>>>
>>>>>
>>>>> On Fri, Sep 6, 2013 at 12:16 PM, Irfan Sayed <ir...@gmail.com>wrote:
>>>>>
>>>>>> ok.. now i made some changes and installation went ahead
>>>>>> but failed in property "HIVE_SERVER_HOST" declaration
>>>>>> in cluster config file, i have commented this property. if i
>>>>>> uncomment , then what server address will give ???
>>>>>>
>>>>>> i have only two windows machines setup.
>>>>>> 1: for namenode and another for datanode
>>>>>>
>>>>>> please suggest
>>>>>>
>>>>>> regards
>>>>>> irfan
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Fri, Sep 6, 2013 at 11:42 AM, Irfan Sayed <ir...@gmail.com>wrote:
>>>>>>
>>>>>>> thanks.
>>>>>>> i installed the latest java in c:\java folder and now no error in
>>>>>>> log file related to java
>>>>>>> however, now it is throwing error on not having cluster properties
>>>>>>> file.
>>>>>>> in fact i am running/installing hdp from the location where this
>>>>>>> file exist . still it is throwing error
>>>>>>>
>>>>>>> please find the attached
>>>>>>>
>>>>>>> [image: Inline image 1]
>>>>>>>
>>>>>>> regards
>>>>>>> irfan
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Fri, Sep 6, 2013 at 11:12 AM, Ravi Mummulla (BIG DATA) <
>>>>>>> ravimu@microsoft.com> wrote:
>>>>>>>
>>>>>>>>  Here’s your issue (from the logs you attached earlier):****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> CAQuietExec:  Checking JAVA_HOME is set correctly...****
>>>>>>>>
>>>>>>>> CAQuietExec:  Files\Java\jdk1.6.0_31 was unexpected at this time.**
>>>>>>>> **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> It seems that you installed Java prerequisite in the default path,
>>>>>>>> which is %PROGRAMFILES% (expands to C:\Program Files in your case). HDP 1.3
>>>>>>>> does not like spaces in paths, do you need to reinstall Java under c:\java\
>>>>>>>> or something similar (in a path with no spaces).****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> *From:* Irfan Sayed [mailto:irfu.sayed@gmail.com]
>>>>>>>> *Sent:* Thursday, September 5, 2013 8:42 PM
>>>>>>>> *To:* user@hadoop.apache.org
>>>>>>>> *Subject:* Re: about replication****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> please find the attached.****
>>>>>>>>
>>>>>>>> i don't have "c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log"
>>>>>>>> as it is not generated ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> irfan****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Thu, Sep 5, 2013 at 6:09 PM, Olivier Renault <
>>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>>
>>>>>>>>  Could you share the log files ( c:\hdp.log,
>>>>>>>> c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log )  as
>>>>>>>> well as your clusterproperties.txt ?****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> Thanks, ****
>>>>>>>>
>>>>>>>> Olivier****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On 5 September 2013 12:33, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  thanks. i followed the user manual for deployment and installed
>>>>>>>> all pre-requisites ****
>>>>>>>>
>>>>>>>> i modified the command and still the issue persist. please suggest
>>>>>>>> ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> please refer below ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> [image: Inline image 1]****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> irfan ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <
>>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>>
>>>>>>>> The command to install it is msiexec /i msifile /...  ****
>>>>>>>>
>>>>>>>> You will find the correct syntax as part of doc. ****
>>>>>>>>
>>>>>>>> Happy reading
>>>>>>>> Olivier ****
>>>>>>>>
>>>>>>>> On 4 Sep 2013 12:37, "Irfan Sayed" <ir...@gmail.com> wrote:***
>>>>>>>> *
>>>>>>>>
>>>>>>>>  thanks. ****
>>>>>>>>
>>>>>>>> i referred the logs and manuals. i modified the clusterproperties
>>>>>>>> file and then double click on the msi file ****
>>>>>>>>
>>>>>>>> however, it still failed.****
>>>>>>>>
>>>>>>>> further i started the installation on command line by giving
>>>>>>>> HDP_LAYOUT=clusterproperties file path, ****
>>>>>>>>
>>>>>>>> installation went ahead and it failed for .NET framework 4.0 and
>>>>>>>> VC++ redistributable package dependency   ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> i installed both and started again the installation. ****
>>>>>>>>
>>>>>>>> failed again with following error ****
>>>>>>>>
>>>>>>>> [image: Inline image 1]****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> when i search for the logs mentioned in the error , i never found
>>>>>>>> that ****
>>>>>>>>
>>>>>>>> please suggest ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> irfan****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <
>>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>>
>>>>>>>> Correct, you need to define the cluster configuration as part of a
>>>>>>>> file. You will find some information on the configuration file as part of
>>>>>>>> the documentation. ****
>>>>>>>>
>>>>>>>>
>>>>>>>> http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
>>>>>>>> ****
>>>>>>>>
>>>>>>>> You should make sure to have also installed the pre requisite. ****
>>>>>>>>
>>>>>>>> Thanks
>>>>>>>> Olivier ****
>>>>>>>>
>>>>>>>> On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:***
>>>>>>>> *
>>>>>>>>
>>>>>>>>  thanks. sorry for the long break. actually got involved in some
>>>>>>>> other priorities****
>>>>>>>>
>>>>>>>> i downloaded the installer and while installing i got following
>>>>>>>> error ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> [image: Inline image 1]****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> do i need to make any configuration prior to installation ??****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> irfan ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
>>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>>
>>>>>>>> Here is the link ****
>>>>>>>>
>>>>>>>> http://download.hortonworks.com/products/hdp-windows/****
>>>>>>>>
>>>>>>>> Olivier ****
>>>>>>>>
>>>>>>>> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:**
>>>>>>>> **
>>>>>>>>
>>>>>>>>  thanks.****
>>>>>>>>
>>>>>>>> i just followed the instructions to setup the pseudo distributed
>>>>>>>> setup first using the url :
>>>>>>>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>>>>>>>> ****
>>>>>>>>
>>>>>>>>  ****
>>>>>>>>
>>>>>>>> i don't think so i am running DN on both machine ****
>>>>>>>>
>>>>>>>> please find the attached log****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> hi olivier ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> can you please give me download link ?****
>>>>>>>>
>>>>>>>> let me try please ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> irfan ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  Are you running DN on both the machines? Could you please show me
>>>>>>>> your DN logs?****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> Also, consider Oliver's suggestion. It's definitely a better option.
>>>>>>>> ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>>
>>>>>>>> ****
>>>>>>>>
>>>>>>>> Warm Regards,****
>>>>>>>>
>>>>>>>> Tariq****
>>>>>>>>
>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>>
>>>>>>>> Irfu, ****
>>>>>>>>
>>>>>>>> If you want to quickly get Hadoop running on windows platform. You
>>>>>>>> may want to try our distribution for Windows. You will be able to find the
>>>>>>>> msi on our website. ****
>>>>>>>>
>>>>>>>> Regards
>>>>>>>> Olivier ****
>>>>>>>>
>>>>>>>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:**
>>>>>>>> **
>>>>>>>>
>>>>>>>>  thanks. ****
>>>>>>>>
>>>>>>>> ok. i think i need to change the plan over here ****
>>>>>>>>
>>>>>>>> let me create two environments. 1: totally windows 2: totally Unix*
>>>>>>>> ***
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> because, on windows , anyway i have to try and see how hadoop works
>>>>>>>> ****
>>>>>>>>
>>>>>>>> on UNIX, it is already known that ,  it is working fine. ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> so, on windows , here is the setup:****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> namenode : windows 2012 R2 ****
>>>>>>>>
>>>>>>>> datanode : windows 2012 R2 ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> now, the exact problem is :****
>>>>>>>>
>>>>>>>> 1: datanode is not getting started ****
>>>>>>>>
>>>>>>>> 2: replication : if i put any file/folder on any datanode , it
>>>>>>>> should get replicated to all another available datanodes ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <do...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  Seriously??You are planning to develop something using Hadoop on
>>>>>>>> windows. Not a good idea. Anyways, cold you plz show me your log files?I
>>>>>>>> also need some additional info :****
>>>>>>>>
>>>>>>>> -The exact problem which you are facing right now****
>>>>>>>>
>>>>>>>> -Your cluster summary(no. of nodes etc)****
>>>>>>>>
>>>>>>>> -Your latest configuration files****
>>>>>>>>
>>>>>>>> -Your /etc.hosts file****
>>>>>>>>
>>>>>>>>
>>>>>>>> ****
>>>>>>>>
>>>>>>>> Warm Regards,****
>>>>>>>>
>>>>>>>> Tariq****
>>>>>>>>
>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  ok. thanks****
>>>>>>>>
>>>>>>>> now, i need to start with all windows setup first as our product
>>>>>>>> will be based on windows ****
>>>>>>>>
>>>>>>>> so, now, please tell me how to resolve the issue ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> datanode is not starting . please suggest ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards,****
>>>>>>>>
>>>>>>>> irfan ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  It is possible. Theoretically Hadoop doesn't stop you from doing
>>>>>>>> that. But it is not a very wise setup.****
>>>>>>>>
>>>>>>>>
>>>>>>>> ****
>>>>>>>>
>>>>>>>> Warm Regards,****
>>>>>>>>
>>>>>>>> Tariq****
>>>>>>>>
>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  please suggest****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> irfan****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  thanks.****
>>>>>>>>
>>>>>>>> can i have setup like this :****
>>>>>>>>
>>>>>>>> namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)
>>>>>>>> ****
>>>>>>>>
>>>>>>>> and datanodes are the combination of any OS (windows , linux , unix
>>>>>>>> etc )****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> however, my doubt is,  as the file systems of  both the systems
>>>>>>>> (win and linux ) are different ,  datanodes of these systems can not be
>>>>>>>> part of single cluster . i have to make windows cluster separate and UNIX
>>>>>>>> cluster separate ?****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
>>>>>>>> aagarwal@hortonworks.com> wrote:****
>>>>>>>>
>>>>>>>> I just noticed you are on Cygwin. IIRC Windows PIDs are not the
>>>>>>>> same as Cygwin PIDs so that may be causing the discrepancy. I don't know
>>>>>>>> how well Hadoop works in Cygwin as I have never tried it. Work is in
>>>>>>>> progress for native Windows support however there are no official releases
>>>>>>>> with Windows support yet. It may be easier to get familiar with a
>>>>>>>> release <https://www.apache.org/dyn/closer.cgi/hadoop/common/> on
>>>>>>>> Linux if you are new to it.****
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> ****
>>>>>>>>
>>>>>>>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  thanks ****
>>>>>>>>
>>>>>>>> here is what i did .****
>>>>>>>>
>>>>>>>> i stopped all the namenodes and datanodes using ./stop-dfs.sh
>>>>>>>> command ****
>>>>>>>>
>>>>>>>> then deleted all pid files for namenodes and datanodes ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> started dfs again with command : "./start-dfs.sh"****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> when i ran the "Jps" command . it shows****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>>>>
>>>>>>>> $ ./jps.exe****
>>>>>>>>
>>>>>>>> 4536 Jps****
>>>>>>>>
>>>>>>>> 2076 NameNode****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> however, when i open the pid file for namenode then it is not
>>>>>>>> showing pid as : 4560. on the contrary, it shud show : 2076****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> please suggest ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
>>>>>>>> aagarwal@hortonworks.com> wrote:****
>>>>>>>>
>>>>>>>>  Most likely there is a stale pid file. Something like
>>>>>>>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>>>>>>>> the datanode.
>>>>>>>>
>>>>>>>> I haven't read the entire thread so you may have looked at this
>>>>>>>> already.
>>>>>>>>
>>>>>>>> -Arpit****
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> ****
>>>>>>>>
>>>>>>>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  datanode is trying to connect to namenode continuously but fails *
>>>>>>>> ***
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> when i try to run "jps" command it says :****
>>>>>>>>
>>>>>>>> $ ./jps.exe****
>>>>>>>>
>>>>>>>> 4584 NameNode****
>>>>>>>>
>>>>>>>> 4016 Jps****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> and when i ran the "./start-dfs.sh" then it says :****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> $ ./start-dfs.sh****
>>>>>>>>
>>>>>>>> namenode running as process 3544. Stop it first.****
>>>>>>>>
>>>>>>>> DFS-1: datanode running as process 4076. Stop it first.****
>>>>>>>>
>>>>>>>> localhost: secondarynamenode running as process 4792. Stop it first.
>>>>>>>> ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> both these logs are contradictory ****
>>>>>>>>
>>>>>>>> please find the attached logs ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> should i attach the conf files as well ?****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>>  ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  Your DN is still not running. Showing me the logs would be
>>>>>>>> helpful.****
>>>>>>>>
>>>>>>>>
>>>>>>>> ****
>>>>>>>>
>>>>>>>> Warm Regards,****
>>>>>>>>
>>>>>>>> Tariq****
>>>>>>>>
>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  i followed the url and did the steps mention in that. i have
>>>>>>>> deployed on the windows platform****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> Now, i am able to browse url : http://localhost:50070 (name node )*
>>>>>>>> ***
>>>>>>>>
>>>>>>>> however, not able to browse url : http://localhost:50030****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> please refer below****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> [image: Inline image 1]****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> i have modified all the config files as mentioned and formatted the
>>>>>>>> hdfs file system as well ****
>>>>>>>>
>>>>>>>> please suggest ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  thanks. i followed this url :
>>>>>>>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>>>>>>>> ****
>>>>>>>>
>>>>>>>> let me follow the url which you gave for pseudo distributed setup
>>>>>>>> and then will switch to distributed mode****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> irfan ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  You are welcome. Which link have you followed for the
>>>>>>>> configuration?Your *core-site.xml* is empty. Remove the property *
>>>>>>>> fs.default.name *from *hdfs-site.xml* and add it to *core-site.xml*.
>>>>>>>> Remove *mapred.job.tracker* as well. It is required in *
>>>>>>>> mapred-site.xml*.****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> I would suggest you to do a pseudo distributed setup first in order
>>>>>>>> to get yourself familiar with the process and then proceed to the
>>>>>>>> distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>>>>>>>> ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> HTH****
>>>>>>>>
>>>>>>>>
>>>>>>>> ****
>>>>>>>>
>>>>>>>> Warm Regards,****
>>>>>>>>
>>>>>>>> Tariq****
>>>>>>>>
>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  thanks tariq for response. ****
>>>>>>>>
>>>>>>>> as discussed last time, i have sent you all the config files in my
>>>>>>>> setup . ****
>>>>>>>>
>>>>>>>> can you please go through that ?****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> please let me know ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> irfan ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  I'm sorry for being unresponsive. Was out of touch for sometime
>>>>>>>> because of ramzan and eid. Resuming work today.****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> What's the current status?****
>>>>>>>>
>>>>>>>>
>>>>>>>> ****
>>>>>>>>
>>>>>>>> Warm Regards,****
>>>>>>>>
>>>>>>>> Tariq****
>>>>>>>>
>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <
>>>>>>>> manishd207@gmail.com> wrote:****
>>>>>>>>
>>>>>>>>  First of all read the concepts ..I hope you will like it..****
>>>>>>>>
>>>>>>>>
>>>>>>>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf**
>>>>>>>> **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  please suggest ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> irfan ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  hey Tariq,****
>>>>>>>>
>>>>>>>> i am still stuck .. ****
>>>>>>>>
>>>>>>>> can you please suggest ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> irfan ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  please suggest ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  attachment got quarantined ****
>>>>>>>>
>>>>>>>> resending in txt format. please rename it to conf.rar ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  thanks.****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> if i run the jps command on namenode :****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>>>>
>>>>>>>> $ ./jps.exe****
>>>>>>>>
>>>>>>>> 3164 NameNode****
>>>>>>>>
>>>>>>>> 1892 Jps****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> same command on datanode :****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> Administrator@DFS-1 /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>>>>
>>>>>>>> $ ./jps.exe****
>>>>>>>>
>>>>>>>> 3848 Jps****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> jps does not list any process for datanode. however, on web browser
>>>>>>>> i can see one live data node ****
>>>>>>>>
>>>>>>>> please find the attached conf rar file of namenode ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <do...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  OK. we'll start fresh. Could you plz show me your latest config
>>>>>>>> files?****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> BTW, are your daemons running fine?Use JPS to verify that.****
>>>>>>>>
>>>>>>>>
>>>>>>>> ****
>>>>>>>>
>>>>>>>> Warm Regards,****
>>>>>>>>
>>>>>>>> Tariq****
>>>>>>>>
>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  i have created these dir "wksp_data" and "wksp_name" on both
>>>>>>>> datanode and namenode ****
>>>>>>>>
>>>>>>>> made the respective changes in "hdfs-site.xml" file ****
>>>>>>>>
>>>>>>>> formatted the namenode ****
>>>>>>>>
>>>>>>>> started the dfs ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> but still, not able to browse the file system through web browser *
>>>>>>>> ***
>>>>>>>>
>>>>>>>> please refer below ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> anything still missing ?****
>>>>>>>>
>>>>>>>> please suggest ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> [image: Inline image 1]****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  these dir needs to be created on all datanodes and namenodes ?****
>>>>>>>>
>>>>>>>> further,  hdfs-site.xml needs to be updated on both datanodes and
>>>>>>>> namenodes for these new dir?****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  Create 2 directories manually corresponding to the values of
>>>>>>>> dfs.name.dir and dfs.data.dir properties and change the permissions of
>>>>>>>> these directories to 755. When you start pushing data into your HDFS, data
>>>>>>>> will start going inside the directory specified by dfs.data.dir and the
>>>>>>>> associated metadata will go inside dfs.name.dir. Remember, you store data
>>>>>>>> in HDFS, but it eventually gets stored in your local/native FS. But you
>>>>>>>> cannot see this data directly on your local/native FS.****
>>>>>>>>
>>>>>>>>
>>>>>>>> ****
>>>>>>>>
>>>>>>>> Warm Regards,****
>>>>>>>>
>>>>>>>> Tariq****
>>>>>>>>
>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  thanks. ****
>>>>>>>>
>>>>>>>> however, i need this to be working on windows environment as
>>>>>>>> project requirement.****
>>>>>>>>
>>>>>>>> i will add/work on Linux later ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> so, now , at this stage , c:\\wksp is the HDFS file system OR do i
>>>>>>>> need to create it from command line ?****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> please suggest****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards,****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  Hello Irfan,****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> Sorry for being unresponsive. Got stuck with some imp work.****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> HDFS webUI doesn't provide us the ability to create file or
>>>>>>>> directory. You can browse HDFS, view files, download files etc. But
>>>>>>>> operation like create, move, copy etc are not supported.****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> These values look fine to me.****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> One suggestion though. Try getting a Linux machine(if possible). Or
>>>>>>>> at least use a VM. I personally feel that using Hadoop on windows is always
>>>>>>>> messy.****
>>>>>>>>
>>>>>>>>
>>>>>>>> ****
>>>>>>>>
>>>>>>>> Warm Regards,****
>>>>>>>>
>>>>>>>> Tariq****
>>>>>>>>
>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  thanks.****
>>>>>>>>
>>>>>>>> when i browse the file system , i am getting following :****
>>>>>>>>
>>>>>>>> i haven't seen any make directory option there ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> i need to create it from command line ?****
>>>>>>>>
>>>>>>>> further, in the hdfs-site.xml file , i have given following
>>>>>>>> entries. are they correct ? ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> <property>****
>>>>>>>>
>>>>>>>>   <name>dfs.data.dir</name>****
>>>>>>>>
>>>>>>>>   <value>c:\\wksp</value>****
>>>>>>>>
>>>>>>>>   </property>****
>>>>>>>>
>>>>>>>> <property>****
>>>>>>>>
>>>>>>>>   <name>dfs.name.dir</name>****
>>>>>>>>
>>>>>>>>   <value>c:\\wksp</value>****
>>>>>>>>
>>>>>>>>   </property>****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> please suggest ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> [image: Inline image 1]****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <
>>>>>>>> manishd207@gmail.com> wrote:****
>>>>>>>>
>>>>>>>>  *You are wrong at this:*****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>>>>>>>
>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp****
>>>>>>>>
>>>>>>>> copyFromLocal: File
>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
>>>>>>>> ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>>>>>>>
>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp**
>>>>>>>> **
>>>>>>>>
>>>>>>>> copyFromLocal: File
>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>>>>>>>> ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> Because,You had wrote both the paths local and You need not to copy
>>>>>>>> hadoop into hdfs...Hadoop is already working..****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> Just check out in browser by after starting ur single node cluster :
>>>>>>>> ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> localhost:50070****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> then go for browse the filesystem link in it..****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> If there is no directory then make directory there.****
>>>>>>>>
>>>>>>>> That is your hdfs directory.****
>>>>>>>>
>>>>>>>> Then copy any text file there(no need to copy hadoop
>>>>>>>> there).beacause u are going to do processing on that data in text
>>>>>>>> file.That's why hadoop is used for ,first u need to make it clear in ur
>>>>>>>> mind.Then and then u will do it...otherwise not possible..****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> *Try this: *****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2****
>>>>>>>>
>>>>>>>> $ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
>>>>>>>> /hdfs/directory/path****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  thanks. yes , i am newbie.****
>>>>>>>>
>>>>>>>> however, i need windows setup.****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> let me surely refer the doc and link which u sent but i need this
>>>>>>>> to be working ...****
>>>>>>>>
>>>>>>>> can you please help****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>>  ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> --
>>>>>>>> MANISH DUNANI
>>>>>>>> -THANX
>>>>>>>> +91 9426881954,+91 8460656443****
>>>>>>>>
>>>>>>>> manishd207@gmail.com****
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> ****
>>>>>>>>
>>>>>>>> -- ****
>>>>>>>>
>>>>>>>> Regards****
>>>>>>>>
>>>>>>>> *Manish Dunani*****
>>>>>>>>
>>>>>>>> *Contact No* : +91 9408329137****
>>>>>>>>
>>>>>>>> *skype id* : manish.dunani****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>
>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>
>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>
>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>
>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> -- ****
>>>>>>>>
>>>>>>>> Olivier Renault****
>>>>>>>>
>>>>>>>> Solution Engineer - Big Data - Hortonworks, Inc.
>>>>>>>> +44 7500 933 036
>>>>>>>> orenault@hortonworks.com
>>>>>>>> www.hortonworks.com****
>>>>>>>>
>>>>>>>> **** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>>>>
>>>>>>>>
>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>> immediately and delete it from your system. Thank You.****<http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>>>>
>>>>>>>>  ** ** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>> CONFIDENTIALITY NOTICE
>>> NOTICE: This message is intended for the use of the individual or entity
>>> to which it is addressed and may contain information that is confidential,
>>> privileged and exempt from disclosure under applicable law. If the reader
>>> of this message is not the intended recipient, you are hereby notified that
>>> any printing, copying, dissemination, distribution, disclosure or
>>> forwarding of this communication is strictly prohibited. If you have
>>> received this communication in error, please contact the sender immediately
>>> and delete it from your system. Thank You.
>>>
>>
>>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.
>

Re: about replication

Posted by Irfan Sayed <ir...@gmail.com>.
thanks.
finally it got installed :)

further, when i try to start the namenode, it failed with following log

C:\hdp>start_remote_hdp_services.cmd
Master nodes: start DFS-DC
0 Master nodes successfully started.
1 Master nodes failed to start.

PSComputerName      Service             Message             Status
--------------      -------             -------             ------
                                        Connecting to re...


StartStop-HDPServices : Manually start services on Master nodes then retry
full
 cluster start.  Exiting.
At C:\hdp\manage_remote_hdp_services.ps1:187 char:47
+ if ($mode -eq "start") { StartStop-HDPservices <<<< ($mode) }
    + CategoryInfo          : NotSpecified: (:) [Write-Error],
WriteErrorExcep
   tion
    + FullyQualifiedErrorId :
Microsoft.PowerShell.Commands.WriteErrorExceptio
   n,StartStop-HDPServices


C:\hdp>

i tried starting manually as well but no luck
anything missing in configuration ?

regards



On Wed, Sep 11, 2013 at 3:16 PM, Olivier Renault
<or...@hortonworks.com>wrote:

> You can put the same FQDN as your NameNode for example.
>
> Thanks
> Olivier
> On 11 Sep 2013 11:26, "Irfan Sayed" <ir...@gmail.com> wrote:
>
>> i do not have any HIVE server host,  then, what should i put over here??
>> . if i comment then i guess it throws error of commenting that
>> can i put the fqdn of namenode for HIVE server host  ?
>>
>> will it be a really working configuration?
>>
>> please suggest
>>
>> regards
>> irfan
>>
>>
>>
>> On Tue, Sep 10, 2013 at 5:09 PM, Olivier Renault <
>> orenault@hortonworks.com> wrote:
>>
>>> Your cluster-properties.txt should look something like :
>>>
>>>
>>> #Log directory
>>> HDP_LOG_DIR=c:\hadoop\logs
>>>
>>> #Data directory
>>> HDP_DATA_DIR=c:\hdp\data
>>>
>>> #Hosts
>>> NAMENODE_HOST=yourmaster.fqdn.com
>>> JOBTRACKER_HOST=yourmaster.fqdn.com
>>> HIVE_SERVER_HOST=yourmaster.fqdn.com
>>> OOZIE_SERVER_HOST=yourmaster.fqdn.com
>>> TEMPLETON_HOST=yourmaster.fqdn.com
>>> SLAVE_HOSTS=yourmaster.fqdn.com,youslave.fqdn.com
>>>
>>>
>>> #Database host
>>> DB_FLAVOR=derby
>>> DB_HOSTNAME=yourmaster.fqdn.com
>>>
>>>
>>> #Hive properties
>>> HIVE_DB_NAME=hive
>>> HIVE_DB_USERNAME=hive
>>> HIVE_DB_PASSWORD=hive
>>>
>>> #Oozie properties
>>> OOZIE_DB_NAME=oozie
>>> OOZIE_DB_USERNAME=oozie
>>> OOZIE_DB_PASSWORD=oozie
>>>
>>> You will need to replace, yourmaster.fqdn.com, yourslave.fqdn.com by
>>> your servers name. For the time being, I suggest that you do not install
>>> HBase, Oozie,
>>>
>>> regards,
>>> Olivier
>>>
>>>
>>> On 10 September 2013 07:02, Irfan Sayed <ir...@gmail.com> wrote:
>>>
>>>> please suggest
>>>>
>>>> regards
>>>> irfan
>>>>
>>>>
>>>>
>>>> On Sat, Sep 7, 2013 at 4:56 PM, Irfan Sayed <ir...@gmail.com>wrote:
>>>>
>>>>> please suggest
>>>>>
>>>>> regards
>>>>> irfan
>>>>>
>>>>>
>>>>>
>>>>> On Fri, Sep 6, 2013 at 12:16 PM, Irfan Sayed <ir...@gmail.com>wrote:
>>>>>
>>>>>> ok.. now i made some changes and installation went ahead
>>>>>> but failed in property "HIVE_SERVER_HOST" declaration
>>>>>> in cluster config file, i have commented this property. if i
>>>>>> uncomment , then what server address will give ???
>>>>>>
>>>>>> i have only two windows machines setup.
>>>>>> 1: for namenode and another for datanode
>>>>>>
>>>>>> please suggest
>>>>>>
>>>>>> regards
>>>>>> irfan
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Fri, Sep 6, 2013 at 11:42 AM, Irfan Sayed <ir...@gmail.com>wrote:
>>>>>>
>>>>>>> thanks.
>>>>>>> i installed the latest java in c:\java folder and now no error in
>>>>>>> log file related to java
>>>>>>> however, now it is throwing error on not having cluster properties
>>>>>>> file.
>>>>>>> in fact i am running/installing hdp from the location where this
>>>>>>> file exist . still it is throwing error
>>>>>>>
>>>>>>> please find the attached
>>>>>>>
>>>>>>> [image: Inline image 1]
>>>>>>>
>>>>>>> regards
>>>>>>> irfan
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Fri, Sep 6, 2013 at 11:12 AM, Ravi Mummulla (BIG DATA) <
>>>>>>> ravimu@microsoft.com> wrote:
>>>>>>>
>>>>>>>>  Here’s your issue (from the logs you attached earlier):****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> CAQuietExec:  Checking JAVA_HOME is set correctly...****
>>>>>>>>
>>>>>>>> CAQuietExec:  Files\Java\jdk1.6.0_31 was unexpected at this time.**
>>>>>>>> **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> It seems that you installed Java prerequisite in the default path,
>>>>>>>> which is %PROGRAMFILES% (expands to C:\Program Files in your case). HDP 1.3
>>>>>>>> does not like spaces in paths, do you need to reinstall Java under c:\java\
>>>>>>>> or something similar (in a path with no spaces).****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> *From:* Irfan Sayed [mailto:irfu.sayed@gmail.com]
>>>>>>>> *Sent:* Thursday, September 5, 2013 8:42 PM
>>>>>>>> *To:* user@hadoop.apache.org
>>>>>>>> *Subject:* Re: about replication****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> please find the attached.****
>>>>>>>>
>>>>>>>> i don't have "c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log"
>>>>>>>> as it is not generated ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> irfan****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Thu, Sep 5, 2013 at 6:09 PM, Olivier Renault <
>>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>>
>>>>>>>>  Could you share the log files ( c:\hdp.log,
>>>>>>>> c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log )  as
>>>>>>>> well as your clusterproperties.txt ?****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> Thanks, ****
>>>>>>>>
>>>>>>>> Olivier****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On 5 September 2013 12:33, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  thanks. i followed the user manual for deployment and installed
>>>>>>>> all pre-requisites ****
>>>>>>>>
>>>>>>>> i modified the command and still the issue persist. please suggest
>>>>>>>> ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> please refer below ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> [image: Inline image 1]****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> irfan ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <
>>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>>
>>>>>>>> The command to install it is msiexec /i msifile /...  ****
>>>>>>>>
>>>>>>>> You will find the correct syntax as part of doc. ****
>>>>>>>>
>>>>>>>> Happy reading
>>>>>>>> Olivier ****
>>>>>>>>
>>>>>>>> On 4 Sep 2013 12:37, "Irfan Sayed" <ir...@gmail.com> wrote:***
>>>>>>>> *
>>>>>>>>
>>>>>>>>  thanks. ****
>>>>>>>>
>>>>>>>> i referred the logs and manuals. i modified the clusterproperties
>>>>>>>> file and then double click on the msi file ****
>>>>>>>>
>>>>>>>> however, it still failed.****
>>>>>>>>
>>>>>>>> further i started the installation on command line by giving
>>>>>>>> HDP_LAYOUT=clusterproperties file path, ****
>>>>>>>>
>>>>>>>> installation went ahead and it failed for .NET framework 4.0 and
>>>>>>>> VC++ redistributable package dependency   ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> i installed both and started again the installation. ****
>>>>>>>>
>>>>>>>> failed again with following error ****
>>>>>>>>
>>>>>>>> [image: Inline image 1]****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> when i search for the logs mentioned in the error , i never found
>>>>>>>> that ****
>>>>>>>>
>>>>>>>> please suggest ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> irfan****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <
>>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>>
>>>>>>>> Correct, you need to define the cluster configuration as part of a
>>>>>>>> file. You will find some information on the configuration file as part of
>>>>>>>> the documentation. ****
>>>>>>>>
>>>>>>>>
>>>>>>>> http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
>>>>>>>> ****
>>>>>>>>
>>>>>>>> You should make sure to have also installed the pre requisite. ****
>>>>>>>>
>>>>>>>> Thanks
>>>>>>>> Olivier ****
>>>>>>>>
>>>>>>>> On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:***
>>>>>>>> *
>>>>>>>>
>>>>>>>>  thanks. sorry for the long break. actually got involved in some
>>>>>>>> other priorities****
>>>>>>>>
>>>>>>>> i downloaded the installer and while installing i got following
>>>>>>>> error ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> [image: Inline image 1]****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> do i need to make any configuration prior to installation ??****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> irfan ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
>>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>>
>>>>>>>> Here is the link ****
>>>>>>>>
>>>>>>>> http://download.hortonworks.com/products/hdp-windows/****
>>>>>>>>
>>>>>>>> Olivier ****
>>>>>>>>
>>>>>>>> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:**
>>>>>>>> **
>>>>>>>>
>>>>>>>>  thanks.****
>>>>>>>>
>>>>>>>> i just followed the instructions to setup the pseudo distributed
>>>>>>>> setup first using the url :
>>>>>>>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>>>>>>>> ****
>>>>>>>>
>>>>>>>>  ****
>>>>>>>>
>>>>>>>> i don't think so i am running DN on both machine ****
>>>>>>>>
>>>>>>>> please find the attached log****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> hi olivier ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> can you please give me download link ?****
>>>>>>>>
>>>>>>>> let me try please ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> irfan ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  Are you running DN on both the machines? Could you please show me
>>>>>>>> your DN logs?****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> Also, consider Oliver's suggestion. It's definitely a better option.
>>>>>>>> ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>>
>>>>>>>> ****
>>>>>>>>
>>>>>>>> Warm Regards,****
>>>>>>>>
>>>>>>>> Tariq****
>>>>>>>>
>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>>
>>>>>>>> Irfu, ****
>>>>>>>>
>>>>>>>> If you want to quickly get Hadoop running on windows platform. You
>>>>>>>> may want to try our distribution for Windows. You will be able to find the
>>>>>>>> msi on our website. ****
>>>>>>>>
>>>>>>>> Regards
>>>>>>>> Olivier ****
>>>>>>>>
>>>>>>>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:**
>>>>>>>> **
>>>>>>>>
>>>>>>>>  thanks. ****
>>>>>>>>
>>>>>>>> ok. i think i need to change the plan over here ****
>>>>>>>>
>>>>>>>> let me create two environments. 1: totally windows 2: totally Unix*
>>>>>>>> ***
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> because, on windows , anyway i have to try and see how hadoop works
>>>>>>>> ****
>>>>>>>>
>>>>>>>> on UNIX, it is already known that ,  it is working fine. ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> so, on windows , here is the setup:****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> namenode : windows 2012 R2 ****
>>>>>>>>
>>>>>>>> datanode : windows 2012 R2 ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> now, the exact problem is :****
>>>>>>>>
>>>>>>>> 1: datanode is not getting started ****
>>>>>>>>
>>>>>>>> 2: replication : if i put any file/folder on any datanode , it
>>>>>>>> should get replicated to all another available datanodes ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <do...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  Seriously??You are planning to develop something using Hadoop on
>>>>>>>> windows. Not a good idea. Anyways, cold you plz show me your log files?I
>>>>>>>> also need some additional info :****
>>>>>>>>
>>>>>>>> -The exact problem which you are facing right now****
>>>>>>>>
>>>>>>>> -Your cluster summary(no. of nodes etc)****
>>>>>>>>
>>>>>>>> -Your latest configuration files****
>>>>>>>>
>>>>>>>> -Your /etc.hosts file****
>>>>>>>>
>>>>>>>>
>>>>>>>> ****
>>>>>>>>
>>>>>>>> Warm Regards,****
>>>>>>>>
>>>>>>>> Tariq****
>>>>>>>>
>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  ok. thanks****
>>>>>>>>
>>>>>>>> now, i need to start with all windows setup first as our product
>>>>>>>> will be based on windows ****
>>>>>>>>
>>>>>>>> so, now, please tell me how to resolve the issue ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> datanode is not starting . please suggest ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards,****
>>>>>>>>
>>>>>>>> irfan ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  It is possible. Theoretically Hadoop doesn't stop you from doing
>>>>>>>> that. But it is not a very wise setup.****
>>>>>>>>
>>>>>>>>
>>>>>>>> ****
>>>>>>>>
>>>>>>>> Warm Regards,****
>>>>>>>>
>>>>>>>> Tariq****
>>>>>>>>
>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  please suggest****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> irfan****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  thanks.****
>>>>>>>>
>>>>>>>> can i have setup like this :****
>>>>>>>>
>>>>>>>> namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)
>>>>>>>> ****
>>>>>>>>
>>>>>>>> and datanodes are the combination of any OS (windows , linux , unix
>>>>>>>> etc )****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> however, my doubt is,  as the file systems of  both the systems
>>>>>>>> (win and linux ) are different ,  datanodes of these systems can not be
>>>>>>>> part of single cluster . i have to make windows cluster separate and UNIX
>>>>>>>> cluster separate ?****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
>>>>>>>> aagarwal@hortonworks.com> wrote:****
>>>>>>>>
>>>>>>>> I just noticed you are on Cygwin. IIRC Windows PIDs are not the
>>>>>>>> same as Cygwin PIDs so that may be causing the discrepancy. I don't know
>>>>>>>> how well Hadoop works in Cygwin as I have never tried it. Work is in
>>>>>>>> progress for native Windows support however there are no official releases
>>>>>>>> with Windows support yet. It may be easier to get familiar with a
>>>>>>>> release <https://www.apache.org/dyn/closer.cgi/hadoop/common/> on
>>>>>>>> Linux if you are new to it.****
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> ****
>>>>>>>>
>>>>>>>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  thanks ****
>>>>>>>>
>>>>>>>> here is what i did .****
>>>>>>>>
>>>>>>>> i stopped all the namenodes and datanodes using ./stop-dfs.sh
>>>>>>>> command ****
>>>>>>>>
>>>>>>>> then deleted all pid files for namenodes and datanodes ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> started dfs again with command : "./start-dfs.sh"****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> when i ran the "Jps" command . it shows****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>>>>
>>>>>>>> $ ./jps.exe****
>>>>>>>>
>>>>>>>> 4536 Jps****
>>>>>>>>
>>>>>>>> 2076 NameNode****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> however, when i open the pid file for namenode then it is not
>>>>>>>> showing pid as : 4560. on the contrary, it shud show : 2076****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> please suggest ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
>>>>>>>> aagarwal@hortonworks.com> wrote:****
>>>>>>>>
>>>>>>>>  Most likely there is a stale pid file. Something like
>>>>>>>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>>>>>>>> the datanode.
>>>>>>>>
>>>>>>>> I haven't read the entire thread so you may have looked at this
>>>>>>>> already.
>>>>>>>>
>>>>>>>> -Arpit****
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> ****
>>>>>>>>
>>>>>>>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  datanode is trying to connect to namenode continuously but fails *
>>>>>>>> ***
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> when i try to run "jps" command it says :****
>>>>>>>>
>>>>>>>> $ ./jps.exe****
>>>>>>>>
>>>>>>>> 4584 NameNode****
>>>>>>>>
>>>>>>>> 4016 Jps****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> and when i ran the "./start-dfs.sh" then it says :****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> $ ./start-dfs.sh****
>>>>>>>>
>>>>>>>> namenode running as process 3544. Stop it first.****
>>>>>>>>
>>>>>>>> DFS-1: datanode running as process 4076. Stop it first.****
>>>>>>>>
>>>>>>>> localhost: secondarynamenode running as process 4792. Stop it first.
>>>>>>>> ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> both these logs are contradictory ****
>>>>>>>>
>>>>>>>> please find the attached logs ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> should i attach the conf files as well ?****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>>  ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  Your DN is still not running. Showing me the logs would be
>>>>>>>> helpful.****
>>>>>>>>
>>>>>>>>
>>>>>>>> ****
>>>>>>>>
>>>>>>>> Warm Regards,****
>>>>>>>>
>>>>>>>> Tariq****
>>>>>>>>
>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  i followed the url and did the steps mention in that. i have
>>>>>>>> deployed on the windows platform****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> Now, i am able to browse url : http://localhost:50070 (name node )*
>>>>>>>> ***
>>>>>>>>
>>>>>>>> however, not able to browse url : http://localhost:50030****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> please refer below****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> [image: Inline image 1]****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> i have modified all the config files as mentioned and formatted the
>>>>>>>> hdfs file system as well ****
>>>>>>>>
>>>>>>>> please suggest ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  thanks. i followed this url :
>>>>>>>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>>>>>>>> ****
>>>>>>>>
>>>>>>>> let me follow the url which you gave for pseudo distributed setup
>>>>>>>> and then will switch to distributed mode****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> irfan ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  You are welcome. Which link have you followed for the
>>>>>>>> configuration?Your *core-site.xml* is empty. Remove the property *
>>>>>>>> fs.default.name *from *hdfs-site.xml* and add it to *core-site.xml*.
>>>>>>>> Remove *mapred.job.tracker* as well. It is required in *
>>>>>>>> mapred-site.xml*.****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> I would suggest you to do a pseudo distributed setup first in order
>>>>>>>> to get yourself familiar with the process and then proceed to the
>>>>>>>> distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>>>>>>>> ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> HTH****
>>>>>>>>
>>>>>>>>
>>>>>>>> ****
>>>>>>>>
>>>>>>>> Warm Regards,****
>>>>>>>>
>>>>>>>> Tariq****
>>>>>>>>
>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  thanks tariq for response. ****
>>>>>>>>
>>>>>>>> as discussed last time, i have sent you all the config files in my
>>>>>>>> setup . ****
>>>>>>>>
>>>>>>>> can you please go through that ?****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> please let me know ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> irfan ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  I'm sorry for being unresponsive. Was out of touch for sometime
>>>>>>>> because of ramzan and eid. Resuming work today.****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> What's the current status?****
>>>>>>>>
>>>>>>>>
>>>>>>>> ****
>>>>>>>>
>>>>>>>> Warm Regards,****
>>>>>>>>
>>>>>>>> Tariq****
>>>>>>>>
>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <
>>>>>>>> manishd207@gmail.com> wrote:****
>>>>>>>>
>>>>>>>>  First of all read the concepts ..I hope you will like it..****
>>>>>>>>
>>>>>>>>
>>>>>>>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf**
>>>>>>>> **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  please suggest ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> irfan ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  hey Tariq,****
>>>>>>>>
>>>>>>>> i am still stuck .. ****
>>>>>>>>
>>>>>>>> can you please suggest ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> irfan ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  please suggest ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  attachment got quarantined ****
>>>>>>>>
>>>>>>>> resending in txt format. please rename it to conf.rar ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  thanks.****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> if i run the jps command on namenode :****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>>>>
>>>>>>>> $ ./jps.exe****
>>>>>>>>
>>>>>>>> 3164 NameNode****
>>>>>>>>
>>>>>>>> 1892 Jps****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> same command on datanode :****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> Administrator@DFS-1 /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>>>>
>>>>>>>> $ ./jps.exe****
>>>>>>>>
>>>>>>>> 3848 Jps****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> jps does not list any process for datanode. however, on web browser
>>>>>>>> i can see one live data node ****
>>>>>>>>
>>>>>>>> please find the attached conf rar file of namenode ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <do...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  OK. we'll start fresh. Could you plz show me your latest config
>>>>>>>> files?****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> BTW, are your daemons running fine?Use JPS to verify that.****
>>>>>>>>
>>>>>>>>
>>>>>>>> ****
>>>>>>>>
>>>>>>>> Warm Regards,****
>>>>>>>>
>>>>>>>> Tariq****
>>>>>>>>
>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  i have created these dir "wksp_data" and "wksp_name" on both
>>>>>>>> datanode and namenode ****
>>>>>>>>
>>>>>>>> made the respective changes in "hdfs-site.xml" file ****
>>>>>>>>
>>>>>>>> formatted the namenode ****
>>>>>>>>
>>>>>>>> started the dfs ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> but still, not able to browse the file system through web browser *
>>>>>>>> ***
>>>>>>>>
>>>>>>>> please refer below ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> anything still missing ?****
>>>>>>>>
>>>>>>>> please suggest ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> [image: Inline image 1]****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  these dir needs to be created on all datanodes and namenodes ?****
>>>>>>>>
>>>>>>>> further,  hdfs-site.xml needs to be updated on both datanodes and
>>>>>>>> namenodes for these new dir?****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  Create 2 directories manually corresponding to the values of
>>>>>>>> dfs.name.dir and dfs.data.dir properties and change the permissions of
>>>>>>>> these directories to 755. When you start pushing data into your HDFS, data
>>>>>>>> will start going inside the directory specified by dfs.data.dir and the
>>>>>>>> associated metadata will go inside dfs.name.dir. Remember, you store data
>>>>>>>> in HDFS, but it eventually gets stored in your local/native FS. But you
>>>>>>>> cannot see this data directly on your local/native FS.****
>>>>>>>>
>>>>>>>>
>>>>>>>> ****
>>>>>>>>
>>>>>>>> Warm Regards,****
>>>>>>>>
>>>>>>>> Tariq****
>>>>>>>>
>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  thanks. ****
>>>>>>>>
>>>>>>>> however, i need this to be working on windows environment as
>>>>>>>> project requirement.****
>>>>>>>>
>>>>>>>> i will add/work on Linux later ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> so, now , at this stage , c:\\wksp is the HDFS file system OR do i
>>>>>>>> need to create it from command line ?****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> please suggest****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards,****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  Hello Irfan,****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> Sorry for being unresponsive. Got stuck with some imp work.****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> HDFS webUI doesn't provide us the ability to create file or
>>>>>>>> directory. You can browse HDFS, view files, download files etc. But
>>>>>>>> operation like create, move, copy etc are not supported.****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> These values look fine to me.****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> One suggestion though. Try getting a Linux machine(if possible). Or
>>>>>>>> at least use a VM. I personally feel that using Hadoop on windows is always
>>>>>>>> messy.****
>>>>>>>>
>>>>>>>>
>>>>>>>> ****
>>>>>>>>
>>>>>>>> Warm Regards,****
>>>>>>>>
>>>>>>>> Tariq****
>>>>>>>>
>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  thanks.****
>>>>>>>>
>>>>>>>> when i browse the file system , i am getting following :****
>>>>>>>>
>>>>>>>> i haven't seen any make directory option there ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> i need to create it from command line ?****
>>>>>>>>
>>>>>>>> further, in the hdfs-site.xml file , i have given following
>>>>>>>> entries. are they correct ? ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> <property>****
>>>>>>>>
>>>>>>>>   <name>dfs.data.dir</name>****
>>>>>>>>
>>>>>>>>   <value>c:\\wksp</value>****
>>>>>>>>
>>>>>>>>   </property>****
>>>>>>>>
>>>>>>>> <property>****
>>>>>>>>
>>>>>>>>   <name>dfs.name.dir</name>****
>>>>>>>>
>>>>>>>>   <value>c:\\wksp</value>****
>>>>>>>>
>>>>>>>>   </property>****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> please suggest ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> [image: Inline image 1]****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <
>>>>>>>> manishd207@gmail.com> wrote:****
>>>>>>>>
>>>>>>>>  *You are wrong at this:*****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>>>>>>>
>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp****
>>>>>>>>
>>>>>>>> copyFromLocal: File
>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
>>>>>>>> ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>>>>>>>
>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp**
>>>>>>>> **
>>>>>>>>
>>>>>>>> copyFromLocal: File
>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>>>>>>>> ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> Because,You had wrote both the paths local and You need not to copy
>>>>>>>> hadoop into hdfs...Hadoop is already working..****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> Just check out in browser by after starting ur single node cluster :
>>>>>>>> ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> localhost:50070****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> then go for browse the filesystem link in it..****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> If there is no directory then make directory there.****
>>>>>>>>
>>>>>>>> That is your hdfs directory.****
>>>>>>>>
>>>>>>>> Then copy any text file there(no need to copy hadoop
>>>>>>>> there).beacause u are going to do processing on that data in text
>>>>>>>> file.That's why hadoop is used for ,first u need to make it clear in ur
>>>>>>>> mind.Then and then u will do it...otherwise not possible..****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> *Try this: *****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2****
>>>>>>>>
>>>>>>>> $ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
>>>>>>>> /hdfs/directory/path****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  thanks. yes , i am newbie.****
>>>>>>>>
>>>>>>>> however, i need windows setup.****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> let me surely refer the doc and link which u sent but i need this
>>>>>>>> to be working ...****
>>>>>>>>
>>>>>>>> can you please help****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>>  ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> --
>>>>>>>> MANISH DUNANI
>>>>>>>> -THANX
>>>>>>>> +91 9426881954,+91 8460656443****
>>>>>>>>
>>>>>>>> manishd207@gmail.com****
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> ****
>>>>>>>>
>>>>>>>> -- ****
>>>>>>>>
>>>>>>>> Regards****
>>>>>>>>
>>>>>>>> *Manish Dunani*****
>>>>>>>>
>>>>>>>> *Contact No* : +91 9408329137****
>>>>>>>>
>>>>>>>> *skype id* : manish.dunani****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>
>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>
>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>
>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>
>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> -- ****
>>>>>>>>
>>>>>>>> Olivier Renault****
>>>>>>>>
>>>>>>>> Solution Engineer - Big Data - Hortonworks, Inc.
>>>>>>>> +44 7500 933 036
>>>>>>>> orenault@hortonworks.com
>>>>>>>> www.hortonworks.com****
>>>>>>>>
>>>>>>>> **** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>>>>
>>>>>>>>
>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>> immediately and delete it from your system. Thank You.****<http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>>>>
>>>>>>>>  ** ** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>> CONFIDENTIALITY NOTICE
>>> NOTICE: This message is intended for the use of the individual or entity
>>> to which it is addressed and may contain information that is confidential,
>>> privileged and exempt from disclosure under applicable law. If the reader
>>> of this message is not the intended recipient, you are hereby notified that
>>> any printing, copying, dissemination, distribution, disclosure or
>>> forwarding of this communication is strictly prohibited. If you have
>>> received this communication in error, please contact the sender immediately
>>> and delete it from your system. Thank You.
>>>
>>
>>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.
>

Re: about replication

Posted by Irfan Sayed <ir...@gmail.com>.
thanks.
finally it got installed :)

further, when i try to start the namenode, it failed with following log

C:\hdp>start_remote_hdp_services.cmd
Master nodes: start DFS-DC
0 Master nodes successfully started.
1 Master nodes failed to start.

PSComputerName      Service             Message             Status
--------------      -------             -------             ------
                                        Connecting to re...


StartStop-HDPServices : Manually start services on Master nodes then retry
full
 cluster start.  Exiting.
At C:\hdp\manage_remote_hdp_services.ps1:187 char:47
+ if ($mode -eq "start") { StartStop-HDPservices <<<< ($mode) }
    + CategoryInfo          : NotSpecified: (:) [Write-Error],
WriteErrorExcep
   tion
    + FullyQualifiedErrorId :
Microsoft.PowerShell.Commands.WriteErrorExceptio
   n,StartStop-HDPServices


C:\hdp>

i tried starting manually as well but no luck
anything missing in configuration ?

regards



On Wed, Sep 11, 2013 at 3:16 PM, Olivier Renault
<or...@hortonworks.com>wrote:

> You can put the same FQDN as your NameNode for example.
>
> Thanks
> Olivier
> On 11 Sep 2013 11:26, "Irfan Sayed" <ir...@gmail.com> wrote:
>
>> i do not have any HIVE server host,  then, what should i put over here??
>> . if i comment then i guess it throws error of commenting that
>> can i put the fqdn of namenode for HIVE server host  ?
>>
>> will it be a really working configuration?
>>
>> please suggest
>>
>> regards
>> irfan
>>
>>
>>
>> On Tue, Sep 10, 2013 at 5:09 PM, Olivier Renault <
>> orenault@hortonworks.com> wrote:
>>
>>> Your cluster-properties.txt should look something like :
>>>
>>>
>>> #Log directory
>>> HDP_LOG_DIR=c:\hadoop\logs
>>>
>>> #Data directory
>>> HDP_DATA_DIR=c:\hdp\data
>>>
>>> #Hosts
>>> NAMENODE_HOST=yourmaster.fqdn.com
>>> JOBTRACKER_HOST=yourmaster.fqdn.com
>>> HIVE_SERVER_HOST=yourmaster.fqdn.com
>>> OOZIE_SERVER_HOST=yourmaster.fqdn.com
>>> TEMPLETON_HOST=yourmaster.fqdn.com
>>> SLAVE_HOSTS=yourmaster.fqdn.com,youslave.fqdn.com
>>>
>>>
>>> #Database host
>>> DB_FLAVOR=derby
>>> DB_HOSTNAME=yourmaster.fqdn.com
>>>
>>>
>>> #Hive properties
>>> HIVE_DB_NAME=hive
>>> HIVE_DB_USERNAME=hive
>>> HIVE_DB_PASSWORD=hive
>>>
>>> #Oozie properties
>>> OOZIE_DB_NAME=oozie
>>> OOZIE_DB_USERNAME=oozie
>>> OOZIE_DB_PASSWORD=oozie
>>>
>>> You will need to replace, yourmaster.fqdn.com, yourslave.fqdn.com by
>>> your servers name. For the time being, I suggest that you do not install
>>> HBase, Oozie,
>>>
>>> regards,
>>> Olivier
>>>
>>>
>>> On 10 September 2013 07:02, Irfan Sayed <ir...@gmail.com> wrote:
>>>
>>>> please suggest
>>>>
>>>> regards
>>>> irfan
>>>>
>>>>
>>>>
>>>> On Sat, Sep 7, 2013 at 4:56 PM, Irfan Sayed <ir...@gmail.com>wrote:
>>>>
>>>>> please suggest
>>>>>
>>>>> regards
>>>>> irfan
>>>>>
>>>>>
>>>>>
>>>>> On Fri, Sep 6, 2013 at 12:16 PM, Irfan Sayed <ir...@gmail.com>wrote:
>>>>>
>>>>>> ok.. now i made some changes and installation went ahead
>>>>>> but failed in property "HIVE_SERVER_HOST" declaration
>>>>>> in cluster config file, i have commented this property. if i
>>>>>> uncomment , then what server address will give ???
>>>>>>
>>>>>> i have only two windows machines setup.
>>>>>> 1: for namenode and another for datanode
>>>>>>
>>>>>> please suggest
>>>>>>
>>>>>> regards
>>>>>> irfan
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Fri, Sep 6, 2013 at 11:42 AM, Irfan Sayed <ir...@gmail.com>wrote:
>>>>>>
>>>>>>> thanks.
>>>>>>> i installed the latest java in c:\java folder and now no error in
>>>>>>> log file related to java
>>>>>>> however, now it is throwing error on not having cluster properties
>>>>>>> file.
>>>>>>> in fact i am running/installing hdp from the location where this
>>>>>>> file exist . still it is throwing error
>>>>>>>
>>>>>>> please find the attached
>>>>>>>
>>>>>>> [image: Inline image 1]
>>>>>>>
>>>>>>> regards
>>>>>>> irfan
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Fri, Sep 6, 2013 at 11:12 AM, Ravi Mummulla (BIG DATA) <
>>>>>>> ravimu@microsoft.com> wrote:
>>>>>>>
>>>>>>>>  Here’s your issue (from the logs you attached earlier):****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> CAQuietExec:  Checking JAVA_HOME is set correctly...****
>>>>>>>>
>>>>>>>> CAQuietExec:  Files\Java\jdk1.6.0_31 was unexpected at this time.**
>>>>>>>> **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> It seems that you installed Java prerequisite in the default path,
>>>>>>>> which is %PROGRAMFILES% (expands to C:\Program Files in your case). HDP 1.3
>>>>>>>> does not like spaces in paths, do you need to reinstall Java under c:\java\
>>>>>>>> or something similar (in a path with no spaces).****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> *From:* Irfan Sayed [mailto:irfu.sayed@gmail.com]
>>>>>>>> *Sent:* Thursday, September 5, 2013 8:42 PM
>>>>>>>> *To:* user@hadoop.apache.org
>>>>>>>> *Subject:* Re: about replication****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> please find the attached.****
>>>>>>>>
>>>>>>>> i don't have "c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log"
>>>>>>>> as it is not generated ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> irfan****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Thu, Sep 5, 2013 at 6:09 PM, Olivier Renault <
>>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>>
>>>>>>>>  Could you share the log files ( c:\hdp.log,
>>>>>>>> c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log )  as
>>>>>>>> well as your clusterproperties.txt ?****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> Thanks, ****
>>>>>>>>
>>>>>>>> Olivier****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On 5 September 2013 12:33, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  thanks. i followed the user manual for deployment and installed
>>>>>>>> all pre-requisites ****
>>>>>>>>
>>>>>>>> i modified the command and still the issue persist. please suggest
>>>>>>>> ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> please refer below ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> [image: Inline image 1]****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> irfan ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <
>>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>>
>>>>>>>> The command to install it is msiexec /i msifile /...  ****
>>>>>>>>
>>>>>>>> You will find the correct syntax as part of doc. ****
>>>>>>>>
>>>>>>>> Happy reading
>>>>>>>> Olivier ****
>>>>>>>>
>>>>>>>> On 4 Sep 2013 12:37, "Irfan Sayed" <ir...@gmail.com> wrote:***
>>>>>>>> *
>>>>>>>>
>>>>>>>>  thanks. ****
>>>>>>>>
>>>>>>>> i referred the logs and manuals. i modified the clusterproperties
>>>>>>>> file and then double click on the msi file ****
>>>>>>>>
>>>>>>>> however, it still failed.****
>>>>>>>>
>>>>>>>> further i started the installation on command line by giving
>>>>>>>> HDP_LAYOUT=clusterproperties file path, ****
>>>>>>>>
>>>>>>>> installation went ahead and it failed for .NET framework 4.0 and
>>>>>>>> VC++ redistributable package dependency   ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> i installed both and started again the installation. ****
>>>>>>>>
>>>>>>>> failed again with following error ****
>>>>>>>>
>>>>>>>> [image: Inline image 1]****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> when i search for the logs mentioned in the error , i never found
>>>>>>>> that ****
>>>>>>>>
>>>>>>>> please suggest ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> irfan****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <
>>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>>
>>>>>>>> Correct, you need to define the cluster configuration as part of a
>>>>>>>> file. You will find some information on the configuration file as part of
>>>>>>>> the documentation. ****
>>>>>>>>
>>>>>>>>
>>>>>>>> http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
>>>>>>>> ****
>>>>>>>>
>>>>>>>> You should make sure to have also installed the pre requisite. ****
>>>>>>>>
>>>>>>>> Thanks
>>>>>>>> Olivier ****
>>>>>>>>
>>>>>>>> On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:***
>>>>>>>> *
>>>>>>>>
>>>>>>>>  thanks. sorry for the long break. actually got involved in some
>>>>>>>> other priorities****
>>>>>>>>
>>>>>>>> i downloaded the installer and while installing i got following
>>>>>>>> error ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> [image: Inline image 1]****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> do i need to make any configuration prior to installation ??****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> irfan ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
>>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>>
>>>>>>>> Here is the link ****
>>>>>>>>
>>>>>>>> http://download.hortonworks.com/products/hdp-windows/****
>>>>>>>>
>>>>>>>> Olivier ****
>>>>>>>>
>>>>>>>> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:**
>>>>>>>> **
>>>>>>>>
>>>>>>>>  thanks.****
>>>>>>>>
>>>>>>>> i just followed the instructions to setup the pseudo distributed
>>>>>>>> setup first using the url :
>>>>>>>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>>>>>>>> ****
>>>>>>>>
>>>>>>>>  ****
>>>>>>>>
>>>>>>>> i don't think so i am running DN on both machine ****
>>>>>>>>
>>>>>>>> please find the attached log****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> hi olivier ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> can you please give me download link ?****
>>>>>>>>
>>>>>>>> let me try please ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> irfan ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  Are you running DN on both the machines? Could you please show me
>>>>>>>> your DN logs?****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> Also, consider Oliver's suggestion. It's definitely a better option.
>>>>>>>> ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>>
>>>>>>>> ****
>>>>>>>>
>>>>>>>> Warm Regards,****
>>>>>>>>
>>>>>>>> Tariq****
>>>>>>>>
>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>>
>>>>>>>> Irfu, ****
>>>>>>>>
>>>>>>>> If you want to quickly get Hadoop running on windows platform. You
>>>>>>>> may want to try our distribution for Windows. You will be able to find the
>>>>>>>> msi on our website. ****
>>>>>>>>
>>>>>>>> Regards
>>>>>>>> Olivier ****
>>>>>>>>
>>>>>>>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:**
>>>>>>>> **
>>>>>>>>
>>>>>>>>  thanks. ****
>>>>>>>>
>>>>>>>> ok. i think i need to change the plan over here ****
>>>>>>>>
>>>>>>>> let me create two environments. 1: totally windows 2: totally Unix*
>>>>>>>> ***
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> because, on windows , anyway i have to try and see how hadoop works
>>>>>>>> ****
>>>>>>>>
>>>>>>>> on UNIX, it is already known that ,  it is working fine. ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> so, on windows , here is the setup:****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> namenode : windows 2012 R2 ****
>>>>>>>>
>>>>>>>> datanode : windows 2012 R2 ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> now, the exact problem is :****
>>>>>>>>
>>>>>>>> 1: datanode is not getting started ****
>>>>>>>>
>>>>>>>> 2: replication : if i put any file/folder on any datanode , it
>>>>>>>> should get replicated to all another available datanodes ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <do...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  Seriously??You are planning to develop something using Hadoop on
>>>>>>>> windows. Not a good idea. Anyways, cold you plz show me your log files?I
>>>>>>>> also need some additional info :****
>>>>>>>>
>>>>>>>> -The exact problem which you are facing right now****
>>>>>>>>
>>>>>>>> -Your cluster summary(no. of nodes etc)****
>>>>>>>>
>>>>>>>> -Your latest configuration files****
>>>>>>>>
>>>>>>>> -Your /etc.hosts file****
>>>>>>>>
>>>>>>>>
>>>>>>>> ****
>>>>>>>>
>>>>>>>> Warm Regards,****
>>>>>>>>
>>>>>>>> Tariq****
>>>>>>>>
>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  ok. thanks****
>>>>>>>>
>>>>>>>> now, i need to start with all windows setup first as our product
>>>>>>>> will be based on windows ****
>>>>>>>>
>>>>>>>> so, now, please tell me how to resolve the issue ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> datanode is not starting . please suggest ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards,****
>>>>>>>>
>>>>>>>> irfan ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  It is possible. Theoretically Hadoop doesn't stop you from doing
>>>>>>>> that. But it is not a very wise setup.****
>>>>>>>>
>>>>>>>>
>>>>>>>> ****
>>>>>>>>
>>>>>>>> Warm Regards,****
>>>>>>>>
>>>>>>>> Tariq****
>>>>>>>>
>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  please suggest****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> irfan****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  thanks.****
>>>>>>>>
>>>>>>>> can i have setup like this :****
>>>>>>>>
>>>>>>>> namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)
>>>>>>>> ****
>>>>>>>>
>>>>>>>> and datanodes are the combination of any OS (windows , linux , unix
>>>>>>>> etc )****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> however, my doubt is,  as the file systems of  both the systems
>>>>>>>> (win and linux ) are different ,  datanodes of these systems can not be
>>>>>>>> part of single cluster . i have to make windows cluster separate and UNIX
>>>>>>>> cluster separate ?****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
>>>>>>>> aagarwal@hortonworks.com> wrote:****
>>>>>>>>
>>>>>>>> I just noticed you are on Cygwin. IIRC Windows PIDs are not the
>>>>>>>> same as Cygwin PIDs so that may be causing the discrepancy. I don't know
>>>>>>>> how well Hadoop works in Cygwin as I have never tried it. Work is in
>>>>>>>> progress for native Windows support however there are no official releases
>>>>>>>> with Windows support yet. It may be easier to get familiar with a
>>>>>>>> release <https://www.apache.org/dyn/closer.cgi/hadoop/common/> on
>>>>>>>> Linux if you are new to it.****
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> ****
>>>>>>>>
>>>>>>>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  thanks ****
>>>>>>>>
>>>>>>>> here is what i did .****
>>>>>>>>
>>>>>>>> i stopped all the namenodes and datanodes using ./stop-dfs.sh
>>>>>>>> command ****
>>>>>>>>
>>>>>>>> then deleted all pid files for namenodes and datanodes ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> started dfs again with command : "./start-dfs.sh"****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> when i ran the "Jps" command . it shows****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>>>>
>>>>>>>> $ ./jps.exe****
>>>>>>>>
>>>>>>>> 4536 Jps****
>>>>>>>>
>>>>>>>> 2076 NameNode****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> however, when i open the pid file for namenode then it is not
>>>>>>>> showing pid as : 4560. on the contrary, it shud show : 2076****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> please suggest ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
>>>>>>>> aagarwal@hortonworks.com> wrote:****
>>>>>>>>
>>>>>>>>  Most likely there is a stale pid file. Something like
>>>>>>>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>>>>>>>> the datanode.
>>>>>>>>
>>>>>>>> I haven't read the entire thread so you may have looked at this
>>>>>>>> already.
>>>>>>>>
>>>>>>>> -Arpit****
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> ****
>>>>>>>>
>>>>>>>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  datanode is trying to connect to namenode continuously but fails *
>>>>>>>> ***
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> when i try to run "jps" command it says :****
>>>>>>>>
>>>>>>>> $ ./jps.exe****
>>>>>>>>
>>>>>>>> 4584 NameNode****
>>>>>>>>
>>>>>>>> 4016 Jps****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> and when i ran the "./start-dfs.sh" then it says :****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> $ ./start-dfs.sh****
>>>>>>>>
>>>>>>>> namenode running as process 3544. Stop it first.****
>>>>>>>>
>>>>>>>> DFS-1: datanode running as process 4076. Stop it first.****
>>>>>>>>
>>>>>>>> localhost: secondarynamenode running as process 4792. Stop it first.
>>>>>>>> ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> both these logs are contradictory ****
>>>>>>>>
>>>>>>>> please find the attached logs ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> should i attach the conf files as well ?****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>>  ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  Your DN is still not running. Showing me the logs would be
>>>>>>>> helpful.****
>>>>>>>>
>>>>>>>>
>>>>>>>> ****
>>>>>>>>
>>>>>>>> Warm Regards,****
>>>>>>>>
>>>>>>>> Tariq****
>>>>>>>>
>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  i followed the url and did the steps mention in that. i have
>>>>>>>> deployed on the windows platform****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> Now, i am able to browse url : http://localhost:50070 (name node )*
>>>>>>>> ***
>>>>>>>>
>>>>>>>> however, not able to browse url : http://localhost:50030****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> please refer below****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> [image: Inline image 1]****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> i have modified all the config files as mentioned and formatted the
>>>>>>>> hdfs file system as well ****
>>>>>>>>
>>>>>>>> please suggest ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  thanks. i followed this url :
>>>>>>>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>>>>>>>> ****
>>>>>>>>
>>>>>>>> let me follow the url which you gave for pseudo distributed setup
>>>>>>>> and then will switch to distributed mode****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> irfan ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  You are welcome. Which link have you followed for the
>>>>>>>> configuration?Your *core-site.xml* is empty. Remove the property *
>>>>>>>> fs.default.name *from *hdfs-site.xml* and add it to *core-site.xml*.
>>>>>>>> Remove *mapred.job.tracker* as well. It is required in *
>>>>>>>> mapred-site.xml*.****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> I would suggest you to do a pseudo distributed setup first in order
>>>>>>>> to get yourself familiar with the process and then proceed to the
>>>>>>>> distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>>>>>>>> ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> HTH****
>>>>>>>>
>>>>>>>>
>>>>>>>> ****
>>>>>>>>
>>>>>>>> Warm Regards,****
>>>>>>>>
>>>>>>>> Tariq****
>>>>>>>>
>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  thanks tariq for response. ****
>>>>>>>>
>>>>>>>> as discussed last time, i have sent you all the config files in my
>>>>>>>> setup . ****
>>>>>>>>
>>>>>>>> can you please go through that ?****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> please let me know ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> irfan ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  I'm sorry for being unresponsive. Was out of touch for sometime
>>>>>>>> because of ramzan and eid. Resuming work today.****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> What's the current status?****
>>>>>>>>
>>>>>>>>
>>>>>>>> ****
>>>>>>>>
>>>>>>>> Warm Regards,****
>>>>>>>>
>>>>>>>> Tariq****
>>>>>>>>
>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <
>>>>>>>> manishd207@gmail.com> wrote:****
>>>>>>>>
>>>>>>>>  First of all read the concepts ..I hope you will like it..****
>>>>>>>>
>>>>>>>>
>>>>>>>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf**
>>>>>>>> **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  please suggest ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> irfan ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  hey Tariq,****
>>>>>>>>
>>>>>>>> i am still stuck .. ****
>>>>>>>>
>>>>>>>> can you please suggest ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> irfan ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  please suggest ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  attachment got quarantined ****
>>>>>>>>
>>>>>>>> resending in txt format. please rename it to conf.rar ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  thanks.****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> if i run the jps command on namenode :****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>>>>
>>>>>>>> $ ./jps.exe****
>>>>>>>>
>>>>>>>> 3164 NameNode****
>>>>>>>>
>>>>>>>> 1892 Jps****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> same command on datanode :****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> Administrator@DFS-1 /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>>>>
>>>>>>>> $ ./jps.exe****
>>>>>>>>
>>>>>>>> 3848 Jps****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> jps does not list any process for datanode. however, on web browser
>>>>>>>> i can see one live data node ****
>>>>>>>>
>>>>>>>> please find the attached conf rar file of namenode ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <do...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  OK. we'll start fresh. Could you plz show me your latest config
>>>>>>>> files?****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> BTW, are your daemons running fine?Use JPS to verify that.****
>>>>>>>>
>>>>>>>>
>>>>>>>> ****
>>>>>>>>
>>>>>>>> Warm Regards,****
>>>>>>>>
>>>>>>>> Tariq****
>>>>>>>>
>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  i have created these dir "wksp_data" and "wksp_name" on both
>>>>>>>> datanode and namenode ****
>>>>>>>>
>>>>>>>> made the respective changes in "hdfs-site.xml" file ****
>>>>>>>>
>>>>>>>> formatted the namenode ****
>>>>>>>>
>>>>>>>> started the dfs ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> but still, not able to browse the file system through web browser *
>>>>>>>> ***
>>>>>>>>
>>>>>>>> please refer below ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> anything still missing ?****
>>>>>>>>
>>>>>>>> please suggest ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> [image: Inline image 1]****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  these dir needs to be created on all datanodes and namenodes ?****
>>>>>>>>
>>>>>>>> further,  hdfs-site.xml needs to be updated on both datanodes and
>>>>>>>> namenodes for these new dir?****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  Create 2 directories manually corresponding to the values of
>>>>>>>> dfs.name.dir and dfs.data.dir properties and change the permissions of
>>>>>>>> these directories to 755. When you start pushing data into your HDFS, data
>>>>>>>> will start going inside the directory specified by dfs.data.dir and the
>>>>>>>> associated metadata will go inside dfs.name.dir. Remember, you store data
>>>>>>>> in HDFS, but it eventually gets stored in your local/native FS. But you
>>>>>>>> cannot see this data directly on your local/native FS.****
>>>>>>>>
>>>>>>>>
>>>>>>>> ****
>>>>>>>>
>>>>>>>> Warm Regards,****
>>>>>>>>
>>>>>>>> Tariq****
>>>>>>>>
>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  thanks. ****
>>>>>>>>
>>>>>>>> however, i need this to be working on windows environment as
>>>>>>>> project requirement.****
>>>>>>>>
>>>>>>>> i will add/work on Linux later ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> so, now , at this stage , c:\\wksp is the HDFS file system OR do i
>>>>>>>> need to create it from command line ?****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> please suggest****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards,****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  Hello Irfan,****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> Sorry for being unresponsive. Got stuck with some imp work.****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> HDFS webUI doesn't provide us the ability to create file or
>>>>>>>> directory. You can browse HDFS, view files, download files etc. But
>>>>>>>> operation like create, move, copy etc are not supported.****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> These values look fine to me.****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> One suggestion though. Try getting a Linux machine(if possible). Or
>>>>>>>> at least use a VM. I personally feel that using Hadoop on windows is always
>>>>>>>> messy.****
>>>>>>>>
>>>>>>>>
>>>>>>>> ****
>>>>>>>>
>>>>>>>> Warm Regards,****
>>>>>>>>
>>>>>>>> Tariq****
>>>>>>>>
>>>>>>>> cloudfront.blogspot.com****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  thanks.****
>>>>>>>>
>>>>>>>> when i browse the file system , i am getting following :****
>>>>>>>>
>>>>>>>> i haven't seen any make directory option there ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> i need to create it from command line ?****
>>>>>>>>
>>>>>>>> further, in the hdfs-site.xml file , i have given following
>>>>>>>> entries. are they correct ? ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> <property>****
>>>>>>>>
>>>>>>>>   <name>dfs.data.dir</name>****
>>>>>>>>
>>>>>>>>   <value>c:\\wksp</value>****
>>>>>>>>
>>>>>>>>   </property>****
>>>>>>>>
>>>>>>>> <property>****
>>>>>>>>
>>>>>>>>   <name>dfs.name.dir</name>****
>>>>>>>>
>>>>>>>>   <value>c:\\wksp</value>****
>>>>>>>>
>>>>>>>>   </property>****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> please suggest ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> [image: Inline image 1]****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <
>>>>>>>> manishd207@gmail.com> wrote:****
>>>>>>>>
>>>>>>>>  *You are wrong at this:*****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>>>>>>>
>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp****
>>>>>>>>
>>>>>>>> copyFromLocal: File
>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
>>>>>>>> ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>>>>>>>
>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp**
>>>>>>>> **
>>>>>>>>
>>>>>>>> copyFromLocal: File
>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>>>>>>>> ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> Because,You had wrote both the paths local and You need not to copy
>>>>>>>> hadoop into hdfs...Hadoop is already working..****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> Just check out in browser by after starting ur single node cluster :
>>>>>>>> ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> localhost:50070****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> then go for browse the filesystem link in it..****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> If there is no directory then make directory there.****
>>>>>>>>
>>>>>>>> That is your hdfs directory.****
>>>>>>>>
>>>>>>>> Then copy any text file there(no need to copy hadoop
>>>>>>>> there).beacause u are going to do processing on that data in text
>>>>>>>> file.That's why hadoop is used for ,first u need to make it clear in ur
>>>>>>>> mind.Then and then u will do it...otherwise not possible..****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> *Try this: *****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2****
>>>>>>>>
>>>>>>>> $ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
>>>>>>>> /hdfs/directory/path****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>>> wrote:****
>>>>>>>>
>>>>>>>>  thanks. yes , i am newbie.****
>>>>>>>>
>>>>>>>> however, i need windows setup.****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> let me surely refer the doc and link which u sent but i need this
>>>>>>>> to be working ...****
>>>>>>>>
>>>>>>>> can you please help****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> regards****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>>  ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> --
>>>>>>>> MANISH DUNANI
>>>>>>>> -THANX
>>>>>>>> +91 9426881954,+91 8460656443****
>>>>>>>>
>>>>>>>> manishd207@gmail.com****
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> ****
>>>>>>>>
>>>>>>>> -- ****
>>>>>>>>
>>>>>>>> Regards****
>>>>>>>>
>>>>>>>> *Manish Dunani*****
>>>>>>>>
>>>>>>>> *Contact No* : +91 9408329137****
>>>>>>>>
>>>>>>>> *skype id* : manish.dunani****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>
>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>
>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>
>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>
>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>>
>>>>>>>>  ** **
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> ****
>>>>>>>>
>>>>>>>> ** **
>>>>>>>>
>>>>>>>> -- ****
>>>>>>>>
>>>>>>>> Olivier Renault****
>>>>>>>>
>>>>>>>> Solution Engineer - Big Data - Hortonworks, Inc.
>>>>>>>> +44 7500 933 036
>>>>>>>> orenault@hortonworks.com
>>>>>>>> www.hortonworks.com****
>>>>>>>>
>>>>>>>> **** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>>>>
>>>>>>>>
>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>> immediately and delete it from your system. Thank You.****<http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>>>>
>>>>>>>>  ** ** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>> CONFIDENTIALITY NOTICE
>>> NOTICE: This message is intended for the use of the individual or entity
>>> to which it is addressed and may contain information that is confidential,
>>> privileged and exempt from disclosure under applicable law. If the reader
>>> of this message is not the intended recipient, you are hereby notified that
>>> any printing, copying, dissemination, distribution, disclosure or
>>> forwarding of this communication is strictly prohibited. If you have
>>> received this communication in error, please contact the sender immediately
>>> and delete it from your system. Thank You.
>>>
>>
>>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.
>

Re: about replication

Posted by Olivier Renault <or...@hortonworks.com>.
You can put the same FQDN as your NameNode for example.

Thanks
Olivier
On 11 Sep 2013 11:26, "Irfan Sayed" <ir...@gmail.com> wrote:

> i do not have any HIVE server host,  then, what should i put over here?? .
> if i comment then i guess it throws error of commenting that
> can i put the fqdn of namenode for HIVE server host  ?
>
> will it be a really working configuration?
>
> please suggest
>
> regards
> irfan
>
>
>
> On Tue, Sep 10, 2013 at 5:09 PM, Olivier Renault <orenault@hortonworks.com
> > wrote:
>
>> Your cluster-properties.txt should look something like :
>>
>>
>> #Log directory
>> HDP_LOG_DIR=c:\hadoop\logs
>>
>> #Data directory
>> HDP_DATA_DIR=c:\hdp\data
>>
>> #Hosts
>> NAMENODE_HOST=yourmaster.fqdn.com
>> JOBTRACKER_HOST=yourmaster.fqdn.com
>> HIVE_SERVER_HOST=yourmaster.fqdn.com
>> OOZIE_SERVER_HOST=yourmaster.fqdn.com
>> TEMPLETON_HOST=yourmaster.fqdn.com
>> SLAVE_HOSTS=yourmaster.fqdn.com,youslave.fqdn.com
>>
>>
>> #Database host
>> DB_FLAVOR=derby
>> DB_HOSTNAME=yourmaster.fqdn.com
>>
>>
>> #Hive properties
>> HIVE_DB_NAME=hive
>> HIVE_DB_USERNAME=hive
>> HIVE_DB_PASSWORD=hive
>>
>> #Oozie properties
>> OOZIE_DB_NAME=oozie
>> OOZIE_DB_USERNAME=oozie
>> OOZIE_DB_PASSWORD=oozie
>>
>> You will need to replace, yourmaster.fqdn.com, yourslave.fqdn.com by
>> your servers name. For the time being, I suggest that you do not install
>> HBase, Oozie,
>>
>> regards,
>> Olivier
>>
>>
>> On 10 September 2013 07:02, Irfan Sayed <ir...@gmail.com> wrote:
>>
>>> please suggest
>>>
>>> regards
>>> irfan
>>>
>>>
>>>
>>> On Sat, Sep 7, 2013 at 4:56 PM, Irfan Sayed <ir...@gmail.com>wrote:
>>>
>>>> please suggest
>>>>
>>>> regards
>>>> irfan
>>>>
>>>>
>>>>
>>>> On Fri, Sep 6, 2013 at 12:16 PM, Irfan Sayed <ir...@gmail.com>wrote:
>>>>
>>>>> ok.. now i made some changes and installation went ahead
>>>>> but failed in property "HIVE_SERVER_HOST" declaration
>>>>> in cluster config file, i have commented this property. if i uncomment
>>>>> , then what server address will give ???
>>>>>
>>>>> i have only two windows machines setup.
>>>>> 1: for namenode and another for datanode
>>>>>
>>>>> please suggest
>>>>>
>>>>> regards
>>>>> irfan
>>>>>
>>>>>
>>>>>
>>>>> On Fri, Sep 6, 2013 at 11:42 AM, Irfan Sayed <ir...@gmail.com>wrote:
>>>>>
>>>>>> thanks.
>>>>>> i installed the latest java in c:\java folder and now no error in log
>>>>>> file related to java
>>>>>> however, now it is throwing error on not having cluster properties
>>>>>> file.
>>>>>> in fact i am running/installing hdp from the location where this file
>>>>>> exist . still it is throwing error
>>>>>>
>>>>>> please find the attached
>>>>>>
>>>>>> [image: Inline image 1]
>>>>>>
>>>>>> regards
>>>>>> irfan
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Fri, Sep 6, 2013 at 11:12 AM, Ravi Mummulla (BIG DATA) <
>>>>>> ravimu@microsoft.com> wrote:
>>>>>>
>>>>>>>  Here’s your issue (from the logs you attached earlier):****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> CAQuietExec:  Checking JAVA_HOME is set correctly...****
>>>>>>>
>>>>>>> CAQuietExec:  Files\Java\jdk1.6.0_31 was unexpected at this time.***
>>>>>>> *
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> It seems that you installed Java prerequisite in the default path,
>>>>>>> which is %PROGRAMFILES% (expands to C:\Program Files in your case). HDP 1.3
>>>>>>> does not like spaces in paths, do you need to reinstall Java under c:\java\
>>>>>>> or something similar (in a path with no spaces).****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> *From:* Irfan Sayed [mailto:irfu.sayed@gmail.com]
>>>>>>> *Sent:* Thursday, September 5, 2013 8:42 PM
>>>>>>> *To:* user@hadoop.apache.org
>>>>>>> *Subject:* Re: about replication****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> please find the attached.****
>>>>>>>
>>>>>>> i don't have "c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log"
>>>>>>> as it is not generated ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> irfan****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Thu, Sep 5, 2013 at 6:09 PM, Olivier Renault <
>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>
>>>>>>>  Could you share the log files ( c:\hdp.log,
>>>>>>> c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log )  as
>>>>>>> well as your clusterproperties.txt ?****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> Thanks, ****
>>>>>>>
>>>>>>> Olivier****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On 5 September 2013 12:33, Irfan Sayed <ir...@gmail.com> wrote:
>>>>>>> ****
>>>>>>>
>>>>>>>  thanks. i followed the user manual for deployment and installed
>>>>>>> all pre-requisites ****
>>>>>>>
>>>>>>> i modified the command and still the issue persist. please suggest *
>>>>>>> ***
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> please refer below ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> [image: Inline image 1]****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> irfan ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <
>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>
>>>>>>> The command to install it is msiexec /i msifile /...  ****
>>>>>>>
>>>>>>> You will find the correct syntax as part of doc. ****
>>>>>>>
>>>>>>> Happy reading
>>>>>>> Olivier ****
>>>>>>>
>>>>>>> On 4 Sep 2013 12:37, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>>>>>
>>>>>>>  thanks. ****
>>>>>>>
>>>>>>> i referred the logs and manuals. i modified the clusterproperties
>>>>>>> file and then double click on the msi file ****
>>>>>>>
>>>>>>> however, it still failed.****
>>>>>>>
>>>>>>> further i started the installation on command line by giving
>>>>>>> HDP_LAYOUT=clusterproperties file path, ****
>>>>>>>
>>>>>>> installation went ahead and it failed for .NET framework 4.0 and
>>>>>>> VC++ redistributable package dependency   ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> i installed both and started again the installation. ****
>>>>>>>
>>>>>>> failed again with following error ****
>>>>>>>
>>>>>>> [image: Inline image 1]****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> when i search for the logs mentioned in the error , i never found
>>>>>>> that ****
>>>>>>>
>>>>>>> please suggest ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> irfan****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <
>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>
>>>>>>> Correct, you need to define the cluster configuration as part of a
>>>>>>> file. You will find some information on the configuration file as part of
>>>>>>> the documentation. ****
>>>>>>>
>>>>>>>
>>>>>>> http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
>>>>>>> ****
>>>>>>>
>>>>>>> You should make sure to have also installed the pre requisite. ****
>>>>>>>
>>>>>>> Thanks
>>>>>>> Olivier ****
>>>>>>>
>>>>>>> On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>>>>>
>>>>>>>  thanks. sorry for the long break. actually got involved in some
>>>>>>> other priorities****
>>>>>>>
>>>>>>> i downloaded the installer and while installing i got following
>>>>>>> error ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> [image: Inline image 1]****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> do i need to make any configuration prior to installation ??****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> irfan ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>
>>>>>>> Here is the link ****
>>>>>>>
>>>>>>> http://download.hortonworks.com/products/hdp-windows/****
>>>>>>>
>>>>>>> Olivier ****
>>>>>>>
>>>>>>> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:***
>>>>>>> *
>>>>>>>
>>>>>>>  thanks.****
>>>>>>>
>>>>>>> i just followed the instructions to setup the pseudo distributed
>>>>>>> setup first using the url :
>>>>>>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>>>>>>> ****
>>>>>>>
>>>>>>>  ****
>>>>>>>
>>>>>>> i don't think so i am running DN on both machine ****
>>>>>>>
>>>>>>> please find the attached log****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> hi olivier ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> can you please give me download link ?****
>>>>>>>
>>>>>>> let me try please ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> irfan ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  Are you running DN on both the machines? Could you please show me
>>>>>>> your DN logs?****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> Also, consider Oliver's suggestion. It's definitely a better option.
>>>>>>> ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>>
>>>>>>> ****
>>>>>>>
>>>>>>> Warm Regards,****
>>>>>>>
>>>>>>> Tariq****
>>>>>>>
>>>>>>> cloudfront.blogspot.com****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>
>>>>>>> Irfu, ****
>>>>>>>
>>>>>>> If you want to quickly get Hadoop running on windows platform. You
>>>>>>> may want to try our distribution for Windows. You will be able to find the
>>>>>>> msi on our website. ****
>>>>>>>
>>>>>>> Regards
>>>>>>> Olivier ****
>>>>>>>
>>>>>>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:***
>>>>>>> *
>>>>>>>
>>>>>>>  thanks. ****
>>>>>>>
>>>>>>> ok. i think i need to change the plan over here ****
>>>>>>>
>>>>>>> let me create two environments. 1: totally windows 2: totally Unix**
>>>>>>> **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> because, on windows , anyway i have to try and see how hadoop works
>>>>>>> ****
>>>>>>>
>>>>>>> on UNIX, it is already known that ,  it is working fine. ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> so, on windows , here is the setup:****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> namenode : windows 2012 R2 ****
>>>>>>>
>>>>>>> datanode : windows 2012 R2 ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> now, the exact problem is :****
>>>>>>>
>>>>>>> 1: datanode is not getting started ****
>>>>>>>
>>>>>>> 2: replication : if i put any file/folder on any datanode , it
>>>>>>> should get replicated to all another available datanodes ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <do...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  Seriously??You are planning to develop something using Hadoop on
>>>>>>> windows. Not a good idea. Anyways, cold you plz show me your log files?I
>>>>>>> also need some additional info :****
>>>>>>>
>>>>>>> -The exact problem which you are facing right now****
>>>>>>>
>>>>>>> -Your cluster summary(no. of nodes etc)****
>>>>>>>
>>>>>>> -Your latest configuration files****
>>>>>>>
>>>>>>> -Your /etc.hosts file****
>>>>>>>
>>>>>>>
>>>>>>> ****
>>>>>>>
>>>>>>> Warm Regards,****
>>>>>>>
>>>>>>> Tariq****
>>>>>>>
>>>>>>> cloudfront.blogspot.com****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  ok. thanks****
>>>>>>>
>>>>>>> now, i need to start with all windows setup first as our product
>>>>>>> will be based on windows ****
>>>>>>>
>>>>>>> so, now, please tell me how to resolve the issue ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> datanode is not starting . please suggest ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards,****
>>>>>>>
>>>>>>> irfan ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  It is possible. Theoretically Hadoop doesn't stop you from doing
>>>>>>> that. But it is not a very wise setup.****
>>>>>>>
>>>>>>>
>>>>>>> ****
>>>>>>>
>>>>>>> Warm Regards,****
>>>>>>>
>>>>>>> Tariq****
>>>>>>>
>>>>>>> cloudfront.blogspot.com****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  please suggest****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> irfan****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  thanks.****
>>>>>>>
>>>>>>> can i have setup like this :****
>>>>>>>
>>>>>>> namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)*
>>>>>>> ***
>>>>>>>
>>>>>>> and datanodes are the combination of any OS (windows , linux , unix
>>>>>>> etc )****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> however, my doubt is,  as the file systems of  both the systems (win
>>>>>>> and linux ) are different ,  datanodes of these systems can not be part of
>>>>>>> single cluster . i have to make windows cluster separate and UNIX cluster
>>>>>>> separate ?****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
>>>>>>> aagarwal@hortonworks.com> wrote:****
>>>>>>>
>>>>>>> I just noticed you are on Cygwin. IIRC Windows PIDs are not the same
>>>>>>> as Cygwin PIDs so that may be causing the discrepancy. I don't know how
>>>>>>> well Hadoop works in Cygwin as I have never tried it. Work is in progress
>>>>>>> for native Windows support however there are no official releases with
>>>>>>> Windows support yet. It may be easier to get familiar with a release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
>>>>>>> ****
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> ****
>>>>>>>
>>>>>>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  thanks ****
>>>>>>>
>>>>>>> here is what i did .****
>>>>>>>
>>>>>>> i stopped all the namenodes and datanodes using ./stop-dfs.sh
>>>>>>> command ****
>>>>>>>
>>>>>>> then deleted all pid files for namenodes and datanodes ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> started dfs again with command : "./start-dfs.sh"****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> when i ran the "Jps" command . it shows****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>>>
>>>>>>> $ ./jps.exe****
>>>>>>>
>>>>>>> 4536 Jps****
>>>>>>>
>>>>>>> 2076 NameNode****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> however, when i open the pid file for namenode then it is not
>>>>>>> showing pid as : 4560. on the contrary, it shud show : 2076****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> please suggest ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
>>>>>>> aagarwal@hortonworks.com> wrote:****
>>>>>>>
>>>>>>>  Most likely there is a stale pid file. Something like
>>>>>>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>>>>>>> the datanode.
>>>>>>>
>>>>>>> I haven't read the entire thread so you may have looked at this
>>>>>>> already.
>>>>>>>
>>>>>>> -Arpit****
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> ****
>>>>>>>
>>>>>>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  datanode is trying to connect to namenode continuously but fails **
>>>>>>> **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> when i try to run "jps" command it says :****
>>>>>>>
>>>>>>> $ ./jps.exe****
>>>>>>>
>>>>>>> 4584 NameNode****
>>>>>>>
>>>>>>> 4016 Jps****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> and when i ran the "./start-dfs.sh" then it says :****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> $ ./start-dfs.sh****
>>>>>>>
>>>>>>> namenode running as process 3544. Stop it first.****
>>>>>>>
>>>>>>> DFS-1: datanode running as process 4076. Stop it first.****
>>>>>>>
>>>>>>> localhost: secondarynamenode running as process 4792. Stop it first.
>>>>>>> ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> both these logs are contradictory ****
>>>>>>>
>>>>>>> please find the attached logs ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> should i attach the conf files as well ?****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>>  ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  Your DN is still not running. Showing me the logs would be helpful.
>>>>>>> ****
>>>>>>>
>>>>>>>
>>>>>>> ****
>>>>>>>
>>>>>>> Warm Regards,****
>>>>>>>
>>>>>>> Tariq****
>>>>>>>
>>>>>>> cloudfront.blogspot.com****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  i followed the url and did the steps mention in that. i have
>>>>>>> deployed on the windows platform****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> Now, i am able to browse url : http://localhost:50070 (name node )**
>>>>>>> **
>>>>>>>
>>>>>>> however, not able to browse url : http://localhost:50030****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> please refer below****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> [image: Inline image 1]****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> i have modified all the config files as mentioned and formatted the
>>>>>>> hdfs file system as well ****
>>>>>>>
>>>>>>> please suggest ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  thanks. i followed this url :
>>>>>>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>>>>>>> ****
>>>>>>>
>>>>>>> let me follow the url which you gave for pseudo distributed setup
>>>>>>> and then will switch to distributed mode****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> irfan ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  You are welcome. Which link have you followed for the
>>>>>>> configuration?Your *core-site.xml* is empty. Remove the property *
>>>>>>> fs.default.name *from *hdfs-site.xml* and add it to *core-site.xml*.
>>>>>>> Remove *mapred.job.tracker* as well. It is required in *
>>>>>>> mapred-site.xml*.****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> I would suggest you to do a pseudo distributed setup first in order
>>>>>>> to get yourself familiar with the process and then proceed to the
>>>>>>> distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>>>>>>> ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> HTH****
>>>>>>>
>>>>>>>
>>>>>>> ****
>>>>>>>
>>>>>>> Warm Regards,****
>>>>>>>
>>>>>>> Tariq****
>>>>>>>
>>>>>>> cloudfront.blogspot.com****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  thanks tariq for response. ****
>>>>>>>
>>>>>>> as discussed last time, i have sent you all the config files in my
>>>>>>> setup . ****
>>>>>>>
>>>>>>> can you please go through that ?****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> please let me know ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> irfan ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  I'm sorry for being unresponsive. Was out of touch for sometime
>>>>>>> because of ramzan and eid. Resuming work today.****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> What's the current status?****
>>>>>>>
>>>>>>>
>>>>>>> ****
>>>>>>>
>>>>>>> Warm Regards,****
>>>>>>>
>>>>>>> Tariq****
>>>>>>>
>>>>>>> cloudfront.blogspot.com****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <ma...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  First of all read the concepts ..I hope you will like it..****
>>>>>>>
>>>>>>>
>>>>>>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf***
>>>>>>> *
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  please suggest ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> irfan ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  hey Tariq,****
>>>>>>>
>>>>>>> i am still stuck .. ****
>>>>>>>
>>>>>>> can you please suggest ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> irfan ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  please suggest ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  attachment got quarantined ****
>>>>>>>
>>>>>>> resending in txt format. please rename it to conf.rar ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  thanks.****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> if i run the jps command on namenode :****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>>>
>>>>>>> $ ./jps.exe****
>>>>>>>
>>>>>>> 3164 NameNode****
>>>>>>>
>>>>>>> 1892 Jps****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> same command on datanode :****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> Administrator@DFS-1 /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>>>
>>>>>>> $ ./jps.exe****
>>>>>>>
>>>>>>> 3848 Jps****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> jps does not list any process for datanode. however, on web browser
>>>>>>> i can see one live data node ****
>>>>>>>
>>>>>>> please find the attached conf rar file of namenode ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <do...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  OK. we'll start fresh. Could you plz show me your latest config
>>>>>>> files?****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> BTW, are your daemons running fine?Use JPS to verify that.****
>>>>>>>
>>>>>>>
>>>>>>> ****
>>>>>>>
>>>>>>> Warm Regards,****
>>>>>>>
>>>>>>> Tariq****
>>>>>>>
>>>>>>> cloudfront.blogspot.com****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  i have created these dir "wksp_data" and "wksp_name" on both
>>>>>>> datanode and namenode ****
>>>>>>>
>>>>>>> made the respective changes in "hdfs-site.xml" file ****
>>>>>>>
>>>>>>> formatted the namenode ****
>>>>>>>
>>>>>>> started the dfs ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> but still, not able to browse the file system through web browser **
>>>>>>> **
>>>>>>>
>>>>>>> please refer below ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> anything still missing ?****
>>>>>>>
>>>>>>> please suggest ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> [image: Inline image 1]****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  these dir needs to be created on all datanodes and namenodes ?****
>>>>>>>
>>>>>>> further,  hdfs-site.xml needs to be updated on both datanodes and
>>>>>>> namenodes for these new dir?****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  Create 2 directories manually corresponding to the values of
>>>>>>> dfs.name.dir and dfs.data.dir properties and change the permissions of
>>>>>>> these directories to 755. When you start pushing data into your HDFS, data
>>>>>>> will start going inside the directory specified by dfs.data.dir and the
>>>>>>> associated metadata will go inside dfs.name.dir. Remember, you store data
>>>>>>> in HDFS, but it eventually gets stored in your local/native FS. But you
>>>>>>> cannot see this data directly on your local/native FS.****
>>>>>>>
>>>>>>>
>>>>>>> ****
>>>>>>>
>>>>>>> Warm Regards,****
>>>>>>>
>>>>>>> Tariq****
>>>>>>>
>>>>>>> cloudfront.blogspot.com****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  thanks. ****
>>>>>>>
>>>>>>> however, i need this to be working on windows environment as project
>>>>>>> requirement.****
>>>>>>>
>>>>>>> i will add/work on Linux later ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> so, now , at this stage , c:\\wksp is the HDFS file system OR do i
>>>>>>> need to create it from command line ?****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> please suggest****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards,****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  Hello Irfan,****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> Sorry for being unresponsive. Got stuck with some imp work.****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> HDFS webUI doesn't provide us the ability to create file or
>>>>>>> directory. You can browse HDFS, view files, download files etc. But
>>>>>>> operation like create, move, copy etc are not supported.****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> These values look fine to me.****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> One suggestion though. Try getting a Linux machine(if possible). Or
>>>>>>> at least use a VM. I personally feel that using Hadoop on windows is always
>>>>>>> messy.****
>>>>>>>
>>>>>>>
>>>>>>> ****
>>>>>>>
>>>>>>> Warm Regards,****
>>>>>>>
>>>>>>> Tariq****
>>>>>>>
>>>>>>> cloudfront.blogspot.com****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  thanks.****
>>>>>>>
>>>>>>> when i browse the file system , i am getting following :****
>>>>>>>
>>>>>>> i haven't seen any make directory option there ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> i need to create it from command line ?****
>>>>>>>
>>>>>>> further, in the hdfs-site.xml file , i have given following entries.
>>>>>>> are they correct ? ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> <property>****
>>>>>>>
>>>>>>>   <name>dfs.data.dir</name>****
>>>>>>>
>>>>>>>   <value>c:\\wksp</value>****
>>>>>>>
>>>>>>>   </property>****
>>>>>>>
>>>>>>> <property>****
>>>>>>>
>>>>>>>   <name>dfs.name.dir</name>****
>>>>>>>
>>>>>>>   <value>c:\\wksp</value>****
>>>>>>>
>>>>>>>   </property>****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> please suggest ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> [image: Inline image 1]****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <ma...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  *You are wrong at this:*****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>>>>>>
>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp****
>>>>>>>
>>>>>>> copyFromLocal: File
>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
>>>>>>> ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>>>>>>
>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp***
>>>>>>> *
>>>>>>>
>>>>>>> copyFromLocal: File
>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>>>>>>> ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> Because,You had wrote both the paths local and You need not to copy
>>>>>>> hadoop into hdfs...Hadoop is already working..****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> Just check out in browser by after starting ur single node cluster :
>>>>>>> ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> localhost:50070****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> then go for browse the filesystem link in it..****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> If there is no directory then make directory there.****
>>>>>>>
>>>>>>> That is your hdfs directory.****
>>>>>>>
>>>>>>> Then copy any text file there(no need to copy hadoop there).beacause
>>>>>>> u are going to do processing on that data in text file.That's why hadoop is
>>>>>>> used for ,first u need to make it clear in ur mind.Then and then u will do
>>>>>>> it...otherwise not possible..****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> *Try this: *****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2****
>>>>>>>
>>>>>>> $ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
>>>>>>> /hdfs/directory/path****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  thanks. yes , i am newbie.****
>>>>>>>
>>>>>>> however, i need windows setup.****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> let me surely refer the doc and link which u sent but i need this to
>>>>>>> be working ...****
>>>>>>>
>>>>>>> can you please help****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>>  ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> --
>>>>>>> MANISH DUNANI
>>>>>>> -THANX
>>>>>>> +91 9426881954,+91 8460656443****
>>>>>>>
>>>>>>> manishd207@gmail.com****
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> ****
>>>>>>>
>>>>>>> -- ****
>>>>>>>
>>>>>>> Regards****
>>>>>>>
>>>>>>> *Manish Dunani*****
>>>>>>>
>>>>>>> *Contact No* : +91 9408329137****
>>>>>>>
>>>>>>> *skype id* : manish.dunani****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>> you have received this communication in error, please contact the sender
>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>> you have received this communication in error, please contact the sender
>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>
>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>> you have received this communication in error, please contact the sender
>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>
>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>> you have received this communication in error, please contact the sender
>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>
>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>> you have received this communication in error, please contact the sender
>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>
>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>> you have received this communication in error, please contact the sender
>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> -- ****
>>>>>>>
>>>>>>> Olivier Renault****
>>>>>>>
>>>>>>> Solution Engineer - Big Data - Hortonworks, Inc.
>>>>>>> +44 7500 933 036
>>>>>>> orenault@hortonworks.com
>>>>>>> www.hortonworks.com****
>>>>>>>
>>>>>>> **** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>>>
>>>>>>>
>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>> you have received this communication in error, please contact the sender
>>>>>>> immediately and delete it from your system. Thank You.****<http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>>>
>>>>>>>  ** ** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity
>> to which it is addressed and may contain information that is confidential,
>> privileged and exempt from disclosure under applicable law. If the reader
>> of this message is not the intended recipient, you are hereby notified that
>> any printing, copying, dissemination, distribution, disclosure or
>> forwarding of this communication is strictly prohibited. If you have
>> received this communication in error, please contact the sender immediately
>> and delete it from your system. Thank You.
>>
>
>

-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.

Re: about replication

Posted by Olivier Renault <or...@hortonworks.com>.
You can put the same FQDN as your NameNode for example.

Thanks
Olivier
On 11 Sep 2013 11:26, "Irfan Sayed" <ir...@gmail.com> wrote:

> i do not have any HIVE server host,  then, what should i put over here?? .
> if i comment then i guess it throws error of commenting that
> can i put the fqdn of namenode for HIVE server host  ?
>
> will it be a really working configuration?
>
> please suggest
>
> regards
> irfan
>
>
>
> On Tue, Sep 10, 2013 at 5:09 PM, Olivier Renault <orenault@hortonworks.com
> > wrote:
>
>> Your cluster-properties.txt should look something like :
>>
>>
>> #Log directory
>> HDP_LOG_DIR=c:\hadoop\logs
>>
>> #Data directory
>> HDP_DATA_DIR=c:\hdp\data
>>
>> #Hosts
>> NAMENODE_HOST=yourmaster.fqdn.com
>> JOBTRACKER_HOST=yourmaster.fqdn.com
>> HIVE_SERVER_HOST=yourmaster.fqdn.com
>> OOZIE_SERVER_HOST=yourmaster.fqdn.com
>> TEMPLETON_HOST=yourmaster.fqdn.com
>> SLAVE_HOSTS=yourmaster.fqdn.com,youslave.fqdn.com
>>
>>
>> #Database host
>> DB_FLAVOR=derby
>> DB_HOSTNAME=yourmaster.fqdn.com
>>
>>
>> #Hive properties
>> HIVE_DB_NAME=hive
>> HIVE_DB_USERNAME=hive
>> HIVE_DB_PASSWORD=hive
>>
>> #Oozie properties
>> OOZIE_DB_NAME=oozie
>> OOZIE_DB_USERNAME=oozie
>> OOZIE_DB_PASSWORD=oozie
>>
>> You will need to replace, yourmaster.fqdn.com, yourslave.fqdn.com by
>> your servers name. For the time being, I suggest that you do not install
>> HBase, Oozie,
>>
>> regards,
>> Olivier
>>
>>
>> On 10 September 2013 07:02, Irfan Sayed <ir...@gmail.com> wrote:
>>
>>> please suggest
>>>
>>> regards
>>> irfan
>>>
>>>
>>>
>>> On Sat, Sep 7, 2013 at 4:56 PM, Irfan Sayed <ir...@gmail.com>wrote:
>>>
>>>> please suggest
>>>>
>>>> regards
>>>> irfan
>>>>
>>>>
>>>>
>>>> On Fri, Sep 6, 2013 at 12:16 PM, Irfan Sayed <ir...@gmail.com>wrote:
>>>>
>>>>> ok.. now i made some changes and installation went ahead
>>>>> but failed in property "HIVE_SERVER_HOST" declaration
>>>>> in cluster config file, i have commented this property. if i uncomment
>>>>> , then what server address will give ???
>>>>>
>>>>> i have only two windows machines setup.
>>>>> 1: for namenode and another for datanode
>>>>>
>>>>> please suggest
>>>>>
>>>>> regards
>>>>> irfan
>>>>>
>>>>>
>>>>>
>>>>> On Fri, Sep 6, 2013 at 11:42 AM, Irfan Sayed <ir...@gmail.com>wrote:
>>>>>
>>>>>> thanks.
>>>>>> i installed the latest java in c:\java folder and now no error in log
>>>>>> file related to java
>>>>>> however, now it is throwing error on not having cluster properties
>>>>>> file.
>>>>>> in fact i am running/installing hdp from the location where this file
>>>>>> exist . still it is throwing error
>>>>>>
>>>>>> please find the attached
>>>>>>
>>>>>> [image: Inline image 1]
>>>>>>
>>>>>> regards
>>>>>> irfan
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Fri, Sep 6, 2013 at 11:12 AM, Ravi Mummulla (BIG DATA) <
>>>>>> ravimu@microsoft.com> wrote:
>>>>>>
>>>>>>>  Here’s your issue (from the logs you attached earlier):****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> CAQuietExec:  Checking JAVA_HOME is set correctly...****
>>>>>>>
>>>>>>> CAQuietExec:  Files\Java\jdk1.6.0_31 was unexpected at this time.***
>>>>>>> *
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> It seems that you installed Java prerequisite in the default path,
>>>>>>> which is %PROGRAMFILES% (expands to C:\Program Files in your case). HDP 1.3
>>>>>>> does not like spaces in paths, do you need to reinstall Java under c:\java\
>>>>>>> or something similar (in a path with no spaces).****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> *From:* Irfan Sayed [mailto:irfu.sayed@gmail.com]
>>>>>>> *Sent:* Thursday, September 5, 2013 8:42 PM
>>>>>>> *To:* user@hadoop.apache.org
>>>>>>> *Subject:* Re: about replication****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> please find the attached.****
>>>>>>>
>>>>>>> i don't have "c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log"
>>>>>>> as it is not generated ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> irfan****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Thu, Sep 5, 2013 at 6:09 PM, Olivier Renault <
>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>
>>>>>>>  Could you share the log files ( c:\hdp.log,
>>>>>>> c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log )  as
>>>>>>> well as your clusterproperties.txt ?****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> Thanks, ****
>>>>>>>
>>>>>>> Olivier****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On 5 September 2013 12:33, Irfan Sayed <ir...@gmail.com> wrote:
>>>>>>> ****
>>>>>>>
>>>>>>>  thanks. i followed the user manual for deployment and installed
>>>>>>> all pre-requisites ****
>>>>>>>
>>>>>>> i modified the command and still the issue persist. please suggest *
>>>>>>> ***
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> please refer below ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> [image: Inline image 1]****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> irfan ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <
>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>
>>>>>>> The command to install it is msiexec /i msifile /...  ****
>>>>>>>
>>>>>>> You will find the correct syntax as part of doc. ****
>>>>>>>
>>>>>>> Happy reading
>>>>>>> Olivier ****
>>>>>>>
>>>>>>> On 4 Sep 2013 12:37, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>>>>>
>>>>>>>  thanks. ****
>>>>>>>
>>>>>>> i referred the logs and manuals. i modified the clusterproperties
>>>>>>> file and then double click on the msi file ****
>>>>>>>
>>>>>>> however, it still failed.****
>>>>>>>
>>>>>>> further i started the installation on command line by giving
>>>>>>> HDP_LAYOUT=clusterproperties file path, ****
>>>>>>>
>>>>>>> installation went ahead and it failed for .NET framework 4.0 and
>>>>>>> VC++ redistributable package dependency   ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> i installed both and started again the installation. ****
>>>>>>>
>>>>>>> failed again with following error ****
>>>>>>>
>>>>>>> [image: Inline image 1]****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> when i search for the logs mentioned in the error , i never found
>>>>>>> that ****
>>>>>>>
>>>>>>> please suggest ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> irfan****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <
>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>
>>>>>>> Correct, you need to define the cluster configuration as part of a
>>>>>>> file. You will find some information on the configuration file as part of
>>>>>>> the documentation. ****
>>>>>>>
>>>>>>>
>>>>>>> http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
>>>>>>> ****
>>>>>>>
>>>>>>> You should make sure to have also installed the pre requisite. ****
>>>>>>>
>>>>>>> Thanks
>>>>>>> Olivier ****
>>>>>>>
>>>>>>> On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>>>>>
>>>>>>>  thanks. sorry for the long break. actually got involved in some
>>>>>>> other priorities****
>>>>>>>
>>>>>>> i downloaded the installer and while installing i got following
>>>>>>> error ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> [image: Inline image 1]****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> do i need to make any configuration prior to installation ??****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> irfan ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>
>>>>>>> Here is the link ****
>>>>>>>
>>>>>>> http://download.hortonworks.com/products/hdp-windows/****
>>>>>>>
>>>>>>> Olivier ****
>>>>>>>
>>>>>>> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:***
>>>>>>> *
>>>>>>>
>>>>>>>  thanks.****
>>>>>>>
>>>>>>> i just followed the instructions to setup the pseudo distributed
>>>>>>> setup first using the url :
>>>>>>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>>>>>>> ****
>>>>>>>
>>>>>>>  ****
>>>>>>>
>>>>>>> i don't think so i am running DN on both machine ****
>>>>>>>
>>>>>>> please find the attached log****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> hi olivier ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> can you please give me download link ?****
>>>>>>>
>>>>>>> let me try please ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> irfan ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  Are you running DN on both the machines? Could you please show me
>>>>>>> your DN logs?****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> Also, consider Oliver's suggestion. It's definitely a better option.
>>>>>>> ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>>
>>>>>>> ****
>>>>>>>
>>>>>>> Warm Regards,****
>>>>>>>
>>>>>>> Tariq****
>>>>>>>
>>>>>>> cloudfront.blogspot.com****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>
>>>>>>> Irfu, ****
>>>>>>>
>>>>>>> If you want to quickly get Hadoop running on windows platform. You
>>>>>>> may want to try our distribution for Windows. You will be able to find the
>>>>>>> msi on our website. ****
>>>>>>>
>>>>>>> Regards
>>>>>>> Olivier ****
>>>>>>>
>>>>>>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:***
>>>>>>> *
>>>>>>>
>>>>>>>  thanks. ****
>>>>>>>
>>>>>>> ok. i think i need to change the plan over here ****
>>>>>>>
>>>>>>> let me create two environments. 1: totally windows 2: totally Unix**
>>>>>>> **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> because, on windows , anyway i have to try and see how hadoop works
>>>>>>> ****
>>>>>>>
>>>>>>> on UNIX, it is already known that ,  it is working fine. ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> so, on windows , here is the setup:****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> namenode : windows 2012 R2 ****
>>>>>>>
>>>>>>> datanode : windows 2012 R2 ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> now, the exact problem is :****
>>>>>>>
>>>>>>> 1: datanode is not getting started ****
>>>>>>>
>>>>>>> 2: replication : if i put any file/folder on any datanode , it
>>>>>>> should get replicated to all another available datanodes ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <do...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  Seriously??You are planning to develop something using Hadoop on
>>>>>>> windows. Not a good idea. Anyways, cold you plz show me your log files?I
>>>>>>> also need some additional info :****
>>>>>>>
>>>>>>> -The exact problem which you are facing right now****
>>>>>>>
>>>>>>> -Your cluster summary(no. of nodes etc)****
>>>>>>>
>>>>>>> -Your latest configuration files****
>>>>>>>
>>>>>>> -Your /etc.hosts file****
>>>>>>>
>>>>>>>
>>>>>>> ****
>>>>>>>
>>>>>>> Warm Regards,****
>>>>>>>
>>>>>>> Tariq****
>>>>>>>
>>>>>>> cloudfront.blogspot.com****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  ok. thanks****
>>>>>>>
>>>>>>> now, i need to start with all windows setup first as our product
>>>>>>> will be based on windows ****
>>>>>>>
>>>>>>> so, now, please tell me how to resolve the issue ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> datanode is not starting . please suggest ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards,****
>>>>>>>
>>>>>>> irfan ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  It is possible. Theoretically Hadoop doesn't stop you from doing
>>>>>>> that. But it is not a very wise setup.****
>>>>>>>
>>>>>>>
>>>>>>> ****
>>>>>>>
>>>>>>> Warm Regards,****
>>>>>>>
>>>>>>> Tariq****
>>>>>>>
>>>>>>> cloudfront.blogspot.com****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  please suggest****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> irfan****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  thanks.****
>>>>>>>
>>>>>>> can i have setup like this :****
>>>>>>>
>>>>>>> namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)*
>>>>>>> ***
>>>>>>>
>>>>>>> and datanodes are the combination of any OS (windows , linux , unix
>>>>>>> etc )****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> however, my doubt is,  as the file systems of  both the systems (win
>>>>>>> and linux ) are different ,  datanodes of these systems can not be part of
>>>>>>> single cluster . i have to make windows cluster separate and UNIX cluster
>>>>>>> separate ?****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
>>>>>>> aagarwal@hortonworks.com> wrote:****
>>>>>>>
>>>>>>> I just noticed you are on Cygwin. IIRC Windows PIDs are not the same
>>>>>>> as Cygwin PIDs so that may be causing the discrepancy. I don't know how
>>>>>>> well Hadoop works in Cygwin as I have never tried it. Work is in progress
>>>>>>> for native Windows support however there are no official releases with
>>>>>>> Windows support yet. It may be easier to get familiar with a release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
>>>>>>> ****
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> ****
>>>>>>>
>>>>>>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  thanks ****
>>>>>>>
>>>>>>> here is what i did .****
>>>>>>>
>>>>>>> i stopped all the namenodes and datanodes using ./stop-dfs.sh
>>>>>>> command ****
>>>>>>>
>>>>>>> then deleted all pid files for namenodes and datanodes ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> started dfs again with command : "./start-dfs.sh"****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> when i ran the "Jps" command . it shows****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>>>
>>>>>>> $ ./jps.exe****
>>>>>>>
>>>>>>> 4536 Jps****
>>>>>>>
>>>>>>> 2076 NameNode****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> however, when i open the pid file for namenode then it is not
>>>>>>> showing pid as : 4560. on the contrary, it shud show : 2076****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> please suggest ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
>>>>>>> aagarwal@hortonworks.com> wrote:****
>>>>>>>
>>>>>>>  Most likely there is a stale pid file. Something like
>>>>>>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>>>>>>> the datanode.
>>>>>>>
>>>>>>> I haven't read the entire thread so you may have looked at this
>>>>>>> already.
>>>>>>>
>>>>>>> -Arpit****
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> ****
>>>>>>>
>>>>>>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  datanode is trying to connect to namenode continuously but fails **
>>>>>>> **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> when i try to run "jps" command it says :****
>>>>>>>
>>>>>>> $ ./jps.exe****
>>>>>>>
>>>>>>> 4584 NameNode****
>>>>>>>
>>>>>>> 4016 Jps****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> and when i ran the "./start-dfs.sh" then it says :****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> $ ./start-dfs.sh****
>>>>>>>
>>>>>>> namenode running as process 3544. Stop it first.****
>>>>>>>
>>>>>>> DFS-1: datanode running as process 4076. Stop it first.****
>>>>>>>
>>>>>>> localhost: secondarynamenode running as process 4792. Stop it first.
>>>>>>> ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> both these logs are contradictory ****
>>>>>>>
>>>>>>> please find the attached logs ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> should i attach the conf files as well ?****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>>  ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  Your DN is still not running. Showing me the logs would be helpful.
>>>>>>> ****
>>>>>>>
>>>>>>>
>>>>>>> ****
>>>>>>>
>>>>>>> Warm Regards,****
>>>>>>>
>>>>>>> Tariq****
>>>>>>>
>>>>>>> cloudfront.blogspot.com****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  i followed the url and did the steps mention in that. i have
>>>>>>> deployed on the windows platform****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> Now, i am able to browse url : http://localhost:50070 (name node )**
>>>>>>> **
>>>>>>>
>>>>>>> however, not able to browse url : http://localhost:50030****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> please refer below****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> [image: Inline image 1]****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> i have modified all the config files as mentioned and formatted the
>>>>>>> hdfs file system as well ****
>>>>>>>
>>>>>>> please suggest ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  thanks. i followed this url :
>>>>>>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>>>>>>> ****
>>>>>>>
>>>>>>> let me follow the url which you gave for pseudo distributed setup
>>>>>>> and then will switch to distributed mode****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> irfan ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  You are welcome. Which link have you followed for the
>>>>>>> configuration?Your *core-site.xml* is empty. Remove the property *
>>>>>>> fs.default.name *from *hdfs-site.xml* and add it to *core-site.xml*.
>>>>>>> Remove *mapred.job.tracker* as well. It is required in *
>>>>>>> mapred-site.xml*.****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> I would suggest you to do a pseudo distributed setup first in order
>>>>>>> to get yourself familiar with the process and then proceed to the
>>>>>>> distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>>>>>>> ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> HTH****
>>>>>>>
>>>>>>>
>>>>>>> ****
>>>>>>>
>>>>>>> Warm Regards,****
>>>>>>>
>>>>>>> Tariq****
>>>>>>>
>>>>>>> cloudfront.blogspot.com****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  thanks tariq for response. ****
>>>>>>>
>>>>>>> as discussed last time, i have sent you all the config files in my
>>>>>>> setup . ****
>>>>>>>
>>>>>>> can you please go through that ?****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> please let me know ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> irfan ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  I'm sorry for being unresponsive. Was out of touch for sometime
>>>>>>> because of ramzan and eid. Resuming work today.****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> What's the current status?****
>>>>>>>
>>>>>>>
>>>>>>> ****
>>>>>>>
>>>>>>> Warm Regards,****
>>>>>>>
>>>>>>> Tariq****
>>>>>>>
>>>>>>> cloudfront.blogspot.com****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <ma...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  First of all read the concepts ..I hope you will like it..****
>>>>>>>
>>>>>>>
>>>>>>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf***
>>>>>>> *
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  please suggest ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> irfan ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  hey Tariq,****
>>>>>>>
>>>>>>> i am still stuck .. ****
>>>>>>>
>>>>>>> can you please suggest ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> irfan ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  please suggest ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  attachment got quarantined ****
>>>>>>>
>>>>>>> resending in txt format. please rename it to conf.rar ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  thanks.****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> if i run the jps command on namenode :****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>>>
>>>>>>> $ ./jps.exe****
>>>>>>>
>>>>>>> 3164 NameNode****
>>>>>>>
>>>>>>> 1892 Jps****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> same command on datanode :****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> Administrator@DFS-1 /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>>>
>>>>>>> $ ./jps.exe****
>>>>>>>
>>>>>>> 3848 Jps****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> jps does not list any process for datanode. however, on web browser
>>>>>>> i can see one live data node ****
>>>>>>>
>>>>>>> please find the attached conf rar file of namenode ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <do...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  OK. we'll start fresh. Could you plz show me your latest config
>>>>>>> files?****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> BTW, are your daemons running fine?Use JPS to verify that.****
>>>>>>>
>>>>>>>
>>>>>>> ****
>>>>>>>
>>>>>>> Warm Regards,****
>>>>>>>
>>>>>>> Tariq****
>>>>>>>
>>>>>>> cloudfront.blogspot.com****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  i have created these dir "wksp_data" and "wksp_name" on both
>>>>>>> datanode and namenode ****
>>>>>>>
>>>>>>> made the respective changes in "hdfs-site.xml" file ****
>>>>>>>
>>>>>>> formatted the namenode ****
>>>>>>>
>>>>>>> started the dfs ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> but still, not able to browse the file system through web browser **
>>>>>>> **
>>>>>>>
>>>>>>> please refer below ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> anything still missing ?****
>>>>>>>
>>>>>>> please suggest ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> [image: Inline image 1]****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  these dir needs to be created on all datanodes and namenodes ?****
>>>>>>>
>>>>>>> further,  hdfs-site.xml needs to be updated on both datanodes and
>>>>>>> namenodes for these new dir?****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  Create 2 directories manually corresponding to the values of
>>>>>>> dfs.name.dir and dfs.data.dir properties and change the permissions of
>>>>>>> these directories to 755. When you start pushing data into your HDFS, data
>>>>>>> will start going inside the directory specified by dfs.data.dir and the
>>>>>>> associated metadata will go inside dfs.name.dir. Remember, you store data
>>>>>>> in HDFS, but it eventually gets stored in your local/native FS. But you
>>>>>>> cannot see this data directly on your local/native FS.****
>>>>>>>
>>>>>>>
>>>>>>> ****
>>>>>>>
>>>>>>> Warm Regards,****
>>>>>>>
>>>>>>> Tariq****
>>>>>>>
>>>>>>> cloudfront.blogspot.com****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  thanks. ****
>>>>>>>
>>>>>>> however, i need this to be working on windows environment as project
>>>>>>> requirement.****
>>>>>>>
>>>>>>> i will add/work on Linux later ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> so, now , at this stage , c:\\wksp is the HDFS file system OR do i
>>>>>>> need to create it from command line ?****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> please suggest****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards,****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  Hello Irfan,****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> Sorry for being unresponsive. Got stuck with some imp work.****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> HDFS webUI doesn't provide us the ability to create file or
>>>>>>> directory. You can browse HDFS, view files, download files etc. But
>>>>>>> operation like create, move, copy etc are not supported.****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> These values look fine to me.****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> One suggestion though. Try getting a Linux machine(if possible). Or
>>>>>>> at least use a VM. I personally feel that using Hadoop on windows is always
>>>>>>> messy.****
>>>>>>>
>>>>>>>
>>>>>>> ****
>>>>>>>
>>>>>>> Warm Regards,****
>>>>>>>
>>>>>>> Tariq****
>>>>>>>
>>>>>>> cloudfront.blogspot.com****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  thanks.****
>>>>>>>
>>>>>>> when i browse the file system , i am getting following :****
>>>>>>>
>>>>>>> i haven't seen any make directory option there ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> i need to create it from command line ?****
>>>>>>>
>>>>>>> further, in the hdfs-site.xml file , i have given following entries.
>>>>>>> are they correct ? ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> <property>****
>>>>>>>
>>>>>>>   <name>dfs.data.dir</name>****
>>>>>>>
>>>>>>>   <value>c:\\wksp</value>****
>>>>>>>
>>>>>>>   </property>****
>>>>>>>
>>>>>>> <property>****
>>>>>>>
>>>>>>>   <name>dfs.name.dir</name>****
>>>>>>>
>>>>>>>   <value>c:\\wksp</value>****
>>>>>>>
>>>>>>>   </property>****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> please suggest ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> [image: Inline image 1]****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <ma...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  *You are wrong at this:*****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>>>>>>
>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp****
>>>>>>>
>>>>>>> copyFromLocal: File
>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
>>>>>>> ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>>>>>>
>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp***
>>>>>>> *
>>>>>>>
>>>>>>> copyFromLocal: File
>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>>>>>>> ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> Because,You had wrote both the paths local and You need not to copy
>>>>>>> hadoop into hdfs...Hadoop is already working..****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> Just check out in browser by after starting ur single node cluster :
>>>>>>> ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> localhost:50070****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> then go for browse the filesystem link in it..****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> If there is no directory then make directory there.****
>>>>>>>
>>>>>>> That is your hdfs directory.****
>>>>>>>
>>>>>>> Then copy any text file there(no need to copy hadoop there).beacause
>>>>>>> u are going to do processing on that data in text file.That's why hadoop is
>>>>>>> used for ,first u need to make it clear in ur mind.Then and then u will do
>>>>>>> it...otherwise not possible..****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> *Try this: *****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2****
>>>>>>>
>>>>>>> $ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
>>>>>>> /hdfs/directory/path****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  thanks. yes , i am newbie.****
>>>>>>>
>>>>>>> however, i need windows setup.****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> let me surely refer the doc and link which u sent but i need this to
>>>>>>> be working ...****
>>>>>>>
>>>>>>> can you please help****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>>  ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> --
>>>>>>> MANISH DUNANI
>>>>>>> -THANX
>>>>>>> +91 9426881954,+91 8460656443****
>>>>>>>
>>>>>>> manishd207@gmail.com****
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> ****
>>>>>>>
>>>>>>> -- ****
>>>>>>>
>>>>>>> Regards****
>>>>>>>
>>>>>>> *Manish Dunani*****
>>>>>>>
>>>>>>> *Contact No* : +91 9408329137****
>>>>>>>
>>>>>>> *skype id* : manish.dunani****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>> you have received this communication in error, please contact the sender
>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>> you have received this communication in error, please contact the sender
>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>
>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>> you have received this communication in error, please contact the sender
>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>
>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>> you have received this communication in error, please contact the sender
>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>
>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>> you have received this communication in error, please contact the sender
>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>
>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>> you have received this communication in error, please contact the sender
>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> -- ****
>>>>>>>
>>>>>>> Olivier Renault****
>>>>>>>
>>>>>>> Solution Engineer - Big Data - Hortonworks, Inc.
>>>>>>> +44 7500 933 036
>>>>>>> orenault@hortonworks.com
>>>>>>> www.hortonworks.com****
>>>>>>>
>>>>>>> **** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>>>
>>>>>>>
>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>> you have received this communication in error, please contact the sender
>>>>>>> immediately and delete it from your system. Thank You.****<http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>>>
>>>>>>>  ** ** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity
>> to which it is addressed and may contain information that is confidential,
>> privileged and exempt from disclosure under applicable law. If the reader
>> of this message is not the intended recipient, you are hereby notified that
>> any printing, copying, dissemination, distribution, disclosure or
>> forwarding of this communication is strictly prohibited. If you have
>> received this communication in error, please contact the sender immediately
>> and delete it from your system. Thank You.
>>
>
>

-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.

Re: about replication

Posted by Olivier Renault <or...@hortonworks.com>.
You can put the same FQDN as your NameNode for example.

Thanks
Olivier
On 11 Sep 2013 11:26, "Irfan Sayed" <ir...@gmail.com> wrote:

> i do not have any HIVE server host,  then, what should i put over here?? .
> if i comment then i guess it throws error of commenting that
> can i put the fqdn of namenode for HIVE server host  ?
>
> will it be a really working configuration?
>
> please suggest
>
> regards
> irfan
>
>
>
> On Tue, Sep 10, 2013 at 5:09 PM, Olivier Renault <orenault@hortonworks.com
> > wrote:
>
>> Your cluster-properties.txt should look something like :
>>
>>
>> #Log directory
>> HDP_LOG_DIR=c:\hadoop\logs
>>
>> #Data directory
>> HDP_DATA_DIR=c:\hdp\data
>>
>> #Hosts
>> NAMENODE_HOST=yourmaster.fqdn.com
>> JOBTRACKER_HOST=yourmaster.fqdn.com
>> HIVE_SERVER_HOST=yourmaster.fqdn.com
>> OOZIE_SERVER_HOST=yourmaster.fqdn.com
>> TEMPLETON_HOST=yourmaster.fqdn.com
>> SLAVE_HOSTS=yourmaster.fqdn.com,youslave.fqdn.com
>>
>>
>> #Database host
>> DB_FLAVOR=derby
>> DB_HOSTNAME=yourmaster.fqdn.com
>>
>>
>> #Hive properties
>> HIVE_DB_NAME=hive
>> HIVE_DB_USERNAME=hive
>> HIVE_DB_PASSWORD=hive
>>
>> #Oozie properties
>> OOZIE_DB_NAME=oozie
>> OOZIE_DB_USERNAME=oozie
>> OOZIE_DB_PASSWORD=oozie
>>
>> You will need to replace, yourmaster.fqdn.com, yourslave.fqdn.com by
>> your servers name. For the time being, I suggest that you do not install
>> HBase, Oozie,
>>
>> regards,
>> Olivier
>>
>>
>> On 10 September 2013 07:02, Irfan Sayed <ir...@gmail.com> wrote:
>>
>>> please suggest
>>>
>>> regards
>>> irfan
>>>
>>>
>>>
>>> On Sat, Sep 7, 2013 at 4:56 PM, Irfan Sayed <ir...@gmail.com>wrote:
>>>
>>>> please suggest
>>>>
>>>> regards
>>>> irfan
>>>>
>>>>
>>>>
>>>> On Fri, Sep 6, 2013 at 12:16 PM, Irfan Sayed <ir...@gmail.com>wrote:
>>>>
>>>>> ok.. now i made some changes and installation went ahead
>>>>> but failed in property "HIVE_SERVER_HOST" declaration
>>>>> in cluster config file, i have commented this property. if i uncomment
>>>>> , then what server address will give ???
>>>>>
>>>>> i have only two windows machines setup.
>>>>> 1: for namenode and another for datanode
>>>>>
>>>>> please suggest
>>>>>
>>>>> regards
>>>>> irfan
>>>>>
>>>>>
>>>>>
>>>>> On Fri, Sep 6, 2013 at 11:42 AM, Irfan Sayed <ir...@gmail.com>wrote:
>>>>>
>>>>>> thanks.
>>>>>> i installed the latest java in c:\java folder and now no error in log
>>>>>> file related to java
>>>>>> however, now it is throwing error on not having cluster properties
>>>>>> file.
>>>>>> in fact i am running/installing hdp from the location where this file
>>>>>> exist . still it is throwing error
>>>>>>
>>>>>> please find the attached
>>>>>>
>>>>>> [image: Inline image 1]
>>>>>>
>>>>>> regards
>>>>>> irfan
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Fri, Sep 6, 2013 at 11:12 AM, Ravi Mummulla (BIG DATA) <
>>>>>> ravimu@microsoft.com> wrote:
>>>>>>
>>>>>>>  Here’s your issue (from the logs you attached earlier):****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> CAQuietExec:  Checking JAVA_HOME is set correctly...****
>>>>>>>
>>>>>>> CAQuietExec:  Files\Java\jdk1.6.0_31 was unexpected at this time.***
>>>>>>> *
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> It seems that you installed Java prerequisite in the default path,
>>>>>>> which is %PROGRAMFILES% (expands to C:\Program Files in your case). HDP 1.3
>>>>>>> does not like spaces in paths, do you need to reinstall Java under c:\java\
>>>>>>> or something similar (in a path with no spaces).****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> *From:* Irfan Sayed [mailto:irfu.sayed@gmail.com]
>>>>>>> *Sent:* Thursday, September 5, 2013 8:42 PM
>>>>>>> *To:* user@hadoop.apache.org
>>>>>>> *Subject:* Re: about replication****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> please find the attached.****
>>>>>>>
>>>>>>> i don't have "c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log"
>>>>>>> as it is not generated ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> irfan****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Thu, Sep 5, 2013 at 6:09 PM, Olivier Renault <
>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>
>>>>>>>  Could you share the log files ( c:\hdp.log,
>>>>>>> c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log )  as
>>>>>>> well as your clusterproperties.txt ?****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> Thanks, ****
>>>>>>>
>>>>>>> Olivier****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On 5 September 2013 12:33, Irfan Sayed <ir...@gmail.com> wrote:
>>>>>>> ****
>>>>>>>
>>>>>>>  thanks. i followed the user manual for deployment and installed
>>>>>>> all pre-requisites ****
>>>>>>>
>>>>>>> i modified the command and still the issue persist. please suggest *
>>>>>>> ***
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> please refer below ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> [image: Inline image 1]****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> irfan ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <
>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>
>>>>>>> The command to install it is msiexec /i msifile /...  ****
>>>>>>>
>>>>>>> You will find the correct syntax as part of doc. ****
>>>>>>>
>>>>>>> Happy reading
>>>>>>> Olivier ****
>>>>>>>
>>>>>>> On 4 Sep 2013 12:37, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>>>>>
>>>>>>>  thanks. ****
>>>>>>>
>>>>>>> i referred the logs and manuals. i modified the clusterproperties
>>>>>>> file and then double click on the msi file ****
>>>>>>>
>>>>>>> however, it still failed.****
>>>>>>>
>>>>>>> further i started the installation on command line by giving
>>>>>>> HDP_LAYOUT=clusterproperties file path, ****
>>>>>>>
>>>>>>> installation went ahead and it failed for .NET framework 4.0 and
>>>>>>> VC++ redistributable package dependency   ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> i installed both and started again the installation. ****
>>>>>>>
>>>>>>> failed again with following error ****
>>>>>>>
>>>>>>> [image: Inline image 1]****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> when i search for the logs mentioned in the error , i never found
>>>>>>> that ****
>>>>>>>
>>>>>>> please suggest ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> irfan****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <
>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>
>>>>>>> Correct, you need to define the cluster configuration as part of a
>>>>>>> file. You will find some information on the configuration file as part of
>>>>>>> the documentation. ****
>>>>>>>
>>>>>>>
>>>>>>> http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
>>>>>>> ****
>>>>>>>
>>>>>>> You should make sure to have also installed the pre requisite. ****
>>>>>>>
>>>>>>> Thanks
>>>>>>> Olivier ****
>>>>>>>
>>>>>>> On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>>>>>
>>>>>>>  thanks. sorry for the long break. actually got involved in some
>>>>>>> other priorities****
>>>>>>>
>>>>>>> i downloaded the installer and while installing i got following
>>>>>>> error ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> [image: Inline image 1]****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> do i need to make any configuration prior to installation ??****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> irfan ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>
>>>>>>> Here is the link ****
>>>>>>>
>>>>>>> http://download.hortonworks.com/products/hdp-windows/****
>>>>>>>
>>>>>>> Olivier ****
>>>>>>>
>>>>>>> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:***
>>>>>>> *
>>>>>>>
>>>>>>>  thanks.****
>>>>>>>
>>>>>>> i just followed the instructions to setup the pseudo distributed
>>>>>>> setup first using the url :
>>>>>>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>>>>>>> ****
>>>>>>>
>>>>>>>  ****
>>>>>>>
>>>>>>> i don't think so i am running DN on both machine ****
>>>>>>>
>>>>>>> please find the attached log****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> hi olivier ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> can you please give me download link ?****
>>>>>>>
>>>>>>> let me try please ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> irfan ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  Are you running DN on both the machines? Could you please show me
>>>>>>> your DN logs?****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> Also, consider Oliver's suggestion. It's definitely a better option.
>>>>>>> ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>>
>>>>>>> ****
>>>>>>>
>>>>>>> Warm Regards,****
>>>>>>>
>>>>>>> Tariq****
>>>>>>>
>>>>>>> cloudfront.blogspot.com****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>
>>>>>>> Irfu, ****
>>>>>>>
>>>>>>> If you want to quickly get Hadoop running on windows platform. You
>>>>>>> may want to try our distribution for Windows. You will be able to find the
>>>>>>> msi on our website. ****
>>>>>>>
>>>>>>> Regards
>>>>>>> Olivier ****
>>>>>>>
>>>>>>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:***
>>>>>>> *
>>>>>>>
>>>>>>>  thanks. ****
>>>>>>>
>>>>>>> ok. i think i need to change the plan over here ****
>>>>>>>
>>>>>>> let me create two environments. 1: totally windows 2: totally Unix**
>>>>>>> **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> because, on windows , anyway i have to try and see how hadoop works
>>>>>>> ****
>>>>>>>
>>>>>>> on UNIX, it is already known that ,  it is working fine. ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> so, on windows , here is the setup:****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> namenode : windows 2012 R2 ****
>>>>>>>
>>>>>>> datanode : windows 2012 R2 ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> now, the exact problem is :****
>>>>>>>
>>>>>>> 1: datanode is not getting started ****
>>>>>>>
>>>>>>> 2: replication : if i put any file/folder on any datanode , it
>>>>>>> should get replicated to all another available datanodes ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <do...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  Seriously??You are planning to develop something using Hadoop on
>>>>>>> windows. Not a good idea. Anyways, cold you plz show me your log files?I
>>>>>>> also need some additional info :****
>>>>>>>
>>>>>>> -The exact problem which you are facing right now****
>>>>>>>
>>>>>>> -Your cluster summary(no. of nodes etc)****
>>>>>>>
>>>>>>> -Your latest configuration files****
>>>>>>>
>>>>>>> -Your /etc.hosts file****
>>>>>>>
>>>>>>>
>>>>>>> ****
>>>>>>>
>>>>>>> Warm Regards,****
>>>>>>>
>>>>>>> Tariq****
>>>>>>>
>>>>>>> cloudfront.blogspot.com****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  ok. thanks****
>>>>>>>
>>>>>>> now, i need to start with all windows setup first as our product
>>>>>>> will be based on windows ****
>>>>>>>
>>>>>>> so, now, please tell me how to resolve the issue ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> datanode is not starting . please suggest ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards,****
>>>>>>>
>>>>>>> irfan ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  It is possible. Theoretically Hadoop doesn't stop you from doing
>>>>>>> that. But it is not a very wise setup.****
>>>>>>>
>>>>>>>
>>>>>>> ****
>>>>>>>
>>>>>>> Warm Regards,****
>>>>>>>
>>>>>>> Tariq****
>>>>>>>
>>>>>>> cloudfront.blogspot.com****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  please suggest****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> irfan****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  thanks.****
>>>>>>>
>>>>>>> can i have setup like this :****
>>>>>>>
>>>>>>> namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)*
>>>>>>> ***
>>>>>>>
>>>>>>> and datanodes are the combination of any OS (windows , linux , unix
>>>>>>> etc )****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> however, my doubt is,  as the file systems of  both the systems (win
>>>>>>> and linux ) are different ,  datanodes of these systems can not be part of
>>>>>>> single cluster . i have to make windows cluster separate and UNIX cluster
>>>>>>> separate ?****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
>>>>>>> aagarwal@hortonworks.com> wrote:****
>>>>>>>
>>>>>>> I just noticed you are on Cygwin. IIRC Windows PIDs are not the same
>>>>>>> as Cygwin PIDs so that may be causing the discrepancy. I don't know how
>>>>>>> well Hadoop works in Cygwin as I have never tried it. Work is in progress
>>>>>>> for native Windows support however there are no official releases with
>>>>>>> Windows support yet. It may be easier to get familiar with a release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
>>>>>>> ****
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> ****
>>>>>>>
>>>>>>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  thanks ****
>>>>>>>
>>>>>>> here is what i did .****
>>>>>>>
>>>>>>> i stopped all the namenodes and datanodes using ./stop-dfs.sh
>>>>>>> command ****
>>>>>>>
>>>>>>> then deleted all pid files for namenodes and datanodes ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> started dfs again with command : "./start-dfs.sh"****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> when i ran the "Jps" command . it shows****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>>>
>>>>>>> $ ./jps.exe****
>>>>>>>
>>>>>>> 4536 Jps****
>>>>>>>
>>>>>>> 2076 NameNode****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> however, when i open the pid file for namenode then it is not
>>>>>>> showing pid as : 4560. on the contrary, it shud show : 2076****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> please suggest ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
>>>>>>> aagarwal@hortonworks.com> wrote:****
>>>>>>>
>>>>>>>  Most likely there is a stale pid file. Something like
>>>>>>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>>>>>>> the datanode.
>>>>>>>
>>>>>>> I haven't read the entire thread so you may have looked at this
>>>>>>> already.
>>>>>>>
>>>>>>> -Arpit****
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> ****
>>>>>>>
>>>>>>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  datanode is trying to connect to namenode continuously but fails **
>>>>>>> **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> when i try to run "jps" command it says :****
>>>>>>>
>>>>>>> $ ./jps.exe****
>>>>>>>
>>>>>>> 4584 NameNode****
>>>>>>>
>>>>>>> 4016 Jps****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> and when i ran the "./start-dfs.sh" then it says :****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> $ ./start-dfs.sh****
>>>>>>>
>>>>>>> namenode running as process 3544. Stop it first.****
>>>>>>>
>>>>>>> DFS-1: datanode running as process 4076. Stop it first.****
>>>>>>>
>>>>>>> localhost: secondarynamenode running as process 4792. Stop it first.
>>>>>>> ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> both these logs are contradictory ****
>>>>>>>
>>>>>>> please find the attached logs ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> should i attach the conf files as well ?****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>>  ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  Your DN is still not running. Showing me the logs would be helpful.
>>>>>>> ****
>>>>>>>
>>>>>>>
>>>>>>> ****
>>>>>>>
>>>>>>> Warm Regards,****
>>>>>>>
>>>>>>> Tariq****
>>>>>>>
>>>>>>> cloudfront.blogspot.com****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  i followed the url and did the steps mention in that. i have
>>>>>>> deployed on the windows platform****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> Now, i am able to browse url : http://localhost:50070 (name node )**
>>>>>>> **
>>>>>>>
>>>>>>> however, not able to browse url : http://localhost:50030****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> please refer below****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> [image: Inline image 1]****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> i have modified all the config files as mentioned and formatted the
>>>>>>> hdfs file system as well ****
>>>>>>>
>>>>>>> please suggest ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  thanks. i followed this url :
>>>>>>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>>>>>>> ****
>>>>>>>
>>>>>>> let me follow the url which you gave for pseudo distributed setup
>>>>>>> and then will switch to distributed mode****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> irfan ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  You are welcome. Which link have you followed for the
>>>>>>> configuration?Your *core-site.xml* is empty. Remove the property *
>>>>>>> fs.default.name *from *hdfs-site.xml* and add it to *core-site.xml*.
>>>>>>> Remove *mapred.job.tracker* as well. It is required in *
>>>>>>> mapred-site.xml*.****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> I would suggest you to do a pseudo distributed setup first in order
>>>>>>> to get yourself familiar with the process and then proceed to the
>>>>>>> distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>>>>>>> ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> HTH****
>>>>>>>
>>>>>>>
>>>>>>> ****
>>>>>>>
>>>>>>> Warm Regards,****
>>>>>>>
>>>>>>> Tariq****
>>>>>>>
>>>>>>> cloudfront.blogspot.com****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  thanks tariq for response. ****
>>>>>>>
>>>>>>> as discussed last time, i have sent you all the config files in my
>>>>>>> setup . ****
>>>>>>>
>>>>>>> can you please go through that ?****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> please let me know ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> irfan ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  I'm sorry for being unresponsive. Was out of touch for sometime
>>>>>>> because of ramzan and eid. Resuming work today.****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> What's the current status?****
>>>>>>>
>>>>>>>
>>>>>>> ****
>>>>>>>
>>>>>>> Warm Regards,****
>>>>>>>
>>>>>>> Tariq****
>>>>>>>
>>>>>>> cloudfront.blogspot.com****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <ma...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  First of all read the concepts ..I hope you will like it..****
>>>>>>>
>>>>>>>
>>>>>>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf***
>>>>>>> *
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  please suggest ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> irfan ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  hey Tariq,****
>>>>>>>
>>>>>>> i am still stuck .. ****
>>>>>>>
>>>>>>> can you please suggest ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> irfan ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  please suggest ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  attachment got quarantined ****
>>>>>>>
>>>>>>> resending in txt format. please rename it to conf.rar ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  thanks.****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> if i run the jps command on namenode :****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>>>
>>>>>>> $ ./jps.exe****
>>>>>>>
>>>>>>> 3164 NameNode****
>>>>>>>
>>>>>>> 1892 Jps****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> same command on datanode :****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> Administrator@DFS-1 /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>>>
>>>>>>> $ ./jps.exe****
>>>>>>>
>>>>>>> 3848 Jps****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> jps does not list any process for datanode. however, on web browser
>>>>>>> i can see one live data node ****
>>>>>>>
>>>>>>> please find the attached conf rar file of namenode ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <do...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  OK. we'll start fresh. Could you plz show me your latest config
>>>>>>> files?****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> BTW, are your daemons running fine?Use JPS to verify that.****
>>>>>>>
>>>>>>>
>>>>>>> ****
>>>>>>>
>>>>>>> Warm Regards,****
>>>>>>>
>>>>>>> Tariq****
>>>>>>>
>>>>>>> cloudfront.blogspot.com****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  i have created these dir "wksp_data" and "wksp_name" on both
>>>>>>> datanode and namenode ****
>>>>>>>
>>>>>>> made the respective changes in "hdfs-site.xml" file ****
>>>>>>>
>>>>>>> formatted the namenode ****
>>>>>>>
>>>>>>> started the dfs ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> but still, not able to browse the file system through web browser **
>>>>>>> **
>>>>>>>
>>>>>>> please refer below ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> anything still missing ?****
>>>>>>>
>>>>>>> please suggest ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> [image: Inline image 1]****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  these dir needs to be created on all datanodes and namenodes ?****
>>>>>>>
>>>>>>> further,  hdfs-site.xml needs to be updated on both datanodes and
>>>>>>> namenodes for these new dir?****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  Create 2 directories manually corresponding to the values of
>>>>>>> dfs.name.dir and dfs.data.dir properties and change the permissions of
>>>>>>> these directories to 755. When you start pushing data into your HDFS, data
>>>>>>> will start going inside the directory specified by dfs.data.dir and the
>>>>>>> associated metadata will go inside dfs.name.dir. Remember, you store data
>>>>>>> in HDFS, but it eventually gets stored in your local/native FS. But you
>>>>>>> cannot see this data directly on your local/native FS.****
>>>>>>>
>>>>>>>
>>>>>>> ****
>>>>>>>
>>>>>>> Warm Regards,****
>>>>>>>
>>>>>>> Tariq****
>>>>>>>
>>>>>>> cloudfront.blogspot.com****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  thanks. ****
>>>>>>>
>>>>>>> however, i need this to be working on windows environment as project
>>>>>>> requirement.****
>>>>>>>
>>>>>>> i will add/work on Linux later ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> so, now , at this stage , c:\\wksp is the HDFS file system OR do i
>>>>>>> need to create it from command line ?****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> please suggest****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards,****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  Hello Irfan,****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> Sorry for being unresponsive. Got stuck with some imp work.****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> HDFS webUI doesn't provide us the ability to create file or
>>>>>>> directory. You can browse HDFS, view files, download files etc. But
>>>>>>> operation like create, move, copy etc are not supported.****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> These values look fine to me.****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> One suggestion though. Try getting a Linux machine(if possible). Or
>>>>>>> at least use a VM. I personally feel that using Hadoop on windows is always
>>>>>>> messy.****
>>>>>>>
>>>>>>>
>>>>>>> ****
>>>>>>>
>>>>>>> Warm Regards,****
>>>>>>>
>>>>>>> Tariq****
>>>>>>>
>>>>>>> cloudfront.blogspot.com****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  thanks.****
>>>>>>>
>>>>>>> when i browse the file system , i am getting following :****
>>>>>>>
>>>>>>> i haven't seen any make directory option there ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> i need to create it from command line ?****
>>>>>>>
>>>>>>> further, in the hdfs-site.xml file , i have given following entries.
>>>>>>> are they correct ? ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> <property>****
>>>>>>>
>>>>>>>   <name>dfs.data.dir</name>****
>>>>>>>
>>>>>>>   <value>c:\\wksp</value>****
>>>>>>>
>>>>>>>   </property>****
>>>>>>>
>>>>>>> <property>****
>>>>>>>
>>>>>>>   <name>dfs.name.dir</name>****
>>>>>>>
>>>>>>>   <value>c:\\wksp</value>****
>>>>>>>
>>>>>>>   </property>****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> please suggest ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> [image: Inline image 1]****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <ma...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  *You are wrong at this:*****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>>>>>>
>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp****
>>>>>>>
>>>>>>> copyFromLocal: File
>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
>>>>>>> ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>>>>>>
>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp***
>>>>>>> *
>>>>>>>
>>>>>>> copyFromLocal: File
>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>>>>>>> ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> Because,You had wrote both the paths local and You need not to copy
>>>>>>> hadoop into hdfs...Hadoop is already working..****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> Just check out in browser by after starting ur single node cluster :
>>>>>>> ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> localhost:50070****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> then go for browse the filesystem link in it..****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> If there is no directory then make directory there.****
>>>>>>>
>>>>>>> That is your hdfs directory.****
>>>>>>>
>>>>>>> Then copy any text file there(no need to copy hadoop there).beacause
>>>>>>> u are going to do processing on that data in text file.That's why hadoop is
>>>>>>> used for ,first u need to make it clear in ur mind.Then and then u will do
>>>>>>> it...otherwise not possible..****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> *Try this: *****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2****
>>>>>>>
>>>>>>> $ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
>>>>>>> /hdfs/directory/path****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  thanks. yes , i am newbie.****
>>>>>>>
>>>>>>> however, i need windows setup.****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> let me surely refer the doc and link which u sent but i need this to
>>>>>>> be working ...****
>>>>>>>
>>>>>>> can you please help****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>>  ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> --
>>>>>>> MANISH DUNANI
>>>>>>> -THANX
>>>>>>> +91 9426881954,+91 8460656443****
>>>>>>>
>>>>>>> manishd207@gmail.com****
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> ****
>>>>>>>
>>>>>>> -- ****
>>>>>>>
>>>>>>> Regards****
>>>>>>>
>>>>>>> *Manish Dunani*****
>>>>>>>
>>>>>>> *Contact No* : +91 9408329137****
>>>>>>>
>>>>>>> *skype id* : manish.dunani****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>> you have received this communication in error, please contact the sender
>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>> you have received this communication in error, please contact the sender
>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>
>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>> you have received this communication in error, please contact the sender
>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>
>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>> you have received this communication in error, please contact the sender
>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>
>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>> you have received this communication in error, please contact the sender
>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>
>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>> you have received this communication in error, please contact the sender
>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> -- ****
>>>>>>>
>>>>>>> Olivier Renault****
>>>>>>>
>>>>>>> Solution Engineer - Big Data - Hortonworks, Inc.
>>>>>>> +44 7500 933 036
>>>>>>> orenault@hortonworks.com
>>>>>>> www.hortonworks.com****
>>>>>>>
>>>>>>> **** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>>>
>>>>>>>
>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>> you have received this communication in error, please contact the sender
>>>>>>> immediately and delete it from your system. Thank You.****<http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>>>
>>>>>>>  ** ** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity
>> to which it is addressed and may contain information that is confidential,
>> privileged and exempt from disclosure under applicable law. If the reader
>> of this message is not the intended recipient, you are hereby notified that
>> any printing, copying, dissemination, distribution, disclosure or
>> forwarding of this communication is strictly prohibited. If you have
>> received this communication in error, please contact the sender immediately
>> and delete it from your system. Thank You.
>>
>
>

-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.

Re: about replication

Posted by Olivier Renault <or...@hortonworks.com>.
You can put the same FQDN as your NameNode for example.

Thanks
Olivier
On 11 Sep 2013 11:26, "Irfan Sayed" <ir...@gmail.com> wrote:

> i do not have any HIVE server host,  then, what should i put over here?? .
> if i comment then i guess it throws error of commenting that
> can i put the fqdn of namenode for HIVE server host  ?
>
> will it be a really working configuration?
>
> please suggest
>
> regards
> irfan
>
>
>
> On Tue, Sep 10, 2013 at 5:09 PM, Olivier Renault <orenault@hortonworks.com
> > wrote:
>
>> Your cluster-properties.txt should look something like :
>>
>>
>> #Log directory
>> HDP_LOG_DIR=c:\hadoop\logs
>>
>> #Data directory
>> HDP_DATA_DIR=c:\hdp\data
>>
>> #Hosts
>> NAMENODE_HOST=yourmaster.fqdn.com
>> JOBTRACKER_HOST=yourmaster.fqdn.com
>> HIVE_SERVER_HOST=yourmaster.fqdn.com
>> OOZIE_SERVER_HOST=yourmaster.fqdn.com
>> TEMPLETON_HOST=yourmaster.fqdn.com
>> SLAVE_HOSTS=yourmaster.fqdn.com,youslave.fqdn.com
>>
>>
>> #Database host
>> DB_FLAVOR=derby
>> DB_HOSTNAME=yourmaster.fqdn.com
>>
>>
>> #Hive properties
>> HIVE_DB_NAME=hive
>> HIVE_DB_USERNAME=hive
>> HIVE_DB_PASSWORD=hive
>>
>> #Oozie properties
>> OOZIE_DB_NAME=oozie
>> OOZIE_DB_USERNAME=oozie
>> OOZIE_DB_PASSWORD=oozie
>>
>> You will need to replace, yourmaster.fqdn.com, yourslave.fqdn.com by
>> your servers name. For the time being, I suggest that you do not install
>> HBase, Oozie,
>>
>> regards,
>> Olivier
>>
>>
>> On 10 September 2013 07:02, Irfan Sayed <ir...@gmail.com> wrote:
>>
>>> please suggest
>>>
>>> regards
>>> irfan
>>>
>>>
>>>
>>> On Sat, Sep 7, 2013 at 4:56 PM, Irfan Sayed <ir...@gmail.com>wrote:
>>>
>>>> please suggest
>>>>
>>>> regards
>>>> irfan
>>>>
>>>>
>>>>
>>>> On Fri, Sep 6, 2013 at 12:16 PM, Irfan Sayed <ir...@gmail.com>wrote:
>>>>
>>>>> ok.. now i made some changes and installation went ahead
>>>>> but failed in property "HIVE_SERVER_HOST" declaration
>>>>> in cluster config file, i have commented this property. if i uncomment
>>>>> , then what server address will give ???
>>>>>
>>>>> i have only two windows machines setup.
>>>>> 1: for namenode and another for datanode
>>>>>
>>>>> please suggest
>>>>>
>>>>> regards
>>>>> irfan
>>>>>
>>>>>
>>>>>
>>>>> On Fri, Sep 6, 2013 at 11:42 AM, Irfan Sayed <ir...@gmail.com>wrote:
>>>>>
>>>>>> thanks.
>>>>>> i installed the latest java in c:\java folder and now no error in log
>>>>>> file related to java
>>>>>> however, now it is throwing error on not having cluster properties
>>>>>> file.
>>>>>> in fact i am running/installing hdp from the location where this file
>>>>>> exist . still it is throwing error
>>>>>>
>>>>>> please find the attached
>>>>>>
>>>>>> [image: Inline image 1]
>>>>>>
>>>>>> regards
>>>>>> irfan
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Fri, Sep 6, 2013 at 11:12 AM, Ravi Mummulla (BIG DATA) <
>>>>>> ravimu@microsoft.com> wrote:
>>>>>>
>>>>>>>  Here’s your issue (from the logs you attached earlier):****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> CAQuietExec:  Checking JAVA_HOME is set correctly...****
>>>>>>>
>>>>>>> CAQuietExec:  Files\Java\jdk1.6.0_31 was unexpected at this time.***
>>>>>>> *
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> It seems that you installed Java prerequisite in the default path,
>>>>>>> which is %PROGRAMFILES% (expands to C:\Program Files in your case). HDP 1.3
>>>>>>> does not like spaces in paths, do you need to reinstall Java under c:\java\
>>>>>>> or something similar (in a path with no spaces).****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> *From:* Irfan Sayed [mailto:irfu.sayed@gmail.com]
>>>>>>> *Sent:* Thursday, September 5, 2013 8:42 PM
>>>>>>> *To:* user@hadoop.apache.org
>>>>>>> *Subject:* Re: about replication****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> please find the attached.****
>>>>>>>
>>>>>>> i don't have "c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log"
>>>>>>> as it is not generated ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> irfan****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Thu, Sep 5, 2013 at 6:09 PM, Olivier Renault <
>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>
>>>>>>>  Could you share the log files ( c:\hdp.log,
>>>>>>> c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log )  as
>>>>>>> well as your clusterproperties.txt ?****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> Thanks, ****
>>>>>>>
>>>>>>> Olivier****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On 5 September 2013 12:33, Irfan Sayed <ir...@gmail.com> wrote:
>>>>>>> ****
>>>>>>>
>>>>>>>  thanks. i followed the user manual for deployment and installed
>>>>>>> all pre-requisites ****
>>>>>>>
>>>>>>> i modified the command and still the issue persist. please suggest *
>>>>>>> ***
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> please refer below ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> [image: Inline image 1]****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> irfan ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <
>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>
>>>>>>> The command to install it is msiexec /i msifile /...  ****
>>>>>>>
>>>>>>> You will find the correct syntax as part of doc. ****
>>>>>>>
>>>>>>> Happy reading
>>>>>>> Olivier ****
>>>>>>>
>>>>>>> On 4 Sep 2013 12:37, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>>>>>
>>>>>>>  thanks. ****
>>>>>>>
>>>>>>> i referred the logs and manuals. i modified the clusterproperties
>>>>>>> file and then double click on the msi file ****
>>>>>>>
>>>>>>> however, it still failed.****
>>>>>>>
>>>>>>> further i started the installation on command line by giving
>>>>>>> HDP_LAYOUT=clusterproperties file path, ****
>>>>>>>
>>>>>>> installation went ahead and it failed for .NET framework 4.0 and
>>>>>>> VC++ redistributable package dependency   ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> i installed both and started again the installation. ****
>>>>>>>
>>>>>>> failed again with following error ****
>>>>>>>
>>>>>>> [image: Inline image 1]****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> when i search for the logs mentioned in the error , i never found
>>>>>>> that ****
>>>>>>>
>>>>>>> please suggest ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> irfan****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <
>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>
>>>>>>> Correct, you need to define the cluster configuration as part of a
>>>>>>> file. You will find some information on the configuration file as part of
>>>>>>> the documentation. ****
>>>>>>>
>>>>>>>
>>>>>>> http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
>>>>>>> ****
>>>>>>>
>>>>>>> You should make sure to have also installed the pre requisite. ****
>>>>>>>
>>>>>>> Thanks
>>>>>>> Olivier ****
>>>>>>>
>>>>>>> On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>>>>>
>>>>>>>  thanks. sorry for the long break. actually got involved in some
>>>>>>> other priorities****
>>>>>>>
>>>>>>> i downloaded the installer and while installing i got following
>>>>>>> error ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> [image: Inline image 1]****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> do i need to make any configuration prior to installation ??****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> irfan ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>
>>>>>>> Here is the link ****
>>>>>>>
>>>>>>> http://download.hortonworks.com/products/hdp-windows/****
>>>>>>>
>>>>>>> Olivier ****
>>>>>>>
>>>>>>> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:***
>>>>>>> *
>>>>>>>
>>>>>>>  thanks.****
>>>>>>>
>>>>>>> i just followed the instructions to setup the pseudo distributed
>>>>>>> setup first using the url :
>>>>>>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>>>>>>> ****
>>>>>>>
>>>>>>>  ****
>>>>>>>
>>>>>>> i don't think so i am running DN on both machine ****
>>>>>>>
>>>>>>> please find the attached log****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> hi olivier ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> can you please give me download link ?****
>>>>>>>
>>>>>>> let me try please ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> irfan ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  Are you running DN on both the machines? Could you please show me
>>>>>>> your DN logs?****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> Also, consider Oliver's suggestion. It's definitely a better option.
>>>>>>> ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>>
>>>>>>> ****
>>>>>>>
>>>>>>> Warm Regards,****
>>>>>>>
>>>>>>> Tariq****
>>>>>>>
>>>>>>> cloudfront.blogspot.com****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>>
>>>>>>> Irfu, ****
>>>>>>>
>>>>>>> If you want to quickly get Hadoop running on windows platform. You
>>>>>>> may want to try our distribution for Windows. You will be able to find the
>>>>>>> msi on our website. ****
>>>>>>>
>>>>>>> Regards
>>>>>>> Olivier ****
>>>>>>>
>>>>>>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:***
>>>>>>> *
>>>>>>>
>>>>>>>  thanks. ****
>>>>>>>
>>>>>>> ok. i think i need to change the plan over here ****
>>>>>>>
>>>>>>> let me create two environments. 1: totally windows 2: totally Unix**
>>>>>>> **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> because, on windows , anyway i have to try and see how hadoop works
>>>>>>> ****
>>>>>>>
>>>>>>> on UNIX, it is already known that ,  it is working fine. ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> so, on windows , here is the setup:****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> namenode : windows 2012 R2 ****
>>>>>>>
>>>>>>> datanode : windows 2012 R2 ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> now, the exact problem is :****
>>>>>>>
>>>>>>> 1: datanode is not getting started ****
>>>>>>>
>>>>>>> 2: replication : if i put any file/folder on any datanode , it
>>>>>>> should get replicated to all another available datanodes ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <do...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  Seriously??You are planning to develop something using Hadoop on
>>>>>>> windows. Not a good idea. Anyways, cold you plz show me your log files?I
>>>>>>> also need some additional info :****
>>>>>>>
>>>>>>> -The exact problem which you are facing right now****
>>>>>>>
>>>>>>> -Your cluster summary(no. of nodes etc)****
>>>>>>>
>>>>>>> -Your latest configuration files****
>>>>>>>
>>>>>>> -Your /etc.hosts file****
>>>>>>>
>>>>>>>
>>>>>>> ****
>>>>>>>
>>>>>>> Warm Regards,****
>>>>>>>
>>>>>>> Tariq****
>>>>>>>
>>>>>>> cloudfront.blogspot.com****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  ok. thanks****
>>>>>>>
>>>>>>> now, i need to start with all windows setup first as our product
>>>>>>> will be based on windows ****
>>>>>>>
>>>>>>> so, now, please tell me how to resolve the issue ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> datanode is not starting . please suggest ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards,****
>>>>>>>
>>>>>>> irfan ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  It is possible. Theoretically Hadoop doesn't stop you from doing
>>>>>>> that. But it is not a very wise setup.****
>>>>>>>
>>>>>>>
>>>>>>> ****
>>>>>>>
>>>>>>> Warm Regards,****
>>>>>>>
>>>>>>> Tariq****
>>>>>>>
>>>>>>> cloudfront.blogspot.com****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  please suggest****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> irfan****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  thanks.****
>>>>>>>
>>>>>>> can i have setup like this :****
>>>>>>>
>>>>>>> namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)*
>>>>>>> ***
>>>>>>>
>>>>>>> and datanodes are the combination of any OS (windows , linux , unix
>>>>>>> etc )****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> however, my doubt is,  as the file systems of  both the systems (win
>>>>>>> and linux ) are different ,  datanodes of these systems can not be part of
>>>>>>> single cluster . i have to make windows cluster separate and UNIX cluster
>>>>>>> separate ?****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
>>>>>>> aagarwal@hortonworks.com> wrote:****
>>>>>>>
>>>>>>> I just noticed you are on Cygwin. IIRC Windows PIDs are not the same
>>>>>>> as Cygwin PIDs so that may be causing the discrepancy. I don't know how
>>>>>>> well Hadoop works in Cygwin as I have never tried it. Work is in progress
>>>>>>> for native Windows support however there are no official releases with
>>>>>>> Windows support yet. It may be easier to get familiar with a release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
>>>>>>> ****
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> ****
>>>>>>>
>>>>>>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  thanks ****
>>>>>>>
>>>>>>> here is what i did .****
>>>>>>>
>>>>>>> i stopped all the namenodes and datanodes using ./stop-dfs.sh
>>>>>>> command ****
>>>>>>>
>>>>>>> then deleted all pid files for namenodes and datanodes ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> started dfs again with command : "./start-dfs.sh"****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> when i ran the "Jps" command . it shows****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>>>
>>>>>>> $ ./jps.exe****
>>>>>>>
>>>>>>> 4536 Jps****
>>>>>>>
>>>>>>> 2076 NameNode****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> however, when i open the pid file for namenode then it is not
>>>>>>> showing pid as : 4560. on the contrary, it shud show : 2076****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> please suggest ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
>>>>>>> aagarwal@hortonworks.com> wrote:****
>>>>>>>
>>>>>>>  Most likely there is a stale pid file. Something like
>>>>>>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>>>>>>> the datanode.
>>>>>>>
>>>>>>> I haven't read the entire thread so you may have looked at this
>>>>>>> already.
>>>>>>>
>>>>>>> -Arpit****
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> ****
>>>>>>>
>>>>>>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  datanode is trying to connect to namenode continuously but fails **
>>>>>>> **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> when i try to run "jps" command it says :****
>>>>>>>
>>>>>>> $ ./jps.exe****
>>>>>>>
>>>>>>> 4584 NameNode****
>>>>>>>
>>>>>>> 4016 Jps****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> and when i ran the "./start-dfs.sh" then it says :****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> $ ./start-dfs.sh****
>>>>>>>
>>>>>>> namenode running as process 3544. Stop it first.****
>>>>>>>
>>>>>>> DFS-1: datanode running as process 4076. Stop it first.****
>>>>>>>
>>>>>>> localhost: secondarynamenode running as process 4792. Stop it first.
>>>>>>> ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> both these logs are contradictory ****
>>>>>>>
>>>>>>> please find the attached logs ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> should i attach the conf files as well ?****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>>  ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  Your DN is still not running. Showing me the logs would be helpful.
>>>>>>> ****
>>>>>>>
>>>>>>>
>>>>>>> ****
>>>>>>>
>>>>>>> Warm Regards,****
>>>>>>>
>>>>>>> Tariq****
>>>>>>>
>>>>>>> cloudfront.blogspot.com****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  i followed the url and did the steps mention in that. i have
>>>>>>> deployed on the windows platform****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> Now, i am able to browse url : http://localhost:50070 (name node )**
>>>>>>> **
>>>>>>>
>>>>>>> however, not able to browse url : http://localhost:50030****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> please refer below****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> [image: Inline image 1]****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> i have modified all the config files as mentioned and formatted the
>>>>>>> hdfs file system as well ****
>>>>>>>
>>>>>>> please suggest ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  thanks. i followed this url :
>>>>>>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>>>>>>> ****
>>>>>>>
>>>>>>> let me follow the url which you gave for pseudo distributed setup
>>>>>>> and then will switch to distributed mode****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> irfan ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  You are welcome. Which link have you followed for the
>>>>>>> configuration?Your *core-site.xml* is empty. Remove the property *
>>>>>>> fs.default.name *from *hdfs-site.xml* and add it to *core-site.xml*.
>>>>>>> Remove *mapred.job.tracker* as well. It is required in *
>>>>>>> mapred-site.xml*.****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> I would suggest you to do a pseudo distributed setup first in order
>>>>>>> to get yourself familiar with the process and then proceed to the
>>>>>>> distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>>>>>>> ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> HTH****
>>>>>>>
>>>>>>>
>>>>>>> ****
>>>>>>>
>>>>>>> Warm Regards,****
>>>>>>>
>>>>>>> Tariq****
>>>>>>>
>>>>>>> cloudfront.blogspot.com****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  thanks tariq for response. ****
>>>>>>>
>>>>>>> as discussed last time, i have sent you all the config files in my
>>>>>>> setup . ****
>>>>>>>
>>>>>>> can you please go through that ?****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> please let me know ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> irfan ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  I'm sorry for being unresponsive. Was out of touch for sometime
>>>>>>> because of ramzan and eid. Resuming work today.****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> What's the current status?****
>>>>>>>
>>>>>>>
>>>>>>> ****
>>>>>>>
>>>>>>> Warm Regards,****
>>>>>>>
>>>>>>> Tariq****
>>>>>>>
>>>>>>> cloudfront.blogspot.com****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <ma...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  First of all read the concepts ..I hope you will like it..****
>>>>>>>
>>>>>>>
>>>>>>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf***
>>>>>>> *
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  please suggest ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> irfan ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  hey Tariq,****
>>>>>>>
>>>>>>> i am still stuck .. ****
>>>>>>>
>>>>>>> can you please suggest ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> irfan ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  please suggest ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  attachment got quarantined ****
>>>>>>>
>>>>>>> resending in txt format. please rename it to conf.rar ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  thanks.****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> if i run the jps command on namenode :****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>>>
>>>>>>> $ ./jps.exe****
>>>>>>>
>>>>>>> 3164 NameNode****
>>>>>>>
>>>>>>> 1892 Jps****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> same command on datanode :****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> Administrator@DFS-1 /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>>>
>>>>>>> $ ./jps.exe****
>>>>>>>
>>>>>>> 3848 Jps****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> jps does not list any process for datanode. however, on web browser
>>>>>>> i can see one live data node ****
>>>>>>>
>>>>>>> please find the attached conf rar file of namenode ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <do...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  OK. we'll start fresh. Could you plz show me your latest config
>>>>>>> files?****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> BTW, are your daemons running fine?Use JPS to verify that.****
>>>>>>>
>>>>>>>
>>>>>>> ****
>>>>>>>
>>>>>>> Warm Regards,****
>>>>>>>
>>>>>>> Tariq****
>>>>>>>
>>>>>>> cloudfront.blogspot.com****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  i have created these dir "wksp_data" and "wksp_name" on both
>>>>>>> datanode and namenode ****
>>>>>>>
>>>>>>> made the respective changes in "hdfs-site.xml" file ****
>>>>>>>
>>>>>>> formatted the namenode ****
>>>>>>>
>>>>>>> started the dfs ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> but still, not able to browse the file system through web browser **
>>>>>>> **
>>>>>>>
>>>>>>> please refer below ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> anything still missing ?****
>>>>>>>
>>>>>>> please suggest ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> [image: Inline image 1]****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  these dir needs to be created on all datanodes and namenodes ?****
>>>>>>>
>>>>>>> further,  hdfs-site.xml needs to be updated on both datanodes and
>>>>>>> namenodes for these new dir?****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  Create 2 directories manually corresponding to the values of
>>>>>>> dfs.name.dir and dfs.data.dir properties and change the permissions of
>>>>>>> these directories to 755. When you start pushing data into your HDFS, data
>>>>>>> will start going inside the directory specified by dfs.data.dir and the
>>>>>>> associated metadata will go inside dfs.name.dir. Remember, you store data
>>>>>>> in HDFS, but it eventually gets stored in your local/native FS. But you
>>>>>>> cannot see this data directly on your local/native FS.****
>>>>>>>
>>>>>>>
>>>>>>> ****
>>>>>>>
>>>>>>> Warm Regards,****
>>>>>>>
>>>>>>> Tariq****
>>>>>>>
>>>>>>> cloudfront.blogspot.com****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  thanks. ****
>>>>>>>
>>>>>>> however, i need this to be working on windows environment as project
>>>>>>> requirement.****
>>>>>>>
>>>>>>> i will add/work on Linux later ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> so, now , at this stage , c:\\wksp is the HDFS file system OR do i
>>>>>>> need to create it from command line ?****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> please suggest****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards,****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <do...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  Hello Irfan,****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> Sorry for being unresponsive. Got stuck with some imp work.****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> HDFS webUI doesn't provide us the ability to create file or
>>>>>>> directory. You can browse HDFS, view files, download files etc. But
>>>>>>> operation like create, move, copy etc are not supported.****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> These values look fine to me.****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> One suggestion though. Try getting a Linux machine(if possible). Or
>>>>>>> at least use a VM. I personally feel that using Hadoop on windows is always
>>>>>>> messy.****
>>>>>>>
>>>>>>>
>>>>>>> ****
>>>>>>>
>>>>>>> Warm Regards,****
>>>>>>>
>>>>>>> Tariq****
>>>>>>>
>>>>>>> cloudfront.blogspot.com****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  thanks.****
>>>>>>>
>>>>>>> when i browse the file system , i am getting following :****
>>>>>>>
>>>>>>> i haven't seen any make directory option there ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> i need to create it from command line ?****
>>>>>>>
>>>>>>> further, in the hdfs-site.xml file , i have given following entries.
>>>>>>> are they correct ? ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> <property>****
>>>>>>>
>>>>>>>   <name>dfs.data.dir</name>****
>>>>>>>
>>>>>>>   <value>c:\\wksp</value>****
>>>>>>>
>>>>>>>   </property>****
>>>>>>>
>>>>>>> <property>****
>>>>>>>
>>>>>>>   <name>dfs.name.dir</name>****
>>>>>>>
>>>>>>>   <value>c:\\wksp</value>****
>>>>>>>
>>>>>>>   </property>****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> please suggest ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> [image: Inline image 1]****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <ma...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  *You are wrong at this:*****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>>>>>>
>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp****
>>>>>>>
>>>>>>> copyFromLocal: File
>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
>>>>>>> ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>>>>>>
>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp***
>>>>>>> *
>>>>>>>
>>>>>>> copyFromLocal: File
>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>>>>>>> ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> Because,You had wrote both the paths local and You need not to copy
>>>>>>> hadoop into hdfs...Hadoop is already working..****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> Just check out in browser by after starting ur single node cluster :
>>>>>>> ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> localhost:50070****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> then go for browse the filesystem link in it..****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> If there is no directory then make directory there.****
>>>>>>>
>>>>>>> That is your hdfs directory.****
>>>>>>>
>>>>>>> Then copy any text file there(no need to copy hadoop there).beacause
>>>>>>> u are going to do processing on that data in text file.That's why hadoop is
>>>>>>> used for ,first u need to make it clear in ur mind.Then and then u will do
>>>>>>> it...otherwise not possible..****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> *Try this: *****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2****
>>>>>>>
>>>>>>> $ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
>>>>>>> /hdfs/directory/path****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
>>>>>>> wrote:****
>>>>>>>
>>>>>>>  thanks. yes , i am newbie.****
>>>>>>>
>>>>>>> however, i need windows setup.****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> let me surely refer the doc and link which u sent but i need this to
>>>>>>> be working ...****
>>>>>>>
>>>>>>> can you please help****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> regards****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>>  ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> --
>>>>>>> MANISH DUNANI
>>>>>>> -THANX
>>>>>>> +91 9426881954,+91 8460656443****
>>>>>>>
>>>>>>> manishd207@gmail.com****
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> ****
>>>>>>>
>>>>>>> -- ****
>>>>>>>
>>>>>>> Regards****
>>>>>>>
>>>>>>> *Manish Dunani*****
>>>>>>>
>>>>>>> *Contact No* : +91 9408329137****
>>>>>>>
>>>>>>> *skype id* : manish.dunani****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>> you have received this communication in error, please contact the sender
>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>> you have received this communication in error, please contact the sender
>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>
>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>> you have received this communication in error, please contact the sender
>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>
>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>> you have received this communication in error, please contact the sender
>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>
>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>> you have received this communication in error, please contact the sender
>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>
>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>> you have received this communication in error, please contact the sender
>>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>>
>>>>>>>  ** **
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> ****
>>>>>>>
>>>>>>> ** **
>>>>>>>
>>>>>>> -- ****
>>>>>>>
>>>>>>> Olivier Renault****
>>>>>>>
>>>>>>> Solution Engineer - Big Data - Hortonworks, Inc.
>>>>>>> +44 7500 933 036
>>>>>>> orenault@hortonworks.com
>>>>>>> www.hortonworks.com****
>>>>>>>
>>>>>>> **** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>>>
>>>>>>>
>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>> you have received this communication in error, please contact the sender
>>>>>>> immediately and delete it from your system. Thank You.****<http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>>>
>>>>>>>  ** ** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity
>> to which it is addressed and may contain information that is confidential,
>> privileged and exempt from disclosure under applicable law. If the reader
>> of this message is not the intended recipient, you are hereby notified that
>> any printing, copying, dissemination, distribution, disclosure or
>> forwarding of this communication is strictly prohibited. If you have
>> received this communication in error, please contact the sender immediately
>> and delete it from your system. Thank You.
>>
>
>

-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.

Re: about replication

Posted by Irfan Sayed <ir...@gmail.com>.
i do not have any HIVE server host,  then, what should i put over here?? .
if i comment then i guess it throws error of commenting that
can i put the fqdn of namenode for HIVE server host  ?

will it be a really working configuration?

please suggest

regards
irfan



On Tue, Sep 10, 2013 at 5:09 PM, Olivier Renault
<or...@hortonworks.com>wrote:

> Your cluster-properties.txt should look something like :
>
> #Log directory
> HDP_LOG_DIR=c:\hadoop\logs
>
> #Data directory
> HDP_DATA_DIR=c:\hdp\data
>
> #Hosts
> NAMENODE_HOST=yourmaster.fqdn.com
> JOBTRACKER_HOST=yourmaster.fqdn.com
> HIVE_SERVER_HOST=yourmaster.fqdn.com
> OOZIE_SERVER_HOST=yourmaster.fqdn.com
> TEMPLETON_HOST=yourmaster.fqdn.com
> SLAVE_HOSTS=yourmaster.fqdn.com,youslave.fqdn.com
>
> #Database host
> DB_FLAVOR=derby
> DB_HOSTNAME=yourmaster.fqdn.com
>
>
> #Hive properties
> HIVE_DB_NAME=hive
> HIVE_DB_USERNAME=hive
> HIVE_DB_PASSWORD=hive
>
> #Oozie properties
> OOZIE_DB_NAME=oozie
> OOZIE_DB_USERNAME=oozie
> OOZIE_DB_PASSWORD=oozie
>
> You will need to replace, yourmaster.fqdn.com, yourslave.fqdn.com by your
> servers name. For the time being, I suggest that you do not install HBase,
> Oozie,
>
> regards,
> Olivier
>
>
> On 10 September 2013 07:02, Irfan Sayed <ir...@gmail.com> wrote:
>
>> please suggest
>>
>> regards
>> irfan
>>
>>
>>
>> On Sat, Sep 7, 2013 at 4:56 PM, Irfan Sayed <ir...@gmail.com> wrote:
>>
>>> please suggest
>>>
>>> regards
>>> irfan
>>>
>>>
>>>
>>> On Fri, Sep 6, 2013 at 12:16 PM, Irfan Sayed <ir...@gmail.com>wrote:
>>>
>>>> ok.. now i made some changes and installation went ahead
>>>> but failed in property "HIVE_SERVER_HOST" declaration
>>>> in cluster config file, i have commented this property. if i uncomment
>>>> , then what server address will give ???
>>>>
>>>> i have only two windows machines setup.
>>>> 1: for namenode and another for datanode
>>>>
>>>> please suggest
>>>>
>>>> regards
>>>> irfan
>>>>
>>>>
>>>>
>>>> On Fri, Sep 6, 2013 at 11:42 AM, Irfan Sayed <ir...@gmail.com>wrote:
>>>>
>>>>> thanks.
>>>>> i installed the latest java in c:\java folder and now no error in log
>>>>> file related to java
>>>>> however, now it is throwing error on not having cluster properties
>>>>> file.
>>>>> in fact i am running/installing hdp from the location where this file
>>>>> exist . still it is throwing error
>>>>>
>>>>> please find the attached
>>>>>
>>>>> [image: Inline image 1]
>>>>>
>>>>> regards
>>>>> irfan
>>>>>
>>>>>
>>>>>
>>>>> On Fri, Sep 6, 2013 at 11:12 AM, Ravi Mummulla (BIG DATA) <
>>>>> ravimu@microsoft.com> wrote:
>>>>>
>>>>>>  Here’s your issue (from the logs you attached earlier):****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> CAQuietExec:  Checking JAVA_HOME is set correctly...****
>>>>>>
>>>>>> CAQuietExec:  Files\Java\jdk1.6.0_31 was unexpected at this time.****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> It seems that you installed Java prerequisite in the default path,
>>>>>> which is %PROGRAMFILES% (expands to C:\Program Files in your case). HDP 1.3
>>>>>> does not like spaces in paths, do you need to reinstall Java under c:\java\
>>>>>> or something similar (in a path with no spaces).****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> *From:* Irfan Sayed [mailto:irfu.sayed@gmail.com]
>>>>>> *Sent:* Thursday, September 5, 2013 8:42 PM
>>>>>> *To:* user@hadoop.apache.org
>>>>>> *Subject:* Re: about replication****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> please find the attached.****
>>>>>>
>>>>>> i don't have "c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log"
>>>>>> as it is not generated ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> irfan****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Thu, Sep 5, 2013 at 6:09 PM, Olivier Renault <
>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>
>>>>>>  Could you share the log files ( c:\hdp.log,
>>>>>> c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log )  as
>>>>>> well as your clusterproperties.txt ?****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> Thanks, ****
>>>>>>
>>>>>> Olivier****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On 5 September 2013 12:33, Irfan Sayed <ir...@gmail.com> wrote:*
>>>>>> ***
>>>>>>
>>>>>>  thanks. i followed the user manual for deployment and installed all
>>>>>> pre-requisites ****
>>>>>>
>>>>>> i modified the command and still the issue persist. please suggest **
>>>>>> **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> please refer below ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> [image: Inline image 1]****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> irfan ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <
>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>
>>>>>> The command to install it is msiexec /i msifile /...  ****
>>>>>>
>>>>>> You will find the correct syntax as part of doc. ****
>>>>>>
>>>>>> Happy reading
>>>>>> Olivier ****
>>>>>>
>>>>>> On 4 Sep 2013 12:37, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>>>>
>>>>>>  thanks. ****
>>>>>>
>>>>>> i referred the logs and manuals. i modified the clusterproperties
>>>>>> file and then double click on the msi file ****
>>>>>>
>>>>>> however, it still failed.****
>>>>>>
>>>>>> further i started the installation on command line by giving
>>>>>> HDP_LAYOUT=clusterproperties file path, ****
>>>>>>
>>>>>> installation went ahead and it failed for .NET framework 4.0 and VC++
>>>>>> redistributable package dependency   ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> i installed both and started again the installation. ****
>>>>>>
>>>>>> failed again with following error ****
>>>>>>
>>>>>> [image: Inline image 1]****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> when i search for the logs mentioned in the error , i never found
>>>>>> that ****
>>>>>>
>>>>>> please suggest ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> irfan****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <
>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>
>>>>>> Correct, you need to define the cluster configuration as part of a
>>>>>> file. You will find some information on the configuration file as part of
>>>>>> the documentation. ****
>>>>>>
>>>>>>
>>>>>> http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
>>>>>> ****
>>>>>>
>>>>>> You should make sure to have also installed the pre requisite. ****
>>>>>>
>>>>>> Thanks
>>>>>> Olivier ****
>>>>>>
>>>>>> On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>>>>
>>>>>>  thanks. sorry for the long break. actually got involved in some
>>>>>> other priorities****
>>>>>>
>>>>>> i downloaded the installer and while installing i got following error
>>>>>> ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> [image: Inline image 1]****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> do i need to make any configuration prior to installation ??****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> irfan ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>
>>>>>> Here is the link ****
>>>>>>
>>>>>> http://download.hortonworks.com/products/hdp-windows/****
>>>>>>
>>>>>> Olivier ****
>>>>>>
>>>>>> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>>>>
>>>>>>  thanks.****
>>>>>>
>>>>>> i just followed the instructions to setup the pseudo distributed
>>>>>> setup first using the url :
>>>>>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>>>>>> ****
>>>>>>
>>>>>>  ****
>>>>>>
>>>>>> i don't think so i am running DN on both machine ****
>>>>>>
>>>>>> please find the attached log****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> hi olivier ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> can you please give me download link ?****
>>>>>>
>>>>>> let me try please ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> irfan ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <do...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  Are you running DN on both the machines? Could you please show me
>>>>>> your DN logs?****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> Also, consider Oliver's suggestion. It's definitely a better option.*
>>>>>> ***
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>>
>>>>>> ****
>>>>>>
>>>>>> Warm Regards,****
>>>>>>
>>>>>> Tariq****
>>>>>>
>>>>>> cloudfront.blogspot.com****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>
>>>>>> Irfu, ****
>>>>>>
>>>>>> If you want to quickly get Hadoop running on windows platform. You
>>>>>> may want to try our distribution for Windows. You will be able to find the
>>>>>> msi on our website. ****
>>>>>>
>>>>>> Regards
>>>>>> Olivier ****
>>>>>>
>>>>>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>>>>
>>>>>>  thanks. ****
>>>>>>
>>>>>> ok. i think i need to change the plan over here ****
>>>>>>
>>>>>> let me create two environments. 1: totally windows 2: totally Unix***
>>>>>> *
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> because, on windows , anyway i have to try and see how hadoop works *
>>>>>> ***
>>>>>>
>>>>>> on UNIX, it is already known that ,  it is working fine. ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> so, on windows , here is the setup:****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> namenode : windows 2012 R2 ****
>>>>>>
>>>>>> datanode : windows 2012 R2 ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> now, the exact problem is :****
>>>>>>
>>>>>> 1: datanode is not getting started ****
>>>>>>
>>>>>> 2: replication : if i put any file/folder on any datanode , it should
>>>>>> get replicated to all another available datanodes ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <do...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  Seriously??You are planning to develop something using Hadoop on
>>>>>> windows. Not a good idea. Anyways, cold you plz show me your log files?I
>>>>>> also need some additional info :****
>>>>>>
>>>>>> -The exact problem which you are facing right now****
>>>>>>
>>>>>> -Your cluster summary(no. of nodes etc)****
>>>>>>
>>>>>> -Your latest configuration files****
>>>>>>
>>>>>> -Your /etc.hosts file****
>>>>>>
>>>>>>
>>>>>> ****
>>>>>>
>>>>>> Warm Regards,****
>>>>>>
>>>>>> Tariq****
>>>>>>
>>>>>> cloudfront.blogspot.com****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  ok. thanks****
>>>>>>
>>>>>> now, i need to start with all windows setup first as our product will
>>>>>> be based on windows ****
>>>>>>
>>>>>> so, now, please tell me how to resolve the issue ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> datanode is not starting . please suggest ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards,****
>>>>>>
>>>>>> irfan ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <do...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  It is possible. Theoretically Hadoop doesn't stop you from doing
>>>>>> that. But it is not a very wise setup.****
>>>>>>
>>>>>>
>>>>>> ****
>>>>>>
>>>>>> Warm Regards,****
>>>>>>
>>>>>> Tariq****
>>>>>>
>>>>>> cloudfront.blogspot.com****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  please suggest****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> irfan****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  thanks.****
>>>>>>
>>>>>> can i have setup like this :****
>>>>>>
>>>>>> namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)**
>>>>>> **
>>>>>>
>>>>>> and datanodes are the combination of any OS (windows , linux , unix
>>>>>> etc )****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> however, my doubt is,  as the file systems of  both the systems (win
>>>>>> and linux ) are different ,  datanodes of these systems can not be part of
>>>>>> single cluster . i have to make windows cluster separate and UNIX cluster
>>>>>> separate ?****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
>>>>>> aagarwal@hortonworks.com> wrote:****
>>>>>>
>>>>>> I just noticed you are on Cygwin. IIRC Windows PIDs are not the same
>>>>>> as Cygwin PIDs so that may be causing the discrepancy. I don't know how
>>>>>> well Hadoop works in Cygwin as I have never tried it. Work is in progress
>>>>>> for native Windows support however there are no official releases with
>>>>>> Windows support yet. It may be easier to get familiar with a release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
>>>>>> ****
>>>>>>
>>>>>>
>>>>>>
>>>>>> ****
>>>>>>
>>>>>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  thanks ****
>>>>>>
>>>>>> here is what i did .****
>>>>>>
>>>>>> i stopped all the namenodes and datanodes using ./stop-dfs.sh command
>>>>>> ****
>>>>>>
>>>>>> then deleted all pid files for namenodes and datanodes ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> started dfs again with command : "./start-dfs.sh"****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> when i ran the "Jps" command . it shows****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>>
>>>>>> $ ./jps.exe****
>>>>>>
>>>>>> 4536 Jps****
>>>>>>
>>>>>> 2076 NameNode****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> however, when i open the pid file for namenode then it is not showing
>>>>>> pid as : 4560. on the contrary, it shud show : 2076****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> please suggest ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
>>>>>> aagarwal@hortonworks.com> wrote:****
>>>>>>
>>>>>>  Most likely there is a stale pid file. Something like
>>>>>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>>>>>> the datanode.
>>>>>>
>>>>>> I haven't read the entire thread so you may have looked at this
>>>>>> already.
>>>>>>
>>>>>> -Arpit****
>>>>>>
>>>>>>
>>>>>>
>>>>>> ****
>>>>>>
>>>>>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  datanode is trying to connect to namenode continuously but fails ***
>>>>>> *
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> when i try to run "jps" command it says :****
>>>>>>
>>>>>> $ ./jps.exe****
>>>>>>
>>>>>> 4584 NameNode****
>>>>>>
>>>>>> 4016 Jps****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> and when i ran the "./start-dfs.sh" then it says :****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> $ ./start-dfs.sh****
>>>>>>
>>>>>> namenode running as process 3544. Stop it first.****
>>>>>>
>>>>>> DFS-1: datanode running as process 4076. Stop it first.****
>>>>>>
>>>>>> localhost: secondarynamenode running as process 4792. Stop it first.*
>>>>>> ***
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> both these logs are contradictory ****
>>>>>>
>>>>>> please find the attached logs ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> should i attach the conf files as well ?****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>>  ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <do...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  Your DN is still not running. Showing me the logs would be helpful.*
>>>>>> ***
>>>>>>
>>>>>>
>>>>>> ****
>>>>>>
>>>>>> Warm Regards,****
>>>>>>
>>>>>> Tariq****
>>>>>>
>>>>>> cloudfront.blogspot.com****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  i followed the url and did the steps mention in that. i have
>>>>>> deployed on the windows platform****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> Now, i am able to browse url : http://localhost:50070 (name node )***
>>>>>> *
>>>>>>
>>>>>> however, not able to browse url : http://localhost:50030****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> please refer below****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> [image: Inline image 1]****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> i have modified all the config files as mentioned and formatted the
>>>>>> hdfs file system as well ****
>>>>>>
>>>>>> please suggest ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  thanks. i followed this url :
>>>>>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>>>>>> ****
>>>>>>
>>>>>> let me follow the url which you gave for pseudo distributed setup and
>>>>>> then will switch to distributed mode****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> irfan ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <do...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  You are welcome. Which link have you followed for the
>>>>>> configuration?Your *core-site.xml* is empty. Remove the property *
>>>>>> fs.default.name *from *hdfs-site.xml* and add it to *core-site.xml*.
>>>>>> Remove *mapred.job.tracker* as well. It is required in *
>>>>>> mapred-site.xml*.****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> I would suggest you to do a pseudo distributed setup first in order
>>>>>> to get yourself familiar with the process and then proceed to the
>>>>>> distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>>>>>> ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> HTH****
>>>>>>
>>>>>>
>>>>>> ****
>>>>>>
>>>>>> Warm Regards,****
>>>>>>
>>>>>> Tariq****
>>>>>>
>>>>>> cloudfront.blogspot.com****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  thanks tariq for response. ****
>>>>>>
>>>>>> as discussed last time, i have sent you all the config files in my
>>>>>> setup . ****
>>>>>>
>>>>>> can you please go through that ?****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> please let me know ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> irfan ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <do...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  I'm sorry for being unresponsive. Was out of touch for sometime
>>>>>> because of ramzan and eid. Resuming work today.****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> What's the current status?****
>>>>>>
>>>>>>
>>>>>> ****
>>>>>>
>>>>>> Warm Regards,****
>>>>>>
>>>>>> Tariq****
>>>>>>
>>>>>> cloudfront.blogspot.com****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <ma...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  First of all read the concepts ..I hope you will like it..****
>>>>>>
>>>>>>
>>>>>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  please suggest ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> irfan ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  hey Tariq,****
>>>>>>
>>>>>> i am still stuck .. ****
>>>>>>
>>>>>> can you please suggest ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> irfan ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  please suggest ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  attachment got quarantined ****
>>>>>>
>>>>>> resending in txt format. please rename it to conf.rar ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  thanks.****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> if i run the jps command on namenode :****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>>
>>>>>> $ ./jps.exe****
>>>>>>
>>>>>> 3164 NameNode****
>>>>>>
>>>>>> 1892 Jps****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> same command on datanode :****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> Administrator@DFS-1 /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>>
>>>>>> $ ./jps.exe****
>>>>>>
>>>>>> 3848 Jps****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> jps does not list any process for datanode. however, on web browser i
>>>>>> can see one live data node ****
>>>>>>
>>>>>> please find the attached conf rar file of namenode ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <do...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  OK. we'll start fresh. Could you plz show me your latest config
>>>>>> files?****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> BTW, are your daemons running fine?Use JPS to verify that.****
>>>>>>
>>>>>>
>>>>>> ****
>>>>>>
>>>>>> Warm Regards,****
>>>>>>
>>>>>> Tariq****
>>>>>>
>>>>>> cloudfront.blogspot.com****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  i have created these dir "wksp_data" and "wksp_name" on both
>>>>>> datanode and namenode ****
>>>>>>
>>>>>> made the respective changes in "hdfs-site.xml" file ****
>>>>>>
>>>>>> formatted the namenode ****
>>>>>>
>>>>>> started the dfs ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> but still, not able to browse the file system through web browser ***
>>>>>> *
>>>>>>
>>>>>> please refer below ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> anything still missing ?****
>>>>>>
>>>>>> please suggest ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> [image: Inline image 1]****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  these dir needs to be created on all datanodes and namenodes ?****
>>>>>>
>>>>>> further,  hdfs-site.xml needs to be updated on both datanodes and
>>>>>> namenodes for these new dir?****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <do...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  Create 2 directories manually corresponding to the values of
>>>>>> dfs.name.dir and dfs.data.dir properties and change the permissions of
>>>>>> these directories to 755. When you start pushing data into your HDFS, data
>>>>>> will start going inside the directory specified by dfs.data.dir and the
>>>>>> associated metadata will go inside dfs.name.dir. Remember, you store data
>>>>>> in HDFS, but it eventually gets stored in your local/native FS. But you
>>>>>> cannot see this data directly on your local/native FS.****
>>>>>>
>>>>>>
>>>>>> ****
>>>>>>
>>>>>> Warm Regards,****
>>>>>>
>>>>>> Tariq****
>>>>>>
>>>>>> cloudfront.blogspot.com****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  thanks. ****
>>>>>>
>>>>>> however, i need this to be working on windows environment as project
>>>>>> requirement.****
>>>>>>
>>>>>> i will add/work on Linux later ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> so, now , at this stage , c:\\wksp is the HDFS file system OR do i
>>>>>> need to create it from command line ?****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> please suggest****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards,****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <do...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  Hello Irfan,****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> Sorry for being unresponsive. Got stuck with some imp work.****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> HDFS webUI doesn't provide us the ability to create file or
>>>>>> directory. You can browse HDFS, view files, download files etc. But
>>>>>> operation like create, move, copy etc are not supported.****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> These values look fine to me.****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> One suggestion though. Try getting a Linux machine(if possible). Or
>>>>>> at least use a VM. I personally feel that using Hadoop on windows is always
>>>>>> messy.****
>>>>>>
>>>>>>
>>>>>> ****
>>>>>>
>>>>>> Warm Regards,****
>>>>>>
>>>>>> Tariq****
>>>>>>
>>>>>> cloudfront.blogspot.com****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  thanks.****
>>>>>>
>>>>>> when i browse the file system , i am getting following :****
>>>>>>
>>>>>> i haven't seen any make directory option there ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> i need to create it from command line ?****
>>>>>>
>>>>>> further, in the hdfs-site.xml file , i have given following entries.
>>>>>> are they correct ? ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> <property>****
>>>>>>
>>>>>>   <name>dfs.data.dir</name>****
>>>>>>
>>>>>>   <value>c:\\wksp</value>****
>>>>>>
>>>>>>   </property>****
>>>>>>
>>>>>> <property>****
>>>>>>
>>>>>>   <name>dfs.name.dir</name>****
>>>>>>
>>>>>>   <value>c:\\wksp</value>****
>>>>>>
>>>>>>   </property>****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> please suggest ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> [image: Inline image 1]****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <ma...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  *You are wrong at this:*****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>>>>>
>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp****
>>>>>>
>>>>>> copyFromLocal: File
>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
>>>>>> ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>>>>>
>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp****
>>>>>>
>>>>>> copyFromLocal: File
>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>>>>>> ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> Because,You had wrote both the paths local and You need not to copy
>>>>>> hadoop into hdfs...Hadoop is already working..****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> Just check out in browser by after starting ur single node cluster :*
>>>>>> ***
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> localhost:50070****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> then go for browse the filesystem link in it..****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> If there is no directory then make directory there.****
>>>>>>
>>>>>> That is your hdfs directory.****
>>>>>>
>>>>>> Then copy any text file there(no need to copy hadoop there).beacause
>>>>>> u are going to do processing on that data in text file.That's why hadoop is
>>>>>> used for ,first u need to make it clear in ur mind.Then and then u will do
>>>>>> it...otherwise not possible..****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> *Try this: *****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2****
>>>>>>
>>>>>> $ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
>>>>>> /hdfs/directory/path****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  thanks. yes , i am newbie.****
>>>>>>
>>>>>> however, i need windows setup.****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> let me surely refer the doc and link which u sent but i need this to
>>>>>> be working ...****
>>>>>>
>>>>>> can you please help****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>>  ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>>
>>>>>>
>>>>>> ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> --
>>>>>> MANISH DUNANI
>>>>>> -THANX
>>>>>> +91 9426881954,+91 8460656443****
>>>>>>
>>>>>> manishd207@gmail.com****
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>
>>>>>>
>>>>>> ****
>>>>>>
>>>>>> -- ****
>>>>>>
>>>>>> Regards****
>>>>>>
>>>>>> *Manish Dunani*****
>>>>>>
>>>>>> *Contact No* : +91 9408329137****
>>>>>>
>>>>>> *skype id* : manish.dunani****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>> CONFIDENTIALITY NOTICE
>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>> entity to which it is addressed and may contain information that is
>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>> you have received this communication in error, please contact the sender
>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>
>>>>>>
>>>>>> CONFIDENTIALITY NOTICE
>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>> entity to which it is addressed and may contain information that is
>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>> you have received this communication in error, please contact the sender
>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>
>>>>>> CONFIDENTIALITY NOTICE
>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>> entity to which it is addressed and may contain information that is
>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>> you have received this communication in error, please contact the sender
>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>
>>>>>> CONFIDENTIALITY NOTICE
>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>> entity to which it is addressed and may contain information that is
>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>> you have received this communication in error, please contact the sender
>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>
>>>>>> CONFIDENTIALITY NOTICE
>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>> entity to which it is addressed and may contain information that is
>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>> you have received this communication in error, please contact the sender
>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>
>>>>>> CONFIDENTIALITY NOTICE
>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>> entity to which it is addressed and may contain information that is
>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>> you have received this communication in error, please contact the sender
>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>
>>>>>>
>>>>>> ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> -- ****
>>>>>>
>>>>>> Olivier Renault****
>>>>>>
>>>>>> Solution Engineer - Big Data - Hortonworks, Inc.
>>>>>> +44 7500 933 036
>>>>>> orenault@hortonworks.com
>>>>>> www.hortonworks.com****
>>>>>>
>>>>>> **** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>>
>>>>>>
>>>>>> CONFIDENTIALITY NOTICE
>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>> entity to which it is addressed and may contain information that is
>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>> you have received this communication in error, please contact the sender
>>>>>> immediately and delete it from your system. Thank You.****<http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>>
>>>>>>  ** ** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>>
>>>>>
>>>>>
>>>>
>>>
>>
>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.
>

Re: about replication

Posted by Irfan Sayed <ir...@gmail.com>.
i do not have any HIVE server host,  then, what should i put over here?? .
if i comment then i guess it throws error of commenting that
can i put the fqdn of namenode for HIVE server host  ?

will it be a really working configuration?

please suggest

regards
irfan



On Tue, Sep 10, 2013 at 5:09 PM, Olivier Renault
<or...@hortonworks.com>wrote:

> Your cluster-properties.txt should look something like :
>
> #Log directory
> HDP_LOG_DIR=c:\hadoop\logs
>
> #Data directory
> HDP_DATA_DIR=c:\hdp\data
>
> #Hosts
> NAMENODE_HOST=yourmaster.fqdn.com
> JOBTRACKER_HOST=yourmaster.fqdn.com
> HIVE_SERVER_HOST=yourmaster.fqdn.com
> OOZIE_SERVER_HOST=yourmaster.fqdn.com
> TEMPLETON_HOST=yourmaster.fqdn.com
> SLAVE_HOSTS=yourmaster.fqdn.com,youslave.fqdn.com
>
> #Database host
> DB_FLAVOR=derby
> DB_HOSTNAME=yourmaster.fqdn.com
>
>
> #Hive properties
> HIVE_DB_NAME=hive
> HIVE_DB_USERNAME=hive
> HIVE_DB_PASSWORD=hive
>
> #Oozie properties
> OOZIE_DB_NAME=oozie
> OOZIE_DB_USERNAME=oozie
> OOZIE_DB_PASSWORD=oozie
>
> You will need to replace, yourmaster.fqdn.com, yourslave.fqdn.com by your
> servers name. For the time being, I suggest that you do not install HBase,
> Oozie,
>
> regards,
> Olivier
>
>
> On 10 September 2013 07:02, Irfan Sayed <ir...@gmail.com> wrote:
>
>> please suggest
>>
>> regards
>> irfan
>>
>>
>>
>> On Sat, Sep 7, 2013 at 4:56 PM, Irfan Sayed <ir...@gmail.com> wrote:
>>
>>> please suggest
>>>
>>> regards
>>> irfan
>>>
>>>
>>>
>>> On Fri, Sep 6, 2013 at 12:16 PM, Irfan Sayed <ir...@gmail.com>wrote:
>>>
>>>> ok.. now i made some changes and installation went ahead
>>>> but failed in property "HIVE_SERVER_HOST" declaration
>>>> in cluster config file, i have commented this property. if i uncomment
>>>> , then what server address will give ???
>>>>
>>>> i have only two windows machines setup.
>>>> 1: for namenode and another for datanode
>>>>
>>>> please suggest
>>>>
>>>> regards
>>>> irfan
>>>>
>>>>
>>>>
>>>> On Fri, Sep 6, 2013 at 11:42 AM, Irfan Sayed <ir...@gmail.com>wrote:
>>>>
>>>>> thanks.
>>>>> i installed the latest java in c:\java folder and now no error in log
>>>>> file related to java
>>>>> however, now it is throwing error on not having cluster properties
>>>>> file.
>>>>> in fact i am running/installing hdp from the location where this file
>>>>> exist . still it is throwing error
>>>>>
>>>>> please find the attached
>>>>>
>>>>> [image: Inline image 1]
>>>>>
>>>>> regards
>>>>> irfan
>>>>>
>>>>>
>>>>>
>>>>> On Fri, Sep 6, 2013 at 11:12 AM, Ravi Mummulla (BIG DATA) <
>>>>> ravimu@microsoft.com> wrote:
>>>>>
>>>>>>  Here’s your issue (from the logs you attached earlier):****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> CAQuietExec:  Checking JAVA_HOME is set correctly...****
>>>>>>
>>>>>> CAQuietExec:  Files\Java\jdk1.6.0_31 was unexpected at this time.****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> It seems that you installed Java prerequisite in the default path,
>>>>>> which is %PROGRAMFILES% (expands to C:\Program Files in your case). HDP 1.3
>>>>>> does not like spaces in paths, do you need to reinstall Java under c:\java\
>>>>>> or something similar (in a path with no spaces).****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> *From:* Irfan Sayed [mailto:irfu.sayed@gmail.com]
>>>>>> *Sent:* Thursday, September 5, 2013 8:42 PM
>>>>>> *To:* user@hadoop.apache.org
>>>>>> *Subject:* Re: about replication****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> please find the attached.****
>>>>>>
>>>>>> i don't have "c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log"
>>>>>> as it is not generated ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> irfan****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Thu, Sep 5, 2013 at 6:09 PM, Olivier Renault <
>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>
>>>>>>  Could you share the log files ( c:\hdp.log,
>>>>>> c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log )  as
>>>>>> well as your clusterproperties.txt ?****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> Thanks, ****
>>>>>>
>>>>>> Olivier****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On 5 September 2013 12:33, Irfan Sayed <ir...@gmail.com> wrote:*
>>>>>> ***
>>>>>>
>>>>>>  thanks. i followed the user manual for deployment and installed all
>>>>>> pre-requisites ****
>>>>>>
>>>>>> i modified the command and still the issue persist. please suggest **
>>>>>> **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> please refer below ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> [image: Inline image 1]****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> irfan ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <
>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>
>>>>>> The command to install it is msiexec /i msifile /...  ****
>>>>>>
>>>>>> You will find the correct syntax as part of doc. ****
>>>>>>
>>>>>> Happy reading
>>>>>> Olivier ****
>>>>>>
>>>>>> On 4 Sep 2013 12:37, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>>>>
>>>>>>  thanks. ****
>>>>>>
>>>>>> i referred the logs and manuals. i modified the clusterproperties
>>>>>> file and then double click on the msi file ****
>>>>>>
>>>>>> however, it still failed.****
>>>>>>
>>>>>> further i started the installation on command line by giving
>>>>>> HDP_LAYOUT=clusterproperties file path, ****
>>>>>>
>>>>>> installation went ahead and it failed for .NET framework 4.0 and VC++
>>>>>> redistributable package dependency   ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> i installed both and started again the installation. ****
>>>>>>
>>>>>> failed again with following error ****
>>>>>>
>>>>>> [image: Inline image 1]****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> when i search for the logs mentioned in the error , i never found
>>>>>> that ****
>>>>>>
>>>>>> please suggest ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> irfan****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <
>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>
>>>>>> Correct, you need to define the cluster configuration as part of a
>>>>>> file. You will find some information on the configuration file as part of
>>>>>> the documentation. ****
>>>>>>
>>>>>>
>>>>>> http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
>>>>>> ****
>>>>>>
>>>>>> You should make sure to have also installed the pre requisite. ****
>>>>>>
>>>>>> Thanks
>>>>>> Olivier ****
>>>>>>
>>>>>> On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>>>>
>>>>>>  thanks. sorry for the long break. actually got involved in some
>>>>>> other priorities****
>>>>>>
>>>>>> i downloaded the installer and while installing i got following error
>>>>>> ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> [image: Inline image 1]****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> do i need to make any configuration prior to installation ??****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> irfan ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>
>>>>>> Here is the link ****
>>>>>>
>>>>>> http://download.hortonworks.com/products/hdp-windows/****
>>>>>>
>>>>>> Olivier ****
>>>>>>
>>>>>> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>>>>
>>>>>>  thanks.****
>>>>>>
>>>>>> i just followed the instructions to setup the pseudo distributed
>>>>>> setup first using the url :
>>>>>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>>>>>> ****
>>>>>>
>>>>>>  ****
>>>>>>
>>>>>> i don't think so i am running DN on both machine ****
>>>>>>
>>>>>> please find the attached log****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> hi olivier ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> can you please give me download link ?****
>>>>>>
>>>>>> let me try please ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> irfan ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <do...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  Are you running DN on both the machines? Could you please show me
>>>>>> your DN logs?****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> Also, consider Oliver's suggestion. It's definitely a better option.*
>>>>>> ***
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>>
>>>>>> ****
>>>>>>
>>>>>> Warm Regards,****
>>>>>>
>>>>>> Tariq****
>>>>>>
>>>>>> cloudfront.blogspot.com****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>
>>>>>> Irfu, ****
>>>>>>
>>>>>> If you want to quickly get Hadoop running on windows platform. You
>>>>>> may want to try our distribution for Windows. You will be able to find the
>>>>>> msi on our website. ****
>>>>>>
>>>>>> Regards
>>>>>> Olivier ****
>>>>>>
>>>>>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>>>>
>>>>>>  thanks. ****
>>>>>>
>>>>>> ok. i think i need to change the plan over here ****
>>>>>>
>>>>>> let me create two environments. 1: totally windows 2: totally Unix***
>>>>>> *
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> because, on windows , anyway i have to try and see how hadoop works *
>>>>>> ***
>>>>>>
>>>>>> on UNIX, it is already known that ,  it is working fine. ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> so, on windows , here is the setup:****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> namenode : windows 2012 R2 ****
>>>>>>
>>>>>> datanode : windows 2012 R2 ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> now, the exact problem is :****
>>>>>>
>>>>>> 1: datanode is not getting started ****
>>>>>>
>>>>>> 2: replication : if i put any file/folder on any datanode , it should
>>>>>> get replicated to all another available datanodes ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <do...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  Seriously??You are planning to develop something using Hadoop on
>>>>>> windows. Not a good idea. Anyways, cold you plz show me your log files?I
>>>>>> also need some additional info :****
>>>>>>
>>>>>> -The exact problem which you are facing right now****
>>>>>>
>>>>>> -Your cluster summary(no. of nodes etc)****
>>>>>>
>>>>>> -Your latest configuration files****
>>>>>>
>>>>>> -Your /etc.hosts file****
>>>>>>
>>>>>>
>>>>>> ****
>>>>>>
>>>>>> Warm Regards,****
>>>>>>
>>>>>> Tariq****
>>>>>>
>>>>>> cloudfront.blogspot.com****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  ok. thanks****
>>>>>>
>>>>>> now, i need to start with all windows setup first as our product will
>>>>>> be based on windows ****
>>>>>>
>>>>>> so, now, please tell me how to resolve the issue ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> datanode is not starting . please suggest ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards,****
>>>>>>
>>>>>> irfan ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <do...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  It is possible. Theoretically Hadoop doesn't stop you from doing
>>>>>> that. But it is not a very wise setup.****
>>>>>>
>>>>>>
>>>>>> ****
>>>>>>
>>>>>> Warm Regards,****
>>>>>>
>>>>>> Tariq****
>>>>>>
>>>>>> cloudfront.blogspot.com****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  please suggest****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> irfan****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  thanks.****
>>>>>>
>>>>>> can i have setup like this :****
>>>>>>
>>>>>> namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)**
>>>>>> **
>>>>>>
>>>>>> and datanodes are the combination of any OS (windows , linux , unix
>>>>>> etc )****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> however, my doubt is,  as the file systems of  both the systems (win
>>>>>> and linux ) are different ,  datanodes of these systems can not be part of
>>>>>> single cluster . i have to make windows cluster separate and UNIX cluster
>>>>>> separate ?****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
>>>>>> aagarwal@hortonworks.com> wrote:****
>>>>>>
>>>>>> I just noticed you are on Cygwin. IIRC Windows PIDs are not the same
>>>>>> as Cygwin PIDs so that may be causing the discrepancy. I don't know how
>>>>>> well Hadoop works in Cygwin as I have never tried it. Work is in progress
>>>>>> for native Windows support however there are no official releases with
>>>>>> Windows support yet. It may be easier to get familiar with a release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
>>>>>> ****
>>>>>>
>>>>>>
>>>>>>
>>>>>> ****
>>>>>>
>>>>>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  thanks ****
>>>>>>
>>>>>> here is what i did .****
>>>>>>
>>>>>> i stopped all the namenodes and datanodes using ./stop-dfs.sh command
>>>>>> ****
>>>>>>
>>>>>> then deleted all pid files for namenodes and datanodes ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> started dfs again with command : "./start-dfs.sh"****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> when i ran the "Jps" command . it shows****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>>
>>>>>> $ ./jps.exe****
>>>>>>
>>>>>> 4536 Jps****
>>>>>>
>>>>>> 2076 NameNode****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> however, when i open the pid file for namenode then it is not showing
>>>>>> pid as : 4560. on the contrary, it shud show : 2076****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> please suggest ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
>>>>>> aagarwal@hortonworks.com> wrote:****
>>>>>>
>>>>>>  Most likely there is a stale pid file. Something like
>>>>>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>>>>>> the datanode.
>>>>>>
>>>>>> I haven't read the entire thread so you may have looked at this
>>>>>> already.
>>>>>>
>>>>>> -Arpit****
>>>>>>
>>>>>>
>>>>>>
>>>>>> ****
>>>>>>
>>>>>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  datanode is trying to connect to namenode continuously but fails ***
>>>>>> *
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> when i try to run "jps" command it says :****
>>>>>>
>>>>>> $ ./jps.exe****
>>>>>>
>>>>>> 4584 NameNode****
>>>>>>
>>>>>> 4016 Jps****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> and when i ran the "./start-dfs.sh" then it says :****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> $ ./start-dfs.sh****
>>>>>>
>>>>>> namenode running as process 3544. Stop it first.****
>>>>>>
>>>>>> DFS-1: datanode running as process 4076. Stop it first.****
>>>>>>
>>>>>> localhost: secondarynamenode running as process 4792. Stop it first.*
>>>>>> ***
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> both these logs are contradictory ****
>>>>>>
>>>>>> please find the attached logs ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> should i attach the conf files as well ?****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>>  ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <do...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  Your DN is still not running. Showing me the logs would be helpful.*
>>>>>> ***
>>>>>>
>>>>>>
>>>>>> ****
>>>>>>
>>>>>> Warm Regards,****
>>>>>>
>>>>>> Tariq****
>>>>>>
>>>>>> cloudfront.blogspot.com****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  i followed the url and did the steps mention in that. i have
>>>>>> deployed on the windows platform****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> Now, i am able to browse url : http://localhost:50070 (name node )***
>>>>>> *
>>>>>>
>>>>>> however, not able to browse url : http://localhost:50030****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> please refer below****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> [image: Inline image 1]****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> i have modified all the config files as mentioned and formatted the
>>>>>> hdfs file system as well ****
>>>>>>
>>>>>> please suggest ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  thanks. i followed this url :
>>>>>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>>>>>> ****
>>>>>>
>>>>>> let me follow the url which you gave for pseudo distributed setup and
>>>>>> then will switch to distributed mode****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> irfan ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <do...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  You are welcome. Which link have you followed for the
>>>>>> configuration?Your *core-site.xml* is empty. Remove the property *
>>>>>> fs.default.name *from *hdfs-site.xml* and add it to *core-site.xml*.
>>>>>> Remove *mapred.job.tracker* as well. It is required in *
>>>>>> mapred-site.xml*.****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> I would suggest you to do a pseudo distributed setup first in order
>>>>>> to get yourself familiar with the process and then proceed to the
>>>>>> distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>>>>>> ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> HTH****
>>>>>>
>>>>>>
>>>>>> ****
>>>>>>
>>>>>> Warm Regards,****
>>>>>>
>>>>>> Tariq****
>>>>>>
>>>>>> cloudfront.blogspot.com****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  thanks tariq for response. ****
>>>>>>
>>>>>> as discussed last time, i have sent you all the config files in my
>>>>>> setup . ****
>>>>>>
>>>>>> can you please go through that ?****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> please let me know ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> irfan ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <do...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  I'm sorry for being unresponsive. Was out of touch for sometime
>>>>>> because of ramzan and eid. Resuming work today.****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> What's the current status?****
>>>>>>
>>>>>>
>>>>>> ****
>>>>>>
>>>>>> Warm Regards,****
>>>>>>
>>>>>> Tariq****
>>>>>>
>>>>>> cloudfront.blogspot.com****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <ma...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  First of all read the concepts ..I hope you will like it..****
>>>>>>
>>>>>>
>>>>>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  please suggest ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> irfan ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  hey Tariq,****
>>>>>>
>>>>>> i am still stuck .. ****
>>>>>>
>>>>>> can you please suggest ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> irfan ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  please suggest ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  attachment got quarantined ****
>>>>>>
>>>>>> resending in txt format. please rename it to conf.rar ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  thanks.****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> if i run the jps command on namenode :****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>>
>>>>>> $ ./jps.exe****
>>>>>>
>>>>>> 3164 NameNode****
>>>>>>
>>>>>> 1892 Jps****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> same command on datanode :****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> Administrator@DFS-1 /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>>
>>>>>> $ ./jps.exe****
>>>>>>
>>>>>> 3848 Jps****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> jps does not list any process for datanode. however, on web browser i
>>>>>> can see one live data node ****
>>>>>>
>>>>>> please find the attached conf rar file of namenode ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <do...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  OK. we'll start fresh. Could you plz show me your latest config
>>>>>> files?****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> BTW, are your daemons running fine?Use JPS to verify that.****
>>>>>>
>>>>>>
>>>>>> ****
>>>>>>
>>>>>> Warm Regards,****
>>>>>>
>>>>>> Tariq****
>>>>>>
>>>>>> cloudfront.blogspot.com****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  i have created these dir "wksp_data" and "wksp_name" on both
>>>>>> datanode and namenode ****
>>>>>>
>>>>>> made the respective changes in "hdfs-site.xml" file ****
>>>>>>
>>>>>> formatted the namenode ****
>>>>>>
>>>>>> started the dfs ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> but still, not able to browse the file system through web browser ***
>>>>>> *
>>>>>>
>>>>>> please refer below ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> anything still missing ?****
>>>>>>
>>>>>> please suggest ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> [image: Inline image 1]****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  these dir needs to be created on all datanodes and namenodes ?****
>>>>>>
>>>>>> further,  hdfs-site.xml needs to be updated on both datanodes and
>>>>>> namenodes for these new dir?****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <do...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  Create 2 directories manually corresponding to the values of
>>>>>> dfs.name.dir and dfs.data.dir properties and change the permissions of
>>>>>> these directories to 755. When you start pushing data into your HDFS, data
>>>>>> will start going inside the directory specified by dfs.data.dir and the
>>>>>> associated metadata will go inside dfs.name.dir. Remember, you store data
>>>>>> in HDFS, but it eventually gets stored in your local/native FS. But you
>>>>>> cannot see this data directly on your local/native FS.****
>>>>>>
>>>>>>
>>>>>> ****
>>>>>>
>>>>>> Warm Regards,****
>>>>>>
>>>>>> Tariq****
>>>>>>
>>>>>> cloudfront.blogspot.com****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  thanks. ****
>>>>>>
>>>>>> however, i need this to be working on windows environment as project
>>>>>> requirement.****
>>>>>>
>>>>>> i will add/work on Linux later ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> so, now , at this stage , c:\\wksp is the HDFS file system OR do i
>>>>>> need to create it from command line ?****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> please suggest****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards,****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <do...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  Hello Irfan,****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> Sorry for being unresponsive. Got stuck with some imp work.****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> HDFS webUI doesn't provide us the ability to create file or
>>>>>> directory. You can browse HDFS, view files, download files etc. But
>>>>>> operation like create, move, copy etc are not supported.****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> These values look fine to me.****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> One suggestion though. Try getting a Linux machine(if possible). Or
>>>>>> at least use a VM. I personally feel that using Hadoop on windows is always
>>>>>> messy.****
>>>>>>
>>>>>>
>>>>>> ****
>>>>>>
>>>>>> Warm Regards,****
>>>>>>
>>>>>> Tariq****
>>>>>>
>>>>>> cloudfront.blogspot.com****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  thanks.****
>>>>>>
>>>>>> when i browse the file system , i am getting following :****
>>>>>>
>>>>>> i haven't seen any make directory option there ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> i need to create it from command line ?****
>>>>>>
>>>>>> further, in the hdfs-site.xml file , i have given following entries.
>>>>>> are they correct ? ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> <property>****
>>>>>>
>>>>>>   <name>dfs.data.dir</name>****
>>>>>>
>>>>>>   <value>c:\\wksp</value>****
>>>>>>
>>>>>>   </property>****
>>>>>>
>>>>>> <property>****
>>>>>>
>>>>>>   <name>dfs.name.dir</name>****
>>>>>>
>>>>>>   <value>c:\\wksp</value>****
>>>>>>
>>>>>>   </property>****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> please suggest ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> [image: Inline image 1]****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <ma...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  *You are wrong at this:*****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>>>>>
>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp****
>>>>>>
>>>>>> copyFromLocal: File
>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
>>>>>> ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>>>>>
>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp****
>>>>>>
>>>>>> copyFromLocal: File
>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>>>>>> ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> Because,You had wrote both the paths local and You need not to copy
>>>>>> hadoop into hdfs...Hadoop is already working..****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> Just check out in browser by after starting ur single node cluster :*
>>>>>> ***
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> localhost:50070****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> then go for browse the filesystem link in it..****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> If there is no directory then make directory there.****
>>>>>>
>>>>>> That is your hdfs directory.****
>>>>>>
>>>>>> Then copy any text file there(no need to copy hadoop there).beacause
>>>>>> u are going to do processing on that data in text file.That's why hadoop is
>>>>>> used for ,first u need to make it clear in ur mind.Then and then u will do
>>>>>> it...otherwise not possible..****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> *Try this: *****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2****
>>>>>>
>>>>>> $ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
>>>>>> /hdfs/directory/path****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  thanks. yes , i am newbie.****
>>>>>>
>>>>>> however, i need windows setup.****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> let me surely refer the doc and link which u sent but i need this to
>>>>>> be working ...****
>>>>>>
>>>>>> can you please help****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>>  ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>>
>>>>>>
>>>>>> ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> --
>>>>>> MANISH DUNANI
>>>>>> -THANX
>>>>>> +91 9426881954,+91 8460656443****
>>>>>>
>>>>>> manishd207@gmail.com****
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>
>>>>>>
>>>>>> ****
>>>>>>
>>>>>> -- ****
>>>>>>
>>>>>> Regards****
>>>>>>
>>>>>> *Manish Dunani*****
>>>>>>
>>>>>> *Contact No* : +91 9408329137****
>>>>>>
>>>>>> *skype id* : manish.dunani****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>> CONFIDENTIALITY NOTICE
>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>> entity to which it is addressed and may contain information that is
>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>> you have received this communication in error, please contact the sender
>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>
>>>>>>
>>>>>> CONFIDENTIALITY NOTICE
>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>> entity to which it is addressed and may contain information that is
>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>> you have received this communication in error, please contact the sender
>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>
>>>>>> CONFIDENTIALITY NOTICE
>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>> entity to which it is addressed and may contain information that is
>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>> you have received this communication in error, please contact the sender
>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>
>>>>>> CONFIDENTIALITY NOTICE
>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>> entity to which it is addressed and may contain information that is
>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>> you have received this communication in error, please contact the sender
>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>
>>>>>> CONFIDENTIALITY NOTICE
>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>> entity to which it is addressed and may contain information that is
>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>> you have received this communication in error, please contact the sender
>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>
>>>>>> CONFIDENTIALITY NOTICE
>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>> entity to which it is addressed and may contain information that is
>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>> you have received this communication in error, please contact the sender
>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>
>>>>>>
>>>>>> ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> -- ****
>>>>>>
>>>>>> Olivier Renault****
>>>>>>
>>>>>> Solution Engineer - Big Data - Hortonworks, Inc.
>>>>>> +44 7500 933 036
>>>>>> orenault@hortonworks.com
>>>>>> www.hortonworks.com****
>>>>>>
>>>>>> **** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>>
>>>>>>
>>>>>> CONFIDENTIALITY NOTICE
>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>> entity to which it is addressed and may contain information that is
>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>> you have received this communication in error, please contact the sender
>>>>>> immediately and delete it from your system. Thank You.****<http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>>
>>>>>>  ** ** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>>
>>>>>
>>>>>
>>>>
>>>
>>
>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.
>

Re: about replication

Posted by Irfan Sayed <ir...@gmail.com>.
i do not have any HIVE server host,  then, what should i put over here?? .
if i comment then i guess it throws error of commenting that
can i put the fqdn of namenode for HIVE server host  ?

will it be a really working configuration?

please suggest

regards
irfan



On Tue, Sep 10, 2013 at 5:09 PM, Olivier Renault
<or...@hortonworks.com>wrote:

> Your cluster-properties.txt should look something like :
>
> #Log directory
> HDP_LOG_DIR=c:\hadoop\logs
>
> #Data directory
> HDP_DATA_DIR=c:\hdp\data
>
> #Hosts
> NAMENODE_HOST=yourmaster.fqdn.com
> JOBTRACKER_HOST=yourmaster.fqdn.com
> HIVE_SERVER_HOST=yourmaster.fqdn.com
> OOZIE_SERVER_HOST=yourmaster.fqdn.com
> TEMPLETON_HOST=yourmaster.fqdn.com
> SLAVE_HOSTS=yourmaster.fqdn.com,youslave.fqdn.com
>
> #Database host
> DB_FLAVOR=derby
> DB_HOSTNAME=yourmaster.fqdn.com
>
>
> #Hive properties
> HIVE_DB_NAME=hive
> HIVE_DB_USERNAME=hive
> HIVE_DB_PASSWORD=hive
>
> #Oozie properties
> OOZIE_DB_NAME=oozie
> OOZIE_DB_USERNAME=oozie
> OOZIE_DB_PASSWORD=oozie
>
> You will need to replace, yourmaster.fqdn.com, yourslave.fqdn.com by your
> servers name. For the time being, I suggest that you do not install HBase,
> Oozie,
>
> regards,
> Olivier
>
>
> On 10 September 2013 07:02, Irfan Sayed <ir...@gmail.com> wrote:
>
>> please suggest
>>
>> regards
>> irfan
>>
>>
>>
>> On Sat, Sep 7, 2013 at 4:56 PM, Irfan Sayed <ir...@gmail.com> wrote:
>>
>>> please suggest
>>>
>>> regards
>>> irfan
>>>
>>>
>>>
>>> On Fri, Sep 6, 2013 at 12:16 PM, Irfan Sayed <ir...@gmail.com>wrote:
>>>
>>>> ok.. now i made some changes and installation went ahead
>>>> but failed in property "HIVE_SERVER_HOST" declaration
>>>> in cluster config file, i have commented this property. if i uncomment
>>>> , then what server address will give ???
>>>>
>>>> i have only two windows machines setup.
>>>> 1: for namenode and another for datanode
>>>>
>>>> please suggest
>>>>
>>>> regards
>>>> irfan
>>>>
>>>>
>>>>
>>>> On Fri, Sep 6, 2013 at 11:42 AM, Irfan Sayed <ir...@gmail.com>wrote:
>>>>
>>>>> thanks.
>>>>> i installed the latest java in c:\java folder and now no error in log
>>>>> file related to java
>>>>> however, now it is throwing error on not having cluster properties
>>>>> file.
>>>>> in fact i am running/installing hdp from the location where this file
>>>>> exist . still it is throwing error
>>>>>
>>>>> please find the attached
>>>>>
>>>>> [image: Inline image 1]
>>>>>
>>>>> regards
>>>>> irfan
>>>>>
>>>>>
>>>>>
>>>>> On Fri, Sep 6, 2013 at 11:12 AM, Ravi Mummulla (BIG DATA) <
>>>>> ravimu@microsoft.com> wrote:
>>>>>
>>>>>>  Here’s your issue (from the logs you attached earlier):****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> CAQuietExec:  Checking JAVA_HOME is set correctly...****
>>>>>>
>>>>>> CAQuietExec:  Files\Java\jdk1.6.0_31 was unexpected at this time.****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> It seems that you installed Java prerequisite in the default path,
>>>>>> which is %PROGRAMFILES% (expands to C:\Program Files in your case). HDP 1.3
>>>>>> does not like spaces in paths, do you need to reinstall Java under c:\java\
>>>>>> or something similar (in a path with no spaces).****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> *From:* Irfan Sayed [mailto:irfu.sayed@gmail.com]
>>>>>> *Sent:* Thursday, September 5, 2013 8:42 PM
>>>>>> *To:* user@hadoop.apache.org
>>>>>> *Subject:* Re: about replication****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> please find the attached.****
>>>>>>
>>>>>> i don't have "c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log"
>>>>>> as it is not generated ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> irfan****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Thu, Sep 5, 2013 at 6:09 PM, Olivier Renault <
>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>
>>>>>>  Could you share the log files ( c:\hdp.log,
>>>>>> c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log )  as
>>>>>> well as your clusterproperties.txt ?****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> Thanks, ****
>>>>>>
>>>>>> Olivier****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On 5 September 2013 12:33, Irfan Sayed <ir...@gmail.com> wrote:*
>>>>>> ***
>>>>>>
>>>>>>  thanks. i followed the user manual for deployment and installed all
>>>>>> pre-requisites ****
>>>>>>
>>>>>> i modified the command and still the issue persist. please suggest **
>>>>>> **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> please refer below ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> [image: Inline image 1]****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> irfan ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <
>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>
>>>>>> The command to install it is msiexec /i msifile /...  ****
>>>>>>
>>>>>> You will find the correct syntax as part of doc. ****
>>>>>>
>>>>>> Happy reading
>>>>>> Olivier ****
>>>>>>
>>>>>> On 4 Sep 2013 12:37, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>>>>
>>>>>>  thanks. ****
>>>>>>
>>>>>> i referred the logs and manuals. i modified the clusterproperties
>>>>>> file and then double click on the msi file ****
>>>>>>
>>>>>> however, it still failed.****
>>>>>>
>>>>>> further i started the installation on command line by giving
>>>>>> HDP_LAYOUT=clusterproperties file path, ****
>>>>>>
>>>>>> installation went ahead and it failed for .NET framework 4.0 and VC++
>>>>>> redistributable package dependency   ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> i installed both and started again the installation. ****
>>>>>>
>>>>>> failed again with following error ****
>>>>>>
>>>>>> [image: Inline image 1]****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> when i search for the logs mentioned in the error , i never found
>>>>>> that ****
>>>>>>
>>>>>> please suggest ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> irfan****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <
>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>
>>>>>> Correct, you need to define the cluster configuration as part of a
>>>>>> file. You will find some information on the configuration file as part of
>>>>>> the documentation. ****
>>>>>>
>>>>>>
>>>>>> http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
>>>>>> ****
>>>>>>
>>>>>> You should make sure to have also installed the pre requisite. ****
>>>>>>
>>>>>> Thanks
>>>>>> Olivier ****
>>>>>>
>>>>>> On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>>>>
>>>>>>  thanks. sorry for the long break. actually got involved in some
>>>>>> other priorities****
>>>>>>
>>>>>> i downloaded the installer and while installing i got following error
>>>>>> ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> [image: Inline image 1]****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> do i need to make any configuration prior to installation ??****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> irfan ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>
>>>>>> Here is the link ****
>>>>>>
>>>>>> http://download.hortonworks.com/products/hdp-windows/****
>>>>>>
>>>>>> Olivier ****
>>>>>>
>>>>>> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>>>>
>>>>>>  thanks.****
>>>>>>
>>>>>> i just followed the instructions to setup the pseudo distributed
>>>>>> setup first using the url :
>>>>>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>>>>>> ****
>>>>>>
>>>>>>  ****
>>>>>>
>>>>>> i don't think so i am running DN on both machine ****
>>>>>>
>>>>>> please find the attached log****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> hi olivier ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> can you please give me download link ?****
>>>>>>
>>>>>> let me try please ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> irfan ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <do...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  Are you running DN on both the machines? Could you please show me
>>>>>> your DN logs?****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> Also, consider Oliver's suggestion. It's definitely a better option.*
>>>>>> ***
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>>
>>>>>> ****
>>>>>>
>>>>>> Warm Regards,****
>>>>>>
>>>>>> Tariq****
>>>>>>
>>>>>> cloudfront.blogspot.com****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>
>>>>>> Irfu, ****
>>>>>>
>>>>>> If you want to quickly get Hadoop running on windows platform. You
>>>>>> may want to try our distribution for Windows. You will be able to find the
>>>>>> msi on our website. ****
>>>>>>
>>>>>> Regards
>>>>>> Olivier ****
>>>>>>
>>>>>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>>>>
>>>>>>  thanks. ****
>>>>>>
>>>>>> ok. i think i need to change the plan over here ****
>>>>>>
>>>>>> let me create two environments. 1: totally windows 2: totally Unix***
>>>>>> *
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> because, on windows , anyway i have to try and see how hadoop works *
>>>>>> ***
>>>>>>
>>>>>> on UNIX, it is already known that ,  it is working fine. ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> so, on windows , here is the setup:****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> namenode : windows 2012 R2 ****
>>>>>>
>>>>>> datanode : windows 2012 R2 ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> now, the exact problem is :****
>>>>>>
>>>>>> 1: datanode is not getting started ****
>>>>>>
>>>>>> 2: replication : if i put any file/folder on any datanode , it should
>>>>>> get replicated to all another available datanodes ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <do...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  Seriously??You are planning to develop something using Hadoop on
>>>>>> windows. Not a good idea. Anyways, cold you plz show me your log files?I
>>>>>> also need some additional info :****
>>>>>>
>>>>>> -The exact problem which you are facing right now****
>>>>>>
>>>>>> -Your cluster summary(no. of nodes etc)****
>>>>>>
>>>>>> -Your latest configuration files****
>>>>>>
>>>>>> -Your /etc.hosts file****
>>>>>>
>>>>>>
>>>>>> ****
>>>>>>
>>>>>> Warm Regards,****
>>>>>>
>>>>>> Tariq****
>>>>>>
>>>>>> cloudfront.blogspot.com****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  ok. thanks****
>>>>>>
>>>>>> now, i need to start with all windows setup first as our product will
>>>>>> be based on windows ****
>>>>>>
>>>>>> so, now, please tell me how to resolve the issue ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> datanode is not starting . please suggest ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards,****
>>>>>>
>>>>>> irfan ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <do...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  It is possible. Theoretically Hadoop doesn't stop you from doing
>>>>>> that. But it is not a very wise setup.****
>>>>>>
>>>>>>
>>>>>> ****
>>>>>>
>>>>>> Warm Regards,****
>>>>>>
>>>>>> Tariq****
>>>>>>
>>>>>> cloudfront.blogspot.com****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  please suggest****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> irfan****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  thanks.****
>>>>>>
>>>>>> can i have setup like this :****
>>>>>>
>>>>>> namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)**
>>>>>> **
>>>>>>
>>>>>> and datanodes are the combination of any OS (windows , linux , unix
>>>>>> etc )****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> however, my doubt is,  as the file systems of  both the systems (win
>>>>>> and linux ) are different ,  datanodes of these systems can not be part of
>>>>>> single cluster . i have to make windows cluster separate and UNIX cluster
>>>>>> separate ?****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
>>>>>> aagarwal@hortonworks.com> wrote:****
>>>>>>
>>>>>> I just noticed you are on Cygwin. IIRC Windows PIDs are not the same
>>>>>> as Cygwin PIDs so that may be causing the discrepancy. I don't know how
>>>>>> well Hadoop works in Cygwin as I have never tried it. Work is in progress
>>>>>> for native Windows support however there are no official releases with
>>>>>> Windows support yet. It may be easier to get familiar with a release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
>>>>>> ****
>>>>>>
>>>>>>
>>>>>>
>>>>>> ****
>>>>>>
>>>>>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  thanks ****
>>>>>>
>>>>>> here is what i did .****
>>>>>>
>>>>>> i stopped all the namenodes and datanodes using ./stop-dfs.sh command
>>>>>> ****
>>>>>>
>>>>>> then deleted all pid files for namenodes and datanodes ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> started dfs again with command : "./start-dfs.sh"****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> when i ran the "Jps" command . it shows****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>>
>>>>>> $ ./jps.exe****
>>>>>>
>>>>>> 4536 Jps****
>>>>>>
>>>>>> 2076 NameNode****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> however, when i open the pid file for namenode then it is not showing
>>>>>> pid as : 4560. on the contrary, it shud show : 2076****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> please suggest ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
>>>>>> aagarwal@hortonworks.com> wrote:****
>>>>>>
>>>>>>  Most likely there is a stale pid file. Something like
>>>>>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>>>>>> the datanode.
>>>>>>
>>>>>> I haven't read the entire thread so you may have looked at this
>>>>>> already.
>>>>>>
>>>>>> -Arpit****
>>>>>>
>>>>>>
>>>>>>
>>>>>> ****
>>>>>>
>>>>>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  datanode is trying to connect to namenode continuously but fails ***
>>>>>> *
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> when i try to run "jps" command it says :****
>>>>>>
>>>>>> $ ./jps.exe****
>>>>>>
>>>>>> 4584 NameNode****
>>>>>>
>>>>>> 4016 Jps****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> and when i ran the "./start-dfs.sh" then it says :****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> $ ./start-dfs.sh****
>>>>>>
>>>>>> namenode running as process 3544. Stop it first.****
>>>>>>
>>>>>> DFS-1: datanode running as process 4076. Stop it first.****
>>>>>>
>>>>>> localhost: secondarynamenode running as process 4792. Stop it first.*
>>>>>> ***
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> both these logs are contradictory ****
>>>>>>
>>>>>> please find the attached logs ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> should i attach the conf files as well ?****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>>  ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <do...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  Your DN is still not running. Showing me the logs would be helpful.*
>>>>>> ***
>>>>>>
>>>>>>
>>>>>> ****
>>>>>>
>>>>>> Warm Regards,****
>>>>>>
>>>>>> Tariq****
>>>>>>
>>>>>> cloudfront.blogspot.com****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  i followed the url and did the steps mention in that. i have
>>>>>> deployed on the windows platform****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> Now, i am able to browse url : http://localhost:50070 (name node )***
>>>>>> *
>>>>>>
>>>>>> however, not able to browse url : http://localhost:50030****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> please refer below****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> [image: Inline image 1]****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> i have modified all the config files as mentioned and formatted the
>>>>>> hdfs file system as well ****
>>>>>>
>>>>>> please suggest ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  thanks. i followed this url :
>>>>>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>>>>>> ****
>>>>>>
>>>>>> let me follow the url which you gave for pseudo distributed setup and
>>>>>> then will switch to distributed mode****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> irfan ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <do...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  You are welcome. Which link have you followed for the
>>>>>> configuration?Your *core-site.xml* is empty. Remove the property *
>>>>>> fs.default.name *from *hdfs-site.xml* and add it to *core-site.xml*.
>>>>>> Remove *mapred.job.tracker* as well. It is required in *
>>>>>> mapred-site.xml*.****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> I would suggest you to do a pseudo distributed setup first in order
>>>>>> to get yourself familiar with the process and then proceed to the
>>>>>> distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>>>>>> ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> HTH****
>>>>>>
>>>>>>
>>>>>> ****
>>>>>>
>>>>>> Warm Regards,****
>>>>>>
>>>>>> Tariq****
>>>>>>
>>>>>> cloudfront.blogspot.com****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  thanks tariq for response. ****
>>>>>>
>>>>>> as discussed last time, i have sent you all the config files in my
>>>>>> setup . ****
>>>>>>
>>>>>> can you please go through that ?****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> please let me know ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> irfan ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <do...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  I'm sorry for being unresponsive. Was out of touch for sometime
>>>>>> because of ramzan and eid. Resuming work today.****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> What's the current status?****
>>>>>>
>>>>>>
>>>>>> ****
>>>>>>
>>>>>> Warm Regards,****
>>>>>>
>>>>>> Tariq****
>>>>>>
>>>>>> cloudfront.blogspot.com****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <ma...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  First of all read the concepts ..I hope you will like it..****
>>>>>>
>>>>>>
>>>>>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  please suggest ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> irfan ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  hey Tariq,****
>>>>>>
>>>>>> i am still stuck .. ****
>>>>>>
>>>>>> can you please suggest ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> irfan ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  please suggest ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  attachment got quarantined ****
>>>>>>
>>>>>> resending in txt format. please rename it to conf.rar ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  thanks.****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> if i run the jps command on namenode :****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>>
>>>>>> $ ./jps.exe****
>>>>>>
>>>>>> 3164 NameNode****
>>>>>>
>>>>>> 1892 Jps****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> same command on datanode :****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> Administrator@DFS-1 /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>>
>>>>>> $ ./jps.exe****
>>>>>>
>>>>>> 3848 Jps****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> jps does not list any process for datanode. however, on web browser i
>>>>>> can see one live data node ****
>>>>>>
>>>>>> please find the attached conf rar file of namenode ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <do...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  OK. we'll start fresh. Could you plz show me your latest config
>>>>>> files?****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> BTW, are your daemons running fine?Use JPS to verify that.****
>>>>>>
>>>>>>
>>>>>> ****
>>>>>>
>>>>>> Warm Regards,****
>>>>>>
>>>>>> Tariq****
>>>>>>
>>>>>> cloudfront.blogspot.com****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  i have created these dir "wksp_data" and "wksp_name" on both
>>>>>> datanode and namenode ****
>>>>>>
>>>>>> made the respective changes in "hdfs-site.xml" file ****
>>>>>>
>>>>>> formatted the namenode ****
>>>>>>
>>>>>> started the dfs ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> but still, not able to browse the file system through web browser ***
>>>>>> *
>>>>>>
>>>>>> please refer below ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> anything still missing ?****
>>>>>>
>>>>>> please suggest ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> [image: Inline image 1]****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  these dir needs to be created on all datanodes and namenodes ?****
>>>>>>
>>>>>> further,  hdfs-site.xml needs to be updated on both datanodes and
>>>>>> namenodes for these new dir?****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <do...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  Create 2 directories manually corresponding to the values of
>>>>>> dfs.name.dir and dfs.data.dir properties and change the permissions of
>>>>>> these directories to 755. When you start pushing data into your HDFS, data
>>>>>> will start going inside the directory specified by dfs.data.dir and the
>>>>>> associated metadata will go inside dfs.name.dir. Remember, you store data
>>>>>> in HDFS, but it eventually gets stored in your local/native FS. But you
>>>>>> cannot see this data directly on your local/native FS.****
>>>>>>
>>>>>>
>>>>>> ****
>>>>>>
>>>>>> Warm Regards,****
>>>>>>
>>>>>> Tariq****
>>>>>>
>>>>>> cloudfront.blogspot.com****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  thanks. ****
>>>>>>
>>>>>> however, i need this to be working on windows environment as project
>>>>>> requirement.****
>>>>>>
>>>>>> i will add/work on Linux later ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> so, now , at this stage , c:\\wksp is the HDFS file system OR do i
>>>>>> need to create it from command line ?****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> please suggest****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards,****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <do...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  Hello Irfan,****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> Sorry for being unresponsive. Got stuck with some imp work.****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> HDFS webUI doesn't provide us the ability to create file or
>>>>>> directory. You can browse HDFS, view files, download files etc. But
>>>>>> operation like create, move, copy etc are not supported.****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> These values look fine to me.****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> One suggestion though. Try getting a Linux machine(if possible). Or
>>>>>> at least use a VM. I personally feel that using Hadoop on windows is always
>>>>>> messy.****
>>>>>>
>>>>>>
>>>>>> ****
>>>>>>
>>>>>> Warm Regards,****
>>>>>>
>>>>>> Tariq****
>>>>>>
>>>>>> cloudfront.blogspot.com****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  thanks.****
>>>>>>
>>>>>> when i browse the file system , i am getting following :****
>>>>>>
>>>>>> i haven't seen any make directory option there ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> i need to create it from command line ?****
>>>>>>
>>>>>> further, in the hdfs-site.xml file , i have given following entries.
>>>>>> are they correct ? ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> <property>****
>>>>>>
>>>>>>   <name>dfs.data.dir</name>****
>>>>>>
>>>>>>   <value>c:\\wksp</value>****
>>>>>>
>>>>>>   </property>****
>>>>>>
>>>>>> <property>****
>>>>>>
>>>>>>   <name>dfs.name.dir</name>****
>>>>>>
>>>>>>   <value>c:\\wksp</value>****
>>>>>>
>>>>>>   </property>****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> please suggest ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> [image: Inline image 1]****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <ma...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  *You are wrong at this:*****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>>>>>
>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp****
>>>>>>
>>>>>> copyFromLocal: File
>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
>>>>>> ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>>>>>
>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp****
>>>>>>
>>>>>> copyFromLocal: File
>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>>>>>> ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> Because,You had wrote both the paths local and You need not to copy
>>>>>> hadoop into hdfs...Hadoop is already working..****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> Just check out in browser by after starting ur single node cluster :*
>>>>>> ***
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> localhost:50070****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> then go for browse the filesystem link in it..****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> If there is no directory then make directory there.****
>>>>>>
>>>>>> That is your hdfs directory.****
>>>>>>
>>>>>> Then copy any text file there(no need to copy hadoop there).beacause
>>>>>> u are going to do processing on that data in text file.That's why hadoop is
>>>>>> used for ,first u need to make it clear in ur mind.Then and then u will do
>>>>>> it...otherwise not possible..****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> *Try this: *****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2****
>>>>>>
>>>>>> $ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
>>>>>> /hdfs/directory/path****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  thanks. yes , i am newbie.****
>>>>>>
>>>>>> however, i need windows setup.****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> let me surely refer the doc and link which u sent but i need this to
>>>>>> be working ...****
>>>>>>
>>>>>> can you please help****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>>  ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>>
>>>>>>
>>>>>> ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> --
>>>>>> MANISH DUNANI
>>>>>> -THANX
>>>>>> +91 9426881954,+91 8460656443****
>>>>>>
>>>>>> manishd207@gmail.com****
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>
>>>>>>
>>>>>> ****
>>>>>>
>>>>>> -- ****
>>>>>>
>>>>>> Regards****
>>>>>>
>>>>>> *Manish Dunani*****
>>>>>>
>>>>>> *Contact No* : +91 9408329137****
>>>>>>
>>>>>> *skype id* : manish.dunani****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>> CONFIDENTIALITY NOTICE
>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>> entity to which it is addressed and may contain information that is
>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>> you have received this communication in error, please contact the sender
>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>
>>>>>>
>>>>>> CONFIDENTIALITY NOTICE
>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>> entity to which it is addressed and may contain information that is
>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>> you have received this communication in error, please contact the sender
>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>
>>>>>> CONFIDENTIALITY NOTICE
>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>> entity to which it is addressed and may contain information that is
>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>> you have received this communication in error, please contact the sender
>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>
>>>>>> CONFIDENTIALITY NOTICE
>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>> entity to which it is addressed and may contain information that is
>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>> you have received this communication in error, please contact the sender
>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>
>>>>>> CONFIDENTIALITY NOTICE
>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>> entity to which it is addressed and may contain information that is
>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>> you have received this communication in error, please contact the sender
>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>
>>>>>> CONFIDENTIALITY NOTICE
>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>> entity to which it is addressed and may contain information that is
>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>> you have received this communication in error, please contact the sender
>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>
>>>>>>
>>>>>> ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> -- ****
>>>>>>
>>>>>> Olivier Renault****
>>>>>>
>>>>>> Solution Engineer - Big Data - Hortonworks, Inc.
>>>>>> +44 7500 933 036
>>>>>> orenault@hortonworks.com
>>>>>> www.hortonworks.com****
>>>>>>
>>>>>> **** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>>
>>>>>>
>>>>>> CONFIDENTIALITY NOTICE
>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>> entity to which it is addressed and may contain information that is
>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>> you have received this communication in error, please contact the sender
>>>>>> immediately and delete it from your system. Thank You.****<http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>>
>>>>>>  ** ** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>>
>>>>>
>>>>>
>>>>
>>>
>>
>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.
>

Re: about replication

Posted by Irfan Sayed <ir...@gmail.com>.
i do not have any HIVE server host,  then, what should i put over here?? .
if i comment then i guess it throws error of commenting that
can i put the fqdn of namenode for HIVE server host  ?

will it be a really working configuration?

please suggest

regards
irfan



On Tue, Sep 10, 2013 at 5:09 PM, Olivier Renault
<or...@hortonworks.com>wrote:

> Your cluster-properties.txt should look something like :
>
> #Log directory
> HDP_LOG_DIR=c:\hadoop\logs
>
> #Data directory
> HDP_DATA_DIR=c:\hdp\data
>
> #Hosts
> NAMENODE_HOST=yourmaster.fqdn.com
> JOBTRACKER_HOST=yourmaster.fqdn.com
> HIVE_SERVER_HOST=yourmaster.fqdn.com
> OOZIE_SERVER_HOST=yourmaster.fqdn.com
> TEMPLETON_HOST=yourmaster.fqdn.com
> SLAVE_HOSTS=yourmaster.fqdn.com,youslave.fqdn.com
>
> #Database host
> DB_FLAVOR=derby
> DB_HOSTNAME=yourmaster.fqdn.com
>
>
> #Hive properties
> HIVE_DB_NAME=hive
> HIVE_DB_USERNAME=hive
> HIVE_DB_PASSWORD=hive
>
> #Oozie properties
> OOZIE_DB_NAME=oozie
> OOZIE_DB_USERNAME=oozie
> OOZIE_DB_PASSWORD=oozie
>
> You will need to replace, yourmaster.fqdn.com, yourslave.fqdn.com by your
> servers name. For the time being, I suggest that you do not install HBase,
> Oozie,
>
> regards,
> Olivier
>
>
> On 10 September 2013 07:02, Irfan Sayed <ir...@gmail.com> wrote:
>
>> please suggest
>>
>> regards
>> irfan
>>
>>
>>
>> On Sat, Sep 7, 2013 at 4:56 PM, Irfan Sayed <ir...@gmail.com> wrote:
>>
>>> please suggest
>>>
>>> regards
>>> irfan
>>>
>>>
>>>
>>> On Fri, Sep 6, 2013 at 12:16 PM, Irfan Sayed <ir...@gmail.com>wrote:
>>>
>>>> ok.. now i made some changes and installation went ahead
>>>> but failed in property "HIVE_SERVER_HOST" declaration
>>>> in cluster config file, i have commented this property. if i uncomment
>>>> , then what server address will give ???
>>>>
>>>> i have only two windows machines setup.
>>>> 1: for namenode and another for datanode
>>>>
>>>> please suggest
>>>>
>>>> regards
>>>> irfan
>>>>
>>>>
>>>>
>>>> On Fri, Sep 6, 2013 at 11:42 AM, Irfan Sayed <ir...@gmail.com>wrote:
>>>>
>>>>> thanks.
>>>>> i installed the latest java in c:\java folder and now no error in log
>>>>> file related to java
>>>>> however, now it is throwing error on not having cluster properties
>>>>> file.
>>>>> in fact i am running/installing hdp from the location where this file
>>>>> exist . still it is throwing error
>>>>>
>>>>> please find the attached
>>>>>
>>>>> [image: Inline image 1]
>>>>>
>>>>> regards
>>>>> irfan
>>>>>
>>>>>
>>>>>
>>>>> On Fri, Sep 6, 2013 at 11:12 AM, Ravi Mummulla (BIG DATA) <
>>>>> ravimu@microsoft.com> wrote:
>>>>>
>>>>>>  Here’s your issue (from the logs you attached earlier):****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> CAQuietExec:  Checking JAVA_HOME is set correctly...****
>>>>>>
>>>>>> CAQuietExec:  Files\Java\jdk1.6.0_31 was unexpected at this time.****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> It seems that you installed Java prerequisite in the default path,
>>>>>> which is %PROGRAMFILES% (expands to C:\Program Files in your case). HDP 1.3
>>>>>> does not like spaces in paths, do you need to reinstall Java under c:\java\
>>>>>> or something similar (in a path with no spaces).****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> *From:* Irfan Sayed [mailto:irfu.sayed@gmail.com]
>>>>>> *Sent:* Thursday, September 5, 2013 8:42 PM
>>>>>> *To:* user@hadoop.apache.org
>>>>>> *Subject:* Re: about replication****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> please find the attached.****
>>>>>>
>>>>>> i don't have "c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log"
>>>>>> as it is not generated ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> irfan****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Thu, Sep 5, 2013 at 6:09 PM, Olivier Renault <
>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>
>>>>>>  Could you share the log files ( c:\hdp.log,
>>>>>> c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log )  as
>>>>>> well as your clusterproperties.txt ?****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> Thanks, ****
>>>>>>
>>>>>> Olivier****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On 5 September 2013 12:33, Irfan Sayed <ir...@gmail.com> wrote:*
>>>>>> ***
>>>>>>
>>>>>>  thanks. i followed the user manual for deployment and installed all
>>>>>> pre-requisites ****
>>>>>>
>>>>>> i modified the command and still the issue persist. please suggest **
>>>>>> **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> please refer below ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> [image: Inline image 1]****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> irfan ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <
>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>
>>>>>> The command to install it is msiexec /i msifile /...  ****
>>>>>>
>>>>>> You will find the correct syntax as part of doc. ****
>>>>>>
>>>>>> Happy reading
>>>>>> Olivier ****
>>>>>>
>>>>>> On 4 Sep 2013 12:37, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>>>>
>>>>>>  thanks. ****
>>>>>>
>>>>>> i referred the logs and manuals. i modified the clusterproperties
>>>>>> file and then double click on the msi file ****
>>>>>>
>>>>>> however, it still failed.****
>>>>>>
>>>>>> further i started the installation on command line by giving
>>>>>> HDP_LAYOUT=clusterproperties file path, ****
>>>>>>
>>>>>> installation went ahead and it failed for .NET framework 4.0 and VC++
>>>>>> redistributable package dependency   ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> i installed both and started again the installation. ****
>>>>>>
>>>>>> failed again with following error ****
>>>>>>
>>>>>> [image: Inline image 1]****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> when i search for the logs mentioned in the error , i never found
>>>>>> that ****
>>>>>>
>>>>>> please suggest ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> irfan****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <
>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>
>>>>>> Correct, you need to define the cluster configuration as part of a
>>>>>> file. You will find some information on the configuration file as part of
>>>>>> the documentation. ****
>>>>>>
>>>>>>
>>>>>> http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
>>>>>> ****
>>>>>>
>>>>>> You should make sure to have also installed the pre requisite. ****
>>>>>>
>>>>>> Thanks
>>>>>> Olivier ****
>>>>>>
>>>>>> On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>>>>
>>>>>>  thanks. sorry for the long break. actually got involved in some
>>>>>> other priorities****
>>>>>>
>>>>>> i downloaded the installer and while installing i got following error
>>>>>> ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> [image: Inline image 1]****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> do i need to make any configuration prior to installation ??****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> irfan ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>
>>>>>> Here is the link ****
>>>>>>
>>>>>> http://download.hortonworks.com/products/hdp-windows/****
>>>>>>
>>>>>> Olivier ****
>>>>>>
>>>>>> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>>>>
>>>>>>  thanks.****
>>>>>>
>>>>>> i just followed the instructions to setup the pseudo distributed
>>>>>> setup first using the url :
>>>>>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>>>>>> ****
>>>>>>
>>>>>>  ****
>>>>>>
>>>>>> i don't think so i am running DN on both machine ****
>>>>>>
>>>>>> please find the attached log****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> hi olivier ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> can you please give me download link ?****
>>>>>>
>>>>>> let me try please ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> irfan ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <do...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  Are you running DN on both the machines? Could you please show me
>>>>>> your DN logs?****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> Also, consider Oliver's suggestion. It's definitely a better option.*
>>>>>> ***
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>>
>>>>>> ****
>>>>>>
>>>>>> Warm Regards,****
>>>>>>
>>>>>> Tariq****
>>>>>>
>>>>>> cloudfront.blogspot.com****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>>>>>> orenault@hortonworks.com> wrote:****
>>>>>>
>>>>>> Irfu, ****
>>>>>>
>>>>>> If you want to quickly get Hadoop running on windows platform. You
>>>>>> may want to try our distribution for Windows. You will be able to find the
>>>>>> msi on our website. ****
>>>>>>
>>>>>> Regards
>>>>>> Olivier ****
>>>>>>
>>>>>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>>>>
>>>>>>  thanks. ****
>>>>>>
>>>>>> ok. i think i need to change the plan over here ****
>>>>>>
>>>>>> let me create two environments. 1: totally windows 2: totally Unix***
>>>>>> *
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> because, on windows , anyway i have to try and see how hadoop works *
>>>>>> ***
>>>>>>
>>>>>> on UNIX, it is already known that ,  it is working fine. ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> so, on windows , here is the setup:****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> namenode : windows 2012 R2 ****
>>>>>>
>>>>>> datanode : windows 2012 R2 ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> now, the exact problem is :****
>>>>>>
>>>>>> 1: datanode is not getting started ****
>>>>>>
>>>>>> 2: replication : if i put any file/folder on any datanode , it should
>>>>>> get replicated to all another available datanodes ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <do...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  Seriously??You are planning to develop something using Hadoop on
>>>>>> windows. Not a good idea. Anyways, cold you plz show me your log files?I
>>>>>> also need some additional info :****
>>>>>>
>>>>>> -The exact problem which you are facing right now****
>>>>>>
>>>>>> -Your cluster summary(no. of nodes etc)****
>>>>>>
>>>>>> -Your latest configuration files****
>>>>>>
>>>>>> -Your /etc.hosts file****
>>>>>>
>>>>>>
>>>>>> ****
>>>>>>
>>>>>> Warm Regards,****
>>>>>>
>>>>>> Tariq****
>>>>>>
>>>>>> cloudfront.blogspot.com****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  ok. thanks****
>>>>>>
>>>>>> now, i need to start with all windows setup first as our product will
>>>>>> be based on windows ****
>>>>>>
>>>>>> so, now, please tell me how to resolve the issue ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> datanode is not starting . please suggest ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards,****
>>>>>>
>>>>>> irfan ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <do...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  It is possible. Theoretically Hadoop doesn't stop you from doing
>>>>>> that. But it is not a very wise setup.****
>>>>>>
>>>>>>
>>>>>> ****
>>>>>>
>>>>>> Warm Regards,****
>>>>>>
>>>>>> Tariq****
>>>>>>
>>>>>> cloudfront.blogspot.com****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  please suggest****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> irfan****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  thanks.****
>>>>>>
>>>>>> can i have setup like this :****
>>>>>>
>>>>>> namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)**
>>>>>> **
>>>>>>
>>>>>> and datanodes are the combination of any OS (windows , linux , unix
>>>>>> etc )****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> however, my doubt is,  as the file systems of  both the systems (win
>>>>>> and linux ) are different ,  datanodes of these systems can not be part of
>>>>>> single cluster . i have to make windows cluster separate and UNIX cluster
>>>>>> separate ?****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
>>>>>> aagarwal@hortonworks.com> wrote:****
>>>>>>
>>>>>> I just noticed you are on Cygwin. IIRC Windows PIDs are not the same
>>>>>> as Cygwin PIDs so that may be causing the discrepancy. I don't know how
>>>>>> well Hadoop works in Cygwin as I have never tried it. Work is in progress
>>>>>> for native Windows support however there are no official releases with
>>>>>> Windows support yet. It may be easier to get familiar with a release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
>>>>>> ****
>>>>>>
>>>>>>
>>>>>>
>>>>>> ****
>>>>>>
>>>>>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  thanks ****
>>>>>>
>>>>>> here is what i did .****
>>>>>>
>>>>>> i stopped all the namenodes and datanodes using ./stop-dfs.sh command
>>>>>> ****
>>>>>>
>>>>>> then deleted all pid files for namenodes and datanodes ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> started dfs again with command : "./start-dfs.sh"****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> when i ran the "Jps" command . it shows****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>>
>>>>>> $ ./jps.exe****
>>>>>>
>>>>>> 4536 Jps****
>>>>>>
>>>>>> 2076 NameNode****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> however, when i open the pid file for namenode then it is not showing
>>>>>> pid as : 4560. on the contrary, it shud show : 2076****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> please suggest ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
>>>>>> aagarwal@hortonworks.com> wrote:****
>>>>>>
>>>>>>  Most likely there is a stale pid file. Something like
>>>>>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>>>>>> the datanode.
>>>>>>
>>>>>> I haven't read the entire thread so you may have looked at this
>>>>>> already.
>>>>>>
>>>>>> -Arpit****
>>>>>>
>>>>>>
>>>>>>
>>>>>> ****
>>>>>>
>>>>>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  datanode is trying to connect to namenode continuously but fails ***
>>>>>> *
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> when i try to run "jps" command it says :****
>>>>>>
>>>>>> $ ./jps.exe****
>>>>>>
>>>>>> 4584 NameNode****
>>>>>>
>>>>>> 4016 Jps****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> and when i ran the "./start-dfs.sh" then it says :****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> $ ./start-dfs.sh****
>>>>>>
>>>>>> namenode running as process 3544. Stop it first.****
>>>>>>
>>>>>> DFS-1: datanode running as process 4076. Stop it first.****
>>>>>>
>>>>>> localhost: secondarynamenode running as process 4792. Stop it first.*
>>>>>> ***
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> both these logs are contradictory ****
>>>>>>
>>>>>> please find the attached logs ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> should i attach the conf files as well ?****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>>  ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <do...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  Your DN is still not running. Showing me the logs would be helpful.*
>>>>>> ***
>>>>>>
>>>>>>
>>>>>> ****
>>>>>>
>>>>>> Warm Regards,****
>>>>>>
>>>>>> Tariq****
>>>>>>
>>>>>> cloudfront.blogspot.com****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  i followed the url and did the steps mention in that. i have
>>>>>> deployed on the windows platform****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> Now, i am able to browse url : http://localhost:50070 (name node )***
>>>>>> *
>>>>>>
>>>>>> however, not able to browse url : http://localhost:50030****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> please refer below****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> [image: Inline image 1]****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> i have modified all the config files as mentioned and formatted the
>>>>>> hdfs file system as well ****
>>>>>>
>>>>>> please suggest ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  thanks. i followed this url :
>>>>>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>>>>>> ****
>>>>>>
>>>>>> let me follow the url which you gave for pseudo distributed setup and
>>>>>> then will switch to distributed mode****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> irfan ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <do...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  You are welcome. Which link have you followed for the
>>>>>> configuration?Your *core-site.xml* is empty. Remove the property *
>>>>>> fs.default.name *from *hdfs-site.xml* and add it to *core-site.xml*.
>>>>>> Remove *mapred.job.tracker* as well. It is required in *
>>>>>> mapred-site.xml*.****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> I would suggest you to do a pseudo distributed setup first in order
>>>>>> to get yourself familiar with the process and then proceed to the
>>>>>> distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>>>>>> ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> HTH****
>>>>>>
>>>>>>
>>>>>> ****
>>>>>>
>>>>>> Warm Regards,****
>>>>>>
>>>>>> Tariq****
>>>>>>
>>>>>> cloudfront.blogspot.com****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  thanks tariq for response. ****
>>>>>>
>>>>>> as discussed last time, i have sent you all the config files in my
>>>>>> setup . ****
>>>>>>
>>>>>> can you please go through that ?****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> please let me know ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> irfan ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <do...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  I'm sorry for being unresponsive. Was out of touch for sometime
>>>>>> because of ramzan and eid. Resuming work today.****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> What's the current status?****
>>>>>>
>>>>>>
>>>>>> ****
>>>>>>
>>>>>> Warm Regards,****
>>>>>>
>>>>>> Tariq****
>>>>>>
>>>>>> cloudfront.blogspot.com****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <ma...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  First of all read the concepts ..I hope you will like it..****
>>>>>>
>>>>>>
>>>>>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  please suggest ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> irfan ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  hey Tariq,****
>>>>>>
>>>>>> i am still stuck .. ****
>>>>>>
>>>>>> can you please suggest ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> irfan ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  please suggest ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  attachment got quarantined ****
>>>>>>
>>>>>> resending in txt format. please rename it to conf.rar ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  thanks.****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> if i run the jps command on namenode :****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>>
>>>>>> $ ./jps.exe****
>>>>>>
>>>>>> 3164 NameNode****
>>>>>>
>>>>>> 1892 Jps****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> same command on datanode :****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> Administrator@DFS-1 /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>>
>>>>>> $ ./jps.exe****
>>>>>>
>>>>>> 3848 Jps****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> jps does not list any process for datanode. however, on web browser i
>>>>>> can see one live data node ****
>>>>>>
>>>>>> please find the attached conf rar file of namenode ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <do...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  OK. we'll start fresh. Could you plz show me your latest config
>>>>>> files?****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> BTW, are your daemons running fine?Use JPS to verify that.****
>>>>>>
>>>>>>
>>>>>> ****
>>>>>>
>>>>>> Warm Regards,****
>>>>>>
>>>>>> Tariq****
>>>>>>
>>>>>> cloudfront.blogspot.com****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  i have created these dir "wksp_data" and "wksp_name" on both
>>>>>> datanode and namenode ****
>>>>>>
>>>>>> made the respective changes in "hdfs-site.xml" file ****
>>>>>>
>>>>>> formatted the namenode ****
>>>>>>
>>>>>> started the dfs ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> but still, not able to browse the file system through web browser ***
>>>>>> *
>>>>>>
>>>>>> please refer below ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> anything still missing ?****
>>>>>>
>>>>>> please suggest ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> [image: Inline image 1]****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  these dir needs to be created on all datanodes and namenodes ?****
>>>>>>
>>>>>> further,  hdfs-site.xml needs to be updated on both datanodes and
>>>>>> namenodes for these new dir?****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <do...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  Create 2 directories manually corresponding to the values of
>>>>>> dfs.name.dir and dfs.data.dir properties and change the permissions of
>>>>>> these directories to 755. When you start pushing data into your HDFS, data
>>>>>> will start going inside the directory specified by dfs.data.dir and the
>>>>>> associated metadata will go inside dfs.name.dir. Remember, you store data
>>>>>> in HDFS, but it eventually gets stored in your local/native FS. But you
>>>>>> cannot see this data directly on your local/native FS.****
>>>>>>
>>>>>>
>>>>>> ****
>>>>>>
>>>>>> Warm Regards,****
>>>>>>
>>>>>> Tariq****
>>>>>>
>>>>>> cloudfront.blogspot.com****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  thanks. ****
>>>>>>
>>>>>> however, i need this to be working on windows environment as project
>>>>>> requirement.****
>>>>>>
>>>>>> i will add/work on Linux later ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> so, now , at this stage , c:\\wksp is the HDFS file system OR do i
>>>>>> need to create it from command line ?****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> please suggest****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards,****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <do...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  Hello Irfan,****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> Sorry for being unresponsive. Got stuck with some imp work.****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> HDFS webUI doesn't provide us the ability to create file or
>>>>>> directory. You can browse HDFS, view files, download files etc. But
>>>>>> operation like create, move, copy etc are not supported.****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> These values look fine to me.****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> One suggestion though. Try getting a Linux machine(if possible). Or
>>>>>> at least use a VM. I personally feel that using Hadoop on windows is always
>>>>>> messy.****
>>>>>>
>>>>>>
>>>>>> ****
>>>>>>
>>>>>> Warm Regards,****
>>>>>>
>>>>>> Tariq****
>>>>>>
>>>>>> cloudfront.blogspot.com****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  thanks.****
>>>>>>
>>>>>> when i browse the file system , i am getting following :****
>>>>>>
>>>>>> i haven't seen any make directory option there ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> i need to create it from command line ?****
>>>>>>
>>>>>> further, in the hdfs-site.xml file , i have given following entries.
>>>>>> are they correct ? ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> <property>****
>>>>>>
>>>>>>   <name>dfs.data.dir</name>****
>>>>>>
>>>>>>   <value>c:\\wksp</value>****
>>>>>>
>>>>>>   </property>****
>>>>>>
>>>>>> <property>****
>>>>>>
>>>>>>   <name>dfs.name.dir</name>****
>>>>>>
>>>>>>   <value>c:\\wksp</value>****
>>>>>>
>>>>>>   </property>****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> please suggest ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> [image: Inline image 1]****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <ma...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  *You are wrong at this:*****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>>>>>
>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp****
>>>>>>
>>>>>> copyFromLocal: File
>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
>>>>>> ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>>>>>
>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp****
>>>>>>
>>>>>> copyFromLocal: File
>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>>>>>> ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> Because,You had wrote both the paths local and You need not to copy
>>>>>> hadoop into hdfs...Hadoop is already working..****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> Just check out in browser by after starting ur single node cluster :*
>>>>>> ***
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> localhost:50070****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> then go for browse the filesystem link in it..****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> If there is no directory then make directory there.****
>>>>>>
>>>>>> That is your hdfs directory.****
>>>>>>
>>>>>> Then copy any text file there(no need to copy hadoop there).beacause
>>>>>> u are going to do processing on that data in text file.That's why hadoop is
>>>>>> used for ,first u need to make it clear in ur mind.Then and then u will do
>>>>>> it...otherwise not possible..****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> *Try this: *****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2****
>>>>>>
>>>>>> $ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
>>>>>> /hdfs/directory/path****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
>>>>>> wrote:****
>>>>>>
>>>>>>  thanks. yes , i am newbie.****
>>>>>>
>>>>>> however, i need windows setup.****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> let me surely refer the doc and link which u sent but i need this to
>>>>>> be working ...****
>>>>>>
>>>>>> can you please help****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> regards****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>>  ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>>
>>>>>>
>>>>>> ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> --
>>>>>> MANISH DUNANI
>>>>>> -THANX
>>>>>> +91 9426881954,+91 8460656443****
>>>>>>
>>>>>> manishd207@gmail.com****
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>
>>>>>>
>>>>>> ****
>>>>>>
>>>>>> -- ****
>>>>>>
>>>>>> Regards****
>>>>>>
>>>>>> *Manish Dunani*****
>>>>>>
>>>>>> *Contact No* : +91 9408329137****
>>>>>>
>>>>>> *skype id* : manish.dunani****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>> CONFIDENTIALITY NOTICE
>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>> entity to which it is addressed and may contain information that is
>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>> you have received this communication in error, please contact the sender
>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>
>>>>>>
>>>>>> CONFIDENTIALITY NOTICE
>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>> entity to which it is addressed and may contain information that is
>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>> you have received this communication in error, please contact the sender
>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>
>>>>>> CONFIDENTIALITY NOTICE
>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>> entity to which it is addressed and may contain information that is
>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>> you have received this communication in error, please contact the sender
>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>
>>>>>> CONFIDENTIALITY NOTICE
>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>> entity to which it is addressed and may contain information that is
>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>> you have received this communication in error, please contact the sender
>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>
>>>>>> CONFIDENTIALITY NOTICE
>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>> entity to which it is addressed and may contain information that is
>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>> you have received this communication in error, please contact the sender
>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>
>>>>>> CONFIDENTIALITY NOTICE
>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>> entity to which it is addressed and may contain information that is
>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>> you have received this communication in error, please contact the sender
>>>>>> immediately and delete it from your system. Thank You.****
>>>>>>
>>>>>>  ** **
>>>>>>
>>>>>>
>>>>>>
>>>>>> ****
>>>>>>
>>>>>> ** **
>>>>>>
>>>>>> -- ****
>>>>>>
>>>>>> Olivier Renault****
>>>>>>
>>>>>> Solution Engineer - Big Data - Hortonworks, Inc.
>>>>>> +44 7500 933 036
>>>>>> orenault@hortonworks.com
>>>>>> www.hortonworks.com****
>>>>>>
>>>>>> **** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>>
>>>>>>
>>>>>> CONFIDENTIALITY NOTICE
>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>> entity to which it is addressed and may contain information that is
>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>> you have received this communication in error, please contact the sender
>>>>>> immediately and delete it from your system. Thank You.****<http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>>
>>>>>>  ** ** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>>
>>>>>
>>>>>
>>>>
>>>
>>
>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.
>

Re: about replication

Posted by Olivier Renault <or...@hortonworks.com>.
Your cluster-properties.txt should look something like :

#Log directory
HDP_LOG_DIR=c:\hadoop\logs

#Data directory
HDP_DATA_DIR=c:\hdp\data

#Hosts
NAMENODE_HOST=yourmaster.fqdn.com
JOBTRACKER_HOST=yourmaster.fqdn.com
HIVE_SERVER_HOST=yourmaster.fqdn.com
OOZIE_SERVER_HOST=yourmaster.fqdn.com
TEMPLETON_HOST=yourmaster.fqdn.com
SLAVE_HOSTS=yourmaster.fqdn.com,youslave.fqdn.com

#Database host
DB_FLAVOR=derby
DB_HOSTNAME=yourmaster.fqdn.com


#Hive properties
HIVE_DB_NAME=hive
HIVE_DB_USERNAME=hive
HIVE_DB_PASSWORD=hive

#Oozie properties
OOZIE_DB_NAME=oozie
OOZIE_DB_USERNAME=oozie
OOZIE_DB_PASSWORD=oozie

You will need to replace, yourmaster.fqdn.com, yourslave.fqdn.com by your
servers name. For the time being, I suggest that you do not install HBase,
Oozie,

regards,
Olivier

On 10 September 2013 07:02, Irfan Sayed <ir...@gmail.com> wrote:

> please suggest
>
> regards
> irfan
>
>
>
> On Sat, Sep 7, 2013 at 4:56 PM, Irfan Sayed <ir...@gmail.com> wrote:
>
>> please suggest
>>
>> regards
>> irfan
>>
>>
>>
>> On Fri, Sep 6, 2013 at 12:16 PM, Irfan Sayed <ir...@gmail.com>wrote:
>>
>>> ok.. now i made some changes and installation went ahead
>>> but failed in property "HIVE_SERVER_HOST" declaration
>>> in cluster config file, i have commented this property. if i uncomment ,
>>> then what server address will give ???
>>>
>>> i have only two windows machines setup.
>>> 1: for namenode and another for datanode
>>>
>>> please suggest
>>>
>>> regards
>>> irfan
>>>
>>>
>>>
>>> On Fri, Sep 6, 2013 at 11:42 AM, Irfan Sayed <ir...@gmail.com>wrote:
>>>
>>>> thanks.
>>>> i installed the latest java in c:\java folder and now no error in log
>>>> file related to java
>>>> however, now it is throwing error on not having cluster properties
>>>> file.
>>>> in fact i am running/installing hdp from the location where this file
>>>> exist . still it is throwing error
>>>>
>>>> please find the attached
>>>>
>>>> [image: Inline image 1]
>>>>
>>>> regards
>>>> irfan
>>>>
>>>>
>>>>
>>>> On Fri, Sep 6, 2013 at 11:12 AM, Ravi Mummulla (BIG DATA) <
>>>> ravimu@microsoft.com> wrote:
>>>>
>>>>>  Here’s your issue (from the logs you attached earlier):****
>>>>>
>>>>> ** **
>>>>>
>>>>> CAQuietExec:  Checking JAVA_HOME is set correctly...****
>>>>>
>>>>> CAQuietExec:  Files\Java\jdk1.6.0_31 was unexpected at this time.****
>>>>>
>>>>> ** **
>>>>>
>>>>> It seems that you installed Java prerequisite in the default path,
>>>>> which is %PROGRAMFILES% (expands to C:\Program Files in your case). HDP 1.3
>>>>> does not like spaces in paths, do you need to reinstall Java under c:\java\
>>>>> or something similar (in a path with no spaces).****
>>>>>
>>>>> ** **
>>>>>
>>>>> *From:* Irfan Sayed [mailto:irfu.sayed@gmail.com]
>>>>> *Sent:* Thursday, September 5, 2013 8:42 PM
>>>>> *To:* user@hadoop.apache.org
>>>>> *Subject:* Re: about replication****
>>>>>
>>>>> ** **
>>>>>
>>>>> please find the attached.****
>>>>>
>>>>> i don't have "c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log"
>>>>> as it is not generated ****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> irfan****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Thu, Sep 5, 2013 at 6:09 PM, Olivier Renault <
>>>>> orenault@hortonworks.com> wrote:****
>>>>>
>>>>>  Could you share the log files ( c:\hdp.log,
>>>>> c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log )  as
>>>>> well as your clusterproperties.txt ?****
>>>>>
>>>>> ** **
>>>>>
>>>>> Thanks, ****
>>>>>
>>>>> Olivier****
>>>>>
>>>>> ** **
>>>>>
>>>>> On 5 September 2013 12:33, Irfan Sayed <ir...@gmail.com> wrote:**
>>>>> **
>>>>>
>>>>>  thanks. i followed the user manual for deployment and installed all
>>>>> pre-requisites ****
>>>>>
>>>>> i modified the command and still the issue persist. please suggest ***
>>>>> *
>>>>>
>>>>> ** **
>>>>>
>>>>> please refer below ****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> [image: Inline image 1]****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> irfan ****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <
>>>>> orenault@hortonworks.com> wrote:****
>>>>>
>>>>> The command to install it is msiexec /i msifile /...  ****
>>>>>
>>>>> You will find the correct syntax as part of doc. ****
>>>>>
>>>>> Happy reading
>>>>> Olivier ****
>>>>>
>>>>> On 4 Sep 2013 12:37, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>>>
>>>>>  thanks. ****
>>>>>
>>>>> i referred the logs and manuals. i modified the clusterproperties file
>>>>> and then double click on the msi file ****
>>>>>
>>>>> however, it still failed.****
>>>>>
>>>>> further i started the installation on command line by giving
>>>>> HDP_LAYOUT=clusterproperties file path, ****
>>>>>
>>>>> installation went ahead and it failed for .NET framework 4.0 and VC++
>>>>> redistributable package dependency   ****
>>>>>
>>>>> ** **
>>>>>
>>>>> i installed both and started again the installation. ****
>>>>>
>>>>> failed again with following error ****
>>>>>
>>>>> [image: Inline image 1]****
>>>>>
>>>>> ** **
>>>>>
>>>>> when i search for the logs mentioned in the error , i never found that
>>>>> ****
>>>>>
>>>>> please suggest ****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> irfan****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <
>>>>> orenault@hortonworks.com> wrote:****
>>>>>
>>>>> Correct, you need to define the cluster configuration as part of a
>>>>> file. You will find some information on the configuration file as part of
>>>>> the documentation. ****
>>>>>
>>>>>
>>>>> http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
>>>>> ****
>>>>>
>>>>> You should make sure to have also installed the pre requisite. ****
>>>>>
>>>>> Thanks
>>>>> Olivier ****
>>>>>
>>>>> On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>>>
>>>>>  thanks. sorry for the long break. actually got involved in some
>>>>> other priorities****
>>>>>
>>>>> i downloaded the installer and while installing i got following error
>>>>> ****
>>>>>
>>>>> ** **
>>>>>
>>>>> [image: Inline image 1]****
>>>>>
>>>>> ** **
>>>>>
>>>>> do i need to make any configuration prior to installation ??****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> irfan ****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
>>>>> orenault@hortonworks.com> wrote:****
>>>>>
>>>>> Here is the link ****
>>>>>
>>>>> http://download.hortonworks.com/products/hdp-windows/****
>>>>>
>>>>> Olivier ****
>>>>>
>>>>> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>>>
>>>>>  thanks.****
>>>>>
>>>>> i just followed the instructions to setup the pseudo distributed setup
>>>>> first using the url :
>>>>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>>>>> ****
>>>>>
>>>>>  ****
>>>>>
>>>>> i don't think so i am running DN on both machine ****
>>>>>
>>>>> please find the attached log****
>>>>>
>>>>> ** **
>>>>>
>>>>> hi olivier ****
>>>>>
>>>>> ** **
>>>>>
>>>>> can you please give me download link ?****
>>>>>
>>>>> let me try please ****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> irfan ****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <do...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  Are you running DN on both the machines? Could you please show me
>>>>> your DN logs?****
>>>>>
>>>>> ** **
>>>>>
>>>>> Also, consider Oliver's suggestion. It's definitely a better option.**
>>>>> **
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>>
>>>>> ****
>>>>>
>>>>> Warm Regards,****
>>>>>
>>>>> Tariq****
>>>>>
>>>>> cloudfront.blogspot.com****
>>>>>
>>>>> ** **
>>>>>
>>>>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>>>>> orenault@hortonworks.com> wrote:****
>>>>>
>>>>> Irfu, ****
>>>>>
>>>>> If you want to quickly get Hadoop running on windows platform. You may
>>>>> want to try our distribution for Windows. You will be able to find the msi
>>>>> on our website. ****
>>>>>
>>>>> Regards
>>>>> Olivier ****
>>>>>
>>>>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>>>
>>>>>  thanks. ****
>>>>>
>>>>> ok. i think i need to change the plan over here ****
>>>>>
>>>>> let me create two environments. 1: totally windows 2: totally Unix****
>>>>>
>>>>> ** **
>>>>>
>>>>> because, on windows , anyway i have to try and see how hadoop works **
>>>>> **
>>>>>
>>>>> on UNIX, it is already known that ,  it is working fine. ****
>>>>>
>>>>> ** **
>>>>>
>>>>> so, on windows , here is the setup:****
>>>>>
>>>>> ** **
>>>>>
>>>>> namenode : windows 2012 R2 ****
>>>>>
>>>>> datanode : windows 2012 R2 ****
>>>>>
>>>>> ** **
>>>>>
>>>>> now, the exact problem is :****
>>>>>
>>>>> 1: datanode is not getting started ****
>>>>>
>>>>> 2: replication : if i put any file/folder on any datanode , it should
>>>>> get replicated to all another available datanodes ****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <do...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  Seriously??You are planning to develop something using Hadoop on
>>>>> windows. Not a good idea. Anyways, cold you plz show me your log files?I
>>>>> also need some additional info :****
>>>>>
>>>>> -The exact problem which you are facing right now****
>>>>>
>>>>> -Your cluster summary(no. of nodes etc)****
>>>>>
>>>>> -Your latest configuration files****
>>>>>
>>>>> -Your /etc.hosts file****
>>>>>
>>>>>
>>>>> ****
>>>>>
>>>>> Warm Regards,****
>>>>>
>>>>> Tariq****
>>>>>
>>>>> cloudfront.blogspot.com****
>>>>>
>>>>> ** **
>>>>>
>>>>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  ok. thanks****
>>>>>
>>>>> now, i need to start with all windows setup first as our product will
>>>>> be based on windows ****
>>>>>
>>>>> so, now, please tell me how to resolve the issue ****
>>>>>
>>>>> ** **
>>>>>
>>>>> datanode is not starting . please suggest ****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards,****
>>>>>
>>>>> irfan ****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <do...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  It is possible. Theoretically Hadoop doesn't stop you from doing
>>>>> that. But it is not a very wise setup.****
>>>>>
>>>>>
>>>>> ****
>>>>>
>>>>> Warm Regards,****
>>>>>
>>>>> Tariq****
>>>>>
>>>>> cloudfront.blogspot.com****
>>>>>
>>>>> ** **
>>>>>
>>>>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  please suggest****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> irfan****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  thanks.****
>>>>>
>>>>> can i have setup like this :****
>>>>>
>>>>> namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)***
>>>>> *
>>>>>
>>>>> and datanodes are the combination of any OS (windows , linux , unix
>>>>> etc )****
>>>>>
>>>>> ** **
>>>>>
>>>>> however, my doubt is,  as the file systems of  both the systems (win
>>>>> and linux ) are different ,  datanodes of these systems can not be part of
>>>>> single cluster . i have to make windows cluster separate and UNIX cluster
>>>>> separate ?****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
>>>>> aagarwal@hortonworks.com> wrote:****
>>>>>
>>>>> I just noticed you are on Cygwin. IIRC Windows PIDs are not the same
>>>>> as Cygwin PIDs so that may be causing the discrepancy. I don't know how
>>>>> well Hadoop works in Cygwin as I have never tried it. Work is in progress
>>>>> for native Windows support however there are no official releases with
>>>>> Windows support yet. It may be easier to get familiar with a release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
>>>>> ****
>>>>>
>>>>>
>>>>>
>>>>> ****
>>>>>
>>>>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  thanks ****
>>>>>
>>>>> here is what i did .****
>>>>>
>>>>> i stopped all the namenodes and datanodes using ./stop-dfs.sh command
>>>>> ****
>>>>>
>>>>> then deleted all pid files for namenodes and datanodes ****
>>>>>
>>>>> ** **
>>>>>
>>>>> started dfs again with command : "./start-dfs.sh"****
>>>>>
>>>>> ** **
>>>>>
>>>>> when i ran the "Jps" command . it shows****
>>>>>
>>>>> ** **
>>>>>
>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>
>>>>> $ ./jps.exe****
>>>>>
>>>>> 4536 Jps****
>>>>>
>>>>> 2076 NameNode****
>>>>>
>>>>> ** **
>>>>>
>>>>> however, when i open the pid file for namenode then it is not showing
>>>>> pid as : 4560. on the contrary, it shud show : 2076****
>>>>>
>>>>> ** **
>>>>>
>>>>> please suggest ****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
>>>>> aagarwal@hortonworks.com> wrote:****
>>>>>
>>>>>  Most likely there is a stale pid file. Something like
>>>>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>>>>> the datanode.
>>>>>
>>>>> I haven't read the entire thread so you may have looked at this
>>>>> already.
>>>>>
>>>>> -Arpit****
>>>>>
>>>>>
>>>>>
>>>>> ****
>>>>>
>>>>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  datanode is trying to connect to namenode continuously but fails ****
>>>>>
>>>>> ** **
>>>>>
>>>>> when i try to run "jps" command it says :****
>>>>>
>>>>> $ ./jps.exe****
>>>>>
>>>>> 4584 NameNode****
>>>>>
>>>>> 4016 Jps****
>>>>>
>>>>> ** **
>>>>>
>>>>> and when i ran the "./start-dfs.sh" then it says :****
>>>>>
>>>>> ** **
>>>>>
>>>>> $ ./start-dfs.sh****
>>>>>
>>>>> namenode running as process 3544. Stop it first.****
>>>>>
>>>>> DFS-1: datanode running as process 4076. Stop it first.****
>>>>>
>>>>> localhost: secondarynamenode running as process 4792. Stop it first.**
>>>>> **
>>>>>
>>>>> ** **
>>>>>
>>>>> both these logs are contradictory ****
>>>>>
>>>>> please find the attached logs ****
>>>>>
>>>>> ** **
>>>>>
>>>>> should i attach the conf files as well ?****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>>  ****
>>>>>
>>>>> ** **
>>>>>
>>>>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <do...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  Your DN is still not running. Showing me the logs would be helpful.**
>>>>> **
>>>>>
>>>>>
>>>>> ****
>>>>>
>>>>> Warm Regards,****
>>>>>
>>>>> Tariq****
>>>>>
>>>>> cloudfront.blogspot.com****
>>>>>
>>>>> ** **
>>>>>
>>>>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  i followed the url and did the steps mention in that. i have
>>>>> deployed on the windows platform****
>>>>>
>>>>> ** **
>>>>>
>>>>> Now, i am able to browse url : http://localhost:50070 (name node )****
>>>>>
>>>>> however, not able to browse url : http://localhost:50030****
>>>>>
>>>>> ** **
>>>>>
>>>>> please refer below****
>>>>>
>>>>> ** **
>>>>>
>>>>> [image: Inline image 1]****
>>>>>
>>>>> ** **
>>>>>
>>>>> i have modified all the config files as mentioned and formatted the
>>>>> hdfs file system as well ****
>>>>>
>>>>> please suggest ****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  thanks. i followed this url :
>>>>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>>>>> ****
>>>>>
>>>>> let me follow the url which you gave for pseudo distributed setup and
>>>>> then will switch to distributed mode****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> irfan ****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <do...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  You are welcome. Which link have you followed for the
>>>>> configuration?Your *core-site.xml* is empty. Remove the property *
>>>>> fs.default.name *from *hdfs-site.xml* and add it to *core-site.xml*.
>>>>> Remove *mapred.job.tracker* as well. It is required in *
>>>>> mapred-site.xml*.****
>>>>>
>>>>> ** **
>>>>>
>>>>> I would suggest you to do a pseudo distributed setup first in order to
>>>>> get yourself familiar with the process and then proceed to the distributed
>>>>> mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>>>>> ****
>>>>>
>>>>> ** **
>>>>>
>>>>> HTH****
>>>>>
>>>>>
>>>>> ****
>>>>>
>>>>> Warm Regards,****
>>>>>
>>>>> Tariq****
>>>>>
>>>>> cloudfront.blogspot.com****
>>>>>
>>>>> ** **
>>>>>
>>>>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  thanks tariq for response. ****
>>>>>
>>>>> as discussed last time, i have sent you all the config files in my
>>>>> setup . ****
>>>>>
>>>>> can you please go through that ?****
>>>>>
>>>>> ** **
>>>>>
>>>>> please let me know ****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> irfan ****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <do...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  I'm sorry for being unresponsive. Was out of touch for sometime
>>>>> because of ramzan and eid. Resuming work today.****
>>>>>
>>>>> ** **
>>>>>
>>>>> What's the current status?****
>>>>>
>>>>>
>>>>> ****
>>>>>
>>>>> Warm Regards,****
>>>>>
>>>>> Tariq****
>>>>>
>>>>> cloudfront.blogspot.com****
>>>>>
>>>>> ** **
>>>>>
>>>>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <ma...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  First of all read the concepts ..I hope you will like it..****
>>>>>
>>>>>
>>>>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf****
>>>>>
>>>>> ** **
>>>>>
>>>>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  please suggest ****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> irfan ****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  hey Tariq,****
>>>>>
>>>>> i am still stuck .. ****
>>>>>
>>>>> can you please suggest ****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> irfan ****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  please suggest ****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  attachment got quarantined ****
>>>>>
>>>>> resending in txt format. please rename it to conf.rar ****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  thanks.****
>>>>>
>>>>> ** **
>>>>>
>>>>> if i run the jps command on namenode :****
>>>>>
>>>>> ** **
>>>>>
>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>
>>>>> $ ./jps.exe****
>>>>>
>>>>> 3164 NameNode****
>>>>>
>>>>> 1892 Jps****
>>>>>
>>>>> ** **
>>>>>
>>>>> same command on datanode :****
>>>>>
>>>>> ** **
>>>>>
>>>>> Administrator@DFS-1 /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>
>>>>> $ ./jps.exe****
>>>>>
>>>>> 3848 Jps****
>>>>>
>>>>> ** **
>>>>>
>>>>> jps does not list any process for datanode. however, on web browser i
>>>>> can see one live data node ****
>>>>>
>>>>> please find the attached conf rar file of namenode ****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <do...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  OK. we'll start fresh. Could you plz show me your latest config
>>>>> files?****
>>>>>
>>>>> ** **
>>>>>
>>>>> BTW, are your daemons running fine?Use JPS to verify that.****
>>>>>
>>>>>
>>>>> ****
>>>>>
>>>>> Warm Regards,****
>>>>>
>>>>> Tariq****
>>>>>
>>>>> cloudfront.blogspot.com****
>>>>>
>>>>> ** **
>>>>>
>>>>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  i have created these dir "wksp_data" and "wksp_name" on both
>>>>> datanode and namenode ****
>>>>>
>>>>> made the respective changes in "hdfs-site.xml" file ****
>>>>>
>>>>> formatted the namenode ****
>>>>>
>>>>> started the dfs ****
>>>>>
>>>>> ** **
>>>>>
>>>>> but still, not able to browse the file system through web browser ****
>>>>>
>>>>> please refer below ****
>>>>>
>>>>> ** **
>>>>>
>>>>> anything still missing ?****
>>>>>
>>>>> please suggest ****
>>>>>
>>>>> ** **
>>>>>
>>>>> [image: Inline image 1]****
>>>>>
>>>>> ** **
>>>>>
>>>>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  these dir needs to be created on all datanodes and namenodes ?****
>>>>>
>>>>> further,  hdfs-site.xml needs to be updated on both datanodes and
>>>>> namenodes for these new dir?****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <do...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  Create 2 directories manually corresponding to the values of
>>>>> dfs.name.dir and dfs.data.dir properties and change the permissions of
>>>>> these directories to 755. When you start pushing data into your HDFS, data
>>>>> will start going inside the directory specified by dfs.data.dir and the
>>>>> associated metadata will go inside dfs.name.dir. Remember, you store data
>>>>> in HDFS, but it eventually gets stored in your local/native FS. But you
>>>>> cannot see this data directly on your local/native FS.****
>>>>>
>>>>>
>>>>> ****
>>>>>
>>>>> Warm Regards,****
>>>>>
>>>>> Tariq****
>>>>>
>>>>> cloudfront.blogspot.com****
>>>>>
>>>>> ** **
>>>>>
>>>>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  thanks. ****
>>>>>
>>>>> however, i need this to be working on windows environment as project
>>>>> requirement.****
>>>>>
>>>>> i will add/work on Linux later ****
>>>>>
>>>>> ** **
>>>>>
>>>>> so, now , at this stage , c:\\wksp is the HDFS file system OR do i
>>>>> need to create it from command line ?****
>>>>>
>>>>> ** **
>>>>>
>>>>> please suggest****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards,****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <do...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  Hello Irfan,****
>>>>>
>>>>> ** **
>>>>>
>>>>> Sorry for being unresponsive. Got stuck with some imp work.****
>>>>>
>>>>> ** **
>>>>>
>>>>> HDFS webUI doesn't provide us the ability to create file or directory.
>>>>> You can browse HDFS, view files, download files etc. But operation like
>>>>> create, move, copy etc are not supported.****
>>>>>
>>>>> ** **
>>>>>
>>>>> These values look fine to me.****
>>>>>
>>>>> ** **
>>>>>
>>>>> One suggestion though. Try getting a Linux machine(if possible). Or at
>>>>> least use a VM. I personally feel that using Hadoop on windows is always
>>>>> messy.****
>>>>>
>>>>>
>>>>> ****
>>>>>
>>>>> Warm Regards,****
>>>>>
>>>>> Tariq****
>>>>>
>>>>> cloudfront.blogspot.com****
>>>>>
>>>>> ** **
>>>>>
>>>>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  thanks.****
>>>>>
>>>>> when i browse the file system , i am getting following :****
>>>>>
>>>>> i haven't seen any make directory option there ****
>>>>>
>>>>> ** **
>>>>>
>>>>> i need to create it from command line ?****
>>>>>
>>>>> further, in the hdfs-site.xml file , i have given following entries.
>>>>> are they correct ? ****
>>>>>
>>>>> ** **
>>>>>
>>>>> <property>****
>>>>>
>>>>>   <name>dfs.data.dir</name>****
>>>>>
>>>>>   <value>c:\\wksp</value>****
>>>>>
>>>>>   </property>****
>>>>>
>>>>> <property>****
>>>>>
>>>>>   <name>dfs.name.dir</name>****
>>>>>
>>>>>   <value>c:\\wksp</value>****
>>>>>
>>>>>   </property>****
>>>>>
>>>>> ** **
>>>>>
>>>>> please suggest ****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> [image: Inline image 1]****
>>>>>
>>>>> ** **
>>>>>
>>>>> On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <ma...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  *You are wrong at this:*****
>>>>>
>>>>> ** **
>>>>>
>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>>>>
>>>>> $ ./hadoop dfs -copyFromLocal
>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp****
>>>>>
>>>>> copyFromLocal: File
>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
>>>>> ****
>>>>>
>>>>> ** **
>>>>>
>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>>>>
>>>>> $ ./hadoop dfs -copyFromLocal
>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp****
>>>>>
>>>>> copyFromLocal: File
>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>>>>> ****
>>>>>
>>>>> ** **
>>>>>
>>>>> Because,You had wrote both the paths local and You need not to copy
>>>>> hadoop into hdfs...Hadoop is already working..****
>>>>>
>>>>> ** **
>>>>>
>>>>> Just check out in browser by after starting ur single node cluster :**
>>>>> **
>>>>>
>>>>> ** **
>>>>>
>>>>> localhost:50070****
>>>>>
>>>>> ** **
>>>>>
>>>>> then go for browse the filesystem link in it..****
>>>>>
>>>>> ** **
>>>>>
>>>>> If there is no directory then make directory there.****
>>>>>
>>>>> That is your hdfs directory.****
>>>>>
>>>>> Then copy any text file there(no need to copy hadoop there).beacause u
>>>>> are going to do processing on that data in text file.That's why hadoop is
>>>>> used for ,first u need to make it clear in ur mind.Then and then u will do
>>>>> it...otherwise not possible..****
>>>>>
>>>>> ** **
>>>>>
>>>>> *Try this: *****
>>>>>
>>>>> ** **
>>>>>
>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2****
>>>>>
>>>>> $ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
>>>>> /hdfs/directory/path****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  thanks. yes , i am newbie.****
>>>>>
>>>>> however, i need windows setup.****
>>>>>
>>>>> ** **
>>>>>
>>>>> let me surely refer the doc and link which u sent but i need this to
>>>>> be working ...****
>>>>>
>>>>> can you please help****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> ** **
>>>>>
>>>>>  ****
>>>>>
>>>>> ** **
>>>>>
>>>>>
>>>>>
>>>>> ****
>>>>>
>>>>> ** **
>>>>>
>>>>> --
>>>>> MANISH DUNANI
>>>>> -THANX
>>>>> +91 9426881954,+91 8460656443****
>>>>>
>>>>> manishd207@gmail.com****
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>
>>>>>
>>>>> ****
>>>>>
>>>>> -- ****
>>>>>
>>>>> Regards****
>>>>>
>>>>> *Manish Dunani*****
>>>>>
>>>>> *Contact No* : +91 9408329137****
>>>>>
>>>>> *skype id* : manish.dunani****
>>>>>
>>>>> ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>> CONFIDENTIALITY NOTICE
>>>>> NOTICE: This message is intended for the use of the individual or
>>>>> entity to which it is addressed and may contain information that is
>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>> notified that any printing, copying, dissemination, distribution,
>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>> you have received this communication in error, please contact the sender
>>>>> immediately and delete it from your system. Thank You.****
>>>>>
>>>>>  ** **
>>>>>
>>>>>
>>>>>
>>>>> CONFIDENTIALITY NOTICE
>>>>> NOTICE: This message is intended for the use of the individual or
>>>>> entity to which it is addressed and may contain information that is
>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>> notified that any printing, copying, dissemination, distribution,
>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>> you have received this communication in error, please contact the sender
>>>>> immediately and delete it from your system. Thank You.****
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>
>>>>> CONFIDENTIALITY NOTICE
>>>>> NOTICE: This message is intended for the use of the individual or
>>>>> entity to which it is addressed and may contain information that is
>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>> notified that any printing, copying, dissemination, distribution,
>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>> you have received this communication in error, please contact the sender
>>>>> immediately and delete it from your system. Thank You.****
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>
>>>>> CONFIDENTIALITY NOTICE
>>>>> NOTICE: This message is intended for the use of the individual or
>>>>> entity to which it is addressed and may contain information that is
>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>> notified that any printing, copying, dissemination, distribution,
>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>> you have received this communication in error, please contact the sender
>>>>> immediately and delete it from your system. Thank You.****
>>>>>
>>>>>  ** **
>>>>>
>>>>>
>>>>> CONFIDENTIALITY NOTICE
>>>>> NOTICE: This message is intended for the use of the individual or
>>>>> entity to which it is addressed and may contain information that is
>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>> notified that any printing, copying, dissemination, distribution,
>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>> you have received this communication in error, please contact the sender
>>>>> immediately and delete it from your system. Thank You.****
>>>>>
>>>>>  ** **
>>>>>
>>>>>
>>>>> CONFIDENTIALITY NOTICE
>>>>> NOTICE: This message is intended for the use of the individual or
>>>>> entity to which it is addressed and may contain information that is
>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>> notified that any printing, copying, dissemination, distribution,
>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>> you have received this communication in error, please contact the sender
>>>>> immediately and delete it from your system. Thank You.****
>>>>>
>>>>>  ** **
>>>>>
>>>>>
>>>>>
>>>>> ****
>>>>>
>>>>> ** **
>>>>>
>>>>> -- ****
>>>>>
>>>>> Olivier Renault****
>>>>>
>>>>> Solution Engineer - Big Data - Hortonworks, Inc.
>>>>> +44 7500 933 036
>>>>> orenault@hortonworks.com
>>>>> www.hortonworks.com****
>>>>>
>>>>> **** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>
>>>>>
>>>>> CONFIDENTIALITY NOTICE
>>>>> NOTICE: This message is intended for the use of the individual or
>>>>> entity to which it is addressed and may contain information that is
>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>> notified that any printing, copying, dissemination, distribution,
>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>> you have received this communication in error, please contact the sender
>>>>> immediately and delete it from your system. Thank You.****<http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>
>>>>>  ** ** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>
>>>>
>>>>
>>>
>>
>

-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.

Re: about replication

Posted by Olivier Renault <or...@hortonworks.com>.
Your cluster-properties.txt should look something like :

#Log directory
HDP_LOG_DIR=c:\hadoop\logs

#Data directory
HDP_DATA_DIR=c:\hdp\data

#Hosts
NAMENODE_HOST=yourmaster.fqdn.com
JOBTRACKER_HOST=yourmaster.fqdn.com
HIVE_SERVER_HOST=yourmaster.fqdn.com
OOZIE_SERVER_HOST=yourmaster.fqdn.com
TEMPLETON_HOST=yourmaster.fqdn.com
SLAVE_HOSTS=yourmaster.fqdn.com,youslave.fqdn.com

#Database host
DB_FLAVOR=derby
DB_HOSTNAME=yourmaster.fqdn.com


#Hive properties
HIVE_DB_NAME=hive
HIVE_DB_USERNAME=hive
HIVE_DB_PASSWORD=hive

#Oozie properties
OOZIE_DB_NAME=oozie
OOZIE_DB_USERNAME=oozie
OOZIE_DB_PASSWORD=oozie

You will need to replace, yourmaster.fqdn.com, yourslave.fqdn.com by your
servers name. For the time being, I suggest that you do not install HBase,
Oozie,

regards,
Olivier

On 10 September 2013 07:02, Irfan Sayed <ir...@gmail.com> wrote:

> please suggest
>
> regards
> irfan
>
>
>
> On Sat, Sep 7, 2013 at 4:56 PM, Irfan Sayed <ir...@gmail.com> wrote:
>
>> please suggest
>>
>> regards
>> irfan
>>
>>
>>
>> On Fri, Sep 6, 2013 at 12:16 PM, Irfan Sayed <ir...@gmail.com>wrote:
>>
>>> ok.. now i made some changes and installation went ahead
>>> but failed in property "HIVE_SERVER_HOST" declaration
>>> in cluster config file, i have commented this property. if i uncomment ,
>>> then what server address will give ???
>>>
>>> i have only two windows machines setup.
>>> 1: for namenode and another for datanode
>>>
>>> please suggest
>>>
>>> regards
>>> irfan
>>>
>>>
>>>
>>> On Fri, Sep 6, 2013 at 11:42 AM, Irfan Sayed <ir...@gmail.com>wrote:
>>>
>>>> thanks.
>>>> i installed the latest java in c:\java folder and now no error in log
>>>> file related to java
>>>> however, now it is throwing error on not having cluster properties
>>>> file.
>>>> in fact i am running/installing hdp from the location where this file
>>>> exist . still it is throwing error
>>>>
>>>> please find the attached
>>>>
>>>> [image: Inline image 1]
>>>>
>>>> regards
>>>> irfan
>>>>
>>>>
>>>>
>>>> On Fri, Sep 6, 2013 at 11:12 AM, Ravi Mummulla (BIG DATA) <
>>>> ravimu@microsoft.com> wrote:
>>>>
>>>>>  Here’s your issue (from the logs you attached earlier):****
>>>>>
>>>>> ** **
>>>>>
>>>>> CAQuietExec:  Checking JAVA_HOME is set correctly...****
>>>>>
>>>>> CAQuietExec:  Files\Java\jdk1.6.0_31 was unexpected at this time.****
>>>>>
>>>>> ** **
>>>>>
>>>>> It seems that you installed Java prerequisite in the default path,
>>>>> which is %PROGRAMFILES% (expands to C:\Program Files in your case). HDP 1.3
>>>>> does not like spaces in paths, do you need to reinstall Java under c:\java\
>>>>> or something similar (in a path with no spaces).****
>>>>>
>>>>> ** **
>>>>>
>>>>> *From:* Irfan Sayed [mailto:irfu.sayed@gmail.com]
>>>>> *Sent:* Thursday, September 5, 2013 8:42 PM
>>>>> *To:* user@hadoop.apache.org
>>>>> *Subject:* Re: about replication****
>>>>>
>>>>> ** **
>>>>>
>>>>> please find the attached.****
>>>>>
>>>>> i don't have "c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log"
>>>>> as it is not generated ****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> irfan****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Thu, Sep 5, 2013 at 6:09 PM, Olivier Renault <
>>>>> orenault@hortonworks.com> wrote:****
>>>>>
>>>>>  Could you share the log files ( c:\hdp.log,
>>>>> c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log )  as
>>>>> well as your clusterproperties.txt ?****
>>>>>
>>>>> ** **
>>>>>
>>>>> Thanks, ****
>>>>>
>>>>> Olivier****
>>>>>
>>>>> ** **
>>>>>
>>>>> On 5 September 2013 12:33, Irfan Sayed <ir...@gmail.com> wrote:**
>>>>> **
>>>>>
>>>>>  thanks. i followed the user manual for deployment and installed all
>>>>> pre-requisites ****
>>>>>
>>>>> i modified the command and still the issue persist. please suggest ***
>>>>> *
>>>>>
>>>>> ** **
>>>>>
>>>>> please refer below ****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> [image: Inline image 1]****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> irfan ****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <
>>>>> orenault@hortonworks.com> wrote:****
>>>>>
>>>>> The command to install it is msiexec /i msifile /...  ****
>>>>>
>>>>> You will find the correct syntax as part of doc. ****
>>>>>
>>>>> Happy reading
>>>>> Olivier ****
>>>>>
>>>>> On 4 Sep 2013 12:37, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>>>
>>>>>  thanks. ****
>>>>>
>>>>> i referred the logs and manuals. i modified the clusterproperties file
>>>>> and then double click on the msi file ****
>>>>>
>>>>> however, it still failed.****
>>>>>
>>>>> further i started the installation on command line by giving
>>>>> HDP_LAYOUT=clusterproperties file path, ****
>>>>>
>>>>> installation went ahead and it failed for .NET framework 4.0 and VC++
>>>>> redistributable package dependency   ****
>>>>>
>>>>> ** **
>>>>>
>>>>> i installed both and started again the installation. ****
>>>>>
>>>>> failed again with following error ****
>>>>>
>>>>> [image: Inline image 1]****
>>>>>
>>>>> ** **
>>>>>
>>>>> when i search for the logs mentioned in the error , i never found that
>>>>> ****
>>>>>
>>>>> please suggest ****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> irfan****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <
>>>>> orenault@hortonworks.com> wrote:****
>>>>>
>>>>> Correct, you need to define the cluster configuration as part of a
>>>>> file. You will find some information on the configuration file as part of
>>>>> the documentation. ****
>>>>>
>>>>>
>>>>> http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
>>>>> ****
>>>>>
>>>>> You should make sure to have also installed the pre requisite. ****
>>>>>
>>>>> Thanks
>>>>> Olivier ****
>>>>>
>>>>> On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>>>
>>>>>  thanks. sorry for the long break. actually got involved in some
>>>>> other priorities****
>>>>>
>>>>> i downloaded the installer and while installing i got following error
>>>>> ****
>>>>>
>>>>> ** **
>>>>>
>>>>> [image: Inline image 1]****
>>>>>
>>>>> ** **
>>>>>
>>>>> do i need to make any configuration prior to installation ??****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> irfan ****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
>>>>> orenault@hortonworks.com> wrote:****
>>>>>
>>>>> Here is the link ****
>>>>>
>>>>> http://download.hortonworks.com/products/hdp-windows/****
>>>>>
>>>>> Olivier ****
>>>>>
>>>>> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>>>
>>>>>  thanks.****
>>>>>
>>>>> i just followed the instructions to setup the pseudo distributed setup
>>>>> first using the url :
>>>>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>>>>> ****
>>>>>
>>>>>  ****
>>>>>
>>>>> i don't think so i am running DN on both machine ****
>>>>>
>>>>> please find the attached log****
>>>>>
>>>>> ** **
>>>>>
>>>>> hi olivier ****
>>>>>
>>>>> ** **
>>>>>
>>>>> can you please give me download link ?****
>>>>>
>>>>> let me try please ****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> irfan ****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <do...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  Are you running DN on both the machines? Could you please show me
>>>>> your DN logs?****
>>>>>
>>>>> ** **
>>>>>
>>>>> Also, consider Oliver's suggestion. It's definitely a better option.**
>>>>> **
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>>
>>>>> ****
>>>>>
>>>>> Warm Regards,****
>>>>>
>>>>> Tariq****
>>>>>
>>>>> cloudfront.blogspot.com****
>>>>>
>>>>> ** **
>>>>>
>>>>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>>>>> orenault@hortonworks.com> wrote:****
>>>>>
>>>>> Irfu, ****
>>>>>
>>>>> If you want to quickly get Hadoop running on windows platform. You may
>>>>> want to try our distribution for Windows. You will be able to find the msi
>>>>> on our website. ****
>>>>>
>>>>> Regards
>>>>> Olivier ****
>>>>>
>>>>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>>>
>>>>>  thanks. ****
>>>>>
>>>>> ok. i think i need to change the plan over here ****
>>>>>
>>>>> let me create two environments. 1: totally windows 2: totally Unix****
>>>>>
>>>>> ** **
>>>>>
>>>>> because, on windows , anyway i have to try and see how hadoop works **
>>>>> **
>>>>>
>>>>> on UNIX, it is already known that ,  it is working fine. ****
>>>>>
>>>>> ** **
>>>>>
>>>>> so, on windows , here is the setup:****
>>>>>
>>>>> ** **
>>>>>
>>>>> namenode : windows 2012 R2 ****
>>>>>
>>>>> datanode : windows 2012 R2 ****
>>>>>
>>>>> ** **
>>>>>
>>>>> now, the exact problem is :****
>>>>>
>>>>> 1: datanode is not getting started ****
>>>>>
>>>>> 2: replication : if i put any file/folder on any datanode , it should
>>>>> get replicated to all another available datanodes ****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <do...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  Seriously??You are planning to develop something using Hadoop on
>>>>> windows. Not a good idea. Anyways, cold you plz show me your log files?I
>>>>> also need some additional info :****
>>>>>
>>>>> -The exact problem which you are facing right now****
>>>>>
>>>>> -Your cluster summary(no. of nodes etc)****
>>>>>
>>>>> -Your latest configuration files****
>>>>>
>>>>> -Your /etc.hosts file****
>>>>>
>>>>>
>>>>> ****
>>>>>
>>>>> Warm Regards,****
>>>>>
>>>>> Tariq****
>>>>>
>>>>> cloudfront.blogspot.com****
>>>>>
>>>>> ** **
>>>>>
>>>>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  ok. thanks****
>>>>>
>>>>> now, i need to start with all windows setup first as our product will
>>>>> be based on windows ****
>>>>>
>>>>> so, now, please tell me how to resolve the issue ****
>>>>>
>>>>> ** **
>>>>>
>>>>> datanode is not starting . please suggest ****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards,****
>>>>>
>>>>> irfan ****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <do...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  It is possible. Theoretically Hadoop doesn't stop you from doing
>>>>> that. But it is not a very wise setup.****
>>>>>
>>>>>
>>>>> ****
>>>>>
>>>>> Warm Regards,****
>>>>>
>>>>> Tariq****
>>>>>
>>>>> cloudfront.blogspot.com****
>>>>>
>>>>> ** **
>>>>>
>>>>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  please suggest****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> irfan****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  thanks.****
>>>>>
>>>>> can i have setup like this :****
>>>>>
>>>>> namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)***
>>>>> *
>>>>>
>>>>> and datanodes are the combination of any OS (windows , linux , unix
>>>>> etc )****
>>>>>
>>>>> ** **
>>>>>
>>>>> however, my doubt is,  as the file systems of  both the systems (win
>>>>> and linux ) are different ,  datanodes of these systems can not be part of
>>>>> single cluster . i have to make windows cluster separate and UNIX cluster
>>>>> separate ?****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
>>>>> aagarwal@hortonworks.com> wrote:****
>>>>>
>>>>> I just noticed you are on Cygwin. IIRC Windows PIDs are not the same
>>>>> as Cygwin PIDs so that may be causing the discrepancy. I don't know how
>>>>> well Hadoop works in Cygwin as I have never tried it. Work is in progress
>>>>> for native Windows support however there are no official releases with
>>>>> Windows support yet. It may be easier to get familiar with a release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
>>>>> ****
>>>>>
>>>>>
>>>>>
>>>>> ****
>>>>>
>>>>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  thanks ****
>>>>>
>>>>> here is what i did .****
>>>>>
>>>>> i stopped all the namenodes and datanodes using ./stop-dfs.sh command
>>>>> ****
>>>>>
>>>>> then deleted all pid files for namenodes and datanodes ****
>>>>>
>>>>> ** **
>>>>>
>>>>> started dfs again with command : "./start-dfs.sh"****
>>>>>
>>>>> ** **
>>>>>
>>>>> when i ran the "Jps" command . it shows****
>>>>>
>>>>> ** **
>>>>>
>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>
>>>>> $ ./jps.exe****
>>>>>
>>>>> 4536 Jps****
>>>>>
>>>>> 2076 NameNode****
>>>>>
>>>>> ** **
>>>>>
>>>>> however, when i open the pid file for namenode then it is not showing
>>>>> pid as : 4560. on the contrary, it shud show : 2076****
>>>>>
>>>>> ** **
>>>>>
>>>>> please suggest ****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
>>>>> aagarwal@hortonworks.com> wrote:****
>>>>>
>>>>>  Most likely there is a stale pid file. Something like
>>>>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>>>>> the datanode.
>>>>>
>>>>> I haven't read the entire thread so you may have looked at this
>>>>> already.
>>>>>
>>>>> -Arpit****
>>>>>
>>>>>
>>>>>
>>>>> ****
>>>>>
>>>>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  datanode is trying to connect to namenode continuously but fails ****
>>>>>
>>>>> ** **
>>>>>
>>>>> when i try to run "jps" command it says :****
>>>>>
>>>>> $ ./jps.exe****
>>>>>
>>>>> 4584 NameNode****
>>>>>
>>>>> 4016 Jps****
>>>>>
>>>>> ** **
>>>>>
>>>>> and when i ran the "./start-dfs.sh" then it says :****
>>>>>
>>>>> ** **
>>>>>
>>>>> $ ./start-dfs.sh****
>>>>>
>>>>> namenode running as process 3544. Stop it first.****
>>>>>
>>>>> DFS-1: datanode running as process 4076. Stop it first.****
>>>>>
>>>>> localhost: secondarynamenode running as process 4792. Stop it first.**
>>>>> **
>>>>>
>>>>> ** **
>>>>>
>>>>> both these logs are contradictory ****
>>>>>
>>>>> please find the attached logs ****
>>>>>
>>>>> ** **
>>>>>
>>>>> should i attach the conf files as well ?****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>>  ****
>>>>>
>>>>> ** **
>>>>>
>>>>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <do...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  Your DN is still not running. Showing me the logs would be helpful.**
>>>>> **
>>>>>
>>>>>
>>>>> ****
>>>>>
>>>>> Warm Regards,****
>>>>>
>>>>> Tariq****
>>>>>
>>>>> cloudfront.blogspot.com****
>>>>>
>>>>> ** **
>>>>>
>>>>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  i followed the url and did the steps mention in that. i have
>>>>> deployed on the windows platform****
>>>>>
>>>>> ** **
>>>>>
>>>>> Now, i am able to browse url : http://localhost:50070 (name node )****
>>>>>
>>>>> however, not able to browse url : http://localhost:50030****
>>>>>
>>>>> ** **
>>>>>
>>>>> please refer below****
>>>>>
>>>>> ** **
>>>>>
>>>>> [image: Inline image 1]****
>>>>>
>>>>> ** **
>>>>>
>>>>> i have modified all the config files as mentioned and formatted the
>>>>> hdfs file system as well ****
>>>>>
>>>>> please suggest ****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  thanks. i followed this url :
>>>>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>>>>> ****
>>>>>
>>>>> let me follow the url which you gave for pseudo distributed setup and
>>>>> then will switch to distributed mode****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> irfan ****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <do...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  You are welcome. Which link have you followed for the
>>>>> configuration?Your *core-site.xml* is empty. Remove the property *
>>>>> fs.default.name *from *hdfs-site.xml* and add it to *core-site.xml*.
>>>>> Remove *mapred.job.tracker* as well. It is required in *
>>>>> mapred-site.xml*.****
>>>>>
>>>>> ** **
>>>>>
>>>>> I would suggest you to do a pseudo distributed setup first in order to
>>>>> get yourself familiar with the process and then proceed to the distributed
>>>>> mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>>>>> ****
>>>>>
>>>>> ** **
>>>>>
>>>>> HTH****
>>>>>
>>>>>
>>>>> ****
>>>>>
>>>>> Warm Regards,****
>>>>>
>>>>> Tariq****
>>>>>
>>>>> cloudfront.blogspot.com****
>>>>>
>>>>> ** **
>>>>>
>>>>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  thanks tariq for response. ****
>>>>>
>>>>> as discussed last time, i have sent you all the config files in my
>>>>> setup . ****
>>>>>
>>>>> can you please go through that ?****
>>>>>
>>>>> ** **
>>>>>
>>>>> please let me know ****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> irfan ****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <do...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  I'm sorry for being unresponsive. Was out of touch for sometime
>>>>> because of ramzan and eid. Resuming work today.****
>>>>>
>>>>> ** **
>>>>>
>>>>> What's the current status?****
>>>>>
>>>>>
>>>>> ****
>>>>>
>>>>> Warm Regards,****
>>>>>
>>>>> Tariq****
>>>>>
>>>>> cloudfront.blogspot.com****
>>>>>
>>>>> ** **
>>>>>
>>>>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <ma...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  First of all read the concepts ..I hope you will like it..****
>>>>>
>>>>>
>>>>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf****
>>>>>
>>>>> ** **
>>>>>
>>>>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  please suggest ****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> irfan ****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  hey Tariq,****
>>>>>
>>>>> i am still stuck .. ****
>>>>>
>>>>> can you please suggest ****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> irfan ****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  please suggest ****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  attachment got quarantined ****
>>>>>
>>>>> resending in txt format. please rename it to conf.rar ****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  thanks.****
>>>>>
>>>>> ** **
>>>>>
>>>>> if i run the jps command on namenode :****
>>>>>
>>>>> ** **
>>>>>
>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>
>>>>> $ ./jps.exe****
>>>>>
>>>>> 3164 NameNode****
>>>>>
>>>>> 1892 Jps****
>>>>>
>>>>> ** **
>>>>>
>>>>> same command on datanode :****
>>>>>
>>>>> ** **
>>>>>
>>>>> Administrator@DFS-1 /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>
>>>>> $ ./jps.exe****
>>>>>
>>>>> 3848 Jps****
>>>>>
>>>>> ** **
>>>>>
>>>>> jps does not list any process for datanode. however, on web browser i
>>>>> can see one live data node ****
>>>>>
>>>>> please find the attached conf rar file of namenode ****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <do...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  OK. we'll start fresh. Could you plz show me your latest config
>>>>> files?****
>>>>>
>>>>> ** **
>>>>>
>>>>> BTW, are your daemons running fine?Use JPS to verify that.****
>>>>>
>>>>>
>>>>> ****
>>>>>
>>>>> Warm Regards,****
>>>>>
>>>>> Tariq****
>>>>>
>>>>> cloudfront.blogspot.com****
>>>>>
>>>>> ** **
>>>>>
>>>>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  i have created these dir "wksp_data" and "wksp_name" on both
>>>>> datanode and namenode ****
>>>>>
>>>>> made the respective changes in "hdfs-site.xml" file ****
>>>>>
>>>>> formatted the namenode ****
>>>>>
>>>>> started the dfs ****
>>>>>
>>>>> ** **
>>>>>
>>>>> but still, not able to browse the file system through web browser ****
>>>>>
>>>>> please refer below ****
>>>>>
>>>>> ** **
>>>>>
>>>>> anything still missing ?****
>>>>>
>>>>> please suggest ****
>>>>>
>>>>> ** **
>>>>>
>>>>> [image: Inline image 1]****
>>>>>
>>>>> ** **
>>>>>
>>>>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  these dir needs to be created on all datanodes and namenodes ?****
>>>>>
>>>>> further,  hdfs-site.xml needs to be updated on both datanodes and
>>>>> namenodes for these new dir?****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <do...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  Create 2 directories manually corresponding to the values of
>>>>> dfs.name.dir and dfs.data.dir properties and change the permissions of
>>>>> these directories to 755. When you start pushing data into your HDFS, data
>>>>> will start going inside the directory specified by dfs.data.dir and the
>>>>> associated metadata will go inside dfs.name.dir. Remember, you store data
>>>>> in HDFS, but it eventually gets stored in your local/native FS. But you
>>>>> cannot see this data directly on your local/native FS.****
>>>>>
>>>>>
>>>>> ****
>>>>>
>>>>> Warm Regards,****
>>>>>
>>>>> Tariq****
>>>>>
>>>>> cloudfront.blogspot.com****
>>>>>
>>>>> ** **
>>>>>
>>>>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  thanks. ****
>>>>>
>>>>> however, i need this to be working on windows environment as project
>>>>> requirement.****
>>>>>
>>>>> i will add/work on Linux later ****
>>>>>
>>>>> ** **
>>>>>
>>>>> so, now , at this stage , c:\\wksp is the HDFS file system OR do i
>>>>> need to create it from command line ?****
>>>>>
>>>>> ** **
>>>>>
>>>>> please suggest****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards,****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <do...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  Hello Irfan,****
>>>>>
>>>>> ** **
>>>>>
>>>>> Sorry for being unresponsive. Got stuck with some imp work.****
>>>>>
>>>>> ** **
>>>>>
>>>>> HDFS webUI doesn't provide us the ability to create file or directory.
>>>>> You can browse HDFS, view files, download files etc. But operation like
>>>>> create, move, copy etc are not supported.****
>>>>>
>>>>> ** **
>>>>>
>>>>> These values look fine to me.****
>>>>>
>>>>> ** **
>>>>>
>>>>> One suggestion though. Try getting a Linux machine(if possible). Or at
>>>>> least use a VM. I personally feel that using Hadoop on windows is always
>>>>> messy.****
>>>>>
>>>>>
>>>>> ****
>>>>>
>>>>> Warm Regards,****
>>>>>
>>>>> Tariq****
>>>>>
>>>>> cloudfront.blogspot.com****
>>>>>
>>>>> ** **
>>>>>
>>>>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  thanks.****
>>>>>
>>>>> when i browse the file system , i am getting following :****
>>>>>
>>>>> i haven't seen any make directory option there ****
>>>>>
>>>>> ** **
>>>>>
>>>>> i need to create it from command line ?****
>>>>>
>>>>> further, in the hdfs-site.xml file , i have given following entries.
>>>>> are they correct ? ****
>>>>>
>>>>> ** **
>>>>>
>>>>> <property>****
>>>>>
>>>>>   <name>dfs.data.dir</name>****
>>>>>
>>>>>   <value>c:\\wksp</value>****
>>>>>
>>>>>   </property>****
>>>>>
>>>>> <property>****
>>>>>
>>>>>   <name>dfs.name.dir</name>****
>>>>>
>>>>>   <value>c:\\wksp</value>****
>>>>>
>>>>>   </property>****
>>>>>
>>>>> ** **
>>>>>
>>>>> please suggest ****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> [image: Inline image 1]****
>>>>>
>>>>> ** **
>>>>>
>>>>> On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <ma...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  *You are wrong at this:*****
>>>>>
>>>>> ** **
>>>>>
>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>>>>
>>>>> $ ./hadoop dfs -copyFromLocal
>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp****
>>>>>
>>>>> copyFromLocal: File
>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
>>>>> ****
>>>>>
>>>>> ** **
>>>>>
>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>>>>
>>>>> $ ./hadoop dfs -copyFromLocal
>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp****
>>>>>
>>>>> copyFromLocal: File
>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>>>>> ****
>>>>>
>>>>> ** **
>>>>>
>>>>> Because,You had wrote both the paths local and You need not to copy
>>>>> hadoop into hdfs...Hadoop is already working..****
>>>>>
>>>>> ** **
>>>>>
>>>>> Just check out in browser by after starting ur single node cluster :**
>>>>> **
>>>>>
>>>>> ** **
>>>>>
>>>>> localhost:50070****
>>>>>
>>>>> ** **
>>>>>
>>>>> then go for browse the filesystem link in it..****
>>>>>
>>>>> ** **
>>>>>
>>>>> If there is no directory then make directory there.****
>>>>>
>>>>> That is your hdfs directory.****
>>>>>
>>>>> Then copy any text file there(no need to copy hadoop there).beacause u
>>>>> are going to do processing on that data in text file.That's why hadoop is
>>>>> used for ,first u need to make it clear in ur mind.Then and then u will do
>>>>> it...otherwise not possible..****
>>>>>
>>>>> ** **
>>>>>
>>>>> *Try this: *****
>>>>>
>>>>> ** **
>>>>>
>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2****
>>>>>
>>>>> $ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
>>>>> /hdfs/directory/path****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  thanks. yes , i am newbie.****
>>>>>
>>>>> however, i need windows setup.****
>>>>>
>>>>> ** **
>>>>>
>>>>> let me surely refer the doc and link which u sent but i need this to
>>>>> be working ...****
>>>>>
>>>>> can you please help****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> ** **
>>>>>
>>>>>  ****
>>>>>
>>>>> ** **
>>>>>
>>>>>
>>>>>
>>>>> ****
>>>>>
>>>>> ** **
>>>>>
>>>>> --
>>>>> MANISH DUNANI
>>>>> -THANX
>>>>> +91 9426881954,+91 8460656443****
>>>>>
>>>>> manishd207@gmail.com****
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>
>>>>>
>>>>> ****
>>>>>
>>>>> -- ****
>>>>>
>>>>> Regards****
>>>>>
>>>>> *Manish Dunani*****
>>>>>
>>>>> *Contact No* : +91 9408329137****
>>>>>
>>>>> *skype id* : manish.dunani****
>>>>>
>>>>> ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>> CONFIDENTIALITY NOTICE
>>>>> NOTICE: This message is intended for the use of the individual or
>>>>> entity to which it is addressed and may contain information that is
>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>> notified that any printing, copying, dissemination, distribution,
>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>> you have received this communication in error, please contact the sender
>>>>> immediately and delete it from your system. Thank You.****
>>>>>
>>>>>  ** **
>>>>>
>>>>>
>>>>>
>>>>> CONFIDENTIALITY NOTICE
>>>>> NOTICE: This message is intended for the use of the individual or
>>>>> entity to which it is addressed and may contain information that is
>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>> notified that any printing, copying, dissemination, distribution,
>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>> you have received this communication in error, please contact the sender
>>>>> immediately and delete it from your system. Thank You.****
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>
>>>>> CONFIDENTIALITY NOTICE
>>>>> NOTICE: This message is intended for the use of the individual or
>>>>> entity to which it is addressed and may contain information that is
>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>> notified that any printing, copying, dissemination, distribution,
>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>> you have received this communication in error, please contact the sender
>>>>> immediately and delete it from your system. Thank You.****
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>
>>>>> CONFIDENTIALITY NOTICE
>>>>> NOTICE: This message is intended for the use of the individual or
>>>>> entity to which it is addressed and may contain information that is
>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>> notified that any printing, copying, dissemination, distribution,
>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>> you have received this communication in error, please contact the sender
>>>>> immediately and delete it from your system. Thank You.****
>>>>>
>>>>>  ** **
>>>>>
>>>>>
>>>>> CONFIDENTIALITY NOTICE
>>>>> NOTICE: This message is intended for the use of the individual or
>>>>> entity to which it is addressed and may contain information that is
>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>> notified that any printing, copying, dissemination, distribution,
>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>> you have received this communication in error, please contact the sender
>>>>> immediately and delete it from your system. Thank You.****
>>>>>
>>>>>  ** **
>>>>>
>>>>>
>>>>> CONFIDENTIALITY NOTICE
>>>>> NOTICE: This message is intended for the use of the individual or
>>>>> entity to which it is addressed and may contain information that is
>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>> notified that any printing, copying, dissemination, distribution,
>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>> you have received this communication in error, please contact the sender
>>>>> immediately and delete it from your system. Thank You.****
>>>>>
>>>>>  ** **
>>>>>
>>>>>
>>>>>
>>>>> ****
>>>>>
>>>>> ** **
>>>>>
>>>>> -- ****
>>>>>
>>>>> Olivier Renault****
>>>>>
>>>>> Solution Engineer - Big Data - Hortonworks, Inc.
>>>>> +44 7500 933 036
>>>>> orenault@hortonworks.com
>>>>> www.hortonworks.com****
>>>>>
>>>>> **** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>
>>>>>
>>>>> CONFIDENTIALITY NOTICE
>>>>> NOTICE: This message is intended for the use of the individual or
>>>>> entity to which it is addressed and may contain information that is
>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>> notified that any printing, copying, dissemination, distribution,
>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>> you have received this communication in error, please contact the sender
>>>>> immediately and delete it from your system. Thank You.****<http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>
>>>>>  ** ** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>
>>>>
>>>>
>>>
>>
>

-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.

Re: about replication

Posted by Olivier Renault <or...@hortonworks.com>.
Your cluster-properties.txt should look something like :

#Log directory
HDP_LOG_DIR=c:\hadoop\logs

#Data directory
HDP_DATA_DIR=c:\hdp\data

#Hosts
NAMENODE_HOST=yourmaster.fqdn.com
JOBTRACKER_HOST=yourmaster.fqdn.com
HIVE_SERVER_HOST=yourmaster.fqdn.com
OOZIE_SERVER_HOST=yourmaster.fqdn.com
TEMPLETON_HOST=yourmaster.fqdn.com
SLAVE_HOSTS=yourmaster.fqdn.com,youslave.fqdn.com

#Database host
DB_FLAVOR=derby
DB_HOSTNAME=yourmaster.fqdn.com


#Hive properties
HIVE_DB_NAME=hive
HIVE_DB_USERNAME=hive
HIVE_DB_PASSWORD=hive

#Oozie properties
OOZIE_DB_NAME=oozie
OOZIE_DB_USERNAME=oozie
OOZIE_DB_PASSWORD=oozie

You will need to replace, yourmaster.fqdn.com, yourslave.fqdn.com by your
servers name. For the time being, I suggest that you do not install HBase,
Oozie,

regards,
Olivier

On 10 September 2013 07:02, Irfan Sayed <ir...@gmail.com> wrote:

> please suggest
>
> regards
> irfan
>
>
>
> On Sat, Sep 7, 2013 at 4:56 PM, Irfan Sayed <ir...@gmail.com> wrote:
>
>> please suggest
>>
>> regards
>> irfan
>>
>>
>>
>> On Fri, Sep 6, 2013 at 12:16 PM, Irfan Sayed <ir...@gmail.com>wrote:
>>
>>> ok.. now i made some changes and installation went ahead
>>> but failed in property "HIVE_SERVER_HOST" declaration
>>> in cluster config file, i have commented this property. if i uncomment ,
>>> then what server address will give ???
>>>
>>> i have only two windows machines setup.
>>> 1: for namenode and another for datanode
>>>
>>> please suggest
>>>
>>> regards
>>> irfan
>>>
>>>
>>>
>>> On Fri, Sep 6, 2013 at 11:42 AM, Irfan Sayed <ir...@gmail.com>wrote:
>>>
>>>> thanks.
>>>> i installed the latest java in c:\java folder and now no error in log
>>>> file related to java
>>>> however, now it is throwing error on not having cluster properties
>>>> file.
>>>> in fact i am running/installing hdp from the location where this file
>>>> exist . still it is throwing error
>>>>
>>>> please find the attached
>>>>
>>>> [image: Inline image 1]
>>>>
>>>> regards
>>>> irfan
>>>>
>>>>
>>>>
>>>> On Fri, Sep 6, 2013 at 11:12 AM, Ravi Mummulla (BIG DATA) <
>>>> ravimu@microsoft.com> wrote:
>>>>
>>>>>  Here’s your issue (from the logs you attached earlier):****
>>>>>
>>>>> ** **
>>>>>
>>>>> CAQuietExec:  Checking JAVA_HOME is set correctly...****
>>>>>
>>>>> CAQuietExec:  Files\Java\jdk1.6.0_31 was unexpected at this time.****
>>>>>
>>>>> ** **
>>>>>
>>>>> It seems that you installed Java prerequisite in the default path,
>>>>> which is %PROGRAMFILES% (expands to C:\Program Files in your case). HDP 1.3
>>>>> does not like spaces in paths, do you need to reinstall Java under c:\java\
>>>>> or something similar (in a path with no spaces).****
>>>>>
>>>>> ** **
>>>>>
>>>>> *From:* Irfan Sayed [mailto:irfu.sayed@gmail.com]
>>>>> *Sent:* Thursday, September 5, 2013 8:42 PM
>>>>> *To:* user@hadoop.apache.org
>>>>> *Subject:* Re: about replication****
>>>>>
>>>>> ** **
>>>>>
>>>>> please find the attached.****
>>>>>
>>>>> i don't have "c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log"
>>>>> as it is not generated ****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> irfan****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Thu, Sep 5, 2013 at 6:09 PM, Olivier Renault <
>>>>> orenault@hortonworks.com> wrote:****
>>>>>
>>>>>  Could you share the log files ( c:\hdp.log,
>>>>> c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log )  as
>>>>> well as your clusterproperties.txt ?****
>>>>>
>>>>> ** **
>>>>>
>>>>> Thanks, ****
>>>>>
>>>>> Olivier****
>>>>>
>>>>> ** **
>>>>>
>>>>> On 5 September 2013 12:33, Irfan Sayed <ir...@gmail.com> wrote:**
>>>>> **
>>>>>
>>>>>  thanks. i followed the user manual for deployment and installed all
>>>>> pre-requisites ****
>>>>>
>>>>> i modified the command and still the issue persist. please suggest ***
>>>>> *
>>>>>
>>>>> ** **
>>>>>
>>>>> please refer below ****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> [image: Inline image 1]****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> irfan ****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <
>>>>> orenault@hortonworks.com> wrote:****
>>>>>
>>>>> The command to install it is msiexec /i msifile /...  ****
>>>>>
>>>>> You will find the correct syntax as part of doc. ****
>>>>>
>>>>> Happy reading
>>>>> Olivier ****
>>>>>
>>>>> On 4 Sep 2013 12:37, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>>>
>>>>>  thanks. ****
>>>>>
>>>>> i referred the logs and manuals. i modified the clusterproperties file
>>>>> and then double click on the msi file ****
>>>>>
>>>>> however, it still failed.****
>>>>>
>>>>> further i started the installation on command line by giving
>>>>> HDP_LAYOUT=clusterproperties file path, ****
>>>>>
>>>>> installation went ahead and it failed for .NET framework 4.0 and VC++
>>>>> redistributable package dependency   ****
>>>>>
>>>>> ** **
>>>>>
>>>>> i installed both and started again the installation. ****
>>>>>
>>>>> failed again with following error ****
>>>>>
>>>>> [image: Inline image 1]****
>>>>>
>>>>> ** **
>>>>>
>>>>> when i search for the logs mentioned in the error , i never found that
>>>>> ****
>>>>>
>>>>> please suggest ****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> irfan****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <
>>>>> orenault@hortonworks.com> wrote:****
>>>>>
>>>>> Correct, you need to define the cluster configuration as part of a
>>>>> file. You will find some information on the configuration file as part of
>>>>> the documentation. ****
>>>>>
>>>>>
>>>>> http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
>>>>> ****
>>>>>
>>>>> You should make sure to have also installed the pre requisite. ****
>>>>>
>>>>> Thanks
>>>>> Olivier ****
>>>>>
>>>>> On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>>>
>>>>>  thanks. sorry for the long break. actually got involved in some
>>>>> other priorities****
>>>>>
>>>>> i downloaded the installer and while installing i got following error
>>>>> ****
>>>>>
>>>>> ** **
>>>>>
>>>>> [image: Inline image 1]****
>>>>>
>>>>> ** **
>>>>>
>>>>> do i need to make any configuration prior to installation ??****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> irfan ****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
>>>>> orenault@hortonworks.com> wrote:****
>>>>>
>>>>> Here is the link ****
>>>>>
>>>>> http://download.hortonworks.com/products/hdp-windows/****
>>>>>
>>>>> Olivier ****
>>>>>
>>>>> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>>>
>>>>>  thanks.****
>>>>>
>>>>> i just followed the instructions to setup the pseudo distributed setup
>>>>> first using the url :
>>>>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>>>>> ****
>>>>>
>>>>>  ****
>>>>>
>>>>> i don't think so i am running DN on both machine ****
>>>>>
>>>>> please find the attached log****
>>>>>
>>>>> ** **
>>>>>
>>>>> hi olivier ****
>>>>>
>>>>> ** **
>>>>>
>>>>> can you please give me download link ?****
>>>>>
>>>>> let me try please ****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> irfan ****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <do...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  Are you running DN on both the machines? Could you please show me
>>>>> your DN logs?****
>>>>>
>>>>> ** **
>>>>>
>>>>> Also, consider Oliver's suggestion. It's definitely a better option.**
>>>>> **
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>>
>>>>> ****
>>>>>
>>>>> Warm Regards,****
>>>>>
>>>>> Tariq****
>>>>>
>>>>> cloudfront.blogspot.com****
>>>>>
>>>>> ** **
>>>>>
>>>>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>>>>> orenault@hortonworks.com> wrote:****
>>>>>
>>>>> Irfu, ****
>>>>>
>>>>> If you want to quickly get Hadoop running on windows platform. You may
>>>>> want to try our distribution for Windows. You will be able to find the msi
>>>>> on our website. ****
>>>>>
>>>>> Regards
>>>>> Olivier ****
>>>>>
>>>>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>>>
>>>>>  thanks. ****
>>>>>
>>>>> ok. i think i need to change the plan over here ****
>>>>>
>>>>> let me create two environments. 1: totally windows 2: totally Unix****
>>>>>
>>>>> ** **
>>>>>
>>>>> because, on windows , anyway i have to try and see how hadoop works **
>>>>> **
>>>>>
>>>>> on UNIX, it is already known that ,  it is working fine. ****
>>>>>
>>>>> ** **
>>>>>
>>>>> so, on windows , here is the setup:****
>>>>>
>>>>> ** **
>>>>>
>>>>> namenode : windows 2012 R2 ****
>>>>>
>>>>> datanode : windows 2012 R2 ****
>>>>>
>>>>> ** **
>>>>>
>>>>> now, the exact problem is :****
>>>>>
>>>>> 1: datanode is not getting started ****
>>>>>
>>>>> 2: replication : if i put any file/folder on any datanode , it should
>>>>> get replicated to all another available datanodes ****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <do...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  Seriously??You are planning to develop something using Hadoop on
>>>>> windows. Not a good idea. Anyways, cold you plz show me your log files?I
>>>>> also need some additional info :****
>>>>>
>>>>> -The exact problem which you are facing right now****
>>>>>
>>>>> -Your cluster summary(no. of nodes etc)****
>>>>>
>>>>> -Your latest configuration files****
>>>>>
>>>>> -Your /etc.hosts file****
>>>>>
>>>>>
>>>>> ****
>>>>>
>>>>> Warm Regards,****
>>>>>
>>>>> Tariq****
>>>>>
>>>>> cloudfront.blogspot.com****
>>>>>
>>>>> ** **
>>>>>
>>>>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  ok. thanks****
>>>>>
>>>>> now, i need to start with all windows setup first as our product will
>>>>> be based on windows ****
>>>>>
>>>>> so, now, please tell me how to resolve the issue ****
>>>>>
>>>>> ** **
>>>>>
>>>>> datanode is not starting . please suggest ****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards,****
>>>>>
>>>>> irfan ****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <do...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  It is possible. Theoretically Hadoop doesn't stop you from doing
>>>>> that. But it is not a very wise setup.****
>>>>>
>>>>>
>>>>> ****
>>>>>
>>>>> Warm Regards,****
>>>>>
>>>>> Tariq****
>>>>>
>>>>> cloudfront.blogspot.com****
>>>>>
>>>>> ** **
>>>>>
>>>>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  please suggest****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> irfan****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  thanks.****
>>>>>
>>>>> can i have setup like this :****
>>>>>
>>>>> namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)***
>>>>> *
>>>>>
>>>>> and datanodes are the combination of any OS (windows , linux , unix
>>>>> etc )****
>>>>>
>>>>> ** **
>>>>>
>>>>> however, my doubt is,  as the file systems of  both the systems (win
>>>>> and linux ) are different ,  datanodes of these systems can not be part of
>>>>> single cluster . i have to make windows cluster separate and UNIX cluster
>>>>> separate ?****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
>>>>> aagarwal@hortonworks.com> wrote:****
>>>>>
>>>>> I just noticed you are on Cygwin. IIRC Windows PIDs are not the same
>>>>> as Cygwin PIDs so that may be causing the discrepancy. I don't know how
>>>>> well Hadoop works in Cygwin as I have never tried it. Work is in progress
>>>>> for native Windows support however there are no official releases with
>>>>> Windows support yet. It may be easier to get familiar with a release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
>>>>> ****
>>>>>
>>>>>
>>>>>
>>>>> ****
>>>>>
>>>>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  thanks ****
>>>>>
>>>>> here is what i did .****
>>>>>
>>>>> i stopped all the namenodes and datanodes using ./stop-dfs.sh command
>>>>> ****
>>>>>
>>>>> then deleted all pid files for namenodes and datanodes ****
>>>>>
>>>>> ** **
>>>>>
>>>>> started dfs again with command : "./start-dfs.sh"****
>>>>>
>>>>> ** **
>>>>>
>>>>> when i ran the "Jps" command . it shows****
>>>>>
>>>>> ** **
>>>>>
>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>
>>>>> $ ./jps.exe****
>>>>>
>>>>> 4536 Jps****
>>>>>
>>>>> 2076 NameNode****
>>>>>
>>>>> ** **
>>>>>
>>>>> however, when i open the pid file for namenode then it is not showing
>>>>> pid as : 4560. on the contrary, it shud show : 2076****
>>>>>
>>>>> ** **
>>>>>
>>>>> please suggest ****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
>>>>> aagarwal@hortonworks.com> wrote:****
>>>>>
>>>>>  Most likely there is a stale pid file. Something like
>>>>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>>>>> the datanode.
>>>>>
>>>>> I haven't read the entire thread so you may have looked at this
>>>>> already.
>>>>>
>>>>> -Arpit****
>>>>>
>>>>>
>>>>>
>>>>> ****
>>>>>
>>>>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  datanode is trying to connect to namenode continuously but fails ****
>>>>>
>>>>> ** **
>>>>>
>>>>> when i try to run "jps" command it says :****
>>>>>
>>>>> $ ./jps.exe****
>>>>>
>>>>> 4584 NameNode****
>>>>>
>>>>> 4016 Jps****
>>>>>
>>>>> ** **
>>>>>
>>>>> and when i ran the "./start-dfs.sh" then it says :****
>>>>>
>>>>> ** **
>>>>>
>>>>> $ ./start-dfs.sh****
>>>>>
>>>>> namenode running as process 3544. Stop it first.****
>>>>>
>>>>> DFS-1: datanode running as process 4076. Stop it first.****
>>>>>
>>>>> localhost: secondarynamenode running as process 4792. Stop it first.**
>>>>> **
>>>>>
>>>>> ** **
>>>>>
>>>>> both these logs are contradictory ****
>>>>>
>>>>> please find the attached logs ****
>>>>>
>>>>> ** **
>>>>>
>>>>> should i attach the conf files as well ?****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>>  ****
>>>>>
>>>>> ** **
>>>>>
>>>>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <do...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  Your DN is still not running. Showing me the logs would be helpful.**
>>>>> **
>>>>>
>>>>>
>>>>> ****
>>>>>
>>>>> Warm Regards,****
>>>>>
>>>>> Tariq****
>>>>>
>>>>> cloudfront.blogspot.com****
>>>>>
>>>>> ** **
>>>>>
>>>>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  i followed the url and did the steps mention in that. i have
>>>>> deployed on the windows platform****
>>>>>
>>>>> ** **
>>>>>
>>>>> Now, i am able to browse url : http://localhost:50070 (name node )****
>>>>>
>>>>> however, not able to browse url : http://localhost:50030****
>>>>>
>>>>> ** **
>>>>>
>>>>> please refer below****
>>>>>
>>>>> ** **
>>>>>
>>>>> [image: Inline image 1]****
>>>>>
>>>>> ** **
>>>>>
>>>>> i have modified all the config files as mentioned and formatted the
>>>>> hdfs file system as well ****
>>>>>
>>>>> please suggest ****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  thanks. i followed this url :
>>>>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>>>>> ****
>>>>>
>>>>> let me follow the url which you gave for pseudo distributed setup and
>>>>> then will switch to distributed mode****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> irfan ****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <do...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  You are welcome. Which link have you followed for the
>>>>> configuration?Your *core-site.xml* is empty. Remove the property *
>>>>> fs.default.name *from *hdfs-site.xml* and add it to *core-site.xml*.
>>>>> Remove *mapred.job.tracker* as well. It is required in *
>>>>> mapred-site.xml*.****
>>>>>
>>>>> ** **
>>>>>
>>>>> I would suggest you to do a pseudo distributed setup first in order to
>>>>> get yourself familiar with the process and then proceed to the distributed
>>>>> mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>>>>> ****
>>>>>
>>>>> ** **
>>>>>
>>>>> HTH****
>>>>>
>>>>>
>>>>> ****
>>>>>
>>>>> Warm Regards,****
>>>>>
>>>>> Tariq****
>>>>>
>>>>> cloudfront.blogspot.com****
>>>>>
>>>>> ** **
>>>>>
>>>>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  thanks tariq for response. ****
>>>>>
>>>>> as discussed last time, i have sent you all the config files in my
>>>>> setup . ****
>>>>>
>>>>> can you please go through that ?****
>>>>>
>>>>> ** **
>>>>>
>>>>> please let me know ****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> irfan ****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <do...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  I'm sorry for being unresponsive. Was out of touch for sometime
>>>>> because of ramzan and eid. Resuming work today.****
>>>>>
>>>>> ** **
>>>>>
>>>>> What's the current status?****
>>>>>
>>>>>
>>>>> ****
>>>>>
>>>>> Warm Regards,****
>>>>>
>>>>> Tariq****
>>>>>
>>>>> cloudfront.blogspot.com****
>>>>>
>>>>> ** **
>>>>>
>>>>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <ma...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  First of all read the concepts ..I hope you will like it..****
>>>>>
>>>>>
>>>>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf****
>>>>>
>>>>> ** **
>>>>>
>>>>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  please suggest ****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> irfan ****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  hey Tariq,****
>>>>>
>>>>> i am still stuck .. ****
>>>>>
>>>>> can you please suggest ****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> irfan ****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  please suggest ****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  attachment got quarantined ****
>>>>>
>>>>> resending in txt format. please rename it to conf.rar ****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  thanks.****
>>>>>
>>>>> ** **
>>>>>
>>>>> if i run the jps command on namenode :****
>>>>>
>>>>> ** **
>>>>>
>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>
>>>>> $ ./jps.exe****
>>>>>
>>>>> 3164 NameNode****
>>>>>
>>>>> 1892 Jps****
>>>>>
>>>>> ** **
>>>>>
>>>>> same command on datanode :****
>>>>>
>>>>> ** **
>>>>>
>>>>> Administrator@DFS-1 /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>
>>>>> $ ./jps.exe****
>>>>>
>>>>> 3848 Jps****
>>>>>
>>>>> ** **
>>>>>
>>>>> jps does not list any process for datanode. however, on web browser i
>>>>> can see one live data node ****
>>>>>
>>>>> please find the attached conf rar file of namenode ****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <do...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  OK. we'll start fresh. Could you plz show me your latest config
>>>>> files?****
>>>>>
>>>>> ** **
>>>>>
>>>>> BTW, are your daemons running fine?Use JPS to verify that.****
>>>>>
>>>>>
>>>>> ****
>>>>>
>>>>> Warm Regards,****
>>>>>
>>>>> Tariq****
>>>>>
>>>>> cloudfront.blogspot.com****
>>>>>
>>>>> ** **
>>>>>
>>>>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  i have created these dir "wksp_data" and "wksp_name" on both
>>>>> datanode and namenode ****
>>>>>
>>>>> made the respective changes in "hdfs-site.xml" file ****
>>>>>
>>>>> formatted the namenode ****
>>>>>
>>>>> started the dfs ****
>>>>>
>>>>> ** **
>>>>>
>>>>> but still, not able to browse the file system through web browser ****
>>>>>
>>>>> please refer below ****
>>>>>
>>>>> ** **
>>>>>
>>>>> anything still missing ?****
>>>>>
>>>>> please suggest ****
>>>>>
>>>>> ** **
>>>>>
>>>>> [image: Inline image 1]****
>>>>>
>>>>> ** **
>>>>>
>>>>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  these dir needs to be created on all datanodes and namenodes ?****
>>>>>
>>>>> further,  hdfs-site.xml needs to be updated on both datanodes and
>>>>> namenodes for these new dir?****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <do...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  Create 2 directories manually corresponding to the values of
>>>>> dfs.name.dir and dfs.data.dir properties and change the permissions of
>>>>> these directories to 755. When you start pushing data into your HDFS, data
>>>>> will start going inside the directory specified by dfs.data.dir and the
>>>>> associated metadata will go inside dfs.name.dir. Remember, you store data
>>>>> in HDFS, but it eventually gets stored in your local/native FS. But you
>>>>> cannot see this data directly on your local/native FS.****
>>>>>
>>>>>
>>>>> ****
>>>>>
>>>>> Warm Regards,****
>>>>>
>>>>> Tariq****
>>>>>
>>>>> cloudfront.blogspot.com****
>>>>>
>>>>> ** **
>>>>>
>>>>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  thanks. ****
>>>>>
>>>>> however, i need this to be working on windows environment as project
>>>>> requirement.****
>>>>>
>>>>> i will add/work on Linux later ****
>>>>>
>>>>> ** **
>>>>>
>>>>> so, now , at this stage , c:\\wksp is the HDFS file system OR do i
>>>>> need to create it from command line ?****
>>>>>
>>>>> ** **
>>>>>
>>>>> please suggest****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards,****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <do...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  Hello Irfan,****
>>>>>
>>>>> ** **
>>>>>
>>>>> Sorry for being unresponsive. Got stuck with some imp work.****
>>>>>
>>>>> ** **
>>>>>
>>>>> HDFS webUI doesn't provide us the ability to create file or directory.
>>>>> You can browse HDFS, view files, download files etc. But operation like
>>>>> create, move, copy etc are not supported.****
>>>>>
>>>>> ** **
>>>>>
>>>>> These values look fine to me.****
>>>>>
>>>>> ** **
>>>>>
>>>>> One suggestion though. Try getting a Linux machine(if possible). Or at
>>>>> least use a VM. I personally feel that using Hadoop on windows is always
>>>>> messy.****
>>>>>
>>>>>
>>>>> ****
>>>>>
>>>>> Warm Regards,****
>>>>>
>>>>> Tariq****
>>>>>
>>>>> cloudfront.blogspot.com****
>>>>>
>>>>> ** **
>>>>>
>>>>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  thanks.****
>>>>>
>>>>> when i browse the file system , i am getting following :****
>>>>>
>>>>> i haven't seen any make directory option there ****
>>>>>
>>>>> ** **
>>>>>
>>>>> i need to create it from command line ?****
>>>>>
>>>>> further, in the hdfs-site.xml file , i have given following entries.
>>>>> are they correct ? ****
>>>>>
>>>>> ** **
>>>>>
>>>>> <property>****
>>>>>
>>>>>   <name>dfs.data.dir</name>****
>>>>>
>>>>>   <value>c:\\wksp</value>****
>>>>>
>>>>>   </property>****
>>>>>
>>>>> <property>****
>>>>>
>>>>>   <name>dfs.name.dir</name>****
>>>>>
>>>>>   <value>c:\\wksp</value>****
>>>>>
>>>>>   </property>****
>>>>>
>>>>> ** **
>>>>>
>>>>> please suggest ****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> [image: Inline image 1]****
>>>>>
>>>>> ** **
>>>>>
>>>>> On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <ma...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  *You are wrong at this:*****
>>>>>
>>>>> ** **
>>>>>
>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>>>>
>>>>> $ ./hadoop dfs -copyFromLocal
>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp****
>>>>>
>>>>> copyFromLocal: File
>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
>>>>> ****
>>>>>
>>>>> ** **
>>>>>
>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>>>>
>>>>> $ ./hadoop dfs -copyFromLocal
>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp****
>>>>>
>>>>> copyFromLocal: File
>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>>>>> ****
>>>>>
>>>>> ** **
>>>>>
>>>>> Because,You had wrote both the paths local and You need not to copy
>>>>> hadoop into hdfs...Hadoop is already working..****
>>>>>
>>>>> ** **
>>>>>
>>>>> Just check out in browser by after starting ur single node cluster :**
>>>>> **
>>>>>
>>>>> ** **
>>>>>
>>>>> localhost:50070****
>>>>>
>>>>> ** **
>>>>>
>>>>> then go for browse the filesystem link in it..****
>>>>>
>>>>> ** **
>>>>>
>>>>> If there is no directory then make directory there.****
>>>>>
>>>>> That is your hdfs directory.****
>>>>>
>>>>> Then copy any text file there(no need to copy hadoop there).beacause u
>>>>> are going to do processing on that data in text file.That's why hadoop is
>>>>> used for ,first u need to make it clear in ur mind.Then and then u will do
>>>>> it...otherwise not possible..****
>>>>>
>>>>> ** **
>>>>>
>>>>> *Try this: *****
>>>>>
>>>>> ** **
>>>>>
>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2****
>>>>>
>>>>> $ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
>>>>> /hdfs/directory/path****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  thanks. yes , i am newbie.****
>>>>>
>>>>> however, i need windows setup.****
>>>>>
>>>>> ** **
>>>>>
>>>>> let me surely refer the doc and link which u sent but i need this to
>>>>> be working ...****
>>>>>
>>>>> can you please help****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> ** **
>>>>>
>>>>>  ****
>>>>>
>>>>> ** **
>>>>>
>>>>>
>>>>>
>>>>> ****
>>>>>
>>>>> ** **
>>>>>
>>>>> --
>>>>> MANISH DUNANI
>>>>> -THANX
>>>>> +91 9426881954,+91 8460656443****
>>>>>
>>>>> manishd207@gmail.com****
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>
>>>>>
>>>>> ****
>>>>>
>>>>> -- ****
>>>>>
>>>>> Regards****
>>>>>
>>>>> *Manish Dunani*****
>>>>>
>>>>> *Contact No* : +91 9408329137****
>>>>>
>>>>> *skype id* : manish.dunani****
>>>>>
>>>>> ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>> CONFIDENTIALITY NOTICE
>>>>> NOTICE: This message is intended for the use of the individual or
>>>>> entity to which it is addressed and may contain information that is
>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>> notified that any printing, copying, dissemination, distribution,
>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>> you have received this communication in error, please contact the sender
>>>>> immediately and delete it from your system. Thank You.****
>>>>>
>>>>>  ** **
>>>>>
>>>>>
>>>>>
>>>>> CONFIDENTIALITY NOTICE
>>>>> NOTICE: This message is intended for the use of the individual or
>>>>> entity to which it is addressed and may contain information that is
>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>> notified that any printing, copying, dissemination, distribution,
>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>> you have received this communication in error, please contact the sender
>>>>> immediately and delete it from your system. Thank You.****
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>
>>>>> CONFIDENTIALITY NOTICE
>>>>> NOTICE: This message is intended for the use of the individual or
>>>>> entity to which it is addressed and may contain information that is
>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>> notified that any printing, copying, dissemination, distribution,
>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>> you have received this communication in error, please contact the sender
>>>>> immediately and delete it from your system. Thank You.****
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>
>>>>> CONFIDENTIALITY NOTICE
>>>>> NOTICE: This message is intended for the use of the individual or
>>>>> entity to which it is addressed and may contain information that is
>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>> notified that any printing, copying, dissemination, distribution,
>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>> you have received this communication in error, please contact the sender
>>>>> immediately and delete it from your system. Thank You.****
>>>>>
>>>>>  ** **
>>>>>
>>>>>
>>>>> CONFIDENTIALITY NOTICE
>>>>> NOTICE: This message is intended for the use of the individual or
>>>>> entity to which it is addressed and may contain information that is
>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>> notified that any printing, copying, dissemination, distribution,
>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>> you have received this communication in error, please contact the sender
>>>>> immediately and delete it from your system. Thank You.****
>>>>>
>>>>>  ** **
>>>>>
>>>>>
>>>>> CONFIDENTIALITY NOTICE
>>>>> NOTICE: This message is intended for the use of the individual or
>>>>> entity to which it is addressed and may contain information that is
>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>> notified that any printing, copying, dissemination, distribution,
>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>> you have received this communication in error, please contact the sender
>>>>> immediately and delete it from your system. Thank You.****
>>>>>
>>>>>  ** **
>>>>>
>>>>>
>>>>>
>>>>> ****
>>>>>
>>>>> ** **
>>>>>
>>>>> -- ****
>>>>>
>>>>> Olivier Renault****
>>>>>
>>>>> Solution Engineer - Big Data - Hortonworks, Inc.
>>>>> +44 7500 933 036
>>>>> orenault@hortonworks.com
>>>>> www.hortonworks.com****
>>>>>
>>>>> **** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>
>>>>>
>>>>> CONFIDENTIALITY NOTICE
>>>>> NOTICE: This message is intended for the use of the individual or
>>>>> entity to which it is addressed and may contain information that is
>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>> notified that any printing, copying, dissemination, distribution,
>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>> you have received this communication in error, please contact the sender
>>>>> immediately and delete it from your system. Thank You.****<http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>
>>>>>  ** ** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>
>>>>
>>>>
>>>
>>
>

-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.

Re: about replication

Posted by Olivier Renault <or...@hortonworks.com>.
Your cluster-properties.txt should look something like :

#Log directory
HDP_LOG_DIR=c:\hadoop\logs

#Data directory
HDP_DATA_DIR=c:\hdp\data

#Hosts
NAMENODE_HOST=yourmaster.fqdn.com
JOBTRACKER_HOST=yourmaster.fqdn.com
HIVE_SERVER_HOST=yourmaster.fqdn.com
OOZIE_SERVER_HOST=yourmaster.fqdn.com
TEMPLETON_HOST=yourmaster.fqdn.com
SLAVE_HOSTS=yourmaster.fqdn.com,youslave.fqdn.com

#Database host
DB_FLAVOR=derby
DB_HOSTNAME=yourmaster.fqdn.com


#Hive properties
HIVE_DB_NAME=hive
HIVE_DB_USERNAME=hive
HIVE_DB_PASSWORD=hive

#Oozie properties
OOZIE_DB_NAME=oozie
OOZIE_DB_USERNAME=oozie
OOZIE_DB_PASSWORD=oozie

You will need to replace, yourmaster.fqdn.com, yourslave.fqdn.com by your
servers name. For the time being, I suggest that you do not install HBase,
Oozie,

regards,
Olivier

On 10 September 2013 07:02, Irfan Sayed <ir...@gmail.com> wrote:

> please suggest
>
> regards
> irfan
>
>
>
> On Sat, Sep 7, 2013 at 4:56 PM, Irfan Sayed <ir...@gmail.com> wrote:
>
>> please suggest
>>
>> regards
>> irfan
>>
>>
>>
>> On Fri, Sep 6, 2013 at 12:16 PM, Irfan Sayed <ir...@gmail.com>wrote:
>>
>>> ok.. now i made some changes and installation went ahead
>>> but failed in property "HIVE_SERVER_HOST" declaration
>>> in cluster config file, i have commented this property. if i uncomment ,
>>> then what server address will give ???
>>>
>>> i have only two windows machines setup.
>>> 1: for namenode and another for datanode
>>>
>>> please suggest
>>>
>>> regards
>>> irfan
>>>
>>>
>>>
>>> On Fri, Sep 6, 2013 at 11:42 AM, Irfan Sayed <ir...@gmail.com>wrote:
>>>
>>>> thanks.
>>>> i installed the latest java in c:\java folder and now no error in log
>>>> file related to java
>>>> however, now it is throwing error on not having cluster properties
>>>> file.
>>>> in fact i am running/installing hdp from the location where this file
>>>> exist . still it is throwing error
>>>>
>>>> please find the attached
>>>>
>>>> [image: Inline image 1]
>>>>
>>>> regards
>>>> irfan
>>>>
>>>>
>>>>
>>>> On Fri, Sep 6, 2013 at 11:12 AM, Ravi Mummulla (BIG DATA) <
>>>> ravimu@microsoft.com> wrote:
>>>>
>>>>>  Here’s your issue (from the logs you attached earlier):****
>>>>>
>>>>> ** **
>>>>>
>>>>> CAQuietExec:  Checking JAVA_HOME is set correctly...****
>>>>>
>>>>> CAQuietExec:  Files\Java\jdk1.6.0_31 was unexpected at this time.****
>>>>>
>>>>> ** **
>>>>>
>>>>> It seems that you installed Java prerequisite in the default path,
>>>>> which is %PROGRAMFILES% (expands to C:\Program Files in your case). HDP 1.3
>>>>> does not like spaces in paths, do you need to reinstall Java under c:\java\
>>>>> or something similar (in a path with no spaces).****
>>>>>
>>>>> ** **
>>>>>
>>>>> *From:* Irfan Sayed [mailto:irfu.sayed@gmail.com]
>>>>> *Sent:* Thursday, September 5, 2013 8:42 PM
>>>>> *To:* user@hadoop.apache.org
>>>>> *Subject:* Re: about replication****
>>>>>
>>>>> ** **
>>>>>
>>>>> please find the attached.****
>>>>>
>>>>> i don't have "c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log"
>>>>> as it is not generated ****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> irfan****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Thu, Sep 5, 2013 at 6:09 PM, Olivier Renault <
>>>>> orenault@hortonworks.com> wrote:****
>>>>>
>>>>>  Could you share the log files ( c:\hdp.log,
>>>>> c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log )  as
>>>>> well as your clusterproperties.txt ?****
>>>>>
>>>>> ** **
>>>>>
>>>>> Thanks, ****
>>>>>
>>>>> Olivier****
>>>>>
>>>>> ** **
>>>>>
>>>>> On 5 September 2013 12:33, Irfan Sayed <ir...@gmail.com> wrote:**
>>>>> **
>>>>>
>>>>>  thanks. i followed the user manual for deployment and installed all
>>>>> pre-requisites ****
>>>>>
>>>>> i modified the command and still the issue persist. please suggest ***
>>>>> *
>>>>>
>>>>> ** **
>>>>>
>>>>> please refer below ****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> [image: Inline image 1]****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> irfan ****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <
>>>>> orenault@hortonworks.com> wrote:****
>>>>>
>>>>> The command to install it is msiexec /i msifile /...  ****
>>>>>
>>>>> You will find the correct syntax as part of doc. ****
>>>>>
>>>>> Happy reading
>>>>> Olivier ****
>>>>>
>>>>> On 4 Sep 2013 12:37, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>>>
>>>>>  thanks. ****
>>>>>
>>>>> i referred the logs and manuals. i modified the clusterproperties file
>>>>> and then double click on the msi file ****
>>>>>
>>>>> however, it still failed.****
>>>>>
>>>>> further i started the installation on command line by giving
>>>>> HDP_LAYOUT=clusterproperties file path, ****
>>>>>
>>>>> installation went ahead and it failed for .NET framework 4.0 and VC++
>>>>> redistributable package dependency   ****
>>>>>
>>>>> ** **
>>>>>
>>>>> i installed both and started again the installation. ****
>>>>>
>>>>> failed again with following error ****
>>>>>
>>>>> [image: Inline image 1]****
>>>>>
>>>>> ** **
>>>>>
>>>>> when i search for the logs mentioned in the error , i never found that
>>>>> ****
>>>>>
>>>>> please suggest ****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> irfan****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <
>>>>> orenault@hortonworks.com> wrote:****
>>>>>
>>>>> Correct, you need to define the cluster configuration as part of a
>>>>> file. You will find some information on the configuration file as part of
>>>>> the documentation. ****
>>>>>
>>>>>
>>>>> http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
>>>>> ****
>>>>>
>>>>> You should make sure to have also installed the pre requisite. ****
>>>>>
>>>>> Thanks
>>>>> Olivier ****
>>>>>
>>>>> On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>>>
>>>>>  thanks. sorry for the long break. actually got involved in some
>>>>> other priorities****
>>>>>
>>>>> i downloaded the installer and while installing i got following error
>>>>> ****
>>>>>
>>>>> ** **
>>>>>
>>>>> [image: Inline image 1]****
>>>>>
>>>>> ** **
>>>>>
>>>>> do i need to make any configuration prior to installation ??****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> irfan ****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
>>>>> orenault@hortonworks.com> wrote:****
>>>>>
>>>>> Here is the link ****
>>>>>
>>>>> http://download.hortonworks.com/products/hdp-windows/****
>>>>>
>>>>> Olivier ****
>>>>>
>>>>> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>>>
>>>>>  thanks.****
>>>>>
>>>>> i just followed the instructions to setup the pseudo distributed setup
>>>>> first using the url :
>>>>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>>>>> ****
>>>>>
>>>>>  ****
>>>>>
>>>>> i don't think so i am running DN on both machine ****
>>>>>
>>>>> please find the attached log****
>>>>>
>>>>> ** **
>>>>>
>>>>> hi olivier ****
>>>>>
>>>>> ** **
>>>>>
>>>>> can you please give me download link ?****
>>>>>
>>>>> let me try please ****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> irfan ****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <do...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  Are you running DN on both the machines? Could you please show me
>>>>> your DN logs?****
>>>>>
>>>>> ** **
>>>>>
>>>>> Also, consider Oliver's suggestion. It's definitely a better option.**
>>>>> **
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>>
>>>>> ****
>>>>>
>>>>> Warm Regards,****
>>>>>
>>>>> Tariq****
>>>>>
>>>>> cloudfront.blogspot.com****
>>>>>
>>>>> ** **
>>>>>
>>>>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>>>>> orenault@hortonworks.com> wrote:****
>>>>>
>>>>> Irfu, ****
>>>>>
>>>>> If you want to quickly get Hadoop running on windows platform. You may
>>>>> want to try our distribution for Windows. You will be able to find the msi
>>>>> on our website. ****
>>>>>
>>>>> Regards
>>>>> Olivier ****
>>>>>
>>>>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>>>
>>>>>  thanks. ****
>>>>>
>>>>> ok. i think i need to change the plan over here ****
>>>>>
>>>>> let me create two environments. 1: totally windows 2: totally Unix****
>>>>>
>>>>> ** **
>>>>>
>>>>> because, on windows , anyway i have to try and see how hadoop works **
>>>>> **
>>>>>
>>>>> on UNIX, it is already known that ,  it is working fine. ****
>>>>>
>>>>> ** **
>>>>>
>>>>> so, on windows , here is the setup:****
>>>>>
>>>>> ** **
>>>>>
>>>>> namenode : windows 2012 R2 ****
>>>>>
>>>>> datanode : windows 2012 R2 ****
>>>>>
>>>>> ** **
>>>>>
>>>>> now, the exact problem is :****
>>>>>
>>>>> 1: datanode is not getting started ****
>>>>>
>>>>> 2: replication : if i put any file/folder on any datanode , it should
>>>>> get replicated to all another available datanodes ****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <do...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  Seriously??You are planning to develop something using Hadoop on
>>>>> windows. Not a good idea. Anyways, cold you plz show me your log files?I
>>>>> also need some additional info :****
>>>>>
>>>>> -The exact problem which you are facing right now****
>>>>>
>>>>> -Your cluster summary(no. of nodes etc)****
>>>>>
>>>>> -Your latest configuration files****
>>>>>
>>>>> -Your /etc.hosts file****
>>>>>
>>>>>
>>>>> ****
>>>>>
>>>>> Warm Regards,****
>>>>>
>>>>> Tariq****
>>>>>
>>>>> cloudfront.blogspot.com****
>>>>>
>>>>> ** **
>>>>>
>>>>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  ok. thanks****
>>>>>
>>>>> now, i need to start with all windows setup first as our product will
>>>>> be based on windows ****
>>>>>
>>>>> so, now, please tell me how to resolve the issue ****
>>>>>
>>>>> ** **
>>>>>
>>>>> datanode is not starting . please suggest ****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards,****
>>>>>
>>>>> irfan ****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <do...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  It is possible. Theoretically Hadoop doesn't stop you from doing
>>>>> that. But it is not a very wise setup.****
>>>>>
>>>>>
>>>>> ****
>>>>>
>>>>> Warm Regards,****
>>>>>
>>>>> Tariq****
>>>>>
>>>>> cloudfront.blogspot.com****
>>>>>
>>>>> ** **
>>>>>
>>>>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  please suggest****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> irfan****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  thanks.****
>>>>>
>>>>> can i have setup like this :****
>>>>>
>>>>> namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)***
>>>>> *
>>>>>
>>>>> and datanodes are the combination of any OS (windows , linux , unix
>>>>> etc )****
>>>>>
>>>>> ** **
>>>>>
>>>>> however, my doubt is,  as the file systems of  both the systems (win
>>>>> and linux ) are different ,  datanodes of these systems can not be part of
>>>>> single cluster . i have to make windows cluster separate and UNIX cluster
>>>>> separate ?****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
>>>>> aagarwal@hortonworks.com> wrote:****
>>>>>
>>>>> I just noticed you are on Cygwin. IIRC Windows PIDs are not the same
>>>>> as Cygwin PIDs so that may be causing the discrepancy. I don't know how
>>>>> well Hadoop works in Cygwin as I have never tried it. Work is in progress
>>>>> for native Windows support however there are no official releases with
>>>>> Windows support yet. It may be easier to get familiar with a release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
>>>>> ****
>>>>>
>>>>>
>>>>>
>>>>> ****
>>>>>
>>>>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  thanks ****
>>>>>
>>>>> here is what i did .****
>>>>>
>>>>> i stopped all the namenodes and datanodes using ./stop-dfs.sh command
>>>>> ****
>>>>>
>>>>> then deleted all pid files for namenodes and datanodes ****
>>>>>
>>>>> ** **
>>>>>
>>>>> started dfs again with command : "./start-dfs.sh"****
>>>>>
>>>>> ** **
>>>>>
>>>>> when i ran the "Jps" command . it shows****
>>>>>
>>>>> ** **
>>>>>
>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>
>>>>> $ ./jps.exe****
>>>>>
>>>>> 4536 Jps****
>>>>>
>>>>> 2076 NameNode****
>>>>>
>>>>> ** **
>>>>>
>>>>> however, when i open the pid file for namenode then it is not showing
>>>>> pid as : 4560. on the contrary, it shud show : 2076****
>>>>>
>>>>> ** **
>>>>>
>>>>> please suggest ****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
>>>>> aagarwal@hortonworks.com> wrote:****
>>>>>
>>>>>  Most likely there is a stale pid file. Something like
>>>>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>>>>> the datanode.
>>>>>
>>>>> I haven't read the entire thread so you may have looked at this
>>>>> already.
>>>>>
>>>>> -Arpit****
>>>>>
>>>>>
>>>>>
>>>>> ****
>>>>>
>>>>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  datanode is trying to connect to namenode continuously but fails ****
>>>>>
>>>>> ** **
>>>>>
>>>>> when i try to run "jps" command it says :****
>>>>>
>>>>> $ ./jps.exe****
>>>>>
>>>>> 4584 NameNode****
>>>>>
>>>>> 4016 Jps****
>>>>>
>>>>> ** **
>>>>>
>>>>> and when i ran the "./start-dfs.sh" then it says :****
>>>>>
>>>>> ** **
>>>>>
>>>>> $ ./start-dfs.sh****
>>>>>
>>>>> namenode running as process 3544. Stop it first.****
>>>>>
>>>>> DFS-1: datanode running as process 4076. Stop it first.****
>>>>>
>>>>> localhost: secondarynamenode running as process 4792. Stop it first.**
>>>>> **
>>>>>
>>>>> ** **
>>>>>
>>>>> both these logs are contradictory ****
>>>>>
>>>>> please find the attached logs ****
>>>>>
>>>>> ** **
>>>>>
>>>>> should i attach the conf files as well ?****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>>  ****
>>>>>
>>>>> ** **
>>>>>
>>>>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <do...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  Your DN is still not running. Showing me the logs would be helpful.**
>>>>> **
>>>>>
>>>>>
>>>>> ****
>>>>>
>>>>> Warm Regards,****
>>>>>
>>>>> Tariq****
>>>>>
>>>>> cloudfront.blogspot.com****
>>>>>
>>>>> ** **
>>>>>
>>>>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  i followed the url and did the steps mention in that. i have
>>>>> deployed on the windows platform****
>>>>>
>>>>> ** **
>>>>>
>>>>> Now, i am able to browse url : http://localhost:50070 (name node )****
>>>>>
>>>>> however, not able to browse url : http://localhost:50030****
>>>>>
>>>>> ** **
>>>>>
>>>>> please refer below****
>>>>>
>>>>> ** **
>>>>>
>>>>> [image: Inline image 1]****
>>>>>
>>>>> ** **
>>>>>
>>>>> i have modified all the config files as mentioned and formatted the
>>>>> hdfs file system as well ****
>>>>>
>>>>> please suggest ****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  thanks. i followed this url :
>>>>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>>>>> ****
>>>>>
>>>>> let me follow the url which you gave for pseudo distributed setup and
>>>>> then will switch to distributed mode****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> irfan ****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <do...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  You are welcome. Which link have you followed for the
>>>>> configuration?Your *core-site.xml* is empty. Remove the property *
>>>>> fs.default.name *from *hdfs-site.xml* and add it to *core-site.xml*.
>>>>> Remove *mapred.job.tracker* as well. It is required in *
>>>>> mapred-site.xml*.****
>>>>>
>>>>> ** **
>>>>>
>>>>> I would suggest you to do a pseudo distributed setup first in order to
>>>>> get yourself familiar with the process and then proceed to the distributed
>>>>> mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>>>>> ****
>>>>>
>>>>> ** **
>>>>>
>>>>> HTH****
>>>>>
>>>>>
>>>>> ****
>>>>>
>>>>> Warm Regards,****
>>>>>
>>>>> Tariq****
>>>>>
>>>>> cloudfront.blogspot.com****
>>>>>
>>>>> ** **
>>>>>
>>>>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  thanks tariq for response. ****
>>>>>
>>>>> as discussed last time, i have sent you all the config files in my
>>>>> setup . ****
>>>>>
>>>>> can you please go through that ?****
>>>>>
>>>>> ** **
>>>>>
>>>>> please let me know ****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> irfan ****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <do...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  I'm sorry for being unresponsive. Was out of touch for sometime
>>>>> because of ramzan and eid. Resuming work today.****
>>>>>
>>>>> ** **
>>>>>
>>>>> What's the current status?****
>>>>>
>>>>>
>>>>> ****
>>>>>
>>>>> Warm Regards,****
>>>>>
>>>>> Tariq****
>>>>>
>>>>> cloudfront.blogspot.com****
>>>>>
>>>>> ** **
>>>>>
>>>>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <ma...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  First of all read the concepts ..I hope you will like it..****
>>>>>
>>>>>
>>>>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf****
>>>>>
>>>>> ** **
>>>>>
>>>>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  please suggest ****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> irfan ****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  hey Tariq,****
>>>>>
>>>>> i am still stuck .. ****
>>>>>
>>>>> can you please suggest ****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> irfan ****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  please suggest ****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  attachment got quarantined ****
>>>>>
>>>>> resending in txt format. please rename it to conf.rar ****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  thanks.****
>>>>>
>>>>> ** **
>>>>>
>>>>> if i run the jps command on namenode :****
>>>>>
>>>>> ** **
>>>>>
>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>
>>>>> $ ./jps.exe****
>>>>>
>>>>> 3164 NameNode****
>>>>>
>>>>> 1892 Jps****
>>>>>
>>>>> ** **
>>>>>
>>>>> same command on datanode :****
>>>>>
>>>>> ** **
>>>>>
>>>>> Administrator@DFS-1 /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>>
>>>>> $ ./jps.exe****
>>>>>
>>>>> 3848 Jps****
>>>>>
>>>>> ** **
>>>>>
>>>>> jps does not list any process for datanode. however, on web browser i
>>>>> can see one live data node ****
>>>>>
>>>>> please find the attached conf rar file of namenode ****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <do...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  OK. we'll start fresh. Could you plz show me your latest config
>>>>> files?****
>>>>>
>>>>> ** **
>>>>>
>>>>> BTW, are your daemons running fine?Use JPS to verify that.****
>>>>>
>>>>>
>>>>> ****
>>>>>
>>>>> Warm Regards,****
>>>>>
>>>>> Tariq****
>>>>>
>>>>> cloudfront.blogspot.com****
>>>>>
>>>>> ** **
>>>>>
>>>>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  i have created these dir "wksp_data" and "wksp_name" on both
>>>>> datanode and namenode ****
>>>>>
>>>>> made the respective changes in "hdfs-site.xml" file ****
>>>>>
>>>>> formatted the namenode ****
>>>>>
>>>>> started the dfs ****
>>>>>
>>>>> ** **
>>>>>
>>>>> but still, not able to browse the file system through web browser ****
>>>>>
>>>>> please refer below ****
>>>>>
>>>>> ** **
>>>>>
>>>>> anything still missing ?****
>>>>>
>>>>> please suggest ****
>>>>>
>>>>> ** **
>>>>>
>>>>> [image: Inline image 1]****
>>>>>
>>>>> ** **
>>>>>
>>>>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  these dir needs to be created on all datanodes and namenodes ?****
>>>>>
>>>>> further,  hdfs-site.xml needs to be updated on both datanodes and
>>>>> namenodes for these new dir?****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <do...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  Create 2 directories manually corresponding to the values of
>>>>> dfs.name.dir and dfs.data.dir properties and change the permissions of
>>>>> these directories to 755. When you start pushing data into your HDFS, data
>>>>> will start going inside the directory specified by dfs.data.dir and the
>>>>> associated metadata will go inside dfs.name.dir. Remember, you store data
>>>>> in HDFS, but it eventually gets stored in your local/native FS. But you
>>>>> cannot see this data directly on your local/native FS.****
>>>>>
>>>>>
>>>>> ****
>>>>>
>>>>> Warm Regards,****
>>>>>
>>>>> Tariq****
>>>>>
>>>>> cloudfront.blogspot.com****
>>>>>
>>>>> ** **
>>>>>
>>>>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  thanks. ****
>>>>>
>>>>> however, i need this to be working on windows environment as project
>>>>> requirement.****
>>>>>
>>>>> i will add/work on Linux later ****
>>>>>
>>>>> ** **
>>>>>
>>>>> so, now , at this stage , c:\\wksp is the HDFS file system OR do i
>>>>> need to create it from command line ?****
>>>>>
>>>>> ** **
>>>>>
>>>>> please suggest****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards,****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <do...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  Hello Irfan,****
>>>>>
>>>>> ** **
>>>>>
>>>>> Sorry for being unresponsive. Got stuck with some imp work.****
>>>>>
>>>>> ** **
>>>>>
>>>>> HDFS webUI doesn't provide us the ability to create file or directory.
>>>>> You can browse HDFS, view files, download files etc. But operation like
>>>>> create, move, copy etc are not supported.****
>>>>>
>>>>> ** **
>>>>>
>>>>> These values look fine to me.****
>>>>>
>>>>> ** **
>>>>>
>>>>> One suggestion though. Try getting a Linux machine(if possible). Or at
>>>>> least use a VM. I personally feel that using Hadoop on windows is always
>>>>> messy.****
>>>>>
>>>>>
>>>>> ****
>>>>>
>>>>> Warm Regards,****
>>>>>
>>>>> Tariq****
>>>>>
>>>>> cloudfront.blogspot.com****
>>>>>
>>>>> ** **
>>>>>
>>>>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  thanks.****
>>>>>
>>>>> when i browse the file system , i am getting following :****
>>>>>
>>>>> i haven't seen any make directory option there ****
>>>>>
>>>>> ** **
>>>>>
>>>>> i need to create it from command line ?****
>>>>>
>>>>> further, in the hdfs-site.xml file , i have given following entries.
>>>>> are they correct ? ****
>>>>>
>>>>> ** **
>>>>>
>>>>> <property>****
>>>>>
>>>>>   <name>dfs.data.dir</name>****
>>>>>
>>>>>   <value>c:\\wksp</value>****
>>>>>
>>>>>   </property>****
>>>>>
>>>>> <property>****
>>>>>
>>>>>   <name>dfs.name.dir</name>****
>>>>>
>>>>>   <value>c:\\wksp</value>****
>>>>>
>>>>>   </property>****
>>>>>
>>>>> ** **
>>>>>
>>>>> please suggest ****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> [image: Inline image 1]****
>>>>>
>>>>> ** **
>>>>>
>>>>> On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <ma...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  *You are wrong at this:*****
>>>>>
>>>>> ** **
>>>>>
>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>>>>
>>>>> $ ./hadoop dfs -copyFromLocal
>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp****
>>>>>
>>>>> copyFromLocal: File
>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
>>>>> ****
>>>>>
>>>>> ** **
>>>>>
>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>>>>
>>>>> $ ./hadoop dfs -copyFromLocal
>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp****
>>>>>
>>>>> copyFromLocal: File
>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>>>>> ****
>>>>>
>>>>> ** **
>>>>>
>>>>> Because,You had wrote both the paths local and You need not to copy
>>>>> hadoop into hdfs...Hadoop is already working..****
>>>>>
>>>>> ** **
>>>>>
>>>>> Just check out in browser by after starting ur single node cluster :**
>>>>> **
>>>>>
>>>>> ** **
>>>>>
>>>>> localhost:50070****
>>>>>
>>>>> ** **
>>>>>
>>>>> then go for browse the filesystem link in it..****
>>>>>
>>>>> ** **
>>>>>
>>>>> If there is no directory then make directory there.****
>>>>>
>>>>> That is your hdfs directory.****
>>>>>
>>>>> Then copy any text file there(no need to copy hadoop there).beacause u
>>>>> are going to do processing on that data in text file.That's why hadoop is
>>>>> used for ,first u need to make it clear in ur mind.Then and then u will do
>>>>> it...otherwise not possible..****
>>>>>
>>>>> ** **
>>>>>
>>>>> *Try this: *****
>>>>>
>>>>> ** **
>>>>>
>>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2****
>>>>>
>>>>> $ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
>>>>> /hdfs/directory/path****
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> ** **
>>>>>
>>>>> On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
>>>>> wrote:****
>>>>>
>>>>>  thanks. yes , i am newbie.****
>>>>>
>>>>> however, i need windows setup.****
>>>>>
>>>>> ** **
>>>>>
>>>>> let me surely refer the doc and link which u sent but i need this to
>>>>> be working ...****
>>>>>
>>>>> can you please help****
>>>>>
>>>>> ** **
>>>>>
>>>>> regards****
>>>>>
>>>>> ** **
>>>>>
>>>>>  ****
>>>>>
>>>>> ** **
>>>>>
>>>>>
>>>>>
>>>>> ****
>>>>>
>>>>> ** **
>>>>>
>>>>> --
>>>>> MANISH DUNANI
>>>>> -THANX
>>>>> +91 9426881954,+91 8460656443****
>>>>>
>>>>> manishd207@gmail.com****
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>
>>>>>
>>>>> ****
>>>>>
>>>>> -- ****
>>>>>
>>>>> Regards****
>>>>>
>>>>> *Manish Dunani*****
>>>>>
>>>>> *Contact No* : +91 9408329137****
>>>>>
>>>>> *skype id* : manish.dunani****
>>>>>
>>>>> ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>> CONFIDENTIALITY NOTICE
>>>>> NOTICE: This message is intended for the use of the individual or
>>>>> entity to which it is addressed and may contain information that is
>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>> notified that any printing, copying, dissemination, distribution,
>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>> you have received this communication in error, please contact the sender
>>>>> immediately and delete it from your system. Thank You.****
>>>>>
>>>>>  ** **
>>>>>
>>>>>
>>>>>
>>>>> CONFIDENTIALITY NOTICE
>>>>> NOTICE: This message is intended for the use of the individual or
>>>>> entity to which it is addressed and may contain information that is
>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>> notified that any printing, copying, dissemination, distribution,
>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>> you have received this communication in error, please contact the sender
>>>>> immediately and delete it from your system. Thank You.****
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>
>>>>> CONFIDENTIALITY NOTICE
>>>>> NOTICE: This message is intended for the use of the individual or
>>>>> entity to which it is addressed and may contain information that is
>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>> notified that any printing, copying, dissemination, distribution,
>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>> you have received this communication in error, please contact the sender
>>>>> immediately and delete it from your system. Thank You.****
>>>>>
>>>>>  ** **
>>>>>
>>>>>  ** **
>>>>>
>>>>>
>>>>> CONFIDENTIALITY NOTICE
>>>>> NOTICE: This message is intended for the use of the individual or
>>>>> entity to which it is addressed and may contain information that is
>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>> notified that any printing, copying, dissemination, distribution,
>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>> you have received this communication in error, please contact the sender
>>>>> immediately and delete it from your system. Thank You.****
>>>>>
>>>>>  ** **
>>>>>
>>>>>
>>>>> CONFIDENTIALITY NOTICE
>>>>> NOTICE: This message is intended for the use of the individual or
>>>>> entity to which it is addressed and may contain information that is
>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>> notified that any printing, copying, dissemination, distribution,
>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>> you have received this communication in error, please contact the sender
>>>>> immediately and delete it from your system. Thank You.****
>>>>>
>>>>>  ** **
>>>>>
>>>>>
>>>>> CONFIDENTIALITY NOTICE
>>>>> NOTICE: This message is intended for the use of the individual or
>>>>> entity to which it is addressed and may contain information that is
>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>> notified that any printing, copying, dissemination, distribution,
>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>> you have received this communication in error, please contact the sender
>>>>> immediately and delete it from your system. Thank You.****
>>>>>
>>>>>  ** **
>>>>>
>>>>>
>>>>>
>>>>> ****
>>>>>
>>>>> ** **
>>>>>
>>>>> -- ****
>>>>>
>>>>> Olivier Renault****
>>>>>
>>>>> Solution Engineer - Big Data - Hortonworks, Inc.
>>>>> +44 7500 933 036
>>>>> orenault@hortonworks.com
>>>>> www.hortonworks.com****
>>>>>
>>>>> **** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>
>>>>>
>>>>> CONFIDENTIALITY NOTICE
>>>>> NOTICE: This message is intended for the use of the individual or
>>>>> entity to which it is addressed and may contain information that is
>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>> notified that any printing, copying, dissemination, distribution,
>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>> you have received this communication in error, please contact the sender
>>>>> immediately and delete it from your system. Thank You.****<http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>
>>>>>  ** ** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>>>
>>>>
>>>>
>>>
>>
>

-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.

Re: about replication

Posted by Irfan Sayed <ir...@gmail.com>.
please suggest

regards
irfan



On Sat, Sep 7, 2013 at 4:56 PM, Irfan Sayed <ir...@gmail.com> wrote:

> please suggest
>
> regards
> irfan
>
>
>
> On Fri, Sep 6, 2013 at 12:16 PM, Irfan Sayed <ir...@gmail.com> wrote:
>
>> ok.. now i made some changes and installation went ahead
>> but failed in property "HIVE_SERVER_HOST" declaration
>> in cluster config file, i have commented this property. if i uncomment ,
>> then what server address will give ???
>>
>> i have only two windows machines setup.
>> 1: for namenode and another for datanode
>>
>> please suggest
>>
>> regards
>> irfan
>>
>>
>>
>> On Fri, Sep 6, 2013 at 11:42 AM, Irfan Sayed <ir...@gmail.com>wrote:
>>
>>> thanks.
>>> i installed the latest java in c:\java folder and now no error in log
>>> file related to java
>>> however, now it is throwing error on not having cluster properties file.
>>> in fact i am running/installing hdp from the location where this file
>>> exist . still it is throwing error
>>>
>>> please find the attached
>>>
>>> [image: Inline image 1]
>>>
>>> regards
>>> irfan
>>>
>>>
>>>
>>> On Fri, Sep 6, 2013 at 11:12 AM, Ravi Mummulla (BIG DATA) <
>>> ravimu@microsoft.com> wrote:
>>>
>>>>  Here’s your issue (from the logs you attached earlier):****
>>>>
>>>> ** **
>>>>
>>>> CAQuietExec:  Checking JAVA_HOME is set correctly...****
>>>>
>>>> CAQuietExec:  Files\Java\jdk1.6.0_31 was unexpected at this time.****
>>>>
>>>> ** **
>>>>
>>>> It seems that you installed Java prerequisite in the default path,
>>>> which is %PROGRAMFILES% (expands to C:\Program Files in your case). HDP 1.3
>>>> does not like spaces in paths, do you need to reinstall Java under c:\java\
>>>> or something similar (in a path with no spaces).****
>>>>
>>>> ** **
>>>>
>>>> *From:* Irfan Sayed [mailto:irfu.sayed@gmail.com]
>>>> *Sent:* Thursday, September 5, 2013 8:42 PM
>>>> *To:* user@hadoop.apache.org
>>>> *Subject:* Re: about replication****
>>>>
>>>> ** **
>>>>
>>>> please find the attached.****
>>>>
>>>> i don't have "c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log"
>>>> as it is not generated ****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> irfan****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Thu, Sep 5, 2013 at 6:09 PM, Olivier Renault <
>>>> orenault@hortonworks.com> wrote:****
>>>>
>>>>  Could you share the log files ( c:\hdp.log,
>>>> c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log )  as
>>>> well as your clusterproperties.txt ?****
>>>>
>>>> ** **
>>>>
>>>> Thanks, ****
>>>>
>>>> Olivier****
>>>>
>>>> ** **
>>>>
>>>> On 5 September 2013 12:33, Irfan Sayed <ir...@gmail.com> wrote:***
>>>> *
>>>>
>>>>  thanks. i followed the user manual for deployment and installed all
>>>> pre-requisites ****
>>>>
>>>> i modified the command and still the issue persist. please suggest ****
>>>>
>>>> ** **
>>>>
>>>> please refer below ****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> [image: Inline image 1]****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> irfan ****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <
>>>> orenault@hortonworks.com> wrote:****
>>>>
>>>> The command to install it is msiexec /i msifile /...  ****
>>>>
>>>> You will find the correct syntax as part of doc. ****
>>>>
>>>> Happy reading
>>>> Olivier ****
>>>>
>>>> On 4 Sep 2013 12:37, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>>
>>>>  thanks. ****
>>>>
>>>> i referred the logs and manuals. i modified the clusterproperties file
>>>> and then double click on the msi file ****
>>>>
>>>> however, it still failed.****
>>>>
>>>> further i started the installation on command line by giving
>>>> HDP_LAYOUT=clusterproperties file path, ****
>>>>
>>>> installation went ahead and it failed for .NET framework 4.0 and VC++
>>>> redistributable package dependency   ****
>>>>
>>>> ** **
>>>>
>>>> i installed both and started again the installation. ****
>>>>
>>>> failed again with following error ****
>>>>
>>>> [image: Inline image 1]****
>>>>
>>>> ** **
>>>>
>>>> when i search for the logs mentioned in the error , i never found that
>>>> ****
>>>>
>>>> please suggest ****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> irfan****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <
>>>> orenault@hortonworks.com> wrote:****
>>>>
>>>> Correct, you need to define the cluster configuration as part of a
>>>> file. You will find some information on the configuration file as part of
>>>> the documentation. ****
>>>>
>>>>
>>>> http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
>>>> ****
>>>>
>>>> You should make sure to have also installed the pre requisite. ****
>>>>
>>>> Thanks
>>>> Olivier ****
>>>>
>>>> On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>>
>>>>  thanks. sorry for the long break. actually got involved in some other
>>>> priorities****
>>>>
>>>> i downloaded the installer and while installing i got following error *
>>>> ***
>>>>
>>>> ** **
>>>>
>>>> [image: Inline image 1]****
>>>>
>>>> ** **
>>>>
>>>> do i need to make any configuration prior to installation ??****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> irfan ****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
>>>> orenault@hortonworks.com> wrote:****
>>>>
>>>> Here is the link ****
>>>>
>>>> http://download.hortonworks.com/products/hdp-windows/****
>>>>
>>>> Olivier ****
>>>>
>>>> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>>
>>>>  thanks.****
>>>>
>>>> i just followed the instructions to setup the pseudo distributed setup
>>>> first using the url :
>>>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>>>> ****
>>>>
>>>>  ****
>>>>
>>>> i don't think so i am running DN on both machine ****
>>>>
>>>> please find the attached log****
>>>>
>>>> ** **
>>>>
>>>> hi olivier ****
>>>>
>>>> ** **
>>>>
>>>> can you please give me download link ?****
>>>>
>>>> let me try please ****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> irfan ****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <do...@gmail.com>
>>>> wrote:****
>>>>
>>>>  Are you running DN on both the machines? Could you please show me
>>>> your DN logs?****
>>>>
>>>> ** **
>>>>
>>>> Also, consider Oliver's suggestion. It's definitely a better option.***
>>>> *
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>>
>>>> ****
>>>>
>>>> Warm Regards,****
>>>>
>>>> Tariq****
>>>>
>>>> cloudfront.blogspot.com****
>>>>
>>>> ** **
>>>>
>>>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>>>> orenault@hortonworks.com> wrote:****
>>>>
>>>> Irfu, ****
>>>>
>>>> If you want to quickly get Hadoop running on windows platform. You may
>>>> want to try our distribution for Windows. You will be able to find the msi
>>>> on our website. ****
>>>>
>>>> Regards
>>>> Olivier ****
>>>>
>>>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>>
>>>>  thanks. ****
>>>>
>>>> ok. i think i need to change the plan over here ****
>>>>
>>>> let me create two environments. 1: totally windows 2: totally Unix****
>>>>
>>>> ** **
>>>>
>>>> because, on windows , anyway i have to try and see how hadoop works ***
>>>> *
>>>>
>>>> on UNIX, it is already known that ,  it is working fine. ****
>>>>
>>>> ** **
>>>>
>>>> so, on windows , here is the setup:****
>>>>
>>>> ** **
>>>>
>>>> namenode : windows 2012 R2 ****
>>>>
>>>> datanode : windows 2012 R2 ****
>>>>
>>>> ** **
>>>>
>>>> now, the exact problem is :****
>>>>
>>>> 1: datanode is not getting started ****
>>>>
>>>> 2: replication : if i put any file/folder on any datanode , it should
>>>> get replicated to all another available datanodes ****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <do...@gmail.com>
>>>> wrote:****
>>>>
>>>>  Seriously??You are planning to develop something using Hadoop on
>>>> windows. Not a good idea. Anyways, cold you plz show me your log files?I
>>>> also need some additional info :****
>>>>
>>>> -The exact problem which you are facing right now****
>>>>
>>>> -Your cluster summary(no. of nodes etc)****
>>>>
>>>> -Your latest configuration files****
>>>>
>>>> -Your /etc.hosts file****
>>>>
>>>>
>>>> ****
>>>>
>>>> Warm Regards,****
>>>>
>>>> Tariq****
>>>>
>>>> cloudfront.blogspot.com****
>>>>
>>>> ** **
>>>>
>>>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  ok. thanks****
>>>>
>>>> now, i need to start with all windows setup first as our product will
>>>> be based on windows ****
>>>>
>>>> so, now, please tell me how to resolve the issue ****
>>>>
>>>> ** **
>>>>
>>>> datanode is not starting . please suggest ****
>>>>
>>>> ** **
>>>>
>>>> regards,****
>>>>
>>>> irfan ****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <do...@gmail.com>
>>>> wrote:****
>>>>
>>>>  It is possible. Theoretically Hadoop doesn't stop you from doing
>>>> that. But it is not a very wise setup.****
>>>>
>>>>
>>>> ****
>>>>
>>>> Warm Regards,****
>>>>
>>>> Tariq****
>>>>
>>>> cloudfront.blogspot.com****
>>>>
>>>> ** **
>>>>
>>>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  please suggest****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> irfan****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  thanks.****
>>>>
>>>> can i have setup like this :****
>>>>
>>>> namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)****
>>>>
>>>> and datanodes are the combination of any OS (windows , linux , unix etc
>>>> )****
>>>>
>>>> ** **
>>>>
>>>> however, my doubt is,  as the file systems of  both the systems (win
>>>> and linux ) are different ,  datanodes of these systems can not be part of
>>>> single cluster . i have to make windows cluster separate and UNIX cluster
>>>> separate ?****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
>>>> aagarwal@hortonworks.com> wrote:****
>>>>
>>>> I just noticed you are on Cygwin. IIRC Windows PIDs are not the same as
>>>> Cygwin PIDs so that may be causing the discrepancy. I don't know how well
>>>> Hadoop works in Cygwin as I have never tried it. Work is in progress for
>>>> native Windows support however there are no official releases with Windows
>>>> support yet. It may be easier to get familiar with a release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
>>>> ****
>>>>
>>>>
>>>>
>>>> ****
>>>>
>>>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  thanks ****
>>>>
>>>> here is what i did .****
>>>>
>>>> i stopped all the namenodes and datanodes using ./stop-dfs.sh command *
>>>> ***
>>>>
>>>> then deleted all pid files for namenodes and datanodes ****
>>>>
>>>> ** **
>>>>
>>>> started dfs again with command : "./start-dfs.sh"****
>>>>
>>>> ** **
>>>>
>>>> when i ran the "Jps" command . it shows****
>>>>
>>>> ** **
>>>>
>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>
>>>> $ ./jps.exe****
>>>>
>>>> 4536 Jps****
>>>>
>>>> 2076 NameNode****
>>>>
>>>> ** **
>>>>
>>>> however, when i open the pid file for namenode then it is not showing
>>>> pid as : 4560. on the contrary, it shud show : 2076****
>>>>
>>>> ** **
>>>>
>>>> please suggest ****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
>>>> aagarwal@hortonworks.com> wrote:****
>>>>
>>>>  Most likely there is a stale pid file. Something like
>>>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>>>> the datanode.
>>>>
>>>> I haven't read the entire thread so you may have looked at this already.
>>>>
>>>> -Arpit****
>>>>
>>>>
>>>>
>>>> ****
>>>>
>>>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  datanode is trying to connect to namenode continuously but fails ****
>>>>
>>>> ** **
>>>>
>>>> when i try to run "jps" command it says :****
>>>>
>>>> $ ./jps.exe****
>>>>
>>>> 4584 NameNode****
>>>>
>>>> 4016 Jps****
>>>>
>>>> ** **
>>>>
>>>> and when i ran the "./start-dfs.sh" then it says :****
>>>>
>>>> ** **
>>>>
>>>> $ ./start-dfs.sh****
>>>>
>>>> namenode running as process 3544. Stop it first.****
>>>>
>>>> DFS-1: datanode running as process 4076. Stop it first.****
>>>>
>>>> localhost: secondarynamenode running as process 4792. Stop it first.***
>>>> *
>>>>
>>>> ** **
>>>>
>>>> both these logs are contradictory ****
>>>>
>>>> please find the attached logs ****
>>>>
>>>> ** **
>>>>
>>>> should i attach the conf files as well ?****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>>  ****
>>>>
>>>> ** **
>>>>
>>>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <do...@gmail.com>
>>>> wrote:****
>>>>
>>>>  Your DN is still not running. Showing me the logs would be helpful.***
>>>> *
>>>>
>>>>
>>>> ****
>>>>
>>>> Warm Regards,****
>>>>
>>>> Tariq****
>>>>
>>>> cloudfront.blogspot.com****
>>>>
>>>> ** **
>>>>
>>>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  i followed the url and did the steps mention in that. i have deployed
>>>> on the windows platform****
>>>>
>>>> ** **
>>>>
>>>> Now, i am able to browse url : http://localhost:50070 (name node )****
>>>>
>>>> however, not able to browse url : http://localhost:50030****
>>>>
>>>> ** **
>>>>
>>>> please refer below****
>>>>
>>>> ** **
>>>>
>>>> [image: Inline image 1]****
>>>>
>>>> ** **
>>>>
>>>> i have modified all the config files as mentioned and formatted the
>>>> hdfs file system as well ****
>>>>
>>>> please suggest ****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  thanks. i followed this url :
>>>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>>>> ****
>>>>
>>>> let me follow the url which you gave for pseudo distributed setup and
>>>> then will switch to distributed mode****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> irfan ****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <do...@gmail.com>
>>>> wrote:****
>>>>
>>>>  You are welcome. Which link have you followed for the
>>>> configuration?Your *core-site.xml* is empty. Remove the property *
>>>> fs.default.name *from *hdfs-site.xml* and add it to *core-site.xml*.
>>>> Remove *mapred.job.tracker* as well. It is required in *mapred-site.xml
>>>> *.****
>>>>
>>>> ** **
>>>>
>>>> I would suggest you to do a pseudo distributed setup first in order to
>>>> get yourself familiar with the process and then proceed to the distributed
>>>> mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>>>> ****
>>>>
>>>> ** **
>>>>
>>>> HTH****
>>>>
>>>>
>>>> ****
>>>>
>>>> Warm Regards,****
>>>>
>>>> Tariq****
>>>>
>>>> cloudfront.blogspot.com****
>>>>
>>>> ** **
>>>>
>>>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  thanks tariq for response. ****
>>>>
>>>> as discussed last time, i have sent you all the config files in my
>>>> setup . ****
>>>>
>>>> can you please go through that ?****
>>>>
>>>> ** **
>>>>
>>>> please let me know ****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> irfan ****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <do...@gmail.com>
>>>> wrote:****
>>>>
>>>>  I'm sorry for being unresponsive. Was out of touch for sometime
>>>> because of ramzan and eid. Resuming work today.****
>>>>
>>>> ** **
>>>>
>>>> What's the current status?****
>>>>
>>>>
>>>> ****
>>>>
>>>> Warm Regards,****
>>>>
>>>> Tariq****
>>>>
>>>> cloudfront.blogspot.com****
>>>>
>>>> ** **
>>>>
>>>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <ma...@gmail.com>
>>>> wrote:****
>>>>
>>>>  First of all read the concepts ..I hope you will like it..****
>>>>
>>>>
>>>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf****
>>>>
>>>> ** **
>>>>
>>>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  please suggest ****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> irfan ****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  hey Tariq,****
>>>>
>>>> i am still stuck .. ****
>>>>
>>>> can you please suggest ****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> irfan ****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  please suggest ****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  attachment got quarantined ****
>>>>
>>>> resending in txt format. please rename it to conf.rar ****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  thanks.****
>>>>
>>>> ** **
>>>>
>>>> if i run the jps command on namenode :****
>>>>
>>>> ** **
>>>>
>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>
>>>> $ ./jps.exe****
>>>>
>>>> 3164 NameNode****
>>>>
>>>> 1892 Jps****
>>>>
>>>> ** **
>>>>
>>>> same command on datanode :****
>>>>
>>>> ** **
>>>>
>>>> Administrator@DFS-1 /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>
>>>> $ ./jps.exe****
>>>>
>>>> 3848 Jps****
>>>>
>>>> ** **
>>>>
>>>> jps does not list any process for datanode. however, on web browser i
>>>> can see one live data node ****
>>>>
>>>> please find the attached conf rar file of namenode ****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <do...@gmail.com>
>>>> wrote:****
>>>>
>>>>  OK. we'll start fresh. Could you plz show me your latest config files?
>>>> ****
>>>>
>>>> ** **
>>>>
>>>> BTW, are your daemons running fine?Use JPS to verify that.****
>>>>
>>>>
>>>> ****
>>>>
>>>> Warm Regards,****
>>>>
>>>> Tariq****
>>>>
>>>> cloudfront.blogspot.com****
>>>>
>>>> ** **
>>>>
>>>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  i have created these dir "wksp_data" and "wksp_name" on both datanode
>>>> and namenode ****
>>>>
>>>> made the respective changes in "hdfs-site.xml" file ****
>>>>
>>>> formatted the namenode ****
>>>>
>>>> started the dfs ****
>>>>
>>>> ** **
>>>>
>>>> but still, not able to browse the file system through web browser ****
>>>>
>>>> please refer below ****
>>>>
>>>> ** **
>>>>
>>>> anything still missing ?****
>>>>
>>>> please suggest ****
>>>>
>>>> ** **
>>>>
>>>> [image: Inline image 1]****
>>>>
>>>> ** **
>>>>
>>>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  these dir needs to be created on all datanodes and namenodes ?****
>>>>
>>>> further,  hdfs-site.xml needs to be updated on both datanodes and
>>>> namenodes for these new dir?****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <do...@gmail.com>
>>>> wrote:****
>>>>
>>>>  Create 2 directories manually corresponding to the values of
>>>> dfs.name.dir and dfs.data.dir properties and change the permissions of
>>>> these directories to 755. When you start pushing data into your HDFS, data
>>>> will start going inside the directory specified by dfs.data.dir and the
>>>> associated metadata will go inside dfs.name.dir. Remember, you store data
>>>> in HDFS, but it eventually gets stored in your local/native FS. But you
>>>> cannot see this data directly on your local/native FS.****
>>>>
>>>>
>>>> ****
>>>>
>>>> Warm Regards,****
>>>>
>>>> Tariq****
>>>>
>>>> cloudfront.blogspot.com****
>>>>
>>>> ** **
>>>>
>>>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  thanks. ****
>>>>
>>>> however, i need this to be working on windows environment as project
>>>> requirement.****
>>>>
>>>> i will add/work on Linux later ****
>>>>
>>>> ** **
>>>>
>>>> so, now , at this stage , c:\\wksp is the HDFS file system OR do i need
>>>> to create it from command line ?****
>>>>
>>>> ** **
>>>>
>>>> please suggest****
>>>>
>>>> ** **
>>>>
>>>> regards,****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <do...@gmail.com>
>>>> wrote:****
>>>>
>>>>  Hello Irfan,****
>>>>
>>>> ** **
>>>>
>>>> Sorry for being unresponsive. Got stuck with some imp work.****
>>>>
>>>> ** **
>>>>
>>>> HDFS webUI doesn't provide us the ability to create file or directory.
>>>> You can browse HDFS, view files, download files etc. But operation like
>>>> create, move, copy etc are not supported.****
>>>>
>>>> ** **
>>>>
>>>> These values look fine to me.****
>>>>
>>>> ** **
>>>>
>>>> One suggestion though. Try getting a Linux machine(if possible). Or at
>>>> least use a VM. I personally feel that using Hadoop on windows is always
>>>> messy.****
>>>>
>>>>
>>>> ****
>>>>
>>>> Warm Regards,****
>>>>
>>>> Tariq****
>>>>
>>>> cloudfront.blogspot.com****
>>>>
>>>> ** **
>>>>
>>>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  thanks.****
>>>>
>>>> when i browse the file system , i am getting following :****
>>>>
>>>> i haven't seen any make directory option there ****
>>>>
>>>> ** **
>>>>
>>>> i need to create it from command line ?****
>>>>
>>>> further, in the hdfs-site.xml file , i have given following entries.
>>>> are they correct ? ****
>>>>
>>>> ** **
>>>>
>>>> <property>****
>>>>
>>>>   <name>dfs.data.dir</name>****
>>>>
>>>>   <value>c:\\wksp</value>****
>>>>
>>>>   </property>****
>>>>
>>>> <property>****
>>>>
>>>>   <name>dfs.name.dir</name>****
>>>>
>>>>   <value>c:\\wksp</value>****
>>>>
>>>>   </property>****
>>>>
>>>> ** **
>>>>
>>>> please suggest ****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> [image: Inline image 1]****
>>>>
>>>> ** **
>>>>
>>>> On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <ma...@gmail.com>
>>>> wrote:****
>>>>
>>>>  *You are wrong at this:*****
>>>>
>>>> ** **
>>>>
>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>>>
>>>> $ ./hadoop dfs -copyFromLocal
>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp****
>>>>
>>>> copyFromLocal: File
>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
>>>> ****
>>>>
>>>> ** **
>>>>
>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>>>
>>>> $ ./hadoop dfs -copyFromLocal
>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp****
>>>>
>>>> copyFromLocal: File
>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>>>> ****
>>>>
>>>> ** **
>>>>
>>>> Because,You had wrote both the paths local and You need not to copy
>>>> hadoop into hdfs...Hadoop is already working..****
>>>>
>>>> ** **
>>>>
>>>> Just check out in browser by after starting ur single node cluster :***
>>>> *
>>>>
>>>> ** **
>>>>
>>>> localhost:50070****
>>>>
>>>> ** **
>>>>
>>>> then go for browse the filesystem link in it..****
>>>>
>>>> ** **
>>>>
>>>> If there is no directory then make directory there.****
>>>>
>>>> That is your hdfs directory.****
>>>>
>>>> Then copy any text file there(no need to copy hadoop there).beacause u
>>>> are going to do processing on that data in text file.That's why hadoop is
>>>> used for ,first u need to make it clear in ur mind.Then and then u will do
>>>> it...otherwise not possible..****
>>>>
>>>> ** **
>>>>
>>>> *Try this: *****
>>>>
>>>> ** **
>>>>
>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2****
>>>>
>>>> $ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
>>>> /hdfs/directory/path****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  thanks. yes , i am newbie.****
>>>>
>>>> however, i need windows setup.****
>>>>
>>>> ** **
>>>>
>>>> let me surely refer the doc and link which u sent but i need this to be
>>>> working ...****
>>>>
>>>> can you please help****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> ** **
>>>>
>>>>  ****
>>>>
>>>> ** **
>>>>
>>>>
>>>>
>>>> ****
>>>>
>>>> ** **
>>>>
>>>> --
>>>> MANISH DUNANI
>>>> -THANX
>>>> +91 9426881954,+91 8460656443****
>>>>
>>>> manishd207@gmail.com****
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>
>>>>
>>>> ****
>>>>
>>>> -- ****
>>>>
>>>> Regards****
>>>>
>>>> *Manish Dunani*****
>>>>
>>>> *Contact No* : +91 9408329137****
>>>>
>>>> *skype id* : manish.dunani****
>>>>
>>>> ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>> CONFIDENTIALITY NOTICE
>>>> NOTICE: This message is intended for the use of the individual or
>>>> entity to which it is addressed and may contain information that is
>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>> If the reader of this message is not the intended recipient, you are hereby
>>>> notified that any printing, copying, dissemination, distribution,
>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>> you have received this communication in error, please contact the sender
>>>> immediately and delete it from your system. Thank You.****
>>>>
>>>>  ** **
>>>>
>>>>
>>>>
>>>> CONFIDENTIALITY NOTICE
>>>> NOTICE: This message is intended for the use of the individual or
>>>> entity to which it is addressed and may contain information that is
>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>> If the reader of this message is not the intended recipient, you are hereby
>>>> notified that any printing, copying, dissemination, distribution,
>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>> you have received this communication in error, please contact the sender
>>>> immediately and delete it from your system. Thank You.****
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>
>>>> CONFIDENTIALITY NOTICE
>>>> NOTICE: This message is intended for the use of the individual or
>>>> entity to which it is addressed and may contain information that is
>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>> If the reader of this message is not the intended recipient, you are hereby
>>>> notified that any printing, copying, dissemination, distribution,
>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>> you have received this communication in error, please contact the sender
>>>> immediately and delete it from your system. Thank You.****
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>
>>>> CONFIDENTIALITY NOTICE
>>>> NOTICE: This message is intended for the use of the individual or
>>>> entity to which it is addressed and may contain information that is
>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>> If the reader of this message is not the intended recipient, you are hereby
>>>> notified that any printing, copying, dissemination, distribution,
>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>> you have received this communication in error, please contact the sender
>>>> immediately and delete it from your system. Thank You.****
>>>>
>>>>  ** **
>>>>
>>>>
>>>> CONFIDENTIALITY NOTICE
>>>> NOTICE: This message is intended for the use of the individual or
>>>> entity to which it is addressed and may contain information that is
>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>> If the reader of this message is not the intended recipient, you are hereby
>>>> notified that any printing, copying, dissemination, distribution,
>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>> you have received this communication in error, please contact the sender
>>>> immediately and delete it from your system. Thank You.****
>>>>
>>>>  ** **
>>>>
>>>>
>>>> CONFIDENTIALITY NOTICE
>>>> NOTICE: This message is intended for the use of the individual or
>>>> entity to which it is addressed and may contain information that is
>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>> If the reader of this message is not the intended recipient, you are hereby
>>>> notified that any printing, copying, dissemination, distribution,
>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>> you have received this communication in error, please contact the sender
>>>> immediately and delete it from your system. Thank You.****
>>>>
>>>>  ** **
>>>>
>>>>
>>>>
>>>> ****
>>>>
>>>> ** **
>>>>
>>>> -- ****
>>>>
>>>> Olivier Renault****
>>>>
>>>> Solution Engineer - Big Data - Hortonworks, Inc.
>>>> +44 7500 933 036
>>>> orenault@hortonworks.com
>>>> www.hortonworks.com****
>>>>
>>>> **** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>>
>>>>
>>>> CONFIDENTIALITY NOTICE
>>>> NOTICE: This message is intended for the use of the individual or
>>>> entity to which it is addressed and may contain information that is
>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>> If the reader of this message is not the intended recipient, you are hereby
>>>> notified that any printing, copying, dissemination, distribution,
>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>> you have received this communication in error, please contact the sender
>>>> immediately and delete it from your system. Thank You.****<http://hortonworks.com/products/hortonworks-sandbox/>
>>>>
>>>>  ** ** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>>
>>>
>>>
>>
>

Re: about replication

Posted by Irfan Sayed <ir...@gmail.com>.
please suggest

regards
irfan



On Sat, Sep 7, 2013 at 4:56 PM, Irfan Sayed <ir...@gmail.com> wrote:

> please suggest
>
> regards
> irfan
>
>
>
> On Fri, Sep 6, 2013 at 12:16 PM, Irfan Sayed <ir...@gmail.com> wrote:
>
>> ok.. now i made some changes and installation went ahead
>> but failed in property "HIVE_SERVER_HOST" declaration
>> in cluster config file, i have commented this property. if i uncomment ,
>> then what server address will give ???
>>
>> i have only two windows machines setup.
>> 1: for namenode and another for datanode
>>
>> please suggest
>>
>> regards
>> irfan
>>
>>
>>
>> On Fri, Sep 6, 2013 at 11:42 AM, Irfan Sayed <ir...@gmail.com>wrote:
>>
>>> thanks.
>>> i installed the latest java in c:\java folder and now no error in log
>>> file related to java
>>> however, now it is throwing error on not having cluster properties file.
>>> in fact i am running/installing hdp from the location where this file
>>> exist . still it is throwing error
>>>
>>> please find the attached
>>>
>>> [image: Inline image 1]
>>>
>>> regards
>>> irfan
>>>
>>>
>>>
>>> On Fri, Sep 6, 2013 at 11:12 AM, Ravi Mummulla (BIG DATA) <
>>> ravimu@microsoft.com> wrote:
>>>
>>>>  Here’s your issue (from the logs you attached earlier):****
>>>>
>>>> ** **
>>>>
>>>> CAQuietExec:  Checking JAVA_HOME is set correctly...****
>>>>
>>>> CAQuietExec:  Files\Java\jdk1.6.0_31 was unexpected at this time.****
>>>>
>>>> ** **
>>>>
>>>> It seems that you installed Java prerequisite in the default path,
>>>> which is %PROGRAMFILES% (expands to C:\Program Files in your case). HDP 1.3
>>>> does not like spaces in paths, do you need to reinstall Java under c:\java\
>>>> or something similar (in a path with no spaces).****
>>>>
>>>> ** **
>>>>
>>>> *From:* Irfan Sayed [mailto:irfu.sayed@gmail.com]
>>>> *Sent:* Thursday, September 5, 2013 8:42 PM
>>>> *To:* user@hadoop.apache.org
>>>> *Subject:* Re: about replication****
>>>>
>>>> ** **
>>>>
>>>> please find the attached.****
>>>>
>>>> i don't have "c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log"
>>>> as it is not generated ****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> irfan****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Thu, Sep 5, 2013 at 6:09 PM, Olivier Renault <
>>>> orenault@hortonworks.com> wrote:****
>>>>
>>>>  Could you share the log files ( c:\hdp.log,
>>>> c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log )  as
>>>> well as your clusterproperties.txt ?****
>>>>
>>>> ** **
>>>>
>>>> Thanks, ****
>>>>
>>>> Olivier****
>>>>
>>>> ** **
>>>>
>>>> On 5 September 2013 12:33, Irfan Sayed <ir...@gmail.com> wrote:***
>>>> *
>>>>
>>>>  thanks. i followed the user manual for deployment and installed all
>>>> pre-requisites ****
>>>>
>>>> i modified the command and still the issue persist. please suggest ****
>>>>
>>>> ** **
>>>>
>>>> please refer below ****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> [image: Inline image 1]****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> irfan ****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <
>>>> orenault@hortonworks.com> wrote:****
>>>>
>>>> The command to install it is msiexec /i msifile /...  ****
>>>>
>>>> You will find the correct syntax as part of doc. ****
>>>>
>>>> Happy reading
>>>> Olivier ****
>>>>
>>>> On 4 Sep 2013 12:37, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>>
>>>>  thanks. ****
>>>>
>>>> i referred the logs and manuals. i modified the clusterproperties file
>>>> and then double click on the msi file ****
>>>>
>>>> however, it still failed.****
>>>>
>>>> further i started the installation on command line by giving
>>>> HDP_LAYOUT=clusterproperties file path, ****
>>>>
>>>> installation went ahead and it failed for .NET framework 4.0 and VC++
>>>> redistributable package dependency   ****
>>>>
>>>> ** **
>>>>
>>>> i installed both and started again the installation. ****
>>>>
>>>> failed again with following error ****
>>>>
>>>> [image: Inline image 1]****
>>>>
>>>> ** **
>>>>
>>>> when i search for the logs mentioned in the error , i never found that
>>>> ****
>>>>
>>>> please suggest ****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> irfan****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <
>>>> orenault@hortonworks.com> wrote:****
>>>>
>>>> Correct, you need to define the cluster configuration as part of a
>>>> file. You will find some information on the configuration file as part of
>>>> the documentation. ****
>>>>
>>>>
>>>> http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
>>>> ****
>>>>
>>>> You should make sure to have also installed the pre requisite. ****
>>>>
>>>> Thanks
>>>> Olivier ****
>>>>
>>>> On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>>
>>>>  thanks. sorry for the long break. actually got involved in some other
>>>> priorities****
>>>>
>>>> i downloaded the installer and while installing i got following error *
>>>> ***
>>>>
>>>> ** **
>>>>
>>>> [image: Inline image 1]****
>>>>
>>>> ** **
>>>>
>>>> do i need to make any configuration prior to installation ??****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> irfan ****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
>>>> orenault@hortonworks.com> wrote:****
>>>>
>>>> Here is the link ****
>>>>
>>>> http://download.hortonworks.com/products/hdp-windows/****
>>>>
>>>> Olivier ****
>>>>
>>>> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>>
>>>>  thanks.****
>>>>
>>>> i just followed the instructions to setup the pseudo distributed setup
>>>> first using the url :
>>>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>>>> ****
>>>>
>>>>  ****
>>>>
>>>> i don't think so i am running DN on both machine ****
>>>>
>>>> please find the attached log****
>>>>
>>>> ** **
>>>>
>>>> hi olivier ****
>>>>
>>>> ** **
>>>>
>>>> can you please give me download link ?****
>>>>
>>>> let me try please ****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> irfan ****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <do...@gmail.com>
>>>> wrote:****
>>>>
>>>>  Are you running DN on both the machines? Could you please show me
>>>> your DN logs?****
>>>>
>>>> ** **
>>>>
>>>> Also, consider Oliver's suggestion. It's definitely a better option.***
>>>> *
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>>
>>>> ****
>>>>
>>>> Warm Regards,****
>>>>
>>>> Tariq****
>>>>
>>>> cloudfront.blogspot.com****
>>>>
>>>> ** **
>>>>
>>>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>>>> orenault@hortonworks.com> wrote:****
>>>>
>>>> Irfu, ****
>>>>
>>>> If you want to quickly get Hadoop running on windows platform. You may
>>>> want to try our distribution for Windows. You will be able to find the msi
>>>> on our website. ****
>>>>
>>>> Regards
>>>> Olivier ****
>>>>
>>>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>>
>>>>  thanks. ****
>>>>
>>>> ok. i think i need to change the plan over here ****
>>>>
>>>> let me create two environments. 1: totally windows 2: totally Unix****
>>>>
>>>> ** **
>>>>
>>>> because, on windows , anyway i have to try and see how hadoop works ***
>>>> *
>>>>
>>>> on UNIX, it is already known that ,  it is working fine. ****
>>>>
>>>> ** **
>>>>
>>>> so, on windows , here is the setup:****
>>>>
>>>> ** **
>>>>
>>>> namenode : windows 2012 R2 ****
>>>>
>>>> datanode : windows 2012 R2 ****
>>>>
>>>> ** **
>>>>
>>>> now, the exact problem is :****
>>>>
>>>> 1: datanode is not getting started ****
>>>>
>>>> 2: replication : if i put any file/folder on any datanode , it should
>>>> get replicated to all another available datanodes ****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <do...@gmail.com>
>>>> wrote:****
>>>>
>>>>  Seriously??You are planning to develop something using Hadoop on
>>>> windows. Not a good idea. Anyways, cold you plz show me your log files?I
>>>> also need some additional info :****
>>>>
>>>> -The exact problem which you are facing right now****
>>>>
>>>> -Your cluster summary(no. of nodes etc)****
>>>>
>>>> -Your latest configuration files****
>>>>
>>>> -Your /etc.hosts file****
>>>>
>>>>
>>>> ****
>>>>
>>>> Warm Regards,****
>>>>
>>>> Tariq****
>>>>
>>>> cloudfront.blogspot.com****
>>>>
>>>> ** **
>>>>
>>>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  ok. thanks****
>>>>
>>>> now, i need to start with all windows setup first as our product will
>>>> be based on windows ****
>>>>
>>>> so, now, please tell me how to resolve the issue ****
>>>>
>>>> ** **
>>>>
>>>> datanode is not starting . please suggest ****
>>>>
>>>> ** **
>>>>
>>>> regards,****
>>>>
>>>> irfan ****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <do...@gmail.com>
>>>> wrote:****
>>>>
>>>>  It is possible. Theoretically Hadoop doesn't stop you from doing
>>>> that. But it is not a very wise setup.****
>>>>
>>>>
>>>> ****
>>>>
>>>> Warm Regards,****
>>>>
>>>> Tariq****
>>>>
>>>> cloudfront.blogspot.com****
>>>>
>>>> ** **
>>>>
>>>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  please suggest****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> irfan****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  thanks.****
>>>>
>>>> can i have setup like this :****
>>>>
>>>> namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)****
>>>>
>>>> and datanodes are the combination of any OS (windows , linux , unix etc
>>>> )****
>>>>
>>>> ** **
>>>>
>>>> however, my doubt is,  as the file systems of  both the systems (win
>>>> and linux ) are different ,  datanodes of these systems can not be part of
>>>> single cluster . i have to make windows cluster separate and UNIX cluster
>>>> separate ?****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
>>>> aagarwal@hortonworks.com> wrote:****
>>>>
>>>> I just noticed you are on Cygwin. IIRC Windows PIDs are not the same as
>>>> Cygwin PIDs so that may be causing the discrepancy. I don't know how well
>>>> Hadoop works in Cygwin as I have never tried it. Work is in progress for
>>>> native Windows support however there are no official releases with Windows
>>>> support yet. It may be easier to get familiar with a release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
>>>> ****
>>>>
>>>>
>>>>
>>>> ****
>>>>
>>>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  thanks ****
>>>>
>>>> here is what i did .****
>>>>
>>>> i stopped all the namenodes and datanodes using ./stop-dfs.sh command *
>>>> ***
>>>>
>>>> then deleted all pid files for namenodes and datanodes ****
>>>>
>>>> ** **
>>>>
>>>> started dfs again with command : "./start-dfs.sh"****
>>>>
>>>> ** **
>>>>
>>>> when i ran the "Jps" command . it shows****
>>>>
>>>> ** **
>>>>
>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>
>>>> $ ./jps.exe****
>>>>
>>>> 4536 Jps****
>>>>
>>>> 2076 NameNode****
>>>>
>>>> ** **
>>>>
>>>> however, when i open the pid file for namenode then it is not showing
>>>> pid as : 4560. on the contrary, it shud show : 2076****
>>>>
>>>> ** **
>>>>
>>>> please suggest ****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
>>>> aagarwal@hortonworks.com> wrote:****
>>>>
>>>>  Most likely there is a stale pid file. Something like
>>>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>>>> the datanode.
>>>>
>>>> I haven't read the entire thread so you may have looked at this already.
>>>>
>>>> -Arpit****
>>>>
>>>>
>>>>
>>>> ****
>>>>
>>>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  datanode is trying to connect to namenode continuously but fails ****
>>>>
>>>> ** **
>>>>
>>>> when i try to run "jps" command it says :****
>>>>
>>>> $ ./jps.exe****
>>>>
>>>> 4584 NameNode****
>>>>
>>>> 4016 Jps****
>>>>
>>>> ** **
>>>>
>>>> and when i ran the "./start-dfs.sh" then it says :****
>>>>
>>>> ** **
>>>>
>>>> $ ./start-dfs.sh****
>>>>
>>>> namenode running as process 3544. Stop it first.****
>>>>
>>>> DFS-1: datanode running as process 4076. Stop it first.****
>>>>
>>>> localhost: secondarynamenode running as process 4792. Stop it first.***
>>>> *
>>>>
>>>> ** **
>>>>
>>>> both these logs are contradictory ****
>>>>
>>>> please find the attached logs ****
>>>>
>>>> ** **
>>>>
>>>> should i attach the conf files as well ?****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>>  ****
>>>>
>>>> ** **
>>>>
>>>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <do...@gmail.com>
>>>> wrote:****
>>>>
>>>>  Your DN is still not running. Showing me the logs would be helpful.***
>>>> *
>>>>
>>>>
>>>> ****
>>>>
>>>> Warm Regards,****
>>>>
>>>> Tariq****
>>>>
>>>> cloudfront.blogspot.com****
>>>>
>>>> ** **
>>>>
>>>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  i followed the url and did the steps mention in that. i have deployed
>>>> on the windows platform****
>>>>
>>>> ** **
>>>>
>>>> Now, i am able to browse url : http://localhost:50070 (name node )****
>>>>
>>>> however, not able to browse url : http://localhost:50030****
>>>>
>>>> ** **
>>>>
>>>> please refer below****
>>>>
>>>> ** **
>>>>
>>>> [image: Inline image 1]****
>>>>
>>>> ** **
>>>>
>>>> i have modified all the config files as mentioned and formatted the
>>>> hdfs file system as well ****
>>>>
>>>> please suggest ****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  thanks. i followed this url :
>>>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>>>> ****
>>>>
>>>> let me follow the url which you gave for pseudo distributed setup and
>>>> then will switch to distributed mode****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> irfan ****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <do...@gmail.com>
>>>> wrote:****
>>>>
>>>>  You are welcome. Which link have you followed for the
>>>> configuration?Your *core-site.xml* is empty. Remove the property *
>>>> fs.default.name *from *hdfs-site.xml* and add it to *core-site.xml*.
>>>> Remove *mapred.job.tracker* as well. It is required in *mapred-site.xml
>>>> *.****
>>>>
>>>> ** **
>>>>
>>>> I would suggest you to do a pseudo distributed setup first in order to
>>>> get yourself familiar with the process and then proceed to the distributed
>>>> mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>>>> ****
>>>>
>>>> ** **
>>>>
>>>> HTH****
>>>>
>>>>
>>>> ****
>>>>
>>>> Warm Regards,****
>>>>
>>>> Tariq****
>>>>
>>>> cloudfront.blogspot.com****
>>>>
>>>> ** **
>>>>
>>>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  thanks tariq for response. ****
>>>>
>>>> as discussed last time, i have sent you all the config files in my
>>>> setup . ****
>>>>
>>>> can you please go through that ?****
>>>>
>>>> ** **
>>>>
>>>> please let me know ****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> irfan ****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <do...@gmail.com>
>>>> wrote:****
>>>>
>>>>  I'm sorry for being unresponsive. Was out of touch for sometime
>>>> because of ramzan and eid. Resuming work today.****
>>>>
>>>> ** **
>>>>
>>>> What's the current status?****
>>>>
>>>>
>>>> ****
>>>>
>>>> Warm Regards,****
>>>>
>>>> Tariq****
>>>>
>>>> cloudfront.blogspot.com****
>>>>
>>>> ** **
>>>>
>>>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <ma...@gmail.com>
>>>> wrote:****
>>>>
>>>>  First of all read the concepts ..I hope you will like it..****
>>>>
>>>>
>>>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf****
>>>>
>>>> ** **
>>>>
>>>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  please suggest ****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> irfan ****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  hey Tariq,****
>>>>
>>>> i am still stuck .. ****
>>>>
>>>> can you please suggest ****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> irfan ****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  please suggest ****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  attachment got quarantined ****
>>>>
>>>> resending in txt format. please rename it to conf.rar ****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  thanks.****
>>>>
>>>> ** **
>>>>
>>>> if i run the jps command on namenode :****
>>>>
>>>> ** **
>>>>
>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>
>>>> $ ./jps.exe****
>>>>
>>>> 3164 NameNode****
>>>>
>>>> 1892 Jps****
>>>>
>>>> ** **
>>>>
>>>> same command on datanode :****
>>>>
>>>> ** **
>>>>
>>>> Administrator@DFS-1 /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>
>>>> $ ./jps.exe****
>>>>
>>>> 3848 Jps****
>>>>
>>>> ** **
>>>>
>>>> jps does not list any process for datanode. however, on web browser i
>>>> can see one live data node ****
>>>>
>>>> please find the attached conf rar file of namenode ****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <do...@gmail.com>
>>>> wrote:****
>>>>
>>>>  OK. we'll start fresh. Could you plz show me your latest config files?
>>>> ****
>>>>
>>>> ** **
>>>>
>>>> BTW, are your daemons running fine?Use JPS to verify that.****
>>>>
>>>>
>>>> ****
>>>>
>>>> Warm Regards,****
>>>>
>>>> Tariq****
>>>>
>>>> cloudfront.blogspot.com****
>>>>
>>>> ** **
>>>>
>>>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  i have created these dir "wksp_data" and "wksp_name" on both datanode
>>>> and namenode ****
>>>>
>>>> made the respective changes in "hdfs-site.xml" file ****
>>>>
>>>> formatted the namenode ****
>>>>
>>>> started the dfs ****
>>>>
>>>> ** **
>>>>
>>>> but still, not able to browse the file system through web browser ****
>>>>
>>>> please refer below ****
>>>>
>>>> ** **
>>>>
>>>> anything still missing ?****
>>>>
>>>> please suggest ****
>>>>
>>>> ** **
>>>>
>>>> [image: Inline image 1]****
>>>>
>>>> ** **
>>>>
>>>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  these dir needs to be created on all datanodes and namenodes ?****
>>>>
>>>> further,  hdfs-site.xml needs to be updated on both datanodes and
>>>> namenodes for these new dir?****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <do...@gmail.com>
>>>> wrote:****
>>>>
>>>>  Create 2 directories manually corresponding to the values of
>>>> dfs.name.dir and dfs.data.dir properties and change the permissions of
>>>> these directories to 755. When you start pushing data into your HDFS, data
>>>> will start going inside the directory specified by dfs.data.dir and the
>>>> associated metadata will go inside dfs.name.dir. Remember, you store data
>>>> in HDFS, but it eventually gets stored in your local/native FS. But you
>>>> cannot see this data directly on your local/native FS.****
>>>>
>>>>
>>>> ****
>>>>
>>>> Warm Regards,****
>>>>
>>>> Tariq****
>>>>
>>>> cloudfront.blogspot.com****
>>>>
>>>> ** **
>>>>
>>>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  thanks. ****
>>>>
>>>> however, i need this to be working on windows environment as project
>>>> requirement.****
>>>>
>>>> i will add/work on Linux later ****
>>>>
>>>> ** **
>>>>
>>>> so, now , at this stage , c:\\wksp is the HDFS file system OR do i need
>>>> to create it from command line ?****
>>>>
>>>> ** **
>>>>
>>>> please suggest****
>>>>
>>>> ** **
>>>>
>>>> regards,****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <do...@gmail.com>
>>>> wrote:****
>>>>
>>>>  Hello Irfan,****
>>>>
>>>> ** **
>>>>
>>>> Sorry for being unresponsive. Got stuck with some imp work.****
>>>>
>>>> ** **
>>>>
>>>> HDFS webUI doesn't provide us the ability to create file or directory.
>>>> You can browse HDFS, view files, download files etc. But operation like
>>>> create, move, copy etc are not supported.****
>>>>
>>>> ** **
>>>>
>>>> These values look fine to me.****
>>>>
>>>> ** **
>>>>
>>>> One suggestion though. Try getting a Linux machine(if possible). Or at
>>>> least use a VM. I personally feel that using Hadoop on windows is always
>>>> messy.****
>>>>
>>>>
>>>> ****
>>>>
>>>> Warm Regards,****
>>>>
>>>> Tariq****
>>>>
>>>> cloudfront.blogspot.com****
>>>>
>>>> ** **
>>>>
>>>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  thanks.****
>>>>
>>>> when i browse the file system , i am getting following :****
>>>>
>>>> i haven't seen any make directory option there ****
>>>>
>>>> ** **
>>>>
>>>> i need to create it from command line ?****
>>>>
>>>> further, in the hdfs-site.xml file , i have given following entries.
>>>> are they correct ? ****
>>>>
>>>> ** **
>>>>
>>>> <property>****
>>>>
>>>>   <name>dfs.data.dir</name>****
>>>>
>>>>   <value>c:\\wksp</value>****
>>>>
>>>>   </property>****
>>>>
>>>> <property>****
>>>>
>>>>   <name>dfs.name.dir</name>****
>>>>
>>>>   <value>c:\\wksp</value>****
>>>>
>>>>   </property>****
>>>>
>>>> ** **
>>>>
>>>> please suggest ****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> [image: Inline image 1]****
>>>>
>>>> ** **
>>>>
>>>> On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <ma...@gmail.com>
>>>> wrote:****
>>>>
>>>>  *You are wrong at this:*****
>>>>
>>>> ** **
>>>>
>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>>>
>>>> $ ./hadoop dfs -copyFromLocal
>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp****
>>>>
>>>> copyFromLocal: File
>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
>>>> ****
>>>>
>>>> ** **
>>>>
>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>>>
>>>> $ ./hadoop dfs -copyFromLocal
>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp****
>>>>
>>>> copyFromLocal: File
>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>>>> ****
>>>>
>>>> ** **
>>>>
>>>> Because,You had wrote both the paths local and You need not to copy
>>>> hadoop into hdfs...Hadoop is already working..****
>>>>
>>>> ** **
>>>>
>>>> Just check out in browser by after starting ur single node cluster :***
>>>> *
>>>>
>>>> ** **
>>>>
>>>> localhost:50070****
>>>>
>>>> ** **
>>>>
>>>> then go for browse the filesystem link in it..****
>>>>
>>>> ** **
>>>>
>>>> If there is no directory then make directory there.****
>>>>
>>>> That is your hdfs directory.****
>>>>
>>>> Then copy any text file there(no need to copy hadoop there).beacause u
>>>> are going to do processing on that data in text file.That's why hadoop is
>>>> used for ,first u need to make it clear in ur mind.Then and then u will do
>>>> it...otherwise not possible..****
>>>>
>>>> ** **
>>>>
>>>> *Try this: *****
>>>>
>>>> ** **
>>>>
>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2****
>>>>
>>>> $ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
>>>> /hdfs/directory/path****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  thanks. yes , i am newbie.****
>>>>
>>>> however, i need windows setup.****
>>>>
>>>> ** **
>>>>
>>>> let me surely refer the doc and link which u sent but i need this to be
>>>> working ...****
>>>>
>>>> can you please help****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> ** **
>>>>
>>>>  ****
>>>>
>>>> ** **
>>>>
>>>>
>>>>
>>>> ****
>>>>
>>>> ** **
>>>>
>>>> --
>>>> MANISH DUNANI
>>>> -THANX
>>>> +91 9426881954,+91 8460656443****
>>>>
>>>> manishd207@gmail.com****
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>
>>>>
>>>> ****
>>>>
>>>> -- ****
>>>>
>>>> Regards****
>>>>
>>>> *Manish Dunani*****
>>>>
>>>> *Contact No* : +91 9408329137****
>>>>
>>>> *skype id* : manish.dunani****
>>>>
>>>> ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>> CONFIDENTIALITY NOTICE
>>>> NOTICE: This message is intended for the use of the individual or
>>>> entity to which it is addressed and may contain information that is
>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>> If the reader of this message is not the intended recipient, you are hereby
>>>> notified that any printing, copying, dissemination, distribution,
>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>> you have received this communication in error, please contact the sender
>>>> immediately and delete it from your system. Thank You.****
>>>>
>>>>  ** **
>>>>
>>>>
>>>>
>>>> CONFIDENTIALITY NOTICE
>>>> NOTICE: This message is intended for the use of the individual or
>>>> entity to which it is addressed and may contain information that is
>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>> If the reader of this message is not the intended recipient, you are hereby
>>>> notified that any printing, copying, dissemination, distribution,
>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>> you have received this communication in error, please contact the sender
>>>> immediately and delete it from your system. Thank You.****
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>
>>>> CONFIDENTIALITY NOTICE
>>>> NOTICE: This message is intended for the use of the individual or
>>>> entity to which it is addressed and may contain information that is
>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>> If the reader of this message is not the intended recipient, you are hereby
>>>> notified that any printing, copying, dissemination, distribution,
>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>> you have received this communication in error, please contact the sender
>>>> immediately and delete it from your system. Thank You.****
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>
>>>> CONFIDENTIALITY NOTICE
>>>> NOTICE: This message is intended for the use of the individual or
>>>> entity to which it is addressed and may contain information that is
>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>> If the reader of this message is not the intended recipient, you are hereby
>>>> notified that any printing, copying, dissemination, distribution,
>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>> you have received this communication in error, please contact the sender
>>>> immediately and delete it from your system. Thank You.****
>>>>
>>>>  ** **
>>>>
>>>>
>>>> CONFIDENTIALITY NOTICE
>>>> NOTICE: This message is intended for the use of the individual or
>>>> entity to which it is addressed and may contain information that is
>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>> If the reader of this message is not the intended recipient, you are hereby
>>>> notified that any printing, copying, dissemination, distribution,
>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>> you have received this communication in error, please contact the sender
>>>> immediately and delete it from your system. Thank You.****
>>>>
>>>>  ** **
>>>>
>>>>
>>>> CONFIDENTIALITY NOTICE
>>>> NOTICE: This message is intended for the use of the individual or
>>>> entity to which it is addressed and may contain information that is
>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>> If the reader of this message is not the intended recipient, you are hereby
>>>> notified that any printing, copying, dissemination, distribution,
>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>> you have received this communication in error, please contact the sender
>>>> immediately and delete it from your system. Thank You.****
>>>>
>>>>  ** **
>>>>
>>>>
>>>>
>>>> ****
>>>>
>>>> ** **
>>>>
>>>> -- ****
>>>>
>>>> Olivier Renault****
>>>>
>>>> Solution Engineer - Big Data - Hortonworks, Inc.
>>>> +44 7500 933 036
>>>> orenault@hortonworks.com
>>>> www.hortonworks.com****
>>>>
>>>> **** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>>
>>>>
>>>> CONFIDENTIALITY NOTICE
>>>> NOTICE: This message is intended for the use of the individual or
>>>> entity to which it is addressed and may contain information that is
>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>> If the reader of this message is not the intended recipient, you are hereby
>>>> notified that any printing, copying, dissemination, distribution,
>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>> you have received this communication in error, please contact the sender
>>>> immediately and delete it from your system. Thank You.****<http://hortonworks.com/products/hortonworks-sandbox/>
>>>>
>>>>  ** ** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>>
>>>
>>>
>>
>

Re: about replication

Posted by Irfan Sayed <ir...@gmail.com>.
please suggest

regards
irfan



On Sat, Sep 7, 2013 at 4:56 PM, Irfan Sayed <ir...@gmail.com> wrote:

> please suggest
>
> regards
> irfan
>
>
>
> On Fri, Sep 6, 2013 at 12:16 PM, Irfan Sayed <ir...@gmail.com> wrote:
>
>> ok.. now i made some changes and installation went ahead
>> but failed in property "HIVE_SERVER_HOST" declaration
>> in cluster config file, i have commented this property. if i uncomment ,
>> then what server address will give ???
>>
>> i have only two windows machines setup.
>> 1: for namenode and another for datanode
>>
>> please suggest
>>
>> regards
>> irfan
>>
>>
>>
>> On Fri, Sep 6, 2013 at 11:42 AM, Irfan Sayed <ir...@gmail.com>wrote:
>>
>>> thanks.
>>> i installed the latest java in c:\java folder and now no error in log
>>> file related to java
>>> however, now it is throwing error on not having cluster properties file.
>>> in fact i am running/installing hdp from the location where this file
>>> exist . still it is throwing error
>>>
>>> please find the attached
>>>
>>> [image: Inline image 1]
>>>
>>> regards
>>> irfan
>>>
>>>
>>>
>>> On Fri, Sep 6, 2013 at 11:12 AM, Ravi Mummulla (BIG DATA) <
>>> ravimu@microsoft.com> wrote:
>>>
>>>>  Here’s your issue (from the logs you attached earlier):****
>>>>
>>>> ** **
>>>>
>>>> CAQuietExec:  Checking JAVA_HOME is set correctly...****
>>>>
>>>> CAQuietExec:  Files\Java\jdk1.6.0_31 was unexpected at this time.****
>>>>
>>>> ** **
>>>>
>>>> It seems that you installed Java prerequisite in the default path,
>>>> which is %PROGRAMFILES% (expands to C:\Program Files in your case). HDP 1.3
>>>> does not like spaces in paths, do you need to reinstall Java under c:\java\
>>>> or something similar (in a path with no spaces).****
>>>>
>>>> ** **
>>>>
>>>> *From:* Irfan Sayed [mailto:irfu.sayed@gmail.com]
>>>> *Sent:* Thursday, September 5, 2013 8:42 PM
>>>> *To:* user@hadoop.apache.org
>>>> *Subject:* Re: about replication****
>>>>
>>>> ** **
>>>>
>>>> please find the attached.****
>>>>
>>>> i don't have "c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log"
>>>> as it is not generated ****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> irfan****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Thu, Sep 5, 2013 at 6:09 PM, Olivier Renault <
>>>> orenault@hortonworks.com> wrote:****
>>>>
>>>>  Could you share the log files ( c:\hdp.log,
>>>> c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log )  as
>>>> well as your clusterproperties.txt ?****
>>>>
>>>> ** **
>>>>
>>>> Thanks, ****
>>>>
>>>> Olivier****
>>>>
>>>> ** **
>>>>
>>>> On 5 September 2013 12:33, Irfan Sayed <ir...@gmail.com> wrote:***
>>>> *
>>>>
>>>>  thanks. i followed the user manual for deployment and installed all
>>>> pre-requisites ****
>>>>
>>>> i modified the command and still the issue persist. please suggest ****
>>>>
>>>> ** **
>>>>
>>>> please refer below ****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> [image: Inline image 1]****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> irfan ****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <
>>>> orenault@hortonworks.com> wrote:****
>>>>
>>>> The command to install it is msiexec /i msifile /...  ****
>>>>
>>>> You will find the correct syntax as part of doc. ****
>>>>
>>>> Happy reading
>>>> Olivier ****
>>>>
>>>> On 4 Sep 2013 12:37, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>>
>>>>  thanks. ****
>>>>
>>>> i referred the logs and manuals. i modified the clusterproperties file
>>>> and then double click on the msi file ****
>>>>
>>>> however, it still failed.****
>>>>
>>>> further i started the installation on command line by giving
>>>> HDP_LAYOUT=clusterproperties file path, ****
>>>>
>>>> installation went ahead and it failed for .NET framework 4.0 and VC++
>>>> redistributable package dependency   ****
>>>>
>>>> ** **
>>>>
>>>> i installed both and started again the installation. ****
>>>>
>>>> failed again with following error ****
>>>>
>>>> [image: Inline image 1]****
>>>>
>>>> ** **
>>>>
>>>> when i search for the logs mentioned in the error , i never found that
>>>> ****
>>>>
>>>> please suggest ****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> irfan****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <
>>>> orenault@hortonworks.com> wrote:****
>>>>
>>>> Correct, you need to define the cluster configuration as part of a
>>>> file. You will find some information on the configuration file as part of
>>>> the documentation. ****
>>>>
>>>>
>>>> http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
>>>> ****
>>>>
>>>> You should make sure to have also installed the pre requisite. ****
>>>>
>>>> Thanks
>>>> Olivier ****
>>>>
>>>> On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>>
>>>>  thanks. sorry for the long break. actually got involved in some other
>>>> priorities****
>>>>
>>>> i downloaded the installer and while installing i got following error *
>>>> ***
>>>>
>>>> ** **
>>>>
>>>> [image: Inline image 1]****
>>>>
>>>> ** **
>>>>
>>>> do i need to make any configuration prior to installation ??****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> irfan ****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
>>>> orenault@hortonworks.com> wrote:****
>>>>
>>>> Here is the link ****
>>>>
>>>> http://download.hortonworks.com/products/hdp-windows/****
>>>>
>>>> Olivier ****
>>>>
>>>> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>>
>>>>  thanks.****
>>>>
>>>> i just followed the instructions to setup the pseudo distributed setup
>>>> first using the url :
>>>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>>>> ****
>>>>
>>>>  ****
>>>>
>>>> i don't think so i am running DN on both machine ****
>>>>
>>>> please find the attached log****
>>>>
>>>> ** **
>>>>
>>>> hi olivier ****
>>>>
>>>> ** **
>>>>
>>>> can you please give me download link ?****
>>>>
>>>> let me try please ****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> irfan ****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <do...@gmail.com>
>>>> wrote:****
>>>>
>>>>  Are you running DN on both the machines? Could you please show me
>>>> your DN logs?****
>>>>
>>>> ** **
>>>>
>>>> Also, consider Oliver's suggestion. It's definitely a better option.***
>>>> *
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>>
>>>> ****
>>>>
>>>> Warm Regards,****
>>>>
>>>> Tariq****
>>>>
>>>> cloudfront.blogspot.com****
>>>>
>>>> ** **
>>>>
>>>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>>>> orenault@hortonworks.com> wrote:****
>>>>
>>>> Irfu, ****
>>>>
>>>> If you want to quickly get Hadoop running on windows platform. You may
>>>> want to try our distribution for Windows. You will be able to find the msi
>>>> on our website. ****
>>>>
>>>> Regards
>>>> Olivier ****
>>>>
>>>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>>
>>>>  thanks. ****
>>>>
>>>> ok. i think i need to change the plan over here ****
>>>>
>>>> let me create two environments. 1: totally windows 2: totally Unix****
>>>>
>>>> ** **
>>>>
>>>> because, on windows , anyway i have to try and see how hadoop works ***
>>>> *
>>>>
>>>> on UNIX, it is already known that ,  it is working fine. ****
>>>>
>>>> ** **
>>>>
>>>> so, on windows , here is the setup:****
>>>>
>>>> ** **
>>>>
>>>> namenode : windows 2012 R2 ****
>>>>
>>>> datanode : windows 2012 R2 ****
>>>>
>>>> ** **
>>>>
>>>> now, the exact problem is :****
>>>>
>>>> 1: datanode is not getting started ****
>>>>
>>>> 2: replication : if i put any file/folder on any datanode , it should
>>>> get replicated to all another available datanodes ****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <do...@gmail.com>
>>>> wrote:****
>>>>
>>>>  Seriously??You are planning to develop something using Hadoop on
>>>> windows. Not a good idea. Anyways, cold you plz show me your log files?I
>>>> also need some additional info :****
>>>>
>>>> -The exact problem which you are facing right now****
>>>>
>>>> -Your cluster summary(no. of nodes etc)****
>>>>
>>>> -Your latest configuration files****
>>>>
>>>> -Your /etc.hosts file****
>>>>
>>>>
>>>> ****
>>>>
>>>> Warm Regards,****
>>>>
>>>> Tariq****
>>>>
>>>> cloudfront.blogspot.com****
>>>>
>>>> ** **
>>>>
>>>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  ok. thanks****
>>>>
>>>> now, i need to start with all windows setup first as our product will
>>>> be based on windows ****
>>>>
>>>> so, now, please tell me how to resolve the issue ****
>>>>
>>>> ** **
>>>>
>>>> datanode is not starting . please suggest ****
>>>>
>>>> ** **
>>>>
>>>> regards,****
>>>>
>>>> irfan ****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <do...@gmail.com>
>>>> wrote:****
>>>>
>>>>  It is possible. Theoretically Hadoop doesn't stop you from doing
>>>> that. But it is not a very wise setup.****
>>>>
>>>>
>>>> ****
>>>>
>>>> Warm Regards,****
>>>>
>>>> Tariq****
>>>>
>>>> cloudfront.blogspot.com****
>>>>
>>>> ** **
>>>>
>>>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  please suggest****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> irfan****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  thanks.****
>>>>
>>>> can i have setup like this :****
>>>>
>>>> namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)****
>>>>
>>>> and datanodes are the combination of any OS (windows , linux , unix etc
>>>> )****
>>>>
>>>> ** **
>>>>
>>>> however, my doubt is,  as the file systems of  both the systems (win
>>>> and linux ) are different ,  datanodes of these systems can not be part of
>>>> single cluster . i have to make windows cluster separate and UNIX cluster
>>>> separate ?****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
>>>> aagarwal@hortonworks.com> wrote:****
>>>>
>>>> I just noticed you are on Cygwin. IIRC Windows PIDs are not the same as
>>>> Cygwin PIDs so that may be causing the discrepancy. I don't know how well
>>>> Hadoop works in Cygwin as I have never tried it. Work is in progress for
>>>> native Windows support however there are no official releases with Windows
>>>> support yet. It may be easier to get familiar with a release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
>>>> ****
>>>>
>>>>
>>>>
>>>> ****
>>>>
>>>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  thanks ****
>>>>
>>>> here is what i did .****
>>>>
>>>> i stopped all the namenodes and datanodes using ./stop-dfs.sh command *
>>>> ***
>>>>
>>>> then deleted all pid files for namenodes and datanodes ****
>>>>
>>>> ** **
>>>>
>>>> started dfs again with command : "./start-dfs.sh"****
>>>>
>>>> ** **
>>>>
>>>> when i ran the "Jps" command . it shows****
>>>>
>>>> ** **
>>>>
>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>
>>>> $ ./jps.exe****
>>>>
>>>> 4536 Jps****
>>>>
>>>> 2076 NameNode****
>>>>
>>>> ** **
>>>>
>>>> however, when i open the pid file for namenode then it is not showing
>>>> pid as : 4560. on the contrary, it shud show : 2076****
>>>>
>>>> ** **
>>>>
>>>> please suggest ****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
>>>> aagarwal@hortonworks.com> wrote:****
>>>>
>>>>  Most likely there is a stale pid file. Something like
>>>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>>>> the datanode.
>>>>
>>>> I haven't read the entire thread so you may have looked at this already.
>>>>
>>>> -Arpit****
>>>>
>>>>
>>>>
>>>> ****
>>>>
>>>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  datanode is trying to connect to namenode continuously but fails ****
>>>>
>>>> ** **
>>>>
>>>> when i try to run "jps" command it says :****
>>>>
>>>> $ ./jps.exe****
>>>>
>>>> 4584 NameNode****
>>>>
>>>> 4016 Jps****
>>>>
>>>> ** **
>>>>
>>>> and when i ran the "./start-dfs.sh" then it says :****
>>>>
>>>> ** **
>>>>
>>>> $ ./start-dfs.sh****
>>>>
>>>> namenode running as process 3544. Stop it first.****
>>>>
>>>> DFS-1: datanode running as process 4076. Stop it first.****
>>>>
>>>> localhost: secondarynamenode running as process 4792. Stop it first.***
>>>> *
>>>>
>>>> ** **
>>>>
>>>> both these logs are contradictory ****
>>>>
>>>> please find the attached logs ****
>>>>
>>>> ** **
>>>>
>>>> should i attach the conf files as well ?****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>>  ****
>>>>
>>>> ** **
>>>>
>>>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <do...@gmail.com>
>>>> wrote:****
>>>>
>>>>  Your DN is still not running. Showing me the logs would be helpful.***
>>>> *
>>>>
>>>>
>>>> ****
>>>>
>>>> Warm Regards,****
>>>>
>>>> Tariq****
>>>>
>>>> cloudfront.blogspot.com****
>>>>
>>>> ** **
>>>>
>>>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  i followed the url and did the steps mention in that. i have deployed
>>>> on the windows platform****
>>>>
>>>> ** **
>>>>
>>>> Now, i am able to browse url : http://localhost:50070 (name node )****
>>>>
>>>> however, not able to browse url : http://localhost:50030****
>>>>
>>>> ** **
>>>>
>>>> please refer below****
>>>>
>>>> ** **
>>>>
>>>> [image: Inline image 1]****
>>>>
>>>> ** **
>>>>
>>>> i have modified all the config files as mentioned and formatted the
>>>> hdfs file system as well ****
>>>>
>>>> please suggest ****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  thanks. i followed this url :
>>>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>>>> ****
>>>>
>>>> let me follow the url which you gave for pseudo distributed setup and
>>>> then will switch to distributed mode****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> irfan ****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <do...@gmail.com>
>>>> wrote:****
>>>>
>>>>  You are welcome. Which link have you followed for the
>>>> configuration?Your *core-site.xml* is empty. Remove the property *
>>>> fs.default.name *from *hdfs-site.xml* and add it to *core-site.xml*.
>>>> Remove *mapred.job.tracker* as well. It is required in *mapred-site.xml
>>>> *.****
>>>>
>>>> ** **
>>>>
>>>> I would suggest you to do a pseudo distributed setup first in order to
>>>> get yourself familiar with the process and then proceed to the distributed
>>>> mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>>>> ****
>>>>
>>>> ** **
>>>>
>>>> HTH****
>>>>
>>>>
>>>> ****
>>>>
>>>> Warm Regards,****
>>>>
>>>> Tariq****
>>>>
>>>> cloudfront.blogspot.com****
>>>>
>>>> ** **
>>>>
>>>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  thanks tariq for response. ****
>>>>
>>>> as discussed last time, i have sent you all the config files in my
>>>> setup . ****
>>>>
>>>> can you please go through that ?****
>>>>
>>>> ** **
>>>>
>>>> please let me know ****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> irfan ****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <do...@gmail.com>
>>>> wrote:****
>>>>
>>>>  I'm sorry for being unresponsive. Was out of touch for sometime
>>>> because of ramzan and eid. Resuming work today.****
>>>>
>>>> ** **
>>>>
>>>> What's the current status?****
>>>>
>>>>
>>>> ****
>>>>
>>>> Warm Regards,****
>>>>
>>>> Tariq****
>>>>
>>>> cloudfront.blogspot.com****
>>>>
>>>> ** **
>>>>
>>>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <ma...@gmail.com>
>>>> wrote:****
>>>>
>>>>  First of all read the concepts ..I hope you will like it..****
>>>>
>>>>
>>>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf****
>>>>
>>>> ** **
>>>>
>>>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  please suggest ****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> irfan ****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  hey Tariq,****
>>>>
>>>> i am still stuck .. ****
>>>>
>>>> can you please suggest ****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> irfan ****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  please suggest ****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  attachment got quarantined ****
>>>>
>>>> resending in txt format. please rename it to conf.rar ****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  thanks.****
>>>>
>>>> ** **
>>>>
>>>> if i run the jps command on namenode :****
>>>>
>>>> ** **
>>>>
>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>
>>>> $ ./jps.exe****
>>>>
>>>> 3164 NameNode****
>>>>
>>>> 1892 Jps****
>>>>
>>>> ** **
>>>>
>>>> same command on datanode :****
>>>>
>>>> ** **
>>>>
>>>> Administrator@DFS-1 /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>
>>>> $ ./jps.exe****
>>>>
>>>> 3848 Jps****
>>>>
>>>> ** **
>>>>
>>>> jps does not list any process for datanode. however, on web browser i
>>>> can see one live data node ****
>>>>
>>>> please find the attached conf rar file of namenode ****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <do...@gmail.com>
>>>> wrote:****
>>>>
>>>>  OK. we'll start fresh. Could you plz show me your latest config files?
>>>> ****
>>>>
>>>> ** **
>>>>
>>>> BTW, are your daemons running fine?Use JPS to verify that.****
>>>>
>>>>
>>>> ****
>>>>
>>>> Warm Regards,****
>>>>
>>>> Tariq****
>>>>
>>>> cloudfront.blogspot.com****
>>>>
>>>> ** **
>>>>
>>>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  i have created these dir "wksp_data" and "wksp_name" on both datanode
>>>> and namenode ****
>>>>
>>>> made the respective changes in "hdfs-site.xml" file ****
>>>>
>>>> formatted the namenode ****
>>>>
>>>> started the dfs ****
>>>>
>>>> ** **
>>>>
>>>> but still, not able to browse the file system through web browser ****
>>>>
>>>> please refer below ****
>>>>
>>>> ** **
>>>>
>>>> anything still missing ?****
>>>>
>>>> please suggest ****
>>>>
>>>> ** **
>>>>
>>>> [image: Inline image 1]****
>>>>
>>>> ** **
>>>>
>>>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  these dir needs to be created on all datanodes and namenodes ?****
>>>>
>>>> further,  hdfs-site.xml needs to be updated on both datanodes and
>>>> namenodes for these new dir?****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <do...@gmail.com>
>>>> wrote:****
>>>>
>>>>  Create 2 directories manually corresponding to the values of
>>>> dfs.name.dir and dfs.data.dir properties and change the permissions of
>>>> these directories to 755. When you start pushing data into your HDFS, data
>>>> will start going inside the directory specified by dfs.data.dir and the
>>>> associated metadata will go inside dfs.name.dir. Remember, you store data
>>>> in HDFS, but it eventually gets stored in your local/native FS. But you
>>>> cannot see this data directly on your local/native FS.****
>>>>
>>>>
>>>> ****
>>>>
>>>> Warm Regards,****
>>>>
>>>> Tariq****
>>>>
>>>> cloudfront.blogspot.com****
>>>>
>>>> ** **
>>>>
>>>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  thanks. ****
>>>>
>>>> however, i need this to be working on windows environment as project
>>>> requirement.****
>>>>
>>>> i will add/work on Linux later ****
>>>>
>>>> ** **
>>>>
>>>> so, now , at this stage , c:\\wksp is the HDFS file system OR do i need
>>>> to create it from command line ?****
>>>>
>>>> ** **
>>>>
>>>> please suggest****
>>>>
>>>> ** **
>>>>
>>>> regards,****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <do...@gmail.com>
>>>> wrote:****
>>>>
>>>>  Hello Irfan,****
>>>>
>>>> ** **
>>>>
>>>> Sorry for being unresponsive. Got stuck with some imp work.****
>>>>
>>>> ** **
>>>>
>>>> HDFS webUI doesn't provide us the ability to create file or directory.
>>>> You can browse HDFS, view files, download files etc. But operation like
>>>> create, move, copy etc are not supported.****
>>>>
>>>> ** **
>>>>
>>>> These values look fine to me.****
>>>>
>>>> ** **
>>>>
>>>> One suggestion though. Try getting a Linux machine(if possible). Or at
>>>> least use a VM. I personally feel that using Hadoop on windows is always
>>>> messy.****
>>>>
>>>>
>>>> ****
>>>>
>>>> Warm Regards,****
>>>>
>>>> Tariq****
>>>>
>>>> cloudfront.blogspot.com****
>>>>
>>>> ** **
>>>>
>>>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  thanks.****
>>>>
>>>> when i browse the file system , i am getting following :****
>>>>
>>>> i haven't seen any make directory option there ****
>>>>
>>>> ** **
>>>>
>>>> i need to create it from command line ?****
>>>>
>>>> further, in the hdfs-site.xml file , i have given following entries.
>>>> are they correct ? ****
>>>>
>>>> ** **
>>>>
>>>> <property>****
>>>>
>>>>   <name>dfs.data.dir</name>****
>>>>
>>>>   <value>c:\\wksp</value>****
>>>>
>>>>   </property>****
>>>>
>>>> <property>****
>>>>
>>>>   <name>dfs.name.dir</name>****
>>>>
>>>>   <value>c:\\wksp</value>****
>>>>
>>>>   </property>****
>>>>
>>>> ** **
>>>>
>>>> please suggest ****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> [image: Inline image 1]****
>>>>
>>>> ** **
>>>>
>>>> On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <ma...@gmail.com>
>>>> wrote:****
>>>>
>>>>  *You are wrong at this:*****
>>>>
>>>> ** **
>>>>
>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>>>
>>>> $ ./hadoop dfs -copyFromLocal
>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp****
>>>>
>>>> copyFromLocal: File
>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
>>>> ****
>>>>
>>>> ** **
>>>>
>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>>>
>>>> $ ./hadoop dfs -copyFromLocal
>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp****
>>>>
>>>> copyFromLocal: File
>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>>>> ****
>>>>
>>>> ** **
>>>>
>>>> Because,You had wrote both the paths local and You need not to copy
>>>> hadoop into hdfs...Hadoop is already working..****
>>>>
>>>> ** **
>>>>
>>>> Just check out in browser by after starting ur single node cluster :***
>>>> *
>>>>
>>>> ** **
>>>>
>>>> localhost:50070****
>>>>
>>>> ** **
>>>>
>>>> then go for browse the filesystem link in it..****
>>>>
>>>> ** **
>>>>
>>>> If there is no directory then make directory there.****
>>>>
>>>> That is your hdfs directory.****
>>>>
>>>> Then copy any text file there(no need to copy hadoop there).beacause u
>>>> are going to do processing on that data in text file.That's why hadoop is
>>>> used for ,first u need to make it clear in ur mind.Then and then u will do
>>>> it...otherwise not possible..****
>>>>
>>>> ** **
>>>>
>>>> *Try this: *****
>>>>
>>>> ** **
>>>>
>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2****
>>>>
>>>> $ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
>>>> /hdfs/directory/path****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  thanks. yes , i am newbie.****
>>>>
>>>> however, i need windows setup.****
>>>>
>>>> ** **
>>>>
>>>> let me surely refer the doc and link which u sent but i need this to be
>>>> working ...****
>>>>
>>>> can you please help****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> ** **
>>>>
>>>>  ****
>>>>
>>>> ** **
>>>>
>>>>
>>>>
>>>> ****
>>>>
>>>> ** **
>>>>
>>>> --
>>>> MANISH DUNANI
>>>> -THANX
>>>> +91 9426881954,+91 8460656443****
>>>>
>>>> manishd207@gmail.com****
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>
>>>>
>>>> ****
>>>>
>>>> -- ****
>>>>
>>>> Regards****
>>>>
>>>> *Manish Dunani*****
>>>>
>>>> *Contact No* : +91 9408329137****
>>>>
>>>> *skype id* : manish.dunani****
>>>>
>>>> ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>> CONFIDENTIALITY NOTICE
>>>> NOTICE: This message is intended for the use of the individual or
>>>> entity to which it is addressed and may contain information that is
>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>> If the reader of this message is not the intended recipient, you are hereby
>>>> notified that any printing, copying, dissemination, distribution,
>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>> you have received this communication in error, please contact the sender
>>>> immediately and delete it from your system. Thank You.****
>>>>
>>>>  ** **
>>>>
>>>>
>>>>
>>>> CONFIDENTIALITY NOTICE
>>>> NOTICE: This message is intended for the use of the individual or
>>>> entity to which it is addressed and may contain information that is
>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>> If the reader of this message is not the intended recipient, you are hereby
>>>> notified that any printing, copying, dissemination, distribution,
>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>> you have received this communication in error, please contact the sender
>>>> immediately and delete it from your system. Thank You.****
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>
>>>> CONFIDENTIALITY NOTICE
>>>> NOTICE: This message is intended for the use of the individual or
>>>> entity to which it is addressed and may contain information that is
>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>> If the reader of this message is not the intended recipient, you are hereby
>>>> notified that any printing, copying, dissemination, distribution,
>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>> you have received this communication in error, please contact the sender
>>>> immediately and delete it from your system. Thank You.****
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>
>>>> CONFIDENTIALITY NOTICE
>>>> NOTICE: This message is intended for the use of the individual or
>>>> entity to which it is addressed and may contain information that is
>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>> If the reader of this message is not the intended recipient, you are hereby
>>>> notified that any printing, copying, dissemination, distribution,
>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>> you have received this communication in error, please contact the sender
>>>> immediately and delete it from your system. Thank You.****
>>>>
>>>>  ** **
>>>>
>>>>
>>>> CONFIDENTIALITY NOTICE
>>>> NOTICE: This message is intended for the use of the individual or
>>>> entity to which it is addressed and may contain information that is
>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>> If the reader of this message is not the intended recipient, you are hereby
>>>> notified that any printing, copying, dissemination, distribution,
>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>> you have received this communication in error, please contact the sender
>>>> immediately and delete it from your system. Thank You.****
>>>>
>>>>  ** **
>>>>
>>>>
>>>> CONFIDENTIALITY NOTICE
>>>> NOTICE: This message is intended for the use of the individual or
>>>> entity to which it is addressed and may contain information that is
>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>> If the reader of this message is not the intended recipient, you are hereby
>>>> notified that any printing, copying, dissemination, distribution,
>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>> you have received this communication in error, please contact the sender
>>>> immediately and delete it from your system. Thank You.****
>>>>
>>>>  ** **
>>>>
>>>>
>>>>
>>>> ****
>>>>
>>>> ** **
>>>>
>>>> -- ****
>>>>
>>>> Olivier Renault****
>>>>
>>>> Solution Engineer - Big Data - Hortonworks, Inc.
>>>> +44 7500 933 036
>>>> orenault@hortonworks.com
>>>> www.hortonworks.com****
>>>>
>>>> **** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>>
>>>>
>>>> CONFIDENTIALITY NOTICE
>>>> NOTICE: This message is intended for the use of the individual or
>>>> entity to which it is addressed and may contain information that is
>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>> If the reader of this message is not the intended recipient, you are hereby
>>>> notified that any printing, copying, dissemination, distribution,
>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>> you have received this communication in error, please contact the sender
>>>> immediately and delete it from your system. Thank You.****<http://hortonworks.com/products/hortonworks-sandbox/>
>>>>
>>>>  ** ** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>>
>>>
>>>
>>
>

Re: about replication

Posted by Irfan Sayed <ir...@gmail.com>.
please suggest

regards
irfan



On Sat, Sep 7, 2013 at 4:56 PM, Irfan Sayed <ir...@gmail.com> wrote:

> please suggest
>
> regards
> irfan
>
>
>
> On Fri, Sep 6, 2013 at 12:16 PM, Irfan Sayed <ir...@gmail.com> wrote:
>
>> ok.. now i made some changes and installation went ahead
>> but failed in property "HIVE_SERVER_HOST" declaration
>> in cluster config file, i have commented this property. if i uncomment ,
>> then what server address will give ???
>>
>> i have only two windows machines setup.
>> 1: for namenode and another for datanode
>>
>> please suggest
>>
>> regards
>> irfan
>>
>>
>>
>> On Fri, Sep 6, 2013 at 11:42 AM, Irfan Sayed <ir...@gmail.com>wrote:
>>
>>> thanks.
>>> i installed the latest java in c:\java folder and now no error in log
>>> file related to java
>>> however, now it is throwing error on not having cluster properties file.
>>> in fact i am running/installing hdp from the location where this file
>>> exist . still it is throwing error
>>>
>>> please find the attached
>>>
>>> [image: Inline image 1]
>>>
>>> regards
>>> irfan
>>>
>>>
>>>
>>> On Fri, Sep 6, 2013 at 11:12 AM, Ravi Mummulla (BIG DATA) <
>>> ravimu@microsoft.com> wrote:
>>>
>>>>  Here’s your issue (from the logs you attached earlier):****
>>>>
>>>> ** **
>>>>
>>>> CAQuietExec:  Checking JAVA_HOME is set correctly...****
>>>>
>>>> CAQuietExec:  Files\Java\jdk1.6.0_31 was unexpected at this time.****
>>>>
>>>> ** **
>>>>
>>>> It seems that you installed Java prerequisite in the default path,
>>>> which is %PROGRAMFILES% (expands to C:\Program Files in your case). HDP 1.3
>>>> does not like spaces in paths, do you need to reinstall Java under c:\java\
>>>> or something similar (in a path with no spaces).****
>>>>
>>>> ** **
>>>>
>>>> *From:* Irfan Sayed [mailto:irfu.sayed@gmail.com]
>>>> *Sent:* Thursday, September 5, 2013 8:42 PM
>>>> *To:* user@hadoop.apache.org
>>>> *Subject:* Re: about replication****
>>>>
>>>> ** **
>>>>
>>>> please find the attached.****
>>>>
>>>> i don't have "c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log"
>>>> as it is not generated ****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> irfan****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Thu, Sep 5, 2013 at 6:09 PM, Olivier Renault <
>>>> orenault@hortonworks.com> wrote:****
>>>>
>>>>  Could you share the log files ( c:\hdp.log,
>>>> c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log )  as
>>>> well as your clusterproperties.txt ?****
>>>>
>>>> ** **
>>>>
>>>> Thanks, ****
>>>>
>>>> Olivier****
>>>>
>>>> ** **
>>>>
>>>> On 5 September 2013 12:33, Irfan Sayed <ir...@gmail.com> wrote:***
>>>> *
>>>>
>>>>  thanks. i followed the user manual for deployment and installed all
>>>> pre-requisites ****
>>>>
>>>> i modified the command and still the issue persist. please suggest ****
>>>>
>>>> ** **
>>>>
>>>> please refer below ****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> [image: Inline image 1]****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> irfan ****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <
>>>> orenault@hortonworks.com> wrote:****
>>>>
>>>> The command to install it is msiexec /i msifile /...  ****
>>>>
>>>> You will find the correct syntax as part of doc. ****
>>>>
>>>> Happy reading
>>>> Olivier ****
>>>>
>>>> On 4 Sep 2013 12:37, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>>
>>>>  thanks. ****
>>>>
>>>> i referred the logs and manuals. i modified the clusterproperties file
>>>> and then double click on the msi file ****
>>>>
>>>> however, it still failed.****
>>>>
>>>> further i started the installation on command line by giving
>>>> HDP_LAYOUT=clusterproperties file path, ****
>>>>
>>>> installation went ahead and it failed for .NET framework 4.0 and VC++
>>>> redistributable package dependency   ****
>>>>
>>>> ** **
>>>>
>>>> i installed both and started again the installation. ****
>>>>
>>>> failed again with following error ****
>>>>
>>>> [image: Inline image 1]****
>>>>
>>>> ** **
>>>>
>>>> when i search for the logs mentioned in the error , i never found that
>>>> ****
>>>>
>>>> please suggest ****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> irfan****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <
>>>> orenault@hortonworks.com> wrote:****
>>>>
>>>> Correct, you need to define the cluster configuration as part of a
>>>> file. You will find some information on the configuration file as part of
>>>> the documentation. ****
>>>>
>>>>
>>>> http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
>>>> ****
>>>>
>>>> You should make sure to have also installed the pre requisite. ****
>>>>
>>>> Thanks
>>>> Olivier ****
>>>>
>>>> On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>>
>>>>  thanks. sorry for the long break. actually got involved in some other
>>>> priorities****
>>>>
>>>> i downloaded the installer and while installing i got following error *
>>>> ***
>>>>
>>>> ** **
>>>>
>>>> [image: Inline image 1]****
>>>>
>>>> ** **
>>>>
>>>> do i need to make any configuration prior to installation ??****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> irfan ****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
>>>> orenault@hortonworks.com> wrote:****
>>>>
>>>> Here is the link ****
>>>>
>>>> http://download.hortonworks.com/products/hdp-windows/****
>>>>
>>>> Olivier ****
>>>>
>>>> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>>
>>>>  thanks.****
>>>>
>>>> i just followed the instructions to setup the pseudo distributed setup
>>>> first using the url :
>>>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>>>> ****
>>>>
>>>>  ****
>>>>
>>>> i don't think so i am running DN on both machine ****
>>>>
>>>> please find the attached log****
>>>>
>>>> ** **
>>>>
>>>> hi olivier ****
>>>>
>>>> ** **
>>>>
>>>> can you please give me download link ?****
>>>>
>>>> let me try please ****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> irfan ****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <do...@gmail.com>
>>>> wrote:****
>>>>
>>>>  Are you running DN on both the machines? Could you please show me
>>>> your DN logs?****
>>>>
>>>> ** **
>>>>
>>>> Also, consider Oliver's suggestion. It's definitely a better option.***
>>>> *
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>>
>>>> ****
>>>>
>>>> Warm Regards,****
>>>>
>>>> Tariq****
>>>>
>>>> cloudfront.blogspot.com****
>>>>
>>>> ** **
>>>>
>>>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>>>> orenault@hortonworks.com> wrote:****
>>>>
>>>> Irfu, ****
>>>>
>>>> If you want to quickly get Hadoop running on windows platform. You may
>>>> want to try our distribution for Windows. You will be able to find the msi
>>>> on our website. ****
>>>>
>>>> Regards
>>>> Olivier ****
>>>>
>>>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>>
>>>>  thanks. ****
>>>>
>>>> ok. i think i need to change the plan over here ****
>>>>
>>>> let me create two environments. 1: totally windows 2: totally Unix****
>>>>
>>>> ** **
>>>>
>>>> because, on windows , anyway i have to try and see how hadoop works ***
>>>> *
>>>>
>>>> on UNIX, it is already known that ,  it is working fine. ****
>>>>
>>>> ** **
>>>>
>>>> so, on windows , here is the setup:****
>>>>
>>>> ** **
>>>>
>>>> namenode : windows 2012 R2 ****
>>>>
>>>> datanode : windows 2012 R2 ****
>>>>
>>>> ** **
>>>>
>>>> now, the exact problem is :****
>>>>
>>>> 1: datanode is not getting started ****
>>>>
>>>> 2: replication : if i put any file/folder on any datanode , it should
>>>> get replicated to all another available datanodes ****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <do...@gmail.com>
>>>> wrote:****
>>>>
>>>>  Seriously??You are planning to develop something using Hadoop on
>>>> windows. Not a good idea. Anyways, cold you plz show me your log files?I
>>>> also need some additional info :****
>>>>
>>>> -The exact problem which you are facing right now****
>>>>
>>>> -Your cluster summary(no. of nodes etc)****
>>>>
>>>> -Your latest configuration files****
>>>>
>>>> -Your /etc.hosts file****
>>>>
>>>>
>>>> ****
>>>>
>>>> Warm Regards,****
>>>>
>>>> Tariq****
>>>>
>>>> cloudfront.blogspot.com****
>>>>
>>>> ** **
>>>>
>>>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  ok. thanks****
>>>>
>>>> now, i need to start with all windows setup first as our product will
>>>> be based on windows ****
>>>>
>>>> so, now, please tell me how to resolve the issue ****
>>>>
>>>> ** **
>>>>
>>>> datanode is not starting . please suggest ****
>>>>
>>>> ** **
>>>>
>>>> regards,****
>>>>
>>>> irfan ****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <do...@gmail.com>
>>>> wrote:****
>>>>
>>>>  It is possible. Theoretically Hadoop doesn't stop you from doing
>>>> that. But it is not a very wise setup.****
>>>>
>>>>
>>>> ****
>>>>
>>>> Warm Regards,****
>>>>
>>>> Tariq****
>>>>
>>>> cloudfront.blogspot.com****
>>>>
>>>> ** **
>>>>
>>>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  please suggest****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> irfan****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  thanks.****
>>>>
>>>> can i have setup like this :****
>>>>
>>>> namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)****
>>>>
>>>> and datanodes are the combination of any OS (windows , linux , unix etc
>>>> )****
>>>>
>>>> ** **
>>>>
>>>> however, my doubt is,  as the file systems of  both the systems (win
>>>> and linux ) are different ,  datanodes of these systems can not be part of
>>>> single cluster . i have to make windows cluster separate and UNIX cluster
>>>> separate ?****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
>>>> aagarwal@hortonworks.com> wrote:****
>>>>
>>>> I just noticed you are on Cygwin. IIRC Windows PIDs are not the same as
>>>> Cygwin PIDs so that may be causing the discrepancy. I don't know how well
>>>> Hadoop works in Cygwin as I have never tried it. Work is in progress for
>>>> native Windows support however there are no official releases with Windows
>>>> support yet. It may be easier to get familiar with a release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
>>>> ****
>>>>
>>>>
>>>>
>>>> ****
>>>>
>>>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  thanks ****
>>>>
>>>> here is what i did .****
>>>>
>>>> i stopped all the namenodes and datanodes using ./stop-dfs.sh command *
>>>> ***
>>>>
>>>> then deleted all pid files for namenodes and datanodes ****
>>>>
>>>> ** **
>>>>
>>>> started dfs again with command : "./start-dfs.sh"****
>>>>
>>>> ** **
>>>>
>>>> when i ran the "Jps" command . it shows****
>>>>
>>>> ** **
>>>>
>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>
>>>> $ ./jps.exe****
>>>>
>>>> 4536 Jps****
>>>>
>>>> 2076 NameNode****
>>>>
>>>> ** **
>>>>
>>>> however, when i open the pid file for namenode then it is not showing
>>>> pid as : 4560. on the contrary, it shud show : 2076****
>>>>
>>>> ** **
>>>>
>>>> please suggest ****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
>>>> aagarwal@hortonworks.com> wrote:****
>>>>
>>>>  Most likely there is a stale pid file. Something like
>>>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>>>> the datanode.
>>>>
>>>> I haven't read the entire thread so you may have looked at this already.
>>>>
>>>> -Arpit****
>>>>
>>>>
>>>>
>>>> ****
>>>>
>>>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  datanode is trying to connect to namenode continuously but fails ****
>>>>
>>>> ** **
>>>>
>>>> when i try to run "jps" command it says :****
>>>>
>>>> $ ./jps.exe****
>>>>
>>>> 4584 NameNode****
>>>>
>>>> 4016 Jps****
>>>>
>>>> ** **
>>>>
>>>> and when i ran the "./start-dfs.sh" then it says :****
>>>>
>>>> ** **
>>>>
>>>> $ ./start-dfs.sh****
>>>>
>>>> namenode running as process 3544. Stop it first.****
>>>>
>>>> DFS-1: datanode running as process 4076. Stop it first.****
>>>>
>>>> localhost: secondarynamenode running as process 4792. Stop it first.***
>>>> *
>>>>
>>>> ** **
>>>>
>>>> both these logs are contradictory ****
>>>>
>>>> please find the attached logs ****
>>>>
>>>> ** **
>>>>
>>>> should i attach the conf files as well ?****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>>  ****
>>>>
>>>> ** **
>>>>
>>>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <do...@gmail.com>
>>>> wrote:****
>>>>
>>>>  Your DN is still not running. Showing me the logs would be helpful.***
>>>> *
>>>>
>>>>
>>>> ****
>>>>
>>>> Warm Regards,****
>>>>
>>>> Tariq****
>>>>
>>>> cloudfront.blogspot.com****
>>>>
>>>> ** **
>>>>
>>>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  i followed the url and did the steps mention in that. i have deployed
>>>> on the windows platform****
>>>>
>>>> ** **
>>>>
>>>> Now, i am able to browse url : http://localhost:50070 (name node )****
>>>>
>>>> however, not able to browse url : http://localhost:50030****
>>>>
>>>> ** **
>>>>
>>>> please refer below****
>>>>
>>>> ** **
>>>>
>>>> [image: Inline image 1]****
>>>>
>>>> ** **
>>>>
>>>> i have modified all the config files as mentioned and formatted the
>>>> hdfs file system as well ****
>>>>
>>>> please suggest ****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  thanks. i followed this url :
>>>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>>>> ****
>>>>
>>>> let me follow the url which you gave for pseudo distributed setup and
>>>> then will switch to distributed mode****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> irfan ****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <do...@gmail.com>
>>>> wrote:****
>>>>
>>>>  You are welcome. Which link have you followed for the
>>>> configuration?Your *core-site.xml* is empty. Remove the property *
>>>> fs.default.name *from *hdfs-site.xml* and add it to *core-site.xml*.
>>>> Remove *mapred.job.tracker* as well. It is required in *mapred-site.xml
>>>> *.****
>>>>
>>>> ** **
>>>>
>>>> I would suggest you to do a pseudo distributed setup first in order to
>>>> get yourself familiar with the process and then proceed to the distributed
>>>> mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>>>> ****
>>>>
>>>> ** **
>>>>
>>>> HTH****
>>>>
>>>>
>>>> ****
>>>>
>>>> Warm Regards,****
>>>>
>>>> Tariq****
>>>>
>>>> cloudfront.blogspot.com****
>>>>
>>>> ** **
>>>>
>>>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  thanks tariq for response. ****
>>>>
>>>> as discussed last time, i have sent you all the config files in my
>>>> setup . ****
>>>>
>>>> can you please go through that ?****
>>>>
>>>> ** **
>>>>
>>>> please let me know ****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> irfan ****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <do...@gmail.com>
>>>> wrote:****
>>>>
>>>>  I'm sorry for being unresponsive. Was out of touch for sometime
>>>> because of ramzan and eid. Resuming work today.****
>>>>
>>>> ** **
>>>>
>>>> What's the current status?****
>>>>
>>>>
>>>> ****
>>>>
>>>> Warm Regards,****
>>>>
>>>> Tariq****
>>>>
>>>> cloudfront.blogspot.com****
>>>>
>>>> ** **
>>>>
>>>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <ma...@gmail.com>
>>>> wrote:****
>>>>
>>>>  First of all read the concepts ..I hope you will like it..****
>>>>
>>>>
>>>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf****
>>>>
>>>> ** **
>>>>
>>>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  please suggest ****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> irfan ****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  hey Tariq,****
>>>>
>>>> i am still stuck .. ****
>>>>
>>>> can you please suggest ****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> irfan ****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  please suggest ****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  attachment got quarantined ****
>>>>
>>>> resending in txt format. please rename it to conf.rar ****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  thanks.****
>>>>
>>>> ** **
>>>>
>>>> if i run the jps command on namenode :****
>>>>
>>>> ** **
>>>>
>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>
>>>> $ ./jps.exe****
>>>>
>>>> 3164 NameNode****
>>>>
>>>> 1892 Jps****
>>>>
>>>> ** **
>>>>
>>>> same command on datanode :****
>>>>
>>>> ** **
>>>>
>>>> Administrator@DFS-1 /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>>
>>>> $ ./jps.exe****
>>>>
>>>> 3848 Jps****
>>>>
>>>> ** **
>>>>
>>>> jps does not list any process for datanode. however, on web browser i
>>>> can see one live data node ****
>>>>
>>>> please find the attached conf rar file of namenode ****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <do...@gmail.com>
>>>> wrote:****
>>>>
>>>>  OK. we'll start fresh. Could you plz show me your latest config files?
>>>> ****
>>>>
>>>> ** **
>>>>
>>>> BTW, are your daemons running fine?Use JPS to verify that.****
>>>>
>>>>
>>>> ****
>>>>
>>>> Warm Regards,****
>>>>
>>>> Tariq****
>>>>
>>>> cloudfront.blogspot.com****
>>>>
>>>> ** **
>>>>
>>>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  i have created these dir "wksp_data" and "wksp_name" on both datanode
>>>> and namenode ****
>>>>
>>>> made the respective changes in "hdfs-site.xml" file ****
>>>>
>>>> formatted the namenode ****
>>>>
>>>> started the dfs ****
>>>>
>>>> ** **
>>>>
>>>> but still, not able to browse the file system through web browser ****
>>>>
>>>> please refer below ****
>>>>
>>>> ** **
>>>>
>>>> anything still missing ?****
>>>>
>>>> please suggest ****
>>>>
>>>> ** **
>>>>
>>>> [image: Inline image 1]****
>>>>
>>>> ** **
>>>>
>>>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  these dir needs to be created on all datanodes and namenodes ?****
>>>>
>>>> further,  hdfs-site.xml needs to be updated on both datanodes and
>>>> namenodes for these new dir?****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <do...@gmail.com>
>>>> wrote:****
>>>>
>>>>  Create 2 directories manually corresponding to the values of
>>>> dfs.name.dir and dfs.data.dir properties and change the permissions of
>>>> these directories to 755. When you start pushing data into your HDFS, data
>>>> will start going inside the directory specified by dfs.data.dir and the
>>>> associated metadata will go inside dfs.name.dir. Remember, you store data
>>>> in HDFS, but it eventually gets stored in your local/native FS. But you
>>>> cannot see this data directly on your local/native FS.****
>>>>
>>>>
>>>> ****
>>>>
>>>> Warm Regards,****
>>>>
>>>> Tariq****
>>>>
>>>> cloudfront.blogspot.com****
>>>>
>>>> ** **
>>>>
>>>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  thanks. ****
>>>>
>>>> however, i need this to be working on windows environment as project
>>>> requirement.****
>>>>
>>>> i will add/work on Linux later ****
>>>>
>>>> ** **
>>>>
>>>> so, now , at this stage , c:\\wksp is the HDFS file system OR do i need
>>>> to create it from command line ?****
>>>>
>>>> ** **
>>>>
>>>> please suggest****
>>>>
>>>> ** **
>>>>
>>>> regards,****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <do...@gmail.com>
>>>> wrote:****
>>>>
>>>>  Hello Irfan,****
>>>>
>>>> ** **
>>>>
>>>> Sorry for being unresponsive. Got stuck with some imp work.****
>>>>
>>>> ** **
>>>>
>>>> HDFS webUI doesn't provide us the ability to create file or directory.
>>>> You can browse HDFS, view files, download files etc. But operation like
>>>> create, move, copy etc are not supported.****
>>>>
>>>> ** **
>>>>
>>>> These values look fine to me.****
>>>>
>>>> ** **
>>>>
>>>> One suggestion though. Try getting a Linux machine(if possible). Or at
>>>> least use a VM. I personally feel that using Hadoop on windows is always
>>>> messy.****
>>>>
>>>>
>>>> ****
>>>>
>>>> Warm Regards,****
>>>>
>>>> Tariq****
>>>>
>>>> cloudfront.blogspot.com****
>>>>
>>>> ** **
>>>>
>>>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  thanks.****
>>>>
>>>> when i browse the file system , i am getting following :****
>>>>
>>>> i haven't seen any make directory option there ****
>>>>
>>>> ** **
>>>>
>>>> i need to create it from command line ?****
>>>>
>>>> further, in the hdfs-site.xml file , i have given following entries.
>>>> are they correct ? ****
>>>>
>>>> ** **
>>>>
>>>> <property>****
>>>>
>>>>   <name>dfs.data.dir</name>****
>>>>
>>>>   <value>c:\\wksp</value>****
>>>>
>>>>   </property>****
>>>>
>>>> <property>****
>>>>
>>>>   <name>dfs.name.dir</name>****
>>>>
>>>>   <value>c:\\wksp</value>****
>>>>
>>>>   </property>****
>>>>
>>>> ** **
>>>>
>>>> please suggest ****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> [image: Inline image 1]****
>>>>
>>>> ** **
>>>>
>>>> On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <ma...@gmail.com>
>>>> wrote:****
>>>>
>>>>  *You are wrong at this:*****
>>>>
>>>> ** **
>>>>
>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>>>
>>>> $ ./hadoop dfs -copyFromLocal
>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp****
>>>>
>>>> copyFromLocal: File
>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
>>>> ****
>>>>
>>>> ** **
>>>>
>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>>>
>>>> $ ./hadoop dfs -copyFromLocal
>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp****
>>>>
>>>> copyFromLocal: File
>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>>>> ****
>>>>
>>>> ** **
>>>>
>>>> Because,You had wrote both the paths local and You need not to copy
>>>> hadoop into hdfs...Hadoop is already working..****
>>>>
>>>> ** **
>>>>
>>>> Just check out in browser by after starting ur single node cluster :***
>>>> *
>>>>
>>>> ** **
>>>>
>>>> localhost:50070****
>>>>
>>>> ** **
>>>>
>>>> then go for browse the filesystem link in it..****
>>>>
>>>> ** **
>>>>
>>>> If there is no directory then make directory there.****
>>>>
>>>> That is your hdfs directory.****
>>>>
>>>> Then copy any text file there(no need to copy hadoop there).beacause u
>>>> are going to do processing on that data in text file.That's why hadoop is
>>>> used for ,first u need to make it clear in ur mind.Then and then u will do
>>>> it...otherwise not possible..****
>>>>
>>>> ** **
>>>>
>>>> *Try this: *****
>>>>
>>>> ** **
>>>>
>>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2****
>>>>
>>>> $ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
>>>> /hdfs/directory/path****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
>>>> wrote:****
>>>>
>>>>  thanks. yes , i am newbie.****
>>>>
>>>> however, i need windows setup.****
>>>>
>>>> ** **
>>>>
>>>> let me surely refer the doc and link which u sent but i need this to be
>>>> working ...****
>>>>
>>>> can you please help****
>>>>
>>>> ** **
>>>>
>>>> regards****
>>>>
>>>> ** **
>>>>
>>>>  ****
>>>>
>>>> ** **
>>>>
>>>>
>>>>
>>>> ****
>>>>
>>>> ** **
>>>>
>>>> --
>>>> MANISH DUNANI
>>>> -THANX
>>>> +91 9426881954,+91 8460656443****
>>>>
>>>> manishd207@gmail.com****
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>
>>>>
>>>> ****
>>>>
>>>> -- ****
>>>>
>>>> Regards****
>>>>
>>>> *Manish Dunani*****
>>>>
>>>> *Contact No* : +91 9408329137****
>>>>
>>>> *skype id* : manish.dunani****
>>>>
>>>> ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>> CONFIDENTIALITY NOTICE
>>>> NOTICE: This message is intended for the use of the individual or
>>>> entity to which it is addressed and may contain information that is
>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>> If the reader of this message is not the intended recipient, you are hereby
>>>> notified that any printing, copying, dissemination, distribution,
>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>> you have received this communication in error, please contact the sender
>>>> immediately and delete it from your system. Thank You.****
>>>>
>>>>  ** **
>>>>
>>>>
>>>>
>>>> CONFIDENTIALITY NOTICE
>>>> NOTICE: This message is intended for the use of the individual or
>>>> entity to which it is addressed and may contain information that is
>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>> If the reader of this message is not the intended recipient, you are hereby
>>>> notified that any printing, copying, dissemination, distribution,
>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>> you have received this communication in error, please contact the sender
>>>> immediately and delete it from your system. Thank You.****
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>
>>>> CONFIDENTIALITY NOTICE
>>>> NOTICE: This message is intended for the use of the individual or
>>>> entity to which it is addressed and may contain information that is
>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>> If the reader of this message is not the intended recipient, you are hereby
>>>> notified that any printing, copying, dissemination, distribution,
>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>> you have received this communication in error, please contact the sender
>>>> immediately and delete it from your system. Thank You.****
>>>>
>>>>  ** **
>>>>
>>>>  ** **
>>>>
>>>>
>>>> CONFIDENTIALITY NOTICE
>>>> NOTICE: This message is intended for the use of the individual or
>>>> entity to which it is addressed and may contain information that is
>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>> If the reader of this message is not the intended recipient, you are hereby
>>>> notified that any printing, copying, dissemination, distribution,
>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>> you have received this communication in error, please contact the sender
>>>> immediately and delete it from your system. Thank You.****
>>>>
>>>>  ** **
>>>>
>>>>
>>>> CONFIDENTIALITY NOTICE
>>>> NOTICE: This message is intended for the use of the individual or
>>>> entity to which it is addressed and may contain information that is
>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>> If the reader of this message is not the intended recipient, you are hereby
>>>> notified that any printing, copying, dissemination, distribution,
>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>> you have received this communication in error, please contact the sender
>>>> immediately and delete it from your system. Thank You.****
>>>>
>>>>  ** **
>>>>
>>>>
>>>> CONFIDENTIALITY NOTICE
>>>> NOTICE: This message is intended for the use of the individual or
>>>> entity to which it is addressed and may contain information that is
>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>> If the reader of this message is not the intended recipient, you are hereby
>>>> notified that any printing, copying, dissemination, distribution,
>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>> you have received this communication in error, please contact the sender
>>>> immediately and delete it from your system. Thank You.****
>>>>
>>>>  ** **
>>>>
>>>>
>>>>
>>>> ****
>>>>
>>>> ** **
>>>>
>>>> -- ****
>>>>
>>>> Olivier Renault****
>>>>
>>>> Solution Engineer - Big Data - Hortonworks, Inc.
>>>> +44 7500 933 036
>>>> orenault@hortonworks.com
>>>> www.hortonworks.com****
>>>>
>>>> **** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>>
>>>>
>>>> CONFIDENTIALITY NOTICE
>>>> NOTICE: This message is intended for the use of the individual or
>>>> entity to which it is addressed and may contain information that is
>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>> If the reader of this message is not the intended recipient, you are hereby
>>>> notified that any printing, copying, dissemination, distribution,
>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>> you have received this communication in error, please contact the sender
>>>> immediately and delete it from your system. Thank You.****<http://hortonworks.com/products/hortonworks-sandbox/>
>>>>
>>>>  ** ** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>>
>>>
>>>
>>
>

Re: about replication

Posted by Irfan Sayed <ir...@gmail.com>.
please suggest

regards
irfan



On Fri, Sep 6, 2013 at 12:16 PM, Irfan Sayed <ir...@gmail.com> wrote:

> ok.. now i made some changes and installation went ahead
> but failed in property "HIVE_SERVER_HOST" declaration
> in cluster config file, i have commented this property. if i uncomment ,
> then what server address will give ???
>
> i have only two windows machines setup.
> 1: for namenode and another for datanode
>
> please suggest
>
> regards
> irfan
>
>
>
> On Fri, Sep 6, 2013 at 11:42 AM, Irfan Sayed <ir...@gmail.com> wrote:
>
>> thanks.
>> i installed the latest java in c:\java folder and now no error in log
>> file related to java
>> however, now it is throwing error on not having cluster properties file.
>> in fact i am running/installing hdp from the location where this file
>> exist . still it is throwing error
>>
>> please find the attached
>>
>> [image: Inline image 1]
>>
>> regards
>> irfan
>>
>>
>>
>> On Fri, Sep 6, 2013 at 11:12 AM, Ravi Mummulla (BIG DATA) <
>> ravimu@microsoft.com> wrote:
>>
>>>  Here’s your issue (from the logs you attached earlier):****
>>>
>>> ** **
>>>
>>> CAQuietExec:  Checking JAVA_HOME is set correctly...****
>>>
>>> CAQuietExec:  Files\Java\jdk1.6.0_31 was unexpected at this time.****
>>>
>>> ** **
>>>
>>> It seems that you installed Java prerequisite in the default path, which
>>> is %PROGRAMFILES% (expands to C:\Program Files in your case). HDP 1.3 does
>>> not like spaces in paths, do you need to reinstall Java under c:\java\ or
>>> something similar (in a path with no spaces).****
>>>
>>> ** **
>>>
>>> *From:* Irfan Sayed [mailto:irfu.sayed@gmail.com]
>>> *Sent:* Thursday, September 5, 2013 8:42 PM
>>> *To:* user@hadoop.apache.org
>>> *Subject:* Re: about replication****
>>>
>>> ** **
>>>
>>> please find the attached.****
>>>
>>> i don't have "c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log"
>>> as it is not generated ****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> irfan****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Thu, Sep 5, 2013 at 6:09 PM, Olivier Renault <
>>> orenault@hortonworks.com> wrote:****
>>>
>>>  Could you share the log files ( c:\hdp.log,
>>> c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log )  as
>>> well as your clusterproperties.txt ?****
>>>
>>> ** **
>>>
>>> Thanks, ****
>>>
>>> Olivier****
>>>
>>> ** **
>>>
>>> On 5 September 2013 12:33, Irfan Sayed <ir...@gmail.com> wrote:****
>>>
>>>  thanks. i followed the user manual for deployment and installed all
>>> pre-requisites ****
>>>
>>> i modified the command and still the issue persist. please suggest ****
>>>
>>> ** **
>>>
>>> please refer below ****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> [image: Inline image 1]****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> irfan ****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <
>>> orenault@hortonworks.com> wrote:****
>>>
>>> The command to install it is msiexec /i msifile /...  ****
>>>
>>> You will find the correct syntax as part of doc. ****
>>>
>>> Happy reading
>>> Olivier ****
>>>
>>> On 4 Sep 2013 12:37, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>
>>>  thanks. ****
>>>
>>> i referred the logs and manuals. i modified the clusterproperties file
>>> and then double click on the msi file ****
>>>
>>> however, it still failed.****
>>>
>>> further i started the installation on command line by giving
>>> HDP_LAYOUT=clusterproperties file path, ****
>>>
>>> installation went ahead and it failed for .NET framework 4.0 and VC++
>>> redistributable package dependency   ****
>>>
>>> ** **
>>>
>>> i installed both and started again the installation. ****
>>>
>>> failed again with following error ****
>>>
>>> [image: Inline image 1]****
>>>
>>> ** **
>>>
>>> when i search for the logs mentioned in the error , i never found that *
>>> ***
>>>
>>> please suggest ****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> irfan****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <
>>> orenault@hortonworks.com> wrote:****
>>>
>>> Correct, you need to define the cluster configuration as part of a file.
>>> You will find some information on the configuration file as part of the
>>> documentation. ****
>>>
>>>
>>> http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
>>> ****
>>>
>>> You should make sure to have also installed the pre requisite. ****
>>>
>>> Thanks
>>> Olivier ****
>>>
>>> On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>
>>>  thanks. sorry for the long break. actually got involved in some other
>>> priorities****
>>>
>>> i downloaded the installer and while installing i got following error **
>>> **
>>>
>>> ** **
>>>
>>> [image: Inline image 1]****
>>>
>>> ** **
>>>
>>> do i need to make any configuration prior to installation ??****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> irfan ****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
>>> orenault@hortonworks.com> wrote:****
>>>
>>> Here is the link ****
>>>
>>> http://download.hortonworks.com/products/hdp-windows/****
>>>
>>> Olivier ****
>>>
>>> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>
>>>  thanks.****
>>>
>>> i just followed the instructions to setup the pseudo distributed setup
>>> first using the url :
>>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>>> ****
>>>
>>>  ****
>>>
>>> i don't think so i am running DN on both machine ****
>>>
>>> please find the attached log****
>>>
>>> ** **
>>>
>>> hi olivier ****
>>>
>>> ** **
>>>
>>> can you please give me download link ?****
>>>
>>> let me try please ****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> irfan ****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <do...@gmail.com>
>>> wrote:****
>>>
>>>  Are you running DN on both the machines? Could you please show me your
>>> DN logs?****
>>>
>>> ** **
>>>
>>> Also, consider Oliver's suggestion. It's definitely a better option.****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>>
>>> ****
>>>
>>> Warm Regards,****
>>>
>>> Tariq****
>>>
>>> cloudfront.blogspot.com****
>>>
>>> ** **
>>>
>>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>>> orenault@hortonworks.com> wrote:****
>>>
>>> Irfu, ****
>>>
>>> If you want to quickly get Hadoop running on windows platform. You may
>>> want to try our distribution for Windows. You will be able to find the msi
>>> on our website. ****
>>>
>>> Regards
>>> Olivier ****
>>>
>>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>
>>>  thanks. ****
>>>
>>> ok. i think i need to change the plan over here ****
>>>
>>> let me create two environments. 1: totally windows 2: totally Unix****
>>>
>>> ** **
>>>
>>> because, on windows , anyway i have to try and see how hadoop works ****
>>>
>>> on UNIX, it is already known that ,  it is working fine. ****
>>>
>>> ** **
>>>
>>> so, on windows , here is the setup:****
>>>
>>> ** **
>>>
>>> namenode : windows 2012 R2 ****
>>>
>>> datanode : windows 2012 R2 ****
>>>
>>> ** **
>>>
>>> now, the exact problem is :****
>>>
>>> 1: datanode is not getting started ****
>>>
>>> 2: replication : if i put any file/folder on any datanode , it should
>>> get replicated to all another available datanodes ****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <do...@gmail.com>
>>> wrote:****
>>>
>>>  Seriously??You are planning to develop something using Hadoop on
>>> windows. Not a good idea. Anyways, cold you plz show me your log files?I
>>> also need some additional info :****
>>>
>>> -The exact problem which you are facing right now****
>>>
>>> -Your cluster summary(no. of nodes etc)****
>>>
>>> -Your latest configuration files****
>>>
>>> -Your /etc.hosts file****
>>>
>>>
>>> ****
>>>
>>> Warm Regards,****
>>>
>>> Tariq****
>>>
>>> cloudfront.blogspot.com****
>>>
>>> ** **
>>>
>>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  ok. thanks****
>>>
>>> now, i need to start with all windows setup first as our product will be
>>> based on windows ****
>>>
>>> so, now, please tell me how to resolve the issue ****
>>>
>>> ** **
>>>
>>> datanode is not starting . please suggest ****
>>>
>>> ** **
>>>
>>> regards,****
>>>
>>> irfan ****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <do...@gmail.com>
>>> wrote:****
>>>
>>>  It is possible. Theoretically Hadoop doesn't stop you from doing that.
>>> But it is not a very wise setup.****
>>>
>>>
>>> ****
>>>
>>> Warm Regards,****
>>>
>>> Tariq****
>>>
>>> cloudfront.blogspot.com****
>>>
>>> ** **
>>>
>>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  please suggest****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> irfan****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  thanks.****
>>>
>>> can i have setup like this :****
>>>
>>> namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)****
>>>
>>> and datanodes are the combination of any OS (windows , linux , unix etc )
>>> ****
>>>
>>> ** **
>>>
>>> however, my doubt is,  as the file systems of  both the systems (win and
>>> linux ) are different ,  datanodes of these systems can not be part of
>>> single cluster . i have to make windows cluster separate and UNIX cluster
>>> separate ?****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
>>> aagarwal@hortonworks.com> wrote:****
>>>
>>> I just noticed you are on Cygwin. IIRC Windows PIDs are not the same as
>>> Cygwin PIDs so that may be causing the discrepancy. I don't know how well
>>> Hadoop works in Cygwin as I have never tried it. Work is in progress for
>>> native Windows support however there are no official releases with Windows
>>> support yet. It may be easier to get familiar with a release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
>>> ****
>>>
>>>
>>>
>>> ****
>>>
>>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  thanks ****
>>>
>>> here is what i did .****
>>>
>>> i stopped all the namenodes and datanodes using ./stop-dfs.sh command **
>>> **
>>>
>>> then deleted all pid files for namenodes and datanodes ****
>>>
>>> ** **
>>>
>>> started dfs again with command : "./start-dfs.sh"****
>>>
>>> ** **
>>>
>>> when i ran the "Jps" command . it shows****
>>>
>>> ** **
>>>
>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>
>>> $ ./jps.exe****
>>>
>>> 4536 Jps****
>>>
>>> 2076 NameNode****
>>>
>>> ** **
>>>
>>> however, when i open the pid file for namenode then it is not showing
>>> pid as : 4560. on the contrary, it shud show : 2076****
>>>
>>> ** **
>>>
>>> please suggest ****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <aa...@hortonworks.com>
>>> wrote:****
>>>
>>>  Most likely there is a stale pid file. Something like
>>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>>> the datanode.
>>>
>>> I haven't read the entire thread so you may have looked at this already.
>>>
>>> -Arpit****
>>>
>>>
>>>
>>> ****
>>>
>>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  datanode is trying to connect to namenode continuously but fails ****
>>>
>>> ** **
>>>
>>> when i try to run "jps" command it says :****
>>>
>>> $ ./jps.exe****
>>>
>>> 4584 NameNode****
>>>
>>> 4016 Jps****
>>>
>>> ** **
>>>
>>> and when i ran the "./start-dfs.sh" then it says :****
>>>
>>> ** **
>>>
>>> $ ./start-dfs.sh****
>>>
>>> namenode running as process 3544. Stop it first.****
>>>
>>> DFS-1: datanode running as process 4076. Stop it first.****
>>>
>>> localhost: secondarynamenode running as process 4792. Stop it first.****
>>>
>>> ** **
>>>
>>> both these logs are contradictory ****
>>>
>>> please find the attached logs ****
>>>
>>> ** **
>>>
>>> should i attach the conf files as well ?****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>>  ****
>>>
>>> ** **
>>>
>>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <do...@gmail.com>
>>> wrote:****
>>>
>>>  Your DN is still not running. Showing me the logs would be helpful.****
>>>
>>>
>>> ****
>>>
>>> Warm Regards,****
>>>
>>> Tariq****
>>>
>>> cloudfront.blogspot.com****
>>>
>>> ** **
>>>
>>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  i followed the url and did the steps mention in that. i have deployed
>>> on the windows platform****
>>>
>>> ** **
>>>
>>> Now, i am able to browse url : http://localhost:50070 (name node )****
>>>
>>> however, not able to browse url : http://localhost:50030****
>>>
>>> ** **
>>>
>>> please refer below****
>>>
>>> ** **
>>>
>>> [image: Inline image 1]****
>>>
>>> ** **
>>>
>>> i have modified all the config files as mentioned and formatted the hdfs
>>> file system as well ****
>>>
>>> please suggest ****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  thanks. i followed this url :
>>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>>> ****
>>>
>>> let me follow the url which you gave for pseudo distributed setup and
>>> then will switch to distributed mode****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> irfan ****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <do...@gmail.com>
>>> wrote:****
>>>
>>>  You are welcome. Which link have you followed for the
>>> configuration?Your *core-site.xml* is empty. Remove the property *
>>> fs.default.name *from *hdfs-site.xml* and add it to *core-site.xml*.
>>> Remove *mapred.job.tracker* as well. It is required in *mapred-site.xml*
>>> .****
>>>
>>> ** **
>>>
>>> I would suggest you to do a pseudo distributed setup first in order to
>>> get yourself familiar with the process and then proceed to the distributed
>>> mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>>> ****
>>>
>>> ** **
>>>
>>> HTH****
>>>
>>>
>>> ****
>>>
>>> Warm Regards,****
>>>
>>> Tariq****
>>>
>>> cloudfront.blogspot.com****
>>>
>>> ** **
>>>
>>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  thanks tariq for response. ****
>>>
>>> as discussed last time, i have sent you all the config files in my setup
>>> . ****
>>>
>>> can you please go through that ?****
>>>
>>> ** **
>>>
>>> please let me know ****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> irfan ****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <do...@gmail.com>
>>> wrote:****
>>>
>>>  I'm sorry for being unresponsive. Was out of touch for sometime
>>> because of ramzan and eid. Resuming work today.****
>>>
>>> ** **
>>>
>>> What's the current status?****
>>>
>>>
>>> ****
>>>
>>> Warm Regards,****
>>>
>>> Tariq****
>>>
>>> cloudfront.blogspot.com****
>>>
>>> ** **
>>>
>>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <ma...@gmail.com>
>>> wrote:****
>>>
>>>  First of all read the concepts ..I hope you will like it..****
>>>
>>>
>>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf****
>>>
>>> ** **
>>>
>>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  please suggest ****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> irfan ****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  hey Tariq,****
>>>
>>> i am still stuck .. ****
>>>
>>> can you please suggest ****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> irfan ****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  please suggest ****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  attachment got quarantined ****
>>>
>>> resending in txt format. please rename it to conf.rar ****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  thanks.****
>>>
>>> ** **
>>>
>>> if i run the jps command on namenode :****
>>>
>>> ** **
>>>
>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>
>>> $ ./jps.exe****
>>>
>>> 3164 NameNode****
>>>
>>> 1892 Jps****
>>>
>>> ** **
>>>
>>> same command on datanode :****
>>>
>>> ** **
>>>
>>> Administrator@DFS-1 /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>
>>> $ ./jps.exe****
>>>
>>> 3848 Jps****
>>>
>>> ** **
>>>
>>> jps does not list any process for datanode. however, on web browser i
>>> can see one live data node ****
>>>
>>> please find the attached conf rar file of namenode ****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <do...@gmail.com>
>>> wrote:****
>>>
>>>  OK. we'll start fresh. Could you plz show me your latest config files?*
>>> ***
>>>
>>> ** **
>>>
>>> BTW, are your daemons running fine?Use JPS to verify that.****
>>>
>>>
>>> ****
>>>
>>> Warm Regards,****
>>>
>>> Tariq****
>>>
>>> cloudfront.blogspot.com****
>>>
>>> ** **
>>>
>>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  i have created these dir "wksp_data" and "wksp_name" on both datanode
>>> and namenode ****
>>>
>>> made the respective changes in "hdfs-site.xml" file ****
>>>
>>> formatted the namenode ****
>>>
>>> started the dfs ****
>>>
>>> ** **
>>>
>>> but still, not able to browse the file system through web browser ****
>>>
>>> please refer below ****
>>>
>>> ** **
>>>
>>> anything still missing ?****
>>>
>>> please suggest ****
>>>
>>> ** **
>>>
>>> [image: Inline image 1]****
>>>
>>> ** **
>>>
>>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  these dir needs to be created on all datanodes and namenodes ?****
>>>
>>> further,  hdfs-site.xml needs to be updated on both datanodes and
>>> namenodes for these new dir?****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <do...@gmail.com>
>>> wrote:****
>>>
>>>  Create 2 directories manually corresponding to the values of
>>> dfs.name.dir and dfs.data.dir properties and change the permissions of
>>> these directories to 755. When you start pushing data into your HDFS, data
>>> will start going inside the directory specified by dfs.data.dir and the
>>> associated metadata will go inside dfs.name.dir. Remember, you store data
>>> in HDFS, but it eventually gets stored in your local/native FS. But you
>>> cannot see this data directly on your local/native FS.****
>>>
>>>
>>> ****
>>>
>>> Warm Regards,****
>>>
>>> Tariq****
>>>
>>> cloudfront.blogspot.com****
>>>
>>> ** **
>>>
>>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  thanks. ****
>>>
>>> however, i need this to be working on windows environment as project
>>> requirement.****
>>>
>>> i will add/work on Linux later ****
>>>
>>> ** **
>>>
>>> so, now , at this stage , c:\\wksp is the HDFS file system OR do i need
>>> to create it from command line ?****
>>>
>>> ** **
>>>
>>> please suggest****
>>>
>>> ** **
>>>
>>> regards,****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <do...@gmail.com>
>>> wrote:****
>>>
>>>  Hello Irfan,****
>>>
>>> ** **
>>>
>>> Sorry for being unresponsive. Got stuck with some imp work.****
>>>
>>> ** **
>>>
>>> HDFS webUI doesn't provide us the ability to create file or directory.
>>> You can browse HDFS, view files, download files etc. But operation like
>>> create, move, copy etc are not supported.****
>>>
>>> ** **
>>>
>>> These values look fine to me.****
>>>
>>> ** **
>>>
>>> One suggestion though. Try getting a Linux machine(if possible). Or at
>>> least use a VM. I personally feel that using Hadoop on windows is always
>>> messy.****
>>>
>>>
>>> ****
>>>
>>> Warm Regards,****
>>>
>>> Tariq****
>>>
>>> cloudfront.blogspot.com****
>>>
>>> ** **
>>>
>>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  thanks.****
>>>
>>> when i browse the file system , i am getting following :****
>>>
>>> i haven't seen any make directory option there ****
>>>
>>> ** **
>>>
>>> i need to create it from command line ?****
>>>
>>> further, in the hdfs-site.xml file , i have given following entries. are
>>> they correct ? ****
>>>
>>> ** **
>>>
>>> <property>****
>>>
>>>   <name>dfs.data.dir</name>****
>>>
>>>   <value>c:\\wksp</value>****
>>>
>>>   </property>****
>>>
>>> <property>****
>>>
>>>   <name>dfs.name.dir</name>****
>>>
>>>   <value>c:\\wksp</value>****
>>>
>>>   </property>****
>>>
>>> ** **
>>>
>>> please suggest ****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> [image: Inline image 1]****
>>>
>>> ** **
>>>
>>> On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <ma...@gmail.com>
>>> wrote:****
>>>
>>>  *You are wrong at this:*****
>>>
>>> ** **
>>>
>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>>
>>> $ ./hadoop dfs -copyFromLocal
>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp****
>>>
>>> copyFromLocal: File
>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
>>> ****
>>>
>>> ** **
>>>
>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>>
>>> $ ./hadoop dfs -copyFromLocal
>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp****
>>>
>>> copyFromLocal: File
>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>>> ****
>>>
>>> ** **
>>>
>>> Because,You had wrote both the paths local and You need not to copy
>>> hadoop into hdfs...Hadoop is already working..****
>>>
>>> ** **
>>>
>>> Just check out in browser by after starting ur single node cluster :****
>>>
>>> ** **
>>>
>>> localhost:50070****
>>>
>>> ** **
>>>
>>> then go for browse the filesystem link in it..****
>>>
>>> ** **
>>>
>>> If there is no directory then make directory there.****
>>>
>>> That is your hdfs directory.****
>>>
>>> Then copy any text file there(no need to copy hadoop there).beacause u
>>> are going to do processing on that data in text file.That's why hadoop is
>>> used for ,first u need to make it clear in ur mind.Then and then u will do
>>> it...otherwise not possible..****
>>>
>>> ** **
>>>
>>> *Try this: *****
>>>
>>> ** **
>>>
>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2****
>>>
>>> $ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
>>> /hdfs/directory/path****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  thanks. yes , i am newbie.****
>>>
>>> however, i need windows setup.****
>>>
>>> ** **
>>>
>>> let me surely refer the doc and link which u sent but i need this to be
>>> working ...****
>>>
>>> can you please help****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> ** **
>>>
>>>  ****
>>>
>>> ** **
>>>
>>>
>>>
>>> ****
>>>
>>> ** **
>>>
>>> --
>>> MANISH DUNANI
>>> -THANX
>>> +91 9426881954,+91 8460656443****
>>>
>>> manishd207@gmail.com****
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>
>>>
>>> ****
>>>
>>> -- ****
>>>
>>> Regards****
>>>
>>> *Manish Dunani*****
>>>
>>> *Contact No* : +91 9408329137****
>>>
>>> *skype id* : manish.dunani****
>>>
>>> ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>> CONFIDENTIALITY NOTICE
>>> NOTICE: This message is intended for the use of the individual or entity
>>> to which it is addressed and may contain information that is confidential,
>>> privileged and exempt from disclosure under applicable law. If the reader
>>> of this message is not the intended recipient, you are hereby notified that
>>> any printing, copying, dissemination, distribution, disclosure or
>>> forwarding of this communication is strictly prohibited. If you have
>>> received this communication in error, please contact the sender immediately
>>> and delete it from your system. Thank You.****
>>>
>>>  ** **
>>>
>>>
>>>
>>> CONFIDENTIALITY NOTICE
>>> NOTICE: This message is intended for the use of the individual or entity
>>> to which it is addressed and may contain information that is confidential,
>>> privileged and exempt from disclosure under applicable law. If the reader
>>> of this message is not the intended recipient, you are hereby notified that
>>> any printing, copying, dissemination, distribution, disclosure or
>>> forwarding of this communication is strictly prohibited. If you have
>>> received this communication in error, please contact the sender immediately
>>> and delete it from your system. Thank You.****
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>
>>> CONFIDENTIALITY NOTICE
>>> NOTICE: This message is intended for the use of the individual or entity
>>> to which it is addressed and may contain information that is confidential,
>>> privileged and exempt from disclosure under applicable law. If the reader
>>> of this message is not the intended recipient, you are hereby notified that
>>> any printing, copying, dissemination, distribution, disclosure or
>>> forwarding of this communication is strictly prohibited. If you have
>>> received this communication in error, please contact the sender immediately
>>> and delete it from your system. Thank You.****
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>
>>> CONFIDENTIALITY NOTICE
>>> NOTICE: This message is intended for the use of the individual or entity
>>> to which it is addressed and may contain information that is confidential,
>>> privileged and exempt from disclosure under applicable law. If the reader
>>> of this message is not the intended recipient, you are hereby notified that
>>> any printing, copying, dissemination, distribution, disclosure or
>>> forwarding of this communication is strictly prohibited. If you have
>>> received this communication in error, please contact the sender immediately
>>> and delete it from your system. Thank You.****
>>>
>>>  ** **
>>>
>>>
>>> CONFIDENTIALITY NOTICE
>>> NOTICE: This message is intended for the use of the individual or entity
>>> to which it is addressed and may contain information that is confidential,
>>> privileged and exempt from disclosure under applicable law. If the reader
>>> of this message is not the intended recipient, you are hereby notified that
>>> any printing, copying, dissemination, distribution, disclosure or
>>> forwarding of this communication is strictly prohibited. If you have
>>> received this communication in error, please contact the sender immediately
>>> and delete it from your system. Thank You.****
>>>
>>>  ** **
>>>
>>>
>>> CONFIDENTIALITY NOTICE
>>> NOTICE: This message is intended for the use of the individual or entity
>>> to which it is addressed and may contain information that is confidential,
>>> privileged and exempt from disclosure under applicable law. If the reader
>>> of this message is not the intended recipient, you are hereby notified that
>>> any printing, copying, dissemination, distribution, disclosure or
>>> forwarding of this communication is strictly prohibited. If you have
>>> received this communication in error, please contact the sender immediately
>>> and delete it from your system. Thank You.****
>>>
>>>  ** **
>>>
>>>
>>>
>>> ****
>>>
>>> ** **
>>>
>>> -- ****
>>>
>>> Olivier Renault****
>>>
>>> Solution Engineer - Big Data - Hortonworks, Inc.
>>> +44 7500 933 036
>>> orenault@hortonworks.com
>>> www.hortonworks.com****
>>>
>>> **** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>
>>>
>>> CONFIDENTIALITY NOTICE
>>> NOTICE: This message is intended for the use of the individual or entity
>>> to which it is addressed and may contain information that is confidential,
>>> privileged and exempt from disclosure under applicable law. If the reader
>>> of this message is not the intended recipient, you are hereby notified that
>>> any printing, copying, dissemination, distribution, disclosure or
>>> forwarding of this communication is strictly prohibited. If you have
>>> received this communication in error, please contact the sender immediately
>>> and delete it from your system. Thank You.****<http://hortonworks.com/products/hortonworks-sandbox/>
>>>
>>>  ** ** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>
>>
>>
>

Re: about replication

Posted by Irfan Sayed <ir...@gmail.com>.
please suggest

regards
irfan



On Fri, Sep 6, 2013 at 12:16 PM, Irfan Sayed <ir...@gmail.com> wrote:

> ok.. now i made some changes and installation went ahead
> but failed in property "HIVE_SERVER_HOST" declaration
> in cluster config file, i have commented this property. if i uncomment ,
> then what server address will give ???
>
> i have only two windows machines setup.
> 1: for namenode and another for datanode
>
> please suggest
>
> regards
> irfan
>
>
>
> On Fri, Sep 6, 2013 at 11:42 AM, Irfan Sayed <ir...@gmail.com> wrote:
>
>> thanks.
>> i installed the latest java in c:\java folder and now no error in log
>> file related to java
>> however, now it is throwing error on not having cluster properties file.
>> in fact i am running/installing hdp from the location where this file
>> exist . still it is throwing error
>>
>> please find the attached
>>
>> [image: Inline image 1]
>>
>> regards
>> irfan
>>
>>
>>
>> On Fri, Sep 6, 2013 at 11:12 AM, Ravi Mummulla (BIG DATA) <
>> ravimu@microsoft.com> wrote:
>>
>>>  Here’s your issue (from the logs you attached earlier):****
>>>
>>> ** **
>>>
>>> CAQuietExec:  Checking JAVA_HOME is set correctly...****
>>>
>>> CAQuietExec:  Files\Java\jdk1.6.0_31 was unexpected at this time.****
>>>
>>> ** **
>>>
>>> It seems that you installed Java prerequisite in the default path, which
>>> is %PROGRAMFILES% (expands to C:\Program Files in your case). HDP 1.3 does
>>> not like spaces in paths, do you need to reinstall Java under c:\java\ or
>>> something similar (in a path with no spaces).****
>>>
>>> ** **
>>>
>>> *From:* Irfan Sayed [mailto:irfu.sayed@gmail.com]
>>> *Sent:* Thursday, September 5, 2013 8:42 PM
>>> *To:* user@hadoop.apache.org
>>> *Subject:* Re: about replication****
>>>
>>> ** **
>>>
>>> please find the attached.****
>>>
>>> i don't have "c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log"
>>> as it is not generated ****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> irfan****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Thu, Sep 5, 2013 at 6:09 PM, Olivier Renault <
>>> orenault@hortonworks.com> wrote:****
>>>
>>>  Could you share the log files ( c:\hdp.log,
>>> c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log )  as
>>> well as your clusterproperties.txt ?****
>>>
>>> ** **
>>>
>>> Thanks, ****
>>>
>>> Olivier****
>>>
>>> ** **
>>>
>>> On 5 September 2013 12:33, Irfan Sayed <ir...@gmail.com> wrote:****
>>>
>>>  thanks. i followed the user manual for deployment and installed all
>>> pre-requisites ****
>>>
>>> i modified the command and still the issue persist. please suggest ****
>>>
>>> ** **
>>>
>>> please refer below ****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> [image: Inline image 1]****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> irfan ****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <
>>> orenault@hortonworks.com> wrote:****
>>>
>>> The command to install it is msiexec /i msifile /...  ****
>>>
>>> You will find the correct syntax as part of doc. ****
>>>
>>> Happy reading
>>> Olivier ****
>>>
>>> On 4 Sep 2013 12:37, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>
>>>  thanks. ****
>>>
>>> i referred the logs and manuals. i modified the clusterproperties file
>>> and then double click on the msi file ****
>>>
>>> however, it still failed.****
>>>
>>> further i started the installation on command line by giving
>>> HDP_LAYOUT=clusterproperties file path, ****
>>>
>>> installation went ahead and it failed for .NET framework 4.0 and VC++
>>> redistributable package dependency   ****
>>>
>>> ** **
>>>
>>> i installed both and started again the installation. ****
>>>
>>> failed again with following error ****
>>>
>>> [image: Inline image 1]****
>>>
>>> ** **
>>>
>>> when i search for the logs mentioned in the error , i never found that *
>>> ***
>>>
>>> please suggest ****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> irfan****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <
>>> orenault@hortonworks.com> wrote:****
>>>
>>> Correct, you need to define the cluster configuration as part of a file.
>>> You will find some information on the configuration file as part of the
>>> documentation. ****
>>>
>>>
>>> http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
>>> ****
>>>
>>> You should make sure to have also installed the pre requisite. ****
>>>
>>> Thanks
>>> Olivier ****
>>>
>>> On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>
>>>  thanks. sorry for the long break. actually got involved in some other
>>> priorities****
>>>
>>> i downloaded the installer and while installing i got following error **
>>> **
>>>
>>> ** **
>>>
>>> [image: Inline image 1]****
>>>
>>> ** **
>>>
>>> do i need to make any configuration prior to installation ??****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> irfan ****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
>>> orenault@hortonworks.com> wrote:****
>>>
>>> Here is the link ****
>>>
>>> http://download.hortonworks.com/products/hdp-windows/****
>>>
>>> Olivier ****
>>>
>>> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>
>>>  thanks.****
>>>
>>> i just followed the instructions to setup the pseudo distributed setup
>>> first using the url :
>>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>>> ****
>>>
>>>  ****
>>>
>>> i don't think so i am running DN on both machine ****
>>>
>>> please find the attached log****
>>>
>>> ** **
>>>
>>> hi olivier ****
>>>
>>> ** **
>>>
>>> can you please give me download link ?****
>>>
>>> let me try please ****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> irfan ****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <do...@gmail.com>
>>> wrote:****
>>>
>>>  Are you running DN on both the machines? Could you please show me your
>>> DN logs?****
>>>
>>> ** **
>>>
>>> Also, consider Oliver's suggestion. It's definitely a better option.****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>>
>>> ****
>>>
>>> Warm Regards,****
>>>
>>> Tariq****
>>>
>>> cloudfront.blogspot.com****
>>>
>>> ** **
>>>
>>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>>> orenault@hortonworks.com> wrote:****
>>>
>>> Irfu, ****
>>>
>>> If you want to quickly get Hadoop running on windows platform. You may
>>> want to try our distribution for Windows. You will be able to find the msi
>>> on our website. ****
>>>
>>> Regards
>>> Olivier ****
>>>
>>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>
>>>  thanks. ****
>>>
>>> ok. i think i need to change the plan over here ****
>>>
>>> let me create two environments. 1: totally windows 2: totally Unix****
>>>
>>> ** **
>>>
>>> because, on windows , anyway i have to try and see how hadoop works ****
>>>
>>> on UNIX, it is already known that ,  it is working fine. ****
>>>
>>> ** **
>>>
>>> so, on windows , here is the setup:****
>>>
>>> ** **
>>>
>>> namenode : windows 2012 R2 ****
>>>
>>> datanode : windows 2012 R2 ****
>>>
>>> ** **
>>>
>>> now, the exact problem is :****
>>>
>>> 1: datanode is not getting started ****
>>>
>>> 2: replication : if i put any file/folder on any datanode , it should
>>> get replicated to all another available datanodes ****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <do...@gmail.com>
>>> wrote:****
>>>
>>>  Seriously??You are planning to develop something using Hadoop on
>>> windows. Not a good idea. Anyways, cold you plz show me your log files?I
>>> also need some additional info :****
>>>
>>> -The exact problem which you are facing right now****
>>>
>>> -Your cluster summary(no. of nodes etc)****
>>>
>>> -Your latest configuration files****
>>>
>>> -Your /etc.hosts file****
>>>
>>>
>>> ****
>>>
>>> Warm Regards,****
>>>
>>> Tariq****
>>>
>>> cloudfront.blogspot.com****
>>>
>>> ** **
>>>
>>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  ok. thanks****
>>>
>>> now, i need to start with all windows setup first as our product will be
>>> based on windows ****
>>>
>>> so, now, please tell me how to resolve the issue ****
>>>
>>> ** **
>>>
>>> datanode is not starting . please suggest ****
>>>
>>> ** **
>>>
>>> regards,****
>>>
>>> irfan ****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <do...@gmail.com>
>>> wrote:****
>>>
>>>  It is possible. Theoretically Hadoop doesn't stop you from doing that.
>>> But it is not a very wise setup.****
>>>
>>>
>>> ****
>>>
>>> Warm Regards,****
>>>
>>> Tariq****
>>>
>>> cloudfront.blogspot.com****
>>>
>>> ** **
>>>
>>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  please suggest****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> irfan****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  thanks.****
>>>
>>> can i have setup like this :****
>>>
>>> namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)****
>>>
>>> and datanodes are the combination of any OS (windows , linux , unix etc )
>>> ****
>>>
>>> ** **
>>>
>>> however, my doubt is,  as the file systems of  both the systems (win and
>>> linux ) are different ,  datanodes of these systems can not be part of
>>> single cluster . i have to make windows cluster separate and UNIX cluster
>>> separate ?****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
>>> aagarwal@hortonworks.com> wrote:****
>>>
>>> I just noticed you are on Cygwin. IIRC Windows PIDs are not the same as
>>> Cygwin PIDs so that may be causing the discrepancy. I don't know how well
>>> Hadoop works in Cygwin as I have never tried it. Work is in progress for
>>> native Windows support however there are no official releases with Windows
>>> support yet. It may be easier to get familiar with a release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
>>> ****
>>>
>>>
>>>
>>> ****
>>>
>>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  thanks ****
>>>
>>> here is what i did .****
>>>
>>> i stopped all the namenodes and datanodes using ./stop-dfs.sh command **
>>> **
>>>
>>> then deleted all pid files for namenodes and datanodes ****
>>>
>>> ** **
>>>
>>> started dfs again with command : "./start-dfs.sh"****
>>>
>>> ** **
>>>
>>> when i ran the "Jps" command . it shows****
>>>
>>> ** **
>>>
>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>
>>> $ ./jps.exe****
>>>
>>> 4536 Jps****
>>>
>>> 2076 NameNode****
>>>
>>> ** **
>>>
>>> however, when i open the pid file for namenode then it is not showing
>>> pid as : 4560. on the contrary, it shud show : 2076****
>>>
>>> ** **
>>>
>>> please suggest ****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <aa...@hortonworks.com>
>>> wrote:****
>>>
>>>  Most likely there is a stale pid file. Something like
>>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>>> the datanode.
>>>
>>> I haven't read the entire thread so you may have looked at this already.
>>>
>>> -Arpit****
>>>
>>>
>>>
>>> ****
>>>
>>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  datanode is trying to connect to namenode continuously but fails ****
>>>
>>> ** **
>>>
>>> when i try to run "jps" command it says :****
>>>
>>> $ ./jps.exe****
>>>
>>> 4584 NameNode****
>>>
>>> 4016 Jps****
>>>
>>> ** **
>>>
>>> and when i ran the "./start-dfs.sh" then it says :****
>>>
>>> ** **
>>>
>>> $ ./start-dfs.sh****
>>>
>>> namenode running as process 3544. Stop it first.****
>>>
>>> DFS-1: datanode running as process 4076. Stop it first.****
>>>
>>> localhost: secondarynamenode running as process 4792. Stop it first.****
>>>
>>> ** **
>>>
>>> both these logs are contradictory ****
>>>
>>> please find the attached logs ****
>>>
>>> ** **
>>>
>>> should i attach the conf files as well ?****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>>  ****
>>>
>>> ** **
>>>
>>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <do...@gmail.com>
>>> wrote:****
>>>
>>>  Your DN is still not running. Showing me the logs would be helpful.****
>>>
>>>
>>> ****
>>>
>>> Warm Regards,****
>>>
>>> Tariq****
>>>
>>> cloudfront.blogspot.com****
>>>
>>> ** **
>>>
>>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  i followed the url and did the steps mention in that. i have deployed
>>> on the windows platform****
>>>
>>> ** **
>>>
>>> Now, i am able to browse url : http://localhost:50070 (name node )****
>>>
>>> however, not able to browse url : http://localhost:50030****
>>>
>>> ** **
>>>
>>> please refer below****
>>>
>>> ** **
>>>
>>> [image: Inline image 1]****
>>>
>>> ** **
>>>
>>> i have modified all the config files as mentioned and formatted the hdfs
>>> file system as well ****
>>>
>>> please suggest ****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  thanks. i followed this url :
>>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>>> ****
>>>
>>> let me follow the url which you gave for pseudo distributed setup and
>>> then will switch to distributed mode****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> irfan ****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <do...@gmail.com>
>>> wrote:****
>>>
>>>  You are welcome. Which link have you followed for the
>>> configuration?Your *core-site.xml* is empty. Remove the property *
>>> fs.default.name *from *hdfs-site.xml* and add it to *core-site.xml*.
>>> Remove *mapred.job.tracker* as well. It is required in *mapred-site.xml*
>>> .****
>>>
>>> ** **
>>>
>>> I would suggest you to do a pseudo distributed setup first in order to
>>> get yourself familiar with the process and then proceed to the distributed
>>> mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>>> ****
>>>
>>> ** **
>>>
>>> HTH****
>>>
>>>
>>> ****
>>>
>>> Warm Regards,****
>>>
>>> Tariq****
>>>
>>> cloudfront.blogspot.com****
>>>
>>> ** **
>>>
>>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  thanks tariq for response. ****
>>>
>>> as discussed last time, i have sent you all the config files in my setup
>>> . ****
>>>
>>> can you please go through that ?****
>>>
>>> ** **
>>>
>>> please let me know ****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> irfan ****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <do...@gmail.com>
>>> wrote:****
>>>
>>>  I'm sorry for being unresponsive. Was out of touch for sometime
>>> because of ramzan and eid. Resuming work today.****
>>>
>>> ** **
>>>
>>> What's the current status?****
>>>
>>>
>>> ****
>>>
>>> Warm Regards,****
>>>
>>> Tariq****
>>>
>>> cloudfront.blogspot.com****
>>>
>>> ** **
>>>
>>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <ma...@gmail.com>
>>> wrote:****
>>>
>>>  First of all read the concepts ..I hope you will like it..****
>>>
>>>
>>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf****
>>>
>>> ** **
>>>
>>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  please suggest ****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> irfan ****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  hey Tariq,****
>>>
>>> i am still stuck .. ****
>>>
>>> can you please suggest ****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> irfan ****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  please suggest ****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  attachment got quarantined ****
>>>
>>> resending in txt format. please rename it to conf.rar ****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  thanks.****
>>>
>>> ** **
>>>
>>> if i run the jps command on namenode :****
>>>
>>> ** **
>>>
>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>
>>> $ ./jps.exe****
>>>
>>> 3164 NameNode****
>>>
>>> 1892 Jps****
>>>
>>> ** **
>>>
>>> same command on datanode :****
>>>
>>> ** **
>>>
>>> Administrator@DFS-1 /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>
>>> $ ./jps.exe****
>>>
>>> 3848 Jps****
>>>
>>> ** **
>>>
>>> jps does not list any process for datanode. however, on web browser i
>>> can see one live data node ****
>>>
>>> please find the attached conf rar file of namenode ****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <do...@gmail.com>
>>> wrote:****
>>>
>>>  OK. we'll start fresh. Could you plz show me your latest config files?*
>>> ***
>>>
>>> ** **
>>>
>>> BTW, are your daemons running fine?Use JPS to verify that.****
>>>
>>>
>>> ****
>>>
>>> Warm Regards,****
>>>
>>> Tariq****
>>>
>>> cloudfront.blogspot.com****
>>>
>>> ** **
>>>
>>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  i have created these dir "wksp_data" and "wksp_name" on both datanode
>>> and namenode ****
>>>
>>> made the respective changes in "hdfs-site.xml" file ****
>>>
>>> formatted the namenode ****
>>>
>>> started the dfs ****
>>>
>>> ** **
>>>
>>> but still, not able to browse the file system through web browser ****
>>>
>>> please refer below ****
>>>
>>> ** **
>>>
>>> anything still missing ?****
>>>
>>> please suggest ****
>>>
>>> ** **
>>>
>>> [image: Inline image 1]****
>>>
>>> ** **
>>>
>>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  these dir needs to be created on all datanodes and namenodes ?****
>>>
>>> further,  hdfs-site.xml needs to be updated on both datanodes and
>>> namenodes for these new dir?****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <do...@gmail.com>
>>> wrote:****
>>>
>>>  Create 2 directories manually corresponding to the values of
>>> dfs.name.dir and dfs.data.dir properties and change the permissions of
>>> these directories to 755. When you start pushing data into your HDFS, data
>>> will start going inside the directory specified by dfs.data.dir and the
>>> associated metadata will go inside dfs.name.dir. Remember, you store data
>>> in HDFS, but it eventually gets stored in your local/native FS. But you
>>> cannot see this data directly on your local/native FS.****
>>>
>>>
>>> ****
>>>
>>> Warm Regards,****
>>>
>>> Tariq****
>>>
>>> cloudfront.blogspot.com****
>>>
>>> ** **
>>>
>>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  thanks. ****
>>>
>>> however, i need this to be working on windows environment as project
>>> requirement.****
>>>
>>> i will add/work on Linux later ****
>>>
>>> ** **
>>>
>>> so, now , at this stage , c:\\wksp is the HDFS file system OR do i need
>>> to create it from command line ?****
>>>
>>> ** **
>>>
>>> please suggest****
>>>
>>> ** **
>>>
>>> regards,****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <do...@gmail.com>
>>> wrote:****
>>>
>>>  Hello Irfan,****
>>>
>>> ** **
>>>
>>> Sorry for being unresponsive. Got stuck with some imp work.****
>>>
>>> ** **
>>>
>>> HDFS webUI doesn't provide us the ability to create file or directory.
>>> You can browse HDFS, view files, download files etc. But operation like
>>> create, move, copy etc are not supported.****
>>>
>>> ** **
>>>
>>> These values look fine to me.****
>>>
>>> ** **
>>>
>>> One suggestion though. Try getting a Linux machine(if possible). Or at
>>> least use a VM. I personally feel that using Hadoop on windows is always
>>> messy.****
>>>
>>>
>>> ****
>>>
>>> Warm Regards,****
>>>
>>> Tariq****
>>>
>>> cloudfront.blogspot.com****
>>>
>>> ** **
>>>
>>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  thanks.****
>>>
>>> when i browse the file system , i am getting following :****
>>>
>>> i haven't seen any make directory option there ****
>>>
>>> ** **
>>>
>>> i need to create it from command line ?****
>>>
>>> further, in the hdfs-site.xml file , i have given following entries. are
>>> they correct ? ****
>>>
>>> ** **
>>>
>>> <property>****
>>>
>>>   <name>dfs.data.dir</name>****
>>>
>>>   <value>c:\\wksp</value>****
>>>
>>>   </property>****
>>>
>>> <property>****
>>>
>>>   <name>dfs.name.dir</name>****
>>>
>>>   <value>c:\\wksp</value>****
>>>
>>>   </property>****
>>>
>>> ** **
>>>
>>> please suggest ****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> [image: Inline image 1]****
>>>
>>> ** **
>>>
>>> On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <ma...@gmail.com>
>>> wrote:****
>>>
>>>  *You are wrong at this:*****
>>>
>>> ** **
>>>
>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>>
>>> $ ./hadoop dfs -copyFromLocal
>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp****
>>>
>>> copyFromLocal: File
>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
>>> ****
>>>
>>> ** **
>>>
>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>>
>>> $ ./hadoop dfs -copyFromLocal
>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp****
>>>
>>> copyFromLocal: File
>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>>> ****
>>>
>>> ** **
>>>
>>> Because,You had wrote both the paths local and You need not to copy
>>> hadoop into hdfs...Hadoop is already working..****
>>>
>>> ** **
>>>
>>> Just check out in browser by after starting ur single node cluster :****
>>>
>>> ** **
>>>
>>> localhost:50070****
>>>
>>> ** **
>>>
>>> then go for browse the filesystem link in it..****
>>>
>>> ** **
>>>
>>> If there is no directory then make directory there.****
>>>
>>> That is your hdfs directory.****
>>>
>>> Then copy any text file there(no need to copy hadoop there).beacause u
>>> are going to do processing on that data in text file.That's why hadoop is
>>> used for ,first u need to make it clear in ur mind.Then and then u will do
>>> it...otherwise not possible..****
>>>
>>> ** **
>>>
>>> *Try this: *****
>>>
>>> ** **
>>>
>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2****
>>>
>>> $ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
>>> /hdfs/directory/path****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  thanks. yes , i am newbie.****
>>>
>>> however, i need windows setup.****
>>>
>>> ** **
>>>
>>> let me surely refer the doc and link which u sent but i need this to be
>>> working ...****
>>>
>>> can you please help****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> ** **
>>>
>>>  ****
>>>
>>> ** **
>>>
>>>
>>>
>>> ****
>>>
>>> ** **
>>>
>>> --
>>> MANISH DUNANI
>>> -THANX
>>> +91 9426881954,+91 8460656443****
>>>
>>> manishd207@gmail.com****
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>
>>>
>>> ****
>>>
>>> -- ****
>>>
>>> Regards****
>>>
>>> *Manish Dunani*****
>>>
>>> *Contact No* : +91 9408329137****
>>>
>>> *skype id* : manish.dunani****
>>>
>>> ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>> CONFIDENTIALITY NOTICE
>>> NOTICE: This message is intended for the use of the individual or entity
>>> to which it is addressed and may contain information that is confidential,
>>> privileged and exempt from disclosure under applicable law. If the reader
>>> of this message is not the intended recipient, you are hereby notified that
>>> any printing, copying, dissemination, distribution, disclosure or
>>> forwarding of this communication is strictly prohibited. If you have
>>> received this communication in error, please contact the sender immediately
>>> and delete it from your system. Thank You.****
>>>
>>>  ** **
>>>
>>>
>>>
>>> CONFIDENTIALITY NOTICE
>>> NOTICE: This message is intended for the use of the individual or entity
>>> to which it is addressed and may contain information that is confidential,
>>> privileged and exempt from disclosure under applicable law. If the reader
>>> of this message is not the intended recipient, you are hereby notified that
>>> any printing, copying, dissemination, distribution, disclosure or
>>> forwarding of this communication is strictly prohibited. If you have
>>> received this communication in error, please contact the sender immediately
>>> and delete it from your system. Thank You.****
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>
>>> CONFIDENTIALITY NOTICE
>>> NOTICE: This message is intended for the use of the individual or entity
>>> to which it is addressed and may contain information that is confidential,
>>> privileged and exempt from disclosure under applicable law. If the reader
>>> of this message is not the intended recipient, you are hereby notified that
>>> any printing, copying, dissemination, distribution, disclosure or
>>> forwarding of this communication is strictly prohibited. If you have
>>> received this communication in error, please contact the sender immediately
>>> and delete it from your system. Thank You.****
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>
>>> CONFIDENTIALITY NOTICE
>>> NOTICE: This message is intended for the use of the individual or entity
>>> to which it is addressed and may contain information that is confidential,
>>> privileged and exempt from disclosure under applicable law. If the reader
>>> of this message is not the intended recipient, you are hereby notified that
>>> any printing, copying, dissemination, distribution, disclosure or
>>> forwarding of this communication is strictly prohibited. If you have
>>> received this communication in error, please contact the sender immediately
>>> and delete it from your system. Thank You.****
>>>
>>>  ** **
>>>
>>>
>>> CONFIDENTIALITY NOTICE
>>> NOTICE: This message is intended for the use of the individual or entity
>>> to which it is addressed and may contain information that is confidential,
>>> privileged and exempt from disclosure under applicable law. If the reader
>>> of this message is not the intended recipient, you are hereby notified that
>>> any printing, copying, dissemination, distribution, disclosure or
>>> forwarding of this communication is strictly prohibited. If you have
>>> received this communication in error, please contact the sender immediately
>>> and delete it from your system. Thank You.****
>>>
>>>  ** **
>>>
>>>
>>> CONFIDENTIALITY NOTICE
>>> NOTICE: This message is intended for the use of the individual or entity
>>> to which it is addressed and may contain information that is confidential,
>>> privileged and exempt from disclosure under applicable law. If the reader
>>> of this message is not the intended recipient, you are hereby notified that
>>> any printing, copying, dissemination, distribution, disclosure or
>>> forwarding of this communication is strictly prohibited. If you have
>>> received this communication in error, please contact the sender immediately
>>> and delete it from your system. Thank You.****
>>>
>>>  ** **
>>>
>>>
>>>
>>> ****
>>>
>>> ** **
>>>
>>> -- ****
>>>
>>> Olivier Renault****
>>>
>>> Solution Engineer - Big Data - Hortonworks, Inc.
>>> +44 7500 933 036
>>> orenault@hortonworks.com
>>> www.hortonworks.com****
>>>
>>> **** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>
>>>
>>> CONFIDENTIALITY NOTICE
>>> NOTICE: This message is intended for the use of the individual or entity
>>> to which it is addressed and may contain information that is confidential,
>>> privileged and exempt from disclosure under applicable law. If the reader
>>> of this message is not the intended recipient, you are hereby notified that
>>> any printing, copying, dissemination, distribution, disclosure or
>>> forwarding of this communication is strictly prohibited. If you have
>>> received this communication in error, please contact the sender immediately
>>> and delete it from your system. Thank You.****<http://hortonworks.com/products/hortonworks-sandbox/>
>>>
>>>  ** ** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>
>>
>>
>

Re: about replication

Posted by Irfan Sayed <ir...@gmail.com>.
please suggest

regards
irfan



On Fri, Sep 6, 2013 at 12:16 PM, Irfan Sayed <ir...@gmail.com> wrote:

> ok.. now i made some changes and installation went ahead
> but failed in property "HIVE_SERVER_HOST" declaration
> in cluster config file, i have commented this property. if i uncomment ,
> then what server address will give ???
>
> i have only two windows machines setup.
> 1: for namenode and another for datanode
>
> please suggest
>
> regards
> irfan
>
>
>
> On Fri, Sep 6, 2013 at 11:42 AM, Irfan Sayed <ir...@gmail.com> wrote:
>
>> thanks.
>> i installed the latest java in c:\java folder and now no error in log
>> file related to java
>> however, now it is throwing error on not having cluster properties file.
>> in fact i am running/installing hdp from the location where this file
>> exist . still it is throwing error
>>
>> please find the attached
>>
>> [image: Inline image 1]
>>
>> regards
>> irfan
>>
>>
>>
>> On Fri, Sep 6, 2013 at 11:12 AM, Ravi Mummulla (BIG DATA) <
>> ravimu@microsoft.com> wrote:
>>
>>>  Here’s your issue (from the logs you attached earlier):****
>>>
>>> ** **
>>>
>>> CAQuietExec:  Checking JAVA_HOME is set correctly...****
>>>
>>> CAQuietExec:  Files\Java\jdk1.6.0_31 was unexpected at this time.****
>>>
>>> ** **
>>>
>>> It seems that you installed Java prerequisite in the default path, which
>>> is %PROGRAMFILES% (expands to C:\Program Files in your case). HDP 1.3 does
>>> not like spaces in paths, do you need to reinstall Java under c:\java\ or
>>> something similar (in a path with no spaces).****
>>>
>>> ** **
>>>
>>> *From:* Irfan Sayed [mailto:irfu.sayed@gmail.com]
>>> *Sent:* Thursday, September 5, 2013 8:42 PM
>>> *To:* user@hadoop.apache.org
>>> *Subject:* Re: about replication****
>>>
>>> ** **
>>>
>>> please find the attached.****
>>>
>>> i don't have "c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log"
>>> as it is not generated ****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> irfan****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Thu, Sep 5, 2013 at 6:09 PM, Olivier Renault <
>>> orenault@hortonworks.com> wrote:****
>>>
>>>  Could you share the log files ( c:\hdp.log,
>>> c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log )  as
>>> well as your clusterproperties.txt ?****
>>>
>>> ** **
>>>
>>> Thanks, ****
>>>
>>> Olivier****
>>>
>>> ** **
>>>
>>> On 5 September 2013 12:33, Irfan Sayed <ir...@gmail.com> wrote:****
>>>
>>>  thanks. i followed the user manual for deployment and installed all
>>> pre-requisites ****
>>>
>>> i modified the command and still the issue persist. please suggest ****
>>>
>>> ** **
>>>
>>> please refer below ****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> [image: Inline image 1]****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> irfan ****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <
>>> orenault@hortonworks.com> wrote:****
>>>
>>> The command to install it is msiexec /i msifile /...  ****
>>>
>>> You will find the correct syntax as part of doc. ****
>>>
>>> Happy reading
>>> Olivier ****
>>>
>>> On 4 Sep 2013 12:37, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>
>>>  thanks. ****
>>>
>>> i referred the logs and manuals. i modified the clusterproperties file
>>> and then double click on the msi file ****
>>>
>>> however, it still failed.****
>>>
>>> further i started the installation on command line by giving
>>> HDP_LAYOUT=clusterproperties file path, ****
>>>
>>> installation went ahead and it failed for .NET framework 4.0 and VC++
>>> redistributable package dependency   ****
>>>
>>> ** **
>>>
>>> i installed both and started again the installation. ****
>>>
>>> failed again with following error ****
>>>
>>> [image: Inline image 1]****
>>>
>>> ** **
>>>
>>> when i search for the logs mentioned in the error , i never found that *
>>> ***
>>>
>>> please suggest ****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> irfan****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <
>>> orenault@hortonworks.com> wrote:****
>>>
>>> Correct, you need to define the cluster configuration as part of a file.
>>> You will find some information on the configuration file as part of the
>>> documentation. ****
>>>
>>>
>>> http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
>>> ****
>>>
>>> You should make sure to have also installed the pre requisite. ****
>>>
>>> Thanks
>>> Olivier ****
>>>
>>> On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>
>>>  thanks. sorry for the long break. actually got involved in some other
>>> priorities****
>>>
>>> i downloaded the installer and while installing i got following error **
>>> **
>>>
>>> ** **
>>>
>>> [image: Inline image 1]****
>>>
>>> ** **
>>>
>>> do i need to make any configuration prior to installation ??****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> irfan ****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
>>> orenault@hortonworks.com> wrote:****
>>>
>>> Here is the link ****
>>>
>>> http://download.hortonworks.com/products/hdp-windows/****
>>>
>>> Olivier ****
>>>
>>> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>
>>>  thanks.****
>>>
>>> i just followed the instructions to setup the pseudo distributed setup
>>> first using the url :
>>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>>> ****
>>>
>>>  ****
>>>
>>> i don't think so i am running DN on both machine ****
>>>
>>> please find the attached log****
>>>
>>> ** **
>>>
>>> hi olivier ****
>>>
>>> ** **
>>>
>>> can you please give me download link ?****
>>>
>>> let me try please ****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> irfan ****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <do...@gmail.com>
>>> wrote:****
>>>
>>>  Are you running DN on both the machines? Could you please show me your
>>> DN logs?****
>>>
>>> ** **
>>>
>>> Also, consider Oliver's suggestion. It's definitely a better option.****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>>
>>> ****
>>>
>>> Warm Regards,****
>>>
>>> Tariq****
>>>
>>> cloudfront.blogspot.com****
>>>
>>> ** **
>>>
>>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>>> orenault@hortonworks.com> wrote:****
>>>
>>> Irfu, ****
>>>
>>> If you want to quickly get Hadoop running on windows platform. You may
>>> want to try our distribution for Windows. You will be able to find the msi
>>> on our website. ****
>>>
>>> Regards
>>> Olivier ****
>>>
>>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>
>>>  thanks. ****
>>>
>>> ok. i think i need to change the plan over here ****
>>>
>>> let me create two environments. 1: totally windows 2: totally Unix****
>>>
>>> ** **
>>>
>>> because, on windows , anyway i have to try and see how hadoop works ****
>>>
>>> on UNIX, it is already known that ,  it is working fine. ****
>>>
>>> ** **
>>>
>>> so, on windows , here is the setup:****
>>>
>>> ** **
>>>
>>> namenode : windows 2012 R2 ****
>>>
>>> datanode : windows 2012 R2 ****
>>>
>>> ** **
>>>
>>> now, the exact problem is :****
>>>
>>> 1: datanode is not getting started ****
>>>
>>> 2: replication : if i put any file/folder on any datanode , it should
>>> get replicated to all another available datanodes ****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <do...@gmail.com>
>>> wrote:****
>>>
>>>  Seriously??You are planning to develop something using Hadoop on
>>> windows. Not a good idea. Anyways, cold you plz show me your log files?I
>>> also need some additional info :****
>>>
>>> -The exact problem which you are facing right now****
>>>
>>> -Your cluster summary(no. of nodes etc)****
>>>
>>> -Your latest configuration files****
>>>
>>> -Your /etc.hosts file****
>>>
>>>
>>> ****
>>>
>>> Warm Regards,****
>>>
>>> Tariq****
>>>
>>> cloudfront.blogspot.com****
>>>
>>> ** **
>>>
>>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  ok. thanks****
>>>
>>> now, i need to start with all windows setup first as our product will be
>>> based on windows ****
>>>
>>> so, now, please tell me how to resolve the issue ****
>>>
>>> ** **
>>>
>>> datanode is not starting . please suggest ****
>>>
>>> ** **
>>>
>>> regards,****
>>>
>>> irfan ****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <do...@gmail.com>
>>> wrote:****
>>>
>>>  It is possible. Theoretically Hadoop doesn't stop you from doing that.
>>> But it is not a very wise setup.****
>>>
>>>
>>> ****
>>>
>>> Warm Regards,****
>>>
>>> Tariq****
>>>
>>> cloudfront.blogspot.com****
>>>
>>> ** **
>>>
>>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  please suggest****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> irfan****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  thanks.****
>>>
>>> can i have setup like this :****
>>>
>>> namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)****
>>>
>>> and datanodes are the combination of any OS (windows , linux , unix etc )
>>> ****
>>>
>>> ** **
>>>
>>> however, my doubt is,  as the file systems of  both the systems (win and
>>> linux ) are different ,  datanodes of these systems can not be part of
>>> single cluster . i have to make windows cluster separate and UNIX cluster
>>> separate ?****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
>>> aagarwal@hortonworks.com> wrote:****
>>>
>>> I just noticed you are on Cygwin. IIRC Windows PIDs are not the same as
>>> Cygwin PIDs so that may be causing the discrepancy. I don't know how well
>>> Hadoop works in Cygwin as I have never tried it. Work is in progress for
>>> native Windows support however there are no official releases with Windows
>>> support yet. It may be easier to get familiar with a release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
>>> ****
>>>
>>>
>>>
>>> ****
>>>
>>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  thanks ****
>>>
>>> here is what i did .****
>>>
>>> i stopped all the namenodes and datanodes using ./stop-dfs.sh command **
>>> **
>>>
>>> then deleted all pid files for namenodes and datanodes ****
>>>
>>> ** **
>>>
>>> started dfs again with command : "./start-dfs.sh"****
>>>
>>> ** **
>>>
>>> when i ran the "Jps" command . it shows****
>>>
>>> ** **
>>>
>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>
>>> $ ./jps.exe****
>>>
>>> 4536 Jps****
>>>
>>> 2076 NameNode****
>>>
>>> ** **
>>>
>>> however, when i open the pid file for namenode then it is not showing
>>> pid as : 4560. on the contrary, it shud show : 2076****
>>>
>>> ** **
>>>
>>> please suggest ****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <aa...@hortonworks.com>
>>> wrote:****
>>>
>>>  Most likely there is a stale pid file. Something like
>>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>>> the datanode.
>>>
>>> I haven't read the entire thread so you may have looked at this already.
>>>
>>> -Arpit****
>>>
>>>
>>>
>>> ****
>>>
>>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  datanode is trying to connect to namenode continuously but fails ****
>>>
>>> ** **
>>>
>>> when i try to run "jps" command it says :****
>>>
>>> $ ./jps.exe****
>>>
>>> 4584 NameNode****
>>>
>>> 4016 Jps****
>>>
>>> ** **
>>>
>>> and when i ran the "./start-dfs.sh" then it says :****
>>>
>>> ** **
>>>
>>> $ ./start-dfs.sh****
>>>
>>> namenode running as process 3544. Stop it first.****
>>>
>>> DFS-1: datanode running as process 4076. Stop it first.****
>>>
>>> localhost: secondarynamenode running as process 4792. Stop it first.****
>>>
>>> ** **
>>>
>>> both these logs are contradictory ****
>>>
>>> please find the attached logs ****
>>>
>>> ** **
>>>
>>> should i attach the conf files as well ?****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>>  ****
>>>
>>> ** **
>>>
>>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <do...@gmail.com>
>>> wrote:****
>>>
>>>  Your DN is still not running. Showing me the logs would be helpful.****
>>>
>>>
>>> ****
>>>
>>> Warm Regards,****
>>>
>>> Tariq****
>>>
>>> cloudfront.blogspot.com****
>>>
>>> ** **
>>>
>>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  i followed the url and did the steps mention in that. i have deployed
>>> on the windows platform****
>>>
>>> ** **
>>>
>>> Now, i am able to browse url : http://localhost:50070 (name node )****
>>>
>>> however, not able to browse url : http://localhost:50030****
>>>
>>> ** **
>>>
>>> please refer below****
>>>
>>> ** **
>>>
>>> [image: Inline image 1]****
>>>
>>> ** **
>>>
>>> i have modified all the config files as mentioned and formatted the hdfs
>>> file system as well ****
>>>
>>> please suggest ****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  thanks. i followed this url :
>>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>>> ****
>>>
>>> let me follow the url which you gave for pseudo distributed setup and
>>> then will switch to distributed mode****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> irfan ****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <do...@gmail.com>
>>> wrote:****
>>>
>>>  You are welcome. Which link have you followed for the
>>> configuration?Your *core-site.xml* is empty. Remove the property *
>>> fs.default.name *from *hdfs-site.xml* and add it to *core-site.xml*.
>>> Remove *mapred.job.tracker* as well. It is required in *mapred-site.xml*
>>> .****
>>>
>>> ** **
>>>
>>> I would suggest you to do a pseudo distributed setup first in order to
>>> get yourself familiar with the process and then proceed to the distributed
>>> mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>>> ****
>>>
>>> ** **
>>>
>>> HTH****
>>>
>>>
>>> ****
>>>
>>> Warm Regards,****
>>>
>>> Tariq****
>>>
>>> cloudfront.blogspot.com****
>>>
>>> ** **
>>>
>>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  thanks tariq for response. ****
>>>
>>> as discussed last time, i have sent you all the config files in my setup
>>> . ****
>>>
>>> can you please go through that ?****
>>>
>>> ** **
>>>
>>> please let me know ****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> irfan ****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <do...@gmail.com>
>>> wrote:****
>>>
>>>  I'm sorry for being unresponsive. Was out of touch for sometime
>>> because of ramzan and eid. Resuming work today.****
>>>
>>> ** **
>>>
>>> What's the current status?****
>>>
>>>
>>> ****
>>>
>>> Warm Regards,****
>>>
>>> Tariq****
>>>
>>> cloudfront.blogspot.com****
>>>
>>> ** **
>>>
>>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <ma...@gmail.com>
>>> wrote:****
>>>
>>>  First of all read the concepts ..I hope you will like it..****
>>>
>>>
>>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf****
>>>
>>> ** **
>>>
>>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  please suggest ****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> irfan ****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  hey Tariq,****
>>>
>>> i am still stuck .. ****
>>>
>>> can you please suggest ****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> irfan ****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  please suggest ****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  attachment got quarantined ****
>>>
>>> resending in txt format. please rename it to conf.rar ****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  thanks.****
>>>
>>> ** **
>>>
>>> if i run the jps command on namenode :****
>>>
>>> ** **
>>>
>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>
>>> $ ./jps.exe****
>>>
>>> 3164 NameNode****
>>>
>>> 1892 Jps****
>>>
>>> ** **
>>>
>>> same command on datanode :****
>>>
>>> ** **
>>>
>>> Administrator@DFS-1 /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>
>>> $ ./jps.exe****
>>>
>>> 3848 Jps****
>>>
>>> ** **
>>>
>>> jps does not list any process for datanode. however, on web browser i
>>> can see one live data node ****
>>>
>>> please find the attached conf rar file of namenode ****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <do...@gmail.com>
>>> wrote:****
>>>
>>>  OK. we'll start fresh. Could you plz show me your latest config files?*
>>> ***
>>>
>>> ** **
>>>
>>> BTW, are your daemons running fine?Use JPS to verify that.****
>>>
>>>
>>> ****
>>>
>>> Warm Regards,****
>>>
>>> Tariq****
>>>
>>> cloudfront.blogspot.com****
>>>
>>> ** **
>>>
>>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  i have created these dir "wksp_data" and "wksp_name" on both datanode
>>> and namenode ****
>>>
>>> made the respective changes in "hdfs-site.xml" file ****
>>>
>>> formatted the namenode ****
>>>
>>> started the dfs ****
>>>
>>> ** **
>>>
>>> but still, not able to browse the file system through web browser ****
>>>
>>> please refer below ****
>>>
>>> ** **
>>>
>>> anything still missing ?****
>>>
>>> please suggest ****
>>>
>>> ** **
>>>
>>> [image: Inline image 1]****
>>>
>>> ** **
>>>
>>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  these dir needs to be created on all datanodes and namenodes ?****
>>>
>>> further,  hdfs-site.xml needs to be updated on both datanodes and
>>> namenodes for these new dir?****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <do...@gmail.com>
>>> wrote:****
>>>
>>>  Create 2 directories manually corresponding to the values of
>>> dfs.name.dir and dfs.data.dir properties and change the permissions of
>>> these directories to 755. When you start pushing data into your HDFS, data
>>> will start going inside the directory specified by dfs.data.dir and the
>>> associated metadata will go inside dfs.name.dir. Remember, you store data
>>> in HDFS, but it eventually gets stored in your local/native FS. But you
>>> cannot see this data directly on your local/native FS.****
>>>
>>>
>>> ****
>>>
>>> Warm Regards,****
>>>
>>> Tariq****
>>>
>>> cloudfront.blogspot.com****
>>>
>>> ** **
>>>
>>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  thanks. ****
>>>
>>> however, i need this to be working on windows environment as project
>>> requirement.****
>>>
>>> i will add/work on Linux later ****
>>>
>>> ** **
>>>
>>> so, now , at this stage , c:\\wksp is the HDFS file system OR do i need
>>> to create it from command line ?****
>>>
>>> ** **
>>>
>>> please suggest****
>>>
>>> ** **
>>>
>>> regards,****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <do...@gmail.com>
>>> wrote:****
>>>
>>>  Hello Irfan,****
>>>
>>> ** **
>>>
>>> Sorry for being unresponsive. Got stuck with some imp work.****
>>>
>>> ** **
>>>
>>> HDFS webUI doesn't provide us the ability to create file or directory.
>>> You can browse HDFS, view files, download files etc. But operation like
>>> create, move, copy etc are not supported.****
>>>
>>> ** **
>>>
>>> These values look fine to me.****
>>>
>>> ** **
>>>
>>> One suggestion though. Try getting a Linux machine(if possible). Or at
>>> least use a VM. I personally feel that using Hadoop on windows is always
>>> messy.****
>>>
>>>
>>> ****
>>>
>>> Warm Regards,****
>>>
>>> Tariq****
>>>
>>> cloudfront.blogspot.com****
>>>
>>> ** **
>>>
>>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  thanks.****
>>>
>>> when i browse the file system , i am getting following :****
>>>
>>> i haven't seen any make directory option there ****
>>>
>>> ** **
>>>
>>> i need to create it from command line ?****
>>>
>>> further, in the hdfs-site.xml file , i have given following entries. are
>>> they correct ? ****
>>>
>>> ** **
>>>
>>> <property>****
>>>
>>>   <name>dfs.data.dir</name>****
>>>
>>>   <value>c:\\wksp</value>****
>>>
>>>   </property>****
>>>
>>> <property>****
>>>
>>>   <name>dfs.name.dir</name>****
>>>
>>>   <value>c:\\wksp</value>****
>>>
>>>   </property>****
>>>
>>> ** **
>>>
>>> please suggest ****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> [image: Inline image 1]****
>>>
>>> ** **
>>>
>>> On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <ma...@gmail.com>
>>> wrote:****
>>>
>>>  *You are wrong at this:*****
>>>
>>> ** **
>>>
>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>>
>>> $ ./hadoop dfs -copyFromLocal
>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp****
>>>
>>> copyFromLocal: File
>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
>>> ****
>>>
>>> ** **
>>>
>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>>
>>> $ ./hadoop dfs -copyFromLocal
>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp****
>>>
>>> copyFromLocal: File
>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>>> ****
>>>
>>> ** **
>>>
>>> Because,You had wrote both the paths local and You need not to copy
>>> hadoop into hdfs...Hadoop is already working..****
>>>
>>> ** **
>>>
>>> Just check out in browser by after starting ur single node cluster :****
>>>
>>> ** **
>>>
>>> localhost:50070****
>>>
>>> ** **
>>>
>>> then go for browse the filesystem link in it..****
>>>
>>> ** **
>>>
>>> If there is no directory then make directory there.****
>>>
>>> That is your hdfs directory.****
>>>
>>> Then copy any text file there(no need to copy hadoop there).beacause u
>>> are going to do processing on that data in text file.That's why hadoop is
>>> used for ,first u need to make it clear in ur mind.Then and then u will do
>>> it...otherwise not possible..****
>>>
>>> ** **
>>>
>>> *Try this: *****
>>>
>>> ** **
>>>
>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2****
>>>
>>> $ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
>>> /hdfs/directory/path****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  thanks. yes , i am newbie.****
>>>
>>> however, i need windows setup.****
>>>
>>> ** **
>>>
>>> let me surely refer the doc and link which u sent but i need this to be
>>> working ...****
>>>
>>> can you please help****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> ** **
>>>
>>>  ****
>>>
>>> ** **
>>>
>>>
>>>
>>> ****
>>>
>>> ** **
>>>
>>> --
>>> MANISH DUNANI
>>> -THANX
>>> +91 9426881954,+91 8460656443****
>>>
>>> manishd207@gmail.com****
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>
>>>
>>> ****
>>>
>>> -- ****
>>>
>>> Regards****
>>>
>>> *Manish Dunani*****
>>>
>>> *Contact No* : +91 9408329137****
>>>
>>> *skype id* : manish.dunani****
>>>
>>> ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>> CONFIDENTIALITY NOTICE
>>> NOTICE: This message is intended for the use of the individual or entity
>>> to which it is addressed and may contain information that is confidential,
>>> privileged and exempt from disclosure under applicable law. If the reader
>>> of this message is not the intended recipient, you are hereby notified that
>>> any printing, copying, dissemination, distribution, disclosure or
>>> forwarding of this communication is strictly prohibited. If you have
>>> received this communication in error, please contact the sender immediately
>>> and delete it from your system. Thank You.****
>>>
>>>  ** **
>>>
>>>
>>>
>>> CONFIDENTIALITY NOTICE
>>> NOTICE: This message is intended for the use of the individual or entity
>>> to which it is addressed and may contain information that is confidential,
>>> privileged and exempt from disclosure under applicable law. If the reader
>>> of this message is not the intended recipient, you are hereby notified that
>>> any printing, copying, dissemination, distribution, disclosure or
>>> forwarding of this communication is strictly prohibited. If you have
>>> received this communication in error, please contact the sender immediately
>>> and delete it from your system. Thank You.****
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>
>>> CONFIDENTIALITY NOTICE
>>> NOTICE: This message is intended for the use of the individual or entity
>>> to which it is addressed and may contain information that is confidential,
>>> privileged and exempt from disclosure under applicable law. If the reader
>>> of this message is not the intended recipient, you are hereby notified that
>>> any printing, copying, dissemination, distribution, disclosure or
>>> forwarding of this communication is strictly prohibited. If you have
>>> received this communication in error, please contact the sender immediately
>>> and delete it from your system. Thank You.****
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>
>>> CONFIDENTIALITY NOTICE
>>> NOTICE: This message is intended for the use of the individual or entity
>>> to which it is addressed and may contain information that is confidential,
>>> privileged and exempt from disclosure under applicable law. If the reader
>>> of this message is not the intended recipient, you are hereby notified that
>>> any printing, copying, dissemination, distribution, disclosure or
>>> forwarding of this communication is strictly prohibited. If you have
>>> received this communication in error, please contact the sender immediately
>>> and delete it from your system. Thank You.****
>>>
>>>  ** **
>>>
>>>
>>> CONFIDENTIALITY NOTICE
>>> NOTICE: This message is intended for the use of the individual or entity
>>> to which it is addressed and may contain information that is confidential,
>>> privileged and exempt from disclosure under applicable law. If the reader
>>> of this message is not the intended recipient, you are hereby notified that
>>> any printing, copying, dissemination, distribution, disclosure or
>>> forwarding of this communication is strictly prohibited. If you have
>>> received this communication in error, please contact the sender immediately
>>> and delete it from your system. Thank You.****
>>>
>>>  ** **
>>>
>>>
>>> CONFIDENTIALITY NOTICE
>>> NOTICE: This message is intended for the use of the individual or entity
>>> to which it is addressed and may contain information that is confidential,
>>> privileged and exempt from disclosure under applicable law. If the reader
>>> of this message is not the intended recipient, you are hereby notified that
>>> any printing, copying, dissemination, distribution, disclosure or
>>> forwarding of this communication is strictly prohibited. If you have
>>> received this communication in error, please contact the sender immediately
>>> and delete it from your system. Thank You.****
>>>
>>>  ** **
>>>
>>>
>>>
>>> ****
>>>
>>> ** **
>>>
>>> -- ****
>>>
>>> Olivier Renault****
>>>
>>> Solution Engineer - Big Data - Hortonworks, Inc.
>>> +44 7500 933 036
>>> orenault@hortonworks.com
>>> www.hortonworks.com****
>>>
>>> **** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>
>>>
>>> CONFIDENTIALITY NOTICE
>>> NOTICE: This message is intended for the use of the individual or entity
>>> to which it is addressed and may contain information that is confidential,
>>> privileged and exempt from disclosure under applicable law. If the reader
>>> of this message is not the intended recipient, you are hereby notified that
>>> any printing, copying, dissemination, distribution, disclosure or
>>> forwarding of this communication is strictly prohibited. If you have
>>> received this communication in error, please contact the sender immediately
>>> and delete it from your system. Thank You.****<http://hortonworks.com/products/hortonworks-sandbox/>
>>>
>>>  ** ** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>
>>
>>
>

Re: about replication

Posted by Irfan Sayed <ir...@gmail.com>.
please suggest

regards
irfan



On Fri, Sep 6, 2013 at 12:16 PM, Irfan Sayed <ir...@gmail.com> wrote:

> ok.. now i made some changes and installation went ahead
> but failed in property "HIVE_SERVER_HOST" declaration
> in cluster config file, i have commented this property. if i uncomment ,
> then what server address will give ???
>
> i have only two windows machines setup.
> 1: for namenode and another for datanode
>
> please suggest
>
> regards
> irfan
>
>
>
> On Fri, Sep 6, 2013 at 11:42 AM, Irfan Sayed <ir...@gmail.com> wrote:
>
>> thanks.
>> i installed the latest java in c:\java folder and now no error in log
>> file related to java
>> however, now it is throwing error on not having cluster properties file.
>> in fact i am running/installing hdp from the location where this file
>> exist . still it is throwing error
>>
>> please find the attached
>>
>> [image: Inline image 1]
>>
>> regards
>> irfan
>>
>>
>>
>> On Fri, Sep 6, 2013 at 11:12 AM, Ravi Mummulla (BIG DATA) <
>> ravimu@microsoft.com> wrote:
>>
>>>  Here’s your issue (from the logs you attached earlier):****
>>>
>>> ** **
>>>
>>> CAQuietExec:  Checking JAVA_HOME is set correctly...****
>>>
>>> CAQuietExec:  Files\Java\jdk1.6.0_31 was unexpected at this time.****
>>>
>>> ** **
>>>
>>> It seems that you installed Java prerequisite in the default path, which
>>> is %PROGRAMFILES% (expands to C:\Program Files in your case). HDP 1.3 does
>>> not like spaces in paths, do you need to reinstall Java under c:\java\ or
>>> something similar (in a path with no spaces).****
>>>
>>> ** **
>>>
>>> *From:* Irfan Sayed [mailto:irfu.sayed@gmail.com]
>>> *Sent:* Thursday, September 5, 2013 8:42 PM
>>> *To:* user@hadoop.apache.org
>>> *Subject:* Re: about replication****
>>>
>>> ** **
>>>
>>> please find the attached.****
>>>
>>> i don't have "c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log"
>>> as it is not generated ****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> irfan****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Thu, Sep 5, 2013 at 6:09 PM, Olivier Renault <
>>> orenault@hortonworks.com> wrote:****
>>>
>>>  Could you share the log files ( c:\hdp.log,
>>> c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log )  as
>>> well as your clusterproperties.txt ?****
>>>
>>> ** **
>>>
>>> Thanks, ****
>>>
>>> Olivier****
>>>
>>> ** **
>>>
>>> On 5 September 2013 12:33, Irfan Sayed <ir...@gmail.com> wrote:****
>>>
>>>  thanks. i followed the user manual for deployment and installed all
>>> pre-requisites ****
>>>
>>> i modified the command and still the issue persist. please suggest ****
>>>
>>> ** **
>>>
>>> please refer below ****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> [image: Inline image 1]****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> irfan ****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <
>>> orenault@hortonworks.com> wrote:****
>>>
>>> The command to install it is msiexec /i msifile /...  ****
>>>
>>> You will find the correct syntax as part of doc. ****
>>>
>>> Happy reading
>>> Olivier ****
>>>
>>> On 4 Sep 2013 12:37, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>
>>>  thanks. ****
>>>
>>> i referred the logs and manuals. i modified the clusterproperties file
>>> and then double click on the msi file ****
>>>
>>> however, it still failed.****
>>>
>>> further i started the installation on command line by giving
>>> HDP_LAYOUT=clusterproperties file path, ****
>>>
>>> installation went ahead and it failed for .NET framework 4.0 and VC++
>>> redistributable package dependency   ****
>>>
>>> ** **
>>>
>>> i installed both and started again the installation. ****
>>>
>>> failed again with following error ****
>>>
>>> [image: Inline image 1]****
>>>
>>> ** **
>>>
>>> when i search for the logs mentioned in the error , i never found that *
>>> ***
>>>
>>> please suggest ****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> irfan****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <
>>> orenault@hortonworks.com> wrote:****
>>>
>>> Correct, you need to define the cluster configuration as part of a file.
>>> You will find some information on the configuration file as part of the
>>> documentation. ****
>>>
>>>
>>> http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
>>> ****
>>>
>>> You should make sure to have also installed the pre requisite. ****
>>>
>>> Thanks
>>> Olivier ****
>>>
>>> On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>
>>>  thanks. sorry for the long break. actually got involved in some other
>>> priorities****
>>>
>>> i downloaded the installer and while installing i got following error **
>>> **
>>>
>>> ** **
>>>
>>> [image: Inline image 1]****
>>>
>>> ** **
>>>
>>> do i need to make any configuration prior to installation ??****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> irfan ****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
>>> orenault@hortonworks.com> wrote:****
>>>
>>> Here is the link ****
>>>
>>> http://download.hortonworks.com/products/hdp-windows/****
>>>
>>> Olivier ****
>>>
>>> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>
>>>  thanks.****
>>>
>>> i just followed the instructions to setup the pseudo distributed setup
>>> first using the url :
>>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>>> ****
>>>
>>>  ****
>>>
>>> i don't think so i am running DN on both machine ****
>>>
>>> please find the attached log****
>>>
>>> ** **
>>>
>>> hi olivier ****
>>>
>>> ** **
>>>
>>> can you please give me download link ?****
>>>
>>> let me try please ****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> irfan ****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <do...@gmail.com>
>>> wrote:****
>>>
>>>  Are you running DN on both the machines? Could you please show me your
>>> DN logs?****
>>>
>>> ** **
>>>
>>> Also, consider Oliver's suggestion. It's definitely a better option.****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>>
>>> ****
>>>
>>> Warm Regards,****
>>>
>>> Tariq****
>>>
>>> cloudfront.blogspot.com****
>>>
>>> ** **
>>>
>>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>>> orenault@hortonworks.com> wrote:****
>>>
>>> Irfu, ****
>>>
>>> If you want to quickly get Hadoop running on windows platform. You may
>>> want to try our distribution for Windows. You will be able to find the msi
>>> on our website. ****
>>>
>>> Regards
>>> Olivier ****
>>>
>>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>>
>>>  thanks. ****
>>>
>>> ok. i think i need to change the plan over here ****
>>>
>>> let me create two environments. 1: totally windows 2: totally Unix****
>>>
>>> ** **
>>>
>>> because, on windows , anyway i have to try and see how hadoop works ****
>>>
>>> on UNIX, it is already known that ,  it is working fine. ****
>>>
>>> ** **
>>>
>>> so, on windows , here is the setup:****
>>>
>>> ** **
>>>
>>> namenode : windows 2012 R2 ****
>>>
>>> datanode : windows 2012 R2 ****
>>>
>>> ** **
>>>
>>> now, the exact problem is :****
>>>
>>> 1: datanode is not getting started ****
>>>
>>> 2: replication : if i put any file/folder on any datanode , it should
>>> get replicated to all another available datanodes ****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <do...@gmail.com>
>>> wrote:****
>>>
>>>  Seriously??You are planning to develop something using Hadoop on
>>> windows. Not a good idea. Anyways, cold you plz show me your log files?I
>>> also need some additional info :****
>>>
>>> -The exact problem which you are facing right now****
>>>
>>> -Your cluster summary(no. of nodes etc)****
>>>
>>> -Your latest configuration files****
>>>
>>> -Your /etc.hosts file****
>>>
>>>
>>> ****
>>>
>>> Warm Regards,****
>>>
>>> Tariq****
>>>
>>> cloudfront.blogspot.com****
>>>
>>> ** **
>>>
>>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  ok. thanks****
>>>
>>> now, i need to start with all windows setup first as our product will be
>>> based on windows ****
>>>
>>> so, now, please tell me how to resolve the issue ****
>>>
>>> ** **
>>>
>>> datanode is not starting . please suggest ****
>>>
>>> ** **
>>>
>>> regards,****
>>>
>>> irfan ****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <do...@gmail.com>
>>> wrote:****
>>>
>>>  It is possible. Theoretically Hadoop doesn't stop you from doing that.
>>> But it is not a very wise setup.****
>>>
>>>
>>> ****
>>>
>>> Warm Regards,****
>>>
>>> Tariq****
>>>
>>> cloudfront.blogspot.com****
>>>
>>> ** **
>>>
>>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  please suggest****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> irfan****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  thanks.****
>>>
>>> can i have setup like this :****
>>>
>>> namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)****
>>>
>>> and datanodes are the combination of any OS (windows , linux , unix etc )
>>> ****
>>>
>>> ** **
>>>
>>> however, my doubt is,  as the file systems of  both the systems (win and
>>> linux ) are different ,  datanodes of these systems can not be part of
>>> single cluster . i have to make windows cluster separate and UNIX cluster
>>> separate ?****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
>>> aagarwal@hortonworks.com> wrote:****
>>>
>>> I just noticed you are on Cygwin. IIRC Windows PIDs are not the same as
>>> Cygwin PIDs so that may be causing the discrepancy. I don't know how well
>>> Hadoop works in Cygwin as I have never tried it. Work is in progress for
>>> native Windows support however there are no official releases with Windows
>>> support yet. It may be easier to get familiar with a release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
>>> ****
>>>
>>>
>>>
>>> ****
>>>
>>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  thanks ****
>>>
>>> here is what i did .****
>>>
>>> i stopped all the namenodes and datanodes using ./stop-dfs.sh command **
>>> **
>>>
>>> then deleted all pid files for namenodes and datanodes ****
>>>
>>> ** **
>>>
>>> started dfs again with command : "./start-dfs.sh"****
>>>
>>> ** **
>>>
>>> when i ran the "Jps" command . it shows****
>>>
>>> ** **
>>>
>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>
>>> $ ./jps.exe****
>>>
>>> 4536 Jps****
>>>
>>> 2076 NameNode****
>>>
>>> ** **
>>>
>>> however, when i open the pid file for namenode then it is not showing
>>> pid as : 4560. on the contrary, it shud show : 2076****
>>>
>>> ** **
>>>
>>> please suggest ****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <aa...@hortonworks.com>
>>> wrote:****
>>>
>>>  Most likely there is a stale pid file. Something like
>>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>>> the datanode.
>>>
>>> I haven't read the entire thread so you may have looked at this already.
>>>
>>> -Arpit****
>>>
>>>
>>>
>>> ****
>>>
>>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  datanode is trying to connect to namenode continuously but fails ****
>>>
>>> ** **
>>>
>>> when i try to run "jps" command it says :****
>>>
>>> $ ./jps.exe****
>>>
>>> 4584 NameNode****
>>>
>>> 4016 Jps****
>>>
>>> ** **
>>>
>>> and when i ran the "./start-dfs.sh" then it says :****
>>>
>>> ** **
>>>
>>> $ ./start-dfs.sh****
>>>
>>> namenode running as process 3544. Stop it first.****
>>>
>>> DFS-1: datanode running as process 4076. Stop it first.****
>>>
>>> localhost: secondarynamenode running as process 4792. Stop it first.****
>>>
>>> ** **
>>>
>>> both these logs are contradictory ****
>>>
>>> please find the attached logs ****
>>>
>>> ** **
>>>
>>> should i attach the conf files as well ?****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>>  ****
>>>
>>> ** **
>>>
>>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <do...@gmail.com>
>>> wrote:****
>>>
>>>  Your DN is still not running. Showing me the logs would be helpful.****
>>>
>>>
>>> ****
>>>
>>> Warm Regards,****
>>>
>>> Tariq****
>>>
>>> cloudfront.blogspot.com****
>>>
>>> ** **
>>>
>>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  i followed the url and did the steps mention in that. i have deployed
>>> on the windows platform****
>>>
>>> ** **
>>>
>>> Now, i am able to browse url : http://localhost:50070 (name node )****
>>>
>>> however, not able to browse url : http://localhost:50030****
>>>
>>> ** **
>>>
>>> please refer below****
>>>
>>> ** **
>>>
>>> [image: Inline image 1]****
>>>
>>> ** **
>>>
>>> i have modified all the config files as mentioned and formatted the hdfs
>>> file system as well ****
>>>
>>> please suggest ****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  thanks. i followed this url :
>>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>>> ****
>>>
>>> let me follow the url which you gave for pseudo distributed setup and
>>> then will switch to distributed mode****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> irfan ****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <do...@gmail.com>
>>> wrote:****
>>>
>>>  You are welcome. Which link have you followed for the
>>> configuration?Your *core-site.xml* is empty. Remove the property *
>>> fs.default.name *from *hdfs-site.xml* and add it to *core-site.xml*.
>>> Remove *mapred.job.tracker* as well. It is required in *mapred-site.xml*
>>> .****
>>>
>>> ** **
>>>
>>> I would suggest you to do a pseudo distributed setup first in order to
>>> get yourself familiar with the process and then proceed to the distributed
>>> mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>>> ****
>>>
>>> ** **
>>>
>>> HTH****
>>>
>>>
>>> ****
>>>
>>> Warm Regards,****
>>>
>>> Tariq****
>>>
>>> cloudfront.blogspot.com****
>>>
>>> ** **
>>>
>>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  thanks tariq for response. ****
>>>
>>> as discussed last time, i have sent you all the config files in my setup
>>> . ****
>>>
>>> can you please go through that ?****
>>>
>>> ** **
>>>
>>> please let me know ****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> irfan ****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <do...@gmail.com>
>>> wrote:****
>>>
>>>  I'm sorry for being unresponsive. Was out of touch for sometime
>>> because of ramzan and eid. Resuming work today.****
>>>
>>> ** **
>>>
>>> What's the current status?****
>>>
>>>
>>> ****
>>>
>>> Warm Regards,****
>>>
>>> Tariq****
>>>
>>> cloudfront.blogspot.com****
>>>
>>> ** **
>>>
>>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <ma...@gmail.com>
>>> wrote:****
>>>
>>>  First of all read the concepts ..I hope you will like it..****
>>>
>>>
>>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf****
>>>
>>> ** **
>>>
>>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  please suggest ****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> irfan ****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  hey Tariq,****
>>>
>>> i am still stuck .. ****
>>>
>>> can you please suggest ****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> irfan ****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  please suggest ****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  attachment got quarantined ****
>>>
>>> resending in txt format. please rename it to conf.rar ****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  thanks.****
>>>
>>> ** **
>>>
>>> if i run the jps command on namenode :****
>>>
>>> ** **
>>>
>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>
>>> $ ./jps.exe****
>>>
>>> 3164 NameNode****
>>>
>>> 1892 Jps****
>>>
>>> ** **
>>>
>>> same command on datanode :****
>>>
>>> ** **
>>>
>>> Administrator@DFS-1 /cygdrive/c/Java/jdk1.7.0_25/bin****
>>>
>>> $ ./jps.exe****
>>>
>>> 3848 Jps****
>>>
>>> ** **
>>>
>>> jps does not list any process for datanode. however, on web browser i
>>> can see one live data node ****
>>>
>>> please find the attached conf rar file of namenode ****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <do...@gmail.com>
>>> wrote:****
>>>
>>>  OK. we'll start fresh. Could you plz show me your latest config files?*
>>> ***
>>>
>>> ** **
>>>
>>> BTW, are your daemons running fine?Use JPS to verify that.****
>>>
>>>
>>> ****
>>>
>>> Warm Regards,****
>>>
>>> Tariq****
>>>
>>> cloudfront.blogspot.com****
>>>
>>> ** **
>>>
>>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  i have created these dir "wksp_data" and "wksp_name" on both datanode
>>> and namenode ****
>>>
>>> made the respective changes in "hdfs-site.xml" file ****
>>>
>>> formatted the namenode ****
>>>
>>> started the dfs ****
>>>
>>> ** **
>>>
>>> but still, not able to browse the file system through web browser ****
>>>
>>> please refer below ****
>>>
>>> ** **
>>>
>>> anything still missing ?****
>>>
>>> please suggest ****
>>>
>>> ** **
>>>
>>> [image: Inline image 1]****
>>>
>>> ** **
>>>
>>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  these dir needs to be created on all datanodes and namenodes ?****
>>>
>>> further,  hdfs-site.xml needs to be updated on both datanodes and
>>> namenodes for these new dir?****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <do...@gmail.com>
>>> wrote:****
>>>
>>>  Create 2 directories manually corresponding to the values of
>>> dfs.name.dir and dfs.data.dir properties and change the permissions of
>>> these directories to 755. When you start pushing data into your HDFS, data
>>> will start going inside the directory specified by dfs.data.dir and the
>>> associated metadata will go inside dfs.name.dir. Remember, you store data
>>> in HDFS, but it eventually gets stored in your local/native FS. But you
>>> cannot see this data directly on your local/native FS.****
>>>
>>>
>>> ****
>>>
>>> Warm Regards,****
>>>
>>> Tariq****
>>>
>>> cloudfront.blogspot.com****
>>>
>>> ** **
>>>
>>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  thanks. ****
>>>
>>> however, i need this to be working on windows environment as project
>>> requirement.****
>>>
>>> i will add/work on Linux later ****
>>>
>>> ** **
>>>
>>> so, now , at this stage , c:\\wksp is the HDFS file system OR do i need
>>> to create it from command line ?****
>>>
>>> ** **
>>>
>>> please suggest****
>>>
>>> ** **
>>>
>>> regards,****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <do...@gmail.com>
>>> wrote:****
>>>
>>>  Hello Irfan,****
>>>
>>> ** **
>>>
>>> Sorry for being unresponsive. Got stuck with some imp work.****
>>>
>>> ** **
>>>
>>> HDFS webUI doesn't provide us the ability to create file or directory.
>>> You can browse HDFS, view files, download files etc. But operation like
>>> create, move, copy etc are not supported.****
>>>
>>> ** **
>>>
>>> These values look fine to me.****
>>>
>>> ** **
>>>
>>> One suggestion though. Try getting a Linux machine(if possible). Or at
>>> least use a VM. I personally feel that using Hadoop on windows is always
>>> messy.****
>>>
>>>
>>> ****
>>>
>>> Warm Regards,****
>>>
>>> Tariq****
>>>
>>> cloudfront.blogspot.com****
>>>
>>> ** **
>>>
>>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  thanks.****
>>>
>>> when i browse the file system , i am getting following :****
>>>
>>> i haven't seen any make directory option there ****
>>>
>>> ** **
>>>
>>> i need to create it from command line ?****
>>>
>>> further, in the hdfs-site.xml file , i have given following entries. are
>>> they correct ? ****
>>>
>>> ** **
>>>
>>> <property>****
>>>
>>>   <name>dfs.data.dir</name>****
>>>
>>>   <value>c:\\wksp</value>****
>>>
>>>   </property>****
>>>
>>> <property>****
>>>
>>>   <name>dfs.name.dir</name>****
>>>
>>>   <value>c:\\wksp</value>****
>>>
>>>   </property>****
>>>
>>> ** **
>>>
>>> please suggest ****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> [image: Inline image 1]****
>>>
>>> ** **
>>>
>>> On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <ma...@gmail.com>
>>> wrote:****
>>>
>>>  *You are wrong at this:*****
>>>
>>> ** **
>>>
>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>>
>>> $ ./hadoop dfs -copyFromLocal
>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp****
>>>
>>> copyFromLocal: File
>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
>>> ****
>>>
>>> ** **
>>>
>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>>
>>> $ ./hadoop dfs -copyFromLocal
>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp****
>>>
>>> copyFromLocal: File
>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>>> ****
>>>
>>> ** **
>>>
>>> Because,You had wrote both the paths local and You need not to copy
>>> hadoop into hdfs...Hadoop is already working..****
>>>
>>> ** **
>>>
>>> Just check out in browser by after starting ur single node cluster :****
>>>
>>> ** **
>>>
>>> localhost:50070****
>>>
>>> ** **
>>>
>>> then go for browse the filesystem link in it..****
>>>
>>> ** **
>>>
>>> If there is no directory then make directory there.****
>>>
>>> That is your hdfs directory.****
>>>
>>> Then copy any text file there(no need to copy hadoop there).beacause u
>>> are going to do processing on that data in text file.That's why hadoop is
>>> used for ,first u need to make it clear in ur mind.Then and then u will do
>>> it...otherwise not possible..****
>>>
>>> ** **
>>>
>>> *Try this: *****
>>>
>>> ** **
>>>
>>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2****
>>>
>>> $ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
>>> /hdfs/directory/path****
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> ** **
>>>
>>> On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
>>> wrote:****
>>>
>>>  thanks. yes , i am newbie.****
>>>
>>> however, i need windows setup.****
>>>
>>> ** **
>>>
>>> let me surely refer the doc and link which u sent but i need this to be
>>> working ...****
>>>
>>> can you please help****
>>>
>>> ** **
>>>
>>> regards****
>>>
>>> ** **
>>>
>>>  ****
>>>
>>> ** **
>>>
>>>
>>>
>>> ****
>>>
>>> ** **
>>>
>>> --
>>> MANISH DUNANI
>>> -THANX
>>> +91 9426881954,+91 8460656443****
>>>
>>> manishd207@gmail.com****
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>
>>>
>>> ****
>>>
>>> -- ****
>>>
>>> Regards****
>>>
>>> *Manish Dunani*****
>>>
>>> *Contact No* : +91 9408329137****
>>>
>>> *skype id* : manish.dunani****
>>>
>>> ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>> CONFIDENTIALITY NOTICE
>>> NOTICE: This message is intended for the use of the individual or entity
>>> to which it is addressed and may contain information that is confidential,
>>> privileged and exempt from disclosure under applicable law. If the reader
>>> of this message is not the intended recipient, you are hereby notified that
>>> any printing, copying, dissemination, distribution, disclosure or
>>> forwarding of this communication is strictly prohibited. If you have
>>> received this communication in error, please contact the sender immediately
>>> and delete it from your system. Thank You.****
>>>
>>>  ** **
>>>
>>>
>>>
>>> CONFIDENTIALITY NOTICE
>>> NOTICE: This message is intended for the use of the individual or entity
>>> to which it is addressed and may contain information that is confidential,
>>> privileged and exempt from disclosure under applicable law. If the reader
>>> of this message is not the intended recipient, you are hereby notified that
>>> any printing, copying, dissemination, distribution, disclosure or
>>> forwarding of this communication is strictly prohibited. If you have
>>> received this communication in error, please contact the sender immediately
>>> and delete it from your system. Thank You.****
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>
>>> CONFIDENTIALITY NOTICE
>>> NOTICE: This message is intended for the use of the individual or entity
>>> to which it is addressed and may contain information that is confidential,
>>> privileged and exempt from disclosure under applicable law. If the reader
>>> of this message is not the intended recipient, you are hereby notified that
>>> any printing, copying, dissemination, distribution, disclosure or
>>> forwarding of this communication is strictly prohibited. If you have
>>> received this communication in error, please contact the sender immediately
>>> and delete it from your system. Thank You.****
>>>
>>>  ** **
>>>
>>>  ** **
>>>
>>>
>>> CONFIDENTIALITY NOTICE
>>> NOTICE: This message is intended for the use of the individual or entity
>>> to which it is addressed and may contain information that is confidential,
>>> privileged and exempt from disclosure under applicable law. If the reader
>>> of this message is not the intended recipient, you are hereby notified that
>>> any printing, copying, dissemination, distribution, disclosure or
>>> forwarding of this communication is strictly prohibited. If you have
>>> received this communication in error, please contact the sender immediately
>>> and delete it from your system. Thank You.****
>>>
>>>  ** **
>>>
>>>
>>> CONFIDENTIALITY NOTICE
>>> NOTICE: This message is intended for the use of the individual or entity
>>> to which it is addressed and may contain information that is confidential,
>>> privileged and exempt from disclosure under applicable law. If the reader
>>> of this message is not the intended recipient, you are hereby notified that
>>> any printing, copying, dissemination, distribution, disclosure or
>>> forwarding of this communication is strictly prohibited. If you have
>>> received this communication in error, please contact the sender immediately
>>> and delete it from your system. Thank You.****
>>>
>>>  ** **
>>>
>>>
>>> CONFIDENTIALITY NOTICE
>>> NOTICE: This message is intended for the use of the individual or entity
>>> to which it is addressed and may contain information that is confidential,
>>> privileged and exempt from disclosure under applicable law. If the reader
>>> of this message is not the intended recipient, you are hereby notified that
>>> any printing, copying, dissemination, distribution, disclosure or
>>> forwarding of this communication is strictly prohibited. If you have
>>> received this communication in error, please contact the sender immediately
>>> and delete it from your system. Thank You.****
>>>
>>>  ** **
>>>
>>>
>>>
>>> ****
>>>
>>> ** **
>>>
>>> -- ****
>>>
>>> Olivier Renault****
>>>
>>> Solution Engineer - Big Data - Hortonworks, Inc.
>>> +44 7500 933 036
>>> orenault@hortonworks.com
>>> www.hortonworks.com****
>>>
>>> **** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>
>>>
>>> CONFIDENTIALITY NOTICE
>>> NOTICE: This message is intended for the use of the individual or entity
>>> to which it is addressed and may contain information that is confidential,
>>> privileged and exempt from disclosure under applicable law. If the reader
>>> of this message is not the intended recipient, you are hereby notified that
>>> any printing, copying, dissemination, distribution, disclosure or
>>> forwarding of this communication is strictly prohibited. If you have
>>> received this communication in error, please contact the sender immediately
>>> and delete it from your system. Thank You.****<http://hortonworks.com/products/hortonworks-sandbox/>
>>>
>>>  ** ** <http://hortonworks.com/products/hortonworks-sandbox/>
>>>
>>
>>
>

Re: about replication

Posted by Irfan Sayed <ir...@gmail.com>.
ok.. now i made some changes and installation went ahead
but failed in property "HIVE_SERVER_HOST" declaration
in cluster config file, i have commented this property. if i uncomment ,
then what server address will give ???

i have only two windows machines setup.
1: for namenode and another for datanode

please suggest

regards
irfan



On Fri, Sep 6, 2013 at 11:42 AM, Irfan Sayed <ir...@gmail.com> wrote:

> thanks.
> i installed the latest java in c:\java folder and now no error in log file
> related to java
> however, now it is throwing error on not having cluster properties file.
> in fact i am running/installing hdp from the location where this file
> exist . still it is throwing error
>
> please find the attached
>
> [image: Inline image 1]
>
> regards
> irfan
>
>
>
> On Fri, Sep 6, 2013 at 11:12 AM, Ravi Mummulla (BIG DATA) <
> ravimu@microsoft.com> wrote:
>
>>  Here’s your issue (from the logs you attached earlier):****
>>
>> ** **
>>
>> CAQuietExec:  Checking JAVA_HOME is set correctly...****
>>
>> CAQuietExec:  Files\Java\jdk1.6.0_31 was unexpected at this time.****
>>
>> ** **
>>
>> It seems that you installed Java prerequisite in the default path, which
>> is %PROGRAMFILES% (expands to C:\Program Files in your case). HDP 1.3 does
>> not like spaces in paths, do you need to reinstall Java under c:\java\ or
>> something similar (in a path with no spaces).****
>>
>> ** **
>>
>> *From:* Irfan Sayed [mailto:irfu.sayed@gmail.com]
>> *Sent:* Thursday, September 5, 2013 8:42 PM
>> *To:* user@hadoop.apache.org
>> *Subject:* Re: about replication****
>>
>> ** **
>>
>> please find the attached.****
>>
>> i don't have "c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log"
>> as it is not generated ****
>>
>> ** **
>>
>> regards****
>>
>> irfan****
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> On Thu, Sep 5, 2013 at 6:09 PM, Olivier Renault <or...@hortonworks.com>
>> wrote:****
>>
>>  Could you share the log files ( c:\hdp.log,
>> c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log )  as
>> well as your clusterproperties.txt ?****
>>
>> ** **
>>
>> Thanks, ****
>>
>> Olivier****
>>
>> ** **
>>
>> On 5 September 2013 12:33, Irfan Sayed <ir...@gmail.com> wrote:****
>>
>>  thanks. i followed the user manual for deployment and installed all
>> pre-requisites ****
>>
>> i modified the command and still the issue persist. please suggest ****
>>
>> ** **
>>
>> please refer below ****
>>
>> ** **
>>
>> ** **
>>
>> [image: Inline image 1]****
>>
>> ** **
>>
>> regards****
>>
>> irfan ****
>>
>> ** **
>>
>> ** **
>>
>> On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <or...@hortonworks.com>
>> wrote:****
>>
>> The command to install it is msiexec /i msifile /...  ****
>>
>> You will find the correct syntax as part of doc. ****
>>
>> Happy reading
>> Olivier ****
>>
>> On 4 Sep 2013 12:37, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>
>>  thanks. ****
>>
>> i referred the logs and manuals. i modified the clusterproperties file
>> and then double click on the msi file ****
>>
>> however, it still failed.****
>>
>> further i started the installation on command line by giving
>> HDP_LAYOUT=clusterproperties file path, ****
>>
>> installation went ahead and it failed for .NET framework 4.0 and VC++
>> redistributable package dependency   ****
>>
>> ** **
>>
>> i installed both and started again the installation. ****
>>
>> failed again with following error ****
>>
>> [image: Inline image 1]****
>>
>> ** **
>>
>> when i search for the logs mentioned in the error , i never found that **
>> **
>>
>> please suggest ****
>>
>> ** **
>>
>> regards****
>>
>> irfan****
>>
>> ** **
>>
>> ** **
>>
>> On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <
>> orenault@hortonworks.com> wrote:****
>>
>> Correct, you need to define the cluster configuration as part of a file.
>> You will find some information on the configuration file as part of the
>> documentation. ****
>>
>>
>> http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
>> ****
>>
>> You should make sure to have also installed the pre requisite. ****
>>
>> Thanks
>> Olivier ****
>>
>> On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>
>>  thanks. sorry for the long break. actually got involved in some other
>> priorities****
>>
>> i downloaded the installer and while installing i got following error ***
>> *
>>
>> ** **
>>
>> [image: Inline image 1]****
>>
>> ** **
>>
>> do i need to make any configuration prior to installation ??****
>>
>> ** **
>>
>> regards****
>>
>> irfan ****
>>
>> ** **
>>
>> ** **
>>
>> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
>> orenault@hortonworks.com> wrote:****
>>
>> Here is the link ****
>>
>> http://download.hortonworks.com/products/hdp-windows/****
>>
>> Olivier ****
>>
>> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>
>>  thanks.****
>>
>> i just followed the instructions to setup the pseudo distributed setup
>> first using the url :
>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>> ****
>>
>>  ****
>>
>> i don't think so i am running DN on both machine ****
>>
>> please find the attached log****
>>
>> ** **
>>
>> hi olivier ****
>>
>> ** **
>>
>> can you please give me download link ?****
>>
>> let me try please ****
>>
>> ** **
>>
>> regards****
>>
>> irfan ****
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <do...@gmail.com>
>> wrote:****
>>
>>  Are you running DN on both the machines? Could you please show me your
>> DN logs?****
>>
>> ** **
>>
>> Also, consider Oliver's suggestion. It's definitely a better option.****
>>
>> ** **
>>
>> ** **
>>
>>
>> ****
>>
>> Warm Regards,****
>>
>> Tariq****
>>
>> cloudfront.blogspot.com****
>>
>> ** **
>>
>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>> orenault@hortonworks.com> wrote:****
>>
>> Irfu, ****
>>
>> If you want to quickly get Hadoop running on windows platform. You may
>> want to try our distribution for Windows. You will be able to find the msi
>> on our website. ****
>>
>> Regards
>> Olivier ****
>>
>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>
>>  thanks. ****
>>
>> ok. i think i need to change the plan over here ****
>>
>> let me create two environments. 1: totally windows 2: totally Unix****
>>
>> ** **
>>
>> because, on windows , anyway i have to try and see how hadoop works ****
>>
>> on UNIX, it is already known that ,  it is working fine. ****
>>
>> ** **
>>
>> so, on windows , here is the setup:****
>>
>> ** **
>>
>> namenode : windows 2012 R2 ****
>>
>> datanode : windows 2012 R2 ****
>>
>> ** **
>>
>> now, the exact problem is :****
>>
>> 1: datanode is not getting started ****
>>
>> 2: replication : if i put any file/folder on any datanode , it should get
>> replicated to all another available datanodes ****
>>
>> ** **
>>
>> regards****
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <do...@gmail.com>
>> wrote:****
>>
>>  Seriously??You are planning to develop something using Hadoop on
>> windows. Not a good idea. Anyways, cold you plz show me your log files?I
>> also need some additional info :****
>>
>> -The exact problem which you are facing right now****
>>
>> -Your cluster summary(no. of nodes etc)****
>>
>> -Your latest configuration files****
>>
>> -Your /etc.hosts file****
>>
>>
>> ****
>>
>> Warm Regards,****
>>
>> Tariq****
>>
>> cloudfront.blogspot.com****
>>
>> ** **
>>
>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <ir...@gmail.com>
>> wrote:****
>>
>>  ok. thanks****
>>
>> now, i need to start with all windows setup first as our product will be
>> based on windows ****
>>
>> so, now, please tell me how to resolve the issue ****
>>
>> ** **
>>
>> datanode is not starting . please suggest ****
>>
>> ** **
>>
>> regards,****
>>
>> irfan ****
>>
>> ** **
>>
>> ** **
>>
>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <do...@gmail.com>
>> wrote:****
>>
>>  It is possible. Theoretically Hadoop doesn't stop you from doing that.
>> But it is not a very wise setup.****
>>
>>
>> ****
>>
>> Warm Regards,****
>>
>> Tariq****
>>
>> cloudfront.blogspot.com****
>>
>> ** **
>>
>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <ir...@gmail.com>
>> wrote:****
>>
>>  please suggest****
>>
>> ** **
>>
>> regards****
>>
>> irfan****
>>
>> ** **
>>
>> ** **
>>
>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
>> wrote:****
>>
>>  thanks.****
>>
>> can i have setup like this :****
>>
>> namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)****
>>
>> and datanodes are the combination of any OS (windows , linux , unix etc )
>> ****
>>
>> ** **
>>
>> however, my doubt is,  as the file systems of  both the systems (win and
>> linux ) are different ,  datanodes of these systems can not be part of
>> single cluster . i have to make windows cluster separate and UNIX cluster
>> separate ?****
>>
>> ** **
>>
>> regards****
>>
>> ** **
>>
>> ** **
>>
>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <aa...@hortonworks.com>
>> wrote:****
>>
>> I just noticed you are on Cygwin. IIRC Windows PIDs are not the same as
>> Cygwin PIDs so that may be causing the discrepancy. I don't know how well
>> Hadoop works in Cygwin as I have never tried it. Work is in progress for
>> native Windows support however there are no official releases with Windows
>> support yet. It may be easier to get familiar with a release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
>> ****
>>
>>
>>
>> ****
>>
>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <ir...@gmail.com>
>> wrote:****
>>
>>  thanks ****
>>
>> here is what i did .****
>>
>> i stopped all the namenodes and datanodes using ./stop-dfs.sh command ***
>> *
>>
>> then deleted all pid files for namenodes and datanodes ****
>>
>> ** **
>>
>> started dfs again with command : "./start-dfs.sh"****
>>
>> ** **
>>
>> when i ran the "Jps" command . it shows****
>>
>> ** **
>>
>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>
>> $ ./jps.exe****
>>
>> 4536 Jps****
>>
>> 2076 NameNode****
>>
>> ** **
>>
>> however, when i open the pid file for namenode then it is not showing pid
>> as : 4560. on the contrary, it shud show : 2076****
>>
>> ** **
>>
>> please suggest ****
>>
>> ** **
>>
>> regards****
>>
>> ** **
>>
>> ** **
>>
>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <aa...@hortonworks.com>
>> wrote:****
>>
>>  Most likely there is a stale pid file. Something like
>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>> the datanode.
>>
>> I haven't read the entire thread so you may have looked at this already.
>>
>> -Arpit****
>>
>>
>>
>> ****
>>
>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <ir...@gmail.com>
>> wrote:****
>>
>>  datanode is trying to connect to namenode continuously but fails ****
>>
>> ** **
>>
>> when i try to run "jps" command it says :****
>>
>> $ ./jps.exe****
>>
>> 4584 NameNode****
>>
>> 4016 Jps****
>>
>> ** **
>>
>> and when i ran the "./start-dfs.sh" then it says :****
>>
>> ** **
>>
>> $ ./start-dfs.sh****
>>
>> namenode running as process 3544. Stop it first.****
>>
>> DFS-1: datanode running as process 4076. Stop it first.****
>>
>> localhost: secondarynamenode running as process 4792. Stop it first.****
>>
>> ** **
>>
>> both these logs are contradictory ****
>>
>> please find the attached logs ****
>>
>> ** **
>>
>> should i attach the conf files as well ?****
>>
>> ** **
>>
>> regards****
>>
>>  ****
>>
>> ** **
>>
>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <do...@gmail.com>
>> wrote:****
>>
>>  Your DN is still not running. Showing me the logs would be helpful.****
>>
>>
>> ****
>>
>> Warm Regards,****
>>
>> Tariq****
>>
>> cloudfront.blogspot.com****
>>
>> ** **
>>
>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <ir...@gmail.com>
>> wrote:****
>>
>>  i followed the url and did the steps mention in that. i have deployed
>> on the windows platform****
>>
>> ** **
>>
>> Now, i am able to browse url : http://localhost:50070 (name node )****
>>
>> however, not able to browse url : http://localhost:50030****
>>
>> ** **
>>
>> please refer below****
>>
>> ** **
>>
>> [image: Inline image 1]****
>>
>> ** **
>>
>> i have modified all the config files as mentioned and formatted the hdfs
>> file system as well ****
>>
>> please suggest ****
>>
>> ** **
>>
>> regards****
>>
>> ** **
>>
>> ** **
>>
>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <ir...@gmail.com>
>> wrote:****
>>
>>  thanks. i followed this url :
>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>> ****
>>
>> let me follow the url which you gave for pseudo distributed setup and
>> then will switch to distributed mode****
>>
>> ** **
>>
>> regards****
>>
>> irfan ****
>>
>> ** **
>>
>> ** **
>>
>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <do...@gmail.com>
>> wrote:****
>>
>>  You are welcome. Which link have you followed for the
>> configuration?Your *core-site.xml* is empty. Remove the property *
>> fs.default.name *from *hdfs-site.xml* and add it to *core-site.xml*.
>> Remove *mapred.job.tracker* as well. It is required in *mapred-site.xml*.
>> ****
>>
>> ** **
>>
>> I would suggest you to do a pseudo distributed setup first in order to
>> get yourself familiar with the process and then proceed to the distributed
>> mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>> ****
>>
>> ** **
>>
>> HTH****
>>
>>
>> ****
>>
>> Warm Regards,****
>>
>> Tariq****
>>
>> cloudfront.blogspot.com****
>>
>> ** **
>>
>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <ir...@gmail.com>
>> wrote:****
>>
>>  thanks tariq for response. ****
>>
>> as discussed last time, i have sent you all the config files in my setup
>> . ****
>>
>> can you please go through that ?****
>>
>> ** **
>>
>> please let me know ****
>>
>> ** **
>>
>> regards****
>>
>> irfan ****
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <do...@gmail.com>
>> wrote:****
>>
>>  I'm sorry for being unresponsive. Was out of touch for sometime because
>> of ramzan and eid. Resuming work today.****
>>
>> ** **
>>
>> What's the current status?****
>>
>>
>> ****
>>
>> Warm Regards,****
>>
>> Tariq****
>>
>> cloudfront.blogspot.com****
>>
>> ** **
>>
>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <ma...@gmail.com>
>> wrote:****
>>
>>  First of all read the concepts ..I hope you will like it..****
>>
>>
>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf****
>>
>> ** **
>>
>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <ir...@gmail.com>
>> wrote:****
>>
>>  please suggest ****
>>
>> ** **
>>
>> regards****
>>
>> irfan ****
>>
>> ** **
>>
>> ** **
>>
>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <ir...@gmail.com>
>> wrote:****
>>
>>  hey Tariq,****
>>
>> i am still stuck .. ****
>>
>> can you please suggest ****
>>
>> ** **
>>
>> regards****
>>
>> irfan ****
>>
>> ** **
>>
>> ** **
>>
>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <ir...@gmail.com> wrote:
>> ****
>>
>>  please suggest ****
>>
>> ** **
>>
>> regards****
>>
>> ** **
>>
>> ** **
>>
>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <ir...@gmail.com> wrote:
>> ****
>>
>>  attachment got quarantined ****
>>
>> resending in txt format. please rename it to conf.rar ****
>>
>> ** **
>>
>> regards****
>>
>> ** **
>>
>> ** **
>>
>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <ir...@gmail.com> wrote:
>> ****
>>
>>  thanks.****
>>
>> ** **
>>
>> if i run the jps command on namenode :****
>>
>> ** **
>>
>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>
>> $ ./jps.exe****
>>
>> 3164 NameNode****
>>
>> 1892 Jps****
>>
>> ** **
>>
>> same command on datanode :****
>>
>> ** **
>>
>> Administrator@DFS-1 /cygdrive/c/Java/jdk1.7.0_25/bin****
>>
>> $ ./jps.exe****
>>
>> 3848 Jps****
>>
>> ** **
>>
>> jps does not list any process for datanode. however, on web browser i can
>> see one live data node ****
>>
>> please find the attached conf rar file of namenode ****
>>
>> ** **
>>
>> regards****
>>
>> ** **
>>
>> ** **
>>
>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <do...@gmail.com>
>> wrote:****
>>
>>  OK. we'll start fresh. Could you plz show me your latest config files?**
>> **
>>
>> ** **
>>
>> BTW, are your daemons running fine?Use JPS to verify that.****
>>
>>
>> ****
>>
>> Warm Regards,****
>>
>> Tariq****
>>
>> cloudfront.blogspot.com****
>>
>> ** **
>>
>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <ir...@gmail.com>
>> wrote:****
>>
>>  i have created these dir "wksp_data" and "wksp_name" on both datanode
>> and namenode ****
>>
>> made the respective changes in "hdfs-site.xml" file ****
>>
>> formatted the namenode ****
>>
>> started the dfs ****
>>
>> ** **
>>
>> but still, not able to browse the file system through web browser ****
>>
>> please refer below ****
>>
>> ** **
>>
>> anything still missing ?****
>>
>> please suggest ****
>>
>> ** **
>>
>> [image: Inline image 1]****
>>
>> ** **
>>
>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <ir...@gmail.com>
>> wrote:****
>>
>>  these dir needs to be created on all datanodes and namenodes ?****
>>
>> further,  hdfs-site.xml needs to be updated on both datanodes and
>> namenodes for these new dir?****
>>
>> ** **
>>
>> regards****
>>
>> ** **
>>
>> ** **
>>
>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <do...@gmail.com>
>> wrote:****
>>
>>  Create 2 directories manually corresponding to the values of
>> dfs.name.dir and dfs.data.dir properties and change the permissions of
>> these directories to 755. When you start pushing data into your HDFS, data
>> will start going inside the directory specified by dfs.data.dir and the
>> associated metadata will go inside dfs.name.dir. Remember, you store data
>> in HDFS, but it eventually gets stored in your local/native FS. But you
>> cannot see this data directly on your local/native FS.****
>>
>>
>> ****
>>
>> Warm Regards,****
>>
>> Tariq****
>>
>> cloudfront.blogspot.com****
>>
>> ** **
>>
>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <ir...@gmail.com> wrote:
>> ****
>>
>>  thanks. ****
>>
>> however, i need this to be working on windows environment as project
>> requirement.****
>>
>> i will add/work on Linux later ****
>>
>> ** **
>>
>> so, now , at this stage , c:\\wksp is the HDFS file system OR do i need
>> to create it from command line ?****
>>
>> ** **
>>
>> please suggest****
>>
>> ** **
>>
>> regards,****
>>
>> ** **
>>
>> ** **
>>
>> On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <do...@gmail.com>
>> wrote:****
>>
>>  Hello Irfan,****
>>
>> ** **
>>
>> Sorry for being unresponsive. Got stuck with some imp work.****
>>
>> ** **
>>
>> HDFS webUI doesn't provide us the ability to create file or directory.
>> You can browse HDFS, view files, download files etc. But operation like
>> create, move, copy etc are not supported.****
>>
>> ** **
>>
>> These values look fine to me.****
>>
>> ** **
>>
>> One suggestion though. Try getting a Linux machine(if possible). Or at
>> least use a VM. I personally feel that using Hadoop on windows is always
>> messy.****
>>
>>
>> ****
>>
>> Warm Regards,****
>>
>> Tariq****
>>
>> cloudfront.blogspot.com****
>>
>> ** **
>>
>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <ir...@gmail.com> wrote:
>> ****
>>
>>  thanks.****
>>
>> when i browse the file system , i am getting following :****
>>
>> i haven't seen any make directory option there ****
>>
>> ** **
>>
>> i need to create it from command line ?****
>>
>> further, in the hdfs-site.xml file , i have given following entries. are
>> they correct ? ****
>>
>> ** **
>>
>> <property>****
>>
>>   <name>dfs.data.dir</name>****
>>
>>   <value>c:\\wksp</value>****
>>
>>   </property>****
>>
>> <property>****
>>
>>   <name>dfs.name.dir</name>****
>>
>>   <value>c:\\wksp</value>****
>>
>>   </property>****
>>
>> ** **
>>
>> please suggest ****
>>
>> ** **
>>
>> ** **
>>
>> [image: Inline image 1]****
>>
>> ** **
>>
>> On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <ma...@gmail.com>
>> wrote:****
>>
>>  *You are wrong at this:*****
>>
>> ** **
>>
>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>
>> $ ./hadoop dfs -copyFromLocal
>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp****
>>
>> copyFromLocal: File
>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.*
>> ***
>>
>> ** **
>>
>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>
>> $ ./hadoop dfs -copyFromLocal
>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp****
>>
>> copyFromLocal: File
>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>> ****
>>
>> ** **
>>
>> Because,You had wrote both the paths local and You need not to copy
>> hadoop into hdfs...Hadoop is already working..****
>>
>> ** **
>>
>> Just check out in browser by after starting ur single node cluster :****
>>
>> ** **
>>
>> localhost:50070****
>>
>> ** **
>>
>> then go for browse the filesystem link in it..****
>>
>> ** **
>>
>> If there is no directory then make directory there.****
>>
>> That is your hdfs directory.****
>>
>> Then copy any text file there(no need to copy hadoop there).beacause u
>> are going to do processing on that data in text file.That's why hadoop is
>> used for ,first u need to make it clear in ur mind.Then and then u will do
>> it...otherwise not possible..****
>>
>> ** **
>>
>> *Try this: *****
>>
>> ** **
>>
>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2****
>>
>> $ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
>> /hdfs/directory/path****
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
>> wrote:****
>>
>>  thanks. yes , i am newbie.****
>>
>> however, i need windows setup.****
>>
>> ** **
>>
>> let me surely refer the doc and link which u sent but i need this to be
>> working ...****
>>
>> can you please help****
>>
>> ** **
>>
>> regards****
>>
>> ** **
>>
>>  ****
>>
>> ** **
>>
>>
>>
>> ****
>>
>> ** **
>>
>> --
>> MANISH DUNANI
>> -THANX
>> +91 9426881954,+91 8460656443****
>>
>> manishd207@gmail.com****
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>
>>
>> ****
>>
>> -- ****
>>
>> Regards****
>>
>> *Manish Dunani*****
>>
>> *Contact No* : +91 9408329137****
>>
>> *skype id* : manish.dunani****
>>
>> ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity
>> to which it is addressed and may contain information that is confidential,
>> privileged and exempt from disclosure under applicable law. If the reader
>> of this message is not the intended recipient, you are hereby notified that
>> any printing, copying, dissemination, distribution, disclosure or
>> forwarding of this communication is strictly prohibited. If you have
>> received this communication in error, please contact the sender immediately
>> and delete it from your system. Thank You.****
>>
>>  ** **
>>
>>
>>
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity
>> to which it is addressed and may contain information that is confidential,
>> privileged and exempt from disclosure under applicable law. If the reader
>> of this message is not the intended recipient, you are hereby notified that
>> any printing, copying, dissemination, distribution, disclosure or
>> forwarding of this communication is strictly prohibited. If you have
>> received this communication in error, please contact the sender immediately
>> and delete it from your system. Thank You.****
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity
>> to which it is addressed and may contain information that is confidential,
>> privileged and exempt from disclosure under applicable law. If the reader
>> of this message is not the intended recipient, you are hereby notified that
>> any printing, copying, dissemination, distribution, disclosure or
>> forwarding of this communication is strictly prohibited. If you have
>> received this communication in error, please contact the sender immediately
>> and delete it from your system. Thank You.****
>>
>>  ** **
>>
>>  ** **
>>
>>
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity
>> to which it is addressed and may contain information that is confidential,
>> privileged and exempt from disclosure under applicable law. If the reader
>> of this message is not the intended recipient, you are hereby notified that
>> any printing, copying, dissemination, distribution, disclosure or
>> forwarding of this communication is strictly prohibited. If you have
>> received this communication in error, please contact the sender immediately
>> and delete it from your system. Thank You.****
>>
>>  ** **
>>
>>
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity
>> to which it is addressed and may contain information that is confidential,
>> privileged and exempt from disclosure under applicable law. If the reader
>> of this message is not the intended recipient, you are hereby notified that
>> any printing, copying, dissemination, distribution, disclosure or
>> forwarding of this communication is strictly prohibited. If you have
>> received this communication in error, please contact the sender immediately
>> and delete it from your system. Thank You.****
>>
>>  ** **
>>
>>
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity
>> to which it is addressed and may contain information that is confidential,
>> privileged and exempt from disclosure under applicable law. If the reader
>> of this message is not the intended recipient, you are hereby notified that
>> any printing, copying, dissemination, distribution, disclosure or
>> forwarding of this communication is strictly prohibited. If you have
>> received this communication in error, please contact the sender immediately
>> and delete it from your system. Thank You.****
>>
>>  ** **
>>
>>
>>
>> ****
>>
>> ** **
>>
>> -- ****
>>
>> Olivier Renault****
>>
>> Solution Engineer - Big Data - Hortonworks, Inc.
>> +44 7500 933 036
>> orenault@hortonworks.com
>> www.hortonworks.com****
>>
>> **** <http://hortonworks.com/products/hortonworks-sandbox/>
>>
>>
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity
>> to which it is addressed and may contain information that is confidential,
>> privileged and exempt from disclosure under applicable law. If the reader
>> of this message is not the intended recipient, you are hereby notified that
>> any printing, copying, dissemination, distribution, disclosure or
>> forwarding of this communication is strictly prohibited. If you have
>> received this communication in error, please contact the sender immediately
>> and delete it from your system. Thank You.****<http://hortonworks.com/products/hortonworks-sandbox/>
>>
>>  ** ** <http://hortonworks.com/products/hortonworks-sandbox/>
>>
>
>

Re: about replication

Posted by Irfan Sayed <ir...@gmail.com>.
ok.. now i made some changes and installation went ahead
but failed in property "HIVE_SERVER_HOST" declaration
in cluster config file, i have commented this property. if i uncomment ,
then what server address will give ???

i have only two windows machines setup.
1: for namenode and another for datanode

please suggest

regards
irfan



On Fri, Sep 6, 2013 at 11:42 AM, Irfan Sayed <ir...@gmail.com> wrote:

> thanks.
> i installed the latest java in c:\java folder and now no error in log file
> related to java
> however, now it is throwing error on not having cluster properties file.
> in fact i am running/installing hdp from the location where this file
> exist . still it is throwing error
>
> please find the attached
>
> [image: Inline image 1]
>
> regards
> irfan
>
>
>
> On Fri, Sep 6, 2013 at 11:12 AM, Ravi Mummulla (BIG DATA) <
> ravimu@microsoft.com> wrote:
>
>>  Here’s your issue (from the logs you attached earlier):****
>>
>> ** **
>>
>> CAQuietExec:  Checking JAVA_HOME is set correctly...****
>>
>> CAQuietExec:  Files\Java\jdk1.6.0_31 was unexpected at this time.****
>>
>> ** **
>>
>> It seems that you installed Java prerequisite in the default path, which
>> is %PROGRAMFILES% (expands to C:\Program Files in your case). HDP 1.3 does
>> not like spaces in paths, do you need to reinstall Java under c:\java\ or
>> something similar (in a path with no spaces).****
>>
>> ** **
>>
>> *From:* Irfan Sayed [mailto:irfu.sayed@gmail.com]
>> *Sent:* Thursday, September 5, 2013 8:42 PM
>> *To:* user@hadoop.apache.org
>> *Subject:* Re: about replication****
>>
>> ** **
>>
>> please find the attached.****
>>
>> i don't have "c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log"
>> as it is not generated ****
>>
>> ** **
>>
>> regards****
>>
>> irfan****
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> On Thu, Sep 5, 2013 at 6:09 PM, Olivier Renault <or...@hortonworks.com>
>> wrote:****
>>
>>  Could you share the log files ( c:\hdp.log,
>> c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log )  as
>> well as your clusterproperties.txt ?****
>>
>> ** **
>>
>> Thanks, ****
>>
>> Olivier****
>>
>> ** **
>>
>> On 5 September 2013 12:33, Irfan Sayed <ir...@gmail.com> wrote:****
>>
>>  thanks. i followed the user manual for deployment and installed all
>> pre-requisites ****
>>
>> i modified the command and still the issue persist. please suggest ****
>>
>> ** **
>>
>> please refer below ****
>>
>> ** **
>>
>> ** **
>>
>> [image: Inline image 1]****
>>
>> ** **
>>
>> regards****
>>
>> irfan ****
>>
>> ** **
>>
>> ** **
>>
>> On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <or...@hortonworks.com>
>> wrote:****
>>
>> The command to install it is msiexec /i msifile /...  ****
>>
>> You will find the correct syntax as part of doc. ****
>>
>> Happy reading
>> Olivier ****
>>
>> On 4 Sep 2013 12:37, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>
>>  thanks. ****
>>
>> i referred the logs and manuals. i modified the clusterproperties file
>> and then double click on the msi file ****
>>
>> however, it still failed.****
>>
>> further i started the installation on command line by giving
>> HDP_LAYOUT=clusterproperties file path, ****
>>
>> installation went ahead and it failed for .NET framework 4.0 and VC++
>> redistributable package dependency   ****
>>
>> ** **
>>
>> i installed both and started again the installation. ****
>>
>> failed again with following error ****
>>
>> [image: Inline image 1]****
>>
>> ** **
>>
>> when i search for the logs mentioned in the error , i never found that **
>> **
>>
>> please suggest ****
>>
>> ** **
>>
>> regards****
>>
>> irfan****
>>
>> ** **
>>
>> ** **
>>
>> On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <
>> orenault@hortonworks.com> wrote:****
>>
>> Correct, you need to define the cluster configuration as part of a file.
>> You will find some information on the configuration file as part of the
>> documentation. ****
>>
>>
>> http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
>> ****
>>
>> You should make sure to have also installed the pre requisite. ****
>>
>> Thanks
>> Olivier ****
>>
>> On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>
>>  thanks. sorry for the long break. actually got involved in some other
>> priorities****
>>
>> i downloaded the installer and while installing i got following error ***
>> *
>>
>> ** **
>>
>> [image: Inline image 1]****
>>
>> ** **
>>
>> do i need to make any configuration prior to installation ??****
>>
>> ** **
>>
>> regards****
>>
>> irfan ****
>>
>> ** **
>>
>> ** **
>>
>> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
>> orenault@hortonworks.com> wrote:****
>>
>> Here is the link ****
>>
>> http://download.hortonworks.com/products/hdp-windows/****
>>
>> Olivier ****
>>
>> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>
>>  thanks.****
>>
>> i just followed the instructions to setup the pseudo distributed setup
>> first using the url :
>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>> ****
>>
>>  ****
>>
>> i don't think so i am running DN on both machine ****
>>
>> please find the attached log****
>>
>> ** **
>>
>> hi olivier ****
>>
>> ** **
>>
>> can you please give me download link ?****
>>
>> let me try please ****
>>
>> ** **
>>
>> regards****
>>
>> irfan ****
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <do...@gmail.com>
>> wrote:****
>>
>>  Are you running DN on both the machines? Could you please show me your
>> DN logs?****
>>
>> ** **
>>
>> Also, consider Oliver's suggestion. It's definitely a better option.****
>>
>> ** **
>>
>> ** **
>>
>>
>> ****
>>
>> Warm Regards,****
>>
>> Tariq****
>>
>> cloudfront.blogspot.com****
>>
>> ** **
>>
>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>> orenault@hortonworks.com> wrote:****
>>
>> Irfu, ****
>>
>> If you want to quickly get Hadoop running on windows platform. You may
>> want to try our distribution for Windows. You will be able to find the msi
>> on our website. ****
>>
>> Regards
>> Olivier ****
>>
>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>
>>  thanks. ****
>>
>> ok. i think i need to change the plan over here ****
>>
>> let me create two environments. 1: totally windows 2: totally Unix****
>>
>> ** **
>>
>> because, on windows , anyway i have to try and see how hadoop works ****
>>
>> on UNIX, it is already known that ,  it is working fine. ****
>>
>> ** **
>>
>> so, on windows , here is the setup:****
>>
>> ** **
>>
>> namenode : windows 2012 R2 ****
>>
>> datanode : windows 2012 R2 ****
>>
>> ** **
>>
>> now, the exact problem is :****
>>
>> 1: datanode is not getting started ****
>>
>> 2: replication : if i put any file/folder on any datanode , it should get
>> replicated to all another available datanodes ****
>>
>> ** **
>>
>> regards****
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <do...@gmail.com>
>> wrote:****
>>
>>  Seriously??You are planning to develop something using Hadoop on
>> windows. Not a good idea. Anyways, cold you plz show me your log files?I
>> also need some additional info :****
>>
>> -The exact problem which you are facing right now****
>>
>> -Your cluster summary(no. of nodes etc)****
>>
>> -Your latest configuration files****
>>
>> -Your /etc.hosts file****
>>
>>
>> ****
>>
>> Warm Regards,****
>>
>> Tariq****
>>
>> cloudfront.blogspot.com****
>>
>> ** **
>>
>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <ir...@gmail.com>
>> wrote:****
>>
>>  ok. thanks****
>>
>> now, i need to start with all windows setup first as our product will be
>> based on windows ****
>>
>> so, now, please tell me how to resolve the issue ****
>>
>> ** **
>>
>> datanode is not starting . please suggest ****
>>
>> ** **
>>
>> regards,****
>>
>> irfan ****
>>
>> ** **
>>
>> ** **
>>
>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <do...@gmail.com>
>> wrote:****
>>
>>  It is possible. Theoretically Hadoop doesn't stop you from doing that.
>> But it is not a very wise setup.****
>>
>>
>> ****
>>
>> Warm Regards,****
>>
>> Tariq****
>>
>> cloudfront.blogspot.com****
>>
>> ** **
>>
>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <ir...@gmail.com>
>> wrote:****
>>
>>  please suggest****
>>
>> ** **
>>
>> regards****
>>
>> irfan****
>>
>> ** **
>>
>> ** **
>>
>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
>> wrote:****
>>
>>  thanks.****
>>
>> can i have setup like this :****
>>
>> namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)****
>>
>> and datanodes are the combination of any OS (windows , linux , unix etc )
>> ****
>>
>> ** **
>>
>> however, my doubt is,  as the file systems of  both the systems (win and
>> linux ) are different ,  datanodes of these systems can not be part of
>> single cluster . i have to make windows cluster separate and UNIX cluster
>> separate ?****
>>
>> ** **
>>
>> regards****
>>
>> ** **
>>
>> ** **
>>
>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <aa...@hortonworks.com>
>> wrote:****
>>
>> I just noticed you are on Cygwin. IIRC Windows PIDs are not the same as
>> Cygwin PIDs so that may be causing the discrepancy. I don't know how well
>> Hadoop works in Cygwin as I have never tried it. Work is in progress for
>> native Windows support however there are no official releases with Windows
>> support yet. It may be easier to get familiar with a release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
>> ****
>>
>>
>>
>> ****
>>
>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <ir...@gmail.com>
>> wrote:****
>>
>>  thanks ****
>>
>> here is what i did .****
>>
>> i stopped all the namenodes and datanodes using ./stop-dfs.sh command ***
>> *
>>
>> then deleted all pid files for namenodes and datanodes ****
>>
>> ** **
>>
>> started dfs again with command : "./start-dfs.sh"****
>>
>> ** **
>>
>> when i ran the "Jps" command . it shows****
>>
>> ** **
>>
>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>
>> $ ./jps.exe****
>>
>> 4536 Jps****
>>
>> 2076 NameNode****
>>
>> ** **
>>
>> however, when i open the pid file for namenode then it is not showing pid
>> as : 4560. on the contrary, it shud show : 2076****
>>
>> ** **
>>
>> please suggest ****
>>
>> ** **
>>
>> regards****
>>
>> ** **
>>
>> ** **
>>
>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <aa...@hortonworks.com>
>> wrote:****
>>
>>  Most likely there is a stale pid file. Something like
>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>> the datanode.
>>
>> I haven't read the entire thread so you may have looked at this already.
>>
>> -Arpit****
>>
>>
>>
>> ****
>>
>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <ir...@gmail.com>
>> wrote:****
>>
>>  datanode is trying to connect to namenode continuously but fails ****
>>
>> ** **
>>
>> when i try to run "jps" command it says :****
>>
>> $ ./jps.exe****
>>
>> 4584 NameNode****
>>
>> 4016 Jps****
>>
>> ** **
>>
>> and when i ran the "./start-dfs.sh" then it says :****
>>
>> ** **
>>
>> $ ./start-dfs.sh****
>>
>> namenode running as process 3544. Stop it first.****
>>
>> DFS-1: datanode running as process 4076. Stop it first.****
>>
>> localhost: secondarynamenode running as process 4792. Stop it first.****
>>
>> ** **
>>
>> both these logs are contradictory ****
>>
>> please find the attached logs ****
>>
>> ** **
>>
>> should i attach the conf files as well ?****
>>
>> ** **
>>
>> regards****
>>
>>  ****
>>
>> ** **
>>
>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <do...@gmail.com>
>> wrote:****
>>
>>  Your DN is still not running. Showing me the logs would be helpful.****
>>
>>
>> ****
>>
>> Warm Regards,****
>>
>> Tariq****
>>
>> cloudfront.blogspot.com****
>>
>> ** **
>>
>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <ir...@gmail.com>
>> wrote:****
>>
>>  i followed the url and did the steps mention in that. i have deployed
>> on the windows platform****
>>
>> ** **
>>
>> Now, i am able to browse url : http://localhost:50070 (name node )****
>>
>> however, not able to browse url : http://localhost:50030****
>>
>> ** **
>>
>> please refer below****
>>
>> ** **
>>
>> [image: Inline image 1]****
>>
>> ** **
>>
>> i have modified all the config files as mentioned and formatted the hdfs
>> file system as well ****
>>
>> please suggest ****
>>
>> ** **
>>
>> regards****
>>
>> ** **
>>
>> ** **
>>
>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <ir...@gmail.com>
>> wrote:****
>>
>>  thanks. i followed this url :
>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>> ****
>>
>> let me follow the url which you gave for pseudo distributed setup and
>> then will switch to distributed mode****
>>
>> ** **
>>
>> regards****
>>
>> irfan ****
>>
>> ** **
>>
>> ** **
>>
>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <do...@gmail.com>
>> wrote:****
>>
>>  You are welcome. Which link have you followed for the
>> configuration?Your *core-site.xml* is empty. Remove the property *
>> fs.default.name *from *hdfs-site.xml* and add it to *core-site.xml*.
>> Remove *mapred.job.tracker* as well. It is required in *mapred-site.xml*.
>> ****
>>
>> ** **
>>
>> I would suggest you to do a pseudo distributed setup first in order to
>> get yourself familiar with the process and then proceed to the distributed
>> mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>> ****
>>
>> ** **
>>
>> HTH****
>>
>>
>> ****
>>
>> Warm Regards,****
>>
>> Tariq****
>>
>> cloudfront.blogspot.com****
>>
>> ** **
>>
>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <ir...@gmail.com>
>> wrote:****
>>
>>  thanks tariq for response. ****
>>
>> as discussed last time, i have sent you all the config files in my setup
>> . ****
>>
>> can you please go through that ?****
>>
>> ** **
>>
>> please let me know ****
>>
>> ** **
>>
>> regards****
>>
>> irfan ****
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <do...@gmail.com>
>> wrote:****
>>
>>  I'm sorry for being unresponsive. Was out of touch for sometime because
>> of ramzan and eid. Resuming work today.****
>>
>> ** **
>>
>> What's the current status?****
>>
>>
>> ****
>>
>> Warm Regards,****
>>
>> Tariq****
>>
>> cloudfront.blogspot.com****
>>
>> ** **
>>
>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <ma...@gmail.com>
>> wrote:****
>>
>>  First of all read the concepts ..I hope you will like it..****
>>
>>
>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf****
>>
>> ** **
>>
>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <ir...@gmail.com>
>> wrote:****
>>
>>  please suggest ****
>>
>> ** **
>>
>> regards****
>>
>> irfan ****
>>
>> ** **
>>
>> ** **
>>
>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <ir...@gmail.com>
>> wrote:****
>>
>>  hey Tariq,****
>>
>> i am still stuck .. ****
>>
>> can you please suggest ****
>>
>> ** **
>>
>> regards****
>>
>> irfan ****
>>
>> ** **
>>
>> ** **
>>
>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <ir...@gmail.com> wrote:
>> ****
>>
>>  please suggest ****
>>
>> ** **
>>
>> regards****
>>
>> ** **
>>
>> ** **
>>
>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <ir...@gmail.com> wrote:
>> ****
>>
>>  attachment got quarantined ****
>>
>> resending in txt format. please rename it to conf.rar ****
>>
>> ** **
>>
>> regards****
>>
>> ** **
>>
>> ** **
>>
>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <ir...@gmail.com> wrote:
>> ****
>>
>>  thanks.****
>>
>> ** **
>>
>> if i run the jps command on namenode :****
>>
>> ** **
>>
>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>
>> $ ./jps.exe****
>>
>> 3164 NameNode****
>>
>> 1892 Jps****
>>
>> ** **
>>
>> same command on datanode :****
>>
>> ** **
>>
>> Administrator@DFS-1 /cygdrive/c/Java/jdk1.7.0_25/bin****
>>
>> $ ./jps.exe****
>>
>> 3848 Jps****
>>
>> ** **
>>
>> jps does not list any process for datanode. however, on web browser i can
>> see one live data node ****
>>
>> please find the attached conf rar file of namenode ****
>>
>> ** **
>>
>> regards****
>>
>> ** **
>>
>> ** **
>>
>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <do...@gmail.com>
>> wrote:****
>>
>>  OK. we'll start fresh. Could you plz show me your latest config files?**
>> **
>>
>> ** **
>>
>> BTW, are your daemons running fine?Use JPS to verify that.****
>>
>>
>> ****
>>
>> Warm Regards,****
>>
>> Tariq****
>>
>> cloudfront.blogspot.com****
>>
>> ** **
>>
>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <ir...@gmail.com>
>> wrote:****
>>
>>  i have created these dir "wksp_data" and "wksp_name" on both datanode
>> and namenode ****
>>
>> made the respective changes in "hdfs-site.xml" file ****
>>
>> formatted the namenode ****
>>
>> started the dfs ****
>>
>> ** **
>>
>> but still, not able to browse the file system through web browser ****
>>
>> please refer below ****
>>
>> ** **
>>
>> anything still missing ?****
>>
>> please suggest ****
>>
>> ** **
>>
>> [image: Inline image 1]****
>>
>> ** **
>>
>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <ir...@gmail.com>
>> wrote:****
>>
>>  these dir needs to be created on all datanodes and namenodes ?****
>>
>> further,  hdfs-site.xml needs to be updated on both datanodes and
>> namenodes for these new dir?****
>>
>> ** **
>>
>> regards****
>>
>> ** **
>>
>> ** **
>>
>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <do...@gmail.com>
>> wrote:****
>>
>>  Create 2 directories manually corresponding to the values of
>> dfs.name.dir and dfs.data.dir properties and change the permissions of
>> these directories to 755. When you start pushing data into your HDFS, data
>> will start going inside the directory specified by dfs.data.dir and the
>> associated metadata will go inside dfs.name.dir. Remember, you store data
>> in HDFS, but it eventually gets stored in your local/native FS. But you
>> cannot see this data directly on your local/native FS.****
>>
>>
>> ****
>>
>> Warm Regards,****
>>
>> Tariq****
>>
>> cloudfront.blogspot.com****
>>
>> ** **
>>
>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <ir...@gmail.com> wrote:
>> ****
>>
>>  thanks. ****
>>
>> however, i need this to be working on windows environment as project
>> requirement.****
>>
>> i will add/work on Linux later ****
>>
>> ** **
>>
>> so, now , at this stage , c:\\wksp is the HDFS file system OR do i need
>> to create it from command line ?****
>>
>> ** **
>>
>> please suggest****
>>
>> ** **
>>
>> regards,****
>>
>> ** **
>>
>> ** **
>>
>> On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <do...@gmail.com>
>> wrote:****
>>
>>  Hello Irfan,****
>>
>> ** **
>>
>> Sorry for being unresponsive. Got stuck with some imp work.****
>>
>> ** **
>>
>> HDFS webUI doesn't provide us the ability to create file or directory.
>> You can browse HDFS, view files, download files etc. But operation like
>> create, move, copy etc are not supported.****
>>
>> ** **
>>
>> These values look fine to me.****
>>
>> ** **
>>
>> One suggestion though. Try getting a Linux machine(if possible). Or at
>> least use a VM. I personally feel that using Hadoop on windows is always
>> messy.****
>>
>>
>> ****
>>
>> Warm Regards,****
>>
>> Tariq****
>>
>> cloudfront.blogspot.com****
>>
>> ** **
>>
>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <ir...@gmail.com> wrote:
>> ****
>>
>>  thanks.****
>>
>> when i browse the file system , i am getting following :****
>>
>> i haven't seen any make directory option there ****
>>
>> ** **
>>
>> i need to create it from command line ?****
>>
>> further, in the hdfs-site.xml file , i have given following entries. are
>> they correct ? ****
>>
>> ** **
>>
>> <property>****
>>
>>   <name>dfs.data.dir</name>****
>>
>>   <value>c:\\wksp</value>****
>>
>>   </property>****
>>
>> <property>****
>>
>>   <name>dfs.name.dir</name>****
>>
>>   <value>c:\\wksp</value>****
>>
>>   </property>****
>>
>> ** **
>>
>> please suggest ****
>>
>> ** **
>>
>> ** **
>>
>> [image: Inline image 1]****
>>
>> ** **
>>
>> On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <ma...@gmail.com>
>> wrote:****
>>
>>  *You are wrong at this:*****
>>
>> ** **
>>
>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>
>> $ ./hadoop dfs -copyFromLocal
>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp****
>>
>> copyFromLocal: File
>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.*
>> ***
>>
>> ** **
>>
>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>
>> $ ./hadoop dfs -copyFromLocal
>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp****
>>
>> copyFromLocal: File
>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>> ****
>>
>> ** **
>>
>> Because,You had wrote both the paths local and You need not to copy
>> hadoop into hdfs...Hadoop is already working..****
>>
>> ** **
>>
>> Just check out in browser by after starting ur single node cluster :****
>>
>> ** **
>>
>> localhost:50070****
>>
>> ** **
>>
>> then go for browse the filesystem link in it..****
>>
>> ** **
>>
>> If there is no directory then make directory there.****
>>
>> That is your hdfs directory.****
>>
>> Then copy any text file there(no need to copy hadoop there).beacause u
>> are going to do processing on that data in text file.That's why hadoop is
>> used for ,first u need to make it clear in ur mind.Then and then u will do
>> it...otherwise not possible..****
>>
>> ** **
>>
>> *Try this: *****
>>
>> ** **
>>
>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2****
>>
>> $ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
>> /hdfs/directory/path****
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
>> wrote:****
>>
>>  thanks. yes , i am newbie.****
>>
>> however, i need windows setup.****
>>
>> ** **
>>
>> let me surely refer the doc and link which u sent but i need this to be
>> working ...****
>>
>> can you please help****
>>
>> ** **
>>
>> regards****
>>
>> ** **
>>
>>  ****
>>
>> ** **
>>
>>
>>
>> ****
>>
>> ** **
>>
>> --
>> MANISH DUNANI
>> -THANX
>> +91 9426881954,+91 8460656443****
>>
>> manishd207@gmail.com****
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>
>>
>> ****
>>
>> -- ****
>>
>> Regards****
>>
>> *Manish Dunani*****
>>
>> *Contact No* : +91 9408329137****
>>
>> *skype id* : manish.dunani****
>>
>> ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity
>> to which it is addressed and may contain information that is confidential,
>> privileged and exempt from disclosure under applicable law. If the reader
>> of this message is not the intended recipient, you are hereby notified that
>> any printing, copying, dissemination, distribution, disclosure or
>> forwarding of this communication is strictly prohibited. If you have
>> received this communication in error, please contact the sender immediately
>> and delete it from your system. Thank You.****
>>
>>  ** **
>>
>>
>>
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity
>> to which it is addressed and may contain information that is confidential,
>> privileged and exempt from disclosure under applicable law. If the reader
>> of this message is not the intended recipient, you are hereby notified that
>> any printing, copying, dissemination, distribution, disclosure or
>> forwarding of this communication is strictly prohibited. If you have
>> received this communication in error, please contact the sender immediately
>> and delete it from your system. Thank You.****
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity
>> to which it is addressed and may contain information that is confidential,
>> privileged and exempt from disclosure under applicable law. If the reader
>> of this message is not the intended recipient, you are hereby notified that
>> any printing, copying, dissemination, distribution, disclosure or
>> forwarding of this communication is strictly prohibited. If you have
>> received this communication in error, please contact the sender immediately
>> and delete it from your system. Thank You.****
>>
>>  ** **
>>
>>  ** **
>>
>>
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity
>> to which it is addressed and may contain information that is confidential,
>> privileged and exempt from disclosure under applicable law. If the reader
>> of this message is not the intended recipient, you are hereby notified that
>> any printing, copying, dissemination, distribution, disclosure or
>> forwarding of this communication is strictly prohibited. If you have
>> received this communication in error, please contact the sender immediately
>> and delete it from your system. Thank You.****
>>
>>  ** **
>>
>>
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity
>> to which it is addressed and may contain information that is confidential,
>> privileged and exempt from disclosure under applicable law. If the reader
>> of this message is not the intended recipient, you are hereby notified that
>> any printing, copying, dissemination, distribution, disclosure or
>> forwarding of this communication is strictly prohibited. If you have
>> received this communication in error, please contact the sender immediately
>> and delete it from your system. Thank You.****
>>
>>  ** **
>>
>>
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity
>> to which it is addressed and may contain information that is confidential,
>> privileged and exempt from disclosure under applicable law. If the reader
>> of this message is not the intended recipient, you are hereby notified that
>> any printing, copying, dissemination, distribution, disclosure or
>> forwarding of this communication is strictly prohibited. If you have
>> received this communication in error, please contact the sender immediately
>> and delete it from your system. Thank You.****
>>
>>  ** **
>>
>>
>>
>> ****
>>
>> ** **
>>
>> -- ****
>>
>> Olivier Renault****
>>
>> Solution Engineer - Big Data - Hortonworks, Inc.
>> +44 7500 933 036
>> orenault@hortonworks.com
>> www.hortonworks.com****
>>
>> **** <http://hortonworks.com/products/hortonworks-sandbox/>
>>
>>
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity
>> to which it is addressed and may contain information that is confidential,
>> privileged and exempt from disclosure under applicable law. If the reader
>> of this message is not the intended recipient, you are hereby notified that
>> any printing, copying, dissemination, distribution, disclosure or
>> forwarding of this communication is strictly prohibited. If you have
>> received this communication in error, please contact the sender immediately
>> and delete it from your system. Thank You.****<http://hortonworks.com/products/hortonworks-sandbox/>
>>
>>  ** ** <http://hortonworks.com/products/hortonworks-sandbox/>
>>
>
>

Re: about replication

Posted by Irfan Sayed <ir...@gmail.com>.
ok.. now i made some changes and installation went ahead
but failed in property "HIVE_SERVER_HOST" declaration
in cluster config file, i have commented this property. if i uncomment ,
then what server address will give ???

i have only two windows machines setup.
1: for namenode and another for datanode

please suggest

regards
irfan



On Fri, Sep 6, 2013 at 11:42 AM, Irfan Sayed <ir...@gmail.com> wrote:

> thanks.
> i installed the latest java in c:\java folder and now no error in log file
> related to java
> however, now it is throwing error on not having cluster properties file.
> in fact i am running/installing hdp from the location where this file
> exist . still it is throwing error
>
> please find the attached
>
> [image: Inline image 1]
>
> regards
> irfan
>
>
>
> On Fri, Sep 6, 2013 at 11:12 AM, Ravi Mummulla (BIG DATA) <
> ravimu@microsoft.com> wrote:
>
>>  Here’s your issue (from the logs you attached earlier):****
>>
>> ** **
>>
>> CAQuietExec:  Checking JAVA_HOME is set correctly...****
>>
>> CAQuietExec:  Files\Java\jdk1.6.0_31 was unexpected at this time.****
>>
>> ** **
>>
>> It seems that you installed Java prerequisite in the default path, which
>> is %PROGRAMFILES% (expands to C:\Program Files in your case). HDP 1.3 does
>> not like spaces in paths, do you need to reinstall Java under c:\java\ or
>> something similar (in a path with no spaces).****
>>
>> ** **
>>
>> *From:* Irfan Sayed [mailto:irfu.sayed@gmail.com]
>> *Sent:* Thursday, September 5, 2013 8:42 PM
>> *To:* user@hadoop.apache.org
>> *Subject:* Re: about replication****
>>
>> ** **
>>
>> please find the attached.****
>>
>> i don't have "c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log"
>> as it is not generated ****
>>
>> ** **
>>
>> regards****
>>
>> irfan****
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> On Thu, Sep 5, 2013 at 6:09 PM, Olivier Renault <or...@hortonworks.com>
>> wrote:****
>>
>>  Could you share the log files ( c:\hdp.log,
>> c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log )  as
>> well as your clusterproperties.txt ?****
>>
>> ** **
>>
>> Thanks, ****
>>
>> Olivier****
>>
>> ** **
>>
>> On 5 September 2013 12:33, Irfan Sayed <ir...@gmail.com> wrote:****
>>
>>  thanks. i followed the user manual for deployment and installed all
>> pre-requisites ****
>>
>> i modified the command and still the issue persist. please suggest ****
>>
>> ** **
>>
>> please refer below ****
>>
>> ** **
>>
>> ** **
>>
>> [image: Inline image 1]****
>>
>> ** **
>>
>> regards****
>>
>> irfan ****
>>
>> ** **
>>
>> ** **
>>
>> On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <or...@hortonworks.com>
>> wrote:****
>>
>> The command to install it is msiexec /i msifile /...  ****
>>
>> You will find the correct syntax as part of doc. ****
>>
>> Happy reading
>> Olivier ****
>>
>> On 4 Sep 2013 12:37, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>
>>  thanks. ****
>>
>> i referred the logs and manuals. i modified the clusterproperties file
>> and then double click on the msi file ****
>>
>> however, it still failed.****
>>
>> further i started the installation on command line by giving
>> HDP_LAYOUT=clusterproperties file path, ****
>>
>> installation went ahead and it failed for .NET framework 4.0 and VC++
>> redistributable package dependency   ****
>>
>> ** **
>>
>> i installed both and started again the installation. ****
>>
>> failed again with following error ****
>>
>> [image: Inline image 1]****
>>
>> ** **
>>
>> when i search for the logs mentioned in the error , i never found that **
>> **
>>
>> please suggest ****
>>
>> ** **
>>
>> regards****
>>
>> irfan****
>>
>> ** **
>>
>> ** **
>>
>> On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <
>> orenault@hortonworks.com> wrote:****
>>
>> Correct, you need to define the cluster configuration as part of a file.
>> You will find some information on the configuration file as part of the
>> documentation. ****
>>
>>
>> http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
>> ****
>>
>> You should make sure to have also installed the pre requisite. ****
>>
>> Thanks
>> Olivier ****
>>
>> On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>
>>  thanks. sorry for the long break. actually got involved in some other
>> priorities****
>>
>> i downloaded the installer and while installing i got following error ***
>> *
>>
>> ** **
>>
>> [image: Inline image 1]****
>>
>> ** **
>>
>> do i need to make any configuration prior to installation ??****
>>
>> ** **
>>
>> regards****
>>
>> irfan ****
>>
>> ** **
>>
>> ** **
>>
>> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
>> orenault@hortonworks.com> wrote:****
>>
>> Here is the link ****
>>
>> http://download.hortonworks.com/products/hdp-windows/****
>>
>> Olivier ****
>>
>> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>
>>  thanks.****
>>
>> i just followed the instructions to setup the pseudo distributed setup
>> first using the url :
>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>> ****
>>
>>  ****
>>
>> i don't think so i am running DN on both machine ****
>>
>> please find the attached log****
>>
>> ** **
>>
>> hi olivier ****
>>
>> ** **
>>
>> can you please give me download link ?****
>>
>> let me try please ****
>>
>> ** **
>>
>> regards****
>>
>> irfan ****
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <do...@gmail.com>
>> wrote:****
>>
>>  Are you running DN on both the machines? Could you please show me your
>> DN logs?****
>>
>> ** **
>>
>> Also, consider Oliver's suggestion. It's definitely a better option.****
>>
>> ** **
>>
>> ** **
>>
>>
>> ****
>>
>> Warm Regards,****
>>
>> Tariq****
>>
>> cloudfront.blogspot.com****
>>
>> ** **
>>
>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>> orenault@hortonworks.com> wrote:****
>>
>> Irfu, ****
>>
>> If you want to quickly get Hadoop running on windows platform. You may
>> want to try our distribution for Windows. You will be able to find the msi
>> on our website. ****
>>
>> Regards
>> Olivier ****
>>
>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>
>>  thanks. ****
>>
>> ok. i think i need to change the plan over here ****
>>
>> let me create two environments. 1: totally windows 2: totally Unix****
>>
>> ** **
>>
>> because, on windows , anyway i have to try and see how hadoop works ****
>>
>> on UNIX, it is already known that ,  it is working fine. ****
>>
>> ** **
>>
>> so, on windows , here is the setup:****
>>
>> ** **
>>
>> namenode : windows 2012 R2 ****
>>
>> datanode : windows 2012 R2 ****
>>
>> ** **
>>
>> now, the exact problem is :****
>>
>> 1: datanode is not getting started ****
>>
>> 2: replication : if i put any file/folder on any datanode , it should get
>> replicated to all another available datanodes ****
>>
>> ** **
>>
>> regards****
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <do...@gmail.com>
>> wrote:****
>>
>>  Seriously??You are planning to develop something using Hadoop on
>> windows. Not a good idea. Anyways, cold you plz show me your log files?I
>> also need some additional info :****
>>
>> -The exact problem which you are facing right now****
>>
>> -Your cluster summary(no. of nodes etc)****
>>
>> -Your latest configuration files****
>>
>> -Your /etc.hosts file****
>>
>>
>> ****
>>
>> Warm Regards,****
>>
>> Tariq****
>>
>> cloudfront.blogspot.com****
>>
>> ** **
>>
>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <ir...@gmail.com>
>> wrote:****
>>
>>  ok. thanks****
>>
>> now, i need to start with all windows setup first as our product will be
>> based on windows ****
>>
>> so, now, please tell me how to resolve the issue ****
>>
>> ** **
>>
>> datanode is not starting . please suggest ****
>>
>> ** **
>>
>> regards,****
>>
>> irfan ****
>>
>> ** **
>>
>> ** **
>>
>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <do...@gmail.com>
>> wrote:****
>>
>>  It is possible. Theoretically Hadoop doesn't stop you from doing that.
>> But it is not a very wise setup.****
>>
>>
>> ****
>>
>> Warm Regards,****
>>
>> Tariq****
>>
>> cloudfront.blogspot.com****
>>
>> ** **
>>
>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <ir...@gmail.com>
>> wrote:****
>>
>>  please suggest****
>>
>> ** **
>>
>> regards****
>>
>> irfan****
>>
>> ** **
>>
>> ** **
>>
>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
>> wrote:****
>>
>>  thanks.****
>>
>> can i have setup like this :****
>>
>> namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)****
>>
>> and datanodes are the combination of any OS (windows , linux , unix etc )
>> ****
>>
>> ** **
>>
>> however, my doubt is,  as the file systems of  both the systems (win and
>> linux ) are different ,  datanodes of these systems can not be part of
>> single cluster . i have to make windows cluster separate and UNIX cluster
>> separate ?****
>>
>> ** **
>>
>> regards****
>>
>> ** **
>>
>> ** **
>>
>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <aa...@hortonworks.com>
>> wrote:****
>>
>> I just noticed you are on Cygwin. IIRC Windows PIDs are not the same as
>> Cygwin PIDs so that may be causing the discrepancy. I don't know how well
>> Hadoop works in Cygwin as I have never tried it. Work is in progress for
>> native Windows support however there are no official releases with Windows
>> support yet. It may be easier to get familiar with a release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
>> ****
>>
>>
>>
>> ****
>>
>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <ir...@gmail.com>
>> wrote:****
>>
>>  thanks ****
>>
>> here is what i did .****
>>
>> i stopped all the namenodes and datanodes using ./stop-dfs.sh command ***
>> *
>>
>> then deleted all pid files for namenodes and datanodes ****
>>
>> ** **
>>
>> started dfs again with command : "./start-dfs.sh"****
>>
>> ** **
>>
>> when i ran the "Jps" command . it shows****
>>
>> ** **
>>
>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>
>> $ ./jps.exe****
>>
>> 4536 Jps****
>>
>> 2076 NameNode****
>>
>> ** **
>>
>> however, when i open the pid file for namenode then it is not showing pid
>> as : 4560. on the contrary, it shud show : 2076****
>>
>> ** **
>>
>> please suggest ****
>>
>> ** **
>>
>> regards****
>>
>> ** **
>>
>> ** **
>>
>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <aa...@hortonworks.com>
>> wrote:****
>>
>>  Most likely there is a stale pid file. Something like
>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>> the datanode.
>>
>> I haven't read the entire thread so you may have looked at this already.
>>
>> -Arpit****
>>
>>
>>
>> ****
>>
>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <ir...@gmail.com>
>> wrote:****
>>
>>  datanode is trying to connect to namenode continuously but fails ****
>>
>> ** **
>>
>> when i try to run "jps" command it says :****
>>
>> $ ./jps.exe****
>>
>> 4584 NameNode****
>>
>> 4016 Jps****
>>
>> ** **
>>
>> and when i ran the "./start-dfs.sh" then it says :****
>>
>> ** **
>>
>> $ ./start-dfs.sh****
>>
>> namenode running as process 3544. Stop it first.****
>>
>> DFS-1: datanode running as process 4076. Stop it first.****
>>
>> localhost: secondarynamenode running as process 4792. Stop it first.****
>>
>> ** **
>>
>> both these logs are contradictory ****
>>
>> please find the attached logs ****
>>
>> ** **
>>
>> should i attach the conf files as well ?****
>>
>> ** **
>>
>> regards****
>>
>>  ****
>>
>> ** **
>>
>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <do...@gmail.com>
>> wrote:****
>>
>>  Your DN is still not running. Showing me the logs would be helpful.****
>>
>>
>> ****
>>
>> Warm Regards,****
>>
>> Tariq****
>>
>> cloudfront.blogspot.com****
>>
>> ** **
>>
>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <ir...@gmail.com>
>> wrote:****
>>
>>  i followed the url and did the steps mention in that. i have deployed
>> on the windows platform****
>>
>> ** **
>>
>> Now, i am able to browse url : http://localhost:50070 (name node )****
>>
>> however, not able to browse url : http://localhost:50030****
>>
>> ** **
>>
>> please refer below****
>>
>> ** **
>>
>> [image: Inline image 1]****
>>
>> ** **
>>
>> i have modified all the config files as mentioned and formatted the hdfs
>> file system as well ****
>>
>> please suggest ****
>>
>> ** **
>>
>> regards****
>>
>> ** **
>>
>> ** **
>>
>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <ir...@gmail.com>
>> wrote:****
>>
>>  thanks. i followed this url :
>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>> ****
>>
>> let me follow the url which you gave for pseudo distributed setup and
>> then will switch to distributed mode****
>>
>> ** **
>>
>> regards****
>>
>> irfan ****
>>
>> ** **
>>
>> ** **
>>
>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <do...@gmail.com>
>> wrote:****
>>
>>  You are welcome. Which link have you followed for the
>> configuration?Your *core-site.xml* is empty. Remove the property *
>> fs.default.name *from *hdfs-site.xml* and add it to *core-site.xml*.
>> Remove *mapred.job.tracker* as well. It is required in *mapred-site.xml*.
>> ****
>>
>> ** **
>>
>> I would suggest you to do a pseudo distributed setup first in order to
>> get yourself familiar with the process and then proceed to the distributed
>> mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>> ****
>>
>> ** **
>>
>> HTH****
>>
>>
>> ****
>>
>> Warm Regards,****
>>
>> Tariq****
>>
>> cloudfront.blogspot.com****
>>
>> ** **
>>
>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <ir...@gmail.com>
>> wrote:****
>>
>>  thanks tariq for response. ****
>>
>> as discussed last time, i have sent you all the config files in my setup
>> . ****
>>
>> can you please go through that ?****
>>
>> ** **
>>
>> please let me know ****
>>
>> ** **
>>
>> regards****
>>
>> irfan ****
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <do...@gmail.com>
>> wrote:****
>>
>>  I'm sorry for being unresponsive. Was out of touch for sometime because
>> of ramzan and eid. Resuming work today.****
>>
>> ** **
>>
>> What's the current status?****
>>
>>
>> ****
>>
>> Warm Regards,****
>>
>> Tariq****
>>
>> cloudfront.blogspot.com****
>>
>> ** **
>>
>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <ma...@gmail.com>
>> wrote:****
>>
>>  First of all read the concepts ..I hope you will like it..****
>>
>>
>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf****
>>
>> ** **
>>
>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <ir...@gmail.com>
>> wrote:****
>>
>>  please suggest ****
>>
>> ** **
>>
>> regards****
>>
>> irfan ****
>>
>> ** **
>>
>> ** **
>>
>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <ir...@gmail.com>
>> wrote:****
>>
>>  hey Tariq,****
>>
>> i am still stuck .. ****
>>
>> can you please suggest ****
>>
>> ** **
>>
>> regards****
>>
>> irfan ****
>>
>> ** **
>>
>> ** **
>>
>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <ir...@gmail.com> wrote:
>> ****
>>
>>  please suggest ****
>>
>> ** **
>>
>> regards****
>>
>> ** **
>>
>> ** **
>>
>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <ir...@gmail.com> wrote:
>> ****
>>
>>  attachment got quarantined ****
>>
>> resending in txt format. please rename it to conf.rar ****
>>
>> ** **
>>
>> regards****
>>
>> ** **
>>
>> ** **
>>
>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <ir...@gmail.com> wrote:
>> ****
>>
>>  thanks.****
>>
>> ** **
>>
>> if i run the jps command on namenode :****
>>
>> ** **
>>
>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>
>> $ ./jps.exe****
>>
>> 3164 NameNode****
>>
>> 1892 Jps****
>>
>> ** **
>>
>> same command on datanode :****
>>
>> ** **
>>
>> Administrator@DFS-1 /cygdrive/c/Java/jdk1.7.0_25/bin****
>>
>> $ ./jps.exe****
>>
>> 3848 Jps****
>>
>> ** **
>>
>> jps does not list any process for datanode. however, on web browser i can
>> see one live data node ****
>>
>> please find the attached conf rar file of namenode ****
>>
>> ** **
>>
>> regards****
>>
>> ** **
>>
>> ** **
>>
>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <do...@gmail.com>
>> wrote:****
>>
>>  OK. we'll start fresh. Could you plz show me your latest config files?**
>> **
>>
>> ** **
>>
>> BTW, are your daemons running fine?Use JPS to verify that.****
>>
>>
>> ****
>>
>> Warm Regards,****
>>
>> Tariq****
>>
>> cloudfront.blogspot.com****
>>
>> ** **
>>
>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <ir...@gmail.com>
>> wrote:****
>>
>>  i have created these dir "wksp_data" and "wksp_name" on both datanode
>> and namenode ****
>>
>> made the respective changes in "hdfs-site.xml" file ****
>>
>> formatted the namenode ****
>>
>> started the dfs ****
>>
>> ** **
>>
>> but still, not able to browse the file system through web browser ****
>>
>> please refer below ****
>>
>> ** **
>>
>> anything still missing ?****
>>
>> please suggest ****
>>
>> ** **
>>
>> [image: Inline image 1]****
>>
>> ** **
>>
>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <ir...@gmail.com>
>> wrote:****
>>
>>  these dir needs to be created on all datanodes and namenodes ?****
>>
>> further,  hdfs-site.xml needs to be updated on both datanodes and
>> namenodes for these new dir?****
>>
>> ** **
>>
>> regards****
>>
>> ** **
>>
>> ** **
>>
>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <do...@gmail.com>
>> wrote:****
>>
>>  Create 2 directories manually corresponding to the values of
>> dfs.name.dir and dfs.data.dir properties and change the permissions of
>> these directories to 755. When you start pushing data into your HDFS, data
>> will start going inside the directory specified by dfs.data.dir and the
>> associated metadata will go inside dfs.name.dir. Remember, you store data
>> in HDFS, but it eventually gets stored in your local/native FS. But you
>> cannot see this data directly on your local/native FS.****
>>
>>
>> ****
>>
>> Warm Regards,****
>>
>> Tariq****
>>
>> cloudfront.blogspot.com****
>>
>> ** **
>>
>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <ir...@gmail.com> wrote:
>> ****
>>
>>  thanks. ****
>>
>> however, i need this to be working on windows environment as project
>> requirement.****
>>
>> i will add/work on Linux later ****
>>
>> ** **
>>
>> so, now , at this stage , c:\\wksp is the HDFS file system OR do i need
>> to create it from command line ?****
>>
>> ** **
>>
>> please suggest****
>>
>> ** **
>>
>> regards,****
>>
>> ** **
>>
>> ** **
>>
>> On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <do...@gmail.com>
>> wrote:****
>>
>>  Hello Irfan,****
>>
>> ** **
>>
>> Sorry for being unresponsive. Got stuck with some imp work.****
>>
>> ** **
>>
>> HDFS webUI doesn't provide us the ability to create file or directory.
>> You can browse HDFS, view files, download files etc. But operation like
>> create, move, copy etc are not supported.****
>>
>> ** **
>>
>> These values look fine to me.****
>>
>> ** **
>>
>> One suggestion though. Try getting a Linux machine(if possible). Or at
>> least use a VM. I personally feel that using Hadoop on windows is always
>> messy.****
>>
>>
>> ****
>>
>> Warm Regards,****
>>
>> Tariq****
>>
>> cloudfront.blogspot.com****
>>
>> ** **
>>
>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <ir...@gmail.com> wrote:
>> ****
>>
>>  thanks.****
>>
>> when i browse the file system , i am getting following :****
>>
>> i haven't seen any make directory option there ****
>>
>> ** **
>>
>> i need to create it from command line ?****
>>
>> further, in the hdfs-site.xml file , i have given following entries. are
>> they correct ? ****
>>
>> ** **
>>
>> <property>****
>>
>>   <name>dfs.data.dir</name>****
>>
>>   <value>c:\\wksp</value>****
>>
>>   </property>****
>>
>> <property>****
>>
>>   <name>dfs.name.dir</name>****
>>
>>   <value>c:\\wksp</value>****
>>
>>   </property>****
>>
>> ** **
>>
>> please suggest ****
>>
>> ** **
>>
>> ** **
>>
>> [image: Inline image 1]****
>>
>> ** **
>>
>> On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <ma...@gmail.com>
>> wrote:****
>>
>>  *You are wrong at this:*****
>>
>> ** **
>>
>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>
>> $ ./hadoop dfs -copyFromLocal
>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp****
>>
>> copyFromLocal: File
>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.*
>> ***
>>
>> ** **
>>
>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>
>> $ ./hadoop dfs -copyFromLocal
>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp****
>>
>> copyFromLocal: File
>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>> ****
>>
>> ** **
>>
>> Because,You had wrote both the paths local and You need not to copy
>> hadoop into hdfs...Hadoop is already working..****
>>
>> ** **
>>
>> Just check out in browser by after starting ur single node cluster :****
>>
>> ** **
>>
>> localhost:50070****
>>
>> ** **
>>
>> then go for browse the filesystem link in it..****
>>
>> ** **
>>
>> If there is no directory then make directory there.****
>>
>> That is your hdfs directory.****
>>
>> Then copy any text file there(no need to copy hadoop there).beacause u
>> are going to do processing on that data in text file.That's why hadoop is
>> used for ,first u need to make it clear in ur mind.Then and then u will do
>> it...otherwise not possible..****
>>
>> ** **
>>
>> *Try this: *****
>>
>> ** **
>>
>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2****
>>
>> $ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
>> /hdfs/directory/path****
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
>> wrote:****
>>
>>  thanks. yes , i am newbie.****
>>
>> however, i need windows setup.****
>>
>> ** **
>>
>> let me surely refer the doc and link which u sent but i need this to be
>> working ...****
>>
>> can you please help****
>>
>> ** **
>>
>> regards****
>>
>> ** **
>>
>>  ****
>>
>> ** **
>>
>>
>>
>> ****
>>
>> ** **
>>
>> --
>> MANISH DUNANI
>> -THANX
>> +91 9426881954,+91 8460656443****
>>
>> manishd207@gmail.com****
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>
>>
>> ****
>>
>> -- ****
>>
>> Regards****
>>
>> *Manish Dunani*****
>>
>> *Contact No* : +91 9408329137****
>>
>> *skype id* : manish.dunani****
>>
>> ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity
>> to which it is addressed and may contain information that is confidential,
>> privileged and exempt from disclosure under applicable law. If the reader
>> of this message is not the intended recipient, you are hereby notified that
>> any printing, copying, dissemination, distribution, disclosure or
>> forwarding of this communication is strictly prohibited. If you have
>> received this communication in error, please contact the sender immediately
>> and delete it from your system. Thank You.****
>>
>>  ** **
>>
>>
>>
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity
>> to which it is addressed and may contain information that is confidential,
>> privileged and exempt from disclosure under applicable law. If the reader
>> of this message is not the intended recipient, you are hereby notified that
>> any printing, copying, dissemination, distribution, disclosure or
>> forwarding of this communication is strictly prohibited. If you have
>> received this communication in error, please contact the sender immediately
>> and delete it from your system. Thank You.****
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity
>> to which it is addressed and may contain information that is confidential,
>> privileged and exempt from disclosure under applicable law. If the reader
>> of this message is not the intended recipient, you are hereby notified that
>> any printing, copying, dissemination, distribution, disclosure or
>> forwarding of this communication is strictly prohibited. If you have
>> received this communication in error, please contact the sender immediately
>> and delete it from your system. Thank You.****
>>
>>  ** **
>>
>>  ** **
>>
>>
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity
>> to which it is addressed and may contain information that is confidential,
>> privileged and exempt from disclosure under applicable law. If the reader
>> of this message is not the intended recipient, you are hereby notified that
>> any printing, copying, dissemination, distribution, disclosure or
>> forwarding of this communication is strictly prohibited. If you have
>> received this communication in error, please contact the sender immediately
>> and delete it from your system. Thank You.****
>>
>>  ** **
>>
>>
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity
>> to which it is addressed and may contain information that is confidential,
>> privileged and exempt from disclosure under applicable law. If the reader
>> of this message is not the intended recipient, you are hereby notified that
>> any printing, copying, dissemination, distribution, disclosure or
>> forwarding of this communication is strictly prohibited. If you have
>> received this communication in error, please contact the sender immediately
>> and delete it from your system. Thank You.****
>>
>>  ** **
>>
>>
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity
>> to which it is addressed and may contain information that is confidential,
>> privileged and exempt from disclosure under applicable law. If the reader
>> of this message is not the intended recipient, you are hereby notified that
>> any printing, copying, dissemination, distribution, disclosure or
>> forwarding of this communication is strictly prohibited. If you have
>> received this communication in error, please contact the sender immediately
>> and delete it from your system. Thank You.****
>>
>>  ** **
>>
>>
>>
>> ****
>>
>> ** **
>>
>> -- ****
>>
>> Olivier Renault****
>>
>> Solution Engineer - Big Data - Hortonworks, Inc.
>> +44 7500 933 036
>> orenault@hortonworks.com
>> www.hortonworks.com****
>>
>> **** <http://hortonworks.com/products/hortonworks-sandbox/>
>>
>>
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity
>> to which it is addressed and may contain information that is confidential,
>> privileged and exempt from disclosure under applicable law. If the reader
>> of this message is not the intended recipient, you are hereby notified that
>> any printing, copying, dissemination, distribution, disclosure or
>> forwarding of this communication is strictly prohibited. If you have
>> received this communication in error, please contact the sender immediately
>> and delete it from your system. Thank You.****<http://hortonworks.com/products/hortonworks-sandbox/>
>>
>>  ** ** <http://hortonworks.com/products/hortonworks-sandbox/>
>>
>
>

Re: about replication

Posted by Irfan Sayed <ir...@gmail.com>.
ok.. now i made some changes and installation went ahead
but failed in property "HIVE_SERVER_HOST" declaration
in cluster config file, i have commented this property. if i uncomment ,
then what server address will give ???

i have only two windows machines setup.
1: for namenode and another for datanode

please suggest

regards
irfan



On Fri, Sep 6, 2013 at 11:42 AM, Irfan Sayed <ir...@gmail.com> wrote:

> thanks.
> i installed the latest java in c:\java folder and now no error in log file
> related to java
> however, now it is throwing error on not having cluster properties file.
> in fact i am running/installing hdp from the location where this file
> exist . still it is throwing error
>
> please find the attached
>
> [image: Inline image 1]
>
> regards
> irfan
>
>
>
> On Fri, Sep 6, 2013 at 11:12 AM, Ravi Mummulla (BIG DATA) <
> ravimu@microsoft.com> wrote:
>
>>  Here’s your issue (from the logs you attached earlier):****
>>
>> ** **
>>
>> CAQuietExec:  Checking JAVA_HOME is set correctly...****
>>
>> CAQuietExec:  Files\Java\jdk1.6.0_31 was unexpected at this time.****
>>
>> ** **
>>
>> It seems that you installed Java prerequisite in the default path, which
>> is %PROGRAMFILES% (expands to C:\Program Files in your case). HDP 1.3 does
>> not like spaces in paths, do you need to reinstall Java under c:\java\ or
>> something similar (in a path with no spaces).****
>>
>> ** **
>>
>> *From:* Irfan Sayed [mailto:irfu.sayed@gmail.com]
>> *Sent:* Thursday, September 5, 2013 8:42 PM
>> *To:* user@hadoop.apache.org
>> *Subject:* Re: about replication****
>>
>> ** **
>>
>> please find the attached.****
>>
>> i don't have "c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log"
>> as it is not generated ****
>>
>> ** **
>>
>> regards****
>>
>> irfan****
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> On Thu, Sep 5, 2013 at 6:09 PM, Olivier Renault <or...@hortonworks.com>
>> wrote:****
>>
>>  Could you share the log files ( c:\hdp.log,
>> c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log )  as
>> well as your clusterproperties.txt ?****
>>
>> ** **
>>
>> Thanks, ****
>>
>> Olivier****
>>
>> ** **
>>
>> On 5 September 2013 12:33, Irfan Sayed <ir...@gmail.com> wrote:****
>>
>>  thanks. i followed the user manual for deployment and installed all
>> pre-requisites ****
>>
>> i modified the command and still the issue persist. please suggest ****
>>
>> ** **
>>
>> please refer below ****
>>
>> ** **
>>
>> ** **
>>
>> [image: Inline image 1]****
>>
>> ** **
>>
>> regards****
>>
>> irfan ****
>>
>> ** **
>>
>> ** **
>>
>> On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <or...@hortonworks.com>
>> wrote:****
>>
>> The command to install it is msiexec /i msifile /...  ****
>>
>> You will find the correct syntax as part of doc. ****
>>
>> Happy reading
>> Olivier ****
>>
>> On 4 Sep 2013 12:37, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>
>>  thanks. ****
>>
>> i referred the logs and manuals. i modified the clusterproperties file
>> and then double click on the msi file ****
>>
>> however, it still failed.****
>>
>> further i started the installation on command line by giving
>> HDP_LAYOUT=clusterproperties file path, ****
>>
>> installation went ahead and it failed for .NET framework 4.0 and VC++
>> redistributable package dependency   ****
>>
>> ** **
>>
>> i installed both and started again the installation. ****
>>
>> failed again with following error ****
>>
>> [image: Inline image 1]****
>>
>> ** **
>>
>> when i search for the logs mentioned in the error , i never found that **
>> **
>>
>> please suggest ****
>>
>> ** **
>>
>> regards****
>>
>> irfan****
>>
>> ** **
>>
>> ** **
>>
>> On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <
>> orenault@hortonworks.com> wrote:****
>>
>> Correct, you need to define the cluster configuration as part of a file.
>> You will find some information on the configuration file as part of the
>> documentation. ****
>>
>>
>> http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
>> ****
>>
>> You should make sure to have also installed the pre requisite. ****
>>
>> Thanks
>> Olivier ****
>>
>> On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>
>>  thanks. sorry for the long break. actually got involved in some other
>> priorities****
>>
>> i downloaded the installer and while installing i got following error ***
>> *
>>
>> ** **
>>
>> [image: Inline image 1]****
>>
>> ** **
>>
>> do i need to make any configuration prior to installation ??****
>>
>> ** **
>>
>> regards****
>>
>> irfan ****
>>
>> ** **
>>
>> ** **
>>
>> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
>> orenault@hortonworks.com> wrote:****
>>
>> Here is the link ****
>>
>> http://download.hortonworks.com/products/hdp-windows/****
>>
>> Olivier ****
>>
>> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>
>>  thanks.****
>>
>> i just followed the instructions to setup the pseudo distributed setup
>> first using the url :
>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>> ****
>>
>>  ****
>>
>> i don't think so i am running DN on both machine ****
>>
>> please find the attached log****
>>
>> ** **
>>
>> hi olivier ****
>>
>> ** **
>>
>> can you please give me download link ?****
>>
>> let me try please ****
>>
>> ** **
>>
>> regards****
>>
>> irfan ****
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <do...@gmail.com>
>> wrote:****
>>
>>  Are you running DN on both the machines? Could you please show me your
>> DN logs?****
>>
>> ** **
>>
>> Also, consider Oliver's suggestion. It's definitely a better option.****
>>
>> ** **
>>
>> ** **
>>
>>
>> ****
>>
>> Warm Regards,****
>>
>> Tariq****
>>
>> cloudfront.blogspot.com****
>>
>> ** **
>>
>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>> orenault@hortonworks.com> wrote:****
>>
>> Irfu, ****
>>
>> If you want to quickly get Hadoop running on windows platform. You may
>> want to try our distribution for Windows. You will be able to find the msi
>> on our website. ****
>>
>> Regards
>> Olivier ****
>>
>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:****
>>
>>  thanks. ****
>>
>> ok. i think i need to change the plan over here ****
>>
>> let me create two environments. 1: totally windows 2: totally Unix****
>>
>> ** **
>>
>> because, on windows , anyway i have to try and see how hadoop works ****
>>
>> on UNIX, it is already known that ,  it is working fine. ****
>>
>> ** **
>>
>> so, on windows , here is the setup:****
>>
>> ** **
>>
>> namenode : windows 2012 R2 ****
>>
>> datanode : windows 2012 R2 ****
>>
>> ** **
>>
>> now, the exact problem is :****
>>
>> 1: datanode is not getting started ****
>>
>> 2: replication : if i put any file/folder on any datanode , it should get
>> replicated to all another available datanodes ****
>>
>> ** **
>>
>> regards****
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <do...@gmail.com>
>> wrote:****
>>
>>  Seriously??You are planning to develop something using Hadoop on
>> windows. Not a good idea. Anyways, cold you plz show me your log files?I
>> also need some additional info :****
>>
>> -The exact problem which you are facing right now****
>>
>> -Your cluster summary(no. of nodes etc)****
>>
>> -Your latest configuration files****
>>
>> -Your /etc.hosts file****
>>
>>
>> ****
>>
>> Warm Regards,****
>>
>> Tariq****
>>
>> cloudfront.blogspot.com****
>>
>> ** **
>>
>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <ir...@gmail.com>
>> wrote:****
>>
>>  ok. thanks****
>>
>> now, i need to start with all windows setup first as our product will be
>> based on windows ****
>>
>> so, now, please tell me how to resolve the issue ****
>>
>> ** **
>>
>> datanode is not starting . please suggest ****
>>
>> ** **
>>
>> regards,****
>>
>> irfan ****
>>
>> ** **
>>
>> ** **
>>
>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <do...@gmail.com>
>> wrote:****
>>
>>  It is possible. Theoretically Hadoop doesn't stop you from doing that.
>> But it is not a very wise setup.****
>>
>>
>> ****
>>
>> Warm Regards,****
>>
>> Tariq****
>>
>> cloudfront.blogspot.com****
>>
>> ** **
>>
>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <ir...@gmail.com>
>> wrote:****
>>
>>  please suggest****
>>
>> ** **
>>
>> regards****
>>
>> irfan****
>>
>> ** **
>>
>> ** **
>>
>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
>> wrote:****
>>
>>  thanks.****
>>
>> can i have setup like this :****
>>
>> namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)****
>>
>> and datanodes are the combination of any OS (windows , linux , unix etc )
>> ****
>>
>> ** **
>>
>> however, my doubt is,  as the file systems of  both the systems (win and
>> linux ) are different ,  datanodes of these systems can not be part of
>> single cluster . i have to make windows cluster separate and UNIX cluster
>> separate ?****
>>
>> ** **
>>
>> regards****
>>
>> ** **
>>
>> ** **
>>
>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <aa...@hortonworks.com>
>> wrote:****
>>
>> I just noticed you are on Cygwin. IIRC Windows PIDs are not the same as
>> Cygwin PIDs so that may be causing the discrepancy. I don't know how well
>> Hadoop works in Cygwin as I have never tried it. Work is in progress for
>> native Windows support however there are no official releases with Windows
>> support yet. It may be easier to get familiar with a release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
>> ****
>>
>>
>>
>> ****
>>
>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <ir...@gmail.com>
>> wrote:****
>>
>>  thanks ****
>>
>> here is what i did .****
>>
>> i stopped all the namenodes and datanodes using ./stop-dfs.sh command ***
>> *
>>
>> then deleted all pid files for namenodes and datanodes ****
>>
>> ** **
>>
>> started dfs again with command : "./start-dfs.sh"****
>>
>> ** **
>>
>> when i ran the "Jps" command . it shows****
>>
>> ** **
>>
>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>
>> $ ./jps.exe****
>>
>> 4536 Jps****
>>
>> 2076 NameNode****
>>
>> ** **
>>
>> however, when i open the pid file for namenode then it is not showing pid
>> as : 4560. on the contrary, it shud show : 2076****
>>
>> ** **
>>
>> please suggest ****
>>
>> ** **
>>
>> regards****
>>
>> ** **
>>
>> ** **
>>
>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <aa...@hortonworks.com>
>> wrote:****
>>
>>  Most likely there is a stale pid file. Something like
>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>> the datanode.
>>
>> I haven't read the entire thread so you may have looked at this already.
>>
>> -Arpit****
>>
>>
>>
>> ****
>>
>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <ir...@gmail.com>
>> wrote:****
>>
>>  datanode is trying to connect to namenode continuously but fails ****
>>
>> ** **
>>
>> when i try to run "jps" command it says :****
>>
>> $ ./jps.exe****
>>
>> 4584 NameNode****
>>
>> 4016 Jps****
>>
>> ** **
>>
>> and when i ran the "./start-dfs.sh" then it says :****
>>
>> ** **
>>
>> $ ./start-dfs.sh****
>>
>> namenode running as process 3544. Stop it first.****
>>
>> DFS-1: datanode running as process 4076. Stop it first.****
>>
>> localhost: secondarynamenode running as process 4792. Stop it first.****
>>
>> ** **
>>
>> both these logs are contradictory ****
>>
>> please find the attached logs ****
>>
>> ** **
>>
>> should i attach the conf files as well ?****
>>
>> ** **
>>
>> regards****
>>
>>  ****
>>
>> ** **
>>
>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <do...@gmail.com>
>> wrote:****
>>
>>  Your DN is still not running. Showing me the logs would be helpful.****
>>
>>
>> ****
>>
>> Warm Regards,****
>>
>> Tariq****
>>
>> cloudfront.blogspot.com****
>>
>> ** **
>>
>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <ir...@gmail.com>
>> wrote:****
>>
>>  i followed the url and did the steps mention in that. i have deployed
>> on the windows platform****
>>
>> ** **
>>
>> Now, i am able to browse url : http://localhost:50070 (name node )****
>>
>> however, not able to browse url : http://localhost:50030****
>>
>> ** **
>>
>> please refer below****
>>
>> ** **
>>
>> [image: Inline image 1]****
>>
>> ** **
>>
>> i have modified all the config files as mentioned and formatted the hdfs
>> file system as well ****
>>
>> please suggest ****
>>
>> ** **
>>
>> regards****
>>
>> ** **
>>
>> ** **
>>
>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <ir...@gmail.com>
>> wrote:****
>>
>>  thanks. i followed this url :
>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>> ****
>>
>> let me follow the url which you gave for pseudo distributed setup and
>> then will switch to distributed mode****
>>
>> ** **
>>
>> regards****
>>
>> irfan ****
>>
>> ** **
>>
>> ** **
>>
>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <do...@gmail.com>
>> wrote:****
>>
>>  You are welcome. Which link have you followed for the
>> configuration?Your *core-site.xml* is empty. Remove the property *
>> fs.default.name *from *hdfs-site.xml* and add it to *core-site.xml*.
>> Remove *mapred.job.tracker* as well. It is required in *mapred-site.xml*.
>> ****
>>
>> ** **
>>
>> I would suggest you to do a pseudo distributed setup first in order to
>> get yourself familiar with the process and then proceed to the distributed
>> mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>> ****
>>
>> ** **
>>
>> HTH****
>>
>>
>> ****
>>
>> Warm Regards,****
>>
>> Tariq****
>>
>> cloudfront.blogspot.com****
>>
>> ** **
>>
>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <ir...@gmail.com>
>> wrote:****
>>
>>  thanks tariq for response. ****
>>
>> as discussed last time, i have sent you all the config files in my setup
>> . ****
>>
>> can you please go through that ?****
>>
>> ** **
>>
>> please let me know ****
>>
>> ** **
>>
>> regards****
>>
>> irfan ****
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <do...@gmail.com>
>> wrote:****
>>
>>  I'm sorry for being unresponsive. Was out of touch for sometime because
>> of ramzan and eid. Resuming work today.****
>>
>> ** **
>>
>> What's the current status?****
>>
>>
>> ****
>>
>> Warm Regards,****
>>
>> Tariq****
>>
>> cloudfront.blogspot.com****
>>
>> ** **
>>
>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <ma...@gmail.com>
>> wrote:****
>>
>>  First of all read the concepts ..I hope you will like it..****
>>
>>
>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf****
>>
>> ** **
>>
>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <ir...@gmail.com>
>> wrote:****
>>
>>  please suggest ****
>>
>> ** **
>>
>> regards****
>>
>> irfan ****
>>
>> ** **
>>
>> ** **
>>
>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <ir...@gmail.com>
>> wrote:****
>>
>>  hey Tariq,****
>>
>> i am still stuck .. ****
>>
>> can you please suggest ****
>>
>> ** **
>>
>> regards****
>>
>> irfan ****
>>
>> ** **
>>
>> ** **
>>
>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <ir...@gmail.com> wrote:
>> ****
>>
>>  please suggest ****
>>
>> ** **
>>
>> regards****
>>
>> ** **
>>
>> ** **
>>
>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <ir...@gmail.com> wrote:
>> ****
>>
>>  attachment got quarantined ****
>>
>> resending in txt format. please rename it to conf.rar ****
>>
>> ** **
>>
>> regards****
>>
>> ** **
>>
>> ** **
>>
>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <ir...@gmail.com> wrote:
>> ****
>>
>>  thanks.****
>>
>> ** **
>>
>> if i run the jps command on namenode :****
>>
>> ** **
>>
>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>>
>> $ ./jps.exe****
>>
>> 3164 NameNode****
>>
>> 1892 Jps****
>>
>> ** **
>>
>> same command on datanode :****
>>
>> ** **
>>
>> Administrator@DFS-1 /cygdrive/c/Java/jdk1.7.0_25/bin****
>>
>> $ ./jps.exe****
>>
>> 3848 Jps****
>>
>> ** **
>>
>> jps does not list any process for datanode. however, on web browser i can
>> see one live data node ****
>>
>> please find the attached conf rar file of namenode ****
>>
>> ** **
>>
>> regards****
>>
>> ** **
>>
>> ** **
>>
>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <do...@gmail.com>
>> wrote:****
>>
>>  OK. we'll start fresh. Could you plz show me your latest config files?**
>> **
>>
>> ** **
>>
>> BTW, are your daemons running fine?Use JPS to verify that.****
>>
>>
>> ****
>>
>> Warm Regards,****
>>
>> Tariq****
>>
>> cloudfront.blogspot.com****
>>
>> ** **
>>
>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <ir...@gmail.com>
>> wrote:****
>>
>>  i have created these dir "wksp_data" and "wksp_name" on both datanode
>> and namenode ****
>>
>> made the respective changes in "hdfs-site.xml" file ****
>>
>> formatted the namenode ****
>>
>> started the dfs ****
>>
>> ** **
>>
>> but still, not able to browse the file system through web browser ****
>>
>> please refer below ****
>>
>> ** **
>>
>> anything still missing ?****
>>
>> please suggest ****
>>
>> ** **
>>
>> [image: Inline image 1]****
>>
>> ** **
>>
>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <ir...@gmail.com>
>> wrote:****
>>
>>  these dir needs to be created on all datanodes and namenodes ?****
>>
>> further,  hdfs-site.xml needs to be updated on both datanodes and
>> namenodes for these new dir?****
>>
>> ** **
>>
>> regards****
>>
>> ** **
>>
>> ** **
>>
>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <do...@gmail.com>
>> wrote:****
>>
>>  Create 2 directories manually corresponding to the values of
>> dfs.name.dir and dfs.data.dir properties and change the permissions of
>> these directories to 755. When you start pushing data into your HDFS, data
>> will start going inside the directory specified by dfs.data.dir and the
>> associated metadata will go inside dfs.name.dir. Remember, you store data
>> in HDFS, but it eventually gets stored in your local/native FS. But you
>> cannot see this data directly on your local/native FS.****
>>
>>
>> ****
>>
>> Warm Regards,****
>>
>> Tariq****
>>
>> cloudfront.blogspot.com****
>>
>> ** **
>>
>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <ir...@gmail.com> wrote:
>> ****
>>
>>  thanks. ****
>>
>> however, i need this to be working on windows environment as project
>> requirement.****
>>
>> i will add/work on Linux later ****
>>
>> ** **
>>
>> so, now , at this stage , c:\\wksp is the HDFS file system OR do i need
>> to create it from command line ?****
>>
>> ** **
>>
>> please suggest****
>>
>> ** **
>>
>> regards,****
>>
>> ** **
>>
>> ** **
>>
>> On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <do...@gmail.com>
>> wrote:****
>>
>>  Hello Irfan,****
>>
>> ** **
>>
>> Sorry for being unresponsive. Got stuck with some imp work.****
>>
>> ** **
>>
>> HDFS webUI doesn't provide us the ability to create file or directory.
>> You can browse HDFS, view files, download files etc. But operation like
>> create, move, copy etc are not supported.****
>>
>> ** **
>>
>> These values look fine to me.****
>>
>> ** **
>>
>> One suggestion though. Try getting a Linux machine(if possible). Or at
>> least use a VM. I personally feel that using Hadoop on windows is always
>> messy.****
>>
>>
>> ****
>>
>> Warm Regards,****
>>
>> Tariq****
>>
>> cloudfront.blogspot.com****
>>
>> ** **
>>
>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <ir...@gmail.com> wrote:
>> ****
>>
>>  thanks.****
>>
>> when i browse the file system , i am getting following :****
>>
>> i haven't seen any make directory option there ****
>>
>> ** **
>>
>> i need to create it from command line ?****
>>
>> further, in the hdfs-site.xml file , i have given following entries. are
>> they correct ? ****
>>
>> ** **
>>
>> <property>****
>>
>>   <name>dfs.data.dir</name>****
>>
>>   <value>c:\\wksp</value>****
>>
>>   </property>****
>>
>> <property>****
>>
>>   <name>dfs.name.dir</name>****
>>
>>   <value>c:\\wksp</value>****
>>
>>   </property>****
>>
>> ** **
>>
>> please suggest ****
>>
>> ** **
>>
>> ** **
>>
>> [image: Inline image 1]****
>>
>> ** **
>>
>> On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <ma...@gmail.com>
>> wrote:****
>>
>>  *You are wrong at this:*****
>>
>> ** **
>>
>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>
>> $ ./hadoop dfs -copyFromLocal
>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp****
>>
>> copyFromLocal: File
>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.*
>> ***
>>
>> ** **
>>
>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>>
>> $ ./hadoop dfs -copyFromLocal
>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp****
>>
>> copyFromLocal: File
>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>> ****
>>
>> ** **
>>
>> Because,You had wrote both the paths local and You need not to copy
>> hadoop into hdfs...Hadoop is already working..****
>>
>> ** **
>>
>> Just check out in browser by after starting ur single node cluster :****
>>
>> ** **
>>
>> localhost:50070****
>>
>> ** **
>>
>> then go for browse the filesystem link in it..****
>>
>> ** **
>>
>> If there is no directory then make directory there.****
>>
>> That is your hdfs directory.****
>>
>> Then copy any text file there(no need to copy hadoop there).beacause u
>> are going to do processing on that data in text file.That's why hadoop is
>> used for ,first u need to make it clear in ur mind.Then and then u will do
>> it...otherwise not possible..****
>>
>> ** **
>>
>> *Try this: *****
>>
>> ** **
>>
>> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2****
>>
>> $ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
>> /hdfs/directory/path****
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> ** **
>>
>> On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
>> wrote:****
>>
>>  thanks. yes , i am newbie.****
>>
>> however, i need windows setup.****
>>
>> ** **
>>
>> let me surely refer the doc and link which u sent but i need this to be
>> working ...****
>>
>> can you please help****
>>
>> ** **
>>
>> regards****
>>
>> ** **
>>
>>  ****
>>
>> ** **
>>
>>
>>
>> ****
>>
>> ** **
>>
>> --
>> MANISH DUNANI
>> -THANX
>> +91 9426881954,+91 8460656443****
>>
>> manishd207@gmail.com****
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>
>>
>> ****
>>
>> -- ****
>>
>> Regards****
>>
>> *Manish Dunani*****
>>
>> *Contact No* : +91 9408329137****
>>
>> *skype id* : manish.dunani****
>>
>> ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity
>> to which it is addressed and may contain information that is confidential,
>> privileged and exempt from disclosure under applicable law. If the reader
>> of this message is not the intended recipient, you are hereby notified that
>> any printing, copying, dissemination, distribution, disclosure or
>> forwarding of this communication is strictly prohibited. If you have
>> received this communication in error, please contact the sender immediately
>> and delete it from your system. Thank You.****
>>
>>  ** **
>>
>>
>>
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity
>> to which it is addressed and may contain information that is confidential,
>> privileged and exempt from disclosure under applicable law. If the reader
>> of this message is not the intended recipient, you are hereby notified that
>> any printing, copying, dissemination, distribution, disclosure or
>> forwarding of this communication is strictly prohibited. If you have
>> received this communication in error, please contact the sender immediately
>> and delete it from your system. Thank You.****
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>  ** **
>>
>>
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity
>> to which it is addressed and may contain information that is confidential,
>> privileged and exempt from disclosure under applicable law. If the reader
>> of this message is not the intended recipient, you are hereby notified that
>> any printing, copying, dissemination, distribution, disclosure or
>> forwarding of this communication is strictly prohibited. If you have
>> received this communication in error, please contact the sender immediately
>> and delete it from your system. Thank You.****
>>
>>  ** **
>>
>>  ** **
>>
>>
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity
>> to which it is addressed and may contain information that is confidential,
>> privileged and exempt from disclosure under applicable law. If the reader
>> of this message is not the intended recipient, you are hereby notified that
>> any printing, copying, dissemination, distribution, disclosure or
>> forwarding of this communication is strictly prohibited. If you have
>> received this communication in error, please contact the sender immediately
>> and delete it from your system. Thank You.****
>>
>>  ** **
>>
>>
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity
>> to which it is addressed and may contain information that is confidential,
>> privileged and exempt from disclosure under applicable law. If the reader
>> of this message is not the intended recipient, you are hereby notified that
>> any printing, copying, dissemination, distribution, disclosure or
>> forwarding of this communication is strictly prohibited. If you have
>> received this communication in error, please contact the sender immediately
>> and delete it from your system. Thank You.****
>>
>>  ** **
>>
>>
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity
>> to which it is addressed and may contain information that is confidential,
>> privileged and exempt from disclosure under applicable law. If the reader
>> of this message is not the intended recipient, you are hereby notified that
>> any printing, copying, dissemination, distribution, disclosure or
>> forwarding of this communication is strictly prohibited. If you have
>> received this communication in error, please contact the sender immediately
>> and delete it from your system. Thank You.****
>>
>>  ** **
>>
>>
>>
>> ****
>>
>> ** **
>>
>> -- ****
>>
>> Olivier Renault****
>>
>> Solution Engineer - Big Data - Hortonworks, Inc.
>> +44 7500 933 036
>> orenault@hortonworks.com
>> www.hortonworks.com****
>>
>> **** <http://hortonworks.com/products/hortonworks-sandbox/>
>>
>>
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity
>> to which it is addressed and may contain information that is confidential,
>> privileged and exempt from disclosure under applicable law. If the reader
>> of this message is not the intended recipient, you are hereby notified that
>> any printing, copying, dissemination, distribution, disclosure or
>> forwarding of this communication is strictly prohibited. If you have
>> received this communication in error, please contact the sender immediately
>> and delete it from your system. Thank You.****<http://hortonworks.com/products/hortonworks-sandbox/>
>>
>>  ** ** <http://hortonworks.com/products/hortonworks-sandbox/>
>>
>
>

Re: about replication

Posted by Irfan Sayed <ir...@gmail.com>.
thanks.
i installed the latest java in c:\java folder and now no error in log file
related to java
however, now it is throwing error on not having cluster properties file.
in fact i am running/installing hdp from the location where this file exist
. still it is throwing error

please find the attached

[image: Inline image 1]

regards
irfan



On Fri, Sep 6, 2013 at 11:12 AM, Ravi Mummulla (BIG DATA) <
ravimu@microsoft.com> wrote:

>  Here’s your issue (from the logs you attached earlier):****
>
> ** **
>
> CAQuietExec:  Checking JAVA_HOME is set correctly...****
>
> CAQuietExec:  Files\Java\jdk1.6.0_31 was unexpected at this time.****
>
> ** **
>
> It seems that you installed Java prerequisite in the default path, which
> is %PROGRAMFILES% (expands to C:\Program Files in your case). HDP 1.3 does
> not like spaces in paths, do you need to reinstall Java under c:\java\ or
> something similar (in a path with no spaces).****
>
> ** **
>
> *From:* Irfan Sayed [mailto:irfu.sayed@gmail.com]
> *Sent:* Thursday, September 5, 2013 8:42 PM
> *To:* user@hadoop.apache.org
> *Subject:* Re: about replication****
>
> ** **
>
> please find the attached.****
>
> i don't have "c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log"
> as it is not generated ****
>
> ** **
>
> regards****
>
> irfan****
>
> ** **
>
> ** **
>
> ** **
>
> ** **
>
> ** **
>
> On Thu, Sep 5, 2013 at 6:09 PM, Olivier Renault <or...@hortonworks.com>
> wrote:****
>
>  Could you share the log files ( c:\hdp.log,
> c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log )  as
> well as your clusterproperties.txt ?****
>
> ** **
>
> Thanks, ****
>
> Olivier****
>
> ** **
>
> On 5 September 2013 12:33, Irfan Sayed <ir...@gmail.com> wrote:****
>
>  thanks. i followed the user manual for deployment and installed all
> pre-requisites ****
>
> i modified the command and still the issue persist. please suggest ****
>
> ** **
>
> please refer below ****
>
> ** **
>
> ** **
>
> [image: Inline image 1]****
>
> ** **
>
> regards****
>
> irfan ****
>
> ** **
>
> ** **
>
> On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <or...@hortonworks.com>
> wrote:****
>
> The command to install it is msiexec /i msifile /...  ****
>
> You will find the correct syntax as part of doc. ****
>
> Happy reading
> Olivier ****
>
> On 4 Sep 2013 12:37, "Irfan Sayed" <ir...@gmail.com> wrote:****
>
>  thanks. ****
>
> i referred the logs and manuals. i modified the clusterproperties file and
> then double click on the msi file ****
>
> however, it still failed.****
>
> further i started the installation on command line by giving
> HDP_LAYOUT=clusterproperties file path, ****
>
> installation went ahead and it failed for .NET framework 4.0 and VC++
> redistributable package dependency   ****
>
> ** **
>
> i installed both and started again the installation. ****
>
> failed again with following error ****
>
> [image: Inline image 1]****
>
> ** **
>
> when i search for the logs mentioned in the error , i never found that ***
> *
>
> please suggest ****
>
> ** **
>
> regards****
>
> irfan****
>
> ** **
>
> ** **
>
> On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <or...@hortonworks.com>
> wrote:****
>
> Correct, you need to define the cluster configuration as part of a file.
> You will find some information on the configuration file as part of the
> documentation. ****
>
>
> http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
> ****
>
> You should make sure to have also installed the pre requisite. ****
>
> Thanks
> Olivier ****
>
> On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:****
>
>  thanks. sorry for the long break. actually got involved in some other
> priorities****
>
> i downloaded the installer and while installing i got following error ****
>
> ** **
>
> [image: Inline image 1]****
>
> ** **
>
> do i need to make any configuration prior to installation ??****
>
> ** **
>
> regards****
>
> irfan ****
>
> ** **
>
> ** **
>
> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <or...@hortonworks.com>
> wrote:****
>
> Here is the link ****
>
> http://download.hortonworks.com/products/hdp-windows/****
>
> Olivier ****
>
> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:****
>
>  thanks.****
>
> i just followed the instructions to setup the pseudo distributed setup
> first using the url :
> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
> ****
>
>  ****
>
> i don't think so i am running DN on both machine ****
>
> please find the attached log****
>
> ** **
>
> hi olivier ****
>
> ** **
>
> can you please give me download link ?****
>
> let me try please ****
>
> ** **
>
> regards****
>
> irfan ****
>
> ** **
>
> ** **
>
> ** **
>
> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <do...@gmail.com>
> wrote:****
>
>  Are you running DN on both the machines? Could you please show me your
> DN logs?****
>
> ** **
>
> Also, consider Oliver's suggestion. It's definitely a better option.****
>
> ** **
>
> ** **
>
>
> ****
>
> Warm Regards,****
>
> Tariq****
>
> cloudfront.blogspot.com****
>
> ** **
>
> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
> orenault@hortonworks.com> wrote:****
>
> Irfu, ****
>
> If you want to quickly get Hadoop running on windows platform. You may
> want to try our distribution for Windows. You will be able to find the msi
> on our website. ****
>
> Regards
> Olivier ****
>
> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:****
>
>  thanks. ****
>
> ok. i think i need to change the plan over here ****
>
> let me create two environments. 1: totally windows 2: totally Unix****
>
> ** **
>
> because, on windows , anyway i have to try and see how hadoop works ****
>
> on UNIX, it is already known that ,  it is working fine. ****
>
> ** **
>
> so, on windows , here is the setup:****
>
> ** **
>
> namenode : windows 2012 R2 ****
>
> datanode : windows 2012 R2 ****
>
> ** **
>
> now, the exact problem is :****
>
> 1: datanode is not getting started ****
>
> 2: replication : if i put any file/folder on any datanode , it should get
> replicated to all another available datanodes ****
>
> ** **
>
> regards****
>
> ** **
>
> ** **
>
> ** **
>
> ** **
>
> ** **
>
> ** **
>
> ** **
>
> ** **
>
> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <do...@gmail.com>
> wrote:****
>
>  Seriously??You are planning to develop something using Hadoop on
> windows. Not a good idea. Anyways, cold you plz show me your log files?I
> also need some additional info :****
>
> -The exact problem which you are facing right now****
>
> -Your cluster summary(no. of nodes etc)****
>
> -Your latest configuration files****
>
> -Your /etc.hosts file****
>
>
> ****
>
> Warm Regards,****
>
> Tariq****
>
> cloudfront.blogspot.com****
>
> ** **
>
> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <ir...@gmail.com> wrote:
> ****
>
>  ok. thanks****
>
> now, i need to start with all windows setup first as our product will be
> based on windows ****
>
> so, now, please tell me how to resolve the issue ****
>
> ** **
>
> datanode is not starting . please suggest ****
>
> ** **
>
> regards,****
>
> irfan ****
>
> ** **
>
> ** **
>
> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <do...@gmail.com>
> wrote:****
>
>  It is possible. Theoretically Hadoop doesn't stop you from doing that.
> But it is not a very wise setup.****
>
>
> ****
>
> Warm Regards,****
>
> Tariq****
>
> cloudfront.blogspot.com****
>
> ** **
>
> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <ir...@gmail.com> wrote:
> ****
>
>  please suggest****
>
> ** **
>
> regards****
>
> irfan****
>
> ** **
>
> ** **
>
> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
> wrote:****
>
>  thanks.****
>
> can i have setup like this :****
>
> namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)****
>
> and datanodes are the combination of any OS (windows , linux , unix etc )*
> ***
>
> ** **
>
> however, my doubt is,  as the file systems of  both the systems (win and
> linux ) are different ,  datanodes of these systems can not be part of
> single cluster . i have to make windows cluster separate and UNIX cluster
> separate ?****
>
> ** **
>
> regards****
>
> ** **
>
> ** **
>
> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <aa...@hortonworks.com>
> wrote:****
>
> I just noticed you are on Cygwin. IIRC Windows PIDs are not the same as
> Cygwin PIDs so that may be causing the discrepancy. I don't know how well
> Hadoop works in Cygwin as I have never tried it. Work is in progress for
> native Windows support however there are no official releases with Windows
> support yet. It may be easier to get familiar with a release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
> ****
>
>
>
> ****
>
> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <ir...@gmail.com>
> wrote:****
>
>  thanks ****
>
> here is what i did .****
>
> i stopped all the namenodes and datanodes using ./stop-dfs.sh command ****
>
> then deleted all pid files for namenodes and datanodes ****
>
> ** **
>
> started dfs again with command : "./start-dfs.sh"****
>
> ** **
>
> when i ran the "Jps" command . it shows****
>
> ** **
>
> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>
> $ ./jps.exe****
>
> 4536 Jps****
>
> 2076 NameNode****
>
> ** **
>
> however, when i open the pid file for namenode then it is not showing pid
> as : 4560. on the contrary, it shud show : 2076****
>
> ** **
>
> please suggest ****
>
> ** **
>
> regards****
>
> ** **
>
> ** **
>
> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <aa...@hortonworks.com>
> wrote:****
>
>  Most likely there is a stale pid file. Something like
> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
> the datanode.
>
> I haven't read the entire thread so you may have looked at this already.
>
> -Arpit****
>
>
>
> ****
>
> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <ir...@gmail.com> wrote:
> ****
>
>  datanode is trying to connect to namenode continuously but fails ****
>
> ** **
>
> when i try to run "jps" command it says :****
>
> $ ./jps.exe****
>
> 4584 NameNode****
>
> 4016 Jps****
>
> ** **
>
> and when i ran the "./start-dfs.sh" then it says :****
>
> ** **
>
> $ ./start-dfs.sh****
>
> namenode running as process 3544. Stop it first.****
>
> DFS-1: datanode running as process 4076. Stop it first.****
>
> localhost: secondarynamenode running as process 4792. Stop it first.****
>
> ** **
>
> both these logs are contradictory ****
>
> please find the attached logs ****
>
> ** **
>
> should i attach the conf files as well ?****
>
> ** **
>
> regards****
>
>  ****
>
> ** **
>
> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <do...@gmail.com>
> wrote:****
>
>  Your DN is still not running. Showing me the logs would be helpful.****
>
>
> ****
>
> Warm Regards,****
>
> Tariq****
>
> cloudfront.blogspot.com****
>
> ** **
>
> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <ir...@gmail.com> wrote:
> ****
>
>  i followed the url and did the steps mention in that. i have deployed on
> the windows platform****
>
> ** **
>
> Now, i am able to browse url : http://localhost:50070 (name node )****
>
> however, not able to browse url : http://localhost:50030****
>
> ** **
>
> please refer below****
>
> ** **
>
> [image: Inline image 1]****
>
> ** **
>
> i have modified all the config files as mentioned and formatted the hdfs
> file system as well ****
>
> please suggest ****
>
> ** **
>
> regards****
>
> ** **
>
> ** **
>
> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <ir...@gmail.com> wrote:
> ****
>
>  thanks. i followed this url :
> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html*
> ***
>
> let me follow the url which you gave for pseudo distributed setup and then
> will switch to distributed mode****
>
> ** **
>
> regards****
>
> irfan ****
>
> ** **
>
> ** **
>
> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <do...@gmail.com>
> wrote:****
>
>  You are welcome. Which link have you followed for the configuration?Your
> *core-site.xml* is empty. Remove the property *fs.default.name *from *
> hdfs-site.xml* and add it to *core-site.xml*. Remove *mapred.job.tracker*as well. It is required in
> *mapred-site.xml*.****
>
> ** **
>
> I would suggest you to do a pseudo distributed setup first in order to get
> yourself familiar with the process and then proceed to the distributed
> mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
> ****
>
> ** **
>
> HTH****
>
>
> ****
>
> Warm Regards,****
>
> Tariq****
>
> cloudfront.blogspot.com****
>
> ** **
>
> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <ir...@gmail.com> wrote:
> ****
>
>  thanks tariq for response. ****
>
> as discussed last time, i have sent you all the config files in my setup .
> ****
>
> can you please go through that ?****
>
> ** **
>
> please let me know ****
>
> ** **
>
> regards****
>
> irfan ****
>
> ** **
>
> ** **
>
> ** **
>
> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <do...@gmail.com>
> wrote:****
>
>  I'm sorry for being unresponsive. Was out of touch for sometime because
> of ramzan and eid. Resuming work today.****
>
> ** **
>
> What's the current status?****
>
>
> ****
>
> Warm Regards,****
>
> Tariq****
>
> cloudfront.blogspot.com****
>
> ** **
>
> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <ma...@gmail.com>
> wrote:****
>
>  First of all read the concepts ..I hope you will like it..****
>
>
> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf****
>
> ** **
>
> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <ir...@gmail.com> wrote:
> ****
>
>  please suggest ****
>
> ** **
>
> regards****
>
> irfan ****
>
> ** **
>
> ** **
>
> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <ir...@gmail.com>
> wrote:****
>
>  hey Tariq,****
>
> i am still stuck .. ****
>
> can you please suggest ****
>
> ** **
>
> regards****
>
> irfan ****
>
> ** **
>
> ** **
>
> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <ir...@gmail.com> wrote:*
> ***
>
>  please suggest ****
>
> ** **
>
> regards****
>
> ** **
>
> ** **
>
> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <ir...@gmail.com> wrote:*
> ***
>
>  attachment got quarantined ****
>
> resending in txt format. please rename it to conf.rar ****
>
> ** **
>
> regards****
>
> ** **
>
> ** **
>
> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <ir...@gmail.com> wrote:*
> ***
>
>  thanks.****
>
> ** **
>
> if i run the jps command on namenode :****
>
> ** **
>
> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>
> $ ./jps.exe****
>
> 3164 NameNode****
>
> 1892 Jps****
>
> ** **
>
> same command on datanode :****
>
> ** **
>
> Administrator@DFS-1 /cygdrive/c/Java/jdk1.7.0_25/bin****
>
> $ ./jps.exe****
>
> 3848 Jps****
>
> ** **
>
> jps does not list any process for datanode. however, on web browser i can
> see one live data node ****
>
> please find the attached conf rar file of namenode ****
>
> ** **
>
> regards****
>
> ** **
>
> ** **
>
> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <do...@gmail.com> wrote:
> ****
>
>  OK. we'll start fresh. Could you plz show me your latest config files?***
> *
>
> ** **
>
> BTW, are your daemons running fine?Use JPS to verify that.****
>
>
> ****
>
> Warm Regards,****
>
> Tariq****
>
> cloudfront.blogspot.com****
>
> ** **
>
> On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <ir...@gmail.com> wrote:
> ****
>
>  i have created these dir "wksp_data" and "wksp_name" on both datanode
> and namenode ****
>
> made the respective changes in "hdfs-site.xml" file ****
>
> formatted the namenode ****
>
> started the dfs ****
>
> ** **
>
> but still, not able to browse the file system through web browser ****
>
> please refer below ****
>
> ** **
>
> anything still missing ?****
>
> please suggest ****
>
> ** **
>
> [image: Inline image 1]****
>
> ** **
>
> On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <ir...@gmail.com> wrote:
> ****
>
>  these dir needs to be created on all datanodes and namenodes ?****
>
> further,  hdfs-site.xml needs to be updated on both datanodes and
> namenodes for these new dir?****
>
> ** **
>
> regards****
>
> ** **
>
> ** **
>
> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <do...@gmail.com> wrote:
> ****
>
>  Create 2 directories manually corresponding to the values of
> dfs.name.dir and dfs.data.dir properties and change the permissions of
> these directories to 755. When you start pushing data into your HDFS, data
> will start going inside the directory specified by dfs.data.dir and the
> associated metadata will go inside dfs.name.dir. Remember, you store data
> in HDFS, but it eventually gets stored in your local/native FS. But you
> cannot see this data directly on your local/native FS.****
>
>
> ****
>
> Warm Regards,****
>
> Tariq****
>
> cloudfront.blogspot.com****
>
> ** **
>
> On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <ir...@gmail.com> wrote:*
> ***
>
>  thanks. ****
>
> however, i need this to be working on windows environment as project
> requirement.****
>
> i will add/work on Linux later ****
>
> ** **
>
> so, now , at this stage , c:\\wksp is the HDFS file system OR do i need to
> create it from command line ?****
>
> ** **
>
> please suggest****
>
> ** **
>
> regards,****
>
> ** **
>
> ** **
>
> On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <do...@gmail.com> wrote:
> ****
>
>  Hello Irfan,****
>
> ** **
>
> Sorry for being unresponsive. Got stuck with some imp work.****
>
> ** **
>
> HDFS webUI doesn't provide us the ability to create file or directory. You
> can browse HDFS, view files, download files etc. But operation like create,
> move, copy etc are not supported.****
>
> ** **
>
> These values look fine to me.****
>
> ** **
>
> One suggestion though. Try getting a Linux machine(if possible). Or at
> least use a VM. I personally feel that using Hadoop on windows is always
> messy.****
>
>
> ****
>
> Warm Regards,****
>
> Tariq****
>
> cloudfront.blogspot.com****
>
> ** **
>
> On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <ir...@gmail.com> wrote:*
> ***
>
>  thanks.****
>
> when i browse the file system , i am getting following :****
>
> i haven't seen any make directory option there ****
>
> ** **
>
> i need to create it from command line ?****
>
> further, in the hdfs-site.xml file , i have given following entries. are
> they correct ? ****
>
> ** **
>
> <property>****
>
>   <name>dfs.data.dir</name>****
>
>   <value>c:\\wksp</value>****
>
>   </property>****
>
> <property>****
>
>   <name>dfs.name.dir</name>****
>
>   <value>c:\\wksp</value>****
>
>   </property>****
>
> ** **
>
> please suggest ****
>
> ** **
>
> ** **
>
> [image: Inline image 1]****
>
> ** **
>
> On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <ma...@gmail.com>
> wrote:****
>
>  *You are wrong at this:*****
>
> ** **
>
> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>
> $ ./hadoop dfs -copyFromLocal
> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp****
>
> copyFromLocal: File
> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.**
> **
>
> ** **
>
> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>
> $ ./hadoop dfs -copyFromLocal
> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp****
>
> copyFromLocal: File
> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
> ****
>
> ** **
>
> Because,You had wrote both the paths local and You need not to copy hadoop
> into hdfs...Hadoop is already working..****
>
> ** **
>
> Just check out in browser by after starting ur single node cluster :****
>
> ** **
>
> localhost:50070****
>
> ** **
>
> then go for browse the filesystem link in it..****
>
> ** **
>
> If there is no directory then make directory there.****
>
> That is your hdfs directory.****
>
> Then copy any text file there(no need to copy hadoop there).beacause u are
> going to do processing on that data in text file.That's why hadoop is used
> for ,first u need to make it clear in ur mind.Then and then u will do
> it...otherwise not possible..****
>
> ** **
>
> *Try this: *****
>
> ** **
>
> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2****
>
> $ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
> /hdfs/directory/path****
>
> ** **
>
> ** **
>
> ** **
>
> ** **
>
> On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com> wrote:
> ****
>
>  thanks. yes , i am newbie.****
>
> however, i need windows setup.****
>
> ** **
>
> let me surely refer the doc and link which u sent but i need this to be
> working ...****
>
> can you please help****
>
> ** **
>
> regards****
>
> ** **
>
>  ****
>
> ** **
>
>
>
> ****
>
> ** **
>
> --
> MANISH DUNANI
> -THANX
> +91 9426881954,+91 8460656443****
>
> manishd207@gmail.com****
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>
>
> ****
>
> -- ****
>
> Regards****
>
> *Manish Dunani*****
>
> *Contact No* : +91 9408329137****
>
> *skype id* : manish.dunani****
>
> ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.****
>
>  ** **
>
>
>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.****
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.****
>
>  ** **
>
>  ** **
>
>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.****
>
>  ** **
>
>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.****
>
>  ** **
>
>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.****
>
>  ** **
>
>
>
> ****
>
> ** **
>
> -- ****
>
> Olivier Renault****
>
> Solution Engineer - Big Data - Hortonworks, Inc.
> +44 7500 933 036
> orenault@hortonworks.com
> www.hortonworks.com****
>
> **** <http://hortonworks.com/products/hortonworks-sandbox/>
>
>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.****<http://hortonworks.com/products/hortonworks-sandbox/>
>
>  ** ** <http://hortonworks.com/products/hortonworks-sandbox/>
>

Re: about replication

Posted by Irfan Sayed <ir...@gmail.com>.
thanks.
i installed the latest java in c:\java folder and now no error in log file
related to java
however, now it is throwing error on not having cluster properties file.
in fact i am running/installing hdp from the location where this file exist
. still it is throwing error

please find the attached

[image: Inline image 1]

regards
irfan



On Fri, Sep 6, 2013 at 11:12 AM, Ravi Mummulla (BIG DATA) <
ravimu@microsoft.com> wrote:

>  Here’s your issue (from the logs you attached earlier):****
>
> ** **
>
> CAQuietExec:  Checking JAVA_HOME is set correctly...****
>
> CAQuietExec:  Files\Java\jdk1.6.0_31 was unexpected at this time.****
>
> ** **
>
> It seems that you installed Java prerequisite in the default path, which
> is %PROGRAMFILES% (expands to C:\Program Files in your case). HDP 1.3 does
> not like spaces in paths, do you need to reinstall Java under c:\java\ or
> something similar (in a path with no spaces).****
>
> ** **
>
> *From:* Irfan Sayed [mailto:irfu.sayed@gmail.com]
> *Sent:* Thursday, September 5, 2013 8:42 PM
> *To:* user@hadoop.apache.org
> *Subject:* Re: about replication****
>
> ** **
>
> please find the attached.****
>
> i don't have "c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log"
> as it is not generated ****
>
> ** **
>
> regards****
>
> irfan****
>
> ** **
>
> ** **
>
> ** **
>
> ** **
>
> ** **
>
> On Thu, Sep 5, 2013 at 6:09 PM, Olivier Renault <or...@hortonworks.com>
> wrote:****
>
>  Could you share the log files ( c:\hdp.log,
> c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log )  as
> well as your clusterproperties.txt ?****
>
> ** **
>
> Thanks, ****
>
> Olivier****
>
> ** **
>
> On 5 September 2013 12:33, Irfan Sayed <ir...@gmail.com> wrote:****
>
>  thanks. i followed the user manual for deployment and installed all
> pre-requisites ****
>
> i modified the command and still the issue persist. please suggest ****
>
> ** **
>
> please refer below ****
>
> ** **
>
> ** **
>
> [image: Inline image 1]****
>
> ** **
>
> regards****
>
> irfan ****
>
> ** **
>
> ** **
>
> On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <or...@hortonworks.com>
> wrote:****
>
> The command to install it is msiexec /i msifile /...  ****
>
> You will find the correct syntax as part of doc. ****
>
> Happy reading
> Olivier ****
>
> On 4 Sep 2013 12:37, "Irfan Sayed" <ir...@gmail.com> wrote:****
>
>  thanks. ****
>
> i referred the logs and manuals. i modified the clusterproperties file and
> then double click on the msi file ****
>
> however, it still failed.****
>
> further i started the installation on command line by giving
> HDP_LAYOUT=clusterproperties file path, ****
>
> installation went ahead and it failed for .NET framework 4.0 and VC++
> redistributable package dependency   ****
>
> ** **
>
> i installed both and started again the installation. ****
>
> failed again with following error ****
>
> [image: Inline image 1]****
>
> ** **
>
> when i search for the logs mentioned in the error , i never found that ***
> *
>
> please suggest ****
>
> ** **
>
> regards****
>
> irfan****
>
> ** **
>
> ** **
>
> On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <or...@hortonworks.com>
> wrote:****
>
> Correct, you need to define the cluster configuration as part of a file.
> You will find some information on the configuration file as part of the
> documentation. ****
>
>
> http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
> ****
>
> You should make sure to have also installed the pre requisite. ****
>
> Thanks
> Olivier ****
>
> On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:****
>
>  thanks. sorry for the long break. actually got involved in some other
> priorities****
>
> i downloaded the installer and while installing i got following error ****
>
> ** **
>
> [image: Inline image 1]****
>
> ** **
>
> do i need to make any configuration prior to installation ??****
>
> ** **
>
> regards****
>
> irfan ****
>
> ** **
>
> ** **
>
> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <or...@hortonworks.com>
> wrote:****
>
> Here is the link ****
>
> http://download.hortonworks.com/products/hdp-windows/****
>
> Olivier ****
>
> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:****
>
>  thanks.****
>
> i just followed the instructions to setup the pseudo distributed setup
> first using the url :
> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
> ****
>
>  ****
>
> i don't think so i am running DN on both machine ****
>
> please find the attached log****
>
> ** **
>
> hi olivier ****
>
> ** **
>
> can you please give me download link ?****
>
> let me try please ****
>
> ** **
>
> regards****
>
> irfan ****
>
> ** **
>
> ** **
>
> ** **
>
> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <do...@gmail.com>
> wrote:****
>
>  Are you running DN on both the machines? Could you please show me your
> DN logs?****
>
> ** **
>
> Also, consider Oliver's suggestion. It's definitely a better option.****
>
> ** **
>
> ** **
>
>
> ****
>
> Warm Regards,****
>
> Tariq****
>
> cloudfront.blogspot.com****
>
> ** **
>
> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
> orenault@hortonworks.com> wrote:****
>
> Irfu, ****
>
> If you want to quickly get Hadoop running on windows platform. You may
> want to try our distribution for Windows. You will be able to find the msi
> on our website. ****
>
> Regards
> Olivier ****
>
> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:****
>
>  thanks. ****
>
> ok. i think i need to change the plan over here ****
>
> let me create two environments. 1: totally windows 2: totally Unix****
>
> ** **
>
> because, on windows , anyway i have to try and see how hadoop works ****
>
> on UNIX, it is already known that ,  it is working fine. ****
>
> ** **
>
> so, on windows , here is the setup:****
>
> ** **
>
> namenode : windows 2012 R2 ****
>
> datanode : windows 2012 R2 ****
>
> ** **
>
> now, the exact problem is :****
>
> 1: datanode is not getting started ****
>
> 2: replication : if i put any file/folder on any datanode , it should get
> replicated to all another available datanodes ****
>
> ** **
>
> regards****
>
> ** **
>
> ** **
>
> ** **
>
> ** **
>
> ** **
>
> ** **
>
> ** **
>
> ** **
>
> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <do...@gmail.com>
> wrote:****
>
>  Seriously??You are planning to develop something using Hadoop on
> windows. Not a good idea. Anyways, cold you plz show me your log files?I
> also need some additional info :****
>
> -The exact problem which you are facing right now****
>
> -Your cluster summary(no. of nodes etc)****
>
> -Your latest configuration files****
>
> -Your /etc.hosts file****
>
>
> ****
>
> Warm Regards,****
>
> Tariq****
>
> cloudfront.blogspot.com****
>
> ** **
>
> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <ir...@gmail.com> wrote:
> ****
>
>  ok. thanks****
>
> now, i need to start with all windows setup first as our product will be
> based on windows ****
>
> so, now, please tell me how to resolve the issue ****
>
> ** **
>
> datanode is not starting . please suggest ****
>
> ** **
>
> regards,****
>
> irfan ****
>
> ** **
>
> ** **
>
> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <do...@gmail.com>
> wrote:****
>
>  It is possible. Theoretically Hadoop doesn't stop you from doing that.
> But it is not a very wise setup.****
>
>
> ****
>
> Warm Regards,****
>
> Tariq****
>
> cloudfront.blogspot.com****
>
> ** **
>
> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <ir...@gmail.com> wrote:
> ****
>
>  please suggest****
>
> ** **
>
> regards****
>
> irfan****
>
> ** **
>
> ** **
>
> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
> wrote:****
>
>  thanks.****
>
> can i have setup like this :****
>
> namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)****
>
> and datanodes are the combination of any OS (windows , linux , unix etc )*
> ***
>
> ** **
>
> however, my doubt is,  as the file systems of  both the systems (win and
> linux ) are different ,  datanodes of these systems can not be part of
> single cluster . i have to make windows cluster separate and UNIX cluster
> separate ?****
>
> ** **
>
> regards****
>
> ** **
>
> ** **
>
> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <aa...@hortonworks.com>
> wrote:****
>
> I just noticed you are on Cygwin. IIRC Windows PIDs are not the same as
> Cygwin PIDs so that may be causing the discrepancy. I don't know how well
> Hadoop works in Cygwin as I have never tried it. Work is in progress for
> native Windows support however there are no official releases with Windows
> support yet. It may be easier to get familiar with a release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
> ****
>
>
>
> ****
>
> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <ir...@gmail.com>
> wrote:****
>
>  thanks ****
>
> here is what i did .****
>
> i stopped all the namenodes and datanodes using ./stop-dfs.sh command ****
>
> then deleted all pid files for namenodes and datanodes ****
>
> ** **
>
> started dfs again with command : "./start-dfs.sh"****
>
> ** **
>
> when i ran the "Jps" command . it shows****
>
> ** **
>
> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>
> $ ./jps.exe****
>
> 4536 Jps****
>
> 2076 NameNode****
>
> ** **
>
> however, when i open the pid file for namenode then it is not showing pid
> as : 4560. on the contrary, it shud show : 2076****
>
> ** **
>
> please suggest ****
>
> ** **
>
> regards****
>
> ** **
>
> ** **
>
> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <aa...@hortonworks.com>
> wrote:****
>
>  Most likely there is a stale pid file. Something like
> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
> the datanode.
>
> I haven't read the entire thread so you may have looked at this already.
>
> -Arpit****
>
>
>
> ****
>
> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <ir...@gmail.com> wrote:
> ****
>
>  datanode is trying to connect to namenode continuously but fails ****
>
> ** **
>
> when i try to run "jps" command it says :****
>
> $ ./jps.exe****
>
> 4584 NameNode****
>
> 4016 Jps****
>
> ** **
>
> and when i ran the "./start-dfs.sh" then it says :****
>
> ** **
>
> $ ./start-dfs.sh****
>
> namenode running as process 3544. Stop it first.****
>
> DFS-1: datanode running as process 4076. Stop it first.****
>
> localhost: secondarynamenode running as process 4792. Stop it first.****
>
> ** **
>
> both these logs are contradictory ****
>
> please find the attached logs ****
>
> ** **
>
> should i attach the conf files as well ?****
>
> ** **
>
> regards****
>
>  ****
>
> ** **
>
> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <do...@gmail.com>
> wrote:****
>
>  Your DN is still not running. Showing me the logs would be helpful.****
>
>
> ****
>
> Warm Regards,****
>
> Tariq****
>
> cloudfront.blogspot.com****
>
> ** **
>
> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <ir...@gmail.com> wrote:
> ****
>
>  i followed the url and did the steps mention in that. i have deployed on
> the windows platform****
>
> ** **
>
> Now, i am able to browse url : http://localhost:50070 (name node )****
>
> however, not able to browse url : http://localhost:50030****
>
> ** **
>
> please refer below****
>
> ** **
>
> [image: Inline image 1]****
>
> ** **
>
> i have modified all the config files as mentioned and formatted the hdfs
> file system as well ****
>
> please suggest ****
>
> ** **
>
> regards****
>
> ** **
>
> ** **
>
> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <ir...@gmail.com> wrote:
> ****
>
>  thanks. i followed this url :
> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html*
> ***
>
> let me follow the url which you gave for pseudo distributed setup and then
> will switch to distributed mode****
>
> ** **
>
> regards****
>
> irfan ****
>
> ** **
>
> ** **
>
> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <do...@gmail.com>
> wrote:****
>
>  You are welcome. Which link have you followed for the configuration?Your
> *core-site.xml* is empty. Remove the property *fs.default.name *from *
> hdfs-site.xml* and add it to *core-site.xml*. Remove *mapred.job.tracker*as well. It is required in
> *mapred-site.xml*.****
>
> ** **
>
> I would suggest you to do a pseudo distributed setup first in order to get
> yourself familiar with the process and then proceed to the distributed
> mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
> ****
>
> ** **
>
> HTH****
>
>
> ****
>
> Warm Regards,****
>
> Tariq****
>
> cloudfront.blogspot.com****
>
> ** **
>
> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <ir...@gmail.com> wrote:
> ****
>
>  thanks tariq for response. ****
>
> as discussed last time, i have sent you all the config files in my setup .
> ****
>
> can you please go through that ?****
>
> ** **
>
> please let me know ****
>
> ** **
>
> regards****
>
> irfan ****
>
> ** **
>
> ** **
>
> ** **
>
> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <do...@gmail.com>
> wrote:****
>
>  I'm sorry for being unresponsive. Was out of touch for sometime because
> of ramzan and eid. Resuming work today.****
>
> ** **
>
> What's the current status?****
>
>
> ****
>
> Warm Regards,****
>
> Tariq****
>
> cloudfront.blogspot.com****
>
> ** **
>
> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <ma...@gmail.com>
> wrote:****
>
>  First of all read the concepts ..I hope you will like it..****
>
>
> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf****
>
> ** **
>
> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <ir...@gmail.com> wrote:
> ****
>
>  please suggest ****
>
> ** **
>
> regards****
>
> irfan ****
>
> ** **
>
> ** **
>
> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <ir...@gmail.com>
> wrote:****
>
>  hey Tariq,****
>
> i am still stuck .. ****
>
> can you please suggest ****
>
> ** **
>
> regards****
>
> irfan ****
>
> ** **
>
> ** **
>
> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <ir...@gmail.com> wrote:*
> ***
>
>  please suggest ****
>
> ** **
>
> regards****
>
> ** **
>
> ** **
>
> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <ir...@gmail.com> wrote:*
> ***
>
>  attachment got quarantined ****
>
> resending in txt format. please rename it to conf.rar ****
>
> ** **
>
> regards****
>
> ** **
>
> ** **
>
> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <ir...@gmail.com> wrote:*
> ***
>
>  thanks.****
>
> ** **
>
> if i run the jps command on namenode :****
>
> ** **
>
> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>
> $ ./jps.exe****
>
> 3164 NameNode****
>
> 1892 Jps****
>
> ** **
>
> same command on datanode :****
>
> ** **
>
> Administrator@DFS-1 /cygdrive/c/Java/jdk1.7.0_25/bin****
>
> $ ./jps.exe****
>
> 3848 Jps****
>
> ** **
>
> jps does not list any process for datanode. however, on web browser i can
> see one live data node ****
>
> please find the attached conf rar file of namenode ****
>
> ** **
>
> regards****
>
> ** **
>
> ** **
>
> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <do...@gmail.com> wrote:
> ****
>
>  OK. we'll start fresh. Could you plz show me your latest config files?***
> *
>
> ** **
>
> BTW, are your daemons running fine?Use JPS to verify that.****
>
>
> ****
>
> Warm Regards,****
>
> Tariq****
>
> cloudfront.blogspot.com****
>
> ** **
>
> On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <ir...@gmail.com> wrote:
> ****
>
>  i have created these dir "wksp_data" and "wksp_name" on both datanode
> and namenode ****
>
> made the respective changes in "hdfs-site.xml" file ****
>
> formatted the namenode ****
>
> started the dfs ****
>
> ** **
>
> but still, not able to browse the file system through web browser ****
>
> please refer below ****
>
> ** **
>
> anything still missing ?****
>
> please suggest ****
>
> ** **
>
> [image: Inline image 1]****
>
> ** **
>
> On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <ir...@gmail.com> wrote:
> ****
>
>  these dir needs to be created on all datanodes and namenodes ?****
>
> further,  hdfs-site.xml needs to be updated on both datanodes and
> namenodes for these new dir?****
>
> ** **
>
> regards****
>
> ** **
>
> ** **
>
> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <do...@gmail.com> wrote:
> ****
>
>  Create 2 directories manually corresponding to the values of
> dfs.name.dir and dfs.data.dir properties and change the permissions of
> these directories to 755. When you start pushing data into your HDFS, data
> will start going inside the directory specified by dfs.data.dir and the
> associated metadata will go inside dfs.name.dir. Remember, you store data
> in HDFS, but it eventually gets stored in your local/native FS. But you
> cannot see this data directly on your local/native FS.****
>
>
> ****
>
> Warm Regards,****
>
> Tariq****
>
> cloudfront.blogspot.com****
>
> ** **
>
> On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <ir...@gmail.com> wrote:*
> ***
>
>  thanks. ****
>
> however, i need this to be working on windows environment as project
> requirement.****
>
> i will add/work on Linux later ****
>
> ** **
>
> so, now , at this stage , c:\\wksp is the HDFS file system OR do i need to
> create it from command line ?****
>
> ** **
>
> please suggest****
>
> ** **
>
> regards,****
>
> ** **
>
> ** **
>
> On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <do...@gmail.com> wrote:
> ****
>
>  Hello Irfan,****
>
> ** **
>
> Sorry for being unresponsive. Got stuck with some imp work.****
>
> ** **
>
> HDFS webUI doesn't provide us the ability to create file or directory. You
> can browse HDFS, view files, download files etc. But operation like create,
> move, copy etc are not supported.****
>
> ** **
>
> These values look fine to me.****
>
> ** **
>
> One suggestion though. Try getting a Linux machine(if possible). Or at
> least use a VM. I personally feel that using Hadoop on windows is always
> messy.****
>
>
> ****
>
> Warm Regards,****
>
> Tariq****
>
> cloudfront.blogspot.com****
>
> ** **
>
> On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <ir...@gmail.com> wrote:*
> ***
>
>  thanks.****
>
> when i browse the file system , i am getting following :****
>
> i haven't seen any make directory option there ****
>
> ** **
>
> i need to create it from command line ?****
>
> further, in the hdfs-site.xml file , i have given following entries. are
> they correct ? ****
>
> ** **
>
> <property>****
>
>   <name>dfs.data.dir</name>****
>
>   <value>c:\\wksp</value>****
>
>   </property>****
>
> <property>****
>
>   <name>dfs.name.dir</name>****
>
>   <value>c:\\wksp</value>****
>
>   </property>****
>
> ** **
>
> please suggest ****
>
> ** **
>
> ** **
>
> [image: Inline image 1]****
>
> ** **
>
> On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <ma...@gmail.com>
> wrote:****
>
>  *You are wrong at this:*****
>
> ** **
>
> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>
> $ ./hadoop dfs -copyFromLocal
> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp****
>
> copyFromLocal: File
> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.**
> **
>
> ** **
>
> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>
> $ ./hadoop dfs -copyFromLocal
> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp****
>
> copyFromLocal: File
> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
> ****
>
> ** **
>
> Because,You had wrote both the paths local and You need not to copy hadoop
> into hdfs...Hadoop is already working..****
>
> ** **
>
> Just check out in browser by after starting ur single node cluster :****
>
> ** **
>
> localhost:50070****
>
> ** **
>
> then go for browse the filesystem link in it..****
>
> ** **
>
> If there is no directory then make directory there.****
>
> That is your hdfs directory.****
>
> Then copy any text file there(no need to copy hadoop there).beacause u are
> going to do processing on that data in text file.That's why hadoop is used
> for ,first u need to make it clear in ur mind.Then and then u will do
> it...otherwise not possible..****
>
> ** **
>
> *Try this: *****
>
> ** **
>
> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2****
>
> $ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
> /hdfs/directory/path****
>
> ** **
>
> ** **
>
> ** **
>
> ** **
>
> On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com> wrote:
> ****
>
>  thanks. yes , i am newbie.****
>
> however, i need windows setup.****
>
> ** **
>
> let me surely refer the doc and link which u sent but i need this to be
> working ...****
>
> can you please help****
>
> ** **
>
> regards****
>
> ** **
>
>  ****
>
> ** **
>
>
>
> ****
>
> ** **
>
> --
> MANISH DUNANI
> -THANX
> +91 9426881954,+91 8460656443****
>
> manishd207@gmail.com****
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>
>
> ****
>
> -- ****
>
> Regards****
>
> *Manish Dunani*****
>
> *Contact No* : +91 9408329137****
>
> *skype id* : manish.dunani****
>
> ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.****
>
>  ** **
>
>
>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.****
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.****
>
>  ** **
>
>  ** **
>
>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.****
>
>  ** **
>
>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.****
>
>  ** **
>
>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.****
>
>  ** **
>
>
>
> ****
>
> ** **
>
> -- ****
>
> Olivier Renault****
>
> Solution Engineer - Big Data - Hortonworks, Inc.
> +44 7500 933 036
> orenault@hortonworks.com
> www.hortonworks.com****
>
> **** <http://hortonworks.com/products/hortonworks-sandbox/>
>
>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.****<http://hortonworks.com/products/hortonworks-sandbox/>
>
>  ** ** <http://hortonworks.com/products/hortonworks-sandbox/>
>

Re: about replication

Posted by Irfan Sayed <ir...@gmail.com>.
thanks.
i installed the latest java in c:\java folder and now no error in log file
related to java
however, now it is throwing error on not having cluster properties file.
in fact i am running/installing hdp from the location where this file exist
. still it is throwing error

please find the attached

[image: Inline image 1]

regards
irfan



On Fri, Sep 6, 2013 at 11:12 AM, Ravi Mummulla (BIG DATA) <
ravimu@microsoft.com> wrote:

>  Here’s your issue (from the logs you attached earlier):****
>
> ** **
>
> CAQuietExec:  Checking JAVA_HOME is set correctly...****
>
> CAQuietExec:  Files\Java\jdk1.6.0_31 was unexpected at this time.****
>
> ** **
>
> It seems that you installed Java prerequisite in the default path, which
> is %PROGRAMFILES% (expands to C:\Program Files in your case). HDP 1.3 does
> not like spaces in paths, do you need to reinstall Java under c:\java\ or
> something similar (in a path with no spaces).****
>
> ** **
>
> *From:* Irfan Sayed [mailto:irfu.sayed@gmail.com]
> *Sent:* Thursday, September 5, 2013 8:42 PM
> *To:* user@hadoop.apache.org
> *Subject:* Re: about replication****
>
> ** **
>
> please find the attached.****
>
> i don't have "c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log"
> as it is not generated ****
>
> ** **
>
> regards****
>
> irfan****
>
> ** **
>
> ** **
>
> ** **
>
> ** **
>
> ** **
>
> On Thu, Sep 5, 2013 at 6:09 PM, Olivier Renault <or...@hortonworks.com>
> wrote:****
>
>  Could you share the log files ( c:\hdp.log,
> c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log )  as
> well as your clusterproperties.txt ?****
>
> ** **
>
> Thanks, ****
>
> Olivier****
>
> ** **
>
> On 5 September 2013 12:33, Irfan Sayed <ir...@gmail.com> wrote:****
>
>  thanks. i followed the user manual for deployment and installed all
> pre-requisites ****
>
> i modified the command and still the issue persist. please suggest ****
>
> ** **
>
> please refer below ****
>
> ** **
>
> ** **
>
> [image: Inline image 1]****
>
> ** **
>
> regards****
>
> irfan ****
>
> ** **
>
> ** **
>
> On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <or...@hortonworks.com>
> wrote:****
>
> The command to install it is msiexec /i msifile /...  ****
>
> You will find the correct syntax as part of doc. ****
>
> Happy reading
> Olivier ****
>
> On 4 Sep 2013 12:37, "Irfan Sayed" <ir...@gmail.com> wrote:****
>
>  thanks. ****
>
> i referred the logs and manuals. i modified the clusterproperties file and
> then double click on the msi file ****
>
> however, it still failed.****
>
> further i started the installation on command line by giving
> HDP_LAYOUT=clusterproperties file path, ****
>
> installation went ahead and it failed for .NET framework 4.0 and VC++
> redistributable package dependency   ****
>
> ** **
>
> i installed both and started again the installation. ****
>
> failed again with following error ****
>
> [image: Inline image 1]****
>
> ** **
>
> when i search for the logs mentioned in the error , i never found that ***
> *
>
> please suggest ****
>
> ** **
>
> regards****
>
> irfan****
>
> ** **
>
> ** **
>
> On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <or...@hortonworks.com>
> wrote:****
>
> Correct, you need to define the cluster configuration as part of a file.
> You will find some information on the configuration file as part of the
> documentation. ****
>
>
> http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
> ****
>
> You should make sure to have also installed the pre requisite. ****
>
> Thanks
> Olivier ****
>
> On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:****
>
>  thanks. sorry for the long break. actually got involved in some other
> priorities****
>
> i downloaded the installer and while installing i got following error ****
>
> ** **
>
> [image: Inline image 1]****
>
> ** **
>
> do i need to make any configuration prior to installation ??****
>
> ** **
>
> regards****
>
> irfan ****
>
> ** **
>
> ** **
>
> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <or...@hortonworks.com>
> wrote:****
>
> Here is the link ****
>
> http://download.hortonworks.com/products/hdp-windows/****
>
> Olivier ****
>
> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:****
>
>  thanks.****
>
> i just followed the instructions to setup the pseudo distributed setup
> first using the url :
> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
> ****
>
>  ****
>
> i don't think so i am running DN on both machine ****
>
> please find the attached log****
>
> ** **
>
> hi olivier ****
>
> ** **
>
> can you please give me download link ?****
>
> let me try please ****
>
> ** **
>
> regards****
>
> irfan ****
>
> ** **
>
> ** **
>
> ** **
>
> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <do...@gmail.com>
> wrote:****
>
>  Are you running DN on both the machines? Could you please show me your
> DN logs?****
>
> ** **
>
> Also, consider Oliver's suggestion. It's definitely a better option.****
>
> ** **
>
> ** **
>
>
> ****
>
> Warm Regards,****
>
> Tariq****
>
> cloudfront.blogspot.com****
>
> ** **
>
> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
> orenault@hortonworks.com> wrote:****
>
> Irfu, ****
>
> If you want to quickly get Hadoop running on windows platform. You may
> want to try our distribution for Windows. You will be able to find the msi
> on our website. ****
>
> Regards
> Olivier ****
>
> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:****
>
>  thanks. ****
>
> ok. i think i need to change the plan over here ****
>
> let me create two environments. 1: totally windows 2: totally Unix****
>
> ** **
>
> because, on windows , anyway i have to try and see how hadoop works ****
>
> on UNIX, it is already known that ,  it is working fine. ****
>
> ** **
>
> so, on windows , here is the setup:****
>
> ** **
>
> namenode : windows 2012 R2 ****
>
> datanode : windows 2012 R2 ****
>
> ** **
>
> now, the exact problem is :****
>
> 1: datanode is not getting started ****
>
> 2: replication : if i put any file/folder on any datanode , it should get
> replicated to all another available datanodes ****
>
> ** **
>
> regards****
>
> ** **
>
> ** **
>
> ** **
>
> ** **
>
> ** **
>
> ** **
>
> ** **
>
> ** **
>
> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <do...@gmail.com>
> wrote:****
>
>  Seriously??You are planning to develop something using Hadoop on
> windows. Not a good idea. Anyways, cold you plz show me your log files?I
> also need some additional info :****
>
> -The exact problem which you are facing right now****
>
> -Your cluster summary(no. of nodes etc)****
>
> -Your latest configuration files****
>
> -Your /etc.hosts file****
>
>
> ****
>
> Warm Regards,****
>
> Tariq****
>
> cloudfront.blogspot.com****
>
> ** **
>
> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <ir...@gmail.com> wrote:
> ****
>
>  ok. thanks****
>
> now, i need to start with all windows setup first as our product will be
> based on windows ****
>
> so, now, please tell me how to resolve the issue ****
>
> ** **
>
> datanode is not starting . please suggest ****
>
> ** **
>
> regards,****
>
> irfan ****
>
> ** **
>
> ** **
>
> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <do...@gmail.com>
> wrote:****
>
>  It is possible. Theoretically Hadoop doesn't stop you from doing that.
> But it is not a very wise setup.****
>
>
> ****
>
> Warm Regards,****
>
> Tariq****
>
> cloudfront.blogspot.com****
>
> ** **
>
> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <ir...@gmail.com> wrote:
> ****
>
>  please suggest****
>
> ** **
>
> regards****
>
> irfan****
>
> ** **
>
> ** **
>
> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
> wrote:****
>
>  thanks.****
>
> can i have setup like this :****
>
> namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)****
>
> and datanodes are the combination of any OS (windows , linux , unix etc )*
> ***
>
> ** **
>
> however, my doubt is,  as the file systems of  both the systems (win and
> linux ) are different ,  datanodes of these systems can not be part of
> single cluster . i have to make windows cluster separate and UNIX cluster
> separate ?****
>
> ** **
>
> regards****
>
> ** **
>
> ** **
>
> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <aa...@hortonworks.com>
> wrote:****
>
> I just noticed you are on Cygwin. IIRC Windows PIDs are not the same as
> Cygwin PIDs so that may be causing the discrepancy. I don't know how well
> Hadoop works in Cygwin as I have never tried it. Work is in progress for
> native Windows support however there are no official releases with Windows
> support yet. It may be easier to get familiar with a release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
> ****
>
>
>
> ****
>
> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <ir...@gmail.com>
> wrote:****
>
>  thanks ****
>
> here is what i did .****
>
> i stopped all the namenodes and datanodes using ./stop-dfs.sh command ****
>
> then deleted all pid files for namenodes and datanodes ****
>
> ** **
>
> started dfs again with command : "./start-dfs.sh"****
>
> ** **
>
> when i ran the "Jps" command . it shows****
>
> ** **
>
> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>
> $ ./jps.exe****
>
> 4536 Jps****
>
> 2076 NameNode****
>
> ** **
>
> however, when i open the pid file for namenode then it is not showing pid
> as : 4560. on the contrary, it shud show : 2076****
>
> ** **
>
> please suggest ****
>
> ** **
>
> regards****
>
> ** **
>
> ** **
>
> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <aa...@hortonworks.com>
> wrote:****
>
>  Most likely there is a stale pid file. Something like
> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
> the datanode.
>
> I haven't read the entire thread so you may have looked at this already.
>
> -Arpit****
>
>
>
> ****
>
> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <ir...@gmail.com> wrote:
> ****
>
>  datanode is trying to connect to namenode continuously but fails ****
>
> ** **
>
> when i try to run "jps" command it says :****
>
> $ ./jps.exe****
>
> 4584 NameNode****
>
> 4016 Jps****
>
> ** **
>
> and when i ran the "./start-dfs.sh" then it says :****
>
> ** **
>
> $ ./start-dfs.sh****
>
> namenode running as process 3544. Stop it first.****
>
> DFS-1: datanode running as process 4076. Stop it first.****
>
> localhost: secondarynamenode running as process 4792. Stop it first.****
>
> ** **
>
> both these logs are contradictory ****
>
> please find the attached logs ****
>
> ** **
>
> should i attach the conf files as well ?****
>
> ** **
>
> regards****
>
>  ****
>
> ** **
>
> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <do...@gmail.com>
> wrote:****
>
>  Your DN is still not running. Showing me the logs would be helpful.****
>
>
> ****
>
> Warm Regards,****
>
> Tariq****
>
> cloudfront.blogspot.com****
>
> ** **
>
> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <ir...@gmail.com> wrote:
> ****
>
>  i followed the url and did the steps mention in that. i have deployed on
> the windows platform****
>
> ** **
>
> Now, i am able to browse url : http://localhost:50070 (name node )****
>
> however, not able to browse url : http://localhost:50030****
>
> ** **
>
> please refer below****
>
> ** **
>
> [image: Inline image 1]****
>
> ** **
>
> i have modified all the config files as mentioned and formatted the hdfs
> file system as well ****
>
> please suggest ****
>
> ** **
>
> regards****
>
> ** **
>
> ** **
>
> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <ir...@gmail.com> wrote:
> ****
>
>  thanks. i followed this url :
> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html*
> ***
>
> let me follow the url which you gave for pseudo distributed setup and then
> will switch to distributed mode****
>
> ** **
>
> regards****
>
> irfan ****
>
> ** **
>
> ** **
>
> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <do...@gmail.com>
> wrote:****
>
>  You are welcome. Which link have you followed for the configuration?Your
> *core-site.xml* is empty. Remove the property *fs.default.name *from *
> hdfs-site.xml* and add it to *core-site.xml*. Remove *mapred.job.tracker*as well. It is required in
> *mapred-site.xml*.****
>
> ** **
>
> I would suggest you to do a pseudo distributed setup first in order to get
> yourself familiar with the process and then proceed to the distributed
> mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
> ****
>
> ** **
>
> HTH****
>
>
> ****
>
> Warm Regards,****
>
> Tariq****
>
> cloudfront.blogspot.com****
>
> ** **
>
> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <ir...@gmail.com> wrote:
> ****
>
>  thanks tariq for response. ****
>
> as discussed last time, i have sent you all the config files in my setup .
> ****
>
> can you please go through that ?****
>
> ** **
>
> please let me know ****
>
> ** **
>
> regards****
>
> irfan ****
>
> ** **
>
> ** **
>
> ** **
>
> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <do...@gmail.com>
> wrote:****
>
>  I'm sorry for being unresponsive. Was out of touch for sometime because
> of ramzan and eid. Resuming work today.****
>
> ** **
>
> What's the current status?****
>
>
> ****
>
> Warm Regards,****
>
> Tariq****
>
> cloudfront.blogspot.com****
>
> ** **
>
> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <ma...@gmail.com>
> wrote:****
>
>  First of all read the concepts ..I hope you will like it..****
>
>
> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf****
>
> ** **
>
> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <ir...@gmail.com> wrote:
> ****
>
>  please suggest ****
>
> ** **
>
> regards****
>
> irfan ****
>
> ** **
>
> ** **
>
> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <ir...@gmail.com>
> wrote:****
>
>  hey Tariq,****
>
> i am still stuck .. ****
>
> can you please suggest ****
>
> ** **
>
> regards****
>
> irfan ****
>
> ** **
>
> ** **
>
> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <ir...@gmail.com> wrote:*
> ***
>
>  please suggest ****
>
> ** **
>
> regards****
>
> ** **
>
> ** **
>
> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <ir...@gmail.com> wrote:*
> ***
>
>  attachment got quarantined ****
>
> resending in txt format. please rename it to conf.rar ****
>
> ** **
>
> regards****
>
> ** **
>
> ** **
>
> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <ir...@gmail.com> wrote:*
> ***
>
>  thanks.****
>
> ** **
>
> if i run the jps command on namenode :****
>
> ** **
>
> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>
> $ ./jps.exe****
>
> 3164 NameNode****
>
> 1892 Jps****
>
> ** **
>
> same command on datanode :****
>
> ** **
>
> Administrator@DFS-1 /cygdrive/c/Java/jdk1.7.0_25/bin****
>
> $ ./jps.exe****
>
> 3848 Jps****
>
> ** **
>
> jps does not list any process for datanode. however, on web browser i can
> see one live data node ****
>
> please find the attached conf rar file of namenode ****
>
> ** **
>
> regards****
>
> ** **
>
> ** **
>
> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <do...@gmail.com> wrote:
> ****
>
>  OK. we'll start fresh. Could you plz show me your latest config files?***
> *
>
> ** **
>
> BTW, are your daemons running fine?Use JPS to verify that.****
>
>
> ****
>
> Warm Regards,****
>
> Tariq****
>
> cloudfront.blogspot.com****
>
> ** **
>
> On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <ir...@gmail.com> wrote:
> ****
>
>  i have created these dir "wksp_data" and "wksp_name" on both datanode
> and namenode ****
>
> made the respective changes in "hdfs-site.xml" file ****
>
> formatted the namenode ****
>
> started the dfs ****
>
> ** **
>
> but still, not able to browse the file system through web browser ****
>
> please refer below ****
>
> ** **
>
> anything still missing ?****
>
> please suggest ****
>
> ** **
>
> [image: Inline image 1]****
>
> ** **
>
> On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <ir...@gmail.com> wrote:
> ****
>
>  these dir needs to be created on all datanodes and namenodes ?****
>
> further,  hdfs-site.xml needs to be updated on both datanodes and
> namenodes for these new dir?****
>
> ** **
>
> regards****
>
> ** **
>
> ** **
>
> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <do...@gmail.com> wrote:
> ****
>
>  Create 2 directories manually corresponding to the values of
> dfs.name.dir and dfs.data.dir properties and change the permissions of
> these directories to 755. When you start pushing data into your HDFS, data
> will start going inside the directory specified by dfs.data.dir and the
> associated metadata will go inside dfs.name.dir. Remember, you store data
> in HDFS, but it eventually gets stored in your local/native FS. But you
> cannot see this data directly on your local/native FS.****
>
>
> ****
>
> Warm Regards,****
>
> Tariq****
>
> cloudfront.blogspot.com****
>
> ** **
>
> On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <ir...@gmail.com> wrote:*
> ***
>
>  thanks. ****
>
> however, i need this to be working on windows environment as project
> requirement.****
>
> i will add/work on Linux later ****
>
> ** **
>
> so, now , at this stage , c:\\wksp is the HDFS file system OR do i need to
> create it from command line ?****
>
> ** **
>
> please suggest****
>
> ** **
>
> regards,****
>
> ** **
>
> ** **
>
> On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <do...@gmail.com> wrote:
> ****
>
>  Hello Irfan,****
>
> ** **
>
> Sorry for being unresponsive. Got stuck with some imp work.****
>
> ** **
>
> HDFS webUI doesn't provide us the ability to create file or directory. You
> can browse HDFS, view files, download files etc. But operation like create,
> move, copy etc are not supported.****
>
> ** **
>
> These values look fine to me.****
>
> ** **
>
> One suggestion though. Try getting a Linux machine(if possible). Or at
> least use a VM. I personally feel that using Hadoop on windows is always
> messy.****
>
>
> ****
>
> Warm Regards,****
>
> Tariq****
>
> cloudfront.blogspot.com****
>
> ** **
>
> On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <ir...@gmail.com> wrote:*
> ***
>
>  thanks.****
>
> when i browse the file system , i am getting following :****
>
> i haven't seen any make directory option there ****
>
> ** **
>
> i need to create it from command line ?****
>
> further, in the hdfs-site.xml file , i have given following entries. are
> they correct ? ****
>
> ** **
>
> <property>****
>
>   <name>dfs.data.dir</name>****
>
>   <value>c:\\wksp</value>****
>
>   </property>****
>
> <property>****
>
>   <name>dfs.name.dir</name>****
>
>   <value>c:\\wksp</value>****
>
>   </property>****
>
> ** **
>
> please suggest ****
>
> ** **
>
> ** **
>
> [image: Inline image 1]****
>
> ** **
>
> On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <ma...@gmail.com>
> wrote:****
>
>  *You are wrong at this:*****
>
> ** **
>
> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>
> $ ./hadoop dfs -copyFromLocal
> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp****
>
> copyFromLocal: File
> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.**
> **
>
> ** **
>
> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>
> $ ./hadoop dfs -copyFromLocal
> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp****
>
> copyFromLocal: File
> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
> ****
>
> ** **
>
> Because,You had wrote both the paths local and You need not to copy hadoop
> into hdfs...Hadoop is already working..****
>
> ** **
>
> Just check out in browser by after starting ur single node cluster :****
>
> ** **
>
> localhost:50070****
>
> ** **
>
> then go for browse the filesystem link in it..****
>
> ** **
>
> If there is no directory then make directory there.****
>
> That is your hdfs directory.****
>
> Then copy any text file there(no need to copy hadoop there).beacause u are
> going to do processing on that data in text file.That's why hadoop is used
> for ,first u need to make it clear in ur mind.Then and then u will do
> it...otherwise not possible..****
>
> ** **
>
> *Try this: *****
>
> ** **
>
> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2****
>
> $ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
> /hdfs/directory/path****
>
> ** **
>
> ** **
>
> ** **
>
> ** **
>
> On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com> wrote:
> ****
>
>  thanks. yes , i am newbie.****
>
> however, i need windows setup.****
>
> ** **
>
> let me surely refer the doc and link which u sent but i need this to be
> working ...****
>
> can you please help****
>
> ** **
>
> regards****
>
> ** **
>
>  ****
>
> ** **
>
>
>
> ****
>
> ** **
>
> --
> MANISH DUNANI
> -THANX
> +91 9426881954,+91 8460656443****
>
> manishd207@gmail.com****
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>
>
> ****
>
> -- ****
>
> Regards****
>
> *Manish Dunani*****
>
> *Contact No* : +91 9408329137****
>
> *skype id* : manish.dunani****
>
> ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.****
>
>  ** **
>
>
>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.****
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.****
>
>  ** **
>
>  ** **
>
>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.****
>
>  ** **
>
>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.****
>
>  ** **
>
>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.****
>
>  ** **
>
>
>
> ****
>
> ** **
>
> -- ****
>
> Olivier Renault****
>
> Solution Engineer - Big Data - Hortonworks, Inc.
> +44 7500 933 036
> orenault@hortonworks.com
> www.hortonworks.com****
>
> **** <http://hortonworks.com/products/hortonworks-sandbox/>
>
>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.****<http://hortonworks.com/products/hortonworks-sandbox/>
>
>  ** ** <http://hortonworks.com/products/hortonworks-sandbox/>
>

Re: about replication

Posted by Irfan Sayed <ir...@gmail.com>.
thanks.
i installed the latest java in c:\java folder and now no error in log file
related to java
however, now it is throwing error on not having cluster properties file.
in fact i am running/installing hdp from the location where this file exist
. still it is throwing error

please find the attached

[image: Inline image 1]

regards
irfan



On Fri, Sep 6, 2013 at 11:12 AM, Ravi Mummulla (BIG DATA) <
ravimu@microsoft.com> wrote:

>  Here’s your issue (from the logs you attached earlier):****
>
> ** **
>
> CAQuietExec:  Checking JAVA_HOME is set correctly...****
>
> CAQuietExec:  Files\Java\jdk1.6.0_31 was unexpected at this time.****
>
> ** **
>
> It seems that you installed Java prerequisite in the default path, which
> is %PROGRAMFILES% (expands to C:\Program Files in your case). HDP 1.3 does
> not like spaces in paths, do you need to reinstall Java under c:\java\ or
> something similar (in a path with no spaces).****
>
> ** **
>
> *From:* Irfan Sayed [mailto:irfu.sayed@gmail.com]
> *Sent:* Thursday, September 5, 2013 8:42 PM
> *To:* user@hadoop.apache.org
> *Subject:* Re: about replication****
>
> ** **
>
> please find the attached.****
>
> i don't have "c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log"
> as it is not generated ****
>
> ** **
>
> regards****
>
> irfan****
>
> ** **
>
> ** **
>
> ** **
>
> ** **
>
> ** **
>
> On Thu, Sep 5, 2013 at 6:09 PM, Olivier Renault <or...@hortonworks.com>
> wrote:****
>
>  Could you share the log files ( c:\hdp.log,
> c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log )  as
> well as your clusterproperties.txt ?****
>
> ** **
>
> Thanks, ****
>
> Olivier****
>
> ** **
>
> On 5 September 2013 12:33, Irfan Sayed <ir...@gmail.com> wrote:****
>
>  thanks. i followed the user manual for deployment and installed all
> pre-requisites ****
>
> i modified the command and still the issue persist. please suggest ****
>
> ** **
>
> please refer below ****
>
> ** **
>
> ** **
>
> [image: Inline image 1]****
>
> ** **
>
> regards****
>
> irfan ****
>
> ** **
>
> ** **
>
> On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <or...@hortonworks.com>
> wrote:****
>
> The command to install it is msiexec /i msifile /...  ****
>
> You will find the correct syntax as part of doc. ****
>
> Happy reading
> Olivier ****
>
> On 4 Sep 2013 12:37, "Irfan Sayed" <ir...@gmail.com> wrote:****
>
>  thanks. ****
>
> i referred the logs and manuals. i modified the clusterproperties file and
> then double click on the msi file ****
>
> however, it still failed.****
>
> further i started the installation on command line by giving
> HDP_LAYOUT=clusterproperties file path, ****
>
> installation went ahead and it failed for .NET framework 4.0 and VC++
> redistributable package dependency   ****
>
> ** **
>
> i installed both and started again the installation. ****
>
> failed again with following error ****
>
> [image: Inline image 1]****
>
> ** **
>
> when i search for the logs mentioned in the error , i never found that ***
> *
>
> please suggest ****
>
> ** **
>
> regards****
>
> irfan****
>
> ** **
>
> ** **
>
> On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <or...@hortonworks.com>
> wrote:****
>
> Correct, you need to define the cluster configuration as part of a file.
> You will find some information on the configuration file as part of the
> documentation. ****
>
>
> http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
> ****
>
> You should make sure to have also installed the pre requisite. ****
>
> Thanks
> Olivier ****
>
> On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:****
>
>  thanks. sorry for the long break. actually got involved in some other
> priorities****
>
> i downloaded the installer and while installing i got following error ****
>
> ** **
>
> [image: Inline image 1]****
>
> ** **
>
> do i need to make any configuration prior to installation ??****
>
> ** **
>
> regards****
>
> irfan ****
>
> ** **
>
> ** **
>
> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <or...@hortonworks.com>
> wrote:****
>
> Here is the link ****
>
> http://download.hortonworks.com/products/hdp-windows/****
>
> Olivier ****
>
> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:****
>
>  thanks.****
>
> i just followed the instructions to setup the pseudo distributed setup
> first using the url :
> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
> ****
>
>  ****
>
> i don't think so i am running DN on both machine ****
>
> please find the attached log****
>
> ** **
>
> hi olivier ****
>
> ** **
>
> can you please give me download link ?****
>
> let me try please ****
>
> ** **
>
> regards****
>
> irfan ****
>
> ** **
>
> ** **
>
> ** **
>
> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <do...@gmail.com>
> wrote:****
>
>  Are you running DN on both the machines? Could you please show me your
> DN logs?****
>
> ** **
>
> Also, consider Oliver's suggestion. It's definitely a better option.****
>
> ** **
>
> ** **
>
>
> ****
>
> Warm Regards,****
>
> Tariq****
>
> cloudfront.blogspot.com****
>
> ** **
>
> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
> orenault@hortonworks.com> wrote:****
>
> Irfu, ****
>
> If you want to quickly get Hadoop running on windows platform. You may
> want to try our distribution for Windows. You will be able to find the msi
> on our website. ****
>
> Regards
> Olivier ****
>
> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:****
>
>  thanks. ****
>
> ok. i think i need to change the plan over here ****
>
> let me create two environments. 1: totally windows 2: totally Unix****
>
> ** **
>
> because, on windows , anyway i have to try and see how hadoop works ****
>
> on UNIX, it is already known that ,  it is working fine. ****
>
> ** **
>
> so, on windows , here is the setup:****
>
> ** **
>
> namenode : windows 2012 R2 ****
>
> datanode : windows 2012 R2 ****
>
> ** **
>
> now, the exact problem is :****
>
> 1: datanode is not getting started ****
>
> 2: replication : if i put any file/folder on any datanode , it should get
> replicated to all another available datanodes ****
>
> ** **
>
> regards****
>
> ** **
>
> ** **
>
> ** **
>
> ** **
>
> ** **
>
> ** **
>
> ** **
>
> ** **
>
> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <do...@gmail.com>
> wrote:****
>
>  Seriously??You are planning to develop something using Hadoop on
> windows. Not a good idea. Anyways, cold you plz show me your log files?I
> also need some additional info :****
>
> -The exact problem which you are facing right now****
>
> -Your cluster summary(no. of nodes etc)****
>
> -Your latest configuration files****
>
> -Your /etc.hosts file****
>
>
> ****
>
> Warm Regards,****
>
> Tariq****
>
> cloudfront.blogspot.com****
>
> ** **
>
> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <ir...@gmail.com> wrote:
> ****
>
>  ok. thanks****
>
> now, i need to start with all windows setup first as our product will be
> based on windows ****
>
> so, now, please tell me how to resolve the issue ****
>
> ** **
>
> datanode is not starting . please suggest ****
>
> ** **
>
> regards,****
>
> irfan ****
>
> ** **
>
> ** **
>
> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <do...@gmail.com>
> wrote:****
>
>  It is possible. Theoretically Hadoop doesn't stop you from doing that.
> But it is not a very wise setup.****
>
>
> ****
>
> Warm Regards,****
>
> Tariq****
>
> cloudfront.blogspot.com****
>
> ** **
>
> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <ir...@gmail.com> wrote:
> ****
>
>  please suggest****
>
> ** **
>
> regards****
>
> irfan****
>
> ** **
>
> ** **
>
> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>
> wrote:****
>
>  thanks.****
>
> can i have setup like this :****
>
> namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)****
>
> and datanodes are the combination of any OS (windows , linux , unix etc )*
> ***
>
> ** **
>
> however, my doubt is,  as the file systems of  both the systems (win and
> linux ) are different ,  datanodes of these systems can not be part of
> single cluster . i have to make windows cluster separate and UNIX cluster
> separate ?****
>
> ** **
>
> regards****
>
> ** **
>
> ** **
>
> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <aa...@hortonworks.com>
> wrote:****
>
> I just noticed you are on Cygwin. IIRC Windows PIDs are not the same as
> Cygwin PIDs so that may be causing the discrepancy. I don't know how well
> Hadoop works in Cygwin as I have never tried it. Work is in progress for
> native Windows support however there are no official releases with Windows
> support yet. It may be easier to get familiar with a release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
> ****
>
>
>
> ****
>
> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <ir...@gmail.com>
> wrote:****
>
>  thanks ****
>
> here is what i did .****
>
> i stopped all the namenodes and datanodes using ./stop-dfs.sh command ****
>
> then deleted all pid files for namenodes and datanodes ****
>
> ** **
>
> started dfs again with command : "./start-dfs.sh"****
>
> ** **
>
> when i ran the "Jps" command . it shows****
>
> ** **
>
> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>
> $ ./jps.exe****
>
> 4536 Jps****
>
> 2076 NameNode****
>
> ** **
>
> however, when i open the pid file for namenode then it is not showing pid
> as : 4560. on the contrary, it shud show : 2076****
>
> ** **
>
> please suggest ****
>
> ** **
>
> regards****
>
> ** **
>
> ** **
>
> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <aa...@hortonworks.com>
> wrote:****
>
>  Most likely there is a stale pid file. Something like
> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
> the datanode.
>
> I haven't read the entire thread so you may have looked at this already.
>
> -Arpit****
>
>
>
> ****
>
> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <ir...@gmail.com> wrote:
> ****
>
>  datanode is trying to connect to namenode continuously but fails ****
>
> ** **
>
> when i try to run "jps" command it says :****
>
> $ ./jps.exe****
>
> 4584 NameNode****
>
> 4016 Jps****
>
> ** **
>
> and when i ran the "./start-dfs.sh" then it says :****
>
> ** **
>
> $ ./start-dfs.sh****
>
> namenode running as process 3544. Stop it first.****
>
> DFS-1: datanode running as process 4076. Stop it first.****
>
> localhost: secondarynamenode running as process 4792. Stop it first.****
>
> ** **
>
> both these logs are contradictory ****
>
> please find the attached logs ****
>
> ** **
>
> should i attach the conf files as well ?****
>
> ** **
>
> regards****
>
>  ****
>
> ** **
>
> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <do...@gmail.com>
> wrote:****
>
>  Your DN is still not running. Showing me the logs would be helpful.****
>
>
> ****
>
> Warm Regards,****
>
> Tariq****
>
> cloudfront.blogspot.com****
>
> ** **
>
> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <ir...@gmail.com> wrote:
> ****
>
>  i followed the url and did the steps mention in that. i have deployed on
> the windows platform****
>
> ** **
>
> Now, i am able to browse url : http://localhost:50070 (name node )****
>
> however, not able to browse url : http://localhost:50030****
>
> ** **
>
> please refer below****
>
> ** **
>
> [image: Inline image 1]****
>
> ** **
>
> i have modified all the config files as mentioned and formatted the hdfs
> file system as well ****
>
> please suggest ****
>
> ** **
>
> regards****
>
> ** **
>
> ** **
>
> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <ir...@gmail.com> wrote:
> ****
>
>  thanks. i followed this url :
> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html*
> ***
>
> let me follow the url which you gave for pseudo distributed setup and then
> will switch to distributed mode****
>
> ** **
>
> regards****
>
> irfan ****
>
> ** **
>
> ** **
>
> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <do...@gmail.com>
> wrote:****
>
>  You are welcome. Which link have you followed for the configuration?Your
> *core-site.xml* is empty. Remove the property *fs.default.name *from *
> hdfs-site.xml* and add it to *core-site.xml*. Remove *mapred.job.tracker*as well. It is required in
> *mapred-site.xml*.****
>
> ** **
>
> I would suggest you to do a pseudo distributed setup first in order to get
> yourself familiar with the process and then proceed to the distributed
> mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
> ****
>
> ** **
>
> HTH****
>
>
> ****
>
> Warm Regards,****
>
> Tariq****
>
> cloudfront.blogspot.com****
>
> ** **
>
> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <ir...@gmail.com> wrote:
> ****
>
>  thanks tariq for response. ****
>
> as discussed last time, i have sent you all the config files in my setup .
> ****
>
> can you please go through that ?****
>
> ** **
>
> please let me know ****
>
> ** **
>
> regards****
>
> irfan ****
>
> ** **
>
> ** **
>
> ** **
>
> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <do...@gmail.com>
> wrote:****
>
>  I'm sorry for being unresponsive. Was out of touch for sometime because
> of ramzan and eid. Resuming work today.****
>
> ** **
>
> What's the current status?****
>
>
> ****
>
> Warm Regards,****
>
> Tariq****
>
> cloudfront.blogspot.com****
>
> ** **
>
> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <ma...@gmail.com>
> wrote:****
>
>  First of all read the concepts ..I hope you will like it..****
>
>
> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf****
>
> ** **
>
> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <ir...@gmail.com> wrote:
> ****
>
>  please suggest ****
>
> ** **
>
> regards****
>
> irfan ****
>
> ** **
>
> ** **
>
> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <ir...@gmail.com>
> wrote:****
>
>  hey Tariq,****
>
> i am still stuck .. ****
>
> can you please suggest ****
>
> ** **
>
> regards****
>
> irfan ****
>
> ** **
>
> ** **
>
> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <ir...@gmail.com> wrote:*
> ***
>
>  please suggest ****
>
> ** **
>
> regards****
>
> ** **
>
> ** **
>
> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <ir...@gmail.com> wrote:*
> ***
>
>  attachment got quarantined ****
>
> resending in txt format. please rename it to conf.rar ****
>
> ** **
>
> regards****
>
> ** **
>
> ** **
>
> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <ir...@gmail.com> wrote:*
> ***
>
>  thanks.****
>
> ** **
>
> if i run the jps command on namenode :****
>
> ** **
>
> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin****
>
> $ ./jps.exe****
>
> 3164 NameNode****
>
> 1892 Jps****
>
> ** **
>
> same command on datanode :****
>
> ** **
>
> Administrator@DFS-1 /cygdrive/c/Java/jdk1.7.0_25/bin****
>
> $ ./jps.exe****
>
> 3848 Jps****
>
> ** **
>
> jps does not list any process for datanode. however, on web browser i can
> see one live data node ****
>
> please find the attached conf rar file of namenode ****
>
> ** **
>
> regards****
>
> ** **
>
> ** **
>
> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <do...@gmail.com> wrote:
> ****
>
>  OK. we'll start fresh. Could you plz show me your latest config files?***
> *
>
> ** **
>
> BTW, are your daemons running fine?Use JPS to verify that.****
>
>
> ****
>
> Warm Regards,****
>
> Tariq****
>
> cloudfront.blogspot.com****
>
> ** **
>
> On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <ir...@gmail.com> wrote:
> ****
>
>  i have created these dir "wksp_data" and "wksp_name" on both datanode
> and namenode ****
>
> made the respective changes in "hdfs-site.xml" file ****
>
> formatted the namenode ****
>
> started the dfs ****
>
> ** **
>
> but still, not able to browse the file system through web browser ****
>
> please refer below ****
>
> ** **
>
> anything still missing ?****
>
> please suggest ****
>
> ** **
>
> [image: Inline image 1]****
>
> ** **
>
> On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <ir...@gmail.com> wrote:
> ****
>
>  these dir needs to be created on all datanodes and namenodes ?****
>
> further,  hdfs-site.xml needs to be updated on both datanodes and
> namenodes for these new dir?****
>
> ** **
>
> regards****
>
> ** **
>
> ** **
>
> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <do...@gmail.com> wrote:
> ****
>
>  Create 2 directories manually corresponding to the values of
> dfs.name.dir and dfs.data.dir properties and change the permissions of
> these directories to 755. When you start pushing data into your HDFS, data
> will start going inside the directory specified by dfs.data.dir and the
> associated metadata will go inside dfs.name.dir. Remember, you store data
> in HDFS, but it eventually gets stored in your local/native FS. But you
> cannot see this data directly on your local/native FS.****
>
>
> ****
>
> Warm Regards,****
>
> Tariq****
>
> cloudfront.blogspot.com****
>
> ** **
>
> On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <ir...@gmail.com> wrote:*
> ***
>
>  thanks. ****
>
> however, i need this to be working on windows environment as project
> requirement.****
>
> i will add/work on Linux later ****
>
> ** **
>
> so, now , at this stage , c:\\wksp is the HDFS file system OR do i need to
> create it from command line ?****
>
> ** **
>
> please suggest****
>
> ** **
>
> regards,****
>
> ** **
>
> ** **
>
> On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <do...@gmail.com> wrote:
> ****
>
>  Hello Irfan,****
>
> ** **
>
> Sorry for being unresponsive. Got stuck with some imp work.****
>
> ** **
>
> HDFS webUI doesn't provide us the ability to create file or directory. You
> can browse HDFS, view files, download files etc. But operation like create,
> move, copy etc are not supported.****
>
> ** **
>
> These values look fine to me.****
>
> ** **
>
> One suggestion though. Try getting a Linux machine(if possible). Or at
> least use a VM. I personally feel that using Hadoop on windows is always
> messy.****
>
>
> ****
>
> Warm Regards,****
>
> Tariq****
>
> cloudfront.blogspot.com****
>
> ** **
>
> On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <ir...@gmail.com> wrote:*
> ***
>
>  thanks.****
>
> when i browse the file system , i am getting following :****
>
> i haven't seen any make directory option there ****
>
> ** **
>
> i need to create it from command line ?****
>
> further, in the hdfs-site.xml file , i have given following entries. are
> they correct ? ****
>
> ** **
>
> <property>****
>
>   <name>dfs.data.dir</name>****
>
>   <value>c:\\wksp</value>****
>
>   </property>****
>
> <property>****
>
>   <name>dfs.name.dir</name>****
>
>   <value>c:\\wksp</value>****
>
>   </property>****
>
> ** **
>
> please suggest ****
>
> ** **
>
> ** **
>
> [image: Inline image 1]****
>
> ** **
>
> On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <ma...@gmail.com>
> wrote:****
>
>  *You are wrong at this:*****
>
> ** **
>
> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>
> $ ./hadoop dfs -copyFromLocal
> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp****
>
> copyFromLocal: File
> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.**
> **
>
> ** **
>
> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin****
>
> $ ./hadoop dfs -copyFromLocal
> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp****
>
> copyFromLocal: File
> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
> ****
>
> ** **
>
> Because,You had wrote both the paths local and You need not to copy hadoop
> into hdfs...Hadoop is already working..****
>
> ** **
>
> Just check out in browser by after starting ur single node cluster :****
>
> ** **
>
> localhost:50070****
>
> ** **
>
> then go for browse the filesystem link in it..****
>
> ** **
>
> If there is no directory then make directory there.****
>
> That is your hdfs directory.****
>
> Then copy any text file there(no need to copy hadoop there).beacause u are
> going to do processing on that data in text file.That's why hadoop is used
> for ,first u need to make it clear in ur mind.Then and then u will do
> it...otherwise not possible..****
>
> ** **
>
> *Try this: *****
>
> ** **
>
> Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2****
>
> $ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file
> /hdfs/directory/path****
>
> ** **
>
> ** **
>
> ** **
>
> ** **
>
> On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com> wrote:
> ****
>
>  thanks. yes , i am newbie.****
>
> however, i need windows setup.****
>
> ** **
>
> let me surely refer the doc and link which u sent but i need this to be
> working ...****
>
> can you please help****
>
> ** **
>
> regards****
>
> ** **
>
>  ****
>
> ** **
>
>
>
> ****
>
> ** **
>
> --
> MANISH DUNANI
> -THANX
> +91 9426881954,+91 8460656443****
>
> manishd207@gmail.com****
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>
>
> ****
>
> -- ****
>
> Regards****
>
> *Manish Dunani*****
>
> *Contact No* : +91 9408329137****
>
> *skype id* : manish.dunani****
>
> ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.****
>
>  ** **
>
>
>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.****
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>  ** **
>
>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.****
>
>  ** **
>
>  ** **
>
>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.****
>
>  ** **
>
>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.****
>
>  ** **
>
>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.****
>
>  ** **
>
>
>
> ****
>
> ** **
>
> -- ****
>
> Olivier Renault****
>
> Solution Engineer - Big Data - Hortonworks, Inc.
> +44 7500 933 036
> orenault@hortonworks.com
> www.hortonworks.com****
>
> **** <http://hortonworks.com/products/hortonworks-sandbox/>
>
>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.****<http://hortonworks.com/products/hortonworks-sandbox/>
>
>  ** ** <http://hortonworks.com/products/hortonworks-sandbox/>
>

RE: about replication

Posted by "Ravi Mummulla (BIG DATA)" <ra...@microsoft.com>.
Here's your issue (from the logs you attached earlier):

CAQuietExec:  Checking JAVA_HOME is set correctly...
CAQuietExec:  Files\Java\jdk1.6.0_31 was unexpected at this time.

It seems that you installed Java prerequisite in the default path, which is %PROGRAMFILES% (expands to C:\Program Files in your case). HDP 1.3 does not like spaces in paths, do you need to reinstall Java under c:\java\ or something similar (in a path with no spaces).

From: Irfan Sayed [mailto:irfu.sayed@gmail.com]
Sent: Thursday, September 5, 2013 8:42 PM
To: user@hadoop.apache.org
Subject: Re: about replication

please find the attached.
i don't have "c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log" as it is not generated

regards
irfan





On Thu, Sep 5, 2013 at 6:09 PM, Olivier Renault <or...@hortonworks.com>> wrote:
Could you share the log files ( c:\hdp.log, c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log )  as well as your clusterproperties.txt ?

Thanks,
Olivier

On 5 September 2013 12:33, Irfan Sayed <ir...@gmail.com>> wrote:
thanks. i followed the user manual for deployment and installed all pre-requisites
i modified the command and still the issue persist. please suggest

please refer below


[Inline image 1]

regards
irfan


On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <or...@hortonworks.com>> wrote:

The command to install it is msiexec /i msifile /...

You will find the correct syntax as part of doc.

Happy reading
Olivier
On 4 Sep 2013 12:37, "Irfan Sayed" <ir...@gmail.com>> wrote:
thanks.
i referred the logs and manuals. i modified the clusterproperties file and then double click on the msi file
however, it still failed.
further i started the installation on command line by giving HDP_LAYOUT=clusterproperties file path,
installation went ahead and it failed for .NET framework 4.0 and VC++ redistributable package dependency

i installed both and started again the installation.
failed again with following error
[Inline image 1]

when i search for the logs mentioned in the error , i never found that
please suggest

regards
irfan


On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <or...@hortonworks.com>> wrote:

Correct, you need to define the cluster configuration as part of a file. You will find some information on the configuration file as part of the documentation.

http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html

You should make sure to have also installed the pre requisite.

Thanks
Olivier
On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com>> wrote:
thanks. sorry for the long break. actually got involved in some other priorities
i downloaded the installer and while installing i got following error

[Inline image 1]

do i need to make any configuration prior to installation ??

regards
irfan


On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <or...@hortonworks.com>> wrote:

Here is the link

http://download.hortonworks.com/products/hdp-windows/

Olivier
On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com>> wrote:
thanks.
i just followed the instructions to setup the pseudo distributed setup first using the url : http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I

i don't think so i am running DN on both machine
please find the attached log

hi olivier

can you please give me download link ?
let me try please

regards
irfan



On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <do...@gmail.com>> wrote:
Are you running DN on both the machines? Could you please show me your DN logs?

Also, consider Oliver's suggestion. It's definitely a better option.



Warm Regards,
Tariq
cloudfront.blogspot.com<http://cloudfront.blogspot.com>

On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <or...@hortonworks.com>> wrote:

Irfu,

If you want to quickly get Hadoop running on windows platform. You may want to try our distribution for Windows. You will be able to find the msi on our website.

Regards
Olivier
On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com>> wrote:
thanks.
ok. i think i need to change the plan over here
let me create two environments. 1: totally windows 2: totally Unix

because, on windows , anyway i have to try and see how hadoop works
on UNIX, it is already known that ,  it is working fine.

so, on windows , here is the setup:

namenode : windows 2012 R2
datanode : windows 2012 R2

now, the exact problem is :
1: datanode is not getting started
2: replication : if i put any file/folder on any datanode , it should get replicated to all another available datanodes

regards








On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <do...@gmail.com>> wrote:
Seriously??You are planning to develop something using Hadoop on windows. Not a good idea. Anyways, cold you plz show me your log files?I also need some additional info :
-The exact problem which you are facing right now
-Your cluster summary(no. of nodes etc)
-Your latest configuration files
-Your /etc.hosts file

Warm Regards,
Tariq
cloudfront.blogspot.com<http://cloudfront.blogspot.com>

On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <ir...@gmail.com>> wrote:
ok. thanks
now, i need to start with all windows setup first as our product will be based on windows
so, now, please tell me how to resolve the issue

datanode is not starting . please suggest

regards,
irfan


On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <do...@gmail.com>> wrote:
It is possible. Theoretically Hadoop doesn't stop you from doing that. But it is not a very wise setup.

Warm Regards,
Tariq
cloudfront.blogspot.com<http://cloudfront.blogspot.com>

On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <ir...@gmail.com>> wrote:
please suggest

regards
irfan


On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>> wrote:
thanks.
can i have setup like this :
namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)
and datanodes are the combination of any OS (windows , linux , unix etc )

however, my doubt is,  as the file systems of  both the systems (win and linux ) are different ,  datanodes of these systems can not be part of single cluster . i have to make windows cluster separate and UNIX cluster separate ?

regards


On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <aa...@hortonworks.com>> wrote:
I just noticed you are on Cygwin. IIRC Windows PIDs are not the same as Cygwin PIDs so that may be causing the discrepancy. I don't know how well Hadoop works in Cygwin as I have never tried it. Work is in progress for native Windows support however there are no official releases with Windows support yet. It may be easier to get familiar with a release<https://www.apache.org/dyn/closer.cgi/hadoop/common/> on Linux if you are new to it.


On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <ir...@gmail.com>> wrote:
thanks
here is what i did .
i stopped all the namenodes and datanodes using ./stop-dfs.sh command
then deleted all pid files for namenodes and datanodes

started dfs again with command : "./start-dfs.sh"

when i ran the "Jps" command . it shows

Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin
$ ./jps.exe
4536 Jps
2076 NameNode

however, when i open the pid file for namenode then it is not showing pid as : 4560. on the contrary, it shud show : 2076

please suggest

regards


On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <aa...@hortonworks.com>> wrote:
Most likely there is a stale pid file. Something like \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting the datanode.

I haven't read the entire thread so you may have looked at this already.

-Arpit


On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <ir...@gmail.com>> wrote:
datanode is trying to connect to namenode continuously but fails

when i try to run "jps" command it says :
$ ./jps.exe
4584 NameNode
4016 Jps

and when i ran the "./start-dfs.sh" then it says :

$ ./start-dfs.sh
namenode running as process 3544. Stop it first.
DFS-1: datanode running as process 4076. Stop it first.
localhost: secondarynamenode running as process 4792. Stop it first.

both these logs are contradictory
please find the attached logs

should i attach the conf files as well ?

regards


On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <do...@gmail.com>> wrote:
Your DN is still not running. Showing me the logs would be helpful.

Warm Regards,
Tariq
cloudfront.blogspot.com<http://cloudfront.blogspot.com>

On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <ir...@gmail.com>> wrote:
i followed the url and did the steps mention in that. i have deployed on the windows platform

Now, i am able to browse url : http://localhost:50070 (name node )
however, not able to browse url : http://localhost:50030

please refer below

[Inline image 1]

i have modified all the config files as mentioned and formatted the hdfs file system as well
please suggest

regards


On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <ir...@gmail.com>> wrote:
thanks. i followed this url : http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
let me follow the url which you gave for pseudo distributed setup and then will switch to distributed mode

regards
irfan


On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <do...@gmail.com>> wrote:
You are welcome. Which link have you followed for the configuration?Your core-site.xml is empty. Remove the property fs.default.name<http://fs.default.name> from hdfs-site.xml and add it to core-site.xml. Remove mapred.job.tracker as well. It is required in mapred-site.xml.

I would suggest you to do a pseudo distributed setup first in order to get yourself familiar with the process and then proceed to the distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I> if you need some help. Let me know if you face any issue.

HTH

Warm Regards,
Tariq
cloudfront.blogspot.com<http://cloudfront.blogspot.com>

On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <ir...@gmail.com>> wrote:
thanks tariq for response.
as discussed last time, i have sent you all the config files in my setup .
can you please go through that ?

please let me know

regards
irfan



On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <do...@gmail.com>> wrote:
I'm sorry for being unresponsive. Was out of touch for sometime because of ramzan and eid. Resuming work today.

What's the current status?

Warm Regards,
Tariq
cloudfront.blogspot.com<http://cloudfront.blogspot.com>

On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <ma...@gmail.com>> wrote:
First of all read the concepts ..I hope you will like it..

https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf

On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <ir...@gmail.com>> wrote:
please suggest

regards
irfan


On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <ir...@gmail.com>> wrote:
hey Tariq,
i am still stuck ..
can you please suggest

regards
irfan


On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <ir...@gmail.com>> wrote:
please suggest

regards


On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <ir...@gmail.com>> wrote:
attachment got quarantined
resending in txt format. please rename it to conf.rar

regards


On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <ir...@gmail.com>> wrote:
thanks.

if i run the jps command on namenode :

Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin
$ ./jps.exe
3164 NameNode
1892 Jps

same command on datanode :

Administrator@DFS-1 /cygdrive/c/Java/jdk1.7.0_25/bin
$ ./jps.exe
3848 Jps

jps does not list any process for datanode. however, on web browser i can see one live data node
please find the attached conf rar file of namenode

regards


On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <do...@gmail.com>> wrote:
OK. we'll start fresh. Could you plz show me your latest config files?

BTW, are your daemons running fine?Use JPS to verify that.

Warm Regards,
Tariq
cloudfront.blogspot.com<http://cloudfront.blogspot.com>

On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <ir...@gmail.com>> wrote:
i have created these dir "wksp_data" and "wksp_name" on both datanode and namenode
made the respective changes in "hdfs-site.xml" file
formatted the namenode
started the dfs

but still, not able to browse the file system through web browser
please refer below

anything still missing ?
please suggest

[Inline image 1]

On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <ir...@gmail.com>> wrote:
these dir needs to be created on all datanodes and namenodes ?
further,  hdfs-site.xml needs to be updated on both datanodes and namenodes for these new dir?

regards


On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <do...@gmail.com>> wrote:
Create 2 directories manually corresponding to the values of dfs.name.dir and dfs.data.dir properties and change the permissions of these directories to 755. When you start pushing data into your HDFS, data will start going inside the directory specified by dfs.data.dir and the associated metadata will go inside dfs.name.dir. Remember, you store data in HDFS, but it eventually gets stored in your local/native FS. But you cannot see this data directly on your local/native FS.

Warm Regards,
Tariq
cloudfront.blogspot.com<http://cloudfront.blogspot.com>

On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <ir...@gmail.com>> wrote:
thanks.
however, i need this to be working on windows environment as project requirement.
i will add/work on Linux later

so, now , at this stage , c:\\wksp is the HDFS file system OR do i need to create it from command line ?

please suggest

regards,


On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <do...@gmail.com>> wrote:
Hello Irfan,

Sorry for being unresponsive. Got stuck with some imp work.

HDFS webUI doesn't provide us the ability to create file or directory. You can browse HDFS, view files, download files etc. But operation like create, move, copy etc are not supported.

These values look fine to me.

One suggestion though. Try getting a Linux machine(if possible). Or at least use a VM. I personally feel that using Hadoop on windows is always messy.

Warm Regards,
Tariq
cloudfront.blogspot.com<http://cloudfront.blogspot.com>

On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <ir...@gmail.com>> wrote:
thanks.
when i browse the file system , i am getting following :
i haven't seen any make directory option there

i need to create it from command line ?
further, in the hdfs-site.xml file , i have given following entries. are they correct ?

<property>
  <name>dfs.data.dir</name>
  <value>c:\\wksp</value>
  </property>
<property>
  <name>dfs.name.dir</name>
  <value>c:\\wksp</value>
  </property>

please suggest


[Inline image 1]

On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <ma...@gmail.com>> wrote:
You are wrong at this:

Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
$ ./hadoop dfs -copyFromLocal /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
copyFromLocal: File /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.

Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
$ ./hadoop dfs -copyFromLocal /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
copyFromLocal: File /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.

Because,You had wrote both the paths local and You need not to copy hadoop into hdfs...Hadoop is already working..

Just check out in browser by after starting ur single node cluster :

localhost:50070

then go for browse the filesystem link in it..

If there is no directory then make directory there.
That is your hdfs directory.
Then copy any text file there(no need to copy hadoop there).beacause u are going to do processing on that data in text file.That's why hadoop is used for ,first u need to make it clear in ur mind.Then and then u will do it...otherwise not possible..

Try this:

Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2
$ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file /hdfs/directory/path




On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>> wrote:
thanks. yes , i am newbie.
however, i need windows setup.

let me surely refer the doc and link which u sent but i need this to be working ...
can you please help

regards






--
MANISH DUNANI
-THANX
+91 9426881954<tel:%2B91%209426881954>,+91 8460656443<tel:%2B91%208460656443>
manishd207@gmail.com<ma...@gmail.com>














--
Regards
Manish Dunani
Contact No : +91 9408329137<tel:%2B91%209408329137>
skype id : manish.dunani









CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to which it is addressed and may contain information that is confidential, privileged and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, you are hereby notified that any printing, copying, dissemination, distribution, disclosure or forwarding of this communication is strictly prohibited. If you have received this communication in error, please contact the sender immediately and delete it from your system. Thank You.



CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to which it is addressed and may contain information that is confidential, privileged and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, you are hereby notified that any printing, copying, dissemination, distribution, disclosure or forwarding of this communication is strictly prohibited. If you have received this communication in error, please contact the sender immediately and delete it from your system. Thank You.







CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to which it is addressed and may contain information that is confidential, privileged and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, you are hereby notified that any printing, copying, dissemination, distribution, disclosure or forwarding of this communication is strictly prohibited. If you have received this communication in error, please contact the sender immediately and delete it from your system. Thank You.



CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to which it is addressed and may contain information that is confidential, privileged and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, you are hereby notified that any printing, copying, dissemination, distribution, disclosure or forwarding of this communication is strictly prohibited. If you have received this communication in error, please contact the sender immediately and delete it from your system. Thank You.


CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to which it is addressed and may contain information that is confidential, privileged and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, you are hereby notified that any printing, copying, dissemination, distribution, disclosure or forwarding of this communication is strictly prohibited. If you have received this communication in error, please contact the sender immediately and delete it from your system. Thank You.


CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to which it is addressed and may contain information that is confidential, privileged and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, you are hereby notified that any printing, copying, dissemination, distribution, disclosure or forwarding of this communication is strictly prohibited. If you have received this communication in error, please contact the sender immediately and delete it from your system. Thank You.




--
Olivier Renault
Solution Engineer - Big Data - Hortonworks, Inc.
+44 7500 933 036
orenault@hortonworks.com<ma...@hortonworks.com>
www.hortonworks.com<http://www.hortonworks.com/>
<http://hortonworks.com/products/hortonworks-sandbox/>

CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to which it is addressed and may contain information that is confidential, privileged and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, you are hereby notified that any printing, copying, dissemination, distribution, disclosure or forwarding of this communication is strictly prohibited. If you have received this communication in error, please contact the sender immediately and delete it from your system. Thank You.<http://hortonworks.com/products/hortonworks-sandbox/>
 <http://hortonworks.com/products/hortonworks-sandbox/>

RE: about replication

Posted by "Ravi Mummulla (BIG DATA)" <ra...@microsoft.com>.
Here's your issue (from the logs you attached earlier):

CAQuietExec:  Checking JAVA_HOME is set correctly...
CAQuietExec:  Files\Java\jdk1.6.0_31 was unexpected at this time.

It seems that you installed Java prerequisite in the default path, which is %PROGRAMFILES% (expands to C:\Program Files in your case). HDP 1.3 does not like spaces in paths, do you need to reinstall Java under c:\java\ or something similar (in a path with no spaces).

From: Irfan Sayed [mailto:irfu.sayed@gmail.com]
Sent: Thursday, September 5, 2013 8:42 PM
To: user@hadoop.apache.org
Subject: Re: about replication

please find the attached.
i don't have "c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log" as it is not generated

regards
irfan





On Thu, Sep 5, 2013 at 6:09 PM, Olivier Renault <or...@hortonworks.com>> wrote:
Could you share the log files ( c:\hdp.log, c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log )  as well as your clusterproperties.txt ?

Thanks,
Olivier

On 5 September 2013 12:33, Irfan Sayed <ir...@gmail.com>> wrote:
thanks. i followed the user manual for deployment and installed all pre-requisites
i modified the command and still the issue persist. please suggest

please refer below


[Inline image 1]

regards
irfan


On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <or...@hortonworks.com>> wrote:

The command to install it is msiexec /i msifile /...

You will find the correct syntax as part of doc.

Happy reading
Olivier
On 4 Sep 2013 12:37, "Irfan Sayed" <ir...@gmail.com>> wrote:
thanks.
i referred the logs and manuals. i modified the clusterproperties file and then double click on the msi file
however, it still failed.
further i started the installation on command line by giving HDP_LAYOUT=clusterproperties file path,
installation went ahead and it failed for .NET framework 4.0 and VC++ redistributable package dependency

i installed both and started again the installation.
failed again with following error
[Inline image 1]

when i search for the logs mentioned in the error , i never found that
please suggest

regards
irfan


On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <or...@hortonworks.com>> wrote:

Correct, you need to define the cluster configuration as part of a file. You will find some information on the configuration file as part of the documentation.

http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html

You should make sure to have also installed the pre requisite.

Thanks
Olivier
On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com>> wrote:
thanks. sorry for the long break. actually got involved in some other priorities
i downloaded the installer and while installing i got following error

[Inline image 1]

do i need to make any configuration prior to installation ??

regards
irfan


On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <or...@hortonworks.com>> wrote:

Here is the link

http://download.hortonworks.com/products/hdp-windows/

Olivier
On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com>> wrote:
thanks.
i just followed the instructions to setup the pseudo distributed setup first using the url : http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I

i don't think so i am running DN on both machine
please find the attached log

hi olivier

can you please give me download link ?
let me try please

regards
irfan



On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <do...@gmail.com>> wrote:
Are you running DN on both the machines? Could you please show me your DN logs?

Also, consider Oliver's suggestion. It's definitely a better option.



Warm Regards,
Tariq
cloudfront.blogspot.com<http://cloudfront.blogspot.com>

On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <or...@hortonworks.com>> wrote:

Irfu,

If you want to quickly get Hadoop running on windows platform. You may want to try our distribution for Windows. You will be able to find the msi on our website.

Regards
Olivier
On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com>> wrote:
thanks.
ok. i think i need to change the plan over here
let me create two environments. 1: totally windows 2: totally Unix

because, on windows , anyway i have to try and see how hadoop works
on UNIX, it is already known that ,  it is working fine.

so, on windows , here is the setup:

namenode : windows 2012 R2
datanode : windows 2012 R2

now, the exact problem is :
1: datanode is not getting started
2: replication : if i put any file/folder on any datanode , it should get replicated to all another available datanodes

regards








On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <do...@gmail.com>> wrote:
Seriously??You are planning to develop something using Hadoop on windows. Not a good idea. Anyways, cold you plz show me your log files?I also need some additional info :
-The exact problem which you are facing right now
-Your cluster summary(no. of nodes etc)
-Your latest configuration files
-Your /etc.hosts file

Warm Regards,
Tariq
cloudfront.blogspot.com<http://cloudfront.blogspot.com>

On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <ir...@gmail.com>> wrote:
ok. thanks
now, i need to start with all windows setup first as our product will be based on windows
so, now, please tell me how to resolve the issue

datanode is not starting . please suggest

regards,
irfan


On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <do...@gmail.com>> wrote:
It is possible. Theoretically Hadoop doesn't stop you from doing that. But it is not a very wise setup.

Warm Regards,
Tariq
cloudfront.blogspot.com<http://cloudfront.blogspot.com>

On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <ir...@gmail.com>> wrote:
please suggest

regards
irfan


On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>> wrote:
thanks.
can i have setup like this :
namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)
and datanodes are the combination of any OS (windows , linux , unix etc )

however, my doubt is,  as the file systems of  both the systems (win and linux ) are different ,  datanodes of these systems can not be part of single cluster . i have to make windows cluster separate and UNIX cluster separate ?

regards


On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <aa...@hortonworks.com>> wrote:
I just noticed you are on Cygwin. IIRC Windows PIDs are not the same as Cygwin PIDs so that may be causing the discrepancy. I don't know how well Hadoop works in Cygwin as I have never tried it. Work is in progress for native Windows support however there are no official releases with Windows support yet. It may be easier to get familiar with a release<https://www.apache.org/dyn/closer.cgi/hadoop/common/> on Linux if you are new to it.


On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <ir...@gmail.com>> wrote:
thanks
here is what i did .
i stopped all the namenodes and datanodes using ./stop-dfs.sh command
then deleted all pid files for namenodes and datanodes

started dfs again with command : "./start-dfs.sh"

when i ran the "Jps" command . it shows

Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin
$ ./jps.exe
4536 Jps
2076 NameNode

however, when i open the pid file for namenode then it is not showing pid as : 4560. on the contrary, it shud show : 2076

please suggest

regards


On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <aa...@hortonworks.com>> wrote:
Most likely there is a stale pid file. Something like \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting the datanode.

I haven't read the entire thread so you may have looked at this already.

-Arpit


On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <ir...@gmail.com>> wrote:
datanode is trying to connect to namenode continuously but fails

when i try to run "jps" command it says :
$ ./jps.exe
4584 NameNode
4016 Jps

and when i ran the "./start-dfs.sh" then it says :

$ ./start-dfs.sh
namenode running as process 3544. Stop it first.
DFS-1: datanode running as process 4076. Stop it first.
localhost: secondarynamenode running as process 4792. Stop it first.

both these logs are contradictory
please find the attached logs

should i attach the conf files as well ?

regards


On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <do...@gmail.com>> wrote:
Your DN is still not running. Showing me the logs would be helpful.

Warm Regards,
Tariq
cloudfront.blogspot.com<http://cloudfront.blogspot.com>

On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <ir...@gmail.com>> wrote:
i followed the url and did the steps mention in that. i have deployed on the windows platform

Now, i am able to browse url : http://localhost:50070 (name node )
however, not able to browse url : http://localhost:50030

please refer below

[Inline image 1]

i have modified all the config files as mentioned and formatted the hdfs file system as well
please suggest

regards


On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <ir...@gmail.com>> wrote:
thanks. i followed this url : http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
let me follow the url which you gave for pseudo distributed setup and then will switch to distributed mode

regards
irfan


On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <do...@gmail.com>> wrote:
You are welcome. Which link have you followed for the configuration?Your core-site.xml is empty. Remove the property fs.default.name<http://fs.default.name> from hdfs-site.xml and add it to core-site.xml. Remove mapred.job.tracker as well. It is required in mapred-site.xml.

I would suggest you to do a pseudo distributed setup first in order to get yourself familiar with the process and then proceed to the distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I> if you need some help. Let me know if you face any issue.

HTH

Warm Regards,
Tariq
cloudfront.blogspot.com<http://cloudfront.blogspot.com>

On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <ir...@gmail.com>> wrote:
thanks tariq for response.
as discussed last time, i have sent you all the config files in my setup .
can you please go through that ?

please let me know

regards
irfan



On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <do...@gmail.com>> wrote:
I'm sorry for being unresponsive. Was out of touch for sometime because of ramzan and eid. Resuming work today.

What's the current status?

Warm Regards,
Tariq
cloudfront.blogspot.com<http://cloudfront.blogspot.com>

On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <ma...@gmail.com>> wrote:
First of all read the concepts ..I hope you will like it..

https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf

On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <ir...@gmail.com>> wrote:
please suggest

regards
irfan


On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <ir...@gmail.com>> wrote:
hey Tariq,
i am still stuck ..
can you please suggest

regards
irfan


On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <ir...@gmail.com>> wrote:
please suggest

regards


On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <ir...@gmail.com>> wrote:
attachment got quarantined
resending in txt format. please rename it to conf.rar

regards


On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <ir...@gmail.com>> wrote:
thanks.

if i run the jps command on namenode :

Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin
$ ./jps.exe
3164 NameNode
1892 Jps

same command on datanode :

Administrator@DFS-1 /cygdrive/c/Java/jdk1.7.0_25/bin
$ ./jps.exe
3848 Jps

jps does not list any process for datanode. however, on web browser i can see one live data node
please find the attached conf rar file of namenode

regards


On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <do...@gmail.com>> wrote:
OK. we'll start fresh. Could you plz show me your latest config files?

BTW, are your daemons running fine?Use JPS to verify that.

Warm Regards,
Tariq
cloudfront.blogspot.com<http://cloudfront.blogspot.com>

On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <ir...@gmail.com>> wrote:
i have created these dir "wksp_data" and "wksp_name" on both datanode and namenode
made the respective changes in "hdfs-site.xml" file
formatted the namenode
started the dfs

but still, not able to browse the file system through web browser
please refer below

anything still missing ?
please suggest

[Inline image 1]

On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <ir...@gmail.com>> wrote:
these dir needs to be created on all datanodes and namenodes ?
further,  hdfs-site.xml needs to be updated on both datanodes and namenodes for these new dir?

regards


On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <do...@gmail.com>> wrote:
Create 2 directories manually corresponding to the values of dfs.name.dir and dfs.data.dir properties and change the permissions of these directories to 755. When you start pushing data into your HDFS, data will start going inside the directory specified by dfs.data.dir and the associated metadata will go inside dfs.name.dir. Remember, you store data in HDFS, but it eventually gets stored in your local/native FS. But you cannot see this data directly on your local/native FS.

Warm Regards,
Tariq
cloudfront.blogspot.com<http://cloudfront.blogspot.com>

On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <ir...@gmail.com>> wrote:
thanks.
however, i need this to be working on windows environment as project requirement.
i will add/work on Linux later

so, now , at this stage , c:\\wksp is the HDFS file system OR do i need to create it from command line ?

please suggest

regards,


On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <do...@gmail.com>> wrote:
Hello Irfan,

Sorry for being unresponsive. Got stuck with some imp work.

HDFS webUI doesn't provide us the ability to create file or directory. You can browse HDFS, view files, download files etc. But operation like create, move, copy etc are not supported.

These values look fine to me.

One suggestion though. Try getting a Linux machine(if possible). Or at least use a VM. I personally feel that using Hadoop on windows is always messy.

Warm Regards,
Tariq
cloudfront.blogspot.com<http://cloudfront.blogspot.com>

On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <ir...@gmail.com>> wrote:
thanks.
when i browse the file system , i am getting following :
i haven't seen any make directory option there

i need to create it from command line ?
further, in the hdfs-site.xml file , i have given following entries. are they correct ?

<property>
  <name>dfs.data.dir</name>
  <value>c:\\wksp</value>
  </property>
<property>
  <name>dfs.name.dir</name>
  <value>c:\\wksp</value>
  </property>

please suggest


[Inline image 1]

On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <ma...@gmail.com>> wrote:
You are wrong at this:

Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
$ ./hadoop dfs -copyFromLocal /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
copyFromLocal: File /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.

Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
$ ./hadoop dfs -copyFromLocal /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
copyFromLocal: File /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.

Because,You had wrote both the paths local and You need not to copy hadoop into hdfs...Hadoop is already working..

Just check out in browser by after starting ur single node cluster :

localhost:50070

then go for browse the filesystem link in it..

If there is no directory then make directory there.
That is your hdfs directory.
Then copy any text file there(no need to copy hadoop there).beacause u are going to do processing on that data in text file.That's why hadoop is used for ,first u need to make it clear in ur mind.Then and then u will do it...otherwise not possible..

Try this:

Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2
$ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file /hdfs/directory/path




On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>> wrote:
thanks. yes , i am newbie.
however, i need windows setup.

let me surely refer the doc and link which u sent but i need this to be working ...
can you please help

regards






--
MANISH DUNANI
-THANX
+91 9426881954<tel:%2B91%209426881954>,+91 8460656443<tel:%2B91%208460656443>
manishd207@gmail.com<ma...@gmail.com>














--
Regards
Manish Dunani
Contact No : +91 9408329137<tel:%2B91%209408329137>
skype id : manish.dunani









CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to which it is addressed and may contain information that is confidential, privileged and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, you are hereby notified that any printing, copying, dissemination, distribution, disclosure or forwarding of this communication is strictly prohibited. If you have received this communication in error, please contact the sender immediately and delete it from your system. Thank You.



CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to which it is addressed and may contain information that is confidential, privileged and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, you are hereby notified that any printing, copying, dissemination, distribution, disclosure or forwarding of this communication is strictly prohibited. If you have received this communication in error, please contact the sender immediately and delete it from your system. Thank You.







CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to which it is addressed and may contain information that is confidential, privileged and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, you are hereby notified that any printing, copying, dissemination, distribution, disclosure or forwarding of this communication is strictly prohibited. If you have received this communication in error, please contact the sender immediately and delete it from your system. Thank You.



CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to which it is addressed and may contain information that is confidential, privileged and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, you are hereby notified that any printing, copying, dissemination, distribution, disclosure or forwarding of this communication is strictly prohibited. If you have received this communication in error, please contact the sender immediately and delete it from your system. Thank You.


CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to which it is addressed and may contain information that is confidential, privileged and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, you are hereby notified that any printing, copying, dissemination, distribution, disclosure or forwarding of this communication is strictly prohibited. If you have received this communication in error, please contact the sender immediately and delete it from your system. Thank You.


CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to which it is addressed and may contain information that is confidential, privileged and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, you are hereby notified that any printing, copying, dissemination, distribution, disclosure or forwarding of this communication is strictly prohibited. If you have received this communication in error, please contact the sender immediately and delete it from your system. Thank You.




--
Olivier Renault
Solution Engineer - Big Data - Hortonworks, Inc.
+44 7500 933 036
orenault@hortonworks.com<ma...@hortonworks.com>
www.hortonworks.com<http://www.hortonworks.com/>
<http://hortonworks.com/products/hortonworks-sandbox/>

CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to which it is addressed and may contain information that is confidential, privileged and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, you are hereby notified that any printing, copying, dissemination, distribution, disclosure or forwarding of this communication is strictly prohibited. If you have received this communication in error, please contact the sender immediately and delete it from your system. Thank You.<http://hortonworks.com/products/hortonworks-sandbox/>
 <http://hortonworks.com/products/hortonworks-sandbox/>

RE: about replication

Posted by "Ravi Mummulla (BIG DATA)" <ra...@microsoft.com>.
Here's your issue (from the logs you attached earlier):

CAQuietExec:  Checking JAVA_HOME is set correctly...
CAQuietExec:  Files\Java\jdk1.6.0_31 was unexpected at this time.

It seems that you installed Java prerequisite in the default path, which is %PROGRAMFILES% (expands to C:\Program Files in your case). HDP 1.3 does not like spaces in paths, do you need to reinstall Java under c:\java\ or something similar (in a path with no spaces).

From: Irfan Sayed [mailto:irfu.sayed@gmail.com]
Sent: Thursday, September 5, 2013 8:42 PM
To: user@hadoop.apache.org
Subject: Re: about replication

please find the attached.
i don't have "c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log" as it is not generated

regards
irfan





On Thu, Sep 5, 2013 at 6:09 PM, Olivier Renault <or...@hortonworks.com>> wrote:
Could you share the log files ( c:\hdp.log, c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log )  as well as your clusterproperties.txt ?

Thanks,
Olivier

On 5 September 2013 12:33, Irfan Sayed <ir...@gmail.com>> wrote:
thanks. i followed the user manual for deployment and installed all pre-requisites
i modified the command and still the issue persist. please suggest

please refer below


[Inline image 1]

regards
irfan


On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <or...@hortonworks.com>> wrote:

The command to install it is msiexec /i msifile /...

You will find the correct syntax as part of doc.

Happy reading
Olivier
On 4 Sep 2013 12:37, "Irfan Sayed" <ir...@gmail.com>> wrote:
thanks.
i referred the logs and manuals. i modified the clusterproperties file and then double click on the msi file
however, it still failed.
further i started the installation on command line by giving HDP_LAYOUT=clusterproperties file path,
installation went ahead and it failed for .NET framework 4.0 and VC++ redistributable package dependency

i installed both and started again the installation.
failed again with following error
[Inline image 1]

when i search for the logs mentioned in the error , i never found that
please suggest

regards
irfan


On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <or...@hortonworks.com>> wrote:

Correct, you need to define the cluster configuration as part of a file. You will find some information on the configuration file as part of the documentation.

http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html

You should make sure to have also installed the pre requisite.

Thanks
Olivier
On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com>> wrote:
thanks. sorry for the long break. actually got involved in some other priorities
i downloaded the installer and while installing i got following error

[Inline image 1]

do i need to make any configuration prior to installation ??

regards
irfan


On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <or...@hortonworks.com>> wrote:

Here is the link

http://download.hortonworks.com/products/hdp-windows/

Olivier
On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com>> wrote:
thanks.
i just followed the instructions to setup the pseudo distributed setup first using the url : http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I

i don't think so i am running DN on both machine
please find the attached log

hi olivier

can you please give me download link ?
let me try please

regards
irfan



On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <do...@gmail.com>> wrote:
Are you running DN on both the machines? Could you please show me your DN logs?

Also, consider Oliver's suggestion. It's definitely a better option.



Warm Regards,
Tariq
cloudfront.blogspot.com<http://cloudfront.blogspot.com>

On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <or...@hortonworks.com>> wrote:

Irfu,

If you want to quickly get Hadoop running on windows platform. You may want to try our distribution for Windows. You will be able to find the msi on our website.

Regards
Olivier
On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com>> wrote:
thanks.
ok. i think i need to change the plan over here
let me create two environments. 1: totally windows 2: totally Unix

because, on windows , anyway i have to try and see how hadoop works
on UNIX, it is already known that ,  it is working fine.

so, on windows , here is the setup:

namenode : windows 2012 R2
datanode : windows 2012 R2

now, the exact problem is :
1: datanode is not getting started
2: replication : if i put any file/folder on any datanode , it should get replicated to all another available datanodes

regards








On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <do...@gmail.com>> wrote:
Seriously??You are planning to develop something using Hadoop on windows. Not a good idea. Anyways, cold you plz show me your log files?I also need some additional info :
-The exact problem which you are facing right now
-Your cluster summary(no. of nodes etc)
-Your latest configuration files
-Your /etc.hosts file

Warm Regards,
Tariq
cloudfront.blogspot.com<http://cloudfront.blogspot.com>

On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <ir...@gmail.com>> wrote:
ok. thanks
now, i need to start with all windows setup first as our product will be based on windows
so, now, please tell me how to resolve the issue

datanode is not starting . please suggest

regards,
irfan


On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <do...@gmail.com>> wrote:
It is possible. Theoretically Hadoop doesn't stop you from doing that. But it is not a very wise setup.

Warm Regards,
Tariq
cloudfront.blogspot.com<http://cloudfront.blogspot.com>

On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <ir...@gmail.com>> wrote:
please suggest

regards
irfan


On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>> wrote:
thanks.
can i have setup like this :
namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)
and datanodes are the combination of any OS (windows , linux , unix etc )

however, my doubt is,  as the file systems of  both the systems (win and linux ) are different ,  datanodes of these systems can not be part of single cluster . i have to make windows cluster separate and UNIX cluster separate ?

regards


On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <aa...@hortonworks.com>> wrote:
I just noticed you are on Cygwin. IIRC Windows PIDs are not the same as Cygwin PIDs so that may be causing the discrepancy. I don't know how well Hadoop works in Cygwin as I have never tried it. Work is in progress for native Windows support however there are no official releases with Windows support yet. It may be easier to get familiar with a release<https://www.apache.org/dyn/closer.cgi/hadoop/common/> on Linux if you are new to it.


On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <ir...@gmail.com>> wrote:
thanks
here is what i did .
i stopped all the namenodes and datanodes using ./stop-dfs.sh command
then deleted all pid files for namenodes and datanodes

started dfs again with command : "./start-dfs.sh"

when i ran the "Jps" command . it shows

Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin
$ ./jps.exe
4536 Jps
2076 NameNode

however, when i open the pid file for namenode then it is not showing pid as : 4560. on the contrary, it shud show : 2076

please suggest

regards


On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <aa...@hortonworks.com>> wrote:
Most likely there is a stale pid file. Something like \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting the datanode.

I haven't read the entire thread so you may have looked at this already.

-Arpit


On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <ir...@gmail.com>> wrote:
datanode is trying to connect to namenode continuously but fails

when i try to run "jps" command it says :
$ ./jps.exe
4584 NameNode
4016 Jps

and when i ran the "./start-dfs.sh" then it says :

$ ./start-dfs.sh
namenode running as process 3544. Stop it first.
DFS-1: datanode running as process 4076. Stop it first.
localhost: secondarynamenode running as process 4792. Stop it first.

both these logs are contradictory
please find the attached logs

should i attach the conf files as well ?

regards


On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <do...@gmail.com>> wrote:
Your DN is still not running. Showing me the logs would be helpful.

Warm Regards,
Tariq
cloudfront.blogspot.com<http://cloudfront.blogspot.com>

On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <ir...@gmail.com>> wrote:
i followed the url and did the steps mention in that. i have deployed on the windows platform

Now, i am able to browse url : http://localhost:50070 (name node )
however, not able to browse url : http://localhost:50030

please refer below

[Inline image 1]

i have modified all the config files as mentioned and formatted the hdfs file system as well
please suggest

regards


On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <ir...@gmail.com>> wrote:
thanks. i followed this url : http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
let me follow the url which you gave for pseudo distributed setup and then will switch to distributed mode

regards
irfan


On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <do...@gmail.com>> wrote:
You are welcome. Which link have you followed for the configuration?Your core-site.xml is empty. Remove the property fs.default.name<http://fs.default.name> from hdfs-site.xml and add it to core-site.xml. Remove mapred.job.tracker as well. It is required in mapred-site.xml.

I would suggest you to do a pseudo distributed setup first in order to get yourself familiar with the process and then proceed to the distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I> if you need some help. Let me know if you face any issue.

HTH

Warm Regards,
Tariq
cloudfront.blogspot.com<http://cloudfront.blogspot.com>

On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <ir...@gmail.com>> wrote:
thanks tariq for response.
as discussed last time, i have sent you all the config files in my setup .
can you please go through that ?

please let me know

regards
irfan



On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <do...@gmail.com>> wrote:
I'm sorry for being unresponsive. Was out of touch for sometime because of ramzan and eid. Resuming work today.

What's the current status?

Warm Regards,
Tariq
cloudfront.blogspot.com<http://cloudfront.blogspot.com>

On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <ma...@gmail.com>> wrote:
First of all read the concepts ..I hope you will like it..

https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf

On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <ir...@gmail.com>> wrote:
please suggest

regards
irfan


On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <ir...@gmail.com>> wrote:
hey Tariq,
i am still stuck ..
can you please suggest

regards
irfan


On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <ir...@gmail.com>> wrote:
please suggest

regards


On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <ir...@gmail.com>> wrote:
attachment got quarantined
resending in txt format. please rename it to conf.rar

regards


On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <ir...@gmail.com>> wrote:
thanks.

if i run the jps command on namenode :

Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin
$ ./jps.exe
3164 NameNode
1892 Jps

same command on datanode :

Administrator@DFS-1 /cygdrive/c/Java/jdk1.7.0_25/bin
$ ./jps.exe
3848 Jps

jps does not list any process for datanode. however, on web browser i can see one live data node
please find the attached conf rar file of namenode

regards


On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <do...@gmail.com>> wrote:
OK. we'll start fresh. Could you plz show me your latest config files?

BTW, are your daemons running fine?Use JPS to verify that.

Warm Regards,
Tariq
cloudfront.blogspot.com<http://cloudfront.blogspot.com>

On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <ir...@gmail.com>> wrote:
i have created these dir "wksp_data" and "wksp_name" on both datanode and namenode
made the respective changes in "hdfs-site.xml" file
formatted the namenode
started the dfs

but still, not able to browse the file system through web browser
please refer below

anything still missing ?
please suggest

[Inline image 1]

On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <ir...@gmail.com>> wrote:
these dir needs to be created on all datanodes and namenodes ?
further,  hdfs-site.xml needs to be updated on both datanodes and namenodes for these new dir?

regards


On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <do...@gmail.com>> wrote:
Create 2 directories manually corresponding to the values of dfs.name.dir and dfs.data.dir properties and change the permissions of these directories to 755. When you start pushing data into your HDFS, data will start going inside the directory specified by dfs.data.dir and the associated metadata will go inside dfs.name.dir. Remember, you store data in HDFS, but it eventually gets stored in your local/native FS. But you cannot see this data directly on your local/native FS.

Warm Regards,
Tariq
cloudfront.blogspot.com<http://cloudfront.blogspot.com>

On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <ir...@gmail.com>> wrote:
thanks.
however, i need this to be working on windows environment as project requirement.
i will add/work on Linux later

so, now , at this stage , c:\\wksp is the HDFS file system OR do i need to create it from command line ?

please suggest

regards,


On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <do...@gmail.com>> wrote:
Hello Irfan,

Sorry for being unresponsive. Got stuck with some imp work.

HDFS webUI doesn't provide us the ability to create file or directory. You can browse HDFS, view files, download files etc. But operation like create, move, copy etc are not supported.

These values look fine to me.

One suggestion though. Try getting a Linux machine(if possible). Or at least use a VM. I personally feel that using Hadoop on windows is always messy.

Warm Regards,
Tariq
cloudfront.blogspot.com<http://cloudfront.blogspot.com>

On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <ir...@gmail.com>> wrote:
thanks.
when i browse the file system , i am getting following :
i haven't seen any make directory option there

i need to create it from command line ?
further, in the hdfs-site.xml file , i have given following entries. are they correct ?

<property>
  <name>dfs.data.dir</name>
  <value>c:\\wksp</value>
  </property>
<property>
  <name>dfs.name.dir</name>
  <value>c:\\wksp</value>
  </property>

please suggest


[Inline image 1]

On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <ma...@gmail.com>> wrote:
You are wrong at this:

Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
$ ./hadoop dfs -copyFromLocal /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
copyFromLocal: File /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.

Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
$ ./hadoop dfs -copyFromLocal /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
copyFromLocal: File /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.

Because,You had wrote both the paths local and You need not to copy hadoop into hdfs...Hadoop is already working..

Just check out in browser by after starting ur single node cluster :

localhost:50070

then go for browse the filesystem link in it..

If there is no directory then make directory there.
That is your hdfs directory.
Then copy any text file there(no need to copy hadoop there).beacause u are going to do processing on that data in text file.That's why hadoop is used for ,first u need to make it clear in ur mind.Then and then u will do it...otherwise not possible..

Try this:

Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2
$ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file /hdfs/directory/path




On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>> wrote:
thanks. yes , i am newbie.
however, i need windows setup.

let me surely refer the doc and link which u sent but i need this to be working ...
can you please help

regards






--
MANISH DUNANI
-THANX
+91 9426881954<tel:%2B91%209426881954>,+91 8460656443<tel:%2B91%208460656443>
manishd207@gmail.com<ma...@gmail.com>














--
Regards
Manish Dunani
Contact No : +91 9408329137<tel:%2B91%209408329137>
skype id : manish.dunani









CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to which it is addressed and may contain information that is confidential, privileged and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, you are hereby notified that any printing, copying, dissemination, distribution, disclosure or forwarding of this communication is strictly prohibited. If you have received this communication in error, please contact the sender immediately and delete it from your system. Thank You.



CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to which it is addressed and may contain information that is confidential, privileged and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, you are hereby notified that any printing, copying, dissemination, distribution, disclosure or forwarding of this communication is strictly prohibited. If you have received this communication in error, please contact the sender immediately and delete it from your system. Thank You.







CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to which it is addressed and may contain information that is confidential, privileged and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, you are hereby notified that any printing, copying, dissemination, distribution, disclosure or forwarding of this communication is strictly prohibited. If you have received this communication in error, please contact the sender immediately and delete it from your system. Thank You.



CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to which it is addressed and may contain information that is confidential, privileged and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, you are hereby notified that any printing, copying, dissemination, distribution, disclosure or forwarding of this communication is strictly prohibited. If you have received this communication in error, please contact the sender immediately and delete it from your system. Thank You.


CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to which it is addressed and may contain information that is confidential, privileged and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, you are hereby notified that any printing, copying, dissemination, distribution, disclosure or forwarding of this communication is strictly prohibited. If you have received this communication in error, please contact the sender immediately and delete it from your system. Thank You.


CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to which it is addressed and may contain information that is confidential, privileged and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, you are hereby notified that any printing, copying, dissemination, distribution, disclosure or forwarding of this communication is strictly prohibited. If you have received this communication in error, please contact the sender immediately and delete it from your system. Thank You.




--
Olivier Renault
Solution Engineer - Big Data - Hortonworks, Inc.
+44 7500 933 036
orenault@hortonworks.com<ma...@hortonworks.com>
www.hortonworks.com<http://www.hortonworks.com/>
<http://hortonworks.com/products/hortonworks-sandbox/>

CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to which it is addressed and may contain information that is confidential, privileged and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, you are hereby notified that any printing, copying, dissemination, distribution, disclosure or forwarding of this communication is strictly prohibited. If you have received this communication in error, please contact the sender immediately and delete it from your system. Thank You.<http://hortonworks.com/products/hortonworks-sandbox/>
 <http://hortonworks.com/products/hortonworks-sandbox/>

RE: about replication

Posted by "Ravi Mummulla (BIG DATA)" <ra...@microsoft.com>.
Here's your issue (from the logs you attached earlier):

CAQuietExec:  Checking JAVA_HOME is set correctly...
CAQuietExec:  Files\Java\jdk1.6.0_31 was unexpected at this time.

It seems that you installed Java prerequisite in the default path, which is %PROGRAMFILES% (expands to C:\Program Files in your case). HDP 1.3 does not like spaces in paths, do you need to reinstall Java under c:\java\ or something similar (in a path with no spaces).

From: Irfan Sayed [mailto:irfu.sayed@gmail.com]
Sent: Thursday, September 5, 2013 8:42 PM
To: user@hadoop.apache.org
Subject: Re: about replication

please find the attached.
i don't have "c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log" as it is not generated

regards
irfan





On Thu, Sep 5, 2013 at 6:09 PM, Olivier Renault <or...@hortonworks.com>> wrote:
Could you share the log files ( c:\hdp.log, c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log )  as well as your clusterproperties.txt ?

Thanks,
Olivier

On 5 September 2013 12:33, Irfan Sayed <ir...@gmail.com>> wrote:
thanks. i followed the user manual for deployment and installed all pre-requisites
i modified the command and still the issue persist. please suggest

please refer below


[Inline image 1]

regards
irfan


On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <or...@hortonworks.com>> wrote:

The command to install it is msiexec /i msifile /...

You will find the correct syntax as part of doc.

Happy reading
Olivier
On 4 Sep 2013 12:37, "Irfan Sayed" <ir...@gmail.com>> wrote:
thanks.
i referred the logs and manuals. i modified the clusterproperties file and then double click on the msi file
however, it still failed.
further i started the installation on command line by giving HDP_LAYOUT=clusterproperties file path,
installation went ahead and it failed for .NET framework 4.0 and VC++ redistributable package dependency

i installed both and started again the installation.
failed again with following error
[Inline image 1]

when i search for the logs mentioned in the error , i never found that
please suggest

regards
irfan


On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <or...@hortonworks.com>> wrote:

Correct, you need to define the cluster configuration as part of a file. You will find some information on the configuration file as part of the documentation.

http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html

You should make sure to have also installed the pre requisite.

Thanks
Olivier
On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com>> wrote:
thanks. sorry for the long break. actually got involved in some other priorities
i downloaded the installer and while installing i got following error

[Inline image 1]

do i need to make any configuration prior to installation ??

regards
irfan


On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <or...@hortonworks.com>> wrote:

Here is the link

http://download.hortonworks.com/products/hdp-windows/

Olivier
On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com>> wrote:
thanks.
i just followed the instructions to setup the pseudo distributed setup first using the url : http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I

i don't think so i am running DN on both machine
please find the attached log

hi olivier

can you please give me download link ?
let me try please

regards
irfan



On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <do...@gmail.com>> wrote:
Are you running DN on both the machines? Could you please show me your DN logs?

Also, consider Oliver's suggestion. It's definitely a better option.



Warm Regards,
Tariq
cloudfront.blogspot.com<http://cloudfront.blogspot.com>

On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <or...@hortonworks.com>> wrote:

Irfu,

If you want to quickly get Hadoop running on windows platform. You may want to try our distribution for Windows. You will be able to find the msi on our website.

Regards
Olivier
On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com>> wrote:
thanks.
ok. i think i need to change the plan over here
let me create two environments. 1: totally windows 2: totally Unix

because, on windows , anyway i have to try and see how hadoop works
on UNIX, it is already known that ,  it is working fine.

so, on windows , here is the setup:

namenode : windows 2012 R2
datanode : windows 2012 R2

now, the exact problem is :
1: datanode is not getting started
2: replication : if i put any file/folder on any datanode , it should get replicated to all another available datanodes

regards








On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <do...@gmail.com>> wrote:
Seriously??You are planning to develop something using Hadoop on windows. Not a good idea. Anyways, cold you plz show me your log files?I also need some additional info :
-The exact problem which you are facing right now
-Your cluster summary(no. of nodes etc)
-Your latest configuration files
-Your /etc.hosts file

Warm Regards,
Tariq
cloudfront.blogspot.com<http://cloudfront.blogspot.com>

On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <ir...@gmail.com>> wrote:
ok. thanks
now, i need to start with all windows setup first as our product will be based on windows
so, now, please tell me how to resolve the issue

datanode is not starting . please suggest

regards,
irfan


On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <do...@gmail.com>> wrote:
It is possible. Theoretically Hadoop doesn't stop you from doing that. But it is not a very wise setup.

Warm Regards,
Tariq
cloudfront.blogspot.com<http://cloudfront.blogspot.com>

On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <ir...@gmail.com>> wrote:
please suggest

regards
irfan


On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>> wrote:
thanks.
can i have setup like this :
namenode will be on linux (flavour may be RHEL, CentOS, UBuntu etc)
and datanodes are the combination of any OS (windows , linux , unix etc )

however, my doubt is,  as the file systems of  both the systems (win and linux ) are different ,  datanodes of these systems can not be part of single cluster . i have to make windows cluster separate and UNIX cluster separate ?

regards


On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <aa...@hortonworks.com>> wrote:
I just noticed you are on Cygwin. IIRC Windows PIDs are not the same as Cygwin PIDs so that may be causing the discrepancy. I don't know how well Hadoop works in Cygwin as I have never tried it. Work is in progress for native Windows support however there are no official releases with Windows support yet. It may be easier to get familiar with a release<https://www.apache.org/dyn/closer.cgi/hadoop/common/> on Linux if you are new to it.


On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <ir...@gmail.com>> wrote:
thanks
here is what i did .
i stopped all the namenodes and datanodes using ./stop-dfs.sh command
then deleted all pid files for namenodes and datanodes

started dfs again with command : "./start-dfs.sh"

when i ran the "Jps" command . it shows

Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin
$ ./jps.exe
4536 Jps
2076 NameNode

however, when i open the pid file for namenode then it is not showing pid as : 4560. on the contrary, it shud show : 2076

please suggest

regards


On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <aa...@hortonworks.com>> wrote:
Most likely there is a stale pid file. Something like \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting the datanode.

I haven't read the entire thread so you may have looked at this already.

-Arpit


On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <ir...@gmail.com>> wrote:
datanode is trying to connect to namenode continuously but fails

when i try to run "jps" command it says :
$ ./jps.exe
4584 NameNode
4016 Jps

and when i ran the "./start-dfs.sh" then it says :

$ ./start-dfs.sh
namenode running as process 3544. Stop it first.
DFS-1: datanode running as process 4076. Stop it first.
localhost: secondarynamenode running as process 4792. Stop it first.

both these logs are contradictory
please find the attached logs

should i attach the conf files as well ?

regards


On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <do...@gmail.com>> wrote:
Your DN is still not running. Showing me the logs would be helpful.

Warm Regards,
Tariq
cloudfront.blogspot.com<http://cloudfront.blogspot.com>

On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <ir...@gmail.com>> wrote:
i followed the url and did the steps mention in that. i have deployed on the windows platform

Now, i am able to browse url : http://localhost:50070 (name node )
however, not able to browse url : http://localhost:50030

please refer below

[Inline image 1]

i have modified all the config files as mentioned and formatted the hdfs file system as well
please suggest

regards


On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <ir...@gmail.com>> wrote:
thanks. i followed this url : http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
let me follow the url which you gave for pseudo distributed setup and then will switch to distributed mode

regards
irfan


On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <do...@gmail.com>> wrote:
You are welcome. Which link have you followed for the configuration?Your core-site.xml is empty. Remove the property fs.default.name<http://fs.default.name> from hdfs-site.xml and add it to core-site.xml. Remove mapred.job.tracker as well. It is required in mapred-site.xml.

I would suggest you to do a pseudo distributed setup first in order to get yourself familiar with the process and then proceed to the distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I> if you need some help. Let me know if you face any issue.

HTH

Warm Regards,
Tariq
cloudfront.blogspot.com<http://cloudfront.blogspot.com>

On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <ir...@gmail.com>> wrote:
thanks tariq for response.
as discussed last time, i have sent you all the config files in my setup .
can you please go through that ?

please let me know

regards
irfan



On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <do...@gmail.com>> wrote:
I'm sorry for being unresponsive. Was out of touch for sometime because of ramzan and eid. Resuming work today.

What's the current status?

Warm Regards,
Tariq
cloudfront.blogspot.com<http://cloudfront.blogspot.com>

On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <ma...@gmail.com>> wrote:
First of all read the concepts ..I hope you will like it..

https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf

On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <ir...@gmail.com>> wrote:
please suggest

regards
irfan


On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <ir...@gmail.com>> wrote:
hey Tariq,
i am still stuck ..
can you please suggest

regards
irfan


On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <ir...@gmail.com>> wrote:
please suggest

regards


On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <ir...@gmail.com>> wrote:
attachment got quarantined
resending in txt format. please rename it to conf.rar

regards


On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <ir...@gmail.com>> wrote:
thanks.

if i run the jps command on namenode :

Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin
$ ./jps.exe
3164 NameNode
1892 Jps

same command on datanode :

Administrator@DFS-1 /cygdrive/c/Java/jdk1.7.0_25/bin
$ ./jps.exe
3848 Jps

jps does not list any process for datanode. however, on web browser i can see one live data node
please find the attached conf rar file of namenode

regards


On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <do...@gmail.com>> wrote:
OK. we'll start fresh. Could you plz show me your latest config files?

BTW, are your daemons running fine?Use JPS to verify that.

Warm Regards,
Tariq
cloudfront.blogspot.com<http://cloudfront.blogspot.com>

On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <ir...@gmail.com>> wrote:
i have created these dir "wksp_data" and "wksp_name" on both datanode and namenode
made the respective changes in "hdfs-site.xml" file
formatted the namenode
started the dfs

but still, not able to browse the file system through web browser
please refer below

anything still missing ?
please suggest

[Inline image 1]

On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <ir...@gmail.com>> wrote:
these dir needs to be created on all datanodes and namenodes ?
further,  hdfs-site.xml needs to be updated on both datanodes and namenodes for these new dir?

regards


On Tue, Aug 6, 2013 at 5:30 PM, Mohammad Tariq <do...@gmail.com>> wrote:
Create 2 directories manually corresponding to the values of dfs.name.dir and dfs.data.dir properties and change the permissions of these directories to 755. When you start pushing data into your HDFS, data will start going inside the directory specified by dfs.data.dir and the associated metadata will go inside dfs.name.dir. Remember, you store data in HDFS, but it eventually gets stored in your local/native FS. But you cannot see this data directly on your local/native FS.

Warm Regards,
Tariq
cloudfront.blogspot.com<http://cloudfront.blogspot.com>

On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed <ir...@gmail.com>> wrote:
thanks.
however, i need this to be working on windows environment as project requirement.
i will add/work on Linux later

so, now , at this stage , c:\\wksp is the HDFS file system OR do i need to create it from command line ?

please suggest

regards,


On Tue, Aug 6, 2013 at 5:19 PM, Mohammad Tariq <do...@gmail.com>> wrote:
Hello Irfan,

Sorry for being unresponsive. Got stuck with some imp work.

HDFS webUI doesn't provide us the ability to create file or directory. You can browse HDFS, view files, download files etc. But operation like create, move, copy etc are not supported.

These values look fine to me.

One suggestion though. Try getting a Linux machine(if possible). Or at least use a VM. I personally feel that using Hadoop on windows is always messy.

Warm Regards,
Tariq
cloudfront.blogspot.com<http://cloudfront.blogspot.com>

On Tue, Aug 6, 2013 at 5:09 PM, Irfan Sayed <ir...@gmail.com>> wrote:
thanks.
when i browse the file system , i am getting following :
i haven't seen any make directory option there

i need to create it from command line ?
further, in the hdfs-site.xml file , i have given following entries. are they correct ?

<property>
  <name>dfs.data.dir</name>
  <value>c:\\wksp</value>
  </property>
<property>
  <name>dfs.name.dir</name>
  <value>c:\\wksp</value>
  </property>

please suggest


[Inline image 1]

On Tue, Aug 6, 2013 at 12:40 PM, manish dunani <ma...@gmail.com>> wrote:
You are wrong at this:

Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
$ ./hadoop dfs -copyFromLocal /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
copyFromLocal: File /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.

Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
$ ./hadoop dfs -copyFromLocal /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
copyFromLocal: File /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.

Because,You had wrote both the paths local and You need not to copy hadoop into hdfs...Hadoop is already working..

Just check out in browser by after starting ur single node cluster :

localhost:50070

then go for browse the filesystem link in it..

If there is no directory then make directory there.
That is your hdfs directory.
Then copy any text file there(no need to copy hadoop there).beacause u are going to do processing on that data in text file.That's why hadoop is used for ,first u need to make it clear in ur mind.Then and then u will do it...otherwise not possible..

Try this:

Administrator@DFS-DC /cygdrive/c/hadoop-1.1.2/hadoop-1.1.2
$ .bin/hadoop dfs -copyFromLocal /full/local/path/to/ur/file /hdfs/directory/path




On Tue, Aug 6, 2013 at 11:49 AM, Irfan Sayed <ir...@gmail.com>> wrote:
thanks. yes , i am newbie.
however, i need windows setup.

let me surely refer the doc and link which u sent but i need this to be working ...
can you please help

regards






--
MANISH DUNANI
-THANX
+91 9426881954<tel:%2B91%209426881954>,+91 8460656443<tel:%2B91%208460656443>
manishd207@gmail.com<ma...@gmail.com>














--
Regards
Manish Dunani
Contact No : +91 9408329137<tel:%2B91%209408329137>
skype id : manish.dunani









CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to which it is addressed and may contain information that is confidential, privileged and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, you are hereby notified that any printing, copying, dissemination, distribution, disclosure or forwarding of this communication is strictly prohibited. If you have received this communication in error, please contact the sender immediately and delete it from your system. Thank You.



CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to which it is addressed and may contain information that is confidential, privileged and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, you are hereby notified that any printing, copying, dissemination, distribution, disclosure or forwarding of this communication is strictly prohibited. If you have received this communication in error, please contact the sender immediately and delete it from your system. Thank You.







CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to which it is addressed and may contain information that is confidential, privileged and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, you are hereby notified that any printing, copying, dissemination, distribution, disclosure or forwarding of this communication is strictly prohibited. If you have received this communication in error, please contact the sender immediately and delete it from your system. Thank You.



CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to which it is addressed and may contain information that is confidential, privileged and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, you are hereby notified that any printing, copying, dissemination, distribution, disclosure or forwarding of this communication is strictly prohibited. If you have received this communication in error, please contact the sender immediately and delete it from your system. Thank You.


CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to which it is addressed and may contain information that is confidential, privileged and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, you are hereby notified that any printing, copying, dissemination, distribution, disclosure or forwarding of this communication is strictly prohibited. If you have received this communication in error, please contact the sender immediately and delete it from your system. Thank You.


CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to which it is addressed and may contain information that is confidential, privileged and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, you are hereby notified that any printing, copying, dissemination, distribution, disclosure or forwarding of this communication is strictly prohibited. If you have received this communication in error, please contact the sender immediately and delete it from your system. Thank You.




--
Olivier Renault
Solution Engineer - Big Data - Hortonworks, Inc.
+44 7500 933 036
orenault@hortonworks.com<ma...@hortonworks.com>
www.hortonworks.com<http://www.hortonworks.com/>
<http://hortonworks.com/products/hortonworks-sandbox/>

CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to which it is addressed and may contain information that is confidential, privileged and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, you are hereby notified that any printing, copying, dissemination, distribution, disclosure or forwarding of this communication is strictly prohibited. If you have received this communication in error, please contact the sender immediately and delete it from your system. Thank You.<http://hortonworks.com/products/hortonworks-sandbox/>
 <http://hortonworks.com/products/hortonworks-sandbox/>

Re: about replication

Posted by Irfan Sayed <ir...@gmail.com>.
please find the attached.
i don't have "c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log"
as it is not generated

regards
irfan






On Thu, Sep 5, 2013 at 6:09 PM, Olivier Renault <or...@hortonworks.com>wrote:

> Could you share the log files ( c:\hdp.log,
> c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log )  as
> well as your clusterproperties.txt ?
>
> Thanks,
> Olivier
>
>
> On 5 September 2013 12:33, Irfan Sayed <ir...@gmail.com> wrote:
>
>> thanks. i followed the user manual for deployment and installed all
>> pre-requisites
>> i modified the command and still the issue persist. please suggest
>>
>> please refer below
>>
>>
>> [image: Inline image 1]
>>
>> regards
>> irfan
>>
>>
>>
>> On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <orenault@hortonworks.com
>> > wrote:
>>
>>> The command to install it is msiexec /i msifile /...
>>>
>>> You will find the correct syntax as part of doc.
>>>
>>> Happy reading
>>> Olivier
>>> On 4 Sep 2013 12:37, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>
>>>> thanks.
>>>> i referred the logs and manuals. i modified the clusterproperties file
>>>> and then double click on the msi file
>>>> however, it still failed.
>>>> further i started the installation on command line by giving
>>>> HDP_LAYOUT=clusterproperties file path,
>>>> installation went ahead and it failed for .NET framework 4.0 and VC++
>>>> redistributable package dependency
>>>>
>>>> i installed both and started again the installation.
>>>> failed again with following error
>>>> [image: Inline image 1]
>>>>
>>>> when i search for the logs mentioned in the error , i never found that
>>>> please suggest
>>>>
>>>> regards
>>>> irfan
>>>>
>>>>
>>>>
>>>> On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <
>>>> orenault@hortonworks.com> wrote:
>>>>
>>>>> Correct, you need to define the cluster configuration as part of a
>>>>> file. You will find some information on the configuration file as part of
>>>>> the documentation.
>>>>>
>>>>>
>>>>> http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
>>>>>
>>>>> You should make sure to have also installed the pre requisite.
>>>>>
>>>>> Thanks
>>>>> Olivier
>>>>> On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>>>
>>>>>> thanks. sorry for the long break. actually got involved in some other
>>>>>> priorities
>>>>>> i downloaded the installer and while installing i got following error
>>>>>>
>>>>>> [image: Inline image 1]
>>>>>>
>>>>>> do i need to make any configuration prior to installation ??
>>>>>>
>>>>>> regards
>>>>>> irfan
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
>>>>>> orenault@hortonworks.com> wrote:
>>>>>>
>>>>>>> Here is the link
>>>>>>>
>>>>>>> http://download.hortonworks.com/products/hdp-windows/
>>>>>>>
>>>>>>> Olivier
>>>>>>> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>>>>>
>>>>>>>> thanks.
>>>>>>>> i just followed the instructions to setup the pseudo distributed
>>>>>>>> setup first using the url :
>>>>>>>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>>>>>>>>
>>>>>>>> i don't think so i am running DN on both machine
>>>>>>>> please find the attached log
>>>>>>>>
>>>>>>>> hi olivier
>>>>>>>>
>>>>>>>> can you please give me download link ?
>>>>>>>> let me try please
>>>>>>>>
>>>>>>>> regards
>>>>>>>> irfan
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <dontariq@gmail.com
>>>>>>>> > wrote:
>>>>>>>>
>>>>>>>>> Are you running DN on both the machines? Could you please show me
>>>>>>>>> your DN logs?
>>>>>>>>>
>>>>>>>>> Also, consider Oliver's suggestion. It's definitely a better
>>>>>>>>> option.
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> Warm Regards,
>>>>>>>>> Tariq
>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>>>>>>>>> orenault@hortonworks.com> wrote:
>>>>>>>>>
>>>>>>>>>> Irfu,
>>>>>>>>>>
>>>>>>>>>> If you want to quickly get Hadoop running on windows platform.
>>>>>>>>>> You may want to try our distribution for Windows. You will be able to find
>>>>>>>>>> the msi on our website.
>>>>>>>>>>
>>>>>>>>>> Regards
>>>>>>>>>> Olivier
>>>>>>>>>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>>> thanks.
>>>>>>>>>>> ok. i think i need to change the plan over here
>>>>>>>>>>> let me create two environments. 1: totally windows 2: totally
>>>>>>>>>>> Unix
>>>>>>>>>>>
>>>>>>>>>>> because, on windows , anyway i have to try and see how hadoop
>>>>>>>>>>> works
>>>>>>>>>>> on UNIX, it is already known that ,  it is working fine.
>>>>>>>>>>>
>>>>>>>>>>> so, on windows , here is the setup:
>>>>>>>>>>>
>>>>>>>>>>> namenode : windows 2012 R2
>>>>>>>>>>> datanode : windows 2012 R2
>>>>>>>>>>>
>>>>>>>>>>> now, the exact problem is :
>>>>>>>>>>> 1: datanode is not getting started
>>>>>>>>>>> 2: replication : if i put any file/folder on any datanode , it
>>>>>>>>>>> should get replicated to all another available datanodes
>>>>>>>>>>>
>>>>>>>>>>> regards
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <
>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> Seriously??You are planning to develop something using Hadoop
>>>>>>>>>>>> on windows. Not a good idea. Anyways, cold you plz show me your log files?I
>>>>>>>>>>>> also need some additional info :
>>>>>>>>>>>> -The exact problem which you are facing right now
>>>>>>>>>>>> -Your cluster summary(no. of nodes etc)
>>>>>>>>>>>> -Your latest configuration files
>>>>>>>>>>>> -Your /etc.hosts file
>>>>>>>>>>>>
>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>> Tariq
>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <
>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>
>>>>>>>>>>>>> ok. thanks
>>>>>>>>>>>>> now, i need to start with all windows setup first as our
>>>>>>>>>>>>> product will be based on windows
>>>>>>>>>>>>> so, now, please tell me how to resolve the issue
>>>>>>>>>>>>>
>>>>>>>>>>>>> datanode is not starting . please suggest
>>>>>>>>>>>>>
>>>>>>>>>>>>> regards,
>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <
>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>
>>>>>>>>>>>>>> It is possible. Theoretically Hadoop doesn't stop you from
>>>>>>>>>>>>>> doing that. But it is not a very wise setup.
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <
>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <
>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>> can i have setup like this :
>>>>>>>>>>>>>>>> namenode will be on linux (flavour may be RHEL, CentOS,
>>>>>>>>>>>>>>>> UBuntu etc)
>>>>>>>>>>>>>>>> and datanodes are the combination of any OS (windows ,
>>>>>>>>>>>>>>>> linux , unix etc )
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> however, my doubt is,  as the file systems of  both the
>>>>>>>>>>>>>>>> systems (win and linux ) are different ,  datanodes of these systems can
>>>>>>>>>>>>>>>> not be part of single cluster . i have to make windows cluster separate and
>>>>>>>>>>>>>>>> UNIX cluster separate ?
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
>>>>>>>>>>>>>>>> aagarwal@hortonworks.com> wrote:
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> I just noticed you are on Cygwin. IIRC Windows PIDs are
>>>>>>>>>>>>>>>>> not the same as Cygwin PIDs so that may be causing the discrepancy. I don't
>>>>>>>>>>>>>>>>> know how well Hadoop works in Cygwin as I have never tried it. Work is in
>>>>>>>>>>>>>>>>> progress for native Windows support however there are no official releases
>>>>>>>>>>>>>>>>> with Windows support yet. It may be easier to get familiar with a
>>>>>>>>>>>>>>>>> release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> thanks
>>>>>>>>>>>>>>>>>> here is what i did .
>>>>>>>>>>>>>>>>>> i stopped all the namenodes and datanodes using
>>>>>>>>>>>>>>>>>> ./stop-dfs.sh command
>>>>>>>>>>>>>>>>>> then deleted all pid files for namenodes and datanodes
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> started dfs again with command : "./start-dfs.sh"
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> when i ran the "Jps" command . it shows
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>> 4536 Jps
>>>>>>>>>>>>>>>>>> 2076 NameNode
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> however, when i open the pid file for namenode then it is
>>>>>>>>>>>>>>>>>> not showing pid as : 4560. on the contrary, it shud show : 2076
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
>>>>>>>>>>>>>>>>>> aagarwal@hortonworks.com> wrote:
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> Most likely there is a stale pid file. Something like
>>>>>>>>>>>>>>>>>>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>>>>>>>>>>>>>>>>>>> the datanode.
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> I haven't read the entire thread so you may have looked
>>>>>>>>>>>>>>>>>>> at this already.
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> -Arpit
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> datanode is trying to connect to namenode continuously
>>>>>>>>>>>>>>>>>>>> but fails
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> when i try to run "jps" command it says :
>>>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>>>> 4584 NameNode
>>>>>>>>>>>>>>>>>>>> 4016 Jps
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> and when i ran the "./start-dfs.sh" then it says :
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> $ ./start-dfs.sh
>>>>>>>>>>>>>>>>>>>> namenode running as process 3544. Stop it first.
>>>>>>>>>>>>>>>>>>>> DFS-1: datanode running as process 4076. Stop it first.
>>>>>>>>>>>>>>>>>>>> localhost: secondarynamenode running as process 4792.
>>>>>>>>>>>>>>>>>>>> Stop it first.
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> both these logs are contradictory
>>>>>>>>>>>>>>>>>>>> please find the attached logs
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> should i attach the conf files as well ?
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> Your DN is still not running. Showing me the logs
>>>>>>>>>>>>>>>>>>>>> would be helpful.
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> i followed the url and did the steps mention in that.
>>>>>>>>>>>>>>>>>>>>>> i have deployed on the windows platform
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> Now, i am able to browse url : http://localhost:50070(name node )
>>>>>>>>>>>>>>>>>>>>>> however, not able to browse url :
>>>>>>>>>>>>>>>>>>>>>> http://localhost:50030
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> please refer below
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> i have modified all the config files as mentioned and
>>>>>>>>>>>>>>>>>>>>>> formatted the hdfs file system as well
>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> thanks. i followed this url :
>>>>>>>>>>>>>>>>>>>>>>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>>>>>>>>>>>>>>>>>>>>>>> let me follow the url which you gave for pseudo
>>>>>>>>>>>>>>>>>>>>>>> distributed setup and then will switch to distributed mode
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> You are welcome. Which link have you followed for
>>>>>>>>>>>>>>>>>>>>>>>> the configuration?Your *core-site.xml* is empty.
>>>>>>>>>>>>>>>>>>>>>>>> Remove the property *fs.default.name *from *
>>>>>>>>>>>>>>>>>>>>>>>> hdfs-site.xml* and add it to *core-site.xml*.
>>>>>>>>>>>>>>>>>>>>>>>> Remove *mapred.job.tracker* as well. It is
>>>>>>>>>>>>>>>>>>>>>>>> required in *mapred-site.xml*.
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> I would suggest you to do a pseudo distributed
>>>>>>>>>>>>>>>>>>>>>>>> setup first in order to get yourself familiar with the process and then
>>>>>>>>>>>>>>>>>>>>>>>> proceed to the distributed mode. You can visit this
>>>>>>>>>>>>>>>>>>>>>>>> link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> HTH
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> thanks tariq for response.
>>>>>>>>>>>>>>>>>>>>>>>>> as discussed last time, i have sent you all the
>>>>>>>>>>>>>>>>>>>>>>>>> config files in my setup .
>>>>>>>>>>>>>>>>>>>>>>>>> can you please go through that ?
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> please let me know
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> I'm sorry for being unresponsive. Was out of
>>>>>>>>>>>>>>>>>>>>>>>>>> touch for sometime because of ramzan and eid. Resuming work today.
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> What's the current status?
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <
>>>>>>>>>>>>>>>>>>>>>>>>>> manishd207@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> First of all read the concepts ..I hope you will
>>>>>>>>>>>>>>>>>>>>>>>>>>> like it..
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> hey Tariq,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i am still stuck ..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> can you please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> attachment got quarantined
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> resending in txt format. please rename it to
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> conf.rar
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> if i run the jps command on namenode :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 3164 NameNode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 1892 Jps
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> same command on datanode :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-1/cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 3848 Jps
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> jps does not list any process for datanode.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, on web browser i can see one live data node
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please find the attached conf rar file of
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> OK. we'll start fresh. Could you plz show
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> me your latest config files?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> BTW, are your daemons running fine?Use JPS
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> to verify that.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i have created these dir "wksp_data" and
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> "wksp_name" on both datanode and namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> made the respective changes in
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> "hdfs-site.xml" file
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> formatted the namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> started the dfs
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> but still, not able to browse the file
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> system through web browser
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please refer below
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> anything still missing ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> these dir needs to be created on all
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> datanodes and namenodes ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> further,  hdfs-site.xml needs to be
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> updated on both datanodes and namenodes for these new dir?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Create 2 directories manually
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> corresponding to the values of dfs.name.dir and dfs.data.dir properties and
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> change the permissions of these directories to 755. When you start pushing
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> data into your HDFS, data will start going inside the directory specified
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> by dfs.data.dir and the associated metadata will go inside dfs.name.dir.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Remember, you store data in HDFS, but it eventually gets stored in your
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> local/native FS. But you cannot see this data directly on your local/native
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> FS.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, i need this to be working on
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> windows environment as project requirement.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i will add/work on Linux later
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> so, now , at this stage , c:\\wksp is
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> the HDFS file system OR do i need to create it from command line ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:19 PM,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Mohammad Tariq <do...@gmail.com>wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Hello Irfan,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sorry for being unresponsive. Got
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> stuck with some imp work.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> HDFS webUI doesn't provide us the
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> ability to create file or directory. You can browse HDFS, view files,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> download files etc. But operation like create, move, copy etc are not
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> supported.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> These values look fine to me.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> One suggestion though. Try getting a
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Linux machine(if possible). Or at least use a VM. I personally feel that
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> using Hadoop on windows is always messy.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> when i browse the file system , i am
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> getting following :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i haven't seen any make directory
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> option there
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i need to create it from command
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> line ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> further, in the hdfs-site.xml file ,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i have given following entries. are they correct ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <name>dfs.data.dir</name>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <value>c:\\wksp</value>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   </property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <name>dfs.name.dir</name>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <value>c:\\wksp</value>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   </property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 12:40 PM,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> manish dunani <ma...@gmail.com>wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *You are wrong at this:*
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> copyFromLocal: File
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> copyFromLocal: File
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Because,You had wrote both the
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> paths local and You need not to copy hadoop into hdfs...Hadoop is already
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> working..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Just check out in browser by after
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> starting ur single node cluster :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> localhost:50070
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> then go for browse the filesystem
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> link in it..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> If there is no directory then make
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> directory there.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> That is your hdfs directory.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Then copy any text file there(no
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> need to copy hadoop there).beacause u are going to do processing on that
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> data in text file.That's why hadoop is used for ,first u need to make it
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> clear in ur mind.Then and then u will do it...otherwise not possible..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *Try this: *
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ .bin/hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /full/local/path/to/ur/file /hdfs/directory/path
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 11:49 AM,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Irfan Sayed <ir...@gmail.com>wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks. yes , i am newbie.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, i need windows setup.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> let me surely refer the doc and
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> link which u sent but i need this to be working ...
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> can you please help
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> MANISH DUNANI
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> -THANX
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> +91 9426881954,+91 8460656443
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> manishd207@gmail.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>>>>>>>>>>> Regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> *Manish Dunani*
>>>>>>>>>>>>>>>>>>>>>>>>>>> *Contact No* : +91 9408329137
>>>>>>>>>>>>>>>>>>>>>>>>>>> *skype id* : manish.dunani*
>>>>>>>>>>>>>>>>>>>>>>>>>>> *
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>>>>>>>>>>> NOTICE: This message is intended for the use of the
>>>>>>>>>>>>>>>>>>> individual or entity to which it is addressed and may contain information
>>>>>>>>>>>>>>>>>>> that is confidential, privileged and exempt from disclosure under
>>>>>>>>>>>>>>>>>>> applicable law. If the reader of this message is not the intended
>>>>>>>>>>>>>>>>>>> recipient, you are hereby notified that any printing, copying,
>>>>>>>>>>>>>>>>>>> dissemination, distribution, disclosure or forwarding of this communication
>>>>>>>>>>>>>>>>>>> is strictly prohibited. If you have received this communication in error,
>>>>>>>>>>>>>>>>>>> please contact the sender immediately and delete it from your system. Thank
>>>>>>>>>>>>>>>>>>> You.
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>>>>>>>>> NOTICE: This message is intended for the use of the
>>>>>>>>>>>>>>>>> individual or entity to which it is addressed and may contain information
>>>>>>>>>>>>>>>>> that is confidential, privileged and exempt from disclosure under
>>>>>>>>>>>>>>>>> applicable law. If the reader of this message is not the intended
>>>>>>>>>>>>>>>>> recipient, you are hereby notified that any printing, copying,
>>>>>>>>>>>>>>>>> dissemination, distribution, disclosure or forwarding of this communication
>>>>>>>>>>>>>>>>> is strictly prohibited. If you have received this communication in error,
>>>>>>>>>>>>>>>>> please contact the sender immediately and delete it from your system. Thank
>>>>>>>>>>>>>>>>> You.
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>>> immediately and delete it from your system. Thank You.
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>> you have received this communication in error, please contact the sender
>>>>>>> immediately and delete it from your system. Thank You.
>>>>>>>
>>>>>>
>>>>>>
>>>>> CONFIDENTIALITY NOTICE
>>>>> NOTICE: This message is intended for the use of the individual or
>>>>> entity to which it is addressed and may contain information that is
>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>> notified that any printing, copying, dissemination, distribution,
>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>> you have received this communication in error, please contact the sender
>>>>> immediately and delete it from your system. Thank You.
>>>>>
>>>>
>>>>
>>> CONFIDENTIALITY NOTICE
>>> NOTICE: This message is intended for the use of the individual or entity
>>> to which it is addressed and may contain information that is confidential,
>>> privileged and exempt from disclosure under applicable law. If the reader
>>> of this message is not the intended recipient, you are hereby notified that
>>> any printing, copying, dissemination, distribution, disclosure or
>>> forwarding of this communication is strictly prohibited. If you have
>>> received this communication in error, please contact the sender immediately
>>> and delete it from your system. Thank You.
>>>
>>
>>
>
>
> --
> Olivier Renault
> Solution Engineer - Big Data - Hortonworks, Inc.
> +44 7500 933 036
> orenault@hortonworks.com
> www.hortonworks.com
>  <http://hortonworks.com/products/hortonworks-sandbox/>
>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.
>

Re: about replication

Posted by Irfan Sayed <ir...@gmail.com>.
please find the attached.
i don't have "c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log"
as it is not generated

regards
irfan






On Thu, Sep 5, 2013 at 6:09 PM, Olivier Renault <or...@hortonworks.com>wrote:

> Could you share the log files ( c:\hdp.log,
> c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log )  as
> well as your clusterproperties.txt ?
>
> Thanks,
> Olivier
>
>
> On 5 September 2013 12:33, Irfan Sayed <ir...@gmail.com> wrote:
>
>> thanks. i followed the user manual for deployment and installed all
>> pre-requisites
>> i modified the command and still the issue persist. please suggest
>>
>> please refer below
>>
>>
>> [image: Inline image 1]
>>
>> regards
>> irfan
>>
>>
>>
>> On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <orenault@hortonworks.com
>> > wrote:
>>
>>> The command to install it is msiexec /i msifile /...
>>>
>>> You will find the correct syntax as part of doc.
>>>
>>> Happy reading
>>> Olivier
>>> On 4 Sep 2013 12:37, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>
>>>> thanks.
>>>> i referred the logs and manuals. i modified the clusterproperties file
>>>> and then double click on the msi file
>>>> however, it still failed.
>>>> further i started the installation on command line by giving
>>>> HDP_LAYOUT=clusterproperties file path,
>>>> installation went ahead and it failed for .NET framework 4.0 and VC++
>>>> redistributable package dependency
>>>>
>>>> i installed both and started again the installation.
>>>> failed again with following error
>>>> [image: Inline image 1]
>>>>
>>>> when i search for the logs mentioned in the error , i never found that
>>>> please suggest
>>>>
>>>> regards
>>>> irfan
>>>>
>>>>
>>>>
>>>> On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <
>>>> orenault@hortonworks.com> wrote:
>>>>
>>>>> Correct, you need to define the cluster configuration as part of a
>>>>> file. You will find some information on the configuration file as part of
>>>>> the documentation.
>>>>>
>>>>>
>>>>> http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
>>>>>
>>>>> You should make sure to have also installed the pre requisite.
>>>>>
>>>>> Thanks
>>>>> Olivier
>>>>> On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>>>
>>>>>> thanks. sorry for the long break. actually got involved in some other
>>>>>> priorities
>>>>>> i downloaded the installer and while installing i got following error
>>>>>>
>>>>>> [image: Inline image 1]
>>>>>>
>>>>>> do i need to make any configuration prior to installation ??
>>>>>>
>>>>>> regards
>>>>>> irfan
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
>>>>>> orenault@hortonworks.com> wrote:
>>>>>>
>>>>>>> Here is the link
>>>>>>>
>>>>>>> http://download.hortonworks.com/products/hdp-windows/
>>>>>>>
>>>>>>> Olivier
>>>>>>> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>>>>>
>>>>>>>> thanks.
>>>>>>>> i just followed the instructions to setup the pseudo distributed
>>>>>>>> setup first using the url :
>>>>>>>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>>>>>>>>
>>>>>>>> i don't think so i am running DN on both machine
>>>>>>>> please find the attached log
>>>>>>>>
>>>>>>>> hi olivier
>>>>>>>>
>>>>>>>> can you please give me download link ?
>>>>>>>> let me try please
>>>>>>>>
>>>>>>>> regards
>>>>>>>> irfan
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <dontariq@gmail.com
>>>>>>>> > wrote:
>>>>>>>>
>>>>>>>>> Are you running DN on both the machines? Could you please show me
>>>>>>>>> your DN logs?
>>>>>>>>>
>>>>>>>>> Also, consider Oliver's suggestion. It's definitely a better
>>>>>>>>> option.
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> Warm Regards,
>>>>>>>>> Tariq
>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>>>>>>>>> orenault@hortonworks.com> wrote:
>>>>>>>>>
>>>>>>>>>> Irfu,
>>>>>>>>>>
>>>>>>>>>> If you want to quickly get Hadoop running on windows platform.
>>>>>>>>>> You may want to try our distribution for Windows. You will be able to find
>>>>>>>>>> the msi on our website.
>>>>>>>>>>
>>>>>>>>>> Regards
>>>>>>>>>> Olivier
>>>>>>>>>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>>> thanks.
>>>>>>>>>>> ok. i think i need to change the plan over here
>>>>>>>>>>> let me create two environments. 1: totally windows 2: totally
>>>>>>>>>>> Unix
>>>>>>>>>>>
>>>>>>>>>>> because, on windows , anyway i have to try and see how hadoop
>>>>>>>>>>> works
>>>>>>>>>>> on UNIX, it is already known that ,  it is working fine.
>>>>>>>>>>>
>>>>>>>>>>> so, on windows , here is the setup:
>>>>>>>>>>>
>>>>>>>>>>> namenode : windows 2012 R2
>>>>>>>>>>> datanode : windows 2012 R2
>>>>>>>>>>>
>>>>>>>>>>> now, the exact problem is :
>>>>>>>>>>> 1: datanode is not getting started
>>>>>>>>>>> 2: replication : if i put any file/folder on any datanode , it
>>>>>>>>>>> should get replicated to all another available datanodes
>>>>>>>>>>>
>>>>>>>>>>> regards
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <
>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> Seriously??You are planning to develop something using Hadoop
>>>>>>>>>>>> on windows. Not a good idea. Anyways, cold you plz show me your log files?I
>>>>>>>>>>>> also need some additional info :
>>>>>>>>>>>> -The exact problem which you are facing right now
>>>>>>>>>>>> -Your cluster summary(no. of nodes etc)
>>>>>>>>>>>> -Your latest configuration files
>>>>>>>>>>>> -Your /etc.hosts file
>>>>>>>>>>>>
>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>> Tariq
>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <
>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>
>>>>>>>>>>>>> ok. thanks
>>>>>>>>>>>>> now, i need to start with all windows setup first as our
>>>>>>>>>>>>> product will be based on windows
>>>>>>>>>>>>> so, now, please tell me how to resolve the issue
>>>>>>>>>>>>>
>>>>>>>>>>>>> datanode is not starting . please suggest
>>>>>>>>>>>>>
>>>>>>>>>>>>> regards,
>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <
>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>
>>>>>>>>>>>>>> It is possible. Theoretically Hadoop doesn't stop you from
>>>>>>>>>>>>>> doing that. But it is not a very wise setup.
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <
>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <
>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>> can i have setup like this :
>>>>>>>>>>>>>>>> namenode will be on linux (flavour may be RHEL, CentOS,
>>>>>>>>>>>>>>>> UBuntu etc)
>>>>>>>>>>>>>>>> and datanodes are the combination of any OS (windows ,
>>>>>>>>>>>>>>>> linux , unix etc )
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> however, my doubt is,  as the file systems of  both the
>>>>>>>>>>>>>>>> systems (win and linux ) are different ,  datanodes of these systems can
>>>>>>>>>>>>>>>> not be part of single cluster . i have to make windows cluster separate and
>>>>>>>>>>>>>>>> UNIX cluster separate ?
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
>>>>>>>>>>>>>>>> aagarwal@hortonworks.com> wrote:
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> I just noticed you are on Cygwin. IIRC Windows PIDs are
>>>>>>>>>>>>>>>>> not the same as Cygwin PIDs so that may be causing the discrepancy. I don't
>>>>>>>>>>>>>>>>> know how well Hadoop works in Cygwin as I have never tried it. Work is in
>>>>>>>>>>>>>>>>> progress for native Windows support however there are no official releases
>>>>>>>>>>>>>>>>> with Windows support yet. It may be easier to get familiar with a
>>>>>>>>>>>>>>>>> release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> thanks
>>>>>>>>>>>>>>>>>> here is what i did .
>>>>>>>>>>>>>>>>>> i stopped all the namenodes and datanodes using
>>>>>>>>>>>>>>>>>> ./stop-dfs.sh command
>>>>>>>>>>>>>>>>>> then deleted all pid files for namenodes and datanodes
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> started dfs again with command : "./start-dfs.sh"
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> when i ran the "Jps" command . it shows
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>> 4536 Jps
>>>>>>>>>>>>>>>>>> 2076 NameNode
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> however, when i open the pid file for namenode then it is
>>>>>>>>>>>>>>>>>> not showing pid as : 4560. on the contrary, it shud show : 2076
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
>>>>>>>>>>>>>>>>>> aagarwal@hortonworks.com> wrote:
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> Most likely there is a stale pid file. Something like
>>>>>>>>>>>>>>>>>>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>>>>>>>>>>>>>>>>>>> the datanode.
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> I haven't read the entire thread so you may have looked
>>>>>>>>>>>>>>>>>>> at this already.
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> -Arpit
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> datanode is trying to connect to namenode continuously
>>>>>>>>>>>>>>>>>>>> but fails
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> when i try to run "jps" command it says :
>>>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>>>> 4584 NameNode
>>>>>>>>>>>>>>>>>>>> 4016 Jps
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> and when i ran the "./start-dfs.sh" then it says :
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> $ ./start-dfs.sh
>>>>>>>>>>>>>>>>>>>> namenode running as process 3544. Stop it first.
>>>>>>>>>>>>>>>>>>>> DFS-1: datanode running as process 4076. Stop it first.
>>>>>>>>>>>>>>>>>>>> localhost: secondarynamenode running as process 4792.
>>>>>>>>>>>>>>>>>>>> Stop it first.
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> both these logs are contradictory
>>>>>>>>>>>>>>>>>>>> please find the attached logs
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> should i attach the conf files as well ?
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> Your DN is still not running. Showing me the logs
>>>>>>>>>>>>>>>>>>>>> would be helpful.
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> i followed the url and did the steps mention in that.
>>>>>>>>>>>>>>>>>>>>>> i have deployed on the windows platform
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> Now, i am able to browse url : http://localhost:50070(name node )
>>>>>>>>>>>>>>>>>>>>>> however, not able to browse url :
>>>>>>>>>>>>>>>>>>>>>> http://localhost:50030
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> please refer below
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> i have modified all the config files as mentioned and
>>>>>>>>>>>>>>>>>>>>>> formatted the hdfs file system as well
>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> thanks. i followed this url :
>>>>>>>>>>>>>>>>>>>>>>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>>>>>>>>>>>>>>>>>>>>>>> let me follow the url which you gave for pseudo
>>>>>>>>>>>>>>>>>>>>>>> distributed setup and then will switch to distributed mode
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> You are welcome. Which link have you followed for
>>>>>>>>>>>>>>>>>>>>>>>> the configuration?Your *core-site.xml* is empty.
>>>>>>>>>>>>>>>>>>>>>>>> Remove the property *fs.default.name *from *
>>>>>>>>>>>>>>>>>>>>>>>> hdfs-site.xml* and add it to *core-site.xml*.
>>>>>>>>>>>>>>>>>>>>>>>> Remove *mapred.job.tracker* as well. It is
>>>>>>>>>>>>>>>>>>>>>>>> required in *mapred-site.xml*.
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> I would suggest you to do a pseudo distributed
>>>>>>>>>>>>>>>>>>>>>>>> setup first in order to get yourself familiar with the process and then
>>>>>>>>>>>>>>>>>>>>>>>> proceed to the distributed mode. You can visit this
>>>>>>>>>>>>>>>>>>>>>>>> link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> HTH
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> thanks tariq for response.
>>>>>>>>>>>>>>>>>>>>>>>>> as discussed last time, i have sent you all the
>>>>>>>>>>>>>>>>>>>>>>>>> config files in my setup .
>>>>>>>>>>>>>>>>>>>>>>>>> can you please go through that ?
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> please let me know
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> I'm sorry for being unresponsive. Was out of
>>>>>>>>>>>>>>>>>>>>>>>>>> touch for sometime because of ramzan and eid. Resuming work today.
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> What's the current status?
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <
>>>>>>>>>>>>>>>>>>>>>>>>>> manishd207@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> First of all read the concepts ..I hope you will
>>>>>>>>>>>>>>>>>>>>>>>>>>> like it..
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> hey Tariq,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i am still stuck ..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> can you please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> attachment got quarantined
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> resending in txt format. please rename it to
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> conf.rar
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> if i run the jps command on namenode :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 3164 NameNode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 1892 Jps
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> same command on datanode :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-1/cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 3848 Jps
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> jps does not list any process for datanode.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, on web browser i can see one live data node
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please find the attached conf rar file of
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> OK. we'll start fresh. Could you plz show
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> me your latest config files?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> BTW, are your daemons running fine?Use JPS
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> to verify that.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i have created these dir "wksp_data" and
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> "wksp_name" on both datanode and namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> made the respective changes in
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> "hdfs-site.xml" file
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> formatted the namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> started the dfs
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> but still, not able to browse the file
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> system through web browser
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please refer below
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> anything still missing ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> these dir needs to be created on all
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> datanodes and namenodes ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> further,  hdfs-site.xml needs to be
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> updated on both datanodes and namenodes for these new dir?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Create 2 directories manually
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> corresponding to the values of dfs.name.dir and dfs.data.dir properties and
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> change the permissions of these directories to 755. When you start pushing
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> data into your HDFS, data will start going inside the directory specified
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> by dfs.data.dir and the associated metadata will go inside dfs.name.dir.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Remember, you store data in HDFS, but it eventually gets stored in your
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> local/native FS. But you cannot see this data directly on your local/native
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> FS.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, i need this to be working on
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> windows environment as project requirement.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i will add/work on Linux later
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> so, now , at this stage , c:\\wksp is
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> the HDFS file system OR do i need to create it from command line ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:19 PM,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Mohammad Tariq <do...@gmail.com>wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Hello Irfan,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sorry for being unresponsive. Got
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> stuck with some imp work.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> HDFS webUI doesn't provide us the
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> ability to create file or directory. You can browse HDFS, view files,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> download files etc. But operation like create, move, copy etc are not
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> supported.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> These values look fine to me.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> One suggestion though. Try getting a
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Linux machine(if possible). Or at least use a VM. I personally feel that
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> using Hadoop on windows is always messy.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> when i browse the file system , i am
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> getting following :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i haven't seen any make directory
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> option there
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i need to create it from command
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> line ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> further, in the hdfs-site.xml file ,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i have given following entries. are they correct ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <name>dfs.data.dir</name>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <value>c:\\wksp</value>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   </property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <name>dfs.name.dir</name>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <value>c:\\wksp</value>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   </property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 12:40 PM,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> manish dunani <ma...@gmail.com>wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *You are wrong at this:*
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> copyFromLocal: File
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> copyFromLocal: File
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Because,You had wrote both the
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> paths local and You need not to copy hadoop into hdfs...Hadoop is already
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> working..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Just check out in browser by after
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> starting ur single node cluster :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> localhost:50070
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> then go for browse the filesystem
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> link in it..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> If there is no directory then make
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> directory there.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> That is your hdfs directory.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Then copy any text file there(no
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> need to copy hadoop there).beacause u are going to do processing on that
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> data in text file.That's why hadoop is used for ,first u need to make it
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> clear in ur mind.Then and then u will do it...otherwise not possible..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *Try this: *
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ .bin/hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /full/local/path/to/ur/file /hdfs/directory/path
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 11:49 AM,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Irfan Sayed <ir...@gmail.com>wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks. yes , i am newbie.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, i need windows setup.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> let me surely refer the doc and
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> link which u sent but i need this to be working ...
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> can you please help
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> MANISH DUNANI
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> -THANX
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> +91 9426881954,+91 8460656443
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> manishd207@gmail.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>>>>>>>>>>> Regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> *Manish Dunani*
>>>>>>>>>>>>>>>>>>>>>>>>>>> *Contact No* : +91 9408329137
>>>>>>>>>>>>>>>>>>>>>>>>>>> *skype id* : manish.dunani*
>>>>>>>>>>>>>>>>>>>>>>>>>>> *
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>>>>>>>>>>> NOTICE: This message is intended for the use of the
>>>>>>>>>>>>>>>>>>> individual or entity to which it is addressed and may contain information
>>>>>>>>>>>>>>>>>>> that is confidential, privileged and exempt from disclosure under
>>>>>>>>>>>>>>>>>>> applicable law. If the reader of this message is not the intended
>>>>>>>>>>>>>>>>>>> recipient, you are hereby notified that any printing, copying,
>>>>>>>>>>>>>>>>>>> dissemination, distribution, disclosure or forwarding of this communication
>>>>>>>>>>>>>>>>>>> is strictly prohibited. If you have received this communication in error,
>>>>>>>>>>>>>>>>>>> please contact the sender immediately and delete it from your system. Thank
>>>>>>>>>>>>>>>>>>> You.
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>>>>>>>>> NOTICE: This message is intended for the use of the
>>>>>>>>>>>>>>>>> individual or entity to which it is addressed and may contain information
>>>>>>>>>>>>>>>>> that is confidential, privileged and exempt from disclosure under
>>>>>>>>>>>>>>>>> applicable law. If the reader of this message is not the intended
>>>>>>>>>>>>>>>>> recipient, you are hereby notified that any printing, copying,
>>>>>>>>>>>>>>>>> dissemination, distribution, disclosure or forwarding of this communication
>>>>>>>>>>>>>>>>> is strictly prohibited. If you have received this communication in error,
>>>>>>>>>>>>>>>>> please contact the sender immediately and delete it from your system. Thank
>>>>>>>>>>>>>>>>> You.
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>>> immediately and delete it from your system. Thank You.
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>> you have received this communication in error, please contact the sender
>>>>>>> immediately and delete it from your system. Thank You.
>>>>>>>
>>>>>>
>>>>>>
>>>>> CONFIDENTIALITY NOTICE
>>>>> NOTICE: This message is intended for the use of the individual or
>>>>> entity to which it is addressed and may contain information that is
>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>> notified that any printing, copying, dissemination, distribution,
>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>> you have received this communication in error, please contact the sender
>>>>> immediately and delete it from your system. Thank You.
>>>>>
>>>>
>>>>
>>> CONFIDENTIALITY NOTICE
>>> NOTICE: This message is intended for the use of the individual or entity
>>> to which it is addressed and may contain information that is confidential,
>>> privileged and exempt from disclosure under applicable law. If the reader
>>> of this message is not the intended recipient, you are hereby notified that
>>> any printing, copying, dissemination, distribution, disclosure or
>>> forwarding of this communication is strictly prohibited. If you have
>>> received this communication in error, please contact the sender immediately
>>> and delete it from your system. Thank You.
>>>
>>
>>
>
>
> --
> Olivier Renault
> Solution Engineer - Big Data - Hortonworks, Inc.
> +44 7500 933 036
> orenault@hortonworks.com
> www.hortonworks.com
>  <http://hortonworks.com/products/hortonworks-sandbox/>
>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.
>

Re: about replication

Posted by Irfan Sayed <ir...@gmail.com>.
please find the attached.
i don't have "c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log"
as it is not generated

regards
irfan






On Thu, Sep 5, 2013 at 6:09 PM, Olivier Renault <or...@hortonworks.com>wrote:

> Could you share the log files ( c:\hdp.log,
> c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log )  as
> well as your clusterproperties.txt ?
>
> Thanks,
> Olivier
>
>
> On 5 September 2013 12:33, Irfan Sayed <ir...@gmail.com> wrote:
>
>> thanks. i followed the user manual for deployment and installed all
>> pre-requisites
>> i modified the command and still the issue persist. please suggest
>>
>> please refer below
>>
>>
>> [image: Inline image 1]
>>
>> regards
>> irfan
>>
>>
>>
>> On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <orenault@hortonworks.com
>> > wrote:
>>
>>> The command to install it is msiexec /i msifile /...
>>>
>>> You will find the correct syntax as part of doc.
>>>
>>> Happy reading
>>> Olivier
>>> On 4 Sep 2013 12:37, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>
>>>> thanks.
>>>> i referred the logs and manuals. i modified the clusterproperties file
>>>> and then double click on the msi file
>>>> however, it still failed.
>>>> further i started the installation on command line by giving
>>>> HDP_LAYOUT=clusterproperties file path,
>>>> installation went ahead and it failed for .NET framework 4.0 and VC++
>>>> redistributable package dependency
>>>>
>>>> i installed both and started again the installation.
>>>> failed again with following error
>>>> [image: Inline image 1]
>>>>
>>>> when i search for the logs mentioned in the error , i never found that
>>>> please suggest
>>>>
>>>> regards
>>>> irfan
>>>>
>>>>
>>>>
>>>> On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <
>>>> orenault@hortonworks.com> wrote:
>>>>
>>>>> Correct, you need to define the cluster configuration as part of a
>>>>> file. You will find some information on the configuration file as part of
>>>>> the documentation.
>>>>>
>>>>>
>>>>> http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
>>>>>
>>>>> You should make sure to have also installed the pre requisite.
>>>>>
>>>>> Thanks
>>>>> Olivier
>>>>> On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>>>
>>>>>> thanks. sorry for the long break. actually got involved in some other
>>>>>> priorities
>>>>>> i downloaded the installer and while installing i got following error
>>>>>>
>>>>>> [image: Inline image 1]
>>>>>>
>>>>>> do i need to make any configuration prior to installation ??
>>>>>>
>>>>>> regards
>>>>>> irfan
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
>>>>>> orenault@hortonworks.com> wrote:
>>>>>>
>>>>>>> Here is the link
>>>>>>>
>>>>>>> http://download.hortonworks.com/products/hdp-windows/
>>>>>>>
>>>>>>> Olivier
>>>>>>> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>>>>>
>>>>>>>> thanks.
>>>>>>>> i just followed the instructions to setup the pseudo distributed
>>>>>>>> setup first using the url :
>>>>>>>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>>>>>>>>
>>>>>>>> i don't think so i am running DN on both machine
>>>>>>>> please find the attached log
>>>>>>>>
>>>>>>>> hi olivier
>>>>>>>>
>>>>>>>> can you please give me download link ?
>>>>>>>> let me try please
>>>>>>>>
>>>>>>>> regards
>>>>>>>> irfan
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <dontariq@gmail.com
>>>>>>>> > wrote:
>>>>>>>>
>>>>>>>>> Are you running DN on both the machines? Could you please show me
>>>>>>>>> your DN logs?
>>>>>>>>>
>>>>>>>>> Also, consider Oliver's suggestion. It's definitely a better
>>>>>>>>> option.
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> Warm Regards,
>>>>>>>>> Tariq
>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>>>>>>>>> orenault@hortonworks.com> wrote:
>>>>>>>>>
>>>>>>>>>> Irfu,
>>>>>>>>>>
>>>>>>>>>> If you want to quickly get Hadoop running on windows platform.
>>>>>>>>>> You may want to try our distribution for Windows. You will be able to find
>>>>>>>>>> the msi on our website.
>>>>>>>>>>
>>>>>>>>>> Regards
>>>>>>>>>> Olivier
>>>>>>>>>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>>> thanks.
>>>>>>>>>>> ok. i think i need to change the plan over here
>>>>>>>>>>> let me create two environments. 1: totally windows 2: totally
>>>>>>>>>>> Unix
>>>>>>>>>>>
>>>>>>>>>>> because, on windows , anyway i have to try and see how hadoop
>>>>>>>>>>> works
>>>>>>>>>>> on UNIX, it is already known that ,  it is working fine.
>>>>>>>>>>>
>>>>>>>>>>> so, on windows , here is the setup:
>>>>>>>>>>>
>>>>>>>>>>> namenode : windows 2012 R2
>>>>>>>>>>> datanode : windows 2012 R2
>>>>>>>>>>>
>>>>>>>>>>> now, the exact problem is :
>>>>>>>>>>> 1: datanode is not getting started
>>>>>>>>>>> 2: replication : if i put any file/folder on any datanode , it
>>>>>>>>>>> should get replicated to all another available datanodes
>>>>>>>>>>>
>>>>>>>>>>> regards
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <
>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> Seriously??You are planning to develop something using Hadoop
>>>>>>>>>>>> on windows. Not a good idea. Anyways, cold you plz show me your log files?I
>>>>>>>>>>>> also need some additional info :
>>>>>>>>>>>> -The exact problem which you are facing right now
>>>>>>>>>>>> -Your cluster summary(no. of nodes etc)
>>>>>>>>>>>> -Your latest configuration files
>>>>>>>>>>>> -Your /etc.hosts file
>>>>>>>>>>>>
>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>> Tariq
>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <
>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>
>>>>>>>>>>>>> ok. thanks
>>>>>>>>>>>>> now, i need to start with all windows setup first as our
>>>>>>>>>>>>> product will be based on windows
>>>>>>>>>>>>> so, now, please tell me how to resolve the issue
>>>>>>>>>>>>>
>>>>>>>>>>>>> datanode is not starting . please suggest
>>>>>>>>>>>>>
>>>>>>>>>>>>> regards,
>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <
>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>
>>>>>>>>>>>>>> It is possible. Theoretically Hadoop doesn't stop you from
>>>>>>>>>>>>>> doing that. But it is not a very wise setup.
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <
>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <
>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>> can i have setup like this :
>>>>>>>>>>>>>>>> namenode will be on linux (flavour may be RHEL, CentOS,
>>>>>>>>>>>>>>>> UBuntu etc)
>>>>>>>>>>>>>>>> and datanodes are the combination of any OS (windows ,
>>>>>>>>>>>>>>>> linux , unix etc )
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> however, my doubt is,  as the file systems of  both the
>>>>>>>>>>>>>>>> systems (win and linux ) are different ,  datanodes of these systems can
>>>>>>>>>>>>>>>> not be part of single cluster . i have to make windows cluster separate and
>>>>>>>>>>>>>>>> UNIX cluster separate ?
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
>>>>>>>>>>>>>>>> aagarwal@hortonworks.com> wrote:
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> I just noticed you are on Cygwin. IIRC Windows PIDs are
>>>>>>>>>>>>>>>>> not the same as Cygwin PIDs so that may be causing the discrepancy. I don't
>>>>>>>>>>>>>>>>> know how well Hadoop works in Cygwin as I have never tried it. Work is in
>>>>>>>>>>>>>>>>> progress for native Windows support however there are no official releases
>>>>>>>>>>>>>>>>> with Windows support yet. It may be easier to get familiar with a
>>>>>>>>>>>>>>>>> release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> thanks
>>>>>>>>>>>>>>>>>> here is what i did .
>>>>>>>>>>>>>>>>>> i stopped all the namenodes and datanodes using
>>>>>>>>>>>>>>>>>> ./stop-dfs.sh command
>>>>>>>>>>>>>>>>>> then deleted all pid files for namenodes and datanodes
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> started dfs again with command : "./start-dfs.sh"
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> when i ran the "Jps" command . it shows
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>> 4536 Jps
>>>>>>>>>>>>>>>>>> 2076 NameNode
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> however, when i open the pid file for namenode then it is
>>>>>>>>>>>>>>>>>> not showing pid as : 4560. on the contrary, it shud show : 2076
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
>>>>>>>>>>>>>>>>>> aagarwal@hortonworks.com> wrote:
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> Most likely there is a stale pid file. Something like
>>>>>>>>>>>>>>>>>>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>>>>>>>>>>>>>>>>>>> the datanode.
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> I haven't read the entire thread so you may have looked
>>>>>>>>>>>>>>>>>>> at this already.
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> -Arpit
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> datanode is trying to connect to namenode continuously
>>>>>>>>>>>>>>>>>>>> but fails
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> when i try to run "jps" command it says :
>>>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>>>> 4584 NameNode
>>>>>>>>>>>>>>>>>>>> 4016 Jps
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> and when i ran the "./start-dfs.sh" then it says :
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> $ ./start-dfs.sh
>>>>>>>>>>>>>>>>>>>> namenode running as process 3544. Stop it first.
>>>>>>>>>>>>>>>>>>>> DFS-1: datanode running as process 4076. Stop it first.
>>>>>>>>>>>>>>>>>>>> localhost: secondarynamenode running as process 4792.
>>>>>>>>>>>>>>>>>>>> Stop it first.
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> both these logs are contradictory
>>>>>>>>>>>>>>>>>>>> please find the attached logs
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> should i attach the conf files as well ?
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> Your DN is still not running. Showing me the logs
>>>>>>>>>>>>>>>>>>>>> would be helpful.
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> i followed the url and did the steps mention in that.
>>>>>>>>>>>>>>>>>>>>>> i have deployed on the windows platform
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> Now, i am able to browse url : http://localhost:50070(name node )
>>>>>>>>>>>>>>>>>>>>>> however, not able to browse url :
>>>>>>>>>>>>>>>>>>>>>> http://localhost:50030
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> please refer below
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> i have modified all the config files as mentioned and
>>>>>>>>>>>>>>>>>>>>>> formatted the hdfs file system as well
>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> thanks. i followed this url :
>>>>>>>>>>>>>>>>>>>>>>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>>>>>>>>>>>>>>>>>>>>>>> let me follow the url which you gave for pseudo
>>>>>>>>>>>>>>>>>>>>>>> distributed setup and then will switch to distributed mode
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> You are welcome. Which link have you followed for
>>>>>>>>>>>>>>>>>>>>>>>> the configuration?Your *core-site.xml* is empty.
>>>>>>>>>>>>>>>>>>>>>>>> Remove the property *fs.default.name *from *
>>>>>>>>>>>>>>>>>>>>>>>> hdfs-site.xml* and add it to *core-site.xml*.
>>>>>>>>>>>>>>>>>>>>>>>> Remove *mapred.job.tracker* as well. It is
>>>>>>>>>>>>>>>>>>>>>>>> required in *mapred-site.xml*.
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> I would suggest you to do a pseudo distributed
>>>>>>>>>>>>>>>>>>>>>>>> setup first in order to get yourself familiar with the process and then
>>>>>>>>>>>>>>>>>>>>>>>> proceed to the distributed mode. You can visit this
>>>>>>>>>>>>>>>>>>>>>>>> link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> HTH
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> thanks tariq for response.
>>>>>>>>>>>>>>>>>>>>>>>>> as discussed last time, i have sent you all the
>>>>>>>>>>>>>>>>>>>>>>>>> config files in my setup .
>>>>>>>>>>>>>>>>>>>>>>>>> can you please go through that ?
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> please let me know
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> I'm sorry for being unresponsive. Was out of
>>>>>>>>>>>>>>>>>>>>>>>>>> touch for sometime because of ramzan and eid. Resuming work today.
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> What's the current status?
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <
>>>>>>>>>>>>>>>>>>>>>>>>>> manishd207@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> First of all read the concepts ..I hope you will
>>>>>>>>>>>>>>>>>>>>>>>>>>> like it..
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> hey Tariq,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i am still stuck ..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> can you please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> attachment got quarantined
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> resending in txt format. please rename it to
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> conf.rar
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> if i run the jps command on namenode :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 3164 NameNode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 1892 Jps
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> same command on datanode :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-1/cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 3848 Jps
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> jps does not list any process for datanode.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, on web browser i can see one live data node
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please find the attached conf rar file of
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> OK. we'll start fresh. Could you plz show
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> me your latest config files?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> BTW, are your daemons running fine?Use JPS
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> to verify that.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i have created these dir "wksp_data" and
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> "wksp_name" on both datanode and namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> made the respective changes in
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> "hdfs-site.xml" file
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> formatted the namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> started the dfs
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> but still, not able to browse the file
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> system through web browser
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please refer below
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> anything still missing ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> these dir needs to be created on all
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> datanodes and namenodes ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> further,  hdfs-site.xml needs to be
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> updated on both datanodes and namenodes for these new dir?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Create 2 directories manually
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> corresponding to the values of dfs.name.dir and dfs.data.dir properties and
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> change the permissions of these directories to 755. When you start pushing
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> data into your HDFS, data will start going inside the directory specified
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> by dfs.data.dir and the associated metadata will go inside dfs.name.dir.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Remember, you store data in HDFS, but it eventually gets stored in your
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> local/native FS. But you cannot see this data directly on your local/native
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> FS.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, i need this to be working on
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> windows environment as project requirement.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i will add/work on Linux later
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> so, now , at this stage , c:\\wksp is
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> the HDFS file system OR do i need to create it from command line ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:19 PM,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Mohammad Tariq <do...@gmail.com>wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Hello Irfan,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sorry for being unresponsive. Got
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> stuck with some imp work.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> HDFS webUI doesn't provide us the
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> ability to create file or directory. You can browse HDFS, view files,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> download files etc. But operation like create, move, copy etc are not
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> supported.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> These values look fine to me.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> One suggestion though. Try getting a
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Linux machine(if possible). Or at least use a VM. I personally feel that
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> using Hadoop on windows is always messy.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> when i browse the file system , i am
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> getting following :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i haven't seen any make directory
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> option there
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i need to create it from command
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> line ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> further, in the hdfs-site.xml file ,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i have given following entries. are they correct ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <name>dfs.data.dir</name>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <value>c:\\wksp</value>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   </property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <name>dfs.name.dir</name>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <value>c:\\wksp</value>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   </property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 12:40 PM,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> manish dunani <ma...@gmail.com>wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *You are wrong at this:*
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> copyFromLocal: File
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> copyFromLocal: File
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Because,You had wrote both the
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> paths local and You need not to copy hadoop into hdfs...Hadoop is already
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> working..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Just check out in browser by after
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> starting ur single node cluster :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> localhost:50070
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> then go for browse the filesystem
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> link in it..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> If there is no directory then make
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> directory there.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> That is your hdfs directory.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Then copy any text file there(no
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> need to copy hadoop there).beacause u are going to do processing on that
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> data in text file.That's why hadoop is used for ,first u need to make it
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> clear in ur mind.Then and then u will do it...otherwise not possible..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *Try this: *
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ .bin/hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /full/local/path/to/ur/file /hdfs/directory/path
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 11:49 AM,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Irfan Sayed <ir...@gmail.com>wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks. yes , i am newbie.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, i need windows setup.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> let me surely refer the doc and
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> link which u sent but i need this to be working ...
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> can you please help
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> MANISH DUNANI
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> -THANX
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> +91 9426881954,+91 8460656443
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> manishd207@gmail.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>>>>>>>>>>> Regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> *Manish Dunani*
>>>>>>>>>>>>>>>>>>>>>>>>>>> *Contact No* : +91 9408329137
>>>>>>>>>>>>>>>>>>>>>>>>>>> *skype id* : manish.dunani*
>>>>>>>>>>>>>>>>>>>>>>>>>>> *
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>>>>>>>>>>> NOTICE: This message is intended for the use of the
>>>>>>>>>>>>>>>>>>> individual or entity to which it is addressed and may contain information
>>>>>>>>>>>>>>>>>>> that is confidential, privileged and exempt from disclosure under
>>>>>>>>>>>>>>>>>>> applicable law. If the reader of this message is not the intended
>>>>>>>>>>>>>>>>>>> recipient, you are hereby notified that any printing, copying,
>>>>>>>>>>>>>>>>>>> dissemination, distribution, disclosure or forwarding of this communication
>>>>>>>>>>>>>>>>>>> is strictly prohibited. If you have received this communication in error,
>>>>>>>>>>>>>>>>>>> please contact the sender immediately and delete it from your system. Thank
>>>>>>>>>>>>>>>>>>> You.
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>>>>>>>>> NOTICE: This message is intended for the use of the
>>>>>>>>>>>>>>>>> individual or entity to which it is addressed and may contain information
>>>>>>>>>>>>>>>>> that is confidential, privileged and exempt from disclosure under
>>>>>>>>>>>>>>>>> applicable law. If the reader of this message is not the intended
>>>>>>>>>>>>>>>>> recipient, you are hereby notified that any printing, copying,
>>>>>>>>>>>>>>>>> dissemination, distribution, disclosure or forwarding of this communication
>>>>>>>>>>>>>>>>> is strictly prohibited. If you have received this communication in error,
>>>>>>>>>>>>>>>>> please contact the sender immediately and delete it from your system. Thank
>>>>>>>>>>>>>>>>> You.
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>>> immediately and delete it from your system. Thank You.
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>> you have received this communication in error, please contact the sender
>>>>>>> immediately and delete it from your system. Thank You.
>>>>>>>
>>>>>>
>>>>>>
>>>>> CONFIDENTIALITY NOTICE
>>>>> NOTICE: This message is intended for the use of the individual or
>>>>> entity to which it is addressed and may contain information that is
>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>> notified that any printing, copying, dissemination, distribution,
>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>> you have received this communication in error, please contact the sender
>>>>> immediately and delete it from your system. Thank You.
>>>>>
>>>>
>>>>
>>> CONFIDENTIALITY NOTICE
>>> NOTICE: This message is intended for the use of the individual or entity
>>> to which it is addressed and may contain information that is confidential,
>>> privileged and exempt from disclosure under applicable law. If the reader
>>> of this message is not the intended recipient, you are hereby notified that
>>> any printing, copying, dissemination, distribution, disclosure or
>>> forwarding of this communication is strictly prohibited. If you have
>>> received this communication in error, please contact the sender immediately
>>> and delete it from your system. Thank You.
>>>
>>
>>
>
>
> --
> Olivier Renault
> Solution Engineer - Big Data - Hortonworks, Inc.
> +44 7500 933 036
> orenault@hortonworks.com
> www.hortonworks.com
>  <http://hortonworks.com/products/hortonworks-sandbox/>
>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.
>

Re: about replication

Posted by Irfan Sayed <ir...@gmail.com>.
please find the attached.
i don't have "c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log"
as it is not generated

regards
irfan






On Thu, Sep 5, 2013 at 6:09 PM, Olivier Renault <or...@hortonworks.com>wrote:

> Could you share the log files ( c:\hdp.log,
> c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log )  as
> well as your clusterproperties.txt ?
>
> Thanks,
> Olivier
>
>
> On 5 September 2013 12:33, Irfan Sayed <ir...@gmail.com> wrote:
>
>> thanks. i followed the user manual for deployment and installed all
>> pre-requisites
>> i modified the command and still the issue persist. please suggest
>>
>> please refer below
>>
>>
>> [image: Inline image 1]
>>
>> regards
>> irfan
>>
>>
>>
>> On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <orenault@hortonworks.com
>> > wrote:
>>
>>> The command to install it is msiexec /i msifile /...
>>>
>>> You will find the correct syntax as part of doc.
>>>
>>> Happy reading
>>> Olivier
>>> On 4 Sep 2013 12:37, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>
>>>> thanks.
>>>> i referred the logs and manuals. i modified the clusterproperties file
>>>> and then double click on the msi file
>>>> however, it still failed.
>>>> further i started the installation on command line by giving
>>>> HDP_LAYOUT=clusterproperties file path,
>>>> installation went ahead and it failed for .NET framework 4.0 and VC++
>>>> redistributable package dependency
>>>>
>>>> i installed both and started again the installation.
>>>> failed again with following error
>>>> [image: Inline image 1]
>>>>
>>>> when i search for the logs mentioned in the error , i never found that
>>>> please suggest
>>>>
>>>> regards
>>>> irfan
>>>>
>>>>
>>>>
>>>> On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <
>>>> orenault@hortonworks.com> wrote:
>>>>
>>>>> Correct, you need to define the cluster configuration as part of a
>>>>> file. You will find some information on the configuration file as part of
>>>>> the documentation.
>>>>>
>>>>>
>>>>> http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
>>>>>
>>>>> You should make sure to have also installed the pre requisite.
>>>>>
>>>>> Thanks
>>>>> Olivier
>>>>> On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>>>
>>>>>> thanks. sorry for the long break. actually got involved in some other
>>>>>> priorities
>>>>>> i downloaded the installer and while installing i got following error
>>>>>>
>>>>>> [image: Inline image 1]
>>>>>>
>>>>>> do i need to make any configuration prior to installation ??
>>>>>>
>>>>>> regards
>>>>>> irfan
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
>>>>>> orenault@hortonworks.com> wrote:
>>>>>>
>>>>>>> Here is the link
>>>>>>>
>>>>>>> http://download.hortonworks.com/products/hdp-windows/
>>>>>>>
>>>>>>> Olivier
>>>>>>> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>>>>>
>>>>>>>> thanks.
>>>>>>>> i just followed the instructions to setup the pseudo distributed
>>>>>>>> setup first using the url :
>>>>>>>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>>>>>>>>
>>>>>>>> i don't think so i am running DN on both machine
>>>>>>>> please find the attached log
>>>>>>>>
>>>>>>>> hi olivier
>>>>>>>>
>>>>>>>> can you please give me download link ?
>>>>>>>> let me try please
>>>>>>>>
>>>>>>>> regards
>>>>>>>> irfan
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <dontariq@gmail.com
>>>>>>>> > wrote:
>>>>>>>>
>>>>>>>>> Are you running DN on both the machines? Could you please show me
>>>>>>>>> your DN logs?
>>>>>>>>>
>>>>>>>>> Also, consider Oliver's suggestion. It's definitely a better
>>>>>>>>> option.
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> Warm Regards,
>>>>>>>>> Tariq
>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>>>>>>>>> orenault@hortonworks.com> wrote:
>>>>>>>>>
>>>>>>>>>> Irfu,
>>>>>>>>>>
>>>>>>>>>> If you want to quickly get Hadoop running on windows platform.
>>>>>>>>>> You may want to try our distribution for Windows. You will be able to find
>>>>>>>>>> the msi on our website.
>>>>>>>>>>
>>>>>>>>>> Regards
>>>>>>>>>> Olivier
>>>>>>>>>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>>> thanks.
>>>>>>>>>>> ok. i think i need to change the plan over here
>>>>>>>>>>> let me create two environments. 1: totally windows 2: totally
>>>>>>>>>>> Unix
>>>>>>>>>>>
>>>>>>>>>>> because, on windows , anyway i have to try and see how hadoop
>>>>>>>>>>> works
>>>>>>>>>>> on UNIX, it is already known that ,  it is working fine.
>>>>>>>>>>>
>>>>>>>>>>> so, on windows , here is the setup:
>>>>>>>>>>>
>>>>>>>>>>> namenode : windows 2012 R2
>>>>>>>>>>> datanode : windows 2012 R2
>>>>>>>>>>>
>>>>>>>>>>> now, the exact problem is :
>>>>>>>>>>> 1: datanode is not getting started
>>>>>>>>>>> 2: replication : if i put any file/folder on any datanode , it
>>>>>>>>>>> should get replicated to all another available datanodes
>>>>>>>>>>>
>>>>>>>>>>> regards
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <
>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> Seriously??You are planning to develop something using Hadoop
>>>>>>>>>>>> on windows. Not a good idea. Anyways, cold you plz show me your log files?I
>>>>>>>>>>>> also need some additional info :
>>>>>>>>>>>> -The exact problem which you are facing right now
>>>>>>>>>>>> -Your cluster summary(no. of nodes etc)
>>>>>>>>>>>> -Your latest configuration files
>>>>>>>>>>>> -Your /etc.hosts file
>>>>>>>>>>>>
>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>> Tariq
>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <
>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>
>>>>>>>>>>>>> ok. thanks
>>>>>>>>>>>>> now, i need to start with all windows setup first as our
>>>>>>>>>>>>> product will be based on windows
>>>>>>>>>>>>> so, now, please tell me how to resolve the issue
>>>>>>>>>>>>>
>>>>>>>>>>>>> datanode is not starting . please suggest
>>>>>>>>>>>>>
>>>>>>>>>>>>> regards,
>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <
>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>
>>>>>>>>>>>>>> It is possible. Theoretically Hadoop doesn't stop you from
>>>>>>>>>>>>>> doing that. But it is not a very wise setup.
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <
>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <
>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>> can i have setup like this :
>>>>>>>>>>>>>>>> namenode will be on linux (flavour may be RHEL, CentOS,
>>>>>>>>>>>>>>>> UBuntu etc)
>>>>>>>>>>>>>>>> and datanodes are the combination of any OS (windows ,
>>>>>>>>>>>>>>>> linux , unix etc )
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> however, my doubt is,  as the file systems of  both the
>>>>>>>>>>>>>>>> systems (win and linux ) are different ,  datanodes of these systems can
>>>>>>>>>>>>>>>> not be part of single cluster . i have to make windows cluster separate and
>>>>>>>>>>>>>>>> UNIX cluster separate ?
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
>>>>>>>>>>>>>>>> aagarwal@hortonworks.com> wrote:
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> I just noticed you are on Cygwin. IIRC Windows PIDs are
>>>>>>>>>>>>>>>>> not the same as Cygwin PIDs so that may be causing the discrepancy. I don't
>>>>>>>>>>>>>>>>> know how well Hadoop works in Cygwin as I have never tried it. Work is in
>>>>>>>>>>>>>>>>> progress for native Windows support however there are no official releases
>>>>>>>>>>>>>>>>> with Windows support yet. It may be easier to get familiar with a
>>>>>>>>>>>>>>>>> release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> thanks
>>>>>>>>>>>>>>>>>> here is what i did .
>>>>>>>>>>>>>>>>>> i stopped all the namenodes and datanodes using
>>>>>>>>>>>>>>>>>> ./stop-dfs.sh command
>>>>>>>>>>>>>>>>>> then deleted all pid files for namenodes and datanodes
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> started dfs again with command : "./start-dfs.sh"
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> when i ran the "Jps" command . it shows
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>> 4536 Jps
>>>>>>>>>>>>>>>>>> 2076 NameNode
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> however, when i open the pid file for namenode then it is
>>>>>>>>>>>>>>>>>> not showing pid as : 4560. on the contrary, it shud show : 2076
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
>>>>>>>>>>>>>>>>>> aagarwal@hortonworks.com> wrote:
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> Most likely there is a stale pid file. Something like
>>>>>>>>>>>>>>>>>>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>>>>>>>>>>>>>>>>>>> the datanode.
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> I haven't read the entire thread so you may have looked
>>>>>>>>>>>>>>>>>>> at this already.
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> -Arpit
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> datanode is trying to connect to namenode continuously
>>>>>>>>>>>>>>>>>>>> but fails
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> when i try to run "jps" command it says :
>>>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>>>> 4584 NameNode
>>>>>>>>>>>>>>>>>>>> 4016 Jps
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> and when i ran the "./start-dfs.sh" then it says :
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> $ ./start-dfs.sh
>>>>>>>>>>>>>>>>>>>> namenode running as process 3544. Stop it first.
>>>>>>>>>>>>>>>>>>>> DFS-1: datanode running as process 4076. Stop it first.
>>>>>>>>>>>>>>>>>>>> localhost: secondarynamenode running as process 4792.
>>>>>>>>>>>>>>>>>>>> Stop it first.
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> both these logs are contradictory
>>>>>>>>>>>>>>>>>>>> please find the attached logs
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> should i attach the conf files as well ?
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> Your DN is still not running. Showing me the logs
>>>>>>>>>>>>>>>>>>>>> would be helpful.
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> i followed the url and did the steps mention in that.
>>>>>>>>>>>>>>>>>>>>>> i have deployed on the windows platform
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> Now, i am able to browse url : http://localhost:50070(name node )
>>>>>>>>>>>>>>>>>>>>>> however, not able to browse url :
>>>>>>>>>>>>>>>>>>>>>> http://localhost:50030
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> please refer below
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> i have modified all the config files as mentioned and
>>>>>>>>>>>>>>>>>>>>>> formatted the hdfs file system as well
>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> thanks. i followed this url :
>>>>>>>>>>>>>>>>>>>>>>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>>>>>>>>>>>>>>>>>>>>>>> let me follow the url which you gave for pseudo
>>>>>>>>>>>>>>>>>>>>>>> distributed setup and then will switch to distributed mode
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> You are welcome. Which link have you followed for
>>>>>>>>>>>>>>>>>>>>>>>> the configuration?Your *core-site.xml* is empty.
>>>>>>>>>>>>>>>>>>>>>>>> Remove the property *fs.default.name *from *
>>>>>>>>>>>>>>>>>>>>>>>> hdfs-site.xml* and add it to *core-site.xml*.
>>>>>>>>>>>>>>>>>>>>>>>> Remove *mapred.job.tracker* as well. It is
>>>>>>>>>>>>>>>>>>>>>>>> required in *mapred-site.xml*.
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> I would suggest you to do a pseudo distributed
>>>>>>>>>>>>>>>>>>>>>>>> setup first in order to get yourself familiar with the process and then
>>>>>>>>>>>>>>>>>>>>>>>> proceed to the distributed mode. You can visit this
>>>>>>>>>>>>>>>>>>>>>>>> link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> HTH
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> thanks tariq for response.
>>>>>>>>>>>>>>>>>>>>>>>>> as discussed last time, i have sent you all the
>>>>>>>>>>>>>>>>>>>>>>>>> config files in my setup .
>>>>>>>>>>>>>>>>>>>>>>>>> can you please go through that ?
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> please let me know
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> I'm sorry for being unresponsive. Was out of
>>>>>>>>>>>>>>>>>>>>>>>>>> touch for sometime because of ramzan and eid. Resuming work today.
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> What's the current status?
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <
>>>>>>>>>>>>>>>>>>>>>>>>>> manishd207@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> First of all read the concepts ..I hope you will
>>>>>>>>>>>>>>>>>>>>>>>>>>> like it..
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> hey Tariq,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i am still stuck ..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> can you please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> attachment got quarantined
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> resending in txt format. please rename it to
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> conf.rar
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> if i run the jps command on namenode :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 3164 NameNode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 1892 Jps
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> same command on datanode :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-1/cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 3848 Jps
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> jps does not list any process for datanode.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, on web browser i can see one live data node
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please find the attached conf rar file of
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> OK. we'll start fresh. Could you plz show
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> me your latest config files?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> BTW, are your daemons running fine?Use JPS
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> to verify that.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i have created these dir "wksp_data" and
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> "wksp_name" on both datanode and namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> made the respective changes in
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> "hdfs-site.xml" file
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> formatted the namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> started the dfs
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> but still, not able to browse the file
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> system through web browser
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please refer below
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> anything still missing ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> these dir needs to be created on all
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> datanodes and namenodes ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> further,  hdfs-site.xml needs to be
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> updated on both datanodes and namenodes for these new dir?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Create 2 directories manually
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> corresponding to the values of dfs.name.dir and dfs.data.dir properties and
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> change the permissions of these directories to 755. When you start pushing
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> data into your HDFS, data will start going inside the directory specified
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> by dfs.data.dir and the associated metadata will go inside dfs.name.dir.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Remember, you store data in HDFS, but it eventually gets stored in your
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> local/native FS. But you cannot see this data directly on your local/native
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> FS.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, i need this to be working on
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> windows environment as project requirement.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i will add/work on Linux later
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> so, now , at this stage , c:\\wksp is
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> the HDFS file system OR do i need to create it from command line ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:19 PM,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Mohammad Tariq <do...@gmail.com>wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Hello Irfan,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sorry for being unresponsive. Got
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> stuck with some imp work.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> HDFS webUI doesn't provide us the
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> ability to create file or directory. You can browse HDFS, view files,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> download files etc. But operation like create, move, copy etc are not
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> supported.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> These values look fine to me.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> One suggestion though. Try getting a
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Linux machine(if possible). Or at least use a VM. I personally feel that
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> using Hadoop on windows is always messy.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> when i browse the file system , i am
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> getting following :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i haven't seen any make directory
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> option there
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i need to create it from command
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> line ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> further, in the hdfs-site.xml file ,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i have given following entries. are they correct ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <name>dfs.data.dir</name>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <value>c:\\wksp</value>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   </property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <name>dfs.name.dir</name>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <value>c:\\wksp</value>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   </property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 12:40 PM,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> manish dunani <ma...@gmail.com>wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *You are wrong at this:*
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> copyFromLocal: File
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> copyFromLocal: File
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Because,You had wrote both the
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> paths local and You need not to copy hadoop into hdfs...Hadoop is already
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> working..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Just check out in browser by after
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> starting ur single node cluster :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> localhost:50070
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> then go for browse the filesystem
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> link in it..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> If there is no directory then make
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> directory there.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> That is your hdfs directory.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Then copy any text file there(no
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> need to copy hadoop there).beacause u are going to do processing on that
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> data in text file.That's why hadoop is used for ,first u need to make it
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> clear in ur mind.Then and then u will do it...otherwise not possible..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *Try this: *
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ .bin/hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /full/local/path/to/ur/file /hdfs/directory/path
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 11:49 AM,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Irfan Sayed <ir...@gmail.com>wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks. yes , i am newbie.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, i need windows setup.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> let me surely refer the doc and
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> link which u sent but i need this to be working ...
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> can you please help
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> MANISH DUNANI
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> -THANX
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> +91 9426881954,+91 8460656443
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> manishd207@gmail.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>>>>>>>>>>> Regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> *Manish Dunani*
>>>>>>>>>>>>>>>>>>>>>>>>>>> *Contact No* : +91 9408329137
>>>>>>>>>>>>>>>>>>>>>>>>>>> *skype id* : manish.dunani*
>>>>>>>>>>>>>>>>>>>>>>>>>>> *
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>>>>>>>>>>> NOTICE: This message is intended for the use of the
>>>>>>>>>>>>>>>>>>> individual or entity to which it is addressed and may contain information
>>>>>>>>>>>>>>>>>>> that is confidential, privileged and exempt from disclosure under
>>>>>>>>>>>>>>>>>>> applicable law. If the reader of this message is not the intended
>>>>>>>>>>>>>>>>>>> recipient, you are hereby notified that any printing, copying,
>>>>>>>>>>>>>>>>>>> dissemination, distribution, disclosure or forwarding of this communication
>>>>>>>>>>>>>>>>>>> is strictly prohibited. If you have received this communication in error,
>>>>>>>>>>>>>>>>>>> please contact the sender immediately and delete it from your system. Thank
>>>>>>>>>>>>>>>>>>> You.
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>>>>>>>>> NOTICE: This message is intended for the use of the
>>>>>>>>>>>>>>>>> individual or entity to which it is addressed and may contain information
>>>>>>>>>>>>>>>>> that is confidential, privileged and exempt from disclosure under
>>>>>>>>>>>>>>>>> applicable law. If the reader of this message is not the intended
>>>>>>>>>>>>>>>>> recipient, you are hereby notified that any printing, copying,
>>>>>>>>>>>>>>>>> dissemination, distribution, disclosure or forwarding of this communication
>>>>>>>>>>>>>>>>> is strictly prohibited. If you have received this communication in error,
>>>>>>>>>>>>>>>>> please contact the sender immediately and delete it from your system. Thank
>>>>>>>>>>>>>>>>> You.
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>>> immediately and delete it from your system. Thank You.
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>> you have received this communication in error, please contact the sender
>>>>>>> immediately and delete it from your system. Thank You.
>>>>>>>
>>>>>>
>>>>>>
>>>>> CONFIDENTIALITY NOTICE
>>>>> NOTICE: This message is intended for the use of the individual or
>>>>> entity to which it is addressed and may contain information that is
>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>> notified that any printing, copying, dissemination, distribution,
>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>> you have received this communication in error, please contact the sender
>>>>> immediately and delete it from your system. Thank You.
>>>>>
>>>>
>>>>
>>> CONFIDENTIALITY NOTICE
>>> NOTICE: This message is intended for the use of the individual or entity
>>> to which it is addressed and may contain information that is confidential,
>>> privileged and exempt from disclosure under applicable law. If the reader
>>> of this message is not the intended recipient, you are hereby notified that
>>> any printing, copying, dissemination, distribution, disclosure or
>>> forwarding of this communication is strictly prohibited. If you have
>>> received this communication in error, please contact the sender immediately
>>> and delete it from your system. Thank You.
>>>
>>
>>
>
>
> --
> Olivier Renault
> Solution Engineer - Big Data - Hortonworks, Inc.
> +44 7500 933 036
> orenault@hortonworks.com
> www.hortonworks.com
>  <http://hortonworks.com/products/hortonworks-sandbox/>
>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.
>

Re: about replication

Posted by Olivier Renault <or...@hortonworks.com>.
Could you share the log files ( c:\hdp.log,
c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log )  as
well as your clusterproperties.txt ?

Thanks,
Olivier


On 5 September 2013 12:33, Irfan Sayed <ir...@gmail.com> wrote:

> thanks. i followed the user manual for deployment and installed all
> pre-requisites
> i modified the command and still the issue persist. please suggest
>
> please refer below
>
>
> [image: Inline image 1]
>
> regards
> irfan
>
>
>
> On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <or...@hortonworks.com>wrote:
>
>> The command to install it is msiexec /i msifile /...
>>
>> You will find the correct syntax as part of doc.
>>
>> Happy reading
>> Olivier
>> On 4 Sep 2013 12:37, "Irfan Sayed" <ir...@gmail.com> wrote:
>>
>>> thanks.
>>> i referred the logs and manuals. i modified the clusterproperties file
>>> and then double click on the msi file
>>> however, it still failed.
>>> further i started the installation on command line by giving
>>> HDP_LAYOUT=clusterproperties file path,
>>> installation went ahead and it failed for .NET framework 4.0 and VC++
>>> redistributable package dependency
>>>
>>> i installed both and started again the installation.
>>> failed again with following error
>>> [image: Inline image 1]
>>>
>>> when i search for the logs mentioned in the error , i never found that
>>> please suggest
>>>
>>> regards
>>> irfan
>>>
>>>
>>>
>>> On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <
>>> orenault@hortonworks.com> wrote:
>>>
>>>> Correct, you need to define the cluster configuration as part of a
>>>> file. You will find some information on the configuration file as part of
>>>> the documentation.
>>>>
>>>>
>>>> http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
>>>>
>>>> You should make sure to have also installed the pre requisite.
>>>>
>>>> Thanks
>>>> Olivier
>>>> On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>>
>>>>> thanks. sorry for the long break. actually got involved in some other
>>>>> priorities
>>>>> i downloaded the installer and while installing i got following error
>>>>>
>>>>> [image: Inline image 1]
>>>>>
>>>>> do i need to make any configuration prior to installation ??
>>>>>
>>>>> regards
>>>>> irfan
>>>>>
>>>>>
>>>>>
>>>>> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
>>>>> orenault@hortonworks.com> wrote:
>>>>>
>>>>>> Here is the link
>>>>>>
>>>>>> http://download.hortonworks.com/products/hdp-windows/
>>>>>>
>>>>>> Olivier
>>>>>> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>>>>
>>>>>>> thanks.
>>>>>>> i just followed the instructions to setup the pseudo distributed
>>>>>>> setup first using the url :
>>>>>>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>>>>>>>
>>>>>>> i don't think so i am running DN on both machine
>>>>>>> please find the attached log
>>>>>>>
>>>>>>> hi olivier
>>>>>>>
>>>>>>> can you please give me download link ?
>>>>>>> let me try please
>>>>>>>
>>>>>>> regards
>>>>>>> irfan
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>>>>
>>>>>>>> Are you running DN on both the machines? Could you please show me
>>>>>>>> your DN logs?
>>>>>>>>
>>>>>>>> Also, consider Oliver's suggestion. It's definitely a better option.
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> Warm Regards,
>>>>>>>> Tariq
>>>>>>>> cloudfront.blogspot.com
>>>>>>>>
>>>>>>>>
>>>>>>>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>>>>>>>> orenault@hortonworks.com> wrote:
>>>>>>>>
>>>>>>>>> Irfu,
>>>>>>>>>
>>>>>>>>> If you want to quickly get Hadoop running on windows platform. You
>>>>>>>>> may want to try our distribution for Windows. You will be able to find the
>>>>>>>>> msi on our website.
>>>>>>>>>
>>>>>>>>> Regards
>>>>>>>>> Olivier
>>>>>>>>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>>>>>>>
>>>>>>>>>> thanks.
>>>>>>>>>> ok. i think i need to change the plan over here
>>>>>>>>>> let me create two environments. 1: totally windows 2: totally Unix
>>>>>>>>>>
>>>>>>>>>> because, on windows , anyway i have to try and see how hadoop
>>>>>>>>>> works
>>>>>>>>>> on UNIX, it is already known that ,  it is working fine.
>>>>>>>>>>
>>>>>>>>>> so, on windows , here is the setup:
>>>>>>>>>>
>>>>>>>>>> namenode : windows 2012 R2
>>>>>>>>>> datanode : windows 2012 R2
>>>>>>>>>>
>>>>>>>>>> now, the exact problem is :
>>>>>>>>>> 1: datanode is not getting started
>>>>>>>>>> 2: replication : if i put any file/folder on any datanode , it
>>>>>>>>>> should get replicated to all another available datanodes
>>>>>>>>>>
>>>>>>>>>> regards
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <
>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>>> Seriously??You are planning to develop something using Hadoop on
>>>>>>>>>>> windows. Not a good idea. Anyways, cold you plz show me your log files?I
>>>>>>>>>>> also need some additional info :
>>>>>>>>>>> -The exact problem which you are facing right now
>>>>>>>>>>> -Your cluster summary(no. of nodes etc)
>>>>>>>>>>> -Your latest configuration files
>>>>>>>>>>> -Your /etc.hosts file
>>>>>>>>>>>
>>>>>>>>>>> Warm Regards,
>>>>>>>>>>> Tariq
>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <
>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> ok. thanks
>>>>>>>>>>>> now, i need to start with all windows setup first as our
>>>>>>>>>>>> product will be based on windows
>>>>>>>>>>>> so, now, please tell me how to resolve the issue
>>>>>>>>>>>>
>>>>>>>>>>>> datanode is not starting . please suggest
>>>>>>>>>>>>
>>>>>>>>>>>> regards,
>>>>>>>>>>>> irfan
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <
>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>
>>>>>>>>>>>>> It is possible. Theoretically Hadoop doesn't stop you from
>>>>>>>>>>>>> doing that. But it is not a very wise setup.
>>>>>>>>>>>>>
>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <
>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>
>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <
>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>> can i have setup like this :
>>>>>>>>>>>>>>> namenode will be on linux (flavour may be RHEL, CentOS,
>>>>>>>>>>>>>>> UBuntu etc)
>>>>>>>>>>>>>>> and datanodes are the combination of any OS (windows , linux
>>>>>>>>>>>>>>> , unix etc )
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> however, my doubt is,  as the file systems of  both the
>>>>>>>>>>>>>>> systems (win and linux ) are different ,  datanodes of these systems can
>>>>>>>>>>>>>>> not be part of single cluster . i have to make windows cluster separate and
>>>>>>>>>>>>>>> UNIX cluster separate ?
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
>>>>>>>>>>>>>>> aagarwal@hortonworks.com> wrote:
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> I just noticed you are on Cygwin. IIRC Windows PIDs are not
>>>>>>>>>>>>>>>> the same as Cygwin PIDs so that may be causing the discrepancy. I don't
>>>>>>>>>>>>>>>> know how well Hadoop works in Cygwin as I have never tried it. Work is in
>>>>>>>>>>>>>>>> progress for native Windows support however there are no official releases
>>>>>>>>>>>>>>>> with Windows support yet. It may be easier to get familiar with a
>>>>>>>>>>>>>>>> release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <
>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> thanks
>>>>>>>>>>>>>>>>> here is what i did .
>>>>>>>>>>>>>>>>> i stopped all the namenodes and datanodes using
>>>>>>>>>>>>>>>>> ./stop-dfs.sh command
>>>>>>>>>>>>>>>>> then deleted all pid files for namenodes and datanodes
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> started dfs again with command : "./start-dfs.sh"
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> when i ran the "Jps" command . it shows
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>> 4536 Jps
>>>>>>>>>>>>>>>>> 2076 NameNode
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> however, when i open the pid file for namenode then it is
>>>>>>>>>>>>>>>>> not showing pid as : 4560. on the contrary, it shud show : 2076
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
>>>>>>>>>>>>>>>>> aagarwal@hortonworks.com> wrote:
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> Most likely there is a stale pid file. Something like
>>>>>>>>>>>>>>>>>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>>>>>>>>>>>>>>>>>> the datanode.
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> I haven't read the entire thread so you may have looked
>>>>>>>>>>>>>>>>>> at this already.
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> -Arpit
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> datanode is trying to connect to namenode continuously
>>>>>>>>>>>>>>>>>>> but fails
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> when i try to run "jps" command it says :
>>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>>> 4584 NameNode
>>>>>>>>>>>>>>>>>>> 4016 Jps
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> and when i ran the "./start-dfs.sh" then it says :
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> $ ./start-dfs.sh
>>>>>>>>>>>>>>>>>>> namenode running as process 3544. Stop it first.
>>>>>>>>>>>>>>>>>>> DFS-1: datanode running as process 4076. Stop it first.
>>>>>>>>>>>>>>>>>>> localhost: secondarynamenode running as process 4792.
>>>>>>>>>>>>>>>>>>> Stop it first.
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> both these logs are contradictory
>>>>>>>>>>>>>>>>>>> please find the attached logs
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> should i attach the conf files as well ?
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> Your DN is still not running. Showing me the logs would
>>>>>>>>>>>>>>>>>>>> be helpful.
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> i followed the url and did the steps mention in that.
>>>>>>>>>>>>>>>>>>>>> i have deployed on the windows platform
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> Now, i am able to browse url : http://localhost:50070(name node )
>>>>>>>>>>>>>>>>>>>>> however, not able to browse url :
>>>>>>>>>>>>>>>>>>>>> http://localhost:50030
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> please refer below
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> i have modified all the config files as mentioned and
>>>>>>>>>>>>>>>>>>>>> formatted the hdfs file system as well
>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> thanks. i followed this url :
>>>>>>>>>>>>>>>>>>>>>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>>>>>>>>>>>>>>>>>>>>>> let me follow the url which you gave for pseudo
>>>>>>>>>>>>>>>>>>>>>> distributed setup and then will switch to distributed mode
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> You are welcome. Which link have you followed for
>>>>>>>>>>>>>>>>>>>>>>> the configuration?Your *core-site.xml* is empty.
>>>>>>>>>>>>>>>>>>>>>>> Remove the property *fs.default.name *from *
>>>>>>>>>>>>>>>>>>>>>>> hdfs-site.xml* and add it to *core-site.xml*.
>>>>>>>>>>>>>>>>>>>>>>> Remove *mapred.job.tracker* as well. It is required
>>>>>>>>>>>>>>>>>>>>>>> in *mapred-site.xml*.
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> I would suggest you to do a pseudo distributed setup
>>>>>>>>>>>>>>>>>>>>>>> first in order to get yourself familiar with the process and then proceed
>>>>>>>>>>>>>>>>>>>>>>> to the distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> HTH
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> thanks tariq for response.
>>>>>>>>>>>>>>>>>>>>>>>> as discussed last time, i have sent you all the
>>>>>>>>>>>>>>>>>>>>>>>> config files in my setup .
>>>>>>>>>>>>>>>>>>>>>>>> can you please go through that ?
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> please let me know
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> I'm sorry for being unresponsive. Was out of touch
>>>>>>>>>>>>>>>>>>>>>>>>> for sometime because of ramzan and eid. Resuming work today.
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> What's the current status?
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <
>>>>>>>>>>>>>>>>>>>>>>>>> manishd207@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> First of all read the concepts ..I hope you will
>>>>>>>>>>>>>>>>>>>>>>>>>> like it..
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> hey Tariq,
>>>>>>>>>>>>>>>>>>>>>>>>>>>> i am still stuck ..
>>>>>>>>>>>>>>>>>>>>>>>>>>>> can you please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> attachment got quarantined
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> resending in txt format. please rename it to
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> conf.rar
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> if i run the jps command on namenode :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 3164 NameNode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 1892 Jps
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> same command on datanode :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-1/cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 3848 Jps
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> jps does not list any process for datanode.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, on web browser i can see one live data node
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please find the attached conf rar file of
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> OK. we'll start fresh. Could you plz show
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> me your latest config files?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> BTW, are your daemons running fine?Use JPS
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> to verify that.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i have created these dir "wksp_data" and
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> "wksp_name" on both datanode and namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> made the respective changes in
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> "hdfs-site.xml" file
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> formatted the namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> started the dfs
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> but still, not able to browse the file
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> system through web browser
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please refer below
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> anything still missing ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> these dir needs to be created on all
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> datanodes and namenodes ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> further,  hdfs-site.xml needs to be
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> updated on both datanodes and namenodes for these new dir?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Create 2 directories manually
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> corresponding to the values of dfs.name.dir and dfs.data.dir properties and
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> change the permissions of these directories to 755. When you start pushing
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> data into your HDFS, data will start going inside the directory specified
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> by dfs.data.dir and the associated metadata will go inside dfs.name.dir.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Remember, you store data in HDFS, but it eventually gets stored in your
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> local/native FS. But you cannot see this data directly on your local/native
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> FS.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, i need this to be working on
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> windows environment as project requirement.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i will add/work on Linux later
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> so, now , at this stage , c:\\wksp is
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> the HDFS file system OR do i need to create it from command line ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:19 PM,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Mohammad Tariq <do...@gmail.com>wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Hello Irfan,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sorry for being unresponsive. Got
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> stuck with some imp work.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> HDFS webUI doesn't provide us the
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> ability to create file or directory. You can browse HDFS, view files,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> download files etc. But operation like create, move, copy etc are not
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> supported.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> These values look fine to me.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> One suggestion though. Try getting a
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Linux machine(if possible). Or at least use a VM. I personally feel that
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> using Hadoop on windows is always messy.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> when i browse the file system , i am
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> getting following :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i haven't seen any make directory
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> option there
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i need to create it from command line
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> further, in the hdfs-site.xml file ,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i have given following entries. are they correct ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <name>dfs.data.dir</name>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <value>c:\\wksp</value>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   </property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <name>dfs.name.dir</name>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <value>c:\\wksp</value>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   </property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 12:40 PM,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> manish dunani <ma...@gmail.com>wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *You are wrong at this:*
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> copyFromLocal: File
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> copyFromLocal: File
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Because,You had wrote both the paths
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> local and You need not to copy hadoop into hdfs...Hadoop is already
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> working..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Just check out in browser by after
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> starting ur single node cluster :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> localhost:50070
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> then go for browse the filesystem
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> link in it..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> If there is no directory then make
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> directory there.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> That is your hdfs directory.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Then copy any text file there(no
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> need to copy hadoop there).beacause u are going to do processing on that
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> data in text file.That's why hadoop is used for ,first u need to make it
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> clear in ur mind.Then and then u will do it...otherwise not possible..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *Try this: *
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ .bin/hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /full/local/path/to/ur/file /hdfs/directory/path
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 11:49 AM,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Irfan Sayed <ir...@gmail.com>wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks. yes , i am newbie.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, i need windows setup.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> let me surely refer the doc and
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> link which u sent but i need this to be working ...
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> can you please help
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> MANISH DUNANI
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> -THANX
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> +91 9426881954,+91 8460656443
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> manishd207@gmail.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>>>>>>>>>> Regards
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> *Manish Dunani*
>>>>>>>>>>>>>>>>>>>>>>>>>> *Contact No* : +91 9408329137
>>>>>>>>>>>>>>>>>>>>>>>>>> *skype id* : manish.dunani*
>>>>>>>>>>>>>>>>>>>>>>>>>> *
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>>>>>>>>>> NOTICE: This message is intended for the use of the
>>>>>>>>>>>>>>>>>> individual or entity to which it is addressed and may contain information
>>>>>>>>>>>>>>>>>> that is confidential, privileged and exempt from disclosure under
>>>>>>>>>>>>>>>>>> applicable law. If the reader of this message is not the intended
>>>>>>>>>>>>>>>>>> recipient, you are hereby notified that any printing, copying,
>>>>>>>>>>>>>>>>>> dissemination, distribution, disclosure or forwarding of this communication
>>>>>>>>>>>>>>>>>> is strictly prohibited. If you have received this communication in error,
>>>>>>>>>>>>>>>>>> please contact the sender immediately and delete it from your system. Thank
>>>>>>>>>>>>>>>>>> You.
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>>>>>>>> NOTICE: This message is intended for the use of the
>>>>>>>>>>>>>>>> individual or entity to which it is addressed and may contain information
>>>>>>>>>>>>>>>> that is confidential, privileged and exempt from disclosure under
>>>>>>>>>>>>>>>> applicable law. If the reader of this message is not the intended
>>>>>>>>>>>>>>>> recipient, you are hereby notified that any printing, copying,
>>>>>>>>>>>>>>>> dissemination, distribution, disclosure or forwarding of this communication
>>>>>>>>>>>>>>>> is strictly prohibited. If you have received this communication in error,
>>>>>>>>>>>>>>>> please contact the sender immediately and delete it from your system. Thank
>>>>>>>>>>>>>>>> You.
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>> immediately and delete it from your system. Thank You.
>>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>> CONFIDENTIALITY NOTICE
>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>> entity to which it is addressed and may contain information that is
>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>> you have received this communication in error, please contact the sender
>>>>>> immediately and delete it from your system. Thank You.
>>>>>>
>>>>>
>>>>>
>>>> CONFIDENTIALITY NOTICE
>>>> NOTICE: This message is intended for the use of the individual or
>>>> entity to which it is addressed and may contain information that is
>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>> If the reader of this message is not the intended recipient, you are hereby
>>>> notified that any printing, copying, dissemination, distribution,
>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>> you have received this communication in error, please contact the sender
>>>> immediately and delete it from your system. Thank You.
>>>>
>>>
>>>
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity
>> to which it is addressed and may contain information that is confidential,
>> privileged and exempt from disclosure under applicable law. If the reader
>> of this message is not the intended recipient, you are hereby notified that
>> any printing, copying, dissemination, distribution, disclosure or
>> forwarding of this communication is strictly prohibited. If you have
>> received this communication in error, please contact the sender immediately
>> and delete it from your system. Thank You.
>>
>
>


-- 
Olivier Renault
Solution Engineer - Big Data - Hortonworks, Inc.
+44 7500 933 036
orenault@hortonworks.com
www.hortonworks.com
<http://hortonworks.com/products/hortonworks-sandbox/>

-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.

Re: about replication

Posted by Olivier Renault <or...@hortonworks.com>.
Could you share the log files ( c:\hdp.log,
c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log )  as
well as your clusterproperties.txt ?

Thanks,
Olivier


On 5 September 2013 12:33, Irfan Sayed <ir...@gmail.com> wrote:

> thanks. i followed the user manual for deployment and installed all
> pre-requisites
> i modified the command and still the issue persist. please suggest
>
> please refer below
>
>
> [image: Inline image 1]
>
> regards
> irfan
>
>
>
> On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <or...@hortonworks.com>wrote:
>
>> The command to install it is msiexec /i msifile /...
>>
>> You will find the correct syntax as part of doc.
>>
>> Happy reading
>> Olivier
>> On 4 Sep 2013 12:37, "Irfan Sayed" <ir...@gmail.com> wrote:
>>
>>> thanks.
>>> i referred the logs and manuals. i modified the clusterproperties file
>>> and then double click on the msi file
>>> however, it still failed.
>>> further i started the installation on command line by giving
>>> HDP_LAYOUT=clusterproperties file path,
>>> installation went ahead and it failed for .NET framework 4.0 and VC++
>>> redistributable package dependency
>>>
>>> i installed both and started again the installation.
>>> failed again with following error
>>> [image: Inline image 1]
>>>
>>> when i search for the logs mentioned in the error , i never found that
>>> please suggest
>>>
>>> regards
>>> irfan
>>>
>>>
>>>
>>> On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <
>>> orenault@hortonworks.com> wrote:
>>>
>>>> Correct, you need to define the cluster configuration as part of a
>>>> file. You will find some information on the configuration file as part of
>>>> the documentation.
>>>>
>>>>
>>>> http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
>>>>
>>>> You should make sure to have also installed the pre requisite.
>>>>
>>>> Thanks
>>>> Olivier
>>>> On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>>
>>>>> thanks. sorry for the long break. actually got involved in some other
>>>>> priorities
>>>>> i downloaded the installer and while installing i got following error
>>>>>
>>>>> [image: Inline image 1]
>>>>>
>>>>> do i need to make any configuration prior to installation ??
>>>>>
>>>>> regards
>>>>> irfan
>>>>>
>>>>>
>>>>>
>>>>> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
>>>>> orenault@hortonworks.com> wrote:
>>>>>
>>>>>> Here is the link
>>>>>>
>>>>>> http://download.hortonworks.com/products/hdp-windows/
>>>>>>
>>>>>> Olivier
>>>>>> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>>>>
>>>>>>> thanks.
>>>>>>> i just followed the instructions to setup the pseudo distributed
>>>>>>> setup first using the url :
>>>>>>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>>>>>>>
>>>>>>> i don't think so i am running DN on both machine
>>>>>>> please find the attached log
>>>>>>>
>>>>>>> hi olivier
>>>>>>>
>>>>>>> can you please give me download link ?
>>>>>>> let me try please
>>>>>>>
>>>>>>> regards
>>>>>>> irfan
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>>>>
>>>>>>>> Are you running DN on both the machines? Could you please show me
>>>>>>>> your DN logs?
>>>>>>>>
>>>>>>>> Also, consider Oliver's suggestion. It's definitely a better option.
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> Warm Regards,
>>>>>>>> Tariq
>>>>>>>> cloudfront.blogspot.com
>>>>>>>>
>>>>>>>>
>>>>>>>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>>>>>>>> orenault@hortonworks.com> wrote:
>>>>>>>>
>>>>>>>>> Irfu,
>>>>>>>>>
>>>>>>>>> If you want to quickly get Hadoop running on windows platform. You
>>>>>>>>> may want to try our distribution for Windows. You will be able to find the
>>>>>>>>> msi on our website.
>>>>>>>>>
>>>>>>>>> Regards
>>>>>>>>> Olivier
>>>>>>>>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>>>>>>>
>>>>>>>>>> thanks.
>>>>>>>>>> ok. i think i need to change the plan over here
>>>>>>>>>> let me create two environments. 1: totally windows 2: totally Unix
>>>>>>>>>>
>>>>>>>>>> because, on windows , anyway i have to try and see how hadoop
>>>>>>>>>> works
>>>>>>>>>> on UNIX, it is already known that ,  it is working fine.
>>>>>>>>>>
>>>>>>>>>> so, on windows , here is the setup:
>>>>>>>>>>
>>>>>>>>>> namenode : windows 2012 R2
>>>>>>>>>> datanode : windows 2012 R2
>>>>>>>>>>
>>>>>>>>>> now, the exact problem is :
>>>>>>>>>> 1: datanode is not getting started
>>>>>>>>>> 2: replication : if i put any file/folder on any datanode , it
>>>>>>>>>> should get replicated to all another available datanodes
>>>>>>>>>>
>>>>>>>>>> regards
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <
>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>>> Seriously??You are planning to develop something using Hadoop on
>>>>>>>>>>> windows. Not a good idea. Anyways, cold you plz show me your log files?I
>>>>>>>>>>> also need some additional info :
>>>>>>>>>>> -The exact problem which you are facing right now
>>>>>>>>>>> -Your cluster summary(no. of nodes etc)
>>>>>>>>>>> -Your latest configuration files
>>>>>>>>>>> -Your /etc.hosts file
>>>>>>>>>>>
>>>>>>>>>>> Warm Regards,
>>>>>>>>>>> Tariq
>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <
>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> ok. thanks
>>>>>>>>>>>> now, i need to start with all windows setup first as our
>>>>>>>>>>>> product will be based on windows
>>>>>>>>>>>> so, now, please tell me how to resolve the issue
>>>>>>>>>>>>
>>>>>>>>>>>> datanode is not starting . please suggest
>>>>>>>>>>>>
>>>>>>>>>>>> regards,
>>>>>>>>>>>> irfan
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <
>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>
>>>>>>>>>>>>> It is possible. Theoretically Hadoop doesn't stop you from
>>>>>>>>>>>>> doing that. But it is not a very wise setup.
>>>>>>>>>>>>>
>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <
>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>
>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <
>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>> can i have setup like this :
>>>>>>>>>>>>>>> namenode will be on linux (flavour may be RHEL, CentOS,
>>>>>>>>>>>>>>> UBuntu etc)
>>>>>>>>>>>>>>> and datanodes are the combination of any OS (windows , linux
>>>>>>>>>>>>>>> , unix etc )
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> however, my doubt is,  as the file systems of  both the
>>>>>>>>>>>>>>> systems (win and linux ) are different ,  datanodes of these systems can
>>>>>>>>>>>>>>> not be part of single cluster . i have to make windows cluster separate and
>>>>>>>>>>>>>>> UNIX cluster separate ?
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
>>>>>>>>>>>>>>> aagarwal@hortonworks.com> wrote:
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> I just noticed you are on Cygwin. IIRC Windows PIDs are not
>>>>>>>>>>>>>>>> the same as Cygwin PIDs so that may be causing the discrepancy. I don't
>>>>>>>>>>>>>>>> know how well Hadoop works in Cygwin as I have never tried it. Work is in
>>>>>>>>>>>>>>>> progress for native Windows support however there are no official releases
>>>>>>>>>>>>>>>> with Windows support yet. It may be easier to get familiar with a
>>>>>>>>>>>>>>>> release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <
>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> thanks
>>>>>>>>>>>>>>>>> here is what i did .
>>>>>>>>>>>>>>>>> i stopped all the namenodes and datanodes using
>>>>>>>>>>>>>>>>> ./stop-dfs.sh command
>>>>>>>>>>>>>>>>> then deleted all pid files for namenodes and datanodes
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> started dfs again with command : "./start-dfs.sh"
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> when i ran the "Jps" command . it shows
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>> 4536 Jps
>>>>>>>>>>>>>>>>> 2076 NameNode
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> however, when i open the pid file for namenode then it is
>>>>>>>>>>>>>>>>> not showing pid as : 4560. on the contrary, it shud show : 2076
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
>>>>>>>>>>>>>>>>> aagarwal@hortonworks.com> wrote:
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> Most likely there is a stale pid file. Something like
>>>>>>>>>>>>>>>>>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>>>>>>>>>>>>>>>>>> the datanode.
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> I haven't read the entire thread so you may have looked
>>>>>>>>>>>>>>>>>> at this already.
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> -Arpit
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> datanode is trying to connect to namenode continuously
>>>>>>>>>>>>>>>>>>> but fails
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> when i try to run "jps" command it says :
>>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>>> 4584 NameNode
>>>>>>>>>>>>>>>>>>> 4016 Jps
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> and when i ran the "./start-dfs.sh" then it says :
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> $ ./start-dfs.sh
>>>>>>>>>>>>>>>>>>> namenode running as process 3544. Stop it first.
>>>>>>>>>>>>>>>>>>> DFS-1: datanode running as process 4076. Stop it first.
>>>>>>>>>>>>>>>>>>> localhost: secondarynamenode running as process 4792.
>>>>>>>>>>>>>>>>>>> Stop it first.
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> both these logs are contradictory
>>>>>>>>>>>>>>>>>>> please find the attached logs
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> should i attach the conf files as well ?
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> Your DN is still not running. Showing me the logs would
>>>>>>>>>>>>>>>>>>>> be helpful.
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> i followed the url and did the steps mention in that.
>>>>>>>>>>>>>>>>>>>>> i have deployed on the windows platform
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> Now, i am able to browse url : http://localhost:50070(name node )
>>>>>>>>>>>>>>>>>>>>> however, not able to browse url :
>>>>>>>>>>>>>>>>>>>>> http://localhost:50030
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> please refer below
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> i have modified all the config files as mentioned and
>>>>>>>>>>>>>>>>>>>>> formatted the hdfs file system as well
>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> thanks. i followed this url :
>>>>>>>>>>>>>>>>>>>>>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>>>>>>>>>>>>>>>>>>>>>> let me follow the url which you gave for pseudo
>>>>>>>>>>>>>>>>>>>>>> distributed setup and then will switch to distributed mode
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> You are welcome. Which link have you followed for
>>>>>>>>>>>>>>>>>>>>>>> the configuration?Your *core-site.xml* is empty.
>>>>>>>>>>>>>>>>>>>>>>> Remove the property *fs.default.name *from *
>>>>>>>>>>>>>>>>>>>>>>> hdfs-site.xml* and add it to *core-site.xml*.
>>>>>>>>>>>>>>>>>>>>>>> Remove *mapred.job.tracker* as well. It is required
>>>>>>>>>>>>>>>>>>>>>>> in *mapred-site.xml*.
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> I would suggest you to do a pseudo distributed setup
>>>>>>>>>>>>>>>>>>>>>>> first in order to get yourself familiar with the process and then proceed
>>>>>>>>>>>>>>>>>>>>>>> to the distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> HTH
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> thanks tariq for response.
>>>>>>>>>>>>>>>>>>>>>>>> as discussed last time, i have sent you all the
>>>>>>>>>>>>>>>>>>>>>>>> config files in my setup .
>>>>>>>>>>>>>>>>>>>>>>>> can you please go through that ?
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> please let me know
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> I'm sorry for being unresponsive. Was out of touch
>>>>>>>>>>>>>>>>>>>>>>>>> for sometime because of ramzan and eid. Resuming work today.
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> What's the current status?
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <
>>>>>>>>>>>>>>>>>>>>>>>>> manishd207@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> First of all read the concepts ..I hope you will
>>>>>>>>>>>>>>>>>>>>>>>>>> like it..
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> hey Tariq,
>>>>>>>>>>>>>>>>>>>>>>>>>>>> i am still stuck ..
>>>>>>>>>>>>>>>>>>>>>>>>>>>> can you please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> attachment got quarantined
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> resending in txt format. please rename it to
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> conf.rar
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> if i run the jps command on namenode :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 3164 NameNode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 1892 Jps
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> same command on datanode :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-1/cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 3848 Jps
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> jps does not list any process for datanode.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, on web browser i can see one live data node
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please find the attached conf rar file of
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> OK. we'll start fresh. Could you plz show
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> me your latest config files?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> BTW, are your daemons running fine?Use JPS
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> to verify that.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i have created these dir "wksp_data" and
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> "wksp_name" on both datanode and namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> made the respective changes in
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> "hdfs-site.xml" file
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> formatted the namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> started the dfs
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> but still, not able to browse the file
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> system through web browser
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please refer below
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> anything still missing ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> these dir needs to be created on all
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> datanodes and namenodes ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> further,  hdfs-site.xml needs to be
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> updated on both datanodes and namenodes for these new dir?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Create 2 directories manually
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> corresponding to the values of dfs.name.dir and dfs.data.dir properties and
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> change the permissions of these directories to 755. When you start pushing
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> data into your HDFS, data will start going inside the directory specified
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> by dfs.data.dir and the associated metadata will go inside dfs.name.dir.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Remember, you store data in HDFS, but it eventually gets stored in your
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> local/native FS. But you cannot see this data directly on your local/native
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> FS.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, i need this to be working on
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> windows environment as project requirement.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i will add/work on Linux later
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> so, now , at this stage , c:\\wksp is
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> the HDFS file system OR do i need to create it from command line ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:19 PM,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Mohammad Tariq <do...@gmail.com>wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Hello Irfan,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sorry for being unresponsive. Got
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> stuck with some imp work.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> HDFS webUI doesn't provide us the
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> ability to create file or directory. You can browse HDFS, view files,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> download files etc. But operation like create, move, copy etc are not
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> supported.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> These values look fine to me.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> One suggestion though. Try getting a
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Linux machine(if possible). Or at least use a VM. I personally feel that
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> using Hadoop on windows is always messy.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> when i browse the file system , i am
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> getting following :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i haven't seen any make directory
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> option there
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i need to create it from command line
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> further, in the hdfs-site.xml file ,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i have given following entries. are they correct ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <name>dfs.data.dir</name>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <value>c:\\wksp</value>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   </property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <name>dfs.name.dir</name>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <value>c:\\wksp</value>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   </property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 12:40 PM,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> manish dunani <ma...@gmail.com>wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *You are wrong at this:*
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> copyFromLocal: File
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> copyFromLocal: File
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Because,You had wrote both the paths
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> local and You need not to copy hadoop into hdfs...Hadoop is already
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> working..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Just check out in browser by after
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> starting ur single node cluster :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> localhost:50070
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> then go for browse the filesystem
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> link in it..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> If there is no directory then make
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> directory there.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> That is your hdfs directory.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Then copy any text file there(no
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> need to copy hadoop there).beacause u are going to do processing on that
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> data in text file.That's why hadoop is used for ,first u need to make it
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> clear in ur mind.Then and then u will do it...otherwise not possible..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *Try this: *
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ .bin/hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /full/local/path/to/ur/file /hdfs/directory/path
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 11:49 AM,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Irfan Sayed <ir...@gmail.com>wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks. yes , i am newbie.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, i need windows setup.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> let me surely refer the doc and
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> link which u sent but i need this to be working ...
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> can you please help
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> MANISH DUNANI
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> -THANX
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> +91 9426881954,+91 8460656443
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> manishd207@gmail.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>>>>>>>>>> Regards
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> *Manish Dunani*
>>>>>>>>>>>>>>>>>>>>>>>>>> *Contact No* : +91 9408329137
>>>>>>>>>>>>>>>>>>>>>>>>>> *skype id* : manish.dunani*
>>>>>>>>>>>>>>>>>>>>>>>>>> *
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>>>>>>>>>> NOTICE: This message is intended for the use of the
>>>>>>>>>>>>>>>>>> individual or entity to which it is addressed and may contain information
>>>>>>>>>>>>>>>>>> that is confidential, privileged and exempt from disclosure under
>>>>>>>>>>>>>>>>>> applicable law. If the reader of this message is not the intended
>>>>>>>>>>>>>>>>>> recipient, you are hereby notified that any printing, copying,
>>>>>>>>>>>>>>>>>> dissemination, distribution, disclosure or forwarding of this communication
>>>>>>>>>>>>>>>>>> is strictly prohibited. If you have received this communication in error,
>>>>>>>>>>>>>>>>>> please contact the sender immediately and delete it from your system. Thank
>>>>>>>>>>>>>>>>>> You.
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>>>>>>>> NOTICE: This message is intended for the use of the
>>>>>>>>>>>>>>>> individual or entity to which it is addressed and may contain information
>>>>>>>>>>>>>>>> that is confidential, privileged and exempt from disclosure under
>>>>>>>>>>>>>>>> applicable law. If the reader of this message is not the intended
>>>>>>>>>>>>>>>> recipient, you are hereby notified that any printing, copying,
>>>>>>>>>>>>>>>> dissemination, distribution, disclosure or forwarding of this communication
>>>>>>>>>>>>>>>> is strictly prohibited. If you have received this communication in error,
>>>>>>>>>>>>>>>> please contact the sender immediately and delete it from your system. Thank
>>>>>>>>>>>>>>>> You.
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>> immediately and delete it from your system. Thank You.
>>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>> CONFIDENTIALITY NOTICE
>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>> entity to which it is addressed and may contain information that is
>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>> you have received this communication in error, please contact the sender
>>>>>> immediately and delete it from your system. Thank You.
>>>>>>
>>>>>
>>>>>
>>>> CONFIDENTIALITY NOTICE
>>>> NOTICE: This message is intended for the use of the individual or
>>>> entity to which it is addressed and may contain information that is
>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>> If the reader of this message is not the intended recipient, you are hereby
>>>> notified that any printing, copying, dissemination, distribution,
>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>> you have received this communication in error, please contact the sender
>>>> immediately and delete it from your system. Thank You.
>>>>
>>>
>>>
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity
>> to which it is addressed and may contain information that is confidential,
>> privileged and exempt from disclosure under applicable law. If the reader
>> of this message is not the intended recipient, you are hereby notified that
>> any printing, copying, dissemination, distribution, disclosure or
>> forwarding of this communication is strictly prohibited. If you have
>> received this communication in error, please contact the sender immediately
>> and delete it from your system. Thank You.
>>
>
>


-- 
Olivier Renault
Solution Engineer - Big Data - Hortonworks, Inc.
+44 7500 933 036
orenault@hortonworks.com
www.hortonworks.com
<http://hortonworks.com/products/hortonworks-sandbox/>

-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.

Re: about replication

Posted by Olivier Renault <or...@hortonworks.com>.
Could you share the log files ( c:\hdp.log,
c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log )  as
well as your clusterproperties.txt ?

Thanks,
Olivier


On 5 September 2013 12:33, Irfan Sayed <ir...@gmail.com> wrote:

> thanks. i followed the user manual for deployment and installed all
> pre-requisites
> i modified the command and still the issue persist. please suggest
>
> please refer below
>
>
> [image: Inline image 1]
>
> regards
> irfan
>
>
>
> On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <or...@hortonworks.com>wrote:
>
>> The command to install it is msiexec /i msifile /...
>>
>> You will find the correct syntax as part of doc.
>>
>> Happy reading
>> Olivier
>> On 4 Sep 2013 12:37, "Irfan Sayed" <ir...@gmail.com> wrote:
>>
>>> thanks.
>>> i referred the logs and manuals. i modified the clusterproperties file
>>> and then double click on the msi file
>>> however, it still failed.
>>> further i started the installation on command line by giving
>>> HDP_LAYOUT=clusterproperties file path,
>>> installation went ahead and it failed for .NET framework 4.0 and VC++
>>> redistributable package dependency
>>>
>>> i installed both and started again the installation.
>>> failed again with following error
>>> [image: Inline image 1]
>>>
>>> when i search for the logs mentioned in the error , i never found that
>>> please suggest
>>>
>>> regards
>>> irfan
>>>
>>>
>>>
>>> On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <
>>> orenault@hortonworks.com> wrote:
>>>
>>>> Correct, you need to define the cluster configuration as part of a
>>>> file. You will find some information on the configuration file as part of
>>>> the documentation.
>>>>
>>>>
>>>> http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
>>>>
>>>> You should make sure to have also installed the pre requisite.
>>>>
>>>> Thanks
>>>> Olivier
>>>> On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>>
>>>>> thanks. sorry for the long break. actually got involved in some other
>>>>> priorities
>>>>> i downloaded the installer and while installing i got following error
>>>>>
>>>>> [image: Inline image 1]
>>>>>
>>>>> do i need to make any configuration prior to installation ??
>>>>>
>>>>> regards
>>>>> irfan
>>>>>
>>>>>
>>>>>
>>>>> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
>>>>> orenault@hortonworks.com> wrote:
>>>>>
>>>>>> Here is the link
>>>>>>
>>>>>> http://download.hortonworks.com/products/hdp-windows/
>>>>>>
>>>>>> Olivier
>>>>>> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>>>>
>>>>>>> thanks.
>>>>>>> i just followed the instructions to setup the pseudo distributed
>>>>>>> setup first using the url :
>>>>>>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>>>>>>>
>>>>>>> i don't think so i am running DN on both machine
>>>>>>> please find the attached log
>>>>>>>
>>>>>>> hi olivier
>>>>>>>
>>>>>>> can you please give me download link ?
>>>>>>> let me try please
>>>>>>>
>>>>>>> regards
>>>>>>> irfan
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>>>>
>>>>>>>> Are you running DN on both the machines? Could you please show me
>>>>>>>> your DN logs?
>>>>>>>>
>>>>>>>> Also, consider Oliver's suggestion. It's definitely a better option.
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> Warm Regards,
>>>>>>>> Tariq
>>>>>>>> cloudfront.blogspot.com
>>>>>>>>
>>>>>>>>
>>>>>>>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>>>>>>>> orenault@hortonworks.com> wrote:
>>>>>>>>
>>>>>>>>> Irfu,
>>>>>>>>>
>>>>>>>>> If you want to quickly get Hadoop running on windows platform. You
>>>>>>>>> may want to try our distribution for Windows. You will be able to find the
>>>>>>>>> msi on our website.
>>>>>>>>>
>>>>>>>>> Regards
>>>>>>>>> Olivier
>>>>>>>>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>>>>>>>
>>>>>>>>>> thanks.
>>>>>>>>>> ok. i think i need to change the plan over here
>>>>>>>>>> let me create two environments. 1: totally windows 2: totally Unix
>>>>>>>>>>
>>>>>>>>>> because, on windows , anyway i have to try and see how hadoop
>>>>>>>>>> works
>>>>>>>>>> on UNIX, it is already known that ,  it is working fine.
>>>>>>>>>>
>>>>>>>>>> so, on windows , here is the setup:
>>>>>>>>>>
>>>>>>>>>> namenode : windows 2012 R2
>>>>>>>>>> datanode : windows 2012 R2
>>>>>>>>>>
>>>>>>>>>> now, the exact problem is :
>>>>>>>>>> 1: datanode is not getting started
>>>>>>>>>> 2: replication : if i put any file/folder on any datanode , it
>>>>>>>>>> should get replicated to all another available datanodes
>>>>>>>>>>
>>>>>>>>>> regards
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <
>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>>> Seriously??You are planning to develop something using Hadoop on
>>>>>>>>>>> windows. Not a good idea. Anyways, cold you plz show me your log files?I
>>>>>>>>>>> also need some additional info :
>>>>>>>>>>> -The exact problem which you are facing right now
>>>>>>>>>>> -Your cluster summary(no. of nodes etc)
>>>>>>>>>>> -Your latest configuration files
>>>>>>>>>>> -Your /etc.hosts file
>>>>>>>>>>>
>>>>>>>>>>> Warm Regards,
>>>>>>>>>>> Tariq
>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <
>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> ok. thanks
>>>>>>>>>>>> now, i need to start with all windows setup first as our
>>>>>>>>>>>> product will be based on windows
>>>>>>>>>>>> so, now, please tell me how to resolve the issue
>>>>>>>>>>>>
>>>>>>>>>>>> datanode is not starting . please suggest
>>>>>>>>>>>>
>>>>>>>>>>>> regards,
>>>>>>>>>>>> irfan
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <
>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>
>>>>>>>>>>>>> It is possible. Theoretically Hadoop doesn't stop you from
>>>>>>>>>>>>> doing that. But it is not a very wise setup.
>>>>>>>>>>>>>
>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <
>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>
>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <
>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>> can i have setup like this :
>>>>>>>>>>>>>>> namenode will be on linux (flavour may be RHEL, CentOS,
>>>>>>>>>>>>>>> UBuntu etc)
>>>>>>>>>>>>>>> and datanodes are the combination of any OS (windows , linux
>>>>>>>>>>>>>>> , unix etc )
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> however, my doubt is,  as the file systems of  both the
>>>>>>>>>>>>>>> systems (win and linux ) are different ,  datanodes of these systems can
>>>>>>>>>>>>>>> not be part of single cluster . i have to make windows cluster separate and
>>>>>>>>>>>>>>> UNIX cluster separate ?
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
>>>>>>>>>>>>>>> aagarwal@hortonworks.com> wrote:
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> I just noticed you are on Cygwin. IIRC Windows PIDs are not
>>>>>>>>>>>>>>>> the same as Cygwin PIDs so that may be causing the discrepancy. I don't
>>>>>>>>>>>>>>>> know how well Hadoop works in Cygwin as I have never tried it. Work is in
>>>>>>>>>>>>>>>> progress for native Windows support however there are no official releases
>>>>>>>>>>>>>>>> with Windows support yet. It may be easier to get familiar with a
>>>>>>>>>>>>>>>> release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <
>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> thanks
>>>>>>>>>>>>>>>>> here is what i did .
>>>>>>>>>>>>>>>>> i stopped all the namenodes and datanodes using
>>>>>>>>>>>>>>>>> ./stop-dfs.sh command
>>>>>>>>>>>>>>>>> then deleted all pid files for namenodes and datanodes
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> started dfs again with command : "./start-dfs.sh"
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> when i ran the "Jps" command . it shows
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>> 4536 Jps
>>>>>>>>>>>>>>>>> 2076 NameNode
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> however, when i open the pid file for namenode then it is
>>>>>>>>>>>>>>>>> not showing pid as : 4560. on the contrary, it shud show : 2076
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
>>>>>>>>>>>>>>>>> aagarwal@hortonworks.com> wrote:
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> Most likely there is a stale pid file. Something like
>>>>>>>>>>>>>>>>>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>>>>>>>>>>>>>>>>>> the datanode.
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> I haven't read the entire thread so you may have looked
>>>>>>>>>>>>>>>>>> at this already.
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> -Arpit
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> datanode is trying to connect to namenode continuously
>>>>>>>>>>>>>>>>>>> but fails
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> when i try to run "jps" command it says :
>>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>>> 4584 NameNode
>>>>>>>>>>>>>>>>>>> 4016 Jps
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> and when i ran the "./start-dfs.sh" then it says :
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> $ ./start-dfs.sh
>>>>>>>>>>>>>>>>>>> namenode running as process 3544. Stop it first.
>>>>>>>>>>>>>>>>>>> DFS-1: datanode running as process 4076. Stop it first.
>>>>>>>>>>>>>>>>>>> localhost: secondarynamenode running as process 4792.
>>>>>>>>>>>>>>>>>>> Stop it first.
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> both these logs are contradictory
>>>>>>>>>>>>>>>>>>> please find the attached logs
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> should i attach the conf files as well ?
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> Your DN is still not running. Showing me the logs would
>>>>>>>>>>>>>>>>>>>> be helpful.
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> i followed the url and did the steps mention in that.
>>>>>>>>>>>>>>>>>>>>> i have deployed on the windows platform
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> Now, i am able to browse url : http://localhost:50070(name node )
>>>>>>>>>>>>>>>>>>>>> however, not able to browse url :
>>>>>>>>>>>>>>>>>>>>> http://localhost:50030
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> please refer below
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> i have modified all the config files as mentioned and
>>>>>>>>>>>>>>>>>>>>> formatted the hdfs file system as well
>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> thanks. i followed this url :
>>>>>>>>>>>>>>>>>>>>>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>>>>>>>>>>>>>>>>>>>>>> let me follow the url which you gave for pseudo
>>>>>>>>>>>>>>>>>>>>>> distributed setup and then will switch to distributed mode
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> You are welcome. Which link have you followed for
>>>>>>>>>>>>>>>>>>>>>>> the configuration?Your *core-site.xml* is empty.
>>>>>>>>>>>>>>>>>>>>>>> Remove the property *fs.default.name *from *
>>>>>>>>>>>>>>>>>>>>>>> hdfs-site.xml* and add it to *core-site.xml*.
>>>>>>>>>>>>>>>>>>>>>>> Remove *mapred.job.tracker* as well. It is required
>>>>>>>>>>>>>>>>>>>>>>> in *mapred-site.xml*.
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> I would suggest you to do a pseudo distributed setup
>>>>>>>>>>>>>>>>>>>>>>> first in order to get yourself familiar with the process and then proceed
>>>>>>>>>>>>>>>>>>>>>>> to the distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> HTH
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> thanks tariq for response.
>>>>>>>>>>>>>>>>>>>>>>>> as discussed last time, i have sent you all the
>>>>>>>>>>>>>>>>>>>>>>>> config files in my setup .
>>>>>>>>>>>>>>>>>>>>>>>> can you please go through that ?
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> please let me know
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> I'm sorry for being unresponsive. Was out of touch
>>>>>>>>>>>>>>>>>>>>>>>>> for sometime because of ramzan and eid. Resuming work today.
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> What's the current status?
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <
>>>>>>>>>>>>>>>>>>>>>>>>> manishd207@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> First of all read the concepts ..I hope you will
>>>>>>>>>>>>>>>>>>>>>>>>>> like it..
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> hey Tariq,
>>>>>>>>>>>>>>>>>>>>>>>>>>>> i am still stuck ..
>>>>>>>>>>>>>>>>>>>>>>>>>>>> can you please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> attachment got quarantined
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> resending in txt format. please rename it to
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> conf.rar
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> if i run the jps command on namenode :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 3164 NameNode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 1892 Jps
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> same command on datanode :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-1/cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 3848 Jps
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> jps does not list any process for datanode.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, on web browser i can see one live data node
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please find the attached conf rar file of
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> OK. we'll start fresh. Could you plz show
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> me your latest config files?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> BTW, are your daemons running fine?Use JPS
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> to verify that.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i have created these dir "wksp_data" and
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> "wksp_name" on both datanode and namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> made the respective changes in
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> "hdfs-site.xml" file
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> formatted the namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> started the dfs
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> but still, not able to browse the file
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> system through web browser
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please refer below
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> anything still missing ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> these dir needs to be created on all
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> datanodes and namenodes ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> further,  hdfs-site.xml needs to be
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> updated on both datanodes and namenodes for these new dir?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Create 2 directories manually
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> corresponding to the values of dfs.name.dir and dfs.data.dir properties and
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> change the permissions of these directories to 755. When you start pushing
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> data into your HDFS, data will start going inside the directory specified
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> by dfs.data.dir and the associated metadata will go inside dfs.name.dir.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Remember, you store data in HDFS, but it eventually gets stored in your
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> local/native FS. But you cannot see this data directly on your local/native
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> FS.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, i need this to be working on
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> windows environment as project requirement.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i will add/work on Linux later
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> so, now , at this stage , c:\\wksp is
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> the HDFS file system OR do i need to create it from command line ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:19 PM,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Mohammad Tariq <do...@gmail.com>wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Hello Irfan,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sorry for being unresponsive. Got
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> stuck with some imp work.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> HDFS webUI doesn't provide us the
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> ability to create file or directory. You can browse HDFS, view files,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> download files etc. But operation like create, move, copy etc are not
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> supported.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> These values look fine to me.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> One suggestion though. Try getting a
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Linux machine(if possible). Or at least use a VM. I personally feel that
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> using Hadoop on windows is always messy.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> when i browse the file system , i am
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> getting following :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i haven't seen any make directory
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> option there
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i need to create it from command line
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> further, in the hdfs-site.xml file ,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i have given following entries. are they correct ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <name>dfs.data.dir</name>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <value>c:\\wksp</value>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   </property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <name>dfs.name.dir</name>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <value>c:\\wksp</value>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   </property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 12:40 PM,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> manish dunani <ma...@gmail.com>wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *You are wrong at this:*
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> copyFromLocal: File
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> copyFromLocal: File
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Because,You had wrote both the paths
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> local and You need not to copy hadoop into hdfs...Hadoop is already
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> working..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Just check out in browser by after
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> starting ur single node cluster :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> localhost:50070
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> then go for browse the filesystem
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> link in it..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> If there is no directory then make
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> directory there.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> That is your hdfs directory.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Then copy any text file there(no
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> need to copy hadoop there).beacause u are going to do processing on that
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> data in text file.That's why hadoop is used for ,first u need to make it
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> clear in ur mind.Then and then u will do it...otherwise not possible..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *Try this: *
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ .bin/hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /full/local/path/to/ur/file /hdfs/directory/path
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 11:49 AM,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Irfan Sayed <ir...@gmail.com>wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks. yes , i am newbie.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, i need windows setup.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> let me surely refer the doc and
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> link which u sent but i need this to be working ...
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> can you please help
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> MANISH DUNANI
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> -THANX
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> +91 9426881954,+91 8460656443
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> manishd207@gmail.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>>>>>>>>>> Regards
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> *Manish Dunani*
>>>>>>>>>>>>>>>>>>>>>>>>>> *Contact No* : +91 9408329137
>>>>>>>>>>>>>>>>>>>>>>>>>> *skype id* : manish.dunani*
>>>>>>>>>>>>>>>>>>>>>>>>>> *
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>>>>>>>>>> NOTICE: This message is intended for the use of the
>>>>>>>>>>>>>>>>>> individual or entity to which it is addressed and may contain information
>>>>>>>>>>>>>>>>>> that is confidential, privileged and exempt from disclosure under
>>>>>>>>>>>>>>>>>> applicable law. If the reader of this message is not the intended
>>>>>>>>>>>>>>>>>> recipient, you are hereby notified that any printing, copying,
>>>>>>>>>>>>>>>>>> dissemination, distribution, disclosure or forwarding of this communication
>>>>>>>>>>>>>>>>>> is strictly prohibited. If you have received this communication in error,
>>>>>>>>>>>>>>>>>> please contact the sender immediately and delete it from your system. Thank
>>>>>>>>>>>>>>>>>> You.
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>>>>>>>> NOTICE: This message is intended for the use of the
>>>>>>>>>>>>>>>> individual or entity to which it is addressed and may contain information
>>>>>>>>>>>>>>>> that is confidential, privileged and exempt from disclosure under
>>>>>>>>>>>>>>>> applicable law. If the reader of this message is not the intended
>>>>>>>>>>>>>>>> recipient, you are hereby notified that any printing, copying,
>>>>>>>>>>>>>>>> dissemination, distribution, disclosure or forwarding of this communication
>>>>>>>>>>>>>>>> is strictly prohibited. If you have received this communication in error,
>>>>>>>>>>>>>>>> please contact the sender immediately and delete it from your system. Thank
>>>>>>>>>>>>>>>> You.
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>> immediately and delete it from your system. Thank You.
>>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>> CONFIDENTIALITY NOTICE
>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>> entity to which it is addressed and may contain information that is
>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>> you have received this communication in error, please contact the sender
>>>>>> immediately and delete it from your system. Thank You.
>>>>>>
>>>>>
>>>>>
>>>> CONFIDENTIALITY NOTICE
>>>> NOTICE: This message is intended for the use of the individual or
>>>> entity to which it is addressed and may contain information that is
>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>> If the reader of this message is not the intended recipient, you are hereby
>>>> notified that any printing, copying, dissemination, distribution,
>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>> you have received this communication in error, please contact the sender
>>>> immediately and delete it from your system. Thank You.
>>>>
>>>
>>>
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity
>> to which it is addressed and may contain information that is confidential,
>> privileged and exempt from disclosure under applicable law. If the reader
>> of this message is not the intended recipient, you are hereby notified that
>> any printing, copying, dissemination, distribution, disclosure or
>> forwarding of this communication is strictly prohibited. If you have
>> received this communication in error, please contact the sender immediately
>> and delete it from your system. Thank You.
>>
>
>


-- 
Olivier Renault
Solution Engineer - Big Data - Hortonworks, Inc.
+44 7500 933 036
orenault@hortonworks.com
www.hortonworks.com
<http://hortonworks.com/products/hortonworks-sandbox/>

-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.

Re: about replication

Posted by Olivier Renault <or...@hortonworks.com>.
Could you share the log files ( c:\hdp.log,
c:\HadoopInstallFiles\HadoopSetupTools\hdp-1.3.0.0.winpkg.install.log )  as
well as your clusterproperties.txt ?

Thanks,
Olivier


On 5 September 2013 12:33, Irfan Sayed <ir...@gmail.com> wrote:

> thanks. i followed the user manual for deployment and installed all
> pre-requisites
> i modified the command and still the issue persist. please suggest
>
> please refer below
>
>
> [image: Inline image 1]
>
> regards
> irfan
>
>
>
> On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <or...@hortonworks.com>wrote:
>
>> The command to install it is msiexec /i msifile /...
>>
>> You will find the correct syntax as part of doc.
>>
>> Happy reading
>> Olivier
>> On 4 Sep 2013 12:37, "Irfan Sayed" <ir...@gmail.com> wrote:
>>
>>> thanks.
>>> i referred the logs and manuals. i modified the clusterproperties file
>>> and then double click on the msi file
>>> however, it still failed.
>>> further i started the installation on command line by giving
>>> HDP_LAYOUT=clusterproperties file path,
>>> installation went ahead and it failed for .NET framework 4.0 and VC++
>>> redistributable package dependency
>>>
>>> i installed both and started again the installation.
>>> failed again with following error
>>> [image: Inline image 1]
>>>
>>> when i search for the logs mentioned in the error , i never found that
>>> please suggest
>>>
>>> regards
>>> irfan
>>>
>>>
>>>
>>> On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <
>>> orenault@hortonworks.com> wrote:
>>>
>>>> Correct, you need to define the cluster configuration as part of a
>>>> file. You will find some information on the configuration file as part of
>>>> the documentation.
>>>>
>>>>
>>>> http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
>>>>
>>>> You should make sure to have also installed the pre requisite.
>>>>
>>>> Thanks
>>>> Olivier
>>>> On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>>
>>>>> thanks. sorry for the long break. actually got involved in some other
>>>>> priorities
>>>>> i downloaded the installer and while installing i got following error
>>>>>
>>>>> [image: Inline image 1]
>>>>>
>>>>> do i need to make any configuration prior to installation ??
>>>>>
>>>>> regards
>>>>> irfan
>>>>>
>>>>>
>>>>>
>>>>> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
>>>>> orenault@hortonworks.com> wrote:
>>>>>
>>>>>> Here is the link
>>>>>>
>>>>>> http://download.hortonworks.com/products/hdp-windows/
>>>>>>
>>>>>> Olivier
>>>>>> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>>>>
>>>>>>> thanks.
>>>>>>> i just followed the instructions to setup the pseudo distributed
>>>>>>> setup first using the url :
>>>>>>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>>>>>>>
>>>>>>> i don't think so i am running DN on both machine
>>>>>>> please find the attached log
>>>>>>>
>>>>>>> hi olivier
>>>>>>>
>>>>>>> can you please give me download link ?
>>>>>>> let me try please
>>>>>>>
>>>>>>> regards
>>>>>>> irfan
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>>>>
>>>>>>>> Are you running DN on both the machines? Could you please show me
>>>>>>>> your DN logs?
>>>>>>>>
>>>>>>>> Also, consider Oliver's suggestion. It's definitely a better option.
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> Warm Regards,
>>>>>>>> Tariq
>>>>>>>> cloudfront.blogspot.com
>>>>>>>>
>>>>>>>>
>>>>>>>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>>>>>>>> orenault@hortonworks.com> wrote:
>>>>>>>>
>>>>>>>>> Irfu,
>>>>>>>>>
>>>>>>>>> If you want to quickly get Hadoop running on windows platform. You
>>>>>>>>> may want to try our distribution for Windows. You will be able to find the
>>>>>>>>> msi on our website.
>>>>>>>>>
>>>>>>>>> Regards
>>>>>>>>> Olivier
>>>>>>>>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>>>>>>>
>>>>>>>>>> thanks.
>>>>>>>>>> ok. i think i need to change the plan over here
>>>>>>>>>> let me create two environments. 1: totally windows 2: totally Unix
>>>>>>>>>>
>>>>>>>>>> because, on windows , anyway i have to try and see how hadoop
>>>>>>>>>> works
>>>>>>>>>> on UNIX, it is already known that ,  it is working fine.
>>>>>>>>>>
>>>>>>>>>> so, on windows , here is the setup:
>>>>>>>>>>
>>>>>>>>>> namenode : windows 2012 R2
>>>>>>>>>> datanode : windows 2012 R2
>>>>>>>>>>
>>>>>>>>>> now, the exact problem is :
>>>>>>>>>> 1: datanode is not getting started
>>>>>>>>>> 2: replication : if i put any file/folder on any datanode , it
>>>>>>>>>> should get replicated to all another available datanodes
>>>>>>>>>>
>>>>>>>>>> regards
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <
>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>>> Seriously??You are planning to develop something using Hadoop on
>>>>>>>>>>> windows. Not a good idea. Anyways, cold you plz show me your log files?I
>>>>>>>>>>> also need some additional info :
>>>>>>>>>>> -The exact problem which you are facing right now
>>>>>>>>>>> -Your cluster summary(no. of nodes etc)
>>>>>>>>>>> -Your latest configuration files
>>>>>>>>>>> -Your /etc.hosts file
>>>>>>>>>>>
>>>>>>>>>>> Warm Regards,
>>>>>>>>>>> Tariq
>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <
>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> ok. thanks
>>>>>>>>>>>> now, i need to start with all windows setup first as our
>>>>>>>>>>>> product will be based on windows
>>>>>>>>>>>> so, now, please tell me how to resolve the issue
>>>>>>>>>>>>
>>>>>>>>>>>> datanode is not starting . please suggest
>>>>>>>>>>>>
>>>>>>>>>>>> regards,
>>>>>>>>>>>> irfan
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <
>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>
>>>>>>>>>>>>> It is possible. Theoretically Hadoop doesn't stop you from
>>>>>>>>>>>>> doing that. But it is not a very wise setup.
>>>>>>>>>>>>>
>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <
>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>
>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <
>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>> can i have setup like this :
>>>>>>>>>>>>>>> namenode will be on linux (flavour may be RHEL, CentOS,
>>>>>>>>>>>>>>> UBuntu etc)
>>>>>>>>>>>>>>> and datanodes are the combination of any OS (windows , linux
>>>>>>>>>>>>>>> , unix etc )
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> however, my doubt is,  as the file systems of  both the
>>>>>>>>>>>>>>> systems (win and linux ) are different ,  datanodes of these systems can
>>>>>>>>>>>>>>> not be part of single cluster . i have to make windows cluster separate and
>>>>>>>>>>>>>>> UNIX cluster separate ?
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
>>>>>>>>>>>>>>> aagarwal@hortonworks.com> wrote:
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> I just noticed you are on Cygwin. IIRC Windows PIDs are not
>>>>>>>>>>>>>>>> the same as Cygwin PIDs so that may be causing the discrepancy. I don't
>>>>>>>>>>>>>>>> know how well Hadoop works in Cygwin as I have never tried it. Work is in
>>>>>>>>>>>>>>>> progress for native Windows support however there are no official releases
>>>>>>>>>>>>>>>> with Windows support yet. It may be easier to get familiar with a
>>>>>>>>>>>>>>>> release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <
>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> thanks
>>>>>>>>>>>>>>>>> here is what i did .
>>>>>>>>>>>>>>>>> i stopped all the namenodes and datanodes using
>>>>>>>>>>>>>>>>> ./stop-dfs.sh command
>>>>>>>>>>>>>>>>> then deleted all pid files for namenodes and datanodes
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> started dfs again with command : "./start-dfs.sh"
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> when i ran the "Jps" command . it shows
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>> 4536 Jps
>>>>>>>>>>>>>>>>> 2076 NameNode
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> however, when i open the pid file for namenode then it is
>>>>>>>>>>>>>>>>> not showing pid as : 4560. on the contrary, it shud show : 2076
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
>>>>>>>>>>>>>>>>> aagarwal@hortonworks.com> wrote:
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> Most likely there is a stale pid file. Something like
>>>>>>>>>>>>>>>>>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>>>>>>>>>>>>>>>>>> the datanode.
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> I haven't read the entire thread so you may have looked
>>>>>>>>>>>>>>>>>> at this already.
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> -Arpit
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> datanode is trying to connect to namenode continuously
>>>>>>>>>>>>>>>>>>> but fails
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> when i try to run "jps" command it says :
>>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>>> 4584 NameNode
>>>>>>>>>>>>>>>>>>> 4016 Jps
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> and when i ran the "./start-dfs.sh" then it says :
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> $ ./start-dfs.sh
>>>>>>>>>>>>>>>>>>> namenode running as process 3544. Stop it first.
>>>>>>>>>>>>>>>>>>> DFS-1: datanode running as process 4076. Stop it first.
>>>>>>>>>>>>>>>>>>> localhost: secondarynamenode running as process 4792.
>>>>>>>>>>>>>>>>>>> Stop it first.
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> both these logs are contradictory
>>>>>>>>>>>>>>>>>>> please find the attached logs
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> should i attach the conf files as well ?
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> Your DN is still not running. Showing me the logs would
>>>>>>>>>>>>>>>>>>>> be helpful.
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> i followed the url and did the steps mention in that.
>>>>>>>>>>>>>>>>>>>>> i have deployed on the windows platform
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> Now, i am able to browse url : http://localhost:50070(name node )
>>>>>>>>>>>>>>>>>>>>> however, not able to browse url :
>>>>>>>>>>>>>>>>>>>>> http://localhost:50030
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> please refer below
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> i have modified all the config files as mentioned and
>>>>>>>>>>>>>>>>>>>>> formatted the hdfs file system as well
>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> thanks. i followed this url :
>>>>>>>>>>>>>>>>>>>>>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>>>>>>>>>>>>>>>>>>>>>> let me follow the url which you gave for pseudo
>>>>>>>>>>>>>>>>>>>>>> distributed setup and then will switch to distributed mode
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> You are welcome. Which link have you followed for
>>>>>>>>>>>>>>>>>>>>>>> the configuration?Your *core-site.xml* is empty.
>>>>>>>>>>>>>>>>>>>>>>> Remove the property *fs.default.name *from *
>>>>>>>>>>>>>>>>>>>>>>> hdfs-site.xml* and add it to *core-site.xml*.
>>>>>>>>>>>>>>>>>>>>>>> Remove *mapred.job.tracker* as well. It is required
>>>>>>>>>>>>>>>>>>>>>>> in *mapred-site.xml*.
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> I would suggest you to do a pseudo distributed setup
>>>>>>>>>>>>>>>>>>>>>>> first in order to get yourself familiar with the process and then proceed
>>>>>>>>>>>>>>>>>>>>>>> to the distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> HTH
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> thanks tariq for response.
>>>>>>>>>>>>>>>>>>>>>>>> as discussed last time, i have sent you all the
>>>>>>>>>>>>>>>>>>>>>>>> config files in my setup .
>>>>>>>>>>>>>>>>>>>>>>>> can you please go through that ?
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> please let me know
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> I'm sorry for being unresponsive. Was out of touch
>>>>>>>>>>>>>>>>>>>>>>>>> for sometime because of ramzan and eid. Resuming work today.
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> What's the current status?
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <
>>>>>>>>>>>>>>>>>>>>>>>>> manishd207@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> First of all read the concepts ..I hope you will
>>>>>>>>>>>>>>>>>>>>>>>>>> like it..
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> hey Tariq,
>>>>>>>>>>>>>>>>>>>>>>>>>>>> i am still stuck ..
>>>>>>>>>>>>>>>>>>>>>>>>>>>> can you please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> attachment got quarantined
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> resending in txt format. please rename it to
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> conf.rar
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> if i run the jps command on namenode :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 3164 NameNode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 1892 Jps
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> same command on datanode :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-1/cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 3848 Jps
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> jps does not list any process for datanode.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, on web browser i can see one live data node
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please find the attached conf rar file of
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> OK. we'll start fresh. Could you plz show
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> me your latest config files?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> BTW, are your daemons running fine?Use JPS
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> to verify that.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i have created these dir "wksp_data" and
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> "wksp_name" on both datanode and namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> made the respective changes in
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> "hdfs-site.xml" file
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> formatted the namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> started the dfs
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> but still, not able to browse the file
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> system through web browser
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please refer below
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> anything still missing ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> these dir needs to be created on all
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> datanodes and namenodes ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> further,  hdfs-site.xml needs to be
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> updated on both datanodes and namenodes for these new dir?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Create 2 directories manually
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> corresponding to the values of dfs.name.dir and dfs.data.dir properties and
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> change the permissions of these directories to 755. When you start pushing
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> data into your HDFS, data will start going inside the directory specified
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> by dfs.data.dir and the associated metadata will go inside dfs.name.dir.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Remember, you store data in HDFS, but it eventually gets stored in your
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> local/native FS. But you cannot see this data directly on your local/native
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> FS.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, i need this to be working on
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> windows environment as project requirement.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i will add/work on Linux later
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> so, now , at this stage , c:\\wksp is
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> the HDFS file system OR do i need to create it from command line ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:19 PM,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Mohammad Tariq <do...@gmail.com>wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Hello Irfan,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sorry for being unresponsive. Got
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> stuck with some imp work.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> HDFS webUI doesn't provide us the
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> ability to create file or directory. You can browse HDFS, view files,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> download files etc. But operation like create, move, copy etc are not
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> supported.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> These values look fine to me.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> One suggestion though. Try getting a
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Linux machine(if possible). Or at least use a VM. I personally feel that
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> using Hadoop on windows is always messy.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> when i browse the file system , i am
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> getting following :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i haven't seen any make directory
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> option there
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i need to create it from command line
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> further, in the hdfs-site.xml file ,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i have given following entries. are they correct ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <name>dfs.data.dir</name>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <value>c:\\wksp</value>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   </property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <name>dfs.name.dir</name>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <value>c:\\wksp</value>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   </property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 12:40 PM,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> manish dunani <ma...@gmail.com>wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *You are wrong at this:*
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> copyFromLocal: File
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> copyFromLocal: File
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Because,You had wrote both the paths
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> local and You need not to copy hadoop into hdfs...Hadoop is already
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> working..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Just check out in browser by after
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> starting ur single node cluster :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> localhost:50070
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> then go for browse the filesystem
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> link in it..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> If there is no directory then make
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> directory there.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> That is your hdfs directory.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Then copy any text file there(no
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> need to copy hadoop there).beacause u are going to do processing on that
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> data in text file.That's why hadoop is used for ,first u need to make it
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> clear in ur mind.Then and then u will do it...otherwise not possible..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *Try this: *
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ .bin/hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /full/local/path/to/ur/file /hdfs/directory/path
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 11:49 AM,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Irfan Sayed <ir...@gmail.com>wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks. yes , i am newbie.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, i need windows setup.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> let me surely refer the doc and
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> link which u sent but i need this to be working ...
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> can you please help
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> MANISH DUNANI
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> -THANX
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> +91 9426881954,+91 8460656443
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> manishd207@gmail.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>>>>>>>>>> Regards
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> *Manish Dunani*
>>>>>>>>>>>>>>>>>>>>>>>>>> *Contact No* : +91 9408329137
>>>>>>>>>>>>>>>>>>>>>>>>>> *skype id* : manish.dunani*
>>>>>>>>>>>>>>>>>>>>>>>>>> *
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>>>>>>>>>> NOTICE: This message is intended for the use of the
>>>>>>>>>>>>>>>>>> individual or entity to which it is addressed and may contain information
>>>>>>>>>>>>>>>>>> that is confidential, privileged and exempt from disclosure under
>>>>>>>>>>>>>>>>>> applicable law. If the reader of this message is not the intended
>>>>>>>>>>>>>>>>>> recipient, you are hereby notified that any printing, copying,
>>>>>>>>>>>>>>>>>> dissemination, distribution, disclosure or forwarding of this communication
>>>>>>>>>>>>>>>>>> is strictly prohibited. If you have received this communication in error,
>>>>>>>>>>>>>>>>>> please contact the sender immediately and delete it from your system. Thank
>>>>>>>>>>>>>>>>>> You.
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>>>>>>>> NOTICE: This message is intended for the use of the
>>>>>>>>>>>>>>>> individual or entity to which it is addressed and may contain information
>>>>>>>>>>>>>>>> that is confidential, privileged and exempt from disclosure under
>>>>>>>>>>>>>>>> applicable law. If the reader of this message is not the intended
>>>>>>>>>>>>>>>> recipient, you are hereby notified that any printing, copying,
>>>>>>>>>>>>>>>> dissemination, distribution, disclosure or forwarding of this communication
>>>>>>>>>>>>>>>> is strictly prohibited. If you have received this communication in error,
>>>>>>>>>>>>>>>> please contact the sender immediately and delete it from your system. Thank
>>>>>>>>>>>>>>>> You.
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>> immediately and delete it from your system. Thank You.
>>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>> CONFIDENTIALITY NOTICE
>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>> entity to which it is addressed and may contain information that is
>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>> you have received this communication in error, please contact the sender
>>>>>> immediately and delete it from your system. Thank You.
>>>>>>
>>>>>
>>>>>
>>>> CONFIDENTIALITY NOTICE
>>>> NOTICE: This message is intended for the use of the individual or
>>>> entity to which it is addressed and may contain information that is
>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>> If the reader of this message is not the intended recipient, you are hereby
>>>> notified that any printing, copying, dissemination, distribution,
>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>> you have received this communication in error, please contact the sender
>>>> immediately and delete it from your system. Thank You.
>>>>
>>>
>>>
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity
>> to which it is addressed and may contain information that is confidential,
>> privileged and exempt from disclosure under applicable law. If the reader
>> of this message is not the intended recipient, you are hereby notified that
>> any printing, copying, dissemination, distribution, disclosure or
>> forwarding of this communication is strictly prohibited. If you have
>> received this communication in error, please contact the sender immediately
>> and delete it from your system. Thank You.
>>
>
>


-- 
Olivier Renault
Solution Engineer - Big Data - Hortonworks, Inc.
+44 7500 933 036
orenault@hortonworks.com
www.hortonworks.com
<http://hortonworks.com/products/hortonworks-sandbox/>

-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.

Re: about replication

Posted by Irfan Sayed <ir...@gmail.com>.
thanks. i followed the user manual for deployment and installed all
pre-requisites
i modified the command and still the issue persist. please suggest

please refer below


[image: Inline image 1]

regards
irfan



On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <or...@hortonworks.com>wrote:

> The command to install it is msiexec /i msifile /...
>
> You will find the correct syntax as part of doc.
>
> Happy reading
> Olivier
> On 4 Sep 2013 12:37, "Irfan Sayed" <ir...@gmail.com> wrote:
>
>> thanks.
>> i referred the logs and manuals. i modified the clusterproperties file
>> and then double click on the msi file
>> however, it still failed.
>> further i started the installation on command line by giving
>> HDP_LAYOUT=clusterproperties file path,
>> installation went ahead and it failed for .NET framework 4.0 and VC++
>> redistributable package dependency
>>
>> i installed both and started again the installation.
>> failed again with following error
>> [image: Inline image 1]
>>
>> when i search for the logs mentioned in the error , i never found that
>> please suggest
>>
>> regards
>> irfan
>>
>>
>>
>> On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <
>> orenault@hortonworks.com> wrote:
>>
>>> Correct, you need to define the cluster configuration as part of a file.
>>> You will find some information on the configuration file as part of the
>>> documentation.
>>>
>>>
>>> http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
>>>
>>> You should make sure to have also installed the pre requisite.
>>>
>>> Thanks
>>> Olivier
>>> On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>
>>>> thanks. sorry for the long break. actually got involved in some other
>>>> priorities
>>>> i downloaded the installer and while installing i got following error
>>>>
>>>> [image: Inline image 1]
>>>>
>>>> do i need to make any configuration prior to installation ??
>>>>
>>>> regards
>>>> irfan
>>>>
>>>>
>>>>
>>>> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
>>>> orenault@hortonworks.com> wrote:
>>>>
>>>>> Here is the link
>>>>>
>>>>> http://download.hortonworks.com/products/hdp-windows/
>>>>>
>>>>> Olivier
>>>>> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>>>
>>>>>> thanks.
>>>>>> i just followed the instructions to setup the pseudo distributed
>>>>>> setup first using the url :
>>>>>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>>>>>>
>>>>>> i don't think so i am running DN on both machine
>>>>>> please find the attached log
>>>>>>
>>>>>> hi olivier
>>>>>>
>>>>>> can you please give me download link ?
>>>>>> let me try please
>>>>>>
>>>>>> regards
>>>>>> irfan
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>>>
>>>>>>> Are you running DN on both the machines? Could you please show me
>>>>>>> your DN logs?
>>>>>>>
>>>>>>> Also, consider Oliver's suggestion. It's definitely a better option.
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> Warm Regards,
>>>>>>> Tariq
>>>>>>> cloudfront.blogspot.com
>>>>>>>
>>>>>>>
>>>>>>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>>>>>>> orenault@hortonworks.com> wrote:
>>>>>>>
>>>>>>>> Irfu,
>>>>>>>>
>>>>>>>> If you want to quickly get Hadoop running on windows platform. You
>>>>>>>> may want to try our distribution for Windows. You will be able to find the
>>>>>>>> msi on our website.
>>>>>>>>
>>>>>>>> Regards
>>>>>>>> Olivier
>>>>>>>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>>>>>>
>>>>>>>>> thanks.
>>>>>>>>> ok. i think i need to change the plan over here
>>>>>>>>> let me create two environments. 1: totally windows 2: totally Unix
>>>>>>>>>
>>>>>>>>> because, on windows , anyway i have to try and see how hadoop
>>>>>>>>> works
>>>>>>>>> on UNIX, it is already known that ,  it is working fine.
>>>>>>>>>
>>>>>>>>> so, on windows , here is the setup:
>>>>>>>>>
>>>>>>>>> namenode : windows 2012 R2
>>>>>>>>> datanode : windows 2012 R2
>>>>>>>>>
>>>>>>>>> now, the exact problem is :
>>>>>>>>> 1: datanode is not getting started
>>>>>>>>> 2: replication : if i put any file/folder on any datanode , it
>>>>>>>>> should get replicated to all another available datanodes
>>>>>>>>>
>>>>>>>>> regards
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <
>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>
>>>>>>>>>> Seriously??You are planning to develop something using Hadoop on
>>>>>>>>>> windows. Not a good idea. Anyways, cold you plz show me your log files?I
>>>>>>>>>> also need some additional info :
>>>>>>>>>> -The exact problem which you are facing right now
>>>>>>>>>> -Your cluster summary(no. of nodes etc)
>>>>>>>>>> -Your latest configuration files
>>>>>>>>>> -Your /etc.hosts file
>>>>>>>>>>
>>>>>>>>>> Warm Regards,
>>>>>>>>>> Tariq
>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>>> ok. thanks
>>>>>>>>>>> now, i need to start with all windows setup first as our product
>>>>>>>>>>> will be based on windows
>>>>>>>>>>> so, now, please tell me how to resolve the issue
>>>>>>>>>>>
>>>>>>>>>>> datanode is not starting . please suggest
>>>>>>>>>>>
>>>>>>>>>>> regards,
>>>>>>>>>>> irfan
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <
>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> It is possible. Theoretically Hadoop doesn't stop you from
>>>>>>>>>>>> doing that. But it is not a very wise setup.
>>>>>>>>>>>>
>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>> Tariq
>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <
>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>
>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>
>>>>>>>>>>>>> regards
>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <
>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>
>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>> can i have setup like this :
>>>>>>>>>>>>>> namenode will be on linux (flavour may be RHEL, CentOS,
>>>>>>>>>>>>>> UBuntu etc)
>>>>>>>>>>>>>> and datanodes are the combination of any OS (windows , linux
>>>>>>>>>>>>>> , unix etc )
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> however, my doubt is,  as the file systems of  both the
>>>>>>>>>>>>>> systems (win and linux ) are different ,  datanodes of these systems can
>>>>>>>>>>>>>> not be part of single cluster . i have to make windows cluster separate and
>>>>>>>>>>>>>> UNIX cluster separate ?
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
>>>>>>>>>>>>>> aagarwal@hortonworks.com> wrote:
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> I just noticed you are on Cygwin. IIRC Windows PIDs are not
>>>>>>>>>>>>>>> the same as Cygwin PIDs so that may be causing the discrepancy. I don't
>>>>>>>>>>>>>>> know how well Hadoop works in Cygwin as I have never tried it. Work is in
>>>>>>>>>>>>>>> progress for native Windows support however there are no official releases
>>>>>>>>>>>>>>> with Windows support yet. It may be easier to get familiar with a
>>>>>>>>>>>>>>> release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <
>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> thanks
>>>>>>>>>>>>>>>> here is what i did .
>>>>>>>>>>>>>>>> i stopped all the namenodes and datanodes using
>>>>>>>>>>>>>>>> ./stop-dfs.sh command
>>>>>>>>>>>>>>>> then deleted all pid files for namenodes and datanodes
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> started dfs again with command : "./start-dfs.sh"
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> when i ran the "Jps" command . it shows
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>> 4536 Jps
>>>>>>>>>>>>>>>> 2076 NameNode
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> however, when i open the pid file for namenode then it is
>>>>>>>>>>>>>>>> not showing pid as : 4560. on the contrary, it shud show : 2076
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
>>>>>>>>>>>>>>>> aagarwal@hortonworks.com> wrote:
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> Most likely there is a stale pid file. Something like
>>>>>>>>>>>>>>>>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>>>>>>>>>>>>>>>>> the datanode.
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> I haven't read the entire thread so you may have looked at
>>>>>>>>>>>>>>>>> this already.
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> -Arpit
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> datanode is trying to connect to namenode continuously
>>>>>>>>>>>>>>>>>> but fails
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> when i try to run "jps" command it says :
>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>> 4584 NameNode
>>>>>>>>>>>>>>>>>> 4016 Jps
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> and when i ran the "./start-dfs.sh" then it says :
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> $ ./start-dfs.sh
>>>>>>>>>>>>>>>>>> namenode running as process 3544. Stop it first.
>>>>>>>>>>>>>>>>>> DFS-1: datanode running as process 4076. Stop it first.
>>>>>>>>>>>>>>>>>> localhost: secondarynamenode running as process 4792.
>>>>>>>>>>>>>>>>>> Stop it first.
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> both these logs are contradictory
>>>>>>>>>>>>>>>>>> please find the attached logs
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> should i attach the conf files as well ?
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> Your DN is still not running. Showing me the logs would
>>>>>>>>>>>>>>>>>>> be helpful.
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> i followed the url and did the steps mention in that. i
>>>>>>>>>>>>>>>>>>>> have deployed on the windows platform
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> Now, i am able to browse url : http://localhost:50070(name node )
>>>>>>>>>>>>>>>>>>>> however, not able to browse url :
>>>>>>>>>>>>>>>>>>>> http://localhost:50030
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> please refer below
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> i have modified all the config files as mentioned and
>>>>>>>>>>>>>>>>>>>> formatted the hdfs file system as well
>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> thanks. i followed this url :
>>>>>>>>>>>>>>>>>>>>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>>>>>>>>>>>>>>>>>>>>> let me follow the url which you gave for pseudo
>>>>>>>>>>>>>>>>>>>>> distributed setup and then will switch to distributed mode
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> You are welcome. Which link have you followed for the
>>>>>>>>>>>>>>>>>>>>>> configuration?Your *core-site.xml* is empty. Remove
>>>>>>>>>>>>>>>>>>>>>> the property *fs.default.name *from *hdfs-site.xml*and add it to
>>>>>>>>>>>>>>>>>>>>>> *core-site.xml*. Remove *mapred.job.tracker* as
>>>>>>>>>>>>>>>>>>>>>> well. It is required in *mapred-site.xml*.
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> I would suggest you to do a pseudo distributed setup
>>>>>>>>>>>>>>>>>>>>>> first in order to get yourself familiar with the process and then proceed
>>>>>>>>>>>>>>>>>>>>>> to the distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> HTH
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> thanks tariq for response.
>>>>>>>>>>>>>>>>>>>>>>> as discussed last time, i have sent you all the
>>>>>>>>>>>>>>>>>>>>>>> config files in my setup .
>>>>>>>>>>>>>>>>>>>>>>> can you please go through that ?
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> please let me know
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> I'm sorry for being unresponsive. Was out of touch
>>>>>>>>>>>>>>>>>>>>>>>> for sometime because of ramzan and eid. Resuming work today.
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> What's the current status?
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <
>>>>>>>>>>>>>>>>>>>>>>>> manishd207@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> First of all read the concepts ..I hope you will
>>>>>>>>>>>>>>>>>>>>>>>>> like it..
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> hey Tariq,
>>>>>>>>>>>>>>>>>>>>>>>>>>> i am still stuck ..
>>>>>>>>>>>>>>>>>>>>>>>>>>> can you please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> attachment got quarantined
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> resending in txt format. please rename it to
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> conf.rar
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> if i run the jps command on namenode :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 3164 NameNode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 1892 Jps
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> same command on datanode :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-1/cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 3848 Jps
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> jps does not list any process for datanode.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, on web browser i can see one live data node
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please find the attached conf rar file of
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> OK. we'll start fresh. Could you plz show me
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> your latest config files?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> BTW, are your daemons running fine?Use JPS
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> to verify that.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i have created these dir "wksp_data" and
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> "wksp_name" on both datanode and namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> made the respective changes in
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> "hdfs-site.xml" file
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> formatted the namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> started the dfs
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> but still, not able to browse the file
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> system through web browser
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please refer below
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> anything still missing ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> these dir needs to be created on all
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> datanodes and namenodes ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> further,  hdfs-site.xml needs to be
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> updated on both datanodes and namenodes for these new dir?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Create 2 directories manually
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> corresponding to the values of dfs.name.dir and dfs.data.dir properties and
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> change the permissions of these directories to 755. When you start pushing
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> data into your HDFS, data will start going inside the directory specified
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> by dfs.data.dir and the associated metadata will go inside dfs.name.dir.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Remember, you store data in HDFS, but it eventually gets stored in your
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> local/native FS. But you cannot see this data directly on your local/native
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> FS.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, i need this to be working on
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> windows environment as project requirement.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i will add/work on Linux later
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> so, now , at this stage , c:\\wksp is
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> the HDFS file system OR do i need to create it from command line ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:19 PM, Mohammad
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Hello Irfan,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sorry for being unresponsive. Got stuck
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> with some imp work.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> HDFS webUI doesn't provide us the
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> ability to create file or directory. You can browse HDFS, view files,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> download files etc. But operation like create, move, copy etc are not
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> supported.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> These values look fine to me.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> One suggestion though. Try getting a
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Linux machine(if possible). Or at least use a VM. I personally feel that
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> using Hadoop on windows is always messy.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> when i browse the file system , i am
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> getting following :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i haven't seen any make directory
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> option there
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i need to create it from command line ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> further, in the hdfs-site.xml file , i
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> have given following entries. are they correct ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <name>dfs.data.dir</name>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <value>c:\\wksp</value>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   </property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <name>dfs.name.dir</name>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <value>c:\\wksp</value>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   </property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 12:40 PM,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> manish dunani <ma...@gmail.com>wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *You are wrong at this:*
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> copyFromLocal: File
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> copyFromLocal: File
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Because,You had wrote both the paths
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> local and You need not to copy hadoop into hdfs...Hadoop is already
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> working..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Just check out in browser by after
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> starting ur single node cluster :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> localhost:50070
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> then go for browse the filesystem
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> link in it..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> If there is no directory then make
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> directory there.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> That is your hdfs directory.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Then copy any text file there(no need
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> to copy hadoop there).beacause u are going to do processing on that data in
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> text file.That's why hadoop is used for ,first u need to make it clear in
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> ur mind.Then and then u will do it...otherwise not possible..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *Try this: *
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ .bin/hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /full/local/path/to/ur/file /hdfs/directory/path
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 11:49 AM,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Irfan Sayed <ir...@gmail.com>wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks. yes , i am newbie.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, i need windows setup.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> let me surely refer the doc and link
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> which u sent but i need this to be working ...
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> can you please help
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> MANISH DUNANI
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> -THANX
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> +91 9426881954,+91 8460656443
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> manishd207@gmail.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>>>>>>>>> Regards
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> *Manish Dunani*
>>>>>>>>>>>>>>>>>>>>>>>>> *Contact No* : +91 9408329137
>>>>>>>>>>>>>>>>>>>>>>>>> *skype id* : manish.dunani*
>>>>>>>>>>>>>>>>>>>>>>>>> *
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>>>>>>>>> NOTICE: This message is intended for the use of the
>>>>>>>>>>>>>>>>> individual or entity to which it is addressed and may contain information
>>>>>>>>>>>>>>>>> that is confidential, privileged and exempt from disclosure under
>>>>>>>>>>>>>>>>> applicable law. If the reader of this message is not the intended
>>>>>>>>>>>>>>>>> recipient, you are hereby notified that any printing, copying,
>>>>>>>>>>>>>>>>> dissemination, distribution, disclosure or forwarding of this communication
>>>>>>>>>>>>>>>>> is strictly prohibited. If you have received this communication in error,
>>>>>>>>>>>>>>>>> please contact the sender immediately and delete it from your system. Thank
>>>>>>>>>>>>>>>>> You.
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>>>>>>> NOTICE: This message is intended for the use of the
>>>>>>>>>>>>>>> individual or entity to which it is addressed and may contain information
>>>>>>>>>>>>>>> that is confidential, privileged and exempt from disclosure under
>>>>>>>>>>>>>>> applicable law. If the reader of this message is not the intended
>>>>>>>>>>>>>>> recipient, you are hereby notified that any printing, copying,
>>>>>>>>>>>>>>> dissemination, distribution, disclosure or forwarding of this communication
>>>>>>>>>>>>>>> is strictly prohibited. If you have received this communication in error,
>>>>>>>>>>>>>>> please contact the sender immediately and delete it from your system. Thank
>>>>>>>>>>>>>>> You.
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>> immediately and delete it from your system. Thank You.
>>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>
>>>>> CONFIDENTIALITY NOTICE
>>>>> NOTICE: This message is intended for the use of the individual or
>>>>> entity to which it is addressed and may contain information that is
>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>> notified that any printing, copying, dissemination, distribution,
>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>> you have received this communication in error, please contact the sender
>>>>> immediately and delete it from your system. Thank You.
>>>>>
>>>>
>>>>
>>> CONFIDENTIALITY NOTICE
>>> NOTICE: This message is intended for the use of the individual or entity
>>> to which it is addressed and may contain information that is confidential,
>>> privileged and exempt from disclosure under applicable law. If the reader
>>> of this message is not the intended recipient, you are hereby notified that
>>> any printing, copying, dissemination, distribution, disclosure or
>>> forwarding of this communication is strictly prohibited. If you have
>>> received this communication in error, please contact the sender immediately
>>> and delete it from your system. Thank You.
>>>
>>
>>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.
>

Re: about replication

Posted by Irfan Sayed <ir...@gmail.com>.
thanks. i followed the user manual for deployment and installed all
pre-requisites
i modified the command and still the issue persist. please suggest

please refer below


[image: Inline image 1]

regards
irfan



On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <or...@hortonworks.com>wrote:

> The command to install it is msiexec /i msifile /...
>
> You will find the correct syntax as part of doc.
>
> Happy reading
> Olivier
> On 4 Sep 2013 12:37, "Irfan Sayed" <ir...@gmail.com> wrote:
>
>> thanks.
>> i referred the logs and manuals. i modified the clusterproperties file
>> and then double click on the msi file
>> however, it still failed.
>> further i started the installation on command line by giving
>> HDP_LAYOUT=clusterproperties file path,
>> installation went ahead and it failed for .NET framework 4.0 and VC++
>> redistributable package dependency
>>
>> i installed both and started again the installation.
>> failed again with following error
>> [image: Inline image 1]
>>
>> when i search for the logs mentioned in the error , i never found that
>> please suggest
>>
>> regards
>> irfan
>>
>>
>>
>> On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <
>> orenault@hortonworks.com> wrote:
>>
>>> Correct, you need to define the cluster configuration as part of a file.
>>> You will find some information on the configuration file as part of the
>>> documentation.
>>>
>>>
>>> http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
>>>
>>> You should make sure to have also installed the pre requisite.
>>>
>>> Thanks
>>> Olivier
>>> On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>
>>>> thanks. sorry for the long break. actually got involved in some other
>>>> priorities
>>>> i downloaded the installer and while installing i got following error
>>>>
>>>> [image: Inline image 1]
>>>>
>>>> do i need to make any configuration prior to installation ??
>>>>
>>>> regards
>>>> irfan
>>>>
>>>>
>>>>
>>>> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
>>>> orenault@hortonworks.com> wrote:
>>>>
>>>>> Here is the link
>>>>>
>>>>> http://download.hortonworks.com/products/hdp-windows/
>>>>>
>>>>> Olivier
>>>>> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>>>
>>>>>> thanks.
>>>>>> i just followed the instructions to setup the pseudo distributed
>>>>>> setup first using the url :
>>>>>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>>>>>>
>>>>>> i don't think so i am running DN on both machine
>>>>>> please find the attached log
>>>>>>
>>>>>> hi olivier
>>>>>>
>>>>>> can you please give me download link ?
>>>>>> let me try please
>>>>>>
>>>>>> regards
>>>>>> irfan
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>>>
>>>>>>> Are you running DN on both the machines? Could you please show me
>>>>>>> your DN logs?
>>>>>>>
>>>>>>> Also, consider Oliver's suggestion. It's definitely a better option.
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> Warm Regards,
>>>>>>> Tariq
>>>>>>> cloudfront.blogspot.com
>>>>>>>
>>>>>>>
>>>>>>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>>>>>>> orenault@hortonworks.com> wrote:
>>>>>>>
>>>>>>>> Irfu,
>>>>>>>>
>>>>>>>> If you want to quickly get Hadoop running on windows platform. You
>>>>>>>> may want to try our distribution for Windows. You will be able to find the
>>>>>>>> msi on our website.
>>>>>>>>
>>>>>>>> Regards
>>>>>>>> Olivier
>>>>>>>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>>>>>>
>>>>>>>>> thanks.
>>>>>>>>> ok. i think i need to change the plan over here
>>>>>>>>> let me create two environments. 1: totally windows 2: totally Unix
>>>>>>>>>
>>>>>>>>> because, on windows , anyway i have to try and see how hadoop
>>>>>>>>> works
>>>>>>>>> on UNIX, it is already known that ,  it is working fine.
>>>>>>>>>
>>>>>>>>> so, on windows , here is the setup:
>>>>>>>>>
>>>>>>>>> namenode : windows 2012 R2
>>>>>>>>> datanode : windows 2012 R2
>>>>>>>>>
>>>>>>>>> now, the exact problem is :
>>>>>>>>> 1: datanode is not getting started
>>>>>>>>> 2: replication : if i put any file/folder on any datanode , it
>>>>>>>>> should get replicated to all another available datanodes
>>>>>>>>>
>>>>>>>>> regards
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <
>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>
>>>>>>>>>> Seriously??You are planning to develop something using Hadoop on
>>>>>>>>>> windows. Not a good idea. Anyways, cold you plz show me your log files?I
>>>>>>>>>> also need some additional info :
>>>>>>>>>> -The exact problem which you are facing right now
>>>>>>>>>> -Your cluster summary(no. of nodes etc)
>>>>>>>>>> -Your latest configuration files
>>>>>>>>>> -Your /etc.hosts file
>>>>>>>>>>
>>>>>>>>>> Warm Regards,
>>>>>>>>>> Tariq
>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>>> ok. thanks
>>>>>>>>>>> now, i need to start with all windows setup first as our product
>>>>>>>>>>> will be based on windows
>>>>>>>>>>> so, now, please tell me how to resolve the issue
>>>>>>>>>>>
>>>>>>>>>>> datanode is not starting . please suggest
>>>>>>>>>>>
>>>>>>>>>>> regards,
>>>>>>>>>>> irfan
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <
>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> It is possible. Theoretically Hadoop doesn't stop you from
>>>>>>>>>>>> doing that. But it is not a very wise setup.
>>>>>>>>>>>>
>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>> Tariq
>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <
>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>
>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>
>>>>>>>>>>>>> regards
>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <
>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>
>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>> can i have setup like this :
>>>>>>>>>>>>>> namenode will be on linux (flavour may be RHEL, CentOS,
>>>>>>>>>>>>>> UBuntu etc)
>>>>>>>>>>>>>> and datanodes are the combination of any OS (windows , linux
>>>>>>>>>>>>>> , unix etc )
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> however, my doubt is,  as the file systems of  both the
>>>>>>>>>>>>>> systems (win and linux ) are different ,  datanodes of these systems can
>>>>>>>>>>>>>> not be part of single cluster . i have to make windows cluster separate and
>>>>>>>>>>>>>> UNIX cluster separate ?
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
>>>>>>>>>>>>>> aagarwal@hortonworks.com> wrote:
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> I just noticed you are on Cygwin. IIRC Windows PIDs are not
>>>>>>>>>>>>>>> the same as Cygwin PIDs so that may be causing the discrepancy. I don't
>>>>>>>>>>>>>>> know how well Hadoop works in Cygwin as I have never tried it. Work is in
>>>>>>>>>>>>>>> progress for native Windows support however there are no official releases
>>>>>>>>>>>>>>> with Windows support yet. It may be easier to get familiar with a
>>>>>>>>>>>>>>> release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <
>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> thanks
>>>>>>>>>>>>>>>> here is what i did .
>>>>>>>>>>>>>>>> i stopped all the namenodes and datanodes using
>>>>>>>>>>>>>>>> ./stop-dfs.sh command
>>>>>>>>>>>>>>>> then deleted all pid files for namenodes and datanodes
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> started dfs again with command : "./start-dfs.sh"
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> when i ran the "Jps" command . it shows
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>> 4536 Jps
>>>>>>>>>>>>>>>> 2076 NameNode
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> however, when i open the pid file for namenode then it is
>>>>>>>>>>>>>>>> not showing pid as : 4560. on the contrary, it shud show : 2076
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
>>>>>>>>>>>>>>>> aagarwal@hortonworks.com> wrote:
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> Most likely there is a stale pid file. Something like
>>>>>>>>>>>>>>>>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>>>>>>>>>>>>>>>>> the datanode.
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> I haven't read the entire thread so you may have looked at
>>>>>>>>>>>>>>>>> this already.
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> -Arpit
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> datanode is trying to connect to namenode continuously
>>>>>>>>>>>>>>>>>> but fails
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> when i try to run "jps" command it says :
>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>> 4584 NameNode
>>>>>>>>>>>>>>>>>> 4016 Jps
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> and when i ran the "./start-dfs.sh" then it says :
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> $ ./start-dfs.sh
>>>>>>>>>>>>>>>>>> namenode running as process 3544. Stop it first.
>>>>>>>>>>>>>>>>>> DFS-1: datanode running as process 4076. Stop it first.
>>>>>>>>>>>>>>>>>> localhost: secondarynamenode running as process 4792.
>>>>>>>>>>>>>>>>>> Stop it first.
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> both these logs are contradictory
>>>>>>>>>>>>>>>>>> please find the attached logs
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> should i attach the conf files as well ?
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> Your DN is still not running. Showing me the logs would
>>>>>>>>>>>>>>>>>>> be helpful.
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> i followed the url and did the steps mention in that. i
>>>>>>>>>>>>>>>>>>>> have deployed on the windows platform
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> Now, i am able to browse url : http://localhost:50070(name node )
>>>>>>>>>>>>>>>>>>>> however, not able to browse url :
>>>>>>>>>>>>>>>>>>>> http://localhost:50030
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> please refer below
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> i have modified all the config files as mentioned and
>>>>>>>>>>>>>>>>>>>> formatted the hdfs file system as well
>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> thanks. i followed this url :
>>>>>>>>>>>>>>>>>>>>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>>>>>>>>>>>>>>>>>>>>> let me follow the url which you gave for pseudo
>>>>>>>>>>>>>>>>>>>>> distributed setup and then will switch to distributed mode
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> You are welcome. Which link have you followed for the
>>>>>>>>>>>>>>>>>>>>>> configuration?Your *core-site.xml* is empty. Remove
>>>>>>>>>>>>>>>>>>>>>> the property *fs.default.name *from *hdfs-site.xml*and add it to
>>>>>>>>>>>>>>>>>>>>>> *core-site.xml*. Remove *mapred.job.tracker* as
>>>>>>>>>>>>>>>>>>>>>> well. It is required in *mapred-site.xml*.
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> I would suggest you to do a pseudo distributed setup
>>>>>>>>>>>>>>>>>>>>>> first in order to get yourself familiar with the process and then proceed
>>>>>>>>>>>>>>>>>>>>>> to the distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> HTH
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> thanks tariq for response.
>>>>>>>>>>>>>>>>>>>>>>> as discussed last time, i have sent you all the
>>>>>>>>>>>>>>>>>>>>>>> config files in my setup .
>>>>>>>>>>>>>>>>>>>>>>> can you please go through that ?
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> please let me know
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> I'm sorry for being unresponsive. Was out of touch
>>>>>>>>>>>>>>>>>>>>>>>> for sometime because of ramzan and eid. Resuming work today.
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> What's the current status?
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <
>>>>>>>>>>>>>>>>>>>>>>>> manishd207@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> First of all read the concepts ..I hope you will
>>>>>>>>>>>>>>>>>>>>>>>>> like it..
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> hey Tariq,
>>>>>>>>>>>>>>>>>>>>>>>>>>> i am still stuck ..
>>>>>>>>>>>>>>>>>>>>>>>>>>> can you please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> attachment got quarantined
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> resending in txt format. please rename it to
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> conf.rar
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> if i run the jps command on namenode :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 3164 NameNode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 1892 Jps
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> same command on datanode :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-1/cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 3848 Jps
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> jps does not list any process for datanode.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, on web browser i can see one live data node
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please find the attached conf rar file of
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> OK. we'll start fresh. Could you plz show me
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> your latest config files?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> BTW, are your daemons running fine?Use JPS
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> to verify that.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i have created these dir "wksp_data" and
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> "wksp_name" on both datanode and namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> made the respective changes in
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> "hdfs-site.xml" file
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> formatted the namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> started the dfs
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> but still, not able to browse the file
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> system through web browser
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please refer below
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> anything still missing ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> these dir needs to be created on all
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> datanodes and namenodes ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> further,  hdfs-site.xml needs to be
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> updated on both datanodes and namenodes for these new dir?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Create 2 directories manually
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> corresponding to the values of dfs.name.dir and dfs.data.dir properties and
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> change the permissions of these directories to 755. When you start pushing
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> data into your HDFS, data will start going inside the directory specified
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> by dfs.data.dir and the associated metadata will go inside dfs.name.dir.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Remember, you store data in HDFS, but it eventually gets stored in your
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> local/native FS. But you cannot see this data directly on your local/native
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> FS.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, i need this to be working on
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> windows environment as project requirement.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i will add/work on Linux later
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> so, now , at this stage , c:\\wksp is
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> the HDFS file system OR do i need to create it from command line ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:19 PM, Mohammad
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Hello Irfan,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sorry for being unresponsive. Got stuck
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> with some imp work.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> HDFS webUI doesn't provide us the
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> ability to create file or directory. You can browse HDFS, view files,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> download files etc. But operation like create, move, copy etc are not
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> supported.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> These values look fine to me.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> One suggestion though. Try getting a
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Linux machine(if possible). Or at least use a VM. I personally feel that
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> using Hadoop on windows is always messy.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> when i browse the file system , i am
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> getting following :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i haven't seen any make directory
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> option there
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i need to create it from command line ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> further, in the hdfs-site.xml file , i
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> have given following entries. are they correct ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <name>dfs.data.dir</name>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <value>c:\\wksp</value>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   </property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <name>dfs.name.dir</name>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <value>c:\\wksp</value>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   </property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 12:40 PM,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> manish dunani <ma...@gmail.com>wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *You are wrong at this:*
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> copyFromLocal: File
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> copyFromLocal: File
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Because,You had wrote both the paths
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> local and You need not to copy hadoop into hdfs...Hadoop is already
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> working..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Just check out in browser by after
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> starting ur single node cluster :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> localhost:50070
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> then go for browse the filesystem
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> link in it..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> If there is no directory then make
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> directory there.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> That is your hdfs directory.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Then copy any text file there(no need
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> to copy hadoop there).beacause u are going to do processing on that data in
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> text file.That's why hadoop is used for ,first u need to make it clear in
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> ur mind.Then and then u will do it...otherwise not possible..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *Try this: *
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ .bin/hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /full/local/path/to/ur/file /hdfs/directory/path
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 11:49 AM,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Irfan Sayed <ir...@gmail.com>wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks. yes , i am newbie.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, i need windows setup.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> let me surely refer the doc and link
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> which u sent but i need this to be working ...
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> can you please help
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> MANISH DUNANI
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> -THANX
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> +91 9426881954,+91 8460656443
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> manishd207@gmail.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>>>>>>>>> Regards
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> *Manish Dunani*
>>>>>>>>>>>>>>>>>>>>>>>>> *Contact No* : +91 9408329137
>>>>>>>>>>>>>>>>>>>>>>>>> *skype id* : manish.dunani*
>>>>>>>>>>>>>>>>>>>>>>>>> *
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>>>>>>>>> NOTICE: This message is intended for the use of the
>>>>>>>>>>>>>>>>> individual or entity to which it is addressed and may contain information
>>>>>>>>>>>>>>>>> that is confidential, privileged and exempt from disclosure under
>>>>>>>>>>>>>>>>> applicable law. If the reader of this message is not the intended
>>>>>>>>>>>>>>>>> recipient, you are hereby notified that any printing, copying,
>>>>>>>>>>>>>>>>> dissemination, distribution, disclosure or forwarding of this communication
>>>>>>>>>>>>>>>>> is strictly prohibited. If you have received this communication in error,
>>>>>>>>>>>>>>>>> please contact the sender immediately and delete it from your system. Thank
>>>>>>>>>>>>>>>>> You.
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>>>>>>> NOTICE: This message is intended for the use of the
>>>>>>>>>>>>>>> individual or entity to which it is addressed and may contain information
>>>>>>>>>>>>>>> that is confidential, privileged and exempt from disclosure under
>>>>>>>>>>>>>>> applicable law. If the reader of this message is not the intended
>>>>>>>>>>>>>>> recipient, you are hereby notified that any printing, copying,
>>>>>>>>>>>>>>> dissemination, distribution, disclosure or forwarding of this communication
>>>>>>>>>>>>>>> is strictly prohibited. If you have received this communication in error,
>>>>>>>>>>>>>>> please contact the sender immediately and delete it from your system. Thank
>>>>>>>>>>>>>>> You.
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>> immediately and delete it from your system. Thank You.
>>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>
>>>>> CONFIDENTIALITY NOTICE
>>>>> NOTICE: This message is intended for the use of the individual or
>>>>> entity to which it is addressed and may contain information that is
>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>> notified that any printing, copying, dissemination, distribution,
>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>> you have received this communication in error, please contact the sender
>>>>> immediately and delete it from your system. Thank You.
>>>>>
>>>>
>>>>
>>> CONFIDENTIALITY NOTICE
>>> NOTICE: This message is intended for the use of the individual or entity
>>> to which it is addressed and may contain information that is confidential,
>>> privileged and exempt from disclosure under applicable law. If the reader
>>> of this message is not the intended recipient, you are hereby notified that
>>> any printing, copying, dissemination, distribution, disclosure or
>>> forwarding of this communication is strictly prohibited. If you have
>>> received this communication in error, please contact the sender immediately
>>> and delete it from your system. Thank You.
>>>
>>
>>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.
>

Re: about replication

Posted by Irfan Sayed <ir...@gmail.com>.
thanks. i followed the user manual for deployment and installed all
pre-requisites
i modified the command and still the issue persist. please suggest

please refer below


[image: Inline image 1]

regards
irfan



On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <or...@hortonworks.com>wrote:

> The command to install it is msiexec /i msifile /...
>
> You will find the correct syntax as part of doc.
>
> Happy reading
> Olivier
> On 4 Sep 2013 12:37, "Irfan Sayed" <ir...@gmail.com> wrote:
>
>> thanks.
>> i referred the logs and manuals. i modified the clusterproperties file
>> and then double click on the msi file
>> however, it still failed.
>> further i started the installation on command line by giving
>> HDP_LAYOUT=clusterproperties file path,
>> installation went ahead and it failed for .NET framework 4.0 and VC++
>> redistributable package dependency
>>
>> i installed both and started again the installation.
>> failed again with following error
>> [image: Inline image 1]
>>
>> when i search for the logs mentioned in the error , i never found that
>> please suggest
>>
>> regards
>> irfan
>>
>>
>>
>> On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <
>> orenault@hortonworks.com> wrote:
>>
>>> Correct, you need to define the cluster configuration as part of a file.
>>> You will find some information on the configuration file as part of the
>>> documentation.
>>>
>>>
>>> http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
>>>
>>> You should make sure to have also installed the pre requisite.
>>>
>>> Thanks
>>> Olivier
>>> On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>
>>>> thanks. sorry for the long break. actually got involved in some other
>>>> priorities
>>>> i downloaded the installer and while installing i got following error
>>>>
>>>> [image: Inline image 1]
>>>>
>>>> do i need to make any configuration prior to installation ??
>>>>
>>>> regards
>>>> irfan
>>>>
>>>>
>>>>
>>>> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
>>>> orenault@hortonworks.com> wrote:
>>>>
>>>>> Here is the link
>>>>>
>>>>> http://download.hortonworks.com/products/hdp-windows/
>>>>>
>>>>> Olivier
>>>>> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>>>
>>>>>> thanks.
>>>>>> i just followed the instructions to setup the pseudo distributed
>>>>>> setup first using the url :
>>>>>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>>>>>>
>>>>>> i don't think so i am running DN on both machine
>>>>>> please find the attached log
>>>>>>
>>>>>> hi olivier
>>>>>>
>>>>>> can you please give me download link ?
>>>>>> let me try please
>>>>>>
>>>>>> regards
>>>>>> irfan
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>>>
>>>>>>> Are you running DN on both the machines? Could you please show me
>>>>>>> your DN logs?
>>>>>>>
>>>>>>> Also, consider Oliver's suggestion. It's definitely a better option.
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> Warm Regards,
>>>>>>> Tariq
>>>>>>> cloudfront.blogspot.com
>>>>>>>
>>>>>>>
>>>>>>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>>>>>>> orenault@hortonworks.com> wrote:
>>>>>>>
>>>>>>>> Irfu,
>>>>>>>>
>>>>>>>> If you want to quickly get Hadoop running on windows platform. You
>>>>>>>> may want to try our distribution for Windows. You will be able to find the
>>>>>>>> msi on our website.
>>>>>>>>
>>>>>>>> Regards
>>>>>>>> Olivier
>>>>>>>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>>>>>>
>>>>>>>>> thanks.
>>>>>>>>> ok. i think i need to change the plan over here
>>>>>>>>> let me create two environments. 1: totally windows 2: totally Unix
>>>>>>>>>
>>>>>>>>> because, on windows , anyway i have to try and see how hadoop
>>>>>>>>> works
>>>>>>>>> on UNIX, it is already known that ,  it is working fine.
>>>>>>>>>
>>>>>>>>> so, on windows , here is the setup:
>>>>>>>>>
>>>>>>>>> namenode : windows 2012 R2
>>>>>>>>> datanode : windows 2012 R2
>>>>>>>>>
>>>>>>>>> now, the exact problem is :
>>>>>>>>> 1: datanode is not getting started
>>>>>>>>> 2: replication : if i put any file/folder on any datanode , it
>>>>>>>>> should get replicated to all another available datanodes
>>>>>>>>>
>>>>>>>>> regards
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <
>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>
>>>>>>>>>> Seriously??You are planning to develop something using Hadoop on
>>>>>>>>>> windows. Not a good idea. Anyways, cold you plz show me your log files?I
>>>>>>>>>> also need some additional info :
>>>>>>>>>> -The exact problem which you are facing right now
>>>>>>>>>> -Your cluster summary(no. of nodes etc)
>>>>>>>>>> -Your latest configuration files
>>>>>>>>>> -Your /etc.hosts file
>>>>>>>>>>
>>>>>>>>>> Warm Regards,
>>>>>>>>>> Tariq
>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>>> ok. thanks
>>>>>>>>>>> now, i need to start with all windows setup first as our product
>>>>>>>>>>> will be based on windows
>>>>>>>>>>> so, now, please tell me how to resolve the issue
>>>>>>>>>>>
>>>>>>>>>>> datanode is not starting . please suggest
>>>>>>>>>>>
>>>>>>>>>>> regards,
>>>>>>>>>>> irfan
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <
>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> It is possible. Theoretically Hadoop doesn't stop you from
>>>>>>>>>>>> doing that. But it is not a very wise setup.
>>>>>>>>>>>>
>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>> Tariq
>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <
>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>
>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>
>>>>>>>>>>>>> regards
>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <
>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>
>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>> can i have setup like this :
>>>>>>>>>>>>>> namenode will be on linux (flavour may be RHEL, CentOS,
>>>>>>>>>>>>>> UBuntu etc)
>>>>>>>>>>>>>> and datanodes are the combination of any OS (windows , linux
>>>>>>>>>>>>>> , unix etc )
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> however, my doubt is,  as the file systems of  both the
>>>>>>>>>>>>>> systems (win and linux ) are different ,  datanodes of these systems can
>>>>>>>>>>>>>> not be part of single cluster . i have to make windows cluster separate and
>>>>>>>>>>>>>> UNIX cluster separate ?
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
>>>>>>>>>>>>>> aagarwal@hortonworks.com> wrote:
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> I just noticed you are on Cygwin. IIRC Windows PIDs are not
>>>>>>>>>>>>>>> the same as Cygwin PIDs so that may be causing the discrepancy. I don't
>>>>>>>>>>>>>>> know how well Hadoop works in Cygwin as I have never tried it. Work is in
>>>>>>>>>>>>>>> progress for native Windows support however there are no official releases
>>>>>>>>>>>>>>> with Windows support yet. It may be easier to get familiar with a
>>>>>>>>>>>>>>> release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <
>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> thanks
>>>>>>>>>>>>>>>> here is what i did .
>>>>>>>>>>>>>>>> i stopped all the namenodes and datanodes using
>>>>>>>>>>>>>>>> ./stop-dfs.sh command
>>>>>>>>>>>>>>>> then deleted all pid files for namenodes and datanodes
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> started dfs again with command : "./start-dfs.sh"
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> when i ran the "Jps" command . it shows
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>> 4536 Jps
>>>>>>>>>>>>>>>> 2076 NameNode
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> however, when i open the pid file for namenode then it is
>>>>>>>>>>>>>>>> not showing pid as : 4560. on the contrary, it shud show : 2076
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
>>>>>>>>>>>>>>>> aagarwal@hortonworks.com> wrote:
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> Most likely there is a stale pid file. Something like
>>>>>>>>>>>>>>>>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>>>>>>>>>>>>>>>>> the datanode.
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> I haven't read the entire thread so you may have looked at
>>>>>>>>>>>>>>>>> this already.
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> -Arpit
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> datanode is trying to connect to namenode continuously
>>>>>>>>>>>>>>>>>> but fails
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> when i try to run "jps" command it says :
>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>> 4584 NameNode
>>>>>>>>>>>>>>>>>> 4016 Jps
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> and when i ran the "./start-dfs.sh" then it says :
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> $ ./start-dfs.sh
>>>>>>>>>>>>>>>>>> namenode running as process 3544. Stop it first.
>>>>>>>>>>>>>>>>>> DFS-1: datanode running as process 4076. Stop it first.
>>>>>>>>>>>>>>>>>> localhost: secondarynamenode running as process 4792.
>>>>>>>>>>>>>>>>>> Stop it first.
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> both these logs are contradictory
>>>>>>>>>>>>>>>>>> please find the attached logs
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> should i attach the conf files as well ?
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> Your DN is still not running. Showing me the logs would
>>>>>>>>>>>>>>>>>>> be helpful.
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> i followed the url and did the steps mention in that. i
>>>>>>>>>>>>>>>>>>>> have deployed on the windows platform
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> Now, i am able to browse url : http://localhost:50070(name node )
>>>>>>>>>>>>>>>>>>>> however, not able to browse url :
>>>>>>>>>>>>>>>>>>>> http://localhost:50030
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> please refer below
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> i have modified all the config files as mentioned and
>>>>>>>>>>>>>>>>>>>> formatted the hdfs file system as well
>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> thanks. i followed this url :
>>>>>>>>>>>>>>>>>>>>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>>>>>>>>>>>>>>>>>>>>> let me follow the url which you gave for pseudo
>>>>>>>>>>>>>>>>>>>>> distributed setup and then will switch to distributed mode
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> You are welcome. Which link have you followed for the
>>>>>>>>>>>>>>>>>>>>>> configuration?Your *core-site.xml* is empty. Remove
>>>>>>>>>>>>>>>>>>>>>> the property *fs.default.name *from *hdfs-site.xml*and add it to
>>>>>>>>>>>>>>>>>>>>>> *core-site.xml*. Remove *mapred.job.tracker* as
>>>>>>>>>>>>>>>>>>>>>> well. It is required in *mapred-site.xml*.
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> I would suggest you to do a pseudo distributed setup
>>>>>>>>>>>>>>>>>>>>>> first in order to get yourself familiar with the process and then proceed
>>>>>>>>>>>>>>>>>>>>>> to the distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> HTH
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> thanks tariq for response.
>>>>>>>>>>>>>>>>>>>>>>> as discussed last time, i have sent you all the
>>>>>>>>>>>>>>>>>>>>>>> config files in my setup .
>>>>>>>>>>>>>>>>>>>>>>> can you please go through that ?
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> please let me know
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> I'm sorry for being unresponsive. Was out of touch
>>>>>>>>>>>>>>>>>>>>>>>> for sometime because of ramzan and eid. Resuming work today.
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> What's the current status?
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <
>>>>>>>>>>>>>>>>>>>>>>>> manishd207@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> First of all read the concepts ..I hope you will
>>>>>>>>>>>>>>>>>>>>>>>>> like it..
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> hey Tariq,
>>>>>>>>>>>>>>>>>>>>>>>>>>> i am still stuck ..
>>>>>>>>>>>>>>>>>>>>>>>>>>> can you please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> attachment got quarantined
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> resending in txt format. please rename it to
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> conf.rar
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> if i run the jps command on namenode :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 3164 NameNode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 1892 Jps
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> same command on datanode :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-1/cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 3848 Jps
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> jps does not list any process for datanode.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, on web browser i can see one live data node
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please find the attached conf rar file of
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> OK. we'll start fresh. Could you plz show me
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> your latest config files?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> BTW, are your daemons running fine?Use JPS
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> to verify that.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i have created these dir "wksp_data" and
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> "wksp_name" on both datanode and namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> made the respective changes in
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> "hdfs-site.xml" file
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> formatted the namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> started the dfs
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> but still, not able to browse the file
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> system through web browser
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please refer below
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> anything still missing ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> these dir needs to be created on all
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> datanodes and namenodes ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> further,  hdfs-site.xml needs to be
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> updated on both datanodes and namenodes for these new dir?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Create 2 directories manually
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> corresponding to the values of dfs.name.dir and dfs.data.dir properties and
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> change the permissions of these directories to 755. When you start pushing
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> data into your HDFS, data will start going inside the directory specified
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> by dfs.data.dir and the associated metadata will go inside dfs.name.dir.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Remember, you store data in HDFS, but it eventually gets stored in your
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> local/native FS. But you cannot see this data directly on your local/native
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> FS.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, i need this to be working on
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> windows environment as project requirement.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i will add/work on Linux later
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> so, now , at this stage , c:\\wksp is
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> the HDFS file system OR do i need to create it from command line ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:19 PM, Mohammad
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Hello Irfan,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sorry for being unresponsive. Got stuck
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> with some imp work.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> HDFS webUI doesn't provide us the
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> ability to create file or directory. You can browse HDFS, view files,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> download files etc. But operation like create, move, copy etc are not
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> supported.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> These values look fine to me.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> One suggestion though. Try getting a
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Linux machine(if possible). Or at least use a VM. I personally feel that
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> using Hadoop on windows is always messy.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> when i browse the file system , i am
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> getting following :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i haven't seen any make directory
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> option there
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i need to create it from command line ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> further, in the hdfs-site.xml file , i
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> have given following entries. are they correct ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <name>dfs.data.dir</name>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <value>c:\\wksp</value>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   </property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <name>dfs.name.dir</name>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <value>c:\\wksp</value>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   </property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 12:40 PM,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> manish dunani <ma...@gmail.com>wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *You are wrong at this:*
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> copyFromLocal: File
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> copyFromLocal: File
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Because,You had wrote both the paths
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> local and You need not to copy hadoop into hdfs...Hadoop is already
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> working..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Just check out in browser by after
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> starting ur single node cluster :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> localhost:50070
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> then go for browse the filesystem
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> link in it..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> If there is no directory then make
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> directory there.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> That is your hdfs directory.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Then copy any text file there(no need
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> to copy hadoop there).beacause u are going to do processing on that data in
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> text file.That's why hadoop is used for ,first u need to make it clear in
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> ur mind.Then and then u will do it...otherwise not possible..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *Try this: *
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ .bin/hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /full/local/path/to/ur/file /hdfs/directory/path
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 11:49 AM,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Irfan Sayed <ir...@gmail.com>wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks. yes , i am newbie.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, i need windows setup.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> let me surely refer the doc and link
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> which u sent but i need this to be working ...
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> can you please help
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> MANISH DUNANI
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> -THANX
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> +91 9426881954,+91 8460656443
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> manishd207@gmail.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>>>>>>>>> Regards
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> *Manish Dunani*
>>>>>>>>>>>>>>>>>>>>>>>>> *Contact No* : +91 9408329137
>>>>>>>>>>>>>>>>>>>>>>>>> *skype id* : manish.dunani*
>>>>>>>>>>>>>>>>>>>>>>>>> *
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>>>>>>>>> NOTICE: This message is intended for the use of the
>>>>>>>>>>>>>>>>> individual or entity to which it is addressed and may contain information
>>>>>>>>>>>>>>>>> that is confidential, privileged and exempt from disclosure under
>>>>>>>>>>>>>>>>> applicable law. If the reader of this message is not the intended
>>>>>>>>>>>>>>>>> recipient, you are hereby notified that any printing, copying,
>>>>>>>>>>>>>>>>> dissemination, distribution, disclosure or forwarding of this communication
>>>>>>>>>>>>>>>>> is strictly prohibited. If you have received this communication in error,
>>>>>>>>>>>>>>>>> please contact the sender immediately and delete it from your system. Thank
>>>>>>>>>>>>>>>>> You.
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>>>>>>> NOTICE: This message is intended for the use of the
>>>>>>>>>>>>>>> individual or entity to which it is addressed and may contain information
>>>>>>>>>>>>>>> that is confidential, privileged and exempt from disclosure under
>>>>>>>>>>>>>>> applicable law. If the reader of this message is not the intended
>>>>>>>>>>>>>>> recipient, you are hereby notified that any printing, copying,
>>>>>>>>>>>>>>> dissemination, distribution, disclosure or forwarding of this communication
>>>>>>>>>>>>>>> is strictly prohibited. If you have received this communication in error,
>>>>>>>>>>>>>>> please contact the sender immediately and delete it from your system. Thank
>>>>>>>>>>>>>>> You.
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>> immediately and delete it from your system. Thank You.
>>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>
>>>>> CONFIDENTIALITY NOTICE
>>>>> NOTICE: This message is intended for the use of the individual or
>>>>> entity to which it is addressed and may contain information that is
>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>> notified that any printing, copying, dissemination, distribution,
>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>> you have received this communication in error, please contact the sender
>>>>> immediately and delete it from your system. Thank You.
>>>>>
>>>>
>>>>
>>> CONFIDENTIALITY NOTICE
>>> NOTICE: This message is intended for the use of the individual or entity
>>> to which it is addressed and may contain information that is confidential,
>>> privileged and exempt from disclosure under applicable law. If the reader
>>> of this message is not the intended recipient, you are hereby notified that
>>> any printing, copying, dissemination, distribution, disclosure or
>>> forwarding of this communication is strictly prohibited. If you have
>>> received this communication in error, please contact the sender immediately
>>> and delete it from your system. Thank You.
>>>
>>
>>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.
>

Re: about replication

Posted by Irfan Sayed <ir...@gmail.com>.
thanks. i followed the user manual for deployment and installed all
pre-requisites
i modified the command and still the issue persist. please suggest

please refer below


[image: Inline image 1]

regards
irfan



On Wed, Sep 4, 2013 at 5:13 PM, Olivier Renault <or...@hortonworks.com>wrote:

> The command to install it is msiexec /i msifile /...
>
> You will find the correct syntax as part of doc.
>
> Happy reading
> Olivier
> On 4 Sep 2013 12:37, "Irfan Sayed" <ir...@gmail.com> wrote:
>
>> thanks.
>> i referred the logs and manuals. i modified the clusterproperties file
>> and then double click on the msi file
>> however, it still failed.
>> further i started the installation on command line by giving
>> HDP_LAYOUT=clusterproperties file path,
>> installation went ahead and it failed for .NET framework 4.0 and VC++
>> redistributable package dependency
>>
>> i installed both and started again the installation.
>> failed again with following error
>> [image: Inline image 1]
>>
>> when i search for the logs mentioned in the error , i never found that
>> please suggest
>>
>> regards
>> irfan
>>
>>
>>
>> On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <
>> orenault@hortonworks.com> wrote:
>>
>>> Correct, you need to define the cluster configuration as part of a file.
>>> You will find some information on the configuration file as part of the
>>> documentation.
>>>
>>>
>>> http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
>>>
>>> You should make sure to have also installed the pre requisite.
>>>
>>> Thanks
>>> Olivier
>>> On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>
>>>> thanks. sorry for the long break. actually got involved in some other
>>>> priorities
>>>> i downloaded the installer and while installing i got following error
>>>>
>>>> [image: Inline image 1]
>>>>
>>>> do i need to make any configuration prior to installation ??
>>>>
>>>> regards
>>>> irfan
>>>>
>>>>
>>>>
>>>> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
>>>> orenault@hortonworks.com> wrote:
>>>>
>>>>> Here is the link
>>>>>
>>>>> http://download.hortonworks.com/products/hdp-windows/
>>>>>
>>>>> Olivier
>>>>> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>>>
>>>>>> thanks.
>>>>>> i just followed the instructions to setup the pseudo distributed
>>>>>> setup first using the url :
>>>>>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>>>>>>
>>>>>> i don't think so i am running DN on both machine
>>>>>> please find the attached log
>>>>>>
>>>>>> hi olivier
>>>>>>
>>>>>> can you please give me download link ?
>>>>>> let me try please
>>>>>>
>>>>>> regards
>>>>>> irfan
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>>>
>>>>>>> Are you running DN on both the machines? Could you please show me
>>>>>>> your DN logs?
>>>>>>>
>>>>>>> Also, consider Oliver's suggestion. It's definitely a better option.
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> Warm Regards,
>>>>>>> Tariq
>>>>>>> cloudfront.blogspot.com
>>>>>>>
>>>>>>>
>>>>>>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>>>>>>> orenault@hortonworks.com> wrote:
>>>>>>>
>>>>>>>> Irfu,
>>>>>>>>
>>>>>>>> If you want to quickly get Hadoop running on windows platform. You
>>>>>>>> may want to try our distribution for Windows. You will be able to find the
>>>>>>>> msi on our website.
>>>>>>>>
>>>>>>>> Regards
>>>>>>>> Olivier
>>>>>>>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>>>>>>
>>>>>>>>> thanks.
>>>>>>>>> ok. i think i need to change the plan over here
>>>>>>>>> let me create two environments. 1: totally windows 2: totally Unix
>>>>>>>>>
>>>>>>>>> because, on windows , anyway i have to try and see how hadoop
>>>>>>>>> works
>>>>>>>>> on UNIX, it is already known that ,  it is working fine.
>>>>>>>>>
>>>>>>>>> so, on windows , here is the setup:
>>>>>>>>>
>>>>>>>>> namenode : windows 2012 R2
>>>>>>>>> datanode : windows 2012 R2
>>>>>>>>>
>>>>>>>>> now, the exact problem is :
>>>>>>>>> 1: datanode is not getting started
>>>>>>>>> 2: replication : if i put any file/folder on any datanode , it
>>>>>>>>> should get replicated to all another available datanodes
>>>>>>>>>
>>>>>>>>> regards
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <
>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>
>>>>>>>>>> Seriously??You are planning to develop something using Hadoop on
>>>>>>>>>> windows. Not a good idea. Anyways, cold you plz show me your log files?I
>>>>>>>>>> also need some additional info :
>>>>>>>>>> -The exact problem which you are facing right now
>>>>>>>>>> -Your cluster summary(no. of nodes etc)
>>>>>>>>>> -Your latest configuration files
>>>>>>>>>> -Your /etc.hosts file
>>>>>>>>>>
>>>>>>>>>> Warm Regards,
>>>>>>>>>> Tariq
>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>>> ok. thanks
>>>>>>>>>>> now, i need to start with all windows setup first as our product
>>>>>>>>>>> will be based on windows
>>>>>>>>>>> so, now, please tell me how to resolve the issue
>>>>>>>>>>>
>>>>>>>>>>> datanode is not starting . please suggest
>>>>>>>>>>>
>>>>>>>>>>> regards,
>>>>>>>>>>> irfan
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <
>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> It is possible. Theoretically Hadoop doesn't stop you from
>>>>>>>>>>>> doing that. But it is not a very wise setup.
>>>>>>>>>>>>
>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>> Tariq
>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <
>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>
>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>
>>>>>>>>>>>>> regards
>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <
>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>
>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>> can i have setup like this :
>>>>>>>>>>>>>> namenode will be on linux (flavour may be RHEL, CentOS,
>>>>>>>>>>>>>> UBuntu etc)
>>>>>>>>>>>>>> and datanodes are the combination of any OS (windows , linux
>>>>>>>>>>>>>> , unix etc )
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> however, my doubt is,  as the file systems of  both the
>>>>>>>>>>>>>> systems (win and linux ) are different ,  datanodes of these systems can
>>>>>>>>>>>>>> not be part of single cluster . i have to make windows cluster separate and
>>>>>>>>>>>>>> UNIX cluster separate ?
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
>>>>>>>>>>>>>> aagarwal@hortonworks.com> wrote:
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> I just noticed you are on Cygwin. IIRC Windows PIDs are not
>>>>>>>>>>>>>>> the same as Cygwin PIDs so that may be causing the discrepancy. I don't
>>>>>>>>>>>>>>> know how well Hadoop works in Cygwin as I have never tried it. Work is in
>>>>>>>>>>>>>>> progress for native Windows support however there are no official releases
>>>>>>>>>>>>>>> with Windows support yet. It may be easier to get familiar with a
>>>>>>>>>>>>>>> release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <
>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> thanks
>>>>>>>>>>>>>>>> here is what i did .
>>>>>>>>>>>>>>>> i stopped all the namenodes and datanodes using
>>>>>>>>>>>>>>>> ./stop-dfs.sh command
>>>>>>>>>>>>>>>> then deleted all pid files for namenodes and datanodes
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> started dfs again with command : "./start-dfs.sh"
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> when i ran the "Jps" command . it shows
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>> 4536 Jps
>>>>>>>>>>>>>>>> 2076 NameNode
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> however, when i open the pid file for namenode then it is
>>>>>>>>>>>>>>>> not showing pid as : 4560. on the contrary, it shud show : 2076
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
>>>>>>>>>>>>>>>> aagarwal@hortonworks.com> wrote:
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> Most likely there is a stale pid file. Something like
>>>>>>>>>>>>>>>>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>>>>>>>>>>>>>>>>> the datanode.
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> I haven't read the entire thread so you may have looked at
>>>>>>>>>>>>>>>>> this already.
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> -Arpit
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> datanode is trying to connect to namenode continuously
>>>>>>>>>>>>>>>>>> but fails
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> when i try to run "jps" command it says :
>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>> 4584 NameNode
>>>>>>>>>>>>>>>>>> 4016 Jps
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> and when i ran the "./start-dfs.sh" then it says :
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> $ ./start-dfs.sh
>>>>>>>>>>>>>>>>>> namenode running as process 3544. Stop it first.
>>>>>>>>>>>>>>>>>> DFS-1: datanode running as process 4076. Stop it first.
>>>>>>>>>>>>>>>>>> localhost: secondarynamenode running as process 4792.
>>>>>>>>>>>>>>>>>> Stop it first.
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> both these logs are contradictory
>>>>>>>>>>>>>>>>>> please find the attached logs
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> should i attach the conf files as well ?
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> Your DN is still not running. Showing me the logs would
>>>>>>>>>>>>>>>>>>> be helpful.
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> i followed the url and did the steps mention in that. i
>>>>>>>>>>>>>>>>>>>> have deployed on the windows platform
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> Now, i am able to browse url : http://localhost:50070(name node )
>>>>>>>>>>>>>>>>>>>> however, not able to browse url :
>>>>>>>>>>>>>>>>>>>> http://localhost:50030
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> please refer below
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> i have modified all the config files as mentioned and
>>>>>>>>>>>>>>>>>>>> formatted the hdfs file system as well
>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> thanks. i followed this url :
>>>>>>>>>>>>>>>>>>>>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>>>>>>>>>>>>>>>>>>>>> let me follow the url which you gave for pseudo
>>>>>>>>>>>>>>>>>>>>> distributed setup and then will switch to distributed mode
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> You are welcome. Which link have you followed for the
>>>>>>>>>>>>>>>>>>>>>> configuration?Your *core-site.xml* is empty. Remove
>>>>>>>>>>>>>>>>>>>>>> the property *fs.default.name *from *hdfs-site.xml*and add it to
>>>>>>>>>>>>>>>>>>>>>> *core-site.xml*. Remove *mapred.job.tracker* as
>>>>>>>>>>>>>>>>>>>>>> well. It is required in *mapred-site.xml*.
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> I would suggest you to do a pseudo distributed setup
>>>>>>>>>>>>>>>>>>>>>> first in order to get yourself familiar with the process and then proceed
>>>>>>>>>>>>>>>>>>>>>> to the distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> HTH
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> thanks tariq for response.
>>>>>>>>>>>>>>>>>>>>>>> as discussed last time, i have sent you all the
>>>>>>>>>>>>>>>>>>>>>>> config files in my setup .
>>>>>>>>>>>>>>>>>>>>>>> can you please go through that ?
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> please let me know
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> I'm sorry for being unresponsive. Was out of touch
>>>>>>>>>>>>>>>>>>>>>>>> for sometime because of ramzan and eid. Resuming work today.
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> What's the current status?
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <
>>>>>>>>>>>>>>>>>>>>>>>> manishd207@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> First of all read the concepts ..I hope you will
>>>>>>>>>>>>>>>>>>>>>>>>> like it..
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> hey Tariq,
>>>>>>>>>>>>>>>>>>>>>>>>>>> i am still stuck ..
>>>>>>>>>>>>>>>>>>>>>>>>>>> can you please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> attachment got quarantined
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> resending in txt format. please rename it to
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> conf.rar
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> if i run the jps command on namenode :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 3164 NameNode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 1892 Jps
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> same command on datanode :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-1/cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 3848 Jps
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> jps does not list any process for datanode.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, on web browser i can see one live data node
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please find the attached conf rar file of
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> OK. we'll start fresh. Could you plz show me
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> your latest config files?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> BTW, are your daemons running fine?Use JPS
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> to verify that.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i have created these dir "wksp_data" and
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> "wksp_name" on both datanode and namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> made the respective changes in
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> "hdfs-site.xml" file
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> formatted the namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> started the dfs
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> but still, not able to browse the file
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> system through web browser
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please refer below
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> anything still missing ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> these dir needs to be created on all
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> datanodes and namenodes ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> further,  hdfs-site.xml needs to be
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> updated on both datanodes and namenodes for these new dir?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Create 2 directories manually
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> corresponding to the values of dfs.name.dir and dfs.data.dir properties and
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> change the permissions of these directories to 755. When you start pushing
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> data into your HDFS, data will start going inside the directory specified
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> by dfs.data.dir and the associated metadata will go inside dfs.name.dir.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Remember, you store data in HDFS, but it eventually gets stored in your
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> local/native FS. But you cannot see this data directly on your local/native
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> FS.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, i need this to be working on
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> windows environment as project requirement.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i will add/work on Linux later
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> so, now , at this stage , c:\\wksp is
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> the HDFS file system OR do i need to create it from command line ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:19 PM, Mohammad
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Hello Irfan,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sorry for being unresponsive. Got stuck
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> with some imp work.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> HDFS webUI doesn't provide us the
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> ability to create file or directory. You can browse HDFS, view files,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> download files etc. But operation like create, move, copy etc are not
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> supported.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> These values look fine to me.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> One suggestion though. Try getting a
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Linux machine(if possible). Or at least use a VM. I personally feel that
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> using Hadoop on windows is always messy.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> when i browse the file system , i am
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> getting following :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i haven't seen any make directory
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> option there
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i need to create it from command line ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> further, in the hdfs-site.xml file , i
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> have given following entries. are they correct ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <name>dfs.data.dir</name>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <value>c:\\wksp</value>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   </property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <name>dfs.name.dir</name>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <value>c:\\wksp</value>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   </property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 12:40 PM,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> manish dunani <ma...@gmail.com>wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *You are wrong at this:*
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> copyFromLocal: File
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> copyFromLocal: File
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Because,You had wrote both the paths
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> local and You need not to copy hadoop into hdfs...Hadoop is already
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> working..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Just check out in browser by after
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> starting ur single node cluster :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> localhost:50070
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> then go for browse the filesystem
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> link in it..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> If there is no directory then make
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> directory there.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> That is your hdfs directory.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Then copy any text file there(no need
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> to copy hadoop there).beacause u are going to do processing on that data in
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> text file.That's why hadoop is used for ,first u need to make it clear in
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> ur mind.Then and then u will do it...otherwise not possible..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *Try this: *
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ .bin/hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /full/local/path/to/ur/file /hdfs/directory/path
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 11:49 AM,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Irfan Sayed <ir...@gmail.com>wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks. yes , i am newbie.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, i need windows setup.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> let me surely refer the doc and link
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> which u sent but i need this to be working ...
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> can you please help
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> MANISH DUNANI
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> -THANX
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> +91 9426881954,+91 8460656443
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> manishd207@gmail.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>>>>>>>>> Regards
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> *Manish Dunani*
>>>>>>>>>>>>>>>>>>>>>>>>> *Contact No* : +91 9408329137
>>>>>>>>>>>>>>>>>>>>>>>>> *skype id* : manish.dunani*
>>>>>>>>>>>>>>>>>>>>>>>>> *
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>>>>>>>>> NOTICE: This message is intended for the use of the
>>>>>>>>>>>>>>>>> individual or entity to which it is addressed and may contain information
>>>>>>>>>>>>>>>>> that is confidential, privileged and exempt from disclosure under
>>>>>>>>>>>>>>>>> applicable law. If the reader of this message is not the intended
>>>>>>>>>>>>>>>>> recipient, you are hereby notified that any printing, copying,
>>>>>>>>>>>>>>>>> dissemination, distribution, disclosure or forwarding of this communication
>>>>>>>>>>>>>>>>> is strictly prohibited. If you have received this communication in error,
>>>>>>>>>>>>>>>>> please contact the sender immediately and delete it from your system. Thank
>>>>>>>>>>>>>>>>> You.
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>>>>>>> NOTICE: This message is intended for the use of the
>>>>>>>>>>>>>>> individual or entity to which it is addressed and may contain information
>>>>>>>>>>>>>>> that is confidential, privileged and exempt from disclosure under
>>>>>>>>>>>>>>> applicable law. If the reader of this message is not the intended
>>>>>>>>>>>>>>> recipient, you are hereby notified that any printing, copying,
>>>>>>>>>>>>>>> dissemination, distribution, disclosure or forwarding of this communication
>>>>>>>>>>>>>>> is strictly prohibited. If you have received this communication in error,
>>>>>>>>>>>>>>> please contact the sender immediately and delete it from your system. Thank
>>>>>>>>>>>>>>> You.
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>> immediately and delete it from your system. Thank You.
>>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>
>>>>> CONFIDENTIALITY NOTICE
>>>>> NOTICE: This message is intended for the use of the individual or
>>>>> entity to which it is addressed and may contain information that is
>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>> notified that any printing, copying, dissemination, distribution,
>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>> you have received this communication in error, please contact the sender
>>>>> immediately and delete it from your system. Thank You.
>>>>>
>>>>
>>>>
>>> CONFIDENTIALITY NOTICE
>>> NOTICE: This message is intended for the use of the individual or entity
>>> to which it is addressed and may contain information that is confidential,
>>> privileged and exempt from disclosure under applicable law. If the reader
>>> of this message is not the intended recipient, you are hereby notified that
>>> any printing, copying, dissemination, distribution, disclosure or
>>> forwarding of this communication is strictly prohibited. If you have
>>> received this communication in error, please contact the sender immediately
>>> and delete it from your system. Thank You.
>>>
>>
>>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.
>

Re: about replication

Posted by Olivier Renault <or...@hortonworks.com>.
The command to install it is msiexec /i msifile /...

You will find the correct syntax as part of doc.

Happy reading
Olivier
On 4 Sep 2013 12:37, "Irfan Sayed" <ir...@gmail.com> wrote:

> thanks.
> i referred the logs and manuals. i modified the clusterproperties file and
> then double click on the msi file
> however, it still failed.
> further i started the installation on command line by giving
> HDP_LAYOUT=clusterproperties file path,
> installation went ahead and it failed for .NET framework 4.0 and VC++
> redistributable package dependency
>
> i installed both and started again the installation.
> failed again with following error
> [image: Inline image 1]
>
> when i search for the logs mentioned in the error , i never found that
> please suggest
>
> regards
> irfan
>
>
>
> On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <orenault@hortonworks.com
> > wrote:
>
>> Correct, you need to define the cluster configuration as part of a file.
>> You will find some information on the configuration file as part of the
>> documentation.
>>
>>
>> http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
>>
>> You should make sure to have also installed the pre requisite.
>>
>> Thanks
>> Olivier
>> On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:
>>
>>> thanks. sorry for the long break. actually got involved in some other
>>> priorities
>>> i downloaded the installer and while installing i got following error
>>>
>>> [image: Inline image 1]
>>>
>>> do i need to make any configuration prior to installation ??
>>>
>>> regards
>>> irfan
>>>
>>>
>>>
>>> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
>>> orenault@hortonworks.com> wrote:
>>>
>>>> Here is the link
>>>>
>>>> http://download.hortonworks.com/products/hdp-windows/
>>>>
>>>> Olivier
>>>> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>>
>>>>> thanks.
>>>>> i just followed the instructions to setup the pseudo distributed setup
>>>>> first using the url :
>>>>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>>>>>
>>>>> i don't think so i am running DN on both machine
>>>>> please find the attached log
>>>>>
>>>>> hi olivier
>>>>>
>>>>> can you please give me download link ?
>>>>> let me try please
>>>>>
>>>>> regards
>>>>> irfan
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>>
>>>>>> Are you running DN on both the machines? Could you please show me
>>>>>> your DN logs?
>>>>>>
>>>>>> Also, consider Oliver's suggestion. It's definitely a better option.
>>>>>>
>>>>>>
>>>>>>
>>>>>> Warm Regards,
>>>>>> Tariq
>>>>>> cloudfront.blogspot.com
>>>>>>
>>>>>>
>>>>>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>>>>>> orenault@hortonworks.com> wrote:
>>>>>>
>>>>>>> Irfu,
>>>>>>>
>>>>>>> If you want to quickly get Hadoop running on windows platform. You
>>>>>>> may want to try our distribution for Windows. You will be able to find the
>>>>>>> msi on our website.
>>>>>>>
>>>>>>> Regards
>>>>>>> Olivier
>>>>>>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>>>>>
>>>>>>>> thanks.
>>>>>>>> ok. i think i need to change the plan over here
>>>>>>>> let me create two environments. 1: totally windows 2: totally Unix
>>>>>>>>
>>>>>>>> because, on windows , anyway i have to try and see how hadoop works
>>>>>>>> on UNIX, it is already known that ,  it is working fine.
>>>>>>>>
>>>>>>>> so, on windows , here is the setup:
>>>>>>>>
>>>>>>>> namenode : windows 2012 R2
>>>>>>>> datanode : windows 2012 R2
>>>>>>>>
>>>>>>>> now, the exact problem is :
>>>>>>>> 1: datanode is not getting started
>>>>>>>> 2: replication : if i put any file/folder on any datanode , it
>>>>>>>> should get replicated to all another available datanodes
>>>>>>>>
>>>>>>>> regards
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <dontariq@gmail.com
>>>>>>>> > wrote:
>>>>>>>>
>>>>>>>>> Seriously??You are planning to develop something using Hadoop on
>>>>>>>>> windows. Not a good idea. Anyways, cold you plz show me your log files?I
>>>>>>>>> also need some additional info :
>>>>>>>>> -The exact problem which you are facing right now
>>>>>>>>> -Your cluster summary(no. of nodes etc)
>>>>>>>>> -Your latest configuration files
>>>>>>>>> -Your /etc.hosts file
>>>>>>>>>
>>>>>>>>> Warm Regards,
>>>>>>>>> Tariq
>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <irfu.sayed@gmail.com
>>>>>>>>> > wrote:
>>>>>>>>>
>>>>>>>>>> ok. thanks
>>>>>>>>>> now, i need to start with all windows setup first as our product
>>>>>>>>>> will be based on windows
>>>>>>>>>> so, now, please tell me how to resolve the issue
>>>>>>>>>>
>>>>>>>>>> datanode is not starting . please suggest
>>>>>>>>>>
>>>>>>>>>> regards,
>>>>>>>>>> irfan
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <
>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>>> It is possible. Theoretically Hadoop doesn't stop you from doing
>>>>>>>>>>> that. But it is not a very wise setup.
>>>>>>>>>>>
>>>>>>>>>>> Warm Regards,
>>>>>>>>>>> Tariq
>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <
>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> please suggest
>>>>>>>>>>>>
>>>>>>>>>>>> regards
>>>>>>>>>>>> irfan
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <
>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>
>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>> can i have setup like this :
>>>>>>>>>>>>> namenode will be on linux (flavour may be RHEL, CentOS, UBuntu
>>>>>>>>>>>>> etc)
>>>>>>>>>>>>> and datanodes are the combination of any OS (windows , linux ,
>>>>>>>>>>>>> unix etc )
>>>>>>>>>>>>>
>>>>>>>>>>>>> however, my doubt is,  as the file systems of  both the
>>>>>>>>>>>>> systems (win and linux ) are different ,  datanodes of these systems can
>>>>>>>>>>>>> not be part of single cluster . i have to make windows cluster separate and
>>>>>>>>>>>>> UNIX cluster separate ?
>>>>>>>>>>>>>
>>>>>>>>>>>>> regards
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
>>>>>>>>>>>>> aagarwal@hortonworks.com> wrote:
>>>>>>>>>>>>>
>>>>>>>>>>>>>> I just noticed you are on Cygwin. IIRC Windows PIDs are not
>>>>>>>>>>>>>> the same as Cygwin PIDs so that may be causing the discrepancy. I don't
>>>>>>>>>>>>>> know how well Hadoop works in Cygwin as I have never tried it. Work is in
>>>>>>>>>>>>>> progress for native Windows support however there are no official releases
>>>>>>>>>>>>>> with Windows support yet. It may be easier to get familiar with a
>>>>>>>>>>>>>> release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <
>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> thanks
>>>>>>>>>>>>>>> here is what i did .
>>>>>>>>>>>>>>> i stopped all the namenodes and datanodes using
>>>>>>>>>>>>>>> ./stop-dfs.sh command
>>>>>>>>>>>>>>> then deleted all pid files for namenodes and datanodes
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> started dfs again with command : "./start-dfs.sh"
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> when i ran the "Jps" command . it shows
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>> 4536 Jps
>>>>>>>>>>>>>>> 2076 NameNode
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> however, when i open the pid file for namenode then it is
>>>>>>>>>>>>>>> not showing pid as : 4560. on the contrary, it shud show : 2076
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
>>>>>>>>>>>>>>> aagarwal@hortonworks.com> wrote:
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> Most likely there is a stale pid file. Something like
>>>>>>>>>>>>>>>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>>>>>>>>>>>>>>>> the datanode.
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> I haven't read the entire thread so you may have looked at
>>>>>>>>>>>>>>>> this already.
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> -Arpit
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <
>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> datanode is trying to connect to namenode continuously but
>>>>>>>>>>>>>>>>> fails
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> when i try to run "jps" command it says :
>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>> 4584 NameNode
>>>>>>>>>>>>>>>>> 4016 Jps
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> and when i ran the "./start-dfs.sh" then it says :
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> $ ./start-dfs.sh
>>>>>>>>>>>>>>>>> namenode running as process 3544. Stop it first.
>>>>>>>>>>>>>>>>> DFS-1: datanode running as process 4076. Stop it first.
>>>>>>>>>>>>>>>>> localhost: secondarynamenode running as process 4792. Stop
>>>>>>>>>>>>>>>>> it first.
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> both these logs are contradictory
>>>>>>>>>>>>>>>>> please find the attached logs
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> should i attach the conf files as well ?
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> Your DN is still not running. Showing me the logs would
>>>>>>>>>>>>>>>>>> be helpful.
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> i followed the url and did the steps mention in that. i
>>>>>>>>>>>>>>>>>>> have deployed on the windows platform
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> Now, i am able to browse url : http://localhost:50070(name node )
>>>>>>>>>>>>>>>>>>> however, not able to browse url : http://localhost:50030
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> please refer below
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> i have modified all the config files as mentioned and
>>>>>>>>>>>>>>>>>>> formatted the hdfs file system as well
>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> thanks. i followed this url :
>>>>>>>>>>>>>>>>>>>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>>>>>>>>>>>>>>>>>>>> let me follow the url which you gave for pseudo
>>>>>>>>>>>>>>>>>>>> distributed setup and then will switch to distributed mode
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> You are welcome. Which link have you followed for the
>>>>>>>>>>>>>>>>>>>>> configuration?Your *core-site.xml* is empty. Remove
>>>>>>>>>>>>>>>>>>>>> the property *fs.default.name *from *hdfs-site.xml*and add it to
>>>>>>>>>>>>>>>>>>>>> *core-site.xml*. Remove *mapred.job.tracker* as well.
>>>>>>>>>>>>>>>>>>>>> It is required in *mapred-site.xml*.
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> I would suggest you to do a pseudo distributed setup
>>>>>>>>>>>>>>>>>>>>> first in order to get yourself familiar with the process and then proceed
>>>>>>>>>>>>>>>>>>>>> to the distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> HTH
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> thanks tariq for response.
>>>>>>>>>>>>>>>>>>>>>> as discussed last time, i have sent you all the
>>>>>>>>>>>>>>>>>>>>>> config files in my setup .
>>>>>>>>>>>>>>>>>>>>>> can you please go through that ?
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> please let me know
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> I'm sorry for being unresponsive. Was out of touch
>>>>>>>>>>>>>>>>>>>>>>> for sometime because of ramzan and eid. Resuming work today.
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> What's the current status?
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <
>>>>>>>>>>>>>>>>>>>>>>> manishd207@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> First of all read the concepts ..I hope you will
>>>>>>>>>>>>>>>>>>>>>>>> like it..
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> hey Tariq,
>>>>>>>>>>>>>>>>>>>>>>>>>> i am still stuck ..
>>>>>>>>>>>>>>>>>>>>>>>>>> can you please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> attachment got quarantined
>>>>>>>>>>>>>>>>>>>>>>>>>>>> resending in txt format. please rename it to
>>>>>>>>>>>>>>>>>>>>>>>>>>>> conf.rar
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> if i run the jps command on namenode :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 3164 NameNode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 1892 Jps
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> same command on datanode :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-1/cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 3848 Jps
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> jps does not list any process for datanode.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, on web browser i can see one live data node
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please find the attached conf rar file of
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> OK. we'll start fresh. Could you plz show me
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> your latest config files?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> BTW, are your daemons running fine?Use JPS to
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> verify that.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i have created these dir "wksp_data" and
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> "wksp_name" on both datanode and namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> made the respective changes in
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> "hdfs-site.xml" file
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> formatted the namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> started the dfs
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> but still, not able to browse the file
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> system through web browser
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please refer below
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> anything still missing ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> these dir needs to be created on all
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> datanodes and namenodes ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> further,  hdfs-site.xml needs to be updated
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> on both datanodes and namenodes for these new dir?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Create 2 directories manually
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> corresponding to the values of dfs.name.dir and dfs.data.dir properties and
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> change the permissions of these directories to 755. When you start pushing
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> data into your HDFS, data will start going inside the directory specified
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> by dfs.data.dir and the associated metadata will go inside dfs.name.dir.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Remember, you store data in HDFS, but it eventually gets stored in your
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> local/native FS. But you cannot see this data directly on your local/native
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> FS.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, i need this to be working on
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> windows environment as project requirement.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i will add/work on Linux later
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> so, now , at this stage , c:\\wksp is the
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> HDFS file system OR do i need to create it from command line ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:19 PM, Mohammad
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Hello Irfan,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sorry for being unresponsive. Got stuck
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> with some imp work.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> HDFS webUI doesn't provide us the
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> ability to create file or directory. You can browse HDFS, view files,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> download files etc. But operation like create, move, copy etc are not
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> supported.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> These values look fine to me.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> One suggestion though. Try getting a
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Linux machine(if possible). Or at least use a VM. I personally feel that
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> using Hadoop on windows is always messy.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> when i browse the file system , i am
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> getting following :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i haven't seen any make directory
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> option there
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i need to create it from command line ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> further, in the hdfs-site.xml file , i
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> have given following entries. are they correct ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <name>dfs.data.dir</name>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <value>c:\\wksp</value>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   </property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <name>dfs.name.dir</name>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <value>c:\\wksp</value>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   </property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 12:40 PM, manish
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> dunani <ma...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *You are wrong at this:*
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> copyFromLocal: File
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> copyFromLocal: File
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Because,You had wrote both the paths
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> local and You need not to copy hadoop into hdfs...Hadoop is already
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> working..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Just check out in browser by after
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> starting ur single node cluster :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> localhost:50070
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> then go for browse the filesystem link
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> in it..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> If there is no directory then make
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> directory there.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> That is your hdfs directory.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Then copy any text file there(no need
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> to copy hadoop there).beacause u are going to do processing on that data in
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> text file.That's why hadoop is used for ,first u need to make it clear in
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> ur mind.Then and then u will do it...otherwise not possible..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *Try this: *
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ .bin/hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /full/local/path/to/ur/file /hdfs/directory/path
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 11:49 AM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks. yes , i am newbie.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, i need windows setup.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> let me surely refer the doc and link
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> which u sent but i need this to be working ...
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> can you please help
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> MANISH DUNANI
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> -THANX
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> +91 9426881954,+91 8460656443
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> manishd207@gmail.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>>>>>>>> Regards
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> *Manish Dunani*
>>>>>>>>>>>>>>>>>>>>>>>> *Contact No* : +91 9408329137
>>>>>>>>>>>>>>>>>>>>>>>> *skype id* : manish.dunani*
>>>>>>>>>>>>>>>>>>>>>>>> *
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>>>>>>>> NOTICE: This message is intended for the use of the
>>>>>>>>>>>>>>>> individual or entity to which it is addressed and may contain information
>>>>>>>>>>>>>>>> that is confidential, privileged and exempt from disclosure under
>>>>>>>>>>>>>>>> applicable law. If the reader of this message is not the intended
>>>>>>>>>>>>>>>> recipient, you are hereby notified that any printing, copying,
>>>>>>>>>>>>>>>> dissemination, distribution, disclosure or forwarding of this communication
>>>>>>>>>>>>>>>> is strictly prohibited. If you have received this communication in error,
>>>>>>>>>>>>>>>> please contact the sender immediately and delete it from your system. Thank
>>>>>>>>>>>>>>>> You.
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>>>>>> NOTICE: This message is intended for the use of the
>>>>>>>>>>>>>> individual or entity to which it is addressed and may contain information
>>>>>>>>>>>>>> that is confidential, privileged and exempt from disclosure under
>>>>>>>>>>>>>> applicable law. If the reader of this message is not the intended
>>>>>>>>>>>>>> recipient, you are hereby notified that any printing, copying,
>>>>>>>>>>>>>> dissemination, distribution, disclosure or forwarding of this communication
>>>>>>>>>>>>>> is strictly prohibited. If you have received this communication in error,
>>>>>>>>>>>>>> please contact the sender immediately and delete it from your system. Thank
>>>>>>>>>>>>>> You.
>>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>> you have received this communication in error, please contact the sender
>>>>>>> immediately and delete it from your system. Thank You.
>>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>> CONFIDENTIALITY NOTICE
>>>> NOTICE: This message is intended for the use of the individual or
>>>> entity to which it is addressed and may contain information that is
>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>> If the reader of this message is not the intended recipient, you are hereby
>>>> notified that any printing, copying, dissemination, distribution,
>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>> you have received this communication in error, please contact the sender
>>>> immediately and delete it from your system. Thank You.
>>>>
>>>
>>>
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity
>> to which it is addressed and may contain information that is confidential,
>> privileged and exempt from disclosure under applicable law. If the reader
>> of this message is not the intended recipient, you are hereby notified that
>> any printing, copying, dissemination, distribution, disclosure or
>> forwarding of this communication is strictly prohibited. If you have
>> received this communication in error, please contact the sender immediately
>> and delete it from your system. Thank You.
>>
>
>

-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.

Re: about replication

Posted by Olivier Renault <or...@hortonworks.com>.
The command to install it is msiexec /i msifile /...

You will find the correct syntax as part of doc.

Happy reading
Olivier
On 4 Sep 2013 12:37, "Irfan Sayed" <ir...@gmail.com> wrote:

> thanks.
> i referred the logs and manuals. i modified the clusterproperties file and
> then double click on the msi file
> however, it still failed.
> further i started the installation on command line by giving
> HDP_LAYOUT=clusterproperties file path,
> installation went ahead and it failed for .NET framework 4.0 and VC++
> redistributable package dependency
>
> i installed both and started again the installation.
> failed again with following error
> [image: Inline image 1]
>
> when i search for the logs mentioned in the error , i never found that
> please suggest
>
> regards
> irfan
>
>
>
> On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <orenault@hortonworks.com
> > wrote:
>
>> Correct, you need to define the cluster configuration as part of a file.
>> You will find some information on the configuration file as part of the
>> documentation.
>>
>>
>> http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
>>
>> You should make sure to have also installed the pre requisite.
>>
>> Thanks
>> Olivier
>> On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:
>>
>>> thanks. sorry for the long break. actually got involved in some other
>>> priorities
>>> i downloaded the installer and while installing i got following error
>>>
>>> [image: Inline image 1]
>>>
>>> do i need to make any configuration prior to installation ??
>>>
>>> regards
>>> irfan
>>>
>>>
>>>
>>> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
>>> orenault@hortonworks.com> wrote:
>>>
>>>> Here is the link
>>>>
>>>> http://download.hortonworks.com/products/hdp-windows/
>>>>
>>>> Olivier
>>>> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>>
>>>>> thanks.
>>>>> i just followed the instructions to setup the pseudo distributed setup
>>>>> first using the url :
>>>>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>>>>>
>>>>> i don't think so i am running DN on both machine
>>>>> please find the attached log
>>>>>
>>>>> hi olivier
>>>>>
>>>>> can you please give me download link ?
>>>>> let me try please
>>>>>
>>>>> regards
>>>>> irfan
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>>
>>>>>> Are you running DN on both the machines? Could you please show me
>>>>>> your DN logs?
>>>>>>
>>>>>> Also, consider Oliver's suggestion. It's definitely a better option.
>>>>>>
>>>>>>
>>>>>>
>>>>>> Warm Regards,
>>>>>> Tariq
>>>>>> cloudfront.blogspot.com
>>>>>>
>>>>>>
>>>>>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>>>>>> orenault@hortonworks.com> wrote:
>>>>>>
>>>>>>> Irfu,
>>>>>>>
>>>>>>> If you want to quickly get Hadoop running on windows platform. You
>>>>>>> may want to try our distribution for Windows. You will be able to find the
>>>>>>> msi on our website.
>>>>>>>
>>>>>>> Regards
>>>>>>> Olivier
>>>>>>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>>>>>
>>>>>>>> thanks.
>>>>>>>> ok. i think i need to change the plan over here
>>>>>>>> let me create two environments. 1: totally windows 2: totally Unix
>>>>>>>>
>>>>>>>> because, on windows , anyway i have to try and see how hadoop works
>>>>>>>> on UNIX, it is already known that ,  it is working fine.
>>>>>>>>
>>>>>>>> so, on windows , here is the setup:
>>>>>>>>
>>>>>>>> namenode : windows 2012 R2
>>>>>>>> datanode : windows 2012 R2
>>>>>>>>
>>>>>>>> now, the exact problem is :
>>>>>>>> 1: datanode is not getting started
>>>>>>>> 2: replication : if i put any file/folder on any datanode , it
>>>>>>>> should get replicated to all another available datanodes
>>>>>>>>
>>>>>>>> regards
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <dontariq@gmail.com
>>>>>>>> > wrote:
>>>>>>>>
>>>>>>>>> Seriously??You are planning to develop something using Hadoop on
>>>>>>>>> windows. Not a good idea. Anyways, cold you plz show me your log files?I
>>>>>>>>> also need some additional info :
>>>>>>>>> -The exact problem which you are facing right now
>>>>>>>>> -Your cluster summary(no. of nodes etc)
>>>>>>>>> -Your latest configuration files
>>>>>>>>> -Your /etc.hosts file
>>>>>>>>>
>>>>>>>>> Warm Regards,
>>>>>>>>> Tariq
>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <irfu.sayed@gmail.com
>>>>>>>>> > wrote:
>>>>>>>>>
>>>>>>>>>> ok. thanks
>>>>>>>>>> now, i need to start with all windows setup first as our product
>>>>>>>>>> will be based on windows
>>>>>>>>>> so, now, please tell me how to resolve the issue
>>>>>>>>>>
>>>>>>>>>> datanode is not starting . please suggest
>>>>>>>>>>
>>>>>>>>>> regards,
>>>>>>>>>> irfan
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <
>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>>> It is possible. Theoretically Hadoop doesn't stop you from doing
>>>>>>>>>>> that. But it is not a very wise setup.
>>>>>>>>>>>
>>>>>>>>>>> Warm Regards,
>>>>>>>>>>> Tariq
>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <
>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> please suggest
>>>>>>>>>>>>
>>>>>>>>>>>> regards
>>>>>>>>>>>> irfan
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <
>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>
>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>> can i have setup like this :
>>>>>>>>>>>>> namenode will be on linux (flavour may be RHEL, CentOS, UBuntu
>>>>>>>>>>>>> etc)
>>>>>>>>>>>>> and datanodes are the combination of any OS (windows , linux ,
>>>>>>>>>>>>> unix etc )
>>>>>>>>>>>>>
>>>>>>>>>>>>> however, my doubt is,  as the file systems of  both the
>>>>>>>>>>>>> systems (win and linux ) are different ,  datanodes of these systems can
>>>>>>>>>>>>> not be part of single cluster . i have to make windows cluster separate and
>>>>>>>>>>>>> UNIX cluster separate ?
>>>>>>>>>>>>>
>>>>>>>>>>>>> regards
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
>>>>>>>>>>>>> aagarwal@hortonworks.com> wrote:
>>>>>>>>>>>>>
>>>>>>>>>>>>>> I just noticed you are on Cygwin. IIRC Windows PIDs are not
>>>>>>>>>>>>>> the same as Cygwin PIDs so that may be causing the discrepancy. I don't
>>>>>>>>>>>>>> know how well Hadoop works in Cygwin as I have never tried it. Work is in
>>>>>>>>>>>>>> progress for native Windows support however there are no official releases
>>>>>>>>>>>>>> with Windows support yet. It may be easier to get familiar with a
>>>>>>>>>>>>>> release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <
>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> thanks
>>>>>>>>>>>>>>> here is what i did .
>>>>>>>>>>>>>>> i stopped all the namenodes and datanodes using
>>>>>>>>>>>>>>> ./stop-dfs.sh command
>>>>>>>>>>>>>>> then deleted all pid files for namenodes and datanodes
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> started dfs again with command : "./start-dfs.sh"
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> when i ran the "Jps" command . it shows
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>> 4536 Jps
>>>>>>>>>>>>>>> 2076 NameNode
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> however, when i open the pid file for namenode then it is
>>>>>>>>>>>>>>> not showing pid as : 4560. on the contrary, it shud show : 2076
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
>>>>>>>>>>>>>>> aagarwal@hortonworks.com> wrote:
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> Most likely there is a stale pid file. Something like
>>>>>>>>>>>>>>>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>>>>>>>>>>>>>>>> the datanode.
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> I haven't read the entire thread so you may have looked at
>>>>>>>>>>>>>>>> this already.
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> -Arpit
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <
>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> datanode is trying to connect to namenode continuously but
>>>>>>>>>>>>>>>>> fails
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> when i try to run "jps" command it says :
>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>> 4584 NameNode
>>>>>>>>>>>>>>>>> 4016 Jps
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> and when i ran the "./start-dfs.sh" then it says :
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> $ ./start-dfs.sh
>>>>>>>>>>>>>>>>> namenode running as process 3544. Stop it first.
>>>>>>>>>>>>>>>>> DFS-1: datanode running as process 4076. Stop it first.
>>>>>>>>>>>>>>>>> localhost: secondarynamenode running as process 4792. Stop
>>>>>>>>>>>>>>>>> it first.
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> both these logs are contradictory
>>>>>>>>>>>>>>>>> please find the attached logs
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> should i attach the conf files as well ?
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> Your DN is still not running. Showing me the logs would
>>>>>>>>>>>>>>>>>> be helpful.
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> i followed the url and did the steps mention in that. i
>>>>>>>>>>>>>>>>>>> have deployed on the windows platform
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> Now, i am able to browse url : http://localhost:50070(name node )
>>>>>>>>>>>>>>>>>>> however, not able to browse url : http://localhost:50030
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> please refer below
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> i have modified all the config files as mentioned and
>>>>>>>>>>>>>>>>>>> formatted the hdfs file system as well
>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> thanks. i followed this url :
>>>>>>>>>>>>>>>>>>>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>>>>>>>>>>>>>>>>>>>> let me follow the url which you gave for pseudo
>>>>>>>>>>>>>>>>>>>> distributed setup and then will switch to distributed mode
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> You are welcome. Which link have you followed for the
>>>>>>>>>>>>>>>>>>>>> configuration?Your *core-site.xml* is empty. Remove
>>>>>>>>>>>>>>>>>>>>> the property *fs.default.name *from *hdfs-site.xml*and add it to
>>>>>>>>>>>>>>>>>>>>> *core-site.xml*. Remove *mapred.job.tracker* as well.
>>>>>>>>>>>>>>>>>>>>> It is required in *mapred-site.xml*.
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> I would suggest you to do a pseudo distributed setup
>>>>>>>>>>>>>>>>>>>>> first in order to get yourself familiar with the process and then proceed
>>>>>>>>>>>>>>>>>>>>> to the distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> HTH
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> thanks tariq for response.
>>>>>>>>>>>>>>>>>>>>>> as discussed last time, i have sent you all the
>>>>>>>>>>>>>>>>>>>>>> config files in my setup .
>>>>>>>>>>>>>>>>>>>>>> can you please go through that ?
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> please let me know
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> I'm sorry for being unresponsive. Was out of touch
>>>>>>>>>>>>>>>>>>>>>>> for sometime because of ramzan and eid. Resuming work today.
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> What's the current status?
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <
>>>>>>>>>>>>>>>>>>>>>>> manishd207@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> First of all read the concepts ..I hope you will
>>>>>>>>>>>>>>>>>>>>>>>> like it..
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> hey Tariq,
>>>>>>>>>>>>>>>>>>>>>>>>>> i am still stuck ..
>>>>>>>>>>>>>>>>>>>>>>>>>> can you please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> attachment got quarantined
>>>>>>>>>>>>>>>>>>>>>>>>>>>> resending in txt format. please rename it to
>>>>>>>>>>>>>>>>>>>>>>>>>>>> conf.rar
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> if i run the jps command on namenode :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 3164 NameNode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 1892 Jps
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> same command on datanode :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-1/cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 3848 Jps
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> jps does not list any process for datanode.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, on web browser i can see one live data node
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please find the attached conf rar file of
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> OK. we'll start fresh. Could you plz show me
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> your latest config files?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> BTW, are your daemons running fine?Use JPS to
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> verify that.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i have created these dir "wksp_data" and
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> "wksp_name" on both datanode and namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> made the respective changes in
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> "hdfs-site.xml" file
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> formatted the namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> started the dfs
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> but still, not able to browse the file
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> system through web browser
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please refer below
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> anything still missing ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> these dir needs to be created on all
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> datanodes and namenodes ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> further,  hdfs-site.xml needs to be updated
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> on both datanodes and namenodes for these new dir?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Create 2 directories manually
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> corresponding to the values of dfs.name.dir and dfs.data.dir properties and
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> change the permissions of these directories to 755. When you start pushing
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> data into your HDFS, data will start going inside the directory specified
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> by dfs.data.dir and the associated metadata will go inside dfs.name.dir.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Remember, you store data in HDFS, but it eventually gets stored in your
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> local/native FS. But you cannot see this data directly on your local/native
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> FS.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, i need this to be working on
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> windows environment as project requirement.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i will add/work on Linux later
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> so, now , at this stage , c:\\wksp is the
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> HDFS file system OR do i need to create it from command line ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:19 PM, Mohammad
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Hello Irfan,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sorry for being unresponsive. Got stuck
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> with some imp work.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> HDFS webUI doesn't provide us the
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> ability to create file or directory. You can browse HDFS, view files,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> download files etc. But operation like create, move, copy etc are not
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> supported.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> These values look fine to me.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> One suggestion though. Try getting a
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Linux machine(if possible). Or at least use a VM. I personally feel that
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> using Hadoop on windows is always messy.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> when i browse the file system , i am
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> getting following :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i haven't seen any make directory
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> option there
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i need to create it from command line ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> further, in the hdfs-site.xml file , i
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> have given following entries. are they correct ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <name>dfs.data.dir</name>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <value>c:\\wksp</value>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   </property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <name>dfs.name.dir</name>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <value>c:\\wksp</value>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   </property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 12:40 PM, manish
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> dunani <ma...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *You are wrong at this:*
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> copyFromLocal: File
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> copyFromLocal: File
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Because,You had wrote both the paths
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> local and You need not to copy hadoop into hdfs...Hadoop is already
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> working..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Just check out in browser by after
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> starting ur single node cluster :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> localhost:50070
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> then go for browse the filesystem link
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> in it..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> If there is no directory then make
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> directory there.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> That is your hdfs directory.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Then copy any text file there(no need
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> to copy hadoop there).beacause u are going to do processing on that data in
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> text file.That's why hadoop is used for ,first u need to make it clear in
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> ur mind.Then and then u will do it...otherwise not possible..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *Try this: *
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ .bin/hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /full/local/path/to/ur/file /hdfs/directory/path
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 11:49 AM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks. yes , i am newbie.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, i need windows setup.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> let me surely refer the doc and link
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> which u sent but i need this to be working ...
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> can you please help
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> MANISH DUNANI
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> -THANX
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> +91 9426881954,+91 8460656443
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> manishd207@gmail.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>>>>>>>> Regards
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> *Manish Dunani*
>>>>>>>>>>>>>>>>>>>>>>>> *Contact No* : +91 9408329137
>>>>>>>>>>>>>>>>>>>>>>>> *skype id* : manish.dunani*
>>>>>>>>>>>>>>>>>>>>>>>> *
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>>>>>>>> NOTICE: This message is intended for the use of the
>>>>>>>>>>>>>>>> individual or entity to which it is addressed and may contain information
>>>>>>>>>>>>>>>> that is confidential, privileged and exempt from disclosure under
>>>>>>>>>>>>>>>> applicable law. If the reader of this message is not the intended
>>>>>>>>>>>>>>>> recipient, you are hereby notified that any printing, copying,
>>>>>>>>>>>>>>>> dissemination, distribution, disclosure or forwarding of this communication
>>>>>>>>>>>>>>>> is strictly prohibited. If you have received this communication in error,
>>>>>>>>>>>>>>>> please contact the sender immediately and delete it from your system. Thank
>>>>>>>>>>>>>>>> You.
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>>>>>> NOTICE: This message is intended for the use of the
>>>>>>>>>>>>>> individual or entity to which it is addressed and may contain information
>>>>>>>>>>>>>> that is confidential, privileged and exempt from disclosure under
>>>>>>>>>>>>>> applicable law. If the reader of this message is not the intended
>>>>>>>>>>>>>> recipient, you are hereby notified that any printing, copying,
>>>>>>>>>>>>>> dissemination, distribution, disclosure or forwarding of this communication
>>>>>>>>>>>>>> is strictly prohibited. If you have received this communication in error,
>>>>>>>>>>>>>> please contact the sender immediately and delete it from your system. Thank
>>>>>>>>>>>>>> You.
>>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>> you have received this communication in error, please contact the sender
>>>>>>> immediately and delete it from your system. Thank You.
>>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>> CONFIDENTIALITY NOTICE
>>>> NOTICE: This message is intended for the use of the individual or
>>>> entity to which it is addressed and may contain information that is
>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>> If the reader of this message is not the intended recipient, you are hereby
>>>> notified that any printing, copying, dissemination, distribution,
>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>> you have received this communication in error, please contact the sender
>>>> immediately and delete it from your system. Thank You.
>>>>
>>>
>>>
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity
>> to which it is addressed and may contain information that is confidential,
>> privileged and exempt from disclosure under applicable law. If the reader
>> of this message is not the intended recipient, you are hereby notified that
>> any printing, copying, dissemination, distribution, disclosure or
>> forwarding of this communication is strictly prohibited. If you have
>> received this communication in error, please contact the sender immediately
>> and delete it from your system. Thank You.
>>
>
>

-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.

Re: about replication

Posted by Olivier Renault <or...@hortonworks.com>.
The command to install it is msiexec /i msifile /...

You will find the correct syntax as part of doc.

Happy reading
Olivier
On 4 Sep 2013 12:37, "Irfan Sayed" <ir...@gmail.com> wrote:

> thanks.
> i referred the logs and manuals. i modified the clusterproperties file and
> then double click on the msi file
> however, it still failed.
> further i started the installation on command line by giving
> HDP_LAYOUT=clusterproperties file path,
> installation went ahead and it failed for .NET framework 4.0 and VC++
> redistributable package dependency
>
> i installed both and started again the installation.
> failed again with following error
> [image: Inline image 1]
>
> when i search for the logs mentioned in the error , i never found that
> please suggest
>
> regards
> irfan
>
>
>
> On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <orenault@hortonworks.com
> > wrote:
>
>> Correct, you need to define the cluster configuration as part of a file.
>> You will find some information on the configuration file as part of the
>> documentation.
>>
>>
>> http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
>>
>> You should make sure to have also installed the pre requisite.
>>
>> Thanks
>> Olivier
>> On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:
>>
>>> thanks. sorry for the long break. actually got involved in some other
>>> priorities
>>> i downloaded the installer and while installing i got following error
>>>
>>> [image: Inline image 1]
>>>
>>> do i need to make any configuration prior to installation ??
>>>
>>> regards
>>> irfan
>>>
>>>
>>>
>>> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
>>> orenault@hortonworks.com> wrote:
>>>
>>>> Here is the link
>>>>
>>>> http://download.hortonworks.com/products/hdp-windows/
>>>>
>>>> Olivier
>>>> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>>
>>>>> thanks.
>>>>> i just followed the instructions to setup the pseudo distributed setup
>>>>> first using the url :
>>>>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>>>>>
>>>>> i don't think so i am running DN on both machine
>>>>> please find the attached log
>>>>>
>>>>> hi olivier
>>>>>
>>>>> can you please give me download link ?
>>>>> let me try please
>>>>>
>>>>> regards
>>>>> irfan
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>>
>>>>>> Are you running DN on both the machines? Could you please show me
>>>>>> your DN logs?
>>>>>>
>>>>>> Also, consider Oliver's suggestion. It's definitely a better option.
>>>>>>
>>>>>>
>>>>>>
>>>>>> Warm Regards,
>>>>>> Tariq
>>>>>> cloudfront.blogspot.com
>>>>>>
>>>>>>
>>>>>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>>>>>> orenault@hortonworks.com> wrote:
>>>>>>
>>>>>>> Irfu,
>>>>>>>
>>>>>>> If you want to quickly get Hadoop running on windows platform. You
>>>>>>> may want to try our distribution for Windows. You will be able to find the
>>>>>>> msi on our website.
>>>>>>>
>>>>>>> Regards
>>>>>>> Olivier
>>>>>>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>>>>>
>>>>>>>> thanks.
>>>>>>>> ok. i think i need to change the plan over here
>>>>>>>> let me create two environments. 1: totally windows 2: totally Unix
>>>>>>>>
>>>>>>>> because, on windows , anyway i have to try and see how hadoop works
>>>>>>>> on UNIX, it is already known that ,  it is working fine.
>>>>>>>>
>>>>>>>> so, on windows , here is the setup:
>>>>>>>>
>>>>>>>> namenode : windows 2012 R2
>>>>>>>> datanode : windows 2012 R2
>>>>>>>>
>>>>>>>> now, the exact problem is :
>>>>>>>> 1: datanode is not getting started
>>>>>>>> 2: replication : if i put any file/folder on any datanode , it
>>>>>>>> should get replicated to all another available datanodes
>>>>>>>>
>>>>>>>> regards
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <dontariq@gmail.com
>>>>>>>> > wrote:
>>>>>>>>
>>>>>>>>> Seriously??You are planning to develop something using Hadoop on
>>>>>>>>> windows. Not a good idea. Anyways, cold you plz show me your log files?I
>>>>>>>>> also need some additional info :
>>>>>>>>> -The exact problem which you are facing right now
>>>>>>>>> -Your cluster summary(no. of nodes etc)
>>>>>>>>> -Your latest configuration files
>>>>>>>>> -Your /etc.hosts file
>>>>>>>>>
>>>>>>>>> Warm Regards,
>>>>>>>>> Tariq
>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <irfu.sayed@gmail.com
>>>>>>>>> > wrote:
>>>>>>>>>
>>>>>>>>>> ok. thanks
>>>>>>>>>> now, i need to start with all windows setup first as our product
>>>>>>>>>> will be based on windows
>>>>>>>>>> so, now, please tell me how to resolve the issue
>>>>>>>>>>
>>>>>>>>>> datanode is not starting . please suggest
>>>>>>>>>>
>>>>>>>>>> regards,
>>>>>>>>>> irfan
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <
>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>>> It is possible. Theoretically Hadoop doesn't stop you from doing
>>>>>>>>>>> that. But it is not a very wise setup.
>>>>>>>>>>>
>>>>>>>>>>> Warm Regards,
>>>>>>>>>>> Tariq
>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <
>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> please suggest
>>>>>>>>>>>>
>>>>>>>>>>>> regards
>>>>>>>>>>>> irfan
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <
>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>
>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>> can i have setup like this :
>>>>>>>>>>>>> namenode will be on linux (flavour may be RHEL, CentOS, UBuntu
>>>>>>>>>>>>> etc)
>>>>>>>>>>>>> and datanodes are the combination of any OS (windows , linux ,
>>>>>>>>>>>>> unix etc )
>>>>>>>>>>>>>
>>>>>>>>>>>>> however, my doubt is,  as the file systems of  both the
>>>>>>>>>>>>> systems (win and linux ) are different ,  datanodes of these systems can
>>>>>>>>>>>>> not be part of single cluster . i have to make windows cluster separate and
>>>>>>>>>>>>> UNIX cluster separate ?
>>>>>>>>>>>>>
>>>>>>>>>>>>> regards
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
>>>>>>>>>>>>> aagarwal@hortonworks.com> wrote:
>>>>>>>>>>>>>
>>>>>>>>>>>>>> I just noticed you are on Cygwin. IIRC Windows PIDs are not
>>>>>>>>>>>>>> the same as Cygwin PIDs so that may be causing the discrepancy. I don't
>>>>>>>>>>>>>> know how well Hadoop works in Cygwin as I have never tried it. Work is in
>>>>>>>>>>>>>> progress for native Windows support however there are no official releases
>>>>>>>>>>>>>> with Windows support yet. It may be easier to get familiar with a
>>>>>>>>>>>>>> release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <
>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> thanks
>>>>>>>>>>>>>>> here is what i did .
>>>>>>>>>>>>>>> i stopped all the namenodes and datanodes using
>>>>>>>>>>>>>>> ./stop-dfs.sh command
>>>>>>>>>>>>>>> then deleted all pid files for namenodes and datanodes
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> started dfs again with command : "./start-dfs.sh"
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> when i ran the "Jps" command . it shows
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>> 4536 Jps
>>>>>>>>>>>>>>> 2076 NameNode
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> however, when i open the pid file for namenode then it is
>>>>>>>>>>>>>>> not showing pid as : 4560. on the contrary, it shud show : 2076
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
>>>>>>>>>>>>>>> aagarwal@hortonworks.com> wrote:
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> Most likely there is a stale pid file. Something like
>>>>>>>>>>>>>>>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>>>>>>>>>>>>>>>> the datanode.
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> I haven't read the entire thread so you may have looked at
>>>>>>>>>>>>>>>> this already.
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> -Arpit
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <
>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> datanode is trying to connect to namenode continuously but
>>>>>>>>>>>>>>>>> fails
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> when i try to run "jps" command it says :
>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>> 4584 NameNode
>>>>>>>>>>>>>>>>> 4016 Jps
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> and when i ran the "./start-dfs.sh" then it says :
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> $ ./start-dfs.sh
>>>>>>>>>>>>>>>>> namenode running as process 3544. Stop it first.
>>>>>>>>>>>>>>>>> DFS-1: datanode running as process 4076. Stop it first.
>>>>>>>>>>>>>>>>> localhost: secondarynamenode running as process 4792. Stop
>>>>>>>>>>>>>>>>> it first.
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> both these logs are contradictory
>>>>>>>>>>>>>>>>> please find the attached logs
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> should i attach the conf files as well ?
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> Your DN is still not running. Showing me the logs would
>>>>>>>>>>>>>>>>>> be helpful.
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> i followed the url and did the steps mention in that. i
>>>>>>>>>>>>>>>>>>> have deployed on the windows platform
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> Now, i am able to browse url : http://localhost:50070(name node )
>>>>>>>>>>>>>>>>>>> however, not able to browse url : http://localhost:50030
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> please refer below
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> i have modified all the config files as mentioned and
>>>>>>>>>>>>>>>>>>> formatted the hdfs file system as well
>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> thanks. i followed this url :
>>>>>>>>>>>>>>>>>>>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>>>>>>>>>>>>>>>>>>>> let me follow the url which you gave for pseudo
>>>>>>>>>>>>>>>>>>>> distributed setup and then will switch to distributed mode
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> You are welcome. Which link have you followed for the
>>>>>>>>>>>>>>>>>>>>> configuration?Your *core-site.xml* is empty. Remove
>>>>>>>>>>>>>>>>>>>>> the property *fs.default.name *from *hdfs-site.xml*and add it to
>>>>>>>>>>>>>>>>>>>>> *core-site.xml*. Remove *mapred.job.tracker* as well.
>>>>>>>>>>>>>>>>>>>>> It is required in *mapred-site.xml*.
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> I would suggest you to do a pseudo distributed setup
>>>>>>>>>>>>>>>>>>>>> first in order to get yourself familiar with the process and then proceed
>>>>>>>>>>>>>>>>>>>>> to the distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> HTH
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> thanks tariq for response.
>>>>>>>>>>>>>>>>>>>>>> as discussed last time, i have sent you all the
>>>>>>>>>>>>>>>>>>>>>> config files in my setup .
>>>>>>>>>>>>>>>>>>>>>> can you please go through that ?
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> please let me know
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> I'm sorry for being unresponsive. Was out of touch
>>>>>>>>>>>>>>>>>>>>>>> for sometime because of ramzan and eid. Resuming work today.
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> What's the current status?
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <
>>>>>>>>>>>>>>>>>>>>>>> manishd207@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> First of all read the concepts ..I hope you will
>>>>>>>>>>>>>>>>>>>>>>>> like it..
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> hey Tariq,
>>>>>>>>>>>>>>>>>>>>>>>>>> i am still stuck ..
>>>>>>>>>>>>>>>>>>>>>>>>>> can you please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> attachment got quarantined
>>>>>>>>>>>>>>>>>>>>>>>>>>>> resending in txt format. please rename it to
>>>>>>>>>>>>>>>>>>>>>>>>>>>> conf.rar
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> if i run the jps command on namenode :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 3164 NameNode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 1892 Jps
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> same command on datanode :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-1/cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 3848 Jps
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> jps does not list any process for datanode.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, on web browser i can see one live data node
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please find the attached conf rar file of
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> OK. we'll start fresh. Could you plz show me
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> your latest config files?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> BTW, are your daemons running fine?Use JPS to
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> verify that.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i have created these dir "wksp_data" and
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> "wksp_name" on both datanode and namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> made the respective changes in
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> "hdfs-site.xml" file
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> formatted the namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> started the dfs
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> but still, not able to browse the file
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> system through web browser
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please refer below
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> anything still missing ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> these dir needs to be created on all
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> datanodes and namenodes ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> further,  hdfs-site.xml needs to be updated
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> on both datanodes and namenodes for these new dir?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Create 2 directories manually
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> corresponding to the values of dfs.name.dir and dfs.data.dir properties and
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> change the permissions of these directories to 755. When you start pushing
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> data into your HDFS, data will start going inside the directory specified
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> by dfs.data.dir and the associated metadata will go inside dfs.name.dir.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Remember, you store data in HDFS, but it eventually gets stored in your
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> local/native FS. But you cannot see this data directly on your local/native
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> FS.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, i need this to be working on
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> windows environment as project requirement.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i will add/work on Linux later
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> so, now , at this stage , c:\\wksp is the
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> HDFS file system OR do i need to create it from command line ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:19 PM, Mohammad
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Hello Irfan,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sorry for being unresponsive. Got stuck
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> with some imp work.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> HDFS webUI doesn't provide us the
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> ability to create file or directory. You can browse HDFS, view files,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> download files etc. But operation like create, move, copy etc are not
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> supported.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> These values look fine to me.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> One suggestion though. Try getting a
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Linux machine(if possible). Or at least use a VM. I personally feel that
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> using Hadoop on windows is always messy.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> when i browse the file system , i am
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> getting following :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i haven't seen any make directory
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> option there
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i need to create it from command line ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> further, in the hdfs-site.xml file , i
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> have given following entries. are they correct ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <name>dfs.data.dir</name>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <value>c:\\wksp</value>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   </property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <name>dfs.name.dir</name>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <value>c:\\wksp</value>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   </property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 12:40 PM, manish
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> dunani <ma...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *You are wrong at this:*
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> copyFromLocal: File
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> copyFromLocal: File
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Because,You had wrote both the paths
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> local and You need not to copy hadoop into hdfs...Hadoop is already
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> working..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Just check out in browser by after
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> starting ur single node cluster :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> localhost:50070
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> then go for browse the filesystem link
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> in it..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> If there is no directory then make
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> directory there.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> That is your hdfs directory.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Then copy any text file there(no need
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> to copy hadoop there).beacause u are going to do processing on that data in
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> text file.That's why hadoop is used for ,first u need to make it clear in
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> ur mind.Then and then u will do it...otherwise not possible..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *Try this: *
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ .bin/hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /full/local/path/to/ur/file /hdfs/directory/path
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 11:49 AM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks. yes , i am newbie.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, i need windows setup.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> let me surely refer the doc and link
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> which u sent but i need this to be working ...
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> can you please help
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> MANISH DUNANI
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> -THANX
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> +91 9426881954,+91 8460656443
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> manishd207@gmail.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>>>>>>>> Regards
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> *Manish Dunani*
>>>>>>>>>>>>>>>>>>>>>>>> *Contact No* : +91 9408329137
>>>>>>>>>>>>>>>>>>>>>>>> *skype id* : manish.dunani*
>>>>>>>>>>>>>>>>>>>>>>>> *
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>>>>>>>> NOTICE: This message is intended for the use of the
>>>>>>>>>>>>>>>> individual or entity to which it is addressed and may contain information
>>>>>>>>>>>>>>>> that is confidential, privileged and exempt from disclosure under
>>>>>>>>>>>>>>>> applicable law. If the reader of this message is not the intended
>>>>>>>>>>>>>>>> recipient, you are hereby notified that any printing, copying,
>>>>>>>>>>>>>>>> dissemination, distribution, disclosure or forwarding of this communication
>>>>>>>>>>>>>>>> is strictly prohibited. If you have received this communication in error,
>>>>>>>>>>>>>>>> please contact the sender immediately and delete it from your system. Thank
>>>>>>>>>>>>>>>> You.
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>>>>>> NOTICE: This message is intended for the use of the
>>>>>>>>>>>>>> individual or entity to which it is addressed and may contain information
>>>>>>>>>>>>>> that is confidential, privileged and exempt from disclosure under
>>>>>>>>>>>>>> applicable law. If the reader of this message is not the intended
>>>>>>>>>>>>>> recipient, you are hereby notified that any printing, copying,
>>>>>>>>>>>>>> dissemination, distribution, disclosure or forwarding of this communication
>>>>>>>>>>>>>> is strictly prohibited. If you have received this communication in error,
>>>>>>>>>>>>>> please contact the sender immediately and delete it from your system. Thank
>>>>>>>>>>>>>> You.
>>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>> you have received this communication in error, please contact the sender
>>>>>>> immediately and delete it from your system. Thank You.
>>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>> CONFIDENTIALITY NOTICE
>>>> NOTICE: This message is intended for the use of the individual or
>>>> entity to which it is addressed and may contain information that is
>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>> If the reader of this message is not the intended recipient, you are hereby
>>>> notified that any printing, copying, dissemination, distribution,
>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>> you have received this communication in error, please contact the sender
>>>> immediately and delete it from your system. Thank You.
>>>>
>>>
>>>
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity
>> to which it is addressed and may contain information that is confidential,
>> privileged and exempt from disclosure under applicable law. If the reader
>> of this message is not the intended recipient, you are hereby notified that
>> any printing, copying, dissemination, distribution, disclosure or
>> forwarding of this communication is strictly prohibited. If you have
>> received this communication in error, please contact the sender immediately
>> and delete it from your system. Thank You.
>>
>
>

-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.

Re: about replication

Posted by Olivier Renault <or...@hortonworks.com>.
The command to install it is msiexec /i msifile /...

You will find the correct syntax as part of doc.

Happy reading
Olivier
On 4 Sep 2013 12:37, "Irfan Sayed" <ir...@gmail.com> wrote:

> thanks.
> i referred the logs and manuals. i modified the clusterproperties file and
> then double click on the msi file
> however, it still failed.
> further i started the installation on command line by giving
> HDP_LAYOUT=clusterproperties file path,
> installation went ahead and it failed for .NET framework 4.0 and VC++
> redistributable package dependency
>
> i installed both and started again the installation.
> failed again with following error
> [image: Inline image 1]
>
> when i search for the logs mentioned in the error , i never found that
> please suggest
>
> regards
> irfan
>
>
>
> On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault <orenault@hortonworks.com
> > wrote:
>
>> Correct, you need to define the cluster configuration as part of a file.
>> You will find some information on the configuration file as part of the
>> documentation.
>>
>>
>> http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
>>
>> You should make sure to have also installed the pre requisite.
>>
>> Thanks
>> Olivier
>> On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:
>>
>>> thanks. sorry for the long break. actually got involved in some other
>>> priorities
>>> i downloaded the installer and while installing i got following error
>>>
>>> [image: Inline image 1]
>>>
>>> do i need to make any configuration prior to installation ??
>>>
>>> regards
>>> irfan
>>>
>>>
>>>
>>> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
>>> orenault@hortonworks.com> wrote:
>>>
>>>> Here is the link
>>>>
>>>> http://download.hortonworks.com/products/hdp-windows/
>>>>
>>>> Olivier
>>>> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>>
>>>>> thanks.
>>>>> i just followed the instructions to setup the pseudo distributed setup
>>>>> first using the url :
>>>>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>>>>>
>>>>> i don't think so i am running DN on both machine
>>>>> please find the attached log
>>>>>
>>>>> hi olivier
>>>>>
>>>>> can you please give me download link ?
>>>>> let me try please
>>>>>
>>>>> regards
>>>>> irfan
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>>
>>>>>> Are you running DN on both the machines? Could you please show me
>>>>>> your DN logs?
>>>>>>
>>>>>> Also, consider Oliver's suggestion. It's definitely a better option.
>>>>>>
>>>>>>
>>>>>>
>>>>>> Warm Regards,
>>>>>> Tariq
>>>>>> cloudfront.blogspot.com
>>>>>>
>>>>>>
>>>>>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>>>>>> orenault@hortonworks.com> wrote:
>>>>>>
>>>>>>> Irfu,
>>>>>>>
>>>>>>> If you want to quickly get Hadoop running on windows platform. You
>>>>>>> may want to try our distribution for Windows. You will be able to find the
>>>>>>> msi on our website.
>>>>>>>
>>>>>>> Regards
>>>>>>> Olivier
>>>>>>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>>>>>
>>>>>>>> thanks.
>>>>>>>> ok. i think i need to change the plan over here
>>>>>>>> let me create two environments. 1: totally windows 2: totally Unix
>>>>>>>>
>>>>>>>> because, on windows , anyway i have to try and see how hadoop works
>>>>>>>> on UNIX, it is already known that ,  it is working fine.
>>>>>>>>
>>>>>>>> so, on windows , here is the setup:
>>>>>>>>
>>>>>>>> namenode : windows 2012 R2
>>>>>>>> datanode : windows 2012 R2
>>>>>>>>
>>>>>>>> now, the exact problem is :
>>>>>>>> 1: datanode is not getting started
>>>>>>>> 2: replication : if i put any file/folder on any datanode , it
>>>>>>>> should get replicated to all another available datanodes
>>>>>>>>
>>>>>>>> regards
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <dontariq@gmail.com
>>>>>>>> > wrote:
>>>>>>>>
>>>>>>>>> Seriously??You are planning to develop something using Hadoop on
>>>>>>>>> windows. Not a good idea. Anyways, cold you plz show me your log files?I
>>>>>>>>> also need some additional info :
>>>>>>>>> -The exact problem which you are facing right now
>>>>>>>>> -Your cluster summary(no. of nodes etc)
>>>>>>>>> -Your latest configuration files
>>>>>>>>> -Your /etc.hosts file
>>>>>>>>>
>>>>>>>>> Warm Regards,
>>>>>>>>> Tariq
>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <irfu.sayed@gmail.com
>>>>>>>>> > wrote:
>>>>>>>>>
>>>>>>>>>> ok. thanks
>>>>>>>>>> now, i need to start with all windows setup first as our product
>>>>>>>>>> will be based on windows
>>>>>>>>>> so, now, please tell me how to resolve the issue
>>>>>>>>>>
>>>>>>>>>> datanode is not starting . please suggest
>>>>>>>>>>
>>>>>>>>>> regards,
>>>>>>>>>> irfan
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <
>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>>> It is possible. Theoretically Hadoop doesn't stop you from doing
>>>>>>>>>>> that. But it is not a very wise setup.
>>>>>>>>>>>
>>>>>>>>>>> Warm Regards,
>>>>>>>>>>> Tariq
>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <
>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> please suggest
>>>>>>>>>>>>
>>>>>>>>>>>> regards
>>>>>>>>>>>> irfan
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <
>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>
>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>> can i have setup like this :
>>>>>>>>>>>>> namenode will be on linux (flavour may be RHEL, CentOS, UBuntu
>>>>>>>>>>>>> etc)
>>>>>>>>>>>>> and datanodes are the combination of any OS (windows , linux ,
>>>>>>>>>>>>> unix etc )
>>>>>>>>>>>>>
>>>>>>>>>>>>> however, my doubt is,  as the file systems of  both the
>>>>>>>>>>>>> systems (win and linux ) are different ,  datanodes of these systems can
>>>>>>>>>>>>> not be part of single cluster . i have to make windows cluster separate and
>>>>>>>>>>>>> UNIX cluster separate ?
>>>>>>>>>>>>>
>>>>>>>>>>>>> regards
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
>>>>>>>>>>>>> aagarwal@hortonworks.com> wrote:
>>>>>>>>>>>>>
>>>>>>>>>>>>>> I just noticed you are on Cygwin. IIRC Windows PIDs are not
>>>>>>>>>>>>>> the same as Cygwin PIDs so that may be causing the discrepancy. I don't
>>>>>>>>>>>>>> know how well Hadoop works in Cygwin as I have never tried it. Work is in
>>>>>>>>>>>>>> progress for native Windows support however there are no official releases
>>>>>>>>>>>>>> with Windows support yet. It may be easier to get familiar with a
>>>>>>>>>>>>>> release<https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <
>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> thanks
>>>>>>>>>>>>>>> here is what i did .
>>>>>>>>>>>>>>> i stopped all the namenodes and datanodes using
>>>>>>>>>>>>>>> ./stop-dfs.sh command
>>>>>>>>>>>>>>> then deleted all pid files for namenodes and datanodes
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> started dfs again with command : "./start-dfs.sh"
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> when i ran the "Jps" command . it shows
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>> 4536 Jps
>>>>>>>>>>>>>>> 2076 NameNode
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> however, when i open the pid file for namenode then it is
>>>>>>>>>>>>>>> not showing pid as : 4560. on the contrary, it shud show : 2076
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
>>>>>>>>>>>>>>> aagarwal@hortonworks.com> wrote:
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> Most likely there is a stale pid file. Something like
>>>>>>>>>>>>>>>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>>>>>>>>>>>>>>>> the datanode.
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> I haven't read the entire thread so you may have looked at
>>>>>>>>>>>>>>>> this already.
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> -Arpit
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <
>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> datanode is trying to connect to namenode continuously but
>>>>>>>>>>>>>>>>> fails
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> when i try to run "jps" command it says :
>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>> 4584 NameNode
>>>>>>>>>>>>>>>>> 4016 Jps
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> and when i ran the "./start-dfs.sh" then it says :
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> $ ./start-dfs.sh
>>>>>>>>>>>>>>>>> namenode running as process 3544. Stop it first.
>>>>>>>>>>>>>>>>> DFS-1: datanode running as process 4076. Stop it first.
>>>>>>>>>>>>>>>>> localhost: secondarynamenode running as process 4792. Stop
>>>>>>>>>>>>>>>>> it first.
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> both these logs are contradictory
>>>>>>>>>>>>>>>>> please find the attached logs
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> should i attach the conf files as well ?
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> Your DN is still not running. Showing me the logs would
>>>>>>>>>>>>>>>>>> be helpful.
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> i followed the url and did the steps mention in that. i
>>>>>>>>>>>>>>>>>>> have deployed on the windows platform
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> Now, i am able to browse url : http://localhost:50070(name node )
>>>>>>>>>>>>>>>>>>> however, not able to browse url : http://localhost:50030
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> please refer below
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> i have modified all the config files as mentioned and
>>>>>>>>>>>>>>>>>>> formatted the hdfs file system as well
>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> thanks. i followed this url :
>>>>>>>>>>>>>>>>>>>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>>>>>>>>>>>>>>>>>>>> let me follow the url which you gave for pseudo
>>>>>>>>>>>>>>>>>>>> distributed setup and then will switch to distributed mode
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> You are welcome. Which link have you followed for the
>>>>>>>>>>>>>>>>>>>>> configuration?Your *core-site.xml* is empty. Remove
>>>>>>>>>>>>>>>>>>>>> the property *fs.default.name *from *hdfs-site.xml*and add it to
>>>>>>>>>>>>>>>>>>>>> *core-site.xml*. Remove *mapred.job.tracker* as well.
>>>>>>>>>>>>>>>>>>>>> It is required in *mapred-site.xml*.
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> I would suggest you to do a pseudo distributed setup
>>>>>>>>>>>>>>>>>>>>> first in order to get yourself familiar with the process and then proceed
>>>>>>>>>>>>>>>>>>>>> to the distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> HTH
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> thanks tariq for response.
>>>>>>>>>>>>>>>>>>>>>> as discussed last time, i have sent you all the
>>>>>>>>>>>>>>>>>>>>>> config files in my setup .
>>>>>>>>>>>>>>>>>>>>>> can you please go through that ?
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> please let me know
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> I'm sorry for being unresponsive. Was out of touch
>>>>>>>>>>>>>>>>>>>>>>> for sometime because of ramzan and eid. Resuming work today.
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> What's the current status?
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <
>>>>>>>>>>>>>>>>>>>>>>> manishd207@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> First of all read the concepts ..I hope you will
>>>>>>>>>>>>>>>>>>>>>>>> like it..
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> hey Tariq,
>>>>>>>>>>>>>>>>>>>>>>>>>> i am still stuck ..
>>>>>>>>>>>>>>>>>>>>>>>>>> can you please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> attachment got quarantined
>>>>>>>>>>>>>>>>>>>>>>>>>>>> resending in txt format. please rename it to
>>>>>>>>>>>>>>>>>>>>>>>>>>>> conf.rar
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> if i run the jps command on namenode :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 3164 NameNode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 1892 Jps
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> same command on datanode :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-1/cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> 3848 Jps
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> jps does not list any process for datanode.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, on web browser i can see one live data node
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please find the attached conf rar file of
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> OK. we'll start fresh. Could you plz show me
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> your latest config files?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> BTW, are your daemons running fine?Use JPS to
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> verify that.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i have created these dir "wksp_data" and
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> "wksp_name" on both datanode and namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> made the respective changes in
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> "hdfs-site.xml" file
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> formatted the namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> started the dfs
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> but still, not able to browse the file
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> system through web browser
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please refer below
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> anything still missing ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> these dir needs to be created on all
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> datanodes and namenodes ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> further,  hdfs-site.xml needs to be updated
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> on both datanodes and namenodes for these new dir?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Create 2 directories manually
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> corresponding to the values of dfs.name.dir and dfs.data.dir properties and
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> change the permissions of these directories to 755. When you start pushing
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> data into your HDFS, data will start going inside the directory specified
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> by dfs.data.dir and the associated metadata will go inside dfs.name.dir.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Remember, you store data in HDFS, but it eventually gets stored in your
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> local/native FS. But you cannot see this data directly on your local/native
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> FS.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, i need this to be working on
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> windows environment as project requirement.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i will add/work on Linux later
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> so, now , at this stage , c:\\wksp is the
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> HDFS file system OR do i need to create it from command line ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:19 PM, Mohammad
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Hello Irfan,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sorry for being unresponsive. Got stuck
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> with some imp work.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> HDFS webUI doesn't provide us the
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> ability to create file or directory. You can browse HDFS, view files,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> download files etc. But operation like create, move, copy etc are not
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> supported.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> These values look fine to me.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> One suggestion though. Try getting a
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Linux machine(if possible). Or at least use a VM. I personally feel that
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> using Hadoop on windows is always messy.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> when i browse the file system , i am
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> getting following :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i haven't seen any make directory
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> option there
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i need to create it from command line ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> further, in the hdfs-site.xml file , i
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> have given following entries. are they correct ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <name>dfs.data.dir</name>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <value>c:\\wksp</value>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   </property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <name>dfs.name.dir</name>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <value>c:\\wksp</value>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   </property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 12:40 PM, manish
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> dunani <ma...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *You are wrong at this:*
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> copyFromLocal: File
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> copyFromLocal: File
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Because,You had wrote both the paths
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> local and You need not to copy hadoop into hdfs...Hadoop is already
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> working..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Just check out in browser by after
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> starting ur single node cluster :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> localhost:50070
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> then go for browse the filesystem link
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> in it..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> If there is no directory then make
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> directory there.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> That is your hdfs directory.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Then copy any text file there(no need
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> to copy hadoop there).beacause u are going to do processing on that data in
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> text file.That's why hadoop is used for ,first u need to make it clear in
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> ur mind.Then and then u will do it...otherwise not possible..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *Try this: *
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ .bin/hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /full/local/path/to/ur/file /hdfs/directory/path
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 11:49 AM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks. yes , i am newbie.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, i need windows setup.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> let me surely refer the doc and link
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> which u sent but i need this to be working ...
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> can you please help
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> MANISH DUNANI
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> -THANX
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> +91 9426881954,+91 8460656443
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> manishd207@gmail.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>>>>>>>> Regards
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> *Manish Dunani*
>>>>>>>>>>>>>>>>>>>>>>>> *Contact No* : +91 9408329137
>>>>>>>>>>>>>>>>>>>>>>>> *skype id* : manish.dunani*
>>>>>>>>>>>>>>>>>>>>>>>> *
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>>>>>>>> NOTICE: This message is intended for the use of the
>>>>>>>>>>>>>>>> individual or entity to which it is addressed and may contain information
>>>>>>>>>>>>>>>> that is confidential, privileged and exempt from disclosure under
>>>>>>>>>>>>>>>> applicable law. If the reader of this message is not the intended
>>>>>>>>>>>>>>>> recipient, you are hereby notified that any printing, copying,
>>>>>>>>>>>>>>>> dissemination, distribution, disclosure or forwarding of this communication
>>>>>>>>>>>>>>>> is strictly prohibited. If you have received this communication in error,
>>>>>>>>>>>>>>>> please contact the sender immediately and delete it from your system. Thank
>>>>>>>>>>>>>>>> You.
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>>>>>> NOTICE: This message is intended for the use of the
>>>>>>>>>>>>>> individual or entity to which it is addressed and may contain information
>>>>>>>>>>>>>> that is confidential, privileged and exempt from disclosure under
>>>>>>>>>>>>>> applicable law. If the reader of this message is not the intended
>>>>>>>>>>>>>> recipient, you are hereby notified that any printing, copying,
>>>>>>>>>>>>>> dissemination, distribution, disclosure or forwarding of this communication
>>>>>>>>>>>>>> is strictly prohibited. If you have received this communication in error,
>>>>>>>>>>>>>> please contact the sender immediately and delete it from your system. Thank
>>>>>>>>>>>>>> You.
>>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>>> entity to which it is addressed and may contain information that is
>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>> you have received this communication in error, please contact the sender
>>>>>>> immediately and delete it from your system. Thank You.
>>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>> CONFIDENTIALITY NOTICE
>>>> NOTICE: This message is intended for the use of the individual or
>>>> entity to which it is addressed and may contain information that is
>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>> If the reader of this message is not the intended recipient, you are hereby
>>>> notified that any printing, copying, dissemination, distribution,
>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>> you have received this communication in error, please contact the sender
>>>> immediately and delete it from your system. Thank You.
>>>>
>>>
>>>
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity
>> to which it is addressed and may contain information that is confidential,
>> privileged and exempt from disclosure under applicable law. If the reader
>> of this message is not the intended recipient, you are hereby notified that
>> any printing, copying, dissemination, distribution, disclosure or
>> forwarding of this communication is strictly prohibited. If you have
>> received this communication in error, please contact the sender immediately
>> and delete it from your system. Thank You.
>>
>
>

-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.

Re: about replication

Posted by Irfan Sayed <ir...@gmail.com>.
thanks.
i referred the logs and manuals. i modified the clusterproperties file and
then double click on the msi file
however, it still failed.
further i started the installation on command line by giving
HDP_LAYOUT=clusterproperties file path,
installation went ahead and it failed for .NET framework 4.0 and VC++
redistributable package dependency

i installed both and started again the installation.
failed again with following error
[image: Inline image 1]

when i search for the logs mentioned in the error , i never found that
please suggest

regards
irfan



On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault
<or...@hortonworks.com>wrote:

> Correct, you need to define the cluster configuration as part of a file.
> You will find some information on the configuration file as part of the
> documentation.
>
>
> http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
>
> You should make sure to have also installed the pre requisite.
>
> Thanks
> Olivier
> On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:
>
>> thanks. sorry for the long break. actually got involved in some other
>> priorities
>> i downloaded the installer and while installing i got following error
>>
>> [image: Inline image 1]
>>
>> do i need to make any configuration prior to installation ??
>>
>> regards
>> irfan
>>
>>
>>
>> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
>> orenault@hortonworks.com> wrote:
>>
>>> Here is the link
>>>
>>> http://download.hortonworks.com/products/hdp-windows/
>>>
>>> Olivier
>>> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>
>>>> thanks.
>>>> i just followed the instructions to setup the pseudo distributed setup
>>>> first using the url :
>>>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>>>>
>>>> i don't think so i am running DN on both machine
>>>> please find the attached log
>>>>
>>>> hi olivier
>>>>
>>>> can you please give me download link ?
>>>> let me try please
>>>>
>>>> regards
>>>> irfan
>>>>
>>>>
>>>>
>>>>
>>>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>
>>>>> Are you running DN on both the machines? Could you please show me your
>>>>> DN logs?
>>>>>
>>>>> Also, consider Oliver's suggestion. It's definitely a better option.
>>>>>
>>>>>
>>>>>
>>>>> Warm Regards,
>>>>> Tariq
>>>>> cloudfront.blogspot.com
>>>>>
>>>>>
>>>>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>>>>> orenault@hortonworks.com> wrote:
>>>>>
>>>>>> Irfu,
>>>>>>
>>>>>> If you want to quickly get Hadoop running on windows platform. You
>>>>>> may want to try our distribution for Windows. You will be able to find the
>>>>>> msi on our website.
>>>>>>
>>>>>> Regards
>>>>>> Olivier
>>>>>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>>>>
>>>>>>> thanks.
>>>>>>> ok. i think i need to change the plan over here
>>>>>>> let me create two environments. 1: totally windows 2: totally Unix
>>>>>>>
>>>>>>> because, on windows , anyway i have to try and see how hadoop works
>>>>>>> on UNIX, it is already known that ,  it is working fine.
>>>>>>>
>>>>>>> so, on windows , here is the setup:
>>>>>>>
>>>>>>> namenode : windows 2012 R2
>>>>>>> datanode : windows 2012 R2
>>>>>>>
>>>>>>> now, the exact problem is :
>>>>>>> 1: datanode is not getting started
>>>>>>> 2: replication : if i put any file/folder on any datanode , it
>>>>>>> should get replicated to all another available datanodes
>>>>>>>
>>>>>>> regards
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>>>>
>>>>>>>> Seriously??You are planning to develop something using Hadoop on
>>>>>>>> windows. Not a good idea. Anyways, cold you plz show me your log files?I
>>>>>>>> also need some additional info :
>>>>>>>> -The exact problem which you are facing right now
>>>>>>>> -Your cluster summary(no. of nodes etc)
>>>>>>>> -Your latest configuration files
>>>>>>>> -Your /etc.hosts file
>>>>>>>>
>>>>>>>> Warm Regards,
>>>>>>>> Tariq
>>>>>>>> cloudfront.blogspot.com
>>>>>>>>
>>>>>>>>
>>>>>>>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <ir...@gmail.com>wrote:
>>>>>>>>
>>>>>>>>> ok. thanks
>>>>>>>>> now, i need to start with all windows setup first as our product
>>>>>>>>> will be based on windows
>>>>>>>>> so, now, please tell me how to resolve the issue
>>>>>>>>>
>>>>>>>>> datanode is not starting . please suggest
>>>>>>>>>
>>>>>>>>> regards,
>>>>>>>>> irfan
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <
>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>
>>>>>>>>>> It is possible. Theoretically Hadoop doesn't stop you from doing
>>>>>>>>>> that. But it is not a very wise setup.
>>>>>>>>>>
>>>>>>>>>> Warm Regards,
>>>>>>>>>> Tariq
>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>>> please suggest
>>>>>>>>>>>
>>>>>>>>>>> regards
>>>>>>>>>>> irfan
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <
>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> thanks.
>>>>>>>>>>>> can i have setup like this :
>>>>>>>>>>>> namenode will be on linux (flavour may be RHEL, CentOS, UBuntu
>>>>>>>>>>>> etc)
>>>>>>>>>>>> and datanodes are the combination of any OS (windows , linux ,
>>>>>>>>>>>> unix etc )
>>>>>>>>>>>>
>>>>>>>>>>>> however, my doubt is,  as the file systems of  both the systems
>>>>>>>>>>>> (win and linux ) are different ,  datanodes of these systems can not be
>>>>>>>>>>>> part of single cluster . i have to make windows cluster separate and UNIX
>>>>>>>>>>>> cluster separate ?
>>>>>>>>>>>>
>>>>>>>>>>>> regards
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
>>>>>>>>>>>> aagarwal@hortonworks.com> wrote:
>>>>>>>>>>>>
>>>>>>>>>>>>> I just noticed you are on Cygwin. IIRC Windows PIDs are not
>>>>>>>>>>>>> the same as Cygwin PIDs so that may be causing the discrepancy. I don't
>>>>>>>>>>>>> know how well Hadoop works in Cygwin as I have never tried it. Work is in
>>>>>>>>>>>>> progress for native Windows support however there are no official releases
>>>>>>>>>>>>> with Windows support yet. It may be easier to get familiar with a
>>>>>>>>>>>>> release <https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <
>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>
>>>>>>>>>>>>>> thanks
>>>>>>>>>>>>>> here is what i did .
>>>>>>>>>>>>>> i stopped all the namenodes and datanodes using ./stop-dfs.sh
>>>>>>>>>>>>>> command
>>>>>>>>>>>>>> then deleted all pid files for namenodes and datanodes
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> started dfs again with command : "./start-dfs.sh"
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> when i ran the "Jps" command . it shows
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>> 4536 Jps
>>>>>>>>>>>>>> 2076 NameNode
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> however, when i open the pid file for namenode then it is not
>>>>>>>>>>>>>> showing pid as : 4560. on the contrary, it shud show : 2076
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
>>>>>>>>>>>>>> aagarwal@hortonworks.com> wrote:
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> Most likely there is a stale pid file. Something like
>>>>>>>>>>>>>>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>>>>>>>>>>>>>>> the datanode.
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> I haven't read the entire thread so you may have looked at
>>>>>>>>>>>>>>> this already.
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> -Arpit
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <
>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> datanode is trying to connect to namenode continuously but
>>>>>>>>>>>>>>>> fails
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> when i try to run "jps" command it says :
>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>> 4584 NameNode
>>>>>>>>>>>>>>>> 4016 Jps
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> and when i ran the "./start-dfs.sh" then it says :
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> $ ./start-dfs.sh
>>>>>>>>>>>>>>>> namenode running as process 3544. Stop it first.
>>>>>>>>>>>>>>>> DFS-1: datanode running as process 4076. Stop it first.
>>>>>>>>>>>>>>>> localhost: secondarynamenode running as process 4792. Stop
>>>>>>>>>>>>>>>> it first.
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> both these logs are contradictory
>>>>>>>>>>>>>>>> please find the attached logs
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> should i attach the conf files as well ?
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> Your DN is still not running. Showing me the logs would be
>>>>>>>>>>>>>>>>> helpful.
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> i followed the url and did the steps mention in that. i
>>>>>>>>>>>>>>>>>> have deployed on the windows platform
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> Now, i am able to browse url : http://localhost:50070(name node )
>>>>>>>>>>>>>>>>>> however, not able to browse url : http://localhost:50030
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> please refer below
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> i have modified all the config files as mentioned and
>>>>>>>>>>>>>>>>>> formatted the hdfs file system as well
>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> thanks. i followed this url :
>>>>>>>>>>>>>>>>>>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>>>>>>>>>>>>>>>>>>> let me follow the url which you gave for pseudo
>>>>>>>>>>>>>>>>>>> distributed setup and then will switch to distributed mode
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> You are welcome. Which link have you followed for the
>>>>>>>>>>>>>>>>>>>> configuration?Your *core-site.xml* is empty. Remove
>>>>>>>>>>>>>>>>>>>> the property *fs.default.name *from *hdfs-site.xml*and add it to
>>>>>>>>>>>>>>>>>>>> *core-site.xml*. Remove *mapred.job.tracker* as well.
>>>>>>>>>>>>>>>>>>>> It is required in *mapred-site.xml*.
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> I would suggest you to do a pseudo distributed setup
>>>>>>>>>>>>>>>>>>>> first in order to get yourself familiar with the process and then proceed
>>>>>>>>>>>>>>>>>>>> to the distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> HTH
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> thanks tariq for response.
>>>>>>>>>>>>>>>>>>>>> as discussed last time, i have sent you all the config
>>>>>>>>>>>>>>>>>>>>> files in my setup .
>>>>>>>>>>>>>>>>>>>>> can you please go through that ?
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> please let me know
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> I'm sorry for being unresponsive. Was out of touch
>>>>>>>>>>>>>>>>>>>>>> for sometime because of ramzan and eid. Resuming work today.
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> What's the current status?
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <
>>>>>>>>>>>>>>>>>>>>>> manishd207@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> First of all read the concepts ..I hope you will
>>>>>>>>>>>>>>>>>>>>>>> like it..
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> hey Tariq,
>>>>>>>>>>>>>>>>>>>>>>>>> i am still stuck ..
>>>>>>>>>>>>>>>>>>>>>>>>> can you please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> attachment got quarantined
>>>>>>>>>>>>>>>>>>>>>>>>>>> resending in txt format. please rename it to
>>>>>>>>>>>>>>>>>>>>>>>>>>> conf.rar
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> if i run the jps command on namenode :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>>>>>>>>>>>> 3164 NameNode
>>>>>>>>>>>>>>>>>>>>>>>>>>>> 1892 Jps
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> same command on datanode :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-1/cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>>>>>>>>>>>> 3848 Jps
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> jps does not list any process for datanode.
>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, on web browser i can see one live data node
>>>>>>>>>>>>>>>>>>>>>>>>>>>> please find the attached conf rar file of
>>>>>>>>>>>>>>>>>>>>>>>>>>>> namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>> <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> OK. we'll start fresh. Could you plz show me
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> your latest config files?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> BTW, are your daemons running fine?Use JPS to
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> verify that.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i have created these dir "wksp_data" and
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> "wksp_name" on both datanode and namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> made the respective changes in
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> "hdfs-site.xml" file
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> formatted the namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> started the dfs
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> but still, not able to browse the file system
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> through web browser
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please refer below
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> anything still missing ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> these dir needs to be created on all
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> datanodes and namenodes ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> further,  hdfs-site.xml needs to be updated
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> on both datanodes and namenodes for these new dir?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Create 2 directories manually corresponding
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> to the values of dfs.name.dir and dfs.data.dir properties and change the
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> permissions of these directories to 755. When you start pushing data into
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> your HDFS, data will start going inside the directory specified by
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> dfs.data.dir and the associated metadata will go inside dfs.name.dir.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Remember, you store data in HDFS, but it eventually gets stored in your
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> local/native FS. But you cannot see this data directly on your local/native
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> FS.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, i need this to be working on
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> windows environment as project requirement.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i will add/work on Linux later
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> so, now , at this stage , c:\\wksp is the
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> HDFS file system OR do i need to create it from command line ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:19 PM, Mohammad
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Hello Irfan,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sorry for being unresponsive. Got stuck
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> with some imp work.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> HDFS webUI doesn't provide us the ability
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> to create file or directory. You can browse HDFS, view files, download
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> files etc. But operation like create, move, copy etc are not supported.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> These values look fine to me.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> One suggestion though. Try getting a
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Linux machine(if possible). Or at least use a VM. I personally feel that
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> using Hadoop on windows is always messy.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> when i browse the file system , i am
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> getting following :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i haven't seen any make directory option
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> there
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i need to create it from command line ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> further, in the hdfs-site.xml file , i
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> have given following entries. are they correct ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <name>dfs.data.dir</name>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <value>c:\\wksp</value>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   </property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <name>dfs.name.dir</name>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <value>c:\\wksp</value>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   </property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 12:40 PM, manish
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> dunani <ma...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *You are wrong at this:*
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> copyFromLocal: File
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> copyFromLocal: File
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Because,You had wrote both the paths
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> local and You need not to copy hadoop into hdfs...Hadoop is already
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> working..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Just check out in browser by after
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> starting ur single node cluster :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> localhost:50070
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> then go for browse the filesystem link
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> in it..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> If there is no directory then make
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> directory there.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> That is your hdfs directory.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Then copy any text file there(no need
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> to copy hadoop there).beacause u are going to do processing on that data in
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> text file.That's why hadoop is used for ,first u need to make it clear in
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> ur mind.Then and then u will do it...otherwise not possible..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *Try this: *
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ .bin/hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /full/local/path/to/ur/file /hdfs/directory/path
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 11:49 AM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks. yes , i am newbie.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, i need windows setup.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> let me surely refer the doc and link
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> which u sent but i need this to be working ...
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> can you please help
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> MANISH DUNANI
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> -THANX
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> +91 9426881954,+91 8460656443
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> manishd207@gmail.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>>>>>>> Regards
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> *Manish Dunani*
>>>>>>>>>>>>>>>>>>>>>>> *Contact No* : +91 9408329137
>>>>>>>>>>>>>>>>>>>>>>> *skype id* : manish.dunani*
>>>>>>>>>>>>>>>>>>>>>>> *
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>>>>>>> NOTICE: This message is intended for the use of the
>>>>>>>>>>>>>>> individual or entity to which it is addressed and may contain information
>>>>>>>>>>>>>>> that is confidential, privileged and exempt from disclosure under
>>>>>>>>>>>>>>> applicable law. If the reader of this message is not the intended
>>>>>>>>>>>>>>> recipient, you are hereby notified that any printing, copying,
>>>>>>>>>>>>>>> dissemination, distribution, disclosure or forwarding of this communication
>>>>>>>>>>>>>>> is strictly prohibited. If you have received this communication in error,
>>>>>>>>>>>>>>> please contact the sender immediately and delete it from your system. Thank
>>>>>>>>>>>>>>> You.
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>>>>> NOTICE: This message is intended for the use of the individual
>>>>>>>>>>>>> or entity to which it is addressed and may contain information that is
>>>>>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>>>>>> immediately and delete it from your system. Thank You.
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>> CONFIDENTIALITY NOTICE
>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>> entity to which it is addressed and may contain information that is
>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>> you have received this communication in error, please contact the sender
>>>>>> immediately and delete it from your system. Thank You.
>>>>>>
>>>>>
>>>>>
>>>>
>>> CONFIDENTIALITY NOTICE
>>> NOTICE: This message is intended for the use of the individual or entity
>>> to which it is addressed and may contain information that is confidential,
>>> privileged and exempt from disclosure under applicable law. If the reader
>>> of this message is not the intended recipient, you are hereby notified that
>>> any printing, copying, dissemination, distribution, disclosure or
>>> forwarding of this communication is strictly prohibited. If you have
>>> received this communication in error, please contact the sender immediately
>>> and delete it from your system. Thank You.
>>>
>>
>>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.
>

Re: about replication

Posted by Irfan Sayed <ir...@gmail.com>.
thanks.
i referred the logs and manuals. i modified the clusterproperties file and
then double click on the msi file
however, it still failed.
further i started the installation on command line by giving
HDP_LAYOUT=clusterproperties file path,
installation went ahead and it failed for .NET framework 4.0 and VC++
redistributable package dependency

i installed both and started again the installation.
failed again with following error
[image: Inline image 1]

when i search for the logs mentioned in the error , i never found that
please suggest

regards
irfan



On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault
<or...@hortonworks.com>wrote:

> Correct, you need to define the cluster configuration as part of a file.
> You will find some information on the configuration file as part of the
> documentation.
>
>
> http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
>
> You should make sure to have also installed the pre requisite.
>
> Thanks
> Olivier
> On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:
>
>> thanks. sorry for the long break. actually got involved in some other
>> priorities
>> i downloaded the installer and while installing i got following error
>>
>> [image: Inline image 1]
>>
>> do i need to make any configuration prior to installation ??
>>
>> regards
>> irfan
>>
>>
>>
>> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
>> orenault@hortonworks.com> wrote:
>>
>>> Here is the link
>>>
>>> http://download.hortonworks.com/products/hdp-windows/
>>>
>>> Olivier
>>> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>
>>>> thanks.
>>>> i just followed the instructions to setup the pseudo distributed setup
>>>> first using the url :
>>>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>>>>
>>>> i don't think so i am running DN on both machine
>>>> please find the attached log
>>>>
>>>> hi olivier
>>>>
>>>> can you please give me download link ?
>>>> let me try please
>>>>
>>>> regards
>>>> irfan
>>>>
>>>>
>>>>
>>>>
>>>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>
>>>>> Are you running DN on both the machines? Could you please show me your
>>>>> DN logs?
>>>>>
>>>>> Also, consider Oliver's suggestion. It's definitely a better option.
>>>>>
>>>>>
>>>>>
>>>>> Warm Regards,
>>>>> Tariq
>>>>> cloudfront.blogspot.com
>>>>>
>>>>>
>>>>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>>>>> orenault@hortonworks.com> wrote:
>>>>>
>>>>>> Irfu,
>>>>>>
>>>>>> If you want to quickly get Hadoop running on windows platform. You
>>>>>> may want to try our distribution for Windows. You will be able to find the
>>>>>> msi on our website.
>>>>>>
>>>>>> Regards
>>>>>> Olivier
>>>>>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>>>>
>>>>>>> thanks.
>>>>>>> ok. i think i need to change the plan over here
>>>>>>> let me create two environments. 1: totally windows 2: totally Unix
>>>>>>>
>>>>>>> because, on windows , anyway i have to try and see how hadoop works
>>>>>>> on UNIX, it is already known that ,  it is working fine.
>>>>>>>
>>>>>>> so, on windows , here is the setup:
>>>>>>>
>>>>>>> namenode : windows 2012 R2
>>>>>>> datanode : windows 2012 R2
>>>>>>>
>>>>>>> now, the exact problem is :
>>>>>>> 1: datanode is not getting started
>>>>>>> 2: replication : if i put any file/folder on any datanode , it
>>>>>>> should get replicated to all another available datanodes
>>>>>>>
>>>>>>> regards
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>>>>
>>>>>>>> Seriously??You are planning to develop something using Hadoop on
>>>>>>>> windows. Not a good idea. Anyways, cold you plz show me your log files?I
>>>>>>>> also need some additional info :
>>>>>>>> -The exact problem which you are facing right now
>>>>>>>> -Your cluster summary(no. of nodes etc)
>>>>>>>> -Your latest configuration files
>>>>>>>> -Your /etc.hosts file
>>>>>>>>
>>>>>>>> Warm Regards,
>>>>>>>> Tariq
>>>>>>>> cloudfront.blogspot.com
>>>>>>>>
>>>>>>>>
>>>>>>>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <ir...@gmail.com>wrote:
>>>>>>>>
>>>>>>>>> ok. thanks
>>>>>>>>> now, i need to start with all windows setup first as our product
>>>>>>>>> will be based on windows
>>>>>>>>> so, now, please tell me how to resolve the issue
>>>>>>>>>
>>>>>>>>> datanode is not starting . please suggest
>>>>>>>>>
>>>>>>>>> regards,
>>>>>>>>> irfan
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <
>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>
>>>>>>>>>> It is possible. Theoretically Hadoop doesn't stop you from doing
>>>>>>>>>> that. But it is not a very wise setup.
>>>>>>>>>>
>>>>>>>>>> Warm Regards,
>>>>>>>>>> Tariq
>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>>> please suggest
>>>>>>>>>>>
>>>>>>>>>>> regards
>>>>>>>>>>> irfan
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <
>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> thanks.
>>>>>>>>>>>> can i have setup like this :
>>>>>>>>>>>> namenode will be on linux (flavour may be RHEL, CentOS, UBuntu
>>>>>>>>>>>> etc)
>>>>>>>>>>>> and datanodes are the combination of any OS (windows , linux ,
>>>>>>>>>>>> unix etc )
>>>>>>>>>>>>
>>>>>>>>>>>> however, my doubt is,  as the file systems of  both the systems
>>>>>>>>>>>> (win and linux ) are different ,  datanodes of these systems can not be
>>>>>>>>>>>> part of single cluster . i have to make windows cluster separate and UNIX
>>>>>>>>>>>> cluster separate ?
>>>>>>>>>>>>
>>>>>>>>>>>> regards
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
>>>>>>>>>>>> aagarwal@hortonworks.com> wrote:
>>>>>>>>>>>>
>>>>>>>>>>>>> I just noticed you are on Cygwin. IIRC Windows PIDs are not
>>>>>>>>>>>>> the same as Cygwin PIDs so that may be causing the discrepancy. I don't
>>>>>>>>>>>>> know how well Hadoop works in Cygwin as I have never tried it. Work is in
>>>>>>>>>>>>> progress for native Windows support however there are no official releases
>>>>>>>>>>>>> with Windows support yet. It may be easier to get familiar with a
>>>>>>>>>>>>> release <https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <
>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>
>>>>>>>>>>>>>> thanks
>>>>>>>>>>>>>> here is what i did .
>>>>>>>>>>>>>> i stopped all the namenodes and datanodes using ./stop-dfs.sh
>>>>>>>>>>>>>> command
>>>>>>>>>>>>>> then deleted all pid files for namenodes and datanodes
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> started dfs again with command : "./start-dfs.sh"
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> when i ran the "Jps" command . it shows
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>> 4536 Jps
>>>>>>>>>>>>>> 2076 NameNode
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> however, when i open the pid file for namenode then it is not
>>>>>>>>>>>>>> showing pid as : 4560. on the contrary, it shud show : 2076
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
>>>>>>>>>>>>>> aagarwal@hortonworks.com> wrote:
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> Most likely there is a stale pid file. Something like
>>>>>>>>>>>>>>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>>>>>>>>>>>>>>> the datanode.
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> I haven't read the entire thread so you may have looked at
>>>>>>>>>>>>>>> this already.
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> -Arpit
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <
>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> datanode is trying to connect to namenode continuously but
>>>>>>>>>>>>>>>> fails
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> when i try to run "jps" command it says :
>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>> 4584 NameNode
>>>>>>>>>>>>>>>> 4016 Jps
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> and when i ran the "./start-dfs.sh" then it says :
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> $ ./start-dfs.sh
>>>>>>>>>>>>>>>> namenode running as process 3544. Stop it first.
>>>>>>>>>>>>>>>> DFS-1: datanode running as process 4076. Stop it first.
>>>>>>>>>>>>>>>> localhost: secondarynamenode running as process 4792. Stop
>>>>>>>>>>>>>>>> it first.
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> both these logs are contradictory
>>>>>>>>>>>>>>>> please find the attached logs
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> should i attach the conf files as well ?
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> Your DN is still not running. Showing me the logs would be
>>>>>>>>>>>>>>>>> helpful.
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> i followed the url and did the steps mention in that. i
>>>>>>>>>>>>>>>>>> have deployed on the windows platform
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> Now, i am able to browse url : http://localhost:50070(name node )
>>>>>>>>>>>>>>>>>> however, not able to browse url : http://localhost:50030
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> please refer below
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> i have modified all the config files as mentioned and
>>>>>>>>>>>>>>>>>> formatted the hdfs file system as well
>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> thanks. i followed this url :
>>>>>>>>>>>>>>>>>>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>>>>>>>>>>>>>>>>>>> let me follow the url which you gave for pseudo
>>>>>>>>>>>>>>>>>>> distributed setup and then will switch to distributed mode
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> You are welcome. Which link have you followed for the
>>>>>>>>>>>>>>>>>>>> configuration?Your *core-site.xml* is empty. Remove
>>>>>>>>>>>>>>>>>>>> the property *fs.default.name *from *hdfs-site.xml*and add it to
>>>>>>>>>>>>>>>>>>>> *core-site.xml*. Remove *mapred.job.tracker* as well.
>>>>>>>>>>>>>>>>>>>> It is required in *mapred-site.xml*.
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> I would suggest you to do a pseudo distributed setup
>>>>>>>>>>>>>>>>>>>> first in order to get yourself familiar with the process and then proceed
>>>>>>>>>>>>>>>>>>>> to the distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> HTH
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> thanks tariq for response.
>>>>>>>>>>>>>>>>>>>>> as discussed last time, i have sent you all the config
>>>>>>>>>>>>>>>>>>>>> files in my setup .
>>>>>>>>>>>>>>>>>>>>> can you please go through that ?
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> please let me know
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> I'm sorry for being unresponsive. Was out of touch
>>>>>>>>>>>>>>>>>>>>>> for sometime because of ramzan and eid. Resuming work today.
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> What's the current status?
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <
>>>>>>>>>>>>>>>>>>>>>> manishd207@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> First of all read the concepts ..I hope you will
>>>>>>>>>>>>>>>>>>>>>>> like it..
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> hey Tariq,
>>>>>>>>>>>>>>>>>>>>>>>>> i am still stuck ..
>>>>>>>>>>>>>>>>>>>>>>>>> can you please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> attachment got quarantined
>>>>>>>>>>>>>>>>>>>>>>>>>>> resending in txt format. please rename it to
>>>>>>>>>>>>>>>>>>>>>>>>>>> conf.rar
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> if i run the jps command on namenode :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>>>>>>>>>>>> 3164 NameNode
>>>>>>>>>>>>>>>>>>>>>>>>>>>> 1892 Jps
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> same command on datanode :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-1/cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>>>>>>>>>>>> 3848 Jps
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> jps does not list any process for datanode.
>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, on web browser i can see one live data node
>>>>>>>>>>>>>>>>>>>>>>>>>>>> please find the attached conf rar file of
>>>>>>>>>>>>>>>>>>>>>>>>>>>> namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>> <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> OK. we'll start fresh. Could you plz show me
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> your latest config files?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> BTW, are your daemons running fine?Use JPS to
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> verify that.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i have created these dir "wksp_data" and
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> "wksp_name" on both datanode and namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> made the respective changes in
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> "hdfs-site.xml" file
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> formatted the namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> started the dfs
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> but still, not able to browse the file system
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> through web browser
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please refer below
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> anything still missing ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> these dir needs to be created on all
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> datanodes and namenodes ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> further,  hdfs-site.xml needs to be updated
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> on both datanodes and namenodes for these new dir?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Create 2 directories manually corresponding
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> to the values of dfs.name.dir and dfs.data.dir properties and change the
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> permissions of these directories to 755. When you start pushing data into
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> your HDFS, data will start going inside the directory specified by
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> dfs.data.dir and the associated metadata will go inside dfs.name.dir.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Remember, you store data in HDFS, but it eventually gets stored in your
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> local/native FS. But you cannot see this data directly on your local/native
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> FS.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, i need this to be working on
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> windows environment as project requirement.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i will add/work on Linux later
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> so, now , at this stage , c:\\wksp is the
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> HDFS file system OR do i need to create it from command line ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:19 PM, Mohammad
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Hello Irfan,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sorry for being unresponsive. Got stuck
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> with some imp work.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> HDFS webUI doesn't provide us the ability
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> to create file or directory. You can browse HDFS, view files, download
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> files etc. But operation like create, move, copy etc are not supported.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> These values look fine to me.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> One suggestion though. Try getting a
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Linux machine(if possible). Or at least use a VM. I personally feel that
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> using Hadoop on windows is always messy.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> when i browse the file system , i am
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> getting following :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i haven't seen any make directory option
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> there
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i need to create it from command line ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> further, in the hdfs-site.xml file , i
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> have given following entries. are they correct ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <name>dfs.data.dir</name>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <value>c:\\wksp</value>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   </property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <name>dfs.name.dir</name>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <value>c:\\wksp</value>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   </property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 12:40 PM, manish
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> dunani <ma...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *You are wrong at this:*
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> copyFromLocal: File
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> copyFromLocal: File
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Because,You had wrote both the paths
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> local and You need not to copy hadoop into hdfs...Hadoop is already
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> working..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Just check out in browser by after
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> starting ur single node cluster :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> localhost:50070
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> then go for browse the filesystem link
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> in it..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> If there is no directory then make
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> directory there.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> That is your hdfs directory.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Then copy any text file there(no need
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> to copy hadoop there).beacause u are going to do processing on that data in
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> text file.That's why hadoop is used for ,first u need to make it clear in
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> ur mind.Then and then u will do it...otherwise not possible..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *Try this: *
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ .bin/hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /full/local/path/to/ur/file /hdfs/directory/path
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 11:49 AM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks. yes , i am newbie.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, i need windows setup.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> let me surely refer the doc and link
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> which u sent but i need this to be working ...
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> can you please help
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> MANISH DUNANI
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> -THANX
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> +91 9426881954,+91 8460656443
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> manishd207@gmail.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>>>>>>> Regards
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> *Manish Dunani*
>>>>>>>>>>>>>>>>>>>>>>> *Contact No* : +91 9408329137
>>>>>>>>>>>>>>>>>>>>>>> *skype id* : manish.dunani*
>>>>>>>>>>>>>>>>>>>>>>> *
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>>>>>>> NOTICE: This message is intended for the use of the
>>>>>>>>>>>>>>> individual or entity to which it is addressed and may contain information
>>>>>>>>>>>>>>> that is confidential, privileged and exempt from disclosure under
>>>>>>>>>>>>>>> applicable law. If the reader of this message is not the intended
>>>>>>>>>>>>>>> recipient, you are hereby notified that any printing, copying,
>>>>>>>>>>>>>>> dissemination, distribution, disclosure or forwarding of this communication
>>>>>>>>>>>>>>> is strictly prohibited. If you have received this communication in error,
>>>>>>>>>>>>>>> please contact the sender immediately and delete it from your system. Thank
>>>>>>>>>>>>>>> You.
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>>>>> NOTICE: This message is intended for the use of the individual
>>>>>>>>>>>>> or entity to which it is addressed and may contain information that is
>>>>>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>>>>>> immediately and delete it from your system. Thank You.
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>> CONFIDENTIALITY NOTICE
>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>> entity to which it is addressed and may contain information that is
>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>> you have received this communication in error, please contact the sender
>>>>>> immediately and delete it from your system. Thank You.
>>>>>>
>>>>>
>>>>>
>>>>
>>> CONFIDENTIALITY NOTICE
>>> NOTICE: This message is intended for the use of the individual or entity
>>> to which it is addressed and may contain information that is confidential,
>>> privileged and exempt from disclosure under applicable law. If the reader
>>> of this message is not the intended recipient, you are hereby notified that
>>> any printing, copying, dissemination, distribution, disclosure or
>>> forwarding of this communication is strictly prohibited. If you have
>>> received this communication in error, please contact the sender immediately
>>> and delete it from your system. Thank You.
>>>
>>
>>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.
>

Re: about replication

Posted by Irfan Sayed <ir...@gmail.com>.
thanks.
i referred the logs and manuals. i modified the clusterproperties file and
then double click on the msi file
however, it still failed.
further i started the installation on command line by giving
HDP_LAYOUT=clusterproperties file path,
installation went ahead and it failed for .NET framework 4.0 and VC++
redistributable package dependency

i installed both and started again the installation.
failed again with following error
[image: Inline image 1]

when i search for the logs mentioned in the error , i never found that
please suggest

regards
irfan



On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault
<or...@hortonworks.com>wrote:

> Correct, you need to define the cluster configuration as part of a file.
> You will find some information on the configuration file as part of the
> documentation.
>
>
> http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
>
> You should make sure to have also installed the pre requisite.
>
> Thanks
> Olivier
> On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:
>
>> thanks. sorry for the long break. actually got involved in some other
>> priorities
>> i downloaded the installer and while installing i got following error
>>
>> [image: Inline image 1]
>>
>> do i need to make any configuration prior to installation ??
>>
>> regards
>> irfan
>>
>>
>>
>> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
>> orenault@hortonworks.com> wrote:
>>
>>> Here is the link
>>>
>>> http://download.hortonworks.com/products/hdp-windows/
>>>
>>> Olivier
>>> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>
>>>> thanks.
>>>> i just followed the instructions to setup the pseudo distributed setup
>>>> first using the url :
>>>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>>>>
>>>> i don't think so i am running DN on both machine
>>>> please find the attached log
>>>>
>>>> hi olivier
>>>>
>>>> can you please give me download link ?
>>>> let me try please
>>>>
>>>> regards
>>>> irfan
>>>>
>>>>
>>>>
>>>>
>>>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>
>>>>> Are you running DN on both the machines? Could you please show me your
>>>>> DN logs?
>>>>>
>>>>> Also, consider Oliver's suggestion. It's definitely a better option.
>>>>>
>>>>>
>>>>>
>>>>> Warm Regards,
>>>>> Tariq
>>>>> cloudfront.blogspot.com
>>>>>
>>>>>
>>>>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>>>>> orenault@hortonworks.com> wrote:
>>>>>
>>>>>> Irfu,
>>>>>>
>>>>>> If you want to quickly get Hadoop running on windows platform. You
>>>>>> may want to try our distribution for Windows. You will be able to find the
>>>>>> msi on our website.
>>>>>>
>>>>>> Regards
>>>>>> Olivier
>>>>>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>>>>
>>>>>>> thanks.
>>>>>>> ok. i think i need to change the plan over here
>>>>>>> let me create two environments. 1: totally windows 2: totally Unix
>>>>>>>
>>>>>>> because, on windows , anyway i have to try and see how hadoop works
>>>>>>> on UNIX, it is already known that ,  it is working fine.
>>>>>>>
>>>>>>> so, on windows , here is the setup:
>>>>>>>
>>>>>>> namenode : windows 2012 R2
>>>>>>> datanode : windows 2012 R2
>>>>>>>
>>>>>>> now, the exact problem is :
>>>>>>> 1: datanode is not getting started
>>>>>>> 2: replication : if i put any file/folder on any datanode , it
>>>>>>> should get replicated to all another available datanodes
>>>>>>>
>>>>>>> regards
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>>>>
>>>>>>>> Seriously??You are planning to develop something using Hadoop on
>>>>>>>> windows. Not a good idea. Anyways, cold you plz show me your log files?I
>>>>>>>> also need some additional info :
>>>>>>>> -The exact problem which you are facing right now
>>>>>>>> -Your cluster summary(no. of nodes etc)
>>>>>>>> -Your latest configuration files
>>>>>>>> -Your /etc.hosts file
>>>>>>>>
>>>>>>>> Warm Regards,
>>>>>>>> Tariq
>>>>>>>> cloudfront.blogspot.com
>>>>>>>>
>>>>>>>>
>>>>>>>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <ir...@gmail.com>wrote:
>>>>>>>>
>>>>>>>>> ok. thanks
>>>>>>>>> now, i need to start with all windows setup first as our product
>>>>>>>>> will be based on windows
>>>>>>>>> so, now, please tell me how to resolve the issue
>>>>>>>>>
>>>>>>>>> datanode is not starting . please suggest
>>>>>>>>>
>>>>>>>>> regards,
>>>>>>>>> irfan
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <
>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>
>>>>>>>>>> It is possible. Theoretically Hadoop doesn't stop you from doing
>>>>>>>>>> that. But it is not a very wise setup.
>>>>>>>>>>
>>>>>>>>>> Warm Regards,
>>>>>>>>>> Tariq
>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>>> please suggest
>>>>>>>>>>>
>>>>>>>>>>> regards
>>>>>>>>>>> irfan
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <
>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> thanks.
>>>>>>>>>>>> can i have setup like this :
>>>>>>>>>>>> namenode will be on linux (flavour may be RHEL, CentOS, UBuntu
>>>>>>>>>>>> etc)
>>>>>>>>>>>> and datanodes are the combination of any OS (windows , linux ,
>>>>>>>>>>>> unix etc )
>>>>>>>>>>>>
>>>>>>>>>>>> however, my doubt is,  as the file systems of  both the systems
>>>>>>>>>>>> (win and linux ) are different ,  datanodes of these systems can not be
>>>>>>>>>>>> part of single cluster . i have to make windows cluster separate and UNIX
>>>>>>>>>>>> cluster separate ?
>>>>>>>>>>>>
>>>>>>>>>>>> regards
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
>>>>>>>>>>>> aagarwal@hortonworks.com> wrote:
>>>>>>>>>>>>
>>>>>>>>>>>>> I just noticed you are on Cygwin. IIRC Windows PIDs are not
>>>>>>>>>>>>> the same as Cygwin PIDs so that may be causing the discrepancy. I don't
>>>>>>>>>>>>> know how well Hadoop works in Cygwin as I have never tried it. Work is in
>>>>>>>>>>>>> progress for native Windows support however there are no official releases
>>>>>>>>>>>>> with Windows support yet. It may be easier to get familiar with a
>>>>>>>>>>>>> release <https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <
>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>
>>>>>>>>>>>>>> thanks
>>>>>>>>>>>>>> here is what i did .
>>>>>>>>>>>>>> i stopped all the namenodes and datanodes using ./stop-dfs.sh
>>>>>>>>>>>>>> command
>>>>>>>>>>>>>> then deleted all pid files for namenodes and datanodes
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> started dfs again with command : "./start-dfs.sh"
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> when i ran the "Jps" command . it shows
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>> 4536 Jps
>>>>>>>>>>>>>> 2076 NameNode
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> however, when i open the pid file for namenode then it is not
>>>>>>>>>>>>>> showing pid as : 4560. on the contrary, it shud show : 2076
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
>>>>>>>>>>>>>> aagarwal@hortonworks.com> wrote:
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> Most likely there is a stale pid file. Something like
>>>>>>>>>>>>>>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>>>>>>>>>>>>>>> the datanode.
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> I haven't read the entire thread so you may have looked at
>>>>>>>>>>>>>>> this already.
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> -Arpit
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <
>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> datanode is trying to connect to namenode continuously but
>>>>>>>>>>>>>>>> fails
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> when i try to run "jps" command it says :
>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>> 4584 NameNode
>>>>>>>>>>>>>>>> 4016 Jps
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> and when i ran the "./start-dfs.sh" then it says :
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> $ ./start-dfs.sh
>>>>>>>>>>>>>>>> namenode running as process 3544. Stop it first.
>>>>>>>>>>>>>>>> DFS-1: datanode running as process 4076. Stop it first.
>>>>>>>>>>>>>>>> localhost: secondarynamenode running as process 4792. Stop
>>>>>>>>>>>>>>>> it first.
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> both these logs are contradictory
>>>>>>>>>>>>>>>> please find the attached logs
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> should i attach the conf files as well ?
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> Your DN is still not running. Showing me the logs would be
>>>>>>>>>>>>>>>>> helpful.
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> i followed the url and did the steps mention in that. i
>>>>>>>>>>>>>>>>>> have deployed on the windows platform
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> Now, i am able to browse url : http://localhost:50070(name node )
>>>>>>>>>>>>>>>>>> however, not able to browse url : http://localhost:50030
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> please refer below
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> i have modified all the config files as mentioned and
>>>>>>>>>>>>>>>>>> formatted the hdfs file system as well
>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> thanks. i followed this url :
>>>>>>>>>>>>>>>>>>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>>>>>>>>>>>>>>>>>>> let me follow the url which you gave for pseudo
>>>>>>>>>>>>>>>>>>> distributed setup and then will switch to distributed mode
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> You are welcome. Which link have you followed for the
>>>>>>>>>>>>>>>>>>>> configuration?Your *core-site.xml* is empty. Remove
>>>>>>>>>>>>>>>>>>>> the property *fs.default.name *from *hdfs-site.xml*and add it to
>>>>>>>>>>>>>>>>>>>> *core-site.xml*. Remove *mapred.job.tracker* as well.
>>>>>>>>>>>>>>>>>>>> It is required in *mapred-site.xml*.
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> I would suggest you to do a pseudo distributed setup
>>>>>>>>>>>>>>>>>>>> first in order to get yourself familiar with the process and then proceed
>>>>>>>>>>>>>>>>>>>> to the distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> HTH
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> thanks tariq for response.
>>>>>>>>>>>>>>>>>>>>> as discussed last time, i have sent you all the config
>>>>>>>>>>>>>>>>>>>>> files in my setup .
>>>>>>>>>>>>>>>>>>>>> can you please go through that ?
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> please let me know
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> I'm sorry for being unresponsive. Was out of touch
>>>>>>>>>>>>>>>>>>>>>> for sometime because of ramzan and eid. Resuming work today.
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> What's the current status?
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <
>>>>>>>>>>>>>>>>>>>>>> manishd207@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> First of all read the concepts ..I hope you will
>>>>>>>>>>>>>>>>>>>>>>> like it..
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> hey Tariq,
>>>>>>>>>>>>>>>>>>>>>>>>> i am still stuck ..
>>>>>>>>>>>>>>>>>>>>>>>>> can you please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> attachment got quarantined
>>>>>>>>>>>>>>>>>>>>>>>>>>> resending in txt format. please rename it to
>>>>>>>>>>>>>>>>>>>>>>>>>>> conf.rar
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> if i run the jps command on namenode :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>>>>>>>>>>>> 3164 NameNode
>>>>>>>>>>>>>>>>>>>>>>>>>>>> 1892 Jps
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> same command on datanode :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-1/cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>>>>>>>>>>>> 3848 Jps
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> jps does not list any process for datanode.
>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, on web browser i can see one live data node
>>>>>>>>>>>>>>>>>>>>>>>>>>>> please find the attached conf rar file of
>>>>>>>>>>>>>>>>>>>>>>>>>>>> namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>> <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> OK. we'll start fresh. Could you plz show me
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> your latest config files?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> BTW, are your daemons running fine?Use JPS to
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> verify that.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i have created these dir "wksp_data" and
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> "wksp_name" on both datanode and namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> made the respective changes in
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> "hdfs-site.xml" file
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> formatted the namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> started the dfs
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> but still, not able to browse the file system
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> through web browser
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please refer below
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> anything still missing ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> these dir needs to be created on all
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> datanodes and namenodes ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> further,  hdfs-site.xml needs to be updated
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> on both datanodes and namenodes for these new dir?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Create 2 directories manually corresponding
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> to the values of dfs.name.dir and dfs.data.dir properties and change the
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> permissions of these directories to 755. When you start pushing data into
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> your HDFS, data will start going inside the directory specified by
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> dfs.data.dir and the associated metadata will go inside dfs.name.dir.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Remember, you store data in HDFS, but it eventually gets stored in your
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> local/native FS. But you cannot see this data directly on your local/native
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> FS.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, i need this to be working on
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> windows environment as project requirement.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i will add/work on Linux later
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> so, now , at this stage , c:\\wksp is the
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> HDFS file system OR do i need to create it from command line ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:19 PM, Mohammad
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Hello Irfan,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sorry for being unresponsive. Got stuck
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> with some imp work.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> HDFS webUI doesn't provide us the ability
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> to create file or directory. You can browse HDFS, view files, download
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> files etc. But operation like create, move, copy etc are not supported.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> These values look fine to me.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> One suggestion though. Try getting a
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Linux machine(if possible). Or at least use a VM. I personally feel that
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> using Hadoop on windows is always messy.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> when i browse the file system , i am
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> getting following :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i haven't seen any make directory option
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> there
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i need to create it from command line ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> further, in the hdfs-site.xml file , i
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> have given following entries. are they correct ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <name>dfs.data.dir</name>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <value>c:\\wksp</value>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   </property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <name>dfs.name.dir</name>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <value>c:\\wksp</value>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   </property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 12:40 PM, manish
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> dunani <ma...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *You are wrong at this:*
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> copyFromLocal: File
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> copyFromLocal: File
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Because,You had wrote both the paths
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> local and You need not to copy hadoop into hdfs...Hadoop is already
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> working..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Just check out in browser by after
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> starting ur single node cluster :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> localhost:50070
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> then go for browse the filesystem link
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> in it..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> If there is no directory then make
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> directory there.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> That is your hdfs directory.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Then copy any text file there(no need
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> to copy hadoop there).beacause u are going to do processing on that data in
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> text file.That's why hadoop is used for ,first u need to make it clear in
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> ur mind.Then and then u will do it...otherwise not possible..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *Try this: *
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ .bin/hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /full/local/path/to/ur/file /hdfs/directory/path
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 11:49 AM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks. yes , i am newbie.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, i need windows setup.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> let me surely refer the doc and link
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> which u sent but i need this to be working ...
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> can you please help
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> MANISH DUNANI
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> -THANX
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> +91 9426881954,+91 8460656443
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> manishd207@gmail.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>>>>>>> Regards
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> *Manish Dunani*
>>>>>>>>>>>>>>>>>>>>>>> *Contact No* : +91 9408329137
>>>>>>>>>>>>>>>>>>>>>>> *skype id* : manish.dunani*
>>>>>>>>>>>>>>>>>>>>>>> *
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>>>>>>> NOTICE: This message is intended for the use of the
>>>>>>>>>>>>>>> individual or entity to which it is addressed and may contain information
>>>>>>>>>>>>>>> that is confidential, privileged and exempt from disclosure under
>>>>>>>>>>>>>>> applicable law. If the reader of this message is not the intended
>>>>>>>>>>>>>>> recipient, you are hereby notified that any printing, copying,
>>>>>>>>>>>>>>> dissemination, distribution, disclosure or forwarding of this communication
>>>>>>>>>>>>>>> is strictly prohibited. If you have received this communication in error,
>>>>>>>>>>>>>>> please contact the sender immediately and delete it from your system. Thank
>>>>>>>>>>>>>>> You.
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>>>>> NOTICE: This message is intended for the use of the individual
>>>>>>>>>>>>> or entity to which it is addressed and may contain information that is
>>>>>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>>>>>> immediately and delete it from your system. Thank You.
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>> CONFIDENTIALITY NOTICE
>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>> entity to which it is addressed and may contain information that is
>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>> you have received this communication in error, please contact the sender
>>>>>> immediately and delete it from your system. Thank You.
>>>>>>
>>>>>
>>>>>
>>>>
>>> CONFIDENTIALITY NOTICE
>>> NOTICE: This message is intended for the use of the individual or entity
>>> to which it is addressed and may contain information that is confidential,
>>> privileged and exempt from disclosure under applicable law. If the reader
>>> of this message is not the intended recipient, you are hereby notified that
>>> any printing, copying, dissemination, distribution, disclosure or
>>> forwarding of this communication is strictly prohibited. If you have
>>> received this communication in error, please contact the sender immediately
>>> and delete it from your system. Thank You.
>>>
>>
>>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.
>

Re: about replication

Posted by Irfan Sayed <ir...@gmail.com>.
thanks.
i referred the logs and manuals. i modified the clusterproperties file and
then double click on the msi file
however, it still failed.
further i started the installation on command line by giving
HDP_LAYOUT=clusterproperties file path,
installation went ahead and it failed for .NET framework 4.0 and VC++
redistributable package dependency

i installed both and started again the installation.
failed again with following error
[image: Inline image 1]

when i search for the logs mentioned in the error , i never found that
please suggest

regards
irfan



On Tue, Sep 3, 2013 at 12:58 PM, Olivier Renault
<or...@hortonworks.com>wrote:

> Correct, you need to define the cluster configuration as part of a file.
> You will find some information on the configuration file as part of the
> documentation.
>
>
> http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html
>
> You should make sure to have also installed the pre requisite.
>
> Thanks
> Olivier
> On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:
>
>> thanks. sorry for the long break. actually got involved in some other
>> priorities
>> i downloaded the installer and while installing i got following error
>>
>> [image: Inline image 1]
>>
>> do i need to make any configuration prior to installation ??
>>
>> regards
>> irfan
>>
>>
>>
>> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <
>> orenault@hortonworks.com> wrote:
>>
>>> Here is the link
>>>
>>> http://download.hortonworks.com/products/hdp-windows/
>>>
>>> Olivier
>>> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>
>>>> thanks.
>>>> i just followed the instructions to setup the pseudo distributed setup
>>>> first using the url :
>>>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>>>>
>>>> i don't think so i am running DN on both machine
>>>> please find the attached log
>>>>
>>>> hi olivier
>>>>
>>>> can you please give me download link ?
>>>> let me try please
>>>>
>>>> regards
>>>> irfan
>>>>
>>>>
>>>>
>>>>
>>>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>
>>>>> Are you running DN on both the machines? Could you please show me your
>>>>> DN logs?
>>>>>
>>>>> Also, consider Oliver's suggestion. It's definitely a better option.
>>>>>
>>>>>
>>>>>
>>>>> Warm Regards,
>>>>> Tariq
>>>>> cloudfront.blogspot.com
>>>>>
>>>>>
>>>>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>>>>> orenault@hortonworks.com> wrote:
>>>>>
>>>>>> Irfu,
>>>>>>
>>>>>> If you want to quickly get Hadoop running on windows platform. You
>>>>>> may want to try our distribution for Windows. You will be able to find the
>>>>>> msi on our website.
>>>>>>
>>>>>> Regards
>>>>>> Olivier
>>>>>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>>>>
>>>>>>> thanks.
>>>>>>> ok. i think i need to change the plan over here
>>>>>>> let me create two environments. 1: totally windows 2: totally Unix
>>>>>>>
>>>>>>> because, on windows , anyway i have to try and see how hadoop works
>>>>>>> on UNIX, it is already known that ,  it is working fine.
>>>>>>>
>>>>>>> so, on windows , here is the setup:
>>>>>>>
>>>>>>> namenode : windows 2012 R2
>>>>>>> datanode : windows 2012 R2
>>>>>>>
>>>>>>> now, the exact problem is :
>>>>>>> 1: datanode is not getting started
>>>>>>> 2: replication : if i put any file/folder on any datanode , it
>>>>>>> should get replicated to all another available datanodes
>>>>>>>
>>>>>>> regards
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>>>>
>>>>>>>> Seriously??You are planning to develop something using Hadoop on
>>>>>>>> windows. Not a good idea. Anyways, cold you plz show me your log files?I
>>>>>>>> also need some additional info :
>>>>>>>> -The exact problem which you are facing right now
>>>>>>>> -Your cluster summary(no. of nodes etc)
>>>>>>>> -Your latest configuration files
>>>>>>>> -Your /etc.hosts file
>>>>>>>>
>>>>>>>> Warm Regards,
>>>>>>>> Tariq
>>>>>>>> cloudfront.blogspot.com
>>>>>>>>
>>>>>>>>
>>>>>>>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <ir...@gmail.com>wrote:
>>>>>>>>
>>>>>>>>> ok. thanks
>>>>>>>>> now, i need to start with all windows setup first as our product
>>>>>>>>> will be based on windows
>>>>>>>>> so, now, please tell me how to resolve the issue
>>>>>>>>>
>>>>>>>>> datanode is not starting . please suggest
>>>>>>>>>
>>>>>>>>> regards,
>>>>>>>>> irfan
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <
>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>
>>>>>>>>>> It is possible. Theoretically Hadoop doesn't stop you from doing
>>>>>>>>>> that. But it is not a very wise setup.
>>>>>>>>>>
>>>>>>>>>> Warm Regards,
>>>>>>>>>> Tariq
>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>>> please suggest
>>>>>>>>>>>
>>>>>>>>>>> regards
>>>>>>>>>>> irfan
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <
>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> thanks.
>>>>>>>>>>>> can i have setup like this :
>>>>>>>>>>>> namenode will be on linux (flavour may be RHEL, CentOS, UBuntu
>>>>>>>>>>>> etc)
>>>>>>>>>>>> and datanodes are the combination of any OS (windows , linux ,
>>>>>>>>>>>> unix etc )
>>>>>>>>>>>>
>>>>>>>>>>>> however, my doubt is,  as the file systems of  both the systems
>>>>>>>>>>>> (win and linux ) are different ,  datanodes of these systems can not be
>>>>>>>>>>>> part of single cluster . i have to make windows cluster separate and UNIX
>>>>>>>>>>>> cluster separate ?
>>>>>>>>>>>>
>>>>>>>>>>>> regards
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
>>>>>>>>>>>> aagarwal@hortonworks.com> wrote:
>>>>>>>>>>>>
>>>>>>>>>>>>> I just noticed you are on Cygwin. IIRC Windows PIDs are not
>>>>>>>>>>>>> the same as Cygwin PIDs so that may be causing the discrepancy. I don't
>>>>>>>>>>>>> know how well Hadoop works in Cygwin as I have never tried it. Work is in
>>>>>>>>>>>>> progress for native Windows support however there are no official releases
>>>>>>>>>>>>> with Windows support yet. It may be easier to get familiar with a
>>>>>>>>>>>>> release <https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <
>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>
>>>>>>>>>>>>>> thanks
>>>>>>>>>>>>>> here is what i did .
>>>>>>>>>>>>>> i stopped all the namenodes and datanodes using ./stop-dfs.sh
>>>>>>>>>>>>>> command
>>>>>>>>>>>>>> then deleted all pid files for namenodes and datanodes
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> started dfs again with command : "./start-dfs.sh"
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> when i ran the "Jps" command . it shows
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>> 4536 Jps
>>>>>>>>>>>>>> 2076 NameNode
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> however, when i open the pid file for namenode then it is not
>>>>>>>>>>>>>> showing pid as : 4560. on the contrary, it shud show : 2076
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
>>>>>>>>>>>>>> aagarwal@hortonworks.com> wrote:
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> Most likely there is a stale pid file. Something like
>>>>>>>>>>>>>>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>>>>>>>>>>>>>>> the datanode.
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> I haven't read the entire thread so you may have looked at
>>>>>>>>>>>>>>> this already.
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> -Arpit
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <
>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> datanode is trying to connect to namenode continuously but
>>>>>>>>>>>>>>>> fails
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> when i try to run "jps" command it says :
>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>> 4584 NameNode
>>>>>>>>>>>>>>>> 4016 Jps
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> and when i ran the "./start-dfs.sh" then it says :
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> $ ./start-dfs.sh
>>>>>>>>>>>>>>>> namenode running as process 3544. Stop it first.
>>>>>>>>>>>>>>>> DFS-1: datanode running as process 4076. Stop it first.
>>>>>>>>>>>>>>>> localhost: secondarynamenode running as process 4792. Stop
>>>>>>>>>>>>>>>> it first.
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> both these logs are contradictory
>>>>>>>>>>>>>>>> please find the attached logs
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> should i attach the conf files as well ?
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> Your DN is still not running. Showing me the logs would be
>>>>>>>>>>>>>>>>> helpful.
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> i followed the url and did the steps mention in that. i
>>>>>>>>>>>>>>>>>> have deployed on the windows platform
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> Now, i am able to browse url : http://localhost:50070(name node )
>>>>>>>>>>>>>>>>>> however, not able to browse url : http://localhost:50030
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> please refer below
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> i have modified all the config files as mentioned and
>>>>>>>>>>>>>>>>>> formatted the hdfs file system as well
>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> thanks. i followed this url :
>>>>>>>>>>>>>>>>>>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>>>>>>>>>>>>>>>>>>> let me follow the url which you gave for pseudo
>>>>>>>>>>>>>>>>>>> distributed setup and then will switch to distributed mode
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> You are welcome. Which link have you followed for the
>>>>>>>>>>>>>>>>>>>> configuration?Your *core-site.xml* is empty. Remove
>>>>>>>>>>>>>>>>>>>> the property *fs.default.name *from *hdfs-site.xml*and add it to
>>>>>>>>>>>>>>>>>>>> *core-site.xml*. Remove *mapred.job.tracker* as well.
>>>>>>>>>>>>>>>>>>>> It is required in *mapred-site.xml*.
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> I would suggest you to do a pseudo distributed setup
>>>>>>>>>>>>>>>>>>>> first in order to get yourself familiar with the process and then proceed
>>>>>>>>>>>>>>>>>>>> to the distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> HTH
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> thanks tariq for response.
>>>>>>>>>>>>>>>>>>>>> as discussed last time, i have sent you all the config
>>>>>>>>>>>>>>>>>>>>> files in my setup .
>>>>>>>>>>>>>>>>>>>>> can you please go through that ?
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> please let me know
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> I'm sorry for being unresponsive. Was out of touch
>>>>>>>>>>>>>>>>>>>>>> for sometime because of ramzan and eid. Resuming work today.
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> What's the current status?
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <
>>>>>>>>>>>>>>>>>>>>>> manishd207@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> First of all read the concepts ..I hope you will
>>>>>>>>>>>>>>>>>>>>>>> like it..
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> hey Tariq,
>>>>>>>>>>>>>>>>>>>>>>>>> i am still stuck ..
>>>>>>>>>>>>>>>>>>>>>>>>> can you please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> attachment got quarantined
>>>>>>>>>>>>>>>>>>>>>>>>>>> resending in txt format. please rename it to
>>>>>>>>>>>>>>>>>>>>>>>>>>> conf.rar
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> if i run the jps command on namenode :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>>>>>>>>>>>> 3164 NameNode
>>>>>>>>>>>>>>>>>>>>>>>>>>>> 1892 Jps
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> same command on datanode :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-1/cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>>>>>>>>>>>> 3848 Jps
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> jps does not list any process for datanode.
>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, on web browser i can see one live data node
>>>>>>>>>>>>>>>>>>>>>>>>>>>> please find the attached conf rar file of
>>>>>>>>>>>>>>>>>>>>>>>>>>>> namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>> <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> OK. we'll start fresh. Could you plz show me
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> your latest config files?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> BTW, are your daemons running fine?Use JPS to
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> verify that.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i have created these dir "wksp_data" and
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> "wksp_name" on both datanode and namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> made the respective changes in
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> "hdfs-site.xml" file
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> formatted the namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> started the dfs
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> but still, not able to browse the file system
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> through web browser
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please refer below
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> anything still missing ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> these dir needs to be created on all
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> datanodes and namenodes ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> further,  hdfs-site.xml needs to be updated
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> on both datanodes and namenodes for these new dir?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Create 2 directories manually corresponding
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> to the values of dfs.name.dir and dfs.data.dir properties and change the
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> permissions of these directories to 755. When you start pushing data into
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> your HDFS, data will start going inside the directory specified by
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> dfs.data.dir and the associated metadata will go inside dfs.name.dir.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Remember, you store data in HDFS, but it eventually gets stored in your
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> local/native FS. But you cannot see this data directly on your local/native
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> FS.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, i need this to be working on
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> windows environment as project requirement.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i will add/work on Linux later
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> so, now , at this stage , c:\\wksp is the
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> HDFS file system OR do i need to create it from command line ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:19 PM, Mohammad
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Hello Irfan,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sorry for being unresponsive. Got stuck
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> with some imp work.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> HDFS webUI doesn't provide us the ability
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> to create file or directory. You can browse HDFS, view files, download
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> files etc. But operation like create, move, copy etc are not supported.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> These values look fine to me.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> One suggestion though. Try getting a
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Linux machine(if possible). Or at least use a VM. I personally feel that
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> using Hadoop on windows is always messy.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> when i browse the file system , i am
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> getting following :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i haven't seen any make directory option
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> there
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i need to create it from command line ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> further, in the hdfs-site.xml file , i
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> have given following entries. are they correct ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <name>dfs.data.dir</name>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <value>c:\\wksp</value>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   </property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <name>dfs.name.dir</name>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <value>c:\\wksp</value>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   </property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 12:40 PM, manish
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> dunani <ma...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *You are wrong at this:*
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> copyFromLocal: File
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> copyFromLocal: File
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Because,You had wrote both the paths
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> local and You need not to copy hadoop into hdfs...Hadoop is already
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> working..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Just check out in browser by after
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> starting ur single node cluster :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> localhost:50070
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> then go for browse the filesystem link
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> in it..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> If there is no directory then make
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> directory there.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> That is your hdfs directory.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Then copy any text file there(no need
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> to copy hadoop there).beacause u are going to do processing on that data in
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> text file.That's why hadoop is used for ,first u need to make it clear in
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> ur mind.Then and then u will do it...otherwise not possible..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *Try this: *
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ .bin/hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /full/local/path/to/ur/file /hdfs/directory/path
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 11:49 AM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks. yes , i am newbie.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, i need windows setup.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> let me surely refer the doc and link
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> which u sent but i need this to be working ...
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> can you please help
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> MANISH DUNANI
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> -THANX
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> +91 9426881954,+91 8460656443
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> manishd207@gmail.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>>>>>>> Regards
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> *Manish Dunani*
>>>>>>>>>>>>>>>>>>>>>>> *Contact No* : +91 9408329137
>>>>>>>>>>>>>>>>>>>>>>> *skype id* : manish.dunani*
>>>>>>>>>>>>>>>>>>>>>>> *
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>>>>>>> NOTICE: This message is intended for the use of the
>>>>>>>>>>>>>>> individual or entity to which it is addressed and may contain information
>>>>>>>>>>>>>>> that is confidential, privileged and exempt from disclosure under
>>>>>>>>>>>>>>> applicable law. If the reader of this message is not the intended
>>>>>>>>>>>>>>> recipient, you are hereby notified that any printing, copying,
>>>>>>>>>>>>>>> dissemination, distribution, disclosure or forwarding of this communication
>>>>>>>>>>>>>>> is strictly prohibited. If you have received this communication in error,
>>>>>>>>>>>>>>> please contact the sender immediately and delete it from your system. Thank
>>>>>>>>>>>>>>> You.
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>>>>> NOTICE: This message is intended for the use of the individual
>>>>>>>>>>>>> or entity to which it is addressed and may contain information that is
>>>>>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>>>>>> immediately and delete it from your system. Thank You.
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>> CONFIDENTIALITY NOTICE
>>>>>> NOTICE: This message is intended for the use of the individual or
>>>>>> entity to which it is addressed and may contain information that is
>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>> you have received this communication in error, please contact the sender
>>>>>> immediately and delete it from your system. Thank You.
>>>>>>
>>>>>
>>>>>
>>>>
>>> CONFIDENTIALITY NOTICE
>>> NOTICE: This message is intended for the use of the individual or entity
>>> to which it is addressed and may contain information that is confidential,
>>> privileged and exempt from disclosure under applicable law. If the reader
>>> of this message is not the intended recipient, you are hereby notified that
>>> any printing, copying, dissemination, distribution, disclosure or
>>> forwarding of this communication is strictly prohibited. If you have
>>> received this communication in error, please contact the sender immediately
>>> and delete it from your system. Thank You.
>>>
>>
>>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.
>

Re: about replication

Posted by Olivier Renault <or...@hortonworks.com>.
Correct, you need to define the cluster configuration as part of a file.
You will find some information on the configuration file as part of the
documentation.

http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html

You should make sure to have also installed the pre requisite.

Thanks
Olivier
On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:

> thanks. sorry for the long break. actually got involved in some other
> priorities
> i downloaded the installer and while installing i got following error
>
> [image: Inline image 1]
>
> do i need to make any configuration prior to installation ??
>
> regards
> irfan
>
>
>
> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <orenault@hortonworks.com
> > wrote:
>
>> Here is the link
>>
>> http://download.hortonworks.com/products/hdp-windows/
>>
>> Olivier
>> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:
>>
>>> thanks.
>>> i just followed the instructions to setup the pseudo distributed setup
>>> first using the url :
>>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>>>
>>> i don't think so i am running DN on both machine
>>> please find the attached log
>>>
>>> hi olivier
>>>
>>> can you please give me download link ?
>>> let me try please
>>>
>>> regards
>>> irfan
>>>
>>>
>>>
>>>
>>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>
>>>> Are you running DN on both the machines? Could you please show me your
>>>> DN logs?
>>>>
>>>> Also, consider Oliver's suggestion. It's definitely a better option.
>>>>
>>>>
>>>>
>>>> Warm Regards,
>>>> Tariq
>>>> cloudfront.blogspot.com
>>>>
>>>>
>>>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>>>> orenault@hortonworks.com> wrote:
>>>>
>>>>> Irfu,
>>>>>
>>>>> If you want to quickly get Hadoop running on windows platform. You may
>>>>> want to try our distribution for Windows. You will be able to find the msi
>>>>> on our website.
>>>>>
>>>>> Regards
>>>>> Olivier
>>>>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>>>
>>>>>> thanks.
>>>>>> ok. i think i need to change the plan over here
>>>>>> let me create two environments. 1: totally windows 2: totally Unix
>>>>>>
>>>>>> because, on windows , anyway i have to try and see how hadoop works
>>>>>> on UNIX, it is already known that ,  it is working fine.
>>>>>>
>>>>>> so, on windows , here is the setup:
>>>>>>
>>>>>> namenode : windows 2012 R2
>>>>>> datanode : windows 2012 R2
>>>>>>
>>>>>> now, the exact problem is :
>>>>>> 1: datanode is not getting started
>>>>>> 2: replication : if i put any file/folder on any datanode , it should
>>>>>> get replicated to all another available datanodes
>>>>>>
>>>>>> regards
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>>>
>>>>>>> Seriously??You are planning to develop something using Hadoop on
>>>>>>> windows. Not a good idea. Anyways, cold you plz show me your log files?I
>>>>>>> also need some additional info :
>>>>>>> -The exact problem which you are facing right now
>>>>>>> -Your cluster summary(no. of nodes etc)
>>>>>>> -Your latest configuration files
>>>>>>> -Your /etc.hosts file
>>>>>>>
>>>>>>> Warm Regards,
>>>>>>> Tariq
>>>>>>> cloudfront.blogspot.com
>>>>>>>
>>>>>>>
>>>>>>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <ir...@gmail.com>wrote:
>>>>>>>
>>>>>>>> ok. thanks
>>>>>>>> now, i need to start with all windows setup first as our product
>>>>>>>> will be based on windows
>>>>>>>> so, now, please tell me how to resolve the issue
>>>>>>>>
>>>>>>>> datanode is not starting . please suggest
>>>>>>>>
>>>>>>>> regards,
>>>>>>>> irfan
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <dontariq@gmail.com
>>>>>>>> > wrote:
>>>>>>>>
>>>>>>>>> It is possible. Theoretically Hadoop doesn't stop you from doing
>>>>>>>>> that. But it is not a very wise setup.
>>>>>>>>>
>>>>>>>>> Warm Regards,
>>>>>>>>> Tariq
>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <irfu.sayed@gmail.com
>>>>>>>>> > wrote:
>>>>>>>>>
>>>>>>>>>> please suggest
>>>>>>>>>>
>>>>>>>>>> regards
>>>>>>>>>> irfan
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>>> thanks.
>>>>>>>>>>> can i have setup like this :
>>>>>>>>>>> namenode will be on linux (flavour may be RHEL, CentOS, UBuntu
>>>>>>>>>>> etc)
>>>>>>>>>>> and datanodes are the combination of any OS (windows , linux ,
>>>>>>>>>>> unix etc )
>>>>>>>>>>>
>>>>>>>>>>> however, my doubt is,  as the file systems of  both the systems
>>>>>>>>>>> (win and linux ) are different ,  datanodes of these systems can not be
>>>>>>>>>>> part of single cluster . i have to make windows cluster separate and UNIX
>>>>>>>>>>> cluster separate ?
>>>>>>>>>>>
>>>>>>>>>>> regards
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
>>>>>>>>>>> aagarwal@hortonworks.com> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> I just noticed you are on Cygwin. IIRC Windows PIDs are not the
>>>>>>>>>>>> same as Cygwin PIDs so that may be causing the discrepancy. I don't know
>>>>>>>>>>>> how well Hadoop works in Cygwin as I have never tried it. Work is in
>>>>>>>>>>>> progress for native Windows support however there are no official releases
>>>>>>>>>>>> with Windows support yet. It may be easier to get familiar with a
>>>>>>>>>>>> release <https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <
>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>
>>>>>>>>>>>>> thanks
>>>>>>>>>>>>> here is what i did .
>>>>>>>>>>>>> i stopped all the namenodes and datanodes using ./stop-dfs.sh
>>>>>>>>>>>>> command
>>>>>>>>>>>>> then deleted all pid files for namenodes and datanodes
>>>>>>>>>>>>>
>>>>>>>>>>>>> started dfs again with command : "./start-dfs.sh"
>>>>>>>>>>>>>
>>>>>>>>>>>>> when i ran the "Jps" command . it shows
>>>>>>>>>>>>>
>>>>>>>>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>> 4536 Jps
>>>>>>>>>>>>> 2076 NameNode
>>>>>>>>>>>>>
>>>>>>>>>>>>> however, when i open the pid file for namenode then it is not
>>>>>>>>>>>>> showing pid as : 4560. on the contrary, it shud show : 2076
>>>>>>>>>>>>>
>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>
>>>>>>>>>>>>> regards
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
>>>>>>>>>>>>> aagarwal@hortonworks.com> wrote:
>>>>>>>>>>>>>
>>>>>>>>>>>>>> Most likely there is a stale pid file. Something like
>>>>>>>>>>>>>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>>>>>>>>>>>>>> the datanode.
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> I haven't read the entire thread so you may have looked at
>>>>>>>>>>>>>> this already.
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> -Arpit
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <
>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> datanode is trying to connect to namenode continuously but
>>>>>>>>>>>>>>> fails
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> when i try to run "jps" command it says :
>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>> 4584 NameNode
>>>>>>>>>>>>>>> 4016 Jps
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> and when i ran the "./start-dfs.sh" then it says :
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> $ ./start-dfs.sh
>>>>>>>>>>>>>>> namenode running as process 3544. Stop it first.
>>>>>>>>>>>>>>> DFS-1: datanode running as process 4076. Stop it first.
>>>>>>>>>>>>>>> localhost: secondarynamenode running as process 4792. Stop
>>>>>>>>>>>>>>> it first.
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> both these logs are contradictory
>>>>>>>>>>>>>>> please find the attached logs
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> should i attach the conf files as well ?
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <
>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> Your DN is still not running. Showing me the logs would be
>>>>>>>>>>>>>>>> helpful.
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <
>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> i followed the url and did the steps mention in that. i
>>>>>>>>>>>>>>>>> have deployed on the windows platform
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> Now, i am able to browse url : http://localhost:50070(name node )
>>>>>>>>>>>>>>>>> however, not able to browse url : http://localhost:50030
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> please refer below
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> i have modified all the config files as mentioned and
>>>>>>>>>>>>>>>>> formatted the hdfs file system as well
>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> thanks. i followed this url :
>>>>>>>>>>>>>>>>>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>>>>>>>>>>>>>>>>>> let me follow the url which you gave for pseudo
>>>>>>>>>>>>>>>>>> distributed setup and then will switch to distributed mode
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> You are welcome. Which link have you followed for the
>>>>>>>>>>>>>>>>>>> configuration?Your *core-site.xml* is empty. Remove the
>>>>>>>>>>>>>>>>>>> property *fs.default.name *from *hdfs-site.xml* and add
>>>>>>>>>>>>>>>>>>> it to *core-site.xml*. Remove *mapred.job.tracker* as
>>>>>>>>>>>>>>>>>>> well. It is required in *mapred-site.xml*.
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> I would suggest you to do a pseudo distributed setup
>>>>>>>>>>>>>>>>>>> first in order to get yourself familiar with the process and then proceed
>>>>>>>>>>>>>>>>>>> to the distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> HTH
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> thanks tariq for response.
>>>>>>>>>>>>>>>>>>>> as discussed last time, i have sent you all the config
>>>>>>>>>>>>>>>>>>>> files in my setup .
>>>>>>>>>>>>>>>>>>>> can you please go through that ?
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> please let me know
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> I'm sorry for being unresponsive. Was out of touch for
>>>>>>>>>>>>>>>>>>>>> sometime because of ramzan and eid. Resuming work today.
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> What's the current status?
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <
>>>>>>>>>>>>>>>>>>>>> manishd207@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> First of all read the concepts ..I hope you will like
>>>>>>>>>>>>>>>>>>>>>> it..
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> hey Tariq,
>>>>>>>>>>>>>>>>>>>>>>>> i am still stuck ..
>>>>>>>>>>>>>>>>>>>>>>>> can you please suggest
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> attachment got quarantined
>>>>>>>>>>>>>>>>>>>>>>>>>> resending in txt format. please rename it to
>>>>>>>>>>>>>>>>>>>>>>>>>> conf.rar
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> if i run the jps command on namenode :
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>>>>>>>>>>> 3164 NameNode
>>>>>>>>>>>>>>>>>>>>>>>>>>> 1892 Jps
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> same command on datanode :
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-1/cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>>>>>>>>>>> 3848 Jps
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> jps does not list any process for datanode.
>>>>>>>>>>>>>>>>>>>>>>>>>>> however, on web browser i can see one live data node
>>>>>>>>>>>>>>>>>>>>>>>>>>> please find the attached conf rar file of
>>>>>>>>>>>>>>>>>>>>>>>>>>> namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> OK. we'll start fresh. Could you plz show me
>>>>>>>>>>>>>>>>>>>>>>>>>>>> your latest config files?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> BTW, are your daemons running fine?Use JPS to
>>>>>>>>>>>>>>>>>>>>>>>>>>>> verify that.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i have created these dir "wksp_data" and
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> "wksp_name" on both datanode and namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> made the respective changes in "hdfs-site.xml"
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> file
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> formatted the namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> started the dfs
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> but still, not able to browse the file system
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> through web browser
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please refer below
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> anything still missing ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> these dir needs to be created on all
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> datanodes and namenodes ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> further,  hdfs-site.xml needs to be updated
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> on both datanodes and namenodes for these new dir?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Create 2 directories manually corresponding
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> to the values of dfs.name.dir and dfs.data.dir properties and change the
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> permissions of these directories to 755. When you start pushing data into
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> your HDFS, data will start going inside the directory specified by
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> dfs.data.dir and the associated metadata will go inside dfs.name.dir.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Remember, you store data in HDFS, but it eventually gets stored in your
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> local/native FS. But you cannot see this data directly on your local/native
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> FS.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, i need this to be working on
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> windows environment as project requirement.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i will add/work on Linux later
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> so, now , at this stage , c:\\wksp is the
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> HDFS file system OR do i need to create it from command line ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:19 PM, Mohammad
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Hello Irfan,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sorry for being unresponsive. Got stuck
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> with some imp work.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> HDFS webUI doesn't provide us the ability
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> to create file or directory. You can browse HDFS, view files, download
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> files etc. But operation like create, move, copy etc are not supported.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> These values look fine to me.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> One suggestion though. Try getting a Linux
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> machine(if possible). Or at least use a VM. I personally feel that using
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Hadoop on windows is always messy.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> when i browse the file system , i am
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> getting following :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i haven't seen any make directory option
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> there
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i need to create it from command line ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> further, in the hdfs-site.xml file , i
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> have given following entries. are they correct ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <name>dfs.data.dir</name>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <value>c:\\wksp</value>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   </property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <name>dfs.name.dir</name>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <value>c:\\wksp</value>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   </property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 12:40 PM, manish
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> dunani <ma...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *You are wrong at this:*
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> copyFromLocal: File
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> copyFromLocal: File
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Because,You had wrote both the paths
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> local and You need not to copy hadoop into hdfs...Hadoop is already
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> working..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Just check out in browser by after
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> starting ur single node cluster :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> localhost:50070
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> then go for browse the filesystem link
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> in it..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> If there is no directory then make
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> directory there.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> That is your hdfs directory.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Then copy any text file there(no need to
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> copy hadoop there).beacause u are going to do processing on that data in
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> text file.That's why hadoop is used for ,first u need to make it clear in
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> ur mind.Then and then u will do it...otherwise not possible..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *Try this: *
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ .bin/hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /full/local/path/to/ur/file /hdfs/directory/path
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 11:49 AM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks. yes , i am newbie.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, i need windows setup.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> let me surely refer the doc and link
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> which u sent but i need this to be working ...
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> can you please help
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> MANISH DUNANI
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> -THANX
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> +91 9426881954,+91 8460656443
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> manishd207@gmail.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>>>>>> Regards
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> *Manish Dunani*
>>>>>>>>>>>>>>>>>>>>>> *Contact No* : +91 9408329137
>>>>>>>>>>>>>>>>>>>>>> *skype id* : manish.dunani*
>>>>>>>>>>>>>>>>>>>>>> *
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>>>>>> NOTICE: This message is intended for the use of the
>>>>>>>>>>>>>> individual or entity to which it is addressed and may contain information
>>>>>>>>>>>>>> that is confidential, privileged and exempt from disclosure under
>>>>>>>>>>>>>> applicable law. If the reader of this message is not the intended
>>>>>>>>>>>>>> recipient, you are hereby notified that any printing, copying,
>>>>>>>>>>>>>> dissemination, distribution, disclosure or forwarding of this communication
>>>>>>>>>>>>>> is strictly prohibited. If you have received this communication in error,
>>>>>>>>>>>>>> please contact the sender immediately and delete it from your system. Thank
>>>>>>>>>>>>>> You.
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>>>> NOTICE: This message is intended for the use of the individual
>>>>>>>>>>>> or entity to which it is addressed and may contain information that is
>>>>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>>>>> immediately and delete it from your system. Thank You.
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>> CONFIDENTIALITY NOTICE
>>>>> NOTICE: This message is intended for the use of the individual or
>>>>> entity to which it is addressed and may contain information that is
>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>> notified that any printing, copying, dissemination, distribution,
>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>> you have received this communication in error, please contact the sender
>>>>> immediately and delete it from your system. Thank You.
>>>>>
>>>>
>>>>
>>>
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity
>> to which it is addressed and may contain information that is confidential,
>> privileged and exempt from disclosure under applicable law. If the reader
>> of this message is not the intended recipient, you are hereby notified that
>> any printing, copying, dissemination, distribution, disclosure or
>> forwarding of this communication is strictly prohibited. If you have
>> received this communication in error, please contact the sender immediately
>> and delete it from your system. Thank You.
>>
>
>

-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.

Re: about replication

Posted by Olivier Renault <or...@hortonworks.com>.
Correct, you need to define the cluster configuration as part of a file.
You will find some information on the configuration file as part of the
documentation.

http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html

You should make sure to have also installed the pre requisite.

Thanks
Olivier
On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:

> thanks. sorry for the long break. actually got involved in some other
> priorities
> i downloaded the installer and while installing i got following error
>
> [image: Inline image 1]
>
> do i need to make any configuration prior to installation ??
>
> regards
> irfan
>
>
>
> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <orenault@hortonworks.com
> > wrote:
>
>> Here is the link
>>
>> http://download.hortonworks.com/products/hdp-windows/
>>
>> Olivier
>> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:
>>
>>> thanks.
>>> i just followed the instructions to setup the pseudo distributed setup
>>> first using the url :
>>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>>>
>>> i don't think so i am running DN on both machine
>>> please find the attached log
>>>
>>> hi olivier
>>>
>>> can you please give me download link ?
>>> let me try please
>>>
>>> regards
>>> irfan
>>>
>>>
>>>
>>>
>>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>
>>>> Are you running DN on both the machines? Could you please show me your
>>>> DN logs?
>>>>
>>>> Also, consider Oliver's suggestion. It's definitely a better option.
>>>>
>>>>
>>>>
>>>> Warm Regards,
>>>> Tariq
>>>> cloudfront.blogspot.com
>>>>
>>>>
>>>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>>>> orenault@hortonworks.com> wrote:
>>>>
>>>>> Irfu,
>>>>>
>>>>> If you want to quickly get Hadoop running on windows platform. You may
>>>>> want to try our distribution for Windows. You will be able to find the msi
>>>>> on our website.
>>>>>
>>>>> Regards
>>>>> Olivier
>>>>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>>>
>>>>>> thanks.
>>>>>> ok. i think i need to change the plan over here
>>>>>> let me create two environments. 1: totally windows 2: totally Unix
>>>>>>
>>>>>> because, on windows , anyway i have to try and see how hadoop works
>>>>>> on UNIX, it is already known that ,  it is working fine.
>>>>>>
>>>>>> so, on windows , here is the setup:
>>>>>>
>>>>>> namenode : windows 2012 R2
>>>>>> datanode : windows 2012 R2
>>>>>>
>>>>>> now, the exact problem is :
>>>>>> 1: datanode is not getting started
>>>>>> 2: replication : if i put any file/folder on any datanode , it should
>>>>>> get replicated to all another available datanodes
>>>>>>
>>>>>> regards
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>>>
>>>>>>> Seriously??You are planning to develop something using Hadoop on
>>>>>>> windows. Not a good idea. Anyways, cold you plz show me your log files?I
>>>>>>> also need some additional info :
>>>>>>> -The exact problem which you are facing right now
>>>>>>> -Your cluster summary(no. of nodes etc)
>>>>>>> -Your latest configuration files
>>>>>>> -Your /etc.hosts file
>>>>>>>
>>>>>>> Warm Regards,
>>>>>>> Tariq
>>>>>>> cloudfront.blogspot.com
>>>>>>>
>>>>>>>
>>>>>>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <ir...@gmail.com>wrote:
>>>>>>>
>>>>>>>> ok. thanks
>>>>>>>> now, i need to start with all windows setup first as our product
>>>>>>>> will be based on windows
>>>>>>>> so, now, please tell me how to resolve the issue
>>>>>>>>
>>>>>>>> datanode is not starting . please suggest
>>>>>>>>
>>>>>>>> regards,
>>>>>>>> irfan
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <dontariq@gmail.com
>>>>>>>> > wrote:
>>>>>>>>
>>>>>>>>> It is possible. Theoretically Hadoop doesn't stop you from doing
>>>>>>>>> that. But it is not a very wise setup.
>>>>>>>>>
>>>>>>>>> Warm Regards,
>>>>>>>>> Tariq
>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <irfu.sayed@gmail.com
>>>>>>>>> > wrote:
>>>>>>>>>
>>>>>>>>>> please suggest
>>>>>>>>>>
>>>>>>>>>> regards
>>>>>>>>>> irfan
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>>> thanks.
>>>>>>>>>>> can i have setup like this :
>>>>>>>>>>> namenode will be on linux (flavour may be RHEL, CentOS, UBuntu
>>>>>>>>>>> etc)
>>>>>>>>>>> and datanodes are the combination of any OS (windows , linux ,
>>>>>>>>>>> unix etc )
>>>>>>>>>>>
>>>>>>>>>>> however, my doubt is,  as the file systems of  both the systems
>>>>>>>>>>> (win and linux ) are different ,  datanodes of these systems can not be
>>>>>>>>>>> part of single cluster . i have to make windows cluster separate and UNIX
>>>>>>>>>>> cluster separate ?
>>>>>>>>>>>
>>>>>>>>>>> regards
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
>>>>>>>>>>> aagarwal@hortonworks.com> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> I just noticed you are on Cygwin. IIRC Windows PIDs are not the
>>>>>>>>>>>> same as Cygwin PIDs so that may be causing the discrepancy. I don't know
>>>>>>>>>>>> how well Hadoop works in Cygwin as I have never tried it. Work is in
>>>>>>>>>>>> progress for native Windows support however there are no official releases
>>>>>>>>>>>> with Windows support yet. It may be easier to get familiar with a
>>>>>>>>>>>> release <https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <
>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>
>>>>>>>>>>>>> thanks
>>>>>>>>>>>>> here is what i did .
>>>>>>>>>>>>> i stopped all the namenodes and datanodes using ./stop-dfs.sh
>>>>>>>>>>>>> command
>>>>>>>>>>>>> then deleted all pid files for namenodes and datanodes
>>>>>>>>>>>>>
>>>>>>>>>>>>> started dfs again with command : "./start-dfs.sh"
>>>>>>>>>>>>>
>>>>>>>>>>>>> when i ran the "Jps" command . it shows
>>>>>>>>>>>>>
>>>>>>>>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>> 4536 Jps
>>>>>>>>>>>>> 2076 NameNode
>>>>>>>>>>>>>
>>>>>>>>>>>>> however, when i open the pid file for namenode then it is not
>>>>>>>>>>>>> showing pid as : 4560. on the contrary, it shud show : 2076
>>>>>>>>>>>>>
>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>
>>>>>>>>>>>>> regards
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
>>>>>>>>>>>>> aagarwal@hortonworks.com> wrote:
>>>>>>>>>>>>>
>>>>>>>>>>>>>> Most likely there is a stale pid file. Something like
>>>>>>>>>>>>>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>>>>>>>>>>>>>> the datanode.
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> I haven't read the entire thread so you may have looked at
>>>>>>>>>>>>>> this already.
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> -Arpit
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <
>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> datanode is trying to connect to namenode continuously but
>>>>>>>>>>>>>>> fails
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> when i try to run "jps" command it says :
>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>> 4584 NameNode
>>>>>>>>>>>>>>> 4016 Jps
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> and when i ran the "./start-dfs.sh" then it says :
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> $ ./start-dfs.sh
>>>>>>>>>>>>>>> namenode running as process 3544. Stop it first.
>>>>>>>>>>>>>>> DFS-1: datanode running as process 4076. Stop it first.
>>>>>>>>>>>>>>> localhost: secondarynamenode running as process 4792. Stop
>>>>>>>>>>>>>>> it first.
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> both these logs are contradictory
>>>>>>>>>>>>>>> please find the attached logs
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> should i attach the conf files as well ?
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <
>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> Your DN is still not running. Showing me the logs would be
>>>>>>>>>>>>>>>> helpful.
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <
>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> i followed the url and did the steps mention in that. i
>>>>>>>>>>>>>>>>> have deployed on the windows platform
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> Now, i am able to browse url : http://localhost:50070(name node )
>>>>>>>>>>>>>>>>> however, not able to browse url : http://localhost:50030
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> please refer below
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> i have modified all the config files as mentioned and
>>>>>>>>>>>>>>>>> formatted the hdfs file system as well
>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> thanks. i followed this url :
>>>>>>>>>>>>>>>>>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>>>>>>>>>>>>>>>>>> let me follow the url which you gave for pseudo
>>>>>>>>>>>>>>>>>> distributed setup and then will switch to distributed mode
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> You are welcome. Which link have you followed for the
>>>>>>>>>>>>>>>>>>> configuration?Your *core-site.xml* is empty. Remove the
>>>>>>>>>>>>>>>>>>> property *fs.default.name *from *hdfs-site.xml* and add
>>>>>>>>>>>>>>>>>>> it to *core-site.xml*. Remove *mapred.job.tracker* as
>>>>>>>>>>>>>>>>>>> well. It is required in *mapred-site.xml*.
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> I would suggest you to do a pseudo distributed setup
>>>>>>>>>>>>>>>>>>> first in order to get yourself familiar with the process and then proceed
>>>>>>>>>>>>>>>>>>> to the distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> HTH
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> thanks tariq for response.
>>>>>>>>>>>>>>>>>>>> as discussed last time, i have sent you all the config
>>>>>>>>>>>>>>>>>>>> files in my setup .
>>>>>>>>>>>>>>>>>>>> can you please go through that ?
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> please let me know
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> I'm sorry for being unresponsive. Was out of touch for
>>>>>>>>>>>>>>>>>>>>> sometime because of ramzan and eid. Resuming work today.
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> What's the current status?
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <
>>>>>>>>>>>>>>>>>>>>> manishd207@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> First of all read the concepts ..I hope you will like
>>>>>>>>>>>>>>>>>>>>>> it..
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> hey Tariq,
>>>>>>>>>>>>>>>>>>>>>>>> i am still stuck ..
>>>>>>>>>>>>>>>>>>>>>>>> can you please suggest
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> attachment got quarantined
>>>>>>>>>>>>>>>>>>>>>>>>>> resending in txt format. please rename it to
>>>>>>>>>>>>>>>>>>>>>>>>>> conf.rar
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> if i run the jps command on namenode :
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>>>>>>>>>>> 3164 NameNode
>>>>>>>>>>>>>>>>>>>>>>>>>>> 1892 Jps
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> same command on datanode :
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-1/cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>>>>>>>>>>> 3848 Jps
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> jps does not list any process for datanode.
>>>>>>>>>>>>>>>>>>>>>>>>>>> however, on web browser i can see one live data node
>>>>>>>>>>>>>>>>>>>>>>>>>>> please find the attached conf rar file of
>>>>>>>>>>>>>>>>>>>>>>>>>>> namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> OK. we'll start fresh. Could you plz show me
>>>>>>>>>>>>>>>>>>>>>>>>>>>> your latest config files?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> BTW, are your daemons running fine?Use JPS to
>>>>>>>>>>>>>>>>>>>>>>>>>>>> verify that.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i have created these dir "wksp_data" and
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> "wksp_name" on both datanode and namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> made the respective changes in "hdfs-site.xml"
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> file
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> formatted the namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> started the dfs
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> but still, not able to browse the file system
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> through web browser
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please refer below
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> anything still missing ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> these dir needs to be created on all
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> datanodes and namenodes ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> further,  hdfs-site.xml needs to be updated
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> on both datanodes and namenodes for these new dir?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Create 2 directories manually corresponding
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> to the values of dfs.name.dir and dfs.data.dir properties and change the
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> permissions of these directories to 755. When you start pushing data into
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> your HDFS, data will start going inside the directory specified by
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> dfs.data.dir and the associated metadata will go inside dfs.name.dir.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Remember, you store data in HDFS, but it eventually gets stored in your
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> local/native FS. But you cannot see this data directly on your local/native
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> FS.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, i need this to be working on
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> windows environment as project requirement.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i will add/work on Linux later
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> so, now , at this stage , c:\\wksp is the
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> HDFS file system OR do i need to create it from command line ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:19 PM, Mohammad
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Hello Irfan,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sorry for being unresponsive. Got stuck
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> with some imp work.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> HDFS webUI doesn't provide us the ability
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> to create file or directory. You can browse HDFS, view files, download
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> files etc. But operation like create, move, copy etc are not supported.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> These values look fine to me.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> One suggestion though. Try getting a Linux
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> machine(if possible). Or at least use a VM. I personally feel that using
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Hadoop on windows is always messy.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> when i browse the file system , i am
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> getting following :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i haven't seen any make directory option
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> there
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i need to create it from command line ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> further, in the hdfs-site.xml file , i
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> have given following entries. are they correct ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <name>dfs.data.dir</name>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <value>c:\\wksp</value>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   </property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <name>dfs.name.dir</name>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <value>c:\\wksp</value>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   </property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 12:40 PM, manish
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> dunani <ma...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *You are wrong at this:*
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> copyFromLocal: File
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> copyFromLocal: File
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Because,You had wrote both the paths
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> local and You need not to copy hadoop into hdfs...Hadoop is already
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> working..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Just check out in browser by after
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> starting ur single node cluster :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> localhost:50070
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> then go for browse the filesystem link
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> in it..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> If there is no directory then make
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> directory there.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> That is your hdfs directory.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Then copy any text file there(no need to
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> copy hadoop there).beacause u are going to do processing on that data in
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> text file.That's why hadoop is used for ,first u need to make it clear in
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> ur mind.Then and then u will do it...otherwise not possible..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *Try this: *
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ .bin/hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /full/local/path/to/ur/file /hdfs/directory/path
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 11:49 AM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks. yes , i am newbie.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, i need windows setup.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> let me surely refer the doc and link
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> which u sent but i need this to be working ...
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> can you please help
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> MANISH DUNANI
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> -THANX
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> +91 9426881954,+91 8460656443
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> manishd207@gmail.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>>>>>> Regards
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> *Manish Dunani*
>>>>>>>>>>>>>>>>>>>>>> *Contact No* : +91 9408329137
>>>>>>>>>>>>>>>>>>>>>> *skype id* : manish.dunani*
>>>>>>>>>>>>>>>>>>>>>> *
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>>>>>> NOTICE: This message is intended for the use of the
>>>>>>>>>>>>>> individual or entity to which it is addressed and may contain information
>>>>>>>>>>>>>> that is confidential, privileged and exempt from disclosure under
>>>>>>>>>>>>>> applicable law. If the reader of this message is not the intended
>>>>>>>>>>>>>> recipient, you are hereby notified that any printing, copying,
>>>>>>>>>>>>>> dissemination, distribution, disclosure or forwarding of this communication
>>>>>>>>>>>>>> is strictly prohibited. If you have received this communication in error,
>>>>>>>>>>>>>> please contact the sender immediately and delete it from your system. Thank
>>>>>>>>>>>>>> You.
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>>>> NOTICE: This message is intended for the use of the individual
>>>>>>>>>>>> or entity to which it is addressed and may contain information that is
>>>>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>>>>> immediately and delete it from your system. Thank You.
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>> CONFIDENTIALITY NOTICE
>>>>> NOTICE: This message is intended for the use of the individual or
>>>>> entity to which it is addressed and may contain information that is
>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>> notified that any printing, copying, dissemination, distribution,
>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>> you have received this communication in error, please contact the sender
>>>>> immediately and delete it from your system. Thank You.
>>>>>
>>>>
>>>>
>>>
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity
>> to which it is addressed and may contain information that is confidential,
>> privileged and exempt from disclosure under applicable law. If the reader
>> of this message is not the intended recipient, you are hereby notified that
>> any printing, copying, dissemination, distribution, disclosure or
>> forwarding of this communication is strictly prohibited. If you have
>> received this communication in error, please contact the sender immediately
>> and delete it from your system. Thank You.
>>
>
>

-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.

Re: about replication

Posted by Olivier Renault <or...@hortonworks.com>.
Correct, you need to define the cluster configuration as part of a file.
You will find some information on the configuration file as part of the
documentation.

http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html

You should make sure to have also installed the pre requisite.

Thanks
Olivier
On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:

> thanks. sorry for the long break. actually got involved in some other
> priorities
> i downloaded the installer and while installing i got following error
>
> [image: Inline image 1]
>
> do i need to make any configuration prior to installation ??
>
> regards
> irfan
>
>
>
> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <orenault@hortonworks.com
> > wrote:
>
>> Here is the link
>>
>> http://download.hortonworks.com/products/hdp-windows/
>>
>> Olivier
>> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:
>>
>>> thanks.
>>> i just followed the instructions to setup the pseudo distributed setup
>>> first using the url :
>>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>>>
>>> i don't think so i am running DN on both machine
>>> please find the attached log
>>>
>>> hi olivier
>>>
>>> can you please give me download link ?
>>> let me try please
>>>
>>> regards
>>> irfan
>>>
>>>
>>>
>>>
>>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>
>>>> Are you running DN on both the machines? Could you please show me your
>>>> DN logs?
>>>>
>>>> Also, consider Oliver's suggestion. It's definitely a better option.
>>>>
>>>>
>>>>
>>>> Warm Regards,
>>>> Tariq
>>>> cloudfront.blogspot.com
>>>>
>>>>
>>>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>>>> orenault@hortonworks.com> wrote:
>>>>
>>>>> Irfu,
>>>>>
>>>>> If you want to quickly get Hadoop running on windows platform. You may
>>>>> want to try our distribution for Windows. You will be able to find the msi
>>>>> on our website.
>>>>>
>>>>> Regards
>>>>> Olivier
>>>>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>>>
>>>>>> thanks.
>>>>>> ok. i think i need to change the plan over here
>>>>>> let me create two environments. 1: totally windows 2: totally Unix
>>>>>>
>>>>>> because, on windows , anyway i have to try and see how hadoop works
>>>>>> on UNIX, it is already known that ,  it is working fine.
>>>>>>
>>>>>> so, on windows , here is the setup:
>>>>>>
>>>>>> namenode : windows 2012 R2
>>>>>> datanode : windows 2012 R2
>>>>>>
>>>>>> now, the exact problem is :
>>>>>> 1: datanode is not getting started
>>>>>> 2: replication : if i put any file/folder on any datanode , it should
>>>>>> get replicated to all another available datanodes
>>>>>>
>>>>>> regards
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>>>
>>>>>>> Seriously??You are planning to develop something using Hadoop on
>>>>>>> windows. Not a good idea. Anyways, cold you plz show me your log files?I
>>>>>>> also need some additional info :
>>>>>>> -The exact problem which you are facing right now
>>>>>>> -Your cluster summary(no. of nodes etc)
>>>>>>> -Your latest configuration files
>>>>>>> -Your /etc.hosts file
>>>>>>>
>>>>>>> Warm Regards,
>>>>>>> Tariq
>>>>>>> cloudfront.blogspot.com
>>>>>>>
>>>>>>>
>>>>>>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <ir...@gmail.com>wrote:
>>>>>>>
>>>>>>>> ok. thanks
>>>>>>>> now, i need to start with all windows setup first as our product
>>>>>>>> will be based on windows
>>>>>>>> so, now, please tell me how to resolve the issue
>>>>>>>>
>>>>>>>> datanode is not starting . please suggest
>>>>>>>>
>>>>>>>> regards,
>>>>>>>> irfan
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <dontariq@gmail.com
>>>>>>>> > wrote:
>>>>>>>>
>>>>>>>>> It is possible. Theoretically Hadoop doesn't stop you from doing
>>>>>>>>> that. But it is not a very wise setup.
>>>>>>>>>
>>>>>>>>> Warm Regards,
>>>>>>>>> Tariq
>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <irfu.sayed@gmail.com
>>>>>>>>> > wrote:
>>>>>>>>>
>>>>>>>>>> please suggest
>>>>>>>>>>
>>>>>>>>>> regards
>>>>>>>>>> irfan
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>>> thanks.
>>>>>>>>>>> can i have setup like this :
>>>>>>>>>>> namenode will be on linux (flavour may be RHEL, CentOS, UBuntu
>>>>>>>>>>> etc)
>>>>>>>>>>> and datanodes are the combination of any OS (windows , linux ,
>>>>>>>>>>> unix etc )
>>>>>>>>>>>
>>>>>>>>>>> however, my doubt is,  as the file systems of  both the systems
>>>>>>>>>>> (win and linux ) are different ,  datanodes of these systems can not be
>>>>>>>>>>> part of single cluster . i have to make windows cluster separate and UNIX
>>>>>>>>>>> cluster separate ?
>>>>>>>>>>>
>>>>>>>>>>> regards
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
>>>>>>>>>>> aagarwal@hortonworks.com> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> I just noticed you are on Cygwin. IIRC Windows PIDs are not the
>>>>>>>>>>>> same as Cygwin PIDs so that may be causing the discrepancy. I don't know
>>>>>>>>>>>> how well Hadoop works in Cygwin as I have never tried it. Work is in
>>>>>>>>>>>> progress for native Windows support however there are no official releases
>>>>>>>>>>>> with Windows support yet. It may be easier to get familiar with a
>>>>>>>>>>>> release <https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <
>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>
>>>>>>>>>>>>> thanks
>>>>>>>>>>>>> here is what i did .
>>>>>>>>>>>>> i stopped all the namenodes and datanodes using ./stop-dfs.sh
>>>>>>>>>>>>> command
>>>>>>>>>>>>> then deleted all pid files for namenodes and datanodes
>>>>>>>>>>>>>
>>>>>>>>>>>>> started dfs again with command : "./start-dfs.sh"
>>>>>>>>>>>>>
>>>>>>>>>>>>> when i ran the "Jps" command . it shows
>>>>>>>>>>>>>
>>>>>>>>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>> 4536 Jps
>>>>>>>>>>>>> 2076 NameNode
>>>>>>>>>>>>>
>>>>>>>>>>>>> however, when i open the pid file for namenode then it is not
>>>>>>>>>>>>> showing pid as : 4560. on the contrary, it shud show : 2076
>>>>>>>>>>>>>
>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>
>>>>>>>>>>>>> regards
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
>>>>>>>>>>>>> aagarwal@hortonworks.com> wrote:
>>>>>>>>>>>>>
>>>>>>>>>>>>>> Most likely there is a stale pid file. Something like
>>>>>>>>>>>>>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>>>>>>>>>>>>>> the datanode.
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> I haven't read the entire thread so you may have looked at
>>>>>>>>>>>>>> this already.
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> -Arpit
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <
>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> datanode is trying to connect to namenode continuously but
>>>>>>>>>>>>>>> fails
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> when i try to run "jps" command it says :
>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>> 4584 NameNode
>>>>>>>>>>>>>>> 4016 Jps
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> and when i ran the "./start-dfs.sh" then it says :
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> $ ./start-dfs.sh
>>>>>>>>>>>>>>> namenode running as process 3544. Stop it first.
>>>>>>>>>>>>>>> DFS-1: datanode running as process 4076. Stop it first.
>>>>>>>>>>>>>>> localhost: secondarynamenode running as process 4792. Stop
>>>>>>>>>>>>>>> it first.
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> both these logs are contradictory
>>>>>>>>>>>>>>> please find the attached logs
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> should i attach the conf files as well ?
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <
>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> Your DN is still not running. Showing me the logs would be
>>>>>>>>>>>>>>>> helpful.
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <
>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> i followed the url and did the steps mention in that. i
>>>>>>>>>>>>>>>>> have deployed on the windows platform
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> Now, i am able to browse url : http://localhost:50070(name node )
>>>>>>>>>>>>>>>>> however, not able to browse url : http://localhost:50030
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> please refer below
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> i have modified all the config files as mentioned and
>>>>>>>>>>>>>>>>> formatted the hdfs file system as well
>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> thanks. i followed this url :
>>>>>>>>>>>>>>>>>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>>>>>>>>>>>>>>>>>> let me follow the url which you gave for pseudo
>>>>>>>>>>>>>>>>>> distributed setup and then will switch to distributed mode
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> You are welcome. Which link have you followed for the
>>>>>>>>>>>>>>>>>>> configuration?Your *core-site.xml* is empty. Remove the
>>>>>>>>>>>>>>>>>>> property *fs.default.name *from *hdfs-site.xml* and add
>>>>>>>>>>>>>>>>>>> it to *core-site.xml*. Remove *mapred.job.tracker* as
>>>>>>>>>>>>>>>>>>> well. It is required in *mapred-site.xml*.
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> I would suggest you to do a pseudo distributed setup
>>>>>>>>>>>>>>>>>>> first in order to get yourself familiar with the process and then proceed
>>>>>>>>>>>>>>>>>>> to the distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> HTH
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> thanks tariq for response.
>>>>>>>>>>>>>>>>>>>> as discussed last time, i have sent you all the config
>>>>>>>>>>>>>>>>>>>> files in my setup .
>>>>>>>>>>>>>>>>>>>> can you please go through that ?
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> please let me know
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> I'm sorry for being unresponsive. Was out of touch for
>>>>>>>>>>>>>>>>>>>>> sometime because of ramzan and eid. Resuming work today.
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> What's the current status?
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <
>>>>>>>>>>>>>>>>>>>>> manishd207@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> First of all read the concepts ..I hope you will like
>>>>>>>>>>>>>>>>>>>>>> it..
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> hey Tariq,
>>>>>>>>>>>>>>>>>>>>>>>> i am still stuck ..
>>>>>>>>>>>>>>>>>>>>>>>> can you please suggest
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> attachment got quarantined
>>>>>>>>>>>>>>>>>>>>>>>>>> resending in txt format. please rename it to
>>>>>>>>>>>>>>>>>>>>>>>>>> conf.rar
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> if i run the jps command on namenode :
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>>>>>>>>>>> 3164 NameNode
>>>>>>>>>>>>>>>>>>>>>>>>>>> 1892 Jps
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> same command on datanode :
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-1/cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>>>>>>>>>>> 3848 Jps
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> jps does not list any process for datanode.
>>>>>>>>>>>>>>>>>>>>>>>>>>> however, on web browser i can see one live data node
>>>>>>>>>>>>>>>>>>>>>>>>>>> please find the attached conf rar file of
>>>>>>>>>>>>>>>>>>>>>>>>>>> namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> OK. we'll start fresh. Could you plz show me
>>>>>>>>>>>>>>>>>>>>>>>>>>>> your latest config files?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> BTW, are your daemons running fine?Use JPS to
>>>>>>>>>>>>>>>>>>>>>>>>>>>> verify that.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i have created these dir "wksp_data" and
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> "wksp_name" on both datanode and namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> made the respective changes in "hdfs-site.xml"
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> file
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> formatted the namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> started the dfs
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> but still, not able to browse the file system
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> through web browser
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please refer below
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> anything still missing ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> these dir needs to be created on all
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> datanodes and namenodes ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> further,  hdfs-site.xml needs to be updated
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> on both datanodes and namenodes for these new dir?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Create 2 directories manually corresponding
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> to the values of dfs.name.dir and dfs.data.dir properties and change the
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> permissions of these directories to 755. When you start pushing data into
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> your HDFS, data will start going inside the directory specified by
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> dfs.data.dir and the associated metadata will go inside dfs.name.dir.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Remember, you store data in HDFS, but it eventually gets stored in your
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> local/native FS. But you cannot see this data directly on your local/native
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> FS.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, i need this to be working on
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> windows environment as project requirement.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i will add/work on Linux later
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> so, now , at this stage , c:\\wksp is the
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> HDFS file system OR do i need to create it from command line ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:19 PM, Mohammad
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Hello Irfan,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sorry for being unresponsive. Got stuck
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> with some imp work.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> HDFS webUI doesn't provide us the ability
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> to create file or directory. You can browse HDFS, view files, download
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> files etc. But operation like create, move, copy etc are not supported.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> These values look fine to me.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> One suggestion though. Try getting a Linux
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> machine(if possible). Or at least use a VM. I personally feel that using
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Hadoop on windows is always messy.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> when i browse the file system , i am
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> getting following :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i haven't seen any make directory option
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> there
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i need to create it from command line ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> further, in the hdfs-site.xml file , i
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> have given following entries. are they correct ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <name>dfs.data.dir</name>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <value>c:\\wksp</value>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   </property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <name>dfs.name.dir</name>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <value>c:\\wksp</value>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   </property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 12:40 PM, manish
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> dunani <ma...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *You are wrong at this:*
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> copyFromLocal: File
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> copyFromLocal: File
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Because,You had wrote both the paths
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> local and You need not to copy hadoop into hdfs...Hadoop is already
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> working..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Just check out in browser by after
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> starting ur single node cluster :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> localhost:50070
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> then go for browse the filesystem link
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> in it..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> If there is no directory then make
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> directory there.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> That is your hdfs directory.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Then copy any text file there(no need to
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> copy hadoop there).beacause u are going to do processing on that data in
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> text file.That's why hadoop is used for ,first u need to make it clear in
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> ur mind.Then and then u will do it...otherwise not possible..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *Try this: *
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ .bin/hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /full/local/path/to/ur/file /hdfs/directory/path
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 11:49 AM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks. yes , i am newbie.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, i need windows setup.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> let me surely refer the doc and link
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> which u sent but i need this to be working ...
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> can you please help
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> MANISH DUNANI
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> -THANX
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> +91 9426881954,+91 8460656443
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> manishd207@gmail.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>>>>>> Regards
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> *Manish Dunani*
>>>>>>>>>>>>>>>>>>>>>> *Contact No* : +91 9408329137
>>>>>>>>>>>>>>>>>>>>>> *skype id* : manish.dunani*
>>>>>>>>>>>>>>>>>>>>>> *
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>>>>>> NOTICE: This message is intended for the use of the
>>>>>>>>>>>>>> individual or entity to which it is addressed and may contain information
>>>>>>>>>>>>>> that is confidential, privileged and exempt from disclosure under
>>>>>>>>>>>>>> applicable law. If the reader of this message is not the intended
>>>>>>>>>>>>>> recipient, you are hereby notified that any printing, copying,
>>>>>>>>>>>>>> dissemination, distribution, disclosure or forwarding of this communication
>>>>>>>>>>>>>> is strictly prohibited. If you have received this communication in error,
>>>>>>>>>>>>>> please contact the sender immediately and delete it from your system. Thank
>>>>>>>>>>>>>> You.
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>>>> NOTICE: This message is intended for the use of the individual
>>>>>>>>>>>> or entity to which it is addressed and may contain information that is
>>>>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>>>>> immediately and delete it from your system. Thank You.
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>> CONFIDENTIALITY NOTICE
>>>>> NOTICE: This message is intended for the use of the individual or
>>>>> entity to which it is addressed and may contain information that is
>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>> notified that any printing, copying, dissemination, distribution,
>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>> you have received this communication in error, please contact the sender
>>>>> immediately and delete it from your system. Thank You.
>>>>>
>>>>
>>>>
>>>
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity
>> to which it is addressed and may contain information that is confidential,
>> privileged and exempt from disclosure under applicable law. If the reader
>> of this message is not the intended recipient, you are hereby notified that
>> any printing, copying, dissemination, distribution, disclosure or
>> forwarding of this communication is strictly prohibited. If you have
>> received this communication in error, please contact the sender immediately
>> and delete it from your system. Thank You.
>>
>
>

-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.

Re: about replication

Posted by Olivier Renault <or...@hortonworks.com>.
Correct, you need to define the cluster configuration as part of a file.
You will find some information on the configuration file as part of the
documentation.

http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-Win-1.3.0/bk_installing_hdp_for_windows/content/win-getting-ready-6.html

You should make sure to have also installed the pre requisite.

Thanks
Olivier
On 3 Sep 2013 06:51, "Irfan Sayed" <ir...@gmail.com> wrote:

> thanks. sorry for the long break. actually got involved in some other
> priorities
> i downloaded the installer and while installing i got following error
>
> [image: Inline image 1]
>
> do i need to make any configuration prior to installation ??
>
> regards
> irfan
>
>
>
> On Fri, Aug 23, 2013 at 4:10 PM, Olivier Renault <orenault@hortonworks.com
> > wrote:
>
>> Here is the link
>>
>> http://download.hortonworks.com/products/hdp-windows/
>>
>> Olivier
>> On 23 Aug 2013 10:55, "Irfan Sayed" <ir...@gmail.com> wrote:
>>
>>> thanks.
>>> i just followed the instructions to setup the pseudo distributed setup
>>> first using the url :
>>> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I
>>>
>>> i don't think so i am running DN on both machine
>>> please find the attached log
>>>
>>> hi olivier
>>>
>>> can you please give me download link ?
>>> let me try please
>>>
>>> regards
>>> irfan
>>>
>>>
>>>
>>>
>>> On Fri, Aug 23, 2013 at 1:08 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>
>>>> Are you running DN on both the machines? Could you please show me your
>>>> DN logs?
>>>>
>>>> Also, consider Oliver's suggestion. It's definitely a better option.
>>>>
>>>>
>>>>
>>>> Warm Regards,
>>>> Tariq
>>>> cloudfront.blogspot.com
>>>>
>>>>
>>>> On Fri, Aug 23, 2013 at 12:57 PM, Olivier Renault <
>>>> orenault@hortonworks.com> wrote:
>>>>
>>>>> Irfu,
>>>>>
>>>>> If you want to quickly get Hadoop running on windows platform. You may
>>>>> want to try our distribution for Windows. You will be able to find the msi
>>>>> on our website.
>>>>>
>>>>> Regards
>>>>> Olivier
>>>>> On 23 Aug 2013 05:15, "Irfan Sayed" <ir...@gmail.com> wrote:
>>>>>
>>>>>> thanks.
>>>>>> ok. i think i need to change the plan over here
>>>>>> let me create two environments. 1: totally windows 2: totally Unix
>>>>>>
>>>>>> because, on windows , anyway i have to try and see how hadoop works
>>>>>> on UNIX, it is already known that ,  it is working fine.
>>>>>>
>>>>>> so, on windows , here is the setup:
>>>>>>
>>>>>> namenode : windows 2012 R2
>>>>>> datanode : windows 2012 R2
>>>>>>
>>>>>> now, the exact problem is :
>>>>>> 1: datanode is not getting started
>>>>>> 2: replication : if i put any file/folder on any datanode , it should
>>>>>> get replicated to all another available datanodes
>>>>>>
>>>>>> regards
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Fri, Aug 23, 2013 at 2:42 AM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>>>
>>>>>>> Seriously??You are planning to develop something using Hadoop on
>>>>>>> windows. Not a good idea. Anyways, cold you plz show me your log files?I
>>>>>>> also need some additional info :
>>>>>>> -The exact problem which you are facing right now
>>>>>>> -Your cluster summary(no. of nodes etc)
>>>>>>> -Your latest configuration files
>>>>>>> -Your /etc.hosts file
>>>>>>>
>>>>>>> Warm Regards,
>>>>>>> Tariq
>>>>>>> cloudfront.blogspot.com
>>>>>>>
>>>>>>>
>>>>>>> On Fri, Aug 23, 2013 at 1:42 AM, Irfan Sayed <ir...@gmail.com>wrote:
>>>>>>>
>>>>>>>> ok. thanks
>>>>>>>> now, i need to start with all windows setup first as our product
>>>>>>>> will be based on windows
>>>>>>>> so, now, please tell me how to resolve the issue
>>>>>>>>
>>>>>>>> datanode is not starting . please suggest
>>>>>>>>
>>>>>>>> regards,
>>>>>>>> irfan
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> On Thu, Aug 22, 2013 at 7:56 PM, Mohammad Tariq <dontariq@gmail.com
>>>>>>>> > wrote:
>>>>>>>>
>>>>>>>>> It is possible. Theoretically Hadoop doesn't stop you from doing
>>>>>>>>> that. But it is not a very wise setup.
>>>>>>>>>
>>>>>>>>> Warm Regards,
>>>>>>>>> Tariq
>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Thu, Aug 22, 2013 at 5:01 PM, Irfan Sayed <irfu.sayed@gmail.com
>>>>>>>>> > wrote:
>>>>>>>>>
>>>>>>>>>> please suggest
>>>>>>>>>>
>>>>>>>>>> regards
>>>>>>>>>> irfan
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Thu, Aug 22, 2013 at 11:49 AM, Irfan Sayed <
>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>>> thanks.
>>>>>>>>>>> can i have setup like this :
>>>>>>>>>>> namenode will be on linux (flavour may be RHEL, CentOS, UBuntu
>>>>>>>>>>> etc)
>>>>>>>>>>> and datanodes are the combination of any OS (windows , linux ,
>>>>>>>>>>> unix etc )
>>>>>>>>>>>
>>>>>>>>>>> however, my doubt is,  as the file systems of  both the systems
>>>>>>>>>>> (win and linux ) are different ,  datanodes of these systems can not be
>>>>>>>>>>> part of single cluster . i have to make windows cluster separate and UNIX
>>>>>>>>>>> cluster separate ?
>>>>>>>>>>>
>>>>>>>>>>> regards
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Thu, Aug 22, 2013 at 11:26 AM, Arpit Agarwal <
>>>>>>>>>>> aagarwal@hortonworks.com> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> I just noticed you are on Cygwin. IIRC Windows PIDs are not the
>>>>>>>>>>>> same as Cygwin PIDs so that may be causing the discrepancy. I don't know
>>>>>>>>>>>> how well Hadoop works in Cygwin as I have never tried it. Work is in
>>>>>>>>>>>> progress for native Windows support however there are no official releases
>>>>>>>>>>>> with Windows support yet. It may be easier to get familiar with a
>>>>>>>>>>>> release <https://www.apache.org/dyn/closer.cgi/hadoop/common/>on Linux if you are new to it.
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> On Wed, Aug 21, 2013 at 10:05 PM, Irfan Sayed <
>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>
>>>>>>>>>>>>> thanks
>>>>>>>>>>>>> here is what i did .
>>>>>>>>>>>>> i stopped all the namenodes and datanodes using ./stop-dfs.sh
>>>>>>>>>>>>> command
>>>>>>>>>>>>> then deleted all pid files for namenodes and datanodes
>>>>>>>>>>>>>
>>>>>>>>>>>>> started dfs again with command : "./start-dfs.sh"
>>>>>>>>>>>>>
>>>>>>>>>>>>> when i ran the "Jps" command . it shows
>>>>>>>>>>>>>
>>>>>>>>>>>>> Administrator@DFS-DC /cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>> 4536 Jps
>>>>>>>>>>>>> 2076 NameNode
>>>>>>>>>>>>>
>>>>>>>>>>>>> however, when i open the pid file for namenode then it is not
>>>>>>>>>>>>> showing pid as : 4560. on the contrary, it shud show : 2076
>>>>>>>>>>>>>
>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>
>>>>>>>>>>>>> regards
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>> On Thu, Aug 22, 2013 at 9:59 AM, Arpit Agarwal <
>>>>>>>>>>>>> aagarwal@hortonworks.com> wrote:
>>>>>>>>>>>>>
>>>>>>>>>>>>>> Most likely there is a stale pid file. Something like
>>>>>>>>>>>>>> \tmp\hadoop-*datanode.pid. You could try deleting it and then restarting
>>>>>>>>>>>>>> the datanode.
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> I haven't read the entire thread so you may have looked at
>>>>>>>>>>>>>> this already.
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> -Arpit
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 9:22 PM, Irfan Sayed <
>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> datanode is trying to connect to namenode continuously but
>>>>>>>>>>>>>>> fails
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> when i try to run "jps" command it says :
>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>> 4584 NameNode
>>>>>>>>>>>>>>> 4016 Jps
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> and when i ran the "./start-dfs.sh" then it says :
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> $ ./start-dfs.sh
>>>>>>>>>>>>>>> namenode running as process 3544. Stop it first.
>>>>>>>>>>>>>>> DFS-1: datanode running as process 4076. Stop it first.
>>>>>>>>>>>>>>> localhost: secondarynamenode running as process 4792. Stop
>>>>>>>>>>>>>>> it first.
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> both these logs are contradictory
>>>>>>>>>>>>>>> please find the attached logs
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> should i attach the conf files as well ?
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 5:28 PM, Mohammad Tariq <
>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> Your DN is still not running. Showing me the logs would be
>>>>>>>>>>>>>>>> helpful.
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>> On Wed, Aug 21, 2013 at 5:11 PM, Irfan Sayed <
>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> i followed the url and did the steps mention in that. i
>>>>>>>>>>>>>>>>> have deployed on the windows platform
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> Now, i am able to browse url : http://localhost:50070(name node )
>>>>>>>>>>>>>>>>> however, not able to browse url : http://localhost:50030
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> please refer below
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> i have modified all the config files as mentioned and
>>>>>>>>>>>>>>>>> formatted the hdfs file system as well
>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 4:14 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> thanks. i followed this url :
>>>>>>>>>>>>>>>>>> http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html
>>>>>>>>>>>>>>>>>> let me follow the url which you gave for pseudo
>>>>>>>>>>>>>>>>>> distributed setup and then will switch to distributed mode
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 3:23 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> You are welcome. Which link have you followed for the
>>>>>>>>>>>>>>>>>>> configuration?Your *core-site.xml* is empty. Remove the
>>>>>>>>>>>>>>>>>>> property *fs.default.name *from *hdfs-site.xml* and add
>>>>>>>>>>>>>>>>>>> it to *core-site.xml*. Remove *mapred.job.tracker* as
>>>>>>>>>>>>>>>>>>> well. It is required in *mapred-site.xml*.
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> I would suggest you to do a pseudo distributed setup
>>>>>>>>>>>>>>>>>>> first in order to get yourself familiar with the process and then proceed
>>>>>>>>>>>>>>>>>>> to the distributed mode. You can visit this link<http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html#.UhM8d2T0-4I>if you need some help. Let me know if you face any issue.
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> HTH
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 2:56 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> thanks tariq for response.
>>>>>>>>>>>>>>>>>>>> as discussed last time, i have sent you all the config
>>>>>>>>>>>>>>>>>>>> files in my setup .
>>>>>>>>>>>>>>>>>>>> can you please go through that ?
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> please let me know
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>> On Tue, Aug 20, 2013 at 1:22 PM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> I'm sorry for being unresponsive. Was out of touch for
>>>>>>>>>>>>>>>>>>>>> sometime because of ramzan and eid. Resuming work today.
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> What's the current status?
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>> On Mon, Aug 19, 2013 at 7:18 PM, manish dunani <
>>>>>>>>>>>>>>>>>>>>> manishd207@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> First of all read the concepts ..I hope you will like
>>>>>>>>>>>>>>>>>>>>>> it..
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> https://www.frcrc.org/sites/default/files/HadoopTutorialPart1.pdf
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> On Mon, Aug 19, 2013 at 9:45 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 13, 2013 at 12:56 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> hey Tariq,
>>>>>>>>>>>>>>>>>>>>>>>> i am still stuck ..
>>>>>>>>>>>>>>>>>>>>>>>> can you please suggest
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>> irfan
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>> On Thu, Aug 8, 2013 at 5:56 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 9:49 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> attachment got quarantined
>>>>>>>>>>>>>>>>>>>>>>>>>> resending in txt format. please rename it to
>>>>>>>>>>>>>>>>>>>>>>>>>> conf.rar
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 9:41 AM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> if i run the jps command on namenode :
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>>>>>>>>>>> 3164 NameNode
>>>>>>>>>>>>>>>>>>>>>>>>>>> 1892 Jps
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> same command on datanode :
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-1/cygdrive/c/Java/jdk1.7.0_25/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./jps.exe
>>>>>>>>>>>>>>>>>>>>>>>>>>> 3848 Jps
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> jps does not list any process for datanode.
>>>>>>>>>>>>>>>>>>>>>>>>>>> however, on web browser i can see one live data node
>>>>>>>>>>>>>>>>>>>>>>>>>>> please find the attached conf rar file of
>>>>>>>>>>>>>>>>>>>>>>>>>>> namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>> On Wed, Aug 7, 2013 at 1:52 AM, Mohammad Tariq <
>>>>>>>>>>>>>>>>>>>>>>>>>>> dontariq@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> OK. we'll start fresh. Could you plz show me
>>>>>>>>>>>>>>>>>>>>>>>>>>>> your latest config files?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> BTW, are your daemons running fine?Use JPS to
>>>>>>>>>>>>>>>>>>>>>>>>>>>> verify that.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 10:59 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i have created these dir "wksp_data" and
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> "wksp_name" on both datanode and namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> made the respective changes in "hdfs-site.xml"
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> file
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> formatted the namenode
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> started the dfs
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> but still, not able to browse the file system
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> through web browser
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please refer below
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> anything still missing ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 10:35 PM, Irfan Sayed <
>>>>>>>>>>>>>>>>>>>>>>>>>>>>> irfu.sayed@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> these dir needs to be created on all
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> datanodes and namenodes ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> further,  hdfs-site.xml needs to be updated
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> on both datanodes and namenodes for these new dir?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:30 PM, Mohammad
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Create 2 directories manually corresponding
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> to the values of dfs.name.dir and dfs.data.dir properties and change the
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> permissions of these directories to 755. When you start pushing data into
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> your HDFS, data will start going inside the directory specified by
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> dfs.data.dir and the associated metadata will go inside dfs.name.dir.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Remember, you store data in HDFS, but it eventually gets stored in your
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> local/native FS. But you cannot see this data directly on your local/native
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> FS.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:26 PM, Irfan Sayed
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, i need this to be working on
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> windows environment as project requirement.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i will add/work on Linux later
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> so, now , at this stage , c:\\wksp is the
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> HDFS file system OR do i need to create it from command line ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:19 PM, Mohammad
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq <do...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Hello Irfan,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sorry for being unresponsive. Got stuck
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> with some imp work.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> HDFS webUI doesn't provide us the ability
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> to create file or directory. You can browse HDFS, view files, download
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> files etc. But operation like create, move, copy etc are not supported.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> These values look fine to me.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> One suggestion though. Try getting a Linux
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> machine(if possible). Or at least use a VM. I personally feel that using
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Hadoop on windows is always messy.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Warm Regards,
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Tariq
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> cloudfront.blogspot.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 5:09 PM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> when i browse the file system , i am
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> getting following :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i haven't seen any make directory option
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> there
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> i need to create it from command line ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> further, in the hdfs-site.xml file , i
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> have given following entries. are they correct ?
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <name>dfs.data.dir</name>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <value>c:\\wksp</value>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   </property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> <property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <name>dfs.name.dir</name>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   <value>c:\\wksp</value>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>   </property>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> please suggest
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> [image: Inline image 1]
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 12:40 PM, manish
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> dunani <ma...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *You are wrong at this:*
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar /wksp
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> copyFromLocal: File
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar does not exist.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2/bin
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ ./hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz /wksp
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> copyFromLocal: File
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /cygdrive/c/Users/Administrator/Desktop/hadoop-1.1.2.tar.gz does not exist.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Because,You had wrote both the paths
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> local and You need not to copy hadoop into hdfs...Hadoop is already
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> working..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Just check out in browser by after
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> starting ur single node cluster :
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> localhost:50070
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> then go for browse the filesystem link
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> in it..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> If there is no directory then make
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> directory there.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> That is your hdfs directory.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Then copy any text file there(no need to
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> copy hadoop there).beacause u are going to do processing on that data in
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> text file.That's why hadoop is used for ,first u need to make it clear in
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> ur mind.Then and then u will do it...otherwise not possible..
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> *Try this: *
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Administrator@DFS-DC/cygdrive/c/hadoop-1.1.2/hadoop-1.1.2
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> $ .bin/hadoop dfs -copyFromLocal
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> /full/local/path/to/ur/file /hdfs/directory/path
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> On Tue, Aug 6, 2013 at 11:49 AM, Irfan
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> Sayed <ir...@gmail.com> wrote:
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> thanks. yes , i am newbie.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> however, i need windows setup.
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> let me surely refer the doc and link
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> which u sent but i need this to be working ...
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> can you please help
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> regards
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> MANISH DUNANI
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> -THANX
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> +91 9426881954,+91 8460656443
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>> manishd207@gmail.com
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> --
>>>>>>>>>>>>>>>>>>>>>> Regards
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>> *Manish Dunani*
>>>>>>>>>>>>>>>>>>>>>> *Contact No* : +91 9408329137
>>>>>>>>>>>>>>>>>>>>>> *skype id* : manish.dunani*
>>>>>>>>>>>>>>>>>>>>>> *
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>>
>>>>>>>>>>>>>>
>>>>>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>>>>>> NOTICE: This message is intended for the use of the
>>>>>>>>>>>>>> individual or entity to which it is addressed and may contain information
>>>>>>>>>>>>>> that is confidential, privileged and exempt from disclosure under
>>>>>>>>>>>>>> applicable law. If the reader of this message is not the intended
>>>>>>>>>>>>>> recipient, you are hereby notified that any printing, copying,
>>>>>>>>>>>>>> dissemination, distribution, disclosure or forwarding of this communication
>>>>>>>>>>>>>> is strictly prohibited. If you have received this communication in error,
>>>>>>>>>>>>>> please contact the sender immediately and delete it from your system. Thank
>>>>>>>>>>>>>> You.
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> CONFIDENTIALITY NOTICE
>>>>>>>>>>>> NOTICE: This message is intended for the use of the individual
>>>>>>>>>>>> or entity to which it is addressed and may contain information that is
>>>>>>>>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>>>>>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>>>>>>>>> notified that any printing, copying, dissemination, distribution,
>>>>>>>>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>>>>>>>>> you have received this communication in error, please contact the sender
>>>>>>>>>>>> immediately and delete it from your system. Thank You.
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>> CONFIDENTIALITY NOTICE
>>>>> NOTICE: This message is intended for the use of the individual or
>>>>> entity to which it is addressed and may contain information that is
>>>>> confidential, privileged and exempt from disclosure under applicable law.
>>>>> If the reader of this message is not the intended recipient, you are hereby
>>>>> notified that any printing, copying, dissemination, distribution,
>>>>> disclosure or forwarding of this communication is strictly prohibited. If
>>>>> you have received this communication in error, please contact the sender
>>>>> immediately and delete it from your system. Thank You.
>>>>>
>>>>
>>>>
>>>
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity
>> to which it is addressed and may contain information that is confidential,
>> privileged and exempt from disclosure under applicable law. If the reader
>> of this message is not the intended recipient, you are hereby notified that
>> any printing, copying, dissemination, distribution, disclosure or
>> forwarding of this communication is strictly prohibited. If you have
>> received this communication in error, please contact the sender immediately
>> and delete it from your system. Thank You.
>>
>
>

-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.