You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hive.apache.org by Babak Bastan <ba...@gmail.com> on 2012/06/05 19:13:59 UTC

Error while Creating Table in Hive

Hello Experts ,

I'm new in Hive .When try to create a test Table in Hive I get an error.I
want to run this command:
*CREATE TABLE Test (DateT STRING, Url STRING, Content STRING);*
but this error occured:
FAILED: Error in metadata: MetaException(message:Got exception:
java.io.FileNotFoundException File file:/user/hive/warehouse/test does not
exist.)
FAILED: Execution Error, return code 1 from
org.apache.hadoop.hive.ql.exec.DDLTask
How can I solve this Problem?
Thank you so much

Re: Error while Creating Table in Hive

Posted by Bejoy Ks <be...@yahoo.com>.
HI Babak

It looks like a hadoop configuration problem for me. Have you configured your 'fs.default.name'  in core-site.xl to point to hdfs:// instead of file:/// ? If not that is likely to be the issue. Also if you are usinh hdfs for first time there are other steps like formatting namenode and all to be performed.

http://hadoop.apache.org/common/docs/r0.20.2/quickstart.html

Ensure your hadoop installation is fully working before you get on to hive.



Regards
Bejoy KS





________________________________
 From: Babak Bastan <ba...@gmail.com>
To: user@hive.apache.org 
Sent: Tuesday, June 5, 2012 10:43 PM
Subject: Error while Creating Table in Hive
 

Hello Experts ,

I'm new in Hive .When try to create a test Table in Hive I get an error.I want to run this command:
CREATE TABLE Test (DateT STRING, Url STRING, Content STRING);
but this error occured:
FAILED: Error in metadata: MetaException(message:Got exception: java.io.FileNotFoundException File file:/user/hive/warehouse/test does not exist.)
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask
How can I solve this Problem?
Thank you so much

Re: Error while Creating Table in Hive

Posted by Mohammad Tariq <do...@gmail.com>.
check your /var/log/hadoop/...also when you do something wrong your
will find your terminal full of many error messages, you can use them
as well..and by the way learning something new requires great deal of
patience

Regards,
    Mohammad Tariq


On Thu, Jun 7, 2012 at 1:25 AM, Babak Bastan <ba...@gmail.com> wrote:
> What the hell is that?I see no log folder there
>
>
> On Wed, Jun 6, 2012 at 9:41 PM, Mohammad Tariq <do...@gmail.com> wrote:
>>
>> go to your HADOOP_HOME i.e your hadoop directory(that includes bin,
>> conf etc)..you can find logs directory there..
>>
>> Regards,
>>     Mohammad Tariq
>>
>>
>> On Thu, Jun 7, 2012 at 1:09 AM, Babak Bastan <ba...@gmail.com> wrote:
>> > hoe can I get my log mohammad?
>> >
>> >
>> > On Wed, Jun 6, 2012 at 9:36 PM, Mohammad Tariq <do...@gmail.com>
>> > wrote:
>> >>
>> >> could you post your logs???that would help me in understanding the
>> >> problem properly.
>> >>
>> >> Regards,
>> >>     Mohammad Tariq
>> >>
>> >>
>> >> On Thu, Jun 7, 2012 at 1:02 AM, Babak Bastan <ba...@gmail.com>
>> >> wrote:
>> >> > Thank you very much mohamad for your attention.I followed the steps
>> >> > but
>> >> > the
>> >> > error is the same as the last time.
>> >> > and there is my hosts file:
>> >> >
>> >> > 127.0.0.1       localhost
>> >> > #127.0.0.1      ubuntu.ubuntu-domain    ubuntu
>> >> >
>> >> >
>> >> > # The following lines are desirable for IPv6 capable hosts
>> >> >
>> >> > #::1     ip6-localhost ip6-loopback
>> >> > #fe00::0 ip6-localnet
>> >> > #ff00::0 ip6-mcastprefix
>> >> > #ff02::1 ip6-allnodes
>> >> > #ff02::2 ip6-allrouters
>> >> >
>> >> > but no effect :(
>> >> >
>> >> > On Wed, Jun 6, 2012 at 8:25 PM, Mohammad Tariq <do...@gmail.com>
>> >> > wrote:
>> >> >>
>> >> >> also change the permissions of these directories to 777.
>> >> >>
>> >> >> Regards,
>> >> >>     Mohammad Tariq
>> >> >>
>> >> >>
>> >> >> On Wed, Jun 6, 2012 at 11:54 PM, Mohammad Tariq <do...@gmail.com>
>> >> >> wrote:
>> >> >> > create a directory "/home/username/hdfs" (or at some place of your
>> >> >> > choice)..inside this hdfs directory create three sub directories -
>> >> >> > name, data, and temp, then follow these steps :
>> >> >> >
>> >> >> > add following properties in your core-site.xml -
>> >> >> >
>> >> >> > <property>
>> >> >> >          <name>fs.default.name</name>
>> >> >> >          <value>hdfs://localhost:9000/</value>
>> >> >> >        </property>
>> >> >> >
>> >> >> >        <property>
>> >> >> >          <name>hadoop.tmp.dir</name>
>> >> >> >          <value>/home/mohammad/hdfs/temp</value>
>> >> >> >        </property>
>> >> >> >
>> >> >> > then add following two properties in your hdfs-site.xml -
>> >> >> >
>> >> >> > <property>
>> >> >> >                <name>dfs.replication</name>
>> >> >> >                <value>1</value>
>> >> >> >        </property>
>> >> >> >
>> >> >> >        <property>
>> >> >> >                <name>dfs.name.dir</name>
>> >> >> >                <value>/home/mohammad/hdfs/name</value>
>> >> >> >        </property>
>> >> >> >
>> >> >> >        <property>
>> >> >> >                <name>dfs.data.dir</name>
>> >> >> >                <value>/home/mohammad/hdfs/data</value>
>> >> >> >        </property>
>> >> >> >
>> >> >> > finally add this property in your mapred-site.xml -
>> >> >> >
>> >> >> >       <property>
>> >> >> >          <name>mapred.job.tracker</name>
>> >> >> >          <value>hdfs://localhost:9001</value>
>> >> >> >        </property>
>> >> >> >
>> >> >> > NOTE: you can give any name to these directories of your choice,
>> >> >> > just
>> >> >> > keep in mind you have to give same names as values of
>> >> >> >           above specified properties in your configuration files.
>> >> >> > (give full path of these directories, not just the name of the
>> >> >> > directory)
>> >> >> >
>> >> >> > After this  follow the steps provided in the previous reply.
>> >> >> >
>> >> >> > Regards,
>> >> >> >     Mohammad Tariq
>> >> >> >
>> >> >> >
>> >> >> > On Wed, Jun 6, 2012 at 11:42 PM, Babak Bastan <ba...@gmail.com>
>> >> >> > wrote:
>> >> >> >> thank's Mohammad
>> >> >> >>
>> >> >> >> with this command:
>> >> >> >>
>> >> >> >> babak@ubuntu:~/Downloads/hadoop/bin$ hadoop namenode -format
>> >> >> >>
>> >> >> >> this is my output:
>> >> >> >>
>> >> >> >> 12/06/06 20:05:20 INFO namenode.NameNode: STARTUP_MSG:
>> >> >> >> /************************************************************
>> >> >> >> STARTUP_MSG: Starting NameNode
>> >> >> >> STARTUP_MSG:   host = ubuntu/127.0.1.1
>> >> >> >> STARTUP_MSG:   args = [-format]
>> >> >> >> STARTUP_MSG:   version = 0.20.2
>> >> >> >> STARTUP_MSG:   build =
>> >> >> >>
>> >> >> >> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.20
>> >> >> >> -r
>> >> >> >> 911707; compiled by 'chrisdo' on Fri Feb 19 08:07:34 UTC 2010
>> >> >> >> ************************************************************/
>> >> >> >> 12/06/06 20:05:20 INFO namenode.FSNamesystem:
>> >> >> >>
>> >> >> >>
>> >> >> >> fsOwner=babak,babak,adm,dialout,cdrom,plugdev,lpadmin,admin,sambashare
>> >> >> >> 12/06/06 20:05:20 INFO namenode.FSNamesystem:
>> >> >> >> supergroup=supergroup
>> >> >> >> 12/06/06 20:05:20 INFO namenode.FSNamesystem:
>> >> >> >> isPermissionEnabled=true
>> >> >> >> 12/06/06 20:05:20 INFO common.Storage: Image file of size 95
>> >> >> >> saved
>> >> >> >> in 0
>> >> >> >> seconds.
>> >> >> >> 12/06/06 20:05:20 INFO common.Storage: Storage directory
>> >> >> >> /tmp/hadoop-babak/dfs/name has been successfully formatted.
>> >> >> >> 12/06/06 20:05:20 INFO namenode.NameNode: SHUTDOWN_MSG:
>> >> >> >> /************************************************************
>> >> >> >> SHUTDOWN_MSG: Shutting down NameNode at ubuntu/127.0.1.1
>> >> >> >> ************************************************************/
>> >> >> >>
>> >> >> >> by this command:
>> >> >> >>
>> >> >> >> babak@ubuntu:~/Downloads/hadoop/bin$ start-dfs.sh
>> >> >> >>
>> >> >> >> this is the out put
>> >> >> >>
>> >> >> >> mkdir: kann Verzeichnis
>> >> >> >> „/home/babak/Downloads/hadoop/bin/../logs“
>> >> >> >> nicht
>> >> >> >> anlegen: Keine Berechtigung
>> >> >> >>
>> >> >> >> this out put(it's in german and it means no right to make this
>> >> >> >> folder)
>> >> >> >>
>> >> >> >>
>> >> >> >> On Wed, Jun 6, 2012 at 7:59 PM, Mohammad Tariq
>> >> >> >> <do...@gmail.com>
>> >> >> >> wrote:
>> >> >> >>>
>> >> >> >>> once we are done with the configuration, we need to format the
>> >> >> >>> file
>> >> >> >>> system..use this command to do that-
>> >> >> >>> bin/hadoop namenode -format
>> >> >> >>>
>> >> >> >>> after this, hadoop daemon processes should be started using
>> >> >> >>> following
>> >> >> >>> commands -
>> >> >> >>> bin/start-dfs.sh (it'll start NN & DN)
>> >> >> >>> bin/start-mapred.sh (it'll start JT & TT)
>> >> >> >>>
>> >> >> >>> after this use jps to check if everything is alright or point
>> >> >> >>> your
>> >> >> >>> browser to localhost:50070..if you further find any problem
>> >> >> >>> provide
>> >> >> >>> us
>> >> >> >>> with the error logs..:)
>> >> >> >>>
>> >> >> >>> Regards,
>> >> >> >>>     Mohammad Tariq
>> >> >> >>>
>> >> >> >>>
>> >> >> >>> On Wed, Jun 6, 2012 at 11:22 PM, Babak Bastan
>> >> >> >>> <ba...@gmail.com>
>> >> >> >>> wrote:
>> >> >> >>> > were you able to format hdfs properly???
>> >> >> >>> > I did'nt get your question,Do you mean HADOOP_HOME? or where
>> >> >> >>> > did
>> >> >> >>> > I
>> >> >> >>> > install
>> >> >> >>> > Hadoop?
>> >> >> >>> >
>> >> >> >>> > On Wed, Jun 6, 2012 at 7:49 PM, Mohammad Tariq
>> >> >> >>> > <do...@gmail.com>
>> >> >> >>> > wrote:
>> >> >> >>> >>
>> >> >> >>> >> if you are getting only this, it means your hadoop is not
>> >> >> >>> >> running..were you able to format hdfs properly???
>> >> >> >>> >>
>> >> >> >>> >> Regards,
>> >> >> >>> >>     Mohammad Tariq
>> >> >> >>> >>
>> >> >> >>> >>
>> >> >> >>> >> On Wed, Jun 6, 2012 at 11:17 PM, Babak Bastan
>> >> >> >>> >> <ba...@gmail.com>
>> >> >> >>> >> wrote:
>> >> >> >>> >> > Hi MohammadmI irun jps in my shel I can see this result:
>> >> >> >>> >> > 2213 Jps
>> >> >> >>> >> >
>> >> >> >>> >> >
>> >> >> >>> >> > On Wed, Jun 6, 2012 at 7:44 PM, Mohammad Tariq
>> >> >> >>> >> > <do...@gmail.com>
>> >> >> >>> >> > wrote:
>> >> >> >>> >> >>
>> >> >> >>> >> >> you can also use "jps" command at your shell to see
>> >> >> >>> >> >> whether
>> >> >> >>> >> >> Hadoop
>> >> >> >>> >> >> processes are running or not.
>> >> >> >>> >> >>
>> >> >> >>> >> >> Regards,
>> >> >> >>> >> >>     Mohammad Tariq
>> >> >> >>> >> >>
>> >> >> >>> >> >>
>> >> >> >>> >> >> On Wed, Jun 6, 2012 at 11:12 PM, Mohammad Tariq
>> >> >> >>> >> >> <do...@gmail.com>
>> >> >> >>> >> >> wrote:
>> >> >> >>> >> >> > Hi Babak,
>> >> >> >>> >> >> >
>> >> >> >>> >> >> >  You have to type it in you web browser..Hadoop provides
>> >> >> >>> >> >> > us
>> >> >> >>> >> >> > a
>> >> >> >>> >> >> > web
>> >> >> >>> >> >> > GUI
>> >> >> >>> >> >> > that not only allows us to browse through the file
>> >> >> >>> >> >> > system,
>> >> >> >>> >> >> > but
>> >> >> >>> >> >> > to
>> >> >> >>> >> >> > download the files as well..Apart from that it also
>> >> >> >>> >> >> > provides a
>> >> >> >>> >> >> > web
>> >> >> >>> >> >> > GUI
>> >> >> >>> >> >> > that can be used to see the status of Jobtracker and
>> >> >> >>> >> >> > Tasktracker..When
>> >> >> >>> >> >> > you run a Hive or Pig job or a Mapreduce job, you can
>> >> >> >>> >> >> > point
>> >> >> >>> >> >> > your
>> >> >> >>> >> >> > browser to http://localhost:50030 to see the status and
>> >> >> >>> >> >> > logs
>> >> >> >>> >> >> > of
>> >> >> >>> >> >> > your
>> >> >> >>> >> >> > job.
>> >> >> >>> >> >> >
>> >> >> >>> >> >> > Regards,
>> >> >> >>> >> >> >     Mohammad Tariq
>> >> >> >>> >> >> >
>> >> >> >>> >> >> >
>> >> >> >>> >> >> > On Wed, Jun 6, 2012 at 8:28 PM, Babak Bastan
>> >> >> >>> >> >> > <ba...@gmail.com>
>> >> >> >>> >> >> > wrote:
>> >> >> >>> >> >> >> Thank you shashwat for the answer,
>> >> >> >>> >> >> >> where should I type http://localhost:50070?
>> >> >> >>> >> >> >> I typed here: hive>http://localhost:50070 but nothing
>> >> >> >>> >> >> >> as
>> >> >> >>> >> >> >> result
>> >> >> >>> >> >> >>
>> >> >> >>> >> >> >>
>> >> >> >>> >> >> >> On Wed, Jun 6, 2012 at 3:32 PM, shashwat shriparv
>> >> >> >>> >> >> >> <dw...@gmail.com> wrote:
>> >> >> >>> >> >> >>>
>> >> >> >>> >> >> >>> first type http://localhost:50070 whether this is
>> >> >> >>> >> >> >>> opening
>> >> >> >>> >> >> >>> or
>> >> >> >>> >> >> >>> not
>> >> >> >>> >> >> >>> and
>> >> >> >>> >> >> >>> check
>> >> >> >>> >> >> >>> how many nodes are available, check some of the hadoop
>> >> >> >>> >> >> >>> shell
>> >> >> >>> >> >> >>> commands
>> >> >> >>> >> >> >>>
>> >> >> >>> >> >> >>>
>> >> >> >>> >> >> >>>
>> >> >> >>> >> >> >>>
>> >> >> >>> >> >> >>> from http://hadoop.apache.org/common/docs/r0.18.3/hdfs_shell.html
>> >> >> >>> >> >> >>> run
>> >> >> >>> >> >> >>> example mapreduce task on hadoop take example from
>> >> >> >>> >> >> >>> here
>> >> >> >>> >> >> >>>
>> >> >> >>> >> >> >>>
>> >> >> >>> >> >> >>>
>> >> >> >>> >> >> >>>
>> >> >> >>> >> >> >>>
>> >> >> >>> >> >> >>>
>> >> >> >>> >> >> >>> : http://www.michael-noll.com/blog/2011/04/09/benchmarking-and-stress-testing-an-hadoop-cluster-with-terasort-testdfsio-nnbench-mrbench/
>> >> >> >>> >> >> >>>
>> >> >> >>> >> >> >>> if all the above you can do sucessfully means hadoop
>> >> >> >>> >> >> >>> is
>> >> >> >>> >> >> >>> configured
>> >> >> >>> >> >> >>> correctly
>> >> >> >>> >> >> >>>
>> >> >> >>> >> >> >>> Regards
>> >> >> >>> >> >> >>> Shashwat
>> >> >> >>> >> >> >>>
>> >> >> >>> >> >> >>>
>> >> >> >>> >> >> >>> On Wed, Jun 6, 2012 at 1:30 AM, Babak Bastan
>> >> >> >>> >> >> >>> <ba...@gmail.com>
>> >> >> >>> >> >> >>> wrote:
>> >> >> >>> >> >> >>>>
>> >> >> >>> >> >> >>>> no I'm not working on CDH.Is there a way to test if
>> >> >> >>> >> >> >>>> my
>> >> >> >>> >> >> >>>> Hadoop
>> >> >> >>> >> >> >>>> works
>> >> >> >>> >> >> >>>> fine
>> >> >> >>> >> >> >>>> or not?
>> >> >> >>> >> >> >>>>
>> >> >> >>> >> >> >>>>
>> >> >> >>> >> >> >>>> On Tue, Jun 5, 2012 at 9:55 PM, Bejoy KS
>> >> >> >>> >> >> >>>> <be...@yahoo.com>
>> >> >> >>> >> >> >>>> wrote:
>> >> >> >>> >> >> >>>>>
>> >> >> >>> >> >> >>>>> Hi Babak
>> >> >> >>> >> >> >>>>>
>> >> >> >>> >> >> >>>>> You gotta follow those instructions in the apace
>> >> >> >>> >> >> >>>>> site
>> >> >> >>> >> >> >>>>> to
>> >> >> >>> >> >> >>>>> set
>> >> >> >>> >> >> >>>>> up
>> >> >> >>> >> >> >>>>> hadoop
>> >> >> >>> >> >> >>>>> from scratch and ensure that hdfs is working first.
>> >> >> >>> >> >> >>>>> You
>> >> >> >>> >> >> >>>>> should
>> >> >> >>> >> >> >>>>> be
>> >> >> >>> >> >> >>>>> able to
>> >> >> >>> >> >> >>>>> read and write files to hdfs before you do your next
>> >> >> >>> >> >> >>>>> steps.
>> >> >> >>> >> >> >>>>>
>> >> >> >>> >> >> >>>>> Are you on CDH or apache distribution of hadoop? If
>> >> >> >>> >> >> >>>>> it
>> >> >> >>> >> >> >>>>> is
>> >> >> >>> >> >> >>>>> CDH
>> >> >> >>> >> >> >>>>> there
>> >> >> >>> >> >> >>>>> are
>> >> >> >>> >> >> >>>>> detailed instructions on Cloudera web site.
>> >> >> >>> >> >> >>>>>
>> >> >> >>> >> >> >>>>> Regards
>> >> >> >>> >> >> >>>>> Bejoy KS
>> >> >> >>> >> >> >>>>>
>> >> >> >>> >> >> >>>>> Sent from handheld, please excuse typos.
>> >> >> >>> >> >> >>>>> ________________________________
>> >> >> >>> >> >> >>>>> From: Babak Bastan <ba...@gmail.com>
>> >> >> >>> >> >> >>>>> Date: Tue, 5 Jun 2012 21:30:22 +0200
>> >> >> >>> >> >> >>>>> To: <us...@hive.apache.org>
>> >> >> >>> >> >> >>>>> ReplyTo: user@hive.apache.org
>> >> >> >>> >> >> >>>>> Subject: Re: Error while Creating Table in Hive
>> >> >> >>> >> >> >>>>>
>> >> >> >>> >> >> >>>>> @Bejoy: I set the fs.default.name in the
>> >> >> >>> >> >> >>>>> core-site.xml
>> >> >> >>> >> >> >>>>> and
>> >> >> >>> >> >> >>>>> I
>> >> >> >>> >> >> >>>>> did
>> >> >> >>> >> >> >>>>> all
>> >> >> >>> >> >> >>>>> of
>> >> >> >>> >> >> >>>>> thing that was mentioned in the reference but no
>> >> >> >>> >> >> >>>>> effect
>> >> >> >>> >> >> >>>>>
>> >> >> >>> >> >> >>>>> On Tue, Jun 5, 2012 at 8:43 PM, Babak Bastan
>> >> >> >>> >> >> >>>>> <ba...@gmail.com>
>> >> >> >>> >> >> >>>>> wrote:
>> >> >> >>> >> >> >>>>>>
>> >> >> >>> >> >> >>>>>> Ok sorry but that was my Mistake .I thought it
>> >> >> >>> >> >> >>>>>> works
>> >> >> >>> >> >> >>>>>> but
>> >> >> >>> >> >> >>>>>> no.
>> >> >> >>> >> >> >>>>>> I wrote the command without ; and then I think It
>> >> >> >>> >> >> >>>>>> works
>> >> >> >>> >> >> >>>>>> but
>> >> >> >>> >> >> >>>>>> with
>> >> >> >>> >> >> >>>>>> ;
>> >> >> >>> >> >> >>>>>> at
>> >> >> >>> >> >> >>>>>> the end of command
>> >> >> >>> >> >> >>>>>>
>> >> >> >>> >> >> >>>>>> CREATE TABLE pokes (foo INT, bar STRING);
>> >> >> >>> >> >> >>>>>>
>> >> >> >>> >> >> >>>>>> does'nt work
>> >> >> >>> >> >> >>>>>>
>> >> >> >>> >> >> >>>>>>
>> >> >> >>> >> >> >>>>>> On Tue, Jun 5, 2012 at 8:34 PM, shashwat shriparv
>> >> >> >>> >> >> >>>>>> <dw...@gmail.com> wrote:
>> >> >> >>> >> >> >>>>>>>
>> >> >> >>> >> >> >>>>>>> inside configuration. all properties will be
>> >> >> >>> >> >> >>>>>>> inside
>> >> >> >>> >> >> >>>>>>> the
>> >> >> >>> >> >> >>>>>>> configuration
>> >> >> >>> >> >> >>>>>>> tags
>> >> >> >>> >> >> >>>>>>>
>> >> >> >>> >> >> >>>>>>>
>> >> >> >>> >> >> >>>>>>> On Tue, Jun 5, 2012 at 11:53 PM, Babak Bastan
>> >> >> >>> >> >> >>>>>>> <ba...@gmail.com>
>> >> >> >>> >> >> >>>>>>> wrote:
>> >> >> >>> >> >> >>>>>>>>
>> >> >> >>> >> >> >>>>>>>> Thank you so much my friend your idee works
>> >> >> >>> >> >> >>>>>>>> fine(no
>> >> >> >>> >> >> >>>>>>>> error)
>> >> >> >>> >> >> >>>>>>>> you
>> >> >> >>> >> >> >>>>>>>> are
>> >> >> >>> >> >> >>>>>>>> the best :)
>> >> >> >>> >> >> >>>>>>>>
>> >> >> >>> >> >> >>>>>>>>
>> >> >> >>> >> >> >>>>>>>> On Tue, Jun 5, 2012 at 8:20 PM, Babak Bastan
>> >> >> >>> >> >> >>>>>>>> <ba...@gmail.com>
>> >> >> >>> >> >> >>>>>>>> wrote:
>> >> >> >>> >> >> >>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>> It must be inside the
>> >> >> >>> >> >> >>>>>>>>> <configuration></configuration>
>> >> >> >>> >> >> >>>>>>>>> or
>> >> >> >>> >> >> >>>>>>>>> outside
>> >> >> >>> >> >> >>>>>>>>> this?
>> >> >> >>> >> >> >>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>> On Tue, Jun 5, 2012 at 8:15 PM, shashwat
>> >> >> >>> >> >> >>>>>>>>> shriparv
>> >> >> >>> >> >> >>>>>>>>> <dw...@gmail.com> wrote:
>> >> >> >>> >> >> >>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>> It will be inside hive/conf
>> >> >> >>> >> >> >>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>> On Tue, Jun 5, 2012 at 11:43 PM, Babak Bastan
>> >> >> >>> >> >> >>>>>>>>>> <ba...@gmail.com>
>> >> >> >>> >> >> >>>>>>>>>> wrote:
>> >> >> >>> >> >> >>>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>>> Thanks sShashwat, and where is this
>> >> >> >>> >> >> >>>>>>>>>>> hive-site.xml
>> >> >> >>> >> >> >>>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>>> On Tue, Jun 5, 2012 at 8:02 PM, shashwat
>> >> >> >>> >> >> >>>>>>>>>>> shriparv
>> >> >> >>> >> >> >>>>>>>>>>> <dw...@gmail.com> wrote:
>> >> >> >>> >> >> >>>>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>>>> set
>> >> >> >>> >> >> >>>>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>>>> hive.metastore.warehouse.dir in hive-site.xml
>> >> >> >>> >> >> >>>>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>>>> <property>
>> >> >> >>> >> >> >>>>>>>>>>>>   <name>hive.metastore.local</name>
>> >> >> >>> >> >> >>>>>>>>>>>>   <value>true</value>
>> >> >> >>> >> >> >>>>>>>>>>>> </property>
>> >> >> >>> >> >> >>>>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>>>> <name>hive.metastore.warehouse.dir</name>
>> >> >> >>> >> >> >>>>>>>>>>>>                <value>/home/<your
>> >> >> >>> >> >> >>>>>>>>>>>> username>/hivefolder</value>
>> >> >> >>> >> >> >>>>>>>>>>>>                <description>location of
>> >> >> >>> >> >> >>>>>>>>>>>> default
>> >> >> >>> >> >> >>>>>>>>>>>> database
>> >> >> >>> >> >> >>>>>>>>>>>> for
>> >> >> >>> >> >> >>>>>>>>>>>> the
>> >> >> >>> >> >> >>>>>>>>>>>> warehouse</description>
>> >> >> >>> >> >> >>>>>>>>>>>>        </property>
>> >> >> >>> >> >> >>>>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>>>> On Tue, Jun 5, 2012 at 10:43 PM, Babak Bastan
>> >> >> >>> >> >> >>>>>>>>>>>> <ba...@gmail.com> wrote:
>> >> >> >>> >> >> >>>>>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>>>>> Hello Experts ,
>> >> >> >>> >> >> >>>>>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>>>>> I'm new in Hive .When try to create a test
>> >> >> >>> >> >> >>>>>>>>>>>>> Table
>> >> >> >>> >> >> >>>>>>>>>>>>> in
>> >> >> >>> >> >> >>>>>>>>>>>>> Hive
>> >> >> >>> >> >> >>>>>>>>>>>>> I
>> >> >> >>> >> >> >>>>>>>>>>>>> get
>> >> >> >>> >> >> >>>>>>>>>>>>> an error.I want to run this command:
>> >> >> >>> >> >> >>>>>>>>>>>>> CREATE TABLE Test (DateT STRING, Url STRING,
>> >> >> >>> >> >> >>>>>>>>>>>>> Content
>> >> >> >>> >> >> >>>>>>>>>>>>> STRING);
>> >> >> >>> >> >> >>>>>>>>>>>>> but this error occured:
>> >> >> >>> >> >> >>>>>>>>>>>>> FAILED: Error in metadata:
>> >> >> >>> >> >> >>>>>>>>>>>>> MetaException(message:Got
>> >> >> >>> >> >> >>>>>>>>>>>>> exception:
>> >> >> >>> >> >> >>>>>>>>>>>>> java.io.FileNotFoundException File
>> >> >> >>> >> >> >>>>>>>>>>>>> file:/user/hive/warehouse/test does not
>> >> >> >>> >> >> >>>>>>>>>>>>> exist.)
>> >> >> >>> >> >> >>>>>>>>>>>>> FAILED: Execution Error, return code 1 from
>> >> >> >>> >> >> >>>>>>>>>>>>> org.apache.hadoop.hive.ql.exec.DDLTask
>> >> >> >>> >> >> >>>>>>>>>>>>> How can I solve this Problem?
>> >> >> >>> >> >> >>>>>>>>>>>>> Thank you so much
>> >> >> >>> >> >> >>>>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>>>> --
>> >> >> >>> >> >> >>>>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>>>> ∞
>> >> >> >>> >> >> >>>>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>>>> Shashwat Shriparv
>> >> >> >>> >> >> >>>>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>> --
>> >> >> >>> >> >> >>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>> ∞
>> >> >> >>> >> >> >>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>> Shashwat Shriparv
>> >> >> >>> >> >> >>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>
>> >> >> >>> >> >> >>>>>>>
>> >> >> >>> >> >> >>>>>>>
>> >> >> >>> >> >> >>>>>>>
>> >> >> >>> >> >> >>>>>>> --
>> >> >> >>> >> >> >>>>>>>
>> >> >> >>> >> >> >>>>>>>
>> >> >> >>> >> >> >>>>>>> ∞
>> >> >> >>> >> >> >>>>>>>
>> >> >> >>> >> >> >>>>>>> Shashwat Shriparv
>> >> >> >>> >> >> >>>>>>>
>> >> >> >>> >> >> >>>>>>>
>> >> >> >>> >> >> >>>>>>
>> >> >> >>> >> >> >>>>>
>> >> >> >>> >> >> >>>>
>> >> >> >>> >> >> >>>
>> >> >> >>> >> >> >>>
>> >> >> >>> >> >> >>>
>> >> >> >>> >> >> >>> --
>> >> >> >>> >> >> >>>
>> >> >> >>> >> >> >>>
>> >> >> >>> >> >> >>> ∞
>> >> >> >>> >> >> >>>
>> >> >> >>> >> >> >>> Shashwat Shriparv
>> >> >> >>> >> >> >>>
>> >> >> >>> >> >> >>>
>> >> >> >>> >> >> >>
>> >> >> >>> >> >
>> >> >> >>> >> >
>> >> >> >>> >
>> >> >> >>> >
>> >> >> >>
>> >> >> >>
>> >> >
>> >> >
>> >
>> >
>
>

Re: Error while Creating Table in Hive

Posted by Mohammad Tariq <do...@gmail.com>.
need not worry.. i am also a student..just keep your calm..start fresh
and follow these steps -

1 - download hadoop from apache using this link -
http://apache.techartifact.com/mirror/hadoop/common/hadoop-0.20.205.0/hadoop-0.20.205.0.tar.gz

2 - untar it - right click+extract here

3 - set JAVA_HOME in your hadoop-env.sh file and save it

4 - add the properties specified in previous replies in your
core-site.xml, hdfs-site.xml and mapredif it doesn't work still i'll
send you the configured hadoop-site.xml files

5 - format HDFS

6 - start the hadoop processes

also your hosts file should look like this -

127.0.0.1	localhost
127.0.0.1	ubuntu.ubuntu-domain	ubuntu

# The following lines are desirable for IPv6 capable hosts
::1     ip6-localhost ip6-loopback
fe00::0 ip6-localnet
ff00::0 ip6-mcastprefix
ff02::1 ip6-allnodes
ff02::2 ip6-allrouters

it'll work..if you further face any problem, i'll send you a
configured copy of hadoop.

Regards,
    Mohammad Tariq


On Thu, Jun 7, 2012 at 1:45 AM, Babak Bastan <ba...@gmail.com> wrote:
> I checked it but no hadoop folder :(
> yes you are right.I'm a student and I want to make a very very simple
> programm hive but untill now hmmmmmmmmm
>
>
> On Wed, Jun 6, 2012 at 10:12 PM, Babak Bastan <ba...@gmail.com> wrote:
>>
>> no one error:
>> i.e if I run this one
>>
>> hostname --fqdn
>>
>>  with the condition that I send to you :
>>
>> 127.0.0.1       localhost
>> #127.0.0.1      ubuntu.ubuntu-domain    ubuntu
>> # The following lines are desirable for IPv6 capable hosts
>> #::1     ip6-localhost ip6-loopback
>> #fe00::0 ip6-localnet
>> #ff00::0 ip6-mcastprefix
>> #ff02::1 ip6-allnodes
>> #ff02::2 ip6-allrouters
>>
>> I get this error:
>>
>> hostname: Name or service not known
>>
>> Or in the second step by this command:
>>
>> babak@ubuntu:~/Downloads/hadoop/bin$ start-hdfs.sh
>>
>> these lines of error:
>>
>>
>> mkdir: kann Verzeichnis „/home/babak/Downloads/hadoop/bin/../logs“ nicht
>> anlegen: Keine Berechtigung
>> starting namenode, logging to
>> /home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-namenode-ubuntu.out
>> /home/babak/Downloads/hadoop/bin/hadoop-daemon.sh: Zeile 117:
>> /home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-namenode-ubuntu.out:
>> Datei oder Verzeichnis nicht gefunden
>> head:
>> „/home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-namenode-ubuntu.out“
>> kann nicht zum Lesen geöffnet werden: Datei oder Verzeichnis nicht gefunden
>> localhost: mkdir: kann Verzeichnis
>> „/home/babak/Downloads/hadoop/bin/../logs“ nicht anlegen: Keine Berechtigung
>> localhost: starting datanode, logging to
>> /home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-datanode-ubuntu.out
>> localhost: /home/babak/Downloads/hadoop/bin/hadoop-daemon.sh: Zeile 117:
>> /home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-datanode-ubuntu.out:
>> Datei oder Verzeichnis nicht gefunden
>> localhost: head:
>> „/home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-datanode-ubuntu.out“
>> kann nicht zum Lesen geöffnet werden: Datei oder Verzeichnis nicht gefunden
>> localhost: mkdir: kann Verzeichnis
>> „/home/babak/Downloads/hadoop/bin/../logs“ nicht anlegen: Keine Berechtigung
>> localhost: starting secondarynamenode, logging to
>> /home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-secondarynamenode-ubuntu.out
>> localhost: /home/babak/Downloads/hadoop/bin/hadoop-daemon.sh: Zeile 117:
>> /home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-secondarynamenode-ubuntu.out:
>> Datei oder Verzeichnis nicht gefunden
>> localhost: head:
>> „/home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-secondarynamenode-ubuntu.out“
>> kann nicht zum Lesen geöffnet werden: Datei oder Verzeichnis nicht gefunden
>>
>> they said no permision to make logs in this
>> path:/home/babak/Downloads/hadoop/bin/../logs
>>
>>  and generally I cant create a table in hive and get this one:
>>
>> FAILED: Error in metadata: MetaException(message:Got exception:
>> java.io.FileNotFoundException File file:/user/hive/warehouse/test does not
>> exist.)
>> FAILED: Execution Error, return code 1 from
>> org.apache.hadoop.hive.ql.exec.DDLTask
>>
>> On Wed, Jun 6, 2012 at 10:02 PM, shashwat shriparv
>> <dw...@gmail.com> wrote:
>>>
>>> whats the error babak ???
>>>
>>>
>>> On Thu, Jun 7, 2012 at 1:25 AM, Babak Bastan <ba...@gmail.com> wrote:
>>>>
>>>> What the hell is that?I see no log folder there
>>>>
>>>>
>>>> On Wed, Jun 6, 2012 at 9:41 PM, Mohammad Tariq <do...@gmail.com>
>>>> wrote:
>>>>>
>>>>> go to your HADOOP_HOME i.e your hadoop directory(that includes bin,
>>>>> conf etc)..you can find logs directory there..
>>>>>
>>>>> Regards,
>>>>>     Mohammad Tariq
>>>>>
>>>>>
>>>>> On Thu, Jun 7, 2012 at 1:09 AM, Babak Bastan <ba...@gmail.com>
>>>>> wrote:
>>>>> > hoe can I get my log mohammad?
>>>>> >
>>>>> >
>>>>> > On Wed, Jun 6, 2012 at 9:36 PM, Mohammad Tariq <do...@gmail.com>
>>>>> > wrote:
>>>>> >>
>>>>> >> could you post your logs???that would help me in understanding the
>>>>> >> problem properly.
>>>>> >>
>>>>> >> Regards,
>>>>> >>     Mohammad Tariq
>>>>> >>
>>>>> >>
>>>>> >> On Thu, Jun 7, 2012 at 1:02 AM, Babak Bastan <ba...@gmail.com>
>>>>> >> wrote:
>>>>> >> > Thank you very much mohamad for your attention.I followed the
>>>>> >> > steps but
>>>>> >> > the
>>>>> >> > error is the same as the last time.
>>>>> >> > and there is my hosts file:
>>>>> >> >
>>>>> >> > 127.0.0.1       localhost
>>>>> >> > #127.0.0.1      ubuntu.ubuntu-domain    ubuntu
>>>>> >> >
>>>>> >> >
>>>>> >> > # The following lines are desirable for IPv6 capable hosts
>>>>> >> >
>>>>> >> > #::1     ip6-localhost ip6-loopback
>>>>> >> > #fe00::0 ip6-localnet
>>>>> >> > #ff00::0 ip6-mcastprefix
>>>>> >> > #ff02::1 ip6-allnodes
>>>>> >> > #ff02::2 ip6-allrouters
>>>>> >> >
>>>>> >> > but no effect :(
>>>>> >> >
>>>>> >> > On Wed, Jun 6, 2012 at 8:25 PM, Mohammad Tariq
>>>>> >> > <do...@gmail.com>
>>>>> >> > wrote:
>>>>> >> >>
>>>>> >> >> also change the permissions of these directories to 777.
>>>>> >> >>
>>>>> >> >> Regards,
>>>>> >> >>     Mohammad Tariq
>>>>> >> >>
>>>>> >> >>
>>>>> >> >> On Wed, Jun 6, 2012 at 11:54 PM, Mohammad Tariq
>>>>> >> >> <do...@gmail.com>
>>>>> >> >> wrote:
>>>>> >> >> > create a directory "/home/username/hdfs" (or at some place of
>>>>> >> >> > your
>>>>> >> >> > choice)..inside this hdfs directory create three sub
>>>>> >> >> > directories -
>>>>> >> >> > name, data, and temp, then follow these steps :
>>>>> >> >> >
>>>>> >> >> > add following properties in your core-site.xml -
>>>>> >> >> >
>>>>> >> >> > <property>
>>>>> >> >> >          <name>fs.default.name</name>
>>>>> >> >> >          <value>hdfs://localhost:9000/</value>
>>>>> >> >> >        </property>
>>>>> >> >> >
>>>>> >> >> >        <property>
>>>>> >> >> >          <name>hadoop.tmp.dir</name>
>>>>> >> >> >          <value>/home/mohammad/hdfs/temp</value>
>>>>> >> >> >        </property>
>>>>> >> >> >
>>>>> >> >> > then add following two properties in your hdfs-site.xml -
>>>>> >> >> >
>>>>> >> >> > <property>
>>>>> >> >> >                <name>dfs.replication</name>
>>>>> >> >> >                <value>1</value>
>>>>> >> >> >        </property>
>>>>> >> >> >
>>>>> >> >> >        <property>
>>>>> >> >> >                <name>dfs.name.dir</name>
>>>>> >> >> >                <value>/home/mohammad/hdfs/name</value>
>>>>> >> >> >        </property>
>>>>> >> >> >
>>>>> >> >> >        <property>
>>>>> >> >> >                <name>dfs.data.dir</name>
>>>>> >> >> >                <value>/home/mohammad/hdfs/data</value>
>>>>> >> >> >        </property>
>>>>> >> >> >
>>>>> >> >> > finally add this property in your mapred-site.xml -
>>>>> >> >> >
>>>>> >> >> >       <property>
>>>>> >> >> >          <name>mapred.job.tracker</name>
>>>>> >> >> >          <value>hdfs://localhost:9001</value>
>>>>> >> >> >        </property>
>>>>> >> >> >
>>>>> >> >> > NOTE: you can give any name to these directories of your
>>>>> >> >> > choice, just
>>>>> >> >> > keep in mind you have to give same names as values of
>>>>> >> >> >           above specified properties in your configuration
>>>>> >> >> > files.
>>>>> >> >> > (give full path of these directories, not just the name of the
>>>>> >> >> > directory)
>>>>> >> >> >
>>>>> >> >> > After this  follow the steps provided in the previous reply.
>>>>> >> >> >
>>>>> >> >> > Regards,
>>>>> >> >> >     Mohammad Tariq
>>>>> >> >> >
>>>>> >> >> >
>>>>> >> >> > On Wed, Jun 6, 2012 at 11:42 PM, Babak Bastan
>>>>> >> >> > <ba...@gmail.com>
>>>>> >> >> > wrote:
>>>>> >> >> >> thank's Mohammad
>>>>> >> >> >>
>>>>> >> >> >> with this command:
>>>>> >> >> >>
>>>>> >> >> >> babak@ubuntu:~/Downloads/hadoop/bin$ hadoop namenode -format
>>>>> >> >> >>
>>>>> >> >> >> this is my output:
>>>>> >> >> >>
>>>>> >> >> >> 12/06/06 20:05:20 INFO namenode.NameNode: STARTUP_MSG:
>>>>> >> >> >> /************************************************************
>>>>> >> >> >> STARTUP_MSG: Starting NameNode
>>>>> >> >> >> STARTUP_MSG:   host = ubuntu/127.0.1.1
>>>>> >> >> >> STARTUP_MSG:   args = [-format]
>>>>> >> >> >> STARTUP_MSG:   version = 0.20.2
>>>>> >> >> >> STARTUP_MSG:   build =
>>>>> >> >> >>
>>>>> >> >> >> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.20
>>>>> >> >> >> -r
>>>>> >> >> >> 911707; compiled by 'chrisdo' on Fri Feb 19 08:07:34 UTC 2010
>>>>> >> >> >> ************************************************************/
>>>>> >> >> >> 12/06/06 20:05:20 INFO namenode.FSNamesystem:
>>>>> >> >> >>
>>>>> >> >> >>
>>>>> >> >> >> fsOwner=babak,babak,adm,dialout,cdrom,plugdev,lpadmin,admin,sambashare
>>>>> >> >> >> 12/06/06 20:05:20 INFO namenode.FSNamesystem:
>>>>> >> >> >> supergroup=supergroup
>>>>> >> >> >> 12/06/06 20:05:20 INFO namenode.FSNamesystem:
>>>>> >> >> >> isPermissionEnabled=true
>>>>> >> >> >> 12/06/06 20:05:20 INFO common.Storage: Image file of size 95
>>>>> >> >> >> saved
>>>>> >> >> >> in 0
>>>>> >> >> >> seconds.
>>>>> >> >> >> 12/06/06 20:05:20 INFO common.Storage: Storage directory
>>>>> >> >> >> /tmp/hadoop-babak/dfs/name has been successfully formatted.
>>>>> >> >> >> 12/06/06 20:05:20 INFO namenode.NameNode: SHUTDOWN_MSG:
>>>>> >> >> >> /************************************************************
>>>>> >> >> >> SHUTDOWN_MSG: Shutting down NameNode at ubuntu/127.0.1.1
>>>>> >> >> >> ************************************************************/
>>>>> >> >> >>
>>>>> >> >> >> by this command:
>>>>> >> >> >>
>>>>> >> >> >> babak@ubuntu:~/Downloads/hadoop/bin$ start-dfs.sh
>>>>> >> >> >>
>>>>> >> >> >> this is the out put
>>>>> >> >> >>
>>>>> >> >> >> mkdir: kann Verzeichnis
>>>>> >> >> >> „/home/babak/Downloads/hadoop/bin/../logs“
>>>>> >> >> >> nicht
>>>>> >> >> >> anlegen: Keine Berechtigung
>>>>> >> >> >>
>>>>> >> >> >> this out put(it's in german and it means no right to make this
>>>>> >> >> >> folder)
>>>>> >> >> >>
>>>>> >> >> >>
>>>>> >> >> >> On Wed, Jun 6, 2012 at 7:59 PM, Mohammad Tariq
>>>>> >> >> >> <do...@gmail.com>
>>>>> >> >> >> wrote:
>>>>> >> >> >>>
>>>>> >> >> >>> once we are done with the configuration, we need to format
>>>>> >> >> >>> the file
>>>>> >> >> >>> system..use this command to do that-
>>>>> >> >> >>> bin/hadoop namenode -format
>>>>> >> >> >>>
>>>>> >> >> >>> after this, hadoop daemon processes should be started using
>>>>> >> >> >>> following
>>>>> >> >> >>> commands -
>>>>> >> >> >>> bin/start-dfs.sh (it'll start NN & DN)
>>>>> >> >> >>> bin/start-mapred.sh (it'll start JT & TT)
>>>>> >> >> >>>
>>>>> >> >> >>> after this use jps to check if everything is alright or point
>>>>> >> >> >>> your
>>>>> >> >> >>> browser to localhost:50070..if you further find any problem
>>>>> >> >> >>> provide
>>>>> >> >> >>> us
>>>>> >> >> >>> with the error logs..:)
>>>>> >> >> >>>
>>>>> >> >> >>> Regards,
>>>>> >> >> >>>     Mohammad Tariq
>>>>> >> >> >>>
>>>>> >> >> >>>
>>>>> >> >> >>> On Wed, Jun 6, 2012 at 11:22 PM, Babak Bastan
>>>>> >> >> >>> <ba...@gmail.com>
>>>>> >> >> >>> wrote:
>>>>> >> >> >>> > were you able to format hdfs properly???
>>>>> >> >> >>> > I did'nt get your question,Do you mean HADOOP_HOME? or
>>>>> >> >> >>> > where did
>>>>> >> >> >>> > I
>>>>> >> >> >>> > install
>>>>> >> >> >>> > Hadoop?
>>>>> >> >> >>> >
>>>>> >> >> >>> > On Wed, Jun 6, 2012 at 7:49 PM, Mohammad Tariq
>>>>> >> >> >>> > <do...@gmail.com>
>>>>> >> >> >>> > wrote:
>>>>> >> >> >>> >>
>>>>> >> >> >>> >> if you are getting only this, it means your hadoop is not
>>>>> >> >> >>> >> running..were you able to format hdfs properly???
>>>>> >> >> >>> >>
>>>>> >> >> >>> >> Regards,
>>>>> >> >> >>> >>     Mohammad Tariq
>>>>> >> >> >>> >>
>>>>> >> >> >>> >>
>>>>> >> >> >>> >> On Wed, Jun 6, 2012 at 11:17 PM, Babak Bastan
>>>>> >> >> >>> >> <ba...@gmail.com>
>>>>> >> >> >>> >> wrote:
>>>>> >> >> >>> >> > Hi MohammadmI irun jps in my shel I can see this result:
>>>>> >> >> >>> >> > 2213 Jps
>>>>> >> >> >>> >> >
>>>>> >> >> >>> >> >
>>>>> >> >> >>> >> > On Wed, Jun 6, 2012 at 7:44 PM, Mohammad Tariq
>>>>> >> >> >>> >> > <do...@gmail.com>
>>>>> >> >> >>> >> > wrote:
>>>>> >> >> >>> >> >>
>>>>> >> >> >>> >> >> you can also use "jps" command at your shell to see
>>>>> >> >> >>> >> >> whether
>>>>> >> >> >>> >> >> Hadoop
>>>>> >> >> >>> >> >> processes are running or not.
>>>>> >> >> >>> >> >>
>>>>> >> >> >>> >> >> Regards,
>>>>> >> >> >>> >> >>     Mohammad Tariq
>>>>> >> >> >>> >> >>
>>>>> >> >> >>> >> >>
>>>>> >> >> >>> >> >> On Wed, Jun 6, 2012 at 11:12 PM, Mohammad Tariq
>>>>> >> >> >>> >> >> <do...@gmail.com>
>>>>> >> >> >>> >> >> wrote:
>>>>> >> >> >>> >> >> > Hi Babak,
>>>>> >> >> >>> >> >> >
>>>>> >> >> >>> >> >> >  You have to type it in you web browser..Hadoop
>>>>> >> >> >>> >> >> > provides us
>>>>> >> >> >>> >> >> > a
>>>>> >> >> >>> >> >> > web
>>>>> >> >> >>> >> >> > GUI
>>>>> >> >> >>> >> >> > that not only allows us to browse through the file
>>>>> >> >> >>> >> >> > system,
>>>>> >> >> >>> >> >> > but
>>>>> >> >> >>> >> >> > to
>>>>> >> >> >>> >> >> > download the files as well..Apart from that it also
>>>>> >> >> >>> >> >> > provides a
>>>>> >> >> >>> >> >> > web
>>>>> >> >> >>> >> >> > GUI
>>>>> >> >> >>> >> >> > that can be used to see the status of Jobtracker and
>>>>> >> >> >>> >> >> > Tasktracker..When
>>>>> >> >> >>> >> >> > you run a Hive or Pig job or a Mapreduce job, you can
>>>>> >> >> >>> >> >> > point
>>>>> >> >> >>> >> >> > your
>>>>> >> >> >>> >> >> > browser to http://localhost:50030 to see the status
>>>>> >> >> >>> >> >> > and
>>>>> >> >> >>> >> >> > logs
>>>>> >> >> >>> >> >> > of
>>>>> >> >> >>> >> >> > your
>>>>> >> >> >>> >> >> > job.
>>>>> >> >> >>> >> >> >
>>>>> >> >> >>> >> >> > Regards,
>>>>> >> >> >>> >> >> >     Mohammad Tariq
>>>>> >> >> >>> >> >> >
>>>>> >> >> >>> >> >> >
>>>>> >> >> >>> >> >> > On Wed, Jun 6, 2012 at 8:28 PM, Babak Bastan
>>>>> >> >> >>> >> >> > <ba...@gmail.com>
>>>>> >> >> >>> >> >> > wrote:
>>>>> >> >> >>> >> >> >> Thank you shashwat for the answer,
>>>>> >> >> >>> >> >> >> where should I type http://localhost:50070?
>>>>> >> >> >>> >> >> >> I typed here: hive>http://localhost:50070 but
>>>>> >> >> >>> >> >> >> nothing as
>>>>> >> >> >>> >> >> >> result
>>>>> >> >> >>> >> >> >>
>>>>> >> >> >>> >> >> >>
>>>>> >> >> >>> >> >> >> On Wed, Jun 6, 2012 at 3:32 PM, shashwat shriparv
>>>>> >> >> >>> >> >> >> <dw...@gmail.com> wrote:
>>>>> >> >> >>> >> >> >>>
>>>>> >> >> >>> >> >> >>> first type http://localhost:50070 whether this is
>>>>> >> >> >>> >> >> >>> opening
>>>>> >> >> >>> >> >> >>> or
>>>>> >> >> >>> >> >> >>> not
>>>>> >> >> >>> >> >> >>> and
>>>>> >> >> >>> >> >> >>> check
>>>>> >> >> >>> >> >> >>> how many nodes are available, check some of the
>>>>> >> >> >>> >> >> >>> hadoop
>>>>> >> >> >>> >> >> >>> shell
>>>>> >> >> >>> >> >> >>> commands
>>>>> >> >> >>> >> >> >>>
>>>>> >> >> >>> >> >> >>>
>>>>> >> >> >>> >> >> >>>
>>>>> >> >> >>> >> >> >>>
>>>>> >> >> >>> >> >> >>> from http://hadoop.apache.org/common/docs/r0.18.3/hdfs_shell.html
>>>>> >> >> >>> >> >> >>> run
>>>>> >> >> >>> >> >> >>> example mapreduce task on hadoop take example from
>>>>> >> >> >>> >> >> >>> here
>>>>> >> >> >>> >> >> >>>
>>>>> >> >> >>> >> >> >>>
>>>>> >> >> >>> >> >> >>>
>>>>> >> >> >>> >> >> >>>
>>>>> >> >> >>> >> >> >>>
>>>>> >> >> >>> >> >> >>>
>>>>> >> >> >>> >> >> >>> : http://www.michael-noll.com/blog/2011/04/09/benchmarking-and-stress-testing-an-hadoop-cluster-with-terasort-testdfsio-nnbench-mrbench/
>>>>> >> >> >>> >> >> >>>
>>>>> >> >> >>> >> >> >>> if all the above you can do sucessfully means
>>>>> >> >> >>> >> >> >>> hadoop is
>>>>> >> >> >>> >> >> >>> configured
>>>>> >> >> >>> >> >> >>> correctly
>>>>> >> >> >>> >> >> >>>
>>>>> >> >> >>> >> >> >>> Regards
>>>>> >> >> >>> >> >> >>> Shashwat
>>>>> >> >> >>> >> >> >>>
>>>>> >> >> >>> >> >> >>>
>>>>> >> >> >>> >> >> >>> On Wed, Jun 6, 2012 at 1:30 AM, Babak Bastan
>>>>> >> >> >>> >> >> >>> <ba...@gmail.com>
>>>>> >> >> >>> >> >> >>> wrote:
>>>>> >> >> >>> >> >> >>>>
>>>>> >> >> >>> >> >> >>>> no I'm not working on CDH.Is there a way to test
>>>>> >> >> >>> >> >> >>>> if my
>>>>> >> >> >>> >> >> >>>> Hadoop
>>>>> >> >> >>> >> >> >>>> works
>>>>> >> >> >>> >> >> >>>> fine
>>>>> >> >> >>> >> >> >>>> or not?
>>>>> >> >> >>> >> >> >>>>
>>>>> >> >> >>> >> >> >>>>
>>>>> >> >> >>> >> >> >>>> On Tue, Jun 5, 2012 at 9:55 PM, Bejoy KS
>>>>> >> >> >>> >> >> >>>> <be...@yahoo.com>
>>>>> >> >> >>> >> >> >>>> wrote:
>>>>> >> >> >>> >> >> >>>>>
>>>>> >> >> >>> >> >> >>>>> Hi Babak
>>>>> >> >> >>> >> >> >>>>>
>>>>> >> >> >>> >> >> >>>>> You gotta follow those instructions in the apace
>>>>> >> >> >>> >> >> >>>>> site
>>>>> >> >> >>> >> >> >>>>> to
>>>>> >> >> >>> >> >> >>>>> set
>>>>> >> >> >>> >> >> >>>>> up
>>>>> >> >> >>> >> >> >>>>> hadoop
>>>>> >> >> >>> >> >> >>>>> from scratch and ensure that hdfs is working
>>>>> >> >> >>> >> >> >>>>> first. You
>>>>> >> >> >>> >> >> >>>>> should
>>>>> >> >> >>> >> >> >>>>> be
>>>>> >> >> >>> >> >> >>>>> able to
>>>>> >> >> >>> >> >> >>>>> read and write files to hdfs before you do your
>>>>> >> >> >>> >> >> >>>>> next
>>>>> >> >> >>> >> >> >>>>> steps.
>>>>> >> >> >>> >> >> >>>>>
>>>>> >> >> >>> >> >> >>>>> Are you on CDH or apache distribution of hadoop?
>>>>> >> >> >>> >> >> >>>>> If it
>>>>> >> >> >>> >> >> >>>>> is
>>>>> >> >> >>> >> >> >>>>> CDH
>>>>> >> >> >>> >> >> >>>>> there
>>>>> >> >> >>> >> >> >>>>> are
>>>>> >> >> >>> >> >> >>>>> detailed instructions on Cloudera web site.
>>>>> >> >> >>> >> >> >>>>>
>>>>> >> >> >>> >> >> >>>>> Regards
>>>>> >> >> >>> >> >> >>>>> Bejoy KS
>>>>> >> >> >>> >> >> >>>>>
>>>>> >> >> >>> >> >> >>>>> Sent from handheld, please excuse typos.
>>>>> >> >> >>> >> >> >>>>> ________________________________
>>>>> >> >> >>> >> >> >>>>> From: Babak Bastan <ba...@gmail.com>
>>>>> >> >> >>> >> >> >>>>> Date: Tue, 5 Jun 2012 21:30:22 +0200
>>>>> >> >> >>> >> >> >>>>> To: <us...@hive.apache.org>
>>>>> >> >> >>> >> >> >>>>> ReplyTo: user@hive.apache.org
>>>>> >> >> >>> >> >> >>>>> Subject: Re: Error while Creating Table in Hive
>>>>> >> >> >>> >> >> >>>>>
>>>>> >> >> >>> >> >> >>>>> @Bejoy: I set the fs.default.name in the
>>>>> >> >> >>> >> >> >>>>> core-site.xml
>>>>> >> >> >>> >> >> >>>>> and
>>>>> >> >> >>> >> >> >>>>> I
>>>>> >> >> >>> >> >> >>>>> did
>>>>> >> >> >>> >> >> >>>>> all
>>>>> >> >> >>> >> >> >>>>> of
>>>>> >> >> >>> >> >> >>>>> thing that was mentioned in the reference but no
>>>>> >> >> >>> >> >> >>>>> effect
>>>>> >> >> >>> >> >> >>>>>
>>>>> >> >> >>> >> >> >>>>> On Tue, Jun 5, 2012 at 8:43 PM, Babak Bastan
>>>>> >> >> >>> >> >> >>>>> <ba...@gmail.com>
>>>>> >> >> >>> >> >> >>>>> wrote:
>>>>> >> >> >>> >> >> >>>>>>
>>>>> >> >> >>> >> >> >>>>>> Ok sorry but that was my Mistake .I thought it
>>>>> >> >> >>> >> >> >>>>>> works
>>>>> >> >> >>> >> >> >>>>>> but
>>>>> >> >> >>> >> >> >>>>>> no.
>>>>> >> >> >>> >> >> >>>>>> I wrote the command without ; and then I think
>>>>> >> >> >>> >> >> >>>>>> It
>>>>> >> >> >>> >> >> >>>>>> works
>>>>> >> >> >>> >> >> >>>>>> but
>>>>> >> >> >>> >> >> >>>>>> with
>>>>> >> >> >>> >> >> >>>>>> ;
>>>>> >> >> >>> >> >> >>>>>> at
>>>>> >> >> >>> >> >> >>>>>> the end of command
>>>>> >> >> >>> >> >> >>>>>>
>>>>> >> >> >>> >> >> >>>>>> CREATE TABLE pokes (foo INT, bar STRING);
>>>>> >> >> >>> >> >> >>>>>>
>>>>> >> >> >>> >> >> >>>>>> does'nt work
>>>>> >> >> >>> >> >> >>>>>>
>>>>> >> >> >>> >> >> >>>>>>
>>>>> >> >> >>> >> >> >>>>>> On Tue, Jun 5, 2012 at 8:34 PM, shashwat
>>>>> >> >> >>> >> >> >>>>>> shriparv
>>>>> >> >> >>> >> >> >>>>>> <dw...@gmail.com> wrote:
>>>>> >> >> >>> >> >> >>>>>>>
>>>>> >> >> >>> >> >> >>>>>>> inside configuration. all properties will be
>>>>> >> >> >>> >> >> >>>>>>> inside
>>>>> >> >> >>> >> >> >>>>>>> the
>>>>> >> >> >>> >> >> >>>>>>> configuration
>>>>> >> >> >>> >> >> >>>>>>> tags
>>>>> >> >> >>> >> >> >>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>
>>>>> >> >> >>> >> >> >>>>>>> On Tue, Jun 5, 2012 at 11:53 PM, Babak Bastan
>>>>> >> >> >>> >> >> >>>>>>> <ba...@gmail.com>
>>>>> >> >> >>> >> >> >>>>>>> wrote:
>>>>> >> >> >>> >> >> >>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>> Thank you so much my friend your idee works
>>>>> >> >> >>> >> >> >>>>>>>> fine(no
>>>>> >> >> >>> >> >> >>>>>>>> error)
>>>>> >> >> >>> >> >> >>>>>>>> you
>>>>> >> >> >>> >> >> >>>>>>>> are
>>>>> >> >> >>> >> >> >>>>>>>> the best :)
>>>>> >> >> >>> >> >> >>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>> On Tue, Jun 5, 2012 at 8:20 PM, Babak Bastan
>>>>> >> >> >>> >> >> >>>>>>>> <ba...@gmail.com>
>>>>> >> >> >>> >> >> >>>>>>>> wrote:
>>>>> >> >> >>> >> >> >>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>> It must be inside the
>>>>> >> >> >>> >> >> >>>>>>>>> <configuration></configuration>
>>>>> >> >> >>> >> >> >>>>>>>>> or
>>>>> >> >> >>> >> >> >>>>>>>>> outside
>>>>> >> >> >>> >> >> >>>>>>>>> this?
>>>>> >> >> >>> >> >> >>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>> On Tue, Jun 5, 2012 at 8:15 PM, shashwat
>>>>> >> >> >>> >> >> >>>>>>>>> shriparv
>>>>> >> >> >>> >> >> >>>>>>>>> <dw...@gmail.com> wrote:
>>>>> >> >> >>> >> >> >>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>> It will be inside hive/conf
>>>>> >> >> >>> >> >> >>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>> On Tue, Jun 5, 2012 at 11:43 PM, Babak
>>>>> >> >> >>> >> >> >>>>>>>>>> Bastan
>>>>> >> >> >>> >> >> >>>>>>>>>> <ba...@gmail.com>
>>>>> >> >> >>> >> >> >>>>>>>>>> wrote:
>>>>> >> >> >>> >> >> >>>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>>> Thanks sShashwat, and where is this
>>>>> >> >> >>> >> >> >>>>>>>>>>> hive-site.xml
>>>>> >> >> >>> >> >> >>>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>>> On Tue, Jun 5, 2012 at 8:02 PM, shashwat
>>>>> >> >> >>> >> >> >>>>>>>>>>> shriparv
>>>>> >> >> >>> >> >> >>>>>>>>>>> <dw...@gmail.com> wrote:
>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>>>> set
>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>>>> hive.metastore.warehouse.dir in
>>>>> >> >> >>> >> >> >>>>>>>>>>>> hive-site.xml
>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>>>> <property>
>>>>> >> >> >>> >> >> >>>>>>>>>>>>   <name>hive.metastore.local</name>
>>>>> >> >> >>> >> >> >>>>>>>>>>>>   <value>true</value>
>>>>> >> >> >>> >> >> >>>>>>>>>>>> </property>
>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>>>> <name>hive.metastore.warehouse.dir</name>
>>>>> >> >> >>> >> >> >>>>>>>>>>>>                <value>/home/<your
>>>>> >> >> >>> >> >> >>>>>>>>>>>> username>/hivefolder</value>
>>>>> >> >> >>> >> >> >>>>>>>>>>>>                <description>location of
>>>>> >> >> >>> >> >> >>>>>>>>>>>> default
>>>>> >> >> >>> >> >> >>>>>>>>>>>> database
>>>>> >> >> >>> >> >> >>>>>>>>>>>> for
>>>>> >> >> >>> >> >> >>>>>>>>>>>> the
>>>>> >> >> >>> >> >> >>>>>>>>>>>> warehouse</description>
>>>>> >> >> >>> >> >> >>>>>>>>>>>>        </property>
>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>>>> On Tue, Jun 5, 2012 at 10:43 PM, Babak
>>>>> >> >> >>> >> >> >>>>>>>>>>>> Bastan
>>>>> >> >> >>> >> >> >>>>>>>>>>>> <ba...@gmail.com> wrote:
>>>>> >> >> >>> >> >> >>>>>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>>>>> Hello Experts ,
>>>>> >> >> >>> >> >> >>>>>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>>>>> I'm new in Hive .When try to create a
>>>>> >> >> >>> >> >> >>>>>>>>>>>>> test
>>>>> >> >> >>> >> >> >>>>>>>>>>>>> Table
>>>>> >> >> >>> >> >> >>>>>>>>>>>>> in
>>>>> >> >> >>> >> >> >>>>>>>>>>>>> Hive
>>>>> >> >> >>> >> >> >>>>>>>>>>>>> I
>>>>> >> >> >>> >> >> >>>>>>>>>>>>> get
>>>>> >> >> >>> >> >> >>>>>>>>>>>>> an error.I want to run this command:
>>>>> >> >> >>> >> >> >>>>>>>>>>>>> CREATE TABLE Test (DateT STRING, Url
>>>>> >> >> >>> >> >> >>>>>>>>>>>>> STRING,
>>>>> >> >> >>> >> >> >>>>>>>>>>>>> Content
>>>>> >> >> >>> >> >> >>>>>>>>>>>>> STRING);
>>>>> >> >> >>> >> >> >>>>>>>>>>>>> but this error occured:
>>>>> >> >> >>> >> >> >>>>>>>>>>>>> FAILED: Error in metadata:
>>>>> >> >> >>> >> >> >>>>>>>>>>>>> MetaException(message:Got
>>>>> >> >> >>> >> >> >>>>>>>>>>>>> exception:
>>>>> >> >> >>> >> >> >>>>>>>>>>>>> java.io.FileNotFoundException File
>>>>> >> >> >>> >> >> >>>>>>>>>>>>> file:/user/hive/warehouse/test does not
>>>>> >> >> >>> >> >> >>>>>>>>>>>>> exist.)
>>>>> >> >> >>> >> >> >>>>>>>>>>>>> FAILED: Execution Error, return code 1
>>>>> >> >> >>> >> >> >>>>>>>>>>>>> from
>>>>> >> >> >>> >> >> >>>>>>>>>>>>> org.apache.hadoop.hive.ql.exec.DDLTask
>>>>> >> >> >>> >> >> >>>>>>>>>>>>> How can I solve this Problem?
>>>>> >> >> >>> >> >> >>>>>>>>>>>>> Thank you so much
>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>>>> --
>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>>>> ∞
>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>>>> Shashwat Shriparv
>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>> --
>>>>> >> >> >>> >> >> >>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>> ∞
>>>>> >> >> >>> >> >> >>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>> Shashwat Shriparv
>>>>> >> >> >>> >> >> >>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>
>>>>> >> >> >>> >> >> >>>>>>> --
>>>>> >> >> >>> >> >> >>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>
>>>>> >> >> >>> >> >> >>>>>>> ∞
>>>>> >> >> >>> >> >> >>>>>>>
>>>>> >> >> >>> >> >> >>>>>>> Shashwat Shriparv
>>>>> >> >> >>> >> >> >>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>
>>>>> >> >> >>> >> >> >>>>>>
>>>>> >> >> >>> >> >> >>>>>
>>>>> >> >> >>> >> >> >>>>
>>>>> >> >> >>> >> >> >>>
>>>>> >> >> >>> >> >> >>>
>>>>> >> >> >>> >> >> >>>
>>>>> >> >> >>> >> >> >>> --
>>>>> >> >> >>> >> >> >>>
>>>>> >> >> >>> >> >> >>>
>>>>> >> >> >>> >> >> >>> ∞
>>>>> >> >> >>> >> >> >>>
>>>>> >> >> >>> >> >> >>> Shashwat Shriparv
>>>>> >> >> >>> >> >> >>>
>>>>> >> >> >>> >> >> >>>
>>>>> >> >> >>> >> >> >>
>>>>> >> >> >>> >> >
>>>>> >> >> >>> >> >
>>>>> >> >> >>> >
>>>>> >> >> >>> >
>>>>> >> >> >>
>>>>> >> >> >>
>>>>> >> >
>>>>> >> >
>>>>> >
>>>>> >
>>>>
>>>>
>>>
>>>
>>>
>>> --
>>>
>>>
>>> ∞
>>>
>>> Shashwat Shriparv
>>>
>>>
>>
>

Re: Error while Creating Table in Hive

Posted by Mohammad Tariq <do...@gmail.com>.
follow this link -
http://hadoop.apache.org/common/docs/r0.20.203.0/single_node_setup.html..it
worked for most of us without any prooblem.

do all the things required to configure hadoop on linux in pseudo
distributed mode as given in this link..start with a simple setup as
shown there..then we'll add more properties

if you need detailed help you can also visit -
http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/

Regards,
    Mohammad Tariq


On Thu, Jun 7, 2012 at 3:04 AM, Babak Bastan <ba...@gmail.com> wrote:
> I try to install another one in blog,most of them was performed without
> problem but in this step
>
>  udo mkdir /hadoop && sudo chown hdfs:hdfs /hadoop && sudo chmod 777 /hadoop
>
> I get this error:
>
> Error:  0: couldn't open source file </hadoop.ui>
>
> and in this step:
> mkdir /usr/lib/hadoop-0.20/.ssh
> this error:
> mkdir: kann Verzeichnis „/usr/lib/hadoop-0.20/.ssh“ nicht anlegen: Keine
> Berechtigung
> ---> no permission to make a directory
>
> On Wed, Jun 6, 2012 at 11:21 PM, Mohammad Tariq <do...@gmail.com> wrote:
>>
>> ok..we'll give it a final shot..then i'll email configured hadoop to
>> your email address..delete the hdfs directory which contains tmp, data
>> and name..recreate it..format hdfs again and then start the processes.
>>
>> Regards,
>>     Mohammad Tariq
>>
>>
>> On Thu, Jun 7, 2012 at 2:22 AM, Babak Bastan <ba...@gmail.com> wrote:
>> > I 've performed the steps but the same error in this step as before:
>> > bin/start-dfs.sh
>> > It is about my permission to make directory
>> >
>> > On Wed, Jun 6, 2012 at 10:33 PM, Mohammad Tariq <do...@gmail.com>
>> > wrote:
>> >>
>> >> actually this blog post explains how to install cloudera's hadoop
>> >> distribution...if you have followed this post and installed cloudera's
>> >> distribution then your logs should ideally be inside
>> >> /usr/lib/hadoop/logs (if everything was fine)..anyway try the steps I
>> >> have given and let me know.
>> >>
>> >> Regards,
>> >>     Mohammad Tariq
>> >>
>> >>
>> >> On Thu, Jun 7, 2012 at 1:52 AM, Babak Bastan <ba...@gmail.com>
>> >> wrote:
>> >> > by the way,you are a very nice man my friend:Thank you so much :)
>> >> >
>> >> > what do you mean aboat this post in stackoverflow?
>> >> >
>> >> > I am assuming that is your first installation of hadoop.
>> >> >
>> >> > At the beginning please check if your daemons are working. To do that
>> >> > use
>> >> > (in terminal):
>> >> >
>> >> > jps
>> >> >
>> >> > If only jps appears that means all daemons are down. Please check the
>> >> > log
>> >> > files. Especially the namenode. Log folder is probably somewhere
>> >> > there
>> >> > /usr/lib/hadoop/logs
>> >> >
>> >> > If you have some permission problems. Use this guide during the
>> >> > installation.
>> >> >
>> >> > Good installation guide
>> >> >
>> >> > I am shooting with this explanations but these are most common
>> >> > problems.
>> >> >
>> >> >
>> >> > On Wed, Jun 6, 2012 at 10:15 PM, Babak Bastan <ba...@gmail.com>
>> >> > wrote:
>> >> >>
>> >> >> I checked it but no hadoop folder :(
>> >> >> yes you are right.I'm a student and I want to make a very very
>> >> >> simple
>> >> >> programm hive but untill now hmmmmmmmmm
>> >> >>
>> >> >>
>> >> >> On Wed, Jun 6, 2012 at 10:12 PM, Babak Bastan <ba...@gmail.com>
>> >> >> wrote:
>> >> >>>
>> >> >>> no one error:
>> >> >>> i.e if I run this one
>> >> >>>
>> >> >>> hostname --fqdn
>> >> >>>
>> >> >>>  with the condition that I send to you :
>> >> >>>
>> >> >>> 127.0.0.1       localhost
>> >> >>> #127.0.0.1      ubuntu.ubuntu-domain    ubuntu
>> >> >>> # The following lines are desirable for IPv6 capable hosts
>> >> >>> #::1     ip6-localhost ip6-loopback
>> >> >>> #fe00::0 ip6-localnet
>> >> >>> #ff00::0 ip6-mcastprefix
>> >> >>> #ff02::1 ip6-allnodes
>> >> >>> #ff02::2 ip6-allrouters
>> >> >>>
>> >> >>> I get this error:
>> >> >>>
>> >> >>> hostname: Name or service not known
>> >> >>>
>> >> >>> Or in the second step by this command:
>> >> >>>
>> >> >>> babak@ubuntu:~/Downloads/hadoop/bin$ start-hdfs.sh
>> >> >>>
>> >> >>> these lines of error:
>> >> >>>
>> >> >>>
>> >> >>> mkdir: kann Verzeichnis „/home/babak/Downloads/hadoop/bin/../logs“
>> >> >>> nicht
>> >> >>> anlegen: Keine Berechtigung
>> >> >>> starting namenode, logging to
>> >> >>>
>> >> >>>
>> >> >>> /home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-namenode-ubuntu.out
>> >> >>> /home/babak/Downloads/hadoop/bin/hadoop-daemon.sh: Zeile 117:
>> >> >>>
>> >> >>>
>> >> >>> /home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-namenode-ubuntu.out:
>> >> >>> Datei oder Verzeichnis nicht gefunden
>> >> >>> head:
>> >> >>>
>> >> >>>
>> >> >>> „/home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-namenode-ubuntu.out“
>> >> >>> kann nicht zum Lesen geöffnet werden: Datei oder Verzeichnis nicht
>> >> >>> gefunden
>> >> >>> localhost: mkdir: kann Verzeichnis
>> >> >>> „/home/babak/Downloads/hadoop/bin/../logs“ nicht anlegen: Keine
>> >> >>> Berechtigung
>> >> >>> localhost: starting datanode, logging to
>> >> >>>
>> >> >>>
>> >> >>> /home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-datanode-ubuntu.out
>> >> >>> localhost: /home/babak/Downloads/hadoop/bin/hadoop-daemon.sh: Zeile
>> >> >>> 117:
>> >> >>>
>> >> >>>
>> >> >>> /home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-datanode-ubuntu.out:
>> >> >>> Datei oder Verzeichnis nicht gefunden
>> >> >>> localhost: head:
>> >> >>>
>> >> >>>
>> >> >>> „/home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-datanode-ubuntu.out“
>> >> >>> kann nicht zum Lesen geöffnet werden: Datei oder Verzeichnis nicht
>> >> >>> gefunden
>> >> >>> localhost: mkdir: kann Verzeichnis
>> >> >>> „/home/babak/Downloads/hadoop/bin/../logs“ nicht anlegen: Keine
>> >> >>> Berechtigung
>> >> >>> localhost: starting secondarynamenode, logging to
>> >> >>>
>> >> >>>
>> >> >>> /home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-secondarynamenode-ubuntu.out
>> >> >>> localhost: /home/babak/Downloads/hadoop/bin/hadoop-daemon.sh: Zeile
>> >> >>> 117:
>> >> >>>
>> >> >>>
>> >> >>> /home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-secondarynamenode-ubuntu.out:
>> >> >>> Datei oder Verzeichnis nicht gefunden
>> >> >>> localhost: head:
>> >> >>>
>> >> >>>
>> >> >>> „/home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-secondarynamenode-ubuntu.out“
>> >> >>> kann nicht zum Lesen geöffnet werden: Datei oder Verzeichnis nicht
>> >> >>> gefunden
>> >> >>>
>> >> >>> they said no permision to make logs in this
>> >> >>> path:/home/babak/Downloads/hadoop/bin/../logs
>> >> >>>
>> >> >>>  and generally I cant create a table in hive and get this one:
>> >> >>>
>> >> >>> FAILED: Error in metadata: MetaException(message:Got exception:
>> >> >>> java.io.FileNotFoundException File file:/user/hive/warehouse/test
>> >> >>> does
>> >> >>> not
>> >> >>> exist.)
>> >> >>> FAILED: Execution Error, return code 1 from
>> >> >>> org.apache.hadoop.hive.ql.exec.DDLTask
>> >> >>>
>> >> >>> On Wed, Jun 6, 2012 at 10:02 PM, shashwat shriparv
>> >> >>> <dw...@gmail.com> wrote:
>> >> >>>>
>> >> >>>> whats the error babak ???
>> >> >>>>
>> >> >>>>
>> >> >>>> On Thu, Jun 7, 2012 at 1:25 AM, Babak Bastan <ba...@gmail.com>
>> >> >>>> wrote:
>> >> >>>>>
>> >> >>>>> What the hell is that?I see no log folder there
>> >> >>>>>
>> >> >>>>>
>> >> >>>>> On Wed, Jun 6, 2012 at 9:41 PM, Mohammad Tariq
>> >> >>>>> <do...@gmail.com>
>> >> >>>>> wrote:
>> >> >>>>>>
>> >> >>>>>> go to your HADOOP_HOME i.e your hadoop directory(that includes
>> >> >>>>>> bin,
>> >> >>>>>> conf etc)..you can find logs directory there..
>> >> >>>>>>
>> >> >>>>>> Regards,
>> >> >>>>>>     Mohammad Tariq
>> >> >>>>>>
>> >> >>>>>>
>> >> >>>>>> On Thu, Jun 7, 2012 at 1:09 AM, Babak Bastan
>> >> >>>>>> <ba...@gmail.com>
>> >> >>>>>> wrote:
>> >> >>>>>> > hoe can I get my log mohammad?
>> >> >>>>>> >
>> >> >>>>>> >
>> >> >>>>>> > On Wed, Jun 6, 2012 at 9:36 PM, Mohammad Tariq
>> >> >>>>>> > <do...@gmail.com>
>> >> >>>>>> > wrote:
>> >> >>>>>> >>
>> >> >>>>>> >> could you post your logs???that would help me in
>> >> >>>>>> >> understanding
>> >> >>>>>> >> the
>> >> >>>>>> >> problem properly.
>> >> >>>>>> >>
>> >> >>>>>> >> Regards,
>> >> >>>>>> >>     Mohammad Tariq
>> >> >>>>>> >>
>> >> >>>>>> >>
>> >> >>>>>> >> On Thu, Jun 7, 2012 at 1:02 AM, Babak Bastan
>> >> >>>>>> >> <ba...@gmail.com>
>> >> >>>>>> >> wrote:
>> >> >>>>>> >> > Thank you very much mohamad for your attention.I followed
>> >> >>>>>> >> > the
>> >> >>>>>> >> > steps but
>> >> >>>>>> >> > the
>> >> >>>>>> >> > error is the same as the last time.
>> >> >>>>>> >> > and there is my hosts file:
>> >> >>>>>> >> >
>> >> >>>>>> >> > 127.0.0.1       localhost
>> >> >>>>>> >> > #127.0.0.1      ubuntu.ubuntu-domain    ubuntu
>> >> >>>>>> >> >
>> >> >>>>>> >> >
>> >> >>>>>> >> > # The following lines are desirable for IPv6 capable hosts
>> >> >>>>>> >> >
>> >> >>>>>> >> > #::1     ip6-localhost ip6-loopback
>> >> >>>>>> >> > #fe00::0 ip6-localnet
>> >> >>>>>> >> > #ff00::0 ip6-mcastprefix
>> >> >>>>>> >> > #ff02::1 ip6-allnodes
>> >> >>>>>> >> > #ff02::2 ip6-allrouters
>> >> >>>>>> >> >
>> >> >>>>>> >> > but no effect :(
>> >> >>>>>> >> >
>> >> >>>>>> >> > On Wed, Jun 6, 2012 at 8:25 PM, Mohammad Tariq
>> >> >>>>>> >> > <do...@gmail.com>
>> >> >>>>>> >> > wrote:
>> >> >>>>>> >> >>
>> >> >>>>>> >> >> also change the permissions of these directories to 777.
>> >> >>>>>> >> >>
>> >> >>>>>> >> >> Regards,
>> >> >>>>>> >> >>     Mohammad Tariq
>> >> >>>>>> >> >>
>> >> >>>>>> >> >>
>> >> >>>>>> >> >> On Wed, Jun 6, 2012 at 11:54 PM, Mohammad Tariq
>> >> >>>>>> >> >> <do...@gmail.com>
>> >> >>>>>> >> >> wrote:
>> >> >>>>>> >> >> > create a directory "/home/username/hdfs" (or at some
>> >> >>>>>> >> >> > place
>> >> >>>>>> >> >> > of
>> >> >>>>>> >> >> > your
>> >> >>>>>> >> >> > choice)..inside this hdfs directory create three sub
>> >> >>>>>> >> >> > directories -
>> >> >>>>>> >> >> > name, data, and temp, then follow these steps :
>> >> >>>>>> >> >> >
>> >> >>>>>> >> >> > add following properties in your core-site.xml -
>> >> >>>>>> >> >> >
>> >> >>>>>> >> >> > <property>
>> >> >>>>>> >> >> >          <name>fs.default.name</name>
>> >> >>>>>> >> >> >          <value>hdfs://localhost:9000/</value>
>> >> >>>>>> >> >> >        </property>
>> >> >>>>>> >> >> >
>> >> >>>>>> >> >> >        <property>
>> >> >>>>>> >> >> >          <name>hadoop.tmp.dir</name>
>> >> >>>>>> >> >> >          <value>/home/mohammad/hdfs/temp</value>
>> >> >>>>>> >> >> >        </property>
>> >> >>>>>> >> >> >
>> >> >>>>>> >> >> > then add following two properties in your hdfs-site.xml
>> >> >>>>>> >> >> > -
>> >> >>>>>> >> >> >
>> >> >>>>>> >> >> > <property>
>> >> >>>>>> >> >> >                <name>dfs.replication</name>
>> >> >>>>>> >> >> >                <value>1</value>
>> >> >>>>>> >> >> >        </property>
>> >> >>>>>> >> >> >
>> >> >>>>>> >> >> >        <property>
>> >> >>>>>> >> >> >                <name>dfs.name.dir</name>
>> >> >>>>>> >> >> >                <value>/home/mohammad/hdfs/name</value>
>> >> >>>>>> >> >> >        </property>
>> >> >>>>>> >> >> >
>> >> >>>>>> >> >> >        <property>
>> >> >>>>>> >> >> >                <name>dfs.data.dir</name>
>> >> >>>>>> >> >> >                <value>/home/mohammad/hdfs/data</value>
>> >> >>>>>> >> >> >        </property>
>> >> >>>>>> >> >> >
>> >> >>>>>> >> >> > finally add this property in your mapred-site.xml -
>> >> >>>>>> >> >> >
>> >> >>>>>> >> >> >       <property>
>> >> >>>>>> >> >> >          <name>mapred.job.tracker</name>
>> >> >>>>>> >> >> >          <value>hdfs://localhost:9001</value>
>> >> >>>>>> >> >> >        </property>
>> >> >>>>>> >> >> >
>> >> >>>>>> >> >> > NOTE: you can give any name to these directories of your
>> >> >>>>>> >> >> > choice, just
>> >> >>>>>> >> >> > keep in mind you have to give same names as values of
>> >> >>>>>> >> >> >           above specified properties in your
>> >> >>>>>> >> >> > configuration
>> >> >>>>>> >> >> > files.
>> >> >>>>>> >> >> > (give full path of these directories, not just the name
>> >> >>>>>> >> >> > of
>> >> >>>>>> >> >> > the
>> >> >>>>>> >> >> > directory)
>> >> >>>>>> >> >> >
>> >> >>>>>> >> >> > After this  follow the steps provided in the previous
>> >> >>>>>> >> >> > reply.
>> >> >>>>>> >> >> >
>> >> >>>>>> >> >> > Regards,
>> >> >>>>>> >> >> >     Mohammad Tariq
>> >> >>>>>> >> >> >
>> >> >>>>>> >> >> >
>> >> >>>>>> >> >> > On Wed, Jun 6, 2012 at 11:42 PM, Babak Bastan
>> >> >>>>>> >> >> > <ba...@gmail.com>
>> >> >>>>>> >> >> > wrote:
>> >> >>>>>> >> >> >> thank's Mohammad
>> >> >>>>>> >> >> >>
>> >> >>>>>> >> >> >> with this command:
>> >> >>>>>> >> >> >>
>> >> >>>>>> >> >> >> babak@ubuntu:~/Downloads/hadoop/bin$ hadoop namenode
>> >> >>>>>> >> >> >> -format
>> >> >>>>>> >> >> >>
>> >> >>>>>> >> >> >> this is my output:
>> >> >>>>>> >> >> >>
>> >> >>>>>> >> >> >> 12/06/06 20:05:20 INFO namenode.NameNode: STARTUP_MSG:
>> >> >>>>>> >> >> >>
>> >> >>>>>> >> >> >>
>> >> >>>>>> >> >> >> /************************************************************
>> >> >>>>>> >> >> >> STARTUP_MSG: Starting NameNode
>> >> >>>>>> >> >> >> STARTUP_MSG:   host = ubuntu/127.0.1.1
>> >> >>>>>> >> >> >> STARTUP_MSG:   args = [-format]
>> >> >>>>>> >> >> >> STARTUP_MSG:   version = 0.20.2
>> >> >>>>>> >> >> >> STARTUP_MSG:   build =
>> >> >>>>>> >> >> >>
>> >> >>>>>> >> >> >>
>> >> >>>>>> >> >> >>
>> >> >>>>>> >> >> >> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.20
>> >> >>>>>> >> >> >> -r
>> >> >>>>>> >> >> >> 911707; compiled by 'chrisdo' on Fri Feb 19 08:07:34
>> >> >>>>>> >> >> >> UTC
>> >> >>>>>> >> >> >> 2010
>> >> >>>>>> >> >> >>
>> >> >>>>>> >> >> >>
>> >> >>>>>> >> >> >> ************************************************************/
>> >> >>>>>> >> >> >> 12/06/06 20:05:20 INFO namenode.FSNamesystem:
>> >> >>>>>> >> >> >>
>> >> >>>>>> >> >> >>
>> >> >>>>>> >> >> >>
>> >> >>>>>> >> >> >>
>> >> >>>>>> >> >> >> fsOwner=babak,babak,adm,dialout,cdrom,plugdev,lpadmin,admin,sambashare
>> >> >>>>>> >> >> >> 12/06/06 20:05:20 INFO namenode.FSNamesystem:
>> >> >>>>>> >> >> >> supergroup=supergroup
>> >> >>>>>> >> >> >> 12/06/06 20:05:20 INFO namenode.FSNamesystem:
>> >> >>>>>> >> >> >> isPermissionEnabled=true
>> >> >>>>>> >> >> >> 12/06/06 20:05:20 INFO common.Storage: Image file of
>> >> >>>>>> >> >> >> size
>> >> >>>>>> >> >> >> 95
>> >> >>>>>> >> >> >> saved
>> >> >>>>>> >> >> >> in 0
>> >> >>>>>> >> >> >> seconds.
>> >> >>>>>> >> >> >> 12/06/06 20:05:20 INFO common.Storage: Storage
>> >> >>>>>> >> >> >> directory
>> >> >>>>>> >> >> >> /tmp/hadoop-babak/dfs/name has been successfully
>> >> >>>>>> >> >> >> formatted.
>> >> >>>>>> >> >> >> 12/06/06 20:05:20 INFO namenode.NameNode: SHUTDOWN_MSG:
>> >> >>>>>> >> >> >>
>> >> >>>>>> >> >> >>
>> >> >>>>>> >> >> >> /************************************************************
>> >> >>>>>> >> >> >> SHUTDOWN_MSG: Shutting down NameNode at
>> >> >>>>>> >> >> >> ubuntu/127.0.1.1
>> >> >>>>>> >> >> >>
>> >> >>>>>> >> >> >>
>> >> >>>>>> >> >> >> ************************************************************/
>> >> >>>>>> >> >> >>
>> >> >>>>>> >> >> >> by this command:
>> >> >>>>>> >> >> >>
>> >> >>>>>> >> >> >> babak@ubuntu:~/Downloads/hadoop/bin$ start-dfs.sh
>> >> >>>>>> >> >> >>
>> >> >>>>>> >> >> >> this is the out put
>> >> >>>>>> >> >> >>
>> >> >>>>>> >> >> >> mkdir: kann Verzeichnis
>> >> >>>>>> >> >> >> „/home/babak/Downloads/hadoop/bin/../logs“
>> >> >>>>>> >> >> >> nicht
>> >> >>>>>> >> >> >> anlegen: Keine Berechtigung
>> >> >>>>>> >> >> >>
>> >> >>>>>> >> >> >> this out put(it's in german and it means no right to
>> >> >>>>>> >> >> >> make
>> >> >>>>>> >> >> >> this
>> >> >>>>>> >> >> >> folder)
>> >> >>>>>> >> >> >>
>> >> >>>>>> >> >> >>
>> >> >>>>>> >> >> >> On Wed, Jun 6, 2012 at 7:59 PM, Mohammad Tariq
>> >> >>>>>> >> >> >> <do...@gmail.com>
>> >> >>>>>> >> >> >> wrote:
>> >> >>>>>> >> >> >>>
>> >> >>>>>> >> >> >>> once we are done with the configuration, we need to
>> >> >>>>>> >> >> >>> format
>> >> >>>>>> >> >> >>> the file
>> >> >>>>>> >> >> >>> system..use this command to do that-
>> >> >>>>>> >> >> >>> bin/hadoop namenode -format
>> >> >>>>>> >> >> >>>
>> >> >>>>>> >> >> >>> after this, hadoop daemon processes should be started
>> >> >>>>>> >> >> >>> using
>> >> >>>>>> >> >> >>> following
>> >> >>>>>> >> >> >>> commands -
>> >> >>>>>> >> >> >>> bin/start-dfs.sh (it'll start NN & DN)
>> >> >>>>>> >> >> >>> bin/start-mapred.sh (it'll start JT & TT)
>> >> >>>>>> >> >> >>>
>> >> >>>>>> >> >> >>> after this use jps to check if everything is alright
>> >> >>>>>> >> >> >>> or
>> >> >>>>>> >> >> >>> point your
>> >> >>>>>> >> >> >>> browser to localhost:50070..if you further find any
>> >> >>>>>> >> >> >>> problem
>> >> >>>>>> >> >> >>> provide
>> >> >>>>>> >> >> >>> us
>> >> >>>>>> >> >> >>> with the error logs..:)
>> >> >>>>>> >> >> >>>
>> >> >>>>>> >> >> >>> Regards,
>> >> >>>>>> >> >> >>>     Mohammad Tariq
>> >> >>>>>> >> >> >>>
>> >> >>>>>> >> >> >>>
>> >> >>>>>> >> >> >>> On Wed, Jun 6, 2012 at 11:22 PM, Babak Bastan
>> >> >>>>>> >> >> >>> <ba...@gmail.com>
>> >> >>>>>> >> >> >>> wrote:
>> >> >>>>>> >> >> >>> > were you able to format hdfs properly???
>> >> >>>>>> >> >> >>> > I did'nt get your question,Do you mean HADOOP_HOME?
>> >> >>>>>> >> >> >>> > or
>> >> >>>>>> >> >> >>> > where did
>> >> >>>>>> >> >> >>> > I
>> >> >>>>>> >> >> >>> > install
>> >> >>>>>> >> >> >>> > Hadoop?
>> >> >>>>>> >> >> >>> >
>> >> >>>>>> >> >> >>> > On Wed, Jun 6, 2012 at 7:49 PM, Mohammad Tariq
>> >> >>>>>> >> >> >>> > <do...@gmail.com>
>> >> >>>>>> >> >> >>> > wrote:
>> >> >>>>>> >> >> >>> >>
>> >> >>>>>> >> >> >>> >> if you are getting only this, it means your hadoop
>> >> >>>>>> >> >> >>> >> is
>> >> >>>>>> >> >> >>> >> not
>> >> >>>>>> >> >> >>> >> running..were you able to format hdfs properly???
>> >> >>>>>> >> >> >>> >>
>> >> >>>>>> >> >> >>> >> Regards,
>> >> >>>>>> >> >> >>> >>     Mohammad Tariq
>> >> >>>>>> >> >> >>> >>
>> >> >>>>>> >> >> >>> >>
>> >> >>>>>> >> >> >>> >> On Wed, Jun 6, 2012 at 11:17 PM, Babak Bastan
>> >> >>>>>> >> >> >>> >> <ba...@gmail.com>
>> >> >>>>>> >> >> >>> >> wrote:
>> >> >>>>>> >> >> >>> >> > Hi MohammadmI irun jps in my shel I can see this
>> >> >>>>>> >> >> >>> >> > result:
>> >> >>>>>> >> >> >>> >> > 2213 Jps
>> >> >>>>>> >> >> >>> >> >
>> >> >>>>>> >> >> >>> >> >
>> >> >>>>>> >> >> >>> >> > On Wed, Jun 6, 2012 at 7:44 PM, Mohammad Tariq
>> >> >>>>>> >> >> >>> >> > <do...@gmail.com>
>> >> >>>>>> >> >> >>> >> > wrote:
>> >> >>>>>> >> >> >>> >> >>
>> >> >>>>>> >> >> >>> >> >> you can also use "jps" command at your shell to
>> >> >>>>>> >> >> >>> >> >> see
>> >> >>>>>> >> >> >>> >> >> whether
>> >> >>>>>> >> >> >>> >> >> Hadoop
>> >> >>>>>> >> >> >>> >> >> processes are running or not.
>> >> >>>>>> >> >> >>> >> >>
>> >> >>>>>> >> >> >>> >> >> Regards,
>> >> >>>>>> >> >> >>> >> >>     Mohammad Tariq
>> >> >>>>>> >> >> >>> >> >>
>> >> >>>>>> >> >> >>> >> >>
>> >> >>>>>> >> >> >>> >> >> On Wed, Jun 6, 2012 at 11:12 PM, Mohammad Tariq
>> >> >>>>>> >> >> >>> >> >> <do...@gmail.com>
>> >> >>>>>> >> >> >>> >> >> wrote:
>> >> >>>>>> >> >> >>> >> >> > Hi Babak,
>> >> >>>>>> >> >> >>> >> >> >
>> >> >>>>>> >> >> >>> >> >> >  You have to type it in you web
>> >> >>>>>> >> >> >>> >> >> > browser..Hadoop
>> >> >>>>>> >> >> >>> >> >> > provides us
>> >> >>>>>> >> >> >>> >> >> > a
>> >> >>>>>> >> >> >>> >> >> > web
>> >> >>>>>> >> >> >>> >> >> > GUI
>> >> >>>>>> >> >> >>> >> >> > that not only allows us to browse through the
>> >> >>>>>> >> >> >>> >> >> > file
>> >> >>>>>> >> >> >>> >> >> > system,
>> >> >>>>>> >> >> >>> >> >> > but
>> >> >>>>>> >> >> >>> >> >> > to
>> >> >>>>>> >> >> >>> >> >> > download the files as well..Apart from that it
>> >> >>>>>> >> >> >>> >> >> > also
>> >> >>>>>> >> >> >>> >> >> > provides a
>> >> >>>>>> >> >> >>> >> >> > web
>> >> >>>>>> >> >> >>> >> >> > GUI
>> >> >>>>>> >> >> >>> >> >> > that can be used to see the status of
>> >> >>>>>> >> >> >>> >> >> > Jobtracker
>> >> >>>>>> >> >> >>> >> >> > and
>> >> >>>>>> >> >> >>> >> >> > Tasktracker..When
>> >> >>>>>> >> >> >>> >> >> > you run a Hive or Pig job or a Mapreduce job,
>> >> >>>>>> >> >> >>> >> >> > you
>> >> >>>>>> >> >> >>> >> >> > can point
>> >> >>>>>> >> >> >>> >> >> > your
>> >> >>>>>> >> >> >>> >> >> > browser to http://localhost:50030 to see the
>> >> >>>>>> >> >> >>> >> >> > status
>> >> >>>>>> >> >> >>> >> >> > and
>> >> >>>>>> >> >> >>> >> >> > logs
>> >> >>>>>> >> >> >>> >> >> > of
>> >> >>>>>> >> >> >>> >> >> > your
>> >> >>>>>> >> >> >>> >> >> > job.
>> >> >>>>>> >> >> >>> >> >> >
>> >> >>>>>> >> >> >>> >> >> > Regards,
>> >> >>>>>> >> >> >>> >> >> >     Mohammad Tariq
>> >> >>>>>> >> >> >>> >> >> >
>> >> >>>>>> >> >> >>> >> >> >
>> >> >>>>>> >> >> >>> >> >> > On Wed, Jun 6, 2012 at 8:28 PM, Babak Bastan
>> >> >>>>>> >> >> >>> >> >> > <ba...@gmail.com>
>> >> >>>>>> >> >> >>> >> >> > wrote:
>> >> >>>>>> >> >> >>> >> >> >> Thank you shashwat for the answer,
>> >> >>>>>> >> >> >>> >> >> >> where should I type http://localhost:50070?
>> >> >>>>>> >> >> >>> >> >> >> I typed here: hive>http://localhost:50070 but
>> >> >>>>>> >> >> >>> >> >> >> nothing as
>> >> >>>>>> >> >> >>> >> >> >> result
>> >> >>>>>> >> >> >>> >> >> >>
>> >> >>>>>> >> >> >>> >> >> >>
>> >> >>>>>> >> >> >>> >> >> >> On Wed, Jun 6, 2012 at 3:32 PM, shashwat
>> >> >>>>>> >> >> >>> >> >> >> shriparv
>> >> >>>>>> >> >> >>> >> >> >> <dw...@gmail.com> wrote:
>> >> >>>>>> >> >> >>> >> >> >>>
>> >> >>>>>> >> >> >>> >> >> >>> first type http://localhost:50070 whether
>> >> >>>>>> >> >> >>> >> >> >>> this
>> >> >>>>>> >> >> >>> >> >> >>> is
>> >> >>>>>> >> >> >>> >> >> >>> opening
>> >> >>>>>> >> >> >>> >> >> >>> or
>> >> >>>>>> >> >> >>> >> >> >>> not
>> >> >>>>>> >> >> >>> >> >> >>> and
>> >> >>>>>> >> >> >>> >> >> >>> check
>> >> >>>>>> >> >> >>> >> >> >>> how many nodes are available, check some of
>> >> >>>>>> >> >> >>> >> >> >>> the
>> >> >>>>>> >> >> >>> >> >> >>> hadoop
>> >> >>>>>> >> >> >>> >> >> >>> shell
>> >> >>>>>> >> >> >>> >> >> >>> commands
>> >> >>>>>> >> >> >>> >> >> >>>
>> >> >>>>>> >> >> >>> >> >> >>>
>> >> >>>>>> >> >> >>> >> >> >>>
>> >> >>>>>> >> >> >>> >> >> >>>
>> >> >>>>>> >> >> >>> >> >> >>>
>> >> >>>>>> >> >> >>> >> >> >>>
>> >> >>>>>> >> >> >>> >> >> >>> from http://hadoop.apache.org/common/docs/r0.18.3/hdfs_shell.html
>> >> >>>>>> >> >> >>> >> >> >>> run
>> >> >>>>>> >> >> >>> >> >> >>> example mapreduce task on hadoop take
>> >> >>>>>> >> >> >>> >> >> >>> example
>> >> >>>>>> >> >> >>> >> >> >>> from
>> >> >>>>>> >> >> >>> >> >> >>> here
>> >> >>>>>> >> >> >>> >> >> >>>
>> >> >>>>>> >> >> >>> >> >> >>>
>> >> >>>>>> >> >> >>> >> >> >>>
>> >> >>>>>> >> >> >>> >> >> >>>
>> >> >>>>>> >> >> >>> >> >> >>>
>> >> >>>>>> >> >> >>> >> >> >>>
>> >> >>>>>> >> >> >>> >> >> >>>
>> >> >>>>>> >> >> >>> >> >> >>>
>> >> >>>>>> >> >> >>> >> >> >>> : http://www.michael-noll.com/blog/2011/04/09/benchmarking-and-stress-testing-an-hadoop-cluster-with-terasort-testdfsio-nnbench-mrbench/
>> >> >>>>>> >> >> >>> >> >> >>>
>> >> >>>>>> >> >> >>> >> >> >>> if all the above you can do sucessfully
>> >> >>>>>> >> >> >>> >> >> >>> means
>> >> >>>>>> >> >> >>> >> >> >>> hadoop is
>> >> >>>>>> >> >> >>> >> >> >>> configured
>> >> >>>>>> >> >> >>> >> >> >>> correctly
>> >> >>>>>> >> >> >>> >> >> >>>
>> >> >>>>>> >> >> >>> >> >> >>> Regards
>> >> >>>>>> >> >> >>> >> >> >>> Shashwat
>> >> >>>>>> >> >> >>> >> >> >>>
>> >> >>>>>> >> >> >>> >> >> >>>
>> >> >>>>>> >> >> >>> >> >> >>> On Wed, Jun 6, 2012 at 1:30 AM, Babak Bastan
>> >> >>>>>> >> >> >>> >> >> >>> <ba...@gmail.com>
>> >> >>>>>> >> >> >>> >> >> >>> wrote:
>> >> >>>>>> >> >> >>> >> >> >>>>
>> >> >>>>>> >> >> >>> >> >> >>>> no I'm not working on CDH.Is there a way to
>> >> >>>>>> >> >> >>> >> >> >>>> test
>> >> >>>>>> >> >> >>> >> >> >>>> if my
>> >> >>>>>> >> >> >>> >> >> >>>> Hadoop
>> >> >>>>>> >> >> >>> >> >> >>>> works
>> >> >>>>>> >> >> >>> >> >> >>>> fine
>> >> >>>>>> >> >> >>> >> >> >>>> or not?
>> >> >>>>>> >> >> >>> >> >> >>>>
>> >> >>>>>> >> >> >>> >> >> >>>>
>> >> >>>>>> >> >> >>> >> >> >>>> On Tue, Jun 5, 2012 at 9:55 PM, Bejoy KS
>> >> >>>>>> >> >> >>> >> >> >>>> <be...@yahoo.com>
>> >> >>>>>> >> >> >>> >> >> >>>> wrote:
>> >> >>>>>> >> >> >>> >> >> >>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>> Hi Babak
>> >> >>>>>> >> >> >>> >> >> >>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>> You gotta follow those instructions in the
>> >> >>>>>> >> >> >>> >> >> >>>>> apace
>> >> >>>>>> >> >> >>> >> >> >>>>> site
>> >> >>>>>> >> >> >>> >> >> >>>>> to
>> >> >>>>>> >> >> >>> >> >> >>>>> set
>> >> >>>>>> >> >> >>> >> >> >>>>> up
>> >> >>>>>> >> >> >>> >> >> >>>>> hadoop
>> >> >>>>>> >> >> >>> >> >> >>>>> from scratch and ensure that hdfs is
>> >> >>>>>> >> >> >>> >> >> >>>>> working
>> >> >>>>>> >> >> >>> >> >> >>>>> first. You
>> >> >>>>>> >> >> >>> >> >> >>>>> should
>> >> >>>>>> >> >> >>> >> >> >>>>> be
>> >> >>>>>> >> >> >>> >> >> >>>>> able to
>> >> >>>>>> >> >> >>> >> >> >>>>> read and write files to hdfs before you do
>> >> >>>>>> >> >> >>> >> >> >>>>> your
>> >> >>>>>> >> >> >>> >> >> >>>>> next
>> >> >>>>>> >> >> >>> >> >> >>>>> steps.
>> >> >>>>>> >> >> >>> >> >> >>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>> Are you on CDH or apache distribution of
>> >> >>>>>> >> >> >>> >> >> >>>>> hadoop?
>> >> >>>>>> >> >> >>> >> >> >>>>> If it
>> >> >>>>>> >> >> >>> >> >> >>>>> is
>> >> >>>>>> >> >> >>> >> >> >>>>> CDH
>> >> >>>>>> >> >> >>> >> >> >>>>> there
>> >> >>>>>> >> >> >>> >> >> >>>>> are
>> >> >>>>>> >> >> >>> >> >> >>>>> detailed instructions on Cloudera web
>> >> >>>>>> >> >> >>> >> >> >>>>> site.
>> >> >>>>>> >> >> >>> >> >> >>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>> Regards
>> >> >>>>>> >> >> >>> >> >> >>>>> Bejoy KS
>> >> >>>>>> >> >> >>> >> >> >>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>> Sent from handheld, please excuse typos.
>> >> >>>>>> >> >> >>> >> >> >>>>> ________________________________
>> >> >>>>>> >> >> >>> >> >> >>>>> From: Babak Bastan <ba...@gmail.com>
>> >> >>>>>> >> >> >>> >> >> >>>>> Date: Tue, 5 Jun 2012 21:30:22 +0200
>> >> >>>>>> >> >> >>> >> >> >>>>> To: <us...@hive.apache.org>
>> >> >>>>>> >> >> >>> >> >> >>>>> ReplyTo: user@hive.apache.org
>> >> >>>>>> >> >> >>> >> >> >>>>> Subject: Re: Error while Creating Table in
>> >> >>>>>> >> >> >>> >> >> >>>>> Hive
>> >> >>>>>> >> >> >>> >> >> >>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>> @Bejoy: I set the fs.default.name in the
>> >> >>>>>> >> >> >>> >> >> >>>>> core-site.xml
>> >> >>>>>> >> >> >>> >> >> >>>>> and
>> >> >>>>>> >> >> >>> >> >> >>>>> I
>> >> >>>>>> >> >> >>> >> >> >>>>> did
>> >> >>>>>> >> >> >>> >> >> >>>>> all
>> >> >>>>>> >> >> >>> >> >> >>>>> of
>> >> >>>>>> >> >> >>> >> >> >>>>> thing that was mentioned in the reference
>> >> >>>>>> >> >> >>> >> >> >>>>> but
>> >> >>>>>> >> >> >>> >> >> >>>>> no
>> >> >>>>>> >> >> >>> >> >> >>>>> effect
>> >> >>>>>> >> >> >>> >> >> >>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>> On Tue, Jun 5, 2012 at 8:43 PM, Babak
>> >> >>>>>> >> >> >>> >> >> >>>>> Bastan
>> >> >>>>>> >> >> >>> >> >> >>>>> <ba...@gmail.com>
>> >> >>>>>> >> >> >>> >> >> >>>>> wrote:
>> >> >>>>>> >> >> >>> >> >> >>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>> Ok sorry but that was my Mistake .I
>> >> >>>>>> >> >> >>> >> >> >>>>>> thought
>> >> >>>>>> >> >> >>> >> >> >>>>>> it
>> >> >>>>>> >> >> >>> >> >> >>>>>> works
>> >> >>>>>> >> >> >>> >> >> >>>>>> but
>> >> >>>>>> >> >> >>> >> >> >>>>>> no.
>> >> >>>>>> >> >> >>> >> >> >>>>>> I wrote the command without ; and then I
>> >> >>>>>> >> >> >>> >> >> >>>>>> think
>> >> >>>>>> >> >> >>> >> >> >>>>>> It
>> >> >>>>>> >> >> >>> >> >> >>>>>> works
>> >> >>>>>> >> >> >>> >> >> >>>>>> but
>> >> >>>>>> >> >> >>> >> >> >>>>>> with
>> >> >>>>>> >> >> >>> >> >> >>>>>> ;
>> >> >>>>>> >> >> >>> >> >> >>>>>> at
>> >> >>>>>> >> >> >>> >> >> >>>>>> the end of command
>> >> >>>>>> >> >> >>> >> >> >>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>> CREATE TABLE pokes (foo INT, bar STRING);
>> >> >>>>>> >> >> >>> >> >> >>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>> does'nt work
>> >> >>>>>> >> >> >>> >> >> >>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>> On Tue, Jun 5, 2012 at 8:34 PM, shashwat
>> >> >>>>>> >> >> >>> >> >> >>>>>> shriparv
>> >> >>>>>> >> >> >>> >> >> >>>>>> <dw...@gmail.com> wrote:
>> >> >>>>>> >> >> >>> >> >> >>>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>>> inside configuration. all properties
>> >> >>>>>> >> >> >>> >> >> >>>>>>> will
>> >> >>>>>> >> >> >>> >> >> >>>>>>> be
>> >> >>>>>> >> >> >>> >> >> >>>>>>> inside
>> >> >>>>>> >> >> >>> >> >> >>>>>>> the
>> >> >>>>>> >> >> >>> >> >> >>>>>>> configuration
>> >> >>>>>> >> >> >>> >> >> >>>>>>> tags
>> >> >>>>>> >> >> >>> >> >> >>>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>>> On Tue, Jun 5, 2012 at 11:53 PM, Babak
>> >> >>>>>> >> >> >>> >> >> >>>>>>> Bastan
>> >> >>>>>> >> >> >>> >> >> >>>>>>> <ba...@gmail.com>
>> >> >>>>>> >> >> >>> >> >> >>>>>>> wrote:
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>>>> Thank you so much my friend your idee
>> >> >>>>>> >> >> >>> >> >> >>>>>>>> works
>> >> >>>>>> >> >> >>> >> >> >>>>>>>> fine(no
>> >> >>>>>> >> >> >>> >> >> >>>>>>>> error)
>> >> >>>>>> >> >> >>> >> >> >>>>>>>> you
>> >> >>>>>> >> >> >>> >> >> >>>>>>>> are
>> >> >>>>>> >> >> >>> >> >> >>>>>>>> the best :)
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>>>> On Tue, Jun 5, 2012 at 8:20 PM, Babak
>> >> >>>>>> >> >> >>> >> >> >>>>>>>> Bastan
>> >> >>>>>> >> >> >>> >> >> >>>>>>>> <ba...@gmail.com>
>> >> >>>>>> >> >> >>> >> >> >>>>>>>> wrote:
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>> It must be inside the
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>> <configuration></configuration>
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>> or
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>> outside
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>> this?
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>> On Tue, Jun 5, 2012 at 8:15 PM,
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>> shashwat
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>> shriparv
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>> <dw...@gmail.com> wrote:
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>> It will be inside hive/conf
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>> On Tue, Jun 5, 2012 at 11:43 PM,
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>> Babak
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>> Bastan
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>> <ba...@gmail.com>
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>> wrote:
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>> Thanks sShashwat, and where is this
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>> hive-site.xml
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>> On Tue, Jun 5, 2012 at 8:02 PM,
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>> shashwat
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>> shriparv
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>> <dw...@gmail.com> wrote:
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> set
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> hive.metastore.warehouse.dir in
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> hive-site.xml
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> <property>
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>   <name>hive.metastore.local</name>
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>   <value>true</value>
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> </property>
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> <name>hive.metastore.warehouse.dir</name>
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>                <value>/home/<your
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> username>/hivefolder</value>
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>  <description>location
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> of
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> default
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> database
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> for
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> the
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> warehouse</description>
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>        </property>
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> On Tue, Jun 5, 2012 at 10:43 PM,
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> Babak
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> Bastan
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> <ba...@gmail.com> wrote:
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> Hello Experts ,
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> I'm new in Hive .When try to
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> create a
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> test
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> Table
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> in
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> Hive
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> I
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> get
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> an error.I want to run this
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> command:
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> CREATE TABLE Test (DateT STRING,
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> Url
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> STRING,
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> Content
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> STRING);
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> but this error occured:
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> FAILED: Error in metadata:
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> MetaException(message:Got
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> exception:
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> java.io.FileNotFoundException File
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> file:/user/hive/warehouse/test
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> does
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> not
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> exist.)
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> FAILED: Execution Error, return
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> code
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> 1
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> from
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> org.apache.hadoop.hive.ql.exec.DDLTask
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> How can I solve this Problem?
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> Thank you so much
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> --
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> ∞
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> Shashwat Shriparv
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>> --
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>> ∞
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>> Shashwat Shriparv
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>>> --
>> >> >>>>>> >> >> >>> >> >> >>>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>>> ∞
>> >> >>>>>> >> >> >>> >> >> >>>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>>> Shashwat Shriparv
>> >> >>>>>> >> >> >>> >> >> >>>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>>
>> >> >>>>>> >> >> >>> >> >> >>>>
>> >> >>>>>> >> >> >>> >> >> >>>
>> >> >>>>>> >> >> >>> >> >> >>>
>> >> >>>>>> >> >> >>> >> >> >>>
>> >> >>>>>> >> >> >>> >> >> >>> --
>> >> >>>>>> >> >> >>> >> >> >>>
>> >> >>>>>> >> >> >>> >> >> >>>
>> >> >>>>>> >> >> >>> >> >> >>> ∞
>> >> >>>>>> >> >> >>> >> >> >>>
>> >> >>>>>> >> >> >>> >> >> >>> Shashwat Shriparv
>> >> >>>>>> >> >> >>> >> >> >>>
>> >> >>>>>> >> >> >>> >> >> >>>
>> >> >>>>>> >> >> >>> >> >> >>
>> >> >>>>>> >> >> >>> >> >
>> >> >>>>>> >> >> >>> >> >
>> >> >>>>>> >> >> >>> >
>> >> >>>>>> >> >> >>> >
>> >> >>>>>> >> >> >>
>> >> >>>>>> >> >> >>
>> >> >>>>>> >> >
>> >> >>>>>> >> >
>> >> >>>>>> >
>> >> >>>>>> >
>> >> >>>>>
>> >> >>>>>
>> >> >>>>
>> >> >>>>
>> >> >>>>
>> >> >>>> --
>> >> >>>>
>> >> >>>>
>> >> >>>> ∞
>> >> >>>>
>> >> >>>> Shashwat Shriparv
>> >> >>>>
>> >> >>>>
>> >> >>>
>> >> >>
>> >> >
>> >
>> >
>
>

Re: Error while Creating Table in Hive

Posted by Babak Bastan <ba...@gmail.com>.
I try to install another one in blog,most of them was performed without
problem but in this step

 udo mkdir /hadoop && sudo chown hdfs:hdfs /hadoop && sudo chmod 777 /hadoop

I get this error:

Error:  0: couldn't open source file </hadoop.ui>

and in this step:
mkdir /usr/lib/hadoop-0.20/.ssh
this error:
mkdir: kann Verzeichnis „/usr/lib/hadoop-0.20/.ssh“ nicht anlegen: Keine
Berechtigung
---> no permission to make a directory

On Wed, Jun 6, 2012 at 11:21 PM, Mohammad Tariq <do...@gmail.com> wrote:

> ok..we'll give it a final shot..then i'll email configured hadoop to
> your email address..delete the hdfs directory which contains tmp, data
> and name..recreate it..format hdfs again and then start the processes.
>
> Regards,
>     Mohammad Tariq
>
>
> On Thu, Jun 7, 2012 at 2:22 AM, Babak Bastan <ba...@gmail.com> wrote:
> > I 've performed the steps but the same error in this step as before:
> > bin/start-dfs.sh
> > It is about my permission to make directory
> >
> > On Wed, Jun 6, 2012 at 10:33 PM, Mohammad Tariq <do...@gmail.com>
> wrote:
> >>
> >> actually this blog post explains how to install cloudera's hadoop
> >> distribution...if you have followed this post and installed cloudera's
> >> distribution then your logs should ideally be inside
> >> /usr/lib/hadoop/logs (if everything was fine)..anyway try the steps I
> >> have given and let me know.
> >>
> >> Regards,
> >>     Mohammad Tariq
> >>
> >>
> >> On Thu, Jun 7, 2012 at 1:52 AM, Babak Bastan <ba...@gmail.com>
> wrote:
> >> > by the way,you are a very nice man my friend:Thank you so much :)
> >> >
> >> > what do you mean aboat this post in stackoverflow?
> >> >
> >> > I am assuming that is your first installation of hadoop.
> >> >
> >> > At the beginning please check if your daemons are working. To do that
> >> > use
> >> > (in terminal):
> >> >
> >> > jps
> >> >
> >> > If only jps appears that means all daemons are down. Please check the
> >> > log
> >> > files. Especially the namenode. Log folder is probably somewhere there
> >> > /usr/lib/hadoop/logs
> >> >
> >> > If you have some permission problems. Use this guide during the
> >> > installation.
> >> >
> >> > Good installation guide
> >> >
> >> > I am shooting with this explanations but these are most common
> problems.
> >> >
> >> >
> >> > On Wed, Jun 6, 2012 at 10:15 PM, Babak Bastan <ba...@gmail.com>
> >> > wrote:
> >> >>
> >> >> I checked it but no hadoop folder :(
> >> >> yes you are right.I'm a student and I want to make a very very simple
> >> >> programm hive but untill now hmmmmmmmmm
> >> >>
> >> >>
> >> >> On Wed, Jun 6, 2012 at 10:12 PM, Babak Bastan <ba...@gmail.com>
> >> >> wrote:
> >> >>>
> >> >>> no one error:
> >> >>> i.e if I run this one
> >> >>>
> >> >>> hostname --fqdn
> >> >>>
> >> >>>  with the condition that I send to you :
> >> >>>
> >> >>> 127.0.0.1       localhost
> >> >>> #127.0.0.1      ubuntu.ubuntu-domain    ubuntu
> >> >>> # The following lines are desirable for IPv6 capable hosts
> >> >>> #::1     ip6-localhost ip6-loopback
> >> >>> #fe00::0 ip6-localnet
> >> >>> #ff00::0 ip6-mcastprefix
> >> >>> #ff02::1 ip6-allnodes
> >> >>> #ff02::2 ip6-allrouters
> >> >>>
> >> >>> I get this error:
> >> >>>
> >> >>> hostname: Name or service not known
> >> >>>
> >> >>> Or in the second step by this command:
> >> >>>
> >> >>> babak@ubuntu:~/Downloads/hadoop/bin$ start-hdfs.sh
> >> >>>
> >> >>> these lines of error:
> >> >>>
> >> >>>
> >> >>> mkdir: kann Verzeichnis „/home/babak/Downloads/hadoop/bin/../logs“
> >> >>> nicht
> >> >>> anlegen: Keine Berechtigung
> >> >>> starting namenode, logging to
> >> >>>
> >> >>>
> /home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-namenode-ubuntu.out
> >> >>> /home/babak/Downloads/hadoop/bin/hadoop-daemon.sh: Zeile 117:
> >> >>>
> >> >>>
> /home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-namenode-ubuntu.out:
> >> >>> Datei oder Verzeichnis nicht gefunden
> >> >>> head:
> >> >>>
> >> >>>
> „/home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-namenode-ubuntu.out“
> >> >>> kann nicht zum Lesen geöffnet werden: Datei oder Verzeichnis nicht
> >> >>> gefunden
> >> >>> localhost: mkdir: kann Verzeichnis
> >> >>> „/home/babak/Downloads/hadoop/bin/../logs“ nicht anlegen: Keine
> >> >>> Berechtigung
> >> >>> localhost: starting datanode, logging to
> >> >>>
> >> >>>
> /home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-datanode-ubuntu.out
> >> >>> localhost: /home/babak/Downloads/hadoop/bin/hadoop-daemon.sh: Zeile
> >> >>> 117:
> >> >>>
> >> >>>
> /home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-datanode-ubuntu.out:
> >> >>> Datei oder Verzeichnis nicht gefunden
> >> >>> localhost: head:
> >> >>>
> >> >>>
> „/home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-datanode-ubuntu.out“
> >> >>> kann nicht zum Lesen geöffnet werden: Datei oder Verzeichnis nicht
> >> >>> gefunden
> >> >>> localhost: mkdir: kann Verzeichnis
> >> >>> „/home/babak/Downloads/hadoop/bin/../logs“ nicht anlegen: Keine
> >> >>> Berechtigung
> >> >>> localhost: starting secondarynamenode, logging to
> >> >>>
> >> >>>
> /home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-secondarynamenode-ubuntu.out
> >> >>> localhost: /home/babak/Downloads/hadoop/bin/hadoop-daemon.sh: Zeile
> >> >>> 117:
> >> >>>
> >> >>>
> /home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-secondarynamenode-ubuntu.out:
> >> >>> Datei oder Verzeichnis nicht gefunden
> >> >>> localhost: head:
> >> >>>
> >> >>>
> „/home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-secondarynamenode-ubuntu.out“
> >> >>> kann nicht zum Lesen geöffnet werden: Datei oder Verzeichnis nicht
> >> >>> gefunden
> >> >>>
> >> >>> they said no permision to make logs in this
> >> >>> path:/home/babak/Downloads/hadoop/bin/../logs
> >> >>>
> >> >>>  and generally I cant create a table in hive and get this one:
> >> >>>
> >> >>> FAILED: Error in metadata: MetaException(message:Got exception:
> >> >>> java.io.FileNotFoundException File file:/user/hive/warehouse/test
> does
> >> >>> not
> >> >>> exist.)
> >> >>> FAILED: Execution Error, return code 1 from
> >> >>> org.apache.hadoop.hive.ql.exec.DDLTask
> >> >>>
> >> >>> On Wed, Jun 6, 2012 at 10:02 PM, shashwat shriparv
> >> >>> <dw...@gmail.com> wrote:
> >> >>>>
> >> >>>> whats the error babak ???
> >> >>>>
> >> >>>>
> >> >>>> On Thu, Jun 7, 2012 at 1:25 AM, Babak Bastan <ba...@gmail.com>
> >> >>>> wrote:
> >> >>>>>
> >> >>>>> What the hell is that?I see no log folder there
> >> >>>>>
> >> >>>>>
> >> >>>>> On Wed, Jun 6, 2012 at 9:41 PM, Mohammad Tariq <
> dontariq@gmail.com>
> >> >>>>> wrote:
> >> >>>>>>
> >> >>>>>> go to your HADOOP_HOME i.e your hadoop directory(that includes
> bin,
> >> >>>>>> conf etc)..you can find logs directory there..
> >> >>>>>>
> >> >>>>>> Regards,
> >> >>>>>>     Mohammad Tariq
> >> >>>>>>
> >> >>>>>>
> >> >>>>>> On Thu, Jun 7, 2012 at 1:09 AM, Babak Bastan <babakbsn@gmail.com
> >
> >> >>>>>> wrote:
> >> >>>>>> > hoe can I get my log mohammad?
> >> >>>>>> >
> >> >>>>>> >
> >> >>>>>> > On Wed, Jun 6, 2012 at 9:36 PM, Mohammad Tariq
> >> >>>>>> > <do...@gmail.com>
> >> >>>>>> > wrote:
> >> >>>>>> >>
> >> >>>>>> >> could you post your logs???that would help me in understanding
> >> >>>>>> >> the
> >> >>>>>> >> problem properly.
> >> >>>>>> >>
> >> >>>>>> >> Regards,
> >> >>>>>> >>     Mohammad Tariq
> >> >>>>>> >>
> >> >>>>>> >>
> >> >>>>>> >> On Thu, Jun 7, 2012 at 1:02 AM, Babak Bastan
> >> >>>>>> >> <ba...@gmail.com>
> >> >>>>>> >> wrote:
> >> >>>>>> >> > Thank you very much mohamad for your attention.I followed
> the
> >> >>>>>> >> > steps but
> >> >>>>>> >> > the
> >> >>>>>> >> > error is the same as the last time.
> >> >>>>>> >> > and there is my hosts file:
> >> >>>>>> >> >
> >> >>>>>> >> > 127.0.0.1       localhost
> >> >>>>>> >> > #127.0.0.1      ubuntu.ubuntu-domain    ubuntu
> >> >>>>>> >> >
> >> >>>>>> >> >
> >> >>>>>> >> > # The following lines are desirable for IPv6 capable hosts
> >> >>>>>> >> >
> >> >>>>>> >> > #::1     ip6-localhost ip6-loopback
> >> >>>>>> >> > #fe00::0 ip6-localnet
> >> >>>>>> >> > #ff00::0 ip6-mcastprefix
> >> >>>>>> >> > #ff02::1 ip6-allnodes
> >> >>>>>> >> > #ff02::2 ip6-allrouters
> >> >>>>>> >> >
> >> >>>>>> >> > but no effect :(
> >> >>>>>> >> >
> >> >>>>>> >> > On Wed, Jun 6, 2012 at 8:25 PM, Mohammad Tariq
> >> >>>>>> >> > <do...@gmail.com>
> >> >>>>>> >> > wrote:
> >> >>>>>> >> >>
> >> >>>>>> >> >> also change the permissions of these directories to 777.
> >> >>>>>> >> >>
> >> >>>>>> >> >> Regards,
> >> >>>>>> >> >>     Mohammad Tariq
> >> >>>>>> >> >>
> >> >>>>>> >> >>
> >> >>>>>> >> >> On Wed, Jun 6, 2012 at 11:54 PM, Mohammad Tariq
> >> >>>>>> >> >> <do...@gmail.com>
> >> >>>>>> >> >> wrote:
> >> >>>>>> >> >> > create a directory "/home/username/hdfs" (or at some
> place
> >> >>>>>> >> >> > of
> >> >>>>>> >> >> > your
> >> >>>>>> >> >> > choice)..inside this hdfs directory create three sub
> >> >>>>>> >> >> > directories -
> >> >>>>>> >> >> > name, data, and temp, then follow these steps :
> >> >>>>>> >> >> >
> >> >>>>>> >> >> > add following properties in your core-site.xml -
> >> >>>>>> >> >> >
> >> >>>>>> >> >> > <property>
> >> >>>>>> >> >> >          <name>fs.default.name</name>
> >> >>>>>> >> >> >          <value>hdfs://localhost:9000/</value>
> >> >>>>>> >> >> >        </property>
> >> >>>>>> >> >> >
> >> >>>>>> >> >> >        <property>
> >> >>>>>> >> >> >          <name>hadoop.tmp.dir</name>
> >> >>>>>> >> >> >          <value>/home/mohammad/hdfs/temp</value>
> >> >>>>>> >> >> >        </property>
> >> >>>>>> >> >> >
> >> >>>>>> >> >> > then add following two properties in your hdfs-site.xml -
> >> >>>>>> >> >> >
> >> >>>>>> >> >> > <property>
> >> >>>>>> >> >> >                <name>dfs.replication</name>
> >> >>>>>> >> >> >                <value>1</value>
> >> >>>>>> >> >> >        </property>
> >> >>>>>> >> >> >
> >> >>>>>> >> >> >        <property>
> >> >>>>>> >> >> >                <name>dfs.name.dir</name>
> >> >>>>>> >> >> >                <value>/home/mohammad/hdfs/name</value>
> >> >>>>>> >> >> >        </property>
> >> >>>>>> >> >> >
> >> >>>>>> >> >> >        <property>
> >> >>>>>> >> >> >                <name>dfs.data.dir</name>
> >> >>>>>> >> >> >                <value>/home/mohammad/hdfs/data</value>
> >> >>>>>> >> >> >        </property>
> >> >>>>>> >> >> >
> >> >>>>>> >> >> > finally add this property in your mapred-site.xml -
> >> >>>>>> >> >> >
> >> >>>>>> >> >> >       <property>
> >> >>>>>> >> >> >          <name>mapred.job.tracker</name>
> >> >>>>>> >> >> >          <value>hdfs://localhost:9001</value>
> >> >>>>>> >> >> >        </property>
> >> >>>>>> >> >> >
> >> >>>>>> >> >> > NOTE: you can give any name to these directories of your
> >> >>>>>> >> >> > choice, just
> >> >>>>>> >> >> > keep in mind you have to give same names as values of
> >> >>>>>> >> >> >           above specified properties in your
> configuration
> >> >>>>>> >> >> > files.
> >> >>>>>> >> >> > (give full path of these directories, not just the name
> of
> >> >>>>>> >> >> > the
> >> >>>>>> >> >> > directory)
> >> >>>>>> >> >> >
> >> >>>>>> >> >> > After this  follow the steps provided in the previous
> >> >>>>>> >> >> > reply.
> >> >>>>>> >> >> >
> >> >>>>>> >> >> > Regards,
> >> >>>>>> >> >> >     Mohammad Tariq
> >> >>>>>> >> >> >
> >> >>>>>> >> >> >
> >> >>>>>> >> >> > On Wed, Jun 6, 2012 at 11:42 PM, Babak Bastan
> >> >>>>>> >> >> > <ba...@gmail.com>
> >> >>>>>> >> >> > wrote:
> >> >>>>>> >> >> >> thank's Mohammad
> >> >>>>>> >> >> >>
> >> >>>>>> >> >> >> with this command:
> >> >>>>>> >> >> >>
> >> >>>>>> >> >> >> babak@ubuntu:~/Downloads/hadoop/bin$ hadoop namenode
> >> >>>>>> >> >> >> -format
> >> >>>>>> >> >> >>
> >> >>>>>> >> >> >> this is my output:
> >> >>>>>> >> >> >>
> >> >>>>>> >> >> >> 12/06/06 20:05:20 INFO namenode.NameNode: STARTUP_MSG:
> >> >>>>>> >> >> >>
> >> >>>>>> >> >> >>
> /************************************************************
> >> >>>>>> >> >> >> STARTUP_MSG: Starting NameNode
> >> >>>>>> >> >> >> STARTUP_MSG:   host = ubuntu/127.0.1.1
> >> >>>>>> >> >> >> STARTUP_MSG:   args = [-format]
> >> >>>>>> >> >> >> STARTUP_MSG:   version = 0.20.2
> >> >>>>>> >> >> >> STARTUP_MSG:   build =
> >> >>>>>> >> >> >>
> >> >>>>>> >> >> >>
> >> >>>>>> >> >> >>
> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.20
> >> >>>>>> >> >> >> -r
> >> >>>>>> >> >> >> 911707; compiled by 'chrisdo' on Fri Feb 19 08:07:34 UTC
> >> >>>>>> >> >> >> 2010
> >> >>>>>> >> >> >>
> >> >>>>>> >> >> >>
> ************************************************************/
> >> >>>>>> >> >> >> 12/06/06 20:05:20 INFO namenode.FSNamesystem:
> >> >>>>>> >> >> >>
> >> >>>>>> >> >> >>
> >> >>>>>> >> >> >>
> >> >>>>>> >> >> >>
> fsOwner=babak,babak,adm,dialout,cdrom,plugdev,lpadmin,admin,sambashare
> >> >>>>>> >> >> >> 12/06/06 20:05:20 INFO namenode.FSNamesystem:
> >> >>>>>> >> >> >> supergroup=supergroup
> >> >>>>>> >> >> >> 12/06/06 20:05:20 INFO namenode.FSNamesystem:
> >> >>>>>> >> >> >> isPermissionEnabled=true
> >> >>>>>> >> >> >> 12/06/06 20:05:20 INFO common.Storage: Image file of
> size
> >> >>>>>> >> >> >> 95
> >> >>>>>> >> >> >> saved
> >> >>>>>> >> >> >> in 0
> >> >>>>>> >> >> >> seconds.
> >> >>>>>> >> >> >> 12/06/06 20:05:20 INFO common.Storage: Storage directory
> >> >>>>>> >> >> >> /tmp/hadoop-babak/dfs/name has been successfully
> >> >>>>>> >> >> >> formatted.
> >> >>>>>> >> >> >> 12/06/06 20:05:20 INFO namenode.NameNode: SHUTDOWN_MSG:
> >> >>>>>> >> >> >>
> >> >>>>>> >> >> >>
> /************************************************************
> >> >>>>>> >> >> >> SHUTDOWN_MSG: Shutting down NameNode at ubuntu/
> 127.0.1.1
> >> >>>>>> >> >> >>
> >> >>>>>> >> >> >>
> ************************************************************/
> >> >>>>>> >> >> >>
> >> >>>>>> >> >> >> by this command:
> >> >>>>>> >> >> >>
> >> >>>>>> >> >> >> babak@ubuntu:~/Downloads/hadoop/bin$ start-dfs.sh
> >> >>>>>> >> >> >>
> >> >>>>>> >> >> >> this is the out put
> >> >>>>>> >> >> >>
> >> >>>>>> >> >> >> mkdir: kann Verzeichnis
> >> >>>>>> >> >> >> „/home/babak/Downloads/hadoop/bin/../logs“
> >> >>>>>> >> >> >> nicht
> >> >>>>>> >> >> >> anlegen: Keine Berechtigung
> >> >>>>>> >> >> >>
> >> >>>>>> >> >> >> this out put(it's in german and it means no right to
> make
> >> >>>>>> >> >> >> this
> >> >>>>>> >> >> >> folder)
> >> >>>>>> >> >> >>
> >> >>>>>> >> >> >>
> >> >>>>>> >> >> >> On Wed, Jun 6, 2012 at 7:59 PM, Mohammad Tariq
> >> >>>>>> >> >> >> <do...@gmail.com>
> >> >>>>>> >> >> >> wrote:
> >> >>>>>> >> >> >>>
> >> >>>>>> >> >> >>> once we are done with the configuration, we need to
> >> >>>>>> >> >> >>> format
> >> >>>>>> >> >> >>> the file
> >> >>>>>> >> >> >>> system..use this command to do that-
> >> >>>>>> >> >> >>> bin/hadoop namenode -format
> >> >>>>>> >> >> >>>
> >> >>>>>> >> >> >>> after this, hadoop daemon processes should be started
> >> >>>>>> >> >> >>> using
> >> >>>>>> >> >> >>> following
> >> >>>>>> >> >> >>> commands -
> >> >>>>>> >> >> >>> bin/start-dfs.sh (it'll start NN & DN)
> >> >>>>>> >> >> >>> bin/start-mapred.sh (it'll start JT & TT)
> >> >>>>>> >> >> >>>
> >> >>>>>> >> >> >>> after this use jps to check if everything is alright or
> >> >>>>>> >> >> >>> point your
> >> >>>>>> >> >> >>> browser to localhost:50070..if you further find any
> >> >>>>>> >> >> >>> problem
> >> >>>>>> >> >> >>> provide
> >> >>>>>> >> >> >>> us
> >> >>>>>> >> >> >>> with the error logs..:)
> >> >>>>>> >> >> >>>
> >> >>>>>> >> >> >>> Regards,
> >> >>>>>> >> >> >>>     Mohammad Tariq
> >> >>>>>> >> >> >>>
> >> >>>>>> >> >> >>>
> >> >>>>>> >> >> >>> On Wed, Jun 6, 2012 at 11:22 PM, Babak Bastan
> >> >>>>>> >> >> >>> <ba...@gmail.com>
> >> >>>>>> >> >> >>> wrote:
> >> >>>>>> >> >> >>> > were you able to format hdfs properly???
> >> >>>>>> >> >> >>> > I did'nt get your question,Do you mean HADOOP_HOME?
> or
> >> >>>>>> >> >> >>> > where did
> >> >>>>>> >> >> >>> > I
> >> >>>>>> >> >> >>> > install
> >> >>>>>> >> >> >>> > Hadoop?
> >> >>>>>> >> >> >>> >
> >> >>>>>> >> >> >>> > On Wed, Jun 6, 2012 at 7:49 PM, Mohammad Tariq
> >> >>>>>> >> >> >>> > <do...@gmail.com>
> >> >>>>>> >> >> >>> > wrote:
> >> >>>>>> >> >> >>> >>
> >> >>>>>> >> >> >>> >> if you are getting only this, it means your hadoop
> is
> >> >>>>>> >> >> >>> >> not
> >> >>>>>> >> >> >>> >> running..were you able to format hdfs properly???
> >> >>>>>> >> >> >>> >>
> >> >>>>>> >> >> >>> >> Regards,
> >> >>>>>> >> >> >>> >>     Mohammad Tariq
> >> >>>>>> >> >> >>> >>
> >> >>>>>> >> >> >>> >>
> >> >>>>>> >> >> >>> >> On Wed, Jun 6, 2012 at 11:17 PM, Babak Bastan
> >> >>>>>> >> >> >>> >> <ba...@gmail.com>
> >> >>>>>> >> >> >>> >> wrote:
> >> >>>>>> >> >> >>> >> > Hi MohammadmI irun jps in my shel I can see this
> >> >>>>>> >> >> >>> >> > result:
> >> >>>>>> >> >> >>> >> > 2213 Jps
> >> >>>>>> >> >> >>> >> >
> >> >>>>>> >> >> >>> >> >
> >> >>>>>> >> >> >>> >> > On Wed, Jun 6, 2012 at 7:44 PM, Mohammad Tariq
> >> >>>>>> >> >> >>> >> > <do...@gmail.com>
> >> >>>>>> >> >> >>> >> > wrote:
> >> >>>>>> >> >> >>> >> >>
> >> >>>>>> >> >> >>> >> >> you can also use "jps" command at your shell to
> see
> >> >>>>>> >> >> >>> >> >> whether
> >> >>>>>> >> >> >>> >> >> Hadoop
> >> >>>>>> >> >> >>> >> >> processes are running or not.
> >> >>>>>> >> >> >>> >> >>
> >> >>>>>> >> >> >>> >> >> Regards,
> >> >>>>>> >> >> >>> >> >>     Mohammad Tariq
> >> >>>>>> >> >> >>> >> >>
> >> >>>>>> >> >> >>> >> >>
> >> >>>>>> >> >> >>> >> >> On Wed, Jun 6, 2012 at 11:12 PM, Mohammad Tariq
> >> >>>>>> >> >> >>> >> >> <do...@gmail.com>
> >> >>>>>> >> >> >>> >> >> wrote:
> >> >>>>>> >> >> >>> >> >> > Hi Babak,
> >> >>>>>> >> >> >>> >> >> >
> >> >>>>>> >> >> >>> >> >> >  You have to type it in you web browser..Hadoop
> >> >>>>>> >> >> >>> >> >> > provides us
> >> >>>>>> >> >> >>> >> >> > a
> >> >>>>>> >> >> >>> >> >> > web
> >> >>>>>> >> >> >>> >> >> > GUI
> >> >>>>>> >> >> >>> >> >> > that not only allows us to browse through the
> >> >>>>>> >> >> >>> >> >> > file
> >> >>>>>> >> >> >>> >> >> > system,
> >> >>>>>> >> >> >>> >> >> > but
> >> >>>>>> >> >> >>> >> >> > to
> >> >>>>>> >> >> >>> >> >> > download the files as well..Apart from that it
> >> >>>>>> >> >> >>> >> >> > also
> >> >>>>>> >> >> >>> >> >> > provides a
> >> >>>>>> >> >> >>> >> >> > web
> >> >>>>>> >> >> >>> >> >> > GUI
> >> >>>>>> >> >> >>> >> >> > that can be used to see the status of
> Jobtracker
> >> >>>>>> >> >> >>> >> >> > and
> >> >>>>>> >> >> >>> >> >> > Tasktracker..When
> >> >>>>>> >> >> >>> >> >> > you run a Hive or Pig job or a Mapreduce job,
> you
> >> >>>>>> >> >> >>> >> >> > can point
> >> >>>>>> >> >> >>> >> >> > your
> >> >>>>>> >> >> >>> >> >> > browser to http://localhost:50030 to see the
> >> >>>>>> >> >> >>> >> >> > status
> >> >>>>>> >> >> >>> >> >> > and
> >> >>>>>> >> >> >>> >> >> > logs
> >> >>>>>> >> >> >>> >> >> > of
> >> >>>>>> >> >> >>> >> >> > your
> >> >>>>>> >> >> >>> >> >> > job.
> >> >>>>>> >> >> >>> >> >> >
> >> >>>>>> >> >> >>> >> >> > Regards,
> >> >>>>>> >> >> >>> >> >> >     Mohammad Tariq
> >> >>>>>> >> >> >>> >> >> >
> >> >>>>>> >> >> >>> >> >> >
> >> >>>>>> >> >> >>> >> >> > On Wed, Jun 6, 2012 at 8:28 PM, Babak Bastan
> >> >>>>>> >> >> >>> >> >> > <ba...@gmail.com>
> >> >>>>>> >> >> >>> >> >> > wrote:
> >> >>>>>> >> >> >>> >> >> >> Thank you shashwat for the answer,
> >> >>>>>> >> >> >>> >> >> >> where should I type http://localhost:50070?
> >> >>>>>> >> >> >>> >> >> >> I typed here: hive>http://localhost:50070 but
> >> >>>>>> >> >> >>> >> >> >> nothing as
> >> >>>>>> >> >> >>> >> >> >> result
> >> >>>>>> >> >> >>> >> >> >>
> >> >>>>>> >> >> >>> >> >> >>
> >> >>>>>> >> >> >>> >> >> >> On Wed, Jun 6, 2012 at 3:32 PM, shashwat
> >> >>>>>> >> >> >>> >> >> >> shriparv
> >> >>>>>> >> >> >>> >> >> >> <dw...@gmail.com> wrote:
> >> >>>>>> >> >> >>> >> >> >>>
> >> >>>>>> >> >> >>> >> >> >>> first type http://localhost:50070 whether
> this
> >> >>>>>> >> >> >>> >> >> >>> is
> >> >>>>>> >> >> >>> >> >> >>> opening
> >> >>>>>> >> >> >>> >> >> >>> or
> >> >>>>>> >> >> >>> >> >> >>> not
> >> >>>>>> >> >> >>> >> >> >>> and
> >> >>>>>> >> >> >>> >> >> >>> check
> >> >>>>>> >> >> >>> >> >> >>> how many nodes are available, check some of
> the
> >> >>>>>> >> >> >>> >> >> >>> hadoop
> >> >>>>>> >> >> >>> >> >> >>> shell
> >> >>>>>> >> >> >>> >> >> >>> commands
> >> >>>>>> >> >> >>> >> >> >>>
> >> >>>>>> >> >> >>> >> >> >>>
> >> >>>>>> >> >> >>> >> >> >>>
> >> >>>>>> >> >> >>> >> >> >>>
> >> >>>>>> >> >> >>> >> >> >>>
> >> >>>>>> >> >> >>> >> >> >>> from
> http://hadoop.apache.org/common/docs/r0.18.3/hdfs_shell.html
> >> >>>>>> >> >> >>> >> >> >>> run
> >> >>>>>> >> >> >>> >> >> >>> example mapreduce task on hadoop take example
> >> >>>>>> >> >> >>> >> >> >>> from
> >> >>>>>> >> >> >>> >> >> >>> here
> >> >>>>>> >> >> >>> >> >> >>>
> >> >>>>>> >> >> >>> >> >> >>>
> >> >>>>>> >> >> >>> >> >> >>>
> >> >>>>>> >> >> >>> >> >> >>>
> >> >>>>>> >> >> >>> >> >> >>>
> >> >>>>>> >> >> >>> >> >> >>>
> >> >>>>>> >> >> >>> >> >> >>>
> >> >>>>>> >> >> >>> >> >> >>> :
> http://www.michael-noll.com/blog/2011/04/09/benchmarking-and-stress-testing-an-hadoop-cluster-with-terasort-testdfsio-nnbench-mrbench/
> >> >>>>>> >> >> >>> >> >> >>>
> >> >>>>>> >> >> >>> >> >> >>> if all the above you can do sucessfully means
> >> >>>>>> >> >> >>> >> >> >>> hadoop is
> >> >>>>>> >> >> >>> >> >> >>> configured
> >> >>>>>> >> >> >>> >> >> >>> correctly
> >> >>>>>> >> >> >>> >> >> >>>
> >> >>>>>> >> >> >>> >> >> >>> Regards
> >> >>>>>> >> >> >>> >> >> >>> Shashwat
> >> >>>>>> >> >> >>> >> >> >>>
> >> >>>>>> >> >> >>> >> >> >>>
> >> >>>>>> >> >> >>> >> >> >>> On Wed, Jun 6, 2012 at 1:30 AM, Babak Bastan
> >> >>>>>> >> >> >>> >> >> >>> <ba...@gmail.com>
> >> >>>>>> >> >> >>> >> >> >>> wrote:
> >> >>>>>> >> >> >>> >> >> >>>>
> >> >>>>>> >> >> >>> >> >> >>>> no I'm not working on CDH.Is there a way to
> >> >>>>>> >> >> >>> >> >> >>>> test
> >> >>>>>> >> >> >>> >> >> >>>> if my
> >> >>>>>> >> >> >>> >> >> >>>> Hadoop
> >> >>>>>> >> >> >>> >> >> >>>> works
> >> >>>>>> >> >> >>> >> >> >>>> fine
> >> >>>>>> >> >> >>> >> >> >>>> or not?
> >> >>>>>> >> >> >>> >> >> >>>>
> >> >>>>>> >> >> >>> >> >> >>>>
> >> >>>>>> >> >> >>> >> >> >>>> On Tue, Jun 5, 2012 at 9:55 PM, Bejoy KS
> >> >>>>>> >> >> >>> >> >> >>>> <be...@yahoo.com>
> >> >>>>>> >> >> >>> >> >> >>>> wrote:
> >> >>>>>> >> >> >>> >> >> >>>>>
> >> >>>>>> >> >> >>> >> >> >>>>> Hi Babak
> >> >>>>>> >> >> >>> >> >> >>>>>
> >> >>>>>> >> >> >>> >> >> >>>>> You gotta follow those instructions in the
> >> >>>>>> >> >> >>> >> >> >>>>> apace
> >> >>>>>> >> >> >>> >> >> >>>>> site
> >> >>>>>> >> >> >>> >> >> >>>>> to
> >> >>>>>> >> >> >>> >> >> >>>>> set
> >> >>>>>> >> >> >>> >> >> >>>>> up
> >> >>>>>> >> >> >>> >> >> >>>>> hadoop
> >> >>>>>> >> >> >>> >> >> >>>>> from scratch and ensure that hdfs is
> working
> >> >>>>>> >> >> >>> >> >> >>>>> first. You
> >> >>>>>> >> >> >>> >> >> >>>>> should
> >> >>>>>> >> >> >>> >> >> >>>>> be
> >> >>>>>> >> >> >>> >> >> >>>>> able to
> >> >>>>>> >> >> >>> >> >> >>>>> read and write files to hdfs before you do
> >> >>>>>> >> >> >>> >> >> >>>>> your
> >> >>>>>> >> >> >>> >> >> >>>>> next
> >> >>>>>> >> >> >>> >> >> >>>>> steps.
> >> >>>>>> >> >> >>> >> >> >>>>>
> >> >>>>>> >> >> >>> >> >> >>>>> Are you on CDH or apache distribution of
> >> >>>>>> >> >> >>> >> >> >>>>> hadoop?
> >> >>>>>> >> >> >>> >> >> >>>>> If it
> >> >>>>>> >> >> >>> >> >> >>>>> is
> >> >>>>>> >> >> >>> >> >> >>>>> CDH
> >> >>>>>> >> >> >>> >> >> >>>>> there
> >> >>>>>> >> >> >>> >> >> >>>>> are
> >> >>>>>> >> >> >>> >> >> >>>>> detailed instructions on Cloudera web site.
> >> >>>>>> >> >> >>> >> >> >>>>>
> >> >>>>>> >> >> >>> >> >> >>>>> Regards
> >> >>>>>> >> >> >>> >> >> >>>>> Bejoy KS
> >> >>>>>> >> >> >>> >> >> >>>>>
> >> >>>>>> >> >> >>> >> >> >>>>> Sent from handheld, please excuse typos.
> >> >>>>>> >> >> >>> >> >> >>>>> ________________________________
> >> >>>>>> >> >> >>> >> >> >>>>> From: Babak Bastan <ba...@gmail.com>
> >> >>>>>> >> >> >>> >> >> >>>>> Date: Tue, 5 Jun 2012 21:30:22 +0200
> >> >>>>>> >> >> >>> >> >> >>>>> To: <us...@hive.apache.org>
> >> >>>>>> >> >> >>> >> >> >>>>> ReplyTo: user@hive.apache.org
> >> >>>>>> >> >> >>> >> >> >>>>> Subject: Re: Error while Creating Table in
> >> >>>>>> >> >> >>> >> >> >>>>> Hive
> >> >>>>>> >> >> >>> >> >> >>>>>
> >> >>>>>> >> >> >>> >> >> >>>>> @Bejoy: I set the fs.default.name in the
> >> >>>>>> >> >> >>> >> >> >>>>> core-site.xml
> >> >>>>>> >> >> >>> >> >> >>>>> and
> >> >>>>>> >> >> >>> >> >> >>>>> I
> >> >>>>>> >> >> >>> >> >> >>>>> did
> >> >>>>>> >> >> >>> >> >> >>>>> all
> >> >>>>>> >> >> >>> >> >> >>>>> of
> >> >>>>>> >> >> >>> >> >> >>>>> thing that was mentioned in the reference
> but
> >> >>>>>> >> >> >>> >> >> >>>>> no
> >> >>>>>> >> >> >>> >> >> >>>>> effect
> >> >>>>>> >> >> >>> >> >> >>>>>
> >> >>>>>> >> >> >>> >> >> >>>>> On Tue, Jun 5, 2012 at 8:43 PM, Babak
> Bastan
> >> >>>>>> >> >> >>> >> >> >>>>> <ba...@gmail.com>
> >> >>>>>> >> >> >>> >> >> >>>>> wrote:
> >> >>>>>> >> >> >>> >> >> >>>>>>
> >> >>>>>> >> >> >>> >> >> >>>>>> Ok sorry but that was my Mistake .I
> thought
> >> >>>>>> >> >> >>> >> >> >>>>>> it
> >> >>>>>> >> >> >>> >> >> >>>>>> works
> >> >>>>>> >> >> >>> >> >> >>>>>> but
> >> >>>>>> >> >> >>> >> >> >>>>>> no.
> >> >>>>>> >> >> >>> >> >> >>>>>> I wrote the command without ; and then I
> >> >>>>>> >> >> >>> >> >> >>>>>> think
> >> >>>>>> >> >> >>> >> >> >>>>>> It
> >> >>>>>> >> >> >>> >> >> >>>>>> works
> >> >>>>>> >> >> >>> >> >> >>>>>> but
> >> >>>>>> >> >> >>> >> >> >>>>>> with
> >> >>>>>> >> >> >>> >> >> >>>>>> ;
> >> >>>>>> >> >> >>> >> >> >>>>>> at
> >> >>>>>> >> >> >>> >> >> >>>>>> the end of command
> >> >>>>>> >> >> >>> >> >> >>>>>>
> >> >>>>>> >> >> >>> >> >> >>>>>> CREATE TABLE pokes (foo INT, bar STRING);
> >> >>>>>> >> >> >>> >> >> >>>>>>
> >> >>>>>> >> >> >>> >> >> >>>>>> does'nt work
> >> >>>>>> >> >> >>> >> >> >>>>>>
> >> >>>>>> >> >> >>> >> >> >>>>>>
> >> >>>>>> >> >> >>> >> >> >>>>>> On Tue, Jun 5, 2012 at 8:34 PM, shashwat
> >> >>>>>> >> >> >>> >> >> >>>>>> shriparv
> >> >>>>>> >> >> >>> >> >> >>>>>> <dw...@gmail.com> wrote:
> >> >>>>>> >> >> >>> >> >> >>>>>>>
> >> >>>>>> >> >> >>> >> >> >>>>>>> inside configuration. all properties will
> >> >>>>>> >> >> >>> >> >> >>>>>>> be
> >> >>>>>> >> >> >>> >> >> >>>>>>> inside
> >> >>>>>> >> >> >>> >> >> >>>>>>> the
> >> >>>>>> >> >> >>> >> >> >>>>>>> configuration
> >> >>>>>> >> >> >>> >> >> >>>>>>> tags
> >> >>>>>> >> >> >>> >> >> >>>>>>>
> >> >>>>>> >> >> >>> >> >> >>>>>>>
> >> >>>>>> >> >> >>> >> >> >>>>>>> On Tue, Jun 5, 2012 at 11:53 PM, Babak
> >> >>>>>> >> >> >>> >> >> >>>>>>> Bastan
> >> >>>>>> >> >> >>> >> >> >>>>>>> <ba...@gmail.com>
> >> >>>>>> >> >> >>> >> >> >>>>>>> wrote:
> >> >>>>>> >> >> >>> >> >> >>>>>>>>
> >> >>>>>> >> >> >>> >> >> >>>>>>>> Thank you so much my friend your idee
> >> >>>>>> >> >> >>> >> >> >>>>>>>> works
> >> >>>>>> >> >> >>> >> >> >>>>>>>> fine(no
> >> >>>>>> >> >> >>> >> >> >>>>>>>> error)
> >> >>>>>> >> >> >>> >> >> >>>>>>>> you
> >> >>>>>> >> >> >>> >> >> >>>>>>>> are
> >> >>>>>> >> >> >>> >> >> >>>>>>>> the best :)
> >> >>>>>> >> >> >>> >> >> >>>>>>>>
> >> >>>>>> >> >> >>> >> >> >>>>>>>>
> >> >>>>>> >> >> >>> >> >> >>>>>>>> On Tue, Jun 5, 2012 at 8:20 PM, Babak
> >> >>>>>> >> >> >>> >> >> >>>>>>>> Bastan
> >> >>>>>> >> >> >>> >> >> >>>>>>>> <ba...@gmail.com>
> >> >>>>>> >> >> >>> >> >> >>>>>>>> wrote:
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>
> >> >>>>>> >> >> >>> >> >> >>>>>>>>> It must be inside the
> >> >>>>>> >> >> >>> >> >> >>>>>>>>> <configuration></configuration>
> >> >>>>>> >> >> >>> >> >> >>>>>>>>> or
> >> >>>>>> >> >> >>> >> >> >>>>>>>>> outside
> >> >>>>>> >> >> >>> >> >> >>>>>>>>> this?
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>
> >> >>>>>> >> >> >>> >> >> >>>>>>>>> On Tue, Jun 5, 2012 at 8:15 PM,
> shashwat
> >> >>>>>> >> >> >>> >> >> >>>>>>>>> shriparv
> >> >>>>>> >> >> >>> >> >> >>>>>>>>> <dw...@gmail.com> wrote:
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>> It will be inside hive/conf
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>> On Tue, Jun 5, 2012 at 11:43 PM, Babak
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>> Bastan
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>> <ba...@gmail.com>
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>> wrote:
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>> Thanks sShashwat, and where is this
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>> hive-site.xml
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>> On Tue, Jun 5, 2012 at 8:02 PM,
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>> shashwat
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>> shriparv
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>> <dw...@gmail.com> wrote:
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> set
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> hive.metastore.warehouse.dir in
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> hive-site.xml
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> <property>
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>   <name>hive.metastore.local</name>
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>   <value>true</value>
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> </property>
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
> <name>hive.metastore.warehouse.dir</name>
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>                <value>/home/<your
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> username>/hivefolder</value>
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>                <description>location
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> of
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> default
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> database
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> for
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> the
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> warehouse</description>
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>        </property>
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> On Tue, Jun 5, 2012 at 10:43 PM,
> Babak
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> Bastan
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> <ba...@gmail.com> wrote:
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>>
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> Hello Experts ,
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>>
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> I'm new in Hive .When try to
> create a
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> test
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> Table
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> in
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> Hive
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> I
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> get
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> an error.I want to run this
> command:
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> CREATE TABLE Test (DateT STRING,
> Url
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> STRING,
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> Content
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> STRING);
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> but this error occured:
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> FAILED: Error in metadata:
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> MetaException(message:Got
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> exception:
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> java.io.FileNotFoundException File
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> file:/user/hive/warehouse/test does
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> not
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> exist.)
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> FAILED: Execution Error, return
> code
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> 1
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> from
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>>
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>>
> org.apache.hadoop.hive.ql.exec.DDLTask
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> How can I solve this Problem?
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> Thank you so much
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> --
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> ∞
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> Shashwat Shriparv
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>>
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>> --
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>> ∞
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>> Shashwat Shriparv
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>>
> >> >>>>>> >> >> >>> >> >> >>>>>>>>>
> >> >>>>>> >> >> >>> >> >> >>>>>>>>
> >> >>>>>> >> >> >>> >> >> >>>>>>>
> >> >>>>>> >> >> >>> >> >> >>>>>>>
> >> >>>>>> >> >> >>> >> >> >>>>>>>
> >> >>>>>> >> >> >>> >> >> >>>>>>> --
> >> >>>>>> >> >> >>> >> >> >>>>>>>
> >> >>>>>> >> >> >>> >> >> >>>>>>>
> >> >>>>>> >> >> >>> >> >> >>>>>>> ∞
> >> >>>>>> >> >> >>> >> >> >>>>>>>
> >> >>>>>> >> >> >>> >> >> >>>>>>> Shashwat Shriparv
> >> >>>>>> >> >> >>> >> >> >>>>>>>
> >> >>>>>> >> >> >>> >> >> >>>>>>>
> >> >>>>>> >> >> >>> >> >> >>>>>>
> >> >>>>>> >> >> >>> >> >> >>>>>
> >> >>>>>> >> >> >>> >> >> >>>>
> >> >>>>>> >> >> >>> >> >> >>>
> >> >>>>>> >> >> >>> >> >> >>>
> >> >>>>>> >> >> >>> >> >> >>>
> >> >>>>>> >> >> >>> >> >> >>> --
> >> >>>>>> >> >> >>> >> >> >>>
> >> >>>>>> >> >> >>> >> >> >>>
> >> >>>>>> >> >> >>> >> >> >>> ∞
> >> >>>>>> >> >> >>> >> >> >>>
> >> >>>>>> >> >> >>> >> >> >>> Shashwat Shriparv
> >> >>>>>> >> >> >>> >> >> >>>
> >> >>>>>> >> >> >>> >> >> >>>
> >> >>>>>> >> >> >>> >> >> >>
> >> >>>>>> >> >> >>> >> >
> >> >>>>>> >> >> >>> >> >
> >> >>>>>> >> >> >>> >
> >> >>>>>> >> >> >>> >
> >> >>>>>> >> >> >>
> >> >>>>>> >> >> >>
> >> >>>>>> >> >
> >> >>>>>> >> >
> >> >>>>>> >
> >> >>>>>> >
> >> >>>>>
> >> >>>>>
> >> >>>>
> >> >>>>
> >> >>>>
> >> >>>> --
> >> >>>>
> >> >>>>
> >> >>>> ∞
> >> >>>>
> >> >>>> Shashwat Shriparv
> >> >>>>
> >> >>>>
> >> >>>
> >> >>
> >> >
> >
> >
>

Re: Error while Creating Table in Hive

Posted by Mohammad Tariq <do...@gmail.com>.
ok..we'll give it a final shot..then i'll email configured hadoop to
your email address..delete the hdfs directory which contains tmp, data
and name..recreate it..format hdfs again and then start the processes.

Regards,
    Mohammad Tariq


On Thu, Jun 7, 2012 at 2:22 AM, Babak Bastan <ba...@gmail.com> wrote:
> I 've performed the steps but the same error in this step as before:
> bin/start-dfs.sh
> It is about my permission to make directory
>
> On Wed, Jun 6, 2012 at 10:33 PM, Mohammad Tariq <do...@gmail.com> wrote:
>>
>> actually this blog post explains how to install cloudera's hadoop
>> distribution...if you have followed this post and installed cloudera's
>> distribution then your logs should ideally be inside
>> /usr/lib/hadoop/logs (if everything was fine)..anyway try the steps I
>> have given and let me know.
>>
>> Regards,
>>     Mohammad Tariq
>>
>>
>> On Thu, Jun 7, 2012 at 1:52 AM, Babak Bastan <ba...@gmail.com> wrote:
>> > by the way,you are a very nice man my friend:Thank you so much :)
>> >
>> > what do you mean aboat this post in stackoverflow?
>> >
>> > I am assuming that is your first installation of hadoop.
>> >
>> > At the beginning please check if your daemons are working. To do that
>> > use
>> > (in terminal):
>> >
>> > jps
>> >
>> > If only jps appears that means all daemons are down. Please check the
>> > log
>> > files. Especially the namenode. Log folder is probably somewhere there
>> > /usr/lib/hadoop/logs
>> >
>> > If you have some permission problems. Use this guide during the
>> > installation.
>> >
>> > Good installation guide
>> >
>> > I am shooting with this explanations but these are most common problems.
>> >
>> >
>> > On Wed, Jun 6, 2012 at 10:15 PM, Babak Bastan <ba...@gmail.com>
>> > wrote:
>> >>
>> >> I checked it but no hadoop folder :(
>> >> yes you are right.I'm a student and I want to make a very very simple
>> >> programm hive but untill now hmmmmmmmmm
>> >>
>> >>
>> >> On Wed, Jun 6, 2012 at 10:12 PM, Babak Bastan <ba...@gmail.com>
>> >> wrote:
>> >>>
>> >>> no one error:
>> >>> i.e if I run this one
>> >>>
>> >>> hostname --fqdn
>> >>>
>> >>>  with the condition that I send to you :
>> >>>
>> >>> 127.0.0.1       localhost
>> >>> #127.0.0.1      ubuntu.ubuntu-domain    ubuntu
>> >>> # The following lines are desirable for IPv6 capable hosts
>> >>> #::1     ip6-localhost ip6-loopback
>> >>> #fe00::0 ip6-localnet
>> >>> #ff00::0 ip6-mcastprefix
>> >>> #ff02::1 ip6-allnodes
>> >>> #ff02::2 ip6-allrouters
>> >>>
>> >>> I get this error:
>> >>>
>> >>> hostname: Name or service not known
>> >>>
>> >>> Or in the second step by this command:
>> >>>
>> >>> babak@ubuntu:~/Downloads/hadoop/bin$ start-hdfs.sh
>> >>>
>> >>> these lines of error:
>> >>>
>> >>>
>> >>> mkdir: kann Verzeichnis „/home/babak/Downloads/hadoop/bin/../logs“
>> >>> nicht
>> >>> anlegen: Keine Berechtigung
>> >>> starting namenode, logging to
>> >>>
>> >>> /home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-namenode-ubuntu.out
>> >>> /home/babak/Downloads/hadoop/bin/hadoop-daemon.sh: Zeile 117:
>> >>>
>> >>> /home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-namenode-ubuntu.out:
>> >>> Datei oder Verzeichnis nicht gefunden
>> >>> head:
>> >>>
>> >>> „/home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-namenode-ubuntu.out“
>> >>> kann nicht zum Lesen geöffnet werden: Datei oder Verzeichnis nicht
>> >>> gefunden
>> >>> localhost: mkdir: kann Verzeichnis
>> >>> „/home/babak/Downloads/hadoop/bin/../logs“ nicht anlegen: Keine
>> >>> Berechtigung
>> >>> localhost: starting datanode, logging to
>> >>>
>> >>> /home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-datanode-ubuntu.out
>> >>> localhost: /home/babak/Downloads/hadoop/bin/hadoop-daemon.sh: Zeile
>> >>> 117:
>> >>>
>> >>> /home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-datanode-ubuntu.out:
>> >>> Datei oder Verzeichnis nicht gefunden
>> >>> localhost: head:
>> >>>
>> >>> „/home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-datanode-ubuntu.out“
>> >>> kann nicht zum Lesen geöffnet werden: Datei oder Verzeichnis nicht
>> >>> gefunden
>> >>> localhost: mkdir: kann Verzeichnis
>> >>> „/home/babak/Downloads/hadoop/bin/../logs“ nicht anlegen: Keine
>> >>> Berechtigung
>> >>> localhost: starting secondarynamenode, logging to
>> >>>
>> >>> /home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-secondarynamenode-ubuntu.out
>> >>> localhost: /home/babak/Downloads/hadoop/bin/hadoop-daemon.sh: Zeile
>> >>> 117:
>> >>>
>> >>> /home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-secondarynamenode-ubuntu.out:
>> >>> Datei oder Verzeichnis nicht gefunden
>> >>> localhost: head:
>> >>>
>> >>> „/home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-secondarynamenode-ubuntu.out“
>> >>> kann nicht zum Lesen geöffnet werden: Datei oder Verzeichnis nicht
>> >>> gefunden
>> >>>
>> >>> they said no permision to make logs in this
>> >>> path:/home/babak/Downloads/hadoop/bin/../logs
>> >>>
>> >>>  and generally I cant create a table in hive and get this one:
>> >>>
>> >>> FAILED: Error in metadata: MetaException(message:Got exception:
>> >>> java.io.FileNotFoundException File file:/user/hive/warehouse/test does
>> >>> not
>> >>> exist.)
>> >>> FAILED: Execution Error, return code 1 from
>> >>> org.apache.hadoop.hive.ql.exec.DDLTask
>> >>>
>> >>> On Wed, Jun 6, 2012 at 10:02 PM, shashwat shriparv
>> >>> <dw...@gmail.com> wrote:
>> >>>>
>> >>>> whats the error babak ???
>> >>>>
>> >>>>
>> >>>> On Thu, Jun 7, 2012 at 1:25 AM, Babak Bastan <ba...@gmail.com>
>> >>>> wrote:
>> >>>>>
>> >>>>> What the hell is that?I see no log folder there
>> >>>>>
>> >>>>>
>> >>>>> On Wed, Jun 6, 2012 at 9:41 PM, Mohammad Tariq <do...@gmail.com>
>> >>>>> wrote:
>> >>>>>>
>> >>>>>> go to your HADOOP_HOME i.e your hadoop directory(that includes bin,
>> >>>>>> conf etc)..you can find logs directory there..
>> >>>>>>
>> >>>>>> Regards,
>> >>>>>>     Mohammad Tariq
>> >>>>>>
>> >>>>>>
>> >>>>>> On Thu, Jun 7, 2012 at 1:09 AM, Babak Bastan <ba...@gmail.com>
>> >>>>>> wrote:
>> >>>>>> > hoe can I get my log mohammad?
>> >>>>>> >
>> >>>>>> >
>> >>>>>> > On Wed, Jun 6, 2012 at 9:36 PM, Mohammad Tariq
>> >>>>>> > <do...@gmail.com>
>> >>>>>> > wrote:
>> >>>>>> >>
>> >>>>>> >> could you post your logs???that would help me in understanding
>> >>>>>> >> the
>> >>>>>> >> problem properly.
>> >>>>>> >>
>> >>>>>> >> Regards,
>> >>>>>> >>     Mohammad Tariq
>> >>>>>> >>
>> >>>>>> >>
>> >>>>>> >> On Thu, Jun 7, 2012 at 1:02 AM, Babak Bastan
>> >>>>>> >> <ba...@gmail.com>
>> >>>>>> >> wrote:
>> >>>>>> >> > Thank you very much mohamad for your attention.I followed the
>> >>>>>> >> > steps but
>> >>>>>> >> > the
>> >>>>>> >> > error is the same as the last time.
>> >>>>>> >> > and there is my hosts file:
>> >>>>>> >> >
>> >>>>>> >> > 127.0.0.1       localhost
>> >>>>>> >> > #127.0.0.1      ubuntu.ubuntu-domain    ubuntu
>> >>>>>> >> >
>> >>>>>> >> >
>> >>>>>> >> > # The following lines are desirable for IPv6 capable hosts
>> >>>>>> >> >
>> >>>>>> >> > #::1     ip6-localhost ip6-loopback
>> >>>>>> >> > #fe00::0 ip6-localnet
>> >>>>>> >> > #ff00::0 ip6-mcastprefix
>> >>>>>> >> > #ff02::1 ip6-allnodes
>> >>>>>> >> > #ff02::2 ip6-allrouters
>> >>>>>> >> >
>> >>>>>> >> > but no effect :(
>> >>>>>> >> >
>> >>>>>> >> > On Wed, Jun 6, 2012 at 8:25 PM, Mohammad Tariq
>> >>>>>> >> > <do...@gmail.com>
>> >>>>>> >> > wrote:
>> >>>>>> >> >>
>> >>>>>> >> >> also change the permissions of these directories to 777.
>> >>>>>> >> >>
>> >>>>>> >> >> Regards,
>> >>>>>> >> >>     Mohammad Tariq
>> >>>>>> >> >>
>> >>>>>> >> >>
>> >>>>>> >> >> On Wed, Jun 6, 2012 at 11:54 PM, Mohammad Tariq
>> >>>>>> >> >> <do...@gmail.com>
>> >>>>>> >> >> wrote:
>> >>>>>> >> >> > create a directory "/home/username/hdfs" (or at some place
>> >>>>>> >> >> > of
>> >>>>>> >> >> > your
>> >>>>>> >> >> > choice)..inside this hdfs directory create three sub
>> >>>>>> >> >> > directories -
>> >>>>>> >> >> > name, data, and temp, then follow these steps :
>> >>>>>> >> >> >
>> >>>>>> >> >> > add following properties in your core-site.xml -
>> >>>>>> >> >> >
>> >>>>>> >> >> > <property>
>> >>>>>> >> >> >          <name>fs.default.name</name>
>> >>>>>> >> >> >          <value>hdfs://localhost:9000/</value>
>> >>>>>> >> >> >        </property>
>> >>>>>> >> >> >
>> >>>>>> >> >> >        <property>
>> >>>>>> >> >> >          <name>hadoop.tmp.dir</name>
>> >>>>>> >> >> >          <value>/home/mohammad/hdfs/temp</value>
>> >>>>>> >> >> >        </property>
>> >>>>>> >> >> >
>> >>>>>> >> >> > then add following two properties in your hdfs-site.xml -
>> >>>>>> >> >> >
>> >>>>>> >> >> > <property>
>> >>>>>> >> >> >                <name>dfs.replication</name>
>> >>>>>> >> >> >                <value>1</value>
>> >>>>>> >> >> >        </property>
>> >>>>>> >> >> >
>> >>>>>> >> >> >        <property>
>> >>>>>> >> >> >                <name>dfs.name.dir</name>
>> >>>>>> >> >> >                <value>/home/mohammad/hdfs/name</value>
>> >>>>>> >> >> >        </property>
>> >>>>>> >> >> >
>> >>>>>> >> >> >        <property>
>> >>>>>> >> >> >                <name>dfs.data.dir</name>
>> >>>>>> >> >> >                <value>/home/mohammad/hdfs/data</value>
>> >>>>>> >> >> >        </property>
>> >>>>>> >> >> >
>> >>>>>> >> >> > finally add this property in your mapred-site.xml -
>> >>>>>> >> >> >
>> >>>>>> >> >> >       <property>
>> >>>>>> >> >> >          <name>mapred.job.tracker</name>
>> >>>>>> >> >> >          <value>hdfs://localhost:9001</value>
>> >>>>>> >> >> >        </property>
>> >>>>>> >> >> >
>> >>>>>> >> >> > NOTE: you can give any name to these directories of your
>> >>>>>> >> >> > choice, just
>> >>>>>> >> >> > keep in mind you have to give same names as values of
>> >>>>>> >> >> >           above specified properties in your configuration
>> >>>>>> >> >> > files.
>> >>>>>> >> >> > (give full path of these directories, not just the name of
>> >>>>>> >> >> > the
>> >>>>>> >> >> > directory)
>> >>>>>> >> >> >
>> >>>>>> >> >> > After this  follow the steps provided in the previous
>> >>>>>> >> >> > reply.
>> >>>>>> >> >> >
>> >>>>>> >> >> > Regards,
>> >>>>>> >> >> >     Mohammad Tariq
>> >>>>>> >> >> >
>> >>>>>> >> >> >
>> >>>>>> >> >> > On Wed, Jun 6, 2012 at 11:42 PM, Babak Bastan
>> >>>>>> >> >> > <ba...@gmail.com>
>> >>>>>> >> >> > wrote:
>> >>>>>> >> >> >> thank's Mohammad
>> >>>>>> >> >> >>
>> >>>>>> >> >> >> with this command:
>> >>>>>> >> >> >>
>> >>>>>> >> >> >> babak@ubuntu:~/Downloads/hadoop/bin$ hadoop namenode
>> >>>>>> >> >> >> -format
>> >>>>>> >> >> >>
>> >>>>>> >> >> >> this is my output:
>> >>>>>> >> >> >>
>> >>>>>> >> >> >> 12/06/06 20:05:20 INFO namenode.NameNode: STARTUP_MSG:
>> >>>>>> >> >> >>
>> >>>>>> >> >> >> /************************************************************
>> >>>>>> >> >> >> STARTUP_MSG: Starting NameNode
>> >>>>>> >> >> >> STARTUP_MSG:   host = ubuntu/127.0.1.1
>> >>>>>> >> >> >> STARTUP_MSG:   args = [-format]
>> >>>>>> >> >> >> STARTUP_MSG:   version = 0.20.2
>> >>>>>> >> >> >> STARTUP_MSG:   build =
>> >>>>>> >> >> >>
>> >>>>>> >> >> >>
>> >>>>>> >> >> >> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.20
>> >>>>>> >> >> >> -r
>> >>>>>> >> >> >> 911707; compiled by 'chrisdo' on Fri Feb 19 08:07:34 UTC
>> >>>>>> >> >> >> 2010
>> >>>>>> >> >> >>
>> >>>>>> >> >> >> ************************************************************/
>> >>>>>> >> >> >> 12/06/06 20:05:20 INFO namenode.FSNamesystem:
>> >>>>>> >> >> >>
>> >>>>>> >> >> >>
>> >>>>>> >> >> >>
>> >>>>>> >> >> >> fsOwner=babak,babak,adm,dialout,cdrom,plugdev,lpadmin,admin,sambashare
>> >>>>>> >> >> >> 12/06/06 20:05:20 INFO namenode.FSNamesystem:
>> >>>>>> >> >> >> supergroup=supergroup
>> >>>>>> >> >> >> 12/06/06 20:05:20 INFO namenode.FSNamesystem:
>> >>>>>> >> >> >> isPermissionEnabled=true
>> >>>>>> >> >> >> 12/06/06 20:05:20 INFO common.Storage: Image file of size
>> >>>>>> >> >> >> 95
>> >>>>>> >> >> >> saved
>> >>>>>> >> >> >> in 0
>> >>>>>> >> >> >> seconds.
>> >>>>>> >> >> >> 12/06/06 20:05:20 INFO common.Storage: Storage directory
>> >>>>>> >> >> >> /tmp/hadoop-babak/dfs/name has been successfully
>> >>>>>> >> >> >> formatted.
>> >>>>>> >> >> >> 12/06/06 20:05:20 INFO namenode.NameNode: SHUTDOWN_MSG:
>> >>>>>> >> >> >>
>> >>>>>> >> >> >> /************************************************************
>> >>>>>> >> >> >> SHUTDOWN_MSG: Shutting down NameNode at ubuntu/127.0.1.1
>> >>>>>> >> >> >>
>> >>>>>> >> >> >> ************************************************************/
>> >>>>>> >> >> >>
>> >>>>>> >> >> >> by this command:
>> >>>>>> >> >> >>
>> >>>>>> >> >> >> babak@ubuntu:~/Downloads/hadoop/bin$ start-dfs.sh
>> >>>>>> >> >> >>
>> >>>>>> >> >> >> this is the out put
>> >>>>>> >> >> >>
>> >>>>>> >> >> >> mkdir: kann Verzeichnis
>> >>>>>> >> >> >> „/home/babak/Downloads/hadoop/bin/../logs“
>> >>>>>> >> >> >> nicht
>> >>>>>> >> >> >> anlegen: Keine Berechtigung
>> >>>>>> >> >> >>
>> >>>>>> >> >> >> this out put(it's in german and it means no right to make
>> >>>>>> >> >> >> this
>> >>>>>> >> >> >> folder)
>> >>>>>> >> >> >>
>> >>>>>> >> >> >>
>> >>>>>> >> >> >> On Wed, Jun 6, 2012 at 7:59 PM, Mohammad Tariq
>> >>>>>> >> >> >> <do...@gmail.com>
>> >>>>>> >> >> >> wrote:
>> >>>>>> >> >> >>>
>> >>>>>> >> >> >>> once we are done with the configuration, we need to
>> >>>>>> >> >> >>> format
>> >>>>>> >> >> >>> the file
>> >>>>>> >> >> >>> system..use this command to do that-
>> >>>>>> >> >> >>> bin/hadoop namenode -format
>> >>>>>> >> >> >>>
>> >>>>>> >> >> >>> after this, hadoop daemon processes should be started
>> >>>>>> >> >> >>> using
>> >>>>>> >> >> >>> following
>> >>>>>> >> >> >>> commands -
>> >>>>>> >> >> >>> bin/start-dfs.sh (it'll start NN & DN)
>> >>>>>> >> >> >>> bin/start-mapred.sh (it'll start JT & TT)
>> >>>>>> >> >> >>>
>> >>>>>> >> >> >>> after this use jps to check if everything is alright or
>> >>>>>> >> >> >>> point your
>> >>>>>> >> >> >>> browser to localhost:50070..if you further find any
>> >>>>>> >> >> >>> problem
>> >>>>>> >> >> >>> provide
>> >>>>>> >> >> >>> us
>> >>>>>> >> >> >>> with the error logs..:)
>> >>>>>> >> >> >>>
>> >>>>>> >> >> >>> Regards,
>> >>>>>> >> >> >>>     Mohammad Tariq
>> >>>>>> >> >> >>>
>> >>>>>> >> >> >>>
>> >>>>>> >> >> >>> On Wed, Jun 6, 2012 at 11:22 PM, Babak Bastan
>> >>>>>> >> >> >>> <ba...@gmail.com>
>> >>>>>> >> >> >>> wrote:
>> >>>>>> >> >> >>> > were you able to format hdfs properly???
>> >>>>>> >> >> >>> > I did'nt get your question,Do you mean HADOOP_HOME? or
>> >>>>>> >> >> >>> > where did
>> >>>>>> >> >> >>> > I
>> >>>>>> >> >> >>> > install
>> >>>>>> >> >> >>> > Hadoop?
>> >>>>>> >> >> >>> >
>> >>>>>> >> >> >>> > On Wed, Jun 6, 2012 at 7:49 PM, Mohammad Tariq
>> >>>>>> >> >> >>> > <do...@gmail.com>
>> >>>>>> >> >> >>> > wrote:
>> >>>>>> >> >> >>> >>
>> >>>>>> >> >> >>> >> if you are getting only this, it means your hadoop is
>> >>>>>> >> >> >>> >> not
>> >>>>>> >> >> >>> >> running..were you able to format hdfs properly???
>> >>>>>> >> >> >>> >>
>> >>>>>> >> >> >>> >> Regards,
>> >>>>>> >> >> >>> >>     Mohammad Tariq
>> >>>>>> >> >> >>> >>
>> >>>>>> >> >> >>> >>
>> >>>>>> >> >> >>> >> On Wed, Jun 6, 2012 at 11:17 PM, Babak Bastan
>> >>>>>> >> >> >>> >> <ba...@gmail.com>
>> >>>>>> >> >> >>> >> wrote:
>> >>>>>> >> >> >>> >> > Hi MohammadmI irun jps in my shel I can see this
>> >>>>>> >> >> >>> >> > result:
>> >>>>>> >> >> >>> >> > 2213 Jps
>> >>>>>> >> >> >>> >> >
>> >>>>>> >> >> >>> >> >
>> >>>>>> >> >> >>> >> > On Wed, Jun 6, 2012 at 7:44 PM, Mohammad Tariq
>> >>>>>> >> >> >>> >> > <do...@gmail.com>
>> >>>>>> >> >> >>> >> > wrote:
>> >>>>>> >> >> >>> >> >>
>> >>>>>> >> >> >>> >> >> you can also use "jps" command at your shell to see
>> >>>>>> >> >> >>> >> >> whether
>> >>>>>> >> >> >>> >> >> Hadoop
>> >>>>>> >> >> >>> >> >> processes are running or not.
>> >>>>>> >> >> >>> >> >>
>> >>>>>> >> >> >>> >> >> Regards,
>> >>>>>> >> >> >>> >> >>     Mohammad Tariq
>> >>>>>> >> >> >>> >> >>
>> >>>>>> >> >> >>> >> >>
>> >>>>>> >> >> >>> >> >> On Wed, Jun 6, 2012 at 11:12 PM, Mohammad Tariq
>> >>>>>> >> >> >>> >> >> <do...@gmail.com>
>> >>>>>> >> >> >>> >> >> wrote:
>> >>>>>> >> >> >>> >> >> > Hi Babak,
>> >>>>>> >> >> >>> >> >> >
>> >>>>>> >> >> >>> >> >> >  You have to type it in you web browser..Hadoop
>> >>>>>> >> >> >>> >> >> > provides us
>> >>>>>> >> >> >>> >> >> > a
>> >>>>>> >> >> >>> >> >> > web
>> >>>>>> >> >> >>> >> >> > GUI
>> >>>>>> >> >> >>> >> >> > that not only allows us to browse through the
>> >>>>>> >> >> >>> >> >> > file
>> >>>>>> >> >> >>> >> >> > system,
>> >>>>>> >> >> >>> >> >> > but
>> >>>>>> >> >> >>> >> >> > to
>> >>>>>> >> >> >>> >> >> > download the files as well..Apart from that it
>> >>>>>> >> >> >>> >> >> > also
>> >>>>>> >> >> >>> >> >> > provides a
>> >>>>>> >> >> >>> >> >> > web
>> >>>>>> >> >> >>> >> >> > GUI
>> >>>>>> >> >> >>> >> >> > that can be used to see the status of Jobtracker
>> >>>>>> >> >> >>> >> >> > and
>> >>>>>> >> >> >>> >> >> > Tasktracker..When
>> >>>>>> >> >> >>> >> >> > you run a Hive or Pig job or a Mapreduce job, you
>> >>>>>> >> >> >>> >> >> > can point
>> >>>>>> >> >> >>> >> >> > your
>> >>>>>> >> >> >>> >> >> > browser to http://localhost:50030 to see the
>> >>>>>> >> >> >>> >> >> > status
>> >>>>>> >> >> >>> >> >> > and
>> >>>>>> >> >> >>> >> >> > logs
>> >>>>>> >> >> >>> >> >> > of
>> >>>>>> >> >> >>> >> >> > your
>> >>>>>> >> >> >>> >> >> > job.
>> >>>>>> >> >> >>> >> >> >
>> >>>>>> >> >> >>> >> >> > Regards,
>> >>>>>> >> >> >>> >> >> >     Mohammad Tariq
>> >>>>>> >> >> >>> >> >> >
>> >>>>>> >> >> >>> >> >> >
>> >>>>>> >> >> >>> >> >> > On Wed, Jun 6, 2012 at 8:28 PM, Babak Bastan
>> >>>>>> >> >> >>> >> >> > <ba...@gmail.com>
>> >>>>>> >> >> >>> >> >> > wrote:
>> >>>>>> >> >> >>> >> >> >> Thank you shashwat for the answer,
>> >>>>>> >> >> >>> >> >> >> where should I type http://localhost:50070?
>> >>>>>> >> >> >>> >> >> >> I typed here: hive>http://localhost:50070 but
>> >>>>>> >> >> >>> >> >> >> nothing as
>> >>>>>> >> >> >>> >> >> >> result
>> >>>>>> >> >> >>> >> >> >>
>> >>>>>> >> >> >>> >> >> >>
>> >>>>>> >> >> >>> >> >> >> On Wed, Jun 6, 2012 at 3:32 PM, shashwat
>> >>>>>> >> >> >>> >> >> >> shriparv
>> >>>>>> >> >> >>> >> >> >> <dw...@gmail.com> wrote:
>> >>>>>> >> >> >>> >> >> >>>
>> >>>>>> >> >> >>> >> >> >>> first type http://localhost:50070 whether this
>> >>>>>> >> >> >>> >> >> >>> is
>> >>>>>> >> >> >>> >> >> >>> opening
>> >>>>>> >> >> >>> >> >> >>> or
>> >>>>>> >> >> >>> >> >> >>> not
>> >>>>>> >> >> >>> >> >> >>> and
>> >>>>>> >> >> >>> >> >> >>> check
>> >>>>>> >> >> >>> >> >> >>> how many nodes are available, check some of the
>> >>>>>> >> >> >>> >> >> >>> hadoop
>> >>>>>> >> >> >>> >> >> >>> shell
>> >>>>>> >> >> >>> >> >> >>> commands
>> >>>>>> >> >> >>> >> >> >>>
>> >>>>>> >> >> >>> >> >> >>>
>> >>>>>> >> >> >>> >> >> >>>
>> >>>>>> >> >> >>> >> >> >>>
>> >>>>>> >> >> >>> >> >> >>>
>> >>>>>> >> >> >>> >> >> >>> from http://hadoop.apache.org/common/docs/r0.18.3/hdfs_shell.html
>> >>>>>> >> >> >>> >> >> >>> run
>> >>>>>> >> >> >>> >> >> >>> example mapreduce task on hadoop take example
>> >>>>>> >> >> >>> >> >> >>> from
>> >>>>>> >> >> >>> >> >> >>> here
>> >>>>>> >> >> >>> >> >> >>>
>> >>>>>> >> >> >>> >> >> >>>
>> >>>>>> >> >> >>> >> >> >>>
>> >>>>>> >> >> >>> >> >> >>>
>> >>>>>> >> >> >>> >> >> >>>
>> >>>>>> >> >> >>> >> >> >>>
>> >>>>>> >> >> >>> >> >> >>>
>> >>>>>> >> >> >>> >> >> >>> : http://www.michael-noll.com/blog/2011/04/09/benchmarking-and-stress-testing-an-hadoop-cluster-with-terasort-testdfsio-nnbench-mrbench/
>> >>>>>> >> >> >>> >> >> >>>
>> >>>>>> >> >> >>> >> >> >>> if all the above you can do sucessfully means
>> >>>>>> >> >> >>> >> >> >>> hadoop is
>> >>>>>> >> >> >>> >> >> >>> configured
>> >>>>>> >> >> >>> >> >> >>> correctly
>> >>>>>> >> >> >>> >> >> >>>
>> >>>>>> >> >> >>> >> >> >>> Regards
>> >>>>>> >> >> >>> >> >> >>> Shashwat
>> >>>>>> >> >> >>> >> >> >>>
>> >>>>>> >> >> >>> >> >> >>>
>> >>>>>> >> >> >>> >> >> >>> On Wed, Jun 6, 2012 at 1:30 AM, Babak Bastan
>> >>>>>> >> >> >>> >> >> >>> <ba...@gmail.com>
>> >>>>>> >> >> >>> >> >> >>> wrote:
>> >>>>>> >> >> >>> >> >> >>>>
>> >>>>>> >> >> >>> >> >> >>>> no I'm not working on CDH.Is there a way to
>> >>>>>> >> >> >>> >> >> >>>> test
>> >>>>>> >> >> >>> >> >> >>>> if my
>> >>>>>> >> >> >>> >> >> >>>> Hadoop
>> >>>>>> >> >> >>> >> >> >>>> works
>> >>>>>> >> >> >>> >> >> >>>> fine
>> >>>>>> >> >> >>> >> >> >>>> or not?
>> >>>>>> >> >> >>> >> >> >>>>
>> >>>>>> >> >> >>> >> >> >>>>
>> >>>>>> >> >> >>> >> >> >>>> On Tue, Jun 5, 2012 at 9:55 PM, Bejoy KS
>> >>>>>> >> >> >>> >> >> >>>> <be...@yahoo.com>
>> >>>>>> >> >> >>> >> >> >>>> wrote:
>> >>>>>> >> >> >>> >> >> >>>>>
>> >>>>>> >> >> >>> >> >> >>>>> Hi Babak
>> >>>>>> >> >> >>> >> >> >>>>>
>> >>>>>> >> >> >>> >> >> >>>>> You gotta follow those instructions in the
>> >>>>>> >> >> >>> >> >> >>>>> apace
>> >>>>>> >> >> >>> >> >> >>>>> site
>> >>>>>> >> >> >>> >> >> >>>>> to
>> >>>>>> >> >> >>> >> >> >>>>> set
>> >>>>>> >> >> >>> >> >> >>>>> up
>> >>>>>> >> >> >>> >> >> >>>>> hadoop
>> >>>>>> >> >> >>> >> >> >>>>> from scratch and ensure that hdfs is working
>> >>>>>> >> >> >>> >> >> >>>>> first. You
>> >>>>>> >> >> >>> >> >> >>>>> should
>> >>>>>> >> >> >>> >> >> >>>>> be
>> >>>>>> >> >> >>> >> >> >>>>> able to
>> >>>>>> >> >> >>> >> >> >>>>> read and write files to hdfs before you do
>> >>>>>> >> >> >>> >> >> >>>>> your
>> >>>>>> >> >> >>> >> >> >>>>> next
>> >>>>>> >> >> >>> >> >> >>>>> steps.
>> >>>>>> >> >> >>> >> >> >>>>>
>> >>>>>> >> >> >>> >> >> >>>>> Are you on CDH or apache distribution of
>> >>>>>> >> >> >>> >> >> >>>>> hadoop?
>> >>>>>> >> >> >>> >> >> >>>>> If it
>> >>>>>> >> >> >>> >> >> >>>>> is
>> >>>>>> >> >> >>> >> >> >>>>> CDH
>> >>>>>> >> >> >>> >> >> >>>>> there
>> >>>>>> >> >> >>> >> >> >>>>> are
>> >>>>>> >> >> >>> >> >> >>>>> detailed instructions on Cloudera web site.
>> >>>>>> >> >> >>> >> >> >>>>>
>> >>>>>> >> >> >>> >> >> >>>>> Regards
>> >>>>>> >> >> >>> >> >> >>>>> Bejoy KS
>> >>>>>> >> >> >>> >> >> >>>>>
>> >>>>>> >> >> >>> >> >> >>>>> Sent from handheld, please excuse typos.
>> >>>>>> >> >> >>> >> >> >>>>> ________________________________
>> >>>>>> >> >> >>> >> >> >>>>> From: Babak Bastan <ba...@gmail.com>
>> >>>>>> >> >> >>> >> >> >>>>> Date: Tue, 5 Jun 2012 21:30:22 +0200
>> >>>>>> >> >> >>> >> >> >>>>> To: <us...@hive.apache.org>
>> >>>>>> >> >> >>> >> >> >>>>> ReplyTo: user@hive.apache.org
>> >>>>>> >> >> >>> >> >> >>>>> Subject: Re: Error while Creating Table in
>> >>>>>> >> >> >>> >> >> >>>>> Hive
>> >>>>>> >> >> >>> >> >> >>>>>
>> >>>>>> >> >> >>> >> >> >>>>> @Bejoy: I set the fs.default.name in the
>> >>>>>> >> >> >>> >> >> >>>>> core-site.xml
>> >>>>>> >> >> >>> >> >> >>>>> and
>> >>>>>> >> >> >>> >> >> >>>>> I
>> >>>>>> >> >> >>> >> >> >>>>> did
>> >>>>>> >> >> >>> >> >> >>>>> all
>> >>>>>> >> >> >>> >> >> >>>>> of
>> >>>>>> >> >> >>> >> >> >>>>> thing that was mentioned in the reference but
>> >>>>>> >> >> >>> >> >> >>>>> no
>> >>>>>> >> >> >>> >> >> >>>>> effect
>> >>>>>> >> >> >>> >> >> >>>>>
>> >>>>>> >> >> >>> >> >> >>>>> On Tue, Jun 5, 2012 at 8:43 PM, Babak Bastan
>> >>>>>> >> >> >>> >> >> >>>>> <ba...@gmail.com>
>> >>>>>> >> >> >>> >> >> >>>>> wrote:
>> >>>>>> >> >> >>> >> >> >>>>>>
>> >>>>>> >> >> >>> >> >> >>>>>> Ok sorry but that was my Mistake .I thought
>> >>>>>> >> >> >>> >> >> >>>>>> it
>> >>>>>> >> >> >>> >> >> >>>>>> works
>> >>>>>> >> >> >>> >> >> >>>>>> but
>> >>>>>> >> >> >>> >> >> >>>>>> no.
>> >>>>>> >> >> >>> >> >> >>>>>> I wrote the command without ; and then I
>> >>>>>> >> >> >>> >> >> >>>>>> think
>> >>>>>> >> >> >>> >> >> >>>>>> It
>> >>>>>> >> >> >>> >> >> >>>>>> works
>> >>>>>> >> >> >>> >> >> >>>>>> but
>> >>>>>> >> >> >>> >> >> >>>>>> with
>> >>>>>> >> >> >>> >> >> >>>>>> ;
>> >>>>>> >> >> >>> >> >> >>>>>> at
>> >>>>>> >> >> >>> >> >> >>>>>> the end of command
>> >>>>>> >> >> >>> >> >> >>>>>>
>> >>>>>> >> >> >>> >> >> >>>>>> CREATE TABLE pokes (foo INT, bar STRING);
>> >>>>>> >> >> >>> >> >> >>>>>>
>> >>>>>> >> >> >>> >> >> >>>>>> does'nt work
>> >>>>>> >> >> >>> >> >> >>>>>>
>> >>>>>> >> >> >>> >> >> >>>>>>
>> >>>>>> >> >> >>> >> >> >>>>>> On Tue, Jun 5, 2012 at 8:34 PM, shashwat
>> >>>>>> >> >> >>> >> >> >>>>>> shriparv
>> >>>>>> >> >> >>> >> >> >>>>>> <dw...@gmail.com> wrote:
>> >>>>>> >> >> >>> >> >> >>>>>>>
>> >>>>>> >> >> >>> >> >> >>>>>>> inside configuration. all properties will
>> >>>>>> >> >> >>> >> >> >>>>>>> be
>> >>>>>> >> >> >>> >> >> >>>>>>> inside
>> >>>>>> >> >> >>> >> >> >>>>>>> the
>> >>>>>> >> >> >>> >> >> >>>>>>> configuration
>> >>>>>> >> >> >>> >> >> >>>>>>> tags
>> >>>>>> >> >> >>> >> >> >>>>>>>
>> >>>>>> >> >> >>> >> >> >>>>>>>
>> >>>>>> >> >> >>> >> >> >>>>>>> On Tue, Jun 5, 2012 at 11:53 PM, Babak
>> >>>>>> >> >> >>> >> >> >>>>>>> Bastan
>> >>>>>> >> >> >>> >> >> >>>>>>> <ba...@gmail.com>
>> >>>>>> >> >> >>> >> >> >>>>>>> wrote:
>> >>>>>> >> >> >>> >> >> >>>>>>>>
>> >>>>>> >> >> >>> >> >> >>>>>>>> Thank you so much my friend your idee
>> >>>>>> >> >> >>> >> >> >>>>>>>> works
>> >>>>>> >> >> >>> >> >> >>>>>>>> fine(no
>> >>>>>> >> >> >>> >> >> >>>>>>>> error)
>> >>>>>> >> >> >>> >> >> >>>>>>>> you
>> >>>>>> >> >> >>> >> >> >>>>>>>> are
>> >>>>>> >> >> >>> >> >> >>>>>>>> the best :)
>> >>>>>> >> >> >>> >> >> >>>>>>>>
>> >>>>>> >> >> >>> >> >> >>>>>>>>
>> >>>>>> >> >> >>> >> >> >>>>>>>> On Tue, Jun 5, 2012 at 8:20 PM, Babak
>> >>>>>> >> >> >>> >> >> >>>>>>>> Bastan
>> >>>>>> >> >> >>> >> >> >>>>>>>> <ba...@gmail.com>
>> >>>>>> >> >> >>> >> >> >>>>>>>> wrote:
>> >>>>>> >> >> >>> >> >> >>>>>>>>>
>> >>>>>> >> >> >>> >> >> >>>>>>>>> It must be inside the
>> >>>>>> >> >> >>> >> >> >>>>>>>>> <configuration></configuration>
>> >>>>>> >> >> >>> >> >> >>>>>>>>> or
>> >>>>>> >> >> >>> >> >> >>>>>>>>> outside
>> >>>>>> >> >> >>> >> >> >>>>>>>>> this?
>> >>>>>> >> >> >>> >> >> >>>>>>>>>
>> >>>>>> >> >> >>> >> >> >>>>>>>>>
>> >>>>>> >> >> >>> >> >> >>>>>>>>> On Tue, Jun 5, 2012 at 8:15 PM, shashwat
>> >>>>>> >> >> >>> >> >> >>>>>>>>> shriparv
>> >>>>>> >> >> >>> >> >> >>>>>>>>> <dw...@gmail.com> wrote:
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>
>> >>>>>> >> >> >>> >> >> >>>>>>>>>> It will be inside hive/conf
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>
>> >>>>>> >> >> >>> >> >> >>>>>>>>>> On Tue, Jun 5, 2012 at 11:43 PM, Babak
>> >>>>>> >> >> >>> >> >> >>>>>>>>>> Bastan
>> >>>>>> >> >> >>> >> >> >>>>>>>>>> <ba...@gmail.com>
>> >>>>>> >> >> >>> >> >> >>>>>>>>>> wrote:
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>> Thanks sShashwat, and where is this
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>> hive-site.xml
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>> On Tue, Jun 5, 2012 at 8:02 PM,
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>> shashwat
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>> shriparv
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>> <dw...@gmail.com> wrote:
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> set
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> hive.metastore.warehouse.dir in
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> hive-site.xml
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> <property>
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>   <name>hive.metastore.local</name>
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>   <value>true</value>
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> </property>
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> <name>hive.metastore.warehouse.dir</name>
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>                <value>/home/<your
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> username>/hivefolder</value>
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>                <description>location
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> of
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> default
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> database
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> for
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> the
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> warehouse</description>
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>        </property>
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> On Tue, Jun 5, 2012 at 10:43 PM, Babak
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> Bastan
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> <ba...@gmail.com> wrote:
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>>
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> Hello Experts ,
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>>
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> I'm new in Hive .When try to create a
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> test
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> Table
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> in
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> Hive
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> I
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> get
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> an error.I want to run this command:
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> CREATE TABLE Test (DateT STRING, Url
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> STRING,
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> Content
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> STRING);
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> but this error occured:
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> FAILED: Error in metadata:
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> MetaException(message:Got
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> exception:
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> java.io.FileNotFoundException File
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> file:/user/hive/warehouse/test does
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> not
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> exist.)
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> FAILED: Execution Error, return code
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> 1
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> from
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>>
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> org.apache.hadoop.hive.ql.exec.DDLTask
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> How can I solve this Problem?
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> Thank you so much
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> --
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> ∞
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> Shashwat Shriparv
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>>
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>
>> >>>>>> >> >> >>> >> >> >>>>>>>>>> --
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>
>> >>>>>> >> >> >>> >> >> >>>>>>>>>> ∞
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>
>> >>>>>> >> >> >>> >> >> >>>>>>>>>> Shashwat Shriparv
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>
>> >>>>>> >> >> >>> >> >> >>>>>>>>>>
>> >>>>>> >> >> >>> >> >> >>>>>>>>>
>> >>>>>> >> >> >>> >> >> >>>>>>>>
>> >>>>>> >> >> >>> >> >> >>>>>>>
>> >>>>>> >> >> >>> >> >> >>>>>>>
>> >>>>>> >> >> >>> >> >> >>>>>>>
>> >>>>>> >> >> >>> >> >> >>>>>>> --
>> >>>>>> >> >> >>> >> >> >>>>>>>
>> >>>>>> >> >> >>> >> >> >>>>>>>
>> >>>>>> >> >> >>> >> >> >>>>>>> ∞
>> >>>>>> >> >> >>> >> >> >>>>>>>
>> >>>>>> >> >> >>> >> >> >>>>>>> Shashwat Shriparv
>> >>>>>> >> >> >>> >> >> >>>>>>>
>> >>>>>> >> >> >>> >> >> >>>>>>>
>> >>>>>> >> >> >>> >> >> >>>>>>
>> >>>>>> >> >> >>> >> >> >>>>>
>> >>>>>> >> >> >>> >> >> >>>>
>> >>>>>> >> >> >>> >> >> >>>
>> >>>>>> >> >> >>> >> >> >>>
>> >>>>>> >> >> >>> >> >> >>>
>> >>>>>> >> >> >>> >> >> >>> --
>> >>>>>> >> >> >>> >> >> >>>
>> >>>>>> >> >> >>> >> >> >>>
>> >>>>>> >> >> >>> >> >> >>> ∞
>> >>>>>> >> >> >>> >> >> >>>
>> >>>>>> >> >> >>> >> >> >>> Shashwat Shriparv
>> >>>>>> >> >> >>> >> >> >>>
>> >>>>>> >> >> >>> >> >> >>>
>> >>>>>> >> >> >>> >> >> >>
>> >>>>>> >> >> >>> >> >
>> >>>>>> >> >> >>> >> >
>> >>>>>> >> >> >>> >
>> >>>>>> >> >> >>> >
>> >>>>>> >> >> >>
>> >>>>>> >> >> >>
>> >>>>>> >> >
>> >>>>>> >> >
>> >>>>>> >
>> >>>>>> >
>> >>>>>
>> >>>>>
>> >>>>
>> >>>>
>> >>>>
>> >>>> --
>> >>>>
>> >>>>
>> >>>> ∞
>> >>>>
>> >>>> Shashwat Shriparv
>> >>>>
>> >>>>
>> >>>
>> >>
>> >
>
>

Re: Error while Creating Table in Hive

Posted by Babak Bastan <ba...@gmail.com>.
I 've performed the steps but the same error in this step as before:
bin/start-dfs.sh
It is about my permission to make directory

On Wed, Jun 6, 2012 at 10:33 PM, Mohammad Tariq <do...@gmail.com> wrote:

> actually this blog post explains how to install cloudera's hadoop
> distribution...if you have followed this post and installed cloudera's
> distribution then your logs should ideally be inside
> /usr/lib/hadoop/logs (if everything was fine)..anyway try the steps I
> have given and let me know.
>
> Regards,
>     Mohammad Tariq
>
>
> On Thu, Jun 7, 2012 at 1:52 AM, Babak Bastan <ba...@gmail.com> wrote:
> > by the way,you are a very nice man my friend:Thank you so much :)
> >
> > what do you mean aboat this post in stackoverflow?
> >
> > I am assuming that is your first installation of hadoop.
> >
> > At the beginning please check if your daemons are working. To do that use
> > (in terminal):
> >
> > jps
> >
> > If only jps appears that means all daemons are down. Please check the log
> > files. Especially the namenode. Log folder is probably somewhere there
> > /usr/lib/hadoop/logs
> >
> > If you have some permission problems. Use this guide during the
> > installation.
> >
> > Good installation guide
> >
> > I am shooting with this explanations but these are most common problems.
> >
> >
> > On Wed, Jun 6, 2012 at 10:15 PM, Babak Bastan <ba...@gmail.com>
> wrote:
> >>
> >> I checked it but no hadoop folder :(
> >> yes you are right.I'm a student and I want to make a very very simple
> >> programm hive but untill now hmmmmmmmmm
> >>
> >>
> >> On Wed, Jun 6, 2012 at 10:12 PM, Babak Bastan <ba...@gmail.com>
> wrote:
> >>>
> >>> no one error:
> >>> i.e if I run this one
> >>>
> >>> hostname --fqdn
> >>>
> >>>  with the condition that I send to you :
> >>>
> >>> 127.0.0.1       localhost
> >>> #127.0.0.1      ubuntu.ubuntu-domain    ubuntu
> >>> # The following lines are desirable for IPv6 capable hosts
> >>> #::1     ip6-localhost ip6-loopback
> >>> #fe00::0 ip6-localnet
> >>> #ff00::0 ip6-mcastprefix
> >>> #ff02::1 ip6-allnodes
> >>> #ff02::2 ip6-allrouters
> >>>
> >>> I get this error:
> >>>
> >>> hostname: Name or service not known
> >>>
> >>> Or in the second step by this command:
> >>>
> >>> babak@ubuntu:~/Downloads/hadoop/bin$ start-hdfs.sh
> >>>
> >>> these lines of error:
> >>>
> >>>
> >>> mkdir: kann Verzeichnis „/home/babak/Downloads/hadoop/bin/../logs“
> nicht
> >>> anlegen: Keine Berechtigung
> >>> starting namenode, logging to
> >>>
> /home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-namenode-ubuntu.out
> >>> /home/babak/Downloads/hadoop/bin/hadoop-daemon.sh: Zeile 117:
> >>>
> /home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-namenode-ubuntu.out:
> >>> Datei oder Verzeichnis nicht gefunden
> >>> head:
> >>>
> „/home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-namenode-ubuntu.out“
> >>> kann nicht zum Lesen geöffnet werden: Datei oder Verzeichnis nicht
> gefunden
> >>> localhost: mkdir: kann Verzeichnis
> >>> „/home/babak/Downloads/hadoop/bin/../logs“ nicht anlegen: Keine
> Berechtigung
> >>> localhost: starting datanode, logging to
> >>>
> /home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-datanode-ubuntu.out
> >>> localhost: /home/babak/Downloads/hadoop/bin/hadoop-daemon.sh: Zeile
> 117:
> >>>
> /home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-datanode-ubuntu.out:
> >>> Datei oder Verzeichnis nicht gefunden
> >>> localhost: head:
> >>>
> „/home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-datanode-ubuntu.out“
> >>> kann nicht zum Lesen geöffnet werden: Datei oder Verzeichnis nicht
> gefunden
> >>> localhost: mkdir: kann Verzeichnis
> >>> „/home/babak/Downloads/hadoop/bin/../logs“ nicht anlegen: Keine
> Berechtigung
> >>> localhost: starting secondarynamenode, logging to
> >>>
> /home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-secondarynamenode-ubuntu.out
> >>> localhost: /home/babak/Downloads/hadoop/bin/hadoop-daemon.sh: Zeile
> 117:
> >>>
> /home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-secondarynamenode-ubuntu.out:
> >>> Datei oder Verzeichnis nicht gefunden
> >>> localhost: head:
> >>>
> „/home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-secondarynamenode-ubuntu.out“
> >>> kann nicht zum Lesen geöffnet werden: Datei oder Verzeichnis nicht
> gefunden
> >>>
> >>> they said no permision to make logs in this
> >>> path:/home/babak/Downloads/hadoop/bin/../logs
> >>>
> >>>  and generally I cant create a table in hive and get this one:
> >>>
> >>> FAILED: Error in metadata: MetaException(message:Got exception:
> >>> java.io.FileNotFoundException File file:/user/hive/warehouse/test does
> not
> >>> exist.)
> >>> FAILED: Execution Error, return code 1 from
> >>> org.apache.hadoop.hive.ql.exec.DDLTask
> >>>
> >>> On Wed, Jun 6, 2012 at 10:02 PM, shashwat shriparv
> >>> <dw...@gmail.com> wrote:
> >>>>
> >>>> whats the error babak ???
> >>>>
> >>>>
> >>>> On Thu, Jun 7, 2012 at 1:25 AM, Babak Bastan <ba...@gmail.com>
> wrote:
> >>>>>
> >>>>> What the hell is that?I see no log folder there
> >>>>>
> >>>>>
> >>>>> On Wed, Jun 6, 2012 at 9:41 PM, Mohammad Tariq <do...@gmail.com>
> >>>>> wrote:
> >>>>>>
> >>>>>> go to your HADOOP_HOME i.e your hadoop directory(that includes bin,
> >>>>>> conf etc)..you can find logs directory there..
> >>>>>>
> >>>>>> Regards,
> >>>>>>     Mohammad Tariq
> >>>>>>
> >>>>>>
> >>>>>> On Thu, Jun 7, 2012 at 1:09 AM, Babak Bastan <ba...@gmail.com>
> >>>>>> wrote:
> >>>>>> > hoe can I get my log mohammad?
> >>>>>> >
> >>>>>> >
> >>>>>> > On Wed, Jun 6, 2012 at 9:36 PM, Mohammad Tariq <
> dontariq@gmail.com>
> >>>>>> > wrote:
> >>>>>> >>
> >>>>>> >> could you post your logs???that would help me in understanding
> the
> >>>>>> >> problem properly.
> >>>>>> >>
> >>>>>> >> Regards,
> >>>>>> >>     Mohammad Tariq
> >>>>>> >>
> >>>>>> >>
> >>>>>> >> On Thu, Jun 7, 2012 at 1:02 AM, Babak Bastan <babakbsn@gmail.com
> >
> >>>>>> >> wrote:
> >>>>>> >> > Thank you very much mohamad for your attention.I followed the
> >>>>>> >> > steps but
> >>>>>> >> > the
> >>>>>> >> > error is the same as the last time.
> >>>>>> >> > and there is my hosts file:
> >>>>>> >> >
> >>>>>> >> > 127.0.0.1       localhost
> >>>>>> >> > #127.0.0.1      ubuntu.ubuntu-domain    ubuntu
> >>>>>> >> >
> >>>>>> >> >
> >>>>>> >> > # The following lines are desirable for IPv6 capable hosts
> >>>>>> >> >
> >>>>>> >> > #::1     ip6-localhost ip6-loopback
> >>>>>> >> > #fe00::0 ip6-localnet
> >>>>>> >> > #ff00::0 ip6-mcastprefix
> >>>>>> >> > #ff02::1 ip6-allnodes
> >>>>>> >> > #ff02::2 ip6-allrouters
> >>>>>> >> >
> >>>>>> >> > but no effect :(
> >>>>>> >> >
> >>>>>> >> > On Wed, Jun 6, 2012 at 8:25 PM, Mohammad Tariq
> >>>>>> >> > <do...@gmail.com>
> >>>>>> >> > wrote:
> >>>>>> >> >>
> >>>>>> >> >> also change the permissions of these directories to 777.
> >>>>>> >> >>
> >>>>>> >> >> Regards,
> >>>>>> >> >>     Mohammad Tariq
> >>>>>> >> >>
> >>>>>> >> >>
> >>>>>> >> >> On Wed, Jun 6, 2012 at 11:54 PM, Mohammad Tariq
> >>>>>> >> >> <do...@gmail.com>
> >>>>>> >> >> wrote:
> >>>>>> >> >> > create a directory "/home/username/hdfs" (or at some place
> of
> >>>>>> >> >> > your
> >>>>>> >> >> > choice)..inside this hdfs directory create three sub
> >>>>>> >> >> > directories -
> >>>>>> >> >> > name, data, and temp, then follow these steps :
> >>>>>> >> >> >
> >>>>>> >> >> > add following properties in your core-site.xml -
> >>>>>> >> >> >
> >>>>>> >> >> > <property>
> >>>>>> >> >> >          <name>fs.default.name</name>
> >>>>>> >> >> >          <value>hdfs://localhost:9000/</value>
> >>>>>> >> >> >        </property>
> >>>>>> >> >> >
> >>>>>> >> >> >        <property>
> >>>>>> >> >> >          <name>hadoop.tmp.dir</name>
> >>>>>> >> >> >          <value>/home/mohammad/hdfs/temp</value>
> >>>>>> >> >> >        </property>
> >>>>>> >> >> >
> >>>>>> >> >> > then add following two properties in your hdfs-site.xml -
> >>>>>> >> >> >
> >>>>>> >> >> > <property>
> >>>>>> >> >> >                <name>dfs.replication</name>
> >>>>>> >> >> >                <value>1</value>
> >>>>>> >> >> >        </property>
> >>>>>> >> >> >
> >>>>>> >> >> >        <property>
> >>>>>> >> >> >                <name>dfs.name.dir</name>
> >>>>>> >> >> >                <value>/home/mohammad/hdfs/name</value>
> >>>>>> >> >> >        </property>
> >>>>>> >> >> >
> >>>>>> >> >> >        <property>
> >>>>>> >> >> >                <name>dfs.data.dir</name>
> >>>>>> >> >> >                <value>/home/mohammad/hdfs/data</value>
> >>>>>> >> >> >        </property>
> >>>>>> >> >> >
> >>>>>> >> >> > finally add this property in your mapred-site.xml -
> >>>>>> >> >> >
> >>>>>> >> >> >       <property>
> >>>>>> >> >> >          <name>mapred.job.tracker</name>
> >>>>>> >> >> >          <value>hdfs://localhost:9001</value>
> >>>>>> >> >> >        </property>
> >>>>>> >> >> >
> >>>>>> >> >> > NOTE: you can give any name to these directories of your
> >>>>>> >> >> > choice, just
> >>>>>> >> >> > keep in mind you have to give same names as values of
> >>>>>> >> >> >           above specified properties in your configuration
> >>>>>> >> >> > files.
> >>>>>> >> >> > (give full path of these directories, not just the name of
> the
> >>>>>> >> >> > directory)
> >>>>>> >> >> >
> >>>>>> >> >> > After this  follow the steps provided in the previous reply.
> >>>>>> >> >> >
> >>>>>> >> >> > Regards,
> >>>>>> >> >> >     Mohammad Tariq
> >>>>>> >> >> >
> >>>>>> >> >> >
> >>>>>> >> >> > On Wed, Jun 6, 2012 at 11:42 PM, Babak Bastan
> >>>>>> >> >> > <ba...@gmail.com>
> >>>>>> >> >> > wrote:
> >>>>>> >> >> >> thank's Mohammad
> >>>>>> >> >> >>
> >>>>>> >> >> >> with this command:
> >>>>>> >> >> >>
> >>>>>> >> >> >> babak@ubuntu:~/Downloads/hadoop/bin$ hadoop namenode
> -format
> >>>>>> >> >> >>
> >>>>>> >> >> >> this is my output:
> >>>>>> >> >> >>
> >>>>>> >> >> >> 12/06/06 20:05:20 INFO namenode.NameNode: STARTUP_MSG:
> >>>>>> >> >> >>
> /************************************************************
> >>>>>> >> >> >> STARTUP_MSG: Starting NameNode
> >>>>>> >> >> >> STARTUP_MSG:   host = ubuntu/127.0.1.1
> >>>>>> >> >> >> STARTUP_MSG:   args = [-format]
> >>>>>> >> >> >> STARTUP_MSG:   version = 0.20.2
> >>>>>> >> >> >> STARTUP_MSG:   build =
> >>>>>> >> >> >>
> >>>>>> >> >> >>
> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.20
> >>>>>> >> >> >> -r
> >>>>>> >> >> >> 911707; compiled by 'chrisdo' on Fri Feb 19 08:07:34 UTC
> 2010
> >>>>>> >> >> >>
> ************************************************************/
> >>>>>> >> >> >> 12/06/06 20:05:20 INFO namenode.FSNamesystem:
> >>>>>> >> >> >>
> >>>>>> >> >> >>
> >>>>>> >> >> >>
> fsOwner=babak,babak,adm,dialout,cdrom,plugdev,lpadmin,admin,sambashare
> >>>>>> >> >> >> 12/06/06 20:05:20 INFO namenode.FSNamesystem:
> >>>>>> >> >> >> supergroup=supergroup
> >>>>>> >> >> >> 12/06/06 20:05:20 INFO namenode.FSNamesystem:
> >>>>>> >> >> >> isPermissionEnabled=true
> >>>>>> >> >> >> 12/06/06 20:05:20 INFO common.Storage: Image file of size
> 95
> >>>>>> >> >> >> saved
> >>>>>> >> >> >> in 0
> >>>>>> >> >> >> seconds.
> >>>>>> >> >> >> 12/06/06 20:05:20 INFO common.Storage: Storage directory
> >>>>>> >> >> >> /tmp/hadoop-babak/dfs/name has been successfully formatted.
> >>>>>> >> >> >> 12/06/06 20:05:20 INFO namenode.NameNode: SHUTDOWN_MSG:
> >>>>>> >> >> >>
> /************************************************************
> >>>>>> >> >> >> SHUTDOWN_MSG: Shutting down NameNode at ubuntu/127.0.1.1
> >>>>>> >> >> >>
> ************************************************************/
> >>>>>> >> >> >>
> >>>>>> >> >> >> by this command:
> >>>>>> >> >> >>
> >>>>>> >> >> >> babak@ubuntu:~/Downloads/hadoop/bin$ start-dfs.sh
> >>>>>> >> >> >>
> >>>>>> >> >> >> this is the out put
> >>>>>> >> >> >>
> >>>>>> >> >> >> mkdir: kann Verzeichnis
> >>>>>> >> >> >> „/home/babak/Downloads/hadoop/bin/../logs“
> >>>>>> >> >> >> nicht
> >>>>>> >> >> >> anlegen: Keine Berechtigung
> >>>>>> >> >> >>
> >>>>>> >> >> >> this out put(it's in german and it means no right to make
> >>>>>> >> >> >> this
> >>>>>> >> >> >> folder)
> >>>>>> >> >> >>
> >>>>>> >> >> >>
> >>>>>> >> >> >> On Wed, Jun 6, 2012 at 7:59 PM, Mohammad Tariq
> >>>>>> >> >> >> <do...@gmail.com>
> >>>>>> >> >> >> wrote:
> >>>>>> >> >> >>>
> >>>>>> >> >> >>> once we are done with the configuration, we need to format
> >>>>>> >> >> >>> the file
> >>>>>> >> >> >>> system..use this command to do that-
> >>>>>> >> >> >>> bin/hadoop namenode -format
> >>>>>> >> >> >>>
> >>>>>> >> >> >>> after this, hadoop daemon processes should be started
> using
> >>>>>> >> >> >>> following
> >>>>>> >> >> >>> commands -
> >>>>>> >> >> >>> bin/start-dfs.sh (it'll start NN & DN)
> >>>>>> >> >> >>> bin/start-mapred.sh (it'll start JT & TT)
> >>>>>> >> >> >>>
> >>>>>> >> >> >>> after this use jps to check if everything is alright or
> >>>>>> >> >> >>> point your
> >>>>>> >> >> >>> browser to localhost:50070..if you further find any
> problem
> >>>>>> >> >> >>> provide
> >>>>>> >> >> >>> us
> >>>>>> >> >> >>> with the error logs..:)
> >>>>>> >> >> >>>
> >>>>>> >> >> >>> Regards,
> >>>>>> >> >> >>>     Mohammad Tariq
> >>>>>> >> >> >>>
> >>>>>> >> >> >>>
> >>>>>> >> >> >>> On Wed, Jun 6, 2012 at 11:22 PM, Babak Bastan
> >>>>>> >> >> >>> <ba...@gmail.com>
> >>>>>> >> >> >>> wrote:
> >>>>>> >> >> >>> > were you able to format hdfs properly???
> >>>>>> >> >> >>> > I did'nt get your question,Do you mean HADOOP_HOME? or
> >>>>>> >> >> >>> > where did
> >>>>>> >> >> >>> > I
> >>>>>> >> >> >>> > install
> >>>>>> >> >> >>> > Hadoop?
> >>>>>> >> >> >>> >
> >>>>>> >> >> >>> > On Wed, Jun 6, 2012 at 7:49 PM, Mohammad Tariq
> >>>>>> >> >> >>> > <do...@gmail.com>
> >>>>>> >> >> >>> > wrote:
> >>>>>> >> >> >>> >>
> >>>>>> >> >> >>> >> if you are getting only this, it means your hadoop is
> not
> >>>>>> >> >> >>> >> running..were you able to format hdfs properly???
> >>>>>> >> >> >>> >>
> >>>>>> >> >> >>> >> Regards,
> >>>>>> >> >> >>> >>     Mohammad Tariq
> >>>>>> >> >> >>> >>
> >>>>>> >> >> >>> >>
> >>>>>> >> >> >>> >> On Wed, Jun 6, 2012 at 11:17 PM, Babak Bastan
> >>>>>> >> >> >>> >> <ba...@gmail.com>
> >>>>>> >> >> >>> >> wrote:
> >>>>>> >> >> >>> >> > Hi MohammadmI irun jps in my shel I can see this
> >>>>>> >> >> >>> >> > result:
> >>>>>> >> >> >>> >> > 2213 Jps
> >>>>>> >> >> >>> >> >
> >>>>>> >> >> >>> >> >
> >>>>>> >> >> >>> >> > On Wed, Jun 6, 2012 at 7:44 PM, Mohammad Tariq
> >>>>>> >> >> >>> >> > <do...@gmail.com>
> >>>>>> >> >> >>> >> > wrote:
> >>>>>> >> >> >>> >> >>
> >>>>>> >> >> >>> >> >> you can also use "jps" command at your shell to see
> >>>>>> >> >> >>> >> >> whether
> >>>>>> >> >> >>> >> >> Hadoop
> >>>>>> >> >> >>> >> >> processes are running or not.
> >>>>>> >> >> >>> >> >>
> >>>>>> >> >> >>> >> >> Regards,
> >>>>>> >> >> >>> >> >>     Mohammad Tariq
> >>>>>> >> >> >>> >> >>
> >>>>>> >> >> >>> >> >>
> >>>>>> >> >> >>> >> >> On Wed, Jun 6, 2012 at 11:12 PM, Mohammad Tariq
> >>>>>> >> >> >>> >> >> <do...@gmail.com>
> >>>>>> >> >> >>> >> >> wrote:
> >>>>>> >> >> >>> >> >> > Hi Babak,
> >>>>>> >> >> >>> >> >> >
> >>>>>> >> >> >>> >> >> >  You have to type it in you web browser..Hadoop
> >>>>>> >> >> >>> >> >> > provides us
> >>>>>> >> >> >>> >> >> > a
> >>>>>> >> >> >>> >> >> > web
> >>>>>> >> >> >>> >> >> > GUI
> >>>>>> >> >> >>> >> >> > that not only allows us to browse through the file
> >>>>>> >> >> >>> >> >> > system,
> >>>>>> >> >> >>> >> >> > but
> >>>>>> >> >> >>> >> >> > to
> >>>>>> >> >> >>> >> >> > download the files as well..Apart from that it
> also
> >>>>>> >> >> >>> >> >> > provides a
> >>>>>> >> >> >>> >> >> > web
> >>>>>> >> >> >>> >> >> > GUI
> >>>>>> >> >> >>> >> >> > that can be used to see the status of Jobtracker
> and
> >>>>>> >> >> >>> >> >> > Tasktracker..When
> >>>>>> >> >> >>> >> >> > you run a Hive or Pig job or a Mapreduce job, you
> >>>>>> >> >> >>> >> >> > can point
> >>>>>> >> >> >>> >> >> > your
> >>>>>> >> >> >>> >> >> > browser to http://localhost:50030 to see the
> status
> >>>>>> >> >> >>> >> >> > and
> >>>>>> >> >> >>> >> >> > logs
> >>>>>> >> >> >>> >> >> > of
> >>>>>> >> >> >>> >> >> > your
> >>>>>> >> >> >>> >> >> > job.
> >>>>>> >> >> >>> >> >> >
> >>>>>> >> >> >>> >> >> > Regards,
> >>>>>> >> >> >>> >> >> >     Mohammad Tariq
> >>>>>> >> >> >>> >> >> >
> >>>>>> >> >> >>> >> >> >
> >>>>>> >> >> >>> >> >> > On Wed, Jun 6, 2012 at 8:28 PM, Babak Bastan
> >>>>>> >> >> >>> >> >> > <ba...@gmail.com>
> >>>>>> >> >> >>> >> >> > wrote:
> >>>>>> >> >> >>> >> >> >> Thank you shashwat for the answer,
> >>>>>> >> >> >>> >> >> >> where should I type http://localhost:50070?
> >>>>>> >> >> >>> >> >> >> I typed here: hive>http://localhost:50070 but
> >>>>>> >> >> >>> >> >> >> nothing as
> >>>>>> >> >> >>> >> >> >> result
> >>>>>> >> >> >>> >> >> >>
> >>>>>> >> >> >>> >> >> >>
> >>>>>> >> >> >>> >> >> >> On Wed, Jun 6, 2012 at 3:32 PM, shashwat shriparv
> >>>>>> >> >> >>> >> >> >> <dw...@gmail.com> wrote:
> >>>>>> >> >> >>> >> >> >>>
> >>>>>> >> >> >>> >> >> >>> first type http://localhost:50070 whether this
> is
> >>>>>> >> >> >>> >> >> >>> opening
> >>>>>> >> >> >>> >> >> >>> or
> >>>>>> >> >> >>> >> >> >>> not
> >>>>>> >> >> >>> >> >> >>> and
> >>>>>> >> >> >>> >> >> >>> check
> >>>>>> >> >> >>> >> >> >>> how many nodes are available, check some of the
> >>>>>> >> >> >>> >> >> >>> hadoop
> >>>>>> >> >> >>> >> >> >>> shell
> >>>>>> >> >> >>> >> >> >>> commands
> >>>>>> >> >> >>> >> >> >>>
> >>>>>> >> >> >>> >> >> >>>
> >>>>>> >> >> >>> >> >> >>>
> >>>>>> >> >> >>> >> >> >>>
> >>>>>> >> >> >>> >> >> >>> from
> http://hadoop.apache.org/common/docs/r0.18.3/hdfs_shell.html
> >>>>>> >> >> >>> >> >> >>> run
> >>>>>> >> >> >>> >> >> >>> example mapreduce task on hadoop take example
> from
> >>>>>> >> >> >>> >> >> >>> here
> >>>>>> >> >> >>> >> >> >>>
> >>>>>> >> >> >>> >> >> >>>
> >>>>>> >> >> >>> >> >> >>>
> >>>>>> >> >> >>> >> >> >>>
> >>>>>> >> >> >>> >> >> >>>
> >>>>>> >> >> >>> >> >> >>>
> >>>>>> >> >> >>> >> >> >>> :
> http://www.michael-noll.com/blog/2011/04/09/benchmarking-and-stress-testing-an-hadoop-cluster-with-terasort-testdfsio-nnbench-mrbench/
> >>>>>> >> >> >>> >> >> >>>
> >>>>>> >> >> >>> >> >> >>> if all the above you can do sucessfully means
> >>>>>> >> >> >>> >> >> >>> hadoop is
> >>>>>> >> >> >>> >> >> >>> configured
> >>>>>> >> >> >>> >> >> >>> correctly
> >>>>>> >> >> >>> >> >> >>>
> >>>>>> >> >> >>> >> >> >>> Regards
> >>>>>> >> >> >>> >> >> >>> Shashwat
> >>>>>> >> >> >>> >> >> >>>
> >>>>>> >> >> >>> >> >> >>>
> >>>>>> >> >> >>> >> >> >>> On Wed, Jun 6, 2012 at 1:30 AM, Babak Bastan
> >>>>>> >> >> >>> >> >> >>> <ba...@gmail.com>
> >>>>>> >> >> >>> >> >> >>> wrote:
> >>>>>> >> >> >>> >> >> >>>>
> >>>>>> >> >> >>> >> >> >>>> no I'm not working on CDH.Is there a way to
> test
> >>>>>> >> >> >>> >> >> >>>> if my
> >>>>>> >> >> >>> >> >> >>>> Hadoop
> >>>>>> >> >> >>> >> >> >>>> works
> >>>>>> >> >> >>> >> >> >>>> fine
> >>>>>> >> >> >>> >> >> >>>> or not?
> >>>>>> >> >> >>> >> >> >>>>
> >>>>>> >> >> >>> >> >> >>>>
> >>>>>> >> >> >>> >> >> >>>> On Tue, Jun 5, 2012 at 9:55 PM, Bejoy KS
> >>>>>> >> >> >>> >> >> >>>> <be...@yahoo.com>
> >>>>>> >> >> >>> >> >> >>>> wrote:
> >>>>>> >> >> >>> >> >> >>>>>
> >>>>>> >> >> >>> >> >> >>>>> Hi Babak
> >>>>>> >> >> >>> >> >> >>>>>
> >>>>>> >> >> >>> >> >> >>>>> You gotta follow those instructions in the
> apace
> >>>>>> >> >> >>> >> >> >>>>> site
> >>>>>> >> >> >>> >> >> >>>>> to
> >>>>>> >> >> >>> >> >> >>>>> set
> >>>>>> >> >> >>> >> >> >>>>> up
> >>>>>> >> >> >>> >> >> >>>>> hadoop
> >>>>>> >> >> >>> >> >> >>>>> from scratch and ensure that hdfs is working
> >>>>>> >> >> >>> >> >> >>>>> first. You
> >>>>>> >> >> >>> >> >> >>>>> should
> >>>>>> >> >> >>> >> >> >>>>> be
> >>>>>> >> >> >>> >> >> >>>>> able to
> >>>>>> >> >> >>> >> >> >>>>> read and write files to hdfs before you do
> your
> >>>>>> >> >> >>> >> >> >>>>> next
> >>>>>> >> >> >>> >> >> >>>>> steps.
> >>>>>> >> >> >>> >> >> >>>>>
> >>>>>> >> >> >>> >> >> >>>>> Are you on CDH or apache distribution of
> hadoop?
> >>>>>> >> >> >>> >> >> >>>>> If it
> >>>>>> >> >> >>> >> >> >>>>> is
> >>>>>> >> >> >>> >> >> >>>>> CDH
> >>>>>> >> >> >>> >> >> >>>>> there
> >>>>>> >> >> >>> >> >> >>>>> are
> >>>>>> >> >> >>> >> >> >>>>> detailed instructions on Cloudera web site.
> >>>>>> >> >> >>> >> >> >>>>>
> >>>>>> >> >> >>> >> >> >>>>> Regards
> >>>>>> >> >> >>> >> >> >>>>> Bejoy KS
> >>>>>> >> >> >>> >> >> >>>>>
> >>>>>> >> >> >>> >> >> >>>>> Sent from handheld, please excuse typos.
> >>>>>> >> >> >>> >> >> >>>>> ________________________________
> >>>>>> >> >> >>> >> >> >>>>> From: Babak Bastan <ba...@gmail.com>
> >>>>>> >> >> >>> >> >> >>>>> Date: Tue, 5 Jun 2012 21:30:22 +0200
> >>>>>> >> >> >>> >> >> >>>>> To: <us...@hive.apache.org>
> >>>>>> >> >> >>> >> >> >>>>> ReplyTo: user@hive.apache.org
> >>>>>> >> >> >>> >> >> >>>>> Subject: Re: Error while Creating Table in
> Hive
> >>>>>> >> >> >>> >> >> >>>>>
> >>>>>> >> >> >>> >> >> >>>>> @Bejoy: I set the fs.default.name in the
> >>>>>> >> >> >>> >> >> >>>>> core-site.xml
> >>>>>> >> >> >>> >> >> >>>>> and
> >>>>>> >> >> >>> >> >> >>>>> I
> >>>>>> >> >> >>> >> >> >>>>> did
> >>>>>> >> >> >>> >> >> >>>>> all
> >>>>>> >> >> >>> >> >> >>>>> of
> >>>>>> >> >> >>> >> >> >>>>> thing that was mentioned in the reference but
> no
> >>>>>> >> >> >>> >> >> >>>>> effect
> >>>>>> >> >> >>> >> >> >>>>>
> >>>>>> >> >> >>> >> >> >>>>> On Tue, Jun 5, 2012 at 8:43 PM, Babak Bastan
> >>>>>> >> >> >>> >> >> >>>>> <ba...@gmail.com>
> >>>>>> >> >> >>> >> >> >>>>> wrote:
> >>>>>> >> >> >>> >> >> >>>>>>
> >>>>>> >> >> >>> >> >> >>>>>> Ok sorry but that was my Mistake .I thought
> it
> >>>>>> >> >> >>> >> >> >>>>>> works
> >>>>>> >> >> >>> >> >> >>>>>> but
> >>>>>> >> >> >>> >> >> >>>>>> no.
> >>>>>> >> >> >>> >> >> >>>>>> I wrote the command without ; and then I
> think
> >>>>>> >> >> >>> >> >> >>>>>> It
> >>>>>> >> >> >>> >> >> >>>>>> works
> >>>>>> >> >> >>> >> >> >>>>>> but
> >>>>>> >> >> >>> >> >> >>>>>> with
> >>>>>> >> >> >>> >> >> >>>>>> ;
> >>>>>> >> >> >>> >> >> >>>>>> at
> >>>>>> >> >> >>> >> >> >>>>>> the end of command
> >>>>>> >> >> >>> >> >> >>>>>>
> >>>>>> >> >> >>> >> >> >>>>>> CREATE TABLE pokes (foo INT, bar STRING);
> >>>>>> >> >> >>> >> >> >>>>>>
> >>>>>> >> >> >>> >> >> >>>>>> does'nt work
> >>>>>> >> >> >>> >> >> >>>>>>
> >>>>>> >> >> >>> >> >> >>>>>>
> >>>>>> >> >> >>> >> >> >>>>>> On Tue, Jun 5, 2012 at 8:34 PM, shashwat
> >>>>>> >> >> >>> >> >> >>>>>> shriparv
> >>>>>> >> >> >>> >> >> >>>>>> <dw...@gmail.com> wrote:
> >>>>>> >> >> >>> >> >> >>>>>>>
> >>>>>> >> >> >>> >> >> >>>>>>> inside configuration. all properties will be
> >>>>>> >> >> >>> >> >> >>>>>>> inside
> >>>>>> >> >> >>> >> >> >>>>>>> the
> >>>>>> >> >> >>> >> >> >>>>>>> configuration
> >>>>>> >> >> >>> >> >> >>>>>>> tags
> >>>>>> >> >> >>> >> >> >>>>>>>
> >>>>>> >> >> >>> >> >> >>>>>>>
> >>>>>> >> >> >>> >> >> >>>>>>> On Tue, Jun 5, 2012 at 11:53 PM, Babak
> Bastan
> >>>>>> >> >> >>> >> >> >>>>>>> <ba...@gmail.com>
> >>>>>> >> >> >>> >> >> >>>>>>> wrote:
> >>>>>> >> >> >>> >> >> >>>>>>>>
> >>>>>> >> >> >>> >> >> >>>>>>>> Thank you so much my friend your idee works
> >>>>>> >> >> >>> >> >> >>>>>>>> fine(no
> >>>>>> >> >> >>> >> >> >>>>>>>> error)
> >>>>>> >> >> >>> >> >> >>>>>>>> you
> >>>>>> >> >> >>> >> >> >>>>>>>> are
> >>>>>> >> >> >>> >> >> >>>>>>>> the best :)
> >>>>>> >> >> >>> >> >> >>>>>>>>
> >>>>>> >> >> >>> >> >> >>>>>>>>
> >>>>>> >> >> >>> >> >> >>>>>>>> On Tue, Jun 5, 2012 at 8:20 PM, Babak
> Bastan
> >>>>>> >> >> >>> >> >> >>>>>>>> <ba...@gmail.com>
> >>>>>> >> >> >>> >> >> >>>>>>>> wrote:
> >>>>>> >> >> >>> >> >> >>>>>>>>>
> >>>>>> >> >> >>> >> >> >>>>>>>>> It must be inside the
> >>>>>> >> >> >>> >> >> >>>>>>>>> <configuration></configuration>
> >>>>>> >> >> >>> >> >> >>>>>>>>> or
> >>>>>> >> >> >>> >> >> >>>>>>>>> outside
> >>>>>> >> >> >>> >> >> >>>>>>>>> this?
> >>>>>> >> >> >>> >> >> >>>>>>>>>
> >>>>>> >> >> >>> >> >> >>>>>>>>>
> >>>>>> >> >> >>> >> >> >>>>>>>>> On Tue, Jun 5, 2012 at 8:15 PM, shashwat
> >>>>>> >> >> >>> >> >> >>>>>>>>> shriparv
> >>>>>> >> >> >>> >> >> >>>>>>>>> <dw...@gmail.com> wrote:
> >>>>>> >> >> >>> >> >> >>>>>>>>>>
> >>>>>> >> >> >>> >> >> >>>>>>>>>> It will be inside hive/conf
> >>>>>> >> >> >>> >> >> >>>>>>>>>>
> >>>>>> >> >> >>> >> >> >>>>>>>>>>
> >>>>>> >> >> >>> >> >> >>>>>>>>>> On Tue, Jun 5, 2012 at 11:43 PM, Babak
> >>>>>> >> >> >>> >> >> >>>>>>>>>> Bastan
> >>>>>> >> >> >>> >> >> >>>>>>>>>> <ba...@gmail.com>
> >>>>>> >> >> >>> >> >> >>>>>>>>>> wrote:
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>
> >>>>>> >> >> >>> >> >> >>>>>>>>>>> Thanks sShashwat, and where is this
> >>>>>> >> >> >>> >> >> >>>>>>>>>>> hive-site.xml
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>
> >>>>>> >> >> >>> >> >> >>>>>>>>>>> On Tue, Jun 5, 2012 at 8:02 PM, shashwat
> >>>>>> >> >> >>> >> >> >>>>>>>>>>> shriparv
> >>>>>> >> >> >>> >> >> >>>>>>>>>>> <dw...@gmail.com> wrote:
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> set
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> hive.metastore.warehouse.dir in
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> hive-site.xml
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> <property>
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>   <name>hive.metastore.local</name>
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>   <value>true</value>
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> </property>
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
> <name>hive.metastore.warehouse.dir</name>
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>                <value>/home/<your
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> username>/hivefolder</value>
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>                <description>location of
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> default
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> database
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> for
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> the
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> warehouse</description>
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>        </property>
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> On Tue, Jun 5, 2012 at 10:43 PM, Babak
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> Bastan
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> <ba...@gmail.com> wrote:
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>>
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> Hello Experts ,
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>>
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> I'm new in Hive .When try to create a
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> test
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> Table
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> in
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> Hive
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> I
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> get
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> an error.I want to run this command:
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> CREATE TABLE Test (DateT STRING, Url
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> STRING,
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> Content
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> STRING);
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> but this error occured:
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> FAILED: Error in metadata:
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> MetaException(message:Got
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> exception:
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> java.io.FileNotFoundException File
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> file:/user/hive/warehouse/test does
> not
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> exist.)
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> FAILED: Execution Error, return code 1
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> from
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> org.apache.hadoop.hive.ql.exec.DDLTask
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> How can I solve this Problem?
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>> Thank you so much
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> --
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> ∞
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>> Shashwat Shriparv
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>>
> >>>>>> >> >> >>> >> >> >>>>>>>>>>>
> >>>>>> >> >> >>> >> >> >>>>>>>>>>
> >>>>>> >> >> >>> >> >> >>>>>>>>>>
> >>>>>> >> >> >>> >> >> >>>>>>>>>>
> >>>>>> >> >> >>> >> >> >>>>>>>>>> --
> >>>>>> >> >> >>> >> >> >>>>>>>>>>
> >>>>>> >> >> >>> >> >> >>>>>>>>>>
> >>>>>> >> >> >>> >> >> >>>>>>>>>> ∞
> >>>>>> >> >> >>> >> >> >>>>>>>>>>
> >>>>>> >> >> >>> >> >> >>>>>>>>>> Shashwat Shriparv
> >>>>>> >> >> >>> >> >> >>>>>>>>>>
> >>>>>> >> >> >>> >> >> >>>>>>>>>>
> >>>>>> >> >> >>> >> >> >>>>>>>>>
> >>>>>> >> >> >>> >> >> >>>>>>>>
> >>>>>> >> >> >>> >> >> >>>>>>>
> >>>>>> >> >> >>> >> >> >>>>>>>
> >>>>>> >> >> >>> >> >> >>>>>>>
> >>>>>> >> >> >>> >> >> >>>>>>> --
> >>>>>> >> >> >>> >> >> >>>>>>>
> >>>>>> >> >> >>> >> >> >>>>>>>
> >>>>>> >> >> >>> >> >> >>>>>>> ∞
> >>>>>> >> >> >>> >> >> >>>>>>>
> >>>>>> >> >> >>> >> >> >>>>>>> Shashwat Shriparv
> >>>>>> >> >> >>> >> >> >>>>>>>
> >>>>>> >> >> >>> >> >> >>>>>>>
> >>>>>> >> >> >>> >> >> >>>>>>
> >>>>>> >> >> >>> >> >> >>>>>
> >>>>>> >> >> >>> >> >> >>>>
> >>>>>> >> >> >>> >> >> >>>
> >>>>>> >> >> >>> >> >> >>>
> >>>>>> >> >> >>> >> >> >>>
> >>>>>> >> >> >>> >> >> >>> --
> >>>>>> >> >> >>> >> >> >>>
> >>>>>> >> >> >>> >> >> >>>
> >>>>>> >> >> >>> >> >> >>> ∞
> >>>>>> >> >> >>> >> >> >>>
> >>>>>> >> >> >>> >> >> >>> Shashwat Shriparv
> >>>>>> >> >> >>> >> >> >>>
> >>>>>> >> >> >>> >> >> >>>
> >>>>>> >> >> >>> >> >> >>
> >>>>>> >> >> >>> >> >
> >>>>>> >> >> >>> >> >
> >>>>>> >> >> >>> >
> >>>>>> >> >> >>> >
> >>>>>> >> >> >>
> >>>>>> >> >> >>
> >>>>>> >> >
> >>>>>> >> >
> >>>>>> >
> >>>>>> >
> >>>>>
> >>>>>
> >>>>
> >>>>
> >>>>
> >>>> --
> >>>>
> >>>>
> >>>> ∞
> >>>>
> >>>> Shashwat Shriparv
> >>>>
> >>>>
> >>>
> >>
> >
>

Re: Error while Creating Table in Hive

Posted by Mohammad Tariq <do...@gmail.com>.
actually this blog post explains how to install cloudera's hadoop
distribution...if you have followed this post and installed cloudera's
distribution then your logs should ideally be inside
/usr/lib/hadoop/logs (if everything was fine)..anyway try the steps I
have given and let me know.

Regards,
    Mohammad Tariq


On Thu, Jun 7, 2012 at 1:52 AM, Babak Bastan <ba...@gmail.com> wrote:
> by the way,you are a very nice man my friend:Thank you so much :)
>
> what do you mean aboat this post in stackoverflow?
>
> I am assuming that is your first installation of hadoop.
>
> At the beginning please check if your daemons are working. To do that use
> (in terminal):
>
> jps
>
> If only jps appears that means all daemons are down. Please check the log
> files. Especially the namenode. Log folder is probably somewhere there
> /usr/lib/hadoop/logs
>
> If you have some permission problems. Use this guide during the
> installation.
>
> Good installation guide
>
> I am shooting with this explanations but these are most common problems.
>
>
> On Wed, Jun 6, 2012 at 10:15 PM, Babak Bastan <ba...@gmail.com> wrote:
>>
>> I checked it but no hadoop folder :(
>> yes you are right.I'm a student and I want to make a very very simple
>> programm hive but untill now hmmmmmmmmm
>>
>>
>> On Wed, Jun 6, 2012 at 10:12 PM, Babak Bastan <ba...@gmail.com> wrote:
>>>
>>> no one error:
>>> i.e if I run this one
>>>
>>> hostname --fqdn
>>>
>>>  with the condition that I send to you :
>>>
>>> 127.0.0.1       localhost
>>> #127.0.0.1      ubuntu.ubuntu-domain    ubuntu
>>> # The following lines are desirable for IPv6 capable hosts
>>> #::1     ip6-localhost ip6-loopback
>>> #fe00::0 ip6-localnet
>>> #ff00::0 ip6-mcastprefix
>>> #ff02::1 ip6-allnodes
>>> #ff02::2 ip6-allrouters
>>>
>>> I get this error:
>>>
>>> hostname: Name or service not known
>>>
>>> Or in the second step by this command:
>>>
>>> babak@ubuntu:~/Downloads/hadoop/bin$ start-hdfs.sh
>>>
>>> these lines of error:
>>>
>>>
>>> mkdir: kann Verzeichnis „/home/babak/Downloads/hadoop/bin/../logs“ nicht
>>> anlegen: Keine Berechtigung
>>> starting namenode, logging to
>>> /home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-namenode-ubuntu.out
>>> /home/babak/Downloads/hadoop/bin/hadoop-daemon.sh: Zeile 117:
>>> /home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-namenode-ubuntu.out:
>>> Datei oder Verzeichnis nicht gefunden
>>> head:
>>> „/home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-namenode-ubuntu.out“
>>> kann nicht zum Lesen geöffnet werden: Datei oder Verzeichnis nicht gefunden
>>> localhost: mkdir: kann Verzeichnis
>>> „/home/babak/Downloads/hadoop/bin/../logs“ nicht anlegen: Keine Berechtigung
>>> localhost: starting datanode, logging to
>>> /home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-datanode-ubuntu.out
>>> localhost: /home/babak/Downloads/hadoop/bin/hadoop-daemon.sh: Zeile 117:
>>> /home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-datanode-ubuntu.out:
>>> Datei oder Verzeichnis nicht gefunden
>>> localhost: head:
>>> „/home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-datanode-ubuntu.out“
>>> kann nicht zum Lesen geöffnet werden: Datei oder Verzeichnis nicht gefunden
>>> localhost: mkdir: kann Verzeichnis
>>> „/home/babak/Downloads/hadoop/bin/../logs“ nicht anlegen: Keine Berechtigung
>>> localhost: starting secondarynamenode, logging to
>>> /home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-secondarynamenode-ubuntu.out
>>> localhost: /home/babak/Downloads/hadoop/bin/hadoop-daemon.sh: Zeile 117:
>>> /home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-secondarynamenode-ubuntu.out:
>>> Datei oder Verzeichnis nicht gefunden
>>> localhost: head:
>>> „/home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-secondarynamenode-ubuntu.out“
>>> kann nicht zum Lesen geöffnet werden: Datei oder Verzeichnis nicht gefunden
>>>
>>> they said no permision to make logs in this
>>> path:/home/babak/Downloads/hadoop/bin/../logs
>>>
>>>  and generally I cant create a table in hive and get this one:
>>>
>>> FAILED: Error in metadata: MetaException(message:Got exception:
>>> java.io.FileNotFoundException File file:/user/hive/warehouse/test does not
>>> exist.)
>>> FAILED: Execution Error, return code 1 from
>>> org.apache.hadoop.hive.ql.exec.DDLTask
>>>
>>> On Wed, Jun 6, 2012 at 10:02 PM, shashwat shriparv
>>> <dw...@gmail.com> wrote:
>>>>
>>>> whats the error babak ???
>>>>
>>>>
>>>> On Thu, Jun 7, 2012 at 1:25 AM, Babak Bastan <ba...@gmail.com> wrote:
>>>>>
>>>>> What the hell is that?I see no log folder there
>>>>>
>>>>>
>>>>> On Wed, Jun 6, 2012 at 9:41 PM, Mohammad Tariq <do...@gmail.com>
>>>>> wrote:
>>>>>>
>>>>>> go to your HADOOP_HOME i.e your hadoop directory(that includes bin,
>>>>>> conf etc)..you can find logs directory there..
>>>>>>
>>>>>> Regards,
>>>>>>     Mohammad Tariq
>>>>>>
>>>>>>
>>>>>> On Thu, Jun 7, 2012 at 1:09 AM, Babak Bastan <ba...@gmail.com>
>>>>>> wrote:
>>>>>> > hoe can I get my log mohammad?
>>>>>> >
>>>>>> >
>>>>>> > On Wed, Jun 6, 2012 at 9:36 PM, Mohammad Tariq <do...@gmail.com>
>>>>>> > wrote:
>>>>>> >>
>>>>>> >> could you post your logs???that would help me in understanding the
>>>>>> >> problem properly.
>>>>>> >>
>>>>>> >> Regards,
>>>>>> >>     Mohammad Tariq
>>>>>> >>
>>>>>> >>
>>>>>> >> On Thu, Jun 7, 2012 at 1:02 AM, Babak Bastan <ba...@gmail.com>
>>>>>> >> wrote:
>>>>>> >> > Thank you very much mohamad for your attention.I followed the
>>>>>> >> > steps but
>>>>>> >> > the
>>>>>> >> > error is the same as the last time.
>>>>>> >> > and there is my hosts file:
>>>>>> >> >
>>>>>> >> > 127.0.0.1       localhost
>>>>>> >> > #127.0.0.1      ubuntu.ubuntu-domain    ubuntu
>>>>>> >> >
>>>>>> >> >
>>>>>> >> > # The following lines are desirable for IPv6 capable hosts
>>>>>> >> >
>>>>>> >> > #::1     ip6-localhost ip6-loopback
>>>>>> >> > #fe00::0 ip6-localnet
>>>>>> >> > #ff00::0 ip6-mcastprefix
>>>>>> >> > #ff02::1 ip6-allnodes
>>>>>> >> > #ff02::2 ip6-allrouters
>>>>>> >> >
>>>>>> >> > but no effect :(
>>>>>> >> >
>>>>>> >> > On Wed, Jun 6, 2012 at 8:25 PM, Mohammad Tariq
>>>>>> >> > <do...@gmail.com>
>>>>>> >> > wrote:
>>>>>> >> >>
>>>>>> >> >> also change the permissions of these directories to 777.
>>>>>> >> >>
>>>>>> >> >> Regards,
>>>>>> >> >>     Mohammad Tariq
>>>>>> >> >>
>>>>>> >> >>
>>>>>> >> >> On Wed, Jun 6, 2012 at 11:54 PM, Mohammad Tariq
>>>>>> >> >> <do...@gmail.com>
>>>>>> >> >> wrote:
>>>>>> >> >> > create a directory "/home/username/hdfs" (or at some place of
>>>>>> >> >> > your
>>>>>> >> >> > choice)..inside this hdfs directory create three sub
>>>>>> >> >> > directories -
>>>>>> >> >> > name, data, and temp, then follow these steps :
>>>>>> >> >> >
>>>>>> >> >> > add following properties in your core-site.xml -
>>>>>> >> >> >
>>>>>> >> >> > <property>
>>>>>> >> >> >          <name>fs.default.name</name>
>>>>>> >> >> >          <value>hdfs://localhost:9000/</value>
>>>>>> >> >> >        </property>
>>>>>> >> >> >
>>>>>> >> >> >        <property>
>>>>>> >> >> >          <name>hadoop.tmp.dir</name>
>>>>>> >> >> >          <value>/home/mohammad/hdfs/temp</value>
>>>>>> >> >> >        </property>
>>>>>> >> >> >
>>>>>> >> >> > then add following two properties in your hdfs-site.xml -
>>>>>> >> >> >
>>>>>> >> >> > <property>
>>>>>> >> >> >                <name>dfs.replication</name>
>>>>>> >> >> >                <value>1</value>
>>>>>> >> >> >        </property>
>>>>>> >> >> >
>>>>>> >> >> >        <property>
>>>>>> >> >> >                <name>dfs.name.dir</name>
>>>>>> >> >> >                <value>/home/mohammad/hdfs/name</value>
>>>>>> >> >> >        </property>
>>>>>> >> >> >
>>>>>> >> >> >        <property>
>>>>>> >> >> >                <name>dfs.data.dir</name>
>>>>>> >> >> >                <value>/home/mohammad/hdfs/data</value>
>>>>>> >> >> >        </property>
>>>>>> >> >> >
>>>>>> >> >> > finally add this property in your mapred-site.xml -
>>>>>> >> >> >
>>>>>> >> >> >       <property>
>>>>>> >> >> >          <name>mapred.job.tracker</name>
>>>>>> >> >> >          <value>hdfs://localhost:9001</value>
>>>>>> >> >> >        </property>
>>>>>> >> >> >
>>>>>> >> >> > NOTE: you can give any name to these directories of your
>>>>>> >> >> > choice, just
>>>>>> >> >> > keep in mind you have to give same names as values of
>>>>>> >> >> >           above specified properties in your configuration
>>>>>> >> >> > files.
>>>>>> >> >> > (give full path of these directories, not just the name of the
>>>>>> >> >> > directory)
>>>>>> >> >> >
>>>>>> >> >> > After this  follow the steps provided in the previous reply.
>>>>>> >> >> >
>>>>>> >> >> > Regards,
>>>>>> >> >> >     Mohammad Tariq
>>>>>> >> >> >
>>>>>> >> >> >
>>>>>> >> >> > On Wed, Jun 6, 2012 at 11:42 PM, Babak Bastan
>>>>>> >> >> > <ba...@gmail.com>
>>>>>> >> >> > wrote:
>>>>>> >> >> >> thank's Mohammad
>>>>>> >> >> >>
>>>>>> >> >> >> with this command:
>>>>>> >> >> >>
>>>>>> >> >> >> babak@ubuntu:~/Downloads/hadoop/bin$ hadoop namenode -format
>>>>>> >> >> >>
>>>>>> >> >> >> this is my output:
>>>>>> >> >> >>
>>>>>> >> >> >> 12/06/06 20:05:20 INFO namenode.NameNode: STARTUP_MSG:
>>>>>> >> >> >> /************************************************************
>>>>>> >> >> >> STARTUP_MSG: Starting NameNode
>>>>>> >> >> >> STARTUP_MSG:   host = ubuntu/127.0.1.1
>>>>>> >> >> >> STARTUP_MSG:   args = [-format]
>>>>>> >> >> >> STARTUP_MSG:   version = 0.20.2
>>>>>> >> >> >> STARTUP_MSG:   build =
>>>>>> >> >> >>
>>>>>> >> >> >> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.20
>>>>>> >> >> >> -r
>>>>>> >> >> >> 911707; compiled by 'chrisdo' on Fri Feb 19 08:07:34 UTC 2010
>>>>>> >> >> >> ************************************************************/
>>>>>> >> >> >> 12/06/06 20:05:20 INFO namenode.FSNamesystem:
>>>>>> >> >> >>
>>>>>> >> >> >>
>>>>>> >> >> >> fsOwner=babak,babak,adm,dialout,cdrom,plugdev,lpadmin,admin,sambashare
>>>>>> >> >> >> 12/06/06 20:05:20 INFO namenode.FSNamesystem:
>>>>>> >> >> >> supergroup=supergroup
>>>>>> >> >> >> 12/06/06 20:05:20 INFO namenode.FSNamesystem:
>>>>>> >> >> >> isPermissionEnabled=true
>>>>>> >> >> >> 12/06/06 20:05:20 INFO common.Storage: Image file of size 95
>>>>>> >> >> >> saved
>>>>>> >> >> >> in 0
>>>>>> >> >> >> seconds.
>>>>>> >> >> >> 12/06/06 20:05:20 INFO common.Storage: Storage directory
>>>>>> >> >> >> /tmp/hadoop-babak/dfs/name has been successfully formatted.
>>>>>> >> >> >> 12/06/06 20:05:20 INFO namenode.NameNode: SHUTDOWN_MSG:
>>>>>> >> >> >> /************************************************************
>>>>>> >> >> >> SHUTDOWN_MSG: Shutting down NameNode at ubuntu/127.0.1.1
>>>>>> >> >> >> ************************************************************/
>>>>>> >> >> >>
>>>>>> >> >> >> by this command:
>>>>>> >> >> >>
>>>>>> >> >> >> babak@ubuntu:~/Downloads/hadoop/bin$ start-dfs.sh
>>>>>> >> >> >>
>>>>>> >> >> >> this is the out put
>>>>>> >> >> >>
>>>>>> >> >> >> mkdir: kann Verzeichnis
>>>>>> >> >> >> „/home/babak/Downloads/hadoop/bin/../logs“
>>>>>> >> >> >> nicht
>>>>>> >> >> >> anlegen: Keine Berechtigung
>>>>>> >> >> >>
>>>>>> >> >> >> this out put(it's in german and it means no right to make
>>>>>> >> >> >> this
>>>>>> >> >> >> folder)
>>>>>> >> >> >>
>>>>>> >> >> >>
>>>>>> >> >> >> On Wed, Jun 6, 2012 at 7:59 PM, Mohammad Tariq
>>>>>> >> >> >> <do...@gmail.com>
>>>>>> >> >> >> wrote:
>>>>>> >> >> >>>
>>>>>> >> >> >>> once we are done with the configuration, we need to format
>>>>>> >> >> >>> the file
>>>>>> >> >> >>> system..use this command to do that-
>>>>>> >> >> >>> bin/hadoop namenode -format
>>>>>> >> >> >>>
>>>>>> >> >> >>> after this, hadoop daemon processes should be started using
>>>>>> >> >> >>> following
>>>>>> >> >> >>> commands -
>>>>>> >> >> >>> bin/start-dfs.sh (it'll start NN & DN)
>>>>>> >> >> >>> bin/start-mapred.sh (it'll start JT & TT)
>>>>>> >> >> >>>
>>>>>> >> >> >>> after this use jps to check if everything is alright or
>>>>>> >> >> >>> point your
>>>>>> >> >> >>> browser to localhost:50070..if you further find any problem
>>>>>> >> >> >>> provide
>>>>>> >> >> >>> us
>>>>>> >> >> >>> with the error logs..:)
>>>>>> >> >> >>>
>>>>>> >> >> >>> Regards,
>>>>>> >> >> >>>     Mohammad Tariq
>>>>>> >> >> >>>
>>>>>> >> >> >>>
>>>>>> >> >> >>> On Wed, Jun 6, 2012 at 11:22 PM, Babak Bastan
>>>>>> >> >> >>> <ba...@gmail.com>
>>>>>> >> >> >>> wrote:
>>>>>> >> >> >>> > were you able to format hdfs properly???
>>>>>> >> >> >>> > I did'nt get your question,Do you mean HADOOP_HOME? or
>>>>>> >> >> >>> > where did
>>>>>> >> >> >>> > I
>>>>>> >> >> >>> > install
>>>>>> >> >> >>> > Hadoop?
>>>>>> >> >> >>> >
>>>>>> >> >> >>> > On Wed, Jun 6, 2012 at 7:49 PM, Mohammad Tariq
>>>>>> >> >> >>> > <do...@gmail.com>
>>>>>> >> >> >>> > wrote:
>>>>>> >> >> >>> >>
>>>>>> >> >> >>> >> if you are getting only this, it means your hadoop is not
>>>>>> >> >> >>> >> running..were you able to format hdfs properly???
>>>>>> >> >> >>> >>
>>>>>> >> >> >>> >> Regards,
>>>>>> >> >> >>> >>     Mohammad Tariq
>>>>>> >> >> >>> >>
>>>>>> >> >> >>> >>
>>>>>> >> >> >>> >> On Wed, Jun 6, 2012 at 11:17 PM, Babak Bastan
>>>>>> >> >> >>> >> <ba...@gmail.com>
>>>>>> >> >> >>> >> wrote:
>>>>>> >> >> >>> >> > Hi MohammadmI irun jps in my shel I can see this
>>>>>> >> >> >>> >> > result:
>>>>>> >> >> >>> >> > 2213 Jps
>>>>>> >> >> >>> >> >
>>>>>> >> >> >>> >> >
>>>>>> >> >> >>> >> > On Wed, Jun 6, 2012 at 7:44 PM, Mohammad Tariq
>>>>>> >> >> >>> >> > <do...@gmail.com>
>>>>>> >> >> >>> >> > wrote:
>>>>>> >> >> >>> >> >>
>>>>>> >> >> >>> >> >> you can also use "jps" command at your shell to see
>>>>>> >> >> >>> >> >> whether
>>>>>> >> >> >>> >> >> Hadoop
>>>>>> >> >> >>> >> >> processes are running or not.
>>>>>> >> >> >>> >> >>
>>>>>> >> >> >>> >> >> Regards,
>>>>>> >> >> >>> >> >>     Mohammad Tariq
>>>>>> >> >> >>> >> >>
>>>>>> >> >> >>> >> >>
>>>>>> >> >> >>> >> >> On Wed, Jun 6, 2012 at 11:12 PM, Mohammad Tariq
>>>>>> >> >> >>> >> >> <do...@gmail.com>
>>>>>> >> >> >>> >> >> wrote:
>>>>>> >> >> >>> >> >> > Hi Babak,
>>>>>> >> >> >>> >> >> >
>>>>>> >> >> >>> >> >> >  You have to type it in you web browser..Hadoop
>>>>>> >> >> >>> >> >> > provides us
>>>>>> >> >> >>> >> >> > a
>>>>>> >> >> >>> >> >> > web
>>>>>> >> >> >>> >> >> > GUI
>>>>>> >> >> >>> >> >> > that not only allows us to browse through the file
>>>>>> >> >> >>> >> >> > system,
>>>>>> >> >> >>> >> >> > but
>>>>>> >> >> >>> >> >> > to
>>>>>> >> >> >>> >> >> > download the files as well..Apart from that it also
>>>>>> >> >> >>> >> >> > provides a
>>>>>> >> >> >>> >> >> > web
>>>>>> >> >> >>> >> >> > GUI
>>>>>> >> >> >>> >> >> > that can be used to see the status of Jobtracker and
>>>>>> >> >> >>> >> >> > Tasktracker..When
>>>>>> >> >> >>> >> >> > you run a Hive or Pig job or a Mapreduce job, you
>>>>>> >> >> >>> >> >> > can point
>>>>>> >> >> >>> >> >> > your
>>>>>> >> >> >>> >> >> > browser to http://localhost:50030 to see the status
>>>>>> >> >> >>> >> >> > and
>>>>>> >> >> >>> >> >> > logs
>>>>>> >> >> >>> >> >> > of
>>>>>> >> >> >>> >> >> > your
>>>>>> >> >> >>> >> >> > job.
>>>>>> >> >> >>> >> >> >
>>>>>> >> >> >>> >> >> > Regards,
>>>>>> >> >> >>> >> >> >     Mohammad Tariq
>>>>>> >> >> >>> >> >> >
>>>>>> >> >> >>> >> >> >
>>>>>> >> >> >>> >> >> > On Wed, Jun 6, 2012 at 8:28 PM, Babak Bastan
>>>>>> >> >> >>> >> >> > <ba...@gmail.com>
>>>>>> >> >> >>> >> >> > wrote:
>>>>>> >> >> >>> >> >> >> Thank you shashwat for the answer,
>>>>>> >> >> >>> >> >> >> where should I type http://localhost:50070?
>>>>>> >> >> >>> >> >> >> I typed here: hive>http://localhost:50070 but
>>>>>> >> >> >>> >> >> >> nothing as
>>>>>> >> >> >>> >> >> >> result
>>>>>> >> >> >>> >> >> >>
>>>>>> >> >> >>> >> >> >>
>>>>>> >> >> >>> >> >> >> On Wed, Jun 6, 2012 at 3:32 PM, shashwat shriparv
>>>>>> >> >> >>> >> >> >> <dw...@gmail.com> wrote:
>>>>>> >> >> >>> >> >> >>>
>>>>>> >> >> >>> >> >> >>> first type http://localhost:50070 whether this is
>>>>>> >> >> >>> >> >> >>> opening
>>>>>> >> >> >>> >> >> >>> or
>>>>>> >> >> >>> >> >> >>> not
>>>>>> >> >> >>> >> >> >>> and
>>>>>> >> >> >>> >> >> >>> check
>>>>>> >> >> >>> >> >> >>> how many nodes are available, check some of the
>>>>>> >> >> >>> >> >> >>> hadoop
>>>>>> >> >> >>> >> >> >>> shell
>>>>>> >> >> >>> >> >> >>> commands
>>>>>> >> >> >>> >> >> >>>
>>>>>> >> >> >>> >> >> >>>
>>>>>> >> >> >>> >> >> >>>
>>>>>> >> >> >>> >> >> >>>
>>>>>> >> >> >>> >> >> >>> from http://hadoop.apache.org/common/docs/r0.18.3/hdfs_shell.html
>>>>>> >> >> >>> >> >> >>> run
>>>>>> >> >> >>> >> >> >>> example mapreduce task on hadoop take example from
>>>>>> >> >> >>> >> >> >>> here
>>>>>> >> >> >>> >> >> >>>
>>>>>> >> >> >>> >> >> >>>
>>>>>> >> >> >>> >> >> >>>
>>>>>> >> >> >>> >> >> >>>
>>>>>> >> >> >>> >> >> >>>
>>>>>> >> >> >>> >> >> >>>
>>>>>> >> >> >>> >> >> >>> : http://www.michael-noll.com/blog/2011/04/09/benchmarking-and-stress-testing-an-hadoop-cluster-with-terasort-testdfsio-nnbench-mrbench/
>>>>>> >> >> >>> >> >> >>>
>>>>>> >> >> >>> >> >> >>> if all the above you can do sucessfully means
>>>>>> >> >> >>> >> >> >>> hadoop is
>>>>>> >> >> >>> >> >> >>> configured
>>>>>> >> >> >>> >> >> >>> correctly
>>>>>> >> >> >>> >> >> >>>
>>>>>> >> >> >>> >> >> >>> Regards
>>>>>> >> >> >>> >> >> >>> Shashwat
>>>>>> >> >> >>> >> >> >>>
>>>>>> >> >> >>> >> >> >>>
>>>>>> >> >> >>> >> >> >>> On Wed, Jun 6, 2012 at 1:30 AM, Babak Bastan
>>>>>> >> >> >>> >> >> >>> <ba...@gmail.com>
>>>>>> >> >> >>> >> >> >>> wrote:
>>>>>> >> >> >>> >> >> >>>>
>>>>>> >> >> >>> >> >> >>>> no I'm not working on CDH.Is there a way to test
>>>>>> >> >> >>> >> >> >>>> if my
>>>>>> >> >> >>> >> >> >>>> Hadoop
>>>>>> >> >> >>> >> >> >>>> works
>>>>>> >> >> >>> >> >> >>>> fine
>>>>>> >> >> >>> >> >> >>>> or not?
>>>>>> >> >> >>> >> >> >>>>
>>>>>> >> >> >>> >> >> >>>>
>>>>>> >> >> >>> >> >> >>>> On Tue, Jun 5, 2012 at 9:55 PM, Bejoy KS
>>>>>> >> >> >>> >> >> >>>> <be...@yahoo.com>
>>>>>> >> >> >>> >> >> >>>> wrote:
>>>>>> >> >> >>> >> >> >>>>>
>>>>>> >> >> >>> >> >> >>>>> Hi Babak
>>>>>> >> >> >>> >> >> >>>>>
>>>>>> >> >> >>> >> >> >>>>> You gotta follow those instructions in the apace
>>>>>> >> >> >>> >> >> >>>>> site
>>>>>> >> >> >>> >> >> >>>>> to
>>>>>> >> >> >>> >> >> >>>>> set
>>>>>> >> >> >>> >> >> >>>>> up
>>>>>> >> >> >>> >> >> >>>>> hadoop
>>>>>> >> >> >>> >> >> >>>>> from scratch and ensure that hdfs is working
>>>>>> >> >> >>> >> >> >>>>> first. You
>>>>>> >> >> >>> >> >> >>>>> should
>>>>>> >> >> >>> >> >> >>>>> be
>>>>>> >> >> >>> >> >> >>>>> able to
>>>>>> >> >> >>> >> >> >>>>> read and write files to hdfs before you do your
>>>>>> >> >> >>> >> >> >>>>> next
>>>>>> >> >> >>> >> >> >>>>> steps.
>>>>>> >> >> >>> >> >> >>>>>
>>>>>> >> >> >>> >> >> >>>>> Are you on CDH or apache distribution of hadoop?
>>>>>> >> >> >>> >> >> >>>>> If it
>>>>>> >> >> >>> >> >> >>>>> is
>>>>>> >> >> >>> >> >> >>>>> CDH
>>>>>> >> >> >>> >> >> >>>>> there
>>>>>> >> >> >>> >> >> >>>>> are
>>>>>> >> >> >>> >> >> >>>>> detailed instructions on Cloudera web site.
>>>>>> >> >> >>> >> >> >>>>>
>>>>>> >> >> >>> >> >> >>>>> Regards
>>>>>> >> >> >>> >> >> >>>>> Bejoy KS
>>>>>> >> >> >>> >> >> >>>>>
>>>>>> >> >> >>> >> >> >>>>> Sent from handheld, please excuse typos.
>>>>>> >> >> >>> >> >> >>>>> ________________________________
>>>>>> >> >> >>> >> >> >>>>> From: Babak Bastan <ba...@gmail.com>
>>>>>> >> >> >>> >> >> >>>>> Date: Tue, 5 Jun 2012 21:30:22 +0200
>>>>>> >> >> >>> >> >> >>>>> To: <us...@hive.apache.org>
>>>>>> >> >> >>> >> >> >>>>> ReplyTo: user@hive.apache.org
>>>>>> >> >> >>> >> >> >>>>> Subject: Re: Error while Creating Table in Hive
>>>>>> >> >> >>> >> >> >>>>>
>>>>>> >> >> >>> >> >> >>>>> @Bejoy: I set the fs.default.name in the
>>>>>> >> >> >>> >> >> >>>>> core-site.xml
>>>>>> >> >> >>> >> >> >>>>> and
>>>>>> >> >> >>> >> >> >>>>> I
>>>>>> >> >> >>> >> >> >>>>> did
>>>>>> >> >> >>> >> >> >>>>> all
>>>>>> >> >> >>> >> >> >>>>> of
>>>>>> >> >> >>> >> >> >>>>> thing that was mentioned in the reference but no
>>>>>> >> >> >>> >> >> >>>>> effect
>>>>>> >> >> >>> >> >> >>>>>
>>>>>> >> >> >>> >> >> >>>>> On Tue, Jun 5, 2012 at 8:43 PM, Babak Bastan
>>>>>> >> >> >>> >> >> >>>>> <ba...@gmail.com>
>>>>>> >> >> >>> >> >> >>>>> wrote:
>>>>>> >> >> >>> >> >> >>>>>>
>>>>>> >> >> >>> >> >> >>>>>> Ok sorry but that was my Mistake .I thought it
>>>>>> >> >> >>> >> >> >>>>>> works
>>>>>> >> >> >>> >> >> >>>>>> but
>>>>>> >> >> >>> >> >> >>>>>> no.
>>>>>> >> >> >>> >> >> >>>>>> I wrote the command without ; and then I think
>>>>>> >> >> >>> >> >> >>>>>> It
>>>>>> >> >> >>> >> >> >>>>>> works
>>>>>> >> >> >>> >> >> >>>>>> but
>>>>>> >> >> >>> >> >> >>>>>> with
>>>>>> >> >> >>> >> >> >>>>>> ;
>>>>>> >> >> >>> >> >> >>>>>> at
>>>>>> >> >> >>> >> >> >>>>>> the end of command
>>>>>> >> >> >>> >> >> >>>>>>
>>>>>> >> >> >>> >> >> >>>>>> CREATE TABLE pokes (foo INT, bar STRING);
>>>>>> >> >> >>> >> >> >>>>>>
>>>>>> >> >> >>> >> >> >>>>>> does'nt work
>>>>>> >> >> >>> >> >> >>>>>>
>>>>>> >> >> >>> >> >> >>>>>>
>>>>>> >> >> >>> >> >> >>>>>> On Tue, Jun 5, 2012 at 8:34 PM, shashwat
>>>>>> >> >> >>> >> >> >>>>>> shriparv
>>>>>> >> >> >>> >> >> >>>>>> <dw...@gmail.com> wrote:
>>>>>> >> >> >>> >> >> >>>>>>>
>>>>>> >> >> >>> >> >> >>>>>>> inside configuration. all properties will be
>>>>>> >> >> >>> >> >> >>>>>>> inside
>>>>>> >> >> >>> >> >> >>>>>>> the
>>>>>> >> >> >>> >> >> >>>>>>> configuration
>>>>>> >> >> >>> >> >> >>>>>>> tags
>>>>>> >> >> >>> >> >> >>>>>>>
>>>>>> >> >> >>> >> >> >>>>>>>
>>>>>> >> >> >>> >> >> >>>>>>> On Tue, Jun 5, 2012 at 11:53 PM, Babak Bastan
>>>>>> >> >> >>> >> >> >>>>>>> <ba...@gmail.com>
>>>>>> >> >> >>> >> >> >>>>>>> wrote:
>>>>>> >> >> >>> >> >> >>>>>>>>
>>>>>> >> >> >>> >> >> >>>>>>>> Thank you so much my friend your idee works
>>>>>> >> >> >>> >> >> >>>>>>>> fine(no
>>>>>> >> >> >>> >> >> >>>>>>>> error)
>>>>>> >> >> >>> >> >> >>>>>>>> you
>>>>>> >> >> >>> >> >> >>>>>>>> are
>>>>>> >> >> >>> >> >> >>>>>>>> the best :)
>>>>>> >> >> >>> >> >> >>>>>>>>
>>>>>> >> >> >>> >> >> >>>>>>>>
>>>>>> >> >> >>> >> >> >>>>>>>> On Tue, Jun 5, 2012 at 8:20 PM, Babak Bastan
>>>>>> >> >> >>> >> >> >>>>>>>> <ba...@gmail.com>
>>>>>> >> >> >>> >> >> >>>>>>>> wrote:
>>>>>> >> >> >>> >> >> >>>>>>>>>
>>>>>> >> >> >>> >> >> >>>>>>>>> It must be inside the
>>>>>> >> >> >>> >> >> >>>>>>>>> <configuration></configuration>
>>>>>> >> >> >>> >> >> >>>>>>>>> or
>>>>>> >> >> >>> >> >> >>>>>>>>> outside
>>>>>> >> >> >>> >> >> >>>>>>>>> this?
>>>>>> >> >> >>> >> >> >>>>>>>>>
>>>>>> >> >> >>> >> >> >>>>>>>>>
>>>>>> >> >> >>> >> >> >>>>>>>>> On Tue, Jun 5, 2012 at 8:15 PM, shashwat
>>>>>> >> >> >>> >> >> >>>>>>>>> shriparv
>>>>>> >> >> >>> >> >> >>>>>>>>> <dw...@gmail.com> wrote:
>>>>>> >> >> >>> >> >> >>>>>>>>>>
>>>>>> >> >> >>> >> >> >>>>>>>>>> It will be inside hive/conf
>>>>>> >> >> >>> >> >> >>>>>>>>>>
>>>>>> >> >> >>> >> >> >>>>>>>>>>
>>>>>> >> >> >>> >> >> >>>>>>>>>> On Tue, Jun 5, 2012 at 11:43 PM, Babak
>>>>>> >> >> >>> >> >> >>>>>>>>>> Bastan
>>>>>> >> >> >>> >> >> >>>>>>>>>> <ba...@gmail.com>
>>>>>> >> >> >>> >> >> >>>>>>>>>> wrote:
>>>>>> >> >> >>> >> >> >>>>>>>>>>>
>>>>>> >> >> >>> >> >> >>>>>>>>>>> Thanks sShashwat, and where is this
>>>>>> >> >> >>> >> >> >>>>>>>>>>> hive-site.xml
>>>>>> >> >> >>> >> >> >>>>>>>>>>>
>>>>>> >> >> >>> >> >> >>>>>>>>>>> On Tue, Jun 5, 2012 at 8:02 PM, shashwat
>>>>>> >> >> >>> >> >> >>>>>>>>>>> shriparv
>>>>>> >> >> >>> >> >> >>>>>>>>>>> <dw...@gmail.com> wrote:
>>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>>>>>> >> >> >>> >> >> >>>>>>>>>>>> set
>>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>>>>>> >> >> >>> >> >> >>>>>>>>>>>> hive.metastore.warehouse.dir in
>>>>>> >> >> >>> >> >> >>>>>>>>>>>> hive-site.xml
>>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>>>>>> >> >> >>> >> >> >>>>>>>>>>>> <property>
>>>>>> >> >> >>> >> >> >>>>>>>>>>>>   <name>hive.metastore.local</name>
>>>>>> >> >> >>> >> >> >>>>>>>>>>>>   <value>true</value>
>>>>>> >> >> >>> >> >> >>>>>>>>>>>> </property>
>>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>>>>>> >> >> >>> >> >> >>>>>>>>>>>> <name>hive.metastore.warehouse.dir</name>
>>>>>> >> >> >>> >> >> >>>>>>>>>>>>                <value>/home/<your
>>>>>> >> >> >>> >> >> >>>>>>>>>>>> username>/hivefolder</value>
>>>>>> >> >> >>> >> >> >>>>>>>>>>>>                <description>location of
>>>>>> >> >> >>> >> >> >>>>>>>>>>>> default
>>>>>> >> >> >>> >> >> >>>>>>>>>>>> database
>>>>>> >> >> >>> >> >> >>>>>>>>>>>> for
>>>>>> >> >> >>> >> >> >>>>>>>>>>>> the
>>>>>> >> >> >>> >> >> >>>>>>>>>>>> warehouse</description>
>>>>>> >> >> >>> >> >> >>>>>>>>>>>>        </property>
>>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>>>>>> >> >> >>> >> >> >>>>>>>>>>>> On Tue, Jun 5, 2012 at 10:43 PM, Babak
>>>>>> >> >> >>> >> >> >>>>>>>>>>>> Bastan
>>>>>> >> >> >>> >> >> >>>>>>>>>>>> <ba...@gmail.com> wrote:
>>>>>> >> >> >>> >> >> >>>>>>>>>>>>>
>>>>>> >> >> >>> >> >> >>>>>>>>>>>>> Hello Experts ,
>>>>>> >> >> >>> >> >> >>>>>>>>>>>>>
>>>>>> >> >> >>> >> >> >>>>>>>>>>>>> I'm new in Hive .When try to create a
>>>>>> >> >> >>> >> >> >>>>>>>>>>>>> test
>>>>>> >> >> >>> >> >> >>>>>>>>>>>>> Table
>>>>>> >> >> >>> >> >> >>>>>>>>>>>>> in
>>>>>> >> >> >>> >> >> >>>>>>>>>>>>> Hive
>>>>>> >> >> >>> >> >> >>>>>>>>>>>>> I
>>>>>> >> >> >>> >> >> >>>>>>>>>>>>> get
>>>>>> >> >> >>> >> >> >>>>>>>>>>>>> an error.I want to run this command:
>>>>>> >> >> >>> >> >> >>>>>>>>>>>>> CREATE TABLE Test (DateT STRING, Url
>>>>>> >> >> >>> >> >> >>>>>>>>>>>>> STRING,
>>>>>> >> >> >>> >> >> >>>>>>>>>>>>> Content
>>>>>> >> >> >>> >> >> >>>>>>>>>>>>> STRING);
>>>>>> >> >> >>> >> >> >>>>>>>>>>>>> but this error occured:
>>>>>> >> >> >>> >> >> >>>>>>>>>>>>> FAILED: Error in metadata:
>>>>>> >> >> >>> >> >> >>>>>>>>>>>>> MetaException(message:Got
>>>>>> >> >> >>> >> >> >>>>>>>>>>>>> exception:
>>>>>> >> >> >>> >> >> >>>>>>>>>>>>> java.io.FileNotFoundException File
>>>>>> >> >> >>> >> >> >>>>>>>>>>>>> file:/user/hive/warehouse/test does not
>>>>>> >> >> >>> >> >> >>>>>>>>>>>>> exist.)
>>>>>> >> >> >>> >> >> >>>>>>>>>>>>> FAILED: Execution Error, return code 1
>>>>>> >> >> >>> >> >> >>>>>>>>>>>>> from
>>>>>> >> >> >>> >> >> >>>>>>>>>>>>> org.apache.hadoop.hive.ql.exec.DDLTask
>>>>>> >> >> >>> >> >> >>>>>>>>>>>>> How can I solve this Problem?
>>>>>> >> >> >>> >> >> >>>>>>>>>>>>> Thank you so much
>>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>>>>>> >> >> >>> >> >> >>>>>>>>>>>> --
>>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>>>>>> >> >> >>> >> >> >>>>>>>>>>>> ∞
>>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>>>>>> >> >> >>> >> >> >>>>>>>>>>>> Shashwat Shriparv
>>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>>>>>> >> >> >>> >> >> >>>>>>>>>>>
>>>>>> >> >> >>> >> >> >>>>>>>>>>
>>>>>> >> >> >>> >> >> >>>>>>>>>>
>>>>>> >> >> >>> >> >> >>>>>>>>>>
>>>>>> >> >> >>> >> >> >>>>>>>>>> --
>>>>>> >> >> >>> >> >> >>>>>>>>>>
>>>>>> >> >> >>> >> >> >>>>>>>>>>
>>>>>> >> >> >>> >> >> >>>>>>>>>> ∞
>>>>>> >> >> >>> >> >> >>>>>>>>>>
>>>>>> >> >> >>> >> >> >>>>>>>>>> Shashwat Shriparv
>>>>>> >> >> >>> >> >> >>>>>>>>>>
>>>>>> >> >> >>> >> >> >>>>>>>>>>
>>>>>> >> >> >>> >> >> >>>>>>>>>
>>>>>> >> >> >>> >> >> >>>>>>>>
>>>>>> >> >> >>> >> >> >>>>>>>
>>>>>> >> >> >>> >> >> >>>>>>>
>>>>>> >> >> >>> >> >> >>>>>>>
>>>>>> >> >> >>> >> >> >>>>>>> --
>>>>>> >> >> >>> >> >> >>>>>>>
>>>>>> >> >> >>> >> >> >>>>>>>
>>>>>> >> >> >>> >> >> >>>>>>> ∞
>>>>>> >> >> >>> >> >> >>>>>>>
>>>>>> >> >> >>> >> >> >>>>>>> Shashwat Shriparv
>>>>>> >> >> >>> >> >> >>>>>>>
>>>>>> >> >> >>> >> >> >>>>>>>
>>>>>> >> >> >>> >> >> >>>>>>
>>>>>> >> >> >>> >> >> >>>>>
>>>>>> >> >> >>> >> >> >>>>
>>>>>> >> >> >>> >> >> >>>
>>>>>> >> >> >>> >> >> >>>
>>>>>> >> >> >>> >> >> >>>
>>>>>> >> >> >>> >> >> >>> --
>>>>>> >> >> >>> >> >> >>>
>>>>>> >> >> >>> >> >> >>>
>>>>>> >> >> >>> >> >> >>> ∞
>>>>>> >> >> >>> >> >> >>>
>>>>>> >> >> >>> >> >> >>> Shashwat Shriparv
>>>>>> >> >> >>> >> >> >>>
>>>>>> >> >> >>> >> >> >>>
>>>>>> >> >> >>> >> >> >>
>>>>>> >> >> >>> >> >
>>>>>> >> >> >>> >> >
>>>>>> >> >> >>> >
>>>>>> >> >> >>> >
>>>>>> >> >> >>
>>>>>> >> >> >>
>>>>>> >> >
>>>>>> >> >
>>>>>> >
>>>>>> >
>>>>>
>>>>>
>>>>
>>>>
>>>>
>>>> --
>>>>
>>>>
>>>> ∞
>>>>
>>>> Shashwat Shriparv
>>>>
>>>>
>>>
>>
>

Re: Error while Creating Table in Hive

Posted by Babak Bastan <ba...@gmail.com>.
by the way,you are a very nice man my friend:Thank you so much :)

what do you mean aboat this post in stackoverflow?

I am assuming that is your first installation of hadoop.

At the beginning please check if your daemons are working. To do that use
(in terminal):

jps

If only jps appears that means all daemons are down. Please check the log
files. Especially the namenode. Log folder is probably somewhere there
/usr/lib/hadoop/logs

If you have some permission problems. Use this guide during the
installation.

Good installation
guide<http://blog.timmattison.com/archives/2011/12/23/how-to-install-hadoop-on-debian-ubuntu/>

I am shooting with this explanations but these are most common problems.

On Wed, Jun 6, 2012 at 10:15 PM, Babak Bastan <ba...@gmail.com> wrote:

> I checked it but no hadoop folder :(
> yes you are right.I'm a student and I want to make a very very simple
> programm hive but untill now hmmmmmmmmm
>
>
> On Wed, Jun 6, 2012 at 10:12 PM, Babak Bastan <ba...@gmail.com> wrote:
>
>> no one error:
>> i.e if I run this one
>>
>> *hostname --fqdn*
>>
>>  with the condition that I send to you :
>>
>> *127.0.0.1       localhost*
>> *#127.0.0.1      ubuntu.ubuntu-domain    ubuntu*
>> *# The following lines are desirable for IPv6 capable hosts*
>> *#::1     ip6-localhost ip6-loopback*
>> *#fe00::0 ip6-localnet*
>> *#ff00::0 ip6-mcastprefix*
>> *#ff02::1 ip6-allnodes*
>> *#ff02::2 ip6-allrouters*
>>
>> I get this error:
>>
>> *hostname: Name or service not known*
>>
>> Or in the second step by this command:
>>
>> *babak@ubuntu:~/Downloads/hadoop/bin$ start-hdfs.sh*
>>
>> these lines of error:
>>
>>
>> mkdir: kann Verzeichnis „/home/babak/Downloads/hadoop/bin/../logs“ nicht
>> anlegen: Keine Berechtigung
>> starting namenode, logging to
>> /home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-namenode-ubuntu.out
>> /home/babak/Downloads/hadoop/bin/hadoop-daemon.sh: Zeile 117:
>> /home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-namenode-ubuntu.out:
>> Datei oder Verzeichnis nicht gefunden
>> head:
>> „/home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-namenode-ubuntu.out“
>> kann nicht zum Lesen geöffnet werden: Datei oder Verzeichnis nicht gefunden
>> localhost: mkdir: kann Verzeichnis
>> „/home/babak/Downloads/hadoop/bin/../logs“ nicht anlegen: Keine Berechtigung
>> localhost: starting datanode, logging to
>> /home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-datanode-ubuntu.out
>> localhost: /home/babak/Downloads/hadoop/bin/hadoop-daemon.sh: Zeile 117:
>> /home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-datanode-ubuntu.out:
>> Datei oder Verzeichnis nicht gefunden
>> localhost: head:
>> „/home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-datanode-ubuntu.out“
>> kann nicht zum Lesen geöffnet werden: Datei oder Verzeichnis nicht gefunden
>> localhost: mkdir: kann Verzeichnis
>> „/home/babak/Downloads/hadoop/bin/../logs“ nicht anlegen: Keine Berechtigung
>> localhost: starting secondarynamenode, logging to
>> /home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-secondarynamenode-ubuntu.out
>> localhost: /home/babak/Downloads/hadoop/bin/hadoop-daemon.sh: Zeile 117:
>> /home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-secondarynamenode-ubuntu.out:
>> Datei oder Verzeichnis nicht gefunden
>> localhost: head:
>> „/home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-secondarynamenode-ubuntu.out“
>> kann nicht zum Lesen geöffnet werden: Datei oder Verzeichnis nicht gefunden
>>
>> they said no permision to make logs in this
>> path:/home/babak/Downloads/hadoop/bin/../logs
>>
>>  and generally I cant create a table in hive and get this one:
>>
>> FAILED: Error in metadata: MetaException(message:Got exception: java.io.FileNotFoundException File file:/user/hive/warehouse/test does not exist.)
>> FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask
>>
>> On Wed, Jun 6, 2012 at 10:02 PM, shashwat shriparv <
>> dwivedishashwat@gmail.com> wrote:
>>
>>> whats the error babak ???
>>>
>>>
>>> On Thu, Jun 7, 2012 at 1:25 AM, Babak Bastan <ba...@gmail.com> wrote:
>>>
>>>> What the hell is that?I see no log folder there
>>>>
>>>>
>>>> On Wed, Jun 6, 2012 at 9:41 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>>
>>>>> go to your HADOOP_HOME i.e your hadoop directory(that includes bin,
>>>>> conf etc)..you can find logs directory there..
>>>>>
>>>>> Regards,
>>>>>     Mohammad Tariq
>>>>>
>>>>>
>>>>> On Thu, Jun 7, 2012 at 1:09 AM, Babak Bastan <ba...@gmail.com>
>>>>> wrote:
>>>>> > hoe can I get my log mohammad?
>>>>> >
>>>>> >
>>>>> > On Wed, Jun 6, 2012 at 9:36 PM, Mohammad Tariq <do...@gmail.com>
>>>>> wrote:
>>>>> >>
>>>>> >> could you post your logs???that would help me in understanding the
>>>>> >> problem properly.
>>>>> >>
>>>>> >> Regards,
>>>>> >>     Mohammad Tariq
>>>>> >>
>>>>> >>
>>>>> >> On Thu, Jun 7, 2012 at 1:02 AM, Babak Bastan <ba...@gmail.com>
>>>>> wrote:
>>>>> >> > Thank you very much mohamad for your attention.I followed the
>>>>> steps but
>>>>> >> > the
>>>>> >> > error is the same as the last time.
>>>>> >> > and there is my hosts file:
>>>>> >> >
>>>>> >> > 127.0.0.1       localhost
>>>>> >> > #127.0.0.1      ubuntu.ubuntu-domain    ubuntu
>>>>> >> >
>>>>> >> >
>>>>> >> > # The following lines are desirable for IPv6 capable hosts
>>>>> >> >
>>>>> >> > #::1     ip6-localhost ip6-loopback
>>>>> >> > #fe00::0 ip6-localnet
>>>>> >> > #ff00::0 ip6-mcastprefix
>>>>> >> > #ff02::1 ip6-allnodes
>>>>> >> > #ff02::2 ip6-allrouters
>>>>> >> >
>>>>> >> > but no effect :(
>>>>> >> >
>>>>> >> > On Wed, Jun 6, 2012 at 8:25 PM, Mohammad Tariq <
>>>>> dontariq@gmail.com>
>>>>> >> > wrote:
>>>>> >> >>
>>>>> >> >> also change the permissions of these directories to 777.
>>>>> >> >>
>>>>> >> >> Regards,
>>>>> >> >>     Mohammad Tariq
>>>>> >> >>
>>>>> >> >>
>>>>> >> >> On Wed, Jun 6, 2012 at 11:54 PM, Mohammad Tariq <
>>>>> dontariq@gmail.com>
>>>>> >> >> wrote:
>>>>> >> >> > create a directory "/home/username/hdfs" (or at some place of
>>>>> your
>>>>> >> >> > choice)..inside this hdfs directory create three sub
>>>>> directories -
>>>>> >> >> > name, data, and temp, then follow these steps :
>>>>> >> >> >
>>>>> >> >> > add following properties in your core-site.xml -
>>>>> >> >> >
>>>>> >> >> > <property>
>>>>> >> >> >          <name>fs.default.name</name>
>>>>> >> >> >          <value>hdfs://localhost:9000/</value>
>>>>> >> >> >        </property>
>>>>> >> >> >
>>>>> >> >> >        <property>
>>>>> >> >> >          <name>hadoop.tmp.dir</name>
>>>>> >> >> >          <value>/home/mohammad/hdfs/temp</value>
>>>>> >> >> >        </property>
>>>>> >> >> >
>>>>> >> >> > then add following two properties in your hdfs-site.xml -
>>>>> >> >> >
>>>>> >> >> > <property>
>>>>> >> >> >                <name>dfs.replication</name>
>>>>> >> >> >                <value>1</value>
>>>>> >> >> >        </property>
>>>>> >> >> >
>>>>> >> >> >        <property>
>>>>> >> >> >                <name>dfs.name.dir</name>
>>>>> >> >> >                <value>/home/mohammad/hdfs/name</value>
>>>>> >> >> >        </property>
>>>>> >> >> >
>>>>> >> >> >        <property>
>>>>> >> >> >                <name>dfs.data.dir</name>
>>>>> >> >> >                <value>/home/mohammad/hdfs/data</value>
>>>>> >> >> >        </property>
>>>>> >> >> >
>>>>> >> >> > finally add this property in your mapred-site.xml -
>>>>> >> >> >
>>>>> >> >> >       <property>
>>>>> >> >> >          <name>mapred.job.tracker</name>
>>>>> >> >> >          <value>hdfs://localhost:9001</value>
>>>>> >> >> >        </property>
>>>>> >> >> >
>>>>> >> >> > NOTE: you can give any name to these directories of your
>>>>> choice, just
>>>>> >> >> > keep in mind you have to give same names as values of
>>>>> >> >> >           above specified properties in your configuration
>>>>> files.
>>>>> >> >> > (give full path of these directories, not just the name of the
>>>>> >> >> > directory)
>>>>> >> >> >
>>>>> >> >> > After this  follow the steps provided in the previous reply.
>>>>> >> >> >
>>>>> >> >> > Regards,
>>>>> >> >> >     Mohammad Tariq
>>>>> >> >> >
>>>>> >> >> >
>>>>> >> >> > On Wed, Jun 6, 2012 at 11:42 PM, Babak Bastan <
>>>>> babakbsn@gmail.com>
>>>>> >> >> > wrote:
>>>>> >> >> >> thank's Mohammad
>>>>> >> >> >>
>>>>> >> >> >> with this command:
>>>>> >> >> >>
>>>>> >> >> >> babak@ubuntu:~/Downloads/hadoop/bin$ hadoop namenode -format
>>>>> >> >> >>
>>>>> >> >> >> this is my output:
>>>>> >> >> >>
>>>>> >> >> >> 12/06/06 20:05:20 INFO namenode.NameNode: STARTUP_MSG:
>>>>> >> >> >> /************************************************************
>>>>> >> >> >> STARTUP_MSG: Starting NameNode
>>>>> >> >> >> STARTUP_MSG:   host = ubuntu/127.0.1.1
>>>>> >> >> >> STARTUP_MSG:   args = [-format]
>>>>> >> >> >> STARTUP_MSG:   version = 0.20.2
>>>>> >> >> >> STARTUP_MSG:   build =
>>>>> >> >> >>
>>>>> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.20
>>>>> >> >> >> -r
>>>>> >> >> >> 911707; compiled by 'chrisdo' on Fri Feb 19 08:07:34 UTC 2010
>>>>> >> >> >> ************************************************************/
>>>>> >> >> >> 12/06/06 20:05:20 INFO namenode.FSNamesystem:
>>>>> >> >> >>
>>>>> >> >> >>
>>>>> fsOwner=babak,babak,adm,dialout,cdrom,plugdev,lpadmin,admin,sambashare
>>>>> >> >> >> 12/06/06 20:05:20 INFO namenode.FSNamesystem:
>>>>> supergroup=supergroup
>>>>> >> >> >> 12/06/06 20:05:20 INFO namenode.FSNamesystem:
>>>>> >> >> >> isPermissionEnabled=true
>>>>> >> >> >> 12/06/06 20:05:20 INFO common.Storage: Image file of size 95
>>>>> saved
>>>>> >> >> >> in 0
>>>>> >> >> >> seconds.
>>>>> >> >> >> 12/06/06 20:05:20 INFO common.Storage: Storage directory
>>>>> >> >> >> /tmp/hadoop-babak/dfs/name has been successfully formatted.
>>>>> >> >> >> 12/06/06 20:05:20 INFO namenode.NameNode: SHUTDOWN_MSG:
>>>>> >> >> >> /************************************************************
>>>>> >> >> >> SHUTDOWN_MSG: Shutting down NameNode at ubuntu/127.0.1.1
>>>>> >> >> >> ************************************************************/
>>>>> >> >> >>
>>>>> >> >> >> by this command:
>>>>> >> >> >>
>>>>> >> >> >> babak@ubuntu:~/Downloads/hadoop/bin$ start-dfs.sh
>>>>> >> >> >>
>>>>> >> >> >> this is the out put
>>>>> >> >> >>
>>>>> >> >> >> mkdir: kann Verzeichnis
>>>>> „/home/babak/Downloads/hadoop/bin/../logs“
>>>>> >> >> >> nicht
>>>>> >> >> >> anlegen: Keine Berechtigung
>>>>> >> >> >>
>>>>> >> >> >> this out put(it's in german and it means no right to make this
>>>>> >> >> >> folder)
>>>>> >> >> >>
>>>>> >> >> >>
>>>>> >> >> >> On Wed, Jun 6, 2012 at 7:59 PM, Mohammad Tariq <
>>>>> dontariq@gmail.com>
>>>>> >> >> >> wrote:
>>>>> >> >> >>>
>>>>> >> >> >>> once we are done with the configuration, we need to format
>>>>> the file
>>>>> >> >> >>> system..use this command to do that-
>>>>> >> >> >>> bin/hadoop namenode -format
>>>>> >> >> >>>
>>>>> >> >> >>> after this, hadoop daemon processes should be started using
>>>>> >> >> >>> following
>>>>> >> >> >>> commands -
>>>>> >> >> >>> bin/start-dfs.sh (it'll start NN & DN)
>>>>> >> >> >>> bin/start-mapred.sh (it'll start JT & TT)
>>>>> >> >> >>>
>>>>> >> >> >>> after this use jps to check if everything is alright or
>>>>> point your
>>>>> >> >> >>> browser to localhost:50070..if you further find any problem
>>>>> provide
>>>>> >> >> >>> us
>>>>> >> >> >>> with the error logs..:)
>>>>> >> >> >>>
>>>>> >> >> >>> Regards,
>>>>> >> >> >>>     Mohammad Tariq
>>>>> >> >> >>>
>>>>> >> >> >>>
>>>>> >> >> >>> On Wed, Jun 6, 2012 at 11:22 PM, Babak Bastan <
>>>>> babakbsn@gmail.com>
>>>>> >> >> >>> wrote:
>>>>> >> >> >>> > were you able to format hdfs properly???
>>>>> >> >> >>> > I did'nt get your question,Do you mean HADOOP_HOME? or
>>>>> where did
>>>>> >> >> >>> > I
>>>>> >> >> >>> > install
>>>>> >> >> >>> > Hadoop?
>>>>> >> >> >>> >
>>>>> >> >> >>> > On Wed, Jun 6, 2012 at 7:49 PM, Mohammad Tariq
>>>>> >> >> >>> > <do...@gmail.com>
>>>>> >> >> >>> > wrote:
>>>>> >> >> >>> >>
>>>>> >> >> >>> >> if you are getting only this, it means your hadoop is not
>>>>> >> >> >>> >> running..were you able to format hdfs properly???
>>>>> >> >> >>> >>
>>>>> >> >> >>> >> Regards,
>>>>> >> >> >>> >>     Mohammad Tariq
>>>>> >> >> >>> >>
>>>>> >> >> >>> >>
>>>>> >> >> >>> >> On Wed, Jun 6, 2012 at 11:17 PM, Babak Bastan
>>>>> >> >> >>> >> <ba...@gmail.com>
>>>>> >> >> >>> >> wrote:
>>>>> >> >> >>> >> > Hi MohammadmI irun jps in my shel I can see this result:
>>>>> >> >> >>> >> > 2213 Jps
>>>>> >> >> >>> >> >
>>>>> >> >> >>> >> >
>>>>> >> >> >>> >> > On Wed, Jun 6, 2012 at 7:44 PM, Mohammad Tariq
>>>>> >> >> >>> >> > <do...@gmail.com>
>>>>> >> >> >>> >> > wrote:
>>>>> >> >> >>> >> >>
>>>>> >> >> >>> >> >> you can also use "jps" command at your shell to see
>>>>> whether
>>>>> >> >> >>> >> >> Hadoop
>>>>> >> >> >>> >> >> processes are running or not.
>>>>> >> >> >>> >> >>
>>>>> >> >> >>> >> >> Regards,
>>>>> >> >> >>> >> >>     Mohammad Tariq
>>>>> >> >> >>> >> >>
>>>>> >> >> >>> >> >>
>>>>> >> >> >>> >> >> On Wed, Jun 6, 2012 at 11:12 PM, Mohammad Tariq
>>>>> >> >> >>> >> >> <do...@gmail.com>
>>>>> >> >> >>> >> >> wrote:
>>>>> >> >> >>> >> >> > Hi Babak,
>>>>> >> >> >>> >> >> >
>>>>> >> >> >>> >> >> >  You have to type it in you web browser..Hadoop
>>>>> provides us
>>>>> >> >> >>> >> >> > a
>>>>> >> >> >>> >> >> > web
>>>>> >> >> >>> >> >> > GUI
>>>>> >> >> >>> >> >> > that not only allows us to browse through the file
>>>>> system,
>>>>> >> >> >>> >> >> > but
>>>>> >> >> >>> >> >> > to
>>>>> >> >> >>> >> >> > download the files as well..Apart from that it also
>>>>> >> >> >>> >> >> > provides a
>>>>> >> >> >>> >> >> > web
>>>>> >> >> >>> >> >> > GUI
>>>>> >> >> >>> >> >> > that can be used to see the status of Jobtracker and
>>>>> >> >> >>> >> >> > Tasktracker..When
>>>>> >> >> >>> >> >> > you run a Hive or Pig job or a Mapreduce job, you
>>>>> can point
>>>>> >> >> >>> >> >> > your
>>>>> >> >> >>> >> >> > browser to http://localhost:50030 to see the status
>>>>> and
>>>>> >> >> >>> >> >> > logs
>>>>> >> >> >>> >> >> > of
>>>>> >> >> >>> >> >> > your
>>>>> >> >> >>> >> >> > job.
>>>>> >> >> >>> >> >> >
>>>>> >> >> >>> >> >> > Regards,
>>>>> >> >> >>> >> >> >     Mohammad Tariq
>>>>> >> >> >>> >> >> >
>>>>> >> >> >>> >> >> >
>>>>> >> >> >>> >> >> > On Wed, Jun 6, 2012 at 8:28 PM, Babak Bastan
>>>>> >> >> >>> >> >> > <ba...@gmail.com>
>>>>> >> >> >>> >> >> > wrote:
>>>>> >> >> >>> >> >> >> Thank you shashwat for the answer,
>>>>> >> >> >>> >> >> >> where should I type http://localhost:50070?
>>>>> >> >> >>> >> >> >> I typed here: hive>http://localhost:50070 but
>>>>> nothing as
>>>>> >> >> >>> >> >> >> result
>>>>> >> >> >>> >> >> >>
>>>>> >> >> >>> >> >> >>
>>>>> >> >> >>> >> >> >> On Wed, Jun 6, 2012 at 3:32 PM, shashwat shriparv
>>>>> >> >> >>> >> >> >> <dw...@gmail.com> wrote:
>>>>> >> >> >>> >> >> >>>
>>>>> >> >> >>> >> >> >>> first type http://localhost:50070 whether this is
>>>>> opening
>>>>> >> >> >>> >> >> >>> or
>>>>> >> >> >>> >> >> >>> not
>>>>> >> >> >>> >> >> >>> and
>>>>> >> >> >>> >> >> >>> check
>>>>> >> >> >>> >> >> >>> how many nodes are available, check some of the
>>>>> hadoop
>>>>> >> >> >>> >> >> >>> shell
>>>>> >> >> >>> >> >> >>> commands
>>>>> >> >> >>> >> >> >>>
>>>>> >> >> >>> >> >> >>>
>>>>> >> >> >>> >> >> >>>
>>>>> >> >> >>> >> >> >>> from
>>>>> http://hadoop.apache.org/common/docs/r0.18.3/hdfs_shell.html
>>>>> >> >> >>> >> >> >>> run
>>>>> >> >> >>> >> >> >>> example mapreduce task on hadoop take example from
>>>>> here
>>>>> >> >> >>> >> >> >>>
>>>>> >> >> >>> >> >> >>>
>>>>> >> >> >>> >> >> >>>
>>>>> >> >> >>> >> >> >>>
>>>>> >> >> >>> >> >> >>>
>>>>> >> >> >>> >> >> >>> :
>>>>> http://www.michael-noll.com/blog/2011/04/09/benchmarking-and-stress-testing-an-hadoop-cluster-with-terasort-testdfsio-nnbench-mrbench/
>>>>> >> >> >>> >> >> >>>
>>>>> >> >> >>> >> >> >>> if all the above you can do sucessfully means
>>>>> hadoop is
>>>>> >> >> >>> >> >> >>> configured
>>>>> >> >> >>> >> >> >>> correctly
>>>>> >> >> >>> >> >> >>>
>>>>> >> >> >>> >> >> >>> Regards
>>>>> >> >> >>> >> >> >>> Shashwat
>>>>> >> >> >>> >> >> >>>
>>>>> >> >> >>> >> >> >>>
>>>>> >> >> >>> >> >> >>> On Wed, Jun 6, 2012 at 1:30 AM, Babak Bastan
>>>>> >> >> >>> >> >> >>> <ba...@gmail.com>
>>>>> >> >> >>> >> >> >>> wrote:
>>>>> >> >> >>> >> >> >>>>
>>>>> >> >> >>> >> >> >>>> no I'm not working on CDH.Is there a way to test
>>>>> if my
>>>>> >> >> >>> >> >> >>>> Hadoop
>>>>> >> >> >>> >> >> >>>> works
>>>>> >> >> >>> >> >> >>>> fine
>>>>> >> >> >>> >> >> >>>> or not?
>>>>> >> >> >>> >> >> >>>>
>>>>> >> >> >>> >> >> >>>>
>>>>> >> >> >>> >> >> >>>> On Tue, Jun 5, 2012 at 9:55 PM, Bejoy KS
>>>>> >> >> >>> >> >> >>>> <be...@yahoo.com>
>>>>> >> >> >>> >> >> >>>> wrote:
>>>>> >> >> >>> >> >> >>>>>
>>>>> >> >> >>> >> >> >>>>> Hi Babak
>>>>> >> >> >>> >> >> >>>>>
>>>>> >> >> >>> >> >> >>>>> You gotta follow those instructions in the apace
>>>>> site
>>>>> >> >> >>> >> >> >>>>> to
>>>>> >> >> >>> >> >> >>>>> set
>>>>> >> >> >>> >> >> >>>>> up
>>>>> >> >> >>> >> >> >>>>> hadoop
>>>>> >> >> >>> >> >> >>>>> from scratch and ensure that hdfs is working
>>>>> first. You
>>>>> >> >> >>> >> >> >>>>> should
>>>>> >> >> >>> >> >> >>>>> be
>>>>> >> >> >>> >> >> >>>>> able to
>>>>> >> >> >>> >> >> >>>>> read and write files to hdfs before you do your
>>>>> next
>>>>> >> >> >>> >> >> >>>>> steps.
>>>>> >> >> >>> >> >> >>>>>
>>>>> >> >> >>> >> >> >>>>> Are you on CDH or apache distribution of hadoop?
>>>>> If it
>>>>> >> >> >>> >> >> >>>>> is
>>>>> >> >> >>> >> >> >>>>> CDH
>>>>> >> >> >>> >> >> >>>>> there
>>>>> >> >> >>> >> >> >>>>> are
>>>>> >> >> >>> >> >> >>>>> detailed instructions on Cloudera web site.
>>>>> >> >> >>> >> >> >>>>>
>>>>> >> >> >>> >> >> >>>>> Regards
>>>>> >> >> >>> >> >> >>>>> Bejoy KS
>>>>> >> >> >>> >> >> >>>>>
>>>>> >> >> >>> >> >> >>>>> Sent from handheld, please excuse typos.
>>>>> >> >> >>> >> >> >>>>> ________________________________
>>>>> >> >> >>> >> >> >>>>> From: Babak Bastan <ba...@gmail.com>
>>>>> >> >> >>> >> >> >>>>> Date: Tue, 5 Jun 2012 21:30:22 +0200
>>>>> >> >> >>> >> >> >>>>> To: <us...@hive.apache.org>
>>>>> >> >> >>> >> >> >>>>> ReplyTo: user@hive.apache.org
>>>>> >> >> >>> >> >> >>>>> Subject: Re: Error while Creating Table in Hive
>>>>> >> >> >>> >> >> >>>>>
>>>>> >> >> >>> >> >> >>>>> @Bejoy: I set the fs.default.name in the
>>>>> core-site.xml
>>>>> >> >> >>> >> >> >>>>> and
>>>>> >> >> >>> >> >> >>>>> I
>>>>> >> >> >>> >> >> >>>>> did
>>>>> >> >> >>> >> >> >>>>> all
>>>>> >> >> >>> >> >> >>>>> of
>>>>> >> >> >>> >> >> >>>>> thing that was mentioned in the reference but no
>>>>> effect
>>>>> >> >> >>> >> >> >>>>>
>>>>> >> >> >>> >> >> >>>>> On Tue, Jun 5, 2012 at 8:43 PM, Babak Bastan
>>>>> >> >> >>> >> >> >>>>> <ba...@gmail.com>
>>>>> >> >> >>> >> >> >>>>> wrote:
>>>>> >> >> >>> >> >> >>>>>>
>>>>> >> >> >>> >> >> >>>>>> Ok sorry but that was my Mistake .I thought it
>>>>> works
>>>>> >> >> >>> >> >> >>>>>> but
>>>>> >> >> >>> >> >> >>>>>> no.
>>>>> >> >> >>> >> >> >>>>>> I wrote the command without ; and then I think
>>>>> It
>>>>> >> >> >>> >> >> >>>>>> works
>>>>> >> >> >>> >> >> >>>>>> but
>>>>> >> >> >>> >> >> >>>>>> with
>>>>> >> >> >>> >> >> >>>>>> ;
>>>>> >> >> >>> >> >> >>>>>> at
>>>>> >> >> >>> >> >> >>>>>> the end of command
>>>>> >> >> >>> >> >> >>>>>>
>>>>> >> >> >>> >> >> >>>>>> CREATE TABLE pokes (foo INT, bar STRING);
>>>>> >> >> >>> >> >> >>>>>>
>>>>> >> >> >>> >> >> >>>>>> does'nt work
>>>>> >> >> >>> >> >> >>>>>>
>>>>> >> >> >>> >> >> >>>>>>
>>>>> >> >> >>> >> >> >>>>>> On Tue, Jun 5, 2012 at 8:34 PM, shashwat
>>>>> shriparv
>>>>> >> >> >>> >> >> >>>>>> <dw...@gmail.com> wrote:
>>>>> >> >> >>> >> >> >>>>>>>
>>>>> >> >> >>> >> >> >>>>>>> inside configuration. all properties will be
>>>>> inside
>>>>> >> >> >>> >> >> >>>>>>> the
>>>>> >> >> >>> >> >> >>>>>>> configuration
>>>>> >> >> >>> >> >> >>>>>>> tags
>>>>> >> >> >>> >> >> >>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>
>>>>> >> >> >>> >> >> >>>>>>> On Tue, Jun 5, 2012 at 11:53 PM, Babak Bastan
>>>>> >> >> >>> >> >> >>>>>>> <ba...@gmail.com>
>>>>> >> >> >>> >> >> >>>>>>> wrote:
>>>>> >> >> >>> >> >> >>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>> Thank you so much my friend your idee works
>>>>> fine(no
>>>>> >> >> >>> >> >> >>>>>>>> error)
>>>>> >> >> >>> >> >> >>>>>>>> you
>>>>> >> >> >>> >> >> >>>>>>>> are
>>>>> >> >> >>> >> >> >>>>>>>> the best :)
>>>>> >> >> >>> >> >> >>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>> On Tue, Jun 5, 2012 at 8:20 PM, Babak Bastan
>>>>> >> >> >>> >> >> >>>>>>>> <ba...@gmail.com>
>>>>> >> >> >>> >> >> >>>>>>>> wrote:
>>>>> >> >> >>> >> >> >>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>> It must be inside the
>>>>> >> >> >>> >> >> >>>>>>>>> <configuration></configuration>
>>>>> >> >> >>> >> >> >>>>>>>>> or
>>>>> >> >> >>> >> >> >>>>>>>>> outside
>>>>> >> >> >>> >> >> >>>>>>>>> this?
>>>>> >> >> >>> >> >> >>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>> On Tue, Jun 5, 2012 at 8:15 PM, shashwat
>>>>> shriparv
>>>>> >> >> >>> >> >> >>>>>>>>> <dw...@gmail.com> wrote:
>>>>> >> >> >>> >> >> >>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>> It will be inside hive/conf
>>>>> >> >> >>> >> >> >>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>> On Tue, Jun 5, 2012 at 11:43 PM, Babak
>>>>> Bastan
>>>>> >> >> >>> >> >> >>>>>>>>>> <ba...@gmail.com>
>>>>> >> >> >>> >> >> >>>>>>>>>> wrote:
>>>>> >> >> >>> >> >> >>>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>>> Thanks sShashwat, and where is this
>>>>> hive-site.xml
>>>>> >> >> >>> >> >> >>>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>>> On Tue, Jun 5, 2012 at 8:02 PM, shashwat
>>>>> shriparv
>>>>> >> >> >>> >> >> >>>>>>>>>>> <dw...@gmail.com> wrote:
>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>>>> set
>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>>>> hive.metastore.warehouse.dir in
>>>>> hive-site.xml
>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>>>> <property>
>>>>> >> >> >>> >> >> >>>>>>>>>>>>   <name>hive.metastore.local</name>
>>>>> >> >> >>> >> >> >>>>>>>>>>>>   <value>true</value>
>>>>> >> >> >>> >> >> >>>>>>>>>>>> </property>
>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>>>> <name>hive.metastore.warehouse.dir</name>
>>>>> >> >> >>> >> >> >>>>>>>>>>>>                <value>/home/<your
>>>>> >> >> >>> >> >> >>>>>>>>>>>> username>/hivefolder</value>
>>>>> >> >> >>> >> >> >>>>>>>>>>>>                <description>location of
>>>>> default
>>>>> >> >> >>> >> >> >>>>>>>>>>>> database
>>>>> >> >> >>> >> >> >>>>>>>>>>>> for
>>>>> >> >> >>> >> >> >>>>>>>>>>>> the
>>>>> >> >> >>> >> >> >>>>>>>>>>>> warehouse</description>
>>>>> >> >> >>> >> >> >>>>>>>>>>>>        </property>
>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>>>> On Tue, Jun 5, 2012 at 10:43 PM, Babak
>>>>> Bastan
>>>>> >> >> >>> >> >> >>>>>>>>>>>> <ba...@gmail.com> wrote:
>>>>> >> >> >>> >> >> >>>>>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>>>>> Hello Experts ,
>>>>> >> >> >>> >> >> >>>>>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>>>>> I'm new in Hive .When try to create a
>>>>> test
>>>>> >> >> >>> >> >> >>>>>>>>>>>>> Table
>>>>> >> >> >>> >> >> >>>>>>>>>>>>> in
>>>>> >> >> >>> >> >> >>>>>>>>>>>>> Hive
>>>>> >> >> >>> >> >> >>>>>>>>>>>>> I
>>>>> >> >> >>> >> >> >>>>>>>>>>>>> get
>>>>> >> >> >>> >> >> >>>>>>>>>>>>> an error.I want to run this command:
>>>>> >> >> >>> >> >> >>>>>>>>>>>>> CREATE TABLE Test (DateT STRING, Url
>>>>> STRING,
>>>>> >> >> >>> >> >> >>>>>>>>>>>>> Content
>>>>> >> >> >>> >> >> >>>>>>>>>>>>> STRING);
>>>>> >> >> >>> >> >> >>>>>>>>>>>>> but this error occured:
>>>>> >> >> >>> >> >> >>>>>>>>>>>>> FAILED: Error in metadata:
>>>>> >> >> >>> >> >> >>>>>>>>>>>>> MetaException(message:Got
>>>>> >> >> >>> >> >> >>>>>>>>>>>>> exception:
>>>>> >> >> >>> >> >> >>>>>>>>>>>>> java.io.FileNotFoundException File
>>>>> >> >> >>> >> >> >>>>>>>>>>>>> file:/user/hive/warehouse/test does not
>>>>> >> >> >>> >> >> >>>>>>>>>>>>> exist.)
>>>>> >> >> >>> >> >> >>>>>>>>>>>>> FAILED: Execution Error, return code 1
>>>>> from
>>>>> >> >> >>> >> >> >>>>>>>>>>>>> org.apache.hadoop.hive.ql.exec.DDLTask
>>>>> >> >> >>> >> >> >>>>>>>>>>>>> How can I solve this Problem?
>>>>> >> >> >>> >> >> >>>>>>>>>>>>> Thank you so much
>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>>>> --
>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>>>> ∞
>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>>>> Shashwat Shriparv
>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>> --
>>>>> >> >> >>> >> >> >>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>> ∞
>>>>> >> >> >>> >> >> >>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>> Shashwat Shriparv
>>>>> >> >> >>> >> >> >>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>
>>>>> >> >> >>> >> >> >>>>>>> --
>>>>> >> >> >>> >> >> >>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>
>>>>> >> >> >>> >> >> >>>>>>> ∞
>>>>> >> >> >>> >> >> >>>>>>>
>>>>> >> >> >>> >> >> >>>>>>> Shashwat Shriparv
>>>>> >> >> >>> >> >> >>>>>>>
>>>>> >> >> >>> >> >> >>>>>>>
>>>>> >> >> >>> >> >> >>>>>>
>>>>> >> >> >>> >> >> >>>>>
>>>>> >> >> >>> >> >> >>>>
>>>>> >> >> >>> >> >> >>>
>>>>> >> >> >>> >> >> >>>
>>>>> >> >> >>> >> >> >>>
>>>>> >> >> >>> >> >> >>> --
>>>>> >> >> >>> >> >> >>>
>>>>> >> >> >>> >> >> >>>
>>>>> >> >> >>> >> >> >>> ∞
>>>>> >> >> >>> >> >> >>>
>>>>> >> >> >>> >> >> >>> Shashwat Shriparv
>>>>> >> >> >>> >> >> >>>
>>>>> >> >> >>> >> >> >>>
>>>>> >> >> >>> >> >> >>
>>>>> >> >> >>> >> >
>>>>> >> >> >>> >> >
>>>>> >> >> >>> >
>>>>> >> >> >>> >
>>>>> >> >> >>
>>>>> >> >> >>
>>>>> >> >
>>>>> >> >
>>>>> >
>>>>> >
>>>>>
>>>>
>>>>
>>>
>>>
>>> --
>>>
>>>
>>> ∞
>>> Shashwat Shriparv
>>>
>>>
>>>
>>
>

Re: Error while Creating Table in Hive

Posted by Babak Bastan <ba...@gmail.com>.
I checked it but no hadoop folder :(
yes you are right.I'm a student and I want to make a very very simple
programm hive but untill now hmmmmmmmmm

On Wed, Jun 6, 2012 at 10:12 PM, Babak Bastan <ba...@gmail.com> wrote:

> no one error:
> i.e if I run this one
>
> *hostname --fqdn*
>
>  with the condition that I send to you :
>
> *127.0.0.1       localhost*
> *#127.0.0.1      ubuntu.ubuntu-domain    ubuntu*
> *# The following lines are desirable for IPv6 capable hosts*
> *#::1     ip6-localhost ip6-loopback*
> *#fe00::0 ip6-localnet*
> *#ff00::0 ip6-mcastprefix*
> *#ff02::1 ip6-allnodes*
> *#ff02::2 ip6-allrouters*
>
> I get this error:
>
> *hostname: Name or service not known*
>
> Or in the second step by this command:
>
> *babak@ubuntu:~/Downloads/hadoop/bin$ start-hdfs.sh*
>
> these lines of error:
>
>
> mkdir: kann Verzeichnis „/home/babak/Downloads/hadoop/bin/../logs“ nicht
> anlegen: Keine Berechtigung
> starting namenode, logging to
> /home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-namenode-ubuntu.out
> /home/babak/Downloads/hadoop/bin/hadoop-daemon.sh: Zeile 117:
> /home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-namenode-ubuntu.out:
> Datei oder Verzeichnis nicht gefunden
> head:
> „/home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-namenode-ubuntu.out“
> kann nicht zum Lesen geöffnet werden: Datei oder Verzeichnis nicht gefunden
> localhost: mkdir: kann Verzeichnis
> „/home/babak/Downloads/hadoop/bin/../logs“ nicht anlegen: Keine Berechtigung
> localhost: starting datanode, logging to
> /home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-datanode-ubuntu.out
> localhost: /home/babak/Downloads/hadoop/bin/hadoop-daemon.sh: Zeile 117:
> /home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-datanode-ubuntu.out:
> Datei oder Verzeichnis nicht gefunden
> localhost: head:
> „/home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-datanode-ubuntu.out“
> kann nicht zum Lesen geöffnet werden: Datei oder Verzeichnis nicht gefunden
> localhost: mkdir: kann Verzeichnis
> „/home/babak/Downloads/hadoop/bin/../logs“ nicht anlegen: Keine Berechtigung
> localhost: starting secondarynamenode, logging to
> /home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-secondarynamenode-ubuntu.out
> localhost: /home/babak/Downloads/hadoop/bin/hadoop-daemon.sh: Zeile 117:
> /home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-secondarynamenode-ubuntu.out:
> Datei oder Verzeichnis nicht gefunden
> localhost: head:
> „/home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-secondarynamenode-ubuntu.out“
> kann nicht zum Lesen geöffnet werden: Datei oder Verzeichnis nicht gefunden
>
> they said no permision to make logs in this
> path:/home/babak/Downloads/hadoop/bin/../logs
>
>  and generally I cant create a table in hive and get this one:
>
> FAILED: Error in metadata: MetaException(message:Got exception: java.io.FileNotFoundException File file:/user/hive/warehouse/test does not exist.)
> FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask
>
> On Wed, Jun 6, 2012 at 10:02 PM, shashwat shriparv <
> dwivedishashwat@gmail.com> wrote:
>
>> whats the error babak ???
>>
>>
>> On Thu, Jun 7, 2012 at 1:25 AM, Babak Bastan <ba...@gmail.com> wrote:
>>
>>> What the hell is that?I see no log folder there
>>>
>>>
>>> On Wed, Jun 6, 2012 at 9:41 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>>
>>>> go to your HADOOP_HOME i.e your hadoop directory(that includes bin,
>>>> conf etc)..you can find logs directory there..
>>>>
>>>> Regards,
>>>>     Mohammad Tariq
>>>>
>>>>
>>>> On Thu, Jun 7, 2012 at 1:09 AM, Babak Bastan <ba...@gmail.com>
>>>> wrote:
>>>> > hoe can I get my log mohammad?
>>>> >
>>>> >
>>>> > On Wed, Jun 6, 2012 at 9:36 PM, Mohammad Tariq <do...@gmail.com>
>>>> wrote:
>>>> >>
>>>> >> could you post your logs???that would help me in understanding the
>>>> >> problem properly.
>>>> >>
>>>> >> Regards,
>>>> >>     Mohammad Tariq
>>>> >>
>>>> >>
>>>> >> On Thu, Jun 7, 2012 at 1:02 AM, Babak Bastan <ba...@gmail.com>
>>>> wrote:
>>>> >> > Thank you very much mohamad for your attention.I followed the
>>>> steps but
>>>> >> > the
>>>> >> > error is the same as the last time.
>>>> >> > and there is my hosts file:
>>>> >> >
>>>> >> > 127.0.0.1       localhost
>>>> >> > #127.0.0.1      ubuntu.ubuntu-domain    ubuntu
>>>> >> >
>>>> >> >
>>>> >> > # The following lines are desirable for IPv6 capable hosts
>>>> >> >
>>>> >> > #::1     ip6-localhost ip6-loopback
>>>> >> > #fe00::0 ip6-localnet
>>>> >> > #ff00::0 ip6-mcastprefix
>>>> >> > #ff02::1 ip6-allnodes
>>>> >> > #ff02::2 ip6-allrouters
>>>> >> >
>>>> >> > but no effect :(
>>>> >> >
>>>> >> > On Wed, Jun 6, 2012 at 8:25 PM, Mohammad Tariq <dontariq@gmail.com
>>>> >
>>>> >> > wrote:
>>>> >> >>
>>>> >> >> also change the permissions of these directories to 777.
>>>> >> >>
>>>> >> >> Regards,
>>>> >> >>     Mohammad Tariq
>>>> >> >>
>>>> >> >>
>>>> >> >> On Wed, Jun 6, 2012 at 11:54 PM, Mohammad Tariq <
>>>> dontariq@gmail.com>
>>>> >> >> wrote:
>>>> >> >> > create a directory "/home/username/hdfs" (or at some place of
>>>> your
>>>> >> >> > choice)..inside this hdfs directory create three sub
>>>> directories -
>>>> >> >> > name, data, and temp, then follow these steps :
>>>> >> >> >
>>>> >> >> > add following properties in your core-site.xml -
>>>> >> >> >
>>>> >> >> > <property>
>>>> >> >> >          <name>fs.default.name</name>
>>>> >> >> >          <value>hdfs://localhost:9000/</value>
>>>> >> >> >        </property>
>>>> >> >> >
>>>> >> >> >        <property>
>>>> >> >> >          <name>hadoop.tmp.dir</name>
>>>> >> >> >          <value>/home/mohammad/hdfs/temp</value>
>>>> >> >> >        </property>
>>>> >> >> >
>>>> >> >> > then add following two properties in your hdfs-site.xml -
>>>> >> >> >
>>>> >> >> > <property>
>>>> >> >> >                <name>dfs.replication</name>
>>>> >> >> >                <value>1</value>
>>>> >> >> >        </property>
>>>> >> >> >
>>>> >> >> >        <property>
>>>> >> >> >                <name>dfs.name.dir</name>
>>>> >> >> >                <value>/home/mohammad/hdfs/name</value>
>>>> >> >> >        </property>
>>>> >> >> >
>>>> >> >> >        <property>
>>>> >> >> >                <name>dfs.data.dir</name>
>>>> >> >> >                <value>/home/mohammad/hdfs/data</value>
>>>> >> >> >        </property>
>>>> >> >> >
>>>> >> >> > finally add this property in your mapred-site.xml -
>>>> >> >> >
>>>> >> >> >       <property>
>>>> >> >> >          <name>mapred.job.tracker</name>
>>>> >> >> >          <value>hdfs://localhost:9001</value>
>>>> >> >> >        </property>
>>>> >> >> >
>>>> >> >> > NOTE: you can give any name to these directories of your
>>>> choice, just
>>>> >> >> > keep in mind you have to give same names as values of
>>>> >> >> >           above specified properties in your configuration
>>>> files.
>>>> >> >> > (give full path of these directories, not just the name of the
>>>> >> >> > directory)
>>>> >> >> >
>>>> >> >> > After this  follow the steps provided in the previous reply.
>>>> >> >> >
>>>> >> >> > Regards,
>>>> >> >> >     Mohammad Tariq
>>>> >> >> >
>>>> >> >> >
>>>> >> >> > On Wed, Jun 6, 2012 at 11:42 PM, Babak Bastan <
>>>> babakbsn@gmail.com>
>>>> >> >> > wrote:
>>>> >> >> >> thank's Mohammad
>>>> >> >> >>
>>>> >> >> >> with this command:
>>>> >> >> >>
>>>> >> >> >> babak@ubuntu:~/Downloads/hadoop/bin$ hadoop namenode -format
>>>> >> >> >>
>>>> >> >> >> this is my output:
>>>> >> >> >>
>>>> >> >> >> 12/06/06 20:05:20 INFO namenode.NameNode: STARTUP_MSG:
>>>> >> >> >> /************************************************************
>>>> >> >> >> STARTUP_MSG: Starting NameNode
>>>> >> >> >> STARTUP_MSG:   host = ubuntu/127.0.1.1
>>>> >> >> >> STARTUP_MSG:   args = [-format]
>>>> >> >> >> STARTUP_MSG:   version = 0.20.2
>>>> >> >> >> STARTUP_MSG:   build =
>>>> >> >> >>
>>>> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.20
>>>> >> >> >> -r
>>>> >> >> >> 911707; compiled by 'chrisdo' on Fri Feb 19 08:07:34 UTC 2010
>>>> >> >> >> ************************************************************/
>>>> >> >> >> 12/06/06 20:05:20 INFO namenode.FSNamesystem:
>>>> >> >> >>
>>>> >> >> >>
>>>> fsOwner=babak,babak,adm,dialout,cdrom,plugdev,lpadmin,admin,sambashare
>>>> >> >> >> 12/06/06 20:05:20 INFO namenode.FSNamesystem:
>>>> supergroup=supergroup
>>>> >> >> >> 12/06/06 20:05:20 INFO namenode.FSNamesystem:
>>>> >> >> >> isPermissionEnabled=true
>>>> >> >> >> 12/06/06 20:05:20 INFO common.Storage: Image file of size 95
>>>> saved
>>>> >> >> >> in 0
>>>> >> >> >> seconds.
>>>> >> >> >> 12/06/06 20:05:20 INFO common.Storage: Storage directory
>>>> >> >> >> /tmp/hadoop-babak/dfs/name has been successfully formatted.
>>>> >> >> >> 12/06/06 20:05:20 INFO namenode.NameNode: SHUTDOWN_MSG:
>>>> >> >> >> /************************************************************
>>>> >> >> >> SHUTDOWN_MSG: Shutting down NameNode at ubuntu/127.0.1.1
>>>> >> >> >> ************************************************************/
>>>> >> >> >>
>>>> >> >> >> by this command:
>>>> >> >> >>
>>>> >> >> >> babak@ubuntu:~/Downloads/hadoop/bin$ start-dfs.sh
>>>> >> >> >>
>>>> >> >> >> this is the out put
>>>> >> >> >>
>>>> >> >> >> mkdir: kann Verzeichnis
>>>> „/home/babak/Downloads/hadoop/bin/../logs“
>>>> >> >> >> nicht
>>>> >> >> >> anlegen: Keine Berechtigung
>>>> >> >> >>
>>>> >> >> >> this out put(it's in german and it means no right to make this
>>>> >> >> >> folder)
>>>> >> >> >>
>>>> >> >> >>
>>>> >> >> >> On Wed, Jun 6, 2012 at 7:59 PM, Mohammad Tariq <
>>>> dontariq@gmail.com>
>>>> >> >> >> wrote:
>>>> >> >> >>>
>>>> >> >> >>> once we are done with the configuration, we need to format
>>>> the file
>>>> >> >> >>> system..use this command to do that-
>>>> >> >> >>> bin/hadoop namenode -format
>>>> >> >> >>>
>>>> >> >> >>> after this, hadoop daemon processes should be started using
>>>> >> >> >>> following
>>>> >> >> >>> commands -
>>>> >> >> >>> bin/start-dfs.sh (it'll start NN & DN)
>>>> >> >> >>> bin/start-mapred.sh (it'll start JT & TT)
>>>> >> >> >>>
>>>> >> >> >>> after this use jps to check if everything is alright or point
>>>> your
>>>> >> >> >>> browser to localhost:50070..if you further find any problem
>>>> provide
>>>> >> >> >>> us
>>>> >> >> >>> with the error logs..:)
>>>> >> >> >>>
>>>> >> >> >>> Regards,
>>>> >> >> >>>     Mohammad Tariq
>>>> >> >> >>>
>>>> >> >> >>>
>>>> >> >> >>> On Wed, Jun 6, 2012 at 11:22 PM, Babak Bastan <
>>>> babakbsn@gmail.com>
>>>> >> >> >>> wrote:
>>>> >> >> >>> > were you able to format hdfs properly???
>>>> >> >> >>> > I did'nt get your question,Do you mean HADOOP_HOME? or
>>>> where did
>>>> >> >> >>> > I
>>>> >> >> >>> > install
>>>> >> >> >>> > Hadoop?
>>>> >> >> >>> >
>>>> >> >> >>> > On Wed, Jun 6, 2012 at 7:49 PM, Mohammad Tariq
>>>> >> >> >>> > <do...@gmail.com>
>>>> >> >> >>> > wrote:
>>>> >> >> >>> >>
>>>> >> >> >>> >> if you are getting only this, it means your hadoop is not
>>>> >> >> >>> >> running..were you able to format hdfs properly???
>>>> >> >> >>> >>
>>>> >> >> >>> >> Regards,
>>>> >> >> >>> >>     Mohammad Tariq
>>>> >> >> >>> >>
>>>> >> >> >>> >>
>>>> >> >> >>> >> On Wed, Jun 6, 2012 at 11:17 PM, Babak Bastan
>>>> >> >> >>> >> <ba...@gmail.com>
>>>> >> >> >>> >> wrote:
>>>> >> >> >>> >> > Hi MohammadmI irun jps in my shel I can see this result:
>>>> >> >> >>> >> > 2213 Jps
>>>> >> >> >>> >> >
>>>> >> >> >>> >> >
>>>> >> >> >>> >> > On Wed, Jun 6, 2012 at 7:44 PM, Mohammad Tariq
>>>> >> >> >>> >> > <do...@gmail.com>
>>>> >> >> >>> >> > wrote:
>>>> >> >> >>> >> >>
>>>> >> >> >>> >> >> you can also use "jps" command at your shell to see
>>>> whether
>>>> >> >> >>> >> >> Hadoop
>>>> >> >> >>> >> >> processes are running or not.
>>>> >> >> >>> >> >>
>>>> >> >> >>> >> >> Regards,
>>>> >> >> >>> >> >>     Mohammad Tariq
>>>> >> >> >>> >> >>
>>>> >> >> >>> >> >>
>>>> >> >> >>> >> >> On Wed, Jun 6, 2012 at 11:12 PM, Mohammad Tariq
>>>> >> >> >>> >> >> <do...@gmail.com>
>>>> >> >> >>> >> >> wrote:
>>>> >> >> >>> >> >> > Hi Babak,
>>>> >> >> >>> >> >> >
>>>> >> >> >>> >> >> >  You have to type it in you web browser..Hadoop
>>>> provides us
>>>> >> >> >>> >> >> > a
>>>> >> >> >>> >> >> > web
>>>> >> >> >>> >> >> > GUI
>>>> >> >> >>> >> >> > that not only allows us to browse through the file
>>>> system,
>>>> >> >> >>> >> >> > but
>>>> >> >> >>> >> >> > to
>>>> >> >> >>> >> >> > download the files as well..Apart from that it also
>>>> >> >> >>> >> >> > provides a
>>>> >> >> >>> >> >> > web
>>>> >> >> >>> >> >> > GUI
>>>> >> >> >>> >> >> > that can be used to see the status of Jobtracker and
>>>> >> >> >>> >> >> > Tasktracker..When
>>>> >> >> >>> >> >> > you run a Hive or Pig job or a Mapreduce job, you can
>>>> point
>>>> >> >> >>> >> >> > your
>>>> >> >> >>> >> >> > browser to http://localhost:50030 to see the status
>>>> and
>>>> >> >> >>> >> >> > logs
>>>> >> >> >>> >> >> > of
>>>> >> >> >>> >> >> > your
>>>> >> >> >>> >> >> > job.
>>>> >> >> >>> >> >> >
>>>> >> >> >>> >> >> > Regards,
>>>> >> >> >>> >> >> >     Mohammad Tariq
>>>> >> >> >>> >> >> >
>>>> >> >> >>> >> >> >
>>>> >> >> >>> >> >> > On Wed, Jun 6, 2012 at 8:28 PM, Babak Bastan
>>>> >> >> >>> >> >> > <ba...@gmail.com>
>>>> >> >> >>> >> >> > wrote:
>>>> >> >> >>> >> >> >> Thank you shashwat for the answer,
>>>> >> >> >>> >> >> >> where should I type http://localhost:50070?
>>>> >> >> >>> >> >> >> I typed here: hive>http://localhost:50070 but
>>>> nothing as
>>>> >> >> >>> >> >> >> result
>>>> >> >> >>> >> >> >>
>>>> >> >> >>> >> >> >>
>>>> >> >> >>> >> >> >> On Wed, Jun 6, 2012 at 3:32 PM, shashwat shriparv
>>>> >> >> >>> >> >> >> <dw...@gmail.com> wrote:
>>>> >> >> >>> >> >> >>>
>>>> >> >> >>> >> >> >>> first type http://localhost:50070 whether this is
>>>> opening
>>>> >> >> >>> >> >> >>> or
>>>> >> >> >>> >> >> >>> not
>>>> >> >> >>> >> >> >>> and
>>>> >> >> >>> >> >> >>> check
>>>> >> >> >>> >> >> >>> how many nodes are available, check some of the
>>>> hadoop
>>>> >> >> >>> >> >> >>> shell
>>>> >> >> >>> >> >> >>> commands
>>>> >> >> >>> >> >> >>>
>>>> >> >> >>> >> >> >>>
>>>> >> >> >>> >> >> >>>
>>>> >> >> >>> >> >> >>> from
>>>> http://hadoop.apache.org/common/docs/r0.18.3/hdfs_shell.html
>>>> >> >> >>> >> >> >>> run
>>>> >> >> >>> >> >> >>> example mapreduce task on hadoop take example from
>>>> here
>>>> >> >> >>> >> >> >>>
>>>> >> >> >>> >> >> >>>
>>>> >> >> >>> >> >> >>>
>>>> >> >> >>> >> >> >>>
>>>> >> >> >>> >> >> >>>
>>>> >> >> >>> >> >> >>> :
>>>> http://www.michael-noll.com/blog/2011/04/09/benchmarking-and-stress-testing-an-hadoop-cluster-with-terasort-testdfsio-nnbench-mrbench/
>>>> >> >> >>> >> >> >>>
>>>> >> >> >>> >> >> >>> if all the above you can do sucessfully means
>>>> hadoop is
>>>> >> >> >>> >> >> >>> configured
>>>> >> >> >>> >> >> >>> correctly
>>>> >> >> >>> >> >> >>>
>>>> >> >> >>> >> >> >>> Regards
>>>> >> >> >>> >> >> >>> Shashwat
>>>> >> >> >>> >> >> >>>
>>>> >> >> >>> >> >> >>>
>>>> >> >> >>> >> >> >>> On Wed, Jun 6, 2012 at 1:30 AM, Babak Bastan
>>>> >> >> >>> >> >> >>> <ba...@gmail.com>
>>>> >> >> >>> >> >> >>> wrote:
>>>> >> >> >>> >> >> >>>>
>>>> >> >> >>> >> >> >>>> no I'm not working on CDH.Is there a way to test
>>>> if my
>>>> >> >> >>> >> >> >>>> Hadoop
>>>> >> >> >>> >> >> >>>> works
>>>> >> >> >>> >> >> >>>> fine
>>>> >> >> >>> >> >> >>>> or not?
>>>> >> >> >>> >> >> >>>>
>>>> >> >> >>> >> >> >>>>
>>>> >> >> >>> >> >> >>>> On Tue, Jun 5, 2012 at 9:55 PM, Bejoy KS
>>>> >> >> >>> >> >> >>>> <be...@yahoo.com>
>>>> >> >> >>> >> >> >>>> wrote:
>>>> >> >> >>> >> >> >>>>>
>>>> >> >> >>> >> >> >>>>> Hi Babak
>>>> >> >> >>> >> >> >>>>>
>>>> >> >> >>> >> >> >>>>> You gotta follow those instructions in the apace
>>>> site
>>>> >> >> >>> >> >> >>>>> to
>>>> >> >> >>> >> >> >>>>> set
>>>> >> >> >>> >> >> >>>>> up
>>>> >> >> >>> >> >> >>>>> hadoop
>>>> >> >> >>> >> >> >>>>> from scratch and ensure that hdfs is working
>>>> first. You
>>>> >> >> >>> >> >> >>>>> should
>>>> >> >> >>> >> >> >>>>> be
>>>> >> >> >>> >> >> >>>>> able to
>>>> >> >> >>> >> >> >>>>> read and write files to hdfs before you do your
>>>> next
>>>> >> >> >>> >> >> >>>>> steps.
>>>> >> >> >>> >> >> >>>>>
>>>> >> >> >>> >> >> >>>>> Are you on CDH or apache distribution of hadoop?
>>>> If it
>>>> >> >> >>> >> >> >>>>> is
>>>> >> >> >>> >> >> >>>>> CDH
>>>> >> >> >>> >> >> >>>>> there
>>>> >> >> >>> >> >> >>>>> are
>>>> >> >> >>> >> >> >>>>> detailed instructions on Cloudera web site.
>>>> >> >> >>> >> >> >>>>>
>>>> >> >> >>> >> >> >>>>> Regards
>>>> >> >> >>> >> >> >>>>> Bejoy KS
>>>> >> >> >>> >> >> >>>>>
>>>> >> >> >>> >> >> >>>>> Sent from handheld, please excuse typos.
>>>> >> >> >>> >> >> >>>>> ________________________________
>>>> >> >> >>> >> >> >>>>> From: Babak Bastan <ba...@gmail.com>
>>>> >> >> >>> >> >> >>>>> Date: Tue, 5 Jun 2012 21:30:22 +0200
>>>> >> >> >>> >> >> >>>>> To: <us...@hive.apache.org>
>>>> >> >> >>> >> >> >>>>> ReplyTo: user@hive.apache.org
>>>> >> >> >>> >> >> >>>>> Subject: Re: Error while Creating Table in Hive
>>>> >> >> >>> >> >> >>>>>
>>>> >> >> >>> >> >> >>>>> @Bejoy: I set the fs.default.name in the
>>>> core-site.xml
>>>> >> >> >>> >> >> >>>>> and
>>>> >> >> >>> >> >> >>>>> I
>>>> >> >> >>> >> >> >>>>> did
>>>> >> >> >>> >> >> >>>>> all
>>>> >> >> >>> >> >> >>>>> of
>>>> >> >> >>> >> >> >>>>> thing that was mentioned in the reference but no
>>>> effect
>>>> >> >> >>> >> >> >>>>>
>>>> >> >> >>> >> >> >>>>> On Tue, Jun 5, 2012 at 8:43 PM, Babak Bastan
>>>> >> >> >>> >> >> >>>>> <ba...@gmail.com>
>>>> >> >> >>> >> >> >>>>> wrote:
>>>> >> >> >>> >> >> >>>>>>
>>>> >> >> >>> >> >> >>>>>> Ok sorry but that was my Mistake .I thought it
>>>> works
>>>> >> >> >>> >> >> >>>>>> but
>>>> >> >> >>> >> >> >>>>>> no.
>>>> >> >> >>> >> >> >>>>>> I wrote the command without ; and then I think It
>>>> >> >> >>> >> >> >>>>>> works
>>>> >> >> >>> >> >> >>>>>> but
>>>> >> >> >>> >> >> >>>>>> with
>>>> >> >> >>> >> >> >>>>>> ;
>>>> >> >> >>> >> >> >>>>>> at
>>>> >> >> >>> >> >> >>>>>> the end of command
>>>> >> >> >>> >> >> >>>>>>
>>>> >> >> >>> >> >> >>>>>> CREATE TABLE pokes (foo INT, bar STRING);
>>>> >> >> >>> >> >> >>>>>>
>>>> >> >> >>> >> >> >>>>>> does'nt work
>>>> >> >> >>> >> >> >>>>>>
>>>> >> >> >>> >> >> >>>>>>
>>>> >> >> >>> >> >> >>>>>> On Tue, Jun 5, 2012 at 8:34 PM, shashwat shriparv
>>>> >> >> >>> >> >> >>>>>> <dw...@gmail.com> wrote:
>>>> >> >> >>> >> >> >>>>>>>
>>>> >> >> >>> >> >> >>>>>>> inside configuration. all properties will be
>>>> inside
>>>> >> >> >>> >> >> >>>>>>> the
>>>> >> >> >>> >> >> >>>>>>> configuration
>>>> >> >> >>> >> >> >>>>>>> tags
>>>> >> >> >>> >> >> >>>>>>>
>>>> >> >> >>> >> >> >>>>>>>
>>>> >> >> >>> >> >> >>>>>>> On Tue, Jun 5, 2012 at 11:53 PM, Babak Bastan
>>>> >> >> >>> >> >> >>>>>>> <ba...@gmail.com>
>>>> >> >> >>> >> >> >>>>>>> wrote:
>>>> >> >> >>> >> >> >>>>>>>>
>>>> >> >> >>> >> >> >>>>>>>> Thank you so much my friend your idee works
>>>> fine(no
>>>> >> >> >>> >> >> >>>>>>>> error)
>>>> >> >> >>> >> >> >>>>>>>> you
>>>> >> >> >>> >> >> >>>>>>>> are
>>>> >> >> >>> >> >> >>>>>>>> the best :)
>>>> >> >> >>> >> >> >>>>>>>>
>>>> >> >> >>> >> >> >>>>>>>>
>>>> >> >> >>> >> >> >>>>>>>> On Tue, Jun 5, 2012 at 8:20 PM, Babak Bastan
>>>> >> >> >>> >> >> >>>>>>>> <ba...@gmail.com>
>>>> >> >> >>> >> >> >>>>>>>> wrote:
>>>> >> >> >>> >> >> >>>>>>>>>
>>>> >> >> >>> >> >> >>>>>>>>> It must be inside the
>>>> >> >> >>> >> >> >>>>>>>>> <configuration></configuration>
>>>> >> >> >>> >> >> >>>>>>>>> or
>>>> >> >> >>> >> >> >>>>>>>>> outside
>>>> >> >> >>> >> >> >>>>>>>>> this?
>>>> >> >> >>> >> >> >>>>>>>>>
>>>> >> >> >>> >> >> >>>>>>>>>
>>>> >> >> >>> >> >> >>>>>>>>> On Tue, Jun 5, 2012 at 8:15 PM, shashwat
>>>> shriparv
>>>> >> >> >>> >> >> >>>>>>>>> <dw...@gmail.com> wrote:
>>>> >> >> >>> >> >> >>>>>>>>>>
>>>> >> >> >>> >> >> >>>>>>>>>> It will be inside hive/conf
>>>> >> >> >>> >> >> >>>>>>>>>>
>>>> >> >> >>> >> >> >>>>>>>>>>
>>>> >> >> >>> >> >> >>>>>>>>>> On Tue, Jun 5, 2012 at 11:43 PM, Babak Bastan
>>>> >> >> >>> >> >> >>>>>>>>>> <ba...@gmail.com>
>>>> >> >> >>> >> >> >>>>>>>>>> wrote:
>>>> >> >> >>> >> >> >>>>>>>>>>>
>>>> >> >> >>> >> >> >>>>>>>>>>> Thanks sShashwat, and where is this
>>>> hive-site.xml
>>>> >> >> >>> >> >> >>>>>>>>>>>
>>>> >> >> >>> >> >> >>>>>>>>>>> On Tue, Jun 5, 2012 at 8:02 PM, shashwat
>>>> shriparv
>>>> >> >> >>> >> >> >>>>>>>>>>> <dw...@gmail.com> wrote:
>>>> >> >> >>> >> >> >>>>>>>>>>>>
>>>> >> >> >>> >> >> >>>>>>>>>>>> set
>>>> >> >> >>> >> >> >>>>>>>>>>>>
>>>> >> >> >>> >> >> >>>>>>>>>>>> hive.metastore.warehouse.dir in
>>>> hive-site.xml
>>>> >> >> >>> >> >> >>>>>>>>>>>>
>>>> >> >> >>> >> >> >>>>>>>>>>>> <property>
>>>> >> >> >>> >> >> >>>>>>>>>>>>   <name>hive.metastore.local</name>
>>>> >> >> >>> >> >> >>>>>>>>>>>>   <value>true</value>
>>>> >> >> >>> >> >> >>>>>>>>>>>> </property>
>>>> >> >> >>> >> >> >>>>>>>>>>>>
>>>> >> >> >>> >> >> >>>>>>>>>>>> <name>hive.metastore.warehouse.dir</name>
>>>> >> >> >>> >> >> >>>>>>>>>>>>                <value>/home/<your
>>>> >> >> >>> >> >> >>>>>>>>>>>> username>/hivefolder</value>
>>>> >> >> >>> >> >> >>>>>>>>>>>>                <description>location of
>>>> default
>>>> >> >> >>> >> >> >>>>>>>>>>>> database
>>>> >> >> >>> >> >> >>>>>>>>>>>> for
>>>> >> >> >>> >> >> >>>>>>>>>>>> the
>>>> >> >> >>> >> >> >>>>>>>>>>>> warehouse</description>
>>>> >> >> >>> >> >> >>>>>>>>>>>>        </property>
>>>> >> >> >>> >> >> >>>>>>>>>>>>
>>>> >> >> >>> >> >> >>>>>>>>>>>>
>>>> >> >> >>> >> >> >>>>>>>>>>>> On Tue, Jun 5, 2012 at 10:43 PM, Babak
>>>> Bastan
>>>> >> >> >>> >> >> >>>>>>>>>>>> <ba...@gmail.com> wrote:
>>>> >> >> >>> >> >> >>>>>>>>>>>>>
>>>> >> >> >>> >> >> >>>>>>>>>>>>> Hello Experts ,
>>>> >> >> >>> >> >> >>>>>>>>>>>>>
>>>> >> >> >>> >> >> >>>>>>>>>>>>> I'm new in Hive .When try to create a test
>>>> >> >> >>> >> >> >>>>>>>>>>>>> Table
>>>> >> >> >>> >> >> >>>>>>>>>>>>> in
>>>> >> >> >>> >> >> >>>>>>>>>>>>> Hive
>>>> >> >> >>> >> >> >>>>>>>>>>>>> I
>>>> >> >> >>> >> >> >>>>>>>>>>>>> get
>>>> >> >> >>> >> >> >>>>>>>>>>>>> an error.I want to run this command:
>>>> >> >> >>> >> >> >>>>>>>>>>>>> CREATE TABLE Test (DateT STRING, Url
>>>> STRING,
>>>> >> >> >>> >> >> >>>>>>>>>>>>> Content
>>>> >> >> >>> >> >> >>>>>>>>>>>>> STRING);
>>>> >> >> >>> >> >> >>>>>>>>>>>>> but this error occured:
>>>> >> >> >>> >> >> >>>>>>>>>>>>> FAILED: Error in metadata:
>>>> >> >> >>> >> >> >>>>>>>>>>>>> MetaException(message:Got
>>>> >> >> >>> >> >> >>>>>>>>>>>>> exception:
>>>> >> >> >>> >> >> >>>>>>>>>>>>> java.io.FileNotFoundException File
>>>> >> >> >>> >> >> >>>>>>>>>>>>> file:/user/hive/warehouse/test does not
>>>> >> >> >>> >> >> >>>>>>>>>>>>> exist.)
>>>> >> >> >>> >> >> >>>>>>>>>>>>> FAILED: Execution Error, return code 1
>>>> from
>>>> >> >> >>> >> >> >>>>>>>>>>>>> org.apache.hadoop.hive.ql.exec.DDLTask
>>>> >> >> >>> >> >> >>>>>>>>>>>>> How can I solve this Problem?
>>>> >> >> >>> >> >> >>>>>>>>>>>>> Thank you so much
>>>> >> >> >>> >> >> >>>>>>>>>>>>
>>>> >> >> >>> >> >> >>>>>>>>>>>>
>>>> >> >> >>> >> >> >>>>>>>>>>>>
>>>> >> >> >>> >> >> >>>>>>>>>>>>
>>>> >> >> >>> >> >> >>>>>>>>>>>> --
>>>> >> >> >>> >> >> >>>>>>>>>>>>
>>>> >> >> >>> >> >> >>>>>>>>>>>>
>>>> >> >> >>> >> >> >>>>>>>>>>>> ∞
>>>> >> >> >>> >> >> >>>>>>>>>>>>
>>>> >> >> >>> >> >> >>>>>>>>>>>> Shashwat Shriparv
>>>> >> >> >>> >> >> >>>>>>>>>>>>
>>>> >> >> >>> >> >> >>>>>>>>>>>>
>>>> >> >> >>> >> >> >>>>>>>>>>>
>>>> >> >> >>> >> >> >>>>>>>>>>
>>>> >> >> >>> >> >> >>>>>>>>>>
>>>> >> >> >>> >> >> >>>>>>>>>>
>>>> >> >> >>> >> >> >>>>>>>>>> --
>>>> >> >> >>> >> >> >>>>>>>>>>
>>>> >> >> >>> >> >> >>>>>>>>>>
>>>> >> >> >>> >> >> >>>>>>>>>> ∞
>>>> >> >> >>> >> >> >>>>>>>>>>
>>>> >> >> >>> >> >> >>>>>>>>>> Shashwat Shriparv
>>>> >> >> >>> >> >> >>>>>>>>>>
>>>> >> >> >>> >> >> >>>>>>>>>>
>>>> >> >> >>> >> >> >>>>>>>>>
>>>> >> >> >>> >> >> >>>>>>>>
>>>> >> >> >>> >> >> >>>>>>>
>>>> >> >> >>> >> >> >>>>>>>
>>>> >> >> >>> >> >> >>>>>>>
>>>> >> >> >>> >> >> >>>>>>> --
>>>> >> >> >>> >> >> >>>>>>>
>>>> >> >> >>> >> >> >>>>>>>
>>>> >> >> >>> >> >> >>>>>>> ∞
>>>> >> >> >>> >> >> >>>>>>>
>>>> >> >> >>> >> >> >>>>>>> Shashwat Shriparv
>>>> >> >> >>> >> >> >>>>>>>
>>>> >> >> >>> >> >> >>>>>>>
>>>> >> >> >>> >> >> >>>>>>
>>>> >> >> >>> >> >> >>>>>
>>>> >> >> >>> >> >> >>>>
>>>> >> >> >>> >> >> >>>
>>>> >> >> >>> >> >> >>>
>>>> >> >> >>> >> >> >>>
>>>> >> >> >>> >> >> >>> --
>>>> >> >> >>> >> >> >>>
>>>> >> >> >>> >> >> >>>
>>>> >> >> >>> >> >> >>> ∞
>>>> >> >> >>> >> >> >>>
>>>> >> >> >>> >> >> >>> Shashwat Shriparv
>>>> >> >> >>> >> >> >>>
>>>> >> >> >>> >> >> >>>
>>>> >> >> >>> >> >> >>
>>>> >> >> >>> >> >
>>>> >> >> >>> >> >
>>>> >> >> >>> >
>>>> >> >> >>> >
>>>> >> >> >>
>>>> >> >> >>
>>>> >> >
>>>> >> >
>>>> >
>>>> >
>>>>
>>>
>>>
>>
>>
>> --
>>
>>
>> ∞
>> Shashwat Shriparv
>>
>>
>>
>

Re: Error while Creating Table in Hive

Posted by Babak Bastan <ba...@gmail.com>.
no one error:
i.e if I run this one

*hostname --fqdn*

 with the condition that I send to you :

*127.0.0.1       localhost*
*#127.0.0.1      ubuntu.ubuntu-domain    ubuntu*
*# The following lines are desirable for IPv6 capable hosts*
*#::1     ip6-localhost ip6-loopback*
*#fe00::0 ip6-localnet*
*#ff00::0 ip6-mcastprefix*
*#ff02::1 ip6-allnodes*
*#ff02::2 ip6-allrouters*

I get this error:

*hostname: Name or service not known*

Or in the second step by this command:

*babak@ubuntu:~/Downloads/hadoop/bin$ start-hdfs.sh*

these lines of error:


mkdir: kann Verzeichnis „/home/babak/Downloads/hadoop/bin/../logs“ nicht
anlegen: Keine Berechtigung
starting namenode, logging to
/home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-namenode-ubuntu.out
/home/babak/Downloads/hadoop/bin/hadoop-daemon.sh: Zeile 117:
/home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-namenode-ubuntu.out:
Datei oder Verzeichnis nicht gefunden
head:
„/home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-namenode-ubuntu.out“
kann nicht zum Lesen geöffnet werden: Datei oder Verzeichnis nicht gefunden
localhost: mkdir: kann Verzeichnis
„/home/babak/Downloads/hadoop/bin/../logs“ nicht anlegen: Keine Berechtigung
localhost: starting datanode, logging to
/home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-datanode-ubuntu.out
localhost: /home/babak/Downloads/hadoop/bin/hadoop-daemon.sh: Zeile 117:
/home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-datanode-ubuntu.out:
Datei oder Verzeichnis nicht gefunden
localhost: head:
„/home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-datanode-ubuntu.out“
kann nicht zum Lesen geöffnet werden: Datei oder Verzeichnis nicht gefunden
localhost: mkdir: kann Verzeichnis
„/home/babak/Downloads/hadoop/bin/../logs“ nicht anlegen: Keine Berechtigung
localhost: starting secondarynamenode, logging to
/home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-secondarynamenode-ubuntu.out
localhost: /home/babak/Downloads/hadoop/bin/hadoop-daemon.sh: Zeile 117:
/home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-secondarynamenode-ubuntu.out:
Datei oder Verzeichnis nicht gefunden
localhost: head:
„/home/babak/Downloads/hadoop/bin/../logs/hadoop-babak-secondarynamenode-ubuntu.out“
kann nicht zum Lesen geöffnet werden: Datei oder Verzeichnis nicht gefunden

they said no permision to make logs in this
path:/home/babak/Downloads/hadoop/bin/../logs

 and generally I cant create a table in hive and get this one:

FAILED: Error in metadata: MetaException(message:Got exception:
java.io.FileNotFoundException File file:/user/hive/warehouse/test does
not exist.)
FAILED: Execution Error, return code 1 from
org.apache.hadoop.hive.ql.exec.DDLTask

On Wed, Jun 6, 2012 at 10:02 PM, shashwat shriparv <
dwivedishashwat@gmail.com> wrote:

> whats the error babak ???
>
>
> On Thu, Jun 7, 2012 at 1:25 AM, Babak Bastan <ba...@gmail.com> wrote:
>
>> What the hell is that?I see no log folder there
>>
>>
>> On Wed, Jun 6, 2012 at 9:41 PM, Mohammad Tariq <do...@gmail.com>wrote:
>>
>>> go to your HADOOP_HOME i.e your hadoop directory(that includes bin,
>>> conf etc)..you can find logs directory there..
>>>
>>> Regards,
>>>     Mohammad Tariq
>>>
>>>
>>> On Thu, Jun 7, 2012 at 1:09 AM, Babak Bastan <ba...@gmail.com> wrote:
>>> > hoe can I get my log mohammad?
>>> >
>>> >
>>> > On Wed, Jun 6, 2012 at 9:36 PM, Mohammad Tariq <do...@gmail.com>
>>> wrote:
>>> >>
>>> >> could you post your logs???that would help me in understanding the
>>> >> problem properly.
>>> >>
>>> >> Regards,
>>> >>     Mohammad Tariq
>>> >>
>>> >>
>>> >> On Thu, Jun 7, 2012 at 1:02 AM, Babak Bastan <ba...@gmail.com>
>>> wrote:
>>> >> > Thank you very much mohamad for your attention.I followed the steps
>>> but
>>> >> > the
>>> >> > error is the same as the last time.
>>> >> > and there is my hosts file:
>>> >> >
>>> >> > 127.0.0.1       localhost
>>> >> > #127.0.0.1      ubuntu.ubuntu-domain    ubuntu
>>> >> >
>>> >> >
>>> >> > # The following lines are desirable for IPv6 capable hosts
>>> >> >
>>> >> > #::1     ip6-localhost ip6-loopback
>>> >> > #fe00::0 ip6-localnet
>>> >> > #ff00::0 ip6-mcastprefix
>>> >> > #ff02::1 ip6-allnodes
>>> >> > #ff02::2 ip6-allrouters
>>> >> >
>>> >> > but no effect :(
>>> >> >
>>> >> > On Wed, Jun 6, 2012 at 8:25 PM, Mohammad Tariq <do...@gmail.com>
>>> >> > wrote:
>>> >> >>
>>> >> >> also change the permissions of these directories to 777.
>>> >> >>
>>> >> >> Regards,
>>> >> >>     Mohammad Tariq
>>> >> >>
>>> >> >>
>>> >> >> On Wed, Jun 6, 2012 at 11:54 PM, Mohammad Tariq <
>>> dontariq@gmail.com>
>>> >> >> wrote:
>>> >> >> > create a directory "/home/username/hdfs" (or at some place of
>>> your
>>> >> >> > choice)..inside this hdfs directory create three sub directories
>>> -
>>> >> >> > name, data, and temp, then follow these steps :
>>> >> >> >
>>> >> >> > add following properties in your core-site.xml -
>>> >> >> >
>>> >> >> > <property>
>>> >> >> >          <name>fs.default.name</name>
>>> >> >> >          <value>hdfs://localhost:9000/</value>
>>> >> >> >        </property>
>>> >> >> >
>>> >> >> >        <property>
>>> >> >> >          <name>hadoop.tmp.dir</name>
>>> >> >> >          <value>/home/mohammad/hdfs/temp</value>
>>> >> >> >        </property>
>>> >> >> >
>>> >> >> > then add following two properties in your hdfs-site.xml -
>>> >> >> >
>>> >> >> > <property>
>>> >> >> >                <name>dfs.replication</name>
>>> >> >> >                <value>1</value>
>>> >> >> >        </property>
>>> >> >> >
>>> >> >> >        <property>
>>> >> >> >                <name>dfs.name.dir</name>
>>> >> >> >                <value>/home/mohammad/hdfs/name</value>
>>> >> >> >        </property>
>>> >> >> >
>>> >> >> >        <property>
>>> >> >> >                <name>dfs.data.dir</name>
>>> >> >> >                <value>/home/mohammad/hdfs/data</value>
>>> >> >> >        </property>
>>> >> >> >
>>> >> >> > finally add this property in your mapred-site.xml -
>>> >> >> >
>>> >> >> >       <property>
>>> >> >> >          <name>mapred.job.tracker</name>
>>> >> >> >          <value>hdfs://localhost:9001</value>
>>> >> >> >        </property>
>>> >> >> >
>>> >> >> > NOTE: you can give any name to these directories of your choice,
>>> just
>>> >> >> > keep in mind you have to give same names as values of
>>> >> >> >           above specified properties in your configuration files.
>>> >> >> > (give full path of these directories, not just the name of the
>>> >> >> > directory)
>>> >> >> >
>>> >> >> > After this  follow the steps provided in the previous reply.
>>> >> >> >
>>> >> >> > Regards,
>>> >> >> >     Mohammad Tariq
>>> >> >> >
>>> >> >> >
>>> >> >> > On Wed, Jun 6, 2012 at 11:42 PM, Babak Bastan <
>>> babakbsn@gmail.com>
>>> >> >> > wrote:
>>> >> >> >> thank's Mohammad
>>> >> >> >>
>>> >> >> >> with this command:
>>> >> >> >>
>>> >> >> >> babak@ubuntu:~/Downloads/hadoop/bin$ hadoop namenode -format
>>> >> >> >>
>>> >> >> >> this is my output:
>>> >> >> >>
>>> >> >> >> 12/06/06 20:05:20 INFO namenode.NameNode: STARTUP_MSG:
>>> >> >> >> /************************************************************
>>> >> >> >> STARTUP_MSG: Starting NameNode
>>> >> >> >> STARTUP_MSG:   host = ubuntu/127.0.1.1
>>> >> >> >> STARTUP_MSG:   args = [-format]
>>> >> >> >> STARTUP_MSG:   version = 0.20.2
>>> >> >> >> STARTUP_MSG:   build =
>>> >> >> >>
>>> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.20
>>> >> >> >> -r
>>> >> >> >> 911707; compiled by 'chrisdo' on Fri Feb 19 08:07:34 UTC 2010
>>> >> >> >> ************************************************************/
>>> >> >> >> 12/06/06 20:05:20 INFO namenode.FSNamesystem:
>>> >> >> >>
>>> >> >> >>
>>> fsOwner=babak,babak,adm,dialout,cdrom,plugdev,lpadmin,admin,sambashare
>>> >> >> >> 12/06/06 20:05:20 INFO namenode.FSNamesystem:
>>> supergroup=supergroup
>>> >> >> >> 12/06/06 20:05:20 INFO namenode.FSNamesystem:
>>> >> >> >> isPermissionEnabled=true
>>> >> >> >> 12/06/06 20:05:20 INFO common.Storage: Image file of size 95
>>> saved
>>> >> >> >> in 0
>>> >> >> >> seconds.
>>> >> >> >> 12/06/06 20:05:20 INFO common.Storage: Storage directory
>>> >> >> >> /tmp/hadoop-babak/dfs/name has been successfully formatted.
>>> >> >> >> 12/06/06 20:05:20 INFO namenode.NameNode: SHUTDOWN_MSG:
>>> >> >> >> /************************************************************
>>> >> >> >> SHUTDOWN_MSG: Shutting down NameNode at ubuntu/127.0.1.1
>>> >> >> >> ************************************************************/
>>> >> >> >>
>>> >> >> >> by this command:
>>> >> >> >>
>>> >> >> >> babak@ubuntu:~/Downloads/hadoop/bin$ start-dfs.sh
>>> >> >> >>
>>> >> >> >> this is the out put
>>> >> >> >>
>>> >> >> >> mkdir: kann Verzeichnis
>>> „/home/babak/Downloads/hadoop/bin/../logs“
>>> >> >> >> nicht
>>> >> >> >> anlegen: Keine Berechtigung
>>> >> >> >>
>>> >> >> >> this out put(it's in german and it means no right to make this
>>> >> >> >> folder)
>>> >> >> >>
>>> >> >> >>
>>> >> >> >> On Wed, Jun 6, 2012 at 7:59 PM, Mohammad Tariq <
>>> dontariq@gmail.com>
>>> >> >> >> wrote:
>>> >> >> >>>
>>> >> >> >>> once we are done with the configuration, we need to format the
>>> file
>>> >> >> >>> system..use this command to do that-
>>> >> >> >>> bin/hadoop namenode -format
>>> >> >> >>>
>>> >> >> >>> after this, hadoop daemon processes should be started using
>>> >> >> >>> following
>>> >> >> >>> commands -
>>> >> >> >>> bin/start-dfs.sh (it'll start NN & DN)
>>> >> >> >>> bin/start-mapred.sh (it'll start JT & TT)
>>> >> >> >>>
>>> >> >> >>> after this use jps to check if everything is alright or point
>>> your
>>> >> >> >>> browser to localhost:50070..if you further find any problem
>>> provide
>>> >> >> >>> us
>>> >> >> >>> with the error logs..:)
>>> >> >> >>>
>>> >> >> >>> Regards,
>>> >> >> >>>     Mohammad Tariq
>>> >> >> >>>
>>> >> >> >>>
>>> >> >> >>> On Wed, Jun 6, 2012 at 11:22 PM, Babak Bastan <
>>> babakbsn@gmail.com>
>>> >> >> >>> wrote:
>>> >> >> >>> > were you able to format hdfs properly???
>>> >> >> >>> > I did'nt get your question,Do you mean HADOOP_HOME? or where
>>> did
>>> >> >> >>> > I
>>> >> >> >>> > install
>>> >> >> >>> > Hadoop?
>>> >> >> >>> >
>>> >> >> >>> > On Wed, Jun 6, 2012 at 7:49 PM, Mohammad Tariq
>>> >> >> >>> > <do...@gmail.com>
>>> >> >> >>> > wrote:
>>> >> >> >>> >>
>>> >> >> >>> >> if you are getting only this, it means your hadoop is not
>>> >> >> >>> >> running..were you able to format hdfs properly???
>>> >> >> >>> >>
>>> >> >> >>> >> Regards,
>>> >> >> >>> >>     Mohammad Tariq
>>> >> >> >>> >>
>>> >> >> >>> >>
>>> >> >> >>> >> On Wed, Jun 6, 2012 at 11:17 PM, Babak Bastan
>>> >> >> >>> >> <ba...@gmail.com>
>>> >> >> >>> >> wrote:
>>> >> >> >>> >> > Hi MohammadmI irun jps in my shel I can see this result:
>>> >> >> >>> >> > 2213 Jps
>>> >> >> >>> >> >
>>> >> >> >>> >> >
>>> >> >> >>> >> > On Wed, Jun 6, 2012 at 7:44 PM, Mohammad Tariq
>>> >> >> >>> >> > <do...@gmail.com>
>>> >> >> >>> >> > wrote:
>>> >> >> >>> >> >>
>>> >> >> >>> >> >> you can also use "jps" command at your shell to see
>>> whether
>>> >> >> >>> >> >> Hadoop
>>> >> >> >>> >> >> processes are running or not.
>>> >> >> >>> >> >>
>>> >> >> >>> >> >> Regards,
>>> >> >> >>> >> >>     Mohammad Tariq
>>> >> >> >>> >> >>
>>> >> >> >>> >> >>
>>> >> >> >>> >> >> On Wed, Jun 6, 2012 at 11:12 PM, Mohammad Tariq
>>> >> >> >>> >> >> <do...@gmail.com>
>>> >> >> >>> >> >> wrote:
>>> >> >> >>> >> >> > Hi Babak,
>>> >> >> >>> >> >> >
>>> >> >> >>> >> >> >  You have to type it in you web browser..Hadoop
>>> provides us
>>> >> >> >>> >> >> > a
>>> >> >> >>> >> >> > web
>>> >> >> >>> >> >> > GUI
>>> >> >> >>> >> >> > that not only allows us to browse through the file
>>> system,
>>> >> >> >>> >> >> > but
>>> >> >> >>> >> >> > to
>>> >> >> >>> >> >> > download the files as well..Apart from that it also
>>> >> >> >>> >> >> > provides a
>>> >> >> >>> >> >> > web
>>> >> >> >>> >> >> > GUI
>>> >> >> >>> >> >> > that can be used to see the status of Jobtracker and
>>> >> >> >>> >> >> > Tasktracker..When
>>> >> >> >>> >> >> > you run a Hive or Pig job or a Mapreduce job, you can
>>> point
>>> >> >> >>> >> >> > your
>>> >> >> >>> >> >> > browser to http://localhost:50030 to see the status
>>> and
>>> >> >> >>> >> >> > logs
>>> >> >> >>> >> >> > of
>>> >> >> >>> >> >> > your
>>> >> >> >>> >> >> > job.
>>> >> >> >>> >> >> >
>>> >> >> >>> >> >> > Regards,
>>> >> >> >>> >> >> >     Mohammad Tariq
>>> >> >> >>> >> >> >
>>> >> >> >>> >> >> >
>>> >> >> >>> >> >> > On Wed, Jun 6, 2012 at 8:28 PM, Babak Bastan
>>> >> >> >>> >> >> > <ba...@gmail.com>
>>> >> >> >>> >> >> > wrote:
>>> >> >> >>> >> >> >> Thank you shashwat for the answer,
>>> >> >> >>> >> >> >> where should I type http://localhost:50070?
>>> >> >> >>> >> >> >> I typed here: hive>http://localhost:50070 but
>>> nothing as
>>> >> >> >>> >> >> >> result
>>> >> >> >>> >> >> >>
>>> >> >> >>> >> >> >>
>>> >> >> >>> >> >> >> On Wed, Jun 6, 2012 at 3:32 PM, shashwat shriparv
>>> >> >> >>> >> >> >> <dw...@gmail.com> wrote:
>>> >> >> >>> >> >> >>>
>>> >> >> >>> >> >> >>> first type http://localhost:50070 whether this is
>>> opening
>>> >> >> >>> >> >> >>> or
>>> >> >> >>> >> >> >>> not
>>> >> >> >>> >> >> >>> and
>>> >> >> >>> >> >> >>> check
>>> >> >> >>> >> >> >>> how many nodes are available, check some of the
>>> hadoop
>>> >> >> >>> >> >> >>> shell
>>> >> >> >>> >> >> >>> commands
>>> >> >> >>> >> >> >>>
>>> >> >> >>> >> >> >>>
>>> >> >> >>> >> >> >>>
>>> >> >> >>> >> >> >>> from
>>> http://hadoop.apache.org/common/docs/r0.18.3/hdfs_shell.html
>>> >> >> >>> >> >> >>> run
>>> >> >> >>> >> >> >>> example mapreduce task on hadoop take example from
>>> here
>>> >> >> >>> >> >> >>>
>>> >> >> >>> >> >> >>>
>>> >> >> >>> >> >> >>>
>>> >> >> >>> >> >> >>>
>>> >> >> >>> >> >> >>>
>>> >> >> >>> >> >> >>> :
>>> http://www.michael-noll.com/blog/2011/04/09/benchmarking-and-stress-testing-an-hadoop-cluster-with-terasort-testdfsio-nnbench-mrbench/
>>> >> >> >>> >> >> >>>
>>> >> >> >>> >> >> >>> if all the above you can do sucessfully means hadoop
>>> is
>>> >> >> >>> >> >> >>> configured
>>> >> >> >>> >> >> >>> correctly
>>> >> >> >>> >> >> >>>
>>> >> >> >>> >> >> >>> Regards
>>> >> >> >>> >> >> >>> Shashwat
>>> >> >> >>> >> >> >>>
>>> >> >> >>> >> >> >>>
>>> >> >> >>> >> >> >>> On Wed, Jun 6, 2012 at 1:30 AM, Babak Bastan
>>> >> >> >>> >> >> >>> <ba...@gmail.com>
>>> >> >> >>> >> >> >>> wrote:
>>> >> >> >>> >> >> >>>>
>>> >> >> >>> >> >> >>>> no I'm not working on CDH.Is there a way to test if
>>> my
>>> >> >> >>> >> >> >>>> Hadoop
>>> >> >> >>> >> >> >>>> works
>>> >> >> >>> >> >> >>>> fine
>>> >> >> >>> >> >> >>>> or not?
>>> >> >> >>> >> >> >>>>
>>> >> >> >>> >> >> >>>>
>>> >> >> >>> >> >> >>>> On Tue, Jun 5, 2012 at 9:55 PM, Bejoy KS
>>> >> >> >>> >> >> >>>> <be...@yahoo.com>
>>> >> >> >>> >> >> >>>> wrote:
>>> >> >> >>> >> >> >>>>>
>>> >> >> >>> >> >> >>>>> Hi Babak
>>> >> >> >>> >> >> >>>>>
>>> >> >> >>> >> >> >>>>> You gotta follow those instructions in the apace
>>> site
>>> >> >> >>> >> >> >>>>> to
>>> >> >> >>> >> >> >>>>> set
>>> >> >> >>> >> >> >>>>> up
>>> >> >> >>> >> >> >>>>> hadoop
>>> >> >> >>> >> >> >>>>> from scratch and ensure that hdfs is working
>>> first. You
>>> >> >> >>> >> >> >>>>> should
>>> >> >> >>> >> >> >>>>> be
>>> >> >> >>> >> >> >>>>> able to
>>> >> >> >>> >> >> >>>>> read and write files to hdfs before you do your
>>> next
>>> >> >> >>> >> >> >>>>> steps.
>>> >> >> >>> >> >> >>>>>
>>> >> >> >>> >> >> >>>>> Are you on CDH or apache distribution of hadoop?
>>> If it
>>> >> >> >>> >> >> >>>>> is
>>> >> >> >>> >> >> >>>>> CDH
>>> >> >> >>> >> >> >>>>> there
>>> >> >> >>> >> >> >>>>> are
>>> >> >> >>> >> >> >>>>> detailed instructions on Cloudera web site.
>>> >> >> >>> >> >> >>>>>
>>> >> >> >>> >> >> >>>>> Regards
>>> >> >> >>> >> >> >>>>> Bejoy KS
>>> >> >> >>> >> >> >>>>>
>>> >> >> >>> >> >> >>>>> Sent from handheld, please excuse typos.
>>> >> >> >>> >> >> >>>>> ________________________________
>>> >> >> >>> >> >> >>>>> From: Babak Bastan <ba...@gmail.com>
>>> >> >> >>> >> >> >>>>> Date: Tue, 5 Jun 2012 21:30:22 +0200
>>> >> >> >>> >> >> >>>>> To: <us...@hive.apache.org>
>>> >> >> >>> >> >> >>>>> ReplyTo: user@hive.apache.org
>>> >> >> >>> >> >> >>>>> Subject: Re: Error while Creating Table in Hive
>>> >> >> >>> >> >> >>>>>
>>> >> >> >>> >> >> >>>>> @Bejoy: I set the fs.default.name in the
>>> core-site.xml
>>> >> >> >>> >> >> >>>>> and
>>> >> >> >>> >> >> >>>>> I
>>> >> >> >>> >> >> >>>>> did
>>> >> >> >>> >> >> >>>>> all
>>> >> >> >>> >> >> >>>>> of
>>> >> >> >>> >> >> >>>>> thing that was mentioned in the reference but no
>>> effect
>>> >> >> >>> >> >> >>>>>
>>> >> >> >>> >> >> >>>>> On Tue, Jun 5, 2012 at 8:43 PM, Babak Bastan
>>> >> >> >>> >> >> >>>>> <ba...@gmail.com>
>>> >> >> >>> >> >> >>>>> wrote:
>>> >> >> >>> >> >> >>>>>>
>>> >> >> >>> >> >> >>>>>> Ok sorry but that was my Mistake .I thought it
>>> works
>>> >> >> >>> >> >> >>>>>> but
>>> >> >> >>> >> >> >>>>>> no.
>>> >> >> >>> >> >> >>>>>> I wrote the command without ; and then I think It
>>> >> >> >>> >> >> >>>>>> works
>>> >> >> >>> >> >> >>>>>> but
>>> >> >> >>> >> >> >>>>>> with
>>> >> >> >>> >> >> >>>>>> ;
>>> >> >> >>> >> >> >>>>>> at
>>> >> >> >>> >> >> >>>>>> the end of command
>>> >> >> >>> >> >> >>>>>>
>>> >> >> >>> >> >> >>>>>> CREATE TABLE pokes (foo INT, bar STRING);
>>> >> >> >>> >> >> >>>>>>
>>> >> >> >>> >> >> >>>>>> does'nt work
>>> >> >> >>> >> >> >>>>>>
>>> >> >> >>> >> >> >>>>>>
>>> >> >> >>> >> >> >>>>>> On Tue, Jun 5, 2012 at 8:34 PM, shashwat shriparv
>>> >> >> >>> >> >> >>>>>> <dw...@gmail.com> wrote:
>>> >> >> >>> >> >> >>>>>>>
>>> >> >> >>> >> >> >>>>>>> inside configuration. all properties will be
>>> inside
>>> >> >> >>> >> >> >>>>>>> the
>>> >> >> >>> >> >> >>>>>>> configuration
>>> >> >> >>> >> >> >>>>>>> tags
>>> >> >> >>> >> >> >>>>>>>
>>> >> >> >>> >> >> >>>>>>>
>>> >> >> >>> >> >> >>>>>>> On Tue, Jun 5, 2012 at 11:53 PM, Babak Bastan
>>> >> >> >>> >> >> >>>>>>> <ba...@gmail.com>
>>> >> >> >>> >> >> >>>>>>> wrote:
>>> >> >> >>> >> >> >>>>>>>>
>>> >> >> >>> >> >> >>>>>>>> Thank you so much my friend your idee works
>>> fine(no
>>> >> >> >>> >> >> >>>>>>>> error)
>>> >> >> >>> >> >> >>>>>>>> you
>>> >> >> >>> >> >> >>>>>>>> are
>>> >> >> >>> >> >> >>>>>>>> the best :)
>>> >> >> >>> >> >> >>>>>>>>
>>> >> >> >>> >> >> >>>>>>>>
>>> >> >> >>> >> >> >>>>>>>> On Tue, Jun 5, 2012 at 8:20 PM, Babak Bastan
>>> >> >> >>> >> >> >>>>>>>> <ba...@gmail.com>
>>> >> >> >>> >> >> >>>>>>>> wrote:
>>> >> >> >>> >> >> >>>>>>>>>
>>> >> >> >>> >> >> >>>>>>>>> It must be inside the
>>> >> >> >>> >> >> >>>>>>>>> <configuration></configuration>
>>> >> >> >>> >> >> >>>>>>>>> or
>>> >> >> >>> >> >> >>>>>>>>> outside
>>> >> >> >>> >> >> >>>>>>>>> this?
>>> >> >> >>> >> >> >>>>>>>>>
>>> >> >> >>> >> >> >>>>>>>>>
>>> >> >> >>> >> >> >>>>>>>>> On Tue, Jun 5, 2012 at 8:15 PM, shashwat
>>> shriparv
>>> >> >> >>> >> >> >>>>>>>>> <dw...@gmail.com> wrote:
>>> >> >> >>> >> >> >>>>>>>>>>
>>> >> >> >>> >> >> >>>>>>>>>> It will be inside hive/conf
>>> >> >> >>> >> >> >>>>>>>>>>
>>> >> >> >>> >> >> >>>>>>>>>>
>>> >> >> >>> >> >> >>>>>>>>>> On Tue, Jun 5, 2012 at 11:43 PM, Babak Bastan
>>> >> >> >>> >> >> >>>>>>>>>> <ba...@gmail.com>
>>> >> >> >>> >> >> >>>>>>>>>> wrote:
>>> >> >> >>> >> >> >>>>>>>>>>>
>>> >> >> >>> >> >> >>>>>>>>>>> Thanks sShashwat, and where is this
>>> hive-site.xml
>>> >> >> >>> >> >> >>>>>>>>>>>
>>> >> >> >>> >> >> >>>>>>>>>>> On Tue, Jun 5, 2012 at 8:02 PM, shashwat
>>> shriparv
>>> >> >> >>> >> >> >>>>>>>>>>> <dw...@gmail.com> wrote:
>>> >> >> >>> >> >> >>>>>>>>>>>>
>>> >> >> >>> >> >> >>>>>>>>>>>> set
>>> >> >> >>> >> >> >>>>>>>>>>>>
>>> >> >> >>> >> >> >>>>>>>>>>>> hive.metastore.warehouse.dir in
>>> hive-site.xml
>>> >> >> >>> >> >> >>>>>>>>>>>>
>>> >> >> >>> >> >> >>>>>>>>>>>> <property>
>>> >> >> >>> >> >> >>>>>>>>>>>>   <name>hive.metastore.local</name>
>>> >> >> >>> >> >> >>>>>>>>>>>>   <value>true</value>
>>> >> >> >>> >> >> >>>>>>>>>>>> </property>
>>> >> >> >>> >> >> >>>>>>>>>>>>
>>> >> >> >>> >> >> >>>>>>>>>>>> <name>hive.metastore.warehouse.dir</name>
>>> >> >> >>> >> >> >>>>>>>>>>>>                <value>/home/<your
>>> >> >> >>> >> >> >>>>>>>>>>>> username>/hivefolder</value>
>>> >> >> >>> >> >> >>>>>>>>>>>>                <description>location of
>>> default
>>> >> >> >>> >> >> >>>>>>>>>>>> database
>>> >> >> >>> >> >> >>>>>>>>>>>> for
>>> >> >> >>> >> >> >>>>>>>>>>>> the
>>> >> >> >>> >> >> >>>>>>>>>>>> warehouse</description>
>>> >> >> >>> >> >> >>>>>>>>>>>>        </property>
>>> >> >> >>> >> >> >>>>>>>>>>>>
>>> >> >> >>> >> >> >>>>>>>>>>>>
>>> >> >> >>> >> >> >>>>>>>>>>>> On Tue, Jun 5, 2012 at 10:43 PM, Babak
>>> Bastan
>>> >> >> >>> >> >> >>>>>>>>>>>> <ba...@gmail.com> wrote:
>>> >> >> >>> >> >> >>>>>>>>>>>>>
>>> >> >> >>> >> >> >>>>>>>>>>>>> Hello Experts ,
>>> >> >> >>> >> >> >>>>>>>>>>>>>
>>> >> >> >>> >> >> >>>>>>>>>>>>> I'm new in Hive .When try to create a test
>>> >> >> >>> >> >> >>>>>>>>>>>>> Table
>>> >> >> >>> >> >> >>>>>>>>>>>>> in
>>> >> >> >>> >> >> >>>>>>>>>>>>> Hive
>>> >> >> >>> >> >> >>>>>>>>>>>>> I
>>> >> >> >>> >> >> >>>>>>>>>>>>> get
>>> >> >> >>> >> >> >>>>>>>>>>>>> an error.I want to run this command:
>>> >> >> >>> >> >> >>>>>>>>>>>>> CREATE TABLE Test (DateT STRING, Url
>>> STRING,
>>> >> >> >>> >> >> >>>>>>>>>>>>> Content
>>> >> >> >>> >> >> >>>>>>>>>>>>> STRING);
>>> >> >> >>> >> >> >>>>>>>>>>>>> but this error occured:
>>> >> >> >>> >> >> >>>>>>>>>>>>> FAILED: Error in metadata:
>>> >> >> >>> >> >> >>>>>>>>>>>>> MetaException(message:Got
>>> >> >> >>> >> >> >>>>>>>>>>>>> exception:
>>> >> >> >>> >> >> >>>>>>>>>>>>> java.io.FileNotFoundException File
>>> >> >> >>> >> >> >>>>>>>>>>>>> file:/user/hive/warehouse/test does not
>>> >> >> >>> >> >> >>>>>>>>>>>>> exist.)
>>> >> >> >>> >> >> >>>>>>>>>>>>> FAILED: Execution Error, return code 1 from
>>> >> >> >>> >> >> >>>>>>>>>>>>> org.apache.hadoop.hive.ql.exec.DDLTask
>>> >> >> >>> >> >> >>>>>>>>>>>>> How can I solve this Problem?
>>> >> >> >>> >> >> >>>>>>>>>>>>> Thank you so much
>>> >> >> >>> >> >> >>>>>>>>>>>>
>>> >> >> >>> >> >> >>>>>>>>>>>>
>>> >> >> >>> >> >> >>>>>>>>>>>>
>>> >> >> >>> >> >> >>>>>>>>>>>>
>>> >> >> >>> >> >> >>>>>>>>>>>> --
>>> >> >> >>> >> >> >>>>>>>>>>>>
>>> >> >> >>> >> >> >>>>>>>>>>>>
>>> >> >> >>> >> >> >>>>>>>>>>>> ∞
>>> >> >> >>> >> >> >>>>>>>>>>>>
>>> >> >> >>> >> >> >>>>>>>>>>>> Shashwat Shriparv
>>> >> >> >>> >> >> >>>>>>>>>>>>
>>> >> >> >>> >> >> >>>>>>>>>>>>
>>> >> >> >>> >> >> >>>>>>>>>>>
>>> >> >> >>> >> >> >>>>>>>>>>
>>> >> >> >>> >> >> >>>>>>>>>>
>>> >> >> >>> >> >> >>>>>>>>>>
>>> >> >> >>> >> >> >>>>>>>>>> --
>>> >> >> >>> >> >> >>>>>>>>>>
>>> >> >> >>> >> >> >>>>>>>>>>
>>> >> >> >>> >> >> >>>>>>>>>> ∞
>>> >> >> >>> >> >> >>>>>>>>>>
>>> >> >> >>> >> >> >>>>>>>>>> Shashwat Shriparv
>>> >> >> >>> >> >> >>>>>>>>>>
>>> >> >> >>> >> >> >>>>>>>>>>
>>> >> >> >>> >> >> >>>>>>>>>
>>> >> >> >>> >> >> >>>>>>>>
>>> >> >> >>> >> >> >>>>>>>
>>> >> >> >>> >> >> >>>>>>>
>>> >> >> >>> >> >> >>>>>>>
>>> >> >> >>> >> >> >>>>>>> --
>>> >> >> >>> >> >> >>>>>>>
>>> >> >> >>> >> >> >>>>>>>
>>> >> >> >>> >> >> >>>>>>> ∞
>>> >> >> >>> >> >> >>>>>>>
>>> >> >> >>> >> >> >>>>>>> Shashwat Shriparv
>>> >> >> >>> >> >> >>>>>>>
>>> >> >> >>> >> >> >>>>>>>
>>> >> >> >>> >> >> >>>>>>
>>> >> >> >>> >> >> >>>>>
>>> >> >> >>> >> >> >>>>
>>> >> >> >>> >> >> >>>
>>> >> >> >>> >> >> >>>
>>> >> >> >>> >> >> >>>
>>> >> >> >>> >> >> >>> --
>>> >> >> >>> >> >> >>>
>>> >> >> >>> >> >> >>>
>>> >> >> >>> >> >> >>> ∞
>>> >> >> >>> >> >> >>>
>>> >> >> >>> >> >> >>> Shashwat Shriparv
>>> >> >> >>> >> >> >>>
>>> >> >> >>> >> >> >>>
>>> >> >> >>> >> >> >>
>>> >> >> >>> >> >
>>> >> >> >>> >> >
>>> >> >> >>> >
>>> >> >> >>> >
>>> >> >> >>
>>> >> >> >>
>>> >> >
>>> >> >
>>> >
>>> >
>>>
>>
>>
>
>
> --
>
>
> ∞
> Shashwat Shriparv
>
>
>

Re: Error while Creating Table in Hive

Posted by shashwat shriparv <dw...@gmail.com>.
whats the error babak ???

On Thu, Jun 7, 2012 at 1:25 AM, Babak Bastan <ba...@gmail.com> wrote:

> What the hell is that?I see no log folder there
>
>
> On Wed, Jun 6, 2012 at 9:41 PM, Mohammad Tariq <do...@gmail.com> wrote:
>
>> go to your HADOOP_HOME i.e your hadoop directory(that includes bin,
>> conf etc)..you can find logs directory there..
>>
>> Regards,
>>     Mohammad Tariq
>>
>>
>> On Thu, Jun 7, 2012 at 1:09 AM, Babak Bastan <ba...@gmail.com> wrote:
>> > hoe can I get my log mohammad?
>> >
>> >
>> > On Wed, Jun 6, 2012 at 9:36 PM, Mohammad Tariq <do...@gmail.com>
>> wrote:
>> >>
>> >> could you post your logs???that would help me in understanding the
>> >> problem properly.
>> >>
>> >> Regards,
>> >>     Mohammad Tariq
>> >>
>> >>
>> >> On Thu, Jun 7, 2012 at 1:02 AM, Babak Bastan <ba...@gmail.com>
>> wrote:
>> >> > Thank you very much mohamad for your attention.I followed the steps
>> but
>> >> > the
>> >> > error is the same as the last time.
>> >> > and there is my hosts file:
>> >> >
>> >> > 127.0.0.1       localhost
>> >> > #127.0.0.1      ubuntu.ubuntu-domain    ubuntu
>> >> >
>> >> >
>> >> > # The following lines are desirable for IPv6 capable hosts
>> >> >
>> >> > #::1     ip6-localhost ip6-loopback
>> >> > #fe00::0 ip6-localnet
>> >> > #ff00::0 ip6-mcastprefix
>> >> > #ff02::1 ip6-allnodes
>> >> > #ff02::2 ip6-allrouters
>> >> >
>> >> > but no effect :(
>> >> >
>> >> > On Wed, Jun 6, 2012 at 8:25 PM, Mohammad Tariq <do...@gmail.com>
>> >> > wrote:
>> >> >>
>> >> >> also change the permissions of these directories to 777.
>> >> >>
>> >> >> Regards,
>> >> >>     Mohammad Tariq
>> >> >>
>> >> >>
>> >> >> On Wed, Jun 6, 2012 at 11:54 PM, Mohammad Tariq <dontariq@gmail.com
>> >
>> >> >> wrote:
>> >> >> > create a directory "/home/username/hdfs" (or at some place of your
>> >> >> > choice)..inside this hdfs directory create three sub directories -
>> >> >> > name, data, and temp, then follow these steps :
>> >> >> >
>> >> >> > add following properties in your core-site.xml -
>> >> >> >
>> >> >> > <property>
>> >> >> >          <name>fs.default.name</name>
>> >> >> >          <value>hdfs://localhost:9000/</value>
>> >> >> >        </property>
>> >> >> >
>> >> >> >        <property>
>> >> >> >          <name>hadoop.tmp.dir</name>
>> >> >> >          <value>/home/mohammad/hdfs/temp</value>
>> >> >> >        </property>
>> >> >> >
>> >> >> > then add following two properties in your hdfs-site.xml -
>> >> >> >
>> >> >> > <property>
>> >> >> >                <name>dfs.replication</name>
>> >> >> >                <value>1</value>
>> >> >> >        </property>
>> >> >> >
>> >> >> >        <property>
>> >> >> >                <name>dfs.name.dir</name>
>> >> >> >                <value>/home/mohammad/hdfs/name</value>
>> >> >> >        </property>
>> >> >> >
>> >> >> >        <property>
>> >> >> >                <name>dfs.data.dir</name>
>> >> >> >                <value>/home/mohammad/hdfs/data</value>
>> >> >> >        </property>
>> >> >> >
>> >> >> > finally add this property in your mapred-site.xml -
>> >> >> >
>> >> >> >       <property>
>> >> >> >          <name>mapred.job.tracker</name>
>> >> >> >          <value>hdfs://localhost:9001</value>
>> >> >> >        </property>
>> >> >> >
>> >> >> > NOTE: you can give any name to these directories of your choice,
>> just
>> >> >> > keep in mind you have to give same names as values of
>> >> >> >           above specified properties in your configuration files.
>> >> >> > (give full path of these directories, not just the name of the
>> >> >> > directory)
>> >> >> >
>> >> >> > After this  follow the steps provided in the previous reply.
>> >> >> >
>> >> >> > Regards,
>> >> >> >     Mohammad Tariq
>> >> >> >
>> >> >> >
>> >> >> > On Wed, Jun 6, 2012 at 11:42 PM, Babak Bastan <babakbsn@gmail.com
>> >
>> >> >> > wrote:
>> >> >> >> thank's Mohammad
>> >> >> >>
>> >> >> >> with this command:
>> >> >> >>
>> >> >> >> babak@ubuntu:~/Downloads/hadoop/bin$ hadoop namenode -format
>> >> >> >>
>> >> >> >> this is my output:
>> >> >> >>
>> >> >> >> 12/06/06 20:05:20 INFO namenode.NameNode: STARTUP_MSG:
>> >> >> >> /************************************************************
>> >> >> >> STARTUP_MSG: Starting NameNode
>> >> >> >> STARTUP_MSG:   host = ubuntu/127.0.1.1
>> >> >> >> STARTUP_MSG:   args = [-format]
>> >> >> >> STARTUP_MSG:   version = 0.20.2
>> >> >> >> STARTUP_MSG:   build =
>> >> >> >>
>> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.20
>> >> >> >> -r
>> >> >> >> 911707; compiled by 'chrisdo' on Fri Feb 19 08:07:34 UTC 2010
>> >> >> >> ************************************************************/
>> >> >> >> 12/06/06 20:05:20 INFO namenode.FSNamesystem:
>> >> >> >>
>> >> >> >>
>> fsOwner=babak,babak,adm,dialout,cdrom,plugdev,lpadmin,admin,sambashare
>> >> >> >> 12/06/06 20:05:20 INFO namenode.FSNamesystem:
>> supergroup=supergroup
>> >> >> >> 12/06/06 20:05:20 INFO namenode.FSNamesystem:
>> >> >> >> isPermissionEnabled=true
>> >> >> >> 12/06/06 20:05:20 INFO common.Storage: Image file of size 95
>> saved
>> >> >> >> in 0
>> >> >> >> seconds.
>> >> >> >> 12/06/06 20:05:20 INFO common.Storage: Storage directory
>> >> >> >> /tmp/hadoop-babak/dfs/name has been successfully formatted.
>> >> >> >> 12/06/06 20:05:20 INFO namenode.NameNode: SHUTDOWN_MSG:
>> >> >> >> /************************************************************
>> >> >> >> SHUTDOWN_MSG: Shutting down NameNode at ubuntu/127.0.1.1
>> >> >> >> ************************************************************/
>> >> >> >>
>> >> >> >> by this command:
>> >> >> >>
>> >> >> >> babak@ubuntu:~/Downloads/hadoop/bin$ start-dfs.sh
>> >> >> >>
>> >> >> >> this is the out put
>> >> >> >>
>> >> >> >> mkdir: kann Verzeichnis
>> „/home/babak/Downloads/hadoop/bin/../logs“
>> >> >> >> nicht
>> >> >> >> anlegen: Keine Berechtigung
>> >> >> >>
>> >> >> >> this out put(it's in german and it means no right to make this
>> >> >> >> folder)
>> >> >> >>
>> >> >> >>
>> >> >> >> On Wed, Jun 6, 2012 at 7:59 PM, Mohammad Tariq <
>> dontariq@gmail.com>
>> >> >> >> wrote:
>> >> >> >>>
>> >> >> >>> once we are done with the configuration, we need to format the
>> file
>> >> >> >>> system..use this command to do that-
>> >> >> >>> bin/hadoop namenode -format
>> >> >> >>>
>> >> >> >>> after this, hadoop daemon processes should be started using
>> >> >> >>> following
>> >> >> >>> commands -
>> >> >> >>> bin/start-dfs.sh (it'll start NN & DN)
>> >> >> >>> bin/start-mapred.sh (it'll start JT & TT)
>> >> >> >>>
>> >> >> >>> after this use jps to check if everything is alright or point
>> your
>> >> >> >>> browser to localhost:50070..if you further find any problem
>> provide
>> >> >> >>> us
>> >> >> >>> with the error logs..:)
>> >> >> >>>
>> >> >> >>> Regards,
>> >> >> >>>     Mohammad Tariq
>> >> >> >>>
>> >> >> >>>
>> >> >> >>> On Wed, Jun 6, 2012 at 11:22 PM, Babak Bastan <
>> babakbsn@gmail.com>
>> >> >> >>> wrote:
>> >> >> >>> > were you able to format hdfs properly???
>> >> >> >>> > I did'nt get your question,Do you mean HADOOP_HOME? or where
>> did
>> >> >> >>> > I
>> >> >> >>> > install
>> >> >> >>> > Hadoop?
>> >> >> >>> >
>> >> >> >>> > On Wed, Jun 6, 2012 at 7:49 PM, Mohammad Tariq
>> >> >> >>> > <do...@gmail.com>
>> >> >> >>> > wrote:
>> >> >> >>> >>
>> >> >> >>> >> if you are getting only this, it means your hadoop is not
>> >> >> >>> >> running..were you able to format hdfs properly???
>> >> >> >>> >>
>> >> >> >>> >> Regards,
>> >> >> >>> >>     Mohammad Tariq
>> >> >> >>> >>
>> >> >> >>> >>
>> >> >> >>> >> On Wed, Jun 6, 2012 at 11:17 PM, Babak Bastan
>> >> >> >>> >> <ba...@gmail.com>
>> >> >> >>> >> wrote:
>> >> >> >>> >> > Hi MohammadmI irun jps in my shel I can see this result:
>> >> >> >>> >> > 2213 Jps
>> >> >> >>> >> >
>> >> >> >>> >> >
>> >> >> >>> >> > On Wed, Jun 6, 2012 at 7:44 PM, Mohammad Tariq
>> >> >> >>> >> > <do...@gmail.com>
>> >> >> >>> >> > wrote:
>> >> >> >>> >> >>
>> >> >> >>> >> >> you can also use "jps" command at your shell to see
>> whether
>> >> >> >>> >> >> Hadoop
>> >> >> >>> >> >> processes are running or not.
>> >> >> >>> >> >>
>> >> >> >>> >> >> Regards,
>> >> >> >>> >> >>     Mohammad Tariq
>> >> >> >>> >> >>
>> >> >> >>> >> >>
>> >> >> >>> >> >> On Wed, Jun 6, 2012 at 11:12 PM, Mohammad Tariq
>> >> >> >>> >> >> <do...@gmail.com>
>> >> >> >>> >> >> wrote:
>> >> >> >>> >> >> > Hi Babak,
>> >> >> >>> >> >> >
>> >> >> >>> >> >> >  You have to type it in you web browser..Hadoop
>> provides us
>> >> >> >>> >> >> > a
>> >> >> >>> >> >> > web
>> >> >> >>> >> >> > GUI
>> >> >> >>> >> >> > that not only allows us to browse through the file
>> system,
>> >> >> >>> >> >> > but
>> >> >> >>> >> >> > to
>> >> >> >>> >> >> > download the files as well..Apart from that it also
>> >> >> >>> >> >> > provides a
>> >> >> >>> >> >> > web
>> >> >> >>> >> >> > GUI
>> >> >> >>> >> >> > that can be used to see the status of Jobtracker and
>> >> >> >>> >> >> > Tasktracker..When
>> >> >> >>> >> >> > you run a Hive or Pig job or a Mapreduce job, you can
>> point
>> >> >> >>> >> >> > your
>> >> >> >>> >> >> > browser to http://localhost:50030 to see the status and
>> >> >> >>> >> >> > logs
>> >> >> >>> >> >> > of
>> >> >> >>> >> >> > your
>> >> >> >>> >> >> > job.
>> >> >> >>> >> >> >
>> >> >> >>> >> >> > Regards,
>> >> >> >>> >> >> >     Mohammad Tariq
>> >> >> >>> >> >> >
>> >> >> >>> >> >> >
>> >> >> >>> >> >> > On Wed, Jun 6, 2012 at 8:28 PM, Babak Bastan
>> >> >> >>> >> >> > <ba...@gmail.com>
>> >> >> >>> >> >> > wrote:
>> >> >> >>> >> >> >> Thank you shashwat for the answer,
>> >> >> >>> >> >> >> where should I type http://localhost:50070?
>> >> >> >>> >> >> >> I typed here: hive>http://localhost:50070 but nothing
>> as
>> >> >> >>> >> >> >> result
>> >> >> >>> >> >> >>
>> >> >> >>> >> >> >>
>> >> >> >>> >> >> >> On Wed, Jun 6, 2012 at 3:32 PM, shashwat shriparv
>> >> >> >>> >> >> >> <dw...@gmail.com> wrote:
>> >> >> >>> >> >> >>>
>> >> >> >>> >> >> >>> first type http://localhost:50070 whether this is
>> opening
>> >> >> >>> >> >> >>> or
>> >> >> >>> >> >> >>> not
>> >> >> >>> >> >> >>> and
>> >> >> >>> >> >> >>> check
>> >> >> >>> >> >> >>> how many nodes are available, check some of the hadoop
>> >> >> >>> >> >> >>> shell
>> >> >> >>> >> >> >>> commands
>> >> >> >>> >> >> >>>
>> >> >> >>> >> >> >>>
>> >> >> >>> >> >> >>>
>> >> >> >>> >> >> >>> from
>> http://hadoop.apache.org/common/docs/r0.18.3/hdfs_shell.html
>> >> >> >>> >> >> >>> run
>> >> >> >>> >> >> >>> example mapreduce task on hadoop take example from
>> here
>> >> >> >>> >> >> >>>
>> >> >> >>> >> >> >>>
>> >> >> >>> >> >> >>>
>> >> >> >>> >> >> >>>
>> >> >> >>> >> >> >>>
>> >> >> >>> >> >> >>> :
>> http://www.michael-noll.com/blog/2011/04/09/benchmarking-and-stress-testing-an-hadoop-cluster-with-terasort-testdfsio-nnbench-mrbench/
>> >> >> >>> >> >> >>>
>> >> >> >>> >> >> >>> if all the above you can do sucessfully means hadoop
>> is
>> >> >> >>> >> >> >>> configured
>> >> >> >>> >> >> >>> correctly
>> >> >> >>> >> >> >>>
>> >> >> >>> >> >> >>> Regards
>> >> >> >>> >> >> >>> Shashwat
>> >> >> >>> >> >> >>>
>> >> >> >>> >> >> >>>
>> >> >> >>> >> >> >>> On Wed, Jun 6, 2012 at 1:30 AM, Babak Bastan
>> >> >> >>> >> >> >>> <ba...@gmail.com>
>> >> >> >>> >> >> >>> wrote:
>> >> >> >>> >> >> >>>>
>> >> >> >>> >> >> >>>> no I'm not working on CDH.Is there a way to test if
>> my
>> >> >> >>> >> >> >>>> Hadoop
>> >> >> >>> >> >> >>>> works
>> >> >> >>> >> >> >>>> fine
>> >> >> >>> >> >> >>>> or not?
>> >> >> >>> >> >> >>>>
>> >> >> >>> >> >> >>>>
>> >> >> >>> >> >> >>>> On Tue, Jun 5, 2012 at 9:55 PM, Bejoy KS
>> >> >> >>> >> >> >>>> <be...@yahoo.com>
>> >> >> >>> >> >> >>>> wrote:
>> >> >> >>> >> >> >>>>>
>> >> >> >>> >> >> >>>>> Hi Babak
>> >> >> >>> >> >> >>>>>
>> >> >> >>> >> >> >>>>> You gotta follow those instructions in the apace
>> site
>> >> >> >>> >> >> >>>>> to
>> >> >> >>> >> >> >>>>> set
>> >> >> >>> >> >> >>>>> up
>> >> >> >>> >> >> >>>>> hadoop
>> >> >> >>> >> >> >>>>> from scratch and ensure that hdfs is working first.
>> You
>> >> >> >>> >> >> >>>>> should
>> >> >> >>> >> >> >>>>> be
>> >> >> >>> >> >> >>>>> able to
>> >> >> >>> >> >> >>>>> read and write files to hdfs before you do your next
>> >> >> >>> >> >> >>>>> steps.
>> >> >> >>> >> >> >>>>>
>> >> >> >>> >> >> >>>>> Are you on CDH or apache distribution of hadoop? If
>> it
>> >> >> >>> >> >> >>>>> is
>> >> >> >>> >> >> >>>>> CDH
>> >> >> >>> >> >> >>>>> there
>> >> >> >>> >> >> >>>>> are
>> >> >> >>> >> >> >>>>> detailed instructions on Cloudera web site.
>> >> >> >>> >> >> >>>>>
>> >> >> >>> >> >> >>>>> Regards
>> >> >> >>> >> >> >>>>> Bejoy KS
>> >> >> >>> >> >> >>>>>
>> >> >> >>> >> >> >>>>> Sent from handheld, please excuse typos.
>> >> >> >>> >> >> >>>>> ________________________________
>> >> >> >>> >> >> >>>>> From: Babak Bastan <ba...@gmail.com>
>> >> >> >>> >> >> >>>>> Date: Tue, 5 Jun 2012 21:30:22 +0200
>> >> >> >>> >> >> >>>>> To: <us...@hive.apache.org>
>> >> >> >>> >> >> >>>>> ReplyTo: user@hive.apache.org
>> >> >> >>> >> >> >>>>> Subject: Re: Error while Creating Table in Hive
>> >> >> >>> >> >> >>>>>
>> >> >> >>> >> >> >>>>> @Bejoy: I set the fs.default.name in the
>> core-site.xml
>> >> >> >>> >> >> >>>>> and
>> >> >> >>> >> >> >>>>> I
>> >> >> >>> >> >> >>>>> did
>> >> >> >>> >> >> >>>>> all
>> >> >> >>> >> >> >>>>> of
>> >> >> >>> >> >> >>>>> thing that was mentioned in the reference but no
>> effect
>> >> >> >>> >> >> >>>>>
>> >> >> >>> >> >> >>>>> On Tue, Jun 5, 2012 at 8:43 PM, Babak Bastan
>> >> >> >>> >> >> >>>>> <ba...@gmail.com>
>> >> >> >>> >> >> >>>>> wrote:
>> >> >> >>> >> >> >>>>>>
>> >> >> >>> >> >> >>>>>> Ok sorry but that was my Mistake .I thought it
>> works
>> >> >> >>> >> >> >>>>>> but
>> >> >> >>> >> >> >>>>>> no.
>> >> >> >>> >> >> >>>>>> I wrote the command without ; and then I think It
>> >> >> >>> >> >> >>>>>> works
>> >> >> >>> >> >> >>>>>> but
>> >> >> >>> >> >> >>>>>> with
>> >> >> >>> >> >> >>>>>> ;
>> >> >> >>> >> >> >>>>>> at
>> >> >> >>> >> >> >>>>>> the end of command
>> >> >> >>> >> >> >>>>>>
>> >> >> >>> >> >> >>>>>> CREATE TABLE pokes (foo INT, bar STRING);
>> >> >> >>> >> >> >>>>>>
>> >> >> >>> >> >> >>>>>> does'nt work
>> >> >> >>> >> >> >>>>>>
>> >> >> >>> >> >> >>>>>>
>> >> >> >>> >> >> >>>>>> On Tue, Jun 5, 2012 at 8:34 PM, shashwat shriparv
>> >> >> >>> >> >> >>>>>> <dw...@gmail.com> wrote:
>> >> >> >>> >> >> >>>>>>>
>> >> >> >>> >> >> >>>>>>> inside configuration. all properties will be
>> inside
>> >> >> >>> >> >> >>>>>>> the
>> >> >> >>> >> >> >>>>>>> configuration
>> >> >> >>> >> >> >>>>>>> tags
>> >> >> >>> >> >> >>>>>>>
>> >> >> >>> >> >> >>>>>>>
>> >> >> >>> >> >> >>>>>>> On Tue, Jun 5, 2012 at 11:53 PM, Babak Bastan
>> >> >> >>> >> >> >>>>>>> <ba...@gmail.com>
>> >> >> >>> >> >> >>>>>>> wrote:
>> >> >> >>> >> >> >>>>>>>>
>> >> >> >>> >> >> >>>>>>>> Thank you so much my friend your idee works
>> fine(no
>> >> >> >>> >> >> >>>>>>>> error)
>> >> >> >>> >> >> >>>>>>>> you
>> >> >> >>> >> >> >>>>>>>> are
>> >> >> >>> >> >> >>>>>>>> the best :)
>> >> >> >>> >> >> >>>>>>>>
>> >> >> >>> >> >> >>>>>>>>
>> >> >> >>> >> >> >>>>>>>> On Tue, Jun 5, 2012 at 8:20 PM, Babak Bastan
>> >> >> >>> >> >> >>>>>>>> <ba...@gmail.com>
>> >> >> >>> >> >> >>>>>>>> wrote:
>> >> >> >>> >> >> >>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>> It must be inside the
>> >> >> >>> >> >> >>>>>>>>> <configuration></configuration>
>> >> >> >>> >> >> >>>>>>>>> or
>> >> >> >>> >> >> >>>>>>>>> outside
>> >> >> >>> >> >> >>>>>>>>> this?
>> >> >> >>> >> >> >>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>> On Tue, Jun 5, 2012 at 8:15 PM, shashwat
>> shriparv
>> >> >> >>> >> >> >>>>>>>>> <dw...@gmail.com> wrote:
>> >> >> >>> >> >> >>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>> It will be inside hive/conf
>> >> >> >>> >> >> >>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>> On Tue, Jun 5, 2012 at 11:43 PM, Babak Bastan
>> >> >> >>> >> >> >>>>>>>>>> <ba...@gmail.com>
>> >> >> >>> >> >> >>>>>>>>>> wrote:
>> >> >> >>> >> >> >>>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>>> Thanks sShashwat, and where is this
>> hive-site.xml
>> >> >> >>> >> >> >>>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>>> On Tue, Jun 5, 2012 at 8:02 PM, shashwat
>> shriparv
>> >> >> >>> >> >> >>>>>>>>>>> <dw...@gmail.com> wrote:
>> >> >> >>> >> >> >>>>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>>>> set
>> >> >> >>> >> >> >>>>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>>>> hive.metastore.warehouse.dir in hive-site.xml
>> >> >> >>> >> >> >>>>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>>>> <property>
>> >> >> >>> >> >> >>>>>>>>>>>>   <name>hive.metastore.local</name>
>> >> >> >>> >> >> >>>>>>>>>>>>   <value>true</value>
>> >> >> >>> >> >> >>>>>>>>>>>> </property>
>> >> >> >>> >> >> >>>>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>>>> <name>hive.metastore.warehouse.dir</name>
>> >> >> >>> >> >> >>>>>>>>>>>>                <value>/home/<your
>> >> >> >>> >> >> >>>>>>>>>>>> username>/hivefolder</value>
>> >> >> >>> >> >> >>>>>>>>>>>>                <description>location of
>> default
>> >> >> >>> >> >> >>>>>>>>>>>> database
>> >> >> >>> >> >> >>>>>>>>>>>> for
>> >> >> >>> >> >> >>>>>>>>>>>> the
>> >> >> >>> >> >> >>>>>>>>>>>> warehouse</description>
>> >> >> >>> >> >> >>>>>>>>>>>>        </property>
>> >> >> >>> >> >> >>>>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>>>> On Tue, Jun 5, 2012 at 10:43 PM, Babak Bastan
>> >> >> >>> >> >> >>>>>>>>>>>> <ba...@gmail.com> wrote:
>> >> >> >>> >> >> >>>>>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>>>>> Hello Experts ,
>> >> >> >>> >> >> >>>>>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>>>>> I'm new in Hive .When try to create a test
>> >> >> >>> >> >> >>>>>>>>>>>>> Table
>> >> >> >>> >> >> >>>>>>>>>>>>> in
>> >> >> >>> >> >> >>>>>>>>>>>>> Hive
>> >> >> >>> >> >> >>>>>>>>>>>>> I
>> >> >> >>> >> >> >>>>>>>>>>>>> get
>> >> >> >>> >> >> >>>>>>>>>>>>> an error.I want to run this command:
>> >> >> >>> >> >> >>>>>>>>>>>>> CREATE TABLE Test (DateT STRING, Url STRING,
>> >> >> >>> >> >> >>>>>>>>>>>>> Content
>> >> >> >>> >> >> >>>>>>>>>>>>> STRING);
>> >> >> >>> >> >> >>>>>>>>>>>>> but this error occured:
>> >> >> >>> >> >> >>>>>>>>>>>>> FAILED: Error in metadata:
>> >> >> >>> >> >> >>>>>>>>>>>>> MetaException(message:Got
>> >> >> >>> >> >> >>>>>>>>>>>>> exception:
>> >> >> >>> >> >> >>>>>>>>>>>>> java.io.FileNotFoundException File
>> >> >> >>> >> >> >>>>>>>>>>>>> file:/user/hive/warehouse/test does not
>> >> >> >>> >> >> >>>>>>>>>>>>> exist.)
>> >> >> >>> >> >> >>>>>>>>>>>>> FAILED: Execution Error, return code 1 from
>> >> >> >>> >> >> >>>>>>>>>>>>> org.apache.hadoop.hive.ql.exec.DDLTask
>> >> >> >>> >> >> >>>>>>>>>>>>> How can I solve this Problem?
>> >> >> >>> >> >> >>>>>>>>>>>>> Thank you so much
>> >> >> >>> >> >> >>>>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>>>> --
>> >> >> >>> >> >> >>>>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>>>> ∞
>> >> >> >>> >> >> >>>>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>>>> Shashwat Shriparv
>> >> >> >>> >> >> >>>>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>> --
>> >> >> >>> >> >> >>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>> ∞
>> >> >> >>> >> >> >>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>> Shashwat Shriparv
>> >> >> >>> >> >> >>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>>
>> >> >> >>> >> >> >>>>>>>>
>> >> >> >>> >> >> >>>>>>>
>> >> >> >>> >> >> >>>>>>>
>> >> >> >>> >> >> >>>>>>>
>> >> >> >>> >> >> >>>>>>> --
>> >> >> >>> >> >> >>>>>>>
>> >> >> >>> >> >> >>>>>>>
>> >> >> >>> >> >> >>>>>>> ∞
>> >> >> >>> >> >> >>>>>>>
>> >> >> >>> >> >> >>>>>>> Shashwat Shriparv
>> >> >> >>> >> >> >>>>>>>
>> >> >> >>> >> >> >>>>>>>
>> >> >> >>> >> >> >>>>>>
>> >> >> >>> >> >> >>>>>
>> >> >> >>> >> >> >>>>
>> >> >> >>> >> >> >>>
>> >> >> >>> >> >> >>>
>> >> >> >>> >> >> >>>
>> >> >> >>> >> >> >>> --
>> >> >> >>> >> >> >>>
>> >> >> >>> >> >> >>>
>> >> >> >>> >> >> >>> ∞
>> >> >> >>> >> >> >>>
>> >> >> >>> >> >> >>> Shashwat Shriparv
>> >> >> >>> >> >> >>>
>> >> >> >>> >> >> >>>
>> >> >> >>> >> >> >>
>> >> >> >>> >> >
>> >> >> >>> >> >
>> >> >> >>> >
>> >> >> >>> >
>> >> >> >>
>> >> >> >>
>> >> >
>> >> >
>> >
>> >
>>
>
>


-- 


∞
Shashwat Shriparv

Re: Error while Creating Table in Hive

Posted by Babak Bastan <ba...@gmail.com>.
What the hell is that?I see no log folder there

On Wed, Jun 6, 2012 at 9:41 PM, Mohammad Tariq <do...@gmail.com> wrote:

> go to your HADOOP_HOME i.e your hadoop directory(that includes bin,
> conf etc)..you can find logs directory there..
>
> Regards,
>     Mohammad Tariq
>
>
> On Thu, Jun 7, 2012 at 1:09 AM, Babak Bastan <ba...@gmail.com> wrote:
> > hoe can I get my log mohammad?
> >
> >
> > On Wed, Jun 6, 2012 at 9:36 PM, Mohammad Tariq <do...@gmail.com>
> wrote:
> >>
> >> could you post your logs???that would help me in understanding the
> >> problem properly.
> >>
> >> Regards,
> >>     Mohammad Tariq
> >>
> >>
> >> On Thu, Jun 7, 2012 at 1:02 AM, Babak Bastan <ba...@gmail.com>
> wrote:
> >> > Thank you very much mohamad for your attention.I followed the steps
> but
> >> > the
> >> > error is the same as the last time.
> >> > and there is my hosts file:
> >> >
> >> > 127.0.0.1       localhost
> >> > #127.0.0.1      ubuntu.ubuntu-domain    ubuntu
> >> >
> >> >
> >> > # The following lines are desirable for IPv6 capable hosts
> >> >
> >> > #::1     ip6-localhost ip6-loopback
> >> > #fe00::0 ip6-localnet
> >> > #ff00::0 ip6-mcastprefix
> >> > #ff02::1 ip6-allnodes
> >> > #ff02::2 ip6-allrouters
> >> >
> >> > but no effect :(
> >> >
> >> > On Wed, Jun 6, 2012 at 8:25 PM, Mohammad Tariq <do...@gmail.com>
> >> > wrote:
> >> >>
> >> >> also change the permissions of these directories to 777.
> >> >>
> >> >> Regards,
> >> >>     Mohammad Tariq
> >> >>
> >> >>
> >> >> On Wed, Jun 6, 2012 at 11:54 PM, Mohammad Tariq <do...@gmail.com>
> >> >> wrote:
> >> >> > create a directory "/home/username/hdfs" (or at some place of your
> >> >> > choice)..inside this hdfs directory create three sub directories -
> >> >> > name, data, and temp, then follow these steps :
> >> >> >
> >> >> > add following properties in your core-site.xml -
> >> >> >
> >> >> > <property>
> >> >> >          <name>fs.default.name</name>
> >> >> >          <value>hdfs://localhost:9000/</value>
> >> >> >        </property>
> >> >> >
> >> >> >        <property>
> >> >> >          <name>hadoop.tmp.dir</name>
> >> >> >          <value>/home/mohammad/hdfs/temp</value>
> >> >> >        </property>
> >> >> >
> >> >> > then add following two properties in your hdfs-site.xml -
> >> >> >
> >> >> > <property>
> >> >> >                <name>dfs.replication</name>
> >> >> >                <value>1</value>
> >> >> >        </property>
> >> >> >
> >> >> >        <property>
> >> >> >                <name>dfs.name.dir</name>
> >> >> >                <value>/home/mohammad/hdfs/name</value>
> >> >> >        </property>
> >> >> >
> >> >> >        <property>
> >> >> >                <name>dfs.data.dir</name>
> >> >> >                <value>/home/mohammad/hdfs/data</value>
> >> >> >        </property>
> >> >> >
> >> >> > finally add this property in your mapred-site.xml -
> >> >> >
> >> >> >       <property>
> >> >> >          <name>mapred.job.tracker</name>
> >> >> >          <value>hdfs://localhost:9001</value>
> >> >> >        </property>
> >> >> >
> >> >> > NOTE: you can give any name to these directories of your choice,
> just
> >> >> > keep in mind you have to give same names as values of
> >> >> >           above specified properties in your configuration files.
> >> >> > (give full path of these directories, not just the name of the
> >> >> > directory)
> >> >> >
> >> >> > After this  follow the steps provided in the previous reply.
> >> >> >
> >> >> > Regards,
> >> >> >     Mohammad Tariq
> >> >> >
> >> >> >
> >> >> > On Wed, Jun 6, 2012 at 11:42 PM, Babak Bastan <ba...@gmail.com>
> >> >> > wrote:
> >> >> >> thank's Mohammad
> >> >> >>
> >> >> >> with this command:
> >> >> >>
> >> >> >> babak@ubuntu:~/Downloads/hadoop/bin$ hadoop namenode -format
> >> >> >>
> >> >> >> this is my output:
> >> >> >>
> >> >> >> 12/06/06 20:05:20 INFO namenode.NameNode: STARTUP_MSG:
> >> >> >> /************************************************************
> >> >> >> STARTUP_MSG: Starting NameNode
> >> >> >> STARTUP_MSG:   host = ubuntu/127.0.1.1
> >> >> >> STARTUP_MSG:   args = [-format]
> >> >> >> STARTUP_MSG:   version = 0.20.2
> >> >> >> STARTUP_MSG:   build =
> >> >> >>
> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.20
> >> >> >> -r
> >> >> >> 911707; compiled by 'chrisdo' on Fri Feb 19 08:07:34 UTC 2010
> >> >> >> ************************************************************/
> >> >> >> 12/06/06 20:05:20 INFO namenode.FSNamesystem:
> >> >> >>
> >> >> >>
> fsOwner=babak,babak,adm,dialout,cdrom,plugdev,lpadmin,admin,sambashare
> >> >> >> 12/06/06 20:05:20 INFO namenode.FSNamesystem:
> supergroup=supergroup
> >> >> >> 12/06/06 20:05:20 INFO namenode.FSNamesystem:
> >> >> >> isPermissionEnabled=true
> >> >> >> 12/06/06 20:05:20 INFO common.Storage: Image file of size 95 saved
> >> >> >> in 0
> >> >> >> seconds.
> >> >> >> 12/06/06 20:05:20 INFO common.Storage: Storage directory
> >> >> >> /tmp/hadoop-babak/dfs/name has been successfully formatted.
> >> >> >> 12/06/06 20:05:20 INFO namenode.NameNode: SHUTDOWN_MSG:
> >> >> >> /************************************************************
> >> >> >> SHUTDOWN_MSG: Shutting down NameNode at ubuntu/127.0.1.1
> >> >> >> ************************************************************/
> >> >> >>
> >> >> >> by this command:
> >> >> >>
> >> >> >> babak@ubuntu:~/Downloads/hadoop/bin$ start-dfs.sh
> >> >> >>
> >> >> >> this is the out put
> >> >> >>
> >> >> >> mkdir: kann Verzeichnis „/home/babak/Downloads/hadoop/bin/../logs“
> >> >> >> nicht
> >> >> >> anlegen: Keine Berechtigung
> >> >> >>
> >> >> >> this out put(it's in german and it means no right to make this
> >> >> >> folder)
> >> >> >>
> >> >> >>
> >> >> >> On Wed, Jun 6, 2012 at 7:59 PM, Mohammad Tariq <
> dontariq@gmail.com>
> >> >> >> wrote:
> >> >> >>>
> >> >> >>> once we are done with the configuration, we need to format the
> file
> >> >> >>> system..use this command to do that-
> >> >> >>> bin/hadoop namenode -format
> >> >> >>>
> >> >> >>> after this, hadoop daemon processes should be started using
> >> >> >>> following
> >> >> >>> commands -
> >> >> >>> bin/start-dfs.sh (it'll start NN & DN)
> >> >> >>> bin/start-mapred.sh (it'll start JT & TT)
> >> >> >>>
> >> >> >>> after this use jps to check if everything is alright or point
> your
> >> >> >>> browser to localhost:50070..if you further find any problem
> provide
> >> >> >>> us
> >> >> >>> with the error logs..:)
> >> >> >>>
> >> >> >>> Regards,
> >> >> >>>     Mohammad Tariq
> >> >> >>>
> >> >> >>>
> >> >> >>> On Wed, Jun 6, 2012 at 11:22 PM, Babak Bastan <
> babakbsn@gmail.com>
> >> >> >>> wrote:
> >> >> >>> > were you able to format hdfs properly???
> >> >> >>> > I did'nt get your question,Do you mean HADOOP_HOME? or where
> did
> >> >> >>> > I
> >> >> >>> > install
> >> >> >>> > Hadoop?
> >> >> >>> >
> >> >> >>> > On Wed, Jun 6, 2012 at 7:49 PM, Mohammad Tariq
> >> >> >>> > <do...@gmail.com>
> >> >> >>> > wrote:
> >> >> >>> >>
> >> >> >>> >> if you are getting only this, it means your hadoop is not
> >> >> >>> >> running..were you able to format hdfs properly???
> >> >> >>> >>
> >> >> >>> >> Regards,
> >> >> >>> >>     Mohammad Tariq
> >> >> >>> >>
> >> >> >>> >>
> >> >> >>> >> On Wed, Jun 6, 2012 at 11:17 PM, Babak Bastan
> >> >> >>> >> <ba...@gmail.com>
> >> >> >>> >> wrote:
> >> >> >>> >> > Hi MohammadmI irun jps in my shel I can see this result:
> >> >> >>> >> > 2213 Jps
> >> >> >>> >> >
> >> >> >>> >> >
> >> >> >>> >> > On Wed, Jun 6, 2012 at 7:44 PM, Mohammad Tariq
> >> >> >>> >> > <do...@gmail.com>
> >> >> >>> >> > wrote:
> >> >> >>> >> >>
> >> >> >>> >> >> you can also use "jps" command at your shell to see whether
> >> >> >>> >> >> Hadoop
> >> >> >>> >> >> processes are running or not.
> >> >> >>> >> >>
> >> >> >>> >> >> Regards,
> >> >> >>> >> >>     Mohammad Tariq
> >> >> >>> >> >>
> >> >> >>> >> >>
> >> >> >>> >> >> On Wed, Jun 6, 2012 at 11:12 PM, Mohammad Tariq
> >> >> >>> >> >> <do...@gmail.com>
> >> >> >>> >> >> wrote:
> >> >> >>> >> >> > Hi Babak,
> >> >> >>> >> >> >
> >> >> >>> >> >> >  You have to type it in you web browser..Hadoop provides
> us
> >> >> >>> >> >> > a
> >> >> >>> >> >> > web
> >> >> >>> >> >> > GUI
> >> >> >>> >> >> > that not only allows us to browse through the file
> system,
> >> >> >>> >> >> > but
> >> >> >>> >> >> > to
> >> >> >>> >> >> > download the files as well..Apart from that it also
> >> >> >>> >> >> > provides a
> >> >> >>> >> >> > web
> >> >> >>> >> >> > GUI
> >> >> >>> >> >> > that can be used to see the status of Jobtracker and
> >> >> >>> >> >> > Tasktracker..When
> >> >> >>> >> >> > you run a Hive or Pig job or a Mapreduce job, you can
> point
> >> >> >>> >> >> > your
> >> >> >>> >> >> > browser to http://localhost:50030 to see the status and
> >> >> >>> >> >> > logs
> >> >> >>> >> >> > of
> >> >> >>> >> >> > your
> >> >> >>> >> >> > job.
> >> >> >>> >> >> >
> >> >> >>> >> >> > Regards,
> >> >> >>> >> >> >     Mohammad Tariq
> >> >> >>> >> >> >
> >> >> >>> >> >> >
> >> >> >>> >> >> > On Wed, Jun 6, 2012 at 8:28 PM, Babak Bastan
> >> >> >>> >> >> > <ba...@gmail.com>
> >> >> >>> >> >> > wrote:
> >> >> >>> >> >> >> Thank you shashwat for the answer,
> >> >> >>> >> >> >> where should I type http://localhost:50070?
> >> >> >>> >> >> >> I typed here: hive>http://localhost:50070 but nothing
> as
> >> >> >>> >> >> >> result
> >> >> >>> >> >> >>
> >> >> >>> >> >> >>
> >> >> >>> >> >> >> On Wed, Jun 6, 2012 at 3:32 PM, shashwat shriparv
> >> >> >>> >> >> >> <dw...@gmail.com> wrote:
> >> >> >>> >> >> >>>
> >> >> >>> >> >> >>> first type http://localhost:50070 whether this is
> opening
> >> >> >>> >> >> >>> or
> >> >> >>> >> >> >>> not
> >> >> >>> >> >> >>> and
> >> >> >>> >> >> >>> check
> >> >> >>> >> >> >>> how many nodes are available, check some of the hadoop
> >> >> >>> >> >> >>> shell
> >> >> >>> >> >> >>> commands
> >> >> >>> >> >> >>>
> >> >> >>> >> >> >>>
> >> >> >>> >> >> >>>
> >> >> >>> >> >> >>> from
> http://hadoop.apache.org/common/docs/r0.18.3/hdfs_shell.html
> >> >> >>> >> >> >>> run
> >> >> >>> >> >> >>> example mapreduce task on hadoop take example from here
> >> >> >>> >> >> >>>
> >> >> >>> >> >> >>>
> >> >> >>> >> >> >>>
> >> >> >>> >> >> >>>
> >> >> >>> >> >> >>>
> >> >> >>> >> >> >>> :
> http://www.michael-noll.com/blog/2011/04/09/benchmarking-and-stress-testing-an-hadoop-cluster-with-terasort-testdfsio-nnbench-mrbench/
> >> >> >>> >> >> >>>
> >> >> >>> >> >> >>> if all the above you can do sucessfully means hadoop is
> >> >> >>> >> >> >>> configured
> >> >> >>> >> >> >>> correctly
> >> >> >>> >> >> >>>
> >> >> >>> >> >> >>> Regards
> >> >> >>> >> >> >>> Shashwat
> >> >> >>> >> >> >>>
> >> >> >>> >> >> >>>
> >> >> >>> >> >> >>> On Wed, Jun 6, 2012 at 1:30 AM, Babak Bastan
> >> >> >>> >> >> >>> <ba...@gmail.com>
> >> >> >>> >> >> >>> wrote:
> >> >> >>> >> >> >>>>
> >> >> >>> >> >> >>>> no I'm not working on CDH.Is there a way to test if my
> >> >> >>> >> >> >>>> Hadoop
> >> >> >>> >> >> >>>> works
> >> >> >>> >> >> >>>> fine
> >> >> >>> >> >> >>>> or not?
> >> >> >>> >> >> >>>>
> >> >> >>> >> >> >>>>
> >> >> >>> >> >> >>>> On Tue, Jun 5, 2012 at 9:55 PM, Bejoy KS
> >> >> >>> >> >> >>>> <be...@yahoo.com>
> >> >> >>> >> >> >>>> wrote:
> >> >> >>> >> >> >>>>>
> >> >> >>> >> >> >>>>> Hi Babak
> >> >> >>> >> >> >>>>>
> >> >> >>> >> >> >>>>> You gotta follow those instructions in the apace site
> >> >> >>> >> >> >>>>> to
> >> >> >>> >> >> >>>>> set
> >> >> >>> >> >> >>>>> up
> >> >> >>> >> >> >>>>> hadoop
> >> >> >>> >> >> >>>>> from scratch and ensure that hdfs is working first.
> You
> >> >> >>> >> >> >>>>> should
> >> >> >>> >> >> >>>>> be
> >> >> >>> >> >> >>>>> able to
> >> >> >>> >> >> >>>>> read and write files to hdfs before you do your next
> >> >> >>> >> >> >>>>> steps.
> >> >> >>> >> >> >>>>>
> >> >> >>> >> >> >>>>> Are you on CDH or apache distribution of hadoop? If
> it
> >> >> >>> >> >> >>>>> is
> >> >> >>> >> >> >>>>> CDH
> >> >> >>> >> >> >>>>> there
> >> >> >>> >> >> >>>>> are
> >> >> >>> >> >> >>>>> detailed instructions on Cloudera web site.
> >> >> >>> >> >> >>>>>
> >> >> >>> >> >> >>>>> Regards
> >> >> >>> >> >> >>>>> Bejoy KS
> >> >> >>> >> >> >>>>>
> >> >> >>> >> >> >>>>> Sent from handheld, please excuse typos.
> >> >> >>> >> >> >>>>> ________________________________
> >> >> >>> >> >> >>>>> From: Babak Bastan <ba...@gmail.com>
> >> >> >>> >> >> >>>>> Date: Tue, 5 Jun 2012 21:30:22 +0200
> >> >> >>> >> >> >>>>> To: <us...@hive.apache.org>
> >> >> >>> >> >> >>>>> ReplyTo: user@hive.apache.org
> >> >> >>> >> >> >>>>> Subject: Re: Error while Creating Table in Hive
> >> >> >>> >> >> >>>>>
> >> >> >>> >> >> >>>>> @Bejoy: I set the fs.default.name in the
> core-site.xml
> >> >> >>> >> >> >>>>> and
> >> >> >>> >> >> >>>>> I
> >> >> >>> >> >> >>>>> did
> >> >> >>> >> >> >>>>> all
> >> >> >>> >> >> >>>>> of
> >> >> >>> >> >> >>>>> thing that was mentioned in the reference but no
> effect
> >> >> >>> >> >> >>>>>
> >> >> >>> >> >> >>>>> On Tue, Jun 5, 2012 at 8:43 PM, Babak Bastan
> >> >> >>> >> >> >>>>> <ba...@gmail.com>
> >> >> >>> >> >> >>>>> wrote:
> >> >> >>> >> >> >>>>>>
> >> >> >>> >> >> >>>>>> Ok sorry but that was my Mistake .I thought it works
> >> >> >>> >> >> >>>>>> but
> >> >> >>> >> >> >>>>>> no.
> >> >> >>> >> >> >>>>>> I wrote the command without ; and then I think It
> >> >> >>> >> >> >>>>>> works
> >> >> >>> >> >> >>>>>> but
> >> >> >>> >> >> >>>>>> with
> >> >> >>> >> >> >>>>>> ;
> >> >> >>> >> >> >>>>>> at
> >> >> >>> >> >> >>>>>> the end of command
> >> >> >>> >> >> >>>>>>
> >> >> >>> >> >> >>>>>> CREATE TABLE pokes (foo INT, bar STRING);
> >> >> >>> >> >> >>>>>>
> >> >> >>> >> >> >>>>>> does'nt work
> >> >> >>> >> >> >>>>>>
> >> >> >>> >> >> >>>>>>
> >> >> >>> >> >> >>>>>> On Tue, Jun 5, 2012 at 8:34 PM, shashwat shriparv
> >> >> >>> >> >> >>>>>> <dw...@gmail.com> wrote:
> >> >> >>> >> >> >>>>>>>
> >> >> >>> >> >> >>>>>>> inside configuration. all properties will be inside
> >> >> >>> >> >> >>>>>>> the
> >> >> >>> >> >> >>>>>>> configuration
> >> >> >>> >> >> >>>>>>> tags
> >> >> >>> >> >> >>>>>>>
> >> >> >>> >> >> >>>>>>>
> >> >> >>> >> >> >>>>>>> On Tue, Jun 5, 2012 at 11:53 PM, Babak Bastan
> >> >> >>> >> >> >>>>>>> <ba...@gmail.com>
> >> >> >>> >> >> >>>>>>> wrote:
> >> >> >>> >> >> >>>>>>>>
> >> >> >>> >> >> >>>>>>>> Thank you so much my friend your idee works
> fine(no
> >> >> >>> >> >> >>>>>>>> error)
> >> >> >>> >> >> >>>>>>>> you
> >> >> >>> >> >> >>>>>>>> are
> >> >> >>> >> >> >>>>>>>> the best :)
> >> >> >>> >> >> >>>>>>>>
> >> >> >>> >> >> >>>>>>>>
> >> >> >>> >> >> >>>>>>>> On Tue, Jun 5, 2012 at 8:20 PM, Babak Bastan
> >> >> >>> >> >> >>>>>>>> <ba...@gmail.com>
> >> >> >>> >> >> >>>>>>>> wrote:
> >> >> >>> >> >> >>>>>>>>>
> >> >> >>> >> >> >>>>>>>>> It must be inside the
> >> >> >>> >> >> >>>>>>>>> <configuration></configuration>
> >> >> >>> >> >> >>>>>>>>> or
> >> >> >>> >> >> >>>>>>>>> outside
> >> >> >>> >> >> >>>>>>>>> this?
> >> >> >>> >> >> >>>>>>>>>
> >> >> >>> >> >> >>>>>>>>>
> >> >> >>> >> >> >>>>>>>>> On Tue, Jun 5, 2012 at 8:15 PM, shashwat shriparv
> >> >> >>> >> >> >>>>>>>>> <dw...@gmail.com> wrote:
> >> >> >>> >> >> >>>>>>>>>>
> >> >> >>> >> >> >>>>>>>>>> It will be inside hive/conf
> >> >> >>> >> >> >>>>>>>>>>
> >> >> >>> >> >> >>>>>>>>>>
> >> >> >>> >> >> >>>>>>>>>> On Tue, Jun 5, 2012 at 11:43 PM, Babak Bastan
> >> >> >>> >> >> >>>>>>>>>> <ba...@gmail.com>
> >> >> >>> >> >> >>>>>>>>>> wrote:
> >> >> >>> >> >> >>>>>>>>>>>
> >> >> >>> >> >> >>>>>>>>>>> Thanks sShashwat, and where is this
> hive-site.xml
> >> >> >>> >> >> >>>>>>>>>>>
> >> >> >>> >> >> >>>>>>>>>>> On Tue, Jun 5, 2012 at 8:02 PM, shashwat
> shriparv
> >> >> >>> >> >> >>>>>>>>>>> <dw...@gmail.com> wrote:
> >> >> >>> >> >> >>>>>>>>>>>>
> >> >> >>> >> >> >>>>>>>>>>>> set
> >> >> >>> >> >> >>>>>>>>>>>>
> >> >> >>> >> >> >>>>>>>>>>>> hive.metastore.warehouse.dir in hive-site.xml
> >> >> >>> >> >> >>>>>>>>>>>>
> >> >> >>> >> >> >>>>>>>>>>>> <property>
> >> >> >>> >> >> >>>>>>>>>>>>   <name>hive.metastore.local</name>
> >> >> >>> >> >> >>>>>>>>>>>>   <value>true</value>
> >> >> >>> >> >> >>>>>>>>>>>> </property>
> >> >> >>> >> >> >>>>>>>>>>>>
> >> >> >>> >> >> >>>>>>>>>>>> <name>hive.metastore.warehouse.dir</name>
> >> >> >>> >> >> >>>>>>>>>>>>                <value>/home/<your
> >> >> >>> >> >> >>>>>>>>>>>> username>/hivefolder</value>
> >> >> >>> >> >> >>>>>>>>>>>>                <description>location of
> default
> >> >> >>> >> >> >>>>>>>>>>>> database
> >> >> >>> >> >> >>>>>>>>>>>> for
> >> >> >>> >> >> >>>>>>>>>>>> the
> >> >> >>> >> >> >>>>>>>>>>>> warehouse</description>
> >> >> >>> >> >> >>>>>>>>>>>>        </property>
> >> >> >>> >> >> >>>>>>>>>>>>
> >> >> >>> >> >> >>>>>>>>>>>>
> >> >> >>> >> >> >>>>>>>>>>>> On Tue, Jun 5, 2012 at 10:43 PM, Babak Bastan
> >> >> >>> >> >> >>>>>>>>>>>> <ba...@gmail.com> wrote:
> >> >> >>> >> >> >>>>>>>>>>>>>
> >> >> >>> >> >> >>>>>>>>>>>>> Hello Experts ,
> >> >> >>> >> >> >>>>>>>>>>>>>
> >> >> >>> >> >> >>>>>>>>>>>>> I'm new in Hive .When try to create a test
> >> >> >>> >> >> >>>>>>>>>>>>> Table
> >> >> >>> >> >> >>>>>>>>>>>>> in
> >> >> >>> >> >> >>>>>>>>>>>>> Hive
> >> >> >>> >> >> >>>>>>>>>>>>> I
> >> >> >>> >> >> >>>>>>>>>>>>> get
> >> >> >>> >> >> >>>>>>>>>>>>> an error.I want to run this command:
> >> >> >>> >> >> >>>>>>>>>>>>> CREATE TABLE Test (DateT STRING, Url STRING,
> >> >> >>> >> >> >>>>>>>>>>>>> Content
> >> >> >>> >> >> >>>>>>>>>>>>> STRING);
> >> >> >>> >> >> >>>>>>>>>>>>> but this error occured:
> >> >> >>> >> >> >>>>>>>>>>>>> FAILED: Error in metadata:
> >> >> >>> >> >> >>>>>>>>>>>>> MetaException(message:Got
> >> >> >>> >> >> >>>>>>>>>>>>> exception:
> >> >> >>> >> >> >>>>>>>>>>>>> java.io.FileNotFoundException File
> >> >> >>> >> >> >>>>>>>>>>>>> file:/user/hive/warehouse/test does not
> >> >> >>> >> >> >>>>>>>>>>>>> exist.)
> >> >> >>> >> >> >>>>>>>>>>>>> FAILED: Execution Error, return code 1 from
> >> >> >>> >> >> >>>>>>>>>>>>> org.apache.hadoop.hive.ql.exec.DDLTask
> >> >> >>> >> >> >>>>>>>>>>>>> How can I solve this Problem?
> >> >> >>> >> >> >>>>>>>>>>>>> Thank you so much
> >> >> >>> >> >> >>>>>>>>>>>>
> >> >> >>> >> >> >>>>>>>>>>>>
> >> >> >>> >> >> >>>>>>>>>>>>
> >> >> >>> >> >> >>>>>>>>>>>>
> >> >> >>> >> >> >>>>>>>>>>>> --
> >> >> >>> >> >> >>>>>>>>>>>>
> >> >> >>> >> >> >>>>>>>>>>>>
> >> >> >>> >> >> >>>>>>>>>>>> ∞
> >> >> >>> >> >> >>>>>>>>>>>>
> >> >> >>> >> >> >>>>>>>>>>>> Shashwat Shriparv
> >> >> >>> >> >> >>>>>>>>>>>>
> >> >> >>> >> >> >>>>>>>>>>>>
> >> >> >>> >> >> >>>>>>>>>>>
> >> >> >>> >> >> >>>>>>>>>>
> >> >> >>> >> >> >>>>>>>>>>
> >> >> >>> >> >> >>>>>>>>>>
> >> >> >>> >> >> >>>>>>>>>> --
> >> >> >>> >> >> >>>>>>>>>>
> >> >> >>> >> >> >>>>>>>>>>
> >> >> >>> >> >> >>>>>>>>>> ∞
> >> >> >>> >> >> >>>>>>>>>>
> >> >> >>> >> >> >>>>>>>>>> Shashwat Shriparv
> >> >> >>> >> >> >>>>>>>>>>
> >> >> >>> >> >> >>>>>>>>>>
> >> >> >>> >> >> >>>>>>>>>
> >> >> >>> >> >> >>>>>>>>
> >> >> >>> >> >> >>>>>>>
> >> >> >>> >> >> >>>>>>>
> >> >> >>> >> >> >>>>>>>
> >> >> >>> >> >> >>>>>>> --
> >> >> >>> >> >> >>>>>>>
> >> >> >>> >> >> >>>>>>>
> >> >> >>> >> >> >>>>>>> ∞
> >> >> >>> >> >> >>>>>>>
> >> >> >>> >> >> >>>>>>> Shashwat Shriparv
> >> >> >>> >> >> >>>>>>>
> >> >> >>> >> >> >>>>>>>
> >> >> >>> >> >> >>>>>>
> >> >> >>> >> >> >>>>>
> >> >> >>> >> >> >>>>
> >> >> >>> >> >> >>>
> >> >> >>> >> >> >>>
> >> >> >>> >> >> >>>
> >> >> >>> >> >> >>> --
> >> >> >>> >> >> >>>
> >> >> >>> >> >> >>>
> >> >> >>> >> >> >>> ∞
> >> >> >>> >> >> >>>
> >> >> >>> >> >> >>> Shashwat Shriparv
> >> >> >>> >> >> >>>
> >> >> >>> >> >> >>>
> >> >> >>> >> >> >>
> >> >> >>> >> >
> >> >> >>> >> >
> >> >> >>> >
> >> >> >>> >
> >> >> >>
> >> >> >>
> >> >
> >> >
> >
> >
>

Re: Error while Creating Table in Hive

Posted by Mohammad Tariq <do...@gmail.com>.
go to your HADOOP_HOME i.e your hadoop directory(that includes bin,
conf etc)..you can find logs directory there..

Regards,
    Mohammad Tariq


On Thu, Jun 7, 2012 at 1:09 AM, Babak Bastan <ba...@gmail.com> wrote:
> hoe can I get my log mohammad?
>
>
> On Wed, Jun 6, 2012 at 9:36 PM, Mohammad Tariq <do...@gmail.com> wrote:
>>
>> could you post your logs???that would help me in understanding the
>> problem properly.
>>
>> Regards,
>>     Mohammad Tariq
>>
>>
>> On Thu, Jun 7, 2012 at 1:02 AM, Babak Bastan <ba...@gmail.com> wrote:
>> > Thank you very much mohamad for your attention.I followed the steps but
>> > the
>> > error is the same as the last time.
>> > and there is my hosts file:
>> >
>> > 127.0.0.1       localhost
>> > #127.0.0.1      ubuntu.ubuntu-domain    ubuntu
>> >
>> >
>> > # The following lines are desirable for IPv6 capable hosts
>> >
>> > #::1     ip6-localhost ip6-loopback
>> > #fe00::0 ip6-localnet
>> > #ff00::0 ip6-mcastprefix
>> > #ff02::1 ip6-allnodes
>> > #ff02::2 ip6-allrouters
>> >
>> > but no effect :(
>> >
>> > On Wed, Jun 6, 2012 at 8:25 PM, Mohammad Tariq <do...@gmail.com>
>> > wrote:
>> >>
>> >> also change the permissions of these directories to 777.
>> >>
>> >> Regards,
>> >>     Mohammad Tariq
>> >>
>> >>
>> >> On Wed, Jun 6, 2012 at 11:54 PM, Mohammad Tariq <do...@gmail.com>
>> >> wrote:
>> >> > create a directory "/home/username/hdfs" (or at some place of your
>> >> > choice)..inside this hdfs directory create three sub directories -
>> >> > name, data, and temp, then follow these steps :
>> >> >
>> >> > add following properties in your core-site.xml -
>> >> >
>> >> > <property>
>> >> >          <name>fs.default.name</name>
>> >> >          <value>hdfs://localhost:9000/</value>
>> >> >        </property>
>> >> >
>> >> >        <property>
>> >> >          <name>hadoop.tmp.dir</name>
>> >> >          <value>/home/mohammad/hdfs/temp</value>
>> >> >        </property>
>> >> >
>> >> > then add following two properties in your hdfs-site.xml -
>> >> >
>> >> > <property>
>> >> >                <name>dfs.replication</name>
>> >> >                <value>1</value>
>> >> >        </property>
>> >> >
>> >> >        <property>
>> >> >                <name>dfs.name.dir</name>
>> >> >                <value>/home/mohammad/hdfs/name</value>
>> >> >        </property>
>> >> >
>> >> >        <property>
>> >> >                <name>dfs.data.dir</name>
>> >> >                <value>/home/mohammad/hdfs/data</value>
>> >> >        </property>
>> >> >
>> >> > finally add this property in your mapred-site.xml -
>> >> >
>> >> >       <property>
>> >> >          <name>mapred.job.tracker</name>
>> >> >          <value>hdfs://localhost:9001</value>
>> >> >        </property>
>> >> >
>> >> > NOTE: you can give any name to these directories of your choice, just
>> >> > keep in mind you have to give same names as values of
>> >> >           above specified properties in your configuration files.
>> >> > (give full path of these directories, not just the name of the
>> >> > directory)
>> >> >
>> >> > After this  follow the steps provided in the previous reply.
>> >> >
>> >> > Regards,
>> >> >     Mohammad Tariq
>> >> >
>> >> >
>> >> > On Wed, Jun 6, 2012 at 11:42 PM, Babak Bastan <ba...@gmail.com>
>> >> > wrote:
>> >> >> thank's Mohammad
>> >> >>
>> >> >> with this command:
>> >> >>
>> >> >> babak@ubuntu:~/Downloads/hadoop/bin$ hadoop namenode -format
>> >> >>
>> >> >> this is my output:
>> >> >>
>> >> >> 12/06/06 20:05:20 INFO namenode.NameNode: STARTUP_MSG:
>> >> >> /************************************************************
>> >> >> STARTUP_MSG: Starting NameNode
>> >> >> STARTUP_MSG:   host = ubuntu/127.0.1.1
>> >> >> STARTUP_MSG:   args = [-format]
>> >> >> STARTUP_MSG:   version = 0.20.2
>> >> >> STARTUP_MSG:   build =
>> >> >> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.20
>> >> >> -r
>> >> >> 911707; compiled by 'chrisdo' on Fri Feb 19 08:07:34 UTC 2010
>> >> >> ************************************************************/
>> >> >> 12/06/06 20:05:20 INFO namenode.FSNamesystem:
>> >> >>
>> >> >> fsOwner=babak,babak,adm,dialout,cdrom,plugdev,lpadmin,admin,sambashare
>> >> >> 12/06/06 20:05:20 INFO namenode.FSNamesystem: supergroup=supergroup
>> >> >> 12/06/06 20:05:20 INFO namenode.FSNamesystem:
>> >> >> isPermissionEnabled=true
>> >> >> 12/06/06 20:05:20 INFO common.Storage: Image file of size 95 saved
>> >> >> in 0
>> >> >> seconds.
>> >> >> 12/06/06 20:05:20 INFO common.Storage: Storage directory
>> >> >> /tmp/hadoop-babak/dfs/name has been successfully formatted.
>> >> >> 12/06/06 20:05:20 INFO namenode.NameNode: SHUTDOWN_MSG:
>> >> >> /************************************************************
>> >> >> SHUTDOWN_MSG: Shutting down NameNode at ubuntu/127.0.1.1
>> >> >> ************************************************************/
>> >> >>
>> >> >> by this command:
>> >> >>
>> >> >> babak@ubuntu:~/Downloads/hadoop/bin$ start-dfs.sh
>> >> >>
>> >> >> this is the out put
>> >> >>
>> >> >> mkdir: kann Verzeichnis „/home/babak/Downloads/hadoop/bin/../logs“
>> >> >> nicht
>> >> >> anlegen: Keine Berechtigung
>> >> >>
>> >> >> this out put(it's in german and it means no right to make this
>> >> >> folder)
>> >> >>
>> >> >>
>> >> >> On Wed, Jun 6, 2012 at 7:59 PM, Mohammad Tariq <do...@gmail.com>
>> >> >> wrote:
>> >> >>>
>> >> >>> once we are done with the configuration, we need to format the file
>> >> >>> system..use this command to do that-
>> >> >>> bin/hadoop namenode -format
>> >> >>>
>> >> >>> after this, hadoop daemon processes should be started using
>> >> >>> following
>> >> >>> commands -
>> >> >>> bin/start-dfs.sh (it'll start NN & DN)
>> >> >>> bin/start-mapred.sh (it'll start JT & TT)
>> >> >>>
>> >> >>> after this use jps to check if everything is alright or point your
>> >> >>> browser to localhost:50070..if you further find any problem provide
>> >> >>> us
>> >> >>> with the error logs..:)
>> >> >>>
>> >> >>> Regards,
>> >> >>>     Mohammad Tariq
>> >> >>>
>> >> >>>
>> >> >>> On Wed, Jun 6, 2012 at 11:22 PM, Babak Bastan <ba...@gmail.com>
>> >> >>> wrote:
>> >> >>> > were you able to format hdfs properly???
>> >> >>> > I did'nt get your question,Do you mean HADOOP_HOME? or where did
>> >> >>> > I
>> >> >>> > install
>> >> >>> > Hadoop?
>> >> >>> >
>> >> >>> > On Wed, Jun 6, 2012 at 7:49 PM, Mohammad Tariq
>> >> >>> > <do...@gmail.com>
>> >> >>> > wrote:
>> >> >>> >>
>> >> >>> >> if you are getting only this, it means your hadoop is not
>> >> >>> >> running..were you able to format hdfs properly???
>> >> >>> >>
>> >> >>> >> Regards,
>> >> >>> >>     Mohammad Tariq
>> >> >>> >>
>> >> >>> >>
>> >> >>> >> On Wed, Jun 6, 2012 at 11:17 PM, Babak Bastan
>> >> >>> >> <ba...@gmail.com>
>> >> >>> >> wrote:
>> >> >>> >> > Hi MohammadmI irun jps in my shel I can see this result:
>> >> >>> >> > 2213 Jps
>> >> >>> >> >
>> >> >>> >> >
>> >> >>> >> > On Wed, Jun 6, 2012 at 7:44 PM, Mohammad Tariq
>> >> >>> >> > <do...@gmail.com>
>> >> >>> >> > wrote:
>> >> >>> >> >>
>> >> >>> >> >> you can also use "jps" command at your shell to see whether
>> >> >>> >> >> Hadoop
>> >> >>> >> >> processes are running or not.
>> >> >>> >> >>
>> >> >>> >> >> Regards,
>> >> >>> >> >>     Mohammad Tariq
>> >> >>> >> >>
>> >> >>> >> >>
>> >> >>> >> >> On Wed, Jun 6, 2012 at 11:12 PM, Mohammad Tariq
>> >> >>> >> >> <do...@gmail.com>
>> >> >>> >> >> wrote:
>> >> >>> >> >> > Hi Babak,
>> >> >>> >> >> >
>> >> >>> >> >> >  You have to type it in you web browser..Hadoop provides us
>> >> >>> >> >> > a
>> >> >>> >> >> > web
>> >> >>> >> >> > GUI
>> >> >>> >> >> > that not only allows us to browse through the file system,
>> >> >>> >> >> > but
>> >> >>> >> >> > to
>> >> >>> >> >> > download the files as well..Apart from that it also
>> >> >>> >> >> > provides a
>> >> >>> >> >> > web
>> >> >>> >> >> > GUI
>> >> >>> >> >> > that can be used to see the status of Jobtracker and
>> >> >>> >> >> > Tasktracker..When
>> >> >>> >> >> > you run a Hive or Pig job or a Mapreduce job, you can point
>> >> >>> >> >> > your
>> >> >>> >> >> > browser to http://localhost:50030 to see the status and
>> >> >>> >> >> > logs
>> >> >>> >> >> > of
>> >> >>> >> >> > your
>> >> >>> >> >> > job.
>> >> >>> >> >> >
>> >> >>> >> >> > Regards,
>> >> >>> >> >> >     Mohammad Tariq
>> >> >>> >> >> >
>> >> >>> >> >> >
>> >> >>> >> >> > On Wed, Jun 6, 2012 at 8:28 PM, Babak Bastan
>> >> >>> >> >> > <ba...@gmail.com>
>> >> >>> >> >> > wrote:
>> >> >>> >> >> >> Thank you shashwat for the answer,
>> >> >>> >> >> >> where should I type http://localhost:50070?
>> >> >>> >> >> >> I typed here: hive>http://localhost:50070 but nothing as
>> >> >>> >> >> >> result
>> >> >>> >> >> >>
>> >> >>> >> >> >>
>> >> >>> >> >> >> On Wed, Jun 6, 2012 at 3:32 PM, shashwat shriparv
>> >> >>> >> >> >> <dw...@gmail.com> wrote:
>> >> >>> >> >> >>>
>> >> >>> >> >> >>> first type http://localhost:50070 whether this is opening
>> >> >>> >> >> >>> or
>> >> >>> >> >> >>> not
>> >> >>> >> >> >>> and
>> >> >>> >> >> >>> check
>> >> >>> >> >> >>> how many nodes are available, check some of the hadoop
>> >> >>> >> >> >>> shell
>> >> >>> >> >> >>> commands
>> >> >>> >> >> >>>
>> >> >>> >> >> >>>
>> >> >>> >> >> >>>
>> >> >>> >> >> >>> from http://hadoop.apache.org/common/docs/r0.18.3/hdfs_shell.html
>> >> >>> >> >> >>> run
>> >> >>> >> >> >>> example mapreduce task on hadoop take example from here
>> >> >>> >> >> >>>
>> >> >>> >> >> >>>
>> >> >>> >> >> >>>
>> >> >>> >> >> >>>
>> >> >>> >> >> >>>
>> >> >>> >> >> >>> : http://www.michael-noll.com/blog/2011/04/09/benchmarking-and-stress-testing-an-hadoop-cluster-with-terasort-testdfsio-nnbench-mrbench/
>> >> >>> >> >> >>>
>> >> >>> >> >> >>> if all the above you can do sucessfully means hadoop is
>> >> >>> >> >> >>> configured
>> >> >>> >> >> >>> correctly
>> >> >>> >> >> >>>
>> >> >>> >> >> >>> Regards
>> >> >>> >> >> >>> Shashwat
>> >> >>> >> >> >>>
>> >> >>> >> >> >>>
>> >> >>> >> >> >>> On Wed, Jun 6, 2012 at 1:30 AM, Babak Bastan
>> >> >>> >> >> >>> <ba...@gmail.com>
>> >> >>> >> >> >>> wrote:
>> >> >>> >> >> >>>>
>> >> >>> >> >> >>>> no I'm not working on CDH.Is there a way to test if my
>> >> >>> >> >> >>>> Hadoop
>> >> >>> >> >> >>>> works
>> >> >>> >> >> >>>> fine
>> >> >>> >> >> >>>> or not?
>> >> >>> >> >> >>>>
>> >> >>> >> >> >>>>
>> >> >>> >> >> >>>> On Tue, Jun 5, 2012 at 9:55 PM, Bejoy KS
>> >> >>> >> >> >>>> <be...@yahoo.com>
>> >> >>> >> >> >>>> wrote:
>> >> >>> >> >> >>>>>
>> >> >>> >> >> >>>>> Hi Babak
>> >> >>> >> >> >>>>>
>> >> >>> >> >> >>>>> You gotta follow those instructions in the apace site
>> >> >>> >> >> >>>>> to
>> >> >>> >> >> >>>>> set
>> >> >>> >> >> >>>>> up
>> >> >>> >> >> >>>>> hadoop
>> >> >>> >> >> >>>>> from scratch and ensure that hdfs is working first. You
>> >> >>> >> >> >>>>> should
>> >> >>> >> >> >>>>> be
>> >> >>> >> >> >>>>> able to
>> >> >>> >> >> >>>>> read and write files to hdfs before you do your next
>> >> >>> >> >> >>>>> steps.
>> >> >>> >> >> >>>>>
>> >> >>> >> >> >>>>> Are you on CDH or apache distribution of hadoop? If it
>> >> >>> >> >> >>>>> is
>> >> >>> >> >> >>>>> CDH
>> >> >>> >> >> >>>>> there
>> >> >>> >> >> >>>>> are
>> >> >>> >> >> >>>>> detailed instructions on Cloudera web site.
>> >> >>> >> >> >>>>>
>> >> >>> >> >> >>>>> Regards
>> >> >>> >> >> >>>>> Bejoy KS
>> >> >>> >> >> >>>>>
>> >> >>> >> >> >>>>> Sent from handheld, please excuse typos.
>> >> >>> >> >> >>>>> ________________________________
>> >> >>> >> >> >>>>> From: Babak Bastan <ba...@gmail.com>
>> >> >>> >> >> >>>>> Date: Tue, 5 Jun 2012 21:30:22 +0200
>> >> >>> >> >> >>>>> To: <us...@hive.apache.org>
>> >> >>> >> >> >>>>> ReplyTo: user@hive.apache.org
>> >> >>> >> >> >>>>> Subject: Re: Error while Creating Table in Hive
>> >> >>> >> >> >>>>>
>> >> >>> >> >> >>>>> @Bejoy: I set the fs.default.name in the core-site.xml
>> >> >>> >> >> >>>>> and
>> >> >>> >> >> >>>>> I
>> >> >>> >> >> >>>>> did
>> >> >>> >> >> >>>>> all
>> >> >>> >> >> >>>>> of
>> >> >>> >> >> >>>>> thing that was mentioned in the reference but no effect
>> >> >>> >> >> >>>>>
>> >> >>> >> >> >>>>> On Tue, Jun 5, 2012 at 8:43 PM, Babak Bastan
>> >> >>> >> >> >>>>> <ba...@gmail.com>
>> >> >>> >> >> >>>>> wrote:
>> >> >>> >> >> >>>>>>
>> >> >>> >> >> >>>>>> Ok sorry but that was my Mistake .I thought it works
>> >> >>> >> >> >>>>>> but
>> >> >>> >> >> >>>>>> no.
>> >> >>> >> >> >>>>>> I wrote the command without ; and then I think It
>> >> >>> >> >> >>>>>> works
>> >> >>> >> >> >>>>>> but
>> >> >>> >> >> >>>>>> with
>> >> >>> >> >> >>>>>> ;
>> >> >>> >> >> >>>>>> at
>> >> >>> >> >> >>>>>> the end of command
>> >> >>> >> >> >>>>>>
>> >> >>> >> >> >>>>>> CREATE TABLE pokes (foo INT, bar STRING);
>> >> >>> >> >> >>>>>>
>> >> >>> >> >> >>>>>> does'nt work
>> >> >>> >> >> >>>>>>
>> >> >>> >> >> >>>>>>
>> >> >>> >> >> >>>>>> On Tue, Jun 5, 2012 at 8:34 PM, shashwat shriparv
>> >> >>> >> >> >>>>>> <dw...@gmail.com> wrote:
>> >> >>> >> >> >>>>>>>
>> >> >>> >> >> >>>>>>> inside configuration. all properties will be inside
>> >> >>> >> >> >>>>>>> the
>> >> >>> >> >> >>>>>>> configuration
>> >> >>> >> >> >>>>>>> tags
>> >> >>> >> >> >>>>>>>
>> >> >>> >> >> >>>>>>>
>> >> >>> >> >> >>>>>>> On Tue, Jun 5, 2012 at 11:53 PM, Babak Bastan
>> >> >>> >> >> >>>>>>> <ba...@gmail.com>
>> >> >>> >> >> >>>>>>> wrote:
>> >> >>> >> >> >>>>>>>>
>> >> >>> >> >> >>>>>>>> Thank you so much my friend your idee works fine(no
>> >> >>> >> >> >>>>>>>> error)
>> >> >>> >> >> >>>>>>>> you
>> >> >>> >> >> >>>>>>>> are
>> >> >>> >> >> >>>>>>>> the best :)
>> >> >>> >> >> >>>>>>>>
>> >> >>> >> >> >>>>>>>>
>> >> >>> >> >> >>>>>>>> On Tue, Jun 5, 2012 at 8:20 PM, Babak Bastan
>> >> >>> >> >> >>>>>>>> <ba...@gmail.com>
>> >> >>> >> >> >>>>>>>> wrote:
>> >> >>> >> >> >>>>>>>>>
>> >> >>> >> >> >>>>>>>>> It must be inside the
>> >> >>> >> >> >>>>>>>>> <configuration></configuration>
>> >> >>> >> >> >>>>>>>>> or
>> >> >>> >> >> >>>>>>>>> outside
>> >> >>> >> >> >>>>>>>>> this?
>> >> >>> >> >> >>>>>>>>>
>> >> >>> >> >> >>>>>>>>>
>> >> >>> >> >> >>>>>>>>> On Tue, Jun 5, 2012 at 8:15 PM, shashwat shriparv
>> >> >>> >> >> >>>>>>>>> <dw...@gmail.com> wrote:
>> >> >>> >> >> >>>>>>>>>>
>> >> >>> >> >> >>>>>>>>>> It will be inside hive/conf
>> >> >>> >> >> >>>>>>>>>>
>> >> >>> >> >> >>>>>>>>>>
>> >> >>> >> >> >>>>>>>>>> On Tue, Jun 5, 2012 at 11:43 PM, Babak Bastan
>> >> >>> >> >> >>>>>>>>>> <ba...@gmail.com>
>> >> >>> >> >> >>>>>>>>>> wrote:
>> >> >>> >> >> >>>>>>>>>>>
>> >> >>> >> >> >>>>>>>>>>> Thanks sShashwat, and where is this hive-site.xml
>> >> >>> >> >> >>>>>>>>>>>
>> >> >>> >> >> >>>>>>>>>>> On Tue, Jun 5, 2012 at 8:02 PM, shashwat shriparv
>> >> >>> >> >> >>>>>>>>>>> <dw...@gmail.com> wrote:
>> >> >>> >> >> >>>>>>>>>>>>
>> >> >>> >> >> >>>>>>>>>>>> set
>> >> >>> >> >> >>>>>>>>>>>>
>> >> >>> >> >> >>>>>>>>>>>> hive.metastore.warehouse.dir in hive-site.xml
>> >> >>> >> >> >>>>>>>>>>>>
>> >> >>> >> >> >>>>>>>>>>>> <property>
>> >> >>> >> >> >>>>>>>>>>>>   <name>hive.metastore.local</name>
>> >> >>> >> >> >>>>>>>>>>>>   <value>true</value>
>> >> >>> >> >> >>>>>>>>>>>> </property>
>> >> >>> >> >> >>>>>>>>>>>>
>> >> >>> >> >> >>>>>>>>>>>> <name>hive.metastore.warehouse.dir</name>
>> >> >>> >> >> >>>>>>>>>>>>                <value>/home/<your
>> >> >>> >> >> >>>>>>>>>>>> username>/hivefolder</value>
>> >> >>> >> >> >>>>>>>>>>>>                <description>location of default
>> >> >>> >> >> >>>>>>>>>>>> database
>> >> >>> >> >> >>>>>>>>>>>> for
>> >> >>> >> >> >>>>>>>>>>>> the
>> >> >>> >> >> >>>>>>>>>>>> warehouse</description>
>> >> >>> >> >> >>>>>>>>>>>>        </property>
>> >> >>> >> >> >>>>>>>>>>>>
>> >> >>> >> >> >>>>>>>>>>>>
>> >> >>> >> >> >>>>>>>>>>>> On Tue, Jun 5, 2012 at 10:43 PM, Babak Bastan
>> >> >>> >> >> >>>>>>>>>>>> <ba...@gmail.com> wrote:
>> >> >>> >> >> >>>>>>>>>>>>>
>> >> >>> >> >> >>>>>>>>>>>>> Hello Experts ,
>> >> >>> >> >> >>>>>>>>>>>>>
>> >> >>> >> >> >>>>>>>>>>>>> I'm new in Hive .When try to create a test
>> >> >>> >> >> >>>>>>>>>>>>> Table
>> >> >>> >> >> >>>>>>>>>>>>> in
>> >> >>> >> >> >>>>>>>>>>>>> Hive
>> >> >>> >> >> >>>>>>>>>>>>> I
>> >> >>> >> >> >>>>>>>>>>>>> get
>> >> >>> >> >> >>>>>>>>>>>>> an error.I want to run this command:
>> >> >>> >> >> >>>>>>>>>>>>> CREATE TABLE Test (DateT STRING, Url STRING,
>> >> >>> >> >> >>>>>>>>>>>>> Content
>> >> >>> >> >> >>>>>>>>>>>>> STRING);
>> >> >>> >> >> >>>>>>>>>>>>> but this error occured:
>> >> >>> >> >> >>>>>>>>>>>>> FAILED: Error in metadata:
>> >> >>> >> >> >>>>>>>>>>>>> MetaException(message:Got
>> >> >>> >> >> >>>>>>>>>>>>> exception:
>> >> >>> >> >> >>>>>>>>>>>>> java.io.FileNotFoundException File
>> >> >>> >> >> >>>>>>>>>>>>> file:/user/hive/warehouse/test does not
>> >> >>> >> >> >>>>>>>>>>>>> exist.)
>> >> >>> >> >> >>>>>>>>>>>>> FAILED: Execution Error, return code 1 from
>> >> >>> >> >> >>>>>>>>>>>>> org.apache.hadoop.hive.ql.exec.DDLTask
>> >> >>> >> >> >>>>>>>>>>>>> How can I solve this Problem?
>> >> >>> >> >> >>>>>>>>>>>>> Thank you so much
>> >> >>> >> >> >>>>>>>>>>>>
>> >> >>> >> >> >>>>>>>>>>>>
>> >> >>> >> >> >>>>>>>>>>>>
>> >> >>> >> >> >>>>>>>>>>>>
>> >> >>> >> >> >>>>>>>>>>>> --
>> >> >>> >> >> >>>>>>>>>>>>
>> >> >>> >> >> >>>>>>>>>>>>
>> >> >>> >> >> >>>>>>>>>>>> ∞
>> >> >>> >> >> >>>>>>>>>>>>
>> >> >>> >> >> >>>>>>>>>>>> Shashwat Shriparv
>> >> >>> >> >> >>>>>>>>>>>>
>> >> >>> >> >> >>>>>>>>>>>>
>> >> >>> >> >> >>>>>>>>>>>
>> >> >>> >> >> >>>>>>>>>>
>> >> >>> >> >> >>>>>>>>>>
>> >> >>> >> >> >>>>>>>>>>
>> >> >>> >> >> >>>>>>>>>> --
>> >> >>> >> >> >>>>>>>>>>
>> >> >>> >> >> >>>>>>>>>>
>> >> >>> >> >> >>>>>>>>>> ∞
>> >> >>> >> >> >>>>>>>>>>
>> >> >>> >> >> >>>>>>>>>> Shashwat Shriparv
>> >> >>> >> >> >>>>>>>>>>
>> >> >>> >> >> >>>>>>>>>>
>> >> >>> >> >> >>>>>>>>>
>> >> >>> >> >> >>>>>>>>
>> >> >>> >> >> >>>>>>>
>> >> >>> >> >> >>>>>>>
>> >> >>> >> >> >>>>>>>
>> >> >>> >> >> >>>>>>> --
>> >> >>> >> >> >>>>>>>
>> >> >>> >> >> >>>>>>>
>> >> >>> >> >> >>>>>>> ∞
>> >> >>> >> >> >>>>>>>
>> >> >>> >> >> >>>>>>> Shashwat Shriparv
>> >> >>> >> >> >>>>>>>
>> >> >>> >> >> >>>>>>>
>> >> >>> >> >> >>>>>>
>> >> >>> >> >> >>>>>
>> >> >>> >> >> >>>>
>> >> >>> >> >> >>>
>> >> >>> >> >> >>>
>> >> >>> >> >> >>>
>> >> >>> >> >> >>> --
>> >> >>> >> >> >>>
>> >> >>> >> >> >>>
>> >> >>> >> >> >>> ∞
>> >> >>> >> >> >>>
>> >> >>> >> >> >>> Shashwat Shriparv
>> >> >>> >> >> >>>
>> >> >>> >> >> >>>
>> >> >>> >> >> >>
>> >> >>> >> >
>> >> >>> >> >
>> >> >>> >
>> >> >>> >
>> >> >>
>> >> >>
>> >
>> >
>
>

Re: Error while Creating Table in Hive

Posted by Babak Bastan <ba...@gmail.com>.
hoe can I get my log mohammad?

On Wed, Jun 6, 2012 at 9:36 PM, Mohammad Tariq <do...@gmail.com> wrote:

> could you post your logs???that would help me in understanding the
> problem properly.
>
> Regards,
>     Mohammad Tariq
>
>
> On Thu, Jun 7, 2012 at 1:02 AM, Babak Bastan <ba...@gmail.com> wrote:
> > Thank you very much mohamad for your attention.I followed the steps but
> the
> > error is the same as the last time.
> > and there is my hosts file:
> >
> > 127.0.0.1       localhost
> > #127.0.0.1      ubuntu.ubuntu-domain    ubuntu
> >
> >
> > # The following lines are desirable for IPv6 capable hosts
> >
> > #::1     ip6-localhost ip6-loopback
> > #fe00::0 ip6-localnet
> > #ff00::0 ip6-mcastprefix
> > #ff02::1 ip6-allnodes
> > #ff02::2 ip6-allrouters
> >
> > but no effect :(
> >
> > On Wed, Jun 6, 2012 at 8:25 PM, Mohammad Tariq <do...@gmail.com>
> wrote:
> >>
> >> also change the permissions of these directories to 777.
> >>
> >> Regards,
> >>     Mohammad Tariq
> >>
> >>
> >> On Wed, Jun 6, 2012 at 11:54 PM, Mohammad Tariq <do...@gmail.com>
> >> wrote:
> >> > create a directory "/home/username/hdfs" (or at some place of your
> >> > choice)..inside this hdfs directory create three sub directories -
> >> > name, data, and temp, then follow these steps :
> >> >
> >> > add following properties in your core-site.xml -
> >> >
> >> > <property>
> >> >          <name>fs.default.name</name>
> >> >          <value>hdfs://localhost:9000/</value>
> >> >        </property>
> >> >
> >> >        <property>
> >> >          <name>hadoop.tmp.dir</name>
> >> >          <value>/home/mohammad/hdfs/temp</value>
> >> >        </property>
> >> >
> >> > then add following two properties in your hdfs-site.xml -
> >> >
> >> > <property>
> >> >                <name>dfs.replication</name>
> >> >                <value>1</value>
> >> >        </property>
> >> >
> >> >        <property>
> >> >                <name>dfs.name.dir</name>
> >> >                <value>/home/mohammad/hdfs/name</value>
> >> >        </property>
> >> >
> >> >        <property>
> >> >                <name>dfs.data.dir</name>
> >> >                <value>/home/mohammad/hdfs/data</value>
> >> >        </property>
> >> >
> >> > finally add this property in your mapred-site.xml -
> >> >
> >> >       <property>
> >> >          <name>mapred.job.tracker</name>
> >> >          <value>hdfs://localhost:9001</value>
> >> >        </property>
> >> >
> >> > NOTE: you can give any name to these directories of your choice, just
> >> > keep in mind you have to give same names as values of
> >> >           above specified properties in your configuration files.
> >> > (give full path of these directories, not just the name of the
> >> > directory)
> >> >
> >> > After this  follow the steps provided in the previous reply.
> >> >
> >> > Regards,
> >> >     Mohammad Tariq
> >> >
> >> >
> >> > On Wed, Jun 6, 2012 at 11:42 PM, Babak Bastan <ba...@gmail.com>
> >> > wrote:
> >> >> thank's Mohammad
> >> >>
> >> >> with this command:
> >> >>
> >> >> babak@ubuntu:~/Downloads/hadoop/bin$ hadoop namenode -format
> >> >>
> >> >> this is my output:
> >> >>
> >> >> 12/06/06 20:05:20 INFO namenode.NameNode: STARTUP_MSG:
> >> >> /************************************************************
> >> >> STARTUP_MSG: Starting NameNode
> >> >> STARTUP_MSG:   host = ubuntu/127.0.1.1
> >> >> STARTUP_MSG:   args = [-format]
> >> >> STARTUP_MSG:   version = 0.20.2
> >> >> STARTUP_MSG:   build =
> >> >> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.20-r
> >> >> 911707; compiled by 'chrisdo' on Fri Feb 19 08:07:34 UTC 2010
> >> >> ************************************************************/
> >> >> 12/06/06 20:05:20 INFO namenode.FSNamesystem:
> >> >>
> fsOwner=babak,babak,adm,dialout,cdrom,plugdev,lpadmin,admin,sambashare
> >> >> 12/06/06 20:05:20 INFO namenode.FSNamesystem: supergroup=supergroup
> >> >> 12/06/06 20:05:20 INFO namenode.FSNamesystem:
> isPermissionEnabled=true
> >> >> 12/06/06 20:05:20 INFO common.Storage: Image file of size 95 saved
> in 0
> >> >> seconds.
> >> >> 12/06/06 20:05:20 INFO common.Storage: Storage directory
> >> >> /tmp/hadoop-babak/dfs/name has been successfully formatted.
> >> >> 12/06/06 20:05:20 INFO namenode.NameNode: SHUTDOWN_MSG:
> >> >> /************************************************************
> >> >> SHUTDOWN_MSG: Shutting down NameNode at ubuntu/127.0.1.1
> >> >> ************************************************************/
> >> >>
> >> >> by this command:
> >> >>
> >> >> babak@ubuntu:~/Downloads/hadoop/bin$ start-dfs.sh
> >> >>
> >> >> this is the out put
> >> >>
> >> >> mkdir: kann Verzeichnis „/home/babak/Downloads/hadoop/bin/../logs“
> >> >> nicht
> >> >> anlegen: Keine Berechtigung
> >> >>
> >> >> this out put(it's in german and it means no right to make this
> folder)
> >> >>
> >> >>
> >> >> On Wed, Jun 6, 2012 at 7:59 PM, Mohammad Tariq <do...@gmail.com>
> >> >> wrote:
> >> >>>
> >> >>> once we are done with the configuration, we need to format the file
> >> >>> system..use this command to do that-
> >> >>> bin/hadoop namenode -format
> >> >>>
> >> >>> after this, hadoop daemon processes should be started using
> following
> >> >>> commands -
> >> >>> bin/start-dfs.sh (it'll start NN & DN)
> >> >>> bin/start-mapred.sh (it'll start JT & TT)
> >> >>>
> >> >>> after this use jps to check if everything is alright or point your
> >> >>> browser to localhost:50070..if you further find any problem provide
> us
> >> >>> with the error logs..:)
> >> >>>
> >> >>> Regards,
> >> >>>     Mohammad Tariq
> >> >>>
> >> >>>
> >> >>> On Wed, Jun 6, 2012 at 11:22 PM, Babak Bastan <ba...@gmail.com>
> >> >>> wrote:
> >> >>> > were you able to format hdfs properly???
> >> >>> > I did'nt get your question,Do you mean HADOOP_HOME? or where did I
> >> >>> > install
> >> >>> > Hadoop?
> >> >>> >
> >> >>> > On Wed, Jun 6, 2012 at 7:49 PM, Mohammad Tariq <
> dontariq@gmail.com>
> >> >>> > wrote:
> >> >>> >>
> >> >>> >> if you are getting only this, it means your hadoop is not
> >> >>> >> running..were you able to format hdfs properly???
> >> >>> >>
> >> >>> >> Regards,
> >> >>> >>     Mohammad Tariq
> >> >>> >>
> >> >>> >>
> >> >>> >> On Wed, Jun 6, 2012 at 11:17 PM, Babak Bastan <
> babakbsn@gmail.com>
> >> >>> >> wrote:
> >> >>> >> > Hi MohammadmI irun jps in my shel I can see this result:
> >> >>> >> > 2213 Jps
> >> >>> >> >
> >> >>> >> >
> >> >>> >> > On Wed, Jun 6, 2012 at 7:44 PM, Mohammad Tariq
> >> >>> >> > <do...@gmail.com>
> >> >>> >> > wrote:
> >> >>> >> >>
> >> >>> >> >> you can also use "jps" command at your shell to see whether
> >> >>> >> >> Hadoop
> >> >>> >> >> processes are running or not.
> >> >>> >> >>
> >> >>> >> >> Regards,
> >> >>> >> >>     Mohammad Tariq
> >> >>> >> >>
> >> >>> >> >>
> >> >>> >> >> On Wed, Jun 6, 2012 at 11:12 PM, Mohammad Tariq
> >> >>> >> >> <do...@gmail.com>
> >> >>> >> >> wrote:
> >> >>> >> >> > Hi Babak,
> >> >>> >> >> >
> >> >>> >> >> >  You have to type it in you web browser..Hadoop provides us
> a
> >> >>> >> >> > web
> >> >>> >> >> > GUI
> >> >>> >> >> > that not only allows us to browse through the file system,
> but
> >> >>> >> >> > to
> >> >>> >> >> > download the files as well..Apart from that it also
> provides a
> >> >>> >> >> > web
> >> >>> >> >> > GUI
> >> >>> >> >> > that can be used to see the status of Jobtracker and
> >> >>> >> >> > Tasktracker..When
> >> >>> >> >> > you run a Hive or Pig job or a Mapreduce job, you can point
> >> >>> >> >> > your
> >> >>> >> >> > browser to http://localhost:50030 to see the status and
> logs
> >> >>> >> >> > of
> >> >>> >> >> > your
> >> >>> >> >> > job.
> >> >>> >> >> >
> >> >>> >> >> > Regards,
> >> >>> >> >> >     Mohammad Tariq
> >> >>> >> >> >
> >> >>> >> >> >
> >> >>> >> >> > On Wed, Jun 6, 2012 at 8:28 PM, Babak Bastan
> >> >>> >> >> > <ba...@gmail.com>
> >> >>> >> >> > wrote:
> >> >>> >> >> >> Thank you shashwat for the answer,
> >> >>> >> >> >> where should I type http://localhost:50070?
> >> >>> >> >> >> I typed here: hive>http://localhost:50070 but nothing as
> >> >>> >> >> >> result
> >> >>> >> >> >>
> >> >>> >> >> >>
> >> >>> >> >> >> On Wed, Jun 6, 2012 at 3:32 PM, shashwat shriparv
> >> >>> >> >> >> <dw...@gmail.com> wrote:
> >> >>> >> >> >>>
> >> >>> >> >> >>> first type http://localhost:50070 whether this is
> opening or
> >> >>> >> >> >>> not
> >> >>> >> >> >>> and
> >> >>> >> >> >>> check
> >> >>> >> >> >>> how many nodes are available, check some of the hadoop
> shell
> >> >>> >> >> >>> commands
> >> >>> >> >> >>>
> >> >>> >> >> >>>
> >> >>> >> >> >>> from
> http://hadoop.apache.org/common/docs/r0.18.3/hdfs_shell.html
> >> >>> >> >> >>> run
> >> >>> >> >> >>> example mapreduce task on hadoop take example from here
> >> >>> >> >> >>>
> >> >>> >> >> >>>
> >> >>> >> >> >>>
> >> >>> >> >> >>>
> >> >>> >> >> >>> :
> http://www.michael-noll.com/blog/2011/04/09/benchmarking-and-stress-testing-an-hadoop-cluster-with-terasort-testdfsio-nnbench-mrbench/
> >> >>> >> >> >>>
> >> >>> >> >> >>> if all the above you can do sucessfully means hadoop is
> >> >>> >> >> >>> configured
> >> >>> >> >> >>> correctly
> >> >>> >> >> >>>
> >> >>> >> >> >>> Regards
> >> >>> >> >> >>> Shashwat
> >> >>> >> >> >>>
> >> >>> >> >> >>>
> >> >>> >> >> >>> On Wed, Jun 6, 2012 at 1:30 AM, Babak Bastan
> >> >>> >> >> >>> <ba...@gmail.com>
> >> >>> >> >> >>> wrote:
> >> >>> >> >> >>>>
> >> >>> >> >> >>>> no I'm not working on CDH.Is there a way to test if my
> >> >>> >> >> >>>> Hadoop
> >> >>> >> >> >>>> works
> >> >>> >> >> >>>> fine
> >> >>> >> >> >>>> or not?
> >> >>> >> >> >>>>
> >> >>> >> >> >>>>
> >> >>> >> >> >>>> On Tue, Jun 5, 2012 at 9:55 PM, Bejoy KS
> >> >>> >> >> >>>> <be...@yahoo.com>
> >> >>> >> >> >>>> wrote:
> >> >>> >> >> >>>>>
> >> >>> >> >> >>>>> Hi Babak
> >> >>> >> >> >>>>>
> >> >>> >> >> >>>>> You gotta follow those instructions in the apace site to
> >> >>> >> >> >>>>> set
> >> >>> >> >> >>>>> up
> >> >>> >> >> >>>>> hadoop
> >> >>> >> >> >>>>> from scratch and ensure that hdfs is working first. You
> >> >>> >> >> >>>>> should
> >> >>> >> >> >>>>> be
> >> >>> >> >> >>>>> able to
> >> >>> >> >> >>>>> read and write files to hdfs before you do your next
> >> >>> >> >> >>>>> steps.
> >> >>> >> >> >>>>>
> >> >>> >> >> >>>>> Are you on CDH or apache distribution of hadoop? If it
> is
> >> >>> >> >> >>>>> CDH
> >> >>> >> >> >>>>> there
> >> >>> >> >> >>>>> are
> >> >>> >> >> >>>>> detailed instructions on Cloudera web site.
> >> >>> >> >> >>>>>
> >> >>> >> >> >>>>> Regards
> >> >>> >> >> >>>>> Bejoy KS
> >> >>> >> >> >>>>>
> >> >>> >> >> >>>>> Sent from handheld, please excuse typos.
> >> >>> >> >> >>>>> ________________________________
> >> >>> >> >> >>>>> From: Babak Bastan <ba...@gmail.com>
> >> >>> >> >> >>>>> Date: Tue, 5 Jun 2012 21:30:22 +0200
> >> >>> >> >> >>>>> To: <us...@hive.apache.org>
> >> >>> >> >> >>>>> ReplyTo: user@hive.apache.org
> >> >>> >> >> >>>>> Subject: Re: Error while Creating Table in Hive
> >> >>> >> >> >>>>>
> >> >>> >> >> >>>>> @Bejoy: I set the fs.default.name in the core-site.xml
> and
> >> >>> >> >> >>>>> I
> >> >>> >> >> >>>>> did
> >> >>> >> >> >>>>> all
> >> >>> >> >> >>>>> of
> >> >>> >> >> >>>>> thing that was mentioned in the reference but no effect
> >> >>> >> >> >>>>>
> >> >>> >> >> >>>>> On Tue, Jun 5, 2012 at 8:43 PM, Babak Bastan
> >> >>> >> >> >>>>> <ba...@gmail.com>
> >> >>> >> >> >>>>> wrote:
> >> >>> >> >> >>>>>>
> >> >>> >> >> >>>>>> Ok sorry but that was my Mistake .I thought it works
> but
> >> >>> >> >> >>>>>> no.
> >> >>> >> >> >>>>>> I wrote the command without ; and then I think It works
> >> >>> >> >> >>>>>> but
> >> >>> >> >> >>>>>> with
> >> >>> >> >> >>>>>> ;
> >> >>> >> >> >>>>>> at
> >> >>> >> >> >>>>>> the end of command
> >> >>> >> >> >>>>>>
> >> >>> >> >> >>>>>> CREATE TABLE pokes (foo INT, bar STRING);
> >> >>> >> >> >>>>>>
> >> >>> >> >> >>>>>> does'nt work
> >> >>> >> >> >>>>>>
> >> >>> >> >> >>>>>>
> >> >>> >> >> >>>>>> On Tue, Jun 5, 2012 at 8:34 PM, shashwat shriparv
> >> >>> >> >> >>>>>> <dw...@gmail.com> wrote:
> >> >>> >> >> >>>>>>>
> >> >>> >> >> >>>>>>> inside configuration. all properties will be inside
> the
> >> >>> >> >> >>>>>>> configuration
> >> >>> >> >> >>>>>>> tags
> >> >>> >> >> >>>>>>>
> >> >>> >> >> >>>>>>>
> >> >>> >> >> >>>>>>> On Tue, Jun 5, 2012 at 11:53 PM, Babak Bastan
> >> >>> >> >> >>>>>>> <ba...@gmail.com>
> >> >>> >> >> >>>>>>> wrote:
> >> >>> >> >> >>>>>>>>
> >> >>> >> >> >>>>>>>> Thank you so much my friend your idee works fine(no
> >> >>> >> >> >>>>>>>> error)
> >> >>> >> >> >>>>>>>> you
> >> >>> >> >> >>>>>>>> are
> >> >>> >> >> >>>>>>>> the best :)
> >> >>> >> >> >>>>>>>>
> >> >>> >> >> >>>>>>>>
> >> >>> >> >> >>>>>>>> On Tue, Jun 5, 2012 at 8:20 PM, Babak Bastan
> >> >>> >> >> >>>>>>>> <ba...@gmail.com>
> >> >>> >> >> >>>>>>>> wrote:
> >> >>> >> >> >>>>>>>>>
> >> >>> >> >> >>>>>>>>> It must be inside the
> <configuration></configuration>
> >> >>> >> >> >>>>>>>>> or
> >> >>> >> >> >>>>>>>>> outside
> >> >>> >> >> >>>>>>>>> this?
> >> >>> >> >> >>>>>>>>>
> >> >>> >> >> >>>>>>>>>
> >> >>> >> >> >>>>>>>>> On Tue, Jun 5, 2012 at 8:15 PM, shashwat shriparv
> >> >>> >> >> >>>>>>>>> <dw...@gmail.com> wrote:
> >> >>> >> >> >>>>>>>>>>
> >> >>> >> >> >>>>>>>>>> It will be inside hive/conf
> >> >>> >> >> >>>>>>>>>>
> >> >>> >> >> >>>>>>>>>>
> >> >>> >> >> >>>>>>>>>> On Tue, Jun 5, 2012 at 11:43 PM, Babak Bastan
> >> >>> >> >> >>>>>>>>>> <ba...@gmail.com>
> >> >>> >> >> >>>>>>>>>> wrote:
> >> >>> >> >> >>>>>>>>>>>
> >> >>> >> >> >>>>>>>>>>> Thanks sShashwat, and where is this hive-site.xml
> >> >>> >> >> >>>>>>>>>>>
> >> >>> >> >> >>>>>>>>>>> On Tue, Jun 5, 2012 at 8:02 PM, shashwat shriparv
> >> >>> >> >> >>>>>>>>>>> <dw...@gmail.com> wrote:
> >> >>> >> >> >>>>>>>>>>>>
> >> >>> >> >> >>>>>>>>>>>> set
> >> >>> >> >> >>>>>>>>>>>>
> >> >>> >> >> >>>>>>>>>>>> hive.metastore.warehouse.dir in hive-site.xml
> >> >>> >> >> >>>>>>>>>>>>
> >> >>> >> >> >>>>>>>>>>>> <property>
> >> >>> >> >> >>>>>>>>>>>>   <name>hive.metastore.local</name>
> >> >>> >> >> >>>>>>>>>>>>   <value>true</value>
> >> >>> >> >> >>>>>>>>>>>> </property>
> >> >>> >> >> >>>>>>>>>>>>
> >> >>> >> >> >>>>>>>>>>>> <name>hive.metastore.warehouse.dir</name>
> >> >>> >> >> >>>>>>>>>>>>                <value>/home/<your
> >> >>> >> >> >>>>>>>>>>>> username>/hivefolder</value>
> >> >>> >> >> >>>>>>>>>>>>                <description>location of default
> >> >>> >> >> >>>>>>>>>>>> database
> >> >>> >> >> >>>>>>>>>>>> for
> >> >>> >> >> >>>>>>>>>>>> the
> >> >>> >> >> >>>>>>>>>>>> warehouse</description>
> >> >>> >> >> >>>>>>>>>>>>        </property>
> >> >>> >> >> >>>>>>>>>>>>
> >> >>> >> >> >>>>>>>>>>>>
> >> >>> >> >> >>>>>>>>>>>> On Tue, Jun 5, 2012 at 10:43 PM, Babak Bastan
> >> >>> >> >> >>>>>>>>>>>> <ba...@gmail.com> wrote:
> >> >>> >> >> >>>>>>>>>>>>>
> >> >>> >> >> >>>>>>>>>>>>> Hello Experts ,
> >> >>> >> >> >>>>>>>>>>>>>
> >> >>> >> >> >>>>>>>>>>>>> I'm new in Hive .When try to create a test Table
> >> >>> >> >> >>>>>>>>>>>>> in
> >> >>> >> >> >>>>>>>>>>>>> Hive
> >> >>> >> >> >>>>>>>>>>>>> I
> >> >>> >> >> >>>>>>>>>>>>> get
> >> >>> >> >> >>>>>>>>>>>>> an error.I want to run this command:
> >> >>> >> >> >>>>>>>>>>>>> CREATE TABLE Test (DateT STRING, Url STRING,
> >> >>> >> >> >>>>>>>>>>>>> Content
> >> >>> >> >> >>>>>>>>>>>>> STRING);
> >> >>> >> >> >>>>>>>>>>>>> but this error occured:
> >> >>> >> >> >>>>>>>>>>>>> FAILED: Error in metadata:
> >> >>> >> >> >>>>>>>>>>>>> MetaException(message:Got
> >> >>> >> >> >>>>>>>>>>>>> exception:
> >> >>> >> >> >>>>>>>>>>>>> java.io.FileNotFoundException File
> >> >>> >> >> >>>>>>>>>>>>> file:/user/hive/warehouse/test does not
> >> >>> >> >> >>>>>>>>>>>>> exist.)
> >> >>> >> >> >>>>>>>>>>>>> FAILED: Execution Error, return code 1 from
> >> >>> >> >> >>>>>>>>>>>>> org.apache.hadoop.hive.ql.exec.DDLTask
> >> >>> >> >> >>>>>>>>>>>>> How can I solve this Problem?
> >> >>> >> >> >>>>>>>>>>>>> Thank you so much
> >> >>> >> >> >>>>>>>>>>>>
> >> >>> >> >> >>>>>>>>>>>>
> >> >>> >> >> >>>>>>>>>>>>
> >> >>> >> >> >>>>>>>>>>>>
> >> >>> >> >> >>>>>>>>>>>> --
> >> >>> >> >> >>>>>>>>>>>>
> >> >>> >> >> >>>>>>>>>>>>
> >> >>> >> >> >>>>>>>>>>>> ∞
> >> >>> >> >> >>>>>>>>>>>>
> >> >>> >> >> >>>>>>>>>>>> Shashwat Shriparv
> >> >>> >> >> >>>>>>>>>>>>
> >> >>> >> >> >>>>>>>>>>>>
> >> >>> >> >> >>>>>>>>>>>
> >> >>> >> >> >>>>>>>>>>
> >> >>> >> >> >>>>>>>>>>
> >> >>> >> >> >>>>>>>>>>
> >> >>> >> >> >>>>>>>>>> --
> >> >>> >> >> >>>>>>>>>>
> >> >>> >> >> >>>>>>>>>>
> >> >>> >> >> >>>>>>>>>> ∞
> >> >>> >> >> >>>>>>>>>>
> >> >>> >> >> >>>>>>>>>> Shashwat Shriparv
> >> >>> >> >> >>>>>>>>>>
> >> >>> >> >> >>>>>>>>>>
> >> >>> >> >> >>>>>>>>>
> >> >>> >> >> >>>>>>>>
> >> >>> >> >> >>>>>>>
> >> >>> >> >> >>>>>>>
> >> >>> >> >> >>>>>>>
> >> >>> >> >> >>>>>>> --
> >> >>> >> >> >>>>>>>
> >> >>> >> >> >>>>>>>
> >> >>> >> >> >>>>>>> ∞
> >> >>> >> >> >>>>>>>
> >> >>> >> >> >>>>>>> Shashwat Shriparv
> >> >>> >> >> >>>>>>>
> >> >>> >> >> >>>>>>>
> >> >>> >> >> >>>>>>
> >> >>> >> >> >>>>>
> >> >>> >> >> >>>>
> >> >>> >> >> >>>
> >> >>> >> >> >>>
> >> >>> >> >> >>>
> >> >>> >> >> >>> --
> >> >>> >> >> >>>
> >> >>> >> >> >>>
> >> >>> >> >> >>> ∞
> >> >>> >> >> >>>
> >> >>> >> >> >>> Shashwat Shriparv
> >> >>> >> >> >>>
> >> >>> >> >> >>>
> >> >>> >> >> >>
> >> >>> >> >
> >> >>> >> >
> >> >>> >
> >> >>> >
> >> >>
> >> >>
> >
> >
>

Re: Error while Creating Table in Hive

Posted by Mohammad Tariq <do...@gmail.com>.
could you post your logs???that would help me in understanding the
problem properly.

Regards,
    Mohammad Tariq


On Thu, Jun 7, 2012 at 1:02 AM, Babak Bastan <ba...@gmail.com> wrote:
> Thank you very much mohamad for your attention.I followed the steps but the
> error is the same as the last time.
> and there is my hosts file:
>
> 127.0.0.1       localhost
> #127.0.0.1      ubuntu.ubuntu-domain    ubuntu
>
>
> # The following lines are desirable for IPv6 capable hosts
>
> #::1     ip6-localhost ip6-loopback
> #fe00::0 ip6-localnet
> #ff00::0 ip6-mcastprefix
> #ff02::1 ip6-allnodes
> #ff02::2 ip6-allrouters
>
> but no effect :(
>
> On Wed, Jun 6, 2012 at 8:25 PM, Mohammad Tariq <do...@gmail.com> wrote:
>>
>> also change the permissions of these directories to 777.
>>
>> Regards,
>>     Mohammad Tariq
>>
>>
>> On Wed, Jun 6, 2012 at 11:54 PM, Mohammad Tariq <do...@gmail.com>
>> wrote:
>> > create a directory "/home/username/hdfs" (or at some place of your
>> > choice)..inside this hdfs directory create three sub directories -
>> > name, data, and temp, then follow these steps :
>> >
>> > add following properties in your core-site.xml -
>> >
>> > <property>
>> >          <name>fs.default.name</name>
>> >          <value>hdfs://localhost:9000/</value>
>> >        </property>
>> >
>> >        <property>
>> >          <name>hadoop.tmp.dir</name>
>> >          <value>/home/mohammad/hdfs/temp</value>
>> >        </property>
>> >
>> > then add following two properties in your hdfs-site.xml -
>> >
>> > <property>
>> >                <name>dfs.replication</name>
>> >                <value>1</value>
>> >        </property>
>> >
>> >        <property>
>> >                <name>dfs.name.dir</name>
>> >                <value>/home/mohammad/hdfs/name</value>
>> >        </property>
>> >
>> >        <property>
>> >                <name>dfs.data.dir</name>
>> >                <value>/home/mohammad/hdfs/data</value>
>> >        </property>
>> >
>> > finally add this property in your mapred-site.xml -
>> >
>> >       <property>
>> >          <name>mapred.job.tracker</name>
>> >          <value>hdfs://localhost:9001</value>
>> >        </property>
>> >
>> > NOTE: you can give any name to these directories of your choice, just
>> > keep in mind you have to give same names as values of
>> >           above specified properties in your configuration files.
>> > (give full path of these directories, not just the name of the
>> > directory)
>> >
>> > After this  follow the steps provided in the previous reply.
>> >
>> > Regards,
>> >     Mohammad Tariq
>> >
>> >
>> > On Wed, Jun 6, 2012 at 11:42 PM, Babak Bastan <ba...@gmail.com>
>> > wrote:
>> >> thank's Mohammad
>> >>
>> >> with this command:
>> >>
>> >> babak@ubuntu:~/Downloads/hadoop/bin$ hadoop namenode -format
>> >>
>> >> this is my output:
>> >>
>> >> 12/06/06 20:05:20 INFO namenode.NameNode: STARTUP_MSG:
>> >> /************************************************************
>> >> STARTUP_MSG: Starting NameNode
>> >> STARTUP_MSG:   host = ubuntu/127.0.1.1
>> >> STARTUP_MSG:   args = [-format]
>> >> STARTUP_MSG:   version = 0.20.2
>> >> STARTUP_MSG:   build =
>> >> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.20 -r
>> >> 911707; compiled by 'chrisdo' on Fri Feb 19 08:07:34 UTC 2010
>> >> ************************************************************/
>> >> 12/06/06 20:05:20 INFO namenode.FSNamesystem:
>> >> fsOwner=babak,babak,adm,dialout,cdrom,plugdev,lpadmin,admin,sambashare
>> >> 12/06/06 20:05:20 INFO namenode.FSNamesystem: supergroup=supergroup
>> >> 12/06/06 20:05:20 INFO namenode.FSNamesystem: isPermissionEnabled=true
>> >> 12/06/06 20:05:20 INFO common.Storage: Image file of size 95 saved in 0
>> >> seconds.
>> >> 12/06/06 20:05:20 INFO common.Storage: Storage directory
>> >> /tmp/hadoop-babak/dfs/name has been successfully formatted.
>> >> 12/06/06 20:05:20 INFO namenode.NameNode: SHUTDOWN_MSG:
>> >> /************************************************************
>> >> SHUTDOWN_MSG: Shutting down NameNode at ubuntu/127.0.1.1
>> >> ************************************************************/
>> >>
>> >> by this command:
>> >>
>> >> babak@ubuntu:~/Downloads/hadoop/bin$ start-dfs.sh
>> >>
>> >> this is the out put
>> >>
>> >> mkdir: kann Verzeichnis „/home/babak/Downloads/hadoop/bin/../logs“
>> >> nicht
>> >> anlegen: Keine Berechtigung
>> >>
>> >> this out put(it's in german and it means no right to make this folder)
>> >>
>> >>
>> >> On Wed, Jun 6, 2012 at 7:59 PM, Mohammad Tariq <do...@gmail.com>
>> >> wrote:
>> >>>
>> >>> once we are done with the configuration, we need to format the file
>> >>> system..use this command to do that-
>> >>> bin/hadoop namenode -format
>> >>>
>> >>> after this, hadoop daemon processes should be started using following
>> >>> commands -
>> >>> bin/start-dfs.sh (it'll start NN & DN)
>> >>> bin/start-mapred.sh (it'll start JT & TT)
>> >>>
>> >>> after this use jps to check if everything is alright or point your
>> >>> browser to localhost:50070..if you further find any problem provide us
>> >>> with the error logs..:)
>> >>>
>> >>> Regards,
>> >>>     Mohammad Tariq
>> >>>
>> >>>
>> >>> On Wed, Jun 6, 2012 at 11:22 PM, Babak Bastan <ba...@gmail.com>
>> >>> wrote:
>> >>> > were you able to format hdfs properly???
>> >>> > I did'nt get your question,Do you mean HADOOP_HOME? or where did I
>> >>> > install
>> >>> > Hadoop?
>> >>> >
>> >>> > On Wed, Jun 6, 2012 at 7:49 PM, Mohammad Tariq <do...@gmail.com>
>> >>> > wrote:
>> >>> >>
>> >>> >> if you are getting only this, it means your hadoop is not
>> >>> >> running..were you able to format hdfs properly???
>> >>> >>
>> >>> >> Regards,
>> >>> >>     Mohammad Tariq
>> >>> >>
>> >>> >>
>> >>> >> On Wed, Jun 6, 2012 at 11:17 PM, Babak Bastan <ba...@gmail.com>
>> >>> >> wrote:
>> >>> >> > Hi MohammadmI irun jps in my shel I can see this result:
>> >>> >> > 2213 Jps
>> >>> >> >
>> >>> >> >
>> >>> >> > On Wed, Jun 6, 2012 at 7:44 PM, Mohammad Tariq
>> >>> >> > <do...@gmail.com>
>> >>> >> > wrote:
>> >>> >> >>
>> >>> >> >> you can also use "jps" command at your shell to see whether
>> >>> >> >> Hadoop
>> >>> >> >> processes are running or not.
>> >>> >> >>
>> >>> >> >> Regards,
>> >>> >> >>     Mohammad Tariq
>> >>> >> >>
>> >>> >> >>
>> >>> >> >> On Wed, Jun 6, 2012 at 11:12 PM, Mohammad Tariq
>> >>> >> >> <do...@gmail.com>
>> >>> >> >> wrote:
>> >>> >> >> > Hi Babak,
>> >>> >> >> >
>> >>> >> >> >  You have to type it in you web browser..Hadoop provides us a
>> >>> >> >> > web
>> >>> >> >> > GUI
>> >>> >> >> > that not only allows us to browse through the file system, but
>> >>> >> >> > to
>> >>> >> >> > download the files as well..Apart from that it also provides a
>> >>> >> >> > web
>> >>> >> >> > GUI
>> >>> >> >> > that can be used to see the status of Jobtracker and
>> >>> >> >> > Tasktracker..When
>> >>> >> >> > you run a Hive or Pig job or a Mapreduce job, you can point
>> >>> >> >> > your
>> >>> >> >> > browser to http://localhost:50030 to see the status and logs
>> >>> >> >> > of
>> >>> >> >> > your
>> >>> >> >> > job.
>> >>> >> >> >
>> >>> >> >> > Regards,
>> >>> >> >> >     Mohammad Tariq
>> >>> >> >> >
>> >>> >> >> >
>> >>> >> >> > On Wed, Jun 6, 2012 at 8:28 PM, Babak Bastan
>> >>> >> >> > <ba...@gmail.com>
>> >>> >> >> > wrote:
>> >>> >> >> >> Thank you shashwat for the answer,
>> >>> >> >> >> where should I type http://localhost:50070?
>> >>> >> >> >> I typed here: hive>http://localhost:50070 but nothing as
>> >>> >> >> >> result
>> >>> >> >> >>
>> >>> >> >> >>
>> >>> >> >> >> On Wed, Jun 6, 2012 at 3:32 PM, shashwat shriparv
>> >>> >> >> >> <dw...@gmail.com> wrote:
>> >>> >> >> >>>
>> >>> >> >> >>> first type http://localhost:50070 whether this is opening or
>> >>> >> >> >>> not
>> >>> >> >> >>> and
>> >>> >> >> >>> check
>> >>> >> >> >>> how many nodes are available, check some of the hadoop shell
>> >>> >> >> >>> commands
>> >>> >> >> >>>
>> >>> >> >> >>>
>> >>> >> >> >>> from http://hadoop.apache.org/common/docs/r0.18.3/hdfs_shell.html
>> >>> >> >> >>> run
>> >>> >> >> >>> example mapreduce task on hadoop take example from here
>> >>> >> >> >>>
>> >>> >> >> >>>
>> >>> >> >> >>>
>> >>> >> >> >>>
>> >>> >> >> >>> : http://www.michael-noll.com/blog/2011/04/09/benchmarking-and-stress-testing-an-hadoop-cluster-with-terasort-testdfsio-nnbench-mrbench/
>> >>> >> >> >>>
>> >>> >> >> >>> if all the above you can do sucessfully means hadoop is
>> >>> >> >> >>> configured
>> >>> >> >> >>> correctly
>> >>> >> >> >>>
>> >>> >> >> >>> Regards
>> >>> >> >> >>> Shashwat
>> >>> >> >> >>>
>> >>> >> >> >>>
>> >>> >> >> >>> On Wed, Jun 6, 2012 at 1:30 AM, Babak Bastan
>> >>> >> >> >>> <ba...@gmail.com>
>> >>> >> >> >>> wrote:
>> >>> >> >> >>>>
>> >>> >> >> >>>> no I'm not working on CDH.Is there a way to test if my
>> >>> >> >> >>>> Hadoop
>> >>> >> >> >>>> works
>> >>> >> >> >>>> fine
>> >>> >> >> >>>> or not?
>> >>> >> >> >>>>
>> >>> >> >> >>>>
>> >>> >> >> >>>> On Tue, Jun 5, 2012 at 9:55 PM, Bejoy KS
>> >>> >> >> >>>> <be...@yahoo.com>
>> >>> >> >> >>>> wrote:
>> >>> >> >> >>>>>
>> >>> >> >> >>>>> Hi Babak
>> >>> >> >> >>>>>
>> >>> >> >> >>>>> You gotta follow those instructions in the apace site to
>> >>> >> >> >>>>> set
>> >>> >> >> >>>>> up
>> >>> >> >> >>>>> hadoop
>> >>> >> >> >>>>> from scratch and ensure that hdfs is working first. You
>> >>> >> >> >>>>> should
>> >>> >> >> >>>>> be
>> >>> >> >> >>>>> able to
>> >>> >> >> >>>>> read and write files to hdfs before you do your next
>> >>> >> >> >>>>> steps.
>> >>> >> >> >>>>>
>> >>> >> >> >>>>> Are you on CDH or apache distribution of hadoop? If it is
>> >>> >> >> >>>>> CDH
>> >>> >> >> >>>>> there
>> >>> >> >> >>>>> are
>> >>> >> >> >>>>> detailed instructions on Cloudera web site.
>> >>> >> >> >>>>>
>> >>> >> >> >>>>> Regards
>> >>> >> >> >>>>> Bejoy KS
>> >>> >> >> >>>>>
>> >>> >> >> >>>>> Sent from handheld, please excuse typos.
>> >>> >> >> >>>>> ________________________________
>> >>> >> >> >>>>> From: Babak Bastan <ba...@gmail.com>
>> >>> >> >> >>>>> Date: Tue, 5 Jun 2012 21:30:22 +0200
>> >>> >> >> >>>>> To: <us...@hive.apache.org>
>> >>> >> >> >>>>> ReplyTo: user@hive.apache.org
>> >>> >> >> >>>>> Subject: Re: Error while Creating Table in Hive
>> >>> >> >> >>>>>
>> >>> >> >> >>>>> @Bejoy: I set the fs.default.name in the core-site.xml and
>> >>> >> >> >>>>> I
>> >>> >> >> >>>>> did
>> >>> >> >> >>>>> all
>> >>> >> >> >>>>> of
>> >>> >> >> >>>>> thing that was mentioned in the reference but no effect
>> >>> >> >> >>>>>
>> >>> >> >> >>>>> On Tue, Jun 5, 2012 at 8:43 PM, Babak Bastan
>> >>> >> >> >>>>> <ba...@gmail.com>
>> >>> >> >> >>>>> wrote:
>> >>> >> >> >>>>>>
>> >>> >> >> >>>>>> Ok sorry but that was my Mistake .I thought it works but
>> >>> >> >> >>>>>> no.
>> >>> >> >> >>>>>> I wrote the command without ; and then I think It works
>> >>> >> >> >>>>>> but
>> >>> >> >> >>>>>> with
>> >>> >> >> >>>>>> ;
>> >>> >> >> >>>>>> at
>> >>> >> >> >>>>>> the end of command
>> >>> >> >> >>>>>>
>> >>> >> >> >>>>>> CREATE TABLE pokes (foo INT, bar STRING);
>> >>> >> >> >>>>>>
>> >>> >> >> >>>>>> does'nt work
>> >>> >> >> >>>>>>
>> >>> >> >> >>>>>>
>> >>> >> >> >>>>>> On Tue, Jun 5, 2012 at 8:34 PM, shashwat shriparv
>> >>> >> >> >>>>>> <dw...@gmail.com> wrote:
>> >>> >> >> >>>>>>>
>> >>> >> >> >>>>>>> inside configuration. all properties will be inside the
>> >>> >> >> >>>>>>> configuration
>> >>> >> >> >>>>>>> tags
>> >>> >> >> >>>>>>>
>> >>> >> >> >>>>>>>
>> >>> >> >> >>>>>>> On Tue, Jun 5, 2012 at 11:53 PM, Babak Bastan
>> >>> >> >> >>>>>>> <ba...@gmail.com>
>> >>> >> >> >>>>>>> wrote:
>> >>> >> >> >>>>>>>>
>> >>> >> >> >>>>>>>> Thank you so much my friend your idee works fine(no
>> >>> >> >> >>>>>>>> error)
>> >>> >> >> >>>>>>>> you
>> >>> >> >> >>>>>>>> are
>> >>> >> >> >>>>>>>> the best :)
>> >>> >> >> >>>>>>>>
>> >>> >> >> >>>>>>>>
>> >>> >> >> >>>>>>>> On Tue, Jun 5, 2012 at 8:20 PM, Babak Bastan
>> >>> >> >> >>>>>>>> <ba...@gmail.com>
>> >>> >> >> >>>>>>>> wrote:
>> >>> >> >> >>>>>>>>>
>> >>> >> >> >>>>>>>>> It must be inside the <configuration></configuration>
>> >>> >> >> >>>>>>>>> or
>> >>> >> >> >>>>>>>>> outside
>> >>> >> >> >>>>>>>>> this?
>> >>> >> >> >>>>>>>>>
>> >>> >> >> >>>>>>>>>
>> >>> >> >> >>>>>>>>> On Tue, Jun 5, 2012 at 8:15 PM, shashwat shriparv
>> >>> >> >> >>>>>>>>> <dw...@gmail.com> wrote:
>> >>> >> >> >>>>>>>>>>
>> >>> >> >> >>>>>>>>>> It will be inside hive/conf
>> >>> >> >> >>>>>>>>>>
>> >>> >> >> >>>>>>>>>>
>> >>> >> >> >>>>>>>>>> On Tue, Jun 5, 2012 at 11:43 PM, Babak Bastan
>> >>> >> >> >>>>>>>>>> <ba...@gmail.com>
>> >>> >> >> >>>>>>>>>> wrote:
>> >>> >> >> >>>>>>>>>>>
>> >>> >> >> >>>>>>>>>>> Thanks sShashwat, and where is this hive-site.xml
>> >>> >> >> >>>>>>>>>>>
>> >>> >> >> >>>>>>>>>>> On Tue, Jun 5, 2012 at 8:02 PM, shashwat shriparv
>> >>> >> >> >>>>>>>>>>> <dw...@gmail.com> wrote:
>> >>> >> >> >>>>>>>>>>>>
>> >>> >> >> >>>>>>>>>>>> set
>> >>> >> >> >>>>>>>>>>>>
>> >>> >> >> >>>>>>>>>>>> hive.metastore.warehouse.dir in hive-site.xml
>> >>> >> >> >>>>>>>>>>>>
>> >>> >> >> >>>>>>>>>>>> <property>
>> >>> >> >> >>>>>>>>>>>>   <name>hive.metastore.local</name>
>> >>> >> >> >>>>>>>>>>>>   <value>true</value>
>> >>> >> >> >>>>>>>>>>>> </property>
>> >>> >> >> >>>>>>>>>>>>
>> >>> >> >> >>>>>>>>>>>> <name>hive.metastore.warehouse.dir</name>
>> >>> >> >> >>>>>>>>>>>>                <value>/home/<your
>> >>> >> >> >>>>>>>>>>>> username>/hivefolder</value>
>> >>> >> >> >>>>>>>>>>>>                <description>location of default
>> >>> >> >> >>>>>>>>>>>> database
>> >>> >> >> >>>>>>>>>>>> for
>> >>> >> >> >>>>>>>>>>>> the
>> >>> >> >> >>>>>>>>>>>> warehouse</description>
>> >>> >> >> >>>>>>>>>>>>        </property>
>> >>> >> >> >>>>>>>>>>>>
>> >>> >> >> >>>>>>>>>>>>
>> >>> >> >> >>>>>>>>>>>> On Tue, Jun 5, 2012 at 10:43 PM, Babak Bastan
>> >>> >> >> >>>>>>>>>>>> <ba...@gmail.com> wrote:
>> >>> >> >> >>>>>>>>>>>>>
>> >>> >> >> >>>>>>>>>>>>> Hello Experts ,
>> >>> >> >> >>>>>>>>>>>>>
>> >>> >> >> >>>>>>>>>>>>> I'm new in Hive .When try to create a test Table
>> >>> >> >> >>>>>>>>>>>>> in
>> >>> >> >> >>>>>>>>>>>>> Hive
>> >>> >> >> >>>>>>>>>>>>> I
>> >>> >> >> >>>>>>>>>>>>> get
>> >>> >> >> >>>>>>>>>>>>> an error.I want to run this command:
>> >>> >> >> >>>>>>>>>>>>> CREATE TABLE Test (DateT STRING, Url STRING,
>> >>> >> >> >>>>>>>>>>>>> Content
>> >>> >> >> >>>>>>>>>>>>> STRING);
>> >>> >> >> >>>>>>>>>>>>> but this error occured:
>> >>> >> >> >>>>>>>>>>>>> FAILED: Error in metadata:
>> >>> >> >> >>>>>>>>>>>>> MetaException(message:Got
>> >>> >> >> >>>>>>>>>>>>> exception:
>> >>> >> >> >>>>>>>>>>>>> java.io.FileNotFoundException File
>> >>> >> >> >>>>>>>>>>>>> file:/user/hive/warehouse/test does not
>> >>> >> >> >>>>>>>>>>>>> exist.)
>> >>> >> >> >>>>>>>>>>>>> FAILED: Execution Error, return code 1 from
>> >>> >> >> >>>>>>>>>>>>> org.apache.hadoop.hive.ql.exec.DDLTask
>> >>> >> >> >>>>>>>>>>>>> How can I solve this Problem?
>> >>> >> >> >>>>>>>>>>>>> Thank you so much
>> >>> >> >> >>>>>>>>>>>>
>> >>> >> >> >>>>>>>>>>>>
>> >>> >> >> >>>>>>>>>>>>
>> >>> >> >> >>>>>>>>>>>>
>> >>> >> >> >>>>>>>>>>>> --
>> >>> >> >> >>>>>>>>>>>>
>> >>> >> >> >>>>>>>>>>>>
>> >>> >> >> >>>>>>>>>>>> ∞
>> >>> >> >> >>>>>>>>>>>>
>> >>> >> >> >>>>>>>>>>>> Shashwat Shriparv
>> >>> >> >> >>>>>>>>>>>>
>> >>> >> >> >>>>>>>>>>>>
>> >>> >> >> >>>>>>>>>>>
>> >>> >> >> >>>>>>>>>>
>> >>> >> >> >>>>>>>>>>
>> >>> >> >> >>>>>>>>>>
>> >>> >> >> >>>>>>>>>> --
>> >>> >> >> >>>>>>>>>>
>> >>> >> >> >>>>>>>>>>
>> >>> >> >> >>>>>>>>>> ∞
>> >>> >> >> >>>>>>>>>>
>> >>> >> >> >>>>>>>>>> Shashwat Shriparv
>> >>> >> >> >>>>>>>>>>
>> >>> >> >> >>>>>>>>>>
>> >>> >> >> >>>>>>>>>
>> >>> >> >> >>>>>>>>
>> >>> >> >> >>>>>>>
>> >>> >> >> >>>>>>>
>> >>> >> >> >>>>>>>
>> >>> >> >> >>>>>>> --
>> >>> >> >> >>>>>>>
>> >>> >> >> >>>>>>>
>> >>> >> >> >>>>>>> ∞
>> >>> >> >> >>>>>>>
>> >>> >> >> >>>>>>> Shashwat Shriparv
>> >>> >> >> >>>>>>>
>> >>> >> >> >>>>>>>
>> >>> >> >> >>>>>>
>> >>> >> >> >>>>>
>> >>> >> >> >>>>
>> >>> >> >> >>>
>> >>> >> >> >>>
>> >>> >> >> >>>
>> >>> >> >> >>> --
>> >>> >> >> >>>
>> >>> >> >> >>>
>> >>> >> >> >>> ∞
>> >>> >> >> >>>
>> >>> >> >> >>> Shashwat Shriparv
>> >>> >> >> >>>
>> >>> >> >> >>>
>> >>> >> >> >>
>> >>> >> >
>> >>> >> >
>> >>> >
>> >>> >
>> >>
>> >>
>
>

Re: Error while Creating Table in Hive

Posted by Babak Bastan <ba...@gmail.com>.
Thank you very much mohamad for your attention.I followed the steps but the
error is the same as the last time.
and there is my hosts file:

127.0.0.1       localhost
#127.0.0.1      ubuntu.ubuntu-domain    ubuntu


# The following lines are desirable for IPv6 capable hosts

#::1     ip6-localhost ip6-loopback
#fe00::0 ip6-localnet
#ff00::0 ip6-mcastprefix
#ff02::1 ip6-allnodes
#ff02::2 ip6-allrouters

but no effect :(

On Wed, Jun 6, 2012 at 8:25 PM, Mohammad Tariq <do...@gmail.com> wrote:

> also change the permissions of these directories to 777.
>
> Regards,
>     Mohammad Tariq
>
>
> On Wed, Jun 6, 2012 at 11:54 PM, Mohammad Tariq <do...@gmail.com>
> wrote:
> > create a directory "/home/username/hdfs" (or at some place of your
> > choice)..inside this hdfs directory create three sub directories -
> > name, data, and temp, then follow these steps :
> >
> > add following properties in your core-site.xml -
> >
> > <property>
> >          <name>fs.default.name</name>
> >          <value>hdfs://localhost:9000/</value>
> >        </property>
> >
> >        <property>
> >          <name>hadoop.tmp.dir</name>
> >          <value>/home/mohammad/hdfs/temp</value>
> >        </property>
> >
> > then add following two properties in your hdfs-site.xml -
> >
> > <property>
> >                <name>dfs.replication</name>
> >                <value>1</value>
> >        </property>
> >
> >        <property>
> >                <name>dfs.name.dir</name>
> >                <value>/home/mohammad/hdfs/name</value>
> >        </property>
> >
> >        <property>
> >                <name>dfs.data.dir</name>
> >                <value>/home/mohammad/hdfs/data</value>
> >        </property>
> >
> > finally add this property in your mapred-site.xml -
> >
> >       <property>
> >          <name>mapred.job.tracker</name>
> >          <value>hdfs://localhost:9001</value>
> >        </property>
> >
> > NOTE: you can give any name to these directories of your choice, just
> > keep in mind you have to give same names as values of
> >           above specified properties in your configuration files.
> > (give full path of these directories, not just the name of the
> > directory)
> >
> > After this  follow the steps provided in the previous reply.
> >
> > Regards,
> >     Mohammad Tariq
> >
> >
> > On Wed, Jun 6, 2012 at 11:42 PM, Babak Bastan <ba...@gmail.com>
> wrote:
> >> thank's Mohammad
> >>
> >> with this command:
> >>
> >> babak@ubuntu:~/Downloads/hadoop/bin$ hadoop namenode -format
> >>
> >> this is my output:
> >>
> >> 12/06/06 20:05:20 INFO namenode.NameNode: STARTUP_MSG:
> >> /************************************************************
> >> STARTUP_MSG: Starting NameNode
> >> STARTUP_MSG:   host = ubuntu/127.0.1.1
> >> STARTUP_MSG:   args = [-format]
> >> STARTUP_MSG:   version = 0.20.2
> >> STARTUP_MSG:   build =
> >> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.20 -r
> >> 911707; compiled by 'chrisdo' on Fri Feb 19 08:07:34 UTC 2010
> >> ************************************************************/
> >> 12/06/06 20:05:20 INFO namenode.FSNamesystem:
> >> fsOwner=babak,babak,adm,dialout,cdrom,plugdev,lpadmin,admin,sambashare
> >> 12/06/06 20:05:20 INFO namenode.FSNamesystem: supergroup=supergroup
> >> 12/06/06 20:05:20 INFO namenode.FSNamesystem: isPermissionEnabled=true
> >> 12/06/06 20:05:20 INFO common.Storage: Image file of size 95 saved in 0
> >> seconds.
> >> 12/06/06 20:05:20 INFO common.Storage: Storage directory
> >> /tmp/hadoop-babak/dfs/name has been successfully formatted.
> >> 12/06/06 20:05:20 INFO namenode.NameNode: SHUTDOWN_MSG:
> >> /************************************************************
> >> SHUTDOWN_MSG: Shutting down NameNode at ubuntu/127.0.1.1
> >> ************************************************************/
> >>
> >> by this command:
> >>
> >> babak@ubuntu:~/Downloads/hadoop/bin$ start-dfs.sh
> >>
> >> this is the out put
> >>
> >> mkdir: kann Verzeichnis „/home/babak/Downloads/hadoop/bin/../logs“ nicht
> >> anlegen: Keine Berechtigung
> >>
> >> this out put(it's in german and it means no right to make this folder)
> >>
> >>
> >> On Wed, Jun 6, 2012 at 7:59 PM, Mohammad Tariq <do...@gmail.com>
> wrote:
> >>>
> >>> once we are done with the configuration, we need to format the file
> >>> system..use this command to do that-
> >>> bin/hadoop namenode -format
> >>>
> >>> after this, hadoop daemon processes should be started using following
> >>> commands -
> >>> bin/start-dfs.sh (it'll start NN & DN)
> >>> bin/start-mapred.sh (it'll start JT & TT)
> >>>
> >>> after this use jps to check if everything is alright or point your
> >>> browser to localhost:50070..if you further find any problem provide us
> >>> with the error logs..:)
> >>>
> >>> Regards,
> >>>     Mohammad Tariq
> >>>
> >>>
> >>> On Wed, Jun 6, 2012 at 11:22 PM, Babak Bastan <ba...@gmail.com>
> wrote:
> >>> > were you able to format hdfs properly???
> >>> > I did'nt get your question,Do you mean HADOOP_HOME? or where did I
> >>> > install
> >>> > Hadoop?
> >>> >
> >>> > On Wed, Jun 6, 2012 at 7:49 PM, Mohammad Tariq <do...@gmail.com>
> >>> > wrote:
> >>> >>
> >>> >> if you are getting only this, it means your hadoop is not
> >>> >> running..were you able to format hdfs properly???
> >>> >>
> >>> >> Regards,
> >>> >>     Mohammad Tariq
> >>> >>
> >>> >>
> >>> >> On Wed, Jun 6, 2012 at 11:17 PM, Babak Bastan <ba...@gmail.com>
> >>> >> wrote:
> >>> >> > Hi MohammadmI irun jps in my shel I can see this result:
> >>> >> > 2213 Jps
> >>> >> >
> >>> >> >
> >>> >> > On Wed, Jun 6, 2012 at 7:44 PM, Mohammad Tariq <
> dontariq@gmail.com>
> >>> >> > wrote:
> >>> >> >>
> >>> >> >> you can also use "jps" command at your shell to see whether
> Hadoop
> >>> >> >> processes are running or not.
> >>> >> >>
> >>> >> >> Regards,
> >>> >> >>     Mohammad Tariq
> >>> >> >>
> >>> >> >>
> >>> >> >> On Wed, Jun 6, 2012 at 11:12 PM, Mohammad Tariq <
> dontariq@gmail.com>
> >>> >> >> wrote:
> >>> >> >> > Hi Babak,
> >>> >> >> >
> >>> >> >> >  You have to type it in you web browser..Hadoop provides us a
> web
> >>> >> >> > GUI
> >>> >> >> > that not only allows us to browse through the file system, but
> to
> >>> >> >> > download the files as well..Apart from that it also provides a
> web
> >>> >> >> > GUI
> >>> >> >> > that can be used to see the status of Jobtracker and
> >>> >> >> > Tasktracker..When
> >>> >> >> > you run a Hive or Pig job or a Mapreduce job, you can point
> your
> >>> >> >> > browser to http://localhost:50030 to see the status and logs
> of
> >>> >> >> > your
> >>> >> >> > job.
> >>> >> >> >
> >>> >> >> > Regards,
> >>> >> >> >     Mohammad Tariq
> >>> >> >> >
> >>> >> >> >
> >>> >> >> > On Wed, Jun 6, 2012 at 8:28 PM, Babak Bastan <
> babakbsn@gmail.com>
> >>> >> >> > wrote:
> >>> >> >> >> Thank you shashwat for the answer,
> >>> >> >> >> where should I type http://localhost:50070?
> >>> >> >> >> I typed here: hive>http://localhost:50070 but nothing as
> result
> >>> >> >> >>
> >>> >> >> >>
> >>> >> >> >> On Wed, Jun 6, 2012 at 3:32 PM, shashwat shriparv
> >>> >> >> >> <dw...@gmail.com> wrote:
> >>> >> >> >>>
> >>> >> >> >>> first type http://localhost:50070 whether this is opening
> or not
> >>> >> >> >>> and
> >>> >> >> >>> check
> >>> >> >> >>> how many nodes are available, check some of the hadoop shell
> >>> >> >> >>> commands
> >>> >> >> >>>
> >>> >> >> >>> from
> http://hadoop.apache.org/common/docs/r0.18.3/hdfs_shell.html
> >>> >> >> >>> run
> >>> >> >> >>> example mapreduce task on hadoop take example from here
> >>> >> >> >>>
> >>> >> >> >>>
> >>> >> >> >>>
> >>> >> >> >>> :
> http://www.michael-noll.com/blog/2011/04/09/benchmarking-and-stress-testing-an-hadoop-cluster-with-terasort-testdfsio-nnbench-mrbench/
> >>> >> >> >>>
> >>> >> >> >>> if all the above you can do sucessfully means hadoop is
> >>> >> >> >>> configured
> >>> >> >> >>> correctly
> >>> >> >> >>>
> >>> >> >> >>> Regards
> >>> >> >> >>> Shashwat
> >>> >> >> >>>
> >>> >> >> >>>
> >>> >> >> >>> On Wed, Jun 6, 2012 at 1:30 AM, Babak Bastan
> >>> >> >> >>> <ba...@gmail.com>
> >>> >> >> >>> wrote:
> >>> >> >> >>>>
> >>> >> >> >>>> no I'm not working on CDH.Is there a way to test if my
> Hadoop
> >>> >> >> >>>> works
> >>> >> >> >>>> fine
> >>> >> >> >>>> or not?
> >>> >> >> >>>>
> >>> >> >> >>>>
> >>> >> >> >>>> On Tue, Jun 5, 2012 at 9:55 PM, Bejoy KS <
> bejoy_ks@yahoo.com>
> >>> >> >> >>>> wrote:
> >>> >> >> >>>>>
> >>> >> >> >>>>> Hi Babak
> >>> >> >> >>>>>
> >>> >> >> >>>>> You gotta follow those instructions in the apace site to
> set
> >>> >> >> >>>>> up
> >>> >> >> >>>>> hadoop
> >>> >> >> >>>>> from scratch and ensure that hdfs is working first. You
> should
> >>> >> >> >>>>> be
> >>> >> >> >>>>> able to
> >>> >> >> >>>>> read and write files to hdfs before you do your next steps.
> >>> >> >> >>>>>
> >>> >> >> >>>>> Are you on CDH or apache distribution of hadoop? If it is
> CDH
> >>> >> >> >>>>> there
> >>> >> >> >>>>> are
> >>> >> >> >>>>> detailed instructions on Cloudera web site.
> >>> >> >> >>>>>
> >>> >> >> >>>>> Regards
> >>> >> >> >>>>> Bejoy KS
> >>> >> >> >>>>>
> >>> >> >> >>>>> Sent from handheld, please excuse typos.
> >>> >> >> >>>>> ________________________________
> >>> >> >> >>>>> From: Babak Bastan <ba...@gmail.com>
> >>> >> >> >>>>> Date: Tue, 5 Jun 2012 21:30:22 +0200
> >>> >> >> >>>>> To: <us...@hive.apache.org>
> >>> >> >> >>>>> ReplyTo: user@hive.apache.org
> >>> >> >> >>>>> Subject: Re: Error while Creating Table in Hive
> >>> >> >> >>>>>
> >>> >> >> >>>>> @Bejoy: I set the fs.default.name in the core-site.xml
> and I
> >>> >> >> >>>>> did
> >>> >> >> >>>>> all
> >>> >> >> >>>>> of
> >>> >> >> >>>>> thing that was mentioned in the reference but no effect
> >>> >> >> >>>>>
> >>> >> >> >>>>> On Tue, Jun 5, 2012 at 8:43 PM, Babak Bastan
> >>> >> >> >>>>> <ba...@gmail.com>
> >>> >> >> >>>>> wrote:
> >>> >> >> >>>>>>
> >>> >> >> >>>>>> Ok sorry but that was my Mistake .I thought it works but
> no.
> >>> >> >> >>>>>> I wrote the command without ; and then I think It works
> but
> >>> >> >> >>>>>> with
> >>> >> >> >>>>>> ;
> >>> >> >> >>>>>> at
> >>> >> >> >>>>>> the end of command
> >>> >> >> >>>>>>
> >>> >> >> >>>>>> CREATE TABLE pokes (foo INT, bar STRING);
> >>> >> >> >>>>>>
> >>> >> >> >>>>>> does'nt work
> >>> >> >> >>>>>>
> >>> >> >> >>>>>>
> >>> >> >> >>>>>> On Tue, Jun 5, 2012 at 8:34 PM, shashwat shriparv
> >>> >> >> >>>>>> <dw...@gmail.com> wrote:
> >>> >> >> >>>>>>>
> >>> >> >> >>>>>>> inside configuration. all properties will be inside the
> >>> >> >> >>>>>>> configuration
> >>> >> >> >>>>>>> tags
> >>> >> >> >>>>>>>
> >>> >> >> >>>>>>>
> >>> >> >> >>>>>>> On Tue, Jun 5, 2012 at 11:53 PM, Babak Bastan
> >>> >> >> >>>>>>> <ba...@gmail.com>
> >>> >> >> >>>>>>> wrote:
> >>> >> >> >>>>>>>>
> >>> >> >> >>>>>>>> Thank you so much my friend your idee works fine(no
> error)
> >>> >> >> >>>>>>>> you
> >>> >> >> >>>>>>>> are
> >>> >> >> >>>>>>>> the best :)
> >>> >> >> >>>>>>>>
> >>> >> >> >>>>>>>>
> >>> >> >> >>>>>>>> On Tue, Jun 5, 2012 at 8:20 PM, Babak Bastan
> >>> >> >> >>>>>>>> <ba...@gmail.com>
> >>> >> >> >>>>>>>> wrote:
> >>> >> >> >>>>>>>>>
> >>> >> >> >>>>>>>>> It must be inside the <configuration></configuration>
> or
> >>> >> >> >>>>>>>>> outside
> >>> >> >> >>>>>>>>> this?
> >>> >> >> >>>>>>>>>
> >>> >> >> >>>>>>>>>
> >>> >> >> >>>>>>>>> On Tue, Jun 5, 2012 at 8:15 PM, shashwat shriparv
> >>> >> >> >>>>>>>>> <dw...@gmail.com> wrote:
> >>> >> >> >>>>>>>>>>
> >>> >> >> >>>>>>>>>> It will be inside hive/conf
> >>> >> >> >>>>>>>>>>
> >>> >> >> >>>>>>>>>>
> >>> >> >> >>>>>>>>>> On Tue, Jun 5, 2012 at 11:43 PM, Babak Bastan
> >>> >> >> >>>>>>>>>> <ba...@gmail.com>
> >>> >> >> >>>>>>>>>> wrote:
> >>> >> >> >>>>>>>>>>>
> >>> >> >> >>>>>>>>>>> Thanks sShashwat, and where is this hive-site.xml
> >>> >> >> >>>>>>>>>>>
> >>> >> >> >>>>>>>>>>> On Tue, Jun 5, 2012 at 8:02 PM, shashwat shriparv
> >>> >> >> >>>>>>>>>>> <dw...@gmail.com> wrote:
> >>> >> >> >>>>>>>>>>>>
> >>> >> >> >>>>>>>>>>>> set
> >>> >> >> >>>>>>>>>>>>
> >>> >> >> >>>>>>>>>>>> hive.metastore.warehouse.dir in hive-site.xml
> >>> >> >> >>>>>>>>>>>>
> >>> >> >> >>>>>>>>>>>> <property>
> >>> >> >> >>>>>>>>>>>>   <name>hive.metastore.local</name>
> >>> >> >> >>>>>>>>>>>>   <value>true</value>
> >>> >> >> >>>>>>>>>>>> </property>
> >>> >> >> >>>>>>>>>>>>
> >>> >> >> >>>>>>>>>>>> <name>hive.metastore.warehouse.dir</name>
> >>> >> >> >>>>>>>>>>>>                <value>/home/<your
> >>> >> >> >>>>>>>>>>>> username>/hivefolder</value>
> >>> >> >> >>>>>>>>>>>>                <description>location of default
> >>> >> >> >>>>>>>>>>>> database
> >>> >> >> >>>>>>>>>>>> for
> >>> >> >> >>>>>>>>>>>> the
> >>> >> >> >>>>>>>>>>>> warehouse</description>
> >>> >> >> >>>>>>>>>>>>        </property>
> >>> >> >> >>>>>>>>>>>>
> >>> >> >> >>>>>>>>>>>>
> >>> >> >> >>>>>>>>>>>> On Tue, Jun 5, 2012 at 10:43 PM, Babak Bastan
> >>> >> >> >>>>>>>>>>>> <ba...@gmail.com> wrote:
> >>> >> >> >>>>>>>>>>>>>
> >>> >> >> >>>>>>>>>>>>> Hello Experts ,
> >>> >> >> >>>>>>>>>>>>>
> >>> >> >> >>>>>>>>>>>>> I'm new in Hive .When try to create a test Table in
> >>> >> >> >>>>>>>>>>>>> Hive
> >>> >> >> >>>>>>>>>>>>> I
> >>> >> >> >>>>>>>>>>>>> get
> >>> >> >> >>>>>>>>>>>>> an error.I want to run this command:
> >>> >> >> >>>>>>>>>>>>> CREATE TABLE Test (DateT STRING, Url STRING,
> Content
> >>> >> >> >>>>>>>>>>>>> STRING);
> >>> >> >> >>>>>>>>>>>>> but this error occured:
> >>> >> >> >>>>>>>>>>>>> FAILED: Error in metadata:
> MetaException(message:Got
> >>> >> >> >>>>>>>>>>>>> exception:
> >>> >> >> >>>>>>>>>>>>> java.io.FileNotFoundException File
> >>> >> >> >>>>>>>>>>>>> file:/user/hive/warehouse/test does not
> >>> >> >> >>>>>>>>>>>>> exist.)
> >>> >> >> >>>>>>>>>>>>> FAILED: Execution Error, return code 1 from
> >>> >> >> >>>>>>>>>>>>> org.apache.hadoop.hive.ql.exec.DDLTask
> >>> >> >> >>>>>>>>>>>>> How can I solve this Problem?
> >>> >> >> >>>>>>>>>>>>> Thank you so much
> >>> >> >> >>>>>>>>>>>>
> >>> >> >> >>>>>>>>>>>>
> >>> >> >> >>>>>>>>>>>>
> >>> >> >> >>>>>>>>>>>>
> >>> >> >> >>>>>>>>>>>> --
> >>> >> >> >>>>>>>>>>>>
> >>> >> >> >>>>>>>>>>>>
> >>> >> >> >>>>>>>>>>>> ∞
> >>> >> >> >>>>>>>>>>>>
> >>> >> >> >>>>>>>>>>>> Shashwat Shriparv
> >>> >> >> >>>>>>>>>>>>
> >>> >> >> >>>>>>>>>>>>
> >>> >> >> >>>>>>>>>>>
> >>> >> >> >>>>>>>>>>
> >>> >> >> >>>>>>>>>>
> >>> >> >> >>>>>>>>>>
> >>> >> >> >>>>>>>>>> --
> >>> >> >> >>>>>>>>>>
> >>> >> >> >>>>>>>>>>
> >>> >> >> >>>>>>>>>> ∞
> >>> >> >> >>>>>>>>>>
> >>> >> >> >>>>>>>>>> Shashwat Shriparv
> >>> >> >> >>>>>>>>>>
> >>> >> >> >>>>>>>>>>
> >>> >> >> >>>>>>>>>
> >>> >> >> >>>>>>>>
> >>> >> >> >>>>>>>
> >>> >> >> >>>>>>>
> >>> >> >> >>>>>>>
> >>> >> >> >>>>>>> --
> >>> >> >> >>>>>>>
> >>> >> >> >>>>>>>
> >>> >> >> >>>>>>> ∞
> >>> >> >> >>>>>>>
> >>> >> >> >>>>>>> Shashwat Shriparv
> >>> >> >> >>>>>>>
> >>> >> >> >>>>>>>
> >>> >> >> >>>>>>
> >>> >> >> >>>>>
> >>> >> >> >>>>
> >>> >> >> >>>
> >>> >> >> >>>
> >>> >> >> >>>
> >>> >> >> >>> --
> >>> >> >> >>>
> >>> >> >> >>>
> >>> >> >> >>> ∞
> >>> >> >> >>>
> >>> >> >> >>> Shashwat Shriparv
> >>> >> >> >>>
> >>> >> >> >>>
> >>> >> >> >>
> >>> >> >
> >>> >> >
> >>> >
> >>> >
> >>
> >>
>

Re: Error while Creating Table in Hive

Posted by Mohammad Tariq <do...@gmail.com>.
also change the permissions of these directories to 777.

Regards,
    Mohammad Tariq


On Wed, Jun 6, 2012 at 11:54 PM, Mohammad Tariq <do...@gmail.com> wrote:
> create a directory "/home/username/hdfs" (or at some place of your
> choice)..inside this hdfs directory create three sub directories -
> name, data, and temp, then follow these steps :
>
> add following properties in your core-site.xml -
>
> <property>
>          <name>fs.default.name</name>
>          <value>hdfs://localhost:9000/</value>
>        </property>
>
>        <property>
>          <name>hadoop.tmp.dir</name>
>          <value>/home/mohammad/hdfs/temp</value>
>        </property>
>
> then add following two properties in your hdfs-site.xml -
>
> <property>
>                <name>dfs.replication</name>
>                <value>1</value>
>        </property>
>
>        <property>
>                <name>dfs.name.dir</name>
>                <value>/home/mohammad/hdfs/name</value>
>        </property>
>
>        <property>
>                <name>dfs.data.dir</name>
>                <value>/home/mohammad/hdfs/data</value>
>        </property>
>
> finally add this property in your mapred-site.xml -
>
>       <property>
>          <name>mapred.job.tracker</name>
>          <value>hdfs://localhost:9001</value>
>        </property>
>
> NOTE: you can give any name to these directories of your choice, just
> keep in mind you have to give same names as values of
>           above specified properties in your configuration files.
> (give full path of these directories, not just the name of the
> directory)
>
> After this  follow the steps provided in the previous reply.
>
> Regards,
>     Mohammad Tariq
>
>
> On Wed, Jun 6, 2012 at 11:42 PM, Babak Bastan <ba...@gmail.com> wrote:
>> thank's Mohammad
>>
>> with this command:
>>
>> babak@ubuntu:~/Downloads/hadoop/bin$ hadoop namenode -format
>>
>> this is my output:
>>
>> 12/06/06 20:05:20 INFO namenode.NameNode: STARTUP_MSG:
>> /************************************************************
>> STARTUP_MSG: Starting NameNode
>> STARTUP_MSG:   host = ubuntu/127.0.1.1
>> STARTUP_MSG:   args = [-format]
>> STARTUP_MSG:   version = 0.20.2
>> STARTUP_MSG:   build =
>> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.20 -r
>> 911707; compiled by 'chrisdo' on Fri Feb 19 08:07:34 UTC 2010
>> ************************************************************/
>> 12/06/06 20:05:20 INFO namenode.FSNamesystem:
>> fsOwner=babak,babak,adm,dialout,cdrom,plugdev,lpadmin,admin,sambashare
>> 12/06/06 20:05:20 INFO namenode.FSNamesystem: supergroup=supergroup
>> 12/06/06 20:05:20 INFO namenode.FSNamesystem: isPermissionEnabled=true
>> 12/06/06 20:05:20 INFO common.Storage: Image file of size 95 saved in 0
>> seconds.
>> 12/06/06 20:05:20 INFO common.Storage: Storage directory
>> /tmp/hadoop-babak/dfs/name has been successfully formatted.
>> 12/06/06 20:05:20 INFO namenode.NameNode: SHUTDOWN_MSG:
>> /************************************************************
>> SHUTDOWN_MSG: Shutting down NameNode at ubuntu/127.0.1.1
>> ************************************************************/
>>
>> by this command:
>>
>> babak@ubuntu:~/Downloads/hadoop/bin$ start-dfs.sh
>>
>> this is the out put
>>
>> mkdir: kann Verzeichnis „/home/babak/Downloads/hadoop/bin/../logs“ nicht
>> anlegen: Keine Berechtigung
>>
>> this out put(it's in german and it means no right to make this folder)
>>
>>
>> On Wed, Jun 6, 2012 at 7:59 PM, Mohammad Tariq <do...@gmail.com> wrote:
>>>
>>> once we are done with the configuration, we need to format the file
>>> system..use this command to do that-
>>> bin/hadoop namenode -format
>>>
>>> after this, hadoop daemon processes should be started using following
>>> commands -
>>> bin/start-dfs.sh (it'll start NN & DN)
>>> bin/start-mapred.sh (it'll start JT & TT)
>>>
>>> after this use jps to check if everything is alright or point your
>>> browser to localhost:50070..if you further find any problem provide us
>>> with the error logs..:)
>>>
>>> Regards,
>>>     Mohammad Tariq
>>>
>>>
>>> On Wed, Jun 6, 2012 at 11:22 PM, Babak Bastan <ba...@gmail.com> wrote:
>>> > were you able to format hdfs properly???
>>> > I did'nt get your question,Do you mean HADOOP_HOME? or where did I
>>> > install
>>> > Hadoop?
>>> >
>>> > On Wed, Jun 6, 2012 at 7:49 PM, Mohammad Tariq <do...@gmail.com>
>>> > wrote:
>>> >>
>>> >> if you are getting only this, it means your hadoop is not
>>> >> running..were you able to format hdfs properly???
>>> >>
>>> >> Regards,
>>> >>     Mohammad Tariq
>>> >>
>>> >>
>>> >> On Wed, Jun 6, 2012 at 11:17 PM, Babak Bastan <ba...@gmail.com>
>>> >> wrote:
>>> >> > Hi MohammadmI irun jps in my shel I can see this result:
>>> >> > 2213 Jps
>>> >> >
>>> >> >
>>> >> > On Wed, Jun 6, 2012 at 7:44 PM, Mohammad Tariq <do...@gmail.com>
>>> >> > wrote:
>>> >> >>
>>> >> >> you can also use "jps" command at your shell to see whether Hadoop
>>> >> >> processes are running or not.
>>> >> >>
>>> >> >> Regards,
>>> >> >>     Mohammad Tariq
>>> >> >>
>>> >> >>
>>> >> >> On Wed, Jun 6, 2012 at 11:12 PM, Mohammad Tariq <do...@gmail.com>
>>> >> >> wrote:
>>> >> >> > Hi Babak,
>>> >> >> >
>>> >> >> >  You have to type it in you web browser..Hadoop provides us a web
>>> >> >> > GUI
>>> >> >> > that not only allows us to browse through the file system, but to
>>> >> >> > download the files as well..Apart from that it also provides a web
>>> >> >> > GUI
>>> >> >> > that can be used to see the status of Jobtracker and
>>> >> >> > Tasktracker..When
>>> >> >> > you run a Hive or Pig job or a Mapreduce job, you can point your
>>> >> >> > browser to http://localhost:50030 to see the status and logs of
>>> >> >> > your
>>> >> >> > job.
>>> >> >> >
>>> >> >> > Regards,
>>> >> >> >     Mohammad Tariq
>>> >> >> >
>>> >> >> >
>>> >> >> > On Wed, Jun 6, 2012 at 8:28 PM, Babak Bastan <ba...@gmail.com>
>>> >> >> > wrote:
>>> >> >> >> Thank you shashwat for the answer,
>>> >> >> >> where should I type http://localhost:50070?
>>> >> >> >> I typed here: hive>http://localhost:50070 but nothing as result
>>> >> >> >>
>>> >> >> >>
>>> >> >> >> On Wed, Jun 6, 2012 at 3:32 PM, shashwat shriparv
>>> >> >> >> <dw...@gmail.com> wrote:
>>> >> >> >>>
>>> >> >> >>> first type http://localhost:50070 whether this is opening or not
>>> >> >> >>> and
>>> >> >> >>> check
>>> >> >> >>> how many nodes are available, check some of the hadoop shell
>>> >> >> >>> commands
>>> >> >> >>>
>>> >> >> >>> from http://hadoop.apache.org/common/docs/r0.18.3/hdfs_shell.html
>>> >> >> >>> run
>>> >> >> >>> example mapreduce task on hadoop take example from here
>>> >> >> >>>
>>> >> >> >>>
>>> >> >> >>>
>>> >> >> >>> : http://www.michael-noll.com/blog/2011/04/09/benchmarking-and-stress-testing-an-hadoop-cluster-with-terasort-testdfsio-nnbench-mrbench/
>>> >> >> >>>
>>> >> >> >>> if all the above you can do sucessfully means hadoop is
>>> >> >> >>> configured
>>> >> >> >>> correctly
>>> >> >> >>>
>>> >> >> >>> Regards
>>> >> >> >>> Shashwat
>>> >> >> >>>
>>> >> >> >>>
>>> >> >> >>> On Wed, Jun 6, 2012 at 1:30 AM, Babak Bastan
>>> >> >> >>> <ba...@gmail.com>
>>> >> >> >>> wrote:
>>> >> >> >>>>
>>> >> >> >>>> no I'm not working on CDH.Is there a way to test if my Hadoop
>>> >> >> >>>> works
>>> >> >> >>>> fine
>>> >> >> >>>> or not?
>>> >> >> >>>>
>>> >> >> >>>>
>>> >> >> >>>> On Tue, Jun 5, 2012 at 9:55 PM, Bejoy KS <be...@yahoo.com>
>>> >> >> >>>> wrote:
>>> >> >> >>>>>
>>> >> >> >>>>> Hi Babak
>>> >> >> >>>>>
>>> >> >> >>>>> You gotta follow those instructions in the apace site to set
>>> >> >> >>>>> up
>>> >> >> >>>>> hadoop
>>> >> >> >>>>> from scratch and ensure that hdfs is working first. You should
>>> >> >> >>>>> be
>>> >> >> >>>>> able to
>>> >> >> >>>>> read and write files to hdfs before you do your next steps.
>>> >> >> >>>>>
>>> >> >> >>>>> Are you on CDH or apache distribution of hadoop? If it is CDH
>>> >> >> >>>>> there
>>> >> >> >>>>> are
>>> >> >> >>>>> detailed instructions on Cloudera web site.
>>> >> >> >>>>>
>>> >> >> >>>>> Regards
>>> >> >> >>>>> Bejoy KS
>>> >> >> >>>>>
>>> >> >> >>>>> Sent from handheld, please excuse typos.
>>> >> >> >>>>> ________________________________
>>> >> >> >>>>> From: Babak Bastan <ba...@gmail.com>
>>> >> >> >>>>> Date: Tue, 5 Jun 2012 21:30:22 +0200
>>> >> >> >>>>> To: <us...@hive.apache.org>
>>> >> >> >>>>> ReplyTo: user@hive.apache.org
>>> >> >> >>>>> Subject: Re: Error while Creating Table in Hive
>>> >> >> >>>>>
>>> >> >> >>>>> @Bejoy: I set the fs.default.name in the core-site.xml and I
>>> >> >> >>>>> did
>>> >> >> >>>>> all
>>> >> >> >>>>> of
>>> >> >> >>>>> thing that was mentioned in the reference but no effect
>>> >> >> >>>>>
>>> >> >> >>>>> On Tue, Jun 5, 2012 at 8:43 PM, Babak Bastan
>>> >> >> >>>>> <ba...@gmail.com>
>>> >> >> >>>>> wrote:
>>> >> >> >>>>>>
>>> >> >> >>>>>> Ok sorry but that was my Mistake .I thought it works but no.
>>> >> >> >>>>>> I wrote the command without ; and then I think It works but
>>> >> >> >>>>>> with
>>> >> >> >>>>>> ;
>>> >> >> >>>>>> at
>>> >> >> >>>>>> the end of command
>>> >> >> >>>>>>
>>> >> >> >>>>>> CREATE TABLE pokes (foo INT, bar STRING);
>>> >> >> >>>>>>
>>> >> >> >>>>>> does'nt work
>>> >> >> >>>>>>
>>> >> >> >>>>>>
>>> >> >> >>>>>> On Tue, Jun 5, 2012 at 8:34 PM, shashwat shriparv
>>> >> >> >>>>>> <dw...@gmail.com> wrote:
>>> >> >> >>>>>>>
>>> >> >> >>>>>>> inside configuration. all properties will be inside the
>>> >> >> >>>>>>> configuration
>>> >> >> >>>>>>> tags
>>> >> >> >>>>>>>
>>> >> >> >>>>>>>
>>> >> >> >>>>>>> On Tue, Jun 5, 2012 at 11:53 PM, Babak Bastan
>>> >> >> >>>>>>> <ba...@gmail.com>
>>> >> >> >>>>>>> wrote:
>>> >> >> >>>>>>>>
>>> >> >> >>>>>>>> Thank you so much my friend your idee works fine(no error)
>>> >> >> >>>>>>>> you
>>> >> >> >>>>>>>> are
>>> >> >> >>>>>>>> the best :)
>>> >> >> >>>>>>>>
>>> >> >> >>>>>>>>
>>> >> >> >>>>>>>> On Tue, Jun 5, 2012 at 8:20 PM, Babak Bastan
>>> >> >> >>>>>>>> <ba...@gmail.com>
>>> >> >> >>>>>>>> wrote:
>>> >> >> >>>>>>>>>
>>> >> >> >>>>>>>>> It must be inside the <configuration></configuration> or
>>> >> >> >>>>>>>>> outside
>>> >> >> >>>>>>>>> this?
>>> >> >> >>>>>>>>>
>>> >> >> >>>>>>>>>
>>> >> >> >>>>>>>>> On Tue, Jun 5, 2012 at 8:15 PM, shashwat shriparv
>>> >> >> >>>>>>>>> <dw...@gmail.com> wrote:
>>> >> >> >>>>>>>>>>
>>> >> >> >>>>>>>>>> It will be inside hive/conf
>>> >> >> >>>>>>>>>>
>>> >> >> >>>>>>>>>>
>>> >> >> >>>>>>>>>> On Tue, Jun 5, 2012 at 11:43 PM, Babak Bastan
>>> >> >> >>>>>>>>>> <ba...@gmail.com>
>>> >> >> >>>>>>>>>> wrote:
>>> >> >> >>>>>>>>>>>
>>> >> >> >>>>>>>>>>> Thanks sShashwat, and where is this hive-site.xml
>>> >> >> >>>>>>>>>>>
>>> >> >> >>>>>>>>>>> On Tue, Jun 5, 2012 at 8:02 PM, shashwat shriparv
>>> >> >> >>>>>>>>>>> <dw...@gmail.com> wrote:
>>> >> >> >>>>>>>>>>>>
>>> >> >> >>>>>>>>>>>> set
>>> >> >> >>>>>>>>>>>>
>>> >> >> >>>>>>>>>>>> hive.metastore.warehouse.dir in hive-site.xml
>>> >> >> >>>>>>>>>>>>
>>> >> >> >>>>>>>>>>>> <property>
>>> >> >> >>>>>>>>>>>>   <name>hive.metastore.local</name>
>>> >> >> >>>>>>>>>>>>   <value>true</value>
>>> >> >> >>>>>>>>>>>> </property>
>>> >> >> >>>>>>>>>>>>
>>> >> >> >>>>>>>>>>>> <name>hive.metastore.warehouse.dir</name>
>>> >> >> >>>>>>>>>>>>                <value>/home/<your
>>> >> >> >>>>>>>>>>>> username>/hivefolder</value>
>>> >> >> >>>>>>>>>>>>                <description>location of default
>>> >> >> >>>>>>>>>>>> database
>>> >> >> >>>>>>>>>>>> for
>>> >> >> >>>>>>>>>>>> the
>>> >> >> >>>>>>>>>>>> warehouse</description>
>>> >> >> >>>>>>>>>>>>        </property>
>>> >> >> >>>>>>>>>>>>
>>> >> >> >>>>>>>>>>>>
>>> >> >> >>>>>>>>>>>> On Tue, Jun 5, 2012 at 10:43 PM, Babak Bastan
>>> >> >> >>>>>>>>>>>> <ba...@gmail.com> wrote:
>>> >> >> >>>>>>>>>>>>>
>>> >> >> >>>>>>>>>>>>> Hello Experts ,
>>> >> >> >>>>>>>>>>>>>
>>> >> >> >>>>>>>>>>>>> I'm new in Hive .When try to create a test Table in
>>> >> >> >>>>>>>>>>>>> Hive
>>> >> >> >>>>>>>>>>>>> I
>>> >> >> >>>>>>>>>>>>> get
>>> >> >> >>>>>>>>>>>>> an error.I want to run this command:
>>> >> >> >>>>>>>>>>>>> CREATE TABLE Test (DateT STRING, Url STRING, Content
>>> >> >> >>>>>>>>>>>>> STRING);
>>> >> >> >>>>>>>>>>>>> but this error occured:
>>> >> >> >>>>>>>>>>>>> FAILED: Error in metadata: MetaException(message:Got
>>> >> >> >>>>>>>>>>>>> exception:
>>> >> >> >>>>>>>>>>>>> java.io.FileNotFoundException File
>>> >> >> >>>>>>>>>>>>> file:/user/hive/warehouse/test does not
>>> >> >> >>>>>>>>>>>>> exist.)
>>> >> >> >>>>>>>>>>>>> FAILED: Execution Error, return code 1 from
>>> >> >> >>>>>>>>>>>>> org.apache.hadoop.hive.ql.exec.DDLTask
>>> >> >> >>>>>>>>>>>>> How can I solve this Problem?
>>> >> >> >>>>>>>>>>>>> Thank you so much
>>> >> >> >>>>>>>>>>>>
>>> >> >> >>>>>>>>>>>>
>>> >> >> >>>>>>>>>>>>
>>> >> >> >>>>>>>>>>>>
>>> >> >> >>>>>>>>>>>> --
>>> >> >> >>>>>>>>>>>>
>>> >> >> >>>>>>>>>>>>
>>> >> >> >>>>>>>>>>>> ∞
>>> >> >> >>>>>>>>>>>>
>>> >> >> >>>>>>>>>>>> Shashwat Shriparv
>>> >> >> >>>>>>>>>>>>
>>> >> >> >>>>>>>>>>>>
>>> >> >> >>>>>>>>>>>
>>> >> >> >>>>>>>>>>
>>> >> >> >>>>>>>>>>
>>> >> >> >>>>>>>>>>
>>> >> >> >>>>>>>>>> --
>>> >> >> >>>>>>>>>>
>>> >> >> >>>>>>>>>>
>>> >> >> >>>>>>>>>> ∞
>>> >> >> >>>>>>>>>>
>>> >> >> >>>>>>>>>> Shashwat Shriparv
>>> >> >> >>>>>>>>>>
>>> >> >> >>>>>>>>>>
>>> >> >> >>>>>>>>>
>>> >> >> >>>>>>>>
>>> >> >> >>>>>>>
>>> >> >> >>>>>>>
>>> >> >> >>>>>>>
>>> >> >> >>>>>>> --
>>> >> >> >>>>>>>
>>> >> >> >>>>>>>
>>> >> >> >>>>>>> ∞
>>> >> >> >>>>>>>
>>> >> >> >>>>>>> Shashwat Shriparv
>>> >> >> >>>>>>>
>>> >> >> >>>>>>>
>>> >> >> >>>>>>
>>> >> >> >>>>>
>>> >> >> >>>>
>>> >> >> >>>
>>> >> >> >>>
>>> >> >> >>>
>>> >> >> >>> --
>>> >> >> >>>
>>> >> >> >>>
>>> >> >> >>> ∞
>>> >> >> >>>
>>> >> >> >>> Shashwat Shriparv
>>> >> >> >>>
>>> >> >> >>>
>>> >> >> >>
>>> >> >
>>> >> >
>>> >
>>> >
>>
>>

Re: Error while Creating Table in Hive

Posted by Mohammad Tariq <do...@gmail.com>.
create a directory "/home/username/hdfs" (or at some place of your
choice)..inside this hdfs directory create three sub directories -
name, data, and temp, then follow these steps :

add following properties in your core-site.xml -

<property>
	  <name>fs.default.name</name>
	  <value>hdfs://localhost:9000/</value>
	</property>

	<property>
	  <name>hadoop.tmp.dir</name>
	  <value>/home/mohammad/hdfs/temp</value>
	</property>

then add following two properties in your hdfs-site.xml -

<property>
		<name>dfs.replication</name>
		<value>1</value>
	</property>

	<property>
		<name>dfs.name.dir</name>
		<value>/home/mohammad/hdfs/name</value>
	</property>

	<property>
		<name>dfs.data.dir</name>
		<value>/home/mohammad/hdfs/data</value>
	</property>

finally add this property in your mapred-site.xml -

       <property>
	  <name>mapred.job.tracker</name>
	  <value>hdfs://localhost:9001</value>
	</property>

NOTE: you can give any name to these directories of your choice, just
keep in mind you have to give same names as values of
           above specified properties in your configuration files.
(give full path of these directories, not just the name of the
directory)

After this  follow the steps provided in the previous reply.

Regards,
    Mohammad Tariq


On Wed, Jun 6, 2012 at 11:42 PM, Babak Bastan <ba...@gmail.com> wrote:
> thank's Mohammad
>
> with this command:
>
> babak@ubuntu:~/Downloads/hadoop/bin$ hadoop namenode -format
>
> this is my output:
>
> 12/06/06 20:05:20 INFO namenode.NameNode: STARTUP_MSG:
> /************************************************************
> STARTUP_MSG: Starting NameNode
> STARTUP_MSG:   host = ubuntu/127.0.1.1
> STARTUP_MSG:   args = [-format]
> STARTUP_MSG:   version = 0.20.2
> STARTUP_MSG:   build =
> https://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.20 -r
> 911707; compiled by 'chrisdo' on Fri Feb 19 08:07:34 UTC 2010
> ************************************************************/
> 12/06/06 20:05:20 INFO namenode.FSNamesystem:
> fsOwner=babak,babak,adm,dialout,cdrom,plugdev,lpadmin,admin,sambashare
> 12/06/06 20:05:20 INFO namenode.FSNamesystem: supergroup=supergroup
> 12/06/06 20:05:20 INFO namenode.FSNamesystem: isPermissionEnabled=true
> 12/06/06 20:05:20 INFO common.Storage: Image file of size 95 saved in 0
> seconds.
> 12/06/06 20:05:20 INFO common.Storage: Storage directory
> /tmp/hadoop-babak/dfs/name has been successfully formatted.
> 12/06/06 20:05:20 INFO namenode.NameNode: SHUTDOWN_MSG:
> /************************************************************
> SHUTDOWN_MSG: Shutting down NameNode at ubuntu/127.0.1.1
> ************************************************************/
>
> by this command:
>
> babak@ubuntu:~/Downloads/hadoop/bin$ start-dfs.sh
>
> this is the out put
>
> mkdir: kann Verzeichnis „/home/babak/Downloads/hadoop/bin/../logs“ nicht
> anlegen: Keine Berechtigung
>
> this out put(it's in german and it means no right to make this folder)
>
>
> On Wed, Jun 6, 2012 at 7:59 PM, Mohammad Tariq <do...@gmail.com> wrote:
>>
>> once we are done with the configuration, we need to format the file
>> system..use this command to do that-
>> bin/hadoop namenode -format
>>
>> after this, hadoop daemon processes should be started using following
>> commands -
>> bin/start-dfs.sh (it'll start NN & DN)
>> bin/start-mapred.sh (it'll start JT & TT)
>>
>> after this use jps to check if everything is alright or point your
>> browser to localhost:50070..if you further find any problem provide us
>> with the error logs..:)
>>
>> Regards,
>>     Mohammad Tariq
>>
>>
>> On Wed, Jun 6, 2012 at 11:22 PM, Babak Bastan <ba...@gmail.com> wrote:
>> > were you able to format hdfs properly???
>> > I did'nt get your question,Do you mean HADOOP_HOME? or where did I
>> > install
>> > Hadoop?
>> >
>> > On Wed, Jun 6, 2012 at 7:49 PM, Mohammad Tariq <do...@gmail.com>
>> > wrote:
>> >>
>> >> if you are getting only this, it means your hadoop is not
>> >> running..were you able to format hdfs properly???
>> >>
>> >> Regards,
>> >>     Mohammad Tariq
>> >>
>> >>
>> >> On Wed, Jun 6, 2012 at 11:17 PM, Babak Bastan <ba...@gmail.com>
>> >> wrote:
>> >> > Hi MohammadmI irun jps in my shel I can see this result:
>> >> > 2213 Jps
>> >> >
>> >> >
>> >> > On Wed, Jun 6, 2012 at 7:44 PM, Mohammad Tariq <do...@gmail.com>
>> >> > wrote:
>> >> >>
>> >> >> you can also use "jps" command at your shell to see whether Hadoop
>> >> >> processes are running or not.
>> >> >>
>> >> >> Regards,
>> >> >>     Mohammad Tariq
>> >> >>
>> >> >>
>> >> >> On Wed, Jun 6, 2012 at 11:12 PM, Mohammad Tariq <do...@gmail.com>
>> >> >> wrote:
>> >> >> > Hi Babak,
>> >> >> >
>> >> >> >  You have to type it in you web browser..Hadoop provides us a web
>> >> >> > GUI
>> >> >> > that not only allows us to browse through the file system, but to
>> >> >> > download the files as well..Apart from that it also provides a web
>> >> >> > GUI
>> >> >> > that can be used to see the status of Jobtracker and
>> >> >> > Tasktracker..When
>> >> >> > you run a Hive or Pig job or a Mapreduce job, you can point your
>> >> >> > browser to http://localhost:50030 to see the status and logs of
>> >> >> > your
>> >> >> > job.
>> >> >> >
>> >> >> > Regards,
>> >> >> >     Mohammad Tariq
>> >> >> >
>> >> >> >
>> >> >> > On Wed, Jun 6, 2012 at 8:28 PM, Babak Bastan <ba...@gmail.com>
>> >> >> > wrote:
>> >> >> >> Thank you shashwat for the answer,
>> >> >> >> where should I type http://localhost:50070?
>> >> >> >> I typed here: hive>http://localhost:50070 but nothing as result
>> >> >> >>
>> >> >> >>
>> >> >> >> On Wed, Jun 6, 2012 at 3:32 PM, shashwat shriparv
>> >> >> >> <dw...@gmail.com> wrote:
>> >> >> >>>
>> >> >> >>> first type http://localhost:50070 whether this is opening or not
>> >> >> >>> and
>> >> >> >>> check
>> >> >> >>> how many nodes are available, check some of the hadoop shell
>> >> >> >>> commands
>> >> >> >>>
>> >> >> >>> from http://hadoop.apache.org/common/docs/r0.18.3/hdfs_shell.html
>> >> >> >>> run
>> >> >> >>> example mapreduce task on hadoop take example from here
>> >> >> >>>
>> >> >> >>>
>> >> >> >>>
>> >> >> >>> : http://www.michael-noll.com/blog/2011/04/09/benchmarking-and-stress-testing-an-hadoop-cluster-with-terasort-testdfsio-nnbench-mrbench/
>> >> >> >>>
>> >> >> >>> if all the above you can do sucessfully means hadoop is
>> >> >> >>> configured
>> >> >> >>> correctly
>> >> >> >>>
>> >> >> >>> Regards
>> >> >> >>> Shashwat
>> >> >> >>>
>> >> >> >>>
>> >> >> >>> On Wed, Jun 6, 2012 at 1:30 AM, Babak Bastan
>> >> >> >>> <ba...@gmail.com>
>> >> >> >>> wrote:
>> >> >> >>>>
>> >> >> >>>> no I'm not working on CDH.Is there a way to test if my Hadoop
>> >> >> >>>> works
>> >> >> >>>> fine
>> >> >> >>>> or not?
>> >> >> >>>>
>> >> >> >>>>
>> >> >> >>>> On Tue, Jun 5, 2012 at 9:55 PM, Bejoy KS <be...@yahoo.com>
>> >> >> >>>> wrote:
>> >> >> >>>>>
>> >> >> >>>>> Hi Babak
>> >> >> >>>>>
>> >> >> >>>>> You gotta follow those instructions in the apace site to set
>> >> >> >>>>> up
>> >> >> >>>>> hadoop
>> >> >> >>>>> from scratch and ensure that hdfs is working first. You should
>> >> >> >>>>> be
>> >> >> >>>>> able to
>> >> >> >>>>> read and write files to hdfs before you do your next steps.
>> >> >> >>>>>
>> >> >> >>>>> Are you on CDH or apache distribution of hadoop? If it is CDH
>> >> >> >>>>> there
>> >> >> >>>>> are
>> >> >> >>>>> detailed instructions on Cloudera web site.
>> >> >> >>>>>
>> >> >> >>>>> Regards
>> >> >> >>>>> Bejoy KS
>> >> >> >>>>>
>> >> >> >>>>> Sent from handheld, please excuse typos.
>> >> >> >>>>> ________________________________
>> >> >> >>>>> From: Babak Bastan <ba...@gmail.com>
>> >> >> >>>>> Date: Tue, 5 Jun 2012 21:30:22 +0200
>> >> >> >>>>> To: <us...@hive.apache.org>
>> >> >> >>>>> ReplyTo: user@hive.apache.org
>> >> >> >>>>> Subject: Re: Error while Creating Table in Hive
>> >> >> >>>>>
>> >> >> >>>>> @Bejoy: I set the fs.default.name in the core-site.xml and I
>> >> >> >>>>> did
>> >> >> >>>>> all
>> >> >> >>>>> of
>> >> >> >>>>> thing that was mentioned in the reference but no effect
>> >> >> >>>>>
>> >> >> >>>>> On Tue, Jun 5, 2012 at 8:43 PM, Babak Bastan
>> >> >> >>>>> <ba...@gmail.com>
>> >> >> >>>>> wrote:
>> >> >> >>>>>>
>> >> >> >>>>>> Ok sorry but that was my Mistake .I thought it works but no.
>> >> >> >>>>>> I wrote the command without ; and then I think It works but
>> >> >> >>>>>> with
>> >> >> >>>>>> ;
>> >> >> >>>>>> at
>> >> >> >>>>>> the end of command
>> >> >> >>>>>>
>> >> >> >>>>>> CREATE TABLE pokes (foo INT, bar STRING);
>> >> >> >>>>>>
>> >> >> >>>>>> does'nt work
>> >> >> >>>>>>
>> >> >> >>>>>>
>> >> >> >>>>>> On Tue, Jun 5, 2012 at 8:34 PM, shashwat shriparv
>> >> >> >>>>>> <dw...@gmail.com> wrote:
>> >> >> >>>>>>>
>> >> >> >>>>>>> inside configuration. all properties will be inside the
>> >> >> >>>>>>> configuration
>> >> >> >>>>>>> tags
>> >> >> >>>>>>>
>> >> >> >>>>>>>
>> >> >> >>>>>>> On Tue, Jun 5, 2012 at 11:53 PM, Babak Bastan
>> >> >> >>>>>>> <ba...@gmail.com>
>> >> >> >>>>>>> wrote:
>> >> >> >>>>>>>>
>> >> >> >>>>>>>> Thank you so much my friend your idee works fine(no error)
>> >> >> >>>>>>>> you
>> >> >> >>>>>>>> are
>> >> >> >>>>>>>> the best :)
>> >> >> >>>>>>>>
>> >> >> >>>>>>>>
>> >> >> >>>>>>>> On Tue, Jun 5, 2012 at 8:20 PM, Babak Bastan
>> >> >> >>>>>>>> <ba...@gmail.com>
>> >> >> >>>>>>>> wrote:
>> >> >> >>>>>>>>>
>> >> >> >>>>>>>>> It must be inside the <configuration></configuration> or
>> >> >> >>>>>>>>> outside
>> >> >> >>>>>>>>> this?
>> >> >> >>>>>>>>>
>> >> >> >>>>>>>>>
>> >> >> >>>>>>>>> On Tue, Jun 5, 2012 at 8:15 PM, shashwat shriparv
>> >> >> >>>>>>>>> <dw...@gmail.com> wrote:
>> >> >> >>>>>>>>>>
>> >> >> >>>>>>>>>> It will be inside hive/conf
>> >> >> >>>>>>>>>>
>> >> >> >>>>>>>>>>
>> >> >> >>>>>>>>>> On Tue, Jun 5, 2012 at 11:43 PM, Babak Bastan
>> >> >> >>>>>>>>>> <ba...@gmail.com>
>> >> >> >>>>>>>>>> wrote:
>> >> >> >>>>>>>>>>>
>> >> >> >>>>>>>>>>> Thanks sShashwat, and where is this hive-site.xml
>> >> >> >>>>>>>>>>>
>> >> >> >>>>>>>>>>> On Tue, Jun 5, 2012 at 8:02 PM, shashwat shriparv
>> >> >> >>>>>>>>>>> <dw...@gmail.com> wrote:
>> >> >> >>>>>>>>>>>>
>> >> >> >>>>>>>>>>>> set
>> >> >> >>>>>>>>>>>>
>> >> >> >>>>>>>>>>>> hive.metastore.warehouse.dir in hive-site.xml
>> >> >> >>>>>>>>>>>>
>> >> >> >>>>>>>>>>>> <property>
>> >> >> >>>>>>>>>>>>   <name>hive.metastore.local</name>
>> >> >> >>>>>>>>>>>>   <value>true</value>
>> >> >> >>>>>>>>>>>> </property>
>> >> >> >>>>>>>>>>>>
>> >> >> >>>>>>>>>>>> <name>hive.metastore.warehouse.dir</name>
>> >> >> >>>>>>>>>>>>                <value>/home/<your
>> >> >> >>>>>>>>>>>> username>/hivefolder</value>
>> >> >> >>>>>>>>>>>>                <description>location of default
>> >> >> >>>>>>>>>>>> database
>> >> >> >>>>>>>>>>>> for
>> >> >> >>>>>>>>>>>> the
>> >> >> >>>>>>>>>>>> warehouse</description>
>> >> >> >>>>>>>>>>>>        </property>
>> >> >> >>>>>>>>>>>>
>> >> >> >>>>>>>>>>>>
>> >> >> >>>>>>>>>>>> On Tue, Jun 5, 2012 at 10:43 PM, Babak Bastan
>> >> >> >>>>>>>>>>>> <ba...@gmail.com> wrote:
>> >> >> >>>>>>>>>>>>>
>> >> >> >>>>>>>>>>>>> Hello Experts ,
>> >> >> >>>>>>>>>>>>>
>> >> >> >>>>>>>>>>>>> I'm new in Hive .When try to create a test Table in
>> >> >> >>>>>>>>>>>>> Hive
>> >> >> >>>>>>>>>>>>> I
>> >> >> >>>>>>>>>>>>> get
>> >> >> >>>>>>>>>>>>> an error.I want to run this command:
>> >> >> >>>>>>>>>>>>> CREATE TABLE Test (DateT STRING, Url STRING, Content
>> >> >> >>>>>>>>>>>>> STRING);
>> >> >> >>>>>>>>>>>>> but this error occured:
>> >> >> >>>>>>>>>>>>> FAILED: Error in metadata: MetaException(message:Got
>> >> >> >>>>>>>>>>>>> exception:
>> >> >> >>>>>>>>>>>>> java.io.FileNotFoundException File
>> >> >> >>>>>>>>>>>>> file:/user/hive/warehouse/test does not
>> >> >> >>>>>>>>>>>>> exist.)
>> >> >> >>>>>>>>>>>>> FAILED: Execution Error, return code 1 from
>> >> >> >>>>>>>>>>>>> org.apache.hadoop.hive.ql.exec.DDLTask
>> >> >> >>>>>>>>>>>>> How can I solve this Problem?
>> >> >> >>>>>>>>>>>>> Thank you so much
>> >> >> >>>>>>>>>>>>
>> >> >> >>>>>>>>>>>>
>> >> >> >>>>>>>>>>>>
>> >> >> >>>>>>>>>>>>
>> >> >> >>>>>>>>>>>> --
>> >> >> >>>>>>>>>>>>
>> >> >> >>>>>>>>>>>>
>> >> >> >>>>>>>>>>>> ∞
>> >> >> >>>>>>>>>>>>
>> >> >> >>>>>>>>>>>> Shashwat Shriparv
>> >> >> >>>>>>>>>>>>
>> >> >> >>>>>>>>>>>>
>> >> >> >>>>>>>>>>>
>> >> >> >>>>>>>>>>
>> >> >> >>>>>>>>>>
>> >> >> >>>>>>>>>>
>> >> >> >>>>>>>>>> --
>> >> >> >>>>>>>>>>
>> >> >> >>>>>>>>>>
>> >> >> >>>>>>>>>> ∞
>> >> >> >>>>>>>>>>
>> >> >> >>>>>>>>>> Shashwat Shriparv
>> >> >> >>>>>>>>>>
>> >> >> >>>>>>>>>>
>> >> >> >>>>>>>>>
>> >> >> >>>>>>>>
>> >> >> >>>>>>>
>> >> >> >>>>>>>
>> >> >> >>>>>>>
>> >> >> >>>>>>> --
>> >> >> >>>>>>>
>> >> >> >>>>>>>
>> >> >> >>>>>>> ∞
>> >> >> >>>>>>>
>> >> >> >>>>>>> Shashwat Shriparv
>> >> >> >>>>>>>
>> >> >> >>>>>>>
>> >> >> >>>>>>
>> >> >> >>>>>
>> >> >> >>>>
>> >> >> >>>
>> >> >> >>>
>> >> >> >>>
>> >> >> >>> --
>> >> >> >>>
>> >> >> >>>
>> >> >> >>> ∞
>> >> >> >>>
>> >> >> >>> Shashwat Shriparv
>> >> >> >>>
>> >> >> >>>
>> >> >> >>
>> >> >
>> >> >
>> >
>> >
>
>

Re: Error while Creating Table in Hive

Posted by Babak Bastan <ba...@gmail.com>.
thank's Mohammad

with this command:

babak@ubuntu:~/Downloads/hadoop/bin$ hadoop namenode -format

this is my output:

12/06/06 20:05:20 INFO namenode.NameNode: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting NameNode
STARTUP_MSG:   host = ubuntu/127.0.1.1
STARTUP_MSG:   args = [-format]
STARTUP_MSG:   version = 0.20.2
STARTUP_MSG:   build =
https://svn.apache.org/repos/asf/hadoop/common/branches/branch-0.20 -r
911707; compiled by 'chrisdo' on Fri Feb 19 08:07:34 UTC 2010
************************************************************/
12/06/06 20:05:20 INFO namenode.FSNamesystem:
fsOwner=babak,babak,adm,dialout,cdrom,plugdev,lpadmin,admin,sambashare
12/06/06 20:05:20 INFO namenode.FSNamesystem: supergroup=supergroup
12/06/06 20:05:20 INFO namenode.FSNamesystem: isPermissionEnabled=true
12/06/06 20:05:20 INFO common.Storage: Image file of size 95 saved in 0 seconds.
12/06/06 20:05:20 INFO common.Storage: Storage directory
/tmp/hadoop-babak/dfs/name has been successfully formatted.
12/06/06 20:05:20 INFO namenode.NameNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at ubuntu/127.0.1.1
************************************************************/

by this command:

babak@ubuntu:~/Downloads/hadoop/bin$ start-dfs.sh

this is the out put

mkdir: kann Verzeichnis „/home/babak/Downloads/hadoop/bin/../logs“
nicht anlegen: Keine Berechtigung

this out put(it's in german and it means no right to make this folder)

On Wed, Jun 6, 2012 at 7:59 PM, Mohammad Tariq <do...@gmail.com> wrote:

> once we are done with the configuration, we need to format the file
> system..use this command to do that-
> bin/hadoop namenode -format
>
> after this, hadoop daemon processes should be started using following
> commands -
> bin/start-dfs.sh (it'll start NN & DN)
> bin/start-mapred.sh (it'll start JT & TT)
>
> after this use jps to check if everything is alright or point your
> browser to localhost:50070..if you further find any problem provide us
> with the error logs..:)
>
> Regards,
>     Mohammad Tariq
>
>
> On Wed, Jun 6, 2012 at 11:22 PM, Babak Bastan <ba...@gmail.com> wrote:
> > were you able to format hdfs properly???
> > I did'nt get your question,Do you mean HADOOP_HOME? or where did I
> install
> > Hadoop?
> >
> > On Wed, Jun 6, 2012 at 7:49 PM, Mohammad Tariq <do...@gmail.com>
> wrote:
> >>
> >> if you are getting only this, it means your hadoop is not
> >> running..were you able to format hdfs properly???
> >>
> >> Regards,
> >>     Mohammad Tariq
> >>
> >>
> >> On Wed, Jun 6, 2012 at 11:17 PM, Babak Bastan <ba...@gmail.com>
> wrote:
> >> > Hi MohammadmI irun jps in my shel I can see this result:
> >> > 2213 Jps
> >> >
> >> >
> >> > On Wed, Jun 6, 2012 at 7:44 PM, Mohammad Tariq <do...@gmail.com>
> >> > wrote:
> >> >>
> >> >> you can also use "jps" command at your shell to see whether Hadoop
> >> >> processes are running or not.
> >> >>
> >> >> Regards,
> >> >>     Mohammad Tariq
> >> >>
> >> >>
> >> >> On Wed, Jun 6, 2012 at 11:12 PM, Mohammad Tariq <do...@gmail.com>
> >> >> wrote:
> >> >> > Hi Babak,
> >> >> >
> >> >> >  You have to type it in you web browser..Hadoop provides us a web
> GUI
> >> >> > that not only allows us to browse through the file system, but to
> >> >> > download the files as well..Apart from that it also provides a web
> >> >> > GUI
> >> >> > that can be used to see the status of Jobtracker and
> >> >> > Tasktracker..When
> >> >> > you run a Hive or Pig job or a Mapreduce job, you can point your
> >> >> > browser to http://localhost:50030 to see the status and logs of
> your
> >> >> > job.
> >> >> >
> >> >> > Regards,
> >> >> >     Mohammad Tariq
> >> >> >
> >> >> >
> >> >> > On Wed, Jun 6, 2012 at 8:28 PM, Babak Bastan <ba...@gmail.com>
> >> >> > wrote:
> >> >> >> Thank you shashwat for the answer,
> >> >> >> where should I type http://localhost:50070?
> >> >> >> I typed here: hive>http://localhost:50070 but nothing as result
> >> >> >>
> >> >> >>
> >> >> >> On Wed, Jun 6, 2012 at 3:32 PM, shashwat shriparv
> >> >> >> <dw...@gmail.com> wrote:
> >> >> >>>
> >> >> >>> first type http://localhost:50070 whether this is opening or not
> >> >> >>> and
> >> >> >>> check
> >> >> >>> how many nodes are available, check some of the hadoop shell
> >> >> >>> commands
> >> >> >>> from
> http://hadoop.apache.org/common/docs/r0.18.3/hdfs_shell.html
> >> >> >>> run
> >> >> >>> example mapreduce task on hadoop take example from here
> >> >> >>>
> >> >> >>>
> >> >> >>> :
> http://www.michael-noll.com/blog/2011/04/09/benchmarking-and-stress-testing-an-hadoop-cluster-with-terasort-testdfsio-nnbench-mrbench/
> >> >> >>>
> >> >> >>> if all the above you can do sucessfully means hadoop is
> configured
> >> >> >>> correctly
> >> >> >>>
> >> >> >>> Regards
> >> >> >>> Shashwat
> >> >> >>>
> >> >> >>>
> >> >> >>> On Wed, Jun 6, 2012 at 1:30 AM, Babak Bastan <babakbsn@gmail.com
> >
> >> >> >>> wrote:
> >> >> >>>>
> >> >> >>>> no I'm not working on CDH.Is there a way to test if my Hadoop
> >> >> >>>> works
> >> >> >>>> fine
> >> >> >>>> or not?
> >> >> >>>>
> >> >> >>>>
> >> >> >>>> On Tue, Jun 5, 2012 at 9:55 PM, Bejoy KS <be...@yahoo.com>
> >> >> >>>> wrote:
> >> >> >>>>>
> >> >> >>>>> Hi Babak
> >> >> >>>>>
> >> >> >>>>> You gotta follow those instructions in the apace site to set up
> >> >> >>>>> hadoop
> >> >> >>>>> from scratch and ensure that hdfs is working first. You should
> be
> >> >> >>>>> able to
> >> >> >>>>> read and write files to hdfs before you do your next steps.
> >> >> >>>>>
> >> >> >>>>> Are you on CDH or apache distribution of hadoop? If it is CDH
> >> >> >>>>> there
> >> >> >>>>> are
> >> >> >>>>> detailed instructions on Cloudera web site.
> >> >> >>>>>
> >> >> >>>>> Regards
> >> >> >>>>> Bejoy KS
> >> >> >>>>>
> >> >> >>>>> Sent from handheld, please excuse typos.
> >> >> >>>>> ________________________________
> >> >> >>>>> From: Babak Bastan <ba...@gmail.com>
> >> >> >>>>> Date: Tue, 5 Jun 2012 21:30:22 +0200
> >> >> >>>>> To: <us...@hive.apache.org>
> >> >> >>>>> ReplyTo: user@hive.apache.org
> >> >> >>>>> Subject: Re: Error while Creating Table in Hive
> >> >> >>>>>
> >> >> >>>>> @Bejoy: I set the fs.default.name in the core-site.xml and I
> did
> >> >> >>>>> all
> >> >> >>>>> of
> >> >> >>>>> thing that was mentioned in the reference but no effect
> >> >> >>>>>
> >> >> >>>>> On Tue, Jun 5, 2012 at 8:43 PM, Babak Bastan <
> babakbsn@gmail.com>
> >> >> >>>>> wrote:
> >> >> >>>>>>
> >> >> >>>>>> Ok sorry but that was my Mistake .I thought it works but no.
> >> >> >>>>>> I wrote the command without ; and then I think It works but
> with
> >> >> >>>>>> ;
> >> >> >>>>>> at
> >> >> >>>>>> the end of command
> >> >> >>>>>>
> >> >> >>>>>> CREATE TABLE pokes (foo INT, bar STRING);
> >> >> >>>>>>
> >> >> >>>>>> does'nt work
> >> >> >>>>>>
> >> >> >>>>>>
> >> >> >>>>>> On Tue, Jun 5, 2012 at 8:34 PM, shashwat shriparv
> >> >> >>>>>> <dw...@gmail.com> wrote:
> >> >> >>>>>>>
> >> >> >>>>>>> inside configuration. all properties will be inside the
> >> >> >>>>>>> configuration
> >> >> >>>>>>> tags
> >> >> >>>>>>>
> >> >> >>>>>>>
> >> >> >>>>>>> On Tue, Jun 5, 2012 at 11:53 PM, Babak Bastan
> >> >> >>>>>>> <ba...@gmail.com>
> >> >> >>>>>>> wrote:
> >> >> >>>>>>>>
> >> >> >>>>>>>> Thank you so much my friend your idee works fine(no error)
> you
> >> >> >>>>>>>> are
> >> >> >>>>>>>> the best :)
> >> >> >>>>>>>>
> >> >> >>>>>>>>
> >> >> >>>>>>>> On Tue, Jun 5, 2012 at 8:20 PM, Babak Bastan
> >> >> >>>>>>>> <ba...@gmail.com>
> >> >> >>>>>>>> wrote:
> >> >> >>>>>>>>>
> >> >> >>>>>>>>> It must be inside the <configuration></configuration> or
> >> >> >>>>>>>>> outside
> >> >> >>>>>>>>> this?
> >> >> >>>>>>>>>
> >> >> >>>>>>>>>
> >> >> >>>>>>>>> On Tue, Jun 5, 2012 at 8:15 PM, shashwat shriparv
> >> >> >>>>>>>>> <dw...@gmail.com> wrote:
> >> >> >>>>>>>>>>
> >> >> >>>>>>>>>> It will be inside hive/conf
> >> >> >>>>>>>>>>
> >> >> >>>>>>>>>>
> >> >> >>>>>>>>>> On Tue, Jun 5, 2012 at 11:43 PM, Babak Bastan
> >> >> >>>>>>>>>> <ba...@gmail.com>
> >> >> >>>>>>>>>> wrote:
> >> >> >>>>>>>>>>>
> >> >> >>>>>>>>>>> Thanks sShashwat, and where is this hive-site.xml
> >> >> >>>>>>>>>>>
> >> >> >>>>>>>>>>> On Tue, Jun 5, 2012 at 8:02 PM, shashwat shriparv
> >> >> >>>>>>>>>>> <dw...@gmail.com> wrote:
> >> >> >>>>>>>>>>>>
> >> >> >>>>>>>>>>>> set
> >> >> >>>>>>>>>>>>
> >> >> >>>>>>>>>>>> hive.metastore.warehouse.dir in hive-site.xml
> >> >> >>>>>>>>>>>>
> >> >> >>>>>>>>>>>> <property>
> >> >> >>>>>>>>>>>>   <name>hive.metastore.local</name>
> >> >> >>>>>>>>>>>>   <value>true</value>
> >> >> >>>>>>>>>>>> </property>
> >> >> >>>>>>>>>>>>
> >> >> >>>>>>>>>>>> <name>hive.metastore.warehouse.dir</name>
> >> >> >>>>>>>>>>>>                <value>/home/<your
> >> >> >>>>>>>>>>>> username>/hivefolder</value>
> >> >> >>>>>>>>>>>>                <description>location of default database
> >> >> >>>>>>>>>>>> for
> >> >> >>>>>>>>>>>> the
> >> >> >>>>>>>>>>>> warehouse</description>
> >> >> >>>>>>>>>>>>        </property>
> >> >> >>>>>>>>>>>>
> >> >> >>>>>>>>>>>>
> >> >> >>>>>>>>>>>> On Tue, Jun 5, 2012 at 10:43 PM, Babak Bastan
> >> >> >>>>>>>>>>>> <ba...@gmail.com> wrote:
> >> >> >>>>>>>>>>>>>
> >> >> >>>>>>>>>>>>> Hello Experts ,
> >> >> >>>>>>>>>>>>>
> >> >> >>>>>>>>>>>>> I'm new in Hive .When try to create a test Table in
> Hive
> >> >> >>>>>>>>>>>>> I
> >> >> >>>>>>>>>>>>> get
> >> >> >>>>>>>>>>>>> an error.I want to run this command:
> >> >> >>>>>>>>>>>>> CREATE TABLE Test (DateT STRING, Url STRING, Content
> >> >> >>>>>>>>>>>>> STRING);
> >> >> >>>>>>>>>>>>> but this error occured:
> >> >> >>>>>>>>>>>>> FAILED: Error in metadata: MetaException(message:Got
> >> >> >>>>>>>>>>>>> exception:
> >> >> >>>>>>>>>>>>> java.io.FileNotFoundException File
> >> >> >>>>>>>>>>>>> file:/user/hive/warehouse/test does not
> >> >> >>>>>>>>>>>>> exist.)
> >> >> >>>>>>>>>>>>> FAILED: Execution Error, return code 1 from
> >> >> >>>>>>>>>>>>> org.apache.hadoop.hive.ql.exec.DDLTask
> >> >> >>>>>>>>>>>>> How can I solve this Problem?
> >> >> >>>>>>>>>>>>> Thank you so much
> >> >> >>>>>>>>>>>>
> >> >> >>>>>>>>>>>>
> >> >> >>>>>>>>>>>>
> >> >> >>>>>>>>>>>>
> >> >> >>>>>>>>>>>> --
> >> >> >>>>>>>>>>>>
> >> >> >>>>>>>>>>>>
> >> >> >>>>>>>>>>>> ∞
> >> >> >>>>>>>>>>>>
> >> >> >>>>>>>>>>>> Shashwat Shriparv
> >> >> >>>>>>>>>>>>
> >> >> >>>>>>>>>>>>
> >> >> >>>>>>>>>>>
> >> >> >>>>>>>>>>
> >> >> >>>>>>>>>>
> >> >> >>>>>>>>>>
> >> >> >>>>>>>>>> --
> >> >> >>>>>>>>>>
> >> >> >>>>>>>>>>
> >> >> >>>>>>>>>> ∞
> >> >> >>>>>>>>>>
> >> >> >>>>>>>>>> Shashwat Shriparv
> >> >> >>>>>>>>>>
> >> >> >>>>>>>>>>
> >> >> >>>>>>>>>
> >> >> >>>>>>>>
> >> >> >>>>>>>
> >> >> >>>>>>>
> >> >> >>>>>>>
> >> >> >>>>>>> --
> >> >> >>>>>>>
> >> >> >>>>>>>
> >> >> >>>>>>> ∞
> >> >> >>>>>>>
> >> >> >>>>>>> Shashwat Shriparv
> >> >> >>>>>>>
> >> >> >>>>>>>
> >> >> >>>>>>
> >> >> >>>>>
> >> >> >>>>
> >> >> >>>
> >> >> >>>
> >> >> >>>
> >> >> >>> --
> >> >> >>>
> >> >> >>>
> >> >> >>> ∞
> >> >> >>>
> >> >> >>> Shashwat Shriparv
> >> >> >>>
> >> >> >>>
> >> >> >>
> >> >
> >> >
> >
> >
>

Re: Error while Creating Table in Hive

Posted by Mohammad Tariq <do...@gmail.com>.
once we are done with the configuration, we need to format the file
system..use this command to do that-
bin/hadoop namenode -format

after this, hadoop daemon processes should be started using following commands -
bin/start-dfs.sh (it'll start NN & DN)
bin/start-mapred.sh (it'll start JT & TT)

after this use jps to check if everything is alright or point your
browser to localhost:50070..if you further find any problem provide us
with the error logs..:)

Regards,
    Mohammad Tariq


On Wed, Jun 6, 2012 at 11:22 PM, Babak Bastan <ba...@gmail.com> wrote:
> were you able to format hdfs properly???
> I did'nt get your question,Do you mean HADOOP_HOME? or where did I install
> Hadoop?
>
> On Wed, Jun 6, 2012 at 7:49 PM, Mohammad Tariq <do...@gmail.com> wrote:
>>
>> if you are getting only this, it means your hadoop is not
>> running..were you able to format hdfs properly???
>>
>> Regards,
>>     Mohammad Tariq
>>
>>
>> On Wed, Jun 6, 2012 at 11:17 PM, Babak Bastan <ba...@gmail.com> wrote:
>> > Hi MohammadmI irun jps in my shel I can see this result:
>> > 2213 Jps
>> >
>> >
>> > On Wed, Jun 6, 2012 at 7:44 PM, Mohammad Tariq <do...@gmail.com>
>> > wrote:
>> >>
>> >> you can also use "jps" command at your shell to see whether Hadoop
>> >> processes are running or not.
>> >>
>> >> Regards,
>> >>     Mohammad Tariq
>> >>
>> >>
>> >> On Wed, Jun 6, 2012 at 11:12 PM, Mohammad Tariq <do...@gmail.com>
>> >> wrote:
>> >> > Hi Babak,
>> >> >
>> >> >  You have to type it in you web browser..Hadoop provides us a web GUI
>> >> > that not only allows us to browse through the file system, but to
>> >> > download the files as well..Apart from that it also provides a web
>> >> > GUI
>> >> > that can be used to see the status of Jobtracker and
>> >> > Tasktracker..When
>> >> > you run a Hive or Pig job or a Mapreduce job, you can point your
>> >> > browser to http://localhost:50030 to see the status and logs of your
>> >> > job.
>> >> >
>> >> > Regards,
>> >> >     Mohammad Tariq
>> >> >
>> >> >
>> >> > On Wed, Jun 6, 2012 at 8:28 PM, Babak Bastan <ba...@gmail.com>
>> >> > wrote:
>> >> >> Thank you shashwat for the answer,
>> >> >> where should I type http://localhost:50070?
>> >> >> I typed here: hive>http://localhost:50070 but nothing as result
>> >> >>
>> >> >>
>> >> >> On Wed, Jun 6, 2012 at 3:32 PM, shashwat shriparv
>> >> >> <dw...@gmail.com> wrote:
>> >> >>>
>> >> >>> first type http://localhost:50070 whether this is opening or not
>> >> >>> and
>> >> >>> check
>> >> >>> how many nodes are available, check some of the hadoop shell
>> >> >>> commands
>> >> >>> from http://hadoop.apache.org/common/docs/r0.18.3/hdfs_shell.html
>> >> >>> run
>> >> >>> example mapreduce task on hadoop take example from here
>> >> >>>
>> >> >>>
>> >> >>> : http://www.michael-noll.com/blog/2011/04/09/benchmarking-and-stress-testing-an-hadoop-cluster-with-terasort-testdfsio-nnbench-mrbench/
>> >> >>>
>> >> >>> if all the above you can do sucessfully means hadoop is configured
>> >> >>> correctly
>> >> >>>
>> >> >>> Regards
>> >> >>> Shashwat
>> >> >>>
>> >> >>>
>> >> >>> On Wed, Jun 6, 2012 at 1:30 AM, Babak Bastan <ba...@gmail.com>
>> >> >>> wrote:
>> >> >>>>
>> >> >>>> no I'm not working on CDH.Is there a way to test if my Hadoop
>> >> >>>> works
>> >> >>>> fine
>> >> >>>> or not?
>> >> >>>>
>> >> >>>>
>> >> >>>> On Tue, Jun 5, 2012 at 9:55 PM, Bejoy KS <be...@yahoo.com>
>> >> >>>> wrote:
>> >> >>>>>
>> >> >>>>> Hi Babak
>> >> >>>>>
>> >> >>>>> You gotta follow those instructions in the apace site to set up
>> >> >>>>> hadoop
>> >> >>>>> from scratch and ensure that hdfs is working first. You should be
>> >> >>>>> able to
>> >> >>>>> read and write files to hdfs before you do your next steps.
>> >> >>>>>
>> >> >>>>> Are you on CDH or apache distribution of hadoop? If it is CDH
>> >> >>>>> there
>> >> >>>>> are
>> >> >>>>> detailed instructions on Cloudera web site.
>> >> >>>>>
>> >> >>>>> Regards
>> >> >>>>> Bejoy KS
>> >> >>>>>
>> >> >>>>> Sent from handheld, please excuse typos.
>> >> >>>>> ________________________________
>> >> >>>>> From: Babak Bastan <ba...@gmail.com>
>> >> >>>>> Date: Tue, 5 Jun 2012 21:30:22 +0200
>> >> >>>>> To: <us...@hive.apache.org>
>> >> >>>>> ReplyTo: user@hive.apache.org
>> >> >>>>> Subject: Re: Error while Creating Table in Hive
>> >> >>>>>
>> >> >>>>> @Bejoy: I set the fs.default.name in the core-site.xml and I did
>> >> >>>>> all
>> >> >>>>> of
>> >> >>>>> thing that was mentioned in the reference but no effect
>> >> >>>>>
>> >> >>>>> On Tue, Jun 5, 2012 at 8:43 PM, Babak Bastan <ba...@gmail.com>
>> >> >>>>> wrote:
>> >> >>>>>>
>> >> >>>>>> Ok sorry but that was my Mistake .I thought it works but no.
>> >> >>>>>> I wrote the command without ; and then I think It works but with
>> >> >>>>>> ;
>> >> >>>>>> at
>> >> >>>>>> the end of command
>> >> >>>>>>
>> >> >>>>>> CREATE TABLE pokes (foo INT, bar STRING);
>> >> >>>>>>
>> >> >>>>>> does'nt work
>> >> >>>>>>
>> >> >>>>>>
>> >> >>>>>> On Tue, Jun 5, 2012 at 8:34 PM, shashwat shriparv
>> >> >>>>>> <dw...@gmail.com> wrote:
>> >> >>>>>>>
>> >> >>>>>>> inside configuration. all properties will be inside the
>> >> >>>>>>> configuration
>> >> >>>>>>> tags
>> >> >>>>>>>
>> >> >>>>>>>
>> >> >>>>>>> On Tue, Jun 5, 2012 at 11:53 PM, Babak Bastan
>> >> >>>>>>> <ba...@gmail.com>
>> >> >>>>>>> wrote:
>> >> >>>>>>>>
>> >> >>>>>>>> Thank you so much my friend your idee works fine(no error) you
>> >> >>>>>>>> are
>> >> >>>>>>>> the best :)
>> >> >>>>>>>>
>> >> >>>>>>>>
>> >> >>>>>>>> On Tue, Jun 5, 2012 at 8:20 PM, Babak Bastan
>> >> >>>>>>>> <ba...@gmail.com>
>> >> >>>>>>>> wrote:
>> >> >>>>>>>>>
>> >> >>>>>>>>> It must be inside the <configuration></configuration> or
>> >> >>>>>>>>> outside
>> >> >>>>>>>>> this?
>> >> >>>>>>>>>
>> >> >>>>>>>>>
>> >> >>>>>>>>> On Tue, Jun 5, 2012 at 8:15 PM, shashwat shriparv
>> >> >>>>>>>>> <dw...@gmail.com> wrote:
>> >> >>>>>>>>>>
>> >> >>>>>>>>>> It will be inside hive/conf
>> >> >>>>>>>>>>
>> >> >>>>>>>>>>
>> >> >>>>>>>>>> On Tue, Jun 5, 2012 at 11:43 PM, Babak Bastan
>> >> >>>>>>>>>> <ba...@gmail.com>
>> >> >>>>>>>>>> wrote:
>> >> >>>>>>>>>>>
>> >> >>>>>>>>>>> Thanks sShashwat, and where is this hive-site.xml
>> >> >>>>>>>>>>>
>> >> >>>>>>>>>>> On Tue, Jun 5, 2012 at 8:02 PM, shashwat shriparv
>> >> >>>>>>>>>>> <dw...@gmail.com> wrote:
>> >> >>>>>>>>>>>>
>> >> >>>>>>>>>>>> set
>> >> >>>>>>>>>>>>
>> >> >>>>>>>>>>>> hive.metastore.warehouse.dir in hive-site.xml
>> >> >>>>>>>>>>>>
>> >> >>>>>>>>>>>> <property>
>> >> >>>>>>>>>>>>   <name>hive.metastore.local</name>
>> >> >>>>>>>>>>>>   <value>true</value>
>> >> >>>>>>>>>>>> </property>
>> >> >>>>>>>>>>>>
>> >> >>>>>>>>>>>> <name>hive.metastore.warehouse.dir</name>
>> >> >>>>>>>>>>>>                <value>/home/<your
>> >> >>>>>>>>>>>> username>/hivefolder</value>
>> >> >>>>>>>>>>>>                <description>location of default database
>> >> >>>>>>>>>>>> for
>> >> >>>>>>>>>>>> the
>> >> >>>>>>>>>>>> warehouse</description>
>> >> >>>>>>>>>>>>        </property>
>> >> >>>>>>>>>>>>
>> >> >>>>>>>>>>>>
>> >> >>>>>>>>>>>> On Tue, Jun 5, 2012 at 10:43 PM, Babak Bastan
>> >> >>>>>>>>>>>> <ba...@gmail.com> wrote:
>> >> >>>>>>>>>>>>>
>> >> >>>>>>>>>>>>> Hello Experts ,
>> >> >>>>>>>>>>>>>
>> >> >>>>>>>>>>>>> I'm new in Hive .When try to create a test Table in Hive
>> >> >>>>>>>>>>>>> I
>> >> >>>>>>>>>>>>> get
>> >> >>>>>>>>>>>>> an error.I want to run this command:
>> >> >>>>>>>>>>>>> CREATE TABLE Test (DateT STRING, Url STRING, Content
>> >> >>>>>>>>>>>>> STRING);
>> >> >>>>>>>>>>>>> but this error occured:
>> >> >>>>>>>>>>>>> FAILED: Error in metadata: MetaException(message:Got
>> >> >>>>>>>>>>>>> exception:
>> >> >>>>>>>>>>>>> java.io.FileNotFoundException File
>> >> >>>>>>>>>>>>> file:/user/hive/warehouse/test does not
>> >> >>>>>>>>>>>>> exist.)
>> >> >>>>>>>>>>>>> FAILED: Execution Error, return code 1 from
>> >> >>>>>>>>>>>>> org.apache.hadoop.hive.ql.exec.DDLTask
>> >> >>>>>>>>>>>>> How can I solve this Problem?
>> >> >>>>>>>>>>>>> Thank you so much
>> >> >>>>>>>>>>>>
>> >> >>>>>>>>>>>>
>> >> >>>>>>>>>>>>
>> >> >>>>>>>>>>>>
>> >> >>>>>>>>>>>> --
>> >> >>>>>>>>>>>>
>> >> >>>>>>>>>>>>
>> >> >>>>>>>>>>>> ∞
>> >> >>>>>>>>>>>>
>> >> >>>>>>>>>>>> Shashwat Shriparv
>> >> >>>>>>>>>>>>
>> >> >>>>>>>>>>>>
>> >> >>>>>>>>>>>
>> >> >>>>>>>>>>
>> >> >>>>>>>>>>
>> >> >>>>>>>>>>
>> >> >>>>>>>>>> --
>> >> >>>>>>>>>>
>> >> >>>>>>>>>>
>> >> >>>>>>>>>> ∞
>> >> >>>>>>>>>>
>> >> >>>>>>>>>> Shashwat Shriparv
>> >> >>>>>>>>>>
>> >> >>>>>>>>>>
>> >> >>>>>>>>>
>> >> >>>>>>>>
>> >> >>>>>>>
>> >> >>>>>>>
>> >> >>>>>>>
>> >> >>>>>>> --
>> >> >>>>>>>
>> >> >>>>>>>
>> >> >>>>>>> ∞
>> >> >>>>>>>
>> >> >>>>>>> Shashwat Shriparv
>> >> >>>>>>>
>> >> >>>>>>>
>> >> >>>>>>
>> >> >>>>>
>> >> >>>>
>> >> >>>
>> >> >>>
>> >> >>>
>> >> >>> --
>> >> >>>
>> >> >>>
>> >> >>> ∞
>> >> >>>
>> >> >>> Shashwat Shriparv
>> >> >>>
>> >> >>>
>> >> >>
>> >
>> >
>
>

Re: Error while Creating Table in Hive

Posted by Babak Bastan <ba...@gmail.com>.
*were you able to format hdfs properly???*
I did'nt get your question,Do you mean HADOOP_HOME? or where did I install
Hadoop?

On Wed, Jun 6, 2012 at 7:49 PM, Mohammad Tariq <do...@gmail.com> wrote:

> if you are getting only this, it means your hadoop is not
> running..were you able to format hdfs properly???
>
> Regards,
>     Mohammad Tariq
>
>
> On Wed, Jun 6, 2012 at 11:17 PM, Babak Bastan <ba...@gmail.com> wrote:
> > Hi MohammadmI irun jps in my shel I can see this result:
> > 2213 Jps
> >
> >
> > On Wed, Jun 6, 2012 at 7:44 PM, Mohammad Tariq <do...@gmail.com>
> wrote:
> >>
> >> you can also use "jps" command at your shell to see whether Hadoop
> >> processes are running or not.
> >>
> >> Regards,
> >>     Mohammad Tariq
> >>
> >>
> >> On Wed, Jun 6, 2012 at 11:12 PM, Mohammad Tariq <do...@gmail.com>
> >> wrote:
> >> > Hi Babak,
> >> >
> >> >  You have to type it in you web browser..Hadoop provides us a web GUI
> >> > that not only allows us to browse through the file system, but to
> >> > download the files as well..Apart from that it also provides a web GUI
> >> > that can be used to see the status of Jobtracker and Tasktracker..When
> >> > you run a Hive or Pig job or a Mapreduce job, you can point your
> >> > browser to http://localhost:50030 to see the status and logs of your
> >> > job.
> >> >
> >> > Regards,
> >> >     Mohammad Tariq
> >> >
> >> >
> >> > On Wed, Jun 6, 2012 at 8:28 PM, Babak Bastan <ba...@gmail.com>
> wrote:
> >> >> Thank you shashwat for the answer,
> >> >> where should I type http://localhost:50070?
> >> >> I typed here: hive>http://localhost:50070 but nothing as result
> >> >>
> >> >>
> >> >> On Wed, Jun 6, 2012 at 3:32 PM, shashwat shriparv
> >> >> <dw...@gmail.com> wrote:
> >> >>>
> >> >>> first type http://localhost:50070 whether this is opening or not
> and
> >> >>> check
> >> >>> how many nodes are available, check some of the hadoop shell
> commands
> >> >>> from http://hadoop.apache.org/common/docs/r0.18.3/hdfs_shell.html
> run
> >> >>> example mapreduce task on hadoop take example from here
> >> >>>
> >> >>> :
> http://www.michael-noll.com/blog/2011/04/09/benchmarking-and-stress-testing-an-hadoop-cluster-with-terasort-testdfsio-nnbench-mrbench/
> >> >>>
> >> >>> if all the above you can do sucessfully means hadoop is configured
> >> >>> correctly
> >> >>>
> >> >>> Regards
> >> >>> Shashwat
> >> >>>
> >> >>>
> >> >>> On Wed, Jun 6, 2012 at 1:30 AM, Babak Bastan <ba...@gmail.com>
> >> >>> wrote:
> >> >>>>
> >> >>>> no I'm not working on CDH.Is there a way to test if my Hadoop works
> >> >>>> fine
> >> >>>> or not?
> >> >>>>
> >> >>>>
> >> >>>> On Tue, Jun 5, 2012 at 9:55 PM, Bejoy KS <be...@yahoo.com>
> wrote:
> >> >>>>>
> >> >>>>> Hi Babak
> >> >>>>>
> >> >>>>> You gotta follow those instructions in the apace site to set up
> >> >>>>> hadoop
> >> >>>>> from scratch and ensure that hdfs is working first. You should be
> >> >>>>> able to
> >> >>>>> read and write files to hdfs before you do your next steps.
> >> >>>>>
> >> >>>>> Are you on CDH or apache distribution of hadoop? If it is CDH
> there
> >> >>>>> are
> >> >>>>> detailed instructions on Cloudera web site.
> >> >>>>>
> >> >>>>> Regards
> >> >>>>> Bejoy KS
> >> >>>>>
> >> >>>>> Sent from handheld, please excuse typos.
> >> >>>>> ________________________________
> >> >>>>> From: Babak Bastan <ba...@gmail.com>
> >> >>>>> Date: Tue, 5 Jun 2012 21:30:22 +0200
> >> >>>>> To: <us...@hive.apache.org>
> >> >>>>> ReplyTo: user@hive.apache.org
> >> >>>>> Subject: Re: Error while Creating Table in Hive
> >> >>>>>
> >> >>>>> @Bejoy: I set the fs.default.name in the core-site.xml and I did
> all
> >> >>>>> of
> >> >>>>> thing that was mentioned in the reference but no effect
> >> >>>>>
> >> >>>>> On Tue, Jun 5, 2012 at 8:43 PM, Babak Bastan <ba...@gmail.com>
> >> >>>>> wrote:
> >> >>>>>>
> >> >>>>>> Ok sorry but that was my Mistake .I thought it works but no.
> >> >>>>>> I wrote the command without ; and then I think It works but with
> ;
> >> >>>>>> at
> >> >>>>>> the end of command
> >> >>>>>>
> >> >>>>>> CREATE TABLE pokes (foo INT, bar STRING);
> >> >>>>>>
> >> >>>>>> does'nt work
> >> >>>>>>
> >> >>>>>>
> >> >>>>>> On Tue, Jun 5, 2012 at 8:34 PM, shashwat shriparv
> >> >>>>>> <dw...@gmail.com> wrote:
> >> >>>>>>>
> >> >>>>>>> inside configuration. all properties will be inside the
> >> >>>>>>> configuration
> >> >>>>>>> tags
> >> >>>>>>>
> >> >>>>>>>
> >> >>>>>>> On Tue, Jun 5, 2012 at 11:53 PM, Babak Bastan <
> babakbsn@gmail.com>
> >> >>>>>>> wrote:
> >> >>>>>>>>
> >> >>>>>>>> Thank you so much my friend your idee works fine(no error) you
> >> >>>>>>>> are
> >> >>>>>>>> the best :)
> >> >>>>>>>>
> >> >>>>>>>>
> >> >>>>>>>> On Tue, Jun 5, 2012 at 8:20 PM, Babak Bastan <
> babakbsn@gmail.com>
> >> >>>>>>>> wrote:
> >> >>>>>>>>>
> >> >>>>>>>>> It must be inside the <configuration></configuration> or
> outside
> >> >>>>>>>>> this?
> >> >>>>>>>>>
> >> >>>>>>>>>
> >> >>>>>>>>> On Tue, Jun 5, 2012 at 8:15 PM, shashwat shriparv
> >> >>>>>>>>> <dw...@gmail.com> wrote:
> >> >>>>>>>>>>
> >> >>>>>>>>>> It will be inside hive/conf
> >> >>>>>>>>>>
> >> >>>>>>>>>>
> >> >>>>>>>>>> On Tue, Jun 5, 2012 at 11:43 PM, Babak Bastan
> >> >>>>>>>>>> <ba...@gmail.com>
> >> >>>>>>>>>> wrote:
> >> >>>>>>>>>>>
> >> >>>>>>>>>>> Thanks sShashwat, and where is this hive-site.xml
> >> >>>>>>>>>>>
> >> >>>>>>>>>>> On Tue, Jun 5, 2012 at 8:02 PM, shashwat shriparv
> >> >>>>>>>>>>> <dw...@gmail.com> wrote:
> >> >>>>>>>>>>>>
> >> >>>>>>>>>>>> set
> >> >>>>>>>>>>>>
> >> >>>>>>>>>>>> hive.metastore.warehouse.dir in hive-site.xml
> >> >>>>>>>>>>>>
> >> >>>>>>>>>>>> <property>
> >> >>>>>>>>>>>>   <name>hive.metastore.local</name>
> >> >>>>>>>>>>>>   <value>true</value>
> >> >>>>>>>>>>>> </property>
> >> >>>>>>>>>>>>
> >> >>>>>>>>>>>> <name>hive.metastore.warehouse.dir</name>
> >> >>>>>>>>>>>>                <value>/home/<your
> >> >>>>>>>>>>>> username>/hivefolder</value>
> >> >>>>>>>>>>>>                <description>location of default database
> for
> >> >>>>>>>>>>>> the
> >> >>>>>>>>>>>> warehouse</description>
> >> >>>>>>>>>>>>        </property>
> >> >>>>>>>>>>>>
> >> >>>>>>>>>>>>
> >> >>>>>>>>>>>> On Tue, Jun 5, 2012 at 10:43 PM, Babak Bastan
> >> >>>>>>>>>>>> <ba...@gmail.com> wrote:
> >> >>>>>>>>>>>>>
> >> >>>>>>>>>>>>> Hello Experts ,
> >> >>>>>>>>>>>>>
> >> >>>>>>>>>>>>> I'm new in Hive .When try to create a test Table in Hive I
> >> >>>>>>>>>>>>> get
> >> >>>>>>>>>>>>> an error.I want to run this command:
> >> >>>>>>>>>>>>> CREATE TABLE Test (DateT STRING, Url STRING, Content
> >> >>>>>>>>>>>>> STRING);
> >> >>>>>>>>>>>>> but this error occured:
> >> >>>>>>>>>>>>> FAILED: Error in metadata: MetaException(message:Got
> >> >>>>>>>>>>>>> exception:
> >> >>>>>>>>>>>>> java.io.FileNotFoundException File
> >> >>>>>>>>>>>>> file:/user/hive/warehouse/test does not
> >> >>>>>>>>>>>>> exist.)
> >> >>>>>>>>>>>>> FAILED: Execution Error, return code 1 from
> >> >>>>>>>>>>>>> org.apache.hadoop.hive.ql.exec.DDLTask
> >> >>>>>>>>>>>>> How can I solve this Problem?
> >> >>>>>>>>>>>>> Thank you so much
> >> >>>>>>>>>>>>
> >> >>>>>>>>>>>>
> >> >>>>>>>>>>>>
> >> >>>>>>>>>>>>
> >> >>>>>>>>>>>> --
> >> >>>>>>>>>>>>
> >> >>>>>>>>>>>>
> >> >>>>>>>>>>>> ∞
> >> >>>>>>>>>>>>
> >> >>>>>>>>>>>> Shashwat Shriparv
> >> >>>>>>>>>>>>
> >> >>>>>>>>>>>>
> >> >>>>>>>>>>>
> >> >>>>>>>>>>
> >> >>>>>>>>>>
> >> >>>>>>>>>>
> >> >>>>>>>>>> --
> >> >>>>>>>>>>
> >> >>>>>>>>>>
> >> >>>>>>>>>> ∞
> >> >>>>>>>>>>
> >> >>>>>>>>>> Shashwat Shriparv
> >> >>>>>>>>>>
> >> >>>>>>>>>>
> >> >>>>>>>>>
> >> >>>>>>>>
> >> >>>>>>>
> >> >>>>>>>
> >> >>>>>>>
> >> >>>>>>> --
> >> >>>>>>>
> >> >>>>>>>
> >> >>>>>>> ∞
> >> >>>>>>>
> >> >>>>>>> Shashwat Shriparv
> >> >>>>>>>
> >> >>>>>>>
> >> >>>>>>
> >> >>>>>
> >> >>>>
> >> >>>
> >> >>>
> >> >>>
> >> >>> --
> >> >>>
> >> >>>
> >> >>> ∞
> >> >>>
> >> >>> Shashwat Shriparv
> >> >>>
> >> >>>
> >> >>
> >
> >
>

Re: Error while Creating Table in Hive

Posted by Mohammad Tariq <do...@gmail.com>.
if you are getting only this, it means your hadoop is not
running..were you able to format hdfs properly???

Regards,
    Mohammad Tariq


On Wed, Jun 6, 2012 at 11:17 PM, Babak Bastan <ba...@gmail.com> wrote:
> Hi MohammadmI irun jps in my shel I can see this result:
> 2213 Jps
>
>
> On Wed, Jun 6, 2012 at 7:44 PM, Mohammad Tariq <do...@gmail.com> wrote:
>>
>> you can also use "jps" command at your shell to see whether Hadoop
>> processes are running or not.
>>
>> Regards,
>>     Mohammad Tariq
>>
>>
>> On Wed, Jun 6, 2012 at 11:12 PM, Mohammad Tariq <do...@gmail.com>
>> wrote:
>> > Hi Babak,
>> >
>> >  You have to type it in you web browser..Hadoop provides us a web GUI
>> > that not only allows us to browse through the file system, but to
>> > download the files as well..Apart from that it also provides a web GUI
>> > that can be used to see the status of Jobtracker and Tasktracker..When
>> > you run a Hive or Pig job or a Mapreduce job, you can point your
>> > browser to http://localhost:50030 to see the status and logs of your
>> > job.
>> >
>> > Regards,
>> >     Mohammad Tariq
>> >
>> >
>> > On Wed, Jun 6, 2012 at 8:28 PM, Babak Bastan <ba...@gmail.com> wrote:
>> >> Thank you shashwat for the answer,
>> >> where should I type http://localhost:50070?
>> >> I typed here: hive>http://localhost:50070 but nothing as result
>> >>
>> >>
>> >> On Wed, Jun 6, 2012 at 3:32 PM, shashwat shriparv
>> >> <dw...@gmail.com> wrote:
>> >>>
>> >>> first type http://localhost:50070 whether this is opening or not and
>> >>> check
>> >>> how many nodes are available, check some of the hadoop shell commands
>> >>> from http://hadoop.apache.org/common/docs/r0.18.3/hdfs_shell.html  run
>> >>> example mapreduce task on hadoop take example from here
>> >>>
>> >>> : http://www.michael-noll.com/blog/2011/04/09/benchmarking-and-stress-testing-an-hadoop-cluster-with-terasort-testdfsio-nnbench-mrbench/
>> >>>
>> >>> if all the above you can do sucessfully means hadoop is configured
>> >>> correctly
>> >>>
>> >>> Regards
>> >>> Shashwat
>> >>>
>> >>>
>> >>> On Wed, Jun 6, 2012 at 1:30 AM, Babak Bastan <ba...@gmail.com>
>> >>> wrote:
>> >>>>
>> >>>> no I'm not working on CDH.Is there a way to test if my Hadoop works
>> >>>> fine
>> >>>> or not?
>> >>>>
>> >>>>
>> >>>> On Tue, Jun 5, 2012 at 9:55 PM, Bejoy KS <be...@yahoo.com> wrote:
>> >>>>>
>> >>>>> Hi Babak
>> >>>>>
>> >>>>> You gotta follow those instructions in the apace site to set up
>> >>>>> hadoop
>> >>>>> from scratch and ensure that hdfs is working first. You should be
>> >>>>> able to
>> >>>>> read and write files to hdfs before you do your next steps.
>> >>>>>
>> >>>>> Are you on CDH or apache distribution of hadoop? If it is CDH there
>> >>>>> are
>> >>>>> detailed instructions on Cloudera web site.
>> >>>>>
>> >>>>> Regards
>> >>>>> Bejoy KS
>> >>>>>
>> >>>>> Sent from handheld, please excuse typos.
>> >>>>> ________________________________
>> >>>>> From: Babak Bastan <ba...@gmail.com>
>> >>>>> Date: Tue, 5 Jun 2012 21:30:22 +0200
>> >>>>> To: <us...@hive.apache.org>
>> >>>>> ReplyTo: user@hive.apache.org
>> >>>>> Subject: Re: Error while Creating Table in Hive
>> >>>>>
>> >>>>> @Bejoy: I set the fs.default.name in the core-site.xml and I did all
>> >>>>> of
>> >>>>> thing that was mentioned in the reference but no effect
>> >>>>>
>> >>>>> On Tue, Jun 5, 2012 at 8:43 PM, Babak Bastan <ba...@gmail.com>
>> >>>>> wrote:
>> >>>>>>
>> >>>>>> Ok sorry but that was my Mistake .I thought it works but no.
>> >>>>>> I wrote the command without ; and then I think It works but with ;
>> >>>>>> at
>> >>>>>> the end of command
>> >>>>>>
>> >>>>>> CREATE TABLE pokes (foo INT, bar STRING);
>> >>>>>>
>> >>>>>> does'nt work
>> >>>>>>
>> >>>>>>
>> >>>>>> On Tue, Jun 5, 2012 at 8:34 PM, shashwat shriparv
>> >>>>>> <dw...@gmail.com> wrote:
>> >>>>>>>
>> >>>>>>> inside configuration. all properties will be inside the
>> >>>>>>> configuration
>> >>>>>>> tags
>> >>>>>>>
>> >>>>>>>
>> >>>>>>> On Tue, Jun 5, 2012 at 11:53 PM, Babak Bastan <ba...@gmail.com>
>> >>>>>>> wrote:
>> >>>>>>>>
>> >>>>>>>> Thank you so much my friend your idee works fine(no error) you
>> >>>>>>>> are
>> >>>>>>>> the best :)
>> >>>>>>>>
>> >>>>>>>>
>> >>>>>>>> On Tue, Jun 5, 2012 at 8:20 PM, Babak Bastan <ba...@gmail.com>
>> >>>>>>>> wrote:
>> >>>>>>>>>
>> >>>>>>>>> It must be inside the <configuration></configuration> or outside
>> >>>>>>>>> this?
>> >>>>>>>>>
>> >>>>>>>>>
>> >>>>>>>>> On Tue, Jun 5, 2012 at 8:15 PM, shashwat shriparv
>> >>>>>>>>> <dw...@gmail.com> wrote:
>> >>>>>>>>>>
>> >>>>>>>>>> It will be inside hive/conf
>> >>>>>>>>>>
>> >>>>>>>>>>
>> >>>>>>>>>> On Tue, Jun 5, 2012 at 11:43 PM, Babak Bastan
>> >>>>>>>>>> <ba...@gmail.com>
>> >>>>>>>>>> wrote:
>> >>>>>>>>>>>
>> >>>>>>>>>>> Thanks sShashwat, and where is this hive-site.xml
>> >>>>>>>>>>>
>> >>>>>>>>>>> On Tue, Jun 5, 2012 at 8:02 PM, shashwat shriparv
>> >>>>>>>>>>> <dw...@gmail.com> wrote:
>> >>>>>>>>>>>>
>> >>>>>>>>>>>> set
>> >>>>>>>>>>>>
>> >>>>>>>>>>>> hive.metastore.warehouse.dir in hive-site.xml
>> >>>>>>>>>>>>
>> >>>>>>>>>>>> <property>
>> >>>>>>>>>>>>   <name>hive.metastore.local</name>
>> >>>>>>>>>>>>   <value>true</value>
>> >>>>>>>>>>>> </property>
>> >>>>>>>>>>>>
>> >>>>>>>>>>>> <name>hive.metastore.warehouse.dir</name>
>> >>>>>>>>>>>>                <value>/home/<your
>> >>>>>>>>>>>> username>/hivefolder</value>
>> >>>>>>>>>>>>                <description>location of default database for
>> >>>>>>>>>>>> the
>> >>>>>>>>>>>> warehouse</description>
>> >>>>>>>>>>>>        </property>
>> >>>>>>>>>>>>
>> >>>>>>>>>>>>
>> >>>>>>>>>>>> On Tue, Jun 5, 2012 at 10:43 PM, Babak Bastan
>> >>>>>>>>>>>> <ba...@gmail.com> wrote:
>> >>>>>>>>>>>>>
>> >>>>>>>>>>>>> Hello Experts ,
>> >>>>>>>>>>>>>
>> >>>>>>>>>>>>> I'm new in Hive .When try to create a test Table in Hive I
>> >>>>>>>>>>>>> get
>> >>>>>>>>>>>>> an error.I want to run this command:
>> >>>>>>>>>>>>> CREATE TABLE Test (DateT STRING, Url STRING, Content
>> >>>>>>>>>>>>> STRING);
>> >>>>>>>>>>>>> but this error occured:
>> >>>>>>>>>>>>> FAILED: Error in metadata: MetaException(message:Got
>> >>>>>>>>>>>>> exception:
>> >>>>>>>>>>>>> java.io.FileNotFoundException File
>> >>>>>>>>>>>>> file:/user/hive/warehouse/test does not
>> >>>>>>>>>>>>> exist.)
>> >>>>>>>>>>>>> FAILED: Execution Error, return code 1 from
>> >>>>>>>>>>>>> org.apache.hadoop.hive.ql.exec.DDLTask
>> >>>>>>>>>>>>> How can I solve this Problem?
>> >>>>>>>>>>>>> Thank you so much
>> >>>>>>>>>>>>
>> >>>>>>>>>>>>
>> >>>>>>>>>>>>
>> >>>>>>>>>>>>
>> >>>>>>>>>>>> --
>> >>>>>>>>>>>>
>> >>>>>>>>>>>>
>> >>>>>>>>>>>> ∞
>> >>>>>>>>>>>>
>> >>>>>>>>>>>> Shashwat Shriparv
>> >>>>>>>>>>>>
>> >>>>>>>>>>>>
>> >>>>>>>>>>>
>> >>>>>>>>>>
>> >>>>>>>>>>
>> >>>>>>>>>>
>> >>>>>>>>>> --
>> >>>>>>>>>>
>> >>>>>>>>>>
>> >>>>>>>>>> ∞
>> >>>>>>>>>>
>> >>>>>>>>>> Shashwat Shriparv
>> >>>>>>>>>>
>> >>>>>>>>>>
>> >>>>>>>>>
>> >>>>>>>>
>> >>>>>>>
>> >>>>>>>
>> >>>>>>>
>> >>>>>>> --
>> >>>>>>>
>> >>>>>>>
>> >>>>>>> ∞
>> >>>>>>>
>> >>>>>>> Shashwat Shriparv
>> >>>>>>>
>> >>>>>>>
>> >>>>>>
>> >>>>>
>> >>>>
>> >>>
>> >>>
>> >>>
>> >>> --
>> >>>
>> >>>
>> >>> ∞
>> >>>
>> >>> Shashwat Shriparv
>> >>>
>> >>>
>> >>
>
>

Re: Error while Creating Table in Hive

Posted by Babak Bastan <ba...@gmail.com>.
Hi MohammadmI irun jps in my shel I can see this result:
2213 Jps


On Wed, Jun 6, 2012 at 7:44 PM, Mohammad Tariq <do...@gmail.com> wrote:

> you can also use "jps" command at your shell to see whether Hadoop
> processes are running or not.
>
> Regards,
>     Mohammad Tariq
>
>
> On Wed, Jun 6, 2012 at 11:12 PM, Mohammad Tariq <do...@gmail.com>
> wrote:
> > Hi Babak,
> >
> >  You have to type it in you web browser..Hadoop provides us a web GUI
> > that not only allows us to browse through the file system, but to
> > download the files as well..Apart from that it also provides a web GUI
> > that can be used to see the status of Jobtracker and Tasktracker..When
> > you run a Hive or Pig job or a Mapreduce job, you can point your
> > browser to http://localhost:50030 to see the status and logs of your
> > job.
> >
> > Regards,
> >     Mohammad Tariq
> >
> >
> > On Wed, Jun 6, 2012 at 8:28 PM, Babak Bastan <ba...@gmail.com> wrote:
> >> Thank you shashwat for the answer,
> >> where should I type http://localhost:50070?
> >> I typed here: hive>http://localhost:50070 but nothing as result
> >>
> >>
> >> On Wed, Jun 6, 2012 at 3:32 PM, shashwat shriparv
> >> <dw...@gmail.com> wrote:
> >>>
> >>> first type http://localhost:50070 whether this is opening or not and
> check
> >>> how many nodes are available, check some of the hadoop shell commands
> >>> from http://hadoop.apache.org/common/docs/r0.18.3/hdfs_shell.html  run
> >>> example mapreduce task on hadoop take example from here
> >>> :
> http://www.michael-noll.com/blog/2011/04/09/benchmarking-and-stress-testing-an-hadoop-cluster-with-terasort-testdfsio-nnbench-mrbench/
> >>>
> >>> if all the above you can do sucessfully means hadoop is configured
> >>> correctly
> >>>
> >>> Regards
> >>> Shashwat
> >>>
> >>>
> >>> On Wed, Jun 6, 2012 at 1:30 AM, Babak Bastan <ba...@gmail.com>
> wrote:
> >>>>
> >>>> no I'm not working on CDH.Is there a way to test if my Hadoop works
> fine
> >>>> or not?
> >>>>
> >>>>
> >>>> On Tue, Jun 5, 2012 at 9:55 PM, Bejoy KS <be...@yahoo.com> wrote:
> >>>>>
> >>>>> Hi Babak
> >>>>>
> >>>>> You gotta follow those instructions in the apace site to set up
> hadoop
> >>>>> from scratch and ensure that hdfs is working first. You should be
> able to
> >>>>> read and write files to hdfs before you do your next steps.
> >>>>>
> >>>>> Are you on CDH or apache distribution of hadoop? If it is CDH there
> are
> >>>>> detailed instructions on Cloudera web site.
> >>>>>
> >>>>> Regards
> >>>>> Bejoy KS
> >>>>>
> >>>>> Sent from handheld, please excuse typos.
> >>>>> ________________________________
> >>>>> From: Babak Bastan <ba...@gmail.com>
> >>>>> Date: Tue, 5 Jun 2012 21:30:22 +0200
> >>>>> To: <us...@hive.apache.org>
> >>>>> ReplyTo: user@hive.apache.org
> >>>>> Subject: Re: Error while Creating Table in Hive
> >>>>>
> >>>>> @Bejoy: I set the fs.default.name in the core-site.xml and I did
> all of
> >>>>> thing that was mentioned in the reference but no effect
> >>>>>
> >>>>> On Tue, Jun 5, 2012 at 8:43 PM, Babak Bastan <ba...@gmail.com>
> wrote:
> >>>>>>
> >>>>>> Ok sorry but that was my Mistake .I thought it works but no.
> >>>>>> I wrote the command without ; and then I think It works but with ;
> at
> >>>>>> the end of command
> >>>>>>
> >>>>>> CREATE TABLE pokes (foo INT, bar STRING);
> >>>>>>
> >>>>>> does'nt work
> >>>>>>
> >>>>>>
> >>>>>> On Tue, Jun 5, 2012 at 8:34 PM, shashwat shriparv
> >>>>>> <dw...@gmail.com> wrote:
> >>>>>>>
> >>>>>>> inside configuration. all properties will be inside the
> configuration
> >>>>>>> tags
> >>>>>>>
> >>>>>>>
> >>>>>>> On Tue, Jun 5, 2012 at 11:53 PM, Babak Bastan <ba...@gmail.com>
> >>>>>>> wrote:
> >>>>>>>>
> >>>>>>>> Thank you so much my friend your idee works fine(no error) you are
> >>>>>>>> the best :)
> >>>>>>>>
> >>>>>>>>
> >>>>>>>> On Tue, Jun 5, 2012 at 8:20 PM, Babak Bastan <ba...@gmail.com>
> >>>>>>>> wrote:
> >>>>>>>>>
> >>>>>>>>> It must be inside the <configuration></configuration> or outside
> >>>>>>>>> this?
> >>>>>>>>>
> >>>>>>>>>
> >>>>>>>>> On Tue, Jun 5, 2012 at 8:15 PM, shashwat shriparv
> >>>>>>>>> <dw...@gmail.com> wrote:
> >>>>>>>>>>
> >>>>>>>>>> It will be inside hive/conf
> >>>>>>>>>>
> >>>>>>>>>>
> >>>>>>>>>> On Tue, Jun 5, 2012 at 11:43 PM, Babak Bastan <
> babakbsn@gmail.com>
> >>>>>>>>>> wrote:
> >>>>>>>>>>>
> >>>>>>>>>>> Thanks sShashwat, and where is this hive-site.xml
> >>>>>>>>>>>
> >>>>>>>>>>> On Tue, Jun 5, 2012 at 8:02 PM, shashwat shriparv
> >>>>>>>>>>> <dw...@gmail.com> wrote:
> >>>>>>>>>>>>
> >>>>>>>>>>>> set
> >>>>>>>>>>>>
> >>>>>>>>>>>> hive.metastore.warehouse.dir in hive-site.xml
> >>>>>>>>>>>>
> >>>>>>>>>>>> <property>
> >>>>>>>>>>>>   <name>hive.metastore.local</name>
> >>>>>>>>>>>>   <value>true</value>
> >>>>>>>>>>>> </property>
> >>>>>>>>>>>>
> >>>>>>>>>>>> <name>hive.metastore.warehouse.dir</name>
> >>>>>>>>>>>>                <value>/home/<your username>/hivefolder</value>
> >>>>>>>>>>>>                <description>location of default database for
> the
> >>>>>>>>>>>> warehouse</description>
> >>>>>>>>>>>>        </property>
> >>>>>>>>>>>>
> >>>>>>>>>>>>
> >>>>>>>>>>>> On Tue, Jun 5, 2012 at 10:43 PM, Babak Bastan
> >>>>>>>>>>>> <ba...@gmail.com> wrote:
> >>>>>>>>>>>>>
> >>>>>>>>>>>>> Hello Experts ,
> >>>>>>>>>>>>>
> >>>>>>>>>>>>> I'm new in Hive .When try to create a test Table in Hive I
> get
> >>>>>>>>>>>>> an error.I want to run this command:
> >>>>>>>>>>>>> CREATE TABLE Test (DateT STRING, Url STRING, Content STRING);
> >>>>>>>>>>>>> but this error occured:
> >>>>>>>>>>>>> FAILED: Error in metadata: MetaException(message:Got
> exception:
> >>>>>>>>>>>>> java.io.FileNotFoundException File
> file:/user/hive/warehouse/test does not
> >>>>>>>>>>>>> exist.)
> >>>>>>>>>>>>> FAILED: Execution Error, return code 1 from
> >>>>>>>>>>>>> org.apache.hadoop.hive.ql.exec.DDLTask
> >>>>>>>>>>>>> How can I solve this Problem?
> >>>>>>>>>>>>> Thank you so much
> >>>>>>>>>>>>
> >>>>>>>>>>>>
> >>>>>>>>>>>>
> >>>>>>>>>>>>
> >>>>>>>>>>>> --
> >>>>>>>>>>>>
> >>>>>>>>>>>>
> >>>>>>>>>>>> ∞
> >>>>>>>>>>>>
> >>>>>>>>>>>> Shashwat Shriparv
> >>>>>>>>>>>>
> >>>>>>>>>>>>
> >>>>>>>>>>>
> >>>>>>>>>>
> >>>>>>>>>>
> >>>>>>>>>>
> >>>>>>>>>> --
> >>>>>>>>>>
> >>>>>>>>>>
> >>>>>>>>>> ∞
> >>>>>>>>>>
> >>>>>>>>>> Shashwat Shriparv
> >>>>>>>>>>
> >>>>>>>>>>
> >>>>>>>>>
> >>>>>>>>
> >>>>>>>
> >>>>>>>
> >>>>>>>
> >>>>>>> --
> >>>>>>>
> >>>>>>>
> >>>>>>> ∞
> >>>>>>>
> >>>>>>> Shashwat Shriparv
> >>>>>>>
> >>>>>>>
> >>>>>>
> >>>>>
> >>>>
> >>>
> >>>
> >>>
> >>> --
> >>>
> >>>
> >>> ∞
> >>>
> >>> Shashwat Shriparv
> >>>
> >>>
> >>
>

Re: Error while Creating Table in Hive

Posted by Mohammad Tariq <do...@gmail.com>.
you can also use "jps" command at your shell to see whether Hadoop
processes are running or not.

Regards,
    Mohammad Tariq


On Wed, Jun 6, 2012 at 11:12 PM, Mohammad Tariq <do...@gmail.com> wrote:
> Hi Babak,
>
>  You have to type it in you web browser..Hadoop provides us a web GUI
> that not only allows us to browse through the file system, but to
> download the files as well..Apart from that it also provides a web GUI
> that can be used to see the status of Jobtracker and Tasktracker..When
> you run a Hive or Pig job or a Mapreduce job, you can point your
> browser to http://localhost:50030 to see the status and logs of your
> job.
>
> Regards,
>     Mohammad Tariq
>
>
> On Wed, Jun 6, 2012 at 8:28 PM, Babak Bastan <ba...@gmail.com> wrote:
>> Thank you shashwat for the answer,
>> where should I type http://localhost:50070?
>> I typed here: hive>http://localhost:50070 but nothing as result
>>
>>
>> On Wed, Jun 6, 2012 at 3:32 PM, shashwat shriparv
>> <dw...@gmail.com> wrote:
>>>
>>> first type http://localhost:50070 whether this is opening or not and check
>>> how many nodes are available, check some of the hadoop shell commands
>>> from http://hadoop.apache.org/common/docs/r0.18.3/hdfs_shell.html  run
>>> example mapreduce task on hadoop take example from here
>>> : http://www.michael-noll.com/blog/2011/04/09/benchmarking-and-stress-testing-an-hadoop-cluster-with-terasort-testdfsio-nnbench-mrbench/
>>>
>>> if all the above you can do sucessfully means hadoop is configured
>>> correctly
>>>
>>> Regards
>>> Shashwat
>>>
>>>
>>> On Wed, Jun 6, 2012 at 1:30 AM, Babak Bastan <ba...@gmail.com> wrote:
>>>>
>>>> no I'm not working on CDH.Is there a way to test if my Hadoop works fine
>>>> or not?
>>>>
>>>>
>>>> On Tue, Jun 5, 2012 at 9:55 PM, Bejoy KS <be...@yahoo.com> wrote:
>>>>>
>>>>> Hi Babak
>>>>>
>>>>> You gotta follow those instructions in the apace site to set up hadoop
>>>>> from scratch and ensure that hdfs is working first. You should be able to
>>>>> read and write files to hdfs before you do your next steps.
>>>>>
>>>>> Are you on CDH or apache distribution of hadoop? If it is CDH there are
>>>>> detailed instructions on Cloudera web site.
>>>>>
>>>>> Regards
>>>>> Bejoy KS
>>>>>
>>>>> Sent from handheld, please excuse typos.
>>>>> ________________________________
>>>>> From: Babak Bastan <ba...@gmail.com>
>>>>> Date: Tue, 5 Jun 2012 21:30:22 +0200
>>>>> To: <us...@hive.apache.org>
>>>>> ReplyTo: user@hive.apache.org
>>>>> Subject: Re: Error while Creating Table in Hive
>>>>>
>>>>> @Bejoy: I set the fs.default.name in the core-site.xml and I did all of
>>>>> thing that was mentioned in the reference but no effect
>>>>>
>>>>> On Tue, Jun 5, 2012 at 8:43 PM, Babak Bastan <ba...@gmail.com> wrote:
>>>>>>
>>>>>> Ok sorry but that was my Mistake .I thought it works but no.
>>>>>> I wrote the command without ; and then I think It works but with ; at
>>>>>> the end of command
>>>>>>
>>>>>> CREATE TABLE pokes (foo INT, bar STRING);
>>>>>>
>>>>>> does'nt work
>>>>>>
>>>>>>
>>>>>> On Tue, Jun 5, 2012 at 8:34 PM, shashwat shriparv
>>>>>> <dw...@gmail.com> wrote:
>>>>>>>
>>>>>>> inside configuration. all properties will be inside the configuration
>>>>>>> tags
>>>>>>>
>>>>>>>
>>>>>>> On Tue, Jun 5, 2012 at 11:53 PM, Babak Bastan <ba...@gmail.com>
>>>>>>> wrote:
>>>>>>>>
>>>>>>>> Thank you so much my friend your idee works fine(no error) you are
>>>>>>>> the best :)
>>>>>>>>
>>>>>>>>
>>>>>>>> On Tue, Jun 5, 2012 at 8:20 PM, Babak Bastan <ba...@gmail.com>
>>>>>>>> wrote:
>>>>>>>>>
>>>>>>>>> It must be inside the <configuration></configuration> or outside
>>>>>>>>> this?
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Tue, Jun 5, 2012 at 8:15 PM, shashwat shriparv
>>>>>>>>> <dw...@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>> It will be inside hive/conf
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Tue, Jun 5, 2012 at 11:43 PM, Babak Bastan <ba...@gmail.com>
>>>>>>>>>> wrote:
>>>>>>>>>>>
>>>>>>>>>>> Thanks sShashwat, and where is this hive-site.xml
>>>>>>>>>>>
>>>>>>>>>>> On Tue, Jun 5, 2012 at 8:02 PM, shashwat shriparv
>>>>>>>>>>> <dw...@gmail.com> wrote:
>>>>>>>>>>>>
>>>>>>>>>>>> set
>>>>>>>>>>>>
>>>>>>>>>>>> hive.metastore.warehouse.dir in hive-site.xml
>>>>>>>>>>>>
>>>>>>>>>>>> <property>
>>>>>>>>>>>>   <name>hive.metastore.local</name>
>>>>>>>>>>>>   <value>true</value>
>>>>>>>>>>>> </property>
>>>>>>>>>>>>
>>>>>>>>>>>> <name>hive.metastore.warehouse.dir</name>
>>>>>>>>>>>>                <value>/home/<your username>/hivefolder</value>
>>>>>>>>>>>>                <description>location of default database for the
>>>>>>>>>>>> warehouse</description>
>>>>>>>>>>>>        </property>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> On Tue, Jun 5, 2012 at 10:43 PM, Babak Bastan
>>>>>>>>>>>> <ba...@gmail.com> wrote:
>>>>>>>>>>>>>
>>>>>>>>>>>>> Hello Experts ,
>>>>>>>>>>>>>
>>>>>>>>>>>>> I'm new in Hive .When try to create a test Table in Hive I get
>>>>>>>>>>>>> an error.I want to run this command:
>>>>>>>>>>>>> CREATE TABLE Test (DateT STRING, Url STRING, Content STRING);
>>>>>>>>>>>>> but this error occured:
>>>>>>>>>>>>> FAILED: Error in metadata: MetaException(message:Got exception:
>>>>>>>>>>>>> java.io.FileNotFoundException File file:/user/hive/warehouse/test does not
>>>>>>>>>>>>> exist.)
>>>>>>>>>>>>> FAILED: Execution Error, return code 1 from
>>>>>>>>>>>>> org.apache.hadoop.hive.ql.exec.DDLTask
>>>>>>>>>>>>> How can I solve this Problem?
>>>>>>>>>>>>> Thank you so much
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> --
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>> ∞
>>>>>>>>>>>>
>>>>>>>>>>>> Shashwat Shriparv
>>>>>>>>>>>>
>>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> --
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ∞
>>>>>>>>>>
>>>>>>>>>> Shashwat Shriparv
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> --
>>>>>>>
>>>>>>>
>>>>>>> ∞
>>>>>>>
>>>>>>> Shashwat Shriparv
>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>>
>>>
>>> --
>>>
>>>
>>> ∞
>>>
>>> Shashwat Shriparv
>>>
>>>
>>

Re: Error while Creating Table in Hive

Posted by Mohammad Tariq <do...@gmail.com>.
Hi Babak,

  You have to type it in you web browser..Hadoop provides us a web GUI
that not only allows us to browse through the file system, but to
download the files as well..Apart from that it also provides a web GUI
that can be used to see the status of Jobtracker and Tasktracker..When
you run a Hive or Pig job or a Mapreduce job, you can point your
browser to http://localhost:50030 to see the status and logs of your
job.

Regards,
    Mohammad Tariq


On Wed, Jun 6, 2012 at 8:28 PM, Babak Bastan <ba...@gmail.com> wrote:
> Thank you shashwat for the answer,
> where should I type http://localhost:50070?
> I typed here: hive>http://localhost:50070 but nothing as result
>
>
> On Wed, Jun 6, 2012 at 3:32 PM, shashwat shriparv
> <dw...@gmail.com> wrote:
>>
>> first type http://localhost:50070 whether this is opening or not and check
>> how many nodes are available, check some of the hadoop shell commands
>> from http://hadoop.apache.org/common/docs/r0.18.3/hdfs_shell.html  run
>> example mapreduce task on hadoop take example from here
>> : http://www.michael-noll.com/blog/2011/04/09/benchmarking-and-stress-testing-an-hadoop-cluster-with-terasort-testdfsio-nnbench-mrbench/
>>
>> if all the above you can do sucessfully means hadoop is configured
>> correctly
>>
>> Regards
>> Shashwat
>>
>>
>> On Wed, Jun 6, 2012 at 1:30 AM, Babak Bastan <ba...@gmail.com> wrote:
>>>
>>> no I'm not working on CDH.Is there a way to test if my Hadoop works fine
>>> or not?
>>>
>>>
>>> On Tue, Jun 5, 2012 at 9:55 PM, Bejoy KS <be...@yahoo.com> wrote:
>>>>
>>>> Hi Babak
>>>>
>>>> You gotta follow those instructions in the apace site to set up hadoop
>>>> from scratch and ensure that hdfs is working first. You should be able to
>>>> read and write files to hdfs before you do your next steps.
>>>>
>>>> Are you on CDH or apache distribution of hadoop? If it is CDH there are
>>>> detailed instructions on Cloudera web site.
>>>>
>>>> Regards
>>>> Bejoy KS
>>>>
>>>> Sent from handheld, please excuse typos.
>>>> ________________________________
>>>> From: Babak Bastan <ba...@gmail.com>
>>>> Date: Tue, 5 Jun 2012 21:30:22 +0200
>>>> To: <us...@hive.apache.org>
>>>> ReplyTo: user@hive.apache.org
>>>> Subject: Re: Error while Creating Table in Hive
>>>>
>>>> @Bejoy: I set the fs.default.name in the core-site.xml and I did all of
>>>> thing that was mentioned in the reference but no effect
>>>>
>>>> On Tue, Jun 5, 2012 at 8:43 PM, Babak Bastan <ba...@gmail.com> wrote:
>>>>>
>>>>> Ok sorry but that was my Mistake .I thought it works but no.
>>>>> I wrote the command without ; and then I think It works but with ; at
>>>>> the end of command
>>>>>
>>>>> CREATE TABLE pokes (foo INT, bar STRING);
>>>>>
>>>>> does'nt work
>>>>>
>>>>>
>>>>> On Tue, Jun 5, 2012 at 8:34 PM, shashwat shriparv
>>>>> <dw...@gmail.com> wrote:
>>>>>>
>>>>>> inside configuration. all properties will be inside the configuration
>>>>>> tags
>>>>>>
>>>>>>
>>>>>> On Tue, Jun 5, 2012 at 11:53 PM, Babak Bastan <ba...@gmail.com>
>>>>>> wrote:
>>>>>>>
>>>>>>> Thank you so much my friend your idee works fine(no error) you are
>>>>>>> the best :)
>>>>>>>
>>>>>>>
>>>>>>> On Tue, Jun 5, 2012 at 8:20 PM, Babak Bastan <ba...@gmail.com>
>>>>>>> wrote:
>>>>>>>>
>>>>>>>> It must be inside the <configuration></configuration> or outside
>>>>>>>> this?
>>>>>>>>
>>>>>>>>
>>>>>>>> On Tue, Jun 5, 2012 at 8:15 PM, shashwat shriparv
>>>>>>>> <dw...@gmail.com> wrote:
>>>>>>>>>
>>>>>>>>> It will be inside hive/conf
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Tue, Jun 5, 2012 at 11:43 PM, Babak Bastan <ba...@gmail.com>
>>>>>>>>> wrote:
>>>>>>>>>>
>>>>>>>>>> Thanks sShashwat, and where is this hive-site.xml
>>>>>>>>>>
>>>>>>>>>> On Tue, Jun 5, 2012 at 8:02 PM, shashwat shriparv
>>>>>>>>>> <dw...@gmail.com> wrote:
>>>>>>>>>>>
>>>>>>>>>>> set
>>>>>>>>>>>
>>>>>>>>>>> hive.metastore.warehouse.dir in hive-site.xml
>>>>>>>>>>>
>>>>>>>>>>> <property>
>>>>>>>>>>>   <name>hive.metastore.local</name>
>>>>>>>>>>>   <value>true</value>
>>>>>>>>>>> </property>
>>>>>>>>>>>
>>>>>>>>>>> <name>hive.metastore.warehouse.dir</name>
>>>>>>>>>>>                <value>/home/<your username>/hivefolder</value>
>>>>>>>>>>>                <description>location of default database for the
>>>>>>>>>>> warehouse</description>
>>>>>>>>>>>        </property>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> On Tue, Jun 5, 2012 at 10:43 PM, Babak Bastan
>>>>>>>>>>> <ba...@gmail.com> wrote:
>>>>>>>>>>>>
>>>>>>>>>>>> Hello Experts ,
>>>>>>>>>>>>
>>>>>>>>>>>> I'm new in Hive .When try to create a test Table in Hive I get
>>>>>>>>>>>> an error.I want to run this command:
>>>>>>>>>>>> CREATE TABLE Test (DateT STRING, Url STRING, Content STRING);
>>>>>>>>>>>> but this error occured:
>>>>>>>>>>>> FAILED: Error in metadata: MetaException(message:Got exception:
>>>>>>>>>>>> java.io.FileNotFoundException File file:/user/hive/warehouse/test does not
>>>>>>>>>>>> exist.)
>>>>>>>>>>>> FAILED: Execution Error, return code 1 from
>>>>>>>>>>>> org.apache.hadoop.hive.ql.exec.DDLTask
>>>>>>>>>>>> How can I solve this Problem?
>>>>>>>>>>>> Thank you so much
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> --
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>> ∞
>>>>>>>>>>>
>>>>>>>>>>> Shashwat Shriparv
>>>>>>>>>>>
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> --
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ∞
>>>>>>>>>
>>>>>>>>> Shashwat Shriparv
>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> --
>>>>>>
>>>>>>
>>>>>> ∞
>>>>>>
>>>>>> Shashwat Shriparv
>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>>
>>
>> --
>>
>>
>> ∞
>>
>> Shashwat Shriparv
>>
>>
>

Re: Error while Creating Table in Hive

Posted by Babak Bastan <ba...@gmail.com>.
Thank you shashwat for the answer,
where should I type http://localhost:50070?
I typed here: hive>http://localhost:50070 but nothing as result

On Wed, Jun 6, 2012 at 3:32 PM, shashwat shriparv <dwivedishashwat@gmail.com
> wrote:

> first type http://localhost:50070 whether this is opening or not and
> check how many nodes are available, check some of the hadoop shell commands
> from http://hadoop.apache.org/common/docs/r0.18.3/hdfs_shell.html  run
> example mapreduce task on hadoop take example from here :
> http://www.michael-noll.com/blog/2011/04/09/benchmarking-and-stress-testing-an-hadoop-cluster-with-terasort-testdfsio-nnbench-mrbench/
>
> if all the above you can do sucessfully means hadoop is configured
> correctly
>
> Regards
> Shashwat
>
>
> On Wed, Jun 6, 2012 at 1:30 AM, Babak Bastan <ba...@gmail.com> wrote:
>
>> no I'm not working on CDH.Is there a way to test if my Hadoop works fine
>> or not?
>>
>>
>> On Tue, Jun 5, 2012 at 9:55 PM, Bejoy KS <be...@yahoo.com> wrote:
>>
>>> **
>>> Hi Babak
>>>
>>> You gotta follow those instructions in the apace site to set up hadoop
>>> from scratch and ensure that hdfs is working first. You should be able to
>>> read and write files to hdfs before you do your next steps.
>>>
>>> Are you on CDH or apache distribution of hadoop? If it is CDH there are
>>> detailed instructions on Cloudera web site.
>>>
>>> Regards
>>> Bejoy KS
>>>
>>> Sent from handheld, please excuse typos.
>>> ------------------------------
>>> *From: * Babak Bastan <ba...@gmail.com>
>>> *Date: *Tue, 5 Jun 2012 21:30:22 +0200
>>> *To: *<us...@hive.apache.org>
>>> *ReplyTo: * user@hive.apache.org
>>> *Subject: *Re: Error while Creating Table in Hive
>>>
>>> @Bejoy: I set the fs.default.name in the core-site.xml and I did all of
>>> thing that was mentioned in the reference but no effect
>>>
>>> On Tue, Jun 5, 2012 at 8:43 PM, Babak Bastan <ba...@gmail.com> wrote:
>>>
>>>> Ok sorry but that was my Mistake .I thought it works but no.
>>>> I wrote the command without ; and then I think It works but with ; at
>>>> the end of command
>>>>
>>>> CREATE TABLE pokes (foo INT, bar STRING);
>>>>
>>>> does'nt work
>>>>
>>>>
>>>> On Tue, Jun 5, 2012 at 8:34 PM, shashwat shriparv <
>>>> dwivedishashwat@gmail.com> wrote:
>>>>
>>>>> inside configuration. all properties will be inside the configuration
>>>>> tags
>>>>>
>>>>>
>>>>> On Tue, Jun 5, 2012 at 11:53 PM, Babak Bastan <ba...@gmail.com>wrote:
>>>>>
>>>>>> Thank you so much my friend your idee works fine(no error) you are
>>>>>> the best :)
>>>>>>
>>>>>>
>>>>>> On Tue, Jun 5, 2012 at 8:20 PM, Babak Bastan <ba...@gmail.com>wrote:
>>>>>>
>>>>>>> It must be inside the <configuration></configuration> or outside
>>>>>>> this?
>>>>>>>
>>>>>>>
>>>>>>> On Tue, Jun 5, 2012 at 8:15 PM, shashwat shriparv <
>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>
>>>>>>>> It will be inside hive/conf
>>>>>>>>
>>>>>>>>
>>>>>>>> On Tue, Jun 5, 2012 at 11:43 PM, Babak Bastan <ba...@gmail.com>wrote:
>>>>>>>>
>>>>>>>>> Thanks sShashwat, and where is this hive-site.xml
>>>>>>>>>
>>>>>>>>> On Tue, Jun 5, 2012 at 8:02 PM, shashwat shriparv <
>>>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>>>
>>>>>>>>>> set
>>>>>>>>>>
>>>>>>>>>> hive.metastore.warehouse.dir in hive-site.xml
>>>>>>>>>>
>>>>>>>>>> <property>
>>>>>>>>>>   <name>hive.metastore.local</name>
>>>>>>>>>>   <value>true</value>
>>>>>>>>>> </property>
>>>>>>>>>>
>>>>>>>>>> <name>hive.metastore.warehouse.dir</name>
>>>>>>>>>>                <value>/home/<your username>/hivefolder</value>
>>>>>>>>>>                <description>location of default database for the
>>>>>>>>>> warehouse</description>
>>>>>>>>>>        </property>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> On Tue, Jun 5, 2012 at 10:43 PM, Babak Bastan <babakbsn@gmail.com
>>>>>>>>>> > wrote:
>>>>>>>>>>
>>>>>>>>>>> Hello Experts ,
>>>>>>>>>>>
>>>>>>>>>>> I'm new in Hive .When try to create a test Table in Hive I get
>>>>>>>>>>> an error.I want to run this command:
>>>>>>>>>>> *CREATE TABLE Test (DateT STRING, Url STRING, Content STRING);*
>>>>>>>>>>> but this error occured:
>>>>>>>>>>> FAILED: Error in metadata: MetaException(message:Got exception:
>>>>>>>>>>> java.io.FileNotFoundException File file:/user/hive/warehouse/test does not
>>>>>>>>>>> exist.)
>>>>>>>>>>> FAILED: Execution Error, return code 1 from
>>>>>>>>>>> org.apache.hadoop.hive.ql.exec.DDLTask
>>>>>>>>>>> How can I solve this Problem?
>>>>>>>>>>> Thank you so much
>>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> --
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> ∞
>>>>>>>>>> Shashwat Shriparv
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> --
>>>>>>>>
>>>>>>>>
>>>>>>>> ∞
>>>>>>>> Shashwat Shriparv
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>>
>>>>> --
>>>>>
>>>>>
>>>>> ∞
>>>>> Shashwat Shriparv
>>>>>
>>>>>
>>>>>
>>>>
>>>
>>
>
>
> --
>
>
> ∞
> Shashwat Shriparv
>
>
>

Re: Error while Creating Table in Hive

Posted by shashwat shriparv <dw...@gmail.com>.
first type http://localhost:50070 whether this is opening or not and check
how many nodes are available, check some of the hadoop shell commands from
http://hadoop.apache.org/common/docs/r0.18.3/hdfs_shell.html  run example
mapreduce task on hadoop take example from here :
http://www.michael-noll.com/blog/2011/04/09/benchmarking-and-stress-testing-an-hadoop-cluster-with-terasort-testdfsio-nnbench-mrbench/

if all the above you can do sucessfully means hadoop is configured correctly

Regards
Shashwat

On Wed, Jun 6, 2012 at 1:30 AM, Babak Bastan <ba...@gmail.com> wrote:

> no I'm not working on CDH.Is there a way to test if my Hadoop works fine
> or not?
>
>
> On Tue, Jun 5, 2012 at 9:55 PM, Bejoy KS <be...@yahoo.com> wrote:
>
>> **
>> Hi Babak
>>
>> You gotta follow those instructions in the apace site to set up hadoop
>> from scratch and ensure that hdfs is working first. You should be able to
>> read and write files to hdfs before you do your next steps.
>>
>> Are you on CDH or apache distribution of hadoop? If it is CDH there are
>> detailed instructions on Cloudera web site.
>>
>> Regards
>> Bejoy KS
>>
>> Sent from handheld, please excuse typos.
>> ------------------------------
>> *From: * Babak Bastan <ba...@gmail.com>
>> *Date: *Tue, 5 Jun 2012 21:30:22 +0200
>> *To: *<us...@hive.apache.org>
>> *ReplyTo: * user@hive.apache.org
>> *Subject: *Re: Error while Creating Table in Hive
>>
>> @Bejoy: I set the fs.default.name in the core-site.xml and I did all of
>> thing that was mentioned in the reference but no effect
>>
>> On Tue, Jun 5, 2012 at 8:43 PM, Babak Bastan <ba...@gmail.com> wrote:
>>
>>> Ok sorry but that was my Mistake .I thought it works but no.
>>> I wrote the command without ; and then I think It works but with ; at
>>> the end of command
>>>
>>> CREATE TABLE pokes (foo INT, bar STRING);
>>>
>>> does'nt work
>>>
>>>
>>> On Tue, Jun 5, 2012 at 8:34 PM, shashwat shriparv <
>>> dwivedishashwat@gmail.com> wrote:
>>>
>>>> inside configuration. all properties will be inside the configuration
>>>> tags
>>>>
>>>>
>>>> On Tue, Jun 5, 2012 at 11:53 PM, Babak Bastan <ba...@gmail.com>wrote:
>>>>
>>>>> Thank you so much my friend your idee works fine(no error) you are the
>>>>> best :)
>>>>>
>>>>>
>>>>> On Tue, Jun 5, 2012 at 8:20 PM, Babak Bastan <ba...@gmail.com>wrote:
>>>>>
>>>>>> It must be inside the <configuration></configuration> or outside this?
>>>>>>
>>>>>>
>>>>>> On Tue, Jun 5, 2012 at 8:15 PM, shashwat shriparv <
>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>
>>>>>>> It will be inside hive/conf
>>>>>>>
>>>>>>>
>>>>>>> On Tue, Jun 5, 2012 at 11:43 PM, Babak Bastan <ba...@gmail.com>wrote:
>>>>>>>
>>>>>>>> Thanks sShashwat, and where is this hive-site.xml
>>>>>>>>
>>>>>>>> On Tue, Jun 5, 2012 at 8:02 PM, shashwat shriparv <
>>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>>
>>>>>>>>> set
>>>>>>>>>
>>>>>>>>> hive.metastore.warehouse.dir in hive-site.xml
>>>>>>>>>
>>>>>>>>> <property>
>>>>>>>>>   <name>hive.metastore.local</name>
>>>>>>>>>   <value>true</value>
>>>>>>>>> </property>
>>>>>>>>>
>>>>>>>>> <name>hive.metastore.warehouse.dir</name>
>>>>>>>>>                <value>/home/<your username>/hivefolder</value>
>>>>>>>>>                <description>location of default database for the
>>>>>>>>> warehouse</description>
>>>>>>>>>        </property>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Tue, Jun 5, 2012 at 10:43 PM, Babak Bastan <ba...@gmail.com>wrote:
>>>>>>>>>
>>>>>>>>>> Hello Experts ,
>>>>>>>>>>
>>>>>>>>>> I'm new in Hive .When try to create a test Table in Hive I get an
>>>>>>>>>> error.I want to run this command:
>>>>>>>>>> *CREATE TABLE Test (DateT STRING, Url STRING, Content STRING);*
>>>>>>>>>> but this error occured:
>>>>>>>>>> FAILED: Error in metadata: MetaException(message:Got exception:
>>>>>>>>>> java.io.FileNotFoundException File file:/user/hive/warehouse/test does not
>>>>>>>>>> exist.)
>>>>>>>>>> FAILED: Execution Error, return code 1 from
>>>>>>>>>> org.apache.hadoop.hive.ql.exec.DDLTask
>>>>>>>>>> How can I solve this Problem?
>>>>>>>>>> Thank you so much
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> --
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ∞
>>>>>>>>> Shashwat Shriparv
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> --
>>>>>>>
>>>>>>>
>>>>>>> ∞
>>>>>>> Shashwat Shriparv
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>>
>>>> --
>>>>
>>>>
>>>> ∞
>>>> Shashwat Shriparv
>>>>
>>>>
>>>>
>>>
>>
>


-- 


∞
Shashwat Shriparv

Re: Error while Creating Table in Hive

Posted by Babak Bastan <ba...@gmail.com>.
no I'm not working on CDH.Is there a way to test if my Hadoop works fine or
not?

On Tue, Jun 5, 2012 at 9:55 PM, Bejoy KS <be...@yahoo.com> wrote:

> **
> Hi Babak
>
> You gotta follow those instructions in the apace site to set up hadoop
> from scratch and ensure that hdfs is working first. You should be able to
> read and write files to hdfs before you do your next steps.
>
> Are you on CDH or apache distribution of hadoop? If it is CDH there are
> detailed instructions on Cloudera web site.
>
> Regards
> Bejoy KS
>
> Sent from handheld, please excuse typos.
> ------------------------------
> *From: * Babak Bastan <ba...@gmail.com>
> *Date: *Tue, 5 Jun 2012 21:30:22 +0200
> *To: *<us...@hive.apache.org>
> *ReplyTo: * user@hive.apache.org
> *Subject: *Re: Error while Creating Table in Hive
>
> @Bejoy: I set the fs.default.name in the core-site.xml and I did all of
> thing that was mentioned in the reference but no effect
>
> On Tue, Jun 5, 2012 at 8:43 PM, Babak Bastan <ba...@gmail.com> wrote:
>
>> Ok sorry but that was my Mistake .I thought it works but no.
>> I wrote the command without ; and then I think It works but with ; at the
>> end of command
>>
>> CREATE TABLE pokes (foo INT, bar STRING);
>>
>> does'nt work
>>
>>
>> On Tue, Jun 5, 2012 at 8:34 PM, shashwat shriparv <
>> dwivedishashwat@gmail.com> wrote:
>>
>>> inside configuration. all properties will be inside the configuration
>>> tags
>>>
>>>
>>> On Tue, Jun 5, 2012 at 11:53 PM, Babak Bastan <ba...@gmail.com>wrote:
>>>
>>>> Thank you so much my friend your idee works fine(no error) you are the
>>>> best :)
>>>>
>>>>
>>>> On Tue, Jun 5, 2012 at 8:20 PM, Babak Bastan <ba...@gmail.com>wrote:
>>>>
>>>>> It must be inside the <configuration></configuration> or outside this?
>>>>>
>>>>>
>>>>> On Tue, Jun 5, 2012 at 8:15 PM, shashwat shriparv <
>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>
>>>>>> It will be inside hive/conf
>>>>>>
>>>>>>
>>>>>> On Tue, Jun 5, 2012 at 11:43 PM, Babak Bastan <ba...@gmail.com>wrote:
>>>>>>
>>>>>>> Thanks sShashwat, and where is this hive-site.xml
>>>>>>>
>>>>>>> On Tue, Jun 5, 2012 at 8:02 PM, shashwat shriparv <
>>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>>
>>>>>>>> set
>>>>>>>>
>>>>>>>> hive.metastore.warehouse.dir in hive-site.xml
>>>>>>>>
>>>>>>>> <property>
>>>>>>>>   <name>hive.metastore.local</name>
>>>>>>>>   <value>true</value>
>>>>>>>> </property>
>>>>>>>>
>>>>>>>> <name>hive.metastore.warehouse.dir</name>
>>>>>>>>                <value>/home/<your username>/hivefolder</value>
>>>>>>>>                <description>location of default database for the
>>>>>>>> warehouse</description>
>>>>>>>>        </property>
>>>>>>>>
>>>>>>>>
>>>>>>>> On Tue, Jun 5, 2012 at 10:43 PM, Babak Bastan <ba...@gmail.com>wrote:
>>>>>>>>
>>>>>>>>> Hello Experts ,
>>>>>>>>>
>>>>>>>>> I'm new in Hive .When try to create a test Table in Hive I get an
>>>>>>>>> error.I want to run this command:
>>>>>>>>> *CREATE TABLE Test (DateT STRING, Url STRING, Content STRING);*
>>>>>>>>> but this error occured:
>>>>>>>>> FAILED: Error in metadata: MetaException(message:Got exception:
>>>>>>>>> java.io.FileNotFoundException File file:/user/hive/warehouse/test does not
>>>>>>>>> exist.)
>>>>>>>>> FAILED: Execution Error, return code 1 from
>>>>>>>>> org.apache.hadoop.hive.ql.exec.DDLTask
>>>>>>>>> How can I solve this Problem?
>>>>>>>>> Thank you so much
>>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> --
>>>>>>>>
>>>>>>>>
>>>>>>>> ∞
>>>>>>>> Shashwat Shriparv
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>>
>>>>>> --
>>>>>>
>>>>>>
>>>>>> ∞
>>>>>> Shashwat Shriparv
>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>>
>>> --
>>>
>>>
>>> ∞
>>> Shashwat Shriparv
>>>
>>>
>>>
>>
>

Re: Error while Creating Table in Hive

Posted by Bejoy KS <be...@yahoo.com>.
Hi Babak

You gotta follow those instructions in the apace site to set up hadoop from scratch and ensure that hdfs is working first. You should be able to read and write files to hdfs before you do your next steps.

Are you on CDH or apache distribution of hadoop? If it is CDH there are detailed instructions on Cloudera web site.


Regards
Bejoy KS

Sent from handheld, please excuse typos.

-----Original Message-----
From: Babak Bastan <ba...@gmail.com>
Date: Tue, 5 Jun 2012 21:30:22 
To: <us...@hive.apache.org>
Reply-To: user@hive.apache.org
Subject: Re: Error while Creating Table in Hive

@Bejoy: I set the fs.default.name in the core-site.xml and I did all of
thing that was mentioned in the reference but no effect

On Tue, Jun 5, 2012 at 8:43 PM, Babak Bastan <ba...@gmail.com> wrote:

> Ok sorry but that was my Mistake .I thought it works but no.
> I wrote the command without ; and then I think It works but with ; at the
> end of command
>
> CREATE TABLE pokes (foo INT, bar STRING);
>
> does'nt work
>
>
> On Tue, Jun 5, 2012 at 8:34 PM, shashwat shriparv <
> dwivedishashwat@gmail.com> wrote:
>
>> inside configuration. all properties will be inside the configuration tags
>>
>>
>> On Tue, Jun 5, 2012 at 11:53 PM, Babak Bastan <ba...@gmail.com> wrote:
>>
>>> Thank you so much my friend your idee works fine(no error) you are the
>>> best :)
>>>
>>>
>>> On Tue, Jun 5, 2012 at 8:20 PM, Babak Bastan <ba...@gmail.com> wrote:
>>>
>>>> It must be inside the <configuration></configuration> or outside this?
>>>>
>>>>
>>>> On Tue, Jun 5, 2012 at 8:15 PM, shashwat shriparv <
>>>> dwivedishashwat@gmail.com> wrote:
>>>>
>>>>> It will be inside hive/conf
>>>>>
>>>>>
>>>>> On Tue, Jun 5, 2012 at 11:43 PM, Babak Bastan <ba...@gmail.com>wrote:
>>>>>
>>>>>> Thanks sShashwat, and where is this hive-site.xml
>>>>>>
>>>>>> On Tue, Jun 5, 2012 at 8:02 PM, shashwat shriparv <
>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>
>>>>>>> set
>>>>>>>
>>>>>>> hive.metastore.warehouse.dir in hive-site.xml
>>>>>>>
>>>>>>> <property>
>>>>>>>   <name>hive.metastore.local</name>
>>>>>>>   <value>true</value>
>>>>>>> </property>
>>>>>>>
>>>>>>> <name>hive.metastore.warehouse.dir</name>
>>>>>>>                <value>/home/<your username>/hivefolder</value>
>>>>>>>                <description>location of default database for the
>>>>>>> warehouse</description>
>>>>>>>        </property>
>>>>>>>
>>>>>>>
>>>>>>> On Tue, Jun 5, 2012 at 10:43 PM, Babak Bastan <ba...@gmail.com>wrote:
>>>>>>>
>>>>>>>> Hello Experts ,
>>>>>>>>
>>>>>>>> I'm new in Hive .When try to create a test Table in Hive I get an
>>>>>>>> error.I want to run this command:
>>>>>>>> *CREATE TABLE Test (DateT STRING, Url STRING, Content STRING);*
>>>>>>>> but this error occured:
>>>>>>>> FAILED: Error in metadata: MetaException(message:Got exception:
>>>>>>>> java.io.FileNotFoundException File file:/user/hive/warehouse/test does not
>>>>>>>> exist.)
>>>>>>>> FAILED: Execution Error, return code 1 from
>>>>>>>> org.apache.hadoop.hive.ql.exec.DDLTask
>>>>>>>> How can I solve this Problem?
>>>>>>>> Thank you so much
>>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> --
>>>>>>>
>>>>>>>
>>>>>>> ∞
>>>>>>> Shashwat Shriparv
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>>
>>>>> --
>>>>>
>>>>>
>>>>> ∞
>>>>> Shashwat Shriparv
>>>>>
>>>>>
>>>>>
>>>>
>>>
>>
>>
>> --
>>
>>
>> ∞
>> Shashwat Shriparv
>>
>>
>>
>


Re: Error while Creating Table in Hive

Posted by Babak Bastan <ba...@gmail.com>.
@Bejoy: I set the fs.default.name in the core-site.xml and I did all of
thing that was mentioned in the reference but no effect

On Tue, Jun 5, 2012 at 8:43 PM, Babak Bastan <ba...@gmail.com> wrote:

> Ok sorry but that was my Mistake .I thought it works but no.
> I wrote the command without ; and then I think It works but with ; at the
> end of command
>
> CREATE TABLE pokes (foo INT, bar STRING);
>
> does'nt work
>
>
> On Tue, Jun 5, 2012 at 8:34 PM, shashwat shriparv <
> dwivedishashwat@gmail.com> wrote:
>
>> inside configuration. all properties will be inside the configuration tags
>>
>>
>> On Tue, Jun 5, 2012 at 11:53 PM, Babak Bastan <ba...@gmail.com> wrote:
>>
>>> Thank you so much my friend your idee works fine(no error) you are the
>>> best :)
>>>
>>>
>>> On Tue, Jun 5, 2012 at 8:20 PM, Babak Bastan <ba...@gmail.com> wrote:
>>>
>>>> It must be inside the <configuration></configuration> or outside this?
>>>>
>>>>
>>>> On Tue, Jun 5, 2012 at 8:15 PM, shashwat shriparv <
>>>> dwivedishashwat@gmail.com> wrote:
>>>>
>>>>> It will be inside hive/conf
>>>>>
>>>>>
>>>>> On Tue, Jun 5, 2012 at 11:43 PM, Babak Bastan <ba...@gmail.com>wrote:
>>>>>
>>>>>> Thanks sShashwat, and where is this hive-site.xml
>>>>>>
>>>>>> On Tue, Jun 5, 2012 at 8:02 PM, shashwat shriparv <
>>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>>
>>>>>>> set
>>>>>>>
>>>>>>> hive.metastore.warehouse.dir in hive-site.xml
>>>>>>>
>>>>>>> <property>
>>>>>>>   <name>hive.metastore.local</name>
>>>>>>>   <value>true</value>
>>>>>>> </property>
>>>>>>>
>>>>>>> <name>hive.metastore.warehouse.dir</name>
>>>>>>>                <value>/home/<your username>/hivefolder</value>
>>>>>>>                <description>location of default database for the
>>>>>>> warehouse</description>
>>>>>>>        </property>
>>>>>>>
>>>>>>>
>>>>>>> On Tue, Jun 5, 2012 at 10:43 PM, Babak Bastan <ba...@gmail.com>wrote:
>>>>>>>
>>>>>>>> Hello Experts ,
>>>>>>>>
>>>>>>>> I'm new in Hive .When try to create a test Table in Hive I get an
>>>>>>>> error.I want to run this command:
>>>>>>>> *CREATE TABLE Test (DateT STRING, Url STRING, Content STRING);*
>>>>>>>> but this error occured:
>>>>>>>> FAILED: Error in metadata: MetaException(message:Got exception:
>>>>>>>> java.io.FileNotFoundException File file:/user/hive/warehouse/test does not
>>>>>>>> exist.)
>>>>>>>> FAILED: Execution Error, return code 1 from
>>>>>>>> org.apache.hadoop.hive.ql.exec.DDLTask
>>>>>>>> How can I solve this Problem?
>>>>>>>> Thank you so much
>>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> --
>>>>>>>
>>>>>>>
>>>>>>> ∞
>>>>>>> Shashwat Shriparv
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>>
>>>>> --
>>>>>
>>>>>
>>>>> ∞
>>>>> Shashwat Shriparv
>>>>>
>>>>>
>>>>>
>>>>
>>>
>>
>>
>> --
>>
>>
>> ∞
>> Shashwat Shriparv
>>
>>
>>
>

Re: Error while Creating Table in Hive

Posted by Babak Bastan <ba...@gmail.com>.
Ok sorry but that was my Mistake .I thought it works but no.
I wrote the command without ; and then I think It works but with ; at the
end of command

CREATE TABLE pokes (foo INT, bar STRING);

does'nt work


On Tue, Jun 5, 2012 at 8:34 PM, shashwat shriparv <dwivedishashwat@gmail.com
> wrote:

> inside configuration. all properties will be inside the configuration tags
>
>
> On Tue, Jun 5, 2012 at 11:53 PM, Babak Bastan <ba...@gmail.com> wrote:
>
>> Thank you so much my friend your idee works fine(no error) you are the
>> best :)
>>
>>
>> On Tue, Jun 5, 2012 at 8:20 PM, Babak Bastan <ba...@gmail.com> wrote:
>>
>>> It must be inside the <configuration></configuration> or outside this?
>>>
>>>
>>> On Tue, Jun 5, 2012 at 8:15 PM, shashwat shriparv <
>>> dwivedishashwat@gmail.com> wrote:
>>>
>>>> It will be inside hive/conf
>>>>
>>>>
>>>> On Tue, Jun 5, 2012 at 11:43 PM, Babak Bastan <ba...@gmail.com>wrote:
>>>>
>>>>> Thanks sShashwat, and where is this hive-site.xml
>>>>>
>>>>> On Tue, Jun 5, 2012 at 8:02 PM, shashwat shriparv <
>>>>> dwivedishashwat@gmail.com> wrote:
>>>>>
>>>>>> set
>>>>>>
>>>>>> hive.metastore.warehouse.dir in hive-site.xml
>>>>>>
>>>>>> <property>
>>>>>>   <name>hive.metastore.local</name>
>>>>>>   <value>true</value>
>>>>>> </property>
>>>>>>
>>>>>> <name>hive.metastore.warehouse.dir</name>
>>>>>>                <value>/home/<your username>/hivefolder</value>
>>>>>>                <description>location of default database for the
>>>>>> warehouse</description>
>>>>>>        </property>
>>>>>>
>>>>>>
>>>>>> On Tue, Jun 5, 2012 at 10:43 PM, Babak Bastan <ba...@gmail.com>wrote:
>>>>>>
>>>>>>> Hello Experts ,
>>>>>>>
>>>>>>> I'm new in Hive .When try to create a test Table in Hive I get an
>>>>>>> error.I want to run this command:
>>>>>>> *CREATE TABLE Test (DateT STRING, Url STRING, Content STRING);*
>>>>>>> but this error occured:
>>>>>>> FAILED: Error in metadata: MetaException(message:Got exception:
>>>>>>> java.io.FileNotFoundException File file:/user/hive/warehouse/test does not
>>>>>>> exist.)
>>>>>>> FAILED: Execution Error, return code 1 from
>>>>>>> org.apache.hadoop.hive.ql.exec.DDLTask
>>>>>>> How can I solve this Problem?
>>>>>>> Thank you so much
>>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> --
>>>>>>
>>>>>>
>>>>>> ∞
>>>>>> Shashwat Shriparv
>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>
>>>>
>>>> --
>>>>
>>>>
>>>> ∞
>>>> Shashwat Shriparv
>>>>
>>>>
>>>>
>>>
>>
>
>
> --
>
>
> ∞
> Shashwat Shriparv
>
>
>

Re: Error while Creating Table in Hive

Posted by shashwat shriparv <dw...@gmail.com>.
inside configuration. all properties will be inside the configuration tags

On Tue, Jun 5, 2012 at 11:53 PM, Babak Bastan <ba...@gmail.com> wrote:

> Thank you so much my friend your idee works fine(no error) you are the
> best :)
>
>
> On Tue, Jun 5, 2012 at 8:20 PM, Babak Bastan <ba...@gmail.com> wrote:
>
>> It must be inside the <configuration></configuration> or outside this?
>>
>>
>> On Tue, Jun 5, 2012 at 8:15 PM, shashwat shriparv <
>> dwivedishashwat@gmail.com> wrote:
>>
>>> It will be inside hive/conf
>>>
>>>
>>> On Tue, Jun 5, 2012 at 11:43 PM, Babak Bastan <ba...@gmail.com>wrote:
>>>
>>>> Thanks sShashwat, and where is this hive-site.xml
>>>>
>>>> On Tue, Jun 5, 2012 at 8:02 PM, shashwat shriparv <
>>>> dwivedishashwat@gmail.com> wrote:
>>>>
>>>>> set
>>>>>
>>>>> hive.metastore.warehouse.dir in hive-site.xml
>>>>>
>>>>> <property>
>>>>>   <name>hive.metastore.local</name>
>>>>>   <value>true</value>
>>>>> </property>
>>>>>
>>>>> <name>hive.metastore.warehouse.dir</name>
>>>>>                <value>/home/<your username>/hivefolder</value>
>>>>>                <description>location of default database for the
>>>>> warehouse</description>
>>>>>        </property>
>>>>>
>>>>>
>>>>> On Tue, Jun 5, 2012 at 10:43 PM, Babak Bastan <ba...@gmail.com>wrote:
>>>>>
>>>>>> Hello Experts ,
>>>>>>
>>>>>> I'm new in Hive .When try to create a test Table in Hive I get an
>>>>>> error.I want to run this command:
>>>>>> *CREATE TABLE Test (DateT STRING, Url STRING, Content STRING);*
>>>>>> but this error occured:
>>>>>> FAILED: Error in metadata: MetaException(message:Got exception:
>>>>>> java.io.FileNotFoundException File file:/user/hive/warehouse/test does not
>>>>>> exist.)
>>>>>> FAILED: Execution Error, return code 1 from
>>>>>> org.apache.hadoop.hive.ql.exec.DDLTask
>>>>>> How can I solve this Problem?
>>>>>> Thank you so much
>>>>>>
>>>>>
>>>>>
>>>>>
>>>>> --
>>>>>
>>>>>
>>>>> ∞
>>>>> Shashwat Shriparv
>>>>>
>>>>>
>>>>>
>>>>
>>>
>>>
>>> --
>>>
>>>
>>> ∞
>>> Shashwat Shriparv
>>>
>>>
>>>
>>
>


-- 


∞
Shashwat Shriparv

Re: Error while Creating Table in Hive

Posted by Babak Bastan <ba...@gmail.com>.
Thank you so much my friend your idee works fine(no error) you are the best
:)

On Tue, Jun 5, 2012 at 8:20 PM, Babak Bastan <ba...@gmail.com> wrote:

> It must be inside the <configuration></configuration> or outside this?
>
>
> On Tue, Jun 5, 2012 at 8:15 PM, shashwat shriparv <
> dwivedishashwat@gmail.com> wrote:
>
>> It will be inside hive/conf
>>
>>
>> On Tue, Jun 5, 2012 at 11:43 PM, Babak Bastan <ba...@gmail.com> wrote:
>>
>>> Thanks sShashwat, and where is this hive-site.xml
>>>
>>> On Tue, Jun 5, 2012 at 8:02 PM, shashwat shriparv <
>>> dwivedishashwat@gmail.com> wrote:
>>>
>>>> set
>>>>
>>>> hive.metastore.warehouse.dir in hive-site.xml
>>>>
>>>> <property>
>>>>   <name>hive.metastore.local</name>
>>>>   <value>true</value>
>>>> </property>
>>>>
>>>> <name>hive.metastore.warehouse.dir</name>
>>>>                <value>/home/<your username>/hivefolder</value>
>>>>                <description>location of default database for the
>>>> warehouse</description>
>>>>        </property>
>>>>
>>>>
>>>> On Tue, Jun 5, 2012 at 10:43 PM, Babak Bastan <ba...@gmail.com>wrote:
>>>>
>>>>> Hello Experts ,
>>>>>
>>>>> I'm new in Hive .When try to create a test Table in Hive I get an
>>>>> error.I want to run this command:
>>>>> *CREATE TABLE Test (DateT STRING, Url STRING, Content STRING);*
>>>>> but this error occured:
>>>>> FAILED: Error in metadata: MetaException(message:Got exception:
>>>>> java.io.FileNotFoundException File file:/user/hive/warehouse/test does not
>>>>> exist.)
>>>>> FAILED: Execution Error, return code 1 from
>>>>> org.apache.hadoop.hive.ql.exec.DDLTask
>>>>> How can I solve this Problem?
>>>>> Thank you so much
>>>>>
>>>>
>>>>
>>>>
>>>> --
>>>>
>>>>
>>>> ∞
>>>> Shashwat Shriparv
>>>>
>>>>
>>>>
>>>
>>
>>
>> --
>>
>>
>> ∞
>> Shashwat Shriparv
>>
>>
>>
>

Re: Error while Creating Table in Hive

Posted by Babak Bastan <ba...@gmail.com>.
It must be inside the <configuration></configuration> or outside this?

On Tue, Jun 5, 2012 at 8:15 PM, shashwat shriparv <dwivedishashwat@gmail.com
> wrote:

> It will be inside hive/conf
>
>
> On Tue, Jun 5, 2012 at 11:43 PM, Babak Bastan <ba...@gmail.com> wrote:
>
>> Thanks sShashwat, and where is this hive-site.xml
>>
>> On Tue, Jun 5, 2012 at 8:02 PM, shashwat shriparv <
>> dwivedishashwat@gmail.com> wrote:
>>
>>> set
>>>
>>> hive.metastore.warehouse.dir in hive-site.xml
>>>
>>> <property>
>>>   <name>hive.metastore.local</name>
>>>   <value>true</value>
>>> </property>
>>>
>>> <name>hive.metastore.warehouse.dir</name>
>>>                <value>/home/<your username>/hivefolder</value>
>>>                <description>location of default database for the
>>> warehouse</description>
>>>        </property>
>>>
>>>
>>> On Tue, Jun 5, 2012 at 10:43 PM, Babak Bastan <ba...@gmail.com>wrote:
>>>
>>>> Hello Experts ,
>>>>
>>>> I'm new in Hive .When try to create a test Table in Hive I get an
>>>> error.I want to run this command:
>>>> *CREATE TABLE Test (DateT STRING, Url STRING, Content STRING);*
>>>> but this error occured:
>>>> FAILED: Error in metadata: MetaException(message:Got exception:
>>>> java.io.FileNotFoundException File file:/user/hive/warehouse/test does not
>>>> exist.)
>>>> FAILED: Execution Error, return code 1 from
>>>> org.apache.hadoop.hive.ql.exec.DDLTask
>>>> How can I solve this Problem?
>>>> Thank you so much
>>>>
>>>
>>>
>>>
>>> --
>>>
>>>
>>> ∞
>>> Shashwat Shriparv
>>>
>>>
>>>
>>
>
>
> --
>
>
> ∞
> Shashwat Shriparv
>
>
>

Re: Error while Creating Table in Hive

Posted by shashwat shriparv <dw...@gmail.com>.
/etc/hive/conf/hive-site.xml

check out at above folder there you make changes

for help i am sending you link

https://ccp.cloudera.com/display/CDHDOC/Hive+Installation

On Tue, Jun 5, 2012 at 11:45 PM, shashwat shriparv <
dwivedishashwat@gmail.com> wrote:

> It will be inside hive/conf
>
>
> On Tue, Jun 5, 2012 at 11:43 PM, Babak Bastan <ba...@gmail.com> wrote:
>
>> Thanks sShashwat, and where is this hive-site.xml
>>
>> On Tue, Jun 5, 2012 at 8:02 PM, shashwat shriparv <
>> dwivedishashwat@gmail.com> wrote:
>>
>>> set
>>>
>>> hive.metastore.warehouse.dir in hive-site.xml
>>>
>>> <property>
>>>   <name>hive.metastore.local</name>
>>>   <value>true</value>
>>> </property>
>>>
>>> <name>hive.metastore.warehouse.dir</name>
>>>                <value>/home/<your username>/hivefolder</value>
>>>                <description>location of default database for the
>>> warehouse</description>
>>>        </property>
>>>
>>>
>>> On Tue, Jun 5, 2012 at 10:43 PM, Babak Bastan <ba...@gmail.com>wrote:
>>>
>>>> Hello Experts ,
>>>>
>>>> I'm new in Hive .When try to create a test Table in Hive I get an
>>>> error.I want to run this command:
>>>> *CREATE TABLE Test (DateT STRING, Url STRING, Content STRING);*
>>>> but this error occured:
>>>> FAILED: Error in metadata: MetaException(message:Got exception:
>>>> java.io.FileNotFoundException File file:/user/hive/warehouse/test does not
>>>> exist.)
>>>> FAILED: Execution Error, return code 1 from
>>>> org.apache.hadoop.hive.ql.exec.DDLTask
>>>> How can I solve this Problem?
>>>> Thank you so much
>>>>
>>>
>>>
>>>
>>> --
>>>
>>>
>>> ∞
>>> Shashwat Shriparv
>>>
>>>
>>>
>>
>
>
> --
>
>
> ∞
> Shashwat Shriparv
>
>
>


-- 


∞
Shashwat Shriparv

Re: Error while Creating Table in Hive

Posted by shashwat shriparv <dw...@gmail.com>.
It will be inside hive/conf

On Tue, Jun 5, 2012 at 11:43 PM, Babak Bastan <ba...@gmail.com> wrote:

> Thanks sShashwat, and where is this hive-site.xml
>
> On Tue, Jun 5, 2012 at 8:02 PM, shashwat shriparv <
> dwivedishashwat@gmail.com> wrote:
>
>> set
>>
>> hive.metastore.warehouse.dir in hive-site.xml
>>
>> <property>
>>   <name>hive.metastore.local</name>
>>   <value>true</value>
>> </property>
>>
>> <name>hive.metastore.warehouse.dir</name>
>>                <value>/home/<your username>/hivefolder</value>
>>                <description>location of default database for the
>> warehouse</description>
>>        </property>
>>
>>
>> On Tue, Jun 5, 2012 at 10:43 PM, Babak Bastan <ba...@gmail.com> wrote:
>>
>>> Hello Experts ,
>>>
>>> I'm new in Hive .When try to create a test Table in Hive I get an
>>> error.I want to run this command:
>>> *CREATE TABLE Test (DateT STRING, Url STRING, Content STRING);*
>>> but this error occured:
>>> FAILED: Error in metadata: MetaException(message:Got exception:
>>> java.io.FileNotFoundException File file:/user/hive/warehouse/test does not
>>> exist.)
>>> FAILED: Execution Error, return code 1 from
>>> org.apache.hadoop.hive.ql.exec.DDLTask
>>> How can I solve this Problem?
>>> Thank you so much
>>>
>>
>>
>>
>> --
>>
>>
>> ∞
>> Shashwat Shriparv
>>
>>
>>
>


-- 


∞
Shashwat Shriparv

Re: Error while Creating Table in Hive

Posted by Babak Bastan <ba...@gmail.com>.
Thanks sShashwat, and where is this hive-site.xml

On Tue, Jun 5, 2012 at 8:02 PM, shashwat shriparv <dwivedishashwat@gmail.com
> wrote:

> set
>
> hive.metastore.warehouse.dir in hive-site.xml
>
> <property>
>   <name>hive.metastore.local</name>
>   <value>true</value>
> </property>
>
> <name>hive.metastore.warehouse.dir</name>
>                <value>/home/<your username>/hivefolder</value>
>                <description>location of default database for the
> warehouse</description>
>        </property>
>
>
> On Tue, Jun 5, 2012 at 10:43 PM, Babak Bastan <ba...@gmail.com> wrote:
>
>> Hello Experts ,
>>
>> I'm new in Hive .When try to create a test Table in Hive I get an error.I
>> want to run this command:
>> *CREATE TABLE Test (DateT STRING, Url STRING, Content STRING);*
>> but this error occured:
>> FAILED: Error in metadata: MetaException(message:Got exception:
>> java.io.FileNotFoundException File file:/user/hive/warehouse/test does not
>> exist.)
>> FAILED: Execution Error, return code 1 from
>> org.apache.hadoop.hive.ql.exec.DDLTask
>> How can I solve this Problem?
>> Thank you so much
>>
>
>
>
> --
>
>
> ∞
> Shashwat Shriparv
>
>
>

Re: Error while Creating Table in Hive

Posted by shashwat shriparv <dw...@gmail.com>.
set

hive.metastore.warehouse.dir in hive-site.xml

<property>
  <name>hive.metastore.local</name>
  <value>true</value>
</property>

<name>hive.metastore.warehouse.dir</name>
               <value>/home/<your username>/hivefolder</value>
               <description>location of default database for the
warehouse</description>
       </property>


On Tue, Jun 5, 2012 at 10:43 PM, Babak Bastan <ba...@gmail.com> wrote:

> Hello Experts ,
>
> I'm new in Hive .When try to create a test Table in Hive I get an error.I
> want to run this command:
> *CREATE TABLE Test (DateT STRING, Url STRING, Content STRING);*
> but this error occured:
> FAILED: Error in metadata: MetaException(message:Got exception:
> java.io.FileNotFoundException File file:/user/hive/warehouse/test does not
> exist.)
> FAILED: Execution Error, return code 1 from
> org.apache.hadoop.hive.ql.exec.DDLTask
> How can I solve this Problem?
> Thank you so much
>



-- 


∞
Shashwat Shriparv