You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hive.apache.org by yo...@wipro.com on 2012/07/06 11:17:46 UTC
HIVE table showing NULL
Hi all
I have created a table tich in Mysql, its structure is:
Num Name
-------------------------
01 Yogesh
and imported it into HDFS by using sqoop's command
sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 --table tich --target-dir /HADOOP/hadoop-0.20.2/tichh7
it imported successfully
I created a table in hive called ticchi as
create table tichhi ( num INT , Name STRING)
now I am trying to upload that table record from HDFS to hive table I used this command.
LOAD DATA INPATH 'hdfs://localhost:9000/HADOOP/hadoop-0.20.2/tichh7/part-m-00000' OVERWRITE INTO TABLE tichhi ;
optput is
Loading data to table default.tichhi
Deleted hdfs://localhost:9000/user/hive/warehouse/tichhi
OK
Time taken: 0.14 seconds
Then I looked the records into hive table using and found values NULL
hive> select * from tichhi;
OK
NULL NULL
Time taken: 0.051 seconds
please help me and suggest why is it so
Thanks & Regards
Yogesh Kumar
Please do not print this email unless it is absolutely necessary.
The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments.
WARNING: Computer viruses can be transmitted via email. The recipient should check this email and any attachments for the presence of viruses. The company accepts no liability for any damage caused by any virus transmitted by this email.
www.wipro.com
RE: HIVE table showing NULL
Posted by yo...@wipro.com.
Hi nitin,
i did
hadoop dfs -ls /HADOOP/hadoop-0.20.2/tichh7/
it results
part-m-00000
If i do
hadoop dfs -cat /HADOOP/hadoop-0.20.2/tichh7/part-m-00000
it shows
01 Yogesh
currect records imported from Mysql
Please help
Regards
Yogesh Kumar
________________________________
From: Nitin Pawar [nitinpawar432@gmail.com]
Sent: Friday, July 06, 2012 2:56 PM
To: user@hive.apache.org
Subject: Re: HIVE table showing NULL
can you do " hadoop dfs -cat <file_path>"
this will tell us wats the content of file
On Fri, Jul 6, 2012 at 2:47 PM, <yo...@wipro.com>> wrote:
Hi all
I have created a table tich in Mysql, its structure is:
Num Name
-------------------------
01 Yogesh
and imported it into HDFS by using sqoop's command
sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 --table tich --target-dir /HADOOP/hadoop-0.20.2/tichh7
it imported successfully
I created a table in hive called ticchi as
create table tichhi ( num INT , Name STRING)
now I am trying to upload that table record from HDFS to hive table I used this command.
LOAD DATA INPATH 'hdfs://localhost:9000/HADOOP/hadoop-0.20.2/tichh7/part-m-00000' OVERWRITE INTO TABLE tichhi ;
optput is
Loading data to table default.tichhi
Deleted hdfs://localhost:9000/user/hive/warehouse/tichhi
OK
Time taken: 0.14 seconds
Then I looked the records into hive table using and found values NULL
hive> select * from tichhi;
OK
NULL NULL
Time taken: 0.051 seconds
please help me and suggest why is it so
Thanks & Regards
Yogesh Kumar
Please do not print this email unless it is absolutely necessary.
The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments.
WARNING: Computer viruses can be transmitted via email. The recipient should check this email and any attachments for the presence of viruses. The company accepts no liability for any damage caused by any virus transmitted by this email.
www.wipro.com<http://www.wipro.com>
--
Nitin Pawar
Please do not print this email unless it is absolutely necessary.
The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments.
WARNING: Computer viruses can be transmitted via email. The recipient should check this email and any attachments for the presence of viruses. The company accepts no liability for any damage caused by any virus transmitted by this email.
www.wipro.com
Re: HIVE table showing NULL
Posted by Nitin Pawar <ni...@gmail.com>.
can you do " hadoop dfs -cat <file_path>"
this will tell us wats the content of file
On Fri, Jul 6, 2012 at 2:47 PM, <yo...@wipro.com> wrote:
> Hi all
>
> I have created a table tich in Mysql, its structure is:
>
> Num Name
> -------------------------
> 01 Yogesh
>
> and imported it into HDFS by using sqoop's command
>
> sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1
> --password SQOOP1 --table tich --target-dir /HADOOP/hadoop-0.20.2/tichh7
>
> it imported successfully
> I created a table in hive called ticchi as
>
> create table tichhi ( num INT , Name STRING)
>
> now I am trying to upload that table record from HDFS to hive table I used
> this command.
>
> LOAD DATA INPATH
> 'hdfs://localhost:9000/HADOOP/hadoop-0.20.2/tichh7/part-m-00000' OVERWRITE
> INTO TABLE tichhi ;
>
> optput is
>
> Loading data to table default.tichhi
> Deleted hdfs://localhost:9000/user/hive/warehouse/tichhi
> OK
> Time taken: 0.14 seconds
>
> Then I looked the records into hive table using and found values NULL
>
> hive> select * from tichhi;
> OK
> NULL NULL
> Time taken: 0.051 seconds
>
> please help me and suggest why is it so
>
> Thanks & Regards
> Yogesh Kumar
>
>
>
> * Please do not print this email unless it is absolutely necessary. *****
>
> The information contained in this electronic message and any attachments
> to this message are intended for the exclusive use of the addressee(s) and
> may contain proprietary, confidential or privileged information. If you are
> not the intended recipient, you should not disseminate, distribute or copy
> this e-mail. Please notify the sender immediately and destroy all copies of
> this message and any attachments.
>
> WARNING: Computer viruses can be transmitted via email. The recipient
> should check this email and any attachments for the presence of viruses.
> The company accepts no liability for any damage caused by any virus
> transmitted by this email.
>
> www.wipro.com
>
--
Nitin Pawar