You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hive.apache.org by yo...@wipro.com on 2012/07/05 11:28:29 UTC

Hive uploading

Hi

I have created a table in Mysql by name Dummy and it has 2 columns, and 1 row of data

I want to upload that table into Hive using Sqoop tool.
I used this command


sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 -table Dummy --hive-table dummyhive  --create-hive-table  --hive-import --hive-home HADOOP/hive


The table has been succesfully uploaded into HDFS  /user/hive/warehouse
but when I run command in Hive

Show Tables;

I don't find dummyhive table in it.

Please suggest and Help


Details of the command and output

mediaadmins-iMac-2:hive mediaadmin$ sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 -table Dummy --hive-table dummyhive  --create-hive-table  --hive-import --hive-home HADOOP/hive
12/07/05 11:09:15 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
12/07/05 11:09:15 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
12/07/05 11:09:15 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
12/07/05 11:09:15 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
12/07/05 11:09:15 INFO tool.CodeGenTool: Beginning code generation
12/07/05 11:09:16 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `Dummy` AS t LIMIT 1
12/07/05 11:09:16 INFO orm.CompilationManager: HADOOP_HOME is /HADOOP/hadoop-0.20.2/bin/..
12/07/05 11:09:16 INFO orm.CompilationManager: Found hadoop core jar at: /HADOOP/hadoop-0.20.2/bin/../hadoop-0.20.2-core.jar
Note: /tmp/sqoop-mediaadmin/compile/382d1c58323cea76efd197632bebbfcd/Dummy.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
12/07/05 11:09:17 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-mediaadmin/compile/382d1c58323cea76efd197632bebbfcd/Dummy.jar
12/07/05 11:09:17 WARN manager.MySQLManager: It looks like you are importing from mysql.
12/07/05 11:09:17 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
12/07/05 11:09:17 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
12/07/05 11:09:17 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
12/07/05 11:09:17 INFO mapreduce.ImportJobBase: Beginning import of Dummy
12/07/05 11:09:18 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN(`Sno`), MAX(`Sno`) FROM `Dummy`
12/07/05 11:09:18 INFO mapred.JobClient: Running job: job_201207051104_0001
12/07/05 11:09:19 INFO mapred.JobClient:  map 0% reduce 0%
12/07/05 11:09:33 INFO mapred.JobClient:  map 100% reduce 0%
12/07/05 11:09:35 INFO mapred.JobClient: Job complete: job_201207051104_0001
12/07/05 11:09:35 INFO mapred.JobClient: Counters: 5
12/07/05 11:09:35 INFO mapred.JobClient:   Job Counters
12/07/05 11:09:35 INFO mapred.JobClient:     Launched map tasks=1
12/07/05 11:09:35 INFO mapred.JobClient:   FileSystemCounters
12/07/05 11:09:35 INFO mapred.JobClient:     HDFS_BYTES_WRITTEN=8
12/07/05 11:09:35 INFO mapred.JobClient:   Map-Reduce Framework
12/07/05 11:09:35 INFO mapred.JobClient:     Map input records=1
12/07/05 11:09:35 INFO mapred.JobClient:     Spilled Records=0
12/07/05 11:09:35 INFO mapred.JobClient:     Map output records=1
12/07/05 11:09:35 INFO mapreduce.ImportJobBase: Transferred 8 bytes in 17.945 seconds (0.4458 bytes/sec)
12/07/05 11:09:35 INFO mapreduce.ImportJobBase: Retrieved 1 records.
12/07/05 11:09:35 INFO hive.HiveImport: Removing temporary files from import process: Dummy/_logs
12/07/05 11:09:35 INFO hive.HiveImport: Loading uploaded data into Hive
12/07/05 11:09:35 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `Dummy` AS t LIMIT 1
12/07/05 11:09:37 INFO hive.HiveImport: Logging initialized using configuration in jar:file:/HADOOP/hive/lib/hive-common-0.8.1.jar!/hive-log4j.properties
12/07/05 11:09:37 INFO hive.HiveImport: Hive history file=/tmp/mediaadmin/hive_job_log_mediaadmin_201207051109_1901926452.txt
12/07/05 11:09:41 INFO hive.HiveImport: OK
12/07/05 11:09:41 INFO hive.HiveImport: Time taken: 3.934 seconds
12/07/05 11:09:41 INFO hive.HiveImport: Loading data to table default.dummyhive
12/07/05 11:09:41 INFO hive.HiveImport: OK
12/07/05 11:09:41 INFO hive.HiveImport: Time taken: 0.262 seconds
12/07/05 11:09:41 INFO hive.HiveImport: Hive import complete.



Why is it so? Please help me out

Thanks & Regards
Yogesh Kumar


Please do not print this email unless it is absolutely necessary. 

The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments. 

WARNING: Computer viruses can be transmitted via email. The recipient should check this email and any attachments for the presence of viruses. The company accepts no liability for any damage caused by any virus transmitted by this email. 

www.wipro.com

RE: Hive uploading

Posted by yo...@wipro.com.
there was no any hive-site.xml when I extracted it.

although I am able to create tables in Hive, uploading data from HDFS to hive by using

LOAD DATA INPATH 'hdfs://localhost:9000/user/hive/warehouse/dummyhive/part-m-00000' OVERWRITE INTO TABLE demoo PARTITION (ROLL = 001);

Loading data to table default.demoo partition (roll=001)
OK
Time taken: 0.466 seconds

and able to see records..

tmp/mediaadmin/hive_job_log_mediaadmin_201207051849_2011545354.txt contains

SessionStart SESSION_ID="mediaadmin_201207051849" TIME="1341494353320"
QueryStart QUERY_STRING="CREATE TABLE `yeshivee` ( `no` INT, `name` STRING) COMMENT 'Imported by sqoop on 2012/07/05 18:49:11' ROW FORMAT DELIMITED FIELDS TERMINATED BY '\001' LINES TERMINATED BY '\012' STORED AS TEXTFILE" QUERY_ID="mediaadmin_20120705184949_24437ec0-3332-49ea-851c-3ba8a3d5c872" TIME="1341494353751"
Counters plan="{"queryId":"mediaadmin_20120705184949_24437ec0-3332-49ea-851c-3ba8a3d5c872","queryType":null,"queryAttributes":{"queryString":"CREATE TABLE `yeshivee` ( `no` INT, `name` STRING) COMMENT 'Imported by sqoop on 2012/07/05 18:49:11' ROW FORMAT DELIMITED FIELDS TERMINATED BY '\001' LINES TERMINATED BY '\012' STORED AS TEXTFILE"},"queryCounters":"null","stageGraph":{"nodeType":"STAGE","roots":"null","adjacencyList":"]"},"stageList":[{"stageId":"Stage-0","stageType":"DDL","stageAttributes":"null","stageCounters":"}","taskList":[{"taskId":"Stage-0_OTHER","taskType":"OTHER","taskAttributes":"null","taskCounters":"null","operatorGraph":"null","operatorList":"]","done":"false","started":"false"}],"done":"false","started":"false"}],"done":"false","started":"true"}" TIME="1341494353762"
TaskStart TASK_NAME="org.apache.hadoop.hive.ql.exec.DDLTask" TASK_ID="Stage-0" QUERY_ID="mediaadmin_20120705184949_24437ec0-3332-49ea-851c-3ba8a3d5c872" TIME="1341494353766"
Counters plan="{"queryId":"mediaadmin_20120705184949_24437ec0-3332-49ea-851c-3ba8a3d5c872","queryType":null,"queryAttributes":{"queryString":"CREATE TABLE `yeshivee` ( `no` INT, `name` STRING) COMMENT 'Imported by sqoop on 2012/07/05 18:49:11' ROW FORMAT DELIMITED FIELDS TERMINATED BY '\001' LINES TERMINATED BY '\012' STORED AS TEXTFILE"},"queryCounters":"null","stageGraph":{"nodeType":"STAGE","roots":"null","adjacencyList":"]"},"stageList":[{"stageId":"Stage-0","stageType":"DDL","stageAttributes":"null","stageCounters":"}","taskList":[{"taskId":"Stage-0_OTHER","taskType":"OTHER","taskAttributes":"null","taskCounters":"null","operatorGraph":"null","operatorList":"]","done":"false","started":"true"}],"done":"false","started":"true"}],"done":"false","started":"true"}" TIME="1341494353768"
Counters plan="{"queryId":"mediaadmin_20120705184949_24437ec0-3332-49ea-851c-3ba8a3d5c872","queryType":null,"queryAttributes":{"queryString":"CREATE TABLE `yeshivee` ( `no` INT, `name` STRING) COMMENT 'Imported by sqoop on 2012/07/05 18:49:11' ROW FORMAT DELIMITED FIELDS TERMINATED BY '\001' LINES TERMINATED BY '\012' STORED AS TEXTFILE"},"queryCounters":"null","stageGraph":{"nodeType":"STAGE","roots":"null","adjacencyList":"]"},"stageList":[{"stageId":"Stage-0","stageType":"DDL","stageAttributes":"null","stageCounters":"}","taskList":[{"taskId":"Stage-0_OTHER","taskType":"OTHER","taskAttributes":"null","taskCounters":"null","operatorGraph":"null","operatorList":"]","done":"true","started":"true"}],"done":"true","started":"true"}],"done":"false","started":"true"}" TIME="1341494357307"
TaskEnd TASK_RET_CODE="0" TASK_NAME="org.apache.hadoop.hive.ql.exec.DDLTask" TASK_ID="Stage-0" QUERY_ID="mediaadmin_20120705184949_24437ec0-3332-49ea-851c-3ba8a3d5c872" TIME="1341494357307"
QueryEnd QUERY_STRING="CREATE TABLE `yeshivee` ( `no` INT, `name` STRING) COMMENT 'Imported by sqoop on 2012/07/05 18:49:11' ROW FORMAT DELIMITED FIELDS TERMINATED BY '\001' LINES TERMINATED BY '\012' STORED AS TEXTFILE" QUERY_ID="mediaadmin_20120705184949_24437ec0-3332-49ea-851c-3ba8a3d5c872" QUERY_RET_CODE="0" QUERY_NUM_TASKS="0" TIME="1341494357363"
Counters plan="{"queryId":"mediaadmin_20120705184949_24437ec0-3332-49ea-851c-3ba8a3d5c872","queryType":null,"queryAttributes":{"queryString":"CREATE TABLE `yeshivee` ( `no` INT, `name` STRING) COMMENT 'Imported by sqoop on 2012/07/05 18:49:11' ROW FORMAT DELIMITED FIELDS TERMINATED BY '\001' LINES TERMINATED BY '\012' STORED AS TEXTFILE"},"queryCounters":"null","stageGraph":{"nodeType":"STAGE","roots":"null","adjacencyList":"]"},"stageList":[{"stageId":"Stage-0","stageType":"DDL","stageAttributes":"null","stageCounters":"}","taskList":[{"taskId":"Stage-0_OTHER","taskType":"OTHER","taskAttributes":"null","taskCounters":"null","operatorGraph":"null","operatorList":"]","done":"true","started":"true"}],"done":"true","started":"true"}],"done":"true","started":"true"}" TIME="1341494357364"
QueryStart QUERY_STRING=" LOAD DATA INPATH 'hdfs://localhost:9000/userdata/yogesh/sqoop/imports/yesrdbms1' INTO TABLE `yeshivee`" QUERY_ID="mediaadmin_20120705184949_72d256c3-1698-40db-a8d6-6536512dbd63" TIME="1341494357502"
Counters plan="{"queryId":"mediaadmin_20120705184949_72d256c3-1698-40db-a8d6-6536512dbd63","queryType":null,"queryAttributes":{"queryString":" LOAD DATA INPATH 'hdfs://localhost:9000/userdata/yogesh/sqoop/imports/yesrdbms1' INTO TABLE `yeshivee`"},"queryCounters":"null","stageGraph":{"nodeType":"STAGE","roots":"null","adjacencyList":"]"},"stageList":[{"stageId":"Stage-0","stageType":"MOVE","stageAttributes":"null","stageCounters":"}","taskList":[{"taskId":"Stage-0_OTHER","taskType":"OTHER","taskAttributes":"null","taskCounters":"null","operatorGraph":"null","operatorList":"]","done":"false","started":"false"}],"done":"false","started":"false"}],"done":"false","started":"true"}" TIME="1341494357502"
TaskStart TASK_NAME="org.apache.hadoop.hive.ql.exec.MoveTask" TASK_ID="Stage-0" QUERY_ID="mediaadmin_20120705184949_72d256c3-1698-40db-a8d6-6536512dbd63" TIME="1341494357502"
Counters plan="{"queryId":"mediaadmin_20120705184949_72d256c3-1698-40db-a8d6-6536512dbd63","queryType":null,"queryAttributes":{"queryString":" LOAD DATA INPATH 'hdfs://localhost:9000/userdata/yogesh/sqoop/imports/yesrdbms1' INTO TABLE `yeshivee`"},"queryCounters":"null","stageGraph":{"nodeType":"STAGE","roots":"null","adjacencyList":"]"},"stageList":[{"stageId":"Stage-0","stageType":"MOVE","stageAttributes":"null","stageCounters":"}","taskList":[{"taskId":"Stage-0_OTHER","taskType":"OTHER","taskAttributes":"null","taskCounters":"null","operatorGraph":"null","operatorList":"]","done":"false","started":"true"}],"done":"false","started":"true"}],"done":"false","started":"true"}" TIME="1341494357503"
Counters plan="{"queryId":"mediaadmin_20120705184949_72d256c3-1698-40db-a8d6-6536512dbd63","queryType":null,"queryAttributes":{"queryString":" LOAD DATA INPATH 'hdfs://localhost:9000/userdata/yogesh/sqoop/imports/yesrdbms1' INTO TABLE `yeshivee`"},"queryCounters":"null","stageGraph":{"nodeType":"STAGE","roots":"null","adjacencyList":"]"},"stageList":[{"stageId":"Stage-0","stageType":"MOVE","stageAttributes":"null","stageCounters":"}","taskList":[{"taskId":"Stage-0_OTHER","taskType":"OTHER","taskAttributes":"null","taskCounters":"null","operatorGraph":"null","operatorList":"]","done":"true","started":"true"}],"done":"true","started":"true"}],"done":"false","started":"true"}" TIME="1341494357611"
TaskEnd TASK_RET_CODE="0" TASK_NAME="org.apache.hadoop.hive.ql.exec.MoveTask" TASK_ID="Stage-0" QUERY_ID="mediaadmin_20120705184949_72d256c3-1698-40db-a8d6-6536512dbd63" TIME="1341494357611"
QueryEnd QUERY_STRING=" LOAD DATA INPATH 'hdfs://localhost:9000/userdata/yogesh/sqoop/imports/yesrdbms1' INTO TABLE `yeshivee`" QUERY_ID="mediaadmin_20120705184949_72d256c3-1698-40db-a8d6-6536512dbd63" QUERY_RET_CODE="0" QUERY_NUM_TASKS="0" TIME="1341494357611"
Counters plan="{"queryId":"mediaadmin_20120705184949_72d256c3-1698-40db-a8d6-6536512dbd63","queryType":null,"queryAttributes":{"queryString":" LOAD DATA INPATH 'hdfs://localhost:9000/userdata/yogesh/sqoop/imports/yesrdbms1' INTO TABLE `yeshivee`"},"queryCounters":"null","stageGraph":{"nodeType":"STAGE","roots":"null","adjacencyList":"]"},"stageList":[{"stageId":"Stage-0","stageType":"MOVE","stageAttributes":"null","stageCounters":"}","taskList":[{"taskId":"Stage-0_OTHER","taskType":"OTHER","taskAttributes":"null","taskCounters":"null","operatorGraph":"null","operatorList":"]","done":"true","started":"true"}],"done":"true","started":"true"}],"done":"true","started":"true"}" TIME="1341494357611"



/tmp/mediaadmin/hive_job_log_mediaadmin_201207051854_1851664234.txt  contains

SessionStart SESSION_ID="mediaadmin_201207051854" TIME="1341494660061"
QueryStart QUERY_STRING="CREATE TABLE `yeshive2` ( `num` INT, `name` STRING) COMMENT 'Imported by sqoop on 2012/07/05 18:54:18' ROW FORMAT DELIMITED FIELDS TERMINATED BY '\001' LINES TERMINATED BY '\012' STORED AS TEXTFILE" QUERY_ID="mediaadmin_20120705185454_d40a27b9-5434-4764-880d-902e91bd3c28" TIME="1341494660513"
Counters plan="{"queryId":"mediaadmin_20120705185454_d40a27b9-5434-4764-880d-902e91bd3c28","queryType":null,"queryAttributes":{"queryString":"CREATE TABLE `yeshive2` ( `num` INT, `name` STRING) COMMENT 'Imported by sqoop on 2012/07/05 18:54:18' ROW FORMAT DELIMITED FIELDS TERMINATED BY '\001' LINES TERMINATED BY '\012' STORED AS TEXTFILE"},"queryCounters":"null","stageGraph":{"nodeType":"STAGE","roots":"null","adjacencyList":"]"},"stageList":[{"stageId":"Stage-0","stageType":"DDL","stageAttributes":"null","stageCounters":"}","taskList":[{"taskId":"Stage-0_OTHER","taskType":"OTHER","taskAttributes":"null","taskCounters":"null","operatorGraph":"null","operatorList":"]","done":"false","started":"false"}],"done":"false","started":"false"}],"done":"false","started":"true"}" TIME="1341494660523"
TaskStart TASK_NAME="org.apache.hadoop.hive.ql.exec.DDLTask" TASK_ID="Stage-0" QUERY_ID="mediaadmin_20120705185454_d40a27b9-5434-4764-880d-902e91bd3c28" TIME="1341494660527"
Counters plan="{"queryId":"mediaadmin_20120705185454_d40a27b9-5434-4764-880d-902e91bd3c28","queryType":null,"queryAttributes":{"queryString":"CREATE TABLE `yeshive2` ( `num` INT, `name` STRING) COMMENT 'Imported by sqoop on 2012/07/05 18:54:18' ROW FORMAT DELIMITED FIELDS TERMINATED BY '\001' LINES TERMINATED BY '\012' STORED AS TEXTFILE"},"queryCounters":"null","stageGraph":{"nodeType":"STAGE","roots":"null","adjacencyList":"]"},"stageList":[{"stageId":"Stage-0","stageType":"DDL","stageAttributes":"null","stageCounters":"}","taskList":[{"taskId":"Stage-0_OTHER","taskType":"OTHER","taskAttributes":"null","taskCounters":"null","operatorGraph":"null","operatorList":"]","done":"false","started":"true"}],"done":"false","started":"true"}],"done":"false","started":"true"}" TIME="1341494660529"
Counters plan="{"queryId":"mediaadmin_20120705185454_d40a27b9-5434-4764-880d-902e91bd3c28","queryType":null,"queryAttributes":{"queryString":"CREATE TABLE `yeshive2` ( `num` INT, `name` STRING) COMMENT 'Imported by sqoop on 2012/07/05 18:54:18' ROW FORMAT DELIMITED FIELDS TERMINATED BY '\001' LINES TERMINATED BY '\012' STORED AS TEXTFILE"},"queryCounters":"null","stageGraph":{"nodeType":"STAGE","roots":"null","adjacencyList":"]"},"stageList":[{"stageId":"Stage-0","stageType":"DDL","stageAttributes":"null","stageCounters":"}","taskList":[{"taskId":"Stage-0_OTHER","taskType":"OTHER","taskAttributes":"null","taskCounters":"null","operatorGraph":"null","operatorList":"]","done":"true","started":"true"}],"done":"true","started":"true"}],"done":"false","started":"true"}" TIME="1341494664453"
TaskEnd TASK_RET_CODE="0" TASK_NAME="org.apache.hadoop.hive.ql.exec.DDLTask" TASK_ID="Stage-0" QUERY_ID="mediaadmin_20120705185454_d40a27b9-5434-4764-880d-902e91bd3c28" TIME="1341494664453"
QueryEnd QUERY_STRING="CREATE TABLE `yeshive2` ( `num` INT, `name` STRING) COMMENT 'Imported by sqoop on 2012/07/05 18:54:18' ROW FORMAT DELIMITED FIELDS TERMINATED BY '\001' LINES TERMINATED BY '\012' STORED AS TEXTFILE" QUERY_ID="mediaadmin_20120705185454_d40a27b9-5434-4764-880d-902e91bd3c28" QUERY_RET_CODE="0" QUERY_NUM_TASKS="0" TIME="1341494664466"
Counters plan="{"queryId":"mediaadmin_20120705185454_d40a27b9-5434-4764-880d-902e91bd3c28","queryType":null,"queryAttributes":{"queryString":"CREATE TABLE `yeshive2` ( `num` INT, `name` STRING) COMMENT 'Imported by sqoop on 2012/07/05 18:54:18' ROW FORMAT DELIMITED FIELDS TERMINATED BY '\001' LINES TERMINATED BY '\012' STORED AS TEXTFILE"},"queryCounters":"null","stageGraph":{"nodeType":"STAGE","roots":"null","adjacencyList":"]"},"stageList":[{"stageId":"Stage-0","stageType":"DDL","stageAttributes":"null","stageCounters":"}","taskList":[{"taskId":"Stage-0_OTHER","taskType":"OTHER","taskAttributes":"null","taskCounters":"null","operatorGraph":"null","operatorList":"]","done":"true","started":"true"}],"done":"true","started":"true"}],"done":"true","started":"true"}" TIME="1341494664467"
QueryStart QUERY_STRING=" LOAD DATA INPATH 'hdfs://localhost:9000/userdata/yogesh/sqoop/imports/oldtoy' INTO TABLE `yeshive2`" QUERY_ID="mediaadmin_20120705185454_c1afa8b5-6f97-4be7-9bc6-9a30426a3cbd" TIME="1341494664599"
Counters plan="{"queryId":"mediaadmin_20120705185454_c1afa8b5-6f97-4be7-9bc6-9a30426a3cbd","queryType":null,"queryAttributes":{"queryString":" LOAD DATA INPATH 'hdfs://localhost:9000/userdata/yogesh/sqoop/imports/oldtoy' INTO TABLE `yeshive2`"},"queryCounters":"null","stageGraph":{"nodeType":"STAGE","roots":"null","adjacencyList":"]"},"stageList":[{"stageId":"Stage-0","stageType":"MOVE","stageAttributes":"null","stageCounters":"}","taskList":[{"taskId":"Stage-0_OTHER","taskType":"OTHER","taskAttributes":"null","taskCounters":"null","operatorGraph":"null","operatorList":"]","done":"false","started":"false"}],"done":"false","started":"false"}],"done":"false","started":"true"}" TIME="1341494664599"
TaskStart TASK_NAME="org.apache.hadoop.hive.ql.exec.MoveTask" TASK_ID="Stage-0" QUERY_ID="mediaadmin_20120705185454_c1afa8b5-6f97-4be7-9bc6-9a30426a3cbd" TIME="1341494664599"
Counters plan="{"queryId":"mediaadmin_20120705185454_c1afa8b5-6f97-4be7-9bc6-9a30426a3cbd","queryType":null,"queryAttributes":{"queryString":" LOAD DATA INPATH 'hdfs://localhost:9000/userdata/yogesh/sqoop/imports/oldtoy' INTO TABLE `yeshive2`"},"queryCounters":"null","stageGraph":{"nodeType":"STAGE","roots":"null","adjacencyList":"]"},"stageList":[{"stageId":"Stage-0","stageType":"MOVE","stageAttributes":"null","stageCounters":"}","taskList":[{"taskId":"Stage-0_OTHER","taskType":"OTHER","taskAttributes":"null","taskCounters":"null","operatorGraph":"null","operatorList":"]","done":"false","started":"true"}],"done":"false","started":"true"}],"done":"false","started":"true"}" TIME="1341494664599"
Counters plan="{"queryId":"mediaadmin_20120705185454_c1afa8b5-6f97-4be7-9bc6-9a30426a3cbd","queryType":null,"queryAttributes":{"queryString":" LOAD DATA INPATH 'hdfs://localhost:9000/userdata/yogesh/sqoop/imports/oldtoy' INTO TABLE `yeshive2`"},"queryCounters":"null","stageGraph":{"nodeType":"STAGE","roots":"null","adjacencyList":"]"},"stageList":[{"stageId":"Stage-0","stageType":"MOVE","stageAttributes":"null","stageCounters":"}","taskList":[{"taskId":"Stage-0_OTHER","taskType":"OTHER","taskAttributes":"null","taskCounters":"null","operatorGraph":"null","operatorList":"]","done":"true","started":"true"}],"done":"true","started":"true"}],"done":"false","started":"true"}" TIME="1341494664744"
TaskEnd TASK_RET_CODE="0" TASK_NAME="org.apache.hadoop.hive.ql.exec.MoveTask" TASK_ID="Stage-0" QUERY_ID="mediaadmin_20120705185454_c1afa8b5-6f97-4be7-9bc6-9a30426a3cbd" TIME="1341494664744"
QueryEnd QUERY_STRING=" LOAD DATA INPATH 'hdfs://localhost:9000/userdata/yogesh/sqoop/imports/oldtoy' INTO TABLE `yeshive2`" QUERY_ID="mediaadmin_20120705185454_c1afa8b5-6f97-4be7-9bc6-9a30426a3cbd" QUERY_RET_CODE="0" QUERY_NUM_TASKS="0" TIME="1341494664744"
Counters plan="{"queryId":"mediaadmin_20120705185454_c1afa8b5-6f97-4be7-9bc6-9a30426a3cbd","queryType":null,"queryAttributes":{"queryString":" LOAD DATA INPATH 'hdfs://localhost:9000/userdata/yogesh/sqoop/imports/oldtoy' INTO TABLE `yeshive2`"},"queryCounters":"null","stageGraph":{"nodeType":"STAGE","roots":"null","adjacencyList":"]"},"stageList":[{"stageId":"Stage-0","stageType":"MOVE","stageAttributes":"null","stageCounters":"}","taskList":[{"taskId":"Stage-0_OTHER","taskType":"OTHER","taskAttributes":"null","taskCounters":"null","operatorGraph":"null","operatorList":"]","done":"true","started":"true"}],"done":"true","started":"true"}],"done":"true","started":"true"}" TIME="1341494664744"


I dont thik so that its showing any error or is it ?

Yeah sure I will also raise this issue at sqoop user list


Regards
Yogesh Kumar
________________________________
From: Bejoy Ks [bejoy_ks@yahoo.com]
Sent: Thursday, July 05, 2012 7:14 PM
To: user@hive.apache.org
Subject: Re: Hive uploading

Hi Yogesh

The verbose logging is still not getting enabled, some issue with SQOOP installation I guess.

The console log shows hive table creation and data load is sucess. Are you still not seeing the tables 'yeshivee' and 'yeshive2 ' in hive. Login to your default hive CLI and check. ($HIVE_HOME/bin/hive)

You can look at the hive history file and see if there are any errors reported in hive.
12/07/05 18:49:13 INFO hive.HiveImport: Hive history file=/tmp/mediaadmin/hive_job_log_mediaadmin_201207051849_2011545354.txt
12/07/05 18:54:20 INFO hive.HiveImport: Hive history file=/tmp/mediaadmin/hive_job_log_mediaadmin_201207051854_1851664234.txt

This is purely a SQOOP issue and  you should be getting better help in Sqoop user group. Please take this conversation to SQOOP user list, we can continue this conversation there.

Regards
Bejoy KS

________________________________
From: "yogesh.kumar13@wipro.com" <yo...@wipro.com>
To: user@hive.apache.org; bejoy_ks@yahoo.com
Sent: Thursday, July 5, 2012 6:56 PM
Subject: RE: Hive uploading

Hi Bejoy,

I used it. after creating userdata/....   dir in hdfs

sqoop import --verbose --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 --table yesrdbms1 -num-mappers 1 --warehouse-dir /userdata/yogesh/sqoop/imports --hive-import  --hive-table yeshivee

outcome is

12/07/05 18:48:56 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
12/07/05 18:48:56 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
12/07/05 18:48:56 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
12/07/05 18:48:56 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
12/07/05 18:48:56 INFO tool.CodeGenTool: Beginning code generation
12/07/05 18:48:56 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `yesrdbms1` AS t LIMIT 1
12/07/05 18:48:56 INFO orm.CompilationManager: HADOOP_HOME is /HADOOP/hadoop-0.20.2/bin/..
12/07/05 18:48:56 INFO orm.CompilationManager: Found hadoop core jar at: /HADOOP/hadoop-0.20.2/bin/../hadoop-0.20.2-core.jar
Note: /tmp/sqoop-mediaadmin/compile/603b6e26b2fd1160693ba0a66786d12e/yesrdbms1.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
12/07/05 18:48:57 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-mediaadmin/compile/603b6e26b2fd1160693ba0a66786d12e/yesrdbms1.jar
12/07/05 18:48:57 WARN manager.MySQLManager: It looks like you are importing from mysql.
12/07/05 18:48:57 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
12/07/05 18:48:57 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
12/07/05 18:48:57 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
12/07/05 18:48:57 INFO mapreduce.ImportJobBase: Beginning import of yesrdbms1
12/07/05 18:48:58 INFO mapred.JobClient: Running job: job_201207051104_0011
12/07/05 18:48:59 INFO mapred.JobClient:  map 0% reduce 0%
12/07/05 18:49:09 INFO mapred.JobClient:  map 100% reduce 0%
12/07/05 18:49:11 INFO mapred.JobClient: Job complete: job_201207051104_0011
12/07/05 18:49:11 INFO mapred.JobClient: Counters: 5
12/07/05 18:49:11 INFO mapred.JobClient:   Job Counters
12/07/05 18:49:11 INFO mapred.JobClient:     Launched map tasks=1
12/07/05 18:49:11 INFO mapred.JobClient:   FileSystemCounters
12/07/05 18:49:11 INFO mapred.JobClient:     HDFS_BYTES_WRITTEN=9
12/07/05 18:49:11 INFO mapred.JobClient:   Map-Reduce Framework
12/07/05 18:49:11 INFO mapred.JobClient:     Map input records=1
12/07/05 18:49:11 INFO mapred.JobClient:     Spilled Records=0
12/07/05 18:49:11 INFO mapred.JobClient:     Map output records=1
12/07/05 18:49:11 INFO mapreduce.ImportJobBase: Transferred 9 bytes in 13.6291 seconds (0.6604 bytes/sec)
12/07/05 18:49:11 INFO mapreduce.ImportJobBase: Retrieved 1 records.
12/07/05 18:49:11 INFO hive.HiveImport: Removing temporary files from import process: /userdata/yogesh/sqoop/imports/yesrdbms1/_logs
12/07/05 18:49:11 INFO hive.HiveImport: Loading uploaded data into Hive
12/07/05 18:49:11 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `yesrdbms1` AS t LIMIT 1
12/07/05 18:49:13 INFO hive.HiveImport: Logging initialized using configuration in jar:file:/HADOOP/hive/lib/hive-common-0.8.1.jar!/hive-log4j.properties
12/07/05 18:49:13 INFO hive.HiveImport: Hive history file=/tmp/mediaadmin/hive_job_log_mediaadmin_201207051849_2011545354.txt
12/07/05 18:49:17 INFO hive.HiveImport: OK
12/07/05 18:49:17 INFO hive.HiveImport: Time taken: 3.864 seconds
12/07/05 18:49:17 INFO hive.HiveImport: Loading data to table default.yeshivee
12/07/05 18:49:17 INFO hive.HiveImport: OK
12/07/05 18:49:17 INFO hive.HiveImport: Time taken: 0.245 seconds
12/07/05 18:49:17 INFO hive.HiveImport: Hive import complete.

------------------------------------*****************************************------------------------------------
and also this one

sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 --table oldtoy --verbose  -num-mappers 1 --warehouse-dir /userdata/yogesh/sqoop/imports --hive-import  --hive-table yeshive2 --create-hive-table

outcome is

Setting your password on the command-line is insecure. Consider using -P instead.
12/07/05 18:53:58 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
12/07/05 18:53:58 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
12/07/05 18:53:59 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
12/07/05 18:53:59 INFO tool.CodeGenTool: Beginning code generation
12/07/05 18:53:59 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `oldtoy` AS t LIMIT 1
12/07/05 18:53:59 INFO orm.CompilationManager: HADOOP_HOME is /HADOOP/hadoop-0.20.2/bin/..
12/07/05 18:53:59 INFO orm.CompilationManager: Found hadoop core jar at: /HADOOP/hadoop-0.20.2/bin/../hadoop-0.20.2-core.jar
Note: /tmp/sqoop-mediaadmin/compile/c5fab2a1adf9725e1c5d556d0cceefd6/oldtoy.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
12/07/05 18:54:00 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-mediaadmin/compile/c5fab2a1adf9725e1c5d556d0cceefd6/oldtoy.jar
12/07/05 18:54:00 WARN manager.MySQLManager: It looks like you are importing from mysql.
12/07/05 18:54:00 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
12/07/05 18:54:00 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
12/07/05 18:54:00 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
12/07/05 18:54:00 INFO mapreduce.ImportJobBase: Beginning import of oldtoy
12/07/05 18:54:01 INFO mapred.JobClient: Running job: job_201207051104_0013
12/07/05 18:54:02 INFO mapred.JobClient:  map 0% reduce 0%
12/07/05 18:54:16 INFO mapred.JobClient:  map 100% reduce 0%
12/07/05 18:54:18 INFO mapred.JobClient: Job complete: job_201207051104_0013
12/07/05 18:54:18 INFO mapred.JobClient: Counters: 5
12/07/05 18:54:18 INFO mapred.JobClient:   Job Counters
12/07/05 18:54:18 INFO mapred.JobClient:     Launched map tasks=1
12/07/05 18:54:18 INFO mapred.JobClient:   FileSystemCounters
12/07/05 18:54:18 INFO mapred.JobClient:     HDFS_BYTES_WRITTEN=8
12/07/05 18:54:18 INFO mapred.JobClient:   Map-Reduce Framework
12/07/05 18:54:18 INFO mapred.JobClient:     Map input records=1
12/07/05 18:54:18 INFO mapred.JobClient:     Spilled Records=0
12/07/05 18:54:18 INFO mapred.JobClient:     Map output records=1
12/07/05 18:54:18 INFO mapreduce.ImportJobBase: Transferred 8 bytes in 17.8107 seconds (0.4492 bytes/sec)
12/07/05 18:54:18 INFO mapreduce.ImportJobBase: Retrieved 1 records.
12/07/05 18:54:18 INFO hive.HiveImport: Removing temporary files from import process: /userdata/yogesh/sqoop/imports/oldtoy/_logs
12/07/05 18:54:18 INFO hive.HiveImport: Loading uploaded data into Hive
12/07/05 18:54:18 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `oldtoy` AS t LIMIT 1
12/07/05 18:54:20 INFO hive.HiveImport: Logging initialized using configuration in jar:file:/HADOOP/hive/lib/hive-common-0.8.1.jar!/hive-log4j.properties
12/07/05 18:54:20 INFO hive.HiveImport: Hive history file=/tmp/mediaadmin/hive_job_log_mediaadmin_201207051854_1851664234.txt
12/07/05 18:54:24 INFO hive.HiveImport: OK
12/07/05 18:54:24 INFO hive.HiveImport: Time taken: 4.222 seconds
12/07/05 18:54:24 INFO hive.HiveImport: Loading data to table default.yeshive2
12/07/05 18:54:24 INFO hive.HiveImport: OK
12/07/05 18:54:24 INFO hive.HiveImport: Time taken: 0.278 seconds
12/07/05 18:54:24 INFO hive.HiveImport: Hive import complete.



Please suggest

Greetings
Yogesh kumar


________________________________
From: Bejoy Ks [bejoy_ks@yahoo.com]
Sent: Thursday, July 05, 2012 6:28 PM
To: user@hive.apache.org
Subject: Re: Hive uploading

Hi Yogesh

The verbose option didn't work there as there is no DEBUG logging, can you please add the verbose to the beginning of your sqoop command?

Lemme frame a small sqoop import sample or you, Please run this command and post in the console log

sqoop import --verbose --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 -P --table troy -num-mappers 1 --warehouse-dir /userdata/yogesh/sqoop/imports --hive-import --hive-table troyhive

I haven't tried this on my end, If it poses any issues with the verbose please chnage its position after --table argument.

Also you need to create the --warehouse-dir in hdfs before running the sqoop import.


Regards
Bejoy KS

________________________________
From: "yogesh.kumar13@wipro.com" <yo...@wipro.com>
To: user@hive.apache.org; bejoy_ks@yahoo.com
Sent: Thursday, July 5, 2012 6:07 PM
Subject: RE: Hive uploading

Hi Bejoy,

I have created new table called Troy  and for hive its troyhive, as it was showing Outputdirectory already exists


sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 -table troy --hive-table troyhive  --create-hive-table  --hive-import --hive-home HADOOP/hive --verbose
12/07/05 17:57:16 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
12/07/05 17:57:16 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
12/07/05 17:57:16 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
12/07/05 17:57:16 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
12/07/05 17:57:16 INFO tool.CodeGenTool: Beginning code generation
12/07/05 17:57:17 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `troy` AS t LIMIT 1
12/07/05 17:57:17 INFO orm.CompilationManager: HADOOP_HOME is /HADOOP/hadoop-0.20.2/bin/..
12/07/05 17:57:17 INFO orm.CompilationManager: Found hadoop core jar at: /HADOOP/hadoop-0.20.2/bin/../hadoop-0.20.2-core.jar
Note: /tmp/sqoop-mediaadmin/compile/26f5861253b910681eade0bd0e84efb5/troy.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
12/07/05 17:57:17 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-mediaadmin/compile/26f5861253b910681eade0bd0e84efb5/troy.jar
12/07/05 17:57:17 WARN manager.MySQLManager: It looks like you are importing from mysql.
12/07/05 17:57:17 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
12/07/05 17:57:17 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
12/07/05 17:57:17 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
12/07/05 17:57:17 INFO mapreduce.ImportJobBase: Beginning import of troy
12/07/05 17:57:18 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN(`num`), MAX(`num`) FROM `troy`
12/07/05 17:57:18 INFO mapred.JobClient: Running job: job_201207051104_0005
12/07/05 17:57:19 INFO mapred.JobClient:  map 0% reduce 0%
12/07/05 17:57:30 INFO mapred.JobClient:  map 100% reduce 0%
12/07/05 17:57:32 INFO mapred.JobClient: Job complete: job_201207051104_0005
12/07/05 17:57:32 INFO mapred.JobClient: Counters: 5
12/07/05 17:57:32 INFO mapred.JobClient:   Job Counters
12/07/05 17:57:32 INFO mapred.JobClient:     Launched map tasks=1
12/07/05 17:57:32 INFO mapred.JobClient:   FileSystemCounters
12/07/05 17:57:32 INFO mapred.JobClient:     HDFS_BYTES_WRITTEN=8
12/07/05 17:57:32 INFO mapred.JobClient:   Map-Reduce Framework
12/07/05 17:57:32 INFO mapred.JobClient:     Map input records=1
12/07/05 17:57:32 INFO mapred.JobClient:     Spilled Records=0
12/07/05 17:57:32 INFO mapred.JobClient:     Map output records=1
12/07/05 17:57:32 INFO mapreduce.ImportJobBase: Transferred 8 bytes in 14.6895 seconds (0.5446 bytes/sec)
12/07/05 17:57:32 INFO mapreduce.ImportJobBase: Retrieved 1 records.
12/07/05 17:57:32 INFO hive.HiveImport: Removing temporary files from import process: troy/_logs
12/07/05 17:57:32 INFO hive.HiveImport: Loading uploaded data into Hive
12/07/05 17:57:32 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `troy` AS t LIMIT 1
12/07/05 17:57:34 INFO hive.HiveImport: Logging initialized using configuration in jar:file:/HADOOP/hive/lib/hive-common-0.8.1.jar!/hive-log4j.properties
12/07/05 17:57:34 INFO hive.HiveImport: Hive history file=/tmp/mediaadmin/hive_job_log_mediaadmin_201207051757_1184599996.txt
12/07/05 17:57:39 INFO hive.HiveImport: OK
12/07/05 17:57:39 INFO hive.HiveImport: Time taken: 4.249 seconds
12/07/05 17:57:39 INFO hive.HiveImport: Loading data to table default.troyhive
12/07/05 17:57:39 INFO hive.HiveImport: OK
12/07/05 17:57:39 INFO hive.HiveImport: Time taken: 0.257 seconds
12/07/05 17:57:39 INFO hive.HiveImport: Hive import complete.

Regards
Yogesh Kumar

________________________________
From: Bejoy Ks [bejoy_ks@yahoo.com]
Sent: Thursday, July 05, 2012 6:03 PM
To: user@hive.apache.org
Subject: Re: Hive uploading

Hi Yogesh

Verbose option won't create any difference in operation, but gives more logging information on console which could be helpful to search for any hints.

So please post in your console dump/log along with the sqoop import command with verbose enabled.

Regards
Bejoy KS

________________________________
From: "yogesh.kumar13@wipro.com" <yo...@wipro.com>
To: user@hive.apache.org; bejoy_ks@yahoo.com
Sent: Thursday, July 5, 2012 6:00 PM
Subject: RE: Hive uploading

Hello Bejoy,

sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 -table Dummy --hive-table dummyhive  --create-hive-table  --hive-import --hive-home HADOOP/hive --verbose

Still the same, no table has been created. I am not able to see the dummyhive table in hive by using command
Show Tables ;

although table dummyhive created into HDFS in dir:
user/hive/warehouse/dummyhive


Please suggest
Yogesh Kumar

________________________________
From: Bejoy Ks [bejoy_ks@yahoo.com]
Sent: Thursday, July 05, 2012 5:29 PM
To: user@hive.apache.org
Subject: Re: Hive uploading

Hi Yogesh

Please try out this command

 sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 -table Dummy --hive-table dummyhive  --create-hive-table  --hive-import --hive-home HADOOP/hive --verbose


Regards
Bejoy KS

________________________________
From: "yogesh.kumar13@wipro.com" <yo...@wipro.com>
To: user@hive.apache.org; bejoy_ks@yahoo.com
Sent: Thursday, July 5, 2012 5:03 PM
Subject: RE: Hive uploading

Hi Bejoy

I have confirmed hive installation its same for both
I used command echo $HIVE_HOME on both sqoop terminal and hive terminal
both result the same Path
HADOOP/hive

I am new to Hive and sqoop, would you please give an example using -verbose option with this command


 sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 -table Dummy --hive-table dummyhive  --create-hive-table  --hive-import --hive-home HADOOP/hive



Please help


________________________________
From: Bejoy Ks [bejoy_ks@yahoo.com]
Sent: Thursday, July 05, 2012 3:14 PM
To: user@hive.apache.org
Subject: Re: Hive uploading

Hi Yogesh

No issues seen on the first look. Can you run the sqoop import with --verbose option and post in the console dump?

Are you having multiple hive installation? If so please verify whether you are using the same hive for both SQOOP import and then for verifying data using hive cli. (the hive installation @ HADOOP/hive)

Regards
Bejoy KS

________________________________
From: "yogesh.kumar13@wipro.com" <yo...@wipro.com>
To: user@hive.apache.org
Sent: Thursday, July 5, 2012 2:58 PM
Subject: Hive uploading

Hi

I have created a table in Mysql by name Dummy and it has 2 columns, and 1 row of data

I want to upload that table into Hive using Sqoop tool.
I used this command


sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 -table Dummy --hive-table dummyhive  --create-hive-table  --hive-import --hive-home HADOOP/hive


The table has been succesfully uploaded into HDFS  /user/hive/warehouse
but when I run command in Hive

Show Tables;

I don't find dummyhive table in it.

Please suggest and Help


Details of the command and output

mediaadmins-iMac-2:hive mediaadmin$ sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 -table Dummy --hive-table dummyhive  --create-hive-table  --hive-import --hive-home HADOOP/hive
12/07/05 11:09:15 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
12/07/05 11:09:15 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
12/07/05 11:09:15 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
12/07/05 11:09:15 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
12/07/05 11:09:15 INFO tool.CodeGenTool: Beginning code generation
12/07/05 11:09:16 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `Dummy` AS t LIMIT 1
12/07/05 11:09:16 INFO orm.CompilationManager: HADOOP_HOME is /HADOOP/hadoop-0.20.2/bin/..
12/07/05 11:09:16 INFO orm.CompilationManager: Found hadoop core jar at: /HADOOP/hadoop-0.20.2/bin/../hadoop-0.20.2-core.jar
Note: /tmp/sqoop-mediaadmin/compile/382d1c58323cea76efd197632bebbfcd/Dummy.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
12/07/05 11:09:17 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-mediaadmin/compile/382d1c58323cea76efd197632bebbfcd/Dummy.jar
12/07/05 11:09:17 WARN manager.MySQLManager: It looks like you are importing from mysql.
12/07/05 11:09:17 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
12/07/05 11:09:17 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
12/07/05 11:09:17 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
12/07/05 11:09:17 INFO mapreduce.ImportJobBase: Beginning import of Dummy
12/07/05 11:09:18 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN(`Sno`), MAX(`Sno`) FROM `Dummy`
12/07/05 11:09:18 INFO mapred.JobClient: Running job: job_201207051104_0001
12/07/05 11:09:19 INFO mapred.JobClient:  map 0% reduce 0%
12/07/05 11:09:33 INFO mapred.JobClient:  map 100% reduce 0%
12/07/05 11:09:35 INFO mapred.JobClient: Job complete: job_201207051104_0001
12/07/05 11:09:35 INFO mapred.JobClient: Counters: 5
12/07/05 11:09:35 INFO mapred.JobClient:   Job Counters
12/07/05 11:09:35 INFO mapred.JobClient:     Launched map tasks=1
12/07/05 11:09:35 INFO mapred.JobClient:   FileSystemCounters
12/07/05 11:09:35 INFO mapred.JobClient:     HDFS_BYTES_WRITTEN=8
12/07/05 11:09:35 INFO mapred.JobClient:   Map-Reduce Framework
12/07/05 11:09:35 INFO mapred.JobClient:     Map input records=1
12/07/05 11:09:35 INFO mapred.JobClient:     Spilled Records=0
12/07/05 11:09:35 INFO mapred.JobClient:     Map output records=1
12/07/05 11:09:35 INFO mapreduce.ImportJobBase: Transferred 8 bytes in 17.945 seconds (0.4458 bytes/sec)
12/07/05 11:09:35 INFO mapreduce.ImportJobBase: Retrieved 1 records.
12/07/05 11:09:35 INFO hive.HiveImport: Removing temporary files from import process: Dummy/_logs
12/07/05 11:09:35 INFO hive.HiveImport: Loading uploaded data into Hive
12/07/05 11:09:35 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `Dummy` AS t LIMIT 1
12/07/05 11:09:37 INFO hive.HiveImport: Logging initialized using configuration in jar:file:/HADOOP/hive/lib/hive-common-0.8.1.jar!/hive-log4j.properties
12/07/05 11:09:37 INFO hive.HiveImport: Hive history file=/tmp/mediaadmin/hive_job_log_mediaadmin_201207051109_1901926452.txt
12/07/05 11:09:41 INFO hive.HiveImport: OK
12/07/05 11:09:41 INFO hive.HiveImport: Time taken: 3.934 seconds
12/07/05 11:09:41 INFO hive.HiveImport: Loading data to table default.dummyhive
12/07/05 11:09:41 INFO hive.HiveImport: OK
12/07/05 11:09:41 INFO hive.HiveImport: Time taken: 0.262 seconds
12/07/05 11:09:41 INFO hive.HiveImport: Hive import complete.



Why is it so? Please help me out

Thanks & Regards
Yogesh Kumar

Please do not print this email unless it is absolutely necessary.
The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments.
WARNING: Computer viruses can be transmitted via email. The recipient should check this email and any attachments for the presence of viruses. The company accepts no liability for any damage caused by any virus transmitted by this email.
www.wipro.com


Please do not print this email unless it is absolutely necessary.
The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments.
WARNING: Computer viruses can be transmitted via email. The recipient should check this email and any attachments for the presence of viruses. The company accepts no liability for any damage caused by any virus transmitted by this email.
www.wipro.com


Please do not print this email unless it is absolutely necessary.
The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments.
WARNING: Computer viruses can be transmitted via email. The recipient should check this email and any attachments for the presence of viruses. The company accepts no liability for any damage caused by any virus transmitted by this email.
www.wipro.com


Please do not print this email unless it is absolutely necessary.
The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments.
WARNING: Computer viruses can be transmitted via email. The recipient should check this email and any attachments for the presence of viruses. The company accepts no liability for any damage caused by any virus transmitted by this email.
www.wipro.com


Please do not print this email unless it is absolutely necessary.
The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments.
WARNING: Computer viruses can be transmitted via email. The recipient should check this email and any attachments for the presence of viruses. The company accepts no liability for any damage caused by any virus transmitted by this email.
www.wipro.com



Please do not print this email unless it is absolutely necessary. 

The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments. 

WARNING: Computer viruses can be transmitted via email. The recipient should check this email and any attachments for the presence of viruses. The company accepts no liability for any damage caused by any virus transmitted by this email. 

www.wipro.com

Re: Hive uploading

Posted by Bejoy Ks <be...@yahoo.com>.
Hi Yogesh

The verbose logging is still not getting enabled, some issue with SQOOP installation I guess.

The console log shows hive table creation and data load is sucess. Are you still not seeing the tables 'yeshivee' and 'yeshive2 ' in hive. Login to your default hive CLI and check. ($HIVE_HOME/bin/hive)


You can look at the hive history file and see if there are any errors reported in hive. 
12/07/05 18:49:13 INFO hive.HiveImport: Hive history file=/tmp/mediaadmin/hive_job_log_mediaadmin_201207051849_2011545354.txt
12/07/05 18:54:20 INFO hive.HiveImport: Hive history file=/tmp/mediaadmin/hive_job_log_mediaadmin_201207051854_1851664234.txt

This is purely a SQOOP issue and  you should be getting better help in Sqoop user group. Please take this conversation to SQOOP user list, we can continue this conversation there.


Regards
Bejoy KS



________________________________
 From: "yogesh.kumar13@wipro.com" <yo...@wipro.com>
To: user@hive.apache.org; bejoy_ks@yahoo.com 
Sent: Thursday, July 5, 2012 6:56 PM
Subject: RE: Hive uploading
 

 
Hi Bejoy,

I used it. after creating userdata/....   dir in hdfs
 
sqoop import --verbose --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 --table yesrdbms1 -num-mappers 1 --warehouse-dir /userdata/yogesh/sqoop/imports --hive-import  --hive-table yeshivee

outcome is 

12/07/05 18:48:56 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
12/07/05 18:48:56 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
12/07/05 18:48:56 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
12/07/05 18:48:56 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
12/07/05 18:48:56 INFO tool.CodeGenTool: Beginning code generation
12/07/05 18:48:56 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `yesrdbms1` AS t LIMIT 1
12/07/05 18:48:56 INFO orm.CompilationManager: HADOOP_HOME is /HADOOP/hadoop-0.20.2/bin/..
12/07/05 18:48:56 INFO orm.CompilationManager: Found hadoop core jar at: /HADOOP/hadoop-0.20.2/bin/../hadoop-0.20.2-core.jar
Note: /tmp/sqoop-mediaadmin/compile/603b6e26b2fd1160693ba0a66786d12e/yesrdbms1.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
12/07/05 18:48:57 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-mediaadmin/compile/603b6e26b2fd1160693ba0a66786d12e/yesrdbms1.jar
12/07/05 18:48:57 WARN manager.MySQLManager: It looks like you are importing from mysql.
12/07/05 18:48:57 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
12/07/05 18:48:57 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
12/07/05 18:48:57 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
12/07/05 18:48:57 INFO mapreduce.ImportJobBase: Beginning import of yesrdbms1
12/07/05 18:48:58 INFO mapred.JobClient: Running job: job_201207051104_0011
12/07/05 18:48:59 INFO mapred.JobClient:  map 0% reduce 0%
12/07/05 18:49:09 INFO mapred.JobClient:  map 100% reduce 0%
12/07/05 18:49:11 INFO mapred.JobClient: Job complete: job_201207051104_0011
12/07/05 18:49:11 INFO mapred.JobClient: Counters: 5
12/07/05 18:49:11 INFO mapred.JobClient:   Job Counters 
12/07/05 18:49:11 INFO mapred.JobClient:     Launched map tasks=1
12/07/05 18:49:11 INFO mapred.JobClient:   FileSystemCounters
12/07/05 18:49:11 INFO mapred.JobClient:     HDFS_BYTES_WRITTEN=9
12/07/05 18:49:11 INFO mapred.JobClient:   Map-Reduce Framework
12/07/05 18:49:11 INFO mapred.JobClient:     Map input records=1
12/07/05 18:49:11 INFO mapred.JobClient:     Spilled Records=0
12/07/05 18:49:11 INFO mapred.JobClient:     Map output records=1
12/07/05 18:49:11 INFO mapreduce.ImportJobBase: Transferred 9 bytes in 13.6291 seconds (0.6604 bytes/sec)
12/07/05 18:49:11 INFO mapreduce.ImportJobBase: Retrieved 1 records.
12/07/05 18:49:11 INFO hive.HiveImport: Removing temporary files from import process: /userdata/yogesh/sqoop/imports/yesrdbms1/_logs
12/07/05 18:49:11 INFO hive.HiveImport: Loading uploaded data into Hive
12/07/05 18:49:11 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `yesrdbms1` AS t LIMIT 1
12/07/05 18:49:13 INFO hive.HiveImport: Logging initialized using configuration in jar:file:/HADOOP/hive/lib/hive-common-0.8.1.jar!/hive-log4j.properties
12/07/05 18:49:13 INFO hive.HiveImport: Hive history file=/tmp/mediaadmin/hive_job_log_mediaadmin_201207051849_2011545354.txt
12/07/05 18:49:17 INFO hive.HiveImport: OK
12/07/05 18:49:17 INFO hive.HiveImport: Time taken: 3.864 seconds
12/07/05 18:49:17 INFO hive.HiveImport: Loading data to table default.yeshivee
12/07/05 18:49:17 INFO hive.HiveImport: OK
12/07/05 18:49:17 INFO hive.HiveImport: Time taken: 0.245 seconds
12/07/05 18:49:17 INFO hive.HiveImport: Hive import complete.

------------------------------------*****************************************------------------------------------
and also this one 

sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 --table oldtoy --verbose  -num-mappers 1 --warehouse-dir /userdata/yogesh/sqoop/imports --hive-import  --hive-table yeshive2 --create-hive-table

outcome is 

Setting your password on the command-line is insecure. Consider using -P instead.
12/07/05 18:53:58 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
12/07/05 18:53:58 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
12/07/05 18:53:59 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
12/07/05 18:53:59 INFO tool.CodeGenTool: Beginning code generation
12/07/05 18:53:59 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `oldtoy` AS t LIMIT 1
12/07/05 18:53:59 INFO orm.CompilationManager: HADOOP_HOME is /HADOOP/hadoop-0.20.2/bin/..
12/07/05 18:53:59 INFO orm.CompilationManager: Found hadoop core jar at: /HADOOP/hadoop-0.20.2/bin/../hadoop-0.20.2-core.jar
Note: /tmp/sqoop-mediaadmin/compile/c5fab2a1adf9725e1c5d556d0cceefd6/oldtoy.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
12/07/05 18:54:00 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-mediaadmin/compile/c5fab2a1adf9725e1c5d556d0cceefd6/oldtoy.jar
12/07/05 18:54:00 WARN manager.MySQLManager: It looks like you are importing from mysql.
12/07/05 18:54:00 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
12/07/05 18:54:00 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
12/07/05 18:54:00 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
12/07/05 18:54:00 INFO mapreduce.ImportJobBase: Beginning import of oldtoy
12/07/05 18:54:01 INFO mapred.JobClient: Running job: job_201207051104_0013
12/07/05 18:54:02 INFO mapred.JobClient:  map 0% reduce 0%
12/07/05 18:54:16 INFO mapred.JobClient:  map 100% reduce 0%
12/07/05 18:54:18 INFO mapred.JobClient: Job complete: job_201207051104_0013
12/07/05 18:54:18 INFO mapred.JobClient: Counters: 5
12/07/05 18:54:18 INFO mapred.JobClient:   Job Counters 
12/07/05 18:54:18 INFO mapred.JobClient:     Launched map tasks=1
12/07/05 18:54:18 INFO mapred.JobClient:   FileSystemCounters
12/07/05 18:54:18 INFO mapred.JobClient:     HDFS_BYTES_WRITTEN=8
12/07/05 18:54:18 INFO mapred.JobClient:   Map-Reduce Framework
12/07/05 18:54:18 INFO mapred.JobClient:     Map input records=1
12/07/05 18:54:18 INFO mapred.JobClient:     Spilled Records=0
12/07/05 18:54:18 INFO mapred.JobClient:     Map output records=1
12/07/05 18:54:18 INFO mapreduce.ImportJobBase: Transferred 8 bytes in 17.8107 seconds (0.4492 bytes/sec)
12/07/05 18:54:18 INFO mapreduce.ImportJobBase: Retrieved 1 records.
12/07/05 18:54:18 INFO hive.HiveImport: Removing temporary files from import process: /userdata/yogesh/sqoop/imports/oldtoy/_logs
12/07/05 18:54:18 INFO hive.HiveImport: Loading uploaded data into Hive
12/07/05 18:54:18 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `oldtoy` AS t LIMIT 1
12/07/05 18:54:20 INFO hive.HiveImport: Logging initialized using configuration in jar:file:/HADOOP/hive/lib/hive-common-0.8.1.jar!/hive-log4j.properties
12/07/05 18:54:20 INFO hive.HiveImport: Hive history file=/tmp/mediaadmin/hive_job_log_mediaadmin_201207051854_1851664234.txt
12/07/05 18:54:24 INFO hive.HiveImport: OK
12/07/05 18:54:24 INFO hive.HiveImport: Time taken: 4.222 seconds
12/07/05 18:54:24 INFO hive.HiveImport: Loading data to table default.yeshive2
12/07/05 18:54:24 INFO hive.HiveImport: OK
12/07/05 18:54:24 INFO hive.HiveImport: Time taken: 0.278 seconds
12/07/05 18:54:24 INFO hive.HiveImport: Hive import complete.



Please suggest

Greetings 
Yogesh kumar




________________________________
 
From: Bejoy Ks [bejoy_ks@yahoo.com]
Sent: Thursday, July 05, 2012 6:28 PM
To: user@hive.apache.org
Subject: Re: Hive uploading


Hi Yogesh

The verbose option didn't work there as there is no DEBUG logging, can you please add the verbose to the beginning of your sqoop command?

Lemme frame a small sqoop import sample or you, Please run this command and post in the console log

sqoop import --verbose --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 -P --table troy -num-mappers 1 --warehouse-dir /userdata/yogesh/sqoop/imports --hive-import --hive-table troyhive  


I haven't tried this on my end, If it poses any issues with the verbose please chnage its position after --table argument.

Also you need to create the --warehouse-dir in hdfs before running the sqoop import.



Regards
Bejoy KS



________________________________
 From: "yogesh.kumar13@wipro.com" <yo...@wipro.com>
To: user@hive.apache.org; bejoy_ks@yahoo.com 
Sent: Thursday, July 5, 2012 6:07 PM
Subject: RE: Hive uploading


 
Hi Bejoy,

I have created new table called Troy  and for hive its troyhive, as it was showing Outputdirectory already exists


sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 -table troy --hive-table troyhive  --create-hive-table  --hive-import --hive-home HADOOP/hive --verbose
12/07/05 17:57:16 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
12/07/05 17:57:16 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
12/07/05 17:57:16 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
12/07/05 17:57:16 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
12/07/05 17:57:16 INFO tool.CodeGenTool: Beginning code generation
12/07/05 17:57:17 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `troy` AS t LIMIT 1
12/07/05 17:57:17 INFO orm.CompilationManager: HADOOP_HOME is /HADOOP/hadoop-0.20.2/bin/..
12/07/05 17:57:17 INFO orm.CompilationManager: Found hadoop core jar at: /HADOOP/hadoop-0.20.2/bin/../hadoop-0.20.2-core.jar
Note: /tmp/sqoop-mediaadmin/compile/26f5861253b910681eade0bd0e84efb5/troy.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
12/07/05 17:57:17 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-mediaadmin/compile/26f5861253b910681eade0bd0e84efb5/troy.jar
12/07/05 17:57:17 WARN manager.MySQLManager: It looks like you are importing from mysql.
12/07/05 17:57:17 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
12/07/05 17:57:17 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
12/07/05 17:57:17 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
12/07/05 17:57:17 INFO mapreduce.ImportJobBase: Beginning import of troy
12/07/05 17:57:18 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN(`num`), MAX(`num`) FROM `troy`
12/07/05 17:57:18 INFO mapred.JobClient: Running job: job_201207051104_0005
12/07/05 17:57:19 INFO mapred.JobClient:  map 0% reduce 0%
12/07/05 17:57:30 INFO mapred.JobClient:  map 100% reduce 0%
12/07/05 17:57:32 INFO mapred.JobClient: Job complete: job_201207051104_0005
12/07/05 17:57:32 INFO mapred.JobClient: Counters: 5
12/07/05 17:57:32 INFO mapred.JobClient:   Job Counters 
12/07/05 17:57:32 INFO mapred.JobClient:     Launched map tasks=1
12/07/05 17:57:32 INFO mapred.JobClient:   FileSystemCounters
12/07/05 17:57:32 INFO mapred.JobClient:     HDFS_BYTES_WRITTEN=8
12/07/05 17:57:32 INFO mapred.JobClient:   Map-Reduce Framework
12/07/05 17:57:32 INFO mapred.JobClient:     Map input records=1
12/07/05 17:57:32 INFO mapred.JobClient:     Spilled Records=0
12/07/05 17:57:32 INFO mapred.JobClient:     Map output records=1
12/07/05 17:57:32 INFO mapreduce.ImportJobBase: Transferred 8 bytes in 14.6895 seconds (0.5446 bytes/sec)
12/07/05 17:57:32 INFO mapreduce.ImportJobBase: Retrieved 1 records.
12/07/05 17:57:32 INFO hive.HiveImport: Removing temporary files from import process: troy/_logs
12/07/05 17:57:32 INFO hive.HiveImport: Loading uploaded data into Hive
12/07/05 17:57:32 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `troy` AS t LIMIT 1
12/07/05 17:57:34 INFO hive.HiveImport: Logging initialized using configuration in jar:file:/HADOOP/hive/lib/hive-common-0.8.1.jar!/hive-log4j.properties
12/07/05 17:57:34 INFO hive.HiveImport: Hive history file=/tmp/mediaadmin/hive_job_log_mediaadmin_201207051757_1184599996.txt
12/07/05 17:57:39 INFO hive.HiveImport: OK
12/07/05 17:57:39 INFO hive.HiveImport: Time taken: 4.249 seconds
12/07/05 17:57:39 INFO hive.HiveImport: Loading data to table default.troyhive
12/07/05 17:57:39 INFO hive.HiveImport: OK
12/07/05 17:57:39 INFO hive.HiveImport: Time taken: 0.257 seconds
12/07/05 17:57:39 INFO hive.HiveImport: Hive import complete.

Regards
Yogesh Kumar



________________________________
 
From: Bejoy Ks [bejoy_ks@yahoo.com]
Sent: Thursday, July 05, 2012 6:03 PM
To: user@hive.apache.org
Subject: Re: Hive uploading


Hi Yogesh

Verbose option won't create any difference in operation, but gives more logging information on console which could be helpful to search for any hints.

So please post in your console dump/log along with the sqoop import command with verbose enabled.

Regards
Bejoy KS


________________________________
 From: "yogesh.kumar13@wipro.com" <yo...@wipro.com>
To: user@hive.apache.org; bejoy_ks@yahoo.com 
Sent: Thursday, July 5, 2012 6:00 PM
Subject: RE: Hive uploading


 
Hello Bejoy,

sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 -table Dummy --hive-table dummyhive  --create-hive-table  --hive-import --hive-home HADOOP/hive --verbose

Still the same, no table has been created. I am not able to see the dummyhive table in hive by using command 
Show Tables ;

although table dummyhive created into HDFS in dir:
user/hive/warehouse/dummyhive


Please suggest
Yogesh Kumar



________________________________
 
From: Bejoy Ks [bejoy_ks@yahoo.com]
Sent: Thursday, July 05, 2012 5:29 PM
To: user@hive.apache.org
Subject: Re: Hive uploading


Hi Yogesh

Please try out this command


 sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 -table Dummy --hive-table dummyhive  --create-hive-table  --hive-import --hive-home HADOOP/hive --verbose


Regards
Bejoy KS



________________________________
 From: "yogesh.kumar13@wipro.com" <yo...@wipro.com>
To: user@hive.apache.org; bejoy_ks@yahoo.com 
Sent: Thursday, July 5, 2012 5:03 PM
Subject: RE: Hive uploading


 
Hi Bejoy

I have confirmed hive installation its same for both
I used command echo $HIVE_HOME on both sqoop terminal and hive terminal
both result the same Path 
HADOOP/hive

I am new to Hive and sqoop, would you please give an example using -verbose option with this command


 sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 -table Dummy --hive-table dummyhive  --create-hive-table  --hive-import --hive-home HADOOP/hive 



Please help




________________________________
 
From: Bejoy Ks [bejoy_ks@yahoo.com]
Sent: Thursday, July 05, 2012 3:14 PM
To: user@hive.apache.org
Subject: Re: Hive uploading


Hi Yogesh

No issues seen on the first look. Can you run the sqoop import with --verbose option and post in the console dump?

Are you having multiple hive installation? If so please verify whether you are using the same hive for both SQOOP import and then for verifying data using hive cli. (the hive installation @ HADOOP/hive)

Regards
Bejoy KS



________________________________
 From: "yogesh.kumar13@wipro.com" <yo...@wipro.com>
To: user@hive.apache.org 
Sent: Thursday, July 5, 2012 2:58 PM
Subject: Hive uploading


 
Hi

I have created a table in Mysql by name Dummy and it has 2 columns, and 1 row of data

I want to upload that table into Hive using Sqoop tool.
I used this command  


sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 -table Dummy --hive-table dummyhive  --create-hive-table  --hive-import --hive-home HADOOP/hive


The table has been succesfully uploaded into HDFS  /user/hive/warehouse
but when I run command in Hive 

Show Tables;

I don't find dummyhive table in it.

Please suggest and Help


Details of the command and output

mediaadmins-iMac-2:hive mediaadmin$ sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 -table Dummy --hive-table dummyhive  --create-hive-table  --hive-import --hive-home HADOOP/hive
12/07/05 11:09:15 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
12/07/05 11:09:15 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
12/07/05 11:09:15 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
12/07/05 11:09:15 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
12/07/05 11:09:15 INFO tool.CodeGenTool: Beginning code generation
12/07/05 11:09:16 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `Dummy` AS t LIMIT 1
12/07/05 11:09:16 INFO orm.CompilationManager: HADOOP_HOME is /HADOOP/hadoop-0.20.2/bin/..
12/07/05 11:09:16 INFO orm.CompilationManager: Found hadoop core jar at: /HADOOP/hadoop-0.20.2/bin/../hadoop-0.20.2-core.jar
Note: /tmp/sqoop-mediaadmin/compile/382d1c58323cea76efd197632bebbfcd/Dummy.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
12/07/05 11:09:17 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-mediaadmin/compile/382d1c58323cea76efd197632bebbfcd/Dummy.jar
12/07/05 11:09:17 WARN manager.MySQLManager: It looks like you are importing from mysql.
12/07/05 11:09:17 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
12/07/05 11:09:17 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
12/07/05 11:09:17 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
12/07/05 11:09:17 INFO mapreduce.ImportJobBase: Beginning import of Dummy
12/07/05 11:09:18 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN(`Sno`), MAX(`Sno`) FROM `Dummy`
12/07/05 11:09:18 INFO mapred.JobClient: Running job: job_201207051104_0001
12/07/05 11:09:19 INFO mapred.JobClient:  map 0% reduce 0%
12/07/05 11:09:33 INFO mapred.JobClient:  map 100% reduce 0%
12/07/05 11:09:35 INFO mapred.JobClient: Job complete: job_201207051104_0001
12/07/05 11:09:35 INFO mapred.JobClient: Counters: 5
12/07/05 11:09:35 INFO mapred.JobClient:   Job Counters 
12/07/05 11:09:35 INFO mapred.JobClient:     Launched map tasks=1
12/07/05 11:09:35 INFO mapred.JobClient:   FileSystemCounters
12/07/05 11:09:35 INFO mapred.JobClient:     HDFS_BYTES_WRITTEN=8
12/07/05 11:09:35 INFO mapred.JobClient:   Map-Reduce Framework
12/07/05 11:09:35 INFO mapred.JobClient:     Map input records=1
12/07/05 11:09:35 INFO mapred.JobClient:     Spilled Records=0
12/07/05 11:09:35 INFO mapred.JobClient:     Map output records=1
12/07/05 11:09:35 INFO mapreduce.ImportJobBase: Transferred 8 bytes in 17.945 seconds (0.4458 bytes/sec)
12/07/05 11:09:35 INFO mapreduce.ImportJobBase: Retrieved 1 records.
12/07/05 11:09:35 INFO hive.HiveImport: Removing temporary files from import process: Dummy/_logs
12/07/05 11:09:35 INFO hive.HiveImport: Loading uploaded data into Hive
12/07/05 11:09:35 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `Dummy` AS t LIMIT 1
12/07/05 11:09:37 INFO hive.HiveImport: Logging initialized using configuration in jar:file:/HADOOP/hive/lib/hive-common-0.8.1.jar!/hive-log4j.properties
12/07/05 11:09:37 INFO hive.HiveImport: Hive history file=/tmp/mediaadmin/hive_job_log_mediaadmin_201207051109_1901926452.txt
12/07/05 11:09:41 INFO hive.HiveImport: OK
12/07/05 11:09:41 INFO hive.HiveImport: Time taken: 3.934 seconds
12/07/05 11:09:41 INFO hive.HiveImport: Loading data to table default.dummyhive
12/07/05 11:09:41 INFO hive.HiveImport: OK
12/07/05 11:09:41 INFO hive.HiveImport: Time taken: 0.262 seconds
12/07/05 11:09:41 INFO hive.HiveImport: Hive import complete.



Why is it so? Please help me out

Thanks & Regards
Yogesh Kumar


Please do not print this email unless it is absolutely necessary. 
The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments. 
WARNING: Computer viruses can be transmitted via email. The recipient should check this email and any attachments for the presence of viruses. The company accepts no liability for any damage caused by any virus transmitted by this email. 
www.wipro.com 


Please do not print this email unless it is absolutely necessary. 
The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments. 
WARNING: Computer viruses can be transmitted via email. The recipient should check this email and any attachments for the presence of viruses. The company accepts no liability for any damage caused by any virus transmitted by this email. 
www.wipro.com 


Please do not print this email unless it is absolutely necessary. 
The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments. 
WARNING: Computer viruses can be transmitted via email. The recipient should check this email and any attachments for the presence of viruses. The company accepts no liability for any damage caused by any virus transmitted by this email. 
www.wipro.com 


Please do not print this email unless it is absolutely necessary. 
The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments. 
WARNING: Computer viruses can be transmitted via email. The recipient should check this email and any attachments for the presence of viruses. The company accepts no liability for any damage caused by any virus transmitted by this email. 
www.wipro.com 


Please do not print this email unless it is absolutely necessary. 
The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments. 
WARNING: Computer viruses can be transmitted via email. The recipient should check this email and any attachments for the presence of viruses. The company accepts no liability for any damage caused by any virus transmitted by this email. 
www.wipro.com 

RE: Hive uploading

Posted by yo...@wipro.com.
Hi Bejoy,

I used it. after creating userdata/....   dir in hdfs

sqoop import --verbose --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 --table yesrdbms1 -num-mappers 1 --warehouse-dir /userdata/yogesh/sqoop/imports --hive-import  --hive-table yeshivee

outcome is

12/07/05 18:48:56 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
12/07/05 18:48:56 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
12/07/05 18:48:56 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
12/07/05 18:48:56 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
12/07/05 18:48:56 INFO tool.CodeGenTool: Beginning code generation
12/07/05 18:48:56 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `yesrdbms1` AS t LIMIT 1
12/07/05 18:48:56 INFO orm.CompilationManager: HADOOP_HOME is /HADOOP/hadoop-0.20.2/bin/..
12/07/05 18:48:56 INFO orm.CompilationManager: Found hadoop core jar at: /HADOOP/hadoop-0.20.2/bin/../hadoop-0.20.2-core.jar
Note: /tmp/sqoop-mediaadmin/compile/603b6e26b2fd1160693ba0a66786d12e/yesrdbms1.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
12/07/05 18:48:57 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-mediaadmin/compile/603b6e26b2fd1160693ba0a66786d12e/yesrdbms1.jar
12/07/05 18:48:57 WARN manager.MySQLManager: It looks like you are importing from mysql.
12/07/05 18:48:57 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
12/07/05 18:48:57 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
12/07/05 18:48:57 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
12/07/05 18:48:57 INFO mapreduce.ImportJobBase: Beginning import of yesrdbms1
12/07/05 18:48:58 INFO mapred.JobClient: Running job: job_201207051104_0011
12/07/05 18:48:59 INFO mapred.JobClient:  map 0% reduce 0%
12/07/05 18:49:09 INFO mapred.JobClient:  map 100% reduce 0%
12/07/05 18:49:11 INFO mapred.JobClient: Job complete: job_201207051104_0011
12/07/05 18:49:11 INFO mapred.JobClient: Counters: 5
12/07/05 18:49:11 INFO mapred.JobClient:   Job Counters
12/07/05 18:49:11 INFO mapred.JobClient:     Launched map tasks=1
12/07/05 18:49:11 INFO mapred.JobClient:   FileSystemCounters
12/07/05 18:49:11 INFO mapred.JobClient:     HDFS_BYTES_WRITTEN=9
12/07/05 18:49:11 INFO mapred.JobClient:   Map-Reduce Framework
12/07/05 18:49:11 INFO mapred.JobClient:     Map input records=1
12/07/05 18:49:11 INFO mapred.JobClient:     Spilled Records=0
12/07/05 18:49:11 INFO mapred.JobClient:     Map output records=1
12/07/05 18:49:11 INFO mapreduce.ImportJobBase: Transferred 9 bytes in 13.6291 seconds (0.6604 bytes/sec)
12/07/05 18:49:11 INFO mapreduce.ImportJobBase: Retrieved 1 records.
12/07/05 18:49:11 INFO hive.HiveImport: Removing temporary files from import process: /userdata/yogesh/sqoop/imports/yesrdbms1/_logs
12/07/05 18:49:11 INFO hive.HiveImport: Loading uploaded data into Hive
12/07/05 18:49:11 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `yesrdbms1` AS t LIMIT 1
12/07/05 18:49:13 INFO hive.HiveImport: Logging initialized using configuration in jar:file:/HADOOP/hive/lib/hive-common-0.8.1.jar!/hive-log4j.properties
12/07/05 18:49:13 INFO hive.HiveImport: Hive history file=/tmp/mediaadmin/hive_job_log_mediaadmin_201207051849_2011545354.txt
12/07/05 18:49:17 INFO hive.HiveImport: OK
12/07/05 18:49:17 INFO hive.HiveImport: Time taken: 3.864 seconds
12/07/05 18:49:17 INFO hive.HiveImport: Loading data to table default.yeshivee
12/07/05 18:49:17 INFO hive.HiveImport: OK
12/07/05 18:49:17 INFO hive.HiveImport: Time taken: 0.245 seconds
12/07/05 18:49:17 INFO hive.HiveImport: Hive import complete.

------------------------------------*****************************************------------------------------------
and also this one

sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 --table oldtoy --verbose  -num-mappers 1 --warehouse-dir /userdata/yogesh/sqoop/imports --hive-import  --hive-table yeshive2 --create-hive-table

outcome is

Setting your password on the command-line is insecure. Consider using -P instead.
12/07/05 18:53:58 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
12/07/05 18:53:58 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
12/07/05 18:53:59 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
12/07/05 18:53:59 INFO tool.CodeGenTool: Beginning code generation
12/07/05 18:53:59 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `oldtoy` AS t LIMIT 1
12/07/05 18:53:59 INFO orm.CompilationManager: HADOOP_HOME is /HADOOP/hadoop-0.20.2/bin/..
12/07/05 18:53:59 INFO orm.CompilationManager: Found hadoop core jar at: /HADOOP/hadoop-0.20.2/bin/../hadoop-0.20.2-core.jar
Note: /tmp/sqoop-mediaadmin/compile/c5fab2a1adf9725e1c5d556d0cceefd6/oldtoy.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
12/07/05 18:54:00 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-mediaadmin/compile/c5fab2a1adf9725e1c5d556d0cceefd6/oldtoy.jar
12/07/05 18:54:00 WARN manager.MySQLManager: It looks like you are importing from mysql.
12/07/05 18:54:00 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
12/07/05 18:54:00 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
12/07/05 18:54:00 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
12/07/05 18:54:00 INFO mapreduce.ImportJobBase: Beginning import of oldtoy
12/07/05 18:54:01 INFO mapred.JobClient: Running job: job_201207051104_0013
12/07/05 18:54:02 INFO mapred.JobClient:  map 0% reduce 0%
12/07/05 18:54:16 INFO mapred.JobClient:  map 100% reduce 0%
12/07/05 18:54:18 INFO mapred.JobClient: Job complete: job_201207051104_0013
12/07/05 18:54:18 INFO mapred.JobClient: Counters: 5
12/07/05 18:54:18 INFO mapred.JobClient:   Job Counters
12/07/05 18:54:18 INFO mapred.JobClient:     Launched map tasks=1
12/07/05 18:54:18 INFO mapred.JobClient:   FileSystemCounters
12/07/05 18:54:18 INFO mapred.JobClient:     HDFS_BYTES_WRITTEN=8
12/07/05 18:54:18 INFO mapred.JobClient:   Map-Reduce Framework
12/07/05 18:54:18 INFO mapred.JobClient:     Map input records=1
12/07/05 18:54:18 INFO mapred.JobClient:     Spilled Records=0
12/07/05 18:54:18 INFO mapred.JobClient:     Map output records=1
12/07/05 18:54:18 INFO mapreduce.ImportJobBase: Transferred 8 bytes in 17.8107 seconds (0.4492 bytes/sec)
12/07/05 18:54:18 INFO mapreduce.ImportJobBase: Retrieved 1 records.
12/07/05 18:54:18 INFO hive.HiveImport: Removing temporary files from import process: /userdata/yogesh/sqoop/imports/oldtoy/_logs
12/07/05 18:54:18 INFO hive.HiveImport: Loading uploaded data into Hive
12/07/05 18:54:18 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `oldtoy` AS t LIMIT 1
12/07/05 18:54:20 INFO hive.HiveImport: Logging initialized using configuration in jar:file:/HADOOP/hive/lib/hive-common-0.8.1.jar!/hive-log4j.properties
12/07/05 18:54:20 INFO hive.HiveImport: Hive history file=/tmp/mediaadmin/hive_job_log_mediaadmin_201207051854_1851664234.txt
12/07/05 18:54:24 INFO hive.HiveImport: OK
12/07/05 18:54:24 INFO hive.HiveImport: Time taken: 4.222 seconds
12/07/05 18:54:24 INFO hive.HiveImport: Loading data to table default.yeshive2
12/07/05 18:54:24 INFO hive.HiveImport: OK
12/07/05 18:54:24 INFO hive.HiveImport: Time taken: 0.278 seconds
12/07/05 18:54:24 INFO hive.HiveImport: Hive import complete.



Please suggest

Greetings
Yogesh kumar


________________________________
From: Bejoy Ks [bejoy_ks@yahoo.com]
Sent: Thursday, July 05, 2012 6:28 PM
To: user@hive.apache.org
Subject: Re: Hive uploading

Hi Yogesh

The verbose option didn't work there as there is no DEBUG logging, can you please add the verbose to the beginning of your sqoop command?

Lemme frame a small sqoop import sample or you, Please run this command and post in the console log

sqoop import --verbose --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 -P --table troy -num-mappers 1 --warehouse-dir /userdata/yogesh/sqoop/imports --hive-import --hive-table troyhive

I haven't tried this on my end, If it poses any issues with the verbose please chnage its position after --table argument.

Also you need to create the --warehouse-dir in hdfs before running the sqoop import.


Regards
Bejoy KS

________________________________
From: "yogesh.kumar13@wipro.com" <yo...@wipro.com>
To: user@hive.apache.org; bejoy_ks@yahoo.com
Sent: Thursday, July 5, 2012 6:07 PM
Subject: RE: Hive uploading

Hi Bejoy,

I have created new table called Troy  and for hive its troyhive, as it was showing Outputdirectory already exists


sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 -table troy --hive-table troyhive  --create-hive-table  --hive-import --hive-home HADOOP/hive --verbose
12/07/05 17:57:16 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
12/07/05 17:57:16 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
12/07/05 17:57:16 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
12/07/05 17:57:16 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
12/07/05 17:57:16 INFO tool.CodeGenTool: Beginning code generation
12/07/05 17:57:17 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `troy` AS t LIMIT 1
12/07/05 17:57:17 INFO orm.CompilationManager: HADOOP_HOME is /HADOOP/hadoop-0.20.2/bin/..
12/07/05 17:57:17 INFO orm.CompilationManager: Found hadoop core jar at: /HADOOP/hadoop-0.20.2/bin/../hadoop-0.20.2-core.jar
Note: /tmp/sqoop-mediaadmin/compile/26f5861253b910681eade0bd0e84efb5/troy.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
12/07/05 17:57:17 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-mediaadmin/compile/26f5861253b910681eade0bd0e84efb5/troy.jar
12/07/05 17:57:17 WARN manager.MySQLManager: It looks like you are importing from mysql.
12/07/05 17:57:17 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
12/07/05 17:57:17 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
12/07/05 17:57:17 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
12/07/05 17:57:17 INFO mapreduce.ImportJobBase: Beginning import of troy
12/07/05 17:57:18 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN(`num`), MAX(`num`) FROM `troy`
12/07/05 17:57:18 INFO mapred.JobClient: Running job: job_201207051104_0005
12/07/05 17:57:19 INFO mapred.JobClient:  map 0% reduce 0%
12/07/05 17:57:30 INFO mapred.JobClient:  map 100% reduce 0%
12/07/05 17:57:32 INFO mapred.JobClient: Job complete: job_201207051104_0005
12/07/05 17:57:32 INFO mapred.JobClient: Counters: 5
12/07/05 17:57:32 INFO mapred.JobClient:   Job Counters
12/07/05 17:57:32 INFO mapred.JobClient:     Launched map tasks=1
12/07/05 17:57:32 INFO mapred.JobClient:   FileSystemCounters
12/07/05 17:57:32 INFO mapred.JobClient:     HDFS_BYTES_WRITTEN=8
12/07/05 17:57:32 INFO mapred.JobClient:   Map-Reduce Framework
12/07/05 17:57:32 INFO mapred.JobClient:     Map input records=1
12/07/05 17:57:32 INFO mapred.JobClient:     Spilled Records=0
12/07/05 17:57:32 INFO mapred.JobClient:     Map output records=1
12/07/05 17:57:32 INFO mapreduce.ImportJobBase: Transferred 8 bytes in 14.6895 seconds (0.5446 bytes/sec)
12/07/05 17:57:32 INFO mapreduce.ImportJobBase: Retrieved 1 records.
12/07/05 17:57:32 INFO hive.HiveImport: Removing temporary files from import process: troy/_logs
12/07/05 17:57:32 INFO hive.HiveImport: Loading uploaded data into Hive
12/07/05 17:57:32 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `troy` AS t LIMIT 1
12/07/05 17:57:34 INFO hive.HiveImport: Logging initialized using configuration in jar:file:/HADOOP/hive/lib/hive-common-0.8.1.jar!/hive-log4j.properties
12/07/05 17:57:34 INFO hive.HiveImport: Hive history file=/tmp/mediaadmin/hive_job_log_mediaadmin_201207051757_1184599996.txt
12/07/05 17:57:39 INFO hive.HiveImport: OK
12/07/05 17:57:39 INFO hive.HiveImport: Time taken: 4.249 seconds
12/07/05 17:57:39 INFO hive.HiveImport: Loading data to table default.troyhive
12/07/05 17:57:39 INFO hive.HiveImport: OK
12/07/05 17:57:39 INFO hive.HiveImport: Time taken: 0.257 seconds
12/07/05 17:57:39 INFO hive.HiveImport: Hive import complete.

Regards
Yogesh Kumar

________________________________
From: Bejoy Ks [bejoy_ks@yahoo.com]
Sent: Thursday, July 05, 2012 6:03 PM
To: user@hive.apache.org
Subject: Re: Hive uploading

Hi Yogesh

Verbose option won't create any difference in operation, but gives more logging information on console which could be helpful to search for any hints.

So please post in your console dump/log along with the sqoop import command with verbose enabled.

Regards
Bejoy KS

________________________________
From: "yogesh.kumar13@wipro.com" <yo...@wipro.com>
To: user@hive.apache.org; bejoy_ks@yahoo.com
Sent: Thursday, July 5, 2012 6:00 PM
Subject: RE: Hive uploading

Hello Bejoy,

sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 -table Dummy --hive-table dummyhive  --create-hive-table  --hive-import --hive-home HADOOP/hive --verbose

Still the same, no table has been created. I am not able to see the dummyhive table in hive by using command
Show Tables ;

although table dummyhive created into HDFS in dir:
user/hive/warehouse/dummyhive


Please suggest
Yogesh Kumar

________________________________
From: Bejoy Ks [bejoy_ks@yahoo.com]
Sent: Thursday, July 05, 2012 5:29 PM
To: user@hive.apache.org
Subject: Re: Hive uploading

Hi Yogesh

Please try out this command

 sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 -table Dummy --hive-table dummyhive  --create-hive-table  --hive-import --hive-home HADOOP/hive --verbose


Regards
Bejoy KS

________________________________
From: "yogesh.kumar13@wipro.com" <yo...@wipro.com>
To: user@hive.apache.org; bejoy_ks@yahoo.com
Sent: Thursday, July 5, 2012 5:03 PM
Subject: RE: Hive uploading

Hi Bejoy

I have confirmed hive installation its same for both
I used command echo $HIVE_HOME on both sqoop terminal and hive terminal
both result the same Path
HADOOP/hive

I am new to Hive and sqoop, would you please give an example using -verbose option with this command


 sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 -table Dummy --hive-table dummyhive  --create-hive-table  --hive-import --hive-home HADOOP/hive



Please help


________________________________
From: Bejoy Ks [bejoy_ks@yahoo.com]
Sent: Thursday, July 05, 2012 3:14 PM
To: user@hive.apache.org
Subject: Re: Hive uploading

Hi Yogesh

No issues seen on the first look. Can you run the sqoop import with --verbose option and post in the console dump?

Are you having multiple hive installation? If so please verify whether you are using the same hive for both SQOOP import and then for verifying data using hive cli. (the hive installation @ HADOOP/hive)

Regards
Bejoy KS

________________________________
From: "yogesh.kumar13@wipro.com" <yo...@wipro.com>
To: user@hive.apache.org
Sent: Thursday, July 5, 2012 2:58 PM
Subject: Hive uploading

Hi

I have created a table in Mysql by name Dummy and it has 2 columns, and 1 row of data

I want to upload that table into Hive using Sqoop tool.
I used this command


sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 -table Dummy --hive-table dummyhive  --create-hive-table  --hive-import --hive-home HADOOP/hive


The table has been succesfully uploaded into HDFS  /user/hive/warehouse
but when I run command in Hive

Show Tables;

I don't find dummyhive table in it.

Please suggest and Help


Details of the command and output

mediaadmins-iMac-2:hive mediaadmin$ sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 -table Dummy --hive-table dummyhive  --create-hive-table  --hive-import --hive-home HADOOP/hive
12/07/05 11:09:15 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
12/07/05 11:09:15 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
12/07/05 11:09:15 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
12/07/05 11:09:15 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
12/07/05 11:09:15 INFO tool.CodeGenTool: Beginning code generation
12/07/05 11:09:16 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `Dummy` AS t LIMIT 1
12/07/05 11:09:16 INFO orm.CompilationManager: HADOOP_HOME is /HADOOP/hadoop-0.20.2/bin/..
12/07/05 11:09:16 INFO orm.CompilationManager: Found hadoop core jar at: /HADOOP/hadoop-0.20.2/bin/../hadoop-0.20.2-core.jar
Note: /tmp/sqoop-mediaadmin/compile/382d1c58323cea76efd197632bebbfcd/Dummy.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
12/07/05 11:09:17 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-mediaadmin/compile/382d1c58323cea76efd197632bebbfcd/Dummy.jar
12/07/05 11:09:17 WARN manager.MySQLManager: It looks like you are importing from mysql.
12/07/05 11:09:17 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
12/07/05 11:09:17 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
12/07/05 11:09:17 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
12/07/05 11:09:17 INFO mapreduce.ImportJobBase: Beginning import of Dummy
12/07/05 11:09:18 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN(`Sno`), MAX(`Sno`) FROM `Dummy`
12/07/05 11:09:18 INFO mapred.JobClient: Running job: job_201207051104_0001
12/07/05 11:09:19 INFO mapred.JobClient:  map 0% reduce 0%
12/07/05 11:09:33 INFO mapred.JobClient:  map 100% reduce 0%
12/07/05 11:09:35 INFO mapred.JobClient: Job complete: job_201207051104_0001
12/07/05 11:09:35 INFO mapred.JobClient: Counters: 5
12/07/05 11:09:35 INFO mapred.JobClient:   Job Counters
12/07/05 11:09:35 INFO mapred.JobClient:     Launched map tasks=1
12/07/05 11:09:35 INFO mapred.JobClient:   FileSystemCounters
12/07/05 11:09:35 INFO mapred.JobClient:     HDFS_BYTES_WRITTEN=8
12/07/05 11:09:35 INFO mapred.JobClient:   Map-Reduce Framework
12/07/05 11:09:35 INFO mapred.JobClient:     Map input records=1
12/07/05 11:09:35 INFO mapred.JobClient:     Spilled Records=0
12/07/05 11:09:35 INFO mapred.JobClient:     Map output records=1
12/07/05 11:09:35 INFO mapreduce.ImportJobBase: Transferred 8 bytes in 17.945 seconds (0.4458 bytes/sec)
12/07/05 11:09:35 INFO mapreduce.ImportJobBase: Retrieved 1 records.
12/07/05 11:09:35 INFO hive.HiveImport: Removing temporary files from import process: Dummy/_logs
12/07/05 11:09:35 INFO hive.HiveImport: Loading uploaded data into Hive
12/07/05 11:09:35 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `Dummy` AS t LIMIT 1
12/07/05 11:09:37 INFO hive.HiveImport: Logging initialized using configuration in jar:file:/HADOOP/hive/lib/hive-common-0.8.1.jar!/hive-log4j.properties
12/07/05 11:09:37 INFO hive.HiveImport: Hive history file=/tmp/mediaadmin/hive_job_log_mediaadmin_201207051109_1901926452.txt
12/07/05 11:09:41 INFO hive.HiveImport: OK
12/07/05 11:09:41 INFO hive.HiveImport: Time taken: 3.934 seconds
12/07/05 11:09:41 INFO hive.HiveImport: Loading data to table default.dummyhive
12/07/05 11:09:41 INFO hive.HiveImport: OK
12/07/05 11:09:41 INFO hive.HiveImport: Time taken: 0.262 seconds
12/07/05 11:09:41 INFO hive.HiveImport: Hive import complete.



Why is it so? Please help me out

Thanks & Regards
Yogesh Kumar

Please do not print this email unless it is absolutely necessary.
The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments.
WARNING: Computer viruses can be transmitted via email. The recipient should check this email and any attachments for the presence of viruses. The company accepts no liability for any damage caused by any virus transmitted by this email.
www.wipro.com


Please do not print this email unless it is absolutely necessary.
The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments.
WARNING: Computer viruses can be transmitted via email. The recipient should check this email and any attachments for the presence of viruses. The company accepts no liability for any damage caused by any virus transmitted by this email.
www.wipro.com


Please do not print this email unless it is absolutely necessary.
The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments.
WARNING: Computer viruses can be transmitted via email. The recipient should check this email and any attachments for the presence of viruses. The company accepts no liability for any damage caused by any virus transmitted by this email.
www.wipro.com


Please do not print this email unless it is absolutely necessary.
The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments.
WARNING: Computer viruses can be transmitted via email. The recipient should check this email and any attachments for the presence of viruses. The company accepts no liability for any damage caused by any virus transmitted by this email.
www.wipro.com



Please do not print this email unless it is absolutely necessary. 

The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments. 

WARNING: Computer viruses can be transmitted via email. The recipient should check this email and any attachments for the presence of viruses. The company accepts no liability for any damage caused by any virus transmitted by this email. 

www.wipro.com

Re: Hive uploading

Posted by Bejoy Ks <be...@yahoo.com>.
Hi Yogesh

The verbose option didn't work there as there is no DEBUG logging, can you please add the verbose to the beginning of your sqoop command?

Lemme frame a small sqoop import sample or you, Please run this command and post in the console log

sqoop import --verbose --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 -P --table troy -num-mappers 1 --warehouse-dir /userdata/yogesh/sqoop/imports --hive-import --hive-table troyhive  


I haven't tried this on my end, If it poses any issues with the verbose please chnage its position after --table argument.

Also you need to create the --warehouse-dir in hdfs before running the sqoop import.



Regards
Bejoy KS



________________________________
 From: "yogesh.kumar13@wipro.com" <yo...@wipro.com>
To: user@hive.apache.org; bejoy_ks@yahoo.com 
Sent: Thursday, July 5, 2012 6:07 PM
Subject: RE: Hive uploading
 

 
Hi Bejoy,

I have created new table called Troy  and for hive its troyhive, as it was showing Outputdirectory already exists


sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 -table troy --hive-table troyhive  --create-hive-table  --hive-import --hive-home HADOOP/hive --verbose
12/07/05 17:57:16 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
12/07/05 17:57:16 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
12/07/05 17:57:16 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
12/07/05 17:57:16 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
12/07/05 17:57:16 INFO tool.CodeGenTool: Beginning code generation
12/07/05 17:57:17 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `troy` AS t LIMIT 1
12/07/05 17:57:17 INFO orm.CompilationManager: HADOOP_HOME is /HADOOP/hadoop-0.20.2/bin/..
12/07/05 17:57:17 INFO orm.CompilationManager: Found hadoop core jar at: /HADOOP/hadoop-0.20.2/bin/../hadoop-0.20.2-core.jar
Note: /tmp/sqoop-mediaadmin/compile/26f5861253b910681eade0bd0e84efb5/troy.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
12/07/05 17:57:17 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-mediaadmin/compile/26f5861253b910681eade0bd0e84efb5/troy.jar
12/07/05 17:57:17 WARN manager.MySQLManager: It looks like you are importing from mysql.
12/07/05 17:57:17 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
12/07/05 17:57:17 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
12/07/05 17:57:17 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
12/07/05 17:57:17 INFO mapreduce.ImportJobBase: Beginning import of troy
12/07/05 17:57:18 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN(`num`), MAX(`num`) FROM `troy`
12/07/05 17:57:18 INFO mapred.JobClient: Running job: job_201207051104_0005
12/07/05 17:57:19 INFO mapred.JobClient:  map 0% reduce 0%
12/07/05 17:57:30 INFO mapred.JobClient:  map 100% reduce 0%
12/07/05 17:57:32 INFO mapred.JobClient: Job complete: job_201207051104_0005
12/07/05 17:57:32 INFO mapred.JobClient: Counters: 5
12/07/05 17:57:32 INFO mapred.JobClient:   Job Counters 
12/07/05 17:57:32 INFO mapred.JobClient:     Launched map tasks=1
12/07/05 17:57:32 INFO mapred.JobClient:   FileSystemCounters
12/07/05 17:57:32 INFO mapred.JobClient:     HDFS_BYTES_WRITTEN=8
12/07/05 17:57:32 INFO mapred.JobClient:   Map-Reduce Framework
12/07/05 17:57:32 INFO mapred.JobClient:     Map input records=1
12/07/05 17:57:32 INFO mapred.JobClient:     Spilled Records=0
12/07/05 17:57:32 INFO mapred.JobClient:     Map output records=1
12/07/05 17:57:32 INFO mapreduce.ImportJobBase: Transferred 8 bytes in 14.6895 seconds (0.5446 bytes/sec)
12/07/05 17:57:32 INFO mapreduce.ImportJobBase: Retrieved 1 records.
12/07/05 17:57:32 INFO hive.HiveImport: Removing temporary files from import process: troy/_logs
12/07/05 17:57:32 INFO hive.HiveImport: Loading uploaded data into Hive
12/07/05 17:57:32 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `troy` AS t LIMIT 1
12/07/05 17:57:34 INFO hive.HiveImport: Logging initialized using configuration in jar:file:/HADOOP/hive/lib/hive-common-0.8.1.jar!/hive-log4j.properties
12/07/05 17:57:34 INFO hive.HiveImport: Hive history file=/tmp/mediaadmin/hive_job_log_mediaadmin_201207051757_1184599996.txt
12/07/05 17:57:39 INFO hive.HiveImport: OK
12/07/05 17:57:39 INFO hive.HiveImport: Time taken: 4.249 seconds
12/07/05 17:57:39 INFO hive.HiveImport: Loading data to table default.troyhive
12/07/05 17:57:39 INFO hive.HiveImport: OK
12/07/05 17:57:39 INFO hive.HiveImport: Time taken: 0.257 seconds
12/07/05 17:57:39 INFO hive.HiveImport: Hive import complete.

Regards
Yogesh Kumar



________________________________
 
From: Bejoy Ks [bejoy_ks@yahoo.com]
Sent: Thursday, July 05, 2012 6:03 PM
To: user@hive.apache.org
Subject: Re: Hive uploading


Hi Yogesh

Verbose option won't create any difference in operation, but gives more logging information on console which could be helpful to search for any hints.

So please post in your console dump/log along with the sqoop import command with verbose enabled.

Regards
Bejoy KS


________________________________
 From: "yogesh.kumar13@wipro.com" <yo...@wipro.com>
To: user@hive.apache.org; bejoy_ks@yahoo.com 
Sent: Thursday, July 5, 2012 6:00 PM
Subject: RE: Hive uploading


 
Hello Bejoy,

sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 -table Dummy --hive-table dummyhive  --create-hive-table  --hive-import --hive-home HADOOP/hive --verbose

Still the same, no table has been created. I am not able to see the dummyhive table in hive by using command 
Show Tables ;

although table dummyhive created into HDFS in dir:
user/hive/warehouse/dummyhive


Please suggest
Yogesh Kumar



________________________________
 
From: Bejoy Ks [bejoy_ks@yahoo.com]
Sent: Thursday, July 05, 2012 5:29 PM
To: user@hive.apache.org
Subject: Re: Hive uploading


Hi Yogesh

Please try out this command


 sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 -table Dummy --hive-table dummyhive  --create-hive-table  --hive-import --hive-home HADOOP/hive --verbose


Regards
Bejoy KS



________________________________
 From: "yogesh.kumar13@wipro.com" <yo...@wipro.com>
To: user@hive.apache.org; bejoy_ks@yahoo.com 
Sent: Thursday, July 5, 2012 5:03 PM
Subject: RE: Hive uploading


 
Hi Bejoy

I have confirmed hive installation its same for both
I used command echo $HIVE_HOME on both sqoop terminal and hive terminal
both result the same Path 
HADOOP/hive

I am new to Hive and sqoop, would you please give an example using -verbose option with this command


 sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 -table Dummy --hive-table dummyhive  --create-hive-table  --hive-import --hive-home HADOOP/hive 



Please help




________________________________
 
From: Bejoy Ks [bejoy_ks@yahoo.com]
Sent: Thursday, July 05, 2012 3:14 PM
To: user@hive.apache.org
Subject: Re: Hive uploading


Hi Yogesh

No issues seen on the first look. Can you run the sqoop import with --verbose option and post in the console dump?

Are you having multiple hive installation? If so please verify whether you are using the same hive for both SQOOP import and then for verifying data using hive cli. (the hive installation @ HADOOP/hive)

Regards
Bejoy KS



________________________________
 From: "yogesh.kumar13@wipro.com" <yo...@wipro.com>
To: user@hive.apache.org 
Sent: Thursday, July 5, 2012 2:58 PM
Subject: Hive uploading


 
Hi

I have created a table in Mysql by name Dummy and it has 2 columns, and 1 row of data

I want to upload that table into Hive using Sqoop tool.
I used this command  


sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 -table Dummy --hive-table dummyhive  --create-hive-table  --hive-import --hive-home HADOOP/hive


The table has been succesfully uploaded into HDFS  /user/hive/warehouse
but when I run command in Hive 

Show Tables;

I don't find dummyhive table in it.

Please suggest and Help


Details of the command and output

mediaadmins-iMac-2:hive mediaadmin$ sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 -table Dummy --hive-table dummyhive  --create-hive-table  --hive-import --hive-home HADOOP/hive
12/07/05 11:09:15 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
12/07/05 11:09:15 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
12/07/05 11:09:15 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
12/07/05 11:09:15 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
12/07/05 11:09:15 INFO tool.CodeGenTool: Beginning code generation
12/07/05 11:09:16 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `Dummy` AS t LIMIT 1
12/07/05 11:09:16 INFO orm.CompilationManager: HADOOP_HOME is /HADOOP/hadoop-0.20.2/bin/..
12/07/05 11:09:16 INFO orm.CompilationManager: Found hadoop core jar at: /HADOOP/hadoop-0.20.2/bin/../hadoop-0.20.2-core.jar
Note: /tmp/sqoop-mediaadmin/compile/382d1c58323cea76efd197632bebbfcd/Dummy.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
12/07/05 11:09:17 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-mediaadmin/compile/382d1c58323cea76efd197632bebbfcd/Dummy.jar
12/07/05 11:09:17 WARN manager.MySQLManager: It looks like you are importing from mysql.
12/07/05 11:09:17 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
12/07/05 11:09:17 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
12/07/05 11:09:17 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
12/07/05 11:09:17 INFO mapreduce.ImportJobBase: Beginning import of Dummy
12/07/05 11:09:18 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN(`Sno`), MAX(`Sno`) FROM `Dummy`
12/07/05 11:09:18 INFO mapred.JobClient: Running job: job_201207051104_0001
12/07/05 11:09:19 INFO mapred.JobClient:  map 0% reduce 0%
12/07/05 11:09:33 INFO mapred.JobClient:  map 100% reduce 0%
12/07/05 11:09:35 INFO mapred.JobClient: Job complete: job_201207051104_0001
12/07/05 11:09:35 INFO mapred.JobClient: Counters: 5
12/07/05 11:09:35 INFO mapred.JobClient:   Job Counters 
12/07/05 11:09:35 INFO mapred.JobClient:     Launched map tasks=1
12/07/05 11:09:35 INFO mapred.JobClient:   FileSystemCounters
12/07/05 11:09:35 INFO mapred.JobClient:     HDFS_BYTES_WRITTEN=8
12/07/05 11:09:35 INFO mapred.JobClient:   Map-Reduce Framework
12/07/05 11:09:35 INFO mapred.JobClient:     Map input records=1
12/07/05 11:09:35 INFO mapred.JobClient:     Spilled Records=0
12/07/05 11:09:35 INFO mapred.JobClient:     Map output records=1
12/07/05 11:09:35 INFO mapreduce.ImportJobBase: Transferred 8 bytes in 17.945 seconds (0.4458 bytes/sec)
12/07/05 11:09:35 INFO mapreduce.ImportJobBase: Retrieved 1 records.
12/07/05 11:09:35 INFO hive.HiveImport: Removing temporary files from import process: Dummy/_logs
12/07/05 11:09:35 INFO hive.HiveImport: Loading uploaded data into Hive
12/07/05 11:09:35 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `Dummy` AS t LIMIT 1
12/07/05 11:09:37 INFO hive.HiveImport: Logging initialized using configuration in jar:file:/HADOOP/hive/lib/hive-common-0.8.1.jar!/hive-log4j.properties
12/07/05 11:09:37 INFO hive.HiveImport: Hive history file=/tmp/mediaadmin/hive_job_log_mediaadmin_201207051109_1901926452.txt
12/07/05 11:09:41 INFO hive.HiveImport: OK
12/07/05 11:09:41 INFO hive.HiveImport: Time taken: 3.934 seconds
12/07/05 11:09:41 INFO hive.HiveImport: Loading data to table default.dummyhive
12/07/05 11:09:41 INFO hive.HiveImport: OK
12/07/05 11:09:41 INFO hive.HiveImport: Time taken: 0.262 seconds
12/07/05 11:09:41 INFO hive.HiveImport: Hive import complete.



Why is it so? Please help me out

Thanks & Regards
Yogesh Kumar


Please do not print this email unless it is absolutely necessary. 
The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments. 
WARNING: Computer viruses can be transmitted via email. The recipient should check this email and any attachments for the presence of viruses. The company accepts no liability for any damage caused by any virus transmitted by this email. 
www.wipro.com 


Please do not print this email unless it is absolutely necessary. 
The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments. 
WARNING: Computer viruses can be transmitted via email. The recipient should check this email and any attachments for the presence of viruses. The company accepts no liability for any damage caused by any virus transmitted by this email. 
www.wipro.com 


Please do not print this email unless it is absolutely necessary. 
The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments. 
WARNING: Computer viruses can be transmitted via email. The recipient should check this email and any attachments for the presence of viruses. The company accepts no liability for any damage caused by any virus transmitted by this email. 
www.wipro.com 


Please do not print this email unless it is absolutely necessary. 
The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments. 
WARNING: Computer viruses can be transmitted via email. The recipient should check this email and any attachments for the presence of viruses. The company accepts no liability for any damage caused by any virus transmitted by this email. 
www.wipro.com 

RE: Hive uploading

Posted by yo...@wipro.com.
Hi Bejoy,

I have created new table called Troy  and for hive its troyhive, as it was showing Outputdirectory already exists


sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 -table troy --hive-table troyhive  --create-hive-table  --hive-import --hive-home HADOOP/hive --verbose
12/07/05 17:57:16 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
12/07/05 17:57:16 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
12/07/05 17:57:16 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
12/07/05 17:57:16 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
12/07/05 17:57:16 INFO tool.CodeGenTool: Beginning code generation
12/07/05 17:57:17 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `troy` AS t LIMIT 1
12/07/05 17:57:17 INFO orm.CompilationManager: HADOOP_HOME is /HADOOP/hadoop-0.20.2/bin/..
12/07/05 17:57:17 INFO orm.CompilationManager: Found hadoop core jar at: /HADOOP/hadoop-0.20.2/bin/../hadoop-0.20.2-core.jar
Note: /tmp/sqoop-mediaadmin/compile/26f5861253b910681eade0bd0e84efb5/troy.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
12/07/05 17:57:17 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-mediaadmin/compile/26f5861253b910681eade0bd0e84efb5/troy.jar
12/07/05 17:57:17 WARN manager.MySQLManager: It looks like you are importing from mysql.
12/07/05 17:57:17 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
12/07/05 17:57:17 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
12/07/05 17:57:17 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
12/07/05 17:57:17 INFO mapreduce.ImportJobBase: Beginning import of troy
12/07/05 17:57:18 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN(`num`), MAX(`num`) FROM `troy`
12/07/05 17:57:18 INFO mapred.JobClient: Running job: job_201207051104_0005
12/07/05 17:57:19 INFO mapred.JobClient:  map 0% reduce 0%
12/07/05 17:57:30 INFO mapred.JobClient:  map 100% reduce 0%
12/07/05 17:57:32 INFO mapred.JobClient: Job complete: job_201207051104_0005
12/07/05 17:57:32 INFO mapred.JobClient: Counters: 5
12/07/05 17:57:32 INFO mapred.JobClient:   Job Counters
12/07/05 17:57:32 INFO mapred.JobClient:     Launched map tasks=1
12/07/05 17:57:32 INFO mapred.JobClient:   FileSystemCounters
12/07/05 17:57:32 INFO mapred.JobClient:     HDFS_BYTES_WRITTEN=8
12/07/05 17:57:32 INFO mapred.JobClient:   Map-Reduce Framework
12/07/05 17:57:32 INFO mapred.JobClient:     Map input records=1
12/07/05 17:57:32 INFO mapred.JobClient:     Spilled Records=0
12/07/05 17:57:32 INFO mapred.JobClient:     Map output records=1
12/07/05 17:57:32 INFO mapreduce.ImportJobBase: Transferred 8 bytes in 14.6895 seconds (0.5446 bytes/sec)
12/07/05 17:57:32 INFO mapreduce.ImportJobBase: Retrieved 1 records.
12/07/05 17:57:32 INFO hive.HiveImport: Removing temporary files from import process: troy/_logs
12/07/05 17:57:32 INFO hive.HiveImport: Loading uploaded data into Hive
12/07/05 17:57:32 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `troy` AS t LIMIT 1
12/07/05 17:57:34 INFO hive.HiveImport: Logging initialized using configuration in jar:file:/HADOOP/hive/lib/hive-common-0.8.1.jar!/hive-log4j.properties
12/07/05 17:57:34 INFO hive.HiveImport: Hive history file=/tmp/mediaadmin/hive_job_log_mediaadmin_201207051757_1184599996.txt
12/07/05 17:57:39 INFO hive.HiveImport: OK
12/07/05 17:57:39 INFO hive.HiveImport: Time taken: 4.249 seconds
12/07/05 17:57:39 INFO hive.HiveImport: Loading data to table default.troyhive
12/07/05 17:57:39 INFO hive.HiveImport: OK
12/07/05 17:57:39 INFO hive.HiveImport: Time taken: 0.257 seconds
12/07/05 17:57:39 INFO hive.HiveImport: Hive import complete.

Regards
Yogesh Kumar

________________________________
From: Bejoy Ks [bejoy_ks@yahoo.com]
Sent: Thursday, July 05, 2012 6:03 PM
To: user@hive.apache.org
Subject: Re: Hive uploading

Hi Yogesh

Verbose option won't create any difference in operation, but gives more logging information on console which could be helpful to search for any hints.

So please post in your console dump/log along with the sqoop import command with verbose enabled.

Regards
Bejoy KS

________________________________
From: "yogesh.kumar13@wipro.com" <yo...@wipro.com>
To: user@hive.apache.org; bejoy_ks@yahoo.com
Sent: Thursday, July 5, 2012 6:00 PM
Subject: RE: Hive uploading

Hello Bejoy,

sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 -table Dummy --hive-table dummyhive  --create-hive-table  --hive-import --hive-home HADOOP/hive --verbose

Still the same, no table has been created. I am not able to see the dummyhive table in hive by using command
Show Tables ;

although table dummyhive created into HDFS in dir:
user/hive/warehouse/dummyhive


Please suggest
Yogesh Kumar

________________________________
From: Bejoy Ks [bejoy_ks@yahoo.com]
Sent: Thursday, July 05, 2012 5:29 PM
To: user@hive.apache.org
Subject: Re: Hive uploading

Hi Yogesh

Please try out this command

 sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 -table Dummy --hive-table dummyhive  --create-hive-table  --hive-import --hive-home HADOOP/hive --verbose


Regards
Bejoy KS

________________________________
From: "yogesh.kumar13@wipro.com" <yo...@wipro.com>
To: user@hive.apache.org; bejoy_ks@yahoo.com
Sent: Thursday, July 5, 2012 5:03 PM
Subject: RE: Hive uploading

Hi Bejoy

I have confirmed hive installation its same for both
I used command echo $HIVE_HOME on both sqoop terminal and hive terminal
both result the same Path
HADOOP/hive

I am new to Hive and sqoop, would you please give an example using -verbose option with this command


 sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 -table Dummy --hive-table dummyhive  --create-hive-table  --hive-import --hive-home HADOOP/hive



Please help


________________________________
From: Bejoy Ks [bejoy_ks@yahoo.com]
Sent: Thursday, July 05, 2012 3:14 PM
To: user@hive.apache.org
Subject: Re: Hive uploading

Hi Yogesh

No issues seen on the first look. Can you run the sqoop import with --verbose option and post in the console dump?

Are you having multiple hive installation? If so please verify whether you are using the same hive for both SQOOP import and then for verifying data using hive cli. (the hive installation @ HADOOP/hive)

Regards
Bejoy KS

________________________________
From: "yogesh.kumar13@wipro.com" <yo...@wipro.com>
To: user@hive.apache.org
Sent: Thursday, July 5, 2012 2:58 PM
Subject: Hive uploading

Hi

I have created a table in Mysql by name Dummy and it has 2 columns, and 1 row of data

I want to upload that table into Hive using Sqoop tool.
I used this command


sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 -table Dummy --hive-table dummyhive  --create-hive-table  --hive-import --hive-home HADOOP/hive


The table has been succesfully uploaded into HDFS  /user/hive/warehouse
but when I run command in Hive

Show Tables;

I don't find dummyhive table in it.

Please suggest and Help


Details of the command and output

mediaadmins-iMac-2:hive mediaadmin$ sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 -table Dummy --hive-table dummyhive  --create-hive-table  --hive-import --hive-home HADOOP/hive
12/07/05 11:09:15 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
12/07/05 11:09:15 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
12/07/05 11:09:15 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
12/07/05 11:09:15 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
12/07/05 11:09:15 INFO tool.CodeGenTool: Beginning code generation
12/07/05 11:09:16 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `Dummy` AS t LIMIT 1
12/07/05 11:09:16 INFO orm.CompilationManager: HADOOP_HOME is /HADOOP/hadoop-0.20.2/bin/..
12/07/05 11:09:16 INFO orm.CompilationManager: Found hadoop core jar at: /HADOOP/hadoop-0.20.2/bin/../hadoop-0.20.2-core.jar
Note: /tmp/sqoop-mediaadmin/compile/382d1c58323cea76efd197632bebbfcd/Dummy.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
12/07/05 11:09:17 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-mediaadmin/compile/382d1c58323cea76efd197632bebbfcd/Dummy.jar
12/07/05 11:09:17 WARN manager.MySQLManager: It looks like you are importing from mysql.
12/07/05 11:09:17 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
12/07/05 11:09:17 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
12/07/05 11:09:17 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
12/07/05 11:09:17 INFO mapreduce.ImportJobBase: Beginning import of Dummy
12/07/05 11:09:18 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN(`Sno`), MAX(`Sno`) FROM `Dummy`
12/07/05 11:09:18 INFO mapred.JobClient: Running job: job_201207051104_0001
12/07/05 11:09:19 INFO mapred.JobClient:  map 0% reduce 0%
12/07/05 11:09:33 INFO mapred.JobClient:  map 100% reduce 0%
12/07/05 11:09:35 INFO mapred.JobClient: Job complete: job_201207051104_0001
12/07/05 11:09:35 INFO mapred.JobClient: Counters: 5
12/07/05 11:09:35 INFO mapred.JobClient:   Job Counters
12/07/05 11:09:35 INFO mapred.JobClient:     Launched map tasks=1
12/07/05 11:09:35 INFO mapred.JobClient:   FileSystemCounters
12/07/05 11:09:35 INFO mapred.JobClient:     HDFS_BYTES_WRITTEN=8
12/07/05 11:09:35 INFO mapred.JobClient:   Map-Reduce Framework
12/07/05 11:09:35 INFO mapred.JobClient:     Map input records=1
12/07/05 11:09:35 INFO mapred.JobClient:     Spilled Records=0
12/07/05 11:09:35 INFO mapred.JobClient:     Map output records=1
12/07/05 11:09:35 INFO mapreduce.ImportJobBase: Transferred 8 bytes in 17.945 seconds (0.4458 bytes/sec)
12/07/05 11:09:35 INFO mapreduce.ImportJobBase: Retrieved 1 records.
12/07/05 11:09:35 INFO hive.HiveImport: Removing temporary files from import process: Dummy/_logs
12/07/05 11:09:35 INFO hive.HiveImport: Loading uploaded data into Hive
12/07/05 11:09:35 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `Dummy` AS t LIMIT 1
12/07/05 11:09:37 INFO hive.HiveImport: Logging initialized using configuration in jar:file:/HADOOP/hive/lib/hive-common-0.8.1.jar!/hive-log4j.properties
12/07/05 11:09:37 INFO hive.HiveImport: Hive history file=/tmp/mediaadmin/hive_job_log_mediaadmin_201207051109_1901926452.txt
12/07/05 11:09:41 INFO hive.HiveImport: OK
12/07/05 11:09:41 INFO hive.HiveImport: Time taken: 3.934 seconds
12/07/05 11:09:41 INFO hive.HiveImport: Loading data to table default.dummyhive
12/07/05 11:09:41 INFO hive.HiveImport: OK
12/07/05 11:09:41 INFO hive.HiveImport: Time taken: 0.262 seconds
12/07/05 11:09:41 INFO hive.HiveImport: Hive import complete.



Why is it so? Please help me out

Thanks & Regards
Yogesh Kumar

Please do not print this email unless it is absolutely necessary.
The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments.
WARNING: Computer viruses can be transmitted via email. The recipient should check this email and any attachments for the presence of viruses. The company accepts no liability for any damage caused by any virus transmitted by this email.
www.wipro.com


Please do not print this email unless it is absolutely necessary.
The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments.
WARNING: Computer viruses can be transmitted via email. The recipient should check this email and any attachments for the presence of viruses. The company accepts no liability for any damage caused by any virus transmitted by this email.
www.wipro.com


Please do not print this email unless it is absolutely necessary.
The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments.
WARNING: Computer viruses can be transmitted via email. The recipient should check this email and any attachments for the presence of viruses. The company accepts no liability for any damage caused by any virus transmitted by this email.
www.wipro.com



Please do not print this email unless it is absolutely necessary. 

The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments. 

WARNING: Computer viruses can be transmitted via email. The recipient should check this email and any attachments for the presence of viruses. The company accepts no liability for any damage caused by any virus transmitted by this email. 

www.wipro.com

Re: Hive uploading

Posted by Bejoy Ks <be...@yahoo.com>.
Hi Yogesh

Verbose option won't create any difference in operation, but gives more logging information on console which could be helpful to search for any hints.

So please post in your console dump/log along with the sqoop import command with verbose enabled.

Regards
Bejoy KS



________________________________
 From: "yogesh.kumar13@wipro.com" <yo...@wipro.com>
To: user@hive.apache.org; bejoy_ks@yahoo.com 
Sent: Thursday, July 5, 2012 6:00 PM
Subject: RE: Hive uploading
 

 
Hello Bejoy,

sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 -table Dummy --hive-table dummyhive  --create-hive-table  --hive-import --hive-home HADOOP/hive --verbose

Still the same, no table has been created. I am not able to see the dummyhive table in hive by using command 
Show Tables ;

although table dummyhive created into HDFS in dir:
user/hive/warehouse/dummyhive


Please suggest
Yogesh Kumar



________________________________
 
From: Bejoy Ks [bejoy_ks@yahoo.com]
Sent: Thursday, July 05, 2012 5:29 PM
To: user@hive.apache.org
Subject: Re: Hive uploading


Hi Yogesh

Please try out this command


 sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 -table Dummy --hive-table dummyhive  --create-hive-table  --hive-import --hive-home HADOOP/hive --verbose


Regards
Bejoy KS



________________________________
 From: "yogesh.kumar13@wipro.com" <yo...@wipro.com>
To: user@hive.apache.org; bejoy_ks@yahoo.com 
Sent: Thursday, July 5, 2012 5:03 PM
Subject: RE: Hive uploading


 
Hi Bejoy

I have confirmed hive installation its same for both
I used command echo $HIVE_HOME on both sqoop terminal and hive terminal
both result the same Path 
HADOOP/hive

I am new to Hive and sqoop, would you please give an example using -verbose option with this command


 sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 -table Dummy --hive-table dummyhive  --create-hive-table  --hive-import --hive-home HADOOP/hive 



Please help




________________________________
 
From: Bejoy Ks [bejoy_ks@yahoo.com]
Sent: Thursday, July 05, 2012 3:14 PM
To: user@hive.apache.org
Subject: Re: Hive uploading


Hi Yogesh

No issues seen on the first look. Can you run the sqoop import with --verbose option and post in the console dump?

Are you having multiple hive installation? If so please verify whether you are using the same hive for both SQOOP import and then for verifying data using hive cli. (the hive installation @ HADOOP/hive)

Regards
Bejoy KS



________________________________
 From: "yogesh.kumar13@wipro.com" <yo...@wipro.com>
To: user@hive.apache.org 
Sent: Thursday, July 5, 2012 2:58 PM
Subject: Hive uploading


 
Hi

I have created a table in Mysql by name Dummy and it has 2 columns, and 1 row of data

I want to upload that table into Hive using Sqoop tool.
I used this command  


sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 -table Dummy --hive-table dummyhive  --create-hive-table  --hive-import --hive-home HADOOP/hive


The table has been succesfully uploaded into HDFS  /user/hive/warehouse
but when I run command in Hive 

Show Tables;

I don't find dummyhive table in it.

Please suggest and Help


Details of the command and output

mediaadmins-iMac-2:hive mediaadmin$ sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 -table Dummy --hive-table dummyhive  --create-hive-table  --hive-import --hive-home HADOOP/hive
12/07/05 11:09:15 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
12/07/05 11:09:15 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
12/07/05 11:09:15 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
12/07/05 11:09:15 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
12/07/05 11:09:15 INFO tool.CodeGenTool: Beginning code generation
12/07/05 11:09:16 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `Dummy` AS t LIMIT 1
12/07/05 11:09:16 INFO orm.CompilationManager: HADOOP_HOME is /HADOOP/hadoop-0.20.2/bin/..
12/07/05 11:09:16 INFO orm.CompilationManager: Found hadoop core jar at: /HADOOP/hadoop-0.20.2/bin/../hadoop-0.20.2-core.jar
Note: /tmp/sqoop-mediaadmin/compile/382d1c58323cea76efd197632bebbfcd/Dummy.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
12/07/05 11:09:17 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-mediaadmin/compile/382d1c58323cea76efd197632bebbfcd/Dummy.jar
12/07/05 11:09:17 WARN manager.MySQLManager: It looks like you are importing from mysql.
12/07/05 11:09:17 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
12/07/05 11:09:17 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
12/07/05 11:09:17 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
12/07/05 11:09:17 INFO mapreduce.ImportJobBase: Beginning import of Dummy
12/07/05 11:09:18 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN(`Sno`), MAX(`Sno`) FROM `Dummy`
12/07/05 11:09:18 INFO mapred.JobClient: Running job: job_201207051104_0001
12/07/05 11:09:19 INFO mapred.JobClient:  map 0% reduce 0%
12/07/05 11:09:33 INFO mapred.JobClient:  map 100% reduce 0%
12/07/05 11:09:35 INFO mapred.JobClient: Job complete: job_201207051104_0001
12/07/05 11:09:35 INFO mapred.JobClient: Counters: 5
12/07/05 11:09:35 INFO mapred.JobClient:   Job Counters 
12/07/05 11:09:35 INFO mapred.JobClient:     Launched map tasks=1
12/07/05 11:09:35 INFO mapred.JobClient:   FileSystemCounters
12/07/05 11:09:35 INFO mapred.JobClient:     HDFS_BYTES_WRITTEN=8
12/07/05 11:09:35 INFO mapred.JobClient:   Map-Reduce Framework
12/07/05 11:09:35 INFO mapred.JobClient:     Map input records=1
12/07/05 11:09:35 INFO mapred.JobClient:     Spilled Records=0
12/07/05 11:09:35 INFO mapred.JobClient:     Map output records=1
12/07/05 11:09:35 INFO mapreduce.ImportJobBase: Transferred 8 bytes in 17.945 seconds (0.4458 bytes/sec)
12/07/05 11:09:35 INFO mapreduce.ImportJobBase: Retrieved 1 records.
12/07/05 11:09:35 INFO hive.HiveImport: Removing temporary files from import process: Dummy/_logs
12/07/05 11:09:35 INFO hive.HiveImport: Loading uploaded data into Hive
12/07/05 11:09:35 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `Dummy` AS t LIMIT 1
12/07/05 11:09:37 INFO hive.HiveImport: Logging initialized using configuration in jar:file:/HADOOP/hive/lib/hive-common-0.8.1.jar!/hive-log4j.properties
12/07/05 11:09:37 INFO hive.HiveImport: Hive history file=/tmp/mediaadmin/hive_job_log_mediaadmin_201207051109_1901926452.txt
12/07/05 11:09:41 INFO hive.HiveImport: OK
12/07/05 11:09:41 INFO hive.HiveImport: Time taken: 3.934 seconds
12/07/05 11:09:41 INFO hive.HiveImport: Loading data to table default.dummyhive
12/07/05 11:09:41 INFO hive.HiveImport: OK
12/07/05 11:09:41 INFO hive.HiveImport: Time taken: 0.262 seconds
12/07/05 11:09:41 INFO hive.HiveImport: Hive import complete.



Why is it so? Please help me out

Thanks & Regards
Yogesh Kumar


Please do not print this email unless it is absolutely necessary. 
The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments. 
WARNING: Computer viruses can be transmitted via email. The recipient should check this email and any attachments for the presence of viruses. The company accepts no liability for any damage caused by any virus transmitted by this email. 
www.wipro.com 


Please do not print this email unless it is absolutely necessary. 
The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments. 
WARNING: Computer viruses can be transmitted via email. The recipient should check this email and any attachments for the presence of viruses. The company accepts no liability for any damage caused by any virus transmitted by this email. 
www.wipro.com 


Please do not print this email unless it is absolutely necessary. 
The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments. 
WARNING: Computer viruses can be transmitted via email. The recipient should check this email and any attachments for the presence of viruses. The company accepts no liability for any damage caused by any virus transmitted by this email. 
www.wipro.com 

RE: Hive uploading

Posted by yo...@wipro.com.
Hello Bejoy,

sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 -table Dummy --hive-table dummyhive  --create-hive-table  --hive-import --hive-home HADOOP/hive --verbose

Still the same, no table has been created. I am not able to see the dummyhive table in hive by using command
Show Tables ;

although table dummyhive created into HDFS in dir:
user/hive/warehouse/dummyhive


Please suggest
Yogesh Kumar

________________________________
From: Bejoy Ks [bejoy_ks@yahoo.com]
Sent: Thursday, July 05, 2012 5:29 PM
To: user@hive.apache.org
Subject: Re: Hive uploading

Hi Yogesh

Please try out this command

 sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 -table Dummy --hive-table dummyhive  --create-hive-table  --hive-import --hive-home HADOOP/hive --verbose


Regards
Bejoy KS

________________________________
From: "yogesh.kumar13@wipro.com" <yo...@wipro.com>
To: user@hive.apache.org; bejoy_ks@yahoo.com
Sent: Thursday, July 5, 2012 5:03 PM
Subject: RE: Hive uploading

Hi Bejoy

I have confirmed hive installation its same for both
I used command echo $HIVE_HOME on both sqoop terminal and hive terminal
both result the same Path
HADOOP/hive

I am new to Hive and sqoop, would you please give an example using -verbose option with this command


 sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 -table Dummy --hive-table dummyhive  --create-hive-table  --hive-import --hive-home HADOOP/hive



Please help


________________________________
From: Bejoy Ks [bejoy_ks@yahoo.com]
Sent: Thursday, July 05, 2012 3:14 PM
To: user@hive.apache.org
Subject: Re: Hive uploading

Hi Yogesh

No issues seen on the first look. Can you run the sqoop import with --verbose option and post in the console dump?

Are you having multiple hive installation? If so please verify whether you are using the same hive for both SQOOP import and then for verifying data using hive cli. (the hive installation @ HADOOP/hive)

Regards
Bejoy KS

________________________________
From: "yogesh.kumar13@wipro.com" <yo...@wipro.com>
To: user@hive.apache.org
Sent: Thursday, July 5, 2012 2:58 PM
Subject: Hive uploading

Hi

I have created a table in Mysql by name Dummy and it has 2 columns, and 1 row of data

I want to upload that table into Hive using Sqoop tool.
I used this command


sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 -table Dummy --hive-table dummyhive  --create-hive-table  --hive-import --hive-home HADOOP/hive


The table has been succesfully uploaded into HDFS  /user/hive/warehouse
but when I run command in Hive

Show Tables;

I don't find dummyhive table in it.

Please suggest and Help


Details of the command and output

mediaadmins-iMac-2:hive mediaadmin$ sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 -table Dummy --hive-table dummyhive  --create-hive-table  --hive-import --hive-home HADOOP/hive
12/07/05 11:09:15 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
12/07/05 11:09:15 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
12/07/05 11:09:15 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
12/07/05 11:09:15 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
12/07/05 11:09:15 INFO tool.CodeGenTool: Beginning code generation
12/07/05 11:09:16 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `Dummy` AS t LIMIT 1
12/07/05 11:09:16 INFO orm.CompilationManager: HADOOP_HOME is /HADOOP/hadoop-0.20.2/bin/..
12/07/05 11:09:16 INFO orm.CompilationManager: Found hadoop core jar at: /HADOOP/hadoop-0.20.2/bin/../hadoop-0.20.2-core.jar
Note: /tmp/sqoop-mediaadmin/compile/382d1c58323cea76efd197632bebbfcd/Dummy.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
12/07/05 11:09:17 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-mediaadmin/compile/382d1c58323cea76efd197632bebbfcd/Dummy.jar
12/07/05 11:09:17 WARN manager.MySQLManager: It looks like you are importing from mysql.
12/07/05 11:09:17 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
12/07/05 11:09:17 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
12/07/05 11:09:17 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
12/07/05 11:09:17 INFO mapreduce.ImportJobBase: Beginning import of Dummy
12/07/05 11:09:18 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN(`Sno`), MAX(`Sno`) FROM `Dummy`
12/07/05 11:09:18 INFO mapred.JobClient: Running job: job_201207051104_0001
12/07/05 11:09:19 INFO mapred.JobClient:  map 0% reduce 0%
12/07/05 11:09:33 INFO mapred.JobClient:  map 100% reduce 0%
12/07/05 11:09:35 INFO mapred.JobClient: Job complete: job_201207051104_0001
12/07/05 11:09:35 INFO mapred.JobClient: Counters: 5
12/07/05 11:09:35 INFO mapred.JobClient:   Job Counters
12/07/05 11:09:35 INFO mapred.JobClient:     Launched map tasks=1
12/07/05 11:09:35 INFO mapred.JobClient:   FileSystemCounters
12/07/05 11:09:35 INFO mapred.JobClient:     HDFS_BYTES_WRITTEN=8
12/07/05 11:09:35 INFO mapred.JobClient:   Map-Reduce Framework
12/07/05 11:09:35 INFO mapred.JobClient:     Map input records=1
12/07/05 11:09:35 INFO mapred.JobClient:     Spilled Records=0
12/07/05 11:09:35 INFO mapred.JobClient:     Map output records=1
12/07/05 11:09:35 INFO mapreduce.ImportJobBase: Transferred 8 bytes in 17.945 seconds (0.4458 bytes/sec)
12/07/05 11:09:35 INFO mapreduce.ImportJobBase: Retrieved 1 records.
12/07/05 11:09:35 INFO hive.HiveImport: Removing temporary files from import process: Dummy/_logs
12/07/05 11:09:35 INFO hive.HiveImport: Loading uploaded data into Hive
12/07/05 11:09:35 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `Dummy` AS t LIMIT 1
12/07/05 11:09:37 INFO hive.HiveImport: Logging initialized using configuration in jar:file:/HADOOP/hive/lib/hive-common-0.8.1.jar!/hive-log4j.properties
12/07/05 11:09:37 INFO hive.HiveImport: Hive history file=/tmp/mediaadmin/hive_job_log_mediaadmin_201207051109_1901926452.txt
12/07/05 11:09:41 INFO hive.HiveImport: OK
12/07/05 11:09:41 INFO hive.HiveImport: Time taken: 3.934 seconds
12/07/05 11:09:41 INFO hive.HiveImport: Loading data to table default.dummyhive
12/07/05 11:09:41 INFO hive.HiveImport: OK
12/07/05 11:09:41 INFO hive.HiveImport: Time taken: 0.262 seconds
12/07/05 11:09:41 INFO hive.HiveImport: Hive import complete.



Why is it so? Please help me out

Thanks & Regards
Yogesh Kumar

Please do not print this email unless it is absolutely necessary.
The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments.
WARNING: Computer viruses can be transmitted via email. The recipient should check this email and any attachments for the presence of viruses. The company accepts no liability for any damage caused by any virus transmitted by this email.
www.wipro.com


Please do not print this email unless it is absolutely necessary.
The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments.
WARNING: Computer viruses can be transmitted via email. The recipient should check this email and any attachments for the presence of viruses. The company accepts no liability for any damage caused by any virus transmitted by this email.
www.wipro.com



Please do not print this email unless it is absolutely necessary. 

The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments. 

WARNING: Computer viruses can be transmitted via email. The recipient should check this email and any attachments for the presence of viruses. The company accepts no liability for any damage caused by any virus transmitted by this email. 

www.wipro.com

Re: Hive uploading

Posted by Bejoy Ks <be...@yahoo.com>.
Hi Yogesh

Please try out this command


 sqoop import --connect 
jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 
-table Dummy --hive-table dummyhive  --create-hive-table  --hive-import 
--hive-home HADOOP/hive --verbose


Regards
Bejoy KS




________________________________
 From: "yogesh.kumar13@wipro.com" <yo...@wipro.com>
To: user@hive.apache.org; bejoy_ks@yahoo.com 
Sent: Thursday, July 5, 2012 5:03 PM
Subject: RE: Hive uploading
 

 
Hi Bejoy

I have confirmed hive installation its same for both
I used command echo $HIVE_HOME on both sqoop terminal and hive terminal
both result the same Path 
HADOOP/hive

I am new to Hive and sqoop, would you please give an example using -verbose option with this command


 sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 -table Dummy --hive-table dummyhive  --create-hive-table  --hive-import --hive-home HADOOP/hive 



Please help




________________________________
 
From: Bejoy Ks [bejoy_ks@yahoo.com]
Sent: Thursday, July 05, 2012 3:14 PM
To: user@hive.apache.org
Subject: Re: Hive uploading


Hi Yogesh

No issues seen on the first look. Can you run the sqoop import with --verbose option and post in the console dump?

Are you having multiple hive installation? If so please verify whether you are using the same hive for both SQOOP import and then for verifying data using hive cli. (the hive installation @ HADOOP/hive)

Regards
Bejoy KS



________________________________
 From: "yogesh.kumar13@wipro.com" <yo...@wipro.com>
To: user@hive.apache.org 
Sent: Thursday, July 5, 2012 2:58 PM
Subject: Hive uploading


 
Hi

I have created a table in Mysql by name Dummy and it has 2 columns, and 1 row of data

I want to upload that table into Hive using Sqoop tool.
I used this command  


sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 -table Dummy --hive-table dummyhive  --create-hive-table  --hive-import --hive-home HADOOP/hive


The table has been succesfully uploaded into HDFS  /user/hive/warehouse
but when I run command in Hive 

Show Tables;

I don't find dummyhive table in it.

Please suggest and Help


Details of the command and output

mediaadmins-iMac-2:hive mediaadmin$ sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 -table Dummy --hive-table dummyhive  --create-hive-table  --hive-import --hive-home HADOOP/hive
12/07/05 11:09:15 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
12/07/05 11:09:15 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
12/07/05 11:09:15 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
12/07/05 11:09:15 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
12/07/05 11:09:15 INFO tool.CodeGenTool: Beginning code generation
12/07/05 11:09:16 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `Dummy` AS t LIMIT 1
12/07/05 11:09:16 INFO orm.CompilationManager: HADOOP_HOME is /HADOOP/hadoop-0.20.2/bin/..
12/07/05 11:09:16 INFO orm.CompilationManager: Found hadoop core jar at: /HADOOP/hadoop-0.20.2/bin/../hadoop-0.20.2-core.jar
Note: /tmp/sqoop-mediaadmin/compile/382d1c58323cea76efd197632bebbfcd/Dummy.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
12/07/05 11:09:17 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-mediaadmin/compile/382d1c58323cea76efd197632bebbfcd/Dummy.jar
12/07/05 11:09:17 WARN manager.MySQLManager: It looks like you are importing from mysql.
12/07/05 11:09:17 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
12/07/05 11:09:17 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
12/07/05 11:09:17 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
12/07/05 11:09:17 INFO mapreduce.ImportJobBase: Beginning import of Dummy
12/07/05 11:09:18 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN(`Sno`), MAX(`Sno`) FROM `Dummy`
12/07/05 11:09:18 INFO mapred.JobClient: Running job: job_201207051104_0001
12/07/05 11:09:19 INFO mapred.JobClient:  map 0% reduce 0%
12/07/05 11:09:33 INFO mapred.JobClient:  map 100% reduce 0%
12/07/05 11:09:35 INFO mapred.JobClient: Job complete: job_201207051104_0001
12/07/05 11:09:35 INFO mapred.JobClient: Counters: 5
12/07/05 11:09:35 INFO mapred.JobClient:   Job Counters 
12/07/05 11:09:35 INFO mapred.JobClient:     Launched map tasks=1
12/07/05 11:09:35 INFO mapred.JobClient:   FileSystemCounters
12/07/05 11:09:35 INFO mapred.JobClient:     HDFS_BYTES_WRITTEN=8
12/07/05 11:09:35 INFO mapred.JobClient:   Map-Reduce Framework
12/07/05 11:09:35 INFO mapred.JobClient:     Map input records=1
12/07/05 11:09:35 INFO mapred.JobClient:     Spilled Records=0
12/07/05 11:09:35 INFO mapred.JobClient:     Map output records=1
12/07/05 11:09:35 INFO mapreduce.ImportJobBase: Transferred 8 bytes in 17.945 seconds (0.4458 bytes/sec)
12/07/05 11:09:35 INFO mapreduce.ImportJobBase: Retrieved 1 records.
12/07/05 11:09:35 INFO hive.HiveImport: Removing temporary files from import process: Dummy/_logs
12/07/05 11:09:35 INFO hive.HiveImport: Loading uploaded data into Hive
12/07/05 11:09:35 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `Dummy` AS t LIMIT 1
12/07/05 11:09:37 INFO hive.HiveImport: Logging initialized using configuration in jar:file:/HADOOP/hive/lib/hive-common-0.8.1.jar!/hive-log4j.properties
12/07/05 11:09:37 INFO hive.HiveImport: Hive history file=/tmp/mediaadmin/hive_job_log_mediaadmin_201207051109_1901926452.txt
12/07/05 11:09:41 INFO hive.HiveImport: OK
12/07/05 11:09:41 INFO hive.HiveImport: Time taken: 3.934 seconds
12/07/05 11:09:41 INFO hive.HiveImport: Loading data to table default.dummyhive
12/07/05 11:09:41 INFO hive.HiveImport: OK
12/07/05 11:09:41 INFO hive.HiveImport: Time taken: 0.262 seconds
12/07/05 11:09:41 INFO hive.HiveImport: Hive import complete.



Why is it so? Please help me out

Thanks & Regards
Yogesh Kumar


Please do not print this email unless it is absolutely necessary. 
The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments. 
WARNING: Computer viruses can be transmitted via email. The recipient should check this email and any attachments for the presence of viruses. The company accepts no liability for any damage caused by any virus transmitted by this email. 
www.wipro.com 


Please do not print this email unless it is absolutely necessary. 
The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments. 
WARNING: Computer viruses can be transmitted via email. The recipient should check this email and any attachments for the presence of viruses. The company accepts no liability for any damage caused by any virus transmitted by this email. 
www.wipro.com 

RE: Hive uploading

Posted by yo...@wipro.com.
Hi Bejoy

I have confirmed hive installation its same for both
I used command echo $HIVE_HOME on both sqoop terminal and hive terminal
both result the same Path
HADOOP/hive

I am new to Hive and sqoop, would you please give an example using -verbose option with this command


 sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 -table Dummy --hive-table dummyhive  --create-hive-table  --hive-import --hive-home HADOOP/hive



Please help


________________________________
From: Bejoy Ks [bejoy_ks@yahoo.com]
Sent: Thursday, July 05, 2012 3:14 PM
To: user@hive.apache.org
Subject: Re: Hive uploading

Hi Yogesh

No issues seen on the first look. Can you run the sqoop import with --verbose option and post in the console dump?

Are you having multiple hive installation? If so please verify whether you are using the same hive for both SQOOP import and then for verifying data using hive cli. (the hive installation @ HADOOP/hive)

Regards
Bejoy KS

________________________________
From: "yogesh.kumar13@wipro.com" <yo...@wipro.com>
To: user@hive.apache.org
Sent: Thursday, July 5, 2012 2:58 PM
Subject: Hive uploading

Hi

I have created a table in Mysql by name Dummy and it has 2 columns, and 1 row of data

I want to upload that table into Hive using Sqoop tool.
I used this command


sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 -table Dummy --hive-table dummyhive  --create-hive-table  --hive-import --hive-home HADOOP/hive


The table has been succesfully uploaded into HDFS  /user/hive/warehouse
but when I run command in Hive

Show Tables;

I don't find dummyhive table in it.

Please suggest and Help


Details of the command and output

mediaadmins-iMac-2:hive mediaadmin$ sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 -table Dummy --hive-table dummyhive  --create-hive-table  --hive-import --hive-home HADOOP/hive
12/07/05 11:09:15 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
12/07/05 11:09:15 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
12/07/05 11:09:15 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
12/07/05 11:09:15 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
12/07/05 11:09:15 INFO tool.CodeGenTool: Beginning code generation
12/07/05 11:09:16 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `Dummy` AS t LIMIT 1
12/07/05 11:09:16 INFO orm.CompilationManager: HADOOP_HOME is /HADOOP/hadoop-0.20.2/bin/..
12/07/05 11:09:16 INFO orm.CompilationManager: Found hadoop core jar at: /HADOOP/hadoop-0.20.2/bin/../hadoop-0.20.2-core.jar
Note: /tmp/sqoop-mediaadmin/compile/382d1c58323cea76efd197632bebbfcd/Dummy.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
12/07/05 11:09:17 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-mediaadmin/compile/382d1c58323cea76efd197632bebbfcd/Dummy.jar
12/07/05 11:09:17 WARN manager.MySQLManager: It looks like you are importing from mysql.
12/07/05 11:09:17 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
12/07/05 11:09:17 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
12/07/05 11:09:17 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
12/07/05 11:09:17 INFO mapreduce.ImportJobBase: Beginning import of Dummy
12/07/05 11:09:18 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN(`Sno`), MAX(`Sno`) FROM `Dummy`
12/07/05 11:09:18 INFO mapred.JobClient: Running job: job_201207051104_0001
12/07/05 11:09:19 INFO mapred.JobClient:  map 0% reduce 0%
12/07/05 11:09:33 INFO mapred.JobClient:  map 100% reduce 0%
12/07/05 11:09:35 INFO mapred.JobClient: Job complete: job_201207051104_0001
12/07/05 11:09:35 INFO mapred.JobClient: Counters: 5
12/07/05 11:09:35 INFO mapred.JobClient:   Job Counters
12/07/05 11:09:35 INFO mapred.JobClient:     Launched map tasks=1
12/07/05 11:09:35 INFO mapred.JobClient:   FileSystemCounters
12/07/05 11:09:35 INFO mapred.JobClient:     HDFS_BYTES_WRITTEN=8
12/07/05 11:09:35 INFO mapred.JobClient:   Map-Reduce Framework
12/07/05 11:09:35 INFO mapred.JobClient:     Map input records=1
12/07/05 11:09:35 INFO mapred.JobClient:     Spilled Records=0
12/07/05 11:09:35 INFO mapred.JobClient:     Map output records=1
12/07/05 11:09:35 INFO mapreduce.ImportJobBase: Transferred 8 bytes in 17.945 seconds (0.4458 bytes/sec)
12/07/05 11:09:35 INFO mapreduce.ImportJobBase: Retrieved 1 records.
12/07/05 11:09:35 INFO hive.HiveImport: Removing temporary files from import process: Dummy/_logs
12/07/05 11:09:35 INFO hive.HiveImport: Loading uploaded data into Hive
12/07/05 11:09:35 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `Dummy` AS t LIMIT 1
12/07/05 11:09:37 INFO hive.HiveImport: Logging initialized using configuration in jar:file:/HADOOP/hive/lib/hive-common-0.8.1.jar!/hive-log4j.properties
12/07/05 11:09:37 INFO hive.HiveImport: Hive history file=/tmp/mediaadmin/hive_job_log_mediaadmin_201207051109_1901926452.txt
12/07/05 11:09:41 INFO hive.HiveImport: OK
12/07/05 11:09:41 INFO hive.HiveImport: Time taken: 3.934 seconds
12/07/05 11:09:41 INFO hive.HiveImport: Loading data to table default.dummyhive
12/07/05 11:09:41 INFO hive.HiveImport: OK
12/07/05 11:09:41 INFO hive.HiveImport: Time taken: 0.262 seconds
12/07/05 11:09:41 INFO hive.HiveImport: Hive import complete.



Why is it so? Please help me out

Thanks & Regards
Yogesh Kumar

Please do not print this email unless it is absolutely necessary.
The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments.
WARNING: Computer viruses can be transmitted via email. The recipient should check this email and any attachments for the presence of viruses. The company accepts no liability for any damage caused by any virus transmitted by this email.
www.wipro.com



Please do not print this email unless it is absolutely necessary. 

The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments. 

WARNING: Computer viruses can be transmitted via email. The recipient should check this email and any attachments for the presence of viruses. The company accepts no liability for any damage caused by any virus transmitted by this email. 

www.wipro.com

Re: Hive uploading

Posted by Bejoy Ks <be...@yahoo.com>.
Hi Yogesh

No issues seen on the first look. Can you run the sqoop import with --verbose option and post in the console dump?

Are you having multiple hive installation? If so please verify whether you are using the same hive for both SQOOP import and then for verifying data using hive cli. (the hive installation @ HADOOP/hive)

Regards
Bejoy KS



________________________________
 From: "yogesh.kumar13@wipro.com" <yo...@wipro.com>
To: user@hive.apache.org 
Sent: Thursday, July 5, 2012 2:58 PM
Subject: Hive uploading
 

 
Hi

I have created a table in Mysql by name Dummy and it has 2 columns, and 1 row of data

I want to upload that table into Hive using Sqoop tool.
I used this command  


sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 -table Dummy --hive-table dummyhive  --create-hive-table  --hive-import --hive-home HADOOP/hive


The table has been succesfully uploaded into HDFS  /user/hive/warehouse
but when I run command in Hive 

Show Tables;

I don't find dummyhive table in it.

Please suggest and Help


Details of the command and output

mediaadmins-iMac-2:hive mediaadmin$ sqoop import --connect jdbc:mysql://localhost:3306/Demo --username sqoop1 --password SQOOP1 -table Dummy --hive-table dummyhive  --create-hive-table  --hive-import --hive-home HADOOP/hive
12/07/05 11:09:15 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
12/07/05 11:09:15 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
12/07/05 11:09:15 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
12/07/05 11:09:15 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
12/07/05 11:09:15 INFO tool.CodeGenTool: Beginning code generation
12/07/05 11:09:16 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `Dummy` AS t LIMIT 1
12/07/05 11:09:16 INFO orm.CompilationManager: HADOOP_HOME is /HADOOP/hadoop-0.20.2/bin/..
12/07/05 11:09:16 INFO orm.CompilationManager: Found hadoop core jar at: /HADOOP/hadoop-0.20.2/bin/../hadoop-0.20.2-core.jar
Note: /tmp/sqoop-mediaadmin/compile/382d1c58323cea76efd197632bebbfcd/Dummy.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
12/07/05 11:09:17 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-mediaadmin/compile/382d1c58323cea76efd197632bebbfcd/Dummy.jar
12/07/05 11:09:17 WARN manager.MySQLManager: It looks like you are importing from mysql.
12/07/05 11:09:17 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
12/07/05 11:09:17 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
12/07/05 11:09:17 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
12/07/05 11:09:17 INFO mapreduce.ImportJobBase: Beginning import of Dummy
12/07/05 11:09:18 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN(`Sno`), MAX(`Sno`) FROM `Dummy`
12/07/05 11:09:18 INFO mapred.JobClient: Running job: job_201207051104_0001
12/07/05 11:09:19 INFO mapred.JobClient:  map 0% reduce 0%
12/07/05 11:09:33 INFO mapred.JobClient:  map 100% reduce 0%
12/07/05 11:09:35 INFO mapred.JobClient: Job complete: job_201207051104_0001
12/07/05 11:09:35 INFO mapred.JobClient: Counters: 5
12/07/05 11:09:35 INFO mapred.JobClient:   Job Counters 
12/07/05 11:09:35 INFO mapred.JobClient:     Launched map tasks=1
12/07/05 11:09:35 INFO mapred.JobClient:   FileSystemCounters
12/07/05 11:09:35 INFO mapred.JobClient:     HDFS_BYTES_WRITTEN=8
12/07/05 11:09:35 INFO mapred.JobClient:   Map-Reduce Framework
12/07/05 11:09:35 INFO mapred.JobClient:     Map input records=1
12/07/05 11:09:35 INFO mapred.JobClient:     Spilled Records=0
12/07/05 11:09:35 INFO mapred.JobClient:     Map output records=1
12/07/05 11:09:35 INFO mapreduce.ImportJobBase: Transferred 8 bytes in 17.945 seconds (0.4458 bytes/sec)
12/07/05 11:09:35 INFO mapreduce.ImportJobBase: Retrieved 1 records.
12/07/05 11:09:35 INFO hive.HiveImport: Removing temporary files from import process: Dummy/_logs
12/07/05 11:09:35 INFO hive.HiveImport: Loading uploaded data into Hive
12/07/05 11:09:35 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `Dummy` AS t LIMIT 1
12/07/05 11:09:37 INFO hive.HiveImport: Logging initialized using configuration in jar:file:/HADOOP/hive/lib/hive-common-0.8.1.jar!/hive-log4j.properties
12/07/05 11:09:37 INFO hive.HiveImport: Hive history file=/tmp/mediaadmin/hive_job_log_mediaadmin_201207051109_1901926452.txt
12/07/05 11:09:41 INFO hive.HiveImport: OK
12/07/05 11:09:41 INFO hive.HiveImport: Time taken: 3.934 seconds
12/07/05 11:09:41 INFO hive.HiveImport: Loading data to table default.dummyhive
12/07/05 11:09:41 INFO hive.HiveImport: OK
12/07/05 11:09:41 INFO hive.HiveImport: Time taken: 0.262 seconds
12/07/05 11:09:41 INFO hive.HiveImport: Hive import complete.



Why is it so? Please help me out

Thanks & Regards
Yogesh Kumar


Please do not print this email unless it is absolutely necessary. 
The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments. 
WARNING: Computer viruses can be transmitted via email. The recipient should check this email and any attachments for the presence of viruses. The company accepts no liability for any damage caused by any virus transmitted by this email. 
www.wipro.com