You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Arun Pandian <ar...@gmail.com> on 2016/02/02 14:28:24 UTC

Sqoop import Mysql to hive import Partitions

How to create sqoop import  mysql to hive partition , this is my code



sqoop import --connect jdbc:mysql://localhost/arun  --table account
--username root --password hadoop -m 1  --hive-partition-key "name"
--hive-partition-value "arun" --hive-database company  --create-hive-table
--hive-table account5  --target-dir /user/sqooptest21

mysql database is arun,
table is account
select * from account;
+----+-------+------+------------+---------+
| id | name  | age  | joindate   | namess  |
+----+-------+------+------------+---------+
|  1 | Arun  |   23 | 29-01-2016 | super   |
|  2 | Mani  |   22 | 30-01-2016 | superb  |
|  3 | Sana  |   25 | 20-01-2016 | superbb |
|  4 | Vicky |   24 | 02-02-2016 | supervb |
|  5 | Vis   |   23 | 30-05-2016 | super   |
+----+-------+------+------------+---------+


i need clear explanation for sqoop import mysql to hive




*Thanks & Regards*

*Arunpandian.l*

*Associate Developer*

Re: Sqoop import Mysql to hive import Partitions

Posted by sivanesan sumtwo <si...@gmail.com>.
sqoop import

--connect jdbc:mysql://servername\dbname

--username hiveuser

--query 'select id,name,salary,dept,sol  from sample5 WHERE
status="Shutting" AND  $CONDITIONS'

--hive-import

--hive-partition-key status

--hive-partition-value 'Shutting'

--fields-terminated-by "," --lines-terminated-by "\n"

--target-dir ts1234 --m 1 -P



/--hive-partition-key is Name of a hive field to partition are sharded on

/--hive-partition-value is Value of the hive field to partition



*Thanks & Regards*
L.Sivanesan.




*Thanks & Regards*
L.Sivanesan.

On Fri, Feb 5, 2016 at 7:23 AM, Chinnappan Chandrasekaran <
chiranchandra@jos.com.sg> wrote:

>
>
> sqoop import --table tablename\
>
> --connect jdbc:mysql:servername\dbname
>
> --username dbuser --password pw \
>
> --fields-terminated-by "\t"
>
> -- hive-partition-key
>
> -- hive-import
>
>
>
>
>
> /--hive-partition-key is Name of a hive field to partition are sharded on
>
>
>
>
>
> *Thanks & Regards*
>
> *Chandrasekaran*
>
> Technical Consultant
>
> Business Solutions Group
>
>
>
> Jardine OneSolution (2001) Pte Ltd
>
> *Tel* +65 6551 9608 | *Mobile *+65 8138 4761 | *Email*
> chiranchandra@jos.com.sg
>
> 55, Ubi Avenue 1 #03-15,Singapore 408935
>
>
>
> [image: Description: 150828 - JOS email signature_grey-02]
> <http://jos.com>
>
>
>
> *From:* Ted Yu [mailto:yuzhihong@gmail.com]
> *Sent:* Tuesday, 2 February, 2016 10:38 PM
> *To:* Arun Pandian
> *Cc:* user@hadoop.apache.org
> *Subject:* Re: Sqoop import Mysql to hive import Partitions
>
>
>
> The question is better suited for sqoop mailing list.
>
>
>
> Meanwhile, you can search for past threads / JIRAs on this subject.
>
> e.g.
>
>
> http://search-hadoop.com/m/C63T91ojE6B14UByZ&subj=sqoop+1+4+4+hive+import+error
>
>
> On Feb 2, 2016, at 5:28 AM, Arun Pandian <ar...@gmail.com>
> wrote:
>
> How to create sqoop import  mysql to hive partition , this is my code
>
>
>
> sqoop import --connect jdbc:mysql://localhost/arun  --table account
> --username root --password hadoop -m 1  --hive-partition-key "name"
> --hive-partition-value "arun" --hive-database company  --create-hive-table
> --hive-table account5  --target-dir /user/sqooptest21
>
> mysql database is arun,
>
> table is account
> select * from account;
> +----+-------+------+------------+---------+
> | id | name  | age  | joindate   | namess  |
> +----+-------+------+------------+---------+
> |  1 | Arun  |   23 | 29-01-2016 | super   |
> |  2 | Mani  |   22 | 30-01-2016 | superb  |
> |  3 | Sana  |   25 | 20-01-2016 | superbb |
> |  4 | Vicky |   24 | 02-02-2016 | supervb |
> |  5 | Vis   |   23 | 30-05-2016 | super   |
> +----+-------+------+------------+---------+
>
> i need clear explanation for sqoop import mysql to hive
>
>
>
>
> *Thanks & Regards*
>
> *Arunpandian.l*
>
> *Associate Developer*
>
>
> ______________________________________________________________________
> This email has been scanned by the Symantec Email Security.cloud service.
> For more information please visit http://www.symanteccloud.com
> ______________________________________________________________________
>
> *For general enquiries, please contact us at JOS Enquiry Email: *
> *enquiry@jos.com.sg* <en...@jos.com.sg>* Hotline: (+65) 6551 9611*
>
> *For JOS Support, please contact us at JOS Services Email: *
> *services@jos.com.sg* <se...@jos.com.sg>* Hotline: (+65) 6484 2302*
>
>
>
> *A member of the Jardine Matheson Group, Jardine OneSolution is one of
> Asia’s leading providers of integrated IT services and solutions with
> offices in Singapore, Malaysia, Hong Kong and China. Find out more about
> JOS at **www.jos.com* <http://www.jos.com>
>
>
> Confidentiality Notice and Disclaimer:
> This email (including any attachment to it) is confidential and intended
> only for the use of the individual or entity named above and may contain
> information that is privileged. If you are not the intended recipient, you
> are notified that any dissemination, distribution or copying of this email
> is strictly prohibited. If you have received this email in error, please
> notify us immediately by return email or telephone and destroy the original
> message (including any attachment to it). Thank you.
>
>
>
>
>
> ______________________________________________________________________
> This email has been scanned by the Symantec Email Security.cloud service.
> Confidentiality Notice and Disclaimer: This email (including any
> attachment to it) is confidential and intended only for the use of the
> individual or entity named above and may contain information that is
> privileged. If you are not the intended recipient, you are notified that
> any dissemination, distribution or copying of this email is strictly
> prohibited. If you have received this email in error, please notify us
> immediately by return email or telephone and destroy the original message
> (including any attachment to it). Thank you.
> ______________________________________________________________________
>

Re: Sqoop import Mysql to hive import Partitions

Posted by sivanesan sumtwo <si...@gmail.com>.
sqoop import

--connect jdbc:mysql://servername\dbname

--username hiveuser

--query 'select id,name,salary,dept,sol  from sample5 WHERE
status="Shutting" AND  $CONDITIONS'

--hive-import

--hive-partition-key status

--hive-partition-value 'Shutting'

--fields-terminated-by "," --lines-terminated-by "\n"

--target-dir ts1234 --m 1 -P



/--hive-partition-key is Name of a hive field to partition are sharded on

/--hive-partition-value is Value of the hive field to partition



*Thanks & Regards*
L.Sivanesan.




*Thanks & Regards*
L.Sivanesan.

On Fri, Feb 5, 2016 at 7:23 AM, Chinnappan Chandrasekaran <
chiranchandra@jos.com.sg> wrote:

>
>
> sqoop import --table tablename\
>
> --connect jdbc:mysql:servername\dbname
>
> --username dbuser --password pw \
>
> --fields-terminated-by "\t"
>
> -- hive-partition-key
>
> -- hive-import
>
>
>
>
>
> /--hive-partition-key is Name of a hive field to partition are sharded on
>
>
>
>
>
> *Thanks & Regards*
>
> *Chandrasekaran*
>
> Technical Consultant
>
> Business Solutions Group
>
>
>
> Jardine OneSolution (2001) Pte Ltd
>
> *Tel* +65 6551 9608 | *Mobile *+65 8138 4761 | *Email*
> chiranchandra@jos.com.sg
>
> 55, Ubi Avenue 1 #03-15,Singapore 408935
>
>
>
> [image: Description: 150828 - JOS email signature_grey-02]
> <http://jos.com>
>
>
>
> *From:* Ted Yu [mailto:yuzhihong@gmail.com]
> *Sent:* Tuesday, 2 February, 2016 10:38 PM
> *To:* Arun Pandian
> *Cc:* user@hadoop.apache.org
> *Subject:* Re: Sqoop import Mysql to hive import Partitions
>
>
>
> The question is better suited for sqoop mailing list.
>
>
>
> Meanwhile, you can search for past threads / JIRAs on this subject.
>
> e.g.
>
>
> http://search-hadoop.com/m/C63T91ojE6B14UByZ&subj=sqoop+1+4+4+hive+import+error
>
>
> On Feb 2, 2016, at 5:28 AM, Arun Pandian <ar...@gmail.com>
> wrote:
>
> How to create sqoop import  mysql to hive partition , this is my code
>
>
>
> sqoop import --connect jdbc:mysql://localhost/arun  --table account
> --username root --password hadoop -m 1  --hive-partition-key "name"
> --hive-partition-value "arun" --hive-database company  --create-hive-table
> --hive-table account5  --target-dir /user/sqooptest21
>
> mysql database is arun,
>
> table is account
> select * from account;
> +----+-------+------+------------+---------+
> | id | name  | age  | joindate   | namess  |
> +----+-------+------+------------+---------+
> |  1 | Arun  |   23 | 29-01-2016 | super   |
> |  2 | Mani  |   22 | 30-01-2016 | superb  |
> |  3 | Sana  |   25 | 20-01-2016 | superbb |
> |  4 | Vicky |   24 | 02-02-2016 | supervb |
> |  5 | Vis   |   23 | 30-05-2016 | super   |
> +----+-------+------+------------+---------+
>
> i need clear explanation for sqoop import mysql to hive
>
>
>
>
> *Thanks & Regards*
>
> *Arunpandian.l*
>
> *Associate Developer*
>
>
> ______________________________________________________________________
> This email has been scanned by the Symantec Email Security.cloud service.
> For more information please visit http://www.symanteccloud.com
> ______________________________________________________________________
>
> *For general enquiries, please contact us at JOS Enquiry Email: *
> *enquiry@jos.com.sg* <en...@jos.com.sg>* Hotline: (+65) 6551 9611*
>
> *For JOS Support, please contact us at JOS Services Email: *
> *services@jos.com.sg* <se...@jos.com.sg>* Hotline: (+65) 6484 2302*
>
>
>
> *A member of the Jardine Matheson Group, Jardine OneSolution is one of
> Asia’s leading providers of integrated IT services and solutions with
> offices in Singapore, Malaysia, Hong Kong and China. Find out more about
> JOS at **www.jos.com* <http://www.jos.com>
>
>
> Confidentiality Notice and Disclaimer:
> This email (including any attachment to it) is confidential and intended
> only for the use of the individual or entity named above and may contain
> information that is privileged. If you are not the intended recipient, you
> are notified that any dissemination, distribution or copying of this email
> is strictly prohibited. If you have received this email in error, please
> notify us immediately by return email or telephone and destroy the original
> message (including any attachment to it). Thank you.
>
>
>
>
>
> ______________________________________________________________________
> This email has been scanned by the Symantec Email Security.cloud service.
> Confidentiality Notice and Disclaimer: This email (including any
> attachment to it) is confidential and intended only for the use of the
> individual or entity named above and may contain information that is
> privileged. If you are not the intended recipient, you are notified that
> any dissemination, distribution or copying of this email is strictly
> prohibited. If you have received this email in error, please notify us
> immediately by return email or telephone and destroy the original message
> (including any attachment to it). Thank you.
> ______________________________________________________________________
>

Re: Sqoop import Mysql to hive import Partitions

Posted by sivanesan sumtwo <si...@gmail.com>.
sqoop import

--connect jdbc:mysql://servername\dbname

--username hiveuser

--query 'select id,name,salary,dept,sol  from sample5 WHERE
status="Shutting" AND  $CONDITIONS'

--hive-import

--hive-partition-key status

--hive-partition-value 'Shutting'

--fields-terminated-by "," --lines-terminated-by "\n"

--target-dir ts1234 --m 1 -P



/--hive-partition-key is Name of a hive field to partition are sharded on

/--hive-partition-value is Value of the hive field to partition



*Thanks & Regards*
L.Sivanesan.




*Thanks & Regards*
L.Sivanesan.

On Fri, Feb 5, 2016 at 7:23 AM, Chinnappan Chandrasekaran <
chiranchandra@jos.com.sg> wrote:

>
>
> sqoop import --table tablename\
>
> --connect jdbc:mysql:servername\dbname
>
> --username dbuser --password pw \
>
> --fields-terminated-by "\t"
>
> -- hive-partition-key
>
> -- hive-import
>
>
>
>
>
> /--hive-partition-key is Name of a hive field to partition are sharded on
>
>
>
>
>
> *Thanks & Regards*
>
> *Chandrasekaran*
>
> Technical Consultant
>
> Business Solutions Group
>
>
>
> Jardine OneSolution (2001) Pte Ltd
>
> *Tel* +65 6551 9608 | *Mobile *+65 8138 4761 | *Email*
> chiranchandra@jos.com.sg
>
> 55, Ubi Avenue 1 #03-15,Singapore 408935
>
>
>
> [image: Description: 150828 - JOS email signature_grey-02]
> <http://jos.com>
>
>
>
> *From:* Ted Yu [mailto:yuzhihong@gmail.com]
> *Sent:* Tuesday, 2 February, 2016 10:38 PM
> *To:* Arun Pandian
> *Cc:* user@hadoop.apache.org
> *Subject:* Re: Sqoop import Mysql to hive import Partitions
>
>
>
> The question is better suited for sqoop mailing list.
>
>
>
> Meanwhile, you can search for past threads / JIRAs on this subject.
>
> e.g.
>
>
> http://search-hadoop.com/m/C63T91ojE6B14UByZ&subj=sqoop+1+4+4+hive+import+error
>
>
> On Feb 2, 2016, at 5:28 AM, Arun Pandian <ar...@gmail.com>
> wrote:
>
> How to create sqoop import  mysql to hive partition , this is my code
>
>
>
> sqoop import --connect jdbc:mysql://localhost/arun  --table account
> --username root --password hadoop -m 1  --hive-partition-key "name"
> --hive-partition-value "arun" --hive-database company  --create-hive-table
> --hive-table account5  --target-dir /user/sqooptest21
>
> mysql database is arun,
>
> table is account
> select * from account;
> +----+-------+------+------------+---------+
> | id | name  | age  | joindate   | namess  |
> +----+-------+------+------------+---------+
> |  1 | Arun  |   23 | 29-01-2016 | super   |
> |  2 | Mani  |   22 | 30-01-2016 | superb  |
> |  3 | Sana  |   25 | 20-01-2016 | superbb |
> |  4 | Vicky |   24 | 02-02-2016 | supervb |
> |  5 | Vis   |   23 | 30-05-2016 | super   |
> +----+-------+------+------------+---------+
>
> i need clear explanation for sqoop import mysql to hive
>
>
>
>
> *Thanks & Regards*
>
> *Arunpandian.l*
>
> *Associate Developer*
>
>
> ______________________________________________________________________
> This email has been scanned by the Symantec Email Security.cloud service.
> For more information please visit http://www.symanteccloud.com
> ______________________________________________________________________
>
> *For general enquiries, please contact us at JOS Enquiry Email: *
> *enquiry@jos.com.sg* <en...@jos.com.sg>* Hotline: (+65) 6551 9611*
>
> *For JOS Support, please contact us at JOS Services Email: *
> *services@jos.com.sg* <se...@jos.com.sg>* Hotline: (+65) 6484 2302*
>
>
>
> *A member of the Jardine Matheson Group, Jardine OneSolution is one of
> Asia’s leading providers of integrated IT services and solutions with
> offices in Singapore, Malaysia, Hong Kong and China. Find out more about
> JOS at **www.jos.com* <http://www.jos.com>
>
>
> Confidentiality Notice and Disclaimer:
> This email (including any attachment to it) is confidential and intended
> only for the use of the individual or entity named above and may contain
> information that is privileged. If you are not the intended recipient, you
> are notified that any dissemination, distribution or copying of this email
> is strictly prohibited. If you have received this email in error, please
> notify us immediately by return email or telephone and destroy the original
> message (including any attachment to it). Thank you.
>
>
>
>
>
> ______________________________________________________________________
> This email has been scanned by the Symantec Email Security.cloud service.
> Confidentiality Notice and Disclaimer: This email (including any
> attachment to it) is confidential and intended only for the use of the
> individual or entity named above and may contain information that is
> privileged. If you are not the intended recipient, you are notified that
> any dissemination, distribution or copying of this email is strictly
> prohibited. If you have received this email in error, please notify us
> immediately by return email or telephone and destroy the original message
> (including any attachment to it). Thank you.
> ______________________________________________________________________
>

Re: Sqoop import Mysql to hive import Partitions

Posted by sivanesan sumtwo <si...@gmail.com>.
sqoop import

--connect jdbc:mysql://servername\dbname

--username hiveuser

--query 'select id,name,salary,dept,sol  from sample5 WHERE
status="Shutting" AND  $CONDITIONS'

--hive-import

--hive-partition-key status

--hive-partition-value 'Shutting'

--fields-terminated-by "," --lines-terminated-by "\n"

--target-dir ts1234 --m 1 -P



/--hive-partition-key is Name of a hive field to partition are sharded on

/--hive-partition-value is Value of the hive field to partition



*Thanks & Regards*
L.Sivanesan.




*Thanks & Regards*
L.Sivanesan.

On Fri, Feb 5, 2016 at 7:23 AM, Chinnappan Chandrasekaran <
chiranchandra@jos.com.sg> wrote:

>
>
> sqoop import --table tablename\
>
> --connect jdbc:mysql:servername\dbname
>
> --username dbuser --password pw \
>
> --fields-terminated-by "\t"
>
> -- hive-partition-key
>
> -- hive-import
>
>
>
>
>
> /--hive-partition-key is Name of a hive field to partition are sharded on
>
>
>
>
>
> *Thanks & Regards*
>
> *Chandrasekaran*
>
> Technical Consultant
>
> Business Solutions Group
>
>
>
> Jardine OneSolution (2001) Pte Ltd
>
> *Tel* +65 6551 9608 | *Mobile *+65 8138 4761 | *Email*
> chiranchandra@jos.com.sg
>
> 55, Ubi Avenue 1 #03-15,Singapore 408935
>
>
>
> [image: Description: 150828 - JOS email signature_grey-02]
> <http://jos.com>
>
>
>
> *From:* Ted Yu [mailto:yuzhihong@gmail.com]
> *Sent:* Tuesday, 2 February, 2016 10:38 PM
> *To:* Arun Pandian
> *Cc:* user@hadoop.apache.org
> *Subject:* Re: Sqoop import Mysql to hive import Partitions
>
>
>
> The question is better suited for sqoop mailing list.
>
>
>
> Meanwhile, you can search for past threads / JIRAs on this subject.
>
> e.g.
>
>
> http://search-hadoop.com/m/C63T91ojE6B14UByZ&subj=sqoop+1+4+4+hive+import+error
>
>
> On Feb 2, 2016, at 5:28 AM, Arun Pandian <ar...@gmail.com>
> wrote:
>
> How to create sqoop import  mysql to hive partition , this is my code
>
>
>
> sqoop import --connect jdbc:mysql://localhost/arun  --table account
> --username root --password hadoop -m 1  --hive-partition-key "name"
> --hive-partition-value "arun" --hive-database company  --create-hive-table
> --hive-table account5  --target-dir /user/sqooptest21
>
> mysql database is arun,
>
> table is account
> select * from account;
> +----+-------+------+------------+---------+
> | id | name  | age  | joindate   | namess  |
> +----+-------+------+------------+---------+
> |  1 | Arun  |   23 | 29-01-2016 | super   |
> |  2 | Mani  |   22 | 30-01-2016 | superb  |
> |  3 | Sana  |   25 | 20-01-2016 | superbb |
> |  4 | Vicky |   24 | 02-02-2016 | supervb |
> |  5 | Vis   |   23 | 30-05-2016 | super   |
> +----+-------+------+------------+---------+
>
> i need clear explanation for sqoop import mysql to hive
>
>
>
>
> *Thanks & Regards*
>
> *Arunpandian.l*
>
> *Associate Developer*
>
>
> ______________________________________________________________________
> This email has been scanned by the Symantec Email Security.cloud service.
> For more information please visit http://www.symanteccloud.com
> ______________________________________________________________________
>
> *For general enquiries, please contact us at JOS Enquiry Email: *
> *enquiry@jos.com.sg* <en...@jos.com.sg>* Hotline: (+65) 6551 9611*
>
> *For JOS Support, please contact us at JOS Services Email: *
> *services@jos.com.sg* <se...@jos.com.sg>* Hotline: (+65) 6484 2302*
>
>
>
> *A member of the Jardine Matheson Group, Jardine OneSolution is one of
> Asia’s leading providers of integrated IT services and solutions with
> offices in Singapore, Malaysia, Hong Kong and China. Find out more about
> JOS at **www.jos.com* <http://www.jos.com>
>
>
> Confidentiality Notice and Disclaimer:
> This email (including any attachment to it) is confidential and intended
> only for the use of the individual or entity named above and may contain
> information that is privileged. If you are not the intended recipient, you
> are notified that any dissemination, distribution or copying of this email
> is strictly prohibited. If you have received this email in error, please
> notify us immediately by return email or telephone and destroy the original
> message (including any attachment to it). Thank you.
>
>
>
>
>
> ______________________________________________________________________
> This email has been scanned by the Symantec Email Security.cloud service.
> Confidentiality Notice and Disclaimer: This email (including any
> attachment to it) is confidential and intended only for the use of the
> individual or entity named above and may contain information that is
> privileged. If you are not the intended recipient, you are notified that
> any dissemination, distribution or copying of this email is strictly
> prohibited. If you have received this email in error, please notify us
> immediately by return email or telephone and destroy the original message
> (including any attachment to it). Thank you.
> ______________________________________________________________________
>

RE: Sqoop import Mysql to hive import Partitions

Posted by Chinnappan Chandrasekaran <ch...@jos.com.sg>.
sqoop import --table tablename\
--connect jdbc:mysql:servername\dbname
--username dbuser --password pw \
--fields-terminated-by "\t"
-- hive-partition-key
-- hive-import


/--hive-partition-key is Name of a hive field to partition are sharded on


Thanks & Regards
Chandrasekaran
Technical Consultant
Business Solutions Group

Jardine OneSolution (2001) Pte Ltd
Tel +65 6551 9608 | Mobile +65 8138 4761 | Email chiranchandra@jos.com.sg<ma...@jos.com.sg>
55, Ubi Avenue 1 #03-15,Singapore 408935

[Description: 150828 - JOS email signature_grey-02]<jos.com>

From: Ted Yu [mailto:yuzhihong@gmail.com]
Sent: Tuesday, 2 February, 2016 10:38 PM
To: Arun Pandian
Cc: user@hadoop.apache.org
Subject: Re: Sqoop import Mysql to hive import Partitions

The question is better suited for sqoop mailing list.

Meanwhile, you can search for past threads / JIRAs on this subject.
e.g.
http://search-hadoop.com/m/C63T91ojE6B14UByZ&subj=sqoop+1+4+4+hive+import+error

On Feb 2, 2016, at 5:28 AM, Arun Pandian <ar...@gmail.com>> wrote:
How to create sqoop import  mysql to hive partition , this is my code

sqoop import --connect jdbc:mysql://localhost/arun  --table account --username root --password hadoop -m 1  --hive-partition-key "name" --hive-partition-value "arun" --hive-database company  --create-hive-table  --hive-table account5  --target-dir /user/sqooptest21
mysql database is arun,
table is account
select * from account;
+----+-------+------+------------+---------+
| id | name  | age  | joindate   | namess  |
+----+-------+------+------------+---------+
|  1 | Arun  |   23 | 29-01-2016 | super   |
|  2 | Mani  |   22 | 30-01-2016 | superb  |
|  3 | Sana  |   25 | 20-01-2016 | superbb |
|  4 | Vicky |   24 | 02-02-2016 | supervb |
|  5 | Vis   |   23 | 30-05-2016 | super   |
+----+-------+------+------------+---------+

i need clear explanation for sqoop import mysql to hive



Thanks & Regards
Arunpandian.l
Associate Developer

______________________________________________________________________
This email has been scanned by the Symantec Email Security.cloud service.
For more information please visit http://www.symanteccloud.com
______________________________________________________________________
[http://img.photobucket.com/albums/v232/RiZZo66/WLAchieverAward2014BampW.jpg]
For general enquiries, please contact us at JOS Enquiry Email: enquiry@jos.com.sg<ma...@jos.com.sg> Hotline: (+65) 6551 9611
For JOS Support, please contact us at JOS Services Email: services@jos.com.sg<ma...@jos.com.sg> Hotline: (+65) 6484 2302

A member of the Jardine Matheson Group, Jardine OneSolution is one of Asia’s leading providers of integrated IT services and solutions with offices in Singapore, Malaysia, Hong Kong and China. Find out more about JOS at www.jos.com<http://www.jos.com>

Confidentiality Notice and Disclaimer:
This email (including any attachment to it) is confidential and intended only for the use of the individual or entity named above and may contain information that is privileged. If you are not the intended recipient, you are notified that any dissemination, distribution or copying of this email is strictly prohibited. If you have received this email in error, please notify us immediately by return email or telephone and destroy the original message (including any attachment to it). Thank you.



______________________________________________________________________
This email has been scanned by the Symantec Email Security.cloud service.
Confidentiality Notice and Disclaimer: This email (including any attachment to it) is confidential and intended only for the use of the individual or entity named above and may contain information that is privileged.  If you are not the intended recipient, you are notified that any dissemination, distribution or copying of this email is strictly prohibited.  If you have received this email in error, please notify us immediately by return email or telephone and destroy the original message (including any attachment to it).  Thank you.
______________________________________________________________________

RE: Sqoop import Mysql to hive import Partitions

Posted by Chinnappan Chandrasekaran <ch...@jos.com.sg>.
sqoop import --table tablename\
--connect jdbc:mysql:servername\dbname
--username dbuser --password pw \
--fields-terminated-by "\t"
-- hive-partition-key
-- hive-import


/--hive-partition-key is Name of a hive field to partition are sharded on


Thanks & Regards
Chandrasekaran
Technical Consultant
Business Solutions Group

Jardine OneSolution (2001) Pte Ltd
Tel +65 6551 9608 | Mobile +65 8138 4761 | Email chiranchandra@jos.com.sg<ma...@jos.com.sg>
55, Ubi Avenue 1 #03-15,Singapore 408935

[Description: 150828 - JOS email signature_grey-02]<jos.com>

From: Ted Yu [mailto:yuzhihong@gmail.com]
Sent: Tuesday, 2 February, 2016 10:38 PM
To: Arun Pandian
Cc: user@hadoop.apache.org
Subject: Re: Sqoop import Mysql to hive import Partitions

The question is better suited for sqoop mailing list.

Meanwhile, you can search for past threads / JIRAs on this subject.
e.g.
http://search-hadoop.com/m/C63T91ojE6B14UByZ&subj=sqoop+1+4+4+hive+import+error

On Feb 2, 2016, at 5:28 AM, Arun Pandian <ar...@gmail.com>> wrote:
How to create sqoop import  mysql to hive partition , this is my code

sqoop import --connect jdbc:mysql://localhost/arun  --table account --username root --password hadoop -m 1  --hive-partition-key "name" --hive-partition-value "arun" --hive-database company  --create-hive-table  --hive-table account5  --target-dir /user/sqooptest21
mysql database is arun,
table is account
select * from account;
+----+-------+------+------------+---------+
| id | name  | age  | joindate   | namess  |
+----+-------+------+------------+---------+
|  1 | Arun  |   23 | 29-01-2016 | super   |
|  2 | Mani  |   22 | 30-01-2016 | superb  |
|  3 | Sana  |   25 | 20-01-2016 | superbb |
|  4 | Vicky |   24 | 02-02-2016 | supervb |
|  5 | Vis   |   23 | 30-05-2016 | super   |
+----+-------+------+------------+---------+

i need clear explanation for sqoop import mysql to hive



Thanks & Regards
Arunpandian.l
Associate Developer

______________________________________________________________________
This email has been scanned by the Symantec Email Security.cloud service.
For more information please visit http://www.symanteccloud.com
______________________________________________________________________
[http://img.photobucket.com/albums/v232/RiZZo66/WLAchieverAward2014BampW.jpg]
For general enquiries, please contact us at JOS Enquiry Email: enquiry@jos.com.sg<ma...@jos.com.sg> Hotline: (+65) 6551 9611
For JOS Support, please contact us at JOS Services Email: services@jos.com.sg<ma...@jos.com.sg> Hotline: (+65) 6484 2302

A member of the Jardine Matheson Group, Jardine OneSolution is one of Asia’s leading providers of integrated IT services and solutions with offices in Singapore, Malaysia, Hong Kong and China. Find out more about JOS at www.jos.com<http://www.jos.com>

Confidentiality Notice and Disclaimer:
This email (including any attachment to it) is confidential and intended only for the use of the individual or entity named above and may contain information that is privileged. If you are not the intended recipient, you are notified that any dissemination, distribution or copying of this email is strictly prohibited. If you have received this email in error, please notify us immediately by return email or telephone and destroy the original message (including any attachment to it). Thank you.



______________________________________________________________________
This email has been scanned by the Symantec Email Security.cloud service.
Confidentiality Notice and Disclaimer: This email (including any attachment to it) is confidential and intended only for the use of the individual or entity named above and may contain information that is privileged.  If you are not the intended recipient, you are notified that any dissemination, distribution or copying of this email is strictly prohibited.  If you have received this email in error, please notify us immediately by return email or telephone and destroy the original message (including any attachment to it).  Thank you.
______________________________________________________________________

RE: Sqoop import Mysql to hive import Partitions

Posted by Chinnappan Chandrasekaran <ch...@jos.com.sg>.
sqoop import --table tablename\
--connect jdbc:mysql:servername\dbname
--username dbuser --password pw \
--fields-terminated-by "\t"
-- hive-partition-key
-- hive-import


/--hive-partition-key is Name of a hive field to partition are sharded on


Thanks & Regards
Chandrasekaran
Technical Consultant
Business Solutions Group

Jardine OneSolution (2001) Pte Ltd
Tel +65 6551 9608 | Mobile +65 8138 4761 | Email chiranchandra@jos.com.sg<ma...@jos.com.sg>
55, Ubi Avenue 1 #03-15,Singapore 408935

[Description: 150828 - JOS email signature_grey-02]<jos.com>

From: Ted Yu [mailto:yuzhihong@gmail.com]
Sent: Tuesday, 2 February, 2016 10:38 PM
To: Arun Pandian
Cc: user@hadoop.apache.org
Subject: Re: Sqoop import Mysql to hive import Partitions

The question is better suited for sqoop mailing list.

Meanwhile, you can search for past threads / JIRAs on this subject.
e.g.
http://search-hadoop.com/m/C63T91ojE6B14UByZ&subj=sqoop+1+4+4+hive+import+error

On Feb 2, 2016, at 5:28 AM, Arun Pandian <ar...@gmail.com>> wrote:
How to create sqoop import  mysql to hive partition , this is my code

sqoop import --connect jdbc:mysql://localhost/arun  --table account --username root --password hadoop -m 1  --hive-partition-key "name" --hive-partition-value "arun" --hive-database company  --create-hive-table  --hive-table account5  --target-dir /user/sqooptest21
mysql database is arun,
table is account
select * from account;
+----+-------+------+------------+---------+
| id | name  | age  | joindate   | namess  |
+----+-------+------+------------+---------+
|  1 | Arun  |   23 | 29-01-2016 | super   |
|  2 | Mani  |   22 | 30-01-2016 | superb  |
|  3 | Sana  |   25 | 20-01-2016 | superbb |
|  4 | Vicky |   24 | 02-02-2016 | supervb |
|  5 | Vis   |   23 | 30-05-2016 | super   |
+----+-------+------+------------+---------+

i need clear explanation for sqoop import mysql to hive



Thanks & Regards
Arunpandian.l
Associate Developer

______________________________________________________________________
This email has been scanned by the Symantec Email Security.cloud service.
For more information please visit http://www.symanteccloud.com
______________________________________________________________________
[http://img.photobucket.com/albums/v232/RiZZo66/WLAchieverAward2014BampW.jpg]
For general enquiries, please contact us at JOS Enquiry Email: enquiry@jos.com.sg<ma...@jos.com.sg> Hotline: (+65) 6551 9611
For JOS Support, please contact us at JOS Services Email: services@jos.com.sg<ma...@jos.com.sg> Hotline: (+65) 6484 2302

A member of the Jardine Matheson Group, Jardine OneSolution is one of Asia’s leading providers of integrated IT services and solutions with offices in Singapore, Malaysia, Hong Kong and China. Find out more about JOS at www.jos.com<http://www.jos.com>

Confidentiality Notice and Disclaimer:
This email (including any attachment to it) is confidential and intended only for the use of the individual or entity named above and may contain information that is privileged. If you are not the intended recipient, you are notified that any dissemination, distribution or copying of this email is strictly prohibited. If you have received this email in error, please notify us immediately by return email or telephone and destroy the original message (including any attachment to it). Thank you.



______________________________________________________________________
This email has been scanned by the Symantec Email Security.cloud service.
Confidentiality Notice and Disclaimer: This email (including any attachment to it) is confidential and intended only for the use of the individual or entity named above and may contain information that is privileged.  If you are not the intended recipient, you are notified that any dissemination, distribution or copying of this email is strictly prohibited.  If you have received this email in error, please notify us immediately by return email or telephone and destroy the original message (including any attachment to it).  Thank you.
______________________________________________________________________

RE: Sqoop import Mysql to hive import Partitions

Posted by Chinnappan Chandrasekaran <ch...@jos.com.sg>.
sqoop import --table tablename\
--connect jdbc:mysql:servername\dbname
--username dbuser --password pw \
--fields-terminated-by "\t"
-- hive-partition-key
-- hive-import


/--hive-partition-key is Name of a hive field to partition are sharded on


Thanks & Regards
Chandrasekaran
Technical Consultant
Business Solutions Group

Jardine OneSolution (2001) Pte Ltd
Tel +65 6551 9608 | Mobile +65 8138 4761 | Email chiranchandra@jos.com.sg<ma...@jos.com.sg>
55, Ubi Avenue 1 #03-15,Singapore 408935

[Description: 150828 - JOS email signature_grey-02]<jos.com>

From: Ted Yu [mailto:yuzhihong@gmail.com]
Sent: Tuesday, 2 February, 2016 10:38 PM
To: Arun Pandian
Cc: user@hadoop.apache.org
Subject: Re: Sqoop import Mysql to hive import Partitions

The question is better suited for sqoop mailing list.

Meanwhile, you can search for past threads / JIRAs on this subject.
e.g.
http://search-hadoop.com/m/C63T91ojE6B14UByZ&subj=sqoop+1+4+4+hive+import+error

On Feb 2, 2016, at 5:28 AM, Arun Pandian <ar...@gmail.com>> wrote:
How to create sqoop import  mysql to hive partition , this is my code

sqoop import --connect jdbc:mysql://localhost/arun  --table account --username root --password hadoop -m 1  --hive-partition-key "name" --hive-partition-value "arun" --hive-database company  --create-hive-table  --hive-table account5  --target-dir /user/sqooptest21
mysql database is arun,
table is account
select * from account;
+----+-------+------+------------+---------+
| id | name  | age  | joindate   | namess  |
+----+-------+------+------------+---------+
|  1 | Arun  |   23 | 29-01-2016 | super   |
|  2 | Mani  |   22 | 30-01-2016 | superb  |
|  3 | Sana  |   25 | 20-01-2016 | superbb |
|  4 | Vicky |   24 | 02-02-2016 | supervb |
|  5 | Vis   |   23 | 30-05-2016 | super   |
+----+-------+------+------------+---------+

i need clear explanation for sqoop import mysql to hive



Thanks & Regards
Arunpandian.l
Associate Developer

______________________________________________________________________
This email has been scanned by the Symantec Email Security.cloud service.
For more information please visit http://www.symanteccloud.com
______________________________________________________________________
[http://img.photobucket.com/albums/v232/RiZZo66/WLAchieverAward2014BampW.jpg]
For general enquiries, please contact us at JOS Enquiry Email: enquiry@jos.com.sg<ma...@jos.com.sg> Hotline: (+65) 6551 9611
For JOS Support, please contact us at JOS Services Email: services@jos.com.sg<ma...@jos.com.sg> Hotline: (+65) 6484 2302

A member of the Jardine Matheson Group, Jardine OneSolution is one of Asia’s leading providers of integrated IT services and solutions with offices in Singapore, Malaysia, Hong Kong and China. Find out more about JOS at www.jos.com<http://www.jos.com>

Confidentiality Notice and Disclaimer:
This email (including any attachment to it) is confidential and intended only for the use of the individual or entity named above and may contain information that is privileged. If you are not the intended recipient, you are notified that any dissemination, distribution or copying of this email is strictly prohibited. If you have received this email in error, please notify us immediately by return email or telephone and destroy the original message (including any attachment to it). Thank you.



______________________________________________________________________
This email has been scanned by the Symantec Email Security.cloud service.
Confidentiality Notice and Disclaimer: This email (including any attachment to it) is confidential and intended only for the use of the individual or entity named above and may contain information that is privileged.  If you are not the intended recipient, you are notified that any dissemination, distribution or copying of this email is strictly prohibited.  If you have received this email in error, please notify us immediately by return email or telephone and destroy the original message (including any attachment to it).  Thank you.
______________________________________________________________________

Re: Sqoop import Mysql to hive import Partitions

Posted by Ted Yu <yu...@gmail.com>.
The question is better suited for sqoop mailing list. 

Meanwhile, you can search for past threads / JIRAs on this subject. 
e.g.
http://search-hadoop.com/m/C63T91ojE6B14UByZ&subj=sqoop+1+4+4+hive+import+error

> On Feb 2, 2016, at 5:28 AM, Arun Pandian <ar...@gmail.com> wrote:
> 
> How to create sqoop import  mysql to hive partition , this is my code
> 
> 
> 
> sqoop import --connect jdbc:mysql://localhost/arun  --table account --username root --password hadoop -m 1  --hive-partition-key "name" --hive-partition-value "arun" --hive-database company  --create-hive-table  --hive-table account5  --target-dir /user/sqooptest21
> 
> mysql database is arun,
> table is account
> select * from account;
> +----+-------+------+------------+---------+
> | id | name  | age  | joindate   | namess  |
> +----+-------+------+------------+---------+
> |  1 | Arun  |   23 | 29-01-2016 | super   |
> |  2 | Mani  |   22 | 30-01-2016 | superb  |
> |  3 | Sana  |   25 | 20-01-2016 | superbb |
> |  4 | Vicky |   24 | 02-02-2016 | supervb |
> |  5 | Vis   |   23 | 30-05-2016 | super   |
> +----+-------+------+------------+---------+
> 
> 
> i need clear explanation for sqoop import mysql to hive
> 
> 
> 
> Thanks & Regards
> Arunpandian.l
> Associate Developer

Re: Sqoop import Mysql to hive import Partitions

Posted by Ted Yu <yu...@gmail.com>.
The question is better suited for sqoop mailing list. 

Meanwhile, you can search for past threads / JIRAs on this subject. 
e.g.
http://search-hadoop.com/m/C63T91ojE6B14UByZ&subj=sqoop+1+4+4+hive+import+error

> On Feb 2, 2016, at 5:28 AM, Arun Pandian <ar...@gmail.com> wrote:
> 
> How to create sqoop import  mysql to hive partition , this is my code
> 
> 
> 
> sqoop import --connect jdbc:mysql://localhost/arun  --table account --username root --password hadoop -m 1  --hive-partition-key "name" --hive-partition-value "arun" --hive-database company  --create-hive-table  --hive-table account5  --target-dir /user/sqooptest21
> 
> mysql database is arun,
> table is account
> select * from account;
> +----+-------+------+------------+---------+
> | id | name  | age  | joindate   | namess  |
> +----+-------+------+------------+---------+
> |  1 | Arun  |   23 | 29-01-2016 | super   |
> |  2 | Mani  |   22 | 30-01-2016 | superb  |
> |  3 | Sana  |   25 | 20-01-2016 | superbb |
> |  4 | Vicky |   24 | 02-02-2016 | supervb |
> |  5 | Vis   |   23 | 30-05-2016 | super   |
> +----+-------+------+------------+---------+
> 
> 
> i need clear explanation for sqoop import mysql to hive
> 
> 
> 
> Thanks & Regards
> Arunpandian.l
> Associate Developer

Re: Sqoop import Mysql to hive import Partitions

Posted by Ted Yu <yu...@gmail.com>.
The question is better suited for sqoop mailing list. 

Meanwhile, you can search for past threads / JIRAs on this subject. 
e.g.
http://search-hadoop.com/m/C63T91ojE6B14UByZ&subj=sqoop+1+4+4+hive+import+error

> On Feb 2, 2016, at 5:28 AM, Arun Pandian <ar...@gmail.com> wrote:
> 
> How to create sqoop import  mysql to hive partition , this is my code
> 
> 
> 
> sqoop import --connect jdbc:mysql://localhost/arun  --table account --username root --password hadoop -m 1  --hive-partition-key "name" --hive-partition-value "arun" --hive-database company  --create-hive-table  --hive-table account5  --target-dir /user/sqooptest21
> 
> mysql database is arun,
> table is account
> select * from account;
> +----+-------+------+------------+---------+
> | id | name  | age  | joindate   | namess  |
> +----+-------+------+------------+---------+
> |  1 | Arun  |   23 | 29-01-2016 | super   |
> |  2 | Mani  |   22 | 30-01-2016 | superb  |
> |  3 | Sana  |   25 | 20-01-2016 | superbb |
> |  4 | Vicky |   24 | 02-02-2016 | supervb |
> |  5 | Vis   |   23 | 30-05-2016 | super   |
> +----+-------+------+------------+---------+
> 
> 
> i need clear explanation for sqoop import mysql to hive
> 
> 
> 
> Thanks & Regards
> Arunpandian.l
> Associate Developer

Re: Sqoop import Mysql to hive import Partitions

Posted by Ted Yu <yu...@gmail.com>.
The question is better suited for sqoop mailing list. 

Meanwhile, you can search for past threads / JIRAs on this subject. 
e.g.
http://search-hadoop.com/m/C63T91ojE6B14UByZ&subj=sqoop+1+4+4+hive+import+error

> On Feb 2, 2016, at 5:28 AM, Arun Pandian <ar...@gmail.com> wrote:
> 
> How to create sqoop import  mysql to hive partition , this is my code
> 
> 
> 
> sqoop import --connect jdbc:mysql://localhost/arun  --table account --username root --password hadoop -m 1  --hive-partition-key "name" --hive-partition-value "arun" --hive-database company  --create-hive-table  --hive-table account5  --target-dir /user/sqooptest21
> 
> mysql database is arun,
> table is account
> select * from account;
> +----+-------+------+------------+---------+
> | id | name  | age  | joindate   | namess  |
> +----+-------+------+------------+---------+
> |  1 | Arun  |   23 | 29-01-2016 | super   |
> |  2 | Mani  |   22 | 30-01-2016 | superb  |
> |  3 | Sana  |   25 | 20-01-2016 | superbb |
> |  4 | Vicky |   24 | 02-02-2016 | supervb |
> |  5 | Vis   |   23 | 30-05-2016 | super   |
> +----+-------+------+------------+---------+
> 
> 
> i need clear explanation for sqoop import mysql to hive
> 
> 
> 
> Thanks & Regards
> Arunpandian.l
> Associate Developer