You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hive.apache.org by Hamza Asad <ha...@gmail.com> on 2013/06/18 13:01:39 UTC

Export hive table format issue

I want to export my table in mysql and for that i'm using sqoop export
command but in HDFS i've data apparantly without any field seperator But it
does contain some field separator. data is saved in the format as shown
below
*8119844144724992013-01-29
00:00:00.0141\N\N\N\N\N\N\N\N\N\N8\N\N\N\N\N1\N\N32\N1
*
how can i export this type of data to mysql and what field separator i
mention it there.. Please help

-- 
*Muhammad Hamza Asad*

Re: Export hive table format issue

Posted by Hamza Asad <ha...@gmail.com>.
Thnx alot Nitin and all, Thats the root cause. Field separator was default
i.e ^A and the above issue u have mentioned. Thnx again :) Stay Blessed


On Wed, Jun 19, 2013 at 10:21 AM, Nitin Pawar <ni...@gmail.com>wrote:

> Jarek,
>
> Any chances that Hamza is hitting this one SQOOP-188: Problem with NULL
> values in MySQL export <https://issues.cloudera.org/browse/SQOOP-188>
>
> In that case I would recommend him to use
> --input-null-string "\\\\N"   --input-null-non-string "\\\\N"
>
> Hamza, can you try above options
>
>
>
> On Wed, Jun 19, 2013 at 5:14 AM, Jarek Jarcec Cecho <ja...@apache.org>wrote:
>
>> Would you mind upgrading Sqoop to version 1.4.3?
>>
>> We've significantly improved error logging for case when the input data
>> can't be parsed during export. You should get state dump (exception, input
>> file, position in the file, entire input line) available in the associated
>> map task log.
>>
>> Jarcec
>>
>> On Tue, Jun 18, 2013 at 03:14:52PM +0000, Arafat, Moiz wrote:
>> > Can you try using default value ex 0 or 9999999 instead of storing NULL
>> in the numeric column on hive side ?
>> >
>> > Thanks,
>> > Moiz Arafat
>> >
>> > On Jun 18, 2013, at 9:14 AM, Hamza Asad <hamza.asad13@gmail.com<mailto:
>> hamza.asad13@gmail.com>> wrote:
>> >
>> > Nitin,
>> >        Issue is not with the INT or BIGINT (as i have verified both),
>> exception is same.. Issue is with some thing else.. Please sort out any
>> solution... following exception still raising (# in input string is not
>> visible in terminal and is translated to # when copied to office writer
>> which i pasted below)
>> > java.lang.NumberFormatException: For input string: " 433649#1#534782#2"
>> >     at
>> java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
>> >     at java.lang.Long.parseLong(Long.java:441)
>> >     at java.lang.Long.valueOf(Long.java:540)
>> >     at
>> dump_hive_events_details.__loadFromFields(dump_hive_events_details.java:949)
>> >     at dump_hive_events_details.parse(dump_hive_events_details.java:901)
>> >     at
>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:77)
>> >     at
>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)
>> >     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>> >     at
>> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
>> >     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
>> >     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
>> >     at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>> >     at java.lang.Long.parseLong(Long.java:441)
>> >     at java.lang.Long.valueOf(Long.java:540)
>> >     at
>> dump_hive_events_details.__loadFromFields(dump_hive_events_details.java:949)
>> >     at dump_hive_events_details.parse(dump_hive_events_details.java:901)
>> >     at
>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:77)
>> >     at
>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)
>> >     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>> >     at
>> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
>> >     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
>> >     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
>> >     at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>> >     at java.security.AccessController.doPrivileged(Native Method)
>> >     at javax.security.auth.Subject.doAs(Subject.java:415)
>> >     at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
>> >     at org.apache.hadoop.mapred.Child.main(Child.java:249)
>> >
>> >
>> >
>> > On Tue, Jun 18, 2013 at 5:53 PM, Nitin Pawar <nitinpawar432@gmail.com
>> <ma...@gmail.com>> wrote:
>> > can you change your mysql schema to have bigint instead of just int.
>> > for more you can refer this
>> http://stackoverflow.com/questions/16886668/why-sqoop-fails-on-numberformatexception-for-numeric-column-during-the-export-fr
>> >
>> >
>> > On Tue, Jun 18, 2013 at 5:52 PM, Hamza Asad <hamza.asad13@gmail.com
>> <ma...@gmail.com>> wrote:
>> > Attached are the schema files of both HIVE and mySql tables
>> >
>> >
>> > On Tue, Jun 18, 2013 at 5:11 PM, Nitin Pawar <nitinpawar432@gmail.com
>> <ma...@gmail.com>> wrote:
>> >  for the number format exception, can you share your mysql schema (put
>> as attachment and not inline in mail). If you have created table with int
>> .. try to switch the column with bigint
>> >
>> >
>> >
>> > On Tue, Jun 18, 2013 at 5:37 PM, Hamza Asad <hamza.asad13@gmail.com
>> <ma...@gmail.com>> wrote:
>> > I have copy paste the ROW in office writer where i saw its #
>> separated...
>> > yeah \N values representing NULL..
>> > the version of sqoop is
>> > Sqoop 1.4.2
>> > git commit id
>> > Compiled by ag on Tue Aug 14 17:37:19 IST 2012
>> >
>> >
>> > On Tue, Jun 18, 2013 at 5:01 PM, Nitin Pawar <nitinpawar432@gmail.com
>> <ma...@gmail.com>> wrote:
>> > is "#" your field separator?
>> > also the separator is normally an octal representation so you can give
>> it a try.
>> >
>> > why does your columns have \N as values? is it for NULL ?
>> >
>> > what version of sqoop are you using?
>> >
>> >
>> > On Tue, Jun 18, 2013 at 5:00 PM, Hamza Asad <hamza.asad13@gmail.com
>> <ma...@gmail.com>> wrote:
>> > im executing following command
>> > sqoop export --connect jdbc:mysql://localhost/xxxx --table
>> dump_hive_events_details --export-dir hive/warehouse/xxxx.db/events_details
>> --input-null-non-string \N --input-fields-terminated-by '#' --username
>> xxxxxxxx --password xxxxxxxxx
>> >
>> > 13/06/18 16:26:44 INFO mapred.JobClient: Task Id :
>> attempt_201306170658_0106_m_000001_0, Status : FAILED
>> > java.lang.NumberFormatException: For input string: "8119844 1 4472499
>> 2013-01-29 00:00:00.0 1 4 1 \N \N \N \N \N \N \N \N \N \N 8 \N \N \N \N \N
>> 1 \N \N 3 2 \N 1"
>> >     at
>> java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
>> >     at java.lang.Integer.parseInt(Integer.java:492)
>> >     at java.lang.Integer.valueOf(Integer.java:582)
>> >     at
>> dump_hive_events_details.__loadFromFields(dump_hive_events_details.java:949)
>> >     at dump_hive_events_details.parse(dump_hive_events_details.java:901)
>> >     at
>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:77)
>> >     at
>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)
>> >     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>> >     at
>> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
>> >     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
>> >     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
>> >     at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>> >     at java.security.AccessController.doPrivileged(Native Method)
>> >     at javax.security.auth.Subject.doAs(Subject.java:415)
>> >     at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
>> >     at org.apache.hadoop.mapred.Child.main(Child.java:249)
>> >
>> >
>> >
>> > On Tue, Jun 18, 2013 at 4:07 PM, Nitin Pawar <nitinpawar432@gmail.com
>> <ma...@gmail.com>> wrote:
>> > check the option --input-fields-terminated-by in sqoop export
>> >
>> >
>> > On Tue, Jun 18, 2013 at 4:31 PM, Hamza Asad <hamza.asad13@gmail.com
>> <ma...@gmail.com>> wrote:
>> > I want to export my table in mysql and for that i'm using sqoop export
>> command but in HDFS i've data apparantly without any field seperator But it
>> does contain some field separator. data is saved in the format as shown
>> below
>> > 8119844 1 4472499 2013-01-29 00:00:00.0 1 4 1 \N \N \N \N \N \N \N \N
>> \N \N 8 \N \N \N \N \N 1 \N \N 3 2 \N 1
>> > how can i export this type of data to mysql and what field separator i
>> mention it there.. Please help
>> >
>> > --
>> > Muhammad Hamza Asad
>> >
>> >
>> >
>> > --
>> > Nitin Pawar
>> >
>> >
>> >
>> > --
>> > Muhammad Hamza Asad
>> >
>> >
>> >
>> > --
>> > Nitin Pawar
>> >
>> >
>> >
>> > --
>> > Muhammad Hamza Asad
>> >
>> >
>> >
>> > --
>> > Nitin Pawar
>> >
>> >
>> >
>> > --
>> > Muhammad Hamza Asad
>> >
>> >
>> >
>> > --
>> > Nitin Pawar
>> >
>> >
>> >
>> > --
>> > Muhammad Hamza Asad
>> >
>>
>
>
>
> --
> Nitin Pawar
>



-- 
*Muhammad Hamza Asad*

Re: Export hive table format issue

Posted by Nitin Pawar <ni...@gmail.com>.
Jarek,

Any chances that Hamza is hitting this one SQOOP-188: Problem with NULL
values in MySQL export <https://issues.cloudera.org/browse/SQOOP-188>

In that case I would recommend him to use
--input-null-string "\\\\N"  --input-null-non-string "\\\\N"

Hamza, can you try above options



On Wed, Jun 19, 2013 at 5:14 AM, Jarek Jarcec Cecho <ja...@apache.org>wrote:

> Would you mind upgrading Sqoop to version 1.4.3?
>
> We've significantly improved error logging for case when the input data
> can't be parsed during export. You should get state dump (exception, input
> file, position in the file, entire input line) available in the associated
> map task log.
>
> Jarcec
>
> On Tue, Jun 18, 2013 at 03:14:52PM +0000, Arafat, Moiz wrote:
> > Can you try using default value ex 0 or 9999999 instead of storing NULL
> in the numeric column on hive side ?
> >
> > Thanks,
> > Moiz Arafat
> >
> > On Jun 18, 2013, at 9:14 AM, Hamza Asad <hamza.asad13@gmail.com<mailto:
> hamza.asad13@gmail.com>> wrote:
> >
> > Nitin,
> >        Issue is not with the INT or BIGINT (as i have verified both),
> exception is same.. Issue is with some thing else.. Please sort out any
> solution... following exception still raising (# in input string is not
> visible in terminal and is translated to # when copied to office writer
> which i pasted below)
> > java.lang.NumberFormatException: For input string: " 433649#1#534782#2"
> >     at
> java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
> >     at java.lang.Long.parseLong(Long.java:441)
> >     at java.lang.Long.valueOf(Long.java:540)
> >     at
> dump_hive_events_details.__loadFromFields(dump_hive_events_details.java:949)
> >     at dump_hive_events_details.parse(dump_hive_events_details.java:901)
> >     at
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:77)
> >     at
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)
> >     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
> >     at
> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
> >     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
> >     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> >     at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> >     at java.lang.Long.parseLong(Long.java:441)
> >     at java.lang.Long.valueOf(Long.java:540)
> >     at
> dump_hive_events_details.__loadFromFields(dump_hive_events_details.java:949)
> >     at dump_hive_events_details.parse(dump_hive_events_details.java:901)
> >     at
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:77)
> >     at
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)
> >     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
> >     at
> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
> >     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
> >     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> >     at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> >     at java.security.AccessController.doPrivileged(Native Method)
> >     at javax.security.auth.Subject.doAs(Subject.java:415)
> >     at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> >     at org.apache.hadoop.mapred.Child.main(Child.java:249)
> >
> >
> >
> > On Tue, Jun 18, 2013 at 5:53 PM, Nitin Pawar <nitinpawar432@gmail.com
> <ma...@gmail.com>> wrote:
> > can you change your mysql schema to have bigint instead of just int.
> > for more you can refer this
> http://stackoverflow.com/questions/16886668/why-sqoop-fails-on-numberformatexception-for-numeric-column-during-the-export-fr
> >
> >
> > On Tue, Jun 18, 2013 at 5:52 PM, Hamza Asad <hamza.asad13@gmail.com
> <ma...@gmail.com>> wrote:
> > Attached are the schema files of both HIVE and mySql tables
> >
> >
> > On Tue, Jun 18, 2013 at 5:11 PM, Nitin Pawar <nitinpawar432@gmail.com
> <ma...@gmail.com>> wrote:
> >  for the number format exception, can you share your mysql schema (put
> as attachment and not inline in mail). If you have created table with int
> .. try to switch the column with bigint
> >
> >
> >
> > On Tue, Jun 18, 2013 at 5:37 PM, Hamza Asad <hamza.asad13@gmail.com
> <ma...@gmail.com>> wrote:
> > I have copy paste the ROW in office writer where i saw its # separated...
> > yeah \N values representing NULL..
> > the version of sqoop is
> > Sqoop 1.4.2
> > git commit id
> > Compiled by ag on Tue Aug 14 17:37:19 IST 2012
> >
> >
> > On Tue, Jun 18, 2013 at 5:01 PM, Nitin Pawar <nitinpawar432@gmail.com
> <ma...@gmail.com>> wrote:
> > is "#" your field separator?
> > also the separator is normally an octal representation so you can give
> it a try.
> >
> > why does your columns have \N as values? is it for NULL ?
> >
> > what version of sqoop are you using?
> >
> >
> > On Tue, Jun 18, 2013 at 5:00 PM, Hamza Asad <hamza.asad13@gmail.com
> <ma...@gmail.com>> wrote:
> > im executing following command
> > sqoop export --connect jdbc:mysql://localhost/xxxx --table
> dump_hive_events_details --export-dir hive/warehouse/xxxx.db/events_details
> --input-null-non-string \N --input-fields-terminated-by '#' --username
> xxxxxxxx --password xxxxxxxxx
> >
> > 13/06/18 16:26:44 INFO mapred.JobClient: Task Id :
> attempt_201306170658_0106_m_000001_0, Status : FAILED
> > java.lang.NumberFormatException: For input string: "8119844 1 4472499
> 2013-01-29 00:00:00.0 1 4 1 \N \N \N \N \N \N \N \N \N \N 8 \N \N \N \N \N
> 1 \N \N 3 2 \N 1"
> >     at
> java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
> >     at java.lang.Integer.parseInt(Integer.java:492)
> >     at java.lang.Integer.valueOf(Integer.java:582)
> >     at
> dump_hive_events_details.__loadFromFields(dump_hive_events_details.java:949)
> >     at dump_hive_events_details.parse(dump_hive_events_details.java:901)
> >     at
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:77)
> >     at
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)
> >     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
> >     at
> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
> >     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
> >     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> >     at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> >     at java.security.AccessController.doPrivileged(Native Method)
> >     at javax.security.auth.Subject.doAs(Subject.java:415)
> >     at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> >     at org.apache.hadoop.mapred.Child.main(Child.java:249)
> >
> >
> >
> > On Tue, Jun 18, 2013 at 4:07 PM, Nitin Pawar <nitinpawar432@gmail.com
> <ma...@gmail.com>> wrote:
> > check the option --input-fields-terminated-by in sqoop export
> >
> >
> > On Tue, Jun 18, 2013 at 4:31 PM, Hamza Asad <hamza.asad13@gmail.com
> <ma...@gmail.com>> wrote:
> > I want to export my table in mysql and for that i'm using sqoop export
> command but in HDFS i've data apparantly without any field seperator But it
> does contain some field separator. data is saved in the format as shown
> below
> > 8119844 1 4472499 2013-01-29 00:00:00.0 1 4 1 \N \N \N \N \N \N \N \N \N
> \N 8 \N \N \N \N \N 1 \N \N 3 2 \N 1
> > how can i export this type of data to mysql and what field separator i
> mention it there.. Please help
> >
> > --
> > Muhammad Hamza Asad
> >
> >
> >
> > --
> > Nitin Pawar
> >
> >
> >
> > --
> > Muhammad Hamza Asad
> >
> >
> >
> > --
> > Nitin Pawar
> >
> >
> >
> > --
> > Muhammad Hamza Asad
> >
> >
> >
> > --
> > Nitin Pawar
> >
> >
> >
> > --
> > Muhammad Hamza Asad
> >
> >
> >
> > --
> > Nitin Pawar
> >
> >
> >
> > --
> > Muhammad Hamza Asad
> >
>



-- 
Nitin Pawar

Re: Export hive table format issue

Posted by Jarek Jarcec Cecho <ja...@apache.org>.
Would you mind upgrading Sqoop to version 1.4.3?

We've significantly improved error logging for case when the input data can't be parsed during export. You should get state dump (exception, input file, position in the file, entire input line) available in the associated map task log.

Jarcec

On Tue, Jun 18, 2013 at 03:14:52PM +0000, Arafat, Moiz wrote:
> Can you try using default value ex 0 or 9999999 instead of storing NULL in the numeric column on hive side ?
> 
> Thanks,
> Moiz Arafat
> 
> On Jun 18, 2013, at 9:14 AM, Hamza Asad <ha...@gmail.com>> wrote:
> 
> Nitin,
>        Issue is not with the INT or BIGINT (as i have verified both), exception is same.. Issue is with some thing else.. Please sort out any solution... following exception still raising (# in input string is not visible in terminal and is translated to # when copied to office writer which i pasted below)
> java.lang.NumberFormatException: For input string: " 433649#1#534782#2"
>     at java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
>     at java.lang.Long.parseLong(Long.java:441)
>     at java.lang.Long.valueOf(Long.java:540)
>     at dump_hive_events_details.__loadFromFields(dump_hive_events_details.java:949)
>     at dump_hive_events_details.parse(dump_hive_events_details.java:901)
>     at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:77)
>     at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)
>     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>     at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
>     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
>     at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>     at java.lang.Long.parseLong(Long.java:441)
>     at java.lang.Long.valueOf(Long.java:540)
>     at dump_hive_events_details.__loadFromFields(dump_hive_events_details.java:949)
>     at dump_hive_events_details.parse(dump_hive_events_details.java:901)
>     at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:77)
>     at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)
>     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>     at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
>     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
>     at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:415)
>     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
>     at org.apache.hadoop.mapred.Child.main(Child.java:249)
> 
> 
> 
> On Tue, Jun 18, 2013 at 5:53 PM, Nitin Pawar <ni...@gmail.com>> wrote:
> can you change your mysql schema to have bigint instead of just int.
> for more you can refer this http://stackoverflow.com/questions/16886668/why-sqoop-fails-on-numberformatexception-for-numeric-column-during-the-export-fr
> 
> 
> On Tue, Jun 18, 2013 at 5:52 PM, Hamza Asad <ha...@gmail.com>> wrote:
> Attached are the schema files of both HIVE and mySql tables
> 
> 
> On Tue, Jun 18, 2013 at 5:11 PM, Nitin Pawar <ni...@gmail.com>> wrote:
>  for the number format exception, can you share your mysql schema (put as attachment and not inline in mail). If you have created table with int .. try to switch the column with bigint
> 
> 
> 
> On Tue, Jun 18, 2013 at 5:37 PM, Hamza Asad <ha...@gmail.com>> wrote:
> I have copy paste the ROW in office writer where i saw its # separated...
> yeah \N values representing NULL..
> the version of sqoop is
> Sqoop 1.4.2
> git commit id
> Compiled by ag on Tue Aug 14 17:37:19 IST 2012
> 
> 
> On Tue, Jun 18, 2013 at 5:01 PM, Nitin Pawar <ni...@gmail.com>> wrote:
> is "#" your field separator?
> also the separator is normally an octal representation so you can give it a try.
> 
> why does your columns have \N as values? is it for NULL ?
> 
> what version of sqoop are you using?
> 
> 
> On Tue, Jun 18, 2013 at 5:00 PM, Hamza Asad <ha...@gmail.com>> wrote:
> im executing following command
> sqoop export --connect jdbc:mysql://localhost/xxxx --table dump_hive_events_details --export-dir hive/warehouse/xxxx.db/events_details --input-null-non-string \N --input-fields-terminated-by '#' --username xxxxxxxx --password xxxxxxxxx
> 
> 13/06/18 16:26:44 INFO mapred.JobClient: Task Id : attempt_201306170658_0106_m_000001_0, Status : FAILED
> java.lang.NumberFormatException: For input string: "8119844 1 4472499 2013-01-29 00:00:00.0 1 4 1 \N \N \N \N \N \N \N \N \N \N 8 \N \N \N \N \N 1 \N \N 3 2 \N 1"
>     at java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
>     at java.lang.Integer.parseInt(Integer.java:492)
>     at java.lang.Integer.valueOf(Integer.java:582)
>     at dump_hive_events_details.__loadFromFields(dump_hive_events_details.java:949)
>     at dump_hive_events_details.parse(dump_hive_events_details.java:901)
>     at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:77)
>     at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)
>     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>     at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
>     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
>     at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:415)
>     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
>     at org.apache.hadoop.mapred.Child.main(Child.java:249)
> 
> 
> 
> On Tue, Jun 18, 2013 at 4:07 PM, Nitin Pawar <ni...@gmail.com>> wrote:
> check the option --input-fields-terminated-by in sqoop export
> 
> 
> On Tue, Jun 18, 2013 at 4:31 PM, Hamza Asad <ha...@gmail.com>> wrote:
> I want to export my table in mysql and for that i'm using sqoop export command but in HDFS i've data apparantly without any field seperator But it does contain some field separator. data is saved in the format as shown below
> 8119844 1 4472499 2013-01-29 00:00:00.0 1 4 1 \N \N \N \N \N \N \N \N \N \N 8 \N \N \N \N \N 1 \N \N 3 2 \N 1
> how can i export this type of data to mysql and what field separator i mention it there.. Please help
> 
> --
> Muhammad Hamza Asad
> 
> 
> 
> --
> Nitin Pawar
> 
> 
> 
> --
> Muhammad Hamza Asad
> 
> 
> 
> --
> Nitin Pawar
> 
> 
> 
> --
> Muhammad Hamza Asad
> 
> 
> 
> --
> Nitin Pawar
> 
> 
> 
> --
> Muhammad Hamza Asad
> 
> 
> 
> --
> Nitin Pawar
> 
> 
> 
> --
> Muhammad Hamza Asad
> 

Re: Export hive table format issue

Posted by "Arafat, Moiz" <mo...@teamaol.com>.
Can you try using default value ex 0 or 9999999 instead of storing NULL in the numeric column on hive side ?

Thanks,
Moiz Arafat

On Jun 18, 2013, at 9:14 AM, Hamza Asad <ha...@gmail.com>> wrote:

Nitin,
       Issue is not with the INT or BIGINT (as i have verified both), exception is same.. Issue is with some thing else.. Please sort out any solution... following exception still raising (# in input string is not visible in terminal and is translated to # when copied to office writer which i pasted below)
java.lang.NumberFormatException: For input string: " 433649#1#534782#2"
    at java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
    at java.lang.Long.parseLong(Long.java:441)
    at java.lang.Long.valueOf(Long.java:540)
    at dump_hive_events_details.__loadFromFields(dump_hive_events_details.java:949)
    at dump_hive_events_details.parse(dump_hive_events_details.java:901)
    at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:77)
    at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)
    at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
    at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
    at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
    at java.lang.Long.parseLong(Long.java:441)
    at java.lang.Long.valueOf(Long.java:540)
    at dump_hive_events_details.__loadFromFields(dump_hive_events_details.java:949)
    at dump_hive_events_details.parse(dump_hive_events_details.java:901)
    at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:77)
    at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)
    at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
    at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
    at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
    at org.apache.hadoop.mapred.Child.main(Child.java:249)



On Tue, Jun 18, 2013 at 5:53 PM, Nitin Pawar <ni...@gmail.com>> wrote:
can you change your mysql schema to have bigint instead of just int.
for more you can refer this http://stackoverflow.com/questions/16886668/why-sqoop-fails-on-numberformatexception-for-numeric-column-during-the-export-fr


On Tue, Jun 18, 2013 at 5:52 PM, Hamza Asad <ha...@gmail.com>> wrote:
Attached are the schema files of both HIVE and mySql tables


On Tue, Jun 18, 2013 at 5:11 PM, Nitin Pawar <ni...@gmail.com>> wrote:
 for the number format exception, can you share your mysql schema (put as attachment and not inline in mail). If you have created table with int .. try to switch the column with bigint



On Tue, Jun 18, 2013 at 5:37 PM, Hamza Asad <ha...@gmail.com>> wrote:
I have copy paste the ROW in office writer where i saw its # separated...
yeah \N values representing NULL..
the version of sqoop is
Sqoop 1.4.2
git commit id
Compiled by ag on Tue Aug 14 17:37:19 IST 2012


On Tue, Jun 18, 2013 at 5:01 PM, Nitin Pawar <ni...@gmail.com>> wrote:
is "#" your field separator?
also the separator is normally an octal representation so you can give it a try.

why does your columns have \N as values? is it for NULL ?

what version of sqoop are you using?


On Tue, Jun 18, 2013 at 5:00 PM, Hamza Asad <ha...@gmail.com>> wrote:
im executing following command
sqoop export --connect jdbc:mysql://localhost/xxxx --table dump_hive_events_details --export-dir hive/warehouse/xxxx.db/events_details --input-null-non-string \N --input-fields-terminated-by '#' --username xxxxxxxx --password xxxxxxxxx

13/06/18 16:26:44 INFO mapred.JobClient: Task Id : attempt_201306170658_0106_m_000001_0, Status : FAILED
java.lang.NumberFormatException: For input string: "8119844 1 4472499 2013-01-29 00:00:00.0 1 4 1 \N \N \N \N \N \N \N \N \N \N 8 \N \N \N \N \N 1 \N \N 3 2 \N 1"
    at java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
    at java.lang.Integer.parseInt(Integer.java:492)
    at java.lang.Integer.valueOf(Integer.java:582)
    at dump_hive_events_details.__loadFromFields(dump_hive_events_details.java:949)
    at dump_hive_events_details.parse(dump_hive_events_details.java:901)
    at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:77)
    at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)
    at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
    at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
    at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
    at org.apache.hadoop.mapred.Child.main(Child.java:249)



On Tue, Jun 18, 2013 at 4:07 PM, Nitin Pawar <ni...@gmail.com>> wrote:
check the option --input-fields-terminated-by in sqoop export


On Tue, Jun 18, 2013 at 4:31 PM, Hamza Asad <ha...@gmail.com>> wrote:
I want to export my table in mysql and for that i'm using sqoop export command but in HDFS i've data apparantly without any field seperator But it does contain some field separator. data is saved in the format as shown below
8119844 1 4472499 2013-01-29 00:00:00.0 1 4 1 \N \N \N \N \N \N \N \N \N \N 8 \N \N \N \N \N 1 \N \N 3 2 \N 1
how can i export this type of data to mysql and what field separator i mention it there.. Please help

--
Muhammad Hamza Asad



--
Nitin Pawar



--
Muhammad Hamza Asad



--
Nitin Pawar



--
Muhammad Hamza Asad



--
Nitin Pawar



--
Muhammad Hamza Asad



--
Nitin Pawar



--
Muhammad Hamza Asad


Re: Export hive table format issue

Posted by Hamza Asad <ha...@gmail.com>.
Nitin,
       Issue is not with the INT or BIGINT (as i have verified both),
exception is same.. Issue is with some thing else.. Please sort out any
solution... following exception still raising (# in input string is not
visible in terminal and is translated to # when copied to office writer
which i pasted below)
*java.lang.NumberFormatException: For input string: "** *433649#1#534782#2"
***    at
java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
    at java.lang.Long.parseLong(Long.java:441)
    at java.lang.Long.valueOf(Long.java:540)
    at
dump_hive_events_details.__loadFromFields(dump_hive_events_details.java:949)
    at dump_hive_events_details.parse(dump_hive_events_details.java:901)
    at
org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:77)
    at
org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)
    at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
    at
org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
    at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
    at java.lang.Long.parseLong(Long.java:441)
    at java.lang.Long.valueOf(Long.java:540)
    at
dump_hive_events_details.__loadFromFields(dump_hive_events_details.java:949)
    at dump_hive_events_details.parse(dump_hive_events_details.java:901)
    at
org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:77)
    at
org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)
    at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
    at
org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
    at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
    at org.apache.hadoop.mapred.Child.main(Child.java:249)
*


On Tue, Jun 18, 2013 at 5:53 PM, Nitin Pawar <ni...@gmail.com>wrote:

> can you change your mysql schema to have bigint instead of just int.
> for more you can refer this
> http://stackoverflow.com/questions/16886668/why-sqoop-fails-on-numberformatexception-for-numeric-column-during-the-export-fr
>
>
> On Tue, Jun 18, 2013 at 5:52 PM, Hamza Asad <ha...@gmail.com>wrote:
>
>> Attached are the schema files of both HIVE and mySql tables
>>
>>
>> On Tue, Jun 18, 2013 at 5:11 PM, Nitin Pawar <ni...@gmail.com>wrote:
>>
>>>  for the number format exception, can you share your mysql schema (put
>>> as attachment and not inline in mail). If you have created table with int
>>> .. try to switch the column with bigint
>>>
>>>
>>>
>>> On Tue, Jun 18, 2013 at 5:37 PM, Hamza Asad <ha...@gmail.com>wrote:
>>>
>>>> I have copy paste the ROW in office writer where i saw its #
>>>> separated...
>>>> yeah \N values representing NULL..
>>>> the version of sqoop is
>>>> *Sqoop 1.4.2
>>>> git commit id
>>>> Compiled by ag on Tue Aug 14 17:37:19 IST 2012*
>>>>
>>>>
>>>> On Tue, Jun 18, 2013 at 5:01 PM, Nitin Pawar <ni...@gmail.com>wrote:
>>>>
>>>>> is "#" your field separator?
>>>>> also the separator is normally an octal representation so you can give
>>>>> it a try.
>>>>>
>>>>> why does your columns have \N as values? is it for NULL ?
>>>>>
>>>>> what version of sqoop are you using?
>>>>>
>>>>>
>>>>> On Tue, Jun 18, 2013 at 5:00 PM, Hamza Asad <ha...@gmail.com>wrote:
>>>>>
>>>>>> im executing following command*
>>>>>> sqoop export --connect jdbc:mysql://localhost/xxxx --table
>>>>>> dump_hive_events_details --export-dir hive/warehouse/xxxx.db/events_details
>>>>>> --input-null-non-string \N --input-fields-terminated-by '#' --username
>>>>>> xxxxxxxx --password xxxxxxxxx*
>>>>>> *
>>>>>> 13/06/18 16:26:44 INFO mapred.JobClient: Task Id :
>>>>>> attempt_201306170658_0106_m_000001_0, Status : FAILED
>>>>>> java.lang.NumberFormatException: For input string: "8119844 1 4472499
>>>>>> 2013-01-29 00:00:00.0 1 4 1 \N \N \N \N \N \N \N \N \N \N 8 \N \N \N \N \N
>>>>>> 1 \N \N 3 2 \N 1"
>>>>>>     at
>>>>>> java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
>>>>>>     at java.lang.Integer.parseInt(Integer.java:492)
>>>>>>     at java.lang.Integer.valueOf(Integer.java:582)
>>>>>>     at
>>>>>> dump_hive_events_details.__loadFromFields(dump_hive_events_details.java:949)
>>>>>>     at
>>>>>> dump_hive_events_details.parse(dump_hive_events_details.java:901)
>>>>>>     at
>>>>>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:77)
>>>>>>     at
>>>>>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)
>>>>>>     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>>>>>>     at
>>>>>> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
>>>>>>     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
>>>>>>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
>>>>>>     at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>>>>>>     at java.security.AccessController.doPrivileged(Native Method)
>>>>>>     at javax.security.auth.Subject.doAs(Subject.java:415)
>>>>>>     at
>>>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
>>>>>>     at org.apache.hadoop.mapred.Child.main(Child.java:249)
>>>>>> *
>>>>>>
>>>>>>
>>>>>> On Tue, Jun 18, 2013 at 4:07 PM, Nitin Pawar <nitinpawar432@gmail.com
>>>>>> > wrote:
>>>>>>
>>>>>>> check the option --input-fields-terminated-by in sqoop export
>>>>>>>
>>>>>>>
>>>>>>> On Tue, Jun 18, 2013 at 4:31 PM, Hamza Asad <ha...@gmail.com>wrote:
>>>>>>>
>>>>>>>> I want to export my table in mysql and for that i'm using sqoop
>>>>>>>> export command but in HDFS i've data apparantly without any field seperator
>>>>>>>> But it does contain some field separator. data is saved in the format as
>>>>>>>> shown below
>>>>>>>> *8119844 1 4472499 2013-01-29 00:00:00.0 1 4 1 \N \N \N \N \N \N
>>>>>>>> \N \N \N \N 8 \N \N \N \N \N 1 \N \N 3 2 \N 1*
>>>>>>>> how can i export this type of data to mysql and what field
>>>>>>>> separator i mention it there.. Please help
>>>>>>>>
>>>>>>>> --
>>>>>>>> *Muhammad Hamza Asad*
>>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> --
>>>>>>> Nitin Pawar
>>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> --
>>>>>> *Muhammad Hamza Asad*
>>>>>>
>>>>>
>>>>>
>>>>>
>>>>> --
>>>>> Nitin Pawar
>>>>>
>>>>
>>>>
>>>>
>>>> --
>>>> *Muhammad Hamza Asad*
>>>>
>>>
>>>
>>>
>>> --
>>> Nitin Pawar
>>>
>>
>>
>>
>> --
>> *Muhammad Hamza Asad*
>>
>
>
>
> --
> Nitin Pawar
>



-- 
*Muhammad Hamza Asad*

Re: Export hive table format issue

Posted by Nitin Pawar <ni...@gmail.com>.
can you change your mysql schema to have bigint instead of just int.
for more you can refer this
http://stackoverflow.com/questions/16886668/why-sqoop-fails-on-numberformatexception-for-numeric-column-during-the-export-fr


On Tue, Jun 18, 2013 at 5:52 PM, Hamza Asad <ha...@gmail.com> wrote:

> Attached are the schema files of both HIVE and mySql tables
>
>
> On Tue, Jun 18, 2013 at 5:11 PM, Nitin Pawar <ni...@gmail.com>wrote:
>
>>  for the number format exception, can you share your mysql schema (put as
>> attachment and not inline in mail). If you have created table with int ..
>> try to switch the column with bigint
>>
>>
>>
>> On Tue, Jun 18, 2013 at 5:37 PM, Hamza Asad <ha...@gmail.com>wrote:
>>
>>> I have copy paste the ROW in office writer where i saw its # separated...
>>> yeah \N values representing NULL..
>>> the version of sqoop is
>>> *Sqoop 1.4.2
>>> git commit id
>>> Compiled by ag on Tue Aug 14 17:37:19 IST 2012*
>>>
>>>
>>> On Tue, Jun 18, 2013 at 5:01 PM, Nitin Pawar <ni...@gmail.com>wrote:
>>>
>>>> is "#" your field separator?
>>>> also the separator is normally an octal representation so you can give
>>>> it a try.
>>>>
>>>> why does your columns have \N as values? is it for NULL ?
>>>>
>>>> what version of sqoop are you using?
>>>>
>>>>
>>>> On Tue, Jun 18, 2013 at 5:00 PM, Hamza Asad <ha...@gmail.com>wrote:
>>>>
>>>>> im executing following command*
>>>>> sqoop export --connect jdbc:mysql://localhost/xxxx --table
>>>>> dump_hive_events_details --export-dir hive/warehouse/xxxx.db/events_details
>>>>> --input-null-non-string \N --input-fields-terminated-by '#' --username
>>>>> xxxxxxxx --password xxxxxxxxx*
>>>>> *
>>>>> 13/06/18 16:26:44 INFO mapred.JobClient: Task Id :
>>>>> attempt_201306170658_0106_m_000001_0, Status : FAILED
>>>>> java.lang.NumberFormatException: For input string: "8119844 1 4472499
>>>>> 2013-01-29 00:00:00.0 1 4 1 \N \N \N \N \N \N \N \N \N \N 8 \N \N \N \N \N
>>>>> 1 \N \N 3 2 \N 1"
>>>>>     at
>>>>> java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
>>>>>     at java.lang.Integer.parseInt(Integer.java:492)
>>>>>     at java.lang.Integer.valueOf(Integer.java:582)
>>>>>     at
>>>>> dump_hive_events_details.__loadFromFields(dump_hive_events_details.java:949)
>>>>>     at
>>>>> dump_hive_events_details.parse(dump_hive_events_details.java:901)
>>>>>     at
>>>>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:77)
>>>>>     at
>>>>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)
>>>>>     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>>>>>     at
>>>>> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
>>>>>     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
>>>>>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
>>>>>     at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>>>>>     at java.security.AccessController.doPrivileged(Native Method)
>>>>>     at javax.security.auth.Subject.doAs(Subject.java:415)
>>>>>     at
>>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
>>>>>     at org.apache.hadoop.mapred.Child.main(Child.java:249)
>>>>> *
>>>>>
>>>>>
>>>>> On Tue, Jun 18, 2013 at 4:07 PM, Nitin Pawar <ni...@gmail.com>wrote:
>>>>>
>>>>>> check the option --input-fields-terminated-by in sqoop export
>>>>>>
>>>>>>
>>>>>> On Tue, Jun 18, 2013 at 4:31 PM, Hamza Asad <ha...@gmail.com>wrote:
>>>>>>
>>>>>>> I want to export my table in mysql and for that i'm using sqoop
>>>>>>> export command but in HDFS i've data apparantly without any field seperator
>>>>>>> But it does contain some field separator. data is saved in the format as
>>>>>>> shown below
>>>>>>> *8119844 1 4472499 2013-01-29 00:00:00.0 1 4 1 \N \N \N \N \N \N \N
>>>>>>> \N \N \N 8 \N \N \N \N \N 1 \N \N 3 2 \N 1*
>>>>>>> how can i export this type of data to mysql and what field separator
>>>>>>> i mention it there.. Please help
>>>>>>>
>>>>>>> --
>>>>>>> *Muhammad Hamza Asad*
>>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> --
>>>>>> Nitin Pawar
>>>>>>
>>>>>
>>>>>
>>>>>
>>>>> --
>>>>> *Muhammad Hamza Asad*
>>>>>
>>>>
>>>>
>>>>
>>>> --
>>>> Nitin Pawar
>>>>
>>>
>>>
>>>
>>> --
>>> *Muhammad Hamza Asad*
>>>
>>
>>
>>
>> --
>> Nitin Pawar
>>
>
>
>
> --
> *Muhammad Hamza Asad*
>



-- 
Nitin Pawar

Re: Export hive table format issue

Posted by Hamza Asad <ha...@gmail.com>.
Attached are the schema files of both HIVE and mySql tables


On Tue, Jun 18, 2013 at 5:11 PM, Nitin Pawar <ni...@gmail.com>wrote:

>  for the number format exception, can you share your mysql schema (put as
> attachment and not inline in mail). If you have created table with int ..
> try to switch the column with bigint
>
>
>
> On Tue, Jun 18, 2013 at 5:37 PM, Hamza Asad <ha...@gmail.com>wrote:
>
>> I have copy paste the ROW in office writer where i saw its # separated...
>> yeah \N values representing NULL..
>> the version of sqoop is
>> *Sqoop 1.4.2
>> git commit id
>> Compiled by ag on Tue Aug 14 17:37:19 IST 2012*
>>
>>
>> On Tue, Jun 18, 2013 at 5:01 PM, Nitin Pawar <ni...@gmail.com>wrote:
>>
>>> is "#" your field separator?
>>> also the separator is normally an octal representation so you can give
>>> it a try.
>>>
>>> why does your columns have \N as values? is it for NULL ?
>>>
>>> what version of sqoop are you using?
>>>
>>>
>>> On Tue, Jun 18, 2013 at 5:00 PM, Hamza Asad <ha...@gmail.com>wrote:
>>>
>>>> im executing following command*
>>>> sqoop export --connect jdbc:mysql://localhost/xxxx --table
>>>> dump_hive_events_details --export-dir hive/warehouse/xxxx.db/events_details
>>>> --input-null-non-string \N --input-fields-terminated-by '#' --username
>>>> xxxxxxxx --password xxxxxxxxx*
>>>> *
>>>> 13/06/18 16:26:44 INFO mapred.JobClient: Task Id :
>>>> attempt_201306170658_0106_m_000001_0, Status : FAILED
>>>> java.lang.NumberFormatException: For input string: "8119844 1 4472499
>>>> 2013-01-29 00:00:00.0 1 4 1 \N \N \N \N \N \N \N \N \N \N 8 \N \N \N \N \N
>>>> 1 \N \N 3 2 \N 1"
>>>>     at
>>>> java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
>>>>     at java.lang.Integer.parseInt(Integer.java:492)
>>>>     at java.lang.Integer.valueOf(Integer.java:582)
>>>>     at
>>>> dump_hive_events_details.__loadFromFields(dump_hive_events_details.java:949)
>>>>     at dump_hive_events_details.parse(dump_hive_events_details.java:901)
>>>>     at
>>>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:77)
>>>>     at
>>>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)
>>>>     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>>>>     at
>>>> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
>>>>     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
>>>>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
>>>>     at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>>>>     at java.security.AccessController.doPrivileged(Native Method)
>>>>     at javax.security.auth.Subject.doAs(Subject.java:415)
>>>>     at
>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
>>>>     at org.apache.hadoop.mapred.Child.main(Child.java:249)
>>>> *
>>>>
>>>>
>>>> On Tue, Jun 18, 2013 at 4:07 PM, Nitin Pawar <ni...@gmail.com>wrote:
>>>>
>>>>> check the option --input-fields-terminated-by in sqoop export
>>>>>
>>>>>
>>>>> On Tue, Jun 18, 2013 at 4:31 PM, Hamza Asad <ha...@gmail.com>wrote:
>>>>>
>>>>>> I want to export my table in mysql and for that i'm using sqoop
>>>>>> export command but in HDFS i've data apparantly without any field seperator
>>>>>> But it does contain some field separator. data is saved in the format as
>>>>>> shown below
>>>>>> *8119844 1 4472499 2013-01-29 00:00:00.0 1 4 1 \N \N \N \N \N \N \N
>>>>>> \N \N \N 8 \N \N \N \N \N 1 \N \N 3 2 \N 1*
>>>>>> how can i export this type of data to mysql and what field separator
>>>>>> i mention it there.. Please help
>>>>>>
>>>>>> --
>>>>>> *Muhammad Hamza Asad*
>>>>>>
>>>>>
>>>>>
>>>>>
>>>>> --
>>>>> Nitin Pawar
>>>>>
>>>>
>>>>
>>>>
>>>> --
>>>> *Muhammad Hamza Asad*
>>>>
>>>
>>>
>>>
>>> --
>>> Nitin Pawar
>>>
>>
>>
>>
>> --
>> *Muhammad Hamza Asad*
>>
>
>
>
> --
> Nitin Pawar
>



-- 
*Muhammad Hamza Asad*

Re: Export hive table format issue

Posted by Nitin Pawar <ni...@gmail.com>.
 for the number format exception, can you share your mysql schema (put as
attachment and not inline in mail). If you have created table with int ..
try to switch the column with bigint



On Tue, Jun 18, 2013 at 5:37 PM, Hamza Asad <ha...@gmail.com> wrote:

> I have copy paste the ROW in office writer where i saw its # separated...
> yeah \N values representing NULL..
> the version of sqoop is
> *Sqoop 1.4.2
> git commit id
> Compiled by ag on Tue Aug 14 17:37:19 IST 2012*
>
>
> On Tue, Jun 18, 2013 at 5:01 PM, Nitin Pawar <ni...@gmail.com>wrote:
>
>> is "#" your field separator?
>> also the separator is normally an octal representation so you can give it
>> a try.
>>
>> why does your columns have \N as values? is it for NULL ?
>>
>> what version of sqoop are you using?
>>
>>
>> On Tue, Jun 18, 2013 at 5:00 PM, Hamza Asad <ha...@gmail.com>wrote:
>>
>>> im executing following command*
>>> sqoop export --connect jdbc:mysql://localhost/xxxx --table
>>> dump_hive_events_details --export-dir hive/warehouse/xxxx.db/events_details
>>> --input-null-non-string \N --input-fields-terminated-by '#' --username
>>> xxxxxxxx --password xxxxxxxxx*
>>> *
>>> 13/06/18 16:26:44 INFO mapred.JobClient: Task Id :
>>> attempt_201306170658_0106_m_000001_0, Status : FAILED
>>> java.lang.NumberFormatException: For input string: "8119844 1 4472499
>>> 2013-01-29 00:00:00.0 1 4 1 \N \N \N \N \N \N \N \N \N \N 8 \N \N \N \N \N
>>> 1 \N \N 3 2 \N 1"
>>>     at
>>> java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
>>>     at java.lang.Integer.parseInt(Integer.java:492)
>>>     at java.lang.Integer.valueOf(Integer.java:582)
>>>     at
>>> dump_hive_events_details.__loadFromFields(dump_hive_events_details.java:949)
>>>     at dump_hive_events_details.parse(dump_hive_events_details.java:901)
>>>     at
>>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:77)
>>>     at
>>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)
>>>     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>>>     at
>>> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
>>>     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
>>>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
>>>     at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>>>     at java.security.AccessController.doPrivileged(Native Method)
>>>     at javax.security.auth.Subject.doAs(Subject.java:415)
>>>     at
>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
>>>     at org.apache.hadoop.mapred.Child.main(Child.java:249)
>>> *
>>>
>>>
>>> On Tue, Jun 18, 2013 at 4:07 PM, Nitin Pawar <ni...@gmail.com>wrote:
>>>
>>>> check the option --input-fields-terminated-by in sqoop export
>>>>
>>>>
>>>> On Tue, Jun 18, 2013 at 4:31 PM, Hamza Asad <ha...@gmail.com>wrote:
>>>>
>>>>> I want to export my table in mysql and for that i'm using sqoop export
>>>>> command but in HDFS i've data apparantly without any field seperator But it
>>>>> does contain some field separator. data is saved in the format as shown
>>>>> below
>>>>> *8119844 1 4472499 2013-01-29 00:00:00.0 1 4 1 \N \N \N \N \N \N \N
>>>>> \N \N \N 8 \N \N \N \N \N 1 \N \N 3 2 \N 1*
>>>>> how can i export this type of data to mysql and what field separator i
>>>>> mention it there.. Please help
>>>>>
>>>>> --
>>>>> *Muhammad Hamza Asad*
>>>>>
>>>>
>>>>
>>>>
>>>> --
>>>> Nitin Pawar
>>>>
>>>
>>>
>>>
>>> --
>>> *Muhammad Hamza Asad*
>>>
>>
>>
>>
>> --
>> Nitin Pawar
>>
>
>
>
> --
> *Muhammad Hamza Asad*
>



-- 
Nitin Pawar

Re: Export hive table format issue

Posted by Hamza Asad <ha...@gmail.com>.
I have copy paste the ROW in office writer where i saw its # separated...
yeah \N values representing NULL..
the version of sqoop is
*Sqoop 1.4.2
git commit id
Compiled by ag on Tue Aug 14 17:37:19 IST 2012*


On Tue, Jun 18, 2013 at 5:01 PM, Nitin Pawar <ni...@gmail.com>wrote:

> is "#" your field separator?
> also the separator is normally an octal representation so you can give it
> a try.
>
> why does your columns have \N as values? is it for NULL ?
>
> what version of sqoop are you using?
>
>
> On Tue, Jun 18, 2013 at 5:00 PM, Hamza Asad <ha...@gmail.com>wrote:
>
>> im executing following command*
>> sqoop export --connect jdbc:mysql://localhost/xxxx --table
>> dump_hive_events_details --export-dir hive/warehouse/xxxx.db/events_details
>> --input-null-non-string \N --input-fields-terminated-by '#' --username
>> xxxxxxxx --password xxxxxxxxx*
>> *
>> 13/06/18 16:26:44 INFO mapred.JobClient: Task Id :
>> attempt_201306170658_0106_m_000001_0, Status : FAILED
>> java.lang.NumberFormatException: For input string: "8119844 1 4472499
>> 2013-01-29 00:00:00.0 1 4 1 \N \N \N \N \N \N \N \N \N \N 8 \N \N \N \N \N
>> 1 \N \N 3 2 \N 1"
>>     at
>> java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
>>     at java.lang.Integer.parseInt(Integer.java:492)
>>     at java.lang.Integer.valueOf(Integer.java:582)
>>     at
>> dump_hive_events_details.__loadFromFields(dump_hive_events_details.java:949)
>>     at dump_hive_events_details.parse(dump_hive_events_details.java:901)
>>     at
>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:77)
>>     at
>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)
>>     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>>     at
>> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
>>     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
>>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
>>     at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>>     at java.security.AccessController.doPrivileged(Native Method)
>>     at javax.security.auth.Subject.doAs(Subject.java:415)
>>     at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
>>     at org.apache.hadoop.mapred.Child.main(Child.java:249)
>> *
>>
>>
>> On Tue, Jun 18, 2013 at 4:07 PM, Nitin Pawar <ni...@gmail.com>wrote:
>>
>>> check the option --input-fields-terminated-by in sqoop export
>>>
>>>
>>> On Tue, Jun 18, 2013 at 4:31 PM, Hamza Asad <ha...@gmail.com>wrote:
>>>
>>>> I want to export my table in mysql and for that i'm using sqoop export
>>>> command but in HDFS i've data apparantly without any field seperator But it
>>>> does contain some field separator. data is saved in the format as shown
>>>> below
>>>> *8119844 1 4472499 2013-01-29 00:00:00.0 1 4 1 \N \N \N \N \N \N \N \N
>>>> \N \N 8 \N \N \N \N \N 1 \N \N 3 2 \N 1*
>>>> how can i export this type of data to mysql and what field separator i
>>>> mention it there.. Please help
>>>>
>>>> --
>>>> *Muhammad Hamza Asad*
>>>>
>>>
>>>
>>>
>>> --
>>> Nitin Pawar
>>>
>>
>>
>>
>> --
>> *Muhammad Hamza Asad*
>>
>
>
>
> --
> Nitin Pawar
>



-- 
*Muhammad Hamza Asad*

Re: Export hive table format issue

Posted by Nitin Pawar <ni...@gmail.com>.
is "#" your field separator?
also the separator is normally an octal representation so you can give it a
try.

why does your columns have \N as values? is it for NULL ?

what version of sqoop are you using?


On Tue, Jun 18, 2013 at 5:00 PM, Hamza Asad <ha...@gmail.com> wrote:

> im executing following command*
> sqoop export --connect jdbc:mysql://localhost/xxxx --table
> dump_hive_events_details --export-dir hive/warehouse/xxxx.db/events_details
> --input-null-non-string \N --input-fields-terminated-by '#' --username
> xxxxxxxx --password xxxxxxxxx*
> *
> 13/06/18 16:26:44 INFO mapred.JobClient: Task Id :
> attempt_201306170658_0106_m_000001_0, Status : FAILED
> java.lang.NumberFormatException: For input string: "8119844 1 4472499
> 2013-01-29 00:00:00.0 1 4 1 \N \N \N \N \N \N \N \N \N \N 8 \N \N \N \N \N
> 1 \N \N 3 2 \N 1"
>     at
> java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
>     at java.lang.Integer.parseInt(Integer.java:492)
>     at java.lang.Integer.valueOf(Integer.java:582)
>     at
> dump_hive_events_details.__loadFromFields(dump_hive_events_details.java:949)
>     at dump_hive_events_details.parse(dump_hive_events_details.java:901)
>     at
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:77)
>     at
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)
>     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>     at
> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
>     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
>     at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:415)
>     at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
>     at org.apache.hadoop.mapred.Child.main(Child.java:249)
> *
>
>
> On Tue, Jun 18, 2013 at 4:07 PM, Nitin Pawar <ni...@gmail.com>wrote:
>
>> check the option --input-fields-terminated-by in sqoop export
>>
>>
>> On Tue, Jun 18, 2013 at 4:31 PM, Hamza Asad <ha...@gmail.com>wrote:
>>
>>> I want to export my table in mysql and for that i'm using sqoop export
>>> command but in HDFS i've data apparantly without any field seperator But it
>>> does contain some field separator. data is saved in the format as shown
>>> below
>>> *8119844 1 4472499 2013-01-29 00:00:00.0 1 4 1 \N \N \N \N \N \N \N \N
>>> \N \N 8 \N \N \N \N \N 1 \N \N 3 2 \N 1*
>>> how can i export this type of data to mysql and what field separator i
>>> mention it there.. Please help
>>>
>>> --
>>> *Muhammad Hamza Asad*
>>>
>>
>>
>>
>> --
>> Nitin Pawar
>>
>
>
>
> --
> *Muhammad Hamza Asad*
>



-- 
Nitin Pawar

Re: Export hive table format issue

Posted by Hamza Asad <ha...@gmail.com>.
im executing following command*
sqoop export --connect jdbc:mysql://localhost/xxxx --table
dump_hive_events_details --export-dir hive/warehouse/xxxx.db/events_details
--input-null-non-string \N --input-fields-terminated-by '#' --username
xxxxxxxx --password xxxxxxxxx*
*
13/06/18 16:26:44 INFO mapred.JobClient: Task Id :
attempt_201306170658_0106_m_000001_0, Status : FAILED
java.lang.NumberFormatException: For input string:
"8119844144724992013-01-29
00:00:00.0141\N\N\N\N\N\N\N\N\N\N8\N\N\N\N\N1\N\N32\N1"
    at
java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
    at java.lang.Integer.parseInt(Integer.java:492)
    at java.lang.Integer.valueOf(Integer.java:582)
    at
dump_hive_events_details.__loadFromFields(dump_hive_events_details.java:949)
    at dump_hive_events_details.parse(dump_hive_events_details.java:901)
    at
org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:77)
    at
org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)
    at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
    at
org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
    at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
    at org.apache.hadoop.mapred.Child.main(Child.java:249)
*


On Tue, Jun 18, 2013 at 4:07 PM, Nitin Pawar <ni...@gmail.com>wrote:

> check the option --input-fields-terminated-by in sqoop export
>
>
> On Tue, Jun 18, 2013 at 4:31 PM, Hamza Asad <ha...@gmail.com>wrote:
>
>> I want to export my table in mysql and for that i'm using sqoop export
>> command but in HDFS i've data apparantly without any field seperator But it
>> does contain some field separator. data is saved in the format as shown
>> below
>> *8119844 1 4472499 2013-01-29 00:00:00.0 1 4 1 \N \N \N \N \N \N \N \N
>> \N \N 8 \N \N \N \N \N 1 \N \N 3 2 \N 1*
>> how can i export this type of data to mysql and what field separator i
>> mention it there.. Please help
>>
>> --
>> *Muhammad Hamza Asad*
>>
>
>
>
> --
> Nitin Pawar
>



-- 
*Muhammad Hamza Asad*

Re: Export hive table format issue

Posted by Nitin Pawar <ni...@gmail.com>.
check the option --input-fields-terminated-by in sqoop export


On Tue, Jun 18, 2013 at 4:31 PM, Hamza Asad <ha...@gmail.com> wrote:

> I want to export my table in mysql and for that i'm using sqoop export
> command but in HDFS i've data apparantly without any field seperator But it
> does contain some field separator. data is saved in the format as shown
> below
> *8119844 1 4472499 2013-01-29 00:00:00.0 1 4 1 \N \N \N \N \N \N \N \N \N
> \N 8 \N \N \N \N \N 1 \N \N 3 2 \N 1*
> how can i export this type of data to mysql and what field separator i
> mention it there.. Please help
>
> --
> *Muhammad Hamza Asad*
>



-- 
Nitin Pawar