You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@sqoop.apache.org by Szabolcs Vasas <va...@gmail.com> on 2017/06/01 09:43:48 UTC

Re: Review Request 58466: SQOOP-3158 - Columns added to Mysql after initial sqoop import, export back to table with same schema fails

-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/58466/#review176592
-----------------------------------------------------------


Ship it!




Hi Eric,

Thank you for fixing the problems and adding all the test cases!

Regards,
Szabolcs

- Szabolcs Vasas


On May 30, 2017, 9:36 a.m., Eric Lin wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/58466/
> -----------------------------------------------------------
> 
> (Updated May 30, 2017, 9:36 a.m.)
> 
> 
> Review request for Sqoop, Attila Szabo and Szabolcs Vasas.
> 
> 
> Bugs: SQOOP-3158
>     https://issues.apache.org/jira/browse/SQOOP-3158
> 
> 
> Repository: sqoop-trunk
> 
> 
> Description
> -------
> 
> I have table in MySQL with 2 columns until yesterday. The columns are id and name.
> 1,Raj
> 2,Jack
> 
> I have imported this data into HDFS yesterday itself as a file. Today we added a new column to the table in MySQL called salary. The table looks like below.
> 
> 1,Raj
> 2,Jack
> 3,Jill,2000
> 4,Nick,3000
> 
> Now I have done Incremental import on this table as a file.
> 
> Part-m-00000 file contains
> 1,Raj
> 2,Jack
> 
> Part-m-00001 file contains
> 3,Jill,2000
> 4,Nick,3000
> 
> Now I created a new table in MySQL with same schema as Original MySQL table with columns id name and salary.
> 
> Sqoop export will fail with below error:
> 
> java.lang.RuntimeException: Can't parse input data: 'Raj'
>         at SQOOP_3158.__loadFromFields(SQOOP_3158.java:316)
>         at SQOOP_3158.parse(SQOOP_3158.java:254)
>         at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:89)
>         at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
>         at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784)
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
>         at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:422)
>         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
>         at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
> Caused by: java.util.NoSuchElementException
>         at java.util.ArrayList$Itr.next(ArrayList.java:854)
>         at SQOOP_3158.__loadFromFields(SQOOP_3158.java:311)
>         ... 12 more
> 
> 
> Diffs
> -----
> 
>   src/java/org/apache/sqoop/orm/ClassWriter.java eaa9123 
>   src/test/com/cloudera/sqoop/TestExport.java b2edc53 
>   src/test/org/apache/sqoop/TestExportUsingProcedure.java 519305c 
> 
> 
> Diff: https://reviews.apache.org/r/58466/diff/5/
> 
> 
> Testing
> -------
> 
> There is no existing test class to cover the path and I am not sure the best way to add test case for this. If you have any suggestion, please let me know.
> 
> I have done manual testing to replicate the issue and confirmed that patch fixed the issue. I have also tried different data types, all working.
> 
> However, if column in MySQL is defined as NOT NULL, then the export will still fail with error, this is expected.
> 
> 
> Thanks,
> 
> Eric Lin
> 
>