You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@sqoop.apache.org by Eric Lin <er...@cloudera.com> on 2017/04/15 03:23:44 UTC

Review Request 58466: SQOOP-3158 - Columns added to Mysql after initial sqoop import, export back to table with same schema fails

-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/58466/
-----------------------------------------------------------

Review request for Sqoop, Attila Szabo and Szabolcs Vasas.


Bugs: SQOOP-3158
    https://issues.apache.org/jira/browse/SQOOP-3158


Repository: sqoop-trunk


Description
-------

I have table in MySQL with 2 columns until yesterday. The columns are id and name.
1,Raj
2,Jack

I have imported this data into HDFS yesterday itself as a file. Today we added a new column to the table in MySQL called salary. The table looks like below.

1,Raj
2,Jack
3,Jill,2000
4,Nick,3000

Now I have done Incremental import on this table as a file.

Part-m-00000 file contains
1,Raj
2,Jack

Part-m-00001 file contains
3,Jill,2000
4,Nick,3000

Now I created a new table in MySQL with same schema as Original MySQL table with columns id name and salary.

Sqoop export will fail with below error:

java.lang.RuntimeException: Can't parse input data: 'Raj'
        at SQOOP_3158.__loadFromFields(SQOOP_3158.java:316)
        at SQOOP_3158.parse(SQOOP_3158.java:254)
        at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:89)
        at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
        at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: java.util.NoSuchElementException
        at java.util.ArrayList$Itr.next(ArrayList.java:854)
        at SQOOP_3158.__loadFromFields(SQOOP_3158.java:311)
        ... 12 more


Diffs
-----

  src/java/org/apache/sqoop/orm/ClassWriter.java eaa9123 


Diff: https://reviews.apache.org/r/58466/diff/1/


Testing
-------

There is no existing test class to cover the path and I am not sure the best way to add test case for this. If you have any suggestion, please let me know.

I have done manual testing to replicate the issue and confirmed that patch fixed the issue. I have also tried different data types, all working.

However, if column in MySQL is defined as NOT NULL, then the export will still fail with error, this is expected.


Thanks,

Eric Lin


Re: Review Request 58466: SQOOP-3158 - Columns added to Mysql after initial sqoop import, export back to table with same schema fails

Posted by Eric Lin <er...@cloudera.com>.

> On May 26, 2017, 9:54 a.m., Szabolcs Vasas wrote:
> > Hi Eric!
> > 
> > Thank you for improving the patch however it turned out that there is one more edge case which needs to be covered.
> > I have realized that there are 2 options for specifying the representation of null: org.apache.sqoop.SqoopOptions#getInNullStringValue (this is used for columns with String data type) and org.apache.sqoop.SqoopOptions#getInNullNonStringValue (this is used for all the other data types). At this point your patch uses the first option only thus the first lines of the __loadFromFields0 method of the generated class look like this:
> > 
> > if (__it.hasNext()) {
> >         __cur_str = __it.next();
> >     } else {
> >         __cur_str = "NNUULL";
> >     }
> >     if (__cur_str.equals("null") || __cur_str.length() == 0) { this.ID = null; } else {
> >       this.ID = Integer.valueOf(__cur_str);
> >     }
> > 
> > Since the ID column is of type Integer Sqoop should use org.apache.sqoop.SqoopOptions#getInNullNonStringValue to generate the first if statement.
> > 
> > Sorry for the late notice but I have just realized that there is this other option too.
> > 
> > Apart from this I have run all the unit tests and it seems that couple of test cases are failing in org.apache.sqoop.TestExportUsingProcedure. TestExportUsingProcedure is a subclass of TestExport thus inherits the new test cases you added but they fail for some reason in that class. Can you please take a look at it too?
> > 
> > 
> > Thank you for your efforts!
> > 
> > Regards,
> > Szabolcs

Hi Szabolcs, 

Thanks for your feedback and suggestion. I have made modifications accordingly and fixed the failed test case.

Thanks again.


- Eric


-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/58466/#review176188
-----------------------------------------------------------


On May 30, 2017, 9:36 a.m., Eric Lin wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/58466/
> -----------------------------------------------------------
> 
> (Updated May 30, 2017, 9:36 a.m.)
> 
> 
> Review request for Sqoop, Attila Szabo and Szabolcs Vasas.
> 
> 
> Bugs: SQOOP-3158
>     https://issues.apache.org/jira/browse/SQOOP-3158
> 
> 
> Repository: sqoop-trunk
> 
> 
> Description
> -------
> 
> I have table in MySQL with 2 columns until yesterday. The columns are id and name.
> 1,Raj
> 2,Jack
> 
> I have imported this data into HDFS yesterday itself as a file. Today we added a new column to the table in MySQL called salary. The table looks like below.
> 
> 1,Raj
> 2,Jack
> 3,Jill,2000
> 4,Nick,3000
> 
> Now I have done Incremental import on this table as a file.
> 
> Part-m-00000 file contains
> 1,Raj
> 2,Jack
> 
> Part-m-00001 file contains
> 3,Jill,2000
> 4,Nick,3000
> 
> Now I created a new table in MySQL with same schema as Original MySQL table with columns id name and salary.
> 
> Sqoop export will fail with below error:
> 
> java.lang.RuntimeException: Can't parse input data: 'Raj'
>         at SQOOP_3158.__loadFromFields(SQOOP_3158.java:316)
>         at SQOOP_3158.parse(SQOOP_3158.java:254)
>         at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:89)
>         at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
>         at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784)
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
>         at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:422)
>         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
>         at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
> Caused by: java.util.NoSuchElementException
>         at java.util.ArrayList$Itr.next(ArrayList.java:854)
>         at SQOOP_3158.__loadFromFields(SQOOP_3158.java:311)
>         ... 12 more
> 
> 
> Diffs
> -----
> 
>   src/java/org/apache/sqoop/orm/ClassWriter.java eaa9123 
>   src/test/com/cloudera/sqoop/TestExport.java b2edc53 
>   src/test/org/apache/sqoop/TestExportUsingProcedure.java 519305c 
> 
> 
> Diff: https://reviews.apache.org/r/58466/diff/5/
> 
> 
> Testing
> -------
> 
> There is no existing test class to cover the path and I am not sure the best way to add test case for this. If you have any suggestion, please let me know.
> 
> I have done manual testing to replicate the issue and confirmed that patch fixed the issue. I have also tried different data types, all working.
> 
> However, if column in MySQL is defined as NOT NULL, then the export will still fail with error, this is expected.
> 
> 
> Thanks,
> 
> Eric Lin
> 
>


Re: Review Request 58466: SQOOP-3158 - Columns added to Mysql after initial sqoop import, export back to table with same schema fails

Posted by Szabolcs Vasas <va...@gmail.com>.
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/58466/#review176188
-----------------------------------------------------------



Hi Eric!

Thank you for improving the patch however it turned out that there is one more edge case which needs to be covered.
I have realized that there are 2 options for specifying the representation of null: org.apache.sqoop.SqoopOptions#getInNullStringValue (this is used for columns with String data type) and org.apache.sqoop.SqoopOptions#getInNullNonStringValue (this is used for all the other data types). At this point your patch uses the first option only thus the first lines of the __loadFromFields0 method of the generated class look like this:

if (__it.hasNext()) {
        __cur_str = __it.next();
    } else {
        __cur_str = "NNUULL";
    }
    if (__cur_str.equals("null") || __cur_str.length() == 0) { this.ID = null; } else {
      this.ID = Integer.valueOf(__cur_str);
    }

Since the ID column is of type Integer Sqoop should use org.apache.sqoop.SqoopOptions#getInNullNonStringValue to generate the first if statement.

Sorry for the late notice but I have just realized that there is this other option too.

Apart from this I have run all the unit tests and it seems that couple of test cases are failing in org.apache.sqoop.TestExportUsingProcedure. TestExportUsingProcedure is a subclass of TestExport thus inherits the new test cases you added but they fail for some reason in that class. Can you please take a look at it too?


Thank you for your efforts!

Regards,
Szabolcs

- Szabolcs Vasas


On May 21, 2017, 11:12 a.m., Eric Lin wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/58466/
> -----------------------------------------------------------
> 
> (Updated May 21, 2017, 11:12 a.m.)
> 
> 
> Review request for Sqoop, Attila Szabo and Szabolcs Vasas.
> 
> 
> Bugs: SQOOP-3158
>     https://issues.apache.org/jira/browse/SQOOP-3158
> 
> 
> Repository: sqoop-trunk
> 
> 
> Description
> -------
> 
> I have table in MySQL with 2 columns until yesterday. The columns are id and name.
> 1,Raj
> 2,Jack
> 
> I have imported this data into HDFS yesterday itself as a file. Today we added a new column to the table in MySQL called salary. The table looks like below.
> 
> 1,Raj
> 2,Jack
> 3,Jill,2000
> 4,Nick,3000
> 
> Now I have done Incremental import on this table as a file.
> 
> Part-m-00000 file contains
> 1,Raj
> 2,Jack
> 
> Part-m-00001 file contains
> 3,Jill,2000
> 4,Nick,3000
> 
> Now I created a new table in MySQL with same schema as Original MySQL table with columns id name and salary.
> 
> Sqoop export will fail with below error:
> 
> java.lang.RuntimeException: Can't parse input data: 'Raj'
>         at SQOOP_3158.__loadFromFields(SQOOP_3158.java:316)
>         at SQOOP_3158.parse(SQOOP_3158.java:254)
>         at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:89)
>         at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
>         at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784)
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
>         at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:422)
>         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
>         at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
> Caused by: java.util.NoSuchElementException
>         at java.util.ArrayList$Itr.next(ArrayList.java:854)
>         at SQOOP_3158.__loadFromFields(SQOOP_3158.java:311)
>         ... 12 more
> 
> 
> Diffs
> -----
> 
>   src/java/org/apache/sqoop/orm/ClassWriter.java eaa9123 
>   src/test/com/cloudera/sqoop/TestExport.java b2edc53 
> 
> 
> Diff: https://reviews.apache.org/r/58466/diff/4/
> 
> 
> Testing
> -------
> 
> There is no existing test class to cover the path and I am not sure the best way to add test case for this. If you have any suggestion, please let me know.
> 
> I have done manual testing to replicate the issue and confirmed that patch fixed the issue. I have also tried different data types, all working.
> 
> However, if column in MySQL is defined as NOT NULL, then the export will still fail with error, this is expected.
> 
> 
> Thanks,
> 
> Eric Lin
> 
>


Re: Review Request 58466: SQOOP-3158 - Columns added to Mysql after initial sqoop import, export back to table with same schema fails

Posted by Szabolcs Vasas <va...@gmail.com>.
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/58466/#review176592
-----------------------------------------------------------


Ship it!




Hi Eric,

Thank you for fixing the problems and adding all the test cases!

Regards,
Szabolcs

- Szabolcs Vasas


On May 30, 2017, 9:36 a.m., Eric Lin wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/58466/
> -----------------------------------------------------------
> 
> (Updated May 30, 2017, 9:36 a.m.)
> 
> 
> Review request for Sqoop, Attila Szabo and Szabolcs Vasas.
> 
> 
> Bugs: SQOOP-3158
>     https://issues.apache.org/jira/browse/SQOOP-3158
> 
> 
> Repository: sqoop-trunk
> 
> 
> Description
> -------
> 
> I have table in MySQL with 2 columns until yesterday. The columns are id and name.
> 1,Raj
> 2,Jack
> 
> I have imported this data into HDFS yesterday itself as a file. Today we added a new column to the table in MySQL called salary. The table looks like below.
> 
> 1,Raj
> 2,Jack
> 3,Jill,2000
> 4,Nick,3000
> 
> Now I have done Incremental import on this table as a file.
> 
> Part-m-00000 file contains
> 1,Raj
> 2,Jack
> 
> Part-m-00001 file contains
> 3,Jill,2000
> 4,Nick,3000
> 
> Now I created a new table in MySQL with same schema as Original MySQL table with columns id name and salary.
> 
> Sqoop export will fail with below error:
> 
> java.lang.RuntimeException: Can't parse input data: 'Raj'
>         at SQOOP_3158.__loadFromFields(SQOOP_3158.java:316)
>         at SQOOP_3158.parse(SQOOP_3158.java:254)
>         at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:89)
>         at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
>         at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784)
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
>         at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:422)
>         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
>         at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
> Caused by: java.util.NoSuchElementException
>         at java.util.ArrayList$Itr.next(ArrayList.java:854)
>         at SQOOP_3158.__loadFromFields(SQOOP_3158.java:311)
>         ... 12 more
> 
> 
> Diffs
> -----
> 
>   src/java/org/apache/sqoop/orm/ClassWriter.java eaa9123 
>   src/test/com/cloudera/sqoop/TestExport.java b2edc53 
>   src/test/org/apache/sqoop/TestExportUsingProcedure.java 519305c 
> 
> 
> Diff: https://reviews.apache.org/r/58466/diff/5/
> 
> 
> Testing
> -------
> 
> There is no existing test class to cover the path and I am not sure the best way to add test case for this. If you have any suggestion, please let me know.
> 
> I have done manual testing to replicate the issue and confirmed that patch fixed the issue. I have also tried different data types, all working.
> 
> However, if column in MySQL is defined as NOT NULL, then the export will still fail with error, this is expected.
> 
> 
> Thanks,
> 
> Eric Lin
> 
>


Re: Review Request 58466: SQOOP-3158 - Columns added to Mysql after initial sqoop import, export back to table with same schema fails

Posted by Anna Szonyi <sz...@cloudera.com>.
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/58466/#review177351
-----------------------------------------------------------


Ship it!




Ship It!

- Anna Szonyi


On May 30, 2017, 9:36 a.m., Eric Lin wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/58466/
> -----------------------------------------------------------
> 
> (Updated May 30, 2017, 9:36 a.m.)
> 
> 
> Review request for Sqoop, Attila Szabo and Szabolcs Vasas.
> 
> 
> Bugs: SQOOP-3158
>     https://issues.apache.org/jira/browse/SQOOP-3158
> 
> 
> Repository: sqoop-trunk
> 
> 
> Description
> -------
> 
> I have table in MySQL with 2 columns until yesterday. The columns are id and name.
> 1,Raj
> 2,Jack
> 
> I have imported this data into HDFS yesterday itself as a file. Today we added a new column to the table in MySQL called salary. The table looks like below.
> 
> 1,Raj
> 2,Jack
> 3,Jill,2000
> 4,Nick,3000
> 
> Now I have done Incremental import on this table as a file.
> 
> Part-m-00000 file contains
> 1,Raj
> 2,Jack
> 
> Part-m-00001 file contains
> 3,Jill,2000
> 4,Nick,3000
> 
> Now I created a new table in MySQL with same schema as Original MySQL table with columns id name and salary.
> 
> Sqoop export will fail with below error:
> 
> java.lang.RuntimeException: Can't parse input data: 'Raj'
>         at SQOOP_3158.__loadFromFields(SQOOP_3158.java:316)
>         at SQOOP_3158.parse(SQOOP_3158.java:254)
>         at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:89)
>         at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
>         at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784)
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
>         at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:422)
>         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
>         at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
> Caused by: java.util.NoSuchElementException
>         at java.util.ArrayList$Itr.next(ArrayList.java:854)
>         at SQOOP_3158.__loadFromFields(SQOOP_3158.java:311)
>         ... 12 more
> 
> 
> Diffs
> -----
> 
>   src/java/org/apache/sqoop/orm/ClassWriter.java eaa9123 
>   src/test/com/cloudera/sqoop/TestExport.java b2edc53 
>   src/test/org/apache/sqoop/TestExportUsingProcedure.java 519305c 
> 
> 
> Diff: https://reviews.apache.org/r/58466/diff/5/
> 
> 
> Testing
> -------
> 
> There is no existing test class to cover the path and I am not sure the best way to add test case for this. If you have any suggestion, please let me know.
> 
> I have done manual testing to replicate the issue and confirmed that patch fixed the issue. I have also tried different data types, all working.
> 
> However, if column in MySQL is defined as NOT NULL, then the export will still fail with error, this is expected.
> 
> 
> Thanks,
> 
> Eric Lin
> 
>


Re: Review Request 58466: SQOOP-3158 - Columns added to Mysql after initial sqoop import, export back to table with same schema fails

Posted by Eric Lin <er...@cloudera.com>.
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/58466/
-----------------------------------------------------------

(Updated May 30, 2017, 9:36 a.m.)


Review request for Sqoop, Attila Szabo and Szabolcs Vasas.


Changes
-------

Latest update based on Szabolcs' feedback.


Bugs: SQOOP-3158
    https://issues.apache.org/jira/browse/SQOOP-3158


Repository: sqoop-trunk


Description
-------

I have table in MySQL with 2 columns until yesterday. The columns are id and name.
1,Raj
2,Jack

I have imported this data into HDFS yesterday itself as a file. Today we added a new column to the table in MySQL called salary. The table looks like below.

1,Raj
2,Jack
3,Jill,2000
4,Nick,3000

Now I have done Incremental import on this table as a file.

Part-m-00000 file contains
1,Raj
2,Jack

Part-m-00001 file contains
3,Jill,2000
4,Nick,3000

Now I created a new table in MySQL with same schema as Original MySQL table with columns id name and salary.

Sqoop export will fail with below error:

java.lang.RuntimeException: Can't parse input data: 'Raj'
        at SQOOP_3158.__loadFromFields(SQOOP_3158.java:316)
        at SQOOP_3158.parse(SQOOP_3158.java:254)
        at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:89)
        at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
        at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: java.util.NoSuchElementException
        at java.util.ArrayList$Itr.next(ArrayList.java:854)
        at SQOOP_3158.__loadFromFields(SQOOP_3158.java:311)
        ... 12 more


Diffs (updated)
-----

  src/java/org/apache/sqoop/orm/ClassWriter.java eaa9123 
  src/test/com/cloudera/sqoop/TestExport.java b2edc53 
  src/test/org/apache/sqoop/TestExportUsingProcedure.java 519305c 


Diff: https://reviews.apache.org/r/58466/diff/5/

Changes: https://reviews.apache.org/r/58466/diff/4-5/


Testing
-------

There is no existing test class to cover the path and I am not sure the best way to add test case for this. If you have any suggestion, please let me know.

I have done manual testing to replicate the issue and confirmed that patch fixed the issue. I have also tried different data types, all working.

However, if column in MySQL is defined as NOT NULL, then the export will still fail with error, this is expected.


Thanks,

Eric Lin


Re: Review Request 58466: SQOOP-3158 - Columns added to Mysql after initial sqoop import, export back to table with same schema fails

Posted by Eric Lin <er...@cloudera.com>.
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/58466/
-----------------------------------------------------------

(Updated May 21, 2017, 11:12 a.m.)


Review request for Sqoop, Attila Szabo and Szabolcs Vasas.


Changes
-------

update from hardcoded "null" value to this.options.getInNullStringValue() function call, and also added a new test case.


Bugs: SQOOP-3158
    https://issues.apache.org/jira/browse/SQOOP-3158


Repository: sqoop-trunk


Description
-------

I have table in MySQL with 2 columns until yesterday. The columns are id and name.
1,Raj
2,Jack

I have imported this data into HDFS yesterday itself as a file. Today we added a new column to the table in MySQL called salary. The table looks like below.

1,Raj
2,Jack
3,Jill,2000
4,Nick,3000

Now I have done Incremental import on this table as a file.

Part-m-00000 file contains
1,Raj
2,Jack

Part-m-00001 file contains
3,Jill,2000
4,Nick,3000

Now I created a new table in MySQL with same schema as Original MySQL table with columns id name and salary.

Sqoop export will fail with below error:

java.lang.RuntimeException: Can't parse input data: 'Raj'
        at SQOOP_3158.__loadFromFields(SQOOP_3158.java:316)
        at SQOOP_3158.parse(SQOOP_3158.java:254)
        at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:89)
        at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
        at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: java.util.NoSuchElementException
        at java.util.ArrayList$Itr.next(ArrayList.java:854)
        at SQOOP_3158.__loadFromFields(SQOOP_3158.java:311)
        ... 12 more


Diffs (updated)
-----

  src/java/org/apache/sqoop/orm/ClassWriter.java eaa9123 
  src/test/com/cloudera/sqoop/TestExport.java b2edc53 


Diff: https://reviews.apache.org/r/58466/diff/4/

Changes: https://reviews.apache.org/r/58466/diff/3-4/


Testing
-------

There is no existing test class to cover the path and I am not sure the best way to add test case for this. If you have any suggestion, please let me know.

I have done manual testing to replicate the issue and confirmed that patch fixed the issue. I have also tried different data types, all working.

However, if column in MySQL is defined as NOT NULL, then the export will still fail with error, this is expected.


Thanks,

Eric Lin


Re: Review Request 58466: SQOOP-3158 - Columns added to Mysql after initial sqoop import, export back to table with same schema fails

Posted by Eric Lin <er...@cloudera.com>.

> On May 9, 2017, 3:09 p.m., Szabolcs Vasas wrote:
> > src/java/org/apache/sqoop/orm/ClassWriter.java
> > Lines 1416 (patched)
> > <https://reviews.apache.org/r/58466/diff/3/?file=1709911#file1709911line1416>
> >
> >     I think the hard coded "null" value will not work if 
> >     the user overwrites the null string with the input-null-string Sqoop parameter. You could use this.options.getInNullStringValue() instead of "null"  like in org.apache.sqoop.orm.ClassWriter#parseNullVal method.

Hi Szabolcs,

Thanks for the review and good pickup. I have updated it and also added a new test case to cover it.


- Eric


-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/58466/#review174317
-----------------------------------------------------------


On May 6, 2017, 11:29 a.m., Eric Lin wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/58466/
> -----------------------------------------------------------
> 
> (Updated May 6, 2017, 11:29 a.m.)
> 
> 
> Review request for Sqoop, Attila Szabo and Szabolcs Vasas.
> 
> 
> Bugs: SQOOP-3158
>     https://issues.apache.org/jira/browse/SQOOP-3158
> 
> 
> Repository: sqoop-trunk
> 
> 
> Description
> -------
> 
> I have table in MySQL with 2 columns until yesterday. The columns are id and name.
> 1,Raj
> 2,Jack
> 
> I have imported this data into HDFS yesterday itself as a file. Today we added a new column to the table in MySQL called salary. The table looks like below.
> 
> 1,Raj
> 2,Jack
> 3,Jill,2000
> 4,Nick,3000
> 
> Now I have done Incremental import on this table as a file.
> 
> Part-m-00000 file contains
> 1,Raj
> 2,Jack
> 
> Part-m-00001 file contains
> 3,Jill,2000
> 4,Nick,3000
> 
> Now I created a new table in MySQL with same schema as Original MySQL table with columns id name and salary.
> 
> Sqoop export will fail with below error:
> 
> java.lang.RuntimeException: Can't parse input data: 'Raj'
>         at SQOOP_3158.__loadFromFields(SQOOP_3158.java:316)
>         at SQOOP_3158.parse(SQOOP_3158.java:254)
>         at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:89)
>         at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
>         at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784)
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
>         at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:422)
>         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
>         at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
> Caused by: java.util.NoSuchElementException
>         at java.util.ArrayList$Itr.next(ArrayList.java:854)
>         at SQOOP_3158.__loadFromFields(SQOOP_3158.java:311)
>         ... 12 more
> 
> 
> Diffs
> -----
> 
>   src/java/org/apache/sqoop/orm/ClassWriter.java eaa9123 
>   src/test/com/cloudera/sqoop/TestExport.java b2edc53 
> 
> 
> Diff: https://reviews.apache.org/r/58466/diff/3/
> 
> 
> Testing
> -------
> 
> There is no existing test class to cover the path and I am not sure the best way to add test case for this. If you have any suggestion, please let me know.
> 
> I have done manual testing to replicate the issue and confirmed that patch fixed the issue. I have also tried different data types, all working.
> 
> However, if column in MySQL is defined as NOT NULL, then the export will still fail with error, this is expected.
> 
> 
> Thanks,
> 
> Eric Lin
> 
>


Re: Review Request 58466: SQOOP-3158 - Columns added to Mysql after initial sqoop import, export back to table with same schema fails

Posted by Szabolcs Vasas <va...@gmail.com>.
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/58466/#review174317
-----------------------------------------------------------



Hi Eric!

Thank you for your patch! I have found an edge case you should check and it would be good if you could write a test case for it too.

Regards,
Szabolcs


src/java/org/apache/sqoop/orm/ClassWriter.java
Lines 1416 (patched)
<https://reviews.apache.org/r/58466/#comment247448>

    I think the hard coded "null" value will not work if 
    the user overwrites the null string with the input-null-string Sqoop parameter. You could use this.options.getInNullStringValue() instead of "null"  like in org.apache.sqoop.orm.ClassWriter#parseNullVal method.


- Szabolcs Vasas


On May 6, 2017, 11:29 a.m., Eric Lin wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/58466/
> -----------------------------------------------------------
> 
> (Updated May 6, 2017, 11:29 a.m.)
> 
> 
> Review request for Sqoop, Attila Szabo and Szabolcs Vasas.
> 
> 
> Bugs: SQOOP-3158
>     https://issues.apache.org/jira/browse/SQOOP-3158
> 
> 
> Repository: sqoop-trunk
> 
> 
> Description
> -------
> 
> I have table in MySQL with 2 columns until yesterday. The columns are id and name.
> 1,Raj
> 2,Jack
> 
> I have imported this data into HDFS yesterday itself as a file. Today we added a new column to the table in MySQL called salary. The table looks like below.
> 
> 1,Raj
> 2,Jack
> 3,Jill,2000
> 4,Nick,3000
> 
> Now I have done Incremental import on this table as a file.
> 
> Part-m-00000 file contains
> 1,Raj
> 2,Jack
> 
> Part-m-00001 file contains
> 3,Jill,2000
> 4,Nick,3000
> 
> Now I created a new table in MySQL with same schema as Original MySQL table with columns id name and salary.
> 
> Sqoop export will fail with below error:
> 
> java.lang.RuntimeException: Can't parse input data: 'Raj'
>         at SQOOP_3158.__loadFromFields(SQOOP_3158.java:316)
>         at SQOOP_3158.parse(SQOOP_3158.java:254)
>         at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:89)
>         at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
>         at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784)
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
>         at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:422)
>         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
>         at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
> Caused by: java.util.NoSuchElementException
>         at java.util.ArrayList$Itr.next(ArrayList.java:854)
>         at SQOOP_3158.__loadFromFields(SQOOP_3158.java:311)
>         ... 12 more
> 
> 
> Diffs
> -----
> 
>   src/java/org/apache/sqoop/orm/ClassWriter.java eaa9123 
>   src/test/com/cloudera/sqoop/TestExport.java b2edc53 
> 
> 
> Diff: https://reviews.apache.org/r/58466/diff/3/
> 
> 
> Testing
> -------
> 
> There is no existing test class to cover the path and I am not sure the best way to add test case for this. If you have any suggestion, please let me know.
> 
> I have done manual testing to replicate the issue and confirmed that patch fixed the issue. I have also tried different data types, all working.
> 
> However, if column in MySQL is defined as NOT NULL, then the export will still fail with error, this is expected.
> 
> 
> Thanks,
> 
> Eric Lin
> 
>


Re: Review Request 58466: SQOOP-3158 - Columns added to Mysql after initial sqoop import, export back to table with same schema fails

Posted by Eric Lin <er...@cloudera.com>.
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/58466/
-----------------------------------------------------------

(Updated May 6, 2017, 11:29 a.m.)


Review request for Sqoop, Attila Szabo and Szabolcs Vasas.


Changes
-------

update test case to use existing checking NULL values after export


Bugs: SQOOP-3158
    https://issues.apache.org/jira/browse/SQOOP-3158


Repository: sqoop-trunk


Description
-------

I have table in MySQL with 2 columns until yesterday. The columns are id and name.
1,Raj
2,Jack

I have imported this data into HDFS yesterday itself as a file. Today we added a new column to the table in MySQL called salary. The table looks like below.

1,Raj
2,Jack
3,Jill,2000
4,Nick,3000

Now I have done Incremental import on this table as a file.

Part-m-00000 file contains
1,Raj
2,Jack

Part-m-00001 file contains
3,Jill,2000
4,Nick,3000

Now I created a new table in MySQL with same schema as Original MySQL table with columns id name and salary.

Sqoop export will fail with below error:

java.lang.RuntimeException: Can't parse input data: 'Raj'
        at SQOOP_3158.__loadFromFields(SQOOP_3158.java:316)
        at SQOOP_3158.parse(SQOOP_3158.java:254)
        at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:89)
        at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
        at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: java.util.NoSuchElementException
        at java.util.ArrayList$Itr.next(ArrayList.java:854)
        at SQOOP_3158.__loadFromFields(SQOOP_3158.java:311)
        ... 12 more


Diffs (updated)
-----

  src/java/org/apache/sqoop/orm/ClassWriter.java eaa9123 
  src/test/com/cloudera/sqoop/TestExport.java b2edc53 


Diff: https://reviews.apache.org/r/58466/diff/3/

Changes: https://reviews.apache.org/r/58466/diff/2-3/


Testing
-------

There is no existing test class to cover the path and I am not sure the best way to add test case for this. If you have any suggestion, please let me know.

I have done manual testing to replicate the issue and confirmed that patch fixed the issue. I have also tried different data types, all working.

However, if column in MySQL is defined as NOT NULL, then the export will still fail with error, this is expected.


Thanks,

Eric Lin


Re: Review Request 58466: SQOOP-3158 - Columns added to Mysql after initial sqoop import, export back to table with same schema fails

Posted by Eric Lin <er...@cloudera.com>.
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/58466/
-----------------------------------------------------------

(Updated May 6, 2017, 11:01 a.m.)


Review request for Sqoop, Attila Szabo and Szabolcs Vasas.


Changes
-------

Added test case to verify it failed before the change and passes after the change.


Bugs: SQOOP-3158
    https://issues.apache.org/jira/browse/SQOOP-3158


Repository: sqoop-trunk


Description
-------

I have table in MySQL with 2 columns until yesterday. The columns are id and name.
1,Raj
2,Jack

I have imported this data into HDFS yesterday itself as a file. Today we added a new column to the table in MySQL called salary. The table looks like below.

1,Raj
2,Jack
3,Jill,2000
4,Nick,3000

Now I have done Incremental import on this table as a file.

Part-m-00000 file contains
1,Raj
2,Jack

Part-m-00001 file contains
3,Jill,2000
4,Nick,3000

Now I created a new table in MySQL with same schema as Original MySQL table with columns id name and salary.

Sqoop export will fail with below error:

java.lang.RuntimeException: Can't parse input data: 'Raj'
        at SQOOP_3158.__loadFromFields(SQOOP_3158.java:316)
        at SQOOP_3158.parse(SQOOP_3158.java:254)
        at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:89)
        at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
        at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: java.util.NoSuchElementException
        at java.util.ArrayList$Itr.next(ArrayList.java:854)
        at SQOOP_3158.__loadFromFields(SQOOP_3158.java:311)
        ... 12 more


Diffs (updated)
-----

  src/java/org/apache/sqoop/orm/ClassWriter.java eaa9123 
  src/test/com/cloudera/sqoop/TestExport.java b2edc53 


Diff: https://reviews.apache.org/r/58466/diff/2/

Changes: https://reviews.apache.org/r/58466/diff/1-2/


Testing
-------

There is no existing test class to cover the path and I am not sure the best way to add test case for this. If you have any suggestion, please let me know.

I have done manual testing to replicate the issue and confirmed that patch fixed the issue. I have also tried different data types, all working.

However, if column in MySQL is defined as NOT NULL, then the export will still fail with error, this is expected.


Thanks,

Eric Lin


Re: Review Request 58466: SQOOP-3158 - Columns added to Mysql after initial sqoop import, export back to table with same schema fails

Posted by Eric Lin <er...@cloudera.com>.

> On April 20, 2017, 1:21 p.m., Liz Szilagyi wrote:
> > Hi Eric,
> > I understand you couldn't find a related test class to expand, but based on your description the problem could be translated into a new test. I'd suggest placing it with the rest of the MySQL tests (org.apache.sqoop.manager.mysql).
> > Could you please look into it if you could write a test case for the previously failing and now passing and another for the expected failing cases?
> > Thanks,
> > Liz

Hi Liz,

Test case added to the latest patch, please review and let me know what you think.

Thanks


- Eric


-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/58466/#review172481
-----------------------------------------------------------


On April 15, 2017, 3:23 a.m., Eric Lin wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/58466/
> -----------------------------------------------------------
> 
> (Updated April 15, 2017, 3:23 a.m.)
> 
> 
> Review request for Sqoop, Attila Szabo and Szabolcs Vasas.
> 
> 
> Bugs: SQOOP-3158
>     https://issues.apache.org/jira/browse/SQOOP-3158
> 
> 
> Repository: sqoop-trunk
> 
> 
> Description
> -------
> 
> I have table in MySQL with 2 columns until yesterday. The columns are id and name.
> 1,Raj
> 2,Jack
> 
> I have imported this data into HDFS yesterday itself as a file. Today we added a new column to the table in MySQL called salary. The table looks like below.
> 
> 1,Raj
> 2,Jack
> 3,Jill,2000
> 4,Nick,3000
> 
> Now I have done Incremental import on this table as a file.
> 
> Part-m-00000 file contains
> 1,Raj
> 2,Jack
> 
> Part-m-00001 file contains
> 3,Jill,2000
> 4,Nick,3000
> 
> Now I created a new table in MySQL with same schema as Original MySQL table with columns id name and salary.
> 
> Sqoop export will fail with below error:
> 
> java.lang.RuntimeException: Can't parse input data: 'Raj'
>         at SQOOP_3158.__loadFromFields(SQOOP_3158.java:316)
>         at SQOOP_3158.parse(SQOOP_3158.java:254)
>         at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:89)
>         at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
>         at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784)
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
>         at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:422)
>         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
>         at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
> Caused by: java.util.NoSuchElementException
>         at java.util.ArrayList$Itr.next(ArrayList.java:854)
>         at SQOOP_3158.__loadFromFields(SQOOP_3158.java:311)
>         ... 12 more
> 
> 
> Diffs
> -----
> 
>   src/java/org/apache/sqoop/orm/ClassWriter.java eaa9123 
> 
> 
> Diff: https://reviews.apache.org/r/58466/diff/1/
> 
> 
> Testing
> -------
> 
> There is no existing test class to cover the path and I am not sure the best way to add test case for this. If you have any suggestion, please let me know.
> 
> I have done manual testing to replicate the issue and confirmed that patch fixed the issue. I have also tried different data types, all working.
> 
> However, if column in MySQL is defined as NOT NULL, then the export will still fail with error, this is expected.
> 
> 
> Thanks,
> 
> Eric Lin
> 
>


Re: Review Request 58466: SQOOP-3158 - Columns added to Mysql after initial sqoop import, export back to table with same schema fails

Posted by Liz Szilagyi <er...@gmail.com>.
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/58466/#review172481
-----------------------------------------------------------



Hi Eric,
I understand you couldn't find a related test class to expand, but based on your description the problem could be translated into a new test. I'd suggest placing it with the rest of the MySQL tests (org.apache.sqoop.manager.mysql).
Could you please look into it if you could write a test case for the previously failing and now passing and another for the expected failing cases?
Thanks,
Liz

- Liz Szilagyi


On April 15, 2017, 5:23 a.m., Eric Lin wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/58466/
> -----------------------------------------------------------
> 
> (Updated April 15, 2017, 5:23 a.m.)
> 
> 
> Review request for Sqoop, Attila Szabo and Szabolcs Vasas.
> 
> 
> Bugs: SQOOP-3158
>     https://issues.apache.org/jira/browse/SQOOP-3158
> 
> 
> Repository: sqoop-trunk
> 
> 
> Description
> -------
> 
> I have table in MySQL with 2 columns until yesterday. The columns are id and name.
> 1,Raj
> 2,Jack
> 
> I have imported this data into HDFS yesterday itself as a file. Today we added a new column to the table in MySQL called salary. The table looks like below.
> 
> 1,Raj
> 2,Jack
> 3,Jill,2000
> 4,Nick,3000
> 
> Now I have done Incremental import on this table as a file.
> 
> Part-m-00000 file contains
> 1,Raj
> 2,Jack
> 
> Part-m-00001 file contains
> 3,Jill,2000
> 4,Nick,3000
> 
> Now I created a new table in MySQL with same schema as Original MySQL table with columns id name and salary.
> 
> Sqoop export will fail with below error:
> 
> java.lang.RuntimeException: Can't parse input data: 'Raj'
>         at SQOOP_3158.__loadFromFields(SQOOP_3158.java:316)
>         at SQOOP_3158.parse(SQOOP_3158.java:254)
>         at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:89)
>         at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
>         at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784)
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
>         at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:422)
>         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
>         at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
> Caused by: java.util.NoSuchElementException
>         at java.util.ArrayList$Itr.next(ArrayList.java:854)
>         at SQOOP_3158.__loadFromFields(SQOOP_3158.java:311)
>         ... 12 more
> 
> 
> Diffs
> -----
> 
>   src/java/org/apache/sqoop/orm/ClassWriter.java eaa9123 
> 
> 
> Diff: https://reviews.apache.org/r/58466/diff/1/
> 
> 
> Testing
> -------
> 
> There is no existing test class to cover the path and I am not sure the best way to add test case for this. If you have any suggestion, please let me know.
> 
> I have done manual testing to replicate the issue and confirmed that patch fixed the issue. I have also tried different data types, all working.
> 
> However, if column in MySQL is defined as NOT NULL, then the export will still fail with error, this is expected.
> 
> 
> Thanks,
> 
> Eric Lin
> 
>