You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@sqoop.apache.org by "Zoltan Fedor (JIRA)" <ji...@apache.org> on 2015/10/02 23:44:26 UTC

[jira] [Comment Edited] (SQOOP-2594) Sqoop export from Avro to Oracle fails with Integer cannot be cast to BigDecimal

    [ https://issues.apache.org/jira/browse/SQOOP-2594?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14941818#comment-14941818 ] 

Zoltan Fedor edited comment on SQOOP-2594 at 10/2/15 9:43 PM:
--------------------------------------------------------------

I would not call it a resolution, but I have a workaround for this issue - by reading all numbers from avro as doubles.

1. In Hadoop store all numerical data in avro as "double" and not "integer" (so store integer values as double)
2. In sqoop change /src/java/org/apache/sqoop/manager/oracle/OraOopOutputFormatBase.java at line 434:
FROM:
      if (column.getOracleType()
          == OraOopOracleQueries.getOracleType("NUMBER")) {
        OraOopOracleQueries.setBigDecimalAtName(statement, bindValueName,
            (BigDecimal) bindValue);

TO:
      if (column.getOracleType()
          == OraOopOracleQueries.getOracleType("NUMBER")) {
        OraOopOracleQueries.setBinaryDoubleAtName(statement, bindValueName,
            (Double) bindValue);

3. When running the export force those columns to be recognized as DOUBLE by adding the next to your sqoop export command (COUNT is the name of the column):
--map-column-java COUNT=Double
4. In Oracle you can set these number columns (like COUNT in the above example) with type "Integer", as that is really just Number(8,0)

Now you can export numerical data stored as avro in Hadoop into Oracle.




was (Author: fedorz):
I would not call it a resolution, but I have a workaround for this issue now by reading all numbers from avro as doubles.

1. In Hadoop store all numerical data in avro as "double" and not "integer" (so store integer values as double)
2. In sqoop change /src/java/org/apache/sqoop/manager/oracle/OraOopOutputFormatBase.java at line 434:
FROM:
      if (column.getOracleType()
          == OraOopOracleQueries.getOracleType("NUMBER")) {
        OraOopOracleQueries.setBigDecimalAtName(statement, bindValueName,
            (BigDecimal) bindValue);

TO:
      if (column.getOracleType()
          == OraOopOracleQueries.getOracleType("NUMBER")) {
        OraOopOracleQueries.setBinaryDoubleAtName(statement, bindValueName,
            (Double) bindValue);

3. When running the export force those columns to be recognized as DOUBLE by adding the next to your sqoop export command (COUNT is the name of the column):
--map-column-java COUNT=Double
4. In Oracle you can set these number columns (like COUNT in the above example) with type "Integer", as that is really just Number(8,0)

Now you can export numerical data stored as avro in Hadoop into Oracle.



> Sqoop export from Avro to Oracle fails with Integer cannot be cast to BigDecimal
> --------------------------------------------------------------------------------
>
>                 Key: SQOOP-2594
>                 URL: https://issues.apache.org/jira/browse/SQOOP-2594
>             Project: Sqoop
>          Issue Type: Bug
>    Affects Versions: 1.4.5, 1.4.7
>            Reporter: Zoltan Fedor
>
> Trying to move an avro-backed Hive table to Oracle using sqoop export.
> The table has some int and bigint columns.
> Exporting into Oracle fails with:
> 15/09/29 19:54:02 INFO mapreduce.Job: Task Id : attempt_1436064325770_499944_m_000000_0, Status : FAILED
> Error: java.lang.ClassCastException: java.lang.Integer cannot be cast to java.math.BigDecimal
>         at WEATHER.setField(WEATHER.java:344)
>         at org.apache.sqoop.mapreduce.AvroExportMapper.toSqoopRecord(AvroExportMapper.java:120)
>         at org.apache.sqoop.mapreduce.AvroExportMapper.map(AvroExportMapper.java:104)
>         at org.apache.sqoop.mapreduce.AvroExportMapper.map(AvroExportMapper.java:49)
>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
>         at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784)
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
>         at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:422)
>         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1642)
>         at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
> This seems to be a similar issue to https://issues.apache.org/jira/browse/SQOOP-2564



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)