You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@sqoop.apache.org by "Suresh (JIRA)" <ji...@apache.org> on 2015/09/09 21:39:45 UTC
[jira] [Commented] (SQOOP-2561) Special Character removal from
Column name as avro data results in duplicate column and fails the import
[ https://issues.apache.org/jira/browse/SQOOP-2561?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14737458#comment-14737458 ]
Suresh commented on SQOOP-2561:
-------------------------------
This is the error i am getting,
15/09/07 10:52:47 ERROR sqoop.Sqoop: Got exception running Sqoop: org.apache.avro.AvroRuntimeException: Duplicate field CODIGO in record QueryResult: CODIGO type:UNION pos:15 and CODIGO type:UNION pos:13.
org.apache.avro.AvroRuntimeException: Duplicate field CODIGO in record QueryResult: CODIGO type:UNION pos:15 and CODIGO type:UNION pos:13.
at org.apache.avro.Schema$RecordSchema.setFields(Schema.java:636)
at org.apache.sqoop.orm.AvroSchemaGenerator.generate(AvroSchemaGenerator.java:91)
at org.apache.sqoop.mapreduce.DataDrivenImportJob.generateAvroSchema(DataDrivenImportJob.java:132)
at org.apache.sqoop.mapreduce.DataDrivenImportJob.configureMapper(DataDrivenImportJob.java:90)
at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:262)
at org.apache.sqoop.manager.SqlManager.importQuery(SqlManager.java:721)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:499)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
at org.apache.oozie.action.hadoop.SqoopMain.runSqoopJob(SqoopMain.java:207)
at org.apache.oozie.action.hadoop.SqoopMain.run(SqoopMain.java:175)
at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:39)
at org.apache.oozie.action.hadoop.SqoopMain.main(SqoopMain.java:45)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:227)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:450)
Where i have columns as CODIGO$ and COD$IGO.
> Special Character removal from Column name as avro data results in duplicate column and fails the import
> --------------------------------------------------------------------------------------------------------
>
> Key: SQOOP-2561
> URL: https://issues.apache.org/jira/browse/SQOOP-2561
> Project: Sqoop
> Issue Type: Bug
> Components: connectors
> Affects Versions: 1.4.5
> Environment: cdh5.3.2
> Reporter: Suresh
> Labels: AVRO, SQOOP
> Fix For: 1.4.5
>
>
> When a Special character like '$' or '#' are present in column name, sqoop/avro removes those special character. In some cases it leads to duplicate column.
> e.g. If we have COL$1 and COL1$ in the schema, it removes both of them and creates the duplicate column as COL1 and it results in failure of the SQOOP import job as a avro data. The same table can be loaded without --as-avarodata flag.
> The similar issue is raised in, https://issues.apache.org/jira/browse/SQOOP-1361 - which i suppose is fixed and the fix is creating this new issue.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)