You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@sqoop.apache.org by "Markus Kemper (JIRA)" <ji...@apache.org> on 2016/12/27 18:50:58 UTC

[jira] [Commented] (SQOOP-3094) Add (import + --as-avrodatafile) with Oracle BINARY_DOUBLE

    [ https://issues.apache.org/jira/browse/SQOOP-3094?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15781030#comment-15781030 ] 

Markus Kemper commented on SQOOP-3094:
--------------------------------------

Linking SQOOP-3095 and SQOOP-3089

> Add (import + --as-avrodatafile) with Oracle BINARY_DOUBLE
> ----------------------------------------------------------
>
>                 Key: SQOOP-3094
>                 URL: https://issues.apache.org/jira/browse/SQOOP-3094
>             Project: Sqoop
>          Issue Type: Improvement
>            Reporter: Markus Kemper
>
> Some users are not able to easily implement the Sqoop option (--map-column-java, Example 1 below).  
> The ask here is to find a way to natively support the Oracle BINARY_DOUBLE type in Sqoop natively without using --map-column-java (see Example 2 below)
> *Example 1*
> {noformat}
> sqoop import --connect $MYCONN --username $MYUSER --password $MYPSWD --table T1 --target-dir /user/user1/t1 --delete-target-dir --verbose --num-mappers 1 --as-avrodatafile --map-column-java C2=String
> hdfs dfs -ls /user/user1/t1/*.avro
> avro-tools tojson --pretty 'hdfs://namenode.cloudera.com/user/user1/t1/part-m-00000.avro'
> Output:
> 16/12/27 10:28:03 INFO mapreduce.ImportJobBase: Transferred 320 bytes in 39.1563 seconds (8.1724 bytes/sec)
> 16/12/27 10:28:03 INFO mapreduce.ImportJobBase: Retrieved 1 records.
> ---
> -rw-r--r--   3 user1 user1        320 2016-12-27 10:28 /user/user1/t1/part-m-00000.avro
> ---
> {
>   "C1" : {
>     "string" : "1"
>   },
>   "C2" : {
>     "string" : "1.1"
>   }
> }
> {noformat}
> *Example 2*
> {noformat}
> sqoop import --connect $MYCONN --username $MYUSER --password $MYPSWD --table T1 --target-dir /user/user1/t1 --delete-target-dir --verbose --num-mappers 1 --as-parquetfile 
> Output:
> 16/12/27 10:05:43 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM T1 t WHERE 1=0
> 16/12/27 10:05:43 DEBUG manager.SqlManager: Found column C1 of type [2, 38, 0]
> 16/12/27 10:05:43 DEBUG manager.SqlManager: Found column C2 of type [101, 0, 0]
> 16/12/27 10:05:43 DEBUG util.ClassLoaderStack: Restoring classloader: java.net.FactoryURLClassLoader@55465b1f
> 16/12/27 10:05:43 ERROR tool.ImportTool: Imported Failed: Cannot convert SQL type 101
> sqoop import --connect $MYCONN --username $MYUSER --password $MYPSWD --table T1 --target-dir /user/user1/t1 --delete-target-dir --verbose --num-mappers 2 --as-parquetfile  --direct
> Output:
> 16/12/27 10:08:50 DEBUG oracle.OraOopUtilities: The Oracle table context has been derived from:
> 	oracleConnectionUserName = sqoop
> 	tableStr = T1
> 	as:
> 	owner : SQOOP
> 	table : T1
> 16/12/27 10:08:50 INFO oracle.OraOopManagerFactory: 
> **************************************************
> *** Using Data Connector for Oracle and Hadoop ***
> **************************************************
> <SNIP>
> 16/12/27 10:08:53 INFO manager.SqlManager: Executing SQL statement: SELECT C1,C2 FROM T1 WHERE 1=0
> 16/12/27 10:08:53 DEBUG manager.SqlManager: Found column C1
> 16/12/27 10:08:53 DEBUG manager.SqlManager: Found column C2
> 16/12/27 10:08:53 DEBUG util.ClassLoaderStack: Restoring classloader: java.net.FactoryURLClassLoader@7e087bf5
> 16/12/27 10:08:53 ERROR tool.ImportTool: Imported Failed: Cannot convert SQL type 101
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)