You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@sqoop.apache.org by Rajasekhar <ra...@gmail.com> on 2014/09/17 00:13:17 UTC
Importing BLOB
Hi,
Any body tried importing BLOB from oracle to hive using sqoop ...??
I have tried bulkload, and map.java(string) options, which didnt help.
CLOB data am able to import successfully.
Please let me know if there is any way to load BLOB data.
Appreciate your help.
Thanks
Raja
Re: Importing BLOB
Posted by Abraham Elmahrek <ab...@cloudera.com>.
What Jira are you referring to?
Also, is there a larger strack trace for the TextExportMapper exception?
-Abe
On Tue, Sep 16, 2014 at 5:00 PM, Raja Sekhar <ra...@gmail.com>
wrote:
> Hi,
>
> I was able to go ahead with --map-column-hive option, but now am seeing
> other issues.
>
> my import command:
> sqoop import --connect jdbc:oracle:thin:@<DB> --username <username> -P
> --table C0.BLOB1 --hive-import --create-hive-table
> --hive-drop-import-delims --map-column-hive ENGINE=String,
> --fields-terminated-by '|' --verbose -m 1 --hive-overwrite
> --lines-terminated-by '\n' --inline-lob-limit 0 (20)
>
> Error:
> 14/09/16 19:43:25 INFO hive.HiveImport: Time taken: 2.504 seconds
> 14/09/16 19:43:25 INFO hive.HiveImport: FAILED: SemanticException Line
> 2:17 Invalid path ''hdfs://
> tbldann01adv-hdp.tdc.vzwcorp.com:8020/user/ambari-qa/C0DUDRA.BLOB1'':
> source contains directory: hdfs://
> tbldann01adv-hdp.tdc.vzwcorp.com:8020/user/ambari-qa/C0DUDRA.BLOB1/_lob
> 14/09/16 19:43:25 ERROR tool.ImportTool: Encountered IOException running
> import job: java.io.IOException: Hive exited with status 64
> at
> org.apache.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:381)
> at
> org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:331)
> at
> org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:235)
> at
> org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:415)
> at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:476)
>
> Tried excluding --inline-lob-limit, import went fine, but its failing
> while exporting back. ( I found a jira filed for this issue long back, but
> no fix is available).
>
> 2014-09-16 19:24:58,579 ERROR org.apache.sqoop.mapreduce.TextExportMapper:
> Exception: java.io.IOException: Could not buffer record at
> org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.write(AsyncSqlRecordWriter.java:218)
> at org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.write
>
> Thanks
> Raja
>
> On Tue, Sep 16, 2014 at 3:13 PM, Rajasekhar <ra...@gmail.com>
> wrote:
>
>> Hi,
>> Any body tried importing BLOB from oracle to hive using sqoop ...??
>> I have tried bulkload, and map.java(string) options, which didnt help.
>> CLOB data am able to import successfully.
>> Please let me know if there is any way to load BLOB data.
>>
>> Appreciate your help.
>>
>> Thanks
>> Raja
>>
>
>
>
> --
> Thanks & Regards
> Rajasekhar D
>
Re: Importing BLOB
Posted by Raja Sekhar <ra...@gmail.com>.
Hi,
I was able to go ahead with --map-column-hive option, but now am seeing
other issues.
my import command:
sqoop import --connect jdbc:oracle:thin:@<DB> --username <username> -P
--table C0.BLOB1 --hive-import --create-hive-table
--hive-drop-import-delims --map-column-hive ENGINE=String,
--fields-terminated-by '|' --verbose -m 1 --hive-overwrite
--lines-terminated-by '\n' --inline-lob-limit 0 (20)
Error:
14/09/16 19:43:25 INFO hive.HiveImport: Time taken: 2.504 seconds
14/09/16 19:43:25 INFO hive.HiveImport: FAILED: SemanticException Line 2:17
Invalid path ''hdfs://
tbldann01adv-hdp.tdc.vzwcorp.com:8020/user/ambari-qa/C0DUDRA.BLOB1'':
source contains directory: hdfs://
tbldann01adv-hdp.tdc.vzwcorp.com:8020/user/ambari-qa/C0DUDRA.BLOB1/_lob
14/09/16 19:43:25 ERROR tool.ImportTool: Encountered IOException running
import job: java.io.IOException: Hive exited with status 64
at
org.apache.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:381)
at
org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:331)
at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:235)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:415)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:476)
Tried excluding --inline-lob-limit, import went fine, but its failing while
exporting back. ( I found a jira filed for this issue long back, but no fix
is available).
2014-09-16 19:24:58,579 ERROR org.apache.sqoop.mapreduce.TextExportMapper:
Exception: java.io.IOException: Could not buffer record at
org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.write(AsyncSqlRecordWriter.java:218)
at org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.write
Thanks
Raja
On Tue, Sep 16, 2014 at 3:13 PM, Rajasekhar <ra...@gmail.com> wrote:
> Hi,
> Any body tried importing BLOB from oracle to hive using sqoop ...??
> I have tried bulkload, and map.java(string) options, which didnt help.
> CLOB data am able to import successfully.
> Please let me know if there is any way to load BLOB data.
>
> Appreciate your help.
>
> Thanks
> Raja
>
--
Thanks & Regards
Rajasekhar D