You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@sqoop.apache.org by sudeep mishra <su...@gmail.com> on 2016/01/20 06:01:47 UTC
sqoop import --hive-import failing for row_version (datatype:
timestamp) column for MS SQL Server
Hi,
I am trying to import MS SQL Server data into Hive using Sqoop import
--hive-import option but it is failing for a column of datatype 'timestamp'.
16/01/20 04:50:53 ERROR tool.ImportTool: Encountered IOException running
import job: java.io.IOException: Hive does not support the SQL type for
column row_version
at
org.apache.sqoop.hive.TableDefWriter.getCreateTableStmt(TableDefWriter.java:181)
at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:188)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:514)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
at org.apache.sqoop.Sqoop.run(Sqoop.java:148)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:184)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:226)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:235)
at org.apache.sqoop.Sqoop.main(Sqoop.java:244)
Is there a way to suppress this or what is a good approach to handle such
cases?
Thanks & Regards,
Sudeep Shekhar Mishra
Re: sqoop import --hive-import failing for row_version (datatype:
timestamp) column for MS SQL Server
Posted by sudeep mishra <su...@gmail.com>.
Thanks Jarcec.
On Thu, Jan 21, 2016 at 1:44 PM, Jarek Jarcec Cecho <ja...@apache.org>
wrote:
> Thanks for the details!
>
> From the logs it seems to me that the column “row_version” is defined as
> binary:
>
> 16/01/20 16:50:32 DEBUG manager.SqlManager: Found column row_version of
> type [-2, 8, 0]
>
> (-2, is a binary [1])
>
> Such exception is actually expected because our Hive integration indeed
> does not work with binary types. You can possibly use —map-column-hive and
> —map-column-java to map the column to something else. Another option is to
> use —query and convert the column value and type to something else.
>
> Jarcec
>
> Links:
> 1:
> https://docs.oracle.com/javase/7/docs/api/constant-values.html#java.sql.Types.BINARY
>
> > On Jan 20, 2016, at 5:53 PM, sudeep mishra <su...@gmail.com>
> wrote:
> >
> > Hi Jarec,
> >
> > I am using below versions.
> >
> > Sqoop - 1.4.6.2.3
> > Hive - 1.2.1.2.3
> >
> > PFA the logs.
> >
> >
> >
> > On Wed, Jan 20, 2016 at 7:34 PM, Jarek Jarcec Cecho <ja...@apache.org>
> wrote:
> > What versions of Hive and Sqoop are you using?
> >
> > Can you share the whole log generated with parameter —verbose?
> >
> > Jarcec
> >
> > > On Jan 20, 2016, at 6:01 AM, sudeep mishra <su...@gmail.com>
> wrote:
> > >
> > > Hi,
> > >
> > > I am trying to import MS SQL Server data into Hive using Sqoop import
> --hive-import option but it is failing for a column of datatype 'timestamp'.
> > >
> > > 16/01/20 04:50:53 ERROR tool.ImportTool: Encountered IOException
> running import job: java.io.IOException: Hive does not support the SQL type
> for column row_version
> > > at
> org.apache.sqoop.hive.TableDefWriter.getCreateTableStmt(TableDefWriter.java:181)
> > > at
> org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:188)
> > > at
> org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:514)
> > > at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
> > > at org.apache.sqoop.Sqoop.run(Sqoop.java:148)
> > > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
> > > at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:184)
> > > at org.apache.sqoop.Sqoop.runTool(Sqoop.java:226)
> > > at org.apache.sqoop.Sqoop.runTool(Sqoop.java:235)
> > > at org.apache.sqoop.Sqoop.main(Sqoop.java:244)
> > >
> > > Is there a way to suppress this or what is a good approach to handle
> such cases?
> > >
> > >
> > > Thanks & Regards,
> > >
> > > Sudeep Shekhar Mishra
> > >
> >
> >
> >
> >
> >
> > <currency_code.log>
>
>
Thanks & Regards,
Sudeep Shekhar Mishra
Re: sqoop import --hive-import failing for row_version (datatype: timestamp) column for MS SQL Server
Posted by Jarek Jarcec Cecho <ja...@apache.org>.
Thanks for the details!
From the logs it seems to me that the column “row_version” is defined as binary:
16/01/20 16:50:32 DEBUG manager.SqlManager: Found column row_version of type [-2, 8, 0]
(-2, is a binary [1])
Such exception is actually expected because our Hive integration indeed does not work with binary types. You can possibly use —map-column-hive and —map-column-java to map the column to something else. Another option is to use —query and convert the column value and type to something else.
Jarcec
Links:
1: https://docs.oracle.com/javase/7/docs/api/constant-values.html#java.sql.Types.BINARY
> On Jan 20, 2016, at 5:53 PM, sudeep mishra <su...@gmail.com> wrote:
>
> Hi Jarec,
>
> I am using below versions.
>
> Sqoop - 1.4.6.2.3
> Hive - 1.2.1.2.3
>
> PFA the logs.
>
>
>
> On Wed, Jan 20, 2016 at 7:34 PM, Jarek Jarcec Cecho <ja...@apache.org> wrote:
> What versions of Hive and Sqoop are you using?
>
> Can you share the whole log generated with parameter —verbose?
>
> Jarcec
>
> > On Jan 20, 2016, at 6:01 AM, sudeep mishra <su...@gmail.com> wrote:
> >
> > Hi,
> >
> > I am trying to import MS SQL Server data into Hive using Sqoop import --hive-import option but it is failing for a column of datatype 'timestamp'.
> >
> > 16/01/20 04:50:53 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: Hive does not support the SQL type for column row_version
> > at org.apache.sqoop.hive.TableDefWriter.getCreateTableStmt(TableDefWriter.java:181)
> > at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:188)
> > at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:514)
> > at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
> > at org.apache.sqoop.Sqoop.run(Sqoop.java:148)
> > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
> > at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:184)
> > at org.apache.sqoop.Sqoop.runTool(Sqoop.java:226)
> > at org.apache.sqoop.Sqoop.runTool(Sqoop.java:235)
> > at org.apache.sqoop.Sqoop.main(Sqoop.java:244)
> >
> > Is there a way to suppress this or what is a good approach to handle such cases?
> >
> >
> > Thanks & Regards,
> >
> > Sudeep Shekhar Mishra
> >
>
>
>
>
>
> <currency_code.log>
Re: sqoop import --hive-import failing for row_version (datatype:
timestamp) column for MS SQL Server
Posted by sudeep mishra <su...@gmail.com>.
Hi Jarec,
I am using below versions.
Sqoop - 1.4.6.2.3
Hive - 1.2.1.2.3
PFA the logs.
On Wed, Jan 20, 2016 at 7:34 PM, Jarek Jarcec Cecho <ja...@apache.org>
wrote:
> What versions of Hive and Sqoop are you using?
>
> Can you share the whole log generated with parameter —verbose?
>
> Jarcec
>
> > On Jan 20, 2016, at 6:01 AM, sudeep mishra <su...@gmail.com>
> wrote:
> >
> > Hi,
> >
> > I am trying to import MS SQL Server data into Hive using Sqoop import
> --hive-import option but it is failing for a column of datatype 'timestamp'.
> >
> > 16/01/20 04:50:53 ERROR tool.ImportTool: Encountered IOException running
> import job: java.io.IOException: Hive does not support the SQL type for
> column row_version
> > at
> org.apache.sqoop.hive.TableDefWriter.getCreateTableStmt(TableDefWriter.java:181)
> > at
> org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:188)
> > at
> org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:514)
> > at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
> > at org.apache.sqoop.Sqoop.run(Sqoop.java:148)
> > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
> > at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:184)
> > at org.apache.sqoop.Sqoop.runTool(Sqoop.java:226)
> > at org.apache.sqoop.Sqoop.runTool(Sqoop.java:235)
> > at org.apache.sqoop.Sqoop.main(Sqoop.java:244)
> >
> > Is there a way to suppress this or what is a good approach to handle
> such cases?
> >
> >
> > Thanks & Regards,
> >
> > Sudeep Shekhar Mishra
> >
>
>
Re: sqoop import --hive-import failing for row_version (datatype: timestamp) column for MS SQL Server
Posted by Jarek Jarcec Cecho <ja...@apache.org>.
What versions of Hive and Sqoop are you using?
Can you share the whole log generated with parameter —verbose?
Jarcec
> On Jan 20, 2016, at 6:01 AM, sudeep mishra <su...@gmail.com> wrote:
>
> Hi,
>
> I am trying to import MS SQL Server data into Hive using Sqoop import --hive-import option but it is failing for a column of datatype 'timestamp'.
>
> 16/01/20 04:50:53 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: Hive does not support the SQL type for column row_version
> at org.apache.sqoop.hive.TableDefWriter.getCreateTableStmt(TableDefWriter.java:181)
> at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:188)
> at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:514)
> at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
> at org.apache.sqoop.Sqoop.run(Sqoop.java:148)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
> at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:184)
> at org.apache.sqoop.Sqoop.runTool(Sqoop.java:226)
> at org.apache.sqoop.Sqoop.runTool(Sqoop.java:235)
> at org.apache.sqoop.Sqoop.main(Sqoop.java:244)
>
> Is there a way to suppress this or what is a good approach to handle such cases?
>
>
> Thanks & Regards,
>
> Sudeep Shekhar Mishra
>