You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@sqoop.apache.org by Ruslan Al-Fakikh <me...@gmail.com> on 2013/04/09 18:40:54 UTC

Support of arrays in fields

Hey guys,

Sorry for raising this old question, but is there a workaround for
uploading data with arrays in fields?
I have a table like this:
CREATE TABLE tablename
(
image_urls character varying(300)[]
);
and I am uplooading from a file on HDFS. Basically I can change a format of
the file, but in what form should I do it in order to make Sqoop upload it
to this unsupported data type? Maybe there is a workaround.

Also I saw this issue, but it is still unresolved:
https://issues.cloudera.org/browse/SQOOP-160

Any help would be appreciated


On Fri, Aug 24, 2012 at 10:07 AM, Jarek Jarcec Cecho <ja...@apache.org>wrote:

> You might consider utilizing Postgresql's function array_to_string to
> "join" the array into one string. You would have to change the import from
> --table to --query then thought.
>
> Jarcec
>
> On Fri, Aug 24, 2012 at 10:40:46AM +0530, Adarsh Sharma wrote:
> > Thanks Jarcec for the update, So Sqoop is not suitable for shifting data
> > from DB to HDFS , if some columns have integer[] or bigint[] datatypes.
> >
> > Is there any way i can sh*ift  date having bigint[] datatypes from*
> postgresql
> > DB to HDFS using Sqoop or I need to test another tool like Talend etc.
> >
> >
> > Thanks
> >
> >
> > On Thu, Aug 23, 2012 at 11:45 PM, Jarek Jarcec Cecho <jarcec@apache.org
> >wrote:
> >
> > > Hi Adarsh,
> > > as far as I know then Sqoop should not have any issues with bigint data
> > > type.
> > >
> > > Based on provided log fragment, It seems that you're having issues with
> > > SQL type 2003 that should be ARRAY (see 1). I'm afraid that array is
> really
> > > not supported in Sqoop at the moment.
> > >
> > > Jarcec
> > >
> > > 1:
> > >
> http://docs.oracle.com/javase/1.5.0/docs/api/constant-values.html#java.sql.Types.ARRAY
> > >
> > > On Thu, Aug 23, 2012 at 08:47:10PM +0530, Adarsh Sharma wrote:
> > > > Hi all,
> > > >
> > > > Please forgive if i violate any rule before posting in this mailing
> list
> > > .
> > > > I am using for some testing in my hadoop standalone set up.
> > > >
> > > > Hadoop Version: 0.20.2-cdh3u5,
> 580d1d26c7ad6a7c6ba72950d8605e2c6fbc96cc
> > > > Sqoop Version : Sqoop 1.4.1-incubating
> > > > Also tried         :  Sqoop 1.4.0-incubating
> > > > Postgresql Versio : edb-psql (9.0.4.14)
> > > >
> > > >
> > > > I am able to export data from HDFS to postgresql but when I am
> trying to
> > > > import data from DB to hdfs , below problem arises :
> > > > hadoop@test123:~/project/
> > > > sqoop-1.4.1-incubating__hadoop-0.20$ bin/sqoop import  --connect
> > > > jdbc:postgresql://localhost/hadooppipeline  --table test_table
> --username
> > > > postgres --password postgres
> > > > Warning: /usr/lib/hbase does not exist! HBase imports will fail.
> > > > Please set $HBASE_HOME to the root of your HBase installation.
> > > > 12/08/23 19:25:19 WARN tool.BaseSqoopTool: Setting your password on
> the
> > > > command-line is insecure. Consider using -P instead.
> > > > 12/08/23 19:25:19 INFO manager.SqlManager: Using default fetchSize of
> > > 1000
> > > > 12/08/23 19:25:19 INFO tool.CodeGenTool: Beginning code generation
> > > > 12/08/23 19:25:19 INFO manager.SqlManager: Executing SQL statement:
> > > SELECT
> > > > t.* FROM "test_table" AS t LIMIT 1
> > > > 12/08/23 19:25:19 ERROR orm.ClassWriter: Cannot resolve SQL type 2003
> > > > 12/08/23 19:25:19 ERROR orm.ClassWriter: Cannot resolve SQL type 2003
> > > > 12/08/23 19:25:19 ERROR orm.ClassWriter: Cannot resolve SQL type 2003
> > > > 12/08/23 19:25:19 ERROR orm.ClassWriter: Cannot resolve SQL type 2003
> > > > 12/08/23 19:25:19 ERROR orm.ClassWriter: Cannot resolve SQL type 2003
> > > > 12/08/23 19:25:19 ERROR orm.ClassWriter: Cannot resolve SQL type 2003
> > > > 12/08/23 19:25:19 ERROR orm.ClassWriter: No Java type for SQL type
> 2003
> > > for
> > > > column rc_list
> > > >
> > > > There are 4 bigint columns in the table. Please guide me if Sqoop
> support
> > > > for bigint columns or not.
> > > >
> > > > I do some Rn D and find only one link but not able to solve the
> issue :
> > > >
> > >
> https://issues.cloudera.org/browse/SQOOP-48?page=com.atlassian.jira.plugin.system.issuetabpanels%3Aall-tabpanel
> > > >
> > > >
> > > > Thanks
> > >
>

Re: Support of arrays in fields

Posted by Ruslan Al-Fakikh <me...@gmail.com>.
Ok, I see.
I appreciate the clarification. Here is the newly created issue:
https://issues.apache.org/jira/browse/SQOOP-1000

Ruslan


On Sat, Apr 20, 2013 at 1:55 AM, Jarek Jarcec Cecho <ja...@apache.org>wrote:

> Hi Ruslan,
> we tried to do bulk import from Cloudera JIRA instance into ASF instance
> when we were incubating Sqoop. Unfortunately we've run into issues, so we
> decided to keep the Cloudera instance read only and create corresponding
> ASF JIRAs on per needed basis. Please check out INFRA-3763 [1] for more
> background.
>
> Jarcec
>
> Links:
> 1: https://issues.apache.org/jira/browse/INFRA-3763
>
> On Sat, Apr 20, 2013 at 01:08:36AM +0400, Ruslan Al-Fakikh wrote:
> > Jarek,
> >
> > This is a second time :-) you are asking me to open a jira on Apache that
> > is already present on Cloudera. If you are saying that Cloudera site is
> no
> > longer used:
> >
> https://issues.apache.org/jira/browse/SQOOP-390?focusedCommentId=13631487&page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-13631487
> > then why haven't all its issues been migrated in batch?
> >
> > Thanks
> >
> >
> > On Mon, Apr 15, 2013 at 8:58 AM, Jarek Jarcec Cecho <jarcec@apache.org
> >wrote:
> >
> > > Hi Ruslan,
> > > I'm afraid that Sqoop currently do not supports arrays natively. The
> > > import case can be workaround by using array_to_string function, but
> I'm
> > > not sure how to easily workaround export. Would you mind opening a new
> JIRA
> > > on Apache JIRA [1] for that?
> > >
> > > Jarcec
> > >
> > > Links:
> > > 1: https://issues.apache.org/jira/browse/SQOOP
> > >
> > > On Tue, Apr 09, 2013 at 08:40:54PM +0400, Ruslan Al-Fakikh wrote:
> > > > Hey guys,
> > > >
> > > > Sorry for raising this old question, but is there a workaround for
> > > > uploading data with arrays in fields?
> > > > I have a table like this:
> > > > CREATE TABLE tablename
> > > > (
> > > > image_urls character varying(300)[]
> > > > );
> > > > and I am uplooading from a file on HDFS. Basically I can change a
> format
> > > of
> > > > the file, but in what form should I do it in order to make Sqoop
> upload
> > > it
> > > > to this unsupported data type? Maybe there is a workaround.
> > > >
> > > > Also I saw this issue, but it is still unresolved:
> > > > https://issues.cloudera.org/browse/SQOOP-160
> > > >
> > > > Any help would be appreciated
> > > >
> > > >
> > > > On Fri, Aug 24, 2012 at 10:07 AM, Jarek Jarcec Cecho <
> jarcec@apache.org
> > > >wrote:
> > > >
> > > > > You might consider utilizing Postgresql's function array_to_string
> to
> > > > > "join" the array into one string. You would have to change the
> import
> > > from
> > > > > --table to --query then thought.
> > > > >
> > > > > Jarcec
> > > > >
> > > > > On Fri, Aug 24, 2012 at 10:40:46AM +0530, Adarsh Sharma wrote:
> > > > > > Thanks Jarcec for the update, So Sqoop is not suitable for
> shifting
> > > data
> > > > > > from DB to HDFS , if some columns have integer[] or bigint[]
> > > datatypes.
> > > > > >
> > > > > > Is there any way i can sh*ift  date having bigint[] datatypes
> from*
> > > > > postgresql
> > > > > > DB to HDFS using Sqoop or I need to test another tool like Talend
> > > etc.
> > > > > >
> > > > > >
> > > > > > Thanks
> > > > > >
> > > > > >
> > > > > > On Thu, Aug 23, 2012 at 11:45 PM, Jarek Jarcec Cecho <
> > > jarcec@apache.org
> > > > > >wrote:
> > > > > >
> > > > > > > Hi Adarsh,
> > > > > > > as far as I know then Sqoop should not have any issues with
> bigint
> > > data
> > > > > > > type.
> > > > > > >
> > > > > > > Based on provided log fragment, It seems that you're having
> issues
> > > with
> > > > > > > SQL type 2003 that should be ARRAY (see 1). I'm afraid that
> array
> > > is
> > > > > really
> > > > > > > not supported in Sqoop at the moment.
> > > > > > >
> > > > > > > Jarcec
> > > > > > >
> > > > > > > 1:
> > > > > > >
> > > > >
> > >
> http://docs.oracle.com/javase/1.5.0/docs/api/constant-values.html#java.sql.Types.ARRAY
> > > > > > >
> > > > > > > On Thu, Aug 23, 2012 at 08:47:10PM +0530, Adarsh Sharma wrote:
> > > > > > > > Hi all,
> > > > > > > >
> > > > > > > > Please forgive if i violate any rule before posting in this
> > > mailing
> > > > > list
> > > > > > > .
> > > > > > > > I am using for some testing in my hadoop standalone set up.
> > > > > > > >
> > > > > > > > Hadoop Version: 0.20.2-cdh3u5,
> > > > > 580d1d26c7ad6a7c6ba72950d8605e2c6fbc96cc
> > > > > > > > Sqoop Version : Sqoop 1.4.1-incubating
> > > > > > > > Also tried         :  Sqoop 1.4.0-incubating
> > > > > > > > Postgresql Versio : edb-psql (9.0.4.14)
> > > > > > > >
> > > > > > > >
> > > > > > > > I am able to export data from HDFS to postgresql but when I
> am
> > > > > trying to
> > > > > > > > import data from DB to hdfs , below problem arises :
> > > > > > > > hadoop@test123:~/project/
> > > > > > > > sqoop-1.4.1-incubating__hadoop-0.20$ bin/sqoop import
>  --connect
> > > > > > > > jdbc:postgresql://localhost/hadooppipeline  --table
> test_table
> > > > > --username
> > > > > > > > postgres --password postgres
> > > > > > > > Warning: /usr/lib/hbase does not exist! HBase imports will
> fail.
> > > > > > > > Please set $HBASE_HOME to the root of your HBase
> installation.
> > > > > > > > 12/08/23 19:25:19 WARN tool.BaseSqoopTool: Setting your
> password
> > > on
> > > > > the
> > > > > > > > command-line is insecure. Consider using -P instead.
> > > > > > > > 12/08/23 19:25:19 INFO manager.SqlManager: Using default
> > > fetchSize of
> > > > > > > 1000
> > > > > > > > 12/08/23 19:25:19 INFO tool.CodeGenTool: Beginning code
> > > generation
> > > > > > > > 12/08/23 19:25:19 INFO manager.SqlManager: Executing SQL
> > > statement:
> > > > > > > SELECT
> > > > > > > > t.* FROM "test_table" AS t LIMIT 1
> > > > > > > > 12/08/23 19:25:19 ERROR orm.ClassWriter: Cannot resolve SQL
> type
> > > 2003
> > > > > > > > 12/08/23 19:25:19 ERROR orm.ClassWriter: Cannot resolve SQL
> type
> > > 2003
> > > > > > > > 12/08/23 19:25:19 ERROR orm.ClassWriter: Cannot resolve SQL
> type
> > > 2003
> > > > > > > > 12/08/23 19:25:19 ERROR orm.ClassWriter: Cannot resolve SQL
> type
> > > 2003
> > > > > > > > 12/08/23 19:25:19 ERROR orm.ClassWriter: Cannot resolve SQL
> type
> > > 2003
> > > > > > > > 12/08/23 19:25:19 ERROR orm.ClassWriter: Cannot resolve SQL
> type
> > > 2003
> > > > > > > > 12/08/23 19:25:19 ERROR orm.ClassWriter: No Java type for SQL
> > > type
> > > > > 2003
> > > > > > > for
> > > > > > > > column rc_list
> > > > > > > >
> > > > > > > > There are 4 bigint columns in the table. Please guide me if
> Sqoop
> > > > > support
> > > > > > > > for bigint columns or not.
> > > > > > > >
> > > > > > > > I do some Rn D and find only one link but not able to solve
> the
> > > > > issue :
> > > > > > > >
> > > > > > >
> > > > >
> > >
> https://issues.cloudera.org/browse/SQOOP-48?page=com.atlassian.jira.plugin.system.issuetabpanels%3Aall-tabpanel
> > > > > > > >
> > > > > > > >
> > > > > > > > Thanks
> > > > > > >
> > > > >
> > >
>

Re: Support of arrays in fields

Posted by Jarek Jarcec Cecho <ja...@apache.org>.
Hi Ruslan,
we tried to do bulk import from Cloudera JIRA instance into ASF instance when we were incubating Sqoop. Unfortunately we've run into issues, so we decided to keep the Cloudera instance read only and create corresponding ASF JIRAs on per needed basis. Please check out INFRA-3763 [1] for more background.

Jarcec

Links:
1: https://issues.apache.org/jira/browse/INFRA-3763

On Sat, Apr 20, 2013 at 01:08:36AM +0400, Ruslan Al-Fakikh wrote:
> Jarek,
> 
> This is a second time :-) you are asking me to open a jira on Apache that
> is already present on Cloudera. If you are saying that Cloudera site is no
> longer used:
> https://issues.apache.org/jira/browse/SQOOP-390?focusedCommentId=13631487&page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-13631487
> then why haven't all its issues been migrated in batch?
> 
> Thanks
> 
> 
> On Mon, Apr 15, 2013 at 8:58 AM, Jarek Jarcec Cecho <ja...@apache.org>wrote:
> 
> > Hi Ruslan,
> > I'm afraid that Sqoop currently do not supports arrays natively. The
> > import case can be workaround by using array_to_string function, but I'm
> > not sure how to easily workaround export. Would you mind opening a new JIRA
> > on Apache JIRA [1] for that?
> >
> > Jarcec
> >
> > Links:
> > 1: https://issues.apache.org/jira/browse/SQOOP
> >
> > On Tue, Apr 09, 2013 at 08:40:54PM +0400, Ruslan Al-Fakikh wrote:
> > > Hey guys,
> > >
> > > Sorry for raising this old question, but is there a workaround for
> > > uploading data with arrays in fields?
> > > I have a table like this:
> > > CREATE TABLE tablename
> > > (
> > > image_urls character varying(300)[]
> > > );
> > > and I am uplooading from a file on HDFS. Basically I can change a format
> > of
> > > the file, but in what form should I do it in order to make Sqoop upload
> > it
> > > to this unsupported data type? Maybe there is a workaround.
> > >
> > > Also I saw this issue, but it is still unresolved:
> > > https://issues.cloudera.org/browse/SQOOP-160
> > >
> > > Any help would be appreciated
> > >
> > >
> > > On Fri, Aug 24, 2012 at 10:07 AM, Jarek Jarcec Cecho <jarcec@apache.org
> > >wrote:
> > >
> > > > You might consider utilizing Postgresql's function array_to_string to
> > > > "join" the array into one string. You would have to change the import
> > from
> > > > --table to --query then thought.
> > > >
> > > > Jarcec
> > > >
> > > > On Fri, Aug 24, 2012 at 10:40:46AM +0530, Adarsh Sharma wrote:
> > > > > Thanks Jarcec for the update, So Sqoop is not suitable for shifting
> > data
> > > > > from DB to HDFS , if some columns have integer[] or bigint[]
> > datatypes.
> > > > >
> > > > > Is there any way i can sh*ift  date having bigint[] datatypes from*
> > > > postgresql
> > > > > DB to HDFS using Sqoop or I need to test another tool like Talend
> > etc.
> > > > >
> > > > >
> > > > > Thanks
> > > > >
> > > > >
> > > > > On Thu, Aug 23, 2012 at 11:45 PM, Jarek Jarcec Cecho <
> > jarcec@apache.org
> > > > >wrote:
> > > > >
> > > > > > Hi Adarsh,
> > > > > > as far as I know then Sqoop should not have any issues with bigint
> > data
> > > > > > type.
> > > > > >
> > > > > > Based on provided log fragment, It seems that you're having issues
> > with
> > > > > > SQL type 2003 that should be ARRAY (see 1). I'm afraid that array
> > is
> > > > really
> > > > > > not supported in Sqoop at the moment.
> > > > > >
> > > > > > Jarcec
> > > > > >
> > > > > > 1:
> > > > > >
> > > >
> > http://docs.oracle.com/javase/1.5.0/docs/api/constant-values.html#java.sql.Types.ARRAY
> > > > > >
> > > > > > On Thu, Aug 23, 2012 at 08:47:10PM +0530, Adarsh Sharma wrote:
> > > > > > > Hi all,
> > > > > > >
> > > > > > > Please forgive if i violate any rule before posting in this
> > mailing
> > > > list
> > > > > > .
> > > > > > > I am using for some testing in my hadoop standalone set up.
> > > > > > >
> > > > > > > Hadoop Version: 0.20.2-cdh3u5,
> > > > 580d1d26c7ad6a7c6ba72950d8605e2c6fbc96cc
> > > > > > > Sqoop Version : Sqoop 1.4.1-incubating
> > > > > > > Also tried         :  Sqoop 1.4.0-incubating
> > > > > > > Postgresql Versio : edb-psql (9.0.4.14)
> > > > > > >
> > > > > > >
> > > > > > > I am able to export data from HDFS to postgresql but when I am
> > > > trying to
> > > > > > > import data from DB to hdfs , below problem arises :
> > > > > > > hadoop@test123:~/project/
> > > > > > > sqoop-1.4.1-incubating__hadoop-0.20$ bin/sqoop import  --connect
> > > > > > > jdbc:postgresql://localhost/hadooppipeline  --table test_table
> > > > --username
> > > > > > > postgres --password postgres
> > > > > > > Warning: /usr/lib/hbase does not exist! HBase imports will fail.
> > > > > > > Please set $HBASE_HOME to the root of your HBase installation.
> > > > > > > 12/08/23 19:25:19 WARN tool.BaseSqoopTool: Setting your password
> > on
> > > > the
> > > > > > > command-line is insecure. Consider using -P instead.
> > > > > > > 12/08/23 19:25:19 INFO manager.SqlManager: Using default
> > fetchSize of
> > > > > > 1000
> > > > > > > 12/08/23 19:25:19 INFO tool.CodeGenTool: Beginning code
> > generation
> > > > > > > 12/08/23 19:25:19 INFO manager.SqlManager: Executing SQL
> > statement:
> > > > > > SELECT
> > > > > > > t.* FROM "test_table" AS t LIMIT 1
> > > > > > > 12/08/23 19:25:19 ERROR orm.ClassWriter: Cannot resolve SQL type
> > 2003
> > > > > > > 12/08/23 19:25:19 ERROR orm.ClassWriter: Cannot resolve SQL type
> > 2003
> > > > > > > 12/08/23 19:25:19 ERROR orm.ClassWriter: Cannot resolve SQL type
> > 2003
> > > > > > > 12/08/23 19:25:19 ERROR orm.ClassWriter: Cannot resolve SQL type
> > 2003
> > > > > > > 12/08/23 19:25:19 ERROR orm.ClassWriter: Cannot resolve SQL type
> > 2003
> > > > > > > 12/08/23 19:25:19 ERROR orm.ClassWriter: Cannot resolve SQL type
> > 2003
> > > > > > > 12/08/23 19:25:19 ERROR orm.ClassWriter: No Java type for SQL
> > type
> > > > 2003
> > > > > > for
> > > > > > > column rc_list
> > > > > > >
> > > > > > > There are 4 bigint columns in the table. Please guide me if Sqoop
> > > > support
> > > > > > > for bigint columns or not.
> > > > > > >
> > > > > > > I do some Rn D and find only one link but not able to solve the
> > > > issue :
> > > > > > >
> > > > > >
> > > >
> > https://issues.cloudera.org/browse/SQOOP-48?page=com.atlassian.jira.plugin.system.issuetabpanels%3Aall-tabpanel
> > > > > > >
> > > > > > >
> > > > > > > Thanks
> > > > > >
> > > >
> >

Re: Support of arrays in fields

Posted by Ruslan Al-Fakikh <me...@gmail.com>.
Jarek,

This is a second time :-) you are asking me to open a jira on Apache that
is already present on Cloudera. If you are saying that Cloudera site is no
longer used:
https://issues.apache.org/jira/browse/SQOOP-390?focusedCommentId=13631487&page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-13631487
then why haven't all its issues been migrated in batch?

Thanks


On Mon, Apr 15, 2013 at 8:58 AM, Jarek Jarcec Cecho <ja...@apache.org>wrote:

> Hi Ruslan,
> I'm afraid that Sqoop currently do not supports arrays natively. The
> import case can be workaround by using array_to_string function, but I'm
> not sure how to easily workaround export. Would you mind opening a new JIRA
> on Apache JIRA [1] for that?
>
> Jarcec
>
> Links:
> 1: https://issues.apache.org/jira/browse/SQOOP
>
> On Tue, Apr 09, 2013 at 08:40:54PM +0400, Ruslan Al-Fakikh wrote:
> > Hey guys,
> >
> > Sorry for raising this old question, but is there a workaround for
> > uploading data with arrays in fields?
> > I have a table like this:
> > CREATE TABLE tablename
> > (
> > image_urls character varying(300)[]
> > );
> > and I am uplooading from a file on HDFS. Basically I can change a format
> of
> > the file, but in what form should I do it in order to make Sqoop upload
> it
> > to this unsupported data type? Maybe there is a workaround.
> >
> > Also I saw this issue, but it is still unresolved:
> > https://issues.cloudera.org/browse/SQOOP-160
> >
> > Any help would be appreciated
> >
> >
> > On Fri, Aug 24, 2012 at 10:07 AM, Jarek Jarcec Cecho <jarcec@apache.org
> >wrote:
> >
> > > You might consider utilizing Postgresql's function array_to_string to
> > > "join" the array into one string. You would have to change the import
> from
> > > --table to --query then thought.
> > >
> > > Jarcec
> > >
> > > On Fri, Aug 24, 2012 at 10:40:46AM +0530, Adarsh Sharma wrote:
> > > > Thanks Jarcec for the update, So Sqoop is not suitable for shifting
> data
> > > > from DB to HDFS , if some columns have integer[] or bigint[]
> datatypes.
> > > >
> > > > Is there any way i can sh*ift  date having bigint[] datatypes from*
> > > postgresql
> > > > DB to HDFS using Sqoop or I need to test another tool like Talend
> etc.
> > > >
> > > >
> > > > Thanks
> > > >
> > > >
> > > > On Thu, Aug 23, 2012 at 11:45 PM, Jarek Jarcec Cecho <
> jarcec@apache.org
> > > >wrote:
> > > >
> > > > > Hi Adarsh,
> > > > > as far as I know then Sqoop should not have any issues with bigint
> data
> > > > > type.
> > > > >
> > > > > Based on provided log fragment, It seems that you're having issues
> with
> > > > > SQL type 2003 that should be ARRAY (see 1). I'm afraid that array
> is
> > > really
> > > > > not supported in Sqoop at the moment.
> > > > >
> > > > > Jarcec
> > > > >
> > > > > 1:
> > > > >
> > >
> http://docs.oracle.com/javase/1.5.0/docs/api/constant-values.html#java.sql.Types.ARRAY
> > > > >
> > > > > On Thu, Aug 23, 2012 at 08:47:10PM +0530, Adarsh Sharma wrote:
> > > > > > Hi all,
> > > > > >
> > > > > > Please forgive if i violate any rule before posting in this
> mailing
> > > list
> > > > > .
> > > > > > I am using for some testing in my hadoop standalone set up.
> > > > > >
> > > > > > Hadoop Version: 0.20.2-cdh3u5,
> > > 580d1d26c7ad6a7c6ba72950d8605e2c6fbc96cc
> > > > > > Sqoop Version : Sqoop 1.4.1-incubating
> > > > > > Also tried         :  Sqoop 1.4.0-incubating
> > > > > > Postgresql Versio : edb-psql (9.0.4.14)
> > > > > >
> > > > > >
> > > > > > I am able to export data from HDFS to postgresql but when I am
> > > trying to
> > > > > > import data from DB to hdfs , below problem arises :
> > > > > > hadoop@test123:~/project/
> > > > > > sqoop-1.4.1-incubating__hadoop-0.20$ bin/sqoop import  --connect
> > > > > > jdbc:postgresql://localhost/hadooppipeline  --table test_table
> > > --username
> > > > > > postgres --password postgres
> > > > > > Warning: /usr/lib/hbase does not exist! HBase imports will fail.
> > > > > > Please set $HBASE_HOME to the root of your HBase installation.
> > > > > > 12/08/23 19:25:19 WARN tool.BaseSqoopTool: Setting your password
> on
> > > the
> > > > > > command-line is insecure. Consider using -P instead.
> > > > > > 12/08/23 19:25:19 INFO manager.SqlManager: Using default
> fetchSize of
> > > > > 1000
> > > > > > 12/08/23 19:25:19 INFO tool.CodeGenTool: Beginning code
> generation
> > > > > > 12/08/23 19:25:19 INFO manager.SqlManager: Executing SQL
> statement:
> > > > > SELECT
> > > > > > t.* FROM "test_table" AS t LIMIT 1
> > > > > > 12/08/23 19:25:19 ERROR orm.ClassWriter: Cannot resolve SQL type
> 2003
> > > > > > 12/08/23 19:25:19 ERROR orm.ClassWriter: Cannot resolve SQL type
> 2003
> > > > > > 12/08/23 19:25:19 ERROR orm.ClassWriter: Cannot resolve SQL type
> 2003
> > > > > > 12/08/23 19:25:19 ERROR orm.ClassWriter: Cannot resolve SQL type
> 2003
> > > > > > 12/08/23 19:25:19 ERROR orm.ClassWriter: Cannot resolve SQL type
> 2003
> > > > > > 12/08/23 19:25:19 ERROR orm.ClassWriter: Cannot resolve SQL type
> 2003
> > > > > > 12/08/23 19:25:19 ERROR orm.ClassWriter: No Java type for SQL
> type
> > > 2003
> > > > > for
> > > > > > column rc_list
> > > > > >
> > > > > > There are 4 bigint columns in the table. Please guide me if Sqoop
> > > support
> > > > > > for bigint columns or not.
> > > > > >
> > > > > > I do some Rn D and find only one link but not able to solve the
> > > issue :
> > > > > >
> > > > >
> > >
> https://issues.cloudera.org/browse/SQOOP-48?page=com.atlassian.jira.plugin.system.issuetabpanels%3Aall-tabpanel
> > > > > >
> > > > > >
> > > > > > Thanks
> > > > >
> > >
>

Re: Support of arrays in fields

Posted by Jarek Jarcec Cecho <ja...@apache.org>.
Hi Ruslan,
I'm afraid that Sqoop currently do not supports arrays natively. The import case can be workaround by using array_to_string function, but I'm not sure how to easily workaround export. Would you mind opening a new JIRA on Apache JIRA [1] for that?

Jarcec

Links:
1: https://issues.apache.org/jira/browse/SQOOP

On Tue, Apr 09, 2013 at 08:40:54PM +0400, Ruslan Al-Fakikh wrote:
> Hey guys,
> 
> Sorry for raising this old question, but is there a workaround for
> uploading data with arrays in fields?
> I have a table like this:
> CREATE TABLE tablename
> (
> image_urls character varying(300)[]
> );
> and I am uplooading from a file on HDFS. Basically I can change a format of
> the file, but in what form should I do it in order to make Sqoop upload it
> to this unsupported data type? Maybe there is a workaround.
> 
> Also I saw this issue, but it is still unresolved:
> https://issues.cloudera.org/browse/SQOOP-160
> 
> Any help would be appreciated
> 
> 
> On Fri, Aug 24, 2012 at 10:07 AM, Jarek Jarcec Cecho <ja...@apache.org>wrote:
> 
> > You might consider utilizing Postgresql's function array_to_string to
> > "join" the array into one string. You would have to change the import from
> > --table to --query then thought.
> >
> > Jarcec
> >
> > On Fri, Aug 24, 2012 at 10:40:46AM +0530, Adarsh Sharma wrote:
> > > Thanks Jarcec for the update, So Sqoop is not suitable for shifting data
> > > from DB to HDFS , if some columns have integer[] or bigint[] datatypes.
> > >
> > > Is there any way i can sh*ift  date having bigint[] datatypes from*
> > postgresql
> > > DB to HDFS using Sqoop or I need to test another tool like Talend etc.
> > >
> > >
> > > Thanks
> > >
> > >
> > > On Thu, Aug 23, 2012 at 11:45 PM, Jarek Jarcec Cecho <jarcec@apache.org
> > >wrote:
> > >
> > > > Hi Adarsh,
> > > > as far as I know then Sqoop should not have any issues with bigint data
> > > > type.
> > > >
> > > > Based on provided log fragment, It seems that you're having issues with
> > > > SQL type 2003 that should be ARRAY (see 1). I'm afraid that array is
> > really
> > > > not supported in Sqoop at the moment.
> > > >
> > > > Jarcec
> > > >
> > > > 1:
> > > >
> > http://docs.oracle.com/javase/1.5.0/docs/api/constant-values.html#java.sql.Types.ARRAY
> > > >
> > > > On Thu, Aug 23, 2012 at 08:47:10PM +0530, Adarsh Sharma wrote:
> > > > > Hi all,
> > > > >
> > > > > Please forgive if i violate any rule before posting in this mailing
> > list
> > > > .
> > > > > I am using for some testing in my hadoop standalone set up.
> > > > >
> > > > > Hadoop Version: 0.20.2-cdh3u5,
> > 580d1d26c7ad6a7c6ba72950d8605e2c6fbc96cc
> > > > > Sqoop Version : Sqoop 1.4.1-incubating
> > > > > Also tried         :  Sqoop 1.4.0-incubating
> > > > > Postgresql Versio : edb-psql (9.0.4.14)
> > > > >
> > > > >
> > > > > I am able to export data from HDFS to postgresql but when I am
> > trying to
> > > > > import data from DB to hdfs , below problem arises :
> > > > > hadoop@test123:~/project/
> > > > > sqoop-1.4.1-incubating__hadoop-0.20$ bin/sqoop import  --connect
> > > > > jdbc:postgresql://localhost/hadooppipeline  --table test_table
> > --username
> > > > > postgres --password postgres
> > > > > Warning: /usr/lib/hbase does not exist! HBase imports will fail.
> > > > > Please set $HBASE_HOME to the root of your HBase installation.
> > > > > 12/08/23 19:25:19 WARN tool.BaseSqoopTool: Setting your password on
> > the
> > > > > command-line is insecure. Consider using -P instead.
> > > > > 12/08/23 19:25:19 INFO manager.SqlManager: Using default fetchSize of
> > > > 1000
> > > > > 12/08/23 19:25:19 INFO tool.CodeGenTool: Beginning code generation
> > > > > 12/08/23 19:25:19 INFO manager.SqlManager: Executing SQL statement:
> > > > SELECT
> > > > > t.* FROM "test_table" AS t LIMIT 1
> > > > > 12/08/23 19:25:19 ERROR orm.ClassWriter: Cannot resolve SQL type 2003
> > > > > 12/08/23 19:25:19 ERROR orm.ClassWriter: Cannot resolve SQL type 2003
> > > > > 12/08/23 19:25:19 ERROR orm.ClassWriter: Cannot resolve SQL type 2003
> > > > > 12/08/23 19:25:19 ERROR orm.ClassWriter: Cannot resolve SQL type 2003
> > > > > 12/08/23 19:25:19 ERROR orm.ClassWriter: Cannot resolve SQL type 2003
> > > > > 12/08/23 19:25:19 ERROR orm.ClassWriter: Cannot resolve SQL type 2003
> > > > > 12/08/23 19:25:19 ERROR orm.ClassWriter: No Java type for SQL type
> > 2003
> > > > for
> > > > > column rc_list
> > > > >
> > > > > There are 4 bigint columns in the table. Please guide me if Sqoop
> > support
> > > > > for bigint columns or not.
> > > > >
> > > > > I do some Rn D and find only one link but not able to solve the
> > issue :
> > > > >
> > > >
> > https://issues.cloudera.org/browse/SQOOP-48?page=com.atlassian.jira.plugin.system.issuetabpanels%3Aall-tabpanel
> > > > >
> > > > >
> > > > > Thanks
> > > >
> >