You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@sqoop.apache.org by Artem Ervits <ar...@nyp.org> on 2012/10/12 19:29:41 UTC

SQOOP 1.3.0 questions using SQL Server

Hello all,

I'm testing a few switches for Sqoop import and I'm having the following problems.

When I use the -incremental switch, the command fails and shows me the help page.
When I use the -compress switch, the command works but when I try to uncompress the results, it says gzip undefined compression code. I also tried to explicitly state the codec and it would still not append the compression extension to the files nor am I able to uncompress the data.
When I use the -as-avrodatafile I get ERROR tool.ImportTool: Imported Failed: Cannot convert SQL type -9

Any ideas? I am not sure if upgrading Sqoop will fix it because SQL Server connector specifically required a 1.3.0 release.

Thanks.

Artem Ervits
Data Analyst
New York Presbyterian Hospital



--------------------

This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.




--------------------

This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.




Re: SQOOP 1.3.0 questions using SQL Server

Posted by Jarek Jarcec Cecho <ja...@apache.org>.
We actually tried very hard to remain compatible with this particular connector, so please feel free to create new JIRA for each issue you will find.

Jarcec

On Fri, Oct 12, 2012 at 06:01:19PM +0000, Artem Ervits wrote:
> Thank you Jarek,
> 
> I will do what you indicated, however, since we're using Microsoft connector for SQL, I am not sure that supports any version but 1.3.0. I will test though.
> 
> Thanks again.
> 
> -----Original Message-----
> From: Jarek Jarcec Cecho [mailto:jarcec@apache.org] 
> Sent: Friday, October 12, 2012 1:51 PM
> To: user@sqoop.apache.org
> Subject: Re: SQOOP 1.3.0 questions using SQL Server
> 
> Hi Artem,
> Sqoop 1.3.0 is very old release, so I would definitely recommend to upgrade to latest 1.4.2. Please note that this release include SQOOP-480[1] that fixes some issues for connectors compiled against 1.3 in 1.4. Sqoop is simple command line utility so you might try to download release into some temporary folder and verify that everything is working prior upgrading Sqoop system wise.
> 
> Specifically to your questions
> 
> 1) --incremental
> Usually Sqoop will print out error message first followed by help page, so I would advise checking few initial lines to see what exactly is wrong.
> 
> 2) --compress
> Please try to reproduce this issue on 1.4.2 and file jira if you'll be able to reproduce it.
> 
> 3) --as-avrodatafile
> Based on [2] SQL type -9 corresponds to data type NVARCHAR. We've added support for NVARCHAR in SQOOP-323 [3] that is part of Sqoop since version 1.4.0-incubating - e.g. 1.3.0 version do not support NVARCHAR type out of the box.
> 
> Jarcec
> 
> Links:
> 1: https://issues.apache.org/jira/browse/SQOOP-480
> 2: http://docs.oracle.com/javase/6/docs/api/constant-values.html#java.sql.Types.DECIMAL
> 3: https://issues.apache.org/jira/browse/SQOOP-323
> 
> On Fri, Oct 12, 2012 at 05:29:41PM +0000, Artem Ervits wrote:
> > 
> > Hello all,
> > 
> > I'm testing a few switches for Sqoop import and I'm having the following problems.
> > 
> > When I use the -incremental switch, the command fails and shows me the help page.
> > When I use the -compress switch, the command works but when I try to uncompress the results, it says gzip undefined compression code. I also tried to explicitly state the codec and it would still not append the compression extension to the files nor am I able to uncompress the data.
> > When I use the -as-avrodatafile I get ERROR tool.ImportTool: Imported Failed: Cannot convert SQL type -9
> > 
> > Any ideas? I am not sure if upgrading Sqoop will fix it because SQL Server connector specifically required a 1.3.0 release.
> > 
> > Thanks.
> > 
> > Artem Ervits
> > Data Analyst
> > New York Presbyterian Hospital
> > 
> > 
> > 
> > --------------------
> > 
> > This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.
> > 
> > 
> > 
> > 
> > --------------------
> > 
> > This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.
> > 
> > 
> > 
> 
> 
> --------------------
> 
> This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.
> 
> 
> 
> 
> --------------------
> 
> This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.
> 
> 
> 

RE: SQOOP 1.3.0 questions using SQL Server

Posted by Artem Ervits <ar...@nyp.org>.
Thank you Jarek,

I will do what you indicated, however, since we're using Microsoft connector for SQL, I am not sure that supports any version but 1.3.0. I will test though.

Thanks again.

-----Original Message-----
From: Jarek Jarcec Cecho [mailto:jarcec@apache.org] 
Sent: Friday, October 12, 2012 1:51 PM
To: user@sqoop.apache.org
Subject: Re: SQOOP 1.3.0 questions using SQL Server

Hi Artem,
Sqoop 1.3.0 is very old release, so I would definitely recommend to upgrade to latest 1.4.2. Please note that this release include SQOOP-480[1] that fixes some issues for connectors compiled against 1.3 in 1.4. Sqoop is simple command line utility so you might try to download release into some temporary folder and verify that everything is working prior upgrading Sqoop system wise.

Specifically to your questions

1) --incremental
Usually Sqoop will print out error message first followed by help page, so I would advise checking few initial lines to see what exactly is wrong.

2) --compress
Please try to reproduce this issue on 1.4.2 and file jira if you'll be able to reproduce it.

3) --as-avrodatafile
Based on [2] SQL type -9 corresponds to data type NVARCHAR. We've added support for NVARCHAR in SQOOP-323 [3] that is part of Sqoop since version 1.4.0-incubating - e.g. 1.3.0 version do not support NVARCHAR type out of the box.

Jarcec

Links:
1: https://issues.apache.org/jira/browse/SQOOP-480
2: http://docs.oracle.com/javase/6/docs/api/constant-values.html#java.sql.Types.DECIMAL
3: https://issues.apache.org/jira/browse/SQOOP-323

On Fri, Oct 12, 2012 at 05:29:41PM +0000, Artem Ervits wrote:
> 
> Hello all,
> 
> I'm testing a few switches for Sqoop import and I'm having the following problems.
> 
> When I use the -incremental switch, the command fails and shows me the help page.
> When I use the -compress switch, the command works but when I try to uncompress the results, it says gzip undefined compression code. I also tried to explicitly state the codec and it would still not append the compression extension to the files nor am I able to uncompress the data.
> When I use the -as-avrodatafile I get ERROR tool.ImportTool: Imported Failed: Cannot convert SQL type -9
> 
> Any ideas? I am not sure if upgrading Sqoop will fix it because SQL Server connector specifically required a 1.3.0 release.
> 
> Thanks.
> 
> Artem Ervits
> Data Analyst
> New York Presbyterian Hospital
> 
> 
> 
> --------------------
> 
> This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.
> 
> 
> 
> 
> --------------------
> 
> This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.
> 
> 
> 


--------------------

This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.




--------------------

This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.




Re: SQOOP 1.3.0 questions using SQL Server

Posted by Jarek Jarcec Cecho <ja...@apache.org>.
Hi Artem,
this is actually very helpful log fragment. Sqoop 1.4.2 do support the NVARCHAR data type for import/export (which was not the case for Sqoop 1.3.0), however Sqoop do not have special Splitter for NVARCHAR type. Which means that such column can't be used for parameter --split-by nor explicitly nor implicitly. One possible workaround is to use parameter "--num-mappers 1" that will limit number of active mappers to 1 and thus there will be no need for splitter. However such solution will obviously have performance issues as there won't be any parallel data movement. Are you able by any chance add some number based column or something?

I believe that this exception is not much helpful, so I've filled SQOOP-652 [1] on your behalf to improve that for more reasonable error message. I've also looked around and I believe that we could reuse TextSplitter that is already there to support splitting by N(LONG)(VAR)CHAR column. I've filled SQOOP-653 [2] to keep track of that. I've marked both issues with "newbie" label as I believe that they should be rather easy to resolve.

Jarcec

Links:
1: https://issues.apache.org/jira/browse/SQOOP-652
2: https://issues.apache.org/jira/browse/SQOOP-653

On Fri, Oct 26, 2012 at 09:48:45PM +0000, Artem Ervits wrote:
> Don't pay attention to error that this table has two primary keys, I just want to know whether nvarchar still works.
> 
> [xxxxx@SERVERNAME ~]$ clear
> [xxxxx@SERVERNAME ~]$ $SQOOP_HOME/bin/sqoop import-all-tables --connect "jdbc:sqlserver://dbserver;username=xxxx;password=xxxxx;database=xxxxx" --hive-import --create-hive-table -compress  -verbose
> 12/10/26 17:39:52 DEBUG tool.BaseSqoopTool: Enabled debug logging.
> 12/10/26 17:39:52 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
> 12/10/26 17:39:52 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
> 12/10/26 17:39:52 ERROR sqoop.ConnFactory: Error loading ManagerFactory information from file /usr/local/sqoop/conf/managers.d/mssqoop-sqlserver: java.io.IOException: the content of connector file must be in form of key=value
>         at org.apache.sqoop.ConnFactory.addManagersFromFile(ConnFactory.java:219)
>         at org.apache.sqoop.ConnFactory.loadManagersFromConfDir(ConnFactory.java:294)
>         at org.apache.sqoop.ConnFactory.instantiateFactories(ConnFactory.java:85)
>         at org.apache.sqoop.ConnFactory.<init>(ConnFactory.java:62)
>         at com.cloudera.sqoop.ConnFactory.<init>(ConnFactory.java:36)
>         at org.apache.sqoop.tool.BaseSqoopTool.init(BaseSqoopTool.java:200)
>         at org.apache.sqoop.tool.ImportTool.init(ImportTool.java:83)
>         at org.apache.sqoop.tool.ImportAllTablesTool.run(ImportAllTablesTool.java:48)
>         at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>         at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
>         at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
>         at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
>         at org.apache.sqoop.Sqoop.main(Sqoop.java:238)
>         at com.cloudera.sqoop.Sqoop.main(Sqoop.java:57)
> 
> 12/10/26 17:39:52 DEBUG sqoop.ConnFactory: Loaded manager factory: com.cloudera.sqoop.manager.DefaultManagerFactory
> 12/10/26 17:39:52 DEBUG sqoop.ConnFactory: Trying ManagerFactory: com.cloudera.sqoop.manager.DefaultManagerFactory
> 12/10/26 17:39:52 DEBUG manager.DefaultManagerFactory: Trying with scheme: jdbc:sqlserver:
> 12/10/26 17:39:52 INFO manager.SqlManager: Using default fetchSize of 1000
> 12/10/26 17:39:52 DEBUG sqoop.ConnFactory: Instantiated ConnManager org.apache.sqoop.manager.SQLServerManager@380e28b9
> 12/10/26 17:39:52 DEBUG manager.SqlManager: No connection paramenters specified. Using regular API for making connection.
> 12/10/26 17:39:53 INFO tool.CodeGenTool: Beginning code generation
> 12/10/26 17:39:53 DEBUG manager.SqlManager: Using fetchSize for next query: 1000
> 12/10/26 17:39:53 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM [TABLENAME] AS t WHERE 1=0
> 12/10/26 17:39:53 DEBUG orm.ClassWriter: selected columns:
> 12/10/26 17:39:53 DEBUG orm.ClassWriter:   Keyword
> 12/10/26 17:39:53 DEBUG orm.ClassWriter:   Item
> 12/10/26 17:39:53 DEBUG orm.ClassWriter:   Description
> 12/10/26 17:39:53 DEBUG orm.ClassWriter: Writing source file: /tmp/sqoop-xxxxx/compile/3a8443bba92a6cc65a6eef5b0999e946/TABLENAME.java
> 12/10/26 17:39:53 DEBUG orm.ClassWriter: Table name: TABLENAME
> 12/10/26 17:39:53 DEBUG orm.ClassWriter: Columns: Keyword:-9, Item:-9, Description:-9,
> 12/10/26 17:39:53 DEBUG orm.ClassWriter: sourceFilename is TABLENAME.java
> 12/10/26 17:39:53 DEBUG orm.CompilationManager: Found existing /tmp/sqoop-xxxxx/compile/3a8443bba92a6cc65a6eef5b0999e946/
> 12/10/26 17:39:53 INFO orm.CompilationManager: HADOOP_HOME is /usr/local/hadoop/libexec/..
> 12/10/26 17:39:53 DEBUG orm.CompilationManager: Adding source file: /tmp/sqoop-xxxxx/compile/3a8443bba92a6cc65a6eef5b0999e946/TABLENAME.java
> 12/10/26 17:39:53 DEBUG orm.CompilationManager: Invoking javac with args:
> 12/10/26 17:39:53 DEBUG orm.CompilationManager:   -sourcepath
> 12/10/26 17:39:53 DEBUG orm.CompilationManager:   /tmp/sqoop-xxxxx/compile/3a8443bba92a6cc65a6eef5b0999e946/
> 12/10/26 17:39:53 DEBUG orm.CompilationManager:   -d
> 12/10/26 17:39:53 DEBUG orm.CompilationManager:   /tmp/sqoop-xxxxx/compile/3a8443bba92a6cc65a6eef5b0999e946/
> 12/10/26 17:39:53 DEBUG orm.CompilationManager:   -classpath
> 12/10/26 17:39:53 DEBUG orm.CompilationManager:   /usr/local/hadoop/libexec/../conf:/usr/lib/jvm/java-sun/lib/tools.jar:/usr/local/hadoop/libexec/..:/usr/local/hadoop/libexec/../hadoop-core-1.0.3.jar:/usr/local/hadoop/libexec/../lib/asm-3.2.jar:/usr/local/hadoop/libexec/../lib/aspectjrt-1.6.5.jar:/usr/local/hadoop/libexec/../lib/aspectjtools-1.6.5.jar:/usr/local/hadoop/libexec/../lib/avro-1.5.3.jar:/usr/local/hadoop/libexec/../lib/avro-ipc-1.5.3.jar:/usr/local/hadoop/libexec/../lib/avro-mapred-1.5.3.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/libexec/../lib/commons-cli-1.2.jar:/usr/local/hadoop/libexec/../lib/commons-codec-1.4.jar:/usr/local/hadoop/libexec/../lib/commons-collections-3.2.1.jar:/usr/local/hadoop/libexec/../lib/commons-configuration-1.6.jar:/usr/local/hadoop/libexec/../lib/commons-daemon-1.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-digester-1.8.jar:/usr/local/hadoop/libexec/../lib/commons-el-1.0.jar:/usr/local/hadoop/libexec/../lib/commons-httpclient-3.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-io-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-lang-2.4.jar:/usr/local/hadoop/libexec/../lib/commons-logging-1.1.1.jar:/usr/local/hadoop/libexec/../lib/commons-logging-api-1.0.4.jar:/usr/local/hadoop/libexec/../lib/commons-math-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-net-1.4.1.jar:/usr/local/hadoop/libexec/../lib/core-3.1.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-capacity-scheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-fairscheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-thriftfs-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hsqldb-1.8.0.10.jar:/usr/local/hadoop/libexec/../lib/jackson-core-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jasper-compiler-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jasper-runtime-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jdeb-0.8.jar:/usr/local/hadoop/libexec/../lib/jersey-core-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-json-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-server-1.8.jar:/usr/local/hadoop/libexec/../lib/jets3t-0.6.1.jar:/usr/local/hadoop/libexec/../lib/jetty-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jetty-util-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jsch-0.1.42.jar:/usr/local/hadoop/libexec/../lib/junit-4.5.jar:/usr/local/hadoop/libexec/../lib/kfs-0.2.2.jar:/usr/local/hadoop/libexec/../lib/log4j-1.2.15.jar:/usr/local/hadoop/libexec/../lib/mockito-all-1.8.5.jar:/usr/local/hadoop/libexec/../lib/oro-2.0.8.jar:/usr/local/hadoop/libexec/../lib/servlet-api-2.5-20081211.jar:/usr/local/hadoop/libexec/../lib/slf4j-api-1.4.3.jar:/usr/local/hadoop/libexec/../lib/slf4j-log4j12-1.4.3.jar:/usr/local/hadoop/libexec/../lib/sqljdbc4.jar:/usr/local/hadoop/libexec/../lib/sqoop-1.4.2.jar:/usr/local/hadoop/libexec/../lib/sqoop-sqlserver-1.0.jar:/usr/local/hadoop/libexec/../lib/xmlenc-0.52.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-2.1.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-api-2.1.jar:/usr/local/hive/lib/hive_contrib.jar:/usr/local/hive/lib/commons-collections-3.2.1.jar:/usr/local/hive/lib/commons-codec-1.4.jar:/usr/local/hive/lib/xz-1.0.jar:/usr/local/hive/lib/stringtemplate-3.1-b1.jar:/usr/local/hive/lib/hbase-0.92.0.jar:/usr/local/hive/lib/commons-dbcp-1.4.jar:/usr/local/hive/lib/jackson-jaxrs-1.8.8.jar:/usr/local/hive/lib/hive-contrib-0.10.0-SNAPSHOT.jar:/usr/local/hive/lib/hive-metastore-0.10.0-SNAPSHOT.jar:/usr/local/hive/lib/servlet-api-2.5-20081211.jar:/usr/local/hive/lib/antlr-runtime-3.0.1.jar:/usr/local/hive/lib/avro-mapred-1.7.1.jar:/usr/local/hive/lib/datanucleus-core-2.0.3.jar:/usr/local/hive/lib/datanucleus-enhancer-2.0.3.jar:/usr/local/hive/lib/hive-jdbc-0.10.0-SNAPSHOT.jar:/usr/local/hive/lib/commons-logging-1.0.4.jar:/usr/local/hive/lib/hive-serde-0.10.0-SNAPSHOT.jar:/usr/local/hive/lib/log4j-1.2.16.jar:/usr/local/hive/lib/avro-1.7.1.jar:/usr/local/hive/lib/commons-pool-1.5.4.jar:/usr/local/hive/lib/hive-hwi-0.10.0-SNAPSHOT.jar:/usr/local/hive/lib/datanucleus-rdbms-2.0.3.jar:/usr/local/hive/lib/hive-builtins-0.10.0-SNAPSHOT.jar:/usr/local/hive/lib/sqlline-1_0_2.jar:/usr/local/hive/lib/hive-pdk-0.10.0-SNAPSHOT.jar:/usr/local/hive/lib/libthrift-0.7.0.jar:/usr/local/hive/lib/jdo2-api-2.3-ec.jar:/usr/local/hive/lib/jline-0.9.94.jar:/usr/local/hive/lib/hive-service-0.10.0-SNAPSHOT.jar:/usr/local/hive/lib/json-20090211.jar:/usr/local/hive/lib/jetty-util-6.1.26.jar:/usr/local/hive/lib/derby-10.4.2.0.jar:/usr/local/hive/lib/commons-configuration-1.6.jar:/usr/local/hive/lib/jetty-6.1.26.jar:/usr/local/hive/lib/hive-exec-0.10.0-SNAPSHOT.jar:/usr/local/hive/lib/slf4j-log4j12-1.6.1.jar:/usr/local/hive/lib/commons-compress-1.4.1.jar:/usr/local/hive/lib/commons-logging-api-1.0.4.jar:/usr/local/hive/lib/hive-cli-0.10.0-SNAPSHOT.jar:/usr/local/hive/lib/hive-shims-0.10.0-SNAPSHOT.jar:/usr/local/hive/lib/jackson-xc-1.8.8.jar:/usr/local/hive/lib/hive-common-0.10.0-SNAPSHOT.jar:/usr/local/hive/lib/commons-lang-2.4.jar:/usr/local/hive/lib/datanucleus-connectionpool-2.0.3.jar:/usr/local/hive/lib/hive-hbase-handler-0.10.0-SNAPSHOT.jar:/usr/local/hive/lib/JavaEWAH-0.3.2.jar:/usr/local/hive/lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hive/lib/antlr-2.7.7.jar:/usr/local/hive/lib/libfb303-0.7.0.jar:/usr/local/hive/lib/guava-11.0.2.jar:/usr/local/hive/lib/hbase-0.92.0-tests.jar:/usr/local/hive/lib/slf4j-api-1.6.1.jar:/usr/local/hive/lib/zookeeper-3.4.3.jar:/usr/local/hive/lib/commons-cli-1.2.jar:/usr/local/hive/lib/jackson-core-asl-1.8.8.jar:/usr/local/hadoop/hadoop-core-1.0.3.jar:/usr/local/hadoop/lib/sqoop-1.4.2.jar
> Note: /tmp/sqoop-xxxxx/compile/3a8443bba92a6cc65a6eef5b0999e946/TABLENAME.java uses or overrides a deprecated API.
> Note: Recompile with -Xlint:deprecation for details.
> 12/10/26 17:39:58 DEBUG orm.CompilationManager: Could not rename /tmp/sqoop-xxxxx/compile/3a8443bba92a6cc65a6eef5b0999e946/TABLENAME.java to /home/xxxxx/./TABLENAME.java
> org.apache.commons.io.FileExistsException: Destination '/home/xxxxx/./TABLENAME.java' already exists
>         at org.apache.commons.io.FileUtils.moveFile(FileUtils.java:2378)
>         at org.apache.sqoop.orm.CompilationManager.compile(CompilationManager.java:227)
>         at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:83)
>         at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:390)
>         at org.apache.sqoop.tool.ImportAllTablesTool.run(ImportAllTablesTool.java:64)
>         at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>         at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
>         at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
>         at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
>         at org.apache.sqoop.Sqoop.main(Sqoop.java:238)
>         at com.cloudera.sqoop.Sqoop.main(Sqoop.java:57)
> 12/10/26 17:39:58 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-xxxxx/compile/3a8443bba92a6cc65a6eef5b0999e946/TABLENAME.jar
> 12/10/26 17:39:58 DEBUG orm.CompilationManager: Scanning for .class files in directory: /tmp/sqoop-xxxxx/compile/3a8443bba92a6cc65a6eef5b0999e946
> 12/10/26 17:39:58 DEBUG orm.CompilationManager: Got classfile: /tmp/sqoop-xxxxx/compile/3a8443bba92a6cc65a6eef5b0999e946/TABLENAME.class -> TABLENAME.class
> 12/10/26 17:39:58 DEBUG orm.CompilationManager: Finished writing jar file /tmp/sqoop-xxxxx/compile/3a8443bba92a6cc65a6eef5b0999e946/TABLENAME.jar
> 12/10/26 17:39:58 DEBUG manager.CatalogQueryManager: Retrieving primary key for table 'TABLENAME' with query SELECT kcu.COLUMN_NAME FROM INFORMATION_SCHEMA.TABLE_CONSTRAINTS tc,   INFORMATION_SCHEMA.KEY_COLUMN_USAGE kcu WHERE tc.TABLE_SCHEMA = kcu.TABLE_SCHEMA   AND tc.TABLE_NAME = kcu.TABLE_NAME   AND tc.CONSTRAINT_SCHEMA = kcu.CONSTRAINT_SCHEMA   AND tc.CONSTRAINT_NAME = kcu.CONSTRAINT_NAME   AND tc.TABLE_SCHEMA = (SELECT SCHEMA_NAME())   AND tc.TABLE_NAME = 'TABLENAME'   AND tc.CONSTRAINT_TYPE = 'PRIMARY KEY'
> 
> 12/10/26 17:39:58 WARN manager.CatalogQueryManager: The table TABLENAME contains a multi-column primary key. Sqoop will default to the column Item only for this job.
> 12/10/26 17:39:58 DEBUG manager.CatalogQueryManager: Retrieving primary key for table 'TABLENAME' with query SELECT kcu.COLUMN_NAME FROM INFORMATION_SCHEMA.TABLE_CONSTRAINTS tc,   INFORMATION_SCHEMA.KEY_COLUMN_USAGE kcu WHERE tc.TABLE_SCHEMA = kcu.TABLE_SCHEMA   AND tc.TABLE_NAME = kcu.TABLE_NAME   AND tc.CONSTRAINT_SCHEMA = kcu.CONSTRAINT_SCHEMA   AND tc.CONSTRAINT_NAME = kcu.CONSTRAINT_NAME   AND tc.TABLE_SCHEMA = (SELECT SCHEMA_NAME())   AND tc.TABLE_NAME = 'TABLENAME'   AND tc.CONSTRAINT_TYPE = 'PRIMARY KEY'
> 
> 12/10/26 17:39:58 WARN manager.CatalogQueryManager: The table TABLENAME contains a multi-column primary key. Sqoop will default to the column Item only for this job.
> 12/10/26 17:39:58 INFO mapreduce.ImportJobBase: Beginning import of TABLENAME
> 12/10/26 17:39:58 DEBUG mapreduce.DataDrivenImportJob: Using table class: TABLENAME
> 12/10/26 17:39:58 DEBUG mapreduce.DataDrivenImportJob: Using InputFormat: class com.cloudera.sqoop.mapreduce.db.DataDrivenDBInputFormat
> 12/10/26 17:39:58 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/local/hadoop/lib/sqoop-1.4.2.jar
> 12/10/26 17:39:58 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/local/hadoop/lib/sqljdbc4.jar
> 12/10/26 17:39:58 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/local/hadoop/lib/sqoop-1.4.2.jar
> 12/10/26 17:39:58 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/local/hadoop/lib/sqoop-1.4.2.jar
> 12/10/26 17:39:58 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/local/sqoop/lib/avro-1.5.3.jar
> 12/10/26 17:39:58 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/local/sqoop/lib/avro-ipc-1.5.3.jar
> 12/10/26 17:39:58 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/local/sqoop/lib/commons-io-1.4.jar
> 12/10/26 17:39:58 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/local/sqoop/lib/hsqldb-1.8.0.10.jar
> 12/10/26 17:39:58 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/local/sqoop/lib/snappy-java-1.0.3.2.jar
> 12/10/26 17:39:58 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/local/sqoop/lib/jopt-simple-3.2.jar
> 12/10/26 17:39:58 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/local/sqoop/lib/sqljdbc4.jar
> 12/10/26 17:39:58 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/local/sqoop/lib/avro-mapred-1.5.3.jar
> 12/10/26 17:39:58 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/local/sqoop/lib/sqoop-sqlserver-1.0.jar
> 12/10/26 17:39:58 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/local/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar
> 12/10/26 17:39:58 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/local/sqoop/lib/jackson-core-asl-1.7.3.jar
> 12/10/26 17:39:58 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/local/sqoop/lib/paranamer-2.3.jar
> 12/10/26 17:39:58 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/local/sqoop/lib/jackson-mapper-asl-1.7.3.jar
> 12/10/26 17:39:58 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/local/sqoop/lib/ant-contrib-1.0b3.jar
> 12/10/26 17:40:02 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN([Item]), MAX([Item]) FROM [TABLENAME]
> 12/10/26 17:40:02 INFO mapred.JobClient: Cleaning up the staging area hdfs://SERVERNAME1:54310/user/xxxxx/.staging/job_201210261704_0005
> 12/10/26 17:40:02 ERROR security.UserGroupInformation: PriviledgedActionException as:xxxxx cause:java.io.IOException: Unknown SQL data type: -9
> 12/10/26 17:40:02 ERROR tool.ImportAllTablesTool: Encountered IOException running import job: java.io.IOException: Unknown SQL data type: -9
> 
> 
> 
> [xxxxx@servername ~]$ java -version
> java version "1.6.0_33"
> Java(TM) SE Runtime Environment (build 1.6.0_33-b03)
> Java HotSpot(TM) 64-Bit Server VM (build 20.8-b03, mixed mode)
> 
> /usr/lib/jvm/jre-1.6.0-sun.x86_64/bin/java
> 
> -----Original Message-----
> From: Jarek Jarcec Cecho [mailto:jarcec@apache.org] 
> Sent: Friday, October 26, 2012 5:17 PM
> To: user@sqoop.apache.org
> Subject: Re: SQOOP 1.3.0 questions using SQL Server
> 
> Hi Artem,
> yes nvarchar (sql type -9) should be supported on 1.4.2. Would you mind share with us entire Sqoop command line and appropriate log generated with --verbose argument? Also what version of Java are using?
> 
> Jarcec
> 
> On Fri, Oct 26, 2012 at 09:09:56PM +0000, Artem Ervits wrote:
> > I have Sqoop 1.4.2 installed
> > 
> > Running sqoop version returns
> > 
> > Sqoop 1.4.2
> > git commit id
> > Compiled by ag on Tue Aug 14 17:37:19 IST 2012
> > 
> > When I tried to run a sqoop import-all-tables I got "Unknown SQL Data type: -9". From the discussion below, nvarchar should be supported in this version of sqoop, please advise?
> > 
> > Thank you.
> > 
> > -----Original Message-----
> > From: Jarek Jarcec Cecho [mailto:jarcec@apache.org]
> > Sent: Monday, October 15, 2012 4:34 PM
> > To: user@sqoop.apache.org
> > Cc: Marc Sturm
> > Subject: Re: SQOOP 1.3.0 questions using SQL Server
> > 
> > Hi Marc,
> > please find my answers inline:
> > 
> > On Mon, Oct 15, 2012 at 07:25:28PM +0000, Artem Ervits wrote:
> > > I installed SQOOP 1.4.2 and was able to compress and then uncompress the data. I used the bin.tar.gz installer.
> > 
> > Great!
> > 
> > > 
> > > 1.       I did try to use the git repo to compile the sqoop jar and was able to compile it no problem which resulted in 1.4.3 snapshot jar using command ant jar-all. I noticed that  all other jars like avro are not present in the build or lib directory. How may I go about downloading source and compiling it to include avro, etc?
> > 
> > If you need to build entire distribution (like the one that is offered in download page) including (almost) all dependency jars you need to run command "ant -Dhadoopversion=$hadoopVersion tar", where $hadoopVersion is your hadoop version. Supported values are 20, 23, 100 and 200.
> > 
> > > 2.       I also tried to use the incremental switch, is this a correct command: "sqoop import -incremental 'lastmodified'" or does the command look some other way?
> > 
> > Yes that is the intended usage. You might find more information about incremental imports in Sqoop documentation:
> > 
> > http://sqoop.apache.org/docs/1.4.2/SqoopUserGuide.html#_incremental_im
> > ports
> > 
> > > Thank you.
> > 
> > Jarcec
> > 
> > > 
> > > 
> > > From: Artem Ervits [mailto:are9004@nyp.org]
> > > Sent: Friday, October 12, 2012 5:17 PM
> > > To: user@sqoop.apache.org
> > > Cc: Marc Sturm
> > > Subject: RE: SQOOP 1.3.0 questions using SQL Server
> > > 
> > > Even better!
> > > 
> > > Thank you.
> > > 
> > > From: Chalcy
> > > [mailto:chalcy@gmail.com]<mailto:[mailto:chalcy@gmail.com]>
> > > Sent: Friday, October 12, 2012 4:18 PM
> > > To: user@sqoop.apache.org<ma...@sqoop.apache.org>
> > > Subject: Re: SQOOP 1.3.0 questions using SQL Server
> > > 
> > > Sqoop 1.4.3 freshly built even works very well with sql server. Just 
> > > get the sqoop jar and replace it and should work just fine.  I so 
> > > far never had any issues changing sqoop version :)
> > > 
> > > Good luck!
> > > 
> > > Thanks,
> > > Chalcy
> > > On Fri, Oct 12, 2012 at 2:16 PM, Artem Ervits <ar...@nyp.org>> wrote:
> > > Actually I was just able to test 3) below and it succeeded, I cast the nvarchar as char.
> > > 
> > > Thank you again, I will do 1) and 2) and report back the results.
> > > 
> > > -----Original Message-----
> > > From: Jarek Jarcec Cecho
> > > [mailto:jarcec@apache.org<ma...@apache.org>]
> > > Sent: Friday, October 12, 2012 1:51 PM
> > > To: user@sqoop.apache.org<ma...@sqoop.apache.org>
> > > Subject: Re: SQOOP 1.3.0 questions using SQL Server Hi Artem, Sqoop
> > > 1.3.0 is very old release, so I would definitely recommend to upgrade to latest 1.4.2. Please note that this release include SQOOP-480[1] that fixes some issues for connectors compiled against 1.3 in 1.4. Sqoop is simple command line utility so you might try to download release into some temporary folder and verify that everything is working prior upgrading Sqoop system wise.
> > > 
> > > Specifically to your questions
> > > 
> > > 1) --incremental
> > > Usually Sqoop will print out error message first followed by help page, so I would advise checking few initial lines to see what exactly is wrong.
> > > 
> > > 2) --compress
> > > Please try to reproduce this issue on 1.4.2 and file jira if you'll be able to reproduce it.
> > > 
> > > 3) --as-avrodatafile
> > > Based on [2] SQL type -9 corresponds to data type NVARCHAR. We've added support for NVARCHAR in SQOOP-323 [3] that is part of Sqoop since version 1.4.0-incubating - e.g. 1.3.0 version do not support NVARCHAR type out of the box.
> > > 
> > > Jarcec
> > > 
> > > Links:
> > > 1: https://issues.apache.org/jira/browse/SQOOP-480
> > > 2: 
> > > http://docs.oracle.com/javase/6/docs/api/constant-values.html#java.s
> > > ql
> > > .Types.DECIMAL
> > > 3: https://issues.apache.org/jira/browse/SQOOP-323
> > > 
> > > On Fri, Oct 12, 2012 at 05:29:41PM +0000, Artem Ervits wrote:
> > > >
> > > > Hello all,
> > > >
> > > > I'm testing a few switches for Sqoop import and I'm having the following problems.
> > > >
> > > > When I use the -incremental switch, the command fails and shows me the help page.
> > > > When I use the -compress switch, the command works but when I try to uncompress the results, it says gzip undefined compression code. I also tried to explicitly state the codec and it would still not append the compression extension to the files nor am I able to uncompress the data.
> > > > When I use the -as-avrodatafile I get ERROR tool.ImportTool: 
> > > > Imported Failed: Cannot convert SQL type -9
> > > >
> > > > Any ideas? I am not sure if upgrading Sqoop will fix it because SQL Server connector specifically required a 1.3.0 release.
> > > >
> > > > Thanks.
> > > >
> > > > Artem Ervits
> > > > Data Analyst
> > > > New York Presbyterian Hospital
> > > >
> > > >
> > > >
> > > > --------------------
> > > >
> > > > This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.
> > > >
> > > >
> > > >
> > > >
> > > > --------------------
> > > >
> > > > This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.
> > > >
> > > >
> > > >
> > > 
> > > 
> > > --------------------
> > > 
> > > This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.
> > > 
> > > 
> > > 
> > > 
> > > --------------------
> > > 
> > > This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.
> > > 
> > > 
> > > 
> > > 
> > > 
> > > --------------------
> > > 
> > > Confidential Information subject to NYP's (and its affiliates') information management and security policies (http://infonet.nyp.org/QA/HospManual/).
> > > 
> > > 
> > > 
> > > --------------------
> > > 
> > > 
> > > 
> > > This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.
> > > 
> > > 
> > > 
> > > 
> > > 
> > > ________________________________
> > > 
> > > Confidential Information subject to NYP's (and its affiliates') information management and security policies (http://infonet.nyp.org/QA/HospitalManual).
> > > 
> > > 
> > > --------------------
> > > Confidential Information subject to NYP's (and its affiliates') information management and security policies (http://infonet.nyp.org/QA/HospManual/).
> > > 
> > > 
> > > 
> > > --------------------
> > > 
> > > This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.
> > > 
> > > 
> > > 
> > 
> > 
> > --------------------
> > 
> > This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.
> > 
> > 
> > 
> > 
> > --------------------
> > 
> > This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.
> > 
> > 
> > 
> 
> 
> --------------------
> 
> This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.
> 
> 
> 
> 
> --------------------
> 
> This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.
> 
> 
> 

RE: SQOOP 1.3.0 questions using SQL Server

Posted by Artem Ervits <ar...@nyp.org>.
Don't pay attention to error that this table has two primary keys, I just want to know whether nvarchar still works.

[xxxxx@SERVERNAME ~]$ clear
[xxxxx@SERVERNAME ~]$ $SQOOP_HOME/bin/sqoop import-all-tables --connect "jdbc:sqlserver://dbserver;username=xxxx;password=xxxxx;database=xxxxx" --hive-import --create-hive-table -compress  -verbose
12/10/26 17:39:52 DEBUG tool.BaseSqoopTool: Enabled debug logging.
12/10/26 17:39:52 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
12/10/26 17:39:52 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
12/10/26 17:39:52 ERROR sqoop.ConnFactory: Error loading ManagerFactory information from file /usr/local/sqoop/conf/managers.d/mssqoop-sqlserver: java.io.IOException: the content of connector file must be in form of key=value
        at org.apache.sqoop.ConnFactory.addManagersFromFile(ConnFactory.java:219)
        at org.apache.sqoop.ConnFactory.loadManagersFromConfDir(ConnFactory.java:294)
        at org.apache.sqoop.ConnFactory.instantiateFactories(ConnFactory.java:85)
        at org.apache.sqoop.ConnFactory.<init>(ConnFactory.java:62)
        at com.cloudera.sqoop.ConnFactory.<init>(ConnFactory.java:36)
        at org.apache.sqoop.tool.BaseSqoopTool.init(BaseSqoopTool.java:200)
        at org.apache.sqoop.tool.ImportTool.init(ImportTool.java:83)
        at org.apache.sqoop.tool.ImportAllTablesTool.run(ImportAllTablesTool.java:48)
        at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
        at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
        at org.apache.sqoop.Sqoop.main(Sqoop.java:238)
        at com.cloudera.sqoop.Sqoop.main(Sqoop.java:57)

12/10/26 17:39:52 DEBUG sqoop.ConnFactory: Loaded manager factory: com.cloudera.sqoop.manager.DefaultManagerFactory
12/10/26 17:39:52 DEBUG sqoop.ConnFactory: Trying ManagerFactory: com.cloudera.sqoop.manager.DefaultManagerFactory
12/10/26 17:39:52 DEBUG manager.DefaultManagerFactory: Trying with scheme: jdbc:sqlserver:
12/10/26 17:39:52 INFO manager.SqlManager: Using default fetchSize of 1000
12/10/26 17:39:52 DEBUG sqoop.ConnFactory: Instantiated ConnManager org.apache.sqoop.manager.SQLServerManager@380e28b9
12/10/26 17:39:52 DEBUG manager.SqlManager: No connection paramenters specified. Using regular API for making connection.
12/10/26 17:39:53 INFO tool.CodeGenTool: Beginning code generation
12/10/26 17:39:53 DEBUG manager.SqlManager: Using fetchSize for next query: 1000
12/10/26 17:39:53 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM [TABLENAME] AS t WHERE 1=0
12/10/26 17:39:53 DEBUG orm.ClassWriter: selected columns:
12/10/26 17:39:53 DEBUG orm.ClassWriter:   Keyword
12/10/26 17:39:53 DEBUG orm.ClassWriter:   Item
12/10/26 17:39:53 DEBUG orm.ClassWriter:   Description
12/10/26 17:39:53 DEBUG orm.ClassWriter: Writing source file: /tmp/sqoop-xxxxx/compile/3a8443bba92a6cc65a6eef5b0999e946/TABLENAME.java
12/10/26 17:39:53 DEBUG orm.ClassWriter: Table name: TABLENAME
12/10/26 17:39:53 DEBUG orm.ClassWriter: Columns: Keyword:-9, Item:-9, Description:-9,
12/10/26 17:39:53 DEBUG orm.ClassWriter: sourceFilename is TABLENAME.java
12/10/26 17:39:53 DEBUG orm.CompilationManager: Found existing /tmp/sqoop-xxxxx/compile/3a8443bba92a6cc65a6eef5b0999e946/
12/10/26 17:39:53 INFO orm.CompilationManager: HADOOP_HOME is /usr/local/hadoop/libexec/..
12/10/26 17:39:53 DEBUG orm.CompilationManager: Adding source file: /tmp/sqoop-xxxxx/compile/3a8443bba92a6cc65a6eef5b0999e946/TABLENAME.java
12/10/26 17:39:53 DEBUG orm.CompilationManager: Invoking javac with args:
12/10/26 17:39:53 DEBUG orm.CompilationManager:   -sourcepath
12/10/26 17:39:53 DEBUG orm.CompilationManager:   /tmp/sqoop-xxxxx/compile/3a8443bba92a6cc65a6eef5b0999e946/
12/10/26 17:39:53 DEBUG orm.CompilationManager:   -d
12/10/26 17:39:53 DEBUG orm.CompilationManager:   /tmp/sqoop-xxxxx/compile/3a8443bba92a6cc65a6eef5b0999e946/
12/10/26 17:39:53 DEBUG orm.CompilationManager:   -classpath
12/10/26 17:39:53 DEBUG orm.CompilationManager:   /usr/local/hadoop/libexec/../conf:/usr/lib/jvm/java-sun/lib/tools.jar:/usr/local/hadoop/libexec/..:/usr/local/hadoop/libexec/../hadoop-core-1.0.3.jar:/usr/local/hadoop/libexec/../lib/asm-3.2.jar:/usr/local/hadoop/libexec/../lib/aspectjrt-1.6.5.jar:/usr/local/hadoop/libexec/../lib/aspectjtools-1.6.5.jar:/usr/local/hadoop/libexec/../lib/avro-1.5.3.jar:/usr/local/hadoop/libexec/../lib/avro-ipc-1.5.3.jar:/usr/local/hadoop/libexec/../lib/avro-mapred-1.5.3.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/libexec/../lib/commons-cli-1.2.jar:/usr/local/hadoop/libexec/../lib/commons-codec-1.4.jar:/usr/local/hadoop/libexec/../lib/commons-collections-3.2.1.jar:/usr/local/hadoop/libexec/../lib/commons-configuration-1.6.jar:/usr/local/hadoop/libexec/../lib/commons-daemon-1.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-digester-1.8.jar:/usr/local/hadoop/libexec/../lib/commons-el-1.0.jar:/usr/local/hadoop/libexec/../lib/commons-httpclient-3.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-io-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-lang-2.4.jar:/usr/local/hadoop/libexec/../lib/commons-logging-1.1.1.jar:/usr/local/hadoop/libexec/../lib/commons-logging-api-1.0.4.jar:/usr/local/hadoop/libexec/../lib/commons-math-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-net-1.4.1.jar:/usr/local/hadoop/libexec/../lib/core-3.1.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-capacity-scheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-fairscheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-thriftfs-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hsqldb-1.8.0.10.jar:/usr/local/hadoop/libexec/../lib/jackson-core-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jasper-compiler-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jasper-runtime-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jdeb-0.8.jar:/usr/local/hadoop/libexec/../lib/jersey-core-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-json-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-server-1.8.jar:/usr/local/hadoop/libexec/../lib/jets3t-0.6.1.jar:/usr/local/hadoop/libexec/../lib/jetty-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jetty-util-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jsch-0.1.42.jar:/usr/local/hadoop/libexec/../lib/junit-4.5.jar:/usr/local/hadoop/libexec/../lib/kfs-0.2.2.jar:/usr/local/hadoop/libexec/../lib/log4j-1.2.15.jar:/usr/local/hadoop/libexec/../lib/mockito-all-1.8.5.jar:/usr/local/hadoop/libexec/../lib/oro-2.0.8.jar:/usr/local/hadoop/libexec/../lib/servlet-api-2.5-20081211.jar:/usr/local/hadoop/libexec/../lib/slf4j-api-1.4.3.jar:/usr/local/hadoop/libexec/../lib/slf4j-log4j12-1.4.3.jar:/usr/local/hadoop/libexec/../lib/sqljdbc4.jar:/usr/local/hadoop/libexec/../lib/sqoop-1.4.2.jar:/usr/local/hadoop/libexec/../lib/sqoop-sqlserver-1.0.jar:/usr/local/hadoop/libexec/../lib/xmlenc-0.52.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-2.1.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-api-2.1.jar:/usr/local/hive/lib/hive_contrib.jar:/usr/local/hive/lib/commons-collections-3.2.1.jar:/usr/local/hive/lib/commons-codec-1.4.jar:/usr/local/hive/lib/xz-1.0.jar:/usr/local/hive/lib/stringtemplate-3.1-b1.jar:/usr/local/hive/lib/hbase-0.92.0.jar:/usr/local/hive/lib/commons-dbcp-1.4.jar:/usr/local/hive/lib/jackson-jaxrs-1.8.8.jar:/usr/local/hive/lib/hive-contrib-0.10.0-SNAPSHOT.jar:/usr/local/hive/lib/hive-metastore-0.10.0-SNAPSHOT.jar:/usr/local/hive/lib/servlet-api-2.5-20081211.jar:/usr/local/hive/lib/antlr-runtime-3.0.1.jar:/usr/local/hive/lib/avro-mapred-1.7.1.jar:/usr/local/hive/lib/datanucleus-core-2.0.3.jar:/usr/local/hive/lib/datanucleus-enhancer-2.0.3.jar:/usr/local/hive/lib/hive-jdbc-0.10.0-SNAPSHOT.jar:/usr/local/hive/lib/commons-logging-1.0.4.jar:/usr/local/hive/lib/hive-serde-0.10.0-SNAPSHOT.jar:/usr/local/hive/lib/log4j-1.2.16.jar:/usr/local/hive/lib/avro-1.7.1.jar:/usr/local/hive/lib/commons-pool-1.5.4.jar:/usr/local/hive/lib/hive-hwi-0.10.0-SNAPSHOT.jar:/usr/local/hive/lib/datanucleus-rdbms-2.0.3.jar:/usr/local/hive/lib/hive-builtins-0.10.0-SNAPSHOT.jar:/usr/local/hive/lib/sqlline-1_0_2.jar:/usr/local/hive/lib/hive-pdk-0.10.0-SNAPSHOT.jar:/usr/local/hive/lib/libthrift-0.7.0.jar:/usr/local/hive/lib/jdo2-api-2.3-ec.jar:/usr/local/hive/lib/jline-0.9.94.jar:/usr/local/hive/lib/hive-service-0.10.0-SNAPSHOT.jar:/usr/local/hive/lib/json-20090211.jar:/usr/local/hive/lib/jetty-util-6.1.26.jar:/usr/local/hive/lib/derby-10.4.2.0.jar:/usr/local/hive/lib/commons-configuration-1.6.jar:/usr/local/hive/lib/jetty-6.1.26.jar:/usr/local/hive/lib/hive-exec-0.10.0-SNAPSHOT.jar:/usr/local/hive/lib/slf4j-log4j12-1.6.1.jar:/usr/local/hive/lib/commons-compress-1.4.1.jar:/usr/local/hive/lib/commons-logging-api-1.0.4.jar:/usr/local/hive/lib/hive-cli-0.10.0-SNAPSHOT.jar:/usr/local/hive/lib/hive-shims-0.10.0-SNAPSHOT.jar:/usr/local/hive/lib/jackson-xc-1.8.8.jar:/usr/local/hive/lib/hive-common-0.10.0-SNAPSHOT.jar:/usr/local/hive/lib/commons-lang-2.4.jar:/usr/local/hive/lib/datanucleus-connectionpool-2.0.3.jar:/usr/local/hive/lib/hive-hbase-handler-0.10.0-SNAPSHOT.jar:/usr/local/hive/lib/JavaEWAH-0.3.2.jar:/usr/local/hive/lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hive/lib/antlr-2.7.7.jar:/usr/local/hive/lib/libfb303-0.7.0.jar:/usr/local/hive/lib/guava-11.0.2.jar:/usr/local/hive/lib/hbase-0.92.0-tests.jar:/usr/local/hive/lib/slf4j-api-1.6.1.jar:/usr/local/hive/lib/zookeeper-3.4.3.jar:/usr/local/hive/lib/commons-cli-1.2.jar:/usr/local/hive/lib/jackson-core-asl-1.8.8.jar:/usr/local/hadoop/hadoop-core-1.0.3.jar:/usr/local/hadoop/lib/sqoop-1.4.2.jar
Note: /tmp/sqoop-xxxxx/compile/3a8443bba92a6cc65a6eef5b0999e946/TABLENAME.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
12/10/26 17:39:58 DEBUG orm.CompilationManager: Could not rename /tmp/sqoop-xxxxx/compile/3a8443bba92a6cc65a6eef5b0999e946/TABLENAME.java to /home/xxxxx/./TABLENAME.java
org.apache.commons.io.FileExistsException: Destination '/home/xxxxx/./TABLENAME.java' already exists
        at org.apache.commons.io.FileUtils.moveFile(FileUtils.java:2378)
        at org.apache.sqoop.orm.CompilationManager.compile(CompilationManager.java:227)
        at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:83)
        at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:390)
        at org.apache.sqoop.tool.ImportAllTablesTool.run(ImportAllTablesTool.java:64)
        at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
        at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
        at org.apache.sqoop.Sqoop.main(Sqoop.java:238)
        at com.cloudera.sqoop.Sqoop.main(Sqoop.java:57)
12/10/26 17:39:58 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-xxxxx/compile/3a8443bba92a6cc65a6eef5b0999e946/TABLENAME.jar
12/10/26 17:39:58 DEBUG orm.CompilationManager: Scanning for .class files in directory: /tmp/sqoop-xxxxx/compile/3a8443bba92a6cc65a6eef5b0999e946
12/10/26 17:39:58 DEBUG orm.CompilationManager: Got classfile: /tmp/sqoop-xxxxx/compile/3a8443bba92a6cc65a6eef5b0999e946/TABLENAME.class -> TABLENAME.class
12/10/26 17:39:58 DEBUG orm.CompilationManager: Finished writing jar file /tmp/sqoop-xxxxx/compile/3a8443bba92a6cc65a6eef5b0999e946/TABLENAME.jar
12/10/26 17:39:58 DEBUG manager.CatalogQueryManager: Retrieving primary key for table 'TABLENAME' with query SELECT kcu.COLUMN_NAME FROM INFORMATION_SCHEMA.TABLE_CONSTRAINTS tc,   INFORMATION_SCHEMA.KEY_COLUMN_USAGE kcu WHERE tc.TABLE_SCHEMA = kcu.TABLE_SCHEMA   AND tc.TABLE_NAME = kcu.TABLE_NAME   AND tc.CONSTRAINT_SCHEMA = kcu.CONSTRAINT_SCHEMA   AND tc.CONSTRAINT_NAME = kcu.CONSTRAINT_NAME   AND tc.TABLE_SCHEMA = (SELECT SCHEMA_NAME())   AND tc.TABLE_NAME = 'TABLENAME'   AND tc.CONSTRAINT_TYPE = 'PRIMARY KEY'

12/10/26 17:39:58 WARN manager.CatalogQueryManager: The table TABLENAME contains a multi-column primary key. Sqoop will default to the column Item only for this job.
12/10/26 17:39:58 DEBUG manager.CatalogQueryManager: Retrieving primary key for table 'TABLENAME' with query SELECT kcu.COLUMN_NAME FROM INFORMATION_SCHEMA.TABLE_CONSTRAINTS tc,   INFORMATION_SCHEMA.KEY_COLUMN_USAGE kcu WHERE tc.TABLE_SCHEMA = kcu.TABLE_SCHEMA   AND tc.TABLE_NAME = kcu.TABLE_NAME   AND tc.CONSTRAINT_SCHEMA = kcu.CONSTRAINT_SCHEMA   AND tc.CONSTRAINT_NAME = kcu.CONSTRAINT_NAME   AND tc.TABLE_SCHEMA = (SELECT SCHEMA_NAME())   AND tc.TABLE_NAME = 'TABLENAME'   AND tc.CONSTRAINT_TYPE = 'PRIMARY KEY'

12/10/26 17:39:58 WARN manager.CatalogQueryManager: The table TABLENAME contains a multi-column primary key. Sqoop will default to the column Item only for this job.
12/10/26 17:39:58 INFO mapreduce.ImportJobBase: Beginning import of TABLENAME
12/10/26 17:39:58 DEBUG mapreduce.DataDrivenImportJob: Using table class: TABLENAME
12/10/26 17:39:58 DEBUG mapreduce.DataDrivenImportJob: Using InputFormat: class com.cloudera.sqoop.mapreduce.db.DataDrivenDBInputFormat
12/10/26 17:39:58 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/local/hadoop/lib/sqoop-1.4.2.jar
12/10/26 17:39:58 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/local/hadoop/lib/sqljdbc4.jar
12/10/26 17:39:58 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/local/hadoop/lib/sqoop-1.4.2.jar
12/10/26 17:39:58 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/local/hadoop/lib/sqoop-1.4.2.jar
12/10/26 17:39:58 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/local/sqoop/lib/avro-1.5.3.jar
12/10/26 17:39:58 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/local/sqoop/lib/avro-ipc-1.5.3.jar
12/10/26 17:39:58 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/local/sqoop/lib/commons-io-1.4.jar
12/10/26 17:39:58 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/local/sqoop/lib/hsqldb-1.8.0.10.jar
12/10/26 17:39:58 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/local/sqoop/lib/snappy-java-1.0.3.2.jar
12/10/26 17:39:58 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/local/sqoop/lib/jopt-simple-3.2.jar
12/10/26 17:39:58 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/local/sqoop/lib/sqljdbc4.jar
12/10/26 17:39:58 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/local/sqoop/lib/avro-mapred-1.5.3.jar
12/10/26 17:39:58 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/local/sqoop/lib/sqoop-sqlserver-1.0.jar
12/10/26 17:39:58 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/local/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar
12/10/26 17:39:58 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/local/sqoop/lib/jackson-core-asl-1.7.3.jar
12/10/26 17:39:58 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/local/sqoop/lib/paranamer-2.3.jar
12/10/26 17:39:58 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/local/sqoop/lib/jackson-mapper-asl-1.7.3.jar
12/10/26 17:39:58 DEBUG mapreduce.JobBase: Adding to job classpath: file:/usr/local/sqoop/lib/ant-contrib-1.0b3.jar
12/10/26 17:40:02 INFO db.DataDrivenDBInputFormat: BoundingValsQuery: SELECT MIN([Item]), MAX([Item]) FROM [TABLENAME]
12/10/26 17:40:02 INFO mapred.JobClient: Cleaning up the staging area hdfs://SERVERNAME1:54310/user/xxxxx/.staging/job_201210261704_0005
12/10/26 17:40:02 ERROR security.UserGroupInformation: PriviledgedActionException as:xxxxx cause:java.io.IOException: Unknown SQL data type: -9
12/10/26 17:40:02 ERROR tool.ImportAllTablesTool: Encountered IOException running import job: java.io.IOException: Unknown SQL data type: -9



[xxxxx@servername ~]$ java -version
java version "1.6.0_33"
Java(TM) SE Runtime Environment (build 1.6.0_33-b03)
Java HotSpot(TM) 64-Bit Server VM (build 20.8-b03, mixed mode)

/usr/lib/jvm/jre-1.6.0-sun.x86_64/bin/java

-----Original Message-----
From: Jarek Jarcec Cecho [mailto:jarcec@apache.org] 
Sent: Friday, October 26, 2012 5:17 PM
To: user@sqoop.apache.org
Subject: Re: SQOOP 1.3.0 questions using SQL Server

Hi Artem,
yes nvarchar (sql type -9) should be supported on 1.4.2. Would you mind share with us entire Sqoop command line and appropriate log generated with --verbose argument? Also what version of Java are using?

Jarcec

On Fri, Oct 26, 2012 at 09:09:56PM +0000, Artem Ervits wrote:
> I have Sqoop 1.4.2 installed
> 
> Running sqoop version returns
> 
> Sqoop 1.4.2
> git commit id
> Compiled by ag on Tue Aug 14 17:37:19 IST 2012
> 
> When I tried to run a sqoop import-all-tables I got "Unknown SQL Data type: -9". From the discussion below, nvarchar should be supported in this version of sqoop, please advise?
> 
> Thank you.
> 
> -----Original Message-----
> From: Jarek Jarcec Cecho [mailto:jarcec@apache.org]
> Sent: Monday, October 15, 2012 4:34 PM
> To: user@sqoop.apache.org
> Cc: Marc Sturm
> Subject: Re: SQOOP 1.3.0 questions using SQL Server
> 
> Hi Marc,
> please find my answers inline:
> 
> On Mon, Oct 15, 2012 at 07:25:28PM +0000, Artem Ervits wrote:
> > I installed SQOOP 1.4.2 and was able to compress and then uncompress the data. I used the bin.tar.gz installer.
> 
> Great!
> 
> > 
> > 1.       I did try to use the git repo to compile the sqoop jar and was able to compile it no problem which resulted in 1.4.3 snapshot jar using command ant jar-all. I noticed that  all other jars like avro are not present in the build or lib directory. How may I go about downloading source and compiling it to include avro, etc?
> 
> If you need to build entire distribution (like the one that is offered in download page) including (almost) all dependency jars you need to run command "ant -Dhadoopversion=$hadoopVersion tar", where $hadoopVersion is your hadoop version. Supported values are 20, 23, 100 and 200.
> 
> > 2.       I also tried to use the incremental switch, is this a correct command: "sqoop import -incremental 'lastmodified'" or does the command look some other way?
> 
> Yes that is the intended usage. You might find more information about incremental imports in Sqoop documentation:
> 
> http://sqoop.apache.org/docs/1.4.2/SqoopUserGuide.html#_incremental_im
> ports
> 
> > Thank you.
> 
> Jarcec
> 
> > 
> > 
> > From: Artem Ervits [mailto:are9004@nyp.org]
> > Sent: Friday, October 12, 2012 5:17 PM
> > To: user@sqoop.apache.org
> > Cc: Marc Sturm
> > Subject: RE: SQOOP 1.3.0 questions using SQL Server
> > 
> > Even better!
> > 
> > Thank you.
> > 
> > From: Chalcy
> > [mailto:chalcy@gmail.com]<mailto:[mailto:chalcy@gmail.com]>
> > Sent: Friday, October 12, 2012 4:18 PM
> > To: user@sqoop.apache.org<ma...@sqoop.apache.org>
> > Subject: Re: SQOOP 1.3.0 questions using SQL Server
> > 
> > Sqoop 1.4.3 freshly built even works very well with sql server. Just 
> > get the sqoop jar and replace it and should work just fine.  I so 
> > far never had any issues changing sqoop version :)
> > 
> > Good luck!
> > 
> > Thanks,
> > Chalcy
> > On Fri, Oct 12, 2012 at 2:16 PM, Artem Ervits <ar...@nyp.org>> wrote:
> > Actually I was just able to test 3) below and it succeeded, I cast the nvarchar as char.
> > 
> > Thank you again, I will do 1) and 2) and report back the results.
> > 
> > -----Original Message-----
> > From: Jarek Jarcec Cecho
> > [mailto:jarcec@apache.org<ma...@apache.org>]
> > Sent: Friday, October 12, 2012 1:51 PM
> > To: user@sqoop.apache.org<ma...@sqoop.apache.org>
> > Subject: Re: SQOOP 1.3.0 questions using SQL Server Hi Artem, Sqoop
> > 1.3.0 is very old release, so I would definitely recommend to upgrade to latest 1.4.2. Please note that this release include SQOOP-480[1] that fixes some issues for connectors compiled against 1.3 in 1.4. Sqoop is simple command line utility so you might try to download release into some temporary folder and verify that everything is working prior upgrading Sqoop system wise.
> > 
> > Specifically to your questions
> > 
> > 1) --incremental
> > Usually Sqoop will print out error message first followed by help page, so I would advise checking few initial lines to see what exactly is wrong.
> > 
> > 2) --compress
> > Please try to reproduce this issue on 1.4.2 and file jira if you'll be able to reproduce it.
> > 
> > 3) --as-avrodatafile
> > Based on [2] SQL type -9 corresponds to data type NVARCHAR. We've added support for NVARCHAR in SQOOP-323 [3] that is part of Sqoop since version 1.4.0-incubating - e.g. 1.3.0 version do not support NVARCHAR type out of the box.
> > 
> > Jarcec
> > 
> > Links:
> > 1: https://issues.apache.org/jira/browse/SQOOP-480
> > 2: 
> > http://docs.oracle.com/javase/6/docs/api/constant-values.html#java.s
> > ql
> > .Types.DECIMAL
> > 3: https://issues.apache.org/jira/browse/SQOOP-323
> > 
> > On Fri, Oct 12, 2012 at 05:29:41PM +0000, Artem Ervits wrote:
> > >
> > > Hello all,
> > >
> > > I'm testing a few switches for Sqoop import and I'm having the following problems.
> > >
> > > When I use the -incremental switch, the command fails and shows me the help page.
> > > When I use the -compress switch, the command works but when I try to uncompress the results, it says gzip undefined compression code. I also tried to explicitly state the codec and it would still not append the compression extension to the files nor am I able to uncompress the data.
> > > When I use the -as-avrodatafile I get ERROR tool.ImportTool: 
> > > Imported Failed: Cannot convert SQL type -9
> > >
> > > Any ideas? I am not sure if upgrading Sqoop will fix it because SQL Server connector specifically required a 1.3.0 release.
> > >
> > > Thanks.
> > >
> > > Artem Ervits
> > > Data Analyst
> > > New York Presbyterian Hospital
> > >
> > >
> > >
> > > --------------------
> > >
> > > This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.
> > >
> > >
> > >
> > >
> > > --------------------
> > >
> > > This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.
> > >
> > >
> > >
> > 
> > 
> > --------------------
> > 
> > This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.
> > 
> > 
> > 
> > 
> > --------------------
> > 
> > This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.
> > 
> > 
> > 
> > 
> > 
> > --------------------
> > 
> > Confidential Information subject to NYP's (and its affiliates') information management and security policies (http://infonet.nyp.org/QA/HospManual/).
> > 
> > 
> > 
> > --------------------
> > 
> > 
> > 
> > This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.
> > 
> > 
> > 
> > 
> > 
> > ________________________________
> > 
> > Confidential Information subject to NYP's (and its affiliates') information management and security policies (http://infonet.nyp.org/QA/HospitalManual).
> > 
> > 
> > --------------------
> > Confidential Information subject to NYP's (and its affiliates') information management and security policies (http://infonet.nyp.org/QA/HospManual/).
> > 
> > 
> > 
> > --------------------
> > 
> > This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.
> > 
> > 
> > 
> 
> 
> --------------------
> 
> This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.
> 
> 
> 
> 
> --------------------
> 
> This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.
> 
> 
> 


--------------------

This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.




--------------------

This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.




Re: SQOOP 1.3.0 questions using SQL Server

Posted by Jarek Jarcec Cecho <ja...@apache.org>.
Hi Artem,
yes nvarchar (sql type -9) should be supported on 1.4.2. Would you mind share with us entire Sqoop command line and appropriate log generated with --verbose argument? Also what version of Java are using?

Jarcec

On Fri, Oct 26, 2012 at 09:09:56PM +0000, Artem Ervits wrote:
> I have Sqoop 1.4.2 installed 
> 
> Running sqoop version returns
> 
> Sqoop 1.4.2
> git commit id
> Compiled by ag on Tue Aug 14 17:37:19 IST 2012
> 
> When I tried to run a sqoop import-all-tables I got "Unknown SQL Data type: -9". From the discussion below, nvarchar should be supported in this version of sqoop, please advise?
> 
> Thank you.
> 
> -----Original Message-----
> From: Jarek Jarcec Cecho [mailto:jarcec@apache.org] 
> Sent: Monday, October 15, 2012 4:34 PM
> To: user@sqoop.apache.org
> Cc: Marc Sturm
> Subject: Re: SQOOP 1.3.0 questions using SQL Server
> 
> Hi Marc,
> please find my answers inline:
> 
> On Mon, Oct 15, 2012 at 07:25:28PM +0000, Artem Ervits wrote:
> > I installed SQOOP 1.4.2 and was able to compress and then uncompress the data. I used the bin.tar.gz installer.
> 
> Great!
> 
> > 
> > 1.       I did try to use the git repo to compile the sqoop jar and was able to compile it no problem which resulted in 1.4.3 snapshot jar using command ant jar-all. I noticed that  all other jars like avro are not present in the build or lib directory. How may I go about downloading source and compiling it to include avro, etc?
> 
> If you need to build entire distribution (like the one that is offered in download page) including (almost) all dependency jars you need to run command "ant -Dhadoopversion=$hadoopVersion tar", where $hadoopVersion is your hadoop version. Supported values are 20, 23, 100 and 200.
> 
> > 2.       I also tried to use the incremental switch, is this a correct command: "sqoop import -incremental 'lastmodified'" or does the command look some other way?
> 
> Yes that is the intended usage. You might find more information about incremental imports in Sqoop documentation:
> 
> http://sqoop.apache.org/docs/1.4.2/SqoopUserGuide.html#_incremental_imports
> 
> > Thank you.
> 
> Jarcec
> 
> > 
> > 
> > From: Artem Ervits [mailto:are9004@nyp.org]
> > Sent: Friday, October 12, 2012 5:17 PM
> > To: user@sqoop.apache.org
> > Cc: Marc Sturm
> > Subject: RE: SQOOP 1.3.0 questions using SQL Server
> > 
> > Even better!
> > 
> > Thank you.
> > 
> > From: Chalcy 
> > [mailto:chalcy@gmail.com]<mailto:[mailto:chalcy@gmail.com]>
> > Sent: Friday, October 12, 2012 4:18 PM
> > To: user@sqoop.apache.org<ma...@sqoop.apache.org>
> > Subject: Re: SQOOP 1.3.0 questions using SQL Server
> > 
> > Sqoop 1.4.3 freshly built even works very well with sql server. Just 
> > get the sqoop jar and replace it and should work just fine.  I so far 
> > never had any issues changing sqoop version :)
> > 
> > Good luck!
> > 
> > Thanks,
> > Chalcy
> > On Fri, Oct 12, 2012 at 2:16 PM, Artem Ervits <ar...@nyp.org>> wrote:
> > Actually I was just able to test 3) below and it succeeded, I cast the nvarchar as char.
> > 
> > Thank you again, I will do 1) and 2) and report back the results.
> > 
> > -----Original Message-----
> > From: Jarek Jarcec Cecho 
> > [mailto:jarcec@apache.org<ma...@apache.org>]
> > Sent: Friday, October 12, 2012 1:51 PM
> > To: user@sqoop.apache.org<ma...@sqoop.apache.org>
> > Subject: Re: SQOOP 1.3.0 questions using SQL Server Hi Artem, Sqoop 
> > 1.3.0 is very old release, so I would definitely recommend to upgrade to latest 1.4.2. Please note that this release include SQOOP-480[1] that fixes some issues for connectors compiled against 1.3 in 1.4. Sqoop is simple command line utility so you might try to download release into some temporary folder and verify that everything is working prior upgrading Sqoop system wise.
> > 
> > Specifically to your questions
> > 
> > 1) --incremental
> > Usually Sqoop will print out error message first followed by help page, so I would advise checking few initial lines to see what exactly is wrong.
> > 
> > 2) --compress
> > Please try to reproduce this issue on 1.4.2 and file jira if you'll be able to reproduce it.
> > 
> > 3) --as-avrodatafile
> > Based on [2] SQL type -9 corresponds to data type NVARCHAR. We've added support for NVARCHAR in SQOOP-323 [3] that is part of Sqoop since version 1.4.0-incubating - e.g. 1.3.0 version do not support NVARCHAR type out of the box.
> > 
> > Jarcec
> > 
> > Links:
> > 1: https://issues.apache.org/jira/browse/SQOOP-480
> > 2: 
> > http://docs.oracle.com/javase/6/docs/api/constant-values.html#java.sql
> > .Types.DECIMAL
> > 3: https://issues.apache.org/jira/browse/SQOOP-323
> > 
> > On Fri, Oct 12, 2012 at 05:29:41PM +0000, Artem Ervits wrote:
> > >
> > > Hello all,
> > >
> > > I'm testing a few switches for Sqoop import and I'm having the following problems.
> > >
> > > When I use the -incremental switch, the command fails and shows me the help page.
> > > When I use the -compress switch, the command works but when I try to uncompress the results, it says gzip undefined compression code. I also tried to explicitly state the codec and it would still not append the compression extension to the files nor am I able to uncompress the data.
> > > When I use the -as-avrodatafile I get ERROR tool.ImportTool: 
> > > Imported Failed: Cannot convert SQL type -9
> > >
> > > Any ideas? I am not sure if upgrading Sqoop will fix it because SQL Server connector specifically required a 1.3.0 release.
> > >
> > > Thanks.
> > >
> > > Artem Ervits
> > > Data Analyst
> > > New York Presbyterian Hospital
> > >
> > >
> > >
> > > --------------------
> > >
> > > This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.
> > >
> > >
> > >
> > >
> > > --------------------
> > >
> > > This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.
> > >
> > >
> > >
> > 
> > 
> > --------------------
> > 
> > This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.
> > 
> > 
> > 
> > 
> > --------------------
> > 
> > This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.
> > 
> > 
> > 
> > 
> > 
> > --------------------
> > 
> > Confidential Information subject to NYP's (and its affiliates') information management and security policies (http://infonet.nyp.org/QA/HospManual/).
> > 
> > 
> > 
> > --------------------
> > 
> > 
> > 
> > This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.
> > 
> > 
> > 
> > 
> > 
> > ________________________________
> > 
> > Confidential Information subject to NYP's (and its affiliates') information management and security policies (http://infonet.nyp.org/QA/HospitalManual).
> > 
> > 
> > --------------------
> > Confidential Information subject to NYP's (and its affiliates') information management and security policies (http://infonet.nyp.org/QA/HospManual/).
> > 
> > 
> > 
> > --------------------
> > 
> > This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.
> > 
> > 
> > 
> 
> 
> --------------------
> 
> This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.
> 
> 
> 
> 
> --------------------
> 
> This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.
> 
> 
> 

RE: SQOOP 1.3.0 questions using SQL Server

Posted by Artem Ervits <ar...@nyp.org>.
I have Sqoop 1.4.2 installed 

Running sqoop version returns

Sqoop 1.4.2
git commit id
Compiled by ag on Tue Aug 14 17:37:19 IST 2012

When I tried to run a sqoop import-all-tables I got "Unknown SQL Data type: -9". From the discussion below, nvarchar should be supported in this version of sqoop, please advise?

Thank you.

-----Original Message-----
From: Jarek Jarcec Cecho [mailto:jarcec@apache.org] 
Sent: Monday, October 15, 2012 4:34 PM
To: user@sqoop.apache.org
Cc: Marc Sturm
Subject: Re: SQOOP 1.3.0 questions using SQL Server

Hi Marc,
please find my answers inline:

On Mon, Oct 15, 2012 at 07:25:28PM +0000, Artem Ervits wrote:
> I installed SQOOP 1.4.2 and was able to compress and then uncompress the data. I used the bin.tar.gz installer.

Great!

> 
> 1.       I did try to use the git repo to compile the sqoop jar and was able to compile it no problem which resulted in 1.4.3 snapshot jar using command ant jar-all. I noticed that  all other jars like avro are not present in the build or lib directory. How may I go about downloading source and compiling it to include avro, etc?

If you need to build entire distribution (like the one that is offered in download page) including (almost) all dependency jars you need to run command "ant -Dhadoopversion=$hadoopVersion tar", where $hadoopVersion is your hadoop version. Supported values are 20, 23, 100 and 200.

> 2.       I also tried to use the incremental switch, is this a correct command: "sqoop import -incremental 'lastmodified'" or does the command look some other way?

Yes that is the intended usage. You might find more information about incremental imports in Sqoop documentation:

http://sqoop.apache.org/docs/1.4.2/SqoopUserGuide.html#_incremental_imports

> Thank you.

Jarcec

> 
> 
> From: Artem Ervits [mailto:are9004@nyp.org]
> Sent: Friday, October 12, 2012 5:17 PM
> To: user@sqoop.apache.org
> Cc: Marc Sturm
> Subject: RE: SQOOP 1.3.0 questions using SQL Server
> 
> Even better!
> 
> Thank you.
> 
> From: Chalcy 
> [mailto:chalcy@gmail.com]<mailto:[mailto:chalcy@gmail.com]>
> Sent: Friday, October 12, 2012 4:18 PM
> To: user@sqoop.apache.org<ma...@sqoop.apache.org>
> Subject: Re: SQOOP 1.3.0 questions using SQL Server
> 
> Sqoop 1.4.3 freshly built even works very well with sql server. Just 
> get the sqoop jar and replace it and should work just fine.  I so far 
> never had any issues changing sqoop version :)
> 
> Good luck!
> 
> Thanks,
> Chalcy
> On Fri, Oct 12, 2012 at 2:16 PM, Artem Ervits <ar...@nyp.org>> wrote:
> Actually I was just able to test 3) below and it succeeded, I cast the nvarchar as char.
> 
> Thank you again, I will do 1) and 2) and report back the results.
> 
> -----Original Message-----
> From: Jarek Jarcec Cecho 
> [mailto:jarcec@apache.org<ma...@apache.org>]
> Sent: Friday, October 12, 2012 1:51 PM
> To: user@sqoop.apache.org<ma...@sqoop.apache.org>
> Subject: Re: SQOOP 1.3.0 questions using SQL Server Hi Artem, Sqoop 
> 1.3.0 is very old release, so I would definitely recommend to upgrade to latest 1.4.2. Please note that this release include SQOOP-480[1] that fixes some issues for connectors compiled against 1.3 in 1.4. Sqoop is simple command line utility so you might try to download release into some temporary folder and verify that everything is working prior upgrading Sqoop system wise.
> 
> Specifically to your questions
> 
> 1) --incremental
> Usually Sqoop will print out error message first followed by help page, so I would advise checking few initial lines to see what exactly is wrong.
> 
> 2) --compress
> Please try to reproduce this issue on 1.4.2 and file jira if you'll be able to reproduce it.
> 
> 3) --as-avrodatafile
> Based on [2] SQL type -9 corresponds to data type NVARCHAR. We've added support for NVARCHAR in SQOOP-323 [3] that is part of Sqoop since version 1.4.0-incubating - e.g. 1.3.0 version do not support NVARCHAR type out of the box.
> 
> Jarcec
> 
> Links:
> 1: https://issues.apache.org/jira/browse/SQOOP-480
> 2: 
> http://docs.oracle.com/javase/6/docs/api/constant-values.html#java.sql
> .Types.DECIMAL
> 3: https://issues.apache.org/jira/browse/SQOOP-323
> 
> On Fri, Oct 12, 2012 at 05:29:41PM +0000, Artem Ervits wrote:
> >
> > Hello all,
> >
> > I'm testing a few switches for Sqoop import and I'm having the following problems.
> >
> > When I use the -incremental switch, the command fails and shows me the help page.
> > When I use the -compress switch, the command works but when I try to uncompress the results, it says gzip undefined compression code. I also tried to explicitly state the codec and it would still not append the compression extension to the files nor am I able to uncompress the data.
> > When I use the -as-avrodatafile I get ERROR tool.ImportTool: 
> > Imported Failed: Cannot convert SQL type -9
> >
> > Any ideas? I am not sure if upgrading Sqoop will fix it because SQL Server connector specifically required a 1.3.0 release.
> >
> > Thanks.
> >
> > Artem Ervits
> > Data Analyst
> > New York Presbyterian Hospital
> >
> >
> >
> > --------------------
> >
> > This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.
> >
> >
> >
> >
> > --------------------
> >
> > This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.
> >
> >
> >
> 
> 
> --------------------
> 
> This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.
> 
> 
> 
> 
> --------------------
> 
> This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.
> 
> 
> 
> 
> 
> --------------------
> 
> Confidential Information subject to NYP's (and its affiliates') information management and security policies (http://infonet.nyp.org/QA/HospManual/).
> 
> 
> 
> --------------------
> 
> 
> 
> This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.
> 
> 
> 
> 
> 
> ________________________________
> 
> Confidential Information subject to NYP's (and its affiliates') information management and security policies (http://infonet.nyp.org/QA/HospitalManual).
> 
> 
> --------------------
> Confidential Information subject to NYP's (and its affiliates') information management and security policies (http://infonet.nyp.org/QA/HospManual/).
> 
> 
> 
> --------------------
> 
> This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.
> 
> 
> 


--------------------

This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.




--------------------

This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.




Re: SQOOP 1.3.0 questions using SQL Server

Posted by Jarek Jarcec Cecho <ja...@apache.org>.
Hi Marc,
please find my answers inline:

On Mon, Oct 15, 2012 at 07:25:28PM +0000, Artem Ervits wrote:
> I installed SQOOP 1.4.2 and was able to compress and then uncompress the data. I used the bin.tar.gz installer.

Great!

> 
> 1.       I did try to use the git repo to compile the sqoop jar and was able to compile it no problem which resulted in 1.4.3 snapshot jar using command ant jar-all. I noticed that  all other jars like avro are not present in the build or lib directory. How may I go about downloading source and compiling it to include avro, etc?

If you need to build entire distribution (like the one that is offered in download page) including (almost) all dependency jars you need to run command "ant -Dhadoopversion=$hadoopVersion tar", where $hadoopVersion is your hadoop version. Supported values are 20, 23, 100 and 200.

> 2.       I also tried to use the incremental switch, is this a correct command: "sqoop import -incremental 'lastmodified'" or does the command look some other way?

Yes that is the intended usage. You might find more information about incremental imports in Sqoop documentation:

http://sqoop.apache.org/docs/1.4.2/SqoopUserGuide.html#_incremental_imports

> Thank you.

Jarcec

> 
> 
> From: Artem Ervits [mailto:are9004@nyp.org]
> Sent: Friday, October 12, 2012 5:17 PM
> To: user@sqoop.apache.org
> Cc: Marc Sturm
> Subject: RE: SQOOP 1.3.0 questions using SQL Server
> 
> Even better!
> 
> Thank you.
> 
> From: Chalcy [mailto:chalcy@gmail.com]<mailto:[mailto:chalcy@gmail.com]>
> Sent: Friday, October 12, 2012 4:18 PM
> To: user@sqoop.apache.org<ma...@sqoop.apache.org>
> Subject: Re: SQOOP 1.3.0 questions using SQL Server
> 
> Sqoop 1.4.3 freshly built even works very well with sql server. Just get the sqoop jar and replace it and should work just fine.  I so far never had any issues changing sqoop version :)
> 
> Good luck!
> 
> Thanks,
> Chalcy
> On Fri, Oct 12, 2012 at 2:16 PM, Artem Ervits <ar...@nyp.org>> wrote:
> Actually I was just able to test 3) below and it succeeded, I cast the nvarchar as char.
> 
> Thank you again, I will do 1) and 2) and report back the results.
> 
> -----Original Message-----
> From: Jarek Jarcec Cecho [mailto:jarcec@apache.org<ma...@apache.org>]
> Sent: Friday, October 12, 2012 1:51 PM
> To: user@sqoop.apache.org<ma...@sqoop.apache.org>
> Subject: Re: SQOOP 1.3.0 questions using SQL Server
> Hi Artem,
> Sqoop 1.3.0 is very old release, so I would definitely recommend to upgrade to latest 1.4.2. Please note that this release include SQOOP-480[1] that fixes some issues for connectors compiled against 1.3 in 1.4. Sqoop is simple command line utility so you might try to download release into some temporary folder and verify that everything is working prior upgrading Sqoop system wise.
> 
> Specifically to your questions
> 
> 1) --incremental
> Usually Sqoop will print out error message first followed by help page, so I would advise checking few initial lines to see what exactly is wrong.
> 
> 2) --compress
> Please try to reproduce this issue on 1.4.2 and file jira if you'll be able to reproduce it.
> 
> 3) --as-avrodatafile
> Based on [2] SQL type -9 corresponds to data type NVARCHAR. We've added support for NVARCHAR in SQOOP-323 [3] that is part of Sqoop since version 1.4.0-incubating - e.g. 1.3.0 version do not support NVARCHAR type out of the box.
> 
> Jarcec
> 
> Links:
> 1: https://issues.apache.org/jira/browse/SQOOP-480
> 2: http://docs.oracle.com/javase/6/docs/api/constant-values.html#java.sql.Types.DECIMAL
> 3: https://issues.apache.org/jira/browse/SQOOP-323
> 
> On Fri, Oct 12, 2012 at 05:29:41PM +0000, Artem Ervits wrote:
> >
> > Hello all,
> >
> > I'm testing a few switches for Sqoop import and I'm having the following problems.
> >
> > When I use the -incremental switch, the command fails and shows me the help page.
> > When I use the -compress switch, the command works but when I try to uncompress the results, it says gzip undefined compression code. I also tried to explicitly state the codec and it would still not append the compression extension to the files nor am I able to uncompress the data.
> > When I use the -as-avrodatafile I get ERROR tool.ImportTool: Imported Failed: Cannot convert SQL type -9
> >
> > Any ideas? I am not sure if upgrading Sqoop will fix it because SQL Server connector specifically required a 1.3.0 release.
> >
> > Thanks.
> >
> > Artem Ervits
> > Data Analyst
> > New York Presbyterian Hospital
> >
> >
> >
> > --------------------
> >
> > This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.
> >
> >
> >
> >
> > --------------------
> >
> > This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.
> >
> >
> >
> 
> 
> --------------------
> 
> This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.
> 
> 
> 
> 
> --------------------
> 
> This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.
> 
> 
> 
> 
> 
> --------------------
> 
> Confidential Information subject to NYP's (and its affiliates') information management and security policies (http://infonet.nyp.org/QA/HospManual/).
> 
> 
> 
> --------------------
> 
> 
> 
> This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.
> 
> 
> 
> 
> 
> ________________________________
> 
> Confidential Information subject to NYP's (and its affiliates') information management and security policies (http://infonet.nyp.org/QA/HospitalManual).
> 
> 
> --------------------
> Confidential Information subject to NYP's (and its affiliates') information management and security policies (http://infonet.nyp.org/QA/HospManual/).
> 
> 
> 
> --------------------
> 
> This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.
> 
> 
> 

RE: SQOOP 1.3.0 questions using SQL Server

Posted by Artem Ervits <ar...@nyp.org>.
I installed SQOOP 1.4.2 and was able to compress and then uncompress the data. I used the bin.tar.gz installer.


1.       I did try to use the git repo to compile the sqoop jar and was able to compile it no problem which resulted in 1.4.3 snapshot jar using command ant jar-all. I noticed that  all other jars like avro are not present in the build or lib directory. How may I go about downloading source and compiling it to include avro, etc?

2.       I also tried to use the incremental switch, is this a correct command: "sqoop import -incremental 'lastmodified'" or does the command look some other way?

Thank you.


From: Artem Ervits [mailto:are9004@nyp.org]
Sent: Friday, October 12, 2012 5:17 PM
To: user@sqoop.apache.org
Cc: Marc Sturm
Subject: RE: SQOOP 1.3.0 questions using SQL Server

Even better!

Thank you.

From: Chalcy [mailto:chalcy@gmail.com]<mailto:[mailto:chalcy@gmail.com]>
Sent: Friday, October 12, 2012 4:18 PM
To: user@sqoop.apache.org<ma...@sqoop.apache.org>
Subject: Re: SQOOP 1.3.0 questions using SQL Server

Sqoop 1.4.3 freshly built even works very well with sql server. Just get the sqoop jar and replace it and should work just fine.  I so far never had any issues changing sqoop version :)

Good luck!

Thanks,
Chalcy
On Fri, Oct 12, 2012 at 2:16 PM, Artem Ervits <ar...@nyp.org>> wrote:
Actually I was just able to test 3) below and it succeeded, I cast the nvarchar as char.

Thank you again, I will do 1) and 2) and report back the results.

-----Original Message-----
From: Jarek Jarcec Cecho [mailto:jarcec@apache.org<ma...@apache.org>]
Sent: Friday, October 12, 2012 1:51 PM
To: user@sqoop.apache.org<ma...@sqoop.apache.org>
Subject: Re: SQOOP 1.3.0 questions using SQL Server
Hi Artem,
Sqoop 1.3.0 is very old release, so I would definitely recommend to upgrade to latest 1.4.2. Please note that this release include SQOOP-480[1] that fixes some issues for connectors compiled against 1.3 in 1.4. Sqoop is simple command line utility so you might try to download release into some temporary folder and verify that everything is working prior upgrading Sqoop system wise.

Specifically to your questions

1) --incremental
Usually Sqoop will print out error message first followed by help page, so I would advise checking few initial lines to see what exactly is wrong.

2) --compress
Please try to reproduce this issue on 1.4.2 and file jira if you'll be able to reproduce it.

3) --as-avrodatafile
Based on [2] SQL type -9 corresponds to data type NVARCHAR. We've added support for NVARCHAR in SQOOP-323 [3] that is part of Sqoop since version 1.4.0-incubating - e.g. 1.3.0 version do not support NVARCHAR type out of the box.

Jarcec

Links:
1: https://issues.apache.org/jira/browse/SQOOP-480
2: http://docs.oracle.com/javase/6/docs/api/constant-values.html#java.sql.Types.DECIMAL
3: https://issues.apache.org/jira/browse/SQOOP-323

On Fri, Oct 12, 2012 at 05:29:41PM +0000, Artem Ervits wrote:
>
> Hello all,
>
> I'm testing a few switches for Sqoop import and I'm having the following problems.
>
> When I use the -incremental switch, the command fails and shows me the help page.
> When I use the -compress switch, the command works but when I try to uncompress the results, it says gzip undefined compression code. I also tried to explicitly state the codec and it would still not append the compression extension to the files nor am I able to uncompress the data.
> When I use the -as-avrodatafile I get ERROR tool.ImportTool: Imported Failed: Cannot convert SQL type -9
>
> Any ideas? I am not sure if upgrading Sqoop will fix it because SQL Server connector specifically required a 1.3.0 release.
>
> Thanks.
>
> Artem Ervits
> Data Analyst
> New York Presbyterian Hospital
>
>
>
> --------------------
>
> This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.
>
>
>
>
> --------------------
>
> This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.
>
>
>


--------------------

This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.




--------------------

This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.





--------------------

Confidential Information subject to NYP's (and its affiliates') information management and security policies (http://infonet.nyp.org/QA/HospManual/).



--------------------



This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.





________________________________

Confidential Information subject to NYP's (and its affiliates') information management and security policies (http://infonet.nyp.org/QA/HospitalManual).


--------------------
Confidential Information subject to NYP's (and its affiliates') information management and security policies (http://infonet.nyp.org/QA/HospManual/).



--------------------

This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.




RE: SQOOP 1.3.0 questions using SQL Server

Posted by Artem Ervits <ar...@nyp.org>.
Even better!

Thank you.

From: Chalcy [mailto:chalcy@gmail.com]
Sent: Friday, October 12, 2012 4:18 PM
To: user@sqoop.apache.org
Subject: Re: SQOOP 1.3.0 questions using SQL Server

Sqoop 1.4.3 freshly built even works very well with sql server. Just get the sqoop jar and replace it and should work just fine.  I so far never had any issues changing sqoop version :)

Good luck!

Thanks,
Chalcy
On Fri, Oct 12, 2012 at 2:16 PM, Artem Ervits <ar...@nyp.org>> wrote:
Actually I was just able to test 3) below and it succeeded, I cast the nvarchar as char.

Thank you again, I will do 1) and 2) and report back the results.

-----Original Message-----
From: Jarek Jarcec Cecho [mailto:jarcec@apache.org<ma...@apache.org>]
Sent: Friday, October 12, 2012 1:51 PM
To: user@sqoop.apache.org<ma...@sqoop.apache.org>
Subject: Re: SQOOP 1.3.0 questions using SQL Server
Hi Artem,
Sqoop 1.3.0 is very old release, so I would definitely recommend to upgrade to latest 1.4.2. Please note that this release include SQOOP-480[1] that fixes some issues for connectors compiled against 1.3 in 1.4. Sqoop is simple command line utility so you might try to download release into some temporary folder and verify that everything is working prior upgrading Sqoop system wise.

Specifically to your questions

1) --incremental
Usually Sqoop will print out error message first followed by help page, so I would advise checking few initial lines to see what exactly is wrong.

2) --compress
Please try to reproduce this issue on 1.4.2 and file jira if you'll be able to reproduce it.

3) --as-avrodatafile
Based on [2] SQL type -9 corresponds to data type NVARCHAR. We've added support for NVARCHAR in SQOOP-323 [3] that is part of Sqoop since version 1.4.0-incubating - e.g. 1.3.0 version do not support NVARCHAR type out of the box.

Jarcec

Links:
1: https://issues.apache.org/jira/browse/SQOOP-480
2: http://docs.oracle.com/javase/6/docs/api/constant-values.html#java.sql.Types.DECIMAL
3: https://issues.apache.org/jira/browse/SQOOP-323

On Fri, Oct 12, 2012 at 05:29:41PM +0000, Artem Ervits wrote:
>
> Hello all,
>
> I'm testing a few switches for Sqoop import and I'm having the following problems.
>
> When I use the -incremental switch, the command fails and shows me the help page.
> When I use the -compress switch, the command works but when I try to uncompress the results, it says gzip undefined compression code. I also tried to explicitly state the codec and it would still not append the compression extension to the files nor am I able to uncompress the data.
> When I use the -as-avrodatafile I get ERROR tool.ImportTool: Imported Failed: Cannot convert SQL type -9
>
> Any ideas? I am not sure if upgrading Sqoop will fix it because SQL Server connector specifically required a 1.3.0 release.
>
> Thanks.
>
> Artem Ervits
> Data Analyst
> New York Presbyterian Hospital
>
>
>
> --------------------
>
> This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.
>
>
>
>
> --------------------
>
> This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.
>
>
>


--------------------

This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.




--------------------

This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.





--------------------
Confidential Information subject to NYP's (and its affiliates') information management and security policies (http://infonet.nyp.org/QA/HospManual/).



--------------------

This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.




Re: SQOOP 1.3.0 questions using SQL Server

Posted by Chalcy <ch...@gmail.com>.
Sqoop 1.4.3 freshly built even works very well with sql server. Just get
the sqoop jar and replace it and should work just fine.  I so far never had
any issues changing sqoop version :)

Good luck!

Thanks,
Chalcy

On Fri, Oct 12, 2012 at 2:16 PM, Artem Ervits <ar...@nyp.org> wrote:

> Actually I was just able to test 3) below and it succeeded, I cast the
> nvarchar as char.
>
> Thank you again, I will do 1) and 2) and report back the results.
>
> -----Original Message-----
> From: Jarek Jarcec Cecho [mailto:jarcec@apache.org]
> Sent: Friday, October 12, 2012 1:51 PM
> To: user@sqoop.apache.org
> Subject: Re: SQOOP 1.3.0 questions using SQL Server
>
> Hi Artem,
> Sqoop 1.3.0 is very old release, so I would definitely recommend to
> upgrade to latest 1.4.2. Please note that this release include SQOOP-480[1]
> that fixes some issues for connectors compiled against 1.3 in 1.4. Sqoop is
> simple command line utility so you might try to download release into some
> temporary folder and verify that everything is working prior upgrading
> Sqoop system wise.
>
> Specifically to your questions
>
> 1) --incremental
> Usually Sqoop will print out error message first followed by help page, so
> I would advise checking few initial lines to see what exactly is wrong.
>
> 2) --compress
> Please try to reproduce this issue on 1.4.2 and file jira if you'll be
> able to reproduce it.
>
> 3) --as-avrodatafile
> Based on [2] SQL type -9 corresponds to data type NVARCHAR. We've added
> support for NVARCHAR in SQOOP-323 [3] that is part of Sqoop since version
> 1.4.0-incubating - e.g. 1.3.0 version do not support NVARCHAR type out of
> the box.
>
> Jarcec
>
> Links:
> 1: https://issues.apache.org/jira/browse/SQOOP-480
> 2:
> http://docs.oracle.com/javase/6/docs/api/constant-values.html#java.sql.Types.DECIMAL
> 3: https://issues.apache.org/jira/browse/SQOOP-323
>
> On Fri, Oct 12, 2012 at 05:29:41PM +0000, Artem Ervits wrote:
> >
> > Hello all,
> >
> > I'm testing a few switches for Sqoop import and I'm having the following
> problems.
> >
> > When I use the -incremental switch, the command fails and shows me the
> help page.
> > When I use the -compress switch, the command works but when I try to
> uncompress the results, it says gzip undefined compression code. I also
> tried to explicitly state the codec and it would still not append the
> compression extension to the files nor am I able to uncompress the data.
> > When I use the -as-avrodatafile I get ERROR tool.ImportTool: Imported
> Failed: Cannot convert SQL type -9
> >
> > Any ideas? I am not sure if upgrading Sqoop will fix it because SQL
> Server connector specifically required a 1.3.0 release.
> >
> > Thanks.
> >
> > Artem Ervits
> > Data Analyst
> > New York Presbyterian Hospital
> >
> >
> >
> > --------------------
> >
> > This electronic message is intended to be for the use only of the named
> recipient, and may contain information that is confidential or privileged.
>  If you are not the intended recipient, you are hereby notified that any
> disclosure, copying, distribution or use of the contents of this message is
> strictly prohibited.  If you have received this message in error or are not
> the named recipient, please notify us immediately by contacting the sender
> at the electronic mail address noted above, and delete and destroy all
> copies of this message.  Thank you.
> >
> >
> >
> >
> > --------------------
> >
> > This electronic message is intended to be for the use only of the named
> recipient, and may contain information that is confidential or privileged.
>  If you are not the intended recipient, you are hereby notified that any
> disclosure, copying, distribution or use of the contents of this message is
> strictly prohibited.  If you have received this message in error or are not
> the named recipient, please notify us immediately by contacting the sender
> at the electronic mail address noted above, and delete and destroy all
> copies of this message.  Thank you.
> >
> >
> >
>
>
> --------------------
>
> This electronic message is intended to be for the use only of the named
> recipient, and may contain information that is confidential or privileged.
>  If you are not the intended recipient, you are hereby notified that any
> disclosure, copying, distribution or use of the contents of this message is
> strictly prohibited.  If you have received this message in error or are not
> the named recipient, please notify us immediately by contacting the sender
> at the electronic mail address noted above, and delete and destroy all
> copies of this message.  Thank you.
>
>
>
>
> --------------------
>
> This electronic message is intended to be for the use only of the named
> recipient, and may contain information that is confidential or privileged.
>  If you are not the intended recipient, you are hereby notified that any
> disclosure, copying, distribution or use of the contents of this message is
> strictly prohibited.  If you have received this message in error or are not
> the named recipient, please notify us immediately by contacting the sender
> at the electronic mail address noted above, and delete and destroy all
> copies of this message.  Thank you.
>
>
>
>

RE: SQOOP 1.3.0 questions using SQL Server

Posted by Artem Ervits <ar...@nyp.org>.
Actually I was just able to test 3) below and it succeeded, I cast the nvarchar as char.

Thank you again, I will do 1) and 2) and report back the results.

-----Original Message-----
From: Jarek Jarcec Cecho [mailto:jarcec@apache.org] 
Sent: Friday, October 12, 2012 1:51 PM
To: user@sqoop.apache.org
Subject: Re: SQOOP 1.3.0 questions using SQL Server

Hi Artem,
Sqoop 1.3.0 is very old release, so I would definitely recommend to upgrade to latest 1.4.2. Please note that this release include SQOOP-480[1] that fixes some issues for connectors compiled against 1.3 in 1.4. Sqoop is simple command line utility so you might try to download release into some temporary folder and verify that everything is working prior upgrading Sqoop system wise.

Specifically to your questions

1) --incremental
Usually Sqoop will print out error message first followed by help page, so I would advise checking few initial lines to see what exactly is wrong.

2) --compress
Please try to reproduce this issue on 1.4.2 and file jira if you'll be able to reproduce it.

3) --as-avrodatafile
Based on [2] SQL type -9 corresponds to data type NVARCHAR. We've added support for NVARCHAR in SQOOP-323 [3] that is part of Sqoop since version 1.4.0-incubating - e.g. 1.3.0 version do not support NVARCHAR type out of the box.

Jarcec

Links:
1: https://issues.apache.org/jira/browse/SQOOP-480
2: http://docs.oracle.com/javase/6/docs/api/constant-values.html#java.sql.Types.DECIMAL
3: https://issues.apache.org/jira/browse/SQOOP-323

On Fri, Oct 12, 2012 at 05:29:41PM +0000, Artem Ervits wrote:
> 
> Hello all,
> 
> I'm testing a few switches for Sqoop import and I'm having the following problems.
> 
> When I use the -incremental switch, the command fails and shows me the help page.
> When I use the -compress switch, the command works but when I try to uncompress the results, it says gzip undefined compression code. I also tried to explicitly state the codec and it would still not append the compression extension to the files nor am I able to uncompress the data.
> When I use the -as-avrodatafile I get ERROR tool.ImportTool: Imported Failed: Cannot convert SQL type -9
> 
> Any ideas? I am not sure if upgrading Sqoop will fix it because SQL Server connector specifically required a 1.3.0 release.
> 
> Thanks.
> 
> Artem Ervits
> Data Analyst
> New York Presbyterian Hospital
> 
> 
> 
> --------------------
> 
> This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.
> 
> 
> 
> 
> --------------------
> 
> This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.
> 
> 
> 


--------------------

This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.




--------------------

This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.




Re: SQOOP 1.3.0 questions using SQL Server

Posted by Jarek Jarcec Cecho <ja...@apache.org>.
Hi Artem,
Sqoop 1.3.0 is very old release, so I would definitely recommend to upgrade to latest 1.4.2. Please note that this release include SQOOP-480[1] that fixes some issues for connectors compiled against 1.3 in 1.4. Sqoop is simple command line utility so you might try to download release into some temporary folder and verify that everything is working prior upgrading Sqoop system wise.

Specifically to your questions

1) --incremental
Usually Sqoop will print out error message first followed by help page, so I would advise checking few initial lines to see what exactly is wrong.

2) --compress
Please try to reproduce this issue on 1.4.2 and file jira if you'll be able to reproduce it.

3) --as-avrodatafile
Based on [2] SQL type -9 corresponds to data type NVARCHAR. We've added support for NVARCHAR in SQOOP-323 [3] that is part of Sqoop since version 1.4.0-incubating - e.g. 1.3.0 version do not support NVARCHAR type out of the box.

Jarcec

Links:
1: https://issues.apache.org/jira/browse/SQOOP-480
2: http://docs.oracle.com/javase/6/docs/api/constant-values.html#java.sql.Types.DECIMAL
3: https://issues.apache.org/jira/browse/SQOOP-323

On Fri, Oct 12, 2012 at 05:29:41PM +0000, Artem Ervits wrote:
> 
> Hello all,
> 
> I'm testing a few switches for Sqoop import and I'm having the following problems.
> 
> When I use the -incremental switch, the command fails and shows me the help page.
> When I use the -compress switch, the command works but when I try to uncompress the results, it says gzip undefined compression code. I also tried to explicitly state the codec and it would still not append the compression extension to the files nor am I able to uncompress the data.
> When I use the -as-avrodatafile I get ERROR tool.ImportTool: Imported Failed: Cannot convert SQL type -9
> 
> Any ideas? I am not sure if upgrading Sqoop will fix it because SQL Server connector specifically required a 1.3.0 release.
> 
> Thanks.
> 
> Artem Ervits
> Data Analyst
> New York Presbyterian Hospital
> 
> 
> 
> --------------------
> 
> This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.
> 
> 
> 
> 
> --------------------
> 
> This electronic message is intended to be for the use only of the named recipient, and may contain information that is confidential or privileged.  If you are not the intended recipient, you are hereby notified that any disclosure, copying, distribution or use of the contents of this message is strictly prohibited.  If you have received this message in error or are not the named recipient, please notify us immediately by contacting the sender at the electronic mail address noted above, and delete and destroy all copies of this message.  Thank you.
> 
> 
>