You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@sqoop.apache.org by Chun-fan Ivan Liao <iv...@ivangelion.tw> on 2012/12/04 10:51:31 UTC

Sqoop export failed: Incorrect syntax near ','

Hi,



We are using Sqoop 1.3.0-cdh3u4 with Hadoop version 1.0.3.



We encountered the following error when we try to export HDFS file into
MSSQL 2005 (partially):



12/12/04 16:44:13 INFO mapred.JobClient: Task Id :
attempt_201212041541_0014_m_000000_2, Status : FAILED

java.io.IOException: com.microsoft.sqlserver.jdbc.SQLServerException:
Incorrect syntax near ','.

        at
com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.close(AsyncSqlRecordWriter.java:195)

        at
org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.close(MapTask.java:651)

        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:766)

        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)

        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)

        at java.security.AccessController.doPrivileged(Native Method)

        at javax.security.auth.Subject.doAs(Subject.java:416)

        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)

        at org.apache.hadoop.mapred.Child.main(Child.java:249)

Caused by: com.microsoft.sqlserver.jdbc.SQLServerException: Incorrect
syntax near ','.

        at
com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:197)

        at
com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1493)

        at
com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.doExecutePreparedStatement(SQLServerPreparedStatement.java:390)

        at
com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement$PrepStmtExecCmd.doExecute(SQLServerPreparedStatement.java:340)

        at
com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:4575)

        at
com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1400)

        at
com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:179)

        at
com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:154)

        at
com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.execute(SQLServerPreparedStatement.java:322)

        at
com.cloudera.sqoop.mapreduce.AsyncSqlOutputFormat$AsyncSqlExecThread.run(AsyncSqlOutputFormat.java:234)



The HDFS file that we want to export was imported using sqoop from SQL 2005
before and uses ‘|’ as field delimiter, and there are commas (‘,’) in a
field of a line in the file.



The commands I submitted is (generalized with capital letters):

$ sqoop export -D sqoop.export.records.per.statement=75 -D
sqoop.export.statements.per.transaction=75 --connect
"jdbc:sqlserver://SERVER-NAME:1433;username=USER_NAME;password=PASSWD;database=DB_NAME"
--table TABLE_NAME -m 1 --input-fields-terminated-by '|' --export-dir
/EXPORT/FROM/DIRECTORY



I’ve adjusted values of sqoop.export.records.per.statement &
sqoop.export.statements.per.transaction, but that didn’t help.



It will be greatly appreciated if you can offer some help. Thanks.



Ivan

Re: Sqoop export failed: Incorrect syntax near ','

Posted by Chun-fan Ivan Liao <iv...@ivangelion.tw>.
Hi Jarcec,

Sorry for the late reply.

I've tried to re-import the data then exported it, but still in vain. Error
messages were the same.

However I think I've found the root cause. I found there are pipes '|' in
the data of some of the rows, and since the pipe is the delimiter, this
made Sqoop split data incorrectly.

We will try to change those pipes in data to other characters, skip those
rows with pipes in them, or just delete those pipes.

Thank you so much for your kind and generous help!! :)

Kind regards,
Ivan

On Sat, Dec 8, 2012 at 12:02 AM, Jarek Jarcec Cecho <ja...@apache.org>wrote:

> I've just realized what is wrong, parameter --hive-drop-import-delims is
> import specific and it seems that you're trying to use it for export. Would
> you mind to re-import your data with this parameter and try the export?
>
> Jarcec
>
> On Fri, Dec 07, 2012 at 03:45:55PM +0800, Chun-fan Ivan Liao wrote:
> > I've upgrade sqoop to 1.4.2 and copied hadoop-core-1.0.3.jar
> > and sqljdbc4.jar to /usr/local/sqoop/lib. I've also specified the
> parameter
> > "--hive-drop-import-delims" in the command, but the same error remained.
> > Parameters specified behind --hive-drop-import-delims could not be
> parsed:
> >
> > ==============
> > 12/12/06 23:39:07 ERROR tool.BaseSqoopTool: Error parsing arguments for
> > export:
> > 12/12/06 23:39:07 ERROR tool.BaseSqoopTool: Unrecognized argument:
> > --hive-drop-import-delims
> > 12/12/06 23:39:07 ERROR tool.BaseSqoopTool: Unrecognized argument:
> --verbose
> >
> > Try --help for usage instructions.
> > ....
> > ==============
> >
> > Is there anything I can do now, e.g. re-importing data using default
> > connector and see if the imported data can be exported back to SQL
> Server?
> >
> >
> > On Fri, Dec 7, 2012 at 12:01 PM, Jarek Jarcec Cecho <jarcec@apache.org
> >wrote:
> >
> > > I see, Would you mind upgrading your Sqoop to most recent version
> 1.4.2?
> > >
> > > Jarcec
> > >
> > > On Fri, Dec 07, 2012 at 11:19:31AM +0800, Chun-fan Ivan Liao wrote:
> > > > Hi Jarek,
> > > >
> > > > I've tried to use "--hive-drop-import-delims", but sqoop showed it
> has
> > > > syntax error:
> > > >
> > > >   ERROR tool.BaseSqoopTool: Unrecognized argument:
> > > --hive-drop-import-delims
> > > >
> > > > Also, should I change Java OpenJDK to Oracle JDK in order to make
> Sqoop
> > > > export work?
> > > >
> > > > Thanks!
> > > > Ivan
> > > >
> > > > On Fri, Dec 7, 2012 at 12:28 AM, Jarek Jarcec Cecho <
> jarcec@apache.org
> > > >wrote:
> > > >
> > > > > Hi Ivangelion,
> > > > > I'm glad that you were able to move on with your issue. It seems
> to me
> > > > > that you're running on OpenJDK - unfortunately Sqoop is tested and
> > > > > supported only Oracle JDK.
> > > > >
> > > > > Based on the exceptions you're hitting:
> > > > >
> > > > >   java.lang.NumberFormatException: For input string: "Male"
> > > > >
> > > > >   java.lang.IllegalArgumentException: Timestamp format must be
> > > yyyy-mm-dd
> > > > > hh:mm:ss[.fffffffff
> > > > >
> > > > > It seems to me your input files got somehow corrupted and for
> example
> > > for
> > > > > the first exception Sqoop is looking for column that should be
> number
> > > but
> > > > > found string "Male" instead. You've mentioned that your data can
> > > contain a
> > > > > lot of wild characters, can it happen that your data also contains
> new
> > > line
> > > > > characters? Would you mind re-trying import with parameter
> > > > > --hive-drop-import-delims [1] to see if it helps? (this parameter
> do
> > > not
> > > > > depend on hive in any way regardless of it's name).
> > > > >
> > > > > Jarcec
> > > > >
> > > > > On Thu, Dec 06, 2012 at 12:03:06PM +0800, Ivangelion wrote:
> > > > > > Hi Jarek,
> > > > > >
> > > > > > It actually worked! Thank you so much~! :D
> > > > > >
> > > > > > However now we faced another problem. The former data we tried to
> > > export
> > > > > is
> > > > > > only test data, which row count is only 10. When we tried to
> export
> > > > > > production data back into SQL Server from HDFS file which was
> > > previously
> > > > > > imported using Sqoop from SQL server, different errors occurred.
> The
> > > row
> > > > > > count is about 400k, and only about 120k rows were exported. This
> > > time we
> > > > > > used "-m 5", and if using "-m 1", nothing will be exported.
> Verbose
> > > log
> > > > > is
> > > > > > in the bottom of this mail.
> > > > > >
> > > > > > Is this has to do with that we used MS SQL connector to do
> previous
> > > > > import,
> > > > > > not the default one?
> > > > > >
> > > > > > Also, should we specify any character encoding, e.g. utf-8 during
> > > > > > import/export process? There are characters of many different
> > > languages
> > > > > in
> > > > > > our original data in SQL Server, and I'm not sure what the
> encoding
> > > is
> > > > > > after imported into HDFS.
> > > > > >
> > > > > > Thanks again, Jarek.
> > > > > >
> > > > > > =====================================
> > > > > > 12/12/05 19:55:27 DEBUG tool.BaseSqoopTool: Enabled debug
> logging.
> > > > > > 12/12/05 19:55:27 DEBUG sqoop.ConnFactory: Loaded manager
> factory:
> > > > > > com.cloudera.sqoop.manager.DefaultManagerFactory
> > > > > > 12/12/05 19:55:27 DEBUG sqoop.ConnFactory: Trying ManagerFactory:
> > > > > > com.cloudera.sqoop.manager.DefaultManagerFactory
> > > > > > 12/12/05 19:55:27 DEBUG manager.DefaultManagerFactory: Trying
> with
> > > > > scheme:
> > > > > > jdbc:sqlserver:
> > > > > > 12/12/05 19:55:27 INFO manager.SqlManager: Using default
> fetchSize of
> > > > > 1000
> > > > > > 12/12/05 19:55:27 DEBUG sqoop.ConnFactory: Instantiated
> ConnManager
> > > > > > com.cloudera.sqoop.manager.SQLServerManager@6766afb3
> > > > > > 12/12/05 19:55:27 INFO tool.CodeGenTool: Beginning code
> generation
> > > > > > 12/12/05 19:55:27 DEBUG manager.SqlManager: No connection
> paramenters
> > > > > > specified. Using regular API for making connection.
> > > > > > 12/12/05 19:55:27 DEBUG manager.SqlManager: Using fetchSize for
> next
> > > > > query:
> > > > > > 1000
> > > > > > 12/12/05 19:55:27 INFO manager.SqlManager: Executing SQL
> statement:
> > > > > SELECT
> > > > > > t.* FROM member_main AS t WHERE 1=0
> > > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter: selected columns:
> > > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   MemberId
> > > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   USERNAME
> > > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   FirstName
> > > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   LastName
> > > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   EmailAddress
> > > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Password
> > > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Password_E5
> > > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Birthday
> > > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   CompanyName
> > > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Gender
> > > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Age
> > > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Education
> > > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Country
> > > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Title
> > > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Phone1
> > > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Phone2
> > > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Fax
> > > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   State
> > > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   City
> > > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Address1
> > > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Address2
> > > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   ZipCode
> > > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   VATID
> > > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Language
> > > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   rec_letter
> > > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   rec_promotion
> > > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   rec_type
> > > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   JointSource
> > > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   CustomerLevel
> > > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   UpdateDate
> > > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   CreateDate
> > > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   FirstLoginDate
> > > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   LastLoginDate
> > > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   LastVisit
> > > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   isValid
> > > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   nJoint
> > > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Upd_SubDate
> > > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   UnSub_Type
> > > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   CreateDateFloat
> > > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter: Writing source file:
> > > > > >
> > > > >
> > >
> /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/member_main.java
> > > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter: Table name: member_main
> > > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter: Columns: MemberId:4,
> > > > > USERNAME:12,
> > > > > > FirstName:-9, LastName:-9, EmailAddress:12, Password:12,
> > > Password_E5:12,
> > > > > > Birthday:93, CompanyName:-9, Gender:12, Age:5, Education:12,
> > > Country:5,
> > > > > > Title:-9, Phone1:12, Phone2:12, Fax:12, State:-9, City:-9,
> > > Address1:-9,
> > > > > > Address2:-9, ZipCode:12, VATID:12, Language:12, rec_letter:-7,
> > > > > > rec_promotion:-7, rec_type:5, JointSource:12, CustomerLevel:4,
> > > > > > UpdateDate:93, CreateDate:93, FirstLoginDate:93,
> LastLoginDate:93,
> > > > > > LastVisit:93, isValid:-7, nJoint:4, Upd_SubDate:93, UnSub_Type:4,
> > > > > > CreateDateFloat:8,
> > > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter: sourceFilename is
> > > > > member_main.java
> > > > > > 12/12/05 19:55:27 DEBUG orm.CompilationManager: Found existing
> > > > > > /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/
> > > > > > 12/12/05 19:55:27 INFO orm.CompilationManager: HADOOP_HOME is
> > > > > > /usr/local/hadoop/libexec/..
> > > > > > 12/12/05 19:55:27 DEBUG orm.CompilationManager: Adding source
> file:
> > > > > >
> > > > >
> > >
> /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/member_main.java
> > > > > > 12/12/05 19:55:27 DEBUG orm.CompilationManager: Invoking javac
> with
> > > args:
> > > > > > 12/12/05 19:55:27 DEBUG orm.CompilationManager:   -sourcepath
> > > > > > 12/12/05 19:55:27 DEBUG orm.CompilationManager:
> > > > > > /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/
> > > > > > 12/12/05 19:55:27 DEBUG orm.CompilationManager:   -d
> > > > > > 12/12/05 19:55:27 DEBUG orm.CompilationManager:
> > > > > > /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/
> > > > > > 12/12/05 19:55:27 DEBUG orm.CompilationManager:   -classpath
> > > > > > 12/12/05 19:55:27 DEBUG orm.CompilationManager:
> > > > > >
> > > > >
> > >
> /usr/local/hadoop/libexec/../conf:/usr/lib/jvm/java-6-openjdk-amd64/lib/tools.jar:/usr/local/hadoop/libexec/..:/usr/local/hadoop/libexec/../hadoop-core-1.0.3.jar:/usr/local/hadoop/libexec/../lib/asm-3.2.jar:/usr/local/hadoop/libexec/../lib/aspectjrt-1.6.5.jar:/usr/local/hadoop/libexec/../lib/aspectjtools-1.6.5.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/libexec/../lib/commons-cli-1.2.jar:/usr/local/hadoop/libexec/../lib/commons-codec-1.4.jar:/usr/local/hadoop/libexec/../lib/commons-collections-3.2.1.jar:/usr/local/hadoop/libexec/../lib/commons-configuration-1.6.jar:/usr/local/hadoop/libexec/../lib/commons-daemon-1.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-digester-1.8.jar:/usr/local/hadoop/libexec/../lib/commons-el-1.0.jar:/usr/local/hadoop/libexec/../lib/commons-httpclient-3.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-io-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-lang-2.4.jar:/usr/local/hadoop/libexec/../lib/commons-logging-1.1.1.jar:/usr/local/hadoop/libexec/../lib/commons-logging-api-1.0.4.jar:/usr/local/hadoop/libexec/../lib/commons-math-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-net-1.4.1.jar:/usr/local/hadoop/libexec/../lib/core-3.1.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-capacity-scheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-fairscheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-thriftfs-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hsqldb-1.8.0.10.jar:/usr/local/hadoop/libexec/../lib/jackson-core-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jasper-compiler-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jasper-runtime-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jdeb-0.8.jar:/usr/local/hadoop/libexec/../lib/jersey-core-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-json-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-server-1.8.jar:/usr/local/hadoop/libexec/../lib/jets3t-0.6.1.jar:/usr/local/hadoop/libexec/../lib/jetty-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jetty-util-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jsch-0.1.42.jar:/usr/local/hadoop/libexec/../lib/junit-4.5.jar:/usr/local/hadoop/libexec/../lib/kfs-0.2.2.jar:/usr/local/hadoop/libexec/../lib/log4j-1.2.15.jar:/usr/local/hadoop/libexec/../lib/mockito-all-1.8.5.jar:/usr/local/hadoop/libexec/../lib/oro-2.0.8.jar:/usr/local/hadoop/libexec/../lib/servlet-api-2.5-20081211.jar:/usr/local/hadoop/libexec/../lib/slf4j-api-1.4.3.jar:/usr/local/hadoop/libexec/../lib/slf4j-log4j12-1.4.3.jar:/usr/local/hadoop/libexec/../lib/xmlenc-0.52.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-2.1.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-api-2.1.jar:/usr/local/hadoop/lib:/usr/local/sqoop/conf::/usr/local/sqoop/lib/ant-contrib-1.0b3.jar:/usr/local/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar:/usr/local/sqoop/lib/avro-1.5.4.jar:/usr/local/sqoop/lib/avro-ipc-1.5.4.jar:/usr/local/sqoop/lib/avro-mapred-1.5.4.jar:/usr/local/sqoop/lib/commons-io-1.4.jar:/usr/local/sqoop/lib/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar:/usr/local/sqoop/lib/jackson-core-asl-1.7.3.jar:/usr/local/sqoop/lib/jackson-mapper-asl-1.7.3.jar:/usr/local/sqoop/lib/jopt-simple-3.2.jar:/usr/local/sqoop/lib/paranamer-2.3.jar:/usr/local/sqoop/lib/snappy-java-1.0.3.2.jar:/usr/local/sqoop/lib/sqljdbc4.jar:/usr/local/sqoop/lib/sqoop-sqlserver-1.0.jar:/usr/local/hbase/conf/:/usr/lib/jvm/java-6-openjdk-amd64//lib/tools.jar:/usr/local/hbase:/usr/local/hbase/hbase-0.94.1.jar:/usr/local/hbase/hbase-0.94.1-tests.jar:/usr/local/hbase/lib/activation-1.1.jar:/usr/local/hbase/lib/asm-3.1.jar:/usr/local/hbase/lib/avro-1.5.3.jar:/usr/local/hbase/lib/avro-ipc-1.5.3.jar:/usr/local/hbase/lib/commons-beanutils-1.7.0.jar:/usr/local/hbase/lib/commons-beanutils-core-1.8.0.jar:/usr/local/hbase/lib/commons-cli-1.2.jar:/usr/local/hbase/lib/commons-codec-1.4.jar:/usr/local/hbase/lib/commons-collections-3.2.1.jar:/usr/local/hbase/lib/commons-configuration-1.6.jar:/usr/local/hbase/lib/commons-digester-1.8.jar:/usr/local/hbase/lib/commons-el-1.0.jar:/usr/local/hbase/lib/commons-httpclient-3.1.jar:/usr/local/hbase/lib/commons-io-2.1.jar:/usr/local/hbase/lib/commons-lang-2.5.jar:/usr/local/hbase/lib/commons-logging-1.1.1.jar:/usr/local/hbase/lib/commons-math-2.1.jar:/usr/local/hbase/lib/commons-net-1.4.1.jar:/usr/local/hbase/lib/core-3.1.1.jar:/usr/local/hbase/lib/guava-11.0.2.jar:/usr/local/hbase/lib/hadoop-core-1.0.3.jar:/usr/local/hbase/lib/high-scale-lib-1.1.1.jar:/usr/local/hbase/lib/httpclient-4.1.2.jar:/usr/local/hbase/lib/httpcore-4.1.3.jar:/usr/local/hbase/lib/jackson-core-asl-1.8.8.jar:/usr/local/hbase/lib/jackson-jaxrs-1.8.8.jar:/usr/local/hbase/lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hbase/lib/jackson-xc-1.8.8.jar:/usr/local/hbase/lib/jamon-runtime-2.3.1.jar:/usr/local/hbase/lib/jasper-compiler-5.5.23.jar:/usr/local/hbase/lib/jasper-runtime-5.5.23.jar:/usr/local/hbase/lib/jaxb-api-2.1.jar:/usr/local/hbase/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hbase/lib/jersey-core-1.8.jar:/usr/local/hbase/lib/jersey-json-1.8.jar:/usr/local/hbase/lib/jersey-server-1.8.jar:/usr/local/hbase/lib/jettison-1.1.jar:/usr/local/hbase/lib/jetty-6.1.26.jar:/usr/local/hbase/lib/jetty-util-6.1.26.jar:/usr/local/hbase/lib/jruby-complete-1.6.5.jar:/usr/local/hbase/lib/jsp-2.1-6.1.14.jar:/usr/local/hbase/lib/jsp-api-2.1-6.1.14.jar:/usr/local/hbase/lib/jsr305-1.3.9.jar:/usr/local/hbase/lib/junit-4.10-HBASE-1.jar:/usr/local/hbase/lib/libthrift-0.8.0.jar:/usr/local/hbase/lib/log4j-1.2.16.jar:/usr/local/hbase/lib/metrics-core-2.1.2.jar:/usr/local/hbase/lib/netty-3.2.4.Final.jar:/usr/local/hbase/lib/protobuf-java-2.4.0a.jar:/usr/local/hbase/lib/servlet-api-2.5-6.1.14.jar:/usr/local/hbase/lib/slf4j-api-1.4.3.jar:/usr/local/hbase/lib/slf4j-log4j12-1.4.3.jar:/usr/local/hbase/lib/snappy-java-1.0.3.2.jar:/usr/local/hbase/lib/stax-api-1.0.1.jar:/usr/local/hbase/lib/velocity-1.7.jar:/usr/local/hbase/lib/xmlenc-0.52.jar:/usr/local/hbase/lib/zookeeper-3.4.3.jar::/usr/local/hadoop/conf:/usr/local/hadoop/libexec/../conf:/usr/lib/jvm/java-6-openjdk-amd64/lib/tools.jar:/usr/local/hadoop/libexec/..:/usr/local/hadoop/libexec/../hadoop-core-1.0.3.jar:/usr/local/hadoop/libexec/../lib/asm-3.2.jar:/usr/local/hadoop/libexec/../lib/aspectjrt-1.6.5.jar:/usr/local/hadoop/libexec/../lib/aspectjtools-1.6.5.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/libexec/../lib/commons-cli-1.2.jar:/usr/local/hadoop/libexec/../lib/commons-codec-1.4.jar:/usr/local/hadoop/libexec/../lib/commons-collections-3.2.1.jar:/usr/local/hadoop/libexec/../lib/commons-configuration-1.6.jar:/usr/local/hadoop/libexec/../lib/commons-daemon-1.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-digester-1.8.jar:/usr/local/hadoop/libexec/../lib/commons-el-1.0.jar:/usr/local/hadoop/libexec/../lib/commons-httpclient-3.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-io-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-lang-2.4.jar:/usr/local/hadoop/libexec/../lib/commons-logging-1.1.1.jar:/usr/local/hadoop/libexec/../lib/commons-logging-api-1.0.4.jar:/usr/local/hadoop/libexec/../lib/commons-math-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-net-1.4.1.jar:/usr/local/hadoop/libexec/../lib/core-3.1.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-capacity-scheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-fairscheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-thriftfs-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hsqldb-1.8.0.10.jar:/usr/local/hadoop/libexec/../lib/jackson-core-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jasper-compiler-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jasper-runtime-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jdeb-0.8.jar:/usr/local/hadoop/libexec/../lib/jersey-core-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-json-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-server-1.8.jar:/usr/local/hadoop/libexec/../lib/jets3t-0.6.1.jar:/usr/local/hadoop/libexec/../lib/jetty-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jetty-util-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jsch-0.1.42.jar:/usr/local/hadoop/libexec/../lib/junit-4.5.jar:/usr/local/hadoop/libexec/../lib/kfs-0.2.2.jar:/usr/local/hadoop/libexec/../lib/log4j-1.2.15.jar:/usr/local/hadoop/libexec/../lib/mockito-all-1.8.5.jar:/usr/local/hadoop/libexec/../lib/oro-2.0.8.jar:/usr/local/hadoop/libexec/../lib/servlet-api-2.5-20081211.jar:/usr/local/hadoop/libexec/../lib/slf4j-api-1.4.3.jar:/usr/local/hadoop/libexec/../lib/slf4j-log4j12-1.4.3.jar:/usr/local/hadoop/libexec/../lib/xmlenc-0.52.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-2.1.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-api-2.1.jar:/usr/local/hadoop/lib::/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar:/usr/local/sqoop/sqoop-test-1.3.0-cdh3u4.jar::/usr/local/hadoop/hadoop-core-1.0.3.jar:/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar
> > > > > > 12/12/05 19:55:28 ERROR orm.CompilationManager: Could not rename
> > > > > >
> > > > >
> > >
> /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/member_main.java
> > > > > > to /home/hadoop/_scripts/1-hadoop/member/./member_main.java
> > > > > > org.apache.commons.io.FileExistsException: Destination
> > > > > > '/home/hadoop/_scripts/1-hadoop/member/./member_main.java'
> already
> > > exists
> > > > > >  at org.apache.commons.io.FileUtils.moveFile(FileUtils.java:2378)
> > > > > > at
> > > > > >
> > > > >
> > >
> com.cloudera.sqoop.orm.CompilationManager.compile(CompilationManager.java:229)
> > > > > >  at
> > > com.cloudera.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:85)
> > > > > > at
> com.cloudera.sqoop.tool.ExportTool.exportTable(ExportTool.java:66)
> > > > > >  at com.cloudera.sqoop.tool.ExportTool.run(ExportTool.java:99)
> > > > > > at com.cloudera.sqoop.Sqoop.run(Sqoop.java:146)
> > > > > >  at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> > > > > > at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:182)
> > > > > >  at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:221)
> > > > > > at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:230)
> > > > > >  at com.cloudera.sqoop.Sqoop.main(Sqoop.java:239)
> > > > > > 12/12/05 19:55:28 INFO orm.CompilationManager: Writing jar file:
> > > > > >
> > > > >
> > >
> /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/member_main.jar
> > > > > > 12/12/05 19:55:28 DEBUG orm.CompilationManager: Scanning for
> .class
> > > files
> > > > > > in directory:
> > > /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d
> > > > > > 12/12/05 19:55:28 DEBUG orm.CompilationManager: Got classfile:
> > > > > >
> > > > >
> > >
> /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/member_main.class
> > > > > > -> member_main.class
> > > > > > 12/12/05 19:55:28 DEBUG orm.CompilationManager: Finished writing
> jar
> > > file
> > > > > >
> > > > >
> > >
> /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/member_main.jar
> > > > > > 12/12/05 19:55:28 INFO mapreduce.ExportJobBase: Beginning export
> of
> > > > > > member_main
> > > > > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Using InputFormat:
> class
> > > > > > com.cloudera.sqoop.mapreduce.ExportInputFormat
> > > > > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job
> classpath:
> > > > > > file:/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar
> > > > > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job
> classpath:
> > > > > > file:/usr/local/sqoop/lib/sqljdbc4.jar
> > > > > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job
> classpath:
> > > > > > file:/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar
> > > > > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job
> classpath:
> > > > > > file:/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar
> > > > > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job
> classpath:
> > > > > > file:/usr/local/sqoop/lib/jackson-core-asl-1.7.3.jar
> > > > > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job
> classpath:
> > > > > > file:/usr/local/sqoop/lib/snappy-java-1.0.3.2.jar
> > > > > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job
> classpath:
> > > > > > file:/usr/local/sqoop/lib/avro-1.5.4.jar
> > > > > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job
> classpath:
> > > > > > file:/usr/local/sqoop/lib/jackson-mapper-asl-1.7.3.jar
> > > > > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job
> classpath:
> > > > > > file:/usr/local/sqoop/lib/sqljdbc4.jar
> > > > > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job
> classpath:
> > > > > > file:/usr/local/sqoop/lib/ant-contrib-1.0b3.jar
> > > > > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job
> classpath:
> > > > > > file:/usr/local/sqoop/lib/sqoop-sqlserver-1.0.jar
> > > > > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job
> classpath:
> > > > > > file:/usr/local/sqoop/lib/paranamer-2.3.jar
> > > > > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job
> classpath:
> > > > > > file:/usr/local/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar
> > > > > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job
> classpath:
> > > > > >
> file:/usr/local/sqoop/lib/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar
> > > > > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job
> classpath:
> > > > > > file:/usr/local/sqoop/lib/avro-ipc-1.5.4.jar
> > > > > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job
> classpath:
> > > > > > file:/usr/local/sqoop/lib/avro-mapred-1.5.4.jar
> > > > > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job
> classpath:
> > > > > > file:/usr/local/sqoop/lib/commons-io-1.4.jar
> > > > > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job
> classpath:
> > > > > > file:/usr/local/sqoop/lib/jopt-simple-3.2.jar
> > > > > > 12/12/05 19:55:31 INFO input.FileInputFormat: Total input paths
> to
> > > > > process
> > > > > > : 1
> > > > > > 12/12/05 19:55:31 DEBUG mapreduce.ExportInputFormat: Target
> > > numMapTasks=5
> > > > > > 12/12/05 19:55:31 DEBUG mapreduce.ExportInputFormat: Total input
> > > > > > bytes=110140058
> > > > > > 12/12/05 19:55:31 DEBUG mapreduce.ExportInputFormat:
> > > > > maxSplitSize=22028011
> > > > > > 12/12/05 19:55:31 INFO input.FileInputFormat: Total input paths
> to
> > > > > process
> > > > > > : 1
> > > > > > 12/12/05 19:55:31 INFO util.NativeCodeLoader: Loaded the
> > > native-hadoop
> > > > > > library
> > > > > > 12/12/05 19:55:31 WARN snappy.LoadSnappy: Snappy native library
> not
> > > > > loaded
> > > > > > 12/12/05 19:55:31 DEBUG mapreduce.ExportInputFormat: Generated
> > > splits:
> > > > > > 12/12/05 19:55:31 DEBUG mapreduce.ExportInputFormat:
> > > > > >
> > > > >
> > >
> Paths:/user/hadoop/CyberlinkMemberData/Member_Main/CHS/part-m-00000:0+22028011
> > > > > > Locations:hadoop03:;
> > > > > > 12/12/05 19:55:31 DEBUG mapreduce.ExportInputFormat:
> > > > > >
> > > > >
> > >
> Paths:/user/hadoop/CyberlinkMemberData/Member_Main/CHS/part-m-00000:22028011+22028011
> > > > > > Locations:hadoop03:;
> > > > > > 12/12/05 19:55:31 DEBUG mapreduce.ExportInputFormat:
> > > > > >
> > > > >
> > >
> Paths:/user/hadoop/CyberlinkMemberData/Member_Main/CHS/part-m-00000:44056022+11526421,/user/hadoop/CyberlinkMemberData/Member_Main/CHS/part-m-00000:55582443+11526421
> > > > > > Locations:hadoop03:;
> > > > > > 12/12/05 19:55:31 DEBUG mapreduce.ExportInputFormat:
> > > > > >
> > > > >
> > >
> Paths:/user/hadoop/CyberlinkMemberData/Member_Main/CHS/part-m-00000:67108864+21515597,/user/hadoop/CyberlinkMemberData/Member_Main/CHS/part-m-00000:88624461+21515597
> > > > > > Locations:hadoop03:;
> > > > > > 12/12/05 19:55:31 INFO mapred.JobClient: Running job:
> > > > > job_201212041541_0245
> > > > > > 12/12/05 19:55:32 INFO mapred.JobClient:  map 0% reduce 0%
> > > > > > 12/12/05 19:55:47 INFO mapred.JobClient: Task Id :
> > > > > > attempt_201212041541_0245_m_000002_0, Status : FAILED
> > > > > > java.lang.NumberFormatException: For input string: "Male"
> > > > > > at
> > > > > >
> > > > >
> > >
> java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
> > > > > >  at java.lang.Integer.parseInt(Integer.java:481)
> > > > > > at java.lang.Integer.valueOf(Integer.java:570)
> > > > > >  at member_main.__loadFromFields(member_main.java:1254)
> > > > > > at member_main.parse(member_main.java:1156)
> > > > > >  at
> > > > > >
> > > > >
> > >
> com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:81)
> > > > > > at
> > > > > >
> > > > >
> > >
> com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:40)
> > > > > >  at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
> > > > > > at
> > > > > >
> > > > >
> > >
> com.cloudera.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:189)
> > > > > >  at
> org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
> > > > > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> > > > > >  at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> > > > > > at java.security.AccessController.doPrivileged(Native Method)
> > > > > >  at javax.security.auth.Subject.doAs(Subject.java:416)
> > > > > > at
> > > > > >
> > > > >
> > >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> > > > > >  at org.apache.hadoop.mapred.Child.main(Child.java:249)
> > > > > >
> > > > > > 12/12/05 19:55:51 INFO mapred.JobClient:  map 5% reduce 0%
> > > > > > 12/12/05 19:55:54 INFO mapred.JobClient:  map 8% reduce 0%
> > > > > > 12/12/05 19:55:54 INFO mapred.JobClient: Task Id :
> > > > > > attempt_201212041541_0245_m_000002_1, Status : FAILED
> > > > > > java.lang.NumberFormatException: For input string: "Male"
> > > > > > at
> > > > > >
> > > > >
> > >
> java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
> > > > > >  at java.lang.Integer.parseInt(Integer.java:481)
> > > > > > at java.lang.Integer.valueOf(Integer.java:570)
> > > > > >  at member_main.__loadFromFields(member_main.java:1254)
> > > > > > at member_main.parse(member_main.java:1156)
> > > > > >  at
> > > > > >
> > > > >
> > >
> com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:81)
> > > > > > at
> > > > > >
> > > > >
> > >
> com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:40)
> > > > > >  at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
> > > > > > at
> > > > > >
> > > > >
> > >
> com.cloudera.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:189)
> > > > > >  at
> org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
> > > > > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> > > > > >  at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> > > > > > at java.security.AccessController.doPrivileged(Native Method)
> > > > > >  at javax.security.auth.Subject.doAs(Subject.java:416)
> > > > > > at
> > > > > >
> > > > >
> > >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> > > > > >  at org.apache.hadoop.mapred.Child.main(Child.java:249)
> > > > > >
> > > > > > 12/12/05 19:55:57 INFO mapred.JobClient:  map 14% reduce 0%
> > > > > > 12/12/05 19:55:59 INFO mapred.JobClient: Task Id :
> > > > > > attempt_201212041541_0245_m_000000_0, Status : FAILED
> > > > > > java.lang.IllegalArgumentException: Timestamp format must be
> > > yyyy-mm-dd
> > > > > > hh:mm:ss[.fffffffff]
> > > > > > at java.sql.Timestamp.valueOf(Timestamp.java:203)
> > > > > >  at member_main.__loadFromFields(member_main.java:1239)
> > > > > > at member_main.parse(member_main.java:1156)
> > > > > >  at
> > > > > >
> > > > >
> > >
> com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:81)
> > > > > > at
> > > > > >
> > > > >
> > >
> com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:40)
> > > > > >  at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
> > > > > > at
> > > > > >
> > > > >
> > >
> com.cloudera.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:189)
> > > > > >  at
> org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
> > > > > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> > > > > >  at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> > > > > > at java.security.AccessController.doPrivileged(Native Method)
> > > > > >  at javax.security.auth.Subject.doAs(Subject.java:416)
> > > > > > at
> > > > > >
> > > > >
> > >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> > > > > >  at org.apache.hadoop.mapred.Child.main(Child.java:249)
> > > > > >
> > > > > > 12/12/05 19:56:00 INFO mapred.JobClient:  map 13% reduce 0%
> > > > > > 12/12/05 19:56:01 INFO mapred.JobClient: Task Id :
> > > > > > attempt_201212041541_0245_m_000002_2, Status : FAILED
> > > > > > java.lang.NumberFormatException: For input string: "Male"
> > > > > > at
> > > > > >
> > > > >
> > >
> java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
> > > > > >  at java.lang.Integer.parseInt(Integer.java:481)
> > > > > > at java.lang.Integer.valueOf(Integer.java:570)
> > > > > >  at member_main.__loadFromFields(member_main.java:1254)
> > > > > > at member_main.parse(member_main.java:1156)
> > > > > >  at
> > > > > >
> > > > >
> > >
> com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:81)
> > > > > > at
> > > > > >
> > > > >
> > >
> com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:40)
> > > > > >  at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
> > > > > > at
> > > > > >
> > > > >
> > >
> com.cloudera.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:189)
> > > > > >  at
> org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
> > > > > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> > > > > >  at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> > > > > > at java.security.AccessController.doPrivileged(Native Method)
> > > > > >  at javax.security.auth.Subject.doAs(Subject.java:416)
> > > > > > at
> > > > > >
> > > > >
> > >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> > > > > >  at org.apache.hadoop.mapred.Child.main(Child.java:249)
> > > > > >
> > > > > > 12/12/05 19:56:03 INFO mapred.JobClient:  map 16% reduce 0%
> > > > > > 12/12/05 19:56:13 INFO mapred.JobClient: Job complete:
> > > > > job_201212041541_0245
> > > > > > 12/12/05 19:56:13 INFO mapred.JobClient: Counters: 8
> > > > > > 12/12/05 19:56:13 INFO mapred.JobClient:   Job Counters
> > > > > > 12/12/05 19:56:13 INFO mapred.JobClient:
> SLOTS_MILLIS_MAPS=91611
> > > > > > 12/12/05 19:56:13 INFO mapred.JobClient:     Total time spent by
> all
> > > > > > reduces waiting after reserving slots (ms)=0
> > > > > > 12/12/05 19:56:13 INFO mapred.JobClient:     Total time spent by
> all
> > > maps
> > > > > > waiting after reserving slots (ms)=0
> > > > > > 12/12/05 19:56:13 INFO mapred.JobClient:     Rack-local map
> tasks=5
> > > > > > 12/12/05 19:56:13 INFO mapred.JobClient:     Launched map tasks=8
> > > > > > 12/12/05 19:56:13 INFO mapred.JobClient:     Data-local map
> tasks=3
> > > > > > 12/12/05 19:56:13 INFO mapred.JobClient:
> SLOTS_MILLIS_REDUCES=0
> > > > > > 12/12/05 19:56:13 INFO mapred.JobClient:     Failed map tasks=1
> > > > > > 12/12/05 19:56:13 INFO mapreduce.ExportJobBase: Transferred 0
> bytes
> > > in
> > > > > > 45.077 seconds (0 bytes/sec)
> > > > > > 12/12/05 19:56:13 INFO mapreduce.ExportJobBase: Exported 0
> records.
> > > > > > 12/12/05 19:56:13 ERROR tool.ExportTool: Error during export:
> Export
> > > job
> > > > > > failed!
> > > > > > =====================================
> > > > > >
> > > > > >
> > > > > > Chun-fan
> > > > > >
> > > > > >
> > > > > > On Thu, Dec 6, 2012 at 12:23 AM, Jarek Jarcec Cecho <
> > > jarcec@apache.org
> > > > > >wrote:
> > > > > >
> > > > > > > Hi Chun-fan,
> > > > > > > thank you very much for sharing the log with us. You are using
> > > > > Microsoft
> > > > > > > SQL Connector because you're downloaded manually from
> Microsoft web
> > > > > pages
> > > > > > > and you can also confirm that by following log lines:
> > > > > > >
> > > > > > > > 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Added factory
> > > > > > > > com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory
> > > specified by
> > > > > > > > /usr/local/sqoop/conf/managers.d/mssqoop-sqlserver
> > > > > > > ...
> > > > > > > > 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Instantiated
> > > ConnManager
> > > > > > > > com.microsoft.sqoop.SqlServer.MSSQLServerManager@736921fd
> > > > > > >
> > > > > > > I'm not sure what is going wrong as it seems that the data were
> > > parsed
> > > > > > > correctly, but submitting the query to SQL server will fail.
> As a
> > > next
> > > > > step
> > > > > > > I would recommend turning Microsoft Connector off and using
> > > build-in
> > > > > one
> > > > > > > instead to see if the issue is specific to Sqoop or the
> Connector.
> > > You
> > > > > can
> > > > > > > do that by temporarily moving file
> > > > > > > /usr/local/sqoop/conf/managers.d/mssqoop-sqlserver somewhere
> else.
> > > > > > >
> > > > > > > Jarcec
> > > > > > >
> > > > > > > On Wed, Dec 05, 2012 at 12:25:24PM +0800, Chun-fan Ivan Liao
> wrote:
> > > > > > > > Thank you, Jarcec. I'm not sure which connector we use. I've
> > > > > downloaded
> > > > > > > > "Microsoft SQL Server Connector for Apache Hadoop" from
> > > > > > > >
> http://www.microsoft.com/en-us/download/details.aspx?id=27584,
> > > but I
> > > > > > > didn't
> > > > > > > > remember if we really used that. How to make sure?
> > > > > > > >
> > > > > > > > And here is the verbose log:
> > > > > > > >
> > > > > > > > ===========
> > > > > > > > 12/12/05 12:08:57 DEBUG tool.BaseSqoopTool: Enabled debug
> > > logging.
> > > > > > > > 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Added factory
> > > > > > > > com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory
> > > specified by
> > > > > > > > /usr/local/sqoop/conf/managers.d/mssqoop-sqlserver
> > > > > > > > 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Loaded manager
> > > factory:
> > > > > > > > com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory
> > > > > > > > 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Loaded manager
> > > factory:
> > > > > > > > com.cloudera.sqoop.manager.DefaultManagerFactory
> > > > > > > > 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Trying
> ManagerFactory:
> > > > > > > > com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory
> > > > > > > > 12/12/05 12:08:57 INFO SqlServer.MSSQLServerManagerFactory:
> Using
> > > > > > > > Microsoft's SQL Server - Hadoop Connector
> > > > > > > > 12/12/05 12:08:57 INFO manager.SqlManager: Using default
> > > fetchSize of
> > > > > > > 1000
> > > > > > > > 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Instantiated
> > > ConnManager
> > > > > > > > com.microsoft.sqoop.SqlServer.MSSQLServerManager@736921fd
> > > > > > > > 12/12/05 12:08:57 INFO tool.CodeGenTool: Beginning code
> > > generation
> > > > > > > > 12/12/05 12:08:57 DEBUG manager.SqlManager: No connection
> > > paramenters
> > > > > > > > specified. Using regular API for making connection.
> > > > > > > > 12/12/05 12:08:57 DEBUG manager.SqlManager: Using fetchSize
> for
> > > next
> > > > > > > query:
> > > > > > > > 1000
> > > > > > > > 12/12/05 12:08:57 INFO manager.SqlManager: Executing SQL
> > > statement:
> > > > > > > SELECT
> > > > > > > > TOP 1 * FROM [member_main]
> > > > > > > > 12/12/05 12:08:57 DEBUG manager.SqlManager: Using fetchSize
> for
> > > next
> > > > > > > query:
> > > > > > > > 1000
> > > > > > > > 12/12/05 12:08:57 INFO manager.SqlManager: Executing SQL
> > > statement:
> > > > > > > SELECT
> > > > > > > > TOP 1 * FROM [member_main]
> > > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter: selected columns:
> > > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   MemberId
> > > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   USERNAME
> > > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   FirstName
> > > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   LastName
> > > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   EmailAddress
> > > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Password
> > > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Password_E5
> > > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Birthday
> > > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   CompanyName
> > > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Gender
> > > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Age
> > > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Education
> > > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Country
> > > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Title
> > > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Phone1
> > > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Phone2
> > > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Fax
> > > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   State
> > > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   City
> > > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Address1
> > > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Address2
> > > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   ZipCode
> > > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   VATID
> > > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Language
> > > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   rec_letter
> > > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   rec_promotion
> > > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   rec_type
> > > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   JointSource
> > > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   CustomerLevel
> > > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   UpdateDate
> > > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   CreateDate
> > > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   FirstLoginDate
> > > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   LastLoginDate
> > > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   LastVisit
> > > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   isValid
> > > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   nJoint
> > > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Upd_SubDate
> > > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   UnSub_Type
> > > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   CreateDateFloat
> > > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter: Writing source file:
> > > > > > > >
> > > > > > >
> > > > >
> > >
> /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/member_main.java
> > > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter: Table name:
> member_main
> > > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter: Columns: MemberId:4,
> > > > > > > USERNAME:12,
> > > > > > > > FirstName:-9, LastName:-9, EmailAddress:12, Password:12,
> > > > > Password_E5:12,
> > > > > > > > Birthday:93, CompanyName:-9, Gender:12, Age:5, Education:12,
> > > > > Country:5,
> > > > > > > > Title:-9, Phone1:12, Phone2:12, Fax:12, State:-9, City:-9,
> > > > > Address1:-9,
> > > > > > > > Address2:-9, ZipCode:12, VATID:12, Language:12,
> rec_letter:-7,
> > > > > > > > rec_promotion:-7, rec_type:5, JointSource:12,
> CustomerLevel:4,
> > > > > > > > UpdateDate:93, CreateDate:93, FirstLoginDate:93,
> > > LastLoginDate:93,
> > > > > > > > LastVisit:93, isValid:-7, nJoint:4, Upd_SubDate:93,
> UnSub_Type:4,
> > > > > > > > CreateDateFloat:8,
> > > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter: sourceFilename is
> > > > > > > member_main.java
> > > > > > > > 12/12/05 12:08:57 DEBUG orm.CompilationManager: Found
> existing
> > > > > > > > /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/
> > > > > > > > 12/12/05 12:08:57 INFO orm.CompilationManager: HADOOP_HOME is
> > > > > > > > /usr/local/hadoop/libexec/..
> > > > > > > > 12/12/05 12:08:57 DEBUG orm.CompilationManager: Adding source
> > > file:
> > > > > > > >
> > > > > > >
> > > > >
> > >
> /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/member_main.java
> > > > > > > > 12/12/05 12:08:57 DEBUG orm.CompilationManager: Invoking
> javac
> > > with
> > > > > args:
> > > > > > > > 12/12/05 12:08:57 DEBUG orm.CompilationManager:   -sourcepath
> > > > > > > > 12/12/05 12:08:57 DEBUG orm.CompilationManager:
> > > > > > > > /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/
> > > > > > > > 12/12/05 12:08:57 DEBUG orm.CompilationManager:   -d
> > > > > > > > 12/12/05 12:08:57 DEBUG orm.CompilationManager:
> > > > > > > > /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/
> > > > > > > > 12/12/05 12:08:57 DEBUG orm.CompilationManager:   -classpath
> > > > > > > > 12/12/05 12:08:57 DEBUG orm.CompilationManager:
> > > > > > > >
> > > > > > >
> > > > >
> > >
> /usr/local/hadoop/libexec/../conf:/usr/lib/jvm/java-6-openjdk-amd64/lib/tools.jar:/usr/local/hadoop/libexec/..:/usr/local/hadoop/libexec/../hadoop-core-1.0.3.jar:/usr/local/hadoop/libexec/../lib/asm-3.2.jar:/usr/local/hadoop/libexec/../lib/aspectjrt-1.6.5.jar:/usr/local/hadoop/libexec/../lib/aspectjtools-1.6.5.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/libexec/../lib/commons-cli-1.2.jar:/usr/local/hadoop/libexec/../lib/commons-codec-1.4.jar:/usr/local/hadoop/libexec/../lib/commons-collections-3.2.1.jar:/usr/local/hadoop/libexec/../lib/commons-configuration-1.6.jar:/usr/local/hadoop/libexec/../lib/commons-daemon-1.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-digester-1.8.jar:/usr/local/hadoop/libexec/../lib/commons-el-1.0.jar:/usr/local/hadoop/libexec/../lib/commons-httpclient-3.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-io-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-lang-2.4.jar:/usr/local/hadoop/libexec/../lib/commons-logging-1.1.1.jar:/usr/local/hadoop/libexec/../lib/commons-logging-api-1.0.4.jar:/usr/local/hadoop/libexec/../lib/commons-math-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-net-1.4.1.jar:/usr/local/hadoop/libexec/../lib/core-3.1.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-capacity-scheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-fairscheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-thriftfs-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hsqldb-1.8.0.10.jar:/usr/local/hadoop/libexec/../lib/jackson-core-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jasper-compiler-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jasper-runtime-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jdeb-0.8.jar:/usr/local/hadoop/libexec/../lib/jersey-core-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-json-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-server-1.8.jar:/usr/local/hadoop/libexec/../lib/jets3t-0.6.1.jar:/usr/local/hadoop/libexec/../lib/jetty-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jetty-util-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jsch-0.1.42.jar:/usr/local/hadoop/libexec/../lib/junit-4.5.jar:/usr/local/hadoop/libexec/../lib/kfs-0.2.2.jar:/usr/local/hadoop/libexec/../lib/log4j-1.2.15.jar:/usr/local/hadoop/libexec/../lib/mockito-all-1.8.5.jar:/usr/local/hadoop/libexec/../lib/oro-2.0.8.jar:/usr/local/hadoop/libexec/../lib/servlet-api-2.5-20081211.jar:/usr/local/hadoop/libexec/../lib/slf4j-api-1.4.3.jar:/usr/local/hadoop/libexec/../lib/slf4j-log4j12-1.4.3.jar:/usr/local/hadoop/libexec/../lib/xmlenc-0.52.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-2.1.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-api-2.1.jar:/usr/local/hadoop/lib:/usr/local/sqoop/conf::/usr/local/sqoop/lib/ant-contrib-1.0b3.jar:/usr/local/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar:/usr/local/sqoop/lib/avro-1.5.4.jar:/usr/local/sqoop/lib/avro-ipc-1.5.4.jar:/usr/local/sqoop/lib/avro-mapred-1.5.4.jar:/usr/local/sqoop/lib/commons-io-1.4.jar:/usr/local/sqoop/lib/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar:/usr/local/sqoop/lib/jackson-core-asl-1.7.3.jar:/usr/local/sqoop/lib/jackson-mapper-asl-1.7.3.jar:/usr/local/sqoop/lib/jopt-simple-3.2.jar:/usr/local/sqoop/lib/paranamer-2.3.jar:/usr/local/sqoop/lib/snappy-java-1.0.3.2.jar:/usr/local/sqoop/lib/sqljdbc4.jar:/usr/local/sqoop/lib/sqoop-sqlserver-1.0.jar:/usr/local/hbase/conf/:/usr/lib/jvm/java-6-openjdk-amd64//lib/tools.jar:/usr/local/hbase:/usr/local/hbase/hbase-0.94.1.jar:/usr/local/hbase/hbase-0.94.1-tests.jar:/usr/local/hbase/lib/activation-1.1.jar:/usr/local/hbase/lib/asm-3.1.jar:/usr/local/hbase/lib/avro-1.5.3.jar:/usr/local/hbase/lib/avro-ipc-1.5.3.jar:/usr/local/hbase/lib/commons-beanutils-1.7.0.jar:/usr/local/hbase/lib/commons-beanutils-core-1.8.0.jar:/usr/local/hbase/lib/commons-cli-1.2.jar:/usr/local/hbase/lib/commons-codec-1.4.jar:/usr/local/hbase/lib/commons-collections-3.2.1.jar:/usr/local/hbase/lib/commons-configuration-1.6.jar:/usr/local/hbase/lib/commons-digester-1.8.jar:/usr/local/hbase/lib/commons-el-1.0.jar:/usr/local/hbase/lib/commons-httpclient-3.1.jar:/usr/local/hbase/lib/commons-io-2.1.jar:/usr/local/hbase/lib/commons-lang-2.5.jar:/usr/local/hbase/lib/commons-logging-1.1.1.jar:/usr/local/hbase/lib/commons-math-2.1.jar:/usr/local/hbase/lib/commons-net-1.4.1.jar:/usr/local/hbase/lib/core-3.1.1.jar:/usr/local/hbase/lib/guava-11.0.2.jar:/usr/local/hbase/lib/hadoop-core-1.0.3.jar:/usr/local/hbase/lib/high-scale-lib-1.1.1.jar:/usr/local/hbase/lib/httpclient-4.1.2.jar:/usr/local/hbase/lib/httpcore-4.1.3.jar:/usr/local/hbase/lib/jackson-core-asl-1.8.8.jar:/usr/local/hbase/lib/jackson-jaxrs-1.8.8.jar:/usr/local/hbase/lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hbase/lib/jackson-xc-1.8.8.jar:/usr/local/hbase/lib/jamon-runtime-2.3.1.jar:/usr/local/hbase/lib/jasper-compiler-5.5.23.jar:/usr/local/hbase/lib/jasper-runtime-5.5.23.jar:/usr/local/hbase/lib/jaxb-api-2.1.jar:/usr/local/hbase/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hbase/lib/jersey-core-1.8.jar:/usr/local/hbase/lib/jersey-json-1.8.jar:/usr/local/hbase/lib/jersey-server-1.8.jar:/usr/local/hbase/lib/jettison-1.1.jar:/usr/local/hbase/lib/jetty-6.1.26.jar:/usr/local/hbase/lib/jetty-util-6.1.26.jar:/usr/local/hbase/lib/jruby-complete-1.6.5.jar:/usr/local/hbase/lib/jsp-2.1-6.1.14.jar:/usr/local/hbase/lib/jsp-api-2.1-6.1.14.jar:/usr/local/hbase/lib/jsr305-1.3.9.jar:/usr/local/hbase/lib/junit-4.10-HBASE-1.jar:/usr/local/hbase/lib/libthrift-0.8.0.jar:/usr/local/hbase/lib/log4j-1.2.16.jar:/usr/local/hbase/lib/metrics-core-2.1.2.jar:/usr/local/hbase/lib/netty-3.2.4.Final.jar:/usr/local/hbase/lib/protobuf-java-2.4.0a.jar:/usr/local/hbase/lib/servlet-api-2.5-6.1.14.jar:/usr/local/hbase/lib/slf4j-api-1.4.3.jar:/usr/local/hbase/lib/slf4j-log4j12-1.4.3.jar:/usr/local/hbase/lib/snappy-java-1.0.3.2.jar:/usr/local/hbase/lib/stax-api-1.0.1.jar:/usr/local/hbase/lib/velocity-1.7.jar:/usr/local/hbase/lib/xmlenc-0.52.jar:/usr/local/hbase/lib/zookeeper-3.4.3.jar::/usr/local/hadoop/conf:/usr/local/hadoop/libexec/../conf:/usr/lib/jvm/java-6-openjdk-amd64/lib/tools.jar:/usr/local/hadoop/libexec/..:/usr/local/hadoop/libexec/../hadoop-core-1.0.3.jar:/usr/local/hadoop/libexec/../lib/asm-3.2.jar:/usr/local/hadoop/libexec/../lib/aspectjrt-1.6.5.jar:/usr/local/hadoop/libexec/../lib/aspectjtools-1.6.5.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/libexec/../lib/commons-cli-1.2.jar:/usr/local/hadoop/libexec/../lib/commons-codec-1.4.jar:/usr/local/hadoop/libexec/../lib/commons-collections-3.2.1.jar:/usr/local/hadoop/libexec/../lib/commons-configuration-1.6.jar:/usr/local/hadoop/libexec/../lib/commons-daemon-1.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-digester-1.8.jar:/usr/local/hadoop/libexec/../lib/commons-el-1.0.jar:/usr/local/hadoop/libexec/../lib/commons-httpclient-3.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-io-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-lang-2.4.jar:/usr/local/hadoop/libexec/../lib/commons-logging-1.1.1.jar:/usr/local/hadoop/libexec/../lib/commons-logging-api-1.0.4.jar:/usr/local/hadoop/libexec/../lib/commons-math-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-net-1.4.1.jar:/usr/local/hadoop/libexec/../lib/core-3.1.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-capacity-scheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-fairscheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-thriftfs-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hsqldb-1.8.0.10.jar:/usr/local/hadoop/libexec/../lib/jackson-core-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jasper-compiler-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jasper-runtime-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jdeb-0.8.jar:/usr/local/hadoop/libexec/../lib/jersey-core-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-json-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-server-1.8.jar:/usr/local/hadoop/libexec/../lib/jets3t-0.6.1.jar:/usr/local/hadoop/libexec/../lib/jetty-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jetty-util-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jsch-0.1.42.jar:/usr/local/hadoop/libexec/../lib/junit-4.5.jar:/usr/local/hadoop/libexec/../lib/kfs-0.2.2.jar:/usr/local/hadoop/libexec/../lib/log4j-1.2.15.jar:/usr/local/hadoop/libexec/../lib/mockito-all-1.8.5.jar:/usr/local/hadoop/libexec/../lib/oro-2.0.8.jar:/usr/local/hadoop/libexec/../lib/servlet-api-2.5-20081211.jar:/usr/local/hadoop/libexec/../lib/slf4j-api-1.4.3.jar:/usr/local/hadoop/libexec/../lib/slf4j-log4j12-1.4.3.jar:/usr/local/hadoop/libexec/../lib/xmlenc-0.52.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-2.1.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-api-2.1.jar:/usr/local/hadoop/lib::/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar:/usr/local/sqoop/sqoop-test-1.3.0-cdh3u4.jar::/usr/local/hadoop/hadoop-core-1.0.3.jar:/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar
> > > > > > > > 12/12/05 12:08:58 ERROR orm.CompilationManager: Could not
> rename
> > > > > > > >
> > > > > > >
> > > > >
> > >
> /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/member_main.java
> > > > > > > > to /home/hadoop/_scripts/1-hadoop/member/./member_main.java
> > > > > > > > org.apache.commons.io.FileExistsException: Destination
> > > > > > > > '/home/hadoop/_scripts/1-hadoop/member/./member_main.java'
> > > already
> > > > > exists
> > > > > > > >  at
> org.apache.commons.io.FileUtils.moveFile(FileUtils.java:2378)
> > > > > > > > at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> com.cloudera.sqoop.orm.CompilationManager.compile(CompilationManager.java:229)
> > > > > > > >  at
> > > > >
> com.cloudera.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:85)
> > > > > > > > at
> > > com.cloudera.sqoop.tool.ExportTool.exportTable(ExportTool.java:66)
> > > > > > > >  at
> com.cloudera.sqoop.tool.ExportTool.run(ExportTool.java:99)
> > > > > > > > at com.cloudera.sqoop.Sqoop.run(Sqoop.java:146)
> > > > > > > >  at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> > > > > > > > at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:182)
> > > > > > > >  at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:221)
> > > > > > > > at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:230)
> > > > > > > >  at com.cloudera.sqoop.Sqoop.main(Sqoop.java:239)
> > > > > > > > 12/12/05 12:08:58 INFO orm.CompilationManager: Writing jar
> file:
> > > > > > > >
> > > > > > >
> > > > >
> > >
> /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/member_main.jar
> > > > > > > > 12/12/05 12:08:58 DEBUG orm.CompilationManager: Scanning for
> > > .class
> > > > > files
> > > > > > > > in directory:
> > > > > /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971
> > > > > > > > 12/12/05 12:08:58 DEBUG orm.CompilationManager: Got
> classfile:
> > > > > > > >
> > > > > > >
> > > > >
> > >
> /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/member_main.class
> > > > > > > > -> member_main.class
> > > > > > > > 12/12/05 12:08:58 DEBUG orm.CompilationManager: Finished
> writing
> > > jar
> > > > > file
> > > > > > > >
> > > > > > >
> > > > >
> > >
> /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/member_main.jar
> > > > > > > > 12/12/05 12:08:58 INFO mapreduce.ExportJobBase: Beginning
> export
> > > of
> > > > > > > > member_main
> > > > > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Using InputFormat:
> > > class
> > > > > > > > com.cloudera.sqoop.mapreduce.ExportInputFormat
> > > > > > > > 12/12/05 12:08:58 DEBUG manager.SqlManager: Using fetchSize
> for
> > > next
> > > > > > > query:
> > > > > > > > 1000
> > > > > > > > 12/12/05 12:08:58 INFO manager.SqlManager: Executing SQL
> > > statement:
> > > > > > > SELECT
> > > > > > > > TOP 1 * FROM [member_main]
> > > > > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job
> > > classpath:
> > > > > > > > file:/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar
> > > > > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job
> > > classpath:
> > > > > > > > file:/usr/local/sqoop/lib/sqljdbc4.jar
> > > > > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job
> > > classpath:
> > > > > > > > file:/usr/local/sqoop/lib/sqoop-sqlserver-1.0.jar
> > > > > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job
> > > classpath:
> > > > > > > > file:/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar
> > > > > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job
> > > classpath:
> > > > > > > > file:/usr/local/sqoop/lib/jackson-core-asl-1.7.3.jar
> > > > > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job
> > > classpath:
> > > > > > > > file:/usr/local/sqoop/lib/snappy-java-1.0.3.2.jar
> > > > > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job
> > > classpath:
> > > > > > > > file:/usr/local/sqoop/lib/avro-1.5.4.jar
> > > > > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job
> > > classpath:
> > > > > > > > file:/usr/local/sqoop/lib/jackson-mapper-asl-1.7.3.jar
> > > > > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job
> > > classpath:
> > > > > > > > file:/usr/local/sqoop/lib/sqljdbc4.jar
> > > > > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job
> > > classpath:
> > > > > > > > file:/usr/local/sqoop/lib/ant-contrib-1.0b3.jar
> > > > > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job
> > > classpath:
> > > > > > > > file:/usr/local/sqoop/lib/sqoop-sqlserver-1.0.jar
> > > > > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job
> > > classpath:
> > > > > > > > file:/usr/local/sqoop/lib/paranamer-2.3.jar
> > > > > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job
> > > classpath:
> > > > > > > > file:/usr/local/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar
> > > > > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job
> > > classpath:
> > > > > > > >
> > > file:/usr/local/sqoop/lib/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar
> > > > > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job
> > > classpath:
> > > > > > > > file:/usr/local/sqoop/lib/avro-ipc-1.5.4.jar
> > > > > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job
> > > classpath:
> > > > > > > > file:/usr/local/sqoop/lib/avro-mapred-1.5.4.jar
> > > > > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job
> > > classpath:
> > > > > > > > file:/usr/local/sqoop/lib/commons-io-1.4.jar
> > > > > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job
> > > classpath:
> > > > > > > > file:/usr/local/sqoop/lib/jopt-simple-3.2.jar
> > > > > > > > 12/12/05 12:09:00 INFO input.FileInputFormat: Total input
> paths
> > > to
> > > > > > > process
> > > > > > > > : 1
> > > > > > > > 12/12/05 12:09:00 DEBUG mapreduce.ExportInputFormat: Target
> > > > > numMapTasks=1
> > > > > > > > 12/12/05 12:09:00 DEBUG mapreduce.ExportInputFormat: Total
> input
> > > > > > > bytes=2611
> > > > > > > > 12/12/05 12:09:00 DEBUG mapreduce.ExportInputFormat:
> > > > > maxSplitSize=2611
> > > > > > > > 12/12/05 12:09:00 INFO input.FileInputFormat: Total input
> paths
> > > to
> > > > > > > process
> > > > > > > > : 1
> > > > > > > > 12/12/05 12:09:00 INFO util.NativeCodeLoader: Loaded the
> > > > > native-hadoop
> > > > > > > > library
> > > > > > > > 12/12/05 12:09:00 WARN snappy.LoadSnappy: Snappy native
> library
> > > not
> > > > > > > loaded
> > > > > > > > 12/12/05 12:09:00 DEBUG mapreduce.ExportInputFormat:
> Generated
> > > > > splits:
> > > > > > > > 12/12/05 12:09:00 DEBUG mapreduce.ExportInputFormat:
> > > > > > > > Paths:/user/hadoop/test-ivan/test:0+2611 Locations:hadoop05:;
> > > > > > > > 12/12/05 12:09:00 INFO mapred.JobClient: Running job:
> > > > > > > job_201212041541_0107
> > > > > > > > 12/12/05 12:09:01 INFO mapred.JobClient:  map 0% reduce 0%
> > > > > > > > 12/12/05 12:09:18 INFO mapred.JobClient: Task Id :
> > > > > > > > attempt_201212041541_0107_m_000000_0, Status : FAILED
> > > > > > > > java.io.IOException:
> > > com.microsoft.sqlserver.jdbc.SQLServerException:
> > > > > > > > Incorrect syntax near ','.
> > > > > > > >  at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.close(AsyncSqlRecordWriter.java:195)
> > > > > > > > at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.close(MapTask.java:651)
> > > > > > > >  at
> > > org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:766)
> > > > > > > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> > > > > > > >  at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> > > > > > > > at java.security.AccessController.doPrivileged(Native Method)
> > > > > > > >  at javax.security.auth.Subject.doAs(Subject.java:416)
> > > > > > > > at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> > > > > > > >  at org.apache.hadoop.mapred.Child.main(Child.java:249)
> > > > > > > > Caused by: com.microsoft.sqlserver.jdbc.SQLServerException:
> > > Incorrect
> > > > > > > > syntax near ','.
> > > > > > > >  at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:197)
> > > > > > > > at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1493)
> > > > > > > >  at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.doExecutePreparedStatement(SQLServerPreparedStatement.java:390)
> > > > > > > > at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement$PrepStmtExecCmd.doExecute(SQLServerPreparedStatement.java:340)
> > > > > > > >  at
> > > > > com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:4575)
> > > > > > > > at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1400)
> > > > > > > >  at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:179)
> > > > > > > > at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:154)
> > > > > > > >  at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.execute(SQLServerPreparedStatement.java:322)
> > > > > > > > at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> com.cloudera.sqoop.mapreduce.AsyncSqlOutputFormat$AsyncSqlExecThread.run(AsyncSqlOutputFormat.java:234)
> > > > > > > >
> > > > > > > > 12/12/05 12:09:24 INFO mapred.JobClient: Task Id :
> > > > > > > > attempt_201212041541_0107_m_000000_1, Status : FAILED
> > > > > > > > java.io.IOException:
> > > com.microsoft.sqlserver.jdbc.SQLServerException:
> > > > > > > > Incorrect syntax near ','.
> > > > > > > >  at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.close(AsyncSqlRecordWriter.java:195)
> > > > > > > > at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.close(MapTask.java:651)
> > > > > > > >  at
> > > org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:766)
> > > > > > > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> > > > > > > >  at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> > > > > > > > at java.security.AccessController.doPrivileged(Native Method)
> > > > > > > >  at javax.security.auth.Subject.doAs(Subject.java:416)
> > > > > > > > at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> > > > > > > >  at org.apache.hadoop.mapred.Child.main(Child.java:249)
> > > > > > > > Caused by: com.microsoft.sqlserver.jdbc.SQLServerException:
> > > Incorrect
> > > > > > > > syntax near ','.
> > > > > > > >  at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:197)
> > > > > > > > at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1493)
> > > > > > > >  at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.doExecutePreparedStatement(SQLServerPreparedStatement.java:390)
> > > > > > > > at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement$PrepStmtExecCmd.doExecute(SQLServerPreparedStatement.java:340)
> > > > > > > >  at
> > > > > com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:4575)
> > > > > > > > at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1400)
> > > > > > > >  at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:179)
> > > > > > > > at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:154)
> > > > > > > >  at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.execute(SQLServerPreparedStatement.java:322)
> > > > > > > > at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> com.cloudera.sqoop.mapreduce.AsyncSqlOutputFormat$AsyncSqlExecThread.run(AsyncSqlOutputFormat.java:234)
> > > > > > > >
> > > > > > > > 12/12/05 12:09:30 INFO mapred.JobClient: Task Id :
> > > > > > > > attempt_201212041541_0107_m_000000_2, Status : FAILED
> > > > > > > > java.io.IOException:
> > > com.microsoft.sqlserver.jdbc.SQLServerException:
> > > > > > > > Incorrect syntax near ','.
> > > > > > > >  at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.close(AsyncSqlRecordWriter.java:195)
> > > > > > > > at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.close(MapTask.java:651)
> > > > > > > >  at
> > > org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:766)
> > > > > > > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> > > > > > > >  at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> > > > > > > > at java.security.AccessController.doPrivileged(Native Method)
> > > > > > > >  at javax.security.auth.Subject.doAs(Subject.java:416)
> > > > > > > > at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> > > > > > > >  at org.apache.hadoop.mapred.Child.main(Child.java:249)
> > > > > > > > Caused by: com.microsoft.sqlserver.jdbc.SQLServerException:
> > > Incorrect
> > > > > > > > syntax near ','.
> > > > > > > >  at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:197)
> > > > > > > > at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1493)
> > > > > > > >  at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.doExecutePreparedStatement(SQLServerPreparedStatement.java:390)
> > > > > > > > at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement$PrepStmtExecCmd.doExecute(SQLServerPreparedStatement.java:340)
> > > > > > > >  at
> > > > > com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:4575)
> > > > > > > > at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1400)
> > > > > > > >  at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:179)
> > > > > > > > at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:154)
> > > > > > > >  at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.execute(SQLServerPreparedStatement.java:322)
> > > > > > > > at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> com.cloudera.sqoop.mapreduce.AsyncSqlOutputFormat$AsyncSqlExecThread.run(AsyncSqlOutputFormat.java:234)
> > > > > > > >
> > > > > > > > 12/12/05 12:09:41 INFO mapred.JobClient: Job complete:
> > > > > > > job_201212041541_0107
> > > > > > > > 12/12/05 12:09:41 INFO mapred.JobClient: Counters: 8
> > > > > > > > 12/12/05 12:09:41 INFO mapred.JobClient:   Job Counters
> > > > > > > > 12/12/05 12:09:41 INFO mapred.JobClient:
> > > SLOTS_MILLIS_MAPS=24379
> > > > > > > > 12/12/05 12:09:41 INFO mapred.JobClient:     Total time
> spent by
> > > all
> > > > > > > > reduces waiting after reserving slots (ms)=0
> > > > > > > > 12/12/05 12:09:41 INFO mapred.JobClient:     Total time
> spent by
> > > all
> > > > > maps
> > > > > > > > waiting after reserving slots (ms)=0
> > > > > > > > 12/12/05 12:09:41 INFO mapred.JobClient:     Rack-local map
> > > tasks=3
> > > > > > > > 12/12/05 12:09:41 INFO mapred.JobClient:     Launched map
> tasks=4
> > > > > > > > 12/12/05 12:09:41 INFO mapred.JobClient:     Data-local map
> > > tasks=1
> > > > > > > > 12/12/05 12:09:41 INFO mapred.JobClient:
> > > SLOTS_MILLIS_REDUCES=0
> > > > > > > > 12/12/05 12:09:41 INFO mapred.JobClient:     Failed map
> tasks=1
> > > > > > > > 12/12/05 12:09:41 INFO mapreduce.ExportJobBase: Transferred 0
> > > bytes
> > > > > in
> > > > > > > > 43.0875 seconds (0 bytes/sec)
> > > > > > > > 12/12/05 12:09:41 INFO mapreduce.ExportJobBase: Exported 0
> > > records.
> > > > > > > > 12/12/05 12:09:41 ERROR tool.ExportTool: Error during export:
> > > Export
> > > > > job
> > > > > > > > failed!
> > > > > > > > ================
> > > > > > > >
> > > > > > > > Kind regards,
> > > > > > > > Chun-fan
> > > > > > > >
> > > > > > > > On Wed, Dec 5, 2012 at 12:01 AM, Jarek Jarcec Cecho <
> > > > > jarcec@apache.org
> > > > > > > >wrote:
> > > > > > > >
> > > > > > > > > HiChun-fan,
> > > > > > > > > would you mind sharing with us entire Sqoop log generated
> with
> > > > > > > parameter
> > > > > > > > > --verbose? Are you using build-in Microsoft SQL Connector
> or
> > > > > connector
> > > > > > > > > provided by Microsoft?
> > > > > > > > >
> > > > > > > > > Jarcec
> > > > > > > > >
> > > > > > > > > On Tue, Dec 04, 2012 at 05:51:31PM +0800, Chun-fan Ivan
> Liao
> > > wrote:
> > > > > > > > > > Hi,
> > > > > > > > > >
> > > > > > > > > >
> > > > > > > > > >
> > > > > > > > > > We are using Sqoop 1.3.0-cdh3u4 with Hadoop version
> 1.0.3.
> > > > > > > > > >
> > > > > > > > > >
> > > > > > > > > >
> > > > > > > > > > We encountered the following error when we try to export
> HDFS
> > > > > file
> > > > > > > into
> > > > > > > > > > MSSQL 2005 (partially):
> > > > > > > > > >
> > > > > > > > > >
> > > > > > > > > >
> > > > > > > > > > 12/12/04 16:44:13 INFO mapred.JobClient: Task Id :
> > > > > > > > > > attempt_201212041541_0014_m_000000_2, Status : FAILED
> > > > > > > > > >
> > > > > > > > > > java.io.IOException:
> > > > > com.microsoft.sqlserver.jdbc.SQLServerException:
> > > > > > > > > > Incorrect syntax near ','.
> > > > > > > > > >
> > > > > > > > > >         at
> > > > > > > > > >
> > > > > > > > >
> > > > > > >
> > > > >
> > >
> com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.close(AsyncSqlRecordWriter.java:195)
> > > > > > > > > >
> > > > > > > > > >         at
> > > > > > > > > >
> > > > > > > > >
> > > > > > >
> > > > >
> > >
> org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.close(MapTask.java:651)
> > > > > > > > > >
> > > > > > > > > >         at
> > > > > > > > >
> org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:766)
> > > > > > > > > >
> > > > > > > > > >         at
> > > org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> > > > > > > > > >
> > > > > > > > > >         at
> > > org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> > > > > > > > > >
> > > > > > > > > >         at
> java.security.AccessController.doPrivileged(Native
> > > > > Method)
> > > > > > > > > >
> > > > > > > > > >         at
> javax.security.auth.Subject.doAs(Subject.java:416)
> > > > > > > > > >
> > > > > > > > > >         at
> > > > > > > > > >
> > > > > > > > >
> > > > > > >
> > > > >
> > >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> > > > > > > > > >
> > > > > > > > > >         at
> > > org.apache.hadoop.mapred.Child.main(Child.java:249)
> > > > > > > > > >
> > > > > > > > > > Caused by:
> com.microsoft.sqlserver.jdbc.SQLServerException:
> > > > > Incorrect
> > > > > > > > > > syntax near ','.
> > > > > > > > > >
> > > > > > > > > >         at
> > > > > > > > > >
> > > > > > > > >
> > > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:197)
> > > > > > > > > >
> > > > > > > > > >         at
> > > > > > > > > >
> > > > > > > > >
> > > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1493)
> > > > > > > > > >
> > > > > > > > > >         at
> > > > > > > > > >
> > > > > > > > >
> > > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.doExecutePreparedStatement(SQLServerPreparedStatement.java:390)
> > > > > > > > > >
> > > > > > > > > >         at
> > > > > > > > > >
> > > > > > > > >
> > > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement$PrepStmtExecCmd.doExecute(SQLServerPreparedStatement.java:340)
> > > > > > > > > >
> > > > > > > > > >         at
> > > > > > > > > >
> > > > > com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:4575)
> > > > > > > > > >
> > > > > > > > > >         at
> > > > > > > > > >
> > > > > > > > >
> > > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1400)
> > > > > > > > > >
> > > > > > > > > >         at
> > > > > > > > > >
> > > > > > > > >
> > > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:179)
> > > > > > > > > >
> > > > > > > > > >         at
> > > > > > > > > >
> > > > > > > > >
> > > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:154)
> > > > > > > > > >
> > > > > > > > > >         at
> > > > > > > > > >
> > > > > > > > >
> > > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.execute(SQLServerPreparedStatement.java:322)
> > > > > > > > > >
> > > > > > > > > >         at
> > > > > > > > > >
> > > > > > > > >
> > > > > > >
> > > > >
> > >
> com.cloudera.sqoop.mapreduce.AsyncSqlOutputFormat$AsyncSqlExecThread.run(AsyncSqlOutputFormat.java:234)
> > > > > > > > > >
> > > > > > > > > >
> > > > > > > > > >
> > > > > > > > > > The HDFS file that we want to export was imported using
> sqoop
> > > > > from
> > > > > > > SQL
> > > > > > > > > 2005
> > > > > > > > > > before and uses ‘|’ as field delimiter, and there are
> commas
> > > > > (‘,’)
> > > > > > > in a
> > > > > > > > > > field of a line in the file.
> > > > > > > > > >
> > > > > > > > > >
> > > > > > > > > >
> > > > > > > > > > The commands I submitted is (generalized with capital
> > > letters):
> > > > > > > > > >
> > > > > > > > > > $ sqoop export -D sqoop.export.records.per.statement=75
> -D
> > > > > > > > > > sqoop.export.statements.per.transaction=75 --connect
> > > > > > > > > >
> > > > > > > > >
> > > > > > >
> > > > >
> > >
> "jdbc:sqlserver://SERVER-NAME:1433;username=USER_NAME;password=PASSWD;database=DB_NAME"
> > > > > > > > > > --table TABLE_NAME -m 1 --input-fields-terminated-by '|'
> > > > > --export-dir
> > > > > > > > > > /EXPORT/FROM/DIRECTORY
> > > > > > > > > >
> > > > > > > > > >
> > > > > > > > > >
> > > > > > > > > > I’ve adjusted values of
> sqoop.export.records.per.statement &
> > > > > > > > > > sqoop.export.statements.per.transaction, but that didn’t
> > > help.
> > > > > > > > > >
> > > > > > > > > >
> > > > > > > > > >
> > > > > > > > > > It will be greatly appreciated if you can offer some
> help.
> > > > > Thanks.
> > > > > > > > > >
> > > > > > > > > >
> > > > > > > > > >
> > > > > > > > > > Ivan
> > > > > > > > >
> > > > > > >
> > > > >
> > >
>

Re: Sqoop export failed: Incorrect syntax near ','

Posted by Jarek Jarcec Cecho <ja...@apache.org>.
I've just realized what is wrong, parameter --hive-drop-import-delims is import specific and it seems that you're trying to use it for export. Would you mind to re-import your data with this parameter and try the export?

Jarcec

On Fri, Dec 07, 2012 at 03:45:55PM +0800, Chun-fan Ivan Liao wrote:
> I've upgrade sqoop to 1.4.2 and copied hadoop-core-1.0.3.jar
> and sqljdbc4.jar to /usr/local/sqoop/lib. I've also specified the parameter
> "--hive-drop-import-delims" in the command, but the same error remained.
> Parameters specified behind --hive-drop-import-delims could not be parsed:
> 
> ==============
> 12/12/06 23:39:07 ERROR tool.BaseSqoopTool: Error parsing arguments for
> export:
> 12/12/06 23:39:07 ERROR tool.BaseSqoopTool: Unrecognized argument:
> --hive-drop-import-delims
> 12/12/06 23:39:07 ERROR tool.BaseSqoopTool: Unrecognized argument: --verbose
> 
> Try --help for usage instructions.
> ....
> ==============
> 
> Is there anything I can do now, e.g. re-importing data using default
> connector and see if the imported data can be exported back to SQL Server?
> 
> 
> On Fri, Dec 7, 2012 at 12:01 PM, Jarek Jarcec Cecho <ja...@apache.org>wrote:
> 
> > I see, Would you mind upgrading your Sqoop to most recent version 1.4.2?
> >
> > Jarcec
> >
> > On Fri, Dec 07, 2012 at 11:19:31AM +0800, Chun-fan Ivan Liao wrote:
> > > Hi Jarek,
> > >
> > > I've tried to use "--hive-drop-import-delims", but sqoop showed it has
> > > syntax error:
> > >
> > >   ERROR tool.BaseSqoopTool: Unrecognized argument:
> > --hive-drop-import-delims
> > >
> > > Also, should I change Java OpenJDK to Oracle JDK in order to make Sqoop
> > > export work?
> > >
> > > Thanks!
> > > Ivan
> > >
> > > On Fri, Dec 7, 2012 at 12:28 AM, Jarek Jarcec Cecho <jarcec@apache.org
> > >wrote:
> > >
> > > > Hi Ivangelion,
> > > > I'm glad that you were able to move on with your issue. It seems to me
> > > > that you're running on OpenJDK - unfortunately Sqoop is tested and
> > > > supported only Oracle JDK.
> > > >
> > > > Based on the exceptions you're hitting:
> > > >
> > > >   java.lang.NumberFormatException: For input string: "Male"
> > > >
> > > >   java.lang.IllegalArgumentException: Timestamp format must be
> > yyyy-mm-dd
> > > > hh:mm:ss[.fffffffff
> > > >
> > > > It seems to me your input files got somehow corrupted and for example
> > for
> > > > the first exception Sqoop is looking for column that should be number
> > but
> > > > found string "Male" instead. You've mentioned that your data can
> > contain a
> > > > lot of wild characters, can it happen that your data also contains new
> > line
> > > > characters? Would you mind re-trying import with parameter
> > > > --hive-drop-import-delims [1] to see if it helps? (this parameter do
> > not
> > > > depend on hive in any way regardless of it's name).
> > > >
> > > > Jarcec
> > > >
> > > > On Thu, Dec 06, 2012 at 12:03:06PM +0800, Ivangelion wrote:
> > > > > Hi Jarek,
> > > > >
> > > > > It actually worked! Thank you so much~! :D
> > > > >
> > > > > However now we faced another problem. The former data we tried to
> > export
> > > > is
> > > > > only test data, which row count is only 10. When we tried to export
> > > > > production data back into SQL Server from HDFS file which was
> > previously
> > > > > imported using Sqoop from SQL server, different errors occurred. The
> > row
> > > > > count is about 400k, and only about 120k rows were exported. This
> > time we
> > > > > used "-m 5", and if using "-m 1", nothing will be exported. Verbose
> > log
> > > > is
> > > > > in the bottom of this mail.
> > > > >
> > > > > Is this has to do with that we used MS SQL connector to do previous
> > > > import,
> > > > > not the default one?
> > > > >
> > > > > Also, should we specify any character encoding, e.g. utf-8 during
> > > > > import/export process? There are characters of many different
> > languages
> > > > in
> > > > > our original data in SQL Server, and I'm not sure what the encoding
> > is
> > > > > after imported into HDFS.
> > > > >
> > > > > Thanks again, Jarek.
> > > > >
> > > > > =====================================
> > > > > 12/12/05 19:55:27 DEBUG tool.BaseSqoopTool: Enabled debug logging.
> > > > > 12/12/05 19:55:27 DEBUG sqoop.ConnFactory: Loaded manager factory:
> > > > > com.cloudera.sqoop.manager.DefaultManagerFactory
> > > > > 12/12/05 19:55:27 DEBUG sqoop.ConnFactory: Trying ManagerFactory:
> > > > > com.cloudera.sqoop.manager.DefaultManagerFactory
> > > > > 12/12/05 19:55:27 DEBUG manager.DefaultManagerFactory: Trying with
> > > > scheme:
> > > > > jdbc:sqlserver:
> > > > > 12/12/05 19:55:27 INFO manager.SqlManager: Using default fetchSize of
> > > > 1000
> > > > > 12/12/05 19:55:27 DEBUG sqoop.ConnFactory: Instantiated ConnManager
> > > > > com.cloudera.sqoop.manager.SQLServerManager@6766afb3
> > > > > 12/12/05 19:55:27 INFO tool.CodeGenTool: Beginning code generation
> > > > > 12/12/05 19:55:27 DEBUG manager.SqlManager: No connection paramenters
> > > > > specified. Using regular API for making connection.
> > > > > 12/12/05 19:55:27 DEBUG manager.SqlManager: Using fetchSize for next
> > > > query:
> > > > > 1000
> > > > > 12/12/05 19:55:27 INFO manager.SqlManager: Executing SQL statement:
> > > > SELECT
> > > > > t.* FROM member_main AS t WHERE 1=0
> > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter: selected columns:
> > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   MemberId
> > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   USERNAME
> > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   FirstName
> > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   LastName
> > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   EmailAddress
> > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Password
> > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Password_E5
> > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Birthday
> > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   CompanyName
> > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Gender
> > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Age
> > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Education
> > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Country
> > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Title
> > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Phone1
> > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Phone2
> > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Fax
> > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   State
> > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   City
> > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Address1
> > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Address2
> > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   ZipCode
> > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   VATID
> > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Language
> > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   rec_letter
> > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   rec_promotion
> > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   rec_type
> > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   JointSource
> > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   CustomerLevel
> > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   UpdateDate
> > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   CreateDate
> > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   FirstLoginDate
> > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   LastLoginDate
> > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   LastVisit
> > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   isValid
> > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   nJoint
> > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Upd_SubDate
> > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   UnSub_Type
> > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   CreateDateFloat
> > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter: Writing source file:
> > > > >
> > > >
> > /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/member_main.java
> > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter: Table name: member_main
> > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter: Columns: MemberId:4,
> > > > USERNAME:12,
> > > > > FirstName:-9, LastName:-9, EmailAddress:12, Password:12,
> > Password_E5:12,
> > > > > Birthday:93, CompanyName:-9, Gender:12, Age:5, Education:12,
> > Country:5,
> > > > > Title:-9, Phone1:12, Phone2:12, Fax:12, State:-9, City:-9,
> > Address1:-9,
> > > > > Address2:-9, ZipCode:12, VATID:12, Language:12, rec_letter:-7,
> > > > > rec_promotion:-7, rec_type:5, JointSource:12, CustomerLevel:4,
> > > > > UpdateDate:93, CreateDate:93, FirstLoginDate:93, LastLoginDate:93,
> > > > > LastVisit:93, isValid:-7, nJoint:4, Upd_SubDate:93, UnSub_Type:4,
> > > > > CreateDateFloat:8,
> > > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter: sourceFilename is
> > > > member_main.java
> > > > > 12/12/05 19:55:27 DEBUG orm.CompilationManager: Found existing
> > > > > /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/
> > > > > 12/12/05 19:55:27 INFO orm.CompilationManager: HADOOP_HOME is
> > > > > /usr/local/hadoop/libexec/..
> > > > > 12/12/05 19:55:27 DEBUG orm.CompilationManager: Adding source file:
> > > > >
> > > >
> > /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/member_main.java
> > > > > 12/12/05 19:55:27 DEBUG orm.CompilationManager: Invoking javac with
> > args:
> > > > > 12/12/05 19:55:27 DEBUG orm.CompilationManager:   -sourcepath
> > > > > 12/12/05 19:55:27 DEBUG orm.CompilationManager:
> > > > > /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/
> > > > > 12/12/05 19:55:27 DEBUG orm.CompilationManager:   -d
> > > > > 12/12/05 19:55:27 DEBUG orm.CompilationManager:
> > > > > /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/
> > > > > 12/12/05 19:55:27 DEBUG orm.CompilationManager:   -classpath
> > > > > 12/12/05 19:55:27 DEBUG orm.CompilationManager:
> > > > >
> > > >
> > /usr/local/hadoop/libexec/../conf:/usr/lib/jvm/java-6-openjdk-amd64/lib/tools.jar:/usr/local/hadoop/libexec/..:/usr/local/hadoop/libexec/../hadoop-core-1.0.3.jar:/usr/local/hadoop/libexec/../lib/asm-3.2.jar:/usr/local/hadoop/libexec/../lib/aspectjrt-1.6.5.jar:/usr/local/hadoop/libexec/../lib/aspectjtools-1.6.5.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/libexec/../lib/commons-cli-1.2.jar:/usr/local/hadoop/libexec/../lib/commons-codec-1.4.jar:/usr/local/hadoop/libexec/../lib/commons-collections-3.2.1.jar:/usr/local/hadoop/libexec/../lib/commons-configuration-1.6.jar:/usr/local/hadoop/libexec/../lib/commons-daemon-1.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-digester-1.8.jar:/usr/local/hadoop/libexec/../lib/commons-el-1.0.jar:/usr/local/hadoop/libexec/../lib/commons-httpclient-3.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-io-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-lang-2.4.jar:/usr/local/hadoop/libexec/../lib/commons-logging-1.1.1.jar:/usr/local/hadoop/libexec/../lib/commons-logging-api-1.0.4.jar:/usr/local/hadoop/libexec/../lib/commons-math-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-net-1.4.1.jar:/usr/local/hadoop/libexec/../lib/core-3.1.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-capacity-scheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-fairscheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-thriftfs-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hsqldb-1.8.0.10.jar:/usr/local/hadoop/libexec/../lib/jackson-core-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jasper-compiler-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jasper-runtime-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jdeb-0.8.jar:/usr/local/hadoop/libexec/../lib/jersey-core-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-json-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-server-1.8.jar:/usr/local/hadoop/libexec/../lib/jets3t-0.6.1.jar:/usr/local/hadoop/libexec/../lib/jetty-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jetty-util-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jsch-0.1.42.jar:/usr/local/hadoop/libexec/../lib/junit-4.5.jar:/usr/local/hadoop/libexec/../lib/kfs-0.2.2.jar:/usr/local/hadoop/libexec/../lib/log4j-1.2.15.jar:/usr/local/hadoop/libexec/../lib/mockito-all-1.8.5.jar:/usr/local/hadoop/libexec/../lib/oro-2.0.8.jar:/usr/local/hadoop/libexec/../lib/servlet-api-2.5-20081211.jar:/usr/local/hadoop/libexec/../lib/slf4j-api-1.4.3.jar:/usr/local/hadoop/libexec/../lib/slf4j-log4j12-1.4.3.jar:/usr/local/hadoop/libexec/../lib/xmlenc-0.52.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-2.1.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-api-2.1.jar:/usr/local/hadoop/lib:/usr/local/sqoop/conf::/usr/local/sqoop/lib/ant-contrib-1.0b3.jar:/usr/local/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar:/usr/local/sqoop/lib/avro-1.5.4.jar:/usr/local/sqoop/lib/avro-ipc-1.5.4.jar:/usr/local/sqoop/lib/avro-mapred-1.5.4.jar:/usr/local/sqoop/lib/commons-io-1.4.jar:/usr/local/sqoop/lib/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar:/usr/local/sqoop/lib/jackson-core-asl-1.7.3.jar:/usr/local/sqoop/lib/jackson-mapper-asl-1.7.3.jar:/usr/local/sqoop/lib/jopt-simple-3.2.jar:/usr/local/sqoop/lib/paranamer-2.3.jar:/usr/local/sqoop/lib/snappy-java-1.0.3.2.jar:/usr/local/sqoop/lib/sqljdbc4.jar:/usr/local/sqoop/lib/sqoop-sqlserver-1.0.jar:/usr/local/hbase/conf/:/usr/lib/jvm/java-6-openjdk-amd64//lib/tools.jar:/usr/local/hbase:/usr/local/hbase/hbase-0.94.1.jar:/usr/local/hbase/hbase-0.94.1-tests.jar:/usr/local/hbase/lib/activation-1.1.jar:/usr/local/hbase/lib/asm-3.1.jar:/usr/local/hbase/lib/avro-1.5.3.jar:/usr/local/hbase/lib/avro-ipc-1.5.3.jar:/usr/local/hbase/lib/commons-beanutils-1.7.0.jar:/usr/local/hbase/lib/commons-beanutils-core-1.8.0.jar:/usr/local/hbase/lib/commons-cli-1.2.jar:/usr/local/hbase/lib/commons-codec-1.4.jar:/usr/local/hbase/lib/commons-collections-3.2.1.jar:/usr/local/hbase/lib/commons-configuration-1.6.jar:/usr/local/hbase/lib/commons-digester-1.8.jar:/usr/local/hbase/lib/commons-el-1.0.jar:/usr/local/hbase/lib/commons-httpclient-3.1.jar:/usr/local/hbase/lib/commons-io-2.1.jar:/usr/local/hbase/lib/commons-lang-2.5.jar:/usr/local/hbase/lib/commons-logging-1.1.1.jar:/usr/local/hbase/lib/commons-math-2.1.jar:/usr/local/hbase/lib/commons-net-1.4.1.jar:/usr/local/hbase/lib/core-3.1.1.jar:/usr/local/hbase/lib/guava-11.0.2.jar:/usr/local/hbase/lib/hadoop-core-1.0.3.jar:/usr/local/hbase/lib/high-scale-lib-1.1.1.jar:/usr/local/hbase/lib/httpclient-4.1.2.jar:/usr/local/hbase/lib/httpcore-4.1.3.jar:/usr/local/hbase/lib/jackson-core-asl-1.8.8.jar:/usr/local/hbase/lib/jackson-jaxrs-1.8.8.jar:/usr/local/hbase/lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hbase/lib/jackson-xc-1.8.8.jar:/usr/local/hbase/lib/jamon-runtime-2.3.1.jar:/usr/local/hbase/lib/jasper-compiler-5.5.23.jar:/usr/local/hbase/lib/jasper-runtime-5.5.23.jar:/usr/local/hbase/lib/jaxb-api-2.1.jar:/usr/local/hbase/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hbase/lib/jersey-core-1.8.jar:/usr/local/hbase/lib/jersey-json-1.8.jar:/usr/local/hbase/lib/jersey-server-1.8.jar:/usr/local/hbase/lib/jettison-1.1.jar:/usr/local/hbase/lib/jetty-6.1.26.jar:/usr/local/hbase/lib/jetty-util-6.1.26.jar:/usr/local/hbase/lib/jruby-complete-1.6.5.jar:/usr/local/hbase/lib/jsp-2.1-6.1.14.jar:/usr/local/hbase/lib/jsp-api-2.1-6.1.14.jar:/usr/local/hbase/lib/jsr305-1.3.9.jar:/usr/local/hbase/lib/junit-4.10-HBASE-1.jar:/usr/local/hbase/lib/libthrift-0.8.0.jar:/usr/local/hbase/lib/log4j-1.2.16.jar:/usr/local/hbase/lib/metrics-core-2.1.2.jar:/usr/local/hbase/lib/netty-3.2.4.Final.jar:/usr/local/hbase/lib/protobuf-java-2.4.0a.jar:/usr/local/hbase/lib/servlet-api-2.5-6.1.14.jar:/usr/local/hbase/lib/slf4j-api-1.4.3.jar:/usr/local/hbase/lib/slf4j-log4j12-1.4.3.jar:/usr/local/hbase/lib/snappy-java-1.0.3.2.jar:/usr/local/hbase/lib/stax-api-1.0.1.jar:/usr/local/hbase/lib/velocity-1.7.jar:/usr/local/hbase/lib/xmlenc-0.52.jar:/usr/local/hbase/lib/zookeeper-3.4.3.jar::/usr/local/hadoop/conf:/usr/local/hadoop/libexec/../conf:/usr/lib/jvm/java-6-openjdk-amd64/lib/tools.jar:/usr/local/hadoop/libexec/..:/usr/local/hadoop/libexec/../hadoop-core-1.0.3.jar:/usr/local/hadoop/libexec/../lib/asm-3.2.jar:/usr/local/hadoop/libexec/../lib/aspectjrt-1.6.5.jar:/usr/local/hadoop/libexec/../lib/aspectjtools-1.6.5.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/libexec/../lib/commons-cli-1.2.jar:/usr/local/hadoop/libexec/../lib/commons-codec-1.4.jar:/usr/local/hadoop/libexec/../lib/commons-collections-3.2.1.jar:/usr/local/hadoop/libexec/../lib/commons-configuration-1.6.jar:/usr/local/hadoop/libexec/../lib/commons-daemon-1.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-digester-1.8.jar:/usr/local/hadoop/libexec/../lib/commons-el-1.0.jar:/usr/local/hadoop/libexec/../lib/commons-httpclient-3.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-io-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-lang-2.4.jar:/usr/local/hadoop/libexec/../lib/commons-logging-1.1.1.jar:/usr/local/hadoop/libexec/../lib/commons-logging-api-1.0.4.jar:/usr/local/hadoop/libexec/../lib/commons-math-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-net-1.4.1.jar:/usr/local/hadoop/libexec/../lib/core-3.1.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-capacity-scheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-fairscheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-thriftfs-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hsqldb-1.8.0.10.jar:/usr/local/hadoop/libexec/../lib/jackson-core-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jasper-compiler-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jasper-runtime-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jdeb-0.8.jar:/usr/local/hadoop/libexec/../lib/jersey-core-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-json-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-server-1.8.jar:/usr/local/hadoop/libexec/../lib/jets3t-0.6.1.jar:/usr/local/hadoop/libexec/../lib/jetty-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jetty-util-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jsch-0.1.42.jar:/usr/local/hadoop/libexec/../lib/junit-4.5.jar:/usr/local/hadoop/libexec/../lib/kfs-0.2.2.jar:/usr/local/hadoop/libexec/../lib/log4j-1.2.15.jar:/usr/local/hadoop/libexec/../lib/mockito-all-1.8.5.jar:/usr/local/hadoop/libexec/../lib/oro-2.0.8.jar:/usr/local/hadoop/libexec/../lib/servlet-api-2.5-20081211.jar:/usr/local/hadoop/libexec/../lib/slf4j-api-1.4.3.jar:/usr/local/hadoop/libexec/../lib/slf4j-log4j12-1.4.3.jar:/usr/local/hadoop/libexec/../lib/xmlenc-0.52.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-2.1.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-api-2.1.jar:/usr/local/hadoop/lib::/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar:/usr/local/sqoop/sqoop-test-1.3.0-cdh3u4.jar::/usr/local/hadoop/hadoop-core-1.0.3.jar:/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar
> > > > > 12/12/05 19:55:28 ERROR orm.CompilationManager: Could not rename
> > > > >
> > > >
> > /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/member_main.java
> > > > > to /home/hadoop/_scripts/1-hadoop/member/./member_main.java
> > > > > org.apache.commons.io.FileExistsException: Destination
> > > > > '/home/hadoop/_scripts/1-hadoop/member/./member_main.java' already
> > exists
> > > > >  at org.apache.commons.io.FileUtils.moveFile(FileUtils.java:2378)
> > > > > at
> > > > >
> > > >
> > com.cloudera.sqoop.orm.CompilationManager.compile(CompilationManager.java:229)
> > > > >  at
> > com.cloudera.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:85)
> > > > > at com.cloudera.sqoop.tool.ExportTool.exportTable(ExportTool.java:66)
> > > > >  at com.cloudera.sqoop.tool.ExportTool.run(ExportTool.java:99)
> > > > > at com.cloudera.sqoop.Sqoop.run(Sqoop.java:146)
> > > > >  at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> > > > > at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:182)
> > > > >  at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:221)
> > > > > at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:230)
> > > > >  at com.cloudera.sqoop.Sqoop.main(Sqoop.java:239)
> > > > > 12/12/05 19:55:28 INFO orm.CompilationManager: Writing jar file:
> > > > >
> > > >
> > /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/member_main.jar
> > > > > 12/12/05 19:55:28 DEBUG orm.CompilationManager: Scanning for .class
> > files
> > > > > in directory:
> > /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d
> > > > > 12/12/05 19:55:28 DEBUG orm.CompilationManager: Got classfile:
> > > > >
> > > >
> > /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/member_main.class
> > > > > -> member_main.class
> > > > > 12/12/05 19:55:28 DEBUG orm.CompilationManager: Finished writing jar
> > file
> > > > >
> > > >
> > /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/member_main.jar
> > > > > 12/12/05 19:55:28 INFO mapreduce.ExportJobBase: Beginning export of
> > > > > member_main
> > > > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Using InputFormat: class
> > > > > com.cloudera.sqoop.mapreduce.ExportInputFormat
> > > > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > > file:/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar
> > > > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > > file:/usr/local/sqoop/lib/sqljdbc4.jar
> > > > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > > file:/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar
> > > > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > > file:/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar
> > > > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > > file:/usr/local/sqoop/lib/jackson-core-asl-1.7.3.jar
> > > > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > > file:/usr/local/sqoop/lib/snappy-java-1.0.3.2.jar
> > > > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > > file:/usr/local/sqoop/lib/avro-1.5.4.jar
> > > > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > > file:/usr/local/sqoop/lib/jackson-mapper-asl-1.7.3.jar
> > > > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > > file:/usr/local/sqoop/lib/sqljdbc4.jar
> > > > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > > file:/usr/local/sqoop/lib/ant-contrib-1.0b3.jar
> > > > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > > file:/usr/local/sqoop/lib/sqoop-sqlserver-1.0.jar
> > > > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > > file:/usr/local/sqoop/lib/paranamer-2.3.jar
> > > > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > > file:/usr/local/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar
> > > > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > > file:/usr/local/sqoop/lib/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar
> > > > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > > file:/usr/local/sqoop/lib/avro-ipc-1.5.4.jar
> > > > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > > file:/usr/local/sqoop/lib/avro-mapred-1.5.4.jar
> > > > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > > file:/usr/local/sqoop/lib/commons-io-1.4.jar
> > > > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > > file:/usr/local/sqoop/lib/jopt-simple-3.2.jar
> > > > > 12/12/05 19:55:31 INFO input.FileInputFormat: Total input paths to
> > > > process
> > > > > : 1
> > > > > 12/12/05 19:55:31 DEBUG mapreduce.ExportInputFormat: Target
> > numMapTasks=5
> > > > > 12/12/05 19:55:31 DEBUG mapreduce.ExportInputFormat: Total input
> > > > > bytes=110140058
> > > > > 12/12/05 19:55:31 DEBUG mapreduce.ExportInputFormat:
> > > > maxSplitSize=22028011
> > > > > 12/12/05 19:55:31 INFO input.FileInputFormat: Total input paths to
> > > > process
> > > > > : 1
> > > > > 12/12/05 19:55:31 INFO util.NativeCodeLoader: Loaded the
> > native-hadoop
> > > > > library
> > > > > 12/12/05 19:55:31 WARN snappy.LoadSnappy: Snappy native library not
> > > > loaded
> > > > > 12/12/05 19:55:31 DEBUG mapreduce.ExportInputFormat: Generated
> > splits:
> > > > > 12/12/05 19:55:31 DEBUG mapreduce.ExportInputFormat:
> > > > >
> > > >
> > Paths:/user/hadoop/CyberlinkMemberData/Member_Main/CHS/part-m-00000:0+22028011
> > > > > Locations:hadoop03:;
> > > > > 12/12/05 19:55:31 DEBUG mapreduce.ExportInputFormat:
> > > > >
> > > >
> > Paths:/user/hadoop/CyberlinkMemberData/Member_Main/CHS/part-m-00000:22028011+22028011
> > > > > Locations:hadoop03:;
> > > > > 12/12/05 19:55:31 DEBUG mapreduce.ExportInputFormat:
> > > > >
> > > >
> > Paths:/user/hadoop/CyberlinkMemberData/Member_Main/CHS/part-m-00000:44056022+11526421,/user/hadoop/CyberlinkMemberData/Member_Main/CHS/part-m-00000:55582443+11526421
> > > > > Locations:hadoop03:;
> > > > > 12/12/05 19:55:31 DEBUG mapreduce.ExportInputFormat:
> > > > >
> > > >
> > Paths:/user/hadoop/CyberlinkMemberData/Member_Main/CHS/part-m-00000:67108864+21515597,/user/hadoop/CyberlinkMemberData/Member_Main/CHS/part-m-00000:88624461+21515597
> > > > > Locations:hadoop03:;
> > > > > 12/12/05 19:55:31 INFO mapred.JobClient: Running job:
> > > > job_201212041541_0245
> > > > > 12/12/05 19:55:32 INFO mapred.JobClient:  map 0% reduce 0%
> > > > > 12/12/05 19:55:47 INFO mapred.JobClient: Task Id :
> > > > > attempt_201212041541_0245_m_000002_0, Status : FAILED
> > > > > java.lang.NumberFormatException: For input string: "Male"
> > > > > at
> > > > >
> > > >
> > java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
> > > > >  at java.lang.Integer.parseInt(Integer.java:481)
> > > > > at java.lang.Integer.valueOf(Integer.java:570)
> > > > >  at member_main.__loadFromFields(member_main.java:1254)
> > > > > at member_main.parse(member_main.java:1156)
> > > > >  at
> > > > >
> > > >
> > com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:81)
> > > > > at
> > > > >
> > > >
> > com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:40)
> > > > >  at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
> > > > > at
> > > > >
> > > >
> > com.cloudera.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:189)
> > > > >  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
> > > > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> > > > >  at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> > > > > at java.security.AccessController.doPrivileged(Native Method)
> > > > >  at javax.security.auth.Subject.doAs(Subject.java:416)
> > > > > at
> > > > >
> > > >
> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> > > > >  at org.apache.hadoop.mapred.Child.main(Child.java:249)
> > > > >
> > > > > 12/12/05 19:55:51 INFO mapred.JobClient:  map 5% reduce 0%
> > > > > 12/12/05 19:55:54 INFO mapred.JobClient:  map 8% reduce 0%
> > > > > 12/12/05 19:55:54 INFO mapred.JobClient: Task Id :
> > > > > attempt_201212041541_0245_m_000002_1, Status : FAILED
> > > > > java.lang.NumberFormatException: For input string: "Male"
> > > > > at
> > > > >
> > > >
> > java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
> > > > >  at java.lang.Integer.parseInt(Integer.java:481)
> > > > > at java.lang.Integer.valueOf(Integer.java:570)
> > > > >  at member_main.__loadFromFields(member_main.java:1254)
> > > > > at member_main.parse(member_main.java:1156)
> > > > >  at
> > > > >
> > > >
> > com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:81)
> > > > > at
> > > > >
> > > >
> > com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:40)
> > > > >  at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
> > > > > at
> > > > >
> > > >
> > com.cloudera.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:189)
> > > > >  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
> > > > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> > > > >  at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> > > > > at java.security.AccessController.doPrivileged(Native Method)
> > > > >  at javax.security.auth.Subject.doAs(Subject.java:416)
> > > > > at
> > > > >
> > > >
> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> > > > >  at org.apache.hadoop.mapred.Child.main(Child.java:249)
> > > > >
> > > > > 12/12/05 19:55:57 INFO mapred.JobClient:  map 14% reduce 0%
> > > > > 12/12/05 19:55:59 INFO mapred.JobClient: Task Id :
> > > > > attempt_201212041541_0245_m_000000_0, Status : FAILED
> > > > > java.lang.IllegalArgumentException: Timestamp format must be
> > yyyy-mm-dd
> > > > > hh:mm:ss[.fffffffff]
> > > > > at java.sql.Timestamp.valueOf(Timestamp.java:203)
> > > > >  at member_main.__loadFromFields(member_main.java:1239)
> > > > > at member_main.parse(member_main.java:1156)
> > > > >  at
> > > > >
> > > >
> > com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:81)
> > > > > at
> > > > >
> > > >
> > com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:40)
> > > > >  at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
> > > > > at
> > > > >
> > > >
> > com.cloudera.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:189)
> > > > >  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
> > > > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> > > > >  at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> > > > > at java.security.AccessController.doPrivileged(Native Method)
> > > > >  at javax.security.auth.Subject.doAs(Subject.java:416)
> > > > > at
> > > > >
> > > >
> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> > > > >  at org.apache.hadoop.mapred.Child.main(Child.java:249)
> > > > >
> > > > > 12/12/05 19:56:00 INFO mapred.JobClient:  map 13% reduce 0%
> > > > > 12/12/05 19:56:01 INFO mapred.JobClient: Task Id :
> > > > > attempt_201212041541_0245_m_000002_2, Status : FAILED
> > > > > java.lang.NumberFormatException: For input string: "Male"
> > > > > at
> > > > >
> > > >
> > java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
> > > > >  at java.lang.Integer.parseInt(Integer.java:481)
> > > > > at java.lang.Integer.valueOf(Integer.java:570)
> > > > >  at member_main.__loadFromFields(member_main.java:1254)
> > > > > at member_main.parse(member_main.java:1156)
> > > > >  at
> > > > >
> > > >
> > com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:81)
> > > > > at
> > > > >
> > > >
> > com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:40)
> > > > >  at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
> > > > > at
> > > > >
> > > >
> > com.cloudera.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:189)
> > > > >  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
> > > > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> > > > >  at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> > > > > at java.security.AccessController.doPrivileged(Native Method)
> > > > >  at javax.security.auth.Subject.doAs(Subject.java:416)
> > > > > at
> > > > >
> > > >
> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> > > > >  at org.apache.hadoop.mapred.Child.main(Child.java:249)
> > > > >
> > > > > 12/12/05 19:56:03 INFO mapred.JobClient:  map 16% reduce 0%
> > > > > 12/12/05 19:56:13 INFO mapred.JobClient: Job complete:
> > > > job_201212041541_0245
> > > > > 12/12/05 19:56:13 INFO mapred.JobClient: Counters: 8
> > > > > 12/12/05 19:56:13 INFO mapred.JobClient:   Job Counters
> > > > > 12/12/05 19:56:13 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=91611
> > > > > 12/12/05 19:56:13 INFO mapred.JobClient:     Total time spent by all
> > > > > reduces waiting after reserving slots (ms)=0
> > > > > 12/12/05 19:56:13 INFO mapred.JobClient:     Total time spent by all
> > maps
> > > > > waiting after reserving slots (ms)=0
> > > > > 12/12/05 19:56:13 INFO mapred.JobClient:     Rack-local map tasks=5
> > > > > 12/12/05 19:56:13 INFO mapred.JobClient:     Launched map tasks=8
> > > > > 12/12/05 19:56:13 INFO mapred.JobClient:     Data-local map tasks=3
> > > > > 12/12/05 19:56:13 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=0
> > > > > 12/12/05 19:56:13 INFO mapred.JobClient:     Failed map tasks=1
> > > > > 12/12/05 19:56:13 INFO mapreduce.ExportJobBase: Transferred 0 bytes
> > in
> > > > > 45.077 seconds (0 bytes/sec)
> > > > > 12/12/05 19:56:13 INFO mapreduce.ExportJobBase: Exported 0 records.
> > > > > 12/12/05 19:56:13 ERROR tool.ExportTool: Error during export: Export
> > job
> > > > > failed!
> > > > > =====================================
> > > > >
> > > > >
> > > > > Chun-fan
> > > > >
> > > > >
> > > > > On Thu, Dec 6, 2012 at 12:23 AM, Jarek Jarcec Cecho <
> > jarcec@apache.org
> > > > >wrote:
> > > > >
> > > > > > Hi Chun-fan,
> > > > > > thank you very much for sharing the log with us. You are using
> > > > Microsoft
> > > > > > SQL Connector because you're downloaded manually from Microsoft web
> > > > pages
> > > > > > and you can also confirm that by following log lines:
> > > > > >
> > > > > > > 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Added factory
> > > > > > > com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory
> > specified by
> > > > > > > /usr/local/sqoop/conf/managers.d/mssqoop-sqlserver
> > > > > > ...
> > > > > > > 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Instantiated
> > ConnManager
> > > > > > > com.microsoft.sqoop.SqlServer.MSSQLServerManager@736921fd
> > > > > >
> > > > > > I'm not sure what is going wrong as it seems that the data were
> > parsed
> > > > > > correctly, but submitting the query to SQL server will fail. As a
> > next
> > > > step
> > > > > > I would recommend turning Microsoft Connector off and using
> > build-in
> > > > one
> > > > > > instead to see if the issue is specific to Sqoop or the Connector.
> > You
> > > > can
> > > > > > do that by temporarily moving file
> > > > > > /usr/local/sqoop/conf/managers.d/mssqoop-sqlserver somewhere else.
> > > > > >
> > > > > > Jarcec
> > > > > >
> > > > > > On Wed, Dec 05, 2012 at 12:25:24PM +0800, Chun-fan Ivan Liao wrote:
> > > > > > > Thank you, Jarcec. I'm not sure which connector we use. I've
> > > > downloaded
> > > > > > > "Microsoft SQL Server Connector for Apache Hadoop" from
> > > > > > > http://www.microsoft.com/en-us/download/details.aspx?id=27584,
> > but I
> > > > > > didn't
> > > > > > > remember if we really used that. How to make sure?
> > > > > > >
> > > > > > > And here is the verbose log:
> > > > > > >
> > > > > > > ===========
> > > > > > > 12/12/05 12:08:57 DEBUG tool.BaseSqoopTool: Enabled debug
> > logging.
> > > > > > > 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Added factory
> > > > > > > com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory
> > specified by
> > > > > > > /usr/local/sqoop/conf/managers.d/mssqoop-sqlserver
> > > > > > > 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Loaded manager
> > factory:
> > > > > > > com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory
> > > > > > > 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Loaded manager
> > factory:
> > > > > > > com.cloudera.sqoop.manager.DefaultManagerFactory
> > > > > > > 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Trying ManagerFactory:
> > > > > > > com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory
> > > > > > > 12/12/05 12:08:57 INFO SqlServer.MSSQLServerManagerFactory: Using
> > > > > > > Microsoft's SQL Server - Hadoop Connector
> > > > > > > 12/12/05 12:08:57 INFO manager.SqlManager: Using default
> > fetchSize of
> > > > > > 1000
> > > > > > > 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Instantiated
> > ConnManager
> > > > > > > com.microsoft.sqoop.SqlServer.MSSQLServerManager@736921fd
> > > > > > > 12/12/05 12:08:57 INFO tool.CodeGenTool: Beginning code
> > generation
> > > > > > > 12/12/05 12:08:57 DEBUG manager.SqlManager: No connection
> > paramenters
> > > > > > > specified. Using regular API for making connection.
> > > > > > > 12/12/05 12:08:57 DEBUG manager.SqlManager: Using fetchSize for
> > next
> > > > > > query:
> > > > > > > 1000
> > > > > > > 12/12/05 12:08:57 INFO manager.SqlManager: Executing SQL
> > statement:
> > > > > > SELECT
> > > > > > > TOP 1 * FROM [member_main]
> > > > > > > 12/12/05 12:08:57 DEBUG manager.SqlManager: Using fetchSize for
> > next
> > > > > > query:
> > > > > > > 1000
> > > > > > > 12/12/05 12:08:57 INFO manager.SqlManager: Executing SQL
> > statement:
> > > > > > SELECT
> > > > > > > TOP 1 * FROM [member_main]
> > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter: selected columns:
> > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   MemberId
> > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   USERNAME
> > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   FirstName
> > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   LastName
> > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   EmailAddress
> > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Password
> > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Password_E5
> > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Birthday
> > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   CompanyName
> > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Gender
> > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Age
> > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Education
> > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Country
> > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Title
> > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Phone1
> > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Phone2
> > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Fax
> > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   State
> > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   City
> > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Address1
> > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Address2
> > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   ZipCode
> > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   VATID
> > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Language
> > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   rec_letter
> > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   rec_promotion
> > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   rec_type
> > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   JointSource
> > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   CustomerLevel
> > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   UpdateDate
> > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   CreateDate
> > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   FirstLoginDate
> > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   LastLoginDate
> > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   LastVisit
> > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   isValid
> > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   nJoint
> > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Upd_SubDate
> > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   UnSub_Type
> > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   CreateDateFloat
> > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter: Writing source file:
> > > > > > >
> > > > > >
> > > >
> > /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/member_main.java
> > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter: Table name: member_main
> > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter: Columns: MemberId:4,
> > > > > > USERNAME:12,
> > > > > > > FirstName:-9, LastName:-9, EmailAddress:12, Password:12,
> > > > Password_E5:12,
> > > > > > > Birthday:93, CompanyName:-9, Gender:12, Age:5, Education:12,
> > > > Country:5,
> > > > > > > Title:-9, Phone1:12, Phone2:12, Fax:12, State:-9, City:-9,
> > > > Address1:-9,
> > > > > > > Address2:-9, ZipCode:12, VATID:12, Language:12, rec_letter:-7,
> > > > > > > rec_promotion:-7, rec_type:5, JointSource:12, CustomerLevel:4,
> > > > > > > UpdateDate:93, CreateDate:93, FirstLoginDate:93,
> > LastLoginDate:93,
> > > > > > > LastVisit:93, isValid:-7, nJoint:4, Upd_SubDate:93, UnSub_Type:4,
> > > > > > > CreateDateFloat:8,
> > > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter: sourceFilename is
> > > > > > member_main.java
> > > > > > > 12/12/05 12:08:57 DEBUG orm.CompilationManager: Found existing
> > > > > > > /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/
> > > > > > > 12/12/05 12:08:57 INFO orm.CompilationManager: HADOOP_HOME is
> > > > > > > /usr/local/hadoop/libexec/..
> > > > > > > 12/12/05 12:08:57 DEBUG orm.CompilationManager: Adding source
> > file:
> > > > > > >
> > > > > >
> > > >
> > /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/member_main.java
> > > > > > > 12/12/05 12:08:57 DEBUG orm.CompilationManager: Invoking javac
> > with
> > > > args:
> > > > > > > 12/12/05 12:08:57 DEBUG orm.CompilationManager:   -sourcepath
> > > > > > > 12/12/05 12:08:57 DEBUG orm.CompilationManager:
> > > > > > > /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/
> > > > > > > 12/12/05 12:08:57 DEBUG orm.CompilationManager:   -d
> > > > > > > 12/12/05 12:08:57 DEBUG orm.CompilationManager:
> > > > > > > /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/
> > > > > > > 12/12/05 12:08:57 DEBUG orm.CompilationManager:   -classpath
> > > > > > > 12/12/05 12:08:57 DEBUG orm.CompilationManager:
> > > > > > >
> > > > > >
> > > >
> > /usr/local/hadoop/libexec/../conf:/usr/lib/jvm/java-6-openjdk-amd64/lib/tools.jar:/usr/local/hadoop/libexec/..:/usr/local/hadoop/libexec/../hadoop-core-1.0.3.jar:/usr/local/hadoop/libexec/../lib/asm-3.2.jar:/usr/local/hadoop/libexec/../lib/aspectjrt-1.6.5.jar:/usr/local/hadoop/libexec/../lib/aspectjtools-1.6.5.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/libexec/../lib/commons-cli-1.2.jar:/usr/local/hadoop/libexec/../lib/commons-codec-1.4.jar:/usr/local/hadoop/libexec/../lib/commons-collections-3.2.1.jar:/usr/local/hadoop/libexec/../lib/commons-configuration-1.6.jar:/usr/local/hadoop/libexec/../lib/commons-daemon-1.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-digester-1.8.jar:/usr/local/hadoop/libexec/../lib/commons-el-1.0.jar:/usr/local/hadoop/libexec/../lib/commons-httpclient-3.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-io-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-lang-2.4.jar:/usr/local/hadoop/libexec/../lib/commons-logging-1.1.1.jar:/usr/local/hadoop/libexec/../lib/commons-logging-api-1.0.4.jar:/usr/local/hadoop/libexec/../lib/commons-math-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-net-1.4.1.jar:/usr/local/hadoop/libexec/../lib/core-3.1.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-capacity-scheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-fairscheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-thriftfs-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hsqldb-1.8.0.10.jar:/usr/local/hadoop/libexec/../lib/jackson-core-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jasper-compiler-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jasper-runtime-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jdeb-0.8.jar:/usr/local/hadoop/libexec/../lib/jersey-core-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-json-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-server-1.8.jar:/usr/local/hadoop/libexec/../lib/jets3t-0.6.1.jar:/usr/local/hadoop/libexec/../lib/jetty-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jetty-util-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jsch-0.1.42.jar:/usr/local/hadoop/libexec/../lib/junit-4.5.jar:/usr/local/hadoop/libexec/../lib/kfs-0.2.2.jar:/usr/local/hadoop/libexec/../lib/log4j-1.2.15.jar:/usr/local/hadoop/libexec/../lib/mockito-all-1.8.5.jar:/usr/local/hadoop/libexec/../lib/oro-2.0.8.jar:/usr/local/hadoop/libexec/../lib/servlet-api-2.5-20081211.jar:/usr/local/hadoop/libexec/../lib/slf4j-api-1.4.3.jar:/usr/local/hadoop/libexec/../lib/slf4j-log4j12-1.4.3.jar:/usr/local/hadoop/libexec/../lib/xmlenc-0.52.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-2.1.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-api-2.1.jar:/usr/local/hadoop/lib:/usr/local/sqoop/conf::/usr/local/sqoop/lib/ant-contrib-1.0b3.jar:/usr/local/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar:/usr/local/sqoop/lib/avro-1.5.4.jar:/usr/local/sqoop/lib/avro-ipc-1.5.4.jar:/usr/local/sqoop/lib/avro-mapred-1.5.4.jar:/usr/local/sqoop/lib/commons-io-1.4.jar:/usr/local/sqoop/lib/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar:/usr/local/sqoop/lib/jackson-core-asl-1.7.3.jar:/usr/local/sqoop/lib/jackson-mapper-asl-1.7.3.jar:/usr/local/sqoop/lib/jopt-simple-3.2.jar:/usr/local/sqoop/lib/paranamer-2.3.jar:/usr/local/sqoop/lib/snappy-java-1.0.3.2.jar:/usr/local/sqoop/lib/sqljdbc4.jar:/usr/local/sqoop/lib/sqoop-sqlserver-1.0.jar:/usr/local/hbase/conf/:/usr/lib/jvm/java-6-openjdk-amd64//lib/tools.jar:/usr/local/hbase:/usr/local/hbase/hbase-0.94.1.jar:/usr/local/hbase/hbase-0.94.1-tests.jar:/usr/local/hbase/lib/activation-1.1.jar:/usr/local/hbase/lib/asm-3.1.jar:/usr/local/hbase/lib/avro-1.5.3.jar:/usr/local/hbase/lib/avro-ipc-1.5.3.jar:/usr/local/hbase/lib/commons-beanutils-1.7.0.jar:/usr/local/hbase/lib/commons-beanutils-core-1.8.0.jar:/usr/local/hbase/lib/commons-cli-1.2.jar:/usr/local/hbase/lib/commons-codec-1.4.jar:/usr/local/hbase/lib/commons-collections-3.2.1.jar:/usr/local/hbase/lib/commons-configuration-1.6.jar:/usr/local/hbase/lib/commons-digester-1.8.jar:/usr/local/hbase/lib/commons-el-1.0.jar:/usr/local/hbase/lib/commons-httpclient-3.1.jar:/usr/local/hbase/lib/commons-io-2.1.jar:/usr/local/hbase/lib/commons-lang-2.5.jar:/usr/local/hbase/lib/commons-logging-1.1.1.jar:/usr/local/hbase/lib/commons-math-2.1.jar:/usr/local/hbase/lib/commons-net-1.4.1.jar:/usr/local/hbase/lib/core-3.1.1.jar:/usr/local/hbase/lib/guava-11.0.2.jar:/usr/local/hbase/lib/hadoop-core-1.0.3.jar:/usr/local/hbase/lib/high-scale-lib-1.1.1.jar:/usr/local/hbase/lib/httpclient-4.1.2.jar:/usr/local/hbase/lib/httpcore-4.1.3.jar:/usr/local/hbase/lib/jackson-core-asl-1.8.8.jar:/usr/local/hbase/lib/jackson-jaxrs-1.8.8.jar:/usr/local/hbase/lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hbase/lib/jackson-xc-1.8.8.jar:/usr/local/hbase/lib/jamon-runtime-2.3.1.jar:/usr/local/hbase/lib/jasper-compiler-5.5.23.jar:/usr/local/hbase/lib/jasper-runtime-5.5.23.jar:/usr/local/hbase/lib/jaxb-api-2.1.jar:/usr/local/hbase/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hbase/lib/jersey-core-1.8.jar:/usr/local/hbase/lib/jersey-json-1.8.jar:/usr/local/hbase/lib/jersey-server-1.8.jar:/usr/local/hbase/lib/jettison-1.1.jar:/usr/local/hbase/lib/jetty-6.1.26.jar:/usr/local/hbase/lib/jetty-util-6.1.26.jar:/usr/local/hbase/lib/jruby-complete-1.6.5.jar:/usr/local/hbase/lib/jsp-2.1-6.1.14.jar:/usr/local/hbase/lib/jsp-api-2.1-6.1.14.jar:/usr/local/hbase/lib/jsr305-1.3.9.jar:/usr/local/hbase/lib/junit-4.10-HBASE-1.jar:/usr/local/hbase/lib/libthrift-0.8.0.jar:/usr/local/hbase/lib/log4j-1.2.16.jar:/usr/local/hbase/lib/metrics-core-2.1.2.jar:/usr/local/hbase/lib/netty-3.2.4.Final.jar:/usr/local/hbase/lib/protobuf-java-2.4.0a.jar:/usr/local/hbase/lib/servlet-api-2.5-6.1.14.jar:/usr/local/hbase/lib/slf4j-api-1.4.3.jar:/usr/local/hbase/lib/slf4j-log4j12-1.4.3.jar:/usr/local/hbase/lib/snappy-java-1.0.3.2.jar:/usr/local/hbase/lib/stax-api-1.0.1.jar:/usr/local/hbase/lib/velocity-1.7.jar:/usr/local/hbase/lib/xmlenc-0.52.jar:/usr/local/hbase/lib/zookeeper-3.4.3.jar::/usr/local/hadoop/conf:/usr/local/hadoop/libexec/../conf:/usr/lib/jvm/java-6-openjdk-amd64/lib/tools.jar:/usr/local/hadoop/libexec/..:/usr/local/hadoop/libexec/../hadoop-core-1.0.3.jar:/usr/local/hadoop/libexec/../lib/asm-3.2.jar:/usr/local/hadoop/libexec/../lib/aspectjrt-1.6.5.jar:/usr/local/hadoop/libexec/../lib/aspectjtools-1.6.5.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/libexec/../lib/commons-cli-1.2.jar:/usr/local/hadoop/libexec/../lib/commons-codec-1.4.jar:/usr/local/hadoop/libexec/../lib/commons-collections-3.2.1.jar:/usr/local/hadoop/libexec/../lib/commons-configuration-1.6.jar:/usr/local/hadoop/libexec/../lib/commons-daemon-1.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-digester-1.8.jar:/usr/local/hadoop/libexec/../lib/commons-el-1.0.jar:/usr/local/hadoop/libexec/../lib/commons-httpclient-3.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-io-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-lang-2.4.jar:/usr/local/hadoop/libexec/../lib/commons-logging-1.1.1.jar:/usr/local/hadoop/libexec/../lib/commons-logging-api-1.0.4.jar:/usr/local/hadoop/libexec/../lib/commons-math-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-net-1.4.1.jar:/usr/local/hadoop/libexec/../lib/core-3.1.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-capacity-scheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-fairscheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-thriftfs-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hsqldb-1.8.0.10.jar:/usr/local/hadoop/libexec/../lib/jackson-core-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jasper-compiler-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jasper-runtime-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jdeb-0.8.jar:/usr/local/hadoop/libexec/../lib/jersey-core-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-json-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-server-1.8.jar:/usr/local/hadoop/libexec/../lib/jets3t-0.6.1.jar:/usr/local/hadoop/libexec/../lib/jetty-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jetty-util-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jsch-0.1.42.jar:/usr/local/hadoop/libexec/../lib/junit-4.5.jar:/usr/local/hadoop/libexec/../lib/kfs-0.2.2.jar:/usr/local/hadoop/libexec/../lib/log4j-1.2.15.jar:/usr/local/hadoop/libexec/../lib/mockito-all-1.8.5.jar:/usr/local/hadoop/libexec/../lib/oro-2.0.8.jar:/usr/local/hadoop/libexec/../lib/servlet-api-2.5-20081211.jar:/usr/local/hadoop/libexec/../lib/slf4j-api-1.4.3.jar:/usr/local/hadoop/libexec/../lib/slf4j-log4j12-1.4.3.jar:/usr/local/hadoop/libexec/../lib/xmlenc-0.52.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-2.1.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-api-2.1.jar:/usr/local/hadoop/lib::/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar:/usr/local/sqoop/sqoop-test-1.3.0-cdh3u4.jar::/usr/local/hadoop/hadoop-core-1.0.3.jar:/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar
> > > > > > > 12/12/05 12:08:58 ERROR orm.CompilationManager: Could not rename
> > > > > > >
> > > > > >
> > > >
> > /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/member_main.java
> > > > > > > to /home/hadoop/_scripts/1-hadoop/member/./member_main.java
> > > > > > > org.apache.commons.io.FileExistsException: Destination
> > > > > > > '/home/hadoop/_scripts/1-hadoop/member/./member_main.java'
> > already
> > > > exists
> > > > > > >  at org.apache.commons.io.FileUtils.moveFile(FileUtils.java:2378)
> > > > > > > at
> > > > > > >
> > > > > >
> > > >
> > com.cloudera.sqoop.orm.CompilationManager.compile(CompilationManager.java:229)
> > > > > > >  at
> > > > com.cloudera.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:85)
> > > > > > > at
> > com.cloudera.sqoop.tool.ExportTool.exportTable(ExportTool.java:66)
> > > > > > >  at com.cloudera.sqoop.tool.ExportTool.run(ExportTool.java:99)
> > > > > > > at com.cloudera.sqoop.Sqoop.run(Sqoop.java:146)
> > > > > > >  at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> > > > > > > at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:182)
> > > > > > >  at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:221)
> > > > > > > at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:230)
> > > > > > >  at com.cloudera.sqoop.Sqoop.main(Sqoop.java:239)
> > > > > > > 12/12/05 12:08:58 INFO orm.CompilationManager: Writing jar file:
> > > > > > >
> > > > > >
> > > >
> > /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/member_main.jar
> > > > > > > 12/12/05 12:08:58 DEBUG orm.CompilationManager: Scanning for
> > .class
> > > > files
> > > > > > > in directory:
> > > > /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971
> > > > > > > 12/12/05 12:08:58 DEBUG orm.CompilationManager: Got classfile:
> > > > > > >
> > > > > >
> > > >
> > /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/member_main.class
> > > > > > > -> member_main.class
> > > > > > > 12/12/05 12:08:58 DEBUG orm.CompilationManager: Finished writing
> > jar
> > > > file
> > > > > > >
> > > > > >
> > > >
> > /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/member_main.jar
> > > > > > > 12/12/05 12:08:58 INFO mapreduce.ExportJobBase: Beginning export
> > of
> > > > > > > member_main
> > > > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Using InputFormat:
> > class
> > > > > > > com.cloudera.sqoop.mapreduce.ExportInputFormat
> > > > > > > 12/12/05 12:08:58 DEBUG manager.SqlManager: Using fetchSize for
> > next
> > > > > > query:
> > > > > > > 1000
> > > > > > > 12/12/05 12:08:58 INFO manager.SqlManager: Executing SQL
> > statement:
> > > > > > SELECT
> > > > > > > TOP 1 * FROM [member_main]
> > > > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job
> > classpath:
> > > > > > > file:/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar
> > > > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job
> > classpath:
> > > > > > > file:/usr/local/sqoop/lib/sqljdbc4.jar
> > > > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job
> > classpath:
> > > > > > > file:/usr/local/sqoop/lib/sqoop-sqlserver-1.0.jar
> > > > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job
> > classpath:
> > > > > > > file:/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar
> > > > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job
> > classpath:
> > > > > > > file:/usr/local/sqoop/lib/jackson-core-asl-1.7.3.jar
> > > > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job
> > classpath:
> > > > > > > file:/usr/local/sqoop/lib/snappy-java-1.0.3.2.jar
> > > > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job
> > classpath:
> > > > > > > file:/usr/local/sqoop/lib/avro-1.5.4.jar
> > > > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job
> > classpath:
> > > > > > > file:/usr/local/sqoop/lib/jackson-mapper-asl-1.7.3.jar
> > > > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job
> > classpath:
> > > > > > > file:/usr/local/sqoop/lib/sqljdbc4.jar
> > > > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job
> > classpath:
> > > > > > > file:/usr/local/sqoop/lib/ant-contrib-1.0b3.jar
> > > > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job
> > classpath:
> > > > > > > file:/usr/local/sqoop/lib/sqoop-sqlserver-1.0.jar
> > > > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job
> > classpath:
> > > > > > > file:/usr/local/sqoop/lib/paranamer-2.3.jar
> > > > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job
> > classpath:
> > > > > > > file:/usr/local/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar
> > > > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job
> > classpath:
> > > > > > >
> > file:/usr/local/sqoop/lib/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar
> > > > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job
> > classpath:
> > > > > > > file:/usr/local/sqoop/lib/avro-ipc-1.5.4.jar
> > > > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job
> > classpath:
> > > > > > > file:/usr/local/sqoop/lib/avro-mapred-1.5.4.jar
> > > > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job
> > classpath:
> > > > > > > file:/usr/local/sqoop/lib/commons-io-1.4.jar
> > > > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job
> > classpath:
> > > > > > > file:/usr/local/sqoop/lib/jopt-simple-3.2.jar
> > > > > > > 12/12/05 12:09:00 INFO input.FileInputFormat: Total input paths
> > to
> > > > > > process
> > > > > > > : 1
> > > > > > > 12/12/05 12:09:00 DEBUG mapreduce.ExportInputFormat: Target
> > > > numMapTasks=1
> > > > > > > 12/12/05 12:09:00 DEBUG mapreduce.ExportInputFormat: Total input
> > > > > > bytes=2611
> > > > > > > 12/12/05 12:09:00 DEBUG mapreduce.ExportInputFormat:
> > > > maxSplitSize=2611
> > > > > > > 12/12/05 12:09:00 INFO input.FileInputFormat: Total input paths
> > to
> > > > > > process
> > > > > > > : 1
> > > > > > > 12/12/05 12:09:00 INFO util.NativeCodeLoader: Loaded the
> > > > native-hadoop
> > > > > > > library
> > > > > > > 12/12/05 12:09:00 WARN snappy.LoadSnappy: Snappy native library
> > not
> > > > > > loaded
> > > > > > > 12/12/05 12:09:00 DEBUG mapreduce.ExportInputFormat: Generated
> > > > splits:
> > > > > > > 12/12/05 12:09:00 DEBUG mapreduce.ExportInputFormat:
> > > > > > > Paths:/user/hadoop/test-ivan/test:0+2611 Locations:hadoop05:;
> > > > > > > 12/12/05 12:09:00 INFO mapred.JobClient: Running job:
> > > > > > job_201212041541_0107
> > > > > > > 12/12/05 12:09:01 INFO mapred.JobClient:  map 0% reduce 0%
> > > > > > > 12/12/05 12:09:18 INFO mapred.JobClient: Task Id :
> > > > > > > attempt_201212041541_0107_m_000000_0, Status : FAILED
> > > > > > > java.io.IOException:
> > com.microsoft.sqlserver.jdbc.SQLServerException:
> > > > > > > Incorrect syntax near ','.
> > > > > > >  at
> > > > > > >
> > > > > >
> > > >
> > com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.close(AsyncSqlRecordWriter.java:195)
> > > > > > > at
> > > > > > >
> > > > > >
> > > >
> > org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.close(MapTask.java:651)
> > > > > > >  at
> > org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:766)
> > > > > > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> > > > > > >  at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> > > > > > > at java.security.AccessController.doPrivileged(Native Method)
> > > > > > >  at javax.security.auth.Subject.doAs(Subject.java:416)
> > > > > > > at
> > > > > > >
> > > > > >
> > > >
> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> > > > > > >  at org.apache.hadoop.mapred.Child.main(Child.java:249)
> > > > > > > Caused by: com.microsoft.sqlserver.jdbc.SQLServerException:
> > Incorrect
> > > > > > > syntax near ','.
> > > > > > >  at
> > > > > > >
> > > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:197)
> > > > > > > at
> > > > > > >
> > > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1493)
> > > > > > >  at
> > > > > > >
> > > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.doExecutePreparedStatement(SQLServerPreparedStatement.java:390)
> > > > > > > at
> > > > > > >
> > > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement$PrepStmtExecCmd.doExecute(SQLServerPreparedStatement.java:340)
> > > > > > >  at
> > > > com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:4575)
> > > > > > > at
> > > > > > >
> > > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1400)
> > > > > > >  at
> > > > > > >
> > > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:179)
> > > > > > > at
> > > > > > >
> > > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:154)
> > > > > > >  at
> > > > > > >
> > > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.execute(SQLServerPreparedStatement.java:322)
> > > > > > > at
> > > > > > >
> > > > > >
> > > >
> > com.cloudera.sqoop.mapreduce.AsyncSqlOutputFormat$AsyncSqlExecThread.run(AsyncSqlOutputFormat.java:234)
> > > > > > >
> > > > > > > 12/12/05 12:09:24 INFO mapred.JobClient: Task Id :
> > > > > > > attempt_201212041541_0107_m_000000_1, Status : FAILED
> > > > > > > java.io.IOException:
> > com.microsoft.sqlserver.jdbc.SQLServerException:
> > > > > > > Incorrect syntax near ','.
> > > > > > >  at
> > > > > > >
> > > > > >
> > > >
> > com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.close(AsyncSqlRecordWriter.java:195)
> > > > > > > at
> > > > > > >
> > > > > >
> > > >
> > org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.close(MapTask.java:651)
> > > > > > >  at
> > org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:766)
> > > > > > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> > > > > > >  at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> > > > > > > at java.security.AccessController.doPrivileged(Native Method)
> > > > > > >  at javax.security.auth.Subject.doAs(Subject.java:416)
> > > > > > > at
> > > > > > >
> > > > > >
> > > >
> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> > > > > > >  at org.apache.hadoop.mapred.Child.main(Child.java:249)
> > > > > > > Caused by: com.microsoft.sqlserver.jdbc.SQLServerException:
> > Incorrect
> > > > > > > syntax near ','.
> > > > > > >  at
> > > > > > >
> > > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:197)
> > > > > > > at
> > > > > > >
> > > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1493)
> > > > > > >  at
> > > > > > >
> > > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.doExecutePreparedStatement(SQLServerPreparedStatement.java:390)
> > > > > > > at
> > > > > > >
> > > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement$PrepStmtExecCmd.doExecute(SQLServerPreparedStatement.java:340)
> > > > > > >  at
> > > > com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:4575)
> > > > > > > at
> > > > > > >
> > > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1400)
> > > > > > >  at
> > > > > > >
> > > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:179)
> > > > > > > at
> > > > > > >
> > > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:154)
> > > > > > >  at
> > > > > > >
> > > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.execute(SQLServerPreparedStatement.java:322)
> > > > > > > at
> > > > > > >
> > > > > >
> > > >
> > com.cloudera.sqoop.mapreduce.AsyncSqlOutputFormat$AsyncSqlExecThread.run(AsyncSqlOutputFormat.java:234)
> > > > > > >
> > > > > > > 12/12/05 12:09:30 INFO mapred.JobClient: Task Id :
> > > > > > > attempt_201212041541_0107_m_000000_2, Status : FAILED
> > > > > > > java.io.IOException:
> > com.microsoft.sqlserver.jdbc.SQLServerException:
> > > > > > > Incorrect syntax near ','.
> > > > > > >  at
> > > > > > >
> > > > > >
> > > >
> > com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.close(AsyncSqlRecordWriter.java:195)
> > > > > > > at
> > > > > > >
> > > > > >
> > > >
> > org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.close(MapTask.java:651)
> > > > > > >  at
> > org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:766)
> > > > > > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> > > > > > >  at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> > > > > > > at java.security.AccessController.doPrivileged(Native Method)
> > > > > > >  at javax.security.auth.Subject.doAs(Subject.java:416)
> > > > > > > at
> > > > > > >
> > > > > >
> > > >
> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> > > > > > >  at org.apache.hadoop.mapred.Child.main(Child.java:249)
> > > > > > > Caused by: com.microsoft.sqlserver.jdbc.SQLServerException:
> > Incorrect
> > > > > > > syntax near ','.
> > > > > > >  at
> > > > > > >
> > > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:197)
> > > > > > > at
> > > > > > >
> > > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1493)
> > > > > > >  at
> > > > > > >
> > > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.doExecutePreparedStatement(SQLServerPreparedStatement.java:390)
> > > > > > > at
> > > > > > >
> > > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement$PrepStmtExecCmd.doExecute(SQLServerPreparedStatement.java:340)
> > > > > > >  at
> > > > com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:4575)
> > > > > > > at
> > > > > > >
> > > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1400)
> > > > > > >  at
> > > > > > >
> > > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:179)
> > > > > > > at
> > > > > > >
> > > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:154)
> > > > > > >  at
> > > > > > >
> > > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.execute(SQLServerPreparedStatement.java:322)
> > > > > > > at
> > > > > > >
> > > > > >
> > > >
> > com.cloudera.sqoop.mapreduce.AsyncSqlOutputFormat$AsyncSqlExecThread.run(AsyncSqlOutputFormat.java:234)
> > > > > > >
> > > > > > > 12/12/05 12:09:41 INFO mapred.JobClient: Job complete:
> > > > > > job_201212041541_0107
> > > > > > > 12/12/05 12:09:41 INFO mapred.JobClient: Counters: 8
> > > > > > > 12/12/05 12:09:41 INFO mapred.JobClient:   Job Counters
> > > > > > > 12/12/05 12:09:41 INFO mapred.JobClient:
> > SLOTS_MILLIS_MAPS=24379
> > > > > > > 12/12/05 12:09:41 INFO mapred.JobClient:     Total time spent by
> > all
> > > > > > > reduces waiting after reserving slots (ms)=0
> > > > > > > 12/12/05 12:09:41 INFO mapred.JobClient:     Total time spent by
> > all
> > > > maps
> > > > > > > waiting after reserving slots (ms)=0
> > > > > > > 12/12/05 12:09:41 INFO mapred.JobClient:     Rack-local map
> > tasks=3
> > > > > > > 12/12/05 12:09:41 INFO mapred.JobClient:     Launched map tasks=4
> > > > > > > 12/12/05 12:09:41 INFO mapred.JobClient:     Data-local map
> > tasks=1
> > > > > > > 12/12/05 12:09:41 INFO mapred.JobClient:
> > SLOTS_MILLIS_REDUCES=0
> > > > > > > 12/12/05 12:09:41 INFO mapred.JobClient:     Failed map tasks=1
> > > > > > > 12/12/05 12:09:41 INFO mapreduce.ExportJobBase: Transferred 0
> > bytes
> > > > in
> > > > > > > 43.0875 seconds (0 bytes/sec)
> > > > > > > 12/12/05 12:09:41 INFO mapreduce.ExportJobBase: Exported 0
> > records.
> > > > > > > 12/12/05 12:09:41 ERROR tool.ExportTool: Error during export:
> > Export
> > > > job
> > > > > > > failed!
> > > > > > > ================
> > > > > > >
> > > > > > > Kind regards,
> > > > > > > Chun-fan
> > > > > > >
> > > > > > > On Wed, Dec 5, 2012 at 12:01 AM, Jarek Jarcec Cecho <
> > > > jarcec@apache.org
> > > > > > >wrote:
> > > > > > >
> > > > > > > > HiChun-fan,
> > > > > > > > would you mind sharing with us entire Sqoop log generated with
> > > > > > parameter
> > > > > > > > --verbose? Are you using build-in Microsoft SQL Connector or
> > > > connector
> > > > > > > > provided by Microsoft?
> > > > > > > >
> > > > > > > > Jarcec
> > > > > > > >
> > > > > > > > On Tue, Dec 04, 2012 at 05:51:31PM +0800, Chun-fan Ivan Liao
> > wrote:
> > > > > > > > > Hi,
> > > > > > > > >
> > > > > > > > >
> > > > > > > > >
> > > > > > > > > We are using Sqoop 1.3.0-cdh3u4 with Hadoop version 1.0.3.
> > > > > > > > >
> > > > > > > > >
> > > > > > > > >
> > > > > > > > > We encountered the following error when we try to export HDFS
> > > > file
> > > > > > into
> > > > > > > > > MSSQL 2005 (partially):
> > > > > > > > >
> > > > > > > > >
> > > > > > > > >
> > > > > > > > > 12/12/04 16:44:13 INFO mapred.JobClient: Task Id :
> > > > > > > > > attempt_201212041541_0014_m_000000_2, Status : FAILED
> > > > > > > > >
> > > > > > > > > java.io.IOException:
> > > > com.microsoft.sqlserver.jdbc.SQLServerException:
> > > > > > > > > Incorrect syntax near ','.
> > > > > > > > >
> > > > > > > > >         at
> > > > > > > > >
> > > > > > > >
> > > > > >
> > > >
> > com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.close(AsyncSqlRecordWriter.java:195)
> > > > > > > > >
> > > > > > > > >         at
> > > > > > > > >
> > > > > > > >
> > > > > >
> > > >
> > org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.close(MapTask.java:651)
> > > > > > > > >
> > > > > > > > >         at
> > > > > > > > org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:766)
> > > > > > > > >
> > > > > > > > >         at
> > org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> > > > > > > > >
> > > > > > > > >         at
> > org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> > > > > > > > >
> > > > > > > > >         at java.security.AccessController.doPrivileged(Native
> > > > Method)
> > > > > > > > >
> > > > > > > > >         at javax.security.auth.Subject.doAs(Subject.java:416)
> > > > > > > > >
> > > > > > > > >         at
> > > > > > > > >
> > > > > > > >
> > > > > >
> > > >
> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> > > > > > > > >
> > > > > > > > >         at
> > org.apache.hadoop.mapred.Child.main(Child.java:249)
> > > > > > > > >
> > > > > > > > > Caused by: com.microsoft.sqlserver.jdbc.SQLServerException:
> > > > Incorrect
> > > > > > > > > syntax near ','.
> > > > > > > > >
> > > > > > > > >         at
> > > > > > > > >
> > > > > > > >
> > > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:197)
> > > > > > > > >
> > > > > > > > >         at
> > > > > > > > >
> > > > > > > >
> > > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1493)
> > > > > > > > >
> > > > > > > > >         at
> > > > > > > > >
> > > > > > > >
> > > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.doExecutePreparedStatement(SQLServerPreparedStatement.java:390)
> > > > > > > > >
> > > > > > > > >         at
> > > > > > > > >
> > > > > > > >
> > > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement$PrepStmtExecCmd.doExecute(SQLServerPreparedStatement.java:340)
> > > > > > > > >
> > > > > > > > >         at
> > > > > > > > >
> > > > com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:4575)
> > > > > > > > >
> > > > > > > > >         at
> > > > > > > > >
> > > > > > > >
> > > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1400)
> > > > > > > > >
> > > > > > > > >         at
> > > > > > > > >
> > > > > > > >
> > > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:179)
> > > > > > > > >
> > > > > > > > >         at
> > > > > > > > >
> > > > > > > >
> > > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:154)
> > > > > > > > >
> > > > > > > > >         at
> > > > > > > > >
> > > > > > > >
> > > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.execute(SQLServerPreparedStatement.java:322)
> > > > > > > > >
> > > > > > > > >         at
> > > > > > > > >
> > > > > > > >
> > > > > >
> > > >
> > com.cloudera.sqoop.mapreduce.AsyncSqlOutputFormat$AsyncSqlExecThread.run(AsyncSqlOutputFormat.java:234)
> > > > > > > > >
> > > > > > > > >
> > > > > > > > >
> > > > > > > > > The HDFS file that we want to export was imported using sqoop
> > > > from
> > > > > > SQL
> > > > > > > > 2005
> > > > > > > > > before and uses ‘|’ as field delimiter, and there are commas
> > > > (‘,’)
> > > > > > in a
> > > > > > > > > field of a line in the file.
> > > > > > > > >
> > > > > > > > >
> > > > > > > > >
> > > > > > > > > The commands I submitted is (generalized with capital
> > letters):
> > > > > > > > >
> > > > > > > > > $ sqoop export -D sqoop.export.records.per.statement=75 -D
> > > > > > > > > sqoop.export.statements.per.transaction=75 --connect
> > > > > > > > >
> > > > > > > >
> > > > > >
> > > >
> > "jdbc:sqlserver://SERVER-NAME:1433;username=USER_NAME;password=PASSWD;database=DB_NAME"
> > > > > > > > > --table TABLE_NAME -m 1 --input-fields-terminated-by '|'
> > > > --export-dir
> > > > > > > > > /EXPORT/FROM/DIRECTORY
> > > > > > > > >
> > > > > > > > >
> > > > > > > > >
> > > > > > > > > I’ve adjusted values of sqoop.export.records.per.statement &
> > > > > > > > > sqoop.export.statements.per.transaction, but that didn’t
> > help.
> > > > > > > > >
> > > > > > > > >
> > > > > > > > >
> > > > > > > > > It will be greatly appreciated if you can offer some help.
> > > > Thanks.
> > > > > > > > >
> > > > > > > > >
> > > > > > > > >
> > > > > > > > > Ivan
> > > > > > > >
> > > > > >
> > > >
> >

Re: Sqoop export failed: Incorrect syntax near ','

Posted by Chun-fan Ivan Liao <iv...@ivangelion.tw>.
I've upgrade sqoop to 1.4.2 and copied hadoop-core-1.0.3.jar
and sqljdbc4.jar to /usr/local/sqoop/lib. I've also specified the parameter
"--hive-drop-import-delims" in the command, but the same error remained.
Parameters specified behind --hive-drop-import-delims could not be parsed:

==============
12/12/06 23:39:07 ERROR tool.BaseSqoopTool: Error parsing arguments for
export:
12/12/06 23:39:07 ERROR tool.BaseSqoopTool: Unrecognized argument:
--hive-drop-import-delims
12/12/06 23:39:07 ERROR tool.BaseSqoopTool: Unrecognized argument: --verbose

Try --help for usage instructions.
....
==============

Is there anything I can do now, e.g. re-importing data using default
connector and see if the imported data can be exported back to SQL Server?


On Fri, Dec 7, 2012 at 12:01 PM, Jarek Jarcec Cecho <ja...@apache.org>wrote:

> I see, Would you mind upgrading your Sqoop to most recent version 1.4.2?
>
> Jarcec
>
> On Fri, Dec 07, 2012 at 11:19:31AM +0800, Chun-fan Ivan Liao wrote:
> > Hi Jarek,
> >
> > I've tried to use "--hive-drop-import-delims", but sqoop showed it has
> > syntax error:
> >
> >   ERROR tool.BaseSqoopTool: Unrecognized argument:
> --hive-drop-import-delims
> >
> > Also, should I change Java OpenJDK to Oracle JDK in order to make Sqoop
> > export work?
> >
> > Thanks!
> > Ivan
> >
> > On Fri, Dec 7, 2012 at 12:28 AM, Jarek Jarcec Cecho <jarcec@apache.org
> >wrote:
> >
> > > Hi Ivangelion,
> > > I'm glad that you were able to move on with your issue. It seems to me
> > > that you're running on OpenJDK - unfortunately Sqoop is tested and
> > > supported only Oracle JDK.
> > >
> > > Based on the exceptions you're hitting:
> > >
> > >   java.lang.NumberFormatException: For input string: "Male"
> > >
> > >   java.lang.IllegalArgumentException: Timestamp format must be
> yyyy-mm-dd
> > > hh:mm:ss[.fffffffff
> > >
> > > It seems to me your input files got somehow corrupted and for example
> for
> > > the first exception Sqoop is looking for column that should be number
> but
> > > found string "Male" instead. You've mentioned that your data can
> contain a
> > > lot of wild characters, can it happen that your data also contains new
> line
> > > characters? Would you mind re-trying import with parameter
> > > --hive-drop-import-delims [1] to see if it helps? (this parameter do
> not
> > > depend on hive in any way regardless of it's name).
> > >
> > > Jarcec
> > >
> > > On Thu, Dec 06, 2012 at 12:03:06PM +0800, Ivangelion wrote:
> > > > Hi Jarek,
> > > >
> > > > It actually worked! Thank you so much~! :D
> > > >
> > > > However now we faced another problem. The former data we tried to
> export
> > > is
> > > > only test data, which row count is only 10. When we tried to export
> > > > production data back into SQL Server from HDFS file which was
> previously
> > > > imported using Sqoop from SQL server, different errors occurred. The
> row
> > > > count is about 400k, and only about 120k rows were exported. This
> time we
> > > > used "-m 5", and if using "-m 1", nothing will be exported. Verbose
> log
> > > is
> > > > in the bottom of this mail.
> > > >
> > > > Is this has to do with that we used MS SQL connector to do previous
> > > import,
> > > > not the default one?
> > > >
> > > > Also, should we specify any character encoding, e.g. utf-8 during
> > > > import/export process? There are characters of many different
> languages
> > > in
> > > > our original data in SQL Server, and I'm not sure what the encoding
> is
> > > > after imported into HDFS.
> > > >
> > > > Thanks again, Jarek.
> > > >
> > > > =====================================
> > > > 12/12/05 19:55:27 DEBUG tool.BaseSqoopTool: Enabled debug logging.
> > > > 12/12/05 19:55:27 DEBUG sqoop.ConnFactory: Loaded manager factory:
> > > > com.cloudera.sqoop.manager.DefaultManagerFactory
> > > > 12/12/05 19:55:27 DEBUG sqoop.ConnFactory: Trying ManagerFactory:
> > > > com.cloudera.sqoop.manager.DefaultManagerFactory
> > > > 12/12/05 19:55:27 DEBUG manager.DefaultManagerFactory: Trying with
> > > scheme:
> > > > jdbc:sqlserver:
> > > > 12/12/05 19:55:27 INFO manager.SqlManager: Using default fetchSize of
> > > 1000
> > > > 12/12/05 19:55:27 DEBUG sqoop.ConnFactory: Instantiated ConnManager
> > > > com.cloudera.sqoop.manager.SQLServerManager@6766afb3
> > > > 12/12/05 19:55:27 INFO tool.CodeGenTool: Beginning code generation
> > > > 12/12/05 19:55:27 DEBUG manager.SqlManager: No connection paramenters
> > > > specified. Using regular API for making connection.
> > > > 12/12/05 19:55:27 DEBUG manager.SqlManager: Using fetchSize for next
> > > query:
> > > > 1000
> > > > 12/12/05 19:55:27 INFO manager.SqlManager: Executing SQL statement:
> > > SELECT
> > > > t.* FROM member_main AS t WHERE 1=0
> > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter: selected columns:
> > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   MemberId
> > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   USERNAME
> > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   FirstName
> > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   LastName
> > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   EmailAddress
> > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Password
> > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Password_E5
> > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Birthday
> > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   CompanyName
> > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Gender
> > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Age
> > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Education
> > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Country
> > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Title
> > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Phone1
> > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Phone2
> > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Fax
> > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   State
> > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   City
> > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Address1
> > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Address2
> > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   ZipCode
> > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   VATID
> > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Language
> > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   rec_letter
> > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   rec_promotion
> > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   rec_type
> > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   JointSource
> > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   CustomerLevel
> > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   UpdateDate
> > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   CreateDate
> > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   FirstLoginDate
> > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   LastLoginDate
> > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   LastVisit
> > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   isValid
> > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   nJoint
> > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Upd_SubDate
> > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   UnSub_Type
> > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   CreateDateFloat
> > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter: Writing source file:
> > > >
> > >
> /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/member_main.java
> > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter: Table name: member_main
> > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter: Columns: MemberId:4,
> > > USERNAME:12,
> > > > FirstName:-9, LastName:-9, EmailAddress:12, Password:12,
> Password_E5:12,
> > > > Birthday:93, CompanyName:-9, Gender:12, Age:5, Education:12,
> Country:5,
> > > > Title:-9, Phone1:12, Phone2:12, Fax:12, State:-9, City:-9,
> Address1:-9,
> > > > Address2:-9, ZipCode:12, VATID:12, Language:12, rec_letter:-7,
> > > > rec_promotion:-7, rec_type:5, JointSource:12, CustomerLevel:4,
> > > > UpdateDate:93, CreateDate:93, FirstLoginDate:93, LastLoginDate:93,
> > > > LastVisit:93, isValid:-7, nJoint:4, Upd_SubDate:93, UnSub_Type:4,
> > > > CreateDateFloat:8,
> > > > 12/12/05 19:55:27 DEBUG orm.ClassWriter: sourceFilename is
> > > member_main.java
> > > > 12/12/05 19:55:27 DEBUG orm.CompilationManager: Found existing
> > > > /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/
> > > > 12/12/05 19:55:27 INFO orm.CompilationManager: HADOOP_HOME is
> > > > /usr/local/hadoop/libexec/..
> > > > 12/12/05 19:55:27 DEBUG orm.CompilationManager: Adding source file:
> > > >
> > >
> /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/member_main.java
> > > > 12/12/05 19:55:27 DEBUG orm.CompilationManager: Invoking javac with
> args:
> > > > 12/12/05 19:55:27 DEBUG orm.CompilationManager:   -sourcepath
> > > > 12/12/05 19:55:27 DEBUG orm.CompilationManager:
> > > > /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/
> > > > 12/12/05 19:55:27 DEBUG orm.CompilationManager:   -d
> > > > 12/12/05 19:55:27 DEBUG orm.CompilationManager:
> > > > /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/
> > > > 12/12/05 19:55:27 DEBUG orm.CompilationManager:   -classpath
> > > > 12/12/05 19:55:27 DEBUG orm.CompilationManager:
> > > >
> > >
> /usr/local/hadoop/libexec/../conf:/usr/lib/jvm/java-6-openjdk-amd64/lib/tools.jar:/usr/local/hadoop/libexec/..:/usr/local/hadoop/libexec/../hadoop-core-1.0.3.jar:/usr/local/hadoop/libexec/../lib/asm-3.2.jar:/usr/local/hadoop/libexec/../lib/aspectjrt-1.6.5.jar:/usr/local/hadoop/libexec/../lib/aspectjtools-1.6.5.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/libexec/../lib/commons-cli-1.2.jar:/usr/local/hadoop/libexec/../lib/commons-codec-1.4.jar:/usr/local/hadoop/libexec/../lib/commons-collections-3.2.1.jar:/usr/local/hadoop/libexec/../lib/commons-configuration-1.6.jar:/usr/local/hadoop/libexec/../lib/commons-daemon-1.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-digester-1.8.jar:/usr/local/hadoop/libexec/../lib/commons-el-1.0.jar:/usr/local/hadoop/libexec/../lib/commons-httpclient-3.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-io-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-lang-2.4.jar:/usr/local/hadoop/libexec/../lib/commons-logging-1.1.1.jar:/usr/local/hadoop/libexec/../lib/commons-logging-api-1.0.4.jar:/usr/local/hadoop/libexec/../lib/commons-math-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-net-1.4.1.jar:/usr/local/hadoop/libexec/../lib/core-3.1.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-capacity-scheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-fairscheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-thriftfs-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hsqldb-1.8.0.10.jar:/usr/local/hadoop/libexec/../lib/jackson-core-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jasper-compiler-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jasper-runtime-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jdeb-0.8.jar:/usr/local/hadoop/libexec/../lib/jersey-core-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-json-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-server-1.8.jar:/usr/local/hadoop/libexec/../lib/jets3t-0.6.1.jar:/usr/local/hadoop/libexec/../lib/jetty-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jetty-util-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jsch-0.1.42.jar:/usr/local/hadoop/libexec/../lib/junit-4.5.jar:/usr/local/hadoop/libexec/../lib/kfs-0.2.2.jar:/usr/local/hadoop/libexec/../lib/log4j-1.2.15.jar:/usr/local/hadoop/libexec/../lib/mockito-all-1.8.5.jar:/usr/local/hadoop/libexec/../lib/oro-2.0.8.jar:/usr/local/hadoop/libexec/../lib/servlet-api-2.5-20081211.jar:/usr/local/hadoop/libexec/../lib/slf4j-api-1.4.3.jar:/usr/local/hadoop/libexec/../lib/slf4j-log4j12-1.4.3.jar:/usr/local/hadoop/libexec/../lib/xmlenc-0.52.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-2.1.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-api-2.1.jar:/usr/local/hadoop/lib:/usr/local/sqoop/conf::/usr/local/sqoop/lib/ant-contrib-1.0b3.jar:/usr/local/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar:/usr/local/sqoop/lib/avro-1.5.4.jar:/usr/local/sqoop/lib/avro-ipc-1.5.4.jar:/usr/local/sqoop/lib/avro-mapred-1.5.4.jar:/usr/local/sqoop/lib/commons-io-1.4.jar:/usr/local/sqoop/lib/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar:/usr/local/sqoop/lib/jackson-core-asl-1.7.3.jar:/usr/local/sqoop/lib/jackson-mapper-asl-1.7.3.jar:/usr/local/sqoop/lib/jopt-simple-3.2.jar:/usr/local/sqoop/lib/paranamer-2.3.jar:/usr/local/sqoop/lib/snappy-java-1.0.3.2.jar:/usr/local/sqoop/lib/sqljdbc4.jar:/usr/local/sqoop/lib/sqoop-sqlserver-1.0.jar:/usr/local/hbase/conf/:/usr/lib/jvm/java-6-openjdk-amd64//lib/tools.jar:/usr/local/hbase:/usr/local/hbase/hbase-0.94.1.jar:/usr/local/hbase/hbase-0.94.1-tests.jar:/usr/local/hbase/lib/activation-1.1.jar:/usr/local/hbase/lib/asm-3.1.jar:/usr/local/hbase/lib/avro-1.5.3.jar:/usr/local/hbase/lib/avro-ipc-1.5.3.jar:/usr/local/hbase/lib/commons-beanutils-1.7.0.jar:/usr/local/hbase/lib/commons-beanutils-core-1.8.0.jar:/usr/local/hbase/lib/commons-cli-1.2.jar:/usr/local/hbase/lib/commons-codec-1.4.jar:/usr/local/hbase/lib/commons-collections-3.2.1.jar:/usr/local/hbase/lib/commons-configuration-1.6.jar:/usr/local/hbase/lib/commons-digester-1.8.jar:/usr/local/hbase/lib/commons-el-1.0.jar:/usr/local/hbase/lib/commons-httpclient-3.1.jar:/usr/local/hbase/lib/commons-io-2.1.jar:/usr/local/hbase/lib/commons-lang-2.5.jar:/usr/local/hbase/lib/commons-logging-1.1.1.jar:/usr/local/hbase/lib/commons-math-2.1.jar:/usr/local/hbase/lib/commons-net-1.4.1.jar:/usr/local/hbase/lib/core-3.1.1.jar:/usr/local/hbase/lib/guava-11.0.2.jar:/usr/local/hbase/lib/hadoop-core-1.0.3.jar:/usr/local/hbase/lib/high-scale-lib-1.1.1.jar:/usr/local/hbase/lib/httpclient-4.1.2.jar:/usr/local/hbase/lib/httpcore-4.1.3.jar:/usr/local/hbase/lib/jackson-core-asl-1.8.8.jar:/usr/local/hbase/lib/jackson-jaxrs-1.8.8.jar:/usr/local/hbase/lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hbase/lib/jackson-xc-1.8.8.jar:/usr/local/hbase/lib/jamon-runtime-2.3.1.jar:/usr/local/hbase/lib/jasper-compiler-5.5.23.jar:/usr/local/hbase/lib/jasper-runtime-5.5.23.jar:/usr/local/hbase/lib/jaxb-api-2.1.jar:/usr/local/hbase/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hbase/lib/jersey-core-1.8.jar:/usr/local/hbase/lib/jersey-json-1.8.jar:/usr/local/hbase/lib/jersey-server-1.8.jar:/usr/local/hbase/lib/jettison-1.1.jar:/usr/local/hbase/lib/jetty-6.1.26.jar:/usr/local/hbase/lib/jetty-util-6.1.26.jar:/usr/local/hbase/lib/jruby-complete-1.6.5.jar:/usr/local/hbase/lib/jsp-2.1-6.1.14.jar:/usr/local/hbase/lib/jsp-api-2.1-6.1.14.jar:/usr/local/hbase/lib/jsr305-1.3.9.jar:/usr/local/hbase/lib/junit-4.10-HBASE-1.jar:/usr/local/hbase/lib/libthrift-0.8.0.jar:/usr/local/hbase/lib/log4j-1.2.16.jar:/usr/local/hbase/lib/metrics-core-2.1.2.jar:/usr/local/hbase/lib/netty-3.2.4.Final.jar:/usr/local/hbase/lib/protobuf-java-2.4.0a.jar:/usr/local/hbase/lib/servlet-api-2.5-6.1.14.jar:/usr/local/hbase/lib/slf4j-api-1.4.3.jar:/usr/local/hbase/lib/slf4j-log4j12-1.4.3.jar:/usr/local/hbase/lib/snappy-java-1.0.3.2.jar:/usr/local/hbase/lib/stax-api-1.0.1.jar:/usr/local/hbase/lib/velocity-1.7.jar:/usr/local/hbase/lib/xmlenc-0.52.jar:/usr/local/hbase/lib/zookeeper-3.4.3.jar::/usr/local/hadoop/conf:/usr/local/hadoop/libexec/../conf:/usr/lib/jvm/java-6-openjdk-amd64/lib/tools.jar:/usr/local/hadoop/libexec/..:/usr/local/hadoop/libexec/../hadoop-core-1.0.3.jar:/usr/local/hadoop/libexec/../lib/asm-3.2.jar:/usr/local/hadoop/libexec/../lib/aspectjrt-1.6.5.jar:/usr/local/hadoop/libexec/../lib/aspectjtools-1.6.5.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/libexec/../lib/commons-cli-1.2.jar:/usr/local/hadoop/libexec/../lib/commons-codec-1.4.jar:/usr/local/hadoop/libexec/../lib/commons-collections-3.2.1.jar:/usr/local/hadoop/libexec/../lib/commons-configuration-1.6.jar:/usr/local/hadoop/libexec/../lib/commons-daemon-1.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-digester-1.8.jar:/usr/local/hadoop/libexec/../lib/commons-el-1.0.jar:/usr/local/hadoop/libexec/../lib/commons-httpclient-3.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-io-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-lang-2.4.jar:/usr/local/hadoop/libexec/../lib/commons-logging-1.1.1.jar:/usr/local/hadoop/libexec/../lib/commons-logging-api-1.0.4.jar:/usr/local/hadoop/libexec/../lib/commons-math-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-net-1.4.1.jar:/usr/local/hadoop/libexec/../lib/core-3.1.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-capacity-scheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-fairscheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-thriftfs-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hsqldb-1.8.0.10.jar:/usr/local/hadoop/libexec/../lib/jackson-core-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jasper-compiler-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jasper-runtime-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jdeb-0.8.jar:/usr/local/hadoop/libexec/../lib/jersey-core-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-json-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-server-1.8.jar:/usr/local/hadoop/libexec/../lib/jets3t-0.6.1.jar:/usr/local/hadoop/libexec/../lib/jetty-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jetty-util-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jsch-0.1.42.jar:/usr/local/hadoop/libexec/../lib/junit-4.5.jar:/usr/local/hadoop/libexec/../lib/kfs-0.2.2.jar:/usr/local/hadoop/libexec/../lib/log4j-1.2.15.jar:/usr/local/hadoop/libexec/../lib/mockito-all-1.8.5.jar:/usr/local/hadoop/libexec/../lib/oro-2.0.8.jar:/usr/local/hadoop/libexec/../lib/servlet-api-2.5-20081211.jar:/usr/local/hadoop/libexec/../lib/slf4j-api-1.4.3.jar:/usr/local/hadoop/libexec/../lib/slf4j-log4j12-1.4.3.jar:/usr/local/hadoop/libexec/../lib/xmlenc-0.52.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-2.1.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-api-2.1.jar:/usr/local/hadoop/lib::/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar:/usr/local/sqoop/sqoop-test-1.3.0-cdh3u4.jar::/usr/local/hadoop/hadoop-core-1.0.3.jar:/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar
> > > > 12/12/05 19:55:28 ERROR orm.CompilationManager: Could not rename
> > > >
> > >
> /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/member_main.java
> > > > to /home/hadoop/_scripts/1-hadoop/member/./member_main.java
> > > > org.apache.commons.io.FileExistsException: Destination
> > > > '/home/hadoop/_scripts/1-hadoop/member/./member_main.java' already
> exists
> > > >  at org.apache.commons.io.FileUtils.moveFile(FileUtils.java:2378)
> > > > at
> > > >
> > >
> com.cloudera.sqoop.orm.CompilationManager.compile(CompilationManager.java:229)
> > > >  at
> com.cloudera.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:85)
> > > > at com.cloudera.sqoop.tool.ExportTool.exportTable(ExportTool.java:66)
> > > >  at com.cloudera.sqoop.tool.ExportTool.run(ExportTool.java:99)
> > > > at com.cloudera.sqoop.Sqoop.run(Sqoop.java:146)
> > > >  at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> > > > at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:182)
> > > >  at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:221)
> > > > at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:230)
> > > >  at com.cloudera.sqoop.Sqoop.main(Sqoop.java:239)
> > > > 12/12/05 19:55:28 INFO orm.CompilationManager: Writing jar file:
> > > >
> > >
> /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/member_main.jar
> > > > 12/12/05 19:55:28 DEBUG orm.CompilationManager: Scanning for .class
> files
> > > > in directory:
> /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d
> > > > 12/12/05 19:55:28 DEBUG orm.CompilationManager: Got classfile:
> > > >
> > >
> /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/member_main.class
> > > > -> member_main.class
> > > > 12/12/05 19:55:28 DEBUG orm.CompilationManager: Finished writing jar
> file
> > > >
> > >
> /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/member_main.jar
> > > > 12/12/05 19:55:28 INFO mapreduce.ExportJobBase: Beginning export of
> > > > member_main
> > > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Using InputFormat: class
> > > > com.cloudera.sqoop.mapreduce.ExportInputFormat
> > > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > file:/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar
> > > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > file:/usr/local/sqoop/lib/sqljdbc4.jar
> > > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > file:/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar
> > > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > file:/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar
> > > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > file:/usr/local/sqoop/lib/jackson-core-asl-1.7.3.jar
> > > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > file:/usr/local/sqoop/lib/snappy-java-1.0.3.2.jar
> > > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > file:/usr/local/sqoop/lib/avro-1.5.4.jar
> > > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > file:/usr/local/sqoop/lib/jackson-mapper-asl-1.7.3.jar
> > > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > file:/usr/local/sqoop/lib/sqljdbc4.jar
> > > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > file:/usr/local/sqoop/lib/ant-contrib-1.0b3.jar
> > > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > file:/usr/local/sqoop/lib/sqoop-sqlserver-1.0.jar
> > > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > file:/usr/local/sqoop/lib/paranamer-2.3.jar
> > > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > file:/usr/local/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar
> > > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > file:/usr/local/sqoop/lib/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar
> > > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > file:/usr/local/sqoop/lib/avro-ipc-1.5.4.jar
> > > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > file:/usr/local/sqoop/lib/avro-mapred-1.5.4.jar
> > > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > file:/usr/local/sqoop/lib/commons-io-1.4.jar
> > > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > file:/usr/local/sqoop/lib/jopt-simple-3.2.jar
> > > > 12/12/05 19:55:31 INFO input.FileInputFormat: Total input paths to
> > > process
> > > > : 1
> > > > 12/12/05 19:55:31 DEBUG mapreduce.ExportInputFormat: Target
> numMapTasks=5
> > > > 12/12/05 19:55:31 DEBUG mapreduce.ExportInputFormat: Total input
> > > > bytes=110140058
> > > > 12/12/05 19:55:31 DEBUG mapreduce.ExportInputFormat:
> > > maxSplitSize=22028011
> > > > 12/12/05 19:55:31 INFO input.FileInputFormat: Total input paths to
> > > process
> > > > : 1
> > > > 12/12/05 19:55:31 INFO util.NativeCodeLoader: Loaded the
> native-hadoop
> > > > library
> > > > 12/12/05 19:55:31 WARN snappy.LoadSnappy: Snappy native library not
> > > loaded
> > > > 12/12/05 19:55:31 DEBUG mapreduce.ExportInputFormat: Generated
> splits:
> > > > 12/12/05 19:55:31 DEBUG mapreduce.ExportInputFormat:
> > > >
> > >
> Paths:/user/hadoop/CyberlinkMemberData/Member_Main/CHS/part-m-00000:0+22028011
> > > > Locations:hadoop03:;
> > > > 12/12/05 19:55:31 DEBUG mapreduce.ExportInputFormat:
> > > >
> > >
> Paths:/user/hadoop/CyberlinkMemberData/Member_Main/CHS/part-m-00000:22028011+22028011
> > > > Locations:hadoop03:;
> > > > 12/12/05 19:55:31 DEBUG mapreduce.ExportInputFormat:
> > > >
> > >
> Paths:/user/hadoop/CyberlinkMemberData/Member_Main/CHS/part-m-00000:44056022+11526421,/user/hadoop/CyberlinkMemberData/Member_Main/CHS/part-m-00000:55582443+11526421
> > > > Locations:hadoop03:;
> > > > 12/12/05 19:55:31 DEBUG mapreduce.ExportInputFormat:
> > > >
> > >
> Paths:/user/hadoop/CyberlinkMemberData/Member_Main/CHS/part-m-00000:67108864+21515597,/user/hadoop/CyberlinkMemberData/Member_Main/CHS/part-m-00000:88624461+21515597
> > > > Locations:hadoop03:;
> > > > 12/12/05 19:55:31 INFO mapred.JobClient: Running job:
> > > job_201212041541_0245
> > > > 12/12/05 19:55:32 INFO mapred.JobClient:  map 0% reduce 0%
> > > > 12/12/05 19:55:47 INFO mapred.JobClient: Task Id :
> > > > attempt_201212041541_0245_m_000002_0, Status : FAILED
> > > > java.lang.NumberFormatException: For input string: "Male"
> > > > at
> > > >
> > >
> java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
> > > >  at java.lang.Integer.parseInt(Integer.java:481)
> > > > at java.lang.Integer.valueOf(Integer.java:570)
> > > >  at member_main.__loadFromFields(member_main.java:1254)
> > > > at member_main.parse(member_main.java:1156)
> > > >  at
> > > >
> > >
> com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:81)
> > > > at
> > > >
> > >
> com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:40)
> > > >  at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
> > > > at
> > > >
> > >
> com.cloudera.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:189)
> > > >  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
> > > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> > > >  at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> > > > at java.security.AccessController.doPrivileged(Native Method)
> > > >  at javax.security.auth.Subject.doAs(Subject.java:416)
> > > > at
> > > >
> > >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> > > >  at org.apache.hadoop.mapred.Child.main(Child.java:249)
> > > >
> > > > 12/12/05 19:55:51 INFO mapred.JobClient:  map 5% reduce 0%
> > > > 12/12/05 19:55:54 INFO mapred.JobClient:  map 8% reduce 0%
> > > > 12/12/05 19:55:54 INFO mapred.JobClient: Task Id :
> > > > attempt_201212041541_0245_m_000002_1, Status : FAILED
> > > > java.lang.NumberFormatException: For input string: "Male"
> > > > at
> > > >
> > >
> java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
> > > >  at java.lang.Integer.parseInt(Integer.java:481)
> > > > at java.lang.Integer.valueOf(Integer.java:570)
> > > >  at member_main.__loadFromFields(member_main.java:1254)
> > > > at member_main.parse(member_main.java:1156)
> > > >  at
> > > >
> > >
> com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:81)
> > > > at
> > > >
> > >
> com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:40)
> > > >  at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
> > > > at
> > > >
> > >
> com.cloudera.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:189)
> > > >  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
> > > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> > > >  at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> > > > at java.security.AccessController.doPrivileged(Native Method)
> > > >  at javax.security.auth.Subject.doAs(Subject.java:416)
> > > > at
> > > >
> > >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> > > >  at org.apache.hadoop.mapred.Child.main(Child.java:249)
> > > >
> > > > 12/12/05 19:55:57 INFO mapred.JobClient:  map 14% reduce 0%
> > > > 12/12/05 19:55:59 INFO mapred.JobClient: Task Id :
> > > > attempt_201212041541_0245_m_000000_0, Status : FAILED
> > > > java.lang.IllegalArgumentException: Timestamp format must be
> yyyy-mm-dd
> > > > hh:mm:ss[.fffffffff]
> > > > at java.sql.Timestamp.valueOf(Timestamp.java:203)
> > > >  at member_main.__loadFromFields(member_main.java:1239)
> > > > at member_main.parse(member_main.java:1156)
> > > >  at
> > > >
> > >
> com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:81)
> > > > at
> > > >
> > >
> com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:40)
> > > >  at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
> > > > at
> > > >
> > >
> com.cloudera.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:189)
> > > >  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
> > > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> > > >  at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> > > > at java.security.AccessController.doPrivileged(Native Method)
> > > >  at javax.security.auth.Subject.doAs(Subject.java:416)
> > > > at
> > > >
> > >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> > > >  at org.apache.hadoop.mapred.Child.main(Child.java:249)
> > > >
> > > > 12/12/05 19:56:00 INFO mapred.JobClient:  map 13% reduce 0%
> > > > 12/12/05 19:56:01 INFO mapred.JobClient: Task Id :
> > > > attempt_201212041541_0245_m_000002_2, Status : FAILED
> > > > java.lang.NumberFormatException: For input string: "Male"
> > > > at
> > > >
> > >
> java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
> > > >  at java.lang.Integer.parseInt(Integer.java:481)
> > > > at java.lang.Integer.valueOf(Integer.java:570)
> > > >  at member_main.__loadFromFields(member_main.java:1254)
> > > > at member_main.parse(member_main.java:1156)
> > > >  at
> > > >
> > >
> com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:81)
> > > > at
> > > >
> > >
> com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:40)
> > > >  at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
> > > > at
> > > >
> > >
> com.cloudera.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:189)
> > > >  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
> > > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> > > >  at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> > > > at java.security.AccessController.doPrivileged(Native Method)
> > > >  at javax.security.auth.Subject.doAs(Subject.java:416)
> > > > at
> > > >
> > >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> > > >  at org.apache.hadoop.mapred.Child.main(Child.java:249)
> > > >
> > > > 12/12/05 19:56:03 INFO mapred.JobClient:  map 16% reduce 0%
> > > > 12/12/05 19:56:13 INFO mapred.JobClient: Job complete:
> > > job_201212041541_0245
> > > > 12/12/05 19:56:13 INFO mapred.JobClient: Counters: 8
> > > > 12/12/05 19:56:13 INFO mapred.JobClient:   Job Counters
> > > > 12/12/05 19:56:13 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=91611
> > > > 12/12/05 19:56:13 INFO mapred.JobClient:     Total time spent by all
> > > > reduces waiting after reserving slots (ms)=0
> > > > 12/12/05 19:56:13 INFO mapred.JobClient:     Total time spent by all
> maps
> > > > waiting after reserving slots (ms)=0
> > > > 12/12/05 19:56:13 INFO mapred.JobClient:     Rack-local map tasks=5
> > > > 12/12/05 19:56:13 INFO mapred.JobClient:     Launched map tasks=8
> > > > 12/12/05 19:56:13 INFO mapred.JobClient:     Data-local map tasks=3
> > > > 12/12/05 19:56:13 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=0
> > > > 12/12/05 19:56:13 INFO mapred.JobClient:     Failed map tasks=1
> > > > 12/12/05 19:56:13 INFO mapreduce.ExportJobBase: Transferred 0 bytes
> in
> > > > 45.077 seconds (0 bytes/sec)
> > > > 12/12/05 19:56:13 INFO mapreduce.ExportJobBase: Exported 0 records.
> > > > 12/12/05 19:56:13 ERROR tool.ExportTool: Error during export: Export
> job
> > > > failed!
> > > > =====================================
> > > >
> > > >
> > > > Chun-fan
> > > >
> > > >
> > > > On Thu, Dec 6, 2012 at 12:23 AM, Jarek Jarcec Cecho <
> jarcec@apache.org
> > > >wrote:
> > > >
> > > > > Hi Chun-fan,
> > > > > thank you very much for sharing the log with us. You are using
> > > Microsoft
> > > > > SQL Connector because you're downloaded manually from Microsoft web
> > > pages
> > > > > and you can also confirm that by following log lines:
> > > > >
> > > > > > 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Added factory
> > > > > > com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory
> specified by
> > > > > > /usr/local/sqoop/conf/managers.d/mssqoop-sqlserver
> > > > > ...
> > > > > > 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Instantiated
> ConnManager
> > > > > > com.microsoft.sqoop.SqlServer.MSSQLServerManager@736921fd
> > > > >
> > > > > I'm not sure what is going wrong as it seems that the data were
> parsed
> > > > > correctly, but submitting the query to SQL server will fail. As a
> next
> > > step
> > > > > I would recommend turning Microsoft Connector off and using
> build-in
> > > one
> > > > > instead to see if the issue is specific to Sqoop or the Connector.
> You
> > > can
> > > > > do that by temporarily moving file
> > > > > /usr/local/sqoop/conf/managers.d/mssqoop-sqlserver somewhere else.
> > > > >
> > > > > Jarcec
> > > > >
> > > > > On Wed, Dec 05, 2012 at 12:25:24PM +0800, Chun-fan Ivan Liao wrote:
> > > > > > Thank you, Jarcec. I'm not sure which connector we use. I've
> > > downloaded
> > > > > > "Microsoft SQL Server Connector for Apache Hadoop" from
> > > > > > http://www.microsoft.com/en-us/download/details.aspx?id=27584,
> but I
> > > > > didn't
> > > > > > remember if we really used that. How to make sure?
> > > > > >
> > > > > > And here is the verbose log:
> > > > > >
> > > > > > ===========
> > > > > > 12/12/05 12:08:57 DEBUG tool.BaseSqoopTool: Enabled debug
> logging.
> > > > > > 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Added factory
> > > > > > com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory
> specified by
> > > > > > /usr/local/sqoop/conf/managers.d/mssqoop-sqlserver
> > > > > > 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Loaded manager
> factory:
> > > > > > com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory
> > > > > > 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Loaded manager
> factory:
> > > > > > com.cloudera.sqoop.manager.DefaultManagerFactory
> > > > > > 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Trying ManagerFactory:
> > > > > > com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory
> > > > > > 12/12/05 12:08:57 INFO SqlServer.MSSQLServerManagerFactory: Using
> > > > > > Microsoft's SQL Server - Hadoop Connector
> > > > > > 12/12/05 12:08:57 INFO manager.SqlManager: Using default
> fetchSize of
> > > > > 1000
> > > > > > 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Instantiated
> ConnManager
> > > > > > com.microsoft.sqoop.SqlServer.MSSQLServerManager@736921fd
> > > > > > 12/12/05 12:08:57 INFO tool.CodeGenTool: Beginning code
> generation
> > > > > > 12/12/05 12:08:57 DEBUG manager.SqlManager: No connection
> paramenters
> > > > > > specified. Using regular API for making connection.
> > > > > > 12/12/05 12:08:57 DEBUG manager.SqlManager: Using fetchSize for
> next
> > > > > query:
> > > > > > 1000
> > > > > > 12/12/05 12:08:57 INFO manager.SqlManager: Executing SQL
> statement:
> > > > > SELECT
> > > > > > TOP 1 * FROM [member_main]
> > > > > > 12/12/05 12:08:57 DEBUG manager.SqlManager: Using fetchSize for
> next
> > > > > query:
> > > > > > 1000
> > > > > > 12/12/05 12:08:57 INFO manager.SqlManager: Executing SQL
> statement:
> > > > > SELECT
> > > > > > TOP 1 * FROM [member_main]
> > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter: selected columns:
> > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   MemberId
> > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   USERNAME
> > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   FirstName
> > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   LastName
> > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   EmailAddress
> > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Password
> > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Password_E5
> > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Birthday
> > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   CompanyName
> > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Gender
> > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Age
> > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Education
> > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Country
> > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Title
> > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Phone1
> > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Phone2
> > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Fax
> > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   State
> > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   City
> > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Address1
> > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Address2
> > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   ZipCode
> > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   VATID
> > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Language
> > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   rec_letter
> > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   rec_promotion
> > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   rec_type
> > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   JointSource
> > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   CustomerLevel
> > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   UpdateDate
> > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   CreateDate
> > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   FirstLoginDate
> > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   LastLoginDate
> > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   LastVisit
> > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   isValid
> > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   nJoint
> > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Upd_SubDate
> > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   UnSub_Type
> > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   CreateDateFloat
> > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter: Writing source file:
> > > > > >
> > > > >
> > >
> /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/member_main.java
> > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter: Table name: member_main
> > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter: Columns: MemberId:4,
> > > > > USERNAME:12,
> > > > > > FirstName:-9, LastName:-9, EmailAddress:12, Password:12,
> > > Password_E5:12,
> > > > > > Birthday:93, CompanyName:-9, Gender:12, Age:5, Education:12,
> > > Country:5,
> > > > > > Title:-9, Phone1:12, Phone2:12, Fax:12, State:-9, City:-9,
> > > Address1:-9,
> > > > > > Address2:-9, ZipCode:12, VATID:12, Language:12, rec_letter:-7,
> > > > > > rec_promotion:-7, rec_type:5, JointSource:12, CustomerLevel:4,
> > > > > > UpdateDate:93, CreateDate:93, FirstLoginDate:93,
> LastLoginDate:93,
> > > > > > LastVisit:93, isValid:-7, nJoint:4, Upd_SubDate:93, UnSub_Type:4,
> > > > > > CreateDateFloat:8,
> > > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter: sourceFilename is
> > > > > member_main.java
> > > > > > 12/12/05 12:08:57 DEBUG orm.CompilationManager: Found existing
> > > > > > /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/
> > > > > > 12/12/05 12:08:57 INFO orm.CompilationManager: HADOOP_HOME is
> > > > > > /usr/local/hadoop/libexec/..
> > > > > > 12/12/05 12:08:57 DEBUG orm.CompilationManager: Adding source
> file:
> > > > > >
> > > > >
> > >
> /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/member_main.java
> > > > > > 12/12/05 12:08:57 DEBUG orm.CompilationManager: Invoking javac
> with
> > > args:
> > > > > > 12/12/05 12:08:57 DEBUG orm.CompilationManager:   -sourcepath
> > > > > > 12/12/05 12:08:57 DEBUG orm.CompilationManager:
> > > > > > /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/
> > > > > > 12/12/05 12:08:57 DEBUG orm.CompilationManager:   -d
> > > > > > 12/12/05 12:08:57 DEBUG orm.CompilationManager:
> > > > > > /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/
> > > > > > 12/12/05 12:08:57 DEBUG orm.CompilationManager:   -classpath
> > > > > > 12/12/05 12:08:57 DEBUG orm.CompilationManager:
> > > > > >
> > > > >
> > >
> /usr/local/hadoop/libexec/../conf:/usr/lib/jvm/java-6-openjdk-amd64/lib/tools.jar:/usr/local/hadoop/libexec/..:/usr/local/hadoop/libexec/../hadoop-core-1.0.3.jar:/usr/local/hadoop/libexec/../lib/asm-3.2.jar:/usr/local/hadoop/libexec/../lib/aspectjrt-1.6.5.jar:/usr/local/hadoop/libexec/../lib/aspectjtools-1.6.5.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/libexec/../lib/commons-cli-1.2.jar:/usr/local/hadoop/libexec/../lib/commons-codec-1.4.jar:/usr/local/hadoop/libexec/../lib/commons-collections-3.2.1.jar:/usr/local/hadoop/libexec/../lib/commons-configuration-1.6.jar:/usr/local/hadoop/libexec/../lib/commons-daemon-1.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-digester-1.8.jar:/usr/local/hadoop/libexec/../lib/commons-el-1.0.jar:/usr/local/hadoop/libexec/../lib/commons-httpclient-3.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-io-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-lang-2.4.jar:/usr/local/hadoop/libexec/../lib/commons-logging-1.1.1.jar:/usr/local/hadoop/libexec/../lib/commons-logging-api-1.0.4.jar:/usr/local/hadoop/libexec/../lib/commons-math-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-net-1.4.1.jar:/usr/local/hadoop/libexec/../lib/core-3.1.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-capacity-scheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-fairscheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-thriftfs-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hsqldb-1.8.0.10.jar:/usr/local/hadoop/libexec/../lib/jackson-core-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jasper-compiler-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jasper-runtime-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jdeb-0.8.jar:/usr/local/hadoop/libexec/../lib/jersey-core-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-json-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-server-1.8.jar:/usr/local/hadoop/libexec/../lib/jets3t-0.6.1.jar:/usr/local/hadoop/libexec/../lib/jetty-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jetty-util-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jsch-0.1.42.jar:/usr/local/hadoop/libexec/../lib/junit-4.5.jar:/usr/local/hadoop/libexec/../lib/kfs-0.2.2.jar:/usr/local/hadoop/libexec/../lib/log4j-1.2.15.jar:/usr/local/hadoop/libexec/../lib/mockito-all-1.8.5.jar:/usr/local/hadoop/libexec/../lib/oro-2.0.8.jar:/usr/local/hadoop/libexec/../lib/servlet-api-2.5-20081211.jar:/usr/local/hadoop/libexec/../lib/slf4j-api-1.4.3.jar:/usr/local/hadoop/libexec/../lib/slf4j-log4j12-1.4.3.jar:/usr/local/hadoop/libexec/../lib/xmlenc-0.52.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-2.1.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-api-2.1.jar:/usr/local/hadoop/lib:/usr/local/sqoop/conf::/usr/local/sqoop/lib/ant-contrib-1.0b3.jar:/usr/local/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar:/usr/local/sqoop/lib/avro-1.5.4.jar:/usr/local/sqoop/lib/avro-ipc-1.5.4.jar:/usr/local/sqoop/lib/avro-mapred-1.5.4.jar:/usr/local/sqoop/lib/commons-io-1.4.jar:/usr/local/sqoop/lib/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar:/usr/local/sqoop/lib/jackson-core-asl-1.7.3.jar:/usr/local/sqoop/lib/jackson-mapper-asl-1.7.3.jar:/usr/local/sqoop/lib/jopt-simple-3.2.jar:/usr/local/sqoop/lib/paranamer-2.3.jar:/usr/local/sqoop/lib/snappy-java-1.0.3.2.jar:/usr/local/sqoop/lib/sqljdbc4.jar:/usr/local/sqoop/lib/sqoop-sqlserver-1.0.jar:/usr/local/hbase/conf/:/usr/lib/jvm/java-6-openjdk-amd64//lib/tools.jar:/usr/local/hbase:/usr/local/hbase/hbase-0.94.1.jar:/usr/local/hbase/hbase-0.94.1-tests.jar:/usr/local/hbase/lib/activation-1.1.jar:/usr/local/hbase/lib/asm-3.1.jar:/usr/local/hbase/lib/avro-1.5.3.jar:/usr/local/hbase/lib/avro-ipc-1.5.3.jar:/usr/local/hbase/lib/commons-beanutils-1.7.0.jar:/usr/local/hbase/lib/commons-beanutils-core-1.8.0.jar:/usr/local/hbase/lib/commons-cli-1.2.jar:/usr/local/hbase/lib/commons-codec-1.4.jar:/usr/local/hbase/lib/commons-collections-3.2.1.jar:/usr/local/hbase/lib/commons-configuration-1.6.jar:/usr/local/hbase/lib/commons-digester-1.8.jar:/usr/local/hbase/lib/commons-el-1.0.jar:/usr/local/hbase/lib/commons-httpclient-3.1.jar:/usr/local/hbase/lib/commons-io-2.1.jar:/usr/local/hbase/lib/commons-lang-2.5.jar:/usr/local/hbase/lib/commons-logging-1.1.1.jar:/usr/local/hbase/lib/commons-math-2.1.jar:/usr/local/hbase/lib/commons-net-1.4.1.jar:/usr/local/hbase/lib/core-3.1.1.jar:/usr/local/hbase/lib/guava-11.0.2.jar:/usr/local/hbase/lib/hadoop-core-1.0.3.jar:/usr/local/hbase/lib/high-scale-lib-1.1.1.jar:/usr/local/hbase/lib/httpclient-4.1.2.jar:/usr/local/hbase/lib/httpcore-4.1.3.jar:/usr/local/hbase/lib/jackson-core-asl-1.8.8.jar:/usr/local/hbase/lib/jackson-jaxrs-1.8.8.jar:/usr/local/hbase/lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hbase/lib/jackson-xc-1.8.8.jar:/usr/local/hbase/lib/jamon-runtime-2.3.1.jar:/usr/local/hbase/lib/jasper-compiler-5.5.23.jar:/usr/local/hbase/lib/jasper-runtime-5.5.23.jar:/usr/local/hbase/lib/jaxb-api-2.1.jar:/usr/local/hbase/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hbase/lib/jersey-core-1.8.jar:/usr/local/hbase/lib/jersey-json-1.8.jar:/usr/local/hbase/lib/jersey-server-1.8.jar:/usr/local/hbase/lib/jettison-1.1.jar:/usr/local/hbase/lib/jetty-6.1.26.jar:/usr/local/hbase/lib/jetty-util-6.1.26.jar:/usr/local/hbase/lib/jruby-complete-1.6.5.jar:/usr/local/hbase/lib/jsp-2.1-6.1.14.jar:/usr/local/hbase/lib/jsp-api-2.1-6.1.14.jar:/usr/local/hbase/lib/jsr305-1.3.9.jar:/usr/local/hbase/lib/junit-4.10-HBASE-1.jar:/usr/local/hbase/lib/libthrift-0.8.0.jar:/usr/local/hbase/lib/log4j-1.2.16.jar:/usr/local/hbase/lib/metrics-core-2.1.2.jar:/usr/local/hbase/lib/netty-3.2.4.Final.jar:/usr/local/hbase/lib/protobuf-java-2.4.0a.jar:/usr/local/hbase/lib/servlet-api-2.5-6.1.14.jar:/usr/local/hbase/lib/slf4j-api-1.4.3.jar:/usr/local/hbase/lib/slf4j-log4j12-1.4.3.jar:/usr/local/hbase/lib/snappy-java-1.0.3.2.jar:/usr/local/hbase/lib/stax-api-1.0.1.jar:/usr/local/hbase/lib/velocity-1.7.jar:/usr/local/hbase/lib/xmlenc-0.52.jar:/usr/local/hbase/lib/zookeeper-3.4.3.jar::/usr/local/hadoop/conf:/usr/local/hadoop/libexec/../conf:/usr/lib/jvm/java-6-openjdk-amd64/lib/tools.jar:/usr/local/hadoop/libexec/..:/usr/local/hadoop/libexec/../hadoop-core-1.0.3.jar:/usr/local/hadoop/libexec/../lib/asm-3.2.jar:/usr/local/hadoop/libexec/../lib/aspectjrt-1.6.5.jar:/usr/local/hadoop/libexec/../lib/aspectjtools-1.6.5.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/libexec/../lib/commons-cli-1.2.jar:/usr/local/hadoop/libexec/../lib/commons-codec-1.4.jar:/usr/local/hadoop/libexec/../lib/commons-collections-3.2.1.jar:/usr/local/hadoop/libexec/../lib/commons-configuration-1.6.jar:/usr/local/hadoop/libexec/../lib/commons-daemon-1.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-digester-1.8.jar:/usr/local/hadoop/libexec/../lib/commons-el-1.0.jar:/usr/local/hadoop/libexec/../lib/commons-httpclient-3.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-io-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-lang-2.4.jar:/usr/local/hadoop/libexec/../lib/commons-logging-1.1.1.jar:/usr/local/hadoop/libexec/../lib/commons-logging-api-1.0.4.jar:/usr/local/hadoop/libexec/../lib/commons-math-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-net-1.4.1.jar:/usr/local/hadoop/libexec/../lib/core-3.1.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-capacity-scheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-fairscheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-thriftfs-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hsqldb-1.8.0.10.jar:/usr/local/hadoop/libexec/../lib/jackson-core-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jasper-compiler-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jasper-runtime-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jdeb-0.8.jar:/usr/local/hadoop/libexec/../lib/jersey-core-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-json-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-server-1.8.jar:/usr/local/hadoop/libexec/../lib/jets3t-0.6.1.jar:/usr/local/hadoop/libexec/../lib/jetty-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jetty-util-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jsch-0.1.42.jar:/usr/local/hadoop/libexec/../lib/junit-4.5.jar:/usr/local/hadoop/libexec/../lib/kfs-0.2.2.jar:/usr/local/hadoop/libexec/../lib/log4j-1.2.15.jar:/usr/local/hadoop/libexec/../lib/mockito-all-1.8.5.jar:/usr/local/hadoop/libexec/../lib/oro-2.0.8.jar:/usr/local/hadoop/libexec/../lib/servlet-api-2.5-20081211.jar:/usr/local/hadoop/libexec/../lib/slf4j-api-1.4.3.jar:/usr/local/hadoop/libexec/../lib/slf4j-log4j12-1.4.3.jar:/usr/local/hadoop/libexec/../lib/xmlenc-0.52.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-2.1.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-api-2.1.jar:/usr/local/hadoop/lib::/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar:/usr/local/sqoop/sqoop-test-1.3.0-cdh3u4.jar::/usr/local/hadoop/hadoop-core-1.0.3.jar:/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar
> > > > > > 12/12/05 12:08:58 ERROR orm.CompilationManager: Could not rename
> > > > > >
> > > > >
> > >
> /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/member_main.java
> > > > > > to /home/hadoop/_scripts/1-hadoop/member/./member_main.java
> > > > > > org.apache.commons.io.FileExistsException: Destination
> > > > > > '/home/hadoop/_scripts/1-hadoop/member/./member_main.java'
> already
> > > exists
> > > > > >  at org.apache.commons.io.FileUtils.moveFile(FileUtils.java:2378)
> > > > > > at
> > > > > >
> > > > >
> > >
> com.cloudera.sqoop.orm.CompilationManager.compile(CompilationManager.java:229)
> > > > > >  at
> > > com.cloudera.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:85)
> > > > > > at
> com.cloudera.sqoop.tool.ExportTool.exportTable(ExportTool.java:66)
> > > > > >  at com.cloudera.sqoop.tool.ExportTool.run(ExportTool.java:99)
> > > > > > at com.cloudera.sqoop.Sqoop.run(Sqoop.java:146)
> > > > > >  at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> > > > > > at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:182)
> > > > > >  at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:221)
> > > > > > at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:230)
> > > > > >  at com.cloudera.sqoop.Sqoop.main(Sqoop.java:239)
> > > > > > 12/12/05 12:08:58 INFO orm.CompilationManager: Writing jar file:
> > > > > >
> > > > >
> > >
> /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/member_main.jar
> > > > > > 12/12/05 12:08:58 DEBUG orm.CompilationManager: Scanning for
> .class
> > > files
> > > > > > in directory:
> > > /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971
> > > > > > 12/12/05 12:08:58 DEBUG orm.CompilationManager: Got classfile:
> > > > > >
> > > > >
> > >
> /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/member_main.class
> > > > > > -> member_main.class
> > > > > > 12/12/05 12:08:58 DEBUG orm.CompilationManager: Finished writing
> jar
> > > file
> > > > > >
> > > > >
> > >
> /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/member_main.jar
> > > > > > 12/12/05 12:08:58 INFO mapreduce.ExportJobBase: Beginning export
> of
> > > > > > member_main
> > > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Using InputFormat:
> class
> > > > > > com.cloudera.sqoop.mapreduce.ExportInputFormat
> > > > > > 12/12/05 12:08:58 DEBUG manager.SqlManager: Using fetchSize for
> next
> > > > > query:
> > > > > > 1000
> > > > > > 12/12/05 12:08:58 INFO manager.SqlManager: Executing SQL
> statement:
> > > > > SELECT
> > > > > > TOP 1 * FROM [member_main]
> > > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job
> classpath:
> > > > > > file:/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar
> > > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job
> classpath:
> > > > > > file:/usr/local/sqoop/lib/sqljdbc4.jar
> > > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job
> classpath:
> > > > > > file:/usr/local/sqoop/lib/sqoop-sqlserver-1.0.jar
> > > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job
> classpath:
> > > > > > file:/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar
> > > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job
> classpath:
> > > > > > file:/usr/local/sqoop/lib/jackson-core-asl-1.7.3.jar
> > > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job
> classpath:
> > > > > > file:/usr/local/sqoop/lib/snappy-java-1.0.3.2.jar
> > > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job
> classpath:
> > > > > > file:/usr/local/sqoop/lib/avro-1.5.4.jar
> > > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job
> classpath:
> > > > > > file:/usr/local/sqoop/lib/jackson-mapper-asl-1.7.3.jar
> > > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job
> classpath:
> > > > > > file:/usr/local/sqoop/lib/sqljdbc4.jar
> > > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job
> classpath:
> > > > > > file:/usr/local/sqoop/lib/ant-contrib-1.0b3.jar
> > > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job
> classpath:
> > > > > > file:/usr/local/sqoop/lib/sqoop-sqlserver-1.0.jar
> > > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job
> classpath:
> > > > > > file:/usr/local/sqoop/lib/paranamer-2.3.jar
> > > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job
> classpath:
> > > > > > file:/usr/local/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar
> > > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job
> classpath:
> > > > > >
> file:/usr/local/sqoop/lib/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar
> > > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job
> classpath:
> > > > > > file:/usr/local/sqoop/lib/avro-ipc-1.5.4.jar
> > > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job
> classpath:
> > > > > > file:/usr/local/sqoop/lib/avro-mapred-1.5.4.jar
> > > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job
> classpath:
> > > > > > file:/usr/local/sqoop/lib/commons-io-1.4.jar
> > > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job
> classpath:
> > > > > > file:/usr/local/sqoop/lib/jopt-simple-3.2.jar
> > > > > > 12/12/05 12:09:00 INFO input.FileInputFormat: Total input paths
> to
> > > > > process
> > > > > > : 1
> > > > > > 12/12/05 12:09:00 DEBUG mapreduce.ExportInputFormat: Target
> > > numMapTasks=1
> > > > > > 12/12/05 12:09:00 DEBUG mapreduce.ExportInputFormat: Total input
> > > > > bytes=2611
> > > > > > 12/12/05 12:09:00 DEBUG mapreduce.ExportInputFormat:
> > > maxSplitSize=2611
> > > > > > 12/12/05 12:09:00 INFO input.FileInputFormat: Total input paths
> to
> > > > > process
> > > > > > : 1
> > > > > > 12/12/05 12:09:00 INFO util.NativeCodeLoader: Loaded the
> > > native-hadoop
> > > > > > library
> > > > > > 12/12/05 12:09:00 WARN snappy.LoadSnappy: Snappy native library
> not
> > > > > loaded
> > > > > > 12/12/05 12:09:00 DEBUG mapreduce.ExportInputFormat: Generated
> > > splits:
> > > > > > 12/12/05 12:09:00 DEBUG mapreduce.ExportInputFormat:
> > > > > > Paths:/user/hadoop/test-ivan/test:0+2611 Locations:hadoop05:;
> > > > > > 12/12/05 12:09:00 INFO mapred.JobClient: Running job:
> > > > > job_201212041541_0107
> > > > > > 12/12/05 12:09:01 INFO mapred.JobClient:  map 0% reduce 0%
> > > > > > 12/12/05 12:09:18 INFO mapred.JobClient: Task Id :
> > > > > > attempt_201212041541_0107_m_000000_0, Status : FAILED
> > > > > > java.io.IOException:
> com.microsoft.sqlserver.jdbc.SQLServerException:
> > > > > > Incorrect syntax near ','.
> > > > > >  at
> > > > > >
> > > > >
> > >
> com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.close(AsyncSqlRecordWriter.java:195)
> > > > > > at
> > > > > >
> > > > >
> > >
> org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.close(MapTask.java:651)
> > > > > >  at
> org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:766)
> > > > > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> > > > > >  at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> > > > > > at java.security.AccessController.doPrivileged(Native Method)
> > > > > >  at javax.security.auth.Subject.doAs(Subject.java:416)
> > > > > > at
> > > > > >
> > > > >
> > >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> > > > > >  at org.apache.hadoop.mapred.Child.main(Child.java:249)
> > > > > > Caused by: com.microsoft.sqlserver.jdbc.SQLServerException:
> Incorrect
> > > > > > syntax near ','.
> > > > > >  at
> > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:197)
> > > > > > at
> > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1493)
> > > > > >  at
> > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.doExecutePreparedStatement(SQLServerPreparedStatement.java:390)
> > > > > > at
> > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement$PrepStmtExecCmd.doExecute(SQLServerPreparedStatement.java:340)
> > > > > >  at
> > > com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:4575)
> > > > > > at
> > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1400)
> > > > > >  at
> > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:179)
> > > > > > at
> > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:154)
> > > > > >  at
> > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.execute(SQLServerPreparedStatement.java:322)
> > > > > > at
> > > > > >
> > > > >
> > >
> com.cloudera.sqoop.mapreduce.AsyncSqlOutputFormat$AsyncSqlExecThread.run(AsyncSqlOutputFormat.java:234)
> > > > > >
> > > > > > 12/12/05 12:09:24 INFO mapred.JobClient: Task Id :
> > > > > > attempt_201212041541_0107_m_000000_1, Status : FAILED
> > > > > > java.io.IOException:
> com.microsoft.sqlserver.jdbc.SQLServerException:
> > > > > > Incorrect syntax near ','.
> > > > > >  at
> > > > > >
> > > > >
> > >
> com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.close(AsyncSqlRecordWriter.java:195)
> > > > > > at
> > > > > >
> > > > >
> > >
> org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.close(MapTask.java:651)
> > > > > >  at
> org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:766)
> > > > > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> > > > > >  at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> > > > > > at java.security.AccessController.doPrivileged(Native Method)
> > > > > >  at javax.security.auth.Subject.doAs(Subject.java:416)
> > > > > > at
> > > > > >
> > > > >
> > >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> > > > > >  at org.apache.hadoop.mapred.Child.main(Child.java:249)
> > > > > > Caused by: com.microsoft.sqlserver.jdbc.SQLServerException:
> Incorrect
> > > > > > syntax near ','.
> > > > > >  at
> > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:197)
> > > > > > at
> > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1493)
> > > > > >  at
> > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.doExecutePreparedStatement(SQLServerPreparedStatement.java:390)
> > > > > > at
> > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement$PrepStmtExecCmd.doExecute(SQLServerPreparedStatement.java:340)
> > > > > >  at
> > > com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:4575)
> > > > > > at
> > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1400)
> > > > > >  at
> > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:179)
> > > > > > at
> > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:154)
> > > > > >  at
> > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.execute(SQLServerPreparedStatement.java:322)
> > > > > > at
> > > > > >
> > > > >
> > >
> com.cloudera.sqoop.mapreduce.AsyncSqlOutputFormat$AsyncSqlExecThread.run(AsyncSqlOutputFormat.java:234)
> > > > > >
> > > > > > 12/12/05 12:09:30 INFO mapred.JobClient: Task Id :
> > > > > > attempt_201212041541_0107_m_000000_2, Status : FAILED
> > > > > > java.io.IOException:
> com.microsoft.sqlserver.jdbc.SQLServerException:
> > > > > > Incorrect syntax near ','.
> > > > > >  at
> > > > > >
> > > > >
> > >
> com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.close(AsyncSqlRecordWriter.java:195)
> > > > > > at
> > > > > >
> > > > >
> > >
> org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.close(MapTask.java:651)
> > > > > >  at
> org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:766)
> > > > > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> > > > > >  at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> > > > > > at java.security.AccessController.doPrivileged(Native Method)
> > > > > >  at javax.security.auth.Subject.doAs(Subject.java:416)
> > > > > > at
> > > > > >
> > > > >
> > >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> > > > > >  at org.apache.hadoop.mapred.Child.main(Child.java:249)
> > > > > > Caused by: com.microsoft.sqlserver.jdbc.SQLServerException:
> Incorrect
> > > > > > syntax near ','.
> > > > > >  at
> > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:197)
> > > > > > at
> > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1493)
> > > > > >  at
> > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.doExecutePreparedStatement(SQLServerPreparedStatement.java:390)
> > > > > > at
> > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement$PrepStmtExecCmd.doExecute(SQLServerPreparedStatement.java:340)
> > > > > >  at
> > > com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:4575)
> > > > > > at
> > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1400)
> > > > > >  at
> > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:179)
> > > > > > at
> > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:154)
> > > > > >  at
> > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.execute(SQLServerPreparedStatement.java:322)
> > > > > > at
> > > > > >
> > > > >
> > >
> com.cloudera.sqoop.mapreduce.AsyncSqlOutputFormat$AsyncSqlExecThread.run(AsyncSqlOutputFormat.java:234)
> > > > > >
> > > > > > 12/12/05 12:09:41 INFO mapred.JobClient: Job complete:
> > > > > job_201212041541_0107
> > > > > > 12/12/05 12:09:41 INFO mapred.JobClient: Counters: 8
> > > > > > 12/12/05 12:09:41 INFO mapred.JobClient:   Job Counters
> > > > > > 12/12/05 12:09:41 INFO mapred.JobClient:
> SLOTS_MILLIS_MAPS=24379
> > > > > > 12/12/05 12:09:41 INFO mapred.JobClient:     Total time spent by
> all
> > > > > > reduces waiting after reserving slots (ms)=0
> > > > > > 12/12/05 12:09:41 INFO mapred.JobClient:     Total time spent by
> all
> > > maps
> > > > > > waiting after reserving slots (ms)=0
> > > > > > 12/12/05 12:09:41 INFO mapred.JobClient:     Rack-local map
> tasks=3
> > > > > > 12/12/05 12:09:41 INFO mapred.JobClient:     Launched map tasks=4
> > > > > > 12/12/05 12:09:41 INFO mapred.JobClient:     Data-local map
> tasks=1
> > > > > > 12/12/05 12:09:41 INFO mapred.JobClient:
> SLOTS_MILLIS_REDUCES=0
> > > > > > 12/12/05 12:09:41 INFO mapred.JobClient:     Failed map tasks=1
> > > > > > 12/12/05 12:09:41 INFO mapreduce.ExportJobBase: Transferred 0
> bytes
> > > in
> > > > > > 43.0875 seconds (0 bytes/sec)
> > > > > > 12/12/05 12:09:41 INFO mapreduce.ExportJobBase: Exported 0
> records.
> > > > > > 12/12/05 12:09:41 ERROR tool.ExportTool: Error during export:
> Export
> > > job
> > > > > > failed!
> > > > > > ================
> > > > > >
> > > > > > Kind regards,
> > > > > > Chun-fan
> > > > > >
> > > > > > On Wed, Dec 5, 2012 at 12:01 AM, Jarek Jarcec Cecho <
> > > jarcec@apache.org
> > > > > >wrote:
> > > > > >
> > > > > > > HiChun-fan,
> > > > > > > would you mind sharing with us entire Sqoop log generated with
> > > > > parameter
> > > > > > > --verbose? Are you using build-in Microsoft SQL Connector or
> > > connector
> > > > > > > provided by Microsoft?
> > > > > > >
> > > > > > > Jarcec
> > > > > > >
> > > > > > > On Tue, Dec 04, 2012 at 05:51:31PM +0800, Chun-fan Ivan Liao
> wrote:
> > > > > > > > Hi,
> > > > > > > >
> > > > > > > >
> > > > > > > >
> > > > > > > > We are using Sqoop 1.3.0-cdh3u4 with Hadoop version 1.0.3.
> > > > > > > >
> > > > > > > >
> > > > > > > >
> > > > > > > > We encountered the following error when we try to export HDFS
> > > file
> > > > > into
> > > > > > > > MSSQL 2005 (partially):
> > > > > > > >
> > > > > > > >
> > > > > > > >
> > > > > > > > 12/12/04 16:44:13 INFO mapred.JobClient: Task Id :
> > > > > > > > attempt_201212041541_0014_m_000000_2, Status : FAILED
> > > > > > > >
> > > > > > > > java.io.IOException:
> > > com.microsoft.sqlserver.jdbc.SQLServerException:
> > > > > > > > Incorrect syntax near ','.
> > > > > > > >
> > > > > > > >         at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.close(AsyncSqlRecordWriter.java:195)
> > > > > > > >
> > > > > > > >         at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.close(MapTask.java:651)
> > > > > > > >
> > > > > > > >         at
> > > > > > > org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:766)
> > > > > > > >
> > > > > > > >         at
> org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> > > > > > > >
> > > > > > > >         at
> org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> > > > > > > >
> > > > > > > >         at java.security.AccessController.doPrivileged(Native
> > > Method)
> > > > > > > >
> > > > > > > >         at javax.security.auth.Subject.doAs(Subject.java:416)
> > > > > > > >
> > > > > > > >         at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> > > > > > > >
> > > > > > > >         at
> org.apache.hadoop.mapred.Child.main(Child.java:249)
> > > > > > > >
> > > > > > > > Caused by: com.microsoft.sqlserver.jdbc.SQLServerException:
> > > Incorrect
> > > > > > > > syntax near ','.
> > > > > > > >
> > > > > > > >         at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:197)
> > > > > > > >
> > > > > > > >         at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1493)
> > > > > > > >
> > > > > > > >         at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.doExecutePreparedStatement(SQLServerPreparedStatement.java:390)
> > > > > > > >
> > > > > > > >         at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement$PrepStmtExecCmd.doExecute(SQLServerPreparedStatement.java:340)
> > > > > > > >
> > > > > > > >         at
> > > > > > > >
> > > com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:4575)
> > > > > > > >
> > > > > > > >         at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1400)
> > > > > > > >
> > > > > > > >         at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:179)
> > > > > > > >
> > > > > > > >         at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:154)
> > > > > > > >
> > > > > > > >         at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.execute(SQLServerPreparedStatement.java:322)
> > > > > > > >
> > > > > > > >         at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> com.cloudera.sqoop.mapreduce.AsyncSqlOutputFormat$AsyncSqlExecThread.run(AsyncSqlOutputFormat.java:234)
> > > > > > > >
> > > > > > > >
> > > > > > > >
> > > > > > > > The HDFS file that we want to export was imported using sqoop
> > > from
> > > > > SQL
> > > > > > > 2005
> > > > > > > > before and uses ‘|’ as field delimiter, and there are commas
> > > (‘,’)
> > > > > in a
> > > > > > > > field of a line in the file.
> > > > > > > >
> > > > > > > >
> > > > > > > >
> > > > > > > > The commands I submitted is (generalized with capital
> letters):
> > > > > > > >
> > > > > > > > $ sqoop export -D sqoop.export.records.per.statement=75 -D
> > > > > > > > sqoop.export.statements.per.transaction=75 --connect
> > > > > > > >
> > > > > > >
> > > > >
> > >
> "jdbc:sqlserver://SERVER-NAME:1433;username=USER_NAME;password=PASSWD;database=DB_NAME"
> > > > > > > > --table TABLE_NAME -m 1 --input-fields-terminated-by '|'
> > > --export-dir
> > > > > > > > /EXPORT/FROM/DIRECTORY
> > > > > > > >
> > > > > > > >
> > > > > > > >
> > > > > > > > I’ve adjusted values of sqoop.export.records.per.statement &
> > > > > > > > sqoop.export.statements.per.transaction, but that didn’t
> help.
> > > > > > > >
> > > > > > > >
> > > > > > > >
> > > > > > > > It will be greatly appreciated if you can offer some help.
> > > Thanks.
> > > > > > > >
> > > > > > > >
> > > > > > > >
> > > > > > > > Ivan
> > > > > > >
> > > > >
> > >
>

Re: Sqoop export failed: Incorrect syntax near ','

Posted by Jarek Jarcec Cecho <ja...@apache.org>.
I see, Would you mind upgrading your Sqoop to most recent version 1.4.2?

Jarcec

On Fri, Dec 07, 2012 at 11:19:31AM +0800, Chun-fan Ivan Liao wrote:
> Hi Jarek,
> 
> I've tried to use "--hive-drop-import-delims", but sqoop showed it has
> syntax error:
> 
>   ERROR tool.BaseSqoopTool: Unrecognized argument: --hive-drop-import-delims
> 
> Also, should I change Java OpenJDK to Oracle JDK in order to make Sqoop
> export work?
> 
> Thanks!
> Ivan
> 
> On Fri, Dec 7, 2012 at 12:28 AM, Jarek Jarcec Cecho <ja...@apache.org>wrote:
> 
> > Hi Ivangelion,
> > I'm glad that you were able to move on with your issue. It seems to me
> > that you're running on OpenJDK - unfortunately Sqoop is tested and
> > supported only Oracle JDK.
> >
> > Based on the exceptions you're hitting:
> >
> >   java.lang.NumberFormatException: For input string: "Male"
> >
> >   java.lang.IllegalArgumentException: Timestamp format must be yyyy-mm-dd
> > hh:mm:ss[.fffffffff
> >
> > It seems to me your input files got somehow corrupted and for example for
> > the first exception Sqoop is looking for column that should be number but
> > found string "Male" instead. You've mentioned that your data can contain a
> > lot of wild characters, can it happen that your data also contains new line
> > characters? Would you mind re-trying import with parameter
> > --hive-drop-import-delims [1] to see if it helps? (this parameter do not
> > depend on hive in any way regardless of it's name).
> >
> > Jarcec
> >
> > On Thu, Dec 06, 2012 at 12:03:06PM +0800, Ivangelion wrote:
> > > Hi Jarek,
> > >
> > > It actually worked! Thank you so much~! :D
> > >
> > > However now we faced another problem. The former data we tried to export
> > is
> > > only test data, which row count is only 10. When we tried to export
> > > production data back into SQL Server from HDFS file which was previously
> > > imported using Sqoop from SQL server, different errors occurred. The row
> > > count is about 400k, and only about 120k rows were exported. This time we
> > > used "-m 5", and if using "-m 1", nothing will be exported. Verbose log
> > is
> > > in the bottom of this mail.
> > >
> > > Is this has to do with that we used MS SQL connector to do previous
> > import,
> > > not the default one?
> > >
> > > Also, should we specify any character encoding, e.g. utf-8 during
> > > import/export process? There are characters of many different languages
> > in
> > > our original data in SQL Server, and I'm not sure what the encoding is
> > > after imported into HDFS.
> > >
> > > Thanks again, Jarek.
> > >
> > > =====================================
> > > 12/12/05 19:55:27 DEBUG tool.BaseSqoopTool: Enabled debug logging.
> > > 12/12/05 19:55:27 DEBUG sqoop.ConnFactory: Loaded manager factory:
> > > com.cloudera.sqoop.manager.DefaultManagerFactory
> > > 12/12/05 19:55:27 DEBUG sqoop.ConnFactory: Trying ManagerFactory:
> > > com.cloudera.sqoop.manager.DefaultManagerFactory
> > > 12/12/05 19:55:27 DEBUG manager.DefaultManagerFactory: Trying with
> > scheme:
> > > jdbc:sqlserver:
> > > 12/12/05 19:55:27 INFO manager.SqlManager: Using default fetchSize of
> > 1000
> > > 12/12/05 19:55:27 DEBUG sqoop.ConnFactory: Instantiated ConnManager
> > > com.cloudera.sqoop.manager.SQLServerManager@6766afb3
> > > 12/12/05 19:55:27 INFO tool.CodeGenTool: Beginning code generation
> > > 12/12/05 19:55:27 DEBUG manager.SqlManager: No connection paramenters
> > > specified. Using regular API for making connection.
> > > 12/12/05 19:55:27 DEBUG manager.SqlManager: Using fetchSize for next
> > query:
> > > 1000
> > > 12/12/05 19:55:27 INFO manager.SqlManager: Executing SQL statement:
> > SELECT
> > > t.* FROM member_main AS t WHERE 1=0
> > > 12/12/05 19:55:27 DEBUG orm.ClassWriter: selected columns:
> > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   MemberId
> > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   USERNAME
> > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   FirstName
> > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   LastName
> > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   EmailAddress
> > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Password
> > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Password_E5
> > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Birthday
> > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   CompanyName
> > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Gender
> > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Age
> > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Education
> > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Country
> > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Title
> > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Phone1
> > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Phone2
> > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Fax
> > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   State
> > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   City
> > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Address1
> > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Address2
> > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   ZipCode
> > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   VATID
> > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Language
> > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   rec_letter
> > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   rec_promotion
> > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   rec_type
> > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   JointSource
> > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   CustomerLevel
> > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   UpdateDate
> > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   CreateDate
> > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   FirstLoginDate
> > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   LastLoginDate
> > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   LastVisit
> > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   isValid
> > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   nJoint
> > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Upd_SubDate
> > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   UnSub_Type
> > > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   CreateDateFloat
> > > 12/12/05 19:55:27 DEBUG orm.ClassWriter: Writing source file:
> > >
> > /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/member_main.java
> > > 12/12/05 19:55:27 DEBUG orm.ClassWriter: Table name: member_main
> > > 12/12/05 19:55:27 DEBUG orm.ClassWriter: Columns: MemberId:4,
> > USERNAME:12,
> > > FirstName:-9, LastName:-9, EmailAddress:12, Password:12, Password_E5:12,
> > > Birthday:93, CompanyName:-9, Gender:12, Age:5, Education:12, Country:5,
> > > Title:-9, Phone1:12, Phone2:12, Fax:12, State:-9, City:-9, Address1:-9,
> > > Address2:-9, ZipCode:12, VATID:12, Language:12, rec_letter:-7,
> > > rec_promotion:-7, rec_type:5, JointSource:12, CustomerLevel:4,
> > > UpdateDate:93, CreateDate:93, FirstLoginDate:93, LastLoginDate:93,
> > > LastVisit:93, isValid:-7, nJoint:4, Upd_SubDate:93, UnSub_Type:4,
> > > CreateDateFloat:8,
> > > 12/12/05 19:55:27 DEBUG orm.ClassWriter: sourceFilename is
> > member_main.java
> > > 12/12/05 19:55:27 DEBUG orm.CompilationManager: Found existing
> > > /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/
> > > 12/12/05 19:55:27 INFO orm.CompilationManager: HADOOP_HOME is
> > > /usr/local/hadoop/libexec/..
> > > 12/12/05 19:55:27 DEBUG orm.CompilationManager: Adding source file:
> > >
> > /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/member_main.java
> > > 12/12/05 19:55:27 DEBUG orm.CompilationManager: Invoking javac with args:
> > > 12/12/05 19:55:27 DEBUG orm.CompilationManager:   -sourcepath
> > > 12/12/05 19:55:27 DEBUG orm.CompilationManager:
> > > /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/
> > > 12/12/05 19:55:27 DEBUG orm.CompilationManager:   -d
> > > 12/12/05 19:55:27 DEBUG orm.CompilationManager:
> > > /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/
> > > 12/12/05 19:55:27 DEBUG orm.CompilationManager:   -classpath
> > > 12/12/05 19:55:27 DEBUG orm.CompilationManager:
> > >
> > /usr/local/hadoop/libexec/../conf:/usr/lib/jvm/java-6-openjdk-amd64/lib/tools.jar:/usr/local/hadoop/libexec/..:/usr/local/hadoop/libexec/../hadoop-core-1.0.3.jar:/usr/local/hadoop/libexec/../lib/asm-3.2.jar:/usr/local/hadoop/libexec/../lib/aspectjrt-1.6.5.jar:/usr/local/hadoop/libexec/../lib/aspectjtools-1.6.5.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/libexec/../lib/commons-cli-1.2.jar:/usr/local/hadoop/libexec/../lib/commons-codec-1.4.jar:/usr/local/hadoop/libexec/../lib/commons-collections-3.2.1.jar:/usr/local/hadoop/libexec/../lib/commons-configuration-1.6.jar:/usr/local/hadoop/libexec/../lib/commons-daemon-1.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-digester-1.8.jar:/usr/local/hadoop/libexec/../lib/commons-el-1.0.jar:/usr/local/hadoop/libexec/../lib/commons-httpclient-3.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-io-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-lang-2.4.jar:/usr/local/hadoop/libexec/../lib/commons-logging-1.1.1.jar:/usr/local/hadoop/libexec/../lib/commons-logging-api-1.0.4.jar:/usr/local/hadoop/libexec/../lib/commons-math-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-net-1.4.1.jar:/usr/local/hadoop/libexec/../lib/core-3.1.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-capacity-scheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-fairscheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-thriftfs-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hsqldb-1.8.0.10.jar:/usr/local/hadoop/libexec/../lib/jackson-core-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jasper-compiler-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jasper-runtime-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jdeb-0.8.jar:/usr/local/hadoop/libexec/../lib/jersey-core-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-json-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-server-1.8.jar:/usr/local/hadoop/libexec/../lib/jets3t-0.6.1.jar:/usr/local/hadoop/libexec/../lib/jetty-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jetty-util-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jsch-0.1.42.jar:/usr/local/hadoop/libexec/../lib/junit-4.5.jar:/usr/local/hadoop/libexec/../lib/kfs-0.2.2.jar:/usr/local/hadoop/libexec/../lib/log4j-1.2.15.jar:/usr/local/hadoop/libexec/../lib/mockito-all-1.8.5.jar:/usr/local/hadoop/libexec/../lib/oro-2.0.8.jar:/usr/local/hadoop/libexec/../lib/servlet-api-2.5-20081211.jar:/usr/local/hadoop/libexec/../lib/slf4j-api-1.4.3.jar:/usr/local/hadoop/libexec/../lib/slf4j-log4j12-1.4.3.jar:/usr/local/hadoop/libexec/../lib/xmlenc-0.52.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-2.1.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-api-2.1.jar:/usr/local/hadoop/lib:/usr/local/sqoop/conf::/usr/local/sqoop/lib/ant-contrib-1.0b3.jar:/usr/local/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar:/usr/local/sqoop/lib/avro-1.5.4.jar:/usr/local/sqoop/lib/avro-ipc-1.5.4.jar:/usr/local/sqoop/lib/avro-mapred-1.5.4.jar:/usr/local/sqoop/lib/commons-io-1.4.jar:/usr/local/sqoop/lib/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar:/usr/local/sqoop/lib/jackson-core-asl-1.7.3.jar:/usr/local/sqoop/lib/jackson-mapper-asl-1.7.3.jar:/usr/local/sqoop/lib/jopt-simple-3.2.jar:/usr/local/sqoop/lib/paranamer-2.3.jar:/usr/local/sqoop/lib/snappy-java-1.0.3.2.jar:/usr/local/sqoop/lib/sqljdbc4.jar:/usr/local/sqoop/lib/sqoop-sqlserver-1.0.jar:/usr/local/hbase/conf/:/usr/lib/jvm/java-6-openjdk-amd64//lib/tools.jar:/usr/local/hbase:/usr/local/hbase/hbase-0.94.1.jar:/usr/local/hbase/hbase-0.94.1-tests.jar:/usr/local/hbase/lib/activation-1.1.jar:/usr/local/hbase/lib/asm-3.1.jar:/usr/local/hbase/lib/avro-1.5.3.jar:/usr/local/hbase/lib/avro-ipc-1.5.3.jar:/usr/local/hbase/lib/commons-beanutils-1.7.0.jar:/usr/local/hbase/lib/commons-beanutils-core-1.8.0.jar:/usr/local/hbase/lib/commons-cli-1.2.jar:/usr/local/hbase/lib/commons-codec-1.4.jar:/usr/local/hbase/lib/commons-collections-3.2.1.jar:/usr/local/hbase/lib/commons-configuration-1.6.jar:/usr/local/hbase/lib/commons-digester-1.8.jar:/usr/local/hbase/lib/commons-el-1.0.jar:/usr/local/hbase/lib/commons-httpclient-3.1.jar:/usr/local/hbase/lib/commons-io-2.1.jar:/usr/local/hbase/lib/commons-lang-2.5.jar:/usr/local/hbase/lib/commons-logging-1.1.1.jar:/usr/local/hbase/lib/commons-math-2.1.jar:/usr/local/hbase/lib/commons-net-1.4.1.jar:/usr/local/hbase/lib/core-3.1.1.jar:/usr/local/hbase/lib/guava-11.0.2.jar:/usr/local/hbase/lib/hadoop-core-1.0.3.jar:/usr/local/hbase/lib/high-scale-lib-1.1.1.jar:/usr/local/hbase/lib/httpclient-4.1.2.jar:/usr/local/hbase/lib/httpcore-4.1.3.jar:/usr/local/hbase/lib/jackson-core-asl-1.8.8.jar:/usr/local/hbase/lib/jackson-jaxrs-1.8.8.jar:/usr/local/hbase/lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hbase/lib/jackson-xc-1.8.8.jar:/usr/local/hbase/lib/jamon-runtime-2.3.1.jar:/usr/local/hbase/lib/jasper-compiler-5.5.23.jar:/usr/local/hbase/lib/jasper-runtime-5.5.23.jar:/usr/local/hbase/lib/jaxb-api-2.1.jar:/usr/local/hbase/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hbase/lib/jersey-core-1.8.jar:/usr/local/hbase/lib/jersey-json-1.8.jar:/usr/local/hbase/lib/jersey-server-1.8.jar:/usr/local/hbase/lib/jettison-1.1.jar:/usr/local/hbase/lib/jetty-6.1.26.jar:/usr/local/hbase/lib/jetty-util-6.1.26.jar:/usr/local/hbase/lib/jruby-complete-1.6.5.jar:/usr/local/hbase/lib/jsp-2.1-6.1.14.jar:/usr/local/hbase/lib/jsp-api-2.1-6.1.14.jar:/usr/local/hbase/lib/jsr305-1.3.9.jar:/usr/local/hbase/lib/junit-4.10-HBASE-1.jar:/usr/local/hbase/lib/libthrift-0.8.0.jar:/usr/local/hbase/lib/log4j-1.2.16.jar:/usr/local/hbase/lib/metrics-core-2.1.2.jar:/usr/local/hbase/lib/netty-3.2.4.Final.jar:/usr/local/hbase/lib/protobuf-java-2.4.0a.jar:/usr/local/hbase/lib/servlet-api-2.5-6.1.14.jar:/usr/local/hbase/lib/slf4j-api-1.4.3.jar:/usr/local/hbase/lib/slf4j-log4j12-1.4.3.jar:/usr/local/hbase/lib/snappy-java-1.0.3.2.jar:/usr/local/hbase/lib/stax-api-1.0.1.jar:/usr/local/hbase/lib/velocity-1.7.jar:/usr/local/hbase/lib/xmlenc-0.52.jar:/usr/local/hbase/lib/zookeeper-3.4.3.jar::/usr/local/hadoop/conf:/usr/local/hadoop/libexec/../conf:/usr/lib/jvm/java-6-openjdk-amd64/lib/tools.jar:/usr/local/hadoop/libexec/..:/usr/local/hadoop/libexec/../hadoop-core-1.0.3.jar:/usr/local/hadoop/libexec/../lib/asm-3.2.jar:/usr/local/hadoop/libexec/../lib/aspectjrt-1.6.5.jar:/usr/local/hadoop/libexec/../lib/aspectjtools-1.6.5.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/libexec/../lib/commons-cli-1.2.jar:/usr/local/hadoop/libexec/../lib/commons-codec-1.4.jar:/usr/local/hadoop/libexec/../lib/commons-collections-3.2.1.jar:/usr/local/hadoop/libexec/../lib/commons-configuration-1.6.jar:/usr/local/hadoop/libexec/../lib/commons-daemon-1.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-digester-1.8.jar:/usr/local/hadoop/libexec/../lib/commons-el-1.0.jar:/usr/local/hadoop/libexec/../lib/commons-httpclient-3.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-io-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-lang-2.4.jar:/usr/local/hadoop/libexec/../lib/commons-logging-1.1.1.jar:/usr/local/hadoop/libexec/../lib/commons-logging-api-1.0.4.jar:/usr/local/hadoop/libexec/../lib/commons-math-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-net-1.4.1.jar:/usr/local/hadoop/libexec/../lib/core-3.1.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-capacity-scheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-fairscheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-thriftfs-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hsqldb-1.8.0.10.jar:/usr/local/hadoop/libexec/../lib/jackson-core-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jasper-compiler-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jasper-runtime-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jdeb-0.8.jar:/usr/local/hadoop/libexec/../lib/jersey-core-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-json-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-server-1.8.jar:/usr/local/hadoop/libexec/../lib/jets3t-0.6.1.jar:/usr/local/hadoop/libexec/../lib/jetty-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jetty-util-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jsch-0.1.42.jar:/usr/local/hadoop/libexec/../lib/junit-4.5.jar:/usr/local/hadoop/libexec/../lib/kfs-0.2.2.jar:/usr/local/hadoop/libexec/../lib/log4j-1.2.15.jar:/usr/local/hadoop/libexec/../lib/mockito-all-1.8.5.jar:/usr/local/hadoop/libexec/../lib/oro-2.0.8.jar:/usr/local/hadoop/libexec/../lib/servlet-api-2.5-20081211.jar:/usr/local/hadoop/libexec/../lib/slf4j-api-1.4.3.jar:/usr/local/hadoop/libexec/../lib/slf4j-log4j12-1.4.3.jar:/usr/local/hadoop/libexec/../lib/xmlenc-0.52.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-2.1.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-api-2.1.jar:/usr/local/hadoop/lib::/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar:/usr/local/sqoop/sqoop-test-1.3.0-cdh3u4.jar::/usr/local/hadoop/hadoop-core-1.0.3.jar:/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar
> > > 12/12/05 19:55:28 ERROR orm.CompilationManager: Could not rename
> > >
> > /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/member_main.java
> > > to /home/hadoop/_scripts/1-hadoop/member/./member_main.java
> > > org.apache.commons.io.FileExistsException: Destination
> > > '/home/hadoop/_scripts/1-hadoop/member/./member_main.java' already exists
> > >  at org.apache.commons.io.FileUtils.moveFile(FileUtils.java:2378)
> > > at
> > >
> > com.cloudera.sqoop.orm.CompilationManager.compile(CompilationManager.java:229)
> > >  at com.cloudera.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:85)
> > > at com.cloudera.sqoop.tool.ExportTool.exportTable(ExportTool.java:66)
> > >  at com.cloudera.sqoop.tool.ExportTool.run(ExportTool.java:99)
> > > at com.cloudera.sqoop.Sqoop.run(Sqoop.java:146)
> > >  at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> > > at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:182)
> > >  at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:221)
> > > at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:230)
> > >  at com.cloudera.sqoop.Sqoop.main(Sqoop.java:239)
> > > 12/12/05 19:55:28 INFO orm.CompilationManager: Writing jar file:
> > >
> > /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/member_main.jar
> > > 12/12/05 19:55:28 DEBUG orm.CompilationManager: Scanning for .class files
> > > in directory: /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d
> > > 12/12/05 19:55:28 DEBUG orm.CompilationManager: Got classfile:
> > >
> > /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/member_main.class
> > > -> member_main.class
> > > 12/12/05 19:55:28 DEBUG orm.CompilationManager: Finished writing jar file
> > >
> > /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/member_main.jar
> > > 12/12/05 19:55:28 INFO mapreduce.ExportJobBase: Beginning export of
> > > member_main
> > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Using InputFormat: class
> > > com.cloudera.sqoop.mapreduce.ExportInputFormat
> > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > file:/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar
> > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > file:/usr/local/sqoop/lib/sqljdbc4.jar
> > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > file:/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar
> > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > file:/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar
> > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > file:/usr/local/sqoop/lib/jackson-core-asl-1.7.3.jar
> > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > file:/usr/local/sqoop/lib/snappy-java-1.0.3.2.jar
> > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > file:/usr/local/sqoop/lib/avro-1.5.4.jar
> > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > file:/usr/local/sqoop/lib/jackson-mapper-asl-1.7.3.jar
> > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > file:/usr/local/sqoop/lib/sqljdbc4.jar
> > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > file:/usr/local/sqoop/lib/ant-contrib-1.0b3.jar
> > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > file:/usr/local/sqoop/lib/sqoop-sqlserver-1.0.jar
> > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > file:/usr/local/sqoop/lib/paranamer-2.3.jar
> > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > file:/usr/local/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar
> > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > file:/usr/local/sqoop/lib/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar
> > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > file:/usr/local/sqoop/lib/avro-ipc-1.5.4.jar
> > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > file:/usr/local/sqoop/lib/avro-mapred-1.5.4.jar
> > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > file:/usr/local/sqoop/lib/commons-io-1.4.jar
> > > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > file:/usr/local/sqoop/lib/jopt-simple-3.2.jar
> > > 12/12/05 19:55:31 INFO input.FileInputFormat: Total input paths to
> > process
> > > : 1
> > > 12/12/05 19:55:31 DEBUG mapreduce.ExportInputFormat: Target numMapTasks=5
> > > 12/12/05 19:55:31 DEBUG mapreduce.ExportInputFormat: Total input
> > > bytes=110140058
> > > 12/12/05 19:55:31 DEBUG mapreduce.ExportInputFormat:
> > maxSplitSize=22028011
> > > 12/12/05 19:55:31 INFO input.FileInputFormat: Total input paths to
> > process
> > > : 1
> > > 12/12/05 19:55:31 INFO util.NativeCodeLoader: Loaded the native-hadoop
> > > library
> > > 12/12/05 19:55:31 WARN snappy.LoadSnappy: Snappy native library not
> > loaded
> > > 12/12/05 19:55:31 DEBUG mapreduce.ExportInputFormat: Generated splits:
> > > 12/12/05 19:55:31 DEBUG mapreduce.ExportInputFormat:
> > >
> > Paths:/user/hadoop/CyberlinkMemberData/Member_Main/CHS/part-m-00000:0+22028011
> > > Locations:hadoop03:;
> > > 12/12/05 19:55:31 DEBUG mapreduce.ExportInputFormat:
> > >
> > Paths:/user/hadoop/CyberlinkMemberData/Member_Main/CHS/part-m-00000:22028011+22028011
> > > Locations:hadoop03:;
> > > 12/12/05 19:55:31 DEBUG mapreduce.ExportInputFormat:
> > >
> > Paths:/user/hadoop/CyberlinkMemberData/Member_Main/CHS/part-m-00000:44056022+11526421,/user/hadoop/CyberlinkMemberData/Member_Main/CHS/part-m-00000:55582443+11526421
> > > Locations:hadoop03:;
> > > 12/12/05 19:55:31 DEBUG mapreduce.ExportInputFormat:
> > >
> > Paths:/user/hadoop/CyberlinkMemberData/Member_Main/CHS/part-m-00000:67108864+21515597,/user/hadoop/CyberlinkMemberData/Member_Main/CHS/part-m-00000:88624461+21515597
> > > Locations:hadoop03:;
> > > 12/12/05 19:55:31 INFO mapred.JobClient: Running job:
> > job_201212041541_0245
> > > 12/12/05 19:55:32 INFO mapred.JobClient:  map 0% reduce 0%
> > > 12/12/05 19:55:47 INFO mapred.JobClient: Task Id :
> > > attempt_201212041541_0245_m_000002_0, Status : FAILED
> > > java.lang.NumberFormatException: For input string: "Male"
> > > at
> > >
> > java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
> > >  at java.lang.Integer.parseInt(Integer.java:481)
> > > at java.lang.Integer.valueOf(Integer.java:570)
> > >  at member_main.__loadFromFields(member_main.java:1254)
> > > at member_main.parse(member_main.java:1156)
> > >  at
> > >
> > com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:81)
> > > at
> > >
> > com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:40)
> > >  at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
> > > at
> > >
> > com.cloudera.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:189)
> > >  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
> > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> > >  at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> > > at java.security.AccessController.doPrivileged(Native Method)
> > >  at javax.security.auth.Subject.doAs(Subject.java:416)
> > > at
> > >
> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> > >  at org.apache.hadoop.mapred.Child.main(Child.java:249)
> > >
> > > 12/12/05 19:55:51 INFO mapred.JobClient:  map 5% reduce 0%
> > > 12/12/05 19:55:54 INFO mapred.JobClient:  map 8% reduce 0%
> > > 12/12/05 19:55:54 INFO mapred.JobClient: Task Id :
> > > attempt_201212041541_0245_m_000002_1, Status : FAILED
> > > java.lang.NumberFormatException: For input string: "Male"
> > > at
> > >
> > java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
> > >  at java.lang.Integer.parseInt(Integer.java:481)
> > > at java.lang.Integer.valueOf(Integer.java:570)
> > >  at member_main.__loadFromFields(member_main.java:1254)
> > > at member_main.parse(member_main.java:1156)
> > >  at
> > >
> > com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:81)
> > > at
> > >
> > com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:40)
> > >  at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
> > > at
> > >
> > com.cloudera.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:189)
> > >  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
> > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> > >  at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> > > at java.security.AccessController.doPrivileged(Native Method)
> > >  at javax.security.auth.Subject.doAs(Subject.java:416)
> > > at
> > >
> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> > >  at org.apache.hadoop.mapred.Child.main(Child.java:249)
> > >
> > > 12/12/05 19:55:57 INFO mapred.JobClient:  map 14% reduce 0%
> > > 12/12/05 19:55:59 INFO mapred.JobClient: Task Id :
> > > attempt_201212041541_0245_m_000000_0, Status : FAILED
> > > java.lang.IllegalArgumentException: Timestamp format must be yyyy-mm-dd
> > > hh:mm:ss[.fffffffff]
> > > at java.sql.Timestamp.valueOf(Timestamp.java:203)
> > >  at member_main.__loadFromFields(member_main.java:1239)
> > > at member_main.parse(member_main.java:1156)
> > >  at
> > >
> > com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:81)
> > > at
> > >
> > com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:40)
> > >  at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
> > > at
> > >
> > com.cloudera.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:189)
> > >  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
> > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> > >  at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> > > at java.security.AccessController.doPrivileged(Native Method)
> > >  at javax.security.auth.Subject.doAs(Subject.java:416)
> > > at
> > >
> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> > >  at org.apache.hadoop.mapred.Child.main(Child.java:249)
> > >
> > > 12/12/05 19:56:00 INFO mapred.JobClient:  map 13% reduce 0%
> > > 12/12/05 19:56:01 INFO mapred.JobClient: Task Id :
> > > attempt_201212041541_0245_m_000002_2, Status : FAILED
> > > java.lang.NumberFormatException: For input string: "Male"
> > > at
> > >
> > java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
> > >  at java.lang.Integer.parseInt(Integer.java:481)
> > > at java.lang.Integer.valueOf(Integer.java:570)
> > >  at member_main.__loadFromFields(member_main.java:1254)
> > > at member_main.parse(member_main.java:1156)
> > >  at
> > >
> > com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:81)
> > > at
> > >
> > com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:40)
> > >  at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
> > > at
> > >
> > com.cloudera.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:189)
> > >  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
> > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> > >  at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> > > at java.security.AccessController.doPrivileged(Native Method)
> > >  at javax.security.auth.Subject.doAs(Subject.java:416)
> > > at
> > >
> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> > >  at org.apache.hadoop.mapred.Child.main(Child.java:249)
> > >
> > > 12/12/05 19:56:03 INFO mapred.JobClient:  map 16% reduce 0%
> > > 12/12/05 19:56:13 INFO mapred.JobClient: Job complete:
> > job_201212041541_0245
> > > 12/12/05 19:56:13 INFO mapred.JobClient: Counters: 8
> > > 12/12/05 19:56:13 INFO mapred.JobClient:   Job Counters
> > > 12/12/05 19:56:13 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=91611
> > > 12/12/05 19:56:13 INFO mapred.JobClient:     Total time spent by all
> > > reduces waiting after reserving slots (ms)=0
> > > 12/12/05 19:56:13 INFO mapred.JobClient:     Total time spent by all maps
> > > waiting after reserving slots (ms)=0
> > > 12/12/05 19:56:13 INFO mapred.JobClient:     Rack-local map tasks=5
> > > 12/12/05 19:56:13 INFO mapred.JobClient:     Launched map tasks=8
> > > 12/12/05 19:56:13 INFO mapred.JobClient:     Data-local map tasks=3
> > > 12/12/05 19:56:13 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=0
> > > 12/12/05 19:56:13 INFO mapred.JobClient:     Failed map tasks=1
> > > 12/12/05 19:56:13 INFO mapreduce.ExportJobBase: Transferred 0 bytes in
> > > 45.077 seconds (0 bytes/sec)
> > > 12/12/05 19:56:13 INFO mapreduce.ExportJobBase: Exported 0 records.
> > > 12/12/05 19:56:13 ERROR tool.ExportTool: Error during export: Export job
> > > failed!
> > > =====================================
> > >
> > >
> > > Chun-fan
> > >
> > >
> > > On Thu, Dec 6, 2012 at 12:23 AM, Jarek Jarcec Cecho <jarcec@apache.org
> > >wrote:
> > >
> > > > Hi Chun-fan,
> > > > thank you very much for sharing the log with us. You are using
> > Microsoft
> > > > SQL Connector because you're downloaded manually from Microsoft web
> > pages
> > > > and you can also confirm that by following log lines:
> > > >
> > > > > 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Added factory
> > > > > com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory specified by
> > > > > /usr/local/sqoop/conf/managers.d/mssqoop-sqlserver
> > > > ...
> > > > > 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Instantiated ConnManager
> > > > > com.microsoft.sqoop.SqlServer.MSSQLServerManager@736921fd
> > > >
> > > > I'm not sure what is going wrong as it seems that the data were parsed
> > > > correctly, but submitting the query to SQL server will fail. As a next
> > step
> > > > I would recommend turning Microsoft Connector off and using build-in
> > one
> > > > instead to see if the issue is specific to Sqoop or the Connector. You
> > can
> > > > do that by temporarily moving file
> > > > /usr/local/sqoop/conf/managers.d/mssqoop-sqlserver somewhere else.
> > > >
> > > > Jarcec
> > > >
> > > > On Wed, Dec 05, 2012 at 12:25:24PM +0800, Chun-fan Ivan Liao wrote:
> > > > > Thank you, Jarcec. I'm not sure which connector we use. I've
> > downloaded
> > > > > "Microsoft SQL Server Connector for Apache Hadoop" from
> > > > > http://www.microsoft.com/en-us/download/details.aspx?id=27584, but I
> > > > didn't
> > > > > remember if we really used that. How to make sure?
> > > > >
> > > > > And here is the verbose log:
> > > > >
> > > > > ===========
> > > > > 12/12/05 12:08:57 DEBUG tool.BaseSqoopTool: Enabled debug logging.
> > > > > 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Added factory
> > > > > com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory specified by
> > > > > /usr/local/sqoop/conf/managers.d/mssqoop-sqlserver
> > > > > 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Loaded manager factory:
> > > > > com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory
> > > > > 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Loaded manager factory:
> > > > > com.cloudera.sqoop.manager.DefaultManagerFactory
> > > > > 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Trying ManagerFactory:
> > > > > com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory
> > > > > 12/12/05 12:08:57 INFO SqlServer.MSSQLServerManagerFactory: Using
> > > > > Microsoft's SQL Server - Hadoop Connector
> > > > > 12/12/05 12:08:57 INFO manager.SqlManager: Using default fetchSize of
> > > > 1000
> > > > > 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Instantiated ConnManager
> > > > > com.microsoft.sqoop.SqlServer.MSSQLServerManager@736921fd
> > > > > 12/12/05 12:08:57 INFO tool.CodeGenTool: Beginning code generation
> > > > > 12/12/05 12:08:57 DEBUG manager.SqlManager: No connection paramenters
> > > > > specified. Using regular API for making connection.
> > > > > 12/12/05 12:08:57 DEBUG manager.SqlManager: Using fetchSize for next
> > > > query:
> > > > > 1000
> > > > > 12/12/05 12:08:57 INFO manager.SqlManager: Executing SQL statement:
> > > > SELECT
> > > > > TOP 1 * FROM [member_main]
> > > > > 12/12/05 12:08:57 DEBUG manager.SqlManager: Using fetchSize for next
> > > > query:
> > > > > 1000
> > > > > 12/12/05 12:08:57 INFO manager.SqlManager: Executing SQL statement:
> > > > SELECT
> > > > > TOP 1 * FROM [member_main]
> > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter: selected columns:
> > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   MemberId
> > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   USERNAME
> > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   FirstName
> > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   LastName
> > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   EmailAddress
> > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Password
> > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Password_E5
> > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Birthday
> > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   CompanyName
> > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Gender
> > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Age
> > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Education
> > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Country
> > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Title
> > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Phone1
> > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Phone2
> > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Fax
> > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   State
> > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   City
> > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Address1
> > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Address2
> > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   ZipCode
> > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   VATID
> > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Language
> > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   rec_letter
> > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   rec_promotion
> > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   rec_type
> > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   JointSource
> > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   CustomerLevel
> > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   UpdateDate
> > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   CreateDate
> > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   FirstLoginDate
> > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   LastLoginDate
> > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   LastVisit
> > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   isValid
> > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   nJoint
> > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Upd_SubDate
> > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   UnSub_Type
> > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   CreateDateFloat
> > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter: Writing source file:
> > > > >
> > > >
> > /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/member_main.java
> > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter: Table name: member_main
> > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter: Columns: MemberId:4,
> > > > USERNAME:12,
> > > > > FirstName:-9, LastName:-9, EmailAddress:12, Password:12,
> > Password_E5:12,
> > > > > Birthday:93, CompanyName:-9, Gender:12, Age:5, Education:12,
> > Country:5,
> > > > > Title:-9, Phone1:12, Phone2:12, Fax:12, State:-9, City:-9,
> > Address1:-9,
> > > > > Address2:-9, ZipCode:12, VATID:12, Language:12, rec_letter:-7,
> > > > > rec_promotion:-7, rec_type:5, JointSource:12, CustomerLevel:4,
> > > > > UpdateDate:93, CreateDate:93, FirstLoginDate:93, LastLoginDate:93,
> > > > > LastVisit:93, isValid:-7, nJoint:4, Upd_SubDate:93, UnSub_Type:4,
> > > > > CreateDateFloat:8,
> > > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter: sourceFilename is
> > > > member_main.java
> > > > > 12/12/05 12:08:57 DEBUG orm.CompilationManager: Found existing
> > > > > /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/
> > > > > 12/12/05 12:08:57 INFO orm.CompilationManager: HADOOP_HOME is
> > > > > /usr/local/hadoop/libexec/..
> > > > > 12/12/05 12:08:57 DEBUG orm.CompilationManager: Adding source file:
> > > > >
> > > >
> > /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/member_main.java
> > > > > 12/12/05 12:08:57 DEBUG orm.CompilationManager: Invoking javac with
> > args:
> > > > > 12/12/05 12:08:57 DEBUG orm.CompilationManager:   -sourcepath
> > > > > 12/12/05 12:08:57 DEBUG orm.CompilationManager:
> > > > > /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/
> > > > > 12/12/05 12:08:57 DEBUG orm.CompilationManager:   -d
> > > > > 12/12/05 12:08:57 DEBUG orm.CompilationManager:
> > > > > /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/
> > > > > 12/12/05 12:08:57 DEBUG orm.CompilationManager:   -classpath
> > > > > 12/12/05 12:08:57 DEBUG orm.CompilationManager:
> > > > >
> > > >
> > /usr/local/hadoop/libexec/../conf:/usr/lib/jvm/java-6-openjdk-amd64/lib/tools.jar:/usr/local/hadoop/libexec/..:/usr/local/hadoop/libexec/../hadoop-core-1.0.3.jar:/usr/local/hadoop/libexec/../lib/asm-3.2.jar:/usr/local/hadoop/libexec/../lib/aspectjrt-1.6.5.jar:/usr/local/hadoop/libexec/../lib/aspectjtools-1.6.5.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/libexec/../lib/commons-cli-1.2.jar:/usr/local/hadoop/libexec/../lib/commons-codec-1.4.jar:/usr/local/hadoop/libexec/../lib/commons-collections-3.2.1.jar:/usr/local/hadoop/libexec/../lib/commons-configuration-1.6.jar:/usr/local/hadoop/libexec/../lib/commons-daemon-1.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-digester-1.8.jar:/usr/local/hadoop/libexec/../lib/commons-el-1.0.jar:/usr/local/hadoop/libexec/../lib/commons-httpclient-3.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-io-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-lang-2.4.jar:/usr/local/hadoop/libexec/../lib/commons-logging-1.1.1.jar:/usr/local/hadoop/libexec/../lib/commons-logging-api-1.0.4.jar:/usr/local/hadoop/libexec/../lib/commons-math-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-net-1.4.1.jar:/usr/local/hadoop/libexec/../lib/core-3.1.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-capacity-scheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-fairscheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-thriftfs-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hsqldb-1.8.0.10.jar:/usr/local/hadoop/libexec/../lib/jackson-core-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jasper-compiler-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jasper-runtime-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jdeb-0.8.jar:/usr/local/hadoop/libexec/../lib/jersey-core-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-json-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-server-1.8.jar:/usr/local/hadoop/libexec/../lib/jets3t-0.6.1.jar:/usr/local/hadoop/libexec/../lib/jetty-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jetty-util-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jsch-0.1.42.jar:/usr/local/hadoop/libexec/../lib/junit-4.5.jar:/usr/local/hadoop/libexec/../lib/kfs-0.2.2.jar:/usr/local/hadoop/libexec/../lib/log4j-1.2.15.jar:/usr/local/hadoop/libexec/../lib/mockito-all-1.8.5.jar:/usr/local/hadoop/libexec/../lib/oro-2.0.8.jar:/usr/local/hadoop/libexec/../lib/servlet-api-2.5-20081211.jar:/usr/local/hadoop/libexec/../lib/slf4j-api-1.4.3.jar:/usr/local/hadoop/libexec/../lib/slf4j-log4j12-1.4.3.jar:/usr/local/hadoop/libexec/../lib/xmlenc-0.52.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-2.1.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-api-2.1.jar:/usr/local/hadoop/lib:/usr/local/sqoop/conf::/usr/local/sqoop/lib/ant-contrib-1.0b3.jar:/usr/local/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar:/usr/local/sqoop/lib/avro-1.5.4.jar:/usr/local/sqoop/lib/avro-ipc-1.5.4.jar:/usr/local/sqoop/lib/avro-mapred-1.5.4.jar:/usr/local/sqoop/lib/commons-io-1.4.jar:/usr/local/sqoop/lib/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar:/usr/local/sqoop/lib/jackson-core-asl-1.7.3.jar:/usr/local/sqoop/lib/jackson-mapper-asl-1.7.3.jar:/usr/local/sqoop/lib/jopt-simple-3.2.jar:/usr/local/sqoop/lib/paranamer-2.3.jar:/usr/local/sqoop/lib/snappy-java-1.0.3.2.jar:/usr/local/sqoop/lib/sqljdbc4.jar:/usr/local/sqoop/lib/sqoop-sqlserver-1.0.jar:/usr/local/hbase/conf/:/usr/lib/jvm/java-6-openjdk-amd64//lib/tools.jar:/usr/local/hbase:/usr/local/hbase/hbase-0.94.1.jar:/usr/local/hbase/hbase-0.94.1-tests.jar:/usr/local/hbase/lib/activation-1.1.jar:/usr/local/hbase/lib/asm-3.1.jar:/usr/local/hbase/lib/avro-1.5.3.jar:/usr/local/hbase/lib/avro-ipc-1.5.3.jar:/usr/local/hbase/lib/commons-beanutils-1.7.0.jar:/usr/local/hbase/lib/commons-beanutils-core-1.8.0.jar:/usr/local/hbase/lib/commons-cli-1.2.jar:/usr/local/hbase/lib/commons-codec-1.4.jar:/usr/local/hbase/lib/commons-collections-3.2.1.jar:/usr/local/hbase/lib/commons-configuration-1.6.jar:/usr/local/hbase/lib/commons-digester-1.8.jar:/usr/local/hbase/lib/commons-el-1.0.jar:/usr/local/hbase/lib/commons-httpclient-3.1.jar:/usr/local/hbase/lib/commons-io-2.1.jar:/usr/local/hbase/lib/commons-lang-2.5.jar:/usr/local/hbase/lib/commons-logging-1.1.1.jar:/usr/local/hbase/lib/commons-math-2.1.jar:/usr/local/hbase/lib/commons-net-1.4.1.jar:/usr/local/hbase/lib/core-3.1.1.jar:/usr/local/hbase/lib/guava-11.0.2.jar:/usr/local/hbase/lib/hadoop-core-1.0.3.jar:/usr/local/hbase/lib/high-scale-lib-1.1.1.jar:/usr/local/hbase/lib/httpclient-4.1.2.jar:/usr/local/hbase/lib/httpcore-4.1.3.jar:/usr/local/hbase/lib/jackson-core-asl-1.8.8.jar:/usr/local/hbase/lib/jackson-jaxrs-1.8.8.jar:/usr/local/hbase/lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hbase/lib/jackson-xc-1.8.8.jar:/usr/local/hbase/lib/jamon-runtime-2.3.1.jar:/usr/local/hbase/lib/jasper-compiler-5.5.23.jar:/usr/local/hbase/lib/jasper-runtime-5.5.23.jar:/usr/local/hbase/lib/jaxb-api-2.1.jar:/usr/local/hbase/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hbase/lib/jersey-core-1.8.jar:/usr/local/hbase/lib/jersey-json-1.8.jar:/usr/local/hbase/lib/jersey-server-1.8.jar:/usr/local/hbase/lib/jettison-1.1.jar:/usr/local/hbase/lib/jetty-6.1.26.jar:/usr/local/hbase/lib/jetty-util-6.1.26.jar:/usr/local/hbase/lib/jruby-complete-1.6.5.jar:/usr/local/hbase/lib/jsp-2.1-6.1.14.jar:/usr/local/hbase/lib/jsp-api-2.1-6.1.14.jar:/usr/local/hbase/lib/jsr305-1.3.9.jar:/usr/local/hbase/lib/junit-4.10-HBASE-1.jar:/usr/local/hbase/lib/libthrift-0.8.0.jar:/usr/local/hbase/lib/log4j-1.2.16.jar:/usr/local/hbase/lib/metrics-core-2.1.2.jar:/usr/local/hbase/lib/netty-3.2.4.Final.jar:/usr/local/hbase/lib/protobuf-java-2.4.0a.jar:/usr/local/hbase/lib/servlet-api-2.5-6.1.14.jar:/usr/local/hbase/lib/slf4j-api-1.4.3.jar:/usr/local/hbase/lib/slf4j-log4j12-1.4.3.jar:/usr/local/hbase/lib/snappy-java-1.0.3.2.jar:/usr/local/hbase/lib/stax-api-1.0.1.jar:/usr/local/hbase/lib/velocity-1.7.jar:/usr/local/hbase/lib/xmlenc-0.52.jar:/usr/local/hbase/lib/zookeeper-3.4.3.jar::/usr/local/hadoop/conf:/usr/local/hadoop/libexec/../conf:/usr/lib/jvm/java-6-openjdk-amd64/lib/tools.jar:/usr/local/hadoop/libexec/..:/usr/local/hadoop/libexec/../hadoop-core-1.0.3.jar:/usr/local/hadoop/libexec/../lib/asm-3.2.jar:/usr/local/hadoop/libexec/../lib/aspectjrt-1.6.5.jar:/usr/local/hadoop/libexec/../lib/aspectjtools-1.6.5.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/libexec/../lib/commons-cli-1.2.jar:/usr/local/hadoop/libexec/../lib/commons-codec-1.4.jar:/usr/local/hadoop/libexec/../lib/commons-collections-3.2.1.jar:/usr/local/hadoop/libexec/../lib/commons-configuration-1.6.jar:/usr/local/hadoop/libexec/../lib/commons-daemon-1.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-digester-1.8.jar:/usr/local/hadoop/libexec/../lib/commons-el-1.0.jar:/usr/local/hadoop/libexec/../lib/commons-httpclient-3.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-io-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-lang-2.4.jar:/usr/local/hadoop/libexec/../lib/commons-logging-1.1.1.jar:/usr/local/hadoop/libexec/../lib/commons-logging-api-1.0.4.jar:/usr/local/hadoop/libexec/../lib/commons-math-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-net-1.4.1.jar:/usr/local/hadoop/libexec/../lib/core-3.1.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-capacity-scheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-fairscheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-thriftfs-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hsqldb-1.8.0.10.jar:/usr/local/hadoop/libexec/../lib/jackson-core-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jasper-compiler-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jasper-runtime-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jdeb-0.8.jar:/usr/local/hadoop/libexec/../lib/jersey-core-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-json-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-server-1.8.jar:/usr/local/hadoop/libexec/../lib/jets3t-0.6.1.jar:/usr/local/hadoop/libexec/../lib/jetty-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jetty-util-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jsch-0.1.42.jar:/usr/local/hadoop/libexec/../lib/junit-4.5.jar:/usr/local/hadoop/libexec/../lib/kfs-0.2.2.jar:/usr/local/hadoop/libexec/../lib/log4j-1.2.15.jar:/usr/local/hadoop/libexec/../lib/mockito-all-1.8.5.jar:/usr/local/hadoop/libexec/../lib/oro-2.0.8.jar:/usr/local/hadoop/libexec/../lib/servlet-api-2.5-20081211.jar:/usr/local/hadoop/libexec/../lib/slf4j-api-1.4.3.jar:/usr/local/hadoop/libexec/../lib/slf4j-log4j12-1.4.3.jar:/usr/local/hadoop/libexec/../lib/xmlenc-0.52.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-2.1.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-api-2.1.jar:/usr/local/hadoop/lib::/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar:/usr/local/sqoop/sqoop-test-1.3.0-cdh3u4.jar::/usr/local/hadoop/hadoop-core-1.0.3.jar:/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar
> > > > > 12/12/05 12:08:58 ERROR orm.CompilationManager: Could not rename
> > > > >
> > > >
> > /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/member_main.java
> > > > > to /home/hadoop/_scripts/1-hadoop/member/./member_main.java
> > > > > org.apache.commons.io.FileExistsException: Destination
> > > > > '/home/hadoop/_scripts/1-hadoop/member/./member_main.java' already
> > exists
> > > > >  at org.apache.commons.io.FileUtils.moveFile(FileUtils.java:2378)
> > > > > at
> > > > >
> > > >
> > com.cloudera.sqoop.orm.CompilationManager.compile(CompilationManager.java:229)
> > > > >  at
> > com.cloudera.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:85)
> > > > > at com.cloudera.sqoop.tool.ExportTool.exportTable(ExportTool.java:66)
> > > > >  at com.cloudera.sqoop.tool.ExportTool.run(ExportTool.java:99)
> > > > > at com.cloudera.sqoop.Sqoop.run(Sqoop.java:146)
> > > > >  at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> > > > > at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:182)
> > > > >  at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:221)
> > > > > at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:230)
> > > > >  at com.cloudera.sqoop.Sqoop.main(Sqoop.java:239)
> > > > > 12/12/05 12:08:58 INFO orm.CompilationManager: Writing jar file:
> > > > >
> > > >
> > /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/member_main.jar
> > > > > 12/12/05 12:08:58 DEBUG orm.CompilationManager: Scanning for .class
> > files
> > > > > in directory:
> > /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971
> > > > > 12/12/05 12:08:58 DEBUG orm.CompilationManager: Got classfile:
> > > > >
> > > >
> > /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/member_main.class
> > > > > -> member_main.class
> > > > > 12/12/05 12:08:58 DEBUG orm.CompilationManager: Finished writing jar
> > file
> > > > >
> > > >
> > /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/member_main.jar
> > > > > 12/12/05 12:08:58 INFO mapreduce.ExportJobBase: Beginning export of
> > > > > member_main
> > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Using InputFormat: class
> > > > > com.cloudera.sqoop.mapreduce.ExportInputFormat
> > > > > 12/12/05 12:08:58 DEBUG manager.SqlManager: Using fetchSize for next
> > > > query:
> > > > > 1000
> > > > > 12/12/05 12:08:58 INFO manager.SqlManager: Executing SQL statement:
> > > > SELECT
> > > > > TOP 1 * FROM [member_main]
> > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > > file:/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar
> > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > > file:/usr/local/sqoop/lib/sqljdbc4.jar
> > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > > file:/usr/local/sqoop/lib/sqoop-sqlserver-1.0.jar
> > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > > file:/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar
> > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > > file:/usr/local/sqoop/lib/jackson-core-asl-1.7.3.jar
> > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > > file:/usr/local/sqoop/lib/snappy-java-1.0.3.2.jar
> > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > > file:/usr/local/sqoop/lib/avro-1.5.4.jar
> > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > > file:/usr/local/sqoop/lib/jackson-mapper-asl-1.7.3.jar
> > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > > file:/usr/local/sqoop/lib/sqljdbc4.jar
> > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > > file:/usr/local/sqoop/lib/ant-contrib-1.0b3.jar
> > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > > file:/usr/local/sqoop/lib/sqoop-sqlserver-1.0.jar
> > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > > file:/usr/local/sqoop/lib/paranamer-2.3.jar
> > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > > file:/usr/local/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar
> > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > > file:/usr/local/sqoop/lib/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar
> > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > > file:/usr/local/sqoop/lib/avro-ipc-1.5.4.jar
> > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > > file:/usr/local/sqoop/lib/avro-mapred-1.5.4.jar
> > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > > file:/usr/local/sqoop/lib/commons-io-1.4.jar
> > > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > > file:/usr/local/sqoop/lib/jopt-simple-3.2.jar
> > > > > 12/12/05 12:09:00 INFO input.FileInputFormat: Total input paths to
> > > > process
> > > > > : 1
> > > > > 12/12/05 12:09:00 DEBUG mapreduce.ExportInputFormat: Target
> > numMapTasks=1
> > > > > 12/12/05 12:09:00 DEBUG mapreduce.ExportInputFormat: Total input
> > > > bytes=2611
> > > > > 12/12/05 12:09:00 DEBUG mapreduce.ExportInputFormat:
> > maxSplitSize=2611
> > > > > 12/12/05 12:09:00 INFO input.FileInputFormat: Total input paths to
> > > > process
> > > > > : 1
> > > > > 12/12/05 12:09:00 INFO util.NativeCodeLoader: Loaded the
> > native-hadoop
> > > > > library
> > > > > 12/12/05 12:09:00 WARN snappy.LoadSnappy: Snappy native library not
> > > > loaded
> > > > > 12/12/05 12:09:00 DEBUG mapreduce.ExportInputFormat: Generated
> > splits:
> > > > > 12/12/05 12:09:00 DEBUG mapreduce.ExportInputFormat:
> > > > > Paths:/user/hadoop/test-ivan/test:0+2611 Locations:hadoop05:;
> > > > > 12/12/05 12:09:00 INFO mapred.JobClient: Running job:
> > > > job_201212041541_0107
> > > > > 12/12/05 12:09:01 INFO mapred.JobClient:  map 0% reduce 0%
> > > > > 12/12/05 12:09:18 INFO mapred.JobClient: Task Id :
> > > > > attempt_201212041541_0107_m_000000_0, Status : FAILED
> > > > > java.io.IOException: com.microsoft.sqlserver.jdbc.SQLServerException:
> > > > > Incorrect syntax near ','.
> > > > >  at
> > > > >
> > > >
> > com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.close(AsyncSqlRecordWriter.java:195)
> > > > > at
> > > > >
> > > >
> > org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.close(MapTask.java:651)
> > > > >  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:766)
> > > > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> > > > >  at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> > > > > at java.security.AccessController.doPrivileged(Native Method)
> > > > >  at javax.security.auth.Subject.doAs(Subject.java:416)
> > > > > at
> > > > >
> > > >
> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> > > > >  at org.apache.hadoop.mapred.Child.main(Child.java:249)
> > > > > Caused by: com.microsoft.sqlserver.jdbc.SQLServerException: Incorrect
> > > > > syntax near ','.
> > > > >  at
> > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:197)
> > > > > at
> > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1493)
> > > > >  at
> > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.doExecutePreparedStatement(SQLServerPreparedStatement.java:390)
> > > > > at
> > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement$PrepStmtExecCmd.doExecute(SQLServerPreparedStatement.java:340)
> > > > >  at
> > com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:4575)
> > > > > at
> > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1400)
> > > > >  at
> > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:179)
> > > > > at
> > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:154)
> > > > >  at
> > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.execute(SQLServerPreparedStatement.java:322)
> > > > > at
> > > > >
> > > >
> > com.cloudera.sqoop.mapreduce.AsyncSqlOutputFormat$AsyncSqlExecThread.run(AsyncSqlOutputFormat.java:234)
> > > > >
> > > > > 12/12/05 12:09:24 INFO mapred.JobClient: Task Id :
> > > > > attempt_201212041541_0107_m_000000_1, Status : FAILED
> > > > > java.io.IOException: com.microsoft.sqlserver.jdbc.SQLServerException:
> > > > > Incorrect syntax near ','.
> > > > >  at
> > > > >
> > > >
> > com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.close(AsyncSqlRecordWriter.java:195)
> > > > > at
> > > > >
> > > >
> > org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.close(MapTask.java:651)
> > > > >  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:766)
> > > > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> > > > >  at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> > > > > at java.security.AccessController.doPrivileged(Native Method)
> > > > >  at javax.security.auth.Subject.doAs(Subject.java:416)
> > > > > at
> > > > >
> > > >
> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> > > > >  at org.apache.hadoop.mapred.Child.main(Child.java:249)
> > > > > Caused by: com.microsoft.sqlserver.jdbc.SQLServerException: Incorrect
> > > > > syntax near ','.
> > > > >  at
> > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:197)
> > > > > at
> > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1493)
> > > > >  at
> > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.doExecutePreparedStatement(SQLServerPreparedStatement.java:390)
> > > > > at
> > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement$PrepStmtExecCmd.doExecute(SQLServerPreparedStatement.java:340)
> > > > >  at
> > com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:4575)
> > > > > at
> > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1400)
> > > > >  at
> > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:179)
> > > > > at
> > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:154)
> > > > >  at
> > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.execute(SQLServerPreparedStatement.java:322)
> > > > > at
> > > > >
> > > >
> > com.cloudera.sqoop.mapreduce.AsyncSqlOutputFormat$AsyncSqlExecThread.run(AsyncSqlOutputFormat.java:234)
> > > > >
> > > > > 12/12/05 12:09:30 INFO mapred.JobClient: Task Id :
> > > > > attempt_201212041541_0107_m_000000_2, Status : FAILED
> > > > > java.io.IOException: com.microsoft.sqlserver.jdbc.SQLServerException:
> > > > > Incorrect syntax near ','.
> > > > >  at
> > > > >
> > > >
> > com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.close(AsyncSqlRecordWriter.java:195)
> > > > > at
> > > > >
> > > >
> > org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.close(MapTask.java:651)
> > > > >  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:766)
> > > > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> > > > >  at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> > > > > at java.security.AccessController.doPrivileged(Native Method)
> > > > >  at javax.security.auth.Subject.doAs(Subject.java:416)
> > > > > at
> > > > >
> > > >
> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> > > > >  at org.apache.hadoop.mapred.Child.main(Child.java:249)
> > > > > Caused by: com.microsoft.sqlserver.jdbc.SQLServerException: Incorrect
> > > > > syntax near ','.
> > > > >  at
> > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:197)
> > > > > at
> > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1493)
> > > > >  at
> > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.doExecutePreparedStatement(SQLServerPreparedStatement.java:390)
> > > > > at
> > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement$PrepStmtExecCmd.doExecute(SQLServerPreparedStatement.java:340)
> > > > >  at
> > com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:4575)
> > > > > at
> > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1400)
> > > > >  at
> > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:179)
> > > > > at
> > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:154)
> > > > >  at
> > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.execute(SQLServerPreparedStatement.java:322)
> > > > > at
> > > > >
> > > >
> > com.cloudera.sqoop.mapreduce.AsyncSqlOutputFormat$AsyncSqlExecThread.run(AsyncSqlOutputFormat.java:234)
> > > > >
> > > > > 12/12/05 12:09:41 INFO mapred.JobClient: Job complete:
> > > > job_201212041541_0107
> > > > > 12/12/05 12:09:41 INFO mapred.JobClient: Counters: 8
> > > > > 12/12/05 12:09:41 INFO mapred.JobClient:   Job Counters
> > > > > 12/12/05 12:09:41 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=24379
> > > > > 12/12/05 12:09:41 INFO mapred.JobClient:     Total time spent by all
> > > > > reduces waiting after reserving slots (ms)=0
> > > > > 12/12/05 12:09:41 INFO mapred.JobClient:     Total time spent by all
> > maps
> > > > > waiting after reserving slots (ms)=0
> > > > > 12/12/05 12:09:41 INFO mapred.JobClient:     Rack-local map tasks=3
> > > > > 12/12/05 12:09:41 INFO mapred.JobClient:     Launched map tasks=4
> > > > > 12/12/05 12:09:41 INFO mapred.JobClient:     Data-local map tasks=1
> > > > > 12/12/05 12:09:41 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=0
> > > > > 12/12/05 12:09:41 INFO mapred.JobClient:     Failed map tasks=1
> > > > > 12/12/05 12:09:41 INFO mapreduce.ExportJobBase: Transferred 0 bytes
> > in
> > > > > 43.0875 seconds (0 bytes/sec)
> > > > > 12/12/05 12:09:41 INFO mapreduce.ExportJobBase: Exported 0 records.
> > > > > 12/12/05 12:09:41 ERROR tool.ExportTool: Error during export: Export
> > job
> > > > > failed!
> > > > > ================
> > > > >
> > > > > Kind regards,
> > > > > Chun-fan
> > > > >
> > > > > On Wed, Dec 5, 2012 at 12:01 AM, Jarek Jarcec Cecho <
> > jarcec@apache.org
> > > > >wrote:
> > > > >
> > > > > > HiChun-fan,
> > > > > > would you mind sharing with us entire Sqoop log generated with
> > > > parameter
> > > > > > --verbose? Are you using build-in Microsoft SQL Connector or
> > connector
> > > > > > provided by Microsoft?
> > > > > >
> > > > > > Jarcec
> > > > > >
> > > > > > On Tue, Dec 04, 2012 at 05:51:31PM +0800, Chun-fan Ivan Liao wrote:
> > > > > > > Hi,
> > > > > > >
> > > > > > >
> > > > > > >
> > > > > > > We are using Sqoop 1.3.0-cdh3u4 with Hadoop version 1.0.3.
> > > > > > >
> > > > > > >
> > > > > > >
> > > > > > > We encountered the following error when we try to export HDFS
> > file
> > > > into
> > > > > > > MSSQL 2005 (partially):
> > > > > > >
> > > > > > >
> > > > > > >
> > > > > > > 12/12/04 16:44:13 INFO mapred.JobClient: Task Id :
> > > > > > > attempt_201212041541_0014_m_000000_2, Status : FAILED
> > > > > > >
> > > > > > > java.io.IOException:
> > com.microsoft.sqlserver.jdbc.SQLServerException:
> > > > > > > Incorrect syntax near ','.
> > > > > > >
> > > > > > >         at
> > > > > > >
> > > > > >
> > > >
> > com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.close(AsyncSqlRecordWriter.java:195)
> > > > > > >
> > > > > > >         at
> > > > > > >
> > > > > >
> > > >
> > org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.close(MapTask.java:651)
> > > > > > >
> > > > > > >         at
> > > > > > org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:766)
> > > > > > >
> > > > > > >         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> > > > > > >
> > > > > > >         at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> > > > > > >
> > > > > > >         at java.security.AccessController.doPrivileged(Native
> > Method)
> > > > > > >
> > > > > > >         at javax.security.auth.Subject.doAs(Subject.java:416)
> > > > > > >
> > > > > > >         at
> > > > > > >
> > > > > >
> > > >
> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> > > > > > >
> > > > > > >         at org.apache.hadoop.mapred.Child.main(Child.java:249)
> > > > > > >
> > > > > > > Caused by: com.microsoft.sqlserver.jdbc.SQLServerException:
> > Incorrect
> > > > > > > syntax near ','.
> > > > > > >
> > > > > > >         at
> > > > > > >
> > > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:197)
> > > > > > >
> > > > > > >         at
> > > > > > >
> > > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1493)
> > > > > > >
> > > > > > >         at
> > > > > > >
> > > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.doExecutePreparedStatement(SQLServerPreparedStatement.java:390)
> > > > > > >
> > > > > > >         at
> > > > > > >
> > > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement$PrepStmtExecCmd.doExecute(SQLServerPreparedStatement.java:340)
> > > > > > >
> > > > > > >         at
> > > > > > >
> > com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:4575)
> > > > > > >
> > > > > > >         at
> > > > > > >
> > > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1400)
> > > > > > >
> > > > > > >         at
> > > > > > >
> > > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:179)
> > > > > > >
> > > > > > >         at
> > > > > > >
> > > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:154)
> > > > > > >
> > > > > > >         at
> > > > > > >
> > > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.execute(SQLServerPreparedStatement.java:322)
> > > > > > >
> > > > > > >         at
> > > > > > >
> > > > > >
> > > >
> > com.cloudera.sqoop.mapreduce.AsyncSqlOutputFormat$AsyncSqlExecThread.run(AsyncSqlOutputFormat.java:234)
> > > > > > >
> > > > > > >
> > > > > > >
> > > > > > > The HDFS file that we want to export was imported using sqoop
> > from
> > > > SQL
> > > > > > 2005
> > > > > > > before and uses ‘|’ as field delimiter, and there are commas
> > (‘,’)
> > > > in a
> > > > > > > field of a line in the file.
> > > > > > >
> > > > > > >
> > > > > > >
> > > > > > > The commands I submitted is (generalized with capital letters):
> > > > > > >
> > > > > > > $ sqoop export -D sqoop.export.records.per.statement=75 -D
> > > > > > > sqoop.export.statements.per.transaction=75 --connect
> > > > > > >
> > > > > >
> > > >
> > "jdbc:sqlserver://SERVER-NAME:1433;username=USER_NAME;password=PASSWD;database=DB_NAME"
> > > > > > > --table TABLE_NAME -m 1 --input-fields-terminated-by '|'
> > --export-dir
> > > > > > > /EXPORT/FROM/DIRECTORY
> > > > > > >
> > > > > > >
> > > > > > >
> > > > > > > I’ve adjusted values of sqoop.export.records.per.statement &
> > > > > > > sqoop.export.statements.per.transaction, but that didn’t help.
> > > > > > >
> > > > > > >
> > > > > > >
> > > > > > > It will be greatly appreciated if you can offer some help.
> > Thanks.
> > > > > > >
> > > > > > >
> > > > > > >
> > > > > > > Ivan
> > > > > >
> > > >
> >

Re: Sqoop export failed: Incorrect syntax near ','

Posted by Chun-fan Ivan Liao <iv...@ivangelion.tw>.
Hi Jarek,

I've tried to use "--hive-drop-import-delims", but sqoop showed it has
syntax error:

  ERROR tool.BaseSqoopTool: Unrecognized argument: --hive-drop-import-delims

Also, should I change Java OpenJDK to Oracle JDK in order to make Sqoop
export work?

Thanks!
Ivan

On Fri, Dec 7, 2012 at 12:28 AM, Jarek Jarcec Cecho <ja...@apache.org>wrote:

> Hi Ivangelion,
> I'm glad that you were able to move on with your issue. It seems to me
> that you're running on OpenJDK - unfortunately Sqoop is tested and
> supported only Oracle JDK.
>
> Based on the exceptions you're hitting:
>
>   java.lang.NumberFormatException: For input string: "Male"
>
>   java.lang.IllegalArgumentException: Timestamp format must be yyyy-mm-dd
> hh:mm:ss[.fffffffff
>
> It seems to me your input files got somehow corrupted and for example for
> the first exception Sqoop is looking for column that should be number but
> found string "Male" instead. You've mentioned that your data can contain a
> lot of wild characters, can it happen that your data also contains new line
> characters? Would you mind re-trying import with parameter
> --hive-drop-import-delims [1] to see if it helps? (this parameter do not
> depend on hive in any way regardless of it's name).
>
> Jarcec
>
> On Thu, Dec 06, 2012 at 12:03:06PM +0800, Ivangelion wrote:
> > Hi Jarek,
> >
> > It actually worked! Thank you so much~! :D
> >
> > However now we faced another problem. The former data we tried to export
> is
> > only test data, which row count is only 10. When we tried to export
> > production data back into SQL Server from HDFS file which was previously
> > imported using Sqoop from SQL server, different errors occurred. The row
> > count is about 400k, and only about 120k rows were exported. This time we
> > used "-m 5", and if using "-m 1", nothing will be exported. Verbose log
> is
> > in the bottom of this mail.
> >
> > Is this has to do with that we used MS SQL connector to do previous
> import,
> > not the default one?
> >
> > Also, should we specify any character encoding, e.g. utf-8 during
> > import/export process? There are characters of many different languages
> in
> > our original data in SQL Server, and I'm not sure what the encoding is
> > after imported into HDFS.
> >
> > Thanks again, Jarek.
> >
> > =====================================
> > 12/12/05 19:55:27 DEBUG tool.BaseSqoopTool: Enabled debug logging.
> > 12/12/05 19:55:27 DEBUG sqoop.ConnFactory: Loaded manager factory:
> > com.cloudera.sqoop.manager.DefaultManagerFactory
> > 12/12/05 19:55:27 DEBUG sqoop.ConnFactory: Trying ManagerFactory:
> > com.cloudera.sqoop.manager.DefaultManagerFactory
> > 12/12/05 19:55:27 DEBUG manager.DefaultManagerFactory: Trying with
> scheme:
> > jdbc:sqlserver:
> > 12/12/05 19:55:27 INFO manager.SqlManager: Using default fetchSize of
> 1000
> > 12/12/05 19:55:27 DEBUG sqoop.ConnFactory: Instantiated ConnManager
> > com.cloudera.sqoop.manager.SQLServerManager@6766afb3
> > 12/12/05 19:55:27 INFO tool.CodeGenTool: Beginning code generation
> > 12/12/05 19:55:27 DEBUG manager.SqlManager: No connection paramenters
> > specified. Using regular API for making connection.
> > 12/12/05 19:55:27 DEBUG manager.SqlManager: Using fetchSize for next
> query:
> > 1000
> > 12/12/05 19:55:27 INFO manager.SqlManager: Executing SQL statement:
> SELECT
> > t.* FROM member_main AS t WHERE 1=0
> > 12/12/05 19:55:27 DEBUG orm.ClassWriter: selected columns:
> > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   MemberId
> > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   USERNAME
> > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   FirstName
> > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   LastName
> > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   EmailAddress
> > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Password
> > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Password_E5
> > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Birthday
> > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   CompanyName
> > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Gender
> > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Age
> > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Education
> > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Country
> > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Title
> > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Phone1
> > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Phone2
> > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Fax
> > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   State
> > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   City
> > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Address1
> > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Address2
> > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   ZipCode
> > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   VATID
> > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Language
> > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   rec_letter
> > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   rec_promotion
> > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   rec_type
> > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   JointSource
> > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   CustomerLevel
> > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   UpdateDate
> > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   CreateDate
> > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   FirstLoginDate
> > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   LastLoginDate
> > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   LastVisit
> > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   isValid
> > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   nJoint
> > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Upd_SubDate
> > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   UnSub_Type
> > 12/12/05 19:55:27 DEBUG orm.ClassWriter:   CreateDateFloat
> > 12/12/05 19:55:27 DEBUG orm.ClassWriter: Writing source file:
> >
> /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/member_main.java
> > 12/12/05 19:55:27 DEBUG orm.ClassWriter: Table name: member_main
> > 12/12/05 19:55:27 DEBUG orm.ClassWriter: Columns: MemberId:4,
> USERNAME:12,
> > FirstName:-9, LastName:-9, EmailAddress:12, Password:12, Password_E5:12,
> > Birthday:93, CompanyName:-9, Gender:12, Age:5, Education:12, Country:5,
> > Title:-9, Phone1:12, Phone2:12, Fax:12, State:-9, City:-9, Address1:-9,
> > Address2:-9, ZipCode:12, VATID:12, Language:12, rec_letter:-7,
> > rec_promotion:-7, rec_type:5, JointSource:12, CustomerLevel:4,
> > UpdateDate:93, CreateDate:93, FirstLoginDate:93, LastLoginDate:93,
> > LastVisit:93, isValid:-7, nJoint:4, Upd_SubDate:93, UnSub_Type:4,
> > CreateDateFloat:8,
> > 12/12/05 19:55:27 DEBUG orm.ClassWriter: sourceFilename is
> member_main.java
> > 12/12/05 19:55:27 DEBUG orm.CompilationManager: Found existing
> > /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/
> > 12/12/05 19:55:27 INFO orm.CompilationManager: HADOOP_HOME is
> > /usr/local/hadoop/libexec/..
> > 12/12/05 19:55:27 DEBUG orm.CompilationManager: Adding source file:
> >
> /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/member_main.java
> > 12/12/05 19:55:27 DEBUG orm.CompilationManager: Invoking javac with args:
> > 12/12/05 19:55:27 DEBUG orm.CompilationManager:   -sourcepath
> > 12/12/05 19:55:27 DEBUG orm.CompilationManager:
> > /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/
> > 12/12/05 19:55:27 DEBUG orm.CompilationManager:   -d
> > 12/12/05 19:55:27 DEBUG orm.CompilationManager:
> > /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/
> > 12/12/05 19:55:27 DEBUG orm.CompilationManager:   -classpath
> > 12/12/05 19:55:27 DEBUG orm.CompilationManager:
> >
> /usr/local/hadoop/libexec/../conf:/usr/lib/jvm/java-6-openjdk-amd64/lib/tools.jar:/usr/local/hadoop/libexec/..:/usr/local/hadoop/libexec/../hadoop-core-1.0.3.jar:/usr/local/hadoop/libexec/../lib/asm-3.2.jar:/usr/local/hadoop/libexec/../lib/aspectjrt-1.6.5.jar:/usr/local/hadoop/libexec/../lib/aspectjtools-1.6.5.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/libexec/../lib/commons-cli-1.2.jar:/usr/local/hadoop/libexec/../lib/commons-codec-1.4.jar:/usr/local/hadoop/libexec/../lib/commons-collections-3.2.1.jar:/usr/local/hadoop/libexec/../lib/commons-configuration-1.6.jar:/usr/local/hadoop/libexec/../lib/commons-daemon-1.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-digester-1.8.jar:/usr/local/hadoop/libexec/../lib/commons-el-1.0.jar:/usr/local/hadoop/libexec/../lib/commons-httpclient-3.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-io-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-lang-2.4.jar:/usr/local/hadoop/libexec/../lib/commons-logging-1.1.1.jar:/usr/local/hadoop/libexec/../lib/commons-logging-api-1.0.4.jar:/usr/local/hadoop/libexec/../lib/commons-math-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-net-1.4.1.jar:/usr/local/hadoop/libexec/../lib/core-3.1.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-capacity-scheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-fairscheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-thriftfs-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hsqldb-1.8.0.10.jar:/usr/local/hadoop/libexec/../lib/jackson-core-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jasper-compiler-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jasper-runtime-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jdeb-0.8.jar:/usr/local/hadoop/libexec/../lib/jersey-core-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-json-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-server-1.8.jar:/usr/local/hadoop/libexec/../lib/jets3t-0.6.1.jar:/usr/local/hadoop/libexec/../lib/jetty-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jetty-util-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jsch-0.1.42.jar:/usr/local/hadoop/libexec/../lib/junit-4.5.jar:/usr/local/hadoop/libexec/../lib/kfs-0.2.2.jar:/usr/local/hadoop/libexec/../lib/log4j-1.2.15.jar:/usr/local/hadoop/libexec/../lib/mockito-all-1.8.5.jar:/usr/local/hadoop/libexec/../lib/oro-2.0.8.jar:/usr/local/hadoop/libexec/../lib/servlet-api-2.5-20081211.jar:/usr/local/hadoop/libexec/../lib/slf4j-api-1.4.3.jar:/usr/local/hadoop/libexec/../lib/slf4j-log4j12-1.4.3.jar:/usr/local/hadoop/libexec/../lib/xmlenc-0.52.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-2.1.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-api-2.1.jar:/usr/local/hadoop/lib:/usr/local/sqoop/conf::/usr/local/sqoop/lib/ant-contrib-1.0b3.jar:/usr/local/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar:/usr/local/sqoop/lib/avro-1.5.4.jar:/usr/local/sqoop/lib/avro-ipc-1.5.4.jar:/usr/local/sqoop/lib/avro-mapred-1.5.4.jar:/usr/local/sqoop/lib/commons-io-1.4.jar:/usr/local/sqoop/lib/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar:/usr/local/sqoop/lib/jackson-core-asl-1.7.3.jar:/usr/local/sqoop/lib/jackson-mapper-asl-1.7.3.jar:/usr/local/sqoop/lib/jopt-simple-3.2.jar:/usr/local/sqoop/lib/paranamer-2.3.jar:/usr/local/sqoop/lib/snappy-java-1.0.3.2.jar:/usr/local/sqoop/lib/sqljdbc4.jar:/usr/local/sqoop/lib/sqoop-sqlserver-1.0.jar:/usr/local/hbase/conf/:/usr/lib/jvm/java-6-openjdk-amd64//lib/tools.jar:/usr/local/hbase:/usr/local/hbase/hbase-0.94.1.jar:/usr/local/hbase/hbase-0.94.1-tests.jar:/usr/local/hbase/lib/activation-1.1.jar:/usr/local/hbase/lib/asm-3.1.jar:/usr/local/hbase/lib/avro-1.5.3.jar:/usr/local/hbase/lib/avro-ipc-1.5.3.jar:/usr/local/hbase/lib/commons-beanutils-1.7.0.jar:/usr/local/hbase/lib/commons-beanutils-core-1.8.0.jar:/usr/local/hbase/lib/commons-cli-1.2.jar:/usr/local/hbase/lib/commons-codec-1.4.jar:/usr/local/hbase/lib/commons-collections-3.2.1.jar:/usr/local/hbase/lib/commons-configuration-1.6.jar:/usr/local/hbase/lib/commons-digester-1.8.jar:/usr/local/hbase/lib/commons-el-1.0.jar:/usr/local/hbase/lib/commons-httpclient-3.1.jar:/usr/local/hbase/lib/commons-io-2.1.jar:/usr/local/hbase/lib/commons-lang-2.5.jar:/usr/local/hbase/lib/commons-logging-1.1.1.jar:/usr/local/hbase/lib/commons-math-2.1.jar:/usr/local/hbase/lib/commons-net-1.4.1.jar:/usr/local/hbase/lib/core-3.1.1.jar:/usr/local/hbase/lib/guava-11.0.2.jar:/usr/local/hbase/lib/hadoop-core-1.0.3.jar:/usr/local/hbase/lib/high-scale-lib-1.1.1.jar:/usr/local/hbase/lib/httpclient-4.1.2.jar:/usr/local/hbase/lib/httpcore-4.1.3.jar:/usr/local/hbase/lib/jackson-core-asl-1.8.8.jar:/usr/local/hbase/lib/jackson-jaxrs-1.8.8.jar:/usr/local/hbase/lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hbase/lib/jackson-xc-1.8.8.jar:/usr/local/hbase/lib/jamon-runtime-2.3.1.jar:/usr/local/hbase/lib/jasper-compiler-5.5.23.jar:/usr/local/hbase/lib/jasper-runtime-5.5.23.jar:/usr/local/hbase/lib/jaxb-api-2.1.jar:/usr/local/hbase/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hbase/lib/jersey-core-1.8.jar:/usr/local/hbase/lib/jersey-json-1.8.jar:/usr/local/hbase/lib/jersey-server-1.8.jar:/usr/local/hbase/lib/jettison-1.1.jar:/usr/local/hbase/lib/jetty-6.1.26.jar:/usr/local/hbase/lib/jetty-util-6.1.26.jar:/usr/local/hbase/lib/jruby-complete-1.6.5.jar:/usr/local/hbase/lib/jsp-2.1-6.1.14.jar:/usr/local/hbase/lib/jsp-api-2.1-6.1.14.jar:/usr/local/hbase/lib/jsr305-1.3.9.jar:/usr/local/hbase/lib/junit-4.10-HBASE-1.jar:/usr/local/hbase/lib/libthrift-0.8.0.jar:/usr/local/hbase/lib/log4j-1.2.16.jar:/usr/local/hbase/lib/metrics-core-2.1.2.jar:/usr/local/hbase/lib/netty-3.2.4.Final.jar:/usr/local/hbase/lib/protobuf-java-2.4.0a.jar:/usr/local/hbase/lib/servlet-api-2.5-6.1.14.jar:/usr/local/hbase/lib/slf4j-api-1.4.3.jar:/usr/local/hbase/lib/slf4j-log4j12-1.4.3.jar:/usr/local/hbase/lib/snappy-java-1.0.3.2.jar:/usr/local/hbase/lib/stax-api-1.0.1.jar:/usr/local/hbase/lib/velocity-1.7.jar:/usr/local/hbase/lib/xmlenc-0.52.jar:/usr/local/hbase/lib/zookeeper-3.4.3.jar::/usr/local/hadoop/conf:/usr/local/hadoop/libexec/../conf:/usr/lib/jvm/java-6-openjdk-amd64/lib/tools.jar:/usr/local/hadoop/libexec/..:/usr/local/hadoop/libexec/../hadoop-core-1.0.3.jar:/usr/local/hadoop/libexec/../lib/asm-3.2.jar:/usr/local/hadoop/libexec/../lib/aspectjrt-1.6.5.jar:/usr/local/hadoop/libexec/../lib/aspectjtools-1.6.5.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/libexec/../lib/commons-cli-1.2.jar:/usr/local/hadoop/libexec/../lib/commons-codec-1.4.jar:/usr/local/hadoop/libexec/../lib/commons-collections-3.2.1.jar:/usr/local/hadoop/libexec/../lib/commons-configuration-1.6.jar:/usr/local/hadoop/libexec/../lib/commons-daemon-1.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-digester-1.8.jar:/usr/local/hadoop/libexec/../lib/commons-el-1.0.jar:/usr/local/hadoop/libexec/../lib/commons-httpclient-3.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-io-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-lang-2.4.jar:/usr/local/hadoop/libexec/../lib/commons-logging-1.1.1.jar:/usr/local/hadoop/libexec/../lib/commons-logging-api-1.0.4.jar:/usr/local/hadoop/libexec/../lib/commons-math-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-net-1.4.1.jar:/usr/local/hadoop/libexec/../lib/core-3.1.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-capacity-scheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-fairscheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-thriftfs-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hsqldb-1.8.0.10.jar:/usr/local/hadoop/libexec/../lib/jackson-core-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jasper-compiler-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jasper-runtime-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jdeb-0.8.jar:/usr/local/hadoop/libexec/../lib/jersey-core-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-json-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-server-1.8.jar:/usr/local/hadoop/libexec/../lib/jets3t-0.6.1.jar:/usr/local/hadoop/libexec/../lib/jetty-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jetty-util-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jsch-0.1.42.jar:/usr/local/hadoop/libexec/../lib/junit-4.5.jar:/usr/local/hadoop/libexec/../lib/kfs-0.2.2.jar:/usr/local/hadoop/libexec/../lib/log4j-1.2.15.jar:/usr/local/hadoop/libexec/../lib/mockito-all-1.8.5.jar:/usr/local/hadoop/libexec/../lib/oro-2.0.8.jar:/usr/local/hadoop/libexec/../lib/servlet-api-2.5-20081211.jar:/usr/local/hadoop/libexec/../lib/slf4j-api-1.4.3.jar:/usr/local/hadoop/libexec/../lib/slf4j-log4j12-1.4.3.jar:/usr/local/hadoop/libexec/../lib/xmlenc-0.52.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-2.1.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-api-2.1.jar:/usr/local/hadoop/lib::/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar:/usr/local/sqoop/sqoop-test-1.3.0-cdh3u4.jar::/usr/local/hadoop/hadoop-core-1.0.3.jar:/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar
> > 12/12/05 19:55:28 ERROR orm.CompilationManager: Could not rename
> >
> /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/member_main.java
> > to /home/hadoop/_scripts/1-hadoop/member/./member_main.java
> > org.apache.commons.io.FileExistsException: Destination
> > '/home/hadoop/_scripts/1-hadoop/member/./member_main.java' already exists
> >  at org.apache.commons.io.FileUtils.moveFile(FileUtils.java:2378)
> > at
> >
> com.cloudera.sqoop.orm.CompilationManager.compile(CompilationManager.java:229)
> >  at com.cloudera.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:85)
> > at com.cloudera.sqoop.tool.ExportTool.exportTable(ExportTool.java:66)
> >  at com.cloudera.sqoop.tool.ExportTool.run(ExportTool.java:99)
> > at com.cloudera.sqoop.Sqoop.run(Sqoop.java:146)
> >  at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> > at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:182)
> >  at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:221)
> > at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:230)
> >  at com.cloudera.sqoop.Sqoop.main(Sqoop.java:239)
> > 12/12/05 19:55:28 INFO orm.CompilationManager: Writing jar file:
> >
> /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/member_main.jar
> > 12/12/05 19:55:28 DEBUG orm.CompilationManager: Scanning for .class files
> > in directory: /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d
> > 12/12/05 19:55:28 DEBUG orm.CompilationManager: Got classfile:
> >
> /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/member_main.class
> > -> member_main.class
> > 12/12/05 19:55:28 DEBUG orm.CompilationManager: Finished writing jar file
> >
> /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/member_main.jar
> > 12/12/05 19:55:28 INFO mapreduce.ExportJobBase: Beginning export of
> > member_main
> > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Using InputFormat: class
> > com.cloudera.sqoop.mapreduce.ExportInputFormat
> > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar
> > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/local/sqoop/lib/sqljdbc4.jar
> > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar
> > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar
> > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/local/sqoop/lib/jackson-core-asl-1.7.3.jar
> > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/local/sqoop/lib/snappy-java-1.0.3.2.jar
> > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/local/sqoop/lib/avro-1.5.4.jar
> > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/local/sqoop/lib/jackson-mapper-asl-1.7.3.jar
> > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/local/sqoop/lib/sqljdbc4.jar
> > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/local/sqoop/lib/ant-contrib-1.0b3.jar
> > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/local/sqoop/lib/sqoop-sqlserver-1.0.jar
> > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/local/sqoop/lib/paranamer-2.3.jar
> > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/local/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar
> > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/local/sqoop/lib/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar
> > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/local/sqoop/lib/avro-ipc-1.5.4.jar
> > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/local/sqoop/lib/avro-mapred-1.5.4.jar
> > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/local/sqoop/lib/commons-io-1.4.jar
> > 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/local/sqoop/lib/jopt-simple-3.2.jar
> > 12/12/05 19:55:31 INFO input.FileInputFormat: Total input paths to
> process
> > : 1
> > 12/12/05 19:55:31 DEBUG mapreduce.ExportInputFormat: Target numMapTasks=5
> > 12/12/05 19:55:31 DEBUG mapreduce.ExportInputFormat: Total input
> > bytes=110140058
> > 12/12/05 19:55:31 DEBUG mapreduce.ExportInputFormat:
> maxSplitSize=22028011
> > 12/12/05 19:55:31 INFO input.FileInputFormat: Total input paths to
> process
> > : 1
> > 12/12/05 19:55:31 INFO util.NativeCodeLoader: Loaded the native-hadoop
> > library
> > 12/12/05 19:55:31 WARN snappy.LoadSnappy: Snappy native library not
> loaded
> > 12/12/05 19:55:31 DEBUG mapreduce.ExportInputFormat: Generated splits:
> > 12/12/05 19:55:31 DEBUG mapreduce.ExportInputFormat:
> >
> Paths:/user/hadoop/CyberlinkMemberData/Member_Main/CHS/part-m-00000:0+22028011
> > Locations:hadoop03:;
> > 12/12/05 19:55:31 DEBUG mapreduce.ExportInputFormat:
> >
> Paths:/user/hadoop/CyberlinkMemberData/Member_Main/CHS/part-m-00000:22028011+22028011
> > Locations:hadoop03:;
> > 12/12/05 19:55:31 DEBUG mapreduce.ExportInputFormat:
> >
> Paths:/user/hadoop/CyberlinkMemberData/Member_Main/CHS/part-m-00000:44056022+11526421,/user/hadoop/CyberlinkMemberData/Member_Main/CHS/part-m-00000:55582443+11526421
> > Locations:hadoop03:;
> > 12/12/05 19:55:31 DEBUG mapreduce.ExportInputFormat:
> >
> Paths:/user/hadoop/CyberlinkMemberData/Member_Main/CHS/part-m-00000:67108864+21515597,/user/hadoop/CyberlinkMemberData/Member_Main/CHS/part-m-00000:88624461+21515597
> > Locations:hadoop03:;
> > 12/12/05 19:55:31 INFO mapred.JobClient: Running job:
> job_201212041541_0245
> > 12/12/05 19:55:32 INFO mapred.JobClient:  map 0% reduce 0%
> > 12/12/05 19:55:47 INFO mapred.JobClient: Task Id :
> > attempt_201212041541_0245_m_000002_0, Status : FAILED
> > java.lang.NumberFormatException: For input string: "Male"
> > at
> >
> java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
> >  at java.lang.Integer.parseInt(Integer.java:481)
> > at java.lang.Integer.valueOf(Integer.java:570)
> >  at member_main.__loadFromFields(member_main.java:1254)
> > at member_main.parse(member_main.java:1156)
> >  at
> >
> com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:81)
> > at
> >
> com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:40)
> >  at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
> > at
> >
> com.cloudera.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:189)
> >  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
> > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> >  at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> > at java.security.AccessController.doPrivileged(Native Method)
> >  at javax.security.auth.Subject.doAs(Subject.java:416)
> > at
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> >  at org.apache.hadoop.mapred.Child.main(Child.java:249)
> >
> > 12/12/05 19:55:51 INFO mapred.JobClient:  map 5% reduce 0%
> > 12/12/05 19:55:54 INFO mapred.JobClient:  map 8% reduce 0%
> > 12/12/05 19:55:54 INFO mapred.JobClient: Task Id :
> > attempt_201212041541_0245_m_000002_1, Status : FAILED
> > java.lang.NumberFormatException: For input string: "Male"
> > at
> >
> java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
> >  at java.lang.Integer.parseInt(Integer.java:481)
> > at java.lang.Integer.valueOf(Integer.java:570)
> >  at member_main.__loadFromFields(member_main.java:1254)
> > at member_main.parse(member_main.java:1156)
> >  at
> >
> com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:81)
> > at
> >
> com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:40)
> >  at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
> > at
> >
> com.cloudera.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:189)
> >  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
> > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> >  at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> > at java.security.AccessController.doPrivileged(Native Method)
> >  at javax.security.auth.Subject.doAs(Subject.java:416)
> > at
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> >  at org.apache.hadoop.mapred.Child.main(Child.java:249)
> >
> > 12/12/05 19:55:57 INFO mapred.JobClient:  map 14% reduce 0%
> > 12/12/05 19:55:59 INFO mapred.JobClient: Task Id :
> > attempt_201212041541_0245_m_000000_0, Status : FAILED
> > java.lang.IllegalArgumentException: Timestamp format must be yyyy-mm-dd
> > hh:mm:ss[.fffffffff]
> > at java.sql.Timestamp.valueOf(Timestamp.java:203)
> >  at member_main.__loadFromFields(member_main.java:1239)
> > at member_main.parse(member_main.java:1156)
> >  at
> >
> com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:81)
> > at
> >
> com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:40)
> >  at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
> > at
> >
> com.cloudera.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:189)
> >  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
> > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> >  at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> > at java.security.AccessController.doPrivileged(Native Method)
> >  at javax.security.auth.Subject.doAs(Subject.java:416)
> > at
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> >  at org.apache.hadoop.mapred.Child.main(Child.java:249)
> >
> > 12/12/05 19:56:00 INFO mapred.JobClient:  map 13% reduce 0%
> > 12/12/05 19:56:01 INFO mapred.JobClient: Task Id :
> > attempt_201212041541_0245_m_000002_2, Status : FAILED
> > java.lang.NumberFormatException: For input string: "Male"
> > at
> >
> java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
> >  at java.lang.Integer.parseInt(Integer.java:481)
> > at java.lang.Integer.valueOf(Integer.java:570)
> >  at member_main.__loadFromFields(member_main.java:1254)
> > at member_main.parse(member_main.java:1156)
> >  at
> >
> com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:81)
> > at
> >
> com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:40)
> >  at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
> > at
> >
> com.cloudera.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:189)
> >  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
> > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> >  at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> > at java.security.AccessController.doPrivileged(Native Method)
> >  at javax.security.auth.Subject.doAs(Subject.java:416)
> > at
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> >  at org.apache.hadoop.mapred.Child.main(Child.java:249)
> >
> > 12/12/05 19:56:03 INFO mapred.JobClient:  map 16% reduce 0%
> > 12/12/05 19:56:13 INFO mapred.JobClient: Job complete:
> job_201212041541_0245
> > 12/12/05 19:56:13 INFO mapred.JobClient: Counters: 8
> > 12/12/05 19:56:13 INFO mapred.JobClient:   Job Counters
> > 12/12/05 19:56:13 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=91611
> > 12/12/05 19:56:13 INFO mapred.JobClient:     Total time spent by all
> > reduces waiting after reserving slots (ms)=0
> > 12/12/05 19:56:13 INFO mapred.JobClient:     Total time spent by all maps
> > waiting after reserving slots (ms)=0
> > 12/12/05 19:56:13 INFO mapred.JobClient:     Rack-local map tasks=5
> > 12/12/05 19:56:13 INFO mapred.JobClient:     Launched map tasks=8
> > 12/12/05 19:56:13 INFO mapred.JobClient:     Data-local map tasks=3
> > 12/12/05 19:56:13 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=0
> > 12/12/05 19:56:13 INFO mapred.JobClient:     Failed map tasks=1
> > 12/12/05 19:56:13 INFO mapreduce.ExportJobBase: Transferred 0 bytes in
> > 45.077 seconds (0 bytes/sec)
> > 12/12/05 19:56:13 INFO mapreduce.ExportJobBase: Exported 0 records.
> > 12/12/05 19:56:13 ERROR tool.ExportTool: Error during export: Export job
> > failed!
> > =====================================
> >
> >
> > Chun-fan
> >
> >
> > On Thu, Dec 6, 2012 at 12:23 AM, Jarek Jarcec Cecho <jarcec@apache.org
> >wrote:
> >
> > > Hi Chun-fan,
> > > thank you very much for sharing the log with us. You are using
> Microsoft
> > > SQL Connector because you're downloaded manually from Microsoft web
> pages
> > > and you can also confirm that by following log lines:
> > >
> > > > 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Added factory
> > > > com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory specified by
> > > > /usr/local/sqoop/conf/managers.d/mssqoop-sqlserver
> > > ...
> > > > 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Instantiated ConnManager
> > > > com.microsoft.sqoop.SqlServer.MSSQLServerManager@736921fd
> > >
> > > I'm not sure what is going wrong as it seems that the data were parsed
> > > correctly, but submitting the query to SQL server will fail. As a next
> step
> > > I would recommend turning Microsoft Connector off and using build-in
> one
> > > instead to see if the issue is specific to Sqoop or the Connector. You
> can
> > > do that by temporarily moving file
> > > /usr/local/sqoop/conf/managers.d/mssqoop-sqlserver somewhere else.
> > >
> > > Jarcec
> > >
> > > On Wed, Dec 05, 2012 at 12:25:24PM +0800, Chun-fan Ivan Liao wrote:
> > > > Thank you, Jarcec. I'm not sure which connector we use. I've
> downloaded
> > > > "Microsoft SQL Server Connector for Apache Hadoop" from
> > > > http://www.microsoft.com/en-us/download/details.aspx?id=27584, but I
> > > didn't
> > > > remember if we really used that. How to make sure?
> > > >
> > > > And here is the verbose log:
> > > >
> > > > ===========
> > > > 12/12/05 12:08:57 DEBUG tool.BaseSqoopTool: Enabled debug logging.
> > > > 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Added factory
> > > > com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory specified by
> > > > /usr/local/sqoop/conf/managers.d/mssqoop-sqlserver
> > > > 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Loaded manager factory:
> > > > com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory
> > > > 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Loaded manager factory:
> > > > com.cloudera.sqoop.manager.DefaultManagerFactory
> > > > 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Trying ManagerFactory:
> > > > com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory
> > > > 12/12/05 12:08:57 INFO SqlServer.MSSQLServerManagerFactory: Using
> > > > Microsoft's SQL Server - Hadoop Connector
> > > > 12/12/05 12:08:57 INFO manager.SqlManager: Using default fetchSize of
> > > 1000
> > > > 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Instantiated ConnManager
> > > > com.microsoft.sqoop.SqlServer.MSSQLServerManager@736921fd
> > > > 12/12/05 12:08:57 INFO tool.CodeGenTool: Beginning code generation
> > > > 12/12/05 12:08:57 DEBUG manager.SqlManager: No connection paramenters
> > > > specified. Using regular API for making connection.
> > > > 12/12/05 12:08:57 DEBUG manager.SqlManager: Using fetchSize for next
> > > query:
> > > > 1000
> > > > 12/12/05 12:08:57 INFO manager.SqlManager: Executing SQL statement:
> > > SELECT
> > > > TOP 1 * FROM [member_main]
> > > > 12/12/05 12:08:57 DEBUG manager.SqlManager: Using fetchSize for next
> > > query:
> > > > 1000
> > > > 12/12/05 12:08:57 INFO manager.SqlManager: Executing SQL statement:
> > > SELECT
> > > > TOP 1 * FROM [member_main]
> > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter: selected columns:
> > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   MemberId
> > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   USERNAME
> > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   FirstName
> > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   LastName
> > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   EmailAddress
> > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Password
> > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Password_E5
> > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Birthday
> > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   CompanyName
> > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Gender
> > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Age
> > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Education
> > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Country
> > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Title
> > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Phone1
> > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Phone2
> > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Fax
> > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   State
> > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   City
> > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Address1
> > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Address2
> > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   ZipCode
> > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   VATID
> > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Language
> > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   rec_letter
> > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   rec_promotion
> > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   rec_type
> > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   JointSource
> > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   CustomerLevel
> > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   UpdateDate
> > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   CreateDate
> > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   FirstLoginDate
> > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   LastLoginDate
> > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   LastVisit
> > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   isValid
> > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   nJoint
> > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Upd_SubDate
> > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   UnSub_Type
> > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   CreateDateFloat
> > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter: Writing source file:
> > > >
> > >
> /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/member_main.java
> > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter: Table name: member_main
> > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter: Columns: MemberId:4,
> > > USERNAME:12,
> > > > FirstName:-9, LastName:-9, EmailAddress:12, Password:12,
> Password_E5:12,
> > > > Birthday:93, CompanyName:-9, Gender:12, Age:5, Education:12,
> Country:5,
> > > > Title:-9, Phone1:12, Phone2:12, Fax:12, State:-9, City:-9,
> Address1:-9,
> > > > Address2:-9, ZipCode:12, VATID:12, Language:12, rec_letter:-7,
> > > > rec_promotion:-7, rec_type:5, JointSource:12, CustomerLevel:4,
> > > > UpdateDate:93, CreateDate:93, FirstLoginDate:93, LastLoginDate:93,
> > > > LastVisit:93, isValid:-7, nJoint:4, Upd_SubDate:93, UnSub_Type:4,
> > > > CreateDateFloat:8,
> > > > 12/12/05 12:08:57 DEBUG orm.ClassWriter: sourceFilename is
> > > member_main.java
> > > > 12/12/05 12:08:57 DEBUG orm.CompilationManager: Found existing
> > > > /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/
> > > > 12/12/05 12:08:57 INFO orm.CompilationManager: HADOOP_HOME is
> > > > /usr/local/hadoop/libexec/..
> > > > 12/12/05 12:08:57 DEBUG orm.CompilationManager: Adding source file:
> > > >
> > >
> /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/member_main.java
> > > > 12/12/05 12:08:57 DEBUG orm.CompilationManager: Invoking javac with
> args:
> > > > 12/12/05 12:08:57 DEBUG orm.CompilationManager:   -sourcepath
> > > > 12/12/05 12:08:57 DEBUG orm.CompilationManager:
> > > > /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/
> > > > 12/12/05 12:08:57 DEBUG orm.CompilationManager:   -d
> > > > 12/12/05 12:08:57 DEBUG orm.CompilationManager:
> > > > /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/
> > > > 12/12/05 12:08:57 DEBUG orm.CompilationManager:   -classpath
> > > > 12/12/05 12:08:57 DEBUG orm.CompilationManager:
> > > >
> > >
> /usr/local/hadoop/libexec/../conf:/usr/lib/jvm/java-6-openjdk-amd64/lib/tools.jar:/usr/local/hadoop/libexec/..:/usr/local/hadoop/libexec/../hadoop-core-1.0.3.jar:/usr/local/hadoop/libexec/../lib/asm-3.2.jar:/usr/local/hadoop/libexec/../lib/aspectjrt-1.6.5.jar:/usr/local/hadoop/libexec/../lib/aspectjtools-1.6.5.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/libexec/../lib/commons-cli-1.2.jar:/usr/local/hadoop/libexec/../lib/commons-codec-1.4.jar:/usr/local/hadoop/libexec/../lib/commons-collections-3.2.1.jar:/usr/local/hadoop/libexec/../lib/commons-configuration-1.6.jar:/usr/local/hadoop/libexec/../lib/commons-daemon-1.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-digester-1.8.jar:/usr/local/hadoop/libexec/../lib/commons-el-1.0.jar:/usr/local/hadoop/libexec/../lib/commons-httpclient-3.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-io-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-lang-2.4.jar:/usr/local/hadoop/libexec/../lib/commons-logging-1.1.1.jar:/usr/local/hadoop/libexec/../lib/commons-logging-api-1.0.4.jar:/usr/local/hadoop/libexec/../lib/commons-math-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-net-1.4.1.jar:/usr/local/hadoop/libexec/../lib/core-3.1.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-capacity-scheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-fairscheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-thriftfs-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hsqldb-1.8.0.10.jar:/usr/local/hadoop/libexec/../lib/jackson-core-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jasper-compiler-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jasper-runtime-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jdeb-0.8.jar:/usr/local/hadoop/libexec/../lib/jersey-core-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-json-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-server-1.8.jar:/usr/local/hadoop/libexec/../lib/jets3t-0.6.1.jar:/usr/local/hadoop/libexec/../lib/jetty-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jetty-util-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jsch-0.1.42.jar:/usr/local/hadoop/libexec/../lib/junit-4.5.jar:/usr/local/hadoop/libexec/../lib/kfs-0.2.2.jar:/usr/local/hadoop/libexec/../lib/log4j-1.2.15.jar:/usr/local/hadoop/libexec/../lib/mockito-all-1.8.5.jar:/usr/local/hadoop/libexec/../lib/oro-2.0.8.jar:/usr/local/hadoop/libexec/../lib/servlet-api-2.5-20081211.jar:/usr/local/hadoop/libexec/../lib/slf4j-api-1.4.3.jar:/usr/local/hadoop/libexec/../lib/slf4j-log4j12-1.4.3.jar:/usr/local/hadoop/libexec/../lib/xmlenc-0.52.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-2.1.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-api-2.1.jar:/usr/local/hadoop/lib:/usr/local/sqoop/conf::/usr/local/sqoop/lib/ant-contrib-1.0b3.jar:/usr/local/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar:/usr/local/sqoop/lib/avro-1.5.4.jar:/usr/local/sqoop/lib/avro-ipc-1.5.4.jar:/usr/local/sqoop/lib/avro-mapred-1.5.4.jar:/usr/local/sqoop/lib/commons-io-1.4.jar:/usr/local/sqoop/lib/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar:/usr/local/sqoop/lib/jackson-core-asl-1.7.3.jar:/usr/local/sqoop/lib/jackson-mapper-asl-1.7.3.jar:/usr/local/sqoop/lib/jopt-simple-3.2.jar:/usr/local/sqoop/lib/paranamer-2.3.jar:/usr/local/sqoop/lib/snappy-java-1.0.3.2.jar:/usr/local/sqoop/lib/sqljdbc4.jar:/usr/local/sqoop/lib/sqoop-sqlserver-1.0.jar:/usr/local/hbase/conf/:/usr/lib/jvm/java-6-openjdk-amd64//lib/tools.jar:/usr/local/hbase:/usr/local/hbase/hbase-0.94.1.jar:/usr/local/hbase/hbase-0.94.1-tests.jar:/usr/local/hbase/lib/activation-1.1.jar:/usr/local/hbase/lib/asm-3.1.jar:/usr/local/hbase/lib/avro-1.5.3.jar:/usr/local/hbase/lib/avro-ipc-1.5.3.jar:/usr/local/hbase/lib/commons-beanutils-1.7.0.jar:/usr/local/hbase/lib/commons-beanutils-core-1.8.0.jar:/usr/local/hbase/lib/commons-cli-1.2.jar:/usr/local/hbase/lib/commons-codec-1.4.jar:/usr/local/hbase/lib/commons-collections-3.2.1.jar:/usr/local/hbase/lib/commons-configuration-1.6.jar:/usr/local/hbase/lib/commons-digester-1.8.jar:/usr/local/hbase/lib/commons-el-1.0.jar:/usr/local/hbase/lib/commons-httpclient-3.1.jar:/usr/local/hbase/lib/commons-io-2.1.jar:/usr/local/hbase/lib/commons-lang-2.5.jar:/usr/local/hbase/lib/commons-logging-1.1.1.jar:/usr/local/hbase/lib/commons-math-2.1.jar:/usr/local/hbase/lib/commons-net-1.4.1.jar:/usr/local/hbase/lib/core-3.1.1.jar:/usr/local/hbase/lib/guava-11.0.2.jar:/usr/local/hbase/lib/hadoop-core-1.0.3.jar:/usr/local/hbase/lib/high-scale-lib-1.1.1.jar:/usr/local/hbase/lib/httpclient-4.1.2.jar:/usr/local/hbase/lib/httpcore-4.1.3.jar:/usr/local/hbase/lib/jackson-core-asl-1.8.8.jar:/usr/local/hbase/lib/jackson-jaxrs-1.8.8.jar:/usr/local/hbase/lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hbase/lib/jackson-xc-1.8.8.jar:/usr/local/hbase/lib/jamon-runtime-2.3.1.jar:/usr/local/hbase/lib/jasper-compiler-5.5.23.jar:/usr/local/hbase/lib/jasper-runtime-5.5.23.jar:/usr/local/hbase/lib/jaxb-api-2.1.jar:/usr/local/hbase/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hbase/lib/jersey-core-1.8.jar:/usr/local/hbase/lib/jersey-json-1.8.jar:/usr/local/hbase/lib/jersey-server-1.8.jar:/usr/local/hbase/lib/jettison-1.1.jar:/usr/local/hbase/lib/jetty-6.1.26.jar:/usr/local/hbase/lib/jetty-util-6.1.26.jar:/usr/local/hbase/lib/jruby-complete-1.6.5.jar:/usr/local/hbase/lib/jsp-2.1-6.1.14.jar:/usr/local/hbase/lib/jsp-api-2.1-6.1.14.jar:/usr/local/hbase/lib/jsr305-1.3.9.jar:/usr/local/hbase/lib/junit-4.10-HBASE-1.jar:/usr/local/hbase/lib/libthrift-0.8.0.jar:/usr/local/hbase/lib/log4j-1.2.16.jar:/usr/local/hbase/lib/metrics-core-2.1.2.jar:/usr/local/hbase/lib/netty-3.2.4.Final.jar:/usr/local/hbase/lib/protobuf-java-2.4.0a.jar:/usr/local/hbase/lib/servlet-api-2.5-6.1.14.jar:/usr/local/hbase/lib/slf4j-api-1.4.3.jar:/usr/local/hbase/lib/slf4j-log4j12-1.4.3.jar:/usr/local/hbase/lib/snappy-java-1.0.3.2.jar:/usr/local/hbase/lib/stax-api-1.0.1.jar:/usr/local/hbase/lib/velocity-1.7.jar:/usr/local/hbase/lib/xmlenc-0.52.jar:/usr/local/hbase/lib/zookeeper-3.4.3.jar::/usr/local/hadoop/conf:/usr/local/hadoop/libexec/../conf:/usr/lib/jvm/java-6-openjdk-amd64/lib/tools.jar:/usr/local/hadoop/libexec/..:/usr/local/hadoop/libexec/../hadoop-core-1.0.3.jar:/usr/local/hadoop/libexec/../lib/asm-3.2.jar:/usr/local/hadoop/libexec/../lib/aspectjrt-1.6.5.jar:/usr/local/hadoop/libexec/../lib/aspectjtools-1.6.5.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/libexec/../lib/commons-cli-1.2.jar:/usr/local/hadoop/libexec/../lib/commons-codec-1.4.jar:/usr/local/hadoop/libexec/../lib/commons-collections-3.2.1.jar:/usr/local/hadoop/libexec/../lib/commons-configuration-1.6.jar:/usr/local/hadoop/libexec/../lib/commons-daemon-1.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-digester-1.8.jar:/usr/local/hadoop/libexec/../lib/commons-el-1.0.jar:/usr/local/hadoop/libexec/../lib/commons-httpclient-3.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-io-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-lang-2.4.jar:/usr/local/hadoop/libexec/../lib/commons-logging-1.1.1.jar:/usr/local/hadoop/libexec/../lib/commons-logging-api-1.0.4.jar:/usr/local/hadoop/libexec/../lib/commons-math-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-net-1.4.1.jar:/usr/local/hadoop/libexec/../lib/core-3.1.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-capacity-scheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-fairscheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-thriftfs-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hsqldb-1.8.0.10.jar:/usr/local/hadoop/libexec/../lib/jackson-core-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jasper-compiler-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jasper-runtime-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jdeb-0.8.jar:/usr/local/hadoop/libexec/../lib/jersey-core-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-json-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-server-1.8.jar:/usr/local/hadoop/libexec/../lib/jets3t-0.6.1.jar:/usr/local/hadoop/libexec/../lib/jetty-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jetty-util-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jsch-0.1.42.jar:/usr/local/hadoop/libexec/../lib/junit-4.5.jar:/usr/local/hadoop/libexec/../lib/kfs-0.2.2.jar:/usr/local/hadoop/libexec/../lib/log4j-1.2.15.jar:/usr/local/hadoop/libexec/../lib/mockito-all-1.8.5.jar:/usr/local/hadoop/libexec/../lib/oro-2.0.8.jar:/usr/local/hadoop/libexec/../lib/servlet-api-2.5-20081211.jar:/usr/local/hadoop/libexec/../lib/slf4j-api-1.4.3.jar:/usr/local/hadoop/libexec/../lib/slf4j-log4j12-1.4.3.jar:/usr/local/hadoop/libexec/../lib/xmlenc-0.52.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-2.1.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-api-2.1.jar:/usr/local/hadoop/lib::/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar:/usr/local/sqoop/sqoop-test-1.3.0-cdh3u4.jar::/usr/local/hadoop/hadoop-core-1.0.3.jar:/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar
> > > > 12/12/05 12:08:58 ERROR orm.CompilationManager: Could not rename
> > > >
> > >
> /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/member_main.java
> > > > to /home/hadoop/_scripts/1-hadoop/member/./member_main.java
> > > > org.apache.commons.io.FileExistsException: Destination
> > > > '/home/hadoop/_scripts/1-hadoop/member/./member_main.java' already
> exists
> > > >  at org.apache.commons.io.FileUtils.moveFile(FileUtils.java:2378)
> > > > at
> > > >
> > >
> com.cloudera.sqoop.orm.CompilationManager.compile(CompilationManager.java:229)
> > > >  at
> com.cloudera.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:85)
> > > > at com.cloudera.sqoop.tool.ExportTool.exportTable(ExportTool.java:66)
> > > >  at com.cloudera.sqoop.tool.ExportTool.run(ExportTool.java:99)
> > > > at com.cloudera.sqoop.Sqoop.run(Sqoop.java:146)
> > > >  at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> > > > at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:182)
> > > >  at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:221)
> > > > at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:230)
> > > >  at com.cloudera.sqoop.Sqoop.main(Sqoop.java:239)
> > > > 12/12/05 12:08:58 INFO orm.CompilationManager: Writing jar file:
> > > >
> > >
> /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/member_main.jar
> > > > 12/12/05 12:08:58 DEBUG orm.CompilationManager: Scanning for .class
> files
> > > > in directory:
> /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971
> > > > 12/12/05 12:08:58 DEBUG orm.CompilationManager: Got classfile:
> > > >
> > >
> /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/member_main.class
> > > > -> member_main.class
> > > > 12/12/05 12:08:58 DEBUG orm.CompilationManager: Finished writing jar
> file
> > > >
> > >
> /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/member_main.jar
> > > > 12/12/05 12:08:58 INFO mapreduce.ExportJobBase: Beginning export of
> > > > member_main
> > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Using InputFormat: class
> > > > com.cloudera.sqoop.mapreduce.ExportInputFormat
> > > > 12/12/05 12:08:58 DEBUG manager.SqlManager: Using fetchSize for next
> > > query:
> > > > 1000
> > > > 12/12/05 12:08:58 INFO manager.SqlManager: Executing SQL statement:
> > > SELECT
> > > > TOP 1 * FROM [member_main]
> > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > file:/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar
> > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > file:/usr/local/sqoop/lib/sqljdbc4.jar
> > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > file:/usr/local/sqoop/lib/sqoop-sqlserver-1.0.jar
> > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > file:/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar
> > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > file:/usr/local/sqoop/lib/jackson-core-asl-1.7.3.jar
> > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > file:/usr/local/sqoop/lib/snappy-java-1.0.3.2.jar
> > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > file:/usr/local/sqoop/lib/avro-1.5.4.jar
> > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > file:/usr/local/sqoop/lib/jackson-mapper-asl-1.7.3.jar
> > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > file:/usr/local/sqoop/lib/sqljdbc4.jar
> > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > file:/usr/local/sqoop/lib/ant-contrib-1.0b3.jar
> > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > file:/usr/local/sqoop/lib/sqoop-sqlserver-1.0.jar
> > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > file:/usr/local/sqoop/lib/paranamer-2.3.jar
> > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > file:/usr/local/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar
> > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > file:/usr/local/sqoop/lib/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar
> > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > file:/usr/local/sqoop/lib/avro-ipc-1.5.4.jar
> > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > file:/usr/local/sqoop/lib/avro-mapred-1.5.4.jar
> > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > file:/usr/local/sqoop/lib/commons-io-1.4.jar
> > > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > file:/usr/local/sqoop/lib/jopt-simple-3.2.jar
> > > > 12/12/05 12:09:00 INFO input.FileInputFormat: Total input paths to
> > > process
> > > > : 1
> > > > 12/12/05 12:09:00 DEBUG mapreduce.ExportInputFormat: Target
> numMapTasks=1
> > > > 12/12/05 12:09:00 DEBUG mapreduce.ExportInputFormat: Total input
> > > bytes=2611
> > > > 12/12/05 12:09:00 DEBUG mapreduce.ExportInputFormat:
> maxSplitSize=2611
> > > > 12/12/05 12:09:00 INFO input.FileInputFormat: Total input paths to
> > > process
> > > > : 1
> > > > 12/12/05 12:09:00 INFO util.NativeCodeLoader: Loaded the
> native-hadoop
> > > > library
> > > > 12/12/05 12:09:00 WARN snappy.LoadSnappy: Snappy native library not
> > > loaded
> > > > 12/12/05 12:09:00 DEBUG mapreduce.ExportInputFormat: Generated
> splits:
> > > > 12/12/05 12:09:00 DEBUG mapreduce.ExportInputFormat:
> > > > Paths:/user/hadoop/test-ivan/test:0+2611 Locations:hadoop05:;
> > > > 12/12/05 12:09:00 INFO mapred.JobClient: Running job:
> > > job_201212041541_0107
> > > > 12/12/05 12:09:01 INFO mapred.JobClient:  map 0% reduce 0%
> > > > 12/12/05 12:09:18 INFO mapred.JobClient: Task Id :
> > > > attempt_201212041541_0107_m_000000_0, Status : FAILED
> > > > java.io.IOException: com.microsoft.sqlserver.jdbc.SQLServerException:
> > > > Incorrect syntax near ','.
> > > >  at
> > > >
> > >
> com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.close(AsyncSqlRecordWriter.java:195)
> > > > at
> > > >
> > >
> org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.close(MapTask.java:651)
> > > >  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:766)
> > > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> > > >  at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> > > > at java.security.AccessController.doPrivileged(Native Method)
> > > >  at javax.security.auth.Subject.doAs(Subject.java:416)
> > > > at
> > > >
> > >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> > > >  at org.apache.hadoop.mapred.Child.main(Child.java:249)
> > > > Caused by: com.microsoft.sqlserver.jdbc.SQLServerException: Incorrect
> > > > syntax near ','.
> > > >  at
> > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:197)
> > > > at
> > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1493)
> > > >  at
> > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.doExecutePreparedStatement(SQLServerPreparedStatement.java:390)
> > > > at
> > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement$PrepStmtExecCmd.doExecute(SQLServerPreparedStatement.java:340)
> > > >  at
> com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:4575)
> > > > at
> > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1400)
> > > >  at
> > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:179)
> > > > at
> > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:154)
> > > >  at
> > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.execute(SQLServerPreparedStatement.java:322)
> > > > at
> > > >
> > >
> com.cloudera.sqoop.mapreduce.AsyncSqlOutputFormat$AsyncSqlExecThread.run(AsyncSqlOutputFormat.java:234)
> > > >
> > > > 12/12/05 12:09:24 INFO mapred.JobClient: Task Id :
> > > > attempt_201212041541_0107_m_000000_1, Status : FAILED
> > > > java.io.IOException: com.microsoft.sqlserver.jdbc.SQLServerException:
> > > > Incorrect syntax near ','.
> > > >  at
> > > >
> > >
> com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.close(AsyncSqlRecordWriter.java:195)
> > > > at
> > > >
> > >
> org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.close(MapTask.java:651)
> > > >  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:766)
> > > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> > > >  at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> > > > at java.security.AccessController.doPrivileged(Native Method)
> > > >  at javax.security.auth.Subject.doAs(Subject.java:416)
> > > > at
> > > >
> > >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> > > >  at org.apache.hadoop.mapred.Child.main(Child.java:249)
> > > > Caused by: com.microsoft.sqlserver.jdbc.SQLServerException: Incorrect
> > > > syntax near ','.
> > > >  at
> > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:197)
> > > > at
> > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1493)
> > > >  at
> > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.doExecutePreparedStatement(SQLServerPreparedStatement.java:390)
> > > > at
> > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement$PrepStmtExecCmd.doExecute(SQLServerPreparedStatement.java:340)
> > > >  at
> com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:4575)
> > > > at
> > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1400)
> > > >  at
> > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:179)
> > > > at
> > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:154)
> > > >  at
> > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.execute(SQLServerPreparedStatement.java:322)
> > > > at
> > > >
> > >
> com.cloudera.sqoop.mapreduce.AsyncSqlOutputFormat$AsyncSqlExecThread.run(AsyncSqlOutputFormat.java:234)
> > > >
> > > > 12/12/05 12:09:30 INFO mapred.JobClient: Task Id :
> > > > attempt_201212041541_0107_m_000000_2, Status : FAILED
> > > > java.io.IOException: com.microsoft.sqlserver.jdbc.SQLServerException:
> > > > Incorrect syntax near ','.
> > > >  at
> > > >
> > >
> com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.close(AsyncSqlRecordWriter.java:195)
> > > > at
> > > >
> > >
> org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.close(MapTask.java:651)
> > > >  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:766)
> > > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> > > >  at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> > > > at java.security.AccessController.doPrivileged(Native Method)
> > > >  at javax.security.auth.Subject.doAs(Subject.java:416)
> > > > at
> > > >
> > >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> > > >  at org.apache.hadoop.mapred.Child.main(Child.java:249)
> > > > Caused by: com.microsoft.sqlserver.jdbc.SQLServerException: Incorrect
> > > > syntax near ','.
> > > >  at
> > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:197)
> > > > at
> > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1493)
> > > >  at
> > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.doExecutePreparedStatement(SQLServerPreparedStatement.java:390)
> > > > at
> > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement$PrepStmtExecCmd.doExecute(SQLServerPreparedStatement.java:340)
> > > >  at
> com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:4575)
> > > > at
> > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1400)
> > > >  at
> > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:179)
> > > > at
> > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:154)
> > > >  at
> > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.execute(SQLServerPreparedStatement.java:322)
> > > > at
> > > >
> > >
> com.cloudera.sqoop.mapreduce.AsyncSqlOutputFormat$AsyncSqlExecThread.run(AsyncSqlOutputFormat.java:234)
> > > >
> > > > 12/12/05 12:09:41 INFO mapred.JobClient: Job complete:
> > > job_201212041541_0107
> > > > 12/12/05 12:09:41 INFO mapred.JobClient: Counters: 8
> > > > 12/12/05 12:09:41 INFO mapred.JobClient:   Job Counters
> > > > 12/12/05 12:09:41 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=24379
> > > > 12/12/05 12:09:41 INFO mapred.JobClient:     Total time spent by all
> > > > reduces waiting after reserving slots (ms)=0
> > > > 12/12/05 12:09:41 INFO mapred.JobClient:     Total time spent by all
> maps
> > > > waiting after reserving slots (ms)=0
> > > > 12/12/05 12:09:41 INFO mapred.JobClient:     Rack-local map tasks=3
> > > > 12/12/05 12:09:41 INFO mapred.JobClient:     Launched map tasks=4
> > > > 12/12/05 12:09:41 INFO mapred.JobClient:     Data-local map tasks=1
> > > > 12/12/05 12:09:41 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=0
> > > > 12/12/05 12:09:41 INFO mapred.JobClient:     Failed map tasks=1
> > > > 12/12/05 12:09:41 INFO mapreduce.ExportJobBase: Transferred 0 bytes
> in
> > > > 43.0875 seconds (0 bytes/sec)
> > > > 12/12/05 12:09:41 INFO mapreduce.ExportJobBase: Exported 0 records.
> > > > 12/12/05 12:09:41 ERROR tool.ExportTool: Error during export: Export
> job
> > > > failed!
> > > > ================
> > > >
> > > > Kind regards,
> > > > Chun-fan
> > > >
> > > > On Wed, Dec 5, 2012 at 12:01 AM, Jarek Jarcec Cecho <
> jarcec@apache.org
> > > >wrote:
> > > >
> > > > > HiChun-fan,
> > > > > would you mind sharing with us entire Sqoop log generated with
> > > parameter
> > > > > --verbose? Are you using build-in Microsoft SQL Connector or
> connector
> > > > > provided by Microsoft?
> > > > >
> > > > > Jarcec
> > > > >
> > > > > On Tue, Dec 04, 2012 at 05:51:31PM +0800, Chun-fan Ivan Liao wrote:
> > > > > > Hi,
> > > > > >
> > > > > >
> > > > > >
> > > > > > We are using Sqoop 1.3.0-cdh3u4 with Hadoop version 1.0.3.
> > > > > >
> > > > > >
> > > > > >
> > > > > > We encountered the following error when we try to export HDFS
> file
> > > into
> > > > > > MSSQL 2005 (partially):
> > > > > >
> > > > > >
> > > > > >
> > > > > > 12/12/04 16:44:13 INFO mapred.JobClient: Task Id :
> > > > > > attempt_201212041541_0014_m_000000_2, Status : FAILED
> > > > > >
> > > > > > java.io.IOException:
> com.microsoft.sqlserver.jdbc.SQLServerException:
> > > > > > Incorrect syntax near ','.
> > > > > >
> > > > > >         at
> > > > > >
> > > > >
> > >
> com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.close(AsyncSqlRecordWriter.java:195)
> > > > > >
> > > > > >         at
> > > > > >
> > > > >
> > >
> org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.close(MapTask.java:651)
> > > > > >
> > > > > >         at
> > > > > org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:766)
> > > > > >
> > > > > >         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> > > > > >
> > > > > >         at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> > > > > >
> > > > > >         at java.security.AccessController.doPrivileged(Native
> Method)
> > > > > >
> > > > > >         at javax.security.auth.Subject.doAs(Subject.java:416)
> > > > > >
> > > > > >         at
> > > > > >
> > > > >
> > >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> > > > > >
> > > > > >         at org.apache.hadoop.mapred.Child.main(Child.java:249)
> > > > > >
> > > > > > Caused by: com.microsoft.sqlserver.jdbc.SQLServerException:
> Incorrect
> > > > > > syntax near ','.
> > > > > >
> > > > > >         at
> > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:197)
> > > > > >
> > > > > >         at
> > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1493)
> > > > > >
> > > > > >         at
> > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.doExecutePreparedStatement(SQLServerPreparedStatement.java:390)
> > > > > >
> > > > > >         at
> > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement$PrepStmtExecCmd.doExecute(SQLServerPreparedStatement.java:340)
> > > > > >
> > > > > >         at
> > > > > >
> com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:4575)
> > > > > >
> > > > > >         at
> > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1400)
> > > > > >
> > > > > >         at
> > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:179)
> > > > > >
> > > > > >         at
> > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:154)
> > > > > >
> > > > > >         at
> > > > > >
> > > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.execute(SQLServerPreparedStatement.java:322)
> > > > > >
> > > > > >         at
> > > > > >
> > > > >
> > >
> com.cloudera.sqoop.mapreduce.AsyncSqlOutputFormat$AsyncSqlExecThread.run(AsyncSqlOutputFormat.java:234)
> > > > > >
> > > > > >
> > > > > >
> > > > > > The HDFS file that we want to export was imported using sqoop
> from
> > > SQL
> > > > > 2005
> > > > > > before and uses ‘|’ as field delimiter, and there are commas
> (‘,’)
> > > in a
> > > > > > field of a line in the file.
> > > > > >
> > > > > >
> > > > > >
> > > > > > The commands I submitted is (generalized with capital letters):
> > > > > >
> > > > > > $ sqoop export -D sqoop.export.records.per.statement=75 -D
> > > > > > sqoop.export.statements.per.transaction=75 --connect
> > > > > >
> > > > >
> > >
> "jdbc:sqlserver://SERVER-NAME:1433;username=USER_NAME;password=PASSWD;database=DB_NAME"
> > > > > > --table TABLE_NAME -m 1 --input-fields-terminated-by '|'
> --export-dir
> > > > > > /EXPORT/FROM/DIRECTORY
> > > > > >
> > > > > >
> > > > > >
> > > > > > I’ve adjusted values of sqoop.export.records.per.statement &
> > > > > > sqoop.export.statements.per.transaction, but that didn’t help.
> > > > > >
> > > > > >
> > > > > >
> > > > > > It will be greatly appreciated if you can offer some help.
> Thanks.
> > > > > >
> > > > > >
> > > > > >
> > > > > > Ivan
> > > > >
> > >
>

Re: Sqoop export failed: Incorrect syntax near ','

Posted by Jarek Jarcec Cecho <ja...@apache.org>.
Hi Ivangelion,
I'm glad that you were able to move on with your issue. It seems to me that you're running on OpenJDK - unfortunately Sqoop is tested and supported only Oracle JDK.

Based on the exceptions you're hitting:

  java.lang.NumberFormatException: For input string: "Male"

  java.lang.IllegalArgumentException: Timestamp format must be yyyy-mm-dd hh:mm:ss[.fffffffff

It seems to me your input files got somehow corrupted and for example for the first exception Sqoop is looking for column that should be number but found string "Male" instead. You've mentioned that your data can contain a lot of wild characters, can it happen that your data also contains new line characters? Would you mind re-trying import with parameter --hive-drop-import-delims [1] to see if it helps? (this parameter do not depend on hive in any way regardless of it's name).

Jarcec

On Thu, Dec 06, 2012 at 12:03:06PM +0800, Ivangelion wrote:
> Hi Jarek,
> 
> It actually worked! Thank you so much~! :D
> 
> However now we faced another problem. The former data we tried to export is
> only test data, which row count is only 10. When we tried to export
> production data back into SQL Server from HDFS file which was previously
> imported using Sqoop from SQL server, different errors occurred. The row
> count is about 400k, and only about 120k rows were exported. This time we
> used "-m 5", and if using "-m 1", nothing will be exported. Verbose log is
> in the bottom of this mail.
> 
> Is this has to do with that we used MS SQL connector to do previous import,
> not the default one?
> 
> Also, should we specify any character encoding, e.g. utf-8 during
> import/export process? There are characters of many different languages in
> our original data in SQL Server, and I'm not sure what the encoding is
> after imported into HDFS.
> 
> Thanks again, Jarek.
> 
> =====================================
> 12/12/05 19:55:27 DEBUG tool.BaseSqoopTool: Enabled debug logging.
> 12/12/05 19:55:27 DEBUG sqoop.ConnFactory: Loaded manager factory:
> com.cloudera.sqoop.manager.DefaultManagerFactory
> 12/12/05 19:55:27 DEBUG sqoop.ConnFactory: Trying ManagerFactory:
> com.cloudera.sqoop.manager.DefaultManagerFactory
> 12/12/05 19:55:27 DEBUG manager.DefaultManagerFactory: Trying with scheme:
> jdbc:sqlserver:
> 12/12/05 19:55:27 INFO manager.SqlManager: Using default fetchSize of 1000
> 12/12/05 19:55:27 DEBUG sqoop.ConnFactory: Instantiated ConnManager
> com.cloudera.sqoop.manager.SQLServerManager@6766afb3
> 12/12/05 19:55:27 INFO tool.CodeGenTool: Beginning code generation
> 12/12/05 19:55:27 DEBUG manager.SqlManager: No connection paramenters
> specified. Using regular API for making connection.
> 12/12/05 19:55:27 DEBUG manager.SqlManager: Using fetchSize for next query:
> 1000
> 12/12/05 19:55:27 INFO manager.SqlManager: Executing SQL statement: SELECT
> t.* FROM member_main AS t WHERE 1=0
> 12/12/05 19:55:27 DEBUG orm.ClassWriter: selected columns:
> 12/12/05 19:55:27 DEBUG orm.ClassWriter:   MemberId
> 12/12/05 19:55:27 DEBUG orm.ClassWriter:   USERNAME
> 12/12/05 19:55:27 DEBUG orm.ClassWriter:   FirstName
> 12/12/05 19:55:27 DEBUG orm.ClassWriter:   LastName
> 12/12/05 19:55:27 DEBUG orm.ClassWriter:   EmailAddress
> 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Password
> 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Password_E5
> 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Birthday
> 12/12/05 19:55:27 DEBUG orm.ClassWriter:   CompanyName
> 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Gender
> 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Age
> 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Education
> 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Country
> 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Title
> 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Phone1
> 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Phone2
> 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Fax
> 12/12/05 19:55:27 DEBUG orm.ClassWriter:   State
> 12/12/05 19:55:27 DEBUG orm.ClassWriter:   City
> 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Address1
> 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Address2
> 12/12/05 19:55:27 DEBUG orm.ClassWriter:   ZipCode
> 12/12/05 19:55:27 DEBUG orm.ClassWriter:   VATID
> 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Language
> 12/12/05 19:55:27 DEBUG orm.ClassWriter:   rec_letter
> 12/12/05 19:55:27 DEBUG orm.ClassWriter:   rec_promotion
> 12/12/05 19:55:27 DEBUG orm.ClassWriter:   rec_type
> 12/12/05 19:55:27 DEBUG orm.ClassWriter:   JointSource
> 12/12/05 19:55:27 DEBUG orm.ClassWriter:   CustomerLevel
> 12/12/05 19:55:27 DEBUG orm.ClassWriter:   UpdateDate
> 12/12/05 19:55:27 DEBUG orm.ClassWriter:   CreateDate
> 12/12/05 19:55:27 DEBUG orm.ClassWriter:   FirstLoginDate
> 12/12/05 19:55:27 DEBUG orm.ClassWriter:   LastLoginDate
> 12/12/05 19:55:27 DEBUG orm.ClassWriter:   LastVisit
> 12/12/05 19:55:27 DEBUG orm.ClassWriter:   isValid
> 12/12/05 19:55:27 DEBUG orm.ClassWriter:   nJoint
> 12/12/05 19:55:27 DEBUG orm.ClassWriter:   Upd_SubDate
> 12/12/05 19:55:27 DEBUG orm.ClassWriter:   UnSub_Type
> 12/12/05 19:55:27 DEBUG orm.ClassWriter:   CreateDateFloat
> 12/12/05 19:55:27 DEBUG orm.ClassWriter: Writing source file:
> /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/member_main.java
> 12/12/05 19:55:27 DEBUG orm.ClassWriter: Table name: member_main
> 12/12/05 19:55:27 DEBUG orm.ClassWriter: Columns: MemberId:4, USERNAME:12,
> FirstName:-9, LastName:-9, EmailAddress:12, Password:12, Password_E5:12,
> Birthday:93, CompanyName:-9, Gender:12, Age:5, Education:12, Country:5,
> Title:-9, Phone1:12, Phone2:12, Fax:12, State:-9, City:-9, Address1:-9,
> Address2:-9, ZipCode:12, VATID:12, Language:12, rec_letter:-7,
> rec_promotion:-7, rec_type:5, JointSource:12, CustomerLevel:4,
> UpdateDate:93, CreateDate:93, FirstLoginDate:93, LastLoginDate:93,
> LastVisit:93, isValid:-7, nJoint:4, Upd_SubDate:93, UnSub_Type:4,
> CreateDateFloat:8,
> 12/12/05 19:55:27 DEBUG orm.ClassWriter: sourceFilename is member_main.java
> 12/12/05 19:55:27 DEBUG orm.CompilationManager: Found existing
> /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/
> 12/12/05 19:55:27 INFO orm.CompilationManager: HADOOP_HOME is
> /usr/local/hadoop/libexec/..
> 12/12/05 19:55:27 DEBUG orm.CompilationManager: Adding source file:
> /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/member_main.java
> 12/12/05 19:55:27 DEBUG orm.CompilationManager: Invoking javac with args:
> 12/12/05 19:55:27 DEBUG orm.CompilationManager:   -sourcepath
> 12/12/05 19:55:27 DEBUG orm.CompilationManager:
> /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/
> 12/12/05 19:55:27 DEBUG orm.CompilationManager:   -d
> 12/12/05 19:55:27 DEBUG orm.CompilationManager:
> /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/
> 12/12/05 19:55:27 DEBUG orm.CompilationManager:   -classpath
> 12/12/05 19:55:27 DEBUG orm.CompilationManager:
> /usr/local/hadoop/libexec/../conf:/usr/lib/jvm/java-6-openjdk-amd64/lib/tools.jar:/usr/local/hadoop/libexec/..:/usr/local/hadoop/libexec/../hadoop-core-1.0.3.jar:/usr/local/hadoop/libexec/../lib/asm-3.2.jar:/usr/local/hadoop/libexec/../lib/aspectjrt-1.6.5.jar:/usr/local/hadoop/libexec/../lib/aspectjtools-1.6.5.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/libexec/../lib/commons-cli-1.2.jar:/usr/local/hadoop/libexec/../lib/commons-codec-1.4.jar:/usr/local/hadoop/libexec/../lib/commons-collections-3.2.1.jar:/usr/local/hadoop/libexec/../lib/commons-configuration-1.6.jar:/usr/local/hadoop/libexec/../lib/commons-daemon-1.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-digester-1.8.jar:/usr/local/hadoop/libexec/../lib/commons-el-1.0.jar:/usr/local/hadoop/libexec/../lib/commons-httpclient-3.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-io-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-lang-2.4.jar:/usr/local/hadoop/libexec/../lib/commons-logging-1.1.1.jar:/usr/local/hadoop/libexec/../lib/commons-logging-api-1.0.4.jar:/usr/local/hadoop/libexec/../lib/commons-math-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-net-1.4.1.jar:/usr/local/hadoop/libexec/../lib/core-3.1.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-capacity-scheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-fairscheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-thriftfs-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hsqldb-1.8.0.10.jar:/usr/local/hadoop/libexec/../lib/jackson-core-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jasper-compiler-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jasper-runtime-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jdeb-0.8.jar:/usr/local/hadoop/libexec/../lib/jersey-core-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-json-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-server-1.8.jar:/usr/local/hadoop/libexec/../lib/jets3t-0.6.1.jar:/usr/local/hadoop/libexec/../lib/jetty-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jetty-util-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jsch-0.1.42.jar:/usr/local/hadoop/libexec/../lib/junit-4.5.jar:/usr/local/hadoop/libexec/../lib/kfs-0.2.2.jar:/usr/local/hadoop/libexec/../lib/log4j-1.2.15.jar:/usr/local/hadoop/libexec/../lib/mockito-all-1.8.5.jar:/usr/local/hadoop/libexec/../lib/oro-2.0.8.jar:/usr/local/hadoop/libexec/../lib/servlet-api-2.5-20081211.jar:/usr/local/hadoop/libexec/../lib/slf4j-api-1.4.3.jar:/usr/local/hadoop/libexec/../lib/slf4j-log4j12-1.4.3.jar:/usr/local/hadoop/libexec/../lib/xmlenc-0.52.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-2.1.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-api-2.1.jar:/usr/local/hadoop/lib:/usr/local/sqoop/conf::/usr/local/sqoop/lib/ant-contrib-1.0b3.jar:/usr/local/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar:/usr/local/sqoop/lib/avro-1.5.4.jar:/usr/local/sqoop/lib/avro-ipc-1.5.4.jar:/usr/local/sqoop/lib/avro-mapred-1.5.4.jar:/usr/local/sqoop/lib/commons-io-1.4.jar:/usr/local/sqoop/lib/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar:/usr/local/sqoop/lib/jackson-core-asl-1.7.3.jar:/usr/local/sqoop/lib/jackson-mapper-asl-1.7.3.jar:/usr/local/sqoop/lib/jopt-simple-3.2.jar:/usr/local/sqoop/lib/paranamer-2.3.jar:/usr/local/sqoop/lib/snappy-java-1.0.3.2.jar:/usr/local/sqoop/lib/sqljdbc4.jar:/usr/local/sqoop/lib/sqoop-sqlserver-1.0.jar:/usr/local/hbase/conf/:/usr/lib/jvm/java-6-openjdk-amd64//lib/tools.jar:/usr/local/hbase:/usr/local/hbase/hbase-0.94.1.jar:/usr/local/hbase/hbase-0.94.1-tests.jar:/usr/local/hbase/lib/activation-1.1.jar:/usr/local/hbase/lib/asm-3.1.jar:/usr/local/hbase/lib/avro-1.5.3.jar:/usr/local/hbase/lib/avro-ipc-1.5.3.jar:/usr/local/hbase/lib/commons-beanutils-1.7.0.jar:/usr/local/hbase/lib/commons-beanutils-core-1.8.0.jar:/usr/local/hbase/lib/commons-cli-1.2.jar:/usr/local/hbase/lib/commons-codec-1.4.jar:/usr/local/hbase/lib/commons-collections-3.2.1.jar:/usr/local/hbase/lib/commons-configuration-1.6.jar:/usr/local/hbase/lib/commons-digester-1.8.jar:/usr/local/hbase/lib/commons-el-1.0.jar:/usr/local/hbase/lib/commons-httpclient-3.1.jar:/usr/local/hbase/lib/commons-io-2.1.jar:/usr/local/hbase/lib/commons-lang-2.5.jar:/usr/local/hbase/lib/commons-logging-1.1.1.jar:/usr/local/hbase/lib/commons-math-2.1.jar:/usr/local/hbase/lib/commons-net-1.4.1.jar:/usr/local/hbase/lib/core-3.1.1.jar:/usr/local/hbase/lib/guava-11.0.2.jar:/usr/local/hbase/lib/hadoop-core-1.0.3.jar:/usr/local/hbase/lib/high-scale-lib-1.1.1.jar:/usr/local/hbase/lib/httpclient-4.1.2.jar:/usr/local/hbase/lib/httpcore-4.1.3.jar:/usr/local/hbase/lib/jackson-core-asl-1.8.8.jar:/usr/local/hbase/lib/jackson-jaxrs-1.8.8.jar:/usr/local/hbase/lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hbase/lib/jackson-xc-1.8.8.jar:/usr/local/hbase/lib/jamon-runtime-2.3.1.jar:/usr/local/hbase/lib/jasper-compiler-5.5.23.jar:/usr/local/hbase/lib/jasper-runtime-5.5.23.jar:/usr/local/hbase/lib/jaxb-api-2.1.jar:/usr/local/hbase/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hbase/lib/jersey-core-1.8.jar:/usr/local/hbase/lib/jersey-json-1.8.jar:/usr/local/hbase/lib/jersey-server-1.8.jar:/usr/local/hbase/lib/jettison-1.1.jar:/usr/local/hbase/lib/jetty-6.1.26.jar:/usr/local/hbase/lib/jetty-util-6.1.26.jar:/usr/local/hbase/lib/jruby-complete-1.6.5.jar:/usr/local/hbase/lib/jsp-2.1-6.1.14.jar:/usr/local/hbase/lib/jsp-api-2.1-6.1.14.jar:/usr/local/hbase/lib/jsr305-1.3.9.jar:/usr/local/hbase/lib/junit-4.10-HBASE-1.jar:/usr/local/hbase/lib/libthrift-0.8.0.jar:/usr/local/hbase/lib/log4j-1.2.16.jar:/usr/local/hbase/lib/metrics-core-2.1.2.jar:/usr/local/hbase/lib/netty-3.2.4.Final.jar:/usr/local/hbase/lib/protobuf-java-2.4.0a.jar:/usr/local/hbase/lib/servlet-api-2.5-6.1.14.jar:/usr/local/hbase/lib/slf4j-api-1.4.3.jar:/usr/local/hbase/lib/slf4j-log4j12-1.4.3.jar:/usr/local/hbase/lib/snappy-java-1.0.3.2.jar:/usr/local/hbase/lib/stax-api-1.0.1.jar:/usr/local/hbase/lib/velocity-1.7.jar:/usr/local/hbase/lib/xmlenc-0.52.jar:/usr/local/hbase/lib/zookeeper-3.4.3.jar::/usr/local/hadoop/conf:/usr/local/hadoop/libexec/../conf:/usr/lib/jvm/java-6-openjdk-amd64/lib/tools.jar:/usr/local/hadoop/libexec/..:/usr/local/hadoop/libexec/../hadoop-core-1.0.3.jar:/usr/local/hadoop/libexec/../lib/asm-3.2.jar:/usr/local/hadoop/libexec/../lib/aspectjrt-1.6.5.jar:/usr/local/hadoop/libexec/../lib/aspectjtools-1.6.5.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/libexec/../lib/commons-cli-1.2.jar:/usr/local/hadoop/libexec/../lib/commons-codec-1.4.jar:/usr/local/hadoop/libexec/../lib/commons-collections-3.2.1.jar:/usr/local/hadoop/libexec/../lib/commons-configuration-1.6.jar:/usr/local/hadoop/libexec/../lib/commons-daemon-1.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-digester-1.8.jar:/usr/local/hadoop/libexec/../lib/commons-el-1.0.jar:/usr/local/hadoop/libexec/../lib/commons-httpclient-3.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-io-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-lang-2.4.jar:/usr/local/hadoop/libexec/../lib/commons-logging-1.1.1.jar:/usr/local/hadoop/libexec/../lib/commons-logging-api-1.0.4.jar:/usr/local/hadoop/libexec/../lib/commons-math-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-net-1.4.1.jar:/usr/local/hadoop/libexec/../lib/core-3.1.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-capacity-scheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-fairscheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-thriftfs-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hsqldb-1.8.0.10.jar:/usr/local/hadoop/libexec/../lib/jackson-core-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jasper-compiler-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jasper-runtime-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jdeb-0.8.jar:/usr/local/hadoop/libexec/../lib/jersey-core-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-json-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-server-1.8.jar:/usr/local/hadoop/libexec/../lib/jets3t-0.6.1.jar:/usr/local/hadoop/libexec/../lib/jetty-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jetty-util-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jsch-0.1.42.jar:/usr/local/hadoop/libexec/../lib/junit-4.5.jar:/usr/local/hadoop/libexec/../lib/kfs-0.2.2.jar:/usr/local/hadoop/libexec/../lib/log4j-1.2.15.jar:/usr/local/hadoop/libexec/../lib/mockito-all-1.8.5.jar:/usr/local/hadoop/libexec/../lib/oro-2.0.8.jar:/usr/local/hadoop/libexec/../lib/servlet-api-2.5-20081211.jar:/usr/local/hadoop/libexec/../lib/slf4j-api-1.4.3.jar:/usr/local/hadoop/libexec/../lib/slf4j-log4j12-1.4.3.jar:/usr/local/hadoop/libexec/../lib/xmlenc-0.52.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-2.1.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-api-2.1.jar:/usr/local/hadoop/lib::/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar:/usr/local/sqoop/sqoop-test-1.3.0-cdh3u4.jar::/usr/local/hadoop/hadoop-core-1.0.3.jar:/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar
> 12/12/05 19:55:28 ERROR orm.CompilationManager: Could not rename
> /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/member_main.java
> to /home/hadoop/_scripts/1-hadoop/member/./member_main.java
> org.apache.commons.io.FileExistsException: Destination
> '/home/hadoop/_scripts/1-hadoop/member/./member_main.java' already exists
>  at org.apache.commons.io.FileUtils.moveFile(FileUtils.java:2378)
> at
> com.cloudera.sqoop.orm.CompilationManager.compile(CompilationManager.java:229)
>  at com.cloudera.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:85)
> at com.cloudera.sqoop.tool.ExportTool.exportTable(ExportTool.java:66)
>  at com.cloudera.sqoop.tool.ExportTool.run(ExportTool.java:99)
> at com.cloudera.sqoop.Sqoop.run(Sqoop.java:146)
>  at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:182)
>  at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:221)
> at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:230)
>  at com.cloudera.sqoop.Sqoop.main(Sqoop.java:239)
> 12/12/05 19:55:28 INFO orm.CompilationManager: Writing jar file:
> /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/member_main.jar
> 12/12/05 19:55:28 DEBUG orm.CompilationManager: Scanning for .class files
> in directory: /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d
> 12/12/05 19:55:28 DEBUG orm.CompilationManager: Got classfile:
> /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/member_main.class
> -> member_main.class
> 12/12/05 19:55:28 DEBUG orm.CompilationManager: Finished writing jar file
> /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/member_main.jar
> 12/12/05 19:55:28 INFO mapreduce.ExportJobBase: Beginning export of
> member_main
> 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Using InputFormat: class
> com.cloudera.sqoop.mapreduce.ExportInputFormat
> 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar
> 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/local/sqoop/lib/sqljdbc4.jar
> 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar
> 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar
> 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/local/sqoop/lib/jackson-core-asl-1.7.3.jar
> 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/local/sqoop/lib/snappy-java-1.0.3.2.jar
> 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/local/sqoop/lib/avro-1.5.4.jar
> 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/local/sqoop/lib/jackson-mapper-asl-1.7.3.jar
> 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/local/sqoop/lib/sqljdbc4.jar
> 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/local/sqoop/lib/ant-contrib-1.0b3.jar
> 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/local/sqoop/lib/sqoop-sqlserver-1.0.jar
> 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/local/sqoop/lib/paranamer-2.3.jar
> 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/local/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar
> 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/local/sqoop/lib/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar
> 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/local/sqoop/lib/avro-ipc-1.5.4.jar
> 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/local/sqoop/lib/avro-mapred-1.5.4.jar
> 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/local/sqoop/lib/commons-io-1.4.jar
> 12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/local/sqoop/lib/jopt-simple-3.2.jar
> 12/12/05 19:55:31 INFO input.FileInputFormat: Total input paths to process
> : 1
> 12/12/05 19:55:31 DEBUG mapreduce.ExportInputFormat: Target numMapTasks=5
> 12/12/05 19:55:31 DEBUG mapreduce.ExportInputFormat: Total input
> bytes=110140058
> 12/12/05 19:55:31 DEBUG mapreduce.ExportInputFormat: maxSplitSize=22028011
> 12/12/05 19:55:31 INFO input.FileInputFormat: Total input paths to process
> : 1
> 12/12/05 19:55:31 INFO util.NativeCodeLoader: Loaded the native-hadoop
> library
> 12/12/05 19:55:31 WARN snappy.LoadSnappy: Snappy native library not loaded
> 12/12/05 19:55:31 DEBUG mapreduce.ExportInputFormat: Generated splits:
> 12/12/05 19:55:31 DEBUG mapreduce.ExportInputFormat:
> Paths:/user/hadoop/CyberlinkMemberData/Member_Main/CHS/part-m-00000:0+22028011
> Locations:hadoop03:;
> 12/12/05 19:55:31 DEBUG mapreduce.ExportInputFormat:
> Paths:/user/hadoop/CyberlinkMemberData/Member_Main/CHS/part-m-00000:22028011+22028011
> Locations:hadoop03:;
> 12/12/05 19:55:31 DEBUG mapreduce.ExportInputFormat:
> Paths:/user/hadoop/CyberlinkMemberData/Member_Main/CHS/part-m-00000:44056022+11526421,/user/hadoop/CyberlinkMemberData/Member_Main/CHS/part-m-00000:55582443+11526421
> Locations:hadoop03:;
> 12/12/05 19:55:31 DEBUG mapreduce.ExportInputFormat:
> Paths:/user/hadoop/CyberlinkMemberData/Member_Main/CHS/part-m-00000:67108864+21515597,/user/hadoop/CyberlinkMemberData/Member_Main/CHS/part-m-00000:88624461+21515597
> Locations:hadoop03:;
> 12/12/05 19:55:31 INFO mapred.JobClient: Running job: job_201212041541_0245
> 12/12/05 19:55:32 INFO mapred.JobClient:  map 0% reduce 0%
> 12/12/05 19:55:47 INFO mapred.JobClient: Task Id :
> attempt_201212041541_0245_m_000002_0, Status : FAILED
> java.lang.NumberFormatException: For input string: "Male"
> at
> java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
>  at java.lang.Integer.parseInt(Integer.java:481)
> at java.lang.Integer.valueOf(Integer.java:570)
>  at member_main.__loadFromFields(member_main.java:1254)
> at member_main.parse(member_main.java:1156)
>  at
> com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:81)
> at
> com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:40)
>  at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
> at
> com.cloudera.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:189)
>  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
>  at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> at java.security.AccessController.doPrivileged(Native Method)
>  at javax.security.auth.Subject.doAs(Subject.java:416)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
>  at org.apache.hadoop.mapred.Child.main(Child.java:249)
> 
> 12/12/05 19:55:51 INFO mapred.JobClient:  map 5% reduce 0%
> 12/12/05 19:55:54 INFO mapred.JobClient:  map 8% reduce 0%
> 12/12/05 19:55:54 INFO mapred.JobClient: Task Id :
> attempt_201212041541_0245_m_000002_1, Status : FAILED
> java.lang.NumberFormatException: For input string: "Male"
> at
> java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
>  at java.lang.Integer.parseInt(Integer.java:481)
> at java.lang.Integer.valueOf(Integer.java:570)
>  at member_main.__loadFromFields(member_main.java:1254)
> at member_main.parse(member_main.java:1156)
>  at
> com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:81)
> at
> com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:40)
>  at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
> at
> com.cloudera.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:189)
>  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
>  at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> at java.security.AccessController.doPrivileged(Native Method)
>  at javax.security.auth.Subject.doAs(Subject.java:416)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
>  at org.apache.hadoop.mapred.Child.main(Child.java:249)
> 
> 12/12/05 19:55:57 INFO mapred.JobClient:  map 14% reduce 0%
> 12/12/05 19:55:59 INFO mapred.JobClient: Task Id :
> attempt_201212041541_0245_m_000000_0, Status : FAILED
> java.lang.IllegalArgumentException: Timestamp format must be yyyy-mm-dd
> hh:mm:ss[.fffffffff]
> at java.sql.Timestamp.valueOf(Timestamp.java:203)
>  at member_main.__loadFromFields(member_main.java:1239)
> at member_main.parse(member_main.java:1156)
>  at
> com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:81)
> at
> com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:40)
>  at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
> at
> com.cloudera.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:189)
>  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
>  at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> at java.security.AccessController.doPrivileged(Native Method)
>  at javax.security.auth.Subject.doAs(Subject.java:416)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
>  at org.apache.hadoop.mapred.Child.main(Child.java:249)
> 
> 12/12/05 19:56:00 INFO mapred.JobClient:  map 13% reduce 0%
> 12/12/05 19:56:01 INFO mapred.JobClient: Task Id :
> attempt_201212041541_0245_m_000002_2, Status : FAILED
> java.lang.NumberFormatException: For input string: "Male"
> at
> java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
>  at java.lang.Integer.parseInt(Integer.java:481)
> at java.lang.Integer.valueOf(Integer.java:570)
>  at member_main.__loadFromFields(member_main.java:1254)
> at member_main.parse(member_main.java:1156)
>  at
> com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:81)
> at
> com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:40)
>  at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
> at
> com.cloudera.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:189)
>  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
>  at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> at java.security.AccessController.doPrivileged(Native Method)
>  at javax.security.auth.Subject.doAs(Subject.java:416)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
>  at org.apache.hadoop.mapred.Child.main(Child.java:249)
> 
> 12/12/05 19:56:03 INFO mapred.JobClient:  map 16% reduce 0%
> 12/12/05 19:56:13 INFO mapred.JobClient: Job complete: job_201212041541_0245
> 12/12/05 19:56:13 INFO mapred.JobClient: Counters: 8
> 12/12/05 19:56:13 INFO mapred.JobClient:   Job Counters
> 12/12/05 19:56:13 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=91611
> 12/12/05 19:56:13 INFO mapred.JobClient:     Total time spent by all
> reduces waiting after reserving slots (ms)=0
> 12/12/05 19:56:13 INFO mapred.JobClient:     Total time spent by all maps
> waiting after reserving slots (ms)=0
> 12/12/05 19:56:13 INFO mapred.JobClient:     Rack-local map tasks=5
> 12/12/05 19:56:13 INFO mapred.JobClient:     Launched map tasks=8
> 12/12/05 19:56:13 INFO mapred.JobClient:     Data-local map tasks=3
> 12/12/05 19:56:13 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=0
> 12/12/05 19:56:13 INFO mapred.JobClient:     Failed map tasks=1
> 12/12/05 19:56:13 INFO mapreduce.ExportJobBase: Transferred 0 bytes in
> 45.077 seconds (0 bytes/sec)
> 12/12/05 19:56:13 INFO mapreduce.ExportJobBase: Exported 0 records.
> 12/12/05 19:56:13 ERROR tool.ExportTool: Error during export: Export job
> failed!
> =====================================
> 
> 
> Chun-fan
> 
> 
> On Thu, Dec 6, 2012 at 12:23 AM, Jarek Jarcec Cecho <ja...@apache.org>wrote:
> 
> > Hi Chun-fan,
> > thank you very much for sharing the log with us. You are using Microsoft
> > SQL Connector because you're downloaded manually from Microsoft web pages
> > and you can also confirm that by following log lines:
> >
> > > 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Added factory
> > > com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory specified by
> > > /usr/local/sqoop/conf/managers.d/mssqoop-sqlserver
> > ...
> > > 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Instantiated ConnManager
> > > com.microsoft.sqoop.SqlServer.MSSQLServerManager@736921fd
> >
> > I'm not sure what is going wrong as it seems that the data were parsed
> > correctly, but submitting the query to SQL server will fail. As a next step
> > I would recommend turning Microsoft Connector off and using build-in one
> > instead to see if the issue is specific to Sqoop or the Connector. You can
> > do that by temporarily moving file
> > /usr/local/sqoop/conf/managers.d/mssqoop-sqlserver somewhere else.
> >
> > Jarcec
> >
> > On Wed, Dec 05, 2012 at 12:25:24PM +0800, Chun-fan Ivan Liao wrote:
> > > Thank you, Jarcec. I'm not sure which connector we use. I've downloaded
> > > "Microsoft SQL Server Connector for Apache Hadoop" from
> > > http://www.microsoft.com/en-us/download/details.aspx?id=27584, but I
> > didn't
> > > remember if we really used that. How to make sure?
> > >
> > > And here is the verbose log:
> > >
> > > ===========
> > > 12/12/05 12:08:57 DEBUG tool.BaseSqoopTool: Enabled debug logging.
> > > 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Added factory
> > > com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory specified by
> > > /usr/local/sqoop/conf/managers.d/mssqoop-sqlserver
> > > 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Loaded manager factory:
> > > com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory
> > > 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Loaded manager factory:
> > > com.cloudera.sqoop.manager.DefaultManagerFactory
> > > 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Trying ManagerFactory:
> > > com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory
> > > 12/12/05 12:08:57 INFO SqlServer.MSSQLServerManagerFactory: Using
> > > Microsoft's SQL Server - Hadoop Connector
> > > 12/12/05 12:08:57 INFO manager.SqlManager: Using default fetchSize of
> > 1000
> > > 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Instantiated ConnManager
> > > com.microsoft.sqoop.SqlServer.MSSQLServerManager@736921fd
> > > 12/12/05 12:08:57 INFO tool.CodeGenTool: Beginning code generation
> > > 12/12/05 12:08:57 DEBUG manager.SqlManager: No connection paramenters
> > > specified. Using regular API for making connection.
> > > 12/12/05 12:08:57 DEBUG manager.SqlManager: Using fetchSize for next
> > query:
> > > 1000
> > > 12/12/05 12:08:57 INFO manager.SqlManager: Executing SQL statement:
> > SELECT
> > > TOP 1 * FROM [member_main]
> > > 12/12/05 12:08:57 DEBUG manager.SqlManager: Using fetchSize for next
> > query:
> > > 1000
> > > 12/12/05 12:08:57 INFO manager.SqlManager: Executing SQL statement:
> > SELECT
> > > TOP 1 * FROM [member_main]
> > > 12/12/05 12:08:57 DEBUG orm.ClassWriter: selected columns:
> > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   MemberId
> > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   USERNAME
> > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   FirstName
> > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   LastName
> > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   EmailAddress
> > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Password
> > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Password_E5
> > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Birthday
> > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   CompanyName
> > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Gender
> > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Age
> > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Education
> > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Country
> > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Title
> > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Phone1
> > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Phone2
> > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Fax
> > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   State
> > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   City
> > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Address1
> > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Address2
> > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   ZipCode
> > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   VATID
> > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Language
> > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   rec_letter
> > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   rec_promotion
> > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   rec_type
> > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   JointSource
> > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   CustomerLevel
> > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   UpdateDate
> > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   CreateDate
> > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   FirstLoginDate
> > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   LastLoginDate
> > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   LastVisit
> > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   isValid
> > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   nJoint
> > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Upd_SubDate
> > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   UnSub_Type
> > > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   CreateDateFloat
> > > 12/12/05 12:08:57 DEBUG orm.ClassWriter: Writing source file:
> > >
> > /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/member_main.java
> > > 12/12/05 12:08:57 DEBUG orm.ClassWriter: Table name: member_main
> > > 12/12/05 12:08:57 DEBUG orm.ClassWriter: Columns: MemberId:4,
> > USERNAME:12,
> > > FirstName:-9, LastName:-9, EmailAddress:12, Password:12, Password_E5:12,
> > > Birthday:93, CompanyName:-9, Gender:12, Age:5, Education:12, Country:5,
> > > Title:-9, Phone1:12, Phone2:12, Fax:12, State:-9, City:-9, Address1:-9,
> > > Address2:-9, ZipCode:12, VATID:12, Language:12, rec_letter:-7,
> > > rec_promotion:-7, rec_type:5, JointSource:12, CustomerLevel:4,
> > > UpdateDate:93, CreateDate:93, FirstLoginDate:93, LastLoginDate:93,
> > > LastVisit:93, isValid:-7, nJoint:4, Upd_SubDate:93, UnSub_Type:4,
> > > CreateDateFloat:8,
> > > 12/12/05 12:08:57 DEBUG orm.ClassWriter: sourceFilename is
> > member_main.java
> > > 12/12/05 12:08:57 DEBUG orm.CompilationManager: Found existing
> > > /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/
> > > 12/12/05 12:08:57 INFO orm.CompilationManager: HADOOP_HOME is
> > > /usr/local/hadoop/libexec/..
> > > 12/12/05 12:08:57 DEBUG orm.CompilationManager: Adding source file:
> > >
> > /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/member_main.java
> > > 12/12/05 12:08:57 DEBUG orm.CompilationManager: Invoking javac with args:
> > > 12/12/05 12:08:57 DEBUG orm.CompilationManager:   -sourcepath
> > > 12/12/05 12:08:57 DEBUG orm.CompilationManager:
> > > /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/
> > > 12/12/05 12:08:57 DEBUG orm.CompilationManager:   -d
> > > 12/12/05 12:08:57 DEBUG orm.CompilationManager:
> > > /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/
> > > 12/12/05 12:08:57 DEBUG orm.CompilationManager:   -classpath
> > > 12/12/05 12:08:57 DEBUG orm.CompilationManager:
> > >
> > /usr/local/hadoop/libexec/../conf:/usr/lib/jvm/java-6-openjdk-amd64/lib/tools.jar:/usr/local/hadoop/libexec/..:/usr/local/hadoop/libexec/../hadoop-core-1.0.3.jar:/usr/local/hadoop/libexec/../lib/asm-3.2.jar:/usr/local/hadoop/libexec/../lib/aspectjrt-1.6.5.jar:/usr/local/hadoop/libexec/../lib/aspectjtools-1.6.5.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/libexec/../lib/commons-cli-1.2.jar:/usr/local/hadoop/libexec/../lib/commons-codec-1.4.jar:/usr/local/hadoop/libexec/../lib/commons-collections-3.2.1.jar:/usr/local/hadoop/libexec/../lib/commons-configuration-1.6.jar:/usr/local/hadoop/libexec/../lib/commons-daemon-1.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-digester-1.8.jar:/usr/local/hadoop/libexec/../lib/commons-el-1.0.jar:/usr/local/hadoop/libexec/../lib/commons-httpclient-3.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-io-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-lang-2.4.jar:/usr/local/hadoop/libexec/../lib/commons-logging-1.1.1.jar:/usr/local/hadoop/libexec/../lib/commons-logging-api-1.0.4.jar:/usr/local/hadoop/libexec/../lib/commons-math-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-net-1.4.1.jar:/usr/local/hadoop/libexec/../lib/core-3.1.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-capacity-scheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-fairscheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-thriftfs-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hsqldb-1.8.0.10.jar:/usr/local/hadoop/libexec/../lib/jackson-core-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jasper-compiler-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jasper-runtime-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jdeb-0.8.jar:/usr/local/hadoop/libexec/../lib/jersey-core-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-json-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-server-1.8.jar:/usr/local/hadoop/libexec/../lib/jets3t-0.6.1.jar:/usr/local/hadoop/libexec/../lib/jetty-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jetty-util-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jsch-0.1.42.jar:/usr/local/hadoop/libexec/../lib/junit-4.5.jar:/usr/local/hadoop/libexec/../lib/kfs-0.2.2.jar:/usr/local/hadoop/libexec/../lib/log4j-1.2.15.jar:/usr/local/hadoop/libexec/../lib/mockito-all-1.8.5.jar:/usr/local/hadoop/libexec/../lib/oro-2.0.8.jar:/usr/local/hadoop/libexec/../lib/servlet-api-2.5-20081211.jar:/usr/local/hadoop/libexec/../lib/slf4j-api-1.4.3.jar:/usr/local/hadoop/libexec/../lib/slf4j-log4j12-1.4.3.jar:/usr/local/hadoop/libexec/../lib/xmlenc-0.52.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-2.1.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-api-2.1.jar:/usr/local/hadoop/lib:/usr/local/sqoop/conf::/usr/local/sqoop/lib/ant-contrib-1.0b3.jar:/usr/local/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar:/usr/local/sqoop/lib/avro-1.5.4.jar:/usr/local/sqoop/lib/avro-ipc-1.5.4.jar:/usr/local/sqoop/lib/avro-mapred-1.5.4.jar:/usr/local/sqoop/lib/commons-io-1.4.jar:/usr/local/sqoop/lib/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar:/usr/local/sqoop/lib/jackson-core-asl-1.7.3.jar:/usr/local/sqoop/lib/jackson-mapper-asl-1.7.3.jar:/usr/local/sqoop/lib/jopt-simple-3.2.jar:/usr/local/sqoop/lib/paranamer-2.3.jar:/usr/local/sqoop/lib/snappy-java-1.0.3.2.jar:/usr/local/sqoop/lib/sqljdbc4.jar:/usr/local/sqoop/lib/sqoop-sqlserver-1.0.jar:/usr/local/hbase/conf/:/usr/lib/jvm/java-6-openjdk-amd64//lib/tools.jar:/usr/local/hbase:/usr/local/hbase/hbase-0.94.1.jar:/usr/local/hbase/hbase-0.94.1-tests.jar:/usr/local/hbase/lib/activation-1.1.jar:/usr/local/hbase/lib/asm-3.1.jar:/usr/local/hbase/lib/avro-1.5.3.jar:/usr/local/hbase/lib/avro-ipc-1.5.3.jar:/usr/local/hbase/lib/commons-beanutils-1.7.0.jar:/usr/local/hbase/lib/commons-beanutils-core-1.8.0.jar:/usr/local/hbase/lib/commons-cli-1.2.jar:/usr/local/hbase/lib/commons-codec-1.4.jar:/usr/local/hbase/lib/commons-collections-3.2.1.jar:/usr/local/hbase/lib/commons-configuration-1.6.jar:/usr/local/hbase/lib/commons-digester-1.8.jar:/usr/local/hbase/lib/commons-el-1.0.jar:/usr/local/hbase/lib/commons-httpclient-3.1.jar:/usr/local/hbase/lib/commons-io-2.1.jar:/usr/local/hbase/lib/commons-lang-2.5.jar:/usr/local/hbase/lib/commons-logging-1.1.1.jar:/usr/local/hbase/lib/commons-math-2.1.jar:/usr/local/hbase/lib/commons-net-1.4.1.jar:/usr/local/hbase/lib/core-3.1.1.jar:/usr/local/hbase/lib/guava-11.0.2.jar:/usr/local/hbase/lib/hadoop-core-1.0.3.jar:/usr/local/hbase/lib/high-scale-lib-1.1.1.jar:/usr/local/hbase/lib/httpclient-4.1.2.jar:/usr/local/hbase/lib/httpcore-4.1.3.jar:/usr/local/hbase/lib/jackson-core-asl-1.8.8.jar:/usr/local/hbase/lib/jackson-jaxrs-1.8.8.jar:/usr/local/hbase/lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hbase/lib/jackson-xc-1.8.8.jar:/usr/local/hbase/lib/jamon-runtime-2.3.1.jar:/usr/local/hbase/lib/jasper-compiler-5.5.23.jar:/usr/local/hbase/lib/jasper-runtime-5.5.23.jar:/usr/local/hbase/lib/jaxb-api-2.1.jar:/usr/local/hbase/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hbase/lib/jersey-core-1.8.jar:/usr/local/hbase/lib/jersey-json-1.8.jar:/usr/local/hbase/lib/jersey-server-1.8.jar:/usr/local/hbase/lib/jettison-1.1.jar:/usr/local/hbase/lib/jetty-6.1.26.jar:/usr/local/hbase/lib/jetty-util-6.1.26.jar:/usr/local/hbase/lib/jruby-complete-1.6.5.jar:/usr/local/hbase/lib/jsp-2.1-6.1.14.jar:/usr/local/hbase/lib/jsp-api-2.1-6.1.14.jar:/usr/local/hbase/lib/jsr305-1.3.9.jar:/usr/local/hbase/lib/junit-4.10-HBASE-1.jar:/usr/local/hbase/lib/libthrift-0.8.0.jar:/usr/local/hbase/lib/log4j-1.2.16.jar:/usr/local/hbase/lib/metrics-core-2.1.2.jar:/usr/local/hbase/lib/netty-3.2.4.Final.jar:/usr/local/hbase/lib/protobuf-java-2.4.0a.jar:/usr/local/hbase/lib/servlet-api-2.5-6.1.14.jar:/usr/local/hbase/lib/slf4j-api-1.4.3.jar:/usr/local/hbase/lib/slf4j-log4j12-1.4.3.jar:/usr/local/hbase/lib/snappy-java-1.0.3.2.jar:/usr/local/hbase/lib/stax-api-1.0.1.jar:/usr/local/hbase/lib/velocity-1.7.jar:/usr/local/hbase/lib/xmlenc-0.52.jar:/usr/local/hbase/lib/zookeeper-3.4.3.jar::/usr/local/hadoop/conf:/usr/local/hadoop/libexec/../conf:/usr/lib/jvm/java-6-openjdk-amd64/lib/tools.jar:/usr/local/hadoop/libexec/..:/usr/local/hadoop/libexec/../hadoop-core-1.0.3.jar:/usr/local/hadoop/libexec/../lib/asm-3.2.jar:/usr/local/hadoop/libexec/../lib/aspectjrt-1.6.5.jar:/usr/local/hadoop/libexec/../lib/aspectjtools-1.6.5.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/libexec/../lib/commons-cli-1.2.jar:/usr/local/hadoop/libexec/../lib/commons-codec-1.4.jar:/usr/local/hadoop/libexec/../lib/commons-collections-3.2.1.jar:/usr/local/hadoop/libexec/../lib/commons-configuration-1.6.jar:/usr/local/hadoop/libexec/../lib/commons-daemon-1.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-digester-1.8.jar:/usr/local/hadoop/libexec/../lib/commons-el-1.0.jar:/usr/local/hadoop/libexec/../lib/commons-httpclient-3.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-io-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-lang-2.4.jar:/usr/local/hadoop/libexec/../lib/commons-logging-1.1.1.jar:/usr/local/hadoop/libexec/../lib/commons-logging-api-1.0.4.jar:/usr/local/hadoop/libexec/../lib/commons-math-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-net-1.4.1.jar:/usr/local/hadoop/libexec/../lib/core-3.1.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-capacity-scheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-fairscheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-thriftfs-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hsqldb-1.8.0.10.jar:/usr/local/hadoop/libexec/../lib/jackson-core-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jasper-compiler-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jasper-runtime-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jdeb-0.8.jar:/usr/local/hadoop/libexec/../lib/jersey-core-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-json-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-server-1.8.jar:/usr/local/hadoop/libexec/../lib/jets3t-0.6.1.jar:/usr/local/hadoop/libexec/../lib/jetty-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jetty-util-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jsch-0.1.42.jar:/usr/local/hadoop/libexec/../lib/junit-4.5.jar:/usr/local/hadoop/libexec/../lib/kfs-0.2.2.jar:/usr/local/hadoop/libexec/../lib/log4j-1.2.15.jar:/usr/local/hadoop/libexec/../lib/mockito-all-1.8.5.jar:/usr/local/hadoop/libexec/../lib/oro-2.0.8.jar:/usr/local/hadoop/libexec/../lib/servlet-api-2.5-20081211.jar:/usr/local/hadoop/libexec/../lib/slf4j-api-1.4.3.jar:/usr/local/hadoop/libexec/../lib/slf4j-log4j12-1.4.3.jar:/usr/local/hadoop/libexec/../lib/xmlenc-0.52.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-2.1.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-api-2.1.jar:/usr/local/hadoop/lib::/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar:/usr/local/sqoop/sqoop-test-1.3.0-cdh3u4.jar::/usr/local/hadoop/hadoop-core-1.0.3.jar:/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar
> > > 12/12/05 12:08:58 ERROR orm.CompilationManager: Could not rename
> > >
> > /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/member_main.java
> > > to /home/hadoop/_scripts/1-hadoop/member/./member_main.java
> > > org.apache.commons.io.FileExistsException: Destination
> > > '/home/hadoop/_scripts/1-hadoop/member/./member_main.java' already exists
> > >  at org.apache.commons.io.FileUtils.moveFile(FileUtils.java:2378)
> > > at
> > >
> > com.cloudera.sqoop.orm.CompilationManager.compile(CompilationManager.java:229)
> > >  at com.cloudera.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:85)
> > > at com.cloudera.sqoop.tool.ExportTool.exportTable(ExportTool.java:66)
> > >  at com.cloudera.sqoop.tool.ExportTool.run(ExportTool.java:99)
> > > at com.cloudera.sqoop.Sqoop.run(Sqoop.java:146)
> > >  at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> > > at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:182)
> > >  at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:221)
> > > at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:230)
> > >  at com.cloudera.sqoop.Sqoop.main(Sqoop.java:239)
> > > 12/12/05 12:08:58 INFO orm.CompilationManager: Writing jar file:
> > >
> > /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/member_main.jar
> > > 12/12/05 12:08:58 DEBUG orm.CompilationManager: Scanning for .class files
> > > in directory: /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971
> > > 12/12/05 12:08:58 DEBUG orm.CompilationManager: Got classfile:
> > >
> > /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/member_main.class
> > > -> member_main.class
> > > 12/12/05 12:08:58 DEBUG orm.CompilationManager: Finished writing jar file
> > >
> > /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/member_main.jar
> > > 12/12/05 12:08:58 INFO mapreduce.ExportJobBase: Beginning export of
> > > member_main
> > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Using InputFormat: class
> > > com.cloudera.sqoop.mapreduce.ExportInputFormat
> > > 12/12/05 12:08:58 DEBUG manager.SqlManager: Using fetchSize for next
> > query:
> > > 1000
> > > 12/12/05 12:08:58 INFO manager.SqlManager: Executing SQL statement:
> > SELECT
> > > TOP 1 * FROM [member_main]
> > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > file:/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar
> > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > file:/usr/local/sqoop/lib/sqljdbc4.jar
> > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > file:/usr/local/sqoop/lib/sqoop-sqlserver-1.0.jar
> > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > file:/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar
> > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > file:/usr/local/sqoop/lib/jackson-core-asl-1.7.3.jar
> > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > file:/usr/local/sqoop/lib/snappy-java-1.0.3.2.jar
> > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > file:/usr/local/sqoop/lib/avro-1.5.4.jar
> > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > file:/usr/local/sqoop/lib/jackson-mapper-asl-1.7.3.jar
> > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > file:/usr/local/sqoop/lib/sqljdbc4.jar
> > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > file:/usr/local/sqoop/lib/ant-contrib-1.0b3.jar
> > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > file:/usr/local/sqoop/lib/sqoop-sqlserver-1.0.jar
> > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > file:/usr/local/sqoop/lib/paranamer-2.3.jar
> > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > file:/usr/local/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar
> > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > file:/usr/local/sqoop/lib/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar
> > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > file:/usr/local/sqoop/lib/avro-ipc-1.5.4.jar
> > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > file:/usr/local/sqoop/lib/avro-mapred-1.5.4.jar
> > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > file:/usr/local/sqoop/lib/commons-io-1.4.jar
> > > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > file:/usr/local/sqoop/lib/jopt-simple-3.2.jar
> > > 12/12/05 12:09:00 INFO input.FileInputFormat: Total input paths to
> > process
> > > : 1
> > > 12/12/05 12:09:00 DEBUG mapreduce.ExportInputFormat: Target numMapTasks=1
> > > 12/12/05 12:09:00 DEBUG mapreduce.ExportInputFormat: Total input
> > bytes=2611
> > > 12/12/05 12:09:00 DEBUG mapreduce.ExportInputFormat: maxSplitSize=2611
> > > 12/12/05 12:09:00 INFO input.FileInputFormat: Total input paths to
> > process
> > > : 1
> > > 12/12/05 12:09:00 INFO util.NativeCodeLoader: Loaded the native-hadoop
> > > library
> > > 12/12/05 12:09:00 WARN snappy.LoadSnappy: Snappy native library not
> > loaded
> > > 12/12/05 12:09:00 DEBUG mapreduce.ExportInputFormat: Generated splits:
> > > 12/12/05 12:09:00 DEBUG mapreduce.ExportInputFormat:
> > > Paths:/user/hadoop/test-ivan/test:0+2611 Locations:hadoop05:;
> > > 12/12/05 12:09:00 INFO mapred.JobClient: Running job:
> > job_201212041541_0107
> > > 12/12/05 12:09:01 INFO mapred.JobClient:  map 0% reduce 0%
> > > 12/12/05 12:09:18 INFO mapred.JobClient: Task Id :
> > > attempt_201212041541_0107_m_000000_0, Status : FAILED
> > > java.io.IOException: com.microsoft.sqlserver.jdbc.SQLServerException:
> > > Incorrect syntax near ','.
> > >  at
> > >
> > com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.close(AsyncSqlRecordWriter.java:195)
> > > at
> > >
> > org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.close(MapTask.java:651)
> > >  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:766)
> > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> > >  at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> > > at java.security.AccessController.doPrivileged(Native Method)
> > >  at javax.security.auth.Subject.doAs(Subject.java:416)
> > > at
> > >
> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> > >  at org.apache.hadoop.mapred.Child.main(Child.java:249)
> > > Caused by: com.microsoft.sqlserver.jdbc.SQLServerException: Incorrect
> > > syntax near ','.
> > >  at
> > >
> > com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:197)
> > > at
> > >
> > com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1493)
> > >  at
> > >
> > com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.doExecutePreparedStatement(SQLServerPreparedStatement.java:390)
> > > at
> > >
> > com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement$PrepStmtExecCmd.doExecute(SQLServerPreparedStatement.java:340)
> > >  at com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:4575)
> > > at
> > >
> > com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1400)
> > >  at
> > >
> > com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:179)
> > > at
> > >
> > com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:154)
> > >  at
> > >
> > com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.execute(SQLServerPreparedStatement.java:322)
> > > at
> > >
> > com.cloudera.sqoop.mapreduce.AsyncSqlOutputFormat$AsyncSqlExecThread.run(AsyncSqlOutputFormat.java:234)
> > >
> > > 12/12/05 12:09:24 INFO mapred.JobClient: Task Id :
> > > attempt_201212041541_0107_m_000000_1, Status : FAILED
> > > java.io.IOException: com.microsoft.sqlserver.jdbc.SQLServerException:
> > > Incorrect syntax near ','.
> > >  at
> > >
> > com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.close(AsyncSqlRecordWriter.java:195)
> > > at
> > >
> > org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.close(MapTask.java:651)
> > >  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:766)
> > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> > >  at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> > > at java.security.AccessController.doPrivileged(Native Method)
> > >  at javax.security.auth.Subject.doAs(Subject.java:416)
> > > at
> > >
> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> > >  at org.apache.hadoop.mapred.Child.main(Child.java:249)
> > > Caused by: com.microsoft.sqlserver.jdbc.SQLServerException: Incorrect
> > > syntax near ','.
> > >  at
> > >
> > com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:197)
> > > at
> > >
> > com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1493)
> > >  at
> > >
> > com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.doExecutePreparedStatement(SQLServerPreparedStatement.java:390)
> > > at
> > >
> > com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement$PrepStmtExecCmd.doExecute(SQLServerPreparedStatement.java:340)
> > >  at com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:4575)
> > > at
> > >
> > com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1400)
> > >  at
> > >
> > com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:179)
> > > at
> > >
> > com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:154)
> > >  at
> > >
> > com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.execute(SQLServerPreparedStatement.java:322)
> > > at
> > >
> > com.cloudera.sqoop.mapreduce.AsyncSqlOutputFormat$AsyncSqlExecThread.run(AsyncSqlOutputFormat.java:234)
> > >
> > > 12/12/05 12:09:30 INFO mapred.JobClient: Task Id :
> > > attempt_201212041541_0107_m_000000_2, Status : FAILED
> > > java.io.IOException: com.microsoft.sqlserver.jdbc.SQLServerException:
> > > Incorrect syntax near ','.
> > >  at
> > >
> > com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.close(AsyncSqlRecordWriter.java:195)
> > > at
> > >
> > org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.close(MapTask.java:651)
> > >  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:766)
> > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> > >  at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> > > at java.security.AccessController.doPrivileged(Native Method)
> > >  at javax.security.auth.Subject.doAs(Subject.java:416)
> > > at
> > >
> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> > >  at org.apache.hadoop.mapred.Child.main(Child.java:249)
> > > Caused by: com.microsoft.sqlserver.jdbc.SQLServerException: Incorrect
> > > syntax near ','.
> > >  at
> > >
> > com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:197)
> > > at
> > >
> > com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1493)
> > >  at
> > >
> > com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.doExecutePreparedStatement(SQLServerPreparedStatement.java:390)
> > > at
> > >
> > com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement$PrepStmtExecCmd.doExecute(SQLServerPreparedStatement.java:340)
> > >  at com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:4575)
> > > at
> > >
> > com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1400)
> > >  at
> > >
> > com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:179)
> > > at
> > >
> > com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:154)
> > >  at
> > >
> > com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.execute(SQLServerPreparedStatement.java:322)
> > > at
> > >
> > com.cloudera.sqoop.mapreduce.AsyncSqlOutputFormat$AsyncSqlExecThread.run(AsyncSqlOutputFormat.java:234)
> > >
> > > 12/12/05 12:09:41 INFO mapred.JobClient: Job complete:
> > job_201212041541_0107
> > > 12/12/05 12:09:41 INFO mapred.JobClient: Counters: 8
> > > 12/12/05 12:09:41 INFO mapred.JobClient:   Job Counters
> > > 12/12/05 12:09:41 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=24379
> > > 12/12/05 12:09:41 INFO mapred.JobClient:     Total time spent by all
> > > reduces waiting after reserving slots (ms)=0
> > > 12/12/05 12:09:41 INFO mapred.JobClient:     Total time spent by all maps
> > > waiting after reserving slots (ms)=0
> > > 12/12/05 12:09:41 INFO mapred.JobClient:     Rack-local map tasks=3
> > > 12/12/05 12:09:41 INFO mapred.JobClient:     Launched map tasks=4
> > > 12/12/05 12:09:41 INFO mapred.JobClient:     Data-local map tasks=1
> > > 12/12/05 12:09:41 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=0
> > > 12/12/05 12:09:41 INFO mapred.JobClient:     Failed map tasks=1
> > > 12/12/05 12:09:41 INFO mapreduce.ExportJobBase: Transferred 0 bytes in
> > > 43.0875 seconds (0 bytes/sec)
> > > 12/12/05 12:09:41 INFO mapreduce.ExportJobBase: Exported 0 records.
> > > 12/12/05 12:09:41 ERROR tool.ExportTool: Error during export: Export job
> > > failed!
> > > ================
> > >
> > > Kind regards,
> > > Chun-fan
> > >
> > > On Wed, Dec 5, 2012 at 12:01 AM, Jarek Jarcec Cecho <jarcec@apache.org
> > >wrote:
> > >
> > > > HiChun-fan,
> > > > would you mind sharing with us entire Sqoop log generated with
> > parameter
> > > > --verbose? Are you using build-in Microsoft SQL Connector or connector
> > > > provided by Microsoft?
> > > >
> > > > Jarcec
> > > >
> > > > On Tue, Dec 04, 2012 at 05:51:31PM +0800, Chun-fan Ivan Liao wrote:
> > > > > Hi,
> > > > >
> > > > >
> > > > >
> > > > > We are using Sqoop 1.3.0-cdh3u4 with Hadoop version 1.0.3.
> > > > >
> > > > >
> > > > >
> > > > > We encountered the following error when we try to export HDFS file
> > into
> > > > > MSSQL 2005 (partially):
> > > > >
> > > > >
> > > > >
> > > > > 12/12/04 16:44:13 INFO mapred.JobClient: Task Id :
> > > > > attempt_201212041541_0014_m_000000_2, Status : FAILED
> > > > >
> > > > > java.io.IOException: com.microsoft.sqlserver.jdbc.SQLServerException:
> > > > > Incorrect syntax near ','.
> > > > >
> > > > >         at
> > > > >
> > > >
> > com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.close(AsyncSqlRecordWriter.java:195)
> > > > >
> > > > >         at
> > > > >
> > > >
> > org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.close(MapTask.java:651)
> > > > >
> > > > >         at
> > > > org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:766)
> > > > >
> > > > >         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> > > > >
> > > > >         at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> > > > >
> > > > >         at java.security.AccessController.doPrivileged(Native Method)
> > > > >
> > > > >         at javax.security.auth.Subject.doAs(Subject.java:416)
> > > > >
> > > > >         at
> > > > >
> > > >
> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> > > > >
> > > > >         at org.apache.hadoop.mapred.Child.main(Child.java:249)
> > > > >
> > > > > Caused by: com.microsoft.sqlserver.jdbc.SQLServerException: Incorrect
> > > > > syntax near ','.
> > > > >
> > > > >         at
> > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:197)
> > > > >
> > > > >         at
> > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1493)
> > > > >
> > > > >         at
> > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.doExecutePreparedStatement(SQLServerPreparedStatement.java:390)
> > > > >
> > > > >         at
> > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement$PrepStmtExecCmd.doExecute(SQLServerPreparedStatement.java:340)
> > > > >
> > > > >         at
> > > > > com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:4575)
> > > > >
> > > > >         at
> > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1400)
> > > > >
> > > > >         at
> > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:179)
> > > > >
> > > > >         at
> > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:154)
> > > > >
> > > > >         at
> > > > >
> > > >
> > com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.execute(SQLServerPreparedStatement.java:322)
> > > > >
> > > > >         at
> > > > >
> > > >
> > com.cloudera.sqoop.mapreduce.AsyncSqlOutputFormat$AsyncSqlExecThread.run(AsyncSqlOutputFormat.java:234)
> > > > >
> > > > >
> > > > >
> > > > > The HDFS file that we want to export was imported using sqoop from
> > SQL
> > > > 2005
> > > > > before and uses ‘|’ as field delimiter, and there are commas (‘,’)
> > in a
> > > > > field of a line in the file.
> > > > >
> > > > >
> > > > >
> > > > > The commands I submitted is (generalized with capital letters):
> > > > >
> > > > > $ sqoop export -D sqoop.export.records.per.statement=75 -D
> > > > > sqoop.export.statements.per.transaction=75 --connect
> > > > >
> > > >
> > "jdbc:sqlserver://SERVER-NAME:1433;username=USER_NAME;password=PASSWD;database=DB_NAME"
> > > > > --table TABLE_NAME -m 1 --input-fields-terminated-by '|' --export-dir
> > > > > /EXPORT/FROM/DIRECTORY
> > > > >
> > > > >
> > > > >
> > > > > I’ve adjusted values of sqoop.export.records.per.statement &
> > > > > sqoop.export.statements.per.transaction, but that didn’t help.
> > > > >
> > > > >
> > > > >
> > > > > It will be greatly appreciated if you can offer some help. Thanks.
> > > > >
> > > > >
> > > > >
> > > > > Ivan
> > > >
> >

Re: Sqoop export failed: Incorrect syntax near ','

Posted by Ivangelion <ad...@ivangelion.tw>.
Hi Jarek,

It actually worked! Thank you so much~! :D

However now we faced another problem. The former data we tried to export is
only test data, which row count is only 10. When we tried to export
production data back into SQL Server from HDFS file which was previously
imported using Sqoop from SQL server, different errors occurred. The row
count is about 400k, and only about 120k rows were exported. This time we
used "-m 5", and if using "-m 1", nothing will be exported. Verbose log is
in the bottom of this mail.

Is this has to do with that we used MS SQL connector to do previous import,
not the default one?

Also, should we specify any character encoding, e.g. utf-8 during
import/export process? There are characters of many different languages in
our original data in SQL Server, and I'm not sure what the encoding is
after imported into HDFS.

Thanks again, Jarek.

=====================================
12/12/05 19:55:27 DEBUG tool.BaseSqoopTool: Enabled debug logging.
12/12/05 19:55:27 DEBUG sqoop.ConnFactory: Loaded manager factory:
com.cloudera.sqoop.manager.DefaultManagerFactory
12/12/05 19:55:27 DEBUG sqoop.ConnFactory: Trying ManagerFactory:
com.cloudera.sqoop.manager.DefaultManagerFactory
12/12/05 19:55:27 DEBUG manager.DefaultManagerFactory: Trying with scheme:
jdbc:sqlserver:
12/12/05 19:55:27 INFO manager.SqlManager: Using default fetchSize of 1000
12/12/05 19:55:27 DEBUG sqoop.ConnFactory: Instantiated ConnManager
com.cloudera.sqoop.manager.SQLServerManager@6766afb3
12/12/05 19:55:27 INFO tool.CodeGenTool: Beginning code generation
12/12/05 19:55:27 DEBUG manager.SqlManager: No connection paramenters
specified. Using regular API for making connection.
12/12/05 19:55:27 DEBUG manager.SqlManager: Using fetchSize for next query:
1000
12/12/05 19:55:27 INFO manager.SqlManager: Executing SQL statement: SELECT
t.* FROM member_main AS t WHERE 1=0
12/12/05 19:55:27 DEBUG orm.ClassWriter: selected columns:
12/12/05 19:55:27 DEBUG orm.ClassWriter:   MemberId
12/12/05 19:55:27 DEBUG orm.ClassWriter:   USERNAME
12/12/05 19:55:27 DEBUG orm.ClassWriter:   FirstName
12/12/05 19:55:27 DEBUG orm.ClassWriter:   LastName
12/12/05 19:55:27 DEBUG orm.ClassWriter:   EmailAddress
12/12/05 19:55:27 DEBUG orm.ClassWriter:   Password
12/12/05 19:55:27 DEBUG orm.ClassWriter:   Password_E5
12/12/05 19:55:27 DEBUG orm.ClassWriter:   Birthday
12/12/05 19:55:27 DEBUG orm.ClassWriter:   CompanyName
12/12/05 19:55:27 DEBUG orm.ClassWriter:   Gender
12/12/05 19:55:27 DEBUG orm.ClassWriter:   Age
12/12/05 19:55:27 DEBUG orm.ClassWriter:   Education
12/12/05 19:55:27 DEBUG orm.ClassWriter:   Country
12/12/05 19:55:27 DEBUG orm.ClassWriter:   Title
12/12/05 19:55:27 DEBUG orm.ClassWriter:   Phone1
12/12/05 19:55:27 DEBUG orm.ClassWriter:   Phone2
12/12/05 19:55:27 DEBUG orm.ClassWriter:   Fax
12/12/05 19:55:27 DEBUG orm.ClassWriter:   State
12/12/05 19:55:27 DEBUG orm.ClassWriter:   City
12/12/05 19:55:27 DEBUG orm.ClassWriter:   Address1
12/12/05 19:55:27 DEBUG orm.ClassWriter:   Address2
12/12/05 19:55:27 DEBUG orm.ClassWriter:   ZipCode
12/12/05 19:55:27 DEBUG orm.ClassWriter:   VATID
12/12/05 19:55:27 DEBUG orm.ClassWriter:   Language
12/12/05 19:55:27 DEBUG orm.ClassWriter:   rec_letter
12/12/05 19:55:27 DEBUG orm.ClassWriter:   rec_promotion
12/12/05 19:55:27 DEBUG orm.ClassWriter:   rec_type
12/12/05 19:55:27 DEBUG orm.ClassWriter:   JointSource
12/12/05 19:55:27 DEBUG orm.ClassWriter:   CustomerLevel
12/12/05 19:55:27 DEBUG orm.ClassWriter:   UpdateDate
12/12/05 19:55:27 DEBUG orm.ClassWriter:   CreateDate
12/12/05 19:55:27 DEBUG orm.ClassWriter:   FirstLoginDate
12/12/05 19:55:27 DEBUG orm.ClassWriter:   LastLoginDate
12/12/05 19:55:27 DEBUG orm.ClassWriter:   LastVisit
12/12/05 19:55:27 DEBUG orm.ClassWriter:   isValid
12/12/05 19:55:27 DEBUG orm.ClassWriter:   nJoint
12/12/05 19:55:27 DEBUG orm.ClassWriter:   Upd_SubDate
12/12/05 19:55:27 DEBUG orm.ClassWriter:   UnSub_Type
12/12/05 19:55:27 DEBUG orm.ClassWriter:   CreateDateFloat
12/12/05 19:55:27 DEBUG orm.ClassWriter: Writing source file:
/tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/member_main.java
12/12/05 19:55:27 DEBUG orm.ClassWriter: Table name: member_main
12/12/05 19:55:27 DEBUG orm.ClassWriter: Columns: MemberId:4, USERNAME:12,
FirstName:-9, LastName:-9, EmailAddress:12, Password:12, Password_E5:12,
Birthday:93, CompanyName:-9, Gender:12, Age:5, Education:12, Country:5,
Title:-9, Phone1:12, Phone2:12, Fax:12, State:-9, City:-9, Address1:-9,
Address2:-9, ZipCode:12, VATID:12, Language:12, rec_letter:-7,
rec_promotion:-7, rec_type:5, JointSource:12, CustomerLevel:4,
UpdateDate:93, CreateDate:93, FirstLoginDate:93, LastLoginDate:93,
LastVisit:93, isValid:-7, nJoint:4, Upd_SubDate:93, UnSub_Type:4,
CreateDateFloat:8,
12/12/05 19:55:27 DEBUG orm.ClassWriter: sourceFilename is member_main.java
12/12/05 19:55:27 DEBUG orm.CompilationManager: Found existing
/tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/
12/12/05 19:55:27 INFO orm.CompilationManager: HADOOP_HOME is
/usr/local/hadoop/libexec/..
12/12/05 19:55:27 DEBUG orm.CompilationManager: Adding source file:
/tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/member_main.java
12/12/05 19:55:27 DEBUG orm.CompilationManager: Invoking javac with args:
12/12/05 19:55:27 DEBUG orm.CompilationManager:   -sourcepath
12/12/05 19:55:27 DEBUG orm.CompilationManager:
/tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/
12/12/05 19:55:27 DEBUG orm.CompilationManager:   -d
12/12/05 19:55:27 DEBUG orm.CompilationManager:
/tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/
12/12/05 19:55:27 DEBUG orm.CompilationManager:   -classpath
12/12/05 19:55:27 DEBUG orm.CompilationManager:
/usr/local/hadoop/libexec/../conf:/usr/lib/jvm/java-6-openjdk-amd64/lib/tools.jar:/usr/local/hadoop/libexec/..:/usr/local/hadoop/libexec/../hadoop-core-1.0.3.jar:/usr/local/hadoop/libexec/../lib/asm-3.2.jar:/usr/local/hadoop/libexec/../lib/aspectjrt-1.6.5.jar:/usr/local/hadoop/libexec/../lib/aspectjtools-1.6.5.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/libexec/../lib/commons-cli-1.2.jar:/usr/local/hadoop/libexec/../lib/commons-codec-1.4.jar:/usr/local/hadoop/libexec/../lib/commons-collections-3.2.1.jar:/usr/local/hadoop/libexec/../lib/commons-configuration-1.6.jar:/usr/local/hadoop/libexec/../lib/commons-daemon-1.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-digester-1.8.jar:/usr/local/hadoop/libexec/../lib/commons-el-1.0.jar:/usr/local/hadoop/libexec/../lib/commons-httpclient-3.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-io-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-lang-2.4.jar:/usr/local/hadoop/libexec/../lib/commons-logging-1.1.1.jar:/usr/local/hadoop/libexec/../lib/commons-logging-api-1.0.4.jar:/usr/local/hadoop/libexec/../lib/commons-math-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-net-1.4.1.jar:/usr/local/hadoop/libexec/../lib/core-3.1.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-capacity-scheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-fairscheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-thriftfs-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hsqldb-1.8.0.10.jar:/usr/local/hadoop/libexec/../lib/jackson-core-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jasper-compiler-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jasper-runtime-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jdeb-0.8.jar:/usr/local/hadoop/libexec/../lib/jersey-core-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-json-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-server-1.8.jar:/usr/local/hadoop/libexec/../lib/jets3t-0.6.1.jar:/usr/local/hadoop/libexec/../lib/jetty-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jetty-util-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jsch-0.1.42.jar:/usr/local/hadoop/libexec/../lib/junit-4.5.jar:/usr/local/hadoop/libexec/../lib/kfs-0.2.2.jar:/usr/local/hadoop/libexec/../lib/log4j-1.2.15.jar:/usr/local/hadoop/libexec/../lib/mockito-all-1.8.5.jar:/usr/local/hadoop/libexec/../lib/oro-2.0.8.jar:/usr/local/hadoop/libexec/../lib/servlet-api-2.5-20081211.jar:/usr/local/hadoop/libexec/../lib/slf4j-api-1.4.3.jar:/usr/local/hadoop/libexec/../lib/slf4j-log4j12-1.4.3.jar:/usr/local/hadoop/libexec/../lib/xmlenc-0.52.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-2.1.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-api-2.1.jar:/usr/local/hadoop/lib:/usr/local/sqoop/conf::/usr/local/sqoop/lib/ant-contrib-1.0b3.jar:/usr/local/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar:/usr/local/sqoop/lib/avro-1.5.4.jar:/usr/local/sqoop/lib/avro-ipc-1.5.4.jar:/usr/local/sqoop/lib/avro-mapred-1.5.4.jar:/usr/local/sqoop/lib/commons-io-1.4.jar:/usr/local/sqoop/lib/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar:/usr/local/sqoop/lib/jackson-core-asl-1.7.3.jar:/usr/local/sqoop/lib/jackson-mapper-asl-1.7.3.jar:/usr/local/sqoop/lib/jopt-simple-3.2.jar:/usr/local/sqoop/lib/paranamer-2.3.jar:/usr/local/sqoop/lib/snappy-java-1.0.3.2.jar:/usr/local/sqoop/lib/sqljdbc4.jar:/usr/local/sqoop/lib/sqoop-sqlserver-1.0.jar:/usr/local/hbase/conf/:/usr/lib/jvm/java-6-openjdk-amd64//lib/tools.jar:/usr/local/hbase:/usr/local/hbase/hbase-0.94.1.jar:/usr/local/hbase/hbase-0.94.1-tests.jar:/usr/local/hbase/lib/activation-1.1.jar:/usr/local/hbase/lib/asm-3.1.jar:/usr/local/hbase/lib/avro-1.5.3.jar:/usr/local/hbase/lib/avro-ipc-1.5.3.jar:/usr/local/hbase/lib/commons-beanutils-1.7.0.jar:/usr/local/hbase/lib/commons-beanutils-core-1.8.0.jar:/usr/local/hbase/lib/commons-cli-1.2.jar:/usr/local/hbase/lib/commons-codec-1.4.jar:/usr/local/hbase/lib/commons-collections-3.2.1.jar:/usr/local/hbase/lib/commons-configuration-1.6.jar:/usr/local/hbase/lib/commons-digester-1.8.jar:/usr/local/hbase/lib/commons-el-1.0.jar:/usr/local/hbase/lib/commons-httpclient-3.1.jar:/usr/local/hbase/lib/commons-io-2.1.jar:/usr/local/hbase/lib/commons-lang-2.5.jar:/usr/local/hbase/lib/commons-logging-1.1.1.jar:/usr/local/hbase/lib/commons-math-2.1.jar:/usr/local/hbase/lib/commons-net-1.4.1.jar:/usr/local/hbase/lib/core-3.1.1.jar:/usr/local/hbase/lib/guava-11.0.2.jar:/usr/local/hbase/lib/hadoop-core-1.0.3.jar:/usr/local/hbase/lib/high-scale-lib-1.1.1.jar:/usr/local/hbase/lib/httpclient-4.1.2.jar:/usr/local/hbase/lib/httpcore-4.1.3.jar:/usr/local/hbase/lib/jackson-core-asl-1.8.8.jar:/usr/local/hbase/lib/jackson-jaxrs-1.8.8.jar:/usr/local/hbase/lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hbase/lib/jackson-xc-1.8.8.jar:/usr/local/hbase/lib/jamon-runtime-2.3.1.jar:/usr/local/hbase/lib/jasper-compiler-5.5.23.jar:/usr/local/hbase/lib/jasper-runtime-5.5.23.jar:/usr/local/hbase/lib/jaxb-api-2.1.jar:/usr/local/hbase/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hbase/lib/jersey-core-1.8.jar:/usr/local/hbase/lib/jersey-json-1.8.jar:/usr/local/hbase/lib/jersey-server-1.8.jar:/usr/local/hbase/lib/jettison-1.1.jar:/usr/local/hbase/lib/jetty-6.1.26.jar:/usr/local/hbase/lib/jetty-util-6.1.26.jar:/usr/local/hbase/lib/jruby-complete-1.6.5.jar:/usr/local/hbase/lib/jsp-2.1-6.1.14.jar:/usr/local/hbase/lib/jsp-api-2.1-6.1.14.jar:/usr/local/hbase/lib/jsr305-1.3.9.jar:/usr/local/hbase/lib/junit-4.10-HBASE-1.jar:/usr/local/hbase/lib/libthrift-0.8.0.jar:/usr/local/hbase/lib/log4j-1.2.16.jar:/usr/local/hbase/lib/metrics-core-2.1.2.jar:/usr/local/hbase/lib/netty-3.2.4.Final.jar:/usr/local/hbase/lib/protobuf-java-2.4.0a.jar:/usr/local/hbase/lib/servlet-api-2.5-6.1.14.jar:/usr/local/hbase/lib/slf4j-api-1.4.3.jar:/usr/local/hbase/lib/slf4j-log4j12-1.4.3.jar:/usr/local/hbase/lib/snappy-java-1.0.3.2.jar:/usr/local/hbase/lib/stax-api-1.0.1.jar:/usr/local/hbase/lib/velocity-1.7.jar:/usr/local/hbase/lib/xmlenc-0.52.jar:/usr/local/hbase/lib/zookeeper-3.4.3.jar::/usr/local/hadoop/conf:/usr/local/hadoop/libexec/../conf:/usr/lib/jvm/java-6-openjdk-amd64/lib/tools.jar:/usr/local/hadoop/libexec/..:/usr/local/hadoop/libexec/../hadoop-core-1.0.3.jar:/usr/local/hadoop/libexec/../lib/asm-3.2.jar:/usr/local/hadoop/libexec/../lib/aspectjrt-1.6.5.jar:/usr/local/hadoop/libexec/../lib/aspectjtools-1.6.5.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/libexec/../lib/commons-cli-1.2.jar:/usr/local/hadoop/libexec/../lib/commons-codec-1.4.jar:/usr/local/hadoop/libexec/../lib/commons-collections-3.2.1.jar:/usr/local/hadoop/libexec/../lib/commons-configuration-1.6.jar:/usr/local/hadoop/libexec/../lib/commons-daemon-1.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-digester-1.8.jar:/usr/local/hadoop/libexec/../lib/commons-el-1.0.jar:/usr/local/hadoop/libexec/../lib/commons-httpclient-3.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-io-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-lang-2.4.jar:/usr/local/hadoop/libexec/../lib/commons-logging-1.1.1.jar:/usr/local/hadoop/libexec/../lib/commons-logging-api-1.0.4.jar:/usr/local/hadoop/libexec/../lib/commons-math-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-net-1.4.1.jar:/usr/local/hadoop/libexec/../lib/core-3.1.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-capacity-scheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-fairscheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-thriftfs-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hsqldb-1.8.0.10.jar:/usr/local/hadoop/libexec/../lib/jackson-core-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jasper-compiler-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jasper-runtime-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jdeb-0.8.jar:/usr/local/hadoop/libexec/../lib/jersey-core-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-json-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-server-1.8.jar:/usr/local/hadoop/libexec/../lib/jets3t-0.6.1.jar:/usr/local/hadoop/libexec/../lib/jetty-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jetty-util-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jsch-0.1.42.jar:/usr/local/hadoop/libexec/../lib/junit-4.5.jar:/usr/local/hadoop/libexec/../lib/kfs-0.2.2.jar:/usr/local/hadoop/libexec/../lib/log4j-1.2.15.jar:/usr/local/hadoop/libexec/../lib/mockito-all-1.8.5.jar:/usr/local/hadoop/libexec/../lib/oro-2.0.8.jar:/usr/local/hadoop/libexec/../lib/servlet-api-2.5-20081211.jar:/usr/local/hadoop/libexec/../lib/slf4j-api-1.4.3.jar:/usr/local/hadoop/libexec/../lib/slf4j-log4j12-1.4.3.jar:/usr/local/hadoop/libexec/../lib/xmlenc-0.52.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-2.1.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-api-2.1.jar:/usr/local/hadoop/lib::/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar:/usr/local/sqoop/sqoop-test-1.3.0-cdh3u4.jar::/usr/local/hadoop/hadoop-core-1.0.3.jar:/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar
12/12/05 19:55:28 ERROR orm.CompilationManager: Could not rename
/tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/member_main.java
to /home/hadoop/_scripts/1-hadoop/member/./member_main.java
org.apache.commons.io.FileExistsException: Destination
'/home/hadoop/_scripts/1-hadoop/member/./member_main.java' already exists
 at org.apache.commons.io.FileUtils.moveFile(FileUtils.java:2378)
at
com.cloudera.sqoop.orm.CompilationManager.compile(CompilationManager.java:229)
 at com.cloudera.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:85)
at com.cloudera.sqoop.tool.ExportTool.exportTable(ExportTool.java:66)
 at com.cloudera.sqoop.tool.ExportTool.run(ExportTool.java:99)
at com.cloudera.sqoop.Sqoop.run(Sqoop.java:146)
 at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:182)
 at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:221)
at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:230)
 at com.cloudera.sqoop.Sqoop.main(Sqoop.java:239)
12/12/05 19:55:28 INFO orm.CompilationManager: Writing jar file:
/tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/member_main.jar
12/12/05 19:55:28 DEBUG orm.CompilationManager: Scanning for .class files
in directory: /tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d
12/12/05 19:55:28 DEBUG orm.CompilationManager: Got classfile:
/tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/member_main.class
-> member_main.class
12/12/05 19:55:28 DEBUG orm.CompilationManager: Finished writing jar file
/tmp/sqoop-hadoop/compile/7131fa8fb957892b4af354982da9e57d/member_main.jar
12/12/05 19:55:28 INFO mapreduce.ExportJobBase: Beginning export of
member_main
12/12/05 19:55:28 DEBUG mapreduce.JobBase: Using InputFormat: class
com.cloudera.sqoop.mapreduce.ExportInputFormat
12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
file:/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar
12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
file:/usr/local/sqoop/lib/sqljdbc4.jar
12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
file:/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar
12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
file:/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar
12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
file:/usr/local/sqoop/lib/jackson-core-asl-1.7.3.jar
12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
file:/usr/local/sqoop/lib/snappy-java-1.0.3.2.jar
12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
file:/usr/local/sqoop/lib/avro-1.5.4.jar
12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
file:/usr/local/sqoop/lib/jackson-mapper-asl-1.7.3.jar
12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
file:/usr/local/sqoop/lib/sqljdbc4.jar
12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
file:/usr/local/sqoop/lib/ant-contrib-1.0b3.jar
12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
file:/usr/local/sqoop/lib/sqoop-sqlserver-1.0.jar
12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
file:/usr/local/sqoop/lib/paranamer-2.3.jar
12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
file:/usr/local/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar
12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
file:/usr/local/sqoop/lib/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar
12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
file:/usr/local/sqoop/lib/avro-ipc-1.5.4.jar
12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
file:/usr/local/sqoop/lib/avro-mapred-1.5.4.jar
12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
file:/usr/local/sqoop/lib/commons-io-1.4.jar
12/12/05 19:55:28 DEBUG mapreduce.JobBase: Adding to job classpath:
file:/usr/local/sqoop/lib/jopt-simple-3.2.jar
12/12/05 19:55:31 INFO input.FileInputFormat: Total input paths to process
: 1
12/12/05 19:55:31 DEBUG mapreduce.ExportInputFormat: Target numMapTasks=5
12/12/05 19:55:31 DEBUG mapreduce.ExportInputFormat: Total input
bytes=110140058
12/12/05 19:55:31 DEBUG mapreduce.ExportInputFormat: maxSplitSize=22028011
12/12/05 19:55:31 INFO input.FileInputFormat: Total input paths to process
: 1
12/12/05 19:55:31 INFO util.NativeCodeLoader: Loaded the native-hadoop
library
12/12/05 19:55:31 WARN snappy.LoadSnappy: Snappy native library not loaded
12/12/05 19:55:31 DEBUG mapreduce.ExportInputFormat: Generated splits:
12/12/05 19:55:31 DEBUG mapreduce.ExportInputFormat:
Paths:/user/hadoop/CyberlinkMemberData/Member_Main/CHS/part-m-00000:0+22028011
Locations:hadoop03:;
12/12/05 19:55:31 DEBUG mapreduce.ExportInputFormat:
Paths:/user/hadoop/CyberlinkMemberData/Member_Main/CHS/part-m-00000:22028011+22028011
Locations:hadoop03:;
12/12/05 19:55:31 DEBUG mapreduce.ExportInputFormat:
Paths:/user/hadoop/CyberlinkMemberData/Member_Main/CHS/part-m-00000:44056022+11526421,/user/hadoop/CyberlinkMemberData/Member_Main/CHS/part-m-00000:55582443+11526421
Locations:hadoop03:;
12/12/05 19:55:31 DEBUG mapreduce.ExportInputFormat:
Paths:/user/hadoop/CyberlinkMemberData/Member_Main/CHS/part-m-00000:67108864+21515597,/user/hadoop/CyberlinkMemberData/Member_Main/CHS/part-m-00000:88624461+21515597
Locations:hadoop03:;
12/12/05 19:55:31 INFO mapred.JobClient: Running job: job_201212041541_0245
12/12/05 19:55:32 INFO mapred.JobClient:  map 0% reduce 0%
12/12/05 19:55:47 INFO mapred.JobClient: Task Id :
attempt_201212041541_0245_m_000002_0, Status : FAILED
java.lang.NumberFormatException: For input string: "Male"
at
java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
 at java.lang.Integer.parseInt(Integer.java:481)
at java.lang.Integer.valueOf(Integer.java:570)
 at member_main.__loadFromFields(member_main.java:1254)
at member_main.parse(member_main.java:1156)
 at
com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:81)
at
com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:40)
 at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
at
com.cloudera.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:189)
 at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
 at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
at java.security.AccessController.doPrivileged(Native Method)
 at javax.security.auth.Subject.doAs(Subject.java:416)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
 at org.apache.hadoop.mapred.Child.main(Child.java:249)

12/12/05 19:55:51 INFO mapred.JobClient:  map 5% reduce 0%
12/12/05 19:55:54 INFO mapred.JobClient:  map 8% reduce 0%
12/12/05 19:55:54 INFO mapred.JobClient: Task Id :
attempt_201212041541_0245_m_000002_1, Status : FAILED
java.lang.NumberFormatException: For input string: "Male"
at
java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
 at java.lang.Integer.parseInt(Integer.java:481)
at java.lang.Integer.valueOf(Integer.java:570)
 at member_main.__loadFromFields(member_main.java:1254)
at member_main.parse(member_main.java:1156)
 at
com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:81)
at
com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:40)
 at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
at
com.cloudera.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:189)
 at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
 at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
at java.security.AccessController.doPrivileged(Native Method)
 at javax.security.auth.Subject.doAs(Subject.java:416)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
 at org.apache.hadoop.mapred.Child.main(Child.java:249)

12/12/05 19:55:57 INFO mapred.JobClient:  map 14% reduce 0%
12/12/05 19:55:59 INFO mapred.JobClient: Task Id :
attempt_201212041541_0245_m_000000_0, Status : FAILED
java.lang.IllegalArgumentException: Timestamp format must be yyyy-mm-dd
hh:mm:ss[.fffffffff]
at java.sql.Timestamp.valueOf(Timestamp.java:203)
 at member_main.__loadFromFields(member_main.java:1239)
at member_main.parse(member_main.java:1156)
 at
com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:81)
at
com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:40)
 at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
at
com.cloudera.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:189)
 at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
 at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
at java.security.AccessController.doPrivileged(Native Method)
 at javax.security.auth.Subject.doAs(Subject.java:416)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
 at org.apache.hadoop.mapred.Child.main(Child.java:249)

12/12/05 19:56:00 INFO mapred.JobClient:  map 13% reduce 0%
12/12/05 19:56:01 INFO mapred.JobClient: Task Id :
attempt_201212041541_0245_m_000002_2, Status : FAILED
java.lang.NumberFormatException: For input string: "Male"
at
java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
 at java.lang.Integer.parseInt(Integer.java:481)
at java.lang.Integer.valueOf(Integer.java:570)
 at member_main.__loadFromFields(member_main.java:1254)
at member_main.parse(member_main.java:1156)
 at
com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:81)
at
com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:40)
 at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
at
com.cloudera.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:189)
 at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
 at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
at java.security.AccessController.doPrivileged(Native Method)
 at javax.security.auth.Subject.doAs(Subject.java:416)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
 at org.apache.hadoop.mapred.Child.main(Child.java:249)

12/12/05 19:56:03 INFO mapred.JobClient:  map 16% reduce 0%
12/12/05 19:56:13 INFO mapred.JobClient: Job complete: job_201212041541_0245
12/12/05 19:56:13 INFO mapred.JobClient: Counters: 8
12/12/05 19:56:13 INFO mapred.JobClient:   Job Counters
12/12/05 19:56:13 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=91611
12/12/05 19:56:13 INFO mapred.JobClient:     Total time spent by all
reduces waiting after reserving slots (ms)=0
12/12/05 19:56:13 INFO mapred.JobClient:     Total time spent by all maps
waiting after reserving slots (ms)=0
12/12/05 19:56:13 INFO mapred.JobClient:     Rack-local map tasks=5
12/12/05 19:56:13 INFO mapred.JobClient:     Launched map tasks=8
12/12/05 19:56:13 INFO mapred.JobClient:     Data-local map tasks=3
12/12/05 19:56:13 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=0
12/12/05 19:56:13 INFO mapred.JobClient:     Failed map tasks=1
12/12/05 19:56:13 INFO mapreduce.ExportJobBase: Transferred 0 bytes in
45.077 seconds (0 bytes/sec)
12/12/05 19:56:13 INFO mapreduce.ExportJobBase: Exported 0 records.
12/12/05 19:56:13 ERROR tool.ExportTool: Error during export: Export job
failed!
=====================================


Chun-fan


On Thu, Dec 6, 2012 at 12:23 AM, Jarek Jarcec Cecho <ja...@apache.org>wrote:

> Hi Chun-fan,
> thank you very much for sharing the log with us. You are using Microsoft
> SQL Connector because you're downloaded manually from Microsoft web pages
> and you can also confirm that by following log lines:
>
> > 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Added factory
> > com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory specified by
> > /usr/local/sqoop/conf/managers.d/mssqoop-sqlserver
> ...
> > 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Instantiated ConnManager
> > com.microsoft.sqoop.SqlServer.MSSQLServerManager@736921fd
>
> I'm not sure what is going wrong as it seems that the data were parsed
> correctly, but submitting the query to SQL server will fail. As a next step
> I would recommend turning Microsoft Connector off and using build-in one
> instead to see if the issue is specific to Sqoop or the Connector. You can
> do that by temporarily moving file
> /usr/local/sqoop/conf/managers.d/mssqoop-sqlserver somewhere else.
>
> Jarcec
>
> On Wed, Dec 05, 2012 at 12:25:24PM +0800, Chun-fan Ivan Liao wrote:
> > Thank you, Jarcec. I'm not sure which connector we use. I've downloaded
> > "Microsoft SQL Server Connector for Apache Hadoop" from
> > http://www.microsoft.com/en-us/download/details.aspx?id=27584, but I
> didn't
> > remember if we really used that. How to make sure?
> >
> > And here is the verbose log:
> >
> > ===========
> > 12/12/05 12:08:57 DEBUG tool.BaseSqoopTool: Enabled debug logging.
> > 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Added factory
> > com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory specified by
> > /usr/local/sqoop/conf/managers.d/mssqoop-sqlserver
> > 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Loaded manager factory:
> > com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory
> > 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Loaded manager factory:
> > com.cloudera.sqoop.manager.DefaultManagerFactory
> > 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Trying ManagerFactory:
> > com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory
> > 12/12/05 12:08:57 INFO SqlServer.MSSQLServerManagerFactory: Using
> > Microsoft's SQL Server - Hadoop Connector
> > 12/12/05 12:08:57 INFO manager.SqlManager: Using default fetchSize of
> 1000
> > 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Instantiated ConnManager
> > com.microsoft.sqoop.SqlServer.MSSQLServerManager@736921fd
> > 12/12/05 12:08:57 INFO tool.CodeGenTool: Beginning code generation
> > 12/12/05 12:08:57 DEBUG manager.SqlManager: No connection paramenters
> > specified. Using regular API for making connection.
> > 12/12/05 12:08:57 DEBUG manager.SqlManager: Using fetchSize for next
> query:
> > 1000
> > 12/12/05 12:08:57 INFO manager.SqlManager: Executing SQL statement:
> SELECT
> > TOP 1 * FROM [member_main]
> > 12/12/05 12:08:57 DEBUG manager.SqlManager: Using fetchSize for next
> query:
> > 1000
> > 12/12/05 12:08:57 INFO manager.SqlManager: Executing SQL statement:
> SELECT
> > TOP 1 * FROM [member_main]
> > 12/12/05 12:08:57 DEBUG orm.ClassWriter: selected columns:
> > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   MemberId
> > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   USERNAME
> > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   FirstName
> > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   LastName
> > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   EmailAddress
> > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Password
> > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Password_E5
> > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Birthday
> > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   CompanyName
> > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Gender
> > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Age
> > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Education
> > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Country
> > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Title
> > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Phone1
> > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Phone2
> > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Fax
> > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   State
> > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   City
> > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Address1
> > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Address2
> > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   ZipCode
> > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   VATID
> > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Language
> > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   rec_letter
> > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   rec_promotion
> > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   rec_type
> > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   JointSource
> > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   CustomerLevel
> > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   UpdateDate
> > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   CreateDate
> > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   FirstLoginDate
> > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   LastLoginDate
> > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   LastVisit
> > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   isValid
> > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   nJoint
> > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Upd_SubDate
> > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   UnSub_Type
> > 12/12/05 12:08:57 DEBUG orm.ClassWriter:   CreateDateFloat
> > 12/12/05 12:08:57 DEBUG orm.ClassWriter: Writing source file:
> >
> /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/member_main.java
> > 12/12/05 12:08:57 DEBUG orm.ClassWriter: Table name: member_main
> > 12/12/05 12:08:57 DEBUG orm.ClassWriter: Columns: MemberId:4,
> USERNAME:12,
> > FirstName:-9, LastName:-9, EmailAddress:12, Password:12, Password_E5:12,
> > Birthday:93, CompanyName:-9, Gender:12, Age:5, Education:12, Country:5,
> > Title:-9, Phone1:12, Phone2:12, Fax:12, State:-9, City:-9, Address1:-9,
> > Address2:-9, ZipCode:12, VATID:12, Language:12, rec_letter:-7,
> > rec_promotion:-7, rec_type:5, JointSource:12, CustomerLevel:4,
> > UpdateDate:93, CreateDate:93, FirstLoginDate:93, LastLoginDate:93,
> > LastVisit:93, isValid:-7, nJoint:4, Upd_SubDate:93, UnSub_Type:4,
> > CreateDateFloat:8,
> > 12/12/05 12:08:57 DEBUG orm.ClassWriter: sourceFilename is
> member_main.java
> > 12/12/05 12:08:57 DEBUG orm.CompilationManager: Found existing
> > /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/
> > 12/12/05 12:08:57 INFO orm.CompilationManager: HADOOP_HOME is
> > /usr/local/hadoop/libexec/..
> > 12/12/05 12:08:57 DEBUG orm.CompilationManager: Adding source file:
> >
> /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/member_main.java
> > 12/12/05 12:08:57 DEBUG orm.CompilationManager: Invoking javac with args:
> > 12/12/05 12:08:57 DEBUG orm.CompilationManager:   -sourcepath
> > 12/12/05 12:08:57 DEBUG orm.CompilationManager:
> > /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/
> > 12/12/05 12:08:57 DEBUG orm.CompilationManager:   -d
> > 12/12/05 12:08:57 DEBUG orm.CompilationManager:
> > /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/
> > 12/12/05 12:08:57 DEBUG orm.CompilationManager:   -classpath
> > 12/12/05 12:08:57 DEBUG orm.CompilationManager:
> >
> /usr/local/hadoop/libexec/../conf:/usr/lib/jvm/java-6-openjdk-amd64/lib/tools.jar:/usr/local/hadoop/libexec/..:/usr/local/hadoop/libexec/../hadoop-core-1.0.3.jar:/usr/local/hadoop/libexec/../lib/asm-3.2.jar:/usr/local/hadoop/libexec/../lib/aspectjrt-1.6.5.jar:/usr/local/hadoop/libexec/../lib/aspectjtools-1.6.5.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/libexec/../lib/commons-cli-1.2.jar:/usr/local/hadoop/libexec/../lib/commons-codec-1.4.jar:/usr/local/hadoop/libexec/../lib/commons-collections-3.2.1.jar:/usr/local/hadoop/libexec/../lib/commons-configuration-1.6.jar:/usr/local/hadoop/libexec/../lib/commons-daemon-1.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-digester-1.8.jar:/usr/local/hadoop/libexec/../lib/commons-el-1.0.jar:/usr/local/hadoop/libexec/../lib/commons-httpclient-3.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-io-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-lang-2.4.jar:/usr/local/hadoop/libexec/../lib/commons-logging-1.1.1.jar:/usr/local/hadoop/libexec/../lib/commons-logging-api-1.0.4.jar:/usr/local/hadoop/libexec/../lib/commons-math-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-net-1.4.1.jar:/usr/local/hadoop/libexec/../lib/core-3.1.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-capacity-scheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-fairscheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-thriftfs-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hsqldb-1.8.0.10.jar:/usr/local/hadoop/libexec/../lib/jackson-core-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jasper-compiler-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jasper-runtime-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jdeb-0.8.jar:/usr/local/hadoop/libexec/../lib/jersey-core-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-json-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-server-1.8.jar:/usr/local/hadoop/libexec/../lib/jets3t-0.6.1.jar:/usr/local/hadoop/libexec/../lib/jetty-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jetty-util-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jsch-0.1.42.jar:/usr/local/hadoop/libexec/../lib/junit-4.5.jar:/usr/local/hadoop/libexec/../lib/kfs-0.2.2.jar:/usr/local/hadoop/libexec/../lib/log4j-1.2.15.jar:/usr/local/hadoop/libexec/../lib/mockito-all-1.8.5.jar:/usr/local/hadoop/libexec/../lib/oro-2.0.8.jar:/usr/local/hadoop/libexec/../lib/servlet-api-2.5-20081211.jar:/usr/local/hadoop/libexec/../lib/slf4j-api-1.4.3.jar:/usr/local/hadoop/libexec/../lib/slf4j-log4j12-1.4.3.jar:/usr/local/hadoop/libexec/../lib/xmlenc-0.52.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-2.1.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-api-2.1.jar:/usr/local/hadoop/lib:/usr/local/sqoop/conf::/usr/local/sqoop/lib/ant-contrib-1.0b3.jar:/usr/local/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar:/usr/local/sqoop/lib/avro-1.5.4.jar:/usr/local/sqoop/lib/avro-ipc-1.5.4.jar:/usr/local/sqoop/lib/avro-mapred-1.5.4.jar:/usr/local/sqoop/lib/commons-io-1.4.jar:/usr/local/sqoop/lib/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar:/usr/local/sqoop/lib/jackson-core-asl-1.7.3.jar:/usr/local/sqoop/lib/jackson-mapper-asl-1.7.3.jar:/usr/local/sqoop/lib/jopt-simple-3.2.jar:/usr/local/sqoop/lib/paranamer-2.3.jar:/usr/local/sqoop/lib/snappy-java-1.0.3.2.jar:/usr/local/sqoop/lib/sqljdbc4.jar:/usr/local/sqoop/lib/sqoop-sqlserver-1.0.jar:/usr/local/hbase/conf/:/usr/lib/jvm/java-6-openjdk-amd64//lib/tools.jar:/usr/local/hbase:/usr/local/hbase/hbase-0.94.1.jar:/usr/local/hbase/hbase-0.94.1-tests.jar:/usr/local/hbase/lib/activation-1.1.jar:/usr/local/hbase/lib/asm-3.1.jar:/usr/local/hbase/lib/avro-1.5.3.jar:/usr/local/hbase/lib/avro-ipc-1.5.3.jar:/usr/local/hbase/lib/commons-beanutils-1.7.0.jar:/usr/local/hbase/lib/commons-beanutils-core-1.8.0.jar:/usr/local/hbase/lib/commons-cli-1.2.jar:/usr/local/hbase/lib/commons-codec-1.4.jar:/usr/local/hbase/lib/commons-collections-3.2.1.jar:/usr/local/hbase/lib/commons-configuration-1.6.jar:/usr/local/hbase/lib/commons-digester-1.8.jar:/usr/local/hbase/lib/commons-el-1.0.jar:/usr/local/hbase/lib/commons-httpclient-3.1.jar:/usr/local/hbase/lib/commons-io-2.1.jar:/usr/local/hbase/lib/commons-lang-2.5.jar:/usr/local/hbase/lib/commons-logging-1.1.1.jar:/usr/local/hbase/lib/commons-math-2.1.jar:/usr/local/hbase/lib/commons-net-1.4.1.jar:/usr/local/hbase/lib/core-3.1.1.jar:/usr/local/hbase/lib/guava-11.0.2.jar:/usr/local/hbase/lib/hadoop-core-1.0.3.jar:/usr/local/hbase/lib/high-scale-lib-1.1.1.jar:/usr/local/hbase/lib/httpclient-4.1.2.jar:/usr/local/hbase/lib/httpcore-4.1.3.jar:/usr/local/hbase/lib/jackson-core-asl-1.8.8.jar:/usr/local/hbase/lib/jackson-jaxrs-1.8.8.jar:/usr/local/hbase/lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hbase/lib/jackson-xc-1.8.8.jar:/usr/local/hbase/lib/jamon-runtime-2.3.1.jar:/usr/local/hbase/lib/jasper-compiler-5.5.23.jar:/usr/local/hbase/lib/jasper-runtime-5.5.23.jar:/usr/local/hbase/lib/jaxb-api-2.1.jar:/usr/local/hbase/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hbase/lib/jersey-core-1.8.jar:/usr/local/hbase/lib/jersey-json-1.8.jar:/usr/local/hbase/lib/jersey-server-1.8.jar:/usr/local/hbase/lib/jettison-1.1.jar:/usr/local/hbase/lib/jetty-6.1.26.jar:/usr/local/hbase/lib/jetty-util-6.1.26.jar:/usr/local/hbase/lib/jruby-complete-1.6.5.jar:/usr/local/hbase/lib/jsp-2.1-6.1.14.jar:/usr/local/hbase/lib/jsp-api-2.1-6.1.14.jar:/usr/local/hbase/lib/jsr305-1.3.9.jar:/usr/local/hbase/lib/junit-4.10-HBASE-1.jar:/usr/local/hbase/lib/libthrift-0.8.0.jar:/usr/local/hbase/lib/log4j-1.2.16.jar:/usr/local/hbase/lib/metrics-core-2.1.2.jar:/usr/local/hbase/lib/netty-3.2.4.Final.jar:/usr/local/hbase/lib/protobuf-java-2.4.0a.jar:/usr/local/hbase/lib/servlet-api-2.5-6.1.14.jar:/usr/local/hbase/lib/slf4j-api-1.4.3.jar:/usr/local/hbase/lib/slf4j-log4j12-1.4.3.jar:/usr/local/hbase/lib/snappy-java-1.0.3.2.jar:/usr/local/hbase/lib/stax-api-1.0.1.jar:/usr/local/hbase/lib/velocity-1.7.jar:/usr/local/hbase/lib/xmlenc-0.52.jar:/usr/local/hbase/lib/zookeeper-3.4.3.jar::/usr/local/hadoop/conf:/usr/local/hadoop/libexec/../conf:/usr/lib/jvm/java-6-openjdk-amd64/lib/tools.jar:/usr/local/hadoop/libexec/..:/usr/local/hadoop/libexec/../hadoop-core-1.0.3.jar:/usr/local/hadoop/libexec/../lib/asm-3.2.jar:/usr/local/hadoop/libexec/../lib/aspectjrt-1.6.5.jar:/usr/local/hadoop/libexec/../lib/aspectjtools-1.6.5.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/libexec/../lib/commons-cli-1.2.jar:/usr/local/hadoop/libexec/../lib/commons-codec-1.4.jar:/usr/local/hadoop/libexec/../lib/commons-collections-3.2.1.jar:/usr/local/hadoop/libexec/../lib/commons-configuration-1.6.jar:/usr/local/hadoop/libexec/../lib/commons-daemon-1.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-digester-1.8.jar:/usr/local/hadoop/libexec/../lib/commons-el-1.0.jar:/usr/local/hadoop/libexec/../lib/commons-httpclient-3.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-io-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-lang-2.4.jar:/usr/local/hadoop/libexec/../lib/commons-logging-1.1.1.jar:/usr/local/hadoop/libexec/../lib/commons-logging-api-1.0.4.jar:/usr/local/hadoop/libexec/../lib/commons-math-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-net-1.4.1.jar:/usr/local/hadoop/libexec/../lib/core-3.1.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-capacity-scheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-fairscheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-thriftfs-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hsqldb-1.8.0.10.jar:/usr/local/hadoop/libexec/../lib/jackson-core-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jasper-compiler-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jasper-runtime-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jdeb-0.8.jar:/usr/local/hadoop/libexec/../lib/jersey-core-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-json-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-server-1.8.jar:/usr/local/hadoop/libexec/../lib/jets3t-0.6.1.jar:/usr/local/hadoop/libexec/../lib/jetty-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jetty-util-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jsch-0.1.42.jar:/usr/local/hadoop/libexec/../lib/junit-4.5.jar:/usr/local/hadoop/libexec/../lib/kfs-0.2.2.jar:/usr/local/hadoop/libexec/../lib/log4j-1.2.15.jar:/usr/local/hadoop/libexec/../lib/mockito-all-1.8.5.jar:/usr/local/hadoop/libexec/../lib/oro-2.0.8.jar:/usr/local/hadoop/libexec/../lib/servlet-api-2.5-20081211.jar:/usr/local/hadoop/libexec/../lib/slf4j-api-1.4.3.jar:/usr/local/hadoop/libexec/../lib/slf4j-log4j12-1.4.3.jar:/usr/local/hadoop/libexec/../lib/xmlenc-0.52.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-2.1.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-api-2.1.jar:/usr/local/hadoop/lib::/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar:/usr/local/sqoop/sqoop-test-1.3.0-cdh3u4.jar::/usr/local/hadoop/hadoop-core-1.0.3.jar:/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar
> > 12/12/05 12:08:58 ERROR orm.CompilationManager: Could not rename
> >
> /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/member_main.java
> > to /home/hadoop/_scripts/1-hadoop/member/./member_main.java
> > org.apache.commons.io.FileExistsException: Destination
> > '/home/hadoop/_scripts/1-hadoop/member/./member_main.java' already exists
> >  at org.apache.commons.io.FileUtils.moveFile(FileUtils.java:2378)
> > at
> >
> com.cloudera.sqoop.orm.CompilationManager.compile(CompilationManager.java:229)
> >  at com.cloudera.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:85)
> > at com.cloudera.sqoop.tool.ExportTool.exportTable(ExportTool.java:66)
> >  at com.cloudera.sqoop.tool.ExportTool.run(ExportTool.java:99)
> > at com.cloudera.sqoop.Sqoop.run(Sqoop.java:146)
> >  at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> > at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:182)
> >  at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:221)
> > at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:230)
> >  at com.cloudera.sqoop.Sqoop.main(Sqoop.java:239)
> > 12/12/05 12:08:58 INFO orm.CompilationManager: Writing jar file:
> >
> /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/member_main.jar
> > 12/12/05 12:08:58 DEBUG orm.CompilationManager: Scanning for .class files
> > in directory: /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971
> > 12/12/05 12:08:58 DEBUG orm.CompilationManager: Got classfile:
> >
> /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/member_main.class
> > -> member_main.class
> > 12/12/05 12:08:58 DEBUG orm.CompilationManager: Finished writing jar file
> >
> /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/member_main.jar
> > 12/12/05 12:08:58 INFO mapreduce.ExportJobBase: Beginning export of
> > member_main
> > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Using InputFormat: class
> > com.cloudera.sqoop.mapreduce.ExportInputFormat
> > 12/12/05 12:08:58 DEBUG manager.SqlManager: Using fetchSize for next
> query:
> > 1000
> > 12/12/05 12:08:58 INFO manager.SqlManager: Executing SQL statement:
> SELECT
> > TOP 1 * FROM [member_main]
> > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar
> > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/local/sqoop/lib/sqljdbc4.jar
> > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/local/sqoop/lib/sqoop-sqlserver-1.0.jar
> > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar
> > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/local/sqoop/lib/jackson-core-asl-1.7.3.jar
> > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/local/sqoop/lib/snappy-java-1.0.3.2.jar
> > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/local/sqoop/lib/avro-1.5.4.jar
> > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/local/sqoop/lib/jackson-mapper-asl-1.7.3.jar
> > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/local/sqoop/lib/sqljdbc4.jar
> > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/local/sqoop/lib/ant-contrib-1.0b3.jar
> > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/local/sqoop/lib/sqoop-sqlserver-1.0.jar
> > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/local/sqoop/lib/paranamer-2.3.jar
> > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/local/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar
> > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/local/sqoop/lib/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar
> > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/local/sqoop/lib/avro-ipc-1.5.4.jar
> > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/local/sqoop/lib/avro-mapred-1.5.4.jar
> > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/local/sqoop/lib/commons-io-1.4.jar
> > 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/local/sqoop/lib/jopt-simple-3.2.jar
> > 12/12/05 12:09:00 INFO input.FileInputFormat: Total input paths to
> process
> > : 1
> > 12/12/05 12:09:00 DEBUG mapreduce.ExportInputFormat: Target numMapTasks=1
> > 12/12/05 12:09:00 DEBUG mapreduce.ExportInputFormat: Total input
> bytes=2611
> > 12/12/05 12:09:00 DEBUG mapreduce.ExportInputFormat: maxSplitSize=2611
> > 12/12/05 12:09:00 INFO input.FileInputFormat: Total input paths to
> process
> > : 1
> > 12/12/05 12:09:00 INFO util.NativeCodeLoader: Loaded the native-hadoop
> > library
> > 12/12/05 12:09:00 WARN snappy.LoadSnappy: Snappy native library not
> loaded
> > 12/12/05 12:09:00 DEBUG mapreduce.ExportInputFormat: Generated splits:
> > 12/12/05 12:09:00 DEBUG mapreduce.ExportInputFormat:
> > Paths:/user/hadoop/test-ivan/test:0+2611 Locations:hadoop05:;
> > 12/12/05 12:09:00 INFO mapred.JobClient: Running job:
> job_201212041541_0107
> > 12/12/05 12:09:01 INFO mapred.JobClient:  map 0% reduce 0%
> > 12/12/05 12:09:18 INFO mapred.JobClient: Task Id :
> > attempt_201212041541_0107_m_000000_0, Status : FAILED
> > java.io.IOException: com.microsoft.sqlserver.jdbc.SQLServerException:
> > Incorrect syntax near ','.
> >  at
> >
> com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.close(AsyncSqlRecordWriter.java:195)
> > at
> >
> org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.close(MapTask.java:651)
> >  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:766)
> > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> >  at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> > at java.security.AccessController.doPrivileged(Native Method)
> >  at javax.security.auth.Subject.doAs(Subject.java:416)
> > at
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> >  at org.apache.hadoop.mapred.Child.main(Child.java:249)
> > Caused by: com.microsoft.sqlserver.jdbc.SQLServerException: Incorrect
> > syntax near ','.
> >  at
> >
> com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:197)
> > at
> >
> com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1493)
> >  at
> >
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.doExecutePreparedStatement(SQLServerPreparedStatement.java:390)
> > at
> >
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement$PrepStmtExecCmd.doExecute(SQLServerPreparedStatement.java:340)
> >  at com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:4575)
> > at
> >
> com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1400)
> >  at
> >
> com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:179)
> > at
> >
> com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:154)
> >  at
> >
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.execute(SQLServerPreparedStatement.java:322)
> > at
> >
> com.cloudera.sqoop.mapreduce.AsyncSqlOutputFormat$AsyncSqlExecThread.run(AsyncSqlOutputFormat.java:234)
> >
> > 12/12/05 12:09:24 INFO mapred.JobClient: Task Id :
> > attempt_201212041541_0107_m_000000_1, Status : FAILED
> > java.io.IOException: com.microsoft.sqlserver.jdbc.SQLServerException:
> > Incorrect syntax near ','.
> >  at
> >
> com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.close(AsyncSqlRecordWriter.java:195)
> > at
> >
> org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.close(MapTask.java:651)
> >  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:766)
> > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> >  at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> > at java.security.AccessController.doPrivileged(Native Method)
> >  at javax.security.auth.Subject.doAs(Subject.java:416)
> > at
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> >  at org.apache.hadoop.mapred.Child.main(Child.java:249)
> > Caused by: com.microsoft.sqlserver.jdbc.SQLServerException: Incorrect
> > syntax near ','.
> >  at
> >
> com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:197)
> > at
> >
> com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1493)
> >  at
> >
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.doExecutePreparedStatement(SQLServerPreparedStatement.java:390)
> > at
> >
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement$PrepStmtExecCmd.doExecute(SQLServerPreparedStatement.java:340)
> >  at com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:4575)
> > at
> >
> com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1400)
> >  at
> >
> com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:179)
> > at
> >
> com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:154)
> >  at
> >
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.execute(SQLServerPreparedStatement.java:322)
> > at
> >
> com.cloudera.sqoop.mapreduce.AsyncSqlOutputFormat$AsyncSqlExecThread.run(AsyncSqlOutputFormat.java:234)
> >
> > 12/12/05 12:09:30 INFO mapred.JobClient: Task Id :
> > attempt_201212041541_0107_m_000000_2, Status : FAILED
> > java.io.IOException: com.microsoft.sqlserver.jdbc.SQLServerException:
> > Incorrect syntax near ','.
> >  at
> >
> com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.close(AsyncSqlRecordWriter.java:195)
> > at
> >
> org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.close(MapTask.java:651)
> >  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:766)
> > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> >  at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> > at java.security.AccessController.doPrivileged(Native Method)
> >  at javax.security.auth.Subject.doAs(Subject.java:416)
> > at
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> >  at org.apache.hadoop.mapred.Child.main(Child.java:249)
> > Caused by: com.microsoft.sqlserver.jdbc.SQLServerException: Incorrect
> > syntax near ','.
> >  at
> >
> com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:197)
> > at
> >
> com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1493)
> >  at
> >
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.doExecutePreparedStatement(SQLServerPreparedStatement.java:390)
> > at
> >
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement$PrepStmtExecCmd.doExecute(SQLServerPreparedStatement.java:340)
> >  at com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:4575)
> > at
> >
> com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1400)
> >  at
> >
> com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:179)
> > at
> >
> com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:154)
> >  at
> >
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.execute(SQLServerPreparedStatement.java:322)
> > at
> >
> com.cloudera.sqoop.mapreduce.AsyncSqlOutputFormat$AsyncSqlExecThread.run(AsyncSqlOutputFormat.java:234)
> >
> > 12/12/05 12:09:41 INFO mapred.JobClient: Job complete:
> job_201212041541_0107
> > 12/12/05 12:09:41 INFO mapred.JobClient: Counters: 8
> > 12/12/05 12:09:41 INFO mapred.JobClient:   Job Counters
> > 12/12/05 12:09:41 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=24379
> > 12/12/05 12:09:41 INFO mapred.JobClient:     Total time spent by all
> > reduces waiting after reserving slots (ms)=0
> > 12/12/05 12:09:41 INFO mapred.JobClient:     Total time spent by all maps
> > waiting after reserving slots (ms)=0
> > 12/12/05 12:09:41 INFO mapred.JobClient:     Rack-local map tasks=3
> > 12/12/05 12:09:41 INFO mapred.JobClient:     Launched map tasks=4
> > 12/12/05 12:09:41 INFO mapred.JobClient:     Data-local map tasks=1
> > 12/12/05 12:09:41 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=0
> > 12/12/05 12:09:41 INFO mapred.JobClient:     Failed map tasks=1
> > 12/12/05 12:09:41 INFO mapreduce.ExportJobBase: Transferred 0 bytes in
> > 43.0875 seconds (0 bytes/sec)
> > 12/12/05 12:09:41 INFO mapreduce.ExportJobBase: Exported 0 records.
> > 12/12/05 12:09:41 ERROR tool.ExportTool: Error during export: Export job
> > failed!
> > ================
> >
> > Kind regards,
> > Chun-fan
> >
> > On Wed, Dec 5, 2012 at 12:01 AM, Jarek Jarcec Cecho <jarcec@apache.org
> >wrote:
> >
> > > HiChun-fan,
> > > would you mind sharing with us entire Sqoop log generated with
> parameter
> > > --verbose? Are you using build-in Microsoft SQL Connector or connector
> > > provided by Microsoft?
> > >
> > > Jarcec
> > >
> > > On Tue, Dec 04, 2012 at 05:51:31PM +0800, Chun-fan Ivan Liao wrote:
> > > > Hi,
> > > >
> > > >
> > > >
> > > > We are using Sqoop 1.3.0-cdh3u4 with Hadoop version 1.0.3.
> > > >
> > > >
> > > >
> > > > We encountered the following error when we try to export HDFS file
> into
> > > > MSSQL 2005 (partially):
> > > >
> > > >
> > > >
> > > > 12/12/04 16:44:13 INFO mapred.JobClient: Task Id :
> > > > attempt_201212041541_0014_m_000000_2, Status : FAILED
> > > >
> > > > java.io.IOException: com.microsoft.sqlserver.jdbc.SQLServerException:
> > > > Incorrect syntax near ','.
> > > >
> > > >         at
> > > >
> > >
> com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.close(AsyncSqlRecordWriter.java:195)
> > > >
> > > >         at
> > > >
> > >
> org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.close(MapTask.java:651)
> > > >
> > > >         at
> > > org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:766)
> > > >
> > > >         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> > > >
> > > >         at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> > > >
> > > >         at java.security.AccessController.doPrivileged(Native Method)
> > > >
> > > >         at javax.security.auth.Subject.doAs(Subject.java:416)
> > > >
> > > >         at
> > > >
> > >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> > > >
> > > >         at org.apache.hadoop.mapred.Child.main(Child.java:249)
> > > >
> > > > Caused by: com.microsoft.sqlserver.jdbc.SQLServerException: Incorrect
> > > > syntax near ','.
> > > >
> > > >         at
> > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:197)
> > > >
> > > >         at
> > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1493)
> > > >
> > > >         at
> > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.doExecutePreparedStatement(SQLServerPreparedStatement.java:390)
> > > >
> > > >         at
> > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement$PrepStmtExecCmd.doExecute(SQLServerPreparedStatement.java:340)
> > > >
> > > >         at
> > > > com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:4575)
> > > >
> > > >         at
> > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1400)
> > > >
> > > >         at
> > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:179)
> > > >
> > > >         at
> > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:154)
> > > >
> > > >         at
> > > >
> > >
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.execute(SQLServerPreparedStatement.java:322)
> > > >
> > > >         at
> > > >
> > >
> com.cloudera.sqoop.mapreduce.AsyncSqlOutputFormat$AsyncSqlExecThread.run(AsyncSqlOutputFormat.java:234)
> > > >
> > > >
> > > >
> > > > The HDFS file that we want to export was imported using sqoop from
> SQL
> > > 2005
> > > > before and uses ‘|’ as field delimiter, and there are commas (‘,’)
> in a
> > > > field of a line in the file.
> > > >
> > > >
> > > >
> > > > The commands I submitted is (generalized with capital letters):
> > > >
> > > > $ sqoop export -D sqoop.export.records.per.statement=75 -D
> > > > sqoop.export.statements.per.transaction=75 --connect
> > > >
> > >
> "jdbc:sqlserver://SERVER-NAME:1433;username=USER_NAME;password=PASSWD;database=DB_NAME"
> > > > --table TABLE_NAME -m 1 --input-fields-terminated-by '|' --export-dir
> > > > /EXPORT/FROM/DIRECTORY
> > > >
> > > >
> > > >
> > > > I’ve adjusted values of sqoop.export.records.per.statement &
> > > > sqoop.export.statements.per.transaction, but that didn’t help.
> > > >
> > > >
> > > >
> > > > It will be greatly appreciated if you can offer some help. Thanks.
> > > >
> > > >
> > > >
> > > > Ivan
> > >
>

Re: Sqoop export failed: Incorrect syntax near ','

Posted by Jarek Jarcec Cecho <ja...@apache.org>.
Hi Chun-fan,
thank you very much for sharing the log with us. You are using Microsoft SQL Connector because you're downloaded manually from Microsoft web pages and you can also confirm that by following log lines:

> 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Added factory
> com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory specified by
> /usr/local/sqoop/conf/managers.d/mssqoop-sqlserver
...
> 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Instantiated ConnManager
> com.microsoft.sqoop.SqlServer.MSSQLServerManager@736921fd

I'm not sure what is going wrong as it seems that the data were parsed correctly, but submitting the query to SQL server will fail. As a next step I would recommend turning Microsoft Connector off and using build-in one instead to see if the issue is specific to Sqoop or the Connector. You can do that by temporarily moving file /usr/local/sqoop/conf/managers.d/mssqoop-sqlserver somewhere else.

Jarcec

On Wed, Dec 05, 2012 at 12:25:24PM +0800, Chun-fan Ivan Liao wrote:
> Thank you, Jarcec. I'm not sure which connector we use. I've downloaded
> "Microsoft SQL Server Connector for Apache Hadoop" from
> http://www.microsoft.com/en-us/download/details.aspx?id=27584, but I didn't
> remember if we really used that. How to make sure?
> 
> And here is the verbose log:
> 
> ===========
> 12/12/05 12:08:57 DEBUG tool.BaseSqoopTool: Enabled debug logging.
> 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Added factory
> com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory specified by
> /usr/local/sqoop/conf/managers.d/mssqoop-sqlserver
> 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Loaded manager factory:
> com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory
> 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Loaded manager factory:
> com.cloudera.sqoop.manager.DefaultManagerFactory
> 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Trying ManagerFactory:
> com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory
> 12/12/05 12:08:57 INFO SqlServer.MSSQLServerManagerFactory: Using
> Microsoft's SQL Server - Hadoop Connector
> 12/12/05 12:08:57 INFO manager.SqlManager: Using default fetchSize of 1000
> 12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Instantiated ConnManager
> com.microsoft.sqoop.SqlServer.MSSQLServerManager@736921fd
> 12/12/05 12:08:57 INFO tool.CodeGenTool: Beginning code generation
> 12/12/05 12:08:57 DEBUG manager.SqlManager: No connection paramenters
> specified. Using regular API for making connection.
> 12/12/05 12:08:57 DEBUG manager.SqlManager: Using fetchSize for next query:
> 1000
> 12/12/05 12:08:57 INFO manager.SqlManager: Executing SQL statement: SELECT
> TOP 1 * FROM [member_main]
> 12/12/05 12:08:57 DEBUG manager.SqlManager: Using fetchSize for next query:
> 1000
> 12/12/05 12:08:57 INFO manager.SqlManager: Executing SQL statement: SELECT
> TOP 1 * FROM [member_main]
> 12/12/05 12:08:57 DEBUG orm.ClassWriter: selected columns:
> 12/12/05 12:08:57 DEBUG orm.ClassWriter:   MemberId
> 12/12/05 12:08:57 DEBUG orm.ClassWriter:   USERNAME
> 12/12/05 12:08:57 DEBUG orm.ClassWriter:   FirstName
> 12/12/05 12:08:57 DEBUG orm.ClassWriter:   LastName
> 12/12/05 12:08:57 DEBUG orm.ClassWriter:   EmailAddress
> 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Password
> 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Password_E5
> 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Birthday
> 12/12/05 12:08:57 DEBUG orm.ClassWriter:   CompanyName
> 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Gender
> 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Age
> 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Education
> 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Country
> 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Title
> 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Phone1
> 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Phone2
> 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Fax
> 12/12/05 12:08:57 DEBUG orm.ClassWriter:   State
> 12/12/05 12:08:57 DEBUG orm.ClassWriter:   City
> 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Address1
> 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Address2
> 12/12/05 12:08:57 DEBUG orm.ClassWriter:   ZipCode
> 12/12/05 12:08:57 DEBUG orm.ClassWriter:   VATID
> 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Language
> 12/12/05 12:08:57 DEBUG orm.ClassWriter:   rec_letter
> 12/12/05 12:08:57 DEBUG orm.ClassWriter:   rec_promotion
> 12/12/05 12:08:57 DEBUG orm.ClassWriter:   rec_type
> 12/12/05 12:08:57 DEBUG orm.ClassWriter:   JointSource
> 12/12/05 12:08:57 DEBUG orm.ClassWriter:   CustomerLevel
> 12/12/05 12:08:57 DEBUG orm.ClassWriter:   UpdateDate
> 12/12/05 12:08:57 DEBUG orm.ClassWriter:   CreateDate
> 12/12/05 12:08:57 DEBUG orm.ClassWriter:   FirstLoginDate
> 12/12/05 12:08:57 DEBUG orm.ClassWriter:   LastLoginDate
> 12/12/05 12:08:57 DEBUG orm.ClassWriter:   LastVisit
> 12/12/05 12:08:57 DEBUG orm.ClassWriter:   isValid
> 12/12/05 12:08:57 DEBUG orm.ClassWriter:   nJoint
> 12/12/05 12:08:57 DEBUG orm.ClassWriter:   Upd_SubDate
> 12/12/05 12:08:57 DEBUG orm.ClassWriter:   UnSub_Type
> 12/12/05 12:08:57 DEBUG orm.ClassWriter:   CreateDateFloat
> 12/12/05 12:08:57 DEBUG orm.ClassWriter: Writing source file:
> /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/member_main.java
> 12/12/05 12:08:57 DEBUG orm.ClassWriter: Table name: member_main
> 12/12/05 12:08:57 DEBUG orm.ClassWriter: Columns: MemberId:4, USERNAME:12,
> FirstName:-9, LastName:-9, EmailAddress:12, Password:12, Password_E5:12,
> Birthday:93, CompanyName:-9, Gender:12, Age:5, Education:12, Country:5,
> Title:-9, Phone1:12, Phone2:12, Fax:12, State:-9, City:-9, Address1:-9,
> Address2:-9, ZipCode:12, VATID:12, Language:12, rec_letter:-7,
> rec_promotion:-7, rec_type:5, JointSource:12, CustomerLevel:4,
> UpdateDate:93, CreateDate:93, FirstLoginDate:93, LastLoginDate:93,
> LastVisit:93, isValid:-7, nJoint:4, Upd_SubDate:93, UnSub_Type:4,
> CreateDateFloat:8,
> 12/12/05 12:08:57 DEBUG orm.ClassWriter: sourceFilename is member_main.java
> 12/12/05 12:08:57 DEBUG orm.CompilationManager: Found existing
> /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/
> 12/12/05 12:08:57 INFO orm.CompilationManager: HADOOP_HOME is
> /usr/local/hadoop/libexec/..
> 12/12/05 12:08:57 DEBUG orm.CompilationManager: Adding source file:
> /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/member_main.java
> 12/12/05 12:08:57 DEBUG orm.CompilationManager: Invoking javac with args:
> 12/12/05 12:08:57 DEBUG orm.CompilationManager:   -sourcepath
> 12/12/05 12:08:57 DEBUG orm.CompilationManager:
> /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/
> 12/12/05 12:08:57 DEBUG orm.CompilationManager:   -d
> 12/12/05 12:08:57 DEBUG orm.CompilationManager:
> /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/
> 12/12/05 12:08:57 DEBUG orm.CompilationManager:   -classpath
> 12/12/05 12:08:57 DEBUG orm.CompilationManager:
> /usr/local/hadoop/libexec/../conf:/usr/lib/jvm/java-6-openjdk-amd64/lib/tools.jar:/usr/local/hadoop/libexec/..:/usr/local/hadoop/libexec/../hadoop-core-1.0.3.jar:/usr/local/hadoop/libexec/../lib/asm-3.2.jar:/usr/local/hadoop/libexec/../lib/aspectjrt-1.6.5.jar:/usr/local/hadoop/libexec/../lib/aspectjtools-1.6.5.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/libexec/../lib/commons-cli-1.2.jar:/usr/local/hadoop/libexec/../lib/commons-codec-1.4.jar:/usr/local/hadoop/libexec/../lib/commons-collections-3.2.1.jar:/usr/local/hadoop/libexec/../lib/commons-configuration-1.6.jar:/usr/local/hadoop/libexec/../lib/commons-daemon-1.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-digester-1.8.jar:/usr/local/hadoop/libexec/../lib/commons-el-1.0.jar:/usr/local/hadoop/libexec/../lib/commons-httpclient-3.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-io-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-lang-2.4.jar:/usr/local/hadoop/libexec/../lib/commons-logging-1.1.1.jar:/usr/local/hadoop/libexec/../lib/commons-logging-api-1.0.4.jar:/usr/local/hadoop/libexec/../lib/commons-math-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-net-1.4.1.jar:/usr/local/hadoop/libexec/../lib/core-3.1.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-capacity-scheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-fairscheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-thriftfs-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hsqldb-1.8.0.10.jar:/usr/local/hadoop/libexec/../lib/jackson-core-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jasper-compiler-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jasper-runtime-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jdeb-0.8.jar:/usr/local/hadoop/libexec/../lib/jersey-core-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-json-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-server-1.8.jar:/usr/local/hadoop/libexec/../lib/jets3t-0.6.1.jar:/usr/local/hadoop/libexec/../lib/jetty-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jetty-util-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jsch-0.1.42.jar:/usr/local/hadoop/libexec/../lib/junit-4.5.jar:/usr/local/hadoop/libexec/../lib/kfs-0.2.2.jar:/usr/local/hadoop/libexec/../lib/log4j-1.2.15.jar:/usr/local/hadoop/libexec/../lib/mockito-all-1.8.5.jar:/usr/local/hadoop/libexec/../lib/oro-2.0.8.jar:/usr/local/hadoop/libexec/../lib/servlet-api-2.5-20081211.jar:/usr/local/hadoop/libexec/../lib/slf4j-api-1.4.3.jar:/usr/local/hadoop/libexec/../lib/slf4j-log4j12-1.4.3.jar:/usr/local/hadoop/libexec/../lib/xmlenc-0.52.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-2.1.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-api-2.1.jar:/usr/local/hadoop/lib:/usr/local/sqoop/conf::/usr/local/sqoop/lib/ant-contrib-1.0b3.jar:/usr/local/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar:/usr/local/sqoop/lib/avro-1.5.4.jar:/usr/local/sqoop/lib/avro-ipc-1.5.4.jar:/usr/local/sqoop/lib/avro-mapred-1.5.4.jar:/usr/local/sqoop/lib/commons-io-1.4.jar:/usr/local/sqoop/lib/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar:/usr/local/sqoop/lib/jackson-core-asl-1.7.3.jar:/usr/local/sqoop/lib/jackson-mapper-asl-1.7.3.jar:/usr/local/sqoop/lib/jopt-simple-3.2.jar:/usr/local/sqoop/lib/paranamer-2.3.jar:/usr/local/sqoop/lib/snappy-java-1.0.3.2.jar:/usr/local/sqoop/lib/sqljdbc4.jar:/usr/local/sqoop/lib/sqoop-sqlserver-1.0.jar:/usr/local/hbase/conf/:/usr/lib/jvm/java-6-openjdk-amd64//lib/tools.jar:/usr/local/hbase:/usr/local/hbase/hbase-0.94.1.jar:/usr/local/hbase/hbase-0.94.1-tests.jar:/usr/local/hbase/lib/activation-1.1.jar:/usr/local/hbase/lib/asm-3.1.jar:/usr/local/hbase/lib/avro-1.5.3.jar:/usr/local/hbase/lib/avro-ipc-1.5.3.jar:/usr/local/hbase/lib/commons-beanutils-1.7.0.jar:/usr/local/hbase/lib/commons-beanutils-core-1.8.0.jar:/usr/local/hbase/lib/commons-cli-1.2.jar:/usr/local/hbase/lib/commons-codec-1.4.jar:/usr/local/hbase/lib/commons-collections-3.2.1.jar:/usr/local/hbase/lib/commons-configuration-1.6.jar:/usr/local/hbase/lib/commons-digester-1.8.jar:/usr/local/hbase/lib/commons-el-1.0.jar:/usr/local/hbase/lib/commons-httpclient-3.1.jar:/usr/local/hbase/lib/commons-io-2.1.jar:/usr/local/hbase/lib/commons-lang-2.5.jar:/usr/local/hbase/lib/commons-logging-1.1.1.jar:/usr/local/hbase/lib/commons-math-2.1.jar:/usr/local/hbase/lib/commons-net-1.4.1.jar:/usr/local/hbase/lib/core-3.1.1.jar:/usr/local/hbase/lib/guava-11.0.2.jar:/usr/local/hbase/lib/hadoop-core-1.0.3.jar:/usr/local/hbase/lib/high-scale-lib-1.1.1.jar:/usr/local/hbase/lib/httpclient-4.1.2.jar:/usr/local/hbase/lib/httpcore-4.1.3.jar:/usr/local/hbase/lib/jackson-core-asl-1.8.8.jar:/usr/local/hbase/lib/jackson-jaxrs-1.8.8.jar:/usr/local/hbase/lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hbase/lib/jackson-xc-1.8.8.jar:/usr/local/hbase/lib/jamon-runtime-2.3.1.jar:/usr/local/hbase/lib/jasper-compiler-5.5.23.jar:/usr/local/hbase/lib/jasper-runtime-5.5.23.jar:/usr/local/hbase/lib/jaxb-api-2.1.jar:/usr/local/hbase/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hbase/lib/jersey-core-1.8.jar:/usr/local/hbase/lib/jersey-json-1.8.jar:/usr/local/hbase/lib/jersey-server-1.8.jar:/usr/local/hbase/lib/jettison-1.1.jar:/usr/local/hbase/lib/jetty-6.1.26.jar:/usr/local/hbase/lib/jetty-util-6.1.26.jar:/usr/local/hbase/lib/jruby-complete-1.6.5.jar:/usr/local/hbase/lib/jsp-2.1-6.1.14.jar:/usr/local/hbase/lib/jsp-api-2.1-6.1.14.jar:/usr/local/hbase/lib/jsr305-1.3.9.jar:/usr/local/hbase/lib/junit-4.10-HBASE-1.jar:/usr/local/hbase/lib/libthrift-0.8.0.jar:/usr/local/hbase/lib/log4j-1.2.16.jar:/usr/local/hbase/lib/metrics-core-2.1.2.jar:/usr/local/hbase/lib/netty-3.2.4.Final.jar:/usr/local/hbase/lib/protobuf-java-2.4.0a.jar:/usr/local/hbase/lib/servlet-api-2.5-6.1.14.jar:/usr/local/hbase/lib/slf4j-api-1.4.3.jar:/usr/local/hbase/lib/slf4j-log4j12-1.4.3.jar:/usr/local/hbase/lib/snappy-java-1.0.3.2.jar:/usr/local/hbase/lib/stax-api-1.0.1.jar:/usr/local/hbase/lib/velocity-1.7.jar:/usr/local/hbase/lib/xmlenc-0.52.jar:/usr/local/hbase/lib/zookeeper-3.4.3.jar::/usr/local/hadoop/conf:/usr/local/hadoop/libexec/../conf:/usr/lib/jvm/java-6-openjdk-amd64/lib/tools.jar:/usr/local/hadoop/libexec/..:/usr/local/hadoop/libexec/../hadoop-core-1.0.3.jar:/usr/local/hadoop/libexec/../lib/asm-3.2.jar:/usr/local/hadoop/libexec/../lib/aspectjrt-1.6.5.jar:/usr/local/hadoop/libexec/../lib/aspectjtools-1.6.5.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/libexec/../lib/commons-cli-1.2.jar:/usr/local/hadoop/libexec/../lib/commons-codec-1.4.jar:/usr/local/hadoop/libexec/../lib/commons-collections-3.2.1.jar:/usr/local/hadoop/libexec/../lib/commons-configuration-1.6.jar:/usr/local/hadoop/libexec/../lib/commons-daemon-1.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-digester-1.8.jar:/usr/local/hadoop/libexec/../lib/commons-el-1.0.jar:/usr/local/hadoop/libexec/../lib/commons-httpclient-3.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-io-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-lang-2.4.jar:/usr/local/hadoop/libexec/../lib/commons-logging-1.1.1.jar:/usr/local/hadoop/libexec/../lib/commons-logging-api-1.0.4.jar:/usr/local/hadoop/libexec/../lib/commons-math-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-net-1.4.1.jar:/usr/local/hadoop/libexec/../lib/core-3.1.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-capacity-scheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-fairscheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-thriftfs-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hsqldb-1.8.0.10.jar:/usr/local/hadoop/libexec/../lib/jackson-core-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jasper-compiler-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jasper-runtime-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jdeb-0.8.jar:/usr/local/hadoop/libexec/../lib/jersey-core-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-json-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-server-1.8.jar:/usr/local/hadoop/libexec/../lib/jets3t-0.6.1.jar:/usr/local/hadoop/libexec/../lib/jetty-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jetty-util-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jsch-0.1.42.jar:/usr/local/hadoop/libexec/../lib/junit-4.5.jar:/usr/local/hadoop/libexec/../lib/kfs-0.2.2.jar:/usr/local/hadoop/libexec/../lib/log4j-1.2.15.jar:/usr/local/hadoop/libexec/../lib/mockito-all-1.8.5.jar:/usr/local/hadoop/libexec/../lib/oro-2.0.8.jar:/usr/local/hadoop/libexec/../lib/servlet-api-2.5-20081211.jar:/usr/local/hadoop/libexec/../lib/slf4j-api-1.4.3.jar:/usr/local/hadoop/libexec/../lib/slf4j-log4j12-1.4.3.jar:/usr/local/hadoop/libexec/../lib/xmlenc-0.52.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-2.1.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-api-2.1.jar:/usr/local/hadoop/lib::/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar:/usr/local/sqoop/sqoop-test-1.3.0-cdh3u4.jar::/usr/local/hadoop/hadoop-core-1.0.3.jar:/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar
> 12/12/05 12:08:58 ERROR orm.CompilationManager: Could not rename
> /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/member_main.java
> to /home/hadoop/_scripts/1-hadoop/member/./member_main.java
> org.apache.commons.io.FileExistsException: Destination
> '/home/hadoop/_scripts/1-hadoop/member/./member_main.java' already exists
>  at org.apache.commons.io.FileUtils.moveFile(FileUtils.java:2378)
> at
> com.cloudera.sqoop.orm.CompilationManager.compile(CompilationManager.java:229)
>  at com.cloudera.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:85)
> at com.cloudera.sqoop.tool.ExportTool.exportTable(ExportTool.java:66)
>  at com.cloudera.sqoop.tool.ExportTool.run(ExportTool.java:99)
> at com.cloudera.sqoop.Sqoop.run(Sqoop.java:146)
>  at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:182)
>  at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:221)
> at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:230)
>  at com.cloudera.sqoop.Sqoop.main(Sqoop.java:239)
> 12/12/05 12:08:58 INFO orm.CompilationManager: Writing jar file:
> /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/member_main.jar
> 12/12/05 12:08:58 DEBUG orm.CompilationManager: Scanning for .class files
> in directory: /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971
> 12/12/05 12:08:58 DEBUG orm.CompilationManager: Got classfile:
> /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/member_main.class
> -> member_main.class
> 12/12/05 12:08:58 DEBUG orm.CompilationManager: Finished writing jar file
> /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/member_main.jar
> 12/12/05 12:08:58 INFO mapreduce.ExportJobBase: Beginning export of
> member_main
> 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Using InputFormat: class
> com.cloudera.sqoop.mapreduce.ExportInputFormat
> 12/12/05 12:08:58 DEBUG manager.SqlManager: Using fetchSize for next query:
> 1000
> 12/12/05 12:08:58 INFO manager.SqlManager: Executing SQL statement: SELECT
> TOP 1 * FROM [member_main]
> 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar
> 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/local/sqoop/lib/sqljdbc4.jar
> 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/local/sqoop/lib/sqoop-sqlserver-1.0.jar
> 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar
> 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/local/sqoop/lib/jackson-core-asl-1.7.3.jar
> 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/local/sqoop/lib/snappy-java-1.0.3.2.jar
> 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/local/sqoop/lib/avro-1.5.4.jar
> 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/local/sqoop/lib/jackson-mapper-asl-1.7.3.jar
> 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/local/sqoop/lib/sqljdbc4.jar
> 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/local/sqoop/lib/ant-contrib-1.0b3.jar
> 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/local/sqoop/lib/sqoop-sqlserver-1.0.jar
> 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/local/sqoop/lib/paranamer-2.3.jar
> 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/local/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar
> 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/local/sqoop/lib/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar
> 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/local/sqoop/lib/avro-ipc-1.5.4.jar
> 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/local/sqoop/lib/avro-mapred-1.5.4.jar
> 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/local/sqoop/lib/commons-io-1.4.jar
> 12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/local/sqoop/lib/jopt-simple-3.2.jar
> 12/12/05 12:09:00 INFO input.FileInputFormat: Total input paths to process
> : 1
> 12/12/05 12:09:00 DEBUG mapreduce.ExportInputFormat: Target numMapTasks=1
> 12/12/05 12:09:00 DEBUG mapreduce.ExportInputFormat: Total input bytes=2611
> 12/12/05 12:09:00 DEBUG mapreduce.ExportInputFormat: maxSplitSize=2611
> 12/12/05 12:09:00 INFO input.FileInputFormat: Total input paths to process
> : 1
> 12/12/05 12:09:00 INFO util.NativeCodeLoader: Loaded the native-hadoop
> library
> 12/12/05 12:09:00 WARN snappy.LoadSnappy: Snappy native library not loaded
> 12/12/05 12:09:00 DEBUG mapreduce.ExportInputFormat: Generated splits:
> 12/12/05 12:09:00 DEBUG mapreduce.ExportInputFormat:
> Paths:/user/hadoop/test-ivan/test:0+2611 Locations:hadoop05:;
> 12/12/05 12:09:00 INFO mapred.JobClient: Running job: job_201212041541_0107
> 12/12/05 12:09:01 INFO mapred.JobClient:  map 0% reduce 0%
> 12/12/05 12:09:18 INFO mapred.JobClient: Task Id :
> attempt_201212041541_0107_m_000000_0, Status : FAILED
> java.io.IOException: com.microsoft.sqlserver.jdbc.SQLServerException:
> Incorrect syntax near ','.
>  at
> com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.close(AsyncSqlRecordWriter.java:195)
> at
> org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.close(MapTask.java:651)
>  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:766)
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
>  at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> at java.security.AccessController.doPrivileged(Native Method)
>  at javax.security.auth.Subject.doAs(Subject.java:416)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
>  at org.apache.hadoop.mapred.Child.main(Child.java:249)
> Caused by: com.microsoft.sqlserver.jdbc.SQLServerException: Incorrect
> syntax near ','.
>  at
> com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:197)
> at
> com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1493)
>  at
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.doExecutePreparedStatement(SQLServerPreparedStatement.java:390)
> at
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement$PrepStmtExecCmd.doExecute(SQLServerPreparedStatement.java:340)
>  at com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:4575)
> at
> com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1400)
>  at
> com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:179)
> at
> com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:154)
>  at
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.execute(SQLServerPreparedStatement.java:322)
> at
> com.cloudera.sqoop.mapreduce.AsyncSqlOutputFormat$AsyncSqlExecThread.run(AsyncSqlOutputFormat.java:234)
> 
> 12/12/05 12:09:24 INFO mapred.JobClient: Task Id :
> attempt_201212041541_0107_m_000000_1, Status : FAILED
> java.io.IOException: com.microsoft.sqlserver.jdbc.SQLServerException:
> Incorrect syntax near ','.
>  at
> com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.close(AsyncSqlRecordWriter.java:195)
> at
> org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.close(MapTask.java:651)
>  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:766)
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
>  at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> at java.security.AccessController.doPrivileged(Native Method)
>  at javax.security.auth.Subject.doAs(Subject.java:416)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
>  at org.apache.hadoop.mapred.Child.main(Child.java:249)
> Caused by: com.microsoft.sqlserver.jdbc.SQLServerException: Incorrect
> syntax near ','.
>  at
> com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:197)
> at
> com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1493)
>  at
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.doExecutePreparedStatement(SQLServerPreparedStatement.java:390)
> at
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement$PrepStmtExecCmd.doExecute(SQLServerPreparedStatement.java:340)
>  at com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:4575)
> at
> com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1400)
>  at
> com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:179)
> at
> com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:154)
>  at
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.execute(SQLServerPreparedStatement.java:322)
> at
> com.cloudera.sqoop.mapreduce.AsyncSqlOutputFormat$AsyncSqlExecThread.run(AsyncSqlOutputFormat.java:234)
> 
> 12/12/05 12:09:30 INFO mapred.JobClient: Task Id :
> attempt_201212041541_0107_m_000000_2, Status : FAILED
> java.io.IOException: com.microsoft.sqlserver.jdbc.SQLServerException:
> Incorrect syntax near ','.
>  at
> com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.close(AsyncSqlRecordWriter.java:195)
> at
> org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.close(MapTask.java:651)
>  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:766)
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
>  at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> at java.security.AccessController.doPrivileged(Native Method)
>  at javax.security.auth.Subject.doAs(Subject.java:416)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
>  at org.apache.hadoop.mapred.Child.main(Child.java:249)
> Caused by: com.microsoft.sqlserver.jdbc.SQLServerException: Incorrect
> syntax near ','.
>  at
> com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:197)
> at
> com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1493)
>  at
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.doExecutePreparedStatement(SQLServerPreparedStatement.java:390)
> at
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement$PrepStmtExecCmd.doExecute(SQLServerPreparedStatement.java:340)
>  at com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:4575)
> at
> com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1400)
>  at
> com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:179)
> at
> com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:154)
>  at
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.execute(SQLServerPreparedStatement.java:322)
> at
> com.cloudera.sqoop.mapreduce.AsyncSqlOutputFormat$AsyncSqlExecThread.run(AsyncSqlOutputFormat.java:234)
> 
> 12/12/05 12:09:41 INFO mapred.JobClient: Job complete: job_201212041541_0107
> 12/12/05 12:09:41 INFO mapred.JobClient: Counters: 8
> 12/12/05 12:09:41 INFO mapred.JobClient:   Job Counters
> 12/12/05 12:09:41 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=24379
> 12/12/05 12:09:41 INFO mapred.JobClient:     Total time spent by all
> reduces waiting after reserving slots (ms)=0
> 12/12/05 12:09:41 INFO mapred.JobClient:     Total time spent by all maps
> waiting after reserving slots (ms)=0
> 12/12/05 12:09:41 INFO mapred.JobClient:     Rack-local map tasks=3
> 12/12/05 12:09:41 INFO mapred.JobClient:     Launched map tasks=4
> 12/12/05 12:09:41 INFO mapred.JobClient:     Data-local map tasks=1
> 12/12/05 12:09:41 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=0
> 12/12/05 12:09:41 INFO mapred.JobClient:     Failed map tasks=1
> 12/12/05 12:09:41 INFO mapreduce.ExportJobBase: Transferred 0 bytes in
> 43.0875 seconds (0 bytes/sec)
> 12/12/05 12:09:41 INFO mapreduce.ExportJobBase: Exported 0 records.
> 12/12/05 12:09:41 ERROR tool.ExportTool: Error during export: Export job
> failed!
> ================
> 
> Kind regards,
> Chun-fan
> 
> On Wed, Dec 5, 2012 at 12:01 AM, Jarek Jarcec Cecho <ja...@apache.org>wrote:
> 
> > HiChun-fan,
> > would you mind sharing with us entire Sqoop log generated with parameter
> > --verbose? Are you using build-in Microsoft SQL Connector or connector
> > provided by Microsoft?
> >
> > Jarcec
> >
> > On Tue, Dec 04, 2012 at 05:51:31PM +0800, Chun-fan Ivan Liao wrote:
> > > Hi,
> > >
> > >
> > >
> > > We are using Sqoop 1.3.0-cdh3u4 with Hadoop version 1.0.3.
> > >
> > >
> > >
> > > We encountered the following error when we try to export HDFS file into
> > > MSSQL 2005 (partially):
> > >
> > >
> > >
> > > 12/12/04 16:44:13 INFO mapred.JobClient: Task Id :
> > > attempt_201212041541_0014_m_000000_2, Status : FAILED
> > >
> > > java.io.IOException: com.microsoft.sqlserver.jdbc.SQLServerException:
> > > Incorrect syntax near ','.
> > >
> > >         at
> > >
> > com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.close(AsyncSqlRecordWriter.java:195)
> > >
> > >         at
> > >
> > org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.close(MapTask.java:651)
> > >
> > >         at
> > org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:766)
> > >
> > >         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> > >
> > >         at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> > >
> > >         at java.security.AccessController.doPrivileged(Native Method)
> > >
> > >         at javax.security.auth.Subject.doAs(Subject.java:416)
> > >
> > >         at
> > >
> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> > >
> > >         at org.apache.hadoop.mapred.Child.main(Child.java:249)
> > >
> > > Caused by: com.microsoft.sqlserver.jdbc.SQLServerException: Incorrect
> > > syntax near ','.
> > >
> > >         at
> > >
> > com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:197)
> > >
> > >         at
> > >
> > com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1493)
> > >
> > >         at
> > >
> > com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.doExecutePreparedStatement(SQLServerPreparedStatement.java:390)
> > >
> > >         at
> > >
> > com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement$PrepStmtExecCmd.doExecute(SQLServerPreparedStatement.java:340)
> > >
> > >         at
> > > com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:4575)
> > >
> > >         at
> > >
> > com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1400)
> > >
> > >         at
> > >
> > com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:179)
> > >
> > >         at
> > >
> > com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:154)
> > >
> > >         at
> > >
> > com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.execute(SQLServerPreparedStatement.java:322)
> > >
> > >         at
> > >
> > com.cloudera.sqoop.mapreduce.AsyncSqlOutputFormat$AsyncSqlExecThread.run(AsyncSqlOutputFormat.java:234)
> > >
> > >
> > >
> > > The HDFS file that we want to export was imported using sqoop from SQL
> > 2005
> > > before and uses ‘|’ as field delimiter, and there are commas (‘,’) in a
> > > field of a line in the file.
> > >
> > >
> > >
> > > The commands I submitted is (generalized with capital letters):
> > >
> > > $ sqoop export -D sqoop.export.records.per.statement=75 -D
> > > sqoop.export.statements.per.transaction=75 --connect
> > >
> > "jdbc:sqlserver://SERVER-NAME:1433;username=USER_NAME;password=PASSWD;database=DB_NAME"
> > > --table TABLE_NAME -m 1 --input-fields-terminated-by '|' --export-dir
> > > /EXPORT/FROM/DIRECTORY
> > >
> > >
> > >
> > > I’ve adjusted values of sqoop.export.records.per.statement &
> > > sqoop.export.statements.per.transaction, but that didn’t help.
> > >
> > >
> > >
> > > It will be greatly appreciated if you can offer some help. Thanks.
> > >
> > >
> > >
> > > Ivan
> >

Re: Sqoop export failed: Incorrect syntax near ','

Posted by Chun-fan Ivan Liao <iv...@ivangelion.tw>.
Thank you, Jarcec. I'm not sure which connector we use. I've downloaded
"Microsoft SQL Server Connector for Apache Hadoop" from
http://www.microsoft.com/en-us/download/details.aspx?id=27584, but I didn't
remember if we really used that. How to make sure?

And here is the verbose log:

===========
12/12/05 12:08:57 DEBUG tool.BaseSqoopTool: Enabled debug logging.
12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Added factory
com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory specified by
/usr/local/sqoop/conf/managers.d/mssqoop-sqlserver
12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Loaded manager factory:
com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory
12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Loaded manager factory:
com.cloudera.sqoop.manager.DefaultManagerFactory
12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Trying ManagerFactory:
com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory
12/12/05 12:08:57 INFO SqlServer.MSSQLServerManagerFactory: Using
Microsoft's SQL Server - Hadoop Connector
12/12/05 12:08:57 INFO manager.SqlManager: Using default fetchSize of 1000
12/12/05 12:08:57 DEBUG sqoop.ConnFactory: Instantiated ConnManager
com.microsoft.sqoop.SqlServer.MSSQLServerManager@736921fd
12/12/05 12:08:57 INFO tool.CodeGenTool: Beginning code generation
12/12/05 12:08:57 DEBUG manager.SqlManager: No connection paramenters
specified. Using regular API for making connection.
12/12/05 12:08:57 DEBUG manager.SqlManager: Using fetchSize for next query:
1000
12/12/05 12:08:57 INFO manager.SqlManager: Executing SQL statement: SELECT
TOP 1 * FROM [member_main]
12/12/05 12:08:57 DEBUG manager.SqlManager: Using fetchSize for next query:
1000
12/12/05 12:08:57 INFO manager.SqlManager: Executing SQL statement: SELECT
TOP 1 * FROM [member_main]
12/12/05 12:08:57 DEBUG orm.ClassWriter: selected columns:
12/12/05 12:08:57 DEBUG orm.ClassWriter:   MemberId
12/12/05 12:08:57 DEBUG orm.ClassWriter:   USERNAME
12/12/05 12:08:57 DEBUG orm.ClassWriter:   FirstName
12/12/05 12:08:57 DEBUG orm.ClassWriter:   LastName
12/12/05 12:08:57 DEBUG orm.ClassWriter:   EmailAddress
12/12/05 12:08:57 DEBUG orm.ClassWriter:   Password
12/12/05 12:08:57 DEBUG orm.ClassWriter:   Password_E5
12/12/05 12:08:57 DEBUG orm.ClassWriter:   Birthday
12/12/05 12:08:57 DEBUG orm.ClassWriter:   CompanyName
12/12/05 12:08:57 DEBUG orm.ClassWriter:   Gender
12/12/05 12:08:57 DEBUG orm.ClassWriter:   Age
12/12/05 12:08:57 DEBUG orm.ClassWriter:   Education
12/12/05 12:08:57 DEBUG orm.ClassWriter:   Country
12/12/05 12:08:57 DEBUG orm.ClassWriter:   Title
12/12/05 12:08:57 DEBUG orm.ClassWriter:   Phone1
12/12/05 12:08:57 DEBUG orm.ClassWriter:   Phone2
12/12/05 12:08:57 DEBUG orm.ClassWriter:   Fax
12/12/05 12:08:57 DEBUG orm.ClassWriter:   State
12/12/05 12:08:57 DEBUG orm.ClassWriter:   City
12/12/05 12:08:57 DEBUG orm.ClassWriter:   Address1
12/12/05 12:08:57 DEBUG orm.ClassWriter:   Address2
12/12/05 12:08:57 DEBUG orm.ClassWriter:   ZipCode
12/12/05 12:08:57 DEBUG orm.ClassWriter:   VATID
12/12/05 12:08:57 DEBUG orm.ClassWriter:   Language
12/12/05 12:08:57 DEBUG orm.ClassWriter:   rec_letter
12/12/05 12:08:57 DEBUG orm.ClassWriter:   rec_promotion
12/12/05 12:08:57 DEBUG orm.ClassWriter:   rec_type
12/12/05 12:08:57 DEBUG orm.ClassWriter:   JointSource
12/12/05 12:08:57 DEBUG orm.ClassWriter:   CustomerLevel
12/12/05 12:08:57 DEBUG orm.ClassWriter:   UpdateDate
12/12/05 12:08:57 DEBUG orm.ClassWriter:   CreateDate
12/12/05 12:08:57 DEBUG orm.ClassWriter:   FirstLoginDate
12/12/05 12:08:57 DEBUG orm.ClassWriter:   LastLoginDate
12/12/05 12:08:57 DEBUG orm.ClassWriter:   LastVisit
12/12/05 12:08:57 DEBUG orm.ClassWriter:   isValid
12/12/05 12:08:57 DEBUG orm.ClassWriter:   nJoint
12/12/05 12:08:57 DEBUG orm.ClassWriter:   Upd_SubDate
12/12/05 12:08:57 DEBUG orm.ClassWriter:   UnSub_Type
12/12/05 12:08:57 DEBUG orm.ClassWriter:   CreateDateFloat
12/12/05 12:08:57 DEBUG orm.ClassWriter: Writing source file:
/tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/member_main.java
12/12/05 12:08:57 DEBUG orm.ClassWriter: Table name: member_main
12/12/05 12:08:57 DEBUG orm.ClassWriter: Columns: MemberId:4, USERNAME:12,
FirstName:-9, LastName:-9, EmailAddress:12, Password:12, Password_E5:12,
Birthday:93, CompanyName:-9, Gender:12, Age:5, Education:12, Country:5,
Title:-9, Phone1:12, Phone2:12, Fax:12, State:-9, City:-9, Address1:-9,
Address2:-9, ZipCode:12, VATID:12, Language:12, rec_letter:-7,
rec_promotion:-7, rec_type:5, JointSource:12, CustomerLevel:4,
UpdateDate:93, CreateDate:93, FirstLoginDate:93, LastLoginDate:93,
LastVisit:93, isValid:-7, nJoint:4, Upd_SubDate:93, UnSub_Type:4,
CreateDateFloat:8,
12/12/05 12:08:57 DEBUG orm.ClassWriter: sourceFilename is member_main.java
12/12/05 12:08:57 DEBUG orm.CompilationManager: Found existing
/tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/
12/12/05 12:08:57 INFO orm.CompilationManager: HADOOP_HOME is
/usr/local/hadoop/libexec/..
12/12/05 12:08:57 DEBUG orm.CompilationManager: Adding source file:
/tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/member_main.java
12/12/05 12:08:57 DEBUG orm.CompilationManager: Invoking javac with args:
12/12/05 12:08:57 DEBUG orm.CompilationManager:   -sourcepath
12/12/05 12:08:57 DEBUG orm.CompilationManager:
/tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/
12/12/05 12:08:57 DEBUG orm.CompilationManager:   -d
12/12/05 12:08:57 DEBUG orm.CompilationManager:
/tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/
12/12/05 12:08:57 DEBUG orm.CompilationManager:   -classpath
12/12/05 12:08:57 DEBUG orm.CompilationManager:
/usr/local/hadoop/libexec/../conf:/usr/lib/jvm/java-6-openjdk-amd64/lib/tools.jar:/usr/local/hadoop/libexec/..:/usr/local/hadoop/libexec/../hadoop-core-1.0.3.jar:/usr/local/hadoop/libexec/../lib/asm-3.2.jar:/usr/local/hadoop/libexec/../lib/aspectjrt-1.6.5.jar:/usr/local/hadoop/libexec/../lib/aspectjtools-1.6.5.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/libexec/../lib/commons-cli-1.2.jar:/usr/local/hadoop/libexec/../lib/commons-codec-1.4.jar:/usr/local/hadoop/libexec/../lib/commons-collections-3.2.1.jar:/usr/local/hadoop/libexec/../lib/commons-configuration-1.6.jar:/usr/local/hadoop/libexec/../lib/commons-daemon-1.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-digester-1.8.jar:/usr/local/hadoop/libexec/../lib/commons-el-1.0.jar:/usr/local/hadoop/libexec/../lib/commons-httpclient-3.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-io-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-lang-2.4.jar:/usr/local/hadoop/libexec/../lib/commons-logging-1.1.1.jar:/usr/local/hadoop/libexec/../lib/commons-logging-api-1.0.4.jar:/usr/local/hadoop/libexec/../lib/commons-math-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-net-1.4.1.jar:/usr/local/hadoop/libexec/../lib/core-3.1.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-capacity-scheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-fairscheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-thriftfs-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hsqldb-1.8.0.10.jar:/usr/local/hadoop/libexec/../lib/jackson-core-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jasper-compiler-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jasper-runtime-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jdeb-0.8.jar:/usr/local/hadoop/libexec/../lib/jersey-core-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-json-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-server-1.8.jar:/usr/local/hadoop/libexec/../lib/jets3t-0.6.1.jar:/usr/local/hadoop/libexec/../lib/jetty-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jetty-util-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jsch-0.1.42.jar:/usr/local/hadoop/libexec/../lib/junit-4.5.jar:/usr/local/hadoop/libexec/../lib/kfs-0.2.2.jar:/usr/local/hadoop/libexec/../lib/log4j-1.2.15.jar:/usr/local/hadoop/libexec/../lib/mockito-all-1.8.5.jar:/usr/local/hadoop/libexec/../lib/oro-2.0.8.jar:/usr/local/hadoop/libexec/../lib/servlet-api-2.5-20081211.jar:/usr/local/hadoop/libexec/../lib/slf4j-api-1.4.3.jar:/usr/local/hadoop/libexec/../lib/slf4j-log4j12-1.4.3.jar:/usr/local/hadoop/libexec/../lib/xmlenc-0.52.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-2.1.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-api-2.1.jar:/usr/local/hadoop/lib:/usr/local/sqoop/conf::/usr/local/sqoop/lib/ant-contrib-1.0b3.jar:/usr/local/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar:/usr/local/sqoop/lib/avro-1.5.4.jar:/usr/local/sqoop/lib/avro-ipc-1.5.4.jar:/usr/local/sqoop/lib/avro-mapred-1.5.4.jar:/usr/local/sqoop/lib/commons-io-1.4.jar:/usr/local/sqoop/lib/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar:/usr/local/sqoop/lib/jackson-core-asl-1.7.3.jar:/usr/local/sqoop/lib/jackson-mapper-asl-1.7.3.jar:/usr/local/sqoop/lib/jopt-simple-3.2.jar:/usr/local/sqoop/lib/paranamer-2.3.jar:/usr/local/sqoop/lib/snappy-java-1.0.3.2.jar:/usr/local/sqoop/lib/sqljdbc4.jar:/usr/local/sqoop/lib/sqoop-sqlserver-1.0.jar:/usr/local/hbase/conf/:/usr/lib/jvm/java-6-openjdk-amd64//lib/tools.jar:/usr/local/hbase:/usr/local/hbase/hbase-0.94.1.jar:/usr/local/hbase/hbase-0.94.1-tests.jar:/usr/local/hbase/lib/activation-1.1.jar:/usr/local/hbase/lib/asm-3.1.jar:/usr/local/hbase/lib/avro-1.5.3.jar:/usr/local/hbase/lib/avro-ipc-1.5.3.jar:/usr/local/hbase/lib/commons-beanutils-1.7.0.jar:/usr/local/hbase/lib/commons-beanutils-core-1.8.0.jar:/usr/local/hbase/lib/commons-cli-1.2.jar:/usr/local/hbase/lib/commons-codec-1.4.jar:/usr/local/hbase/lib/commons-collections-3.2.1.jar:/usr/local/hbase/lib/commons-configuration-1.6.jar:/usr/local/hbase/lib/commons-digester-1.8.jar:/usr/local/hbase/lib/commons-el-1.0.jar:/usr/local/hbase/lib/commons-httpclient-3.1.jar:/usr/local/hbase/lib/commons-io-2.1.jar:/usr/local/hbase/lib/commons-lang-2.5.jar:/usr/local/hbase/lib/commons-logging-1.1.1.jar:/usr/local/hbase/lib/commons-math-2.1.jar:/usr/local/hbase/lib/commons-net-1.4.1.jar:/usr/local/hbase/lib/core-3.1.1.jar:/usr/local/hbase/lib/guava-11.0.2.jar:/usr/local/hbase/lib/hadoop-core-1.0.3.jar:/usr/local/hbase/lib/high-scale-lib-1.1.1.jar:/usr/local/hbase/lib/httpclient-4.1.2.jar:/usr/local/hbase/lib/httpcore-4.1.3.jar:/usr/local/hbase/lib/jackson-core-asl-1.8.8.jar:/usr/local/hbase/lib/jackson-jaxrs-1.8.8.jar:/usr/local/hbase/lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hbase/lib/jackson-xc-1.8.8.jar:/usr/local/hbase/lib/jamon-runtime-2.3.1.jar:/usr/local/hbase/lib/jasper-compiler-5.5.23.jar:/usr/local/hbase/lib/jasper-runtime-5.5.23.jar:/usr/local/hbase/lib/jaxb-api-2.1.jar:/usr/local/hbase/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hbase/lib/jersey-core-1.8.jar:/usr/local/hbase/lib/jersey-json-1.8.jar:/usr/local/hbase/lib/jersey-server-1.8.jar:/usr/local/hbase/lib/jettison-1.1.jar:/usr/local/hbase/lib/jetty-6.1.26.jar:/usr/local/hbase/lib/jetty-util-6.1.26.jar:/usr/local/hbase/lib/jruby-complete-1.6.5.jar:/usr/local/hbase/lib/jsp-2.1-6.1.14.jar:/usr/local/hbase/lib/jsp-api-2.1-6.1.14.jar:/usr/local/hbase/lib/jsr305-1.3.9.jar:/usr/local/hbase/lib/junit-4.10-HBASE-1.jar:/usr/local/hbase/lib/libthrift-0.8.0.jar:/usr/local/hbase/lib/log4j-1.2.16.jar:/usr/local/hbase/lib/metrics-core-2.1.2.jar:/usr/local/hbase/lib/netty-3.2.4.Final.jar:/usr/local/hbase/lib/protobuf-java-2.4.0a.jar:/usr/local/hbase/lib/servlet-api-2.5-6.1.14.jar:/usr/local/hbase/lib/slf4j-api-1.4.3.jar:/usr/local/hbase/lib/slf4j-log4j12-1.4.3.jar:/usr/local/hbase/lib/snappy-java-1.0.3.2.jar:/usr/local/hbase/lib/stax-api-1.0.1.jar:/usr/local/hbase/lib/velocity-1.7.jar:/usr/local/hbase/lib/xmlenc-0.52.jar:/usr/local/hbase/lib/zookeeper-3.4.3.jar::/usr/local/hadoop/conf:/usr/local/hadoop/libexec/../conf:/usr/lib/jvm/java-6-openjdk-amd64/lib/tools.jar:/usr/local/hadoop/libexec/..:/usr/local/hadoop/libexec/../hadoop-core-1.0.3.jar:/usr/local/hadoop/libexec/../lib/asm-3.2.jar:/usr/local/hadoop/libexec/../lib/aspectjrt-1.6.5.jar:/usr/local/hadoop/libexec/../lib/aspectjtools-1.6.5.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/libexec/../lib/commons-cli-1.2.jar:/usr/local/hadoop/libexec/../lib/commons-codec-1.4.jar:/usr/local/hadoop/libexec/../lib/commons-collections-3.2.1.jar:/usr/local/hadoop/libexec/../lib/commons-configuration-1.6.jar:/usr/local/hadoop/libexec/../lib/commons-daemon-1.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-digester-1.8.jar:/usr/local/hadoop/libexec/../lib/commons-el-1.0.jar:/usr/local/hadoop/libexec/../lib/commons-httpclient-3.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-io-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-lang-2.4.jar:/usr/local/hadoop/libexec/../lib/commons-logging-1.1.1.jar:/usr/local/hadoop/libexec/../lib/commons-logging-api-1.0.4.jar:/usr/local/hadoop/libexec/../lib/commons-math-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-net-1.4.1.jar:/usr/local/hadoop/libexec/../lib/core-3.1.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-capacity-scheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-fairscheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-thriftfs-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hsqldb-1.8.0.10.jar:/usr/local/hadoop/libexec/../lib/jackson-core-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jasper-compiler-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jasper-runtime-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jdeb-0.8.jar:/usr/local/hadoop/libexec/../lib/jersey-core-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-json-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-server-1.8.jar:/usr/local/hadoop/libexec/../lib/jets3t-0.6.1.jar:/usr/local/hadoop/libexec/../lib/jetty-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jetty-util-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jsch-0.1.42.jar:/usr/local/hadoop/libexec/../lib/junit-4.5.jar:/usr/local/hadoop/libexec/../lib/kfs-0.2.2.jar:/usr/local/hadoop/libexec/../lib/log4j-1.2.15.jar:/usr/local/hadoop/libexec/../lib/mockito-all-1.8.5.jar:/usr/local/hadoop/libexec/../lib/oro-2.0.8.jar:/usr/local/hadoop/libexec/../lib/servlet-api-2.5-20081211.jar:/usr/local/hadoop/libexec/../lib/slf4j-api-1.4.3.jar:/usr/local/hadoop/libexec/../lib/slf4j-log4j12-1.4.3.jar:/usr/local/hadoop/libexec/../lib/xmlenc-0.52.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-2.1.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-api-2.1.jar:/usr/local/hadoop/lib::/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar:/usr/local/sqoop/sqoop-test-1.3.0-cdh3u4.jar::/usr/local/hadoop/hadoop-core-1.0.3.jar:/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar
12/12/05 12:08:58 ERROR orm.CompilationManager: Could not rename
/tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/member_main.java
to /home/hadoop/_scripts/1-hadoop/member/./member_main.java
org.apache.commons.io.FileExistsException: Destination
'/home/hadoop/_scripts/1-hadoop/member/./member_main.java' already exists
 at org.apache.commons.io.FileUtils.moveFile(FileUtils.java:2378)
at
com.cloudera.sqoop.orm.CompilationManager.compile(CompilationManager.java:229)
 at com.cloudera.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:85)
at com.cloudera.sqoop.tool.ExportTool.exportTable(ExportTool.java:66)
 at com.cloudera.sqoop.tool.ExportTool.run(ExportTool.java:99)
at com.cloudera.sqoop.Sqoop.run(Sqoop.java:146)
 at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:182)
 at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:221)
at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:230)
 at com.cloudera.sqoop.Sqoop.main(Sqoop.java:239)
12/12/05 12:08:58 INFO orm.CompilationManager: Writing jar file:
/tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/member_main.jar
12/12/05 12:08:58 DEBUG orm.CompilationManager: Scanning for .class files
in directory: /tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971
12/12/05 12:08:58 DEBUG orm.CompilationManager: Got classfile:
/tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/member_main.class
-> member_main.class
12/12/05 12:08:58 DEBUG orm.CompilationManager: Finished writing jar file
/tmp/sqoop-hadoop/compile/14ce35e69f66546d9d0d41065fac0971/member_main.jar
12/12/05 12:08:58 INFO mapreduce.ExportJobBase: Beginning export of
member_main
12/12/05 12:08:58 DEBUG mapreduce.JobBase: Using InputFormat: class
com.cloudera.sqoop.mapreduce.ExportInputFormat
12/12/05 12:08:58 DEBUG manager.SqlManager: Using fetchSize for next query:
1000
12/12/05 12:08:58 INFO manager.SqlManager: Executing SQL statement: SELECT
TOP 1 * FROM [member_main]
12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
file:/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar
12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
file:/usr/local/sqoop/lib/sqljdbc4.jar
12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
file:/usr/local/sqoop/lib/sqoop-sqlserver-1.0.jar
12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
file:/usr/local/sqoop/sqoop-1.3.0-cdh3u4.jar
12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
file:/usr/local/sqoop/lib/jackson-core-asl-1.7.3.jar
12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
file:/usr/local/sqoop/lib/snappy-java-1.0.3.2.jar
12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
file:/usr/local/sqoop/lib/avro-1.5.4.jar
12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
file:/usr/local/sqoop/lib/jackson-mapper-asl-1.7.3.jar
12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
file:/usr/local/sqoop/lib/sqljdbc4.jar
12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
file:/usr/local/sqoop/lib/ant-contrib-1.0b3.jar
12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
file:/usr/local/sqoop/lib/sqoop-sqlserver-1.0.jar
12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
file:/usr/local/sqoop/lib/paranamer-2.3.jar
12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
file:/usr/local/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar
12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
file:/usr/local/sqoop/lib/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar
12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
file:/usr/local/sqoop/lib/avro-ipc-1.5.4.jar
12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
file:/usr/local/sqoop/lib/avro-mapred-1.5.4.jar
12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
file:/usr/local/sqoop/lib/commons-io-1.4.jar
12/12/05 12:08:58 DEBUG mapreduce.JobBase: Adding to job classpath:
file:/usr/local/sqoop/lib/jopt-simple-3.2.jar
12/12/05 12:09:00 INFO input.FileInputFormat: Total input paths to process
: 1
12/12/05 12:09:00 DEBUG mapreduce.ExportInputFormat: Target numMapTasks=1
12/12/05 12:09:00 DEBUG mapreduce.ExportInputFormat: Total input bytes=2611
12/12/05 12:09:00 DEBUG mapreduce.ExportInputFormat: maxSplitSize=2611
12/12/05 12:09:00 INFO input.FileInputFormat: Total input paths to process
: 1
12/12/05 12:09:00 INFO util.NativeCodeLoader: Loaded the native-hadoop
library
12/12/05 12:09:00 WARN snappy.LoadSnappy: Snappy native library not loaded
12/12/05 12:09:00 DEBUG mapreduce.ExportInputFormat: Generated splits:
12/12/05 12:09:00 DEBUG mapreduce.ExportInputFormat:
Paths:/user/hadoop/test-ivan/test:0+2611 Locations:hadoop05:;
12/12/05 12:09:00 INFO mapred.JobClient: Running job: job_201212041541_0107
12/12/05 12:09:01 INFO mapred.JobClient:  map 0% reduce 0%
12/12/05 12:09:18 INFO mapred.JobClient: Task Id :
attempt_201212041541_0107_m_000000_0, Status : FAILED
java.io.IOException: com.microsoft.sqlserver.jdbc.SQLServerException:
Incorrect syntax near ','.
 at
com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.close(AsyncSqlRecordWriter.java:195)
at
org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.close(MapTask.java:651)
 at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:766)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
 at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
at java.security.AccessController.doPrivileged(Native Method)
 at javax.security.auth.Subject.doAs(Subject.java:416)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
 at org.apache.hadoop.mapred.Child.main(Child.java:249)
Caused by: com.microsoft.sqlserver.jdbc.SQLServerException: Incorrect
syntax near ','.
 at
com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:197)
at
com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1493)
 at
com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.doExecutePreparedStatement(SQLServerPreparedStatement.java:390)
at
com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement$PrepStmtExecCmd.doExecute(SQLServerPreparedStatement.java:340)
 at com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:4575)
at
com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1400)
 at
com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:179)
at
com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:154)
 at
com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.execute(SQLServerPreparedStatement.java:322)
at
com.cloudera.sqoop.mapreduce.AsyncSqlOutputFormat$AsyncSqlExecThread.run(AsyncSqlOutputFormat.java:234)

12/12/05 12:09:24 INFO mapred.JobClient: Task Id :
attempt_201212041541_0107_m_000000_1, Status : FAILED
java.io.IOException: com.microsoft.sqlserver.jdbc.SQLServerException:
Incorrect syntax near ','.
 at
com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.close(AsyncSqlRecordWriter.java:195)
at
org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.close(MapTask.java:651)
 at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:766)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
 at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
at java.security.AccessController.doPrivileged(Native Method)
 at javax.security.auth.Subject.doAs(Subject.java:416)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
 at org.apache.hadoop.mapred.Child.main(Child.java:249)
Caused by: com.microsoft.sqlserver.jdbc.SQLServerException: Incorrect
syntax near ','.
 at
com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:197)
at
com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1493)
 at
com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.doExecutePreparedStatement(SQLServerPreparedStatement.java:390)
at
com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement$PrepStmtExecCmd.doExecute(SQLServerPreparedStatement.java:340)
 at com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:4575)
at
com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1400)
 at
com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:179)
at
com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:154)
 at
com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.execute(SQLServerPreparedStatement.java:322)
at
com.cloudera.sqoop.mapreduce.AsyncSqlOutputFormat$AsyncSqlExecThread.run(AsyncSqlOutputFormat.java:234)

12/12/05 12:09:30 INFO mapred.JobClient: Task Id :
attempt_201212041541_0107_m_000000_2, Status : FAILED
java.io.IOException: com.microsoft.sqlserver.jdbc.SQLServerException:
Incorrect syntax near ','.
 at
com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.close(AsyncSqlRecordWriter.java:195)
at
org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.close(MapTask.java:651)
 at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:766)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
 at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
at java.security.AccessController.doPrivileged(Native Method)
 at javax.security.auth.Subject.doAs(Subject.java:416)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
 at org.apache.hadoop.mapred.Child.main(Child.java:249)
Caused by: com.microsoft.sqlserver.jdbc.SQLServerException: Incorrect
syntax near ','.
 at
com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:197)
at
com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1493)
 at
com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.doExecutePreparedStatement(SQLServerPreparedStatement.java:390)
at
com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement$PrepStmtExecCmd.doExecute(SQLServerPreparedStatement.java:340)
 at com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:4575)
at
com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1400)
 at
com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:179)
at
com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:154)
 at
com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.execute(SQLServerPreparedStatement.java:322)
at
com.cloudera.sqoop.mapreduce.AsyncSqlOutputFormat$AsyncSqlExecThread.run(AsyncSqlOutputFormat.java:234)

12/12/05 12:09:41 INFO mapred.JobClient: Job complete: job_201212041541_0107
12/12/05 12:09:41 INFO mapred.JobClient: Counters: 8
12/12/05 12:09:41 INFO mapred.JobClient:   Job Counters
12/12/05 12:09:41 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=24379
12/12/05 12:09:41 INFO mapred.JobClient:     Total time spent by all
reduces waiting after reserving slots (ms)=0
12/12/05 12:09:41 INFO mapred.JobClient:     Total time spent by all maps
waiting after reserving slots (ms)=0
12/12/05 12:09:41 INFO mapred.JobClient:     Rack-local map tasks=3
12/12/05 12:09:41 INFO mapred.JobClient:     Launched map tasks=4
12/12/05 12:09:41 INFO mapred.JobClient:     Data-local map tasks=1
12/12/05 12:09:41 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=0
12/12/05 12:09:41 INFO mapred.JobClient:     Failed map tasks=1
12/12/05 12:09:41 INFO mapreduce.ExportJobBase: Transferred 0 bytes in
43.0875 seconds (0 bytes/sec)
12/12/05 12:09:41 INFO mapreduce.ExportJobBase: Exported 0 records.
12/12/05 12:09:41 ERROR tool.ExportTool: Error during export: Export job
failed!
================

Kind regards,
Chun-fan

On Wed, Dec 5, 2012 at 12:01 AM, Jarek Jarcec Cecho <ja...@apache.org>wrote:

> HiChun-fan,
> would you mind sharing with us entire Sqoop log generated with parameter
> --verbose? Are you using build-in Microsoft SQL Connector or connector
> provided by Microsoft?
>
> Jarcec
>
> On Tue, Dec 04, 2012 at 05:51:31PM +0800, Chun-fan Ivan Liao wrote:
> > Hi,
> >
> >
> >
> > We are using Sqoop 1.3.0-cdh3u4 with Hadoop version 1.0.3.
> >
> >
> >
> > We encountered the following error when we try to export HDFS file into
> > MSSQL 2005 (partially):
> >
> >
> >
> > 12/12/04 16:44:13 INFO mapred.JobClient: Task Id :
> > attempt_201212041541_0014_m_000000_2, Status : FAILED
> >
> > java.io.IOException: com.microsoft.sqlserver.jdbc.SQLServerException:
> > Incorrect syntax near ','.
> >
> >         at
> >
> com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.close(AsyncSqlRecordWriter.java:195)
> >
> >         at
> >
> org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.close(MapTask.java:651)
> >
> >         at
> org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:766)
> >
> >         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> >
> >         at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> >
> >         at java.security.AccessController.doPrivileged(Native Method)
> >
> >         at javax.security.auth.Subject.doAs(Subject.java:416)
> >
> >         at
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> >
> >         at org.apache.hadoop.mapred.Child.main(Child.java:249)
> >
> > Caused by: com.microsoft.sqlserver.jdbc.SQLServerException: Incorrect
> > syntax near ','.
> >
> >         at
> >
> com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:197)
> >
> >         at
> >
> com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1493)
> >
> >         at
> >
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.doExecutePreparedStatement(SQLServerPreparedStatement.java:390)
> >
> >         at
> >
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement$PrepStmtExecCmd.doExecute(SQLServerPreparedStatement.java:340)
> >
> >         at
> > com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:4575)
> >
> >         at
> >
> com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1400)
> >
> >         at
> >
> com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:179)
> >
> >         at
> >
> com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:154)
> >
> >         at
> >
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.execute(SQLServerPreparedStatement.java:322)
> >
> >         at
> >
> com.cloudera.sqoop.mapreduce.AsyncSqlOutputFormat$AsyncSqlExecThread.run(AsyncSqlOutputFormat.java:234)
> >
> >
> >
> > The HDFS file that we want to export was imported using sqoop from SQL
> 2005
> > before and uses ‘|’ as field delimiter, and there are commas (‘,’) in a
> > field of a line in the file.
> >
> >
> >
> > The commands I submitted is (generalized with capital letters):
> >
> > $ sqoop export -D sqoop.export.records.per.statement=75 -D
> > sqoop.export.statements.per.transaction=75 --connect
> >
> "jdbc:sqlserver://SERVER-NAME:1433;username=USER_NAME;password=PASSWD;database=DB_NAME"
> > --table TABLE_NAME -m 1 --input-fields-terminated-by '|' --export-dir
> > /EXPORT/FROM/DIRECTORY
> >
> >
> >
> > I’ve adjusted values of sqoop.export.records.per.statement &
> > sqoop.export.statements.per.transaction, but that didn’t help.
> >
> >
> >
> > It will be greatly appreciated if you can offer some help. Thanks.
> >
> >
> >
> > Ivan
>

Re: Sqoop export failed: Incorrect syntax near ','

Posted by Jarek Jarcec Cecho <ja...@apache.org>.
HiChun-fan,
would you mind sharing with us entire Sqoop log generated with parameter --verbose? Are you using build-in Microsoft SQL Connector or connector provided by Microsoft?

Jarcec

On Tue, Dec 04, 2012 at 05:51:31PM +0800, Chun-fan Ivan Liao wrote:
> Hi,
> 
> 
> 
> We are using Sqoop 1.3.0-cdh3u4 with Hadoop version 1.0.3.
> 
> 
> 
> We encountered the following error when we try to export HDFS file into
> MSSQL 2005 (partially):
> 
> 
> 
> 12/12/04 16:44:13 INFO mapred.JobClient: Task Id :
> attempt_201212041541_0014_m_000000_2, Status : FAILED
> 
> java.io.IOException: com.microsoft.sqlserver.jdbc.SQLServerException:
> Incorrect syntax near ','.
> 
>         at
> com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.close(AsyncSqlRecordWriter.java:195)
> 
>         at
> org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.close(MapTask.java:651)
> 
>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:766)
> 
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> 
>         at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> 
>         at java.security.AccessController.doPrivileged(Native Method)
> 
>         at javax.security.auth.Subject.doAs(Subject.java:416)
> 
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> 
>         at org.apache.hadoop.mapred.Child.main(Child.java:249)
> 
> Caused by: com.microsoft.sqlserver.jdbc.SQLServerException: Incorrect
> syntax near ','.
> 
>         at
> com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:197)
> 
>         at
> com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1493)
> 
>         at
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.doExecutePreparedStatement(SQLServerPreparedStatement.java:390)
> 
>         at
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement$PrepStmtExecCmd.doExecute(SQLServerPreparedStatement.java:340)
> 
>         at
> com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:4575)
> 
>         at
> com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1400)
> 
>         at
> com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:179)
> 
>         at
> com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:154)
> 
>         at
> com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.execute(SQLServerPreparedStatement.java:322)
> 
>         at
> com.cloudera.sqoop.mapreduce.AsyncSqlOutputFormat$AsyncSqlExecThread.run(AsyncSqlOutputFormat.java:234)
> 
> 
> 
> The HDFS file that we want to export was imported using sqoop from SQL 2005
> before and uses ‘|’ as field delimiter, and there are commas (‘,’) in a
> field of a line in the file.
> 
> 
> 
> The commands I submitted is (generalized with capital letters):
> 
> $ sqoop export -D sqoop.export.records.per.statement=75 -D
> sqoop.export.statements.per.transaction=75 --connect
> "jdbc:sqlserver://SERVER-NAME:1433;username=USER_NAME;password=PASSWD;database=DB_NAME"
> --table TABLE_NAME -m 1 --input-fields-terminated-by '|' --export-dir
> /EXPORT/FROM/DIRECTORY
> 
> 
> 
> I’ve adjusted values of sqoop.export.records.per.statement &
> sqoop.export.statements.per.transaction, but that didn’t help.
> 
> 
> 
> It will be greatly appreciated if you can offer some help. Thanks.
> 
> 
> 
> Ivan