You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@sqoop.apache.org by Cyril Bogus <cy...@gmail.com> on 2013/05/02 18:43:40 UTC

Re: Export

Yup the map task logs did not show any error. As if the error was suggested
but not actually written in the file


On Sun, Apr 28, 2013 at 11:03 AM, Jarek Jarcec Cecho <ja...@apache.org>wrote:

> Hi Cyril,
> did you by any chance checked the map task logs as the exception is
> suggesting?
>
> > java.io.IOException: Can't export data, please check task tracker logs
>
> Jarcec
>
> On Fri, Apr 26, 2013 at 03:21:21PM -0400, Cyril Bogus wrote:
> > Hi Jarek, Thank you so much for the replies.
> >
> > I made some changes to my export queries. It seems that it did not like
> the
> > fact that I had setup update-key and columns.
> > When I did so the export actually started but now I am having a mapreduce
> > issue.
> >
> > Here is the output I am getting with --verbose on. And attached is the
> > class generated with the export command.
> >
> > Warning: /home/hbase does not exist! HBase imports will fail.
> > Please set $HBASE_HOME to the root of your HBase installation.
> > Warning: $HADOOP_HOME is deprecated.
> >
> > 13/04/26 15:17:52 DEBUG tool.BaseSqoopTool: Enabled debug logging.
> > 13/04/26 15:17:52 DEBUG util.ClassLoaderStack: Checking for existing
> class:
> > com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory
> > 13/04/26 15:17:52 DEBUG util.ClassLoaderStack: Class is already
> available.
> > Skipping jar /home/sqoop/lib/sqljdbc4.jar
> > 13/04/26 15:17:52 DEBUG sqoop.ConnFactory: Added factory
> > com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory in jar
> > /home/sqoop/lib/sqljdbc4.jar specified by
> > /home/sqoop/conf/managers.d/mssqoop-sqlserver
> > 13/04/26 15:17:52 DEBUG sqoop.ConnFactory: Loaded manager factory:
> > com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory
> > 13/04/26 15:17:52 DEBUG sqoop.ConnFactory: Loaded manager factory:
> > com.cloudera.sqoop.manager.DefaultManagerFactory
> > 13/04/26 15:17:52 DEBUG sqoop.ConnFactory: Trying ManagerFactory:
> > com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory
> > 13/04/26 15:17:52 INFO SqlServer.MSSQLServerManagerFactory: Using
> > Microsoft's SQL Server - Hadoop Connector
> > 13/04/26 15:17:52 INFO manager.SqlManager: Using default fetchSize of
> 1000
> > 13/04/26 15:17:52 DEBUG sqoop.ConnFactory: Instantiated ConnManager
> > com.microsoft.sqoop.SqlServer.MSSQLServerManager@4cb9e45a
> > 13/04/26 15:17:52 INFO tool.CodeGenTool: Beginning code generation
> > 13/04/26 15:17:53 DEBUG manager.SqlManager: No connection paramenters
> > specified. Using regular API for making connection.
> > 13/04/26 15:17:53 DEBUG manager.SqlManager: Using fetchSize for next
> query:
> > 1000
> > 13/04/26 15:17:53 INFO manager.SqlManager: Executing SQL statement:
> SELECT
> > TOP 1 * FROM [kmeansclusterIds]
> > 13/04/26 15:17:53 DEBUG manager.SqlManager: Using fetchSize for next
> query:
> > 1000
> > 13/04/26 15:17:53 INFO manager.SqlManager: Executing SQL statement:
> SELECT
> > TOP 1 * FROM [kmeansclusterIds]
> > 13/04/26 15:17:53 DEBUG orm.ClassWriter: selected columns:
> > 13/04/26 15:17:53 DEBUG orm.ClassWriter:   driver_license
> > 13/04/26 15:17:53 DEBUG orm.ClassWriter:   clusterId
> > 13/04/26 15:17:53 DEBUG orm.ClassWriter: Writing source file:
> >
> /tmp/sqoop-cyrille/compile/31f5d725de6c671fa550489f492e31a2/kmeansclusterIds.java
> > 13/04/26 15:17:53 DEBUG orm.ClassWriter: Table name: kmeansclusterIds
> > 13/04/26 15:17:53 DEBUG orm.ClassWriter: Columns: driver_license:12,
> > clusterId:4,
> > 13/04/26 15:17:53 DEBUG orm.ClassWriter: sourceFilename is
> > kmeansclusterIds.java
> > 13/04/26 15:17:53 DEBUG orm.CompilationManager: Found existing
> > /tmp/sqoop-cyrille/compile/31f5d725de6c671fa550489f492e31a2/
> > 13/04/26 15:17:53 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is
> > /home/hadoop/mapred
> > 13/04/26 15:17:53 DEBUG orm.CompilationManager: Adding source file:
> >
> /tmp/sqoop-cyrille/compile/31f5d725de6c671fa550489f492e31a2/kmeansclusterIds.java
> > 13/04/26 15:17:53 DEBUG orm.CompilationManager: Invoking javac with args:
> > 13/04/26 15:17:53 DEBUG orm.CompilationManager:   -sourcepath
> > 13/04/26 15:17:53 DEBUG orm.CompilationManager:
> > /tmp/sqoop-cyrille/compile/31f5d725de6c671fa550489f492e31a2/
> > 13/04/26 15:17:53 DEBUG orm.CompilationManager:   -d
> > 13/04/26 15:17:53 DEBUG orm.CompilationManager:
> > /tmp/sqoop-cyrille/compile/31f5d725de6c671fa550489f492e31a2/
> > 13/04/26 15:17:53 DEBUG orm.CompilationManager:   -classpath
> > 13/04/26 15:17:53 DEBUG orm.CompilationManager:
> >
> /home/hadoop/conf:/usr/lib/tools.jar:/home/hadoop:/home/hadoop/hadoop-core-1.0.4.jar:/home/hadoop/lib/antlr-2.7.7.jar:/home/hadoop/lib/antlr-3.2.jar:/home/hadoop/lib/antlr-runtime-3.2.jar:/home/hadoop/lib/asm-3.2.jar:/home/hadoop/lib/aspectjrt-1.6.5.jar:/home/hadoop/lib/aspectjtools-1.6.5.jar:/home/hadoop/lib/avro-1.4.0-cassandra-1.jar:/home/hadoop/lib/bson-2.5.jar:/home/hadoop/lib/cassandra-all-0.8.1.jar:/home/hadoop/lib/cassandra-thrift-0.8.1.jar:/home/hadoop/lib/cglib-nodep-2.2.jar:/home/hadoop/lib/commons-beanutils-1.7.0.jar:/home/hadoop/lib/commons-beanutils-core-1.8.0.jar:/home/hadoop/lib/commons-cli-1.2.jar:/home/hadoop/lib/commons-cli-2.0-mahout.jar:/home/hadoop/lib/commons-codec-1.4.jar:/home/hadoop/lib/commons-collections-3.2.1.jar:/home/hadoop/lib/commons-compress-1.2.jar:/home/hadoop/lib/commons-configuration-1.6.jar:/home/hadoop/lib/commons-daemon-1.0.1.jar:/home/hadoop/lib/commons-dbcp-1.4.jar:/home/hadoop/lib/commons-digester-1.8.jar:/home/hadoop/lib/commons-el-1.0.jar:/home/hadoop/lib/commons-httpclient-3.0.1.jar:/home/hadoop/lib/commons-io-2.0.1.jar:/home/hadoop/lib/commons-io-2.1.jar:/home/hadoop/lib/commons-lang-2.4.jar:/home/hadoop/lib/commons-lang-2.6.jar:/home/hadoop/lib/commons-logging-1.1.1.jar:/home/hadoop/lib/commons-logging-api-1.0.4.jar:/home/hadoop/lib/commons-math-2.1.jar:/home/hadoop/lib/commons-math-2.2.jar:/home/hadoop/lib/commons-net-1.4.1.jar:/home/hadoop/lib/commons-pool-1.5.6.jar:/home/hadoop/lib/concurrentlinkedhashmap-lru-1.1.jar:/home/hadoop/lib/core-3.1.1.jar:/home/hadoop/lib/easymock-3.0.jar:/home/hadoop/lib/guava-r09.jar:/home/hadoop/lib/hadoop-capacity-scheduler-1.0.4.jar:/home/hadoop/lib/hadoop-fairscheduler-1.0.4.jar:/home/hadoop/lib/hadoop-thriftfs-1.0.4.jar:/home/hadoop/lib/hector-core-0.8.0-2.jar:/home/hadoop/lib/high-scale-lib-1.1.2.jar:/home/hadoop/lib/hsqldb-1.8.0.10.jar:/home/hadoop/lib/httpclient-4.0.1.jar:/home/hadoop/lib/httpcore-4.0.1.jar:/home/hadoop/lib/icu4j-4.8.1.1.jar:/home/hadoop/lib/jackson-core-asl-1.8.2.jar:/home/hadoop/lib/jackson-core-asl-1.8.8.jar:/home/hadoop/lib/jackson-mapper-asl-1.8.2.jar:/home/hadoop/lib/jackson-mapper-asl-1.8.8.jar:/home/hadoop/lib/jakarta-regexp-1.4.jar:/home/hadoop/lib/jamm-0.2.2.jar:/home/hadoop/lib/jasper-compiler-5.5.12.jar:/home/hadoop/lib/jasper-runtime-5.5.12.jar:/home/hadoop/lib/jcommon-1.0.12.jar:/home/hadoop/lib/jdeb-0.8.jar:/home/hadoop/lib/jersey-core-1.8.jar:/home/hadoop/lib/jersey-json-1.8.jar:/home/hadoop/lib/jersey-server-1.8.jar:/home/hadoop/lib/jets3t-0.6.1.jar:/home/hadoop/lib/jetty-6.1.22.jar:/home/hadoop/lib/jetty-6.1.26.jar:/home/hadoop/lib/jetty-util-6.1.22.jar:/home/hadoop/lib/jetty-util-6.1.26.jar:/home/hadoop/lib/jline-0.9.94.jar:/home/hadoop/lib/jsch-0.1.42.jar:/home/hadoop/lib/json-simple-1.1.jar:/home/hadoop/lib/jul-to-slf4j-1.6.1.jar:/home/hadoop/lib/junit-4.5.jar:/home/hadoop/lib/junit-4.8.2.jar:/home/hadoop/lib/kfs-0.2.2.jar:/home/hadoop/lib/libthrift-0.6.1.jar:/home/hadoop/lib/log4j-1.2.15.jar:/home/hadoop/lib/log4j-1.2.16.jar:/home/hadoop/lib/lucene-analyzers-3.6.0.jar:/home/hadoop/lib/lucene-benchmark-3.6.0.jar:/home/hadoop/lib/lucene-core-3.6.0.jar:/home/hadoop/lib/lucene-facet-3.6.0.jar:/home/hadoop/lib/lucene-highlighter-3.6.0.jar:/home/hadoop/lib/lucene-memory-3.6.0.jar:/home/hadoop/lib/lucene-queries-3.6.0.jar:/home/hadoop/lib/mahout-core-0.7.jar:/home/hadoop/lib/mahout-core-0.7-job.jar:/home/hadoop/lib/mahout-integration-0.7.jar:/home/hadoop/lib/mahout-math-0.7.jar:/home/hadoop/lib/mockito-all-1.8.5.jar:/home/hadoop/lib/mongo-java-driver-2.5.jar:/home/hadoop/lib/objenesis-1.2.jar:/home/hadoop/lib/oro-2.0.8.jar:/home/hadoop/lib/servlet-api-2.5-20081211.jar:/home/hadoop/lib/servlet-api-2.5.jar:/home/hadoop/lib/slf4j-api-1.6.1.jar:/home/hadoop/lib/slf4j-log4j12-1.6.1.jar:/home/hadoop/lib/snakeyaml-1.6.jar:/home/hadoop/lib/solr-commons-csv-3.5.0.jar:/home/hadoop/lib/speed4j-0.9.jar:/home/hadoop/lib/stringtemplate-3.2.jar:/home/hadoop/lib/uncommons-maths-1.2.2.jar:/home/hadoop/lib/uuid-3.2.0.jar:/home/hadoop/lib/xercesImpl-2.9.1.jar:/home/hadoop/lib/xml-apis-1.3.04.jar:/home/hadoop/lib/xmlenc-0.52.jar:/home/hadoop/lib/xpp3_min-1.1.4c.jar:/home/hadoop/lib/xstream-1.3.1.jar:/home/hadoop/lib/jsp-2.1/jsp-2.1.jar:/home/hadoop/lib/jsp-2.1/jsp-api-2.1.jar:/home/sqoop/conf::/home/sqoop/lib/ant-contrib-1.0b3.jar:/home/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar:/home/sqoop/lib/avro-1.5.3.jar:/home/sqoop/lib/avro-ipc-1.5.3.jar:/home/sqoop/lib/avro-mapred-1.5.3.jar:/home/sqoop/lib/commons-io-1.4.jar:/home/sqoop/lib/hsqldb-1.8.0.10.jar:/home/sqoop/lib/jackson-core-asl-1.7.3.jar:/home/sqoop/lib/jackson-mapper-asl-1.7.3.jar:/home/sqoop/lib/jopt-simple-3.2.jar:/home/sqoop/lib/paranamer-2.3.jar:/home/sqoop/lib/snappy-java-1.0.3.2.jar:/home/sqoop/lib/sqljdbc4.jar:/home/sqoop/lib/sqoop-sqlserver-1.0.jar:/home/sqoop/sqoop-1.4.3.jar:/home/sqoop/sqoop-test-1.4.3.jar::/home/hadoop/hadoop-core-1.0.4.jar:/home/sqoop/sqoop-1.4.3.jar
> > Note:
> >
> /tmp/sqoop-cyrille/compile/31f5d725de6c671fa550489f492e31a2/kmeansclusterIds.java
> > uses or overrides a deprecated API.
> > Note: Recompile with -Xlint:deprecation for details.
> > 13/04/26 15:17:54 DEBUG orm.CompilationManager: Could not rename
> >
> /tmp/sqoop-cyrille/compile/31f5d725de6c671fa550489f492e31a2/kmeansclusterIds.java
> > to /home/cyrille/workspace/AutomaticClusterer/./kmeansclusterIds.java
> > org.apache.commons.io.FileExistsException: Destination
> > '/home/cyrille/workspace/AutomaticClusterer/./kmeansclusterIds.java'
> > already exists
> >     at org.apache.commons.io.FileUtils.moveFile(FileUtils.java:2099)
> >     at
> >
> org.apache.sqoop.orm.CompilationManager.compile(CompilationManager.java:228)
> >     at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:83)
> >     at org.apache.sqoop.tool.ExportTool.exportTable(ExportTool.java:64)
> >     at org.apache.sqoop.tool.ExportTool.run(ExportTool.java:100)
> >     at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
> >     at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> >     at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
> >     at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
> >     at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
> >     at org.apache.sqoop.Sqoop.main(Sqoop.java:238)
> > 13/04/26 15:17:54 INFO orm.CompilationManager: Writing jar file:
> >
> /tmp/sqoop-cyrille/compile/31f5d725de6c671fa550489f492e31a2/kmeansclusterIds.jar
> > 13/04/26 15:17:54 DEBUG orm.CompilationManager: Scanning for .class files
> > in directory: /tmp/sqoop-cyrille/compile/31f5d725de6c671fa550489f492e31a2
> > 13/04/26 15:17:54 DEBUG orm.CompilationManager: Got classfile:
> >
> /tmp/sqoop-cyrille/compile/31f5d725de6c671fa550489f492e31a2/kmeansclusterIds.class
> > -> kmeansclusterIds.class
> > 13/04/26 15:17:54 DEBUG orm.CompilationManager: Finished writing jar file
> >
> /tmp/sqoop-cyrille/compile/31f5d725de6c671fa550489f492e31a2/kmeansclusterIds.jar
> > 13/04/26 15:17:54 INFO mapreduce.ExportJobBase: Beginning export of
> > kmeansclusterIds
> > 13/04/26 15:17:55 DEBUG mapreduce.JobBase: Using InputFormat: class
> > org.apache.sqoop.mapreduce.ExportInputFormat
> > 13/04/26 15:17:55 DEBUG manager.SqlManager: Using fetchSize for next
> query:
> > 1000
> > 13/04/26 15:17:55 INFO manager.SqlManager: Executing SQL statement:
> SELECT
> > TOP 1 * FROM [kmeansclusterIds]
> > 13/04/26 15:17:55 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/home/sqoop/sqoop-1.4.3.jar
> > 13/04/26 15:17:55 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/home/sqoop/lib/sqljdbc4.jar
> > 13/04/26 15:17:55 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/home/sqoop/lib/sqoop-sqlserver-1.0.jar
> > 13/04/26 15:17:55 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/home/sqoop/sqoop-1.4.3.jar
> > 13/04/26 15:17:55 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/home/sqoop/lib/ant-contrib-1.0b3.jar
> > 13/04/26 15:17:55 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/home/sqoop/lib/avro-ipc-1.5.3.jar
> > 13/04/26 15:17:55 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/home/sqoop/lib/jackson-mapper-asl-1.7.3.jar
> > 13/04/26 15:17:55 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/home/sqoop/lib/jackson-core-asl-1.7.3.jar
> > 13/04/26 15:17:55 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/home/sqoop/lib/commons-io-1.4.jar
> > 13/04/26 15:17:55 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/home/sqoop/lib/sqoop-sqlserver-1.0.jar
> > 13/04/26 15:17:55 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/home/sqoop/lib/paranamer-2.3.jar
> > 13/04/26 15:17:55 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/home/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar
> > 13/04/26 15:17:55 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/home/sqoop/lib/jopt-simple-3.2.jar
> > 13/04/26 15:17:55 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/home/sqoop/lib/avro-1.5.3.jar
> > 13/04/26 15:17:55 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/home/sqoop/lib/hsqldb-1.8.0.10.jar
> > 13/04/26 15:17:55 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/home/sqoop/lib/avro-mapred-1.5.3.jar
> > 13/04/26 15:17:55 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/home/sqoop/lib/sqljdbc4.jar
> > 13/04/26 15:17:55 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/home/sqoop/lib/snappy-java-1.0.3.2.jar
> > 13/04/26 15:17:56 INFO input.FileInputFormat: Total input paths to
> process
> > : 1
> > 13/04/26 15:17:56 DEBUG mapreduce.ExportInputFormat: Target numMapTasks=4
> > 13/04/26 15:17:56 DEBUG mapreduce.ExportInputFormat: Total input
> bytes=11388
> > 13/04/26 15:17:56 DEBUG mapreduce.ExportInputFormat: maxSplitSize=2847
> > 13/04/26 15:17:56 INFO input.FileInputFormat: Total input paths to
> process
> > : 1
> > 13/04/26 15:17:56 INFO util.NativeCodeLoader: Loaded the native-hadoop
> > library
> > 13/04/26 15:17:56 WARN snappy.LoadSnappy: Snappy native library not
> loaded
> > 13/04/26 15:17:56 DEBUG mapreduce.ExportInputFormat: Generated splits:
> > 13/04/26 15:17:56 DEBUG mapreduce.ExportInputFormat:
> > Paths:/user/cyrille/drivers/output.txt:0+2847 Locations:Agnik-17:;
> > 13/04/26 15:17:56 DEBUG mapreduce.ExportInputFormat:
> > Paths:/user/cyrille/drivers/output.txt:2847+2847 Locations:Agnik-17:;
> > 13/04/26 15:17:56 DEBUG mapreduce.ExportInputFormat:
> > Paths:/user/cyrille/drivers/output.txt:5694+2847 Locations:Agnik-17:;
> > 13/04/26 15:17:56 DEBUG mapreduce.ExportInputFormat:
> > Paths:/user/cyrille/drivers/output.txt:8541+2847 Locations:Agnik-17:;
> > 13/04/26 15:17:56 INFO mapred.JobClient: Running job:
> job_201304111243_0289
> > 13/04/26 15:17:57 INFO mapred.JobClient:  map 0% reduce 0%
> > 13/04/26 15:18:09 INFO mapred.JobClient: Task Id :
> > attempt_201304111243_0289_m_000000_0, Status : FAILED
> > java.io.IOException: Can't export data, please check task tracker logs
> >     at
> >
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
> >     at
> > org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
> >     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
> >     at
> >
> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
> >     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
> >     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> >     at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> >     at java.security.AccessController.doPrivileged(Native Method)
> >     at javax.security.auth.Subject.doAs(Subject.java:416)
> >     at
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> >     at org.apache.hadoop.mapred.Child.main(Child.java:249)
> > Caused by: java.util.NoSuchElementException
> >     at java.util.ArrayList$Itr.next(ArrayList.java:757)
> >     at kmeansclusterIds.__loadFromFields(kmeansclusterIds.java:198)
> >     at kmeansclusterIds.parse(kmeansclusterIds.java:147)
> >     at
> > org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
> >     ... 10 more
> >
> > 13/04/26 15:18:11 INFO mapred.JobClient: Task Id :
> > attempt_201304111243_0289_m_000001_0, Status : FAILED
> > java.io.IOException: Can't export data, please check task tracker logs
> >     at
> >
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
> >     at
> > org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
> >     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
> >     at
> >
> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
> >     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
> >     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> >     at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> >     at java.security.AccessController.doPrivileged(Native Method)
> >     at javax.security.auth.Subject.doAs(Subject.java:416)
> >     at
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> >     at org.apache.hadoop.mapred.Child.main(Child.java:249)
> > Caused by: java.util.NoSuchElementException
> >     at java.util.ArrayList$Itr.next(ArrayList.java:757)
> >     at kmeansclusterIds.__loadFromFields(kmeansclusterIds.java:198)
> >     at kmeansclusterIds.parse(kmeansclusterIds.java:147)
> >     at
> > org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
> >     ... 10 more
> >
> > 13/04/26 15:18:14 INFO mapred.JobClient: Task Id :
> > attempt_201304111243_0289_m_000000_1, Status : FAILED
> > java.io.IOException: Can't export data, please check task tracker logs
> >     at
> >
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
> >     at
> > org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
> >     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
> >     at
> >
> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
> >     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
> >     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> >     at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> >     at java.security.AccessController.doPrivileged(Native Method)
> >     at javax.security.auth.Subject.doAs(Subject.java:416)
> >     at
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> >     at org.apache.hadoop.mapred.Child.main(Child.java:249)
> > Caused by: java.util.NoSuchElementException
> >     at java.util.ArrayList$Itr.next(ArrayList.java:757)
> >     at kmeansclusterIds.__loadFromFields(kmeansclusterIds.java:198)
> >     at kmeansclusterIds.parse(kmeansclusterIds.java:147)
> >     at
> > org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
> >     ... 10 more
> >
> > 13/04/26 15:18:17 INFO mapred.JobClient: Task Id :
> > attempt_201304111243_0289_m_000001_1, Status : FAILED
> > java.io.IOException: Can't export data, please check task tracker logs
> >     at
> >
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
> >     at
> > org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
> >     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
> >     at
> >
> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
> >     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
> >     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> >     at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> >     at java.security.AccessController.doPrivileged(Native Method)
> >     at javax.security.auth.Subject.doAs(Subject.java:416)
> >     at
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> >     at org.apache.hadoop.mapred.Child.main(Child.java:249)
> > Caused by: java.util.NoSuchElementException
> >     at java.util.ArrayList$Itr.next(ArrayList.java:757)
> >     at kmeansclusterIds.__loadFromFields(kmeansclusterIds.java:198)
> >     at kmeansclusterIds.parse(kmeansclusterIds.java:147)
> >     at
> > org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
> >     ... 10 more
> >
> >
> >
> > On Fri, Apr 26, 2013 at 3:05 PM, Jarek Jarcec Cecho <jarcec@apache.org
> >wrote:
> >
> > > Hi Cyril,
> > > would you mind running your Sqoop command with argument --verbose? It
> > > should print out much more information. Please gather the generated
> class
> > > after Sqoop will end and sent it to the mailing list as well.
> > >
> > > Jarcec
> > >
> > > On Fri, Apr 26, 2013 at 01:52:29PM -0400, Cyril Bogus wrote:
> > > > Here is the sqoop log
> > > >
> > > > 13/04/26 13:36:46 WARN tool.SqoopTool: $SQOOP_CONF_DIR has not been
> set
> > > in
> > > > the environment. Cannot check for additional configuration.
> > > > 13/04/26 13:36:46 WARN sqoop.ConnFactory: $SQOOP_CONF_DIR has not
> been
> > > set
> > > > in the environment. Cannot check for additional configuration.
> > > > 13/04/26 13:36:46 INFO manager.SqlManager: Using default fetchSize of
> > > 1000
> > > > 13/04/26 13:36:46 INFO tool.CodeGenTool: Beginning code generation
> > > > 13/04/26 13:36:46 INFO manager.SqlManager: Executing SQL statement:
> > > SELECT
> > > > t.* FROM [kmeansclusterIds] AS t WHERE 1=0
> > > > 13/04/26 13:36:46 INFO orm.CompilationManager: $HADOOP_HOME is not
> set
> > > >
> > >
> /tmp/sqoop-cyrille/compile/2a3ab7b9299edcac783039a7addc9666/kmeansclusterIds.java:73:
> > > > cannot find symbol
> > > > symbol  : variable driver_license
> > > > location: class kmeansclusterIds
> > > >     JdbcWritableBridge.writeString(driver_license, 2 + __off, 12,
> > > __dbStmt);
> > > >                                    ^
> > > > Note:
> > > >
> > >
> /tmp/sqoop-cyrille/compile/2a3ab7b9299edcac783039a7addc9666/kmeansclusterIds.java
> > > > uses or overrides a deprecated API.
> > > > Note: Recompile with -Xlint:deprecation for details.
> > > > 1 error
> > > > 13/04/26 13:36:47 ERROR tool.ExportTool: Encountered IOException
> running
> > > > export job: java.io.IOException: Error returned by javac
> > > >
> > > > the kmeansClusterId class is set in a temp file so I cannot show
> what the
> > > > class do beside the point of error as stated above.
> > > >
> > > > Thank you for your reply Jarek
> > > >
> > > >
> > > > On Fri, Apr 26, 2013 at 1:38 PM, Jarek Jarcec Cecho <
> jarcec@apache.org
> > > >wrote:
> > > >
> > > > > Hi Cyril,
> > > > > would you mind sharing entire Sqoop log and the generated java
> class?
> > > > >
> > > > > Jarcec
> > > > >
> > > > > On Fri, Apr 26, 2013 at 12:53:49PM -0400, Cyril Bogus wrote:
> > > > > > UPDATE!!!
> > > > > >
> > > > > > Now I get the following error
> > > > > >
> > > > > >
> > > > >
> > >
> /tmp/sqoop-cyril/compile/b156fd4f270274b11320d007472bbfe7/kmeansclusterIds.java:73:
> > > > > > cannot find symbol
> > > > > > symbol  : variable driver_license
> > > > > > location: class kmeansclusterIds
> > > > > >     JdbcWritableBridge.writeString(driver_license, 2 + __off, 12,
> > > > > __dbStmt);
> > > > > >                                    ^
> > > > > > Note:
> > > > > >
> > > > >
> > >
> /tmp/sqoop-cyrille/compile/b156fd4f270274b11320d007472bbfe7/kmeansclusterIds.java
> > > > > > uses or overrides a deprecated API.
> > > > > > Note: Recompile with -Xlint:deprecation for details.
> > > > > > 1 error
> > > > > > 13/04/26 12:52:26 ERROR tool.ExportTool: Encountered IOException
> > > running
> > > > > > export job: java.io.IOException: Error returned by javac
> > > > > >
> > > > > >
> > > > > >
> > > > > >
> > > > > > On Fri, Apr 26, 2013 at 12:52 PM, Cyril Bogus <
> cyrilbogus@gmail.com>
> > > > > wrote:
> > > > > >
> > > > > > > Hi everyone,
> > > > > > >
> > > > > > > I am trying to do an export from HDFS to MSSQL using Sqoop
> > > > > > >
> > > > > > > my data is in the following format
> > > > > > >
> > > > > > > JTDKN3DU0B0261494,345
> > > > > > > JTEBU14R840022700,340
> > > > > > > JTEEP21A770208029,314
> > > > > > > JTHBF5C24A5125359,348
> > > > > > > jthbk1eg6a2395028,341
> > > > > > > JTMBD31V565007305,355
> > > > > > > KL1PM5C5XAK700838,352
> > > > > > > KMHCG45C41U225885,352
> > > > > > > KMHDC86EX9U037746,304
> > > > > > > NM0LS6BN8CT123712,354
> > > > > > >
> > > > > > > my export statement is the following
> > > > > > >
> > > > > > > export
> > > > > > > --connect
> > > > > > >
> > > 'jdbc:sqlserver://server:port;username=sa;password=pass;database=db'
> > > > > > > --table
> > > > > > > kmeansclusterIds
> > > > > > > --update-key
> > > > > > > driver_license
> > > > > > > --columns
> > > > > > > clusterId
> > > > > > > --update-mode
> > > > > > > allowinsert
> > > > > > > --export-dir
> > > > > > > drivers/output.txt
> > > > > > > --fields-terminated-by
> > > > > > > ','
> > > > > > > --lines-terminated-by
> > > > > > > \n
> > > > > > >
> > > > > > > I created a table named kmeansclusterIds on the server.
> > > > > > > I get the following error:
> > > > > > >
> > > > > > > Exception in thread "main" java.lang.NoSuchMethodError:
> > > > > > >
> > > > >
> > >
> com.cloudera.sqoop.manager.ExportJobContext.setConnManager(Lcom/cloudera/sqoop/manager/ConnManager;)V
> > > > > > >     at
> > > > > > >
> > > > >
> > >
> com.microsoft.sqoop.SqlServer.MSSQLServerManager.exportTable(MSSQLServerManager.java:151)
> > > > > > >     at
> > > org.apache.sqoop.tool.ExportTool.exportTable(ExportTool.java:78)
> > > > > > >     at org.apache.sqoop.tool.ExportTool.run(ExportTool.java:97)
> > > > > > >     at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
> > > > > > >     at
> org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> > > > > > >     at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
> > > > > > >     at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
> > > > > > >     at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
> > > > > > >     at org.apache.sqoop.Sqoop.main(Sqoop.java:238)
> > > > > > >     at com.cloudera.sqoop.Sqoop.main(Sqoop.java:57)
> > > > > > >
> > > > > > > Any insight in what the real issue might be?
> > > > > > >
> > > > > > > Thank you in advance for a reply.
> > > > > > >
> > > > >
> > >
>
>
>

Re: Export

Posted by Venkat Ranganathan <vr...@hortonworks.com>.
Hi Cyril

13/04/26 15:17:54 DEBUG orm.CompilationManager: Could not rename
/tmp/sqoop-cyrille/compile/31f5d725de6c671fa550489f492e31a2/kmeansclusterIds.java
to /home/cyrille/workspace/AutomaticClusterer/./kmeansclusterIds.java
org.apache.commons.io.FileExistsException: Destination
'/home/cyrille/workspace/AutomaticClusterer/./kmeansclusterIds.java'
already exists

You are saying you made changes to the export job.   It looks like you are
changing the columns being exported or the table definitions?  Can you
remove the file and run the export job and provide the logs that Jarcec is
mentioning.   That might provide more clues


Thanks

Venkat



On Thu, May 2, 2013 at 9:00 PM, Jarek Jarcec Cecho <ja...@apache.org>wrote:

> Hi Cyril,
> I think that the exception is a bit misleading, you should check out map
> task logs, not task tracker logs. Would you mind sharing with us one failed
> map task log?
>
> Jarcec
>
> On Thu, May 02, 2013 at 12:43:40PM -0400, Cyril Bogus wrote:
> > Yup the map task logs did not show any error. As if the error was
> suggested
> > but not actually written in the file
> >
> >
> > On Sun, Apr 28, 2013 at 11:03 AM, Jarek Jarcec Cecho <jarcec@apache.org
> >wrote:
> >
> > > Hi Cyril,
> > > did you by any chance checked the map task logs as the exception is
> > > suggesting?
> > >
> > > > java.io.IOException: Can't export data, please check task tracker
> logs
> > >
> > > Jarcec
> > >
> > > On Fri, Apr 26, 2013 at 03:21:21PM -0400, Cyril Bogus wrote:
> > > > Hi Jarek, Thank you so much for the replies.
> > > >
> > > > I made some changes to my export queries. It seems that it did not
> like
> > > the
> > > > fact that I had setup update-key and columns.
> > > > When I did so the export actually started but now I am having a
> mapreduce
> > > > issue.
> > > >
> > > > Here is the output I am getting with --verbose on. And attached is
> the
> > > > class generated with the export command.
> > > >
> > > > Warning: /home/hbase does not exist! HBase imports will fail.
> > > > Please set $HBASE_HOME to the root of your HBase installation.
> > > > Warning: $HADOOP_HOME is deprecated.
> > > >
> > > > 13/04/26 15:17:52 DEBUG tool.BaseSqoopTool: Enabled debug logging.
> > > > 13/04/26 15:17:52 DEBUG util.ClassLoaderStack: Checking for existing
> > > class:
> > > > com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory
> > > > 13/04/26 15:17:52 DEBUG util.ClassLoaderStack: Class is already
> > > available.
> > > > Skipping jar /home/sqoop/lib/sqljdbc4.jar
> > > > 13/04/26 15:17:52 DEBUG sqoop.ConnFactory: Added factory
> > > > com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory in jar
> > > > /home/sqoop/lib/sqljdbc4.jar specified by
> > > > /home/sqoop/conf/managers.d/mssqoop-sqlserver
> > > > 13/04/26 15:17:52 DEBUG sqoop.ConnFactory: Loaded manager factory:
> > > > com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory
> > > > 13/04/26 15:17:52 DEBUG sqoop.ConnFactory: Loaded manager factory:
> > > > com.cloudera.sqoop.manager.DefaultManagerFactory
> > > > 13/04/26 15:17:52 DEBUG sqoop.ConnFactory: Trying ManagerFactory:
> > > > com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory
> > > > 13/04/26 15:17:52 INFO SqlServer.MSSQLServerManagerFactory: Using
> > > > Microsoft's SQL Server - Hadoop Connector
> > > > 13/04/26 15:17:52 INFO manager.SqlManager: Using default fetchSize of
> > > 1000
> > > > 13/04/26 15:17:52 DEBUG sqoop.ConnFactory: Instantiated ConnManager
> > > > com.microsoft.sqoop.SqlServer.MSSQLServerManager@4cb9e45a
> > > > 13/04/26 15:17:52 INFO tool.CodeGenTool: Beginning code generation
> > > > 13/04/26 15:17:53 DEBUG manager.SqlManager: No connection paramenters
> > > > specified. Using regular API for making connection.
> > > > 13/04/26 15:17:53 DEBUG manager.SqlManager: Using fetchSize for next
> > > query:
> > > > 1000
> > > > 13/04/26 15:17:53 INFO manager.SqlManager: Executing SQL statement:
> > > SELECT
> > > > TOP 1 * FROM [kmeansclusterIds]
> > > > 13/04/26 15:17:53 DEBUG manager.SqlManager: Using fetchSize for next
> > > query:
> > > > 1000
> > > > 13/04/26 15:17:53 INFO manager.SqlManager: Executing SQL statement:
> > > SELECT
> > > > TOP 1 * FROM [kmeansclusterIds]
> > > > 13/04/26 15:17:53 DEBUG orm.ClassWriter: selected columns:
> > > > 13/04/26 15:17:53 DEBUG orm.ClassWriter:   driver_license
> > > > 13/04/26 15:17:53 DEBUG orm.ClassWriter:   clusterId
> > > > 13/04/26 15:17:53 DEBUG orm.ClassWriter: Writing source file:
> > > >
> > >
> /tmp/sqoop-cyrille/compile/31f5d725de6c671fa550489f492e31a2/kmeansclusterIds.java
> > > > 13/04/26 15:17:53 DEBUG orm.ClassWriter: Table name: kmeansclusterIds
> > > > 13/04/26 15:17:53 DEBUG orm.ClassWriter: Columns: driver_license:12,
> > > > clusterId:4,
> > > > 13/04/26 15:17:53 DEBUG orm.ClassWriter: sourceFilename is
> > > > kmeansclusterIds.java
> > > > 13/04/26 15:17:53 DEBUG orm.CompilationManager: Found existing
> > > > /tmp/sqoop-cyrille/compile/31f5d725de6c671fa550489f492e31a2/
> > > > 13/04/26 15:17:53 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is
> > > > /home/hadoop/mapred
> > > > 13/04/26 15:17:53 DEBUG orm.CompilationManager: Adding source file:
> > > >
> > >
> /tmp/sqoop-cyrille/compile/31f5d725de6c671fa550489f492e31a2/kmeansclusterIds.java
> > > > 13/04/26 15:17:53 DEBUG orm.CompilationManager: Invoking javac with
> args:
> > > > 13/04/26 15:17:53 DEBUG orm.CompilationManager:   -sourcepath
> > > > 13/04/26 15:17:53 DEBUG orm.CompilationManager:
> > > > /tmp/sqoop-cyrille/compile/31f5d725de6c671fa550489f492e31a2/
> > > > 13/04/26 15:17:53 DEBUG orm.CompilationManager:   -d
> > > > 13/04/26 15:17:53 DEBUG orm.CompilationManager:
> > > > /tmp/sqoop-cyrille/compile/31f5d725de6c671fa550489f492e31a2/
> > > > 13/04/26 15:17:53 DEBUG orm.CompilationManager:   -classpath
> > > > 13/04/26 15:17:53 DEBUG orm.CompilationManager:
> > > >
> > >
> /home/hadoop/conf:/usr/lib/tools.jar:/home/hadoop:/home/hadoop/hadoop-core-1.0.4.jar:/home/hadoop/lib/antlr-2.7.7.jar:/home/hadoop/lib/antlr-3.2.jar:/home/hadoop/lib/antlr-runtime-3.2.jar:/home/hadoop/lib/asm-3.2.jar:/home/hadoop/lib/aspectjrt-1.6.5.jar:/home/hadoop/lib/aspectjtools-1.6.5.jar:/home/hadoop/lib/avro-1.4.0-cassandra-1.jar:/home/hadoop/lib/bson-2.5.jar:/home/hadoop/lib/cassandra-all-0.8.1.jar:/home/hadoop/lib/cassandra-thrift-0.8.1.jar:/home/hadoop/lib/cglib-nodep-2.2.jar:/home/hadoop/lib/commons-beanutils-1.7.0.jar:/home/hadoop/lib/commons-beanutils-core-1.8.0.jar:/home/hadoop/lib/commons-cli-1.2.jar:/home/hadoop/lib/commons-cli-2.0-mahout.jar:/home/hadoop/lib/commons-codec-1.4.jar:/home/hadoop/lib/commons-collections-3.2.1.jar:/home/hadoop/lib/commons-compress-1.2.jar:/home/hadoop/lib/commons-configuration-1.6.jar:/home/hadoop/lib/commons-daemon-1.0.1.jar:/home/hadoop/lib/commons-dbcp-1.4.jar:/home/hadoop/lib/commons-digester-1.8.jar:/home/hadoop/lib/commons-el-1.0.jar:/home/hadoop/lib/commons-httpclient-3.0.1.jar:/home/hadoop/lib/commons-io-2.0.1.jar:/home/hadoop/lib/commons-io-2.1.jar:/home/hadoop/lib/commons-lang-2.4.jar:/home/hadoop/lib/commons-lang-2.6.jar:/home/hadoop/lib/commons-logging-1.1.1.jar:/home/hadoop/lib/commons-logging-api-1.0.4.jar:/home/hadoop/lib/commons-math-2.1.jar:/home/hadoop/lib/commons-math-2.2.jar:/home/hadoop/lib/commons-net-1.4.1.jar:/home/hadoop/lib/commons-pool-1.5.6.jar:/home/hadoop/lib/concurrentlinkedhashmap-lru-1.1.jar:/home/hadoop/lib/core-3.1.1.jar:/home/hadoop/lib/easymock-3.0.jar:/home/hadoop/lib/guava-r09.jar:/home/hadoop/lib/hadoop-capacity-scheduler-1.0.4.jar:/home/hadoop/lib/hadoop-fairscheduler-1.0.4.jar:/home/hadoop/lib/hadoop-thriftfs-1.0.4.jar:/home/hadoop/lib/hector-core-0.8.0-2.jar:/home/hadoop/lib/high-scale-lib-1.1.2.jar:/home/hadoop/lib/hsqldb-1.8.0.10.jar:/home/hadoop/lib/httpclient-4.0.1.jar:/home/hadoop/lib/httpcore-4.0.1.jar:/home/hadoop/lib/icu4j-4.8.1.1.jar:/home/hadoop/lib/jackson-core-asl-1.8.2.jar:/home/hadoop/lib/jackson-core-asl-1.8.8.jar:/home/hadoop/lib/jackson-mapper-asl-1.8.2.jar:/home/hadoop/lib/jackson-mapper-asl-1.8.8.jar:/home/hadoop/lib/jakarta-regexp-1.4.jar:/home/hadoop/lib/jamm-0.2.2.jar:/home/hadoop/lib/jasper-compiler-5.5.12.jar:/home/hadoop/lib/jasper-runtime-5.5.12.jar:/home/hadoop/lib/jcommon-1.0.12.jar:/home/hadoop/lib/jdeb-0.8.jar:/home/hadoop/lib/jersey-core-1.8.jar:/home/hadoop/lib/jersey-json-1.8.jar:/home/hadoop/lib/jersey-server-1.8.jar:/home/hadoop/lib/jets3t-0.6.1.jar:/home/hadoop/lib/jetty-6.1.22.jar:/home/hadoop/lib/jetty-6.1.26.jar:/home/hadoop/lib/jetty-util-6.1.22.jar:/home/hadoop/lib/jetty-util-6.1.26.jar:/home/hadoop/lib/jline-0.9.94.jar:/home/hadoop/lib/jsch-0.1.42.jar:/home/hadoop/lib/json-simple-1.1.jar:/home/hadoop/lib/jul-to-slf4j-1.6.1.jar:/home/hadoop/lib/junit-4.5.jar:/home/hadoop/lib/junit-4.8.2.jar:/home/hadoop/lib/kfs-0.2.2.jar:/home/hadoop/lib/libthrift-0.6.1.jar:/home/hadoop/lib/log4j-1.2.15.jar:/home/hadoop/lib/log4j-1.2.16.jar:/home/hadoop/lib/lucene-analyzers-3.6.0.jar:/home/hadoop/lib/lucene-benchmark-3.6.0.jar:/home/hadoop/lib/lucene-core-3.6.0.jar:/home/hadoop/lib/lucene-facet-3.6.0.jar:/home/hadoop/lib/lucene-highlighter-3.6.0.jar:/home/hadoop/lib/lucene-memory-3.6.0.jar:/home/hadoop/lib/lucene-queries-3.6.0.jar:/home/hadoop/lib/mahout-core-0.7.jar:/home/hadoop/lib/mahout-core-0.7-job.jar:/home/hadoop/lib/mahout-integration-0.7.jar:/home/hadoop/lib/mahout-math-0.7.jar:/home/hadoop/lib/mockito-all-1.8.5.jar:/home/hadoop/lib/mongo-java-driver-2.5.jar:/home/hadoop/lib/objenesis-1.2.jar:/home/hadoop/lib/oro-2.0.8.jar:/home/hadoop/lib/servlet-api-2.5-20081211.jar:/home/hadoop/lib/servlet-api-2.5.jar:/home/hadoop/lib/slf4j-api-1.6.1.jar:/home/hadoop/lib/slf4j-log4j12-1.6.1.jar:/home/hadoop/lib/snakeyaml-1.6.jar:/home/hadoop/lib/solr-commons-csv-3.5.0.jar:/home/hadoop/lib/speed4j-0.9.jar:/home/hadoop/lib/stringtemplate-3.2.jar:/home/hadoop/lib/uncommons-maths-1.2.2.jar:/home/hadoop/lib/uuid-3.2.0.jar:/home/hadoop/lib/xercesImpl-2.9.1.jar:/home/hadoop/lib/xml-apis-1.3.04.jar:/home/hadoop/lib/xmlenc-0.52.jar:/home/hadoop/lib/xpp3_min-1.1.4c.jar:/home/hadoop/lib/xstream-1.3.1.jar:/home/hadoop/lib/jsp-2.1/jsp-2.1.jar:/home/hadoop/lib/jsp-2.1/jsp-api-2.1.jar:/home/sqoop/conf::/home/sqoop/lib/ant-contrib-1.0b3.jar:/home/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar:/home/sqoop/lib/avro-1.5.3.jar:/home/sqoop/lib/avro-ipc-1.5.3.jar:/home/sqoop/lib/avro-mapred-1.5.3.jar:/home/sqoop/lib/commons-io-1.4.jar:/home/sqoop/lib/hsqldb-1.8.0.10.jar:/home/sqoop/lib/jackson-core-asl-1.7.3.jar:/home/sqoop/lib/jackson-mapper-asl-1.7.3.jar:/home/sqoop/lib/jopt-simple-3.2.jar:/home/sqoop/lib/paranamer-2.3.jar:/home/sqoop/lib/snappy-java-1.0.3.2.jar:/home/sqoop/lib/sqljdbc4.jar:/home/sqoop/lib/sqoop-sqlserver-1.0.jar:/home/sqoop/sqoop-1.4.3.jar:/home/sqoop/sqoop-test-1.4.3.jar::/home/hadoop/hadoop-core-1.0.4.jar:/home/sqoop/sqoop-1.4.3.jar
> > > > Note:
> > > >
> > >
> /tmp/sqoop-cyrille/compile/31f5d725de6c671fa550489f492e31a2/kmeansclusterIds.java
> > > > uses or overrides a deprecated API.
> > > > Note: Recompile with -Xlint:deprecation for details.
> > > > 13/04/26 15:17:54 DEBUG orm.CompilationManager: Could not rename
> > > >
> > >
> /tmp/sqoop-cyrille/compile/31f5d725de6c671fa550489f492e31a2/kmeansclusterIds.java
> > > > to /home/cyrille/workspace/AutomaticClusterer/./kmeansclusterIds.java
> > > > org.apache.commons.io.FileExistsException: Destination
> > > > '/home/cyrille/workspace/AutomaticClusterer/./kmeansclusterIds.java'
> > > > already exists
> > > >     at org.apache.commons.io.FileUtils.moveFile(FileUtils.java:2099)
> > > >     at
> > > >
> > >
> org.apache.sqoop.orm.CompilationManager.compile(CompilationManager.java:228)
> > > >     at
> org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:83)
> > > >     at
> org.apache.sqoop.tool.ExportTool.exportTable(ExportTool.java:64)
> > > >     at org.apache.sqoop.tool.ExportTool.run(ExportTool.java:100)
> > > >     at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
> > > >     at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> > > >     at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
> > > >     at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
> > > >     at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
> > > >     at org.apache.sqoop.Sqoop.main(Sqoop.java:238)
> > > > 13/04/26 15:17:54 INFO orm.CompilationManager: Writing jar file:
> > > >
> > >
> /tmp/sqoop-cyrille/compile/31f5d725de6c671fa550489f492e31a2/kmeansclusterIds.jar
> > > > 13/04/26 15:17:54 DEBUG orm.CompilationManager: Scanning for .class
> files
> > > > in directory:
> /tmp/sqoop-cyrille/compile/31f5d725de6c671fa550489f492e31a2
> > > > 13/04/26 15:17:54 DEBUG orm.CompilationManager: Got classfile:
> > > >
> > >
> /tmp/sqoop-cyrille/compile/31f5d725de6c671fa550489f492e31a2/kmeansclusterIds.class
> > > > -> kmeansclusterIds.class
> > > > 13/04/26 15:17:54 DEBUG orm.CompilationManager: Finished writing jar
> file
> > > >
> > >
> /tmp/sqoop-cyrille/compile/31f5d725de6c671fa550489f492e31a2/kmeansclusterIds.jar
> > > > 13/04/26 15:17:54 INFO mapreduce.ExportJobBase: Beginning export of
> > > > kmeansclusterIds
> > > > 13/04/26 15:17:55 DEBUG mapreduce.JobBase: Using InputFormat: class
> > > > org.apache.sqoop.mapreduce.ExportInputFormat
> > > > 13/04/26 15:17:55 DEBUG manager.SqlManager: Using fetchSize for next
> > > query:
> > > > 1000
> > > > 13/04/26 15:17:55 INFO manager.SqlManager: Executing SQL statement:
> > > SELECT
> > > > TOP 1 * FROM [kmeansclusterIds]
> > > > 13/04/26 15:17:55 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > file:/home/sqoop/sqoop-1.4.3.jar
> > > > 13/04/26 15:17:55 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > file:/home/sqoop/lib/sqljdbc4.jar
> > > > 13/04/26 15:17:55 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > file:/home/sqoop/lib/sqoop-sqlserver-1.0.jar
> > > > 13/04/26 15:17:55 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > file:/home/sqoop/sqoop-1.4.3.jar
> > > > 13/04/26 15:17:55 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > file:/home/sqoop/lib/ant-contrib-1.0b3.jar
> > > > 13/04/26 15:17:55 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > file:/home/sqoop/lib/avro-ipc-1.5.3.jar
> > > > 13/04/26 15:17:55 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > file:/home/sqoop/lib/jackson-mapper-asl-1.7.3.jar
> > > > 13/04/26 15:17:55 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > file:/home/sqoop/lib/jackson-core-asl-1.7.3.jar
> > > > 13/04/26 15:17:55 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > file:/home/sqoop/lib/commons-io-1.4.jar
> > > > 13/04/26 15:17:55 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > file:/home/sqoop/lib/sqoop-sqlserver-1.0.jar
> > > > 13/04/26 15:17:55 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > file:/home/sqoop/lib/paranamer-2.3.jar
> > > > 13/04/26 15:17:55 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > file:/home/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar
> > > > 13/04/26 15:17:55 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > file:/home/sqoop/lib/jopt-simple-3.2.jar
> > > > 13/04/26 15:17:55 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > file:/home/sqoop/lib/avro-1.5.3.jar
> > > > 13/04/26 15:17:55 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > file:/home/sqoop/lib/hsqldb-1.8.0.10.jar
> > > > 13/04/26 15:17:55 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > file:/home/sqoop/lib/avro-mapred-1.5.3.jar
> > > > 13/04/26 15:17:55 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > file:/home/sqoop/lib/sqljdbc4.jar
> > > > 13/04/26 15:17:55 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > > file:/home/sqoop/lib/snappy-java-1.0.3.2.jar
> > > > 13/04/26 15:17:56 INFO input.FileInputFormat: Total input paths to
> > > process
> > > > : 1
> > > > 13/04/26 15:17:56 DEBUG mapreduce.ExportInputFormat: Target
> numMapTasks=4
> > > > 13/04/26 15:17:56 DEBUG mapreduce.ExportInputFormat: Total input
> > > bytes=11388
> > > > 13/04/26 15:17:56 DEBUG mapreduce.ExportInputFormat:
> maxSplitSize=2847
> > > > 13/04/26 15:17:56 INFO input.FileInputFormat: Total input paths to
> > > process
> > > > : 1
> > > > 13/04/26 15:17:56 INFO util.NativeCodeLoader: Loaded the
> native-hadoop
> > > > library
> > > > 13/04/26 15:17:56 WARN snappy.LoadSnappy: Snappy native library not
> > > loaded
> > > > 13/04/26 15:17:56 DEBUG mapreduce.ExportInputFormat: Generated
> splits:
> > > > 13/04/26 15:17:56 DEBUG mapreduce.ExportInputFormat:
> > > > Paths:/user/cyrille/drivers/output.txt:0+2847 Locations:Agnik-17:;
> > > > 13/04/26 15:17:56 DEBUG mapreduce.ExportInputFormat:
> > > > Paths:/user/cyrille/drivers/output.txt:2847+2847 Locations:Agnik-17:;
> > > > 13/04/26 15:17:56 DEBUG mapreduce.ExportInputFormat:
> > > > Paths:/user/cyrille/drivers/output.txt:5694+2847 Locations:Agnik-17:;
> > > > 13/04/26 15:17:56 DEBUG mapreduce.ExportInputFormat:
> > > > Paths:/user/cyrille/drivers/output.txt:8541+2847 Locations:Agnik-17:;
> > > > 13/04/26 15:17:56 INFO mapred.JobClient: Running job:
> > > job_201304111243_0289
> > > > 13/04/26 15:17:57 INFO mapred.JobClient:  map 0% reduce 0%
> > > > 13/04/26 15:18:09 INFO mapred.JobClient: Task Id :
> > > > attempt_201304111243_0289_m_000000_0, Status : FAILED
> > > > java.io.IOException: Can't export data, please check task tracker
> logs
> > > >     at
> > > >
> > >
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
> > > >     at
> > > >
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
> > > >     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
> > > >     at
> > > >
> > >
> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
> > > >     at
> org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
> > > >     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> > > >     at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> > > >     at java.security.AccessController.doPrivileged(Native Method)
> > > >     at javax.security.auth.Subject.doAs(Subject.java:416)
> > > >     at
> > > >
> > >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> > > >     at org.apache.hadoop.mapred.Child.main(Child.java:249)
> > > > Caused by: java.util.NoSuchElementException
> > > >     at java.util.ArrayList$Itr.next(ArrayList.java:757)
> > > >     at kmeansclusterIds.__loadFromFields(kmeansclusterIds.java:198)
> > > >     at kmeansclusterIds.parse(kmeansclusterIds.java:147)
> > > >     at
> > > >
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
> > > >     ... 10 more
> > > >
> > > > 13/04/26 15:18:11 INFO mapred.JobClient: Task Id :
> > > > attempt_201304111243_0289_m_000001_0, Status : FAILED
> > > > java.io.IOException: Can't export data, please check task tracker
> logs
> > > >     at
> > > >
> > >
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
> > > >     at
> > > >
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
> > > >     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
> > > >     at
> > > >
> > >
> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
> > > >     at
> org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
> > > >     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> > > >     at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> > > >     at java.security.AccessController.doPrivileged(Native Method)
> > > >     at javax.security.auth.Subject.doAs(Subject.java:416)
> > > >     at
> > > >
> > >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> > > >     at org.apache.hadoop.mapred.Child.main(Child.java:249)
> > > > Caused by: java.util.NoSuchElementException
> > > >     at java.util.ArrayList$Itr.next(ArrayList.java:757)
> > > >     at kmeansclusterIds.__loadFromFields(kmeansclusterIds.java:198)
> > > >     at kmeansclusterIds.parse(kmeansclusterIds.java:147)
> > > >     at
> > > >
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
> > > >     ... 10 more
> > > >
> > > > 13/04/26 15:18:14 INFO mapred.JobClient: Task Id :
> > > > attempt_201304111243_0289_m_000000_1, Status : FAILED
> > > > java.io.IOException: Can't export data, please check task tracker
> logs
> > > >     at
> > > >
> > >
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
> > > >     at
> > > >
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
> > > >     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
> > > >     at
> > > >
> > >
> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
> > > >     at
> org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
> > > >     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> > > >     at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> > > >     at java.security.AccessController.doPrivileged(Native Method)
> > > >     at javax.security.auth.Subject.doAs(Subject.java:416)
> > > >     at
> > > >
> > >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> > > >     at org.apache.hadoop.mapred.Child.main(Child.java:249)
> > > > Caused by: java.util.NoSuchElementException
> > > >     at java.util.ArrayList$Itr.next(ArrayList.java:757)
> > > >     at kmeansclusterIds.__loadFromFields(kmeansclusterIds.java:198)
> > > >     at kmeansclusterIds.parse(kmeansclusterIds.java:147)
> > > >     at
> > > >
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
> > > >     ... 10 more
> > > >
> > > > 13/04/26 15:18:17 INFO mapred.JobClient: Task Id :
> > > > attempt_201304111243_0289_m_000001_1, Status : FAILED
> > > > java.io.IOException: Can't export data, please check task tracker
> logs
> > > >     at
> > > >
> > >
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
> > > >     at
> > > >
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
> > > >     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
> > > >     at
> > > >
> > >
> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
> > > >     at
> org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
> > > >     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> > > >     at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> > > >     at java.security.AccessController.doPrivileged(Native Method)
> > > >     at javax.security.auth.Subject.doAs(Subject.java:416)
> > > >     at
> > > >
> > >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> > > >     at org.apache.hadoop.mapred.Child.main(Child.java:249)
> > > > Caused by: java.util.NoSuchElementException
> > > >     at java.util.ArrayList$Itr.next(ArrayList.java:757)
> > > >     at kmeansclusterIds.__loadFromFields(kmeansclusterIds.java:198)
> > > >     at kmeansclusterIds.parse(kmeansclusterIds.java:147)
> > > >     at
> > > >
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
> > > >     ... 10 more
> > > >
> > > >
> > > >
> > > > On Fri, Apr 26, 2013 at 3:05 PM, Jarek Jarcec Cecho <
> jarcec@apache.org
> > > >wrote:
> > > >
> > > > > Hi Cyril,
> > > > > would you mind running your Sqoop command with argument --verbose?
> It
> > > > > should print out much more information. Please gather the generated
> > > class
> > > > > after Sqoop will end and sent it to the mailing list as well.
> > > > >
> > > > > Jarcec
> > > > >
> > > > > On Fri, Apr 26, 2013 at 01:52:29PM -0400, Cyril Bogus wrote:
> > > > > > Here is the sqoop log
> > > > > >
> > > > > > 13/04/26 13:36:46 WARN tool.SqoopTool: $SQOOP_CONF_DIR has not
> been
> > > set
> > > > > in
> > > > > > the environment. Cannot check for additional configuration.
> > > > > > 13/04/26 13:36:46 WARN sqoop.ConnFactory: $SQOOP_CONF_DIR has not
> > > been
> > > > > set
> > > > > > in the environment. Cannot check for additional configuration.
> > > > > > 13/04/26 13:36:46 INFO manager.SqlManager: Using default
> fetchSize of
> > > > > 1000
> > > > > > 13/04/26 13:36:46 INFO tool.CodeGenTool: Beginning code
> generation
> > > > > > 13/04/26 13:36:46 INFO manager.SqlManager: Executing SQL
> statement:
> > > > > SELECT
> > > > > > t.* FROM [kmeansclusterIds] AS t WHERE 1=0
> > > > > > 13/04/26 13:36:46 INFO orm.CompilationManager: $HADOOP_HOME is
> not
> > > set
> > > > > >
> > > > >
> > >
> /tmp/sqoop-cyrille/compile/2a3ab7b9299edcac783039a7addc9666/kmeansclusterIds.java:73:
> > > > > > cannot find symbol
> > > > > > symbol  : variable driver_license
> > > > > > location: class kmeansclusterIds
> > > > > >     JdbcWritableBridge.writeString(driver_license, 2 + __off, 12,
> > > > > __dbStmt);
> > > > > >                                    ^
> > > > > > Note:
> > > > > >
> > > > >
> > >
> /tmp/sqoop-cyrille/compile/2a3ab7b9299edcac783039a7addc9666/kmeansclusterIds.java
> > > > > > uses or overrides a deprecated API.
> > > > > > Note: Recompile with -Xlint:deprecation for details.
> > > > > > 1 error
> > > > > > 13/04/26 13:36:47 ERROR tool.ExportTool: Encountered IOException
> > > running
> > > > > > export job: java.io.IOException: Error returned by javac
> > > > > >
> > > > > > the kmeansClusterId class is set in a temp file so I cannot show
> > > what the
> > > > > > class do beside the point of error as stated above.
> > > > > >
> > > > > > Thank you for your reply Jarek
> > > > > >
> > > > > >
> > > > > > On Fri, Apr 26, 2013 at 1:38 PM, Jarek Jarcec Cecho <
> > > jarcec@apache.org
> > > > > >wrote:
> > > > > >
> > > > > > > Hi Cyril,
> > > > > > > would you mind sharing entire Sqoop log and the generated java
> > > class?
> > > > > > >
> > > > > > > Jarcec
> > > > > > >
> > > > > > > On Fri, Apr 26, 2013 at 12:53:49PM -0400, Cyril Bogus wrote:
> > > > > > > > UPDATE!!!
> > > > > > > >
> > > > > > > > Now I get the following error
> > > > > > > >
> > > > > > > >
> > > > > > >
> > > > >
> > >
> /tmp/sqoop-cyril/compile/b156fd4f270274b11320d007472bbfe7/kmeansclusterIds.java:73:
> > > > > > > > cannot find symbol
> > > > > > > > symbol  : variable driver_license
> > > > > > > > location: class kmeansclusterIds
> > > > > > > >     JdbcWritableBridge.writeString(driver_license, 2 +
> __off, 12,
> > > > > > > __dbStmt);
> > > > > > > >                                    ^
> > > > > > > > Note:
> > > > > > > >
> > > > > > >
> > > > >
> > >
> /tmp/sqoop-cyrille/compile/b156fd4f270274b11320d007472bbfe7/kmeansclusterIds.java
> > > > > > > > uses or overrides a deprecated API.
> > > > > > > > Note: Recompile with -Xlint:deprecation for details.
> > > > > > > > 1 error
> > > > > > > > 13/04/26 12:52:26 ERROR tool.ExportTool: Encountered
> IOException
> > > > > running
> > > > > > > > export job: java.io.IOException: Error returned by javac
> > > > > > > >
> > > > > > > >
> > > > > > > >
> > > > > > > >
> > > > > > > > On Fri, Apr 26, 2013 at 12:52 PM, Cyril Bogus <
> > > cyrilbogus@gmail.com>
> > > > > > > wrote:
> > > > > > > >
> > > > > > > > > Hi everyone,
> > > > > > > > >
> > > > > > > > > I am trying to do an export from HDFS to MSSQL using Sqoop
> > > > > > > > >
> > > > > > > > > my data is in the following format
> > > > > > > > >
> > > > > > > > > JTDKN3DU0B0261494,345
> > > > > > > > > JTEBU14R840022700,340
> > > > > > > > > JTEEP21A770208029,314
> > > > > > > > > JTHBF5C24A5125359,348
> > > > > > > > > jthbk1eg6a2395028,341
> > > > > > > > > JTMBD31V565007305,355
> > > > > > > > > KL1PM5C5XAK700838,352
> > > > > > > > > KMHCG45C41U225885,352
> > > > > > > > > KMHDC86EX9U037746,304
> > > > > > > > > NM0LS6BN8CT123712,354
> > > > > > > > >
> > > > > > > > > my export statement is the following
> > > > > > > > >
> > > > > > > > > export
> > > > > > > > > --connect
> > > > > > > > >
> > > > >
> 'jdbc:sqlserver://server:port;username=sa;password=pass;database=db'
> > > > > > > > > --table
> > > > > > > > > kmeansclusterIds
> > > > > > > > > --update-key
> > > > > > > > > driver_license
> > > > > > > > > --columns
> > > > > > > > > clusterId
> > > > > > > > > --update-mode
> > > > > > > > > allowinsert
> > > > > > > > > --export-dir
> > > > > > > > > drivers/output.txt
> > > > > > > > > --fields-terminated-by
> > > > > > > > > ','
> > > > > > > > > --lines-terminated-by
> > > > > > > > > \n
> > > > > > > > >
> > > > > > > > > I created a table named kmeansclusterIds on the server.
> > > > > > > > > I get the following error:
> > > > > > > > >
> > > > > > > > > Exception in thread "main" java.lang.NoSuchMethodError:
> > > > > > > > >
> > > > > > >
> > > > >
> > >
> com.cloudera.sqoop.manager.ExportJobContext.setConnManager(Lcom/cloudera/sqoop/manager/ConnManager;)V
> > > > > > > > >     at
> > > > > > > > >
> > > > > > >
> > > > >
> > >
> com.microsoft.sqoop.SqlServer.MSSQLServerManager.exportTable(MSSQLServerManager.java:151)
> > > > > > > > >     at
> > > > > org.apache.sqoop.tool.ExportTool.exportTable(ExportTool.java:78)
> > > > > > > > >     at
> org.apache.sqoop.tool.ExportTool.run(ExportTool.java:97)
> > > > > > > > >     at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
> > > > > > > > >     at
> > > org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> > > > > > > > >     at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
> > > > > > > > >     at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
> > > > > > > > >     at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
> > > > > > > > >     at org.apache.sqoop.Sqoop.main(Sqoop.java:238)
> > > > > > > > >     at com.cloudera.sqoop.Sqoop.main(Sqoop.java:57)
> > > > > > > > >
> > > > > > > > > Any insight in what the real issue might be?
> > > > > > > > >
> > > > > > > > > Thank you in advance for a reply.
> > > > > > > > >
> > > > > > >
> > > > >
> > >
> > >
> > >
>

Re: Export

Posted by Jarek Jarcec Cecho <ja...@apache.org>.
Hi Cyril,
I think that the exception is a bit misleading, you should check out map task logs, not task tracker logs. Would you mind sharing with us one failed map task log?

Jarcec

On Thu, May 02, 2013 at 12:43:40PM -0400, Cyril Bogus wrote:
> Yup the map task logs did not show any error. As if the error was suggested
> but not actually written in the file
> 
> 
> On Sun, Apr 28, 2013 at 11:03 AM, Jarek Jarcec Cecho <ja...@apache.org>wrote:
> 
> > Hi Cyril,
> > did you by any chance checked the map task logs as the exception is
> > suggesting?
> >
> > > java.io.IOException: Can't export data, please check task tracker logs
> >
> > Jarcec
> >
> > On Fri, Apr 26, 2013 at 03:21:21PM -0400, Cyril Bogus wrote:
> > > Hi Jarek, Thank you so much for the replies.
> > >
> > > I made some changes to my export queries. It seems that it did not like
> > the
> > > fact that I had setup update-key and columns.
> > > When I did so the export actually started but now I am having a mapreduce
> > > issue.
> > >
> > > Here is the output I am getting with --verbose on. And attached is the
> > > class generated with the export command.
> > >
> > > Warning: /home/hbase does not exist! HBase imports will fail.
> > > Please set $HBASE_HOME to the root of your HBase installation.
> > > Warning: $HADOOP_HOME is deprecated.
> > >
> > > 13/04/26 15:17:52 DEBUG tool.BaseSqoopTool: Enabled debug logging.
> > > 13/04/26 15:17:52 DEBUG util.ClassLoaderStack: Checking for existing
> > class:
> > > com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory
> > > 13/04/26 15:17:52 DEBUG util.ClassLoaderStack: Class is already
> > available.
> > > Skipping jar /home/sqoop/lib/sqljdbc4.jar
> > > 13/04/26 15:17:52 DEBUG sqoop.ConnFactory: Added factory
> > > com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory in jar
> > > /home/sqoop/lib/sqljdbc4.jar specified by
> > > /home/sqoop/conf/managers.d/mssqoop-sqlserver
> > > 13/04/26 15:17:52 DEBUG sqoop.ConnFactory: Loaded manager factory:
> > > com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory
> > > 13/04/26 15:17:52 DEBUG sqoop.ConnFactory: Loaded manager factory:
> > > com.cloudera.sqoop.manager.DefaultManagerFactory
> > > 13/04/26 15:17:52 DEBUG sqoop.ConnFactory: Trying ManagerFactory:
> > > com.microsoft.sqoop.SqlServer.MSSQLServerManagerFactory
> > > 13/04/26 15:17:52 INFO SqlServer.MSSQLServerManagerFactory: Using
> > > Microsoft's SQL Server - Hadoop Connector
> > > 13/04/26 15:17:52 INFO manager.SqlManager: Using default fetchSize of
> > 1000
> > > 13/04/26 15:17:52 DEBUG sqoop.ConnFactory: Instantiated ConnManager
> > > com.microsoft.sqoop.SqlServer.MSSQLServerManager@4cb9e45a
> > > 13/04/26 15:17:52 INFO tool.CodeGenTool: Beginning code generation
> > > 13/04/26 15:17:53 DEBUG manager.SqlManager: No connection paramenters
> > > specified. Using regular API for making connection.
> > > 13/04/26 15:17:53 DEBUG manager.SqlManager: Using fetchSize for next
> > query:
> > > 1000
> > > 13/04/26 15:17:53 INFO manager.SqlManager: Executing SQL statement:
> > SELECT
> > > TOP 1 * FROM [kmeansclusterIds]
> > > 13/04/26 15:17:53 DEBUG manager.SqlManager: Using fetchSize for next
> > query:
> > > 1000
> > > 13/04/26 15:17:53 INFO manager.SqlManager: Executing SQL statement:
> > SELECT
> > > TOP 1 * FROM [kmeansclusterIds]
> > > 13/04/26 15:17:53 DEBUG orm.ClassWriter: selected columns:
> > > 13/04/26 15:17:53 DEBUG orm.ClassWriter:   driver_license
> > > 13/04/26 15:17:53 DEBUG orm.ClassWriter:   clusterId
> > > 13/04/26 15:17:53 DEBUG orm.ClassWriter: Writing source file:
> > >
> > /tmp/sqoop-cyrille/compile/31f5d725de6c671fa550489f492e31a2/kmeansclusterIds.java
> > > 13/04/26 15:17:53 DEBUG orm.ClassWriter: Table name: kmeansclusterIds
> > > 13/04/26 15:17:53 DEBUG orm.ClassWriter: Columns: driver_license:12,
> > > clusterId:4,
> > > 13/04/26 15:17:53 DEBUG orm.ClassWriter: sourceFilename is
> > > kmeansclusterIds.java
> > > 13/04/26 15:17:53 DEBUG orm.CompilationManager: Found existing
> > > /tmp/sqoop-cyrille/compile/31f5d725de6c671fa550489f492e31a2/
> > > 13/04/26 15:17:53 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is
> > > /home/hadoop/mapred
> > > 13/04/26 15:17:53 DEBUG orm.CompilationManager: Adding source file:
> > >
> > /tmp/sqoop-cyrille/compile/31f5d725de6c671fa550489f492e31a2/kmeansclusterIds.java
> > > 13/04/26 15:17:53 DEBUG orm.CompilationManager: Invoking javac with args:
> > > 13/04/26 15:17:53 DEBUG orm.CompilationManager:   -sourcepath
> > > 13/04/26 15:17:53 DEBUG orm.CompilationManager:
> > > /tmp/sqoop-cyrille/compile/31f5d725de6c671fa550489f492e31a2/
> > > 13/04/26 15:17:53 DEBUG orm.CompilationManager:   -d
> > > 13/04/26 15:17:53 DEBUG orm.CompilationManager:
> > > /tmp/sqoop-cyrille/compile/31f5d725de6c671fa550489f492e31a2/
> > > 13/04/26 15:17:53 DEBUG orm.CompilationManager:   -classpath
> > > 13/04/26 15:17:53 DEBUG orm.CompilationManager:
> > >
> > /home/hadoop/conf:/usr/lib/tools.jar:/home/hadoop:/home/hadoop/hadoop-core-1.0.4.jar:/home/hadoop/lib/antlr-2.7.7.jar:/home/hadoop/lib/antlr-3.2.jar:/home/hadoop/lib/antlr-runtime-3.2.jar:/home/hadoop/lib/asm-3.2.jar:/home/hadoop/lib/aspectjrt-1.6.5.jar:/home/hadoop/lib/aspectjtools-1.6.5.jar:/home/hadoop/lib/avro-1.4.0-cassandra-1.jar:/home/hadoop/lib/bson-2.5.jar:/home/hadoop/lib/cassandra-all-0.8.1.jar:/home/hadoop/lib/cassandra-thrift-0.8.1.jar:/home/hadoop/lib/cglib-nodep-2.2.jar:/home/hadoop/lib/commons-beanutils-1.7.0.jar:/home/hadoop/lib/commons-beanutils-core-1.8.0.jar:/home/hadoop/lib/commons-cli-1.2.jar:/home/hadoop/lib/commons-cli-2.0-mahout.jar:/home/hadoop/lib/commons-codec-1.4.jar:/home/hadoop/lib/commons-collections-3.2.1.jar:/home/hadoop/lib/commons-compress-1.2.jar:/home/hadoop/lib/commons-configuration-1.6.jar:/home/hadoop/lib/commons-daemon-1.0.1.jar:/home/hadoop/lib/commons-dbcp-1.4.jar:/home/hadoop/lib/commons-digester-1.8.jar:/home/hadoop/lib/commons-el-1.0.jar:/home/hadoop/lib/commons-httpclient-3.0.1.jar:/home/hadoop/lib/commons-io-2.0.1.jar:/home/hadoop/lib/commons-io-2.1.jar:/home/hadoop/lib/commons-lang-2.4.jar:/home/hadoop/lib/commons-lang-2.6.jar:/home/hadoop/lib/commons-logging-1.1.1.jar:/home/hadoop/lib/commons-logging-api-1.0.4.jar:/home/hadoop/lib/commons-math-2.1.jar:/home/hadoop/lib/commons-math-2.2.jar:/home/hadoop/lib/commons-net-1.4.1.jar:/home/hadoop/lib/commons-pool-1.5.6.jar:/home/hadoop/lib/concurrentlinkedhashmap-lru-1.1.jar:/home/hadoop/lib/core-3.1.1.jar:/home/hadoop/lib/easymock-3.0.jar:/home/hadoop/lib/guava-r09.jar:/home/hadoop/lib/hadoop-capacity-scheduler-1.0.4.jar:/home/hadoop/lib/hadoop-fairscheduler-1.0.4.jar:/home/hadoop/lib/hadoop-thriftfs-1.0.4.jar:/home/hadoop/lib/hector-core-0.8.0-2.jar:/home/hadoop/lib/high-scale-lib-1.1.2.jar:/home/hadoop/lib/hsqldb-1.8.0.10.jar:/home/hadoop/lib/httpclient-4.0.1.jar:/home/hadoop/lib/httpcore-4.0.1.jar:/home/hadoop/lib/icu4j-4.8.1.1.jar:/home/hadoop/lib/jackson-core-asl-1.8.2.jar:/home/hadoop/lib/jackson-core-asl-1.8.8.jar:/home/hadoop/lib/jackson-mapper-asl-1.8.2.jar:/home/hadoop/lib/jackson-mapper-asl-1.8.8.jar:/home/hadoop/lib/jakarta-regexp-1.4.jar:/home/hadoop/lib/jamm-0.2.2.jar:/home/hadoop/lib/jasper-compiler-5.5.12.jar:/home/hadoop/lib/jasper-runtime-5.5.12.jar:/home/hadoop/lib/jcommon-1.0.12.jar:/home/hadoop/lib/jdeb-0.8.jar:/home/hadoop/lib/jersey-core-1.8.jar:/home/hadoop/lib/jersey-json-1.8.jar:/home/hadoop/lib/jersey-server-1.8.jar:/home/hadoop/lib/jets3t-0.6.1.jar:/home/hadoop/lib/jetty-6.1.22.jar:/home/hadoop/lib/jetty-6.1.26.jar:/home/hadoop/lib/jetty-util-6.1.22.jar:/home/hadoop/lib/jetty-util-6.1.26.jar:/home/hadoop/lib/jline-0.9.94.jar:/home/hadoop/lib/jsch-0.1.42.jar:/home/hadoop/lib/json-simple-1.1.jar:/home/hadoop/lib/jul-to-slf4j-1.6.1.jar:/home/hadoop/lib/junit-4.5.jar:/home/hadoop/lib/junit-4.8.2.jar:/home/hadoop/lib/kfs-0.2.2.jar:/home/hadoop/lib/libthrift-0.6.1.jar:/home/hadoop/lib/log4j-1.2.15.jar:/home/hadoop/lib/log4j-1.2.16.jar:/home/hadoop/lib/lucene-analyzers-3.6.0.jar:/home/hadoop/lib/lucene-benchmark-3.6.0.jar:/home/hadoop/lib/lucene-core-3.6.0.jar:/home/hadoop/lib/lucene-facet-3.6.0.jar:/home/hadoop/lib/lucene-highlighter-3.6.0.jar:/home/hadoop/lib/lucene-memory-3.6.0.jar:/home/hadoop/lib/lucene-queries-3.6.0.jar:/home/hadoop/lib/mahout-core-0.7.jar:/home/hadoop/lib/mahout-core-0.7-job.jar:/home/hadoop/lib/mahout-integration-0.7.jar:/home/hadoop/lib/mahout-math-0.7.jar:/home/hadoop/lib/mockito-all-1.8.5.jar:/home/hadoop/lib/mongo-java-driver-2.5.jar:/home/hadoop/lib/objenesis-1.2.jar:/home/hadoop/lib/oro-2.0.8.jar:/home/hadoop/lib/servlet-api-2.5-20081211.jar:/home/hadoop/lib/servlet-api-2.5.jar:/home/hadoop/lib/slf4j-api-1.6.1.jar:/home/hadoop/lib/slf4j-log4j12-1.6.1.jar:/home/hadoop/lib/snakeyaml-1.6.jar:/home/hadoop/lib/solr-commons-csv-3.5.0.jar:/home/hadoop/lib/speed4j-0.9.jar:/home/hadoop/lib/stringtemplate-3.2.jar:/home/hadoop/lib/uncommons-maths-1.2.2.jar:/home/hadoop/lib/uuid-3.2.0.jar:/home/hadoop/lib/xercesImpl-2.9.1.jar:/home/hadoop/lib/xml-apis-1.3.04.jar:/home/hadoop/lib/xmlenc-0.52.jar:/home/hadoop/lib/xpp3_min-1.1.4c.jar:/home/hadoop/lib/xstream-1.3.1.jar:/home/hadoop/lib/jsp-2.1/jsp-2.1.jar:/home/hadoop/lib/jsp-2.1/jsp-api-2.1.jar:/home/sqoop/conf::/home/sqoop/lib/ant-contrib-1.0b3.jar:/home/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar:/home/sqoop/lib/avro-1.5.3.jar:/home/sqoop/lib/avro-ipc-1.5.3.jar:/home/sqoop/lib/avro-mapred-1.5.3.jar:/home/sqoop/lib/commons-io-1.4.jar:/home/sqoop/lib/hsqldb-1.8.0.10.jar:/home/sqoop/lib/jackson-core-asl-1.7.3.jar:/home/sqoop/lib/jackson-mapper-asl-1.7.3.jar:/home/sqoop/lib/jopt-simple-3.2.jar:/home/sqoop/lib/paranamer-2.3.jar:/home/sqoop/lib/snappy-java-1.0.3.2.jar:/home/sqoop/lib/sqljdbc4.jar:/home/sqoop/lib/sqoop-sqlserver-1.0.jar:/home/sqoop/sqoop-1.4.3.jar:/home/sqoop/sqoop-test-1.4.3.jar::/home/hadoop/hadoop-core-1.0.4.jar:/home/sqoop/sqoop-1.4.3.jar
> > > Note:
> > >
> > /tmp/sqoop-cyrille/compile/31f5d725de6c671fa550489f492e31a2/kmeansclusterIds.java
> > > uses or overrides a deprecated API.
> > > Note: Recompile with -Xlint:deprecation for details.
> > > 13/04/26 15:17:54 DEBUG orm.CompilationManager: Could not rename
> > >
> > /tmp/sqoop-cyrille/compile/31f5d725de6c671fa550489f492e31a2/kmeansclusterIds.java
> > > to /home/cyrille/workspace/AutomaticClusterer/./kmeansclusterIds.java
> > > org.apache.commons.io.FileExistsException: Destination
> > > '/home/cyrille/workspace/AutomaticClusterer/./kmeansclusterIds.java'
> > > already exists
> > >     at org.apache.commons.io.FileUtils.moveFile(FileUtils.java:2099)
> > >     at
> > >
> > org.apache.sqoop.orm.CompilationManager.compile(CompilationManager.java:228)
> > >     at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:83)
> > >     at org.apache.sqoop.tool.ExportTool.exportTable(ExportTool.java:64)
> > >     at org.apache.sqoop.tool.ExportTool.run(ExportTool.java:100)
> > >     at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
> > >     at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> > >     at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
> > >     at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
> > >     at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
> > >     at org.apache.sqoop.Sqoop.main(Sqoop.java:238)
> > > 13/04/26 15:17:54 INFO orm.CompilationManager: Writing jar file:
> > >
> > /tmp/sqoop-cyrille/compile/31f5d725de6c671fa550489f492e31a2/kmeansclusterIds.jar
> > > 13/04/26 15:17:54 DEBUG orm.CompilationManager: Scanning for .class files
> > > in directory: /tmp/sqoop-cyrille/compile/31f5d725de6c671fa550489f492e31a2
> > > 13/04/26 15:17:54 DEBUG orm.CompilationManager: Got classfile:
> > >
> > /tmp/sqoop-cyrille/compile/31f5d725de6c671fa550489f492e31a2/kmeansclusterIds.class
> > > -> kmeansclusterIds.class
> > > 13/04/26 15:17:54 DEBUG orm.CompilationManager: Finished writing jar file
> > >
> > /tmp/sqoop-cyrille/compile/31f5d725de6c671fa550489f492e31a2/kmeansclusterIds.jar
> > > 13/04/26 15:17:54 INFO mapreduce.ExportJobBase: Beginning export of
> > > kmeansclusterIds
> > > 13/04/26 15:17:55 DEBUG mapreduce.JobBase: Using InputFormat: class
> > > org.apache.sqoop.mapreduce.ExportInputFormat
> > > 13/04/26 15:17:55 DEBUG manager.SqlManager: Using fetchSize for next
> > query:
> > > 1000
> > > 13/04/26 15:17:55 INFO manager.SqlManager: Executing SQL statement:
> > SELECT
> > > TOP 1 * FROM [kmeansclusterIds]
> > > 13/04/26 15:17:55 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > file:/home/sqoop/sqoop-1.4.3.jar
> > > 13/04/26 15:17:55 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > file:/home/sqoop/lib/sqljdbc4.jar
> > > 13/04/26 15:17:55 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > file:/home/sqoop/lib/sqoop-sqlserver-1.0.jar
> > > 13/04/26 15:17:55 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > file:/home/sqoop/sqoop-1.4.3.jar
> > > 13/04/26 15:17:55 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > file:/home/sqoop/lib/ant-contrib-1.0b3.jar
> > > 13/04/26 15:17:55 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > file:/home/sqoop/lib/avro-ipc-1.5.3.jar
> > > 13/04/26 15:17:55 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > file:/home/sqoop/lib/jackson-mapper-asl-1.7.3.jar
> > > 13/04/26 15:17:55 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > file:/home/sqoop/lib/jackson-core-asl-1.7.3.jar
> > > 13/04/26 15:17:55 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > file:/home/sqoop/lib/commons-io-1.4.jar
> > > 13/04/26 15:17:55 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > file:/home/sqoop/lib/sqoop-sqlserver-1.0.jar
> > > 13/04/26 15:17:55 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > file:/home/sqoop/lib/paranamer-2.3.jar
> > > 13/04/26 15:17:55 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > file:/home/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar
> > > 13/04/26 15:17:55 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > file:/home/sqoop/lib/jopt-simple-3.2.jar
> > > 13/04/26 15:17:55 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > file:/home/sqoop/lib/avro-1.5.3.jar
> > > 13/04/26 15:17:55 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > file:/home/sqoop/lib/hsqldb-1.8.0.10.jar
> > > 13/04/26 15:17:55 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > file:/home/sqoop/lib/avro-mapred-1.5.3.jar
> > > 13/04/26 15:17:55 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > file:/home/sqoop/lib/sqljdbc4.jar
> > > 13/04/26 15:17:55 DEBUG mapreduce.JobBase: Adding to job classpath:
> > > file:/home/sqoop/lib/snappy-java-1.0.3.2.jar
> > > 13/04/26 15:17:56 INFO input.FileInputFormat: Total input paths to
> > process
> > > : 1
> > > 13/04/26 15:17:56 DEBUG mapreduce.ExportInputFormat: Target numMapTasks=4
> > > 13/04/26 15:17:56 DEBUG mapreduce.ExportInputFormat: Total input
> > bytes=11388
> > > 13/04/26 15:17:56 DEBUG mapreduce.ExportInputFormat: maxSplitSize=2847
> > > 13/04/26 15:17:56 INFO input.FileInputFormat: Total input paths to
> > process
> > > : 1
> > > 13/04/26 15:17:56 INFO util.NativeCodeLoader: Loaded the native-hadoop
> > > library
> > > 13/04/26 15:17:56 WARN snappy.LoadSnappy: Snappy native library not
> > loaded
> > > 13/04/26 15:17:56 DEBUG mapreduce.ExportInputFormat: Generated splits:
> > > 13/04/26 15:17:56 DEBUG mapreduce.ExportInputFormat:
> > > Paths:/user/cyrille/drivers/output.txt:0+2847 Locations:Agnik-17:;
> > > 13/04/26 15:17:56 DEBUG mapreduce.ExportInputFormat:
> > > Paths:/user/cyrille/drivers/output.txt:2847+2847 Locations:Agnik-17:;
> > > 13/04/26 15:17:56 DEBUG mapreduce.ExportInputFormat:
> > > Paths:/user/cyrille/drivers/output.txt:5694+2847 Locations:Agnik-17:;
> > > 13/04/26 15:17:56 DEBUG mapreduce.ExportInputFormat:
> > > Paths:/user/cyrille/drivers/output.txt:8541+2847 Locations:Agnik-17:;
> > > 13/04/26 15:17:56 INFO mapred.JobClient: Running job:
> > job_201304111243_0289
> > > 13/04/26 15:17:57 INFO mapred.JobClient:  map 0% reduce 0%
> > > 13/04/26 15:18:09 INFO mapred.JobClient: Task Id :
> > > attempt_201304111243_0289_m_000000_0, Status : FAILED
> > > java.io.IOException: Can't export data, please check task tracker logs
> > >     at
> > >
> > org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
> > >     at
> > > org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
> > >     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
> > >     at
> > >
> > org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
> > >     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
> > >     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> > >     at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> > >     at java.security.AccessController.doPrivileged(Native Method)
> > >     at javax.security.auth.Subject.doAs(Subject.java:416)
> > >     at
> > >
> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> > >     at org.apache.hadoop.mapred.Child.main(Child.java:249)
> > > Caused by: java.util.NoSuchElementException
> > >     at java.util.ArrayList$Itr.next(ArrayList.java:757)
> > >     at kmeansclusterIds.__loadFromFields(kmeansclusterIds.java:198)
> > >     at kmeansclusterIds.parse(kmeansclusterIds.java:147)
> > >     at
> > > org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
> > >     ... 10 more
> > >
> > > 13/04/26 15:18:11 INFO mapred.JobClient: Task Id :
> > > attempt_201304111243_0289_m_000001_0, Status : FAILED
> > > java.io.IOException: Can't export data, please check task tracker logs
> > >     at
> > >
> > org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
> > >     at
> > > org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
> > >     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
> > >     at
> > >
> > org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
> > >     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
> > >     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> > >     at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> > >     at java.security.AccessController.doPrivileged(Native Method)
> > >     at javax.security.auth.Subject.doAs(Subject.java:416)
> > >     at
> > >
> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> > >     at org.apache.hadoop.mapred.Child.main(Child.java:249)
> > > Caused by: java.util.NoSuchElementException
> > >     at java.util.ArrayList$Itr.next(ArrayList.java:757)
> > >     at kmeansclusterIds.__loadFromFields(kmeansclusterIds.java:198)
> > >     at kmeansclusterIds.parse(kmeansclusterIds.java:147)
> > >     at
> > > org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
> > >     ... 10 more
> > >
> > > 13/04/26 15:18:14 INFO mapred.JobClient: Task Id :
> > > attempt_201304111243_0289_m_000000_1, Status : FAILED
> > > java.io.IOException: Can't export data, please check task tracker logs
> > >     at
> > >
> > org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
> > >     at
> > > org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
> > >     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
> > >     at
> > >
> > org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
> > >     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
> > >     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> > >     at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> > >     at java.security.AccessController.doPrivileged(Native Method)
> > >     at javax.security.auth.Subject.doAs(Subject.java:416)
> > >     at
> > >
> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> > >     at org.apache.hadoop.mapred.Child.main(Child.java:249)
> > > Caused by: java.util.NoSuchElementException
> > >     at java.util.ArrayList$Itr.next(ArrayList.java:757)
> > >     at kmeansclusterIds.__loadFromFields(kmeansclusterIds.java:198)
> > >     at kmeansclusterIds.parse(kmeansclusterIds.java:147)
> > >     at
> > > org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
> > >     ... 10 more
> > >
> > > 13/04/26 15:18:17 INFO mapred.JobClient: Task Id :
> > > attempt_201304111243_0289_m_000001_1, Status : FAILED
> > > java.io.IOException: Can't export data, please check task tracker logs
> > >     at
> > >
> > org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
> > >     at
> > > org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
> > >     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
> > >     at
> > >
> > org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
> > >     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
> > >     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> > >     at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
> > >     at java.security.AccessController.doPrivileged(Native Method)
> > >     at javax.security.auth.Subject.doAs(Subject.java:416)
> > >     at
> > >
> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> > >     at org.apache.hadoop.mapred.Child.main(Child.java:249)
> > > Caused by: java.util.NoSuchElementException
> > >     at java.util.ArrayList$Itr.next(ArrayList.java:757)
> > >     at kmeansclusterIds.__loadFromFields(kmeansclusterIds.java:198)
> > >     at kmeansclusterIds.parse(kmeansclusterIds.java:147)
> > >     at
> > > org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
> > >     ... 10 more
> > >
> > >
> > >
> > > On Fri, Apr 26, 2013 at 3:05 PM, Jarek Jarcec Cecho <jarcec@apache.org
> > >wrote:
> > >
> > > > Hi Cyril,
> > > > would you mind running your Sqoop command with argument --verbose? It
> > > > should print out much more information. Please gather the generated
> > class
> > > > after Sqoop will end and sent it to the mailing list as well.
> > > >
> > > > Jarcec
> > > >
> > > > On Fri, Apr 26, 2013 at 01:52:29PM -0400, Cyril Bogus wrote:
> > > > > Here is the sqoop log
> > > > >
> > > > > 13/04/26 13:36:46 WARN tool.SqoopTool: $SQOOP_CONF_DIR has not been
> > set
> > > > in
> > > > > the environment. Cannot check for additional configuration.
> > > > > 13/04/26 13:36:46 WARN sqoop.ConnFactory: $SQOOP_CONF_DIR has not
> > been
> > > > set
> > > > > in the environment. Cannot check for additional configuration.
> > > > > 13/04/26 13:36:46 INFO manager.SqlManager: Using default fetchSize of
> > > > 1000
> > > > > 13/04/26 13:36:46 INFO tool.CodeGenTool: Beginning code generation
> > > > > 13/04/26 13:36:46 INFO manager.SqlManager: Executing SQL statement:
> > > > SELECT
> > > > > t.* FROM [kmeansclusterIds] AS t WHERE 1=0
> > > > > 13/04/26 13:36:46 INFO orm.CompilationManager: $HADOOP_HOME is not
> > set
> > > > >
> > > >
> > /tmp/sqoop-cyrille/compile/2a3ab7b9299edcac783039a7addc9666/kmeansclusterIds.java:73:
> > > > > cannot find symbol
> > > > > symbol  : variable driver_license
> > > > > location: class kmeansclusterIds
> > > > >     JdbcWritableBridge.writeString(driver_license, 2 + __off, 12,
> > > > __dbStmt);
> > > > >                                    ^
> > > > > Note:
> > > > >
> > > >
> > /tmp/sqoop-cyrille/compile/2a3ab7b9299edcac783039a7addc9666/kmeansclusterIds.java
> > > > > uses or overrides a deprecated API.
> > > > > Note: Recompile with -Xlint:deprecation for details.
> > > > > 1 error
> > > > > 13/04/26 13:36:47 ERROR tool.ExportTool: Encountered IOException
> > running
> > > > > export job: java.io.IOException: Error returned by javac
> > > > >
> > > > > the kmeansClusterId class is set in a temp file so I cannot show
> > what the
> > > > > class do beside the point of error as stated above.
> > > > >
> > > > > Thank you for your reply Jarek
> > > > >
> > > > >
> > > > > On Fri, Apr 26, 2013 at 1:38 PM, Jarek Jarcec Cecho <
> > jarcec@apache.org
> > > > >wrote:
> > > > >
> > > > > > Hi Cyril,
> > > > > > would you mind sharing entire Sqoop log and the generated java
> > class?
> > > > > >
> > > > > > Jarcec
> > > > > >
> > > > > > On Fri, Apr 26, 2013 at 12:53:49PM -0400, Cyril Bogus wrote:
> > > > > > > UPDATE!!!
> > > > > > >
> > > > > > > Now I get the following error
> > > > > > >
> > > > > > >
> > > > > >
> > > >
> > /tmp/sqoop-cyril/compile/b156fd4f270274b11320d007472bbfe7/kmeansclusterIds.java:73:
> > > > > > > cannot find symbol
> > > > > > > symbol  : variable driver_license
> > > > > > > location: class kmeansclusterIds
> > > > > > >     JdbcWritableBridge.writeString(driver_license, 2 + __off, 12,
> > > > > > __dbStmt);
> > > > > > >                                    ^
> > > > > > > Note:
> > > > > > >
> > > > > >
> > > >
> > /tmp/sqoop-cyrille/compile/b156fd4f270274b11320d007472bbfe7/kmeansclusterIds.java
> > > > > > > uses or overrides a deprecated API.
> > > > > > > Note: Recompile with -Xlint:deprecation for details.
> > > > > > > 1 error
> > > > > > > 13/04/26 12:52:26 ERROR tool.ExportTool: Encountered IOException
> > > > running
> > > > > > > export job: java.io.IOException: Error returned by javac
> > > > > > >
> > > > > > >
> > > > > > >
> > > > > > >
> > > > > > > On Fri, Apr 26, 2013 at 12:52 PM, Cyril Bogus <
> > cyrilbogus@gmail.com>
> > > > > > wrote:
> > > > > > >
> > > > > > > > Hi everyone,
> > > > > > > >
> > > > > > > > I am trying to do an export from HDFS to MSSQL using Sqoop
> > > > > > > >
> > > > > > > > my data is in the following format
> > > > > > > >
> > > > > > > > JTDKN3DU0B0261494,345
> > > > > > > > JTEBU14R840022700,340
> > > > > > > > JTEEP21A770208029,314
> > > > > > > > JTHBF5C24A5125359,348
> > > > > > > > jthbk1eg6a2395028,341
> > > > > > > > JTMBD31V565007305,355
> > > > > > > > KL1PM5C5XAK700838,352
> > > > > > > > KMHCG45C41U225885,352
> > > > > > > > KMHDC86EX9U037746,304
> > > > > > > > NM0LS6BN8CT123712,354
> > > > > > > >
> > > > > > > > my export statement is the following
> > > > > > > >
> > > > > > > > export
> > > > > > > > --connect
> > > > > > > >
> > > > 'jdbc:sqlserver://server:port;username=sa;password=pass;database=db'
> > > > > > > > --table
> > > > > > > > kmeansclusterIds
> > > > > > > > --update-key
> > > > > > > > driver_license
> > > > > > > > --columns
> > > > > > > > clusterId
> > > > > > > > --update-mode
> > > > > > > > allowinsert
> > > > > > > > --export-dir
> > > > > > > > drivers/output.txt
> > > > > > > > --fields-terminated-by
> > > > > > > > ','
> > > > > > > > --lines-terminated-by
> > > > > > > > \n
> > > > > > > >
> > > > > > > > I created a table named kmeansclusterIds on the server.
> > > > > > > > I get the following error:
> > > > > > > >
> > > > > > > > Exception in thread "main" java.lang.NoSuchMethodError:
> > > > > > > >
> > > > > >
> > > >
> > com.cloudera.sqoop.manager.ExportJobContext.setConnManager(Lcom/cloudera/sqoop/manager/ConnManager;)V
> > > > > > > >     at
> > > > > > > >
> > > > > >
> > > >
> > com.microsoft.sqoop.SqlServer.MSSQLServerManager.exportTable(MSSQLServerManager.java:151)
> > > > > > > >     at
> > > > org.apache.sqoop.tool.ExportTool.exportTable(ExportTool.java:78)
> > > > > > > >     at org.apache.sqoop.tool.ExportTool.run(ExportTool.java:97)
> > > > > > > >     at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
> > > > > > > >     at
> > org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> > > > > > > >     at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
> > > > > > > >     at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
> > > > > > > >     at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
> > > > > > > >     at org.apache.sqoop.Sqoop.main(Sqoop.java:238)
> > > > > > > >     at com.cloudera.sqoop.Sqoop.main(Sqoop.java:57)
> > > > > > > >
> > > > > > > > Any insight in what the real issue might be?
> > > > > > > >
> > > > > > > > Thank you in advance for a reply.
> > > > > > > >
> > > > > >
> > > >
> >
> >
> >