You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@sqoop.apache.org by Kate Ting <ka...@cloudera.com> on 2011/09/16 17:55:32 UTC

Re: [sqoop-user] Problem using sqoop with --direct (mysqldump)

[Moving the conversation to sqoop-user@incubator.apache.org. Please
subscribe (and post questions) to the new mailing list.]

Hi Eric -

(1) Is the mysqldump utility installed on individual node machines?
(2) If so, can you pastebin your task log as well as verbose output?

Regards, Kate

On Fri, Sep 16, 2011 at 8:04 AM, Eric <er...@gmail.com> wrote:
> Hi all,
>
> I cannot sqoop in using the --direct option, my sqoop works fine up
> until i add --direct .
>
> I am using Sqoop 1.3.0-cdh3u1
> git commit id 3a60cc809b14d538dd1eb0e90ffa9767e8d06a43
> Compiled by jenkins@ubuntu-slave01 on Mon Jul 18 08:38:49 PDT 2011
>
> Please Advise,
>
> -Eric
>
>
> error message:
>
> 11/09/16 07:57:39 INFO manager.MySQLManager: Preparing to use a MySQL
> streaming resultset.
> 11/09/16 07:57:39 INFO tool.CodeGenTool: Beginning code generation
> 11/09/16 07:57:40 INFO manager.SqlManager: Executing SQL statement:
> SELECT t.* FROM `table1` AS t LIMIT 1
> 11/09/16 07:57:40 INFO manager.SqlManager: Executing SQL statement:
> SELECT t.* FROM `table1` AS t LIMIT 1
> 11/09/16 07:57:40 INFO orm.CompilationManager: HADOOP_HOME is /usr/lib/
> hadoop
> 11/09/16 07:57:40 INFO orm.CompilationManager: Found hadoop core jar
> at: /usr/lib/hadoop/hadoop-0.20.2-cdh3u1-core.jar
> 11/09/16 07:57:41 INFO orm.CompilationManager: Writing jar file: /tmp/
> sqoop-root/compile/aef5c62d2156aeae5338ee272de42d26/table1.jar
> 11/09/16 07:57:41 INFO manager.DirectMySQLManager: Beginning mysqldump
> fast path import
> 11/09/16 07:57:41 INFO mapreduce.ImportJobBase: Beginning import of
> table1
> 11/09/16 07:57:41 INFO manager.SqlManager: Executing SQL statement:
> SELECT t.* FROM `table1` AS t LIMIT 1
> 11/09/16 07:57:43 INFO mapred.JobClient: Running job:
> job_201109160744_0004
> 11/09/16 07:57:44 INFO mapred.JobClient:  map 0% reduce 0%
> 11/09/16 07:57:50 INFO mapred.JobClient: Task Id :
> attempt_201109160744_0004_m_000000_0, Status : FAILED
> java.io.IOException: mysqldump terminated with status 5
>        at
> com.cloudera.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:
> 476)
>        at
> com.cloudera.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:
> 49)
>        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:647)
>        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:323)
>        at org.apache.hadoop.mapred.Child$4.run(Child.java:270)
>        at java.security.AccessController.doPrivileged(Native Method)
>        at javax.security.auth.Subject.doAs(Subject.java:396)
>        at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:
> 1127)
>        at org.apache.hadoop.mapred.Child.main(Child.java:264)
>
> attempt_201109160744_0004_m_000000_0: Exception in thread "Thread-12"
> java.lang.IndexOutOfBoundsException
> attempt_201109160744_0004_m_000000_0:   at
> java.nio.CharBuffer.wrap(CharBuffer.java:445)
> attempt_201109160744_0004_m_000000_0:   at
> com.cloudera.sqoop.mapreduce.MySQLDumpMapper$ReparsingAsyncSink
> $ReparsingStreamThread.run(MySQLDumpMapper.java:253)
> attempt_201109160744_0004_m_000000_0: log4j:WARN No appenders could be
> found for logger (org.apache.hadoop.hdfs.DFSClient).
> attempt_201109160744_0004_m_000000_0: log4j:WARN Please initialize the
> log4j system properly.
> 11/09/16 07:57:55 INFO mapred.JobClient: Task Id :
> attempt_201109160744_0004_m_000000_1, Status : FAILED
> java.io.IOException: mysqldump terminated with status 5
>        at
> com.cloudera.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:
> 476)
>        at
> com.cloudera.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:
> 49)
>        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:647)
>        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:323)
>        at org.apache.hadoop.mapred.Child$4.run(Child.java:270)
>        at java.security.AccessController.doPrivileged(Native Method)
>        at javax.security.auth.Subject.doAs(Subject.java:396)
>        at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:
> 1127)
>        at org.apache.hadoop.mapred.Child.main(Child.java:264)
>
> attempt_201109160744_0004_m_000000_1: Exception in thread "Thread-12"
> java.lang.IndexOutOfBoundsException
> attempt_201109160744_0004_m_000000_1:   at
> java.nio.CharBuffer.wrap(CharBuffer.java:445)
> attempt_201109160744_0004_m_000000_1:   at
> com.cloudera.sqoop.mapreduce.MySQLDumpMapper$ReparsingAsyncSink
> $ReparsingStreamThread.run(MySQLDumpMapper.java:253)
> attempt_201109160744_0004_m_000000_1: log4j:WARN No appenders could be
> found for logger (org.apache.hadoop.hdfs.DFSClient).
> attempt_201109160744_0004_m_000000_1: log4j:WARN Please initialize the
> log4j system properly.
> 11/09/16 07:58:01 INFO mapred.JobClient: Task Id :
> attempt_201109160744_0004_m_000000_2, Status : FAILED
> java.io.IOException: mysqldump terminated with status 5
>        at
> com.cloudera.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:
> 476)
>        at
> com.cloudera.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:
> 49)
>        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:647)
>        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:323)
>        at org.apache.hadoop.mapred.Child$4.run(Child.java:270)
>        at java.security.AccessController.doPrivileged(Native Method)
>        at javax.security.auth.Subject.doAs(Subject.java:396)
>        at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:
> 1127)
>        at org.apache.hadoop.mapred.Child.main(Child.java:264)
>
> attempt_201109160744_0004_m_000000_2: Exception in thread "Thread-12"
> java.lang.IndexOutOfBoundsException
> attempt_201109160744_0004_m_000000_2:   at
> java.nio.CharBuffer.wrap(CharBuffer.java:445)
> attempt_201109160744_0004_m_000000_2:   at
> com.cloudera.sqoop.mapreduce.MySQLDumpMapper$ReparsingAsyncSink
> $ReparsingStreamThread.run(MySQLDumpMapper.java:253)
> attempt_201109160744_0004_m_000000_2: log4j:WARN No appenders could be
> found for logger (org.apache.hadoop.hdfs.DFSClient).
> attempt_201109160744_0004_m_000000_2: log4j:WARN Please initialize the
> log4j system properly.
> 11/09/16 07:58:07 INFO mapred.JobClient: Job complete:
> job_201109160744_0004
> 11/09/16 07:58:07 INFO mapred.JobClient: Counters: 6
> 11/09/16 07:58:07 INFO mapred.JobClient:   Job Counters
> 11/09/16 07:58:07 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=19165
> 11/09/16 07:58:07 INFO mapred.JobClient:     Total time spent by all
> reduces waiting after reserving slots (ms)=0
> 11/09/16 07:58:07 INFO mapred.JobClient:     Total time spent by all
> maps waiting after reserving slots (ms)=0
> 11/09/16 07:58:07 INFO mapred.JobClient:     Launched map tasks=4
> 11/09/16 07:58:07 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=0
> 11/09/16 07:58:07 INFO mapred.JobClient:     Failed map tasks=1
> 11/09/16 07:58:07 INFO mapreduce.ImportJobBase: Transferred 0 bytes in
> 25.1844 seconds (0 bytes/sec)
> 11/09/16 07:58:07 INFO mapreduce.ImportJobBase: Retrieved 0 records.
> 11/09/16 07:58:07 ERROR tool.ImportTool: Error during import: Import
> job failed!
>
> --
> NOTE: The mailing list sqoop-user@cloudera.org is deprecated in favor of Apache Sqoop mailing list sqoop-user@incubator.apache.org. Please subscribe to it by sending an email to incubator-sqoop-user-subscribe@apache.org.
>

Re: [sqoop-user] Re: Problem using sqoop with --direct (mysqldump)

Posted by Kathleen Ting <ka...@cloudera.com>.
Eric, Priya - FYI, https://issues.apache.org/jira/browse/SQOOP-450 will fix
this issue.

Regards, Kathleen

On Tue, Nov 8, 2011 at 5:15 PM, Priya <pr...@ecofactor.com> wrote:

> I have the same issue using Sqoop 1.3.0-cdh3u1
> The import works, however I see some junk characters at the end of
> each part imported, like below:
> cter_set_client */
> cter_set_results */
> tion_connection */
>
>
> neral_ci */
> ode */
> is there a solution?
> If I exclude the --direct option, data import is as expected.
> Thanks!
>
> On Sep 19, 5:15 pm, Kate Ting <k....@cloudera.com> wrote:
> > [To: sqoop-u...@incubator.apache.org; Bcc: sqoop-u...@cloudera.org]
> >
> > Eric - to rule out permission issues andmysqldumpissues:
> >
> > 1. Does username 'sqoop' have permission to access table1?
> > 2. Can you ssh into one of the node machines, runmysqldump, and then
> > paste the output?
> >
> > Regards, Kate
> >
> >
> >
> >
> >
> >
> >
> > On Sat, Sep 17, 2011 at 1:05 PM, eric hernandez <er...@gmail.com>
> wrote:
> > > Kate,
> > > I am sqooping from a MySQL Server version: 5.0.58-enterprise-gpl MySQL
> > > Enterprise Server
> >
> > > Currently i am using on all data nodes
> > >mysqldump  Ver 10.13 Distrib 5.5.15, for Linux (x86_64)
> >
> > > But I tried also using
> > > mysql-5.0.77-4.el5_6.6.x86_64  mysqldumpfrom the centos 5.5 repo.
> >
> > > Both fail.
> >
> > > -Eric
> >
> > > On Sat, Sep 17, 2011 at 11:43 AM, Kate Ting <k....@cloudera.com> wrote:
> >
> > >> Eric, what is your mysql version and yourmysqldumpversion?
> >
> > >> Regards, Kate
> >
> > >> On Fri, Sep 16, 2011 at 7:30 PM, eric hernandez <
> eric.hard...@gmail.com>
> > >> wrote:
> > >> > Kate,
> >
> > >> > sqoop import --connect jdbc:mysql://192.168.0.100:3307/test--verbose -m
> > >> > 1
> > >> > --username sqoop --password sanitized --hive-overwrite --direct
> --table
> > >> > table1 --hive-import --create-hive-table --hive-table table1
> > >> > --fields-terminated-by '\t' --lines-terminated-by '\n' --append
> >
> > >> > Please note this works fine if i remove the --direct option. I am
> also
> > >> > limiting it to 1 mapper because if not the output of the failure is
> very
> > >> > long.
> >
> > >> > On Fri, Sep 16, 2011 at 3:01 PM, Kate Ting <k....@cloudera.com>
> wrote:
> >
> > >> >> Eric - what is the exact Sqoop command that you ran (including, if
> > >> >> applicable, contents of the options-file)?
> >
> > >> >> Regards, Kate
> >
> > >> >> On Fri, Sep 16, 2011 at 9:27 AM, eric hernandez
> > >> >> <er...@gmail.com>
> > >> >> wrote:
> > >> >> > Yes i havemysqldumpon all nodes.
> >
> > >> >> > Verbose output
> >
> > >> >> > 11/09/16 09:22:42 DEBUG tool.BaseSqoopTool: Enabled debug
> logging.
> > >> >> > 11/09/16 09:22:42 WARN tool.BaseSqoopTool: Setting your password
> on
> > >> >> > the
> > >> >> > command-line is insecure. Consider using -P instead.
> > >> >> > 11/09/16 09:22:42 DEBUG sqoop.ConnFactory: Loaded manager
> factory:
> > >> >> > com.cloudera.sqoop.manager.DefaultManagerFactory
> > >> >> > 11/09/16 09:22:42 DEBUG sqoop.ConnFactory: Trying ManagerFactory:
> > >> >> > com.cloudera.sqoop.manager.DefaultManagerFactory
> > >> >> > 11/09/16 09:22:42 DEBUG manager.DefaultManagerFactory: Trying
> with
> > >> >> > scheme:
> > >> >> > jdbc:mysql:
> > >> >> > 11/09/16 09:22:42 INFO manager.MySQLManager: Preparing to use a
> MySQL
> > >> >> > streaming resultset.
> > >> >> > 11/09/16 09:22:42 DEBUG sqoop.ConnFactory: Instantiated
> ConnManager
> > >> >> > com.cloudera.sqoop.manager.DirectMySQLManager@7ad81784
> > >> >> > 11/09/16 09:22:42 INFO tool.CodeGenTool: Beginning code
> generation
> > >> >> > 11/09/16 09:22:42 DEBUG manager.SqlManager: No connection
> paramenters
> > >> >> > specified. Using regular API for making connection.
> > >> >> > 11/09/16 09:22:43 DEBUG manager.SqlManager: Using fetchSize for
> next
> > >> >> > query:
> > >> >> > -2147483648
> > >> >> > 11/09/16 09:22:43 INFO manager.SqlManager: Executing SQL
> statement:
> > >> >> > SELECT
> > >> >> > t.* FROM `table1` AS t LIMIT 1
> > >> >> > 11/09/16 09:22:43 DEBUG manager.SqlManager: Using fetchSize for
> next
> > >> >> > query:
> > >> >> > -2147483648
> > >> >> > 11/09/16 09:22:43 INFO manager.SqlManager: Executing SQL
> statement:
> > >> >> > SELECT
> > >> >> > t.* FROM `table1` AS t LIMIT 1
> > >> >> > 11/09/16 09:22:43 DEBUG orm.ClassWriter: selected columns:
> > >> >> > 11/09/16 09:22:43 DEBUG orm.ClassWriter:   id
> > >> >> > 11/09/16 09:22:43 DEBUG orm.ClassWriter:   application_id
> > >> >> > 11/09/16 09:22:43 DEBUG orm.ClassWriter:   event_id
> > >> >> > 11/09/16 09:22:43 DEBUG orm.ClassWriter:   response_id
> > >> >> > 11/09/16 09:22:43 DEBUG orm.ClassWriter:   target_id
> > >> >> > 11/09/16 09:22:43 DEBUG orm.ClassWriter:   mode
> > >> >> > 11/09/16 09:22:43 DEBUG orm.ClassWriter:   date_created
> > >> >> > 11/09/16 09:22:43 DEBUG orm.ClassWriter: Writing source file:
> > >> >> >
> /tmp/sqoop-root/compile/4da62fc9c254b3faa3ba5115ef61783b/table1.java
> > >> >> > 11/09/16 09:22:43 DEBUG orm.ClassWriter: Table name: table1
> > >> >> > 11/09/16 09:22:43 DEBUG orm.ClassWriter: Columns: id:4,
> > >> >> > application_id:4,
> > >> >> > event_id:4, response_id:4, target_id:4, mode:1, date_created:93,
> > >> >> > 11/09/16 09:22:43 DEBUG orm.ClassWriter: sourceFilename is
> > >> >> > table1.java
> > >> >> > 11/09/16 09:22:43 DEBUG orm.CompilationManager: Found existing
> > >> >> > /tmp/sqoop-root/compile/4da62fc9c254b3faa3ba5115ef61783b/
> > >> >> > 11/09/16 09:22:43 INFO orm.CompilationManager: HADOOP_HOME is
> > >> >> > /usr/lib/hadoop
> > >> >> > 11/09/16 09:22:43 INFO orm.CompilationManager: Found hadoop core
> jar
> > >> >> > at:
> > >> >> > /usr/lib/hadoop/hadoop-0.20.2-cdh3u1-core.jar
> > >> >> > 11/09/16 09:22:43 DEBUG orm.CompilationManager: Adding source
> file:
> > >> >> >
> /tmp/sqoop-root/compile/4da62fc9c254b3faa3ba5115ef61783b/table1.java
> > >> >> > 11/09/16 09:22:43 DEBUG orm.CompilationManager: Invoking javac
> with
> > >> >> > args:
> > >> >> > 11/09/16 09:22:43 DEBUG orm.CompilationManager:   -sourcepath
> > >> >> > 11/09/16 09:22:43 DEBUG orm.CompilationManager:
> > >> >> > /tmp/sqoop-root/compile/4da62fc9c254b3faa3ba5115ef61783b/
> > >> >> > 11/09/16 09:22:43 DEBUG orm.CompilationManager:   -d
> > >> >> > 11/09/16 09:22:43 DEBUG orm.CompilationManager:
> > >> >> > /tmp/sqoop-root/compile/4da62fc9c254b3faa3ba5115ef61783b/
> > >> >> > 11/09/16 09:22:43 DEBUG orm.CompilationManager:   -classpath
> > >> >> > 11/09/16 09:22:43 DEBUG orm.CompilationManager:
> >
> > >> >> >
> /etc/hadoop/conf:/usr/java/jdk1.6.0_21/lib/tools.jar:/usr/lib/hadoop:/usr/lib/hadoop/hadoop-core-0.20.2-cdh3u1.jar:/usr/lib/hadoop/lib/ant-contrib-1.0b3.jar:/usr/lib/hadoop/lib/aspectjrt-1.6.5.jar:/usr/lib/hadoop/lib/aspectjtools-1.6.5.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/commons-daemon-1.0.1.jar:/usr/lib/hadoop/lib/commons-el-1.0.jar:/usr/lib/hadoop/lib/commons-httpclient-3.0.1.jar:/usr/lib/hadoop/lib/commons-logging-1.0.4.jar:/usr/lib/hadoop/lib/commons-logging-api-1.0.4.jar:/usr/lib/hadoop/lib/commons-net-1.4.1.jar:/usr/lib/hadoop/lib/core-3.1.1.jar:/usr/lib/hadoop/lib/hadoop-fairscheduler-0.20.2-cdh3u1.jar:/usr/lib/hadoop/lib/hsqldb-1.8.0.10.jar:/usr/lib/hadoop/lib/hue-plugins-1.2.0-cdh3u1.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.5.2.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.5.2.jar:/usr/lib/hadoop/lib/jasper-compiler-5.5.12.jar:/usr/lib/hadoop/lib/jasper-runtime-5.5.12.jar:/usr/lib/hadoop/lib/jets3t-0.6.1.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/jetty-servlet-tester-6.1.26.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/junit-4.5.jar:/usr/lib/hadoop/lib/kfs-0.2.2.jar:/usr/lib/hadoop/lib/log4j-1.2.15.jar:/usr/lib/hadoop/lib/mockito-all-1.8.2.jar:/usr/lib/hadoop/lib/oro-2.0.8.jar:/usr/lib/hadoop/lib/servlet-api-2.5-20081211.jar:/usr/lib/hadoop/lib/servlet-api-2.5-6.1.14.jar:/usr/lib/hadoop/lib/slf4j-api-1.4.3.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.4.3.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jsp-2.1/jsp-2.1.jar:/usr/lib/hadoop/lib/jsp-2.1/jsp-api-2.1.jar:/usr/lib/sqoop/conf:/etc/zookeeper::/usr/lib/sqoop/lib/ant-contrib-1.0b3.jar:/usr/lib/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar:/usr/lib/sqoop/lib/avro-1.5.1.jar:/usr/lib/sqoop/lib/avro-ipc-1.5.1.jar:/usr/lib/sqoop/lib/avro-mapred-1.5.1.jar:/usr/lib/sqoop/lib/commons-io-1.4.jar:/usr/lib/sqoop/lib/ivy-2.0.0-rc2.jar:/usr/lib/sqoop/lib/jackson-core-asl-1.7.3.jar:/usr/lib/sqoop/lib/jackson-mapper-asl-1.7.3.jar:/usr/lib/sqoop/lib/jopt-simple-3.2.jar:/usr/lib/sqoop/lib/mysql-connector-java-5.1.15-bin.jar:/usr/lib/sqoop/lib/paranamer-2.3.jar:/usr/lib/sqoop/lib/snappy-java-1.0.3-rc2.jar:/usr/lib/sqoop/sqoop-1.3.0-cdh3u1.jar:/usr/lib/sqoop/sqoop-test-1.3.0-cdh3u1.jar::/usr/lib/hadoop/hadoop-0.20.2-cdh3u1-core.jar:/usr/lib/sqoop/sqoop-1.3.0-cdh3u1.jar
> > >> >> > 11/09/16 09:22:44 INFO orm.CompilationManager: Writing jar file:
> > >> >> >
> /tmp/sqoop-root/compile/4da62fc9c254b3faa3ba5115ef61783b/table1.jar
> > >> >> > 11/09/16 09:22:44 DEBUG orm.CompilationManager: Scanning for
> .class
> > >> >> > files in
> > >> >> > directory:
> /tmp/sqoop-root/compile/4da62fc9c254b3faa3ba5115ef61783b
> > >> >> > 11/09/16 09:22:44 DEBUG orm.CompilationManager: Got classfile:
> > >> >> >
> /tmp/sqoop-root/compile/4da62fc9c254b3faa3ba5115ef61783b/table1.class
> > >> >> > ->
> > >> >> > table1.class
> > >> >> > 11/09/16 09:22:44 DEBUG orm.CompilationManager: Finished writing
> jar
> > >> >> > file
> > >> >> >
> /tmp/sqoop-root/compile/4da62fc9c254b3faa3ba5115ef61783b/table1.jar
> > >> >> > 11/09/16 09:22:44 DEBUG tool.ImportTool: Using temporary folder:
> > >> >> > 16092244516table1
> > >> >> > 11/09/16 09:22:44 INFO manager.DirectMySQLManager: Beginning
> > >> >> >mysqldump
> > >> >> > fast
> > >> >> > path import
> > >> >> > 11/09/16 09:22:44 INFO mapreduce.ImportJobBase: Beginning import
> of
> > >> >> > table1
> > >> >> > 11/09/16 09:22:44 DEBUG manager.SqlManager: Using fetchSize for
> next
> > >> >> > query:
> > >> >> > -2147483648
> > >> >> > 11/09/16 09:22:44 INFO manager.SqlManager: Executing SQL
> statement:
> > >> >> > SELECT
> > >> >> > t.* FROM `table1` AS t LIMIT 1
> > >> >> > 11/09/16 09:22:44 DEBUG mapreduce.MySQLDumpImportJob: Using
> > >> >> > InputFormat:
> > >> >> > class com.cloudera.sqoop.mapreduce.MySQLDumpInputFormat
> > >> >> > 11/09/16 09:22:44 DEBUG mapreduce.JobBase: Adding to job
> classpath:
> > >> >> > file:/usr/lib/sqoop/sqoop-1.3.0-cdh3u1.jar
> > >> >> > 11/09/16 09:22:44 DEBUG mapreduce.JobBase: Adding to job
> classpath:
> > >> >> > file:/usr/lib/sqoop/lib/mysql-connector-java-5.1.15-bin.jar
> > >> >> > 11/09/16 09:22:44 DEBUG mapreduce.JobBase: Adding to job
> classpath:
> > >> >> > file:/usr/lib/sqoop/sqoop-1.3.0-cdh3u1.jar
> > >> >> > 11/09/16 09:22:44 DEBUG mapreduce.JobBase: Adding to job
> classpath:
> > >> >> > file:/usr/lib/sqoop/sqoop-1.3.0-cdh3u1.jar
> > >> >> > 11/09/16 09:22:44 DEBUG mapreduce.JobBase: Adding to job
> classpath:
> > >> >> > file:/usr/lib/sqoop/lib/jackson-mapper-asl-1.7.3.jar
> > >> >> > 11/09/16 09:22:44 DEBUG
> >
> > ...
> >
> > read more »
>
> --
> NOTE: The mailing list sqoop-user@cloudera.org is deprecated in favor of
> Apache Sqoop mailing list sqoop-user@incubator.apache.org. Please
> subscribe to it by sending an email to
> incubator-sqoop-user-subscribe@apache.org.
>

Re: [sqoop-user] Problem using sqoop with --direct (mysqldump)

Posted by Kate Ting <ka...@cloudera.com>.
[To: sqoop-user@incubator.apache.org; Bcc: sqoop-user@cloudera.org]

Eric - to rule out permission issues and mysqldump issues:

1. Does username 'sqoop' have permission to access table1?
2. Can you ssh into one of the node machines, run mysqldump, and then
paste the output?

Regards, Kate

On Sat, Sep 17, 2011 at 1:05 PM, eric hernandez <er...@gmail.com> wrote:
> Kate,
> I am sqooping from a MySQL Server version: 5.0.58-enterprise-gpl MySQL
> Enterprise Server
>
> Currently i am using on all data nodes
> mysqldump  Ver 10.13 Distrib 5.5.15, for Linux (x86_64)
>
> But I tried also using
> mysql-5.0.77-4.el5_6.6.x86_64   mysqldump from the centos 5.5 repo.
>
> Both fail.
>
> -Eric
>
> On Sat, Sep 17, 2011 at 11:43 AM, Kate Ting <ka...@cloudera.com> wrote:
>>
>> Eric, what is your mysql version and your mysqldump version?
>>
>> Regards, Kate
>>
>> On Fri, Sep 16, 2011 at 7:30 PM, eric hernandez <er...@gmail.com>
>> wrote:
>> > Kate,
>> >
>> > sqoop import --connect jdbc:mysql://192.168.0.100:3307/test --verbose -m
>> > 1
>> > --username sqoop --password sanitized --hive-overwrite --direct --table
>> > table1 --hive-import --create-hive-table --hive-table table1
>> > --fields-terminated-by '\t' --lines-terminated-by '\n' --append
>> >
>> > Please note this works fine if i remove the --direct option. I am also
>> > limiting it to 1 mapper because if not the output of the failure is very
>> > long.
>> >
>> >
>> > On Fri, Sep 16, 2011 at 3:01 PM, Kate Ting <ka...@cloudera.com> wrote:
>> >>
>> >> Eric - what is the exact Sqoop command that you ran (including, if
>> >> applicable, contents of the options-file)?
>> >>
>> >> Regards, Kate
>> >>
>> >> On Fri, Sep 16, 2011 at 9:27 AM, eric hernandez
>> >> <er...@gmail.com>
>> >> wrote:
>> >> > Yes i have mysqldump on all nodes.
>> >> >
>> >> > Verbose output
>> >> >
>> >> >
>> >> >
>> >> > 11/09/16 09:22:42 DEBUG tool.BaseSqoopTool: Enabled debug logging.
>> >> > 11/09/16 09:22:42 WARN tool.BaseSqoopTool: Setting your password on
>> >> > the
>> >> > command-line is insecure. Consider using -P instead.
>> >> > 11/09/16 09:22:42 DEBUG sqoop.ConnFactory: Loaded manager factory:
>> >> > com.cloudera.sqoop.manager.DefaultManagerFactory
>> >> > 11/09/16 09:22:42 DEBUG sqoop.ConnFactory: Trying ManagerFactory:
>> >> > com.cloudera.sqoop.manager.DefaultManagerFactory
>> >> > 11/09/16 09:22:42 DEBUG manager.DefaultManagerFactory: Trying with
>> >> > scheme:
>> >> > jdbc:mysql:
>> >> > 11/09/16 09:22:42 INFO manager.MySQLManager: Preparing to use a MySQL
>> >> > streaming resultset.
>> >> > 11/09/16 09:22:42 DEBUG sqoop.ConnFactory: Instantiated ConnManager
>> >> > com.cloudera.sqoop.manager.DirectMySQLManager@7ad81784
>> >> > 11/09/16 09:22:42 INFO tool.CodeGenTool: Beginning code generation
>> >> > 11/09/16 09:22:42 DEBUG manager.SqlManager: No connection paramenters
>> >> > specified. Using regular API for making connection.
>> >> > 11/09/16 09:22:43 DEBUG manager.SqlManager: Using fetchSize for next
>> >> > query:
>> >> > -2147483648
>> >> > 11/09/16 09:22:43 INFO manager.SqlManager: Executing SQL statement:
>> >> > SELECT
>> >> > t.* FROM `table1` AS t LIMIT 1
>> >> > 11/09/16 09:22:43 DEBUG manager.SqlManager: Using fetchSize for next
>> >> > query:
>> >> > -2147483648
>> >> > 11/09/16 09:22:43 INFO manager.SqlManager: Executing SQL statement:
>> >> > SELECT
>> >> > t.* FROM `table1` AS t LIMIT 1
>> >> > 11/09/16 09:22:43 DEBUG orm.ClassWriter: selected columns:
>> >> > 11/09/16 09:22:43 DEBUG orm.ClassWriter:   id
>> >> > 11/09/16 09:22:43 DEBUG orm.ClassWriter:   application_id
>> >> > 11/09/16 09:22:43 DEBUG orm.ClassWriter:   event_id
>> >> > 11/09/16 09:22:43 DEBUG orm.ClassWriter:   response_id
>> >> > 11/09/16 09:22:43 DEBUG orm.ClassWriter:   target_id
>> >> > 11/09/16 09:22:43 DEBUG orm.ClassWriter:   mode
>> >> > 11/09/16 09:22:43 DEBUG orm.ClassWriter:   date_created
>> >> > 11/09/16 09:22:43 DEBUG orm.ClassWriter: Writing source file:
>> >> > /tmp/sqoop-root/compile/4da62fc9c254b3faa3ba5115ef61783b/table1.java
>> >> > 11/09/16 09:22:43 DEBUG orm.ClassWriter: Table name: table1
>> >> > 11/09/16 09:22:43 DEBUG orm.ClassWriter: Columns: id:4,
>> >> > application_id:4,
>> >> > event_id:4, response_id:4, target_id:4, mode:1, date_created:93,
>> >> > 11/09/16 09:22:43 DEBUG orm.ClassWriter: sourceFilename is
>> >> > table1.java
>> >> > 11/09/16 09:22:43 DEBUG orm.CompilationManager: Found existing
>> >> > /tmp/sqoop-root/compile/4da62fc9c254b3faa3ba5115ef61783b/
>> >> > 11/09/16 09:22:43 INFO orm.CompilationManager: HADOOP_HOME is
>> >> > /usr/lib/hadoop
>> >> > 11/09/16 09:22:43 INFO orm.CompilationManager: Found hadoop core jar
>> >> > at:
>> >> > /usr/lib/hadoop/hadoop-0.20.2-cdh3u1-core.jar
>> >> > 11/09/16 09:22:43 DEBUG orm.CompilationManager: Adding source file:
>> >> > /tmp/sqoop-root/compile/4da62fc9c254b3faa3ba5115ef61783b/table1.java
>> >> > 11/09/16 09:22:43 DEBUG orm.CompilationManager: Invoking javac with
>> >> > args:
>> >> > 11/09/16 09:22:43 DEBUG orm.CompilationManager:   -sourcepath
>> >> > 11/09/16 09:22:43 DEBUG orm.CompilationManager:
>> >> > /tmp/sqoop-root/compile/4da62fc9c254b3faa3ba5115ef61783b/
>> >> > 11/09/16 09:22:43 DEBUG orm.CompilationManager:   -d
>> >> > 11/09/16 09:22:43 DEBUG orm.CompilationManager:
>> >> > /tmp/sqoop-root/compile/4da62fc9c254b3faa3ba5115ef61783b/
>> >> > 11/09/16 09:22:43 DEBUG orm.CompilationManager:   -classpath
>> >> > 11/09/16 09:22:43 DEBUG orm.CompilationManager:
>> >> >
>> >> >
>> >> > /etc/hadoop/conf:/usr/java/jdk1.6.0_21/lib/tools.jar:/usr/lib/hadoop:/usr/lib/hadoop/hadoop-core-0.20.2-cdh3u1.jar:/usr/lib/hadoop/lib/ant-contrib-1.0b3.jar:/usr/lib/hadoop/lib/aspectjrt-1.6.5.jar:/usr/lib/hadoop/lib/aspectjtools-1.6.5.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/commons-daemon-1.0.1.jar:/usr/lib/hadoop/lib/commons-el-1.0.jar:/usr/lib/hadoop/lib/commons-httpclient-3.0.1.jar:/usr/lib/hadoop/lib/commons-logging-1.0.4.jar:/usr/lib/hadoop/lib/commons-logging-api-1.0.4.jar:/usr/lib/hadoop/lib/commons-net-1.4.1.jar:/usr/lib/hadoop/lib/core-3.1.1.jar:/usr/lib/hadoop/lib/hadoop-fairscheduler-0.20.2-cdh3u1.jar:/usr/lib/hadoop/lib/hsqldb-1.8.0.10.jar:/usr/lib/hadoop/lib/hue-plugins-1.2.0-cdh3u1.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.5.2.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.5.2.jar:/usr/lib/hadoop/lib/jasper-compiler-5.5.12.jar:/usr/lib/hadoop/lib/jasper-runtime-5.5.12.jar:/usr/lib/hadoop/lib/jets3t-0.6.1.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/jetty-servlet-tester-6.1.26.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/junit-4.5.jar:/usr/lib/hadoop/lib/kfs-0.2.2.jar:/usr/lib/hadoop/lib/log4j-1.2.15.jar:/usr/lib/hadoop/lib/mockito-all-1.8.2.jar:/usr/lib/hadoop/lib/oro-2.0.8.jar:/usr/lib/hadoop/lib/servlet-api-2.5-20081211.jar:/usr/lib/hadoop/lib/servlet-api-2.5-6.1.14.jar:/usr/lib/hadoop/lib/slf4j-api-1.4.3.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.4.3.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jsp-2.1/jsp-2.1.jar:/usr/lib/hadoop/lib/jsp-2.1/jsp-api-2.1.jar:/usr/lib/sqoop/conf:/etc/zookeeper::/usr/lib/sqoop/lib/ant-contrib-1.0b3.jar:/usr/lib/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar:/usr/lib/sqoop/lib/avro-1.5.1.jar:/usr/lib/sqoop/lib/avro-ipc-1.5.1.jar:/usr/lib/sqoop/lib/avro-mapred-1.5.1.jar:/usr/lib/sqoop/lib/commons-io-1.4.jar:/usr/lib/sqoop/lib/ivy-2.0.0-rc2.jar:/usr/lib/sqoop/lib/jackson-core-asl-1.7.3.jar:/usr/lib/sqoop/lib/jackson-mapper-asl-1.7.3.jar:/usr/lib/sqoop/lib/jopt-simple-3.2.jar:/usr/lib/sqoop/lib/mysql-connector-java-5.1.15-bin.jar:/usr/lib/sqoop/lib/paranamer-2.3.jar:/usr/lib/sqoop/lib/snappy-java-1.0.3-rc2.jar:/usr/lib/sqoop/sqoop-1.3.0-cdh3u1.jar:/usr/lib/sqoop/sqoop-test-1.3.0-cdh3u1.jar::/usr/lib/hadoop/hadoop-0.20.2-cdh3u1-core.jar:/usr/lib/sqoop/sqoop-1.3.0-cdh3u1.jar
>> >> > 11/09/16 09:22:44 INFO orm.CompilationManager: Writing jar file:
>> >> > /tmp/sqoop-root/compile/4da62fc9c254b3faa3ba5115ef61783b/table1.jar
>> >> > 11/09/16 09:22:44 DEBUG orm.CompilationManager: Scanning for .class
>> >> > files in
>> >> > directory: /tmp/sqoop-root/compile/4da62fc9c254b3faa3ba5115ef61783b
>> >> > 11/09/16 09:22:44 DEBUG orm.CompilationManager: Got classfile:
>> >> > /tmp/sqoop-root/compile/4da62fc9c254b3faa3ba5115ef61783b/table1.class
>> >> > ->
>> >> > table1.class
>> >> > 11/09/16 09:22:44 DEBUG orm.CompilationManager: Finished writing jar
>> >> > file
>> >> > /tmp/sqoop-root/compile/4da62fc9c254b3faa3ba5115ef61783b/table1.jar
>> >> > 11/09/16 09:22:44 DEBUG tool.ImportTool: Using temporary folder:
>> >> > 16092244516table1
>> >> > 11/09/16 09:22:44 INFO manager.DirectMySQLManager: Beginning
>> >> > mysqldump
>> >> > fast
>> >> > path import
>> >> > 11/09/16 09:22:44 INFO mapreduce.ImportJobBase: Beginning import of
>> >> > table1
>> >> > 11/09/16 09:22:44 DEBUG manager.SqlManager: Using fetchSize for next
>> >> > query:
>> >> > -2147483648
>> >> > 11/09/16 09:22:44 INFO manager.SqlManager: Executing SQL statement:
>> >> > SELECT
>> >> > t.* FROM `table1` AS t LIMIT 1
>> >> > 11/09/16 09:22:44 DEBUG mapreduce.MySQLDumpImportJob: Using
>> >> > InputFormat:
>> >> > class com.cloudera.sqoop.mapreduce.MySQLDumpInputFormat
>> >> > 11/09/16 09:22:44 DEBUG mapreduce.JobBase: Adding to job classpath:
>> >> > file:/usr/lib/sqoop/sqoop-1.3.0-cdh3u1.jar
>> >> > 11/09/16 09:22:44 DEBUG mapreduce.JobBase: Adding to job classpath:
>> >> > file:/usr/lib/sqoop/lib/mysql-connector-java-5.1.15-bin.jar
>> >> > 11/09/16 09:22:44 DEBUG mapreduce.JobBase: Adding to job classpath:
>> >> > file:/usr/lib/sqoop/sqoop-1.3.0-cdh3u1.jar
>> >> > 11/09/16 09:22:44 DEBUG mapreduce.JobBase: Adding to job classpath:
>> >> > file:/usr/lib/sqoop/sqoop-1.3.0-cdh3u1.jar
>> >> > 11/09/16 09:22:44 DEBUG mapreduce.JobBase: Adding to job classpath:
>> >> > file:/usr/lib/sqoop/lib/jackson-mapper-asl-1.7.3.jar
>> >> > 11/09/16 09:22:44 DEBUG mapreduce.JobBase: Adding to job classpath:
>> >> > file:/usr/lib/sqoop/lib/snappy-java-1.0.3-rc2.jar
>> >> > 11/09/16 09:22:44 DEBUG mapreduce.JobBase: Adding to job classpath:
>> >> > file:/usr/lib/sqoop/lib/ant-contrib-1.0b3.jar
>> >> > 11/09/16 09:22:44 DEBUG mapreduce.JobBase: Adding to job classpath:
>> >> > file:/usr/lib/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar
>> >> > 11/09/16 09:22:44 DEBUG mapreduce.JobBase: Adding to job classpath:
>> >> > file:/usr/lib/sqoop/lib/mysql-connector-java-5.1.15-bin.jar
>> >> > 11/09/16 09:22:44 DEBUG mapreduce.JobBase: Adding to job classpath:
>> >> > file:/usr/lib/sqoop/lib/paranamer-2.3.jar
>> >> > 11/09/16 09:22:44 DEBUG mapreduce.JobBase: Adding to job classpath:
>> >> > file:/usr/lib/sqoop/lib/ivy-2.0.0-rc2.jar
>> >> > 11/09/16 09:22:44 DEBUG mapreduce.JobBase: Adding to job classpath:
>> >> > file:/usr/lib/sqoop/lib/jackson-core-asl-1.7.3.jar
>> >> > 11/09/16 09:22:44 DEBUG mapreduce.JobBase: Adding to job classpath:
>> >> > file:/usr/lib/sqoop/lib/avro-mapred-1.5.1.jar
>> >> > 11/09/16 09:22:44 DEBUG mapreduce.JobBase: Adding to job classpath:
>> >> > file:/usr/lib/sqoop/lib/avro-1.5.1.jar
>> >> > 11/09/16 09:22:44 DEBUG mapreduce.JobBase: Adding to job classpath:
>> >> > file:/usr/lib/sqoop/lib/jopt-simple-3.2.jar
>> >> > 11/09/16 09:22:44 DEBUG mapreduce.JobBase: Adding to job classpath:
>> >> > file:/usr/lib/sqoop/lib/commons-io-1.4.jar
>> >> > 11/09/16 09:22:44 DEBUG mapreduce.JobBase: Adding to job classpath:
>> >> > file:/usr/lib/sqoop/lib/avro-ipc-1.5.1.jar
>> >> > 11/09/16 09:22:46 INFO mapred.JobClient: Running job:
>> >> > job_201109160744_0006
>> >> > 11/09/16 09:22:47 INFO mapred.JobClient:  map 0% reduce 0%
>> >> > 11/09/16 09:22:53 INFO mapred.JobClient: Task Id :
>> >> > attempt_201109160744_0006_m_000000_0, Status : FAILED
>> >> > java.io.IOException: mysqldump terminated with status 5
>> >> >     at
>> >> >
>> >> >
>> >> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:476)
>> >> >     at
>> >> >
>> >> >
>> >> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:49)
>> >> >     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>> >> >     at
>> >> > org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:647)
>> >> >     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:323)
>> >> >     at org.apache.hadoop.mapred.Child$4.run(Child.java:270)
>> >> >     at java.security.AccessController.doPrivileged(Native Method)
>> >> >     at javax.security.auth.Subject.doAs(Subject.java:396)
>> >> >     at
>> >> >
>> >> >
>> >> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1127)
>> >> >     at org.apache.hadoop.mapred.Child.main(Child.java:264)
>> >> >
>> >> > attempt_201109160744_0006_m_000000_0: Exception in thread "Thread-12"
>> >> > java.lang.IndexOutOfBoundsException
>> >> > attempt_201109160744_0006_m_000000_0:     at
>> >> > java.nio.CharBuffer.wrap(CharBuffer.java:445)
>> >> > attempt_201109160744_0006_m_000000_0:     at
>> >> >
>> >> >
>> >> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper$ReparsingAsyncSink$ReparsingStreamThread.run(MySQLDumpMapper.java:253)
>> >> > attempt_201109160744_0006_m_000000_0: log4j:WARN No appenders could
>> >> > be
>> >> > found
>> >> > for logger (org.apache.hadoop.hdfs.DFSClient).
>> >> > attempt_201109160744_0006_m_000000_0: log4j:WARN Please initialize
>> >> > the
>> >> > log4j
>> >> > system properly.
>> >> > 11/09/16 09:22:58 INFO mapred.JobClient: Task Id :
>> >> > attempt_201109160744_0006_m_000000_1, Status : FAILED
>> >> > java.io.IOException: mysqldump terminated with status 5
>> >> >     at
>> >> >
>> >> >
>> >> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:476)
>> >> >     at
>> >> >
>> >> >
>> >> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:49)
>> >> >     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>> >> >     at
>> >> > org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:647)
>> >> >     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:323)
>> >> >     at org.apache.hadoop.mapred.Child$4.run(Child.java:270)
>> >> >     at java.security.AccessController.doPrivileged(Native Method)
>> >> >     at javax.security.auth.Subject.doAs(Subject.java:396)
>> >> >     at
>> >> >
>> >> >
>> >> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1127)
>> >> >     at org.apache.hadoop.mapred.Child.main(Child.java:264)
>> >> >
>> >> > attempt_201109160744_0006_m_000000_1: Exception in thread "Thread-12"
>> >> > java.lang.IndexOutOfBoundsException
>> >> > attempt_201109160744_0006_m_000000_1:     at
>> >> > java.nio.CharBuffer.wrap(CharBuffer.java:445)
>> >> > attempt_201109160744_0006_m_000000_1:     at
>> >> >
>> >> >
>> >> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper$ReparsingAsyncSink$ReparsingStreamThread.run(MySQLDumpMapper.java:253)
>> >> > attempt_201109160744_0006_m_000000_1: log4j:WARN No appenders could
>> >> > be
>> >> > found
>> >> > for logger (org.apache.hadoop.hdfs.DFSClient).
>> >> > attempt_201109160744_0006_m_000000_1: log4j:WARN Please initialize
>> >> > the
>> >> > log4j
>> >> > system properly.
>> >> > 11/09/16 09:23:03 INFO mapred.JobClient: Task Id :
>> >> > attempt_201109160744_0006_m_000000_2, Status : FAILED
>> >> > java.io.IOException: mysqldump terminated with status 5
>> >> >     at
>> >> >
>> >> >
>> >> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:476)
>> >> >     at
>> >> >
>> >> >
>> >> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:49)
>> >> >     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>> >> >     at
>> >> > org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:647)
>> >> >     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:323)
>> >> >     at org.apache.hadoop.mapred.Child$4.run(Child.java:270)
>> >> >     at java.security.AccessController.doPrivileged(Native Method)
>> >> >     at javax.security.auth.Subject.doAs(Subject.java:396)
>> >> >     at
>> >> >
>> >> >
>> >> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1127)
>> >> >     at org.apache.hadoop.mapred.Child.main(Child.java:264)
>> >> >
>> >> > attempt_201109160744_0006_m_000000_2: Exception in thread "Thread-12"
>> >> > java.lang.IndexOutOfBoundsException
>> >> > attempt_201109160744_0006_m_000000_2:     at
>> >> > java.nio.CharBuffer.wrap(CharBuffer.java:445)
>> >> > attempt_201109160744_0006_m_000000_2:     at
>> >> >
>> >> >
>> >> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper$ReparsingAsyncSink$ReparsingStreamThread.run(MySQLDumpMapper.java:253)
>> >> > attempt_201109160744_0006_m_000000_2: log4j:WARN No appenders could
>> >> > be
>> >> > found
>> >> > for logger (org.apache.hadoop.hdfs.DFSClient).
>> >> > attempt_201109160744_0006_m_000000_2: log4j:WARN Please initialize
>> >> > the
>> >> > log4j
>> >> > system properly.
>> >> > 11/09/16 09:23:09 INFO mapred.JobClient: Job complete:
>> >> > job_201109160744_0006
>> >> > 11/09/16 09:23:09 INFO mapred.JobClient: Counters: 6
>> >> > 11/09/16 09:23:09 INFO mapred.JobClient:   Job Counters
>> >> > 11/09/16 09:23:09 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=19196
>> >> > 11/09/16 09:23:09 INFO mapred.JobClient:     Total time spent by all
>> >> > reduces
>> >> > waiting after reserving slots (ms)=0
>> >> > 11/09/16 09:23:09 INFO mapred.JobClient:     Total time spent by all
>> >> > maps
>> >> > waiting after reserving slots (ms)=0
>> >> > 11/09/16 09:23:09 INFO mapred.JobClient:     Launched map tasks=4
>> >> > 11/09/16 09:23:09 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=0
>> >> > 11/09/16 09:23:09 INFO mapred.JobClient:     Failed map tasks=1
>> >> > 11/09/16 09:23:09 INFO mapreduce.ImportJobBase: Transferred 0 bytes
>> >> > in
>> >> > 24.8354 seconds (0 bytes/sec)
>> >> > 11/09/16 09:23:09 INFO mapreduce.ImportJobBase: Retrieved 0 records.
>> >> > 11/09/16 09:23:09 ERROR tool.ImportTool: Error during import: Import
>> >> > job
>> >> > failed!
>> >> >
>> >> >
>> >> > --- Task log
>> >> >
>> >> > Task Logs: 'attempt_201109160744_0006_m_000000_1'
>> >> >
>> >> > stdout logs
>> >> > ________________________________
>> >> >
>> >> > stderr logs
>> >> >
>> >> > Exception in thread "Thread-12" java.lang.
>> >> > IndexOutOfBoundsException
>> >> >       at java.nio.CharBuffer.wrap(CharBuffer.java:445)
>> >> >       at
>> >> >
>> >> >
>> >> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper$ReparsingAsyncSink$ReparsingStreamThread.run(MySQLDumpMapper.java:253)
>> >> > log4j:WARN No appenders could be found for logger
>> >> > (org.apache.hadoop.hdfs.DFSClient).
>> >> > log4j:WARN Please initialize the log4j system properly.
>> >> >
>> >> > ________________________________
>> >> >
>> >> > syslog logs
>> >> >
>> >> > 2011-09-16 09:22:54,194 WARN org.apache.hadoop.util.NativeCodeLoader:
>> >> > Unable
>> >> > to load native-hadoop library for your platform... using builtin-java
>> >> > classes where applicable
>> >> > 2011-09-16 09:22:54,326 INFO
>> >> > org.apache.hadoop.metrics.jvm.JvmMetrics:
>> >> > Initializing JVM Metrics with processName=MAP, sessionId=
>> >> > 2011-09-16 09:22:54,687 INFO
>> >> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper:
>> >> > Beginning mysqldump fast path import
>> >> > 2011-09-16 09:22:54,690 INFO
>> >> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper:
>> >> > Performing import of table table1 from database test
>> >> > 2011-09-16 09:22:54,696 INFO
>> >> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper:
>> >> > Converting data to use specified delimiters.
>> >> > 2011-09-16 09:22:54,696 INFO
>> >> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper:
>> >> > (For the fastest possible import, use
>> >> > 2011-09-16 09:22:54,696 INFO
>> >> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper:
>> >> > --mysql-delimiters to specify the same field
>> >> > 2011-09-16 09:22:54,696 INFO
>> >> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper:
>> >> > delimiters as are used by mysqldump.)
>> >> > 2011-09-16 09:22:54,710 INFO
>> >> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper:
>> >> > mysqldump: Got errno 32 on write
>> >> > 2011-09-16 09:22:54,710 INFO
>> >> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper:
>> >> > Transfer loop complete.
>> >> > 2011-09-16 09:22:54,740 INFO
>> >> > org.apache.hadoop.mapred.TaskLogsTruncater:
>> >> > Initializing logs' truncater with mapRetainSize=-1 and
>> >> > reduceRetainSize=-1
>> >> > 2011-09-16 09:22:54,746 WARN org.apache.hadoop.mapred.Child: Error
>> >> > running
>> >> > child
>> >> > java.io.IOException: mysqldump terminated with status 5
>> >> >       at
>> >> >
>> >> >
>> >> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:476)
>> >> >       at
>> >> >
>> >> >
>> >> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:49)
>> >> >       at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>> >> >       at
>> >> > org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:647)
>> >> >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:323)
>> >> >       at org.apache.hadoop.mapred.Child$4.run(Child.java:270)
>> >> >       at java.security.AccessController.doPrivileged(Native Method)
>> >> >       at javax.security.auth.Subject.doAs(Subject.java:396)
>> >> >       at
>> >> >
>> >> >
>> >> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1127)
>> >> >       at org.apache.hadoop.mapred.Child.main(Child.java:264)
>> >> > 2011-09-16 09:22:54,750 INFO org.apache.hadoop.mapred.Task: Runnning
>> >> > cleanup
>> >> > for the task
>> >> >
>> >> >
>> >> > On Fri, Sep 16, 2011 at 8:55 AM, Kate Ting <ka...@cloudera.com> wrote:
>> >> >>
>> >> >> [Moving the conversation to sqoop-user@incubator.apache.org. Please
>> >> >> subscribe (and post questions) to the new mailing list.]
>> >> >>
>> >> >> Hi Eric -
>> >> >>
>> >> >> (1) Is the mysqldump utility installed on individual node machines?
>> >> >> (2) If so, can you pastebin your task log as well as verbose output?
>> >> >>
>> >> >> Regards, Kate
>> >> >>
>> >> >> On Fri, Sep 16, 2011 at 8:04 AM, Eric <er...@gmail.com>
>> >> >> wrote:
>> >> >> > Hi all,
>> >> >> >
>> >> >> > I cannot sqoop in using the --direct option, my sqoop works fine
>> >> >> > up
>> >> >> > until i add --direct .
>> >> >> >
>> >> >> > I am using Sqoop 1.3.0-cdh3u1
>> >> >> > git commit id 3a60cc809b14d538dd1eb0e90ffa9767e8d06a43
>> >> >> > Compiled by jenkins@ubuntu-slave01 on Mon Jul 18 08:38:49 PDT 2011
>> >> >> >
>> >> >> > Please Advise,
>> >> >> >
>> >> >> > -Eric
>> >> >> >
>> >> >> >
>> >> >> > error message:
>> >> >> >
>> >> >> > 11/09/16 07:57:39 INFO manager.MySQLManager: Preparing to use a
>> >> >> > MySQL
>> >> >> > streaming resultset.
>> >> >> > 11/09/16 07:57:39 INFO tool.CodeGenTool: Beginning code generation
>> >> >> > 11/09/16 07:57:40 INFO manager.SqlManager: Executing SQL
>> >> >> > statement:
>> >> >> > SELECT t.* FROM `table1` AS t LIMIT 1
>> >> >> > 11/09/16 07:57:40 INFO manager.SqlManager: Executing SQL
>> >> >> > statement:
>> >> >> > SELECT t.* FROM `table1` AS t LIMIT 1
>> >> >> > 11/09/16 07:57:40 INFO orm.CompilationManager: HADOOP_HOME is
>> >> >> > /usr/lib/
>> >> >> > hadoop
>> >> >> > 11/09/16 07:57:40 INFO orm.CompilationManager: Found hadoop core
>> >> >> > jar
>> >> >> > at: /usr/lib/hadoop/hadoop-0.20.2-cdh3u1-core.jar
>> >> >> > 11/09/16 07:57:41 INFO orm.CompilationManager: Writing jar file:
>> >> >> > /tmp/
>> >> >> > sqoop-root/compile/aef5c62d2156aeae5338ee272de42d26/table1.jar
>> >> >> > 11/09/16 07:57:41 INFO manager.DirectMySQLManager: Beginning
>> >> >> > mysqldump
>> >> >> > fast path import
>> >> >> > 11/09/16 07:57:41 INFO mapreduce.ImportJobBase: Beginning import
>> >> >> > of
>> >> >> > table1
>> >> >> > 11/09/16 07:57:41 INFO manager.SqlManager: Executing SQL
>> >> >> > statement:
>> >> >> > SELECT t.* FROM `table1` AS t LIMIT 1
>> >> >> > 11/09/16 07:57:43 INFO mapred.JobClient: Running job:
>> >> >> > job_201109160744_0004
>> >> >> > 11/09/16 07:57:44 INFO mapred.JobClient:  map 0% reduce 0%
>> >> >> > 11/09/16 07:57:50 INFO mapred.JobClient: Task Id :
>> >> >> > attempt_201109160744_0004_m_000000_0, Status : FAILED
>> >> >> > java.io.IOException: mysqldump terminated with status 5
>> >> >> >        at
>> >> >> >
>> >> >> >
>> >> >> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:
>> >> >> > 476)
>> >> >> >        at
>> >> >> >
>> >> >> >
>> >> >> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:
>> >> >> > 49)
>> >> >> >        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>> >> >> >        at
>> >> >> > org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:647)
>> >> >> >        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:323)
>> >> >> >        at org.apache.hadoop.mapred.Child$4.run(Child.java:270)
>> >> >> >        at java.security.AccessController.doPrivileged(Native
>> >> >> > Method)
>> >> >> >        at javax.security.auth.Subject.doAs(Subject.java:396)
>> >> >> >        at
>> >> >> >
>> >> >> >
>> >> >> >
>> >> >> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:
>> >> >> > 1127)
>> >> >> >        at org.apache.hadoop.mapred.Child.main(Child.java:264)
>> >> >> >
>> >> >> > attempt_201109160744_0004_m_000000_0: Exception in thread
>> >> >> > "Thread-12"
>> >> >> > java.lang.IndexOutOfBoundsException
>> >> >> > attempt_201109160744_0004_m_000000_0:   at
>> >> >> > java.nio.CharBuffer.wrap(CharBuffer.java:445)
>> >> >> > attempt_201109160744_0004_m_000000_0:   at
>> >> >> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper$ReparsingAsyncSink
>> >> >> > $ReparsingStreamThread.run(MySQLDumpMapper.java:253)
>> >> >> > attempt_201109160744_0004_m_000000_0: log4j:WARN No appenders
>> >> >> > could
>> >> >> > be
>> >> >> > found for logger (org.apache.hadoop.hdfs.DFSClient).
>> >> >> > attempt_201109160744_0004_m_000000_0: log4j:WARN Please initialize
>> >> >> > the
>> >> >> > log4j system properly.
>> >> >> > 11/09/16 07:57:55 INFO mapred.JobClient: Task Id :
>> >> >> > attempt_201109160744_0004_m_000000_1, Status : FAILED
>> >> >> > java.io.IOException: mysqldump terminated with status 5
>> >> >> >        at
>> >> >> >
>> >> >> >
>> >> >> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:
>> >> >> > 476)
>> >> >> >        at
>> >> >> >
>> >> >> >
>> >> >> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:
>> >> >> > 49)
>> >> >> >        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>> >> >> >        at
>> >> >> > org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:647)
>> >> >> >        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:323)
>> >> >> >        at org.apache.hadoop.mapred.Child$4.run(Child.java:270)
>> >> >> >        at java.security.AccessController.doPrivileged(Native
>> >> >> > Method)
>> >> >> >        at javax.security.auth.Subject.doAs(Subject.java:396)
>> >> >> >        at
>> >> >> >
>> >> >> >
>> >> >> >
>> >> >> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:
>> >> >> > 1127)
>> >> >> >        at org.apache.hadoop.mapred.Child.main(Child.java:264)
>> >> >> >
>> >> >> > attempt_201109160744_0004_m_000000_1: Exception in thread
>> >> >> > "Thread-12"
>> >> >> > java.lang.IndexOutOfBoundsException
>> >> >> > attempt_201109160744_0004_m_000000_1:   at
>> >> >> > java.nio.CharBuffer.wrap(CharBuffer.java:445)
>> >> >> > attempt_201109160744_0004_m_000000_1:   at
>> >> >> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper$ReparsingAsyncSink
>> >> >> > $ReparsingStreamThread.run(MySQLDumpMapper.java:253)
>> >> >> > attempt_201109160744_0004_m_000000_1: log4j:WARN No appenders
>> >> >> > could
>> >> >> > be
>> >> >> > found for logger (org.apache.hadoop.hdfs.DFSClient).
>> >> >> > attempt_201109160744_0004_m_000000_1: log4j:WARN Please initialize
>> >> >> > the
>> >> >> > log4j system properly.
>> >> >> > 11/09/16 07:58:01 INFO mapred.JobClient: Task Id :
>> >> >> > attempt_201109160744_0004_m_000000_2, Status : FAILED
>> >> >> > java.io.IOException: mysqldump terminated with status 5
>> >> >> >        at
>> >> >> >
>> >> >> >
>> >> >> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:
>> >> >> > 476)
>> >> >> >        at
>> >> >> >
>> >> >> >
>> >> >> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:
>> >> >> > 49)
>> >> >> >        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>> >> >> >        at
>> >> >> > org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:647)
>> >> >> >        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:323)
>> >> >> >        at org.apache.hadoop.mapred.Child$4.run(Child.java:270)
>> >> >> >        at java.security.AccessController.doPrivileged(Native
>> >> >> > Method)
>> >> >> >        at javax.security.auth.Subject.doAs(Subject.java:396)
>> >> >> >        at
>> >> >> >
>> >> >> >
>> >> >> >
>> >> >> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:
>> >> >> > 1127)
>> >> >> >        at org.apache.hadoop.mapred.Child.main(Child.java:264)
>> >> >> >
>> >> >> > attempt_201109160744_0004_m_000000_2: Exception in thread
>> >> >> > "Thread-12"
>> >> >> > java.lang.IndexOutOfBoundsException
>> >> >> > attempt_201109160744_0004_m_000000_2:   at
>> >> >> > java.nio.CharBuffer.wrap(CharBuffer.java:445)
>> >> >> > attempt_201109160744_0004_m_000000_2:   at
>> >> >> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper$ReparsingAsyncSink
>> >> >> > $ReparsingStreamThread.run(MySQLDumpMapper.java:253)
>> >> >> > attempt_201109160744_0004_m_000000_2: log4j:WARN No appenders
>> >> >> > could
>> >> >> > be
>> >> >> > found for logger (org.apache.hadoop.hdfs.DFSClient).
>> >> >> > attempt_201109160744_0004_m_000000_2: log4j:WARN Please initialize
>> >> >> > the
>> >> >> > log4j system properly.
>> >> >> > 11/09/16 07:58:07 INFO mapred.JobClient: Job complete:
>> >> >> > job_201109160744_0004
>> >> >> > 11/09/16 07:58:07 INFO mapred.JobClient: Counters: 6
>> >> >> > 11/09/16 07:58:07 INFO mapred.JobClient:   Job Counters
>> >> >> > 11/09/16 07:58:07 INFO mapred.JobClient:
>> >> >> > SLOTS_MILLIS_MAPS=19165
>> >> >> > 11/09/16 07:58:07 INFO mapred.JobClient:     Total time spent by
>> >> >> > all
>> >> >> > reduces waiting after reserving slots (ms)=0
>> >> >> > 11/09/16 07:58:07 INFO mapred.JobClient:     Total time spent by
>> >> >> > all
>> >> >> > maps waiting after reserving slots (ms)=0
>> >> >> > 11/09/16 07:58:07 INFO mapred.JobClient:     Launched map tasks=4
>> >> >> > 11/09/16 07:58:07 INFO mapred.JobClient:
>> >> >> > SLOTS_MILLIS_REDUCES=0
>> >> >> > 11/09/16 07:58:07 INFO mapred.JobClient:     Failed map tasks=1
>> >> >> > 11/09/16 07:58:07 INFO mapreduce.ImportJobBase: Transferred 0
>> >> >> > bytes
>> >> >> > in
>> >> >> > 25.1844 seconds (0 bytes/sec)
>> >> >> > 11/09/16 07:58:07 INFO mapreduce.ImportJobBase: Retrieved 0
>> >> >> > records.
>> >> >> > 11/09/16 07:58:07 ERROR tool.ImportTool: Error during import:
>> >> >> > Import
>> >> >> > job failed!
>> >> >> >
>> >> >> > --
>> >> >> > NOTE: The mailing list sqoop-user@cloudera.org is deprecated in
>> >> >> > favor
>> >> >> > of
>> >> >> > Apache Sqoop mailing list sqoop-user@incubator.apache.org. Please
>> >> >> > subscribe
>> >> >> > to it by sending an email to
>> >> >> > incubator-sqoop-user-subscribe@apache.org.
>> >> >> >
>> >> >>
>> >> >> --
>> >> >> NOTE: The mailing list sqoop-user@cloudera.org is deprecated in
>> >> >> favor
>> >> >> of
>> >> >> Apache Sqoop mailing list sqoop-user@incubator.apache.org. Please
>> >> >> subscribe
>> >> >> to it by sending an email to
>> >> >> incubator-sqoop-user-subscribe@apache.org.
>> >> >
>> >> >
>> >> >
>> >> > --
>> >> > Eric H.
>> >> > eric.hardway@gmail.com
>> >> >
>> >> > --
>> >> > NOTE: The mailing list sqoop-user@cloudera.org is deprecated in favor
>> >> > of
>> >> > Apache Sqoop mailing list sqoop-user@incubator.apache.org. Please
>> >> > subscribe
>> >> > to it by sending an email to
>> >> > incubator-sqoop-user-subscribe@apache.org.
>> >> >
>> >>
>> >> --
>> >> NOTE: The mailing list sqoop-user@cloudera.org is deprecated in favor
>> >> of
>> >> Apache Sqoop mailing list sqoop-user@incubator.apache.org. Please
>> >> subscribe
>> >> to it by sending an email to incubator-sqoop-user-subscribe@apache.org.
>> >
>> >
>> >
>> > --
>> > Eric H.
>> > eric.hardway@gmail.com
>> >
>> > --
>> > NOTE: The mailing list sqoop-user@cloudera.org is deprecated in favor of
>> > Apache Sqoop mailing list sqoop-user@incubator.apache.org. Please
>> > subscribe
>> > to it by sending an email to incubator-sqoop-user-subscribe@apache.org.
>> >
>>
>> --
>> NOTE: The mailing list sqoop-user@cloudera.org is deprecated in favor of
>> Apache Sqoop mailing list sqoop-user@incubator.apache.org. Please subscribe
>> to it by sending an email to incubator-sqoop-user-subscribe@apache.org.
>
>
>
> --
> Eric H.
> eric.hardway@gmail.com
>
> --
> NOTE: The mailing list sqoop-user@cloudera.org is deprecated in favor of
> Apache Sqoop mailing list sqoop-user@incubator.apache.org. Please subscribe
> to it by sending an email to incubator-sqoop-user-subscribe@apache.org.
>

Re: [sqoop-user] Problem using sqoop with --direct (mysqldump)

Posted by Kate Ting <ka...@cloudera.com>.
Eric, what is your mysql version and your mysqldump version?

Regards, Kate

On Fri, Sep 16, 2011 at 7:30 PM, eric hernandez <er...@gmail.com> wrote:
> Kate,
>
> sqoop import --connect jdbc:mysql://192.168.0.100:3307/test --verbose -m 1
> --username sqoop --password sanitized --hive-overwrite --direct --table
> table1 --hive-import --create-hive-table --hive-table table1
> --fields-terminated-by '\t' --lines-terminated-by '\n' --append
>
> Please note this works fine if i remove the --direct option. I am also
> limiting it to 1 mapper because if not the output of the failure is very
> long.
>
>
> On Fri, Sep 16, 2011 at 3:01 PM, Kate Ting <ka...@cloudera.com> wrote:
>>
>> Eric - what is the exact Sqoop command that you ran (including, if
>> applicable, contents of the options-file)?
>>
>> Regards, Kate
>>
>> On Fri, Sep 16, 2011 at 9:27 AM, eric hernandez <er...@gmail.com>
>> wrote:
>> > Yes i have mysqldump on all nodes.
>> >
>> > Verbose output
>> >
>> >
>> >
>> > 11/09/16 09:22:42 DEBUG tool.BaseSqoopTool: Enabled debug logging.
>> > 11/09/16 09:22:42 WARN tool.BaseSqoopTool: Setting your password on the
>> > command-line is insecure. Consider using -P instead.
>> > 11/09/16 09:22:42 DEBUG sqoop.ConnFactory: Loaded manager factory:
>> > com.cloudera.sqoop.manager.DefaultManagerFactory
>> > 11/09/16 09:22:42 DEBUG sqoop.ConnFactory: Trying ManagerFactory:
>> > com.cloudera.sqoop.manager.DefaultManagerFactory
>> > 11/09/16 09:22:42 DEBUG manager.DefaultManagerFactory: Trying with
>> > scheme:
>> > jdbc:mysql:
>> > 11/09/16 09:22:42 INFO manager.MySQLManager: Preparing to use a MySQL
>> > streaming resultset.
>> > 11/09/16 09:22:42 DEBUG sqoop.ConnFactory: Instantiated ConnManager
>> > com.cloudera.sqoop.manager.DirectMySQLManager@7ad81784
>> > 11/09/16 09:22:42 INFO tool.CodeGenTool: Beginning code generation
>> > 11/09/16 09:22:42 DEBUG manager.SqlManager: No connection paramenters
>> > specified. Using regular API for making connection.
>> > 11/09/16 09:22:43 DEBUG manager.SqlManager: Using fetchSize for next
>> > query:
>> > -2147483648
>> > 11/09/16 09:22:43 INFO manager.SqlManager: Executing SQL statement:
>> > SELECT
>> > t.* FROM `table1` AS t LIMIT 1
>> > 11/09/16 09:22:43 DEBUG manager.SqlManager: Using fetchSize for next
>> > query:
>> > -2147483648
>> > 11/09/16 09:22:43 INFO manager.SqlManager: Executing SQL statement:
>> > SELECT
>> > t.* FROM `table1` AS t LIMIT 1
>> > 11/09/16 09:22:43 DEBUG orm.ClassWriter: selected columns:
>> > 11/09/16 09:22:43 DEBUG orm.ClassWriter:   id
>> > 11/09/16 09:22:43 DEBUG orm.ClassWriter:   application_id
>> > 11/09/16 09:22:43 DEBUG orm.ClassWriter:   event_id
>> > 11/09/16 09:22:43 DEBUG orm.ClassWriter:   response_id
>> > 11/09/16 09:22:43 DEBUG orm.ClassWriter:   target_id
>> > 11/09/16 09:22:43 DEBUG orm.ClassWriter:   mode
>> > 11/09/16 09:22:43 DEBUG orm.ClassWriter:   date_created
>> > 11/09/16 09:22:43 DEBUG orm.ClassWriter: Writing source file:
>> > /tmp/sqoop-root/compile/4da62fc9c254b3faa3ba5115ef61783b/table1.java
>> > 11/09/16 09:22:43 DEBUG orm.ClassWriter: Table name: table1
>> > 11/09/16 09:22:43 DEBUG orm.ClassWriter: Columns: id:4,
>> > application_id:4,
>> > event_id:4, response_id:4, target_id:4, mode:1, date_created:93,
>> > 11/09/16 09:22:43 DEBUG orm.ClassWriter: sourceFilename is table1.java
>> > 11/09/16 09:22:43 DEBUG orm.CompilationManager: Found existing
>> > /tmp/sqoop-root/compile/4da62fc9c254b3faa3ba5115ef61783b/
>> > 11/09/16 09:22:43 INFO orm.CompilationManager: HADOOP_HOME is
>> > /usr/lib/hadoop
>> > 11/09/16 09:22:43 INFO orm.CompilationManager: Found hadoop core jar at:
>> > /usr/lib/hadoop/hadoop-0.20.2-cdh3u1-core.jar
>> > 11/09/16 09:22:43 DEBUG orm.CompilationManager: Adding source file:
>> > /tmp/sqoop-root/compile/4da62fc9c254b3faa3ba5115ef61783b/table1.java
>> > 11/09/16 09:22:43 DEBUG orm.CompilationManager: Invoking javac with
>> > args:
>> > 11/09/16 09:22:43 DEBUG orm.CompilationManager:   -sourcepath
>> > 11/09/16 09:22:43 DEBUG orm.CompilationManager:
>> > /tmp/sqoop-root/compile/4da62fc9c254b3faa3ba5115ef61783b/
>> > 11/09/16 09:22:43 DEBUG orm.CompilationManager:   -d
>> > 11/09/16 09:22:43 DEBUG orm.CompilationManager:
>> > /tmp/sqoop-root/compile/4da62fc9c254b3faa3ba5115ef61783b/
>> > 11/09/16 09:22:43 DEBUG orm.CompilationManager:   -classpath
>> > 11/09/16 09:22:43 DEBUG orm.CompilationManager:
>> >
>> > /etc/hadoop/conf:/usr/java/jdk1.6.0_21/lib/tools.jar:/usr/lib/hadoop:/usr/lib/hadoop/hadoop-core-0.20.2-cdh3u1.jar:/usr/lib/hadoop/lib/ant-contrib-1.0b3.jar:/usr/lib/hadoop/lib/aspectjrt-1.6.5.jar:/usr/lib/hadoop/lib/aspectjtools-1.6.5.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/commons-daemon-1.0.1.jar:/usr/lib/hadoop/lib/commons-el-1.0.jar:/usr/lib/hadoop/lib/commons-httpclient-3.0.1.jar:/usr/lib/hadoop/lib/commons-logging-1.0.4.jar:/usr/lib/hadoop/lib/commons-logging-api-1.0.4.jar:/usr/lib/hadoop/lib/commons-net-1.4.1.jar:/usr/lib/hadoop/lib/core-3.1.1.jar:/usr/lib/hadoop/lib/hadoop-fairscheduler-0.20.2-cdh3u1.jar:/usr/lib/hadoop/lib/hsqldb-1.8.0.10.jar:/usr/lib/hadoop/lib/hue-plugins-1.2.0-cdh3u1.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.5.2.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.5.2.jar:/usr/lib/hadoop/lib/jasper-compiler-5.5.12.jar:/usr/lib/hadoop/lib/jasper-runtime-5.5.12.jar:/usr/lib/hadoop/lib/jets3t-0.6.1.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/jetty-servlet-tester-6.1.26.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/junit-4.5.jar:/usr/lib/hadoop/lib/kfs-0.2.2.jar:/usr/lib/hadoop/lib/log4j-1.2.15.jar:/usr/lib/hadoop/lib/mockito-all-1.8.2.jar:/usr/lib/hadoop/lib/oro-2.0.8.jar:/usr/lib/hadoop/lib/servlet-api-2.5-20081211.jar:/usr/lib/hadoop/lib/servlet-api-2.5-6.1.14.jar:/usr/lib/hadoop/lib/slf4j-api-1.4.3.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.4.3.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jsp-2.1/jsp-2.1.jar:/usr/lib/hadoop/lib/jsp-2.1/jsp-api-2.1.jar:/usr/lib/sqoop/conf:/etc/zookeeper::/usr/lib/sqoop/lib/ant-contrib-1.0b3.jar:/usr/lib/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar:/usr/lib/sqoop/lib/avro-1.5.1.jar:/usr/lib/sqoop/lib/avro-ipc-1.5.1.jar:/usr/lib/sqoop/lib/avro-mapred-1.5.1.jar:/usr/lib/sqoop/lib/commons-io-1.4.jar:/usr/lib/sqoop/lib/ivy-2.0.0-rc2.jar:/usr/lib/sqoop/lib/jackson-core-asl-1.7.3.jar:/usr/lib/sqoop/lib/jackson-mapper-asl-1.7.3.jar:/usr/lib/sqoop/lib/jopt-simple-3.2.jar:/usr/lib/sqoop/lib/mysql-connector-java-5.1.15-bin.jar:/usr/lib/sqoop/lib/paranamer-2.3.jar:/usr/lib/sqoop/lib/snappy-java-1.0.3-rc2.jar:/usr/lib/sqoop/sqoop-1.3.0-cdh3u1.jar:/usr/lib/sqoop/sqoop-test-1.3.0-cdh3u1.jar::/usr/lib/hadoop/hadoop-0.20.2-cdh3u1-core.jar:/usr/lib/sqoop/sqoop-1.3.0-cdh3u1.jar
>> > 11/09/16 09:22:44 INFO orm.CompilationManager: Writing jar file:
>> > /tmp/sqoop-root/compile/4da62fc9c254b3faa3ba5115ef61783b/table1.jar
>> > 11/09/16 09:22:44 DEBUG orm.CompilationManager: Scanning for .class
>> > files in
>> > directory: /tmp/sqoop-root/compile/4da62fc9c254b3faa3ba5115ef61783b
>> > 11/09/16 09:22:44 DEBUG orm.CompilationManager: Got classfile:
>> > /tmp/sqoop-root/compile/4da62fc9c254b3faa3ba5115ef61783b/table1.class ->
>> > table1.class
>> > 11/09/16 09:22:44 DEBUG orm.CompilationManager: Finished writing jar
>> > file
>> > /tmp/sqoop-root/compile/4da62fc9c254b3faa3ba5115ef61783b/table1.jar
>> > 11/09/16 09:22:44 DEBUG tool.ImportTool: Using temporary folder:
>> > 16092244516table1
>> > 11/09/16 09:22:44 INFO manager.DirectMySQLManager: Beginning mysqldump
>> > fast
>> > path import
>> > 11/09/16 09:22:44 INFO mapreduce.ImportJobBase: Beginning import of
>> > table1
>> > 11/09/16 09:22:44 DEBUG manager.SqlManager: Using fetchSize for next
>> > query:
>> > -2147483648
>> > 11/09/16 09:22:44 INFO manager.SqlManager: Executing SQL statement:
>> > SELECT
>> > t.* FROM `table1` AS t LIMIT 1
>> > 11/09/16 09:22:44 DEBUG mapreduce.MySQLDumpImportJob: Using InputFormat:
>> > class com.cloudera.sqoop.mapreduce.MySQLDumpInputFormat
>> > 11/09/16 09:22:44 DEBUG mapreduce.JobBase: Adding to job classpath:
>> > file:/usr/lib/sqoop/sqoop-1.3.0-cdh3u1.jar
>> > 11/09/16 09:22:44 DEBUG mapreduce.JobBase: Adding to job classpath:
>> > file:/usr/lib/sqoop/lib/mysql-connector-java-5.1.15-bin.jar
>> > 11/09/16 09:22:44 DEBUG mapreduce.JobBase: Adding to job classpath:
>> > file:/usr/lib/sqoop/sqoop-1.3.0-cdh3u1.jar
>> > 11/09/16 09:22:44 DEBUG mapreduce.JobBase: Adding to job classpath:
>> > file:/usr/lib/sqoop/sqoop-1.3.0-cdh3u1.jar
>> > 11/09/16 09:22:44 DEBUG mapreduce.JobBase: Adding to job classpath:
>> > file:/usr/lib/sqoop/lib/jackson-mapper-asl-1.7.3.jar
>> > 11/09/16 09:22:44 DEBUG mapreduce.JobBase: Adding to job classpath:
>> > file:/usr/lib/sqoop/lib/snappy-java-1.0.3-rc2.jar
>> > 11/09/16 09:22:44 DEBUG mapreduce.JobBase: Adding to job classpath:
>> > file:/usr/lib/sqoop/lib/ant-contrib-1.0b3.jar
>> > 11/09/16 09:22:44 DEBUG mapreduce.JobBase: Adding to job classpath:
>> > file:/usr/lib/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar
>> > 11/09/16 09:22:44 DEBUG mapreduce.JobBase: Adding to job classpath:
>> > file:/usr/lib/sqoop/lib/mysql-connector-java-5.1.15-bin.jar
>> > 11/09/16 09:22:44 DEBUG mapreduce.JobBase: Adding to job classpath:
>> > file:/usr/lib/sqoop/lib/paranamer-2.3.jar
>> > 11/09/16 09:22:44 DEBUG mapreduce.JobBase: Adding to job classpath:
>> > file:/usr/lib/sqoop/lib/ivy-2.0.0-rc2.jar
>> > 11/09/16 09:22:44 DEBUG mapreduce.JobBase: Adding to job classpath:
>> > file:/usr/lib/sqoop/lib/jackson-core-asl-1.7.3.jar
>> > 11/09/16 09:22:44 DEBUG mapreduce.JobBase: Adding to job classpath:
>> > file:/usr/lib/sqoop/lib/avro-mapred-1.5.1.jar
>> > 11/09/16 09:22:44 DEBUG mapreduce.JobBase: Adding to job classpath:
>> > file:/usr/lib/sqoop/lib/avro-1.5.1.jar
>> > 11/09/16 09:22:44 DEBUG mapreduce.JobBase: Adding to job classpath:
>> > file:/usr/lib/sqoop/lib/jopt-simple-3.2.jar
>> > 11/09/16 09:22:44 DEBUG mapreduce.JobBase: Adding to job classpath:
>> > file:/usr/lib/sqoop/lib/commons-io-1.4.jar
>> > 11/09/16 09:22:44 DEBUG mapreduce.JobBase: Adding to job classpath:
>> > file:/usr/lib/sqoop/lib/avro-ipc-1.5.1.jar
>> > 11/09/16 09:22:46 INFO mapred.JobClient: Running job:
>> > job_201109160744_0006
>> > 11/09/16 09:22:47 INFO mapred.JobClient:  map 0% reduce 0%
>> > 11/09/16 09:22:53 INFO mapred.JobClient: Task Id :
>> > attempt_201109160744_0006_m_000000_0, Status : FAILED
>> > java.io.IOException: mysqldump terminated with status 5
>> >     at
>> >
>> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:476)
>> >     at
>> >
>> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:49)
>> >     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>> >     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:647)
>> >     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:323)
>> >     at org.apache.hadoop.mapred.Child$4.run(Child.java:270)
>> >     at java.security.AccessController.doPrivileged(Native Method)
>> >     at javax.security.auth.Subject.doAs(Subject.java:396)
>> >     at
>> >
>> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1127)
>> >     at org.apache.hadoop.mapred.Child.main(Child.java:264)
>> >
>> > attempt_201109160744_0006_m_000000_0: Exception in thread "Thread-12"
>> > java.lang.IndexOutOfBoundsException
>> > attempt_201109160744_0006_m_000000_0:     at
>> > java.nio.CharBuffer.wrap(CharBuffer.java:445)
>> > attempt_201109160744_0006_m_000000_0:     at
>> >
>> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper$ReparsingAsyncSink$ReparsingStreamThread.run(MySQLDumpMapper.java:253)
>> > attempt_201109160744_0006_m_000000_0: log4j:WARN No appenders could be
>> > found
>> > for logger (org.apache.hadoop.hdfs.DFSClient).
>> > attempt_201109160744_0006_m_000000_0: log4j:WARN Please initialize the
>> > log4j
>> > system properly.
>> > 11/09/16 09:22:58 INFO mapred.JobClient: Task Id :
>> > attempt_201109160744_0006_m_000000_1, Status : FAILED
>> > java.io.IOException: mysqldump terminated with status 5
>> >     at
>> >
>> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:476)
>> >     at
>> >
>> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:49)
>> >     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>> >     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:647)
>> >     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:323)
>> >     at org.apache.hadoop.mapred.Child$4.run(Child.java:270)
>> >     at java.security.AccessController.doPrivileged(Native Method)
>> >     at javax.security.auth.Subject.doAs(Subject.java:396)
>> >     at
>> >
>> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1127)
>> >     at org.apache.hadoop.mapred.Child.main(Child.java:264)
>> >
>> > attempt_201109160744_0006_m_000000_1: Exception in thread "Thread-12"
>> > java.lang.IndexOutOfBoundsException
>> > attempt_201109160744_0006_m_000000_1:     at
>> > java.nio.CharBuffer.wrap(CharBuffer.java:445)
>> > attempt_201109160744_0006_m_000000_1:     at
>> >
>> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper$ReparsingAsyncSink$ReparsingStreamThread.run(MySQLDumpMapper.java:253)
>> > attempt_201109160744_0006_m_000000_1: log4j:WARN No appenders could be
>> > found
>> > for logger (org.apache.hadoop.hdfs.DFSClient).
>> > attempt_201109160744_0006_m_000000_1: log4j:WARN Please initialize the
>> > log4j
>> > system properly.
>> > 11/09/16 09:23:03 INFO mapred.JobClient: Task Id :
>> > attempt_201109160744_0006_m_000000_2, Status : FAILED
>> > java.io.IOException: mysqldump terminated with status 5
>> >     at
>> >
>> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:476)
>> >     at
>> >
>> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:49)
>> >     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>> >     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:647)
>> >     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:323)
>> >     at org.apache.hadoop.mapred.Child$4.run(Child.java:270)
>> >     at java.security.AccessController.doPrivileged(Native Method)
>> >     at javax.security.auth.Subject.doAs(Subject.java:396)
>> >     at
>> >
>> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1127)
>> >     at org.apache.hadoop.mapred.Child.main(Child.java:264)
>> >
>> > attempt_201109160744_0006_m_000000_2: Exception in thread "Thread-12"
>> > java.lang.IndexOutOfBoundsException
>> > attempt_201109160744_0006_m_000000_2:     at
>> > java.nio.CharBuffer.wrap(CharBuffer.java:445)
>> > attempt_201109160744_0006_m_000000_2:     at
>> >
>> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper$ReparsingAsyncSink$ReparsingStreamThread.run(MySQLDumpMapper.java:253)
>> > attempt_201109160744_0006_m_000000_2: log4j:WARN No appenders could be
>> > found
>> > for logger (org.apache.hadoop.hdfs.DFSClient).
>> > attempt_201109160744_0006_m_000000_2: log4j:WARN Please initialize the
>> > log4j
>> > system properly.
>> > 11/09/16 09:23:09 INFO mapred.JobClient: Job complete:
>> > job_201109160744_0006
>> > 11/09/16 09:23:09 INFO mapred.JobClient: Counters: 6
>> > 11/09/16 09:23:09 INFO mapred.JobClient:   Job Counters
>> > 11/09/16 09:23:09 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=19196
>> > 11/09/16 09:23:09 INFO mapred.JobClient:     Total time spent by all
>> > reduces
>> > waiting after reserving slots (ms)=0
>> > 11/09/16 09:23:09 INFO mapred.JobClient:     Total time spent by all
>> > maps
>> > waiting after reserving slots (ms)=0
>> > 11/09/16 09:23:09 INFO mapred.JobClient:     Launched map tasks=4
>> > 11/09/16 09:23:09 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=0
>> > 11/09/16 09:23:09 INFO mapred.JobClient:     Failed map tasks=1
>> > 11/09/16 09:23:09 INFO mapreduce.ImportJobBase: Transferred 0 bytes in
>> > 24.8354 seconds (0 bytes/sec)
>> > 11/09/16 09:23:09 INFO mapreduce.ImportJobBase: Retrieved 0 records.
>> > 11/09/16 09:23:09 ERROR tool.ImportTool: Error during import: Import job
>> > failed!
>> >
>> >
>> > --- Task log
>> >
>> > Task Logs: 'attempt_201109160744_0006_m_000000_1'
>> >
>> > stdout logs
>> > ________________________________
>> >
>> > stderr logs
>> >
>> > Exception in thread "Thread-12" java.lang.
>> > IndexOutOfBoundsException
>> >       at java.nio.CharBuffer.wrap(CharBuffer.java:445)
>> >       at
>> >
>> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper$ReparsingAsyncSink$ReparsingStreamThread.run(MySQLDumpMapper.java:253)
>> > log4j:WARN No appenders could be found for logger
>> > (org.apache.hadoop.hdfs.DFSClient).
>> > log4j:WARN Please initialize the log4j system properly.
>> >
>> > ________________________________
>> >
>> > syslog logs
>> >
>> > 2011-09-16 09:22:54,194 WARN org.apache.hadoop.util.NativeCodeLoader:
>> > Unable
>> > to load native-hadoop library for your platform... using builtin-java
>> > classes where applicable
>> > 2011-09-16 09:22:54,326 INFO org.apache.hadoop.metrics.jvm.JvmMetrics:
>> > Initializing JVM Metrics with processName=MAP, sessionId=
>> > 2011-09-16 09:22:54,687 INFO
>> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper:
>> > Beginning mysqldump fast path import
>> > 2011-09-16 09:22:54,690 INFO
>> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper:
>> > Performing import of table table1 from database test
>> > 2011-09-16 09:22:54,696 INFO
>> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper:
>> > Converting data to use specified delimiters.
>> > 2011-09-16 09:22:54,696 INFO
>> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper:
>> > (For the fastest possible import, use
>> > 2011-09-16 09:22:54,696 INFO
>> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper:
>> > --mysql-delimiters to specify the same field
>> > 2011-09-16 09:22:54,696 INFO
>> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper:
>> > delimiters as are used by mysqldump.)
>> > 2011-09-16 09:22:54,710 INFO
>> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper:
>> > mysqldump: Got errno 32 on write
>> > 2011-09-16 09:22:54,710 INFO
>> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper:
>> > Transfer loop complete.
>> > 2011-09-16 09:22:54,740 INFO org.apache.hadoop.mapred.TaskLogsTruncater:
>> > Initializing logs' truncater with mapRetainSize=-1 and
>> > reduceRetainSize=-1
>> > 2011-09-16 09:22:54,746 WARN org.apache.hadoop.mapred.Child: Error
>> > running
>> > child
>> > java.io.IOException: mysqldump terminated with status 5
>> >       at
>> >
>> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:476)
>> >       at
>> >
>> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:49)
>> >       at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>> >       at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:647)
>> >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:323)
>> >       at org.apache.hadoop.mapred.Child$4.run(Child.java:270)
>> >       at java.security.AccessController.doPrivileged(Native Method)
>> >       at javax.security.auth.Subject.doAs(Subject.java:396)
>> >       at
>> >
>> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1127)
>> >       at org.apache.hadoop.mapred.Child.main(Child.java:264)
>> > 2011-09-16 09:22:54,750 INFO org.apache.hadoop.mapred.Task: Runnning
>> > cleanup
>> > for the task
>> >
>> >
>> > On Fri, Sep 16, 2011 at 8:55 AM, Kate Ting <ka...@cloudera.com> wrote:
>> >>
>> >> [Moving the conversation to sqoop-user@incubator.apache.org. Please
>> >> subscribe (and post questions) to the new mailing list.]
>> >>
>> >> Hi Eric -
>> >>
>> >> (1) Is the mysqldump utility installed on individual node machines?
>> >> (2) If so, can you pastebin your task log as well as verbose output?
>> >>
>> >> Regards, Kate
>> >>
>> >> On Fri, Sep 16, 2011 at 8:04 AM, Eric <er...@gmail.com> wrote:
>> >> > Hi all,
>> >> >
>> >> > I cannot sqoop in using the --direct option, my sqoop works fine up
>> >> > until i add --direct .
>> >> >
>> >> > I am using Sqoop 1.3.0-cdh3u1
>> >> > git commit id 3a60cc809b14d538dd1eb0e90ffa9767e8d06a43
>> >> > Compiled by jenkins@ubuntu-slave01 on Mon Jul 18 08:38:49 PDT 2011
>> >> >
>> >> > Please Advise,
>> >> >
>> >> > -Eric
>> >> >
>> >> >
>> >> > error message:
>> >> >
>> >> > 11/09/16 07:57:39 INFO manager.MySQLManager: Preparing to use a MySQL
>> >> > streaming resultset.
>> >> > 11/09/16 07:57:39 INFO tool.CodeGenTool: Beginning code generation
>> >> > 11/09/16 07:57:40 INFO manager.SqlManager: Executing SQL statement:
>> >> > SELECT t.* FROM `table1` AS t LIMIT 1
>> >> > 11/09/16 07:57:40 INFO manager.SqlManager: Executing SQL statement:
>> >> > SELECT t.* FROM `table1` AS t LIMIT 1
>> >> > 11/09/16 07:57:40 INFO orm.CompilationManager: HADOOP_HOME is
>> >> > /usr/lib/
>> >> > hadoop
>> >> > 11/09/16 07:57:40 INFO orm.CompilationManager: Found hadoop core jar
>> >> > at: /usr/lib/hadoop/hadoop-0.20.2-cdh3u1-core.jar
>> >> > 11/09/16 07:57:41 INFO orm.CompilationManager: Writing jar file:
>> >> > /tmp/
>> >> > sqoop-root/compile/aef5c62d2156aeae5338ee272de42d26/table1.jar
>> >> > 11/09/16 07:57:41 INFO manager.DirectMySQLManager: Beginning
>> >> > mysqldump
>> >> > fast path import
>> >> > 11/09/16 07:57:41 INFO mapreduce.ImportJobBase: Beginning import of
>> >> > table1
>> >> > 11/09/16 07:57:41 INFO manager.SqlManager: Executing SQL statement:
>> >> > SELECT t.* FROM `table1` AS t LIMIT 1
>> >> > 11/09/16 07:57:43 INFO mapred.JobClient: Running job:
>> >> > job_201109160744_0004
>> >> > 11/09/16 07:57:44 INFO mapred.JobClient:  map 0% reduce 0%
>> >> > 11/09/16 07:57:50 INFO mapred.JobClient: Task Id :
>> >> > attempt_201109160744_0004_m_000000_0, Status : FAILED
>> >> > java.io.IOException: mysqldump terminated with status 5
>> >> >        at
>> >> >
>> >> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:
>> >> > 476)
>> >> >        at
>> >> >
>> >> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:
>> >> > 49)
>> >> >        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>> >> >        at
>> >> > org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:647)
>> >> >        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:323)
>> >> >        at org.apache.hadoop.mapred.Child$4.run(Child.java:270)
>> >> >        at java.security.AccessController.doPrivileged(Native Method)
>> >> >        at javax.security.auth.Subject.doAs(Subject.java:396)
>> >> >        at
>> >> >
>> >> >
>> >> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:
>> >> > 1127)
>> >> >        at org.apache.hadoop.mapred.Child.main(Child.java:264)
>> >> >
>> >> > attempt_201109160744_0004_m_000000_0: Exception in thread "Thread-12"
>> >> > java.lang.IndexOutOfBoundsException
>> >> > attempt_201109160744_0004_m_000000_0:   at
>> >> > java.nio.CharBuffer.wrap(CharBuffer.java:445)
>> >> > attempt_201109160744_0004_m_000000_0:   at
>> >> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper$ReparsingAsyncSink
>> >> > $ReparsingStreamThread.run(MySQLDumpMapper.java:253)
>> >> > attempt_201109160744_0004_m_000000_0: log4j:WARN No appenders could
>> >> > be
>> >> > found for logger (org.apache.hadoop.hdfs.DFSClient).
>> >> > attempt_201109160744_0004_m_000000_0: log4j:WARN Please initialize
>> >> > the
>> >> > log4j system properly.
>> >> > 11/09/16 07:57:55 INFO mapred.JobClient: Task Id :
>> >> > attempt_201109160744_0004_m_000000_1, Status : FAILED
>> >> > java.io.IOException: mysqldump terminated with status 5
>> >> >        at
>> >> >
>> >> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:
>> >> > 476)
>> >> >        at
>> >> >
>> >> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:
>> >> > 49)
>> >> >        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>> >> >        at
>> >> > org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:647)
>> >> >        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:323)
>> >> >        at org.apache.hadoop.mapred.Child$4.run(Child.java:270)
>> >> >        at java.security.AccessController.doPrivileged(Native Method)
>> >> >        at javax.security.auth.Subject.doAs(Subject.java:396)
>> >> >        at
>> >> >
>> >> >
>> >> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:
>> >> > 1127)
>> >> >        at org.apache.hadoop.mapred.Child.main(Child.java:264)
>> >> >
>> >> > attempt_201109160744_0004_m_000000_1: Exception in thread "Thread-12"
>> >> > java.lang.IndexOutOfBoundsException
>> >> > attempt_201109160744_0004_m_000000_1:   at
>> >> > java.nio.CharBuffer.wrap(CharBuffer.java:445)
>> >> > attempt_201109160744_0004_m_000000_1:   at
>> >> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper$ReparsingAsyncSink
>> >> > $ReparsingStreamThread.run(MySQLDumpMapper.java:253)
>> >> > attempt_201109160744_0004_m_000000_1: log4j:WARN No appenders could
>> >> > be
>> >> > found for logger (org.apache.hadoop.hdfs.DFSClient).
>> >> > attempt_201109160744_0004_m_000000_1: log4j:WARN Please initialize
>> >> > the
>> >> > log4j system properly.
>> >> > 11/09/16 07:58:01 INFO mapred.JobClient: Task Id :
>> >> > attempt_201109160744_0004_m_000000_2, Status : FAILED
>> >> > java.io.IOException: mysqldump terminated with status 5
>> >> >        at
>> >> >
>> >> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:
>> >> > 476)
>> >> >        at
>> >> >
>> >> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:
>> >> > 49)
>> >> >        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>> >> >        at
>> >> > org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:647)
>> >> >        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:323)
>> >> >        at org.apache.hadoop.mapred.Child$4.run(Child.java:270)
>> >> >        at java.security.AccessController.doPrivileged(Native Method)
>> >> >        at javax.security.auth.Subject.doAs(Subject.java:396)
>> >> >        at
>> >> >
>> >> >
>> >> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:
>> >> > 1127)
>> >> >        at org.apache.hadoop.mapred.Child.main(Child.java:264)
>> >> >
>> >> > attempt_201109160744_0004_m_000000_2: Exception in thread "Thread-12"
>> >> > java.lang.IndexOutOfBoundsException
>> >> > attempt_201109160744_0004_m_000000_2:   at
>> >> > java.nio.CharBuffer.wrap(CharBuffer.java:445)
>> >> > attempt_201109160744_0004_m_000000_2:   at
>> >> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper$ReparsingAsyncSink
>> >> > $ReparsingStreamThread.run(MySQLDumpMapper.java:253)
>> >> > attempt_201109160744_0004_m_000000_2: log4j:WARN No appenders could
>> >> > be
>> >> > found for logger (org.apache.hadoop.hdfs.DFSClient).
>> >> > attempt_201109160744_0004_m_000000_2: log4j:WARN Please initialize
>> >> > the
>> >> > log4j system properly.
>> >> > 11/09/16 07:58:07 INFO mapred.JobClient: Job complete:
>> >> > job_201109160744_0004
>> >> > 11/09/16 07:58:07 INFO mapred.JobClient: Counters: 6
>> >> > 11/09/16 07:58:07 INFO mapred.JobClient:   Job Counters
>> >> > 11/09/16 07:58:07 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=19165
>> >> > 11/09/16 07:58:07 INFO mapred.JobClient:     Total time spent by all
>> >> > reduces waiting after reserving slots (ms)=0
>> >> > 11/09/16 07:58:07 INFO mapred.JobClient:     Total time spent by all
>> >> > maps waiting after reserving slots (ms)=0
>> >> > 11/09/16 07:58:07 INFO mapred.JobClient:     Launched map tasks=4
>> >> > 11/09/16 07:58:07 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=0
>> >> > 11/09/16 07:58:07 INFO mapred.JobClient:     Failed map tasks=1
>> >> > 11/09/16 07:58:07 INFO mapreduce.ImportJobBase: Transferred 0 bytes
>> >> > in
>> >> > 25.1844 seconds (0 bytes/sec)
>> >> > 11/09/16 07:58:07 INFO mapreduce.ImportJobBase: Retrieved 0 records.
>> >> > 11/09/16 07:58:07 ERROR tool.ImportTool: Error during import: Import
>> >> > job failed!
>> >> >
>> >> > --
>> >> > NOTE: The mailing list sqoop-user@cloudera.org is deprecated in favor
>> >> > of
>> >> > Apache Sqoop mailing list sqoop-user@incubator.apache.org. Please
>> >> > subscribe
>> >> > to it by sending an email to
>> >> > incubator-sqoop-user-subscribe@apache.org.
>> >> >
>> >>
>> >> --
>> >> NOTE: The mailing list sqoop-user@cloudera.org is deprecated in favor
>> >> of
>> >> Apache Sqoop mailing list sqoop-user@incubator.apache.org. Please
>> >> subscribe
>> >> to it by sending an email to incubator-sqoop-user-subscribe@apache.org.
>> >
>> >
>> >
>> > --
>> > Eric H.
>> > eric.hardway@gmail.com
>> >
>> > --
>> > NOTE: The mailing list sqoop-user@cloudera.org is deprecated in favor of
>> > Apache Sqoop mailing list sqoop-user@incubator.apache.org. Please
>> > subscribe
>> > to it by sending an email to incubator-sqoop-user-subscribe@apache.org.
>> >
>>
>> --
>> NOTE: The mailing list sqoop-user@cloudera.org is deprecated in favor of
>> Apache Sqoop mailing list sqoop-user@incubator.apache.org. Please subscribe
>> to it by sending an email to incubator-sqoop-user-subscribe@apache.org.
>
>
>
> --
> Eric H.
> eric.hardway@gmail.com
>
> --
> NOTE: The mailing list sqoop-user@cloudera.org is deprecated in favor of
> Apache Sqoop mailing list sqoop-user@incubator.apache.org. Please subscribe
> to it by sending an email to incubator-sqoop-user-subscribe@apache.org.
>

Re: [sqoop-user] Problem using sqoop with --direct (mysqldump)

Posted by Kate Ting <ka...@cloudera.com>.
Eric - what is the exact Sqoop command that you ran (including, if
applicable, contents of the options-file)?

Regards, Kate

On Fri, Sep 16, 2011 at 9:27 AM, eric hernandez <er...@gmail.com> wrote:
> Yes i have mysqldump on all nodes.
>
> Verbose output
>
>
>
> 11/09/16 09:22:42 DEBUG tool.BaseSqoopTool: Enabled debug logging.
> 11/09/16 09:22:42 WARN tool.BaseSqoopTool: Setting your password on the
> command-line is insecure. Consider using -P instead.
> 11/09/16 09:22:42 DEBUG sqoop.ConnFactory: Loaded manager factory:
> com.cloudera.sqoop.manager.DefaultManagerFactory
> 11/09/16 09:22:42 DEBUG sqoop.ConnFactory: Trying ManagerFactory:
> com.cloudera.sqoop.manager.DefaultManagerFactory
> 11/09/16 09:22:42 DEBUG manager.DefaultManagerFactory: Trying with scheme:
> jdbc:mysql:
> 11/09/16 09:22:42 INFO manager.MySQLManager: Preparing to use a MySQL
> streaming resultset.
> 11/09/16 09:22:42 DEBUG sqoop.ConnFactory: Instantiated ConnManager
> com.cloudera.sqoop.manager.DirectMySQLManager@7ad81784
> 11/09/16 09:22:42 INFO tool.CodeGenTool: Beginning code generation
> 11/09/16 09:22:42 DEBUG manager.SqlManager: No connection paramenters
> specified. Using regular API for making connection.
> 11/09/16 09:22:43 DEBUG manager.SqlManager: Using fetchSize for next query:
> -2147483648
> 11/09/16 09:22:43 INFO manager.SqlManager: Executing SQL statement: SELECT
> t.* FROM `table1` AS t LIMIT 1
> 11/09/16 09:22:43 DEBUG manager.SqlManager: Using fetchSize for next query:
> -2147483648
> 11/09/16 09:22:43 INFO manager.SqlManager: Executing SQL statement: SELECT
> t.* FROM `table1` AS t LIMIT 1
> 11/09/16 09:22:43 DEBUG orm.ClassWriter: selected columns:
> 11/09/16 09:22:43 DEBUG orm.ClassWriter:   id
> 11/09/16 09:22:43 DEBUG orm.ClassWriter:   application_id
> 11/09/16 09:22:43 DEBUG orm.ClassWriter:   event_id
> 11/09/16 09:22:43 DEBUG orm.ClassWriter:   response_id
> 11/09/16 09:22:43 DEBUG orm.ClassWriter:   target_id
> 11/09/16 09:22:43 DEBUG orm.ClassWriter:   mode
> 11/09/16 09:22:43 DEBUG orm.ClassWriter:   date_created
> 11/09/16 09:22:43 DEBUG orm.ClassWriter: Writing source file:
> /tmp/sqoop-root/compile/4da62fc9c254b3faa3ba5115ef61783b/table1.java
> 11/09/16 09:22:43 DEBUG orm.ClassWriter: Table name: table1
> 11/09/16 09:22:43 DEBUG orm.ClassWriter: Columns: id:4, application_id:4,
> event_id:4, response_id:4, target_id:4, mode:1, date_created:93,
> 11/09/16 09:22:43 DEBUG orm.ClassWriter: sourceFilename is table1.java
> 11/09/16 09:22:43 DEBUG orm.CompilationManager: Found existing
> /tmp/sqoop-root/compile/4da62fc9c254b3faa3ba5115ef61783b/
> 11/09/16 09:22:43 INFO orm.CompilationManager: HADOOP_HOME is
> /usr/lib/hadoop
> 11/09/16 09:22:43 INFO orm.CompilationManager: Found hadoop core jar at:
> /usr/lib/hadoop/hadoop-0.20.2-cdh3u1-core.jar
> 11/09/16 09:22:43 DEBUG orm.CompilationManager: Adding source file:
> /tmp/sqoop-root/compile/4da62fc9c254b3faa3ba5115ef61783b/table1.java
> 11/09/16 09:22:43 DEBUG orm.CompilationManager: Invoking javac with args:
> 11/09/16 09:22:43 DEBUG orm.CompilationManager:   -sourcepath
> 11/09/16 09:22:43 DEBUG orm.CompilationManager:
> /tmp/sqoop-root/compile/4da62fc9c254b3faa3ba5115ef61783b/
> 11/09/16 09:22:43 DEBUG orm.CompilationManager:   -d
> 11/09/16 09:22:43 DEBUG orm.CompilationManager:
> /tmp/sqoop-root/compile/4da62fc9c254b3faa3ba5115ef61783b/
> 11/09/16 09:22:43 DEBUG orm.CompilationManager:   -classpath
> 11/09/16 09:22:43 DEBUG orm.CompilationManager:
> /etc/hadoop/conf:/usr/java/jdk1.6.0_21/lib/tools.jar:/usr/lib/hadoop:/usr/lib/hadoop/hadoop-core-0.20.2-cdh3u1.jar:/usr/lib/hadoop/lib/ant-contrib-1.0b3.jar:/usr/lib/hadoop/lib/aspectjrt-1.6.5.jar:/usr/lib/hadoop/lib/aspectjtools-1.6.5.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/commons-daemon-1.0.1.jar:/usr/lib/hadoop/lib/commons-el-1.0.jar:/usr/lib/hadoop/lib/commons-httpclient-3.0.1.jar:/usr/lib/hadoop/lib/commons-logging-1.0.4.jar:/usr/lib/hadoop/lib/commons-logging-api-1.0.4.jar:/usr/lib/hadoop/lib/commons-net-1.4.1.jar:/usr/lib/hadoop/lib/core-3.1.1.jar:/usr/lib/hadoop/lib/hadoop-fairscheduler-0.20.2-cdh3u1.jar:/usr/lib/hadoop/lib/hsqldb-1.8.0.10.jar:/usr/lib/hadoop/lib/hue-plugins-1.2.0-cdh3u1.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.5.2.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.5.2.jar:/usr/lib/hadoop/lib/jasper-compiler-5.5.12.jar:/usr/lib/hadoop/lib/jasper-runtime-5.5.12.jar:/usr/lib/hadoop/lib/jets3t-0.6.1.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/jetty-servlet-tester-6.1.26.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/junit-4.5.jar:/usr/lib/hadoop/lib/kfs-0.2.2.jar:/usr/lib/hadoop/lib/log4j-1.2.15.jar:/usr/lib/hadoop/lib/mockito-all-1.8.2.jar:/usr/lib/hadoop/lib/oro-2.0.8.jar:/usr/lib/hadoop/lib/servlet-api-2.5-20081211.jar:/usr/lib/hadoop/lib/servlet-api-2.5-6.1.14.jar:/usr/lib/hadoop/lib/slf4j-api-1.4.3.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.4.3.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/jsp-2.1/jsp-2.1.jar:/usr/lib/hadoop/lib/jsp-2.1/jsp-api-2.1.jar:/usr/lib/sqoop/conf:/etc/zookeeper::/usr/lib/sqoop/lib/ant-contrib-1.0b3.jar:/usr/lib/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar:/usr/lib/sqoop/lib/avro-1.5.1.jar:/usr/lib/sqoop/lib/avro-ipc-1.5.1.jar:/usr/lib/sqoop/lib/avro-mapred-1.5.1.jar:/usr/lib/sqoop/lib/commons-io-1.4.jar:/usr/lib/sqoop/lib/ivy-2.0.0-rc2.jar:/usr/lib/sqoop/lib/jackson-core-asl-1.7.3.jar:/usr/lib/sqoop/lib/jackson-mapper-asl-1.7.3.jar:/usr/lib/sqoop/lib/jopt-simple-3.2.jar:/usr/lib/sqoop/lib/mysql-connector-java-5.1.15-bin.jar:/usr/lib/sqoop/lib/paranamer-2.3.jar:/usr/lib/sqoop/lib/snappy-java-1.0.3-rc2.jar:/usr/lib/sqoop/sqoop-1.3.0-cdh3u1.jar:/usr/lib/sqoop/sqoop-test-1.3.0-cdh3u1.jar::/usr/lib/hadoop/hadoop-0.20.2-cdh3u1-core.jar:/usr/lib/sqoop/sqoop-1.3.0-cdh3u1.jar
> 11/09/16 09:22:44 INFO orm.CompilationManager: Writing jar file:
> /tmp/sqoop-root/compile/4da62fc9c254b3faa3ba5115ef61783b/table1.jar
> 11/09/16 09:22:44 DEBUG orm.CompilationManager: Scanning for .class files in
> directory: /tmp/sqoop-root/compile/4da62fc9c254b3faa3ba5115ef61783b
> 11/09/16 09:22:44 DEBUG orm.CompilationManager: Got classfile:
> /tmp/sqoop-root/compile/4da62fc9c254b3faa3ba5115ef61783b/table1.class ->
> table1.class
> 11/09/16 09:22:44 DEBUG orm.CompilationManager: Finished writing jar file
> /tmp/sqoop-root/compile/4da62fc9c254b3faa3ba5115ef61783b/table1.jar
> 11/09/16 09:22:44 DEBUG tool.ImportTool: Using temporary folder:
> 16092244516table1
> 11/09/16 09:22:44 INFO manager.DirectMySQLManager: Beginning mysqldump fast
> path import
> 11/09/16 09:22:44 INFO mapreduce.ImportJobBase: Beginning import of table1
> 11/09/16 09:22:44 DEBUG manager.SqlManager: Using fetchSize for next query:
> -2147483648
> 11/09/16 09:22:44 INFO manager.SqlManager: Executing SQL statement: SELECT
> t.* FROM `table1` AS t LIMIT 1
> 11/09/16 09:22:44 DEBUG mapreduce.MySQLDumpImportJob: Using InputFormat:
> class com.cloudera.sqoop.mapreduce.MySQLDumpInputFormat
> 11/09/16 09:22:44 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/lib/sqoop/sqoop-1.3.0-cdh3u1.jar
> 11/09/16 09:22:44 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/lib/sqoop/lib/mysql-connector-java-5.1.15-bin.jar
> 11/09/16 09:22:44 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/lib/sqoop/sqoop-1.3.0-cdh3u1.jar
> 11/09/16 09:22:44 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/lib/sqoop/sqoop-1.3.0-cdh3u1.jar
> 11/09/16 09:22:44 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/lib/sqoop/lib/jackson-mapper-asl-1.7.3.jar
> 11/09/16 09:22:44 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/lib/sqoop/lib/snappy-java-1.0.3-rc2.jar
> 11/09/16 09:22:44 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/lib/sqoop/lib/ant-contrib-1.0b3.jar
> 11/09/16 09:22:44 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/lib/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar
> 11/09/16 09:22:44 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/lib/sqoop/lib/mysql-connector-java-5.1.15-bin.jar
> 11/09/16 09:22:44 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/lib/sqoop/lib/paranamer-2.3.jar
> 11/09/16 09:22:44 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/lib/sqoop/lib/ivy-2.0.0-rc2.jar
> 11/09/16 09:22:44 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/lib/sqoop/lib/jackson-core-asl-1.7.3.jar
> 11/09/16 09:22:44 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/lib/sqoop/lib/avro-mapred-1.5.1.jar
> 11/09/16 09:22:44 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/lib/sqoop/lib/avro-1.5.1.jar
> 11/09/16 09:22:44 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/lib/sqoop/lib/jopt-simple-3.2.jar
> 11/09/16 09:22:44 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/lib/sqoop/lib/commons-io-1.4.jar
> 11/09/16 09:22:44 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/lib/sqoop/lib/avro-ipc-1.5.1.jar
> 11/09/16 09:22:46 INFO mapred.JobClient: Running job: job_201109160744_0006
> 11/09/16 09:22:47 INFO mapred.JobClient:  map 0% reduce 0%
> 11/09/16 09:22:53 INFO mapred.JobClient: Task Id :
> attempt_201109160744_0006_m_000000_0, Status : FAILED
> java.io.IOException: mysqldump terminated with status 5
>     at
> com.cloudera.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:476)
>     at
> com.cloudera.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:49)
>     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:647)
>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:323)
>     at org.apache.hadoop.mapred.Child$4.run(Child.java:270)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:396)
>     at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1127)
>     at org.apache.hadoop.mapred.Child.main(Child.java:264)
>
> attempt_201109160744_0006_m_000000_0: Exception in thread "Thread-12"
> java.lang.IndexOutOfBoundsException
> attempt_201109160744_0006_m_000000_0:     at
> java.nio.CharBuffer.wrap(CharBuffer.java:445)
> attempt_201109160744_0006_m_000000_0:     at
> com.cloudera.sqoop.mapreduce.MySQLDumpMapper$ReparsingAsyncSink$ReparsingStreamThread.run(MySQLDumpMapper.java:253)
> attempt_201109160744_0006_m_000000_0: log4j:WARN No appenders could be found
> for logger (org.apache.hadoop.hdfs.DFSClient).
> attempt_201109160744_0006_m_000000_0: log4j:WARN Please initialize the log4j
> system properly.
> 11/09/16 09:22:58 INFO mapred.JobClient: Task Id :
> attempt_201109160744_0006_m_000000_1, Status : FAILED
> java.io.IOException: mysqldump terminated with status 5
>     at
> com.cloudera.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:476)
>     at
> com.cloudera.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:49)
>     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:647)
>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:323)
>     at org.apache.hadoop.mapred.Child$4.run(Child.java:270)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:396)
>     at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1127)
>     at org.apache.hadoop.mapred.Child.main(Child.java:264)
>
> attempt_201109160744_0006_m_000000_1: Exception in thread "Thread-12"
> java.lang.IndexOutOfBoundsException
> attempt_201109160744_0006_m_000000_1:     at
> java.nio.CharBuffer.wrap(CharBuffer.java:445)
> attempt_201109160744_0006_m_000000_1:     at
> com.cloudera.sqoop.mapreduce.MySQLDumpMapper$ReparsingAsyncSink$ReparsingStreamThread.run(MySQLDumpMapper.java:253)
> attempt_201109160744_0006_m_000000_1: log4j:WARN No appenders could be found
> for logger (org.apache.hadoop.hdfs.DFSClient).
> attempt_201109160744_0006_m_000000_1: log4j:WARN Please initialize the log4j
> system properly.
> 11/09/16 09:23:03 INFO mapred.JobClient: Task Id :
> attempt_201109160744_0006_m_000000_2, Status : FAILED
> java.io.IOException: mysqldump terminated with status 5
>     at
> com.cloudera.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:476)
>     at
> com.cloudera.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:49)
>     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:647)
>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:323)
>     at org.apache.hadoop.mapred.Child$4.run(Child.java:270)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:396)
>     at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1127)
>     at org.apache.hadoop.mapred.Child.main(Child.java:264)
>
> attempt_201109160744_0006_m_000000_2: Exception in thread "Thread-12"
> java.lang.IndexOutOfBoundsException
> attempt_201109160744_0006_m_000000_2:     at
> java.nio.CharBuffer.wrap(CharBuffer.java:445)
> attempt_201109160744_0006_m_000000_2:     at
> com.cloudera.sqoop.mapreduce.MySQLDumpMapper$ReparsingAsyncSink$ReparsingStreamThread.run(MySQLDumpMapper.java:253)
> attempt_201109160744_0006_m_000000_2: log4j:WARN No appenders could be found
> for logger (org.apache.hadoop.hdfs.DFSClient).
> attempt_201109160744_0006_m_000000_2: log4j:WARN Please initialize the log4j
> system properly.
> 11/09/16 09:23:09 INFO mapred.JobClient: Job complete: job_201109160744_0006
> 11/09/16 09:23:09 INFO mapred.JobClient: Counters: 6
> 11/09/16 09:23:09 INFO mapred.JobClient:   Job Counters
> 11/09/16 09:23:09 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=19196
> 11/09/16 09:23:09 INFO mapred.JobClient:     Total time spent by all reduces
> waiting after reserving slots (ms)=0
> 11/09/16 09:23:09 INFO mapred.JobClient:     Total time spent by all maps
> waiting after reserving slots (ms)=0
> 11/09/16 09:23:09 INFO mapred.JobClient:     Launched map tasks=4
> 11/09/16 09:23:09 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=0
> 11/09/16 09:23:09 INFO mapred.JobClient:     Failed map tasks=1
> 11/09/16 09:23:09 INFO mapreduce.ImportJobBase: Transferred 0 bytes in
> 24.8354 seconds (0 bytes/sec)
> 11/09/16 09:23:09 INFO mapreduce.ImportJobBase: Retrieved 0 records.
> 11/09/16 09:23:09 ERROR tool.ImportTool: Error during import: Import job
> failed!
>
>
> --- Task log
>
> Task Logs: 'attempt_201109160744_0006_m_000000_1'
>
> stdout logs
> ________________________________
>
> stderr logs
>
> Exception in thread "Thread-12" java.lang.
> IndexOutOfBoundsException
> 	at java.nio.CharBuffer.wrap(CharBuffer.java:445)
> 	at
> com.cloudera.sqoop.mapreduce.MySQLDumpMapper$ReparsingAsyncSink$ReparsingStreamThread.run(MySQLDumpMapper.java:253)
> log4j:WARN No appenders could be found for logger
> (org.apache.hadoop.hdfs.DFSClient).
> log4j:WARN Please initialize the log4j system properly.
>
> ________________________________
>
> syslog logs
>
> 2011-09-16 09:22:54,194 WARN org.apache.hadoop.util.NativeCodeLoader: Unable
> to load native-hadoop library for your platform... using builtin-java
> classes where applicable
> 2011-09-16 09:22:54,326 INFO org.apache.hadoop.metrics.jvm.JvmMetrics:
> Initializing JVM Metrics with processName=MAP, sessionId=
> 2011-09-16 09:22:54,687 INFO com.cloudera.sqoop.mapreduce.MySQLDumpMapper:
> Beginning mysqldump fast path import
> 2011-09-16 09:22:54,690 INFO com.cloudera.sqoop.mapreduce.MySQLDumpMapper:
> Performing import of table table1 from database test
> 2011-09-16 09:22:54,696 INFO com.cloudera.sqoop.mapreduce.MySQLDumpMapper:
> Converting data to use specified delimiters.
> 2011-09-16 09:22:54,696 INFO com.cloudera.sqoop.mapreduce.MySQLDumpMapper:
> (For the fastest possible import, use
> 2011-09-16 09:22:54,696 INFO com.cloudera.sqoop.mapreduce.MySQLDumpMapper:
> --mysql-delimiters to specify the same field
> 2011-09-16 09:22:54,696 INFO com.cloudera.sqoop.mapreduce.MySQLDumpMapper:
> delimiters as are used by mysqldump.)
> 2011-09-16 09:22:54,710 INFO com.cloudera.sqoop.mapreduce.MySQLDumpMapper:
> mysqldump: Got errno 32 on write
> 2011-09-16 09:22:54,710 INFO com.cloudera.sqoop.mapreduce.MySQLDumpMapper:
> Transfer loop complete.
> 2011-09-16 09:22:54,740 INFO org.apache.hadoop.mapred.TaskLogsTruncater:
> Initializing logs' truncater with mapRetainSize=-1 and reduceRetainSize=-1
> 2011-09-16 09:22:54,746 WARN org.apache.hadoop.mapred.Child: Error running
> child
> java.io.IOException: mysqldump terminated with status 5
> 	at
> com.cloudera.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:476)
> 	at
> com.cloudera.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:49)
> 	at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
> 	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:647)
> 	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:323)
> 	at org.apache.hadoop.mapred.Child$4.run(Child.java:270)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:396)
> 	at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1127)
> 	at org.apache.hadoop.mapred.Child.main(Child.java:264)
> 2011-09-16 09:22:54,750 INFO org.apache.hadoop.mapred.Task: Runnning cleanup
> for the task
>
>
> On Fri, Sep 16, 2011 at 8:55 AM, Kate Ting <ka...@cloudera.com> wrote:
>>
>> [Moving the conversation to sqoop-user@incubator.apache.org. Please
>> subscribe (and post questions) to the new mailing list.]
>>
>> Hi Eric -
>>
>> (1) Is the mysqldump utility installed on individual node machines?
>> (2) If so, can you pastebin your task log as well as verbose output?
>>
>> Regards, Kate
>>
>> On Fri, Sep 16, 2011 at 8:04 AM, Eric <er...@gmail.com> wrote:
>> > Hi all,
>> >
>> > I cannot sqoop in using the --direct option, my sqoop works fine up
>> > until i add --direct .
>> >
>> > I am using Sqoop 1.3.0-cdh3u1
>> > git commit id 3a60cc809b14d538dd1eb0e90ffa9767e8d06a43
>> > Compiled by jenkins@ubuntu-slave01 on Mon Jul 18 08:38:49 PDT 2011
>> >
>> > Please Advise,
>> >
>> > -Eric
>> >
>> >
>> > error message:
>> >
>> > 11/09/16 07:57:39 INFO manager.MySQLManager: Preparing to use a MySQL
>> > streaming resultset.
>> > 11/09/16 07:57:39 INFO tool.CodeGenTool: Beginning code generation
>> > 11/09/16 07:57:40 INFO manager.SqlManager: Executing SQL statement:
>> > SELECT t.* FROM `table1` AS t LIMIT 1
>> > 11/09/16 07:57:40 INFO manager.SqlManager: Executing SQL statement:
>> > SELECT t.* FROM `table1` AS t LIMIT 1
>> > 11/09/16 07:57:40 INFO orm.CompilationManager: HADOOP_HOME is /usr/lib/
>> > hadoop
>> > 11/09/16 07:57:40 INFO orm.CompilationManager: Found hadoop core jar
>> > at: /usr/lib/hadoop/hadoop-0.20.2-cdh3u1-core.jar
>> > 11/09/16 07:57:41 INFO orm.CompilationManager: Writing jar file: /tmp/
>> > sqoop-root/compile/aef5c62d2156aeae5338ee272de42d26/table1.jar
>> > 11/09/16 07:57:41 INFO manager.DirectMySQLManager: Beginning mysqldump
>> > fast path import
>> > 11/09/16 07:57:41 INFO mapreduce.ImportJobBase: Beginning import of
>> > table1
>> > 11/09/16 07:57:41 INFO manager.SqlManager: Executing SQL statement:
>> > SELECT t.* FROM `table1` AS t LIMIT 1
>> > 11/09/16 07:57:43 INFO mapred.JobClient: Running job:
>> > job_201109160744_0004
>> > 11/09/16 07:57:44 INFO mapred.JobClient:  map 0% reduce 0%
>> > 11/09/16 07:57:50 INFO mapred.JobClient: Task Id :
>> > attempt_201109160744_0004_m_000000_0, Status : FAILED
>> > java.io.IOException: mysqldump terminated with status 5
>> >        at
>> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:
>> > 476)
>> >        at
>> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:
>> > 49)
>> >        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>> >        at
>> > org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:647)
>> >        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:323)
>> >        at org.apache.hadoop.mapred.Child$4.run(Child.java:270)
>> >        at java.security.AccessController.doPrivileged(Native Method)
>> >        at javax.security.auth.Subject.doAs(Subject.java:396)
>> >        at
>> >
>> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:
>> > 1127)
>> >        at org.apache.hadoop.mapred.Child.main(Child.java:264)
>> >
>> > attempt_201109160744_0004_m_000000_0: Exception in thread "Thread-12"
>> > java.lang.IndexOutOfBoundsException
>> > attempt_201109160744_0004_m_000000_0:   at
>> > java.nio.CharBuffer.wrap(CharBuffer.java:445)
>> > attempt_201109160744_0004_m_000000_0:   at
>> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper$ReparsingAsyncSink
>> > $ReparsingStreamThread.run(MySQLDumpMapper.java:253)
>> > attempt_201109160744_0004_m_000000_0: log4j:WARN No appenders could be
>> > found for logger (org.apache.hadoop.hdfs.DFSClient).
>> > attempt_201109160744_0004_m_000000_0: log4j:WARN Please initialize the
>> > log4j system properly.
>> > 11/09/16 07:57:55 INFO mapred.JobClient: Task Id :
>> > attempt_201109160744_0004_m_000000_1, Status : FAILED
>> > java.io.IOException: mysqldump terminated with status 5
>> >        at
>> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:
>> > 476)
>> >        at
>> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:
>> > 49)
>> >        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>> >        at
>> > org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:647)
>> >        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:323)
>> >        at org.apache.hadoop.mapred.Child$4.run(Child.java:270)
>> >        at java.security.AccessController.doPrivileged(Native Method)
>> >        at javax.security.auth.Subject.doAs(Subject.java:396)
>> >        at
>> >
>> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:
>> > 1127)
>> >        at org.apache.hadoop.mapred.Child.main(Child.java:264)
>> >
>> > attempt_201109160744_0004_m_000000_1: Exception in thread "Thread-12"
>> > java.lang.IndexOutOfBoundsException
>> > attempt_201109160744_0004_m_000000_1:   at
>> > java.nio.CharBuffer.wrap(CharBuffer.java:445)
>> > attempt_201109160744_0004_m_000000_1:   at
>> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper$ReparsingAsyncSink
>> > $ReparsingStreamThread.run(MySQLDumpMapper.java:253)
>> > attempt_201109160744_0004_m_000000_1: log4j:WARN No appenders could be
>> > found for logger (org.apache.hadoop.hdfs.DFSClient).
>> > attempt_201109160744_0004_m_000000_1: log4j:WARN Please initialize the
>> > log4j system properly.
>> > 11/09/16 07:58:01 INFO mapred.JobClient: Task Id :
>> > attempt_201109160744_0004_m_000000_2, Status : FAILED
>> > java.io.IOException: mysqldump terminated with status 5
>> >        at
>> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:
>> > 476)
>> >        at
>> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:
>> > 49)
>> >        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>> >        at
>> > org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:647)
>> >        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:323)
>> >        at org.apache.hadoop.mapred.Child$4.run(Child.java:270)
>> >        at java.security.AccessController.doPrivileged(Native Method)
>> >        at javax.security.auth.Subject.doAs(Subject.java:396)
>> >        at
>> >
>> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:
>> > 1127)
>> >        at org.apache.hadoop.mapred.Child.main(Child.java:264)
>> >
>> > attempt_201109160744_0004_m_000000_2: Exception in thread "Thread-12"
>> > java.lang.IndexOutOfBoundsException
>> > attempt_201109160744_0004_m_000000_2:   at
>> > java.nio.CharBuffer.wrap(CharBuffer.java:445)
>> > attempt_201109160744_0004_m_000000_2:   at
>> > com.cloudera.sqoop.mapreduce.MySQLDumpMapper$ReparsingAsyncSink
>> > $ReparsingStreamThread.run(MySQLDumpMapper.java:253)
>> > attempt_201109160744_0004_m_000000_2: log4j:WARN No appenders could be
>> > found for logger (org.apache.hadoop.hdfs.DFSClient).
>> > attempt_201109160744_0004_m_000000_2: log4j:WARN Please initialize the
>> > log4j system properly.
>> > 11/09/16 07:58:07 INFO mapred.JobClient: Job complete:
>> > job_201109160744_0004
>> > 11/09/16 07:58:07 INFO mapred.JobClient: Counters: 6
>> > 11/09/16 07:58:07 INFO mapred.JobClient:   Job Counters
>> > 11/09/16 07:58:07 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=19165
>> > 11/09/16 07:58:07 INFO mapred.JobClient:     Total time spent by all
>> > reduces waiting after reserving slots (ms)=0
>> > 11/09/16 07:58:07 INFO mapred.JobClient:     Total time spent by all
>> > maps waiting after reserving slots (ms)=0
>> > 11/09/16 07:58:07 INFO mapred.JobClient:     Launched map tasks=4
>> > 11/09/16 07:58:07 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=0
>> > 11/09/16 07:58:07 INFO mapred.JobClient:     Failed map tasks=1
>> > 11/09/16 07:58:07 INFO mapreduce.ImportJobBase: Transferred 0 bytes in
>> > 25.1844 seconds (0 bytes/sec)
>> > 11/09/16 07:58:07 INFO mapreduce.ImportJobBase: Retrieved 0 records.
>> > 11/09/16 07:58:07 ERROR tool.ImportTool: Error during import: Import
>> > job failed!
>> >
>> > --
>> > NOTE: The mailing list sqoop-user@cloudera.org is deprecated in favor of
>> > Apache Sqoop mailing list sqoop-user@incubator.apache.org. Please subscribe
>> > to it by sending an email to incubator-sqoop-user-subscribe@apache.org.
>> >
>>
>> --
>> NOTE: The mailing list sqoop-user@cloudera.org is deprecated in favor of
>> Apache Sqoop mailing list sqoop-user@incubator.apache.org. Please subscribe
>> to it by sending an email to incubator-sqoop-user-subscribe@apache.org.
>
>
>
> --
> Eric H.
> eric.hardway@gmail.com
>
> --
> NOTE: The mailing list sqoop-user@cloudera.org is deprecated in favor of
> Apache Sqoop mailing list sqoop-user@incubator.apache.org. Please subscribe
> to it by sending an email to incubator-sqoop-user-subscribe@apache.org.
>