You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@sqoop.apache.org by "Peleg, Eyal" <ey...@intel.com> on 2013/06/09 13:39:51 UTC

sqoop export - parse exception

My export command is as follows:


sqoop export --connect 'jdbc:sqlserver://papdb-dev.intel.com:3180;username=epmsysadm;password=s!peruser; DATABASE=AdvancedBIsystem' --table testing --export-dir  /user/eyapeleg/test --input-fields-terminated-by '\t'  --input-escaped-by '\t'  --lines-terminated-by '\n' --username epmsysadm --password 's!peruser'

*note! I used tab delimited format.

I'm able to export the following table:

a              a
b             b

but  fail to export the next one:

aa           a
bb           a

I get a parse exception:

java.io.IOException: Could not parse record: aa a
        at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:80)
        at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)
        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
        at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:183)
        at org.apache.hadoop.mapred.MapTask.runNewMapper_aroundBody4(MapTask.java:813)
        at org.apache.hadoop.mapred.MapTask$AjcClosure5.run(MapTask.java:1)
        at org.aspectj.runtime.reflect.JoinPointImpl.proceed(JoinPointImpl.java:149)
...

Best Regards,




---------------------------------------------------------------------
Intel Electronics Ltd.

This e-mail and any attachments may contain confidential material for
the sole use of the intended recipient(s). Any review or distribution
by others is strictly prohibited. If you are not the intended
recipient, please contact the sender and delete all copies.

Re: sqoop export - parse exception

Posted by Jarek Jarcec Cecho <ja...@apache.org>.
Hi Peleg,
what issues are you seeing with getting the files from HDFS? Getting table's create statement might be different depending on your target database, so I would advise contacting your DBA in case that you do not know.

Jarcec

On Sun, Jun 09, 2013 at 04:29:05PM +0000, Peleg, Eyal wrote:
> As I'm dealing with an HDFS files I'm unable to attach the actual file, just the java file that the sqoop creates while exporting the data.
> As for the exported table's DDL, Not sure where to find it..
> 
> ---------------------------------------------------------------------
> Intel Electronics Ltd.
> 
> This e-mail and any attachments may contain confidential material for
> the sole use of the intended recipient(s). Any review or distribution
> by others is strictly prohibited. If you are not the intended
> recipient, please contact the sender and delete all copies.



RE: sqoop export - parse exception

Posted by "Peleg, Eyal" <ey...@intel.com>.
As I'm dealing with an HDFS files I'm unable to attach the actual file, just the java file that the sqoop creates while exporting the data.
As for the exported table's DDL, Not sure where to find it..

---------------------------------------------------------------------
Intel Electronics Ltd.

This e-mail and any attachments may contain confidential material for
the sole use of the intended recipient(s). Any review or distribution
by others is strictly prohibited. If you are not the intended
recipient, please contact the sender and delete all copies.

Re: sqoop export - parse exception

Posted by Jarek Jarcec Cecho <ja...@apache.org>.
Hi Peleg,
please do attach real files and not their content. Various email editors might transform the data, so we need vanilla data that you are transferring. Also in order to reproduce the issue we will need to get exported table's DDL.

Jarcec

On Sun, Jun 09, 2013 at 04:10:49PM +0000, Peleg, Eyal wrote:
> I'm not sure about the sqoop version yet, I will come back to you with an answer later.
> 
> The example goes as follows:
> 
> A Table which was transferred successfully:
> 
> 	a    b
> 	a    b
> 
> 
> A Table which failed transfer:
> 
> 	aa   b
> 	aa   b
> 
> Important things to mention:
> 	*tab delimited
> 	*the hdfs file I want to export was produced using a java application
> 	* I already tried other delimiters
> ---------------------------------------------------------------------
> Intel Electronics Ltd.
> 
> This e-mail and any attachments may contain confidential material for
> the sole use of the intended recipient(s). Any review or distribution
> by others is strictly prohibited. If you are not the intended
> recipient, please contact the sender and delete all copies.
> 

RE: sqoop export - parse exception

Posted by "Peleg, Eyal" <ey...@intel.com>.
I'm not sure about the sqoop version yet, I will come back to you with an answer later.

The example goes as follows:

A Table which was transferred successfully:

	a    b
	a    b


A Table which failed transfer:

	aa   b
	aa   b

Important things to mention:
	*tab delimited
	*the hdfs file I want to export was produced using a java application
	* I already tried other delimiters
---------------------------------------------------------------------
Intel Electronics Ltd.

This e-mail and any attachments may contain confidential material for
the sole use of the intended recipient(s). Any review or distribution
by others is strictly prohibited. If you are not the intended
recipient, please contact the sender and delete all copies.


Re: sqoop export - parse exception

Posted by Jarek Jarcec Cecho <ja...@apache.org>.
Hi Peleg,
this seems to me as a data corruption issue, would you mind sharing with us the table schema and example small input file that trigger the issue? I'm expecting that number of columns in the table and in the file will be different.

What Sqoop version are you using? We've done significant improvements in Sqoop 1.4.3 (for example SQOOP-720) that are simplifying triaging of such issues, so you might consider upgrade.

Jarcec

On Sun, Jun 09, 2013 at 03:52:46PM +0000, Peleg, Eyal wrote:
> First, thx for the quick response..
> 
> 
> The command:
> sqoop export ... --table testing --export-dir  /user/eyapeleg/test --input-fields-terminated-by '\t' --lines-terminated-by '\n' --verbose ...
> 
> 
> the exception:
> java.util.NoSuchElementException
> 
> 
> the log:
> 
> 13/06/09 08:48:19 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
> 13/06/09 08:48:19 ERROR sqoop.ConnFactory: Error loading ManagerFactory information from file /usr/lib/sqoop/conf/managers.d/connectors: java.io.IOException: Could not load jar /usr/lib/ into JVM. (Could not find class com.cloudera.sqoop.manager.NetezzaManagerFactory.)
>         at org.apache.sqoop.util.ClassLoaderStack.addJarFile(ClassLoaderStack.java:92)
>         at com.cloudera.sqoop.util.ClassLoaderStack.addJarFile(ClassLoaderStack.java:36)
>         at org.apache.sqoop.ConnFactory.addManagersFromFile(ConnFactory.java:159)
>         at org.apache.sqoop.ConnFactory.loadManagersFromConfDir(ConnFactory.java:218)
>         at org.apache.sqoop.ConnFactory.instantiateFactories(ConnFactory.java:83)
>         at org.apache.sqoop.ConnFactory.<init>(ConnFactory.java:60)
>         at com.cloudera.sqoop.ConnFactory.<init>(ConnFactory.java:36)
>         at org.apache.sqoop.tool.BaseSqoopTool.init(BaseSqoopTool.java:202)
>         at org.apache.sqoop.tool.ExportTool.run(ExportTool.java:86)
>         at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>         at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
>         at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
>         at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
>         at org.apache.sqoop.Sqoop.main(Sqoop.java:238)
>         at com.cloudera.sqoop.Sqoop.main(Sqoop.java:57)
> Caused by: java.lang.ClassNotFoundException: com.cloudera.sqoop.manager.NetezzaManagerFactory
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>         at java.net.FactoryURLClassLoader.loadClass(URLClassLoader.java:627)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>         at java.lang.Class.forName0(Native Method)
>         at java.lang.Class.forName(Class.java:247)
>         at org.apache.sqoop.util.ClassLoaderStack.addJarFile(ClassLoaderStack.java:88)
>         ... 15 more
> 
> 13/06/09 08:48:19 ERROR sqoop.ConnFactory: Could not load ManagerFactory com.cloudera.sqoop.manager.NetezzaManagerFactory (not found)
> 13/06/09 08:48:19 INFO manager.SqlManager: Using default fetchSize of 1000
> 13/06/09 08:48:19 INFO tool.CodeGenTool: Beginning code generation
> 13/06/09 08:48:19 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM testing AS t WHERE 1=0
> 13/06/09 08:48:19 INFO orm.CompilationManager: HADOOP_HOME is /usr/lib/hadoop/libexec/..
> 13/06/09 08:48:19 INFO orm.CompilationManager: Found hadoop core jar at: /usr/lib/hadoop/libexec/../hadoop-core.jar
> Note: /tmp/sqoop-eyapeleg/compile/637c112af558b7f914cfbd4024cbdb82/testing.java uses or overrides a deprecated API.
> Note: Recompile with -Xlint:deprecation for details.
> 13/06/09 08:48:21 ERROR orm.CompilationManager: Could not rename /tmp/sqoop-eyapeleg/compile/637c112af558b7f914cfbd4024cbdb82/testing.java to /home/eyapeleg/./testing.java
> org.apache.commons.io.FileExistsException: Destination '/home/eyapeleg/./testing.java' already exists
>         at org.apache.commons.io.FileUtils.moveFile(FileUtils.java:2378)
>         at org.apache.sqoop.orm.CompilationManager.compile(CompilationManager.java:227)
>         at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:83)
>         at org.apache.sqoop.tool.ExportTool.exportTable(ExportTool.java:64)
>         at org.apache.sqoop.tool.ExportTool.run(ExportTool.java:97)
>         at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>         at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
>         at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
>         at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
>         at org.apache.sqoop.Sqoop.main(Sqoop.java:238)
>         at com.cloudera.sqoop.Sqoop.main(Sqoop.java:57)
> 13/06/09 08:48:21 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-eyapeleg/compile/637c112af558b7f914cfbd4024cbdb82/testing.jar
> 13/06/09 08:48:21 INFO mapreduce.ExportJobBase: Beginning export of testing
> 13/06/09 08:48:26 INFO input.FileInputFormat: Total input paths to process : 1
> 13/06/09 08:48:26 INFO input.FileInputFormat: Total input paths to process : 1
> 13/06/09 08:48:26 INFO mapred.JobClient: Running job: job_201305130941_2945
> 13/06/09 08:48:27 INFO mapred.JobClient:  map 0% reduce 0%
> 13/06/09 08:48:46 INFO mapred.JobClient: Task Id : attempt_201305130941_2945_m_000000_0, Status : FAILED
> java.util.NoSuchElementException
>         at java.util.AbstractList$Itr.next(AbstractList.java:350)
>         at testing.__loadFromFields(testing.java:194)
>         at testing.parse(testing.java:143)
>         at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:77)
>         at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)
>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>         at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:183)
>         at org.apache.hadoop.mapred.MapTask.runNewMapper_aroundBody4(MapTask.java:813)
>         at org.apache.hadoop.mapred.MapTask$AjcClosure5.run(MapTask.java:1)
>         at org.aspectj.runtime.reflect.JoinPointImpl.proceed(JoinPointImpl.java:149)
> ...
> 13/06/09 08:48:52 INFO mapred.JobClient: Task Id : attempt_201305130941_2945_m_000000_1, Status : FAILED
> java.util.NoSuchElementException
>         at java.util.AbstractList$Itr.next(AbstractList.java:350)
>         at testing.__loadFromFields(testing.java:194)
>         at testing.parse(testing.java:143)
>         at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:77)
>         at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)
>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>         at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:183)
>         at org.apache.hadoop.mapred.MapTask.runNewMapper_aroundBody4(MapTask.java:813)
>         at org.apache.hadoop.mapred.MapTask$AjcClosure5.run(MapTask.java:1)
>         at org.aspectj.runtime.reflect.JoinPointImpl.proceed(JoinPointImpl.java:149)
> ...
> 13/06/09 08:49:02 INFO mapred.JobClient: Task Id : attempt_201305130941_2945_m_000000_2, Status : FAILED
> java.util.NoSuchElementException
>         at java.util.AbstractList$Itr.next(AbstractList.java:350)
>         at testing.__loadFromFields(testing.java:194)
>         at testing.parse(testing.java:143)
>         at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:77)
>         at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)
>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>         at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:183)
>         at org.apache.hadoop.mapred.MapTask.runNewMapper_aroundBody4(MapTask.java:813)
>         at org.apache.hadoop.mapred.MapTask$AjcClosure5.run(MapTask.java:1)
>         at org.aspectj.runtime.reflect.JoinPointImpl.proceed(JoinPointImpl.java:149)
> ...
> 13/06/09 08:49:17 INFO mapred.JobClient: Job complete: job_201305130941_2945
> 13/06/09 08:49:17 INFO mapred.JobClient: Counters: 8
> 13/06/09 08:49:17 INFO mapred.JobClient:   Job Counters
> 13/06/09 08:49:17 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=34661
> 13/06/09 08:49:17 INFO mapred.JobClient:     Total time spent by all reduces waiting after reserving slots (ms)=0
> 13/06/09 08:49:17 INFO mapred.JobClient:     Total time spent by all maps waiting after reserving slots (ms)=0
> 13/06/09 08:49:17 INFO mapred.JobClient:     Rack-local map tasks=3
> 13/06/09 08:49:17 INFO mapred.JobClient:     Launched map tasks=4
> 13/06/09 08:49:17 INFO mapred.JobClient:     Data-local map tasks=1
> 13/06/09 08:49:17 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=0
> 13/06/09 08:49:17 INFO mapred.JobClient:     Failed map tasks=1
> 13/06/09 08:49:17 INFO mapreduce.ExportJobBase: Transferred 0 bytes in 55.4027 seconds (0 bytes/sec)
> 13/06/09 08:49:17 INFO mapreduce.ExportJobBase: Exported 0 records.
> 13/06/09 08:49:17 ERROR tool.ExportTool: Error during export: Export job failed! 
> ---------------------------------------------------------------------
> Intel Electronics Ltd.
> 
> This e-mail and any attachments may contain confidential material for
> the sole use of the intended recipient(s). Any review or distribution
> by others is strictly prohibited. If you are not the intended
> recipient, please contact the sender and delete all copies.
> 

RE: sqoop export - parse exception

Posted by "Peleg, Eyal" <ey...@intel.com>.
First, thx for the quick response..


The command:
sqoop export ... --table testing --export-dir  /user/eyapeleg/test --input-fields-terminated-by '\t' --lines-terminated-by '\n' --verbose ...


the exception:
java.util.NoSuchElementException


the log:

13/06/09 08:48:19 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
13/06/09 08:48:19 ERROR sqoop.ConnFactory: Error loading ManagerFactory information from file /usr/lib/sqoop/conf/managers.d/connectors: java.io.IOException: Could not load jar /usr/lib/ into JVM. (Could not find class com.cloudera.sqoop.manager.NetezzaManagerFactory.)
        at org.apache.sqoop.util.ClassLoaderStack.addJarFile(ClassLoaderStack.java:92)
        at com.cloudera.sqoop.util.ClassLoaderStack.addJarFile(ClassLoaderStack.java:36)
        at org.apache.sqoop.ConnFactory.addManagersFromFile(ConnFactory.java:159)
        at org.apache.sqoop.ConnFactory.loadManagersFromConfDir(ConnFactory.java:218)
        at org.apache.sqoop.ConnFactory.instantiateFactories(ConnFactory.java:83)
        at org.apache.sqoop.ConnFactory.<init>(ConnFactory.java:60)
        at com.cloudera.sqoop.ConnFactory.<init>(ConnFactory.java:36)
        at org.apache.sqoop.tool.BaseSqoopTool.init(BaseSqoopTool.java:202)
        at org.apache.sqoop.tool.ExportTool.run(ExportTool.java:86)
        at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
        at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
        at org.apache.sqoop.Sqoop.main(Sqoop.java:238)
        at com.cloudera.sqoop.Sqoop.main(Sqoop.java:57)
Caused by: java.lang.ClassNotFoundException: com.cloudera.sqoop.manager.NetezzaManagerFactory
        at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
        at java.net.FactoryURLClassLoader.loadClass(URLClassLoader.java:627)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:247)
        at org.apache.sqoop.util.ClassLoaderStack.addJarFile(ClassLoaderStack.java:88)
        ... 15 more

13/06/09 08:48:19 ERROR sqoop.ConnFactory: Could not load ManagerFactory com.cloudera.sqoop.manager.NetezzaManagerFactory (not found)
13/06/09 08:48:19 INFO manager.SqlManager: Using default fetchSize of 1000
13/06/09 08:48:19 INFO tool.CodeGenTool: Beginning code generation
13/06/09 08:48:19 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM testing AS t WHERE 1=0
13/06/09 08:48:19 INFO orm.CompilationManager: HADOOP_HOME is /usr/lib/hadoop/libexec/..
13/06/09 08:48:19 INFO orm.CompilationManager: Found hadoop core jar at: /usr/lib/hadoop/libexec/../hadoop-core.jar
Note: /tmp/sqoop-eyapeleg/compile/637c112af558b7f914cfbd4024cbdb82/testing.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
13/06/09 08:48:21 ERROR orm.CompilationManager: Could not rename /tmp/sqoop-eyapeleg/compile/637c112af558b7f914cfbd4024cbdb82/testing.java to /home/eyapeleg/./testing.java
org.apache.commons.io.FileExistsException: Destination '/home/eyapeleg/./testing.java' already exists
        at org.apache.commons.io.FileUtils.moveFile(FileUtils.java:2378)
        at org.apache.sqoop.orm.CompilationManager.compile(CompilationManager.java:227)
        at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:83)
        at org.apache.sqoop.tool.ExportTool.exportTable(ExportTool.java:64)
        at org.apache.sqoop.tool.ExportTool.run(ExportTool.java:97)
        at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
        at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
        at org.apache.sqoop.Sqoop.main(Sqoop.java:238)
        at com.cloudera.sqoop.Sqoop.main(Sqoop.java:57)
13/06/09 08:48:21 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-eyapeleg/compile/637c112af558b7f914cfbd4024cbdb82/testing.jar
13/06/09 08:48:21 INFO mapreduce.ExportJobBase: Beginning export of testing
13/06/09 08:48:26 INFO input.FileInputFormat: Total input paths to process : 1
13/06/09 08:48:26 INFO input.FileInputFormat: Total input paths to process : 1
13/06/09 08:48:26 INFO mapred.JobClient: Running job: job_201305130941_2945
13/06/09 08:48:27 INFO mapred.JobClient:  map 0% reduce 0%
13/06/09 08:48:46 INFO mapred.JobClient: Task Id : attempt_201305130941_2945_m_000000_0, Status : FAILED
java.util.NoSuchElementException
        at java.util.AbstractList$Itr.next(AbstractList.java:350)
        at testing.__loadFromFields(testing.java:194)
        at testing.parse(testing.java:143)
        at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:77)
        at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)
        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
        at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:183)
        at org.apache.hadoop.mapred.MapTask.runNewMapper_aroundBody4(MapTask.java:813)
        at org.apache.hadoop.mapred.MapTask$AjcClosure5.run(MapTask.java:1)
        at org.aspectj.runtime.reflect.JoinPointImpl.proceed(JoinPointImpl.java:149)
...
13/06/09 08:48:52 INFO mapred.JobClient: Task Id : attempt_201305130941_2945_m_000000_1, Status : FAILED
java.util.NoSuchElementException
        at java.util.AbstractList$Itr.next(AbstractList.java:350)
        at testing.__loadFromFields(testing.java:194)
        at testing.parse(testing.java:143)
        at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:77)
        at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)
        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
        at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:183)
        at org.apache.hadoop.mapred.MapTask.runNewMapper_aroundBody4(MapTask.java:813)
        at org.apache.hadoop.mapred.MapTask$AjcClosure5.run(MapTask.java:1)
        at org.aspectj.runtime.reflect.JoinPointImpl.proceed(JoinPointImpl.java:149)
...
13/06/09 08:49:02 INFO mapred.JobClient: Task Id : attempt_201305130941_2945_m_000000_2, Status : FAILED
java.util.NoSuchElementException
        at java.util.AbstractList$Itr.next(AbstractList.java:350)
        at testing.__loadFromFields(testing.java:194)
        at testing.parse(testing.java:143)
        at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:77)
        at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)
        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
        at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:183)
        at org.apache.hadoop.mapred.MapTask.runNewMapper_aroundBody4(MapTask.java:813)
        at org.apache.hadoop.mapred.MapTask$AjcClosure5.run(MapTask.java:1)
        at org.aspectj.runtime.reflect.JoinPointImpl.proceed(JoinPointImpl.java:149)
...
13/06/09 08:49:17 INFO mapred.JobClient: Job complete: job_201305130941_2945
13/06/09 08:49:17 INFO mapred.JobClient: Counters: 8
13/06/09 08:49:17 INFO mapred.JobClient:   Job Counters
13/06/09 08:49:17 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=34661
13/06/09 08:49:17 INFO mapred.JobClient:     Total time spent by all reduces waiting after reserving slots (ms)=0
13/06/09 08:49:17 INFO mapred.JobClient:     Total time spent by all maps waiting after reserving slots (ms)=0
13/06/09 08:49:17 INFO mapred.JobClient:     Rack-local map tasks=3
13/06/09 08:49:17 INFO mapred.JobClient:     Launched map tasks=4
13/06/09 08:49:17 INFO mapred.JobClient:     Data-local map tasks=1
13/06/09 08:49:17 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=0
13/06/09 08:49:17 INFO mapred.JobClient:     Failed map tasks=1
13/06/09 08:49:17 INFO mapreduce.ExportJobBase: Transferred 0 bytes in 55.4027 seconds (0 bytes/sec)
13/06/09 08:49:17 INFO mapreduce.ExportJobBase: Exported 0 records.
13/06/09 08:49:17 ERROR tool.ExportTool: Error during export: Export job failed! 
---------------------------------------------------------------------
Intel Electronics Ltd.

This e-mail and any attachments may contain confidential material for
the sole use of the intended recipient(s). Any review or distribution
by others is strictly prohibited. If you are not the intended
recipient, please contact the sender and delete all copies.


Re: sqoop export - parse exception

Posted by Jarek Jarcec Cecho <ja...@apache.org>.
Hi Peleg,
thank you very much for your feedback! Would you mind sharing with us entire Sqoop log generated with parameter --verbose, but without the --input-escaped-by? I would be interested to see what exception was thrown. If the job is failing on mapreduce side, please also do share the failed map task log as it might contain additional information.

Jarcec

On Sun, Jun 09, 2013 at 03:36:10PM +0000, Peleg, Eyal wrote:
> I tried without using the " input-escaped-by" statement, and it threw an exception.
> Only when I did use this statement, and even though they both had the same value, the first table I mentioned went through successfully.
> 
> -----Original Message-----
> From: Jarek Jarcec Cecho [mailto:jarcec@apache.org] 
> Sent: Sunday, June 09, 2013 17:58
> To: user@sqoop.apache.org
> Subject: Re: sqoop export - parse exception
> 
> Hi Peleg,
> I'm not sure that using the same value for both --input-fields-terminated-by and --input-escaped-by is correct. Is your input really delimited and at the same time escaped by tabulators?
> 
> Jarcec
> 
> On Sun, Jun 09, 2013 at 11:39:51AM +0000, Peleg, Eyal wrote:
> > My export command is as follows:
> > 
> > 
> > sqoop export --connect 'jdbc:sqlserver://papdb-dev.intel.com:3180;username=epmsysadm;password=s!peruser; DATABASE=AdvancedBIsystem' --table testing --export-dir  /user/eyapeleg/test --input-fields-terminated-by '\t'  --input-escaped-by '\t'  --lines-terminated-by '\n' --username epmsysadm --password 's!peruser'
> > 
> > *note! I used tab delimited format.
> > 
> > I'm able to export the following table:
> > 
> > a              a
> > b             b
> > 
> > but  fail to export the next one:
> > 
> > aa           a
> > bb           a
> > 
> > I get a parse exception:
> > 
> > java.io.IOException: Could not parse record: aa a
> >         at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:80)
> >         at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)
> >         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
> >         at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:183)
> >         at org.apache.hadoop.mapred.MapTask.runNewMapper_aroundBody4(MapTask.java:813)
> >         at org.apache.hadoop.mapred.MapTask$AjcClosure5.run(MapTask.java:1)
> >         at 
> > org.aspectj.runtime.reflect.JoinPointImpl.proceed(JoinPointImpl.java:1
> > 49)
> > ...
> > 
> > Best Regards,
> > 
> > 
> > 
> > 
> > ---------------------------------------------------------------------
> > Intel Electronics Ltd.
> > 
> > This e-mail and any attachments may contain confidential material for 
> > the sole use of the intended recipient(s). Any review or distribution 
> > by others is strictly prohibited. If you are not the intended 
> > recipient, please contact the sender and delete all copies.
> ---------------------------------------------------------------------
> Intel Electronics Ltd.
> 
> This e-mail and any attachments may contain confidential material for
> the sole use of the intended recipient(s). Any review or distribution
> by others is strictly prohibited. If you are not the intended
> recipient, please contact the sender and delete all copies.
> 

RE: sqoop export - parse exception

Posted by "Peleg, Eyal" <ey...@intel.com>.
I tried without using the " input-escaped-by" statement, and it threw an exception.
Only when I did use this statement, and even though they both had the same value, the first table I mentioned went through successfully.

-----Original Message-----
From: Jarek Jarcec Cecho [mailto:jarcec@apache.org] 
Sent: Sunday, June 09, 2013 17:58
To: user@sqoop.apache.org
Subject: Re: sqoop export - parse exception

Hi Peleg,
I'm not sure that using the same value for both --input-fields-terminated-by and --input-escaped-by is correct. Is your input really delimited and at the same time escaped by tabulators?

Jarcec

On Sun, Jun 09, 2013 at 11:39:51AM +0000, Peleg, Eyal wrote:
> My export command is as follows:
> 
> 
> sqoop export --connect 'jdbc:sqlserver://papdb-dev.intel.com:3180;username=epmsysadm;password=s!peruser; DATABASE=AdvancedBIsystem' --table testing --export-dir  /user/eyapeleg/test --input-fields-terminated-by '\t'  --input-escaped-by '\t'  --lines-terminated-by '\n' --username epmsysadm --password 's!peruser'
> 
> *note! I used tab delimited format.
> 
> I'm able to export the following table:
> 
> a              a
> b             b
> 
> but  fail to export the next one:
> 
> aa           a
> bb           a
> 
> I get a parse exception:
> 
> java.io.IOException: Could not parse record: aa a
>         at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:80)
>         at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)
>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>         at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:183)
>         at org.apache.hadoop.mapred.MapTask.runNewMapper_aroundBody4(MapTask.java:813)
>         at org.apache.hadoop.mapred.MapTask$AjcClosure5.run(MapTask.java:1)
>         at 
> org.aspectj.runtime.reflect.JoinPointImpl.proceed(JoinPointImpl.java:1
> 49)
> ...
> 
> Best Regards,
> 
> 
> 
> 
> ---------------------------------------------------------------------
> Intel Electronics Ltd.
> 
> This e-mail and any attachments may contain confidential material for 
> the sole use of the intended recipient(s). Any review or distribution 
> by others is strictly prohibited. If you are not the intended 
> recipient, please contact the sender and delete all copies.
---------------------------------------------------------------------
Intel Electronics Ltd.

This e-mail and any attachments may contain confidential material for
the sole use of the intended recipient(s). Any review or distribution
by others is strictly prohibited. If you are not the intended
recipient, please contact the sender and delete all copies.


Re: sqoop export - parse exception

Posted by Jarek Jarcec Cecho <ja...@apache.org>.
Hi Peleg,
I'm not sure that using the same value for both --input-fields-terminated-by and --input-escaped-by is correct. Is your input really delimited and at the same time escaped by tabulators?

Jarcec

On Sun, Jun 09, 2013 at 11:39:51AM +0000, Peleg, Eyal wrote:
> My export command is as follows:
> 
> 
> sqoop export --connect 'jdbc:sqlserver://papdb-dev.intel.com:3180;username=epmsysadm;password=s!peruser; DATABASE=AdvancedBIsystem' --table testing --export-dir  /user/eyapeleg/test --input-fields-terminated-by '\t'  --input-escaped-by '\t'  --lines-terminated-by '\n' --username epmsysadm --password 's!peruser'
> 
> *note! I used tab delimited format.
> 
> I'm able to export the following table:
> 
> a              a
> b             b
> 
> but  fail to export the next one:
> 
> aa           a
> bb           a
> 
> I get a parse exception:
> 
> java.io.IOException: Could not parse record: aa a
>         at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:80)
>         at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:36)
>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>         at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:183)
>         at org.apache.hadoop.mapred.MapTask.runNewMapper_aroundBody4(MapTask.java:813)
>         at org.apache.hadoop.mapred.MapTask$AjcClosure5.run(MapTask.java:1)
>         at org.aspectj.runtime.reflect.JoinPointImpl.proceed(JoinPointImpl.java:149)
> ...
> 
> Best Regards,
> 
> 
> 
> 
> ---------------------------------------------------------------------
> Intel Electronics Ltd.
> 
> This e-mail and any attachments may contain confidential material for
> the sole use of the intended recipient(s). Any review or distribution
> by others is strictly prohibited. If you are not the intended
> recipient, please contact the sender and delete all copies.