You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@sqoop.apache.org by Amit <mr...@gmail.com> on 2013/05/06 12:01:43 UTC

Sqoop2 : The type is not supported - BigDecimal

Hi,

I am not able to import MySQL tables containing decimal datatype. Am I
doing anything wrong? Here is the sqoop log file -

java.lang.Exception: org.apache.sqoop.common.SqoopException:
MAPRED_EXEC_0017:Error occurs during extractor run
at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:400)
Caused by: org.apache.sqoop.common.SqoopException: MAPRED_EXEC_0017:Error
occurs during extractor run
at org.apache.sqoop.job.mr.SqoopMapper.run(SqoopMapper.java:94)
 at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:725)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)
 at
org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:232)
 at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
 at java.util.concurrent.FutureTask.run(FutureTask.java:166)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
 at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
 at java.lang.Thread.run(Thread.java:722)
Caused by: org.apache.sqoop.common.SqoopException: MAPRED_EXEC_0013:Cannot
write to the data writer
 at
org.apache.sqoop.job.mr.SqoopMapper$MapDataWriter.writeContent(SqoopMapper.java:142)
 at
org.apache.sqoop.job.mr.SqoopMapper$MapDataWriter.writeArrayRecord(SqoopMapper.java:124)
at
org.apache.sqoop.connector.jdbc.GenericJdbcImportExtractor.extract(GenericJdbcImportExtractor.java:60)
 at
org.apache.sqoop.connector.jdbc.GenericJdbcImportExtractor.extract(GenericJdbcImportExtractor.java:31)
 at org.apache.sqoop.job.mr.SqoopMapper.run(SqoopMapper.java:89)
... 9 more
Caused by: java.io.IOException: org.apache.sqoop.common.SqoopException:
MAPRED_EXEC_0012:The type is not supported - java.math.BigDecimal
 at org.apache.sqoop.job.io.Data.writeArray(Data.java:309)
at org.apache.sqoop.job.io.Data.write(Data.java:171)
 at
org.apache.hadoop.io.serializer.WritableSerialization$WritableSerializer.serialize(WritableSerialization.java:100)
 at
org.apache.hadoop.io.serializer.WritableSerialization$WritableSerializer.serialize(WritableSerialization.java:84)
 at
org.apache.hadoop.mapred.MapTask$MapOutputBuffer.collect(MapTask.java:1075)
 at
org.apache.hadoop.mapred.MapTask$NewOutputCollector.write(MapTask.java:655)
at
org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89)
 at
org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112)
 at
org.apache.sqoop.job.mr.SqoopMapper$MapDataWriter.writeContent(SqoopMapper.java:140)
... 13 more
Caused by: org.apache.sqoop.common.SqoopException: MAPRED_EXEC_0012:The
type is not supported - java.math.BigDecimal
... 22 more

-- 
Thanks,
Am
it

Re: Sqoop2 : The type is not supported - BigDecimal

Posted by Amit <mr...@gmail.com>.
Thanks a lot for your support.


On Tue, May 21, 2013 at 5:33 PM, Jarek Jarcec Cecho <ja...@apache.org>wrote:

> You seems to be using local job runner (based on the external id
> "job_local") that do not supports retrieving status of submitted jobs. I
> would advise switching to real Hadoop cluster.
>
> Jarcec
>
> On Tue, May 21, 2013 at 05:27:18PM +0530, Amit wrote:
> > Thank You, now its worked. But I am getting job Status: UNKNOWN. Can you
> > please tell me whats wrong?
> >
> >
> > sqoop:000> submission status --jid 8
> > Submission details
> > Job id: 8
> > Status: UNKNOWN
> > Creation date: 2013-05-21 17:20:24 IST
> > Last update date: 2013-05-21 17:20:59 IST
> > External Id: job_local_0005
> >         http://localhost:8080/
> >
> >
> >
> > On Tue, May 21, 2013 at 5:05 PM, Jarek Jarcec Cecho <jarcec@apache.org
> >wrote:
> >
> > > Hi Amit,
> > > it seems that you've set extractors and loaders to 1, would you mind
> > > removing any values in both fields?
> > >
> > > Jarcec
> > >
> > > On Tue, May 21, 2013 at 04:54:16PM +0530, Amit wrote:
> > > > sqoop:000> show job --jid 8
> > > > 1 job(s) to show:
> > > > Job with id 8 and name First job (Created 5/6/13 3:10 PM, Updated
> 5/6/13
> > > > 8:21 PM)
> > > > Using Connection id 5 and Connector id 1
> > > >   Database configuration
> > > >     Schema name:
> > > >     Table name: business_entities
> > > >     Table SQL statement:
> > > >     Table column names: Company
> > > >     Partition column name: ID
> > > >     Boundary query:
> > > >   Output configuration
> > > >     Storage type: HDFS
> > > >     Output format: TEXT_FILE
> > > >     Output directory: /landscape/MySQL
> > > >   Throttling resources
> > > >     Extractors: 1
> > > >     Loaders: 1
> > > >
> > > >
> > > >
> > > >
> > > >
> > > > On Tue, May 21, 2013 at 3:17 PM, Jarek Jarcec Cecho <
> jarcec@apache.org
> > > >wrote:
> > > >
> > > > > Hi sir,
> > > > > would you mind sharing output of the "show job" command and the
> > > mapreduce
> > > > > job configuration XML file?
> > > > >
> > > > > Jarcec
> > > > >
> > > > > On Tue, May 14, 2013 at 01:01:52PM +0530, Amit wrote:
> > > > > > None, I am using default extractors and loaders.
> > > > > >
> > > > > > --
> > > > > > Thanks,
> > > > > > Amit
> > > > > >
> > > > > >
> > > > > > On Mon, May 13, 2013 at 10:41 PM, Jarek Jarcec Cecho <
> > > jarcec@apache.org
> > > > > >wrote:
> > > > > >
> > > > > > > Hi Amit,
> > > > > > > how many extractors and loaders do you have configured in this
> job?
> > > > > > >
> > > > > > > Jarcec
> > > > > > >
> > > > > > > On Mon, May 06, 2013 at 03:31:43PM +0530, Amit wrote:
> > > > > > > > Hi,
> > > > > > > >
> > > > > > > > I am not able to import MySQL tables containing decimal
> > > datatype. Am
> > > > > I
> > > > > > > > doing anything wrong? Here is the sqoop log file -
> > > > > > > >
> > > > > > > > java.lang.Exception: org.apache.sqoop.common.SqoopException:
> > > > > > > > MAPRED_EXEC_0017:Error occurs during extractor run
> > > > > > > > at
> > > > > > >
> > > > >
> > >
> org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:400)
> > > > > > > > Caused by: org.apache.sqoop.common.SqoopException:
> > > > > MAPRED_EXEC_0017:Error
> > > > > > > > occurs during extractor run
> > > > > > > > at
> org.apache.sqoop.job.mr.SqoopMapper.run(SqoopMapper.java:94)
> > > > > > > >  at
> > > org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:725)
> > > > > > > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)
> > > > > > > >  at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:232)
> > > > > > > >  at
> > > > > > >
> > > java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
> > > > > > > > at
> > > java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
> > > > > > > >  at java.util.concurrent.FutureTask.run(FutureTask.java:166)
> > > > > > > > at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> > > > > > > >  at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> > > > > > > >  at java.lang.Thread.run(Thread.java:722)
> > > > > > > > Caused by: org.apache.sqoop.common.SqoopException:
> > > > > > > MAPRED_EXEC_0013:Cannot
> > > > > > > > write to the data writer
> > > > > > > >  at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> org.apache.sqoop.job.mr.SqoopMapper$MapDataWriter.writeContent(SqoopMapper.java:142)
> > > > > > > >  at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> org.apache.sqoop.job.mr.SqoopMapper$MapDataWriter.writeArrayRecord(SqoopMapper.java:124)
> > > > > > > > at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> org.apache.sqoop.connector.jdbc.GenericJdbcImportExtractor.extract(GenericJdbcImportExtractor.java:60)
> > > > > > > >  at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> org.apache.sqoop.connector.jdbc.GenericJdbcImportExtractor.extract(GenericJdbcImportExtractor.java:31)
> > > > > > > >  at
> org.apache.sqoop.job.mr.SqoopMapper.run(SqoopMapper.java:89)
> > > > > > > > ... 9 more
> > > > > > > > Caused by: java.io.IOException:
> > > > > org.apache.sqoop.common.SqoopException:
> > > > > > > > MAPRED_EXEC_0012:The type is not supported -
> java.math.BigDecimal
> > > > > > > >  at org.apache.sqoop.job.io.Data.writeArray(Data.java:309)
> > > > > > > > at org.apache.sqoop.job.io.Data.write(Data.java:171)
> > > > > > > >  at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> org.apache.hadoop.io.serializer.WritableSerialization$WritableSerializer.serialize(WritableSerialization.java:100)
> > > > > > > >  at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> org.apache.hadoop.io.serializer.WritableSerialization$WritableSerializer.serialize(WritableSerialization.java:84)
> > > > > > > >  at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> org.apache.hadoop.mapred.MapTask$MapOutputBuffer.collect(MapTask.java:1075)
> > > > > > > >  at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> org.apache.hadoop.mapred.MapTask$NewOutputCollector.write(MapTask.java:655)
> > > > > > > > at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89)
> > > > > > > >  at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112)
> > > > > > > >  at
> > > > > > > >
> > > > > > >
> > > > >
> > >
> org.apache.sqoop.job.mr.SqoopMapper$MapDataWriter.writeContent(SqoopMapper.java:140)
> > > > > > > > ... 13 more
> > > > > > > > Caused by: org.apache.sqoop.common.SqoopException:
> > > > > MAPRED_EXEC_0012:The
> > > > > > > > type is not supported - java.math.BigDecimal
> > > > > > > > ... 22 more
> > > > > > > >
> > > > > > > > --
> > > > > > > > Thanks,
> > > > > > > > Am
> > > > > > > > it
> > > > > > >
> > > > >
> > > >
> > > >
> > > >
> > > > --
> > > > Thanks,
> > > > Mit
> > >
> > > > <?xml version="1.0"?>
> > > > <?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
> > > >
> > > > <!-- Put site-specific property overrides in this file. -->
> > > >
> > > > <configuration>
> > > >
> > > >   <property>
> > > >     <name>mapred.capacity-scheduler.queue.hive.capacity</name>
> > > >     <value>70</value>
> > > >     <description>Percentage of the number of slots in the cluster
> that
> > > are to be available for jobs in this queue.</description>
> > > >   </property>
> > > >   <property>
> > > >     <name>mapreduce.task.io.sort.mb</name>
> > > >     <value>1</value>
> > > >   </property>
> > > >   <property>
> > > >     <name>mapred.child.java.opts</name>
> > > >     <value>-Xmx1024m -server</value>
> > > >   </property>
> > > >
> > > > </configuration>
> > >
> > > > <?xml version="1.0"?>
> > > > <configuration>
> > > >
> > > > <!-- Site specific YARN configuration properties -->
> > > >   <property>
> > > >     <name>yarn.resourcemanager.address</name>
> > > >     <value>linux-test-10:8032</value>
> > > >   </property>
> > > > </configuration>
> > >
> > >
> >
> >
> > --
> > Thanks,
> > Mit
>

Re: Sqoop2 : The type is not supported - BigDecimal

Posted by Jarek Jarcec Cecho <ja...@apache.org>.
You seems to be using local job runner (based on the external id "job_local") that do not supports retrieving status of submitted jobs. I would advise switching to real Hadoop cluster.

Jarcec

On Tue, May 21, 2013 at 05:27:18PM +0530, Amit wrote:
> Thank You, now its worked. But I am getting job Status: UNKNOWN. Can you
> please tell me whats wrong?
> 
> 
> sqoop:000> submission status --jid 8
> Submission details
> Job id: 8
> Status: UNKNOWN
> Creation date: 2013-05-21 17:20:24 IST
> Last update date: 2013-05-21 17:20:59 IST
> External Id: job_local_0005
>         http://localhost:8080/
> 
> 
> 
> On Tue, May 21, 2013 at 5:05 PM, Jarek Jarcec Cecho <ja...@apache.org>wrote:
> 
> > Hi Amit,
> > it seems that you've set extractors and loaders to 1, would you mind
> > removing any values in both fields?
> >
> > Jarcec
> >
> > On Tue, May 21, 2013 at 04:54:16PM +0530, Amit wrote:
> > > sqoop:000> show job --jid 8
> > > 1 job(s) to show:
> > > Job with id 8 and name First job (Created 5/6/13 3:10 PM, Updated 5/6/13
> > > 8:21 PM)
> > > Using Connection id 5 and Connector id 1
> > >   Database configuration
> > >     Schema name:
> > >     Table name: business_entities
> > >     Table SQL statement:
> > >     Table column names: Company
> > >     Partition column name: ID
> > >     Boundary query:
> > >   Output configuration
> > >     Storage type: HDFS
> > >     Output format: TEXT_FILE
> > >     Output directory: /landscape/MySQL
> > >   Throttling resources
> > >     Extractors: 1
> > >     Loaders: 1
> > >
> > >
> > >
> > >
> > >
> > > On Tue, May 21, 2013 at 3:17 PM, Jarek Jarcec Cecho <jarcec@apache.org
> > >wrote:
> > >
> > > > Hi sir,
> > > > would you mind sharing output of the "show job" command and the
> > mapreduce
> > > > job configuration XML file?
> > > >
> > > > Jarcec
> > > >
> > > > On Tue, May 14, 2013 at 01:01:52PM +0530, Amit wrote:
> > > > > None, I am using default extractors and loaders.
> > > > >
> > > > > --
> > > > > Thanks,
> > > > > Amit
> > > > >
> > > > >
> > > > > On Mon, May 13, 2013 at 10:41 PM, Jarek Jarcec Cecho <
> > jarcec@apache.org
> > > > >wrote:
> > > > >
> > > > > > Hi Amit,
> > > > > > how many extractors and loaders do you have configured in this job?
> > > > > >
> > > > > > Jarcec
> > > > > >
> > > > > > On Mon, May 06, 2013 at 03:31:43PM +0530, Amit wrote:
> > > > > > > Hi,
> > > > > > >
> > > > > > > I am not able to import MySQL tables containing decimal
> > datatype. Am
> > > > I
> > > > > > > doing anything wrong? Here is the sqoop log file -
> > > > > > >
> > > > > > > java.lang.Exception: org.apache.sqoop.common.SqoopException:
> > > > > > > MAPRED_EXEC_0017:Error occurs during extractor run
> > > > > > > at
> > > > > >
> > > >
> > org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:400)
> > > > > > > Caused by: org.apache.sqoop.common.SqoopException:
> > > > MAPRED_EXEC_0017:Error
> > > > > > > occurs during extractor run
> > > > > > > at org.apache.sqoop.job.mr.SqoopMapper.run(SqoopMapper.java:94)
> > > > > > >  at
> > org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:725)
> > > > > > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)
> > > > > > >  at
> > > > > > >
> > > > > >
> > > >
> > org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:232)
> > > > > > >  at
> > > > > >
> > java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
> > > > > > > at
> > java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
> > > > > > >  at java.util.concurrent.FutureTask.run(FutureTask.java:166)
> > > > > > > at
> > > > > > >
> > > > > >
> > > >
> > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> > > > > > >  at
> > > > > > >
> > > > > >
> > > >
> > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> > > > > > >  at java.lang.Thread.run(Thread.java:722)
> > > > > > > Caused by: org.apache.sqoop.common.SqoopException:
> > > > > > MAPRED_EXEC_0013:Cannot
> > > > > > > write to the data writer
> > > > > > >  at
> > > > > > >
> > > > > >
> > > >
> > org.apache.sqoop.job.mr.SqoopMapper$MapDataWriter.writeContent(SqoopMapper.java:142)
> > > > > > >  at
> > > > > > >
> > > > > >
> > > >
> > org.apache.sqoop.job.mr.SqoopMapper$MapDataWriter.writeArrayRecord(SqoopMapper.java:124)
> > > > > > > at
> > > > > > >
> > > > > >
> > > >
> > org.apache.sqoop.connector.jdbc.GenericJdbcImportExtractor.extract(GenericJdbcImportExtractor.java:60)
> > > > > > >  at
> > > > > > >
> > > > > >
> > > >
> > org.apache.sqoop.connector.jdbc.GenericJdbcImportExtractor.extract(GenericJdbcImportExtractor.java:31)
> > > > > > >  at org.apache.sqoop.job.mr.SqoopMapper.run(SqoopMapper.java:89)
> > > > > > > ... 9 more
> > > > > > > Caused by: java.io.IOException:
> > > > org.apache.sqoop.common.SqoopException:
> > > > > > > MAPRED_EXEC_0012:The type is not supported - java.math.BigDecimal
> > > > > > >  at org.apache.sqoop.job.io.Data.writeArray(Data.java:309)
> > > > > > > at org.apache.sqoop.job.io.Data.write(Data.java:171)
> > > > > > >  at
> > > > > > >
> > > > > >
> > > >
> > org.apache.hadoop.io.serializer.WritableSerialization$WritableSerializer.serialize(WritableSerialization.java:100)
> > > > > > >  at
> > > > > > >
> > > > > >
> > > >
> > org.apache.hadoop.io.serializer.WritableSerialization$WritableSerializer.serialize(WritableSerialization.java:84)
> > > > > > >  at
> > > > > > >
> > > > > >
> > > >
> > org.apache.hadoop.mapred.MapTask$MapOutputBuffer.collect(MapTask.java:1075)
> > > > > > >  at
> > > > > > >
> > > > > >
> > > >
> > org.apache.hadoop.mapred.MapTask$NewOutputCollector.write(MapTask.java:655)
> > > > > > > at
> > > > > > >
> > > > > >
> > > >
> > org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89)
> > > > > > >  at
> > > > > > >
> > > > > >
> > > >
> > org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112)
> > > > > > >  at
> > > > > > >
> > > > > >
> > > >
> > org.apache.sqoop.job.mr.SqoopMapper$MapDataWriter.writeContent(SqoopMapper.java:140)
> > > > > > > ... 13 more
> > > > > > > Caused by: org.apache.sqoop.common.SqoopException:
> > > > MAPRED_EXEC_0012:The
> > > > > > > type is not supported - java.math.BigDecimal
> > > > > > > ... 22 more
> > > > > > >
> > > > > > > --
> > > > > > > Thanks,
> > > > > > > Am
> > > > > > > it
> > > > > >
> > > >
> > >
> > >
> > >
> > > --
> > > Thanks,
> > > Mit
> >
> > > <?xml version="1.0"?>
> > > <?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
> > >
> > > <!-- Put site-specific property overrides in this file. -->
> > >
> > > <configuration>
> > >
> > >   <property>
> > >     <name>mapred.capacity-scheduler.queue.hive.capacity</name>
> > >     <value>70</value>
> > >     <description>Percentage of the number of slots in the cluster that
> > are to be available for jobs in this queue.</description>
> > >   </property>
> > >   <property>
> > >     <name>mapreduce.task.io.sort.mb</name>
> > >     <value>1</value>
> > >   </property>
> > >   <property>
> > >     <name>mapred.child.java.opts</name>
> > >     <value>-Xmx1024m -server</value>
> > >   </property>
> > >
> > > </configuration>
> >
> > > <?xml version="1.0"?>
> > > <configuration>
> > >
> > > <!-- Site specific YARN configuration properties -->
> > >   <property>
> > >     <name>yarn.resourcemanager.address</name>
> > >     <value>linux-test-10:8032</value>
> > >   </property>
> > > </configuration>
> >
> >
> 
> 
> -- 
> Thanks,
> Mit

Re: Sqoop2 : The type is not supported - BigDecimal

Posted by Amit <mr...@gmail.com>.
Thank You, now its worked. But I am getting job Status: UNKNOWN. Can you
please tell me whats wrong?


sqoop:000> submission status --jid 8
Submission details
Job id: 8
Status: UNKNOWN
Creation date: 2013-05-21 17:20:24 IST
Last update date: 2013-05-21 17:20:59 IST
External Id: job_local_0005
        http://localhost:8080/



On Tue, May 21, 2013 at 5:05 PM, Jarek Jarcec Cecho <ja...@apache.org>wrote:

> Hi Amit,
> it seems that you've set extractors and loaders to 1, would you mind
> removing any values in both fields?
>
> Jarcec
>
> On Tue, May 21, 2013 at 04:54:16PM +0530, Amit wrote:
> > sqoop:000> show job --jid 8
> > 1 job(s) to show:
> > Job with id 8 and name First job (Created 5/6/13 3:10 PM, Updated 5/6/13
> > 8:21 PM)
> > Using Connection id 5 and Connector id 1
> >   Database configuration
> >     Schema name:
> >     Table name: business_entities
> >     Table SQL statement:
> >     Table column names: Company
> >     Partition column name: ID
> >     Boundary query:
> >   Output configuration
> >     Storage type: HDFS
> >     Output format: TEXT_FILE
> >     Output directory: /landscape/MySQL
> >   Throttling resources
> >     Extractors: 1
> >     Loaders: 1
> >
> >
> >
> >
> >
> > On Tue, May 21, 2013 at 3:17 PM, Jarek Jarcec Cecho <jarcec@apache.org
> >wrote:
> >
> > > Hi sir,
> > > would you mind sharing output of the "show job" command and the
> mapreduce
> > > job configuration XML file?
> > >
> > > Jarcec
> > >
> > > On Tue, May 14, 2013 at 01:01:52PM +0530, Amit wrote:
> > > > None, I am using default extractors and loaders.
> > > >
> > > > --
> > > > Thanks,
> > > > Amit
> > > >
> > > >
> > > > On Mon, May 13, 2013 at 10:41 PM, Jarek Jarcec Cecho <
> jarcec@apache.org
> > > >wrote:
> > > >
> > > > > Hi Amit,
> > > > > how many extractors and loaders do you have configured in this job?
> > > > >
> > > > > Jarcec
> > > > >
> > > > > On Mon, May 06, 2013 at 03:31:43PM +0530, Amit wrote:
> > > > > > Hi,
> > > > > >
> > > > > > I am not able to import MySQL tables containing decimal
> datatype. Am
> > > I
> > > > > > doing anything wrong? Here is the sqoop log file -
> > > > > >
> > > > > > java.lang.Exception: org.apache.sqoop.common.SqoopException:
> > > > > > MAPRED_EXEC_0017:Error occurs during extractor run
> > > > > > at
> > > > >
> > >
> org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:400)
> > > > > > Caused by: org.apache.sqoop.common.SqoopException:
> > > MAPRED_EXEC_0017:Error
> > > > > > occurs during extractor run
> > > > > > at org.apache.sqoop.job.mr.SqoopMapper.run(SqoopMapper.java:94)
> > > > > >  at
> org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:725)
> > > > > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)
> > > > > >  at
> > > > > >
> > > > >
> > >
> org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:232)
> > > > > >  at
> > > > >
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
> > > > > > at
> java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
> > > > > >  at java.util.concurrent.FutureTask.run(FutureTask.java:166)
> > > > > > at
> > > > > >
> > > > >
> > >
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> > > > > >  at
> > > > > >
> > > > >
> > >
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> > > > > >  at java.lang.Thread.run(Thread.java:722)
> > > > > > Caused by: org.apache.sqoop.common.SqoopException:
> > > > > MAPRED_EXEC_0013:Cannot
> > > > > > write to the data writer
> > > > > >  at
> > > > > >
> > > > >
> > >
> org.apache.sqoop.job.mr.SqoopMapper$MapDataWriter.writeContent(SqoopMapper.java:142)
> > > > > >  at
> > > > > >
> > > > >
> > >
> org.apache.sqoop.job.mr.SqoopMapper$MapDataWriter.writeArrayRecord(SqoopMapper.java:124)
> > > > > > at
> > > > > >
> > > > >
> > >
> org.apache.sqoop.connector.jdbc.GenericJdbcImportExtractor.extract(GenericJdbcImportExtractor.java:60)
> > > > > >  at
> > > > > >
> > > > >
> > >
> org.apache.sqoop.connector.jdbc.GenericJdbcImportExtractor.extract(GenericJdbcImportExtractor.java:31)
> > > > > >  at org.apache.sqoop.job.mr.SqoopMapper.run(SqoopMapper.java:89)
> > > > > > ... 9 more
> > > > > > Caused by: java.io.IOException:
> > > org.apache.sqoop.common.SqoopException:
> > > > > > MAPRED_EXEC_0012:The type is not supported - java.math.BigDecimal
> > > > > >  at org.apache.sqoop.job.io.Data.writeArray(Data.java:309)
> > > > > > at org.apache.sqoop.job.io.Data.write(Data.java:171)
> > > > > >  at
> > > > > >
> > > > >
> > >
> org.apache.hadoop.io.serializer.WritableSerialization$WritableSerializer.serialize(WritableSerialization.java:100)
> > > > > >  at
> > > > > >
> > > > >
> > >
> org.apache.hadoop.io.serializer.WritableSerialization$WritableSerializer.serialize(WritableSerialization.java:84)
> > > > > >  at
> > > > > >
> > > > >
> > >
> org.apache.hadoop.mapred.MapTask$MapOutputBuffer.collect(MapTask.java:1075)
> > > > > >  at
> > > > > >
> > > > >
> > >
> org.apache.hadoop.mapred.MapTask$NewOutputCollector.write(MapTask.java:655)
> > > > > > at
> > > > > >
> > > > >
> > >
> org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89)
> > > > > >  at
> > > > > >
> > > > >
> > >
> org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112)
> > > > > >  at
> > > > > >
> > > > >
> > >
> org.apache.sqoop.job.mr.SqoopMapper$MapDataWriter.writeContent(SqoopMapper.java:140)
> > > > > > ... 13 more
> > > > > > Caused by: org.apache.sqoop.common.SqoopException:
> > > MAPRED_EXEC_0012:The
> > > > > > type is not supported - java.math.BigDecimal
> > > > > > ... 22 more
> > > > > >
> > > > > > --
> > > > > > Thanks,
> > > > > > Am
> > > > > > it
> > > > >
> > >
> >
> >
> >
> > --
> > Thanks,
> > Mit
>
> > <?xml version="1.0"?>
> > <?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
> >
> > <!-- Put site-specific property overrides in this file. -->
> >
> > <configuration>
> >
> >   <property>
> >     <name>mapred.capacity-scheduler.queue.hive.capacity</name>
> >     <value>70</value>
> >     <description>Percentage of the number of slots in the cluster that
> are to be available for jobs in this queue.</description>
> >   </property>
> >   <property>
> >     <name>mapreduce.task.io.sort.mb</name>
> >     <value>1</value>
> >   </property>
> >   <property>
> >     <name>mapred.child.java.opts</name>
> >     <value>-Xmx1024m -server</value>
> >   </property>
> >
> > </configuration>
>
> > <?xml version="1.0"?>
> > <configuration>
> >
> > <!-- Site specific YARN configuration properties -->
> >   <property>
> >     <name>yarn.resourcemanager.address</name>
> >     <value>linux-test-10:8032</value>
> >   </property>
> > </configuration>
>
>


-- 
Thanks,
Mit

Re: Sqoop2 : The type is not supported - BigDecimal

Posted by Jarek Jarcec Cecho <ja...@apache.org>.
Hi Amit,
it seems that you've set extractors and loaders to 1, would you mind removing any values in both fields?

Jarcec

On Tue, May 21, 2013 at 04:54:16PM +0530, Amit wrote:
> sqoop:000> show job --jid 8
> 1 job(s) to show:
> Job with id 8 and name First job (Created 5/6/13 3:10 PM, Updated 5/6/13
> 8:21 PM)
> Using Connection id 5 and Connector id 1
>   Database configuration
>     Schema name:
>     Table name: business_entities
>     Table SQL statement:
>     Table column names: Company
>     Partition column name: ID
>     Boundary query:
>   Output configuration
>     Storage type: HDFS
>     Output format: TEXT_FILE
>     Output directory: /landscape/MySQL
>   Throttling resources
>     Extractors: 1
>     Loaders: 1
> 
> 
> 
> 
> 
> On Tue, May 21, 2013 at 3:17 PM, Jarek Jarcec Cecho <ja...@apache.org>wrote:
> 
> > Hi sir,
> > would you mind sharing output of the "show job" command and the mapreduce
> > job configuration XML file?
> >
> > Jarcec
> >
> > On Tue, May 14, 2013 at 01:01:52PM +0530, Amit wrote:
> > > None, I am using default extractors and loaders.
> > >
> > > --
> > > Thanks,
> > > Amit
> > >
> > >
> > > On Mon, May 13, 2013 at 10:41 PM, Jarek Jarcec Cecho <jarcec@apache.org
> > >wrote:
> > >
> > > > Hi Amit,
> > > > how many extractors and loaders do you have configured in this job?
> > > >
> > > > Jarcec
> > > >
> > > > On Mon, May 06, 2013 at 03:31:43PM +0530, Amit wrote:
> > > > > Hi,
> > > > >
> > > > > I am not able to import MySQL tables containing decimal datatype. Am
> > I
> > > > > doing anything wrong? Here is the sqoop log file -
> > > > >
> > > > > java.lang.Exception: org.apache.sqoop.common.SqoopException:
> > > > > MAPRED_EXEC_0017:Error occurs during extractor run
> > > > > at
> > > >
> > org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:400)
> > > > > Caused by: org.apache.sqoop.common.SqoopException:
> > MAPRED_EXEC_0017:Error
> > > > > occurs during extractor run
> > > > > at org.apache.sqoop.job.mr.SqoopMapper.run(SqoopMapper.java:94)
> > > > >  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:725)
> > > > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)
> > > > >  at
> > > > >
> > > >
> > org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:232)
> > > > >  at
> > > > java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
> > > > > at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
> > > > >  at java.util.concurrent.FutureTask.run(FutureTask.java:166)
> > > > > at
> > > > >
> > > >
> > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> > > > >  at
> > > > >
> > > >
> > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> > > > >  at java.lang.Thread.run(Thread.java:722)
> > > > > Caused by: org.apache.sqoop.common.SqoopException:
> > > > MAPRED_EXEC_0013:Cannot
> > > > > write to the data writer
> > > > >  at
> > > > >
> > > >
> > org.apache.sqoop.job.mr.SqoopMapper$MapDataWriter.writeContent(SqoopMapper.java:142)
> > > > >  at
> > > > >
> > > >
> > org.apache.sqoop.job.mr.SqoopMapper$MapDataWriter.writeArrayRecord(SqoopMapper.java:124)
> > > > > at
> > > > >
> > > >
> > org.apache.sqoop.connector.jdbc.GenericJdbcImportExtractor.extract(GenericJdbcImportExtractor.java:60)
> > > > >  at
> > > > >
> > > >
> > org.apache.sqoop.connector.jdbc.GenericJdbcImportExtractor.extract(GenericJdbcImportExtractor.java:31)
> > > > >  at org.apache.sqoop.job.mr.SqoopMapper.run(SqoopMapper.java:89)
> > > > > ... 9 more
> > > > > Caused by: java.io.IOException:
> > org.apache.sqoop.common.SqoopException:
> > > > > MAPRED_EXEC_0012:The type is not supported - java.math.BigDecimal
> > > > >  at org.apache.sqoop.job.io.Data.writeArray(Data.java:309)
> > > > > at org.apache.sqoop.job.io.Data.write(Data.java:171)
> > > > >  at
> > > > >
> > > >
> > org.apache.hadoop.io.serializer.WritableSerialization$WritableSerializer.serialize(WritableSerialization.java:100)
> > > > >  at
> > > > >
> > > >
> > org.apache.hadoop.io.serializer.WritableSerialization$WritableSerializer.serialize(WritableSerialization.java:84)
> > > > >  at
> > > > >
> > > >
> > org.apache.hadoop.mapred.MapTask$MapOutputBuffer.collect(MapTask.java:1075)
> > > > >  at
> > > > >
> > > >
> > org.apache.hadoop.mapred.MapTask$NewOutputCollector.write(MapTask.java:655)
> > > > > at
> > > > >
> > > >
> > org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89)
> > > > >  at
> > > > >
> > > >
> > org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112)
> > > > >  at
> > > > >
> > > >
> > org.apache.sqoop.job.mr.SqoopMapper$MapDataWriter.writeContent(SqoopMapper.java:140)
> > > > > ... 13 more
> > > > > Caused by: org.apache.sqoop.common.SqoopException:
> > MAPRED_EXEC_0012:The
> > > > > type is not supported - java.math.BigDecimal
> > > > > ... 22 more
> > > > >
> > > > > --
> > > > > Thanks,
> > > > > Am
> > > > > it
> > > >
> >
> 
> 
> 
> -- 
> Thanks,
> Mit

> <?xml version="1.0"?>
> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
> 
> <!-- Put site-specific property overrides in this file. -->
> 
> <configuration>
> 
>   <property>
>     <name>mapred.capacity-scheduler.queue.hive.capacity</name>
>     <value>70</value>
>     <description>Percentage of the number of slots in the cluster that are to be available for jobs in this queue.</description>
>   </property>
>   <property>
>     <name>mapreduce.task.io.sort.mb</name>
>     <value>1</value>
>   </property>
>   <property>
>     <name>mapred.child.java.opts</name>
>     <value>-Xmx1024m -server</value>
>   </property>
> 
> </configuration>

> <?xml version="1.0"?>
> <configuration>
> 
> <!-- Site specific YARN configuration properties -->
>   <property>
>     <name>yarn.resourcemanager.address</name>
>     <value>linux-test-10:8032</value>
>   </property>
> </configuration>


Re: Sqoop2 : The type is not supported - BigDecimal

Posted by Amit <mr...@gmail.com>.
sqoop:000> show job --jid 8
1 job(s) to show:
Job with id 8 and name First job (Created 5/6/13 3:10 PM, Updated 5/6/13
8:21 PM)
Using Connection id 5 and Connector id 1
  Database configuration
    Schema name:
    Table name: business_entities
    Table SQL statement:
    Table column names: Company
    Partition column name: ID
    Boundary query:
  Output configuration
    Storage type: HDFS
    Output format: TEXT_FILE
    Output directory: /landscape/MySQL
  Throttling resources
    Extractors: 1
    Loaders: 1





On Tue, May 21, 2013 at 3:17 PM, Jarek Jarcec Cecho <ja...@apache.org>wrote:

> Hi sir,
> would you mind sharing output of the "show job" command and the mapreduce
> job configuration XML file?
>
> Jarcec
>
> On Tue, May 14, 2013 at 01:01:52PM +0530, Amit wrote:
> > None, I am using default extractors and loaders.
> >
> > --
> > Thanks,
> > Amit
> >
> >
> > On Mon, May 13, 2013 at 10:41 PM, Jarek Jarcec Cecho <jarcec@apache.org
> >wrote:
> >
> > > Hi Amit,
> > > how many extractors and loaders do you have configured in this job?
> > >
> > > Jarcec
> > >
> > > On Mon, May 06, 2013 at 03:31:43PM +0530, Amit wrote:
> > > > Hi,
> > > >
> > > > I am not able to import MySQL tables containing decimal datatype. Am
> I
> > > > doing anything wrong? Here is the sqoop log file -
> > > >
> > > > java.lang.Exception: org.apache.sqoop.common.SqoopException:
> > > > MAPRED_EXEC_0017:Error occurs during extractor run
> > > > at
> > >
> org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:400)
> > > > Caused by: org.apache.sqoop.common.SqoopException:
> MAPRED_EXEC_0017:Error
> > > > occurs during extractor run
> > > > at org.apache.sqoop.job.mr.SqoopMapper.run(SqoopMapper.java:94)
> > > >  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:725)
> > > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)
> > > >  at
> > > >
> > >
> org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:232)
> > > >  at
> > > java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
> > > > at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
> > > >  at java.util.concurrent.FutureTask.run(FutureTask.java:166)
> > > > at
> > > >
> > >
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> > > >  at
> > > >
> > >
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> > > >  at java.lang.Thread.run(Thread.java:722)
> > > > Caused by: org.apache.sqoop.common.SqoopException:
> > > MAPRED_EXEC_0013:Cannot
> > > > write to the data writer
> > > >  at
> > > >
> > >
> org.apache.sqoop.job.mr.SqoopMapper$MapDataWriter.writeContent(SqoopMapper.java:142)
> > > >  at
> > > >
> > >
> org.apache.sqoop.job.mr.SqoopMapper$MapDataWriter.writeArrayRecord(SqoopMapper.java:124)
> > > > at
> > > >
> > >
> org.apache.sqoop.connector.jdbc.GenericJdbcImportExtractor.extract(GenericJdbcImportExtractor.java:60)
> > > >  at
> > > >
> > >
> org.apache.sqoop.connector.jdbc.GenericJdbcImportExtractor.extract(GenericJdbcImportExtractor.java:31)
> > > >  at org.apache.sqoop.job.mr.SqoopMapper.run(SqoopMapper.java:89)
> > > > ... 9 more
> > > > Caused by: java.io.IOException:
> org.apache.sqoop.common.SqoopException:
> > > > MAPRED_EXEC_0012:The type is not supported - java.math.BigDecimal
> > > >  at org.apache.sqoop.job.io.Data.writeArray(Data.java:309)
> > > > at org.apache.sqoop.job.io.Data.write(Data.java:171)
> > > >  at
> > > >
> > >
> org.apache.hadoop.io.serializer.WritableSerialization$WritableSerializer.serialize(WritableSerialization.java:100)
> > > >  at
> > > >
> > >
> org.apache.hadoop.io.serializer.WritableSerialization$WritableSerializer.serialize(WritableSerialization.java:84)
> > > >  at
> > > >
> > >
> org.apache.hadoop.mapred.MapTask$MapOutputBuffer.collect(MapTask.java:1075)
> > > >  at
> > > >
> > >
> org.apache.hadoop.mapred.MapTask$NewOutputCollector.write(MapTask.java:655)
> > > > at
> > > >
> > >
> org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89)
> > > >  at
> > > >
> > >
> org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112)
> > > >  at
> > > >
> > >
> org.apache.sqoop.job.mr.SqoopMapper$MapDataWriter.writeContent(SqoopMapper.java:140)
> > > > ... 13 more
> > > > Caused by: org.apache.sqoop.common.SqoopException:
> MAPRED_EXEC_0012:The
> > > > type is not supported - java.math.BigDecimal
> > > > ... 22 more
> > > >
> > > > --
> > > > Thanks,
> > > > Am
> > > > it
> > >
>



-- 
Thanks,
Mit

Re: Sqoop2 : The type is not supported - BigDecimal

Posted by Jarek Jarcec Cecho <ja...@apache.org>.
Hi sir,
would you mind sharing output of the "show job" command and the mapreduce job configuration XML file?

Jarcec

On Tue, May 14, 2013 at 01:01:52PM +0530, Amit wrote:
> None, I am using default extractors and loaders.
> 
> -- 
> Thanks,
> Amit
> 
> 
> On Mon, May 13, 2013 at 10:41 PM, Jarek Jarcec Cecho <ja...@apache.org>wrote:
> 
> > Hi Amit,
> > how many extractors and loaders do you have configured in this job?
> >
> > Jarcec
> >
> > On Mon, May 06, 2013 at 03:31:43PM +0530, Amit wrote:
> > > Hi,
> > >
> > > I am not able to import MySQL tables containing decimal datatype. Am I
> > > doing anything wrong? Here is the sqoop log file -
> > >
> > > java.lang.Exception: org.apache.sqoop.common.SqoopException:
> > > MAPRED_EXEC_0017:Error occurs during extractor run
> > > at
> > org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:400)
> > > Caused by: org.apache.sqoop.common.SqoopException: MAPRED_EXEC_0017:Error
> > > occurs during extractor run
> > > at org.apache.sqoop.job.mr.SqoopMapper.run(SqoopMapper.java:94)
> > >  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:725)
> > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)
> > >  at
> > >
> > org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:232)
> > >  at
> > java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
> > > at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
> > >  at java.util.concurrent.FutureTask.run(FutureTask.java:166)
> > > at
> > >
> > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> > >  at
> > >
> > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> > >  at java.lang.Thread.run(Thread.java:722)
> > > Caused by: org.apache.sqoop.common.SqoopException:
> > MAPRED_EXEC_0013:Cannot
> > > write to the data writer
> > >  at
> > >
> > org.apache.sqoop.job.mr.SqoopMapper$MapDataWriter.writeContent(SqoopMapper.java:142)
> > >  at
> > >
> > org.apache.sqoop.job.mr.SqoopMapper$MapDataWriter.writeArrayRecord(SqoopMapper.java:124)
> > > at
> > >
> > org.apache.sqoop.connector.jdbc.GenericJdbcImportExtractor.extract(GenericJdbcImportExtractor.java:60)
> > >  at
> > >
> > org.apache.sqoop.connector.jdbc.GenericJdbcImportExtractor.extract(GenericJdbcImportExtractor.java:31)
> > >  at org.apache.sqoop.job.mr.SqoopMapper.run(SqoopMapper.java:89)
> > > ... 9 more
> > > Caused by: java.io.IOException: org.apache.sqoop.common.SqoopException:
> > > MAPRED_EXEC_0012:The type is not supported - java.math.BigDecimal
> > >  at org.apache.sqoop.job.io.Data.writeArray(Data.java:309)
> > > at org.apache.sqoop.job.io.Data.write(Data.java:171)
> > >  at
> > >
> > org.apache.hadoop.io.serializer.WritableSerialization$WritableSerializer.serialize(WritableSerialization.java:100)
> > >  at
> > >
> > org.apache.hadoop.io.serializer.WritableSerialization$WritableSerializer.serialize(WritableSerialization.java:84)
> > >  at
> > >
> > org.apache.hadoop.mapred.MapTask$MapOutputBuffer.collect(MapTask.java:1075)
> > >  at
> > >
> > org.apache.hadoop.mapred.MapTask$NewOutputCollector.write(MapTask.java:655)
> > > at
> > >
> > org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89)
> > >  at
> > >
> > org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112)
> > >  at
> > >
> > org.apache.sqoop.job.mr.SqoopMapper$MapDataWriter.writeContent(SqoopMapper.java:140)
> > > ... 13 more
> > > Caused by: org.apache.sqoop.common.SqoopException: MAPRED_EXEC_0012:The
> > > type is not supported - java.math.BigDecimal
> > > ... 22 more
> > >
> > > --
> > > Thanks,
> > > Am
> > > it
> >

Re: Sqoop2 : The type is not supported - BigDecimal

Posted by Amit <mr...@gmail.com>.
None, I am using default extractors and loaders.

-- 
Thanks,
Amit


On Mon, May 13, 2013 at 10:41 PM, Jarek Jarcec Cecho <ja...@apache.org>wrote:

> Hi Amit,
> how many extractors and loaders do you have configured in this job?
>
> Jarcec
>
> On Mon, May 06, 2013 at 03:31:43PM +0530, Amit wrote:
> > Hi,
> >
> > I am not able to import MySQL tables containing decimal datatype. Am I
> > doing anything wrong? Here is the sqoop log file -
> >
> > java.lang.Exception: org.apache.sqoop.common.SqoopException:
> > MAPRED_EXEC_0017:Error occurs during extractor run
> > at
> org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:400)
> > Caused by: org.apache.sqoop.common.SqoopException: MAPRED_EXEC_0017:Error
> > occurs during extractor run
> > at org.apache.sqoop.job.mr.SqoopMapper.run(SqoopMapper.java:94)
> >  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:725)
> > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)
> >  at
> >
> org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:232)
> >  at
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
> > at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
> >  at java.util.concurrent.FutureTask.run(FutureTask.java:166)
> > at
> >
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> >  at
> >
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> >  at java.lang.Thread.run(Thread.java:722)
> > Caused by: org.apache.sqoop.common.SqoopException:
> MAPRED_EXEC_0013:Cannot
> > write to the data writer
> >  at
> >
> org.apache.sqoop.job.mr.SqoopMapper$MapDataWriter.writeContent(SqoopMapper.java:142)
> >  at
> >
> org.apache.sqoop.job.mr.SqoopMapper$MapDataWriter.writeArrayRecord(SqoopMapper.java:124)
> > at
> >
> org.apache.sqoop.connector.jdbc.GenericJdbcImportExtractor.extract(GenericJdbcImportExtractor.java:60)
> >  at
> >
> org.apache.sqoop.connector.jdbc.GenericJdbcImportExtractor.extract(GenericJdbcImportExtractor.java:31)
> >  at org.apache.sqoop.job.mr.SqoopMapper.run(SqoopMapper.java:89)
> > ... 9 more
> > Caused by: java.io.IOException: org.apache.sqoop.common.SqoopException:
> > MAPRED_EXEC_0012:The type is not supported - java.math.BigDecimal
> >  at org.apache.sqoop.job.io.Data.writeArray(Data.java:309)
> > at org.apache.sqoop.job.io.Data.write(Data.java:171)
> >  at
> >
> org.apache.hadoop.io.serializer.WritableSerialization$WritableSerializer.serialize(WritableSerialization.java:100)
> >  at
> >
> org.apache.hadoop.io.serializer.WritableSerialization$WritableSerializer.serialize(WritableSerialization.java:84)
> >  at
> >
> org.apache.hadoop.mapred.MapTask$MapOutputBuffer.collect(MapTask.java:1075)
> >  at
> >
> org.apache.hadoop.mapred.MapTask$NewOutputCollector.write(MapTask.java:655)
> > at
> >
> org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89)
> >  at
> >
> org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112)
> >  at
> >
> org.apache.sqoop.job.mr.SqoopMapper$MapDataWriter.writeContent(SqoopMapper.java:140)
> > ... 13 more
> > Caused by: org.apache.sqoop.common.SqoopException: MAPRED_EXEC_0012:The
> > type is not supported - java.math.BigDecimal
> > ... 22 more
> >
> > --
> > Thanks,
> > Am
> > it
>

Re: Sqoop2 : The type is not supported - BigDecimal

Posted by Jarek Jarcec Cecho <ja...@apache.org>.
Hi Amit,
how many extractors and loaders do you have configured in this job?

Jarcec

On Mon, May 06, 2013 at 03:31:43PM +0530, Amit wrote:
> Hi,
> 
> I am not able to import MySQL tables containing decimal datatype. Am I
> doing anything wrong? Here is the sqoop log file -
> 
> java.lang.Exception: org.apache.sqoop.common.SqoopException:
> MAPRED_EXEC_0017:Error occurs during extractor run
> at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:400)
> Caused by: org.apache.sqoop.common.SqoopException: MAPRED_EXEC_0017:Error
> occurs during extractor run
> at org.apache.sqoop.job.mr.SqoopMapper.run(SqoopMapper.java:94)
>  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:725)
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)
>  at
> org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:232)
>  at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
> at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
>  at java.util.concurrent.FutureTask.run(FutureTask.java:166)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>  at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>  at java.lang.Thread.run(Thread.java:722)
> Caused by: org.apache.sqoop.common.SqoopException: MAPRED_EXEC_0013:Cannot
> write to the data writer
>  at
> org.apache.sqoop.job.mr.SqoopMapper$MapDataWriter.writeContent(SqoopMapper.java:142)
>  at
> org.apache.sqoop.job.mr.SqoopMapper$MapDataWriter.writeArrayRecord(SqoopMapper.java:124)
> at
> org.apache.sqoop.connector.jdbc.GenericJdbcImportExtractor.extract(GenericJdbcImportExtractor.java:60)
>  at
> org.apache.sqoop.connector.jdbc.GenericJdbcImportExtractor.extract(GenericJdbcImportExtractor.java:31)
>  at org.apache.sqoop.job.mr.SqoopMapper.run(SqoopMapper.java:89)
> ... 9 more
> Caused by: java.io.IOException: org.apache.sqoop.common.SqoopException:
> MAPRED_EXEC_0012:The type is not supported - java.math.BigDecimal
>  at org.apache.sqoop.job.io.Data.writeArray(Data.java:309)
> at org.apache.sqoop.job.io.Data.write(Data.java:171)
>  at
> org.apache.hadoop.io.serializer.WritableSerialization$WritableSerializer.serialize(WritableSerialization.java:100)
>  at
> org.apache.hadoop.io.serializer.WritableSerialization$WritableSerializer.serialize(WritableSerialization.java:84)
>  at
> org.apache.hadoop.mapred.MapTask$MapOutputBuffer.collect(MapTask.java:1075)
>  at
> org.apache.hadoop.mapred.MapTask$NewOutputCollector.write(MapTask.java:655)
> at
> org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89)
>  at
> org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112)
>  at
> org.apache.sqoop.job.mr.SqoopMapper$MapDataWriter.writeContent(SqoopMapper.java:140)
> ... 13 more
> Caused by: org.apache.sqoop.common.SqoopException: MAPRED_EXEC_0012:The
> type is not supported - java.math.BigDecimal
> ... 22 more
> 
> -- 
> Thanks,
> Am
> it