You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@sqoop.apache.org by YouPeng Yang <yy...@gmail.com> on 2013/04/15 08:30:10 UTC

WritableName can't load class

Dear ALL

  I have done an import job by running this:
  $/home/sqoop-1.4.1-cdh4.1.2/bin/sqoop import --connect
jdbc:oracle:thin:@10.167.14.225:1521:wxoss  -username XUJINGYU -password
123456  -target-dir sqoop/NMS_CMTS_MEMORY_CDX3 --query "select  a.* from
NMS_CMTS_MEMORY_CDX a where \$CONDITIONS" --split-by a.CMTSID
--as-sequencefile --class-name com.jhel.memoryseq

Note:this is an sequencefile import,and I set the  class-name.

After the success import job,I intend to do a MRv2 job[1] (with
Hadoop2.0.0) to process the imported datafiles in the HDFS.However I got
the exception [2].

I have already put the the memoryseq.class into the directory bin of my
project :
[root@Hadoop01 ~]# cp
 /tmp/sqoop-hadoop/compile/893b75fc25d3ade0272ab8fa1db420ef/com/jhel/memoryseq.class
 /home/hadoop/indigo_workspace/sqoopetl/bin/com/jhetl

The exception still came out.

please help me.




[1]the job settings as following.
======================================================
public static void main(String[] args) throws IOException,
InterruptedException, ClassNotFoundException {
Configuration conf = new Configuration();
String[] otheArgs = new GenericOptionsParser(conf,args).getRemainingArgs();
if(otheArgs.length != 2){
System.err.println("Usage:aaaa");
System.exit(2);
}
 @SuppressWarnings("deprecation")
Job job = new Job(conf,"Data test2");
job.setMapperClass(MEMMapper.class);
job.setReducerClass(MEMReducer.class);
job.setInputFormatClass(SequenceFileAsTextInputFormat.class);
job.setOutputKeyClass(Text.class);
job.setOutputValueClass(Text.class);
FileInputFormat.addInputPath(job,new Path(otheArgs[0]));
FileOutputFormat.setOutputPath(job,new Path(otheArgs[1]));
System.exit(job.waitForCompletion(true) ?  0 : 1);
 }
=====================================================


[2]===================================================
...
 2013-04-15 14:10:08,907 WARN  mapred.LocalJobRunner
(LocalJobRunner.java:run(479)) - job_local_0001
*java.lang.Exception: java.lang.RuntimeException: java.io.IOException:
WritableName can't load class: com.jhel.memoryseq*
* at
org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:400)*
*Caused by: java.lang.RuntimeException: java.io.IOException: WritableName
can't load class: com.jhel.memoryseq*
at
org.apache.hadoop.io.SequenceFile$Reader.getValueClass(SequenceFile.java:1966)
at org.apache.hadoop.io.SequenceFile$Reader.init(SequenceFile.java:1906)
at
org.apache.hadoop.io.SequenceFile$Reader.initialize(SequenceFile.java:1765)
at org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1714)
at org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1728)
at
org.apache.hadoop.mapreduce.lib.input.SequenceFileRecordReader.initialize(SequenceFileRecordReader.java:54)
at
org.apache.hadoop.mapreduce.lib.input.SequenceFileAsTextRecordReader.initialize(SequenceFileAsTextRecordReader.java:56)
at
org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.initialize(MapTask.java:488)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:724)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)
at
org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:232)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at
java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Caused by: java.io.IOException: WritableName can't load class:
com.jhel.memoryseq
at org.apache.hadoop.io.WritableName.getClass(WritableName.java:77)
at
org.apache.hadoop.io.SequenceFile$Reader.getValueClass(SequenceFile.java:1964)
... 16 more
Caused by: java.lang.ClassNotFoundException: Class com.jhel.memoryseq not
found
at
org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:1477)
at org.apache.hadoop.io.WritableName.getClass(WritableName.java:75)
... 17 more
2013-04-15 14:10:09,123 INFO  mapreduce.Job
(Job.java:monitorAndPrintJob(1286)) - Job job_local_0001 running in uber
mode : false
2013-04-15 14:10:09,124 INFO  mapreduce.Job
(Job.java:monitorAndPrintJob(1293)) -  map 0% reduce 0%
2013-04-15 14:10:09,127 INFO  mapreduce.Job
(Job.java:monitorAndPrintJob(1306)) - Job job_local_0001 failed with state
FAILED due to: NA
2013-04-15 14:10:09,134 INFO  mapreduce.Job
(Job.java:monitorAndPrintJob(1311)) - Counters: 0
===========================================================



Regards.

Re: WritableName can't load class

Posted by Jarek Jarcec Cecho <ja...@apache.org>.
Hi YouPeng,
I think that using --class-name for specifying the package name might cause some issues, so I would suggest to use the --class-name parameter only for specifying class name and --package to specify desired package name. You can find more details about the parameters in our user guide:

http://sqoop.apache.org/docs/1.4.3/SqoopUserGuide.html

Jarcec

On Mon, Apr 15, 2013 at 02:30:10PM +0800, YouPeng Yang wrote:
> Dear ALL
> 
>   I have done an import job by running this:
>   $/home/sqoop-1.4.1-cdh4.1.2/bin/sqoop import --connect
> jdbc:oracle:thin:@10.167.14.225:1521:wxoss  -username XUJINGYU -password
> 123456  -target-dir sqoop/NMS_CMTS_MEMORY_CDX3 --query "select  a.* from
> NMS_CMTS_MEMORY_CDX a where \$CONDITIONS" --split-by a.CMTSID
> --as-sequencefile --class-name com.jhel.memoryseq
> 
> Note:this is an sequencefile import,and I set the  class-name.
> 
> After the success import job,I intend to do a MRv2 job[1] (with
> Hadoop2.0.0) to process the imported datafiles in the HDFS.However I got
> the exception [2].
> 
> I have already put the the memoryseq.class into the directory bin of my
> project :
> [root@Hadoop01 ~]# cp
>  /tmp/sqoop-hadoop/compile/893b75fc25d3ade0272ab8fa1db420ef/com/jhel/memoryseq.class
>  /home/hadoop/indigo_workspace/sqoopetl/bin/com/jhetl
> 
> The exception still came out.
> 
> please help me.
> 
> 
> 
> 
> [1]the job settings as following.
> ======================================================
> public static void main(String[] args) throws IOException,
> InterruptedException, ClassNotFoundException {
> Configuration conf = new Configuration();
> String[] otheArgs = new GenericOptionsParser(conf,args).getRemainingArgs();
> if(otheArgs.length != 2){
> System.err.println("Usage:aaaa");
> System.exit(2);
> }
>  @SuppressWarnings("deprecation")
> Job job = new Job(conf,"Data test2");
> job.setMapperClass(MEMMapper.class);
> job.setReducerClass(MEMReducer.class);
> job.setInputFormatClass(SequenceFileAsTextInputFormat.class);
> job.setOutputKeyClass(Text.class);
> job.setOutputValueClass(Text.class);
> FileInputFormat.addInputPath(job,new Path(otheArgs[0]));
> FileOutputFormat.setOutputPath(job,new Path(otheArgs[1]));
> System.exit(job.waitForCompletion(true) ?  0 : 1);
>  }
> =====================================================
> 
> 
> [2]===================================================
> ...
>  2013-04-15 14:10:08,907 WARN  mapred.LocalJobRunner
> (LocalJobRunner.java:run(479)) - job_local_0001
> *java.lang.Exception: java.lang.RuntimeException: java.io.IOException:
> WritableName can't load class: com.jhel.memoryseq*
> * at
> org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:400)*
> *Caused by: java.lang.RuntimeException: java.io.IOException: WritableName
> can't load class: com.jhel.memoryseq*
> at
> org.apache.hadoop.io.SequenceFile$Reader.getValueClass(SequenceFile.java:1966)
> at org.apache.hadoop.io.SequenceFile$Reader.init(SequenceFile.java:1906)
> at
> org.apache.hadoop.io.SequenceFile$Reader.initialize(SequenceFile.java:1765)
> at org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1714)
> at org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1728)
> at
> org.apache.hadoop.mapreduce.lib.input.SequenceFileRecordReader.initialize(SequenceFileRecordReader.java:54)
> at
> org.apache.hadoop.mapreduce.lib.input.SequenceFileAsTextRecordReader.initialize(SequenceFileAsTextRecordReader.java:56)
> at
> org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.initialize(MapTask.java:488)
> at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:724)
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)
> at
> org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:232)
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
> at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
> at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
> at java.lang.Thread.run(Thread.java:662)
> Caused by: java.io.IOException: WritableName can't load class:
> com.jhel.memoryseq
> at org.apache.hadoop.io.WritableName.getClass(WritableName.java:77)
> at
> org.apache.hadoop.io.SequenceFile$Reader.getValueClass(SequenceFile.java:1964)
> ... 16 more
> Caused by: java.lang.ClassNotFoundException: Class com.jhel.memoryseq not
> found
> at
> org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:1477)
> at org.apache.hadoop.io.WritableName.getClass(WritableName.java:75)
> ... 17 more
> 2013-04-15 14:10:09,123 INFO  mapreduce.Job
> (Job.java:monitorAndPrintJob(1286)) - Job job_local_0001 running in uber
> mode : false
> 2013-04-15 14:10:09,124 INFO  mapreduce.Job
> (Job.java:monitorAndPrintJob(1293)) -  map 0% reduce 0%
> 2013-04-15 14:10:09,127 INFO  mapreduce.Job
> (Job.java:monitorAndPrintJob(1306)) - Job job_local_0001 failed with state
> FAILED due to: NA
> 2013-04-15 14:10:09,134 INFO  mapreduce.Job
> (Job.java:monitorAndPrintJob(1311)) - Counters: 0
> ===========================================================
> 
> 
> 
> Regards.