You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hadoop.apache.org by Sandgorgon <sa...@myopera.com> on 2013/10/24 01:38:21 UTC

Error on createDatumWriter?

Hello Guys,

If I could ask for some help with an error I am getting please? An
excerpt of the errors I am getting are as follows - I have no clue as to
what the error points to and what is it that I should do to fix it:

Error:
org.apache.avro.generic.GenericData.createDatumWriter(Lorg/apache/avro/Schema;)Lorg/apache/avro/io/DatumWriter;
13/10/23 17:07:42 INFO mapreduce.Job: Task Id :
attempt_1379090674214_0017_m_000002_1, Status : FAILED
Error:
org.apache.avro.generic.GenericData.createDatumWriter(Lorg/apache/avro/Schema;)Lorg/apache/avro/io/DatumWriter;
Container killed by the ApplicationMaster.

### The following is just additional information that hopefully will be
useful.

Here is the environment I am working in: CDH 4.4.0, YARN implementation
using Avro v1.7.5, I -jarslib the avro/jackson jars with my MR job.

### My Mapper:

public class AttributeFilterMapper extends
Mapper<AvroKey<GenericData.Record>, NullWritable,
AvroKey<GenericData.Record>, NullWritable> {

    @Override
    public void map(AvroKey<GenericData.Record> key, NullWritable nil,
    Context context) throws IOException, InterruptedException {
        GenericData.Record record = key.datum();

        if(record.get("delta_filter").equals("neumann")) 
        context.write(new AvroKey<GenericData.Record>(key.datum()),
        nil);
    }


### My Reducer

public class DefaultFormatReducer extends
Reducer<AvroKey<GenericData.Record>, NullWritable, Text, Text>{

    @Override
    protected void reduce(AvroKey<GenericData.Record> key,
    Iterable<NullWritable> values, Context context) throws IOException,
    InterruptedException {
        GenericData.Record record = key.datum();

        context.write(new Text("nd"), new
        Text(record.get("delta_filter").toString()));
    }
}

### My Main

    @Override
    public int run(String[] args) throws Exception {
        Job job = Job.getInstance(super.getConf());
        job.setJobName("BigDataHeader");
        job.setJar("bdh.jar");

        FileInputFormat.addInputPath(job, new Path(args[0]));
        FileOutputFormat.setOutputPath(job, new Path(args[1]));

        Schema headerSchema = new
        Parser().parse(getClass().getResourceAsStream("/com/micron/cesoft/hadoop/tte/Header.avsc"));
        AvroJob.setInputKeySchema(job, headerSchema);
        AvroJob.setMapOutputKeySchema(job, headerSchema);

        job.setMapperClass(AttributeFilterMapper.class);
        job.setReducerClass(DefaultFormatReducer.class);

        job.setOutputKeyClass(Text.class);
        job.setOutputValueClass(Text.class);

        return job.waitForCompletion(true) ? 0 : 1;
    }



Any help would be appreciated as this is my first run-in with using
Avro.

Best Regards,
Mon

Re: Error on createDatumWriter?

Posted by Ravi Prakash <ra...@ymail.com>.
Mon!

Can you see that task attempt's log? What error does it contain?







On Wednesday, October 23, 2013 6:38 PM, Sandgorgon <sa...@myopera.com> wrote:
 
Hello Guys,

If I could ask for some help with an error I am getting please? An
excerpt of the errors I am getting are as follows - I have no clue as to
what the error points to and what is it that I should do to fix it:

Error:
org.apache.avro.generic.GenericData.createDatumWriter(Lorg/apache/avro/Schema;)Lorg/apache/avro/io/DatumWriter;
13/10/23 17:07:42 INFO mapreduce.Job: Task Id :
attempt_1379090674214_0017_m_000002_1, Status : FAILED
Error:
org.apache.avro.generic.GenericData.createDatumWriter(Lorg/apache/avro/Schema;)Lorg/apache/avro/io/DatumWriter;
Container killed by the ApplicationMaster.

### The following is just additional information that hopefully will be
useful.

Here is the environment I am working in: CDH 4.4.0, YARN implementation
using Avro v1.7.5, I -jarslib the avro/jackson jars with my MR job.

### My Mapper:

public class AttributeFilterMapper extends
Mapper<AvroKey<GenericData.Record>, NullWritable,
AvroKey<GenericData.Record>, NullWritable> {

    @Override
    public void map(AvroKey<GenericData.Record> key, NullWritable nil,
    Context context) throws IOException, InterruptedException {
        GenericData.Record record = key.datum();

        if(record.get("delta_filter").equals("neumann")) 
        context.write(new AvroKey<GenericData.Record>(key.datum()),
        nil);
    }


### My Reducer

public class DefaultFormatReducer extends
Reducer<AvroKey<GenericData.Record>, NullWritable, Text, Text>{

    @Override
    protected void reduce(AvroKey<GenericData.Record> key,
    Iterable<NullWritable> values, Context context) throws IOException,
    InterruptedException {
        GenericData.Record record = key.datum();

        context.write(new Text("nd"), new
        Text(record.get("delta_filter").toString()));
    }
}

### My Main

    @Override
    public int run(String[] args) throws Exception {
        Job job = Job.getInstance(super.getConf());
        job.setJobName("BigDataHeader");
        job.setJar("bdh.jar");

        FileInputFormat.addInputPath(job, new Path(args[0]));
        FileOutputFormat.setOutputPath(job, new Path(args[1]));

        Schema headerSchema = new
        Parser().parse(getClass().getResourceAsStream("/com/micron/cesoft/hadoop/tte/Header.avsc"));
        AvroJob.setInputKeySchema(job, headerSchema);
        AvroJob.setMapOutputKeySchema(job, headerSchema);

        job.setMapperClass(AttributeFilterMapper.class);
        job.setReducerClass(DefaultFormatReducer.class);

        job.setOutputKeyClass(Text.class);
        job.setOutputValueClass(Text.class);

        return job.waitForCompletion(true) ? 0 : 1;
    }



Any help would be appreciated as this is my first run-in with using
Avro.

Best Regards,
Mon

Re: Error on createDatumWriter?

Posted by Ravi Prakash <ra...@ymail.com>.
Mon!

Can you see that task attempt's log? What error does it contain?







On Wednesday, October 23, 2013 6:38 PM, Sandgorgon <sa...@myopera.com> wrote:
 
Hello Guys,

If I could ask for some help with an error I am getting please? An
excerpt of the errors I am getting are as follows - I have no clue as to
what the error points to and what is it that I should do to fix it:

Error:
org.apache.avro.generic.GenericData.createDatumWriter(Lorg/apache/avro/Schema;)Lorg/apache/avro/io/DatumWriter;
13/10/23 17:07:42 INFO mapreduce.Job: Task Id :
attempt_1379090674214_0017_m_000002_1, Status : FAILED
Error:
org.apache.avro.generic.GenericData.createDatumWriter(Lorg/apache/avro/Schema;)Lorg/apache/avro/io/DatumWriter;
Container killed by the ApplicationMaster.

### The following is just additional information that hopefully will be
useful.

Here is the environment I am working in: CDH 4.4.0, YARN implementation
using Avro v1.7.5, I -jarslib the avro/jackson jars with my MR job.

### My Mapper:

public class AttributeFilterMapper extends
Mapper<AvroKey<GenericData.Record>, NullWritable,
AvroKey<GenericData.Record>, NullWritable> {

    @Override
    public void map(AvroKey<GenericData.Record> key, NullWritable nil,
    Context context) throws IOException, InterruptedException {
        GenericData.Record record = key.datum();

        if(record.get("delta_filter").equals("neumann")) 
        context.write(new AvroKey<GenericData.Record>(key.datum()),
        nil);
    }


### My Reducer

public class DefaultFormatReducer extends
Reducer<AvroKey<GenericData.Record>, NullWritable, Text, Text>{

    @Override
    protected void reduce(AvroKey<GenericData.Record> key,
    Iterable<NullWritable> values, Context context) throws IOException,
    InterruptedException {
        GenericData.Record record = key.datum();

        context.write(new Text("nd"), new
        Text(record.get("delta_filter").toString()));
    }
}

### My Main

    @Override
    public int run(String[] args) throws Exception {
        Job job = Job.getInstance(super.getConf());
        job.setJobName("BigDataHeader");
        job.setJar("bdh.jar");

        FileInputFormat.addInputPath(job, new Path(args[0]));
        FileOutputFormat.setOutputPath(job, new Path(args[1]));

        Schema headerSchema = new
        Parser().parse(getClass().getResourceAsStream("/com/micron/cesoft/hadoop/tte/Header.avsc"));
        AvroJob.setInputKeySchema(job, headerSchema);
        AvroJob.setMapOutputKeySchema(job, headerSchema);

        job.setMapperClass(AttributeFilterMapper.class);
        job.setReducerClass(DefaultFormatReducer.class);

        job.setOutputKeyClass(Text.class);
        job.setOutputValueClass(Text.class);

        return job.waitForCompletion(true) ? 0 : 1;
    }



Any help would be appreciated as this is my first run-in with using
Avro.

Best Regards,
Mon

Re: Error on createDatumWriter?

Posted by Ravi Prakash <ra...@ymail.com>.
Mon!

Can you see that task attempt's log? What error does it contain?







On Wednesday, October 23, 2013 6:38 PM, Sandgorgon <sa...@myopera.com> wrote:
 
Hello Guys,

If I could ask for some help with an error I am getting please? An
excerpt of the errors I am getting are as follows - I have no clue as to
what the error points to and what is it that I should do to fix it:

Error:
org.apache.avro.generic.GenericData.createDatumWriter(Lorg/apache/avro/Schema;)Lorg/apache/avro/io/DatumWriter;
13/10/23 17:07:42 INFO mapreduce.Job: Task Id :
attempt_1379090674214_0017_m_000002_1, Status : FAILED
Error:
org.apache.avro.generic.GenericData.createDatumWriter(Lorg/apache/avro/Schema;)Lorg/apache/avro/io/DatumWriter;
Container killed by the ApplicationMaster.

### The following is just additional information that hopefully will be
useful.

Here is the environment I am working in: CDH 4.4.0, YARN implementation
using Avro v1.7.5, I -jarslib the avro/jackson jars with my MR job.

### My Mapper:

public class AttributeFilterMapper extends
Mapper<AvroKey<GenericData.Record>, NullWritable,
AvroKey<GenericData.Record>, NullWritable> {

    @Override
    public void map(AvroKey<GenericData.Record> key, NullWritable nil,
    Context context) throws IOException, InterruptedException {
        GenericData.Record record = key.datum();

        if(record.get("delta_filter").equals("neumann")) 
        context.write(new AvroKey<GenericData.Record>(key.datum()),
        nil);
    }


### My Reducer

public class DefaultFormatReducer extends
Reducer<AvroKey<GenericData.Record>, NullWritable, Text, Text>{

    @Override
    protected void reduce(AvroKey<GenericData.Record> key,
    Iterable<NullWritable> values, Context context) throws IOException,
    InterruptedException {
        GenericData.Record record = key.datum();

        context.write(new Text("nd"), new
        Text(record.get("delta_filter").toString()));
    }
}

### My Main

    @Override
    public int run(String[] args) throws Exception {
        Job job = Job.getInstance(super.getConf());
        job.setJobName("BigDataHeader");
        job.setJar("bdh.jar");

        FileInputFormat.addInputPath(job, new Path(args[0]));
        FileOutputFormat.setOutputPath(job, new Path(args[1]));

        Schema headerSchema = new
        Parser().parse(getClass().getResourceAsStream("/com/micron/cesoft/hadoop/tte/Header.avsc"));
        AvroJob.setInputKeySchema(job, headerSchema);
        AvroJob.setMapOutputKeySchema(job, headerSchema);

        job.setMapperClass(AttributeFilterMapper.class);
        job.setReducerClass(DefaultFormatReducer.class);

        job.setOutputKeyClass(Text.class);
        job.setOutputValueClass(Text.class);

        return job.waitForCompletion(true) ? 0 : 1;
    }



Any help would be appreciated as this is my first run-in with using
Avro.

Best Regards,
Mon

Re: Error on createDatumWriter?

Posted by Ravi Prakash <ra...@ymail.com>.
Mon!

Can you see that task attempt's log? What error does it contain?







On Wednesday, October 23, 2013 6:38 PM, Sandgorgon <sa...@myopera.com> wrote:
 
Hello Guys,

If I could ask for some help with an error I am getting please? An
excerpt of the errors I am getting are as follows - I have no clue as to
what the error points to and what is it that I should do to fix it:

Error:
org.apache.avro.generic.GenericData.createDatumWriter(Lorg/apache/avro/Schema;)Lorg/apache/avro/io/DatumWriter;
13/10/23 17:07:42 INFO mapreduce.Job: Task Id :
attempt_1379090674214_0017_m_000002_1, Status : FAILED
Error:
org.apache.avro.generic.GenericData.createDatumWriter(Lorg/apache/avro/Schema;)Lorg/apache/avro/io/DatumWriter;
Container killed by the ApplicationMaster.

### The following is just additional information that hopefully will be
useful.

Here is the environment I am working in: CDH 4.4.0, YARN implementation
using Avro v1.7.5, I -jarslib the avro/jackson jars with my MR job.

### My Mapper:

public class AttributeFilterMapper extends
Mapper<AvroKey<GenericData.Record>, NullWritable,
AvroKey<GenericData.Record>, NullWritable> {

    @Override
    public void map(AvroKey<GenericData.Record> key, NullWritable nil,
    Context context) throws IOException, InterruptedException {
        GenericData.Record record = key.datum();

        if(record.get("delta_filter").equals("neumann")) 
        context.write(new AvroKey<GenericData.Record>(key.datum()),
        nil);
    }


### My Reducer

public class DefaultFormatReducer extends
Reducer<AvroKey<GenericData.Record>, NullWritable, Text, Text>{

    @Override
    protected void reduce(AvroKey<GenericData.Record> key,
    Iterable<NullWritable> values, Context context) throws IOException,
    InterruptedException {
        GenericData.Record record = key.datum();

        context.write(new Text("nd"), new
        Text(record.get("delta_filter").toString()));
    }
}

### My Main

    @Override
    public int run(String[] args) throws Exception {
        Job job = Job.getInstance(super.getConf());
        job.setJobName("BigDataHeader");
        job.setJar("bdh.jar");

        FileInputFormat.addInputPath(job, new Path(args[0]));
        FileOutputFormat.setOutputPath(job, new Path(args[1]));

        Schema headerSchema = new
        Parser().parse(getClass().getResourceAsStream("/com/micron/cesoft/hadoop/tte/Header.avsc"));
        AvroJob.setInputKeySchema(job, headerSchema);
        AvroJob.setMapOutputKeySchema(job, headerSchema);

        job.setMapperClass(AttributeFilterMapper.class);
        job.setReducerClass(DefaultFormatReducer.class);

        job.setOutputKeyClass(Text.class);
        job.setOutputValueClass(Text.class);

        return job.waitForCompletion(true) ? 0 : 1;
    }



Any help would be appreciated as this is my first run-in with using
Avro.

Best Regards,
Mon