You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Nagaraj K <na...@yahoo-inc.com> on 2009/04/01 22:05:14 UTC

Reducer side output

Hi,

I am trying to do a side-effect output along with the usual output from the reducer.
But for the side-effect output attempt, I get the following error.

org.apache.hadoop.fs.permission.AccessControlException: org.apache.hadoop.fs.permission.AccessControlException: Permission denied: user=nagarajk, access=WRITE, inode="":hdfs:hdfs:rwxr-xr-x
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
        at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:90)
        at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:52)
        at org.apache.hadoop.dfs.DFSClient$DFSOutputStream.<init>(DFSClient.java:2311)
        at org.apache.hadoop.dfs.DFSClient.create(DFSClient.java:477)
        at org.apache.hadoop.dfs.DistributedFileSystem.create(DistributedFileSystem.java:178)
        at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:503)
        at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:484)
        at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:391)
        at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:383)
        at org.yahoo.delphi.DecisionTree$AttStatReducer.reduce(DecisionTree.java:1310)
        at org.yahoo.delphi.DecisionTree$AttStatReducer.reduce(DecisionTree.java:1275)
        at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:319)
        at org.apache.hadoop.mapred.TaskTracker$Child.main(TaskTracker.java:2206)

My reducer code;
=============
conf.set("group_stat", "some_path"); // Set during the configuration of jobconf object

public static class ReducerClass extends MapReduceBase implements Reducer<Text,DoubleWritable,Text,DoubleWritable> {
        FSDataOutputStream part=null;
        JobConf conf;

        public void reduce(Text key, Iterator<DoubleWritable> values,
                           OutputCollector<Text,DoubleWritable> output,
                           Reporter reporter) throws IOException {
            double i_sum = 0.0;
            while (values.hasNext()) {
                i_sum += ((Double) values.next()).valueOf();
            }
            String [] fields = key.toString().split(SEP);
            if(fields.length==1)
            {
                           if(part==null)
                           {
                                   FileSystem fs = FileSystem.get(conf);
                                    String jobpart = conf.get("mapred.task.partition");
                                    part = fs.create(new Path(conf.get("group_stat"),"/part-000"+jobpart)) ; // Failing here
                           }
                           part.writeBytes(fields[0] +"\t" + i_sum +"\n");

            }
            else
                    output.collect(key, new DoubleWritable(i_sum));
        }
}

Can you guys let me know what I am doing wrong here!.

Thanks
Nagaraj K

Re: Reducer side output

Posted by Rasit OZDAS <ra...@gmail.com>.
I think it's about that you have no right to access to the path you define.
Did you try it with a path under your user directory?

You can change permissions from console.

2009/4/1 Nagaraj K <na...@yahoo-inc.com>

> Hi,
>
> I am trying to do a side-effect output along with the usual output from the
> reducer.
> But for the side-effect output attempt, I get the following error.
>
> org.apache.hadoop.fs.permission.AccessControlException:
> org.apache.hadoop.fs.permission.AccessControlException: Permission denied:
> user=nagarajk, access=WRITE, inode="":hdfs:hdfs:rwxr-xr-x
>        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
>        at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
>        at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
>        at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
>        at
> org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:90)
>        at
> org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:52)
>        at
> org.apache.hadoop.dfs.DFSClient$DFSOutputStream.<init>(DFSClient.java:2311)
>        at org.apache.hadoop.dfs.DFSClient.create(DFSClient.java:477)
>        at
> org.apache.hadoop.dfs.DistributedFileSystem.create(DistributedFileSystem.java:178)
>        at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:503)
>        at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:484)
>        at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:391)
>        at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:383)
>        at
> org.yahoo.delphi.DecisionTree$AttStatReducer.reduce(DecisionTree.java:1310)
>        at
> org.yahoo.delphi.DecisionTree$AttStatReducer.reduce(DecisionTree.java:1275)
>        at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:319)
>        at
> org.apache.hadoop.mapred.TaskTracker$Child.main(TaskTracker.java:2206)
>
> My reducer code;
> =============
> conf.set("group_stat", "some_path"); // Set during the configuration of
> jobconf object
>
> public static class ReducerClass extends MapReduceBase implements
> Reducer<Text,DoubleWritable,Text,DoubleWritable> {
>        FSDataOutputStream part=null;
>        JobConf conf;
>
>        public void reduce(Text key, Iterator<DoubleWritable> values,
>                           OutputCollector<Text,DoubleWritable> output,
>                           Reporter reporter) throws IOException {
>            double i_sum = 0.0;
>            while (values.hasNext()) {
>                i_sum += ((Double) values.next()).valueOf();
>            }
>            String [] fields = key.toString().split(SEP);
>            if(fields.length==1)
>            {
>                           if(part==null)
>                           {
>                                   FileSystem fs = FileSystem.get(conf);
>                                    String jobpart =
> conf.get("mapred.task.partition");
>                                    part = fs.create(new
> Path(conf.get("group_stat"),"/part-000"+jobpart)) ; // Failing here
>                           }
>                           part.writeBytes(fields[0] +"\t" + i_sum +"\n");
>
>            }
>            else
>                    output.collect(key, new DoubleWritable(i_sum));
>        }
> }
>
> Can you guys let me know what I am doing wrong here!.
>
> Thanks
> Nagaraj K
>



-- 
M. Raşit ÖZDAŞ