You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@pig.apache.org by Prasad GS <gs...@gmail.com> on 2013/10/16 08:29:44 UTC

java.io.IOException: Deserialization error: org.apache.hcatalog.data.schema.HCatSchema

Hi All,

I'm currently integrating Pig with HCatalog & then trying to run the pig
scripts. I'm using cloudera CDH 4.4.0 with pig-0.11.0+33, hive-0.10.0+198
and  hcatalog-0.5.0+13.

When I use pig -useHCatalog to run my pig scripts, everything works fine.
But when I try to launch the pig scripts using PigServer (to run pig from
java), I get the following exception. please note that I don't have any
problems running normal pig scripts with PigServer. When I want to use
HCatalog and load the table using org.apache.hcatalog.pig.HCatLoader(), i
get this exception. Pls let me know if anybody has any idea to get around
the issue.

2013-10-16 11:37:21,588 WARN mapreduce.Counters: Group
org.apache.hadoop.mapred.Task$Counter is deprecated. Use
org.apache.hadoop.mapreduce.TaskCounter instead
2013-10-16 11:37:21,843 INFO
org.apache.hadoop.filecache.TrackerDistributedCacheManager: Creating
symlink: /data2/hadoop/mapred/local/taskTracker/yumecorp/jobcache/job_201310071233_0062/jars/job.jar
<- /data3/hadoop/mapred/local/taskTracker/yumecorp/jobcache/job_201310071233_0062/attempt_201310071233_0062_m_000002_0/work/job.jar
2013-10-16 11:37:21,845 INFO
org.apache.hadoop.filecache.TrackerDistributedCacheManager: Creating
symlink: /data2/hadoop/mapred/local/taskTracker/yumecorp/jobcache/job_201310071233_0062/jars/.job.jar.crc
<- /data3/hadoop/mapred/local/taskTracker/yumecorp/jobcache/job_201310071233_0062/attempt_201310071233_0062_m_000002_0/work/.job.jar.crc
2013-10-16 11:37:21,881 WARN org.apache.hadoop.conf.Configuration:
session.id is deprecated. Instead, use dfs.metrics.session-id
2013-10-16 11:37:21,881 INFO org.apache.hadoop.metrics.jvm.JvmMetrics:
Initializing JVM Metrics with processName=MAP, sessionId=
2013-10-16 11:37:22,135 INFO
org.apache.hadoop.mapred.TaskLogsTruncater: Initializing logs'
truncater with mapRetainSize=-1 and reduceRetainSize=-1
2013-10-16 11:37:22,137 ERROR
org.apache.hadoop.security.UserGroupInformation:
PriviledgedActionException as:yumecorp (auth:SIMPLE)
cause:java.io.IOException: Deserialization error:
org.apache.hcatalog.data.schema.HCatSchema
2013-10-16 11:37:22,137 WARN org.apache.hadoop.mapred.Child: Error running child
java.io.IOException: Deserialization error:
org.apache.hcatalog.data.schema.HCatSchema
	at org.apache.pig.impl.util.ObjectSerializer.deserialize(ObjectSerializer.java:60)
	at org.apache.pig.impl.util.UDFContext.deserialize(UDFContext.java:192)
	at org.apache.pig.backend.hadoop.executionengine.util.MapRedUtil.setupUDFContext(MapRedUtil.java:159)
	at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.setupUdfEnvAndStores(PigOutputFormat.java:229)
	at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.getOutputCommitter(PigOutputFormat.java:275)
	at org.apache.hadoop.mapred.Task.initialize(Task.java:526)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:313)
	at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:396)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
	at org.apache.hadoop.mapred.Child.main(Child.java:262)
Caused by: java.lang.ClassNotFoundException:
org.apache.hcatalog.data.schema.HCatSchema
	at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
	at java.lang.Class.forName0(Native Method)
	at java.lang.Class.forName(Class.java:247)
	at java.io.ObjectInputStream.resolveClass(ObjectInputStream.java:603)
	at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1574)
	at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1495)
	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1731)
	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1328)
	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:350)
	at java.util.Hashtable.readObject(Hashtable.java:859)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:969)
	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1848)
	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1752)
	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1328)
	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:350)
	at java.util.HashMap.readObject(HashMap.java:1030)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:969)
	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1848)
	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1752)
	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1328)
	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:350)
	at org.apache.pig.impl.util.ObjectSerializer.deserialize(ObjectSerializer.java:58)
	... 11 more
2013-10-16 11:37:22,140 INFO org.apache.hadoop.mapred.Task: Runnning
cleanup for the task
2013-10-16 11:37:22,140 INFO org.apache.hadoop.mapred.Child: Error cleaning up
java.lang.NullPointerException
	at org.apache.hadoop.mapred.Task.taskCleanup(Task.java:1050)
	at org.apache.hadoop.mapred.Child$5.run(Child.java:300)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:396)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
	at org.apache.hadoop.mapred.Child.main(Child.java:297)


Regards,

Prasad

Re: java.io.IOException: Deserialization error: org.apache.hcatalog.data.schema.HCatSchema

Posted by Prasad GS <gs...@gmail.com>.
Hi Nitin,

All the required JARS(hadoop,hive,hcatalog) are in the classpath of the Pig
script submission code using PigServer.

There is already a JIRA (https://issues.apache.org/jira/browse/PIG-2532) on
this issue & it is currently in the "resolved" state for the pig version
that im using (pig 0.11). But I'm still getting the issue.
I looked at the source code and found that the suggested fix is not
available in either 0.11 or 0.12. Any suggestions?

Regards,
Prasad




On Wed, Oct 16, 2013 at 12:43 PM, Nitin Pawar <ni...@gmail.com>wrote:

> Caused by:
> java.lang.ClassNotFoundException:org.apache.hcatalog.data.schema.HCatSchema
> <--- This suggests that the jar is not loaded in classpath.
> Can you try registering the jar inside your script
>
> PS: I am still newbie with pig. I am mainly a hive user.
>
>
> On Wed, Oct 16, 2013 at 11:59 AM, Prasad GS <gs...@gmail.com> wrote:
>
> > Hi All,
> >
> > I'm currently integrating Pig with HCatalog & then trying to run the pig
> > scripts. I'm using cloudera CDH 4.4.0 with pig-0.11.0+33, hive-0.10.0+198
> > and  hcatalog-0.5.0+13.
> >
> > When I use pig -useHCatalog to run my pig scripts, everything works fine.
> > But when I try to launch the pig scripts using PigServer (to run pig from
> > java), I get the following exception. please note that I don't have any
> > problems running normal pig scripts with PigServer. When I want to use
> > HCatalog and load the table using org.apache.hcatalog.pig.HCatLoader(), i
> > get this exception. Pls let me know if anybody has any idea to get around
> > the issue.
> >
> > 2013-10-16 11:37:21,588 WARN mapreduce.Counters: Group
> > org.apache.hadoop.mapred.Task$Counter is deprecated. Use
> > org.apache.hadoop.mapreduce.TaskCounter instead
> > 2013-10-16 11:37:21,843 INFO
> > org.apache.hadoop.filecache.TrackerDistributedCacheManager: Creating
> > symlink:
> >
> /data2/hadoop/mapred/local/taskTracker/yumecorp/jobcache/job_201310071233_0062/jars/job.jar
> > <-
> >
> /data3/hadoop/mapred/local/taskTracker/yumecorp/jobcache/job_201310071233_0062/attempt_201310071233_0062_m_000002_0/work/job.jar
> > 2013-10-16 11:37:21,845 INFO
> > org.apache.hadoop.filecache.TrackerDistributedCacheManager: Creating
> > symlink:
> >
> /data2/hadoop/mapred/local/taskTracker/yumecorp/jobcache/job_201310071233_0062/jars/.job.jar.crc
> > <-
> >
> /data3/hadoop/mapred/local/taskTracker/yumecorp/jobcache/job_201310071233_0062/attempt_201310071233_0062_m_000002_0/work/.job.jar.crc
> > 2013-10-16 11:37:21,881 WARN org.apache.hadoop.conf.Configuration:
> > session.id is deprecated. Instead, use dfs.metrics.session-id
> > 2013-10-16 11:37:21,881 INFO org.apache.hadoop.metrics.jvm.JvmMetrics:
> > Initializing JVM Metrics with processName=MAP, sessionId=
> > 2013-10-16 11:37:22,135 INFO
> > org.apache.hadoop.mapred.TaskLogsTruncater: Initializing logs'
> > truncater with mapRetainSize=-1 and reduceRetainSize=-1
> > 2013-10-16 11:37:22,137 ERROR
> > org.apache.hadoop.security.UserGroupInformation:
> > PriviledgedActionException as:yumecorp (auth:SIMPLE)
> > cause:java.io.IOException: Deserialization error:
> > org.apache.hcatalog.data.schema.HCatSchema
> > 2013-10-16 11:37:22,137 WARN org.apache.hadoop.mapred.Child: Error
> running
> > child
> > java.io.IOException: Deserialization error:
> > org.apache.hcatalog.data.schema.HCatSchema
> >         at
> >
> org.apache.pig.impl.util.ObjectSerializer.deserialize(ObjectSerializer.java:60)
> >         at
> > org.apache.pig.impl.util.UDFContext.deserialize(UDFContext.java:192)
> >         at
> >
> org.apache.pig.backend.hadoop.executionengine.util.MapRedUtil.setupUDFContext(MapRedUtil.java:159)
> >         at
> >
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.setupUdfEnvAndStores(PigOutputFormat.java:229)
> >         at
> >
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.getOutputCommitter(PigOutputFormat.java:275)
> >         at org.apache.hadoop.mapred.Task.initialize(Task.java:526)
> >         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:313)
> >         at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
> >         at java.security.AccessController.doPrivileged(Native Method)
> >         at javax.security.auth.Subject.doAs(Subject.java:396)
> >         at
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
> >         at org.apache.hadoop.mapred.Child.main(Child.java:262)
> > Caused by: java.lang.ClassNotFoundException:
> > org.apache.hcatalog.data.schema.HCatSchema
> >         at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
> >         at java.security.AccessController.doPrivileged(Native Method)
> >         at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> >         at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> >         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
> >         at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> >         at java.lang.Class.forName0(Native Method)
> >         at java.lang.Class.forName(Class.java:247)
> >         at
> > java.io.ObjectInputStream.resolveClass(ObjectInputStream.java:603)
> >         at
> > java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1574)
> >         at
> > java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1495)
> >         at
> > java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1731)
> >         at
> > java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1328)
> >         at
> java.io.ObjectInputStream.readObject(ObjectInputStream.java:350)
> >         at java.util.Hashtable.readObject(Hashtable.java:859)
> >         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >         at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >         at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >         at java.lang.reflect.Method.invoke(Method.java:597)
> >         at
> > java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:969)
> >         at
> > java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1848)
> >         at
> > java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1752)
> >         at
> > java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1328)
> >         at
> java.io.ObjectInputStream.readObject(ObjectInputStream.java:350)
> >         at java.util.HashMap.readObject(HashMap.java:1030)
> >         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >         at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >         at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >         at java.lang.reflect.Method.invoke(Method.java:597)
> >         at
> > java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:969)
> >         at
> > java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1848)
> >         at
> > java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1752)
> >         at
> > java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1328)
> >         at
> java.io.ObjectInputStream.readObject(ObjectInputStream.java:350)
> >         at
> >
> org.apache.pig.impl.util.ObjectSerializer.deserialize(ObjectSerializer.java:58)
> >         ... 11 more
> > 2013-10-16 11:37:22,140 INFO org.apache.hadoop.mapred.Task: Runnning
> > cleanup for the task
> > 2013-10-16 11:37:22,140 INFO org.apache.hadoop.mapred.Child: Error
> > cleaning up
> > java.lang.NullPointerException
> >         at org.apache.hadoop.mapred.Task.taskCleanup(Task.java:1050)
> >         at org.apache.hadoop.mapred.Child$5.run(Child.java:300)
> >         at java.security.AccessController.doPrivileged(Native Method)
> >         at javax.security.auth.Subject.doAs(Subject.java:396)
> >         at
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
> >         at org.apache.hadoop.mapred.Child.main(Child.java:297)
> >
> >
> > Regards,
> >
> > Prasad
> >
>
>
>
> --
> Nitin Pawar
>

Re: java.io.IOException: Deserialization error: org.apache.hcatalog.data.schema.HCatSchema

Posted by Nitin Pawar <ni...@gmail.com>.
Caused by: java.lang.ClassNotFoundException:org.apache.hcatalog.data.schema.HCatSchema
<--- This suggests that the jar is not loaded in classpath.
Can you try registering the jar inside your script

PS: I am still newbie with pig. I am mainly a hive user.


On Wed, Oct 16, 2013 at 11:59 AM, Prasad GS <gs...@gmail.com> wrote:

> Hi All,
>
> I'm currently integrating Pig with HCatalog & then trying to run the pig
> scripts. I'm using cloudera CDH 4.4.0 with pig-0.11.0+33, hive-0.10.0+198
> and  hcatalog-0.5.0+13.
>
> When I use pig -useHCatalog to run my pig scripts, everything works fine.
> But when I try to launch the pig scripts using PigServer (to run pig from
> java), I get the following exception. please note that I don't have any
> problems running normal pig scripts with PigServer. When I want to use
> HCatalog and load the table using org.apache.hcatalog.pig.HCatLoader(), i
> get this exception. Pls let me know if anybody has any idea to get around
> the issue.
>
> 2013-10-16 11:37:21,588 WARN mapreduce.Counters: Group
> org.apache.hadoop.mapred.Task$Counter is deprecated. Use
> org.apache.hadoop.mapreduce.TaskCounter instead
> 2013-10-16 11:37:21,843 INFO
> org.apache.hadoop.filecache.TrackerDistributedCacheManager: Creating
> symlink:
> /data2/hadoop/mapred/local/taskTracker/yumecorp/jobcache/job_201310071233_0062/jars/job.jar
> <-
> /data3/hadoop/mapred/local/taskTracker/yumecorp/jobcache/job_201310071233_0062/attempt_201310071233_0062_m_000002_0/work/job.jar
> 2013-10-16 11:37:21,845 INFO
> org.apache.hadoop.filecache.TrackerDistributedCacheManager: Creating
> symlink:
> /data2/hadoop/mapred/local/taskTracker/yumecorp/jobcache/job_201310071233_0062/jars/.job.jar.crc
> <-
> /data3/hadoop/mapred/local/taskTracker/yumecorp/jobcache/job_201310071233_0062/attempt_201310071233_0062_m_000002_0/work/.job.jar.crc
> 2013-10-16 11:37:21,881 WARN org.apache.hadoop.conf.Configuration:
> session.id is deprecated. Instead, use dfs.metrics.session-id
> 2013-10-16 11:37:21,881 INFO org.apache.hadoop.metrics.jvm.JvmMetrics:
> Initializing JVM Metrics with processName=MAP, sessionId=
> 2013-10-16 11:37:22,135 INFO
> org.apache.hadoop.mapred.TaskLogsTruncater: Initializing logs'
> truncater with mapRetainSize=-1 and reduceRetainSize=-1
> 2013-10-16 11:37:22,137 ERROR
> org.apache.hadoop.security.UserGroupInformation:
> PriviledgedActionException as:yumecorp (auth:SIMPLE)
> cause:java.io.IOException: Deserialization error:
> org.apache.hcatalog.data.schema.HCatSchema
> 2013-10-16 11:37:22,137 WARN org.apache.hadoop.mapred.Child: Error running
> child
> java.io.IOException: Deserialization error:
> org.apache.hcatalog.data.schema.HCatSchema
>         at
> org.apache.pig.impl.util.ObjectSerializer.deserialize(ObjectSerializer.java:60)
>         at
> org.apache.pig.impl.util.UDFContext.deserialize(UDFContext.java:192)
>         at
> org.apache.pig.backend.hadoop.executionengine.util.MapRedUtil.setupUDFContext(MapRedUtil.java:159)
>         at
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.setupUdfEnvAndStores(PigOutputFormat.java:229)
>         at
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.getOutputCommitter(PigOutputFormat.java:275)
>         at org.apache.hadoop.mapred.Task.initialize(Task.java:526)
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:313)
>         at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:396)
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
>         at org.apache.hadoop.mapred.Child.main(Child.java:262)
> Caused by: java.lang.ClassNotFoundException:
> org.apache.hcatalog.data.schema.HCatSchema
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>         at java.lang.Class.forName0(Native Method)
>         at java.lang.Class.forName(Class.java:247)
>         at
> java.io.ObjectInputStream.resolveClass(ObjectInputStream.java:603)
>         at
> java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1574)
>         at
> java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1495)
>         at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1731)
>         at
> java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1328)
>         at java.io.ObjectInputStream.readObject(ObjectInputStream.java:350)
>         at java.util.Hashtable.readObject(Hashtable.java:859)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>         at java.lang.reflect.Method.invoke(Method.java:597)
>         at
> java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:969)
>         at
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1848)
>         at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1752)
>         at
> java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1328)
>         at java.io.ObjectInputStream.readObject(ObjectInputStream.java:350)
>         at java.util.HashMap.readObject(HashMap.java:1030)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>         at java.lang.reflect.Method.invoke(Method.java:597)
>         at
> java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:969)
>         at
> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1848)
>         at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1752)
>         at
> java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1328)
>         at java.io.ObjectInputStream.readObject(ObjectInputStream.java:350)
>         at
> org.apache.pig.impl.util.ObjectSerializer.deserialize(ObjectSerializer.java:58)
>         ... 11 more
> 2013-10-16 11:37:22,140 INFO org.apache.hadoop.mapred.Task: Runnning
> cleanup for the task
> 2013-10-16 11:37:22,140 INFO org.apache.hadoop.mapred.Child: Error
> cleaning up
> java.lang.NullPointerException
>         at org.apache.hadoop.mapred.Task.taskCleanup(Task.java:1050)
>         at org.apache.hadoop.mapred.Child$5.run(Child.java:300)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:396)
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
>         at org.apache.hadoop.mapred.Child.main(Child.java:297)
>
>
> Regards,
>
> Prasad
>



-- 
Nitin Pawar