You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@pig.apache.org by Praveen Bysani <pr...@gmail.com> on 2013/03/27 09:29:10 UTC

Unable to typecast fields loaded from HBase

Hi,

I am unable to typecast fields loaded from my hbase to anything other than
default bytearray. I tried both during the LOAD statement and using
typecast after loading. Neither works. The script works when i load the
data as below,
records = LOAD 'hbase://hantu' USING
org.apache.pig.backend.hadoop.hbase.HBaseStorage('v:member v:guest') as
(member, guest);
records_limit = LIMIT records 10;
DUMP records_limit;

But when i change the first line to ,
records = LOAD 'hbase://hantu' USING
org.apache.pig.backend.hadoop.hbase.HBaseStorage('v:member v:guest') as
(member:chararray, guest:chararray);

The pig script fails and the log is as below,
Backend error message
---------------------
Error: java.lang.ClassNotFoundException:
org.apache.hadoop.hbase.mapreduce.TableInputFormat
        at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
        at java.lang.ClassLoader.defineClass1(Native Method)
        at java.lang.ClassLoader.defineClassCond(ClassLoader.java:632)
        at java.lang.ClassLoader.defineClass(ClassLoader.java:616)
        at
java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
        at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
        at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:190

Backend error message
---------------------
Error: java.lang.ClassNotFoundException:
org.apache.hadoop.hbase.mapreduce.TableInputFormat
        at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
        at java.lang.ClassLoader.defineClass1(Native Method)
        at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
        at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
        at
java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
        at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
        at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:190

Error message from task (reduce) task_201303270642_0043_r_000000
----------------------------------------------------------------
ERROR 6015: During execution, encountered a Hadoop error.

org.apache.pig.backend.executionengine.ExecException: ERROR 6015: During
execution, encountered a Hadoop error.
        at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
        at java.lang.ClassLoader.defineClass1(Native Method)
        at java.lang.ClassLoader.defineClassCond(ClassLoader.java:632)
        at java.lang.ClassLoader.defineClass(ClassLoader.java:616)
        at
java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
        at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
        at java.net.URLClassLoader.access(000(URLClassLoader.java:58)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
        at java.security.AccessController.doPrivileged(Native Method)
Caused by: java.lang.ClassNotFoundException:
org.apache.hadoop.hbase.mapreduce.TableInputFormat
        ... 14 more
================================================================================
Error message from task (reduce) task_201303270642_0043_r_000000
----------------------------------------------------------------
ERROR 6015: During execution, encountered a Hadoop error.

org.apache.pig.backend.executionengine.ExecException: ERROR 6015: During
execution, encountered a Hadoop error.
        at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
        at java.lang.ClassLoader.defineClass1(Native Method)
        at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
        at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
        at
java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
        at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
        at java.net.URLClassLoader.access(000(URLClassLoader.java:58)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
        at java.security.AccessController.doPrivileged(Native Method)
Caused by: java.lang.ClassNotFoundException:
org.apache.hadoop.hbase.mapreduce.TableInputFormat
        ... 14 more
================================================================================
Pig Stack Trace
---------------
ERROR 6015: During execution, encountered a Hadoop error.

org.apache.pig.impl.logicalLayer.FrontendException: ERROR 1066: Unable to
open iterator for alias bet_records_float. Backend error : During
execution, encountered a Hadoop error.
        at org.apache.pig.PigServer.openIterator(PigServer.java:826)
        at
org.apache.pig.tools.grunt.GruntParser.processDump(GruntParser.java:696)
        at
org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:320)
        at
org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:194)
        at
org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:170)
        at org.apache.pig.tools.grunt.Grunt.exec(Grunt.java:84)
        at org.apache.pig.Main.run(Main.java:604)
        at org.apache.pig.Main.main(Main.java:157)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:601)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
Caused by: org.apache.pig.backend.executionengine.ExecException: ERROR
6015: During execution, encountered a Hadoop error.
        at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
        at java.lang.ClassLoader.defineClass1(Native Method)
        at java.lang.ClassLoader.defineClassCond(ClassLoader.java:632)
        at java.lang.ClassLoader.defineClass(ClassLoader.java:616)
        at
java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
        at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
        at java.net.URLClassLoader.access(000(URLClassLoader.java:58)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:197)

The data is valid simple strings, i am not sure what the problem is.
-- 
Regards,
Praveen Bysani
http://www.praveenbysani.com

Re: Unable to typecast fields loaded from HBase

Posted by Praveen Bysani <pr...@gmail.com>.
On 31 March 2013 20:36, Praveen Bysani <pr...@gmail.com> wrote:

> I have fixed the Hbase access on that node. But the error still happens.
>
>
> When executing the script i observed the following error. But i checked
> the /etc/hosts file and i have proper entry to that ip address there. I
> suspect if that could be a reason, because sometimes i can still execute
> the script successfully.
>
> *2013-03-30 13:08:37,701 [JobControl] ERROR
> org.apache.hadoop.hbase.mapreduce.TableInputFormatBase - Cannot resolve the
> host name for /180.235.132.126 because of
> javax.naming.NameNotFoundException: DNS name not found [response code 3];
> remaining name '126.132.235.180.in-addr.arpa'
> *2013-03-30 13:08:38,642 [main] INFO  org.apache.pig.backend.hadoop.
> executionengine.mapReduceLayer.MapReduceLauncher - HadoopJobId:
> job_201303280611_0045
> 2013-03-30 13:08:38,642 [main] INFO
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher
> - Processing aliases A,F
> 2013-03-30 13:08:38,642 [main] INFO
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher
> - detailed locations: M: A[1,4],F[2,4] C:  R:
> 2013-03-30 13:08:38,642 [main] INFO
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher
> - More information at:
> http://server.epicoders.com:50030/jobdetails.jsp?jobid=job_201303280611_0045
> 2013-03-30 13:08:47,246 [main] INFO
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher
> - 25% complete
> 2013-03-30 13:08:54,308 [main] INFO
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher
> - 50% complete
> 2013-03-30 13:08:58,446 [main] INFO
> org.apache.pig.tools.pigstats.ScriptState - Pig script settings are added
> to the job
> 2013-03-30 13:08:58,448 [main] INFO
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler
> - mapred.job.reduce.markreset.buffer.percent is not set, set to default 0.3
> 2013-03-30 13:08:58,450 [main] INFO
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler
> - Setting Parallelism to 1
> 2013-03-30 13:08:58,450 [main] INFO
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler
> - creating jar file Job3676713252658297974.jar
> 2013-03-30 13:09:01,797 [main] INFO
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler
> - jar file Job3676713252658297974.jar created
> 2013-03-30 13:09:01,808 [main] INFO
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler
> - Setting up single store job
> 2013-03-30 13:09:01,810 [main] INFO
> org.apache.pig.data.SchemaTupleFrontend - Key [pig.schematuple] is false,
> will not generate code.
> 2013-03-30 13:09:01,810 [main] INFO
> org.apache.pig.data.SchemaTupleFrontend - Starting process to move
> generated code to distributed cacche
> 2013-03-30 13:09:01,810 [main] INFO
> org.apache.pig.data.SchemaTupleFrontend - Setting key
> [pig.schematuple.classes] with classes to deserialize []
> 2013-03-30 13:09:01,844 [main] INFO
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher
> - 1 map-reduce job(s) waiting for submission.
> 2013-03-30 13:09:01,884 [JobControl] WARN
> org.apache.hadoop.mapred.JobClient - Use GenericOptionsParser for parsing
> the arguments. Applications should implement Tool for the same.
> 2013-03-30 13:09:02,394 [JobControl] INFO
> org.apache.hadoop.mapreduce.lib.input.FileInputFormat - Total input paths
> to process : 1
> 2013-03-30 13:09:02,394 [JobControl] INFO
> org.apache.pig.backend.hadoop.executionengine.util.MapRedUtil - Total input
> paths to process : 1
> 2013-03-30 13:09:02,410 [JobControl] INFO
> org.apache.pig.backend.hadoop.executionengine.util.MapRedUtil - Total input
> paths (combined) to process : 1
> 2013-03-30 13:09:03,406 [main] INFO
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher
> - HadoopJobId: job_201303280611_0046
> 2013-03-30 13:09:03,406 [main] INFO
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher
> - Processing aliases A
> 2013-03-30 13:09:03,406 [main] INFO
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher
> - detailed locations: M:  C:  R: A[-1,-1]
> 2013-03-30 13:09:03,406 [main] INFO
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher
> - More information at:
> http://server.epicoders.com:50030/jobdetails.jsp?jobid=job_201303280611_0046
> 2013-03-30 13:09:13,746 [main] INFO
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher
> - 75% complete
> 2013-03-30 13:09:48,246 [main] WARN
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher
> - Ooops! Some job has failed! Specify -stop_on_failure if you want Pig to
> stop immediately on failure.
> 2013-03-30 13:09:48,246 [main] INFO
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher
> - job job_201303280611_0046 has failed! Stop running all dependent jobs
>
>
> After the job is killed, i checked the logs on JT node and it has
> following errors,
>
> 2013-03-30 05:09:22,913 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
> 2013-03-30 05:09:23,769 INFO org.apache.hadoop.filecache.TrackerDistributedCacheManager: Creating symlink: /mapred/1/local/taskTracker/hadoopuser/jobcache/job_201303280611_0046/jars/job.jar <- /mapred/2/local/taskTracker/hadoopuser/jobcache/job_201303280611_0046/attempt_201303280611_0046_r_000000_1/work/job.jar
> 2013-03-30 05:09:23,801 INFO org.apache.hadoop.filecache.TrackerDistributedCacheManager: Creating symlink: /mapred/1/local/taskTracker/hadoopuser/jobcache/job_201303280611_0046/jars/.job.jar.crc <- /mapred/2/local/taskTracker/hadoopuser/jobcache/job_201303280611_0046/attempt_201303280611_0046_r_000000_1/work/.job.jar.crc
> 2013-03-30 05:09:23,924 WARN org.apache.hadoop.conf.Configuration: session.id is deprecated. Instead, use dfs.metrics.session-id
> 2013-03-30 05:09:23,926 INFO org.apache.hadoop.metrics.jvm.JvmMetrics: Initializing JVM Metrics with processName=SHUFFLE, sessionId=
> 2013-03-30 05:09:24,637 INFO org.apache.hadoop.util.ProcessTree: setsid exited with exit code 0
> 2013-03-30 05:09:24,713 INFO org.apache.hadoop.mapred.Task:  Using ResourceCalculatorPlugin : org.apache.hadoop.util.LinuxResourceCalculatorPlugin@5759780d
> 2013-03-30 05:09:24,782 INFO org.apache.hadoop.mapred.ReduceTask: Using ShuffleConsumerPlugin: org.apache.hadoop.mapred.ReduceTask$ReduceCopier
> 2013-03-30 05:09:24,788 INFO org.apache.hadoop.mapred.ReduceTask: ShuffleRamManager: MemoryLimit=324934048, MaxSingleShuffleLimit=81233512
> 2013-03-30 05:09:24,802 INFO org.apache.hadoop.io.compress.CodecPool: Got brand-new decompressor [.snappy]
> 2013-03-30 05:09:24,803 INFO org.apache.hadoop.io.compress.CodecPool: Got brand-new decompressor [.snappy]
> 2013-03-30 05:09:24,804 INFO org.apache.hadoop.io.compress.CodecPool: Got brand-new decompressor [.snappy]
> 2013-03-30 05:09:24,805 INFO org.apache.hadoop.io.compress.CodecPool: Got brand-new decompressor [.snappy]
> 2013-03-30 05:09:24,806 INFO org.apache.hadoop.io.compress.CodecPool: Got brand-new decompressor [.snappy]
> 2013-03-30 05:09:24,807 INFO org.apache.hadoop.io.compress.CodecPool: Got brand-new decompressor [.snappy]
> 2013-03-30 05:09:24,808 INFO org.apache.hadoop.io.compress.CodecPool: Got brand-new decompressor [.snappy]
> 2013-03-30 05:09:24,809 INFO org.apache.hadoop.io.compress.CodecPool: Got brand-new decompressor [.snappy]
> 2013-03-30 05:09:24,810 INFO org.apache.hadoop.io.compress.CodecPool: Got brand-new decompressor [.snappy]
> 2013-03-30 05:09:24,811 INFO org.apache.hadoop.io.compress.CodecPool: Got brand-new decompressor [.snappy]
> 2013-03-30 05:09:24,814 INFO org.apache.hadoop.mapred.ReduceTask: attempt_201303280611_0046_r_000000_1 Thread started: Thread for merging on-disk files
> 2013-03-30 05:09:24,814 INFO org.apache.hadoop.mapred.ReduceTask: attempt_201303280611_0046_r_000000_1 Thread waiting: Thread for merging on-disk files
> 2013-03-30 05:09:24,816 INFO org.apache.hadoop.mapred.ReduceTask: attempt_201303280611_0046_r_000000_1 Thread started: Thread for merging in memory files
> 2013-03-30 05:09:24,816 INFO org.apache.hadoop.mapred.ReduceTask: attempt_201303280611_0046_r_000000_1 Need another 1 map output(s) where 0 is already in progress
> 2013-03-30 05:09:24,817 INFO org.apache.hadoop.mapred.ReduceTask: attempt_201303280611_0046_r_000000_1 Scheduled 0 outputs (0 slow hosts and0 dup hosts)
> 2013-03-30 05:09:24,817 INFO org.apache.hadoop.mapred.ReduceTask: attempt_201303280611_0046_r_000000_1 Thread started: Thread for polling Map Completion Events
> 2013-03-30 05:09:24,826 INFO org.apache.hadoop.mapred.ReduceTask: attempt_201303280611_0046_r_000000_1 Scheduled 1 outputs (0 slow hosts and0 dup hosts)
> 2013-03-30 05:09:25,417 INFO org.apache.hadoop.mapred.ReduceTask: GetMapEventsThread exiting
> 2013-03-30 05:09:25,417 INFO org.apache.hadoop.mapred.ReduceTask: getMapsEventsThread joined.
> 2013-03-30 05:09:25,419 INFO org.apache.hadoop.mapred.ReduceTask: Closed ram manager
> 2013-03-30 05:09:25,419 INFO org.apache.hadoop.mapred.ReduceTask: Interleaved on-disk merge complete: 0 files left.
> 2013-03-30 05:09:25,419 INFO org.apache.hadoop.mapred.ReduceTask: In-memory merge complete: 1 files left.
> 2013-03-30 05:09:25,615 INFO org.apache.hadoop.mapred.Merger: Merging 1 sorted segments
> 2013-03-30 05:09:25,615 INFO org.apache.hadoop.mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 10224 bytes
> 2013-03-30 05:09:25,618 INFO org.apache.hadoop.io.compress.CodecPool: Got brand-new compressor [.snappy]
> 2013-03-30 05:09:25,722 INFO org.apache.hadoop.mapred.ReduceTask: Merged 1 segments, 10224 bytes to disk to satisfy reduce memory limit
> 2013-03-30 05:09:25,723 INFO org.apache.hadoop.mapred.ReduceTask: Merging 1 files, 3819 bytes from disk
> 2013-03-30 05:09:25,725 INFO org.apache.hadoop.mapred.ReduceTask: Merging 0 segments, 0 bytes from memory into reduce
> 2013-03-30 05:09:25,725 INFO org.apache.hadoop.mapred.Merger: Merging 1 sorted segments
> 2013-03-30 05:09:25,732 INFO org.apache.hadoop.mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 10222 bytes
> 2013-03-30 05:09:25,874 INFO org.apache.pig.data.SchemaTupleBackend: Key [pig.schematuple] was not set... will not generate code.
> 2013-03-30 05:09:25,944 INFO org.apache.hadoop.mapred.TaskLogsTruncater: Initializing logs' truncater with mapRetainSize=-1 and reduceRetainSize=-1
> 2013-03-30 05:09:25,948 FATAL org.apache.hadoop.mapred.Child: Error running child : java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/mapreduce/TableInputFormat
> 	at java.lang.ClassLoader.defineClass1(Native Method)
> 	at java.lang.ClassLoader.defineClassCond(ClassLoader.java:632)
> 	at java.lang.ClassLoader.defineClass(ClassLoader.java:616)
> 	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
> 	at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
> 	at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
> 	at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> 	at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
> 	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
> 	at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
> 	at java.lang.Class.forName0(Native Method)
> 	at java.lang.Class.forName(Class.java:247)
> 	at org.apache.pig.impl.PigContext.resolveClassName(PigContext.java:505)
> 	at org.apache.pig.impl.PigContext.instantiateFuncFromSpec(PigContext.java:572)
> 	at org.apache.pig.backend.hadoop.executionengine.physicalLayer.expressionOperators.POCast.instantiateFunc(POCast.java:85)
> 	at org.apache.pig.backend.hadoop.executionengine.physicalLayer.expressionOperators.POCast.readObject(POCast.java:1725)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> 	at java.lang.reflect.Method.invoke(Method.java:597)
> 	at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:974)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1848)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1752)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1328)
> 	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:350)
> 	at java.util.ArrayList.readObject(ArrayList.java:593)
> 	at sun.reflect.GeneratedMethodAccessor2.invoke(Unknown Source)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> 	at java.lang.reflect.Method.invoke(Method.java:597)
> 	at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:974)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1848)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1752)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1328)
> 	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:350)
> 	at java.util.HashMap.readObject(HashMap.java:1030)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> 	at java.lang.reflect.Method.invoke(Method.java:597)
> 	at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:974)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1848)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1752)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1328)
> 	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1946)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1870)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1752)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1328)
> 	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1946)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1870)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1752)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1328)
> 	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:350)
> 	at java.util.ArrayList.readObject(ArrayList.java:593)
> 	at sun.reflect.GeneratedMethodAccessor2.invoke(Unknown Source)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> 	at java.lang.reflect.Method.invoke(Method.java:597)
> 	at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:974)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1848)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1752)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1328)
> 	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1946)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1870)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1752)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1328)
> 	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:350)
> 	at java.util.ArrayList.readObject(ArrayList.java:593)
> 	at sun.reflect.GeneratedMethodAccessor2.invoke(Unknown Source)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> 	at java.lang.reflect.Method.invoke(Method.java:597)
> 	at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:974)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1848)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1752)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1328)
> 	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:350)
> 	at java.util.HashMap.readObject(HashMap.java:1030)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> 	at java.lang.reflect.Method.invoke(Method.java:597)
> 	at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:974)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1848)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1752)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1328)
> 	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1946)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1870)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1752)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1328)
> 	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1946)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1870)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1752)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1328)
> 	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:350)
> 	at org.apache.pig.impl.util.ObjectSerializer.deserialize(ObjectSerializer.java:58)
> 	at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigGenericMapReduce$Reduce.setup(PigGenericMapReduce.java:322)
> 	at org.apache.hadoop.mapreduce.Reducer.run(Reducer.java:162)
> 	at org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:610)
> 	at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:444)
> 	at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:396)
> 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
> 	at org.apache.hadoop.mapred.Child.main(Child.java:262)
> Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.mapreduce.TableInputFormat
> 	at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> 	at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
> 	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
> 	at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
> 	... 104 more
>
> It would be great if someone could throw some light on this issue.
>
>
> On 29 March 2013 20:30, Praveen Bysani <pr...@gmail.com> wrote:
>
>> Sure, i will do so. Thanks anyway.
>>
>>
>> On 28 March 2013 22:48, Bill Graham <bi...@gmail.com> wrote:
>>
>>> Looks like an issue with either your HBase configs that specify the ZK
>>> quorum being off, or ZK itself not responding. If you keep having problems
>>> though, I'm sure the hbase users list would be able to help out pretty
>>> quickly. I'd start with checking that the quorum is properly configured and
>>> that you can connect to it manually from a node.
>>>
>>>
>>> On Thu, Mar 28, 2013 at 3:25 AM, Praveen Bysani <praveen.iiith@gmail.com
>>> > wrote:
>>>
>>>> Hi,
>>>>
>>>> I setup all the nodes using Cloudera Manager. So i assume all the
>>>> classpaths and the environment is handled by the framework (cloudera
>>>> distro), isn't it so ? However after trying to execute on each node, i
>>>> found that on one of my node has problems connecting to hbase. The ip
>>>> address of this node is recently changed from what it was during
>>>> installation. I update the /etc/hosts file on all nodes and restarted all
>>>> hadoop services. The services tab in cloudera manager shows good health for
>>>> all services which made me believe everything is alright, apparently not so.
>>>>
>>>> Trying to access hbase on that particular node gives,
>>>>
>>>> 13/03/28 16:28:14 ERROR zookeeper.RecoverableZooKeeper: ZooKeeper
>>>> exists failed after 3 retries
>>>> 13/03/28 16:28:14 WARN zookeeper.ZKUtil: hconnection Unable to set
>>>> watcher on znode /hbase/master
>>>> org.apache.zookeeper.KeeperException$ConnectionLossException:
>>>> KeeperErrorCode = ConnectionLoss for /hbase/master
>>>>         at
>>>> org.apache.zookeeper.KeeperException.create(KeeperException.java:99)
>>>>         at
>>>> org.apache.zookeeper.KeeperException.create(KeeperException.java:51)
>>>>         at org.apache.zookeeper.ZooKeeper.exists(ZooKeeper.java:1041)
>>>>         at
>>>> org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper.exists(RecoverableZooKeeper.java:176)
>>>>         at
>>>> org.apache.hadoop.hbase.zookeeper.ZKUtil.watchAndCheckExists(ZKUtil.java:418)
>>>>         at
>>>> org.apache.hadoop.hbase.zookeeper.ZooKeeperNodeTracker.start(ZooKeeperNodeTracker.java:82)
>>>>         at
>>>> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.ensureZookeeperTrackers(HConnectionManager.java:589)
>>>>         at
>>>> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getMaster(HConnectionManager.java:648)
>>>>         at
>>>> org.apache.hadoop.hbase.client.HBaseAdmin.<init>(HBaseAdmin.java:121)
>>>>         at
>>>> sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>>>>         at
>>>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
>>>>         at
>>>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
>>>>         at
>>>> java.lang.reflect.Constructor.newInstance(Constructor.java:513)
>>>>         at
>>>> org.jruby.javasupport.JavaConstructor.newInstanceDirect(JavaConstructor.java:275)
>>>>         at
>>>> org.jruby.java.invokers.ConstructorInvoker.call(ConstructorInvoker.java:91)
>>>>         at
>>>> org.jruby.java.invokers.ConstructorInvoker.call(ConstructorInvoker.java:178)
>>>>         at
>>>> org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:322)
>>>>         at
>>>> org.jruby.runtime.callsite.CachingCallSite.callBlock(CachingCallSite.java:178)
>>>>         at
>>>> org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:182)
>>>>         at
>>>> org.jruby.java.proxies.ConcreteJavaProxy$2.call(ConcreteJavaProxy.java:47)
>>>>         at
>>>> org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:322)
>>>>         at
>>>> org.jruby.runtime.callsite.CachingCallSite.callBlock(CachingCallSite.java:178)
>>>>
>>>> I understand it is no longer an issue with pig, it would be great if
>>>> someone could give some pointers to configure the hbase on the node that
>>>> has a new ip address.
>>>>
>>>> On 28 March 2013 12:54, Bill Graham <bi...@gmail.com> wrote:
>>>>
>>>>> Your initial exception shows ClassNotFoundExceptions for HBase. Are you
>>>>> adding HBase to PIG_CLASSPATH on the client or do you have it
>>>>> installed on
>>>>> your Hadoop nodes? In the case of the latter, maybe some nodes are
>>>>> different than others?
>>>>>
>>>>>
>>>>> On Wed, Mar 27, 2013 at 9:20 PM, Praveen Bysani <
>>>>> praveen.iiith@gmail.com>wrote:
>>>>>
>>>>> > This is not about casting types. The scripts work sometime without
>>>>> any
>>>>> > issue and fails with the error as i specified before ? I have no
>>>>> clue of
>>>>> > what might be the issue ? Network probably ? I run my cluster on VPS
>>>>> > machines, running CDH 4.2 that is installed using cloudera Manager.
>>>>> I am
>>>>> > running pig version 0.10.1 which is installed as parcel.
>>>>> >
>>>>> > On 27 March 2013 16:29, Praveen Bysani <pr...@gmail.com>
>>>>> wrote:
>>>>> >
>>>>> > > Hi,
>>>>> > >
>>>>> > > I am unable to typecast fields loaded from my hbase to anything
>>>>> other
>>>>> > than
>>>>> > > default bytearray. I tried both during the LOAD statement and using
>>>>> > > typecast after loading. Neither works. The script works when i
>>>>> load the
>>>>> > > data as below,
>>>>> > > records = LOAD 'hbase://hantu' USING
>>>>> > > org.apache.pig.backend.hadoop.hbase.HBaseStorage('v:member
>>>>> v:guest') as
>>>>> > > (member, guest);
>>>>> > > records_limit = LIMIT records 10;
>>>>> > > DUMP records_limit;
>>>>> > >
>>>>> > > But when i change the first line to ,
>>>>> > > records = LOAD 'hbase://hantu' USING
>>>>> > > org.apache.pig.backend.hadoop.hbase.HBaseStorage('v:member
>>>>> v:guest') as
>>>>> > > (member:chararray, guest:chararray);
>>>>> > >
>>>>> > > The pig script fails and the log is as below,
>>>>> > > Backend error message
>>>>> > > ---------------------
>>>>> > > Error: java.lang.ClassNotFoundException:
>>>>> > > org.apache.hadoop.hbase.mapreduce.TableInputFormat
>>>>> > >         at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>>>>> > >         at java.security.AccessController.doPrivileged(Native
>>>>> Method)
>>>>> > >         at
>>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>>>>> > >         at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
>>>>> > >         at
>>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>>>>> > >         at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
>>>>> > >         at java.lang.ClassLoader.defineClass1(Native Method)
>>>>> > >         at
>>>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:632)
>>>>> > >         at java.lang.ClassLoader.defineClass(ClassLoader.java:616)
>>>>> > >         at
>>>>> > >
>>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>>> > >         at
>>>>> java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
>>>>> > >         at
>>>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)
>>>>> > >         at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
>>>>> > >         at java.security.AccessController.doPrivileged(Native
>>>>> Method)
>>>>> > >         at
>>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190
>>>>> > >
>>>>> > > Backend error message
>>>>> > > ---------------------
>>>>> > > Error: java.lang.ClassNotFoundException:
>>>>> > > org.apache.hadoop.hbase.mapreduce.TableInputFormat
>>>>> > >         at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>>>>> > >         at java.security.AccessController.doPrivileged(Native
>>>>> Method)
>>>>> > >         at
>>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>>>>> > >         at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>>>>> > >         at
>>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>>>>> > >         at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>>>>> > >         at java.lang.ClassLoader.defineClass1(Native Method)
>>>>> > >         at
>>>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
>>>>> > >         at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
>>>>> > >         at
>>>>> > >
>>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>>> > >         at
>>>>> java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
>>>>> > >         at
>>>>> java.net.URLClassLoader.access$000(URLClassLoader.java:58)
>>>>> > >         at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
>>>>> > >         at java.security.AccessController.doPrivileged(Native
>>>>> Method)
>>>>> > >         at
>>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190
>>>>> > >
>>>>> > > Error message from task (reduce) task_201303270642_0043_r_000000
>>>>> > > ----------------------------------------------------------------
>>>>> > > ERROR 6015: During execution, encountered a Hadoop error.
>>>>> > >
>>>>> > > org.apache.pig.backend.executionengine.ExecException: ERROR 6015:
>>>>> During
>>>>> > > execution, encountered a Hadoop error.
>>>>> > >         at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>>>>> > >         at java.security.AccessController.doPrivileged(Native
>>>>> Method)
>>>>> > >         at
>>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>>>>> > >         at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
>>>>> > >         at
>>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>>>>> > >         at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
>>>>> > >         at java.lang.ClassLoader.defineClass1(Native Method)
>>>>> > >         at
>>>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:632)
>>>>> > >         at java.lang.ClassLoader.defineClass(ClassLoader.java:616)
>>>>> > >         at
>>>>> > >
>>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>>> > >         at
>>>>> java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
>>>>> > >         at
>>>>> java.net.URLClassLoader.access(000(URLClassLoader.java:58)
>>>>> > >         at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
>>>>> > >         at java.security.AccessController.doPrivileged(Native
>>>>> Method)
>>>>> > > Caused by: java.lang.ClassNotFoundException:
>>>>> > > org.apache.hadoop.hbase.mapreduce.TableInputFormat
>>>>> > >         ... 14 more
>>>>> > >
>>>>> > >
>>>>> >
>>>>> ================================================================================
>>>>> > > Error message from task (reduce) task_201303270642_0043_r_000000
>>>>> > > ----------------------------------------------------------------
>>>>> > > ERROR 6015: During execution, encountered a Hadoop error.
>>>>> > >
>>>>> > > org.apache.pig.backend.executionengine.ExecException: ERROR 6015:
>>>>> During
>>>>> > > execution, encountered a Hadoop error.
>>>>> > >         at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>>>>> > >         at java.security.AccessController.doPrivileged(Native
>>>>> Method)
>>>>> > >         at
>>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>>>>> > >         at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>>>>> > >         at
>>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>>>>> > >         at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>>>>> > >         at java.lang.ClassLoader.defineClass1(Native Method)
>>>>> > >         at
>>>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
>>>>> > >         at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
>>>>> > >         at
>>>>> > >
>>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>>> > >         at
>>>>> java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
>>>>> > >         at
>>>>> java.net.URLClassLoader.access(000(URLClassLoader.java:58)
>>>>> > >         at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
>>>>> > >         at java.security.AccessController.doPrivileged(Native
>>>>> Method)
>>>>> > > Caused by: java.lang.ClassNotFoundException:
>>>>> > > org.apache.hadoop.hbase.mapreduce.TableInputFormat
>>>>> > >         ... 14 more
>>>>> > >
>>>>> > >
>>>>> >
>>>>> ================================================================================
>>>>> > > Pig Stack Trace
>>>>> > > ---------------
>>>>> > > ERROR 6015: During execution, encountered a Hadoop error.
>>>>> > >
>>>>> > > org.apache.pig.impl.logicalLayer.FrontendException: ERROR 1066:
>>>>> Unable to
>>>>> > > open iterator for alias bet_records_float. Backend error : During
>>>>> > > execution, encountered a Hadoop error.
>>>>> > >         at
>>>>> org.apache.pig.PigServer.openIterator(PigServer.java:826)
>>>>> > >         at
>>>>> > >
>>>>> org.apache.pig.tools.grunt.GruntParser.processDump(GruntParser.java:696)
>>>>> > >         at
>>>>> > >
>>>>> >
>>>>> org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:320)
>>>>> > >         at
>>>>> > >
>>>>> >
>>>>> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:194)
>>>>> > >         at
>>>>> > >
>>>>> >
>>>>> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:170)
>>>>> > >         at org.apache.pig.tools.grunt.Grunt.exec(Grunt.java:84)
>>>>> > >         at org.apache.pig.Main.run(Main.java:604)
>>>>> > >         at org.apache.pig.Main.main(Main.java:157)
>>>>> > >         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
>>>>> Method)
>>>>> > >         at
>>>>> > >
>>>>> >
>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>> > >         at
>>>>> > >
>>>>> >
>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>> > >         at java.lang.reflect.Method.invoke(Method.java:601)
>>>>> > >         at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
>>>>> > > Caused by: org.apache.pig.backend.executionengine.ExecException:
>>>>> ERROR
>>>>> > > 6015: During execution, encountered a Hadoop error.
>>>>> > >         at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>>>>> > >         at java.security.AccessController.doPrivileged(Native
>>>>> Method)
>>>>> > >         at
>>>>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>>>>> > >         at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
>>>>> > >         at
>>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>>>>> > >         at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
>>>>> > >         at java.lang.ClassLoader.defineClass1(Native Method)
>>>>> > >         at
>>>>> java.lang.ClassLoader.defineClassCond(ClassLoader.java:632)
>>>>> > >         at java.lang.ClassLoader.defineClass(ClassLoader.java:616)
>>>>> > >         at
>>>>> > >
>>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>>>>> > >         at
>>>>> java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
>>>>> > >         at
>>>>> java.net.URLClassLoader.access(000(URLClassLoader.java:58)
>>>>> > >         at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
>>>>> > >
>>>>> > > The data is valid simple strings, i am not sure what the problem
>>>>> is.
>>>>> > > --
>>>>> > > Regards,
>>>>> > > Praveen Bysani
>>>>> > > http://www.praveenbysani.com
>>>>> > >
>>>>> >
>>>>> >
>>>>> >
>>>>> > --
>>>>> > Regards,
>>>>> > Praveen Bysani
>>>>> > http://www.praveenbysani.com
>>>>> >
>>>>>
>>>>>
>>>>>
>>>>> --
>>>>> *Note that I'm no longer using my Yahoo! email address. Please email
>>>>> me at
>>>>> billgraham@gmail.com going forward.*
>>>>>
>>>>
>>>>
>>>>
>>>> --
>>>> Regards,
>>>> Praveen Bysani
>>>> http://www.praveenbysani.com
>>>>
>>>
>>>
>>>
>>> --
>>> *Note that I'm no longer using my Yahoo! email address. Please email me
>>> at billgraham@gmail.com going forward.*
>>>
>>
>>
>>
>> --
>> Regards,
>> Praveen Bysani
>> http://www.praveenbysani.com
>>
>
>
>
> --
> Regards,
> Praveen Bysani
> http://www.praveenbysani.com
>



-- 
Regards,
Praveen Bysani
http://www.praveenbysani.com

Re: Unable to typecast fields loaded from HBase

Posted by Bill Graham <bi...@gmail.com>.
Looks like an issue with either your HBase configs that specify the ZK
quorum being off, or ZK itself not responding. If you keep having problems
though, I'm sure the hbase users list would be able to help out pretty
quickly. I'd start with checking that the quorum is properly configured and
that you can connect to it manually from a node.


On Thu, Mar 28, 2013 at 3:25 AM, Praveen Bysani <pr...@gmail.com>wrote:

> Hi,
>
> I setup all the nodes using Cloudera Manager. So i assume all the
> classpaths and the environment is handled by the framework (cloudera
> distro), isn't it so ? However after trying to execute on each node, i
> found that on one of my node has problems connecting to hbase. The ip
> address of this node is recently changed from what it was during
> installation. I update the /etc/hosts file on all nodes and restarted all
> hadoop services. The services tab in cloudera manager shows good health for
> all services which made me believe everything is alright, apparently not so.
>
> Trying to access hbase on that particular node gives,
>
> 13/03/28 16:28:14 ERROR zookeeper.RecoverableZooKeeper: ZooKeeper exists
> failed after 3 retries
> 13/03/28 16:28:14 WARN zookeeper.ZKUtil: hconnection Unable to set watcher
> on znode /hbase/master
> org.apache.zookeeper.KeeperException$ConnectionLossException:
> KeeperErrorCode = ConnectionLoss for /hbase/master
>         at
> org.apache.zookeeper.KeeperException.create(KeeperException.java:99)
>         at
> org.apache.zookeeper.KeeperException.create(KeeperException.java:51)
>         at org.apache.zookeeper.ZooKeeper.exists(ZooKeeper.java:1041)
>         at
> org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper.exists(RecoverableZooKeeper.java:176)
>         at
> org.apache.hadoop.hbase.zookeeper.ZKUtil.watchAndCheckExists(ZKUtil.java:418)
>         at
> org.apache.hadoop.hbase.zookeeper.ZooKeeperNodeTracker.start(ZooKeeperNodeTracker.java:82)
>         at
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.ensureZookeeperTrackers(HConnectionManager.java:589)
>         at
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getMaster(HConnectionManager.java:648)
>         at
> org.apache.hadoop.hbase.client.HBaseAdmin.<init>(HBaseAdmin.java:121)
>         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
>         at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
>         at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
>         at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
>         at
> org.jruby.javasupport.JavaConstructor.newInstanceDirect(JavaConstructor.java:275)
>         at
> org.jruby.java.invokers.ConstructorInvoker.call(ConstructorInvoker.java:91)
>         at
> org.jruby.java.invokers.ConstructorInvoker.call(ConstructorInvoker.java:178)
>         at
> org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:322)
>         at
> org.jruby.runtime.callsite.CachingCallSite.callBlock(CachingCallSite.java:178)
>         at
> org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:182)
>         at
> org.jruby.java.proxies.ConcreteJavaProxy$2.call(ConcreteJavaProxy.java:47)
>         at
> org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:322)
>         at
> org.jruby.runtime.callsite.CachingCallSite.callBlock(CachingCallSite.java:178)
>
> I understand it is no longer an issue with pig, it would be great if
> someone could give some pointers to configure the hbase on the node that
> has a new ip address.
>
> On 28 March 2013 12:54, Bill Graham <bi...@gmail.com> wrote:
>
>> Your initial exception shows ClassNotFoundExceptions for HBase. Are you
>> adding HBase to PIG_CLASSPATH on the client or do you have it installed on
>> your Hadoop nodes? In the case of the latter, maybe some nodes are
>> different than others?
>>
>>
>> On Wed, Mar 27, 2013 at 9:20 PM, Praveen Bysani <praveen.iiith@gmail.com
>> >wrote:
>>
>> > This is not about casting types. The scripts work sometime without any
>> > issue and fails with the error as i specified before ? I have no clue of
>> > what might be the issue ? Network probably ? I run my cluster on VPS
>> > machines, running CDH 4.2 that is installed using cloudera Manager. I am
>> > running pig version 0.10.1 which is installed as parcel.
>> >
>> > On 27 March 2013 16:29, Praveen Bysani <pr...@gmail.com> wrote:
>> >
>> > > Hi,
>> > >
>> > > I am unable to typecast fields loaded from my hbase to anything other
>> > than
>> > > default bytearray. I tried both during the LOAD statement and using
>> > > typecast after loading. Neither works. The script works when i load
>> the
>> > > data as below,
>> > > records = LOAD 'hbase://hantu' USING
>> > > org.apache.pig.backend.hadoop.hbase.HBaseStorage('v:member v:guest')
>> as
>> > > (member, guest);
>> > > records_limit = LIMIT records 10;
>> > > DUMP records_limit;
>> > >
>> > > But when i change the first line to ,
>> > > records = LOAD 'hbase://hantu' USING
>> > > org.apache.pig.backend.hadoop.hbase.HBaseStorage('v:member v:guest')
>> as
>> > > (member:chararray, guest:chararray);
>> > >
>> > > The pig script fails and the log is as below,
>> > > Backend error message
>> > > ---------------------
>> > > Error: java.lang.ClassNotFoundException:
>> > > org.apache.hadoop.hbase.mapreduce.TableInputFormat
>> > >         at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>> > >         at java.security.AccessController.doPrivileged(Native Method)
>> > >         at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>> > >         at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
>> > >         at
>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>> > >         at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
>> > >         at java.lang.ClassLoader.defineClass1(Native Method)
>> > >         at java.lang.ClassLoader.defineClassCond(ClassLoader.java:632)
>> > >         at java.lang.ClassLoader.defineClass(ClassLoader.java:616)
>> > >         at
>> > >
>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>> > >         at
>> java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
>> > >         at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
>> > >         at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
>> > >         at java.security.AccessController.doPrivileged(Native Method)
>> > >         at java.net.URLClassLoader.findClass(URLClassLoader.java:190
>> > >
>> > > Backend error message
>> > > ---------------------
>> > > Error: java.lang.ClassNotFoundException:
>> > > org.apache.hadoop.hbase.mapreduce.TableInputFormat
>> > >         at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>> > >         at java.security.AccessController.doPrivileged(Native Method)
>> > >         at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>> > >         at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>> > >         at
>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>> > >         at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>> > >         at java.lang.ClassLoader.defineClass1(Native Method)
>> > >         at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
>> > >         at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
>> > >         at
>> > >
>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>> > >         at
>> java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
>> > >         at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
>> > >         at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
>> > >         at java.security.AccessController.doPrivileged(Native Method)
>> > >         at java.net.URLClassLoader.findClass(URLClassLoader.java:190
>> > >
>> > > Error message from task (reduce) task_201303270642_0043_r_000000
>> > > ----------------------------------------------------------------
>> > > ERROR 6015: During execution, encountered a Hadoop error.
>> > >
>> > > org.apache.pig.backend.executionengine.ExecException: ERROR 6015:
>> During
>> > > execution, encountered a Hadoop error.
>> > >         at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>> > >         at java.security.AccessController.doPrivileged(Native Method)
>> > >         at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>> > >         at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
>> > >         at
>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>> > >         at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
>> > >         at java.lang.ClassLoader.defineClass1(Native Method)
>> > >         at java.lang.ClassLoader.defineClassCond(ClassLoader.java:632)
>> > >         at java.lang.ClassLoader.defineClass(ClassLoader.java:616)
>> > >         at
>> > >
>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>> > >         at
>> java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
>> > >         at java.net.URLClassLoader.access(000(URLClassLoader.java:58)
>> > >         at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
>> > >         at java.security.AccessController.doPrivileged(Native Method)
>> > > Caused by: java.lang.ClassNotFoundException:
>> > > org.apache.hadoop.hbase.mapreduce.TableInputFormat
>> > >         ... 14 more
>> > >
>> > >
>> >
>> ================================================================================
>> > > Error message from task (reduce) task_201303270642_0043_r_000000
>> > > ----------------------------------------------------------------
>> > > ERROR 6015: During execution, encountered a Hadoop error.
>> > >
>> > > org.apache.pig.backend.executionengine.ExecException: ERROR 6015:
>> During
>> > > execution, encountered a Hadoop error.
>> > >         at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>> > >         at java.security.AccessController.doPrivileged(Native Method)
>> > >         at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>> > >         at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>> > >         at
>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>> > >         at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>> > >         at java.lang.ClassLoader.defineClass1(Native Method)
>> > >         at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
>> > >         at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
>> > >         at
>> > >
>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>> > >         at
>> java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
>> > >         at java.net.URLClassLoader.access(000(URLClassLoader.java:58)
>> > >         at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
>> > >         at java.security.AccessController.doPrivileged(Native Method)
>> > > Caused by: java.lang.ClassNotFoundException:
>> > > org.apache.hadoop.hbase.mapreduce.TableInputFormat
>> > >         ... 14 more
>> > >
>> > >
>> >
>> ================================================================================
>> > > Pig Stack Trace
>> > > ---------------
>> > > ERROR 6015: During execution, encountered a Hadoop error.
>> > >
>> > > org.apache.pig.impl.logicalLayer.FrontendException: ERROR 1066:
>> Unable to
>> > > open iterator for alias bet_records_float. Backend error : During
>> > > execution, encountered a Hadoop error.
>> > >         at org.apache.pig.PigServer.openIterator(PigServer.java:826)
>> > >         at
>> > >
>> org.apache.pig.tools.grunt.GruntParser.processDump(GruntParser.java:696)
>> > >         at
>> > >
>> >
>> org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:320)
>> > >         at
>> > >
>> >
>> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:194)
>> > >         at
>> > >
>> >
>> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:170)
>> > >         at org.apache.pig.tools.grunt.Grunt.exec(Grunt.java:84)
>> > >         at org.apache.pig.Main.run(Main.java:604)
>> > >         at org.apache.pig.Main.main(Main.java:157)
>> > >         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> > >         at
>> > >
>> >
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>> > >         at
>> > >
>> >
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> > >         at java.lang.reflect.Method.invoke(Method.java:601)
>> > >         at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
>> > > Caused by: org.apache.pig.backend.executionengine.ExecException: ERROR
>> > > 6015: During execution, encountered a Hadoop error.
>> > >         at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>> > >         at java.security.AccessController.doPrivileged(Native Method)
>> > >         at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>> > >         at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
>> > >         at
>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>> > >         at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
>> > >         at java.lang.ClassLoader.defineClass1(Native Method)
>> > >         at java.lang.ClassLoader.defineClassCond(ClassLoader.java:632)
>> > >         at java.lang.ClassLoader.defineClass(ClassLoader.java:616)
>> > >         at
>> > >
>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>> > >         at
>> java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
>> > >         at java.net.URLClassLoader.access(000(URLClassLoader.java:58)
>> > >         at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
>> > >
>> > > The data is valid simple strings, i am not sure what the problem is.
>> > > --
>> > > Regards,
>> > > Praveen Bysani
>> > > http://www.praveenbysani.com
>> > >
>> >
>> >
>> >
>> > --
>> > Regards,
>> > Praveen Bysani
>> > http://www.praveenbysani.com
>> >
>>
>>
>>
>> --
>> *Note that I'm no longer using my Yahoo! email address. Please email me at
>> billgraham@gmail.com going forward.*
>>
>
>
>
> --
> Regards,
> Praveen Bysani
> http://www.praveenbysani.com
>



-- 
*Note that I'm no longer using my Yahoo! email address. Please email me at
billgraham@gmail.com going forward.*

Re: Unable to typecast fields loaded from HBase

Posted by Praveen Bysani <pr...@gmail.com>.
Hi,

I setup all the nodes using Cloudera Manager. So i assume all the
classpaths and the environment is handled by the framework (cloudera
distro), isn't it so ? However after trying to execute on each node, i
found that on one of my node has problems connecting to hbase. The ip
address of this node is recently changed from what it was during
installation. I update the /etc/hosts file on all nodes and restarted all
hadoop services. The services tab in cloudera manager shows good health for
all services which made me believe everything is alright, apparently not so.

Trying to access hbase on that particular node gives,

13/03/28 16:28:14 ERROR zookeeper.RecoverableZooKeeper: ZooKeeper exists
failed after 3 retries
13/03/28 16:28:14 WARN zookeeper.ZKUtil: hconnection Unable to set watcher
on znode /hbase/master
org.apache.zookeeper.KeeperException$ConnectionLossException:
KeeperErrorCode = ConnectionLoss for /hbase/master
        at
org.apache.zookeeper.KeeperException.create(KeeperException.java:99)
        at
org.apache.zookeeper.KeeperException.create(KeeperException.java:51)
        at org.apache.zookeeper.ZooKeeper.exists(ZooKeeper.java:1041)
        at
org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper.exists(RecoverableZooKeeper.java:176)
        at
org.apache.hadoop.hbase.zookeeper.ZKUtil.watchAndCheckExists(ZKUtil.java:418)
        at
org.apache.hadoop.hbase.zookeeper.ZooKeeperNodeTracker.start(ZooKeeperNodeTracker.java:82)
        at
org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.ensureZookeeperTrackers(HConnectionManager.java:589)
        at
org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getMaster(HConnectionManager.java:648)
        at
org.apache.hadoop.hbase.client.HBaseAdmin.<init>(HBaseAdmin.java:121)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)
        at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
        at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
        at
org.jruby.javasupport.JavaConstructor.newInstanceDirect(JavaConstructor.java:275)
        at
org.jruby.java.invokers.ConstructorInvoker.call(ConstructorInvoker.java:91)
        at
org.jruby.java.invokers.ConstructorInvoker.call(ConstructorInvoker.java:178)
        at
org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:322)
        at
org.jruby.runtime.callsite.CachingCallSite.callBlock(CachingCallSite.java:178)
        at
org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:182)
        at
org.jruby.java.proxies.ConcreteJavaProxy$2.call(ConcreteJavaProxy.java:47)
        at
org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:322)
        at
org.jruby.runtime.callsite.CachingCallSite.callBlock(CachingCallSite.java:178)

I understand it is no longer an issue with pig, it would be great if
someone could give some pointers to configure the hbase on the node that
has a new ip address.

On 28 March 2013 12:54, Bill Graham <bi...@gmail.com> wrote:

> Your initial exception shows ClassNotFoundExceptions for HBase. Are you
> adding HBase to PIG_CLASSPATH on the client or do you have it installed on
> your Hadoop nodes? In the case of the latter, maybe some nodes are
> different than others?
>
>
> On Wed, Mar 27, 2013 at 9:20 PM, Praveen Bysani <praveen.iiith@gmail.com
> >wrote:
>
> > This is not about casting types. The scripts work sometime without any
> > issue and fails with the error as i specified before ? I have no clue of
> > what might be the issue ? Network probably ? I run my cluster on VPS
> > machines, running CDH 4.2 that is installed using cloudera Manager. I am
> > running pig version 0.10.1 which is installed as parcel.
> >
> > On 27 March 2013 16:29, Praveen Bysani <pr...@gmail.com> wrote:
> >
> > > Hi,
> > >
> > > I am unable to typecast fields loaded from my hbase to anything other
> > than
> > > default bytearray. I tried both during the LOAD statement and using
> > > typecast after loading. Neither works. The script works when i load the
> > > data as below,
> > > records = LOAD 'hbase://hantu' USING
> > > org.apache.pig.backend.hadoop.hbase.HBaseStorage('v:member v:guest') as
> > > (member, guest);
> > > records_limit = LIMIT records 10;
> > > DUMP records_limit;
> > >
> > > But when i change the first line to ,
> > > records = LOAD 'hbase://hantu' USING
> > > org.apache.pig.backend.hadoop.hbase.HBaseStorage('v:member v:guest') as
> > > (member:chararray, guest:chararray);
> > >
> > > The pig script fails and the log is as below,
> > > Backend error message
> > > ---------------------
> > > Error: java.lang.ClassNotFoundException:
> > > org.apache.hadoop.hbase.mapreduce.TableInputFormat
> > >         at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
> > >         at java.security.AccessController.doPrivileged(Native Method)
> > >         at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> > >         at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
> > >         at
> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
> > >         at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
> > >         at java.lang.ClassLoader.defineClass1(Native Method)
> > >         at java.lang.ClassLoader.defineClassCond(ClassLoader.java:632)
> > >         at java.lang.ClassLoader.defineClass(ClassLoader.java:616)
> > >         at
> > > java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
> > >         at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
> > >         at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
> > >         at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
> > >         at java.security.AccessController.doPrivileged(Native Method)
> > >         at java.net.URLClassLoader.findClass(URLClassLoader.java:190
> > >
> > > Backend error message
> > > ---------------------
> > > Error: java.lang.ClassNotFoundException:
> > > org.apache.hadoop.hbase.mapreduce.TableInputFormat
> > >         at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
> > >         at java.security.AccessController.doPrivileged(Native Method)
> > >         at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> > >         at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> > >         at
> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
> > >         at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> > >         at java.lang.ClassLoader.defineClass1(Native Method)
> > >         at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
> > >         at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
> > >         at
> > > java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
> > >         at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
> > >         at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
> > >         at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
> > >         at java.security.AccessController.doPrivileged(Native Method)
> > >         at java.net.URLClassLoader.findClass(URLClassLoader.java:190
> > >
> > > Error message from task (reduce) task_201303270642_0043_r_000000
> > > ----------------------------------------------------------------
> > > ERROR 6015: During execution, encountered a Hadoop error.
> > >
> > > org.apache.pig.backend.executionengine.ExecException: ERROR 6015:
> During
> > > execution, encountered a Hadoop error.
> > >         at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
> > >         at java.security.AccessController.doPrivileged(Native Method)
> > >         at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> > >         at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
> > >         at
> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
> > >         at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
> > >         at java.lang.ClassLoader.defineClass1(Native Method)
> > >         at java.lang.ClassLoader.defineClassCond(ClassLoader.java:632)
> > >         at java.lang.ClassLoader.defineClass(ClassLoader.java:616)
> > >         at
> > > java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
> > >         at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
> > >         at java.net.URLClassLoader.access(000(URLClassLoader.java:58)
> > >         at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
> > >         at java.security.AccessController.doPrivileged(Native Method)
> > > Caused by: java.lang.ClassNotFoundException:
> > > org.apache.hadoop.hbase.mapreduce.TableInputFormat
> > >         ... 14 more
> > >
> > >
> >
> ================================================================================
> > > Error message from task (reduce) task_201303270642_0043_r_000000
> > > ----------------------------------------------------------------
> > > ERROR 6015: During execution, encountered a Hadoop error.
> > >
> > > org.apache.pig.backend.executionengine.ExecException: ERROR 6015:
> During
> > > execution, encountered a Hadoop error.
> > >         at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
> > >         at java.security.AccessController.doPrivileged(Native Method)
> > >         at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> > >         at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> > >         at
> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
> > >         at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> > >         at java.lang.ClassLoader.defineClass1(Native Method)
> > >         at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
> > >         at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
> > >         at
> > > java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
> > >         at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
> > >         at java.net.URLClassLoader.access(000(URLClassLoader.java:58)
> > >         at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
> > >         at java.security.AccessController.doPrivileged(Native Method)
> > > Caused by: java.lang.ClassNotFoundException:
> > > org.apache.hadoop.hbase.mapreduce.TableInputFormat
> > >         ... 14 more
> > >
> > >
> >
> ================================================================================
> > > Pig Stack Trace
> > > ---------------
> > > ERROR 6015: During execution, encountered a Hadoop error.
> > >
> > > org.apache.pig.impl.logicalLayer.FrontendException: ERROR 1066: Unable
> to
> > > open iterator for alias bet_records_float. Backend error : During
> > > execution, encountered a Hadoop error.
> > >         at org.apache.pig.PigServer.openIterator(PigServer.java:826)
> > >         at
> > >
> org.apache.pig.tools.grunt.GruntParser.processDump(GruntParser.java:696)
> > >         at
> > >
> >
> org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:320)
> > >         at
> > >
> >
> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:194)
> > >         at
> > >
> >
> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:170)
> > >         at org.apache.pig.tools.grunt.Grunt.exec(Grunt.java:84)
> > >         at org.apache.pig.Main.run(Main.java:604)
> > >         at org.apache.pig.Main.main(Main.java:157)
> > >         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > >         at
> > >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> > >         at
> > >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > >         at java.lang.reflect.Method.invoke(Method.java:601)
> > >         at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
> > > Caused by: org.apache.pig.backend.executionengine.ExecException: ERROR
> > > 6015: During execution, encountered a Hadoop error.
> > >         at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
> > >         at java.security.AccessController.doPrivileged(Native Method)
> > >         at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> > >         at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
> > >         at
> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
> > >         at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
> > >         at java.lang.ClassLoader.defineClass1(Native Method)
> > >         at java.lang.ClassLoader.defineClassCond(ClassLoader.java:632)
> > >         at java.lang.ClassLoader.defineClass(ClassLoader.java:616)
> > >         at
> > > java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
> > >         at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
> > >         at java.net.URLClassLoader.access(000(URLClassLoader.java:58)
> > >         at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
> > >
> > > The data is valid simple strings, i am not sure what the problem is.
> > > --
> > > Regards,
> > > Praveen Bysani
> > > http://www.praveenbysani.com
> > >
> >
> >
> >
> > --
> > Regards,
> > Praveen Bysani
> > http://www.praveenbysani.com
> >
>
>
>
> --
> *Note that I'm no longer using my Yahoo! email address. Please email me at
> billgraham@gmail.com going forward.*
>



-- 
Regards,
Praveen Bysani
http://www.praveenbysani.com

Re: Unable to typecast fields loaded from HBase

Posted by Bill Graham <bi...@gmail.com>.
Your initial exception shows ClassNotFoundExceptions for HBase. Are you
adding HBase to PIG_CLASSPATH on the client or do you have it installed on
your Hadoop nodes? In the case of the latter, maybe some nodes are
different than others?


On Wed, Mar 27, 2013 at 9:20 PM, Praveen Bysani <pr...@gmail.com>wrote:

> This is not about casting types. The scripts work sometime without any
> issue and fails with the error as i specified before ? I have no clue of
> what might be the issue ? Network probably ? I run my cluster on VPS
> machines, running CDH 4.2 that is installed using cloudera Manager. I am
> running pig version 0.10.1 which is installed as parcel.
>
> On 27 March 2013 16:29, Praveen Bysani <pr...@gmail.com> wrote:
>
> > Hi,
> >
> > I am unable to typecast fields loaded from my hbase to anything other
> than
> > default bytearray. I tried both during the LOAD statement and using
> > typecast after loading. Neither works. The script works when i load the
> > data as below,
> > records = LOAD 'hbase://hantu' USING
> > org.apache.pig.backend.hadoop.hbase.HBaseStorage('v:member v:guest') as
> > (member, guest);
> > records_limit = LIMIT records 10;
> > DUMP records_limit;
> >
> > But when i change the first line to ,
> > records = LOAD 'hbase://hantu' USING
> > org.apache.pig.backend.hadoop.hbase.HBaseStorage('v:member v:guest') as
> > (member:chararray, guest:chararray);
> >
> > The pig script fails and the log is as below,
> > Backend error message
> > ---------------------
> > Error: java.lang.ClassNotFoundException:
> > org.apache.hadoop.hbase.mapreduce.TableInputFormat
> >         at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
> >         at java.security.AccessController.doPrivileged(Native Method)
> >         at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> >         at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
> >         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
> >         at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
> >         at java.lang.ClassLoader.defineClass1(Native Method)
> >         at java.lang.ClassLoader.defineClassCond(ClassLoader.java:632)
> >         at java.lang.ClassLoader.defineClass(ClassLoader.java:616)
> >         at
> > java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
> >         at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
> >         at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
> >         at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
> >         at java.security.AccessController.doPrivileged(Native Method)
> >         at java.net.URLClassLoader.findClass(URLClassLoader.java:190
> >
> > Backend error message
> > ---------------------
> > Error: java.lang.ClassNotFoundException:
> > org.apache.hadoop.hbase.mapreduce.TableInputFormat
> >         at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
> >         at java.security.AccessController.doPrivileged(Native Method)
> >         at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> >         at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> >         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
> >         at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> >         at java.lang.ClassLoader.defineClass1(Native Method)
> >         at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
> >         at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
> >         at
> > java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
> >         at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
> >         at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
> >         at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
> >         at java.security.AccessController.doPrivileged(Native Method)
> >         at java.net.URLClassLoader.findClass(URLClassLoader.java:190
> >
> > Error message from task (reduce) task_201303270642_0043_r_000000
> > ----------------------------------------------------------------
> > ERROR 6015: During execution, encountered a Hadoop error.
> >
> > org.apache.pig.backend.executionengine.ExecException: ERROR 6015: During
> > execution, encountered a Hadoop error.
> >         at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
> >         at java.security.AccessController.doPrivileged(Native Method)
> >         at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> >         at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
> >         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
> >         at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
> >         at java.lang.ClassLoader.defineClass1(Native Method)
> >         at java.lang.ClassLoader.defineClassCond(ClassLoader.java:632)
> >         at java.lang.ClassLoader.defineClass(ClassLoader.java:616)
> >         at
> > java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
> >         at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
> >         at java.net.URLClassLoader.access(000(URLClassLoader.java:58)
> >         at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
> >         at java.security.AccessController.doPrivileged(Native Method)
> > Caused by: java.lang.ClassNotFoundException:
> > org.apache.hadoop.hbase.mapreduce.TableInputFormat
> >         ... 14 more
> >
> >
> ================================================================================
> > Error message from task (reduce) task_201303270642_0043_r_000000
> > ----------------------------------------------------------------
> > ERROR 6015: During execution, encountered a Hadoop error.
> >
> > org.apache.pig.backend.executionengine.ExecException: ERROR 6015: During
> > execution, encountered a Hadoop error.
> >         at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
> >         at java.security.AccessController.doPrivileged(Native Method)
> >         at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> >         at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> >         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
> >         at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> >         at java.lang.ClassLoader.defineClass1(Native Method)
> >         at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
> >         at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
> >         at
> > java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
> >         at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
> >         at java.net.URLClassLoader.access(000(URLClassLoader.java:58)
> >         at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
> >         at java.security.AccessController.doPrivileged(Native Method)
> > Caused by: java.lang.ClassNotFoundException:
> > org.apache.hadoop.hbase.mapreduce.TableInputFormat
> >         ... 14 more
> >
> >
> ================================================================================
> > Pig Stack Trace
> > ---------------
> > ERROR 6015: During execution, encountered a Hadoop error.
> >
> > org.apache.pig.impl.logicalLayer.FrontendException: ERROR 1066: Unable to
> > open iterator for alias bet_records_float. Backend error : During
> > execution, encountered a Hadoop error.
> >         at org.apache.pig.PigServer.openIterator(PigServer.java:826)
> >         at
> > org.apache.pig.tools.grunt.GruntParser.processDump(GruntParser.java:696)
> >         at
> >
> org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:320)
> >         at
> >
> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:194)
> >         at
> >
> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:170)
> >         at org.apache.pig.tools.grunt.Grunt.exec(Grunt.java:84)
> >         at org.apache.pig.Main.run(Main.java:604)
> >         at org.apache.pig.Main.main(Main.java:157)
> >         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >         at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> >         at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >         at java.lang.reflect.Method.invoke(Method.java:601)
> >         at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
> > Caused by: org.apache.pig.backend.executionengine.ExecException: ERROR
> > 6015: During execution, encountered a Hadoop error.
> >         at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
> >         at java.security.AccessController.doPrivileged(Native Method)
> >         at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> >         at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
> >         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
> >         at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
> >         at java.lang.ClassLoader.defineClass1(Native Method)
> >         at java.lang.ClassLoader.defineClassCond(ClassLoader.java:632)
> >         at java.lang.ClassLoader.defineClass(ClassLoader.java:616)
> >         at
> > java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
> >         at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
> >         at java.net.URLClassLoader.access(000(URLClassLoader.java:58)
> >         at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
> >
> > The data is valid simple strings, i am not sure what the problem is.
> > --
> > Regards,
> > Praveen Bysani
> > http://www.praveenbysani.com
> >
>
>
>
> --
> Regards,
> Praveen Bysani
> http://www.praveenbysani.com
>



-- 
*Note that I'm no longer using my Yahoo! email address. Please email me at
billgraham@gmail.com going forward.*

Re: Unable to typecast fields loaded from HBase

Posted by Praveen Bysani <pr...@gmail.com>.
This is not about casting types. The scripts work sometime without any
issue and fails with the error as i specified before ? I have no clue of
what might be the issue ? Network probably ? I run my cluster on VPS
machines, running CDH 4.2 that is installed using cloudera Manager. I am
running pig version 0.10.1 which is installed as parcel.

On 27 March 2013 16:29, Praveen Bysani <pr...@gmail.com> wrote:

> Hi,
>
> I am unable to typecast fields loaded from my hbase to anything other than
> default bytearray. I tried both during the LOAD statement and using
> typecast after loading. Neither works. The script works when i load the
> data as below,
> records = LOAD 'hbase://hantu' USING
> org.apache.pig.backend.hadoop.hbase.HBaseStorage('v:member v:guest') as
> (member, guest);
> records_limit = LIMIT records 10;
> DUMP records_limit;
>
> But when i change the first line to ,
> records = LOAD 'hbase://hantu' USING
> org.apache.pig.backend.hadoop.hbase.HBaseStorage('v:member v:guest') as
> (member:chararray, guest:chararray);
>
> The pig script fails and the log is as below,
> Backend error message
> ---------------------
> Error: java.lang.ClassNotFoundException:
> org.apache.hadoop.hbase.mapreduce.TableInputFormat
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
>         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
>         at java.lang.ClassLoader.defineClass1(Native Method)
>         at java.lang.ClassLoader.defineClassCond(ClassLoader.java:632)
>         at java.lang.ClassLoader.defineClass(ClassLoader.java:616)
>         at
> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>         at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
>         at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:190
>
> Backend error message
> ---------------------
> Error: java.lang.ClassNotFoundException:
> org.apache.hadoop.hbase.mapreduce.TableInputFormat
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>         at java.lang.ClassLoader.defineClass1(Native Method)
>         at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
>         at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
>         at
> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>         at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
>         at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:190
>
> Error message from task (reduce) task_201303270642_0043_r_000000
> ----------------------------------------------------------------
> ERROR 6015: During execution, encountered a Hadoop error.
>
> org.apache.pig.backend.executionengine.ExecException: ERROR 6015: During
> execution, encountered a Hadoop error.
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
>         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
>         at java.lang.ClassLoader.defineClass1(Native Method)
>         at java.lang.ClassLoader.defineClassCond(ClassLoader.java:632)
>         at java.lang.ClassLoader.defineClass(ClassLoader.java:616)
>         at
> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>         at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
>         at java.net.URLClassLoader.access(000(URLClassLoader.java:58)
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
>         at java.security.AccessController.doPrivileged(Native Method)
> Caused by: java.lang.ClassNotFoundException:
> org.apache.hadoop.hbase.mapreduce.TableInputFormat
>         ... 14 more
>
> ================================================================================
> Error message from task (reduce) task_201303270642_0043_r_000000
> ----------------------------------------------------------------
> ERROR 6015: During execution, encountered a Hadoop error.
>
> org.apache.pig.backend.executionengine.ExecException: ERROR 6015: During
> execution, encountered a Hadoop error.
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>         at java.lang.ClassLoader.defineClass1(Native Method)
>         at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
>         at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
>         at
> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>         at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
>         at java.net.URLClassLoader.access(000(URLClassLoader.java:58)
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
>         at java.security.AccessController.doPrivileged(Native Method)
> Caused by: java.lang.ClassNotFoundException:
> org.apache.hadoop.hbase.mapreduce.TableInputFormat
>         ... 14 more
>
> ================================================================================
> Pig Stack Trace
> ---------------
> ERROR 6015: During execution, encountered a Hadoop error.
>
> org.apache.pig.impl.logicalLayer.FrontendException: ERROR 1066: Unable to
> open iterator for alias bet_records_float. Backend error : During
> execution, encountered a Hadoop error.
>         at org.apache.pig.PigServer.openIterator(PigServer.java:826)
>         at
> org.apache.pig.tools.grunt.GruntParser.processDump(GruntParser.java:696)
>         at
> org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:320)
>         at
> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:194)
>         at
> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:170)
>         at org.apache.pig.tools.grunt.Grunt.exec(Grunt.java:84)
>         at org.apache.pig.Main.run(Main.java:604)
>         at org.apache.pig.Main.main(Main.java:157)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:601)
>         at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
> Caused by: org.apache.pig.backend.executionengine.ExecException: ERROR
> 6015: During execution, encountered a Hadoop error.
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
>         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
>         at java.lang.ClassLoader.defineClass1(Native Method)
>         at java.lang.ClassLoader.defineClassCond(ClassLoader.java:632)
>         at java.lang.ClassLoader.defineClass(ClassLoader.java:616)
>         at
> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
>         at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
>         at java.net.URLClassLoader.access(000(URLClassLoader.java:58)
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
>
> The data is valid simple strings, i am not sure what the problem is.
> --
> Regards,
> Praveen Bysani
> http://www.praveenbysani.com
>



-- 
Regards,
Praveen Bysani
http://www.praveenbysani.com