You are viewing a plain text version of this content. The canonical link for it is here.
Posted to mapreduce-user@hadoop.apache.org by ch huang <ju...@gmail.com> on 2013/07/22 10:38:33 UTC

test lzo problem in hadoop

anyone can help?

# sudo -u hdfs hadoop jar /usr/lib/hadoop/lib/hadoop-lzo-0.4.15.jar
com.hadoop.compression.lzo.DistributedLzoIndexer
/alex/test_lzo/sqoop-1.99.2-bin-hadoop200.tar.gz.lzo
13/07/22 16:33:50 INFO lzo.GPLNativeCodeLoader: Loaded native gpl library
13/07/22 16:33:50 INFO lzo.LzoCodec: Successfully loaded & initialized
native-lzo library [hadoop-lzo rev 6bb1b7f8b9044d8df9b4d2b6641db7658aab3cf8]
13/07/22 16:33:50 INFO lzo.DistributedLzoIndexer: Adding LZO file
/alex/test_lzo/sqoop-1.99.2-bin-hadoop200.tar.gz.lzo to indexing list (no
index currently exists)
13/07/22 16:33:50 ERROR security.UserGroupInformation:
PriviledgedActionException as:hdfs (auth:SIMPLE) cause:java.io.IOException:
Failed on local exception:
com.google.protobuf.InvalidProtocolBufferException: Protocol message
end-group tag did not match expected tag.; Host Details : local host is:
"CH22/192.168.10.22"; destination host is: "CH22":8088;
Exception in thread "main" java.io.IOException: Failed on local exception:
com.google.protobuf.InvalidProtocolBufferException: Protocol message
end-group tag did not match expected tag.; Host Details : local host is:
"CH22/192.168.10.22"; destination host is: "CH22":8088;
        at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:763)
        at org.apache.hadoop.ipc.Client.call(Client.java:1229)
        at
org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:225)
        at org.apache.hadoop.mapred.$Proxy10.getStagingAreaDir(Unknown
Source)
        at
org.apache.hadoop.mapred.JobClient.getStagingAreaDir(JobClient.java:1324)
        at
org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:102)
        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:951)
        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:945)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
        at
org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:945)
        at org.apache.hadoop.mapreduce.Job.submit(Job.java:566)
        at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:596)
        at
com.hadoop.compression.lzo.DistributedLzoIndexer.run(DistributedLzoIndexer.java:111)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
        at
com.hadoop.compression.lzo.DistributedLzoIndexer.main(DistributedLzoIndexer.java:115)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
Caused by: com.google.protobuf.InvalidProtocolBufferException: Protocol
message end-group tag did not match expected tag.
        at
com.google.protobuf.InvalidProtocolBufferException.invalidEndTag(InvalidProtocolBufferException.java:73)
        at
com.google.protobuf.CodedInputStream.checkLastTagWas(CodedInputStream.java:124)
        at
com.google.protobuf.AbstractMessageLite$Builder.mergeFrom(AbstractMessageLite.java:213)
        at
com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:746)
        at
com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:238)
        at
com.google.protobuf.AbstractMessageLite$Builder.mergeDelimitedFrom(AbstractMessageLite.java:282)
        at
com.google.protobuf.AbstractMessage$Builder.mergeDelimitedFrom(AbstractMessage.java:760)
        at
com.google.protobuf.AbstractMessageLite$Builder.mergeDelimitedFrom(AbstractMessageLite.java:288)
        at
com.google.protobuf.AbstractMessage$Builder.mergeDelimitedFrom(AbstractMessage.java:752)
        at
org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto.parseDelimitedFrom(RpcPayloadHeaderProtos.java:985)
        at
org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:938)
        at org.apache.hadoop.ipc.Client$Connection.run(Client.java:836)

Re: test lzo problem in hadoop

Posted by ch huang <ju...@gmail.com>.
that's ok ,but why i can not use
com.hadoop.compression.lzo.DistributedLzoIndexer ?

# hadoop jar /usr/lib/hadoop/lib/hadoop-lzo-0.4.15.jar
com.hadoop.compression.lzo.LzoIndexer  /alex/ttt.lzo
13/08/02 09:11:09 INFO lzo.GPLNativeCodeLoader: Loaded native gpl library
13/08/02 09:11:09 INFO lzo.LzoCodec: Successfully loaded & initialized
native-lzo library [hadoop-lzo rev 6bb1b7f8b9044d8df9b4d2b6641db7658aab3cf8]
13/08/02 09:11:10 INFO lzo.LzoIndexer: [INDEX] LZO Indexing file
/alex/ttt.lzo, size 0.00 GB...
13/08/02 09:11:10 WARN conf.Configuration: hadoop.native.lib is deprecated.
Instead, use io.native.lib.available
13/08/02 09:11:10 INFO lzo.LzoIndexer: Completed LZO Indexing in 0.19
seconds (0.00 MB/s).  Index size is 0.01 KB.


On Mon, Jul 22, 2013 at 6:07 PM, Sandeep Nemuri <nh...@gmail.com>wrote:

> Try  this Ccommand
> hadoop jar /usr/lib/hadoop/lib/hadoop-lzo-cdh4-0.4.15-gplextras.jar
> com.hadoop.compression.lzo.LzoIndexer /user/sample.txt.lzo
>
>
> On Mon, Jul 22, 2013 at 2:08 PM, ch huang <ju...@gmail.com> wrote:
>
>> anyone can help?
>>
>> # sudo -u hdfs hadoop jar /usr/lib/hadoop/lib/hadoop-lzo-0.4.15.jar
>> com.hadoop.compression.lzo.DistributedLzoIndexer
>> /alex/test_lzo/sqoop-1.99.2-bin-hadoop200.tar.gz.lzo
>> 13/07/22 16:33:50 INFO lzo.GPLNativeCodeLoader: Loaded native gpl library
>> 13/07/22 16:33:50 INFO lzo.LzoCodec: Successfully loaded & initialized
>> native-lzo library [hadoop-lzo rev 6bb1b7f8b9044d8df9b4d2b6641db7658aab3cf8]
>> 13/07/22 16:33:50 INFO lzo.DistributedLzoIndexer: Adding LZO file
>> /alex/test_lzo/sqoop-1.99.2-bin-hadoop200.tar.gz.lzo to indexing list (no
>> index currently exists)
>> 13/07/22 16:33:50 ERROR security.UserGroupInformation:
>> PriviledgedActionException as:hdfs (auth:SIMPLE) cause:java.io.IOException:
>> Failed on local exception:
>> com.google.protobuf.InvalidProtocolBufferException: Protocol message
>> end-group tag did not match expected tag.; Host Details : local host is:
>> "CH22/192.168.10.22"; destination host is: "CH22":8088;
>> Exception in thread "main" java.io.IOException: Failed on local
>> exception: com.google.protobuf.InvalidProtocolBufferException: Protocol
>> message end-group tag did not match expected tag.; Host Details : local
>> host is: "CH22/192.168.10.22"; destination host is: "CH22":8088;
>>         at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:763)
>>         at org.apache.hadoop.ipc.Client.call(Client.java:1229)
>>         at
>> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:225)
>>         at org.apache.hadoop.mapred.$Proxy10.getStagingAreaDir(Unknown
>> Source)
>>         at
>> org.apache.hadoop.mapred.JobClient.getStagingAreaDir(JobClient.java:1324)
>>         at
>> org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:102)
>>         at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:951)
>>         at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:945)
>>         at java.security.AccessController.doPrivileged(Native Method)
>>         at javax.security.auth.Subject.doAs(Subject.java:396)
>>         at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
>>         at
>> org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:945)
>>         at org.apache.hadoop.mapreduce.Job.submit(Job.java:566)
>>         at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:596)
>>         at
>> com.hadoop.compression.lzo.DistributedLzoIndexer.run(DistributedLzoIndexer.java:111)
>>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
>>         at
>> com.hadoop.compression.lzo.DistributedLzoIndexer.main(DistributedLzoIndexer.java:115)
>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>         at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>         at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>         at java.lang.reflect.Method.invoke(Method.java:597)
>>         at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
>> Caused by: com.google.protobuf.InvalidProtocolBufferException: Protocol
>> message end-group tag did not match expected tag.
>>         at
>> com.google.protobuf.InvalidProtocolBufferException.invalidEndTag(InvalidProtocolBufferException.java:73)
>>         at
>> com.google.protobuf.CodedInputStream.checkLastTagWas(CodedInputStream.java:124)
>>         at
>> com.google.protobuf.AbstractMessageLite$Builder.mergeFrom(AbstractMessageLite.java:213)
>>         at
>> com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:746)
>>         at
>> com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:238)
>>         at
>> com.google.protobuf.AbstractMessageLite$Builder.mergeDelimitedFrom(AbstractMessageLite.java:282)
>>         at
>> com.google.protobuf.AbstractMessage$Builder.mergeDelimitedFrom(AbstractMessage.java:760)
>>         at
>> com.google.protobuf.AbstractMessageLite$Builder.mergeDelimitedFrom(AbstractMessageLite.java:288)
>>         at
>> com.google.protobuf.AbstractMessage$Builder.mergeDelimitedFrom(AbstractMessage.java:752)
>>         at
>> org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto.parseDelimitedFrom(RpcPayloadHeaderProtos.java:985)
>>         at
>> org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:938)
>>         at org.apache.hadoop.ipc.Client$Connection.run(Client.java:836)
>>
>
>
>
> --
> --Regards
>   Sandeep Nemuri
>

Re: test lzo problem in hadoop

Posted by "J. Rottinghuis" <jr...@gmail.com>.
In Hadoop 2.0 some of the classes have changed from an abstract class to an
interface.
You'll have to compile again. In addition, you need to use a version of
hadoop-lzo that is compatible with Hadoop 2.0 (Yarn).

See: https://github.com/twitter/hadoop-lzo/issues/56
and the announcement of a newer version 0.4.17 that solves this problem:
http://mail-archives.apache.org/mod_mbox/hadoop-common-dev/201306.mbox/%3CCADbBEnt1iiVbZFqQFupEUJ6VVdm5NitVvXqrfuzqugaOyMz+0g@mail.gmail.com%3E

Cheers,

Joep


On Thu, Aug 1, 2013 at 6:44 PM, ch huang <ju...@gmail.com> wrote:

> i use yarn ,and i commented the following option and error is different
>
>
> vi /etc/hadoop/conf/mapred-site.xml
> <!--
>         <property>
>                 <name>mapred.job.tracker</name>
>                 <value>CH22:8088</value>
>         </property>
> -->
>
>
> #  hadoop jar /usr/lib/hadoop/lib/hadoop-lzo-0.4.15.jar
> com.hadoop.compression.lzo.DistributedLzoIndexer /alex/ttt.lzo
> 13/08/02 09:25:51 INFO lzo.GPLNativeCodeLoader: Loaded native gpl library
> 13/08/02 09:25:51 INFO lzo.LzoCodec: Successfully loaded & initialized
> native-lzo library [hadoop-lzo rev 6bb1b7f8b9044d8df9b4d2b6641db7658aab3cf8]
> 13/08/02 09:25:52 INFO lzo.DistributedLzoIndexer: Adding LZO file
> /alex/ttt.lzo to indexing list (no index currently exists)
> 13/08/02 09:25:52 WARN conf.Configuration: session.id is deprecated.
> Instead, use dfs.metrics.session-id
> 13/08/02 09:25:52 INFO jvm.JvmMetrics: Initializing JVM Metrics with
> processName=JobTracker, sessionId=
> 13/08/02 09:25:52 WARN conf.Configuration: slave.host.name is deprecated.
> Instead, use dfs.datanode.hostname
> 13/08/02 09:25:52 WARN mapred.JobClient: Use GenericOptionsParser for
> parsing the arguments. Applications should implement Tool for the same.
> 13/08/02 09:25:52 INFO input.FileInputFormat: Total input paths to process
> : 1
> 13/08/02 09:25:53 INFO mapred.LocalJobRunner: OutputCommitter set in
> config null
> 13/08/02 09:25:53 INFO mapred.JobClient: Running job:
> job_local180628093_0001
> 13/08/02 09:25:53 INFO mapred.LocalJobRunner: OutputCommitter is
> com.hadoop.mapreduce.LzoIndexOutputFormat$1
> 13/08/02 09:25:53 INFO mapred.LocalJobRunner: Waiting for map tasks
> 13/08/02 09:25:53 INFO mapred.LocalJobRunner: Starting task:
> attempt_local180628093_0001_m_000000_0
> 13/08/02 09:25:53 WARN mapreduce.Counters: Group
> org.apache.hadoop.mapred.Task$Counter is deprecated. Use
> org.apache.hadoop.mapreduce.TaskCounter instead
> 13/08/02 09:25:53 INFO util.ProcessTree: setsid exited with exit code 0
> 13/08/02 09:25:53 INFO mapred.Task:  Using ResourceCalculatorPlugin :
> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@338e18a3
> 13/08/02 09:25:53 INFO mapred.MapTask: Processing split:
> hdfs://CH22:9000/alex/ttt.lzo:0+306
> 13/08/02 09:25:53 INFO mapred.LocalJobRunner: Map task executor complete.
> 13/08/02 09:25:53 WARN mapred.LocalJobRunner: job_local180628093_0001
> java.lang.Exception: java.lang.IncompatibleClassChangeError: Found
> interface org.apache.hadoop.mapreduce.TaskAttemptContext, but class was
> expected
>         at
> org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:404)
> Caused by: java.lang.IncompatibleClassChangeError: Found interface
> org.apache.hadoop.mapreduce.TaskAttemptContext, but class was expected
>         at
> com.hadoop.mapreduce.LzoSplitRecordReader.initialize(LzoSplitRecordReader.java:47)
>         at
> org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.initialize(MapTask.java:478)
>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:671)
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
>         at
> org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:266)
>         at
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
>         at
> java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
>         at java.util.concurrent.FutureTask.run(FutureTask.java:138)
>         at
> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
>         at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
>         at java.lang.Thread.run(Thread.java:662)
> 13/08/02 09:25:54 INFO mapred.JobClient:  map 0% reduce 0%
> 13/08/02 09:25:54 INFO mapred.JobClient: Job complete:
> job_local180628093_0001
> 13/08/02 09:25:54 INFO mapred.JobClient: Counters: 0
>
> On Mon, Jul 22, 2013 at 6:07 PM, Sandeep Nemuri <nh...@gmail.com>wrote:
>
>> Try  this Ccommand
>> hadoop jar /usr/lib/hadoop/lib/hadoop-lzo-cdh4-0.4.15-gplextras.jar
>> com.hadoop.compression.lzo.LzoIndexer /user/sample.txt.lzo
>>
>>
>> On Mon, Jul 22, 2013 at 2:08 PM, ch huang <ju...@gmail.com> wrote:
>>
>>> anyone can help?
>>>
>>> # sudo -u hdfs hadoop jar /usr/lib/hadoop/lib/hadoop-lzo-0.4.15.jar
>>> com.hadoop.compression.lzo.DistributedLzoIndexer
>>> /alex/test_lzo/sqoop-1.99.2-bin-hadoop200.tar.gz.lzo
>>> 13/07/22 16:33:50 INFO lzo.GPLNativeCodeLoader: Loaded native gpl library
>>> 13/07/22 16:33:50 INFO lzo.LzoCodec: Successfully loaded & initialized
>>> native-lzo library [hadoop-lzo rev 6bb1b7f8b9044d8df9b4d2b6641db7658aab3cf8]
>>> 13/07/22 16:33:50 INFO lzo.DistributedLzoIndexer: Adding LZO file
>>> /alex/test_lzo/sqoop-1.99.2-bin-hadoop200.tar.gz.lzo to indexing list (no
>>> index currently exists)
>>> 13/07/22 16:33:50 ERROR security.UserGroupInformation:
>>> PriviledgedActionException as:hdfs (auth:SIMPLE) cause:java.io.IOException:
>>> Failed on local exception:
>>> com.google.protobuf.InvalidProtocolBufferException: Protocol message
>>> end-group tag did not match expected tag.; Host Details : local host is:
>>> "CH22/192.168.10.22"; destination host is: "CH22":8088;
>>> Exception in thread "main" java.io.IOException: Failed on local
>>> exception: com.google.protobuf.InvalidProtocolBufferException: Protocol
>>> message end-group tag did not match expected tag.; Host Details : local
>>> host is: "CH22/192.168.10.22"; destination host is: "CH22":8088;
>>>         at
>>> org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:763)
>>>         at org.apache.hadoop.ipc.Client.call(Client.java:1229)
>>>         at
>>> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:225)
>>>         at org.apache.hadoop.mapred.$Proxy10.getStagingAreaDir(Unknown
>>> Source)
>>>         at
>>> org.apache.hadoop.mapred.JobClient.getStagingAreaDir(JobClient.java:1324)
>>>         at
>>> org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:102)
>>>         at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:951)
>>>         at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:945)
>>>         at java.security.AccessController.doPrivileged(Native Method)
>>>         at javax.security.auth.Subject.doAs(Subject.java:396)
>>>         at
>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
>>>         at
>>> org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:945)
>>>         at org.apache.hadoop.mapreduce.Job.submit(Job.java:566)
>>>         at
>>> org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:596)
>>>         at
>>> com.hadoop.compression.lzo.DistributedLzoIndexer.run(DistributedLzoIndexer.java:111)
>>>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>>>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
>>>         at
>>> com.hadoop.compression.lzo.DistributedLzoIndexer.main(DistributedLzoIndexer.java:115)
>>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>         at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>>         at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>         at java.lang.reflect.Method.invoke(Method.java:597)
>>>         at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
>>> Caused by: com.google.protobuf.InvalidProtocolBufferException: Protocol
>>> message end-group tag did not match expected tag.
>>>         at
>>> com.google.protobuf.InvalidProtocolBufferException.invalidEndTag(InvalidProtocolBufferException.java:73)
>>>         at
>>> com.google.protobuf.CodedInputStream.checkLastTagWas(CodedInputStream.java:124)
>>>         at
>>> com.google.protobuf.AbstractMessageLite$Builder.mergeFrom(AbstractMessageLite.java:213)
>>>         at
>>> com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:746)
>>>         at
>>> com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:238)
>>>         at
>>> com.google.protobuf.AbstractMessageLite$Builder.mergeDelimitedFrom(AbstractMessageLite.java:282)
>>>         at
>>> com.google.protobuf.AbstractMessage$Builder.mergeDelimitedFrom(AbstractMessage.java:760)
>>>         at
>>> com.google.protobuf.AbstractMessageLite$Builder.mergeDelimitedFrom(AbstractMessageLite.java:288)
>>>         at
>>> com.google.protobuf.AbstractMessage$Builder.mergeDelimitedFrom(AbstractMessage.java:752)
>>>         at
>>> org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto.parseDelimitedFrom(RpcPayloadHeaderProtos.java:985)
>>>         at
>>> org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:938)
>>>         at org.apache.hadoop.ipc.Client$Connection.run(Client.java:836)
>>>
>>
>>
>>
>> --
>> --Regards
>>   Sandeep Nemuri
>>
>
>

Re: test lzo problem in hadoop

Posted by "J. Rottinghuis" <jr...@gmail.com>.
In Hadoop 2.0 some of the classes have changed from an abstract class to an
interface.
You'll have to compile again. In addition, you need to use a version of
hadoop-lzo that is compatible with Hadoop 2.0 (Yarn).

See: https://github.com/twitter/hadoop-lzo/issues/56
and the announcement of a newer version 0.4.17 that solves this problem:
http://mail-archives.apache.org/mod_mbox/hadoop-common-dev/201306.mbox/%3CCADbBEnt1iiVbZFqQFupEUJ6VVdm5NitVvXqrfuzqugaOyMz+0g@mail.gmail.com%3E

Cheers,

Joep


On Thu, Aug 1, 2013 at 6:44 PM, ch huang <ju...@gmail.com> wrote:

> i use yarn ,and i commented the following option and error is different
>
>
> vi /etc/hadoop/conf/mapred-site.xml
> <!--
>         <property>
>                 <name>mapred.job.tracker</name>
>                 <value>CH22:8088</value>
>         </property>
> -->
>
>
> #  hadoop jar /usr/lib/hadoop/lib/hadoop-lzo-0.4.15.jar
> com.hadoop.compression.lzo.DistributedLzoIndexer /alex/ttt.lzo
> 13/08/02 09:25:51 INFO lzo.GPLNativeCodeLoader: Loaded native gpl library
> 13/08/02 09:25:51 INFO lzo.LzoCodec: Successfully loaded & initialized
> native-lzo library [hadoop-lzo rev 6bb1b7f8b9044d8df9b4d2b6641db7658aab3cf8]
> 13/08/02 09:25:52 INFO lzo.DistributedLzoIndexer: Adding LZO file
> /alex/ttt.lzo to indexing list (no index currently exists)
> 13/08/02 09:25:52 WARN conf.Configuration: session.id is deprecated.
> Instead, use dfs.metrics.session-id
> 13/08/02 09:25:52 INFO jvm.JvmMetrics: Initializing JVM Metrics with
> processName=JobTracker, sessionId=
> 13/08/02 09:25:52 WARN conf.Configuration: slave.host.name is deprecated.
> Instead, use dfs.datanode.hostname
> 13/08/02 09:25:52 WARN mapred.JobClient: Use GenericOptionsParser for
> parsing the arguments. Applications should implement Tool for the same.
> 13/08/02 09:25:52 INFO input.FileInputFormat: Total input paths to process
> : 1
> 13/08/02 09:25:53 INFO mapred.LocalJobRunner: OutputCommitter set in
> config null
> 13/08/02 09:25:53 INFO mapred.JobClient: Running job:
> job_local180628093_0001
> 13/08/02 09:25:53 INFO mapred.LocalJobRunner: OutputCommitter is
> com.hadoop.mapreduce.LzoIndexOutputFormat$1
> 13/08/02 09:25:53 INFO mapred.LocalJobRunner: Waiting for map tasks
> 13/08/02 09:25:53 INFO mapred.LocalJobRunner: Starting task:
> attempt_local180628093_0001_m_000000_0
> 13/08/02 09:25:53 WARN mapreduce.Counters: Group
> org.apache.hadoop.mapred.Task$Counter is deprecated. Use
> org.apache.hadoop.mapreduce.TaskCounter instead
> 13/08/02 09:25:53 INFO util.ProcessTree: setsid exited with exit code 0
> 13/08/02 09:25:53 INFO mapred.Task:  Using ResourceCalculatorPlugin :
> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@338e18a3
> 13/08/02 09:25:53 INFO mapred.MapTask: Processing split:
> hdfs://CH22:9000/alex/ttt.lzo:0+306
> 13/08/02 09:25:53 INFO mapred.LocalJobRunner: Map task executor complete.
> 13/08/02 09:25:53 WARN mapred.LocalJobRunner: job_local180628093_0001
> java.lang.Exception: java.lang.IncompatibleClassChangeError: Found
> interface org.apache.hadoop.mapreduce.TaskAttemptContext, but class was
> expected
>         at
> org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:404)
> Caused by: java.lang.IncompatibleClassChangeError: Found interface
> org.apache.hadoop.mapreduce.TaskAttemptContext, but class was expected
>         at
> com.hadoop.mapreduce.LzoSplitRecordReader.initialize(LzoSplitRecordReader.java:47)
>         at
> org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.initialize(MapTask.java:478)
>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:671)
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
>         at
> org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:266)
>         at
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
>         at
> java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
>         at java.util.concurrent.FutureTask.run(FutureTask.java:138)
>         at
> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
>         at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
>         at java.lang.Thread.run(Thread.java:662)
> 13/08/02 09:25:54 INFO mapred.JobClient:  map 0% reduce 0%
> 13/08/02 09:25:54 INFO mapred.JobClient: Job complete:
> job_local180628093_0001
> 13/08/02 09:25:54 INFO mapred.JobClient: Counters: 0
>
> On Mon, Jul 22, 2013 at 6:07 PM, Sandeep Nemuri <nh...@gmail.com>wrote:
>
>> Try  this Ccommand
>> hadoop jar /usr/lib/hadoop/lib/hadoop-lzo-cdh4-0.4.15-gplextras.jar
>> com.hadoop.compression.lzo.LzoIndexer /user/sample.txt.lzo
>>
>>
>> On Mon, Jul 22, 2013 at 2:08 PM, ch huang <ju...@gmail.com> wrote:
>>
>>> anyone can help?
>>>
>>> # sudo -u hdfs hadoop jar /usr/lib/hadoop/lib/hadoop-lzo-0.4.15.jar
>>> com.hadoop.compression.lzo.DistributedLzoIndexer
>>> /alex/test_lzo/sqoop-1.99.2-bin-hadoop200.tar.gz.lzo
>>> 13/07/22 16:33:50 INFO lzo.GPLNativeCodeLoader: Loaded native gpl library
>>> 13/07/22 16:33:50 INFO lzo.LzoCodec: Successfully loaded & initialized
>>> native-lzo library [hadoop-lzo rev 6bb1b7f8b9044d8df9b4d2b6641db7658aab3cf8]
>>> 13/07/22 16:33:50 INFO lzo.DistributedLzoIndexer: Adding LZO file
>>> /alex/test_lzo/sqoop-1.99.2-bin-hadoop200.tar.gz.lzo to indexing list (no
>>> index currently exists)
>>> 13/07/22 16:33:50 ERROR security.UserGroupInformation:
>>> PriviledgedActionException as:hdfs (auth:SIMPLE) cause:java.io.IOException:
>>> Failed on local exception:
>>> com.google.protobuf.InvalidProtocolBufferException: Protocol message
>>> end-group tag did not match expected tag.; Host Details : local host is:
>>> "CH22/192.168.10.22"; destination host is: "CH22":8088;
>>> Exception in thread "main" java.io.IOException: Failed on local
>>> exception: com.google.protobuf.InvalidProtocolBufferException: Protocol
>>> message end-group tag did not match expected tag.; Host Details : local
>>> host is: "CH22/192.168.10.22"; destination host is: "CH22":8088;
>>>         at
>>> org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:763)
>>>         at org.apache.hadoop.ipc.Client.call(Client.java:1229)
>>>         at
>>> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:225)
>>>         at org.apache.hadoop.mapred.$Proxy10.getStagingAreaDir(Unknown
>>> Source)
>>>         at
>>> org.apache.hadoop.mapred.JobClient.getStagingAreaDir(JobClient.java:1324)
>>>         at
>>> org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:102)
>>>         at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:951)
>>>         at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:945)
>>>         at java.security.AccessController.doPrivileged(Native Method)
>>>         at javax.security.auth.Subject.doAs(Subject.java:396)
>>>         at
>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
>>>         at
>>> org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:945)
>>>         at org.apache.hadoop.mapreduce.Job.submit(Job.java:566)
>>>         at
>>> org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:596)
>>>         at
>>> com.hadoop.compression.lzo.DistributedLzoIndexer.run(DistributedLzoIndexer.java:111)
>>>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>>>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
>>>         at
>>> com.hadoop.compression.lzo.DistributedLzoIndexer.main(DistributedLzoIndexer.java:115)
>>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>         at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>>         at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>         at java.lang.reflect.Method.invoke(Method.java:597)
>>>         at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
>>> Caused by: com.google.protobuf.InvalidProtocolBufferException: Protocol
>>> message end-group tag did not match expected tag.
>>>         at
>>> com.google.protobuf.InvalidProtocolBufferException.invalidEndTag(InvalidProtocolBufferException.java:73)
>>>         at
>>> com.google.protobuf.CodedInputStream.checkLastTagWas(CodedInputStream.java:124)
>>>         at
>>> com.google.protobuf.AbstractMessageLite$Builder.mergeFrom(AbstractMessageLite.java:213)
>>>         at
>>> com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:746)
>>>         at
>>> com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:238)
>>>         at
>>> com.google.protobuf.AbstractMessageLite$Builder.mergeDelimitedFrom(AbstractMessageLite.java:282)
>>>         at
>>> com.google.protobuf.AbstractMessage$Builder.mergeDelimitedFrom(AbstractMessage.java:760)
>>>         at
>>> com.google.protobuf.AbstractMessageLite$Builder.mergeDelimitedFrom(AbstractMessageLite.java:288)
>>>         at
>>> com.google.protobuf.AbstractMessage$Builder.mergeDelimitedFrom(AbstractMessage.java:752)
>>>         at
>>> org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto.parseDelimitedFrom(RpcPayloadHeaderProtos.java:985)
>>>         at
>>> org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:938)
>>>         at org.apache.hadoop.ipc.Client$Connection.run(Client.java:836)
>>>
>>
>>
>>
>> --
>> --Regards
>>   Sandeep Nemuri
>>
>
>

Re: test lzo problem in hadoop

Posted by "J. Rottinghuis" <jr...@gmail.com>.
In Hadoop 2.0 some of the classes have changed from an abstract class to an
interface.
You'll have to compile again. In addition, you need to use a version of
hadoop-lzo that is compatible with Hadoop 2.0 (Yarn).

See: https://github.com/twitter/hadoop-lzo/issues/56
and the announcement of a newer version 0.4.17 that solves this problem:
http://mail-archives.apache.org/mod_mbox/hadoop-common-dev/201306.mbox/%3CCADbBEnt1iiVbZFqQFupEUJ6VVdm5NitVvXqrfuzqugaOyMz+0g@mail.gmail.com%3E

Cheers,

Joep


On Thu, Aug 1, 2013 at 6:44 PM, ch huang <ju...@gmail.com> wrote:

> i use yarn ,and i commented the following option and error is different
>
>
> vi /etc/hadoop/conf/mapred-site.xml
> <!--
>         <property>
>                 <name>mapred.job.tracker</name>
>                 <value>CH22:8088</value>
>         </property>
> -->
>
>
> #  hadoop jar /usr/lib/hadoop/lib/hadoop-lzo-0.4.15.jar
> com.hadoop.compression.lzo.DistributedLzoIndexer /alex/ttt.lzo
> 13/08/02 09:25:51 INFO lzo.GPLNativeCodeLoader: Loaded native gpl library
> 13/08/02 09:25:51 INFO lzo.LzoCodec: Successfully loaded & initialized
> native-lzo library [hadoop-lzo rev 6bb1b7f8b9044d8df9b4d2b6641db7658aab3cf8]
> 13/08/02 09:25:52 INFO lzo.DistributedLzoIndexer: Adding LZO file
> /alex/ttt.lzo to indexing list (no index currently exists)
> 13/08/02 09:25:52 WARN conf.Configuration: session.id is deprecated.
> Instead, use dfs.metrics.session-id
> 13/08/02 09:25:52 INFO jvm.JvmMetrics: Initializing JVM Metrics with
> processName=JobTracker, sessionId=
> 13/08/02 09:25:52 WARN conf.Configuration: slave.host.name is deprecated.
> Instead, use dfs.datanode.hostname
> 13/08/02 09:25:52 WARN mapred.JobClient: Use GenericOptionsParser for
> parsing the arguments. Applications should implement Tool for the same.
> 13/08/02 09:25:52 INFO input.FileInputFormat: Total input paths to process
> : 1
> 13/08/02 09:25:53 INFO mapred.LocalJobRunner: OutputCommitter set in
> config null
> 13/08/02 09:25:53 INFO mapred.JobClient: Running job:
> job_local180628093_0001
> 13/08/02 09:25:53 INFO mapred.LocalJobRunner: OutputCommitter is
> com.hadoop.mapreduce.LzoIndexOutputFormat$1
> 13/08/02 09:25:53 INFO mapred.LocalJobRunner: Waiting for map tasks
> 13/08/02 09:25:53 INFO mapred.LocalJobRunner: Starting task:
> attempt_local180628093_0001_m_000000_0
> 13/08/02 09:25:53 WARN mapreduce.Counters: Group
> org.apache.hadoop.mapred.Task$Counter is deprecated. Use
> org.apache.hadoop.mapreduce.TaskCounter instead
> 13/08/02 09:25:53 INFO util.ProcessTree: setsid exited with exit code 0
> 13/08/02 09:25:53 INFO mapred.Task:  Using ResourceCalculatorPlugin :
> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@338e18a3
> 13/08/02 09:25:53 INFO mapred.MapTask: Processing split:
> hdfs://CH22:9000/alex/ttt.lzo:0+306
> 13/08/02 09:25:53 INFO mapred.LocalJobRunner: Map task executor complete.
> 13/08/02 09:25:53 WARN mapred.LocalJobRunner: job_local180628093_0001
> java.lang.Exception: java.lang.IncompatibleClassChangeError: Found
> interface org.apache.hadoop.mapreduce.TaskAttemptContext, but class was
> expected
>         at
> org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:404)
> Caused by: java.lang.IncompatibleClassChangeError: Found interface
> org.apache.hadoop.mapreduce.TaskAttemptContext, but class was expected
>         at
> com.hadoop.mapreduce.LzoSplitRecordReader.initialize(LzoSplitRecordReader.java:47)
>         at
> org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.initialize(MapTask.java:478)
>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:671)
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
>         at
> org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:266)
>         at
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
>         at
> java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
>         at java.util.concurrent.FutureTask.run(FutureTask.java:138)
>         at
> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
>         at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
>         at java.lang.Thread.run(Thread.java:662)
> 13/08/02 09:25:54 INFO mapred.JobClient:  map 0% reduce 0%
> 13/08/02 09:25:54 INFO mapred.JobClient: Job complete:
> job_local180628093_0001
> 13/08/02 09:25:54 INFO mapred.JobClient: Counters: 0
>
> On Mon, Jul 22, 2013 at 6:07 PM, Sandeep Nemuri <nh...@gmail.com>wrote:
>
>> Try  this Ccommand
>> hadoop jar /usr/lib/hadoop/lib/hadoop-lzo-cdh4-0.4.15-gplextras.jar
>> com.hadoop.compression.lzo.LzoIndexer /user/sample.txt.lzo
>>
>>
>> On Mon, Jul 22, 2013 at 2:08 PM, ch huang <ju...@gmail.com> wrote:
>>
>>> anyone can help?
>>>
>>> # sudo -u hdfs hadoop jar /usr/lib/hadoop/lib/hadoop-lzo-0.4.15.jar
>>> com.hadoop.compression.lzo.DistributedLzoIndexer
>>> /alex/test_lzo/sqoop-1.99.2-bin-hadoop200.tar.gz.lzo
>>> 13/07/22 16:33:50 INFO lzo.GPLNativeCodeLoader: Loaded native gpl library
>>> 13/07/22 16:33:50 INFO lzo.LzoCodec: Successfully loaded & initialized
>>> native-lzo library [hadoop-lzo rev 6bb1b7f8b9044d8df9b4d2b6641db7658aab3cf8]
>>> 13/07/22 16:33:50 INFO lzo.DistributedLzoIndexer: Adding LZO file
>>> /alex/test_lzo/sqoop-1.99.2-bin-hadoop200.tar.gz.lzo to indexing list (no
>>> index currently exists)
>>> 13/07/22 16:33:50 ERROR security.UserGroupInformation:
>>> PriviledgedActionException as:hdfs (auth:SIMPLE) cause:java.io.IOException:
>>> Failed on local exception:
>>> com.google.protobuf.InvalidProtocolBufferException: Protocol message
>>> end-group tag did not match expected tag.; Host Details : local host is:
>>> "CH22/192.168.10.22"; destination host is: "CH22":8088;
>>> Exception in thread "main" java.io.IOException: Failed on local
>>> exception: com.google.protobuf.InvalidProtocolBufferException: Protocol
>>> message end-group tag did not match expected tag.; Host Details : local
>>> host is: "CH22/192.168.10.22"; destination host is: "CH22":8088;
>>>         at
>>> org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:763)
>>>         at org.apache.hadoop.ipc.Client.call(Client.java:1229)
>>>         at
>>> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:225)
>>>         at org.apache.hadoop.mapred.$Proxy10.getStagingAreaDir(Unknown
>>> Source)
>>>         at
>>> org.apache.hadoop.mapred.JobClient.getStagingAreaDir(JobClient.java:1324)
>>>         at
>>> org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:102)
>>>         at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:951)
>>>         at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:945)
>>>         at java.security.AccessController.doPrivileged(Native Method)
>>>         at javax.security.auth.Subject.doAs(Subject.java:396)
>>>         at
>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
>>>         at
>>> org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:945)
>>>         at org.apache.hadoop.mapreduce.Job.submit(Job.java:566)
>>>         at
>>> org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:596)
>>>         at
>>> com.hadoop.compression.lzo.DistributedLzoIndexer.run(DistributedLzoIndexer.java:111)
>>>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>>>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
>>>         at
>>> com.hadoop.compression.lzo.DistributedLzoIndexer.main(DistributedLzoIndexer.java:115)
>>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>         at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>>         at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>         at java.lang.reflect.Method.invoke(Method.java:597)
>>>         at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
>>> Caused by: com.google.protobuf.InvalidProtocolBufferException: Protocol
>>> message end-group tag did not match expected tag.
>>>         at
>>> com.google.protobuf.InvalidProtocolBufferException.invalidEndTag(InvalidProtocolBufferException.java:73)
>>>         at
>>> com.google.protobuf.CodedInputStream.checkLastTagWas(CodedInputStream.java:124)
>>>         at
>>> com.google.protobuf.AbstractMessageLite$Builder.mergeFrom(AbstractMessageLite.java:213)
>>>         at
>>> com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:746)
>>>         at
>>> com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:238)
>>>         at
>>> com.google.protobuf.AbstractMessageLite$Builder.mergeDelimitedFrom(AbstractMessageLite.java:282)
>>>         at
>>> com.google.protobuf.AbstractMessage$Builder.mergeDelimitedFrom(AbstractMessage.java:760)
>>>         at
>>> com.google.protobuf.AbstractMessageLite$Builder.mergeDelimitedFrom(AbstractMessageLite.java:288)
>>>         at
>>> com.google.protobuf.AbstractMessage$Builder.mergeDelimitedFrom(AbstractMessage.java:752)
>>>         at
>>> org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto.parseDelimitedFrom(RpcPayloadHeaderProtos.java:985)
>>>         at
>>> org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:938)
>>>         at org.apache.hadoop.ipc.Client$Connection.run(Client.java:836)
>>>
>>
>>
>>
>> --
>> --Regards
>>   Sandeep Nemuri
>>
>
>

Re: test lzo problem in hadoop

Posted by "J. Rottinghuis" <jr...@gmail.com>.
In Hadoop 2.0 some of the classes have changed from an abstract class to an
interface.
You'll have to compile again. In addition, you need to use a version of
hadoop-lzo that is compatible with Hadoop 2.0 (Yarn).

See: https://github.com/twitter/hadoop-lzo/issues/56
and the announcement of a newer version 0.4.17 that solves this problem:
http://mail-archives.apache.org/mod_mbox/hadoop-common-dev/201306.mbox/%3CCADbBEnt1iiVbZFqQFupEUJ6VVdm5NitVvXqrfuzqugaOyMz+0g@mail.gmail.com%3E

Cheers,

Joep


On Thu, Aug 1, 2013 at 6:44 PM, ch huang <ju...@gmail.com> wrote:

> i use yarn ,and i commented the following option and error is different
>
>
> vi /etc/hadoop/conf/mapred-site.xml
> <!--
>         <property>
>                 <name>mapred.job.tracker</name>
>                 <value>CH22:8088</value>
>         </property>
> -->
>
>
> #  hadoop jar /usr/lib/hadoop/lib/hadoop-lzo-0.4.15.jar
> com.hadoop.compression.lzo.DistributedLzoIndexer /alex/ttt.lzo
> 13/08/02 09:25:51 INFO lzo.GPLNativeCodeLoader: Loaded native gpl library
> 13/08/02 09:25:51 INFO lzo.LzoCodec: Successfully loaded & initialized
> native-lzo library [hadoop-lzo rev 6bb1b7f8b9044d8df9b4d2b6641db7658aab3cf8]
> 13/08/02 09:25:52 INFO lzo.DistributedLzoIndexer: Adding LZO file
> /alex/ttt.lzo to indexing list (no index currently exists)
> 13/08/02 09:25:52 WARN conf.Configuration: session.id is deprecated.
> Instead, use dfs.metrics.session-id
> 13/08/02 09:25:52 INFO jvm.JvmMetrics: Initializing JVM Metrics with
> processName=JobTracker, sessionId=
> 13/08/02 09:25:52 WARN conf.Configuration: slave.host.name is deprecated.
> Instead, use dfs.datanode.hostname
> 13/08/02 09:25:52 WARN mapred.JobClient: Use GenericOptionsParser for
> parsing the arguments. Applications should implement Tool for the same.
> 13/08/02 09:25:52 INFO input.FileInputFormat: Total input paths to process
> : 1
> 13/08/02 09:25:53 INFO mapred.LocalJobRunner: OutputCommitter set in
> config null
> 13/08/02 09:25:53 INFO mapred.JobClient: Running job:
> job_local180628093_0001
> 13/08/02 09:25:53 INFO mapred.LocalJobRunner: OutputCommitter is
> com.hadoop.mapreduce.LzoIndexOutputFormat$1
> 13/08/02 09:25:53 INFO mapred.LocalJobRunner: Waiting for map tasks
> 13/08/02 09:25:53 INFO mapred.LocalJobRunner: Starting task:
> attempt_local180628093_0001_m_000000_0
> 13/08/02 09:25:53 WARN mapreduce.Counters: Group
> org.apache.hadoop.mapred.Task$Counter is deprecated. Use
> org.apache.hadoop.mapreduce.TaskCounter instead
> 13/08/02 09:25:53 INFO util.ProcessTree: setsid exited with exit code 0
> 13/08/02 09:25:53 INFO mapred.Task:  Using ResourceCalculatorPlugin :
> org.apache.hadoop.util.LinuxResourceCalculatorPlugin@338e18a3
> 13/08/02 09:25:53 INFO mapred.MapTask: Processing split:
> hdfs://CH22:9000/alex/ttt.lzo:0+306
> 13/08/02 09:25:53 INFO mapred.LocalJobRunner: Map task executor complete.
> 13/08/02 09:25:53 WARN mapred.LocalJobRunner: job_local180628093_0001
> java.lang.Exception: java.lang.IncompatibleClassChangeError: Found
> interface org.apache.hadoop.mapreduce.TaskAttemptContext, but class was
> expected
>         at
> org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:404)
> Caused by: java.lang.IncompatibleClassChangeError: Found interface
> org.apache.hadoop.mapreduce.TaskAttemptContext, but class was expected
>         at
> com.hadoop.mapreduce.LzoSplitRecordReader.initialize(LzoSplitRecordReader.java:47)
>         at
> org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.initialize(MapTask.java:478)
>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:671)
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
>         at
> org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:266)
>         at
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
>         at
> java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
>         at java.util.concurrent.FutureTask.run(FutureTask.java:138)
>         at
> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
>         at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
>         at java.lang.Thread.run(Thread.java:662)
> 13/08/02 09:25:54 INFO mapred.JobClient:  map 0% reduce 0%
> 13/08/02 09:25:54 INFO mapred.JobClient: Job complete:
> job_local180628093_0001
> 13/08/02 09:25:54 INFO mapred.JobClient: Counters: 0
>
> On Mon, Jul 22, 2013 at 6:07 PM, Sandeep Nemuri <nh...@gmail.com>wrote:
>
>> Try  this Ccommand
>> hadoop jar /usr/lib/hadoop/lib/hadoop-lzo-cdh4-0.4.15-gplextras.jar
>> com.hadoop.compression.lzo.LzoIndexer /user/sample.txt.lzo
>>
>>
>> On Mon, Jul 22, 2013 at 2:08 PM, ch huang <ju...@gmail.com> wrote:
>>
>>> anyone can help?
>>>
>>> # sudo -u hdfs hadoop jar /usr/lib/hadoop/lib/hadoop-lzo-0.4.15.jar
>>> com.hadoop.compression.lzo.DistributedLzoIndexer
>>> /alex/test_lzo/sqoop-1.99.2-bin-hadoop200.tar.gz.lzo
>>> 13/07/22 16:33:50 INFO lzo.GPLNativeCodeLoader: Loaded native gpl library
>>> 13/07/22 16:33:50 INFO lzo.LzoCodec: Successfully loaded & initialized
>>> native-lzo library [hadoop-lzo rev 6bb1b7f8b9044d8df9b4d2b6641db7658aab3cf8]
>>> 13/07/22 16:33:50 INFO lzo.DistributedLzoIndexer: Adding LZO file
>>> /alex/test_lzo/sqoop-1.99.2-bin-hadoop200.tar.gz.lzo to indexing list (no
>>> index currently exists)
>>> 13/07/22 16:33:50 ERROR security.UserGroupInformation:
>>> PriviledgedActionException as:hdfs (auth:SIMPLE) cause:java.io.IOException:
>>> Failed on local exception:
>>> com.google.protobuf.InvalidProtocolBufferException: Protocol message
>>> end-group tag did not match expected tag.; Host Details : local host is:
>>> "CH22/192.168.10.22"; destination host is: "CH22":8088;
>>> Exception in thread "main" java.io.IOException: Failed on local
>>> exception: com.google.protobuf.InvalidProtocolBufferException: Protocol
>>> message end-group tag did not match expected tag.; Host Details : local
>>> host is: "CH22/192.168.10.22"; destination host is: "CH22":8088;
>>>         at
>>> org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:763)
>>>         at org.apache.hadoop.ipc.Client.call(Client.java:1229)
>>>         at
>>> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:225)
>>>         at org.apache.hadoop.mapred.$Proxy10.getStagingAreaDir(Unknown
>>> Source)
>>>         at
>>> org.apache.hadoop.mapred.JobClient.getStagingAreaDir(JobClient.java:1324)
>>>         at
>>> org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:102)
>>>         at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:951)
>>>         at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:945)
>>>         at java.security.AccessController.doPrivileged(Native Method)
>>>         at javax.security.auth.Subject.doAs(Subject.java:396)
>>>         at
>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
>>>         at
>>> org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:945)
>>>         at org.apache.hadoop.mapreduce.Job.submit(Job.java:566)
>>>         at
>>> org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:596)
>>>         at
>>> com.hadoop.compression.lzo.DistributedLzoIndexer.run(DistributedLzoIndexer.java:111)
>>>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>>>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
>>>         at
>>> com.hadoop.compression.lzo.DistributedLzoIndexer.main(DistributedLzoIndexer.java:115)
>>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>         at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>>         at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>         at java.lang.reflect.Method.invoke(Method.java:597)
>>>         at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
>>> Caused by: com.google.protobuf.InvalidProtocolBufferException: Protocol
>>> message end-group tag did not match expected tag.
>>>         at
>>> com.google.protobuf.InvalidProtocolBufferException.invalidEndTag(InvalidProtocolBufferException.java:73)
>>>         at
>>> com.google.protobuf.CodedInputStream.checkLastTagWas(CodedInputStream.java:124)
>>>         at
>>> com.google.protobuf.AbstractMessageLite$Builder.mergeFrom(AbstractMessageLite.java:213)
>>>         at
>>> com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:746)
>>>         at
>>> com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:238)
>>>         at
>>> com.google.protobuf.AbstractMessageLite$Builder.mergeDelimitedFrom(AbstractMessageLite.java:282)
>>>         at
>>> com.google.protobuf.AbstractMessage$Builder.mergeDelimitedFrom(AbstractMessage.java:760)
>>>         at
>>> com.google.protobuf.AbstractMessageLite$Builder.mergeDelimitedFrom(AbstractMessageLite.java:288)
>>>         at
>>> com.google.protobuf.AbstractMessage$Builder.mergeDelimitedFrom(AbstractMessage.java:752)
>>>         at
>>> org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto.parseDelimitedFrom(RpcPayloadHeaderProtos.java:985)
>>>         at
>>> org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:938)
>>>         at org.apache.hadoop.ipc.Client$Connection.run(Client.java:836)
>>>
>>
>>
>>
>> --
>> --Regards
>>   Sandeep Nemuri
>>
>
>

Re: test lzo problem in hadoop

Posted by ch huang <ju...@gmail.com>.
i use yarn ,and i commented the following option and error is different


vi /etc/hadoop/conf/mapred-site.xml
<!--
        <property>
                <name>mapred.job.tracker</name>
                <value>CH22:8088</value>
        </property>
-->


#  hadoop jar /usr/lib/hadoop/lib/hadoop-lzo-0.4.15.jar
com.hadoop.compression.lzo.DistributedLzoIndexer /alex/ttt.lzo
13/08/02 09:25:51 INFO lzo.GPLNativeCodeLoader: Loaded native gpl library
13/08/02 09:25:51 INFO lzo.LzoCodec: Successfully loaded & initialized
native-lzo library [hadoop-lzo rev 6bb1b7f8b9044d8df9b4d2b6641db7658aab3cf8]
13/08/02 09:25:52 INFO lzo.DistributedLzoIndexer: Adding LZO file
/alex/ttt.lzo to indexing list (no index currently exists)
13/08/02 09:25:52 WARN conf.Configuration: session.id is deprecated.
Instead, use dfs.metrics.session-id
13/08/02 09:25:52 INFO jvm.JvmMetrics: Initializing JVM Metrics with
processName=JobTracker, sessionId=
13/08/02 09:25:52 WARN conf.Configuration: slave.host.name is deprecated.
Instead, use dfs.datanode.hostname
13/08/02 09:25:52 WARN mapred.JobClient: Use GenericOptionsParser for
parsing the arguments. Applications should implement Tool for the same.
13/08/02 09:25:52 INFO input.FileInputFormat: Total input paths to process
: 1
13/08/02 09:25:53 INFO mapred.LocalJobRunner: OutputCommitter set in config
null
13/08/02 09:25:53 INFO mapred.JobClient: Running job:
job_local180628093_0001
13/08/02 09:25:53 INFO mapred.LocalJobRunner: OutputCommitter is
com.hadoop.mapreduce.LzoIndexOutputFormat$1
13/08/02 09:25:53 INFO mapred.LocalJobRunner: Waiting for map tasks
13/08/02 09:25:53 INFO mapred.LocalJobRunner: Starting task:
attempt_local180628093_0001_m_000000_0
13/08/02 09:25:53 WARN mapreduce.Counters: Group
org.apache.hadoop.mapred.Task$Counter is deprecated. Use
org.apache.hadoop.mapreduce.TaskCounter instead
13/08/02 09:25:53 INFO util.ProcessTree: setsid exited with exit code 0
13/08/02 09:25:53 INFO mapred.Task:  Using ResourceCalculatorPlugin :
org.apache.hadoop.util.LinuxResourceCalculatorPlugin@338e18a3
13/08/02 09:25:53 INFO mapred.MapTask: Processing split:
hdfs://CH22:9000/alex/ttt.lzo:0+306
13/08/02 09:25:53 INFO mapred.LocalJobRunner: Map task executor complete.
13/08/02 09:25:53 WARN mapred.LocalJobRunner: job_local180628093_0001
java.lang.Exception: java.lang.IncompatibleClassChangeError: Found
interface org.apache.hadoop.mapreduce.TaskAttemptContext, but class was
expected
        at
org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:404)
Caused by: java.lang.IncompatibleClassChangeError: Found interface
org.apache.hadoop.mapreduce.TaskAttemptContext, but class was expected
        at
com.hadoop.mapreduce.LzoSplitRecordReader.initialize(LzoSplitRecordReader.java:47)
        at
org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.initialize(MapTask.java:478)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:671)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
        at
org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:266)
        at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
        at
java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
        at java.util.concurrent.FutureTask.run(FutureTask.java:138)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
        at java.lang.Thread.run(Thread.java:662)
13/08/02 09:25:54 INFO mapred.JobClient:  map 0% reduce 0%
13/08/02 09:25:54 INFO mapred.JobClient: Job complete:
job_local180628093_0001
13/08/02 09:25:54 INFO mapred.JobClient: Counters: 0

On Mon, Jul 22, 2013 at 6:07 PM, Sandeep Nemuri <nh...@gmail.com>wrote:

> Try  this Ccommand
> hadoop jar /usr/lib/hadoop/lib/hadoop-lzo-cdh4-0.4.15-gplextras.jar
> com.hadoop.compression.lzo.LzoIndexer /user/sample.txt.lzo
>
>
> On Mon, Jul 22, 2013 at 2:08 PM, ch huang <ju...@gmail.com> wrote:
>
>> anyone can help?
>>
>> # sudo -u hdfs hadoop jar /usr/lib/hadoop/lib/hadoop-lzo-0.4.15.jar
>> com.hadoop.compression.lzo.DistributedLzoIndexer
>> /alex/test_lzo/sqoop-1.99.2-bin-hadoop200.tar.gz.lzo
>> 13/07/22 16:33:50 INFO lzo.GPLNativeCodeLoader: Loaded native gpl library
>> 13/07/22 16:33:50 INFO lzo.LzoCodec: Successfully loaded & initialized
>> native-lzo library [hadoop-lzo rev 6bb1b7f8b9044d8df9b4d2b6641db7658aab3cf8]
>> 13/07/22 16:33:50 INFO lzo.DistributedLzoIndexer: Adding LZO file
>> /alex/test_lzo/sqoop-1.99.2-bin-hadoop200.tar.gz.lzo to indexing list (no
>> index currently exists)
>> 13/07/22 16:33:50 ERROR security.UserGroupInformation:
>> PriviledgedActionException as:hdfs (auth:SIMPLE) cause:java.io.IOException:
>> Failed on local exception:
>> com.google.protobuf.InvalidProtocolBufferException: Protocol message
>> end-group tag did not match expected tag.; Host Details : local host is:
>> "CH22/192.168.10.22"; destination host is: "CH22":8088;
>> Exception in thread "main" java.io.IOException: Failed on local
>> exception: com.google.protobuf.InvalidProtocolBufferException: Protocol
>> message end-group tag did not match expected tag.; Host Details : local
>> host is: "CH22/192.168.10.22"; destination host is: "CH22":8088;
>>         at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:763)
>>         at org.apache.hadoop.ipc.Client.call(Client.java:1229)
>>         at
>> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:225)
>>         at org.apache.hadoop.mapred.$Proxy10.getStagingAreaDir(Unknown
>> Source)
>>         at
>> org.apache.hadoop.mapred.JobClient.getStagingAreaDir(JobClient.java:1324)
>>         at
>> org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:102)
>>         at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:951)
>>         at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:945)
>>         at java.security.AccessController.doPrivileged(Native Method)
>>         at javax.security.auth.Subject.doAs(Subject.java:396)
>>         at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
>>         at
>> org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:945)
>>         at org.apache.hadoop.mapreduce.Job.submit(Job.java:566)
>>         at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:596)
>>         at
>> com.hadoop.compression.lzo.DistributedLzoIndexer.run(DistributedLzoIndexer.java:111)
>>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
>>         at
>> com.hadoop.compression.lzo.DistributedLzoIndexer.main(DistributedLzoIndexer.java:115)
>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>         at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>         at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>         at java.lang.reflect.Method.invoke(Method.java:597)
>>         at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
>> Caused by: com.google.protobuf.InvalidProtocolBufferException: Protocol
>> message end-group tag did not match expected tag.
>>         at
>> com.google.protobuf.InvalidProtocolBufferException.invalidEndTag(InvalidProtocolBufferException.java:73)
>>         at
>> com.google.protobuf.CodedInputStream.checkLastTagWas(CodedInputStream.java:124)
>>         at
>> com.google.protobuf.AbstractMessageLite$Builder.mergeFrom(AbstractMessageLite.java:213)
>>         at
>> com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:746)
>>         at
>> com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:238)
>>         at
>> com.google.protobuf.AbstractMessageLite$Builder.mergeDelimitedFrom(AbstractMessageLite.java:282)
>>         at
>> com.google.protobuf.AbstractMessage$Builder.mergeDelimitedFrom(AbstractMessage.java:760)
>>         at
>> com.google.protobuf.AbstractMessageLite$Builder.mergeDelimitedFrom(AbstractMessageLite.java:288)
>>         at
>> com.google.protobuf.AbstractMessage$Builder.mergeDelimitedFrom(AbstractMessage.java:752)
>>         at
>> org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto.parseDelimitedFrom(RpcPayloadHeaderProtos.java:985)
>>         at
>> org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:938)
>>         at org.apache.hadoop.ipc.Client$Connection.run(Client.java:836)
>>
>
>
>
> --
> --Regards
>   Sandeep Nemuri
>

Re: test lzo problem in hadoop

Posted by ch huang <ju...@gmail.com>.
i use yarn ,and i commented the following option and error is different


vi /etc/hadoop/conf/mapred-site.xml
<!--
        <property>
                <name>mapred.job.tracker</name>
                <value>CH22:8088</value>
        </property>
-->


#  hadoop jar /usr/lib/hadoop/lib/hadoop-lzo-0.4.15.jar
com.hadoop.compression.lzo.DistributedLzoIndexer /alex/ttt.lzo
13/08/02 09:25:51 INFO lzo.GPLNativeCodeLoader: Loaded native gpl library
13/08/02 09:25:51 INFO lzo.LzoCodec: Successfully loaded & initialized
native-lzo library [hadoop-lzo rev 6bb1b7f8b9044d8df9b4d2b6641db7658aab3cf8]
13/08/02 09:25:52 INFO lzo.DistributedLzoIndexer: Adding LZO file
/alex/ttt.lzo to indexing list (no index currently exists)
13/08/02 09:25:52 WARN conf.Configuration: session.id is deprecated.
Instead, use dfs.metrics.session-id
13/08/02 09:25:52 INFO jvm.JvmMetrics: Initializing JVM Metrics with
processName=JobTracker, sessionId=
13/08/02 09:25:52 WARN conf.Configuration: slave.host.name is deprecated.
Instead, use dfs.datanode.hostname
13/08/02 09:25:52 WARN mapred.JobClient: Use GenericOptionsParser for
parsing the arguments. Applications should implement Tool for the same.
13/08/02 09:25:52 INFO input.FileInputFormat: Total input paths to process
: 1
13/08/02 09:25:53 INFO mapred.LocalJobRunner: OutputCommitter set in config
null
13/08/02 09:25:53 INFO mapred.JobClient: Running job:
job_local180628093_0001
13/08/02 09:25:53 INFO mapred.LocalJobRunner: OutputCommitter is
com.hadoop.mapreduce.LzoIndexOutputFormat$1
13/08/02 09:25:53 INFO mapred.LocalJobRunner: Waiting for map tasks
13/08/02 09:25:53 INFO mapred.LocalJobRunner: Starting task:
attempt_local180628093_0001_m_000000_0
13/08/02 09:25:53 WARN mapreduce.Counters: Group
org.apache.hadoop.mapred.Task$Counter is deprecated. Use
org.apache.hadoop.mapreduce.TaskCounter instead
13/08/02 09:25:53 INFO util.ProcessTree: setsid exited with exit code 0
13/08/02 09:25:53 INFO mapred.Task:  Using ResourceCalculatorPlugin :
org.apache.hadoop.util.LinuxResourceCalculatorPlugin@338e18a3
13/08/02 09:25:53 INFO mapred.MapTask: Processing split:
hdfs://CH22:9000/alex/ttt.lzo:0+306
13/08/02 09:25:53 INFO mapred.LocalJobRunner: Map task executor complete.
13/08/02 09:25:53 WARN mapred.LocalJobRunner: job_local180628093_0001
java.lang.Exception: java.lang.IncompatibleClassChangeError: Found
interface org.apache.hadoop.mapreduce.TaskAttemptContext, but class was
expected
        at
org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:404)
Caused by: java.lang.IncompatibleClassChangeError: Found interface
org.apache.hadoop.mapreduce.TaskAttemptContext, but class was expected
        at
com.hadoop.mapreduce.LzoSplitRecordReader.initialize(LzoSplitRecordReader.java:47)
        at
org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.initialize(MapTask.java:478)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:671)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
        at
org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:266)
        at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
        at
java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
        at java.util.concurrent.FutureTask.run(FutureTask.java:138)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
        at java.lang.Thread.run(Thread.java:662)
13/08/02 09:25:54 INFO mapred.JobClient:  map 0% reduce 0%
13/08/02 09:25:54 INFO mapred.JobClient: Job complete:
job_local180628093_0001
13/08/02 09:25:54 INFO mapred.JobClient: Counters: 0

On Mon, Jul 22, 2013 at 6:07 PM, Sandeep Nemuri <nh...@gmail.com>wrote:

> Try  this Ccommand
> hadoop jar /usr/lib/hadoop/lib/hadoop-lzo-cdh4-0.4.15-gplextras.jar
> com.hadoop.compression.lzo.LzoIndexer /user/sample.txt.lzo
>
>
> On Mon, Jul 22, 2013 at 2:08 PM, ch huang <ju...@gmail.com> wrote:
>
>> anyone can help?
>>
>> # sudo -u hdfs hadoop jar /usr/lib/hadoop/lib/hadoop-lzo-0.4.15.jar
>> com.hadoop.compression.lzo.DistributedLzoIndexer
>> /alex/test_lzo/sqoop-1.99.2-bin-hadoop200.tar.gz.lzo
>> 13/07/22 16:33:50 INFO lzo.GPLNativeCodeLoader: Loaded native gpl library
>> 13/07/22 16:33:50 INFO lzo.LzoCodec: Successfully loaded & initialized
>> native-lzo library [hadoop-lzo rev 6bb1b7f8b9044d8df9b4d2b6641db7658aab3cf8]
>> 13/07/22 16:33:50 INFO lzo.DistributedLzoIndexer: Adding LZO file
>> /alex/test_lzo/sqoop-1.99.2-bin-hadoop200.tar.gz.lzo to indexing list (no
>> index currently exists)
>> 13/07/22 16:33:50 ERROR security.UserGroupInformation:
>> PriviledgedActionException as:hdfs (auth:SIMPLE) cause:java.io.IOException:
>> Failed on local exception:
>> com.google.protobuf.InvalidProtocolBufferException: Protocol message
>> end-group tag did not match expected tag.; Host Details : local host is:
>> "CH22/192.168.10.22"; destination host is: "CH22":8088;
>> Exception in thread "main" java.io.IOException: Failed on local
>> exception: com.google.protobuf.InvalidProtocolBufferException: Protocol
>> message end-group tag did not match expected tag.; Host Details : local
>> host is: "CH22/192.168.10.22"; destination host is: "CH22":8088;
>>         at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:763)
>>         at org.apache.hadoop.ipc.Client.call(Client.java:1229)
>>         at
>> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:225)
>>         at org.apache.hadoop.mapred.$Proxy10.getStagingAreaDir(Unknown
>> Source)
>>         at
>> org.apache.hadoop.mapred.JobClient.getStagingAreaDir(JobClient.java:1324)
>>         at
>> org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:102)
>>         at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:951)
>>         at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:945)
>>         at java.security.AccessController.doPrivileged(Native Method)
>>         at javax.security.auth.Subject.doAs(Subject.java:396)
>>         at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
>>         at
>> org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:945)
>>         at org.apache.hadoop.mapreduce.Job.submit(Job.java:566)
>>         at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:596)
>>         at
>> com.hadoop.compression.lzo.DistributedLzoIndexer.run(DistributedLzoIndexer.java:111)
>>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
>>         at
>> com.hadoop.compression.lzo.DistributedLzoIndexer.main(DistributedLzoIndexer.java:115)
>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>         at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>         at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>         at java.lang.reflect.Method.invoke(Method.java:597)
>>         at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
>> Caused by: com.google.protobuf.InvalidProtocolBufferException: Protocol
>> message end-group tag did not match expected tag.
>>         at
>> com.google.protobuf.InvalidProtocolBufferException.invalidEndTag(InvalidProtocolBufferException.java:73)
>>         at
>> com.google.protobuf.CodedInputStream.checkLastTagWas(CodedInputStream.java:124)
>>         at
>> com.google.protobuf.AbstractMessageLite$Builder.mergeFrom(AbstractMessageLite.java:213)
>>         at
>> com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:746)
>>         at
>> com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:238)
>>         at
>> com.google.protobuf.AbstractMessageLite$Builder.mergeDelimitedFrom(AbstractMessageLite.java:282)
>>         at
>> com.google.protobuf.AbstractMessage$Builder.mergeDelimitedFrom(AbstractMessage.java:760)
>>         at
>> com.google.protobuf.AbstractMessageLite$Builder.mergeDelimitedFrom(AbstractMessageLite.java:288)
>>         at
>> com.google.protobuf.AbstractMessage$Builder.mergeDelimitedFrom(AbstractMessage.java:752)
>>         at
>> org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto.parseDelimitedFrom(RpcPayloadHeaderProtos.java:985)
>>         at
>> org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:938)
>>         at org.apache.hadoop.ipc.Client$Connection.run(Client.java:836)
>>
>
>
>
> --
> --Regards
>   Sandeep Nemuri
>

Re: test lzo problem in hadoop

Posted by ch huang <ju...@gmail.com>.
i use yarn ,and i commented the following option and error is different


vi /etc/hadoop/conf/mapred-site.xml
<!--
        <property>
                <name>mapred.job.tracker</name>
                <value>CH22:8088</value>
        </property>
-->


#  hadoop jar /usr/lib/hadoop/lib/hadoop-lzo-0.4.15.jar
com.hadoop.compression.lzo.DistributedLzoIndexer /alex/ttt.lzo
13/08/02 09:25:51 INFO lzo.GPLNativeCodeLoader: Loaded native gpl library
13/08/02 09:25:51 INFO lzo.LzoCodec: Successfully loaded & initialized
native-lzo library [hadoop-lzo rev 6bb1b7f8b9044d8df9b4d2b6641db7658aab3cf8]
13/08/02 09:25:52 INFO lzo.DistributedLzoIndexer: Adding LZO file
/alex/ttt.lzo to indexing list (no index currently exists)
13/08/02 09:25:52 WARN conf.Configuration: session.id is deprecated.
Instead, use dfs.metrics.session-id
13/08/02 09:25:52 INFO jvm.JvmMetrics: Initializing JVM Metrics with
processName=JobTracker, sessionId=
13/08/02 09:25:52 WARN conf.Configuration: slave.host.name is deprecated.
Instead, use dfs.datanode.hostname
13/08/02 09:25:52 WARN mapred.JobClient: Use GenericOptionsParser for
parsing the arguments. Applications should implement Tool for the same.
13/08/02 09:25:52 INFO input.FileInputFormat: Total input paths to process
: 1
13/08/02 09:25:53 INFO mapred.LocalJobRunner: OutputCommitter set in config
null
13/08/02 09:25:53 INFO mapred.JobClient: Running job:
job_local180628093_0001
13/08/02 09:25:53 INFO mapred.LocalJobRunner: OutputCommitter is
com.hadoop.mapreduce.LzoIndexOutputFormat$1
13/08/02 09:25:53 INFO mapred.LocalJobRunner: Waiting for map tasks
13/08/02 09:25:53 INFO mapred.LocalJobRunner: Starting task:
attempt_local180628093_0001_m_000000_0
13/08/02 09:25:53 WARN mapreduce.Counters: Group
org.apache.hadoop.mapred.Task$Counter is deprecated. Use
org.apache.hadoop.mapreduce.TaskCounter instead
13/08/02 09:25:53 INFO util.ProcessTree: setsid exited with exit code 0
13/08/02 09:25:53 INFO mapred.Task:  Using ResourceCalculatorPlugin :
org.apache.hadoop.util.LinuxResourceCalculatorPlugin@338e18a3
13/08/02 09:25:53 INFO mapred.MapTask: Processing split:
hdfs://CH22:9000/alex/ttt.lzo:0+306
13/08/02 09:25:53 INFO mapred.LocalJobRunner: Map task executor complete.
13/08/02 09:25:53 WARN mapred.LocalJobRunner: job_local180628093_0001
java.lang.Exception: java.lang.IncompatibleClassChangeError: Found
interface org.apache.hadoop.mapreduce.TaskAttemptContext, but class was
expected
        at
org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:404)
Caused by: java.lang.IncompatibleClassChangeError: Found interface
org.apache.hadoop.mapreduce.TaskAttemptContext, but class was expected
        at
com.hadoop.mapreduce.LzoSplitRecordReader.initialize(LzoSplitRecordReader.java:47)
        at
org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.initialize(MapTask.java:478)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:671)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
        at
org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:266)
        at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
        at
java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
        at java.util.concurrent.FutureTask.run(FutureTask.java:138)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
        at java.lang.Thread.run(Thread.java:662)
13/08/02 09:25:54 INFO mapred.JobClient:  map 0% reduce 0%
13/08/02 09:25:54 INFO mapred.JobClient: Job complete:
job_local180628093_0001
13/08/02 09:25:54 INFO mapred.JobClient: Counters: 0

On Mon, Jul 22, 2013 at 6:07 PM, Sandeep Nemuri <nh...@gmail.com>wrote:

> Try  this Ccommand
> hadoop jar /usr/lib/hadoop/lib/hadoop-lzo-cdh4-0.4.15-gplextras.jar
> com.hadoop.compression.lzo.LzoIndexer /user/sample.txt.lzo
>
>
> On Mon, Jul 22, 2013 at 2:08 PM, ch huang <ju...@gmail.com> wrote:
>
>> anyone can help?
>>
>> # sudo -u hdfs hadoop jar /usr/lib/hadoop/lib/hadoop-lzo-0.4.15.jar
>> com.hadoop.compression.lzo.DistributedLzoIndexer
>> /alex/test_lzo/sqoop-1.99.2-bin-hadoop200.tar.gz.lzo
>> 13/07/22 16:33:50 INFO lzo.GPLNativeCodeLoader: Loaded native gpl library
>> 13/07/22 16:33:50 INFO lzo.LzoCodec: Successfully loaded & initialized
>> native-lzo library [hadoop-lzo rev 6bb1b7f8b9044d8df9b4d2b6641db7658aab3cf8]
>> 13/07/22 16:33:50 INFO lzo.DistributedLzoIndexer: Adding LZO file
>> /alex/test_lzo/sqoop-1.99.2-bin-hadoop200.tar.gz.lzo to indexing list (no
>> index currently exists)
>> 13/07/22 16:33:50 ERROR security.UserGroupInformation:
>> PriviledgedActionException as:hdfs (auth:SIMPLE) cause:java.io.IOException:
>> Failed on local exception:
>> com.google.protobuf.InvalidProtocolBufferException: Protocol message
>> end-group tag did not match expected tag.; Host Details : local host is:
>> "CH22/192.168.10.22"; destination host is: "CH22":8088;
>> Exception in thread "main" java.io.IOException: Failed on local
>> exception: com.google.protobuf.InvalidProtocolBufferException: Protocol
>> message end-group tag did not match expected tag.; Host Details : local
>> host is: "CH22/192.168.10.22"; destination host is: "CH22":8088;
>>         at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:763)
>>         at org.apache.hadoop.ipc.Client.call(Client.java:1229)
>>         at
>> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:225)
>>         at org.apache.hadoop.mapred.$Proxy10.getStagingAreaDir(Unknown
>> Source)
>>         at
>> org.apache.hadoop.mapred.JobClient.getStagingAreaDir(JobClient.java:1324)
>>         at
>> org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:102)
>>         at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:951)
>>         at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:945)
>>         at java.security.AccessController.doPrivileged(Native Method)
>>         at javax.security.auth.Subject.doAs(Subject.java:396)
>>         at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
>>         at
>> org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:945)
>>         at org.apache.hadoop.mapreduce.Job.submit(Job.java:566)
>>         at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:596)
>>         at
>> com.hadoop.compression.lzo.DistributedLzoIndexer.run(DistributedLzoIndexer.java:111)
>>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
>>         at
>> com.hadoop.compression.lzo.DistributedLzoIndexer.main(DistributedLzoIndexer.java:115)
>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>         at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>         at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>         at java.lang.reflect.Method.invoke(Method.java:597)
>>         at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
>> Caused by: com.google.protobuf.InvalidProtocolBufferException: Protocol
>> message end-group tag did not match expected tag.
>>         at
>> com.google.protobuf.InvalidProtocolBufferException.invalidEndTag(InvalidProtocolBufferException.java:73)
>>         at
>> com.google.protobuf.CodedInputStream.checkLastTagWas(CodedInputStream.java:124)
>>         at
>> com.google.protobuf.AbstractMessageLite$Builder.mergeFrom(AbstractMessageLite.java:213)
>>         at
>> com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:746)
>>         at
>> com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:238)
>>         at
>> com.google.protobuf.AbstractMessageLite$Builder.mergeDelimitedFrom(AbstractMessageLite.java:282)
>>         at
>> com.google.protobuf.AbstractMessage$Builder.mergeDelimitedFrom(AbstractMessage.java:760)
>>         at
>> com.google.protobuf.AbstractMessageLite$Builder.mergeDelimitedFrom(AbstractMessageLite.java:288)
>>         at
>> com.google.protobuf.AbstractMessage$Builder.mergeDelimitedFrom(AbstractMessage.java:752)
>>         at
>> org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto.parseDelimitedFrom(RpcPayloadHeaderProtos.java:985)
>>         at
>> org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:938)
>>         at org.apache.hadoop.ipc.Client$Connection.run(Client.java:836)
>>
>
>
>
> --
> --Regards
>   Sandeep Nemuri
>

Re: test lzo problem in hadoop

Posted by ch huang <ju...@gmail.com>.
that's ok ,but why i can not use
com.hadoop.compression.lzo.DistributedLzoIndexer ?

# hadoop jar /usr/lib/hadoop/lib/hadoop-lzo-0.4.15.jar
com.hadoop.compression.lzo.LzoIndexer  /alex/ttt.lzo
13/08/02 09:11:09 INFO lzo.GPLNativeCodeLoader: Loaded native gpl library
13/08/02 09:11:09 INFO lzo.LzoCodec: Successfully loaded & initialized
native-lzo library [hadoop-lzo rev 6bb1b7f8b9044d8df9b4d2b6641db7658aab3cf8]
13/08/02 09:11:10 INFO lzo.LzoIndexer: [INDEX] LZO Indexing file
/alex/ttt.lzo, size 0.00 GB...
13/08/02 09:11:10 WARN conf.Configuration: hadoop.native.lib is deprecated.
Instead, use io.native.lib.available
13/08/02 09:11:10 INFO lzo.LzoIndexer: Completed LZO Indexing in 0.19
seconds (0.00 MB/s).  Index size is 0.01 KB.


On Mon, Jul 22, 2013 at 6:07 PM, Sandeep Nemuri <nh...@gmail.com>wrote:

> Try  this Ccommand
> hadoop jar /usr/lib/hadoop/lib/hadoop-lzo-cdh4-0.4.15-gplextras.jar
> com.hadoop.compression.lzo.LzoIndexer /user/sample.txt.lzo
>
>
> On Mon, Jul 22, 2013 at 2:08 PM, ch huang <ju...@gmail.com> wrote:
>
>> anyone can help?
>>
>> # sudo -u hdfs hadoop jar /usr/lib/hadoop/lib/hadoop-lzo-0.4.15.jar
>> com.hadoop.compression.lzo.DistributedLzoIndexer
>> /alex/test_lzo/sqoop-1.99.2-bin-hadoop200.tar.gz.lzo
>> 13/07/22 16:33:50 INFO lzo.GPLNativeCodeLoader: Loaded native gpl library
>> 13/07/22 16:33:50 INFO lzo.LzoCodec: Successfully loaded & initialized
>> native-lzo library [hadoop-lzo rev 6bb1b7f8b9044d8df9b4d2b6641db7658aab3cf8]
>> 13/07/22 16:33:50 INFO lzo.DistributedLzoIndexer: Adding LZO file
>> /alex/test_lzo/sqoop-1.99.2-bin-hadoop200.tar.gz.lzo to indexing list (no
>> index currently exists)
>> 13/07/22 16:33:50 ERROR security.UserGroupInformation:
>> PriviledgedActionException as:hdfs (auth:SIMPLE) cause:java.io.IOException:
>> Failed on local exception:
>> com.google.protobuf.InvalidProtocolBufferException: Protocol message
>> end-group tag did not match expected tag.; Host Details : local host is:
>> "CH22/192.168.10.22"; destination host is: "CH22":8088;
>> Exception in thread "main" java.io.IOException: Failed on local
>> exception: com.google.protobuf.InvalidProtocolBufferException: Protocol
>> message end-group tag did not match expected tag.; Host Details : local
>> host is: "CH22/192.168.10.22"; destination host is: "CH22":8088;
>>         at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:763)
>>         at org.apache.hadoop.ipc.Client.call(Client.java:1229)
>>         at
>> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:225)
>>         at org.apache.hadoop.mapred.$Proxy10.getStagingAreaDir(Unknown
>> Source)
>>         at
>> org.apache.hadoop.mapred.JobClient.getStagingAreaDir(JobClient.java:1324)
>>         at
>> org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:102)
>>         at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:951)
>>         at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:945)
>>         at java.security.AccessController.doPrivileged(Native Method)
>>         at javax.security.auth.Subject.doAs(Subject.java:396)
>>         at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
>>         at
>> org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:945)
>>         at org.apache.hadoop.mapreduce.Job.submit(Job.java:566)
>>         at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:596)
>>         at
>> com.hadoop.compression.lzo.DistributedLzoIndexer.run(DistributedLzoIndexer.java:111)
>>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
>>         at
>> com.hadoop.compression.lzo.DistributedLzoIndexer.main(DistributedLzoIndexer.java:115)
>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>         at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>         at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>         at java.lang.reflect.Method.invoke(Method.java:597)
>>         at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
>> Caused by: com.google.protobuf.InvalidProtocolBufferException: Protocol
>> message end-group tag did not match expected tag.
>>         at
>> com.google.protobuf.InvalidProtocolBufferException.invalidEndTag(InvalidProtocolBufferException.java:73)
>>         at
>> com.google.protobuf.CodedInputStream.checkLastTagWas(CodedInputStream.java:124)
>>         at
>> com.google.protobuf.AbstractMessageLite$Builder.mergeFrom(AbstractMessageLite.java:213)
>>         at
>> com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:746)
>>         at
>> com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:238)
>>         at
>> com.google.protobuf.AbstractMessageLite$Builder.mergeDelimitedFrom(AbstractMessageLite.java:282)
>>         at
>> com.google.protobuf.AbstractMessage$Builder.mergeDelimitedFrom(AbstractMessage.java:760)
>>         at
>> com.google.protobuf.AbstractMessageLite$Builder.mergeDelimitedFrom(AbstractMessageLite.java:288)
>>         at
>> com.google.protobuf.AbstractMessage$Builder.mergeDelimitedFrom(AbstractMessage.java:752)
>>         at
>> org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto.parseDelimitedFrom(RpcPayloadHeaderProtos.java:985)
>>         at
>> org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:938)
>>         at org.apache.hadoop.ipc.Client$Connection.run(Client.java:836)
>>
>
>
>
> --
> --Regards
>   Sandeep Nemuri
>

Re: test lzo problem in hadoop

Posted by ch huang <ju...@gmail.com>.
that's ok ,but why i can not use
com.hadoop.compression.lzo.DistributedLzoIndexer ?

# hadoop jar /usr/lib/hadoop/lib/hadoop-lzo-0.4.15.jar
com.hadoop.compression.lzo.LzoIndexer  /alex/ttt.lzo
13/08/02 09:11:09 INFO lzo.GPLNativeCodeLoader: Loaded native gpl library
13/08/02 09:11:09 INFO lzo.LzoCodec: Successfully loaded & initialized
native-lzo library [hadoop-lzo rev 6bb1b7f8b9044d8df9b4d2b6641db7658aab3cf8]
13/08/02 09:11:10 INFO lzo.LzoIndexer: [INDEX] LZO Indexing file
/alex/ttt.lzo, size 0.00 GB...
13/08/02 09:11:10 WARN conf.Configuration: hadoop.native.lib is deprecated.
Instead, use io.native.lib.available
13/08/02 09:11:10 INFO lzo.LzoIndexer: Completed LZO Indexing in 0.19
seconds (0.00 MB/s).  Index size is 0.01 KB.


On Mon, Jul 22, 2013 at 6:07 PM, Sandeep Nemuri <nh...@gmail.com>wrote:

> Try  this Ccommand
> hadoop jar /usr/lib/hadoop/lib/hadoop-lzo-cdh4-0.4.15-gplextras.jar
> com.hadoop.compression.lzo.LzoIndexer /user/sample.txt.lzo
>
>
> On Mon, Jul 22, 2013 at 2:08 PM, ch huang <ju...@gmail.com> wrote:
>
>> anyone can help?
>>
>> # sudo -u hdfs hadoop jar /usr/lib/hadoop/lib/hadoop-lzo-0.4.15.jar
>> com.hadoop.compression.lzo.DistributedLzoIndexer
>> /alex/test_lzo/sqoop-1.99.2-bin-hadoop200.tar.gz.lzo
>> 13/07/22 16:33:50 INFO lzo.GPLNativeCodeLoader: Loaded native gpl library
>> 13/07/22 16:33:50 INFO lzo.LzoCodec: Successfully loaded & initialized
>> native-lzo library [hadoop-lzo rev 6bb1b7f8b9044d8df9b4d2b6641db7658aab3cf8]
>> 13/07/22 16:33:50 INFO lzo.DistributedLzoIndexer: Adding LZO file
>> /alex/test_lzo/sqoop-1.99.2-bin-hadoop200.tar.gz.lzo to indexing list (no
>> index currently exists)
>> 13/07/22 16:33:50 ERROR security.UserGroupInformation:
>> PriviledgedActionException as:hdfs (auth:SIMPLE) cause:java.io.IOException:
>> Failed on local exception:
>> com.google.protobuf.InvalidProtocolBufferException: Protocol message
>> end-group tag did not match expected tag.; Host Details : local host is:
>> "CH22/192.168.10.22"; destination host is: "CH22":8088;
>> Exception in thread "main" java.io.IOException: Failed on local
>> exception: com.google.protobuf.InvalidProtocolBufferException: Protocol
>> message end-group tag did not match expected tag.; Host Details : local
>> host is: "CH22/192.168.10.22"; destination host is: "CH22":8088;
>>         at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:763)
>>         at org.apache.hadoop.ipc.Client.call(Client.java:1229)
>>         at
>> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:225)
>>         at org.apache.hadoop.mapred.$Proxy10.getStagingAreaDir(Unknown
>> Source)
>>         at
>> org.apache.hadoop.mapred.JobClient.getStagingAreaDir(JobClient.java:1324)
>>         at
>> org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:102)
>>         at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:951)
>>         at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:945)
>>         at java.security.AccessController.doPrivileged(Native Method)
>>         at javax.security.auth.Subject.doAs(Subject.java:396)
>>         at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
>>         at
>> org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:945)
>>         at org.apache.hadoop.mapreduce.Job.submit(Job.java:566)
>>         at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:596)
>>         at
>> com.hadoop.compression.lzo.DistributedLzoIndexer.run(DistributedLzoIndexer.java:111)
>>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
>>         at
>> com.hadoop.compression.lzo.DistributedLzoIndexer.main(DistributedLzoIndexer.java:115)
>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>         at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>         at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>         at java.lang.reflect.Method.invoke(Method.java:597)
>>         at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
>> Caused by: com.google.protobuf.InvalidProtocolBufferException: Protocol
>> message end-group tag did not match expected tag.
>>         at
>> com.google.protobuf.InvalidProtocolBufferException.invalidEndTag(InvalidProtocolBufferException.java:73)
>>         at
>> com.google.protobuf.CodedInputStream.checkLastTagWas(CodedInputStream.java:124)
>>         at
>> com.google.protobuf.AbstractMessageLite$Builder.mergeFrom(AbstractMessageLite.java:213)
>>         at
>> com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:746)
>>         at
>> com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:238)
>>         at
>> com.google.protobuf.AbstractMessageLite$Builder.mergeDelimitedFrom(AbstractMessageLite.java:282)
>>         at
>> com.google.protobuf.AbstractMessage$Builder.mergeDelimitedFrom(AbstractMessage.java:760)
>>         at
>> com.google.protobuf.AbstractMessageLite$Builder.mergeDelimitedFrom(AbstractMessageLite.java:288)
>>         at
>> com.google.protobuf.AbstractMessage$Builder.mergeDelimitedFrom(AbstractMessage.java:752)
>>         at
>> org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto.parseDelimitedFrom(RpcPayloadHeaderProtos.java:985)
>>         at
>> org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:938)
>>         at org.apache.hadoop.ipc.Client$Connection.run(Client.java:836)
>>
>
>
>
> --
> --Regards
>   Sandeep Nemuri
>

Re: test lzo problem in hadoop

Posted by ch huang <ju...@gmail.com>.
i use yarn ,and i commented the following option and error is different


vi /etc/hadoop/conf/mapred-site.xml
<!--
        <property>
                <name>mapred.job.tracker</name>
                <value>CH22:8088</value>
        </property>
-->


#  hadoop jar /usr/lib/hadoop/lib/hadoop-lzo-0.4.15.jar
com.hadoop.compression.lzo.DistributedLzoIndexer /alex/ttt.lzo
13/08/02 09:25:51 INFO lzo.GPLNativeCodeLoader: Loaded native gpl library
13/08/02 09:25:51 INFO lzo.LzoCodec: Successfully loaded & initialized
native-lzo library [hadoop-lzo rev 6bb1b7f8b9044d8df9b4d2b6641db7658aab3cf8]
13/08/02 09:25:52 INFO lzo.DistributedLzoIndexer: Adding LZO file
/alex/ttt.lzo to indexing list (no index currently exists)
13/08/02 09:25:52 WARN conf.Configuration: session.id is deprecated.
Instead, use dfs.metrics.session-id
13/08/02 09:25:52 INFO jvm.JvmMetrics: Initializing JVM Metrics with
processName=JobTracker, sessionId=
13/08/02 09:25:52 WARN conf.Configuration: slave.host.name is deprecated.
Instead, use dfs.datanode.hostname
13/08/02 09:25:52 WARN mapred.JobClient: Use GenericOptionsParser for
parsing the arguments. Applications should implement Tool for the same.
13/08/02 09:25:52 INFO input.FileInputFormat: Total input paths to process
: 1
13/08/02 09:25:53 INFO mapred.LocalJobRunner: OutputCommitter set in config
null
13/08/02 09:25:53 INFO mapred.JobClient: Running job:
job_local180628093_0001
13/08/02 09:25:53 INFO mapred.LocalJobRunner: OutputCommitter is
com.hadoop.mapreduce.LzoIndexOutputFormat$1
13/08/02 09:25:53 INFO mapred.LocalJobRunner: Waiting for map tasks
13/08/02 09:25:53 INFO mapred.LocalJobRunner: Starting task:
attempt_local180628093_0001_m_000000_0
13/08/02 09:25:53 WARN mapreduce.Counters: Group
org.apache.hadoop.mapred.Task$Counter is deprecated. Use
org.apache.hadoop.mapreduce.TaskCounter instead
13/08/02 09:25:53 INFO util.ProcessTree: setsid exited with exit code 0
13/08/02 09:25:53 INFO mapred.Task:  Using ResourceCalculatorPlugin :
org.apache.hadoop.util.LinuxResourceCalculatorPlugin@338e18a3
13/08/02 09:25:53 INFO mapred.MapTask: Processing split:
hdfs://CH22:9000/alex/ttt.lzo:0+306
13/08/02 09:25:53 INFO mapred.LocalJobRunner: Map task executor complete.
13/08/02 09:25:53 WARN mapred.LocalJobRunner: job_local180628093_0001
java.lang.Exception: java.lang.IncompatibleClassChangeError: Found
interface org.apache.hadoop.mapreduce.TaskAttemptContext, but class was
expected
        at
org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:404)
Caused by: java.lang.IncompatibleClassChangeError: Found interface
org.apache.hadoop.mapreduce.TaskAttemptContext, but class was expected
        at
com.hadoop.mapreduce.LzoSplitRecordReader.initialize(LzoSplitRecordReader.java:47)
        at
org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.initialize(MapTask.java:478)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:671)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
        at
org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:266)
        at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
        at
java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
        at java.util.concurrent.FutureTask.run(FutureTask.java:138)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
        at java.lang.Thread.run(Thread.java:662)
13/08/02 09:25:54 INFO mapred.JobClient:  map 0% reduce 0%
13/08/02 09:25:54 INFO mapred.JobClient: Job complete:
job_local180628093_0001
13/08/02 09:25:54 INFO mapred.JobClient: Counters: 0

On Mon, Jul 22, 2013 at 6:07 PM, Sandeep Nemuri <nh...@gmail.com>wrote:

> Try  this Ccommand
> hadoop jar /usr/lib/hadoop/lib/hadoop-lzo-cdh4-0.4.15-gplextras.jar
> com.hadoop.compression.lzo.LzoIndexer /user/sample.txt.lzo
>
>
> On Mon, Jul 22, 2013 at 2:08 PM, ch huang <ju...@gmail.com> wrote:
>
>> anyone can help?
>>
>> # sudo -u hdfs hadoop jar /usr/lib/hadoop/lib/hadoop-lzo-0.4.15.jar
>> com.hadoop.compression.lzo.DistributedLzoIndexer
>> /alex/test_lzo/sqoop-1.99.2-bin-hadoop200.tar.gz.lzo
>> 13/07/22 16:33:50 INFO lzo.GPLNativeCodeLoader: Loaded native gpl library
>> 13/07/22 16:33:50 INFO lzo.LzoCodec: Successfully loaded & initialized
>> native-lzo library [hadoop-lzo rev 6bb1b7f8b9044d8df9b4d2b6641db7658aab3cf8]
>> 13/07/22 16:33:50 INFO lzo.DistributedLzoIndexer: Adding LZO file
>> /alex/test_lzo/sqoop-1.99.2-bin-hadoop200.tar.gz.lzo to indexing list (no
>> index currently exists)
>> 13/07/22 16:33:50 ERROR security.UserGroupInformation:
>> PriviledgedActionException as:hdfs (auth:SIMPLE) cause:java.io.IOException:
>> Failed on local exception:
>> com.google.protobuf.InvalidProtocolBufferException: Protocol message
>> end-group tag did not match expected tag.; Host Details : local host is:
>> "CH22/192.168.10.22"; destination host is: "CH22":8088;
>> Exception in thread "main" java.io.IOException: Failed on local
>> exception: com.google.protobuf.InvalidProtocolBufferException: Protocol
>> message end-group tag did not match expected tag.; Host Details : local
>> host is: "CH22/192.168.10.22"; destination host is: "CH22":8088;
>>         at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:763)
>>         at org.apache.hadoop.ipc.Client.call(Client.java:1229)
>>         at
>> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:225)
>>         at org.apache.hadoop.mapred.$Proxy10.getStagingAreaDir(Unknown
>> Source)
>>         at
>> org.apache.hadoop.mapred.JobClient.getStagingAreaDir(JobClient.java:1324)
>>         at
>> org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:102)
>>         at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:951)
>>         at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:945)
>>         at java.security.AccessController.doPrivileged(Native Method)
>>         at javax.security.auth.Subject.doAs(Subject.java:396)
>>         at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
>>         at
>> org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:945)
>>         at org.apache.hadoop.mapreduce.Job.submit(Job.java:566)
>>         at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:596)
>>         at
>> com.hadoop.compression.lzo.DistributedLzoIndexer.run(DistributedLzoIndexer.java:111)
>>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
>>         at
>> com.hadoop.compression.lzo.DistributedLzoIndexer.main(DistributedLzoIndexer.java:115)
>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>         at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>         at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>         at java.lang.reflect.Method.invoke(Method.java:597)
>>         at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
>> Caused by: com.google.protobuf.InvalidProtocolBufferException: Protocol
>> message end-group tag did not match expected tag.
>>         at
>> com.google.protobuf.InvalidProtocolBufferException.invalidEndTag(InvalidProtocolBufferException.java:73)
>>         at
>> com.google.protobuf.CodedInputStream.checkLastTagWas(CodedInputStream.java:124)
>>         at
>> com.google.protobuf.AbstractMessageLite$Builder.mergeFrom(AbstractMessageLite.java:213)
>>         at
>> com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:746)
>>         at
>> com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:238)
>>         at
>> com.google.protobuf.AbstractMessageLite$Builder.mergeDelimitedFrom(AbstractMessageLite.java:282)
>>         at
>> com.google.protobuf.AbstractMessage$Builder.mergeDelimitedFrom(AbstractMessage.java:760)
>>         at
>> com.google.protobuf.AbstractMessageLite$Builder.mergeDelimitedFrom(AbstractMessageLite.java:288)
>>         at
>> com.google.protobuf.AbstractMessage$Builder.mergeDelimitedFrom(AbstractMessage.java:752)
>>         at
>> org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto.parseDelimitedFrom(RpcPayloadHeaderProtos.java:985)
>>         at
>> org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:938)
>>         at org.apache.hadoop.ipc.Client$Connection.run(Client.java:836)
>>
>
>
>
> --
> --Regards
>   Sandeep Nemuri
>

Re: test lzo problem in hadoop

Posted by ch huang <ju...@gmail.com>.
that's ok ,but why i can not use
com.hadoop.compression.lzo.DistributedLzoIndexer ?

# hadoop jar /usr/lib/hadoop/lib/hadoop-lzo-0.4.15.jar
com.hadoop.compression.lzo.LzoIndexer  /alex/ttt.lzo
13/08/02 09:11:09 INFO lzo.GPLNativeCodeLoader: Loaded native gpl library
13/08/02 09:11:09 INFO lzo.LzoCodec: Successfully loaded & initialized
native-lzo library [hadoop-lzo rev 6bb1b7f8b9044d8df9b4d2b6641db7658aab3cf8]
13/08/02 09:11:10 INFO lzo.LzoIndexer: [INDEX] LZO Indexing file
/alex/ttt.lzo, size 0.00 GB...
13/08/02 09:11:10 WARN conf.Configuration: hadoop.native.lib is deprecated.
Instead, use io.native.lib.available
13/08/02 09:11:10 INFO lzo.LzoIndexer: Completed LZO Indexing in 0.19
seconds (0.00 MB/s).  Index size is 0.01 KB.


On Mon, Jul 22, 2013 at 6:07 PM, Sandeep Nemuri <nh...@gmail.com>wrote:

> Try  this Ccommand
> hadoop jar /usr/lib/hadoop/lib/hadoop-lzo-cdh4-0.4.15-gplextras.jar
> com.hadoop.compression.lzo.LzoIndexer /user/sample.txt.lzo
>
>
> On Mon, Jul 22, 2013 at 2:08 PM, ch huang <ju...@gmail.com> wrote:
>
>> anyone can help?
>>
>> # sudo -u hdfs hadoop jar /usr/lib/hadoop/lib/hadoop-lzo-0.4.15.jar
>> com.hadoop.compression.lzo.DistributedLzoIndexer
>> /alex/test_lzo/sqoop-1.99.2-bin-hadoop200.tar.gz.lzo
>> 13/07/22 16:33:50 INFO lzo.GPLNativeCodeLoader: Loaded native gpl library
>> 13/07/22 16:33:50 INFO lzo.LzoCodec: Successfully loaded & initialized
>> native-lzo library [hadoop-lzo rev 6bb1b7f8b9044d8df9b4d2b6641db7658aab3cf8]
>> 13/07/22 16:33:50 INFO lzo.DistributedLzoIndexer: Adding LZO file
>> /alex/test_lzo/sqoop-1.99.2-bin-hadoop200.tar.gz.lzo to indexing list (no
>> index currently exists)
>> 13/07/22 16:33:50 ERROR security.UserGroupInformation:
>> PriviledgedActionException as:hdfs (auth:SIMPLE) cause:java.io.IOException:
>> Failed on local exception:
>> com.google.protobuf.InvalidProtocolBufferException: Protocol message
>> end-group tag did not match expected tag.; Host Details : local host is:
>> "CH22/192.168.10.22"; destination host is: "CH22":8088;
>> Exception in thread "main" java.io.IOException: Failed on local
>> exception: com.google.protobuf.InvalidProtocolBufferException: Protocol
>> message end-group tag did not match expected tag.; Host Details : local
>> host is: "CH22/192.168.10.22"; destination host is: "CH22":8088;
>>         at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:763)
>>         at org.apache.hadoop.ipc.Client.call(Client.java:1229)
>>         at
>> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:225)
>>         at org.apache.hadoop.mapred.$Proxy10.getStagingAreaDir(Unknown
>> Source)
>>         at
>> org.apache.hadoop.mapred.JobClient.getStagingAreaDir(JobClient.java:1324)
>>         at
>> org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:102)
>>         at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:951)
>>         at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:945)
>>         at java.security.AccessController.doPrivileged(Native Method)
>>         at javax.security.auth.Subject.doAs(Subject.java:396)
>>         at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
>>         at
>> org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:945)
>>         at org.apache.hadoop.mapreduce.Job.submit(Job.java:566)
>>         at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:596)
>>         at
>> com.hadoop.compression.lzo.DistributedLzoIndexer.run(DistributedLzoIndexer.java:111)
>>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
>>         at
>> com.hadoop.compression.lzo.DistributedLzoIndexer.main(DistributedLzoIndexer.java:115)
>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>         at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>         at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>         at java.lang.reflect.Method.invoke(Method.java:597)
>>         at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
>> Caused by: com.google.protobuf.InvalidProtocolBufferException: Protocol
>> message end-group tag did not match expected tag.
>>         at
>> com.google.protobuf.InvalidProtocolBufferException.invalidEndTag(InvalidProtocolBufferException.java:73)
>>         at
>> com.google.protobuf.CodedInputStream.checkLastTagWas(CodedInputStream.java:124)
>>         at
>> com.google.protobuf.AbstractMessageLite$Builder.mergeFrom(AbstractMessageLite.java:213)
>>         at
>> com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:746)
>>         at
>> com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:238)
>>         at
>> com.google.protobuf.AbstractMessageLite$Builder.mergeDelimitedFrom(AbstractMessageLite.java:282)
>>         at
>> com.google.protobuf.AbstractMessage$Builder.mergeDelimitedFrom(AbstractMessage.java:760)
>>         at
>> com.google.protobuf.AbstractMessageLite$Builder.mergeDelimitedFrom(AbstractMessageLite.java:288)
>>         at
>> com.google.protobuf.AbstractMessage$Builder.mergeDelimitedFrom(AbstractMessage.java:752)
>>         at
>> org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto.parseDelimitedFrom(RpcPayloadHeaderProtos.java:985)
>>         at
>> org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:938)
>>         at org.apache.hadoop.ipc.Client$Connection.run(Client.java:836)
>>
>
>
>
> --
> --Regards
>   Sandeep Nemuri
>

Re: test lzo problem in hadoop

Posted by Sandeep Nemuri <nh...@gmail.com>.
Try  this Ccommand
hadoop jar /usr/lib/hadoop/lib/hadoop-lzo-cdh4-0.4.15-gplextras.jar
com.hadoop.compression.lzo.LzoIndexer /user/sample.txt.lzo


On Mon, Jul 22, 2013 at 2:08 PM, ch huang <ju...@gmail.com> wrote:

> anyone can help?
>
> # sudo -u hdfs hadoop jar /usr/lib/hadoop/lib/hadoop-lzo-0.4.15.jar
> com.hadoop.compression.lzo.DistributedLzoIndexer
> /alex/test_lzo/sqoop-1.99.2-bin-hadoop200.tar.gz.lzo
> 13/07/22 16:33:50 INFO lzo.GPLNativeCodeLoader: Loaded native gpl library
> 13/07/22 16:33:50 INFO lzo.LzoCodec: Successfully loaded & initialized
> native-lzo library [hadoop-lzo rev 6bb1b7f8b9044d8df9b4d2b6641db7658aab3cf8]
> 13/07/22 16:33:50 INFO lzo.DistributedLzoIndexer: Adding LZO file
> /alex/test_lzo/sqoop-1.99.2-bin-hadoop200.tar.gz.lzo to indexing list (no
> index currently exists)
> 13/07/22 16:33:50 ERROR security.UserGroupInformation:
> PriviledgedActionException as:hdfs (auth:SIMPLE) cause:java.io.IOException:
> Failed on local exception:
> com.google.protobuf.InvalidProtocolBufferException: Protocol message
> end-group tag did not match expected tag.; Host Details : local host is:
> "CH22/192.168.10.22"; destination host is: "CH22":8088;
> Exception in thread "main" java.io.IOException: Failed on local exception:
> com.google.protobuf.InvalidProtocolBufferException: Protocol message
> end-group tag did not match expected tag.; Host Details : local host is:
> "CH22/192.168.10.22"; destination host is: "CH22":8088;
>         at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:763)
>         at org.apache.hadoop.ipc.Client.call(Client.java:1229)
>         at
> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:225)
>         at org.apache.hadoop.mapred.$Proxy10.getStagingAreaDir(Unknown
> Source)
>         at
> org.apache.hadoop.mapred.JobClient.getStagingAreaDir(JobClient.java:1324)
>         at
> org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:102)
>         at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:951)
>         at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:945)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:396)
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
>         at
> org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:945)
>         at org.apache.hadoop.mapreduce.Job.submit(Job.java:566)
>         at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:596)
>         at
> com.hadoop.compression.lzo.DistributedLzoIndexer.run(DistributedLzoIndexer.java:111)
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
>         at
> com.hadoop.compression.lzo.DistributedLzoIndexer.main(DistributedLzoIndexer.java:115)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>         at java.lang.reflect.Method.invoke(Method.java:597)
>         at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
> Caused by: com.google.protobuf.InvalidProtocolBufferException: Protocol
> message end-group tag did not match expected tag.
>         at
> com.google.protobuf.InvalidProtocolBufferException.invalidEndTag(InvalidProtocolBufferException.java:73)
>         at
> com.google.protobuf.CodedInputStream.checkLastTagWas(CodedInputStream.java:124)
>         at
> com.google.protobuf.AbstractMessageLite$Builder.mergeFrom(AbstractMessageLite.java:213)
>         at
> com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:746)
>         at
> com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:238)
>         at
> com.google.protobuf.AbstractMessageLite$Builder.mergeDelimitedFrom(AbstractMessageLite.java:282)
>         at
> com.google.protobuf.AbstractMessage$Builder.mergeDelimitedFrom(AbstractMessage.java:760)
>         at
> com.google.protobuf.AbstractMessageLite$Builder.mergeDelimitedFrom(AbstractMessageLite.java:288)
>         at
> com.google.protobuf.AbstractMessage$Builder.mergeDelimitedFrom(AbstractMessage.java:752)
>         at
> org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto.parseDelimitedFrom(RpcPayloadHeaderProtos.java:985)
>         at
> org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:938)
>         at org.apache.hadoop.ipc.Client$Connection.run(Client.java:836)
>



-- 
--Regards
  Sandeep Nemuri

Re: test lzo problem in hadoop

Posted by Sandeep Nemuri <nh...@gmail.com>.
Try  this Ccommand
hadoop jar /usr/lib/hadoop/lib/hadoop-lzo-cdh4-0.4.15-gplextras.jar
com.hadoop.compression.lzo.LzoIndexer /user/sample.txt.lzo


On Mon, Jul 22, 2013 at 2:08 PM, ch huang <ju...@gmail.com> wrote:

> anyone can help?
>
> # sudo -u hdfs hadoop jar /usr/lib/hadoop/lib/hadoop-lzo-0.4.15.jar
> com.hadoop.compression.lzo.DistributedLzoIndexer
> /alex/test_lzo/sqoop-1.99.2-bin-hadoop200.tar.gz.lzo
> 13/07/22 16:33:50 INFO lzo.GPLNativeCodeLoader: Loaded native gpl library
> 13/07/22 16:33:50 INFO lzo.LzoCodec: Successfully loaded & initialized
> native-lzo library [hadoop-lzo rev 6bb1b7f8b9044d8df9b4d2b6641db7658aab3cf8]
> 13/07/22 16:33:50 INFO lzo.DistributedLzoIndexer: Adding LZO file
> /alex/test_lzo/sqoop-1.99.2-bin-hadoop200.tar.gz.lzo to indexing list (no
> index currently exists)
> 13/07/22 16:33:50 ERROR security.UserGroupInformation:
> PriviledgedActionException as:hdfs (auth:SIMPLE) cause:java.io.IOException:
> Failed on local exception:
> com.google.protobuf.InvalidProtocolBufferException: Protocol message
> end-group tag did not match expected tag.; Host Details : local host is:
> "CH22/192.168.10.22"; destination host is: "CH22":8088;
> Exception in thread "main" java.io.IOException: Failed on local exception:
> com.google.protobuf.InvalidProtocolBufferException: Protocol message
> end-group tag did not match expected tag.; Host Details : local host is:
> "CH22/192.168.10.22"; destination host is: "CH22":8088;
>         at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:763)
>         at org.apache.hadoop.ipc.Client.call(Client.java:1229)
>         at
> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:225)
>         at org.apache.hadoop.mapred.$Proxy10.getStagingAreaDir(Unknown
> Source)
>         at
> org.apache.hadoop.mapred.JobClient.getStagingAreaDir(JobClient.java:1324)
>         at
> org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:102)
>         at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:951)
>         at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:945)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:396)
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
>         at
> org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:945)
>         at org.apache.hadoop.mapreduce.Job.submit(Job.java:566)
>         at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:596)
>         at
> com.hadoop.compression.lzo.DistributedLzoIndexer.run(DistributedLzoIndexer.java:111)
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
>         at
> com.hadoop.compression.lzo.DistributedLzoIndexer.main(DistributedLzoIndexer.java:115)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>         at java.lang.reflect.Method.invoke(Method.java:597)
>         at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
> Caused by: com.google.protobuf.InvalidProtocolBufferException: Protocol
> message end-group tag did not match expected tag.
>         at
> com.google.protobuf.InvalidProtocolBufferException.invalidEndTag(InvalidProtocolBufferException.java:73)
>         at
> com.google.protobuf.CodedInputStream.checkLastTagWas(CodedInputStream.java:124)
>         at
> com.google.protobuf.AbstractMessageLite$Builder.mergeFrom(AbstractMessageLite.java:213)
>         at
> com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:746)
>         at
> com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:238)
>         at
> com.google.protobuf.AbstractMessageLite$Builder.mergeDelimitedFrom(AbstractMessageLite.java:282)
>         at
> com.google.protobuf.AbstractMessage$Builder.mergeDelimitedFrom(AbstractMessage.java:760)
>         at
> com.google.protobuf.AbstractMessageLite$Builder.mergeDelimitedFrom(AbstractMessageLite.java:288)
>         at
> com.google.protobuf.AbstractMessage$Builder.mergeDelimitedFrom(AbstractMessage.java:752)
>         at
> org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto.parseDelimitedFrom(RpcPayloadHeaderProtos.java:985)
>         at
> org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:938)
>         at org.apache.hadoop.ipc.Client$Connection.run(Client.java:836)
>



-- 
--Regards
  Sandeep Nemuri

Re: test lzo problem in hadoop

Posted by Sandeep Nemuri <nh...@gmail.com>.
Try  this Ccommand
hadoop jar /usr/lib/hadoop/lib/hadoop-lzo-cdh4-0.4.15-gplextras.jar
com.hadoop.compression.lzo.LzoIndexer /user/sample.txt.lzo


On Mon, Jul 22, 2013 at 2:08 PM, ch huang <ju...@gmail.com> wrote:

> anyone can help?
>
> # sudo -u hdfs hadoop jar /usr/lib/hadoop/lib/hadoop-lzo-0.4.15.jar
> com.hadoop.compression.lzo.DistributedLzoIndexer
> /alex/test_lzo/sqoop-1.99.2-bin-hadoop200.tar.gz.lzo
> 13/07/22 16:33:50 INFO lzo.GPLNativeCodeLoader: Loaded native gpl library
> 13/07/22 16:33:50 INFO lzo.LzoCodec: Successfully loaded & initialized
> native-lzo library [hadoop-lzo rev 6bb1b7f8b9044d8df9b4d2b6641db7658aab3cf8]
> 13/07/22 16:33:50 INFO lzo.DistributedLzoIndexer: Adding LZO file
> /alex/test_lzo/sqoop-1.99.2-bin-hadoop200.tar.gz.lzo to indexing list (no
> index currently exists)
> 13/07/22 16:33:50 ERROR security.UserGroupInformation:
> PriviledgedActionException as:hdfs (auth:SIMPLE) cause:java.io.IOException:
> Failed on local exception:
> com.google.protobuf.InvalidProtocolBufferException: Protocol message
> end-group tag did not match expected tag.; Host Details : local host is:
> "CH22/192.168.10.22"; destination host is: "CH22":8088;
> Exception in thread "main" java.io.IOException: Failed on local exception:
> com.google.protobuf.InvalidProtocolBufferException: Protocol message
> end-group tag did not match expected tag.; Host Details : local host is:
> "CH22/192.168.10.22"; destination host is: "CH22":8088;
>         at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:763)
>         at org.apache.hadoop.ipc.Client.call(Client.java:1229)
>         at
> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:225)
>         at org.apache.hadoop.mapred.$Proxy10.getStagingAreaDir(Unknown
> Source)
>         at
> org.apache.hadoop.mapred.JobClient.getStagingAreaDir(JobClient.java:1324)
>         at
> org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:102)
>         at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:951)
>         at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:945)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:396)
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
>         at
> org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:945)
>         at org.apache.hadoop.mapreduce.Job.submit(Job.java:566)
>         at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:596)
>         at
> com.hadoop.compression.lzo.DistributedLzoIndexer.run(DistributedLzoIndexer.java:111)
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
>         at
> com.hadoop.compression.lzo.DistributedLzoIndexer.main(DistributedLzoIndexer.java:115)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>         at java.lang.reflect.Method.invoke(Method.java:597)
>         at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
> Caused by: com.google.protobuf.InvalidProtocolBufferException: Protocol
> message end-group tag did not match expected tag.
>         at
> com.google.protobuf.InvalidProtocolBufferException.invalidEndTag(InvalidProtocolBufferException.java:73)
>         at
> com.google.protobuf.CodedInputStream.checkLastTagWas(CodedInputStream.java:124)
>         at
> com.google.protobuf.AbstractMessageLite$Builder.mergeFrom(AbstractMessageLite.java:213)
>         at
> com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:746)
>         at
> com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:238)
>         at
> com.google.protobuf.AbstractMessageLite$Builder.mergeDelimitedFrom(AbstractMessageLite.java:282)
>         at
> com.google.protobuf.AbstractMessage$Builder.mergeDelimitedFrom(AbstractMessage.java:760)
>         at
> com.google.protobuf.AbstractMessageLite$Builder.mergeDelimitedFrom(AbstractMessageLite.java:288)
>         at
> com.google.protobuf.AbstractMessage$Builder.mergeDelimitedFrom(AbstractMessage.java:752)
>         at
> org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto.parseDelimitedFrom(RpcPayloadHeaderProtos.java:985)
>         at
> org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:938)
>         at org.apache.hadoop.ipc.Client$Connection.run(Client.java:836)
>



-- 
--Regards
  Sandeep Nemuri

Re: test lzo problem in hadoop

Posted by Sandeep Nemuri <nh...@gmail.com>.
Try  this Ccommand
hadoop jar /usr/lib/hadoop/lib/hadoop-lzo-cdh4-0.4.15-gplextras.jar
com.hadoop.compression.lzo.LzoIndexer /user/sample.txt.lzo


On Mon, Jul 22, 2013 at 2:08 PM, ch huang <ju...@gmail.com> wrote:

> anyone can help?
>
> # sudo -u hdfs hadoop jar /usr/lib/hadoop/lib/hadoop-lzo-0.4.15.jar
> com.hadoop.compression.lzo.DistributedLzoIndexer
> /alex/test_lzo/sqoop-1.99.2-bin-hadoop200.tar.gz.lzo
> 13/07/22 16:33:50 INFO lzo.GPLNativeCodeLoader: Loaded native gpl library
> 13/07/22 16:33:50 INFO lzo.LzoCodec: Successfully loaded & initialized
> native-lzo library [hadoop-lzo rev 6bb1b7f8b9044d8df9b4d2b6641db7658aab3cf8]
> 13/07/22 16:33:50 INFO lzo.DistributedLzoIndexer: Adding LZO file
> /alex/test_lzo/sqoop-1.99.2-bin-hadoop200.tar.gz.lzo to indexing list (no
> index currently exists)
> 13/07/22 16:33:50 ERROR security.UserGroupInformation:
> PriviledgedActionException as:hdfs (auth:SIMPLE) cause:java.io.IOException:
> Failed on local exception:
> com.google.protobuf.InvalidProtocolBufferException: Protocol message
> end-group tag did not match expected tag.; Host Details : local host is:
> "CH22/192.168.10.22"; destination host is: "CH22":8088;
> Exception in thread "main" java.io.IOException: Failed on local exception:
> com.google.protobuf.InvalidProtocolBufferException: Protocol message
> end-group tag did not match expected tag.; Host Details : local host is:
> "CH22/192.168.10.22"; destination host is: "CH22":8088;
>         at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:763)
>         at org.apache.hadoop.ipc.Client.call(Client.java:1229)
>         at
> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:225)
>         at org.apache.hadoop.mapred.$Proxy10.getStagingAreaDir(Unknown
> Source)
>         at
> org.apache.hadoop.mapred.JobClient.getStagingAreaDir(JobClient.java:1324)
>         at
> org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:102)
>         at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:951)
>         at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:945)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:396)
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
>         at
> org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:945)
>         at org.apache.hadoop.mapreduce.Job.submit(Job.java:566)
>         at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:596)
>         at
> com.hadoop.compression.lzo.DistributedLzoIndexer.run(DistributedLzoIndexer.java:111)
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
>         at
> com.hadoop.compression.lzo.DistributedLzoIndexer.main(DistributedLzoIndexer.java:115)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>         at java.lang.reflect.Method.invoke(Method.java:597)
>         at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
> Caused by: com.google.protobuf.InvalidProtocolBufferException: Protocol
> message end-group tag did not match expected tag.
>         at
> com.google.protobuf.InvalidProtocolBufferException.invalidEndTag(InvalidProtocolBufferException.java:73)
>         at
> com.google.protobuf.CodedInputStream.checkLastTagWas(CodedInputStream.java:124)
>         at
> com.google.protobuf.AbstractMessageLite$Builder.mergeFrom(AbstractMessageLite.java:213)
>         at
> com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:746)
>         at
> com.google.protobuf.AbstractMessage$Builder.mergeFrom(AbstractMessage.java:238)
>         at
> com.google.protobuf.AbstractMessageLite$Builder.mergeDelimitedFrom(AbstractMessageLite.java:282)
>         at
> com.google.protobuf.AbstractMessage$Builder.mergeDelimitedFrom(AbstractMessage.java:760)
>         at
> com.google.protobuf.AbstractMessageLite$Builder.mergeDelimitedFrom(AbstractMessageLite.java:288)
>         at
> com.google.protobuf.AbstractMessage$Builder.mergeDelimitedFrom(AbstractMessage.java:752)
>         at
> org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto.parseDelimitedFrom(RpcPayloadHeaderProtos.java:985)
>         at
> org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:938)
>         at org.apache.hadoop.ipc.Client$Connection.run(Client.java:836)
>



-- 
--Regards
  Sandeep Nemuri