You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@flume.apache.org by "Kumar, Deepak8 " <de...@citi.com> on 2012/10/03 08:26:28 UTC

Flume 0.94 with CDH4

Hi Hari,

I am trying to use Flume 0.94 with CDH4. I am getting following exception.



1.       Without hbase 0.9.2 jar file, flume throws following error

2012-10-01 13:29:13,958 ERROR com.cloudera.flume.core.connector.DirectDriver: Exiting driver logicalNode vm-ccf5-9333.nam.nsroot.net_synch-56 in error state KeywordSynchSource | ConsoleEventSink because Not a host:port pair: �^@^@^@!26322@vm-3733-9C.nam.nsroot.netvm-3733-969C.nam.nsroot.net,60020,1348696066287

It's a version conflict error if you try to access to hbsae 0.9.2 from hbsae 0.90.4 client.

2.       Hbase 0.9.2 + hadoop 0.20.2 jar files, flume throws following error.

Exception in thread "Heartbeat" java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper

       at org.apache.hadoop.hbase.ipc.HBaseClient.getConnection(HBaseClient.java:1035)

        at org.apache.hadoop.hbase.ipc.HBaseClient.call(HBaseClient.java:897)

        at org.apache.hadoop.hbase.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:150)

        at $Proxy7.getProtocolVersion(Unknown Source)

        at org.apache.hadoop.hbase.ipc.WritableRpcEngine.getProxy(WritableRpcEngine.java:183)

        at org.apache.hadoop.hbase.ipc.HBaseRPC.getProxy(HBaseRPC.java:303)

        at org.apache.hadoop.hbase.ipc.HBaseRPC.getProxy(HBaseRPC.java:280)

        at org.apache.hadoop.hbase.ipc.HBaseRPC.getProxy(HBaseRPC.java:332)

        at org.apache.hadoop.hbase.ipc.HBaseRPC.waitForProxy(HBaseRPC.java:236)

        at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getHRegionConnection(HConnectionManager.java:1278)

        at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getHRegionConnection(HConnectionManager.java:1235)

        at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getHRegionConnection(HConnectionManager.java:1222)

        at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegionInMeta(HConnectionManager.java:918)

        at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:814)

        at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:782)

        at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegionInMeta(HConnectionManager.java:915)

        at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:818)

        at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:782)

        at org.apache.hadoop.hbase.client.HTable.finishSetup(HTable.java:249)

        at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:213)

        at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:171)

        at com.citi.sponge.flume.sink.ELFNotifyCacheSink$1.build(ELFNotifyCacheSink.java:117)

        at com.cloudera.flume.conf.SinkFactory$SinkBuilder.create(SinkFactory.java:42)

        at com.cloudera.flume.conf.SinkFactoryImpl.createSink(SinkFactoryImpl.java:318)

        at com.cloudera.flume.conf.FlumeBuilder.buildEventSink(FlumeBuilder.java:500)

        at com.cloudera.flume.conf.FlumeBuilder.buildEventSink(FlumeBuilder.java:531)

        at com.cloudera.flume.conf.FlumeBuilder.buildEventSink(FlumeBuilder.java:513)

        at com.cloudera.flume.conf.FlumeBuilder.buildEventSink(FlumeBuilder.java:544)

        at com.cloudera.flume.conf.FlumeBuilder.buildEventSink(FlumeBuilder.java:544)

        at com.cloudera.flume.conf.FlumeBuilder.buildSink(FlumeBuilder.java:302)

        at com.cloudera.flume.agent.LogicalNode.loadConfig(LogicalNode.java:215)

        at com.cloudera.flume.agent.LogicalNodeManager.spawn(LogicalNodeManager.java:104)

        at com.cloudera.flume.agent.LivenessManager.checkLogicalNodes(LivenessManager.java:141)

        at com.cloudera.flume.agent.LivenessManager.heartbeatChecks(LivenessManager.java:183)

        at com.cloudera.flume.agent.LivenessManager$HeartbeatThread.run(LivenessManager.java:233)

3.       Hbase 0.9.2 + Hadoop 2.0 + flume 0.9.4

flume script complains that hadoop-core.jar is not found. Flume 0.9.4 hardcodes the name of hadoop and the name of jar files are changed from hadoop 2.0

4.       Hbase 0.9.2 + Hadoop 2.0 + modify hadoop jar location on the script file + flume 0.9.4

Exception in thread "logicalNode vm-93ff-5f31.nam.nsroot.net_sa56647_1345135162189-22" java.lang.NoSuchMethodError: org.apache.hadoop.io.SequenceFile$Writer: method <init>()V not found

        at org.apache.hadoop.io.RawSequenceFileWriter.<init>(RawSequenceFileWriter.java:70)

        at org.apache.hadoop.io.RawSequenceFileWriter.createWriter(RawSequenceFileWriter.java:61)

        at com.cloudera.flume.handlers.hdfs.SeqfileEventSink.open(SeqfileEventSink.java:79)

        at com.cloudera.flume.core.EventSinkDecorator.open(EventSinkDecorator.java:75)

        at com.cloudera.flume.handlers.rolling.RollSink.open(RollSink.java:381)

        at com.cloudera.flume.agent.diskfailover.DiskFailoverDeco.open(DiskFailoverDeco.java:196)

        at com.cloudera.flume.core.BackOffFailOverSink.open(BackOffFailOverSink.java:203)

        at com.cloudera.flume.core.EventSinkDecorator.open(EventSinkDecorator.java:75)

        at com.cloudera.flume.core.connector.DirectDriver$PumperThread.run(DirectDriver.java:88)

It seems Flume has a package org.apache.hadoop.io and there is a class name RawSequenceFileWriter. Flume call a hadoop API (org.apache.hadoop.io.SequenceFile$Writer.init()) from this class and this API signature is changed on hadoop 2.0.

Could you please guide me, whether we can use Flume 0.94 with CDH4 or not?

Regards,
Deepak



Re: Flume 0.94 with CDH4

Posted by Hari Shreedharan <hs...@cloudera.com>.
Hi Deepak,  

Hadoop made some significant changes from 0.20.2 to 2.0.0. As such something build using Hadoop-0.20.x/1.0 will not run with Hadoop-2.0, and vice versa. So you need to use Flume build against the correct version of Flume, or you will hit these issues. Are you running Flume from CDH3 on CDH4? Unfortunately, that will not work because of this reason.

I'd recommend switching to Flume-ng, flume-1.2.0 which is the current version.

Thanks,
Hari

--  
Hari Shreedharan


On Tuesday, October 2, 2012 at 11:26 PM, Kumar, Deepak8 wrote:

> Hi Hari,
> I am trying to use Flume 0.94 with CDH4. I am getting following exception.
>   
> 1.       Without hbase 0.9.2 jar file, flume throws following error
> 2012-10-01 13:29:13,958 ERROR com.cloudera.flume.core.connector.DirectDriver: Exiting driver logicalNode vm-ccf5-9333.nam.nsroot.net_synch-56 in error state KeywordSynchSource | ConsoleEventSink because Not a host:port pair: �^@^@^@!26322@vm-3733-9C.nam.nsroot.netvm-3733-969C.nam.nsroot.net (mailto:26322@vm-3733-9C.nam.nsroot.netvm-3733-969C.nam.nsroot.net),60020,1348696066287
> It’s a version conflict error if you try to access to hbsae 0.9.2 from hbsae 0.90.4 client.
> 2.       Hbase 0.9.2 + hadoop 0.20.2 jar files, flume throws following error.
> Exception in thread "Heartbeat" java.lang.NoClassDefFoundError: org/apache/hadoop/net/SocketInputWrapper
>        at org.apache.hadoop.hbase.ipc.HBaseClient.getConnection(HBaseClient.java:1035)
>         at org.apache.hadoop.hbase.ipc.HBaseClient.call(HBaseClient.java:897)
>         at org.apache.hadoop.hbase.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:150)
>         at $Proxy7.getProtocolVersion(Unknown Source)
>         at org.apache.hadoop.hbase.ipc.WritableRpcEngine.getProxy(WritableRpcEngine.java:183)
>         at org.apache.hadoop.hbase.ipc.HBaseRPC.getProxy(HBaseRPC.java:303)
>         at org.apache.hadoop.hbase.ipc.HBaseRPC.getProxy(HBaseRPC.java:280)
>         at org.apache.hadoop.hbase.ipc.HBaseRPC.getProxy(HBaseRPC.java:332)
>         at org.apache.hadoop.hbase.ipc.HBaseRPC.waitForProxy(HBaseRPC.java:236)
>         at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getHRegionConnection(HConnectionManager.java:1278)
>         at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getHRegionConnection(HConnectionManager.java:1235)
>         at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getHRegionConnection(HConnectionManager.java:1222)
>         at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegionInMeta(HConnectionManager.java:918)
>         at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:814)
>         at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:782)
>         at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegionInMeta(HConnectionManager.java:915)
>         at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:818)
>         at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:782)
>         at org.apache.hadoop.hbase.client.HTable.finishSetup(HTable.java:249)
>         at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:213)
>         at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:171)
>         at com.citi.sponge.flume.sink.ELFNotifyCacheSink$1.build(ELFNotifyCacheSink.java:117)
>         at com.cloudera.flume.conf.SinkFactory$SinkBuilder.create(SinkFactory.java:42)
>         at com.cloudera.flume.conf.SinkFactoryImpl.createSink(SinkFactoryImpl.java:318)
>         at com.cloudera.flume.conf.FlumeBuilder.buildEventSink(FlumeBuilder.java:500)
>         at com.cloudera.flume.conf.FlumeBuilder.buildEventSink(FlumeBuilder.java:531)
>         at com.cloudera.flume.conf.FlumeBuilder.buildEventSink(FlumeBuilder.java:513)
>         at com.cloudera.flume.conf.FlumeBuilder.buildEventSink(FlumeBuilder.java:544)
>         at com.cloudera.flume.conf.FlumeBuilder.buildEventSink(FlumeBuilder.java:544)
>         at com.cloudera.flume.conf.FlumeBuilder.buildSink(FlumeBuilder.java:302)
>         at com.cloudera.flume.agent.LogicalNode.loadConfig(LogicalNode.java:215)
>         at com.cloudera.flume.agent.LogicalNodeManager.spawn(LogicalNodeManager.java:104)
>         at com.cloudera.flume.agent.LivenessManager.checkLogicalNodes(LivenessManager.java:141)
>         at com.cloudera.flume.agent.LivenessManager.heartbeatChecks(LivenessManager.java:183)
>         at com.cloudera.flume.agent.LivenessManager$HeartbeatThread.run(LivenessManager.java:233)
> 3.       Hbase 0.9.2 + Hadoop 2.0 + flume 0.9.4
> flume script complains that hadoop-core.jar is not found. Flume 0.9.4 hardcodes the name of hadoop and the name of jar files are changed from hadoop 2.0
> 4.       Hbase 0.9.2 + Hadoop 2.0 + modify hadoop jar location on the script file + flume 0.9.4
> Exception in thread "logicalNode vm-93ff-5f31.nam.nsroot.net_sa56647_1345135162189-22" java.lang.NoSuchMethodError: org.apache.hadoop.io.SequenceFile$Writer: method <init>()V not found
>         at org.apache.hadoop.io.RawSequenceFileWriter.<init>(RawSequenceFileWriter.java:70)
>         at org.apache.hadoop.io.RawSequenceFileWriter.createWriter(RawSequenceFileWriter.java:61)
>         at com.cloudera.flume.handlers.hdfs.SeqfileEventSink.open(SeqfileEventSink.java:79)
>         at com.cloudera.flume.core.EventSinkDecorator.open(EventSinkDecorator.java:75)
>         at com.cloudera.flume.handlers.rolling.RollSink.open(RollSink.java:381)
>         at com.cloudera.flume.agent.diskfailover.DiskFailoverDeco.open(DiskFailoverDeco.java:196)
>         at com.cloudera.flume.core.BackOffFailOverSink.open(BackOffFailOverSink.java:203)
>         at com.cloudera.flume.core.EventSinkDecorator.open(EventSinkDecorator.java:75)
>         at com.cloudera.flume.core.connector.DirectDriver$PumperThread.run(DirectDriver.java:88)
>   
> It seems Flume has a package org.apache.hadoop.io (http://org.apache.hadoop.io) and there is a class name RawSequenceFileWriter. Flume call a hadoop API (org.apache.hadoop.io.SequenceFile$Writer.init()) from this class and this API signature is changed on hadoop 2.0.
>   
> Could you please guide me, whether we can use Flume 0.94 with CDH4 or not?
>   
> Regards,
> Deepak
>   
>  
>  
>