You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@flume.apache.org by Nikhil Gs <gs...@gmail.com> on 2015/08/06 22:37:07 UTC
Flume ERROR
Hello Team,
Facing the below error very alternatively even though worked with different
port numbers. I have pasted my flume config file along with the error.
Thanks in advance.
Below is my flume configuration
#################################################
# Please paste flume.conf here. Example:
# Sources, channels, and sinks are defined per
# agent name, in this case 'pnmtest2'.
pnmtest2.sources = SPOOL
pnmtest2.channels = MemChanneltest2
pnmtest2.sinks = AVRO
# For each source, channel, and sink, set
# standard properties.
pnmtest2.sources.SPOOL.type = spooldir
pnmtest2.sources.SPOOL.spoolDir = /home/a.nikhill/pnm
pnmtest2.sources.SPOOL.ignorePattern = \.*tmp$
pnmtest2.sources.SPOOL.channels = MemChanneltest2
pnmtest2.sources.SPOOL.fileHeader = true
pnmtest2.sources.SPOOL.deletePolicy = immediate
pnmtest2.sources.SPOOL.consumeOrder = oldest
pnmtest2.sources.SPOOL.batchSize = 100
pnmtest2.sources.SPOOL.interceptors = time
pnmtest2.sources.SPOOL.interceptors.time.type =
org.apache.flume.interceptor.TimestampInterceptor$Builder
pnmtest2.sources.SPOOL.deserializer =
com.sudnline.flume.WholeFileDeserializer$Builder
pnmtest2.sinks.AVRO.type = avro
pnmtest2.sinks.AVRO.channel = MemChanneltest2
pnmtest2.sinks.AVRO.hostname = sdldalplhdw02.sudnline.cequel3.com
<http://sdldalplhdw02.suddenlink.cequel3.com/>
pnmtest2.sinks.AVRO.port = 40002
pnmtest2.sinks.AVRO.batch-size = 100
pnmtest2.sinks.AVRO.connect-timeout = 40000
# pnmtest2.sinks.HDFS.type = hdfs
# pnmtest2.sinks.HDFS.channel = MemChannel2
# pnmtest2.sinks.HDFS.hdfs.path = /user/flume/poll/%Y/%m/%d/%H/
# pnmtest2.sinks.HDFS.hdfs.fileType = DataStream
# pnmtest2.sinks.HDFS.hdfs.writeFormat = Text
# pnmtest2.sinks.HDFS.hdfs.batchSize = 100
# pnmtest2.sinks.HDFS.hdfs.rollSize = 0
# pnmtest2.sinks.HDFS.hdfs.rollCount = 1000
# pnmtest2.sinks.HDFS.hdfs.rollInterval = 600
# Other properties are specific to each type of
# source, channel, or sink. In this case, we
# specify the capacity of the memory channel.
#pnmtest2.channels.MemChanneltest1.capacity = 10000
#pnmtest2.channels.MemChanneltest1.type = memory
pnmtest2.channels.MemChanneltest2.capacity = 1000000
pnmtest2.channels.MemChanneltest2.type = memory
Below is my error.
> ERRORorg.apache.flume.SinkRunner
>>
>> Unable to deliver event. Exception follows.
>> org.apache.flume.EventDeliveryException: Failed to send events
>> at org.apache.flume.sink.AbstractRpcSink.process(AbstractRpcSink.java:392)
>> at org.apache.flume.sink.DefaultSinkProcessor.process(DefaultSinkProcessor.java:68)
>> at org.apache.flume.SinkRunner$PollingRunner.run(SinkRunner.java:147)
>> at java.lang.Thread.run(Thread.java:745)
>> Caused by: org.apache.flume.FlumeException: NettyAvroRpcClient { host: sdldalplhdw02.suddenlink.cequel3.com, port: 40002 }: RPC connection error
>> at org.apache.flume.api.NettyAvroRpcClient.connect(NettyAvroRpcClient.java:182)
>> at org.apache.flume.api.NettyAvroRpcClient.connect(NettyAvroRpcClient.java:121)
>> at org.apache.flume.api.NettyAvroRpcClient.configure(NettyAvroRpcClient.java:638)
>> at org.apache.flume.api.RpcClientFactory.getInstance(RpcClientFactory.java:89)
>> at org.apache.flume.sink.AvroSink.initializeRpcClient(AvroSink.java:127)
>> at org.apache.flume.sink.AbstractRpcSink.createConnection(AbstractRpcSink.java:211)
>> at org.apache.flume.sink.AbstractRpcSink.verifyConnection(AbstractRpcSink.java:272)
>> at org.apache.flume.sink.AbstractRpcSink.process(AbstractRpcSink.java:349)
>> ... 3 more
>> Caused by: java.io.IOException: Error connecting to sdldalplhdw02.suddenlink.cequel3.com/10.48.210.244:40002
>> at org.apache.avro.ipc.NettyTransceiver.getChannel(NettyTransceiver.java:292)
>> at org.apache.avro.ipc.NettyTransceiver.<init>(NettyTransceiver.java:206)
>> at org.apache.avro.ipc.NettyTransceiver.<init>(NettyTransceiver.java:155)
>> at org.apache.flume.api.NettyAvroRpcClient.connect(NettyAvroRpcClient.java:168)
>> ... 10 more
>> Caused by: java.net.ConnectException: Connection refused
>> at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
>> at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
>> at org.jboss.netty.channel.socket.nio.NioClientBoss.connect(NioClientBoss.java:148)
>> at org.jboss.netty.channel.socket.nio.NioClientBoss.processSelectedKeys(NioClientBoss.java:104)
>> at org.jboss.netty.channel.socket.nio.NioClientBoss.process(NioClientBoss.java:78)
>> at org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:312)
>> at org.jboss.netty.channel.socket.nio.NioClientBoss.run(NioClientBoss.java:41)
>> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>
>>
>>
>>
Regards,
Nik.
Re: Flume ERROR
Posted by Anandkumar Lakshmanan <an...@orzota.com>.
Hi,
Yes it is admin related work.
You can flush the firewall rules by running this command.
Sudo iptables -F
or you can open the specific port which flume requires to do its process.
Thanks
Anand.
On 08/07/2015 07:50 PM, Nikhil Gs wrote:
>
> Thanks AnandKumar.
>
> Flushing the firewall rules is admin related work right?
>
> Redhat 4.7 is our OS.
>
> Linux version 2.6.32-504.3.3.el6.x86_64
> (mockbuild@x86-028.build.eng.bos.redhat.com
> <ma...@x86-028.build.eng.bos.redhat.com>) (gcc version
> 4.4.7 20120313 (Red Hat 4.4.7-9) (GCC) ) #1 SMP Fri Dec 12 16:05:43
> EST 2014
>
> Regards,
> Nik.
>
> On Fri, Aug 7, 2015 at 9:21 AM, Anandkumar Lakshmanan
> <anand@orzota.com <ma...@orzota.com>> wrote:
>
>
> Hi,
>
> Flush the firewall rules and start flume.
>
> What OS are you using?
>
> Thanks
> Anand.
>
>
> On 08/07/2015 07:24 PM, Nikhil Gs wrote:
>> Hello AnandKumar,
>>
>> The issue is again repeating. Yesterday the files were deleting
>> from spool but the data was not available in the destination
>> Hbase tables, but now this morning what I have noticed is the log
>> is same.
>>
>> Earlier, as you have suggested, "verify the firewall settings. It
>> blocks the connection it seems."
>> Can you be more specific, I mean how can verify the firewall
>> settings. Any suggestion?
>>
>> Thanks in advance.
>>
>> 8:45:01.040 AM ERROR org.apa
>> che.flume.SinkRunner
>> Unable to deliver event. Exception follows.
>> org.apache.flume.EventDeliveryException: Failed to send events
>> at org.apache.flume.sink.AbstractRpcSink.process(AbstractRpcSink.java:392)
>> at org.apache.flume.sink.DefaultSinkProcessor.process(DefaultSinkProcessor.java:68)
>> at org.apache.flume.SinkRunner$PollingRunner.run(SinkRunner.java:147)
>> at java.lang.Thread.run(Thread.java:745)
>> Caused by: org.apache.flume.FlumeException: NettyAvroRpcClient { host:sdldalplhdw02.suddenlink.cequel3.com <http://sdldalplhdw02.suddenlink.cequel3.com>, port: 40002 }: RPC connection error
>> at org.apache.flume.api.NettyAvroRpcClient.connect(NettyAvroRpcClient.java:182)
>> at org.apache.flume.api.NettyAvroRpcClient.connect(NettyAvroRpcClient.java:121)
>> at org.apache.flume.api.NettyAvroRpcClient.configure(NettyAvroRpcClient.java:638)
>> at org.apache.flume.api.RpcClientFactory.getInstance(RpcClientFactory.java:89)
>> at org.apache.flume.sink.AvroSink.initializeRpcClient(AvroSink.java:127)
>> at org.apache.flume.sink.AbstractRpcSink.createConnection(AbstractRpcSink.java:211)
>> at org.apache.flume.sink.AbstractRpcSink.verifyConnection(AbstractRpcSink.java:272)
>> at org.apache.flume.sink.AbstractRpcSink.process(AbstractRpcSink.java:349)
>> ... 3 more
>> Caused by: java.io.IOException: Error connecting tosdldalplhdw02.suddenlink.cequel3.com/10.48.210.244:40002 <http://sdldalplhdw02.suddenlink.cequel3.com/10.48.210.244:40002>
>> at org.apache.avro.ipc.NettyTransceiver.getChannel(NettyTransceiver.java:292)
>> at org.apache.avro.ipc.NettyTransceiver.<init>(NettyTransceiver.java:206)
>> at org.apache.avro.ipc.NettyTransceiver.<init>(NettyTransceiver.java:155)
>> at org.apache.flume.api.NettyAvroRpcClient.connect(NettyAvroRpcClient.java:168)
>> ... 10 more
>> Caused by: java.net.ConnectException: Connection refused
>> at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
>> at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
>> at org.jboss.netty.channel.socket.nio.NioClientBoss.connect(NioClientBoss.java:148)
>> at org.jboss.netty.channel.socket.nio.NioClientBoss.processSelectedKeys(NioClientBoss.java:104)
>> at org.jboss.netty.channel.socket.nio.NioClientBoss.process(NioClientBoss.java:78)
>> at org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:312)
>> at org.jboss.netty.channel.socket.nio.NioClientBoss.run(NioClientBoss.java:41)
>> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>> ... 1 more
>>
>>
>> Regards,
>> Nik.
>>
>> On Fri, Aug 7, 2015 at 8:28 AM, Nikhil Gs
>> <gsnikhil1432010@gmail.com <ma...@gmail.com>> wrote:
>>
>> Hello Anandkumar,
>>
>> Thank you for your time and also for the reply.
>>
>> Now, flume acts differently. I don't see any log errors, in
>> fact my log now looks as below;
>>
>> 2015-08-06 22:51:14,426 INFO
>> org.apache.flume.client.avro.ReliableSpoolingFileEv
>> entReader: Preparing to delete file
>> /home/a_nikhil.gopishetti/pnm/GVLLCMTK03-675 79944.pf
>> <http://79944.pf>
>> 2015-08-06 23:12:15,026 INFO
>> org.apache.flume.client.avro.ReliableSpoolingFileEv
>> entReader: Preparing to delete file
>> /home/a_nikhil.gopishetti/pnm/KNTNCMTK01.SUD
>> DENLINK.NET-7682.pf <http://DENLINK.NET-7682.pf>
>> 2015-08-06 23:12:22,030 INFO
>> org.apache.flume.client.avro.ReliableSpoolingFileEv
>> entReader: Preparing to delete file
>> /home/a_nikhil.gopishetti/pnm/KNTNCMTK01.SUD
>> DENLINK.NET-7682.pf <http://DENLINK.NET-7682.pf>
>> 2015-08-06 23:13:47,570 INFO
>> org.apache.flume.client.avro.ReliableSpoolingFileEv
>> entReader: Preparing to delete file
>> /home/a_nikhil.gopishetti/pnm/NBRNCMTK01-340
>> 17296.pf.filepart
>> 2015-08-06 23:13:57,076 INFO
>> org.apache.flume.client.avro.ReliableSpoolingFileEv
>> entReader: Preparing to delete file
>> /home/a_nikhil.gopishetti/pnm/NBRNCMTK01-675 84072.pf
>> <http://84072.pf>
>> 2015-08-06 23:14:03,581 INFO
>> org.apache.flume.client.avro.ReliableSpoolingFileEv
>> entReader: Preparing to delete file
>> /home/a_nikhil.gopishetti/pnm/NBRNCMTK01-675 79944.pf
>> <http://79944.pf>
>> 2015-08-06 23:23:23,348 INFO
>> org.apache.flume.client.avro.ReliableSpoolingFileEv
>> entReader: Preparing to delete file
>> /home/a_nikhil.gopishetti/pnm/PMLECMTK01.SUD
>> DENLINK.NET-8426.pf.filepart
>>
>>
>> So the files are deleting. So when I go back to my
>> destination i.e. Hbase tables, I don't see any data. Earlier,
>> a couple of days back, I could clearly see that my data was
>> deleting from the spool directory and it was getting into my
>> Hbase table. But now I don't see any data in my destination
>> tables. I don't know why. Any suggestions.
>>
>> Thanks in advance for your time and reply.
>>
>> Regards,
>> Nik.
>>
>> On Thu, Aug 6, 2015 at 11:43 PM, Anandkumar Lakshmanan
>> <anand@orzota.com <ma...@orzota.com>> wrote:
>>
>> Hi Nik,
>>
>> Please verify the firewall settings. It blocks the
>> connection it seems.
>>
>> Thanks
>> Anand.
>>
>>
>>
>> On 08/07/2015 02:07 AM, Nikhil Gs wrote:
>>> Hello Team,
>>>
>>> Facing the below error very alternatively even though
>>> worked with different port numbers. I have pasted my
>>> flume config file along with the error.
>>>
>>> Thanks in advance.
>>>
>>>
>>> Below is my flume configuration
>>>
>>> #################################################
>>>
>>> # Please paste flume.conf here. Example:
>>>
>>> # Sources, channels, and sinks are defined per
>>> # agent name, in this case 'pnmtest2'.
>>> pnmtest2.sources = SPOOL
>>> pnmtest2.channels = MemChanneltest2
>>> pnmtest2.sinks = AVRO
>>>
>>> # For each source, channel, and sink, set
>>> # standard properties.
>>> pnmtest2.sources.SPOOL.type = spooldir
>>> pnmtest2.sources.SPOOL.spoolDir = /home/a.nikhill/pnm
>>> pnmtest2.sources.SPOOL.ignorePattern = \.*tmp$
>>> pnmtest2.sources.SPOOL.channels = MemChanneltest2
>>> pnmtest2.sources.SPOOL.fileHeader = true
>>> pnmtest2.sources.SPOOL.deletePolicy = immediate
>>> pnmtest2.sources.SPOOL.consumeOrder = oldest
>>> pnmtest2.sources.SPOOL.batchSize = 100
>>>
>>> pnmtest2.sources.SPOOL.interceptors = time
>>> pnmtest2.sources.SPOOL.interceptors.time.type =
>>> org.apache.flume.interceptor.TimestampInterceptor$Builder
>>> pnmtest2.sources.SPOOL.deserializer =
>>> com.sudnline.flume.WholeFileDeserializer$Builder
>>>
>>> pnmtest2.sinks.AVRO.type = avro
>>> pnmtest2.sinks.AVRO.channel = MemChanneltest2
>>> pnmtest2.sinks.AVRO.hostname =
>>> sdldalplhdw02.sudnline.cequel3.com
>>> <http://sdldalplhdw02.suddenlink.cequel3.com/>
>>> pnmtest2.sinks.AVRO.port = 40002
>>> pnmtest2.sinks.AVRO.batch-size = 100
>>> pnmtest2.sinks.AVRO.connect-timeout = 40000
>>>
>>>
>>> # pnmtest2.sinks.HDFS.type = hdfs
>>> # pnmtest2.sinks.HDFS.channel = MemChannel2
>>> # pnmtest2.sinks.HDFS.hdfs.path =
>>> /user/flume/poll/%Y/%m/%d/%H/
>>> # pnmtest2.sinks.HDFS.hdfs.fileType = DataStream
>>> # pnmtest2.sinks.HDFS.hdfs.writeFormat = Text
>>> # pnmtest2.sinks.HDFS.hdfs.batchSize = 100
>>> # pnmtest2.sinks.HDFS.hdfs.rollSize = 0
>>> # pnmtest2.sinks.HDFS.hdfs.rollCount = 1000
>>> # pnmtest2.sinks.HDFS.hdfs.rollInterval = 600
>>>
>>> # Other properties are specific to each type of
>>> # source, channel, or sink. In this case, we
>>> # specify the capacity of the memory channel.
>>>
>>> #pnmtest2.channels.MemChanneltest1.capacity = 10000
>>> #pnmtest2.channels.MemChanneltest1.type = memory
>>>
>>> pnmtest2.channels.MemChanneltest2.capacity = 1000000
>>> pnmtest2.channels.MemChanneltest2.type = memory
>>>
>>>
>>> Below is my error.
>>>
>>> ERROR org.apache.flume.SinkRunner
>>>
>>> Unable to deliver event. Exception follows.
>>> org.apache.flume.EventDeliveryException: Failed to send events
>>> at org.apache.flume.sink.AbstractRpcSink.process(AbstractRpcSink.java:392)
>>> at org.apache.flume.sink.DefaultSinkProcessor.process(DefaultSinkProcessor.java:68)
>>> at org.apache.flume.SinkRunner$PollingRunner.run(SinkRunner.java:147)
>>> at java.lang.Thread.run(Thread.java:745)
>>> Caused by: org.apache.flume.FlumeException: NettyAvroRpcClient { host:sdldalplhdw02.suddenlink.cequel3.com <http://sdldalplhdw02.suddenlink.cequel3.com/>, port: 40002 }: RPC connection error
>>> at org.apache.flume.api.NettyAvroRpcClient.connect(NettyAvroRpcClient.java:182)
>>> at org.apache.flume.api.NettyAvroRpcClient.connect(NettyAvroRpcClient.java:121)
>>> at org.apache.flume.api.NettyAvroRpcClient.configure(NettyAvroRpcClient.java:638)
>>> at org.apache.flume.api.RpcClientFactory.getInstance(RpcClientFactory.java:89)
>>> at org.apache.flume.sink.AvroSink.initializeRpcClient(AvroSink.java:127)
>>> at org.apache.flume.sink.AbstractRpcSink.createConnection(AbstractRpcSink.java:211)
>>> at org.apache.flume.sink.AbstractRpcSink.verifyConnection(AbstractRpcSink.java:272)
>>> at org.apache.flume.sink.AbstractRpcSink.process(AbstractRpcSink.java:349)
>>> ... 3 more
>>> Caused by: java.io.IOException: Error connecting tosdldalplhdw02.suddenlink.cequel3.com/10.48.210.244:40002 <http://sdldalplhdw02.suddenlink.cequel3.com/10.48.210.244:40002>
>>> at org.apache.avro.ipc.NettyTransceiver.getChannel(NettyTransceiver.java:292)
>>> at org.apache.avro.ipc.NettyTransceiver.<init>(NettyTransceiver.java:206)
>>> at org.apache.avro.ipc.NettyTransceiver.<init>(NettyTransceiver.java:155)
>>> at org.apache.flume.api.NettyAvroRpcClient.connect(NettyAvroRpcClient.java:168)
>>> ... 10 more
>>> Caused by: java.net.ConnectException: Connection refused
>>> at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
>>> at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
>>> at org.jboss.netty.channel.socket.nio.NioClientBoss.connect(NioClientBoss.java:148)
>>> at org.jboss.netty.channel.socket.nio.NioClientBoss.processSelectedKeys(NioClientBoss.java:104)
>>> at org.jboss.netty.channel.socket.nio.NioClientBoss.process(NioClientBoss.java:78)
>>> at org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:312)
>>> at org.jboss.netty.channel.socket.nio.NioClientBoss.run(NioClientBoss.java:41)
>>> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>
>>>
>>>
>>>
>>> Regards,
>>> Nik.
>>
>>
>>
>
>
Re: Flume ERROR
Posted by Nikhil Gs <gs...@gmail.com>.
Thanks AnandKumar.
Flushing the firewall rules is admin related work right?
Redhat 4.7 is our OS.
Linux version 2.6.32-504.3.3.el6.x86_64 (
mockbuild@x86-028.build.eng.bos.redhat.com) (gcc version 4.4.7 20120313
(Red Hat 4.4.7-9) (GCC) ) #1 SMP Fri Dec 12 16:05:43 EST 2014
Regards,
Nik.
On Fri, Aug 7, 2015 at 9:21 AM, Anandkumar Lakshmanan <an...@orzota.com>
wrote:
>
> Hi,
>
> Flush the firewall rules and start flume.
>
> What OS are you using?
>
> Thanks
> Anand.
>
>
> On 08/07/2015 07:24 PM, Nikhil Gs wrote:
>
> Hello AnandKumar,
>
> The issue is again repeating. Yesterday the files were deleting from spool
> but the data was not available in the destination Hbase tables, but now
> this morning what I have noticed is the log is same.
>
> Earlier, as you have suggested, "verify the firewall settings. It blocks
> the connection it seems."
> Can you be more specific, I mean how can verify the firewall settings. Any
> suggestion?
>
> Thanks in advance.
>
> 8:45:01.040 AM ERROR org.apa
> che.flume.SinkRunner
>
> Unable to deliver event. Exception follows.
> org.apache.flume.EventDeliveryException: Failed to send events
> at org.apache.flume.sink.AbstractRpcSink.process(AbstractRpcSink.java:392)
> at org.apache.flume.sink.DefaultSinkProcessor.process(DefaultSinkProcessor.java:68)
> at org.apache.flume.SinkRunner$PollingRunner.run(SinkRunner.java:147)
> at java.lang.Thread.run(Thread.java:745)
> Caused by: org.apache.flume.FlumeException: NettyAvroRpcClient { host: sdldalplhdw02.suddenlink.cequel3.com, port: 40002 }: RPC connection error
> at org.apache.flume.api.NettyAvroRpcClient.connect(NettyAvroRpcClient.java:182)
> at org.apache.flume.api.NettyAvroRpcClient.connect(NettyAvroRpcClient.java:121)
> at org.apache.flume.api.NettyAvroRpcClient.configure(NettyAvroRpcClient.java:638)
> at org.apache.flume.api.RpcClientFactory.getInstance(RpcClientFactory.java:89)
> at org.apache.flume.sink.AvroSink.initializeRpcClient(AvroSink.java:127)
> at org.apache.flume.sink.AbstractRpcSink.createConnection(AbstractRpcSink.java:211)
> at org.apache.flume.sink.AbstractRpcSink.verifyConnection(AbstractRpcSink.java:272)
> at org.apache.flume.sink.AbstractRpcSink.process(AbstractRpcSink.java:349)
> ... 3 more
> Caused by: java.io.IOException: Error connecting to sdldalplhdw02.suddenlink.cequel3.com/10.48.210.244:40002
> at org.apache.avro.ipc.NettyTransceiver.getChannel(NettyTransceiver.java:292)
> at org.apache.avro.ipc.NettyTransceiver.<init>(NettyTransceiver.java:206)
> at org.apache.avro.ipc.NettyTransceiver.<init>(NettyTransceiver.java:155)
> at org.apache.flume.api.NettyAvroRpcClient.connect(NettyAvroRpcClient.java:168)
> ... 10 more
> Caused by: java.net.ConnectException: Connection refused
> at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
> at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
> at org.jboss.netty.channel.socket.nio.NioClientBoss.connect(NioClientBoss.java:148)
> at org.jboss.netty.channel.socket.nio.NioClientBoss.processSelectedKeys(NioClientBoss.java:104)
> at org.jboss.netty.channel.socket.nio.NioClientBoss.process(NioClientBoss.java:78)
> at org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:312)
> at org.jboss.netty.channel.socket.nio.NioClientBoss.run(NioClientBoss.java:41)
> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> ... 1 more
>
>
> Regards,
> Nik.
>
> On Fri, Aug 7, 2015 at 8:28 AM, Nikhil Gs <gs...@gmail.com>
> wrote:
>
>> Hello Anandkumar,
>>
>> Thank you for your time and also for the reply.
>>
>> Now, flume acts differently. I don't see any log errors, in fact my log
>> now looks as below;
>>
>> 2015-08-06 22:51:14,426 INFO
>> org.apache.flume.client.avro.ReliableSpoolingFileEv
>> entReader:
>> Preparing to delete file /home/a_nikhil.gopishetti/pnm/GVLLCMTK03-675
>>
>> 79944.pf
>> 2015-08-06 23:12:15,026 INFO
>> org.apache.flume.client.avro.ReliableSpoolingFileEv
>> entReader:
>> Preparing to delete file /home/a_nikhil.gopishetti/pnm/KNTNCMTK01.SUD
>>
>> DENLINK.NET-7682.pf
>> 2015-08-06 23:12:22,030 INFO
>> org.apache.flume.client.avro.ReliableSpoolingFileEv
>> entReader:
>> Preparing to delete file /home/a_nikhil.gopishetti/pnm/KNTNCMTK01.SUD
>>
>> DENLINK.NET-7682.pf
>> 2015-08-06 23:13:47,570 INFO
>> org.apache.flume.client.avro.ReliableSpoolingFileEv
>> entReader:
>> Preparing to delete file /home/a_nikhil.gopishetti/pnm/NBRNCMTK01-340
>>
>> 17296.pf.filepart
>> 2015-08-06 23:13:57,076 INFO
>> org.apache.flume.client.avro.ReliableSpoolingFileEv
>> entReader:
>> Preparing to delete file /home/a_nikhil.gopishetti/pnm/NBRNCMTK01-675
>>
>> 84072.pf
>> 2015-08-06 23:14:03,581 INFO
>> org.apache.flume.client.avro.ReliableSpoolingFileEv
>> entReader:
>> Preparing to delete file /home/a_nikhil.gopishetti/pnm/NBRNCMTK01-675
>>
>> 79944.pf
>> 2015-08-06 23:23:23,348 INFO
>> org.apache.flume.client.avro.ReliableSpoolingFileEv
>> entReader:
>> Preparing to delete file /home/a_nikhil.gopishetti/pnm/PMLECMTK01.SUD
>>
>> DENLINK.NET-8426.pf.filepart
>>
>>
>> So the files are deleting. So when I go back to my destination i.e. Hbase
>> tables, I don't see any data. Earlier, a couple of days back, I could
>> clearly see that my data was deleting from the spool directory and it was
>> getting into my Hbase table. But now I don't see any data in my destination
>> tables. I don't know why. Any suggestions.
>>
>> Thanks in advance for your time and reply.
>>
>> Regards,
>> Nik.
>>
>> On Thu, Aug 6, 2015 at 11:43 PM, Anandkumar Lakshmanan <an...@orzota.com>
>> wrote:
>>
>>> Hi Nik,
>>>
>>> Please verify the firewall settings. It blocks the connection it seems.
>>>
>>> Thanks
>>> Anand.
>>>
>>>
>>>
>>> On 08/07/2015 02:07 AM, Nikhil Gs wrote:
>>>
>>> Hello Team,
>>>
>>> Facing the below error very alternatively even though worked with
>>> different port numbers. I have pasted my flume config file along with the
>>> error.
>>>
>>> Thanks in advance.
>>>
>>>
>>> Below is my flume configuration
>>>
>>> #################################################
>>>
>>> # Please paste flume.conf here. Example:
>>>
>>> # Sources, channels, and sinks are defined per
>>> # agent name, in this case 'pnmtest2'.
>>> pnmtest2.sources = SPOOL
>>> pnmtest2.channels = MemChanneltest2
>>> pnmtest2.sinks = AVRO
>>>
>>> # For each source, channel, and sink, set
>>> # standard properties.
>>> pnmtest2.sources.SPOOL.type = spooldir
>>> pnmtest2.sources.SPOOL.spoolDir = /home/a.nikhill/pnm
>>> pnmtest2.sources.SPOOL.ignorePattern = \.*tmp$
>>> pnmtest2.sources.SPOOL.channels = MemChanneltest2
>>> pnmtest2.sources.SPOOL.fileHeader = true
>>> pnmtest2.sources.SPOOL.deletePolicy = immediate
>>> pnmtest2.sources.SPOOL.consumeOrder = oldest
>>> pnmtest2.sources.SPOOL.batchSize = 100
>>>
>>> pnmtest2.sources.SPOOL.interceptors = time
>>> pnmtest2.sources.SPOOL.interceptors.time.type =
>>> org.apache.flume.interceptor.TimestampInterceptor$Builder
>>> pnmtest2.sources.SPOOL.deserializer =
>>> com.sudnline.flume.WholeFileDeserializer$Builder
>>>
>>> pnmtest2.sinks.AVRO.type = avro
>>> pnmtest2.sinks.AVRO.channel = MemChanneltest2
>>> pnmtest2.sinks.AVRO.hostname = sdldalplhdw02.sudnline.cequel3.com
>>> <http://sdldalplhdw02.suddenlink.cequel3.com/>
>>> pnmtest2.sinks.AVRO.port = 40002
>>> pnmtest2.sinks.AVRO.batch-size = 100
>>> pnmtest2.sinks.AVRO.connect-timeout = 40000
>>>
>>>
>>> # pnmtest2.sinks.HDFS.type = hdfs
>>> # pnmtest2.sinks.HDFS.channel = MemChannel2
>>> # pnmtest2.sinks.HDFS.hdfs.path = /user/flume/poll/%Y/%m/%d/%H/
>>> # pnmtest2.sinks.HDFS.hdfs.fileType = DataStream
>>> # pnmtest2.sinks.HDFS.hdfs.writeFormat = Text
>>> # pnmtest2.sinks.HDFS.hdfs.batchSize = 100
>>> # pnmtest2.sinks.HDFS.hdfs.rollSize = 0
>>> # pnmtest2.sinks.HDFS.hdfs.rollCount = 1000
>>> # pnmtest2.sinks.HDFS.hdfs.rollInterval = 600
>>>
>>> # Other properties are specific to each type of
>>> # source, channel, or sink. In this case, we
>>> # specify the capacity of the memory channel.
>>>
>>> #pnmtest2.channels.MemChanneltest1.capacity = 10000
>>> #pnmtest2.channels.MemChanneltest1.type = memory
>>>
>>> pnmtest2.channels.MemChanneltest2.capacity = 1000000
>>> pnmtest2.channels.MemChanneltest2.type = memory
>>>
>>>
>>> Below is my error.
>>>
>>>> ERROR org.apache.flume.SinkRunner
>>>>>
>>>>> Unable to deliver event. Exception follows.
>>>>> org.apache.flume.EventDeliveryException: Failed to send events
>>>>> at org.apache.flume.sink.AbstractRpcSink.process(AbstractRpcSink.java:392)
>>>>> at org.apache.flume.sink.DefaultSinkProcessor.process(DefaultSinkProcessor.java:68)
>>>>> at org.apache.flume.SinkRunner$PollingRunner.run(SinkRunner.java:147)
>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>> Caused by: org.apache.flume.FlumeException: NettyAvroRpcClient { host: sdldalplhdw02.suddenlink.cequel3.com, port: 40002 }: RPC connection error
>>>>> at org.apache.flume.api.NettyAvroRpcClient.connect(NettyAvroRpcClient.java:182)
>>>>> at org.apache.flume.api.NettyAvroRpcClient.connect(NettyAvroRpcClient.java:121)
>>>>> at org.apache.flume.api.NettyAvroRpcClient.configure(NettyAvroRpcClient.java:638)
>>>>> at org.apache.flume.api.RpcClientFactory.getInstance(RpcClientFactory.java:89)
>>>>> at org.apache.flume.sink.AvroSink.initializeRpcClient(AvroSink.java:127)
>>>>> at org.apache.flume.sink.AbstractRpcSink.createConnection(AbstractRpcSink.java:211)
>>>>> at org.apache.flume.sink.AbstractRpcSink.verifyConnection(AbstractRpcSink.java:272)
>>>>> at org.apache.flume.sink.AbstractRpcSink.process(AbstractRpcSink.java:349)
>>>>> ... 3 more
>>>>> Caused by: java.io.IOException: Error connecting to sdldalplhdw02.suddenlink.cequel3.com/10.48.210.244:40002
>>>>> at org.apache.avro.ipc.NettyTransceiver.getChannel(NettyTransceiver.java:292)
>>>>> at org.apache.avro.ipc.NettyTransceiver.<init>(NettyTransceiver.java:206)
>>>>> at org.apache.avro.ipc.NettyTransceiver.<init>(NettyTransceiver.java:155)
>>>>> at org.apache.flume.api.NettyAvroRpcClient.connect(NettyAvroRpcClient.java:168)
>>>>> ... 10 more
>>>>> Caused by: java.net.ConnectException: Connection refused
>>>>> at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
>>>>> at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
>>>>> at org.jboss.netty.channel.socket.nio.NioClientBoss.connect(NioClientBoss.java:148)
>>>>> at org.jboss.netty.channel.socket.nio.NioClientBoss.processSelectedKeys(NioClientBoss.java:104)
>>>>> at org.jboss.netty.channel.socket.nio.NioClientBoss.process(NioClientBoss.java:78)
>>>>> at org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:312)
>>>>> at org.jboss.netty.channel.socket.nio.NioClientBoss.run(NioClientBoss.java:41)
>>>>> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>>> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>>
>>>>>
>>>>>
>>>>>
>>> Regards,
>>> Nik.
>>>
>>>
>>>
>>
>
>
Re: Flume ERROR
Posted by Anandkumar Lakshmanan <an...@orzota.com>.
Hi,
Flush the firewall rules and start flume.
What OS are you using?
Thanks
Anand.
On 08/07/2015 07:24 PM, Nikhil Gs wrote:
> Hello AnandKumar,
>
> The issue is again repeating. Yesterday the files were deleting from
> spool but the data was not available in the destination Hbase tables,
> but now this morning what I have noticed is the log is same.
>
> Earlier, as you have suggested, "verify the firewall settings. It
> blocks the connection it seems."
> Can you be more specific, I mean how can verify the firewall settings.
> Any suggestion?
>
> Thanks in advance.
>
> 8:45:01.040 AM ERROR org.apa
> che.flume.SinkRunner
> Unable to deliver event. Exception follows.
> org.apache.flume.EventDeliveryException: Failed to send events
> at org.apache.flume.sink.AbstractRpcSink.process(AbstractRpcSink.java:392)
> at org.apache.flume.sink.DefaultSinkProcessor.process(DefaultSinkProcessor.java:68)
> at org.apache.flume.SinkRunner$PollingRunner.run(SinkRunner.java:147)
> at java.lang.Thread.run(Thread.java:745)
> Caused by: org.apache.flume.FlumeException: NettyAvroRpcClient { host:sdldalplhdw02.suddenlink.cequel3.com <http://sdldalplhdw02.suddenlink.cequel3.com>, port: 40002 }: RPC connection error
> at org.apache.flume.api.NettyAvroRpcClient.connect(NettyAvroRpcClient.java:182)
> at org.apache.flume.api.NettyAvroRpcClient.connect(NettyAvroRpcClient.java:121)
> at org.apache.flume.api.NettyAvroRpcClient.configure(NettyAvroRpcClient.java:638)
> at org.apache.flume.api.RpcClientFactory.getInstance(RpcClientFactory.java:89)
> at org.apache.flume.sink.AvroSink.initializeRpcClient(AvroSink.java:127)
> at org.apache.flume.sink.AbstractRpcSink.createConnection(AbstractRpcSink.java:211)
> at org.apache.flume.sink.AbstractRpcSink.verifyConnection(AbstractRpcSink.java:272)
> at org.apache.flume.sink.AbstractRpcSink.process(AbstractRpcSink.java:349)
> ... 3 more
> Caused by: java.io.IOException: Error connecting tosdldalplhdw02.suddenlink.cequel3.com/10.48.210.244:40002 <http://sdldalplhdw02.suddenlink.cequel3.com/10.48.210.244:40002>
> at org.apache.avro.ipc.NettyTransceiver.getChannel(NettyTransceiver.java:292)
> at org.apache.avro.ipc.NettyTransceiver.<init>(NettyTransceiver.java:206)
> at org.apache.avro.ipc.NettyTransceiver.<init>(NettyTransceiver.java:155)
> at org.apache.flume.api.NettyAvroRpcClient.connect(NettyAvroRpcClient.java:168)
> ... 10 more
> Caused by: java.net.ConnectException: Connection refused
> at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
> at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
> at org.jboss.netty.channel.socket.nio.NioClientBoss.connect(NioClientBoss.java:148)
> at org.jboss.netty.channel.socket.nio.NioClientBoss.processSelectedKeys(NioClientBoss.java:104)
> at org.jboss.netty.channel.socket.nio.NioClientBoss.process(NioClientBoss.java:78)
> at org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:312)
> at org.jboss.netty.channel.socket.nio.NioClientBoss.run(NioClientBoss.java:41)
> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> ... 1 more
>
>
> Regards,
> Nik.
>
> On Fri, Aug 7, 2015 at 8:28 AM, Nikhil Gs <gsnikhil1432010@gmail.com
> <ma...@gmail.com>> wrote:
>
> Hello Anandkumar,
>
> Thank you for your time and also for the reply.
>
> Now, flume acts differently. I don't see any log errors, in fact
> my log now looks as below;
>
> 2015-08-06 22:51:14,426 INFO
> org.apache.flume.client.avro.ReliableSpoolingFileEv
> entReader: Preparing to delete file
> /home/a_nikhil.gopishetti/pnm/GVLLCMTK03-675 79944.pf
> <http://79944.pf>
> 2015-08-06 23:12:15,026 INFO
> org.apache.flume.client.avro.ReliableSpoolingFileEv
> entReader: Preparing to delete file
> /home/a_nikhil.gopishetti/pnm/KNTNCMTK01.SUD DENLINK.NET-7682.pf
> <http://DENLINK.NET-7682.pf>
> 2015-08-06 23:12:22,030 INFO
> org.apache.flume.client.avro.ReliableSpoolingFileEv
> entReader: Preparing to delete file
> /home/a_nikhil.gopishetti/pnm/KNTNCMTK01.SUD DENLINK.NET-7682.pf
> <http://DENLINK.NET-7682.pf>
> 2015-08-06 23:13:47,570 INFO
> org.apache.flume.client.avro.ReliableSpoolingFileEv
> entReader: Preparing to delete file
> /home/a_nikhil.gopishetti/pnm/NBRNCMTK01-340
> 17296.pf.filepart
> 2015-08-06 23:13:57,076 INFO
> org.apache.flume.client.avro.ReliableSpoolingFileEv
> entReader: Preparing to delete file
> /home/a_nikhil.gopishetti/pnm/NBRNCMTK01-675 84072.pf
> <http://84072.pf>
> 2015-08-06 23:14:03,581 INFO
> org.apache.flume.client.avro.ReliableSpoolingFileEv
> entReader: Preparing to delete file
> /home/a_nikhil.gopishetti/pnm/NBRNCMTK01-675 79944.pf
> <http://79944.pf>
> 2015-08-06 23:23:23,348 INFO
> org.apache.flume.client.avro.ReliableSpoolingFileEv
> entReader: Preparing to delete file
> /home/a_nikhil.gopishetti/pnm/PMLECMTK01.SUD
> DENLINK.NET-8426.pf.filepart
>
>
> So the files are deleting. So when I go back to my destination
> i.e. Hbase tables, I don't see any data. Earlier, a couple of days
> back, I could clearly see that my data was deleting from the spool
> directory and it was getting into my Hbase table. But now I don't
> see any data in my destination tables. I don't know why. Any
> suggestions.
>
> Thanks in advance for your time and reply.
>
> Regards,
> Nik.
>
> On Thu, Aug 6, 2015 at 11:43 PM, Anandkumar Lakshmanan
> <anand@orzota.com <ma...@orzota.com>> wrote:
>
> Hi Nik,
>
> Please verify the firewall settings. It blocks the connection
> it seems.
>
> Thanks
> Anand.
>
>
>
> On 08/07/2015 02:07 AM, Nikhil Gs wrote:
>> Hello Team,
>>
>> Facing the below error very alternatively even though worked
>> with different port numbers. I have pasted my flume config
>> file along with the error.
>>
>> Thanks in advance.
>>
>>
>> Below is my flume configuration
>>
>> #################################################
>>
>> # Please paste flume.conf here. Example:
>>
>> # Sources, channels, and sinks are defined per
>> # agent name, in this case 'pnmtest2'.
>> pnmtest2.sources = SPOOL
>> pnmtest2.channels = MemChanneltest2
>> pnmtest2.sinks = AVRO
>>
>> # For each source, channel, and sink, set
>> # standard properties.
>> pnmtest2.sources.SPOOL.type = spooldir
>> pnmtest2.sources.SPOOL.spoolDir = /home/a.nikhill/pnm
>> pnmtest2.sources.SPOOL.ignorePattern = \.*tmp$
>> pnmtest2.sources.SPOOL.channels = MemChanneltest2
>> pnmtest2.sources.SPOOL.fileHeader = true
>> pnmtest2.sources.SPOOL.deletePolicy = immediate
>> pnmtest2.sources.SPOOL.consumeOrder = oldest
>> pnmtest2.sources.SPOOL.batchSize = 100
>>
>> pnmtest2.sources.SPOOL.interceptors = time
>> pnmtest2.sources.SPOOL.interceptors.time.type =
>> org.apache.flume.interceptor.TimestampInterceptor$Builder
>> pnmtest2.sources.SPOOL.deserializer =
>> com.sudnline.flume.WholeFileDeserializer$Builder
>>
>> pnmtest2.sinks.AVRO.type = avro
>> pnmtest2.sinks.AVRO.channel = MemChanneltest2
>> pnmtest2.sinks.AVRO.hostname =
>> sdldalplhdw02.sudnline.cequel3.com
>> <http://sdldalplhdw02.suddenlink.cequel3.com/>
>> pnmtest2.sinks.AVRO.port = 40002
>> pnmtest2.sinks.AVRO.batch-size = 100
>> pnmtest2.sinks.AVRO.connect-timeout = 40000
>>
>>
>> # pnmtest2.sinks.HDFS.type = hdfs
>> # pnmtest2.sinks.HDFS.channel = MemChannel2
>> # pnmtest2.sinks.HDFS.hdfs.path = /user/flume/poll/%Y/%m/%d/%H/
>> # pnmtest2.sinks.HDFS.hdfs.fileType = DataStream
>> # pnmtest2.sinks.HDFS.hdfs.writeFormat = Text
>> # pnmtest2.sinks.HDFS.hdfs.batchSize = 100
>> # pnmtest2.sinks.HDFS.hdfs.rollSize = 0
>> # pnmtest2.sinks.HDFS.hdfs.rollCount = 1000
>> # pnmtest2.sinks.HDFS.hdfs.rollInterval = 600
>>
>> # Other properties are specific to each type of
>> # source, channel, or sink. In this case, we
>> # specify the capacity of the memory channel.
>>
>> #pnmtest2.channels.MemChanneltest1.capacity = 10000
>> #pnmtest2.channels.MemChanneltest1.type = memory
>>
>> pnmtest2.channels.MemChanneltest2.capacity = 1000000
>> pnmtest2.channels.MemChanneltest2.type = memory
>>
>>
>> Below is my error.
>>
>> ERROR org.apache.flume.SinkRunner
>>
>> Unable to deliver event. Exception follows.
>> org.apache.flume.EventDeliveryException: Failed to send events
>> at org.apache.flume.sink.AbstractRpcSink.process(AbstractRpcSink.java:392)
>> at org.apache.flume.sink.DefaultSinkProcessor.process(DefaultSinkProcessor.java:68)
>> at org.apache.flume.SinkRunner$PollingRunner.run(SinkRunner.java:147)
>> at java.lang.Thread.run(Thread.java:745)
>> Caused by: org.apache.flume.FlumeException: NettyAvroRpcClient { host:sdldalplhdw02.suddenlink.cequel3.com <http://sdldalplhdw02.suddenlink.cequel3.com/>, port: 40002 }: RPC connection error
>> at org.apache.flume.api.NettyAvroRpcClient.connect(NettyAvroRpcClient.java:182)
>> at org.apache.flume.api.NettyAvroRpcClient.connect(NettyAvroRpcClient.java:121)
>> at org.apache.flume.api.NettyAvroRpcClient.configure(NettyAvroRpcClient.java:638)
>> at org.apache.flume.api.RpcClientFactory.getInstance(RpcClientFactory.java:89)
>> at org.apache.flume.sink.AvroSink.initializeRpcClient(AvroSink.java:127)
>> at org.apache.flume.sink.AbstractRpcSink.createConnection(AbstractRpcSink.java:211)
>> at org.apache.flume.sink.AbstractRpcSink.verifyConnection(AbstractRpcSink.java:272)
>> at org.apache.flume.sink.AbstractRpcSink.process(AbstractRpcSink.java:349)
>> ... 3 more
>> Caused by: java.io.IOException: Error connecting tosdldalplhdw02.suddenlink.cequel3.com/10.48.210.244:40002 <http://sdldalplhdw02.suddenlink.cequel3.com/10.48.210.244:40002>
>> at org.apache.avro.ipc.NettyTransceiver.getChannel(NettyTransceiver.java:292)
>> at org.apache.avro.ipc.NettyTransceiver.<init>(NettyTransceiver.java:206)
>> at org.apache.avro.ipc.NettyTransceiver.<init>(NettyTransceiver.java:155)
>> at org.apache.flume.api.NettyAvroRpcClient.connect(NettyAvroRpcClient.java:168)
>> ... 10 more
>> Caused by: java.net.ConnectException: Connection refused
>> at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
>> at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
>> at org.jboss.netty.channel.socket.nio.NioClientBoss.connect(NioClientBoss.java:148)
>> at org.jboss.netty.channel.socket.nio.NioClientBoss.processSelectedKeys(NioClientBoss.java:104)
>> at org.jboss.netty.channel.socket.nio.NioClientBoss.process(NioClientBoss.java:78)
>> at org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:312)
>> at org.jboss.netty.channel.socket.nio.NioClientBoss.run(NioClientBoss.java:41)
>> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>
>>
>>
>>
>> Regards,
>> Nik.
>
>
>
Re: Flume ERROR
Posted by Nikhil Gs <gs...@gmail.com>.
Hello AnandKumar,
The issue is again repeating. Yesterday the files were deleting from spool
but the data was not available in the destination Hbase tables, but now
this morning what I have noticed is the log is same.
Earlier, as you have suggested, "verify the firewall settings. It blocks
the connection it seems."
Can you be more specific, I mean how can verify the firewall settings. Any
suggestion?
Thanks in advance.
8:45:01.040 AMERRORorg.apa
che.flume.SinkRunner
Unable to deliver event. Exception follows.
org.apache.flume.EventDeliveryException: Failed to send events
at org.apache.flume.sink.AbstractRpcSink.process(AbstractRpcSink.java:392)
at org.apache.flume.sink.DefaultSinkProcessor.process(DefaultSinkProcessor.java:68)
at org.apache.flume.SinkRunner$PollingRunner.run(SinkRunner.java:147)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.flume.FlumeException: NettyAvroRpcClient { host:
sdldalplhdw02.suddenlink.cequel3.com, port: 40002 }: RPC connection
error
at org.apache.flume.api.NettyAvroRpcClient.connect(NettyAvroRpcClient.java:182)
at org.apache.flume.api.NettyAvroRpcClient.connect(NettyAvroRpcClient.java:121)
at org.apache.flume.api.NettyAvroRpcClient.configure(NettyAvroRpcClient.java:638)
at org.apache.flume.api.RpcClientFactory.getInstance(RpcClientFactory.java:89)
at org.apache.flume.sink.AvroSink.initializeRpcClient(AvroSink.java:127)
at org.apache.flume.sink.AbstractRpcSink.createConnection(AbstractRpcSink.java:211)
at org.apache.flume.sink.AbstractRpcSink.verifyConnection(AbstractRpcSink.java:272)
at org.apache.flume.sink.AbstractRpcSink.process(AbstractRpcSink.java:349)
... 3 more
Caused by: java.io.IOException: Error connecting to
sdldalplhdw02.suddenlink.cequel3.com/10.48.210.244:40002
at org.apache.avro.ipc.NettyTransceiver.getChannel(NettyTransceiver.java:292)
at org.apache.avro.ipc.NettyTransceiver.<init>(NettyTransceiver.java:206)
at org.apache.avro.ipc.NettyTransceiver.<init>(NettyTransceiver.java:155)
at org.apache.flume.api.NettyAvroRpcClient.connect(NettyAvroRpcClient.java:168)
... 10 more
Caused by: java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
at org.jboss.netty.channel.socket.nio.NioClientBoss.connect(NioClientBoss.java:148)
at org.jboss.netty.channel.socket.nio.NioClientBoss.processSelectedKeys(NioClientBoss.java:104)
at org.jboss.netty.channel.socket.nio.NioClientBoss.process(NioClientBoss.java:78)
at org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:312)
at org.jboss.netty.channel.socket.nio.NioClientBoss.run(NioClientBoss.java:41)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
... 1 more
Regards,
Nik.
On Fri, Aug 7, 2015 at 8:28 AM, Nikhil Gs <gs...@gmail.com> wrote:
> Hello Anandkumar,
>
> Thank you for your time and also for the reply.
>
> Now, flume acts differently. I don't see any log errors, in fact my log
> now looks as below;
>
> 2015-08-06 22:51:14,426 INFO
> org.apache.flume.client.avro.ReliableSpoolingFileEv
> entReader:
> Preparing to delete file /home/a_nikhil.gopishetti/pnm/GVLLCMTK03-675
>
> 79944.pf
> 2015-08-06 23:12:15,026 INFO
> org.apache.flume.client.avro.ReliableSpoolingFileEv
> entReader:
> Preparing to delete file /home/a_nikhil.gopishetti/pnm/KNTNCMTK01.SUD
>
> DENLINK.NET-7682.pf
> 2015-08-06 23:12:22,030 INFO
> org.apache.flume.client.avro.ReliableSpoolingFileEv
> entReader:
> Preparing to delete file /home/a_nikhil.gopishetti/pnm/KNTNCMTK01.SUD
>
> DENLINK.NET-7682.pf
> 2015-08-06 23:13:47,570 INFO
> org.apache.flume.client.avro.ReliableSpoolingFileEv
> entReader:
> Preparing to delete file /home/a_nikhil.gopishetti/pnm/NBRNCMTK01-340
>
> 17296.pf.filepart
> 2015-08-06 23:13:57,076 INFO
> org.apache.flume.client.avro.ReliableSpoolingFileEv
> entReader:
> Preparing to delete file /home/a_nikhil.gopishetti/pnm/NBRNCMTK01-675
>
> 84072.pf
> 2015-08-06 23:14:03,581 INFO
> org.apache.flume.client.avro.ReliableSpoolingFileEv
> entReader:
> Preparing to delete file /home/a_nikhil.gopishetti/pnm/NBRNCMTK01-675
>
> 79944.pf
> 2015-08-06 23:23:23,348 INFO
> org.apache.flume.client.avro.ReliableSpoolingFileEv
> entReader:
> Preparing to delete file /home/a_nikhil.gopishetti/pnm/PMLECMTK01.SUD
>
> DENLINK.NET-8426.pf.filepart
>
>
> So the files are deleting. So when I go back to my destination i.e. Hbase
> tables, I don't see any data. Earlier, a couple of days back, I could
> clearly see that my data was deleting from the spool directory and it was
> getting into my Hbase table. But now I don't see any data in my destination
> tables. I don't know why. Any suggestions.
>
> Thanks in advance for your time and reply.
>
> Regards,
> Nik.
>
> On Thu, Aug 6, 2015 at 11:43 PM, Anandkumar Lakshmanan <an...@orzota.com>
> wrote:
>
>> Hi Nik,
>>
>> Please verify the firewall settings. It blocks the connection it seems.
>>
>> Thanks
>> Anand.
>>
>>
>>
>> On 08/07/2015 02:07 AM, Nikhil Gs wrote:
>>
>> Hello Team,
>>
>> Facing the below error very alternatively even though worked with
>> different port numbers. I have pasted my flume config file along with the
>> error.
>>
>> Thanks in advance.
>>
>>
>> Below is my flume configuration
>>
>> #################################################
>>
>> # Please paste flume.conf here. Example:
>>
>> # Sources, channels, and sinks are defined per
>> # agent name, in this case 'pnmtest2'.
>> pnmtest2.sources = SPOOL
>> pnmtest2.channels = MemChanneltest2
>> pnmtest2.sinks = AVRO
>>
>> # For each source, channel, and sink, set
>> # standard properties.
>> pnmtest2.sources.SPOOL.type = spooldir
>> pnmtest2.sources.SPOOL.spoolDir = /home/a.nikhill/pnm
>> pnmtest2.sources.SPOOL.ignorePattern = \.*tmp$
>> pnmtest2.sources.SPOOL.channels = MemChanneltest2
>> pnmtest2.sources.SPOOL.fileHeader = true
>> pnmtest2.sources.SPOOL.deletePolicy = immediate
>> pnmtest2.sources.SPOOL.consumeOrder = oldest
>> pnmtest2.sources.SPOOL.batchSize = 100
>>
>> pnmtest2.sources.SPOOL.interceptors = time
>> pnmtest2.sources.SPOOL.interceptors.time.type =
>> org.apache.flume.interceptor.TimestampInterceptor$Builder
>> pnmtest2.sources.SPOOL.deserializer =
>> com.sudnline.flume.WholeFileDeserializer$Builder
>>
>> pnmtest2.sinks.AVRO.type = avro
>> pnmtest2.sinks.AVRO.channel = MemChanneltest2
>> pnmtest2.sinks.AVRO.hostname = sdldalplhdw02.sudnline.cequel3.com
>> <http://sdldalplhdw02.suddenlink.cequel3.com/>
>> pnmtest2.sinks.AVRO.port = 40002
>> pnmtest2.sinks.AVRO.batch-size = 100
>> pnmtest2.sinks.AVRO.connect-timeout = 40000
>>
>>
>> # pnmtest2.sinks.HDFS.type = hdfs
>> # pnmtest2.sinks.HDFS.channel = MemChannel2
>> # pnmtest2.sinks.HDFS.hdfs.path = /user/flume/poll/%Y/%m/%d/%H/
>> # pnmtest2.sinks.HDFS.hdfs.fileType = DataStream
>> # pnmtest2.sinks.HDFS.hdfs.writeFormat = Text
>> # pnmtest2.sinks.HDFS.hdfs.batchSize = 100
>> # pnmtest2.sinks.HDFS.hdfs.rollSize = 0
>> # pnmtest2.sinks.HDFS.hdfs.rollCount = 1000
>> # pnmtest2.sinks.HDFS.hdfs.rollInterval = 600
>>
>> # Other properties are specific to each type of
>> # source, channel, or sink. In this case, we
>> # specify the capacity of the memory channel.
>>
>> #pnmtest2.channels.MemChanneltest1.capacity = 10000
>> #pnmtest2.channels.MemChanneltest1.type = memory
>>
>> pnmtest2.channels.MemChanneltest2.capacity = 1000000
>> pnmtest2.channels.MemChanneltest2.type = memory
>>
>>
>> Below is my error.
>>
>>> ERROR org.apache.flume.SinkRunner
>>>>
>>>> Unable to deliver event. Exception follows.
>>>> org.apache.flume.EventDeliveryException: Failed to send events
>>>> at org.apache.flume.sink.AbstractRpcSink.process(AbstractRpcSink.java:392)
>>>> at org.apache.flume.sink.DefaultSinkProcessor.process(DefaultSinkProcessor.java:68)
>>>> at org.apache.flume.SinkRunner$PollingRunner.run(SinkRunner.java:147)
>>>> at java.lang.Thread.run(Thread.java:745)
>>>> Caused by: org.apache.flume.FlumeException: NettyAvroRpcClient { host: sdldalplhdw02.suddenlink.cequel3.com, port: 40002 }: RPC connection error
>>>> at org.apache.flume.api.NettyAvroRpcClient.connect(NettyAvroRpcClient.java:182)
>>>> at org.apache.flume.api.NettyAvroRpcClient.connect(NettyAvroRpcClient.java:121)
>>>> at org.apache.flume.api.NettyAvroRpcClient.configure(NettyAvroRpcClient.java:638)
>>>> at org.apache.flume.api.RpcClientFactory.getInstance(RpcClientFactory.java:89)
>>>> at org.apache.flume.sink.AvroSink.initializeRpcClient(AvroSink.java:127)
>>>> at org.apache.flume.sink.AbstractRpcSink.createConnection(AbstractRpcSink.java:211)
>>>> at org.apache.flume.sink.AbstractRpcSink.verifyConnection(AbstractRpcSink.java:272)
>>>> at org.apache.flume.sink.AbstractRpcSink.process(AbstractRpcSink.java:349)
>>>> ... 3 more
>>>> Caused by: java.io.IOException: Error connecting to sdldalplhdw02.suddenlink.cequel3.com/10.48.210.244:40002
>>>> at org.apache.avro.ipc.NettyTransceiver.getChannel(NettyTransceiver.java:292)
>>>> at org.apache.avro.ipc.NettyTransceiver.<init>(NettyTransceiver.java:206)
>>>> at org.apache.avro.ipc.NettyTransceiver.<init>(NettyTransceiver.java:155)
>>>> at org.apache.flume.api.NettyAvroRpcClient.connect(NettyAvroRpcClient.java:168)
>>>> ... 10 more
>>>> Caused by: java.net.ConnectException: Connection refused
>>>> at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
>>>> at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
>>>> at org.jboss.netty.channel.socket.nio.NioClientBoss.connect(NioClientBoss.java:148)
>>>> at org.jboss.netty.channel.socket.nio.NioClientBoss.processSelectedKeys(NioClientBoss.java:104)
>>>> at org.jboss.netty.channel.socket.nio.NioClientBoss.process(NioClientBoss.java:78)
>>>> at org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:312)
>>>> at org.jboss.netty.channel.socket.nio.NioClientBoss.run(NioClientBoss.java:41)
>>>> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>>
>>>>
>>>>
>>>>
>> Regards,
>> Nik.
>>
>>
>>
>
Re: Flume ERROR
Posted by Nikhil Gs <gs...@gmail.com>.
Hello Anandkumar,
Thank you for your time and also for the reply.
Now, flume acts differently. I don't see any log errors, in fact my log now
looks as below;
2015-08-06 22:51:14,426 INFO
org.apache.flume.client.avro.ReliableSpoolingFileEv
entReader:
Preparing to delete file /home/a_nikhil.gopishetti/pnm/GVLLCMTK03-675
79944.pf
2015-08-06 23:12:15,026 INFO
org.apache.flume.client.avro.ReliableSpoolingFileEv
entReader:
Preparing to delete file /home/a_nikhil.gopishetti/pnm/KNTNCMTK01.SUD
DENLINK.NET-7682.pf
2015-08-06 23:12:22,030 INFO
org.apache.flume.client.avro.ReliableSpoolingFileEv
entReader:
Preparing to delete file /home/a_nikhil.gopishetti/pnm/KNTNCMTK01.SUD
DENLINK.NET-7682.pf
2015-08-06 23:13:47,570 INFO
org.apache.flume.client.avro.ReliableSpoolingFileEv
entReader:
Preparing to delete file /home/a_nikhil.gopishetti/pnm/NBRNCMTK01-340
17296.pf.filepart
2015-08-06 23:13:57,076 INFO
org.apache.flume.client.avro.ReliableSpoolingFileEv
entReader:
Preparing to delete file /home/a_nikhil.gopishetti/pnm/NBRNCMTK01-675
84072.pf
2015-08-06 23:14:03,581 INFO
org.apache.flume.client.avro.ReliableSpoolingFileEv
entReader:
Preparing to delete file /home/a_nikhil.gopishetti/pnm/NBRNCMTK01-675
79944.pf
2015-08-06 23:23:23,348 INFO
org.apache.flume.client.avro.ReliableSpoolingFileEv
entReader:
Preparing to delete file /home/a_nikhil.gopishetti/pnm/PMLECMTK01.SUD
DENLINK.NET-8426.pf.filepart
So the files are deleting. So when I go back to my destination i.e. Hbase
tables, I don't see any data. Earlier, a couple of days back, I could
clearly see that my data was deleting from the spool directory and it was
getting into my Hbase table. But now I don't see any data in my destination
tables. I don't know why. Any suggestions.
Thanks in advance for your time and reply.
Regards,
Nik.
On Thu, Aug 6, 2015 at 11:43 PM, Anandkumar Lakshmanan <an...@orzota.com>
wrote:
> Hi Nik,
>
> Please verify the firewall settings. It blocks the connection it seems.
>
> Thanks
> Anand.
>
>
>
> On 08/07/2015 02:07 AM, Nikhil Gs wrote:
>
> Hello Team,
>
> Facing the below error very alternatively even though worked with
> different port numbers. I have pasted my flume config file along with the
> error.
>
> Thanks in advance.
>
>
> Below is my flume configuration
>
> #################################################
>
> # Please paste flume.conf here. Example:
>
> # Sources, channels, and sinks are defined per
> # agent name, in this case 'pnmtest2'.
> pnmtest2.sources = SPOOL
> pnmtest2.channels = MemChanneltest2
> pnmtest2.sinks = AVRO
>
> # For each source, channel, and sink, set
> # standard properties.
> pnmtest2.sources.SPOOL.type = spooldir
> pnmtest2.sources.SPOOL.spoolDir = /home/a.nikhill/pnm
> pnmtest2.sources.SPOOL.ignorePattern = \.*tmp$
> pnmtest2.sources.SPOOL.channels = MemChanneltest2
> pnmtest2.sources.SPOOL.fileHeader = true
> pnmtest2.sources.SPOOL.deletePolicy = immediate
> pnmtest2.sources.SPOOL.consumeOrder = oldest
> pnmtest2.sources.SPOOL.batchSize = 100
>
> pnmtest2.sources.SPOOL.interceptors = time
> pnmtest2.sources.SPOOL.interceptors.time.type =
> org.apache.flume.interceptor.TimestampInterceptor$Builder
> pnmtest2.sources.SPOOL.deserializer =
> com.sudnline.flume.WholeFileDeserializer$Builder
>
> pnmtest2.sinks.AVRO.type = avro
> pnmtest2.sinks.AVRO.channel = MemChanneltest2
> pnmtest2.sinks.AVRO.hostname = sdldalplhdw02.sudnline.cequel3.com
> <http://sdldalplhdw02.suddenlink.cequel3.com/>
> pnmtest2.sinks.AVRO.port = 40002
> pnmtest2.sinks.AVRO.batch-size = 100
> pnmtest2.sinks.AVRO.connect-timeout = 40000
>
>
> # pnmtest2.sinks.HDFS.type = hdfs
> # pnmtest2.sinks.HDFS.channel = MemChannel2
> # pnmtest2.sinks.HDFS.hdfs.path = /user/flume/poll/%Y/%m/%d/%H/
> # pnmtest2.sinks.HDFS.hdfs.fileType = DataStream
> # pnmtest2.sinks.HDFS.hdfs.writeFormat = Text
> # pnmtest2.sinks.HDFS.hdfs.batchSize = 100
> # pnmtest2.sinks.HDFS.hdfs.rollSize = 0
> # pnmtest2.sinks.HDFS.hdfs.rollCount = 1000
> # pnmtest2.sinks.HDFS.hdfs.rollInterval = 600
>
> # Other properties are specific to each type of
> # source, channel, or sink. In this case, we
> # specify the capacity of the memory channel.
>
> #pnmtest2.channels.MemChanneltest1.capacity = 10000
> #pnmtest2.channels.MemChanneltest1.type = memory
>
> pnmtest2.channels.MemChanneltest2.capacity = 1000000
> pnmtest2.channels.MemChanneltest2.type = memory
>
>
> Below is my error.
>
>> ERROR org.apache.flume.SinkRunner
>>>
>>> Unable to deliver event. Exception follows.
>>> org.apache.flume.EventDeliveryException: Failed to send events
>>> at org.apache.flume.sink.AbstractRpcSink.process(AbstractRpcSink.java:392)
>>> at org.apache.flume.sink.DefaultSinkProcessor.process(DefaultSinkProcessor.java:68)
>>> at org.apache.flume.SinkRunner$PollingRunner.run(SinkRunner.java:147)
>>> at java.lang.Thread.run(Thread.java:745)
>>> Caused by: org.apache.flume.FlumeException: NettyAvroRpcClient { host: sdldalplhdw02.suddenlink.cequel3.com, port: 40002 }: RPC connection error
>>> at org.apache.flume.api.NettyAvroRpcClient.connect(NettyAvroRpcClient.java:182)
>>> at org.apache.flume.api.NettyAvroRpcClient.connect(NettyAvroRpcClient.java:121)
>>> at org.apache.flume.api.NettyAvroRpcClient.configure(NettyAvroRpcClient.java:638)
>>> at org.apache.flume.api.RpcClientFactory.getInstance(RpcClientFactory.java:89)
>>> at org.apache.flume.sink.AvroSink.initializeRpcClient(AvroSink.java:127)
>>> at org.apache.flume.sink.AbstractRpcSink.createConnection(AbstractRpcSink.java:211)
>>> at org.apache.flume.sink.AbstractRpcSink.verifyConnection(AbstractRpcSink.java:272)
>>> at org.apache.flume.sink.AbstractRpcSink.process(AbstractRpcSink.java:349)
>>> ... 3 more
>>> Caused by: java.io.IOException: Error connecting to sdldalplhdw02.suddenlink.cequel3.com/10.48.210.244:40002
>>> at org.apache.avro.ipc.NettyTransceiver.getChannel(NettyTransceiver.java:292)
>>> at org.apache.avro.ipc.NettyTransceiver.<init>(NettyTransceiver.java:206)
>>> at org.apache.avro.ipc.NettyTransceiver.<init>(NettyTransceiver.java:155)
>>> at org.apache.flume.api.NettyAvroRpcClient.connect(NettyAvroRpcClient.java:168)
>>> ... 10 more
>>> Caused by: java.net.ConnectException: Connection refused
>>> at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
>>> at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
>>> at org.jboss.netty.channel.socket.nio.NioClientBoss.connect(NioClientBoss.java:148)
>>> at org.jboss.netty.channel.socket.nio.NioClientBoss.processSelectedKeys(NioClientBoss.java:104)
>>> at org.jboss.netty.channel.socket.nio.NioClientBoss.process(NioClientBoss.java:78)
>>> at org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:312)
>>> at org.jboss.netty.channel.socket.nio.NioClientBoss.run(NioClientBoss.java:41)
>>> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>
>>>
>>>
>>>
> Regards,
> Nik.
>
>
>
Re: Flume ERROR
Posted by Anandkumar Lakshmanan <an...@orzota.com>.
Hi Nik,
Please verify the firewall settings. It blocks the connection it seems.
Thanks
Anand.
On 08/07/2015 02:07 AM, Nikhil Gs wrote:
> Hello Team,
>
> Facing the below error very alternatively even though worked with
> different port numbers. I have pasted my flume config file along with
> the error.
>
> Thanks in advance.
>
>
> Below is my flume configuration
>
> #################################################
>
> # Please paste flume.conf here. Example:
>
> # Sources, channels, and sinks are defined per
> # agent name, in this case 'pnmtest2'.
> pnmtest2.sources = SPOOL
> pnmtest2.channels = MemChanneltest2
> pnmtest2.sinks = AVRO
>
> # For each source, channel, and sink, set
> # standard properties.
> pnmtest2.sources.SPOOL.type = spooldir
> pnmtest2.sources.SPOOL.spoolDir = /home/a.nikhill/pnm
> pnmtest2.sources.SPOOL.ignorePattern = \.*tmp$
> pnmtest2.sources.SPOOL.channels = MemChanneltest2
> pnmtest2.sources.SPOOL.fileHeader = true
> pnmtest2.sources.SPOOL.deletePolicy = immediate
> pnmtest2.sources.SPOOL.consumeOrder = oldest
> pnmtest2.sources.SPOOL.batchSize = 100
>
> pnmtest2.sources.SPOOL.interceptors = time
> pnmtest2.sources.SPOOL.interceptors.time.type =
> org.apache.flume.interceptor.TimestampInterceptor$Builder
> pnmtest2.sources.SPOOL.deserializer =
> com.sudnline.flume.WholeFileDeserializer$Builder
>
> pnmtest2.sinks.AVRO.type = avro
> pnmtest2.sinks.AVRO.channel = MemChanneltest2
> pnmtest2.sinks.AVRO.hostname = sdldalplhdw02.sudnline.cequel3.com
> <http://sdldalplhdw02.suddenlink.cequel3.com/>
> pnmtest2.sinks.AVRO.port = 40002
> pnmtest2.sinks.AVRO.batch-size = 100
> pnmtest2.sinks.AVRO.connect-timeout = 40000
>
>
> # pnmtest2.sinks.HDFS.type = hdfs
> # pnmtest2.sinks.HDFS.channel = MemChannel2
> # pnmtest2.sinks.HDFS.hdfs.path = /user/flume/poll/%Y/%m/%d/%H/
> # pnmtest2.sinks.HDFS.hdfs.fileType = DataStream
> # pnmtest2.sinks.HDFS.hdfs.writeFormat = Text
> # pnmtest2.sinks.HDFS.hdfs.batchSize = 100
> # pnmtest2.sinks.HDFS.hdfs.rollSize = 0
> # pnmtest2.sinks.HDFS.hdfs.rollCount = 1000
> # pnmtest2.sinks.HDFS.hdfs.rollInterval = 600
>
> # Other properties are specific to each type of
> # source, channel, or sink. In this case, we
> # specify the capacity of the memory channel.
>
> #pnmtest2.channels.MemChanneltest1.capacity = 10000
> #pnmtest2.channels.MemChanneltest1.type = memory
>
> pnmtest2.channels.MemChanneltest2.capacity = 1000000
> pnmtest2.channels.MemChanneltest2.type = memory
>
>
> Below is my error.
>
> ERROR org.apache.flume.SinkRunner
>
> Unable to deliver event. Exception follows.
> org.apache.flume.EventDeliveryException: Failed to send events
> at org.apache.flume.sink.AbstractRpcSink.process(AbstractRpcSink.java:392)
> at org.apache.flume.sink.DefaultSinkProcessor.process(DefaultSinkProcessor.java:68)
> at org.apache.flume.SinkRunner$PollingRunner.run(SinkRunner.java:147)
> at java.lang.Thread.run(Thread.java:745)
> Caused by: org.apache.flume.FlumeException: NettyAvroRpcClient { host:sdldalplhdw02.suddenlink.cequel3.com <http://sdldalplhdw02.suddenlink.cequel3.com/>, port: 40002 }: RPC connection error
> at org.apache.flume.api.NettyAvroRpcClient.connect(NettyAvroRpcClient.java:182)
> at org.apache.flume.api.NettyAvroRpcClient.connect(NettyAvroRpcClient.java:121)
> at org.apache.flume.api.NettyAvroRpcClient.configure(NettyAvroRpcClient.java:638)
> at org.apache.flume.api.RpcClientFactory.getInstance(RpcClientFactory.java:89)
> at org.apache.flume.sink.AvroSink.initializeRpcClient(AvroSink.java:127)
> at org.apache.flume.sink.AbstractRpcSink.createConnection(AbstractRpcSink.java:211)
> at org.apache.flume.sink.AbstractRpcSink.verifyConnection(AbstractRpcSink.java:272)
> at org.apache.flume.sink.AbstractRpcSink.process(AbstractRpcSink.java:349)
> ... 3 more
> Caused by: java.io.IOException: Error connecting tosdldalplhdw02.suddenlink.cequel3.com/10.48.210.244:40002 <http://sdldalplhdw02.suddenlink.cequel3.com/10.48.210.244:40002>
> at org.apache.avro.ipc.NettyTransceiver.getChannel(NettyTransceiver.java:292)
> at org.apache.avro.ipc.NettyTransceiver.<init>(NettyTransceiver.java:206)
> at org.apache.avro.ipc.NettyTransceiver.<init>(NettyTransceiver.java:155)
> at org.apache.flume.api.NettyAvroRpcClient.connect(NettyAvroRpcClient.java:168)
> ... 10 more
> Caused by: java.net.ConnectException: Connection refused
> at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
> at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
> at org.jboss.netty.channel.socket.nio.NioClientBoss.connect(NioClientBoss.java:148)
> at org.jboss.netty.channel.socket.nio.NioClientBoss.processSelectedKeys(NioClientBoss.java:104)
> at org.jboss.netty.channel.socket.nio.NioClientBoss.process(NioClientBoss.java:78)
> at org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:312)
> at org.jboss.netty.channel.socket.nio.NioClientBoss.run(NioClientBoss.java:41)
> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>
>
>
>
> Regards,
> Nik.