You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@flume.apache.org by Shekhar Sharma <sh...@gmail.com> on 2013/04/24 16:15:59 UTC

Flume Agent not able to write to HDFS Sink

Hello All,


I am using a HDFS sink , but it is throwing the following error:

Exception in thread "SinkRunner-PollingRunner-DefaultSinkProcessor"
java.lang.NoSuchMethodError:
org.apache.flume.formatter.output.BucketPath.escapeString(Ljava/lang/String;Ljava/util/Map;ZII)Ljava/lang/String;
at org.apache.flume.sink.hdfs.HDFSEventSink.process(HDFSEventSink.java:389)
at
org.apache.flume.sink.DefaultSinkProcessor.process(DefaultSinkProcessor.java:65)
at org.apache.flume.SinkRunner$PollingRunner.run(SinkRunner.java:148)
at java.lang.Thread.run(Thread.java:722)


I have seen the similar error
https://issues.apache.org/jira/browse/FLUME-2019
But i don't think this would be the error, since i have built the flume
source code on my own system..so there shouldn't be any version mismatch

The class path which it is taking while running it as follows:

exec /usr/java/jdk1.7.0_10/bin/java -Xmx20m -cp
'/home/training/flume-1.2.0-incubating-SNAPSHOT/conf:/home/training/flume-1.2.0-incubating-SNAPSHOT/lib/*:/usr/local/hadoop/libexec/../conf:/usr/java/jdk1.7.0_10/lib/tools.jar:/usr/local/hadoop/libexec/..:/usr/local/hadoop/libexec/../hadoop-core-1.0.3.jar:/usr/local/hadoop/libexec/../lib/asm-3.2.jar:/usr/local/hadoop/libexec/../lib/aspectjrt-1.6.5.jar:/usr/local/hadoop/libexec/../lib/aspectjtools-1.6.5.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/libexec/../lib/commons-cli-1.2.jar:/usr/local/hadoop/libexec/../lib/commons-codec-1.4.jar:/usr/local/hadoop/libexec/../lib/commons-collections-3.2.1.jar:/usr/local/hadoop/libexec/../lib/commons-configuration-1.6.jar:/usr/local/hadoop/libexec/../lib/commons-daemon-1.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-digester-1.8.jar:/usr/local/hadoop/libexec/../lib/commons-el-1.0.jar:/usr/local/hadoop/libexec/../lib/commons-httpclient-3.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-io-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-lang-2.4.jar:/usr/local/hadoop/libexec/../lib/commons-logging-1.1.1.jar:/usr/local/hadoop/libexec/../lib/commons-logging-api-1.0.4.jar:/usr/local/hadoop/libexec/../lib/commons-math-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-net-1.4.1.jar:/usr/local/hadoop/libexec/../lib/core-3.1.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-capacity-scheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-fairscheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-thriftfs-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hsqldb-1.8.0.10.jar:/usr/local/hadoop/libexec/../lib/jackson-core-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jasper-compiler-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jasper-runtime-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jdeb-0.8.jar:/usr/local/hadoop/libexec/../lib/jersey-core-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-json-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-server-1.8.jar:/usr/local/hadoop/libexec/../lib/jets3t-0.6.1.jar:/usr/local/hadoop/libexec/../lib/jetty-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jetty-util-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jsch-0.1.42.jar:/usr/local/hadoop/libexec/../lib/junit-4.5.jar:/usr/local/hadoop/libexec/../lib/kfs-0.2.2.jar:/usr/local/hadoop/libexec/../lib/log4j-1.2.15.jar:/usr/local/hadoop/libexec/../lib/mockito-all-1.8.5.jar:/usr/local/hadoop/libexec/../lib/oro-2.0.8.jar:/usr/local/hadoop/libexec/../lib/servlet-api-2.5-20081211.jar:/usr/local/hadoop/libexec/../lib/snappy-java-1.0.3.2.jar:/usr/local/hadoop/libexec/../lib/xmlenc-0.52.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-2.1.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-api-2.1.jar'
-Djava.library.path=:/usr/local/hadoop/libexec/../lib/native/Linux-i386-32
org.apache.flume.node.Application -f ../conf/rpc_to_hdfs_node.js.properties
-n sample
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in
[jar:file:/home/training/flume-1.2.0-incubating-SNAPSHOT/lib/FlumeEsperSink-1.0.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in
[jar:file:/home/training/flume-1.2.0-incubating-SNAPSHOT/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
explanation.
Exception in thread "SinkRunner-PollingRunner-DefaultSinkProcessor"
java.lang.NoSuchMethodError:
org.apache.flume.formatter.output.BucketPath.escapeString(Ljava/lang/String;Ljava/util/Map;ZII)Ljava/lang/String;
at org.apache.flume.sink.hdfs.HDFSEventSink.process(HDFSEventSink.java:389)
at
org.apache.flume.sink.DefaultSinkProcessor.process(DefaultSinkProcessor.java:65)
at org.apache.flume.SinkRunner$PollingRunner.run(SinkRunner.java:148)
at java.lang.Thread.run(Thread.java:722)





My agent file is as follows:

sample.sources = avroSource
sample.channels = memoryChannel2
sample.sinks = hdfsSink

#Source configuration

sample.sources.avroSource.type = avro
sample.sources.avroSource.bind = localhost
sample.sources.avroSource.port = 41416
sample.sources.avroSource.channels =  memoryChannel2



#Channels


sample.channels.memoryChannel2.type = memory


#Sinks



sample.sinks.hdfsSink.type = hdfs
sample.sinks.hdfsSink.hdfs.path=hdfs://localhost:54310/user/training/FlumeFanOut
sample.sinks.hdfsSink.hdfs.fileType=DataStream
#Number of seconds to wait before rolling current file (0 = never roll
based on time interval)
sample.sinks.hdfsSink.hdfs.rollInterval=0
#File size to trigger roll, in bytes (0: never roll based on file size)
sample.sinks.hdfsSink.hdfs.rollSize=0
#Number of events written to file before it rolled (0 = never roll based on
number of events)
sample.sinks.hdfsSink.hdfs.rollCount=0
sample.sinks.hdfsSink.channel = memoryChannel2


Regards,
Som Shekhar Sharma

Re: Flume Agent not able to write to HDFS Sink

Posted by Shekhar Sharma <sh...@gmail.com>.
Can you tell me why this error comes up generally? I have checked the code
and that method is present..

i think it is looking out for method in some other jar ( may be older)
which doesnt have that method..


BTW it was working previously...but i have created my custom sink and i
kept the jar in the lib folder of flume..and this creating problem...Will
update if i find something..


Regards,
Som Shekhar Sharma
+91-8197243810


On Wed, Apr 24, 2013 at 9:04 PM, Israel Ekpo <is...@aicer.org> wrote:

> If you are just starting out, I think you should grab the latest version
> of Flume (1.3.1)
>
> http://flume.apache.org/download.html
>
> Version 1.2.0 is an older version so working with the latest version is
> better.
>
> Try it out and send your feedback.
>
>
> On 24 April 2013 10:15, Shekhar Sharma <sh...@gmail.com> wrote:
>
>> Hello All,
>>
>>
>> I am using a HDFS sink , but it is throwing the following error:
>>
>> Exception in thread "SinkRunner-PollingRunner-DefaultSinkProcessor"
>> java.lang.NoSuchMethodError:
>> org.apache.flume.formatter.output.BucketPath.escapeString(Ljava/lang/String;Ljava/util/Map;ZII)Ljava/lang/String;
>>  at
>> org.apache.flume.sink.hdfs.HDFSEventSink.process(HDFSEventSink.java:389)
>> at
>> org.apache.flume.sink.DefaultSinkProcessor.process(DefaultSinkProcessor.java:65)
>>  at org.apache.flume.SinkRunner$PollingRunner.run(SinkRunner.java:148)
>> at java.lang.Thread.run(Thread.java:722)
>>
>>
>> I have seen the similar error
>> https://issues.apache.org/jira/browse/FLUME-2019
>> But i don't think this would be the error, since i have built the flume
>> source code on my own system..so there shouldn't be any version mismatch
>>
>> The class path which it is taking while running it as follows:
>>
>> exec /usr/java/jdk1.7.0_10/bin/java -Xmx20m -cp
>> '/home/training/flume-1.2.0-incubating-SNAPSHOT/conf:/home/training/flume-1.2.0-incubating-SNAPSHOT/lib/*:/usr/local/hadoop/libexec/../conf:/usr/java/jdk1.7.0_10/lib/tools.jar:/usr/local/hadoop/libexec/..:/usr/local/hadoop/libexec/../hadoop-core-1.0.3.jar:/usr/local/hadoop/libexec/../lib/asm-3.2.jar:/usr/local/hadoop/libexec/../lib/aspectjrt-1.6.5.jar:/usr/local/hadoop/libexec/../lib/aspectjtools-1.6.5.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/libexec/../lib/commons-cli-1.2.jar:/usr/local/hadoop/libexec/../lib/commons-codec-1.4.jar:/usr/local/hadoop/libexec/../lib/commons-collections-3.2.1.jar:/usr/local/hadoop/libexec/../lib/commons-configuration-1.6.jar:/usr/local/hadoop/libexec/../lib/commons-daemon-1.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-digester-1.8.jar:/usr/local/hadoop/libexec/../lib/commons-el-1.0.jar:/usr/local/hadoop/libexec/../lib/commons-httpclient-3.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-io-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-lang-2.4.jar:/usr/local/hadoop/libexec/../lib/commons-logging-1.1.1.jar:/usr/local/hadoop/libexec/../lib/commons-logging-api-1.0.4.jar:/usr/local/hadoop/libexec/../lib/commons-math-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-net-1.4.1.jar:/usr/local/hadoop/libexec/../lib/core-3.1.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-capacity-scheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-fairscheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-thriftfs-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hsqldb-1.8.0.10.jar:/usr/local/hadoop/libexec/../lib/jackson-core-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jasper-compiler-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jasper-runtime-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jdeb-0.8.jar:/usr/local/hadoop/libexec/../lib/jersey-core-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-json-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-server-1.8.jar:/usr/local/hadoop/libexec/../lib/jets3t-0.6.1.jar:/usr/local/hadoop/libexec/../lib/jetty-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jetty-util-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jsch-0.1.42.jar:/usr/local/hadoop/libexec/../lib/junit-4.5.jar:/usr/local/hadoop/libexec/../lib/kfs-0.2.2.jar:/usr/local/hadoop/libexec/../lib/log4j-1.2.15.jar:/usr/local/hadoop/libexec/../lib/mockito-all-1.8.5.jar:/usr/local/hadoop/libexec/../lib/oro-2.0.8.jar:/usr/local/hadoop/libexec/../lib/servlet-api-2.5-20081211.jar:/usr/local/hadoop/libexec/../lib/snappy-java-1.0.3.2.jar:/usr/local/hadoop/libexec/../lib/xmlenc-0.52.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-2.1.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-api-2.1.jar'
>> -Djava.library.path=:/usr/local/hadoop/libexec/../lib/native/Linux-i386-32
>> org.apache.flume.node.Application -f ../conf/rpc_to_hdfs_node.js.properties
>> -n sample
>> SLF4J: Class path contains multiple SLF4J bindings.
>> SLF4J: Found binding in
>> [jar:file:/home/training/flume-1.2.0-incubating-SNAPSHOT/lib/FlumeEsperSink-1.0.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>> SLF4J: Found binding in
>> [jar:file:/home/training/flume-1.2.0-incubating-SNAPSHOT/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>> explanation.
>> Exception in thread "SinkRunner-PollingRunner-DefaultSinkProcessor"
>> java.lang.NoSuchMethodError:
>> org.apache.flume.formatter.output.BucketPath.escapeString(Ljava/lang/String;Ljava/util/Map;ZII)Ljava/lang/String;
>>  at
>> org.apache.flume.sink.hdfs.HDFSEventSink.process(HDFSEventSink.java:389)
>> at
>> org.apache.flume.sink.DefaultSinkProcessor.process(DefaultSinkProcessor.java:65)
>>  at org.apache.flume.SinkRunner$PollingRunner.run(SinkRunner.java:148)
>> at java.lang.Thread.run(Thread.java:722)
>>
>>
>>
>>
>>
>> My agent file is as follows:
>>
>> sample.sources = avroSource
>> sample.channels = memoryChannel2
>> sample.sinks = hdfsSink
>>
>> #Source configuration
>>
>> sample.sources.avroSource.type = avro
>> sample.sources.avroSource.bind = localhost
>> sample.sources.avroSource.port = 41416
>> sample.sources.avroSource.channels =  memoryChannel2
>>
>>
>>
>> #Channels
>>
>>
>> sample.channels.memoryChannel2.type = memory
>>
>>
>> #Sinks
>>
>>
>>
>> sample.sinks.hdfsSink.type = hdfs
>>
>> sample.sinks.hdfsSink.hdfs.path=hdfs://localhost:54310/user/training/FlumeFanOut
>> sample.sinks.hdfsSink.hdfs.fileType=DataStream
>> #Number of seconds to wait before rolling current file (0 = never roll
>> based on time interval)
>> sample.sinks.hdfsSink.hdfs.rollInterval=0
>> #File size to trigger roll, in bytes (0: never roll based on file size)
>> sample.sinks.hdfsSink.hdfs.rollSize=0
>> #Number of events written to file before it rolled (0 = never roll based
>> on number of events)
>> sample.sinks.hdfsSink.hdfs.rollCount=0
>> sample.sinks.hdfsSink.channel = memoryChannel2
>>
>>
>> Regards,
>> Som Shekhar Sharma
>>
>>
>

Re: Flume Agent not able to write to HDFS Sink

Posted by Israel Ekpo <is...@aicer.org>.
If you are just starting out, I think you should grab the latest version of
Flume (1.3.1)

http://flume.apache.org/download.html

Version 1.2.0 is an older version so working with the latest version is
better.

Try it out and send your feedback.


On 24 April 2013 10:15, Shekhar Sharma <sh...@gmail.com> wrote:

> Hello All,
>
>
> I am using a HDFS sink , but it is throwing the following error:
>
> Exception in thread "SinkRunner-PollingRunner-DefaultSinkProcessor"
> java.lang.NoSuchMethodError:
> org.apache.flume.formatter.output.BucketPath.escapeString(Ljava/lang/String;Ljava/util/Map;ZII)Ljava/lang/String;
>  at
> org.apache.flume.sink.hdfs.HDFSEventSink.process(HDFSEventSink.java:389)
> at
> org.apache.flume.sink.DefaultSinkProcessor.process(DefaultSinkProcessor.java:65)
>  at org.apache.flume.SinkRunner$PollingRunner.run(SinkRunner.java:148)
> at java.lang.Thread.run(Thread.java:722)
>
>
> I have seen the similar error
> https://issues.apache.org/jira/browse/FLUME-2019
> But i don't think this would be the error, since i have built the flume
> source code on my own system..so there shouldn't be any version mismatch
>
> The class path which it is taking while running it as follows:
>
> exec /usr/java/jdk1.7.0_10/bin/java -Xmx20m -cp
> '/home/training/flume-1.2.0-incubating-SNAPSHOT/conf:/home/training/flume-1.2.0-incubating-SNAPSHOT/lib/*:/usr/local/hadoop/libexec/../conf:/usr/java/jdk1.7.0_10/lib/tools.jar:/usr/local/hadoop/libexec/..:/usr/local/hadoop/libexec/../hadoop-core-1.0.3.jar:/usr/local/hadoop/libexec/../lib/asm-3.2.jar:/usr/local/hadoop/libexec/../lib/aspectjrt-1.6.5.jar:/usr/local/hadoop/libexec/../lib/aspectjtools-1.6.5.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/libexec/../lib/commons-cli-1.2.jar:/usr/local/hadoop/libexec/../lib/commons-codec-1.4.jar:/usr/local/hadoop/libexec/../lib/commons-collections-3.2.1.jar:/usr/local/hadoop/libexec/../lib/commons-configuration-1.6.jar:/usr/local/hadoop/libexec/../lib/commons-daemon-1.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-digester-1.8.jar:/usr/local/hadoop/libexec/../lib/commons-el-1.0.jar:/usr/local/hadoop/libexec/../lib/commons-httpclient-3.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-io-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-lang-2.4.jar:/usr/local/hadoop/libexec/../lib/commons-logging-1.1.1.jar:/usr/local/hadoop/libexec/../lib/commons-logging-api-1.0.4.jar:/usr/local/hadoop/libexec/../lib/commons-math-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-net-1.4.1.jar:/usr/local/hadoop/libexec/../lib/core-3.1.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-capacity-scheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-fairscheduler-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hadoop-thriftfs-1.0.3.jar:/usr/local/hadoop/libexec/../lib/hsqldb-1.8.0.10.jar:/usr/local/hadoop/libexec/../lib/jackson-core-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jasper-compiler-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jasper-runtime-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jdeb-0.8.jar:/usr/local/hadoop/libexec/../lib/jersey-core-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-json-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-server-1.8.jar:/usr/local/hadoop/libexec/../lib/jets3t-0.6.1.jar:/usr/local/hadoop/libexec/../lib/jetty-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jetty-util-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jsch-0.1.42.jar:/usr/local/hadoop/libexec/../lib/junit-4.5.jar:/usr/local/hadoop/libexec/../lib/kfs-0.2.2.jar:/usr/local/hadoop/libexec/../lib/log4j-1.2.15.jar:/usr/local/hadoop/libexec/../lib/mockito-all-1.8.5.jar:/usr/local/hadoop/libexec/../lib/oro-2.0.8.jar:/usr/local/hadoop/libexec/../lib/servlet-api-2.5-20081211.jar:/usr/local/hadoop/libexec/../lib/snappy-java-1.0.3.2.jar:/usr/local/hadoop/libexec/../lib/xmlenc-0.52.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-2.1.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-api-2.1.jar'
> -Djava.library.path=:/usr/local/hadoop/libexec/../lib/native/Linux-i386-32
> org.apache.flume.node.Application -f ../conf/rpc_to_hdfs_node.js.properties
> -n sample
> SLF4J: Class path contains multiple SLF4J bindings.
> SLF4J: Found binding in
> [jar:file:/home/training/flume-1.2.0-incubating-SNAPSHOT/lib/FlumeEsperSink-1.0.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in
> [jar:file:/home/training/flume-1.2.0-incubating-SNAPSHOT/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> explanation.
> Exception in thread "SinkRunner-PollingRunner-DefaultSinkProcessor"
> java.lang.NoSuchMethodError:
> org.apache.flume.formatter.output.BucketPath.escapeString(Ljava/lang/String;Ljava/util/Map;ZII)Ljava/lang/String;
>  at
> org.apache.flume.sink.hdfs.HDFSEventSink.process(HDFSEventSink.java:389)
> at
> org.apache.flume.sink.DefaultSinkProcessor.process(DefaultSinkProcessor.java:65)
>  at org.apache.flume.SinkRunner$PollingRunner.run(SinkRunner.java:148)
> at java.lang.Thread.run(Thread.java:722)
>
>
>
>
>
> My agent file is as follows:
>
> sample.sources = avroSource
> sample.channels = memoryChannel2
> sample.sinks = hdfsSink
>
> #Source configuration
>
> sample.sources.avroSource.type = avro
> sample.sources.avroSource.bind = localhost
> sample.sources.avroSource.port = 41416
> sample.sources.avroSource.channels =  memoryChannel2
>
>
>
> #Channels
>
>
> sample.channels.memoryChannel2.type = memory
>
>
> #Sinks
>
>
>
> sample.sinks.hdfsSink.type = hdfs
>
> sample.sinks.hdfsSink.hdfs.path=hdfs://localhost:54310/user/training/FlumeFanOut
> sample.sinks.hdfsSink.hdfs.fileType=DataStream
> #Number of seconds to wait before rolling current file (0 = never roll
> based on time interval)
> sample.sinks.hdfsSink.hdfs.rollInterval=0
> #File size to trigger roll, in bytes (0: never roll based on file size)
> sample.sinks.hdfsSink.hdfs.rollSize=0
> #Number of events written to file before it rolled (0 = never roll based
> on number of events)
> sample.sinks.hdfsSink.hdfs.rollCount=0
> sample.sinks.hdfsSink.channel = memoryChannel2
>
>
> Regards,
> Som Shekhar Sharma
>
>